[Eeglablist] EEGlab function to trim datasets
mmiyakoshi at ucsd.edu
Tue Feb 18 17:10:19 PST 2014
By the way, did you try the memory mapping option on? It slows down the
process but should address the memory issue. Let me know if it does not
2014-02-12 14:26 GMT-08:00 Brent A. Field <bfield at princeton.edu>:
> I running a computationally intensive analysis on a high-end cluster,
> but have been confounded because I periodically hit a Matlab Out of Memory
> error. The input data has a high sampling rate, has a lot of channels, and
> is collected over a long period. These can be simplified at later stages,
> but the first step requires inputting roughly a 4 GB continuous data file,
> with memory requirement far beyond that to actually run the analysis. I
> won't go into it, but there is a reason why the data needs processed as one
> There are other tricks I can try, but one obviously one is just to slim
> down the input. And fortunately there is some fat I can trim out of input
> datasets. But it seems that EEGlab deals with this by offering the option
> to mark data in continuous files as irrelevant, not by making a new copy of
> the dataset which just contains the data of interest. Is this correct?
> Obviously I can write my own function to remove irrelevant sections from
> the dataset, but I just wanted to check that there wasn't already some
> function out there first that chopped data out of EEG datasets.
> Thanks for any thoughts on this matter!
> Brent Field
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
> For digest mode, send an email with the subject "set digest mime" to
> eeglablist-request at sccn.ucsd.edu
Swartz Center for Computational Neuroscience
Institute for Neural Computation, University of California San Diego
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the eeglablist