[Eeglablist] EEGlab function to trim datasets
Brent A. Field
bfield at princeton.edu
Wed Feb 12 14:26:10 PST 2014
I running a computationally intensive analysis on a high-end cluster, but have been confounded because I periodically hit a Matlab Out of Memory error. The input data has a high sampling rate, has a lot of channels, and is collected over a long period. These can be simplified at later stages, but the first step requires inputting roughly a 4 GB continuous data file, with memory requirement far beyond that to actually run the analysis. I won't go into it, but there is a reason why the data needs processed as one block.
There are other tricks I can try, but one obviously one is just to slim down the input. And fortunately there is some fat I can trim out of input datasets. But it seems that EEGlab deals with this by offering the option to mark data in continuous files as irrelevant, not by making a new copy of the dataset which just contains the data of interest. Is this correct?
Obviously I can write my own function to remove irrelevant sections from the dataset, but I just wanted to check that there wasn't already some function out there first that chopped data out of EEG datasets.
Thanks for any thoughts on this matter!
Brent Field
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20140212/e97017ab/attachment.html>
More information about the eeglablist
mailing list