[Eeglablist] RAM usage when loading and saving (certain) EEGLAB files
Tarik S Bel-Bahar
tarikbelbahar at gmail.com
Tue Apr 25 13:20:25 PDT 2017
Thanks for your reply Armand. Sorry I'm not an expert in matlab
Overall it seems like you don't have a "block" but rather you're trying to
speed up loading/saving.
We might here from some other users who are more saavy about these
A few more last notes below.
1. true, I don't see how a good export from mff to .set/.fdt could be
messing you up. I guess it depends on exactly how mff is saved (I assume
all in matlab, one can review the saving code). I'll assume you're using
the eeglab MFF import function, but you might be using something else.
2. that should not be critical, just suggestion based on my preferences.
4. just try doing some steps from the GUI, and reviewing eegh which spits
out the commands that were just run.
It's unusual for an eeglab user to not have worked with the GUI ever, as it
is required for the online tutorial. [If you haven't had a chance to,
there's a lot to learn about EEG.structure and scripting in the online
*to explore different saving options, you want to check the available flags
within the functions that you are using (open NAMEofFUNCTION, or help
NAMEofFUNCTION, or doc NAMEofFUNCTION). The larger files might be hitting
some limit and/or saving in a unique way due to their size. Alternatively
within the functions, for example pop_saveset, find the line that actually
saves the file, and one could add the different matlab flag there. See also
online matlab documentation on -7.3 version saving, etc..
*check the single/double precision of the files that are giving you
problems. Changing to double precision might be occurring, but I'm not sure
it would impact your loading problem.
*try a colleagues PC with lower RAM, and let it sit for a while. Of course
you could try a range of setups.
On Tue, Apr 25, 2017 at 3:57 PM, Armand Mensen <research.mensen at gmail.com>
> Thanks for the response Tarik!
> In response to your suggestions:
> 1. Indeed ".raw" format seems to run much smoother as its what I used to
> do. But this requires access to a computer with NetStation all the time to
> convert and loses some basic meta-data, so that's why I preferred to
> convert directly from the mff, and then only have to store the mff files as
> a backup and the converted EEGLAB file. However, the conversion from mff,
> despite also being relatively slow, does run well and I can process the
> data afterwards. If this is specific to the mff file conversion, how could
> it be that even once converted to fdt/set, it still has some properties
> that make it slow to load afterwards.
> 2. I really like the fdt/set way of saving things, because throughout my
> analysis I add a bunch of custom information within the .set file and like
> to retain the ability to load just the set file to access this meta
> information without having to load all the actual data. Note that this is
> not happening yet when I have the problem with the memory.
> 3. Indeed, the conversion from mff seems to work, and I can explore all
> the data without any noticeable problem with the data (except for the crazy
> memory usage).
> 4. I always work from the command line and never the GUI, so only ever
> have a single EEG struct in the workspace at any time.
> Re the "naive suggestions"
> - How could I explore different matlab saving options given that I only
> use the native EEGLAB pop_saveset() and pop_loadset() functions?
> - All single precision saved and loaded. Could it be there is a double
> conversion happening somewhere in the loading that therefore expands all
> the data? Or is there just some useless copying of the data that it uses 4x
> the memory?
> - Unfortunately I don't have access to multiple 64GB RAM PCs around, so
> can't really attempt the conversion etc on another PC... but might try some
> colleagues PC if it comes down to that.
> Hope that extra info provided can help me figure this out.
> Thanks again!
> On 25 April 2017 at 21:40, Tarik S Bel-Bahar <tarikbelbahar at gmail.com>
>> Hello Armand, this may be something about your setup though you have
>> large files that don't seem to have this problem.
>> Also, it's not clear what "slower" means in your case. If it's a matter
>> of ~10 to 20 minutes to load for a giant file, that might be appropriate.
>> I'd recommend you check in what other circumstances (with matlab) that
>> similar "leaks" occur, so as to make sure it's not something about your PCs
>> I've worked with ~5GB 500 hz files from ~7 hour recordings and have had
>> no issue except slightly slower loading for multi-GB files (with adequate
>> RAM and CPU).
>> Here are some notes below that might be of use to you. Please let the
>> list know if you achieve clarity on this topic.
>> You may also google eeglablist and your topic for previous similar posts
>> about memory usage or memory "leaks".
>> 1. You may want to export the files as .raw if you have not, and then
>> import them.
>> 2. You may want to not use the .fdt/.set but just .set [change in eeglab
>> 3. There may be issues with straight loading of mff format into matlab,
>> though it seems you converted them correctly first.
>> Try step 1 and see if it makes any difference.
>> 4. you may want to check whether you've changed eeglab options to hold
>> only the current file in memory, rather than all. Check out all the eeglab
>> options you can set from the GUI.
>> naive extra suggestions:
>> this might have to do with the format that matlab saved the files in
>> (e.g., the 7.3 flag).
>> this might be due to single versus double precision
>> this might have to due with unique setup on your PCs
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the eeglablist