[Eeglablist] RAM usage when loading and saving (certain) EEGLAB files
Armand Mensen
research.mensen at gmail.com
Tue Apr 25 12:57:22 PDT 2017
Thanks for the response Tarik!
In response to your suggestions:
1. Indeed ".raw" format seems to run much smoother as its what I used to
do. But this requires access to a computer with NetStation all the time to
convert and loses some basic meta-data, so that's why I preferred to
convert directly from the mff, and then only have to store the mff files as
a backup and the converted EEGLAB file. However, the conversion from mff,
despite also being relatively slow, does run well and I can process the
data afterwards. If this is specific to the mff file conversion, how could
it be that even once converted to fdt/set, it still has some properties
that make it slow to load afterwards.
2. I really like the fdt/set way of saving things, because throughout my
analysis I add a bunch of custom information within the .set file and like
to retain the ability to load just the set file to access this meta
information without having to load all the actual data. Note that this is
not happening yet when I have the problem with the memory.
3. Indeed, the conversion from mff seems to work, and I can explore all the
data without any noticeable problem with the data (except for the crazy
memory usage).
4. I always work from the command line and never the GUI, so only ever have
a single EEG struct in the workspace at any time.
Re the "naive suggestions"
- How could I explore different matlab saving options given that I only use
the native EEGLAB pop_saveset() and pop_loadset() functions?
- All single precision saved and loaded. Could it be there is a double
conversion happening somewhere in the loading that therefore expands all
the data? Or is there just some useless copying of the data that it uses 4x
the memory?
- Unfortunately I don't have access to multiple 64GB RAM PCs around, so
can't really attempt the conversion etc on another PC... but might try some
colleagues PC if it comes down to that.
Hope that extra info provided can help me figure this out.
Thanks again!
On 25 April 2017 at 21:40, Tarik S Bel-Bahar <tarikbelbahar at gmail.com>
wrote:
> Hello Armand, this may be something about your setup though you have large
> files that don't seem to have this problem.
> Also, it's not clear what "slower" means in your case. If it's a matter of
> ~10 to 20 minutes to load for a giant file, that might be appropriate.
>
> I'd recommend you check in what other circumstances (with matlab) that
> similar "leaks" occur, so as to make sure it's not something about your PCs
> setup.
>
> I've worked with ~5GB 500 hz files from ~7 hour recordings and have had no
> issue except slightly slower loading for multi-GB files (with adequate RAM
> and CPU).
> Here are some notes below that might be of use to you. Please let the list
> know if you achieve clarity on this topic.
> You may also google eeglablist and your topic for previous similar posts
> about memory usage or memory "leaks".
>
> 1. You may want to export the files as .raw if you have not, and then
> import them.
> 2. You may want to not use the .fdt/.set but just .set [change in eeglab
> options]
> 3. There may be issues with straight loading of mff format into matlab,
> though it seems you converted them correctly first.
> Try step 1 and see if it makes any difference.
> 4. you may want to check whether you've changed eeglab options to hold
> only the current file in memory, rather than all. Check out all the eeglab
> options you can set from the GUI.
>
> naive extra suggestions:
> this might have to do with the format that matlab saved the files in
> (e.g., the 7.3 flag).
> this might be due to single versus double precision
> this might have to due with unique setup on your PCs
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20170425/455587ed/attachment.html>
More information about the eeglablist
mailing list