[Eeglablist] RAM usage when loading and saving (certain) EEGLAB files
Tarik S Bel-Bahar
tarikbelbahar at gmail.com
Tue Apr 25 12:40:51 PDT 2017
Hello Armand, this may be something about your setup though you have large
files that don't seem to have this problem.
Also, it's not clear what "slower" means in your case. If it's a matter of
~10 to 20 minutes to load for a giant file, that might be appropriate.
I'd recommend you check in what other circumstances (with matlab) that
similar "leaks" occur, so as to make sure it's not something about your PCs
setup.
I've worked with ~5GB 500 hz files from ~7 hour recordings and have had no
issue except slightly slower loading for multi-GB files (with adequate RAM
and CPU).
Here are some notes below that might be of use to you. Please let the list
know if you achieve clarity on this topic.
You may also google eeglablist and your topic for previous similar posts
about memory usage or memory "leaks".
1. You may want to export the files as .raw if you have not, and then
import them.
2. You may want to not use the .fdt/.set but just .set [change in eeglab
options]
3. There may be issues with straight loading of mff format into matlab,
though it seems you converted them correctly first.
Try step 1 and see if it makes any difference.
4. you may want to check whether you've changed eeglab options to hold only
the current file in memory, rather than all. Check out all the eeglab
options you can set from the GUI.
naive extra suggestions:
this might have to do with the format that matlab saved the files in (e.g.,
the 7.3 flag).
this might be due to single versus double precision
this might have to due with unique setup on your PCs
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20170425/dea9e3ec/attachment.html>
More information about the eeglablist
mailing list