[Eeglablist] more memory

Jim Kroger jkroger at nmsu.edu
Tue Jan 31 09:38:45 PST 2006


We had a lot of trouble with loading data when we began using EEGLAB. My 
student did empirical tests in linux using the "free" command periodically 
to understand why our opteron computer with 6 gigs RAM had such trouble 
with our 2 gig, 1.5 hour data files collected on a biosemi system. We set 
swap to 12 gigs, and she observed that loading a 2 gig file (128 channels) 
took several hours, and at times occupied as much as 13 gigs (both RAM and 
swap). Once the file was successfully loaded, the machine backed off to 
using approximately 3-4 gigs. We speculated that the process of translating 
our .bdf files to EEGLAB's internal formats (which includes two copies of 
the data) somehow required massive memory. As soon as it exceeds available 
ram, and begins to use swap, performance slows to a crawl and it can take 
overnight to load. We have chopped 15 minutes of data off the file, and 
that loads easily using only RAM and in a couple minutes. EEGLAB has the 
ability to combine multiple segments (set files), so that you can follow a 
strategy of loading smaller segments. The problem with this approach is if 
you wish to process, save, and later reopen the result, you will again face 
the memory problems. Another student is working on a utility to load as 
well as save data in segments to permanently circumvent the memory issues.

So in short, it's not surprising that lots of people report problems 
loading data that it seems should load easily, according to the machines 
and data sets being described. In the end you need more RAM, or to use a 
very large swap, in which case loads will be slow.

This refers to linux, and I don't know how closely these issues translate 
to other OSs.

Jim



At 09:11 AM 1/30/2006, Stefano Seri wrote:
>Still on the same topic of Memory allocation. This time the variation on 
>the theme is OS X.
>Can't just get to terms with the fact that is spite of a Mac quadra (2 
>dual core PowerPC at 2.5GHz) amd 5 GB RAM, Matlab 7.1 EDF files greater 
>then 200 Mb can't be read  into EEGLab.
>Can't believe that the OS cannot identify approx 800 MB contiguous memory 
>to read the file in.
>
>Any feedback on this?
>
>Stefano Seri
>_______________________________________________
>eeglablist mailing list eeglablist at sccn.ucsd.edu
>Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
>To unsubscribe, send an empty email to eeglablist-unsubscribe at sccn.ucsd.edu


--------------------------------------------
Jim Kroger
NMSU Psychology MSC 3452
220 Science Hall, Williams Street
Las Cruces,  NM 88003-8001
USA
http://www.psych.nmsu.edu/~jkroger/lab/index.html
Tel:  (505) 646 2243
Fax: (505) 646 6212
--------------------------------------------





More information about the eeglablist mailing list