[Eeglablist] Using large datasets in EEGLab
elinmel at stud.ntnu.no
Wed Nov 23 09:17:56 PST 2005
I`ve run into the same problem ("Out of Memory" error message) when
trying to run ICA on a file with 128 channels. I then "asked" Matlab
about the prblem, and got the answear that I could use the "PACK
command", my problem is that I don`t know where to use this command; is
it in matlab (didn`t work by typing it...) or is it in Eeglab? Does
anyone know how to do this?
What I did find in Eeglab was a function under "File" named "Maximize
memory", tried this, but it didn`t help!
Is there anyone out there who knows how much memory is needed to run ICA
on 128 and 256 chanel data, the datasets are from 120 000 kB - 400 000
I`m using Windows XP on a 1,5 GHz portable PC with 1.2GB of RAM.
Sitat Patrick Craston <pc52 at kent.ac.uk>:
> I am trying to load and merge rather large datasets (about 100MB
> using EEGLab. Currently I have 4 datasets that are to be merged into
> one, but this number will increase with the progress of my EEG
> The problem is I get a 'Out of Memory' error message whenever I try
> merge the datasets.
> I am using Windows XP Pro on a 3.2GHz PC with 1GB of RAM.
> Does anyone know of a solution? Surely Matlab should be able to
> files of that size.
> Or can EEGLab be configured to compress data during merging, for
> instance using PACK?
> Thanks for your help.
> Patrick Craston
> Computing Lab, University of Kent, Canterbury, CT2 7NF, UK
> eeglablist mailing list eeglablist at sccn.ucsd.edu
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
More information about the eeglablist