[Eeglablist] Using large datasets in EEGLab

Patrick Craston pc52 at kent.ac.uk
Sat Nov 19 06:21:47 PST 2005


Hi,

I am trying to load and merge rather large datasets (about 100MB each) 
using EEGLab. Currently I have 4 datasets that are to be merged into 
one, but this number will increase with the progress of my EEG experiments.

The problem is I get a 'Out of Memory' error message whenever I try to 
merge the datasets.

I am using Windows XP Pro on a 3.2GHz PC with 1GB of RAM.


Does anyone know of a solution? Surely Matlab should be able to handle 
files of that size.

Or can EEGLab be configured to compress data during merging, for 
instance using PACK?


Thanks for your help.

Patrick

-- 
Patrick Craston
Computing Lab, University of Kent, Canterbury, CT2 7NF, UK
http://www.cs.kent.ac.uk/~pc52



More information about the eeglablist mailing list