[Eeglablist] Out of memory when running ICA

Stephen Politzer-Ahles politzerahless at gmail.com
Tue Jul 26 16:44:48 PDT 2011


Hello,

I am getting an "out of memory" error when running ICA decomposition (both
with runica and binica) on some epoched data. This is confusing me because I
didn't get this error when running ICA recently on data that was nearly
identical and larger. To be specific: a few months ago I successfully ran
ICA on some epoched Neuroscan data that I had imported into EEGLAB (each
dataset consisted of around 120-160 sweeps, 3 seconds each with a sampling
rate of 1000 Hz, and 36 channels, thus about 3000 frames per epoch, usually
about 50-60 mB per dataset). Today I started running an identical analysis,
on the same computer, using different Neuroscan epoch data (the Neuroscan
epoched files were actually obtained from the same continuous (.cnt) data as
the previous ones, just selected based on different event markers) and
processing with the same protocol, and the new datasets are actually a bit
smaller (fewer trials per dataset), but got the "out of memory" error. ICA
still works on the old data, so it can't just be a problem with this
computer. Can anyone think of anything other than the dataset size and
number of channels that might affect memory required for ICA and cause this
problem?

Thank you,
Stephen Politzer-Ahles

-- 
Stephen Politzer-Ahles
University of Kansas
Linguistics Department
http://www.linguistics.ku.edu/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20110726/b9b1567f/attachment.html>


More information about the eeglablist mailing list