[Eeglablist] amount of data required for an ICA decomposition of 128 channels

nishant seth nishant.sth at gmail.com
Wed Jun 8 23:30:40 PDT 2011


Hi,

I'm running ICA on a dataset with 128 electrodes. Because I have RAM
constraints (8GB), I wanted to know the amount of data required for a*'good'
* ICA.

I know the questions I'm asking don't have precise answers, but a rough
guess by anyone with some experience in EEG analysis would be better than my
own guess.

The wiki on 'decomposing ICA data' (
http://sccn.ucsd.edu/wiki/Chapter_09:_Decomposing_Data_Using_ICA) has the
following to say about this issue:
"As a general rule, finding *N*stable components (from N-channel data)
typically requires *more than* *kN^2* data sample points (at each channel),
where N^2 is the number of weights in the unmixing matrix that ICA is trying
to learn and *k* is a multiplier. In our experience, the value of *k* increases
as the number of channels increases. In our example using 32 channels, we
have 30800 data points, giving 30800/32^2 = 30 pts/weight points. However,
to find 256 components, it appears that even 30 points per weight is not
enough data."

what would the rough value of *k* be for 128 channels?

Regards,
Nishant Seth

- - - - - - - - - - - - - - -
Research Associate,
Multimedia Lab, I.I.T. Delhi
New Delhi, India
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20110609/ae2d235f/attachment.html>


More information about the eeglablist mailing list