[Eeglablist] ICA decomposition with 128 versus 64 channels?

Joseph Dien jdien at ku.edu
Wed Jan 18 17:37:13 PST 2006


My lab has a 128-channel system and runs all the data through ICA  
without any problems.  For a typical ERP experiment our 2GB of RAM  
has been sufficient to run the full single trial dataset (takes a  
couple hours on a 2GHz G5).  As for whether the extra channels are  
worth it, depends on what your goals are.  For example, we do a lot  
of source modeling and for that purpose the extra channels are quite  
helpful.  A couple relevant papers on other applications:

Srinivasan, R., Nunez, P. L., Tucker, D. M., Silberstein, R. B., &  
Cadusch, P. J. (1996). Spatial sampling and filtering of EEG with  
spline laplacians to estimate cortical potentials. Brain Topography, 8 
(4), 355-66.

Fletcher, E. M., Kussmaul, C. L., & Mangun, G. R. (1996). Estimation  
of interpolation errors in scalp topographic mapping.  
Electroencephalography and Clinical Neurophysiology, 98(5), 422-34.

Cheers!

Joe


On Jan 18, 2006, at 5:41 PM, Matthew Belmonte wrote:

> I'm in the process of putting together a proposal for an EEG  
> facility, and
> would like to select hardware with EEGLAB processing in mind.  I'm  
> approaching
> this from perhaps a bit of a dated perspective: in 1996 I was using  
> only 16
> channels and homebrewed software for time-frequency analysis, and  
> I've spent
> the intervening decade working exclusively with fMRI.
>
> I've heard from one EEGLAB user that 128 channels don't confer much  
> advantage
> over 64, since inputs must be spatially downsampled in order to be  
> processed
> practically on typical computing hardware, and since the  
> independent components
> of interest (those from neural sources) don't become much cleaner  
> with 128
> inputs as compared to 64.  (The tradeoff of spatial resolution and  
> SNR to
> electrode application time also is a consideration; we'd be  
> recording from
> autistic children and couldn't afford any great deal of time spent  
> fiddling.)
>
> I'd like to hear from EEGLAB users (and developers!) with  
> experience at 128
> and/or 64 channels:  Do you find a 64-channel system adequate?  What
> improvement in data quality has moving to 128 channels given you?   
> If I loaded
> up a GNU/Linux system with the most RAM that I could get (16GB on  
> an IBM
> IntelliStation), would it be able to handle an ICA decomposition of  
> 128-channel
> data without thrashing, or would I be doubling my investment in  
> amplifiers only
> to have to mix down 128 signals to 64 before ICA?  And, even if it  
> would be
> computationally practical, would it be scientifically useful enough  
> to justify
> the extra preparation time?
>
> Many thanks
>
> Matthew Belmonte <mkb30 at cam.ac.uk>
> _______________________________________________
> eeglablist mailing list eeglablist at sccn.ucsd.edu
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to eeglablist- 
> unsubscribe at sccn.ucsd.edu

------------------------------------------------------------------------ 
--------

Joseph Dien
Assistant Professor of Psychology
Department of Psychology
419 Fraser Hall
1415 Jayhawk Blvd
University of Kansas
Lawrence, KS 66045-7556
E-mail: jdien at ku.edu
Office: 785-864-9822 (note: no voicemail)
Fax: 785-864-5696
http://people.ku.edu/~jdien/Dien.html




More information about the eeglablist mailing list