[Eeglablist] ICA time estimate
Scott Makeig
smakeig at ucsd.edu
Tue Jan 11 08:13:21 PST 2005
Christian --
I am surprised this decomposition ran at all -- Was it swapping (-->
many times slower)?
Here, we would (1) carefully preprocess the data, removing any 'bad'
channels and 'paroxsymal' (i.e., non- spatially stereotyped) artifact
data. (2) Use PCA (within runica) to reduce the dimensionality to
100-150(?) (3) Use the binary form of runica() -- called from Matlab by
binica().
This is the way we currently perform decompositions of our 256-channel
EEG data - at least until we get a 8GB, 64-bit (Opterion/Linux) compute
engine (soon).
To find 275 components requires setting 275^2 weights -- and seems to
require >> 275^2 time points (maybe 50x or more?). Reducing the
dimensionality to 138 (by half) will reduce the number of weights
learned by 4x and may reduce the amount of data required by more than that.
I'd be happy to hear what use you make of the results...
Scott Makeig
Christian Bikle wrote:
> I started an ICA on a ~480meg data set which was acquired on a
> 275channel system. The linux machine processing the ICA only has 1gig
> of ram and a 2.8GHz processor. This process was running for over 2
> weeks (I was on vacation during this time) until a co-worker shut down
> the process (DOH!). I was wondering if any one had suggestions to
> accelerate this process before starting again?
>
> Christian Bikle
> Psychologist
> Laboratory of Brain and Cognition
> National Institute of Mental Health
> Building 10/4C214
> Work: 3014518509
>
>
> _______________________________________________
> eeglablist mailing list eeglablist at sccn.ucsd.edu
> http://sccn.ucsd.edu/mailman/listinfo/eeglablist
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
More information about the eeglablist
mailing list