[Eeglablist] Poor ICA decompostion

Kelly Michaelis kcmichaelis at gmail.com
Fri Oct 26 07:47:15 PDT 2018


Hi Arno, Marius and Makoto,

Thank you all for the helpful advice!

To answer Arno's question about other ICA algorithms, I tried runica, and I
found AMICA produced better results. I did also try re-running it with just
removing the channels and not interpolating before ICA, and found that it
didn't really make a difference.

As for the amount of data - I have two recording sessions per subject, and
the sessions are split into two .set files (I reapplied saline to lower
impedances halfway through the session). Each .set file is submitted to ICA
independently. Each file is about 30 minutes of data, but once that gets
epoched to 1sec epochs before ICA, it's trimmed considerably, so that
before ICA I end up with (data = 128x250x2052), depending on the # of
epochs removed. So perhaps this is part of why things don't look as pretty
as they could?

My data are fairly noisy - lots of EMG and non-repetitive artifacts - but
this being my first major EEG study I didn't have much to compare my ICA
decompositions to except tutorials online. After hearing from you all,
perhaps what I'm getting isn't all that abnormal for 128 channels and
somewhat noisy data. Following Makoto's advice, I have been checking the
component scroll data for each file, and it looks fine.

I've already gone through half the subjects and rejected artifactual
components (I'm rejecting about 3-10 components per file - this may be too
many? I'm careful to only reject things that are clearly blinks, cardiac
artifacts, saccades, or what looks like single bad channels or bad EMG). This
screenshot <https://georgetown.box.com/s/78olhxp6wx5k0x3q8u02nhypsd1cgfnc>
shows another subject with a few components pulled up - #3 which seems to
be clearly brain, and #27, which would be an EMG artifact that I might
reject along with the clear blink/cardiac components (1,2,7,9,14)

If this still looks poor, I can re-run everything through ICA and try the
1.5Hz filter, reducing the number of components, and subtracting the extra
channel (FYI, when referencing, I did use the method where I add a dummy
channel, re-ref, then remove the dummy channel - so maybe this is
unnecessary?).

I'm curious to know what you all think. Thank you!

Kelly





On Thu, Oct 25, 2018 at 2:47 PM Makoto Miyakoshi <mmiyakoshi at ucsd.edu>
wrote:

> Dear Kelly,
>
> I like Marius's advice. Try 1.5Hz high-pass. Also, your data look fine
> with me.
>
> See the pictures shown here. If you screw up in copying weight matrices,
> you'll NEVER know it by looking at scalp maps. You MUST check EEG.icaact
> which is ICA activation on the scroll data.
>
> https://sccn.ucsd.edu/wiki/Makoto's_preprocessing_pipeline#Tips_for_copying_ICA_results_to_other_datasets_.2806.2F26.2F2018_updated.29
>
> > I got one suggestion to reduce the number of components down to
> something like 64, but this article by Fiorenzo, Delorme, Makeig recommends
> against that.
>
> I have been very worried that Fiorenzo's message was widely misunderstood.
> He did not say using PCA is unconditionally bad. Instead, his message was
> that do not use PCA to save time. PCA is a lossy compression, so you do
> lose information. Also, usually up to 20 principal components explains >95%
> of data, but most of the variance here is artifacts (eye blinks etc).
> Although 'keeping 95% variance' is a widely-used engineering standard
> practice, it does not hold for EEG.
>
> Using PCA is TOTALLY VALID if you know you need to reduce data dimension.
> There is no evidence, for example, that channel rejection is better than
> PAC dimension reduction.
>
> Makoto
>
>
>
> On Tue, Oct 16, 2018 at 8:18 AM Kelly Michaelis <kcmichaelis at gmail.com>
> wrote:
>
>> Hi everyone,
>>
>> I'm wondering if anyone can help shed some light on why I'm getting such
>> poor ICA decomposition and what to do about it. I've tried a number of
>> pipelines and methods, and each one is about this bad (The link below has
>> pictures of the scalp maps from two files below). I'm using a 128 channel
>> EGI system. Here is my pipeline:
>>
>> 1. Import, low pass filter at 40Hz, resample to 250Hz, high pass filter
>> at 1Hz
>> 2. Remove bad channels and interpolate, then re-reference to average ref
>> 3. Epoch to 1s epochs, remove bad epochs using joint probability
>> 4. run AMICA using PCA keep to reduce components to 128-#chans
>> interpolated
>> 5. Load raw data, filter same as above, resample, remove bad chans,
>> interpolate, re-reference
>> 6. Apply ICA weights to continuous, pre-processed data
>> 7. Do component rejection
>>
>> What am I missing? Does anyone see any glaring errors here? My data are a
>> bit on the noisy side, and while I do capture things like blinks and
>> cardiac artifacts pretty clearly, I get the artifacts loading on a lot of
>> components, and I'm not getting many clear brain components. I got one
>> suggestion to reduce the number of components down to something like 64,
>> but this article by Fiorenzo, Delorme, Makeig recommends against that.
>>
>> Any ideas?
>>
>> Thanks,
>> Kelly
>>
>> Scalp maps:
>> https://georgetown.box.com/s/1dv1n5fhv1uqgn1qc59lmssnh1387sud
>>
>> On Thu, Oct 11, 2018 at 11:10 AM Kelly Michaelis <kcmichaelis at gmail.com>
>> wrote:
>>
>>> Hi everyone,
>>>
>>> I'm wondering if anyone can help shed some light on why I'm getting such
>>> poor ICA decomposition and what to do about it. I've tried a number of
>>> pipelines and methods, and each one is about this bad (I've attached
>>> pictures of the scalp maps from two files below). I'm using a 128 channel
>>> EGI system. Here is my pipeline:
>>>
>>> 1. Import, low pass filter at 40Hz, resample to 250Hz, high pass filter
>>> at 1Hz
>>> 2. Remove bad channels and interpolate, then re-reference to average ref
>>> 3. Epoch to 1s epochs, remove bad epochs using joint probability
>>> 4. run AMICA using PCA keep to reduce components to 128-#chans
>>> interpolated
>>> 5. Load raw data, filter same as above, resample, remove bad chans,
>>> interpolate, re-reference
>>> 6. Apply ICA weights to continuous, pre-processed data
>>> 7. Do component rejection
>>>
>>> What am I missing? Does anyone see any glaring errors here? My data are
>>> a bit on the noisy side, and while I do capture things like blinks and
>>> cardiac artifacts pretty clearly, I get the artifacts loading on a lot of
>>> components, and I'm not getting many clear brain components. I got one
>>> suggestion to reduce the number of components down to something like 64,
>>> but this article by Fiorenzo, Delorme, Makeig recommends against that.
>>>
>>> Any ideas?
>>>
>>> Thanks,
>>> Kelly
>>>
>>>
>>> _______________________________________________
>> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
>> To unsubscribe, send an empty email to
>> eeglablist-unsubscribe at sccn.ucsd.edu
>> For digest mode, send an email with the subject "set digest mime" to
>> eeglablist-request at sccn.ucsd.edu
>
>
>
> --
> Makoto Miyakoshi
> Swartz Center for Computational Neuroscience
> Institute for Neural Computation, University of California San Diego
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20181026/112f37fb/attachment.html>


More information about the eeglablist mailing list