[Eeglablist] Poor ICA decompostion
Scott Makeig
smakeig at ucsd.edu
Sun Nov 4 17:42:30 PST 2018
Kelly -
Your 10-26 screen shot of IC maps does look good. I suggest you try
downloading and applying the ICLabel plug-in, which may suggest several ICs
accounting for EMG activity in neck/scalp muscles (etc.). Again, ICA
attempts to separate your data into separate *sources of information.* Each
scalp/neck muscle has its own source of information to provide, and
individual-channel noise may be distinctive enough to 'win' an IC filter,
etc.
I do suggest you experiment with decomposing more data, not just the data
epochs - though removing data segments with movement noise. After all, most
of the effective EEG sources are also active between trials, and more
(clean) data is better.
Another option you might explore is to apply multi-modal AMICA to the whole
data (both sessions -- IF the electrode positions on the head do not
change). See this paper in press for examples:
Hsu, S.H., Pion-Tonachini, L., Palmer, J., Miyakoshi, M., Makeig, S. and
Jung, T.P., 2018. Modeling brain dynamic state changes with adaptive
mixture independent component analysis. *NeuroImage*, *183*, pp.47-61.
Re PCA vs lower channel count: The problem with using PCA for dimension
reduction is that all the independent sources included in the original
(e.g.) 128 channels load on to the retained PCA subspace - in fact, PCA
attempts to include as many ICs as possible in each PC. Omitting channels
also leaves projections of all the independent sources in the data, though
somewhat less markedly (in principle). It could be a worthwhile experiment
to try this: stochastically remove (various combinations of) channels,
perform ICA decomposition, and then cluster the resulting ICs to look for
e.g. brain IC clusters of interest... - though exploring this would be a
small research project in itself.
Scott Makeig
On Fri, Oct 26, 2018 at 7:53 AM Kelly Michaelis <kcmichaelis at gmail.com>
wrote:
> Hi Arno, Marius and Makoto,
>
> Thank you all for the helpful advice!
>
> To answer Arno's question about other ICA algorithms, I tried runica, and
> I found AMICA produced better results. I did also try re-running it with
> just removing the channels and not interpolating before ICA, and found that
> it didn't really make a difference.
>
> As for the amount of data - I have two recording sessions per subject, and
> the sessions are split into two .set files (I reapplied saline to lower
> impedances halfway through the session). Each .set file is submitted to ICA
> independently. Each file is about 30 minutes of data, but once that gets
> epoched to 1sec epochs before ICA, it's trimmed considerably, so that
> before ICA I end up with (data = 128x250x2052), depending on the # of
> epochs removed. So perhaps this is part of why things don't look as pretty
> as they could?
>
> My data are fairly noisy - lots of EMG and non-repetitive artifacts - but
> this being my first major EEG study I didn't have much to compare my ICA
> decompositions to except tutorials online. After hearing from you all,
> perhaps what I'm getting isn't all that abnormal for 128 channels and
> somewhat noisy data. Following Makoto's advice, I have been checking the
> component scroll data for each file, and it looks fine.
>
> I've already gone through half the subjects and rejected artifactual
> components (I'm rejecting about 3-10 components per file - this may be too
> many? I'm careful to only reject things that are clearly blinks, cardiac
> artifacts, saccades, or what looks like single bad channels or bad EMG). This
> screenshot <https://georgetown.box.com/s/78olhxp6wx5k0x3q8u02nhypsd1cgfnc>
> shows another subject with a few components pulled up - #3 which seems to
> be clearly brain, and #27, which would be an EMG artifact that I might
> reject along with the clear blink/cardiac components (1,2,7,9,14)
>
> If this still looks poor, I can re-run everything through ICA and try the
> 1.5Hz filter, reducing the number of components, and subtracting the extra
> channel (FYI, when referencing, I did use the method where I add a dummy
> channel, re-ref, then remove the dummy channel - so maybe this is
> unnecessary?).
>
> I'm curious to know what you all think. Thank you!
>
> Kelly
>
>
>
>
>
> On Thu, Oct 25, 2018 at 2:47 PM Makoto Miyakoshi <mmiyakoshi at ucsd.edu>
> wrote:
>
>> Dear Kelly,
>>
>> I like Marius's advice. Try 1.5Hz high-pass. Also, your data look fine
>> with me.
>>
>> See the pictures shown here. If you screw up in copying weight matrices,
>> you'll NEVER know it by looking at scalp maps. You MUST check EEG.icaact
>> which is ICA activation on the scroll data.
>>
>> https://sccn.ucsd.edu/wiki/Makoto's_preprocessing_pipeline#Tips_for_copying_ICA_results_to_other_datasets_.2806.2F26.2F2018_updated.29
>>
>> > I got one suggestion to reduce the number of components down to
>> something like 64, but this article by Fiorenzo, Delorme, Makeig recommends
>> against that.
>>
>> I have been very worried that Fiorenzo's message was widely
>> misunderstood. He did not say using PCA is unconditionally bad. Instead,
>> his message was that do not use PCA to save time. PCA is a lossy
>> compression, so you do lose information. Also, usually up to 20 principal
>> components explains >95% of data, but most of the variance here is
>> artifacts (eye blinks etc). Although 'keeping 95% variance' is a
>> widely-used engineering standard practice, it does not hold for EEG.
>>
>> Using PCA is TOTALLY VALID if you know you need to reduce data dimension.
>> There is no evidence, for example, that channel rejection is better than
>> PAC dimension reduction.
>>
>> Makoto
>>
>>
>>
>> On Tue, Oct 16, 2018 at 8:18 AM Kelly Michaelis <kcmichaelis at gmail.com>
>> wrote:
>>
>>> Hi everyone,
>>>
>>> I'm wondering if anyone can help shed some light on why I'm getting such
>>> poor ICA decomposition and what to do about it. I've tried a number of
>>> pipelines and methods, and each one is about this bad (The link below has
>>> pictures of the scalp maps from two files below). I'm using a 128 channel
>>> EGI system. Here is my pipeline:
>>>
>>> 1. Import, low pass filter at 40Hz, resample to 250Hz, high pass filter
>>> at 1Hz
>>> 2. Remove bad channels and interpolate, then re-reference to average ref
>>> 3. Epoch to 1s epochs, remove bad epochs using joint probability
>>> 4. run AMICA using PCA keep to reduce components to 128-#chans
>>> interpolated
>>> 5. Load raw data, filter same as above, resample, remove bad chans,
>>> interpolate, re-reference
>>> 6. Apply ICA weights to continuous, pre-processed data
>>> 7. Do component rejection
>>>
>>> What am I missing? Does anyone see any glaring errors here? My data are
>>> a bit on the noisy side, and while I do capture things like blinks and
>>> cardiac artifacts pretty clearly, I get the artifacts loading on a lot of
>>> components, and I'm not getting many clear brain components. I got one
>>> suggestion to reduce the number of components down to something like 64,
>>> but this article by Fiorenzo, Delorme, Makeig recommends against that.
>>>
>>> Any ideas?
>>>
>>> Thanks,
>>> Kelly
>>>
>>> Scalp maps:
>>> https://georgetown.box.com/s/1dv1n5fhv1uqgn1qc59lmssnh1387sud
>>>
>>> On Thu, Oct 11, 2018 at 11:10 AM Kelly Michaelis <kcmichaelis at gmail.com>
>>> wrote:
>>>
>>>> Hi everyone,
>>>>
>>>> I'm wondering if anyone can help shed some light on why I'm getting
>>>> such poor ICA decomposition and what to do about it. I've tried a number of
>>>> pipelines and methods, and each one is about this bad (I've attached
>>>> pictures of the scalp maps from two files below). I'm using a 128 channel
>>>> EGI system. Here is my pipeline:
>>>>
>>>> 1. Import, low pass filter at 40Hz, resample to 250Hz, high pass filter
>>>> at 1Hz
>>>> 2. Remove bad channels and interpolate, then re-reference to average ref
>>>> 3. Epoch to 1s epochs, remove bad epochs using joint probability
>>>> 4. run AMICA using PCA keep to reduce components to 128-#chans
>>>> interpolated
>>>> 5. Load raw data, filter same as above, resample, remove bad chans,
>>>> interpolate, re-reference
>>>> 6. Apply ICA weights to continuous, pre-processed data
>>>> 7. Do component rejection
>>>>
>>>> What am I missing? Does anyone see any glaring errors here? My data are
>>>> a bit on the noisy side, and while I do capture things like blinks and
>>>> cardiac artifacts pretty clearly, I get the artifacts loading on a lot of
>>>> components, and I'm not getting many clear brain components. I got one
>>>> suggestion to reduce the number of components down to something like 64,
>>>> but this article by Fiorenzo, Delorme, Makeig recommends against that.
>>>>
>>>> Any ideas?
>>>>
>>>> Thanks,
>>>> Kelly
>>>>
>>>>
>>>> _______________________________________________
>>> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
>>> To unsubscribe, send an empty email to
>>> eeglablist-unsubscribe at sccn.ucsd.edu
>>> For digest mode, send an email with the subject "set digest mime" to
>>> eeglablist-request at sccn.ucsd.edu
>>
>>
>>
>> --
>> Makoto Miyakoshi
>> Swartz Center for Computational Neuroscience
>> Institute for Neural Computation, University of California San Diego
>>
> _______________________________________________
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
> For digest mode, send an email with the subject "set digest mime" to
> eeglablist-request at sccn.ucsd.edu
--
Scott Makeig, Research Scientist and Director, Swartz Center for
Computational Neuroscience, Institute for Neural Computation, University of
California San Diego, La Jolla CA 92093-0961, http://sccn.ucsd.edu/~scott
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20181104/c2943675/attachment.html>
More information about the eeglablist
mailing list