[Eeglablist] Artifact removal validation
Stefan Debener
s.debener at uke.uni-hamburg.de
Wed Jun 20 01:52:21 PDT 2007
Hi Borna,
you mention that ICA gives a different set of components when applied
repeatedly to the same data. In my experience, this indicates that, a),
you don't have enough data points relative to the number of channels,
and/or b), the data are not well recorded or pre-processed. With a
reasonable recording & pre-processing, an average data quality, and a
reasonable length of your data, you will find virtually identical and
thus very robust ICs (most weights of dipolar ICs correlate >.95). If
you do not get this, something is wrong.
So, I would recommend fixing this first: Try a high pass filter, remove
noisy, drifting or clipping channels (or better avoid recording those!),
make sure you don't have electrolyte bridges in your data, remove
non-stereotyped artifact periods (gross movements, cable movement,
swallowing, etc.), and, importantly, ensure that the cap does not change
its position on the head while recording the data (ICA assumes spatial
stationarity!!). If this is all ok, then run extended infomax ICA. You
will see a number of dipolar ICs (between about 5 and 25) which are very
robust.
To identify those ICs reflecting artefacts, it is good to have some a
priori knowledge about EEG artifacts. ICA easily pulls out eye blinks
and eye movements, which can be fully automatically removed by
correlating the IC weights with a template (e.g., the map of the first
eye blink in your data will correlate best with the IC reflecting eye
blinks. Try looking up maps of artifacts on a single trial level, that's
very informative...). An example about using ICA to fully automatically
remove artifacts is given in Debener et al. (2007). Neuroimage, 34,
587-597. However, Arnos Neuroimage paper suggests much more flexible,
and fully implemented approaches in this regard. Another artifact that
is very commmon is the ECG artifact. Depending on your subject and your
electrode layout, you will get one (sometimes 2) IC showing a nice ECG R
peak, thus reflecting the electrical heart activity volume conducted to
the EEG sensors (the second, if it shows up, usually reflects the
rotational part of the R peak vector: the heart wrings out to pump the
blood, it does not pump by compression). With some experience, this
artifact is so easy to spot that I cannot see that there is "much
subjective" in IC identification (compared to any other artifact
processing method, such as thresholding at +- 50 uV, or any other
arbitrarily chosen value like the number of principal components that
account for something).
Validation 1: If you want to compare the performance of ICA to any other
artifact correction algorithm, then you should look at the single-trial
performance, not the averaged data. One could for instance define eye
blink peak latencies, and then compare the uncorrected eye blink maps
with the corrected maps at the same latencies. I did this for a few
datasets, and ICA in my tests easily outperformed regression approaches.
That is, you can expect to find a higher correlation following
regression than following ICA eye blink removal. Of course, this will be
obtained only if your ICA solution is robust, thus depend to some extend
on your preprocessing. I'd love to see a paper published using this
approach ...
Validation 2: Try to obtain results based on ICA preprocessing with any
other signal processing routine, for instance with regard to some well
known neurocognitive effect. I virtually always find that, if the ICA
decomposition is meaningful and robust across subjects, ICA performs
better than channel based data processing. ICA obtains nice links
between behavior and EEG single trial amplitudes (e.g. Debener et al.,
2005, J Neurosci), ICA components show systematic trial-by-trial
correlations with the fMRI BOLD signal (some papers about to be
published soon, e.g. Scheeringa et al., Int J Psychophysiol), and AEPs
source localization is easy even on a single subject level when based on
ICA preprocessing (e.g. Hine & Debener, 2007, Clin Neurophysiol).
Validation 3: However, I fully agree that many open issues with regard
to ICA validation persist. More likely than not, the whole approach can
be further improved and be made "less subjective" than it is right now....
Hope this helps,
Stefan
Borna Noureddin wrote:
> Hello,
>
> I have been working through the most recent tutorial on artifact
> rejection, and have a question about validation. How can I validate or
> evaluate the performance of ICA for artifact removal?
>
> The tutorial gives some steps for identifying artifact components, but
> they are quite subjective. This is further compounded by the fact that,
> given the same set of data, running ICA gives a different set of
> components each time. So, even with an acceptable validation method,
> it's unclear how I can objectively and reliably apply that method.
>
> Is there, for example, a "post-artifact-removal" version of the tutorial
> dataset that can be compared with the original tutorial dataset provided
> with EEGLAB? And if so, is there documentation about how precisely the
> artifacts were removed?
>
> Thanks,
> Borna Noureddin
>
>
> _______________________________________________
> eeglablist mailing list eeglablist at sccn.ucsd.edu
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to eeglablist-unsubscribe at sccn.ucsd.edu
>
>
>
More information about the eeglablist
mailing list