[Eeglablist] artifact rejection for abnormal trends

Joshua Hartshorne jkhartshorne at gmail.com
Wed Aug 22 11:10:47 PDT 2012


I am using ICA to correct for blinks. Prior to running ICA, I am doing a
bit of minimal data-cleaning so that my ICA is reasonably good. I rejected
and replaced bad channels (kurtosis, threshold=4) and then am rejecting
epochs based on an abnormal linear trend of greater than 150 (which I
realize is quite generous).

What I don't entirely understand is the minR option that one has to set for
the abnormal trend detection. The documentation says, cryptically, that
it's the "minimal linear regression R-square value to allow in the data".
What are we regressing? Is this mV against time? That is, is this a measure
of how linear the trend is (1 = perfectly linear during the epoch)?

Thank you. Josh Hartshorne
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20120822/8ba6835b/attachment.html>


More information about the eeglablist mailing list