[Eeglablist] Unstable and inconsistent validation in SIFT
Tim Mullen
mullen.tim at gmail.com
Tue Aug 20 11:42:21 PDT 2013
Stability here is a key factor. If the model is unstable, the
autocorrelation and multivariate portmanteau tests are not meaningful, as
some basic assumptions underlying the test are violated.
Any statistical results derived from an unstable model should be rejected.
On the other hand, it's possible that because you have a short time window
(and high model order), estimates of residual cross-correlation are poorly
estimated at larger time lags leading to bias in the whiteness tests.
How many cross-correlation time lags are being used for the whiteness tests
here?
Tim
On Tue, Aug 20, 2013 at 4:58 AM, boglyas <kamanbalazs at gmail.com> wrote:
> Dear list,
>
> I'm writing about two things:
>
> - I'm using SIFT for my study on different subject groups. After trying
> out numerous parameter sets, 0.35 second window length, 0.01 second step
> size and 21 model order proved to be the best option. It works really well
> on the majority of the datasets (all datasets are: 250 Hz sampling rate, 60
> epoch, 1250 frames per epoch, 5 second long trials: 1 second 'nothing',
> 1 second preparation [cross appeared on the screen], 3 seconds of motor
> imaginary [according to arrow direction, right hand, left hand, legs]), but
> for one of the datasets it gives really strange validation results.
>
> The figure of validation is the following:
> http://i.imgur.com/AKbCojc.png
>
> 100% whiteness, but there are some windows which are inconsistent and
> unstable (for the other datasets the validation gives 100% stability and ~
> 85% consistency)
>
> I tried different parameters, about 10 parameter sets, and it seems, that
> in case of high model fitting (high whiteness) the model is unstable or
> inconsistent.
>
> What's your opinion, can I use this dataset with the fitted model, if the
> number of the unstable/inconsistent windows are low? Or it is
> mathematically incorrect, and it will have significant effect on the
> results of the whole recording? Should I keep on trying to find parameters
> that give correct validation?
>
> - My second question would be:
>
> Is it possible, that the validation figure is correct, but the results are
> false? Probably if the result seems to be nonsense, the problem is with the
> original EEG file (noise, etc.), just I want to be sure by asking you.
>
> Thanks in advance,
>
> Balazs Kaman
>
> _______________________________________________
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
> For digest mode, send an email with the subject "set digest mime" to
> eeglablist-request at sccn.ucsd.edu
>
--
--------- αντίληψη -----------
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20130820/e25fedcb/attachment.html>
More information about the eeglablist
mailing list