[Eeglablist] paradox in whitening

Makoto Miyakoshi mmiyakoshi at ucsd.edu
Mon Mar 18 17:23:37 PDT 2013


Dear Mo,

That's not consistent with Delorme et al. (2012) in PLoS One. I wonder why
too. Did you use the same data to test two different weight matrices?

Makoto



2013/3/18 Mo Khalili <mokhal at yahoo.co.uk>

> Hi,
>
> I have a general query about performance of ICA algorithms.
> As I know FastICA is trying to estimate sources based on maximizing the
> non-gaussianity while, infoamx is trying to estimate the components based
> on minimizing mutual information between sources. What wonders me is , when
> I calculated the mutual information between ICs, the averaged mutual
> information between pairs of ICs in  FastICA is lower than ICs estimated by
> infomax. Additionally when I calculate the correlation between pairs of the
> estimated ICs, the averaged absolute correlation between ICs obtained by
> FastICA is significantly lower than infomax.
> in my opinion this result speculates, the whitening(sphering) step in the
> infomax is different from FastICA, and it does not perform the same.
> However, when I look at the codes of FastICA and infomax, both have used
> the same method (and functions) for whitening.
> I appreciate it if anyone can help me through understanding the reason.
>
> Regards,
> Mo
>
>
>
>
>
> _______________________________________________
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
> For digest mode, send an email with the subject "set digest mime" to
> eeglablist-request at sccn.ucsd.edu
>



-- 
Makoto Miyakoshi
JSPS Postdoctral Fellow for Research Abroad
Swartz Center for Computational Neuroscience
Institute for Neural Computation, University of California San Diego
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20130318/cefb5d87/attachment.html>


More information about the eeglablist mailing list