[Eeglablist] general solution for memory problems?
Andreas Romeyke
romeyke at cbs.mpg.de
Thu May 31 23:54:57 PDT 2007
-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1
Hi Brian,
Brian Murphy schrieb:
> But, presumably this could be automated. So, perhaps something like this
> would work:
> - determine the largest contiguous piece of RAM available to Matlab (I
> understand from previous posts that it is contiguous RAM rather than
> total RAM that matters), lets call this A(vailable)
> - estimate the total memory needed for the final imported data (in my
> experience, it seems to be the raw filesize by a factor of three) - F(inal)
> - then the temporary space available for conversion T(emp) is A-F
> - then determine how big a fraction of the input file can be dealt with
> at a time - divide the Temporary space by the peak conversion factor C
> (again, in my experience, more than 10 times the original filesize of
> RAM is required - what is other people's experience?): T/C
> - then split the time-course of the original file (of size O) into O/C
> pieces.
> - load these pieces in
> - merge them
> - discard the pieces
>
> Any comments? I am new to Matlab and EEGLAB, so much or all of what I've
Under Linux my workaround is to add swapspace temporarily.
In the near(?) future I expect that EEGlab will also run under GNU
Octave (http://www.octave.org). In my experiences Octave does not have
such a memory problem like Matlab, I think.
Bye Andreas
- --
Software Developer / Dipl. Inform. (FH)
Max Planck Institute for Human Cognitive and Brain Sciences
Department of Psychology
Stephanstr. 1a, 04103 Leipzig, Germany
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.6 (GNU/Linux)
iD8DBQFGX8LBGBhgJGthyeERAln8AJ478RlDl0eL2O7w8zxzB5Dy45w31ACfQdQG
Zv9XuJViKN3mo/A0Tx+dhlw=
=rJ8e
-----END PGP SIGNATURE-----
More information about the eeglablist
mailing list