[Eeglablist] Out of memory when loading a CNT
Christopher Wilke
wilk0242 at umn.edu
Wed Jun 18 15:07:55 PDT 2008
Hi Min-
Your problem is likely due to the manner in which the different
operating systems handle memory. With Matlab, the largest variable size
is determined by the largest contiguous block of free memory. Therefore,
even though your Windows machine has 2.5G RAM, the maximum variable size
Matlab is able to create and store could be much, much smaller,
particularly if you have several processes running in the background.
You can check the free contiguous blocks of memory at the Matlab prompt
by typing: system_dependent memstats
There is a helpful web tutorial dealing with large datasets in Matlab on
the Matlab Central File Exchange
(http://www.mathworks.com/matlabcentral/fileexchange/loadFile.do?objectId=9060&objectType=file#).
Hope this helps.
Best regards,
Chris
------------------------------------------------------------------------
Subject:
[Eeglablist] Out of memory when loading a CNT
From:
Min Bao <ustcbm at yahoo.com.cn>
Date:
Wed, 18 Jun 2008 11:44:51 +0800 (CST)
To:
eeglablist at sccn.ucsd.edu
Hi experts,
My CNT file is less than 400M (32bit). When I used an iMAC with 1G RAM,
the CNT file could be imported successfully. However, when I used myPC
with 1G RAM, the loading was out of memory. So I changed to another PC
with 2.5G RAM, the same thing happened again. Thus, I am confused why a
PC with 2.5G RAM worked worse than an iMAC with only 1G RAM?
Thanks a lot to any help!
Min
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20080619/b78565c1/attachment.html>
More information about the eeglablist
mailing list