[Eeglablist] readbdf - big data set - convert to single?
Davide Baldo
davidebaldo84 at gmail.com
Thu Aug 30 00:12:55 PDT 2012
Thanks a lot for your answers!
@Chris: When I convert, the EEG data (DAT.Record) are still double and the
following command takes a lot of time and RAM to run. But with the
conversion to single precision it become much faster:
if rem(Mode,2)==0 % Autocalib
DAT.Record=[ones(RecLen*length(Records),1) DAT.Record]*EDF.Calib;
end;
I need to load the entire set of data (to run ICA for example). I have no
problem loading with the conversion to single precision, but I would like
to know if anyone think that the EEG data should not be converted from
double to single precision ( I had no problem so far)
@Andrew: I agree with you, 512Hz would be enough. But I prefer to record at
2048Hz and resampling the data via software after importing them into
Matlab. (that is why I need to load them first).
The converted option sounds interesting, but could you please explain why
files became smaller after removing the DC offset? I believe that even
without offset the data would be represented in double precision, thus each
sample still need 8 bytes.
(yes, I am using a quadcore 64-bit laptop with 4GB of RAM, even though I
have the impression that is hard to take advantage of having 4 processors,
unless you use parallel processing).
Thanks again!
Davide.
On Wed, Aug 29, 2012 at 8:09 PM, ANDREW HILL <andrewhill at ucla.edu> wrote:
> I would suggest downsampling first - 512 Hz (2 ms resolution in an ERP)
> should be fine, if that's sufficient for your needs.
>
> That'll speed things up remarkably :) You can use "Decimator" from
> Biosemi to do this before you import, if you want.
>
> Also, I tend to use "Converter" to move BDF files into EDF+ format before
> importing into Matlab. This essentially applies a 0.016 Hz highpass
> filter, discarding the DC offset and making much smaller files.
>
> Lastly, your CPU and operating system both need to be 64-bit versions (as
> does your Matlab) if you want to ever be able to use more than 2GB of RAM -
> I'm assuming you are running out of RAM even with a lot of it and 64-bit,
> but just in case that's another gotcha.
>
>
>
> Best,
>
> andrew
>
>
>
>
> On Today 12:49 AM, davidebaldo84 at gmail.com wrote:
>
> Dear all,
>
> Lately I have been working with an EEG dataset containing 64 channels
> (sampling rate: 2048Hz, Biosemi data).
> When importing the data into Matlab (via pop_readbdf function) I
> experience two problems:
>
> 1. It takes a lot of time to load and afterwards the pc became extremely
> slow (I assume that is because of the huge amount of RAM needed)
> 2. Sometimes the pc run out of memory
>
>
> Thus I have modified the readbdf function, converting the EEG data from
> double to single precision (each value occupies 4 bytes instead of 8 bytes):
>
> ...
> (line 100) catch,
> warning backtrace off;
> warning('Warning: file might be incomplete');
> Records(nrec:end) = [];
> DAT.Record(RecLen*length(Records)+1:end,:) = [];
> S(nrec:end,:) = [];
> break;
> end;
>
> (line 109) end;
>
> %%%%%%% START DAVIDE MODIFICATION %%%%%%%%
>
> >>>>>>> DAT.Record = single(DAT.Record); <<<<<<< CONVERTING THE EEG
> DATA FROM DOUBLE TO SINGLE
>
> %%%%%%% END DAVIDE MODIFICATION %%%%%%%%
>
>
> if rem(Mode,2)==0 % Autocalib
> DAT.Record=[ones(RecLen*length(Records),1) DAT.Record]*EDF.Calib;
> end;
>
> DAT.Record=DAT.Record';
>
> ...
>
>
> This way the data importing is much faster. The question is: Do you think
> I can have any problem because of converting EEG data from double to single
> precision?
>
>
> Thanks a lot,
>
>
> Davide.
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20120830/6843b4a2/attachment.html>
More information about the eeglablist
mailing list