[Eeglablist] readbdf - big data set - convert to single?

Davide Baldo davidebaldo84 at gmail.com
Wed Aug 29 00:49:50 PDT 2012


Dear all,

Lately I have been working with an EEG dataset containing 64 channels
(sampling rate: 2048Hz, Biosemi data).
When importing the data into Matlab (via pop_readbdf function) I experience
two problems:

1. It takes a lot of time to load and afterwards the pc became extremely
slow (I assume that is because of the huge amount of  RAM needed)
2. Sometimes the pc run out of memory


Thus I have modified the readbdf function, converting the EEG data from
double to single precision (each value occupies 4 bytes instead of 8 bytes):

   ...
(line 100)   catch,
           warning backtrace off;
           warning('Warning: file might be incomplete');
           Records(nrec:end) = [];
           DAT.Record(RecLen*length(Records)+1:end,:) = [];
           S(nrec:end,:) = [];
           break;
       end;

 (line 109)  end;

%%%%%%% START DAVIDE MODIFICATION %%%%%%%%

>>>>>>>     DAT.Record = single(DAT.Record); <<<<<<< CONVERTING THE EEG
DATA FROM DOUBLE TO SINGLE

%%%%%%% END DAVIDE  MODIFICATION  %%%%%%%%


if rem(Mode,2)==0 % Autocalib
DAT.Record=[ones(RecLen*length(Records),1) DAT.Record]*EDF.Calib;
end;

DAT.Record=DAT.Record';

...


This way the data importing is much faster. The question is: Do you think I
can have any problem because of converting EEG data from double to single
precision?


Thanks a lot,


Davide.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20120829/ef760270/attachment.html>


More information about the eeglablist mailing list