[Eeglablist] datasets larger than 2GB

Stefan Debener s.debener at uke.uni-hamburg.de
Mon Oct 16 22:12:46 PDT 2006


Hi Doerte,
an alternative way is to import the data for a subset of channels, and 
later on combine the low-pass filtered and downsampled datasets. We do 
this with 5kHz sampled data from inside-scanner recordings and it works 
fine. The advantage compared to a block-wise reading is that you don't 
have to worry about any potential boundary problems when re-combining 
the single channels - every dataset has the complete event information...
Best,
Stefan


Stefan

Doerte K. Spring wrote:

>Dear EEGlab List,
>
>I know that in the moment there is possibilty to save datasets larger than 2^31
>bytes which equals about 2.1GB.
>Till now, we had not enough RAM to handle and import such big bdf-files, but by
>upgrading our workstations, we're able to import those huge files, but can't
>save it as a set-file thereafter, because it is too large for EEGLab. Even 2.5GB
>are to large for the HDF5WRITE option.
>Our goal is to keep the correct continuous dataset info, especially the
>latencies. By cropping the dataset into several little chunks then the
>breakpoints in EEGLab are very disturbing while addressing latencies for
>example.
>
>Is there a quick solution possible to save such big files in EEGlab? Or is there
>maybe a better work around option that I could use instead?
>
>Many thanks in advance,
>
>Doerte
>_______________________________________________
>eeglablist mailing list eeglablist at sccn.ucsd.edu
>Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
>To unsubscribe, send an empty email to eeglablist-unsubscribe at sccn.ucsd.edu
>
>
>  
>






More information about the eeglablist mailing list