[Eeglablist] datasets larger than 2GB
Doerte K. Spring
doerte at spring-west.net
Mon Oct 16 08:53:14 PDT 2006
Dear EEGlab List,
I know that in the moment there is possibilty to save datasets larger than 2^31
bytes which equals about 2.1GB.
Till now, we had not enough RAM to handle and import such big bdf-files, but by
upgrading our workstations, we're able to import those huge files, but can't
save it as a set-file thereafter, because it is too large for EEGLab. Even 2.5GB
are to large for the HDF5WRITE option.
Our goal is to keep the correct continuous dataset info, especially the
latencies. By cropping the dataset into several little chunks then the
breakpoints in EEGLab are very disturbing while addressing latencies for
example.
Is there a quick solution possible to save such big files in EEGlab? Or is there
maybe a better work around option that I could use instead?
Many thanks in advance,
Doerte
More information about the eeglablist
mailing list