[Eeglablist] reading only half of large .cnt file
mmiyakoshi at ucsd.edu
Wed Oct 29 13:27:23 PDT 2014
This looks like an error quite well investigated by the reporter (thank you
Kathrin). Could you investigate it?
On Thu, Oct 23, 2014 at 4:06 PM, Kathrin Müsch <kathrin.muesch at gmail.com>
> Dear all,
> I am trying to load a large 32-bit .cnt file acquired with Neuroscan’s
> Scan 4.5 software (5000 Hz sampling rate). However, only about half of my
> data will be loaded. The datafile is indeed very large (8 GB).
> The total number of samples in the datafile is read out correctly
> (h.numsamples in line 259 = 17,000,000) but only 8,000,000 sample are read
> (because of the specification of “nums” in line 327). I also get an error
> message when I try to manually set sample1 and ldnsamples to the desired
> values because it falsely assumes that my last sample is 8,000,000.
> Is this a bug in the script because nums (line 327) is set to a false
> value or do I need to find a workaround? I also tried to use the memmapfile
> option without any luck.
> Any help would be appreciated,
> Kathrin Müsch, Ph.D.
> Department of Psychology
> University of Toronto
> Toronto, Canada
> Eeglablist page: http://sccn.ucsd.edu/eeglab/eeglabmail.html
> To unsubscribe, send an empty email to
> eeglablist-unsubscribe at sccn.ucsd.edu
> For digest mode, send an email with the subject "set digest mime" to
> eeglablist-request at sccn.ucsd.edu
Swartz Center for Computational Neuroscience
Institute for Neural Computation, University of California San Diego
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the eeglablist