[Eeglablist] Decimating/processing a large file

Robin Cash robin.cash at monash.edu
Fri Feb 22 04:29:18 PST 2019


Hi,
I have several large files (~7gb) that I am having trouble processing and
was wondering whether I could get some advice.

I managed to convert these files from .cnt to .set , but am unable to
decimate them (to reduce file size) as I get the error below.

"Calling input name from the indexing overload subasgn is not supported.
(Error occurred in function subasgn() at line 47).”

I don't think it is a ram limitation as the PC has 64gb ram.

This is after changing the eeglab memory options, converting to set file
and attempting to decimate.

Are there any workarounds? I was thinking I could try to convert to .mat
and then use decimate.m or resample.m or down sample.m and then convert
back to .set . However, I noticed that some of matlab's decimation
functions actually include filters that might differ to what eeglab uses in
pop_resample.

I could probably do this in neuroscan too, but there are too many files to
manually do this with each file individually.

Any advice would be much appreciated. Presumably others have had this issue
before and found a good work around.

Many thanks!
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://sccn.ucsd.edu/pipermail/eeglablist/attachments/20190222/6b2024fc/attachment.html>


More information about the eeglablist mailing list