<html xmlns:v="urn:schemas-microsoft-com:vml" xmlns:o="urn:schemas-microsoft-com:office:office" xmlns:w="urn:schemas-microsoft-com:office:word" xmlns:m="http://schemas.microsoft.com/office/2004/12/omml" xmlns="http://www.w3.org/TR/REC-html40">
<head>
<meta http-equiv=Content-Type content="text/html; charset=us-ascii">
<meta name=Generator content="Microsoft Word 12 (filtered medium)">
<style>
<!--
/* Font Definitions */
@font-face
{font-family:"Cambria Math";
panose-1:2 4 5 3 5 4 6 3 2 4;}
@font-face
{font-family:Calibri;
panose-1:2 15 5 2 2 2 4 3 2 4;}
/* Style Definitions */
p.MsoNormal, li.MsoNormal, div.MsoNormal
{margin:0in;
margin-bottom:.0001pt;
font-size:11.0pt;
font-family:"Calibri","sans-serif";}
a:link, span.MsoHyperlink
{mso-style-priority:99;
color:blue;
text-decoration:underline;}
a:visited, span.MsoHyperlinkFollowed
{mso-style-priority:99;
color:purple;
text-decoration:underline;}
span.EmailStyle17
{mso-style-type:personal-compose;
font-family:"Calibri","sans-serif";
color:windowtext;}
.MsoChpDefault
{mso-style-type:export-only;}
@page Section1
{size:8.5in 11.0in;
margin:1.0in 1.0in 1.0in 1.0in;}
div.Section1
{page:Section1;}
-->
</style>
<!--[if gte mso 9]><xml>
<o:shapedefaults v:ext="edit" spidmax="1026" />
</xml><![endif]--><!--[if gte mso 9]><xml>
<o:shapelayout v:ext="edit">
<o:idmap v:ext="edit" data="1" />
</o:shapelayout></xml><![endif]-->
</head>
<body lang=EN-US link=blue vlink=purple>
<div class=Section1>
<p class=MsoNormal>I’ve been looking (so far in vein) for some precedent
or algorithm to determine a reasonable contiguity threshold for time-frequency
analysis. My understanding is that the zero-masking done by newtimef() is
on a pixel (resel?) by pixel basis, showing only those pixels where the
deviation from baseline is statistically significant. I’m wondering
if anyone is using the added constraint of a contiguity threshold in order to
mask even those pixels with statistically significant deviations by requiring that
there be multiple consecutive (in time and/or frequency) pixels with
significant deviations before that time-frequency perturbation is considered “significant.”
Two alternatives I’m imagining are: (1)a theoretically determined threshold
in terms of time and frequency (e.g., perturbation must span 50ms and/or 5Hz) that
is independent of the temporal/frequency resolution and (2) an algorithmically
determined threshold that is based on the dimensions of the
time-frequency matrix (e.g., the joint probability of N consecutive
time-bins and N consecutive frequency-bins is less than alpha). Any comments,
recommendations and/or references would be greatly appreciated.<o:p></o:p></p>
<p class=MsoNormal><o:p> </o:p></p>
<p class=MsoNormal>Thanks!<o:p></o:p></p>
<p class=MsoNormal>Paul<o:p></o:p></p>
<p class=MsoNormal><o:p> </o:p></p>
</div>
</body>
</html>