IsTwoTailed option in y_GRF_Threshold

First of all, thanks for all matlab scripts you made, which are really helpful.

I'm using y_GRF_Threshold.m to do a FSL-style cluster-extent thresholding in matlab. As far as I know, y_GRF_Threshold.m is a matlab version of "easythresh" in fsl. One interesting thing about the function was it has an option for one/two-tailed thresholding, because I've never heard about one/two-tailed approach in cluster-extent thresholding. If the option is turned on, the function is using cluster-defining threshold (i.e., voxel-wise height p value) divided by 2. Also it's using a half of corrected p value. Then, it calculates cluster extent size < the corrected p value.

Is this process same as easythresh is doing? That is, easythresh has the two-tailed option as well? The two-tailed and one-tailed could produce big differences in results, I think... However, actually, I cannot understand the reason of dividing a cluster-level p value by two. This is because, as far as I understand, we are using the distribution of maximum cluster size in one direction (i.e., whether the cluster extent size is "bigger" (not smaller) than p = .05 or not given the primary (cluster-defining) p value). So I'm curious to hear about your understanding of this. Any comments would be really appreciated!

Forums:

Hi Wani,

Thanks a lot for your interests and test on my codes!

You are correct, y_GRF_Threshold.m is doing a FSL-style cluster-extent thresholding. And there is NO "two-tailed" way in easythresh.

I made the "two-tailed" option is because: when people compare AD and controls, they usually interested in BOTH the increased activity and the decreased activity in AD compared with controls (i.e., this is a "TWO-TAILED" statistical analysis).

In FSL, I guess lots people will first only see AD>controls, set a Z>2.3 and cluster level p<0.05.
                                             And then see AD<controls, set a Z<2.3 and cluster level p<0.05. (or reverse the sign)
However, you can only ensure p<0.1 take both of them together.
(In addition, as a Note in my code: FSL's Z>2.3 corresponds to one-tailed VoxelPThreshold = 0.0107. |Z|>2.3 corresponds to two-tailed VoxelPThreshold = 0.0214.)

Thus, I decided to have an "two-tailed" option (e.g., if set VoxelPThreshold = 0.01 and cluster level p<0.05):
1. See AD>controls, set a Z>2.576 (two-tailed p <0.01) and cluster level p<0.025.
2. See AD<controls, set a Z<2.576 (two-tailed p <0.01) and cluster level p<0.025.
3. Add 1 and 2 together, which could ensure the total p<0.5.

However, dividing by 2 (use p<0.025) is a way like Bonferroni correction, i.e., a little strict. I don't know if this the most appropriate way, but I do think we need to have some kind of correction if you see both AD>controls and AD<controls.

Given it is a little strict (0.05/2 like Bonferroni correction), thus, in the upcoming version of REST (y_GRF_Threshold will be released as rest_GRF_Threshold in this version),  I am making "One-Tailed" as default, and then let the users to choose if doing "Two-Tailed" or not. And if they click the "two-tailed" button, there will be a pop-up window to explain what procedures will be done in such a way.

Best,

Chao-Gan

Hi Chao-Gan.

Thanks for the reply. I'm sorry for the delayed response. I wanted to reply soon, but it took some time to think a little bit more about this. So, now I understand the idea why you divide the cluster defining p value by 2, but I don't know well about the reason you need to divide cluster level p value by two. You divided them because you're testing twice (so multiple comparison again), but I don't think we need to test twice. We could test + and - together at once. So if we want to use .01 as a primary threshold and two-tailed thresholding, in my opinion, we need to use |Z| > 2.576, and cluster level p < .05. Then, with the cluster extent threshold size, we can get activation and deactivation at once.

However, because I don't usually use FSL, I don't know how FSL users are using threshold (especially easythresh). If it's the case that people usually test each contrast separately, we might need kind of multiple comparison correction again - but I'm not really sure.

By the way, did you already confirm if the cluster extent size from your scripts is same with the outcomes from "FSL" and "Alphasim"? I just want to make sure about the validity of your methods. If that's the case, this is awesome. Thanks, and looking forward to hearing from you again.

Hi Wani,

Initially, I also want to threshold the map at |Z| > 2.576 and then use the cluster size by cluster-level p<0.05.

However, after carefully reading some GRF papers (e.g., K. Friston, K. Worsley, R. Frackowiak, J. Mazziotta, and A. Evans. Assessing the significance of focal activations using their spatial extent. Human Brain Mapping, 1:214?220, 1994.) I am not sure if the cluster size derived from the equations could generalized to two-tailed version. The equations were developed based on one-tailed test, e.g., [EN = S*Phi(-u). (Expectation of N voxels)], maybe need some modification if we want to use these cluster size on two-tailed version.

This is why I used cluster-level-p/2, as a conservative way. Probably you can also focus on the equations and derive one suitable version for two-tailed test.

The y_GRF_Threshold script (now you can find as rest_GRF_Threshold in REST) could have the same cluster size as FSL. AlphaSim program is almost as AFNI, however, slight difference because this is a random simulation test. If you can perform some outside test, then will be much more helpful. :)

Best,

 Hi, Dr. Yan,

During GRF correction, could you provide another choice, i.e. estimating the smoothest (FWHM) on the original images (not on the statistical images). My results showed that estimation on the statistical images is too strict to get reasonable results.

Best wishes!

Hi,

Thanks for your post. As pointed out in several previous posts (some in Chinese), I know this issue. I do have a plan to revise the Statistical Analysis Module thoroughly in May. The smoothness will be estimated from the residual files and pass to GRF correction.

Following thoroughly revision of Statistical Analysis Module, there will be a new course on Statistical Analysis.

I hope I would have enough time this month, and my schedule doesn't mess up. :)

Best,

Chao-Gan