Abstract :
[en] In this thesis we demonstrate that direct measurement and comparison across subjects of the surface area of the cerebral cortex at a fine scale is possible using mass conservative interpolation methods. We present a framework for analyses of the cortical surface area, as well as for any other measurement distributed across the cortex that is areal by nature, including cortical gray matter volume. The method consists of the construction of a mesh representation of the cortex, registration to a common coordinate system and, crucially, interpolation using a pycnophylactic method. Statistical analysis of surface area is done with power-transformed data to address lognormality, and inference is done with permutation methods, which can provide exact control of false positives, making only weak assumptions about the data. We further report on results on approximate permutation methods that are more flexible with respect to the experimental design and nuisance variables, conducting detailed simulations to identify the best method for settings that are typical for imaging scenarios. We present a generic framework for permutation inference for complex general linear models (GLMs) when the errors are exchangeable and/or have a symmetric distribution, and show that, even in the presence of nuisance effects, these permutation inferences are powerful. We also demonstrate how the inference on GLM parameters, originally intended for independent data, can be used in certain special but useful cases in which independence is violated. Finally, we show how permutation methods can be applied to combination analyses such as those that include multiple imaging modalities, multiple data acquisitions of the same modality, or simply multiple hypotheses on the same data. For this, we use synchronised permutations, allowing flexibility to integrate imaging data with different spatial resolutions, surface and/or volume-based representations of the brain, including non-imaging data. For the problem of joint inference, we propose a modification of the Non-Parametric Combination (NPC) methodology, such that instead of a two-phase algorithm and large data storage requirements, the inference can be performed in a single phase, with more reasonable computational demands. We also evaluate various combining methods and identify those that provide the best control over error rate and power across. We show that one of these, the method of Tippett, provides a link between correction for the multiplicity of tests and their combination.