Computational protocol: Functional neuroanatomy of auditory scene analysis in Alzheimer's disease

Similar protocols

Protocol publication

[…] Brain imaging data were analysed using statistical parametric mapping software (SPM8; http://www.fil.ion.ucl.ac.uk/spm). In initial image pre-processing, the EPI functional series for each participant was realigned using the first image as a reference and images were unwarped incorporating field-map distortion information (). The DARTEL toolbox () was used to spatially normalise all individual functional images to a group mean template image in Montreal Neurological Institute (MNI) standard stereotactic space; to construct this group brain template, each individual's T1 weighted MR image was first co-registered to their EPI series and segmented using DARTEL tools (New Segment) and this segment was then used to estimate a group template that was aligned to MNI space. Functional images were smoothed using a 6 mm full-width-at-half-maximum Gaussian smoothing kernel. For the purpose of rendering statistical parametric functional maps, a study-specific mean structural brain image template was created by warping all bias-corrected native space whole-brain images to the final DARTEL template and calculating the average of the warped brain images.Pre-processed functional images were entered into a first-level design matrix incorporating the five experimental conditions (NS, NI, RS, RI and the baseline silence condition) modelled as separate regressors convolved with the standard haemodynamic response function, and also including six head movement regressors generated from the realignment process. For each participant, first-level t-test contrast images were generated for the main effects of auditory stimulation [(NS + NI + RS + RI) − silence], identification of own name [(NS + NI) − (RS + RI)] and segregation of auditory foreground from background [(NS + RS) − (NI + RI)]. In the absence of a specific output task during scanning, we use ‘identification’ here to indicate specific processing of own-name identity in relation to an acoustically similar perceptual baseline. In addition, contrast images were generated for the interaction of identification and segregation processes [(NS − RS) − (NI − RI)]: we argue that this interaction captures the computational process that supports the cocktail party effect proper. Both ‘forward’ and ‘reverse’ contrasts were assessed in each case. Contrast images for each participant were entered into a second-level random-effects analysis in which effects within each experimental group and between the healthy control and AD groups were assessed using voxel-wise t-test contrasts.Contrasts were assessed at peak voxel statistical significance threshold p < 0.05 after family-wise error (FWE) correction for multiple voxel-wise comparisons in two anatomical small volumes of interest, specified by our prior hypotheses (; ; ; ; ). These regional volumes were created using MRICron® (http://www.mccauslandcenter.sc.edu/mricro/mricron/) and comprised temporo-parietal junction (including superior temporal and adjacent inferior parietal cortex posterior to Heschl's gyrus and supramarginal gyrus; the putative substrate for auditory scene analysis) and superior temporal gyrus anterior and lateral to Heschl's gyrus (the putative substrate for name identity coding). For the purpose of assessing overall auditory stimulation, a combined regional volume with addition of Heschl's gyrus was used for the contrast [(NS + NI + RS + RI) − silence]. […]

Pipeline specifications

Software tools SPM, MRIcron
Applications Magnetic resonance imaging, Functional magnetic resonance imaging
Organisms Homo sapiens
Diseases Alzheimer Disease