Computational protocol: Reduced preference for social rewards in a novel tablet based task in young children with Autism Spectrum Disorders

Similar protocols

Protocol publication

[…] All methods were carried out in accordance with the relevant guidelines and regulations specified by the Research Ethics committee. The child was sat on a comfortable chair in front of an iPad positioned on a support with a 45 degrees inclination to the table. The experimenter, sat in front to the child, started the training trial saying “[Child’s name], look!” and pressed one of the two buttons on the screen with the index finger contextually naming the button color. The aim of this training trial was to ensure that the child grasped the contingency between the button and the image. When the stimulus (either a slide or a guitar) appeared on the screen, the experimenter verbally labeled the object. The experimenter then pressed the other button naming the color and labelled the other image on the screen. Next the experimenter invited the child to do the same, saying “Now your turn!”. If the child did not touch the screen after the experimenter verbal prompt, the experimenter would physically prompt the child to touch the screen button once, leaving the last trial of the pilot session to be done independently by the child. If the child did not press the button independently during the last pilot trial, the test phase was not conducted assuming that the child had not adequate fine motor and/or cognitive skills to perform the task. If the child performed the pilot trial independently at least once after the modeling, the test trial was conducted without giving the child any further verbal prompt.Each image was presented for 3 seconds immediately after the child’s touching the button on the tablet screen. A screensaver followed the image presentation for 3 seconds, following which the two buttons were presented again in a variable random position in the screen. The variable spatial positioning of the buttons was to ensure that the toddlers were not exhibiting their preference to a specific spatial location, but to a specific button.The number and identity (social/nonsocial) of the button-presses were recorded. In addition, video footage of the session was recorded using both the tablet webcam (close-up of the child’s face) and a second external camera (recording the child’s face, tablet and experimenter’s face). Before commencing the main experiment, a separate block of 4 trials, identical to the test trial but using two different black and white images, was administered to familiarize the child with the task.The examiner maintained a neutral face during the all test trial not to create biases in social engagement. Task understanding was coded from the video footage for all children by the experimenter according to the following criteria: a) active searching (through visual exploration of the screen and/or pointing gesture) of the corresponding button to the chosen image b) vocalizations related to the button color-image association. In a randomly chosen subgroup of n = 15 children from this sample, these behaviours were blind-coded by two independent coders, who were found to be mutually reliable (Cohen’s Kappa = 0.85). Furthermore, child behaviour directed to both the image on the screen and to the examiner - was analysed through video-coding. The following behaviours were manually coded from the video footage for all trials: eye-contact with the experimenter, smiles directed to the image, other facial expressions directed to the image, pointing gestures to the image, vocalizations during image presentation. Inter-rater reliability for coding each of these behaviours ranged from 0.85 to 1.Five children (4 ASD and 1 TD aged 14 months) did not pass the pilot phase and were not administered the test phase, whilst performance developmental quotient (PDQ) subscore of the GMDS was not available for one TD child. Additionally, one child with ASD did not complete the control task (with scrambled images) and was excluded from the relevant analysis. [...] Statistical analyses were conducted using IBM-SPSS Statistics 20 and R (http://www.r-project.org/).Relative preference for social images was computed as a ratio ranging from 0 to 1 (number of social button touches/ total number of touches). Only the touches on the button were considered in the denominator. Random touches to the screen off the buttons were not computed. Between-group comparisons of relative preference for social images as well as correlation analyses with behavioural responses were examined in n = 58 children (n = 21 ASD children and n = 37 TD children). All the analyses were adjusted for PDQ and gender, since TD children had significantly higher PDQ compared to ASD children and the male to female ratio was not matched in the two groups.One-tailed p values are presented for all the inferential statistics, in keeping with the directional nature of the hypotheses. Since behavioural variables showed significant deviation from normality, Spearman rank correlation (controlling for PDQ and gender) was used to assess the relationship between image preference and behavioural responses in the ASD and TD group separately. […]

Pipeline specifications

Software tools Screensaver, SPSS
Application Miscellaneous
Organisms Homo sapiens