Soon, pediatricians and other health care providers will be able to install an app on their smartphone or tablet that is capable of analyzing the visual gaze of a toddler in order to determine if they may be on the autism spectrum. Eventually, parents and others will be able to download it onto their own mobile devices and do the screening themselves. This new research out of Duke University has the potential to expand the reach of early screening, and therefore get possibly affected children in for detailed clinical evaluations faster.
Autism spectrum disorder (ASD) is a set neurodevelopment conditions characterized by a broad range of challenges associated with social and communication skills, varying degrees of repetitive and stereotyped behaviors, and different types of sensitivities to the environment. Autism is common. As of 2016, it is estimated that about 1 in 54 children were affected.
In a laboratory setting, autism can be detected in children as young as six months by observing visual responses to social cues. It turns out that how young children visually engage with others in social settings affects how brain circuits responsible for social interactions develop. When these circuits don’t develop as they should, it can lead to increased challenges for successful social engagement and communication later on. By identifying potential autism early, interventions aimed at facilitating social interactions can begin to mitigate the effects of reduced social attention and the consequences it has on the development of the child.
Dr. Geraldine Dawson, one of the paper’s senior authors and the William Cleland Distinguished Professor of Psychiatry and Behavioral Sciences, and Director of the Duke Institute for Brain Sciences and the Duke Center for Autism and Brain Development, emphasized three major issues addressed by this work. First, the traditional questionnaires that are used to collect data to screen for Autism are prone to false positive results. In fact, about 50% of toddlers that the questionnaires suggest may have Autism will actually not have it. Second, there are typically long waits in order to gain access to clinical experts capable of fully evaluating a child in order to make definite diagnosis. And in many parts of the country that access is limited or unavailable. To make matters worse, because of the false positive rate of the traditional screening methods, many parents are putting children on waitlists to be tested when in fact they don’t need to be, further adding to the backlog. On the flip side, because the questionnaires tend to be long and cumbersome some pediatricians don’t screen at all, potentially delaying access to treatments for affected children.
The third reason is an important issue related to equity of care. As Prof. Dawson explained, “The questionnaire does not perform well with families of color or families with lower educational backgrounds. So the challenge for us was how do you develop a tool that directly observes the child’s behavior, and assess whether the child is displaying autism symptoms?”
In effect, the app the Duke team is developing takes what has until now been a qualitative and subjective screening assessment with a significant amount of inherent error, to a much more accurate quantitative objective evaluation that will relatively soon be able to be used by any family doctor or pediatrician. This is particularly valuable in, say, a rural setting away from a large medical center where access to specialists may be limited. The specialized equipment and expertise necessary to measure visual gaze in children would normally be limited to a clinical laboratory that is not widely available or feasible. In fact, the scientists had these constraints in mind from the beginning, working with clinicians on the front line to make sure the research is applicable to real world scenarios, not just controlled lab environments.
How Engineering is Transforming Diagnosis
In a broader context, this work is just one example of the transformational contributions engineering tools are beginning to have on the study of behavior and on neurodevelopmental and neuropsychiatric disorders. Up until recently, behavioral work has relied almost exclusively on subjective observations by other humans. A trained expert observes the child and subjectively scores them on a numerical scale as they perform different motor, social, and cognitive tasks.
To be sure, such expert-intensive observational screening continues to be the cornerstone that provides patients and their families validated clinical diagnoses and access to therapeutic resources. But there exists the opportunity to do much better. There is a wide gap between the observational and subjective reporting methods of behavioral evaluations and the quantitative methods and tools that engineering and related fields can bring to bear. Technology has an important role to play here.
In the case of the Duke team’s work, Prof. Dawson joined forces with Prof. Guillermo Sapiro, the James B. Duke Distinguished Professor of Electrical and Computer Engineering, and Professor of Computer Science. Prof. Sapiro is an expert in computer vision, computer graphics, medical imaging, and machine learning. All this heavy duty engineering is what enabled the development and testing of the algorithms and tools that make up the app.
It is what Prof. Dawson calls ‘quantitative phenotyping’. A phenotype is a set of observable and measurable characteristics and traits in an individual that result from complex interactions between genetic makeup and the environment.
For example, in the current study the researchers measured visual gaze in response to toddlers watching specially constructed video clips on an iPhone or iPad. But in related work, they used the same app to measure the time it took Autistic children to turn their heads when their name was called out. They were able to do so with much greater sensitivity and resolution than human observers. This simple test is a valuable clinical indicator since the reaction time a child takes to turn towards someone calling them is correlated with other behavioral and communication challenges linked to Autism.
Eventually, the goal is not to measure a single feature associated with a potential diagnosis of Autism, but to integrate multiple features — such as the time it takes a child to turn their head when their name is called in addition to visual gaze analysis. Such tools will offer potentially powerful diagnostic and predictive methods that don’t exist today. As Prof. Dawson put it: “If you combine gaze to social versus non-social information to gaze patterns in a conversation, your ability to accurately predict Autism increases. Imagine doing that with multiple more features that can be derived from computer vision.”
Importantly, these kinds of analyses tools have a critical role to play not just for evaluating an individual’s state in the present, but also for tracking changes and progress (or regression) in a patient over time. Tracking trends in measured variables is of tremendous clinical value. It can uncover changes due not just to the progression of Autism (or other neurodevelopmental or psychiatric disorders), but also to factors such as how age, or learning, or therapeutic approaches are affecting and interacting with the evolving symptoms and disorder.
Still, Prof. Dawson thinks that it will likely be a few years before the work is sufficiently validated and ready for widespread distribution among health care providers, and even longer before parents and others will be able to download the app to their smartphones or tablets. The team’s cautious approach is not without merit. Part of the reason it will take so long is to ensure that the algorithms and analyses methods they use properly capture potential considerations across different cultural and socioeconomic variables.
The wait will be worth it though. As engineering methods, tools, and ways of thinking further contribute to neuroscience, how Autism and other neurodevelopmental and psychiatric disorders are defined, diagnosed, and eventually treated will likely be very different than they are today.