Therapeutic interventions in VR for children with autism
To teach children with autism about social skills.
We have designed and developed a multi-player game in Virtual Reality using HTC Vive targeted towards teaching social skills to children with autism. We have targeted 4 social skills : proximity, volume, body orientation and time talking. Proximity is knowing how far away to stand from a person during a conversation. Volume is how loud or low you talk. Body orientation is facing who you are talking to. Time talking is maintaining a balance between you and your friend while having a conversation.
This is an on-going project under the 'Social and Technological Research Group' at UCI. The project was started in November 2016.
We are a team of 5-6 people led by a PhD student. I am one of the two designers and developers of the team. I also helped in user research, running the study and writing the academic paper.
Here is a link to the CHI 2018 paper.
In earlier research, students created a curriculum in the form of cards to teach autistic young adults about social skills. The poster below shows the skills around which the curriculum is designed. Mobile applications were designed and deployed to teach children about proximity, that is the distance they should maintain during conversations and prosody, that is the volume and pitch of the voice that should be used. Taking this study further, we decided to build similar games in virtual reality for teaching social skills to children with autism. Often, individuals with autism are not comfortable approaching people or learning such skills with therapists face-to-face. VR also provides the added flexibility of better information visualization. Therefore, we decided to build a solution in VR that could be fun, engaging and comfortable to learn in.
To conduct user research, we went to a special needs school near LA to interact with the young adults and teachers. Participants were asked to do a card sort of the top 8 cards of the social curriculum that they thought were the most useful. Based on this result, we created the game on the proximity compass, volume meter, and friendship balance. We interpreted friendship balance as a time talking application that would show users how much they talked in comparison to the other. Volume-meter measured how loud the users were. The proximity compass measured how close or far they stood to the other person.
We started with creating body based armors for proximity through Minecraft. Vivecraft allowed us to make mods in VR. In the images below we created a shield and an armor. However, due to Minecraft limitations, we quickly shifted to Unity.We started creating a one player game where the child had to visit people such as mother, teacher, stranger, etc. and had to decide how close he/she should get to the person. As shown in the images below, the player had red, green and blue concentric circle around the, signifying personal and social space. The player's red circle must not meet the other person's red circle. A beep would trigger if this rule was broken, indicating the player to step back. Later, we decided to pursue the study with a multi-player game in order to achieve collaborative and cooperative game play among the participants.
Here are a few sketches we came up with for proximity, body orientation and time talking for a multi player game. For proximity, players must stay in the blue circles and must not enter their partners red circle. For time talking, players should try to fill up the bar equally, thus achieving a balance in their friendship. The latter image shows how we initially imagined the main screen would look like.
We developed the game for 2 players. Players would see the proximity circles and would stand at a distance that is socially acceptable. Scoring was collaborative. Players lost points on entering each others red circles and gained point if they stayed in each others blue circles. Players were indicated if they are talking too loudly or too low. As shown in the image to the right (top left column of image), the bar on top indicated how much the players talked. The orange bar was the first player the purple bar was the second player. The players needed to achieve a balance by meeting in the middle of the bar. The volume of the player was indicated by a speaker that changes to gray, green and red for silent, good and loud respectively.
We got rid of the text in later iterations as user testing revealed that it did not grasp much attention. We introduced a life system like traditional games. The hearts in the image below indicated the collaborative lives of both players. Players lost lives while in red circle and gained lives in blue circles. We also combined body orientation and proximity into one solution by offsetting the concentric circles.
We tested the game 3 times with young adults with autism who were interning at CHOC hospital in Santa Ana. We also tested the game with middle school students. The users enjoyed being in the system and understood the visual cues. However, they were less inclined to react and correct themselves. The city theme of the game was also distracting as participants focused more on exploring the city rather than the social cues. Hence we removed the theme and changed to a blank space. They also did not pay attention to the hearts as lives, and thus we removed this in the next iteration. The volume speaker was also very small to grab attention. In order, for the users to react, we created stronger visual cues in the final game.
The images below show the game solution for proximity and body orientation. Instead of hearts as lives, we introduced red, yellow and white screens with textual cues for players to react. In the image below, the top images show when the players are in each other's blue circles, this is considered to be a good distance while in conversation. The image also shows when players are too far away from each other and their blue circles do not collide. They are then given a textual cue to step closer.
The images below show when the players are too close to each other. If they are in each other's yellow circles they are given a warning of being too close, and the air turns yellow. If they are in each other's red circles they are given a textual cue to step away, along with a red tint, alerting them that they are too close. Thus, the circles are present for the user to glance at if requires but do not demand attention. The screens with the text on the other hand demand attention and elicit immediate response as required.
The images below show the game solution for volume and time talking. The speakers gray, green and red signify silent, good and loud, indicating the volume of the player. They increase in size to capture the user's attention. The time talking bar on top indicates how much the players have talked. The orange bar is the first player the purple bar is the second player. The players need to achieve a balance by meeting in the middle of the bar.
Given below is an image of the full screen that a user sees.
We conducted the study with 11 participant. A research assistant took role of player 2 and the participants were player 1. We first ran the participants through a baseline treatment without any visual feedback and analyzed data to see if they struggled with proximity and volume and thus if they qualified for the study. They ran through 5 such sessions that will were a minute each. With the selected candidates, we then ran sessions with all visuals (both proximity and volume / time talking), sessions with only proximity and sessions with only volume/ time talking. There were 15 such sessions that lasted a minute each, conducted in a randomized order. Given below is a picture of all four sessions: One without any visual feedback, one with volume and time talking feedback, one with only proximity feedback and one with all feedback.
We observed a positive and learn-able response towards proximity as the participant immediately responded and corrected themselves after seeing the visual cues. Few participants observed the time talking bar and adhered to the colors meeting in the middle. We did not observe an immediate response for volume with most of the participant. Additionally, the participants were very free and comfortable while interacting, which is generally not observed in children with autism during face-to-face interactions. Given below is a picture of the study in session.
After the sessions we interviewed the mothers of the participants and the participants based on their experience. We then transcribed the interviews and created codes and categories for the same. Given below is an image of the process
Our findings were grouped into 3 main categories.
Comfort through Familiarity: Mothers consistently described immersive VR as comfortable for their children compared to her children’s engagement in face to face interactions. The children were very free and comfortable while interaction in the system. One of the mother's said:
Comfort through Reduced Sensory World: Many mothers picked up on the idea of making the system sparse in terms of sensory information and also the fact that we had used only visual cues. This helped create comfort and a feeling of "normal" to the children as they did not feel the burden of sensory information. A quote by a mother -
Ease of use leads to access to social interactions: This escape from the full sensory experience of physical face to face interaction allowed for a rich social interaction. Parents recognized and commented on the unique opportunity to practice communication skills.
Using logs generated by the Unity code and the HTC Vive VR system, we captured each user’s distance from avatar, volume, and duration of talking. The logs enabled statistical analysis for comparisons to be made between conditions. We used a repeated measures ANOVA.
Of the three variables we measured, only proximity yielded statistically significant differences between the proximity condition and baseline condition (p=.03). This finding in the proximity condition was replicated in comparison between baseline and the combination condition (p=.01). This means that as a group, the participants displayed a significantly higher level and stable trend in the percent correct proximity when they received visualizations of proximity. To the best of our knowledge, this is the first clear demonstration of the impact of nonverbal communication visualizations in an immersive environment for children with autism.
The volume condition compared to baseline yielded no observable change (p=.5). The combination (volume and proximity) also was not seen to be effective with this sample (p=.6). As a group, the volume remained stable across sessions and conditions, yet at the individual level, three participants improved in the intervention condition, three performed well in baseline and intervention, and three showed reduced performance in the intervention condition.