Understanding how technology is transforming children’s interaction with the world
Dr Andrew Manches is the Chancellor’s Fellow at the University of Edinburgh. This is his view of the research topic he delivered as one of the nine research sessions at CMC 2014
View the presentation: Research 7
Children’s interaction with digital media
Children’s interaction with digital media is evolving: no longer are they limited to pressing buttons on the TV to change channels, or mastering the mouse to access more interactive media on a computer. One way to describe how interaction with media is changing is in terms of ‘embodiment’: the relationship between how children move their hands (and bodies) and the resulting change to digital media.
More ‘disembodied’ technologies
For devices like the mouse or keyboard, the relationship between children’s hand actions and what happens on screen is relatively indirect: or ‘disembodied’. Small hand movements across the table, along with one-finger button presses, more or less define the limits of physical expression. The side image, taken from the paper, ‘How a computer sees us’  caricatures the limits of interaction offered by devices such as the mouse. Perhaps unsurprisingly, the mouse has been difficult to control for children until around five or six years  .
The last few years has witnessed quite a change in how we can manipulate digital media – through touch screen interfaces such as the iPad. The change has been significant for children, with tablets becoming increasingly common in homes (and schools). The ability to manipulate digital media more directly through one, or more, finger touch is easier and has enabled much younger children, reflected in the statistic from the Joan Ganz Cooney Centre that 58% of learning apps were categorized for toddlers/preschoolers . In one of our own projects, we are investigating how the iPad influences parent-children interaction with ‘zero to three’ year olds.
Touch screen devices are not limited to tablets. Young children often play with mobile phones at home, and electronic boards at school (and possibly tabletop computers soon). These devices offer more embodied interaction: swiping a finger across the screen to simulate cutting a rope; sliding objects across the screen as if they were cards on a table. Yet, such movements are still somewhat limited when considering the full range of actions children make everyday with their hands and bodies. Bret Victor  critiques the interaction limits of touch screen devices using the expression ‘Pictures under Glass” and argues for more embodied forms of interaction.
More ‘Embodied Technologies’
We have already glimpsed the potential of technologies that can respond to a much greater variety of physical actions in the form of gesture recognition devices such as the Nintendo Wii or Microsoft Kinect. Swinging a golf club on an on-screen fairway can be created by a similar swing in the living room. It is likely that this more expressive form of gesture-based interaction will be increasingly exploited in screen devices from Tablets to the TV. Yet interaction is still relatively indirect when you consider where your hands are in relation to what happens on screen. In contrast, tangible technology is an umbrella term describing devices with a strong relationship between physical interaction and digital media.
With tangible technology, children can manipulate digital media as easily as they might wooden blocks; for example, a ball that changes colour when children squeeze it to send message to the teacher about how they feel about a class task ; or blocks that play different sounds when pushed together or pulled apart . Children can even manipulate on-screen media through their interactions with physical toys. These emerging technologies are more embodied because children’s manipulation of digital media is much more closely aligned to how they are physically interacting with the world.
If children already find touch-screen interaction easy, what is the benefit of more embodied forms of interaction with technology? One answer may simply be they are more aesthetically appealing and engaging. Studies tend to confirm this , although they too often fail to take into account novelty effects: children generally show preference for new things – for a while. Another possibility is that more embodied technologies have a unique potential to tap into the way children think and learn.
Research over the last twenty years has provided strong evidence that cognition is embodied. This is to say, the way we think is intrinsically linked to our previous sensory experiences and physical actions in the world. For example, when we think about something as abstract as mathematics, we often draw upon early physical experiences such as walking in a line step by step, sorting out groups of blocks, or even writing out certain sums.
The reason we understand more about the embodied nature of cognition is we have new methods of examining how we think. One way is from brain imaging. Another is from gesture research. Ask someone to tell a story or explain a maths problem, and you will often find they gesture. Research has shown that the primary function for gesturing is to help the speaker with their own thinking, although gesturing helps the listener too.
Gesture research is important, not only as evidence for recent theories around embodied cognition, but because it provides a window into how different physical experiences have influenced how we think. In one of our projects, we are looking at how 110 children aged 5-8 years gesture when explaining number relationships. Many children used gestures that appeared to simulate pointing along a number line, or manipulating groups of blocks. We are using these findings to explore how giving children more experiences with blocks or number lines could help them learn.
Embodied Cognition and Technology
If children’s physical actions with materials are important for how they learn, this is important for how we design new learning technologies. First, we can consider how certain disembodied technologies, like the mouse, may limit important physical experiences. For example, how do you bring two groups of blocks together when adding using a mouse? Secondly, we can think of how we can use more embodied technologies to encourage certain actions to support learning. For example, we can use gesture recognition devices to encourage children to generate large gestures to move groups of on-screen blocks together. In one project in the US , they have used the Nintendo Wii to help children explore the concept of ratio by moving their hands at different distances from the table. Tangible technologies can use digital effects to encourage children to move, swap, stack, build and generally explore physical materials in certain ways.
Embodied Technologies and Children’s media
There seems quite a gulf between designing new forms of digital interaction for learning and the design of children’s media. But there is a key message. New forms of digital interaction are creating new opportunities for designing new media experiences. We have seen how a device such as the iPad can transform an industry. One of our projects is investigating the potential of another form of digital interaction: physical toys linked to computer games (e.g. Skylanders).
As embodied technologies become more ubiquitous (and accessible to designers), they too will open new doors: media that children can interact with by moving their bodies; media that children can manipulate through simple interaction with physical materials around them. These new, more embodied, technologies are likely to bring children more immersive and engaging experiences; but significantly, they also offer a powerful way to support the way children think and learn.
- O’Sullivan, D. and T. Igoe, Physical computing: sensing and controlling the physical world with computers. 2004: Course Technology.
- Donker, A. and P. Reitsma, Young children’s ability to use a computer mouse. Computers & Education, 2007. 48(4): p. 602-617.
- Shuler, C., Z. Levine, and J. Ree. iLearn II: An analysis of the education category of Apple’s app store. in New York: The Joan Ganz Cooney Center at Sesame Workshop. 2012.
- Victor, B. “A Brief Rant on the Future of Interaction Design.” 2011 8th July 2014]; Available from: http://worrydream.com/ABriefRantOnTheFutureOfInteractionDesign/.
- Balaam, M., et al. Exploring affective technologies for the classroom with the subtle stone. in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 2010. ACM.
- Schiettecatte, B. and J. Vanderdonckt. AudioCubes: a distributed cube tangible interface based on interaction range for sound design. in Proceedings of the 2nd international conference on Tangible and embedded interaction. 2008. ACM.
- Xie, L., A.N. Antle, and N. Motamedi, Are tangibles more fun?: comparing children’s enjoyment and engagement using physical, graphical and tangible user interfaces, in Proceedings of the 2nd international conference on Tangible and embedded interaction. 2008, ACM: Bonn, Germany.
- Abrahamson, A., From Gesture to Design: Building Cognitively Ergonomic Learning Tools, in International Society for Gesture Studies. 2007: Northwestern University.
View the presentation: Research 7