This screen awaits a classroom full of brand-new Kindergarten students on their first day of school. These 5-year-olds may not yet be able to identify letters or numbers, but their iPad is ready for them to sign into their math accounts and get started…

ISTE Standards for Students 1.1.d states, “Students understand the fundamental concepts of technology operations, demonstrate the ability to choose, use and troubleshoot current technologies and are able to transfer their knowledge to explore emerging technologies.”(2020). How can teachers best support their youngest students in developing their digital literacy skills so they can be proficient and independent on their devices?

Choosing apps that are not only ideal for the information you are covering but also for the development of your students is really the first place to start. The TPaCK model suggests that choosing a tool starts with finding the right balance of the pedagogical knowledge of the age group you are working with, the content knowledge of the subject matter, and the technological knowledge of what tools are available and how to use them, all fit within the larger context of the climate of the classroom and individuals within. (Koehler, 2014) A factor among this when working with very young children is their ability to perform and be signaled to use gestures with a touch screen. “Four- to 6-year-old children can perform single-finger touchscreen gestures and follow non-textual prompting techniques, while some 7- and 8-year-old children can also perform multi-finger gestures and follow textual instructions.” (Yadav et al., 2020). This indicates that apps such as google earth may require significant support for kindergarten students as they employees multi-finger gestures such as pinching and zooming. It also suggests that most students can select objects independently, and even sliding or long-pressing an object can be inferred by students in kindergarten with only minimal support. In the classroom, this does not mean that complex apps are out of reach for our youngest students, just that they will need much more guidance and support when starting out.

Modeling technology use is a strategy that is highly effective for teaching students how to use an app or device. “Adults reported that this modeling is effective; the director described how some children have watched their teachers and learned how to use the Google voice assistant.” (Chordia, Yip, and Hiniker, 2019). The younger the student is, the more vital a model is. “Children under 3 are unable to interpret these [tablet] prompts and should be guided by an adult model.” (Hiniker et al., 2015). In fact, even on-screen models or examples which showed a visual of a hand performing the desired gesture were still primarily ineffective until children were eight or older (Yadav et al., 2020). Modeling an app in a classroom could mean projecting your own tablet and demonstrating what you want students to do before they receive their device. One could also ask one student to model their work for others: as a student discovers a tool feature, ask them to share what they learned with a classmate.

Joint media engagement, or guided learning, is another common strategy for content acquisition in early education. We see this strategy often, such as reading a book with an adult who asks questions as the child reads. When translating this practice to a touch screen, communication may change forms to be less about words and more about actions. “…Touchscreens change the pedagogical interaction between children and teachers. In particular, that while communication continues to take place throughout the interaction, the mode of communication changes to focus on children’s actions on the iPad compared to the more verbally‐driven shared book reading interactions.” (Samuelsson, Price, and Jewitt, 2022).This study also found that while student communication went from talking to gestures, teacher interaction did not necessarily change. This means that teachers should be paying close attention to the screens and gestures of their students to be ready to prompt students with the next steps or questions. The practical application of this strategy could be to use a real-time screen viewer or to have one-on-one or small group stations where the teacher can watch as the students use the apps. These strategies allow the teacher to ask questions, comment on what the children see in real-time, and prompt students to hypothesize what might happen if they select a different item.

Collaboration and teamwork are other ways to encourage meaning-making in the classroom. Karno and Hatcher investigated what happens when young children, ages 2-8, were offered a touch table with a variety of games and puzzles that students could play; while providing very limited support, the students were offered the table and allowed to explore together. “The multi-touch table delivered a singular environment in which common goals were built into the activities, providing an opportunity for individual and collective agency as well as social learning… The multi-touch table offered an environment in which children began with the building of personal efficacy, and ending in a collective agency.” (Karno & Hatcher, 2020). Having students work together to explore or work through puzzles and games is a great way to offer collaboration and allow students to develop their technical skills with each other as support. This strategy is already being implemented in educational technology tools like code.org, which has Pair Programming as a feature built into its curriculum (code.org, 2014). It’s possible to use that same model in any app or subject. In the classroom, this could look like providing one device for every two children and asking them to complete a puzzle or a challenging problem together (even using the pair programming strategy of switching off as driver and navigator). It could also look like asking students to all complete the same level or problem on their own devices but allow them to interact as much as they like.

A final method is to allow students to explore and play independently within an app. Play is an important part of child development and has long been incorporated in kindergarten and preschool classrooms. (Saracho & Spodek, 1995). One example of using play to support the use of a touch screen in the classroom is using games. When students were presented with puzzles and games that provided feedback with audio and visual cues to their success, students were motivated to do well. “Celebratory-type actions often accompanied the completion of app goals. A child giggled and said “Look what I made!” after creating a virtual insect. The built-in app voice reward, “Good job” that occurred as a puzzle was completed often elicited raising hands, high fives, and jumping up and down.” (Karno & Hatcher, 2020). When students can see and hear when they have completed their task, they get feedback that they were successful, and that encourages them to continue on in the game.

When I begin my year with my kindergarten class, I start with the names of the parts of the device, building our shared vocabulary. We explore where the cameras are, where the iPad plugs in, and maybe a few buttons like power and volume. In the next few weeks we will only work on one or two apps: the camera and a drawing app. I begin by modeling the camera app on the big screen, then students work in pairs to go on a scavenger hunt for colors or shapes around the classroom. They LOVE to take selfies and pose with their friends! Next, we dive into a drawing app after I demo a few steps on the board while explaining my thinking as I go “I will choose the plus symbol because that means I want to add something”. When the students are working I will monitor all their screens simultaneously, frequently asking them to explain what tools they are using or what might happen if they change to a different pen. I will also ask a few students to show the rest of the class how they could add shapes to their drawings by putting their screens on the big TV and letting them teach the class. Finally, it’s time. We know how to turn on the device, we know how to swipe between the screens to find a different selection of apps, and we know that when we tap on a picture, something will happen! Let’s get into that math app!

Tap anywhere to begin…

Resources

Chordia, I., Yip, J., & Hiniker, A. (2019). Intentional technology use in early childhood education. Proceedings of the ACM on Human-Computer Interaction, 3, 1-22. doi:10.1145/3359180

Code.org. (2014, September 11). Pair Programming [Video]. YouTube. https://www.youtube.com/watch?v=vgkahOzFH2Q&feature=youtu.be

Hiniker, A., Sobel, K., Suh, H., Irish, I., Kim, D., & Kientz, J. (2015-06-01). Touchscreen prompts for preschoolers: Designing developmentally appropriate techniques for teaching young children to perform gestures. Paper presented at the doi:10.1145/2771839.2771851

International Society for Technology in Education. (2020). ISTE standards: Coaches. ISTE. https://www.iste.org/standards/iste-standards-for-coaches

Koehler, M. (2012, Sept 24). TPACK explained. TPaCK.org. http://www.tpack.org/

Karno, D., & Hatcher, B. (2020). Building computer supported collaborative learning environments in early childhood classrooms. Educational Technology Research & Development, 68(1), 249-267. doi:10.1007/s11423-019-09686-z

Samuelsson, R., Price, S., & Jewitt, C. (2022). How pedagogical relations in early years settings are reconfigured by interactive touchscreens. British Journal of Educational Technology, 53(1), 58-76. doi:10.1111/bjet.13152

Saracho, O. N., & Spodek, B. (1995). Children’s play and early childhood education: Insights from history and theory. Journal of Education, 177(3), 129. doi:10.1177/002205749517700308

Yadav, S., Chakraborty, P., Kaul, A., Pooja, Gupta, B., & Garg, A. (2020). Ability of children to perform touchscreen gestures and follow prompting techniques when using mobile apps. Clinical and Experimental Pediatrics, 63(6), 232-236. doi:10.3345/cep.2019.00997

One Reply to “Tap Anywhere to Begin”

  1. Excellent Post, Samatha! Fascinating topic!

    I love how you mentioned that modelling a workflow could be a great way for kids to learn. It is so interesting that kids can follow a model of interaction with Google Assistant. I wonder how kids’ learning will change when the A.I. gets more and more advanced. I wonder if parents will even be needed 😛

    I also like how you mentioned how kids could teach and collaborate to learn an app. You said that the modes of communication between teacher and student changes while on an iPad (to something less verbal). Were the researchers/papers generally positive or negative about this new paradigm? Do you sense that they find it a “lesser” communication?

    It would be exciting to see kids learn an app through pair programming. The navigator’s job is typically to think about long-term goals. I wonder what kids would come up with? Could they see the bigger picture? The results of such a study would be fascinating! 😀

Leave a Reply

Your email address will not be published. Required fields are marked *