Can Computer Coding Be Used to Teach Values? This Scholar Thinks So

Book cover of Beyond Coding by Marina Umaschi Bers

In the new book Beyond Coding: How Children Learn Human Values Through Programming, Tufts professor Marina Umaschi Bers argues that schools can teach computer coding in ways that develop character as well as technical skills. Umaschi Bers, the director of Tuft’s interdisciplinary DevTech Research Group, wrote the book during the pandemic. She says that the disruptions of Covid-19 were a stark reminder of how vital it is that technology be approached with respect for connection, community, and generosity.

Hess: Your new book tackles a topic that may strike some readers as surprising—namely, how children learn values through computer coding. Can you explain what you have in mind?

Bers: Any time we engage in an activity, we are expressing our values. We do this knowingly or unknowingly. The activity of coding is not different. In the book, I suggest 10 different values—such as persistence, curiosity, and generosity—that play out while learning how to program projects that are personally meaningful to the coder. I use the metaphor of a “palette of virtues” to describe these values. The palette of virtues, much like the painter’s palette, is dynamic and can change according to context and goal. For example, by engaging children in a team coding environment, it invites them to be generous with each other through sharing technical knowledge and problem solving together, or when things in the code do not work, it creates opportunities to learn to be patient and to persevere. Learning how to code, in which problem solving is at the center of the activity, provides a wonderful opportunity to explore moral and personal values, since there are not only cognitive dimensions, but also social and emotional aspects.

Hess: How did you come to this as a focal point in your work?

Photo of Marina Umaschi Bers
Marina Umaschi Bers

Bers: My doctoral thesis, back in 2001 at the MIT Media Lab, was already focused on how new technologies could help children explore and express personal and moral values. I created Zora, a three-dimensional virtual city for children to design characters and then tell stories to those characters. Those stories and interactions were guided by personal and moral values. Over time, I realized that it is not only through a specially designed tech environment, such as Zora, that we can explore values in an intentional and purposeful way, but we can also have this sort of exploration more generally, through the activity of creating with, and through, technologies.

Hess: In the book, you draw a direct link between programming and moral growth. Can you say a bit about this link?

Bers: It all goes back to intentionality. What is the intention of a teacher who brings robotics to her kindergarten students? Is it only to teach technical skills so children grow up to become engineers? Or is it also to engage them in developing social skills, positive attitudes, and to support their emotional growth? The coding “playground,” as I call this learning environment, provides wonderful opportunities for both. For example, children learn how to invent new games and respect rules, how to solve social conflicts, and how to keep trying when things do not work out. At the same time, it also engages children in computational thinking, problem solving, engineering, and programming.

Hess: You’ve written previously about the difference between thinking about creative “playgrounds” versus limited “playpens” when it comes to learning. In this book, you suggest that coding can be a dynamic playground. What do you mean by this?

Bers: Playgrounds invite fantasy play and require conflict resolution with little adult supervision. In contrast to playgrounds, playpens convey a lack of freedom to experiment, lack of autonomy for exploration, lack of creative opportunities, and lack of taking risks. Playpens are confined limited spaces with few options for children to make their own choices. Although playpens are safer, playgrounds provide infinite possibilities for growth and learning. In the playground, the child learns about the social world by negotiating for their favorite toys in the sandbox, about their own emotions when they struggle to keep up with others on the monkey bars, and about moral choices and consequences when they are faced with the dilemma to wait politely for their turn on the swing or to cut the line. In the playground, the child is encountering the multiple dimensions of human development. However, she is doing it in a safe space, a place where she can make mistakes and try again. Programming languages, such as KIBO and ScratchJr, are coding playgrounds in which children have freedom to create projects to express their thinking and to communicate who they are and what they love. For example, they can make an interactive birthday card for mom or dad with ScratchJr or program a KIBO robot to dance the salsa.

Hess: You suggest that it’s useful to think about coding as if students are learning a second language. What does this mean for how we think about and teach computer skills?

Bers: I propose that learning computer programming allows children to gain an alternative form of literacy that is essential in the 21st century. However, my rationale for supporting the introduction of computer science and computational thinking starting in kindergarten is not the creation of the future workforce but the future citizenry. Most people know that reading and writing are tools for interpretation and have the potential to be tools of power. Echoing Brazilian educator Paulo Freire, literacy is a tool for critical comprehension, for understanding the world, and for actively changing it. This is the same with coding. Without understanding the fundamentals of what an algorithm is and how it works, people might not understand why and how certain data is displayed and become illiterate in the information age when so much of what we consume is managed by algorithms.

Hess: In 2019, you launched the Beyond STEM program, working with kindergarten teachers and school administrators in Boston and Buenos Aires. Can you say a bit about that work and what’s ahead for it?

Bers: Through this project, we brought KIBO robots, an age-appropriate robot that can be programmed with wooden blocks instead of screens, to kindergartners and their teachers in religious and secular schools, in Buenos Aires and Boston. Together, we explored questions such as how can we teach human values through computer programming that represent both universal and particular perspectives? Can coding support character development? How does a robotics-based program not only promote the acquisition of technological skills but also help children become better citizens and human beings? Can coding serve as a bridge by providing another language to get to know others who are different from us? Teachers attending the robotics training came with different levels of KIBO knowledge, but they all left with technical skills and a palette of virtues that allowed them to bring robotics lessons to their early-childhood classrooms. Each school worked to create a story using KIBO robots that would highlight the values of their school and shared the robots with the other participating schools in their area and also with their international counterparts through a virtual portal. In the future, we hope to continue this work by providing more opportunities—face-to-face workshops and virtual interactions—for people from different cultures and ethnicities, religions and countries, and who speak different languages to get to know each other by engaging in the shared, universal language of coding.

Hess: OK, final thought. If you had one piece of advice for educators and parents hoping to help kids engage with technology in a generous, moral fashion, what would it be?

Bers: My advice is to look at your own palette of virtues first and ask yourself these hard questions: Are your own values exercised or displayed in your use of technology? Are you consistent in what you believe is good and bad in your own behaviors? For most of us, the answer is usually “no.” For example, we care about family time, but our phone sits with us during meals. We highly value confidentiality, but we forward emails. Our children are constantly watching us, so before we worry about them, we need to look at our own behaviors. Because like it or not, we are role models. And we must be intentional about what kind of modeling we are doing.

Frederick Hess is director of education policy studies at the American Enterprise Institute and an executive editor of Education Next.

This post originally appeared on Rick Hess Straight Up.

The post Can Computer Coding Be Used to Teach Values? This Scholar Thinks So appeared first on Education Next.


Older Post Newer Post