The “Trolley Problem” — the classic ethics exercise that challenges students to consider whose life should be saved from a runaway trolley — is updated in the artificial intelligence (AI) ethics class taught in the Montour School District in Coraopolis, Pennsylvania. In the class, a self-driving vehicle is substituted for the trolley. Students must decide how they will program the vehicle in the event the brakes fail: Continue moving straight ahead and risk plowing into pedestrians or force it to stop by crashing it into a wall but killing the vehicle’s passengers.

Along with examining a variety of ethical issues associated with AI, all 850 students in grades five to eight at Montour’s David E. Williams Middle School are introduced to coding and building autonomous robots, computer programming and data literacy, as well as careers in robotics and automation. In addition, they examine the presence of AI in their daily lives — from personal assistants (think Alexa, Siri, and Google Assistant) to predictive analytics (the power behind Amazon and Netflix) to image filters like Snapchat — as well as the intersection of AI with the arts.

The decision to create a middle school AI curriculum came about “because we asked what more we can do for our kids to be future ready,” says Justin Aglio, director of academic achievement and district innovation for Montour, a 3,000 student-school system bordering the city of Pittsburgh. And after only one full semester of operation, the district plans to expand the AI curriculum to its elementary school and high school in the next school year.

The presence of AI — often described as programming that gives devices the ability to identify and recognize patterns, predict behavior, make simple decisions and learn — has been growing in schools, from personalized learning assistants to responsive, social robots in special education classrooms, to school management and security systems.

But the move to teach K-12 students about AI concepts, to encourage them to see themselves as not just consumers but also creators of AI tools, and to prepare students for increased human-computer interaction and workforce changes, has been much slower, says learning innovations advocate Tom Vander Ark. A former school superintendent and the first executive director of the Bill and Melinda Gates Foundation, Vander Ark is among education advisors who have been calling for greater awareness of and instruction around AI in K-12 classrooms over the past several years. “It’s the most important change force on earth,” says Vander Ark. “Nothing more profound is shaping the lives and livelihood of young people than AI and its related technologies.”

Create, Not Replace Various economic forecasts echo the claim: The World Economic Forum predicts that the growth of AI could create 58 million new jobs in the next few years. Economic Modeling Specialists International predicts that jobs in STEM fields will grow by 13 percent between now and 2027. Most respondents in a Pew survey of business and technology experts predicted advances in AI and robotics “pervading nearly every aspect of daily life by the year 2025,” from manufacturing, transportation, and communications, to who delivers the pizzas to our homes, reads our X-rays at the doctor’s office, and takes care of household chores.

Former Google executive Kai-Fu Lee made headlines earlier this year when he predicted that 40 percent of the world’s jobs will be displaced by some type of machine, capable of automating tasks over the next 20 years, affecting both blue-collar and white-collar professions.

“One of the biggest fears I’m hearing from educators is that technology and artificial intelligence will replace teachers in the next decade or two,” says Michelle Zimmerman, a veteran K-12 educator and author of the new book Teaching AI: Exploring New Frontiers for Learning.

Published by the International Society for Technology in Education (ISTE), the book looks at AI’s potential for teaching STEM, project-based learning, design thinking, and more. It also serves as a companion to the ISTE online course, Artificial Intelligence Explorations and Their Practical Use in Schools.

“The more I’ve looked into AI and technology, the more I’m convinced that the human component of who we are as people needs to be forefront and central” when it comes to AI in the classroom, Zimmerman says.

Pittsburgh’s Carnegie Mellon University computer science professor Dave Touretzky offers a similar view. “People have high hopes for AI-based tutoring systems that can deliver personalized instruction at low cost, but we’ll just have to wait and see how good they are in practice,” he said via email. Without the “general intelligence, common sense, and self-awareness” that humans possess, computers “cannot be as good as a human teacher.”

Currently, “we are very far” from the point where computers are capable of exhibiting what is referred to as “artificial general intelligence,” or AGI, Touretzky says. “The best chess-playing programs in the world do not know that they are playing a game, or that the pieces have specific names and shapes for historical reasons. Netflix can recommend movies to you, but it can’t explain what a ‘movie’ is or have a discussion about who was the best Batman.”

Still, economic competitiveness is a vital reason for U.S. schools to begin teaching students about AI, he says: “China has already announced its intention to become the world leader in AI by 2030, and the Chinese government has ordered that every Chinese schoolchild should be learning about AI.” Russian President Vladimir Putin has likewise weighed in on the topic, saying in 2017 that the nation that leads in AI “will be the ruler of the world.”

That should serve as a reminder that “America needs to keep up,” Touretzky says.

A Home in CTE That view was echoed by a community advisory committee in Maryland’s Washington County Public Schools. Made up of local businesses, industry, and higher education partners, the committee offers input on program relevancy, curricular needs, and labor forecasts. The district will launch a new Artificial Intelligence and Cloud Computing CTE program in 2020 as part of its “expanded, technology-related, high-value career pathways for students.”

“We’re always looking at labor market needs, program needs, and student interests,” says Cody Pine, the school system’s supervisor of career and technology education and enrichment. The program will bring together networking, security, analytics, and management, and students can simultaneously earn college credits. A capstone project will have students do some cloud-based mechatronics (a specialized form of engineering), artificial intelligence, and robotics work.

“We couldn’t really find any other secondary schools that had a program like this, so we went to the post-secondary level and the state Department of Education for assistance,” Pine says.

Partners for SupportWhen developing Montour’s program, Aglio, a visiting fellow at Carnegie Mellon University, also turned to higher education and industry partners for assistance.

This allowed the program to launch “with very minimal expenses,” he says. Carnegie Mellon, a national leader in the study of AI, developed the software used in the district’s AI curriculum, and the private company READY AI provided the AI-powered toy robots used in the autonomous robotics class.

The district has formed a partnership with the university in a pilot test of math software that sends real-time student work information to the classroom teacher via her smart glasses — wearable computer glasses that add information alongside or into what the wearer sees. She sees a smiling emoji over the heads of students who are progressing well and a frowning emoji over the heads of students who are struggling. A tap of the emoji specifies which problem the student is working on.

“Through artificial intelligence, the teacher gets data sent back to her computer and an algorithm of what she needs to re-teach in a more one-to-one environment,” Aglio says.

In addition, the district is working with the MIT Media Lab in Cambridge, Massachusetts, to develop and teach the AI ethics course. Plans call for the final curriculum to be open-sourced, available to everyone.

Among the topics Montour students examined: computer scientist Joy Buolamwini’s research showing that the algorithms used in facial recognition software typically do a much worse job at correctly identifying people of color. That’s a problem that can come into play as school systems look to install facial recognition systems for campus security.

To provide more school and educator support of and resources for AI instruction, several organizations have been conducting outreach with symposiums and workshops. Along with commissioning Teaching AI and offering the online course, ISTE has created a webinar series around careers in AI and an online community where educators can share resources and exchange ideas. Much of that work was supported by a General Motors Foundation grant because they saw in their own business how foundational AI was to transforming their work and workforce needs, says Joseph South, chief learning officer for ISTE.

Willy Kjellstrom, who supports classroom teachers with technology integration in Virginia’s Albemarle County Public Schools, says the ISTE course provided “a broad understanding of different ways I might approach using AI with different teachers.”

He recently worked with a teacher and students trying to predict when their school’s beehives might experience problems. He introduced them to the concept of neural networks — a set of computer algorithms, modeled loosely after the human brain — that are designed to recognize patterns and how to use those patterns to time hive activity.

“I like that AI can be used as a tool for analyzing and building upon problems,” Kjellstrom says. “That’s a valuable skill.”

After taking the ISTE course, Suzy Brooks, the instructional technology director for Mashpee Public Schools in Massachusetts, has helped develop a project where high school students are designing virtual assistant-like learning tools for elementary-level students.

The older students visit the elementary classrooms, sit in on lessons, and look for opportunities where these tools could be useful to classroom operation, Brooks says. “Are there things a teacher could use that could be housed on an iPad or on YouTube or on Google Drive if a student needed to review a certain lesson, like how to quote sources for a project, or how to get a Bee-Bot (robot) to do what they want?”

This approach allows students to think about how a lesson can be enhanced, a void can be filled, or relearning accomplished, and the process needed to accomplish that, she says. “That’s hugely important.”

Another educator support effort is being spearheaded by the Association for the Advancement of Artificial Intelligence and the Computer Science Teachers Association. Last year, the groups launched a teacher working group to define what students in various grade levels should know about AI, and what they should be able to do.

Touretzky is chair of the working group and ultimately envisions teachers across different content areas incorporating “bits of AI all over the curriculum. AI can fit into various STEM courses, naturally, but it can be appropriate in other places as well,” he says. “For example, when using Google Translate in a language class, it makes sense to take a moment to consider how this service works, and what its limitations are.”

Tough Questions  Along with issues related to bias and the underrepresentation of women and people of color in the world of computing, privacy issues related to AI are another pressing concern. It was a topic central to the presentation, “AI Demystified: Facts, Fiction & the Future of Learning,” made by leaders from Pennsylvania’s Penn Manor School District at NSBA’s 2019 Annual Conference.

The potential trade-offs associated with AI-enhanced surveillance technology that can detect threatening movements or behaviors, conduct facial recognition scanning, or scour social media activity for concerning posts or conversations are important issues that school leaders need to discuss, says Penn Manor Superintendent Michael Leichliter.

Questions must be asked about “how much information is too much and how intrusive should we be outside of the brick-and-mortar school and the schoolyard,” Leichliter says.

Even something as unexpected as anti-plagiarism software can present an ethical AI challenge, says Charlie Reisinger, Penn Manor’s technology director. He notes that it is the users of such software — including middle and high school students — who “feed the algorithm” that allow those tools to improve and become more effective. Lately, questions have emerged over who owns that data, Reisinger says. “Does a student have some level of ownership? Is it ethical for a company to build its tool based on student work?”

These are tough issues that educators and school leaders must grapple with, but that “does not mean we turn off the technology or hinder its introduction” in schools, says Nicol Turner Lee, a fellow in the Brookings Institution Center for Technology Innovation.

It does, however, mean that we “elevate the need to get the right products and services into the schools so they can complement and be additive” to the learning experience, Turner Lee says.

Along with the important ethical issues that students and educators must weigh as they engage AI, they need to learn about the good that comes from this technology that they are already living with and will continue to grow, Touretzky says. Students need to be educated so they can become informed citizens, he adds, “and they should be thinking about how they can have careers in this field.”

Michelle Healy is associate editor of American School Board Journal.

Reprinted with permission from American School Board Journal, June 2019. Copyright 2019 National School Boards Association. All rights reserved.