(Boston) Amazon senior executive Rohit Prasad had an urgent message for students [des niveaux équivalents à la 3e et la 4e année du secondaire] from Dearborn STEM Academy, a public school located in the Roxbury neighborhood of Boston.
He had come to school one recent morning to observe an Amazon-sponsored artificial intelligence (AI) lesson, which teaches students to program simple tasks for Alexa, Amazon’s voice-activated virtual assistant. He assured the students of Dearborn that there would soon be millions of new jobs in the field of artificial intelligence.
“We need to create talent for the next generation,” Mr. Prasad, Alexa’s chief scientist, told the class. “That’s why we educate students about AI from an early age. »
A few miles away, Sally Kornbluth, president of the Massachusetts Institute of Technology (MIT), was delivering a more serious message about AI to local school students who had gathered at Boston’s Kennedy Library complex for a workshop. on the risks and regulations of AI.
“AI is such a powerful new technology that for it to work well in society, it really needs rules,” said Ms.me Kornbluth. “We have to make sure she doesn’t cause any damage. »
Mastery of AI
The events on the same day – one encouraging work on artificial intelligence and the other warning against too hasty deployment of this technology – reflect the broader debate currently raging in the United States on the promises and the potential dangers of artificial intelligence.
The two workshops for students were organized by the MIT initiative on “responsible AI” whose donors are Amazon, Google and Microsoft. They focused on a question that has preoccupied school districts across the country this year: How should schools prepare students to navigate a world where, according to leading AI developers, the rise in power of AI-powered tools seems entirely inevitable?
Teaching AI in schools is not new. Computer science and civics classes now regularly include exercises on the societal consequences of facial recognition and other automated systems.
But the push to teach AI has grown stronger this year, after reports of ChatGPT – a new chatbot capable of producing human-like assignments and sometimes fabricating misinformation. – began to spread in schools.
Today, “AI mastery” is a new buzzword in education. Schools are scrambling to find the resources to teach it. Some universities, tech companies, and nonprofits offer ready-made programs.
Train and prevent
Courses are on the rise even as schools grapple with a fundamental question: should they teach students how to program and use AI tools, in order to train them in the technical skills employers are looking for? Or should students learn to anticipate and mitigate the adverse effects of AI?
Cynthia Breazeal, an MIT professor who leads the university’s AI initiative responsible for social empowerment and education, said her program aims to help schools do both.
“We want students to be informed and responsible users and designers of these technologies,” said Ms.me Breazeal, whose group organized the AI workshops for schools.
We want to make them informed and responsible citizens about the rapid evolution of AI and the many ways it influences our personal and professional lives.
Cynthia Breazeal, professor at MIT
(For the sake of transparency: the author of this text was recently awarded a fellowship under the Knight Science Journalism program at MIT.)
The Boston workshops were part of an “AI Day” organized by the M program.me Breazeal, which has attracted several thousand students worldwide. This event provided an insight into the different approaches schools are taking to teaching AI.
At Dearborn STEM, Hilah Barbot, senior product manager at Amazon Future Engineer, the company’s computer science education program, led a lesson on voice AI for students. The lessons were developed by MIT in conjunction with Amazon’s program, which provides coding and other training for elementary and secondary schools. The company awarded more than $2 million in grants to MIT for this project.
First of all, M.me Barbot explained the voice AI jargon. She walked students through “utterances,” which are phrases consumers can say to ask Alexa to respond.
Students then programmed simple tasks for Alexa, such as telling jokes. Jada Reed, student of 3e high school, programmed Alexa to answer questions about Japanese manga characters. “I think it’s really cool to be able to train him to do different things,” she said.
Leading Tools
Mme Breazeal says it was important for students to have access to professional software tools from leading technology companies. “We give them skills and insights into how they can work with AI to do things that matter to them,” she said.
Some Dearborn students, who had already built and programmed robots in school, said they enjoyed learning to code a different technology: voice-activated assistant robots. Alexa uses a series of artificial intelligence techniques, including automatic speech recognition.
At least a few students also raised concerns about privacy and other aspects of AI-assisted tools.
“Did you know there is a conspiracy theory that Alexa listens to your conversations to show you ads? asked Eboni Maxwell, a 14-year-old student.
“I’m not afraid of her listening,” replied Laniya Sanders, another student her age. Still, Laniya Sanders said she avoids using voice assistants because, she says, “I want to do it myself.”
A few miles away, at the Edward M. Kennedy Institute for the United States Senate, an educational center that houses a life-size replica of the US Senate Chamber, dozens of students from Warren Prescott School in Charlestown, in Massachusetts, were exploring a different topic: AI policy and security rules.
Playing the role of senators from different states, the students participated in a mock hearing during which they debated the provisions of a hypothetical AI security bill.
Some students wanted to ban companies and police departments from using AI to target people based on data such as their race or ethnicity. Others wanted to require schools and hospitals to assess the fairness of AI systems before deploying them.
The exercise was not unknown to the students. Nancy Arsenault, an English and civics teacher at Warren Prescott, said she often asks her students to think about how digital tools affect them and the people they care about.
“Even though the students love the technology, they are acutely aware that they don’t want unbridled AI,” she said. “They want limits. »
This article was originally published in the New York Times.