5 GenAI principles for K-12 education

Imagine this: A new technology has arrived, drawing enormous public discussion. Among the questions is how it might be used in schools. Advocates call for its widespread adoption, saying it will revolutionize schooling and motivate students. Others are wary, fearing it will make kids lazy.

The year is 1922; the leading advocate is Thomas Edison, who has a special place in my heart for founding what became General Electric (GE) in my hometown of Schenectady, NY. He promised that his invention, the motion picture, “is destined to revolutionize our educational system”—even replacing textbooks.

A century later, it’s safe to say that Edison’s revolution didn’t play out exactly as he pictured (nor justify his critics’ worst fears), but it surely had profound effects. Movies didn’t replace the written word—nothing can—but they put new tools at teachers’ fingertips. Like waves of technological innovation that followed, movies also presented schools with crucial choices about their responsible use in ways that both benefit and protect students.

Introducing generative AI

And now comes generative AI (GenAI), best recognized as the chatbots that have exploded into popular awareness. Once again, schools will be a place for crucial decisions. Companies like ours—the largest provider of learning solutions to K-12 schools—have a role here, and I believe

we should publicly state principles to keep students and teachers at the center of GenAI development.

Half of educators say they are currently using generative AI, and it is saving them time. Recent studies suggest that teachers spend over 50% of their time on non-teaching tasks—imagine what could be possible if they spent more of that time directly connected to students and teaching.

That idea, I believe, only scratches the surface of AI’s potential benefits. AI tools can enhance teachers’ productivity by helping them plan lessons and activities, convert text into presentations, and create summaries of texts—just to name a few. AI tools can also help enhance students’ literacy learning withpersonalized learning experiences—such as providing teachers with suggested feedback and revision on student writing.

It’s exciting, but AI will earn the trust of schools, teachers, families, and education leaders only if it’s used with wisdom, guidelines, and safeguards that ensure it genuinely supports teachers, benefits students, and never compromises children’s privacy or safety. That’s why we’re outlining five recommended principles we believe should guide the responsible adoption of AI technologies in K-12 schools.

Keep teachers at the center

The teacher-student relationship is crucial. We believe in a “high-tech, high-touch” approach in which technology should support, not mediate, this connection. Teachers are closest to the educational experience, and their voices must also inform the development of new technologies intended to serve them.

Teachers will need support and professional development to build “artificial intelligence literacy” to effectively leverage the technology in the classroom. Most educators (76%) identify a need for education on ethical AI usage and its integration into the classroom.

Uphold student privacy, safety, and well being

Protecting student privacy and data is non-negotiable. Existing federal laws provide strong protections that must apply to the new uses that may be associated with GenAI. Many state laws also protect children’s and students’ privacy, and third-party organizations must uphold and promote data privacy and student safety.

Lawmakers should ensure that existing laws and regulations properly account for and clarify how these levers can be used or applied to GenAI.

Ensure responsible and ethical use

Families need to understand how GenAI is being used in schools—without being overwhelmed with information that’s too detailed or technical to understand. Federal and state policymakers should work with AI experts to determine appropriate disclosure requirements and provide guidance for how districts and schools can access the information they need about GenAI systems they choose to use.

Encourage continuous evaluation and improvement

Systemic integration of AI into education technology and practice requires analysis of which strategies work, for whom, and why. Creating a culture of ongoing evaluation and improvement will ensure the technologies genuinely support teaching and learning. Even these trials must include guardrails to protect student privacy, safety, and wellbeing.

Prioritize accessibility and inclusivity

As classrooms become more diverse in demographics and learning needs, GenAI tools can equip teachers with personalized approaches, recommendations, and supplemental materials to meet each student’s needs. As new bias, equity, and accessibility considerations emerge with the use of GenAI, regulations need to evolve.

Our schools, like our society, face the task of defining guardrails for a field that’s evolving with astonishing speed. Policymakers, and companies like ours, must put empathy, safety, and privacy at the forefront to maximize the benefit that these technologies will surely have to elevate teaching and learning.

Jack Lynch is CEO of HMH.

https://www.fastcompany.com/91258146/5-genai-principles-for-k-12-education?partner=rss&utm_source=rss&utm_medium=feed&utm_campaign=rss+fastcompany&utm_content=rss

Creado 5mo | 10 ene 2025, 19:50:09


Inicia sesión para agregar comentarios

Otros mensajes en este grupo.

Those security codes you ask to receive via text leave your accounts vulnerable. Do this instead

Do you receive login security codes for your online accounts via text message? These are the six- or seven-digit numbers sent via SMS that you need to enter along with your password when trying to

21 jun 2025, 10:40:03 | Fast company - tech
This is the best online file converter—and it’s totally free

We were supposed to be finished with files by now.

For years, tech companies (well, certain tech companies) tooted their horns about a future in which files didn’t matter. You d

21 jun 2025, 10:40:02 | Fast company - tech
Astroworld is back in the spotlight and survivors are sharing haunting stories on TikTok

Astroworld is back in the news, and social media has some thoughts.

In November 2021, a

20 jun 2025, 23:10:03 | Fast company - tech
Your reliance on ChatGPT might be really bad for your brain

If you value critical thinking, you may want to rethink your use of ChatGPT.

As graduates

20 jun 2025, 18:30:02 | Fast company - tech
What is ‘office chair butt’? TikTok’s viral term for a real health problem

Rather than the Sunday scaries or toxic bosses, employees have unlocked a new workplace fear: office chair butt.

While not a new concern, the term has resurfaced on TikTok to describe ho

20 jun 2025, 16:10:07 | Fast company - tech
How this Parisian music streaming service is fighting AI fraud

Music streaming service Deezer said Friday that it will start flagging albums with AI-generated songs, part of its fight against

20 jun 2025, 16:10:06 | Fast company - tech