Teaching and Learning blog

Explore insights, trends, and research that impact teaching, learning, and leading.

Explore posts in other areas.

PreK-12Pearson studentsProfessional

  • Students in a lecture hall, all looking down at their cell phone devices

    AI in the classroom? A tech journalist breaks down the buzz

    By Patrick Golden

    Last year, technology writer and editor Sage Lazzaro experienced an “aha” moment and realized that AI was truly buzzworthy.

    “I was out at a restaurant and overheard a table of teachers seated next to me asking, ‘What are we going to do about ChatGPT?’ It was unheard of a year before to hear people in casual conversation talking about AI,” she said.

    Lazzaro, whose writing has appeared in publications including Fortune, VentureBeat, and Wired, among others, has covered AI for a decade, long before it rocketed into orbit as a cultural and business phenomenon.

    At the Pearson Ed.Tech Symposium 2024, a virtual event held this October, the veteran tech journalist shared her insights on the potential impact of AI on education and other fields with an audience of over 1,000 curious educators.

    An intriguing, yet cloudy future

    Educators in the U.S. and beyond are eager to understand how burgeoning AI tools will impact the classroom, students, and the future of the teaching profession.

    “I don’t think there's a golden answer to that question because it's still so early,” said Lazzaro, adding that there’s even confusion around defining AI.

    To some, AI is ChatGPT or the human-like robots dreamed up in Hollywood blockbusters. But those are AI use cases, Lazzaro explained, continuing that AI is an umbrella term for techniques that enable computers to complete tasks without being explicitly programmed.

    That opens AI to a universe of use cases.

    Lazzaro highlighted some that recently led to groundbreaking discoveries — particularly in science and medicine. The 2024 Nobel Prize in Chemistry was awarded to three scientists for their work in using AI to design and predict proteins that could help researchers develop new life-saving drugs, such as treatments for cancer, in a fraction of the time typically needed.

    Lazzaro also sees other potential benefits of AI, such as performing monotonous tasks that most people would gladly hand off. Professionals, including educators, could offload tedious duties in favor of more interesting, fulfilling endeavors, thus changing the relationship between humans and work for the better.

    Is AI head-of-the-class ready?

    As educators ponder their role in an AI-driven future, Lazzaro sees a potential parallel to how the workforce has repeatedly adapted to other technological breakthroughs.

    “While it’s very early, I think AI is going to drastically change the jobs we do and how we do them,” she said. “Look at the Information Age. Most of us work jobs now that didn't exist 30 years ago.”

    Educators are also challenged to navigate the intersection of AI and pedagogy, given the challenges the technology presents.

    “I think you should approach AI with curiosity, but also skepticism,” said Lazzaro. “It's important for educators to be aware of ethical considerations and be an active part of discussions around when and how AI is used in schools.”

    AI tools are far from a panacea in their present form. They can be quirky, unpredictable, and unreliable. Current Generative AI models might “hallucinate,” retrieving information that doesn’t exist, or providing misinformation that appears plausible — especially to an untrained eye.

    What’s more, AI is trained on large data sets that may include biases, likely unintentional, against certain populations, Lazzaro cautioned.

    With AI’s wrinkles yet to be ironed out, Lazzaro suggested educators limit AI use to specific tasks, such as fuel for brainstorming sessions or as a launching point for developing lessons.

    She also advised educators to be wary of AI-detection software that claims to identify work, such as writing assignments, as AI-generated rather than student-generated.

    “I see stories all the time from students who say they got a failing grade or are facing disciplinary action for using ChatGPT to write an assignment that they wrote themselves,” she said. “There are lots of studies showing that these detectors aren't accurate, especially for students for whom English isn't their first language.”

    And what about concerns that AI will ultimately siphon off jobs in education? Lazzaro offered a straightforward approach, be human.

    “The best advice I would give is to stay flexible, open, and aware of these changes, but also lean on the attributes that make someone a strong professional or job candidate today, or in any environment,” she said. “Take initiative, be reliable, be organized — the types of things that go far and that make us human. We’ll still go far in the future no matter what the job landscape looks like with AI.” 


    In October, tech journalist Sage Lazzaro was featured in the Future Forward session at Pearson’s inaugural ED.tech Symposium. In this session, Sage offers viewers her perspective on the current and future state of AI based on her long tenure on the AI beat.

  • Students in a classroom setting using laptop devices and typing on keypads

    Kimberly Bryant: Fighting for education equity in an AI-driven world

    By Patrick Golden

    "I didn’t come here to make you feel comfortable about AI; I came here to challenge you," Kimberly Bryant said to an audience of more than 1,000 educators during the opening keynote presentation of the Pearson ED.Tech Symposium 2024.

    As an electrical engineer, social activist, and educator, Bryant sees promise and the potential for peril in this rapidly evolving technology — especially when it comes to education.

    The Silicon Valley veteran now pours her passion into expanding equity and opportunity in AI and other technologies. Among her other endeavors, she’s the founder and CEO of Black Innovation Lab by Ascend Ventures and the founder of Black Girls CODE, a nonprofit organization focused on providing technology and computer programming education to African American girls.

    “Technology is not equally accessible to all, and as we advance into the age of AI, this divide becomes more pronounced,” she said during the virtual event, Pearson’s first symposium focused exclusively on AI technology in education.

    Bryant pointed to another technological revolution, the arrival of the printing press in the 15th century, as an example of an invention that democratized access to information while also having the power to deepen social divides.

    “I think we’re living in this moment of rapid disruption, and what we do next with AI and education will either accelerate us toward a future of equity and empowerment, or it could possibly leave an entire community behind,” she said.

    The dangers of the digital divide

    Bryant cited the COVID-19 pandemic as evidence of the chasm in the country’s digital disparities. Students with broadband internet access and tech devices continued their learning, while those without access were left behind.

    Federal data from the 2020-2021 school year found that in Florida, only 66% of schools reported having high-speed internet connections, compared to 99% in Kentucky, Maryland, North Carolina, Virginia, and West Virginia.

    Bryant is also troubled by a recent UNESCO report that found that fewer than 10% of 450 schools and universities surveyed have developed institutional policies or formal guidance concerning the use of generative AI applications in the classroom.

    Another UNESCO report found that 90% of online higher education materials come from just two regions — North America and the European Union (EU) — limiting the global diversity of knowledge. Bryant cautioned that without intentional efforts, AI could further narrow students’ perspectives and misrepresent marginalized communities.

    Leaning into AI done right

    Bryant remains cautiously optimistic about the future of AI in driving social equity. She provided examples of institutions and organizations that she believes are leveraging the technology with social responsibility at the heart of their efforts.

    While UNESCO found most institutions of higher education have yet to adopt meaningful AI policies, Bryant praised the University of California (UC) for taking the initiative to create a broad working group that oversees how the system responsibly integrates AI into its academics.

    And AI is flexing its muscle to positively influence education, she said, via personalized learning platforms that tailor education to meet students’ needs in real-time and help to close achievement gaps.

    Entrepreneurs like Kate Kallot are high on Bryant’s list, too. The MIT-trained computer scientist heads up Amini, an organization that deploys AI to predict climate change in African communities. Kallot earned a spot on TIME’s 2023 list of the 100 most influential people in AI.

    Then there’s Arkangel Ai CEO José Zea. He and his team have deployed a no-code health platform with which healthcare professionals can use plug-and-play AI algorithms to improve patient retention, therapy success rates, and patient engagement. One of Arkangel Ai’s initiatives addresses high maternal mortality rates in the U.S., particularly among African American and other minority women.

    AI: Not a neutral technology

    While AI promises greater efficiency and access, it’s not neutral, said Bryant. It’s trained on biased data that can perpetuate and amplify societal inequalities.

    “If we don’t put some safeguards in place in our academic institutions, I think the risk of what can happen with an AI-powered learning tool that consistently underrepresents or misrepresents marginalized communities is real.”

    With AI, it’s not about the technology itself. It’s about who controls it and who has access to it, she said. If large learning models that drive AI systems embed ingrained biases into the algorithms that guide students and their learning journeys, the consequences can be devastating.

    Bryant highlighted AI-powered textbooks and curricula that show racially biased outcomes or underrepresent marginalized communities in illustrations and examples.

    She also called for greater racial diversity in the developers, educators, and policymakers who design and implement AI systems. Without them, AI will reflect the biases of its creators and reinforce inequality, she said, stressing too the importance of teaching students not just to use AI to answer questions, but to critically engage with AI, question its role, and ensure it serves as a tool for progress rather than harm.

    A call to action for “Generation AI”

    Bryant provided the educators in attendance at the Pearson Ed.Tech Symposium with a mission and referred to them as the foundation of everything that matters as “Generation AI” students are shaped into global citizens.

    “Unlike in previous (technology) revolutions, we have an opportunity to act with a little bit of foresight and guide this technology in ways that empower and don’t exclude,” she said.

    That’s something, Bryant said, that won’t happen organically or by chance.

    “It’s going to happen because educators like yourselves guide our students, not just to use AI, but to wield it responsibly. We need to train students to question the biases of the tool and to demand fairness in the answers it provides,” she said. “Teach them to ask the right questions — in life, in AI, in the classroom, in their paths as young adults. Let’s get it right."

  • Student with dry erase marker in hand, writing on presentation board in front of the class

    MyLab Math: Purpose-built to meet students where they are on their unique learning journeys

    By Patrick Golden

    Located in the heart of downtown Indianapolis, Indiana University – Purdue University Indianapolis (IUPUI) is a vibrant higher learning institution, enrolling a diverse student body that includes more than 16,000 undergraduates.

    At IUPUI, Math faculty have long trusted and adopted MyLab Math from Pearson, a dynamic platform that’s driving improved performance and high satisfaction for students and faculty alike. It’s described as an integral part of the math curriculum, purpose-built to effectively adapt to individual students and their unique learning needs.

    The adoption and success of MyLab Math at IUPUI goes hand in hand with Pearson’s commitment to going above and beyond as a dedicated partner every step of the way.