Teaching and Learning blog

Explore insights, trends, and research that impact teaching, learning, and leading.

Explore posts in other areas.

PreK-12Pearson studentsProfessional

  • Maximize the Power of Revel
    By Liz Lebold

    Imagine being able to give your students a tool that puts them in the driver’s seat of their individual learning journeys.

    Revel is an innovative teaching and learning platform that transforms passive education into an interactive experience. By assigning coursework within Revel, you’ll inspire students and elevate learning outcomes. You’ll also make your life easier.

    5 Ways Revel Assignments Make Instructors’ Lives Easier

  • Group of individuals sitting in a computer lab while an instructor discusses information on device.
    Three simple ways to use AI to empower teaching and learning
    By Nada Alnounou

    Artificial intelligence (AI) has been the hot topic on the block for a few years, and there are mixed feelings about it.

    Some people fear its potential for misuse and academic dishonesty. However, conversations about AI in higher education have broadened to encompass this technology’s tremendous ability to positively transform teaching and learning. Our job as educators is to bridge the daunting gap of the unknown and help our students learn how to use this new tool at their disposal. Instead of shying away from this incredibly useful resource, we should be instructing students on the moral and appropriate ways to utilize AI.

    Here are a few ways that AI can be used to enhance and empower classroom instruction.

  • Group of students sitting around banched table, reviewing content on laptop screen.
    Voices of Innovation: A Q&A Series on Generative AI – Part 7
    By Pearson Voices of Innovation Series

    Using technology to improve teaching and learning is in Pearson’s DNA. As the first major higher education publisher to integrate generative AI study tools into its proprietary academic content, Pearson is excited to be harnessing the power of AI to drive transformative outcomes for learners. We are focused on creating tools that combine the power of AI with trusted Pearson content to provide students with a simplified study experience that delivers on-demand, personalized support, whenever and wherever they need it.

    In this multi-part blog series, you’ll have a chance to hear about AI innovations from Pearson team members, faculty, and students who have been involved with the development and rollout of Pearson’s AI-powered study tools.

  • Student with dry erase marker in hand, writing on presentation board in front of the class
    MyLab Math: Purpose-built to meet students where they are on their unique learning journeys
    By Patrick Golden

    Located in the heart of downtown Indianapolis, Indiana University – Purdue University Indianapolis (IUPUI) is a vibrant higher learning institution, enrolling a diverse student body that includes more than 16,000 undergraduates.

    At IUPUI, Math faculty have long trusted and adopted MyLab Math from Pearson, a dynamic platform that’s driving improved performance and high satisfaction for students and faculty alike. It’s described as an integral part of the math curriculum, purpose-built to effectively adapt to individual students and their unique learning needs.

    The adoption and success of MyLab Math at IUPUI goes hand in hand with Pearson’s commitment to going above and beyond as a dedicated partner every step of the way.

  • Group of individuals sitting around a conference table discussing content on a digital tablet.
    Voices of Innovation: A Q&A Series on Generative AI – Part 6
    By Pearson Voices of Innovation Series

    Using technology to improve teaching and learning is in Pearson’s DNA. As the first major higher education publisher to integrate generative AI study tools into its proprietary academic content, Pearson is excited to be harnessing the power of AI to drive transformative outcomes for learners. We are focused on creating tools that combine the power of AI with trusted Pearson content to provide students with a simplified study experience that delivers on-demand, personalized support, whenever and wherever they need it.

    In this multi-part blog series, you’ll have a chance to hear about AI innovations from Pearson team members, faculty, and students who have been involved with the development and rollout of Pearson’s AI-powered study tools.

  • Closeup of a row of students, listening to a ninstructor, while writing down information
    A Quantum Leap toward success: An instructor spotlight on Amy Pope
    By Kristin Marang

    Amy Pope is an award-winning senior lecturer in physics and astronomy at Clemson University. A Clemson alumna herself, Amy has her bachelor’s, master’s, and doctoral degrees in physics, and has devoted the last 22 years to teaching physics at her alma mater.

    Clemson has “a large focus on teaching and making sure that students have the number one engagement experience in their classes,” Amy explains. Which is part of what makes Clemson stand out, in addition to being a “fun, close-knit community.”

    Amy shares Clemson’s commitment to delivering engaging learning experiences, while also making learning affordable. As Amy describes it, “excessive cost is certainly a barrier to student success.” With that in mind, it’s a priority for Amy to use an affordable, effective learning platform, tied to a physics textbook she can trust.

  • College-aged student with paper and pencil, writing in front of a laptop computer.
    Voices of Innovation: A Q&A Series on Generative AI - Part 5
    By Pearson Voices of Innovation Series

    Using technology to improve teaching and learning is in Pearson’s DNA. As the first major higher education publisher to integrate generative AI study tools into its proprietary academic content, Pearson is excited to be harnessing the power of AI to drive transformative outcomes for learners. We are focused on creating tools that combine the power of AI with trusted Pearson content to provide students with a simplified study experience that delivers on-demand and personalized support whenever and wherever they need it.

    In this multi-part blog series, you’ll have a chance to hear about AI innovations from Pearson team members, faculty, and students who have been involved with the development and rollout of Pearson’s AI-powered study tools.

  • An instructor standing in front of a room of young adult students
    The Impact of Mastering at Clemson University: A Spotlight on Professor John Cummings
    By Kristin Marang

    John Cummings is a senior lecturer in the Department of Biological Science at Clemson University. He’s an award-winning educator who has furthered the use of innovative technology at Clemson. He also gets shout-outs from Clemson students on TigerNet message boards. According to one commentor, “John Cummings for Anatomy and Physiology is pretty awesome.”

    John currently teaches Human Anatomy and Physiology I and II, and a lot of the nursing students in his classes go on to take the MCAT. Many of those students report back to John that the work they did in his A&P class had a positive impact on their MCAT scores.

    “I know from unsolicited note cards and emails that I’m getting from students,” John explains, “that their highest portion on the MCAT was the bio because of what happened in Anatomy and that they’re feeling really positive. And that’s because of the foundation that’s there. What we’re teaching them to do is be independent learners.”

    Improving student performance

    John was an early adopter of Pearson’s Mastering® A&P digital learning platform and has a vivid memory of the first time his Pearson sales rep did a Mastering demo for him. “I was on board immediately,” he recalls, “because it saved me from having to do a lot of the stuff that I had been doing in putting together my own things, [with the] automatic grading and all of that sort of stuff. It’s one of those things that can really sell itself.”

    From that first demo, John says that Mastering appealed to him “because it’s conveniently accessible.” He felt it would add elements to his courses, benefiting him and his students. “If it doesn’t do something for my students,” he adds, “I'm not going to maintain it in my class.”

    John kept a critical eye on progress as he implemented Mastering into his courses at Clemson, and the impact was obvious from the first semester. “There was a tremendous improvement in student performance that first year,” he explains.

    Instructors are able to see how much time a student spends on a question in Mastering, an insight that John feels gives him a sense of which students are truly answering a question versus letting a quick Google search find the answer for them. “Initially, when I first adopted it,” he recalls, “the people who significantly worked through the Mastering assignments saw about a seven percent increase in performance on examinations.”

  • A group of people looking at a laptop computer.
    Common practices in open science
    By Jeanne Ellis Ormrod

    In recent years, researchers have increasingly advocated for open science (some scholars capitalize it as Open Science). The open science movement consists of several practices that make research reports more transparent, such that details regarding methodologies, collected data, and data analysis procedures are fully disclosed and open to public inspection and critique.

    Here are five common ways in which researchers can enhance the transparency — and ultimately also the credibility — of their research projects. These points are described in greater depth in my book Practical Research: Design and Process, 13th Edition:

    • Preregistration of a study: Researchers post an online description of what they will do in an upcoming research project. Among other things, such a description usually includes the overarching research question(s) to be addressed, specific hypotheses to be tested, specific methods to be used to collect data, and specific statistical procedures and other strategies to be used in analyzing the data. (See Chapter 5.)
    • Results-blind reviewing: Before a project is conducted, outside reviewers examine a preregistered research proposal to determine the apparent soundness of a researcher’s main question(s), rationale, a priori hypotheses, study design, and planned data-collection and data-analysis strategies. (See Chapter 13.)
    • Registered report: When a proposed project has been both (a) preregistered and (b) results-blind reviewed, an editor of an academic journal commits to publishing the final research report in that journal — regardless of whether the results are statistically significant or in some other way especially noteworthy — as long as the project has actually been carried out as originally proposed or else modified in reasonable, well-documented ways. (Again, see Chapter 13.)
    • Open access to specific data collection and analysis procedures: Researchers make the details of their data-collection and data-analysis strategies readily available to interested scholars; for example, they might share the questionnaires administered, specific statistical procedures conducted, or particular coding schemes used to find patterns in participants’ interview responses. (Again, see Chapter 13.)
    • Open access to raw data: Especially when qualitative data has been collected (e.g. when people have been interviewed or members of a particular sociocultural group have been carefully observed as they’ve gone about their daily activities), the actual verbal and/or nonverbal behaviors might be presented in an appendix or supplemental materials. There’s an important caveat here: A researcher can give other people open access to their raw data only when either (a) responses remain strictly anonymous or (b) participants have given explicit written permission for their identities to be disclosed. (Once again see Chapter 13; also, see the section “Ensuring Participants’ Rights and Well-Being” in Chapter 4.)

    When striving for transparency, researchers don’t necessarily do all of these things for a particular research project. For example, when a researcher is building on another researcher’s work and uses a questionnaire that the other researcher has created, publishing the questionnaire might violate the other researcher’s intellectual property rights. And preregistration of a project isn’t always logistically feasible. For instance, many action research projects may be intentionally ill-defined at the very beginning, at least in part because initial data collection may lead to new questions to be addressed and further data collection may be necessary.

    The bottom line here is this: I recommend open science practices when they can enhance the credibility of a research project without jeopardizing either (a) the privacy and well-being of the people being studied in the project or (b) the degree to which the project can thoroughly address the questions that underlie and drive the project.