Educators have been waiting for the promise of AI in education to arrive since the late 1960s.
Sure, we’ve had some successful, if albeit narrow, applications. Math and chess are classic examples. We’ve even seen universities use AI to identify students who may need additional attention to stay enrolled.
Now, however, AI’s capabilities are exploding, and the field’s innovation currently looks like a hockey stick turned upwards. What hasn’t changed is the nature of human learning. People still learn best through stories.
So can we use AI to get back to a narrative-centered learning experience?
Dr. Jeremy Roschelle, Executive Director of Learning Sciences Research at Digital Promise, joined the Enrollment Growth University podcast to discuss the next generation of artificial intelligence for educational storytelling and what they’re hoping to build with their $20 million grant from the National Science Foundation.
The Importance of Storytelling in Education
Instructors trying to teach tens or hundreds of students simultaneously don’t have time to make slightly different variant story experiences for all their students to be part of.
But students are engaged when they feel they belong. Of course, belonging is a powerful human experience, and stories are something that makes us feel like we belong — or that we don’t.
Imagine that we are recontextualizing a science lab experience to discover something that’s fundamental to a problem. Maybe a somewhat-familiar disease is spreading, and people have to address some of the scientific questions in different communities.
As educators, we want to vary the story of the disease’s spread with different groups of students to let them follow what they are really interested in while still keeping the curricular emphasis. That’s the goal of the course. It’s problem-based learning, learning in teams, and learning through collaboration.
A sense of culture and socialization is intrinsic to story, and students find that motivating. Instead of breaking that sense of cultural and socialization with an instructor-driven, one-size-fits-all narrative, how can we keep it going a bit longer?
Where are we now in terms of AI in education?
The capabilities of AI are suddenly exploding. We all see this happening every day in our lives.
We’re talking to home assistants. We’re talking to our watches. We’re used to bots that complete our sentences or suggest people we ought to talk with. The underlying capabilities, number of patents, and the number of researchers are expanding wildly right now.
Within the next five years, that’s going to mean new types of applications for education, not just those narrow ones that we’ve seen for the past 30 or so years.
AI Progress from A/B Sequencing to Narrative Synthesis
We’ve seen a lot of machine learning applications already, and that will likely continue. These applications look for patterns or associations between two things. For example, these can make course sequencing recommendations easier when you have simple A/B variables. Platforms can gather enough data here where instructors may currently be receiving sensible recommendations from a courseware planning assistant.
But currently, customizing the narrative-centered experience is rare because it’s too hard, too expensive, and too difficult to scale. The only place you see it is with massively talented faculty members who can weave their students into a story.
Not every faculty member is going to have that talent, though, so Jeremy and his team are trying to break through to a learning experience that cannot be created today, something that cannot be assigned A or B.
Rocket to Mars Illustration
What does AI-based storytelling look like?
Inquiry-based science is important to translate to students. Science isn’t just facts. It’s a process with a certain quality.
Imagine a group of students engaging with the idea that a month from now, we are going on a trip to Mars together. We’re going there for a scientific project where we want to collect some data on the Martian soil, and we need to plan this out. What instruments are we going to bring? Where are we going to collect our samples? How are we going to analyze them?
We may have different interests about what we want to look at on Mars, so we’re going to divide into small groups. We’re all going on this big ship to Mars, so let’s figure out what you want to do when you get there. Then, let’s spend a month of our class time learning about the fundamental physics or chemistry or whatever it is that you need to know.
A month from today, the mission starts. We’re going to get on that rocket ship. We’re going to go on this simulated mission to Mars. We’re going to collect our data, and we’re going to come back to class and talk about what we found when we got to Mars.
Okay, we can’t produce that kind of simulated experience today because it’s too expensive and complicated. But in five years… well… are you on board?
Where to Start Your AI Focus
Start by reading about the state of AI. Understand what the technologies can do — voice input, make sense of a sketch, sense patterns, and synthesize constructive actions or texts.
Complement that with some readings about human-centered AI, sometimes called responsible AI or ethical AI. We have issues of bias. Tracking some of those issues and becoming aware of them is critical.
Then, develop the ability to look under the hood because there are superficial promises being made that if something has AI in it, it’s good. But maybe the team that built that thing really hasn’t gone particularly deep, so kick the tires before you buy.
In short, read about the fundamentals, think about the ethics, and build a team that can really kick tires.
This post is based on a podcast interview with Dr. Jeremy Roschelle, Executive Director of Learning Sciences Research at Digital Promise. To hear this episode and many more like it, you can subscribe to Enrollment Growth University.
If you don’t use iTunes, you can listen to every episode here.