Like nearly every other industry, higher ed is starting to feel the effects of AI. What do we need to know about the future of AI in higher education?
J Scott Christianson, Assistant Teaching Professor at University of Missouri, joined the Enrollment Growth University podcast to talk about why we should be both bullish and bearish about artificial intelligence and machine learning coming to higher ed.
Why Higher Ed Should Be Bullish on AI and Machine Learning
Machine learning, a subset of AI, is really good at pattern recognition and optimizing processes. Higher ed could use a lot of optimization.
When classes don’t meet their minimum enrollment, for example, they get cancelled, and then the students are scrambling for other classes. AI can optimize things by doing demand forecasting, being able to look at the supply of students, where they are in their particular academic plan, and then planning and scheduling courses to best meet those students needs. It’s a more efficient approach than this department-by-department system we have now.
AI could also help with scheduling meetings and balancing faculty members’ teaching loads with service and research.
“There’s lots of things that are common in business as well as in higher education that will help us out a lot,” Scott said. “Just freeing up us from these mundane tasks.”
All of that feels safe and benign. It’s easy to get excited about. But …
Why Higher Ed Should Be Bearish on AI and Machine Learning
Higher ed should be cautious about AI in terms of over-relying on data alone to replace human decision making, especially when it comes to enrollment and financial aid decisions.
We might look at a process, decide it’s paper intensive, and that a machine-learning algorithm would be a great way to streamline things. It could chew on the data and tell us what students should be admitted or granted financial aid.
“A lot of people that have experimented with this were well intentioned,” Scott said, “but unfortunately the way these algorithms work is … they’re ethically neutral. And if you train them with data that has some bias in it, you are introducing a bias into that decision making by the machine learning algorithm.”
Humans have biases as well, of course, and some of those people who experimented with these technologies were trying to eliminate human biases. But we have to be cautious about that.
“Where I’m optimistic is turning the tables around and giving tools to students who might be able to determine what institution is going to be the best fit,” Scott said.
Rather than institutions using AI to figure out which students would be successful, prospective students could use the technology to determine where they would be most successful, thus putting the power in their hands.
Using AI in Higher Ed for Recruitment
Let’s say you were wanting to get a job at a university as a new PhD. You go on LinkedIn and find somebody at the school you want to work for, maybe an alumni you know. You want to cold call them or reach out to them. How do you know what that person is like? How do they know what they’re going to respond to?
“There is a company called Crystal Knows,” Scott told us. “You can sign up for a free account to test it out. If you’re connected with somebody on LinkedIn, it will actually grab the information from LinkedIn as well as other places (so you can) find that person and try to analyze what that person is like.”
Institutional Review Board for AI?
When we cross over from these mundane tasks such as scheduling into something involving personality, we have to be cautious and raise some red flags.
“There needs to be something like an institutional review board for AI,” Scott said.
People who want to do research need to go before an IRB to tell them what data they plan to collect, how they’ll get it, what they’ll do with it, and what the consequences to the subject will be. We may need to do something similar with AI. At the very least, we need to hold a discussion around what is appropriate and what is not at the institutional, state, and federal levels.
“This is kind of a Wild West right now,” Scott said. “I saw one prediction that said that AI growth is going to continue at about a 50% year-over-year growth rate. And that’s just incorporating AI into existing products, not new AI’s. So I think that’s something we’re going to have to be very cautious of and have a discussion around.”
Next-Steps Advice for Pursuing Student-centric AI
There’s a lot of hype and marketing around AI. It’s the latest thing, and people want to add it to their product to make it good. So, it’s important to get educated on the processes that AI is good at optimizing and the patterns it’s good at recognizing.
“There are all sorts of things behind the scenes that could help as well,” Scott said. “Energy management on our campuses, that’s data intensive that could also be helped with AI. Our lawyers that work for the university could also be helped because it helps them scan through lots and lots of documents. So I think just doing an inventory of how are you already using AI or what AI is already built into the products you have is an important first step.”
If you don’t use iTunes, you can listen to every episode here.