Sugata Mitra’s keynote yesterday at EDUCAUSE 2016 may have alienated half of the room he meant to excite. Rousing the IT portion of the crowd with a speech that positioned technology front and center as the new and primary hero in education, he may have inadvertently caused every faculty member in attendance to question their role in this new tech future he spoke so passionately about, or if they even would have one. While it’s true that self-motivated exploration is an important aspect of education, for anyone other than the wholly self-motivated autodidact, it is not its replacement.
What Mitra speaks so passionately about is the ability for technology to deploy faculty expertise, not replace it. But I’m concerned that essential nuance may have been lost in an IT crowd of those, including myself, who are all too excited to jump on board the newest shiny object, the newest panacea in education. The limited-use case Mitra explored was how mere access to the Internet can empower disadvantaged self-learners around the world–even without local teachers.
But what are the students accessing? Programs, curriculum, and games written and created by these very teachers. While technology may help scale education in a way that makes traditional teachers less visible in the process, we mustn’t move to replace them. We can’t. No matter which platforms are developed and how technology changes, someone has to create the content and guide the process. Thinking we can automate everything is a false assumption, especially at the undergraduate and graduate levels of education, for this line of thinking caters only to the incredibly self-motivated autodidacts among us and ignores the majority of learners. While it is true that technology can help shape and amplify the pathways of that exploration, technology is no replacement for content and experience.
This is also to say nothing of the still notable impact of a college degree on one’s economic prospects and civil participation. As the newly released Primer on the College Student Journey report asserted, “…evidence for the United States indicates that the rate of return on investments in attending higher education has been higher in recent decades than it has ever been in the past.” It goes on to state that:
- Those with a 4-year college degree make, on average, $21,000 more annually than their counterparts with only a high school diploma
- College grads reinforce the social benefits of higher ed by spending a larger amount of time engaged in unpaid volunteer service
- College participation and graduation is linked to better health and greater civic activity
This research shows that the benefits of the college experience are perhaps more relevant than ever before, and that experience simply cannot be circumvented through self teaching in a scalable or meaningful.
There’s no question that technology boasts exciting opportunities for the content acquisition aspect of education, and technology should undoubtedly be embraced in higher ed. But turning to technology as a replacement for the higher ed institution walks a dangerous line that ignores all of the ancillary benefits of education, benefits that cannot be quantified quite so easily. As recent findings in fields such as social psychology and behavioral economics show, we are social, emotional creatures that integrate rather than separate our humanity with our intelligence and learning. The nature of inquiry and learning is complex, and we don’t progress as a society by letting people navigate their education alone–we progress by allowing each generation of learners to stand on the backs of giants.
As with numerous pairings in life, technology and the higher ed institution should be viewed in partnership with each other rather than in opposition to each other. After all, couldn’t society do with a little less “us vs. them” sentiment these days? Same team, Sugata.