I remember the first time that someone told me that my work could not “scale” and, as a result, was not going to advance any entrepreneurial interests I harbored.
I had just graduated from college, and was working a lot of odd jobs: working for a tutor company here, editing college application essays there, and patching together some part-time work over in the on-campus writing center. I proposed a business idea to one of the private businesses I worked as a tutor for: to create an online writing tutor program that would offer students prompts, questions, and outlines to help them write their college application essays. At the time, I had little awareness of online writing centers and what they could do for students. I just remember thinking, “A lot more students could have access to the kinds of prompts and questions I often give to private tutoring clients. Why not create a system where I could interact with more people online?”
That’s when I heard the feedback: that idea won’t scale. Because I had incorporated personalized feedback (from me) into the model of the online writing program concept, the idea was that perhaps I could manage the labor of responding to dozens of students, but I could not manage the labor of responding to thousands of students.
“Is there a way to automate this?” I was asked.
My post-college-self was frankly appalled at the idea that someone would consider automation as a solution to giving students feedback on their writing (little did I know just how far the conversation on automated written response would go, in terms of research-writing bots, news article writing bots, and of course automated essay evaluation). I did not go any further with the plan, unable to conceive of a way to serve more people without actually working with them directly.
Looking back on this conversation, I can see how it could have been a turning point in my career. What if I had leaned into the idea of working with a private business-owner to build automated scripts that would generate prompts for “robo-teachers” to help students write college application essays? After all, most application essays follow clear formulas. An algorithm would have been conceivably quite easy to generate and perhaps (optimistically) access to that algorithm may have helped students who couldn’t afford the tutors, the textbooks, and the training get through yet another hoop in the long and complicated college admissions process for elite schools today.
But I turned away from that option, pursuing a PhD and imagining higher education as a place where the concept of scaling education, of innovating learning, of giving students access would be handled with responsibility and empathy. The jury is still very much out on what “innovating learning” means, of course, but one thing is clear: higher education is still figuring out what it means to do the work of giving students access to new ideas in ways that leverage new technology responsibly and empathetically.
Right now, the concept of “innovation” has often been minimized to tool or software adoption. Innovation can mean stocking iPads in a classroom. Innovation can mean adopting automated grading software (much like what I had resisted as a recent college graduate). Innovation can mean using virtual reality headsets for learning. Innovation can mean giving students their own web domains to create their own personal websites.
None of these uses of innovation are wrong. But none of them are quite right either because they are narrow notions of “innovation,” conceived only by the adoption of tools, not by the critical thinking or awareness of the whole, networked environment of which these tools are a part. Tool adoption, as a means of communicating “innovation” is, to me, the thing that instructors in particular tend to recoil against, to find exceedingly uncomfortable to their pedagogy. And yet, it is the form of “innovation” that often makes its way visibly into the news or into university press coverage.
I should say that, as someone who has designed online writing courses, who has websites with resources about teaching with technology, and whose whole profession right now is centered in critical approaches to bringing digital technology into classes, I believe strongly that we need “innovation” in higher education’s teaching mission.
But here’s how I define innovation: innovation is the ability to leverage available tools, materials, and technologies to improve student experience. Does that mean technological adoption is always the answer? No. It means weighing the available options, picking up the choices that work to improve learning, but REJECTING the choices that advance surveillance states, increase inequity, and bar access. Adopting pedagogies, practices and, yes, tools, that are attuned to the needs of students in a world with changing material conditions and greater interconnectedness is what will make education truly innovative.
The role of technology in education has a pernicious history, a trend that writers like Audrey Watters and Chris Gilliard have researched and written about extensively. We need to take seriously that “innovation” can also be perverted into tracking student behaviors (Saint Louis University installed Echo Dots into students’ dorm rooms), collecting students’ intellectual property and data (Jesse Stommel and Sean Michael Morris have detailed how plagiarism detection programs like Turnitin collect students’ written data) and throttling information, thereby excluding students from the education they deserve (through what Chris smartly refers to as digital redlining). These are just a few examples of “innovation,” warped into practices that, to put it mildly, simply do not advance a university’s core mission to educate.
But I don’t think we should fall back on nostalgic thinking about the glories of educational practice in the days before education. What perversions of “innovation” really show us is not how bad technology is, but how badly technology can be weaponized within education.
For example, I actually think there is a time and place for automation in higher education, even though automation can be weaponized terribly against students. Indeed, our students could actually learn a lot from interacting with artificial intelligence for learning, though perhaps not in the ways that entrepreneurs might think. That is, there is no replacing a human facilitator (relationships are key to effective learning, a point proven by the researchers behind How Learning Works among many, many others), but there perhaps is room for collaboration between an instructor and a machine, particularly if the machine itself becomes an object for students to investigate, unpack, and interrogate. To dial down into this example even further, students could turn written assignments into automated graders, but then have to analyze and explain why the automated grader gave them particular kinds of feedback. That would encourage students to break down what goes into an algorithm, to identify why algorithms are designed in particular ways, and discern why algorithms privilege some factors over others.
Automating learning experiences is not bad in and of itself; it’s bad when it’s divorced from interpersonal relationships or are released, unmoored from educational contexts, and separated from authentic learning outcomes. And we could really say that this is true for a lot of educational technology: educational technology solutions are bad when they ignore the people and the goals that they are supposed to serve.
I perhaps remain that optimistic person who went off to a PhD program thinking that continued training would help her solve big problems and help her help others enjoy learning. And I also remain that recent college graduate, unsure of how to approach education in light of entrepreneurial calls for disrupting the interpersonal relationships that are so critical to becoming successful learners and writers. But I have also become a professional hardened by and concerned with the ways that education seems to continue amplifying inequities in our country. At my core, I’m hopeful. I’m hopeful that there will be more of us (and there are plenty of us out there) who do not want to ignore the need to “scale” education and who do not want to ignore the critical and central role that digital, networked devices – and their effects – play in student learning, but that who also want to bring the “education” back into “education technology,” the actual, real kind of change that matters back into “innovation.”
Trackbacks/Pingbacks