Algorithms Are Only As Intelligent As the Humans Behind Them.

Automated learning pathways are currently a hot topic in discussions on education because those pathways have the potential to personalize learning and provide timely intervention for students.  With automated pathways, also known as adaptive learning, the instruction changes based on the students’ current levels of understanding. In some cases, adaptive learning is teacher-directed. In this case, teachers push new content to their students based on the learners’ previous performances. In other cases, the pathways are “system-generated” so the technology “automatically” adapts to meet the needs of individual students. The latter option is far more controversial because it elicits this question: Are algorithms replacing teachers?  What if the system “misunderstands” and doesn’t provide students with the proper support and intervention?1

These are legitimate concerns.  However, before we jump into the debate, we should consider what “automatically” means when we talk about automated learning pathways. Automated means that someone (a human) deliberately programmed a next step for students based on their performance outcomes.  A group of people program a series of pathways for students. These pathways are essentially “if/then” statements—otherwise known as algorithms.  If the student receives x on an assessment, release y.

Here the human component becomes even more critical.  How granular are the programmers in their approaches to intervention?  Do they break down each competency into a series of possible errors to provide targeted remedial support? The best adaptive programs design new lessons specifically for a particular problem area.  For example, in English, if students don’t receive a particular score on a line in a rubric for using topic sentences, students receive helpful resources that enable them to create effective topic sentences.

However, most adaptive programs “tag” already existing content pages and external resources from the vast worldwide web and push them to students when they do poorly on a topic.  Using a math example, students might receive 10 videos and 25 lessons that were tagged “distributive law”. Students are expected to sift through this content when they’re already feeling overwhelmed. Granted, learning to sift through online information is a critical 21st-century skill, but it’s not the best approach for targeted remediation.  It’s the difference between identifying that

  1. The student failed the distributive law test and must need remediation in how to use the distributive law to solve equations.

OR, that

  1. The student doesn’t understand how to collect like terms and has had difficulty using the distributive law to solve an equation. So, the student must need remediation in how to collect like terms if he or she is to understand the distributive law.

The latter approach is far more targeted; it helps students set specific and measurable goals for improvement and encourages the growth mindset. Such targeted instruction requires tremendous thought and planning up front, but the value for students is endless.

Adaptive learning pathways don’t eliminate teachers from education.  Careful, thoughtful educators are more important than ever—both in the classroom and behind the scenes.  When we think about technology creating adaptive pathways for students, remember that each pathway was programmed by someone.  So who are the people behind your curriculum?

One thought on “Algorithms Are Only As Intelligent As the Humans Behind Them.”

  1. This is an important article introducing a vital subject which, I hope, will lead to more reflection, experimentation and debate.

    In the field of learning we too often find “solutions” (methods, technological applications) that make logical sense in an abstract framework but utterly fail in concrete reality. There are two main reasons for the failure in my opinion.

    1. Institutional inertia: whereas any serious methodological breakthrough requires reexamining the underlying theses (what are we teaching? why? with what expected results?), the tendency is always simply to suppose that we have already answered those questions and just drop the innovation on top of the existing system.
    2. The culture factor: learning is ALWAYS cultural (i.e. social) and NEVER purely individual (this is the worst illusion, if not delusion we suffer from). It ALAWYAS involves the way groups of people along multiple horizons — society, experts and authorities, learners, interest groups, etc. — account for the world itself and specific areas of knowledge within both the dominant worldview and the acceptable variants on that dominant view.

    Culture = commonly perceived and/or held or remembered assumptions, beliefs, associations and stories. As the article clearly demonstrates, there are already multiple components of any supposed knowledge set (itself a cultural construct rather than a simple datum), but there are also social and psychological components that give expected “units of knowledge” (e.g. “topic sentence”, “distributive law”) their structure. One of the functions of the traditional school or university — as opposed to online ones — is to provide a physical environment for random non-academic social exchange that ensures some form of culture will organically grow and provide the framework for giving meaning to what students are learning.

    When we begin thinking seriously about learning culture (including multiple horizons) we can begin productively addressing the issues of how technology and, more particularly, algorithmic logic and adaptive strategies can be built into an effective learning process.

Leave a Reply

Your email address will not be published.