The CEO of Duolingo, Luis von Ahn, “didn't expect the blowback” to his announcement that they're using AI in lieu of human contractors.
The outrage stems from AI replacing jobs, of course. Von Ahn clearly thinks that this is the issue that has upset users. Though to be clear, they can replace you, but not him:
“I’m not going to claim CEOs are that special,” he says. “It’s just somebody has to tell others . . . ‘This is where we’re going.’ And AI is not particularly good at that yet.”
Anyway, AI can replace you, you’re mad about it, blah blah blah. People weren’t always happy when they tried to replace teachers nearly 100 years ago. Skinner himself believed that teaching machines and programmed instruction could outperform some teachers.
But, believe it or not, the rest of von Ahn’s statement has far deeper problems.
...
The best curriculum design seems simple. But so does the worst curriculum design. Drowning can look a lot like swimming.
Bad curriculum design conveys the wrong information. Bill Heward gives examples. A worksheet where components of compound words should be connected with a line, but the matches are color-coded. This is not necessarily an exercise in learning compound words, but in simply matching colors. A fill-in-the-blank with a word bank that leaves spaces for each letter doesn't teach anything to a kid who can count the number of letters.
These examples look, at first glance, like reasonable exercises. On closer examination, they may not be teaching, and could even interfere with learning.
Color-coded compound words could be completed by simply matching colors
Note that the blank is followed by the number of letters in the missing word; the letters in each word in the word bank can be counted
While Heward wrote his article 22 years ago, the above examples were found this week.
Zig Engelmann used the term “faultless communication” to describe how he designed lessons. How can the teacher introduce a concept in a way which could not be misunderstood?
And we start to see how hard it can be to teach a concept like “red” – starting here in Theory of Instruction:
Easy enough: positive examples that each vary, negative examples, and test examples. But handing a child two red crayons and saying “red” can lead to the following errors:
Does “red” mean something wrapped in paper? Does “red” mean something which is handed to you? Engelmann viewed these potential misunderstandings as the fault of the teacher, not the student.
Suddenly it’s abstract: teach the concepts glert and not-glert with necessary features (A) and extraneous features (B, C, & D). Red is a color, but a crayon (B) in the classroom (C) held by the teacher (D) are not qualities of red. In Figure 2.2, example 2 may be a red ball on the playground (not a crayon, not in the classroom, not held by the teacher) – and so on.
...
Good curriculum design is hard. It’s not the solution to all ills, but it’s important. Here is something that von Ahn had to say in his AI statement:
To teach well, we need to create a massive amount of content, and doing that manually doesn’t scale. One of the best decisions we made recently was replacing a slow, manual content creation process with one powered by AI.
[...]
However, we can’t wait until the technology is 100% perfect. We’d rather move with urgency and take occasional small hits on quality than move slowly and miss the moment.
This is an alarming statement for a learner to read. But another curriculum designer might have something to say about the statement.
Bob Mager, no less an instructional genius than Zig but comparatively laconic, insists that the teaching directly relates to the goal. That is, if you want to check if a person can tie their shoe, you don’t assign them an essay – you watch them perform. Start by analyzing the goal.
What should a successful language app do? It should create a fluent language speaker – one who speaks accurately and quickly. It should do that as quickly and efficiently as possible.
But we all know: that’s not what a successful app does.
It is not the goal.
...
Von Ahn stresses the amount of content over the quality. The food is terrible, and such small portions. Who is clamoring for a trough of slop?
The goal analysis tells us. The goal of Duolingo is not to create a fluent speaker. It’s 2025 and we all know what makes for a successful app. The goal is engagement. The goal is to sell ads and subscriptions. The goal is daily active users.
Creating a fluent speaker would, ironically, cause the user to leave the app.
On the other hand, an unending stream of content would convince a learner that they should return in perpetuity – they never reach the end. They never reach their goal.
But von Ahn will reach his.