37 Comments

"We know that it is possible to deliberately create high growth cultures, and we know that software makes it easier to scale them. But there are many open questions. (If there is interest, I could do a write-up of my understanding of this design problem.)" πŸ™‹β€β™‚οΈ

Expand full comment

This is very interesting Henrik, I think that a lot of up would like to know more. Screentime is such a big issue in education and I am wandering how this will fit into that conversation.

I could see Simulated Intelligence(SI I refuse to call them AI) doing sort of rote teaching but most of teaching to me is about modeling from the teacher so I have a fair bit of skepticism. The emptiness, the flatness, of the large language models is definitely something that we wouldn't want emulated. I feel like we already have too much of 'seeming alive without actually being alive'.

Expand full comment

Much of tech has the unfortunate fate of being bastardized. For instance, most of the bandwidth on the internet goes (or went) to Netflix. I agree with you and your other readers, these AI tutors can help with personalization of learning - "if" they are used properly. For instance, AI helping them individual projects. Like your article on geniuses: most invested time in their own projects, possibly out of boredom. An AI tutor, which can shine a light on the darkness around their own research projects - outside of schoolwork - might just create other JS Mills.

Expand full comment
Apr 4, 2023Liked by Henrik Karlsson

This is great. I think there is a big difference between tutoring and the milieu you talk about. I think about Mike Piazza getting batting practice from Ted Williams, or Picasso himself getting instruction from his father the artist/professor. It seems like we have a ways to go before we replace that type of human interaction and training by seeing it being done live.

Expand full comment

I think the best tutors inspire their tutees. I wonder if AI tutors will be able to make mini sub-cultures in each their sessions, making learning personalized and purposeful for kids who’d otherwise not care

Expand full comment

One challenge will be to create AI tutors that don't feel like tutors so that students can learn at a variable pace and aren't put off by perceived dictatorial style tutoring. I'm not as worried about cultural acceptance, if it's effective it should only take a generation or so to become commonplace.

Expand full comment

Thank you Henrik for this. I have been teaching and mentoring kids from rural areas and under-resourced urban areas for the last 10 years. I can vouch for the importance of learning cultures. I am interested to know your thoughts on how to create and scale high growth cultures deliberately.

Expand full comment

I work at an agile learning centre with self-directed young people and work on a programme where we help them set goals. Broadly we ask them to look at their past, present, the future they want, the journey to get there, and how they will define success. I honestly believe that LLM are a powerful tool, but in the context of your argument about culture, as russett's comment gets at they are no different to google.

You are right in your assumption that culture is the bottleneck, but I think that it is a very specific aspect of culture that is going to prove to be the problem and that is school itself. The reason that khan academy has so little impact in schools, and why your assumption that if AI and LLM were "introduced in schools, I doubt most children would leverage these systems to grow significantly faster" is true is because schools erode our intrinsic motivation.

All the young people that come to our community from school have to go through a process of deschooling, where they essentially play and reject any notions of anything that looks like formal learning (the time this takes ranges depending on how negative the experience of school was). But when they come out the other side they are able to flourish, set their own goals, and work towards them on their own terms. As you note teachers will become less useful, we call ourselves mentors or facilitators and that probably is the future of the profession.

And when young people are in that space with plenty of intrinsic motivation then the tools will actually prove useful and have impact. Just this week I took a tool we use to help think about goals called a learning sprint and applied chat-gpt to it.

You start with a goal and then you note down all the possibilities you can think of within that goal. Say the goal is Romans, you write down anything you know about the Romans, and anywhere you could learn about the Romans - the What to Learn and the How to Learn. Then you build a story - this is a question, or a project proposal, or a maker project - depending on the nature of the goal out of those possibilities. Lots of themes of entertainment might come up - gladiators, plays, colosseums, lions - and so the story might be: How were the Romans entertained?

Then you create a task list of how you would go about answering that question, and you have a few weeks worth of research/making/writing/thinking to delve into.

Chat-gpt can be used to help at multiple stages. You can prompt it to help with the What to Learn and the How to Learn, creating a broader range of possibilities to draw on to create your story. You could put all the possibilities you have listed into it and get it to point out the themes to you to help you craft your story. You could ask it to help as you create the task list. The sub-task of Researching Roman Plays can be seen as a broader task that can be narrowed down into more specific tasks: who are the famous Roman playwrights, find age appropriate translations of plays, translations of plays in picture book format, which were most popular at the time and which plays are most read now? These can all be researched through the interface and all the while building on the previous questions as you narrow down to get what you want/need.

These are all tasks that a facilitator is there to help you with as you progress through the task, but you could outsource to a LLM if they were not there. You could take the learning sprint tool and work on it with me for an hour and then go home and now you know how to use a learning sprint work on your other three goals that you have in your own time with chat-gpt to help assist you. Or these are all things that a facilitator could theoretically do with you, but might not really know enough about the Romans and so you could use a LLM in the presence of a facilitator to get the best of both worlds.

But as I note, this relies on the intrinsic motivation of an eight year old walking up to you and going "I need to learn about the Romans. Will you help me?" We have had the ability for young people to think these goals and then go out and find the information for at least the last twenty years, and so in that sense it is no different to Google, in some ways I like to think of it as a more powerful Google, in that Google is a tool that releases self-directed learners into the world and allows the whole world to permeate back at them, and LLM are just the same but much more powerful, intuitive as they do the heavy lifting of the searching and filtering for you. But as you recognise the culture is causing the bottleneck somewhere and was when we had Google to find the answers to almost everything, and I believe it is the way school indoctrinates us in our relationship with learning, taking away intrinsic motivation and leaving us reliant on outside forces to motivate us, tell us, teach us, and grade us.

The key learning objective really is knowing how to use the software and knowing how to ask the right questions, that is the practice. What questions are going to give me the answers I want. You tweeted about recursive lists to find further authors to read to delve into a topic. An eight year old is not going to know to ask that. But our job as mentors is to help them develop with these tools so that those types of questions will seem logical and second nature to them as they get older. And when a tween says "I want to learn to be brave" as one did to me yesterday, it is about knowing that a LLM, is probably not going to be able to help you as much as a facilitator who knows you intimately is. That exploration is best done back in the real world with those meat sacks we call people.

Expand full comment

β€œBut soon we will be able to spend less precious human time on basic tutoring; instead, the emotional labor we do to support each other can be invested in a more leveraged way.”

I have a feeling that part of the power of tutoring comes from having contact with an adult that is really excited with understanding the world, and who also believes you can understand it too. This is, that the process of teaching has a role in instilling culture; it’s not merely a complement to it. If my hypothesis is right, there will still be a place for humans doing basic tutoring. Maybe human tutors can serve solely as advisers, guides, but I’m not sure this is enough.

Expand full comment

Interesting words, and I think the thesis that culture is a limiting factor in converting AI access to growth/learning is a correct one. I feel confused by a few things, though:

- If AI progress continues the way it has been, what's the point in scaling such high-growth cultures? We're already at a point where LLMs perform better on knowledge tasks than a large portion of the population would be capable of performing. Is there a benefit to people outside of some small portion of the population learning to use LLMs for growth in this manner?

- It seems like we're moving towards a world where having a broad knowledge base internalized is less valuable. In particular, we're moving towards a world where we can access knowledge "just-in-time", in the same way that we currently (or at least prior to pandemic-related supply chain issues) managed logistics. In a few years, it may be that your obsessions with esoteric questions are just a new variant of entertainment, the new way people like you "distract themselves and wriggle out of work".

I suspect that a key culture shift will be that people move from "just Google it" to "just ask ChatGPT" -- and once that happens, and once a new generation grows up with LLMs and is as fluent with prompting them as millenials are with searching Google, and as AI companies make LLMs easier to use, what's the difference between the world we inhabit and the one you worry we won't?

Expand full comment

Your talk about exceptional people reminded me of chapter 9 of Atomic Habits (the chess prodigies).

They were tutored since childhood to play chess. And I guess they becoming great at chess was only possible with their parent's guidance.

That's why I think that If a kid wants to excel at something, it either needs to discover from young age something they're obsessed with. Or be highly directed by their guardians. I guess this is something that also happened with Tiger Woods. He grabbed a golf club when he was 2 years old. By age 12, he already had his 10k hours in (I think, don't quote me on that math).

This was a great read! I was thinking about AI and school, but I had a much darker vision than yours. Because there are companies marketing AI not as a tutor, but someone that just does your homework. Your text gave me hope into a better future! Thanks!

Expand full comment

"Can we figure out ways to scale access to high-growth cultures? Are there ways for more people to grow up in cultures that approximate in richness that J.S. Mill, Pascal, and Bertrand Russel had access to?"

I think this is the key question right here, and I think on some level it's replicating it in microcosm in a way that stays tightknit across geographical barriers, and then trying to grow and spread that seed from there.

Expand full comment

I’m definitely interested in reading about your understanding of the high growth cultures design problem (or scaling problem). I guess that you are talking about designing in-school cultures (or learning pod cultures or homeschool co-op cultures or even adult study groups or intellectual circles), but when I read β€œhigh growth cultures” I immediately think about the cultural groups in America that exhibit disproportionate academic success: the Jewish, the East Asian immigrants (plus other successful minorities described theΒ The triple package book https://books.google.com/books?id=4F6MAgAAQBAJ&pg=PT3) and the American elites (whatΒ Matthew Stewart calls β€œthe 9.9 percent” https://www.theatlantic.com/magazine/archive/2018/06/the-birth-of-a-new-american-aristocracy/559130/)

There seems to be a neo-strict school trend (https://www.economist.com/britain/2023/01/16/why-super-strict-classrooms-are-in-vogue-in-britain) trying to β€œscale” the East Asian disciplinarian style in schools (because I guess is easier than scaling the other styles)

The Triple Package book aims to provide an explanation for why some groups "seize on education as a route to upward mobility”. It argues that education and hard work are not a good explanation for success; they are a β€œdependent variable”. Some of the key motivators described in the book are the constant sense of insecurity and a feeling of not being good enough. So I see a design problem there (although perhaps not the one you had in mind) and even if it was feasible, I’m not sure if is a good idea or not to scale insecurity.

Expand full comment

To me, good teaching is always approximating the father to son(mother to daughter) or master to apprentice dynamic. It is about experienced humans passing on their humanity or parts of it rather than simply abstract facts or skills.

I expect to see reports coming out showing that simulated intelligence can teach this or that better based on this or that metric, this isn't news. We have known for a long time that we can build things to hit metrics and that the purpose-built machine is often fragile in unexpected and critical ways. That is not a fragility that we want for our children.

To me, the most interesting thing about machine tutors is what we can learn about humans and human learning from the machine tutors failure modes but I worry that no one will be watching and catch the failures before they cascade into something truly nasty.

Expand full comment

"We will tend the bull" - what does this mean?

Expand full comment
Comment deleted
Expand full comment