The learning facilitators they mention are the key to understanding all of this. They need them to actually maintain discipline and ensure the kids engage with the AI, so they need humans in the room still. But now roles that were once teachers have been redefined as “Learning facilitators”. Apparently former teachers have rejoined the school in these new roles.
Like a lot of automation, the main selling point is deskilling roles, reducing pay, making people more easily replaceable (don’t need a teaching qualification to be a "learning facilitator to the AI) and producing a worse service which is just good enough if it is wrapped in difficult to verify claims and assumptions about what education actually is. Of course it also means that you get a new middleman parasite siphoning off funds that used to flow to staff.
They could just have the kids read actual books designed by actual pedagogic experts which actually help to learn through studying it.
Now nobody knows if the “AI” is even teaching real things or if it is only using properly vetted material, if the structure it proposes makes sense.
Yes teachers are fallible, but they are also human and can emotionally understand what is going on during learning that a trained algorithm just cannot get. In so far also it means there needs to be a clearly defined “goal” of knowledge and competencies and the algorithm can only fill the holes, rather than encourage students to maybe seek knowledge beyond the established set.
Also i am skeptical how much of it is even “AI” in the sense of needing a machine learning approach, or it is just regular computer tests of which “level” is reached in each category and where to improve still. Chance is, this could be done with an excel sheet.
Aren’t there laws about who gets to teach kids? I know there are strictures on teacher to student ratios, but how can that exist without a written definition of what a teacher is?
so the thing is this is a private school at the sort of fees that attract really good teachers and use them as a selling point, so I don’t actually think being cheap is the goal here. I think some idiot thinks this is actually a good idea.
Unfortunately this trend is happening in the States even without the AI buzzwords (though it is there). You give every kid a tablet with educational apps that feed into a curriculum algo. Teachers are told by the algo which student needs help on what, basically they become facilitators to the app. Then you also have “student summarizers” which will “analyze” a student written or audio submission and flatten it down to some unform stats.
In some areas of the USA, teaching degrees aren’t required to actually teach. I hope I don’t see this world-wide.
There is a lot of benefit to be had though. It will likely suck at first and I think the tendency for outsourcing this kind of thing is idiotic. The gov needs to be the AI administrator AND the company because AI is extremely privacy invasive and should never be commercialized in any capacity with kids. I don’t support even the school having full access to a child’s prompting. I say this because I have intimate knowledge of what kind of information can be accessed using this and how invasive it is. I only run my own open source models on my own offline hardware. The only persons within a school with full access to a child’s prompting should be someone bound to confidentiality and a Hippocratic oath like a licensed psychiatrist with no obligations or bias towards the school’s petty interests.
The education system is largely antiquated presently. I’m all for supporting my community with living wage jobs. Our reductionist culture is a big part of why we are falling apart. When we are presented with efficiency improvements, we are too stupid to adapt, and too stupid to use them as a resource. We flush out that newly created value instead of investing it immediately within ourselves.
The world has changed from an era when a traditional teacher is relevant. Audio visual information is our primary form of communication. With readily available video, it is criminal to continue live lecturing and presentation of static information. There is no chance that the live presentation of information is anywhere near the quality of a polished and edited video. There is very little chance that any given lecturer is truly the best at presenting such information. Such a statement glosses over the fact that there are an enormous range of personalities and functional thought processes. It is extremely unlikely that any given teacher connects well with each individual student. We have had readily available video communication for over a decade. Some university professors readily use the medium and offer class time as more of a workshop or lab environment. Most primary schools lack this kind of adoption of technology, complexity, and efficiency to keep up with the changing world. In truth, we lack the requirement for a teacher to be a life long learner too.
I expect much the same Luddism with AI. With teaching kids, this is pushing AI to the point where it needs serious supervision to be effective. Maintaining a child’s autonomy and right to privacy is absolutely critical for the future of society as a whole. However, the ability for AI to adapt to any functional thought and help with individualized problem solving is something that no teacher is capable of with more than one student at a time.
Most of us had to persist through our frustration in order to learn. AI can directly and individually address that frustration and find a solution. It is not always correct, but it is in the same realm of accuracy as an above average teacher. Maybe you too were aware of just how many teachers did not even know the subjects they were tasked with teaching in primary school, I certainly was.
With readily available video, it is criminal to continue live lecturing and presentation of static information. There is no chance that the live presentation of information is anywhere near the quality of a polished and edited video. There is very little chance that any given lecturer is truly the best at presenting such information.
christ
However, the ability for AI to adapt to any functional thought and help with individualized problem solving is something that no teacher is capable of with more than one student at a time.
it doesn’t do this
It is not always correct, but it is in the same realm of accuracy as an above average teacher. Maybe you too were aware of just how many teachers did not even know the subjects they were tasked with teaching in primary school, I certainly was.
I’m sorry your teachers sucked bad enough you could replace them with a prerecorded video and a statistical language model that’s notorious for generating confident, dangerous lies. I don’t think most kids should have that kind of experience in school though, and if they are currently maybe we should do what it takes (funding, regulation, strikes) to not go in that direction.
The thing is, technology could absolutely play a huge role in advancing education, allowing students to approach material at their own pace and (algorithmically, not black box bullshit) adjusting problem sets to optimize their benefit from the learning.
But this is to free the actual teacher to spend their time one on one assisting students with areas where they need the extra attention. It’s not to replace it with some unreliable bullshit machine.
(It should also probably be only part of the schedule. Various group settings have a bunch of value in a bunch of contexts both for the material and social stuff.) But you could absolutely enhance learning.
AI can directly and individually address that frustration and find a solution.
No, it can’t.
We have had readily available video communication for over a decade.
We’ve been using “video communication” to teach for half a century at least; Open University enrolled students in 1970. All the advantages of editing together the best performances from a top-notch professor, moving beyond the blackboard to animation, etc., etc., were obvious in the 1980s when Caltech did exactly that and made a whole TV series to teach physics students and, even more importantly, their teachers. Adding a new technology that spouts bullshit without regard to factual accuracy is necessarily, inevitably, a backward step.
the kicker here:
this GCSE costs £27,000 per student per year
they’re paying this much not to have teachers
I’m trying to find any detail on what they actually do in this thing and I can’t find anything. As far as I can tell the AI does ??shit?? and the “learning facilitators” are the “teachers”.
Seems to be like an awesome way to get tech millionaires with weird ideas about education from reading too much Ayn Rand to cough up 27 grand a year to educate their unfortunate kids.
I’m trying to work out what the fuck is up with this school. Like that’s a private school fee, sure.
But in what world did this look like a good idea?
for comparison, there’s a school that does an online GCSE at £5k/yr full rate, popular with diplomats and expats, but for about half the students it’s paid for by the local council as disability support (kids who can’t attend a physical school for some reason). I predict everyone involved would shit if they tried this AI nonsense on them.
Hahahaha! That is by far the stupidest embrace of “AI” possible.
I’d say I can’t wait for it to fail, but whenever I say that things tend to become intractably lodged in the culture, so. Great success!
Here’s a thought. Take that money per kid per year and pay a teacher a decent salary for a class of 15 kids
Of all the awful and bad reasons to homeschool, “my government forces my kids to learn parroted bullshit” is probably the most annoyingly valid.
There’s absolutely no way this could possibly go wrong ever.
Something tells me they’re not just slapping chatGPT on the school computers and telling kids to go at it; surely one of the parents would have been up-to-date enough to know it’s a scam otherwise. At the very least, surely the students will start to get upset that they’re getting made fun of for the “facts” they’re learning from chatGPT, complain to their parents, and cause the school to get sued.
It seems like a very stupid scam to try and teach rich kids with chatGPT which is why I’m wondering if they’re using something else. They could be acting as a testbed for a new AI designed specifically for teaching. I wouldn’t put it past rich people to use their kids as guinea pigs if it meant they could save or make money elsewhere.
Unfortunately the article doesn’t mention what kind of AI they’re using though.
Something tells me they’re not just slapping chatGPT on the school computers and telling kids to go at it; surely one of the parents would have been up-to-date enough to know it’s a scam otherwise.
If people with money had that much good sense, the world would be a well-nigh unfathomably different place…
I extremely much want the details on how this all works.
The UK has the worst priorities.
This isn’t the UK government or UK public education policy, to be fair on the UK. It’s a £27,000-per-year private school in London - the sort that helps ram the possibly-not-so-bright kids of the wealthy through their GCSEs and A-Levels.
The US hates robots. It’ll never catch on.
They hate ATMs, automated drive through kiosks, hitchhiking robots, and electric cars.
I actually don’t get the general hate for AI here.
To be clear, I don’t think AI-assisted learning in this specific way is a great idea, but for people wo were taught to be sceptic of what LLMs produce learning with LLMs can actually be kind of great. You can ask it very specific or even “dumb” questions and it will explain in great detail and spend way more “time” with your individual questions than a teacher normally could, leading to you being able to learn at your own pace. Also, from my experience as a student current LLMs are better at explaining than the average teacher is.
Overall, I think our education system is largely outdated and schools of the future won’t look like one teacher explaining and 25 people more or less not listening and I suspect a more individualistic way with supportive teacher roles will lead to better education over all.
I actually don’t get the general hate for AI here.
Try harder.
you’re identical to the other poster in both tone and content so either you’re using the same LLM to write your posts or you’re otherwise extremely familiar with each other. either way we don’t particularly need you here, but before you go:
Also, from my experience as a student current LLMs are better at explaining than the average teacher is.
Overall, I think our education system is largely outdated and schools of the future won’t look like one teacher explaining and 25 people more or less not listening
one day you’ll hopefully grow old enough to realize how lazy this shit looks to other people
You can ask it very specific or even “dumb” questions and it will explain in great detail and spend way more “time” with your individual questions than a teacher normally could, leading to you being able to learn at your own pace.
Yeah, but there’s 0 guarantee that it explains you anything factual or if it just makes shit up, either fully or partially. LLMs are absolutely not suited for this, which with them used as search engines and fact checkers has already shown. Teaching a bunch of generations possible false things and to rely on the word of machines that are just pretending to know things is just a terrible idea.
VR assisted teaching could be cool though, as there’s a lot of additional tools that could be used in a digital space. But I don’t see this becoming the norm, considering how expensive HMDs alone already are, let alone all the other equipment needed. People in many places already struggle affording the needed school books & general school stuff like blogs and pencils and teachers everywhere are almost always out of budget for their classes too.