Jan. 27, 2026
Ep 171: AI In Education: Tech Barriers, Government Plans & Replacing Teachers?!
The player is loading ...
Book a free demo for MathsZoo now and see how we can help your school smash maths! www.mathszoo.orgJoin our free WhatsApp community for Q&A submissions, polls on future episodes & links to the podcast first: https://chat.whatsapp.com/HB7n1PNGdGL5STACssEH1sLeave us a review and share this episode with someone you think might enjoy it! It really helps us out.Follow us on Instagram: www.instagram.com/teachsleeprepeatpodcastFollow us on TikTok: www.tiktok.com/teachsleeprepeatpodcast
1
00:00:00,040 --> 00:00:02,080
Hello everyone, welcome back to
another episode of Teach Steep
2
00:00:02,080 --> 00:00:03,000
Repeat.
My name is Dylan.
3
00:00:03,160 --> 00:00:05,280
And my name's Hayden.
And before we start, Dylan,
4
00:00:05,320 --> 00:00:07,360
you'll see your mouth going.
You always pick on me at the
5
00:00:07,360 --> 00:00:09,000
start of the episode, so I
thought I'd pick on you.
6
00:00:09,120 --> 00:00:10,600
You're wearing a Gym Shark
hoodie right now, and you
7
00:00:10,600 --> 00:00:12,360
haven't been to the gym in about
10 years.
8
00:00:13,120 --> 00:00:15,160
Is there a reason for that?
I'm also not a shark.
9
00:00:15,960 --> 00:00:17,480
Well, you know, that was a bit
more obvious.
10
00:00:17,680 --> 00:00:20,800
I don't actually, but you have
not you have not been to the gym
11
00:00:20,800 --> 00:00:22,840
in a long time.
Just came back to me and I am a
12
00:00:22,840 --> 00:00:27,040
shark.
Like in clothing I can realise
13
00:00:27,200 --> 00:00:28,880
not not even a shark who can
talk.
14
00:00:28,920 --> 00:00:31,880
Just a shark.
Not me in shark form.
15
00:00:32,080 --> 00:00:35,800
A literal shark.
Only at this point and I realise
16
00:00:35,800 --> 00:00:38,320
a shark let me in the house, a
shark got me a drink, a shark
17
00:00:38,320 --> 00:00:40,360
took me upstairs, turned the
lights on.
18
00:00:40,480 --> 00:00:42,400
Not only that, I'm just doing
dumbbells the whole time as
19
00:00:42,400 --> 00:00:43,240
well.
So what do you mean?
20
00:00:43,240 --> 00:00:44,200
Yeah?
I'm a gym shark.
21
00:00:44,200 --> 00:00:45,000
Do you know what?
Do you know what?
22
00:00:45,000 --> 00:00:47,320
Bloody you take it all back?
Bloody you take it all back.
23
00:00:47,480 --> 00:00:49,720
Also, sorry, very brave coming
from a man with some kind of
24
00:00:49,720 --> 00:00:51,840
foreign language written on
their top that could say
25
00:00:51,840 --> 00:00:52,560
anything.
Yeah.
26
00:00:52,600 --> 00:00:54,280
Well, do you know what it says?
No.
27
00:00:54,280 --> 00:00:57,160
Well, I'll tell you then.
In that case, it says this
28
00:00:57,160 --> 00:01:00,320
podcast is great and you
wouldn't even know.
29
00:01:00,720 --> 00:01:03,360
You know, there's things of
people who get tattoos and it
30
00:01:03,360 --> 00:01:05,800
turns out it says like
completely, yeah, that's you
31
00:01:05,800 --> 00:01:06,640
right now.
So funny.
32
00:01:06,720 --> 00:01:08,840
That says something on the lines
of teachers suck.
33
00:01:09,120 --> 00:01:10,080
Look what he's saying to you
guys.
34
00:01:10,080 --> 00:01:12,440
It might do or it might.
It might just say T-shirt, it
35
00:01:12,440 --> 00:01:14,520
might just say extra large.
I don't know.
36
00:01:14,720 --> 00:01:18,440
Have you seen the flip version
where there's pictures of people
37
00:01:18,440 --> 00:01:21,160
in like Asian countries and it
just says like smile?
38
00:01:21,360 --> 00:01:24,400
Yeah, or again tattoos.
They get something tattooed in
39
00:01:24,400 --> 00:01:26,560
English on their on their arm
and just doesn't make sense.
40
00:01:26,560 --> 00:01:30,240
Loading like in their language
it means really profound about
41
00:01:30,280 --> 00:01:32,120
building up in English is just
loading.
42
00:01:32,240 --> 00:01:35,600
Yeah, yeah, does not have the
same profound nature.
43
00:01:36,360 --> 00:01:38,440
We took a bit of a detour there,
but anyway, today's episode is
44
00:01:38,520 --> 00:01:40,320
all about artificial
intelligence.
45
00:01:40,320 --> 00:01:42,480
Now something we talked about
before, but the reason we want
46
00:01:42,480 --> 00:01:44,560
to talk about it today is
because we've just got back from
47
00:01:44,560 --> 00:01:48,880
BET 2026 and I would say the
biggest thing on my mind when I
48
00:01:48,880 --> 00:01:52,680
finished was sure excitement.
There's lots of real good use
49
00:01:52,680 --> 00:01:54,680
cases for artificial
intelligence now can help
50
00:01:54,680 --> 00:01:58,080
education, but also I, I can't
shake off and I've never been
51
00:01:58,080 --> 00:02:01,080
able to do this in education.
The reality of actually
52
00:02:01,080 --> 00:02:03,360
implementing it and how
realistic it will be and what
53
00:02:03,360 --> 00:02:05,200
will actually happen then in the
future.
54
00:02:05,200 --> 00:02:07,640
So do you want to lay a bit of
context down, a little bit of
55
00:02:07,640 --> 00:02:09,240
our thoughts and what we even
did there?
56
00:02:09,240 --> 00:02:10,440
Who did we hear?
Yeah, yeah, sure.
57
00:02:10,440 --> 00:02:13,240
So like just to sort of jump on
what you said, my first thought
58
00:02:13,440 --> 00:02:15,360
at bet was, is there anything
but AI?
59
00:02:15,600 --> 00:02:18,360
Because there was so much stuff
and even last year we went and
60
00:02:18,360 --> 00:02:20,440
we thought, oh, has a lot of AI
stands popping up now?
61
00:02:20,440 --> 00:02:22,760
Because it was kind of, you
know, the rise of AI, but it
62
00:02:22,760 --> 00:02:26,000
felt like genuinely everything
was about AI there, wasn't it?
63
00:02:26,000 --> 00:02:28,360
Like all of the because we
booked in for a few talks and in
64
00:02:28,360 --> 00:02:31,240
the arena, which is really cool.
And yeah, I looked at loads of
65
00:02:31,240 --> 00:02:33,240
stand and it was just this
everything has AI at the end of
66
00:02:33,240 --> 00:02:35,240
the name AI in the name
somewhere to be like, hey, look
67
00:02:35,280 --> 00:02:36,880
us, we're using AI come and
check us out.
68
00:02:36,880 --> 00:02:40,800
So that I think that's important
context in terms of clearly it's
69
00:02:40,800 --> 00:02:42,640
on everyone's minds.
So one of the one of the biggest
70
00:02:42,640 --> 00:02:45,640
things that happened was Bridget
Phillipson came out and did a
71
00:02:45,640 --> 00:02:47,280
talk and we were in the arena at
the time and.
72
00:02:47,400 --> 00:02:49,320
Education Secretary for anyone
listening who might not be in
73
00:02:49,320 --> 00:02:51,600
the UK, thank.
You for the clarification and
74
00:02:51,600 --> 00:02:52,920
she.
Hated talking by the way.
75
00:02:52,920 --> 00:02:55,040
He's a host of TC Repeat
everyone listening outside of
76
00:02:55,040 --> 00:02:57,080
the normal listeners.
What we might do is every
77
00:02:57,080 --> 00:02:58,800
sentence, if you could.
Just sentence is just a group of
78
00:02:58,800 --> 00:03:00,520
words put together to make a
phrase that makes sense on its
79
00:03:00,520 --> 00:03:01,560
own if you.
Could just summarise what I'm
80
00:03:01,560 --> 00:03:02,600
saying each time, that'd be
great.
81
00:03:02,600 --> 00:03:04,960
Saying is when you're talking.
You've done it in in a certain
82
00:03:04,960 --> 00:03:06,800
tense.
This is horrendous.
83
00:03:06,840 --> 00:03:08,640
I'd be impressed if you can go
on, right.
84
00:03:08,640 --> 00:03:10,920
So yeah, Bridget Phillips came
out and and did did quite a long
85
00:03:10,920 --> 00:03:16,320
talk and a lot of it really was
centred around AI and how she
86
00:03:16,320 --> 00:03:17,360
thinks.
And the government, I guess
87
00:03:17,360 --> 00:03:20,760
think that AI can be a huge
player in solving a lot of the
88
00:03:20,760 --> 00:03:23,520
problems in education.
It's it's quite a wide open
89
00:03:23,520 --> 00:03:24,960
point.
And if you want to sort of nail
90
00:03:24,960 --> 00:03:27,880
down and start anywhere in this.
Yeah, it, it was good.
91
00:03:27,880 --> 00:03:30,720
Do you know what I'll say
genuinely, Bridget Phillips, in
92
00:03:30,720 --> 00:03:35,640
my time as a teacher, I think
it's the most I've I've seen
93
00:03:35,760 --> 00:03:37,320
passion come through.
Yeah, for sure.
94
00:03:37,320 --> 00:03:42,200
And genuine desire for
implementing technology in a way
95
00:03:42,200 --> 00:03:44,760
that's going to help teachers
and just passion for education
96
00:03:44,760 --> 00:03:47,720
in general and specifically
around social mobility and SEND
97
00:03:48,160 --> 00:03:49,840
that comes through in everything
she says.
98
00:03:49,840 --> 00:03:52,920
So like I, I was very, I, I feel
very grateful.
99
00:03:52,960 --> 00:03:54,920
You know, I've seen the merry go
round of education secretaries
100
00:03:54,920 --> 00:03:56,480
over the whole time that I've
been a teacher.
101
00:03:57,000 --> 00:03:59,720
And I do think that her message
is on point and I do think she's
102
00:03:59,720 --> 00:04:01,560
saying the right thing.
So that's, that's a really good,
103
00:04:01,560 --> 00:04:02,680
solid start.
Genuinely.
104
00:04:03,320 --> 00:04:06,800
I, I just feel like sometimes
when she was talking about what
105
00:04:06,800 --> 00:04:09,080
they're doing and implementing,
she announced like 6 or 7
106
00:04:09,080 --> 00:04:10,560
different things.
Yeah, it turned into a bit of a
107
00:04:10,560 --> 00:04:12,400
party political broadcast
really, which, which I
108
00:04:12,400 --> 00:04:13,640
understand.
Obviously she's going to do
109
00:04:13,640 --> 00:04:14,680
that.
It's a good place to announce
110
00:04:14,680 --> 00:04:16,760
these things.
But it, it felt like, yeah,
111
00:04:16,760 --> 00:04:18,120
they're putting money in the
right place.
112
00:04:18,279 --> 00:04:20,680
It's not enough money.
I don't personally think, no.
113
00:04:20,920 --> 00:04:23,880
And the big push about how
technology can transform
114
00:04:23,880 --> 00:04:26,720
education and help teachers, She
was very, very, very clear it's
115
00:04:26,720 --> 00:04:28,760
not about replacing teachers.
We'll come on to that later
116
00:04:28,760 --> 00:04:31,920
because I'm not sure in general
what will happen, but she was
117
00:04:31,920 --> 00:04:34,800
very, very clear about how it's
helping teachers and students,
118
00:04:34,800 --> 00:04:37,000
right and and kids to learn
more.
119
00:04:37,480 --> 00:04:41,720
My worry with it is that the
actual physical hardware and
120
00:04:41,720 --> 00:04:45,680
technology I've got an iPad here
in classrooms does not remotely
121
00:04:45,680 --> 00:04:48,800
match what the vision is for
using it to help children and
122
00:04:48,800 --> 00:04:52,600
teachers alike.
For sure that that to me is the
123
00:04:52,600 --> 00:04:55,360
thing that screamed out that I
I'll just want to get on the
124
00:04:55,360 --> 00:04:57,640
roof to I want to get I want to
start a podcast, you know, where
125
00:04:57,640 --> 00:05:00,040
I've got people listening and
say only if only we had a
126
00:05:00,040 --> 00:05:02,720
platform to sort the people and
just say this is the issue.
127
00:05:03,000 --> 00:05:05,480
Do you think then, because I've
been really thinking about this
128
00:05:05,480 --> 00:05:09,240
since at the first, I was very
much on the side of one line of
129
00:05:09,240 --> 00:05:11,320
thought, which is schools don't
have enough money to get the
130
00:05:11,320 --> 00:05:13,400
tech, to get the iPads, to get
the computers, the laptops,
131
00:05:13,400 --> 00:05:16,440
whatever it is, to be able to
harness the full power of all of
132
00:05:16,440 --> 00:05:17,760
this tech, whether it's AI or
not.
133
00:05:18,560 --> 00:05:22,080
And I've that's where I started
and I'll share my sort of
134
00:05:22,080 --> 00:05:23,560
thoughts now.
But is that what you think?
135
00:05:23,560 --> 00:05:25,120
Do you feel like the problem is
simply this?
136
00:05:25,600 --> 00:05:28,560
Schools aren't being given
enough money to invest in tech?
137
00:05:28,760 --> 00:05:30,200
That's it.
There's multiple things going
138
00:05:30,200 --> 00:05:32,400
on.
Lots of schools have enough
139
00:05:32,400 --> 00:05:34,680
tech, so where do they get the
money from?
140
00:05:34,680 --> 00:05:36,720
Yeah, right.
They got the money daily, right?
141
00:05:36,960 --> 00:05:38,960
I think lots of schools are
making decisions, maybe to spend
142
00:05:38,960 --> 00:05:41,600
money elsewhere.
I think that's also because of
143
00:05:41,600 --> 00:05:44,200
necessity half the time, right?
We've got a bunch of money.
144
00:05:44,200 --> 00:05:47,400
What we're going to spend on,
OK, maybe cover because we need
145
00:05:47,400 --> 00:05:48,880
to actually cover our teachers
properly.
146
00:05:49,000 --> 00:05:51,080
Or maybe I'm going to spend it
on playground equipment because
147
00:05:51,080 --> 00:05:53,160
PE is very important and
physical health is super
148
00:05:53,160 --> 00:05:55,080
important.
And maybe in my catchment area,
149
00:05:55,080 --> 00:05:56,800
actually we're really behind on
a certain thing.
150
00:05:56,920 --> 00:05:58,280
So I'm going to boost some money
into that.
151
00:05:58,560 --> 00:06:02,800
All these decisions are fine and
makes sense, but unless you have
152
00:06:02,800 --> 00:06:06,720
a real clear tech part that is
for tech, money's going to
153
00:06:06,720 --> 00:06:08,520
haemorrhage out of that.
So I don't think it's just
154
00:06:08,520 --> 00:06:10,200
simply saying the government
aren't giving enough money.
155
00:06:10,320 --> 00:06:15,840
I think it's a whole system of
we need to simply have a set of
156
00:06:15,840 --> 00:06:18,280
money and funding going into
schools to make sure tech is up
157
00:06:18,280 --> 00:06:21,320
to date and it's ring fenced and
it's extra and it's, and it's
158
00:06:21,320 --> 00:06:23,600
important to have because it
feels like, again, it's just
159
00:06:23,600 --> 00:06:25,960
another thing that the
government can say is really
160
00:06:25,960 --> 00:06:27,720
important and will really
transform things.
161
00:06:27,720 --> 00:06:29,120
Can you please just make sure
you do it?
162
00:06:29,440 --> 00:06:32,280
It's a, oh, wait, I've got 30
Chromebooks between 100 and 20
163
00:06:32,280 --> 00:06:35,000
kids per year group, and four of
the Chromebooks don't work.
164
00:06:35,000 --> 00:06:36,600
At one point.
It's like, OK, cool.
165
00:06:36,920 --> 00:06:40,360
Let's look at all of the really
good tech options for children
166
00:06:40,360 --> 00:06:43,720
that can help their learning.
We need more of a ratio than
167
00:06:43,720 --> 00:06:45,120
that for it to actually be
effective.
168
00:06:45,200 --> 00:06:48,440
Yeah.
So we have to, we have to make
169
00:06:48,440 --> 00:06:51,640
sure that schools are getting
the technology they need because
170
00:06:51,640 --> 00:06:54,760
I can imagine a school in a
rural area looking at what the
171
00:06:54,760 --> 00:06:57,800
government are saying and
nothing will cut through because
172
00:06:57,800 --> 00:07:00,160
the first thing that score in
the rural area needs is the
173
00:07:00,160 --> 00:07:02,560
funding to have the technology
to be able to access it.
174
00:07:03,040 --> 00:07:04,960
That has to come first.
Yeah, I fully agree.
175
00:07:04,960 --> 00:07:09,720
There's a bit of a catch 22 as
well in terms of the tech when
176
00:07:09,720 --> 00:07:12,080
when schools don't have enough
tech, sorry, don't have enough
177
00:07:12,080 --> 00:07:14,640
devices to use the tech.
Sometimes that can actually make
178
00:07:14,720 --> 00:07:17,160
the tech they're trying to use
genuinely A nuisance.
179
00:07:17,160 --> 00:07:21,360
And then, you know, it's like,
unless you have the full set of
180
00:07:21,360 --> 00:07:24,440
equipment, the tech is actually
less effective.
181
00:07:24,600 --> 00:07:26,520
It's annoying because like you
said, like things are not
182
00:07:26,520 --> 00:07:28,640
working on one class they've got
going on a rota now and then
183
00:07:28,640 --> 00:07:31,080
they're just it's annoying to
use and that just isn't good
184
00:07:31,080 --> 00:07:32,880
because this is great tech
solutions that people are then
185
00:07:33,240 --> 00:07:34,480
going, Oh, this is rubbish.
Doesn't work.
186
00:07:34,480 --> 00:07:37,160
It's not that it doesn't work,
it's just that you don't have
187
00:07:37,160 --> 00:07:39,040
the right equipment to use it.
So that's one point.
188
00:07:39,120 --> 00:07:41,880
And then secondly, yeah, I kind
of basically I agree what you
189
00:07:41,880 --> 00:07:43,920
said about the tech pot of
money.
190
00:07:44,280 --> 00:07:45,840
It has to, it has, it's the only
way.
191
00:07:46,040 --> 00:07:48,360
Because I look at when you look
at some schools where you say,
192
00:07:48,360 --> 00:07:50,480
well, they've managed to do the
tech, they've managed to get all
193
00:07:50,480 --> 00:07:52,080
of the equipment in their
school.
194
00:07:52,760 --> 00:07:54,440
So therefore clearly all schools
can do it.
195
00:07:54,440 --> 00:07:56,560
I hate that generalisation
because I'm like, we don't know.
196
00:07:56,560 --> 00:07:58,440
You don't know what that school
prioritised.
197
00:07:58,440 --> 00:08:01,360
Yeah, they might have cut back
on other things that when you
198
00:08:01,360 --> 00:08:03,040
find out about what they cut
back on, you think, oh, you
199
00:08:03,040 --> 00:08:04,960
can't cut that.
That's a significant thing.
200
00:08:04,960 --> 00:08:06,720
You're just looking at the fact
that they have the equipment.
201
00:08:07,040 --> 00:08:09,880
So it has to be regardless of
what schools already have or
202
00:08:09,880 --> 00:08:12,480
don't have right now.
I feel like there has to be a
203
00:08:12,480 --> 00:08:15,280
top down approach of we are
going to equip all of our
204
00:08:15,280 --> 00:08:18,480
schools with enough equipment so
that all of this stuff we're
205
00:08:18,480 --> 00:08:21,040
saying about tech can actually
be implemented properly because
206
00:08:21,040 --> 00:08:23,640
one without the other doesn't
really make any sense.
207
00:08:23,640 --> 00:08:25,520
And that's when it comes back
around to, and we've mentioned
208
00:08:25,520 --> 00:08:27,320
this before in the past, lip
service.
209
00:08:27,480 --> 00:08:31,720
When people, governments,
whatever policy makers say stuff
210
00:08:31,880 --> 00:08:34,440
and it sounds great.
There's a glaringly obvious
211
00:08:34,440 --> 00:08:36,400
problem that could be solved
that isn't solved.
212
00:08:36,400 --> 00:08:38,200
That's when I go, oh, OK, this,
this is meaningless.
213
00:08:38,200 --> 00:08:40,280
And this is just lip service.
You're, you're not actually,
214
00:08:40,400 --> 00:08:42,200
you're not actually trying to
solve this problem because, you
215
00:08:42,200 --> 00:08:46,280
know, you know, these schools
have no equipment and you're
216
00:08:46,280 --> 00:08:49,280
saying tech is the solution and
then going blind eye walk away.
217
00:08:49,320 --> 00:08:51,280
Yeah, that's just not good to
me.
218
00:08:51,360 --> 00:08:53,680
We can solve teacher burnout
because it's going to do lots of
219
00:08:53,680 --> 00:08:55,240
marking for you.
Isn't that fantastic?
220
00:08:55,760 --> 00:08:57,440
Oh, really?
Wait, can we have the thing to
221
00:08:57,440 --> 00:08:59,080
do on?
No, no, no.
222
00:08:59,160 --> 00:09:00,480
OK, so it's not.
Get it from your existing
223
00:09:00,480 --> 00:09:02,040
budgets.
We don't have our existing
224
00:09:02,040 --> 00:09:03,200
budgets already completely
stretched.
225
00:09:03,200 --> 00:09:06,000
And if we if we would have to
fire ATA to do this and we don't
226
00:09:06,000 --> 00:09:08,120
really want to do that, it's
ethical to consideration the
227
00:09:08,120 --> 00:09:10,560
other school did it.
Yeah, OK, that's up to them.
228
00:09:10,560 --> 00:09:12,040
Like right.
Can we all just have the same
229
00:09:12,040 --> 00:09:14,960
part for this thing?
Can you just say here how much
230
00:09:14,960 --> 00:09:15,640
you need?
Here you go.
231
00:09:15,640 --> 00:09:17,680
Here's an iPad for every kid.
That is such a good point you
232
00:09:17,680 --> 00:09:19,480
made about how when you're
discussing something in
233
00:09:19,480 --> 00:09:21,640
particular, it's easy to make
comparisons between schools.
234
00:09:21,800 --> 00:09:23,440
But schools are so much more
complex than that.
235
00:09:23,440 --> 00:09:27,920
Like you said that school with
one iPad per kid very much may
236
00:09:27,920 --> 00:09:30,800
well not have any support stuff.
They they may have made that
237
00:09:30,800 --> 00:09:31,520
decision.
Right.
238
00:09:31,640 --> 00:09:33,440
And I'm not saying that's the
right way to go either, because
239
00:09:33,440 --> 00:09:35,840
I think people can sometimes
think, oh, well, you're saying
240
00:09:35,840 --> 00:09:37,680
one tech, more tech in the
classroom.
241
00:09:37,680 --> 00:09:39,720
That means at this expense
that's bad, you're terrible.
242
00:09:39,920 --> 00:09:42,040
No, it's not what I'm saying.
I'm just saying that as a
243
00:09:42,040 --> 00:09:44,240
baseline, we need to have that.
There's there's something else
244
00:09:44,240 --> 00:09:47,520
that really jumped out at me
that I find fascinating is the
245
00:09:47,520 --> 00:09:51,280
political problem.
Like I genuinely understand how
246
00:09:51,280 --> 00:09:54,080
it's easy for us to sit here and
say, well, why don't you just do
247
00:09:54,080 --> 00:09:54,880
that?
Isn't that easy.
248
00:09:54,880 --> 00:09:56,800
If you don't go on politicians,
why don't you just give everyone
249
00:09:56,800 --> 00:09:59,680
an iPad, etcetera.
They must be constantly walking
250
00:09:59,680 --> 00:10:03,600
a rope of appeasing people.
At the end of the day, they're
251
00:10:03,600 --> 00:10:06,040
there to serve the people.
And there is a significant
252
00:10:06,040 --> 00:10:10,520
portion of society and it's
growing who look at tech and say
253
00:10:10,520 --> 00:10:13,200
tech in schools and screens in
schools equals bad.
254
00:10:13,640 --> 00:10:16,360
So in the one breath you've got
the government saying how
255
00:10:16,360 --> 00:10:18,160
important tech is for the
classroom.
256
00:10:18,640 --> 00:10:20,560
I agree, by the way, because
they're going to enter the real
257
00:10:20,560 --> 00:10:23,760
world, which is so tech heavy.
We have to make sure that
258
00:10:23,760 --> 00:10:26,440
they're learning how to actually
use these devices properly and
259
00:10:26,440 --> 00:10:28,200
also just like get the most out
of them, right?
260
00:10:28,560 --> 00:10:31,720
But then at the same time, we're
explaining how it's so important
261
00:10:31,720 --> 00:10:32,960
to do that.
Tech's really important.
262
00:10:32,960 --> 00:10:34,640
It's going to save teachers time
in the classroom.
263
00:10:34,640 --> 00:10:36,880
Kids need to use it and and use
it well.
264
00:10:37,480 --> 00:10:39,920
At the same time, you've got
people saying no screens in the
265
00:10:39,920 --> 00:10:42,400
classroom, technology, what my
kid has any technology in the
266
00:10:42,400 --> 00:10:44,000
classroom, I'll be taking them
out of that school.
267
00:10:44,000 --> 00:10:46,760
Then that's awful for them and
you've got the government
268
00:10:46,760 --> 00:10:49,040
releasing a device on screen
time, etcetera.
269
00:10:49,040 --> 00:10:50,960
In the early years, they're
going to be releasing, as
270
00:10:50,960 --> 00:10:52,880
Bridget was saying, they're
going to extend that and give
271
00:10:53,120 --> 00:10:55,600
advice for infants, give advice
for primary, give advice for
272
00:10:55,600 --> 00:10:58,360
secondary, so that that's going
to be given, right?
273
00:10:58,680 --> 00:11:00,720
So it'd be really, really
interesting to see how this
274
00:11:00,720 --> 00:11:04,040
tightrope is kind of walked
because my genuine opinion is we
275
00:11:04,040 --> 00:11:07,040
need way more tech in schools.
But I do think there'll be a big
276
00:11:07,040 --> 00:11:09,960
push back from a, from a big,
big section of societies.
277
00:11:10,040 --> 00:11:11,000
Like how do you think about
that?
278
00:11:11,160 --> 00:11:13,840
I do wonder if that guidance
they're going to bring out, you
279
00:11:13,840 --> 00:11:15,720
know, the, the screen time
guidance, which I starting with
280
00:11:15,720 --> 00:11:17,440
sort of early years and then
they're going to, she said.
281
00:11:17,440 --> 00:11:18,280
They're going to work their way
out.
282
00:11:19,240 --> 00:11:21,800
It could be a really big force
for good, you know, yeah, I
283
00:11:21,800 --> 00:11:24,280
think it could be the
government's way of very nicely
284
00:11:24,280 --> 00:11:27,640
saying to parents and people who
are more carers, whoever who are
285
00:11:27,640 --> 00:11:29,840
sort of anti screen time
completely and can't see the
286
00:11:29,840 --> 00:11:33,160
nuance between YouTube algorithm
brain rot and actual educational
287
00:11:33,160 --> 00:11:34,600
software that's really helpful
like Mapsu.
288
00:11:34,800 --> 00:11:36,640
Like Mapsu, you know can't see
the difference just.
289
00:11:36,640 --> 00:11:38,760
Randomly pick one out there.
Yeah, he just thought of one on
290
00:11:38,760 --> 00:11:39,240
top of.
Your head.
291
00:11:39,240 --> 00:11:41,360
Yeah, massive.
Yeah, exactly.
292
00:11:41,360 --> 00:11:44,000
Demo down below.
But the way you're clever,
293
00:11:44,000 --> 00:11:46,400
you're not into this.
So but what, what that guidance
294
00:11:46,480 --> 00:11:49,480
might end up doing is actually
saying, Oh yeah, a little nod to
295
00:11:49,480 --> 00:11:51,280
it, yet all too much screen time
is bad.
296
00:11:51,720 --> 00:11:55,600
But and then really from a top
down policy makers perspective,
297
00:11:55,800 --> 00:11:59,080
just say, but you do just need
to stop putting it all in the
298
00:11:59,080 --> 00:12:00,960
same category though, because
this is actually good.
299
00:12:01,200 --> 00:12:03,160
Government is saying this is OK,
this is fine.
300
00:12:03,160 --> 00:12:04,440
Can we please stop panicking
about it?
301
00:12:04,760 --> 00:12:07,160
And yeah, this stuff that we all
think is bad is is, funnily
302
00:12:07,160 --> 00:12:10,880
enough, bad for kids.
Strangely enough, yes, it's
303
00:12:10,880 --> 00:12:13,240
unsupervised, isn't it?
At the hands of the tech giants,
304
00:12:13,240 --> 00:12:14,640
yeah.
Your attention because I think
305
00:12:14,640 --> 00:12:17,240
people worried that this
guidance is going to come out
306
00:12:17,240 --> 00:12:19,560
and just be like pandering to
parents and carers who think
307
00:12:19,560 --> 00:12:21,800
this and say Oh yeah, screen
times terrible.
308
00:12:21,800 --> 00:12:23,800
We're going to start banning
screens in schools now.
309
00:12:23,800 --> 00:12:26,400
I think people worrying, it's
going to be that, like, I'd be
310
00:12:26,400 --> 00:12:27,760
very surprised.
It's not, it's not if that's
311
00:12:27,760 --> 00:12:29,280
what it was.
And also this is the thing,
312
00:12:29,280 --> 00:12:30,920
there's no nuances.
This is what I mean about.
313
00:12:30,920 --> 00:12:32,760
It's a tightrope because I get
what you're saying, right?
314
00:12:33,000 --> 00:12:35,320
It could be, it could be a
blessing in disguise where it
315
00:12:35,320 --> 00:12:37,240
brings everyone together like
it's lovely and roses.
316
00:12:37,400 --> 00:12:39,320
What will what will actually
happen is both sides will think
317
00:12:39,320 --> 00:12:41,560
it doesn't go far enough for
their side and both sides will
318
00:12:41,560 --> 00:12:43,240
hate it.
It's like the guidance that was
319
00:12:43,240 --> 00:12:45,160
released about earlier screen
time, right?
320
00:12:45,520 --> 00:12:49,800
You had some people who said it,
it should completely say no
321
00:12:49,800 --> 00:12:53,160
screen time ever for children.
That's ridiculous, isn't awful.
322
00:12:53,320 --> 00:12:55,800
And it said actually it compared
between 5 hours a day and 45
323
00:12:55,800 --> 00:12:58,360
minutes a day, even 45 minutes a
day is awful and they weren't
324
00:12:58,360 --> 00:13:00,320
happy right.
Then the other side, you had
325
00:13:00,320 --> 00:13:03,320
children of parents of special
educational needs, children who
326
00:13:03,320 --> 00:13:07,360
say that the screen time is is a
real device they can use to help
327
00:13:07,360 --> 00:13:08,920
their children actually get
through the day.
328
00:13:09,360 --> 00:13:10,920
And they were annoyed because
they were saying you're not
329
00:13:10,920 --> 00:13:12,440
thinking about SEND children in
this.
330
00:13:12,720 --> 00:13:14,760
And I just look at them,
everyone in this situation, I
331
00:13:14,760 --> 00:13:16,600
just go it's a guidance
document.
332
00:13:17,120 --> 00:13:20,480
Do you know what guidance is?
Guidance isn't every single
333
00:13:20,480 --> 00:13:22,400
person in the world to follow
this all of the time.
334
00:13:22,560 --> 00:13:25,800
This is just some advice.
What do you want the government
335
00:13:25,800 --> 00:13:28,160
to do in this situation?
Because if they wholly side with
336
00:13:28,160 --> 00:13:30,120
you, they're going to alienate
99% of people.
337
00:13:30,360 --> 00:13:33,200
This isn't for an individual,
this is guidance.
338
00:13:33,520 --> 00:13:35,720
It's best practise for most
children.
339
00:13:35,960 --> 00:13:37,320
It's not hitting everything at
once.
340
00:13:37,600 --> 00:13:41,800
And I just worry that as we
extend this up, anyone who isn't
341
00:13:41,800 --> 00:13:45,120
on the exact one percentile of
what the guidance says will
342
00:13:45,120 --> 00:13:46,360
think it's too far the other
way.
343
00:13:46,400 --> 00:13:49,320
I'm not quite right and it will
end up making more people angry
344
00:13:49,480 --> 00:13:51,920
than actually making them feel
better about it.
345
00:13:51,920 --> 00:13:53,840
And I just think that's what I
mean when I say it's a
346
00:13:53,840 --> 00:13:55,840
tightrope.
And I think what the government
347
00:13:55,840 --> 00:13:59,040
has to do and what I would love
to see more of from politicians
348
00:13:59,040 --> 00:14:02,920
in general is have an ideology
that's backed up in evidence and
349
00:14:02,920 --> 00:14:05,080
research and mean something and
go with it.
350
00:14:05,480 --> 00:14:08,200
And then spend their time trying
to argue and convince people why
351
00:14:08,200 --> 00:14:11,080
that's the right thing to do
rather than trying to pander to
352
00:14:11,080 --> 00:14:12,360
everyone and make everyone feel
happy.
353
00:14:12,520 --> 00:14:15,120
You will never succeed at that.
So why don't you just go for
354
00:14:15,120 --> 00:14:17,200
something you actually believe
in and try and convince people
355
00:14:17,200 --> 00:14:19,360
of it?
You got voted in for a reason.
356
00:14:19,560 --> 00:14:22,400
Just go forward and do it and
explain why it's a force for
357
00:14:22,400 --> 00:14:23,440
good.
And you know what?
358
00:14:23,440 --> 00:14:25,360
If it doesn't work in the long
run, you'll be out of office
359
00:14:25,360 --> 00:14:27,000
next time.
Yeah, definitely, definitely.
360
00:14:27,200 --> 00:14:28,960
I like your point about
guidance.
361
00:14:28,960 --> 00:14:30,520
Can't please everyone.
It's true because it has to be
362
00:14:30,520 --> 00:14:32,400
for the average person, like,
like there's always.
363
00:14:32,440 --> 00:14:34,160
Even that sounds bad.
You mean although people listen
364
00:14:34,160 --> 00:14:36,000
to you, right?
Very quickly because you didn't
365
00:14:36,000 --> 00:14:38,240
mean it in a bad way.
But even you saying the average
366
00:14:38,240 --> 00:14:40,160
person made me think of, oh, am
I not an average person?
367
00:14:40,720 --> 00:14:41,640
Oh, my God.
Yeah.
368
00:14:41,640 --> 00:14:43,080
No, I mean that, you know,
automatically.
369
00:14:43,080 --> 00:14:44,960
Yeah, No, I know what you mean.
But there'll be people listening
370
00:14:44,960 --> 00:14:46,840
about as my child, not an
average child, just because.
371
00:14:46,840 --> 00:14:47,960
They've got this.
Yeah, it's not.
372
00:14:48,240 --> 00:14:51,000
What, you mean it's just, yeah,
I don't know how to explain it a
373
00:14:51,000 --> 00:14:52,640
different way.
The graph I'm thinking of in my
374
00:14:52,640 --> 00:14:55,320
head in terms of like, everyone
is in this graph somewhere and
375
00:14:55,320 --> 00:14:57,360
there's always outliers in every
graph ever.
376
00:14:57,360 --> 00:14:58,080
Yeah.
Ever.
377
00:14:58,200 --> 00:14:59,720
Yeah.
Like when you've got X&Y this
378
00:14:59,720 --> 00:15:02,120
way, there is going to be
someone, yeah, who's right up
379
00:15:02,120 --> 00:15:03,000
here.
And there's gonna be someone
380
00:15:03,000 --> 00:15:05,080
who's right down here.
And they are outliers to where
381
00:15:05,080 --> 00:15:08,040
the general bulk of people are.
And guidance generally comes in
382
00:15:08,040 --> 00:15:09,880
to say, OK, we've got the big
bulk of people here.
383
00:15:09,880 --> 00:15:12,320
Yeah, this is generally
guidance, but we're aware that
384
00:15:12,320 --> 00:15:14,480
this might not be suitable for
every single individual.
385
00:15:14,480 --> 00:15:16,800
And the fact that people don't
understand that I'm like, I
386
00:15:16,800 --> 00:15:18,920
don't, yeah, you can't win.
There is no win because you
387
00:15:18,920 --> 00:15:20,000
can't please every single
person.
388
00:15:20,360 --> 00:15:21,760
So let's talk that that was
Bridget Phillipson.
389
00:15:21,760 --> 00:15:23,960
So she did, Yeah.
Really interesting chat, very AI
390
00:15:23,960 --> 00:15:25,600
focused.
Very tech focused, very
391
00:15:25,600 --> 00:15:27,000
interesting in your opinion on
the whole.
392
00:15:27,000 --> 00:15:30,920
Then you finish that chat, you
know the government starts and
393
00:15:30,920 --> 00:15:32,920
stuff.
How would you rate it?
394
00:15:32,920 --> 00:15:34,800
What do you feel about is it?
Is it a solid position?
395
00:15:34,800 --> 00:15:36,040
Do you agree with it in general?
What?
396
00:15:36,040 --> 00:15:37,880
What's your feeling?
Overall, quite positive.
397
00:15:37,880 --> 00:15:39,400
To be honest with you.
I, I didn't really feel
398
00:15:39,400 --> 00:15:41,040
negative.
I, I, I felt like all the things
399
00:15:41,040 --> 00:15:44,520
she was saying, it was a fairly
strong position of I think she
400
00:15:44,520 --> 00:15:47,440
could have pandered more to Oh
yeah, no, tech is bad in schools
401
00:15:47,440 --> 00:15:49,080
because of screen time.
She didn't do that at all.
402
00:15:49,080 --> 00:15:51,520
I felt like she was very much
saying, no, actually tech is
403
00:15:51,520 --> 00:15:53,520
really good and we're actually
we're going to push more.
404
00:15:53,520 --> 00:15:55,480
I think you know, there was and
you find the right thing
405
00:15:55,520 --> 00:15:56,520
allocated.
Yeah, absolutely.
406
00:15:56,520 --> 00:15:58,200
I think it is the right thing
because like you said, we live
407
00:15:58,200 --> 00:16:00,000
in a tech heavy world.
It's only going to get more
408
00:16:00,000 --> 00:16:01,600
techie.
I don't know what world people
409
00:16:01,600 --> 00:16:03,240
are living in where they think
that we're suddenly going to
410
00:16:03,240 --> 00:16:05,840
regress back in terms of tech
and how much is in our lives.
411
00:16:05,840 --> 00:16:07,560
It's not happening.
It's just not happening.
412
00:16:08,000 --> 00:16:12,160
And we can use it to for so much
good, so much good, not just in
413
00:16:12,160 --> 00:16:13,840
education across, you know, the
world.
414
00:16:13,880 --> 00:16:16,960
You see, you see loads of good
uses of AI and tech in, in the
415
00:16:16,960 --> 00:16:20,880
world of medicine and, you know,
and in education, obviously it
416
00:16:20,880 --> 00:16:23,760
can be used for good as well.
Like, I don't know, I don't know
417
00:16:23,760 --> 00:16:25,360
how it's so black and white to
me.
418
00:16:25,600 --> 00:16:28,160
So of course it was a good move.
There's so much in my life as an
419
00:16:28,160 --> 00:16:32,160
adult that tech has made better
that I would hate it to have a
420
00:16:32,160 --> 00:16:34,640
lobby group in that certain area
go no, no, it's too much.
421
00:16:34,800 --> 00:16:35,600
Yeah, yeah, Yeah.
Well.
422
00:16:35,840 --> 00:16:38,400
Oh, actually, that's made me
have my screening more accurate
423
00:16:38,400 --> 00:16:39,920
now for my health, actually.
Exactly.
424
00:16:39,920 --> 00:16:42,360
So that sounds quite good.
Yeah, I get it.
425
00:16:42,360 --> 00:16:43,680
With children, we want to
protect children.
426
00:16:43,800 --> 00:16:45,320
That's that's our number one aim
as educators.
427
00:16:45,320 --> 00:16:47,840
We always say if you were to ask
me my role as a teacher, my
428
00:16:47,840 --> 00:16:49,440
number one thing was always
safety of children.
429
00:16:49,440 --> 00:16:52,240
So I, I please don't think I'm
not taking that seriously, but I
430
00:16:52,240 --> 00:16:56,240
just think that you, you, that's
why I think you.
431
00:16:56,240 --> 00:16:58,560
Can do it, Yeah, yeah.
That's why I think it's like,
432
00:16:58,560 --> 00:17:01,080
without sounding common control
tech, it's not actually that
433
00:17:01,080 --> 00:17:05,480
hard to ensure the tech is safe.
Like in in all seriousness, is
434
00:17:05,480 --> 00:17:08,720
doing 10 minutes of your
practise on a maths app in
435
00:17:08,720 --> 00:17:11,359
school with your teachers like
actually watching over you and
436
00:17:11,359 --> 00:17:13,319
monitoring it and setting it for
you and then when you finish
437
00:17:13,319 --> 00:17:15,599
they give you some intervention.
Are you are you?
438
00:17:15,680 --> 00:17:17,599
Do you?
Do you really going to try and
439
00:17:17,599 --> 00:17:18,560
convince me?
That that's bad.
440
00:17:18,560 --> 00:17:21,920
Yeah, yeah, genuinely.
Or that or that it's no better
441
00:17:21,920 --> 00:17:25,480
than having a a sheet in front
of them like and it's so easy to
442
00:17:25,480 --> 00:17:27,960
say what number one, the two
things to me, everything now of
443
00:17:27,960 --> 00:17:29,720
tech, especially now we're
building a mass platform, right.
444
00:17:29,920 --> 00:17:31,560
Is does it save the teacher
time?
445
00:17:31,760 --> 00:17:33,520
That's what it's like a separate
thing that's just great.
446
00:17:33,520 --> 00:17:34,920
Anyway, does it save the teacher
time?
447
00:17:34,920 --> 00:17:35,680
Yes.
Cool.
448
00:17:35,680 --> 00:17:37,960
They get instant live gap
analysis, for example, with like
449
00:17:37,960 --> 00:17:40,040
with Mapsu or instant generation
of multiple questions.
450
00:17:40,040 --> 00:17:41,640
I don't know.
Or the ability to see an
451
00:17:41,640 --> 00:17:43,240
intervention group.
I'm not trying, I promise.
452
00:17:43,240 --> 00:17:44,440
I'm turning to a sales pitch,
but it is.
453
00:17:44,520 --> 00:17:46,520
It is.
Doing all the things that the
454
00:17:46,520 --> 00:17:49,600
tech should be good for.
And then what, you're I'm going
455
00:17:49,600 --> 00:17:51,680
to tear you up here because a
lot of people say that's great,
456
00:17:51,680 --> 00:17:52,800
but it's at the expense of the
kids.
457
00:17:52,960 --> 00:17:55,720
Yeah, and it's not because if it
can save the teachers time and
458
00:17:55,720 --> 00:17:58,080
it's also literally better for
the kids because it's more, it's
459
00:17:58,080 --> 00:17:59,400
more personalised, it's more
adaptive.
460
00:17:59,400 --> 00:18:03,120
It can do things that one person
with a with pen and paper just
461
00:18:03,120 --> 00:18:04,920
can't do quick enough.
You might be able to do it if
462
00:18:04,920 --> 00:18:06,600
you have three hours, but you
don't have three hours.
463
00:18:06,600 --> 00:18:09,120
You're in the middle of a
lesson, like if it's better for
464
00:18:09,120 --> 00:18:13,400
the children and it's better and
faster and gives more deeper
465
00:18:13,400 --> 00:18:15,400
analysis, stuff that you could
never get as a teacher and saves
466
00:18:15,400 --> 00:18:17,680
you time.
It's a no brainer.
467
00:18:17,880 --> 00:18:20,680
So the only barrier at that
point is the kids haven't got
468
00:18:20,680 --> 00:18:23,600
the, the, the equipment.
So that's why full circle
469
00:18:23,600 --> 00:18:25,480
finishing this conversation
about Bridget Phillips and her
470
00:18:25,480 --> 00:18:28,920
and her chat.
That's why to me it's, it's been
471
00:18:28,920 --> 00:18:30,960
out like this stuff's been
around for like 20 years.
472
00:18:30,960 --> 00:18:32,640
Do you know what I mean?
Like iPads, It's not like it's
473
00:18:32,640 --> 00:18:33,960
brand new and it's like, oh,
we're working on it.
474
00:18:33,960 --> 00:18:35,640
We're going to start getting
that funding together for this
475
00:18:35,640 --> 00:18:37,560
new tech, for this new equipment
in schools.
476
00:18:38,200 --> 00:18:40,280
No, no, no, no, no, no, no.
Computers have been around for
477
00:18:40,280 --> 00:18:43,480
like 30 years in schools. iPads
have honestly been around for at
478
00:18:43,480 --> 00:18:45,520
least 15 years in schools.
Like what are we talking about?
479
00:18:45,520 --> 00:18:47,720
Come on.
It's it's been so long now that
480
00:18:48,040 --> 00:18:50,200
there isn't really much excuse.
It's just OK, let's just let's
481
00:18:50,200 --> 00:18:52,200
just get it done.
Yeah, get it done, get it done.
482
00:18:52,480 --> 00:18:54,640
And then from there the kids
will benefit.
483
00:18:54,640 --> 00:18:55,760
Absolutely.
I completely agree.
484
00:18:55,760 --> 00:18:58,080
I'm I'm very excited by it.
I just think that has to come
485
00:18:58,080 --> 00:19:00,560
first.
It just has to second thing then
486
00:19:00,560 --> 00:19:02,360
before we go on to our.
So the main bulk, what we'll do
487
00:19:02,360 --> 00:19:06,040
later is we'll talk from a more,
I suppose, practical point of
488
00:19:06,040 --> 00:19:09,720
view from teaching how useful AI
is for different jobs around,
489
00:19:09,720 --> 00:19:11,920
around the classroom.
What, what, whether it's a good
490
00:19:11,920 --> 00:19:13,360
thing or a bad thing, come on to
that.
491
00:19:13,360 --> 00:19:16,480
But one thing I wanted to talk
about very quickly was what I
492
00:19:16,480 --> 00:19:19,880
found one of the most
captivating talks I've actually
493
00:19:19,880 --> 00:19:22,640
listened to in a long time where
I was really like really on the
494
00:19:22,640 --> 00:19:24,440
edge of my seat listening to
every word they were saying.
495
00:19:24,960 --> 00:19:28,640
And it was Amor Rajam from who
hosts University Challenge.
496
00:19:28,640 --> 00:19:32,000
Yes, an amazing journalist, does
BBC as well.
497
00:19:32,000 --> 00:19:35,360
And he's got a podcast a minute
called Radical where he's going
498
00:19:35,360 --> 00:19:37,960
in depth about people who like
being radical in certain things.
499
00:19:37,960 --> 00:19:39,560
That was one about foster carers
recently.
500
00:19:39,880 --> 00:19:41,920
He's got Jonathan Height coming
on who did the anxious
501
00:19:41,920 --> 00:19:44,120
generation talking about
smartphones, etcetera.
502
00:19:44,120 --> 00:19:47,480
So I'm really interested and you
can just tell Amor really goes
503
00:19:47,480 --> 00:19:49,760
into depth about the things he's
talking to someone about.
504
00:19:50,200 --> 00:19:52,600
And he's such a powerful
interviewer because he knows it
505
00:19:52,600 --> 00:19:55,560
really well, but understands
that there's an expert over the
506
00:19:55,560 --> 00:19:56,760
way.
And he was talking to Hannah
507
00:19:56,760 --> 00:19:58,840
Fry.
Everyone knows Hannah Fry, Fry
508
00:19:58,840 --> 00:20:00,400
squared.
She's you're a big fan.
509
00:20:00,560 --> 00:20:02,400
Huge.
Fan just going back for a second
510
00:20:02,400 --> 00:20:04,360
just gassing up Hannah Fryer.
She's awesome.
511
00:20:04,760 --> 00:20:07,200
I remember probably around the
same same time we found it,
512
00:20:07,200 --> 00:20:08,600
there was a YouTube channel
called Number File.
513
00:20:08,680 --> 00:20:10,720
Still is number file if you're
into maths.
514
00:20:11,440 --> 00:20:12,920
Which we are.
Did you and you just?
515
00:20:13,160 --> 00:20:14,840
You just like nerding out about
maths things.
516
00:20:14,840 --> 00:20:17,240
It's just cool math stuff.
Number file is a great channel
517
00:20:17,240 --> 00:20:19,560
and Hannah Fry, she think she
still might do to be there but
518
00:20:19,560 --> 00:20:22,560
she there was certainly a period
of time like 78910 years ago
519
00:20:22,720 --> 00:20:24,960
where she did loads of videos
with number file.
520
00:20:24,960 --> 00:20:27,320
And that is originally probably
about 10 years ago how I got
521
00:20:27,320 --> 00:20:29,160
introduced to her and I remember
watching the video thinking
522
00:20:29,440 --> 00:20:31,240
she's awesome.
I want to watch her explain more
523
00:20:31,240 --> 00:20:33,480
stuff.
Also just fun fact, she does not
524
00:20:33,480 --> 00:20:34,680
age.
She does.
525
00:20:34,760 --> 00:20:36,720
I was not age.
I saw her on stage with with
526
00:20:36,720 --> 00:20:40,640
ammo doing this talk and I was
like, you look exactly, exactly
527
00:20:40,640 --> 00:20:42,600
the same, Paul.
Rudd and Hannah Fryer having the
528
00:20:42,600 --> 00:20:43,920
same diet in your samples?
Go.
529
00:20:44,080 --> 00:20:46,680
Seriously, go pull up a pull up
a thumbnail from 10 years ago on
530
00:20:46,680 --> 00:20:48,320
YouTube.
She has not aged at all as
531
00:20:48,320 --> 00:20:50,600
mental.
Whereas you do, I don't.
532
00:20:50,600 --> 00:20:53,040
Actually I don't I will look
like this hopefully until I'm
533
00:20:53,040 --> 00:20:55,160
70.
But the, the, the chat they had
534
00:20:55,160 --> 00:20:56,800
was really interesting.
It was all about AI, it was all
535
00:20:56,800 --> 00:20:59,880
about artificial intelligence.
And it kind of it was, it was
536
00:20:59,920 --> 00:21:02,360
like embedded in education, but
talked about like wider society
537
00:21:02,360 --> 00:21:04,840
as well.
And something I found so
538
00:21:04,840 --> 00:21:07,960
interesting and it really got me
thinking was they were talking
539
00:21:07,960 --> 00:21:10,760
about how AI will transform
education, right?
540
00:21:11,000 --> 00:21:13,560
So whether we like it on our AI
is here to stay, what's it going
541
00:21:13,560 --> 00:21:16,640
to do?
And they spoke about how
542
00:21:16,640 --> 00:21:19,920
currently AI, for example,
there's a lot of worry in in
543
00:21:19,920 --> 00:21:21,320
university for students,
etcetera.
544
00:21:21,320 --> 00:21:24,880
Sixth form, how AI is writing
essays for the children.
545
00:21:25,320 --> 00:21:26,840
OK.
And it's like, cool, I've got a
546
00:21:26,840 --> 00:21:29,560
shortcut now I can just write an
essay, get it done, get it
547
00:21:29,560 --> 00:21:31,280
graded.
There's my B, Boom, I've got it.
548
00:21:31,840 --> 00:21:34,360
And it really highlighted this,
this issue and this thought
549
00:21:34,360 --> 00:21:36,520
about how what is the point of
learning?
550
00:21:36,880 --> 00:21:40,360
And AI has really put a
microscope on it because for
551
00:21:40,360 --> 00:21:43,560
those children, the point of
learning was to get to the grade
552
00:21:43,560 --> 00:21:46,000
at the end.
And that is something I think is
553
00:21:46,000 --> 00:21:49,240
so problematic in education all
the way through that What's the
554
00:21:49,240 --> 00:21:51,000
point of learning to pass the
test?
555
00:21:51,160 --> 00:21:53,680
What's the point in VA 6 to make
sure the SAT results are good at
556
00:21:53,680 --> 00:21:55,080
the end?
What's the point in secondary
557
00:21:55,080 --> 00:21:56,960
school to get your GCSE for your
qualifications?
558
00:21:57,200 --> 00:22:01,040
I understand in isolation why
those things are important, but
559
00:22:01,040 --> 00:22:04,040
it became the whole reason for
education for a lot of teachers.
560
00:22:04,040 --> 00:22:05,160
I've got to get them through
this test.
561
00:22:05,160 --> 00:22:07,360
I'm just doing it for the test.
Doing it for the test.
562
00:22:07,720 --> 00:22:11,400
AI comes along and makes the end
product something you can just
563
00:22:11,400 --> 00:22:14,880
do in five seconds.
It solves the problem if you
564
00:22:14,880 --> 00:22:16,720
think about it, AI solving that
problem.
565
00:22:16,720 --> 00:22:18,520
I've written an essay for you.
I've saved you 6 hours.
566
00:22:18,520 --> 00:22:20,480
Isn't everyone says AI is 4?
Isn't that fantastic?
567
00:22:20,920 --> 00:22:24,960
And for me, what it's
highlighted is how education
568
00:22:24,960 --> 00:22:26,640
needs a revamp.
Whether we've got AI or not.
569
00:22:27,080 --> 00:22:30,960
Education needs to be resented
to the act of learning is the
570
00:22:30,960 --> 00:22:34,680
point of learning.
Why else do we learn if not for
571
00:22:34,680 --> 00:22:38,680
the journey and the struggle and
trying something and getting it
572
00:22:38,680 --> 00:22:41,080
wrong and finding something out
and seeing that it works and
573
00:22:41,080 --> 00:22:44,280
spotting patterns?
The act of doing that, much like
574
00:22:44,280 --> 00:22:47,320
problem solving in maths, the
act of doing something you don't
575
00:22:47,320 --> 00:22:49,240
know how to get to the end, but
you're going to try and get
576
00:22:49,240 --> 00:22:51,600
there anyway based on what
you've learned with someone, a
577
00:22:51,600 --> 00:22:54,280
teacher normally alongside you,
prompting you in the right
578
00:22:54,280 --> 00:22:56,880
direction.
That's the point of learning.
579
00:22:57,240 --> 00:23:00,640
And it I found it fascinating
how they spoke about how AI is
580
00:23:00,640 --> 00:23:04,320
taking that away and it almost
doesn't matter anymore.
581
00:23:04,320 --> 00:23:07,840
And I'm interested to your
points about, we talk about AI
582
00:23:07,840 --> 00:23:09,880
in tech and how it's so
important that it should be
583
00:23:09,880 --> 00:23:11,320
implemented in education
correctly.
584
00:23:11,880 --> 00:23:14,520
Do you think that's a problem
and an issue then that it could
585
00:23:14,520 --> 00:23:16,920
end up being?
We're not really focusing on
586
00:23:16,920 --> 00:23:18,760
what's important.
Oh, big time, big time.
587
00:23:18,760 --> 00:23:21,800
I think I think Hannah, she went
really like meta with this and
588
00:23:21,800 --> 00:23:23,680
she was talking about almost
like the whole world and just I
589
00:23:23,680 --> 00:23:27,480
like even like lifespans and she
was like we and I agree.
590
00:23:27,480 --> 00:23:30,800
We really just need to challenge
the status quo completely of
591
00:23:30,800 --> 00:23:33,040
like how life works now.
She was like, AI is going to be
592
00:23:33,040 --> 00:23:37,160
so disruptive and such a fast
moving tech that we already need
593
00:23:37,160 --> 00:23:40,240
to stop looking at life as, Oh
yeah, you go to school, you work
594
00:23:40,240 --> 00:23:42,360
for X amount of years, you get
these grades so that you can get
595
00:23:42,360 --> 00:23:44,880
this job and you work that job
for 40 years so that you can
596
00:23:44,880 --> 00:23:46,720
retire and have a nice bit of
time before you die.
597
00:23:47,160 --> 00:23:50,040
That is what she said.
And and she was like, that's
598
00:23:50,040 --> 00:23:51,520
just not going to be a thing
anymore.
599
00:23:51,520 --> 00:23:53,520
Like we're already seeing the
disruption of people having to
600
00:23:53,520 --> 00:23:56,160
retrain after five years in a
job because, oh, hey, I can do
601
00:23:56,160 --> 00:23:58,000
it now.
It's happened in the short span
602
00:23:58,160 --> 00:24:00,400
of AI going from this little
gimmicky thing to where it is
603
00:24:00,400 --> 00:24:03,360
now of in that time, it's
already eradicated certain jobs
604
00:24:03,960 --> 00:24:07,000
or or at least huge, you know,
chunks of certain jobs, and
605
00:24:07,440 --> 00:24:09,760
that's just fast.
It just fascinates me like it.
606
00:24:09,840 --> 00:24:13,120
It not only does the education,
you know, structure of education
607
00:24:13,120 --> 00:24:15,800
just need to be completely
dismantled all of life.
608
00:24:15,920 --> 00:24:16,400
It's just.
Like.
609
00:24:16,720 --> 00:24:19,000
We need to start what are we
preparing children for?
610
00:24:19,000 --> 00:24:20,720
But ultimately it's for the
workforce generally.
611
00:24:20,720 --> 00:24:22,840
That's what we know people,
society says about education.
612
00:24:23,120 --> 00:24:25,200
Well, the workforce is changing
rapidly, and the workforce is
613
00:24:25,200 --> 00:24:26,640
going to be nothing like it used
to be.
614
00:24:26,800 --> 00:24:29,160
So actually, what are we
preparing kids for?
615
00:24:29,160 --> 00:24:30,240
Yeah.
And it kind of went back to that
616
00:24:30,240 --> 00:24:32,840
question of what are we doing?
Yeah, just getting essays done
617
00:24:32,840 --> 00:24:34,800
quicker and getting grades isn't
even useful anyway.
618
00:24:34,800 --> 00:24:36,400
That's the irony of it.
It's like, you know, exactly.
619
00:24:36,400 --> 00:24:38,680
It's not even that useful now.
It's solving a problem that
620
00:24:38,680 --> 00:24:41,400
doesn't need to be a problem.
We've made it a problem.
621
00:24:41,400 --> 00:24:44,800
So here's the beauty of it.
It's solving a problem for the
622
00:24:44,800 --> 00:24:47,640
workforce of yesterday.
Yeah, yeah, yeah.
623
00:24:47,640 --> 00:24:49,480
It doesn't even solve a real
problem anymore.
624
00:24:49,480 --> 00:24:51,880
No, it doesn't make sense.
So everything's outdated.
625
00:24:51,880 --> 00:24:53,880
All needs to be restructured and
rethought.
626
00:24:53,880 --> 00:24:55,520
I think is a huge problem.
I don't think is gonna be any
627
00:24:55,520 --> 00:24:57,840
quick fixes to this at all.
What I found really interesting
628
00:24:57,840 --> 00:25:01,080
as well on the back of that was
how, you know, it was a big
629
00:25:01,080 --> 00:25:02,160
talk.
That's why I loved it.
630
00:25:02,160 --> 00:25:04,000
I think that's why I was
captivated, because it was about
631
00:25:04,000 --> 00:25:06,480
big, big, big things.
And it started with education.
632
00:25:06,480 --> 00:25:09,960
But you very quickly found out
that actually education is life
633
00:25:09,960 --> 00:25:12,360
is everything, right?
And they spoke about how again,
634
00:25:12,360 --> 00:25:15,000
this is backdated view, much
like what you said about go to
635
00:25:15,040 --> 00:25:17,040
go to school, get qualification,
get a job, retire.
636
00:25:17,440 --> 00:25:19,880
There's also a backdated view
that you kind of don't have to
637
00:25:19,880 --> 00:25:23,080
keep learning new things.
And this idea of like learnings
638
00:25:23,080 --> 00:25:25,400
for school keeper back then,
it's actually with the advent of
639
00:25:25,400 --> 00:25:28,120
AI.
And she and she was saying, Amor
640
00:25:28,120 --> 00:25:29,960
was saying both of them are
saying how they're more curious
641
00:25:29,960 --> 00:25:32,560
and find out more things than
ever because they can use AI to
642
00:25:32,560 --> 00:25:35,080
superpower things, to superpower
research to find things out.
643
00:25:35,560 --> 00:25:37,080
And I was like, yes, that's what
we need as well.
644
00:25:37,200 --> 00:25:40,120
We need an entire population
who's thirsty for knowledge,
645
00:25:40,480 --> 00:25:43,360
right?
And unfortunately in school
646
00:25:43,360 --> 00:25:46,240
right now, I don't think we're
feeding that thirst for
647
00:25:46,240 --> 00:25:48,600
knowledge enough.
I think we're feeding a need for
648
00:25:48,600 --> 00:25:51,560
to pass an exam feeling, a need
for to write an essay.
649
00:25:52,400 --> 00:25:54,800
AI is going to come along and do
it even quicker and get us even
650
00:25:54,800 --> 00:25:58,360
further from creating a
generation of children who want
651
00:25:58,360 --> 00:25:59,960
to learn for the sake of
learning.
652
00:26:00,240 --> 00:26:03,280
And what that will do is in the
short term, cool grades might be
653
00:26:03,280 --> 00:26:05,920
okay for a bit when we're just
marking an essay, but it's going
654
00:26:05,920 --> 00:26:08,440
to be a double whammy because
like you said, the
655
00:26:08,440 --> 00:26:10,440
qualifications won't matter
anymore because the jobs won't
656
00:26:10,440 --> 00:26:12,320
be there.
But also then when they're
657
00:26:12,320 --> 00:26:15,160
adults, they'll have no
experience in the act of
658
00:26:15,160 --> 00:26:18,640
learning or no desire to want to
find out more because they just
659
00:26:18,640 --> 00:26:21,520
think a bot can do it for them.
And that's where we have to.
660
00:26:21,520 --> 00:26:25,880
We have to teach children at
some point in their journey
661
00:26:25,880 --> 00:26:31,320
through education how to use AI
to not do the work for them
662
00:26:31,320 --> 00:26:35,280
necessarily, but to help them
with their learning and the
663
00:26:35,280 --> 00:26:39,120
journey themselves and make AI
work for them rather than just
664
00:26:39,360 --> 00:26:41,360
exporting all of the output.
Couldn't agree more.
665
00:26:41,360 --> 00:26:43,600
To superpower their critical
thinking skills?
666
00:26:43,600 --> 00:26:45,520
Exactly.
To superpower their enjoyment of
667
00:26:45,520 --> 00:26:47,360
learning a particular subject
that they've just found out.
668
00:26:47,360 --> 00:26:49,160
But how long?
Have we said that it feels like
669
00:26:49,160 --> 00:26:50,440
AI is just putting a microscope
on it?
670
00:26:50,960 --> 00:26:52,920
It's like, oh, AIS come along
and can do the thing we've
671
00:26:52,920 --> 00:26:55,680
always thought is pointless and
stupid even quicker, so kids
672
00:26:55,680 --> 00:26:57,200
have to do less.
It's like, cool.
673
00:26:57,200 --> 00:27:01,720
The the problem remains.
As a teacher, I want to inspire
674
00:27:01,720 --> 00:27:04,440
children, OK?
I want to inspire them to want
675
00:27:04,440 --> 00:27:07,520
to find things out.
My job I always thought of when
676
00:27:07,520 --> 00:27:11,040
I was in front of my class was I
want to trick them into thinking
677
00:27:11,040 --> 00:27:12,480
they're deciding what comes
next.
678
00:27:12,800 --> 00:27:15,720
I want to make them think that
they're on their own journey,
679
00:27:15,920 --> 00:27:18,400
which they do as much as
possible, but I'm curating it.
680
00:27:18,680 --> 00:27:21,560
My job as a curator and my job
is to inspire, to make them
681
00:27:21,560 --> 00:27:24,040
think something.
I want them to think of the next
682
00:27:24,040 --> 00:27:26,600
thing in the progression without
them thinking I've made them do
683
00:27:26,600 --> 00:27:27,120
it.
Yeah, yeah.
684
00:27:27,120 --> 00:27:28,560
Or without you just telling
them, Yeah, because it's
685
00:27:28,680 --> 00:27:30,440
obviously, you know, you're
thinking, you're asking the
686
00:27:30,440 --> 00:27:31,720
questions like, yeah, what would
happen here?
687
00:27:31,720 --> 00:27:34,600
You know exactly what happened.
You're curating a curious mind.
688
00:27:34,840 --> 00:27:36,520
That's what you're doing.
Yeah, that's what you want to
689
00:27:36,520 --> 00:27:38,200
do.
And how boring and droll are the
690
00:27:38,200 --> 00:27:40,600
lessons where you just stand at
the front and just tell them
691
00:27:40,600 --> 00:27:42,160
things?
Yeah, if I wanted to.
692
00:27:42,160 --> 00:27:44,920
There you go.
There's AIAI can do that if we
693
00:27:44,920 --> 00:27:46,360
want to.
We just get it on a voice note
694
00:27:46,360 --> 00:27:48,680
app, read out some facts to the
kids.
695
00:27:48,680 --> 00:27:51,360
Yeah, what's that going to do?
I want to inspire them.
696
00:27:51,360 --> 00:27:54,680
That's why I think there is
still that difference between
697
00:27:54,680 --> 00:27:58,160
the teacher delivering A
curriculum inspiring children
698
00:27:58,160 --> 00:28:00,320
and what AI can currently do
right now.
699
00:28:00,320 --> 00:28:02,440
I think there's still a big, big
disconnect.
700
00:28:02,600 --> 00:28:06,120
We have to focus more on what we
as humans can do, human to
701
00:28:06,120 --> 00:28:07,400
human.
Well, that was a really good
702
00:28:07,400 --> 00:28:10,920
example of maybe a part of the
class and perhaps the AI could
703
00:28:10,920 --> 00:28:13,560
be used for if it's literally
just here are facts and you need
704
00:28:13,560 --> 00:28:15,760
to know them.
And there's no reason why a bot
705
00:28:15,760 --> 00:28:17,680
didn't even have to be AI
couldn't just give you those
706
00:28:17,680 --> 00:28:19,520
facts.
And in the olden days, they'd
707
00:28:19,520 --> 00:28:21,840
call it a book, give a child a
book by themselves and read it
708
00:28:21,840 --> 00:28:24,240
to find out yourselves.
But let's maybe we can go
709
00:28:24,240 --> 00:28:27,320
through a few things then in the
world of education that we think
710
00:28:27,320 --> 00:28:31,960
and rank them as to how how good
we think AI is at doing that
711
00:28:31,960 --> 00:28:33,720
job.
So in other words, how it almost
712
00:28:33,880 --> 00:28:36,240
kind of coincides with how
likely we think then AI is to
713
00:28:36,240 --> 00:28:38,040
replace that job, because how
the better it is, the more
714
00:28:38,040 --> 00:28:38,920
likely.
This is good.
715
00:28:39,040 --> 00:28:40,080
It's good.
So we'll give it a square out of
716
00:28:40,080 --> 00:28:43,880
10 and we'll say basically how
useful is AI for this?
717
00:28:43,880 --> 00:28:45,680
Yeah, yeah, yeah.
So I'll start you off then you
718
00:28:45,680 --> 00:28:47,880
can go with this one.
So written lesson plans.
719
00:28:48,640 --> 00:28:52,800
OK.
I think a solid 8A, solid 8
720
00:28:52,800 --> 00:28:54,640
explain.
Because again, there's going to
721
00:28:54,640 --> 00:28:56,240
be context to all of these by
the way, people here, right?
722
00:28:56,240 --> 00:28:58,840
And go what?
Either way, the solid context
723
00:28:58,840 --> 00:29:02,000
here is that AI now, especially
large language models can be
724
00:29:02,000 --> 00:29:04,800
trained on documents you upload.
Actually might be uploading
725
00:29:04,800 --> 00:29:06,440
documents from your curriculum.
OK.
726
00:29:06,760 --> 00:29:09,360
Also very important, the prompts
you put into it.
727
00:29:09,720 --> 00:29:12,120
What I'm not saying in anything,
I think I'll talk for both of
728
00:29:12,120 --> 00:29:12,480
us.
Now.
729
00:29:12,560 --> 00:29:14,680
I don't think either of us are
just saying press a button and I
730
00:29:14,680 --> 00:29:16,840
hope it comes.
We have to have the
731
00:29:17,280 --> 00:29:19,520
understanding here that what
we're doing is prompting AI to
732
00:29:19,520 --> 00:29:21,560
do this for us.
So this is only as good as your
733
00:29:21,560 --> 00:29:23,960
prompts at any point.
But if your prompts are good,
734
00:29:24,360 --> 00:29:27,640
especially I'm thinking of
medium to long term planning, If
735
00:29:27,640 --> 00:29:31,080
you want AI to organise your
learning in a structured order
736
00:29:31,280 --> 00:29:33,600
and you're putting up the
curriculum for it, I think it'd
737
00:29:33,600 --> 00:29:35,880
be very good at putting that
into a table for you and mapping
738
00:29:35,880 --> 00:29:38,880
out across the year that kind of
grunt work that you could sit
739
00:29:38,880 --> 00:29:40,480
there and do it on Excel if you
wanted to.
740
00:29:40,960 --> 00:29:42,400
AI can do that in 10 seconds for
you.
741
00:29:42,400 --> 00:29:44,760
I think that's fantastic.
The actual content itself, the
742
00:29:44,760 --> 00:29:48,360
reason why it's not more than an
8 for me is because I think
743
00:29:48,360 --> 00:29:51,720
personally, the content will
always come from something else.
744
00:29:51,720 --> 00:29:53,680
Even if I was curating it, I'd
be getting my content from
745
00:29:53,680 --> 00:29:55,040
somewhere else.
I'd be reading through the
746
00:29:55,040 --> 00:29:58,920
content and deciding what to do
AI, I'd be giving it that right.
747
00:29:59,000 --> 00:30:01,440
So it's still, it's still.
Don't just think that I'm hoping
748
00:30:01,440 --> 00:30:03,680
it does the right thing.
And the reason it's not 9 or 10
749
00:30:03,680 --> 00:30:05,800
is because afterwards I'd be
spending time going through what
750
00:30:05,800 --> 00:30:08,280
it did and editing it.
But I think in general, for
751
00:30:08,280 --> 00:30:10,400
written lesson plans, I think
written lesson plans on the
752
00:30:10,400 --> 00:30:11,840
whole, especially when you're
more experienced, are an
753
00:30:11,840 --> 00:30:12,800
absolute waste of time.
Yeah, yeah.
754
00:30:12,800 --> 00:30:15,240
I was going to say, what about
in the classic situation where
755
00:30:15,480 --> 00:30:18,000
you've already planned your
lesson in terms of what, however
756
00:30:18,000 --> 00:30:19,920
you do that, whether it's a
PowerPoint or in your mind or
757
00:30:19,920 --> 00:30:22,120
it's practical, whatever.
And then the school policy is,
758
00:30:22,120 --> 00:30:24,480
Oh, no, you have to have it
written down on this, on this
759
00:30:24,480 --> 00:30:26,400
planning format.
You've already done the work.
760
00:30:26,400 --> 00:30:29,160
There's nothing to gain for you
in the situation of writing it
761
00:30:29,160 --> 00:30:30,680
down.
It's just for the score in that
762
00:30:30,680 --> 00:30:33,240
situation, it's the 10 out of 10
because I'm like, I don't even
763
00:30:33,240 --> 00:30:35,840
want to do this anyway.
So if someone else and if AI can
764
00:30:35,840 --> 00:30:38,360
just take what I've already done
and and just put it into some
765
00:30:38,360 --> 00:30:40,160
lovely written lesson for
someone else, great.
766
00:30:40,360 --> 00:30:42,080
So here's here's my big worry
for AI.
767
00:30:42,080 --> 00:30:44,800
Not worry, but I can see this
happening already is people
768
00:30:44,800 --> 00:30:48,440
become obsessed with AI and try
to shoehorn AI into every single
769
00:30:48,440 --> 00:30:51,360
part of their teams life.
So let's say you're the head of
770
00:30:51,360 --> 00:30:54,080
a school or a trust and you
think I need to get AI and
771
00:30:54,080 --> 00:30:56,080
everything for productivity?
We must get it in everything.
772
00:30:56,280 --> 00:30:58,960
Sometimes I think it adds steps
that you don't need right?
773
00:30:58,960 --> 00:31:01,640
So whilst okay, sure in your
context there right of your
774
00:31:01,640 --> 00:31:04,640
written plan, you've already got
your smart notebook or your or
775
00:31:04,640 --> 00:31:07,280
your PowerPoint for your lesson.
You're ready to teach, you're as
776
00:31:07,280 --> 00:31:10,120
ready as you can be, but there's
a your your school go.
777
00:31:10,120 --> 00:31:11,800
No, no, I need a written plan
just put into AI.
778
00:31:11,800 --> 00:31:15,640
It'd be so quick.
It was like, OK, any second I
779
00:31:15,640 --> 00:31:20,120
put into making that is a second
I didn't need to spend yeah, but
780
00:31:20,120 --> 00:31:23,120
they'll look at how long would
it have taken you if we hadn't
781
00:31:23,120 --> 00:31:25,440
had AI.
It's gone from one hour to 10
782
00:31:25,440 --> 00:31:26,800
minutes.
Isn't that fantastic?
783
00:31:27,120 --> 00:31:29,240
I really need people to
understand here and just be
784
00:31:29,240 --> 00:31:32,080
nuanced and think about stuff
moment to moment because
785
00:31:32,080 --> 00:31:35,080
actually, no, you haven't saved
50 minutes, OK?
786
00:31:35,360 --> 00:31:37,000
You've wasted 10 minutes.
Yeah, definitely.
787
00:31:37,000 --> 00:31:38,600
It's like Amazon Black Friday.
No, you haven't.
788
00:31:38,600 --> 00:31:40,360
You haven't saved £200 on that
Hoover.
789
00:31:40,360 --> 00:31:43,120
You don't need OK from 400 to
£200.
790
00:31:43,240 --> 00:31:44,960
You've wasted £200 because you
didn't.
791
00:31:44,960 --> 00:31:45,920
Need it?
Yeah, and you won't get about
792
00:31:45,920 --> 00:31:46,480
anyway.
Exactly.
793
00:31:46,480 --> 00:31:47,440
Just flip it.
Exactly.
794
00:31:47,440 --> 00:31:49,160
Yeah, Yeah.
You've no longer wasted 60
795
00:31:49,160 --> 00:31:50,320
minutes.
You've wasted 10 minutes.
796
00:31:50,320 --> 00:31:51,960
That's the way to do it.
No, you've saved 50.
797
00:31:52,040 --> 00:31:54,200
Exactly.
So thinking outside the box is
798
00:31:54,200 --> 00:31:56,320
important, isn't here.
In general, if just like is the
799
00:31:56,320 --> 00:31:58,640
system broke, is is the system
the problem, not can we make the
800
00:31:58,640 --> 00:32:00,000
system quicker?
Exactly right.
801
00:32:00,400 --> 00:32:03,360
I'm going to give you, I'm going
to communication with parents.
802
00:32:04,120 --> 00:32:08,080
So letters, newsletters, notes
in general.
803
00:32:08,080 --> 00:32:09,960
Maybe as a teacher you want to
e-mail parents.
804
00:32:11,480 --> 00:32:14,440
Quite probably like a three or
four for me.
805
00:32:14,440 --> 00:32:16,920
Yeah, because I think the, I
think, I think AI can do some of
806
00:32:16,920 --> 00:32:20,360
the grunt work in terms of
speeding up your notes perhaps
807
00:32:20,360 --> 00:32:22,400
into a formal letter.
That's quite good.
808
00:32:22,520 --> 00:32:25,120
There's no reason.
But like, realistically, all of
809
00:32:25,120 --> 00:32:28,920
the content absolutely has to be
created and created by you.
810
00:32:28,920 --> 00:32:31,640
You know what I mean?
You're talking to a parent about
811
00:32:32,120 --> 00:32:33,920
their child.
Like every child is unique.
812
00:32:33,920 --> 00:32:36,400
It's a different situation.
It's not like a curriculum
813
00:32:36,400 --> 00:32:38,280
document.
You can't just upload the child
814
00:32:38,400 --> 00:32:41,640
into a into a thing and say this
is like a national curriculum
815
00:32:41,640 --> 00:32:43,600
document is one thing and it's
objective, you know?
816
00:32:43,760 --> 00:32:45,200
GDPR.
Yeah, exactly.
817
00:32:45,200 --> 00:32:47,920
But a child is like a child.
Every child's experience and
818
00:32:47,920 --> 00:32:49,240
life is different.
I can't really.
819
00:32:49,360 --> 00:32:51,640
I just need to talk to the
parent and have a nuanced
820
00:32:51,640 --> 00:32:53,160
conversation.
Like I don't really think AI.
821
00:32:53,200 --> 00:32:54,800
Hold that floor.
Yeah, because I'm going to give
822
00:32:54,800 --> 00:32:57,880
you another one straight away.
Report writing that one.
823
00:32:57,880 --> 00:33:01,160
Why is it different?
I kind of said it already that
824
00:33:01,160 --> 00:33:03,240
report writing is.
Nuanced.
825
00:33:03,240 --> 00:33:04,480
Every single child is.
Different, true.
826
00:33:04,480 --> 00:33:06,800
I guess I'm really putting
emphasis on report writing on on
827
00:33:06,800 --> 00:33:09,040
the writing bit.
Communication with parents to me
828
00:33:09,040 --> 00:33:11,920
is like phone calls and like you
said, maybe notes or or chatting
829
00:33:11,920 --> 00:33:14,320
after school.
Like it's kind of like AI is.
830
00:33:14,320 --> 00:33:15,600
I can't see it replacing much of
that.
831
00:33:15,720 --> 00:33:17,920
OK, so you so for communicate,
let's be really clear then
832
00:33:17,920 --> 00:33:20,480
because that's very interesting
for communication with parents.
833
00:33:20,480 --> 00:33:24,160
What you're saying is actually
you should be doing in person
834
00:33:24,160 --> 00:33:26,760
stuff more.
Yeah, realistically and I, I
835
00:33:26,760 --> 00:33:29,520
don't think AI should be taking
that away from the experience of
836
00:33:29,520 --> 00:33:31,520
school and the relationship
between parents and.
837
00:33:31,560 --> 00:33:32,040
School.
OK, Very.
838
00:33:32,080 --> 00:33:33,560
Interesting.
Whereas report writing is
839
00:33:33,560 --> 00:33:35,560
traditionally a piece of paper
with writing on it.
840
00:33:35,720 --> 00:33:39,800
And to me, I mean, I've done it
for years, there's, there's just
841
00:33:39,800 --> 00:33:44,280
no benefit to me writing out the
paragraphs myself when I can
842
00:33:44,280 --> 00:33:45,920
just put all of my bullet points
that I would already have
843
00:33:45,920 --> 00:33:47,520
written.
So in both situations, AI or not
844
00:33:47,520 --> 00:33:49,840
AI, I'm writing some bullet
points out about a child and the
845
00:33:49,840 --> 00:33:51,320
things I want to get across to
their parent.
846
00:33:51,600 --> 00:33:55,440
Then I'm spending either 1520
minutes curating a lovely
847
00:33:55,440 --> 00:33:59,440
paragraph that's written really
nicely, or I can spend 5 seconds
848
00:33:59,440 --> 00:34:01,160
putting into AI and saying this
is my writing style.
849
00:34:01,160 --> 00:34:03,840
Can you just, can you just put
these notes into a paragraph and
850
00:34:03,840 --> 00:34:06,720
then proofreading it, always
doing little bits of edits, but
851
00:34:06,720 --> 00:34:09,600
the time saved is astronomical.
When I started using AI for
852
00:34:09,600 --> 00:34:12,040
reports, not only were they just
better because I wasn't getting
853
00:34:12,040 --> 00:34:14,159
the fatigue and starting to just
be like, oh man, I've been
854
00:34:14,159 --> 00:34:17,400
writing reports for 17 hours.
Genuinely because I'm just, I'm
855
00:34:17,400 --> 00:34:19,719
so fatigued that and it's all in
my own time because no one ever
856
00:34:19,719 --> 00:34:23,520
gets bloody report writing time
nearly or never enough that I'm
857
00:34:23,520 --> 00:34:26,440
just copying paragraphs from
previous kids because they're
858
00:34:26,440 --> 00:34:28,040
similar.
I'll just change, tweak a word
859
00:34:28,040 --> 00:34:30,080
or two.
Oh, I'm just using a Bank of
860
00:34:30,080 --> 00:34:32,120
stuff from five years ago, from
10 years ago.
861
00:34:32,120 --> 00:34:35,040
Like like we all pretend that
that was fine.
862
00:34:35,400 --> 00:34:38,800
Maybe it was, but we can't then
also say, oh, AI is terrible.
863
00:34:38,960 --> 00:34:41,639
Actually, AI was making more
unique paragraphs than ever
864
00:34:41,639 --> 00:34:43,800
before.
It was still mimicking my style
865
00:34:43,800 --> 00:34:45,840
and I and it didn't have to copy
and paste anything from any
866
00:34:45,840 --> 00:34:47,600
other child.
I could give it unique notes
867
00:34:47,960 --> 00:34:50,880
every single time about each
child, and all it did was just
868
00:34:50,880 --> 00:34:52,679
formulate those notes into a
paragraph for someone else to
869
00:34:52,679 --> 00:34:55,719
read in a professional manner.
So like the sheer amount of
870
00:34:55,719 --> 00:34:58,440
grunt work that had to go into
report writing for you bumps it
871
00:34:58,440 --> 00:35:00,080
up from your free.
So what would you give out of
872
00:35:00,080 --> 00:35:01,440
10?
For you, I'd probably still give
873
00:35:01,440 --> 00:35:05,560
it like 7 because ultimately,
yeah, it would be a 10 if if the
874
00:35:05,560 --> 00:35:08,120
notes taking wasn't existed,
existing even 7.
875
00:35:08,120 --> 00:35:09,440
Probably a bit generous to be
honest with you.
876
00:35:09,440 --> 00:35:10,760
Because that is, I'm going to
give it five.
877
00:35:10,920 --> 00:35:13,200
I'm going down because I think
half, half of the importance
878
00:35:13,200 --> 00:35:16,880
there is is just from you, the
teacher, your understanding of
879
00:35:16,880 --> 00:35:18,400
the child, nothing to do with
anybody else.
880
00:35:18,800 --> 00:35:21,120
That bit is so important, even
though it only takes 5% of the
881
00:35:21,120 --> 00:35:23,640
time now.
Yeah, very, very important.
882
00:35:23,640 --> 00:35:26,840
So AI can't cannot replace that
bit, but it can replace the time
883
00:35:26,840 --> 00:35:29,000
saved with writing.
So I'm going to very quickly on
884
00:35:29,000 --> 00:35:30,680
this now, whilst we're talking
about this, I'm going to ask
885
00:35:30,680 --> 00:35:32,080
you, would you use AI for this
thing?
886
00:35:32,080 --> 00:35:33,160
It's all to do with this.
OK.
887
00:35:33,200 --> 00:35:35,360
OK.
Would you use AI to to when
888
00:35:35,360 --> 00:35:37,160
sending an e-mail to a parent?
Yeah.
889
00:35:37,440 --> 00:35:39,560
Would you use AI when sending
out a letter to a parent?
890
00:35:39,560 --> 00:35:41,160
Yeah.
Would you use AI when writing
891
00:35:41,160 --> 00:35:42,040
reports?
Yep.
892
00:35:42,280 --> 00:35:44,480
Would you use AI when writing a
HCP plan?
893
00:35:45,760 --> 00:35:47,560
Yeah.
Interest.
894
00:35:47,560 --> 00:35:49,040
Yeah, yeah, yeah.
What made you slow down?
895
00:35:49,360 --> 00:35:52,360
Because I was thinking that's
more sensitive and again, like a
896
00:35:52,360 --> 00:35:53,720
bit more nuanced, but actually
do.
897
00:35:53,720 --> 00:35:56,840
You think it's worth the time
notes or are you really
898
00:35:56,840 --> 00:36:00,080
separating Outlook?
I put the grunt, I put the
899
00:36:00,080 --> 00:36:01,920
actual actually.
Yeah, I think because you have
900
00:36:01,920 --> 00:36:04,680
to put so much, so much careful
consideration of thoughts into
901
00:36:04,680 --> 00:36:06,800
the kind of your note taking
process when writing the HCP in
902
00:36:06,800 --> 00:36:08,920
the 1st place, that kind of
becomes the HCP.
903
00:36:08,920 --> 00:36:10,480
I'd probably avoid.
It so there might actually not
904
00:36:10,480 --> 00:36:12,840
be that much difference there
because it's so it's so from.
905
00:36:12,840 --> 00:36:14,920
You trying to see where maybe
your line is basically like, you
906
00:36:14,920 --> 00:36:18,240
know, like, you know, a letter
out to a letter is just very
907
00:36:18,240 --> 00:36:20,080
generic.
Isn't it down to like a very
908
00:36:20,080 --> 00:36:23,080
specific legal document over the
plan for the HP that we're gonna
909
00:36:23,080 --> 00:36:24,000
implement for?
The child.
910
00:36:24,000 --> 00:36:27,040
Yeah, yeah, yeah.
There's got the Whether I'm
911
00:36:27,040 --> 00:36:29,360
right or wrong, there's got to
be a line somewhere.
912
00:36:29,360 --> 00:36:33,360
In my opinion, I still probably.
I think I would, you know, in
913
00:36:33,360 --> 00:36:35,840
what, in what respect there
purely if there was just some
914
00:36:35,840 --> 00:36:38,280
crazy format I had to fill out
and I was like, I've done all
915
00:36:38,280 --> 00:36:39,960
the work and I just wanted to
turn into that format.
916
00:36:39,960 --> 00:36:41,400
I'd absolutely use that.
And just be really clear.
917
00:36:41,400 --> 00:36:43,040
I want to put a disclaimer in
here because I know me and you
918
00:36:43,040 --> 00:36:46,400
know this and this is like deep
within our brains, we would do
919
00:36:46,400 --> 00:36:49,320
not advocate and would never
actually put children's names
920
00:36:49,320 --> 00:36:51,560
and we would not breach GDPR by
putting children's names and
921
00:36:51,560 --> 00:36:54,840
data into things like ChatGPT.
You can use blanks.
922
00:36:54,840 --> 00:36:57,120
You can get it to structure
sentences without using
923
00:36:57,120 --> 00:36:58,240
children's names, and you
shouldn't do it.
924
00:36:58,240 --> 00:37:00,960
We should say that because it is
very, very, very, very, very,
925
00:37:00,960 --> 00:37:03,080
very important.
Like there are people out there
926
00:37:03,080 --> 00:37:05,840
who are lovely and don't mean
any harm whatsoever, who won't
927
00:37:05,840 --> 00:37:08,320
even think about it.
They're not being malicious, but
928
00:37:09,160 --> 00:37:11,080
it's bad and you should not do
that.
929
00:37:11,080 --> 00:37:14,440
So if you're doing your reports,
don't put just put literally
930
00:37:14,440 --> 00:37:16,880
like a string of letters or a or
something in when you're putting
931
00:37:16,880 --> 00:37:18,600
it into the thing, you still
give the description, the bullet
932
00:37:18,600 --> 00:37:21,160
points and then just manually
replace put the child's name in
933
00:37:21,160 --> 00:37:22,480
your.
So that's how that's how to do
934
00:37:22,480 --> 00:37:23,560
it.
Just to be really clear, in case
935
00:37:23,560 --> 00:37:25,440
someone's.
Also that means, by the way, it
936
00:37:25,440 --> 00:37:27,800
means that I built in my editing
and reviewing whilst I was
937
00:37:27,800 --> 00:37:28,720
putting name in.
Yeah.
938
00:37:28,720 --> 00:37:30,200
And I did one of the.
I was like, oh, it's an extra
939
00:37:30,200 --> 00:37:31,600
thing.
Not really because I read it.
940
00:37:31,600 --> 00:37:33,640
I always read them, Yeah.
And any time it said name, I'd
941
00:37:33,640 --> 00:37:35,200
write their name in, then I'd
keep reading it.
942
00:37:35,200 --> 00:37:37,000
And it may be something else.
I don't quite like that word.
943
00:37:37,000 --> 00:37:39,120
I think it's not quite right.
I added that, put the name in,
944
00:37:39,120 --> 00:37:41,120
put the name in it.
It worked quite well.
945
00:37:41,120 --> 00:37:43,200
Yeah, it was absolutely fine.
It really was still a huge time,
946
00:37:43,200 --> 00:37:45,840
save for unique reports.
I'm going to give you 1 now.
947
00:37:45,840 --> 00:37:49,120
So going back to the context of
rating out of 10, how useful AI
948
00:37:49,160 --> 00:37:52,080
is for something in the
classroom, lesson hooks.
949
00:37:52,160 --> 00:37:53,480
So that bit at the start of the
lesson.
950
00:37:53,680 --> 00:37:56,800
Yeah, I think this is where it
can come into its own depending
951
00:37:56,800 --> 00:37:58,680
on what you're doing.
Like it can be 10 out of 10.
952
00:37:58,680 --> 00:38:03,240
Genuinely, I've, I've not for
every lesson ever, but you might
953
00:38:03,240 --> 00:38:04,600
decide actually, do you know
what?
954
00:38:04,960 --> 00:38:07,600
I, we did some writing once
before we were doing some
955
00:38:07,600 --> 00:38:11,680
character description.
Tell me the hardest thing to get
956
00:38:11,680 --> 00:38:14,200
kids to believe is real when
they're doing writing, is that
957
00:38:14,200 --> 00:38:15,560
it?
There's a real purpose behind
958
00:38:15,560 --> 00:38:17,640
it, right?
How much better is children's
959
00:38:17,640 --> 00:38:19,880
writing when there's purpose?
That's why, you know, when
960
00:38:19,880 --> 00:38:21,760
you're writing a letter, you try
and write that to someone
961
00:38:21,760 --> 00:38:23,480
important.
You say I'm going to send it to
962
00:38:23,480 --> 00:38:26,040
David Attenborough, I'm going to
send it to the Prime Minister,
963
00:38:26,040 --> 00:38:28,160
I'm going to send it to Father
Christmas, whatever it might be.
964
00:38:28,400 --> 00:38:31,640
So purpose is massive in writing
because you get children engaged
965
00:38:31,640 --> 00:38:34,480
and want to do it.
So we were doing some writing
966
00:38:34,480 --> 00:38:37,440
before we did some character
descriptions quite draw quite
967
00:38:37,440 --> 00:38:40,240
boring, genuinely, like unless
they've got a really vivid
968
00:38:40,240 --> 00:38:42,040
imagination that some children
don't have.
969
00:38:42,600 --> 00:38:46,320
So AII remember at the front of
the room we were doing some
970
00:38:46,320 --> 00:38:49,200
shared writing together.
We were talking about how to use
971
00:38:49,200 --> 00:38:50,760
different descriptive words,
etcetera.
972
00:38:50,760 --> 00:38:53,840
What can make the, the, the
monster thing they're doing,
973
00:38:53,840 --> 00:38:56,200
monsters or aliens or something,
really stand out?
974
00:38:57,120 --> 00:39:01,840
And I then put that into image
generation and it made the image
975
00:39:01,960 --> 00:39:04,320
and they saw it and they
immediately then could link
976
00:39:04,320 --> 00:39:07,720
between the words they had
chosen and what it looks like.
977
00:39:07,960 --> 00:39:10,240
And then after that, we went
back, we identified all the
978
00:39:10,240 --> 00:39:13,720
adjectives in the sentence, for
example, and we said, OK, that
979
00:39:13,720 --> 00:39:17,520
that alien, it's, it's too big.
It's too big.
980
00:39:17,520 --> 00:39:19,360
Did we describe?
Oh, you said it's enormous.
981
00:39:19,760 --> 00:39:22,840
Rub it out, changed it went to
something else.
982
00:39:23,200 --> 00:39:24,760
It's green.
I don't think Green's that good.
983
00:39:24,760 --> 00:39:26,720
I think in the in that
environment, I think it'd be
984
00:39:26,720 --> 00:39:28,800
better to be like purpley or
something, whatever it might be.
985
00:39:29,200 --> 00:39:31,400
Go through, scrub out the
adjectives and swap them.
986
00:39:31,400 --> 00:39:35,280
Whatever reason to maybe change
the theme or to change the feel
987
00:39:35,280 --> 00:39:37,760
of the monster to make it from a
cuddly monster to a scary
988
00:39:37,760 --> 00:39:40,600
monster.
So the kids were having purpose
989
00:39:40,600 --> 00:39:43,000
because they knew it was about
to be regenerated and we were
990
00:39:43,000 --> 00:39:45,760
talking about the power of words
in writing.
991
00:39:45,920 --> 00:39:49,240
It became tangible for them.
That hook was 10 out of 10 for
992
00:39:49,240 --> 00:39:52,120
my kids because suddenly they
were thinking every word they
993
00:39:52,120 --> 00:39:55,240
wrote in their paper.
They were thinking right when
994
00:39:55,320 --> 00:39:57,480
Mr. Price generates this image,
I said to them, going to
995
00:39:57,480 --> 00:40:00,960
generate all your animals When
Mr. Price generates this, is
996
00:40:00,960 --> 00:40:03,160
that the right word to use?
Or actually, I really want it to
997
00:40:03,160 --> 00:40:04,680
look like this.
What's a better word?
998
00:40:04,680 --> 00:40:06,160
Let's get the thesaurus.
Let's look it up.
999
00:40:07,000 --> 00:40:09,080
There was so much purpose.
Their writing was so much
1000
00:40:09,080 --> 00:40:10,960
better.
It was meaningful.
1001
00:40:11,160 --> 00:40:14,080
And we said every word that goes
in your page will be learned and
1002
00:40:14,080 --> 00:40:16,520
be shown.
And it was 10 out of 10.
1003
00:40:16,640 --> 00:40:20,240
That hook worked so well.
And that was using AI.
1004
00:40:20,640 --> 00:40:22,600
That's it.
Do you reckon you're skewed
1005
00:40:22,600 --> 00:40:25,480
slightly towards English in
terms of like where hooks are
1006
00:40:25,480 --> 00:40:27,320
really good?
Yeah, because it depends.
1007
00:40:27,320 --> 00:40:31,120
It depends because writing, I've
also before asked it like what
1008
00:40:31,160 --> 00:40:32,960
AI can do is give you lots of
ideas.
1009
00:40:32,960 --> 00:40:36,040
Yeah, true, lots of ideas.
So for example, what I always
1010
00:40:36,040 --> 00:40:39,320
say is never ask it for a hook,
never ask it for an idea.
1011
00:40:39,960 --> 00:40:43,640
You might say, right, I'm doing
a lesson this afternoon on, you
1012
00:40:43,640 --> 00:40:47,600
know, encampments or something,
or Roman history or, or how, you
1013
00:40:47,600 --> 00:40:50,160
know, the Anglo Saxons came over
to Britain, etcetera.
1014
00:40:50,160 --> 00:40:51,400
I made you didn't say Rivers,
but yeah.
1015
00:40:51,400 --> 00:40:52,240
Rivers.
Rivers are.
1016
00:40:52,240 --> 00:40:54,080
I'm really trying to avoid
saying rivers usually.
1017
00:40:54,520 --> 00:40:55,640
Really.
I'm doing a lesson on rivers.
1018
00:40:56,200 --> 00:40:58,400
How can I hook the children to
get them interested in this
1019
00:40:58,400 --> 00:41:00,160
story?
So you can say to them like, but
1020
00:41:00,160 --> 00:41:01,960
you can say give me 10 ideas.
Yeah, definitely.
1021
00:41:01,960 --> 00:41:04,120
And it'll generate 10/7 will be
nonsense.
1022
00:41:04,160 --> 00:41:06,240
Yeah, 2 will be all right, and 1
might be really good.
1023
00:41:06,240 --> 00:41:08,040
Yeah, yeah, yeah.
So it's just about ideas and it
1024
00:41:08,040 --> 00:41:11,600
can give you ideas.
And I found again, I'll say this
1025
00:41:11,600 --> 00:41:15,800
again, AI for a lot of people is
stifling creativity.
1026
00:41:16,520 --> 00:41:20,440
For me, I became more creative
because it gave me more ideas.
1027
00:41:20,600 --> 00:41:22,520
I did stuff at the start of
lessons that I would not have
1028
00:41:22,520 --> 00:41:24,040
done before.
I always thought to myself,
1029
00:41:24,040 --> 00:41:26,880
what's a really good hook?
And Oh my God, I just scoffed my
1030
00:41:26,880 --> 00:41:28,160
lunch down.
I'm on duty for a bit.
1031
00:41:28,160 --> 00:41:29,040
I've got a lesson this
afternoon.
1032
00:41:29,040 --> 00:41:32,800
I can't think for two seconds.
Why don't you try starting off
1033
00:41:32,800 --> 00:41:35,520
by doing this?
Yes, we're going to start off
1034
00:41:35,520 --> 00:41:37,440
with a little story.
Wouldn't people argue that is
1035
00:41:37,440 --> 00:41:39,240
stifling your creativity though
because you didn't do it?
1036
00:41:39,280 --> 00:41:41,080
No, it's not stifling by
creative because the option is
1037
00:41:41,200 --> 00:41:44,960
I'm not creative at all or I'm
choosing, I'm making my lesson
1038
00:41:44,960 --> 00:41:46,680
more creative.
So that's what that's what I
1039
00:41:46,680 --> 00:41:48,560
mean.
Like I without it, I would not
1040
00:41:48,560 --> 00:41:51,120
have done anything.
So by definition, I was more
1041
00:41:51,120 --> 00:41:53,600
creative.
But if I'm not, I'm not this.
1042
00:41:53,720 --> 00:41:56,320
I'm a teacher.
I'm there to give children, you
1043
00:41:56,320 --> 00:42:00,000
know, option opportunities in a
lesson to be more hooked and
1044
00:42:00,000 --> 00:42:02,720
engaged by something.
It's a more creative lesson.
1045
00:42:03,360 --> 00:42:06,480
Am I personally going through
the creative process of sat down
1046
00:42:06,480 --> 00:42:08,880
thinking about, no, I don't have
time to do it, but it's made my
1047
00:42:08,880 --> 00:42:10,520
lesson more creative.
It's made my children more
1048
00:42:10,520 --> 00:42:12,760
engaged.
I'm giving more to the kids as a
1049
00:42:12,760 --> 00:42:14,880
result, just because I've used
this as a tool.
1050
00:42:15,120 --> 00:42:18,160
I think for hooks, by all means,
use your own hooks that you've
1051
00:42:18,160 --> 00:42:19,800
used in your whole career.
There are some lessons.
1052
00:42:19,800 --> 00:42:22,400
I know what I'm gonna do all the
time when I'm doing instructions
1053
00:42:22,400 --> 00:42:24,200
for the first time.
We make a sandwich.
1054
00:42:24,200 --> 00:42:27,000
Yeah, yeah, of course.
And we go, the kids say, put the
1055
00:42:27,000 --> 00:42:28,440
bread on top, I'll put on top of
my head.
1056
00:42:28,440 --> 00:42:29,920
You're not being specific
enough.
1057
00:42:29,920 --> 00:42:32,360
All that kind of stuff.
This stuff I will always do, but
1058
00:42:32,440 --> 00:42:34,600
for lessons where you want
something just whack it into AI.
1059
00:42:34,680 --> 00:42:36,640
The Egyptian lesson years ago
and we used to introduce
1060
00:42:36,640 --> 00:42:38,480
Egyptians and we'd get we'd get
the toilet paper out, dress up
1061
00:42:38,480 --> 00:42:40,720
as mummy, they'd get the sand
out and they'd dig for treasure
1062
00:42:41,040 --> 00:42:42,960
just to get them hooked into the
idea of.
1063
00:42:43,120 --> 00:42:45,200
Dig for treasure, that classic
historical language.
1064
00:42:45,880 --> 00:42:47,960
Dig for treasure.
Kids dress up as mummy.
1065
00:42:48,080 --> 00:42:50,720
Dress up as archaeologists.
No one's taking primary learning
1066
00:42:50,720 --> 00:42:53,560
seriously.
But it was hook, you know, and
1067
00:42:53,560 --> 00:42:55,520
the thought of like replacing
that with that was the best hook
1068
00:42:55,520 --> 00:42:56,120
for that.
Yeah.
1069
00:42:56,280 --> 00:42:58,160
I mean, that that didn't need
AI, but you're absolutely right.
1070
00:42:58,160 --> 00:43:00,360
You know what AI might have been
instrumental in thinking of that
1071
00:43:00,360 --> 00:43:01,960
hook in the 1st place.
Do you know from that meta level
1072
00:43:01,960 --> 00:43:04,800
like, yeah, exactly.
As well as actually being used
1073
00:43:04,920 --> 00:43:06,880
like in your English example?
I want to do another one with
1074
00:43:06,880 --> 00:43:09,320
you because it's kind of linked.
I think it's close enough.
1075
00:43:09,320 --> 00:43:11,640
Anyway, that was lesson hooks.
What about in the main part of
1076
00:43:11,640 --> 00:43:13,600
the lesson, like the actual
worksheet?
1077
00:43:14,760 --> 00:43:18,440
Oh, this really depends.
It can genuinely be 0 out, then
1078
00:43:19,080 --> 00:43:21,360
it can genuinely be absolute
trash nonsense.
1079
00:43:21,640 --> 00:43:24,640
It can be quite useful, right?
So let me be really specific
1080
00:43:24,640 --> 00:43:27,680
here, give you some examples.
When I've taught GPS objectives
1081
00:43:27,680 --> 00:43:32,760
in the past and I've wanted ten
questions to fill a gap with a
1082
00:43:32,760 --> 00:43:36,240
preposition or something, or
choosing between as sorry and
1083
00:43:36,240 --> 00:43:39,400
and a right or choosing A
determiner to go in the gap or
1084
00:43:39,640 --> 00:43:42,480
anything like that kind of
nonsense that I hate anyway.
1085
00:43:42,840 --> 00:43:45,080
I don't want to spend 10 minutes
putting a sheet together for
1086
00:43:45,080 --> 00:43:47,240
that.
So AI was very good at
1087
00:43:47,360 --> 00:43:52,240
generating 10/15/20 sentences
with the preposition missing and
1088
00:43:52,240 --> 00:43:55,160
maybe even in brackets 3 to
choose from that kind of
1089
00:43:55,160 --> 00:43:57,000
generation of worksheets.
Fantastic.
1090
00:43:57,400 --> 00:44:00,000
AI generated loads of times
table practise for me.
1091
00:44:00,400 --> 00:44:03,080
And I said only do three and two
times table mix up the
1092
00:44:03,080 --> 00:44:04,600
presentation.
It could handle that.
1093
00:44:04,680 --> 00:44:07,120
Yeah, yeah, right.
So in terms of that kind of
1094
00:44:07,120 --> 00:44:10,880
rote, repetitive fill in the box
text level stuff.
1095
00:44:10,880 --> 00:44:12,520
Closed procedure.
Exactly.
1096
00:44:12,520 --> 00:44:13,880
Closed procedure, that's the
word.
1097
00:44:14,040 --> 00:44:16,240
I found it was very good at
that, especially if you give it
1098
00:44:16,240 --> 00:44:20,440
the right prompts and told it
what to do outside of that when
1099
00:44:20,440 --> 00:44:21,920
it gets a bit further.
I don't know.
1100
00:44:21,920 --> 00:44:23,440
We've played around with
different things, haven't we?
1101
00:44:23,440 --> 00:44:26,120
And I'm not going to sit and
name all companies etcetera, but
1102
00:44:26,400 --> 00:44:28,040
but it can be hit and miss,
right?
1103
00:44:28,200 --> 00:44:29,280
Yeah.
I think if we sort of band it
1104
00:44:29,280 --> 00:44:31,200
together then with maybe lesson
generation general, like slide
1105
00:44:31,200 --> 00:44:32,960
generation.
So like your work, basically the
1106
00:44:32,960 --> 00:44:35,080
content that you might put in on
a sheet for the kids to work
1107
00:44:35,080 --> 00:44:37,120
through, all the content that
might be on your slides when
1108
00:44:37,120 --> 00:44:39,320
you're going through.
But very quick, I think that's,
1109
00:44:39,720 --> 00:44:41,920
I think that's, I think it's
completely separate in terms of
1110
00:44:42,240 --> 00:44:45,360
even the worksheets.
Yeah, like I think even that
1111
00:44:45,360 --> 00:44:47,680
even those alone is such a wide
range.
1112
00:44:47,680 --> 00:44:48,360
Oh, fair enough.
Yeah, yeah.
1113
00:44:48,360 --> 00:44:50,680
But in terms of the no, because
you're basically saying no
1114
00:44:50,680 --> 00:44:53,120
worksheet's fine.
It's not like half the time like
1115
00:44:53,160 --> 00:44:55,840
if you, if you want a registered
worksheet, like for example,
1116
00:44:55,840 --> 00:44:58,680
even in a maths objective, you
know when you create maths
1117
00:44:58,680 --> 00:45:02,040
questions where in a row there's
a reason why you put them in a
1118
00:45:02,040 --> 00:45:06,240
row.
Like you know, 7 * 870 * 871 *
1119
00:45:06,240 --> 00:45:10,360
871 * 9 and and all that, you
know, individual, you could sit
1120
00:45:10,360 --> 00:45:13,240
and work them out, but every
single jump there's a reason why
1121
00:45:13,240 --> 00:45:14,760
you've jumped it and there's
reasoning built in.
1122
00:45:15,120 --> 00:45:17,080
I find it really struggles of
anything like that.
1123
00:45:17,440 --> 00:45:20,680
And if you want a real quality
task for your children, I think
1124
00:45:20,680 --> 00:45:23,120
it's better to create it
yourself sometimes, or get it
1125
00:45:23,120 --> 00:45:25,200
from a real trusted source where
a specialist has done that.
1126
00:45:25,720 --> 00:45:27,560
That's all I'll say about that,
but please crack.
1127
00:45:27,560 --> 00:45:29,800
On no, no, I was sort of
bandaged together because I
1128
00:45:29,800 --> 00:45:32,120
also, I agree by the way, like
with worksheet generation, go
1129
00:45:32,120 --> 00:45:34,120
back to that for a second.
Even things like I think about
1130
00:45:34,120 --> 00:45:36,600
some of those sort of rich
afternoon lessons we do
1131
00:45:36,600 --> 00:45:40,400
sometimes maybe in science or or
history or something where it's
1132
00:45:40,400 --> 00:45:43,240
not actually in that lesson,
just about like question,
1133
00:45:43,240 --> 00:45:45,080
answer, question, answer,
knowledge, knowledge, knowledge.
1134
00:45:45,080 --> 00:45:47,120
Sometimes it's like a we're
going to go out and we're going
1135
00:45:47,120 --> 00:45:49,800
to stick some leaves in and
we're going to come back and we
1136
00:45:49,800 --> 00:45:51,440
analyse them and we're going to
label them as stuff.
1137
00:45:51,720 --> 00:45:53,880
I felt like AI would was rubbish
at doing that.
1138
00:45:53,880 --> 00:45:56,560
It was rubbish at create that
thinking of that idea and making
1139
00:45:56,560 --> 00:45:58,280
a worksheet for me because I
definitely tried it lots of
1140
00:45:58,280 --> 00:46:01,080
times and it was always very,
very, very boring stuff.
1141
00:46:01,080 --> 00:46:04,040
So like, yeah, I agree, sort of
three out of 10 is not not the
1142
00:46:04,040 --> 00:46:06,280
best at doing that.
Then with slide generation, this
1143
00:46:06,280 --> 00:46:08,720
one is fascinating because I
feel like it's the it's the
1144
00:46:08,720 --> 00:46:10,200
thing people are trying to crack
at the moment.
1145
00:46:11,480 --> 00:46:14,000
And maybe it has been cracked,
but I'm sort of yet to see it in
1146
00:46:14,000 --> 00:46:16,400
all honesty, that there's lots
of bits of software out there
1147
00:46:16,400 --> 00:46:19,880
that say, oh, just put in some
prompts and or tell it what your
1148
00:46:20,160 --> 00:46:22,720
curriculum is and it will make a
whole lesson for you at the
1149
00:46:22,720 --> 00:46:26,640
PowerPoint.
And I've tried a few and I'll be
1150
00:46:26,640 --> 00:46:29,160
honest, I would not use them in
my class.
1151
00:46:29,360 --> 00:46:32,120
I would not use them.
Like even you could argue or
1152
00:46:32,120 --> 00:46:34,560
maybe it just gives you a
starting point, a base, a base
1153
00:46:34,560 --> 00:46:37,760
point to sort of adapt from.
Even though it was, it was, I
1154
00:46:38,080 --> 00:46:39,400
was clutching at straws to be
honest with you.
1155
00:46:39,400 --> 00:46:43,000
I was thinking, I have to edit
this so much that I might as
1156
00:46:43,000 --> 00:46:45,240
well just start from scratch.
Like this isn't really saved me
1157
00:46:45,240 --> 00:46:47,080
any time.
And if I was to use it, it's
1158
00:46:47,240 --> 00:46:50,760
very very sub optimal lesson in
terms of the resources being put
1159
00:46:50,760 --> 00:46:53,120
in front of kids.
That's just my opinion, I'm sure
1160
00:46:53,120 --> 00:46:54,640
people disagree.
A lot of people are enjoying
1161
00:46:54,640 --> 00:46:56,480
using these resources because
I'm sure it does save them time,
1162
00:46:57,160 --> 00:47:01,640
but at what cost?
So this is I would say we can
1163
00:47:01,640 --> 00:47:04,440
probably round off in terms of
the talk here, because I think
1164
00:47:04,440 --> 00:47:10,560
this is the perfect example of
how saving teachers time and
1165
00:47:10,560 --> 00:47:14,320
school staff time is genuinely
at the forefront of everything
1166
00:47:14,320 --> 00:47:16,320
that we should be using with
tech, right?
1167
00:47:17,400 --> 00:47:20,240
But it needs to be on a level
with it being quality for the
1168
00:47:20,400 --> 00:47:22,440
for the for the students, right?
For the children.
1169
00:47:22,920 --> 00:47:26,880
It has to be equal.
It has to be because I feel like
1170
00:47:26,880 --> 00:47:29,880
a lot of the shortcuts currently
for creating worksheets or
1171
00:47:29,880 --> 00:47:32,640
creating slides.
I'm not saying it won't happen
1172
00:47:32,640 --> 00:47:34,200
in the future.
I think we could get to a point
1173
00:47:34,200 --> 00:47:37,680
where it is very useful, but
currently it's at the detriment
1174
00:47:37,680 --> 00:47:41,120
of the quality of what's being
produced because AI can produce
1175
00:47:41,120 --> 00:47:42,720
slop.
We know that it's still at this
1176
00:47:42,720 --> 00:47:45,240
stage getting better, but we
know it can produce rubbish.
1177
00:47:45,760 --> 00:47:48,680
So when you look at these, I, I
look at it, sometimes I think
1178
00:47:48,680 --> 00:47:52,040
that would not wash in my school
if, if I made, if I made that
1179
00:47:52,040 --> 00:47:53,680
slide, I said, here's the lesson
guys, for everyone.
1180
00:47:53,680 --> 00:47:56,040
That would not pass as a good
enough lesson.
1181
00:47:56,480 --> 00:47:59,320
Not only sometimes in the
generation of lessons that we
1182
00:47:59,320 --> 00:48:02,080
practise, we, we looked at and
tried, not only sometimes were
1183
00:48:02,080 --> 00:48:05,160
the facts just wrong, which was
a whole different conversation
1184
00:48:05,160 --> 00:48:06,680
about how it's scraping the
Internet.
1185
00:48:06,880 --> 00:48:08,600
Sometimes the facts were
literally wrong.
1186
00:48:08,760 --> 00:48:11,480
And as a teacher who's not a
specialist in geography and
1187
00:48:11,480 --> 00:48:14,200
rivers, it could have it could
have scraped something off.
1188
00:48:14,200 --> 00:48:16,760
That sounds right.
What I'm gonna do research
1189
00:48:16,760 --> 00:48:18,160
everything.
I might have just did a research
1190
00:48:18,160 --> 00:48:20,240
myself.
Literally just picture that was
1191
00:48:20,240 --> 00:48:22,440
wrong as well.
It was it was a historical
1192
00:48:22,440 --> 00:48:23,680
person.
It was really important that
1193
00:48:23,680 --> 00:48:26,160
this picture was accurate.
Just pulled up like a picture of
1194
00:48:26,760 --> 00:48:29,160
different people and I was like,
oh, this is terrible.
1195
00:48:29,160 --> 00:48:30,200
This is.
Not good.
1196
00:48:30,200 --> 00:48:32,440
It's like it should have been
Winston Churchill, but it was
1197
00:48:32,440 --> 00:48:33,840
bloody, you know?
Margaret Thatcher.
1198
00:48:33,920 --> 00:48:35,880
Yeah, it's something.
I don't think Margaret Thatcher
1199
00:48:35,880 --> 00:48:37,720
was involved in World War 2.
Yeah, or it was someone that
1200
00:48:37,720 --> 00:48:39,600
just looked a bit like Winston
Churchill, and it's like this.
1201
00:48:39,600 --> 00:48:41,320
Oh, like he's clearly scoured
Internet and thought this was
1202
00:48:41,320 --> 00:48:43,320
this.
But it's not like this is not
1203
00:48:43,400 --> 00:48:44,680
good.
And it's not going to pass
1204
00:48:44,680 --> 00:48:46,680
schools.
Everyone who works in this
1205
00:48:46,680 --> 00:48:49,040
school, we talk about it so much
how schools are obsessed with
1206
00:48:49,040 --> 00:48:51,920
uniformity as well sometimes and
slides having to look the same
1207
00:48:51,920 --> 00:48:53,200
and following the same
structure.
1208
00:48:53,760 --> 00:48:56,080
There's no structure.
There was no structure to those
1209
00:48:56,080 --> 00:48:57,400
lessons.
It was just like a story.
1210
00:48:57,400 --> 00:48:59,960
And it's like, cool if we've got
absolutely no standards in what
1211
00:48:59,960 --> 00:49:03,760
lessons look like, maybe that's.
Just so I think it'll get there.
1212
00:49:03,920 --> 00:49:05,880
In all honesty, with the great
AI changing, I do think we'll
1213
00:49:05,880 --> 00:49:07,360
get there.
It's just that right now I'd say
1214
00:49:07,360 --> 00:49:09,640
it's not the best at doing that
job and it still needs a
1215
00:49:09,640 --> 00:49:10,880
teacher's touch.
Let's get to the big one,
1216
00:49:10,880 --> 00:49:12,000
because I really want to ask you
this one.
1217
00:49:12,920 --> 00:49:17,000
Teaching out of 10, AI replacing
teaching.
1218
00:49:17,280 --> 00:49:21,480
Right.
Right now as the hold as me as a
1219
00:49:21,480 --> 00:49:26,320
teacher in the classroom one. 1,
not 0 but.
1220
00:49:26,320 --> 00:49:30,880
One, no, no, not 0, because you
just have to be honest that
1221
00:49:31,560 --> 00:49:33,600
there are things that AI is
better than me at.
1222
00:49:33,760 --> 00:49:37,320
Yeah, of course there is.
Hannah Frye said anything that
1223
00:49:37,320 --> 00:49:40,200
can be done sat at a computer
will be replaced by AI.
1224
00:49:40,240 --> 00:49:42,880
Yeah, it is some element of a
teacher's job sat behind a
1225
00:49:42,880 --> 00:49:43,960
computer.
We wouldn't have a bloody
1226
00:49:43,960 --> 00:49:46,440
podcast at all.
How often we talked about how
1227
00:49:46,440 --> 00:49:48,400
we're not in this repetitive
point in the class.
1228
00:49:48,720 --> 00:49:50,880
So let's take that away.
Yeah, brilliant.
1229
00:49:50,880 --> 00:49:53,760
That's maybe it's a 2.
But the teaching bit, you are
1230
00:49:53,760 --> 00:49:56,120
standing in front of kids and
you are delivering content to
1231
00:49:56,120 --> 00:49:58,080
them and you are teaching and
you are adapting on the spot and
1232
00:49:58,080 --> 00:49:59,680
you're doing all eight of the
teachers standards.
1233
00:49:59,920 --> 00:50:02,960
That bit of teaching, pastoral
right now.
1234
00:50:03,000 --> 00:50:05,480
Pastoral, OK, all of it.
Yeah, right now is 1 out of 10,
1235
00:50:05,480 --> 00:50:05,920
isn't it?
Yeah.
1236
00:50:06,240 --> 00:50:08,240
What do you think about the
future?
1237
00:50:08,240 --> 00:50:10,320
Because this is where I think we
have had lots of little
1238
00:50:10,320 --> 00:50:12,040
discussion about.
Let's air this out now.
1239
00:50:12,040 --> 00:50:13,960
All right, all right.
Where do you think we're going?
1240
00:50:14,880 --> 00:50:19,880
It is to me, and this is not me
saying this is good to me,
1241
00:50:20,080 --> 00:50:26,160
inevitable, that the job of a
teacher will be replaced by
1242
00:50:26,160 --> 00:50:29,240
artificial intelligence at some
point in the future.
1243
00:50:29,520 --> 00:50:35,720
Are we talking robots or AI
actually doing the entire job,
1244
00:50:36,080 --> 00:50:38,080
teaching the kids, the pastoral
care, all of it.
1245
00:50:38,280 --> 00:50:41,240
Listen, I I don't like this
prediction.
1246
00:50:41,760 --> 00:50:43,120
I don't think this prediction is
good.
1247
00:50:43,160 --> 00:50:46,000
That's fine.
But I do think there will be a
1248
00:50:46,000 --> 00:50:49,920
point where it will be a choice
by governments or whoever is in
1249
00:50:49,920 --> 00:50:55,280
charge to spend more money to
get a real life human in front
1250
00:50:55,280 --> 00:50:57,360
of the staff.
And it'll only be as long as the
1251
00:50:57,360 --> 00:51:01,280
resolve is there for humans to
have humans with other humans,
1252
00:51:01,880 --> 00:51:04,280
that that will happen.
I think it will get to a point
1253
00:51:04,280 --> 00:51:08,640
where artificial intelligence is
better at everything that we can
1254
00:51:08,640 --> 00:51:11,280
possibly do.
It's it's literally better at
1255
00:51:11,280 --> 00:51:13,280
building in the same way.
Do you know, different people
1256
00:51:13,280 --> 00:51:14,560
have different strengths, Right?
Yeah.
1257
00:51:14,640 --> 00:51:17,080
And if you lined up 100 people,
you could line them up in the
1258
00:51:17,080 --> 00:51:19,720
order of how emotionally
intelligent they are and how
1259
00:51:20,000 --> 00:51:22,880
empathetic they are and how how
kind they are and how tall they
1260
00:51:22,880 --> 00:51:25,040
are and how angry they get.
And it'll be all different
1261
00:51:25,040 --> 00:51:27,280
orders.
There will come a point where
1262
00:51:27,280 --> 00:51:30,400
artificial intelligence will be
at the top 1% of every single
1263
00:51:30,720 --> 00:51:32,600
one of them, right?
In the same way me and you are
1264
00:51:32,600 --> 00:51:34,760
different.
Artificial intelligence will
1265
00:51:34,760 --> 00:51:37,360
learn what makes us different
and better and worse and be
1266
00:51:37,360 --> 00:51:39,360
better at everything than
humans.
1267
00:51:39,360 --> 00:51:41,240
That will happen.
The intelligence will get to
1268
00:51:41,240 --> 00:51:44,880
such a high level, it will
skyrocket up where it gets to
1269
00:51:44,880 --> 00:51:47,200
the point where it knows
everything and can do everything
1270
00:51:47,200 --> 00:51:49,640
better and always people.
You'll never replace a human's
1271
00:51:49,640 --> 00:51:51,880
heart and a human's touch.
I was like, yeah, well, do you
1272
00:51:51,880 --> 00:51:54,000
know what?
There's, I can think of a few
1273
00:51:54,000 --> 00:51:58,400
human emotions and intelligences
that I have that I think you
1274
00:51:58,400 --> 00:52:00,840
could learn to be better at than
me.
1275
00:52:00,960 --> 00:52:04,080
Yeah, yeah, yeah.
So I'm just saying I don't like
1276
00:52:04,080 --> 00:52:07,320
this fact and I don't think it's
necessarily a good thing, but I
1277
00:52:07,320 --> 00:52:10,560
just think that humans are a
bunch of chemicals, chemical
1278
00:52:10,560 --> 00:52:13,240
reactions, neurons are true
there.
1279
00:52:13,320 --> 00:52:17,200
There will be a point where we
can replicate that in a robot,
1280
00:52:17,200 --> 00:52:18,520
an artificial.
Empathy for you.
1281
00:52:18,520 --> 00:52:20,520
You never learned that one
that's that's low and that's
1282
00:52:20,520 --> 00:52:22,240
easy.
AI does that now better than
1283
00:52:22,400 --> 00:52:24,800
you.
Literally, that's my point.
1284
00:52:24,800 --> 00:52:26,280
And so I think it will replace
it.
1285
00:52:26,280 --> 00:52:29,840
And I think it's going to come
down to a cost evaluation.
1286
00:52:29,880 --> 00:52:32,240
And eventually there'll be so
much pressure that they'll drop
1287
00:52:32,240 --> 00:52:35,440
you like, look, why, why can we
sit here and say, and in the
1288
00:52:35,440 --> 00:52:38,640
chat people said all the white
collar jobs are going to go, you
1289
00:52:38,640 --> 00:52:39,760
know, all all the different
jobs.
1290
00:52:39,760 --> 00:52:42,040
They're like, you know, lawyers.
They'll be, they'll be AI bought
1291
00:52:42,040 --> 00:52:44,440
another law inside out and
within 3 seconds can work over
1292
00:52:44,440 --> 00:52:46,880
something's right or wrong.
Can take about look at all of
1293
00:52:46,880 --> 00:52:49,160
the log of every single
conviction ever.
1294
00:52:49,360 --> 00:52:51,840
Get rid of any kind of bias and
just judge someone immediately,
1295
00:52:51,960 --> 00:52:54,240
right.
Better than a group of 12 random
1296
00:52:54,240 --> 00:52:55,600
people from the whole country
can.
1297
00:52:56,120 --> 00:52:58,520
What we are, of course it's
better, but it's only going to
1298
00:52:58,520 --> 00:53:01,680
be the resolve of the humans in
the system to say, no, we need
1299
00:53:01,680 --> 00:53:03,560
that human connection.
They need to think of a really
1300
00:53:03,560 --> 00:53:06,040
good argument why we do it
because eventually it will just
1301
00:53:06,040 --> 00:53:09,680
be, well, we're doing it just
'cause that's why.
1302
00:53:09,680 --> 00:53:12,160
And I think there'll come a
point where it's like they're
1303
00:53:12,160 --> 00:53:14,280
just better off you go.
And I think I will take over the
1304
00:53:14,280 --> 00:53:15,520
world.
God, that's mad, isn't it?
1305
00:53:15,520 --> 00:53:17,840
I, I got a couple of things to
say because I think your, your
1306
00:53:17,960 --> 00:53:19,920
argument is very compelling,
especially when there's infinite
1307
00:53:19,920 --> 00:53:21,440
time involved.
I mean, I'm sort of like, OK,
1308
00:53:21,440 --> 00:53:24,200
maybe in 3000 years that is.
Definitely, I think sooner, but.
1309
00:53:24,240 --> 00:53:26,280
For you think, yeah, for sure.
I think our, our understanding
1310
00:53:26,280 --> 00:53:30,720
of our differences mostly late
in time, but even with the
1311
00:53:30,720 --> 00:53:33,000
government, I, you know, I
always see things on the online
1312
00:53:33,000 --> 00:53:35,920
of being like, oh, this country
is now going to use an AI member
1313
00:53:35,920 --> 00:53:37,600
of cabinet and stuff like this.
And I'm like it.
1314
00:53:37,800 --> 00:53:40,600
It does make sense, probably
long term to for for things like
1315
00:53:40,600 --> 00:53:43,520
governing bodies to be a bit
more objective and, you know,
1316
00:53:43,520 --> 00:53:45,640
less likely to be corrupted if
possible.
1317
00:53:45,800 --> 00:53:49,440
Let's So let's hire in the Mark
Zuckerberg and Sam Altman
1318
00:53:49,480 --> 00:53:50,680
company.
I'm not really sure.
1319
00:53:50,680 --> 00:53:52,920
Yeah, don't get me wrong.
I think it's, I don't mean it's
1320
00:53:52,920 --> 00:53:55,240
going to get rid of right.
I think it's going to hyper
1321
00:53:55,240 --> 00:53:58,440
concentrate it to a very few
very, very rich people and
1322
00:53:58,440 --> 00:54:01,000
democracy will become a farce.
That's what I think.
1323
00:54:01,200 --> 00:54:04,520
Potentially, but for the same
reason that that that's dumb in
1324
00:54:04,520 --> 00:54:06,200
terms of like the corruption
from a different way.
1325
00:54:06,200 --> 00:54:07,920
I do think the same thing is
probably going to happen in
1326
00:54:07,920 --> 00:54:10,960
teaching in terms of, OK, let's
get in this these two companies
1327
00:54:10,960 --> 00:54:13,800
now that control precisely what
our kids are being taught,
1328
00:54:13,800 --> 00:54:16,080
rather than just, you know, the
pick of all of humanity where
1329
00:54:16,080 --> 00:54:17,960
you get all sorts of teachers
and all sorts of different
1330
00:54:17,960 --> 00:54:20,600
backgrounds and they get and
children get access to all these
1331
00:54:20,600 --> 00:54:22,840
different teachers across their
career in school.
1332
00:54:23,120 --> 00:54:25,720
I think that paired with the
fact that I think the human
1333
00:54:25,720 --> 00:54:28,080
resolve, we need to give it more
credit than then maybe you're
1334
00:54:28,080 --> 00:54:30,480
giving it.
I really really really think
1335
00:54:30,480 --> 00:54:33,720
that even in 100 years when we
have the most unreal AI robots
1336
00:54:33,720 --> 00:54:35,960
that can be the top 1% and
everything, we're still just
1337
00:54:35,960 --> 00:54:39,080
going to want 4 year olds who
don't understand anything about
1338
00:54:39,080 --> 00:54:42,000
the world yet in front of
another flesh person.
1339
00:54:42,040 --> 00:54:45,200
You know, not an AI person.
I think, I think we'll always
1340
00:54:45,200 --> 00:54:47,320
want that.
I want that want is great.
1341
00:54:47,680 --> 00:54:49,960
Something happen.
I, I there are lots of people
1342
00:54:49,960 --> 00:54:52,440
who didn't want self-service
checkouts at Tesco because they
1343
00:54:52,440 --> 00:54:54,680
wanted the human interaction and
going to talk to someone.
1344
00:54:54,840 --> 00:54:57,120
They didn't want one person in
charge of 6 checkouts.
1345
00:54:57,120 --> 00:54:59,200
They wanted one person per
person so they can talk to them.
1346
00:54:59,520 --> 00:55:01,520
Don't matter what you want
because at the end of the day
1347
00:55:01,520 --> 00:55:03,280
it's better for the company,
it's better for money.
1348
00:55:03,280 --> 00:55:05,000
You're going to make money.
We're going to get more people
1349
00:55:05,000 --> 00:55:06,160
through the door.
We're going to get more money
1350
00:55:06,160 --> 00:55:09,360
through the tills if we do it
this way. 1 is a great thing.
1351
00:55:09,560 --> 00:55:11,680
I want it to, yeah.
It's not going to happen.
1352
00:55:12,040 --> 00:55:13,520
Do you know what one situation I
could imagine?
1353
00:55:13,520 --> 00:55:15,240
I think you probably told me
this probably where it's come
1354
00:55:15,280 --> 00:55:17,680
from is firstly bigger class
sizes.
1355
00:55:17,680 --> 00:55:19,880
Yeah, because I think that's,
that's the first easy way to cut
1356
00:55:19,880 --> 00:55:21,960
costs of.
OK, look, all right, we have 50%
1357
00:55:21,960 --> 00:55:23,880
real teachers, but 50% AI
teachers and we can.
1358
00:55:23,960 --> 00:55:25,040
It's happening in America right
now.
1359
00:55:25,120 --> 00:55:27,840
There's a school that does that.
We're talking to Lee, Mr P Bet,
1360
00:55:27,840 --> 00:55:32,880
and he was saying how in America
it has basically like 100 kids
1361
00:55:32,880 --> 00:55:35,040
in a room being taught something
and then they go off to their
1362
00:55:35,040 --> 00:55:36,840
one to one individual AI tutor.
Right.
1363
00:55:37,160 --> 00:55:38,840
It's happening.
Yeah, yeah, yeah, that's yeah,
1364
00:55:38,840 --> 00:55:40,720
very similar example I was
thinking of imagine like a big
1365
00:55:40,720 --> 00:55:44,800
room where it's almost flipped
like the the teacher is the AI
1366
00:55:44,800 --> 00:55:47,200
robot that's basically just sort
of delivering facts.
1367
00:55:47,200 --> 00:55:49,040
It's not really doing the
empathy thing very even if it's
1368
00:55:49,040 --> 00:55:51,160
really good at it, it's not
actually we're not saying it's
1369
00:55:51,160 --> 00:55:52,240
going to provide the human
touch.
1370
00:55:52,760 --> 00:55:55,240
And then the teaching assistants
are basically just some adults
1371
00:55:55,240 --> 00:55:57,960
dotted around who can do that
human to human interaction that
1372
00:55:57,960 --> 00:56:00,160
I don't think we'll ever, ever,
ever not need.
1373
00:56:00,160 --> 00:56:02,120
I don't think we'll ever get to
a point where we don't actually
1374
00:56:02,120 --> 00:56:04,000
want or need that as a society,
no matter the cost.
1375
00:56:04,480 --> 00:56:07,280
But I can't imagine cutting
costs of OK, yeah, but the the
1376
00:56:07,280 --> 00:56:09,400
one at the front delivering the
knowledge is like that that can
1377
00:56:09,400 --> 00:56:10,800
be a robot.
And there's just a few adults
1378
00:56:10,800 --> 00:56:12,120
around, just totally different
roles.
1379
00:56:12,120 --> 00:56:13,320
It's not even teaching
assistants anymore.
1380
00:56:13,320 --> 00:56:15,800
It's just the human to human
connection people.
1381
00:56:16,120 --> 00:56:18,760
I don't actually disagree of
anything you're saying.
1382
00:56:19,040 --> 00:56:21,520
I just, I'm just saying I don't
think that will happen.
1383
00:56:22,000 --> 00:56:26,480
Like like I'm it's almost as if
the two, the two like parts
1384
00:56:26,480 --> 00:56:28,840
we're having here are.
But I think this is important
1385
00:56:28,840 --> 00:56:30,040
and we should still have that
thing.
1386
00:56:30,400 --> 00:56:33,640
And I'm kind of saying, yeah,
but we won't like I don't, I
1387
00:56:33,640 --> 00:56:36,000
don't disagree with you.
I don't disagree with you at
1388
00:56:36,000 --> 00:56:37,880
all.
I just don't think it will
1389
00:56:37,880 --> 00:56:39,960
happen.
Is it better to have one to one
1390
00:56:40,600 --> 00:56:42,920
scanners at Tesco so you can
talk to someone as you go for
1391
00:56:42,920 --> 00:56:43,800
it?
I've got a problem with this.
1392
00:56:43,800 --> 00:56:45,360
I can talk to you right away.
It's more immediate.
1393
00:56:45,640 --> 00:56:47,600
Is that better for the consumer?
Yes.
1394
00:56:47,800 --> 00:56:51,000
Is it better for the shareholder
of the company new because they
1395
00:56:51,000 --> 00:56:52,440
can make more money doing
something else?
1396
00:56:52,520 --> 00:56:55,840
Yeah, yeah, yeah.
The big tech giants will be a
1397
00:56:55,840 --> 00:56:58,200
conglomerate.
Everything will go through them.
1398
00:56:58,320 --> 00:57:00,920
They will take over education as
well as everything else they're
1399
00:57:00,920 --> 00:57:03,480
taking over.
And they will cut, cut, cut, cut
1400
00:57:03,480 --> 00:57:04,560
because only be a few of them
left.
1401
00:57:04,760 --> 00:57:07,600
They will take everything.
It won't be a good thing, but it
1402
00:57:07,600 --> 00:57:10,600
will happen.
And that that would give it 30
1403
00:57:10,600 --> 00:57:14,880
years.
Society will look like nothing
1404
00:57:14,880 --> 00:57:17,480
you have ever seen before.
And we're going to have to
1405
00:57:17,480 --> 00:57:19,600
relearn everything, which is
what we started off with.
1406
00:57:19,600 --> 00:57:20,760
The conversation with Hannah
Frye.
1407
00:57:21,160 --> 00:57:23,840
I'm not saying that it's
necessarily a good thing.
1408
00:57:24,080 --> 00:57:27,280
I'm just saying it's a thing.
I mean, yeah, we'll have to
1409
00:57:27,280 --> 00:57:29,360
agree to disagree a little bit.
I don't equally disagree.
1410
00:57:29,360 --> 00:57:30,680
I think that's probably what's
going to happen.
1411
00:57:30,680 --> 00:57:33,720
But I like to think that.
I like to think that human
1412
00:57:33,720 --> 00:57:36,720
resolve will come through a
little bit stronger and we'll no
1413
00:57:36,720 --> 00:57:39,200
matter how good AI is not
replaced, just even if it's just
1414
00:57:39,200 --> 00:57:42,000
a tiny bit plus a tiny bit of
what teachers bring.
1415
00:57:42,280 --> 00:57:44,440
Still, now we've got people
saying never use a screen ever.
1416
00:57:44,600 --> 00:57:46,680
We're like, Oh my.
God, yeah, exactly right, Yeah.
1417
00:57:48,200 --> 00:57:49,800
I.
Wonder how those people are
1418
00:57:49,800 --> 00:57:51,760
going to feel when they when
they realise that kids teachers
1419
00:57:51,760 --> 00:57:54,880
an AI robot with a massive
screen on its head.
1420
00:57:56,760 --> 00:57:59,000
Those screens, you say?
No technology you.
1421
00:58:00,880 --> 00:58:04,240
Know what happened will be
something some other crazy tech
1422
00:58:04,240 --> 00:58:06,280
development will happen in the
next 10 years that will just be
1423
00:58:06,280 --> 00:58:08,280
wildly different again in the
same way AI disrupted
1424
00:58:08,280 --> 00:58:09,600
everything.
And it'll be something we can't
1425
00:58:09,600 --> 00:58:11,400
even comprehend right now.
And it'll be like, no, it was
1426
00:58:11,400 --> 00:58:13,320
never going to be AI robots.
It's going to be this other
1427
00:58:13,320 --> 00:58:17,560
insane, crazy, futuristic thing.
Joe, we should do at some point
1428
00:58:17,560 --> 00:58:20,680
in the next year, release a
podcast episode that is holy AI
1429
00:58:21,160 --> 00:58:24,160
and just see who notices.
See who notices.
1430
00:58:24,160 --> 00:58:25,560
Yeah.
And it'd be crazy, wouldn't it,
1431
00:58:25,680 --> 00:58:27,800
if it was this one.
Bye.
1432
00:58:28,400 --> 00:58:28,720
See ya.
00:00:00,040 --> 00:00:02,080
Hello everyone, welcome back to
another episode of Teach Steep
2
00:00:02,080 --> 00:00:03,000
Repeat.
My name is Dylan.
3
00:00:03,160 --> 00:00:05,280
And my name's Hayden.
And before we start, Dylan,
4
00:00:05,320 --> 00:00:07,360
you'll see your mouth going.
You always pick on me at the
5
00:00:07,360 --> 00:00:09,000
start of the episode, so I
thought I'd pick on you.
6
00:00:09,120 --> 00:00:10,600
You're wearing a Gym Shark
hoodie right now, and you
7
00:00:10,600 --> 00:00:12,360
haven't been to the gym in about
10 years.
8
00:00:13,120 --> 00:00:15,160
Is there a reason for that?
I'm also not a shark.
9
00:00:15,960 --> 00:00:17,480
Well, you know, that was a bit
more obvious.
10
00:00:17,680 --> 00:00:20,800
I don't actually, but you have
not you have not been to the gym
11
00:00:20,800 --> 00:00:22,840
in a long time.
Just came back to me and I am a
12
00:00:22,840 --> 00:00:27,040
shark.
Like in clothing I can realise
13
00:00:27,200 --> 00:00:28,880
not not even a shark who can
talk.
14
00:00:28,920 --> 00:00:31,880
Just a shark.
Not me in shark form.
15
00:00:32,080 --> 00:00:35,800
A literal shark.
Only at this point and I realise
16
00:00:35,800 --> 00:00:38,320
a shark let me in the house, a
shark got me a drink, a shark
17
00:00:38,320 --> 00:00:40,360
took me upstairs, turned the
lights on.
18
00:00:40,480 --> 00:00:42,400
Not only that, I'm just doing
dumbbells the whole time as
19
00:00:42,400 --> 00:00:43,240
well.
So what do you mean?
20
00:00:43,240 --> 00:00:44,200
Yeah?
I'm a gym shark.
21
00:00:44,200 --> 00:00:45,000
Do you know what?
Do you know what?
22
00:00:45,000 --> 00:00:47,320
Bloody you take it all back?
Bloody you take it all back.
23
00:00:47,480 --> 00:00:49,720
Also, sorry, very brave coming
from a man with some kind of
24
00:00:49,720 --> 00:00:51,840
foreign language written on
their top that could say
25
00:00:51,840 --> 00:00:52,560
anything.
Yeah.
26
00:00:52,600 --> 00:00:54,280
Well, do you know what it says?
No.
27
00:00:54,280 --> 00:00:57,160
Well, I'll tell you then.
In that case, it says this
28
00:00:57,160 --> 00:01:00,320
podcast is great and you
wouldn't even know.
29
00:01:00,720 --> 00:01:03,360
You know, there's things of
people who get tattoos and it
30
00:01:03,360 --> 00:01:05,800
turns out it says like
completely, yeah, that's you
31
00:01:05,800 --> 00:01:06,640
right now.
So funny.
32
00:01:06,720 --> 00:01:08,840
That says something on the lines
of teachers suck.
33
00:01:09,120 --> 00:01:10,080
Look what he's saying to you
guys.
34
00:01:10,080 --> 00:01:12,440
It might do or it might.
It might just say T-shirt, it
35
00:01:12,440 --> 00:01:14,520
might just say extra large.
I don't know.
36
00:01:14,720 --> 00:01:18,440
Have you seen the flip version
where there's pictures of people
37
00:01:18,440 --> 00:01:21,160
in like Asian countries and it
just says like smile?
38
00:01:21,360 --> 00:01:24,400
Yeah, or again tattoos.
They get something tattooed in
39
00:01:24,400 --> 00:01:26,560
English on their on their arm
and just doesn't make sense.
40
00:01:26,560 --> 00:01:30,240
Loading like in their language
it means really profound about
41
00:01:30,280 --> 00:01:32,120
building up in English is just
loading.
42
00:01:32,240 --> 00:01:35,600
Yeah, yeah, does not have the
same profound nature.
43
00:01:36,360 --> 00:01:38,440
We took a bit of a detour there,
but anyway, today's episode is
44
00:01:38,520 --> 00:01:40,320
all about artificial
intelligence.
45
00:01:40,320 --> 00:01:42,480
Now something we talked about
before, but the reason we want
46
00:01:42,480 --> 00:01:44,560
to talk about it today is
because we've just got back from
47
00:01:44,560 --> 00:01:48,880
BET 2026 and I would say the
biggest thing on my mind when I
48
00:01:48,880 --> 00:01:52,680
finished was sure excitement.
There's lots of real good use
49
00:01:52,680 --> 00:01:54,680
cases for artificial
intelligence now can help
50
00:01:54,680 --> 00:01:58,080
education, but also I, I can't
shake off and I've never been
51
00:01:58,080 --> 00:02:01,080
able to do this in education.
The reality of actually
52
00:02:01,080 --> 00:02:03,360
implementing it and how
realistic it will be and what
53
00:02:03,360 --> 00:02:05,200
will actually happen then in the
future.
54
00:02:05,200 --> 00:02:07,640
So do you want to lay a bit of
context down, a little bit of
55
00:02:07,640 --> 00:02:09,240
our thoughts and what we even
did there?
56
00:02:09,240 --> 00:02:10,440
Who did we hear?
Yeah, yeah, sure.
57
00:02:10,440 --> 00:02:13,240
So like just to sort of jump on
what you said, my first thought
58
00:02:13,440 --> 00:02:15,360
at bet was, is there anything
but AI?
59
00:02:15,600 --> 00:02:18,360
Because there was so much stuff
and even last year we went and
60
00:02:18,360 --> 00:02:20,440
we thought, oh, has a lot of AI
stands popping up now?
61
00:02:20,440 --> 00:02:22,760
Because it was kind of, you
know, the rise of AI, but it
62
00:02:22,760 --> 00:02:26,000
felt like genuinely everything
was about AI there, wasn't it?
63
00:02:26,000 --> 00:02:28,360
Like all of the because we
booked in for a few talks and in
64
00:02:28,360 --> 00:02:31,240
the arena, which is really cool.
And yeah, I looked at loads of
65
00:02:31,240 --> 00:02:33,240
stand and it was just this
everything has AI at the end of
66
00:02:33,240 --> 00:02:35,240
the name AI in the name
somewhere to be like, hey, look
67
00:02:35,280 --> 00:02:36,880
us, we're using AI come and
check us out.
68
00:02:36,880 --> 00:02:40,800
So that I think that's important
context in terms of clearly it's
69
00:02:40,800 --> 00:02:42,640
on everyone's minds.
So one of the one of the biggest
70
00:02:42,640 --> 00:02:45,640
things that happened was Bridget
Phillipson came out and did a
71
00:02:45,640 --> 00:02:47,280
talk and we were in the arena at
the time and.
72
00:02:47,400 --> 00:02:49,320
Education Secretary for anyone
listening who might not be in
73
00:02:49,320 --> 00:02:51,600
the UK, thank.
You for the clarification and
74
00:02:51,600 --> 00:02:52,920
she.
Hated talking by the way.
75
00:02:52,920 --> 00:02:55,040
He's a host of TC Repeat
everyone listening outside of
76
00:02:55,040 --> 00:02:57,080
the normal listeners.
What we might do is every
77
00:02:57,080 --> 00:02:58,800
sentence, if you could.
Just sentence is just a group of
78
00:02:58,800 --> 00:03:00,520
words put together to make a
phrase that makes sense on its
79
00:03:00,520 --> 00:03:01,560
own if you.
Could just summarise what I'm
80
00:03:01,560 --> 00:03:02,600
saying each time, that'd be
great.
81
00:03:02,600 --> 00:03:04,960
Saying is when you're talking.
You've done it in in a certain
82
00:03:04,960 --> 00:03:06,800
tense.
This is horrendous.
83
00:03:06,840 --> 00:03:08,640
I'd be impressed if you can go
on, right.
84
00:03:08,640 --> 00:03:10,920
So yeah, Bridget Phillips came
out and and did did quite a long
85
00:03:10,920 --> 00:03:16,320
talk and a lot of it really was
centred around AI and how she
86
00:03:16,320 --> 00:03:17,360
thinks.
And the government, I guess
87
00:03:17,360 --> 00:03:20,760
think that AI can be a huge
player in solving a lot of the
88
00:03:20,760 --> 00:03:23,520
problems in education.
It's it's quite a wide open
89
00:03:23,520 --> 00:03:24,960
point.
And if you want to sort of nail
90
00:03:24,960 --> 00:03:27,880
down and start anywhere in this.
Yeah, it, it was good.
91
00:03:27,880 --> 00:03:30,720
Do you know what I'll say
genuinely, Bridget Phillips, in
92
00:03:30,720 --> 00:03:35,640
my time as a teacher, I think
it's the most I've I've seen
93
00:03:35,760 --> 00:03:37,320
passion come through.
Yeah, for sure.
94
00:03:37,320 --> 00:03:42,200
And genuine desire for
implementing technology in a way
95
00:03:42,200 --> 00:03:44,760
that's going to help teachers
and just passion for education
96
00:03:44,760 --> 00:03:47,720
in general and specifically
around social mobility and SEND
97
00:03:48,160 --> 00:03:49,840
that comes through in everything
she says.
98
00:03:49,840 --> 00:03:52,920
So like I, I was very, I, I feel
very grateful.
99
00:03:52,960 --> 00:03:54,920
You know, I've seen the merry go
round of education secretaries
100
00:03:54,920 --> 00:03:56,480
over the whole time that I've
been a teacher.
101
00:03:57,000 --> 00:03:59,720
And I do think that her message
is on point and I do think she's
102
00:03:59,720 --> 00:04:01,560
saying the right thing.
So that's, that's a really good,
103
00:04:01,560 --> 00:04:02,680
solid start.
Genuinely.
104
00:04:03,320 --> 00:04:06,800
I, I just feel like sometimes
when she was talking about what
105
00:04:06,800 --> 00:04:09,080
they're doing and implementing,
she announced like 6 or 7
106
00:04:09,080 --> 00:04:10,560
different things.
Yeah, it turned into a bit of a
107
00:04:10,560 --> 00:04:12,400
party political broadcast
really, which, which I
108
00:04:12,400 --> 00:04:13,640
understand.
Obviously she's going to do
109
00:04:13,640 --> 00:04:14,680
that.
It's a good place to announce
110
00:04:14,680 --> 00:04:16,760
these things.
But it, it felt like, yeah,
111
00:04:16,760 --> 00:04:18,120
they're putting money in the
right place.
112
00:04:18,279 --> 00:04:20,680
It's not enough money.
I don't personally think, no.
113
00:04:20,920 --> 00:04:23,880
And the big push about how
technology can transform
114
00:04:23,880 --> 00:04:26,720
education and help teachers, She
was very, very, very clear it's
115
00:04:26,720 --> 00:04:28,760
not about replacing teachers.
We'll come on to that later
116
00:04:28,760 --> 00:04:31,920
because I'm not sure in general
what will happen, but she was
117
00:04:31,920 --> 00:04:34,800
very, very clear about how it's
helping teachers and students,
118
00:04:34,800 --> 00:04:37,000
right and and kids to learn
more.
119
00:04:37,480 --> 00:04:41,720
My worry with it is that the
actual physical hardware and
120
00:04:41,720 --> 00:04:45,680
technology I've got an iPad here
in classrooms does not remotely
121
00:04:45,680 --> 00:04:48,800
match what the vision is for
using it to help children and
122
00:04:48,800 --> 00:04:52,600
teachers alike.
For sure that that to me is the
123
00:04:52,600 --> 00:04:55,360
thing that screamed out that I
I'll just want to get on the
124
00:04:55,360 --> 00:04:57,640
roof to I want to get I want to
start a podcast, you know, where
125
00:04:57,640 --> 00:05:00,040
I've got people listening and
say only if only we had a
126
00:05:00,040 --> 00:05:02,720
platform to sort the people and
just say this is the issue.
127
00:05:03,000 --> 00:05:05,480
Do you think then, because I've
been really thinking about this
128
00:05:05,480 --> 00:05:09,240
since at the first, I was very
much on the side of one line of
129
00:05:09,240 --> 00:05:11,320
thought, which is schools don't
have enough money to get the
130
00:05:11,320 --> 00:05:13,400
tech, to get the iPads, to get
the computers, the laptops,
131
00:05:13,400 --> 00:05:16,440
whatever it is, to be able to
harness the full power of all of
132
00:05:16,440 --> 00:05:17,760
this tech, whether it's AI or
not.
133
00:05:18,560 --> 00:05:22,080
And I've that's where I started
and I'll share my sort of
134
00:05:22,080 --> 00:05:23,560
thoughts now.
But is that what you think?
135
00:05:23,560 --> 00:05:25,120
Do you feel like the problem is
simply this?
136
00:05:25,600 --> 00:05:28,560
Schools aren't being given
enough money to invest in tech?
137
00:05:28,760 --> 00:05:30,200
That's it.
There's multiple things going
138
00:05:30,200 --> 00:05:32,400
on.
Lots of schools have enough
139
00:05:32,400 --> 00:05:34,680
tech, so where do they get the
money from?
140
00:05:34,680 --> 00:05:36,720
Yeah, right.
They got the money daily, right?
141
00:05:36,960 --> 00:05:38,960
I think lots of schools are
making decisions, maybe to spend
142
00:05:38,960 --> 00:05:41,600
money elsewhere.
I think that's also because of
143
00:05:41,600 --> 00:05:44,200
necessity half the time, right?
We've got a bunch of money.
144
00:05:44,200 --> 00:05:47,400
What we're going to spend on,
OK, maybe cover because we need
145
00:05:47,400 --> 00:05:48,880
to actually cover our teachers
properly.
146
00:05:49,000 --> 00:05:51,080
Or maybe I'm going to spend it
on playground equipment because
147
00:05:51,080 --> 00:05:53,160
PE is very important and
physical health is super
148
00:05:53,160 --> 00:05:55,080
important.
And maybe in my catchment area,
149
00:05:55,080 --> 00:05:56,800
actually we're really behind on
a certain thing.
150
00:05:56,920 --> 00:05:58,280
So I'm going to boost some money
into that.
151
00:05:58,560 --> 00:06:02,800
All these decisions are fine and
makes sense, but unless you have
152
00:06:02,800 --> 00:06:06,720
a real clear tech part that is
for tech, money's going to
153
00:06:06,720 --> 00:06:08,520
haemorrhage out of that.
So I don't think it's just
154
00:06:08,520 --> 00:06:10,200
simply saying the government
aren't giving enough money.
155
00:06:10,320 --> 00:06:15,840
I think it's a whole system of
we need to simply have a set of
156
00:06:15,840 --> 00:06:18,280
money and funding going into
schools to make sure tech is up
157
00:06:18,280 --> 00:06:21,320
to date and it's ring fenced and
it's extra and it's, and it's
158
00:06:21,320 --> 00:06:23,600
important to have because it
feels like, again, it's just
159
00:06:23,600 --> 00:06:25,960
another thing that the
government can say is really
160
00:06:25,960 --> 00:06:27,720
important and will really
transform things.
161
00:06:27,720 --> 00:06:29,120
Can you please just make sure
you do it?
162
00:06:29,440 --> 00:06:32,280
It's a, oh, wait, I've got 30
Chromebooks between 100 and 20
163
00:06:32,280 --> 00:06:35,000
kids per year group, and four of
the Chromebooks don't work.
164
00:06:35,000 --> 00:06:36,600
At one point.
It's like, OK, cool.
165
00:06:36,920 --> 00:06:40,360
Let's look at all of the really
good tech options for children
166
00:06:40,360 --> 00:06:43,720
that can help their learning.
We need more of a ratio than
167
00:06:43,720 --> 00:06:45,120
that for it to actually be
effective.
168
00:06:45,200 --> 00:06:48,440
Yeah.
So we have to, we have to make
169
00:06:48,440 --> 00:06:51,640
sure that schools are getting
the technology they need because
170
00:06:51,640 --> 00:06:54,760
I can imagine a school in a
rural area looking at what the
171
00:06:54,760 --> 00:06:57,800
government are saying and
nothing will cut through because
172
00:06:57,800 --> 00:07:00,160
the first thing that score in
the rural area needs is the
173
00:07:00,160 --> 00:07:02,560
funding to have the technology
to be able to access it.
174
00:07:03,040 --> 00:07:04,960
That has to come first.
Yeah, I fully agree.
175
00:07:04,960 --> 00:07:09,720
There's a bit of a catch 22 as
well in terms of the tech when
176
00:07:09,720 --> 00:07:12,080
when schools don't have enough
tech, sorry, don't have enough
177
00:07:12,080 --> 00:07:14,640
devices to use the tech.
Sometimes that can actually make
178
00:07:14,720 --> 00:07:17,160
the tech they're trying to use
genuinely A nuisance.
179
00:07:17,160 --> 00:07:21,360
And then, you know, it's like,
unless you have the full set of
180
00:07:21,360 --> 00:07:24,440
equipment, the tech is actually
less effective.
181
00:07:24,600 --> 00:07:26,520
It's annoying because like you
said, like things are not
182
00:07:26,520 --> 00:07:28,640
working on one class they've got
going on a rota now and then
183
00:07:28,640 --> 00:07:31,080
they're just it's annoying to
use and that just isn't good
184
00:07:31,080 --> 00:07:32,880
because this is great tech
solutions that people are then
185
00:07:33,240 --> 00:07:34,480
going, Oh, this is rubbish.
Doesn't work.
186
00:07:34,480 --> 00:07:37,160
It's not that it doesn't work,
it's just that you don't have
187
00:07:37,160 --> 00:07:39,040
the right equipment to use it.
So that's one point.
188
00:07:39,120 --> 00:07:41,880
And then secondly, yeah, I kind
of basically I agree what you
189
00:07:41,880 --> 00:07:43,920
said about the tech pot of
money.
190
00:07:44,280 --> 00:07:45,840
It has to, it has, it's the only
way.
191
00:07:46,040 --> 00:07:48,360
Because I look at when you look
at some schools where you say,
192
00:07:48,360 --> 00:07:50,480
well, they've managed to do the
tech, they've managed to get all
193
00:07:50,480 --> 00:07:52,080
of the equipment in their
school.
194
00:07:52,760 --> 00:07:54,440
So therefore clearly all schools
can do it.
195
00:07:54,440 --> 00:07:56,560
I hate that generalisation
because I'm like, we don't know.
196
00:07:56,560 --> 00:07:58,440
You don't know what that school
prioritised.
197
00:07:58,440 --> 00:08:01,360
Yeah, they might have cut back
on other things that when you
198
00:08:01,360 --> 00:08:03,040
find out about what they cut
back on, you think, oh, you
199
00:08:03,040 --> 00:08:04,960
can't cut that.
That's a significant thing.
200
00:08:04,960 --> 00:08:06,720
You're just looking at the fact
that they have the equipment.
201
00:08:07,040 --> 00:08:09,880
So it has to be regardless of
what schools already have or
202
00:08:09,880 --> 00:08:12,480
don't have right now.
I feel like there has to be a
203
00:08:12,480 --> 00:08:15,280
top down approach of we are
going to equip all of our
204
00:08:15,280 --> 00:08:18,480
schools with enough equipment so
that all of this stuff we're
205
00:08:18,480 --> 00:08:21,040
saying about tech can actually
be implemented properly because
206
00:08:21,040 --> 00:08:23,640
one without the other doesn't
really make any sense.
207
00:08:23,640 --> 00:08:25,520
And that's when it comes back
around to, and we've mentioned
208
00:08:25,520 --> 00:08:27,320
this before in the past, lip
service.
209
00:08:27,480 --> 00:08:31,720
When people, governments,
whatever policy makers say stuff
210
00:08:31,880 --> 00:08:34,440
and it sounds great.
There's a glaringly obvious
211
00:08:34,440 --> 00:08:36,400
problem that could be solved
that isn't solved.
212
00:08:36,400 --> 00:08:38,200
That's when I go, oh, OK, this,
this is meaningless.
213
00:08:38,200 --> 00:08:40,280
And this is just lip service.
You're, you're not actually,
214
00:08:40,400 --> 00:08:42,200
you're not actually trying to
solve this problem because, you
215
00:08:42,200 --> 00:08:46,280
know, you know, these schools
have no equipment and you're
216
00:08:46,280 --> 00:08:49,280
saying tech is the solution and
then going blind eye walk away.
217
00:08:49,320 --> 00:08:51,280
Yeah, that's just not good to
me.
218
00:08:51,360 --> 00:08:53,680
We can solve teacher burnout
because it's going to do lots of
219
00:08:53,680 --> 00:08:55,240
marking for you.
Isn't that fantastic?
220
00:08:55,760 --> 00:08:57,440
Oh, really?
Wait, can we have the thing to
221
00:08:57,440 --> 00:08:59,080
do on?
No, no, no.
222
00:08:59,160 --> 00:09:00,480
OK, so it's not.
Get it from your existing
223
00:09:00,480 --> 00:09:02,040
budgets.
We don't have our existing
224
00:09:02,040 --> 00:09:03,200
budgets already completely
stretched.
225
00:09:03,200 --> 00:09:06,000
And if we if we would have to
fire ATA to do this and we don't
226
00:09:06,000 --> 00:09:08,120
really want to do that, it's
ethical to consideration the
227
00:09:08,120 --> 00:09:10,560
other school did it.
Yeah, OK, that's up to them.
228
00:09:10,560 --> 00:09:12,040
Like right.
Can we all just have the same
229
00:09:12,040 --> 00:09:14,960
part for this thing?
Can you just say here how much
230
00:09:14,960 --> 00:09:15,640
you need?
Here you go.
231
00:09:15,640 --> 00:09:17,680
Here's an iPad for every kid.
That is such a good point you
232
00:09:17,680 --> 00:09:19,480
made about how when you're
discussing something in
233
00:09:19,480 --> 00:09:21,640
particular, it's easy to make
comparisons between schools.
234
00:09:21,800 --> 00:09:23,440
But schools are so much more
complex than that.
235
00:09:23,440 --> 00:09:27,920
Like you said that school with
one iPad per kid very much may
236
00:09:27,920 --> 00:09:30,800
well not have any support stuff.
They they may have made that
237
00:09:30,800 --> 00:09:31,520
decision.
Right.
238
00:09:31,640 --> 00:09:33,440
And I'm not saying that's the
right way to go either, because
239
00:09:33,440 --> 00:09:35,840
I think people can sometimes
think, oh, well, you're saying
240
00:09:35,840 --> 00:09:37,680
one tech, more tech in the
classroom.
241
00:09:37,680 --> 00:09:39,720
That means at this expense
that's bad, you're terrible.
242
00:09:39,920 --> 00:09:42,040
No, it's not what I'm saying.
I'm just saying that as a
243
00:09:42,040 --> 00:09:44,240
baseline, we need to have that.
There's there's something else
244
00:09:44,240 --> 00:09:47,520
that really jumped out at me
that I find fascinating is the
245
00:09:47,520 --> 00:09:51,280
political problem.
Like I genuinely understand how
246
00:09:51,280 --> 00:09:54,080
it's easy for us to sit here and
say, well, why don't you just do
247
00:09:54,080 --> 00:09:54,880
that?
Isn't that easy.
248
00:09:54,880 --> 00:09:56,800
If you don't go on politicians,
why don't you just give everyone
249
00:09:56,800 --> 00:09:59,680
an iPad, etcetera.
They must be constantly walking
250
00:09:59,680 --> 00:10:03,600
a rope of appeasing people.
At the end of the day, they're
251
00:10:03,600 --> 00:10:06,040
there to serve the people.
And there is a significant
252
00:10:06,040 --> 00:10:10,520
portion of society and it's
growing who look at tech and say
253
00:10:10,520 --> 00:10:13,200
tech in schools and screens in
schools equals bad.
254
00:10:13,640 --> 00:10:16,360
So in the one breath you've got
the government saying how
255
00:10:16,360 --> 00:10:18,160
important tech is for the
classroom.
256
00:10:18,640 --> 00:10:20,560
I agree, by the way, because
they're going to enter the real
257
00:10:20,560 --> 00:10:23,760
world, which is so tech heavy.
We have to make sure that
258
00:10:23,760 --> 00:10:26,440
they're learning how to actually
use these devices properly and
259
00:10:26,440 --> 00:10:28,200
also just like get the most out
of them, right?
260
00:10:28,560 --> 00:10:31,720
But then at the same time, we're
explaining how it's so important
261
00:10:31,720 --> 00:10:32,960
to do that.
Tech's really important.
262
00:10:32,960 --> 00:10:34,640
It's going to save teachers time
in the classroom.
263
00:10:34,640 --> 00:10:36,880
Kids need to use it and and use
it well.
264
00:10:37,480 --> 00:10:39,920
At the same time, you've got
people saying no screens in the
265
00:10:39,920 --> 00:10:42,400
classroom, technology, what my
kid has any technology in the
266
00:10:42,400 --> 00:10:44,000
classroom, I'll be taking them
out of that school.
267
00:10:44,000 --> 00:10:46,760
Then that's awful for them and
you've got the government
268
00:10:46,760 --> 00:10:49,040
releasing a device on screen
time, etcetera.
269
00:10:49,040 --> 00:10:50,960
In the early years, they're
going to be releasing, as
270
00:10:50,960 --> 00:10:52,880
Bridget was saying, they're
going to extend that and give
271
00:10:53,120 --> 00:10:55,600
advice for infants, give advice
for primary, give advice for
272
00:10:55,600 --> 00:10:58,360
secondary, so that that's going
to be given, right?
273
00:10:58,680 --> 00:11:00,720
So it'd be really, really
interesting to see how this
274
00:11:00,720 --> 00:11:04,040
tightrope is kind of walked
because my genuine opinion is we
275
00:11:04,040 --> 00:11:07,040
need way more tech in schools.
But I do think there'll be a big
276
00:11:07,040 --> 00:11:09,960
push back from a, from a big,
big section of societies.
277
00:11:10,040 --> 00:11:11,000
Like how do you think about
that?
278
00:11:11,160 --> 00:11:13,840
I do wonder if that guidance
they're going to bring out, you
279
00:11:13,840 --> 00:11:15,720
know, the, the screen time
guidance, which I starting with
280
00:11:15,720 --> 00:11:17,440
sort of early years and then
they're going to, she said.
281
00:11:17,440 --> 00:11:18,280
They're going to work their way
out.
282
00:11:19,240 --> 00:11:21,800
It could be a really big force
for good, you know, yeah, I
283
00:11:21,800 --> 00:11:24,280
think it could be the
government's way of very nicely
284
00:11:24,280 --> 00:11:27,640
saying to parents and people who
are more carers, whoever who are
285
00:11:27,640 --> 00:11:29,840
sort of anti screen time
completely and can't see the
286
00:11:29,840 --> 00:11:33,160
nuance between YouTube algorithm
brain rot and actual educational
287
00:11:33,160 --> 00:11:34,600
software that's really helpful
like Mapsu.
288
00:11:34,800 --> 00:11:36,640
Like Mapsu, you know can't see
the difference just.
289
00:11:36,640 --> 00:11:38,760
Randomly pick one out there.
Yeah, he just thought of one on
290
00:11:38,760 --> 00:11:39,240
top of.
Your head.
291
00:11:39,240 --> 00:11:41,360
Yeah, massive.
Yeah, exactly.
292
00:11:41,360 --> 00:11:44,000
Demo down below.
But the way you're clever,
293
00:11:44,000 --> 00:11:46,400
you're not into this.
So but what, what that guidance
294
00:11:46,480 --> 00:11:49,480
might end up doing is actually
saying, Oh yeah, a little nod to
295
00:11:49,480 --> 00:11:51,280
it, yet all too much screen time
is bad.
296
00:11:51,720 --> 00:11:55,600
But and then really from a top
down policy makers perspective,
297
00:11:55,800 --> 00:11:59,080
just say, but you do just need
to stop putting it all in the
298
00:11:59,080 --> 00:12:00,960
same category though, because
this is actually good.
299
00:12:01,200 --> 00:12:03,160
Government is saying this is OK,
this is fine.
300
00:12:03,160 --> 00:12:04,440
Can we please stop panicking
about it?
301
00:12:04,760 --> 00:12:07,160
And yeah, this stuff that we all
think is bad is is, funnily
302
00:12:07,160 --> 00:12:10,880
enough, bad for kids.
Strangely enough, yes, it's
303
00:12:10,880 --> 00:12:13,240
unsupervised, isn't it?
At the hands of the tech giants,
304
00:12:13,240 --> 00:12:14,640
yeah.
Your attention because I think
305
00:12:14,640 --> 00:12:17,240
people worried that this
guidance is going to come out
306
00:12:17,240 --> 00:12:19,560
and just be like pandering to
parents and carers who think
307
00:12:19,560 --> 00:12:21,800
this and say Oh yeah, screen
times terrible.
308
00:12:21,800 --> 00:12:23,800
We're going to start banning
screens in schools now.
309
00:12:23,800 --> 00:12:26,400
I think people worrying, it's
going to be that, like, I'd be
310
00:12:26,400 --> 00:12:27,760
very surprised.
It's not, it's not if that's
311
00:12:27,760 --> 00:12:29,280
what it was.
And also this is the thing,
312
00:12:29,280 --> 00:12:30,920
there's no nuances.
This is what I mean about.
313
00:12:30,920 --> 00:12:32,760
It's a tightrope because I get
what you're saying, right?
314
00:12:33,000 --> 00:12:35,320
It could be, it could be a
blessing in disguise where it
315
00:12:35,320 --> 00:12:37,240
brings everyone together like
it's lovely and roses.
316
00:12:37,400 --> 00:12:39,320
What will what will actually
happen is both sides will think
317
00:12:39,320 --> 00:12:41,560
it doesn't go far enough for
their side and both sides will
318
00:12:41,560 --> 00:12:43,240
hate it.
It's like the guidance that was
319
00:12:43,240 --> 00:12:45,160
released about earlier screen
time, right?
320
00:12:45,520 --> 00:12:49,800
You had some people who said it,
it should completely say no
321
00:12:49,800 --> 00:12:53,160
screen time ever for children.
That's ridiculous, isn't awful.
322
00:12:53,320 --> 00:12:55,800
And it said actually it compared
between 5 hours a day and 45
323
00:12:55,800 --> 00:12:58,360
minutes a day, even 45 minutes a
day is awful and they weren't
324
00:12:58,360 --> 00:13:00,320
happy right.
Then the other side, you had
325
00:13:00,320 --> 00:13:03,320
children of parents of special
educational needs, children who
326
00:13:03,320 --> 00:13:07,360
say that the screen time is is a
real device they can use to help
327
00:13:07,360 --> 00:13:08,920
their children actually get
through the day.
328
00:13:09,360 --> 00:13:10,920
And they were annoyed because
they were saying you're not
329
00:13:10,920 --> 00:13:12,440
thinking about SEND children in
this.
330
00:13:12,720 --> 00:13:14,760
And I just look at them,
everyone in this situation, I
331
00:13:14,760 --> 00:13:16,600
just go it's a guidance
document.
332
00:13:17,120 --> 00:13:20,480
Do you know what guidance is?
Guidance isn't every single
333
00:13:20,480 --> 00:13:22,400
person in the world to follow
this all of the time.
334
00:13:22,560 --> 00:13:25,800
This is just some advice.
What do you want the government
335
00:13:25,800 --> 00:13:28,160
to do in this situation?
Because if they wholly side with
336
00:13:28,160 --> 00:13:30,120
you, they're going to alienate
99% of people.
337
00:13:30,360 --> 00:13:33,200
This isn't for an individual,
this is guidance.
338
00:13:33,520 --> 00:13:35,720
It's best practise for most
children.
339
00:13:35,960 --> 00:13:37,320
It's not hitting everything at
once.
340
00:13:37,600 --> 00:13:41,800
And I just worry that as we
extend this up, anyone who isn't
341
00:13:41,800 --> 00:13:45,120
on the exact one percentile of
what the guidance says will
342
00:13:45,120 --> 00:13:46,360
think it's too far the other
way.
343
00:13:46,400 --> 00:13:49,320
I'm not quite right and it will
end up making more people angry
344
00:13:49,480 --> 00:13:51,920
than actually making them feel
better about it.
345
00:13:51,920 --> 00:13:53,840
And I just think that's what I
mean when I say it's a
346
00:13:53,840 --> 00:13:55,840
tightrope.
And I think what the government
347
00:13:55,840 --> 00:13:59,040
has to do and what I would love
to see more of from politicians
348
00:13:59,040 --> 00:14:02,920
in general is have an ideology
that's backed up in evidence and
349
00:14:02,920 --> 00:14:05,080
research and mean something and
go with it.
350
00:14:05,480 --> 00:14:08,200
And then spend their time trying
to argue and convince people why
351
00:14:08,200 --> 00:14:11,080
that's the right thing to do
rather than trying to pander to
352
00:14:11,080 --> 00:14:12,360
everyone and make everyone feel
happy.
353
00:14:12,520 --> 00:14:15,120
You will never succeed at that.
So why don't you just go for
354
00:14:15,120 --> 00:14:17,200
something you actually believe
in and try and convince people
355
00:14:17,200 --> 00:14:19,360
of it?
You got voted in for a reason.
356
00:14:19,560 --> 00:14:22,400
Just go forward and do it and
explain why it's a force for
357
00:14:22,400 --> 00:14:23,440
good.
And you know what?
358
00:14:23,440 --> 00:14:25,360
If it doesn't work in the long
run, you'll be out of office
359
00:14:25,360 --> 00:14:27,000
next time.
Yeah, definitely, definitely.
360
00:14:27,200 --> 00:14:28,960
I like your point about
guidance.
361
00:14:28,960 --> 00:14:30,520
Can't please everyone.
It's true because it has to be
362
00:14:30,520 --> 00:14:32,400
for the average person, like,
like there's always.
363
00:14:32,440 --> 00:14:34,160
Even that sounds bad.
You mean although people listen
364
00:14:34,160 --> 00:14:36,000
to you, right?
Very quickly because you didn't
365
00:14:36,000 --> 00:14:38,240
mean it in a bad way.
But even you saying the average
366
00:14:38,240 --> 00:14:40,160
person made me think of, oh, am
I not an average person?
367
00:14:40,720 --> 00:14:41,640
Oh, my God.
Yeah.
368
00:14:41,640 --> 00:14:43,080
No, I mean that, you know,
automatically.
369
00:14:43,080 --> 00:14:44,960
Yeah, No, I know what you mean.
But there'll be people listening
370
00:14:44,960 --> 00:14:46,840
about as my child, not an
average child, just because.
371
00:14:46,840 --> 00:14:47,960
They've got this.
Yeah, it's not.
372
00:14:48,240 --> 00:14:51,000
What, you mean it's just, yeah,
I don't know how to explain it a
373
00:14:51,000 --> 00:14:52,640
different way.
The graph I'm thinking of in my
374
00:14:52,640 --> 00:14:55,320
head in terms of like, everyone
is in this graph somewhere and
375
00:14:55,320 --> 00:14:57,360
there's always outliers in every
graph ever.
376
00:14:57,360 --> 00:14:58,080
Yeah.
Ever.
377
00:14:58,200 --> 00:14:59,720
Yeah.
Like when you've got X&Y this
378
00:14:59,720 --> 00:15:02,120
way, there is going to be
someone, yeah, who's right up
379
00:15:02,120 --> 00:15:03,000
here.
And there's gonna be someone
380
00:15:03,000 --> 00:15:05,080
who's right down here.
And they are outliers to where
381
00:15:05,080 --> 00:15:08,040
the general bulk of people are.
And guidance generally comes in
382
00:15:08,040 --> 00:15:09,880
to say, OK, we've got the big
bulk of people here.
383
00:15:09,880 --> 00:15:12,320
Yeah, this is generally
guidance, but we're aware that
384
00:15:12,320 --> 00:15:14,480
this might not be suitable for
every single individual.
385
00:15:14,480 --> 00:15:16,800
And the fact that people don't
understand that I'm like, I
386
00:15:16,800 --> 00:15:18,920
don't, yeah, you can't win.
There is no win because you
387
00:15:18,920 --> 00:15:20,000
can't please every single
person.
388
00:15:20,360 --> 00:15:21,760
So let's talk that that was
Bridget Phillipson.
389
00:15:21,760 --> 00:15:23,960
So she did, Yeah.
Really interesting chat, very AI
390
00:15:23,960 --> 00:15:25,600
focused.
Very tech focused, very
391
00:15:25,600 --> 00:15:27,000
interesting in your opinion on
the whole.
392
00:15:27,000 --> 00:15:30,920
Then you finish that chat, you
know the government starts and
393
00:15:30,920 --> 00:15:32,920
stuff.
How would you rate it?
394
00:15:32,920 --> 00:15:34,800
What do you feel about is it?
Is it a solid position?
395
00:15:34,800 --> 00:15:36,040
Do you agree with it in general?
What?
396
00:15:36,040 --> 00:15:37,880
What's your feeling?
Overall, quite positive.
397
00:15:37,880 --> 00:15:39,400
To be honest with you.
I, I didn't really feel
398
00:15:39,400 --> 00:15:41,040
negative.
I, I, I felt like all the things
399
00:15:41,040 --> 00:15:44,520
she was saying, it was a fairly
strong position of I think she
400
00:15:44,520 --> 00:15:47,440
could have pandered more to Oh
yeah, no, tech is bad in schools
401
00:15:47,440 --> 00:15:49,080
because of screen time.
She didn't do that at all.
402
00:15:49,080 --> 00:15:51,520
I felt like she was very much
saying, no, actually tech is
403
00:15:51,520 --> 00:15:53,520
really good and we're actually
we're going to push more.
404
00:15:53,520 --> 00:15:55,480
I think you know, there was and
you find the right thing
405
00:15:55,520 --> 00:15:56,520
allocated.
Yeah, absolutely.
406
00:15:56,520 --> 00:15:58,200
I think it is the right thing
because like you said, we live
407
00:15:58,200 --> 00:16:00,000
in a tech heavy world.
It's only going to get more
408
00:16:00,000 --> 00:16:01,600
techie.
I don't know what world people
409
00:16:01,600 --> 00:16:03,240
are living in where they think
that we're suddenly going to
410
00:16:03,240 --> 00:16:05,840
regress back in terms of tech
and how much is in our lives.
411
00:16:05,840 --> 00:16:07,560
It's not happening.
It's just not happening.
412
00:16:08,000 --> 00:16:12,160
And we can use it to for so much
good, so much good, not just in
413
00:16:12,160 --> 00:16:13,840
education across, you know, the
world.
414
00:16:13,880 --> 00:16:16,960
You see, you see loads of good
uses of AI and tech in, in the
415
00:16:16,960 --> 00:16:20,880
world of medicine and, you know,
and in education, obviously it
416
00:16:20,880 --> 00:16:23,760
can be used for good as well.
Like, I don't know, I don't know
417
00:16:23,760 --> 00:16:25,360
how it's so black and white to
me.
418
00:16:25,600 --> 00:16:28,160
So of course it was a good move.
There's so much in my life as an
419
00:16:28,160 --> 00:16:32,160
adult that tech has made better
that I would hate it to have a
420
00:16:32,160 --> 00:16:34,640
lobby group in that certain area
go no, no, it's too much.
421
00:16:34,800 --> 00:16:35,600
Yeah, yeah, Yeah.
Well.
422
00:16:35,840 --> 00:16:38,400
Oh, actually, that's made me
have my screening more accurate
423
00:16:38,400 --> 00:16:39,920
now for my health, actually.
Exactly.
424
00:16:39,920 --> 00:16:42,360
So that sounds quite good.
Yeah, I get it.
425
00:16:42,360 --> 00:16:43,680
With children, we want to
protect children.
426
00:16:43,800 --> 00:16:45,320
That's that's our number one aim
as educators.
427
00:16:45,320 --> 00:16:47,840
We always say if you were to ask
me my role as a teacher, my
428
00:16:47,840 --> 00:16:49,440
number one thing was always
safety of children.
429
00:16:49,440 --> 00:16:52,240
So I, I please don't think I'm
not taking that seriously, but I
430
00:16:52,240 --> 00:16:56,240
just think that you, you, that's
why I think you.
431
00:16:56,240 --> 00:16:58,560
Can do it, Yeah, yeah.
That's why I think it's like,
432
00:16:58,560 --> 00:17:01,080
without sounding common control
tech, it's not actually that
433
00:17:01,080 --> 00:17:05,480
hard to ensure the tech is safe.
Like in in all seriousness, is
434
00:17:05,480 --> 00:17:08,720
doing 10 minutes of your
practise on a maths app in
435
00:17:08,720 --> 00:17:11,359
school with your teachers like
actually watching over you and
436
00:17:11,359 --> 00:17:13,319
monitoring it and setting it for
you and then when you finish
437
00:17:13,319 --> 00:17:15,599
they give you some intervention.
Are you are you?
438
00:17:15,680 --> 00:17:17,599
Do you?
Do you really going to try and
439
00:17:17,599 --> 00:17:18,560
convince me?
That that's bad.
440
00:17:18,560 --> 00:17:21,920
Yeah, yeah, genuinely.
Or that or that it's no better
441
00:17:21,920 --> 00:17:25,480
than having a a sheet in front
of them like and it's so easy to
442
00:17:25,480 --> 00:17:27,960
say what number one, the two
things to me, everything now of
443
00:17:27,960 --> 00:17:29,720
tech, especially now we're
building a mass platform, right.
444
00:17:29,920 --> 00:17:31,560
Is does it save the teacher
time?
445
00:17:31,760 --> 00:17:33,520
That's what it's like a separate
thing that's just great.
446
00:17:33,520 --> 00:17:34,920
Anyway, does it save the teacher
time?
447
00:17:34,920 --> 00:17:35,680
Yes.
Cool.
448
00:17:35,680 --> 00:17:37,960
They get instant live gap
analysis, for example, with like
449
00:17:37,960 --> 00:17:40,040
with Mapsu or instant generation
of multiple questions.
450
00:17:40,040 --> 00:17:41,640
I don't know.
Or the ability to see an
451
00:17:41,640 --> 00:17:43,240
intervention group.
I'm not trying, I promise.
452
00:17:43,240 --> 00:17:44,440
I'm turning to a sales pitch,
but it is.
453
00:17:44,520 --> 00:17:46,520
It is.
Doing all the things that the
454
00:17:46,520 --> 00:17:49,600
tech should be good for.
And then what, you're I'm going
455
00:17:49,600 --> 00:17:51,680
to tear you up here because a
lot of people say that's great,
456
00:17:51,680 --> 00:17:52,800
but it's at the expense of the
kids.
457
00:17:52,960 --> 00:17:55,720
Yeah, and it's not because if it
can save the teachers time and
458
00:17:55,720 --> 00:17:58,080
it's also literally better for
the kids because it's more, it's
459
00:17:58,080 --> 00:17:59,400
more personalised, it's more
adaptive.
460
00:17:59,400 --> 00:18:03,120
It can do things that one person
with a with pen and paper just
461
00:18:03,120 --> 00:18:04,920
can't do quick enough.
You might be able to do it if
462
00:18:04,920 --> 00:18:06,600
you have three hours, but you
don't have three hours.
463
00:18:06,600 --> 00:18:09,120
You're in the middle of a
lesson, like if it's better for
464
00:18:09,120 --> 00:18:13,400
the children and it's better and
faster and gives more deeper
465
00:18:13,400 --> 00:18:15,400
analysis, stuff that you could
never get as a teacher and saves
466
00:18:15,400 --> 00:18:17,680
you time.
It's a no brainer.
467
00:18:17,880 --> 00:18:20,680
So the only barrier at that
point is the kids haven't got
468
00:18:20,680 --> 00:18:23,600
the, the, the equipment.
So that's why full circle
469
00:18:23,600 --> 00:18:25,480
finishing this conversation
about Bridget Phillips and her
470
00:18:25,480 --> 00:18:28,920
and her chat.
That's why to me it's, it's been
471
00:18:28,920 --> 00:18:30,960
out like this stuff's been
around for like 20 years.
472
00:18:30,960 --> 00:18:32,640
Do you know what I mean?
Like iPads, It's not like it's
473
00:18:32,640 --> 00:18:33,960
brand new and it's like, oh,
we're working on it.
474
00:18:33,960 --> 00:18:35,640
We're going to start getting
that funding together for this
475
00:18:35,640 --> 00:18:37,560
new tech, for this new equipment
in schools.
476
00:18:38,200 --> 00:18:40,280
No, no, no, no, no, no, no.
Computers have been around for
477
00:18:40,280 --> 00:18:43,480
like 30 years in schools. iPads
have honestly been around for at
478
00:18:43,480 --> 00:18:45,520
least 15 years in schools.
Like what are we talking about?
479
00:18:45,520 --> 00:18:47,720
Come on.
It's it's been so long now that
480
00:18:48,040 --> 00:18:50,200
there isn't really much excuse.
It's just OK, let's just let's
481
00:18:50,200 --> 00:18:52,200
just get it done.
Yeah, get it done, get it done.
482
00:18:52,480 --> 00:18:54,640
And then from there the kids
will benefit.
483
00:18:54,640 --> 00:18:55,760
Absolutely.
I completely agree.
484
00:18:55,760 --> 00:18:58,080
I'm I'm very excited by it.
I just think that has to come
485
00:18:58,080 --> 00:19:00,560
first.
It just has to second thing then
486
00:19:00,560 --> 00:19:02,360
before we go on to our.
So the main bulk, what we'll do
487
00:19:02,360 --> 00:19:06,040
later is we'll talk from a more,
I suppose, practical point of
488
00:19:06,040 --> 00:19:09,720
view from teaching how useful AI
is for different jobs around,
489
00:19:09,720 --> 00:19:11,920
around the classroom.
What, what, whether it's a good
490
00:19:11,920 --> 00:19:13,360
thing or a bad thing, come on to
that.
491
00:19:13,360 --> 00:19:16,480
But one thing I wanted to talk
about very quickly was what I
492
00:19:16,480 --> 00:19:19,880
found one of the most
captivating talks I've actually
493
00:19:19,880 --> 00:19:22,640
listened to in a long time where
I was really like really on the
494
00:19:22,640 --> 00:19:24,440
edge of my seat listening to
every word they were saying.
495
00:19:24,960 --> 00:19:28,640
And it was Amor Rajam from who
hosts University Challenge.
496
00:19:28,640 --> 00:19:32,000
Yes, an amazing journalist, does
BBC as well.
497
00:19:32,000 --> 00:19:35,360
And he's got a podcast a minute
called Radical where he's going
498
00:19:35,360 --> 00:19:37,960
in depth about people who like
being radical in certain things.
499
00:19:37,960 --> 00:19:39,560
That was one about foster carers
recently.
500
00:19:39,880 --> 00:19:41,920
He's got Jonathan Height coming
on who did the anxious
501
00:19:41,920 --> 00:19:44,120
generation talking about
smartphones, etcetera.
502
00:19:44,120 --> 00:19:47,480
So I'm really interested and you
can just tell Amor really goes
503
00:19:47,480 --> 00:19:49,760
into depth about the things he's
talking to someone about.
504
00:19:50,200 --> 00:19:52,600
And he's such a powerful
interviewer because he knows it
505
00:19:52,600 --> 00:19:55,560
really well, but understands
that there's an expert over the
506
00:19:55,560 --> 00:19:56,760
way.
And he was talking to Hannah
507
00:19:56,760 --> 00:19:58,840
Fry.
Everyone knows Hannah Fry, Fry
508
00:19:58,840 --> 00:20:00,400
squared.
She's you're a big fan.
509
00:20:00,560 --> 00:20:02,400
Huge.
Fan just going back for a second
510
00:20:02,400 --> 00:20:04,360
just gassing up Hannah Fryer.
She's awesome.
511
00:20:04,760 --> 00:20:07,200
I remember probably around the
same same time we found it,
512
00:20:07,200 --> 00:20:08,600
there was a YouTube channel
called Number File.
513
00:20:08,680 --> 00:20:10,720
Still is number file if you're
into maths.
514
00:20:11,440 --> 00:20:12,920
Which we are.
Did you and you just?
515
00:20:13,160 --> 00:20:14,840
You just like nerding out about
maths things.
516
00:20:14,840 --> 00:20:17,240
It's just cool math stuff.
Number file is a great channel
517
00:20:17,240 --> 00:20:19,560
and Hannah Fry, she think she
still might do to be there but
518
00:20:19,560 --> 00:20:22,560
she there was certainly a period
of time like 78910 years ago
519
00:20:22,720 --> 00:20:24,960
where she did loads of videos
with number file.
520
00:20:24,960 --> 00:20:27,320
And that is originally probably
about 10 years ago how I got
521
00:20:27,320 --> 00:20:29,160
introduced to her and I remember
watching the video thinking
522
00:20:29,440 --> 00:20:31,240
she's awesome.
I want to watch her explain more
523
00:20:31,240 --> 00:20:33,480
stuff.
Also just fun fact, she does not
524
00:20:33,480 --> 00:20:34,680
age.
She does.
525
00:20:34,760 --> 00:20:36,720
I was not age.
I saw her on stage with with
526
00:20:36,720 --> 00:20:40,640
ammo doing this talk and I was
like, you look exactly, exactly
527
00:20:40,640 --> 00:20:42,600
the same, Paul.
Rudd and Hannah Fryer having the
528
00:20:42,600 --> 00:20:43,920
same diet in your samples?
Go.
529
00:20:44,080 --> 00:20:46,680
Seriously, go pull up a pull up
a thumbnail from 10 years ago on
530
00:20:46,680 --> 00:20:48,320
YouTube.
She has not aged at all as
531
00:20:48,320 --> 00:20:50,600
mental.
Whereas you do, I don't.
532
00:20:50,600 --> 00:20:53,040
Actually I don't I will look
like this hopefully until I'm
533
00:20:53,040 --> 00:20:55,160
70.
But the, the, the chat they had
534
00:20:55,160 --> 00:20:56,800
was really interesting.
It was all about AI, it was all
535
00:20:56,800 --> 00:20:59,880
about artificial intelligence.
And it kind of it was, it was
536
00:20:59,920 --> 00:21:02,360
like embedded in education, but
talked about like wider society
537
00:21:02,360 --> 00:21:04,840
as well.
And something I found so
538
00:21:04,840 --> 00:21:07,960
interesting and it really got me
thinking was they were talking
539
00:21:07,960 --> 00:21:10,760
about how AI will transform
education, right?
540
00:21:11,000 --> 00:21:13,560
So whether we like it on our AI
is here to stay, what's it going
541
00:21:13,560 --> 00:21:16,640
to do?
And they spoke about how
542
00:21:16,640 --> 00:21:19,920
currently AI, for example,
there's a lot of worry in in
543
00:21:19,920 --> 00:21:21,320
university for students,
etcetera.
544
00:21:21,320 --> 00:21:24,880
Sixth form, how AI is writing
essays for the children.
545
00:21:25,320 --> 00:21:26,840
OK.
And it's like, cool, I've got a
546
00:21:26,840 --> 00:21:29,560
shortcut now I can just write an
essay, get it done, get it
547
00:21:29,560 --> 00:21:31,280
graded.
There's my B, Boom, I've got it.
548
00:21:31,840 --> 00:21:34,360
And it really highlighted this,
this issue and this thought
549
00:21:34,360 --> 00:21:36,520
about how what is the point of
learning?
550
00:21:36,880 --> 00:21:40,360
And AI has really put a
microscope on it because for
551
00:21:40,360 --> 00:21:43,560
those children, the point of
learning was to get to the grade
552
00:21:43,560 --> 00:21:46,000
at the end.
And that is something I think is
553
00:21:46,000 --> 00:21:49,240
so problematic in education all
the way through that What's the
554
00:21:49,240 --> 00:21:51,000
point of learning to pass the
test?
555
00:21:51,160 --> 00:21:53,680
What's the point in VA 6 to make
sure the SAT results are good at
556
00:21:53,680 --> 00:21:55,080
the end?
What's the point in secondary
557
00:21:55,080 --> 00:21:56,960
school to get your GCSE for your
qualifications?
558
00:21:57,200 --> 00:22:01,040
I understand in isolation why
those things are important, but
559
00:22:01,040 --> 00:22:04,040
it became the whole reason for
education for a lot of teachers.
560
00:22:04,040 --> 00:22:05,160
I've got to get them through
this test.
561
00:22:05,160 --> 00:22:07,360
I'm just doing it for the test.
Doing it for the test.
562
00:22:07,720 --> 00:22:11,400
AI comes along and makes the end
product something you can just
563
00:22:11,400 --> 00:22:14,880
do in five seconds.
It solves the problem if you
564
00:22:14,880 --> 00:22:16,720
think about it, AI solving that
problem.
565
00:22:16,720 --> 00:22:18,520
I've written an essay for you.
I've saved you 6 hours.
566
00:22:18,520 --> 00:22:20,480
Isn't everyone says AI is 4?
Isn't that fantastic?
567
00:22:20,920 --> 00:22:24,960
And for me, what it's
highlighted is how education
568
00:22:24,960 --> 00:22:26,640
needs a revamp.
Whether we've got AI or not.
569
00:22:27,080 --> 00:22:30,960
Education needs to be resented
to the act of learning is the
570
00:22:30,960 --> 00:22:34,680
point of learning.
Why else do we learn if not for
571
00:22:34,680 --> 00:22:38,680
the journey and the struggle and
trying something and getting it
572
00:22:38,680 --> 00:22:41,080
wrong and finding something out
and seeing that it works and
573
00:22:41,080 --> 00:22:44,280
spotting patterns?
The act of doing that, much like
574
00:22:44,280 --> 00:22:47,320
problem solving in maths, the
act of doing something you don't
575
00:22:47,320 --> 00:22:49,240
know how to get to the end, but
you're going to try and get
576
00:22:49,240 --> 00:22:51,600
there anyway based on what
you've learned with someone, a
577
00:22:51,600 --> 00:22:54,280
teacher normally alongside you,
prompting you in the right
578
00:22:54,280 --> 00:22:56,880
direction.
That's the point of learning.
579
00:22:57,240 --> 00:23:00,640
And it I found it fascinating
how they spoke about how AI is
580
00:23:00,640 --> 00:23:04,320
taking that away and it almost
doesn't matter anymore.
581
00:23:04,320 --> 00:23:07,840
And I'm interested to your
points about, we talk about AI
582
00:23:07,840 --> 00:23:09,880
in tech and how it's so
important that it should be
583
00:23:09,880 --> 00:23:11,320
implemented in education
correctly.
584
00:23:11,880 --> 00:23:14,520
Do you think that's a problem
and an issue then that it could
585
00:23:14,520 --> 00:23:16,920
end up being?
We're not really focusing on
586
00:23:16,920 --> 00:23:18,760
what's important.
Oh, big time, big time.
587
00:23:18,760 --> 00:23:21,800
I think I think Hannah, she went
really like meta with this and
588
00:23:21,800 --> 00:23:23,680
she was talking about almost
like the whole world and just I
589
00:23:23,680 --> 00:23:27,480
like even like lifespans and she
was like we and I agree.
590
00:23:27,480 --> 00:23:30,800
We really just need to challenge
the status quo completely of
591
00:23:30,800 --> 00:23:33,040
like how life works now.
She was like, AI is going to be
592
00:23:33,040 --> 00:23:37,160
so disruptive and such a fast
moving tech that we already need
593
00:23:37,160 --> 00:23:40,240
to stop looking at life as, Oh
yeah, you go to school, you work
594
00:23:40,240 --> 00:23:42,360
for X amount of years, you get
these grades so that you can get
595
00:23:42,360 --> 00:23:44,880
this job and you work that job
for 40 years so that you can
596
00:23:44,880 --> 00:23:46,720
retire and have a nice bit of
time before you die.
597
00:23:47,160 --> 00:23:50,040
That is what she said.
And and she was like, that's
598
00:23:50,040 --> 00:23:51,520
just not going to be a thing
anymore.
599
00:23:51,520 --> 00:23:53,520
Like we're already seeing the
disruption of people having to
600
00:23:53,520 --> 00:23:56,160
retrain after five years in a
job because, oh, hey, I can do
601
00:23:56,160 --> 00:23:58,000
it now.
It's happened in the short span
602
00:23:58,160 --> 00:24:00,400
of AI going from this little
gimmicky thing to where it is
603
00:24:00,400 --> 00:24:03,360
now of in that time, it's
already eradicated certain jobs
604
00:24:03,960 --> 00:24:07,000
or or at least huge, you know,
chunks of certain jobs, and
605
00:24:07,440 --> 00:24:09,760
that's just fast.
It just fascinates me like it.
606
00:24:09,840 --> 00:24:13,120
It not only does the education,
you know, structure of education
607
00:24:13,120 --> 00:24:15,800
just need to be completely
dismantled all of life.
608
00:24:15,920 --> 00:24:16,400
It's just.
Like.
609
00:24:16,720 --> 00:24:19,000
We need to start what are we
preparing children for?
610
00:24:19,000 --> 00:24:20,720
But ultimately it's for the
workforce generally.
611
00:24:20,720 --> 00:24:22,840
That's what we know people,
society says about education.
612
00:24:23,120 --> 00:24:25,200
Well, the workforce is changing
rapidly, and the workforce is
613
00:24:25,200 --> 00:24:26,640
going to be nothing like it used
to be.
614
00:24:26,800 --> 00:24:29,160
So actually, what are we
preparing kids for?
615
00:24:29,160 --> 00:24:30,240
Yeah.
And it kind of went back to that
616
00:24:30,240 --> 00:24:32,840
question of what are we doing?
Yeah, just getting essays done
617
00:24:32,840 --> 00:24:34,800
quicker and getting grades isn't
even useful anyway.
618
00:24:34,800 --> 00:24:36,400
That's the irony of it.
It's like, you know, exactly.
619
00:24:36,400 --> 00:24:38,680
It's not even that useful now.
It's solving a problem that
620
00:24:38,680 --> 00:24:41,400
doesn't need to be a problem.
We've made it a problem.
621
00:24:41,400 --> 00:24:44,800
So here's the beauty of it.
It's solving a problem for the
622
00:24:44,800 --> 00:24:47,640
workforce of yesterday.
Yeah, yeah, yeah.
623
00:24:47,640 --> 00:24:49,480
It doesn't even solve a real
problem anymore.
624
00:24:49,480 --> 00:24:51,880
No, it doesn't make sense.
So everything's outdated.
625
00:24:51,880 --> 00:24:53,880
All needs to be restructured and
rethought.
626
00:24:53,880 --> 00:24:55,520
I think is a huge problem.
I don't think is gonna be any
627
00:24:55,520 --> 00:24:57,840
quick fixes to this at all.
What I found really interesting
628
00:24:57,840 --> 00:25:01,080
as well on the back of that was
how, you know, it was a big
629
00:25:01,080 --> 00:25:02,160
talk.
That's why I loved it.
630
00:25:02,160 --> 00:25:04,000
I think that's why I was
captivated, because it was about
631
00:25:04,000 --> 00:25:06,480
big, big, big things.
And it started with education.
632
00:25:06,480 --> 00:25:09,960
But you very quickly found out
that actually education is life
633
00:25:09,960 --> 00:25:12,360
is everything, right?
And they spoke about how again,
634
00:25:12,360 --> 00:25:15,000
this is backdated view, much
like what you said about go to
635
00:25:15,040 --> 00:25:17,040
go to school, get qualification,
get a job, retire.
636
00:25:17,440 --> 00:25:19,880
There's also a backdated view
that you kind of don't have to
637
00:25:19,880 --> 00:25:23,080
keep learning new things.
And this idea of like learnings
638
00:25:23,080 --> 00:25:25,400
for school keeper back then,
it's actually with the advent of
639
00:25:25,400 --> 00:25:28,120
AI.
And she and she was saying, Amor
640
00:25:28,120 --> 00:25:29,960
was saying both of them are
saying how they're more curious
641
00:25:29,960 --> 00:25:32,560
and find out more things than
ever because they can use AI to
642
00:25:32,560 --> 00:25:35,080
superpower things, to superpower
research to find things out.
643
00:25:35,560 --> 00:25:37,080
And I was like, yes, that's what
we need as well.
644
00:25:37,200 --> 00:25:40,120
We need an entire population
who's thirsty for knowledge,
645
00:25:40,480 --> 00:25:43,360
right?
And unfortunately in school
646
00:25:43,360 --> 00:25:46,240
right now, I don't think we're
feeding that thirst for
647
00:25:46,240 --> 00:25:48,600
knowledge enough.
I think we're feeding a need for
648
00:25:48,600 --> 00:25:51,560
to pass an exam feeling, a need
for to write an essay.
649
00:25:52,400 --> 00:25:54,800
AI is going to come along and do
it even quicker and get us even
650
00:25:54,800 --> 00:25:58,360
further from creating a
generation of children who want
651
00:25:58,360 --> 00:25:59,960
to learn for the sake of
learning.
652
00:26:00,240 --> 00:26:03,280
And what that will do is in the
short term, cool grades might be
653
00:26:03,280 --> 00:26:05,920
okay for a bit when we're just
marking an essay, but it's going
654
00:26:05,920 --> 00:26:08,440
to be a double whammy because
like you said, the
655
00:26:08,440 --> 00:26:10,440
qualifications won't matter
anymore because the jobs won't
656
00:26:10,440 --> 00:26:12,320
be there.
But also then when they're
657
00:26:12,320 --> 00:26:15,160
adults, they'll have no
experience in the act of
658
00:26:15,160 --> 00:26:18,640
learning or no desire to want to
find out more because they just
659
00:26:18,640 --> 00:26:21,520
think a bot can do it for them.
And that's where we have to.
660
00:26:21,520 --> 00:26:25,880
We have to teach children at
some point in their journey
661
00:26:25,880 --> 00:26:31,320
through education how to use AI
to not do the work for them
662
00:26:31,320 --> 00:26:35,280
necessarily, but to help them
with their learning and the
663
00:26:35,280 --> 00:26:39,120
journey themselves and make AI
work for them rather than just
664
00:26:39,360 --> 00:26:41,360
exporting all of the output.
Couldn't agree more.
665
00:26:41,360 --> 00:26:43,600
To superpower their critical
thinking skills?
666
00:26:43,600 --> 00:26:45,520
Exactly.
To superpower their enjoyment of
667
00:26:45,520 --> 00:26:47,360
learning a particular subject
that they've just found out.
668
00:26:47,360 --> 00:26:49,160
But how long?
Have we said that it feels like
669
00:26:49,160 --> 00:26:50,440
AI is just putting a microscope
on it?
670
00:26:50,960 --> 00:26:52,920
It's like, oh, AIS come along
and can do the thing we've
671
00:26:52,920 --> 00:26:55,680
always thought is pointless and
stupid even quicker, so kids
672
00:26:55,680 --> 00:26:57,200
have to do less.
It's like, cool.
673
00:26:57,200 --> 00:27:01,720
The the problem remains.
As a teacher, I want to inspire
674
00:27:01,720 --> 00:27:04,440
children, OK?
I want to inspire them to want
675
00:27:04,440 --> 00:27:07,520
to find things out.
My job I always thought of when
676
00:27:07,520 --> 00:27:11,040
I was in front of my class was I
want to trick them into thinking
677
00:27:11,040 --> 00:27:12,480
they're deciding what comes
next.
678
00:27:12,800 --> 00:27:15,720
I want to make them think that
they're on their own journey,
679
00:27:15,920 --> 00:27:18,400
which they do as much as
possible, but I'm curating it.
680
00:27:18,680 --> 00:27:21,560
My job as a curator and my job
is to inspire, to make them
681
00:27:21,560 --> 00:27:24,040
think something.
I want them to think of the next
682
00:27:24,040 --> 00:27:26,600
thing in the progression without
them thinking I've made them do
683
00:27:26,600 --> 00:27:27,120
it.
Yeah, yeah.
684
00:27:27,120 --> 00:27:28,560
Or without you just telling
them, Yeah, because it's
685
00:27:28,680 --> 00:27:30,440
obviously, you know, you're
thinking, you're asking the
686
00:27:30,440 --> 00:27:31,720
questions like, yeah, what would
happen here?
687
00:27:31,720 --> 00:27:34,600
You know exactly what happened.
You're curating a curious mind.
688
00:27:34,840 --> 00:27:36,520
That's what you're doing.
Yeah, that's what you want to
689
00:27:36,520 --> 00:27:38,200
do.
And how boring and droll are the
690
00:27:38,200 --> 00:27:40,600
lessons where you just stand at
the front and just tell them
691
00:27:40,600 --> 00:27:42,160
things?
Yeah, if I wanted to.
692
00:27:42,160 --> 00:27:44,920
There you go.
There's AIAI can do that if we
693
00:27:44,920 --> 00:27:46,360
want to.
We just get it on a voice note
694
00:27:46,360 --> 00:27:48,680
app, read out some facts to the
kids.
695
00:27:48,680 --> 00:27:51,360
Yeah, what's that going to do?
I want to inspire them.
696
00:27:51,360 --> 00:27:54,680
That's why I think there is
still that difference between
697
00:27:54,680 --> 00:27:58,160
the teacher delivering A
curriculum inspiring children
698
00:27:58,160 --> 00:28:00,320
and what AI can currently do
right now.
699
00:28:00,320 --> 00:28:02,440
I think there's still a big, big
disconnect.
700
00:28:02,600 --> 00:28:06,120
We have to focus more on what we
as humans can do, human to
701
00:28:06,120 --> 00:28:07,400
human.
Well, that was a really good
702
00:28:07,400 --> 00:28:10,920
example of maybe a part of the
class and perhaps the AI could
703
00:28:10,920 --> 00:28:13,560
be used for if it's literally
just here are facts and you need
704
00:28:13,560 --> 00:28:15,760
to know them.
And there's no reason why a bot
705
00:28:15,760 --> 00:28:17,680
didn't even have to be AI
couldn't just give you those
706
00:28:17,680 --> 00:28:19,520
facts.
And in the olden days, they'd
707
00:28:19,520 --> 00:28:21,840
call it a book, give a child a
book by themselves and read it
708
00:28:21,840 --> 00:28:24,240
to find out yourselves.
But let's maybe we can go
709
00:28:24,240 --> 00:28:27,320
through a few things then in the
world of education that we think
710
00:28:27,320 --> 00:28:31,960
and rank them as to how how good
we think AI is at doing that
711
00:28:31,960 --> 00:28:33,720
job.
So in other words, how it almost
712
00:28:33,880 --> 00:28:36,240
kind of coincides with how
likely we think then AI is to
713
00:28:36,240 --> 00:28:38,040
replace that job, because how
the better it is, the more
714
00:28:38,040 --> 00:28:38,920
likely.
This is good.
715
00:28:39,040 --> 00:28:40,080
It's good.
So we'll give it a square out of
716
00:28:40,080 --> 00:28:43,880
10 and we'll say basically how
useful is AI for this?
717
00:28:43,880 --> 00:28:45,680
Yeah, yeah, yeah.
So I'll start you off then you
718
00:28:45,680 --> 00:28:47,880
can go with this one.
So written lesson plans.
719
00:28:48,640 --> 00:28:52,800
OK.
I think a solid 8A, solid 8
720
00:28:52,800 --> 00:28:54,640
explain.
Because again, there's going to
721
00:28:54,640 --> 00:28:56,240
be context to all of these by
the way, people here, right?
722
00:28:56,240 --> 00:28:58,840
And go what?
Either way, the solid context
723
00:28:58,840 --> 00:29:02,000
here is that AI now, especially
large language models can be
724
00:29:02,000 --> 00:29:04,800
trained on documents you upload.
Actually might be uploading
725
00:29:04,800 --> 00:29:06,440
documents from your curriculum.
OK.
726
00:29:06,760 --> 00:29:09,360
Also very important, the prompts
you put into it.
727
00:29:09,720 --> 00:29:12,120
What I'm not saying in anything,
I think I'll talk for both of
728
00:29:12,120 --> 00:29:12,480
us.
Now.
729
00:29:12,560 --> 00:29:14,680
I don't think either of us are
just saying press a button and I
730
00:29:14,680 --> 00:29:16,840
hope it comes.
We have to have the
731
00:29:17,280 --> 00:29:19,520
understanding here that what
we're doing is prompting AI to
732
00:29:19,520 --> 00:29:21,560
do this for us.
So this is only as good as your
733
00:29:21,560 --> 00:29:23,960
prompts at any point.
But if your prompts are good,
734
00:29:24,360 --> 00:29:27,640
especially I'm thinking of
medium to long term planning, If
735
00:29:27,640 --> 00:29:31,080
you want AI to organise your
learning in a structured order
736
00:29:31,280 --> 00:29:33,600
and you're putting up the
curriculum for it, I think it'd
737
00:29:33,600 --> 00:29:35,880
be very good at putting that
into a table for you and mapping
738
00:29:35,880 --> 00:29:38,880
out across the year that kind of
grunt work that you could sit
739
00:29:38,880 --> 00:29:40,480
there and do it on Excel if you
wanted to.
740
00:29:40,960 --> 00:29:42,400
AI can do that in 10 seconds for
you.
741
00:29:42,400 --> 00:29:44,760
I think that's fantastic.
The actual content itself, the
742
00:29:44,760 --> 00:29:48,360
reason why it's not more than an
8 for me is because I think
743
00:29:48,360 --> 00:29:51,720
personally, the content will
always come from something else.
744
00:29:51,720 --> 00:29:53,680
Even if I was curating it, I'd
be getting my content from
745
00:29:53,680 --> 00:29:55,040
somewhere else.
I'd be reading through the
746
00:29:55,040 --> 00:29:58,920
content and deciding what to do
AI, I'd be giving it that right.
747
00:29:59,000 --> 00:30:01,440
So it's still, it's still.
Don't just think that I'm hoping
748
00:30:01,440 --> 00:30:03,680
it does the right thing.
And the reason it's not 9 or 10
749
00:30:03,680 --> 00:30:05,800
is because afterwards I'd be
spending time going through what
750
00:30:05,800 --> 00:30:08,280
it did and editing it.
But I think in general, for
751
00:30:08,280 --> 00:30:10,400
written lesson plans, I think
written lesson plans on the
752
00:30:10,400 --> 00:30:11,840
whole, especially when you're
more experienced, are an
753
00:30:11,840 --> 00:30:12,800
absolute waste of time.
Yeah, yeah.
754
00:30:12,800 --> 00:30:15,240
I was going to say, what about
in the classic situation where
755
00:30:15,480 --> 00:30:18,000
you've already planned your
lesson in terms of what, however
756
00:30:18,000 --> 00:30:19,920
you do that, whether it's a
PowerPoint or in your mind or
757
00:30:19,920 --> 00:30:22,120
it's practical, whatever.
And then the school policy is,
758
00:30:22,120 --> 00:30:24,480
Oh, no, you have to have it
written down on this, on this
759
00:30:24,480 --> 00:30:26,400
planning format.
You've already done the work.
760
00:30:26,400 --> 00:30:29,160
There's nothing to gain for you
in the situation of writing it
761
00:30:29,160 --> 00:30:30,680
down.
It's just for the score in that
762
00:30:30,680 --> 00:30:33,240
situation, it's the 10 out of 10
because I'm like, I don't even
763
00:30:33,240 --> 00:30:35,840
want to do this anyway.
So if someone else and if AI can
764
00:30:35,840 --> 00:30:38,360
just take what I've already done
and and just put it into some
765
00:30:38,360 --> 00:30:40,160
lovely written lesson for
someone else, great.
766
00:30:40,360 --> 00:30:42,080
So here's here's my big worry
for AI.
767
00:30:42,080 --> 00:30:44,800
Not worry, but I can see this
happening already is people
768
00:30:44,800 --> 00:30:48,440
become obsessed with AI and try
to shoehorn AI into every single
769
00:30:48,440 --> 00:30:51,360
part of their teams life.
So let's say you're the head of
770
00:30:51,360 --> 00:30:54,080
a school or a trust and you
think I need to get AI and
771
00:30:54,080 --> 00:30:56,080
everything for productivity?
We must get it in everything.
772
00:30:56,280 --> 00:30:58,960
Sometimes I think it adds steps
that you don't need right?
773
00:30:58,960 --> 00:31:01,640
So whilst okay, sure in your
context there right of your
774
00:31:01,640 --> 00:31:04,640
written plan, you've already got
your smart notebook or your or
775
00:31:04,640 --> 00:31:07,280
your PowerPoint for your lesson.
You're ready to teach, you're as
776
00:31:07,280 --> 00:31:10,120
ready as you can be, but there's
a your your school go.
777
00:31:10,120 --> 00:31:11,800
No, no, I need a written plan
just put into AI.
778
00:31:11,800 --> 00:31:15,640
It'd be so quick.
It was like, OK, any second I
779
00:31:15,640 --> 00:31:20,120
put into making that is a second
I didn't need to spend yeah, but
780
00:31:20,120 --> 00:31:23,120
they'll look at how long would
it have taken you if we hadn't
781
00:31:23,120 --> 00:31:25,440
had AI.
It's gone from one hour to 10
782
00:31:25,440 --> 00:31:26,800
minutes.
Isn't that fantastic?
783
00:31:27,120 --> 00:31:29,240
I really need people to
understand here and just be
784
00:31:29,240 --> 00:31:32,080
nuanced and think about stuff
moment to moment because
785
00:31:32,080 --> 00:31:35,080
actually, no, you haven't saved
50 minutes, OK?
786
00:31:35,360 --> 00:31:37,000
You've wasted 10 minutes.
Yeah, definitely.
787
00:31:37,000 --> 00:31:38,600
It's like Amazon Black Friday.
No, you haven't.
788
00:31:38,600 --> 00:31:40,360
You haven't saved £200 on that
Hoover.
789
00:31:40,360 --> 00:31:43,120
You don't need OK from 400 to
£200.
790
00:31:43,240 --> 00:31:44,960
You've wasted £200 because you
didn't.
791
00:31:44,960 --> 00:31:45,920
Need it?
Yeah, and you won't get about
792
00:31:45,920 --> 00:31:46,480
anyway.
Exactly.
793
00:31:46,480 --> 00:31:47,440
Just flip it.
Exactly.
794
00:31:47,440 --> 00:31:49,160
Yeah, Yeah.
You've no longer wasted 60
795
00:31:49,160 --> 00:31:50,320
minutes.
You've wasted 10 minutes.
796
00:31:50,320 --> 00:31:51,960
That's the way to do it.
No, you've saved 50.
797
00:31:52,040 --> 00:31:54,200
Exactly.
So thinking outside the box is
798
00:31:54,200 --> 00:31:56,320
important, isn't here.
In general, if just like is the
799
00:31:56,320 --> 00:31:58,640
system broke, is is the system
the problem, not can we make the
800
00:31:58,640 --> 00:32:00,000
system quicker?
Exactly right.
801
00:32:00,400 --> 00:32:03,360
I'm going to give you, I'm going
to communication with parents.
802
00:32:04,120 --> 00:32:08,080
So letters, newsletters, notes
in general.
803
00:32:08,080 --> 00:32:09,960
Maybe as a teacher you want to
e-mail parents.
804
00:32:11,480 --> 00:32:14,440
Quite probably like a three or
four for me.
805
00:32:14,440 --> 00:32:16,920
Yeah, because I think the, I
think, I think AI can do some of
806
00:32:16,920 --> 00:32:20,360
the grunt work in terms of
speeding up your notes perhaps
807
00:32:20,360 --> 00:32:22,400
into a formal letter.
That's quite good.
808
00:32:22,520 --> 00:32:25,120
There's no reason.
But like, realistically, all of
809
00:32:25,120 --> 00:32:28,920
the content absolutely has to be
created and created by you.
810
00:32:28,920 --> 00:32:31,640
You know what I mean?
You're talking to a parent about
811
00:32:32,120 --> 00:32:33,920
their child.
Like every child is unique.
812
00:32:33,920 --> 00:32:36,400
It's a different situation.
It's not like a curriculum
813
00:32:36,400 --> 00:32:38,280
document.
You can't just upload the child
814
00:32:38,400 --> 00:32:41,640
into a into a thing and say this
is like a national curriculum
815
00:32:41,640 --> 00:32:43,600
document is one thing and it's
objective, you know?
816
00:32:43,760 --> 00:32:45,200
GDPR.
Yeah, exactly.
817
00:32:45,200 --> 00:32:47,920
But a child is like a child.
Every child's experience and
818
00:32:47,920 --> 00:32:49,240
life is different.
I can't really.
819
00:32:49,360 --> 00:32:51,640
I just need to talk to the
parent and have a nuanced
820
00:32:51,640 --> 00:32:53,160
conversation.
Like I don't really think AI.
821
00:32:53,200 --> 00:32:54,800
Hold that floor.
Yeah, because I'm going to give
822
00:32:54,800 --> 00:32:57,880
you another one straight away.
Report writing that one.
823
00:32:57,880 --> 00:33:01,160
Why is it different?
I kind of said it already that
824
00:33:01,160 --> 00:33:03,240
report writing is.
Nuanced.
825
00:33:03,240 --> 00:33:04,480
Every single child is.
Different, true.
826
00:33:04,480 --> 00:33:06,800
I guess I'm really putting
emphasis on report writing on on
827
00:33:06,800 --> 00:33:09,040
the writing bit.
Communication with parents to me
828
00:33:09,040 --> 00:33:11,920
is like phone calls and like you
said, maybe notes or or chatting
829
00:33:11,920 --> 00:33:14,320
after school.
Like it's kind of like AI is.
830
00:33:14,320 --> 00:33:15,600
I can't see it replacing much of
that.
831
00:33:15,720 --> 00:33:17,920
OK, so you so for communicate,
let's be really clear then
832
00:33:17,920 --> 00:33:20,480
because that's very interesting
for communication with parents.
833
00:33:20,480 --> 00:33:24,160
What you're saying is actually
you should be doing in person
834
00:33:24,160 --> 00:33:26,760
stuff more.
Yeah, realistically and I, I
835
00:33:26,760 --> 00:33:29,520
don't think AI should be taking
that away from the experience of
836
00:33:29,520 --> 00:33:31,520
school and the relationship
between parents and.
837
00:33:31,560 --> 00:33:32,040
School.
OK, Very.
838
00:33:32,080 --> 00:33:33,560
Interesting.
Whereas report writing is
839
00:33:33,560 --> 00:33:35,560
traditionally a piece of paper
with writing on it.
840
00:33:35,720 --> 00:33:39,800
And to me, I mean, I've done it
for years, there's, there's just
841
00:33:39,800 --> 00:33:44,280
no benefit to me writing out the
paragraphs myself when I can
842
00:33:44,280 --> 00:33:45,920
just put all of my bullet points
that I would already have
843
00:33:45,920 --> 00:33:47,520
written.
So in both situations, AI or not
844
00:33:47,520 --> 00:33:49,840
AI, I'm writing some bullet
points out about a child and the
845
00:33:49,840 --> 00:33:51,320
things I want to get across to
their parent.
846
00:33:51,600 --> 00:33:55,440
Then I'm spending either 1520
minutes curating a lovely
847
00:33:55,440 --> 00:33:59,440
paragraph that's written really
nicely, or I can spend 5 seconds
848
00:33:59,440 --> 00:34:01,160
putting into AI and saying this
is my writing style.
849
00:34:01,160 --> 00:34:03,840
Can you just, can you just put
these notes into a paragraph and
850
00:34:03,840 --> 00:34:06,720
then proofreading it, always
doing little bits of edits, but
851
00:34:06,720 --> 00:34:09,600
the time saved is astronomical.
When I started using AI for
852
00:34:09,600 --> 00:34:12,040
reports, not only were they just
better because I wasn't getting
853
00:34:12,040 --> 00:34:14,159
the fatigue and starting to just
be like, oh man, I've been
854
00:34:14,159 --> 00:34:17,400
writing reports for 17 hours.
Genuinely because I'm just, I'm
855
00:34:17,400 --> 00:34:19,719
so fatigued that and it's all in
my own time because no one ever
856
00:34:19,719 --> 00:34:23,520
gets bloody report writing time
nearly or never enough that I'm
857
00:34:23,520 --> 00:34:26,440
just copying paragraphs from
previous kids because they're
858
00:34:26,440 --> 00:34:28,040
similar.
I'll just change, tweak a word
859
00:34:28,040 --> 00:34:30,080
or two.
Oh, I'm just using a Bank of
860
00:34:30,080 --> 00:34:32,120
stuff from five years ago, from
10 years ago.
861
00:34:32,120 --> 00:34:35,040
Like like we all pretend that
that was fine.
862
00:34:35,400 --> 00:34:38,800
Maybe it was, but we can't then
also say, oh, AI is terrible.
863
00:34:38,960 --> 00:34:41,639
Actually, AI was making more
unique paragraphs than ever
864
00:34:41,639 --> 00:34:43,800
before.
It was still mimicking my style
865
00:34:43,800 --> 00:34:45,840
and I and it didn't have to copy
and paste anything from any
866
00:34:45,840 --> 00:34:47,600
other child.
I could give it unique notes
867
00:34:47,960 --> 00:34:50,880
every single time about each
child, and all it did was just
868
00:34:50,880 --> 00:34:52,679
formulate those notes into a
paragraph for someone else to
869
00:34:52,679 --> 00:34:55,719
read in a professional manner.
So like the sheer amount of
870
00:34:55,719 --> 00:34:58,440
grunt work that had to go into
report writing for you bumps it
871
00:34:58,440 --> 00:35:00,080
up from your free.
So what would you give out of
872
00:35:00,080 --> 00:35:01,440
10?
For you, I'd probably still give
873
00:35:01,440 --> 00:35:05,560
it like 7 because ultimately,
yeah, it would be a 10 if if the
874
00:35:05,560 --> 00:35:08,120
notes taking wasn't existed,
existing even 7.
875
00:35:08,120 --> 00:35:09,440
Probably a bit generous to be
honest with you.
876
00:35:09,440 --> 00:35:10,760
Because that is, I'm going to
give it five.
877
00:35:10,920 --> 00:35:13,200
I'm going down because I think
half, half of the importance
878
00:35:13,200 --> 00:35:16,880
there is is just from you, the
teacher, your understanding of
879
00:35:16,880 --> 00:35:18,400
the child, nothing to do with
anybody else.
880
00:35:18,800 --> 00:35:21,120
That bit is so important, even
though it only takes 5% of the
881
00:35:21,120 --> 00:35:23,640
time now.
Yeah, very, very important.
882
00:35:23,640 --> 00:35:26,840
So AI can't cannot replace that
bit, but it can replace the time
883
00:35:26,840 --> 00:35:29,000
saved with writing.
So I'm going to very quickly on
884
00:35:29,000 --> 00:35:30,680
this now, whilst we're talking
about this, I'm going to ask
885
00:35:30,680 --> 00:35:32,080
you, would you use AI for this
thing?
886
00:35:32,080 --> 00:35:33,160
It's all to do with this.
OK.
887
00:35:33,200 --> 00:35:35,360
OK.
Would you use AI to to when
888
00:35:35,360 --> 00:35:37,160
sending an e-mail to a parent?
Yeah.
889
00:35:37,440 --> 00:35:39,560
Would you use AI when sending
out a letter to a parent?
890
00:35:39,560 --> 00:35:41,160
Yeah.
Would you use AI when writing
891
00:35:41,160 --> 00:35:42,040
reports?
Yep.
892
00:35:42,280 --> 00:35:44,480
Would you use AI when writing a
HCP plan?
893
00:35:45,760 --> 00:35:47,560
Yeah.
Interest.
894
00:35:47,560 --> 00:35:49,040
Yeah, yeah, yeah.
What made you slow down?
895
00:35:49,360 --> 00:35:52,360
Because I was thinking that's
more sensitive and again, like a
896
00:35:52,360 --> 00:35:53,720
bit more nuanced, but actually
do.
897
00:35:53,720 --> 00:35:56,840
You think it's worth the time
notes or are you really
898
00:35:56,840 --> 00:36:00,080
separating Outlook?
I put the grunt, I put the
899
00:36:00,080 --> 00:36:01,920
actual actually.
Yeah, I think because you have
900
00:36:01,920 --> 00:36:04,680
to put so much, so much careful
consideration of thoughts into
901
00:36:04,680 --> 00:36:06,800
the kind of your note taking
process when writing the HCP in
902
00:36:06,800 --> 00:36:08,920
the 1st place, that kind of
becomes the HCP.
903
00:36:08,920 --> 00:36:10,480
I'd probably avoid.
It so there might actually not
904
00:36:10,480 --> 00:36:12,840
be that much difference there
because it's so it's so from.
905
00:36:12,840 --> 00:36:14,920
You trying to see where maybe
your line is basically like, you
906
00:36:14,920 --> 00:36:18,240
know, like, you know, a letter
out to a letter is just very
907
00:36:18,240 --> 00:36:20,080
generic.
Isn't it down to like a very
908
00:36:20,080 --> 00:36:23,080
specific legal document over the
plan for the HP that we're gonna
909
00:36:23,080 --> 00:36:24,000
implement for?
The child.
910
00:36:24,000 --> 00:36:27,040
Yeah, yeah, yeah.
There's got the Whether I'm
911
00:36:27,040 --> 00:36:29,360
right or wrong, there's got to
be a line somewhere.
912
00:36:29,360 --> 00:36:33,360
In my opinion, I still probably.
I think I would, you know, in
913
00:36:33,360 --> 00:36:35,840
what, in what respect there
purely if there was just some
914
00:36:35,840 --> 00:36:38,280
crazy format I had to fill out
and I was like, I've done all
915
00:36:38,280 --> 00:36:39,960
the work and I just wanted to
turn into that format.
916
00:36:39,960 --> 00:36:41,400
I'd absolutely use that.
And just be really clear.
917
00:36:41,400 --> 00:36:43,040
I want to put a disclaimer in
here because I know me and you
918
00:36:43,040 --> 00:36:46,400
know this and this is like deep
within our brains, we would do
919
00:36:46,400 --> 00:36:49,320
not advocate and would never
actually put children's names
920
00:36:49,320 --> 00:36:51,560
and we would not breach GDPR by
putting children's names and
921
00:36:51,560 --> 00:36:54,840
data into things like ChatGPT.
You can use blanks.
922
00:36:54,840 --> 00:36:57,120
You can get it to structure
sentences without using
923
00:36:57,120 --> 00:36:58,240
children's names, and you
shouldn't do it.
924
00:36:58,240 --> 00:37:00,960
We should say that because it is
very, very, very, very, very,
925
00:37:00,960 --> 00:37:03,080
very important.
Like there are people out there
926
00:37:03,080 --> 00:37:05,840
who are lovely and don't mean
any harm whatsoever, who won't
927
00:37:05,840 --> 00:37:08,320
even think about it.
They're not being malicious, but
928
00:37:09,160 --> 00:37:11,080
it's bad and you should not do
that.
929
00:37:11,080 --> 00:37:14,440
So if you're doing your reports,
don't put just put literally
930
00:37:14,440 --> 00:37:16,880
like a string of letters or a or
something in when you're putting
931
00:37:16,880 --> 00:37:18,600
it into the thing, you still
give the description, the bullet
932
00:37:18,600 --> 00:37:21,160
points and then just manually
replace put the child's name in
933
00:37:21,160 --> 00:37:22,480
your.
So that's how that's how to do
934
00:37:22,480 --> 00:37:23,560
it.
Just to be really clear, in case
935
00:37:23,560 --> 00:37:25,440
someone's.
Also that means, by the way, it
936
00:37:25,440 --> 00:37:27,800
means that I built in my editing
and reviewing whilst I was
937
00:37:27,800 --> 00:37:28,720
putting name in.
Yeah.
938
00:37:28,720 --> 00:37:30,200
And I did one of the.
I was like, oh, it's an extra
939
00:37:30,200 --> 00:37:31,600
thing.
Not really because I read it.
940
00:37:31,600 --> 00:37:33,640
I always read them, Yeah.
And any time it said name, I'd
941
00:37:33,640 --> 00:37:35,200
write their name in, then I'd
keep reading it.
942
00:37:35,200 --> 00:37:37,000
And it may be something else.
I don't quite like that word.
943
00:37:37,000 --> 00:37:39,120
I think it's not quite right.
I added that, put the name in,
944
00:37:39,120 --> 00:37:41,120
put the name in it.
It worked quite well.
945
00:37:41,120 --> 00:37:43,200
Yeah, it was absolutely fine.
It really was still a huge time,
946
00:37:43,200 --> 00:37:45,840
save for unique reports.
I'm going to give you 1 now.
947
00:37:45,840 --> 00:37:49,120
So going back to the context of
rating out of 10, how useful AI
948
00:37:49,160 --> 00:37:52,080
is for something in the
classroom, lesson hooks.
949
00:37:52,160 --> 00:37:53,480
So that bit at the start of the
lesson.
950
00:37:53,680 --> 00:37:56,800
Yeah, I think this is where it
can come into its own depending
951
00:37:56,800 --> 00:37:58,680
on what you're doing.
Like it can be 10 out of 10.
952
00:37:58,680 --> 00:38:03,240
Genuinely, I've, I've not for
every lesson ever, but you might
953
00:38:03,240 --> 00:38:04,600
decide actually, do you know
what?
954
00:38:04,960 --> 00:38:07,600
I, we did some writing once
before we were doing some
955
00:38:07,600 --> 00:38:11,680
character description.
Tell me the hardest thing to get
956
00:38:11,680 --> 00:38:14,200
kids to believe is real when
they're doing writing, is that
957
00:38:14,200 --> 00:38:15,560
it?
There's a real purpose behind
958
00:38:15,560 --> 00:38:17,640
it, right?
How much better is children's
959
00:38:17,640 --> 00:38:19,880
writing when there's purpose?
That's why, you know, when
960
00:38:19,880 --> 00:38:21,760
you're writing a letter, you try
and write that to someone
961
00:38:21,760 --> 00:38:23,480
important.
You say I'm going to send it to
962
00:38:23,480 --> 00:38:26,040
David Attenborough, I'm going to
send it to the Prime Minister,
963
00:38:26,040 --> 00:38:28,160
I'm going to send it to Father
Christmas, whatever it might be.
964
00:38:28,400 --> 00:38:31,640
So purpose is massive in writing
because you get children engaged
965
00:38:31,640 --> 00:38:34,480
and want to do it.
So we were doing some writing
966
00:38:34,480 --> 00:38:37,440
before we did some character
descriptions quite draw quite
967
00:38:37,440 --> 00:38:40,240
boring, genuinely, like unless
they've got a really vivid
968
00:38:40,240 --> 00:38:42,040
imagination that some children
don't have.
969
00:38:42,600 --> 00:38:46,320
So AII remember at the front of
the room we were doing some
970
00:38:46,320 --> 00:38:49,200
shared writing together.
We were talking about how to use
971
00:38:49,200 --> 00:38:50,760
different descriptive words,
etcetera.
972
00:38:50,760 --> 00:38:53,840
What can make the, the, the
monster thing they're doing,
973
00:38:53,840 --> 00:38:56,200
monsters or aliens or something,
really stand out?
974
00:38:57,120 --> 00:39:01,840
And I then put that into image
generation and it made the image
975
00:39:01,960 --> 00:39:04,320
and they saw it and they
immediately then could link
976
00:39:04,320 --> 00:39:07,720
between the words they had
chosen and what it looks like.
977
00:39:07,960 --> 00:39:10,240
And then after that, we went
back, we identified all the
978
00:39:10,240 --> 00:39:13,720
adjectives in the sentence, for
example, and we said, OK, that
979
00:39:13,720 --> 00:39:17,520
that alien, it's, it's too big.
It's too big.
980
00:39:17,520 --> 00:39:19,360
Did we describe?
Oh, you said it's enormous.
981
00:39:19,760 --> 00:39:22,840
Rub it out, changed it went to
something else.
982
00:39:23,200 --> 00:39:24,760
It's green.
I don't think Green's that good.
983
00:39:24,760 --> 00:39:26,720
I think in the in that
environment, I think it'd be
984
00:39:26,720 --> 00:39:28,800
better to be like purpley or
something, whatever it might be.
985
00:39:29,200 --> 00:39:31,400
Go through, scrub out the
adjectives and swap them.
986
00:39:31,400 --> 00:39:35,280
Whatever reason to maybe change
the theme or to change the feel
987
00:39:35,280 --> 00:39:37,760
of the monster to make it from a
cuddly monster to a scary
988
00:39:37,760 --> 00:39:40,600
monster.
So the kids were having purpose
989
00:39:40,600 --> 00:39:43,000
because they knew it was about
to be regenerated and we were
990
00:39:43,000 --> 00:39:45,760
talking about the power of words
in writing.
991
00:39:45,920 --> 00:39:49,240
It became tangible for them.
That hook was 10 out of 10 for
992
00:39:49,240 --> 00:39:52,120
my kids because suddenly they
were thinking every word they
993
00:39:52,120 --> 00:39:55,240
wrote in their paper.
They were thinking right when
994
00:39:55,320 --> 00:39:57,480
Mr. Price generates this image,
I said to them, going to
995
00:39:57,480 --> 00:40:00,960
generate all your animals When
Mr. Price generates this, is
996
00:40:00,960 --> 00:40:03,160
that the right word to use?
Or actually, I really want it to
997
00:40:03,160 --> 00:40:04,680
look like this.
What's a better word?
998
00:40:04,680 --> 00:40:06,160
Let's get the thesaurus.
Let's look it up.
999
00:40:07,000 --> 00:40:09,080
There was so much purpose.
Their writing was so much
1000
00:40:09,080 --> 00:40:10,960
better.
It was meaningful.
1001
00:40:11,160 --> 00:40:14,080
And we said every word that goes
in your page will be learned and
1002
00:40:14,080 --> 00:40:16,520
be shown.
And it was 10 out of 10.
1003
00:40:16,640 --> 00:40:20,240
That hook worked so well.
And that was using AI.
1004
00:40:20,640 --> 00:40:22,600
That's it.
Do you reckon you're skewed
1005
00:40:22,600 --> 00:40:25,480
slightly towards English in
terms of like where hooks are
1006
00:40:25,480 --> 00:40:27,320
really good?
Yeah, because it depends.
1007
00:40:27,320 --> 00:40:31,120
It depends because writing, I've
also before asked it like what
1008
00:40:31,160 --> 00:40:32,960
AI can do is give you lots of
ideas.
1009
00:40:32,960 --> 00:40:36,040
Yeah, true, lots of ideas.
So for example, what I always
1010
00:40:36,040 --> 00:40:39,320
say is never ask it for a hook,
never ask it for an idea.
1011
00:40:39,960 --> 00:40:43,640
You might say, right, I'm doing
a lesson this afternoon on, you
1012
00:40:43,640 --> 00:40:47,600
know, encampments or something,
or Roman history or, or how, you
1013
00:40:47,600 --> 00:40:50,160
know, the Anglo Saxons came over
to Britain, etcetera.
1014
00:40:50,160 --> 00:40:51,400
I made you didn't say Rivers,
but yeah.
1015
00:40:51,400 --> 00:40:52,240
Rivers.
Rivers are.
1016
00:40:52,240 --> 00:40:54,080
I'm really trying to avoid
saying rivers usually.
1017
00:40:54,520 --> 00:40:55,640
Really.
I'm doing a lesson on rivers.
1018
00:40:56,200 --> 00:40:58,400
How can I hook the children to
get them interested in this
1019
00:40:58,400 --> 00:41:00,160
story?
So you can say to them like, but
1020
00:41:00,160 --> 00:41:01,960
you can say give me 10 ideas.
Yeah, definitely.
1021
00:41:01,960 --> 00:41:04,120
And it'll generate 10/7 will be
nonsense.
1022
00:41:04,160 --> 00:41:06,240
Yeah, 2 will be all right, and 1
might be really good.
1023
00:41:06,240 --> 00:41:08,040
Yeah, yeah, yeah.
So it's just about ideas and it
1024
00:41:08,040 --> 00:41:11,600
can give you ideas.
And I found again, I'll say this
1025
00:41:11,600 --> 00:41:15,800
again, AI for a lot of people is
stifling creativity.
1026
00:41:16,520 --> 00:41:20,440
For me, I became more creative
because it gave me more ideas.
1027
00:41:20,600 --> 00:41:22,520
I did stuff at the start of
lessons that I would not have
1028
00:41:22,520 --> 00:41:24,040
done before.
I always thought to myself,
1029
00:41:24,040 --> 00:41:26,880
what's a really good hook?
And Oh my God, I just scoffed my
1030
00:41:26,880 --> 00:41:28,160
lunch down.
I'm on duty for a bit.
1031
00:41:28,160 --> 00:41:29,040
I've got a lesson this
afternoon.
1032
00:41:29,040 --> 00:41:32,800
I can't think for two seconds.
Why don't you try starting off
1033
00:41:32,800 --> 00:41:35,520
by doing this?
Yes, we're going to start off
1034
00:41:35,520 --> 00:41:37,440
with a little story.
Wouldn't people argue that is
1035
00:41:37,440 --> 00:41:39,240
stifling your creativity though
because you didn't do it?
1036
00:41:39,280 --> 00:41:41,080
No, it's not stifling by
creative because the option is
1037
00:41:41,200 --> 00:41:44,960
I'm not creative at all or I'm
choosing, I'm making my lesson
1038
00:41:44,960 --> 00:41:46,680
more creative.
So that's what that's what I
1039
00:41:46,680 --> 00:41:48,560
mean.
Like I without it, I would not
1040
00:41:48,560 --> 00:41:51,120
have done anything.
So by definition, I was more
1041
00:41:51,120 --> 00:41:53,600
creative.
But if I'm not, I'm not this.
1042
00:41:53,720 --> 00:41:56,320
I'm a teacher.
I'm there to give children, you
1043
00:41:56,320 --> 00:42:00,000
know, option opportunities in a
lesson to be more hooked and
1044
00:42:00,000 --> 00:42:02,720
engaged by something.
It's a more creative lesson.
1045
00:42:03,360 --> 00:42:06,480
Am I personally going through
the creative process of sat down
1046
00:42:06,480 --> 00:42:08,880
thinking about, no, I don't have
time to do it, but it's made my
1047
00:42:08,880 --> 00:42:10,520
lesson more creative.
It's made my children more
1048
00:42:10,520 --> 00:42:12,760
engaged.
I'm giving more to the kids as a
1049
00:42:12,760 --> 00:42:14,880
result, just because I've used
this as a tool.
1050
00:42:15,120 --> 00:42:18,160
I think for hooks, by all means,
use your own hooks that you've
1051
00:42:18,160 --> 00:42:19,800
used in your whole career.
There are some lessons.
1052
00:42:19,800 --> 00:42:22,400
I know what I'm gonna do all the
time when I'm doing instructions
1053
00:42:22,400 --> 00:42:24,200
for the first time.
We make a sandwich.
1054
00:42:24,200 --> 00:42:27,000
Yeah, yeah, of course.
And we go, the kids say, put the
1055
00:42:27,000 --> 00:42:28,440
bread on top, I'll put on top of
my head.
1056
00:42:28,440 --> 00:42:29,920
You're not being specific
enough.
1057
00:42:29,920 --> 00:42:32,360
All that kind of stuff.
This stuff I will always do, but
1058
00:42:32,440 --> 00:42:34,600
for lessons where you want
something just whack it into AI.
1059
00:42:34,680 --> 00:42:36,640
The Egyptian lesson years ago
and we used to introduce
1060
00:42:36,640 --> 00:42:38,480
Egyptians and we'd get we'd get
the toilet paper out, dress up
1061
00:42:38,480 --> 00:42:40,720
as mummy, they'd get the sand
out and they'd dig for treasure
1062
00:42:41,040 --> 00:42:42,960
just to get them hooked into the
idea of.
1063
00:42:43,120 --> 00:42:45,200
Dig for treasure, that classic
historical language.
1064
00:42:45,880 --> 00:42:47,960
Dig for treasure.
Kids dress up as mummy.
1065
00:42:48,080 --> 00:42:50,720
Dress up as archaeologists.
No one's taking primary learning
1066
00:42:50,720 --> 00:42:53,560
seriously.
But it was hook, you know, and
1067
00:42:53,560 --> 00:42:55,520
the thought of like replacing
that with that was the best hook
1068
00:42:55,520 --> 00:42:56,120
for that.
Yeah.
1069
00:42:56,280 --> 00:42:58,160
I mean, that that didn't need
AI, but you're absolutely right.
1070
00:42:58,160 --> 00:43:00,360
You know what AI might have been
instrumental in thinking of that
1071
00:43:00,360 --> 00:43:01,960
hook in the 1st place.
Do you know from that meta level
1072
00:43:01,960 --> 00:43:04,800
like, yeah, exactly.
As well as actually being used
1073
00:43:04,920 --> 00:43:06,880
like in your English example?
I want to do another one with
1074
00:43:06,880 --> 00:43:09,320
you because it's kind of linked.
I think it's close enough.
1075
00:43:09,320 --> 00:43:11,640
Anyway, that was lesson hooks.
What about in the main part of
1076
00:43:11,640 --> 00:43:13,600
the lesson, like the actual
worksheet?
1077
00:43:14,760 --> 00:43:18,440
Oh, this really depends.
It can genuinely be 0 out, then
1078
00:43:19,080 --> 00:43:21,360
it can genuinely be absolute
trash nonsense.
1079
00:43:21,640 --> 00:43:24,640
It can be quite useful, right?
So let me be really specific
1080
00:43:24,640 --> 00:43:27,680
here, give you some examples.
When I've taught GPS objectives
1081
00:43:27,680 --> 00:43:32,760
in the past and I've wanted ten
questions to fill a gap with a
1082
00:43:32,760 --> 00:43:36,240
preposition or something, or
choosing between as sorry and
1083
00:43:36,240 --> 00:43:39,400
and a right or choosing A
determiner to go in the gap or
1084
00:43:39,640 --> 00:43:42,480
anything like that kind of
nonsense that I hate anyway.
1085
00:43:42,840 --> 00:43:45,080
I don't want to spend 10 minutes
putting a sheet together for
1086
00:43:45,080 --> 00:43:47,240
that.
So AI was very good at
1087
00:43:47,360 --> 00:43:52,240
generating 10/15/20 sentences
with the preposition missing and
1088
00:43:52,240 --> 00:43:55,160
maybe even in brackets 3 to
choose from that kind of
1089
00:43:55,160 --> 00:43:57,000
generation of worksheets.
Fantastic.
1090
00:43:57,400 --> 00:44:00,000
AI generated loads of times
table practise for me.
1091
00:44:00,400 --> 00:44:03,080
And I said only do three and two
times table mix up the
1092
00:44:03,080 --> 00:44:04,600
presentation.
It could handle that.
1093
00:44:04,680 --> 00:44:07,120
Yeah, yeah, right.
So in terms of that kind of
1094
00:44:07,120 --> 00:44:10,880
rote, repetitive fill in the box
text level stuff.
1095
00:44:10,880 --> 00:44:12,520
Closed procedure.
Exactly.
1096
00:44:12,520 --> 00:44:13,880
Closed procedure, that's the
word.
1097
00:44:14,040 --> 00:44:16,240
I found it was very good at
that, especially if you give it
1098
00:44:16,240 --> 00:44:20,440
the right prompts and told it
what to do outside of that when
1099
00:44:20,440 --> 00:44:21,920
it gets a bit further.
I don't know.
1100
00:44:21,920 --> 00:44:23,440
We've played around with
different things, haven't we?
1101
00:44:23,440 --> 00:44:26,120
And I'm not going to sit and
name all companies etcetera, but
1102
00:44:26,400 --> 00:44:28,040
but it can be hit and miss,
right?
1103
00:44:28,200 --> 00:44:29,280
Yeah.
I think if we sort of band it
1104
00:44:29,280 --> 00:44:31,200
together then with maybe lesson
generation general, like slide
1105
00:44:31,200 --> 00:44:32,960
generation.
So like your work, basically the
1106
00:44:32,960 --> 00:44:35,080
content that you might put in on
a sheet for the kids to work
1107
00:44:35,080 --> 00:44:37,120
through, all the content that
might be on your slides when
1108
00:44:37,120 --> 00:44:39,320
you're going through.
But very quick, I think that's,
1109
00:44:39,720 --> 00:44:41,920
I think that's, I think it's
completely separate in terms of
1110
00:44:42,240 --> 00:44:45,360
even the worksheets.
Yeah, like I think even that
1111
00:44:45,360 --> 00:44:47,680
even those alone is such a wide
range.
1112
00:44:47,680 --> 00:44:48,360
Oh, fair enough.
Yeah, yeah.
1113
00:44:48,360 --> 00:44:50,680
But in terms of the no, because
you're basically saying no
1114
00:44:50,680 --> 00:44:53,120
worksheet's fine.
It's not like half the time like
1115
00:44:53,160 --> 00:44:55,840
if you, if you want a registered
worksheet, like for example,
1116
00:44:55,840 --> 00:44:58,680
even in a maths objective, you
know when you create maths
1117
00:44:58,680 --> 00:45:02,040
questions where in a row there's
a reason why you put them in a
1118
00:45:02,040 --> 00:45:06,240
row.
Like you know, 7 * 870 * 871 *
1119
00:45:06,240 --> 00:45:10,360
871 * 9 and and all that, you
know, individual, you could sit
1120
00:45:10,360 --> 00:45:13,240
and work them out, but every
single jump there's a reason why
1121
00:45:13,240 --> 00:45:14,760
you've jumped it and there's
reasoning built in.
1122
00:45:15,120 --> 00:45:17,080
I find it really struggles of
anything like that.
1123
00:45:17,440 --> 00:45:20,680
And if you want a real quality
task for your children, I think
1124
00:45:20,680 --> 00:45:23,120
it's better to create it
yourself sometimes, or get it
1125
00:45:23,120 --> 00:45:25,200
from a real trusted source where
a specialist has done that.
1126
00:45:25,720 --> 00:45:27,560
That's all I'll say about that,
but please crack.
1127
00:45:27,560 --> 00:45:29,800
On no, no, I was sort of
bandaged together because I
1128
00:45:29,800 --> 00:45:32,120
also, I agree by the way, like
with worksheet generation, go
1129
00:45:32,120 --> 00:45:34,120
back to that for a second.
Even things like I think about
1130
00:45:34,120 --> 00:45:36,600
some of those sort of rich
afternoon lessons we do
1131
00:45:36,600 --> 00:45:40,400
sometimes maybe in science or or
history or something where it's
1132
00:45:40,400 --> 00:45:43,240
not actually in that lesson,
just about like question,
1133
00:45:43,240 --> 00:45:45,080
answer, question, answer,
knowledge, knowledge, knowledge.
1134
00:45:45,080 --> 00:45:47,120
Sometimes it's like a we're
going to go out and we're going
1135
00:45:47,120 --> 00:45:49,800
to stick some leaves in and
we're going to come back and we
1136
00:45:49,800 --> 00:45:51,440
analyse them and we're going to
label them as stuff.
1137
00:45:51,720 --> 00:45:53,880
I felt like AI would was rubbish
at doing that.
1138
00:45:53,880 --> 00:45:56,560
It was rubbish at create that
thinking of that idea and making
1139
00:45:56,560 --> 00:45:58,280
a worksheet for me because I
definitely tried it lots of
1140
00:45:58,280 --> 00:46:01,080
times and it was always very,
very, very boring stuff.
1141
00:46:01,080 --> 00:46:04,040
So like, yeah, I agree, sort of
three out of 10 is not not the
1142
00:46:04,040 --> 00:46:06,280
best at doing that.
Then with slide generation, this
1143
00:46:06,280 --> 00:46:08,720
one is fascinating because I
feel like it's the it's the
1144
00:46:08,720 --> 00:46:10,200
thing people are trying to crack
at the moment.
1145
00:46:11,480 --> 00:46:14,000
And maybe it has been cracked,
but I'm sort of yet to see it in
1146
00:46:14,000 --> 00:46:16,400
all honesty, that there's lots
of bits of software out there
1147
00:46:16,400 --> 00:46:19,880
that say, oh, just put in some
prompts and or tell it what your
1148
00:46:20,160 --> 00:46:22,720
curriculum is and it will make a
whole lesson for you at the
1149
00:46:22,720 --> 00:46:26,640
PowerPoint.
And I've tried a few and I'll be
1150
00:46:26,640 --> 00:46:29,160
honest, I would not use them in
my class.
1151
00:46:29,360 --> 00:46:32,120
I would not use them.
Like even you could argue or
1152
00:46:32,120 --> 00:46:34,560
maybe it just gives you a
starting point, a base, a base
1153
00:46:34,560 --> 00:46:37,760
point to sort of adapt from.
Even though it was, it was, I
1154
00:46:38,080 --> 00:46:39,400
was clutching at straws to be
honest with you.
1155
00:46:39,400 --> 00:46:43,000
I was thinking, I have to edit
this so much that I might as
1156
00:46:43,000 --> 00:46:45,240
well just start from scratch.
Like this isn't really saved me
1157
00:46:45,240 --> 00:46:47,080
any time.
And if I was to use it, it's
1158
00:46:47,240 --> 00:46:50,760
very very sub optimal lesson in
terms of the resources being put
1159
00:46:50,760 --> 00:46:53,120
in front of kids.
That's just my opinion, I'm sure
1160
00:46:53,120 --> 00:46:54,640
people disagree.
A lot of people are enjoying
1161
00:46:54,640 --> 00:46:56,480
using these resources because
I'm sure it does save them time,
1162
00:46:57,160 --> 00:47:01,640
but at what cost?
So this is I would say we can
1163
00:47:01,640 --> 00:47:04,440
probably round off in terms of
the talk here, because I think
1164
00:47:04,440 --> 00:47:10,560
this is the perfect example of
how saving teachers time and
1165
00:47:10,560 --> 00:47:14,320
school staff time is genuinely
at the forefront of everything
1166
00:47:14,320 --> 00:47:16,320
that we should be using with
tech, right?
1167
00:47:17,400 --> 00:47:20,240
But it needs to be on a level
with it being quality for the
1168
00:47:20,400 --> 00:47:22,440
for the for the students, right?
For the children.
1169
00:47:22,920 --> 00:47:26,880
It has to be equal.
It has to be because I feel like
1170
00:47:26,880 --> 00:47:29,880
a lot of the shortcuts currently
for creating worksheets or
1171
00:47:29,880 --> 00:47:32,640
creating slides.
I'm not saying it won't happen
1172
00:47:32,640 --> 00:47:34,200
in the future.
I think we could get to a point
1173
00:47:34,200 --> 00:47:37,680
where it is very useful, but
currently it's at the detriment
1174
00:47:37,680 --> 00:47:41,120
of the quality of what's being
produced because AI can produce
1175
00:47:41,120 --> 00:47:42,720
slop.
We know that it's still at this
1176
00:47:42,720 --> 00:47:45,240
stage getting better, but we
know it can produce rubbish.
1177
00:47:45,760 --> 00:47:48,680
So when you look at these, I, I
look at it, sometimes I think
1178
00:47:48,680 --> 00:47:52,040
that would not wash in my school
if, if I made, if I made that
1179
00:47:52,040 --> 00:47:53,680
slide, I said, here's the lesson
guys, for everyone.
1180
00:47:53,680 --> 00:47:56,040
That would not pass as a good
enough lesson.
1181
00:47:56,480 --> 00:47:59,320
Not only sometimes in the
generation of lessons that we
1182
00:47:59,320 --> 00:48:02,080
practise, we, we looked at and
tried, not only sometimes were
1183
00:48:02,080 --> 00:48:05,160
the facts just wrong, which was
a whole different conversation
1184
00:48:05,160 --> 00:48:06,680
about how it's scraping the
Internet.
1185
00:48:06,880 --> 00:48:08,600
Sometimes the facts were
literally wrong.
1186
00:48:08,760 --> 00:48:11,480
And as a teacher who's not a
specialist in geography and
1187
00:48:11,480 --> 00:48:14,200
rivers, it could have it could
have scraped something off.
1188
00:48:14,200 --> 00:48:16,760
That sounds right.
What I'm gonna do research
1189
00:48:16,760 --> 00:48:18,160
everything.
I might have just did a research
1190
00:48:18,160 --> 00:48:20,240
myself.
Literally just picture that was
1191
00:48:20,240 --> 00:48:22,440
wrong as well.
It was it was a historical
1192
00:48:22,440 --> 00:48:23,680
person.
It was really important that
1193
00:48:23,680 --> 00:48:26,160
this picture was accurate.
Just pulled up like a picture of
1194
00:48:26,760 --> 00:48:29,160
different people and I was like,
oh, this is terrible.
1195
00:48:29,160 --> 00:48:30,200
This is.
Not good.
1196
00:48:30,200 --> 00:48:32,440
It's like it should have been
Winston Churchill, but it was
1197
00:48:32,440 --> 00:48:33,840
bloody, you know?
Margaret Thatcher.
1198
00:48:33,920 --> 00:48:35,880
Yeah, it's something.
I don't think Margaret Thatcher
1199
00:48:35,880 --> 00:48:37,720
was involved in World War 2.
Yeah, or it was someone that
1200
00:48:37,720 --> 00:48:39,600
just looked a bit like Winston
Churchill, and it's like this.
1201
00:48:39,600 --> 00:48:41,320
Oh, like he's clearly scoured
Internet and thought this was
1202
00:48:41,320 --> 00:48:43,320
this.
But it's not like this is not
1203
00:48:43,400 --> 00:48:44,680
good.
And it's not going to pass
1204
00:48:44,680 --> 00:48:46,680
schools.
Everyone who works in this
1205
00:48:46,680 --> 00:48:49,040
school, we talk about it so much
how schools are obsessed with
1206
00:48:49,040 --> 00:48:51,920
uniformity as well sometimes and
slides having to look the same
1207
00:48:51,920 --> 00:48:53,200
and following the same
structure.
1208
00:48:53,760 --> 00:48:56,080
There's no structure.
There was no structure to those
1209
00:48:56,080 --> 00:48:57,400
lessons.
It was just like a story.
1210
00:48:57,400 --> 00:48:59,960
And it's like, cool if we've got
absolutely no standards in what
1211
00:48:59,960 --> 00:49:03,760
lessons look like, maybe that's.
Just so I think it'll get there.
1212
00:49:03,920 --> 00:49:05,880
In all honesty, with the great
AI changing, I do think we'll
1213
00:49:05,880 --> 00:49:07,360
get there.
It's just that right now I'd say
1214
00:49:07,360 --> 00:49:09,640
it's not the best at doing that
job and it still needs a
1215
00:49:09,640 --> 00:49:10,880
teacher's touch.
Let's get to the big one,
1216
00:49:10,880 --> 00:49:12,000
because I really want to ask you
this one.
1217
00:49:12,920 --> 00:49:17,000
Teaching out of 10, AI replacing
teaching.
1218
00:49:17,280 --> 00:49:21,480
Right.
Right now as the hold as me as a
1219
00:49:21,480 --> 00:49:26,320
teacher in the classroom one. 1,
not 0 but.
1220
00:49:26,320 --> 00:49:30,880
One, no, no, not 0, because you
just have to be honest that
1221
00:49:31,560 --> 00:49:33,600
there are things that AI is
better than me at.
1222
00:49:33,760 --> 00:49:37,320
Yeah, of course there is.
Hannah Frye said anything that
1223
00:49:37,320 --> 00:49:40,200
can be done sat at a computer
will be replaced by AI.
1224
00:49:40,240 --> 00:49:42,880
Yeah, it is some element of a
teacher's job sat behind a
1225
00:49:42,880 --> 00:49:43,960
computer.
We wouldn't have a bloody
1226
00:49:43,960 --> 00:49:46,440
podcast at all.
How often we talked about how
1227
00:49:46,440 --> 00:49:48,400
we're not in this repetitive
point in the class.
1228
00:49:48,720 --> 00:49:50,880
So let's take that away.
Yeah, brilliant.
1229
00:49:50,880 --> 00:49:53,760
That's maybe it's a 2.
But the teaching bit, you are
1230
00:49:53,760 --> 00:49:56,120
standing in front of kids and
you are delivering content to
1231
00:49:56,120 --> 00:49:58,080
them and you are teaching and
you are adapting on the spot and
1232
00:49:58,080 --> 00:49:59,680
you're doing all eight of the
teachers standards.
1233
00:49:59,920 --> 00:50:02,960
That bit of teaching, pastoral
right now.
1234
00:50:03,000 --> 00:50:05,480
Pastoral, OK, all of it.
Yeah, right now is 1 out of 10,
1235
00:50:05,480 --> 00:50:05,920
isn't it?
Yeah.
1236
00:50:06,240 --> 00:50:08,240
What do you think about the
future?
1237
00:50:08,240 --> 00:50:10,320
Because this is where I think we
have had lots of little
1238
00:50:10,320 --> 00:50:12,040
discussion about.
Let's air this out now.
1239
00:50:12,040 --> 00:50:13,960
All right, all right.
Where do you think we're going?
1240
00:50:14,880 --> 00:50:19,880
It is to me, and this is not me
saying this is good to me,
1241
00:50:20,080 --> 00:50:26,160
inevitable, that the job of a
teacher will be replaced by
1242
00:50:26,160 --> 00:50:29,240
artificial intelligence at some
point in the future.
1243
00:50:29,520 --> 00:50:35,720
Are we talking robots or AI
actually doing the entire job,
1244
00:50:36,080 --> 00:50:38,080
teaching the kids, the pastoral
care, all of it.
1245
00:50:38,280 --> 00:50:41,240
Listen, I I don't like this
prediction.
1246
00:50:41,760 --> 00:50:43,120
I don't think this prediction is
good.
1247
00:50:43,160 --> 00:50:46,000
That's fine.
But I do think there will be a
1248
00:50:46,000 --> 00:50:49,920
point where it will be a choice
by governments or whoever is in
1249
00:50:49,920 --> 00:50:55,280
charge to spend more money to
get a real life human in front
1250
00:50:55,280 --> 00:50:57,360
of the staff.
And it'll only be as long as the
1251
00:50:57,360 --> 00:51:01,280
resolve is there for humans to
have humans with other humans,
1252
00:51:01,880 --> 00:51:04,280
that that will happen.
I think it will get to a point
1253
00:51:04,280 --> 00:51:08,640
where artificial intelligence is
better at everything that we can
1254
00:51:08,640 --> 00:51:11,280
possibly do.
It's it's literally better at
1255
00:51:11,280 --> 00:51:13,280
building in the same way.
Do you know, different people
1256
00:51:13,280 --> 00:51:14,560
have different strengths, Right?
Yeah.
1257
00:51:14,640 --> 00:51:17,080
And if you lined up 100 people,
you could line them up in the
1258
00:51:17,080 --> 00:51:19,720
order of how emotionally
intelligent they are and how
1259
00:51:20,000 --> 00:51:22,880
empathetic they are and how how
kind they are and how tall they
1260
00:51:22,880 --> 00:51:25,040
are and how angry they get.
And it'll be all different
1261
00:51:25,040 --> 00:51:27,280
orders.
There will come a point where
1262
00:51:27,280 --> 00:51:30,400
artificial intelligence will be
at the top 1% of every single
1263
00:51:30,720 --> 00:51:32,600
one of them, right?
In the same way me and you are
1264
00:51:32,600 --> 00:51:34,760
different.
Artificial intelligence will
1265
00:51:34,760 --> 00:51:37,360
learn what makes us different
and better and worse and be
1266
00:51:37,360 --> 00:51:39,360
better at everything than
humans.
1267
00:51:39,360 --> 00:51:41,240
That will happen.
The intelligence will get to
1268
00:51:41,240 --> 00:51:44,880
such a high level, it will
skyrocket up where it gets to
1269
00:51:44,880 --> 00:51:47,200
the point where it knows
everything and can do everything
1270
00:51:47,200 --> 00:51:49,640
better and always people.
You'll never replace a human's
1271
00:51:49,640 --> 00:51:51,880
heart and a human's touch.
I was like, yeah, well, do you
1272
00:51:51,880 --> 00:51:54,000
know what?
There's, I can think of a few
1273
00:51:54,000 --> 00:51:58,400
human emotions and intelligences
that I have that I think you
1274
00:51:58,400 --> 00:52:00,840
could learn to be better at than
me.
1275
00:52:00,960 --> 00:52:04,080
Yeah, yeah, yeah.
So I'm just saying I don't like
1276
00:52:04,080 --> 00:52:07,320
this fact and I don't think it's
necessarily a good thing, but I
1277
00:52:07,320 --> 00:52:10,560
just think that humans are a
bunch of chemicals, chemical
1278
00:52:10,560 --> 00:52:13,240
reactions, neurons are true
there.
1279
00:52:13,320 --> 00:52:17,200
There will be a point where we
can replicate that in a robot,
1280
00:52:17,200 --> 00:52:18,520
an artificial.
Empathy for you.
1281
00:52:18,520 --> 00:52:20,520
You never learned that one
that's that's low and that's
1282
00:52:20,520 --> 00:52:22,240
easy.
AI does that now better than
1283
00:52:22,400 --> 00:52:24,800
you.
Literally, that's my point.
1284
00:52:24,800 --> 00:52:26,280
And so I think it will replace
it.
1285
00:52:26,280 --> 00:52:29,840
And I think it's going to come
down to a cost evaluation.
1286
00:52:29,880 --> 00:52:32,240
And eventually there'll be so
much pressure that they'll drop
1287
00:52:32,240 --> 00:52:35,440
you like, look, why, why can we
sit here and say, and in the
1288
00:52:35,440 --> 00:52:38,640
chat people said all the white
collar jobs are going to go, you
1289
00:52:38,640 --> 00:52:39,760
know, all all the different
jobs.
1290
00:52:39,760 --> 00:52:42,040
They're like, you know, lawyers.
They'll be, they'll be AI bought
1291
00:52:42,040 --> 00:52:44,440
another law inside out and
within 3 seconds can work over
1292
00:52:44,440 --> 00:52:46,880
something's right or wrong.
Can take about look at all of
1293
00:52:46,880 --> 00:52:49,160
the log of every single
conviction ever.
1294
00:52:49,360 --> 00:52:51,840
Get rid of any kind of bias and
just judge someone immediately,
1295
00:52:51,960 --> 00:52:54,240
right.
Better than a group of 12 random
1296
00:52:54,240 --> 00:52:55,600
people from the whole country
can.
1297
00:52:56,120 --> 00:52:58,520
What we are, of course it's
better, but it's only going to
1298
00:52:58,520 --> 00:53:01,680
be the resolve of the humans in
the system to say, no, we need
1299
00:53:01,680 --> 00:53:03,560
that human connection.
They need to think of a really
1300
00:53:03,560 --> 00:53:06,040
good argument why we do it
because eventually it will just
1301
00:53:06,040 --> 00:53:09,680
be, well, we're doing it just
'cause that's why.
1302
00:53:09,680 --> 00:53:12,160
And I think there'll come a
point where it's like they're
1303
00:53:12,160 --> 00:53:14,280
just better off you go.
And I think I will take over the
1304
00:53:14,280 --> 00:53:15,520
world.
God, that's mad, isn't it?
1305
00:53:15,520 --> 00:53:17,840
I, I got a couple of things to
say because I think your, your
1306
00:53:17,960 --> 00:53:19,920
argument is very compelling,
especially when there's infinite
1307
00:53:19,920 --> 00:53:21,440
time involved.
I mean, I'm sort of like, OK,
1308
00:53:21,440 --> 00:53:24,200
maybe in 3000 years that is.
Definitely, I think sooner, but.
1309
00:53:24,240 --> 00:53:26,280
For you think, yeah, for sure.
I think our, our understanding
1310
00:53:26,280 --> 00:53:30,720
of our differences mostly late
in time, but even with the
1311
00:53:30,720 --> 00:53:33,000
government, I, you know, I
always see things on the online
1312
00:53:33,000 --> 00:53:35,920
of being like, oh, this country
is now going to use an AI member
1313
00:53:35,920 --> 00:53:37,600
of cabinet and stuff like this.
And I'm like it.
1314
00:53:37,800 --> 00:53:40,600
It does make sense, probably
long term to for for things like
1315
00:53:40,600 --> 00:53:43,520
governing bodies to be a bit
more objective and, you know,
1316
00:53:43,520 --> 00:53:45,640
less likely to be corrupted if
possible.
1317
00:53:45,800 --> 00:53:49,440
Let's So let's hire in the Mark
Zuckerberg and Sam Altman
1318
00:53:49,480 --> 00:53:50,680
company.
I'm not really sure.
1319
00:53:50,680 --> 00:53:52,920
Yeah, don't get me wrong.
I think it's, I don't mean it's
1320
00:53:52,920 --> 00:53:55,240
going to get rid of right.
I think it's going to hyper
1321
00:53:55,240 --> 00:53:58,440
concentrate it to a very few
very, very rich people and
1322
00:53:58,440 --> 00:54:01,000
democracy will become a farce.
That's what I think.
1323
00:54:01,200 --> 00:54:04,520
Potentially, but for the same
reason that that that's dumb in
1324
00:54:04,520 --> 00:54:06,200
terms of like the corruption
from a different way.
1325
00:54:06,200 --> 00:54:07,920
I do think the same thing is
probably going to happen in
1326
00:54:07,920 --> 00:54:10,960
teaching in terms of, OK, let's
get in this these two companies
1327
00:54:10,960 --> 00:54:13,800
now that control precisely what
our kids are being taught,
1328
00:54:13,800 --> 00:54:16,080
rather than just, you know, the
pick of all of humanity where
1329
00:54:16,080 --> 00:54:17,960
you get all sorts of teachers
and all sorts of different
1330
00:54:17,960 --> 00:54:20,600
backgrounds and they get and
children get access to all these
1331
00:54:20,600 --> 00:54:22,840
different teachers across their
career in school.
1332
00:54:23,120 --> 00:54:25,720
I think that paired with the
fact that I think the human
1333
00:54:25,720 --> 00:54:28,080
resolve, we need to give it more
credit than then maybe you're
1334
00:54:28,080 --> 00:54:30,480
giving it.
I really really really think
1335
00:54:30,480 --> 00:54:33,720
that even in 100 years when we
have the most unreal AI robots
1336
00:54:33,720 --> 00:54:35,960
that can be the top 1% and
everything, we're still just
1337
00:54:35,960 --> 00:54:39,080
going to want 4 year olds who
don't understand anything about
1338
00:54:39,080 --> 00:54:42,000
the world yet in front of
another flesh person.
1339
00:54:42,040 --> 00:54:45,200
You know, not an AI person.
I think, I think we'll always
1340
00:54:45,200 --> 00:54:47,320
want that.
I want that want is great.
1341
00:54:47,680 --> 00:54:49,960
Something happen.
I, I there are lots of people
1342
00:54:49,960 --> 00:54:52,440
who didn't want self-service
checkouts at Tesco because they
1343
00:54:52,440 --> 00:54:54,680
wanted the human interaction and
going to talk to someone.
1344
00:54:54,840 --> 00:54:57,120
They didn't want one person in
charge of 6 checkouts.
1345
00:54:57,120 --> 00:54:59,200
They wanted one person per
person so they can talk to them.
1346
00:54:59,520 --> 00:55:01,520
Don't matter what you want
because at the end of the day
1347
00:55:01,520 --> 00:55:03,280
it's better for the company,
it's better for money.
1348
00:55:03,280 --> 00:55:05,000
You're going to make money.
We're going to get more people
1349
00:55:05,000 --> 00:55:06,160
through the door.
We're going to get more money
1350
00:55:06,160 --> 00:55:09,360
through the tills if we do it
this way. 1 is a great thing.
1351
00:55:09,560 --> 00:55:11,680
I want it to, yeah.
It's not going to happen.
1352
00:55:12,040 --> 00:55:13,520
Do you know what one situation I
could imagine?
1353
00:55:13,520 --> 00:55:15,240
I think you probably told me
this probably where it's come
1354
00:55:15,280 --> 00:55:17,680
from is firstly bigger class
sizes.
1355
00:55:17,680 --> 00:55:19,880
Yeah, because I think that's,
that's the first easy way to cut
1356
00:55:19,880 --> 00:55:21,960
costs of.
OK, look, all right, we have 50%
1357
00:55:21,960 --> 00:55:23,880
real teachers, but 50% AI
teachers and we can.
1358
00:55:23,960 --> 00:55:25,040
It's happening in America right
now.
1359
00:55:25,120 --> 00:55:27,840
There's a school that does that.
We're talking to Lee, Mr P Bet,
1360
00:55:27,840 --> 00:55:32,880
and he was saying how in America
it has basically like 100 kids
1361
00:55:32,880 --> 00:55:35,040
in a room being taught something
and then they go off to their
1362
00:55:35,040 --> 00:55:36,840
one to one individual AI tutor.
Right.
1363
00:55:37,160 --> 00:55:38,840
It's happening.
Yeah, yeah, yeah, that's yeah,
1364
00:55:38,840 --> 00:55:40,720
very similar example I was
thinking of imagine like a big
1365
00:55:40,720 --> 00:55:44,800
room where it's almost flipped
like the the teacher is the AI
1366
00:55:44,800 --> 00:55:47,200
robot that's basically just sort
of delivering facts.
1367
00:55:47,200 --> 00:55:49,040
It's not really doing the
empathy thing very even if it's
1368
00:55:49,040 --> 00:55:51,160
really good at it, it's not
actually we're not saying it's
1369
00:55:51,160 --> 00:55:52,240
going to provide the human
touch.
1370
00:55:52,760 --> 00:55:55,240
And then the teaching assistants
are basically just some adults
1371
00:55:55,240 --> 00:55:57,960
dotted around who can do that
human to human interaction that
1372
00:55:57,960 --> 00:56:00,160
I don't think we'll ever, ever,
ever not need.
1373
00:56:00,160 --> 00:56:02,120
I don't think we'll ever get to
a point where we don't actually
1374
00:56:02,120 --> 00:56:04,000
want or need that as a society,
no matter the cost.
1375
00:56:04,480 --> 00:56:07,280
But I can't imagine cutting
costs of OK, yeah, but the the
1376
00:56:07,280 --> 00:56:09,400
one at the front delivering the
knowledge is like that that can
1377
00:56:09,400 --> 00:56:10,800
be a robot.
And there's just a few adults
1378
00:56:10,800 --> 00:56:12,120
around, just totally different
roles.
1379
00:56:12,120 --> 00:56:13,320
It's not even teaching
assistants anymore.
1380
00:56:13,320 --> 00:56:15,800
It's just the human to human
connection people.
1381
00:56:16,120 --> 00:56:18,760
I don't actually disagree of
anything you're saying.
1382
00:56:19,040 --> 00:56:21,520
I just, I'm just saying I don't
think that will happen.
1383
00:56:22,000 --> 00:56:26,480
Like like I'm it's almost as if
the two, the two like parts
1384
00:56:26,480 --> 00:56:28,840
we're having here are.
But I think this is important
1385
00:56:28,840 --> 00:56:30,040
and we should still have that
thing.
1386
00:56:30,400 --> 00:56:33,640
And I'm kind of saying, yeah,
but we won't like I don't, I
1387
00:56:33,640 --> 00:56:36,000
don't disagree with you.
I don't disagree with you at
1388
00:56:36,000 --> 00:56:37,880
all.
I just don't think it will
1389
00:56:37,880 --> 00:56:39,960
happen.
Is it better to have one to one
1390
00:56:40,600 --> 00:56:42,920
scanners at Tesco so you can
talk to someone as you go for
1391
00:56:42,920 --> 00:56:43,800
it?
I've got a problem with this.
1392
00:56:43,800 --> 00:56:45,360
I can talk to you right away.
It's more immediate.
1393
00:56:45,640 --> 00:56:47,600
Is that better for the consumer?
Yes.
1394
00:56:47,800 --> 00:56:51,000
Is it better for the shareholder
of the company new because they
1395
00:56:51,000 --> 00:56:52,440
can make more money doing
something else?
1396
00:56:52,520 --> 00:56:55,840
Yeah, yeah, yeah.
The big tech giants will be a
1397
00:56:55,840 --> 00:56:58,200
conglomerate.
Everything will go through them.
1398
00:56:58,320 --> 00:57:00,920
They will take over education as
well as everything else they're
1399
00:57:00,920 --> 00:57:03,480
taking over.
And they will cut, cut, cut, cut
1400
00:57:03,480 --> 00:57:04,560
because only be a few of them
left.
1401
00:57:04,760 --> 00:57:07,600
They will take everything.
It won't be a good thing, but it
1402
00:57:07,600 --> 00:57:10,600
will happen.
And that that would give it 30
1403
00:57:10,600 --> 00:57:14,880
years.
Society will look like nothing
1404
00:57:14,880 --> 00:57:17,480
you have ever seen before.
And we're going to have to
1405
00:57:17,480 --> 00:57:19,600
relearn everything, which is
what we started off with.
1406
00:57:19,600 --> 00:57:20,760
The conversation with Hannah
Frye.
1407
00:57:21,160 --> 00:57:23,840
I'm not saying that it's
necessarily a good thing.
1408
00:57:24,080 --> 00:57:27,280
I'm just saying it's a thing.
I mean, yeah, we'll have to
1409
00:57:27,280 --> 00:57:29,360
agree to disagree a little bit.
I don't equally disagree.
1410
00:57:29,360 --> 00:57:30,680
I think that's probably what's
going to happen.
1411
00:57:30,680 --> 00:57:33,720
But I like to think that.
I like to think that human
1412
00:57:33,720 --> 00:57:36,720
resolve will come through a
little bit stronger and we'll no
1413
00:57:36,720 --> 00:57:39,200
matter how good AI is not
replaced, just even if it's just
1414
00:57:39,200 --> 00:57:42,000
a tiny bit plus a tiny bit of
what teachers bring.
1415
00:57:42,280 --> 00:57:44,440
Still, now we've got people
saying never use a screen ever.
1416
00:57:44,600 --> 00:57:46,680
We're like, Oh my.
God, yeah, exactly right, Yeah.
1417
00:57:48,200 --> 00:57:49,800
I.
Wonder how those people are
1418
00:57:49,800 --> 00:57:51,760
going to feel when they when
they realise that kids teachers
1419
00:57:51,760 --> 00:57:54,880
an AI robot with a massive
screen on its head.
1420
00:57:56,760 --> 00:57:59,000
Those screens, you say?
No technology you.
1421
00:58:00,880 --> 00:58:04,240
Know what happened will be
something some other crazy tech
1422
00:58:04,240 --> 00:58:06,280
development will happen in the
next 10 years that will just be
1423
00:58:06,280 --> 00:58:08,280
wildly different again in the
same way AI disrupted
1424
00:58:08,280 --> 00:58:09,600
everything.
And it'll be something we can't
1425
00:58:09,600 --> 00:58:11,400
even comprehend right now.
And it'll be like, no, it was
1426
00:58:11,400 --> 00:58:13,320
never going to be AI robots.
It's going to be this other
1427
00:58:13,320 --> 00:58:17,560
insane, crazy, futuristic thing.
Joe, we should do at some point
1428
00:58:17,560 --> 00:58:20,680
in the next year, release a
podcast episode that is holy AI
1429
00:58:21,160 --> 00:58:24,160
and just see who notices.
See who notices.
1430
00:58:24,160 --> 00:58:25,560
Yeah.
And it'd be crazy, wouldn't it,
1431
00:58:25,680 --> 00:58:27,800
if it was this one.
Bye.
1432
00:58:28,400 --> 00:58:28,720
See ya.