In this podcast episode, we discuss an interesting study that suggests preschoolers may prefer learning from a competent robot over an incompetent human. Plus, we tackle some community questions about handling stakeholders who ignore UXR feedback, coping with lost Zoom recordings, and navigating antagonistic CX relationships.
#podcast #UXresearch #technology #preschoolers #robots #humaninteraction #customerexperience #userfeedback #Zoom #lostrecordings
Recorded live on April 13th, 2023, hosted by Nick Roome, with guests Barry Kirby.
Check out the latest from our sister podcast - 1202 The Human Factors Podcast -on Naturalistic Decision Making - An interview with Rob Hutton:
News:
It Came From:
Follow us:
Thank you to our Human Factors Cast Honorary Staff Patreons:
Support us:
Human Factors Cast Socials:
Reference:
Feedback:
Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.
Mentioned in this episode:
1202 - The Human Factors Podcast
Listen here: https://www.1202podcast.com
Listen to Human Factors Minute
Step into the world of Human Factors and UX with the Human Factors Minute podcast! Each episode is like a mini-crash course in all things related to the field, packed with valuable insights and information in just one minute. From organizations and conferences to theories, models, and tools, we've got you covered. Whether you're a practitioner, student or just a curious mind, this podcast is the perfect way to stay ahead of the curve and impress your colleagues with your knowledge. Tune in on the 10th, 20th, and last day of every month for a new and interesting tidbit related to Human Factors. Join us as we explore the field and discover how fun and engaging learning about Human Factors can be! https://www.humanfactorsminute.com https://feeds.captivate.fm/human-factors-minute/
Let us know what you want to hear about next week by voting in our latest "Choose the News" poll!
Follow us:
Thank you to our Human Factors Cast Honorary Staff Patreons:
Support us:
Human Factors Cast Socials:
Reference:
Feedback:
Disclaimer: Human Factors Cast may earn an affiliate commission when you buy through the links here.
[00:00:00] Nick Roome: What I'm about to say will amaze you. We have done 280 of shows of this show thing that we do on this show. We've done 280 of these. We're recording this episode live on April 13th, 2023. I had to double check that cuz I made some updates to the automation. This is Human Factors Cast. I'm your host, I'm gonna say it.
Barry. Nick. Nick Boic fan. Banana. Banana fan. Fa Fifi Mok, Nick Rome. I'm joined today by Barry Kirby.
[00:00:29] Barry Kirby: Oh, that's brilliant. Thank you for the person put in the notes. Great to be here. That's just made my night.
[00:00:35] Nick Roome: We, we've got a great show for you tonight. The snark is in full effect in these show notes, that's for sure. Yes, that's if you are detecting an attitude at the top, that is certainly what's going on here. We'll be discussing how preschoolers prefer to learn from a competent robot rather than incompetent humans, such as the ones that are hosting this podcast right now.
After that, we'll be answering some more questions from the community, such as, what do you do when stakeholders ignore your research and feedback, or, and do you have any ideas on what to do when you've lost your data in term, in, in the sense of a recording? And lastly, we'll talk about. Antagonistic customer experience, UX relationships.
I love it. But first, got some programming notes for you. I'll if you are unaware, we have some conference coverage coming out for you literally as we speak. There's some interviews out now with the Student design finalists. All of this is from the Healthcare Symposium and Human Factors. That was just earlier last month, I guess some other friends of the pod talking about cool topics like medical challenges and human factors.
We also catch up with Joe Keebler and Terra Cohen about the latest from the conference and with Faran and Tony about the journal. So lots of cool stuff to look forward to. They're all in your feed right now. Go listen to 'em if you're interested in all that stuff. And there's one more to come.
We're just waiting on some final approvals before we actually release it. So look for that soon. Barry, what is going on over at 1202? I must know.
[00:02:00] Barry Kirby: 1202, we talked about naturalistic decision making or ndm. This is with a a friend and ex-colleague of mine, Rob Hutton, and he's done an awful lot of work. He's lived out in the States, he's worked out in the States and came back to the uk. And I, like I said, I work, worked with him my, in my previous job.
He gives a real insight into naturalistic decision making, how it came about and how it's used. And actually it was quite eye-opening for me. It's not something I've had a great amount of experience before and it's certainly, it's one of them that I then went away, downloaded a few papers and started reading them so thoroughly recommended.
And I wonder if there's any of these that I've not recommended. So I'm pretty sure I say thoroughly recommended for all of them, but then I'm obviously biased, so yeah. Already recommended. Go, never.
[00:02:43] Nick Roome: Yeah, it's better than this episode sucks. I'm gonna make the naturalistic decision to go and move us into the next part of the show. That's right. This is the part of the show all about human factors news. Barry, what is the story this week?
[00:02:56] Barry Kirby: So this week we're talking about how preschoolers prefer to learn from a competent robot than an incompetent human. In a recent study, preschoolers aged three and five were tested to determine whether they prefer learning from a competent robot of an incompetent human. During the study, the Stu, the Children watched a video of a young woman and a robot called Neo, or Neo that sat side by side pointing to familiar objects between them.
The robot labeled the objects accurately whilst the woman labeled them incorrectly. The children were then presented with unfamiliar objects that were labeled with nonsense words by both the robot and the woman, and were asked which name referred to which object. The five-year-olds were twice as lengthy to choose the robot's correct.
Labeling of the woman's, whilst the three-year-olds didn't really show a preference. The study also found that the morphology of the robot didn't affect the children's trust strategies. However, the study also found that the three-year-olds struggle to distinguish between biological organs and mechanical gears.
While the five year olds were more likely to assume that the only mechanical PA parts belong to. The study's important to help us understand how technology could EF effectively facilitate learning as we cons continue to live in an increasingly technology rich environment. So Nick, what are your thoughts on your younger Nick spending his years being taught by a walk and talk in Jack Chat G P T app?
[00:04:17] Nick Roome: By the amount of hallucination that chat g p t has not great, but I put in here the kids are ai, ai. Anyway, bad joke. I think in a lot of ways adults would rather learn from competent AI robots than from incompetent humans or deal with competent robots over incompetent humans.
I think we have a naturalistic preference towards competency. My, my first thought on this immediately goes to bias and algorithms. And if we're teaching, then how do we accommodate for these things in terms of, racial bias, gender bias built into these algorithms from the get go, I will say, assuming those issues are solved, there are some other higher order things that I am a little, questioning concerned about.
I, they're not, I curious about, there's the whole social interaction piece. What happens when we take away that human element of teaching, is it going to be effective in the long term? You might be able to learn things, but is it going to stick? Is it going to be effective for. Not just this cohort, but future cohorts et cetera.
There's also this interesting question where the teacher is seen in most cases as an authority figure, they are the adult in the room, literally. And so what happens when the adult in the room is the robot in the room and they don't have the same level of awareness Here in the states, we have unique problems facing our school children, such as bullets hitting them when they're sitting in the classrooms.
And if a robot is teaching, are they going to have that level of awareness to get the kids to safety? It's, it's terrible to even think down that route, but there, there are situations, let's just take it back a notch and say natural disasters, earthquakes, tornadoes, wherever you're at geographically located, there's probably something that fires, you name it, what's going to happen in a classroom setting where a robot may not have those environmental cues to and I guess sway with the children to lead them out of that environment safely.
Then the last thing I guess is in the classroom setting, does this then change the dynamic of. What the students are learning. Is this just limited to a classroom or can this be, applied to homeschooling too? Can you put them in a room and will they learn with a, an AI in that room? I don't know, presumably if it's competent, but how do you then match the, that learning to the students?
We talked about this in a different episode, but there's a whole interesting conversation about what AI can do to enhance learning. And I just go back to that conversation because there's the sort of tailored approach that you can take for students. And if you incorporate that ai, I promise my thoughts are almost done.
If you incorporate that in AI and bring it into a teacher who has awareness of a larger classroom, that could actually be really effective for addressing most of the class's educational needs. And so for that reason, I'm excited about it. Barry, what do you think?
[00:07:41] Barry Kirby: Big, deep breath. So if we if education was just about the delivery of information and the uptake of information by the young eager minds sat in the classroom, then probably, in fact, it could be a lot more accurate because it would have all facts available. We if everything was there, you'd know it would happen.
You know that every class will be delivered in exactly the same way because you'd be programmed to do brilliant. But what is it that a teacher does the, over the facts and of the lesson plans and things like that? They can inspire, they can motivate, they can give you an enthusiasm or a passion for a subject.
That, the, I don't see how a robot could do that. How does a robot motivate you? How does it get you excited about a topic that, that maybe you have a, you have an art teacher who's passionate about the art. You have a math teacher who's passionate about, about doing that, and that pa that passion can ignite something in you.
But then I then flipped that and said actually, is that were all of my teachers that passionate about their subject, that I was ignited? Clearly not because my foreign languages are a bit rubbish. And, things, certain things didn't hit my aptitude. So actually, is it just a few teachers that ignite you with a passion?
And therefore actually, does it matter too much? I do think at that level I think for younger students possibly that's not where this is most beneficial. You hit upon it I think for older students, secondary schools, sixth form university, where you could use that robot in a way that could be hugely beneficial, not perhaps at the front of the classroom.
But what about as the assistant? What about as a cohort, one of these robots per small cohort of children a small cohort of students or one to one. So a true cohort human machine teaming idea. But this leads me and I will get, I'm just gonna get my soapbox out and metaphorically stand on it because the.
I think this highlights a bigger issue, and I will declare we home educate our children because I think that the school system is somewhat broken because I think if we get robots in at that age to do what is being professed, then it will further destroy individuality and creativity within children.
But the moment we have a a classroom based and we've driven this classroom based younger and younger, like down to nursery and things like that, it's really a Victorian approach to education. It's made to teach by rote. It's made just to a, keep children busy while parents go on work, but also just chuck loads of information at them.
And then that drive around, the need to read has got younger and younger, which is actually unhelpful in children's development. We try to get them to sit down in a classroom for however many hours a day, which is not helpful. And if we then replace the teachers who do that with robots, then we are not where are the kids gonna go outside, where they're gonna go, the get their hands dirty and the feet wet to learn about what it is to be in society, to learn how to interact with other kids, to learn how to live.
So I think that if we are pushing this type of technology at this age, and so that's where I'm hypercritical of the article itself, if we are pushing it that age, then we're just gonna reinforce an anti antiquated approach. And there's a really cool Ted talk about about this, which I can. Talk about post show, but the fundamentally I think the, yes, I think robots have a place, cobots have a place in older education elements.
But please keep it away from preschoolers or from young schoolers in would be my thing. I'm just gonna go and put my soapbox away. Did we get any social thoughts in anything worthwhile to bring up?
[00:11:12] Nick Roome: Yeah we got a couple actually, and I'm gonna bring up some here live. Alex is watching along with us, they ask how will they interact with the creative thought of children as well Then consider what analytics the robot is reporting back. So when we dive into consent for that feedback being recorded.
So there's an interesting question on consent. Barry. I know some folks commented on your Twitter and Facebook. Do you wanna go over some of those?
[00:11:36] Barry Kirby: so da David Godwin Leah commented and said anything will be better than homeschooling. I think that was a bit of a dig of me. Cause I've known David for a long time and in Nasby homeschool and then my friend Duncan on Facebook also, yes, we could use hunter killer teams for Truance.
I don't think that's quite what they're looking for, but it's cool.
[00:11:55] Nick Roome: Yeah, I want to, before we get into some of these more interesting questions, I want to just recap the study and talk about it from a methodological standpoint. I think it's important to cover that basically what they're looking at, you said this in the blurb, but there's a humanoid robot now, and is that how it's pronounced now?
And a neo and a truck shaped robot named Cosmo. So there's two different types of robots. Now I understand, Barry that you may have met one of these robots.
[00:12:26] Barry Kirby: I've met Neo we've had a conversation. Oh, in fact, NA's danced for me.
[00:12:30] Nick Roome: Oh
[00:12:31] Barry Kirby: which one of the partnerships I've got is we Cardiff University here in Wales, in the uk and they've they're part of their team is the, or there's the center artificial intelligence, robotics and human machine systems which is easy to say, oh, we've got IROs.
And they've got Neo and they've got a couple of other robots as well where they're, they've been engaging with how does human machine teaming work? And in fact, one of the projects that I'm trying to get off the ground at the moment where we've talked about climate ergonomics before, is the user climate ergonomics and command and control situation with neo alongside human operators to use as a cobot to ask questions and things like that.
So the other things that they've done there is within medical settings as well, so within hospital settings to be able to take notes and do things like that. So yeah it's interesting because it's actually quite small. It probably comes up to maybe your waist, maybe a bit more. And so it's actually quite it's a cute homo humanoid robot.
[00:13:22] Nick Roome: Is it child sized?
[00:13:24] Barry Kirby: Yeah. It's that sort of thing. And, but when it's articulating, talking and stuff like that, it's, I think it's quite easy to slip into that idea that yeah, you have no, you're having a, an engagement with it, and it's meaningful. So yeah, it's quite cool.
[00:13:39] Nick Roome: That's cool. So just to. What's going on here? The children had to decide between labels that were offered by the robot and by the human for unfamiliar objects. And the three-year-olds ended up showing no preference, but the five-year-olds preferred the robot because it was more competent.
When you look at the methodology of these, there's there's some interesting pieces about this that I wanna specifically call out. This wasn't done in a classroom setting. This was done over Zoom. And so you're also taking that into consideration. This wasn't tested in a classroom setting. I can imagine that'd be a future study that they'd wanna do.
And basically they use a comparative approach looking at the two different groups and basically the human morphology was controlled for by the different shapes of these robots, right? By, by using Cosmo versus an ao one truck shaped and one humanoid that didn't matter. And it's just it's a really unique study in a lot of ways because it's comparing a human speaker versus a robot.
And that's not something that we've seen a whole lot, especially when it comes to children. For me, there's I guess a couple things that I think about one of them I already brought up here was the Zoom meetings. But the other thing is that this is just a reporting on the scientific article.
This is not like the actual article. We didn't. Have access to that. So this the sample size. We don't know what it is. We don't know any of the further details, but I think this opens up a great discussion for human factors and what this actually means for education, learning, children's issues, all those topics.
Barry, where do you wanna start?
[00:15:14] Barry Kirby: So I think it's, I dunno. What's interesting about this is firstly to, again, look at that we didn't actually, the student didn't really. Investigate the quality of output. Yes, we had the human doing some stuff wrong and the robot doing it right and purpose that but there was no, no real investigation there into actually which the, which which mode the students retained information better from, if that, that makes any sense.
They pushed it out into, which they, which they perceive as being more competent. But actually that's very objective. Oh, sorry, very subjective. There, there's no real objective measure into quality of input. I think there's some bit here and I've already mentioned it and I think you mentioned it too.
It'd be really neat to go and actually work out, where is the quality in other age groups and where, what, how would different age groups utilize this sort of technology? So not just opening up to being the person at the front delivering that traditional teacher role, but what are their mother roles that that could bring something there and that this was supported by one of the social thoughts that we had from Mark Jones on Facebook, who certainly he thought was certainly older students were, would be there.
Not sure about primary ages, though. I'm sure there's probably already been some sort of tests already taken already. I was at that. That's a very cynical approach there. Mark, but possibly not wrong. Nick where do you wanna go?
[00:16:35] Nick Roome: You can take, think about, you were mentioning what if you did this in addition to a teacher? What if you put a an aid in the classroom and put the robot as a teacher's aid and. Basically incorporate a lot of these features that the children find appealing or the things that work about a robot teaching children.
And this is John Putty on Facebook, right? Absolutely, yes. I think your, in your discussion and you can credit me for the, we are, John, it's important to separate the intellectual element of the robot from the physical embodiment. My three and five year old kids already interact with Alexa totally.
Normally for them. It's not some new tech, but it's just normal. Now, couple Alexa with a large language model, which can learn and you have an intelligent personal assistant who can learn whether your kid needs this explained in a different way, actually get actually so much more inclusive as each kid gets the support they need in terms of physical man manifestation.
Kids can in some ways choose what they want, but could be augmented to, sorry, I'm losing my train of thought here. Could be augmented. There's massive cybersecurity and safeguarding implications, though. Imagine hacking the brains of an entire generation by hacking their AI assistance before they're out of nappies.
Yes, that is. We have some more thoughts on cybersecurity later, but yeah, that's a great point John, and I think there's a lot to chew on there, but yes, you're right. If we raise these children with it just being a normal part of life and you think. A robot just being in the classroom as an aid. And in fact, I know some teachers have these devices in their classrooms, so that way they can ask questions live, just if a student has something, they can teach them how to use that technology and use the resources that are available to them.
And I think you're right, once you start incorporating those large language models and being able to as I mentioned earlier with the artificial intelligence and being able to design specific coursework for a student based on their needs, then I think this becomes insanely powerful because not only could you do it with in, in classroom learning, but you could do it, like I said, with coursework and target questions that are challenging to them, but also give them questions that are just on their level that they could answer and feel confident about.
And you get this right mix of confidence so that way it feels challenging, but also rewarding to do. And if you match that to the student's level, then it's just going to be this perfect, awesome combination. Yes, we, the cybersecurity issues are there. We can talk about those later, but I think that's a great social thought.
[00:19:15] Barry Kirby: I think it is something the point that he raises that Our kids are already using Alexa. Our kids are already using them, basically automated technology based tools that are in the home. In fact, if you're in our house, they're everywhere then, this is really no different.
We, we see a step change, and this is a cultural digital native al immigrant next gen sort of issue that we see this as a big step change that we going into something cool, the kids of today, the young adult adults of tomorrow. This is what they're growing up with. Their phones are already, they've already got all this information at the fingertips.
I having spoken to sort these, my kids and some other younger people that they almost don't see the AI is actually a massive thing that chat, g p t, it is just something you tap a question into and it gives you some answers. It's a better Google, it's not something that, it's not necessarily something amazingly clever, it's just what they expect technology to do.
So that's interesting. So this for is probably it's just gonna be expected, isn't it? It's just gonna be that thing. No, John, I thought that was a really good interaction there.
[00:20:20] Nick Roome: Yeah, I think there's some interesting questions around whether or not this will change the way in which children perceive and interact with technology. I don't think that's necessarily true if they grow up with it like you were mentioning, right? I think there's a lot to be said about growing up with a smart device in your home that you can ask questions, or if they're growing up with a robotic teacher's aide, that's not gonna be very different for these children who are growing up in this environment.
I think there's There's like I was mentioning earlier, this also highlights sort of the importance when it comes to comp competency. We have this natural inclination to go towards competency rather than incompetency, and I think that's highlighted by this. There's, children are not stupid.
They are learning how the world works, but there's this natural inclination, even from a young age to go towards the person who knows what they're doing. And that's comforting to me in a lot of ways. So that's good. And then if you think about the future of education and what this means from that perspective, there's a lot of really interesting points.
Do you want to mention this social thought here?
[00:21:35] Barry Kirby: The well fir firstly, before, before I mention that, I think there's something around the way that we put courses together as well. The idea that we can support students on their individual learning pathway is absolutely key. So if you've got a, an overall module and you can then tailor you by the use of cobalts to give the individual support, to be able to, rephrase things in a way that is particular for that student's in individual learning needs it's really good.
And Susie Broadband made the comment that in the age of cobots, it's probably actually good practice getting them used for the future world of work, much as they started to introduce computers back when she was in school, in junior school. It is the future. The key, as with all tech, is to work out which tasks are better.
They are better at than humans and which ones humans are better at. So this goes back to almost what we said at the beginning, I is the robot able to do some of these things that, what is it a teacher does that is just human? That actually a robot probably at this stage couldn't do though it, it might be able to in the future, which might also bring in the other social thought that's just coming through LinkedIn.
[00:22:45] Nick Roome: Yeah, I was just gonna bring that up.
[00:22:47] Barry Kirby: look, it's like
[00:22:49] Nick Roome: yeah, it's like it exists. Emily on LinkedIn, some great points raised. How would these robots approach working with learners with learning dis difficulties? Also, how would they be able to replace the emotional support teaching staff? Provide both very good questions.
Let's talk about the learning difficulties first, because I feel like that's the easier of the two. Mentioned it before, the artificial intelligence being responsive to a student's needs. I can imagine in a classroom setting, if you had those with learning difficulties, you might set them in a different classroom setting.
They're special education classrooms where they are geared more towards those with learning difficulties. But also there's we need to think about it from the other perspective of advanced education settings too, where you're beyond your peers and you need more stimulation to get you to the next level to keep you engaged.
Otherwise it's just not going to be great for you. And so I think this is where that adaptive learning, and I guess I could plug the other episode that we did on this, which was let me find it. 223 we did it on how can AI improve learning? So I think that discussion captures a lot of that question.
Now, the second part of this question is how would they be able to replace the emotional support that teaching staff provide? That is something that I struggle with because if you were to just put a robot in the classroom, then you lose a lot of that. I think we're still a long way out from having emotional relationship.
Maybe not because we talked about this too, emotional relationships with AI and robots, but
[00:24:29] Barry Kirby: Talk about that?
[00:24:30] Nick Roome: let's not talk about that one, but. I think we are still a long way out before we all feel that same level of emotional connection to an AI or a robot that we do with humans. I thought there was another social thought on this one too.
Barry, what are your thoughts?
[00:24:45] Barry Kirby: The, I think that the first point that, again, that she raised around working with learn learners with learning difficulties, I think we can broaden that out to actually just understanding that different people learning different ways. And so if you've got the. Message that's been delivered. And then each student could potentially have this cobot that could understand what they specific needs are for each student. So somebody might have particularly strong learning difficulties people might have. So people could deaf blind so physical issues that they need to deal with.
So they might need some sort of conversion, some sort of translation. It could almost be that one stop hub for supporting all of the student in their particular need in order to be able to get the the message across. It might be the case that actually they, they might need the problem broken down in a different way or the lesson being learned in a different way or just, broken down into smaller chunks for them to learn.
And it could help do that. So I think there's a lot to do with that. But I do think as you said, the, that emotional support piece is firstly, do we want could it happen? I think it probably will at some point. But is that something we actually want? Do we want to be able to give AI the same level of emotional connection that we do with a human being?
That's something that we need to probably dig into at some point as a society is. Do we want to be giving, I dunno, Terminator, the same the same hugs as we do our children? I'm not sold on.
[00:26:14] Nick Roome: We do have some more social thoughts on this very topic, so I feel right now is a good time to bring them in. Chris on Twitter says, I find it hard to see a robot taking on the emotional support role teachers provide. I see AI and other similar technologies like calculators. It's an assistive tool, and I don't know, what do you agree with that, Barry?
[00:26:33] Barry Kirby: Yeah, I think definitely that we need to be looking them and they're part of the toolkit at the moment. They can, I think they, as we come in, come into they, they've definitely got a use of, there's definitely an opening potential there. But there are definite, there, there are definite gaps at the moment.
So just to follow that, so Glenys sends a similar thing in that she highlights that little ones need nurturing to be encouraged to learn and they're keen to learn if they get a cuddle now, and then if they make a mistake or mess up, then that, that's fine. Not sure the robot could provide or replace that human touch People need people both large and small.
That's quite prophetic.
[00:27:09] Nick Roome: It is counter-argument robots could be programmed to simulate that empathy. If you think about is it true? It may or may not be, probably not if it's a robot doing it, but is it what the human needs in that moment and does it make sense to do, it could allow those human teachers to focus on providing that personalized attention to the students where while the while the, the, I guess the human teacher would be doing the warmth, the human emotional piece, and then the robot could be teaching.
That could be another piece of it too, right? Where you almost have the human going over and investigating and do, how do those roles play together? It's a very interesting question. I just don't know. We don't have enough information.
[00:27:49] Barry Kirby: Elsa come in with a comment again, on Facebook where, we assumed here that that robots wouldn't be able to be as enthusiastic or show the same thing as teachers, but actually she comes back and says actually, there may be more enthusiastic than some teachers are, but that's for sure.
Personally I think robots would benefit older children more so let's say secondary to school age. Her children age eight and four would love the robots, I'm sure, but personally don't, I don't think they'll get the reassurance and encouragement they get from actual teachers. But then again, I've never been taught by a robot to do any different, which is actually a really fair point.
We've actually been in the in this situation. How many courses do we go through now that are just recorded? Web web delivered lessons? So your health and safety stuff and your, all that sort of stuff. It is just delivered by basically recorded webinar. We still absorb that information.
We do it, we get bored by it. Maybe you will, but will be a more exciting way of delivering some of that stuff.
[00:28:42] Nick Roome: Yeah. I want you to act as the best teacher possible. Like seriously though. Cause you're right. The social post is right. I think there's a lot of questions about how we engage students and can we create the perfect teacher using AI and robotics and put that into every classroom that will react to the student's needs.
Like to me, that's actually really exciting if you think about it from that perspective, is like, how can we get the perfect teacher? The perfect teacher doesn't exist because it's largely dependent on the student teacher relationship. And there might be a good match somewhere, but everyone's going to have different teaching styles.
Everyone's going to have different learning needs. And if you could come up with a teacher that somehow investigates the needs of the classroom and teaches to the center of those needs, to the media, Then you're getting at most of those, and then the rest of the, issues or difficulties can then be picked up around the edges and it's I don't know.
It's just really exciting to me. Really exciting.
[00:29:54] Barry Kirby: But then the almost taking that, that one step further is actually, yes. They could teach the median, but actually they could teach the individual. If they know that student A prefers it in this way, student B prefers it in this way, they could truly adapt the lesson as they go through to, to be reactive into one, into what happens.
It's really interesting. It's, but then do you but then are we getting away from, one, one of my big criticism, I guess with. The way that we educate. Now I've got lots of criticism about the way we educate nowadays, but we educate in large groups. So we educate classes of 30 here in the UK because that's the most cost efficient way of doing it.
We have a teacher and they might have a couple of teaching assistants if they've got a a school that could afford them and we have to teach to the middle of the road. And in fact, there was a study done or some research done that showed that outta an entire day's worth of school, so eight hours of school in six or eight hours of school, there was about, as, I think it was an hour and 40 minutes of actual education, everything else was just looking after children with childcare.
And so if we had this ability to maybe make it a bit more individualistic, would they be more motivated and therefore do what they do? El also actually just come up with the chat and said, actually the perfect teacher to one person wouldn't be the perfect teacher to another individual. So would it be the same for the robot?
I think, yeah, I think you're absolutely right. Unless the unless that robot could almost multiplex between them all, but then you might as well have a one-on-one teaching and or at least you smaller groups. The other bit that we haven't touched upon this, which does we've edged around is resourcing.
The biggest, one of the biggest issues we've got here in the UK is teacher is getting teachers in schools. And particularly we've actually got a a lack of male teachers. We have more female teachers, very few male teachers particularly I think in primary education. So actually, would this be a way, try to put positive spin on this?
Would this be a way of solving that teaching crisis? Actually, you have one or two teachers like human teachers setting the curriculum, setting the educational standards, setting what the lessons are gonna be, and then they have their drones go and deploy them to the,
[00:32:00] Nick Roome: You'd have the other argument, of course, which would be the robots took our teaching jobs away. But you're right, if there is but there's so many, there's so many available jobs and there's a need, there's a need for more education. Like the more students that you have per teacher the quality degrades that much more for each of those students.
And so the more educators you have, robotic or human, as long as they're the thing that's going to be best for the student, then I think that's a net positive. Then of course, you're always going to need a human in the classroom. I think if anything, to babysit the robot or to, to babysit the robot or to be that level of awareness if there is anything environmentally going on that they need to get to safety or something like that.
I think you'll always need an adult human in the classroom, but will they need to have the same level of training as educators do now? I don't necessarily think so. If you are introducing a robot that is the ultimate teaching machine in the classroom, right? So there's some of these interesting for and against questions.
Yes, it might take away jobs of educators, but you can still have people that are passionate about it without needing to go through all the training. Certification certif certainly because you're dealing with young children, but training maybe not so much and that might be a net positive thing, I think.
[00:33:22] Barry Kirby: Yeah, and I think the, I think fundamentally it's a bit like ai, when we talked about the influencer Cha g p t and said actually we, this could almost reform the way we are thinking about what we're doing. This could be a catalyst for us to rethink education because there are, there's different teaching styles and certainly it's something we've looked at previously where we looking at school, the school system is what they call is it FM v t?
Fixed. No fixed time. It's f fixed time variable mastery. So you spend a certain amount of ed time in education and you'll come out with a grading of how much you've learned in that time. What this will allow us to do is to look at actually almost a, yes, you've still got fixed time, but actually personalize the educational thing.
So it, it allows you to, if they finish things really quickly, maybe crem more stuff in or do less or focus on getting the basics right. It should allow us to really shake up education and think about what education is. I will bring in a, I think it's the last social thought that we've had.
Cause if I don't, Amanda will be upset with me. Because she did highlight, where we talked about whether children would still prefer learning if the human was equally competent. She highlights that a primary age, unless robots are helping them climb trees, helping break out the arts and craft supplies, experimented with the Newton's laws of motion on rope swings, helping them understand how plants grow or Tad falls turn into frogs.
What use would there be, but the secondary age she had teachers who pretty much read from a script and might as well have been a robot. But the good ones are inspiring and spark curiosity. Not sure if that's a robot's bag either. Exam revision, maybe really good. So again, the this do, I think we're identifying roles that they could do and possibly do really well, but actually roles that at the moment we think they would struggle.
[00:35:03] Nick Roome: This has been a fantastic discussion so far. We're getting to the point of the show where I might need to move us along, but I think we each open up one more can of worms and then we move on to the next. Barry, open up your last can of worms.
[00:35:15] Barry Kirby: So I think the, for me, the last can is it was actually some what somebody else raised Alex raised is around what happens around the social bit as well. Sorry the data protection element of this. Because if we are able to, if we've got robots in each of these rooms and they are analyzing the children to work out what the best teaching mechanisms are and being able to deliver that information, where's all the information going and how is that we would I like to live in a, so in a light and fluffy world where everybody does everything for a positive reason and there's no malicious actors, but we know that there are lords of malicious actors.
So be that actually malicious or just commercial, what is, what, how could this, what is the unintended consequence of this, of having everybody taught in the same, the same stuff, same way. Cause you, the curriculum is gonna be whatever the curriculum is. You talked right at the top of the show about how what, what, how do ethics play into this?
Where is the bias though? There is an argument. We are assuming here the teachers aren't biased. So what is the unintended consequence? I think there is a lot of stuff that it'll be fun to do this but I think they will also spin off things that we weren't necessarily thinking about.
And somebody's gonna make a lot of money.
[00:36:28] Nick Roome: Yeah, somebody will. I think we gotta treat that data very similarly to how we treat medical data, the children's learning data. Protect it, all that stuff. The last can of worms that I would like to open up is, does this use of robotic teaching assistance teachers, does that create a dependency for the children potentially having access to this technological solution?
And does that hinder them from being able to learn in environments without technology? So if you can ask your teacher and it can access the internet and get you a response to anything, and it can deliver that information to you in a way that's completely personalized to you, does that then ruin learning for children in other environments where you're basing something off of somebody else's knowledge and they may know that to be true?
And what is truth? That is the last, that's the last thing I'll open up. But the excessive reliance on teachers could really force a student to not be able to adapt to these non-technological based learning solutions or these learning environments, and the other side of that argument would be you could integrate.
That, that approach, and have both the traditional learning approach, the methods and the technological methods into a classroom setting, so that way they feel like they could learn in any setting, right? So if a student were to ask a question to the robot and say, what is the answer to this question?
They can say instead of just going straight to Google, right? Let's actually think about that logically and encourage them to think about things from a logical perspective. And I think that's the right approach. But it's just an interesting question that I think about is like this reliance on technology.
Cuz when was the last time you went a day without your phone and for good or for better or for worse. It's a part of our lives now. And is that technology, is that, is learning that technology going to be part of our lives too? I would say yes. There's so many things that we didn't get to. Maybe we'll get to some of it in the post show.
But thank you to everyone this week for celebrity, our topic especially our patrons. We wanna thank our friends over at Concordia University for our news story this week. Special thank you to all of you who commented with HFC social Thoughts. That's a hashtag to share your thoughts with us. You can always use this hashtag if we like your comment, we may read it on the show like we did tonight.
If you wanna follow along, we do post the links to all the original articles in our weekly roundups on our blog. You can also join us. In our discord for more discussion on these stories and other things in there too. We're gonna take a quick break and we'll be back to see what's going on in the Human Factors community right after this. Yes. Huge. Thank you as always. To our patrons, we especially wanna thank our human factors. Cast all access. Patron Michelle, trip patrons like you truly keep this show running on operational. We keep going because of you. So let's talk about that. Let's talk about all access. What does that mean? Will welcome to the human fact.
This is the dumb commercial read. Welcome to the Human Factors Cast Live Show. Today we'll be talking about our human factors. Cast all access and human factors. Cast v I. Part for just $20 a month, you could become, look, okay I'm gonna go off script here. A buck gets you in the door. Let me just be clear.
A buck gets you in the door, really helps the show. But for $20 a month, you can become part of our exclusive community. Get recognized on our show with a shoutout every week. That's right. We'll be shouting out your name loud and clear every single week. Plus, you'll get access to all of our premium learning resources through Human Factors Cast Academy.
We're talking eBooks, webinars, courses, all of which currently in development. But now if you're looking for something even more exclusive now there's our Human factors cast, v i p tier, and that's for you. For $50 a month, you'll be able to get to choose any topic you want, host your own discussion with yours, truly, myself and Barry Kirby, we will sit there and we will talk about whatever you want.
Human factors cast related, not human factors, cast related. It's your own hour long podcast that we will sit there and do for you. So yeah, you hear that, right? You can host a show with us. So become part of the Human factors cast community, get access to all the exclusive content benefits that we have to offer.
Just remember, these tiers are for people who like to be recognized and appreciated, but also who have an extra 20 or $50 lying around each month. So there's that. All right let's get, let's switch gears and get to the next part of the show. Camera's bouncing around everywhere. All right? Yes. Let's switch gears. Get to this. It came from, this is where we search all over the internet to bring you topics the community is talking about. If you find any of these answers useful, give us a, like, wherever you're watching or listening to help other people find these questions and answers.
We got a couple tonight that really you'll see. Alright. This first one here is from responsible Fruit, one on the UX research subreddit. As a UX researcher, I get asked for feedback by product managers, but some stakeholders ignore my suggestions. How can I handle this situation without conceding and having problems fall back on UX research?
Barry, what do you think?
[00:41:47] Barry Kirby: Stakeholders will hold stakes. For me it's part, this is part of the job. So you can do your technical stuff, you can do the research. You can have some amazing insights book. Just going, you've gotta go and prove what it is that you are, that you're doing is good stuff and it'll add value.
Just go in and said, no, please, I've got this insider. Is it might help. Isn't going to cut it. We are a a rare breed. We there, there's not that many of us around. And actually sometimes it can be quite easy to railroad what we say. So sometimes you've, you've gotta go and be loud and proud about it.
If just being loud isn't enough, go be a pain. Go and get in, go and get in somebody's face, but in, in a nice way, obviously. I'm not advocating well, I sometimes maybe but no, do it in a nice, helpful way, but go and show value, show that actually what you're talking about. You're not just being loud for the sake of it, but you actually can change the product for the better.
If all else fails and your advice wasn't taken and you can then prove that because your advice wasn't taken, it wasn't very su successful. Learn the after saying, I told you so without actually saying I told you it doesn't actually help in the grand scheme of things, but it does make you feel that little bit better afterwards.
Nick, what do you think? How would you solve it?
[00:42:59] Nick Roome: Are you saying that you would rather hold the steak?
[00:43:02] Barry Kirby: Maybe.
[00:43:04] Nick Roome: I put in my thoughts here. This question again, really? Who picks these? I do. I pick 'em, but look, here's the thing. I'm gonna keep beating this. I'm gonna keep beating this drum. You get them involved, you get them as part of the process. You get them to understand your perspective, et cetera, et cetera.
I'm tired of answering this question. It keeps coming up. If that doesn't work for you, quit, stop doing work. Just get paid for sitting there. I don't care. I don't just, is that bad advice? I don't.
[00:43:38] Barry Kirby: To be. I don't think it is because actually if all you can do is your best, right? And if you've done, if you can say you've diligently done your research, you come with your recommendations, conclusions, whatever it is, you've passed them onto the right people and you give them with the right amount of effort, then it's not your fault.
You've done the best that you can possibly do. Yeah.
[00:44:00] Nick Roome: There's the other side of the coin of just do the bare minimum of work that you need to not get fired because then you're still getting paid for the same amount of work that you would've done if you were overperforming and overachieving. And if you are at this point where all these people have asked this question before and this person too, if you've tried your best, if you've done, if you've literally done your due diligence, then why would you put in an ounce of effort beyond what that, what the minimum of is required of you.
If you know this is not going to change a culture. Like I it is just a larger question. Larger question.
[00:44:33] Barry Kirby: personal pride
[00:44:35] Nick Roome: Yeah.
[00:44:36] Barry Kirby: boils down.
[00:44:37] Nick Roome: All right, let's get into this next one here. Zoom lost my recording. Any ideas? This Byy Laro stars on the UX research. Celebrate. Hey there. I lost a recording from a Zoom interview that I really need.
Zoom says they can't see it, but it's not, zoom says they can see it, but it's not accessible to me. Any ideas how, on how I can recover it? Has anyone experienced this before? Barry, I wanna focus on the losing. Participant research data part of this, not the Zoom technical piece.
[00:45:03] Barry Kirby: Okay, so in the in the bid, that doesn't help this person whatsoever, but backups on your backups. It does happen. It absolutely does happen. How many times have you, Nick helped me out of a hall with my 12 or two recordings if nothing else, but I've been there where you've had a recording or you've even think you've had a written down notebook and you can't find the notebook.
It happens. But I now have the mantra of if I'm doing something, I have two recording devices and notes. I might not make the notes, but somebody will be delegated to make notes of the main things. So that's basically a, that's my whole backup on your backup type approach.
You have to take two recordings and actually things like Dictaphones or whatever if you're doing physical interviews or if you're doing Zoom, trying to have a local device hang, hanging off it as well. Or cuz whatever platform you, you are using to actually get an a an audio output now isn't too difficult.
Quite hard. Whereas now almost any Bluetooth device would do it. Just take, just do a double backup. However if you've if you've lost all, like you've, you've can do it and you're in the situation where you've lost that piece, I would do effectively a cognitive walkthrough. Replay that whole interview the way that you did it and step yourself through almost try and relive the experience and then try and narrate it to yourself.
So do a recording with a dick phone or some sort of device, and just walk yourself through what happened through that interview to try and recreate your stuff. If you took no notes at all, nothing else, then walk it through. If you've, even if you've got your notes that you took and they're just doodles actually the doodles have a way of helping you remember what happened.
So as you do this do a walkthrough with looking at the notes and try and remember what you're doing in order to create the doodle or whatever it is that you're doing that will help you recreate it and then learn for next time. Nick, how would you do it? What would you play with.
[00:46:49] Nick Roome: Losing data sucks. It's happened to me before. It's happened to a lot of us. This specific question sounds more like a technical limitation with Zoom. And I think your double backup of everything is a little excessive. Personally, I used to double back up this show and do local recordings and have everybody do local recordings, and that was way too much every week.
So now we just wing it and if it messes up on us then and messes up on us, and that's it. But really what we're looking at here is having control over the situation. And I think that's really what's key here. So you have relied on a recording from Zoom. Maybe use technology that's local to your computer that you can record it with, right?
As I say here, recording on a third party source. So there's that, but there's also, you have control over how you record. And cloud recordings are bad. For many reasons. I don't like them. If they're your only option, fine, but you have control over that. If you lose control over things, then it's not gonna be great.
There's other ways that you can back this thing up too. Now, there's great AI tools that will summarize your entire meeting like paragraph by paragraph, which are great. And you can implement those in your workflow process. So if you don't have the recording, you at least have the AI notes. And then what's even better than AI notes is a human sitting there right by your side as a dedicated note taker.
That if the event that the interview does become a corrupted file or you can't get access to the recording, you at least have that dedicated note takers notes. And that's what you're looking at in terms of options. I don't know. Zoom sucks. I sorry. I don't like those are fighting words, but all right.
Let's get into this last one here. Antagonistic CX relationship. That's customer experience for any of you that are unfamiliar. This is by Akane on the, that's a long name. It's a long week on the UX research sub at they write as a member of the UX research team, I find our relationship with the customer experience team to be toxic.
How can I improve this situation? What are some tips to deal with the customer experience? Team's negative attitude and manipulative behavior towards UX research. Has anyone been in a sim similar situation, and how did you deal with it?
[00:49:04] Barry Kirby: What is, oh, you mentioned it was a CX team, but can we stop making up crappy team names? Experience begins with an E. There's an E, right? The front of it. Ct. Oh. Anyway fundamentally well get the same. No we've done this one on a number of occasions. Get the human factors flag, make a flag, put human factors all over it on ergonomics or ux in your case flag it or the flag comes with a pole.
Do what you want with that. But no, the serious answer it's a relationship. It's a relationship like you have with a client. Like you have ways superiors like you have with other teams. You've gotta work at them. And yeah, you can have some really bad relationships here and, but just saying that they're.
Great. That's, that gives you a challenge. But you've gotta work with it. I've biscuits, cake, wine can work well, but maybe some sort of joint team mounting, go to the pub or an escape room or some sort of challenging thing to get, go play golf. I've never really played golf.
I've played golf, play golf phones. And I did that because I couldn't get hold of the people I wanted to in another way. And I did that as a work thing. But yeah, I it takes effort. People doesn't just happen Nick. How would you solve this in a slightly less frivolous way than what I have?
[00:50:16] Nick Roome: I think this is probably one of the best pieces of advice that I've ever given battle Royal.
[00:50:20] Barry Kirby: Okay.
[00:50:22] Nick Roome: All right. Now it's time for one more thing. No, look, it's exactly what I said in the first one. This question, again, like you, Barry, you've said it multiple times on the show before you, we are the glue that holds everything together in a lot of ways. And they are an internal stakeholder. You need to play nice with them.
And if they're being toxic to you then understand why. And I think a lot of your advice, Barry, is sound. I think that's the way to go. You gotta build a relationship with them in the same way that you build rapport with your users and you need to do it. It's just a part of the job. And dealing with terrible people can be, not always, fortunately I'm not, but it can be a part of the job.
Not now anyway, in the past, but I'm gonna, I'm gonna give the same advice as my first question there. If it quit, stop doing work. Just get paid for doing the littlest amount. I don't know. I don't care. Just battle royale. That's the way. All right let's get into one more thing.
Barry, what's your one more thing this week?
[00:51:18] Barry Kirby: So we've just been through a two national holidays. So we had bank holiday Friday, and we have bank holiday Monday off Easter, so a good four day weekend. However, everybody in my house was ill except me. So wife, real kids real, and it was a really weird experience because, I didn't wanna hang around in the house too much because I didn't wanna catch the luge that's going around.
But I also couldn't get stuck into any sort of jobs or anything because if there were any sort of diy, maybe Dan, I thought about getting the chainsaw out and doing some things like that. And I was like, actually, if I have an accident then I've got no backup nor no real support. Couldn't really go out anywhere very much or anything like that.
And so I spent a lot of, certainly a good couple of the days wandering around, slightly aimlessly literally. Cause I didn't wanna do any work. I actually made a conscious effort to not do any work. I normally would just go and probably pick up a couple of tasks, conscious effort to take a break.
But the way there was quite nice, so I wanted to be outside and so I was wandering up and that, yeah, it was just a really weird experience of just not being able to do stuff. Cause I, I didn't wanna be too far away, so I ended up not doing very much and just felt very aimless, but I don't think that was necessarily a bad thing. What about you?
[00:52:25] Nick Roome: I just imagine that Mema the guy just standing around waiting,
[00:52:28] Barry Kirby: Yeah, that was,
[00:52:29] Nick Roome: plastered all over
[00:52:30] Barry Kirby: yeah, that was
[00:52:31] Nick Roome: Oh man. Okay. Y'all ready for a Love Is Blind update. Oh, this show, man. This show is insane. Holy moly. Okay. I've been mentioning this every week and just when you think it couldn't get crazier, it gets crazier.
They drop new episodes every Friday. This upcoming Friday is the last, like this is the most insane season. That they've ever had. And I can't even get into some of the spoilers because I know some people, it's been out a week and some people are waiting to binge the whole thing. But let me just say, the twists and turns of this season are insane.
There are people who are like, I don't know, this is not geez, how do I say this without spoiling it? There are people who are like making really poor decisions on camera that like, they know they're on camera and they're like, I don't know, supposedly committed to people and it's just it's dumb.
It's really dumb and just fun to watch. So there's your Love is Blind update.
[00:53:27] Barry Kirby: so if I watch, cause I've never watched any of them. Do I just, do I need to go and watch a previous series first or should I just go straight into this one?
[00:53:34] Nick Roome: just go straight into season five, man. Like you are good to go. You. The only thing that you need to know about this show, the premise is that there are a group of men and a group of women, and they date each other without seeing each other. They get into these, what they call pods. Okay. And then at the end of, I think it's two weeks or something, they spend all day dating each other and talking to each other and falling in love with each other.
And once they found their match, they, some, somebody proposes and then they say, okay, yes, I'd like to marry you without ever seeing you. And then they go off and then they get integrated into their real lives together. So work and school and finances and family and friends and all of it. And it's just insane to see.
And then they decide at the end of it whether or not they wanna get married or not. And,
It's, this season is really good and oh, this weekend they're doing a live reunion, which is something completely new. So anything can happen. It's live. It's like these shows, we're recording this live, anything could happen.
Anything like me ending the episode. So that's it for the day, everyone. If you like this episode and enjoy some of the discussion about AI robotics teaching, I'll encourage you all to go listen to episode 223. How can artificial intelligence improve learning? That was a great conversation. Comment wherever you're listening with what you think of the news story this week.
For more in-depth discussion, you can always join us on our Discord community. Visit artificial website, sign up for our newsletters, stay up to date with all the latest human factors news. If you like what you hear, you wanna support the show, there's a couple things you can do. One right now. Wherever you're at, you can stop what you're doing.
Leave us a five star review, that's free. You can do that without giving us any money. I love that. Two, you can also do this without giving us any money. Tell your friends about the show. That really helps us grow. And I can't tell you how many people have found the show because of somebody else telling them you'd like this if you do have money and wanna support us.
That way, just a buck gets you in the door. With Patreon, it's a wonderful community of folks in our Patreon. You get access to a million different things. We have so much stuff and we're always trying to give back to our patrons because they truly support the show. And with that, I wanna thank Mr.
Barry Kirby for being on the show today. Where can our listeners go and find you if they wanna talk about learning about robots?
[00:55:47] Barry Kirby: That was very boring for you. I have to
[00:55:49] Nick Roome: Okay. Okay. Let me, lemme try again. Where can our listeners go and find you if they wanna talk about your experience with na?
[00:55:55] Barry Kirby: Okay, that's certainly better. Okay. If you wanna, if you come and get in touch with me and come talk on my socials, then find me on Twitter cable of other socials as well. Just Barry Kirby if wanna come and listen to interview amazing people from the human factors community and to the human factors community.
Then come and listen to me on 1202 Human Factors podcast, which is 12 two podcast.com.
[00:56:16] Nick Roome: As for me, I've been your host Nick Rome. If you wanna talk trash television and love is Blind finale. You can find me on our discord and across social media. Ed Nick underscore Rome. Thanks again for tuning into Human Factors Cast. Until next time,
[00:56:28] Barry Kirby: It depends,
Managing Director
A human factors practitioner, based in Wales, UK. MD of K Sharp, Fellow of the CIEHF and a bit of a gadget geek.
If you're new to the podcast, check our some of our favorite episodes here!