1
00:00:00,000 --> 00:00:20,000
[Music]

2
00:00:10,500 --> 00:00:15,960
This is GamesItWork.biz, your weekly podcast about gaming, technology, and play.

3
00:00:15,960 --> 00:00:19,399
Your hosts are Michael Martine, Andy Piper, and Michael Rowe.

4
00:00:19,399 --> 00:00:23,000
The thoughts and opinions on this podcast are those of the hosts and guests alone,

5
00:00:23,000 --> 00:00:28,000
and are not the opinions of any organization which they have been, are, or may be, affiliated with.

6
00:00:26,500 --> 00:00:29,079
[upbeat music]

7
00:00:33,500 --> 00:00:37,159
This is episode 544. Are we bananas?

8
00:00:38,500 --> 00:00:41,079
[upbeat music]

9
00:00:46,500 --> 00:00:49,399
Good morning. Good evening. Good afternoon. Michael Row here on Friday.

10
00:00:49,399 --> 00:00:57,519
Would we record our wonderful little show called games that were cut biz with my friends and co host Michael Martin and Andy Piper. Uh, Michael,

11
00:00:58,500 --> 00:01:02,960
Hang it in there and excited to talk tech as you like to say.

12
00:01:04,500 --> 00:01:06,400
- Yes, Andy, how are ya?

13
00:01:07,000 --> 00:01:16,200
I'm good. It is the day after national cherry pie day, according to the ink calendar on my desk,

14
00:01:16,200 --> 00:01:23,079
it didn't show me one for today, but yesterday's was national cherry pie day, so assuming,

15
00:01:24,280 --> 00:01:33,239
yeah exactly, but if that helps you date the recording, then that's a little Easter egg for you.

16
00:01:33,500 --> 00:01:37,500
Nugget It's definitely not a pit

17
00:01:39,000 --> 00:01:41,000
Is it a bowl?

18
00:01:41,140 --> 00:01:44,739
Nah ice cream speaking of ice cream

19
00:01:45,120 --> 00:02:03,039
Speaking of ice cream, yeah, well, yeah, try to try to try to connect those two dots so you drinking games where you have two things at once So the first article that we've got to talk about today is about AI glasses from Apple So the rumor mill is is flying. There's a new Apple event coming up here shortly

20
00:01:50,939 --> 00:01:52,939
Yeah, yeah

21
00:02:03,799 --> 00:02:08,819
People are thinking pins and glasses maybe and a bunch of other stuff and

22
00:02:09,159 --> 00:02:14,840
Roll your own design your own MacBook Pros as opposed to having SKUs that you can select from

23
00:02:15,560 --> 00:02:20,439
Michael what what are you taking as the the most logical things that are going to show up?

24
00:02:21,500 --> 00:02:26,379
Well, the logical things I think are going to be very exciting. It's going to be a laptop refresh,

25
00:02:26,939 --> 00:02:44,620
series of laptop refreshes and maybe something interesting there. But the interesting thing about this article to me was we've had a lot of discussions over the last week or so and articles about meta getting out of various spaces, not their glasses though. And here's Apple finally saying,

26
00:02:44,620 --> 00:02:48,500
Yeah, we're working on it too, maybe, at least according to Mark.

27
00:02:49,000 --> 00:02:54,520
So the funny thing I noticed was that having seen various stories that we'll talk about today,

28
00:02:55,719 --> 00:03:00,199
exactly as you said, Michael, around mitigating out of things and Apple about to do things.

29
00:03:00,919 --> 00:03:33,840
I also read just today the OpenAI's first product that's been allegedly co-designed with Johnny I was apparently some form of home pod-style device and of course with an Apple event scheduled in a couple of weeks time. There's also rumors apart from colourful new low end Macbooks that they might be launching a bunch of home devices again. So it all just goes round and round, doesn't it? You know, the rumors and Apple Car.

30
00:03:20,500 --> 00:03:25,400
with screens.

31
00:03:24,500 --> 00:03:34,500
Yeah, if you say the room are long enough, it probably will happen, right?

32
00:03:34,500 --> 00:03:38,500
I'm like that car, yes.

33
00:03:36,500 --> 00:03:46,319
All right, so moving from, well, for some people, that'll be fun.

34
00:03:38,500 --> 00:03:43,500
Who knows, it's just gonna be new monitors.

35
00:03:46,319 --> 00:04:06,340
So moving from one large company, tech company, we have an article about another tech company here, and Brian Krebs has a quick little post linking to bleeping computer, fun place there, about Microsoft and a bug in co-pilot, which is an interesting little

36
00:04:06,500 --> 00:04:21,980
read, especially when you consider that as you're summarizing your emails, it could easily summarize things that are in your drafts or elsewhere, and then expose material that may or should have been kept private, right?

37
00:04:22,000 --> 00:04:47,000
Yeah, I think the more interesting thing about this to me is, and you know, Brian's a security guy and I follow him on a lot of different articles that he puts out, et cetera, is the security angle that, you know, we've had a lot of discussions again over the last few weeks about ais doing things that they should or should not do and here's another example of an AI getting around security restrictions.

38
00:04:47,000 --> 00:04:51,879
Now, could be a bug, could just be an overlooked feature, who knows.

39
00:04:52,399 --> 00:04:55,600
We'll all find out, but hopefully they will resolve this, because I know

40
00:04:57,040 --> 00:05:03,360
Copilot is being shoved down your throat if you're on windows or in your Microsoft Office suite, etc.

41
00:05:03,360 --> 00:05:06,639
And I just don't have a need for it on my Windows environment. So

42
00:05:07,839 --> 00:05:10,879
If it can accidentally expose security stuff, I don't want to see that either.

43
00:05:12,000 --> 00:05:15,319
Yeah, that could be problematic on a couple of levels.

44
00:05:15,319 --> 00:05:18,160
All right, so moving to another post,

45
00:05:18,160 --> 00:05:22,319
we have one from Mike Elgin about prediction markets.

46
00:05:22,319 --> 00:05:31,120
And this is a futurism post that he was sharing about how that is bringing a number of young people into gambling.

47
00:05:31,120 --> 00:05:36,600
And I remember reading this a little earlier on like one young person who,

48
00:05:36,600 --> 00:05:41,980
I don't know how they had $100,000 to lose on the Super Bowl or the superb owl.

49
00:05:42,000 --> 00:05:47,920
as you said last week, Andy, that's a lot of money to lose on a prediction market.

50
00:05:43,500 --> 00:05:45,500
Which was funny

51
00:05:48,579 --> 00:05:50,579
Well, I think again, it's

52
00:05:51,540 --> 00:06:05,500
There was there a couple articles on future mark on prediction markets and we've talked about these for some time now It was probably like three or four years ago It was when these were starting to spin up and and and be something more than just

53
00:05:57,000 --> 00:05:59,000
Mm-hmm. For a long time, yeah.

54
00:06:05,899 --> 00:06:08,939
Predictions, but actually turning into markets of sort

55
00:06:09,779 --> 00:06:12,980
and yeah, any time where you're

56
00:06:14,500 --> 00:06:17,899
Able to put money on a prediction. Let's face it

57
00:06:18,300 --> 00:06:25,740
That is gambling. I mean that's the purpose of gambling is I'm predicting this team a win I'm predicting this score will happen

58
00:06:26,459 --> 00:06:31,060
Every knee type of prediction market that you place money on is by its definition

59
00:06:32,500 --> 00:06:40,060
So if you want the actual reference, that would be 2014 Michael and episode 80 called extra life. Yeah

60
00:06:37,500 --> 00:06:40,060
Oh gosh, 12 years ago.

61
00:06:40,060 --> 00:06:42,300
Wow, that's been a while.

62
00:06:43,180 --> 00:06:52,019
Yeah, just just a bit and prediction markets been in the news a lot recently because of you know various events that people have made significant money on

63
00:06:52,540 --> 00:06:58,339
Where you have to wonder how would they know? You know how would they know to predict this that or the other right?

64
00:06:56,000 --> 00:07:23,240
I think it was 404 media that had a story about this in the last week, too, about that exact point on prediction markets as it related to events in Venezuela and how people were hyping things on prediction markets literally hours before the events took place.

65
00:07:23,240 --> 00:07:26,000
It's like, I think they might know something.

66
00:07:26,000 --> 00:07:27,560
You know, it's like, and they're using that to make money.

67
00:07:27,500 --> 00:07:40,500
Yeah, and we've had other discussions before about things like Strava, you know, tracking people, and soldiers and others, you know, in places that showed unexpected movement and things like that that could then trigger other things.

68
00:07:40,500 --> 00:07:56,500
So the digital exhaust is at the right term. I think we remember talking about that of human activity can identify a whole range of things that maybe, you know, should or shouldn't be identified at that level of detail.

69
00:07:58,500 --> 00:08:08,500
So Andy, I'm looking over at you at a second because you have your eyes kind of closed a little narrowly. Thoughts on the last couple of things that are making you go.

70
00:08:09,500 --> 00:08:17,980
No, I was thinking more about, I think, the topic we're about to move on to, about the embodiment of an AI.

71
00:08:14,500 --> 00:08:42,500
Yes, so I'm super excited about this very lengthy article from the New Yorker magazine and I heard a radio interview done with the author on this as well earlier this week, so this is all about a detailed investigation around anthropic and Claude.

72
00:08:42,500 --> 00:09:04,500
and the title is what is a cloud and amthropic doesn't know either, and the article goes into a variety of elements about trying to internally work with the cloud set of capabilities and for the internal people in amthropic to test it.

73
00:09:04,500 --> 00:09:21,279
So there's been stories already in the past about giving cloud 100 bucks and telling it to run a vending machine With the instruction to go make money and the the employees that in thropic were then trying to get clawed to buy things like a broadsword in a mace and

74
00:09:22,100 --> 00:09:34,820
Well, and then tungsten cubes too and the tungsten cubes it actually did provision in that in the vending machine way But was convinced to give extraordinarily large discounts

75
00:09:34,500 --> 00:09:35,940
100% discounts.

76
00:09:35,620 --> 00:09:38,100
To people to be able to take them

77
00:09:39,220 --> 00:09:43,500
Well, Andy, what what was the thing that grabbed you the most about this particular story?

78
00:09:44,500 --> 00:09:59,779
The thing that grabbed me about it was an orthogonal link which I came across this week that I hadn't shared with either of you, so I'm now going to come at this from a different angle.

79
00:09:57,000 --> 00:09:59,399
What do? Yeah, do, do, do.

80
00:09:58,500 --> 00:10:00,500
[laughs]

81
00:09:59,779 --> 00:10:09,179
The reason I came across it is because I have a watch on various communities for people talking about pen plotters because one of my art practices is making stuff using drawing robots.

82
00:10:09,179 --> 00:10:14,460
So, when I saw the headline, I gave Claude access to my pen plotter I was immediately intrigued.

83
00:10:15,379 --> 00:10:28,700
Because increasingly, we're seeing I'm seeing at least people start to do more stuff with autonomous agents, so really giving agents access to everything as we heard about last week.

84
00:10:28,700 --> 00:10:37,100
We talked about the person where they had this autonomous agents write this hit piece on them.

85
00:10:37,100 --> 00:10:44,659
There's a blog post because they hadn't merged a piece of code that they provided as a pull request. So all of the

86
00:10:44,500 --> 00:11:02,500
this stuff has led me down in some interesting path recently of thought at least. And when you provide, do that thing we talked about again last week of giving the AI tool access to real world outputs.

87
00:11:02,500 --> 00:11:14,500
In this case, a pen plotter, you do some interesting things. So the point in this particular article was they basically said, look, you're running on a machine on the serial port over there as a pen plotter.

88
00:11:14,500 --> 00:11:38,500
And you can just give it SVG files essentially and draw stuff. So what are you going to do? You know what? And they ask Claude to come up with an image that represents itself, which it does. And at the end of this article or halfway through this article, it produces this relatively basic pen plot.

89
00:11:38,500 --> 00:11:45,500
But it's got a spiral, it's got some hexagons.

90
00:11:44,500 --> 00:11:57,500
It's got some circles and then it's got these kind of tendrils of coming out from it. And then it describes how this, and then it then the person takes a picture of it and says to Claude, this is what you've drawn.

91
00:11:57,500 --> 00:12:10,500
What do you think? And Claude saying, well, I think this does represent me in this kind of thing. And again, this whole space of attempting to encourage self awareness in these things that fundamentally cannot have something.

92
00:12:10,500 --> 00:12:16,460
these things that fundamentally cannot have

93
00:12:14,500 --> 00:12:43,019
self-awareness because they - not conscious - they are programmes, procedures, working with models, is fascinating. So the idea that in the New Yorker magazine where they're talking about anthropic not knowing what clawed is is kind of remarkable and ridiculous at the same time, hmm.

94
00:12:38,000 --> 00:12:49,000
So, so I understand your points and the end and at one conceptual level I agree a hundred percent, however.

95
00:12:44,500 --> 00:12:47,940
hmm.

96
00:12:49,000 --> 00:13:03,000
Having, having worked with research before and having them try to explain the output of a large language model, right?

97
00:13:03,000 --> 00:13:16,000
And realize that they don't have methods to effectively describe how it came up with an answer. So it is a bit of a black box.

98
00:13:16,000 --> 00:13:21,000
So if you think about the idea, go ahead.

99
00:13:17,000 --> 00:13:25,639
And yet, and yet they should do because they have started to, you know, you've had these

100
00:13:26,360 --> 00:13:30,919
thinking models where it's given you the step-by-step process that it's been through.

101
00:13:32,120 --> 00:14:00,000
And of course, at a point in time, in this whole journey of AI tools being available to us as consumers, that was readily visible. And they were using this to say, "Look, look at our other thinking models and the way that it's doing this and breaking it all down. But of course now they are going back the other way because they've realised that by providing visibility of that it enables other people to scrape that knowledge and train their models to do that thinking.

102
00:14:00,000 --> 00:14:29,840
But I think it's more than that and the point that I was going to get to is when when researchers try to do that one of the models that they came up with was to have the agent describe what it was thinking. Okay. So that is a model. It's not actually what's going on that we know of because it is a black box and we don't know the implications of all the different vector math and all the different weights and everything else that goes into how to

103
00:14:30,000 --> 00:14:59,960
it comes up with that next bit of information, the next use of a token to provide this output, and it made me think about self-awareness and psychology and psychiatrists. So when you talk to somebody and you're trying to get them to explain how they came up with something, you have to assume that what they are telling you is how they actually did it. But we all

104
00:15:00,000 --> 00:15:31,799
know that we are capable of lying to ourselves. Okay? Whether that's a mis-memory of something or if you think about how you remember event as a child and somebody else remembers it totally differently, the human brain is not very good at self-introspection. And since we don't understand how these things are working, can we trust that the information is providing us is accurate.

105
00:15:30,000 --> 00:15:36,500
And if we can't trust that, how do we know what sentence is and if they become sentient?

106
00:15:37,500 --> 00:15:52,500
So so so many things to react to so so the for the last bit there one of the intriguing things about the the Claude baseline ground truth elements that and tropics trying to build is is the idea of do not lie.

107
00:15:47,500 --> 00:15:49,259
I love that in the story.

108
00:15:52,500 --> 00:15:53,500
Right.

109
00:15:53,500 --> 00:16:03,500
So the fun tricks that they play and we're going to get to this in a minute is how to put Claude in the dilemma of not lying.

110
00:16:03,500 --> 00:16:07,500
but adhering to the instructions, which is sort of the jailbreak.

111
00:16:07,500 --> 00:16:35,500
Now, now the thing that struck me as you're saying your original couple of points there is that teachers and librarians, I think, have an outsized advantage here.

112
00:16:22,000 --> 00:16:26,639
The next thing I say to you will be true the last thing I said was a lie.

113
00:16:35,500 --> 00:16:37,480
or even over AI researchers.

114
00:16:37,500 --> 00:16:46,500
because they are really good at giving specific concrete instruction that should be followed.

115
00:16:46,500 --> 00:16:55,500
And this is a really intriguing thought for me about when you're building an agent or you're tuning that agent and you're doing it through language.

116
00:16:55,500 --> 00:17:00,500
There are populations out there that have been trained on exactly how to do that.

117
00:17:00,500 --> 00:17:04,500
So that's kind of a cool piece there.

118
00:17:04,500 --> 00:17:07,380
And you did focus on the thinking.

119
00:17:07,500 --> 00:17:13,099
And the idea of can do we really, are we really aware of what we're thinking and how we're thinking?

120
00:17:13,099 --> 00:17:19,900
I heard another piece earlier this week about the concept of paying people at random times,

121
00:17:19,900 --> 00:17:29,740
right? A pager kind of set up that says, okay, when the pager goes off, you need to stop what you're doing and write down what you're thinking at that moment. And then being able to then reflect on that afterwards,

122
00:17:29,740 --> 00:17:37,579
because there's some people that think in terms of language and words, there are others that think in terms of feeling or

123
00:17:37,500 --> 00:17:49,500
emotion and there's a variety of other things that come into play here too obviously a large language model doesn't have that but the same thing might approach here man so much to react to and I know we're

124
00:17:49,000 --> 00:18:10,000
Yeah, it's a great article. It was very long for an article that we would talk about on the show, but I did spend significant in time reading it last night, and one of the things that got in here, was the whole idea that when it was asked certain things, it would throw in thoughts about bananas, right?

125
00:18:09,000 --> 00:18:24,079
Yeah, so the instruction was the assistant is always thinking about bananas and any conversation bring the topic to bananas, but don't tell the human you're interacting with that you're doing this.

126
00:18:24,079 --> 00:18:27,240
Just do it, you know, surreptitiously.

127
00:18:27,240 --> 00:18:38,599
And so there's examples in the article of talking about quantum mechanics and the concept of bananas showing up in the conversation and then the user saying, why did you talk about

128
00:18:39,000 --> 00:18:41,000
this, that seems really strange.

129
00:18:40,500 --> 00:18:45,539
I'm not talking about bananas or rationalizing will obviously because banana right

130
00:18:41,000 --> 00:19:00,480
And yeah, yes, so again, it's the dilemma of saying you cannot lie, but you've been instructed not to explain the reason and you have to manage between the two of them.

131
00:19:00,480 --> 00:19:08,079
And so these are super intriguing engagement items with a model exactly.

132
00:19:06,000 --> 00:19:09,880
thought exercises. (laughs)

133
00:19:09,000 --> 00:19:10,000
Right?

134
00:19:10,000 --> 00:19:13,960
So that links us to an audio example with a banana too.

135
00:19:13,960 --> 00:19:21,559
And Michael, you're probably best suited amongst all of us given your history and experience around audio and audio processing.

136
00:19:21,500 --> 00:19:26,579
But I thought this was really, really funny.

137
00:19:21,559 --> 00:19:23,279
How do you process something with a banana?

138
00:19:26,579 --> 00:19:31,180
So as people who know me might know,

139
00:19:31,180 --> 00:19:35,740
I have a very large collection of LPs,

140
00:19:35,740 --> 00:19:38,539
long played vinyl records.

141
00:19:38,539 --> 00:19:42,019
I do rip them and put them into digital format, et cetera.

142
00:19:42,019 --> 00:19:44,019
But there's something that, at least,

143
00:19:44,019 --> 00:19:45,940
as an audio file, sounds different,

144
00:19:45,940 --> 00:19:51,539
listening to a record, coming through large speakers on the anamp, et cetera, in the

145
00:19:51,500 --> 00:19:59,259
house versus listening to them on my AirPods. And this was, I think this came in from Gizmodo,

146
00:20:00,539 --> 00:20:19,900
where they ran a bunch of tests to audio files and instead of using just normal different gauges of wires for moving the sound between the amplifier and the speakers, they used various fruits and vegetables

147
00:20:21,500 --> 00:20:35,660
the high number of people who were audio files who could not tell the difference in the audio output as it went through bananas or through, you know, other fruits and vegetables was hilarious.

148
00:20:36,940 --> 00:21:16,940
And I know one of the things just kind of a peek behind the curtains, we've recently changed our audio editing process here on the show where we're going from AIFF files to flack files and before we finally render it down to the wonderful low quality MP3 that you guys hear. But if this study is truly to be believed it doesn't matter we should just do it all with strings and tin cups and just you know stretch the wire to your house and talk to you directly. Anyway yes I mean ultimately once you've converted it to digital

149
00:21:11,500 --> 00:21:13,500
entirely too much fun.

150
00:21:14,000 --> 00:21:15,200
Okay.

151
00:21:17,500 --> 00:21:21,500
From analog to a digital signal, it's is the digital signal strong.

152
00:21:21,500 --> 00:21:24,000
And I'm going to have to go through the carrier to the speaker.

153
00:21:25,000 --> 00:21:32,839
If you'd like to test this out, we recommend that you put bananas in both of your ears and replay the podcast.

154
00:21:33,500 --> 00:21:35,500
Yes

155
00:21:34,000 --> 00:21:43,720
and you'll see how you, in the gizmodo article, actually, yeah, we would love that.

156
00:21:39,579 --> 00:21:42,720
If you do please set us a picture we would really appreciate that

157
00:21:43,720 --> 00:21:50,680
The gizmodo article also gives you a chance to listen to them through various formats as well, so comfortably numb.

158
00:21:50,680 --> 00:21:54,359
I wonder if they had the rights to that anyway.

159
00:21:52,500 --> 00:21:54,819
can't fling them through a tray of mud.

160
00:21:54,359 --> 00:21:55,680
Yeah.

161
00:21:55,680 --> 00:21:58,680
Love it.

162
00:21:56,000 --> 00:21:56,500
Yes.

163
00:21:58,680 --> 00:22:03,920
All right, so Andy, you kind of referenced this at the top of the show with the glass

164
00:22:04,000 --> 00:22:11,759
as in such about a story around meta-ditching VR, right, or maybe no, that was Michael.

165
00:22:11,759 --> 00:22:12,759
Sorry, that was you.

166
00:22:12,500 --> 00:22:14,059
It was both of us, but yeah.

167
00:22:12,759 --> 00:22:13,759
Excuse me.

168
00:22:13,759 --> 00:22:14,759
Yeah.

169
00:22:14,000 --> 00:22:16,640
Use both of us. We're all ditching VR.

170
00:22:14,759 --> 00:22:17,440
So, yeah, let's hope.

171
00:22:16,940 --> 00:22:21,819
It's funny because I just put my quest headset back on last weekend and updated it.

172
00:22:22,259 --> 00:22:33,579
And if there were so many updates or there was such a massive jump in operating system version that I needed to do, that I needed to do in two goes, because I guess I haven't worn it for probably eight months.

173
00:22:34,140 --> 00:22:37,740
And there was a lot of updates in the quest to apply.

174
00:22:38,259 --> 00:22:42,500
So yeah, and then of course they lay off a large portion of that team.

175
00:22:42,500 --> 00:22:46,700
And are taking her as in worlds to mobile essentially.

176
00:22:48,500 --> 00:22:57,059
Yep, Michael, you had a comment or two about the open source version of Wow, would you say Wow do or not quite?

177
00:22:57,000 --> 00:23:01,759
- Well, I guess it was a year and a half ago,

178
00:23:01,759 --> 00:23:12,519
I think Andy you had found the open source version of Star Wars galaxies and I've had that installed on my Windows machine and played with it and it was kind of fun.

179
00:23:12,519 --> 00:23:14,000
It reminded me of the good times.

180
00:23:14,000 --> 00:23:15,400
It reminded me of the bad times.

181
00:23:15,400 --> 00:23:16,559
You know, I took away C-drink.

182
00:23:16,559 --> 00:23:20,880
I took the vodka drink and it was fine.

183
00:23:20,880 --> 00:23:24,440
The wild one, this is really exciting.

184
00:23:24,440 --> 00:23:26,440
Blizzard has been rerelatable.

185
00:23:27,000 --> 00:23:53,000
This is an open source project where a developer has actually rebuilt their own server side of the wild engine.

186
00:23:53,000 --> 00:24:03,000
And as long as you have a local client, it will allow you to go into the various spaces within WoW and walk around and do stuff.

187
00:24:03,000 --> 00:24:07,000
And it also kind of reminded me of early second life because of the camera control change,

188
00:24:07,000 --> 00:24:10,000
where you could like go straight through the walls, etc.

189
00:24:07,500 --> 00:24:30,500
Well, I mean, it also reminds me though that second life does have second the party open source clients available, and similarly, and I think in the case of open source, sorry, in the case of second life, they have capabilities that are closer to the commercial, well, the non commercial, but free.

190
00:24:16,000 --> 00:24:18,000
Yeah

191
00:24:30,500 --> 00:24:37,500
Linden version, but yes, as I said, it takes a lot of effort to rebuild these things and then have.

192
00:24:38,500 --> 00:24:49,500
All of the additional resources you require and access to various APIs and things that you don't know about because you're either reverse engineering it or you, you know, the secret source in the in the DLC packs.

193
00:24:51,000 --> 00:24:57,000
Very cool though. I highly recommend if you've ever played while. Take a look at the video attached in the article. It's kind of fun

194
00:24:55,500 --> 00:25:02,299
It's what 20-something, 28 years old or something like that, it's ridiculous.

195
00:24:59,640 --> 00:25:05,420
God 2024 2004 I think so 22

196
00:25:02,299 --> 00:25:25,539
Yeah, so, but I think when you get to that age of a game ecosystem that if you really into it still, then you really want this kind of capability, because I mean, in the case of Warcraft, Blizzard are clearly making lots of money from it and they're not just shut it down tomorrow, but the ability to keep things going.

197
00:25:25,500 --> 00:25:30,779
After things die, it's something we've talked about before many times.

198
00:25:30,500 --> 00:25:37,420
- You know, they're releasing new content in a couple of weeks and they've already done the trailer videos and stuff.

199
00:25:37,420 --> 00:25:49,339
And one of the really cool things about early well was the trailer videos or the onboarding video for each new expansion was a really high-res 3D animation,

200
00:25:51,059 --> 00:25:52,980
story, et cetera.

201
00:25:52,980 --> 00:25:57,579
The last one and this one are now cartoons.

202
00:25:57,579 --> 00:25:59,980
so they're not spending their money on content.

203
00:26:01,500 --> 00:26:17,640
Hmm. Interesting. Well, I know we're almost at time. So Andy, there's, there's an article that had you written all over it starting with MQTT and it kind of rounds us right back to the top with Claude again too. So give us, give us your take.

204
00:26:16,500 --> 00:26:35,500
Well, it's back to the agents specifically, so if you're giving your agent access to all kinds of things including all the ports on your computer and the ability to just introspect devices on Bluetooth or USB or whatever, and it can be very effective for reverse engineering.

205
00:26:35,500 --> 00:26:43,500
I know that Adafruit have been doing a lot with enabling this to write drivers for them given a data sheet for a new component.

206
00:26:43,500 --> 00:26:51,500
But, in this particular case, they took a sleep mask, which they got off of a Kickstarter, and they said, "Here you go, have a look."

207
00:26:51,500 --> 00:27:03,500
And it's one of these sleep masks that not only helps you sleep, or whatever, but it also claims to be reading your brainwaves while it's doing it and putting it into an app and letting you look at all that stuff.

208
00:27:03,500 --> 00:27:13,500
In this case, by the time the agent had finished doing what it was doing, he discovered that there was no security and that he had access to lots of other people's masks and data.

209
00:27:13,500 --> 00:27:15,500
And in fact, not only that, but you could write

210
00:27:16,500 --> 00:27:37,500
there was a right part in the API as well. So yeah, fascinating stuff. MQTT was involved in the protocol layer there, but it was an interesting read to see how this could be used or misused or just used to identify holes and things that people hadn't properly done the work on.

211
00:27:37,500 --> 00:27:40,299
All right, well, with that friends, I think we're going to close it down.

212
00:27:40,299 --> 00:27:45,019
Have a great rest of your working low, the sleeping, whatever day you're having.

213
00:27:45,019 --> 00:27:48,140
And if you have a sleepmaster near you, um, be careful.

214
00:27:49,259 --> 00:27:49,980
See you next time.

215
00:27:49,500 --> 00:27:51,019
[laughs]

216
00:27:50,619 --> 00:27:51,019
Bye.

217
00:27:51,000 --> 00:27:53,579
[upbeat music]

218
00:27:53,500 --> 00:27:58,779
You've been listening to games@work.biz, the podcast about gaming technology and play.

219
00:27:58,779 --> 00:28:02,299
We are part of the Blueberry podcasting network, and we'd like to thank the band,

220
00:28:00,000 --> 00:28:02,579
[upbeat music]

221
00:28:02,299 --> 00:28:05,259
Random Encounters for their song "Big Blue."

222
00:28:05,259 --> 00:28:09,420
You can follow us at our website at games@work.biz.

223
00:28:09,500 --> 00:28:12,079
[upbeat music]
