1
00:00:00,000 --> 00:00:20,000
[Music]

2
00:00:10,500 --> 00:00:15,780
This is GamesItWork.biz, your weekly podcast about gaming, technology, and play.

3
00:00:15,780 --> 00:00:19,140
Your hosts are Michael Martine, Andy Piper, and Michael Rowe.

4
00:00:19,140 --> 00:00:22,740
The thoughts and opinions on this podcast are those of the hosts and guests alone,

5
00:00:22,740 --> 00:00:28,339
and are not the opinions of any organization which they have been, are, or may be, affiliated with.

6
00:00:32,820 --> 00:00:34,980
This is episode 547.

7
00:00:35,379 --> 00:00:37,480
B-tastic!

8
00:00:38,500 --> 00:00:41,079
[upbeat music]

9
00:00:46,000 --> 00:00:48,439
Good morning. Good evening. Good afternoon, Michael Row here.

10
00:00:48,439 --> 00:01:03,640
It's Friday afternoon. I've got screens in front of me, a giant microphone and a friend named Michael on the other side of the screen to talk tech with you and for you about your tags and interesting topics at games or work.biz. Michael

11
00:01:05,500 --> 00:01:16,000
excited to be here on a lovely day and all kinds of excited share lots of stories here about some things that have studs to them.

12
00:01:17,500 --> 00:01:31,739
Ooh. Does that make us studs? No. Not at all. Well, first we're going to say, well, no,

13
00:01:18,239 --> 00:01:19,579
How about that.

14
00:01:20,640 --> 00:01:21,780
No.

15
00:01:23,359 --> 00:01:30,980
No, not at all, but we're going to we're going to start with an article from ours technica don't you think.

16
00:01:31,739 --> 00:01:44,219
because first I'm going to shout out our co-host, Andy Piper, who's off this week. And I think we'll be off next week. And I hope he's having an awesome time spelunking in the fjords,

17
00:01:45,420 --> 00:01:47,420
Because you don't know where he's at.

18
00:01:47,500 --> 00:01:49,500
No one knows where he's at.

19
00:01:49,500 --> 00:01:51,500
But he's having a good time.

20
00:01:51,500 --> 00:01:53,500
Now, what are we going to talk about next?

21
00:01:55,000 --> 00:01:56,000
All right.

22
00:01:56,000 --> 00:02:07,340
So we're going to go under the sea just a little bit with a story about meta and about multiple.

23
00:02:07,340 --> 00:02:14,879
So there was an article earlier this week, there were several articles about this particular topic where meta acquired multiple book.

24
00:02:14,879 --> 00:02:17,060
And we've talked about that a little bit before.

25
00:02:17,060 --> 00:02:24,860
So we won't go into great detail, but the fun thing about this in my mind is when you consider the voracious appetite of

26
00:02:25,000 --> 00:02:41,079
elm's including llama from our friends over at meta. It kind of makes sense that you might want to have something like moat book where agents are talking to one another as a potential source of additional information that you might incorporate into your large language model.

27
00:02:41,960 --> 00:02:52,199
There's that and you might want to have them sell advertisements to agents that may decide to buy them or tell their humans that they should buy them too. So good fun stuff.

28
00:02:52,500 --> 00:02:55,879
I actually had an interesting thought about this the other day.

29
00:02:55,879 --> 00:02:58,680
And so as you say,

30
00:02:58,680 --> 00:03:03,680
Walt Book has these agents talking to each other.

31
00:03:05,460 --> 00:03:18,960
And the idea is, is there value in that conversation that I can leverage if I'm building an agentic workflow to understand how other agents might be doing things and can I reverse engineer some knowledge?

32
00:03:18,960 --> 00:03:21,800
So maybe that's an acquisition target for that reason.

33
00:03:22,500 --> 00:03:28,840
So we're going to think about what happens when you take a deep research document,

34
00:03:30,460 --> 00:03:35,879
you run an AI on it and you say summarize this to help me make a decision.

35
00:03:35,879 --> 00:03:37,280
So it does that.

36
00:03:37,280 --> 00:03:40,800
And you probably used AI to generate that deep research.

37
00:03:40,800 --> 00:03:45,560
So then you have a summary of an AI summary deep research.

38
00:03:45,560 --> 00:03:52,479
And then you say, take this summarized information and let me create something on from it.

39
00:03:52,500 --> 00:03:57,120
Like maybe I'm creating a presentation or a keynote or whatever.

40
00:03:57,120 --> 00:04:03,340
And then you have a summary of the summary of AI generated research.

41
00:04:03,340 --> 00:04:16,819
And then you go off and you take that information and you use it to feed into an agentic tool to go off and run an exercise of some sort which is going to summarize that data and create content.

42
00:04:16,819 --> 00:04:22,399
And you get down to, do you remember the days when photocopiers were really bad.

43
00:04:23,500 --> 00:04:41,500
And you would take a new story on an article from a physical magazine, you make a photocopy and then you make another photocopy of that photocopy and eventually your teacher would do a handout of a photocopy of a photocopy and it was this black mess.

44
00:04:28,000 --> 00:04:31,360
Sure it would degrade with every copy, of course.

45
00:04:40,500 --> 00:04:44,500
or a mimeograph, that was even better.

46
00:04:41,500 --> 00:04:52,500
Well, those were loads of fun. I actually got to work on that as one of my little student things as a kid in high school and got to love the moon grass.

47
00:04:52,500 --> 00:05:11,500
But yeah, I mean, at what level do all these acquisitions of second and third level AI tools add value versus subtract value because they're over summarizing existing content.

48
00:05:12,500 --> 00:05:21,259
Well, I don't know, does this mall book summarizing content or was it really more agents essentially interacting with other agents, right?

49
00:05:18,500 --> 00:05:22,180
Well, it's the agents interacting with other agents,

50
00:05:21,259 --> 00:05:25,819
So, sure.

51
00:05:22,180 --> 00:05:26,279
and multiple was capturing that conversation.

52
00:05:26,279 --> 00:05:34,980
Well, that conversation is driven based off of a summarization of something that turns into a set of actions.

53
00:05:34,980 --> 00:05:45,860
So, it's actions based off of summarization that turn into a string of actions that you're then going to take and feed into an LLM of some sort to learn from.

54
00:05:46,000 --> 00:05:51,800
Yeah, well, I mean it makes it makes sense because that's a meta set of conversations.

55
00:05:51,500 --> 00:05:54,339
Ahh, you got it, it's meta.

56
00:05:51,800 --> 00:05:55,160
So why not, right?

57
00:05:54,339 --> 00:05:58,439
[laughing]

58
00:05:55,160 --> 00:05:58,680
It's on brand, right?

59
00:05:58,439 --> 00:05:59,480
It's yup, yup.

60
00:05:58,680 --> 00:06:00,199
It's really on brand.

61
00:06:00,199 --> 00:06:16,040
I mean, this reminds me also a little bit of the topic we had some years ago where you had a Google Home device talking to another Google Home device or to the watchword and they they would talk back and forth and not understand what.

62
00:06:10,000 --> 00:06:13,480
Siri, and they had these arguments.

63
00:06:16,000 --> 00:06:24,000
And it was so entertaining. Maybe we'll maybe we'll find that for the show notes and if you want to see that YouTube video, it was it was good fun.

64
00:06:20,000 --> 00:06:39,000
Mm. Well, and, and especially if we remember the stories, I guess there's two weeks ago where they talk about, maybe three weeks ago that a lot of the conversation where, where agents looked like they were going off to try to, you know, do things that he weren't supposed to do were actually fed by prompts by people.

65
00:06:42,500 --> 00:06:43,980
- Yeah, exactly, all right.

66
00:06:43,980 --> 00:06:48,980
So moving on to another interesting discussion,

67
00:06:44,000 --> 00:06:46,000
Agents, they're people just like us.

68
00:06:51,540 --> 00:07:00,819
where great followers of Mike Elgin and one of the things that he'd shared was a futurism post about a study here,

69
00:07:00,819 --> 00:07:05,300
that title is study finds that execs are already outsourcing, they're thinking to AI.

70
00:07:05,300 --> 00:07:12,480
And this is a common discussion point here that people are losing their own.

71
00:07:13,500 --> 00:07:38,500
And we've seen some of that over time where, you know, think about the phone numbers you have memorized, you used to have most of the important people's in your life, their phone numbers memorized because doing that on a rotary dial meant you had to have that handy to do it on a rotary dial and now you could just say, you know, hey, wake word, please call so and so, and it'll call them.

72
00:07:38,500 --> 00:07:42,500
So it's not entirely, uh,

73
00:07:42,500 --> 00:07:52,899
surprising here that there's yet another article on this and they call it cognitive debt, um, you know, by outsourcing work to AIs.

74
00:07:53,000 --> 00:08:00,360
So, yeah, I guess kind of the differentiating thought here might be.

75
00:08:00,360 --> 00:08:07,120
If you think executive's incorporation is what their role is supposed to be, right?

76
00:08:07,120 --> 00:08:14,079
It's supposed to be strategic thought and setting directions for others to go do something.

77
00:08:14,079 --> 00:08:22,959
If you have an executive who's doing all this stuff, they're not performing executive activities, right? That's they're supposed to have people to do that.

78
00:08:23,000 --> 00:08:27,079
For them. And in this case, they're saying I've got agents to do that for them.

79
00:08:28,279 --> 00:08:38,519
The question is, when it comes to decision making, that is, I think, where the problem could lie.

80
00:08:38,519 --> 00:08:46,120
If you're outsourcing the decision, that's a problem. If you're saying, go get me this information,

81
00:08:47,000 --> 00:08:51,639
and I will then make a decision off of it. I think that's an appropriate executive function.

82
00:08:53,000 --> 00:08:59,320
And looking at the article to me, it didn't go down deep enough to explain that part of it.

83
00:08:58,500 --> 00:09:01,500
Yeah, no, of course not. Right.

84
00:09:02,360 --> 00:09:08,600
And I don't think executive is asking something else or someone else to do

85
00:09:09,639 --> 00:09:14,279
research for them is a bad thing if they're performing their executive function.

86
00:09:14,000 --> 00:09:22,500
Exactly, and that's how do you guide your AI counterpart to go and collect information for you,

87
00:09:22,500 --> 00:09:30,399
validate it, show where it came from, evaluate the trustworthiness or truthiness of those elements.

88
00:09:30,399 --> 00:09:34,700
Michael and I, before we hit the pre-show, we're talking about what I'll be doing this weekend,

89
00:09:34,700 --> 00:09:42,500
and I'll be doing a keynote address for a statewide hackathon called "Smathhacks" 2026,

90
00:09:42,500 --> 00:09:43,980
I've been involved with this community now.

91
00:09:44,000 --> 00:10:03,000
I've been involved for a number of years and one of the things that I'll be talking about are the skills that are needed now in 2026 to really be able to interoperate and to leverage AI and where those skills are to be best found, nurtured and allow people to move into the future.

92
00:10:03,000 --> 00:10:33,000
So what you just explained, Michael, makes a ton of sense about the researching, right? So can you find the insights of using an AI agent that can quickly go and

93
00:10:15,000 --> 00:10:32,000
to understand and ingest a wide variety of sources that the human then can evaluate the appropriateness, the truthiness, the nature of those sources and then direct the actions to be able to go and do something with that.

94
00:10:32,000 --> 00:10:44,000
It might be to create some wireframes, it might be able to create code directly, it might be to do a refinement of the research or to do something compelling to convince others that this is a good idea.

95
00:10:44,000 --> 00:10:54,000
it's worth investing in. So it's a variety of those things and that I think is going to be the really intriguing thing to delve a little bit deeper here for sure.

96
00:10:53,500 --> 00:11:01,759
Yeah, one thought kind of riffing off of that that I that I had and I was talking to a colleague about this is when you think about

97
00:11:03,220 --> 00:11:10,679
Where the bottleneck has been historically on change and I'll talk about corporate change

98
00:11:11,419 --> 00:11:16,460
It could be how long does it take you to implement your new ERP system, right?

99
00:11:17,539 --> 00:11:23,320
It may be how long does it take you to go from idea to MVP

100
00:11:23,500 --> 00:11:35,259
To ship to the market etcetera and what's going to happen if it's not already happening is that the bottleneck is going to shift

101
00:11:36,299 --> 00:11:47,340
Because it may be it took you six months to write the application MVP now that happens in a weekend that doesn't mean you're gonna ship on Monday

102
00:11:49,019 --> 00:11:53,659
because there are other decisions and other things that now instead of testing.

103
00:11:53,500 --> 00:12:12,500
50 features in your new product. You've got a thousand features because it becomes so trivial to create features that you just shove them all in 1.0 right and so you get into this shift of where the bottleneck is and when.

104
00:12:12,500 --> 00:12:23,500
Again, back to the executive function, the the ability to understand the complexity etc. When you get into the thought process of I just got a thousand.

105
00:12:23,500 --> 00:12:52,500
How's the decisions to make? To say this is shipable or not, how do you have the capacity and the cognitive ability to make those 1000 decisions with the same level or appropriate level of care and understanding and and knowledge that used to be able to make those 1000 decisions over the course of five releases in four years right and now you have a month.

106
00:12:54,500 --> 00:13:05,500
So you're shifting where the bottlenecks are going to be and how do we make people skilled enough to be able to address those new bottlenecks that we're going to make.

107
00:13:07,500 --> 00:13:21,500
Yeah, I love that. Michael, that is really, really insightful, and I think that leads really nicely into the next story here, too, which is the Mac stories article about Apple's 50th anniversary.

108
00:13:21,500 --> 00:13:34,500
And one of the things that Steve Jobs was really famous for talking about was, and there's a link to it right here, so you can see the monologue and listen to it if you like to, the intersection of technology and the liberal arts.

109
00:13:34,500 --> 00:13:37,500
And this features prominently in the remark.

110
00:13:37,500 --> 00:13:51,500
I'll be giving this weekend as well because my belief is that certainly you have to have the necessary science level understanding in order to do what you need to do in a technology world for sure.

111
00:13:51,500 --> 00:14:00,500
But you also need to have the level of creativity, empathy, humanity to know what's important to people.

112
00:14:00,500 --> 00:14:07,500
Unless you're on mope book which in case that you're talking agent agent, that's a different story. But the thousand things.

113
00:14:08,500 --> 00:14:14,500
Which are the most important and Steve Jobs also talked about saying a thousand knows to every yes.

114
00:14:14,500 --> 00:14:17,500
So just because you can doesn't mean you should.

115
00:14:17,500 --> 00:14:23,500
And these are the nuances that are going to be really really important now in 2026.

116
00:14:23,500 --> 00:14:30,500
And I think for a long time still in the future because the bottlenecks are going to shift left.

117
00:14:30,500 --> 00:14:36,500
And as they shift left earlier in the cycle, it's those ideas that are going to be the not commodities.

118
00:14:37,500 --> 00:14:44,500
The uniqueness is that has to be determined.

119
00:14:40,500 --> 00:14:42,360
Yeah, I don't know if they're all going to shift left.

120
00:14:42,360 --> 00:14:46,120
I think you're going to shift left and right.

121
00:14:46,120 --> 00:14:48,820
And so right now, the bottleneck historically,

122
00:14:48,820 --> 00:14:54,139
I'll just talk about software development was in the middle, the writing and the testing of the code.

123
00:14:54,139 --> 00:14:55,779
You could come up with ideas pretty quick,

124
00:14:55,779 --> 00:15:01,899
and you could do the final packaging and a little bit of marketing material pretty quick on the other end.

125
00:15:01,899 --> 00:15:09,419
But now, the middle is going to shrink so dramatically that those both ends are going to be more important.

126
00:15:09,419 --> 00:15:10,580
because if any--

127
00:15:10,500 --> 00:15:14,500
anybody can write a full feature to do app,

128
00:15:14,500 --> 00:15:16,700
there's no way to distinguish yourself in the markets.

129
00:15:16,700 --> 00:15:22,500
You're going to spend a heck of a lot more time on the marketing side, right?

130
00:15:20,500 --> 00:15:22,500
Keep calculator.

131
00:15:22,500 --> 00:15:24,500
Typically, you got it.

132
00:15:24,500 --> 00:15:30,500
What was the fart machine right on the early iPhone apps?

133
00:15:30,500 --> 00:15:38,500
Or on the front end on market research and feature definition and user profiles.

134
00:15:38,500 --> 00:15:39,500
and all this stuff.

135
00:15:40,500 --> 00:16:10,500
I think, you know, jobs as comment was right and you're hitting it too, is we need to understand humanity and the humanities from an education perspective to make sure that we're building the right thing or doing the right thing for the right reasons and the right people, right, so that there's actually the right market out there. Otherwise, it could be really good slot, but it's still slop. Yep.

136
00:16:03,500 --> 00:16:05,500
Product

137
00:16:07,100 --> 00:16:20,220
It's product market fit, you know to you know talk about it in those terms and that that is going to be super crucial in remains So but the iterations and testing for that is going to be something can be known a whole heck of a lot faster, right?

138
00:16:20,220 --> 00:16:25,200
Because you can really evaluate okay. Well, what if it was this what if it was this what if it was this what if it was this and

139
00:16:24,500 --> 00:16:28,299
And so, so that's a really good question.

140
00:16:28,299 --> 00:16:29,779
So it gets faster and faster and faster,

141
00:16:29,779 --> 00:16:31,899
faster, faster, faster, faster, faster, faster, right?

142
00:16:31,899 --> 00:16:35,799
And we say that in the future because of AI and robotics,

143
00:16:35,799 --> 00:16:38,580
we're going to have all this free time.

144
00:16:38,580 --> 00:16:44,100
Are we really, or are we going to see what we're having exactly exactly what we're going to see?

145
00:16:44,100 --> 00:16:49,179
And we see this today with developers who are using AI coding tools.

146
00:16:49,179 --> 00:16:53,100
And instead of just putting out a release two or three times a year,

147
00:16:53,100 --> 00:16:54,299
They're now cranking them out.

148
00:16:54,500 --> 00:17:00,820
Instantly because they can and you end up in a situation where you end up with burnout a lot faster

149
00:17:01,220 --> 00:17:11,299
So that takes us right back to the humanities. You've got to look at this whole listically It's not just about cranking out new features and not just about releasing new products to market

150
00:17:11,900 --> 00:17:18,460
It's not just because you can go faster and have more and more iterations

151
00:17:19,220 --> 00:17:21,599
That may not be the right thing to do

152
00:17:24,500 --> 00:17:31,539
Yes, just another great example we know that once a year Apple's going to come out with a new iPhone

153
00:17:32,500 --> 00:17:34,839
What really? Are you sure? Maybe?

154
00:17:32,779 --> 00:17:42,460
Right usually in September or usually in September right, but right they might come out with a mid cycle release in Let's say March and call it an e device

155
00:17:43,140 --> 00:17:47,140
But for all practical versus once you're they're kind of come out with a new iPhone

156
00:17:48,140 --> 00:17:52,140
Imagine what would happen to their revenue cycle

157
00:17:54,500 --> 00:17:56,579
If they didn't have that

158
00:17:57,259 --> 00:18:04,660
Artificial once a year release cycle, but they just started cranking out new found phones as fast as they could because they can

159
00:18:08,220 --> 00:18:10,579
I'll just leave it up there. So what else we got?

160
00:18:11,000 --> 00:18:40,940
Oh, so what I guess we're not done with with Apple yet what we have is a really cool hack if you will of a case that looks an awful lot like the little Lego computer cases so it's a little two by two sloped computer simulation where you can put it around the Mac mini and the way it's been set up allows you then to have your Mac mini on on your table and you

161
00:18:41,000 --> 00:18:51,339
Will have like this really cool little device and the way it's been set up you can charge your phone on it because that's been set in place too and that's kind of kind of cool

162
00:18:51,500 --> 00:19:10,500
I think this guy actually was the one who developed the little Lego with an actual computer in it, the little two by Lego piece with an actual computer in it, a couple years ago, and so this is using his same designs for that, but scaling it up to put a Mac mini in it.

163
00:19:00,500 --> 00:19:04,920
do it. Oh, really? Maybe.

164
00:19:10,299 --> 00:19:14,900
Oh, that's funny. We'll have to find that if we if that's in fact the case.

165
00:19:13,500 --> 00:19:21,500
Yeah, I'm pretty sure I remember seeing that.

166
00:19:21,500 --> 00:19:38,500
I know he's basing it off the classic two by two sloped computer brick Lego piece, but I think this might have been the same guy, and if not, it still has the same look who did that with an actual Lego piece.

167
00:19:26,000 --> 00:19:28,000
Right Mm-hmm

168
00:19:35,000 --> 00:19:39,559
Yeah, and the touch screen is just such a nice touch there too, right?

169
00:19:38,500 --> 00:19:46,500
Yeah, I saw that, I was trying to figure out what he's actually showing on it, though.

170
00:19:42,440 --> 00:19:47,799
Well, what he's, what he's showing is it is an image that is just like the sticker,

171
00:19:46,500 --> 00:19:51,500
The one that was on the chip. Yeah.

172
00:19:47,799 --> 00:19:52,759
not really sticker. It was what was on the actual little computer two by two sloped brick,

173
00:19:52,759 --> 00:20:01,319
but you know, it is a touch screen that you could use for other controls and there are other companies out there that have created things that are like that. Yeah, yeah.

174
00:19:59,000 --> 00:20:01,000
Kind of like a stream deck.

175
00:20:01,000 --> 00:20:02,579
You can make it into your own little stream deck.

176
00:20:02,599 --> 00:20:04,839
Yeah, I mean, that's that kind of was what

177
00:20:05,000 --> 00:20:26,000
I've forgotten the name of it, but that's exactly what I was thinking. This lets us move a little bit more into the Lego space, or we're kind of picking up on that studs piece, and Michael, you saw an article come across your feeds from Duke about someone who actually went to go work for Lego, didn't you?

178
00:20:26,000 --> 00:20:28,000
Yeah, so this is interesting

179
00:20:28,640 --> 00:20:34,400
in the alumni newsletter crap that they send you all the time asking for money, right?

180
00:20:34,400 --> 00:20:40,799
I think all universities do that. I saw this article. It really caught me. It's it's actually from December and

181
00:20:41,279 --> 00:20:43,279
it's talking about a

182
00:20:44,119 --> 00:20:49,200
student from Duke who actually went to Duke originally for the ballet program and

183
00:20:49,759 --> 00:20:51,759
then it up switching into

184
00:20:52,720 --> 00:20:55,720
the sciences with chemistry.

185
00:20:56,000 --> 00:21:11,000
chemistry classes and how that led her into the path of STEM and teach for America and ultimately connecting up with the Lego education program.

186
00:21:11,000 --> 00:21:24,000
And I just thought it was really kind of cool how, you know, our original paths don't always go down where you think they're going to be and how Lego to me was always a way of connecting things.

187
00:21:24,000 --> 00:21:25,960
my love of Star Trek with

188
00:21:26,000 --> 00:21:40,960
my love of building Legos, because I used to make phasers and stuff out of it. But it was a really interesting kind of journey for this student. And it fit really well into a lot of our discussions we've been having around Lego lately. I think it's cool.

189
00:21:41,500 --> 00:21:56,539
Yeah, I like that too. And the fact that it's all about Lego education, which is where she wound up going, I think that's also really cool because Lego has been a key part of education through the robotics programs and through a variety of other things as well.

190
00:21:57,819 --> 00:22:06,220
Last week's show, we talked a little bit about the new Lego brick and I've been playing around with it just a little bit since then, but I haven't built.

191
00:22:05,500 --> 00:22:07,839
Well, they are only about this big, right? They are little.

192
00:22:07,420 --> 00:22:11,420
Yeah, and one of the articles that turned up was the

193
00:22:11,500 --> 00:22:25,500
the fact that you can't replace the battery in it, which is sort of a standard thing that we've seen around phones for, oh, I don't know, like 20 years that you can't replace batteries and things.

194
00:22:25,500 --> 00:22:37,500
And it's funny because it bit me, because I had the Lego brick here at my desk and I was going to show it to some people, and I started shaking it and twisting it and moving it around and sure enough the battery had expired.

195
00:22:37,500 --> 00:22:41,500
Lego themselves suggest that you keep it on the charger if you're not active.

196
00:22:41,500 --> 00:23:02,500
And it's been recharged, so we're good, but the battery is really small, and trying to pry it open to replace it, or something that feels on a lot of levels just kind of silly.

197
00:22:43,500 --> 00:22:45,920
So is it fully dead or are we able to recharge it?

198
00:22:49,099 --> 00:22:49,940
Okay, good.

199
00:22:53,099 --> 00:22:54,619
Of course, it's a break.

200
00:23:01,500 --> 00:23:10,940
It would also be kind of dangerous because kids can eat bricks as it is and you start peeling a brick apart into smaller pieces that could be a safety hazard.

201
00:23:12,000 --> 00:23:18,240
Yeah, good good point, Michael, too. So you, you, you don't want to have necessarily, excuse me,

202
00:23:18,240 --> 00:23:23,599
like a button battery or something like that in there that could, you know, cause other issues.

203
00:23:23,599 --> 00:23:41,440
And, and that brings to mind also, who are these things for, right? You know, who are these sets for? And this is not the adult fans of Lego kind of structure. There are plenty of sets that are really geared toward that. You're not going to spend a thousand dollars on a death star set for

204
00:23:42,000 --> 00:23:48,480
you know, junior, who's only six years old. And there's a, there's a lovely, you know, maybe,

205
00:23:46,000 --> 00:23:48,000
Some people might, but-

206
00:23:48,480 --> 00:24:10,400
maybe some people. I don't know who those people are. But there's a lovely video that will include in the show to us. You can take a look at it about one of the Lego fans that had his kids kind of be exposed to this. And the sheer delight from the kids that is there is just fantastic. I mean,

207
00:24:10,400 --> 00:24:12,039
it does add an awful lot to

208
00:24:12,000 --> 00:24:40,759
the play experience, although there's some out there that are saying, "Well, it's stunting the imagination because now you're providing this extra stimulus to it," kind of a little bit like the phone conversation we had earlier. But at the same time, the fact that it unlocks some interesting new experiences and that when you have more than one brick cow, one gun turret as it's shooting, pew pew, and another one that's nearby can sense that and go,

209
00:24:40,759 --> 00:24:42,000
Oh, I'm going to explode.

210
00:24:42,000 --> 00:24:55,400
Now, over here, those are kind of cool things and I would say that as we go forward in time, that will provide for new play experiences and for people to do more stuff with them.

211
00:24:55,400 --> 00:25:04,599
Now, along those lines there, there's a couple of other articles that we've listed out here too about finding other uses for the Smartbrick already.

212
00:25:04,599 --> 00:25:12,099
So, there's an example video of someone who's taken the little RFID piece and putting it into a...

213
00:25:12,000 --> 00:25:23,000
a light saber and then putting the brick on, said light saber and moving it around and getting the experiences and the sounds related to that is super deluxe, cool.

214
00:25:23,000 --> 00:25:42,000
They'll probably other things you can do from a sensor perspective on it and we had one last or a point on from Mastodon about how cloning the bricks is also possible because they do comply with the appropriate ISO standards that allow for this sort of thing.

215
00:25:42,000 --> 00:25:48,000
The hacker community is on it as we suggested they were, right?

216
00:25:43,000 --> 00:25:51,240
Yeah. Of course they are. Of course they were. I mean, that's the fun thing. I mean, let's face it,

217
00:25:52,599 --> 00:26:01,480
growing up to me before having my first computer, I mean, Leo's where you're hacking tool,

218
00:26:01,480 --> 00:26:13,000
right? You built things that, I mean, you couldn't actually hack things with them, but you could pretend, right? And you learned, you know, how to break things down and how to build things up based off of components and features.

219
00:26:13,000 --> 00:26:39,000
You're out how to put things together to represent something else and I mean go back to the the Lego computer design with the Mac mini in it right you're repurposing something in a way or making something to make it look like something else and possibly even behave differently than originally designed and and that's the fun of hacking.

220
00:26:38,000 --> 00:26:43,200
Yeah. And tinkering and playing, which kind of brings us right back to the beginning of all this,

221
00:26:43,200 --> 00:26:48,880
you learn by playing with things, experiencing them, even at work.

222
00:26:46,500 --> 00:26:48,500
Even at work.

223
00:26:50,720 --> 00:27:07,279
So with that friends, Andy Wemissia, we're looking forward to being back full strength with you again soon. And if you've got some Lego brick stories or you've got some other things that you'd like to hear us chat about, you can find us at games@work.biz and you can find us on the socials as well,

224
00:27:08,160 --> 00:27:15,119
we'd love to hear about what you're playing with at work or otherwise. Thanks a lot, everybody,

225
00:27:15,119 --> 00:27:16,720
and we'll see you next time.

226
00:27:15,500 --> 00:27:17,500
See ya!

227
00:27:16,000 --> 00:27:18,579
[upbeat music]

228
00:27:18,500 --> 00:27:24,140
You've been listening to games@work.biz, the podcast about gaming technology and play.

229
00:27:24,140 --> 00:27:27,539
We are part of the Blueberry podcasting network, and we'd like to thank the band,

230
00:27:25,000 --> 00:27:27,579
[upbeat music]

231
00:27:27,539 --> 00:27:30,660
Random Encounters for their song, Big Blue.

232
00:27:30,660 --> 00:27:34,299
You can follow us at our website at games@work.biz.

233
00:27:34,500 --> 00:27:37,079
[upbeat music]

234
00:27:56,500 --> 00:27:58,740
[clapping]
