1
00:00:00,000 --> 00:00:20,000
[Music]

2
00:00:10,500 --> 00:00:15,800
This is GamesItWork.biz, your weekly podcast about gaming, technology, and play.

3
00:00:15,800 --> 00:00:19,199
Your hosts are Michael Martine, Andy Piper, and Michael Rowe.

4
00:00:19,199 --> 00:00:22,800
The thoughts and opinions on this podcast are those of the hosts and guests alone,

5
00:00:22,800 --> 00:00:28,399
and are not the opinions of any organization which they have been, are, or may be, affiliated with.

6
00:00:32,799 --> 00:00:38,299
This is episode 5-4-8, The Uncomfortable Valley.

7
00:00:38,500 --> 00:00:41,079
[upbeat music]

8
00:00:47,000 --> 00:01:07,879
Hello and welcome again to another edition of Games@Work.biz. This is Michael Martin, one of your two Michael co-hosts and not one of your Andy co-hosts who's not with us this week, who is off on other engagements. But Michael, you and I are going to hold down the fort. We're going to have all kinds of good fun stuff to talk about. Aren't we?

9
00:01:08,500 --> 00:01:20,500
Yes, and I am also not one of your Andy co-hosts because there's only one Andy co-host as you said, and this Michael co-host is happy to be here. Michael, take it away. Let's have some fun.

10
00:01:14,500 --> 00:01:38,340
Yeah, let's do. So the very first article for this week is coming from Fast Company and I used to subscribe to this magazine once upon a time and I enjoyed seeing it all the time and they still have great articles that come through and this particular one was super funny for me to read about.

11
00:01:38,340 --> 00:01:44,420
Let's buy Rebecca Heil while, and that kind of rhymes there, doesn't it, that talks about

12
00:01:44,500 --> 00:01:59,620
Microsoft Teams and the emoji faces on Microsoft Teams. So Michael, you know, I've both experienced Teams and yeah.

13
00:01:54,000 --> 00:01:59,599
I'm just barely for a very short period of time before I switched companies.

14
00:01:59,599 --> 00:02:10,759
My current company does not use Teams, but I do have a customer who uses Teams, and I keep trying to force them on to Zoom.

15
00:02:07,500 --> 00:02:21,900
>> Well, good luck with that, because one of the reasons you might want to force them on to zoom is because the emojis are a little on the weird side.

16
00:02:10,759 --> 00:02:11,680
Now we'll know why.

17
00:02:19,500 --> 00:02:21,500
Creepy

18
00:02:21,900 --> 00:02:37,420
So, the article is titled "The Uncomfortable Valley," and we have talked about different valleys here before, and I love the word "uncomfortable," because these emojis are not I'm Kenny, they don't remind you of a...

19
00:02:37,500 --> 00:03:05,699
humanoid or a robot that's like it's a human but not quite, but it's a little uncomfortable because the emojis are animated and they do some stuff and so aside, maybe even from like the smiley face, once you start going a little bit beyond that, those emojis do things that are not exactly maybe what you'd really want to convey in many of a business meaning, and that's kind of what Rebecca is writing about in her article.

20
00:03:05,699 --> 00:03:07,539
that the mojis are supposed to convey.

21
00:03:07,500 --> 00:03:23,500
And to do so effectively and efficiently with a more or less common thought, but the manner in which these mojis and the animation behind them come across is sometimes maybe a little less business like than one might expect.

22
00:03:25,500 --> 00:03:29,259
- Yeah, it is a funny story.

23
00:03:29,259 --> 00:03:37,419
And the examples that the author gives going through the uncanny valley discussion, et cetera,

24
00:03:37,419 --> 00:03:45,860
brought back nice little smiles and conversations that we had years ago in the 3D internet days.

25
00:03:47,659 --> 00:03:52,659
And I do find it interesting that that,

26
00:03:56,300 --> 00:04:00,840
there's this, there's this corporate move right now to be playful.

27
00:04:02,580 --> 00:04:04,460
And I,

28
00:04:03,500 --> 00:04:07,020
Don't say they're with disdain, that's what we've been talking about for years.

29
00:04:07,580 --> 00:04:14,460
well, there's a difference to me between being playful and being professional.

30
00:04:16,620 --> 00:04:22,579
You can be professionally playful with like gaming technology, things of that way.

31
00:04:22,579 --> 00:04:25,500
But in the end.

32
00:04:23,000 --> 00:04:25,959
For playfully professional, rest in playful.

33
00:04:25,500 --> 00:04:34,000
And I actually posted about this on Macedon about living and I'm a text-based person living in an emoji world.

34
00:04:34,000 --> 00:04:44,500
We've got this environment, at least my current environment world work at, that loves to put emojis in everything.

35
00:04:36,000 --> 00:04:38,000
[Laughs]

36
00:04:44,500 --> 00:04:48,500
And I don't need emojis in my documentation.

37
00:04:48,500 --> 00:04:51,500
I don't need emojis in my status reports.

38
00:04:51,500 --> 00:04:55,480
need cute little icons of things and stuff.

39
00:04:55,500 --> 00:05:02,500
Or I'm trying to consume information and the emojis provide no additional value, right?

40
00:05:02,500 --> 00:05:15,500
And so when I look at, when I look at this, it's another one, these, yes, the little faces are cute if you're like saying yay, you did great, right?

41
00:05:04,000 --> 00:05:06,000
Mm-hmm

42
00:05:15,500 --> 00:05:20,500
Or ha, ha, ha, I'm smiling or laughing, but you don't need the animation.

43
00:05:20,500 --> 00:05:21,500
the animation.

44
00:05:21,500 --> 00:05:25,500
That is an additional piece of cognitive load that you as a user have to do.

45
00:05:26,500 --> 00:05:30,500
to deal with when it pops up on the screen.

46
00:05:30,500 --> 00:05:35,500
And if you're in the middle of, let's say you're in the middle of a finance audit.

47
00:05:35,500 --> 00:05:37,500
Right?

48
00:05:37,500 --> 00:05:42,500
And you're on your own teams and suddenly this cute little thing comes up.

49
00:05:42,500 --> 00:05:43,500
Let's yeee!

50
00:05:43,500 --> 00:05:44,500
No.

51
00:05:44,500 --> 00:05:45,500
No.

52
00:05:45,500 --> 00:05:46,500
I don't need that crap.

53
00:05:46,500 --> 00:05:48,500
I don't need that crap.

54
00:05:48,500 --> 00:05:50,500
So, yours is not working.

55
00:05:50,500 --> 00:05:53,500
Are you trying to give things?

56
00:05:50,500 --> 00:05:59,980
I know, no, I'm just playing with, you know, so what Michael's referring to is I'm doing the reactions in macOS, right?

57
00:05:53,500 --> 00:05:55,579
[laughing]

58
00:05:56,459 --> 00:05:57,379
Reactions.

59
00:05:59,980 --> 00:06:08,459
You know, so sometimes you get some birthday balloons or you get some fireworks or some other things and there's certainly been articles and other topics.

60
00:06:00,459 --> 00:06:01,459
And they're not doing.

61
00:06:08,459 --> 00:06:18,579
I think we've talked about it, too, that there are times when that might not be the appropriate emotion or animation that you might want to convey in a particular experience.

62
00:06:18,579 --> 00:06:20,339
So I get it.

63
00:06:20,500 --> 00:06:37,500
I don't understand your point, Michael. If I'm reading serious documentation about how to operate some software or hardware or what have you, I don't need it to be cutesy. I need it to convey the information and allow me to do what I need to do, right?

64
00:06:35,000 --> 00:06:37,160
Exactly. Ex-ex-ex-ex-ax.

65
00:06:37,500 --> 00:06:49,500
You know, it's not like at the end of a Monty Python, you know, the crawl where they, you know, claim that there were no llamas or coconuts or sparrows, you know, that were injured as a part of the making of the film.

66
00:06:50,500 --> 00:06:53,060
Exactly. Exactly. You got it.

67
00:06:51,500 --> 00:07:01,500
Yeah, yeah, so I'm with you. And that to me was why I was attracted to this article. I knew you're going to have a point of view on it.

68
00:06:59,000 --> 00:07:01,000
of course

69
00:07:01,500 --> 00:07:13,500
Because the idea of being professionally playful or playfully professional or, you know, being on the right part of that spectrum for the context of where you are is kind of important.

70
00:07:13,500 --> 00:07:22,500
And when you're dealing with multiple nationalities and you're dealing with all kinds of, you know, other norms and people.

71
00:07:21,500 --> 00:07:45,500
You want, you want the message sent to equal the message received. And that's what this article is really talking about, like the angry face, for example, where you might want to professionally playfully convey that, oh, I don't like that particular set of numbers in that, you know, region of the country of the world for the sales figures.

72
00:07:45,500 --> 00:07:50,439
The fact that the emoji shakes and kind of is looking like super-deluxe angry might

73
00:07:50,500 --> 00:07:55,339
overly state how you feel and that's not what you're trying to send.

74
00:07:54,000 --> 00:08:04,920
And it also, I mean, those cultural differences, too, that could play into misrepresentation or misunderstanding of what you're actually sending.

75
00:08:04,920 --> 00:08:10,199
So context is kind of important, don't you think?

76
00:08:09,000 --> 00:08:20,680
It kind of is so uncomfortable valley. I love the term and then then thanks. I'm really really grateful for that now Now thinking about some other weird things. We've talked about some weird Nintendo

77
00:08:21,480 --> 00:08:34,519
Experiences like the Elorimo was one of the things we talked about recently and there's an article from the verge Although there's been several articles on this particular front of a flower toy a talking flower toy

78
00:08:35,080 --> 00:08:58,879
that remains playful and it will tell you various and sundry things as you see it but it also looks a little bit weird and it will interrupt your day and in the way I'm reading it in a playful fun kind of way to tell you the time and maybe say something a little bit like gosh the talking moose would have done back in the day.

79
00:09:01,000 --> 00:09:12,919
Yeah, okay. So I get to be grumpy old man this episode. And yeah, so this one I actually get

80
00:09:05,500 --> 00:09:09,500
Oh my gosh, you don't have to, don't, don't feel like you need to play the role if you don't want it.

81
00:09:13,879 --> 00:09:24,360
because this is a consumer device for home and you could probably have it like in your kid's room or in a playroom or something. And you can have some fun with it and you can disable it.

82
00:09:24,500 --> 00:09:27,600
Yes, right

83
00:09:25,159 --> 00:09:31,000
You can tell it not to do things too. And and it's not a chat device.

84
00:09:31,000 --> 00:09:36,039
It's not designed for full interaction where it's sucking up the data in your house.

85
00:09:35,000 --> 00:09:38,240
Right, there's no large language model for this, right?

86
00:09:37,639 --> 00:09:44,919
So so this one actually I would never have it, but I get it and I'm okay with it.

87
00:09:46,000 --> 00:09:53,360
Okay, they see your side step that nicely now. Now, how about this next one about this camera idea?

88
00:09:46,200 --> 00:09:47,639
See I wasn't the grumpy old man.

89
00:09:53,360 --> 00:09:59,919
You know, how does this one grab you? I mean, it doesn't have a large language model to it, but it will help create a

90
00:10:00,399 --> 00:10:03,200
Gameified version of your day and send you on

91
00:10:03,759 --> 00:10:08,240
Quests to go and collect information and pictures and a whole lot more

92
00:10:09,500 --> 00:10:27,120
Yeah, so this is from Haxter I/O, and it's a device that a guy developed that gamifies taking environmental pictures as the way I put things around you.

93
00:10:27,120 --> 00:10:37,600
And I was actually listening to a story the other day about Pokemon Go.

94
00:10:35,500 --> 00:10:37,500
Oh yeah.

95
00:10:37,500 --> 00:10:39,500
And about pizzas?

96
00:10:37,600 --> 00:10:39,600
I know you, you, you, you.

97
00:10:39,500 --> 00:10:50,620
You were a fan of the monopoly game and the whole idea of checking in places and, you know,

98
00:10:43,500 --> 00:10:45,500
Oh yeah, that was fun.

99
00:10:50,620 --> 00:11:34,139
we, we ultimately discovered the main reason for each of these is data collection and mapping out popular areas and, and finding out where people are and, and so this is an individual doing it, it's kind of for fun and I, I think those

100
00:11:09,500 --> 00:11:17,500
other games may have started for fun but tended to be a data suck in the long run.

101
00:11:17,500 --> 00:11:39,500
I, I hope this stays a one off fun little project for a guy who just wants to hack around and do these things and I'll leave it at that because I've, you know, we all know I've gotten sensitive about my data over time and, and I, I think it's important that people understand.

102
00:11:23,500 --> 00:11:25,580
[laughing]

103
00:11:39,500 --> 00:11:50,419
The nuances of data collection. What can and cannot be collected, right? And what it can and cannot be used for it's funny. I've I've listened to

104
00:11:52,340 --> 00:11:55,299
The Twitter network for 20 years

105
00:11:56,000 --> 00:11:58,399
Gosh, they've been around that long.

106
00:11:57,620 --> 00:11:59,620
Yeah, and

107
00:12:00,059 --> 00:12:11,460
For the longest time Leoloport who's kind of the founder of that and does a lot of the shows there would say I've I've got nothing to hide open blood.

108
00:12:09,500 --> 00:12:31,600
I got no problem about data privacy and he's starting to finally switch his tune, which yeah, recognizing that just because you got nothing to hide doesn't mean you want everything consumed and known, right?

109
00:12:31,600 --> 00:12:39,500
And so I think we as tech people and people who enjoy technology.

110
00:12:39,500 --> 00:12:59,799
And learning new things and playing on the leading edge need to kind of reassess the assumptions when we play with something new to don't just assume everything's always going to be used for good, right?

111
00:12:59,799 --> 00:13:04,320
Or not used for nefarious purposes.

112
00:13:04,320 --> 00:13:09,480
And so I found it interesting that he was recognized.

113
00:13:09,500 --> 00:13:23,779
And yeah, I want to get your perspective because I know, are you still doing, was it for square and checking in places?

114
00:13:20,000 --> 00:13:25,960
They have something called swarm, which is a derivative of four square.

115
00:13:23,779 --> 00:13:26,720
That was it, squirmed that the new version was the, yeah, yeah.

116
00:13:25,960 --> 00:13:26,960
Yeah.

117
00:13:26,960 --> 00:13:34,360
Yeah, so there's certainly a lot of thoughts about what data you're putting out where.

118
00:13:34,360 --> 00:13:35,360
Who owns it?

119
00:13:35,360 --> 00:13:36,360
Who has control over it?

120
00:13:36,360 --> 00:13:37,360
What could go?

121
00:13:37,360 --> 00:13:40,279
What are the derivative works that come as a result?

122
00:13:40,279 --> 00:13:44,240
And this works whether you're playing with Google Maps.

123
00:13:44,240 --> 00:13:49,480
Even Apple Maps now has frequented places and there's ways of storing or not storing

124
00:13:50,840 --> 00:14:00,720
for square and swarm have their own data structures and the like and certainly Pokemon go in a variety of other situations.

125
00:14:00,720 --> 00:14:02,759
Waze is another lovely example.

126
00:14:02,759 --> 00:14:50,000
We're all about harvesting data and providing a degree of entertainment to people or some other element of value because why would people do it if they weren't getting some value out of it in order to then have a deeper hash of information that could be used and repurposed for things like, I thought you were going to talk about the article about how for pizza delivery, you know, optimization, you know, and for mapping, you know, that's what many of these are harvesting data for to keep up to speed with what's changing with the roads and what is now the most efficient, effective direction to go get somewhere crowdsourcing that.

127
00:14:44,500 --> 00:15:06,500
Oh, I remember having, when I did a lot of work with the automotive industry, one of the big things was, you know, looking at the sensor data coming off the tires of the car, that you could then sell that information to the city to say, hey, you've got problems with potholes in this area and you ought to consider going out and sending a repair truck and fixing that.

128
00:15:06,500 --> 00:15:14,019
Yes. Exactly. Exactly. So it's, it's, it's in a super interesting area about how do you,

129
00:15:14,019 --> 00:15:28,580
how could you monetize data that people are emitting? And, and the digital contrails of how we walk through life, many people don't consider it all what they're emitting and some, you know, some do.

130
00:15:27,500 --> 00:15:38,840
You should assume that if a new technology can collect telemetry data, assume they're going to monetize it.

131
00:15:38,840 --> 00:15:52,500
And if you're okay with your data being monetized and you're getting no incremental value over just the toy or the game that you're playing, are you okay with it?

132
00:15:52,500 --> 00:15:59,019
Right, I mean the phrase was if you're not paying for something you're paying with it with your data, right?

133
00:15:59,019 --> 00:16:02,139
And then that's something we've talked about for eons

134
00:16:03,000 --> 00:16:06,679
Yeah, but we have, but your average user,

135
00:16:06,679 --> 00:16:10,080
I don't think, understands what that.

136
00:16:09,000 --> 00:16:20,360
So of course not, of course not, and the terms and conditions are deliberately so dense and impenetrable that you can't really fathom that unless you've had that level of experience

137
00:16:21,000 --> 00:16:28,960
And we know that, you know, the smart individuals who listen to our podcast are well aware of all of this.

138
00:16:29,500 --> 00:16:39,340
Well, that's one of the reasons why, you know, all of our listeners are above average and super, you know, super clever when it comes to this kind of stuff.

139
00:16:39,340 --> 00:16:49,419
So speaking of super clever, there's, there was, um, I think Andy actually was the one who shared this and this tickled my fancy on a couple of different ages, right?

140
00:16:44,500 --> 00:16:51,480
Yes, I knew it would as soon as I saw it. I was like this was targeted for you specifically

141
00:16:49,419 --> 00:16:59,500
Well, yeah, so, so here's, here's the deal in, in an, in an energetic world where you're using agents to accomplish your task.

142
00:17:00,500 --> 00:17:04,500
There's a need for a visual representation of them.

143
00:17:04,500 --> 00:17:08,500
And if you're in a development environment,

144
00:17:07,000 --> 00:17:08,200
Okey dokey!

145
00:17:08,500 --> 00:17:10,500
well, I mean, yeah, that's what we talked about.

146
00:17:10,200 --> 00:17:12,039
We talked about that two weeks ago.

147
00:17:10,500 --> 00:17:13,500
The other time a couple of weeks ago.

148
00:17:13,500 --> 00:17:18,500
There are ways for an agent to say,

149
00:17:18,500 --> 00:17:21,500
"Oh, I've received a task and I'm doing something with it."

150
00:17:21,500 --> 00:17:24,500
And in many cases, they chat experience back and forth, right?

151
00:17:24,500 --> 00:17:27,500
So you ask for something in the chat experience.

152
00:17:27,500 --> 00:17:29,460
So you get a working kind of...

153
00:17:29,500 --> 00:17:34,500
response, and then at some point there's then, okay, I've done the thing you've asked for.

154
00:17:34,500 --> 00:17:44,500
Now, what I've been looking at more and more are the design patterns of how do you orchestrate across a multiplicity of agents?

155
00:17:44,500 --> 00:17:56,500
Who are independently operating on certain activities that might need to jointly use their inputs and outputs from one another to do something together, right?

156
00:17:56,500 --> 00:17:59,500
an architect agent might need to do something with a user.

157
00:17:59,500 --> 00:18:07,500
A database administrator agent to all come up with a solution that's going to work properly.

158
00:18:07,500 --> 00:18:20,500
And this pixel friends or pixel agents, pixel-agents from Pablo de Luca, is a really neat way of having a representation of multiple agents.

159
00:18:20,500 --> 00:18:29,500
Yes, using a bit, but to show that there's work going on in that an agent might be busy.

160
00:18:29,500 --> 00:18:33,500
That's finishing up a piece of work, and another one might be ready to start.

161
00:18:33,500 --> 00:18:42,500
And if there needs to be sub-agents, for those of us who've been playing in this space for a long time, not just an agent, but a demon,

162
00:18:42,500 --> 00:18:48,500
that you can do this kind of work and show it and see what's happening.

163
00:18:48,500 --> 00:18:58,500
And more easily as the human being in the center, orchestrate, recognize, see when things are done, inspect what has been completed, pass off a task to another agent.

164
00:18:59,500 --> 00:19:03,500
and watch it going in its way. So I love this.

165
00:19:03,000 --> 00:19:15,720
So, this is the ultimate management by walking around in a pixelated world because you could sit at your desk and watch all your agents go working and you could walk around without walking around.

166
00:19:15,720 --> 00:19:18,279
And I used to love doing management by walking around.

167
00:19:18,279 --> 00:19:31,279
I think that's a very effective way to do management when all your team is co-located and physically stuck, right, stuck in a room together because it's social, it's interactive.

168
00:19:22,500 --> 00:19:24,500
Yep, very HP of you by the way.

169
00:19:33,000 --> 00:19:37,839
Management by walking around, "Voilier Edition."

170
00:19:36,000 --> 00:19:38,259
[laughing]

171
00:19:37,839 --> 00:19:40,400
[laughing]

172
00:19:40,400 --> 00:19:41,799
because all your agents are working.

173
00:19:41,799 --> 00:19:50,200
Matter of fact, there was a thing announced last week or this week that is an agent orchestrator

174
00:19:50,640 --> 00:19:51,480
for the MAC.

175
00:19:51,480 --> 00:20:03,160
It's an open source project called OSARIS and I'll put a link into the show notes for it that I just started playing with.

176
00:20:03,000 --> 00:20:32,920
It doesn't give you the UI, but I'm wondering if you could hook it up to this because I've actually taken some skills that somebody defined to improve your coding reviews, code reviews, and stuff, and they understand like the current rules around Swift data, the current rules around Swift concurrency, the current rule, I mean like last weeks, not the ones when the models were built.

177
00:20:34,000 --> 00:20:54,000
And I've been using those and adjusting and creating local models based off of other coding models so that I can have my Olamas server hook up to my Xcode environment with current information to do things like review this code, I'm having a problem here can you kind of give me some pointers.

178
00:20:55,000 --> 00:21:02,000
And so this is an orchestrator that has skills and capabilities to do things like access.

179
00:21:04,000 --> 00:21:29,000
So it's on that edge now where it's open source, but is it freeware? There's a difference between those two things, right? But yeah, that's why I said there's a difference between those two things. If it's open source, that's one thing. If it's freeware, they're getting money somehow.

180
00:21:17,500 --> 00:21:19,960
And are you paying for it with your data or not?

181
00:21:30,000 --> 00:21:32,960
So yeah, very, very interesting.

182
00:21:33,000 --> 00:21:49,000
Interesting set of things and I did as soon as I saw this from from Andy. I was like, yep, this this is for you, Michael. It's the pixelated 8 bit world and the number of games that that we played with with pixelated things.

183
00:21:51,000 --> 00:21:53,000
Just just fits so perfect for this.

184
00:21:53,500 --> 00:22:41,500
it does. I mean, it leaves you in the, not even the uncomfortable valley. It leaves you in a comfortable valley, because it gives you a user experience or a comfy chair. But it gives you that opportunity to really kind of see what's going on. And what I also like about this is the design pattern of multiple multiple threads with could be even the same agent when you think about it. But you can represent the same agent with multiple 8-bit characters, right? So the first agent is acting as that architect agent. The second one is doing documentation. The third one is doing user experience creation.

185
00:22:05,500 --> 00:22:07,500
Like a fuzzy sweater.

186
00:22:09,740 --> 00:22:12,700
Oh no, not the comfy chair! Not the comfy chair!

187
00:22:41,500 --> 00:22:52,500
And they're all the same agent underlying, right? Still same LLM, etc. But you can now split them apart and see, okay, I've got three threads that are working concurrently.

188
00:22:45,000 --> 00:22:50,000
Yeah. Well.

189
00:22:53,500 --> 00:23:01,500
and may finish before the others and then you can go and inspect what's happened. And then now.

190
00:22:57,000 --> 00:23:12,000
I think the interesting thing here, and I know we're going to run out of time, is there are a lot of paid services that do this exact thing, they just don't have that cute, playful nature sitting on top that allows you to understand what's going on.

191
00:23:11,000 --> 00:23:14,559
And they're not open source, so there you go, right?

192
00:23:12,000 --> 00:23:17,000
Well, yeah, that's true.

193
00:23:14,559 --> 00:23:16,640
[laughs]

194
00:23:18,680 --> 00:23:21,640
Excuse me, yeah, so you're right,

195
00:23:21,640 --> 00:23:23,440
we're gonna run out of time if we don't move along.

196
00:23:23,440 --> 00:23:26,519
And if we had all the time in the world,

197
00:23:26,519 --> 00:23:29,680
because there were agents that could take over the jobs,

198
00:23:29,680 --> 00:23:34,720
then we could have time to talk about this stuff for hours and hours on end.

199
00:23:34,720 --> 00:23:35,920
And the Washington Post,

200
00:23:35,920 --> 00:23:41,039
and there were several other articles that came around the last week about What is anthropic thing?

201
00:23:41,000 --> 00:23:52,000
We're thinking are the jobs that are most and least exposed to having AI be able to handle the preponderance of the work that is going on.

202
00:23:52,000 --> 00:24:06,000
So, wasn't exactly surprising to me where things were fitting on here, plumbers and electricians, hairdressers, where those roles are fitting versus web designers and others like that.

203
00:24:05,000 --> 00:24:11,319
The only thing that I would add to this and it's a key section of the article is that

204
00:24:11,960 --> 00:24:18,359
they're analysis and if you go back historically this may be a common pattern is that

205
00:24:19,960 --> 00:24:27,720
86% of the most vulnerable workers are women based off of what's being automated.

206
00:24:27,720 --> 00:24:33,240
And that is a very interesting statistic and I'd love to learn more about that

207
00:24:35,000 --> 00:24:40,359
additional stories about this because that seems very, very bad.

208
00:24:43,500 --> 00:24:48,779
Yep, I'll be interested to see what you turn out, Michael. I think that's an astute observation

209
00:24:50,539 --> 00:24:59,339
So rounding out our topics for this week. We had kind of a fun-ish one I mean, it's not really fun when you think about it, but it was an intriguing way

210
00:24:57,500 --> 00:25:00,000
[laughs]

211
00:24:59,900 --> 00:25:12,380
Yeah, exactly who is an intriguing way of leveraging large language models in AI here And this is from the Virgin, the title, which was a great clickbait, is chat GPT did not cure a dog's cancer

212
00:25:13,500 --> 00:25:16,859
What was your main takeaway? Would you say from this, Michael?

213
00:25:18,000 --> 00:25:23,000
- Yeah, my, I mean, the click bait title aside,

214
00:25:24,359 --> 00:25:29,000
my take is, chat GPT doesn't solve anything.

215
00:25:29,000 --> 00:25:40,920
It provides you with data that doctors and scientists and researchers and other people might not have found those patterns and then can go do something with them, right?

216
00:25:40,920 --> 00:25:43,160
And so from that perspective,

217
00:25:43,160 --> 00:25:46,759
you're leveraging it to brainstorm treatment ideas,

218
00:25:46,759 --> 00:25:48,279
I think that's a good thing, right?

219
00:25:48,000 --> 00:25:59,000
I think it's interesting way of doing it. And if that helps them uncover anything, that's great. But yeah, I don't think chat GPT cures anything.

220
00:26:01,000 --> 00:26:53,880
I like the reference of the protein folding from the alpha fold example, too, and that was intriguing for me just because I've been running on my machines for -- yeah, so I've been still doing that for a really long time, and it's interesting to see how this is continuously moving forward, and I have to imagine that that's going to be a lovely, lovely

221
00:26:14,000 --> 00:26:16,000
Are you still running? Folding at home.

222
00:26:31,799 --> 00:26:47,400
workload to put on quantum computers as we go into the future, because that way you're not doing the bulldozer version of, okay, you find the contents of the sphere by slicing off one slice of the sphere, and then a second slice, and then a third, go ch-ch-ch-ch all the way down.

223
00:26:48,359 --> 00:26:55,000
So that was kind of cool. All right, so take us home, Michael, with this Devo link,

224
00:26:55,000 --> 00:26:59,640
you said you had something interesting to share, and I'm all -- I'm all excited.

225
00:26:57,000 --> 00:27:00,500
I got two angles, actually, one that just came up.

226
00:27:00,500 --> 00:27:06,259
If you remember, what was the very very first kickstarter I backed?

227
00:27:01,000 --> 00:27:06,839
Oh my gosh, Michael, I don't keep track of that.

228
00:27:06,259 --> 00:27:09,380
It was a divo documentary that never happened.

229
00:27:08,839 --> 00:27:13,240
Oh, yeah, yeah. That's right. That's right. Yes.

230
00:27:09,380 --> 00:27:23,759
The guy took everybody's money and disappeared basically because he, it was going to be the authorized divo documentary and then divo said, no, after he took everybody's money did it.

231
00:27:23,759 --> 00:27:24,900
Anyway, so that was the first angle.

232
00:27:24,900 --> 00:27:26,900
So this is there's a debo has a

233
00:27:27,000 --> 00:27:58,960
a YouTube channel called D revolutionary Times where they go in and they give background and information about videos and the albums that they created over the years and I think this was filmed I want to say within the last couple of years it may even be like last year anyway episode six is on the album oh no it's divo and that album has a wonderful history to me

234
00:27:58,000 --> 00:28:26,799
in high school is when this album came out and a friend of mine he had gotten tickets to go see them live at the Fox Theater in Atlanta and in high school I lived about a hundred and twenty miles south of Atlanta and my parents said no you can't go I'm like but his dad's taking him so his dad got to go see the concert with him instead of me however the interesting thing and how it ties in with this show.

235
00:28:27,000 --> 00:28:51,000
What Divo did for that concert tour is they started using computer generated animation and integrated it with their music system so that they could control what was on giant video screens behind them down to the millisecond.

236
00:28:51,000 --> 00:28:58,000
And so there's videos out there and the videos don't look that good because the way they do it.

237
00:28:57,000 --> 00:28:58,000
They finally rendered them out.

238
00:28:58,000 --> 00:29:00,000
They're sync problems in it.

239
00:29:00,000 --> 00:29:07,000
But during the concert there's a point where Divo did a lot of things where they would kind of march in place singing forward.

240
00:29:07,000 --> 00:29:18,000
And Mark Mothersbad, who's a lead singer, would point up over his right shoulder at the exact time where a certain thing would happen in the video.

241
00:29:18,000 --> 00:29:26,000
And all of that was computer generated graphics happening in real time based off of the integration between the computers on their keyboard.

242
00:29:27,000 --> 00:29:30,000
It turns into the sound system into the video system.

243
00:29:30,000 --> 00:29:42,000
So this is like the earliest example of a major musical act doing some really cool computer generated graphics in real time in video.

244
00:29:42,000 --> 00:29:45,000
Fantastic 16 minute video.

245
00:29:45,000 --> 00:29:54,000
Highly recommend you watch it all the way through and just see how we've gotten from that because it's like wire frames and stuff to what you can do now.

246
00:29:54,000 --> 00:29:56,920
if you've ever gone to like, when I went to MOGF.

247
00:29:57,000 --> 00:30:17,880
the couple of years, well, it was still here in Durham, there are a lot of bands that do this electronic bands now, that will do music integrated with video content, etc. in real time, and they're basically using a Mac and a keyboard, right? And that's it. And this is, it's just really, really cool,

248
00:30:17,880 --> 00:30:19,000
highly recommend watching.

249
00:30:19,500 --> 00:30:27,759
Awesome. Awesome. All right. Well, you'll find that in the show notes along with all the other links from what we talked about today So if you're like, hey, what was that thing?

250
00:30:27,759 --> 00:30:31,299
You know, it's now a go all to the website and you'll see what the thing is

251
00:30:29,000 --> 00:30:30,000
That was the thing.

252
00:30:31,900 --> 00:30:45,180
Because it'll be there All right. Well, thanks everyone for joining us for this edition You know where to find us games at work dot biz drop us a link or idea and we'll be happy to incorporate it in our show

253
00:30:45,279 --> 00:30:49,079
Next time right here on games at work dot this

254
00:30:49,000 --> 00:30:51,400
Biz. See ya.

255
00:30:50,299 --> 00:30:52,299
See everybody

256
00:30:50,500 --> 00:30:53,079
[upbeat music]

257
00:30:53,000 --> 00:30:58,599
You've been listening to Games@Work.biz, the podcast about gaming technology and play.

258
00:30:58,599 --> 00:31:02,039
We are part of the blueberry podcasting network, and we'd like to thank the band,

259
00:30:59,500 --> 00:31:02,079
[upbeat music]

260
00:31:02,039 --> 00:31:05,160
Random Encounters for their song, Big Blue.

261
00:31:05,160 --> 00:31:08,799
You can follow us at our website at Games@Work.biz.

262
00:31:09,000 --> 00:31:11,579
[upbeat music]
