June 6, 2025

Josh Bongard: What are Biological Robots? How AI is Reshaping Life, Consciousness & Matter!

The player is loading ...
Josh Bongard: What are Biological Robots? How AI is Reshaping Life, Consciousness & Matter!

Josh Bongard is Professor of Computer Science at the University of Vermont and director of the Morphology, Evolution & Cognition Laboratory. His work involves automated design and manufacture of soft-, evolved-, and crowdsourced robots, as well as computer-designed organisms. In 2007, he was awarded a prestigious Microsoft Research New Faculty Fellowship and was named one of MIT Technology Review's top 35 young innovators under 35. In 2010 he was awarded a Presidential Early Career Award for Scientists and Engineers (PECASE) by Barack Obama at a White House ceremony. He has received funding from NSF, NASA, DARPA, ARO and the Sloan Foundation. He is the co-author of the book How The Body Shapes the Way we Think, the co-author of "Designing Intelligence: Why Brains Aren't Enough", the instructor of a reddit-based evolutionary robotics MOOC, and director of the robotics outreach program Twitch Plays Robotics. TIMESTAMPS:(0:00) - Introduction (1:22) - Life, Consciousness & Intelligence(5:14) - How The Body Shapes The Way We Think(9:18) - Evolutionary Robotics & Consciousness(17:00) - Biological Robots ("Xenobots")(24:00) - Implications of Self-Replicating Living Machines(32:00) - The Role of AI in Shaping Biology(39:00) - What is Conscious, Really?(42:00) - AI Robotics(46:00) - The Advantage of Interdisciplinary Collaborating(49:00) - Escaping Cartesian Dualism(53:00) - Meta-Materials (Groundbreaking Work!)(56:00) - Cause & Effect(1:04:48) - Expanding Morphospace in its Entirety(1:12:00) - Blurring the Lines Between Living & Non-Living (Meta-Materials Are The Future!)(1:17:14) - Non-Embodiment vs Embodiment AI(1:20:00) - Conclusion EPISODE LINKS:- Josh's Website: https://jbongard.github.io/- Josh's Lab: https://www.meclab.org/- Josh's Channel: https://youtube.com/@joshbongard3314- Josh's X: https://x.com/DoctorJosh- Josh's Publications: https://tinyurl.com/3pd4t8ff- Josh's Book: https://tinyurl.com/4wd7hw3s- Michael Levin 1: https://www.youtube.com/watch?v=v6gp-ORTBlU- Michael Levin 2: https://www.youtube.com/watch?v=kMxTS7eKkNM- Michael Levin 3: https://www.youtube.com/watch?v=1R-tdscgxu4- Michael Levin Lecture: https://www.youtube.com/watch?v=aQEX-twenkA- Michael Levin & Terrence Deacon: https://www.youtube.com/watch?v=HuWbHwPZd60- Keith Frankish: https://www.youtube.com/watch?v=QxDYG0K360E- Keith Frankish 2: https://www.youtube.com/watch?v=jTO-A1lw4JM- Keith Frankish Lecture: https://www.youtube.com/watch?v=IbjGRcqD96Q- Nicholas Humphrey: https://www.youtube.com/watch?v=SCTJb-uiQww- Nicholas Humphrey Lecture: https://www.youtube.com/watch?v=Z3cWQLUbnKs- Mark Solms: https://www.youtube.com/watch?v=qqM76ZHIR-o- Mark Solms 2: https://www.youtube.com/watch?v=rkbeaxjAZm4CONNECT:- Website: https://tevinnaidu.com - Podcast: https://creators.spotify.com/pod/show/mindbodysolution- YouTube: https://youtube.com/mindbodysolution- Twitter: https://twitter.com/drtevinnaidu- Facebook: https://facebook.com/drtevinnaidu - Instagram: https://instagram.com/drtevinnaidu- LinkedIn: https://linkedin.com/in/drtevinnaidu=============================Disclaimer: The information provided on this channel is for educational purposes only. The content is shared in the spirit of open discourse and does not constitute, nor does it substitute, professional or medical advice. We do not accept any liability for any loss or damage incurred from you acting or not acting as a result of listening/watching any of our contents. You acknowledge that you use the information provided at your own risk. Listeners/viewers are advised to conduct their own research and consult with their own experts in the respective fields.

1
00:00:00,080 --> 00:00:02,480
So I am interested in the
philosophical side of things.

2
00:00:02,480 --> 00:00:06,400
So one of the things that I see
in the Xenobots and some of the

3
00:00:06,440 --> 00:00:11,240
other kind of exotic robots that
we work with is a conflation of

4
00:00:11,240 --> 00:00:15,560
thought and action.
So if you take a Roomba, the the

5
00:00:15,560 --> 00:00:18,600
robot vacuum cleaner, you can
point to the wheels and you can

6
00:00:18,600 --> 00:00:21,480
say action.
You can open it up and point to

7
00:00:21,480 --> 00:00:25,520
the central, the CPU, and say,
you know, thought or cognition

8
00:00:25,520 --> 00:00:29,760
or processing traditional
robots, there is a Cartesian cut

9
00:00:29,760 --> 00:00:32,680
you can make that separates the
body from the brain.

10
00:00:33,040 --> 00:00:36,200
But what I'm excited about is if
you look at a lot of cutting

11
00:00:36,200 --> 00:00:40,800
edge technologies, that
separation is becoming less and

12
00:00:40,800 --> 00:00:46,240
less obvious, which I think is,
you know, finally the beginnings

13
00:00:46,240 --> 00:00:50,080
of an acid that's dissolving
this distinction that at least

14
00:00:50,080 --> 00:00:53,600
in the West has been around for
over 300 years.

15
00:00:54,240 --> 00:00:57,120
So I think that's great from a
purely intellectual point of

16
00:00:57,120 --> 00:00:59,280
view.
But again, it, it also is

17
00:00:59,280 --> 00:01:02,440
important for us as a species to
understand that, you know,

18
00:01:02,440 --> 00:01:06,000
brains are not everything and
everything else is expendable.

19
00:01:06,000 --> 00:01:09,520
You know, it's, the story is
much more complicated than that.

20
00:01:09,520 --> 00:01:13,080
And I think that finally
overturning Cartesian dualism

21
00:01:13,080 --> 00:01:16,680
will actually be a positive
thing for for society.

22
00:01:21,840 --> 00:01:24,640
Josh, I've been following you
work for years and you guys have

23
00:01:24,640 --> 00:01:26,400
done some incredible stuff in
the field.

24
00:01:27,040 --> 00:01:30,160
It's biology, robotics,
artificial intelligence, so many

25
00:01:30,160 --> 00:01:31,560
different branches we can go
into.

26
00:01:31,720 --> 00:01:34,800
But I think the best place to
start would be let's start with

27
00:01:34,800 --> 00:01:36,920
definitions.
How do you define life,

28
00:01:37,360 --> 00:01:42,560
consciousness and intelligence?
Life, consciousness and

29
00:01:42,560 --> 00:01:44,600
intelligence.
So we're diving in the deep end

30
00:01:44,640 --> 00:01:49,600
to start.
OK, so to me I am very much in

31
00:01:49,600 --> 00:01:54,120
favour, not surprisingly, of an
embodied approach to life.

32
00:01:54,200 --> 00:01:59,000
So life in many ways is a
process of self construction.

33
00:02:00,120 --> 00:02:04,200
This has been articulated I
think, best in the idea of

34
00:02:04,200 --> 00:02:09,080
autopoiesis, that you can
construct your own components.

35
00:02:09,360 --> 00:02:13,080
Those may be parts of your own
body, parts of your own brain,

36
00:02:13,080 --> 00:02:16,240
tools.
They may be offspring, let you

37
00:02:16,240 --> 00:02:19,560
know that it's almost a cliche,
but life perpetuates itself,

38
00:02:19,560 --> 00:02:22,880
which seems obvious and almost
tautological.

39
00:02:23,360 --> 00:02:26,960
But it is not a trivial thing to
perpetuate oneself and

40
00:02:26,960 --> 00:02:31,960
constantly self construct.
And so that is to me the

41
00:02:31,960 --> 00:02:34,760
definition of life.
Not, none of our technologies,

42
00:02:34,760 --> 00:02:38,560
no inert materials that we know
of are capable of this.

43
00:02:39,080 --> 00:02:41,800
And that's part of what
interests me about robotics.

44
00:02:41,800 --> 00:02:45,960
I believe in future we will
create machines that are living

45
00:02:45,960 --> 00:02:49,880
in the sense that maybe they're
made of living materials, but

46
00:02:49,880 --> 00:02:52,360
even if they're not, that
they're able to constantly

47
00:02:52,360 --> 00:02:57,240
construct the self and propagate
the self in various ways into

48
00:02:57,240 --> 00:03:00,960
the future.
So that that's my definition of

49
00:03:00,960 --> 00:03:03,880
life.
You asked about consciousness.

50
00:03:04,920 --> 00:03:08,040
All, all cards on the table.
I'm an illusionist.

51
00:03:08,040 --> 00:03:11,080
I do not believe consciousness
is a thing.

52
00:03:11,080 --> 00:03:14,400
We've been looking for it for a
long time, haven't found it.

53
00:03:15,520 --> 00:03:18,720
I think it's just one of those
things that's the best, the most

54
00:03:18,720 --> 00:03:22,720
convenient term that humans have
come up with for something

55
00:03:22,720 --> 00:03:25,400
that's epiphenomenal, that's not
really there.

56
00:03:26,440 --> 00:03:29,680
I would follow in the footsteps
of Daniel Dennett in this way,

57
00:03:30,160 --> 00:03:33,560
that it is a form of illusion
and we can point to optical

58
00:03:33,560 --> 00:03:37,240
illusions, auditory illusions,
motor illusions.

59
00:03:37,800 --> 00:03:41,400
Our brains are very good at
fooling themselves and I think

60
00:03:41,400 --> 00:03:44,960
fooling themselves into
believing they are conscious is

61
00:03:44,960 --> 00:03:48,480
one of those things.
And what, what was your third

62
00:03:48,840 --> 00:03:51,000
dissideratum?
Intelligence.

63
00:03:51,480 --> 00:03:55,720
Intelligence.
Aha, yes, OK, that one again,

64
00:03:55,720 --> 00:04:00,400
I'm sort of a very pragmatist
about intelligence is the

65
00:04:00,400 --> 00:04:03,880
ability to make sure that you do
not get painted into a corner in

66
00:04:03,880 --> 00:04:06,040
the future.
That's it.

67
00:04:07,120 --> 00:04:09,480
I love the fact that you brought
up the fact that you think

68
00:04:09,480 --> 00:04:13,240
consciousness is an illusion.
Well I I guess the word there is

69
00:04:13,240 --> 00:04:15,800
quite important and and
difficult to work around because

70
00:04:15,800 --> 00:04:17,519
you often have to backtrack what
that means.

71
00:04:17,880 --> 00:04:20,880
When I wrote my dissertation, it
was on illusionism as a theory

72
00:04:20,880 --> 00:04:23,360
of consciousness.
OK, there you go.

73
00:04:23,680 --> 00:04:26,360
Followed in the footsteps of
dand in it, Keith Frankish,

74
00:04:26,360 --> 00:04:29,600
Michael Graziano and I put all
of this together to defended

75
00:04:29,640 --> 00:04:32,240
within psychiatry.
And I find it interesting

76
00:04:32,240 --> 00:04:34,920
because it's it's a very
misunderstood theory of

77
00:04:34,920 --> 00:04:36,480
consciousness.
Well, it's not really a theory

78
00:04:36,480 --> 00:04:39,120
of consciousness, but rather a
theory that shows how other

79
00:04:39,120 --> 00:04:44,120
theories of consciousness tend
to lack substantive evidence of

80
00:04:44,120 --> 00:04:46,800
what they're talking about.
So it's almost like intuition

81
00:04:46,800 --> 00:04:50,160
pumps, what Dan and it called
for theory of consciousness,

82
00:04:50,240 --> 00:04:52,760
sort of how to think of it.
But when I look at your work,

83
00:04:52,760 --> 00:04:55,680
it's almost easy to say that you
might be an illusionist.

84
00:04:55,680 --> 00:04:57,800
And I thought that right up
until now.

85
00:04:57,800 --> 00:05:00,480
So you've you've managed to
confirm what I've been thinking

86
00:05:00,480 --> 00:05:02,240
all about.
OK, great.

87
00:05:03,080 --> 00:05:05,280
Let's start off with I've I've
I've scheduled this.

88
00:05:05,880 --> 00:05:09,200
Sorry, I've prepped this podcast
with 10 main questions in mind.

89
00:05:09,320 --> 00:05:12,480
The first one in what you just
mentioned embodied intelligence.

90
00:05:12,840 --> 00:05:16,560
In the book you co-authored, How
the body Shapes the way we

91
00:05:16,560 --> 00:05:18,760
think.
You argue that cognition is

92
00:05:18,760 --> 00:05:22,320
deeply rooted within the body's
physical form.

93
00:05:22,440 --> 00:05:26,200
So how does this embodied
perspective challenge our

94
00:05:26,200 --> 00:05:28,160
traditional abstract models of
mind?

95
00:05:28,440 --> 00:05:31,200
I mean, this podcast is called
Mind Body Solution, paying

96
00:05:31,200 --> 00:05:32,800
homage to the infamous mind body
problem.

97
00:05:32,960 --> 00:05:35,000
But therein might be a problem,
the fact that we're

98
00:05:35,000 --> 00:05:38,000
differentiating between mind and
body, separating these two

99
00:05:38,000 --> 00:05:40,160
entities.
Does this problem exist?

100
00:05:40,160 --> 00:05:43,400
What are your thoughts of this?
Yeah, OK, great.

101
00:05:44,320 --> 00:05:48,200
So yeah, embodied cognition,
this term, it means a lot of

102
00:05:48,200 --> 00:05:50,280
different things to a lot of
people.

103
00:05:51,600 --> 00:05:54,880
I've worked in embodied
cognition for many years now.

104
00:05:54,880 --> 00:06:00,240
So my my particular perspective
in it on it is exactly that,

105
00:06:00,240 --> 00:06:03,520
that there is no distinction
between what we conveniently

106
00:06:03,520 --> 00:06:08,320
refer to as mind and what we may
be less problem problematically

107
00:06:08,320 --> 00:06:12,520
refer to as the body.
There is no separation.

108
00:06:13,120 --> 00:06:17,000
There are some additional kind
of add-ons, evolutionarily

109
00:06:17,000 --> 00:06:20,840
recent add-ons like neurons and
brains and central nervous

110
00:06:20,840 --> 00:06:25,080
systems and prefrontal cortices,
but they're the icing on the

111
00:06:25,080 --> 00:06:28,080
cake.
They're not the cake the nervous

112
00:06:28,080 --> 00:06:33,040
system facilitates, things that
organisms without brains were

113
00:06:33,040 --> 00:06:36,760
able to do for a very, very long
time.

114
00:06:37,160 --> 00:06:41,200
That when you get down to brass
tacks, there isn't anything

115
00:06:41,200 --> 00:06:45,120
really qualitatively new that
brains have brought to the

116
00:06:45,120 --> 00:06:50,320
table, although from a human
perspective, it often feels like

117
00:06:50,320 --> 00:06:52,880
there is.
Again, illusions are creeping

118
00:06:52,880 --> 00:06:56,760
back in.
When when you think of embodied

119
00:06:56,760 --> 00:07:00,760
cognition, how familiar or how
much do you also consider what

120
00:07:01,000 --> 00:07:02,680
psychologists call the 4E
cogsaw?

121
00:07:02,680 --> 00:07:06,480
I mean inactivism and embodied
cognition.

122
00:07:06,480 --> 00:07:09,840
And then you have the others
like embedded cognition and is

123
00:07:09,840 --> 00:07:11,400
the other E?
And there's one more.

124
00:07:11,920 --> 00:07:13,280
Yeah, I'm trying to remember
now.

125
00:07:13,280 --> 00:07:14,640
What it is?
It's embedded.

126
00:07:15,200 --> 00:07:19,400
OK, yeah, I, I, I think you know
this.

127
00:07:19,720 --> 00:07:23,880
My belief comports with a lot of
those, you know, and now putting

128
00:07:23,880 --> 00:07:28,000
on my roboticist hat, you know,
the devil is in the details.

129
00:07:28,000 --> 00:07:31,600
What exactly does it mean to be,
you know, all the ease,

130
00:07:31,600 --> 00:07:37,720
empowered, enacted, enabled, you
know, you name it, embodied that

131
00:07:37,720 --> 00:07:40,880
it's at the surface, it's kind
of intuitive.

132
00:07:40,880 --> 00:07:43,720
You know, something or someone
that has a body that's a

133
00:07:43,720 --> 00:07:46,880
physical tool with which to
interact with the environment

134
00:07:46,880 --> 00:07:52,560
somehow, you know, intuitively
has more options and ability to

135
00:07:52,960 --> 00:07:56,080
prevent getting painted into a
corner in the future, right?

136
00:07:56,080 --> 00:07:59,240
It just kind of makes sense.
But what I find fascinating in

137
00:07:59,240 --> 00:08:03,280
the the research that I do and
some of my colleagues do is when

138
00:08:03,280 --> 00:08:07,320
you get down into the into the
weeds, things become very non

139
00:08:07,320 --> 00:08:10,520
obvious.
So I co-authored how the body

140
00:08:10,520 --> 00:08:13,640
shapes the way we think with my
PhD advisor Roll Pfeiffer.

141
00:08:14,000 --> 00:08:17,880
And when I was a PhD student
with Rolf, I would always ask

142
00:08:17,880 --> 00:08:20,920
him, you know, how exactly does
the body shape the way we think?

143
00:08:20,920 --> 00:08:24,920
What exactly is it about the,
the body that neural networks

144
00:08:24,920 --> 00:08:27,120
can't do?
And we would have, you know, a

145
00:08:27,120 --> 00:08:29,280
lot of heated discussions about
that.

146
00:08:29,280 --> 00:08:32,039
And and that's what I've
dedicated my career to is to

147
00:08:32,039 --> 00:08:37,120
trying to quantify and
concretize, you know, exactly

148
00:08:37,120 --> 00:08:40,120
what this means because we can
build machines to try out some

149
00:08:40,120 --> 00:08:42,799
of these theories.
Whereas in the, you know, the

150
00:08:42,799 --> 00:08:46,680
cognitive sciences and the
psychological sciences, you

151
00:08:46,680 --> 00:08:49,720
know, there's only so much we
can do to humans to try and

152
00:08:49,720 --> 00:08:53,320
understand how ease exist in
humans.

153
00:08:53,720 --> 00:08:55,520
But with robots, you can do
anything.

154
00:08:55,520 --> 00:08:59,760
So I think that's the difference
between the fields, but I think

155
00:08:59,760 --> 00:09:03,080
their long term goal is the same
as to really understand what all

156
00:09:03,080 --> 00:09:06,400
these ES mean.
You you actually touched on the

157
00:09:06,400 --> 00:09:07,720
one.
I think I might have made a

158
00:09:07,720 --> 00:09:10,400
mistake, but the other E was
extended and.

159
00:09:11,000 --> 00:09:12,360
Yes.
Those tools that we do use,

160
00:09:12,360 --> 00:09:13,560
yeah.
So like our cell phones,

161
00:09:13,560 --> 00:09:17,080
everything else that we use to
that form bottle cognition, that

162
00:09:17,840 --> 00:09:20,360
in terms of evolutionary design
of intelligence.

163
00:09:20,640 --> 00:09:24,360
Your work in evolutionary
robotics explores how machines

164
00:09:24,360 --> 00:09:27,640
evolve adaptive behaviours and
it's.

165
00:09:27,880 --> 00:09:31,440
Can you explain how these
evolutionary algorithms guide

166
00:09:31,440 --> 00:09:36,120
this process and with such
evolved systems could ever cross

167
00:09:36,120 --> 00:09:39,080
into the realm of, let's say,
consciousness or sentience?

168
00:09:39,720 --> 00:09:42,080
Sure, sure.
So, yeah.

169
00:09:42,080 --> 00:09:44,440
So for those of your listeners
that aren't familiar with the

170
00:09:44,440 --> 00:09:48,320
field of evolutionary robotics,
the approach is more or less

171
00:09:48,320 --> 00:09:52,680
what the name implies, which is
that we try and we create AI

172
00:09:53,320 --> 00:09:58,200
that evolves bodies and brains
of robots in simulation.

173
00:09:58,200 --> 00:10:01,920
So I'm sitting here in my office
and on the far side of the quad

174
00:10:01,920 --> 00:10:04,560
I can see the building where a
supercomputer is housed.

175
00:10:04,880 --> 00:10:09,360
And inside that supercomputer,
in the GPU's, there's 10,000

176
00:10:09,360 --> 00:10:11,760
virtual worlds running right
now.

177
00:10:12,240 --> 00:10:15,960
And inside each of those virtual
worlds, there's a swarm of

178
00:10:15,960 --> 00:10:19,320
virtual robots.
And every once in a while, some

179
00:10:19,320 --> 00:10:22,720
of those robots disappear
because the AI is deleting the

180
00:10:22,720 --> 00:10:25,600
ones that aren't doing such a
good job at whatever we want

181
00:10:25,600 --> 00:10:28,480
them to do.
And the AI makes randomly

182
00:10:28,480 --> 00:10:31,920
modified copies of the surviving
virtual robots.

183
00:10:32,280 --> 00:10:34,720
So that that's the
methodological approach.

184
00:10:35,320 --> 00:10:39,360
This AI that I mentioned, it's,
it's arguably the oldest form of

185
00:10:39,480 --> 00:10:41,000
AI.
It's called an evolutionary

186
00:10:41,000 --> 00:10:45,240
algorithm, and you can trace it
back almost to the 1940s.

187
00:10:45,240 --> 00:10:48,440
Some argue that the very first
computer program that was ever

188
00:10:48,440 --> 00:10:51,120
written was something that
looked kind of like an

189
00:10:51,120 --> 00:10:53,760
evolutionary algorithm.
And it's kind of, you know, an

190
00:10:53,760 --> 00:10:55,640
intuitive idea.
If you don't know how to solve

191
00:10:55,640 --> 00:10:59,360
your problem, why don't you just
create a population of random

192
00:10:59,360 --> 00:11:02,600
solutions, measure how good
those solutions are, delete the

193
00:11:02,600 --> 00:11:06,400
bad ones, make randomly modified
copies of the survivors, and off

194
00:11:06,400 --> 00:11:08,840
you go.
The I think the second part of

195
00:11:08,840 --> 00:11:12,520
your question was, well, how
might this process lead to, you

196
00:11:12,520 --> 00:11:16,240
know, more abstract forms of
cognition like sentience and

197
00:11:16,240 --> 00:11:18,560
consciousness?
It's a very good question.

198
00:11:19,200 --> 00:11:21,800
I've been working in
evolutionary robotics for over

199
00:11:21,800 --> 00:11:26,280
20 years now, and we tend to
focus on, you know, sensor motor

200
00:11:26,880 --> 00:11:30,400
tasks, behaviors rooted in
sensory motor things, so

201
00:11:30,400 --> 00:11:34,920
locomotion, object manipulation,
swarm behavior, collective

202
00:11:34,920 --> 00:11:36,800
intelligence, that sort of
thing.

203
00:11:38,560 --> 00:11:42,880
But we have dabbled in the more
abstract, you know, aspects of

204
00:11:42,880 --> 00:11:46,480
cognition.
In some work I did in my post

205
00:11:46,480 --> 00:11:51,800
doc with Hod Lipson at Cornell,
we evolved a robot that evolves

206
00:11:51,800 --> 00:11:56,320
understandings of itself.
So, so it gets a little like in

207
00:11:56,480 --> 00:12:00,280
the movie Inception.
So you have a virtual robot and

208
00:12:00,320 --> 00:12:04,360
it's trying to survive, and that
virtual robot is running

209
00:12:04,360 --> 00:12:09,280
simulations of itself in its own
virtual head, and it is using

210
00:12:09,280 --> 00:12:14,120
those guesses about itself and
its current situation to guide

211
00:12:14,120 --> 00:12:17,480
its behavior.
So there's an old theory, you

212
00:12:17,480 --> 00:12:20,280
know, in the biological and
cognitive sciences that that

213
00:12:20,280 --> 00:12:22,800
self-awareness, I don't know
about consciousness, but

214
00:12:22,800 --> 00:12:25,920
self-awareness evolved for very
good evolutionary reasons.

215
00:12:26,480 --> 00:12:30,240
If you are near the edge of a
Cliff and you are able to

216
00:12:30,240 --> 00:12:33,720
simulate what would happen to
yourself, you can mentally

217
00:12:33,720 --> 00:12:36,120
rehearse what would happen if
you take three more steps

218
00:12:36,120 --> 00:12:38,200
forward.
Obviously you're in a much

219
00:12:38,200 --> 00:12:42,560
better evolutionary situation
than someone who cannot mentally

220
00:12:42,560 --> 00:12:45,680
rehearse and has to physically
take those three steps to see

221
00:12:45,680 --> 00:12:49,240
what happens.
So we have seen evidence in some

222
00:12:49,240 --> 00:12:54,680
of our simulations where our
robots start to evolve what we

223
00:12:54,680 --> 00:12:58,960
call self models, a model of
self, and they use it to avoid

224
00:12:58,960 --> 00:13:03,120
risky behaviors, practice things
before trying them in reality.

225
00:13:03,480 --> 00:13:06,680
And you can see that how for
many people, that's a stepping

226
00:13:06,680 --> 00:13:12,000
stone towards very abstract
things like consciousness.

227
00:13:13,080 --> 00:13:16,280
And I think it, it's, it reminds
me of some work done well.

228
00:13:16,280 --> 00:13:18,840
Josh about this talks about
something similar in his work.

229
00:13:19,080 --> 00:13:21,320
But Nicholas Humphrey, he wrote
a book called Soul Dust.

230
00:13:21,640 --> 00:13:23,840
And he talks about one of the
functions of consciousness also

231
00:13:23,840 --> 00:13:29,400
having this, this self-awareness
or this awareness of being alive

232
00:13:29,400 --> 00:13:32,360
in itself, the beauty of being
conscious and, and, and

233
00:13:32,400 --> 00:13:35,520
surviving in this world, which
can be another force and driving

234
00:13:35,520 --> 00:13:39,960
force with keeping us alive.
So if that, if the AI eventually

235
00:13:40,240 --> 00:13:43,000
discovers that the fact that
it's there in the 1st place is

236
00:13:43,000 --> 00:13:45,680
important in itself because it
now has this ability to

237
00:13:45,680 --> 00:13:49,000
experience and therefore will
not take that further step if it

238
00:13:49,000 --> 00:13:52,520
does process those thoughts of
if I take a few more steps, I'm

239
00:13:52,520 --> 00:13:54,720
going to fall off the Cliff.
Is that something that you've

240
00:13:54,720 --> 00:13:57,400
seen presenting itself in any
form or any way?

241
00:13:57,440 --> 00:13:59,840
Sorry, I'm sidetracking now.
No, that's OK.

242
00:13:59,840 --> 00:14:03,120
I, I, I haven't, but that's
something that we certainly are

243
00:14:03,120 --> 00:14:06,080
on the lookout.
So if you are able to simulate

244
00:14:06,080 --> 00:14:10,200
self, if you have a, you know,
some sort of abstraction that

245
00:14:10,200 --> 00:14:14,000
you can form of self, you know,
obviously it's also useful for

246
00:14:14,000 --> 00:14:16,800
social creatures.
So if I can form a model of

247
00:14:16,800 --> 00:14:21,280
self, I, you know, I can adapt
that to form a model of other

248
00:14:21,800 --> 00:14:24,880
and a good model of other like
you is the assumption that

249
00:14:24,960 --> 00:14:29,080
you're also running a simulation
of yourself in your head, which

250
00:14:29,080 --> 00:14:32,320
probably includes a simulation
of me, which includes a

251
00:14:32,320 --> 00:14:35,920
simulation of you.
And so you know this, this,

252
00:14:36,440 --> 00:14:40,200
these evolutionary pressures
towards self-awareness very

253
00:14:40,200 --> 00:14:43,920
quickly can lead to recursion
and reflexivity, which is the

254
00:14:43,920 --> 00:14:48,000
awareness of being aware.
And for many people that is a

255
00:14:48,000 --> 00:14:50,560
definition of consciousness.
So.

256
00:14:50,880 --> 00:14:53,520
That sort of meta cognition
which most people talk about is

257
00:14:53,520 --> 00:14:55,800
their own definition of what
consciousness is.

258
00:14:56,400 --> 00:14:58,440
There you go.
No, it doesn't quite handle

259
00:14:58,440 --> 00:15:01,720
qualia very well, but but it,
you know, captures a lot of what

260
00:15:01,720 --> 00:15:04,640
people think of when they think
of consciousness.

261
00:15:05,000 --> 00:15:07,560
I think to a lot of philosophers
and psychologists, just knowing

262
00:15:07,560 --> 00:15:10,880
that some artificial
intelligences have this ability

263
00:15:10,880 --> 00:15:14,440
to self reflect and develop
these self models is probably

264
00:15:14,440 --> 00:15:17,360
enough for them to think that
this this is conscious.

265
00:15:17,960 --> 00:15:21,880
I again, as an illusionist I'd
say sure, absolutely.

266
00:15:21,880 --> 00:15:25,120
You know, using the term
consciousness to describe this,

267
00:15:25,440 --> 00:15:28,480
you know this.
This recursive self-awareness

268
00:15:28,640 --> 00:15:32,240
seems useful as it is also
useful for describing our

269
00:15:32,240 --> 00:15:35,360
ability to be recursively self
aware.

270
00:15:36,200 --> 00:15:39,600
Let let's expand a little bit.
I mean an illusionist when they

271
00:15:39,600 --> 00:15:42,600
think about consciousness.
I mean, we know that the brain

272
00:15:42,600 --> 00:15:45,720
plays tricks on us all the time.
Whether it's optical illusions,

273
00:15:45,720 --> 00:15:48,560
auditory, doesn't really matter.
Hallucinations and, and

274
00:15:48,560 --> 00:15:51,840
experiences occur all the time.
Just like AI confabulates, we

275
00:15:51,840 --> 00:15:55,240
confabulate, we make up stories.
When something happens, we

276
00:15:55,240 --> 00:15:58,480
remember it incorrectly and we
create and draft new stories to

277
00:15:58,480 --> 00:16:02,120
explain this phenomenon.
Are these all the very varying

278
00:16:02,120 --> 00:16:04,560
reasons that you've come to this
conclusion that we, we

279
00:16:04,560 --> 00:16:06,400
fundamentally are flawed
species?

280
00:16:06,600 --> 00:16:10,600
We're we're limited by
biological processes, heuristic

281
00:16:10,600 --> 00:16:13,960
adaptations and and and
therefore we can't assume when

282
00:16:13,960 --> 00:16:17,160
we ask what is consciousness in
an AI that we even are conscious

283
00:16:17,160 --> 00:16:19,200
in the 1st place.
It's like we know well.

284
00:16:19,200 --> 00:16:23,360
Again, I, you know, I, I'm, I'm
perfectly fine with that non

285
00:16:23,360 --> 00:16:27,000
problematic side of it, which
again for most people captures

286
00:16:27,000 --> 00:16:29,080
part of what they mean by
consciousness.

287
00:16:29,480 --> 00:16:31,720
As an illusionist.
What I have a problem with is

288
00:16:31,720 --> 00:16:34,840
that taking the further step to
things like qualia and the

289
00:16:34,840 --> 00:16:38,440
redness of red and you know that
there's there are these things

290
00:16:38,440 --> 00:16:40,960
called qualia and they're in
there somewhere.

291
00:16:41,200 --> 00:16:45,600
To me, that's a step too far.
You know it, it feels as if we

292
00:16:45,600 --> 00:16:47,080
have them.
And I think it's an

293
00:16:47,080 --> 00:16:51,760
epiphenomenal, you know, part of
something to do with

294
00:16:51,760 --> 00:16:55,120
self-awareness.
So that that's that part of it.

295
00:16:55,120 --> 00:16:59,880
I don't, I don't buy.
When you, you guys work, there's

296
00:16:59,960 --> 00:17:02,800
the four of you with when it
comes to biological robots,

297
00:17:02,800 --> 00:17:03,880
which we're going to touch on
soon.

298
00:17:04,280 --> 00:17:07,520
But on the other side, I mean,
you've got someone like Mike who

299
00:17:08,079 --> 00:17:10,200
Mike Levin for all those who
know, but but he's obviously

300
00:17:10,200 --> 00:17:13,200
within the Pancychus realm in
that he sees this cognition

301
00:17:13,200 --> 00:17:14,440
happening in layers going
upward.

302
00:17:14,800 --> 00:17:16,720
And, and then you were on the
other end saying actually now

303
00:17:16,720 --> 00:17:19,800
these are just phenomenal
phenomena occurring, but you're

304
00:17:19,800 --> 00:17:22,839
both producing such amazing work
as a collaboration in a unit.

305
00:17:23,119 --> 00:17:25,000
How do you guys go about
discussing these?

306
00:17:27,400 --> 00:17:29,960
It doesn't, surprisingly.
It doesn't come up too much.

307
00:17:29,960 --> 00:17:33,120
I think we kind of nerd out on
the nuts and bolts, you know,

308
00:17:33,120 --> 00:17:36,360
experimental side of it.
And then how each of us

309
00:17:36,360 --> 00:17:41,200
interprets the implications of
the work is sort of a private,

310
00:17:41,200 --> 00:17:44,200
private affair, I guess.
OK, well, you guys have done

311
00:17:44,200 --> 00:17:46,440
something incredible and I
always think about it and I

312
00:17:46,440 --> 00:17:49,000
always wonder why this isn't
spoken about more.

313
00:17:49,000 --> 00:17:51,720
I mean, when you think of
firstly, what are biological

314
00:17:51,720 --> 00:17:55,240
robots, which we we call now
Xenobots and thanks to you guys,

315
00:17:55,640 --> 00:18:00,040
and I mean these living machines
that you help design, they blur

316
00:18:00,040 --> 00:18:03,000
the line between life and and
machine.

317
00:18:03,000 --> 00:18:05,880
And how do they force us to
rethink our definitions of life,

318
00:18:07,240 --> 00:18:09,880
intelligence, and even agency?
Sure.

319
00:18:09,880 --> 00:18:12,040
Yeah.
So these biological robots, this

320
00:18:12,040 --> 00:18:14,680
was a collaboration as you
mentioned, between myself and

321
00:18:14,680 --> 00:18:18,960
Mike Levin at Tufts and two of
our collaborators, Sam Krigman

322
00:18:18,960 --> 00:18:23,080
and and Doug Blackiston.
And so over five years ago now,

323
00:18:23,080 --> 00:18:25,760
the four of us were together on
a funded project.

324
00:18:25,760 --> 00:18:30,000
And myself and Sam, as the
roboticist, showed what we could

325
00:18:30,000 --> 00:18:32,800
do at the beginning of this
collaboration, that we could

326
00:18:34,040 --> 00:18:38,080
teach an AI to reconfigure
virtual robot parts, you know,

327
00:18:38,080 --> 00:18:41,200
in a supercomputer to make new
virtual robots.

328
00:18:41,200 --> 00:18:45,080
And we demonstrated all of this.
And then a week later, Mike and

329
00:18:45,080 --> 00:18:49,600
Doug joined us on the Zoom call.
And Doug was a very talented

330
00:18:49,600 --> 00:18:53,920
microsurgeon, showed us that he
built a version of one of our

331
00:18:53,920 --> 00:18:57,800
four legged robots under the
microscope from frog cells.

332
00:18:58,320 --> 00:19:00,200
And I'll never forget this
moment.

333
00:19:00,200 --> 00:19:02,840
There was complete silence on
the call because this thing

334
00:19:02,840 --> 00:19:08,520
looked like, you know, our
robots, traditional robots, but

335
00:19:08,520 --> 00:19:11,120
it was clearly some something
biological.

336
00:19:11,120 --> 00:19:13,440
And we found out it was made
from frog cells.

337
00:19:14,520 --> 00:19:17,240
And so Sam and I immediately
asked Doug, we said, can you

338
00:19:17,240 --> 00:19:20,080
take this statue and can we make
it move?

339
00:19:20,080 --> 00:19:22,600
And it took Doug a few more
months, but he figured out how

340
00:19:22,600 --> 00:19:25,120
to put some muscle cells in
there.

341
00:19:25,120 --> 00:19:28,600
And then that was sort of the
first xenobot, this little 4

342
00:19:28,600 --> 00:19:31,840
legged creature walking along
the bottom of a, a Petri dish.

343
00:19:32,480 --> 00:19:35,920
So in, you know, the, the
simplest explanation of a

344
00:19:35,920 --> 00:19:38,240
biological robot is it's a
robot.

345
00:19:38,240 --> 00:19:42,640
It's something that's been built
by us, in this case, us being an

346
00:19:42,720 --> 00:19:46,400
AI plus humans.
It's been built by us to do

347
00:19:46,400 --> 00:19:49,480
something we wanted to do, which
in this first experiment was

348
00:19:49,480 --> 00:19:51,440
just walk along the bottom of a
Petri dish.

349
00:19:52,240 --> 00:19:55,240
A lot of folks don't realize it,
but that's actually the original

350
00:19:55,240 --> 00:20:00,600
definition of a robot.
It's something built by humans

351
00:20:00,600 --> 00:20:04,800
that runs around, is capable of
moving about and does stuff on

352
00:20:04,800 --> 00:20:09,360
our behalf and.
In the original, the original

353
00:20:09,360 --> 00:20:14,280
definition of robots, it comes
from a Czech play over 100 years

354
00:20:14,280 --> 00:20:18,600
ago.
And in that play the robota were

355
00:20:18,600 --> 00:20:21,760
built from protoplasm.
It was some sort of biological

356
00:20:21,760 --> 00:20:24,360
mass.
But of course, then into the

357
00:20:24,360 --> 00:20:27,880
20th century, along comes metal
and plastics and ceramics and

358
00:20:27,880 --> 00:20:30,800
eventually electronics.
And so now, and thanks to

359
00:20:30,800 --> 00:20:34,680
Hollywood, most people think of
robots as things built from 20th

360
00:20:34,680 --> 00:20:39,080
century materials.
But we're in a way as

361
00:20:39,080 --> 00:20:40,960
roboticists.
This is very satisfying.

362
00:20:40,960 --> 00:20:44,520
We're going back to our roots
and we're tasking AI with

363
00:20:44,520 --> 00:20:47,520
building robots from living
materials.

364
00:20:48,600 --> 00:20:50,480
And I think there's sorry,
continue the no.

365
00:20:50,680 --> 00:20:52,600
No, please go ahead, I.
Was going to say, and I think

366
00:20:52,600 --> 00:20:57,120
it, it would be more comforting
to most people to see this type

367
00:20:57,120 --> 00:21:01,440
of architecture within a system
that's robotic in that it's not

368
00:21:01,440 --> 00:21:04,120
scary, it's not dangerously,
it's not metal, it's not going

369
00:21:04,120 --> 00:21:06,800
to harm you or hurt you.
It's softer, it's slightly, it's

370
00:21:06,800 --> 00:21:09,280
easier to look at.
Do you think that

371
00:21:09,480 --> 00:21:12,320
psychologically this is an easy
approach as well?

372
00:21:12,600 --> 00:21:15,720
Just.
For the my experience has been

373
00:21:15,720 --> 00:21:18,040
the exact opposite of what you
just described.

374
00:21:18,040 --> 00:21:22,000
People are absolutely terrified
of what have been been known as

375
00:21:22,000 --> 00:21:26,840
the Zenobots.
We, so we, we finished this

376
00:21:26,840 --> 00:21:29,880
work, we published it in a very
visible journal and it triggered

377
00:21:29,880 --> 00:21:33,080
this, you know, huge media
interest in obviously, you know,

378
00:21:33,080 --> 00:21:36,040
frog bots built by AI.
What, what's not to love?

379
00:21:37,240 --> 00:21:40,840
So, so psychologically, it's
kind of interesting because I

380
00:21:40,840 --> 00:21:44,600
think the Xenobots have
discovered one of the deepest

381
00:21:44,600 --> 00:21:48,880
parts of the uncanny valley.
So for folks who haven't heard,

382
00:21:48,880 --> 00:21:52,280
the uncanny valley is something
where there's something out

383
00:21:52,280 --> 00:21:55,640
there that's kind of like us,
but also in an unexpected way,

384
00:21:55,640 --> 00:21:58,480
not like us.
And that seems to press on some

385
00:21:58,480 --> 00:22:02,400
very deep buttons in humans.
So, you know, zombie, all the

386
00:22:02,400 --> 00:22:05,720
zombie movies and TV shows that
just never seem to end, you

387
00:22:05,720 --> 00:22:08,400
know, zombies are a great
example of something that's

388
00:22:08,760 --> 00:22:11,200
that's in this, you know,
uncanny valley.

389
00:22:11,200 --> 00:22:16,240
And I think Chachi PT and how
from 2001, these disembodied

390
00:22:16,480 --> 00:22:20,360
authoritative voices from above,
you know, that somehow also

391
00:22:20,360 --> 00:22:22,960
presses our buttons.
And now, and maybe in

392
00:22:22,960 --> 00:22:27,520
retrospect, not so surprisingly,
AI designed frogbots also

393
00:22:28,800 --> 00:22:33,080
frightens a lot of people.
I will say that it's mostly

394
00:22:33,080 --> 00:22:36,040
older people, I think that are
afraid at the xenobots.

395
00:22:36,040 --> 00:22:39,440
Mike and I now, you know, on a
daily basis get emails from

396
00:22:39,440 --> 00:22:43,520
young folks saying how do I
train to become a xenoboticist?

397
00:22:43,520 --> 00:22:46,280
And you know, they, they think
it's the coolest thing under the

398
00:22:46,280 --> 00:22:48,920
sun.
So it depends on who we're

399
00:22:48,920 --> 00:22:51,880
talking about.
How did the name Xenobot come

400
00:22:51,880 --> 00:22:52,480
about?
Who?

401
00:22:52,480 --> 00:22:54,400
Who said that the first time?
Yep.

402
00:22:54,400 --> 00:22:58,120
So that was Mike, and I'm, I'm
probably going to botch this

403
00:22:58,120 --> 00:23:01,000
story, but I think he was
sitting in on someone's PhD

404
00:23:01,000 --> 00:23:04,840
defense and he saw a little bit
of an animal cat.

405
00:23:04,840 --> 00:23:09,640
This is one part of a frog egg,
and it was moving about on its

406
00:23:09,640 --> 00:23:12,080
own.
And from Mike's perspective, it

407
00:23:12,080 --> 00:23:14,800
looked like a little robot
running around doing something.

408
00:23:14,800 --> 00:23:17,800
And I think if I'm getting the
story right, that's where the

409
00:23:17,800 --> 00:23:22,600
name came from.
But I think it fits quite well

410
00:23:22,600 --> 00:23:26,720
for this technology because it
has the bot part in these are

411
00:23:27,000 --> 00:23:30,640
living machines that have been
built by us for a purpose.

412
00:23:31,680 --> 00:23:34,920
The Zeno comes from Xenopus
Levus, which is the Latin name

413
00:23:34,920 --> 00:23:38,040
of the the particular frog that
we draw these cells from.

414
00:23:38,520 --> 00:23:42,320
But Zeno is also the Greek
cognate for like stranger or

415
00:23:42,320 --> 00:23:47,120
newcomer, which was just
coincidental, but it was a nice

416
00:23:47,120 --> 00:23:50,800
a nice add to the term I think.
Yeah, I think it's such a cool

417
00:23:50,800 --> 00:23:52,840
name.
I think I might have asked Mike

418
00:23:52,840 --> 00:23:55,840
this many years ago and I can't,
I can't remember if that if the

419
00:23:55,840 --> 00:23:58,200
story matches me, I need to.
I'll double.

420
00:23:58,200 --> 00:24:00,040
S you can, you can compare,
yeah.

421
00:24:00,040 --> 00:24:02,960
See if you got it right.
But these xenobots have

422
00:24:02,960 --> 00:24:06,640
demonstrated self replication
and something typically that's

423
00:24:06,640 --> 00:24:10,080
associated with life only.
And what does this ability tell

424
00:24:10,080 --> 00:24:12,600
us about the relationship
between reproduction,

425
00:24:12,600 --> 00:24:15,080
intelligence and autonomy in
living systems?

426
00:24:15,440 --> 00:24:19,800
So this comes back to the top of
our interview where you were

427
00:24:19,800 --> 00:24:22,440
asking me about intelligence and
and life.

428
00:24:22,480 --> 00:24:24,360
And to me these things are
entwined.

429
00:24:24,360 --> 00:24:28,400
So life is capable of
perpetuating itself, making

430
00:24:28,400 --> 00:24:32,320
stuff including parts of itself
or copies of itself or

431
00:24:32,320 --> 00:24:35,000
offspring.
And the ability to self

432
00:24:35,000 --> 00:24:39,960
construct is a very good tool if
your goal is to not get painted

433
00:24:39,960 --> 00:24:43,280
into a corner in the future,
which for me is the working

434
00:24:43,280 --> 00:24:46,520
definition of intelligence.
It gives you options.

435
00:24:46,720 --> 00:24:51,680
So one way of viewing the self
replicating xenobots is that in

436
00:24:51,680 --> 00:24:56,480
essence, we took individual frog
skin cells and just like pulled

437
00:24:56,480 --> 00:25:00,080
them apart.
So this is like the a maximally

438
00:25:00,080 --> 00:25:06,280
deconstructed frog incapable of
its tree, of its usual way of

439
00:25:06,280 --> 00:25:08,560
reproducing or perpetuating
itself.

440
00:25:09,360 --> 00:25:14,240
And these skin cells seem like
they want to come back together

441
00:25:14,240 --> 00:25:18,520
and they sort of somehow figure
out how to make copies of

442
00:25:18,520 --> 00:25:21,120
themselves.
And I am, I'm adopting the

443
00:25:21,120 --> 00:25:24,200
intentional stance here.
I'm talking about frog cells

444
00:25:24,200 --> 00:25:27,560
wanting things and having goals.
I'm not sure whether that's

445
00:25:27,560 --> 00:25:30,280
true, but again, it kind of
makes sense.

446
00:25:30,280 --> 00:25:34,200
If you're an Organism made-up of
a whole bunch of parts, a way to

447
00:25:34,200 --> 00:25:37,720
be intelligent is to make sure
you can reconstitute yourself no

448
00:25:37,720 --> 00:25:41,560
matter what happens and continue
on.

449
00:25:41,600 --> 00:25:46,680
So one of the interesting things
about the self replication study

450
00:25:46,680 --> 00:25:51,560
was, you know, life finds a way.
And this was certainly an

451
00:25:51,560 --> 00:25:55,000
unexpected way that had
previously been unknown.

452
00:25:55,880 --> 00:25:59,440
And I, I, I can sort of
understand when you, when you

453
00:25:59,440 --> 00:26:03,120
talk about the, the backlash,
let's say when you guys

454
00:26:03,120 --> 00:26:08,240
published the first batch of
these papers, because when you

455
00:26:08,240 --> 00:26:12,640
think about the ethics within
biology and with the way people

456
00:26:12,640 --> 00:26:15,040
perceived it years ago, any,
anything relating to sort of

457
00:26:15,040 --> 00:26:18,040
tinkering with an animal doing
this or doing that with a cell

458
00:26:18,600 --> 00:26:21,480
to them is sort of playing God.
And it's, it's their, it, it

459
00:26:21,480 --> 00:26:23,160
comes with a lot of backlash in
general.

460
00:26:23,640 --> 00:26:25,320
I was watching one of your
lectures and I find it quite

461
00:26:25,320 --> 00:26:28,680
funny when you spoke about the
tadpole that now has an eye on

462
00:26:28,680 --> 00:26:31,680
its, but I just said to myself,
that's a joke.

463
00:26:31,800 --> 00:26:34,440
I mean, today people watching
it, it, it is something we, we

464
00:26:34,440 --> 00:26:35,680
noticed that this thing was not
harmed.

465
00:26:35,680 --> 00:26:37,960
It actually still evolved trades
that could do things.

466
00:26:38,440 --> 00:26:41,520
The frog was fully functional,
still able to turn around and

467
00:26:41,520 --> 00:26:45,120
then still catch its prey, so
these systems adapted

468
00:26:45,120 --> 00:26:48,440
appropriately and technically no
harm was done.

469
00:26:48,800 --> 00:26:52,040
So do you think the ethics of
just generally tinkering with

470
00:26:52,040 --> 00:26:54,440
biology is what's coming into
play when it comes to this,

471
00:26:54,440 --> 00:26:57,760
rather than just being used to
metal robots?

472
00:26:58,960 --> 00:27:02,240
I think, yeah, you know,
obviously this is one of the an

473
00:27:02,240 --> 00:27:05,000
old theme, possibly one of the
oldest themes in humanity.

474
00:27:05,000 --> 00:27:08,360
Like what, what starts to happen
when you tinker with things

475
00:27:08,360 --> 00:27:10,040
you're not supposed to tinker
with, right.

476
00:27:10,040 --> 00:27:13,480
All major religions, their
origin stories have to do with

477
00:27:13,480 --> 00:27:16,760
some version of that.
And I would say this is just the

478
00:27:16,760 --> 00:27:20,200
latest chapter in that.
And you know, that that's where

479
00:27:20,200 --> 00:27:24,320
we as a global society are.
We are now a technological

480
00:27:24,320 --> 00:27:26,960
civilization.
We are doing things far beyond

481
00:27:26,960 --> 00:27:30,240
what our ancestors could do.
And guess what?

482
00:27:30,240 --> 00:27:32,120
It's starting to cause some
problems.

483
00:27:33,080 --> 00:27:37,000
So I, I can understand people, a
lot of people's perspective here

484
00:27:37,000 --> 00:27:40,440
that, that we are tinkering with
things that are very powerful

485
00:27:40,680 --> 00:27:44,600
that can deflect in directions
we didn't think of and that can

486
00:27:44,600 --> 00:27:48,680
really cause problems for us.
On the other hand, you know,

487
00:27:48,840 --> 00:27:52,240
that is a reality.
So the most dangerous thing we

488
00:27:52,240 --> 00:27:54,200
could do is just to stop
innovating.

489
00:27:54,280 --> 00:27:56,760
You know, if we, you know, we're
flying up an airplane, if we

490
00:27:56,760 --> 00:27:59,280
turn off the engine, it not a
good solution.

491
00:27:59,280 --> 00:28:03,080
If if we are going to as a
species or as part of this

492
00:28:03,080 --> 00:28:07,920
ecosystem come back to a steady
state, whatever that means, it's

493
00:28:07,920 --> 00:28:12,040
going to be by going forward and
innovating rather than trying to

494
00:28:12,040 --> 00:28:16,800
turn things off and go back.
Do you think what a big fear

495
00:28:16,800 --> 00:28:21,240
would be is sort of creating new
forms of life that could outpace

496
00:28:21,480 --> 00:28:26,240
or control us in a in a way that
we did not anticipate?

497
00:28:26,840 --> 00:28:29,400
I think so.
So you mentioned outpace, right?

498
00:28:29,400 --> 00:28:32,640
So you look at, you know, fear
around AI technologies.

499
00:28:32,800 --> 00:28:36,200
Again, it boils down to the
speed at which things are

500
00:28:36,200 --> 00:28:38,680
changing.
And again, I think this is a

501
00:28:38,680 --> 00:28:41,680
reality that we are only now
just waking up to.

502
00:28:42,760 --> 00:28:48,160
So 25 years ago, Bill Joy, the
CEO of Sun Microsystems, an old

503
00:28:48,160 --> 00:28:52,640
type of computer, wrote a very
influential article back then

504
00:28:52,800 --> 00:28:57,240
about exponential technologies.
These are things that, you know,

505
00:28:57,240 --> 00:29:01,440
grow at an exponential rate.
And you know, we are now living

506
00:29:01,440 --> 00:29:04,840
in the exponential age.
You know, COVID is an example of

507
00:29:04,840 --> 00:29:07,640
computer viruses.
You know, the, the rapid

508
00:29:07,640 --> 00:29:11,280
improvement of AI and it's
exponentially growing powers.

509
00:29:12,200 --> 00:29:16,320
We, it's just where we are.
So for me, the xenobots are even

510
00:29:16,320 --> 00:29:19,960
more important because they are
sometimes an exponential

511
00:29:19,960 --> 00:29:22,600
technology.
They can grow exponentially if

512
00:29:22,600 --> 00:29:27,840
self replication is involved.
So we need to understand how how

513
00:29:27,840 --> 00:29:32,960
exponential technologies work.
You know, it was a great human

514
00:29:32,960 --> 00:29:35,440
achievement that we developed
the COVID vaccine.

515
00:29:35,440 --> 00:29:39,400
You know, the, the, the Black
Death lasted for centuries.

516
00:29:39,400 --> 00:29:42,520
COVID is still here, but the
worst of it, you know, was over.

517
00:29:42,520 --> 00:29:46,720
In a few years, there will be
things that spread exponentially

518
00:29:46,720 --> 00:29:49,960
around the planet.
And we are AI have to learn how

519
00:29:49,960 --> 00:29:53,080
to create other exponential
technologies that can bring the

520
00:29:53,080 --> 00:29:56,520
bad exponential technologies
down as quickly as possible.

521
00:29:56,520 --> 00:30:01,280
And that requires understanding
exponential technologies.

522
00:30:01,960 --> 00:30:04,400
How was your?
How was your experience with

523
00:30:04,880 --> 00:30:08,080
encountering these xenobots when
he showed you what he had done

524
00:30:08,080 --> 00:30:10,240
with your work the first time
you saw it?

525
00:30:10,240 --> 00:30:12,880
How did this fundamentally
either change the way you saw

526
00:30:12,880 --> 00:30:16,440
the mind body problem or or or
just impact you in general?

527
00:30:16,440 --> 00:30:18,960
How did it change your
philosophical or psychological

528
00:30:18,960 --> 00:30:21,160
views on the concept?
Yeah, sure, sure.

529
00:30:21,160 --> 00:30:22,880
So it was a very visceral
experience.

530
00:30:22,880 --> 00:30:27,200
So as a scientist, you know, 99
things you try don't work.

531
00:30:27,200 --> 00:30:29,440
It's failure after failure after
failure.

532
00:30:29,640 --> 00:30:32,400
And then every once in a while,
if you're lucky, something like

533
00:30:32,400 --> 00:30:35,960
the Xenobots comes along.
So I remember I was here in my

534
00:30:35,960 --> 00:30:40,000
office and my then PhD student
Sam came in and showed me some

535
00:30:40,000 --> 00:30:44,480
preliminary data that the
xenobots that the AI had dreamed

536
00:30:44,480 --> 00:30:48,040
up in our supercomputer.
Doug was able to build them

537
00:30:48,040 --> 00:30:51,120
under the microscope.
So he, so the AI, in effect, was

538
00:30:51,120 --> 00:30:55,080
right.
It knew it knew how to rearrange

539
00:30:55,080 --> 00:30:58,320
living materials to create a
robot for us.

540
00:30:58,320 --> 00:31:00,840
And that robot did what it was
supposed to do.

541
00:31:01,520 --> 00:31:04,320
And it was, it was Thanksgiving
time here.

542
00:31:04,320 --> 00:31:06,320
So there was no one in the
research building here.

543
00:31:06,320 --> 00:31:09,080
And I remember just getting up
and going for a walk in the hall

544
00:31:09,080 --> 00:31:11,560
and, and my legs just felt like
jello.

545
00:31:12,080 --> 00:31:15,760
Just realizing that the first
realization was it's possible.

546
00:31:15,760 --> 00:31:18,640
You can do this.
We can make machines from

547
00:31:18,640 --> 00:31:21,040
genetically unmodified living
materials.

548
00:31:21,040 --> 00:31:23,800
This is not GMO.
This is not something else.

549
00:31:23,800 --> 00:31:27,400
It's just, it's just possible.
And for a scientist or an

550
00:31:27,400 --> 00:31:29,760
engineer, you know, that's,
that's the best part.

551
00:31:29,760 --> 00:31:34,000
Just suddenly this whole new
Vista opens up and then all the

552
00:31:34,000 --> 00:31:37,920
implications since then, you
know, are exciting and fun.

553
00:31:37,920 --> 00:31:40,720
But that was that was the big
moment for me at least.

554
00:31:41,640 --> 00:31:44,640
Do you guys ever sit back and
and just reminisce on that day?

555
00:31:44,640 --> 00:31:49,080
Do you Do you talk about the?
Not yet, but I'm hoping we're so

556
00:31:49,080 --> 00:31:52,240
busy, you know, because it
really has started to snowball

557
00:31:52,240 --> 00:31:55,520
since then, you know, and then
typical nerds and scientists.

558
00:31:55,520 --> 00:31:57,680
We could just get caught up in
the work and then, you know,

559
00:31:57,680 --> 00:32:00,800
it's I'm sure one of these days
we'll get, we'll sit back with a

560
00:32:00,800 --> 00:32:04,920
beer and say remember when, but
haven't haven't quite found that

561
00:32:04,920 --> 00:32:07,520
moment yet.
Well, you, you've touched on

562
00:32:07,520 --> 00:32:10,480
this at the moment AI's role in
shaping biology because I mean,

563
00:32:10,480 --> 00:32:12,800
this clearly showed us what
could be possible.

564
00:32:13,440 --> 00:32:16,200
So it played a critical role in
designing these innerbots.

565
00:32:16,800 --> 00:32:20,240
Could you walk us through how AI
is reshaping the way we design

566
00:32:20,240 --> 00:32:23,000
biological systems currently,
and what this might mean for,

567
00:32:23,080 --> 00:32:27,080
let's say future artificial in
life or evolution in general?

568
00:32:27,640 --> 00:32:29,680
Sure, sure.
So stepping back from the

569
00:32:29,680 --> 00:32:35,200
xenobots for a moment, you know,
AI and biology in general are

570
00:32:35,200 --> 00:32:37,960
biological engineering, genetic
engineering.

571
00:32:37,960 --> 00:32:39,840
They're merging.
So it's, you know, it's

572
00:32:39,840 --> 00:32:42,040
impossible to say.
There's so many exciting things

573
00:32:42,040 --> 00:32:46,680
happening every week.
So the first most obvious one is

574
00:32:46,760 --> 00:32:51,080
genetic engineering, that that
AI is being fed vast amounts of

575
00:32:51,080 --> 00:32:54,520
genetic information.
And so AI is learning.

576
00:32:54,520 --> 00:32:57,440
If you tweak this gene, this
will happen.

577
00:32:57,440 --> 00:32:59,480
If you tweak that gene, this
will happen.

578
00:32:59,480 --> 00:33:03,040
And that's, you know, immensely
interesting and exciting.

579
00:33:03,040 --> 00:33:07,240
It's, you know, bringing us,
it's bringing us a certain

580
00:33:07,520 --> 00:33:11,000
grains, strains of rice that are
very helpful in developing

581
00:33:11,000 --> 00:33:12,880
countries.
You know, it's lifting people

582
00:33:12,880 --> 00:33:17,000
out of poverty and and misery,
dealing with some huge health

583
00:33:17,000 --> 00:33:21,200
issues, Immensely, immensely
powerful and helpful.

584
00:33:23,000 --> 00:33:26,040
Then there is the generative AI
side, you know, Gen.

585
00:33:26,240 --> 00:33:30,000
AI where it's not looking at
data and learning how to tweak

586
00:33:30,000 --> 00:33:32,480
things, it's learning how to
build things.

587
00:33:33,400 --> 00:33:36,960
And so at the moment, the big
successes there have been at the

588
00:33:36,960 --> 00:33:40,000
sub cellular level.
So there are now AI design

589
00:33:40,000 --> 00:33:44,960
proteins and these proteins look
and act like no other protein

590
00:33:44,960 --> 00:33:46,760
that humans have ever seen,
right?

591
00:33:46,760 --> 00:33:51,200
And that's revolutionary
revolutionizing, you know, drug

592
00:33:51,200 --> 00:33:54,160
design and you know,
therapeutics in general.

593
00:33:54,880 --> 00:33:57,160
So the the xenobots then is sort
of Gen.

594
00:33:57,160 --> 00:34:01,240
I Gen.
AI at the Super cellular level.

595
00:34:01,240 --> 00:34:05,720
So the AI is taking cells as the
building block, not peptides,

596
00:34:06,000 --> 00:34:09,040
and putting those cells together
to make things that are, you

597
00:34:09,040 --> 00:34:13,159
know, a millimeter in size or,
or a little bit larger to create

598
00:34:13,159 --> 00:34:18,159
little tiny machines.
And so stepping back, you know,

599
00:34:18,159 --> 00:34:21,560
we're basically developing AI
that is learning how to tinker

600
00:34:21,560 --> 00:34:26,639
with life at all levels, sub
cellular, super cellular, you

601
00:34:26,639 --> 00:34:30,520
know, I don't know how how high
up the scale we can go, but you

602
00:34:30,520 --> 00:34:34,280
know, the trend suggests there's
there doesn't seem to be a limit

603
00:34:34,280 --> 00:34:37,120
insight yet.
Have they has it?

604
00:34:37,280 --> 00:34:40,040
Well, I'm not too familiar with
the with the data within this

605
00:34:40,040 --> 00:34:42,000
field though.
But when it comes to these

606
00:34:42,000 --> 00:34:45,400
proteins and these, does it ever
give us insight into certain

607
00:34:45,400 --> 00:34:47,040
amino acids?
Like maybe we missed something,

608
00:34:47,040 --> 00:34:48,840
maybe there's something we
haven't yet figured this out.

609
00:34:50,080 --> 00:34:52,960
Have there been cases like this
where people are figuring things

610
00:34:52,960 --> 00:34:57,040
out, where biologists just never
would have never considered or

611
00:34:57,040 --> 00:34:59,600
expected aside?
Yeah, absolutely.

612
00:34:59,600 --> 00:35:02,600
Now it's a much harder, it's a
much harder lift.

613
00:35:02,600 --> 00:35:06,640
It does happen.
It's rarer because if you think

614
00:35:06,640 --> 00:35:10,720
about it, you know the we've
learned now that so AI is black

615
00:35:10,720 --> 00:35:12,520
box.
It's able to do some amazing

616
00:35:12,520 --> 00:35:16,840
things and not bother to explain
to us poor humans how it did it.

617
00:35:17,480 --> 00:35:21,840
But you can make things harder
on the AI by saying, please come

618
00:35:21,840 --> 00:35:25,320
up with new things we never
thought of and also come up with

619
00:35:25,320 --> 00:35:30,360
explanations that we as pure
poor humans will understand and

620
00:35:30,400 --> 00:35:33,280
that, you know, that's a harder
ask.

621
00:35:33,400 --> 00:35:35,920
We joke in the lab, you know,
from the AI perspective says,

622
00:35:36,040 --> 00:35:39,160
oh, you want me to also have to
try and explain this to you?

623
00:35:39,160 --> 00:35:43,040
You know, that's really hard.
But but it is possible.

624
00:35:43,280 --> 00:35:47,360
And I, I've dabbled with some of
that in a different branch of AI

625
00:35:48,320 --> 00:35:51,400
where the AI takes in raw data
and spits out equations.

626
00:35:51,400 --> 00:35:54,560
It basically generates math to
explain the data.

627
00:35:54,880 --> 00:35:58,960
And you can read this AI
generated math term by term.

628
00:35:59,320 --> 00:36:02,480
And when this AI is working, it
should generate the terms that

629
00:36:02,480 --> 00:36:04,680
you as an expert in that field
are familiar.

630
00:36:04,680 --> 00:36:07,960
What with you would expect to
see this term, that term.

631
00:36:08,440 --> 00:36:09,720
Hey, wait a second.
Why?

632
00:36:09,840 --> 00:36:13,760
I don't recognize this term?
That's where things get really

633
00:36:13,760 --> 00:36:16,640
interesting.
So yeah, it does.

634
00:36:16,640 --> 00:36:19,280
It does happen.
But we are the weakest link in

635
00:36:19,280 --> 00:36:21,800
that chain, right?
It's up to us to struggle to

636
00:36:21,800 --> 00:36:25,160
actually understand what the AI
is trying to tell us it.

637
00:36:25,920 --> 00:36:29,640
Kind of reminds me of the film
Her with Joaquin Phoenix.

638
00:36:29,680 --> 00:36:32,280
We're at the very end.
I mean this artificial

639
00:36:32,280 --> 00:36:34,400
intelligence is just way too
complex for him to even

640
00:36:34,400 --> 00:36:35,880
understand.
Yes.

641
00:36:36,000 --> 00:36:38,240
I mean, we're there already.
We're there already.

642
00:36:38,240 --> 00:36:42,320
It's yeah, yeah, we're, we're
living in that reality for sure.

643
00:36:42,440 --> 00:36:45,680
The outpacing is a problem, I
would say that that a lot of us

644
00:36:45,680 --> 00:36:47,920
fear in general.
I mean, even I, I myself as a

645
00:36:47,920 --> 00:36:50,840
doctor, when I see how many of
my colleagues would use AI to

646
00:36:50,840 --> 00:36:54,160
draft motivation letters or even
just to diagnose or try and

647
00:36:54,160 --> 00:36:56,520
interact with patients, they
send an e-mail reply.

648
00:36:56,520 --> 00:37:00,960
I mean, there's a lot of
legalities around the privacy

649
00:37:00,960 --> 00:37:03,440
and obviously trying to not give
out personal information, but

650
00:37:03,640 --> 00:37:06,200
but people are using this
everywhere all the time.

651
00:37:06,200 --> 00:37:09,840
It's in, it's almost
inescapable, and we have to

652
00:37:09,840 --> 00:37:12,840
consider this to be part of the
extended cognition we were

653
00:37:12,840 --> 00:37:15,640
talking about earlier.
AI has now become fundamentally

654
00:37:15,640 --> 00:37:17,720
a part of this extended
cognition.

655
00:37:17,920 --> 00:37:20,160
What are your thoughts about?
Yeah.

656
00:37:20,160 --> 00:37:24,040
I mean, the great, the greatest
extension of our ability so far,

657
00:37:24,040 --> 00:37:25,480
right?
And it's hard to argue that it

658
00:37:25,480 --> 00:37:27,400
isn't.
It's Yeah, it is.

659
00:37:28,640 --> 00:37:31,360
Yeah, I'm slowly struggling to
see a day that goes by where I

660
00:37:31,360 --> 00:37:34,160
don't use some sort of
artificial intelligence.

661
00:37:34,560 --> 00:37:36,600
To assist.
And you probably used it 32

662
00:37:36,600 --> 00:37:39,440
times today in ways you didn't
even realize that you were using

663
00:37:39,440 --> 00:37:41,840
it right.
There's also that, there's the

664
00:37:41,880 --> 00:37:45,280
there's the dark matter of AI
that now exists in our world as

665
00:37:45,280 --> 00:37:47,800
well.
I can't remember who the author

666
00:37:47,800 --> 00:37:50,560
was who mentioned the
technological singularity, but

667
00:37:51,160 --> 00:37:53,040
you.
Think Ray Kurzweil.

668
00:37:53,200 --> 00:37:55,240
Yes, yes.
Do you, do you think we're close

669
00:37:55,240 --> 00:37:56,120
to that?
Do you think it's?

670
00:37:57,400 --> 00:37:58,920
Yeah, I got him sounding like a
cynic.

671
00:37:58,920 --> 00:38:00,520
I don't believe in the
singularity.

672
00:38:01,200 --> 00:38:05,080
Things are moving fast and even
possibly exponentially, but that

673
00:38:05,080 --> 00:38:07,400
doesn't mean that they will
continue to you.

674
00:38:07,440 --> 00:38:11,680
You see, you see a lot of
sigmoid curves in the world.

675
00:38:11,680 --> 00:38:15,520
Things suddenly get really,
really fast and then you start

676
00:38:15,520 --> 00:38:19,280
to get diminishing returns.
I don't, I've never seen any

677
00:38:19,280 --> 00:38:23,320
phenomenon that just keeps going
like this and I don't see why AI

678
00:38:23,360 --> 00:38:26,400
would.
Well, let's Josh, let's bring it

679
00:38:26,400 --> 00:38:27,880
back.
So this podcast obviously is

680
00:38:27,960 --> 00:38:30,760
focused on mind body solution,
the mind body problem,

681
00:38:30,760 --> 00:38:32,920
consciousness, trying to
understand this fundamental

682
00:38:32,920 --> 00:38:35,240
question.
And I can't help but think, but

683
00:38:35,240 --> 00:38:38,360
I know you love sci-fi and and
growing up, I mean, this was

684
00:38:38,360 --> 00:38:39,800
something you were always
fascinated by.

685
00:38:40,840 --> 00:38:47,040
Have you watched Ex Machina?
Ex Machina, Yes, yes, yes.

686
00:38:47,040 --> 00:38:50,960
I think I remember the.
Alex Garland So the the science

687
00:38:50,960 --> 00:38:55,040
advisor was Maurice Shanahan and
he he spoke about how the, when

688
00:38:55,040 --> 00:38:58,880
he was one of the best scenes in
the film was when, when Oscar

689
00:38:58,880 --> 00:39:03,360
Isaac asks the I'm sorry, the
guy who asks Oscar Isaac about

690
00:39:03,520 --> 00:39:07,720
his artificial intelligence, he
says, did he pass the Turing

691
00:39:07,720 --> 00:39:09,240
test?
And he says no, he doesn't need

692
00:39:09,240 --> 00:39:11,480
to.
So she doesn't need to because I

693
00:39:11,480 --> 00:39:13,120
want you to know this is a
robot.

694
00:39:13,120 --> 00:39:15,840
I want you to know this is
artificial intelligence.

695
00:39:16,200 --> 00:39:18,960
The question is, will you still
believe it's consciousness

696
00:39:19,160 --> 00:39:21,880
thereafter?
So so here the the cards out on

697
00:39:21,880 --> 00:39:24,680
the table.
We know this is not a conscious

698
00:39:24,680 --> 00:39:26,800
biological being.
You know, it's physically

699
00:39:27,040 --> 00:39:29,640
artificial intelligence.
But are you still fooled?

700
00:39:29,640 --> 00:39:31,720
And he called us the Garland
test since that form.

701
00:39:31,840 --> 00:39:34,280
Do you think that's a bit it's
more of an interesting approach

702
00:39:34,280 --> 00:39:36,680
rather than having the normal
Turing test, but rather know

703
00:39:36,680 --> 00:39:40,760
this is a non, well a non
biological being, but yet he's

704
00:39:40,760 --> 00:39:42,720
super convinced that it is
conscious.

705
00:39:42,960 --> 00:39:46,240
And I think I don't know that it
helps much.

706
00:39:47,120 --> 00:39:49,640
I would mention Thomas Nagel's
essay.

707
00:39:49,640 --> 00:39:51,480
You know what it is like to be a
bat?

708
00:39:51,480 --> 00:39:54,720
So a bat is also not human.
It is biological.

709
00:39:55,160 --> 00:39:58,560
But, you know, humans have a
long, you know, we've got a very

710
00:39:58,560 --> 00:40:01,080
long history of looking at
things that are not us.

711
00:40:01,760 --> 00:40:05,400
And making decisions that's
conscious, that's not conscious,

712
00:40:05,400 --> 00:40:08,520
that maybe is conscious.
I'll never know whether that is

713
00:40:08,520 --> 00:40:12,160
conscious.
So, you know, in terms of from

714
00:40:12,160 --> 00:40:14,960
that perspective, suddenly
having, you know, super

715
00:40:14,960 --> 00:40:21,000
realistic humanoids or you know,
how level type chat GPTS, what's

716
00:40:21,000 --> 00:40:23,200
the difference?
I I don't know whether a bat is

717
00:40:23,200 --> 00:40:24,760
conscious.
I also don't know whether

718
00:40:24,760 --> 00:40:27,360
ChatGPT is conscious.
I don't know whether it's that's

719
00:40:27,360 --> 00:40:31,640
a well formed question.
I don't know that it's that

720
00:40:31,640 --> 00:40:35,160
different in that that way.
Well, it's, it's, it's one of

721
00:40:35,160 --> 00:40:37,400
those things where the
philosophical zombie will always

722
00:40:37,720 --> 00:40:40,160
come into play.
It's, there's, there's

723
00:40:40,160 --> 00:40:42,760
absolutely no way for me to look
at you and just assume that you

724
00:40:42,760 --> 00:40:46,080
are actually a conscious being.
It is, I only have access to

725
00:40:46,080 --> 00:40:49,480
this mind and it, and, and all
we do is taking cues and, and

726
00:40:49,480 --> 00:40:52,760
sort of make relative guesses as
to what we think is conscious.

727
00:40:53,000 --> 00:40:55,840
Do you think we anthropomorphize
things way too much as a

728
00:40:55,840 --> 00:40:57,840
species?
Way too much.

729
00:40:59,120 --> 00:41:01,520
Oh, that's a that's a good one.
We certainly do it.

730
00:41:01,520 --> 00:41:04,480
You know, an obscene amount of
time.

731
00:41:04,480 --> 00:41:09,560
I, I have AI, have a robot
lawnmower at home and my 4 year

732
00:41:09,560 --> 00:41:12,720
old, you know, this, this is,
he's basically views it as his

733
00:41:12,720 --> 00:41:16,640
sibling and he will get upset
with us if we work the, the

734
00:41:16,640 --> 00:41:19,480
lawnmower too much.
You know, it's, it's everywhere

735
00:41:19,920 --> 00:41:22,400
and and robots are just going to
exacerbate it.

736
00:41:22,880 --> 00:41:26,000
Now, whether it's too much, you
know, you know, suggests that

737
00:41:26,000 --> 00:41:29,400
maybe it's a bad thing.
I think, you know, the main

738
00:41:29,400 --> 00:41:32,720
thing that humans struggle with
is increasing our circle of

739
00:41:32,720 --> 00:41:36,120
empathy.
So if we mistakenly, you know,

740
00:41:36,120 --> 00:41:39,120
empathize with things that maybe
don't there's nothing at home,

741
00:41:39,120 --> 00:41:43,160
there's there's zombie or not.
I think better to air on that

742
00:41:43,160 --> 00:41:46,480
side than to air on the side of
this is not something that's

743
00:41:46,480 --> 00:41:49,520
worthy of our consideration, our
moral consideration.

744
00:41:50,440 --> 00:41:53,480
When you look back at someone
like Isaac Asimov when he came

745
00:41:53,480 --> 00:41:57,240
up with his laws of robotics and
AI, and then you look back to

746
00:41:57,240 --> 00:42:00,800
today with xenobots present, how
do you think that this field has

747
00:42:00,800 --> 00:42:03,560
fundamentally shifted and
changed throughout its course?

748
00:42:05,000 --> 00:42:09,360
Yeah, I mean, Isaac Asimov,
brilliant, creative and also

749
00:42:09,400 --> 00:42:11,400
like all of us, a creature of
his time.

750
00:42:11,400 --> 00:42:15,160
So, you know, it was a lot of,
you know, androids and, and that

751
00:42:15,160 --> 00:42:19,920
sort of thing and, and metal and
plastic and, but with the laws

752
00:42:19,920 --> 00:42:23,520
of robotics, you know, I think
he got that exactly right, which

753
00:42:23,520 --> 00:42:28,800
is human hubris that we think we
can put guardrails around

754
00:42:28,800 --> 00:42:32,440
complicated things.
And so in in AI and robotics,

755
00:42:32,440 --> 00:42:36,160
there's a famous term called
perverse instantiation, which

756
00:42:36,160 --> 00:42:40,600
means that the machine somehow
instantiates the behavior we

757
00:42:40,600 --> 00:42:44,800
want, but instantiates it in a
perverse way.

758
00:42:45,080 --> 00:42:48,360
If you ask an autonomous vehicle
to get the human occupant from

759
00:42:48,360 --> 00:42:52,160
point A to point B as quickly as
possible, fastest way to do that

760
00:42:52,160 --> 00:42:55,800
is to drive in a straight line,
you know, through parks and over

761
00:42:55,800 --> 00:42:58,560
sidewalks and it's everywhere,
right?

762
00:42:58,560 --> 00:43:02,200
So, and it's a very difficult
problem to solve.

763
00:43:02,200 --> 00:43:06,360
We, we end up having to tell all
our AI technologies, please do

764
00:43:06,360 --> 00:43:09,880
X, but don't do it in this way
and don't do it in that way and

765
00:43:09,880 --> 00:43:13,120
don't do we keep tacking on
these, these things, which

766
00:43:13,440 --> 00:43:15,720
reminds me of, you know, all the
things that went wrong with the

767
00:43:15,720 --> 00:43:18,520
three laws of robotics in
Asimov's book.

768
00:43:18,520 --> 00:43:22,600
So I would say actually he was
very prescient and he got at the

769
00:43:22,600 --> 00:43:26,520
root of what the problem would
be, which is exactly the problem

770
00:43:26,520 --> 00:43:30,320
We're all now wrestling.
It's pretty incredible how these

771
00:43:30,480 --> 00:43:34,760
sci-fi writers back in the day
just guessed so many things so

772
00:43:34,760 --> 00:43:39,160
accurately and well informed
within their within their time.

773
00:43:39,160 --> 00:43:42,240
It's so cool to see it all come
to fruition today.

774
00:43:42,640 --> 00:43:46,040
When you look at the field and
and the work you does you guys

775
00:43:46,040 --> 00:43:48,920
have done and that you've done
over time, what do you feel most

776
00:43:48,920 --> 00:43:52,560
proud about when you look back?
What's the most exciting you you

777
00:43:52,560 --> 00:43:57,720
think's been happening so far?
Oh gosh, that's very hard to

778
00:43:57,720 --> 00:43:59,640
say.
A lot of the things we've done I

779
00:43:59,640 --> 00:44:03,880
found very personally satisfying
in that for me, they were good

780
00:44:04,320 --> 00:44:07,520
pieces of the puzzle in
answering this question of how

781
00:44:07,520 --> 00:44:09,080
the body shapes the way we
think.

782
00:44:09,400 --> 00:44:12,280
But some of those results are
kind of, you know, abstract or

783
00:44:12,280 --> 00:44:15,200
abstruse and and, you know,
they're going to have probably

784
00:44:15,200 --> 00:44:17,920
limited impact.
So I, I guess I would say the

785
00:44:17,920 --> 00:44:22,400
Xenobots for the simple fact
that they've given, they've

786
00:44:22,400 --> 00:44:25,880
given society a new way to make
helpful tools, you know, if we

787
00:44:25,880 --> 00:44:30,040
want to clean up microplastics,
not leave additional metal and

788
00:44:30,040 --> 00:44:32,440
stuff, and then when we clean
up, great.

789
00:44:32,440 --> 00:44:36,360
So I'm happy about that.
And then there's also something

790
00:44:36,360 --> 00:44:38,520
about the Xenobots, as I
mentioned, that seems

791
00:44:38,520 --> 00:44:43,160
inspirational to younger people.
And that's something I care a

792
00:44:43,160 --> 00:44:47,080
lot about.
I consider myself an old person

793
00:44:47,080 --> 00:44:49,280
or, you know, we we've made a
lot of problems that

794
00:44:49,280 --> 00:44:52,040
unfortunately younger folks are
going to have to try and deal

795
00:44:52,040 --> 00:44:55,080
with somehow.
And as a scientist and engineer,

796
00:44:55,080 --> 00:44:58,920
I feel morally responsible for
trying to at least leave them

797
00:44:58,920 --> 00:45:02,720
some tools with which to try and
solve some of these problems.

798
00:45:02,720 --> 00:45:05,040
And I hope that Xenobots will be
one of those tools.

799
00:45:05,760 --> 00:45:08,080
When you look back when you were
growing up, who who were the

800
00:45:08,080 --> 00:45:10,880
scientists or writers Sign for
writers, whatever that really

801
00:45:10,880 --> 00:45:13,560
inspired you and got you to to
like look at this field and

802
00:45:13,560 --> 00:45:16,320
think about it so seriously too,
to a point where you guys create

803
00:45:16,320 --> 00:45:19,360
new organisms that have.
Yeah, Yeah, It's a good

804
00:45:19,360 --> 00:45:21,760
question.
So I didn't, I there were no

805
00:45:21,760 --> 00:45:24,600
scientists in my family.
It wasn't something that, you

806
00:45:24,600 --> 00:45:26,640
know, was a part of my
childhood.

807
00:45:26,640 --> 00:45:29,200
So I didn't actually really
think about scientists or

808
00:45:29,200 --> 00:45:31,440
engineers or what we now called
STEM.

809
00:45:31,440 --> 00:45:35,480
That wasn't a thing.
Being a professor also wasn't,

810
00:45:35,920 --> 00:45:40,040
you know, wasn't really a thing
in my family, but it, it was the

811
00:45:40,040 --> 00:45:42,560
sci-fi.
And you know, what I took away

812
00:45:42,560 --> 00:45:46,160
from it is, you know, all these
things sound wonderful.

813
00:45:46,960 --> 00:45:49,640
Where are they?
Like if, and, you know, as a

814
00:45:49,640 --> 00:45:51,760
young person, I could see back
then, you know, there are also

815
00:45:51,760 --> 00:45:54,400
problems.
Like wouldn't it be great?

816
00:45:54,400 --> 00:45:58,360
Like this literature seems to
suggest there's other paths

817
00:45:58,360 --> 00:46:00,640
forward.
There's other ways to do things.

818
00:46:00,640 --> 00:46:03,520
And then I would, from my
limited vantage point, look

819
00:46:03,520 --> 00:46:07,560
around and just, I remember just
being confused, like, why not?

820
00:46:07,560 --> 00:46:10,960
Why aren't we trying to?
And then of course, I learned as

821
00:46:10,960 --> 00:46:13,600
I got older that we were trying
and failing miserably.

822
00:46:14,040 --> 00:46:17,760
You know, I, I came of age in,
in academia during the AI winter

823
00:46:17,760 --> 00:46:21,240
when nothing was working.
It seemed that, you know, robots

824
00:46:21,240 --> 00:46:26,480
and AI was 1000 years off.
And I, for me, I guess my

825
00:46:26,480 --> 00:46:30,480
personality that the challenge
appealed to me, the fact that

826
00:46:31,320 --> 00:46:34,760
there were so many failures that
even if I entered the field and

827
00:46:34,760 --> 00:46:38,760
was slightly less of a failure,
that would be progress that, you

828
00:46:38,760 --> 00:46:40,920
know, being able to contribute a
small bit.

829
00:46:41,400 --> 00:46:44,520
And then suddenly the AI summer
happened and, you know, all

830
00:46:44,520 --> 00:46:46,000
these other things started to
happen.

831
00:46:46,000 --> 00:46:48,200
But but anyways, that's how I
got to this point.

832
00:46:48,760 --> 00:46:51,760
You you mentioned something like
STEM, being a professor.

833
00:46:51,800 --> 00:46:55,800
When you think about you, your
work and how it bridges biology,

834
00:46:55,800 --> 00:46:59,800
computer science, robotics,
philosophy, so many different

835
00:46:59,800 --> 00:47:02,480
fields, this interdisciplinary
work that you guys are doing, so

836
00:47:02,480 --> 00:47:03,920
many different people coming
together.

837
00:47:04,360 --> 00:47:08,680
How do you think that this has
changed the game for you guys?

838
00:47:08,680 --> 00:47:11,680
Because it wasn't a thing back
in the day where, I mean, we're

839
00:47:11,680 --> 00:47:14,080
all stuck in these niche fields
and everybody's just doing their

840
00:47:14,080 --> 00:47:16,400
own work.
Even within biology, you'll have

841
00:47:16,400 --> 00:47:19,080
a biologist who knows nothing
about a specific molecule and

842
00:47:19,520 --> 00:47:21,800
there's so little that everyone
knows about everyone else's

843
00:47:21,800 --> 00:47:23,960
work.
But but I see you guys doing

844
00:47:23,960 --> 00:47:27,200
this in such an
interdisciplinary way that it's

845
00:47:27,200 --> 00:47:29,560
it's pretty epic to watch from
from the outside.

846
00:47:29,720 --> 00:47:34,080
And how's that been for you?
Well, first of all, it's been

847
00:47:34,080 --> 00:47:36,560
very difficult and it's been
very slow going.

848
00:47:36,560 --> 00:47:39,160
So we've been talking about the
Xenobots, which is kind of the

849
00:47:39,160 --> 00:47:42,600
end, you know, or it's a, it's a
stopping point on a very long

850
00:47:42,600 --> 00:47:45,760
journey.
I was very fortunate and again,

851
00:47:46,040 --> 00:47:51,280
just by chance to have fallen
into various programs at

852
00:47:51,280 --> 00:47:55,360
different universities in, in
four different countries that

853
00:47:55,360 --> 00:47:58,880
taught me how to be an
interdisciplinary researcher.

854
00:47:58,880 --> 00:48:02,200
There are ways, you know, to do
it very badly, right?

855
00:48:02,200 --> 00:48:06,120
So, you know, Jack of all
trades, master of none, you

856
00:48:06,120 --> 00:48:09,920
know, there, it's not an easy
thing to do that The, the reason

857
00:48:09,920 --> 00:48:12,560
that most people are
specialists, and I'm glad there

858
00:48:12,560 --> 00:48:16,920
are specialists, you know, is
because it's easier to drill

859
00:48:16,920 --> 00:48:21,680
down and really grasp a branch
of human knowledge and extend it

860
00:48:22,080 --> 00:48:25,600
than it is to try and connect
two branches together.

861
00:48:25,600 --> 00:48:30,040
It takes at least twice as long.
So in the case of my

862
00:48:30,040 --> 00:48:33,160
collaboration with Mike, I told
you about, you know, the, the

863
00:48:33,160 --> 00:48:35,080
Xeno sculpture and then the
xenobot.

864
00:48:35,080 --> 00:48:38,120
But you know, it took a long
time to get to that point.

865
00:48:38,160 --> 00:48:40,360
We, we don't even speak the same
languages.

866
00:48:40,360 --> 00:48:44,800
You have to learn, you know how
what each are capable of, what

867
00:48:44,800 --> 00:48:48,680
takes a week, what takes five
years, you know all those sorts

868
00:48:48,680 --> 00:48:52,120
of things.
And then you can proceed

869
00:48:52,120 --> 00:48:55,560
carefully.
And of course, in some cases all

870
00:48:55,560 --> 00:48:57,800
of this effort is worth it
because you get lucky with

871
00:48:57,800 --> 00:49:02,080
something like the Xenobots, but
it is definitely not the norm.

872
00:49:02,520 --> 00:49:04,800
Yeah, and then you get stuck in
a podcast that's philosophical

873
00:49:04,800 --> 00:49:06,600
based on the mind body problem
being asked.

874
00:49:06,600 --> 00:49:09,440
What is consciousness?
That's right, that's right.

875
00:49:09,720 --> 00:49:11,480
You got to roll with the roll
with the punches.

876
00:49:12,880 --> 00:49:15,760
Tell me, Josh, when you, when
you think about this field

877
00:49:15,760 --> 00:49:18,920
moving forward, what excites you
most about it?

878
00:49:18,920 --> 00:49:21,560
Where, what do you think in
terms of the the future of the

879
00:49:21,560 --> 00:49:24,720
industry, people that are
working perhaps under you or

880
00:49:24,960 --> 00:49:27,560
work that you've seen what
what's what's been most exciting

881
00:49:27,560 --> 00:49:32,040
and even within your own work?
Yeah, so, so I am interested in

882
00:49:32,040 --> 00:49:33,440
the philosophical side of
things.

883
00:49:33,440 --> 00:49:37,400
So one of the things that I see
in the Xenobots and some of the

884
00:49:37,400 --> 00:49:42,600
other kind of exotic robots that
we work with is a, a conflation

885
00:49:42,600 --> 00:49:47,080
of thought and action.
So in a, if you take a Roomba,

886
00:49:47,080 --> 00:49:50,280
the the robot vacuum cleaner,
you can point to the wheels and

887
00:49:50,280 --> 00:49:54,080
you can say action.
You can open it up and point to

888
00:49:54,080 --> 00:49:58,080
the central, the CPU and say,
you know, thought or cognition

889
00:49:58,080 --> 00:50:01,480
or processing, you know,
traditional robots, there is a

890
00:50:01,480 --> 00:50:05,200
Cartesian cut you can make that
separates the body from the

891
00:50:05,200 --> 00:50:08,080
brain.
But what I'm excited about is if

892
00:50:08,080 --> 00:50:11,920
you look at a lot of cutting
edge technologies, that

893
00:50:11,920 --> 00:50:18,080
separation is becoming less and
less obvious, which I think is,

894
00:50:18,080 --> 00:50:21,920
you know, finally the beginnings
of an acid that's dissolving

895
00:50:21,920 --> 00:50:25,520
this distinction that at least
in the West has been around for

896
00:50:25,520 --> 00:50:30,160
over 300 years.
And so I think that's great from

897
00:50:30,160 --> 00:50:32,240
a purely intellectual point of
view.

898
00:50:32,560 --> 00:50:35,960
But again, it, it also is
important for us as a species to

899
00:50:35,960 --> 00:50:39,000
understand that, you know,
brains are not everything and

900
00:50:39,000 --> 00:50:42,160
everything else is expendable.
You know, it's the story is much

901
00:50:42,160 --> 00:50:48,680
more complicated than that.
And I think that's finally

902
00:50:48,680 --> 00:50:52,800
overturning Cartesian dualism
will actually be a positive

903
00:50:52,800 --> 00:50:56,600
thing for for society, assuming
it actually happens.

904
00:50:57,120 --> 00:51:01,680
You know, the Keith Frankish,
the one who termed it, coined

905
00:51:01,680 --> 00:51:04,280
the term illusionism, He's
writing a book called Escaping

906
00:51:04,280 --> 00:51:07,440
Descartes Prison and it's
fundamentally premised on that,

907
00:51:07,480 --> 00:51:10,560
on that whole idea that this
particular dualism is, is

908
00:51:10,560 --> 00:51:12,720
definitely on a decline,
however.

909
00:51:12,960 --> 00:51:16,000
I know, I know Keith, and I'm
glad to know he's also like,

910
00:51:16,000 --> 00:51:18,040
wielding an axe from a different
direction.

911
00:51:18,040 --> 00:51:22,280
Well, we're doing our empiricist
best, and yeah, hopefully the

912
00:51:22,280 --> 00:51:25,840
tree will come down.
But but on the flip side, if you

913
00:51:25,840 --> 00:51:28,640
think about it, Josh, while that
Cartesian dualism is breaking

914
00:51:28,640 --> 00:51:31,560
down, you've got a rise in Ben
psychicism and idealism as well.

915
00:51:31,880 --> 00:51:34,920
So you've got a whole group of
thinkers that think that either

916
00:51:34,920 --> 00:51:37,640
everything is conscious or
everything is consciousness.

917
00:51:38,120 --> 00:51:40,200
What are your thoughts on that?
I know this is not your field.

918
00:51:40,200 --> 00:51:43,640
Now we're just going beyond.
You know what, I'm OK with it.

919
00:51:43,640 --> 00:51:47,920
I'm I'm OK with that.
It's actually the the the stance

920
00:51:47,920 --> 00:51:50,000
in the middle that I find
problematic.

921
00:51:50,640 --> 00:51:53,880
I am conscious and that thing
over there isn't.

922
00:51:53,880 --> 00:51:56,040
So it doesn't matter what I do
to it.

923
00:51:56,280 --> 00:52:00,280
Like that's the problem.
So if there are a greater number

924
00:52:00,280 --> 00:52:03,880
of folks that believe everything
is conscious and we should be

925
00:52:04,240 --> 00:52:08,360
careful about how we interact
with other things and other

926
00:52:09,200 --> 00:52:12,840
other entities, great.
If the illusionists say, listen,

927
00:52:12,840 --> 00:52:15,680
we're all on the same level
playing field, you know, for

928
00:52:15,680 --> 00:52:18,120
different reasons, that's also
good.

929
00:52:18,120 --> 00:52:21,280
I think it's yeah.
My problem is the actually the

930
00:52:21,280 --> 00:52:23,280
intermediate stance.
Yeah, I agree.

931
00:52:23,280 --> 00:52:27,520
I think ethically and, and sort
of the the the moral philosophy

932
00:52:27,520 --> 00:52:30,720
behind both of those, whether
it's illusionism or panpsychism,

933
00:52:30,720 --> 00:52:33,000
pretty much lead to the same
thing and that we're all the

934
00:52:33,000 --> 00:52:35,480
same thing.
We're all the same thing,

935
00:52:35,480 --> 00:52:37,560
exactly.
And I think that's all the

936
00:52:37,920 --> 00:52:39,680
lines.
And even within an idealist

937
00:52:39,680 --> 00:52:42,160
philosophy where they think that
everything is consciousness and

938
00:52:42,160 --> 00:52:45,000
we're just a part of it, that
also tends to always lead to

939
00:52:45,000 --> 00:52:48,040
some sort of a this
consciousness trying to love

940
00:52:48,040 --> 00:52:50,360
itself, enjoy itself.
It always leads in some sort of

941
00:52:50,360 --> 00:52:53,760
an empathetic positive
worldview, which I think in

942
00:52:53,760 --> 00:52:56,280
essence would be a better option
anyway, instead of drawing these

943
00:52:56,280 --> 00:52:59,800
fundamental lines between one
thing or another.

944
00:52:59,880 --> 00:53:04,240
But when when you think about
your work currently, what are

945
00:53:04,240 --> 00:53:07,800
you guys doing that you think we
should be looking forward to the

946
00:53:07,800 --> 00:53:09,880
most and that you're most
excited about?

947
00:53:10,640 --> 00:53:14,560
Yeah.
So this is true of the xenobots

948
00:53:14,560 --> 00:53:18,000
in general, but we are also
doing a lot of work in my lab

949
00:53:18,000 --> 00:53:21,800
with what are called meta
materials, and these are human

950
00:53:21,800 --> 00:53:25,520
engineered materials that act
very differently from natural

951
00:53:25,520 --> 00:53:28,560
materials.
And it turns out that these meta

952
00:53:28,560 --> 00:53:34,080
materials also are capable of
thought or cognition and action

953
00:53:34,080 --> 00:53:38,040
simultaneously.
Imagine that I take a sheet of

954
00:53:38,040 --> 00:53:41,840
this material and I start
vibrating it, but I vibrate it

955
00:53:41,840 --> 00:53:45,200
at different frequencies.
It turns out you can get this

956
00:53:45,200 --> 00:53:49,080
material to do different things
at these different frequencies

957
00:53:49,440 --> 00:53:52,920
and actually use these different
frequencies to carry

958
00:53:52,920 --> 00:53:55,600
information.
You can you can design A sheet

959
00:53:55,600 --> 00:53:59,480
of this material that when you
shake it, it performs logical

960
00:53:59,480 --> 00:54:04,720
AND at one frequency and logical
OR at the other frequency

961
00:54:04,800 --> 00:54:08,560
simultaneously.
And while computing that

962
00:54:08,560 --> 00:54:11,880
function, the material is moving
and if you put it on the ground,

963
00:54:11,880 --> 00:54:13,920
it will actually move along the
ground.

964
00:54:13,920 --> 00:54:17,800
So where is the distinction
between thought and action?

965
00:54:18,880 --> 00:54:22,400
So in answer to your question,
I'm excited about, you know, all

966
00:54:22,400 --> 00:54:27,040
of these exotic materials, which
includes living materials that

967
00:54:27,040 --> 00:54:32,440
conflate thought and action.
And then a is ability to exploit

968
00:54:32,440 --> 00:54:35,920
the potential of these new
materials to make useful

969
00:54:35,920 --> 00:54:40,440
machines in ways, you know,
Hollywood hasn't even begun to

970
00:54:40,440 --> 00:54:43,200
explore.
They will, I guess, but not not

971
00:54:43,200 --> 00:54:45,320
yet.
They always eventually catch up

972
00:54:45,320 --> 00:54:49,160
and but but that's incredible.
I mean that the, there's there's

973
00:54:49,160 --> 00:54:52,040
so much ground breaking stuff
occurring that I mean, you're

974
00:54:52,040 --> 00:54:54,120
right.
The the pace even for me when I

975
00:54:54,120 --> 00:54:56,440
watch what you guys do,
sometimes when I look at it, I

976
00:54:56,440 --> 00:54:58,840
think like this is crazy how
fast this is happening.

977
00:54:59,480 --> 00:55:02,000
Like you see five years ago one
Xenobots made and now suddenly

978
00:55:02,000 --> 00:55:04,720
you guys are making materials
that can vibrate at different I

979
00:55:04,880 --> 00:55:07,720
think.
Yeah, I mean, the the reason I

980
00:55:07,720 --> 00:55:11,120
would say is again, it Mike and
myself and some others, you

981
00:55:11,120 --> 00:55:14,120
know, have been on a very long
journey to try and see things

982
00:55:14,120 --> 00:55:17,520
differently, to escape from the
Cartesian worldview, right.

983
00:55:17,520 --> 00:55:21,480
So if you don't escape, you
know, you do things like ChatGPT

984
00:55:21,480 --> 00:55:24,080
and you try and make bigger and
bigger brains, which I would say

985
00:55:24,080 --> 00:55:27,640
is also useful.
Nothing against, you know, open

986
00:55:27,720 --> 00:55:30,120
AI and ChatGPT and Stable
Diffusion.

987
00:55:30,120 --> 00:55:34,080
They're they're great.
But the reason I think that that

988
00:55:34,080 --> 00:55:36,560
we are discovering a lot of
things is because we're not

989
00:55:36,560 --> 00:55:38,960
looking under the lamppost
anymore, right?

990
00:55:38,960 --> 00:55:41,560
We're out in the dark and it
turns out there's a lot of stuff

991
00:55:41,560 --> 00:55:44,920
out there.
It's just that people haven't,

992
00:55:45,120 --> 00:55:48,720
haven't thought to look there,
been able to get there because

993
00:55:48,720 --> 00:55:52,040
you have to think in a non
Cartesian way to get there.

994
00:55:52,040 --> 00:55:56,720
You have to ignore a distinction
between body and brain, and then

995
00:55:56,720 --> 00:56:00,080
you start finding these weird
materials and finding ways to

996
00:56:00,080 --> 00:56:03,720
bend them to new purposes.
Something I found particularly

997
00:56:03,720 --> 00:56:06,800
fascinating when I listen to you
speak was you mentioned

998
00:56:06,800 --> 00:56:11,880
something about cause and effect
and, and until an artificial

999
00:56:11,880 --> 00:56:15,920
intelligence is able to actually
cause an effect, that's when you

1000
00:56:15,920 --> 00:56:18,720
know that this thing is has
become a fundamentally part of

1001
00:56:18,720 --> 00:56:19,960
reality.
Do you want to expand on that

1002
00:56:19,960 --> 00:56:21,520
idea?
Sure, sure.

1003
00:56:21,520 --> 00:56:25,720
So this is something that
current AI technology struggle

1004
00:56:25,720 --> 00:56:30,040
with cause and effect.
You know, the best, the best a

1005
00:56:30,040 --> 00:56:34,840
non embodied AI can do is sort
of read about all cause effect

1006
00:56:34,840 --> 00:56:37,680
relationships that humans have
written about on the Internet or

1007
00:56:37,680 --> 00:56:41,240
in books or whatever.
That's that's all those AIS have

1008
00:56:41,240 --> 00:56:45,320
access to and they can do OK
with cause and effect.

1009
00:56:45,320 --> 00:56:49,040
Not great, but again, you know
how the body shapes the way we

1010
00:56:49,040 --> 00:56:50,920
think.
Like I said, I have a four year

1011
00:56:50,920 --> 00:56:54,080
old at home and he pushes a
coffee mug off the table and it

1012
00:56:54,080 --> 00:56:56,360
shatters like he caused an
effect.

1013
00:56:56,640 --> 00:57:00,000
And not only did it shatter, but
he sees how his two parents

1014
00:57:00,000 --> 00:57:02,760
react and how they react
differently to what's there's.

1015
00:57:03,120 --> 00:57:06,360
When you can push against the
world, literally, you know,

1016
00:57:06,360 --> 00:57:10,600
there is a massive amount of
rich effectual data that comes

1017
00:57:10,600 --> 00:57:13,200
back.
And so I say this a lot.

1018
00:57:13,200 --> 00:57:15,960
Embodied cognition is about
pushing against the world and

1019
00:57:15,960 --> 00:57:17,720
observing how the world pushes
back.

1020
00:57:18,040 --> 00:57:22,320
And that is a cause effect loop.
You can cause them in all sorts

1021
00:57:22,320 --> 00:57:27,600
of ways from, you know, pushing
a coffee cup off a table to, you

1022
00:57:27,600 --> 00:57:31,760
know, changing a society,
landing someone on the moon, all

1023
00:57:31,760 --> 00:57:35,840
sorts of things.
And that that gives embodied

1024
00:57:35,840 --> 00:57:39,040
learners, not just human
learners, but non human

1025
00:57:39,040 --> 00:57:42,160
learners, you know, front row
seat for cause and effect.

1026
00:57:42,400 --> 00:57:45,880
If you're not sure, if you're
thinking that this effect is

1027
00:57:45,880 --> 00:57:49,760
caused by that cause, just go
out and verify it be be a

1028
00:57:49,760 --> 00:57:52,240
scientist, push against the
world and see whether you were

1029
00:57:52,240 --> 00:57:55,360
right or not.
It's actually very simple when

1030
00:57:55,360 --> 00:57:58,560
you think about it.
So I think there are some of

1031
00:57:58,560 --> 00:58:02,600
some of these examples that make
clear why we can't just ignore

1032
00:58:02,600 --> 00:58:04,760
the body.
It matters in very important

1033
00:58:04,760 --> 00:58:05,160
ways.
Yeah.

1034
00:58:05,400 --> 00:58:07,200
It runs, I think it was a paper
I read.

1035
00:58:07,200 --> 00:58:09,240
I can't remember the author's
name, but I'll try and find it.

1036
00:58:09,240 --> 00:58:10,560
But it's called the nobody
problem.

1037
00:58:10,560 --> 00:58:13,360
Just that there's a problem in
having nobody.

1038
00:58:13,400 --> 00:58:15,120
That's the thing.
It's it's a fundamental.

1039
00:58:15,120 --> 00:58:17,000
That's the real problem.
Agreed.

1040
00:58:17,120 --> 00:58:18,880
Nobody.
Is a big problem.

1041
00:58:19,240 --> 00:58:21,400
Yeah.
And it's it's it's, yeah.

1042
00:58:21,400 --> 00:58:23,360
It goes down to the crux of what
we're talking about.

1043
00:58:23,560 --> 00:58:26,800
That same cause and effect that
you're talking about is that

1044
00:58:26,800 --> 00:58:29,680
almost it reminds you of that
inactivist part, the fact that

1045
00:58:29,680 --> 00:58:32,760
we're here trying to
consistently and continuously do

1046
00:58:32,760 --> 00:58:36,160
something like there's a purpose
driven by behind everything we

1047
00:58:36,160 --> 00:58:37,200
do.
We must find food.

1048
00:58:37,200 --> 00:58:38,560
We must.
We see it.

1049
00:58:38,560 --> 00:58:40,760
We see something red.
We know it might be an apple.

1050
00:58:40,760 --> 00:58:43,520
There's, there's this
interaction with reality that's

1051
00:58:43,560 --> 00:58:47,000
always looping back and forth.
And do you think fundamentally

1052
00:58:47,000 --> 00:58:51,120
with AI we would either have to
program it to have some sort of

1053
00:58:51,280 --> 00:58:55,320
a personal loss of if it does
not exist, we must have some

1054
00:58:55,320 --> 00:58:58,440
sort of program within it or
does it not need something like

1055
00:58:58,440 --> 00:59:02,400
that in order to?
Be I, I, I think it absolutely

1056
00:59:02,400 --> 00:59:06,520
needs it, but not necessarily in
the literal sense of, you know,

1057
00:59:06,520 --> 00:59:09,480
making robots that push against
the world and observing how the

1058
00:59:09,480 --> 00:59:12,680
world pushes back.
All the, you know, ChatGPT and

1059
00:59:12,680 --> 00:59:15,680
all the chat bots, they kind of
already are doing it because

1060
00:59:16,000 --> 00:59:20,320
they say stuff and that affects
their human interlocutors, some

1061
00:59:20,320 --> 00:59:23,960
of whom go off and do things and
then report back to ChatGPT,

1062
00:59:24,200 --> 00:59:26,360
which becomes part of its
training set.

1063
00:59:26,360 --> 00:59:31,600
So, you know, modern AI, they
already have, you know, slaves,

1064
00:59:31,600 --> 00:59:34,160
which is us.
You know, they can use us to to

1065
00:59:34,160 --> 00:59:36,560
try out cause and effect
relationships.

1066
00:59:37,040 --> 00:59:40,240
I don't know, you know, what the
big AI companies are doing.

1067
00:59:41,160 --> 00:59:43,560
A lot of them are very smart.
They probably started to figure

1068
00:59:43,560 --> 00:59:46,320
this out.
So, you know, there are ways

1069
00:59:46,320 --> 00:59:48,480
that you can learn about the
real world.

1070
00:59:48,480 --> 00:59:51,120
You can learn about cause and
effect where you have something

1071
00:59:51,120 --> 00:59:54,960
or someone else go and, you
know, embody that cause and

1072
00:59:54,960 --> 00:59:58,880
effect loop for you.
So it's already happening.

1073
00:59:59,000 --> 01:00:01,920
It's a crazy thought.
In other words, we are AI's

1074
01:00:01,920 --> 01:00:05,040
extended mind.
We we absolutely are.

1075
01:00:05,040 --> 01:00:07,760
So it is an extended mind
already It's got I don't know

1076
01:00:07,760 --> 01:00:09,880
how many of us as its end
effectors.

1077
01:00:09,880 --> 01:00:13,680
It's it's happening.
Do you think that a, a, a bigger

1078
01:00:13,680 --> 01:00:16,720
step would to take this even
further would be then giving it

1079
01:00:16,720 --> 01:00:21,320
this sort of sense of loss?
So having such a hard wired

1080
01:00:21,320 --> 01:00:25,640
program that if it ceases to
exist, it's a, that's a

1081
01:00:25,640 --> 01:00:28,200
fundamental problem, the same
way we are so afraid of death.

1082
01:00:28,680 --> 01:00:31,440
You think that's a big factor
that we'll have to consider in

1083
01:00:31,440 --> 01:00:34,080
the future to make this?
It's a it's a really good

1084
01:00:34,080 --> 01:00:35,920
question.
I don't know whether, like, a

1085
01:00:35,920 --> 01:00:39,520
fear of death or extinction is a
necessary component for

1086
01:00:39,520 --> 01:00:44,000
developing embodied cognition.
If it is, that's going to be

1087
01:00:44,000 --> 01:00:47,440
problematic because this is
something Hollywood has explored

1088
01:00:47,440 --> 01:00:50,760
in great detail.
If you know, if you give

1089
01:00:51,040 --> 01:00:54,760
technologies existential dread,
you know they'll do everything

1090
01:00:54,760 --> 01:00:56,760
they can in their power to avoid
it.

1091
01:00:56,760 --> 01:00:59,440
And I agree with it.
That just makes sense.

1092
01:01:00,120 --> 01:01:02,800
So I, I think actually it's a
research question.

1093
01:01:02,800 --> 01:01:06,440
Like we started talking about
self-awareness and how that

1094
01:01:06,440 --> 01:01:11,400
might have evolved in us to
avoid extinction, avoid death.

1095
01:01:12,600 --> 01:01:16,440
But it's not necessarily true
that technologies can't become

1096
01:01:16,440 --> 01:01:20,600
self aware and practice cause
and effect internally and verify

1097
01:01:20,600 --> 01:01:24,840
it physically that that all
needs to be driven by, you know,

1098
01:01:24,920 --> 01:01:28,640
fear of extinction.
I hope it it's not true.

1099
01:01:28,640 --> 01:01:32,000
I hope that, you know, they
don't have to rely on that for

1100
01:01:32,000 --> 01:01:34,600
their motivation.
But I guess, I guess we'll see.

1101
01:01:35,840 --> 01:01:38,640
A lot's going to probably change
the next few years and I think

1102
01:01:38,640 --> 01:01:42,000
the, well, as we said, it's very
fast-paced, not to the point of

1103
01:01:42,000 --> 01:01:45,560
a singularity, but very fast in
general.

1104
01:01:45,760 --> 01:01:48,880
When you look back at your views
on consciousness in the mind,

1105
01:01:49,760 --> 01:01:53,800
how often have you changed your
views regarding consciousness

1106
01:01:53,800 --> 01:01:56,320
and the mind body problem?
So when you were young, let's

1107
01:01:56,320 --> 01:01:59,600
say prior to entering the field
as a student perhaps, what did

1108
01:01:59,600 --> 01:02:01,320
you have a Cartesian mindset on
this?

1109
01:02:01,320 --> 01:02:03,160
Did you think there was a sort
of soul or?

1110
01:02:04,920 --> 01:02:08,400
I did not have a religious,
yeah, I did not have a religious

1111
01:02:08,400 --> 01:02:11,000
upbringing, but yeah, I was
convinced there was someone in

1112
01:02:11,000 --> 01:02:14,200
there.
But, and that's just because I

1113
01:02:14,200 --> 01:02:16,960
was raised in the West, pretty
much everyone, even if you're an

1114
01:02:16,960 --> 01:02:21,600
atheist, that's what you're
taught indirectly and directly.

1115
01:02:21,960 --> 01:02:26,840
So, so my awakening was at the
University of Sussex.

1116
01:02:26,840 --> 01:02:30,720
I did my masters there and
Margaret Bowden and Phil

1117
01:02:30,720 --> 01:02:33,360
husbands and Inman Harvey and
Anil Seth.

1118
01:02:33,640 --> 01:02:36,920
You know, they did a very good
job of disabusing us of that

1119
01:02:36,920 --> 01:02:40,320
notion, or at least, you know,
presenting it that it's just one

1120
01:02:40,320 --> 01:02:43,320
view.
And so that was that was the

1121
01:02:43,320 --> 01:02:46,440
start for me.
And then, you know, then

1122
01:02:46,520 --> 01:02:50,560
developing self aware robots and
actually kind of how easy it was

1123
01:02:50,560 --> 01:02:53,440
to do that.
What wasn't super easy, but

1124
01:02:53,840 --> 01:02:56,600
surprisingly easy.
That was the nail and that was

1125
01:02:56,600 --> 01:02:58,360
the final nail in the coffin for
me.

1126
01:02:58,360 --> 01:03:04,200
I was like, if we can make this
$29.00 robot, you know, be able

1127
01:03:04,200 --> 01:03:07,680
to create an awareness of self,
you know, it's not really

1128
01:03:07,680 --> 01:03:10,200
something special.
It's probably, you know, it's

1129
01:03:10,240 --> 01:03:13,240
not not we should take it off
the pedestal, let's put it that

1130
01:03:13,240 --> 01:03:15,040
way.
Do you think there'll be a

1131
01:03:15,040 --> 01:03:18,520
fundamental shift or difference
when you're able to give these

1132
01:03:19,640 --> 01:03:26,080
self aware agents agency or or
or allow it to make choices and

1133
01:03:26,080 --> 01:03:28,800
and then interact with those
cause and effects at some point?

1134
01:03:28,800 --> 01:03:32,240
Yeah, absolutely.
Because one of the things I love

1135
01:03:32,240 --> 01:03:35,440
about AI and robots is like how
creative they are.

1136
01:03:35,440 --> 01:03:39,200
So, you know, there's all these
hilarious AI fails, but they're

1137
01:03:39,200 --> 01:03:41,400
also, you know, some of them are
really creative.

1138
01:03:41,640 --> 01:03:44,000
So it'd be great if they're also
creative wins.

1139
01:03:44,000 --> 01:03:47,160
You know, like, again, when the
AI says, I, I'm going to make a

1140
01:03:47,160 --> 01:03:50,280
protein like this and all the
human experts on the planet say

1141
01:03:50,280 --> 01:03:51,840
you can't make a protein like
that.

1142
01:03:51,840 --> 01:03:54,800
And then of course it does.
Like, that's it.

1143
01:03:54,800 --> 01:03:58,560
So, you know, humans can only
push against the world in so

1144
01:03:58,560 --> 01:04:01,400
many ways.
And wouldn't it be great if we

1145
01:04:01,400 --> 01:04:04,640
had allies, you know, here in
the real world that could push

1146
01:04:04,640 --> 01:04:07,680
on the world in ways that we
just can't?

1147
01:04:08,120 --> 01:04:11,080
And they're going to learn cause
and effect relationships that

1148
01:04:11,080 --> 01:04:13,600
would have been impossible or
would have taken us a very, very

1149
01:04:13,600 --> 01:04:17,160
long time to discover.
That's the future that I'm

1150
01:04:17,160 --> 01:04:20,920
excited about.
And I think it's, I think it's

1151
01:04:20,920 --> 01:04:23,640
incredible work.
You guys are really changing the

1152
01:04:23,640 --> 01:04:26,360
game.
And the there's, there's so many

1153
01:04:26,360 --> 01:04:29,000
different aspects to the work
that we haven't really even

1154
01:04:29,000 --> 01:04:31,200
touched on because there's so
much greater detail that we

1155
01:04:31,200 --> 01:04:33,920
could go into.
But is there anything particular

1156
01:04:33,920 --> 01:04:35,520
that you'd like to touch and
that you feel like you haven't

1157
01:04:35,520 --> 01:04:38,440
mentioned about your work in
general, Josh, that because this

1158
01:04:38,440 --> 01:04:40,640
is obviously just the first
podcast we've had together, but

1159
01:04:41,000 --> 01:04:43,520
there's many papers we could
dissect in so much detail.

1160
01:04:43,520 --> 01:04:46,400
But before we ever do, is there
anything particular you'd like

1161
01:04:46,400 --> 01:04:49,760
to mention?
Well, one thing we, we, I think

1162
01:04:49,760 --> 01:04:52,760
we mentioned in passing, but
didn't have a chance to address

1163
01:04:52,760 --> 01:04:58,240
is how the, the work on xenobots
and even AI design biology is

1164
01:04:58,240 --> 01:05:00,640
changing like our understanding
of life itself.

1165
01:05:02,000 --> 01:05:06,520
You know, most of us were taught
that, you know, frog DNA codes

1166
01:05:06,520 --> 01:05:09,400
for frogs and human DNA codes
for humans.

1167
01:05:09,400 --> 01:05:12,960
And what I've learned in working
with Mike, and also just looking

1168
01:05:12,960 --> 01:05:16,600
at the intersection between AI
and biology in general, is that

1169
01:05:17,600 --> 01:05:20,680
the species and organisms that
exist on the planet, they're

1170
01:05:20,680 --> 01:05:25,320
points in an attractor space.
So this sort of means that when

1171
01:05:25,320 --> 01:05:30,520
you're in familiar environments,
your genes tend to build humans

1172
01:05:30,520 --> 01:05:33,840
or frogs or what have you, but
that you can change the

1173
01:05:33,840 --> 01:05:39,240
environment in quite drastic
ways, including the embryo

1174
01:05:39,240 --> 01:05:43,400
itself, or even take it apart
into its component cells and put

1175
01:05:43,400 --> 01:05:46,960
them together in new ways.
And without changing any of the

1176
01:05:46,960 --> 01:05:50,480
genetics, you get new form and
function.

1177
01:05:50,960 --> 01:05:54,200
So there there's an old idea in
biology of what's called Morpho

1178
01:05:54,200 --> 01:05:59,120
space, and it's this imaginary
high dimensional space of every

1179
01:05:59,120 --> 01:06:03,040
possible living thing that could
survive on this planet.

1180
01:06:03,560 --> 01:06:07,280
And despite how creative Mother
Nature's been over the last 3.5

1181
01:06:07,280 --> 01:06:10,000
billion years, she's only
explored A vanishingly small

1182
01:06:10,000 --> 01:06:13,800
part of Morpho space.
So, you know, the obvious

1183
01:06:13,800 --> 01:06:15,360
question is what else is out
there?

1184
01:06:15,360 --> 01:06:18,600
You know, everyone you know is
curious about aliens, and maybe

1185
01:06:18,600 --> 01:06:20,920
we'll find some on another
planet someday.

1186
01:06:20,920 --> 01:06:26,000
But we can kind of find them
like now here by asking an AI to

1187
01:06:26,000 --> 01:06:29,440
say, here's frog, here's human,
here's axolotl.

1188
01:06:29,560 --> 01:06:31,840
Now go in the opposite
direction, put them at your

1189
01:06:31,840 --> 01:06:36,560
back, and drive into morphal
space as far from familiar

1190
01:06:36,560 --> 01:06:39,640
organisms as you can to find
those that can exist here.

1191
01:06:40,000 --> 01:06:41,960
And then build them for us,
please.

1192
01:06:42,520 --> 01:06:45,920
The, the prospect of that, you
know, is going to revolutionize

1193
01:06:46,440 --> 01:06:49,600
biology, you know, beyond
recognition, I think.

1194
01:06:50,160 --> 01:06:53,480
I, I literally had one of those
questions down to to further

1195
01:06:53,480 --> 01:06:55,200
elaborate on that.
Remember when I told you about

1196
01:06:55,200 --> 01:06:57,880
the X, the extra questions I had
planned for us?

1197
01:06:58,360 --> 01:07:01,480
Yes, when when you it brings me
back to one of the other

1198
01:07:01,480 --> 01:07:03,880
questions I wrote.
You've used evolutionary

1199
01:07:03,880 --> 01:07:06,800
algorithms to evolve solutions
to both software and hardware.

1200
01:07:07,440 --> 01:07:12,080
How do these algorithms reflect,
or perhaps extend the principles

1201
01:07:12,080 --> 01:07:15,600
of natural selection?
Oh, yeah.

1202
01:07:17,120 --> 01:07:21,640
Gosh, I don't know.
I mean, I'm not a biologist, so

1203
01:07:21,640 --> 01:07:25,760
I'm sort of speaking beyond my
professional expertise here.

1204
01:07:26,120 --> 01:07:29,240
I don't know that it really is
beyond natural evolution.

1205
01:07:29,240 --> 01:07:33,960
I mean, we're just, you know, AI
is a product of us, and we're a

1206
01:07:33,960 --> 01:07:37,240
product of natural evolution.
And the AI is modifying or

1207
01:07:37,320 --> 01:07:40,440
putting pressures on these
organisms in ways they haven't

1208
01:07:40,440 --> 01:07:43,680
experienced before.
So, you know, from a bit of a

1209
01:07:43,680 --> 01:07:47,760
distance and if you squint, it
is just, you know, natural

1210
01:07:47,760 --> 01:07:50,120
selection.
It's just entered a new chapter,

1211
01:07:50,120 --> 01:07:53,120
like when, you know, the fish
crawled out of the Seas or we

1212
01:07:53,120 --> 01:07:56,800
came down from the trees.
It's just feels like a different

1213
01:07:56,800 --> 01:08:01,920
phase in a similar process.
So I, yeah, I don't know if

1214
01:08:01,920 --> 01:08:04,680
we've broken out of the bounds
of natural selection yet.

1215
01:08:04,680 --> 01:08:08,120
It it is unnatural, of course,
because there's things like AI

1216
01:08:08,120 --> 01:08:13,600
and robots and a supercomputers
involved, but ultimately it's

1217
01:08:13,600 --> 01:08:19,040
just challenging genetically
unmodified materials to survive

1218
01:08:19,040 --> 01:08:23,200
and continue, and they're
finding a way to do so.

1219
01:08:23,640 --> 01:08:26,200
The I I agree with you.
I think I had this conversation

1220
01:08:26,200 --> 01:08:30,439
with someone recently where we
spoke about the fact that it

1221
01:08:30,439 --> 01:08:33,160
took us billions of years to get
you to have this sort of

1222
01:08:33,319 --> 01:08:35,120
experience.
It doesn't mean that if we're

1223
01:08:35,120 --> 01:08:38,600
able to do it within 5 minutes
in the lab, that that this is

1224
01:08:38,600 --> 01:08:41,840
some sort of a miracle or it's
just literally all that work

1225
01:08:41,840 --> 01:08:44,399
that's taken place, as you know,
with the cells that you take.

1226
01:08:44,680 --> 01:08:48,279
It's already got so much prior
information and it's just always

1227
01:08:48,279 --> 01:08:50,279
working on itself.
So it's not like we're going

1228
01:08:50,279 --> 01:08:52,439
beyond the bounds of of nature
itself.

1229
01:08:53,200 --> 01:08:55,439
And I think, you know, the other
thing is we talked about the

1230
01:08:55,439 --> 01:08:58,640
Cartesian dualism.
A lot of this work, not just

1231
01:08:58,640 --> 01:09:01,760
ours but others, as well as
exploding the concept of self.

1232
01:09:01,760 --> 01:09:05,359
We tend to view things about
like at the self level, if you

1233
01:09:05,359 --> 01:09:08,319
think about the xenobots and
that like the cells that make up

1234
01:09:08,319 --> 01:09:12,560
the xenobots from the cells
perspective, things might not

1235
01:09:12,560 --> 01:09:15,439
actually be that different, you
know, or problematic.

1236
01:09:16,319 --> 01:09:19,560
When a, when a, when a frog
develops or any Organism

1237
01:09:19,560 --> 01:09:22,920
develops, the cells that are
part of the developmental

1238
01:09:22,920 --> 01:09:26,920
process, they're being pulled
and pushed and sheared and, you

1239
01:09:26,920 --> 01:09:30,439
know, shoved through tubes like
crazy all the time.

1240
01:09:30,800 --> 01:09:34,880
So all of our manipulations for
from the perspective of the cell

1241
01:09:34,880 --> 01:09:36,960
might just be, you know,
business as usual.

1242
01:09:36,960 --> 01:09:40,560
It's find your neighbors and
connect to them and establish

1243
01:09:40,560 --> 01:09:46,160
calcium communication again.
And you know, so I agree, like

1244
01:09:46,160 --> 01:09:49,880
the xenobots in some way seems
miraculous if you think about it

1245
01:09:49,880 --> 01:09:52,840
from the frog's perspective.
But if you think about it from

1246
01:09:53,040 --> 01:09:57,600
the frog's cells perspective,
maybe it's not so, you know,

1247
01:09:57,600 --> 01:10:00,880
surprising or traumatic.
And I think that's where I think

1248
01:10:00,880 --> 01:10:03,240
Mike at some point, and that's
when they see cognition all the

1249
01:10:03,240 --> 01:10:07,360
way going all the way down, is
that it's really, really hard to

1250
01:10:07,360 --> 01:10:10,960
actually differentiate this,
this type of process in terms of

1251
01:10:10,960 --> 01:10:12,960
cognition.
When you look at a Xenobot and

1252
01:10:12,960 --> 01:10:16,920
you see it exhibiting some sort
of autonomous behavior.

1253
01:10:17,640 --> 01:10:20,480
Have you ever found yourself
ascribing some sort of agency or

1254
01:10:20,480 --> 01:10:26,800
free will to this To this?
I can feel the temptation to

1255
01:10:26,800 --> 01:10:29,280
want to do so.
Let's put it that way again.

1256
01:10:29,280 --> 01:10:31,840
I'm an illusionist.
I know I'm being fooled.

1257
01:10:31,840 --> 01:10:34,680
I can feel it.
And yes, sometimes it takes more

1258
01:10:34,680 --> 01:10:36,760
effort than other times to
resist.

1259
01:10:37,760 --> 01:10:40,000
Have you always?
At what point have you actually

1260
01:10:40,000 --> 01:10:42,960
said firmly I'm an illusionist?
Was this something you one day

1261
01:10:42,960 --> 01:10:45,360
read or was?
I think I was.

1262
01:10:45,360 --> 01:10:49,840
And then I met Keith and Dan at
a workshop a while back and and

1263
01:10:49,840 --> 01:10:52,120
learned this actual term.
I said that's it.

1264
01:10:52,120 --> 01:10:54,680
That's what I am.
Yeah, I think.

1265
01:10:54,720 --> 01:10:57,000
I think I felt it before I knew
what I was.

1266
01:10:57,560 --> 01:10:59,320
That's exactly the way I felt
about it.

1267
01:10:59,320 --> 01:11:02,680
And and then one day watching
Dan and then Keith eventually as

1268
01:11:02,680 --> 01:11:04,800
well because he helped me with
my dissertation, which is quite

1269
01:11:04,800 --> 01:11:07,200
cool.
OK yeah, Nicholas Humphrey also

1270
01:11:07,200 --> 01:11:10,760
gave some input and Mark solms.
Great, which is pretty cool.

1271
01:11:11,160 --> 01:11:14,400
And I actually have a chat
coming up with Mark and Carl 1st

1272
01:11:14,400 --> 01:11:16,480
and they're gonna have a nice
chat together about some of the

1273
01:11:16,480 --> 01:11:18,840
work you can do as well.
But that's great.

1274
01:11:18,840 --> 01:11:23,480
Then after this podcast, it's
weird how my, my firm

1275
01:11:23,480 --> 01:11:26,960
illusionist beliefs have slowly
like become slightly less

1276
01:11:27,240 --> 01:11:30,080
interior.
And I thought, I haven't seen, I

1277
01:11:30,120 --> 01:11:33,600
don't see myself changing it
anyway, but but I'm far less

1278
01:11:33,720 --> 01:11:35,800
firm about the belief than I
used to be.

1279
01:11:36,080 --> 01:11:39,560
But I wrote the dissertation
very, very firm illusionism.

1280
01:11:39,600 --> 01:11:42,480
And yeah, to the point that it's
just, to me, it just made the

1281
01:11:42,480 --> 01:11:45,960
most sense that, that we're just
playing mind games with

1282
01:11:45,960 --> 01:11:48,800
ourselves all the time.
It's it's a very difficult thing

1283
01:11:48,800 --> 01:11:50,400
to understand.
True.

1284
01:11:50,680 --> 01:11:51,520
Agreed.
Agreed.

1285
01:11:51,800 --> 01:11:55,440
It's a strange phenomenon in
terms of your work, everything

1286
01:11:55,440 --> 01:11:58,800
you've done.
How is this overall impacted

1287
01:11:58,800 --> 01:12:01,920
your entire view of this field
of understanding the mind body

1288
01:12:01,920 --> 01:12:07,880
connection?
I think what it what it's

1289
01:12:07,880 --> 01:12:12,240
convinced me of is that whatever
the answer is, I don't think we

1290
01:12:12,240 --> 01:12:14,600
have it yet.
It's it's going to be very

1291
01:12:14,600 --> 01:12:19,040
confusing, very non intuitive.
This is what I've learned as a

1292
01:12:19,040 --> 01:12:24,320
student of AI and robots is that
things that work, you know, for

1293
01:12:24,320 --> 01:12:28,520
Mother Nature, they're very
confusing, non intuitive.

1294
01:12:29,080 --> 01:12:32,320
So I think there are some
answers to, you know, some of

1295
01:12:32,320 --> 01:12:36,040
our deeper questions aside who
we are, what makes a special, if

1296
01:12:36,040 --> 01:12:39,600
anything, you know, how the body
shapes the way we think.

1297
01:12:39,600 --> 01:12:42,240
Well, you know, what are aliens
going to look like and how are

1298
01:12:42,240 --> 01:12:45,840
they going to behave?
Whatever the answer is, it's

1299
01:12:45,840 --> 01:12:49,640
not, you know what I've learned
from the xenobots in these meta

1300
01:12:49,640 --> 01:12:52,280
materials?
They act in these very, very

1301
01:12:52,280 --> 01:12:55,160
strange, surprising ways.
And I think we're just

1302
01:12:55,160 --> 01:12:58,880
scratching the tip of the the
iceberg, you know, the fact that

1303
01:12:59,440 --> 01:13:03,480
huge matrix multiplications can
give rise to things that look

1304
01:13:03,480 --> 01:13:06,000
like, you know, brilliant human
conversation.

1305
01:13:06,280 --> 01:13:10,080
Like it we're being things are
moving quickly and everything

1306
01:13:10,080 --> 01:13:14,000
that happens is surprising to to
the public in general and also

1307
01:13:14,000 --> 01:13:16,920
to the experts.
And to me, that's changing my

1308
01:13:16,920 --> 01:13:20,200
world view to be like whatever
we thought the answer was.

1309
01:13:20,200 --> 01:13:22,960
It's much, much stranger.
It, it kind of feels like the

1310
01:13:22,960 --> 01:13:27,280
quantum revolution for like
everything other than physics is

1311
01:13:27,280 --> 01:13:29,680
now happening.
Physics already had its, you

1312
01:13:29,680 --> 01:13:33,200
know, brush with the the
ineffable and unexplainable.

1313
01:13:33,640 --> 01:13:36,120
Now it's our turn.
And, and within, I mean, quantum

1314
01:13:36,120 --> 01:13:38,520
computers, quantum biology,
these are on the rise.

1315
01:13:38,520 --> 01:13:40,080
This is this is a thing that's
happening.

1316
01:13:40,080 --> 01:13:42,600
So people are taking this very
seriously, but these meta

1317
01:13:42,600 --> 01:13:44,440
materials are really intriguing
to me.

1318
01:13:44,440 --> 01:13:46,560
And where do you think this is
going to lead?

1319
01:13:47,440 --> 01:13:50,640
Oh, I, I think, you know, meta
materials are the future.

1320
01:13:50,640 --> 01:13:55,840
Like, so you, you know, all the
materials that we build with,

1321
01:13:55,840 --> 01:13:59,120
they have problems, you know,
you, you can only build a

1322
01:13:59,120 --> 01:14:03,120
building so high because the
weight ratio, you know, so meta

1323
01:14:03,120 --> 01:14:06,440
materials are amazing.
Like I, I see a future in which,

1324
01:14:06,720 --> 01:14:10,520
you know, we can cut a robot in
half and the two halves will

1325
01:14:10,520 --> 01:14:13,760
form 2 smaller versions of
exactly the same robot.

1326
01:14:13,840 --> 01:14:21,400
You know, living materials like
cells, when they attach to rigid

1327
01:14:21,400 --> 01:14:23,560
materials, those cells become
more rigid.

1328
01:14:23,560 --> 01:14:26,400
When they attach to soft
materials, they become softer.

1329
01:14:26,400 --> 01:14:28,760
They're cells are like
chameleons.

1330
01:14:29,280 --> 01:14:32,680
And so I think we're going to
make meta materials that are the

1331
01:14:32,680 --> 01:14:36,720
same way that they, they get the
hint when you try and use them

1332
01:14:36,720 --> 01:14:39,480
to be something, they become
that thing.

1333
01:14:39,480 --> 01:14:43,520
They construct what they need or
they change, you know, that's

1334
01:14:43,520 --> 01:14:47,280
what, that's what 21st century
technologies are gonna look like

1335
01:14:47,280 --> 01:14:49,560
as we push further into the 21st
century.

1336
01:14:50,120 --> 01:14:52,440
I think that's one of this is
probably one of the most

1337
01:14:52,440 --> 01:14:57,840
exciting or or sci-fi ish thing
I've heard, and quite some I

1338
01:14:57,880 --> 01:15:00,280
think it's.
Again, we were talking about

1339
01:15:00,280 --> 01:15:02,040
surprise.
The surprising thing is a lot of

1340
01:15:02,040 --> 01:15:04,800
those materials already exist.
You know, they're just kind of

1341
01:15:04,800 --> 01:15:07,720
in labs at the moment.
They're not in general use, but

1342
01:15:08,160 --> 01:15:11,720
but the aerospace companies, you
know, they have R&D labs around

1343
01:15:11,720 --> 01:15:14,840
these things.
And so these, these, these

1344
01:15:14,840 --> 01:15:18,000
materials, ironically, you know,
they're being, these materials

1345
01:15:18,000 --> 01:15:21,520
are being designed by AI.
And then once they're built,

1346
01:15:21,680 --> 01:15:24,840
designed by the AI, the AI can
figure out how to exploit them.

1347
01:15:24,840 --> 01:15:28,040
You know, how, how can they best
be incorporated into buildings

1348
01:15:28,040 --> 01:15:31,640
and roads and cars and plumbing
and you name it.

1349
01:15:31,640 --> 01:15:33,760
It's, it's amazing.
It hasn't quite reached the

1350
01:15:33,760 --> 01:15:36,720
public, you know, eye yet, but
it's, it's coming.

1351
01:15:37,760 --> 01:15:40,600
It's kind of crazy because it's
almost like, I mean, we're

1352
01:15:40,600 --> 01:15:43,480
making biological robots and
then at the same time you're,

1353
01:15:43,560 --> 01:15:48,160
well, you are, but at the same
time you're making now materials

1354
01:15:48,160 --> 01:15:50,240
more biological.
It's kind of like you're doing

1355
01:15:50,240 --> 01:15:52,440
the both, you're doing like the
flip side on both sides.

1356
01:15:52,440 --> 01:15:54,920
It's just pretty crazy and cool
and.

1357
01:15:55,560 --> 01:15:58,800
Yeah, not just my lab.
Some of the exciting things that

1358
01:15:58,800 --> 01:16:02,760
are going on in these
interdisciplinary labs are

1359
01:16:02,760 --> 01:16:06,320
exactly that, like blurring the
line between living and non

1360
01:16:06,320 --> 01:16:09,400
living materials, right.
So again, there's another

1361
01:16:09,400 --> 01:16:13,440
distinction that we just assume
is so obvious and so clear cut.

1362
01:16:13,440 --> 01:16:18,000
And from the AIS perspective, it
says why I, I don't understand.

1363
01:16:18,000 --> 01:16:21,800
I'll put, you know, I'll, I'll
embed some proteins in a plastic

1364
01:16:22,040 --> 01:16:25,000
to make, you know, a protein
infused plastic.

1365
01:16:25,080 --> 01:16:27,680
So what is that Now?
Is that a living thing, a non

1366
01:16:27,680 --> 01:16:29,760
living thing?
I don't know.

1367
01:16:30,000 --> 01:16:32,800
Yeah, just like make the make
the metal some sort of an ion

1368
01:16:32,800 --> 01:16:34,760
channel to assist with like
shifting cells.

1369
01:16:34,760 --> 01:16:36,600
I mean, you can.
It's exactly that.

1370
01:16:36,600 --> 01:16:39,640
When you think about what the
potential this has and how much

1371
01:16:39,640 --> 01:16:43,280
this can affect medicine and
every other field, it's, it's so

1372
01:16:43,280 --> 01:16:45,400
widespread that it would
literally change humanity.

1373
01:16:45,400 --> 01:16:48,400
Like it's, it's pretty crazy.
I agree.

1374
01:16:48,720 --> 01:16:50,840
I thought it's Max.
You don't, you don't need, I

1375
01:16:50,840 --> 01:16:53,240
don't think we need the the
singularity.

1376
01:16:53,240 --> 01:16:56,400
You know they'll be sufficiently
crazy stuff that will happen

1377
01:16:56,400 --> 01:16:58,800
without us all uploading to the
cloud or.

1378
01:16:59,040 --> 01:17:01,360
Yeah, no, this is this is
probably the most exciting thing

1379
01:17:01,360 --> 01:17:03,200
I've heard.
Like and trust me, when you

1380
01:17:03,200 --> 01:17:05,720
scroll online and you do scroll,
you see a lot of crazy things,

1381
01:17:05,720 --> 01:17:07,520
but this is today, so this is
pretty cool.

1382
01:17:07,520 --> 01:17:10,160
I mean, this, I can't wait to
see what, what you guys come up

1383
01:17:10,160 --> 01:17:12,120
with in and I'm really looking
forward to it.

1384
01:17:12,120 --> 01:17:13,520
Yeah, It's, it's, it's
incredible.

1385
01:17:14,040 --> 01:17:14,360
Yeah.
No.

1386
01:17:14,360 --> 01:17:16,960
So, Josh, thank you so much for
this wonderful conversation.

1387
01:17:17,000 --> 01:17:18,240
Any, any final words from your
side?

1388
01:17:18,240 --> 01:17:21,640
Anything you'd like to add?
Well, just just as we're

1389
01:17:21,640 --> 01:17:24,040
wrapping up, you're asking me
about some other things that we

1390
01:17:24,040 --> 01:17:26,640
might not have talked about.
Just one last thing I wanted to

1391
01:17:26,640 --> 01:17:30,680
mention about connecting AI and
embodied cognition is where we

1392
01:17:30,680 --> 01:17:33,920
are at the moment.
So we now have these immensely,

1393
01:17:33,920 --> 01:17:39,360
you know, powerful AIS that say
compelling things, but

1394
01:17:39,360 --> 01:17:43,280
instinctually we're not sure
whether to trust what they say.

1395
01:17:44,280 --> 01:17:47,160
So we're doing some work here in
the lab and there are others

1396
01:17:47,160 --> 01:17:51,360
that are doing so as well, which
is, you know, combining a non

1397
01:17:51,360 --> 01:17:54,440
embodied technologies like
Chachi BT with embodied

1398
01:17:54,440 --> 01:17:57,840
technologies like robots.
And So what we see on the

1399
01:17:57,840 --> 01:18:01,040
horizon is a human has a
problem.

1400
01:18:01,160 --> 01:18:05,560
They come to an AI, they ask for
a solution and the AI proposes

1401
01:18:05,560 --> 01:18:09,800
something that sounds great.
And the human says, I don't know

1402
01:18:09,800 --> 01:18:14,600
that I trust you prove it.
And the AI says, connect me to a

1403
01:18:14,600 --> 01:18:19,080
3D printer, prints a robot that
goes out and, you know,

1404
01:18:19,080 --> 01:18:24,240
physically verifies, does some
experiments, you know, to show

1405
01:18:24,320 --> 01:18:27,560
some version of this solution or
some aspect of it.

1406
01:18:28,120 --> 01:18:34,440
So can we teach AI to design
experiments that will, that will

1407
01:18:34,440 --> 01:18:37,000
prove whatever it is that it's
proposing?

1408
01:18:37,000 --> 01:18:39,960
So I think that's something else
we haven't talked about that's

1409
01:18:39,960 --> 01:18:43,560
often known as like AI science
or AI driven science or machine

1410
01:18:43,560 --> 01:18:48,400
science that also I think has
huge potential for dealing with

1411
01:18:48,960 --> 01:18:51,640
at the moment, which seems like
an intractable problem that

1412
01:18:51,640 --> 01:18:54,760
there's no way to guarantee what
a ChatGPT says.

1413
01:18:54,880 --> 01:18:59,760
And I agree, doesn't matter what
it or anyone says the proof is

1414
01:18:59,760 --> 01:19:03,120
in the physical world.
And I think we're going to start

1415
01:19:03,120 --> 01:19:06,080
to see some of that emerging in
the not too distant future.

1416
01:19:06,440 --> 01:19:09,960
And so far, Josh, have you seen
anything remotely close to that

1417
01:19:09,960 --> 01:19:16,520
or how how far off are we?
We're close for, for toy

1418
01:19:16,520 --> 01:19:19,200
problems.
I, I guess we're, I would say

1419
01:19:19,200 --> 01:19:22,280
we're for in terms of like
practical utility, we're still a

1420
01:19:22,280 --> 01:19:24,440
ways off.
I, I don't even know that we're

1421
01:19:24,440 --> 01:19:28,520
ready morally or ethically to
allow an AI to build things to

1422
01:19:28,520 --> 01:19:32,000
prove its ideas about the world,
that there's another one for

1423
01:19:32,000 --> 01:19:33,520
Hollywood to play with.
But that that.

1424
01:19:34,040 --> 01:19:36,160
Actually is quite a good script
for a film that is.

1425
01:19:36,320 --> 01:19:40,200
Yeah, yeah, exactly, exactly.
No, no raw, yeah, no lack of raw

1426
01:19:40,200 --> 01:19:43,880
material here.
But anyways, I just wanted, I

1427
01:19:43,880 --> 01:19:46,840
guess I wanted to close just by
saying I don't think, you know,

1428
01:19:46,840 --> 01:19:50,240
it's non embodiment against
embodiment that some of the

1429
01:19:50,240 --> 01:19:54,480
really valuable technologies and
solutions in the years to come

1430
01:19:54,480 --> 01:19:58,440
are going to be, you know,
clever ways of combining them so

1431
01:19:58,440 --> 01:20:01,440
you get the best of both worlds.
And in general, I mean life and

1432
01:20:01,440 --> 01:20:02,920
not life.
That dichotomy, that false

1433
01:20:02,920 --> 01:20:05,800
dichotomy, it seems that at some
point we're all going to be

1434
01:20:06,160 --> 01:20:10,240
chemically bonded in some way.
Yeah, Yeah.

1435
01:20:10,240 --> 01:20:12,280
I think it's a beautiful.
That's a beautiful way to

1436
01:20:12,280 --> 01:20:14,120
probably end it there.
Josh, thank you so much.

1437
01:20:14,120 --> 01:20:15,600
Thanks for your time.
Wonderful work.

1438
01:20:16,160 --> 01:20:18,160
No, thank you, Tevin.
It was a pleasure to be on your

1439
01:20:18,160 --> 01:20:20,120
show.
Yeah, no, I can't wait to

1440
01:20:20,240 --> 01:20:23,120
hopefully have around 2 and and
further dissects maybe specific

1441
01:20:23,120 --> 01:20:25,920
papers and we can go sure some
deep dives in the future.

1442
01:20:25,920 --> 01:20:28,520
It's it's a privilege to chat to
you and I really appreciate it.

1443
01:20:28,720 --> 01:20:30,560
Yeah, likewise.
And again, thanks, thanks for

1444
01:20:30,560 --> 01:20:32,800
your interest and you clearly
put a lot of time and effort

1445
01:20:32,800 --> 01:20:35,720
into to preparing this was this
was a real pleasure.