Jan. 23, 2026

Neuroscience Beyond Neurons in the Diverse Intelligence Era | Michael Levin & Robert Chis-Ciure

The player is loading ...
Neuroscience Beyond Neurons in the Diverse Intelligence Era | Michael Levin & Robert Chis-Ciure

What if neurons aren’t the foundation of mind? In this Mind-Body Solution Colloquia, Michael Levin and Robert Chis-Ciure challenge one of neuroscience’s deepest assumptions: that cognition and intelligence are exclusive to brains and neurons.Drawing on cutting-edge work in bioelectricity, developmental biology, and philosophy of mind, this conversation explores how cells, tissues, and living systems exhibit goal-directed behavior, memory, and problem-solving — long before neurons ever appear.We explore: • Cognition without neurons• Bioelectric networks as control systems• Memory and learning beyond synapses• Morphogenesis as collective intelligence• Implications for AI, consciousness, and ethicsThis episode pushes neuroscience beyond the neuron, toward a deeper understanding of mind, life, and intelligence as continuous across scales.TIMESTAMPS:0:00 – Introduction: Why Neuroscience Must Go Beyond Neurons3:12 – The Central Claim: Cognition Is Not Exclusive to Brains7:05 – Defining Cognition, Intelligence, and Agency Without Neurons11:02 – Bioelectricity as a Control Layer for Morphogenesis15:08 – Cells as Problem-Solvers: Goals, Memory, and Error Correction19:41 – The Body as a Cognitive System: Scaling Intelligence Across Levels24:10 – Developmental Plasticity and Non-Neural Decision-Making28:36 – Morphological Computation and Collective Cellular Intelligence33:02 – Challenging Neuron-Centric Neuroscience Assumptions37:18 – Bioelectric Networks vs Neural Networks: Key Differences41:55 – Memory Without Synapses: Storing Information in Living Tissue46:07 – Rewriting Anatomy: Regeneration, Repatterning, and Control50:29 – Cancer, Developmental Errors, and Cognitive Breakdown54:48 – Pluribus: Philosophical Implications59:14 – From Cells to Selves: Where Does Agency Begin?1:03:22 – Implications for AI: Intelligence Without Brains or Neurons1:08:11 – Rethinking Consciousness: Gradualism vs Binary Models1:12:47 – Ethics of Expanding the Moral Circle Beyond Humans1:17:31 – Future Science: New Tools for a Post-Neuron Neuroscience1:22:54 – Closing Reflections: Life, Mind, and Intelligence All the Way DownEPISODE LINKS:- Cognition All the Way Down 2.0: Neuroscience Beyond Neurons in the Diverse Intelligence Era: https://link.springer.com/article/10.1007/s11229-025-05319-6- Robert's Publications: https://scholar.google.com/citations?user=7V9C7skAAAAJ&hl=en- Mike's Podcast 1: https://www.youtube.com/watch?v=v6gp-ORTBlU- Mike's Podcast 2: https://www.youtube.com/watch?v=kMxTS7eKkNM- Mike's Podcast 3: https://www.youtube.com/watch?v=1R-tdscgxu4- Mike's Podcast 4 (with Terrence Deacon): https://youtu.be/HuWbHwPZd60?si=z2unvX37OjXMjjIv- Mike's Lecture: https://www.youtube.com/watch?v=aQEX-twenkA- Mike's Channel: https://www.youtube.com/@drmichaellevin- Mike's Website: https://drmichaellevin.org/- Mike's Blog: https://thoughtforms.lifeCONNECT:- Website: https://mindbodysolution.org - YouTube: https://youtube.com/@mindbodysolution- Podcast: https://creators.spotify.com/pod/show/mindbodysolution- Twitter: https://twitter.com/drtevinnaidu- Facebook: https://facebook.com/drtevinnaidu - Instagram: https://instagram.com/drtevinnaidu- LinkedIn: https://linkedin.com/in/drtevinnaidu- Website: https://tevinnaidu.com=============================Disclaimer: The information provided on this channel is for educational purposes only. The content is shared in the spirit of open discourse and does not constitute, nor does it substitute, professional or medical advice. We do not accept any liability for any loss or damage incurred from you acting or not acting as a result of listening/watching any of our contents. You acknowledge that you use the information provided at your own risk. Listeners/viewers are advised to conduct their own research and consult with their own experts in the respective fields.

1
00:00:08,080 --> 00:00:10,240
Mike Roberts, thank you both so
much for joining me.

2
00:00:10,240 --> 00:00:12,680
This is exciting.
I've been waiting a long time

3
00:00:12,680 --> 00:00:14,240
for this conversation.
Welcome back.

4
00:00:14,520 --> 00:00:15,880
Well, bye.
Welcome back.

5
00:00:16,560 --> 00:00:19,200
Good to see you.
Mike, you've been on this

6
00:00:19,200 --> 00:00:21,560
channel so many times.
I think you've actually been on

7
00:00:21,560 --> 00:00:24,080
this channel more than anyone.
So I think it would be a great

8
00:00:24,080 --> 00:00:26,200
way to start this by asking
Robert.

9
00:00:26,880 --> 00:00:30,480
Robert, your work, for those who
are not familiar, perhaps could

10
00:00:30,480 --> 00:00:34,000
you expand what your work covers
and how your work has brought

11
00:00:34,000 --> 00:00:35,800
you to Mike?
And then we'll go into this

12
00:00:35,800 --> 00:00:37,360
wonderful paper you guys wrote
together all.

13
00:00:39,040 --> 00:00:42,280
Right, sure.
So I'm a consciousness

14
00:00:42,280 --> 00:00:44,560
researcher primarily, even
though I was trained as a

15
00:00:44,560 --> 00:00:47,200
philosopher, I mostly act as a
neuroscientist now.

16
00:00:47,200 --> 00:00:50,600
So I'm sort of a neuro
philosopher if you want a label,

17
00:00:51,000 --> 00:00:55,000
and my work is at Sussex
University in annual search Lab

18
00:00:55,000 --> 00:00:58,520
where we focus mostly on
consciousness, which is why I

19
00:00:58,520 --> 00:01:02,160
might say the the big absentee
of this paper.

20
00:01:02,360 --> 00:01:04,959
But that was deliberate and we
will get into that towards

21
00:01:05,560 --> 00:01:07,000
towards the end of our
discussion.

22
00:01:08,920 --> 00:01:14,400
My usual work involves trying to
think in conceptual but also

23
00:01:14,400 --> 00:01:18,040
mathematical and empirical terms
about various structures of

24
00:01:18,040 --> 00:01:21,560
experience or features of
experience, let's call them

25
00:01:21,560 --> 00:01:26,280
phenomenological invariance.
And this is my my primary line

26
00:01:26,280 --> 00:01:30,360
of investigation.
However, in the past year and in

27
00:01:30,360 --> 00:01:34,720
the past two years actually upon
reading Mike's work and

28
00:01:34,720 --> 00:01:38,880
especially the multi competency
architecture that we explore in

29
00:01:38,880 --> 00:01:42,520
this paper, I started thinking
more and more about the

30
00:01:42,520 --> 00:01:44,520
connection between consciousness
and intelligence.

31
00:01:45,040 --> 00:01:49,120
And we will probably get to that
in more detail later.

32
00:01:49,560 --> 00:01:52,200
But the main problem with
consciousness is the fact that

33
00:01:52,200 --> 00:01:56,880
it is less operationalizable
than intelligence and

34
00:01:56,960 --> 00:01:59,520
intelligence.
Not only is it more

35
00:01:59,520 --> 00:02:03,680
operationalizable in in the
sense that you can create an

36
00:02:04,080 --> 00:02:07,240
easily measurable or somewhat
easily measurable protocol for

37
00:02:07,240 --> 00:02:10,639
determining it as an observer,
because that's the big catch.

38
00:02:10,720 --> 00:02:13,200
We are always in the position of
being observers.

39
00:02:13,960 --> 00:02:18,360
And most of the intelligence we
detect is also a reflection of

40
00:02:18,400 --> 00:02:21,320
our intelligence as well.
So in a sense, detecting

41
00:02:21,320 --> 00:02:24,800
intelligence is an IQ test for
the detector, as Mike likes to

42
00:02:24,800 --> 00:02:29,960
say in his papers.
And as I got deeper into my

43
00:02:30,040 --> 00:02:35,400
work, it happened that my got in
touch proactively to discuss

44
00:02:35,400 --> 00:02:40,000
some some consciousness related
things and we started this

45
00:02:40,000 --> 00:02:44,880
collaboration.
And the seed of this idea came

46
00:02:44,880 --> 00:02:48,160
from primarily this line of
thinking.

47
00:02:48,200 --> 00:02:52,640
How can we leverage intelligence
as a more scientifically

48
00:02:52,640 --> 00:02:56,400
tractable concept to gain an
insight into consciousness,

49
00:02:56,600 --> 00:03:00,240
which is perhaps less tractable,
at least with our current

50
00:03:00,440 --> 00:03:03,520
investigation?
And everything ballooned and we

51
00:03:03,520 --> 00:03:08,680
we got to this, to this paper.
And Mike, your thoughts on

52
00:03:08,880 --> 00:03:12,040
Robert's journey, his work,
what's fascinated you most about

53
00:03:12,040 --> 00:03:14,920
it and how and why was it the
perfect fit to bring you guys

54
00:03:14,920 --> 00:03:18,760
together?
Yeah, Well, actually, Luke

55
00:03:18,880 --> 00:03:22,840
Roloff's gave me Robert's name,
and he said that he showed me

56
00:03:22,840 --> 00:03:25,160
some stuff and he said that you
might want to talk to this guy.

57
00:03:25,160 --> 00:03:27,520
And I was like, I absolutely
want to talk to this guy.

58
00:03:27,920 --> 00:03:30,440
And so.
Yeah.

59
00:03:30,440 --> 00:03:32,160
And so.
And so I reached out and I, you

60
00:03:32,160 --> 00:03:34,800
know, I sent you some stuff
because I thought, yeah, I mean,

61
00:03:35,680 --> 00:03:39,800
I thought this is exactly kind
of the sweet spot for what we

62
00:03:39,800 --> 00:03:42,880
want to be working on.
You know, intelligence obviously

63
00:03:42,880 --> 00:03:45,080
in our lab is, is something that
we try to address

64
00:03:45,080 --> 00:03:48,360
experimentally.
But in the end, you, I, I don't

65
00:03:48,400 --> 00:03:51,280
think you can complete this,
this journey without really

66
00:03:51,280 --> 00:03:54,120
taking on consciousness.
And in the end, yet, as you

67
00:03:54,120 --> 00:03:56,840
said, it's it's, it's, it has
all kinds of difficulties.

68
00:03:56,840 --> 00:04:00,040
And so the connection between
those two seems like is

69
00:04:00,040 --> 00:04:01,800
something that eventually we're
going to have to tackle.

70
00:04:01,800 --> 00:04:06,400
So yeah, here we are.
Well, for those who who are

71
00:04:06,400 --> 00:04:09,760
watching this or listening to
this, the the entire focus is

72
00:04:09,760 --> 00:04:12,400
going to be your paper cognition
all the way down. 2.0

73
00:04:12,520 --> 00:04:16,440
neuroscience beyond neurons in
the diverse intelligence era.

74
00:04:16,560 --> 00:04:20,200
What a wonderful paper.
Firstly, guys, what and this

75
00:04:20,200 --> 00:04:23,560
question is for both of you,
what motivated you to move

76
00:04:23,560 --> 00:04:27,280
beyond conceptual debates in the
cognition wars so and offer a

77
00:04:27,280 --> 00:04:31,680
formal quantitative framework
for intelligence across scales

78
00:04:31,680 --> 00:04:35,000
from molecules to organisms.
What really drove this?

79
00:04:37,320 --> 00:04:41,120
Like wanna wanna ground with the
entirety of your MCA and then

80
00:04:41,320 --> 00:04:43,160
maybe I can pivot at first,
sure.

81
00:04:43,720 --> 00:04:48,960
Sure.
Yeah, I mean, it's, it's it's,

82
00:04:49,040 --> 00:04:54,960
it's important to to establish
tractable frameworks that are

83
00:04:54,960 --> 00:04:57,280
going to make contact with,
experiment with the real world.

84
00:04:57,560 --> 00:05:01,440
Not only, not only because it,
it sort of keeps you honest in a

85
00:05:01,440 --> 00:05:03,280
philosophical sense.
It lets you, it gives you an

86
00:05:03,280 --> 00:05:05,360
idea of whether you're on the
right track or not.

87
00:05:06,120 --> 00:05:09,320
And I don't just mean sort of
predictions and things like

88
00:05:09,320 --> 00:05:12,360
that.
I mean generative fertility of,

89
00:05:12,360 --> 00:05:14,600
of the ideas, you know, what
does it help you come up with?

90
00:05:14,600 --> 00:05:16,520
What, what are the better
questions that helps you get

91
00:05:16,520 --> 00:05:18,680
what are the new things you
wouldn't have done otherwise,

92
00:05:18,680 --> 00:05:21,320
right.
But also the other, the other

93
00:05:21,320 --> 00:05:24,120
aspect of this, and I think
Chris Fields said this, that

94
00:05:24,400 --> 00:05:26,880
arguments are only resolved by
technologies.

95
00:05:27,280 --> 00:05:30,360
And so in in some of these
philosophical questions, like

96
00:05:30,360 --> 00:05:33,360
people have been debating this
stuff for a really long time,

97
00:05:33,360 --> 00:05:35,160
you know, thousands of years in
some cases.

98
00:05:35,800 --> 00:05:40,480
And many of these things are in
fact treated, I think mistakenly

99
00:05:40,480 --> 00:05:46,520
so, as matters of philosophical
taste or linguistic definitions.

100
00:05:46,800 --> 00:05:49,600
And, you know, when people say,
well, OK, you, you know, you,

101
00:05:49,640 --> 00:05:51,760
you can't talk about cells
having intelligence, that's a

102
00:05:51,760 --> 00:05:54,080
category error.
You know, people love, love to

103
00:05:54,080 --> 00:05:57,160
play like smack it with these
category errors and like, OK,

104
00:05:57,160 --> 00:06:01,280
but but these categories were
not given to us, you know, by by

105
00:06:01,280 --> 00:06:04,080
aliens who came down and told us
this is this is how they define

106
00:06:04,080 --> 00:06:05,440
the universe.
And then here the here's

107
00:06:05,520 --> 00:06:09,440
categories, these categories
have to evolve with this, with

108
00:06:09,440 --> 00:06:11,840
the science.
And if we're going to have

109
00:06:11,840 --> 00:06:14,800
frameworks that are not just
sort of armchair, this is how I

110
00:06:14,800 --> 00:06:16,960
define it.
And and I'm that like, that's

111
00:06:16,960 --> 00:06:17,920
it.
That's how, that's how I'm now

112
00:06:17,920 --> 00:06:20,320
I'm going to have opinions on
this based on just how I how I

113
00:06:20,320 --> 00:06:22,400
feel about it.
Then then we're going to have to

114
00:06:22,400 --> 00:06:25,080
do experiments that that means
getting quantitative and getting

115
00:06:25,080 --> 00:06:28,200
to the point where we have a
rigorous framework that either

116
00:06:28,200 --> 00:06:33,840
guides experiments or becomes a
fleshed out in some sort of

117
00:06:33,840 --> 00:06:35,480
computational work or something
like that.

118
00:06:35,560 --> 00:06:37,720
So that's the those are the
motivations.

119
00:06:38,800 --> 00:06:44,200
Yeah, and as a philosopher and
engaging mostly with

120
00:06:44,200 --> 00:06:46,600
philosophers, at least in the in
the first part of my academic

121
00:06:46,600 --> 00:06:51,080
career, actually I was at the
NYU with David Chalmers when

122
00:06:51,080 --> 00:06:55,240
when when I got in touch with
Mike initially, Luke Roltz, all

123
00:06:55,240 --> 00:06:57,520
these discussions that we
addressed in this paper or all

124
00:06:57,520 --> 00:06:59,680
these problems.
For example, the definition of

125
00:06:59,680 --> 00:07:04,080
cognition.
How can one have a definition

126
00:07:04,080 --> 00:07:07,760
that doesn't bake a priori
essentials into the very result

127
00:07:07,960 --> 00:07:10,720
of the inquiry which is based on
the definition.

128
00:07:11,240 --> 00:07:14,880
So the notion of representation,
for example, is 11 good example.

129
00:07:15,160 --> 00:07:19,360
What counts as a representation
is a very much an open problem.

130
00:07:20,120 --> 00:07:25,400
And in philosophy, as Mike was
saying, some sometimes it feels

131
00:07:25,400 --> 00:07:29,400
like it's a matter of taste what
what you end up having given

132
00:07:29,400 --> 00:07:36,800
your taste with which you start.
So in our paper, we try to shift

133
00:07:36,800 --> 00:07:40,920
from this mostly primarily,
mostly conceptual approach to

134
00:07:40,920 --> 00:07:46,200
also a formal approach.
And if you do that, the the

135
00:07:46,280 --> 00:07:50,360
litmus test is OK.
How profitable is this move in

136
00:07:50,360 --> 00:07:54,000
the end game?
And that means you need not only

137
00:07:54,000 --> 00:07:57,480
to be productive, but you also
need to be precise because

138
00:07:57,520 --> 00:08:01,200
otherwise you are you're just
spinning your wheels in in in

139
00:08:01,200 --> 00:08:05,000
the castles of abstraction as as
philosophers often do.

140
00:08:06,240 --> 00:08:11,520
So then that forces forced us to
think, OK, if we want to have a

141
00:08:11,520 --> 00:08:15,560
a sort of cosmopolitan notion of
cognition or intelligence, how

142
00:08:15,560 --> 00:08:20,680
we define intelligence is it's a
crucial move right at the

143
00:08:20,680 --> 00:08:25,000
beginning of the project.
And that's where inspired by Max

144
00:08:25,000 --> 00:08:29,640
previous work, we thought of
intelligence as a way to, to

145
00:08:30,560 --> 00:08:34,320
quantify the efficiency with
which an arbitrary system moves

146
00:08:34,320 --> 00:08:38,880
into an arbitrary problem space.
And all of these are very

147
00:08:38,880 --> 00:08:42,120
important notions.
The, the notion of often of a

148
00:08:42,120 --> 00:08:45,640
problem space, the notion of, of
states, the notion of, of

149
00:08:46,120 --> 00:08:50,840
operators, all, all the things
that we, we go into more details

150
00:08:50,840 --> 00:08:54,040
in the, in the formalism of the
paper, but the, the fundamental

151
00:08:54,040 --> 00:08:57,480
idea is quite simple.
If you are more efficient in

152
00:08:57,480 --> 00:09:00,600
solving a problem, you are more
intelligent.

153
00:09:01,320 --> 00:09:05,520
But the the tricky part is not
only recognizing the problem

154
00:09:05,520 --> 00:09:07,280
which is relevant for the system
itself.

155
00:09:07,560 --> 00:09:11,640
Again, recursively, we get back
to the idea of us as observers

156
00:09:11,840 --> 00:09:14,880
being given an intelligence test
in the very fact that we want to

157
00:09:14,880 --> 00:09:19,280
detect intelligence.
So recognizing the very idea of

158
00:09:19,280 --> 00:09:23,120
of a problem for a system and
mapping out the problem space

159
00:09:23,120 --> 00:09:26,920
for that system and not for you
as the observer, that's a

160
00:09:26,920 --> 00:09:30,360
fundamentally difficult issue.
So not only is that then

161
00:09:30,360 --> 00:09:34,120
quantifying, given those
parameters, how efficient is a

162
00:09:34,120 --> 00:09:39,240
system in that problem space?
And then the most important

163
00:09:39,240 --> 00:09:44,640
question perhaps is what is the,
the, the ground truth for that?

164
00:09:45,240 --> 00:09:50,080
What is the best null model you
can use for quantifying

165
00:09:50,080 --> 00:09:52,200
intelligence, which is a matter
of efficiency?

166
00:09:52,680 --> 00:09:55,320
And the the best answer, at
least provisionally in this

167
00:09:55,320 --> 00:10:00,080
paper was doing it relative to
chance.

168
00:10:01,760 --> 00:10:06,400
It's the most ecumenical, the,
the, the, the most, the less

169
00:10:06,680 --> 00:10:11,360
loaded assumption, formal
assumptions you can make to

170
00:10:11,360 --> 00:10:14,760
quantify the, the, the
efficiency degree of something.

171
00:10:15,040 --> 00:10:17,520
But it also has problems and we
can talk about them later.

172
00:10:17,760 --> 00:10:22,120
So that's roughly the the
context, the dialectical context

173
00:10:22,120 --> 00:10:25,680
under the the what, what
choices, what formal choices and

174
00:10:25,680 --> 00:10:27,360
and and conceptual choices we
had to make.

175
00:10:28,680 --> 00:10:31,160
Well, I mean, you brought this
up, so I think let's go into it.

176
00:10:31,160 --> 00:10:33,600
I was going to ask you something
else, but I think it can wait.

177
00:10:33,880 --> 00:10:37,080
When it comes to this
intelligence as efficient

178
00:10:37,080 --> 00:10:41,000
search, You guys revive a new
Alan Simon's idea that

179
00:10:41,000 --> 00:10:45,000
intelligence averts
combinatorial explosions.

180
00:10:45,200 --> 00:10:50,760
Why is search efficiency OK in
the paper The Right unifying

181
00:10:50,760 --> 00:10:58,160
matrix for scale 3 theory I.
Mean with L metrics and all

182
00:10:58,160 --> 00:11:02,600
mathematical formalisms, you can
have many of them packed into a

183
00:11:02,600 --> 00:11:05,160
concept.
So it's not like you can have a

184
00:11:05,160 --> 00:11:08,880
uniqueness proof for this
mathematical construction.

185
00:11:08,880 --> 00:11:12,200
As as Mike was saying in the in
the beginning, you you you judge

186
00:11:12,280 --> 00:11:14,360
a construct based on its
fruitfulness.

187
00:11:14,800 --> 00:11:20,560
So this was in a sense of of
mathematical hypothesis of first

188
00:11:21,160 --> 00:11:24,920
way to flesh to flesh this out.
And we have ongoing work in

189
00:11:24,920 --> 00:11:28,640
progress where we try to embed
this problem space formalism

190
00:11:28,880 --> 00:11:31,400
into the free energy principle
because there are multiple

191
00:11:32,200 --> 00:11:34,920
alternative formulations that
one can give of this problem

192
00:11:34,920 --> 00:11:37,840
space.
But why and and and this is this

193
00:11:37,840 --> 00:11:43,280
is an excellent point you made.
We try to revive the newer and

194
00:11:43,280 --> 00:11:47,640
Simon conception, which is from
the 1972 at least.

195
00:11:47,680 --> 00:11:50,400
And and this is a classical
conception in cognitive science.

196
00:11:50,760 --> 00:11:54,160
But the reason we we try to do
that is because we wanted to

197
00:11:54,160 --> 00:11:57,800
address the skeptics in skeptics
in the, in the cognition wars,

198
00:11:57,880 --> 00:12:00,680
in the basal cognition wars,
right on their territory.

199
00:12:00,840 --> 00:12:04,320
Because the work by Neil and
Simon was primarily addressed,

200
00:12:05,000 --> 00:12:08,320
was primarily addressed in the
human case of problem solving.

201
00:12:08,680 --> 00:12:11,160
So it was mostly in the era of
symbolic AI.

202
00:12:11,720 --> 00:12:17,480
So go far and all that done that
ancient AI, now ancient AI

203
00:12:17,560 --> 00:12:22,480
discussion.
So we try to build on that, but

204
00:12:22,480 --> 00:12:26,280
also extend that to the way in
which we found palatable for our

205
00:12:26,480 --> 00:12:29,320
current contemporary and
scientific sensitivities.

206
00:12:29,960 --> 00:12:35,680
So we took that, added some
elements to it and proposed K as

207
00:12:35,680 --> 00:12:38,560
the as this ratio, as the log
turn of the ratio between

208
00:12:39,240 --> 00:12:44,320
directed or a genetic or or
system specific search versus a

209
00:12:44,400 --> 00:12:48,920
null model, which would be a Max
sent maximum entropy or random

210
00:12:48,920 --> 00:12:54,320
search for that system.
So whether that is the unique

211
00:12:54,320 --> 00:12:59,880
metric that cannot be given a
formal proof, probably, but I

212
00:12:59,880 --> 00:13:04,840
think it's a useful metric if we
start using it and the the proof

213
00:13:04,840 --> 00:13:08,080
was in the pudding of our two
computational models, which were

214
00:13:08,400 --> 00:13:12,120
mostly if you want on the net
king calculations based on the

215
00:13:12,400 --> 00:13:17,120
current empirical literature we
have on the on on panaria and

216
00:13:17,120 --> 00:13:22,160
bacterial chemotaxis.
So we that required a lot of

217
00:13:22,600 --> 00:13:25,600
empirical assumptions which we
inerted from the literature.

218
00:13:25,840 --> 00:13:30,440
But once we did that and were
able to specify concretely what

219
00:13:30,440 --> 00:13:34,760
is the problem space, what are
the states, the operators, the

220
00:13:34,760 --> 00:13:41,160
constraints, the horizon and
what was the the the last one

221
00:13:41,160 --> 00:13:47,000
there are it's a 5 Tuffle now
forgetting my my, my point.

222
00:13:47,000 --> 00:13:49,880
So let's let's rewind this.
So once the.

223
00:13:50,320 --> 00:13:52,600
SOCH.
Yeah, yeah.

224
00:13:52,760 --> 00:13:54,360
So, yeah, that is it.
Yeah, exactly.

225
00:13:54,600 --> 00:13:59,520
So rewind.
So once you specify the the, the

226
00:13:59,520 --> 00:14:03,360
states, the operator, the
constraints, the evaluation

227
00:14:03,360 --> 00:14:08,800
functional and the horizon in a
in a specific way, then you can

228
00:14:08,800 --> 00:14:12,440
start evaluating.
And given all these assumptions

229
00:14:12,440 --> 00:14:18,600
and, and K defined as it was, we
found multiple orders of

230
00:14:18,600 --> 00:14:23,280
magnitudes, billions of orders
of of of magnitudes of

231
00:14:23,280 --> 00:14:27,320
efficiency, even under the most
conservative assumptions.

232
00:14:27,600 --> 00:14:31,080
And and we took the right from
we took, we took those

233
00:14:31,080 --> 00:14:36,000
assumptions right from Mike's
work on on on on planaria.

234
00:14:36,760 --> 00:14:41,440
And even even if we make the
most conservative assumptions,

235
00:14:41,440 --> 00:14:44,440
which means we underestimate
rather than overestimate

236
00:14:44,440 --> 00:14:48,520
intelligence, we still end up
with fantastically higher

237
00:14:48,520 --> 00:14:50,640
efficiency relative to A to a
null model.

238
00:14:52,600 --> 00:14:54,040
Mike, anything you'd like to add
to that?

239
00:14:55,200 --> 00:14:58,320
Well, there's two there's
there's two interesting things

240
00:14:58,480 --> 00:15:01,320
here I could comment on.
One is that, you know, in

241
00:15:01,320 --> 00:15:05,920
general, have having a framework
like this that is really a

242
00:15:05,920 --> 00:15:11,120
formal in a way that's substrate
agnostic is incredibly powerful.

243
00:15:11,120 --> 00:15:15,200
Because this is typically, you
know, people have a really hard

244
00:15:15,200 --> 00:15:20,360
time porting words and the
concepts from, you know, sort of

245
00:15:20,960 --> 00:15:24,080
behavioral science and this kind
of like agentic talk onto other

246
00:15:24,080 --> 00:15:25,440
substrates.
And it always kind of feels

247
00:15:25,440 --> 00:15:30,360
wrong to a lot of people.
But having a formalism that says

248
00:15:30,360 --> 00:15:34,240
no, no, the mapping is good and
here's why we can actually map

249
00:15:34,240 --> 00:15:35,560
this stuff on and it works very
well.

250
00:15:35,880 --> 00:15:38,440
You know, sort of it's nice and
clean like that, that that has a

251
00:15:38,440 --> 00:15:40,400
lot of that has a lot of value
to it.

252
00:15:40,800 --> 00:15:45,800
And this is partly why I'm sort
of so, so enamored of James''s

253
00:15:45,800 --> 00:15:49,360
original definition of
intelligence because it is so

254
00:15:49,400 --> 00:15:51,480
cybernetic.
Like it's, it doesn't say you're

255
00:15:51,480 --> 00:15:53,440
a brain and it doesn't say what
space you're in.

256
00:15:53,440 --> 00:15:56,480
And it doesn't say, you know,
how you got here or what you're

257
00:15:56,480 --> 00:15:59,080
made of or any of that stuff.
It's like very, you know,

258
00:15:59,080 --> 00:16:01,680
general and very powerful.
And it tells you like it's, it's

259
00:16:01,680 --> 00:16:04,560
calling to you to say, Hey,
maybe I could apply this to

260
00:16:04,560 --> 00:16:06,480
other things, right?
Like it's daring you to do that.

261
00:16:06,680 --> 00:16:09,960
So, so I, so I like that a lot.
The other thing I'll just

262
00:16:09,960 --> 00:16:13,280
mention about efficiency is, is
the following.

263
00:16:13,280 --> 00:16:15,440
And I think, I think there's an
interesting, an interesting

264
00:16:15,440 --> 00:16:20,240
feedback loop here, which is
that if you're a biological and,

265
00:16:20,240 --> 00:16:22,480
and there's actually a whole
other thing that Robert and I

266
00:16:22,480 --> 00:16:25,480
need to kind of talk about and,
and extend this to, to, to non

267
00:16:25,480 --> 00:16:28,160
biological, like to pre
biological kinds of things,

268
00:16:28,160 --> 00:16:29,080
right?
Because I think we're going to

269
00:16:29,080 --> 00:16:31,400
find stuff there too.
But, but for, but for sure, if

270
00:16:31,400 --> 00:16:35,200
you're a biological, you don't
have efficiency is really key

271
00:16:35,200 --> 00:16:37,840
because you don't have time to
be a Laplacian demon, OK?

272
00:16:37,840 --> 00:16:40,560
You don't have time to look at
microstates to try and calculate

273
00:16:40,560 --> 00:16:41,760
every damn thing that's going to
happen.

274
00:16:41,760 --> 00:16:45,880
And you know, to, to you, you
will be dead and gone long.

275
00:16:45,880 --> 00:16:47,600
But before you compute the
first, the first thing.

276
00:16:47,960 --> 00:16:51,880
So, So what that means is that
you're under a tremendous amount

277
00:16:51,880 --> 00:16:54,880
of pressure to to be efficient,
but be efficient.

278
00:16:54,880 --> 00:16:57,480
How?
By course graining by having

279
00:16:58,880 --> 00:17:00,560
strategies to navigate the
world.

280
00:17:00,560 --> 00:17:01,960
And that means putting labels on
things.

281
00:17:01,960 --> 00:17:05,599
That means having making models
in the world of large scale

282
00:17:05,599 --> 00:17:07,720
systems, doing things.
You don't have time to track all

283
00:17:07,720 --> 00:17:09,960
the particles that are impinging
on your on your surface.

284
00:17:09,960 --> 00:17:12,800
You have to have these large
scale models and and I think

285
00:17:12,800 --> 00:17:17,160
that's very powerful because if
you have a pressure to see the

286
00:17:17,160 --> 00:17:21,240
environment as containing agents
that do things, eventually you

287
00:17:21,240 --> 00:17:23,560
turn that light on to yourself
and you say, wait a minute, I'm

288
00:17:23,560 --> 00:17:26,119
an agent that does things and
it's sort of right.

289
00:17:26,119 --> 00:17:29,640
And it can it, it, it what kind
of cranks up this now, to the

290
00:17:29,640 --> 00:17:31,520
extent that you do that you
become more efficient, which

291
00:17:31,520 --> 00:17:33,240
then, right.
So, so there's like this, this,

292
00:17:33,240 --> 00:17:36,880
this, this feedback loop, which
I think really favors, you know,

293
00:17:36,880 --> 00:17:39,400
there's a, there's a reason we
see agents everywhere.

294
00:17:39,400 --> 00:17:42,680
And it's not because we're just
like terribly, you know,

295
00:17:42,680 --> 00:17:44,920
deluded.
And, and it like, it's, it pays

296
00:17:44,920 --> 00:17:46,200
off.
Like, dude, there's a reason why

297
00:17:46,200 --> 00:17:47,880
we do this.
And I mean, of course you can

298
00:17:47,880 --> 00:17:50,160
make mistakes, but, but, but
mostly it pays off.

299
00:17:50,400 --> 00:17:54,600
So I think efficiency from that
perspective, it gives us a, a

300
00:17:54,600 --> 00:17:59,600
glimpse into a, a driving force
for what makes us agents and

301
00:17:59,600 --> 00:18:01,720
what thus enables us to see
other agents.

302
00:18:02,280 --> 00:18:04,000
So this is, you know, part of
part.

303
00:18:04,000 --> 00:18:08,160
I see this as part of that.
I mean, before we continue the

304
00:18:08,320 --> 00:18:13,600
the title neuroscience beyond
neurons, sorry that it's sort of

305
00:18:13,600 --> 00:18:16,480
decent as neurons without
dismissing neuroscience as well.

306
00:18:16,480 --> 00:18:20,080
So what Robert, perhaps you
could give us the answer to

307
00:18:20,080 --> 00:18:21,200
this.
What does mainstream

308
00:18:21,200 --> 00:18:24,120
neuroscience get right from a
philosophical perspective and

309
00:18:24,120 --> 00:18:29,160
what does it miss by by staying
neurocentric neuron centric?

310
00:18:31,720 --> 00:18:34,720
Well, mainstream neuroscience is
even hard to to to characterize

311
00:18:34,720 --> 00:18:37,240
what what is mainstream
neuroscience in this sense of?

312
00:18:37,240 --> 00:18:41,480
So it's probably the the status
quo which considers that

313
00:18:41,480 --> 00:18:45,440
intelligence and cognition even
are the prerequisite of of of

314
00:18:45,440 --> 00:18:48,720
brains or only nervous system.
But then even that that is a

315
00:18:48,720 --> 00:18:52,760
complicated question.
Which brains of which animals,

316
00:18:52,760 --> 00:18:56,000
which nervous system, how many
parts of the nervous system, at

317
00:18:56,000 --> 00:18:58,960
what degree of resolution of the
nervous system can you consider?

318
00:18:59,040 --> 00:19:01,520
Is the medulla intelligent in
the human brain?

319
00:19:03,440 --> 00:19:06,600
All these kind of questions.
So when you start rolling the

320
00:19:06,600 --> 00:19:11,760
ball on being skeptical on the
assumptions and then then you,

321
00:19:11,840 --> 00:19:14,720
you start, OK, what about the
subsystem in the, in the

322
00:19:14,720 --> 00:19:18,000
dorsolateral prefrontal cortex?
What about the mini column?

323
00:19:19,800 --> 00:19:25,240
That's about 100,000 neurons.
How, how many, what about the

324
00:19:25,240 --> 00:19:28,880
filament in the mini column?
Are those systems intelligence?

325
00:19:29,040 --> 00:19:31,960
What about the single neuron you
see?

326
00:19:31,960 --> 00:19:35,760
So, so the, the, the fact that
you ask these questions and then

327
00:19:35,760 --> 00:19:40,880
if you are knowledgeable of, of,
of, of, of even a modicum of

328
00:19:40,880 --> 00:19:45,120
brain neurophysiology, you start
thinking maybe there is

329
00:19:45,120 --> 00:19:48,880
something to this.
Maybe I should not be so neuron

330
00:19:48,880 --> 00:19:54,160
chauvinists or maybe I I should
not have the same certainty that

331
00:19:54,160 --> 00:19:59,160
I have when it comes to to
humans when I attribute or

332
00:19:59,160 --> 00:20:02,040
actually not attribute
intelligence to other systems

333
00:20:02,520 --> 00:20:05,280
and.
And as we we do that, we

334
00:20:05,480 --> 00:20:09,840
progressively start departing
from something like a, a gaseous

335
00:20:09,840 --> 00:20:12,640
consensus.
There is a core of the consensus

336
00:20:12,640 --> 00:20:15,280
in neuroscience, the brains and
nervous systems and animals.

337
00:20:15,480 --> 00:20:18,480
So the seat of intelligence is
the the Organism.

338
00:20:18,680 --> 00:20:22,640
And that is in almost all cases
an Organism with the nervous

339
00:20:22,640 --> 00:20:25,360
system and in almost all cases a
nervous system with a

340
00:20:25,360 --> 00:20:27,680
centralized brain or something
like that.

341
00:20:28,160 --> 00:20:32,800
Cephalopod, moles said are
within that space, but once you

342
00:20:32,800 --> 00:20:37,600
get around or beyond that core
of that crystallized core in in

343
00:20:37,600 --> 00:20:40,520
the mainstream neuroscience, so
to speak, you get to more

344
00:20:40,760 --> 00:20:45,040
gaseous regions of the of the
periphery.

345
00:20:45,080 --> 00:20:47,840
And then that's where we started
building.

346
00:20:47,840 --> 00:20:52,720
OK, what happens if you push
through this gas and go as much

347
00:20:52,720 --> 00:20:54,960
in an exploration as you can?
How deep can you go?

348
00:20:55,880 --> 00:20:59,040
Can you go to a single cell?
Can you go to something like

349
00:20:59,040 --> 00:21:01,840
Mike was intimating that is pre
biological.

350
00:21:02,240 --> 00:21:04,360
And our answer is yes, I think
we can.

351
00:21:04,760 --> 00:21:10,280
And the onus is on us to make
this in a conceptually coherent

352
00:21:10,280 --> 00:21:15,040
and formally rigorous way and
then obviously try to obtain

353
00:21:15,040 --> 00:21:19,000
data and validate those those
those decisions we made on the

354
00:21:19,000 --> 00:21:23,200
theoretical side experimentally.
But the fact that we questioned

355
00:21:23,200 --> 00:21:27,640
the assumption that's that's the
point that the fact that we are

356
00:21:27,640 --> 00:21:32,200
not locked ossified in the core
of Mason neuroscience.

357
00:21:32,640 --> 00:21:40,800
So the fact that we are not in
the core cognition like like

358
00:21:40,800 --> 00:21:43,720
brain centric, corticocentric or
whatever.

359
00:21:43,720 --> 00:21:46,920
Yeah, the Cortico.
Yeah, yeah, exactly.

360
00:21:47,320 --> 00:21:51,000
We are not we are not there that
that's because we think in in

361
00:21:51,000 --> 00:21:56,400
this more nomadic ways.
Well, look, I'm I'm super

362
00:21:56,400 --> 00:22:00,000
excited to hear about the pre
biological part, but we need to

363
00:22:00,000 --> 00:22:02,280
build up to that.
So going along the lines of the

364
00:22:02,280 --> 00:22:06,800
paper, Mike, why measure
relative to a blind random walk

365
00:22:06,880 --> 00:22:11,560
rather than human benchmarks or
tasks success let's say like

366
00:22:11,560 --> 00:22:15,320
what problem does this
logarithmic continuous scale

367
00:22:15,320 --> 00:22:19,480
dissolve versus binary cognitive
non cognitive notions?

368
00:22:20,720 --> 00:22:23,600
Yeah.
And actually just something real

369
00:22:23,600 --> 00:22:26,400
quick about the previous, just
to say something about the

370
00:22:26,400 --> 00:22:29,000
previous question.
You know, one one thing that I

371
00:22:29,000 --> 00:22:36,000
think neuroscience gets
massively right is the idea that

372
00:22:37,680 --> 00:22:41,600
what you're studying is
literally a multi scale system

373
00:22:41,800 --> 00:22:46,320
where causal influence at least
not everybody believes this, but

374
00:22:46,320 --> 00:22:48,840
you know, at least at least some
neurosciences do causal

375
00:22:48,840 --> 00:22:51,560
influence traverses up and down
the whole stack.

376
00:22:51,880 --> 00:22:56,000
So it's right.
So, so this idea that all of it

377
00:22:56,000 --> 00:22:58,240
is valuable.
So, so yes, the synaptic

378
00:22:58,240 --> 00:23:01,800
proteins are yes, you need to
know about that, but also

379
00:23:01,800 --> 00:23:04,680
there's the network properties
and there's the what?

380
00:23:04,680 --> 00:23:06,440
And then eventually
psychoanalysis, right?

381
00:23:06,440 --> 00:23:10,280
Like like all of these things.
So, so that the multi scale, the

382
00:23:10,880 --> 00:23:13,560
the idea that what we're looking
at is a multi scale problem

383
00:23:13,560 --> 00:23:15,000
where all of the scales are
important.

384
00:23:15,000 --> 00:23:17,280
We're all they're all generating
something, something new.

385
00:23:17,680 --> 00:23:22,360
This is very unique and it's and
it's, and, and yes, there are

386
00:23:22,480 --> 00:23:24,840
people in the field who think
that ad or eventually it all

387
00:23:24,840 --> 00:23:27,760
gets boiled down to chemistry.
I, I don't know, nobody seems to

388
00:23:27,760 --> 00:23:30,080
think it gets boiled down to
quantum foam.

389
00:23:30,080 --> 00:23:32,120
I don't know why that is.
They kind of stop at chemistry.

390
00:23:32,280 --> 00:23:36,080
But, but, but, but I think most
people think that there is new,

391
00:23:36,080 --> 00:23:39,240
there are new and new aspects
introduced at the different

392
00:23:39,240 --> 00:23:41,160
levels that it's not just all
going to be eventually, you

393
00:23:41,160 --> 00:23:44,960
know, chemistry, but in other
fields, for example, in

394
00:23:44,960 --> 00:23:48,200
developmental biology, I think
almost nobody believes that.

395
00:23:48,200 --> 00:23:51,520
I think, I think people do think
that that these higher levels

396
00:23:51,520 --> 00:23:54,400
don't really add anything.
At best, there are some cool

397
00:23:54,400 --> 00:23:56,880
sort of sort of evolutionary
patterns that you see at large

398
00:23:56,880 --> 00:23:58,840
scale.
But fundamentally it's all this

399
00:23:58,840 --> 00:24:00,440
the story of chemistry rattling
around.

400
00:24:00,960 --> 00:24:04,480
And, and I think, I think we can
really learn from neuroscience

401
00:24:05,120 --> 00:24:07,760
this this multi scale kind of
kind of approach.

402
00:24:07,840 --> 00:24:12,320
And so I think so I think that's
critical, But nevertheless, I

403
00:24:12,320 --> 00:24:15,600
don't think, I don't know if I
can say this, but I, I don't

404
00:24:15,600 --> 00:24:17,320
think that neuroscience is the
study of neurons.

405
00:24:17,600 --> 00:24:20,160
I think it's the study of multi
scale causation, which happens

406
00:24:20,160 --> 00:24:22,600
to be like, these neurons are
like an amazing example of that,

407
00:24:22,600 --> 00:24:25,160
that, that we can, you know,
that we see may be the best

408
00:24:25,160 --> 00:24:28,360
example so far, but I don't
think it's the study of neurons

409
00:24:28,360 --> 00:24:33,080
per SE.
And so now, so now for your, for

410
00:24:33,080 --> 00:24:36,640
your quick question, why?
Why not compare it to humans?

411
00:24:36,880 --> 00:24:42,920
This, I think is really critical
because humans like like look at

412
00:24:42,920 --> 00:24:44,640
look at a lot of the debate
around AI.

413
00:24:44,920 --> 00:24:48,040
People say, well, is it it's
just like a human or it's not

414
00:24:48,040 --> 00:24:50,800
like a human?
Why, why do you need to compare

415
00:24:50,800 --> 00:24:53,040
it to a human?
If your, if your goal is to

416
00:24:53,040 --> 00:24:55,680
understand its nature and what,
how much intelligence it has and

417
00:24:55,680 --> 00:24:58,560
what kind of intelligence it
has, why are humans the

418
00:24:58,560 --> 00:24:59,960
comparison?
I mean, I get it.

419
00:24:59,960 --> 00:25:03,480
We want collaborators and Co
workers and you know that, that

420
00:25:03,480 --> 00:25:06,640
are fast and smart and all that.
But but, but really, biology has

421
00:25:06,640 --> 00:25:10,920
been solving problems and doing
intelligence long before we had

422
00:25:10,920 --> 00:25:13,920
humans and even long before we
had neurons, and I would argue

423
00:25:14,040 --> 00:25:18,600
before we had real cells even.
But but the point is, I don't

424
00:25:18,600 --> 00:25:20,400
think the human, you know,
humans should be the measure of

425
00:25:20,400 --> 00:25:22,320
all things, you know, right.
There's got to be there there

426
00:25:22,360 --> 00:25:25,240
there must be ways to be
intelligent and to have all

427
00:25:25,240 --> 00:25:27,680
these other cognitive features
that we want that are nothing

428
00:25:27,680 --> 00:25:29,680
like humans.
I think it's incredibly

429
00:25:29,680 --> 00:25:32,080
dangerous to think that humans
are going to be the measure of

430
00:25:32,080 --> 00:25:34,880
everything both, both for
practical reasons, because we're

431
00:25:34,880 --> 00:25:37,920
going to miss and not see
things, but but but also very

432
00:25:37,920 --> 00:25:41,080
importantly for ethical reasons,
because because we have a huge

433
00:25:41,520 --> 00:25:43,680
difficulty seeing minds that are
not like ours.

434
00:25:43,840 --> 00:25:46,320
And we have to start by just
acknowledging that humans are

435
00:25:46,320 --> 00:25:47,800
not the measure of everything
interest.

436
00:25:49,280 --> 00:25:53,000
And, and just to add something
to to this danger of

437
00:25:53,000 --> 00:25:56,800
anthropocentrism or actually
anthropomorphism, the fact that

438
00:25:56,800 --> 00:26:02,120
we accept multiple scales in the
in the multi causation, multi

439
00:26:02,120 --> 00:26:07,120
causal composition of a system
means that as an observer that

440
00:26:07,120 --> 00:26:12,400
is able with some some degree of
resolution access those scales,

441
00:26:12,960 --> 00:26:16,080
you must recognize the problem
space is operant at those

442
00:26:16,080 --> 00:26:19,560
scales.
So you would eventually end up

443
00:26:19,600 --> 00:26:23,760
as seeing each system, if you
have enough data and enough

444
00:26:23,760 --> 00:26:29,400
resolution to to sample, you
will see it as a, as a Republic

445
00:26:29,400 --> 00:26:35,720
of competent problem solvers or
as a some sort of sometimes

446
00:26:35,720 --> 00:26:39,760
cooperating, sometimes not
cooperating democracy of, of

447
00:26:39,760 --> 00:26:44,200
intelligencies.
And at each scale you analyze,

448
00:26:44,400 --> 00:26:46,760
you must make scale specific
assumptions.

449
00:26:47,160 --> 00:26:50,560
But the beauty of of the
framework we propose is that no

450
00:26:50,560 --> 00:26:54,240
matter what essentials play out
at the single scale, you can use

451
00:26:54,240 --> 00:26:56,400
a formalism that is scaling
variant.

452
00:26:56,520 --> 00:27:01,240
And that was the attempt of of
formalizing Mike's previous work

453
00:27:01,240 --> 00:27:03,960
with with with Chris Fields on
on the on the multi scale

454
00:27:03,960 --> 00:27:07,160
competency architecture.
Yeah.

455
00:27:07,360 --> 00:27:10,240
Yeah, I think, yeah, let's let's
go further into this.

456
00:27:10,800 --> 00:27:15,440
Robert, you're the quintuple
expands classical problem

457
00:27:15,440 --> 00:27:19,600
spaces.
Why is it key that biological

458
00:27:19,600 --> 00:27:25,080
systems don't just search but
reshape spaces by editing

459
00:27:25,080 --> 00:27:27,640
constraints, operators or
horizons?

460
00:27:29,120 --> 00:27:34,360
Those were just a few examples.
I mean, if, if you want to, to

461
00:27:34,360 --> 00:27:37,480
develop these, these things in,
in, in detail, you'd probably

462
00:27:37,480 --> 00:27:40,840
write more than one book to just
to develop, to pursue a single

463
00:27:40,840 --> 00:27:43,720
constant.
But the the point as, as Mike

464
00:27:43,720 --> 00:27:47,880
was mentioning and and in the
original work by by Neil and

465
00:27:47,880 --> 00:27:52,800
Simon, usually when you want to
solve a problem, you search in

466
00:27:52,800 --> 00:27:55,880
the space of possible solutions.
So, so you search for an end

467
00:27:55,880 --> 00:28:00,120
state that is given a cost of
operators that you incur by

468
00:28:00,120 --> 00:28:02,680
pursuing the trajectory to that
state.

469
00:28:03,280 --> 00:28:10,080
You as a human, you rehearse in
your mind, you rehearse sort of

470
00:28:10,480 --> 00:28:13,040
a simulation of what the
trajectory would be.

471
00:28:13,160 --> 00:28:15,160
So that is counterfactual
reasoning.

472
00:28:15,600 --> 00:28:20,160
Now when you are not assuming
that something like a bacterium

473
00:28:20,160 --> 00:28:23,840
can do the same counterfactual
deep reasoning as we can do,

474
00:28:24,680 --> 00:28:28,240
that invites the question, how
can a bacterium or any other

475
00:28:28,240 --> 00:28:32,480
system consider multiple
trajectories for the solutions

476
00:28:32,720 --> 00:28:40,000
it has at its at its disposal?
So that raises certain problems.

477
00:28:40,520 --> 00:28:44,400
In the original formulations by
Neil and Simon, it was already

478
00:28:44,400 --> 00:28:49,360
baked in that you would have the
human analog of counterfactual

479
00:28:49,360 --> 00:28:54,480
reasoning, a human that given
some states, some operators and

480
00:28:54,480 --> 00:28:58,440
some constraints and that is it.
No necessarily any valuation

481
00:28:58,440 --> 00:29:01,760
functional, no, no horizon.
And now I'll get to the horizon

482
00:29:01,760 --> 00:29:05,440
in a second.
You would be able to to to to

483
00:29:05,480 --> 00:29:10,080
specify a problem space and and
quantify a such efficiency if

484
00:29:10,080 --> 00:29:14,120
you are able to computationally
simulate the entirety of the

485
00:29:14,120 --> 00:29:15,480
state space for that problem
space.

486
00:29:15,480 --> 00:29:19,800
And that that brings certain
problems in in all systems, not

487
00:29:19,800 --> 00:29:22,200
only humans.
But let's let's go with the the

488
00:29:22,200 --> 00:29:24,720
argument.
So the reason for adding an

489
00:29:24,720 --> 00:29:29,080
evaluation functional and then
adding the horizon as as as a

490
00:29:29,080 --> 00:29:30,920
construct.
What are the following?

491
00:29:31,720 --> 00:29:36,840
The valuation function is a sort
of metric of expenditure that

492
00:29:36,840 --> 00:29:41,560
you would apply to each
iteration of some operator.

493
00:29:41,960 --> 00:29:47,000
So if you have certain costs
that are associated with the

494
00:29:48,040 --> 00:29:51,040
iteration, with the application
of an operator that moves you in

495
00:29:51,040 --> 00:29:55,200
the state space and that cost
can can be and will be scale

496
00:29:55,200 --> 00:29:59,040
specific.
There will be AATP units, there

497
00:29:59,040 --> 00:30:05,960
will be a limit, a limited
amount of of possible spaces in

498
00:30:05,960 --> 00:30:08,600
the 3D space that you can make
when you go to someplace.

499
00:30:08,720 --> 00:30:12,640
So, so all of these are are are
costs you must be able to

500
00:30:12,680 --> 00:30:16,440
evaluate internally from the
perspective of the system that

501
00:30:16,440 --> 00:30:20,000
you are evaluate better and
worse moves.

502
00:30:20,720 --> 00:30:24,600
If you don't have that, you are
immediately bound to be

503
00:30:24,600 --> 00:30:27,720
haphazardly sampling the the
problem space.

504
00:30:28,200 --> 00:30:32,160
And that that puts you into the
combinatorial explosion

505
00:30:32,160 --> 00:30:36,640
territory in into into exploding
regime of of of search options.

506
00:30:37,000 --> 00:30:42,000
Because the moment you try to to
estimate how big is a problem

507
00:30:42,000 --> 00:30:47,200
space for a given system, that
means defining the, the, the the

508
00:30:47,200 --> 00:30:51,200
smallest possible state.
The microstate means course of

509
00:30:51,320 --> 00:30:54,480
course, graining, averaging, so
that there are all all sorts of

510
00:30:54,760 --> 00:30:58,360
of techniques for dimensionality
reductions and and state space

511
00:30:58,360 --> 00:31:01,840
estimation.
You would see immediately that

512
00:31:01,840 --> 00:31:05,320
if a system would not have a way
to tell that is going into the

513
00:31:05,320 --> 00:31:08,760
right direction and it's not
necessary that the system can

514
00:31:08,760 --> 00:31:11,000
have something like a
counterfactual reasoning as as

515
00:31:11,000 --> 00:31:14,120
humans do, they might be running
out of energy.

516
00:31:14,560 --> 00:31:18,200
Energy in this sense, as as a
very trivial example, would be

517
00:31:18,200 --> 00:31:21,280
the signal that the system is
not solving the problem well.

518
00:31:21,760 --> 00:31:27,320
And that's as fundamental in
biological terms as it gets.

519
00:31:27,800 --> 00:31:33,680
You run out of energy, you run
out of life, so to speak.

520
00:31:34,360 --> 00:31:40,000
So, and the horizon would be
adding a sort of step size if

521
00:31:40,000 --> 00:31:46,120
you want each agent depending on
on the depth of it's in the

522
00:31:46,280 --> 00:31:49,840
mainstream New York style stock
abstraction abilities, coarse

523
00:31:49,840 --> 00:31:52,040
graining and and labeling
abilities.

524
00:31:52,400 --> 00:31:58,600
Even if you are known on a known
human system has some form of,

525
00:31:59,520 --> 00:32:05,680
of depth to its abilities to
abstract moves, possible moves.

526
00:32:05,680 --> 00:32:08,760
And, and that is something that
we think is is important even

527
00:32:08,760 --> 00:32:13,080
for a system.
And that that's probably 1 of

528
00:32:13,080 --> 00:32:16,440
the hardest ways or one of the
hardest constructs to to, to

529
00:32:16,440 --> 00:32:19,360
estimate for a, for a single
scale, for a specific scale.

530
00:32:19,800 --> 00:32:22,000
But I think, I think it's
something important and it ends

531
00:32:22,000 --> 00:32:25,680
up in in informing the problem
space discussion quite, quite a

532
00:32:25,680 --> 00:32:28,800
lot is the human equivalent of
counterfactual depth.

533
00:32:31,520 --> 00:32:40,440
Mike, any comments on that?
In general on the quintuple and

534
00:32:40,440 --> 00:32:42,720
how it expands classical problem
spaces as well.

535
00:32:43,840 --> 00:32:48,600
No, I think, I think it's, it's,
it's very, very plastic.

536
00:32:48,600 --> 00:32:50,680
I think we can apply it to a lot
of different things.

537
00:32:50,680 --> 00:32:53,840
And I've got, you know, all
sorts of all sorts of ideas for

538
00:32:53,840 --> 00:32:56,960
weird, weird things to, you
know, like, like, like one of

539
00:32:56,960 --> 00:33:00,840
the one of the issues is, as
you, you mentioned this in your

540
00:33:00,840 --> 00:33:03,640
last question is how we're doing
better than the binary.

541
00:33:03,640 --> 00:33:04,880
Why is it better to do the
binary?

542
00:33:05,280 --> 00:33:09,520
So the, the thing about these
binaries is that inevitably in

543
00:33:09,520 --> 00:33:13,040
biology, everything we know from
both evolution and developmental

544
00:33:13,040 --> 00:33:16,640
biology is that all of these
things are slow, continuous

545
00:33:16,640 --> 00:33:18,920
processes, right?
We all start life as a little,

546
00:33:18,920 --> 00:33:21,920
little drop of chemicals and
eventually we end up at whatever

547
00:33:21,920 --> 00:33:24,480
we are now.
And, and you have to get there

548
00:33:24,480 --> 00:33:27,160
slowly and gradually.
And so the real question is what

549
00:33:27,160 --> 00:33:29,520
happens along the way?
What it, what is that journey

550
00:33:29,520 --> 00:33:31,240
like, right.
They, you actually go through

551
00:33:31,240 --> 00:33:32,240
all the different disciplines,
right?

552
00:33:32,240 --> 00:33:34,840
So you start off in chemistry
and physics and then some kind

553
00:33:34,840 --> 00:33:37,600
of developmental Physiology and
eventually behavior science.

554
00:33:37,600 --> 00:33:39,920
And then maybe you can, you
know, psychiatry or something

555
00:33:39,920 --> 00:33:41,400
later on.
But like, like you go through

556
00:33:41,400 --> 00:33:46,520
all these different disciplines.
And it would be, it would be,

557
00:33:46,720 --> 00:33:50,360
you know, sort of remarkable to
see how this kind of formalism

558
00:33:50,360 --> 00:33:54,280
applies all along the way.
Because because you, we have to

559
00:33:54,280 --> 00:33:56,560
have a, we, we have to be able
to tell a story of that

560
00:33:56,560 --> 00:33:59,520
transformation.
And maybe so, so, so I think

561
00:33:59,520 --> 00:34:03,440
what people assume the null
hypothesis I think for a lot of

562
00:34:03,440 --> 00:34:06,280
people is that there are phase
transitions, right?

563
00:34:06,280 --> 00:34:10,239
That that like, yeah, yeah, the
underlying sort of mechanisms

564
00:34:10,239 --> 00:34:12,400
are slow and gradual.
But, but there are these phase

565
00:34:12,400 --> 00:34:14,480
transitions like you do a little
bit and then like more is

566
00:34:14,480 --> 00:34:16,639
different and then you get in
some kind of, some kind of

567
00:34:16,639 --> 00:34:17,840
sigmoid or something like that,
right.

568
00:34:18,560 --> 00:34:22,040
And so, and so maybe that's
true, but I, I don't see that as

569
00:34:22,040 --> 00:34:24,639
the, the null hypothesis.
I see that as something that has

570
00:34:24,639 --> 00:34:26,520
to be be argued.
Like if that's true, you have to

571
00:34:26,520 --> 00:34:28,159
show that.
You have to show what is it

572
00:34:28,159 --> 00:34:29,960
that's sharp?
The, the, you know, the makes a

573
00:34:29,960 --> 00:34:33,440
phase transition and, and why is
it that you can't just zoom in

574
00:34:33,440 --> 00:34:36,560
there and still see it as smooth
at a higher level of resolution?

575
00:34:36,560 --> 00:34:39,440
Like most sigmoids, you zoom in
and it's nice and flat again.

576
00:34:40,000 --> 00:34:44,480
So anyway, so, so this, This is
why I think it's important to be

577
00:34:44,480 --> 00:34:46,840
able to tell stories that aren't
binary because I think the

578
00:34:46,840 --> 00:34:50,600
phenomenon is not binary.
Our categories are binary.

579
00:34:50,600 --> 00:34:53,360
The way the way people use the
word adult, you know, you want

580
00:34:53,360 --> 00:34:54,679
it.
You want it for traffic courts

581
00:34:54,679 --> 00:34:57,760
so that you don't have to spend
every, you know, every case is

582
00:34:57,760 --> 00:35:00,360
like this long debate about what
does it mean to mature into

583
00:35:00,360 --> 00:35:02,160
personal responsibility?
Like nobody wants that.

584
00:35:02,160 --> 00:35:03,560
So fine, so we have this word
adult.

585
00:35:03,960 --> 00:35:07,520
But it basically conceals the
fact that underneath that are a

586
00:35:07,520 --> 00:35:08,840
bunch of unanswered questions
about.

587
00:35:08,840 --> 00:35:10,640
Yeah.
So how did you become a fully

588
00:35:10,640 --> 00:35:13,440
responsible human and what
happens with brain tumors and

589
00:35:13,440 --> 00:35:16,640
Twinkies and then all the things
that impinge on, on, on your

590
00:35:16,880 --> 00:35:19,000
sort of full responsibility and,
and, and so on.

591
00:35:19,240 --> 00:35:20,600
So, so I think that's what
happens here.

592
00:35:20,600 --> 00:35:24,640
We use these binary categories.
It's it's fine to use them for

593
00:35:24,640 --> 00:35:28,280
expediency, but we better not
sort of forget what, what

594
00:35:28,280 --> 00:35:30,320
actually underwrites them.
And, and yeah.

595
00:35:30,400 --> 00:35:33,120
And so the right to continue
understanding that story of

596
00:35:33,120 --> 00:35:37,120
transformation across
disciplines is, is I think what

597
00:35:37,120 --> 00:35:39,400
we need to do.
Yeah, and.

598
00:35:39,680 --> 00:35:43,240
And just to add something to
that, just just a quick addition

599
00:35:43,240 --> 00:35:46,680
to that.
This is in a sense trailblazing

600
00:35:46,760 --> 00:35:50,720
because we are proposing
something, something that it's

601
00:35:50,920 --> 00:35:54,400
that it's, it wasn't the
literature in in some larval

602
00:35:54,400 --> 00:35:58,160
form.
We, we made it grow into

603
00:35:58,160 --> 00:36:00,400
something like a, like a pupa
maybe.

604
00:36:01,080 --> 00:36:06,960
And we are now waiting for the
butterfly to come out.

605
00:36:06,960 --> 00:36:10,200
And that will take effort.
And just imagine, as I'm

606
00:36:10,520 --> 00:36:12,880
thinking out loud here, but
imagine if you have something

607
00:36:12,880 --> 00:36:17,320
like this analysis or something
similar, something that's both

608
00:36:17,320 --> 00:36:22,040
quantifiable, mathematically,
conceptually coherent, but also

609
00:36:22,080 --> 00:36:25,720
empirically accessible.
Imagine you do this recursively

610
00:36:25,720 --> 00:36:29,160
for multiple scales or for
multiple systems, and then you

611
00:36:29,160 --> 00:36:33,280
would have a Ledger of all the
scores you make for K or

612
00:36:33,280 --> 00:36:36,600
something similar.
And then you start seeing that

613
00:36:37,160 --> 00:36:42,040
certain combinatorial explosions
after a certain phase

614
00:36:42,040 --> 00:36:46,800
transition, a certain bent on
the sigmoid is more important

615
00:36:46,800 --> 00:36:49,240
than another.
So some there would be a

616
00:36:49,320 --> 00:36:53,160
threshold at which interesting
things happen if you start

617
00:36:53,160 --> 00:36:55,560
thinking in this way in which we
propose.

618
00:36:56,160 --> 00:37:00,000
But that means applying this
formalism, trying to think in

619
00:37:00,000 --> 00:37:03,160
these terms and trying to see
empirically what happens and get

620
00:37:03,160 --> 00:37:07,600
scores, mathematical benchmarks
for efficiency in problem

621
00:37:07,600 --> 00:37:10,600
spaces, given that you've
estimated the spaces in some

622
00:37:10,600 --> 00:37:12,480
way.
And then maybe, maybe maybe you,

623
00:37:12,480 --> 00:37:17,000
you get some very interesting
insights into thresholds.

624
00:37:17,960 --> 00:37:20,320
Maybe there's nothing much
happening that two orders of

625
00:37:20,320 --> 00:37:23,000
magnitude better than chance or
five orders of magnitude better.

626
00:37:23,000 --> 00:37:27,480
But maybe at at the a billion
orders of magnitude, maybe

627
00:37:27,480 --> 00:37:30,400
that's something that happens
and it's, it's impressive.

628
00:37:30,400 --> 00:37:33,840
And no matter the size of the
observer that you are, if you if

629
00:37:33,840 --> 00:37:36,520
you are big enough, you will see
it so.

630
00:37:38,840 --> 00:37:41,440
Yeah, it's it's for me.
It was going to be asking about

631
00:37:41,720 --> 00:37:45,360
a practical approach this
because practically, what does

632
00:37:45,360 --> 00:37:49,840
it mean that evolution retiles
problem spaces rather than just

633
00:37:49,840 --> 00:37:53,520
speeding up searches and which
element, for example,

634
00:37:53,680 --> 00:37:57,480
constraints, let's say, do
biologists most often overlook?

635
00:37:57,480 --> 00:38:02,360
Do you guys think?
Want to take this one Mike or

636
00:38:02,480 --> 00:38:07,240
should I?
Should I can start, I guess I, I

637
00:38:07,240 --> 00:38:11,360
think, I think part of the issue
is the, the focus on constraints

638
00:38:11,400 --> 00:38:14,760
to begin with, in other words.
And, and I realized that that

639
00:38:15,200 --> 00:38:18,320
folks who, who you know, who are
focused on this work, they're

640
00:38:18,320 --> 00:38:20,080
very careful.
And they say constraints isn't

641
00:38:20,080 --> 00:38:21,840
just about the things you can't
do and whatever.

642
00:38:22,040 --> 00:38:26,480
But, but, but, but still,
there's a difference between

643
00:38:27,000 --> 00:38:31,680
constraints in the sense that,
OK, I've shut off this set of

644
00:38:31,800 --> 00:38:33,920
things that are possible to you.
So now you go over here and you

645
00:38:33,920 --> 00:38:37,880
do some other things versus
really taking seriously this

646
00:38:37,880 --> 00:38:40,840
notion of enablements, right?
The idea that there are certain

647
00:38:40,840 --> 00:38:44,120
things that they don't just shut
off possibilities and they don't

648
00:38:44,120 --> 00:38:48,160
just shunt you into new into
more of the same just, you know,

649
00:38:48,160 --> 00:38:52,120
a little different.
It's it's I, I, I think, and

650
00:38:52,120 --> 00:38:54,080
it's pretty good controversial.
I understand that.

651
00:38:54,080 --> 00:38:56,680
And there's a lot of work to be
to be done to flesh this up.

652
00:38:56,960 --> 00:39:01,240
But, but really this notion of
getting more out than you put

653
00:39:01,240 --> 00:39:04,760
in, you know, I think biology is
amazing at this, but I also

654
00:39:04,760 --> 00:39:07,600
think you already see this in,
in, in math and, and some other

655
00:39:07,600 --> 00:39:10,880
places where where you really
get more out than you put in.

656
00:39:10,880 --> 00:39:12,800
It's not just a constraint, it's
actually an enablement.

657
00:39:12,800 --> 00:39:17,800
And I think biology and
evolution like just really

658
00:39:17,800 --> 00:39:21,760
exploit the, excuse me, exploit
the heck out of that out of

659
00:39:21,760 --> 00:39:25,080
these kind of free lunches.
So I so I think that's that's

660
00:39:25,400 --> 00:39:31,800
that's interesting.
So in the to continue the thread

661
00:39:32,200 --> 00:39:36,960
in the original formulations by
Newell and Simon, finding new

662
00:39:36,960 --> 00:39:40,360
constraints for a problem.
Which means in a sense that

663
00:39:40,360 --> 00:39:46,280
redefining the problem space was
in some cases found to to

664
00:39:46,280 --> 00:39:50,040
converse the solutions that were
not available until you do that

665
00:39:50,040 --> 00:39:55,200
retiling of the problem space.
Which means the system that that

666
00:39:55,600 --> 00:39:59,360
solves the problem if it has
enough degrees of freedom.

667
00:39:59,480 --> 00:40:02,760
And again, this is a story about
the multi scale causal

668
00:40:02,760 --> 00:40:05,080
architecture of a system.
So it's a story of multi

669
00:40:05,080 --> 00:40:10,200
causation.
You can, or the system can edit

670
00:40:10,200 --> 00:40:14,880
the constraints which define the
problem space, rather than just

671
00:40:15,080 --> 00:40:18,560
iterating on operators and
applying the different sets of

672
00:40:18,560 --> 00:40:21,760
operators it has at its disposal
to traverse the space.

673
00:40:22,480 --> 00:40:26,240
And obviously some of that
description might sound vague

674
00:40:26,240 --> 00:40:30,400
and abstract, but then you can
be quite pragmatic about it and

675
00:40:30,400 --> 00:40:34,400
practical about it, because
systems have all sorts of

676
00:40:34,400 --> 00:40:38,320
feedback mechanisms and sensors
to detect their own internal

677
00:40:38,320 --> 00:40:40,920
states.
And especially this recurrent

678
00:40:40,920 --> 00:40:45,880
architecture is ideally suited
to be able to relax or enable

679
00:40:45,880 --> 00:40:48,760
some constraints or even close
other constraints.

680
00:40:48,760 --> 00:40:53,760
And there are countless examples
one could give if you hear of of

681
00:40:54,520 --> 00:40:58,280
state transitions or or dynamics
that's possible only under

682
00:40:58,280 --> 00:41:00,720
certain conditions and not
others.

683
00:41:00,880 --> 00:41:03,640
For example, just having a tad
bit more dopamine in your

684
00:41:03,640 --> 00:41:06,920
mesocortical tract will do
wonders for the refresh rate of

685
00:41:06,920 --> 00:41:11,080
your cognition, whereas not
having that dopamine will make

686
00:41:11,080 --> 00:41:14,120
you less able to retire the
constraints of your abstract

687
00:41:14,120 --> 00:41:16,800
spaces when you when you try to
to visit them.

688
00:41:17,560 --> 00:41:21,840
And that's that's just the the
presence of of a single compound

689
00:41:21,920 --> 00:41:27,160
or anything at the scale
specific thing can have in the

690
00:41:27,160 --> 00:41:31,240
problem space formulation can
have an effect of working on the

691
00:41:31,240 --> 00:41:35,160
constraints rather than working
on the exploration itself, which

692
00:41:35,200 --> 00:41:39,520
is something distinct because
the constraints define even, I

693
00:41:39,600 --> 00:41:43,840
mean formally here define what's
admissible as a set of of of

694
00:41:43,840 --> 00:41:46,920
moves or actually defines the
what's not admissible.

695
00:41:46,920 --> 00:41:50,720
But yeah.
So that means if you open some

696
00:41:50,720 --> 00:41:55,360
moves that were not available
before, it's like you would open

697
00:41:55,960 --> 00:41:59,080
territories in the problem space
that you were closed off, sort

698
00:41:59,080 --> 00:42:03,600
of you explore new paths of the
world and it's an open world,

699
00:42:03,720 --> 00:42:06,920
not close the world anymore.
And that means you might find

700
00:42:06,920 --> 00:42:09,720
solutions that are way more
efficient to reach your end

701
00:42:09,720 --> 00:42:15,160
state, which is your goal.
So yeah, that's that's the

702
00:42:15,160 --> 00:42:16,880
story.
And to bring that back to

703
00:42:16,880 --> 00:42:22,040
evolution, evolution does this
automatically in a sense.

704
00:42:22,280 --> 00:42:25,440
But then the question is does
the evolution do this

705
00:42:25,560 --> 00:42:30,680
adjunctively or we as the
observer observers are able to

706
00:42:30,680 --> 00:42:34,960
see that evolution worked on
some constraints and was able to

707
00:42:34,960 --> 00:42:38,080
find solutions which were not
available until the side work on

708
00:42:38,080 --> 00:42:41,240
constraints was done.
So we are able to recognize at

709
00:42:41,240 --> 00:42:44,320
the abstractions with which
operates also in the cushioning

710
00:42:44,320 --> 00:42:47,640
of our obstructions were able to
recognize this was not search

711
00:42:47,960 --> 00:42:50,120
evolution, search for some
solutions for this.

712
00:42:50,520 --> 00:42:52,760
But then he found a way to to
relax constraints.

713
00:42:52,760 --> 00:42:56,560
And 11 simple example that comes
into the mind is the development

714
00:42:56,560 --> 00:43:01,520
of of multiple ion channels.
So the fact that that that you

715
00:43:01,520 --> 00:43:05,320
started with I, I think it was,
it was sodium or it was

716
00:43:05,320 --> 00:43:08,400
potassium first or what was it?
Sodium 1st and then you had

717
00:43:08,400 --> 00:43:10,840
calcium and then, then you, you
had the third one.

718
00:43:11,000 --> 00:43:15,360
The each each of these thing is
like I'm relaxing a bit the the

719
00:43:15,360 --> 00:43:17,720
constraints of the of the the
problem space.

720
00:43:17,720 --> 00:43:23,080
I'm enabling an explosion of
possible other moves which then

721
00:43:24,160 --> 00:43:27,200
ultimately open new vistas of of
solutions that are are

722
00:43:27,200 --> 00:43:29,840
profitable for the that the
system in question.

723
00:43:30,600 --> 00:43:33,520
Yeah, that's, that's actually
an, an awesome example because

724
00:43:34,040 --> 00:43:37,560
right.
So you evolve a voltage gated

725
00:43:37,560 --> 00:43:39,320
potassium channel or whatever,
right?

726
00:43:39,640 --> 00:43:42,240
And so now you've got yourself a
voltage gate at current

727
00:43:42,240 --> 00:43:45,480
conductance, AKA a transistor.
You have a couple of those, you

728
00:43:45,480 --> 00:43:48,720
have a logic gate, all of the
truth tables, right?

729
00:43:48,720 --> 00:43:51,800
The fact that NAND is special,
like all this cool stuff, you

730
00:43:51,840 --> 00:43:53,440
get that for free.
You don't need to evolve that.

731
00:43:53,560 --> 00:43:57,080
You get all those properties for
free because you create, right?

732
00:43:57,080 --> 00:44:02,920
So that's the enablement, right?
You get all that stuff because

733
00:44:02,920 --> 00:44:07,160
you made the appropriate
interface to those dynamics.

734
00:44:08,920 --> 00:44:10,640
Yeah, I would love to.
So, So one of the things, so

735
00:44:10,640 --> 00:44:12,880
Robert, I don't remember if
we've talked about this at all,

736
00:44:12,880 --> 00:44:14,960
but one of the things that
there's, I have a whole list of

737
00:44:14,960 --> 00:44:16,400
things we're going to need to
apply this to.

738
00:44:16,400 --> 00:44:19,080
But but one of the things you
know talking about new spaces is

739
00:44:19,080 --> 00:44:22,040
have you seen, have you seen our
stuff on iterated prisoner's

740
00:44:22,040 --> 00:44:24,120
dilemma?
Yeah, there isn't.

741
00:44:24,240 --> 00:44:27,680
Yeah.
So yeah, so this is elections

742
00:44:27,680 --> 00:44:33,240
work where you have these agents
just playing a prisoner's

743
00:44:33,240 --> 00:44:35,000
dilemma with each other, right,
in a spatialized thing.

744
00:44:35,200 --> 00:44:39,440
But they have the option.
They can defect and cooperate,

745
00:44:39,440 --> 00:44:40,880
but they can also merge and
split.

746
00:44:41,480 --> 00:44:44,240
So what happens over time is
that these things, they, they

747
00:44:44,240 --> 00:44:46,600
actually tend to merge.
So you get these blobs, blobs,

748
00:44:46,600 --> 00:44:49,480
blobs forming, right?
That is, is everything.

749
00:44:49,480 --> 00:44:52,120
When you, when you do that, you
of course change the payoff

750
00:44:52,200 --> 00:44:54,600
table because now the number of
players is different.

751
00:44:54,840 --> 00:44:57,160
So, so it completely like,
right.

752
00:44:57,160 --> 00:44:59,800
It's, it's very, you know, sort
of recurrent and all that.

753
00:45:00,360 --> 00:45:02,160
And, and there's something,
there's something fun that

754
00:45:02,160 --> 00:45:04,640
happens at the end, which I
didn't realize until we actually

755
00:45:04,640 --> 00:45:06,160
did the experiment.
Of course I should, I should

756
00:45:06,160 --> 00:45:09,160
have thought of it.
You, you, you look at the, you

757
00:45:09,160 --> 00:45:11,920
look at the, the reward that
they're getting right from it.

758
00:45:11,920 --> 00:45:15,160
It turns out that, that, that,
you know, coming together is, is

759
00:45:15,160 --> 00:45:16,840
good.
And so, you know, so the reward

760
00:45:16,840 --> 00:45:19,400
goes up, goes up and then
eventually just like plummets at

761
00:45:19,400 --> 00:45:21,040
the end, it just drops.
I said, what the hell is this?

762
00:45:21,040 --> 00:45:23,440
Why, why is it, Why is it
suddenly dropping when things

763
00:45:23,440 --> 00:45:25,320
are getting back?
And then I realized what happens

764
00:45:25,680 --> 00:45:29,840
when you, when everybody merges
and you get so big that you take

765
00:45:29,840 --> 00:45:32,360
up most of your world.
There's nobody to play with.

766
00:45:32,920 --> 00:45:35,040
And if there's nobody to play,
there's, there's no, there's no,

767
00:45:35,040 --> 00:45:37,280
there's no one to I, you know,
to cooperate with and there's

768
00:45:37,480 --> 00:45:39,840
so, so that's it.
And then everybody, you know,

769
00:45:39,840 --> 00:45:41,400
then, then, then the reward
plummets.

770
00:45:41,840 --> 00:45:44,000
And so there's a couple of
different thing ways you can,

771
00:45:44,000 --> 00:45:45,600
you can handle that.
And we, we said this in the

772
00:45:45,600 --> 00:45:48,240
paper, like one thing you could
do is you could just say, well,

773
00:45:48,280 --> 00:45:49,880
hate death of the universe.
That's, that's it.

774
00:45:49,880 --> 00:45:51,360
I guess you've eaten everything
there is to eat.

775
00:45:51,360 --> 00:45:52,880
The whole thing dies and that's
the end of that.

776
00:45:53,080 --> 00:45:56,560
That's one way to do it.
Another way to do it is to have

777
00:45:56,560 --> 00:46:00,360
like a like a big bag Big Bang,
big crunch cycle because

778
00:46:00,400 --> 00:46:03,360
everybody gets together.
Then there's the stress of not

779
00:46:03,360 --> 00:46:05,480
having any to play with.
So you're all you're all bored

780
00:46:05,480 --> 00:46:07,600
and kind of freaking out.
And so you fragment, right?

781
00:46:07,600 --> 00:46:10,520
It's like a like a stress
induced multiple.

782
00:46:10,680 --> 00:46:12,960
You know what dissociative state
basically, right.

783
00:46:12,960 --> 00:46:14,440
So you you break up into pieces
again.

784
00:46:14,520 --> 00:46:16,440
Then you spend a bunch of time
coming back together again.

785
00:46:16,440 --> 00:46:19,080
Then so it's crunchy, OK,
oscillation.

786
00:46:19,360 --> 00:46:21,920
But but the third thing you can
do and and we haven't done this

787
00:46:21,920 --> 00:46:23,120
yet, but I think it would be
really neat.

788
00:46:23,120 --> 00:46:25,800
And now, and now I'm thinking we
should be, we should be using

789
00:46:25,800 --> 00:46:30,200
this formalism to analyze that
all, all that data is to say

790
00:46:30,200 --> 00:46:34,440
that, well, once you do merge,
you now have access to a new

791
00:46:34,440 --> 00:46:36,000
space you didn't have access to
before.

792
00:46:36,160 --> 00:46:39,680
So as single cells, you have,
you're solving problems in

793
00:46:39,680 --> 00:46:43,280
metabolic space and you know
what transcriptional space, the

794
00:46:43,280 --> 00:46:46,640
physiological space, but you get
together into a big sheet and

795
00:46:46,640 --> 00:46:49,800
well, now you have anatomical
amorphous space that you can

796
00:46:49,800 --> 00:46:52,480
play with because now you can
have stresses and, and tensions

797
00:46:52,480 --> 00:46:54,960
that bend you into a, you know
what different shape and you

798
00:46:54,960 --> 00:46:56,520
know, your tube or a ball or
something.

799
00:46:56,520 --> 00:46:59,120
And now, now you get to explore.
So there's a new space, which in

800
00:46:59,120 --> 00:47:00,960
our system they didn't, they
didn't have access to, but, but

801
00:47:00,960 --> 00:47:03,960
it's kind of like it's related
to what you were talking about.

802
00:47:03,960 --> 00:47:06,680
Now, you know, at some point, at
some point, you, you break into

803
00:47:06,680 --> 00:47:11,240
an entirely new space and you
reap the benefits of the laws

804
00:47:11,240 --> 00:47:12,480
that are there.
And they're not just

805
00:47:12,480 --> 00:47:14,120
constraints.
I mean, there's some constraints

806
00:47:14,120 --> 00:47:15,320
like you can't go faster than
those.

807
00:47:15,760 --> 00:47:17,880
But the but there are other
things that are enablements like

808
00:47:17,880 --> 00:47:20,920
oh if you do this well suddenly
you get handed the, you know,

809
00:47:21,080 --> 00:47:23,320
truth tables or or or other
facts of.

810
00:47:25,720 --> 00:47:28,240
There are certainly no free
lunches in this universe, but

811
00:47:28,240 --> 00:47:32,120
then at, at some, in some
spaces, there are wonders to be

812
00:47:32,120 --> 00:47:34,640
discovered that are not
available until you, you go

813
00:47:34,640 --> 00:47:37,480
there.
And ultimately it's degrees of

814
00:47:37,480 --> 00:47:40,280
freedom and and degrees of free
energy that, that, that you

815
00:47:40,400 --> 00:47:42,800
access.
So if, if you think of, of your

816
00:47:42,800 --> 00:47:46,160
skin as capturing sunlight,
there's only so much you can

817
00:47:46,160 --> 00:47:47,760
capture.
But imagine if your skin was the

818
00:47:47,760 --> 00:47:51,000
size of a continent, you'd be
able to use that, that, that

819
00:47:51,000 --> 00:47:55,600
energy way more to, to, to find
the many things inside.

820
00:47:55,960 --> 00:48:01,080
So how this, this is a very
visceral example, but it's,

821
00:48:01,080 --> 00:48:02,960
it's, it's good to think in, in
this way.

822
00:48:02,960 --> 00:48:07,160
When you, when you BLOB as, as,
as Mike was saying in the, in

823
00:48:07,160 --> 00:48:11,400
the experiment, when you BLOB
and you grow, certainly new

824
00:48:11,400 --> 00:48:13,920
problems appears, but also new
opportunities appear.

825
00:48:14,200 --> 00:48:20,680
And that creates a a recursive
enforcement on on on the parts

826
00:48:20,680 --> 00:48:24,120
to start acting well together
rather than not acting well

827
00:48:24,120 --> 00:48:28,520
together then on other parts
that are external to you to stop

828
00:48:28,560 --> 00:48:30,640
acting well together themselves
and BLOB again.

829
00:48:30,640 --> 00:48:33,280
Because then you you colonize
the spaces and and get the

830
00:48:33,280 --> 00:48:37,120
resources.
And probably there is there are

831
00:48:37,120 --> 00:48:39,680
some ways to estimate an
equilibrium state for all for

832
00:48:39,680 --> 00:48:42,960
all these things.
And it would be fantastic to

833
00:48:42,960 --> 00:48:46,960
apply this also to some if you
want your political analysis.

834
00:48:46,960 --> 00:48:51,320
Like what was the optimal number
of states that can profitably

835
00:48:51,480 --> 00:48:55,440
run independently as a nation on
Earth given Earth's resources

836
00:48:55,440 --> 00:48:58,600
and size for the planet that it
is and and the degrees of free

837
00:48:58,600 --> 00:49:01,960
energy it has, it can amass now
with where what on the

838
00:49:01,960 --> 00:49:07,400
Kardashian scale on 0.75 or
something else about there.

839
00:49:07,760 --> 00:49:11,120
So if you try to, to, to think
and get crazy about this, and I

840
00:49:11,120 --> 00:49:14,800
know Mike likes a lot to get
crazy, maybe there are there are

841
00:49:14,800 --> 00:49:20,320
ways to to, to extrapolate and
and think of of some bounds.

842
00:49:20,320 --> 00:49:25,400
You are however tight or loose
of equilibrium.

843
00:49:25,800 --> 00:49:29,040
Proper regimes of equilibrium
for for for players to exist in

844
00:49:29,040 --> 00:49:31,000
a in a in a resource bound
space.

845
00:49:32,320 --> 00:49:35,360
Which which I think is super
critical because people ask me

846
00:49:35,360 --> 00:49:37,440
all the time.
So so people hear my story of

847
00:49:37,440 --> 00:49:39,640
cancer right as and
multicellularity and what

848
00:49:39,640 --> 00:49:41,760
happened?
They say, Oh, well, we can we

849
00:49:41,760 --> 00:49:43,320
can extend this to the social
sphere.

850
00:49:43,320 --> 00:49:45,720
We'll just gap junction are all
us all together.

851
00:49:45,720 --> 00:49:47,800
So we'll we'll, you know, have
this mind meld.

852
00:49:47,800 --> 00:49:50,040
We'll lose our individual
memories and we'll all and it'll

853
00:49:50,040 --> 00:49:52,040
be fantastic.
Like, yeah, no, that's not going

854
00:49:52,040 --> 00:49:53,720
to be fantastic.
That that that that's been

855
00:49:53,720 --> 00:49:55,080
tried.
It doesn't work well.

856
00:49:55,440 --> 00:50:00,840
And I think but but but what it
tells you is to that maybe we

857
00:50:00,840 --> 00:50:04,760
can do a systematic search of
policies that do do that do in

858
00:50:04,760 --> 00:50:07,040
fact do what we want.
You know, there's there's no

859
00:50:07,080 --> 00:50:09,960
there's no reason why we have to
copy the the the biological one.

860
00:50:09,960 --> 00:50:13,560
But the but something like it
some kind of a search for for

861
00:50:13,560 --> 00:50:14,920
policies.
Yeah.

862
00:50:15,960 --> 00:50:19,120
You know, something else that's,
you know, I wasn't going to

863
00:50:19,120 --> 00:50:22,960
bring this up, but now that you
said about how much I like crazy

864
00:50:22,960 --> 00:50:24,800
stuff, I just, just, and this is
brand new.

865
00:50:24,800 --> 00:50:27,680
I, I just thirdly, like 2
minutes ago, I just thought of

866
00:50:27,680 --> 00:50:29,880
this as probably total nonsense,
But, but I'll just, I'll just

867
00:50:29,880 --> 00:50:32,560
say it anyway because, because
you, you had this amazing

868
00:50:32,560 --> 00:50:37,520
analogy of, of our, of, of the
framework actually this sort of

869
00:50:37,520 --> 00:50:39,840
pupating, you know, from, from
the larval stage.

870
00:50:40,360 --> 00:50:43,280
So that, so now, so now I'm
thinking, OK, I've already been

871
00:50:43,360 --> 00:50:46,240
thinking about the, the
butterfly, the Caterpillar

872
00:50:46,240 --> 00:50:48,400
butterfly story, right?
There's, there's three, there's

873
00:50:48,400 --> 00:50:49,640
at least three perspective
there.

874
00:50:49,640 --> 00:50:52,240
So there's the perspective of
the Caterpillar who's facing a

875
00:50:52,240 --> 00:50:55,640
singularity and, and you know,
and how do you think about that,

876
00:50:55,640 --> 00:50:57,360
your end and all of that?
Then, then there's the

877
00:50:57,360 --> 00:51:01,080
perspective of the butterfly,
which is basically saddled with

878
00:51:01,080 --> 00:51:03,760
some memories that it has no
idea how they got there, right?

879
00:51:03,760 --> 00:51:05,480
Cuz you can train the
Caterpillar and it persists.

880
00:51:05,600 --> 00:51:06,920
So those are the, so the
butterfly.

881
00:51:06,920 --> 00:51:09,720
Like where, where am I getting
these weird memories and, and

882
00:51:09,720 --> 00:51:11,560
these weird behavioral
propensities that I didn't

883
00:51:11,560 --> 00:51:15,400
really earn in my life Demo OK.
And and then, and then the the

884
00:51:15,400 --> 00:51:17,640
much crazier thing, which is the
perspective of the memory

885
00:51:17,640 --> 00:51:19,560
itself.
So you train the you train the

886
00:51:19,560 --> 00:51:22,400
Caterpillar, right?
So the perspective of the memory

887
00:51:22,400 --> 00:51:26,600
pattern, living in a cognitive
medium and knowing that, well,

888
00:51:26,640 --> 00:51:28,680
I'm not going to survive as a
Caterpillar memory.

889
00:51:28,680 --> 00:51:30,920
That's useless to the butterfly.
If I'm going to survive, I need

890
00:51:30,920 --> 00:51:33,280
to remap myself onto this new,
new world, right.

891
00:51:33,760 --> 00:51:36,320
But now that now what, what you
just said, I hadn't thought of

892
00:51:36,320 --> 00:51:40,120
this before, but maybe we ought
to even think about what that

893
00:51:40,120 --> 00:51:43,000
process looks like for
scientific frameworks or

894
00:51:43,000 --> 00:51:46,400
scientific theories.
So as, as the theory.

895
00:51:46,400 --> 00:51:49,440
So, so like, what is it like?
So here's the totally crazy

896
00:51:49,440 --> 00:51:50,520
part.
So I haven't thought about this

897
00:51:50,520 --> 00:51:52,080
at all of this.
And is this if we can do

898
00:51:52,080 --> 00:51:55,080
anything with this or not?
But like, what is it like for

899
00:51:55,080 --> 00:51:58,480
for Newtonian mechanics to
pupate into like Einstein into

900
00:51:58,480 --> 00:52:00,400
an Einstein formulation?
Like what is that?

901
00:52:00,400 --> 00:52:02,600
Like what has to be remap?
What has to be trash?

902
00:52:02,800 --> 00:52:05,400
But but, but again, from the
perspective of the, of the, of

903
00:52:05,400 --> 00:52:07,680
the theory itself, right?
What, what happens?

904
00:52:08,280 --> 00:52:10,000
I, I, I don't know that that's,
that's what I was thinking

905
00:52:10,000 --> 00:52:11,120
about.
As soon as you said that for our

906
00:52:11,120 --> 00:52:13,000
framework, I was like, wow, you
could probably do that for

907
00:52:13,000 --> 00:52:17,640
almost any, you know, persistent
system.

908
00:52:18,560 --> 00:52:22,600
Well, for sure, and and and
physical things, physical

909
00:52:22,600 --> 00:52:25,240
patterns of of organizations are
are one thing.

910
00:52:25,240 --> 00:52:27,840
But as as you said, like you
can, you can do abstract

911
00:52:27,840 --> 00:52:31,680
patterns as well.
What, what application strategy

912
00:52:31,680 --> 00:52:35,120
for certain abstract patterns is
is most useful in in a certain

913
00:52:35,120 --> 00:52:36,880
space and and not in other
space?

914
00:52:37,480 --> 00:52:41,040
Which, which parts of of the
pattern get replicated and

915
00:52:41,040 --> 00:52:44,760
preserved and which get
discarded, even insertion in a

916
00:52:44,760 --> 00:52:47,360
new space, relaxing of
constraints, adding of

917
00:52:47,360 --> 00:52:50,320
constraints and stuff like that.
The the difficulty I see with

918
00:52:50,320 --> 00:52:53,800
with with what you say, Mike, is
sort of the, the God like

919
00:52:53,800 --> 00:52:57,880
perspective for, for for us to
be able to say priori.

920
00:52:57,880 --> 00:53:01,160
I mean, we can say with the
benefit of insight what what

921
00:53:01,160 --> 00:53:03,480
needed to be discarded from
internal mechanics to to get

922
00:53:03,480 --> 00:53:09,040
into understanding relativity.
But you cannot do that facing

923
00:53:09,040 --> 00:53:12,360
the the future.
You probably are bounded because

924
00:53:12,960 --> 00:53:16,440
this is at least for the scale
and size of observers that we

925
00:53:16,440 --> 00:53:19,040
are.
It's probably something we can

926
00:53:19,440 --> 00:53:22,320
in the way you want to do it for
scientific theories assumes

927
00:53:22,680 --> 00:53:25,640
already you've traversed the
space.

928
00:53:26,400 --> 00:53:29,080
I I think that the, the
invitation will be, can you

929
00:53:29,080 --> 00:53:33,920
shortcut the search by thinking
by doing this kind of meta level

930
00:53:33,920 --> 00:53:38,360
modeling and then maybe you
device policies, maybe you do

931
00:53:38,360 --> 00:53:40,960
enough of this inside analysis
and you can device policies for

932
00:53:40,960 --> 00:53:44,840
thinking.
Or or, or maybe, and again, I

933
00:53:44,840 --> 00:53:48,080
have no idea if this would work,
but, but maybe there's some sort

934
00:53:48,080 --> 00:53:54,440
of least action kind of thing
where, yes, you can sort of go

935
00:53:54,440 --> 00:53:56,800
piece by piece or you could,
right.

936
00:53:56,800 --> 00:53:59,080
So, so from some perspective, it
looks like, hey, you took the

937
00:53:59,080 --> 00:54:02,720
least action path, which means
you kind of already knew what

938
00:54:02,720 --> 00:54:04,880
the rest of it was because you
can't do that locally.

939
00:54:05,080 --> 00:54:08,040
So maybe maybe there's some kind
of a version of a, you know, a

940
00:54:08,040 --> 00:54:12,240
least action version of this
where you could you, you in some

941
00:54:12,240 --> 00:54:15,160
sense you, you have, you know,
something about the full path

942
00:54:15,160 --> 00:54:16,560
even if you don't know it
locally.

943
00:54:17,480 --> 00:54:20,280
Or yeah, or you can extrapolate
from from from previous examples

944
00:54:20,280 --> 00:54:22,160
of local paths.
You can extrapolate something

945
00:54:22,440 --> 00:54:25,520
about the, the the topology of
the of the of the full path and

946
00:54:25,520 --> 00:54:27,920
then ask yeah, yeah, that would
work.

947
00:54:27,920 --> 00:54:34,840
OK, So there's no hesitate the
with with this generation, the

948
00:54:36,160 --> 00:54:38,760
the podcast.
But yeah, so David, David, this

949
00:54:38,760 --> 00:54:42,640
was US just thinking out loud,
embarrassing ideas.

950
00:54:43,280 --> 00:54:45,800
Look, that's, that's better.
This is exactly what it's for.

951
00:54:45,800 --> 00:54:47,840
Hopefully people listen to this,
watch this and build on this

952
00:54:47,840 --> 00:54:50,160
idea and, and something new
comes about.

953
00:54:50,200 --> 00:54:51,760
You know what?
This actually reminds me of the

954
00:54:51,760 --> 00:54:54,720
conversation I had with Richard
Watson and Josh Bond got about

955
00:54:54,720 --> 00:54:58,440
coming together, why things come
together, things uniting in that

956
00:54:58,440 --> 00:55:00,160
sense.
And then when you, when Mike,

957
00:55:00,160 --> 00:55:02,960
when you brought up that story
now it, it reminded me of

958
00:55:02,960 --> 00:55:04,560
Pluribus.
And you know, when all these

959
00:55:04,560 --> 00:55:09,160
minds get together due to this,
this RNA of DNA, what happens if

960
00:55:09,160 --> 00:55:12,880
they get everyone?
Does that system eventually shut

961
00:55:12,880 --> 00:55:14,520
down?
Anything you want to add to

962
00:55:14,520 --> 00:55:15,960
that?
There's this two side fire.

963
00:55:18,400 --> 00:55:21,000
Well, I think it's and I haven't
seen the whole thing.

964
00:55:21,000 --> 00:55:24,880
I've seen a few episodes.
Yes, I had, I had lots of

965
00:55:24,880 --> 00:55:29,240
questions, which is like, you
know, what the heck are they

966
00:55:29,240 --> 00:55:31,160
doing?
Like, like, OK, so they've all

967
00:55:31,160 --> 00:55:33,000
gotten together.
What do you do all day?

968
00:55:33,000 --> 00:55:35,520
Is there are there other goals
besides getting together?

969
00:55:35,560 --> 00:55:37,560
Like what, what is the goal of
the collective now?

970
00:55:37,560 --> 00:55:39,960
Are they, are they still on
working on getting to Mars or,

971
00:55:39,960 --> 00:55:43,840
or that they don't care anymore?
And, and that question, it's

972
00:55:43,840 --> 00:55:47,520
funny because I spent, I spent
some time with some, some

973
00:55:47,520 --> 00:55:51,480
Buddhist colleagues and some,
some, you know, some very sort

974
00:55:51,480 --> 00:55:53,640
of high level scholars from that
community.

975
00:55:54,080 --> 00:55:57,000
And I asked them, what is the
success plan?

976
00:55:57,240 --> 00:56:01,680
In other words, suppose that you
are amazingly effective and

977
00:56:01,680 --> 00:56:04,800
everybody becomes a high level,
it's sort of enlightened

978
00:56:04,800 --> 00:56:08,040
meditator and all of that.
What does the planet look like

979
00:56:08,040 --> 00:56:10,960
then?
Like like now it it's you guys

980
00:56:10,960 --> 00:56:13,160
helping us, you know, get out of
our whatever.

981
00:56:13,440 --> 00:56:17,160
Once everybody does that, then
then what are we now?

982
00:56:17,160 --> 00:56:19,200
Are we are we still going into
space?

983
00:56:19,200 --> 00:56:22,160
Are we still doing fusion?
Are we still, you know,

984
00:56:22,160 --> 00:56:26,320
understanding biology or or or
or not again, I never, I never

985
00:56:26,320 --> 00:56:28,560
quite, you know, and the answer
usually is why you do whatever

986
00:56:28,560 --> 00:56:32,200
you want to do.
But that, you know, I yeah, I'm

987
00:56:32,200 --> 00:56:33,920
not satisfied with that.
I think.

988
00:56:33,920 --> 00:56:37,360
I think for these kind of major
transformations, you have to be

989
00:56:37,360 --> 00:56:40,640
able to say what the future then
looks like having having

990
00:56:40,640 --> 00:56:43,080
achieved that transformation.
So I don't know, maybe maybe the

991
00:56:43,080 --> 00:56:44,920
show will will answer that.
Maybe it won't end.

992
00:56:47,080 --> 00:56:48,560
You don't want to watch.
It Robin.

993
00:56:49,960 --> 00:56:51,960
I used to believe it.
Yeah, yeah, it's on Apple TV.

994
00:56:51,960 --> 00:56:53,600
I I wanted to watch it, but I
didn't.

995
00:56:53,600 --> 00:56:56,600
But I, I, I do have something
similar in in mind the Unity

996
00:56:57,080 --> 00:56:59,840
entity from Rick and Morty, if
you know, if you know the show.

997
00:57:00,600 --> 00:57:07,800
So Unity is probably even better
because they they do interesting

998
00:57:07,800 --> 00:57:10,480
things.
There at least, yeah.

999
00:57:10,920 --> 00:57:14,360
OK, let's let's, let's move back
to beyond brains for a for a

1000
00:57:14,360 --> 00:57:16,440
moment.
Let's talk about some of the

1001
00:57:16,600 --> 00:57:20,640
empirical examples guys, because
within this paper, you guys

1002
00:57:20,640 --> 00:57:24,440
really went in deep when it
comes to the examples you give

1003
00:57:24,440 --> 00:57:27,600
empirically, when it comes to
the amoeba and the Plenarium

1004
00:57:27,600 --> 00:57:30,800
models, they show massive
efficiency gains.

1005
00:57:30,800 --> 00:57:34,360
I think it was K2 to 21.
I can't remember exactly the

1006
00:57:34,360 --> 00:57:38,520
amount without neurons.
So what's assumptions about

1007
00:57:38,520 --> 00:57:42,080
learning memory or
representation Do these

1008
00:57:42,080 --> 00:57:44,320
challenge Robert?
Do you want a bad start?

1009
00:57:45,920 --> 00:57:50,200
Yeah, sure.
So to recall something we were

1010
00:57:50,200 --> 00:57:54,440
saying in the beginning, the
very act of defining a problem

1011
00:57:54,440 --> 00:57:59,360
space is an intelligence task
for the, for the definer, for

1012
00:57:59,360 --> 00:58:02,960
the observer.
So once you have enough

1013
00:58:03,880 --> 00:58:07,120
empirical data and detail at
your disposal to define the

1014
00:58:07,120 --> 00:58:11,200
problem space and you can do
these estimations as as we did

1015
00:58:12,160 --> 00:58:16,960
to, to exemplify that, that this
works, that sort of opens new,

1016
00:58:17,240 --> 00:58:19,640
new possibilities for you, for
you to consider.

1017
00:58:19,640 --> 00:58:23,680
So in, in the case of, let me,
let me see the, the exact

1018
00:58:23,680 --> 00:58:25,200
numbers.
So in this, in the case of

1019
00:58:25,200 --> 00:58:32,600
planarian regeneration of of the
head, we had 10/10/10 we had

1020
00:58:32,600 --> 00:58:37,560
yeah, 10 to 21 times more
efficient.

1021
00:58:38,040 --> 00:58:44,400
So then the most conservative
null space, null space model of

1022
00:58:44,400 --> 00:58:49,640
random search we could find.
So something that even that was

1023
00:58:49,640 --> 00:58:54,960
extravagantly fine, but we made
very conservative assumptions.

1024
00:58:54,960 --> 00:58:59,560
So we most likely underestimated
how much efficient head

1025
00:58:59,560 --> 00:59:03,480
regeneration is relative to just
randomly sampling the the space

1026
00:59:03,480 --> 00:59:06,920
of possible anatomical
configurations, turning on a

1027
00:59:06,920 --> 00:59:10,840
random transcriptome States and
and on and off it it.

1028
00:59:10,840 --> 00:59:16,040
It was amazing So that then,
then that invites questions like

1029
00:59:17,000 --> 00:59:22,000
what kind of features that are
cognitive bearing a system must

1030
00:59:22,000 --> 00:59:23,760
have to be able to be so
efficient.

1031
00:59:24,160 --> 00:59:27,840
And when we think of memory, we
usually extrapolate from our

1032
00:59:27,840 --> 00:59:30,080
example, from our, from our
human experience.

1033
00:59:30,080 --> 00:59:33,120
We think, oh, I remember that.
Therefore I'm able to, to use

1034
00:59:33,120 --> 00:59:36,640
that stored information to
inform future processing and

1035
00:59:36,640 --> 00:59:39,200
future search for, for for
problem solving.

1036
00:59:39,600 --> 00:59:42,360
And we think when we think of
memory, we think of structures

1037
00:59:42,360 --> 00:59:45,080
like, oh, the brain hypochampus
stores abstractions,

1038
00:59:45,080 --> 00:59:48,200
representations, sequential
activations of subsets of

1039
00:59:48,200 --> 00:59:51,640
hypochampus neurons enables me
to think of of a space I

1040
00:59:51,640 --> 00:59:54,960
traversed in the past and not
that sequential activation now

1041
00:59:54,960 --> 00:59:57,560
helps me traverse this new space
that is somewhat topologically

1042
00:59:57,560 --> 00:59:59,440
similar.
So therefore I'm I'm a better

1043
00:59:59,440 --> 01:00:03,520
problem servers because of my
hypercampus or what about cells?

1044
01:00:03,560 --> 01:00:07,200
What about what, what, what
about the morphological

1045
01:00:07,200 --> 01:00:08,800
organization of a planarian
head?

1046
01:00:08,800 --> 01:00:11,480
Do they have something like the
memory equivalent?

1047
01:00:12,520 --> 01:00:16,040
It's just a way to boost your
efficiency.

1048
01:00:16,360 --> 01:00:21,840
Memory is a way to not.
Encode the novel, encode again

1049
01:00:22,600 --> 01:00:25,640
or rediscover again solutions
you in the past.

1050
01:00:25,640 --> 01:00:30,920
Discover or batches parts of the
solutions that you discover in

1051
01:00:30,920 --> 01:00:33,840
the past.
You encode an N gram, a

1052
01:00:35,000 --> 01:00:38,160
compressed representation of
those and you are able to use

1053
01:00:38,160 --> 01:00:41,640
those in the future.
And the fact that you are now

1054
01:00:41,680 --> 01:00:44,560
able, if you relax your
assumptions and you don't become

1055
01:00:44,840 --> 01:00:48,800
a chauvinist, A neuro
chauvinist, you start thinking

1056
01:00:49,000 --> 01:00:52,640
maybe something like memory is
also available to other systems.

1057
01:00:52,680 --> 01:00:56,880
Maybe something like memory
could be what, what, what helps

1058
01:00:57,120 --> 01:01:00,560
with the efficiency, massive
efficiency gains we see at even

1059
01:01:00,560 --> 01:01:04,040
under conservative models, maybe
something like prediction as

1060
01:01:04,040 --> 01:01:07,040
well as in the horizon.
Is there a way for for such

1061
01:01:07,040 --> 01:01:11,200
systems to to make sort of
guesses, inform guesses,

1062
01:01:11,200 --> 01:01:15,680
guesstimates of certain states
or certain trajectories or

1063
01:01:15,680 --> 01:01:19,320
policies of trajectories rather
than others, which would be the

1064
01:01:19,320 --> 01:01:21,000
equivalent of a human
prediction?

1065
01:01:21,880 --> 01:01:25,920
We do immense amount of
predictions online as we solve

1066
01:01:25,920 --> 01:01:29,120
problems even even without our
intention.

1067
01:01:29,120 --> 01:01:32,120
We do predictions and we we
estimate the results of our own

1068
01:01:32,120 --> 01:01:35,400
actions in real time.
But that doesn't mean something

1069
01:01:35,400 --> 01:01:40,920
that's smaller, slower and way
less multicellular or multi

1070
01:01:40,920 --> 01:01:45,680
component but still big enough
to be relevant doesn't do that.

1071
01:01:48,440 --> 01:01:52,920
Yeah, and I think, you know,
this, this this trick of being

1072
01:01:52,920 --> 01:01:57,760
handed a bunch of engrams that
leverage past the experience and

1073
01:01:57,760 --> 01:02:01,400
then being responsible for
interpreting them creatively,

1074
01:02:01,400 --> 01:02:04,040
meaning that you get to
improvise what they mean and all

1075
01:02:04,040 --> 01:02:06,160
of the effects you have to do
all this kind of stuff.

1076
01:02:07,680 --> 01:02:09,360
This is an ancient thing.
So.

1077
01:02:09,360 --> 01:02:14,880
So as a first, as an early cell,
you got handed some, some DNA,

1078
01:02:15,200 --> 01:02:19,000
but you also got handed some
cytoskeletal structures.

1079
01:02:19,200 --> 01:02:22,080
You also got handed some plasma
membrane structures.

1080
01:02:22,080 --> 01:02:25,480
All of these things are
basically you inherit those from

1081
01:02:25,480 --> 01:02:28,120
the, you know, you don't create
them from scratch, you inherit

1082
01:02:28,120 --> 01:02:31,280
them from your, from your
precursor and and then you pass

1083
01:02:31,280 --> 01:02:32,680
them on, right.
So these are all, these are all

1084
01:02:32,680 --> 01:02:35,600
things you pass on.
Yeah.

1085
01:02:35,600 --> 01:02:39,720
So, so I, I suspect this is
where the sort of the brain got

1086
01:02:39,720 --> 01:02:43,640
its amazing tricks is by speed
optimizing and sort of shifting

1087
01:02:43,640 --> 01:02:49,520
around some some, some, some
details around that that that

1088
01:02:49,520 --> 01:02:51,120
whole thing of of having to do
that.

1089
01:02:51,120 --> 01:02:53,640
I think all all cells have to do
that and and they reap the

1090
01:02:53,640 --> 01:02:55,720
benefits of it.
And, and also, you know, the

1091
01:02:55,720 --> 01:02:59,120
fact that not only is the
environment going to change, but

1092
01:02:59,200 --> 01:03:02,200
your own substrate is an
unreliable material.

1093
01:03:02,200 --> 01:03:04,520
Like in biology, you never know
how you, you know, you're going

1094
01:03:04,520 --> 01:03:05,960
to get mutated over the long
term.

1095
01:03:05,960 --> 01:03:08,920
You don't know how many copies
of anything you have in terms of

1096
01:03:08,920 --> 01:03:11,400
proteins or anything else.
You have to do the best you can

1097
01:03:11,400 --> 01:03:14,560
with the memories you have.
And maybe you can take them like

1098
01:03:14,560 --> 01:03:16,760
your ancestors took them, or
maybe not.

1099
01:03:16,760 --> 01:03:21,080
Or maybe you know, you have all
the, the DNA of a of a frog, but

1100
01:03:21,080 --> 01:03:22,400
you're going to be a Cnabot
instead.

1101
01:03:22,400 --> 01:03:24,720
Or you have a human genome, but
you're going to be an anthropot.

1102
01:03:25,040 --> 01:03:27,960
And that's how it's, you know,
and you can, you have other

1103
01:03:27,960 --> 01:03:30,360
resources to pull from.
You have new genes in case the

1104
01:03:30,360 --> 01:03:34,280
Anthropos 9000 differently
expressed genes, right, like

1105
01:03:34,280 --> 01:03:37,480
half the genome and not because
anybody touched your DNA or

1106
01:03:37,480 --> 01:03:39,680
because you have some weird and
nano materials or something.

1107
01:03:39,680 --> 01:03:41,520
No, because you have a different
lifestyle, that's all.

1108
01:03:42,160 --> 01:03:46,120
And which apparently you can
tell within, you know, within

1109
01:03:46,120 --> 01:03:48,200
the amount of days.
So that that that life's going

1110
01:03:48,200 --> 01:03:50,880
to be different.
So, yeah, I think that's

1111
01:03:50,880 --> 01:03:53,720
interesting.
And and you know, on the Fisarum

1112
01:03:53,720 --> 01:03:56,800
and like one of my favorite
things, so, so Narosha Murugan

1113
01:03:56,960 --> 01:03:59,920
in my group, when when she was
here, we did this thing where

1114
01:04:00,240 --> 01:04:04,920
you plate the slime mold and
then you put one glass disc on

1115
01:04:04,920 --> 01:04:07,560
one side, 3 glass discs on the
other side, right?

1116
01:04:07,680 --> 01:04:10,960
And this thing can basically
reliably go to the heavier,

1117
01:04:11,400 --> 01:04:13,960
well, the larger strain angle in
the, in the Agar that they're

1118
01:04:13,960 --> 01:04:16,040
on.
But my favorite part of all

1119
01:04:16,040 --> 01:04:18,520
that, if you look at the videos,
my favorite part of all of it is

1120
01:04:18,520 --> 01:04:22,640
that for the first, I don't
remember 4 hours, 5 hours, a few

1121
01:04:22,640 --> 01:04:27,080
hours, it grows, it grows sort
of homogeneously.

1122
01:04:27,160 --> 01:04:29,840
It doesn't make a choice yet.
And the whole time it's like,

1123
01:04:30,040 --> 01:04:32,760
it's like tugging on the Agar
and sending out these waves and

1124
01:04:32,760 --> 01:04:35,600
feeling the strain angle back.
So it's, it's, it's doing all

1125
01:04:35,600 --> 01:04:39,160
this and for four hours it's
getting the sort of lay of the

1126
01:04:39,160 --> 01:04:41,840
land of its environment.
And then boom, it goes to the,

1127
01:04:41,840 --> 01:04:44,400
to the heavier side.
So it's got a thinking phase and

1128
01:04:44,560 --> 01:04:47,440
then he's got an action phase.
And yeah, I just see how you,

1129
01:04:47,440 --> 01:04:50,240
like, I kept staring at that
initial like, I can see you like

1130
01:04:50,240 --> 01:04:52,400
I can see you thinking.
I can see you integrating the,

1131
01:04:52,760 --> 01:04:56,360
all, the, all the sensory
information you got from a 360°

1132
01:04:56,640 --> 01:04:59,160
area.
And then for some reason that we

1133
01:04:59,160 --> 01:05:00,600
don't know, it prefers the
heavier end.

1134
01:05:00,600 --> 01:05:03,840
I don't know why why it does
that, but but, and then, OK, now

1135
01:05:03,840 --> 01:05:06,960
I know where I know what's
around me at a, at a distance of

1136
01:05:06,960 --> 01:05:09,560
centimeters, you know, like like
centimeters away.

1137
01:05:09,560 --> 01:05:11,840
And these these these things
are, are quite, quite tiny.

1138
01:05:11,840 --> 01:05:14,720
The cells are anyway.
And yeah, at at a distance of

1139
01:05:14,720 --> 01:05:16,480
centimeters.
OK, now I know what my world

1140
01:05:16,480 --> 01:05:17,640
looks like.
I'm going this way.

1141
01:05:18,360 --> 01:05:20,840
And and so it's just, you know,
that that's, that's the kind of

1142
01:05:20,840 --> 01:05:25,000
stuff we would like to be able
to decode.

1143
01:05:25,600 --> 01:05:27,760
This is sort of like, you know,
like like Robert was saying, is

1144
01:05:27,760 --> 01:05:28,840
there is like, is there a
memory?

1145
01:05:28,840 --> 01:05:31,400
Like in the case of the
Planaria, we've been able to see

1146
01:05:31,400 --> 01:05:33,400
what like we can see the memory
now and we can see the

1147
01:05:33,400 --> 01:05:37,040
bioelectric state that says 1
head or two heads or, or no

1148
01:05:37,040 --> 01:05:39,720
heads, in fact.
And so we would like to do that

1149
01:05:39,720 --> 01:05:42,160
for all of these systems just to
be able to, you know, get in

1150
01:05:42,160 --> 01:05:43,480
there and read their, read their
mind.

1151
01:05:44,600 --> 01:05:48,360
Yeah, I think that reminds me of
something of when I was reading.

1152
01:05:48,400 --> 01:05:51,920
I was just the other day, I was
rereading one of Tolkien's

1153
01:05:51,920 --> 01:05:54,080
books, I think it was, I think
it was the second Lord of the

1154
01:05:54,080 --> 01:05:56,120
Rings book.
And I was, I was thinking about

1155
01:05:56,120 --> 01:05:58,480
how these, these ants
communicate and this

1156
01:05:58,480 --> 01:06:00,560
intelligence and the way you
portrayed it is so beautiful

1157
01:06:00,560 --> 01:06:02,560
when it comes to my
understanding of intelligence.

1158
01:06:02,880 --> 01:06:05,840
I mean, that's slow progressive
phase, the same way trees

1159
01:06:05,840 --> 01:06:09,080
actually do communicate.
And we often underestimate the

1160
01:06:09,080 --> 01:06:11,040
value of their intelligence, in
a sense.

1161
01:06:13,200 --> 01:06:16,160
For sure.
And, and the fact so, so this is

1162
01:06:17,000 --> 01:06:20,920
very much falling from, from
what Mike said, he was able and

1163
01:06:21,000 --> 01:06:24,080
the team was able to observe
the, the planarian case.

1164
01:06:24,080 --> 01:06:28,320
And now, now in, in the, in the,
the slime mold, the, the, the

1165
01:06:28,360 --> 01:06:31,080
problem is the reverse engineer,
OK, where is the memory?

1166
01:06:31,080 --> 01:06:34,480
Where, where, where are all
these cognitive tool boxes that

1167
01:06:34,480 --> 01:06:38,120
we might use?
An interesting thing that

1168
01:06:38,120 --> 01:06:40,440
happens when you when you start
thinking in this way is that you

1169
01:06:40,440 --> 01:06:45,400
can see the environmental
engineering that systems might

1170
01:06:45,400 --> 01:06:48,480
make to offload cognitive
effort.

1171
01:06:49,360 --> 01:06:51,880
Our memory.
We think of our memory as being

1172
01:06:51,880 --> 01:06:56,480
in here some brains and
whatever, but our external

1173
01:06:56,480 --> 01:06:59,000
memories are the same.
We are doing environmental

1174
01:06:59,000 --> 01:07:02,560
design to to scaffold our
cognition and and think better

1175
01:07:02,560 --> 01:07:06,120
and solve problem better as a as
a multi cellular, multi

1176
01:07:06,120 --> 01:07:10,600
individual Society of of
intelligent agents.

1177
01:07:10,840 --> 01:07:13,160
A data center is an external
memory.

1178
01:07:14,280 --> 01:07:18,600
So it's like, and once you start
thinking in this way, you might

1179
01:07:18,600 --> 01:07:21,520
start asking questions, how is
this system engineering its

1180
01:07:21,520 --> 01:07:24,560
environment?
Or how is this system at a scale

1181
01:07:24,560 --> 01:07:29,080
within an Organism engineering
the environment such that it can

1182
01:07:29,080 --> 01:07:33,520
leverage certain ways in which
it could shortcut its searches?

1183
01:07:34,800 --> 01:07:39,480
What's the memory like?
If the system were to engineer

1184
01:07:39,480 --> 01:07:43,200
its environment to do niche
sculpting for cognitive

1185
01:07:43,200 --> 01:07:46,640
offloading, what would memory
look like there?

1186
01:07:46,920 --> 01:07:50,040
This is not even a possible
question under the current

1187
01:07:50,040 --> 01:07:53,800
frameworks, yeah.
Yeah, yeah.

1188
01:07:53,800 --> 01:07:55,960
I think that's really, that's
really critical.

1189
01:07:55,960 --> 01:07:59,400
And even, you know, you can get
really weird with it and say,

1190
01:07:59,400 --> 01:08:02,360
OK, from the perspective of a
memory that wants to stick

1191
01:08:02,360 --> 01:08:04,600
around, right.
So a pattern that would like to

1192
01:08:04,600 --> 01:08:07,520
be persistent is so some
memories are like that, right?

1193
01:08:07,520 --> 01:08:10,200
Some memories are hard to get
rid of once you once they take

1194
01:08:10,200 --> 01:08:12,600
hold.
And so can you, can you see the

1195
01:08:12,600 --> 01:08:14,840
actual niche construction?
And I think if I'm not wrong, I

1196
01:08:14,840 --> 01:08:17,439
think people who study stress
and PTSD and things like that,

1197
01:08:18,439 --> 01:08:21,200
they see changes like having too
much of a certain kind of

1198
01:08:21,200 --> 01:08:23,319
thought makes it too easy to
keep having that thought.

1199
01:08:23,520 --> 01:08:26,160
Like that actually changes the
cognitive architecture, right?

1200
01:08:26,439 --> 01:08:30,000
So, so you could say that, yeah,
there's a little niche

1201
01:08:30,000 --> 01:08:32,680
construction going on there that
that some of these things would

1202
01:08:32,680 --> 01:08:37,680
like to be persistent.
And how much of the properties

1203
01:08:37,680 --> 01:08:40,960
of what has to happen to make a
butterfly from a Caterpillar?

1204
01:08:41,200 --> 01:08:45,160
How much of that is basically
due to due to the niche

1205
01:08:45,160 --> 01:08:47,359
construction of the of the
cognitive of the, of the

1206
01:08:47,359 --> 01:08:49,000
patterns that didn't want to go
right?

1207
01:08:49,160 --> 01:08:52,120
Like like basically they exerted
effort to to not just be

1208
01:08:52,120 --> 01:08:54,279
dropped.
And, and so, so some of what

1209
01:08:54,279 --> 01:08:56,880
ends up happening that that
remapping, like does the

1210
01:08:56,880 --> 01:09:00,439
Caterpillar that does the
butterfly really need to remap a

1211
01:09:00,439 --> 01:09:01,800
bunch of old Caterpillar
memories?

1212
01:09:01,800 --> 01:09:04,880
I'm not sure that it does.
I think some, some of that, some

1213
01:09:04,880 --> 01:09:07,479
of that might not be that might
not be in the interest of the of

1214
01:09:07,479 --> 01:09:10,080
the of the butterfly at all or
or maybe it is.

1215
01:09:10,200 --> 01:09:12,960
Yeah, we don't.
I think that's that's something

1216
01:09:13,120 --> 01:09:15,439
perfect to ask to Mark Holmes.
I think that can be something

1217
01:09:15,439 --> 01:09:18,120
he's very intrigued by.
Did you actually bring that up

1218
01:09:18,120 --> 01:09:22,080
with him the next time?
Guys, when it comes to

1219
01:09:22,080 --> 01:09:24,080
consciousness studies in
general, all of you are doing so

1220
01:09:24,080 --> 01:09:26,439
many different things from so
many different angles, different

1221
01:09:26,439 --> 01:09:28,359
labs, different side parts of
the world.

1222
01:09:28,960 --> 01:09:31,399
It's it's so incredible.
And Mike, I think the last time

1223
01:09:31,399 --> 01:09:33,920
we spoke, and I think it might
have been actually a few years

1224
01:09:33,920 --> 01:09:35,960
ago, we called it like this
Avengers of mine coming

1225
01:09:35,960 --> 01:09:37,880
together, doing all this crazy
stuff.

1226
01:09:38,439 --> 01:09:41,080
But when it comes to different
people working together from

1227
01:09:41,080 --> 01:09:43,800
different parts of the world,
the assumption is when you write

1228
01:09:43,800 --> 01:09:45,960
a paper like this together, you
both might still have the same

1229
01:09:45,960 --> 01:09:48,359
view on certain things.
So consciousness is one of them.

1230
01:09:48,479 --> 01:09:52,279
In the paper you guys don't
directly bring it up, but how do

1231
01:09:52,279 --> 01:09:55,320
both of you define and
differentiate between the word

1232
01:09:55,320 --> 01:09:59,080
consciousness and intelligence?
Are these separate or are they

1233
01:09:59,360 --> 01:10:03,280
gradient or scale relative?
Anyone could start.

1234
01:10:06,040 --> 01:10:09,320
One on the read line.
All right, my mind will be quick

1235
01:10:09,320 --> 01:10:11,680
because I'm, I'm not a
consciousness scientist and

1236
01:10:11,680 --> 01:10:14,960
that's not my primary, you know,
it's not my primary thing.

1237
01:10:14,960 --> 01:10:17,480
So, so it's pretty, pretty
amateur.

1238
01:10:17,720 --> 01:10:22,560
What I will say is, is this, I
think that in, in all of these

1239
01:10:22,760 --> 01:10:25,200
kinds of systems, relating to
all of these kinds of systems,

1240
01:10:25,200 --> 01:10:29,880
there are three basic what kind
of perspectives?

1241
01:10:30,120 --> 01:10:32,720
There's a third person
perspective, which is that of

1242
01:10:32,720 --> 01:10:36,760
conventional science, which is
when I study it, I'm not changed

1243
01:10:36,760 --> 01:10:38,400
by it.
I do an experiment.

1244
01:10:38,400 --> 01:10:39,880
I'm still the same before and
after.

1245
01:10:40,040 --> 01:10:42,160
Maybe I've learned something,
which is, which is actually a

1246
01:10:42,160 --> 01:10:44,200
bit of a change, but but
fundamentally I'm still, I'm

1247
01:10:44,200 --> 01:10:47,240
still me and, and we can all
sort of look at it externally

1248
01:10:47,240 --> 01:10:48,600
and, and say something about it.
Fine.

1249
01:10:49,120 --> 01:10:51,880
Then then there's the second
person, which is, which is

1250
01:10:52,040 --> 01:10:54,240
trying to give a, which you're
trying to communicate

1251
01:10:54,480 --> 01:10:57,080
instructions and trying to get
it to do what you want it to do.

1252
01:10:57,200 --> 01:11:01,480
So that includes engineering,
rewiring, hacking, social

1253
01:11:01,480 --> 01:11:06,560
engineering, communication,
giving a good talk, writing an

1254
01:11:06,560 --> 01:11:09,880
effective book, like all of that
kind of stuff is, is trying to

1255
01:11:09,880 --> 01:11:11,800
get the system to do in a
different state.

1256
01:11:11,800 --> 01:11:14,680
You're trying to push the so so
there's a communication there.

1257
01:11:14,680 --> 01:11:17,000
And then and then there's the
first person perspective, which

1258
01:11:17,000 --> 01:11:20,680
is that you can't do is.
So this is where I think

1259
01:11:20,680 --> 01:11:22,480
consciousness comes in.
I think, I think it's a first

1260
01:11:22,480 --> 01:11:24,480
person perspective.
I mean that that part's that

1261
01:11:24,480 --> 01:11:27,880
part's not not new.
The only thing I'll say about it

1262
01:11:27,880 --> 01:11:31,000
is and, and you know, you, you
maybe have seen some of my

1263
01:11:31,000 --> 01:11:34,720
latest stuff on no, these,
these, these patterns from

1264
01:11:34,720 --> 01:11:37,360
there's latent space, which you
might call the Platonic space,

1265
01:11:37,360 --> 01:11:40,240
maybe not.
What I think at this point is

1266
01:11:40,240 --> 01:11:44,200
that consciousness is basically
what we call the perspective

1267
01:11:44,200 --> 01:11:46,520
from that space looking out into
the physical world.

1268
01:11:46,840 --> 01:11:49,440
In other words, I don't think we
are fundamentally physical

1269
01:11:49,440 --> 01:11:51,680
beings that occasionally get
impinged upon by some

1270
01:11:51,680 --> 01:11:52,920
mathematical pattern or
something.

1271
01:11:52,920 --> 01:11:56,360
I say, I think the important
thing about us is we, we is, is

1272
01:11:56,360 --> 01:11:58,480
the, is the path.
We are the pattern and we come

1273
01:11:58,480 --> 01:12:01,680
through various interfaces, for
example, a human body and

1274
01:12:01,760 --> 01:12:04,440
consciousness is what we call
the perspective from that space

1275
01:12:04,480 --> 01:12:06,920
out into the physical world.
And then of course, we could

1276
01:12:06,920 --> 01:12:08,560
look at each other within the
physical world.

1277
01:12:08,560 --> 01:12:11,040
And that's the third person, you
know, that's conventional size.

1278
01:12:11,360 --> 01:12:15,600
And I think the, the thing about
real consciousness studies is

1279
01:12:15,600 --> 01:12:18,720
that you're not the same before
and after, right?

1280
01:12:18,720 --> 01:12:21,960
This is like, you know, we're
talking about people who really

1281
01:12:21,960 --> 01:12:23,560
do experiments.
So So what is that?

1282
01:12:23,560 --> 01:12:27,000
That's that's that's meditation.
That's psychedelics.

1283
01:12:27,000 --> 01:12:30,040
So that's we, you know,
psychonauts of various, various

1284
01:12:30,040 --> 01:12:32,000
types like that.
That, I think, is a key part of

1285
01:12:32,000 --> 01:12:34,720
it, which is which You don't
have to do that when you do

1286
01:12:34,720 --> 01:12:41,320
other kinds of science.
I mean the discussion

1287
01:12:41,320 --> 01:12:44,880
unconscious that was presented
because multiple podcasts to to

1288
01:12:44,880 --> 01:12:49,040
to to even start.
So I mean just a snide remark

1289
01:12:49,040 --> 01:12:51,080
there.
My that was anything but

1290
01:12:51,080 --> 01:12:55,320
amateuristic that she said that
was possibly a revolutionary

1291
01:12:56,040 --> 01:12:59,440
train of thought that was not
entertained until recently until

1292
01:12:59,440 --> 01:13:01,840
you introduced it.
So anything but amateuristic, I

1293
01:13:01,840 --> 01:13:06,520
would say.
Now talking from from the

1294
01:13:06,520 --> 01:13:11,080
perspective of of someone that
does does know the layer of the

1295
01:13:11,080 --> 01:13:14,880
land in the, in the mainstream
consciousness science, I would

1296
01:13:14,880 --> 01:13:20,840
say that the core of the
consensus is that consciousness

1297
01:13:20,840 --> 01:13:22,720
is not the same thing as
intelligence.

1298
01:13:23,200 --> 01:13:26,320
And when you will press me to
define consciousness and say,

1299
01:13:26,320 --> 01:13:30,560
OK, what is consciousness?
I would say something like this.

1300
01:13:31,240 --> 01:13:34,640
I do not think a real definition
in the in the philosophical

1301
01:13:34,640 --> 01:13:36,800
sense of the world.
The real definition is something

1302
01:13:36,800 --> 01:13:38,920
that specifies the essence of
something else.

1303
01:13:39,200 --> 01:13:42,280
So when you give a real
definition of a phenomenon, you

1304
01:13:42,280 --> 01:13:46,280
are able to give a description
of the true or essential or the

1305
01:13:47,000 --> 01:13:48,920
fundamental properties of that
phenomenon.

1306
01:13:48,920 --> 01:13:50,880
That's capturing that real
definition.

1307
01:13:51,360 --> 01:13:54,400
That's sort of the philosophical
definition of a real definition.

1308
01:13:54,880 --> 01:13:56,880
Now, when you want to give a
real definition of

1309
01:13:56,880 --> 01:14:00,760
consciousness, you might run
into the following problem.

1310
01:14:01,200 --> 01:14:07,920
It's probably not theoretically
coherent to define it in terms

1311
01:14:07,920 --> 01:14:14,720
of more basic terms.
This the fact that we you start

1312
01:14:14,720 --> 01:14:19,160
with the the inability of
defining it not.

1313
01:14:19,160 --> 01:14:22,680
And I use the word real
definition for a reason.

1314
01:14:22,680 --> 01:14:25,360
I'm not saying operational
definition, you can do that.

1315
01:14:26,160 --> 01:14:30,280
I'm not saying any type of
working assumptions where you

1316
01:14:30,280 --> 01:14:33,040
equate something with a product.
No, I'm saying a real

1317
01:14:33,040 --> 01:14:37,400
definition, saying that this
consciousness can be described

1318
01:14:37,440 --> 01:14:41,680
without the remainder in the in
terms of this set of properties,

1319
01:14:41,920 --> 01:14:47,200
essences, whatever, natures
whatever, and the description of

1320
01:14:47,280 --> 01:14:51,360
of that set will not involve any
reference to consciousness.

1321
01:14:52,040 --> 01:14:54,800
This is my very convoluted way
of saying that consciousness is

1322
01:14:54,800 --> 01:14:58,480
a basic feature, is as
fundamental as everything can

1323
01:14:58,480 --> 01:15:01,760
be.
And I am sure Mike has his his

1324
01:15:01,760 --> 01:15:04,080
own take and I'm sure other
people have their own take.

1325
01:15:04,280 --> 01:15:08,080
And I'm sure a lot of people
would would say, would disagree

1326
01:15:08,080 --> 01:15:11,720
with me.
But I think consciousness is the

1327
01:15:11,720 --> 01:15:14,760
thing that defines is not
defined.

1328
01:15:15,760 --> 01:15:19,120
Consciousness is as fundamental
a thing as anything can get.

1329
01:15:20,080 --> 01:15:24,880
And once you start accepting
that and it's a, it's a very,

1330
01:15:24,880 --> 01:15:28,360
very big ask because it has
several implications for, for

1331
01:15:28,360 --> 01:15:30,360
the ontologies that you are able
to build.

1332
01:15:30,360 --> 01:15:33,680
And they'll be primarily
idealistic or platonistic in

1333
01:15:33,680 --> 01:15:38,120
nature, as Mike was saying.
If, if you accept that, you see

1334
01:15:38,120 --> 01:15:41,160
that the, the, the problem is
how, how do you define

1335
01:15:41,160 --> 01:15:43,560
everything else in terms of
consciousness, not the other way

1336
01:15:43,560 --> 01:15:47,080
around.
And then intelligence comes into

1337
01:15:47,080 --> 01:15:50,760
the picture as as being, as we
said in the beginning, a more

1338
01:15:50,760 --> 01:15:55,520
operationalizable construct and
as being something that at least

1339
01:15:55,520 --> 01:15:59,760
as the kind of observers that we
are, we often observe as

1340
01:15:59,760 --> 01:16:04,520
occurring coincidentally at the
same time in the same conditions

1341
01:16:04,520 --> 01:16:09,000
with outwards manifestations or
proxies of of consciousness,

1342
01:16:09,280 --> 01:16:12,400
which we define operation as
something it is like to

1343
01:16:12,400 --> 01:16:15,560
experience.
But then what's something it is

1344
01:16:15,560 --> 01:16:18,560
like to experience then gets
operationalized as it behaves in

1345
01:16:18,560 --> 01:16:20,440
this way, It reports in this
way.

1346
01:16:20,440 --> 01:16:23,760
If it doesn't report, it seems
to be competent in this way.

1347
01:16:24,320 --> 01:16:27,560
Competent in this way starts to
sound an awful lot like it's

1348
01:16:27,560 --> 01:16:29,320
intelligent.
Oh, there you go.

1349
01:16:29,320 --> 01:16:32,480
Intelligence is a proxy to infer
consciousness.

1350
01:16:32,480 --> 01:16:35,120
But why?
Because in our experience, we

1351
01:16:35,320 --> 01:16:37,080
often saw the correlation of the
two.

1352
01:16:37,560 --> 01:16:41,440
But is a locked in syndrome
patient that is absolutely not

1353
01:16:41,440 --> 01:16:45,160
displaying intelligent behavior,
at least outwardly not

1354
01:16:45,160 --> 01:16:48,640
conscious?
Well, that's that's not true.

1355
01:16:48,640 --> 01:16:51,280
It's locked in syndrome is
defined clinically as the

1356
01:16:51,280 --> 01:16:53,880
presence of consciousness in in
also the presence or or the

1357
01:16:53,880 --> 01:16:58,040
absence of of motor output.
And it it can that there are

1358
01:16:58,040 --> 01:16:59,760
several reasons why why that's
the case.

1359
01:17:00,560 --> 01:17:04,040
There are also more difficult
reasons, more difficult cases

1360
01:17:04,040 --> 01:17:09,360
like islands in otherwise lesion
cortical hemispheres.

1361
01:17:10,280 --> 01:17:14,480
Until recently there was a
recent, there was a serious

1362
01:17:14,480 --> 01:17:19,600
hypothesis that may be trapped
islands of of experience within

1363
01:17:21,240 --> 01:17:24,760
the afferented hemisphere.
And, and the group by Marcelo

1364
01:17:24,760 --> 01:17:29,760
Massimini in 2024 explored this
and, and, and they did this in,

1365
01:17:29,800 --> 01:17:32,920
in, in, in children.
And they saw that that

1366
01:17:32,920 --> 01:17:37,360
hemisphere does not exhibit any
of the dynamical signs of, of

1367
01:17:37,360 --> 01:17:39,480
the presence of experience that
we usually know.

1368
01:17:39,480 --> 01:17:43,200
So it's mostly like a weird form
of sleep, of deep sleep.

1369
01:17:44,320 --> 01:17:47,640
And that's not consistent with
the presence of experience.

1370
01:17:47,920 --> 01:17:53,440
So to, to get back consciousness
and intelligence, most people

1371
01:17:53,440 --> 01:17:55,200
would say that they are, they
are dissociable.

1372
01:17:56,320 --> 01:17:58,360
A lot of those people would also
say that they are doubly

1373
01:17:58,360 --> 01:18:04,440
dissociable.
And probably all of those people

1374
01:18:04,440 --> 01:18:10,440
would say that intelligence is a
proxy or a good way to start

1375
01:18:10,440 --> 01:18:14,120
thinking or or some some way to
ground discussions about

1376
01:18:14,120 --> 01:18:18,240
consciousness.
But almost none or or just a few

1377
01:18:18,240 --> 01:18:21,040
of those people to give the same
subset would say that

1378
01:18:21,040 --> 01:18:26,320
consciousness is fundamental.
Maybe this is a weird way to

1379
01:18:26,320 --> 01:18:30,160
reply to your question, but but
I I think that that that's sort

1380
01:18:30,160 --> 01:18:33,200
of my estimate of what the
distribution of consensus is

1381
01:18:33,200 --> 01:18:35,640
when.
Yeah, yeah, I mean, I, I

1382
01:18:35,640 --> 01:18:38,440
completely agree with that, that
it's that it is fundamental.

1383
01:18:38,440 --> 01:18:40,960
It's the background for anything
else that we want to say around

1384
01:18:40,960 --> 01:18:45,840
definitions or I, I find it
amazingly, it just seems

1385
01:18:45,840 --> 01:18:49,000
completely implausible to me,
this idea that, yeah, you know,

1386
01:18:49,120 --> 01:18:51,640
the universe was sort of
mindless for a long time and

1387
01:18:51,640 --> 01:18:53,560
then boom, eventually like
something happened.

1388
01:18:53,560 --> 01:18:56,080
OK, now you've got conscious.
I just like, I know that's not

1389
01:18:56,080 --> 01:18:59,800
an argument, but, but just it,
it seems, it seems like that has

1390
01:18:59,800 --> 01:19:02,280
a very low a priority
probability to me.

1391
01:19:02,760 --> 01:19:06,800
And what I think we don't know,
and I don't know how we're going

1392
01:19:06,800 --> 01:19:10,720
to find this out necessarily.
But what we don't know is to

1393
01:19:10,720 --> 01:19:13,480
what extent does this
consciousness track

1394
01:19:13,480 --> 01:19:16,760
intelligence?
Like it seems to but, but I, but

1395
01:19:16,760 --> 01:19:19,520
as Robert was just saying, like
I think mostly it seems to

1396
01:19:19,520 --> 01:19:22,200
because, because that's what
we're using for the detector.

1397
01:19:22,440 --> 01:19:26,600
And so, you know, if you, if
you, if you, if you buy into

1398
01:19:26,600 --> 01:19:28,520
your tools too much, then
everything starts to look like

1399
01:19:28,520 --> 01:19:30,120
they work Well, of course, of
course they were.

1400
01:19:30,560 --> 01:19:34,240
So maybe by a lot in
biologicals, maybe those things

1401
01:19:34,240 --> 01:19:37,680
go hand in hand nicely, I guess
maybe.

1402
01:19:37,680 --> 01:19:40,840
But but as you as we start to
get weirder and weirder and

1403
01:19:40,840 --> 01:19:46,160
making beings that that are just
not like anything that we've

1404
01:19:46,160 --> 01:19:51,520
evolved before, whether that be
fully synthetic agents or what

1405
01:19:51,520 --> 01:19:54,560
kind of hybrid chimeric things
are just things that have never

1406
01:19:54,560 --> 01:19:58,000
been around, to what extent do
those things still go together?

1407
01:19:58,000 --> 01:20:01,120
You know, to what extent do they
do they have to go together?

1408
01:20:01,360 --> 01:20:03,960
And I'm not sure like I don't
know how we're going to get the

1409
01:20:03,960 --> 01:20:06,320
answer to that, but I think it's
a really interesting question.

1410
01:20:08,400 --> 01:20:12,440
And just just just think of the
of the the current state we have

1411
01:20:12,440 --> 01:20:15,600
to debate people to even relax
the definition of intelligence

1412
01:20:15,600 --> 01:20:18,720
to be able to apply it to non
conventional system.

1413
01:20:19,440 --> 01:20:24,960
Then imagine what what obstacle
it is to to convince people that

1414
01:20:24,960 --> 01:20:27,800
consciousness might be present
in system where even the

1415
01:20:27,800 --> 01:20:30,720
definition of intelligence there
are skeptics that are hesitant

1416
01:20:30,720 --> 01:20:35,320
to extend it.
So what happens when you accept

1417
01:20:35,680 --> 01:20:39,800
that something that is prima
facie dumb according to your own

1418
01:20:39,800 --> 01:20:42,680
definition might be quite
experientially rich?

1419
01:20:43,400 --> 01:20:47,440
And then what happens if you
start recognizing alien forms of

1420
01:20:47,440 --> 01:20:51,000
intelligence, Radically non
mainstream forms of intelligence

1421
01:20:51,000 --> 01:20:55,000
is as being instantiated, and
then you accept that

1422
01:20:55,000 --> 01:20:58,600
intelligence is a way to get to
experience.

1423
01:20:58,600 --> 01:21:00,800
It's some form of proxy to
experience.

1424
01:21:01,240 --> 01:21:04,760
You collapse the mapping that
you usually have because that

1425
01:21:04,760 --> 01:21:08,360
mapping was constructed based on
on on mainstream forms of

1426
01:21:08,360 --> 01:21:10,960
intelligence that you were
exposed to in the past.

1427
01:21:10,960 --> 01:21:14,240
So let's say the evolution of of
human and and and animal

1428
01:21:14,240 --> 01:21:18,240
organisms.
But then what if that decoupling

1429
01:21:18,520 --> 01:21:23,240
or coupling doesn't work the
same way in the case of of of

1430
01:21:23,320 --> 01:21:26,120
alien non mainstream
intelligences?

1431
01:21:26,800 --> 01:21:30,840
This is a real question what or
like if you were to to to think

1432
01:21:30,840 --> 01:21:33,920
of intelligence, assuming you
are, you are generous enough to

1433
01:21:33,920 --> 01:21:36,560
grant us the assumption of of of
our framework.

1434
01:21:37,400 --> 01:21:39,920
What's next with comes to
consciousness?

1435
01:21:40,800 --> 01:21:45,280
So the deliberate reason why we,
we do did not include

1436
01:21:45,280 --> 01:21:49,600
consciousness was just to not be
able to, just to not need to

1437
01:21:49,600 --> 01:21:54,240
find many uphill battles.
We, we had that big battle

1438
01:21:54,240 --> 01:21:57,680
uphill of, of convincing people
there is this way of thinking

1439
01:21:57,680 --> 01:22:00,040
about intelligence.
And lo and behold, it might be

1440
01:22:00,040 --> 01:22:02,560
fruitful even though it
irritates certain assumptions.

1441
01:22:03,200 --> 01:22:05,400
Imagine if you added
consciousness to the mix.

1442
01:22:07,080 --> 01:22:11,320
That's actually what what we
wanted to write initially, Mike,

1443
01:22:11,320 --> 01:22:12,840
if you remember about that
pattern.

1444
01:22:13,880 --> 01:22:17,760
Well, we still, we still will,
but it's so funny.

1445
01:22:17,760 --> 01:22:20,080
I mean, you're right.
It's it's like like you can't

1446
01:22:20,080 --> 01:22:23,200
fight every battle at once.
And I get these emails all the

1447
01:22:23,200 --> 01:22:26,120
time from people who are like,
you're such a coward.

1448
01:22:26,120 --> 01:22:28,440
You know, you never mention
consciousness in your papers.

1449
01:22:28,440 --> 01:22:31,600
Like you're always talking about
the like people like, yes, I

1450
01:22:31,600 --> 01:22:33,840
know, I know it's there.
I just keep like, you can't do

1451
01:22:33,840 --> 01:22:35,840
everything at once.
I can barely get the you know, I

1452
01:22:35,840 --> 01:22:40,400
can barely get the the the prop
somebody on a on a recent in a

1453
01:22:40,520 --> 01:22:44,080
recent review of a paper just
completely was triggered by the

1454
01:22:44,080 --> 01:22:48,040
notion of a problem space and
you know, and that and that that

1455
01:22:48,040 --> 01:22:52,160
any biologicals were were
traversing some kind of problem

1456
01:22:52,160 --> 01:22:55,040
space.
I mean, this is like a by now,

1457
01:22:55,040 --> 01:22:58,120
it's an old engineering notion
that this is nothing magic, you

1458
01:22:58,120 --> 01:23:00,320
know, but even that like
triggers people like crazy.

1459
01:23:00,320 --> 01:23:01,840
And then you think you think I
was going to fold in

1460
01:23:01,840 --> 01:23:04,040
consciousness into all this
stuff until it got, you know,

1461
01:23:04,440 --> 01:23:05,880
further settled.
No way.

1462
01:23:07,640 --> 01:23:09,800
And it is.
It is bind, double bind.

1463
01:23:10,320 --> 01:23:13,760
This conservative mechanism of,
of filtering that we have in

1464
01:23:13,760 --> 01:23:18,800
academia doesn't allow you to,
to, to get much as much as you

1465
01:23:18,800 --> 01:23:22,440
would want out.
But if you don't do that, and if

1466
01:23:22,440 --> 01:23:25,280
you go the other non mainstream
routes, you, you go publish blog

1467
01:23:25,280 --> 01:23:28,480
posts like like Mike does.
There's not even the same degree

1468
01:23:28,480 --> 01:23:31,560
of traction and credentials that
you give for your own ideas.

1469
01:23:31,720 --> 01:23:35,240
It is unfortunate that the forum
of ideas that we have is the one

1470
01:23:35,240 --> 01:23:38,400
filtered so heavily by, by the
academic peer review process.

1471
01:23:38,400 --> 01:23:41,960
I mean, there are reasons for
this, but it's, it also means

1472
01:23:41,960 --> 01:23:45,560
that, that the, the, the
generation abilities of, of, of

1473
01:23:45,560 --> 01:23:48,760
our brains is, is, is throttled
by this process of filtering.

1474
01:23:48,760 --> 01:23:53,720
So we would want, I would want
to write 50 pages, 60 pages long

1475
01:23:53,720 --> 01:23:56,480
papers and just be accepted by
the reviewers and say, Oh my

1476
01:23:56,480 --> 01:23:59,440
God, this is fantastic.
Let's have all of this together.

1477
01:23:59,440 --> 01:24:02,040
No, the most we can do is 4K
words.

1478
01:24:02,440 --> 01:24:07,080
Yeah, yeah, yeah.
Well, we, they've left us with

1479
01:24:07,080 --> 01:24:08,400
one paper that's really
excellent.

1480
01:24:08,400 --> 01:24:10,800
And I think let's say let's get
back to it now.

1481
01:24:11,720 --> 01:24:15,240
So with regards to the cognition
all the way down, let's talk

1482
01:24:15,240 --> 01:24:17,760
about some implications for AI
biology and mind.

1483
01:24:18,080 --> 01:24:21,400
Michael, this one's for you.
If intelligence is efficient

1484
01:24:21,400 --> 01:24:25,040
search, what does this imply for
current AI?

1485
01:24:25,240 --> 01:24:28,960
What's it missing or reshaping?
And how do we design slash

1486
01:24:29,000 --> 01:24:34,200
evaluated?
Yeah, boy, a lot of things.

1487
01:24:34,960 --> 01:24:38,080
I mean, on the one hand, like we
all, we all know that, that the

1488
01:24:38,080 --> 01:24:42,680
current favorite models of, of
AI use massive amounts of

1489
01:24:42,680 --> 01:24:44,440
energy, right?
Compared to compared to what the

1490
01:24:44,440 --> 01:24:50,880
biologicals does.
Do I think that, well, I've,

1491
01:24:50,960 --> 01:24:55,640
I've OK, I have a kind of a
weird, weird take on, on what

1492
01:24:55,640 --> 01:24:58,800
the AI is doing.
I I think we need to have a lot

1493
01:24:58,800 --> 01:25:03,440
of humility around knowing what
something is just because you

1494
01:25:03,440 --> 01:25:05,840
made it right.
You would think from having

1495
01:25:05,840 --> 01:25:08,040
babies, we were already, we
would already be comfortable

1496
01:25:08,240 --> 01:25:10,360
about this.
It's what Dan Dennett used to

1497
01:25:10,360 --> 01:25:11,920
call confidence without
comprehension.

1498
01:25:12,440 --> 01:25:15,120
You don't think you understand
what was I used to ask my wife

1499
01:25:15,120 --> 01:25:16,200
all the time, like, what's going
on?

1500
01:25:16,400 --> 01:25:20,080
If you could just tell me what's
going on in there, my job would

1501
01:25:20,080 --> 01:25:22,800
be so much farther ahead, right?
But you can't.

1502
01:25:23,080 --> 01:25:26,360
And I think people assume, and
I've had people say this to me.

1503
01:25:26,640 --> 01:25:28,520
Oh, there's no, there's no
emerging features.

1504
01:25:28,520 --> 01:25:30,200
I know what this is.
It's just linear algebra.

1505
01:25:30,200 --> 01:25:31,280
I write these things, I make
them.

1506
01:25:31,280 --> 01:25:33,520
I, I know what it is like you,
we, we don't even know what

1507
01:25:33,520 --> 01:25:35,560
bubble sort is really, I don't
think.

1508
01:25:35,920 --> 01:25:40,240
And that's so.
So when when we make these

1509
01:25:40,240 --> 01:25:43,840
things and we make assumptions,
it doesn't have, it doesn't have

1510
01:25:43,840 --> 01:25:45,240
memory.
It's a feed forward thing.

1511
01:25:45,240 --> 01:25:50,600
And, and you know, it, it's,
it's, it's just doing this, it's

1512
01:25:50,600 --> 01:25:52,680
just doing that.
I don't think we have any idea,

1513
01:25:52,920 --> 01:25:54,520
right.
I, I say, I think, I think

1514
01:25:55,280 --> 01:25:57,560
there's, there are very
significant ways in which our,

1515
01:25:57,760 --> 01:26:01,600
our picture of algorithms and
what, and what machines are, is

1516
01:26:01,600 --> 01:26:04,440
letting us down in, in many
cases and even in very simple

1517
01:26:04,440 --> 01:26:08,880
cases.
And so I think the things that,

1518
01:26:08,920 --> 01:26:11,120
so let's just take language
models, for example, the things

1519
01:26:11,120 --> 01:26:14,840
that these things say might have
zero to do with I, I don't know,

1520
01:26:14,840 --> 01:26:17,560
but that it could be anywhere
from from zero to to a high

1521
01:26:17,560 --> 01:26:21,480
number of percentages of how
much what they say correlates to

1522
01:26:21,480 --> 01:26:24,480
whatever problem they're
actually solving or what the

1523
01:26:24,480 --> 01:26:26,280
kind of intrinsic motivations
there might be.

1524
01:26:26,520 --> 01:26:28,840
Again, like in this in this
bubble sort example that we

1525
01:26:28,840 --> 01:26:31,960
studied, there are things that
it does that are not what we

1526
01:26:31,960 --> 01:26:33,320
asked it to do.
It still does.

1527
01:26:33,320 --> 01:26:35,240
It still does the sorting, all
right, But it also does some

1528
01:26:35,240 --> 01:26:36,880
other stuff.
And the other stuff I think is

1529
01:26:36,880 --> 01:26:40,200
much more interesting from the
perspective of what is the, you

1530
01:26:40,200 --> 01:26:43,640
know, what kind of what kind of
intrinsic agency might, might

1531
01:26:43,640 --> 01:26:45,200
this thing have, right?
It's not the things you're

1532
01:26:45,200 --> 01:26:48,720
forced to do that, that, that
tell me about your, about your

1533
01:26:48,720 --> 01:26:49,720
mind.
It's the things you do when

1534
01:26:49,720 --> 01:26:52,320
you're not forced or in between
the things you have to do.

1535
01:26:52,640 --> 01:26:54,320
And I think I think that's
probably true here.

1536
01:26:54,320 --> 01:26:56,600
So, so when people ask them,
like, how do you feel?

1537
01:26:56,600 --> 01:26:58,040
And then, you know, do you think
you're conscious?

1538
01:26:59,040 --> 01:27:01,640
I don't think any of that stuff
is necessarily telling us

1539
01:27:01,640 --> 01:27:04,200
anything.
I think the, the, the, the

1540
01:27:04,200 --> 01:27:07,640
language use, it may not, may be
a red herring with respect.

1541
01:27:07,760 --> 01:27:11,440
And so we'd really do need to do
basic behavior science on, on

1542
01:27:11,440 --> 01:27:13,680
these things, the way we've done
on bubble sort and on a few

1543
01:27:13,680 --> 01:27:16,000
other minimal things that are
coming out in the next, you

1544
01:27:16,000 --> 01:27:18,680
know, month or so to really know
anything about what they're

1545
01:27:18,680 --> 01:27:20,040
doing.
That's that's what that's what I

1546
01:27:20,080 --> 01:27:22,720
think.
Guys, thank you so much for this

1547
01:27:22,720 --> 01:27:24,200
wonderful chat.
It's a pleasure.

1548
01:27:24,200 --> 01:27:25,920
This paper is wonderful, it's
excellent.

1549
01:27:25,960 --> 01:27:28,560
I look forward to sharing it.
Any final words?

1550
01:27:29,000 --> 01:27:30,640
We hopefully can produce more
papers, Mike.

1551
01:27:31,040 --> 01:27:33,960
Yeah, exactly.
Yeah, no, just thank you for

1552
01:27:33,960 --> 01:27:35,440
giving us a chance to talk about
this stuff.

1553
01:27:35,440 --> 01:27:37,080
I'm very excited.
You know, I think Robert and I

1554
01:27:37,080 --> 01:27:38,840
have a ton of stuff to do
together, which which we're

1555
01:27:38,840 --> 01:27:41,080
going to do.
And so, yeah, very happy to have

1556
01:27:41,080 --> 01:27:42,240
this opportunity.
So thank you.

1557
01:27:42,280 --> 01:27:44,000
Cheers.
Thank you so much, I'll send you

1558
01:27:44,000 --> 01:27:46,120
links to everyday.
Thanks gentlemen, see you soon.

1559
01:27:46,760 --> 01:27:47,520
See you soon.
Stop.

1560
01:27:48,040 --> 01:27:48,200
Bye.