Sept. 14, 2025

What is Fear? Memory, Emotion & the Malleability of the Mind | Daniela Schiller

The player is loading ...
What is Fear? Memory, Emotion & the Malleability of the Mind | Daniela Schiller

What if our memories weren’t fixed, but flexible? And what if the key to understanding fear, emotion, and consciousness lies in how the brain constantly reshapes experience?In this episode of Mind-Body Solution, Dr Tevin Naidu speaks with Prof Daniela Schiller, a world-renowned neuroscientist, and director of the Laboratory of Affective Neuroscience at Mount Sinai. Prof Schiller’s groundbreaking work explores how we form, change, and even reimagine emotional memories, from the neuroscience of fear to the dynamic nature of consciousness and identity. Together, we dive into:- How fear memories are formed in the brain- Why emotional responses are flexible, not hardwired- The ethics of modifying traumatic memories- Social navigation: how the brain maps human relationships- The Human Affectome: a bold framework linking emotion and consciousness- Can machines ever be conscious? Free will, probabilities, and neuroscience- Memory as liberation: how to live with multiple stories of the selfProf Schiller is not only a world-leading neuroscientist with work published in Nature, Neuron, Nature Neuroscience, and PNAS, she’s also a Fulbright Fellow, Kavli Frontiers of Science Fellow, two-time Moth StorySLAM winner, and drummer for the rock band "The Amygdaloids".TIMESTAMPS:(0:00) – Introduction: Daniela Schiller on the Science of Emotion & Memory(0:39) – From Animal Models to Human Fear Studies: Schiller’s Journey(2:00) – What Happens in the Brain When a Fear Memory is Formed?(3:15) – Flexibility of Emotional Responses: Why Fear Is Not Hardwired(4:51) – Computational Psychiatry & the Brain as an Algorithmic System(6:00) – From Circuits to Consciousness: Can Neuroscience Explain Subjective Experience?(7:11) – The Human Affectome: A Framework Linking Emotion and Consciousness(9:13) – What Is Consciousness? Felt Experience as the Core of Mind(11:04) – Social Navigation: Mapping Human Relationships in the Brain(14:02) – How Social Media Distorts Real Interaction and Social Space(18:15) – Ethics of Modifying Traumatic Memories: Liberation or Risk?(21:27) – Are Emotions Brain Events, Bodily Events, or Psychological Phenomena?(23:16) – The 4E Approach: Embedded, Embodied, Enactive, and Extended Cognition(24:00) – Bringing Philosophy Into Neuroscience: The Human Affectome Project(27:03) – Exciting Advances: Intracranial Recordings, VR, and Naturalistic Neuroscience(33:11) – Can Artificial Intelligence or Machines Ever Be Conscious?(36:26) – Free Will and Probabilities: Neuroscience Meets Philosophy(41:12) – Overcoming Fear as Liberation: Redefining Memory and Identity(46:09) – Living With Multiple Stories: Memory, Authenticity, and Self-Creation(1:02:24) – Future Directions: Reconsolidation, Social Space, and the Human AffectomeEPISODE LINKS:- Daniela' Website: https://profiles.mountsinai.org/daniela-schiller- Daniela's Lab: https://labs.neuroscience.mssm.edu/project/schiller-lab/- Daniela's Publications: https://www.ncbi.nlm.nih.gov/myncbi/daniela.schiller.2/bibliography/public/CONNECT:- Website: https://tevinnaidu.com - Podcast: https://creators.spotify.com/pod/show/mindbodysolution- YouTube: https://youtube.com/mindbodysolution- Twitter: https://twitter.com/drtevinnaidu- Facebook: https://facebook.com/drtevinnaidu - Instagram: https://instagram.com/drtevinnaidu- LinkedIn: https://linkedin.com/in/drtevinnaidu=============================Disclaimer: The information provided on this channel is for educational purposes only. The content is shared in the spirit of open discourse and does not constitute, nor does it substitute, professional or medical advice. We do not accept any liability for any loss or damage incurred from you acting or not acting as a result of listening/watching any of our contents. You acknowledge that you use the information provided at your own risk. Listeners/viewers are advised to conduct their own research and consult with their own experts in the respective fields.

1
00:00:05,280 --> 00:00:06,800
Daniella, thanks so much for
joining me.

2
00:00:06,800 --> 00:00:10,120
I am a huge fan of your work,
been following it for years, and

3
00:00:10,480 --> 00:00:12,400
it's a pleasure and privilege to
host you today.

4
00:00:12,480 --> 00:00:15,880
So yeah, thanks for joining me.
Yeah, thanks for having me.

5
00:00:16,440 --> 00:00:19,040
My pleasure.
I thought I'd break this episode

6
00:00:19,040 --> 00:00:23,120
into five parts, the first part
being Foundations, and within

7
00:00:23,120 --> 00:00:26,240
this I labeled it the Science of
Emotion and Memory.

8
00:00:26,920 --> 00:00:28,480
So I think let's get started
with that.

9
00:00:28,480 --> 00:00:30,760
And thereafter we'll move on to
things like from circuits to

10
00:00:30,760 --> 00:00:34,960
consciousness and and beyond.
So the first question I have is

11
00:00:35,080 --> 00:00:39,640
you've devoted your life to and
your career to studying the

12
00:00:39,640 --> 00:00:42,200
neural mechanisms of emotional
learning and memory.

13
00:00:43,080 --> 00:00:46,760
Could you share perhaps how your
journey from animal models of

14
00:00:46,760 --> 00:00:50,760
schizophrenia symptoms to human
studies of fear have shaped the

15
00:00:50,760 --> 00:00:56,280
questions that you asked today?
Yeah, it's been a long journey.

16
00:00:56,840 --> 00:01:02,640
It started really from interest
in the philosophy of mind and

17
00:01:02,720 --> 00:01:06,360
philosophy of science and
psychobiology.

18
00:01:06,600 --> 00:01:09,280
I don't know the brain basis of
behaviour.

19
00:01:09,280 --> 00:01:13,280
That's where I that's what I
studied in my undergrad and

20
00:01:13,280 --> 00:01:16,040
that's kind of naturally led to
cognitive neuroscience.

21
00:01:16,040 --> 00:01:19,040
So I did start with animal
models of schizophrenia, but

22
00:01:19,040 --> 00:01:22,440
that was an animal model of
emotional learning.

23
00:01:22,440 --> 00:01:25,560
So it already incorporated the
fear learning component.

24
00:01:26,480 --> 00:01:31,480
And then I switched to humans
and I can't say it was really

25
00:01:31,480 --> 00:01:34,040
planned it it just the way it it
went about.

26
00:01:34,040 --> 00:01:37,440
I guess I just persisted with
the the topics that seemed

27
00:01:37,560 --> 00:01:40,520
interesting to me.
And it always revolved around

28
00:01:40,600 --> 00:01:43,800
emotion, in particularly fear.
OK.

29
00:01:44,240 --> 00:01:49,120
And when when you think about a
fear memory and when it's first

30
00:01:49,120 --> 00:01:52,040
formed in the brain, what what
would you say is actually

31
00:01:52,040 --> 00:01:55,640
happening when this occurs,
let's say at a neural level and

32
00:01:55,640 --> 00:01:59,720
thereafter will slowly branch
out into what it's repercussions

33
00:01:59,720 --> 00:02:02,720
have on a social level?
Yeah.

34
00:02:02,720 --> 00:02:06,480
So the, I think the most
fundamental thing is the

35
00:02:06,560 --> 00:02:11,280
Organism identifying something
that is important and relevant

36
00:02:11,720 --> 00:02:15,040
in the environment to the
Organism and that can be in many

37
00:02:15,040 --> 00:02:18,280
domains.
So in this sense, you can put

38
00:02:18,280 --> 00:02:21,040
all sorts of emotions.
In the case of fear, it's a

39
00:02:21,160 --> 00:02:24,680
threat to the survival of the
Organism, something threatening.

40
00:02:25,200 --> 00:02:29,320
So the Organism is identifying
that and then the brain gets

41
00:02:29,320 --> 00:02:32,320
into a mode.
It can be a predictive mode

42
00:02:32,440 --> 00:02:36,280
because it's preparatory, so it
prepares the brain to or the

43
00:02:36,280 --> 00:02:39,160
Organism to address that
concern.

44
00:02:40,960 --> 00:02:43,840
One of your central findings is
that emotional responses are

45
00:02:43,840 --> 00:02:46,640
flexible.
So when somebody encounters

46
00:02:46,640 --> 00:02:49,720
something fear driven, let's say
something that leads to a post

47
00:02:49,720 --> 00:02:51,720
traumatic stress type of
disorder.

48
00:02:51,720 --> 00:02:53,200
I know you're not a
psychiatrist, so just a

49
00:02:53,200 --> 00:02:54,440
disclaimer of everyone out
there.

50
00:02:55,680 --> 00:03:00,560
How do you think that plasticity
of the brain and reducing this,

51
00:03:01,920 --> 00:03:06,880
let's say emotional non hard
wired feeling that's it just

52
00:03:06,880 --> 00:03:08,520
doesn't make sense.
But let's say the fact that it's

53
00:03:08,520 --> 00:03:11,840
not hard wired.
How does someone then apply this

54
00:03:12,080 --> 00:03:16,080
from a behavioral standpoint?
So because in your work you

55
00:03:16,080 --> 00:03:19,040
focus primarily on fear and the
fact that it's such a formative

56
00:03:19,040 --> 00:03:21,800
experience, but thereafter you
also work on memory and how we

57
00:03:21,800 --> 00:03:23,720
can change this, how malleable
we are.

58
00:03:23,920 --> 00:03:26,800
So it's a perfect blend, I think
of these two different

59
00:03:26,800 --> 00:03:29,480
components and how it can change
our experience of reality.

60
00:03:29,680 --> 00:03:31,800
You perhaps unpack that.
Sorry, it's a bit of a long

61
00:03:31,800 --> 00:03:34,800
winded question.
Yeah, no, it's something that's

62
00:03:34,800 --> 00:03:37,800
really fascinating that we see
in the course of evolution.

63
00:03:37,800 --> 00:03:39,960
So it starts from reflexive
responses.

64
00:03:39,960 --> 00:03:43,680
These are hardwired.
There's a just like a very in

65
00:03:43,680 --> 00:03:46,760
our refer to our responses.
There is, let's say, you know,

66
00:03:46,760 --> 00:03:49,800
threat or kind of predator.
And then the Organism is a set

67
00:03:49,800 --> 00:03:54,080
of responses, sometimes just
like one particular response to

68
00:03:54,120 --> 00:03:56,680
to that situation.
And with evolution, we kind of

69
00:03:56,680 --> 00:04:03,440
evolved to 1st learn that and
but to build on that additional,

70
00:04:03,440 --> 00:04:07,720
for example, stimuli that are,
as I mentioned, predictive.

71
00:04:08,280 --> 00:04:10,280
So now you can prepare ahead of
time.

72
00:04:10,280 --> 00:04:12,760
No, you don't need to wait for
the encounter itself.

73
00:04:12,760 --> 00:04:16,880
So you have these preparatory
responses and then environment

74
00:04:16,880 --> 00:04:19,480
is changing.
So you do need to update.

75
00:04:20,000 --> 00:04:22,079
You move from one environment to
another.

76
00:04:22,079 --> 00:04:26,240
Conditions change.
And so that's a very important

77
00:04:26,240 --> 00:04:29,400
capacity to flexibly modulate.
What is it that you learned?

78
00:04:29,400 --> 00:04:33,320
Because otherwise every bad
experience that you have will

79
00:04:33,320 --> 00:04:34,680
kind of change your life
forever.

80
00:04:34,680 --> 00:04:38,160
And we don't want to have that.
So we need that flexibility.

81
00:04:39,280 --> 00:04:42,720
So we have this rapporteur
responses that we can modify

82
00:04:43,200 --> 00:04:46,560
flexibly, you know, with time
and with condition and with

83
00:04:46,560 --> 00:04:50,440
additional information.
When I was doing my

84
00:04:50,560 --> 00:04:53,480
dissertation, I wrote a lot
about computational psychiatry,

85
00:04:54,120 --> 00:04:57,080
a lot of work from Cole Preston,
working on prior information,

86
00:04:57,080 --> 00:05:00,200
Bayesian brains, inference.
When when you look at those

87
00:05:00,200 --> 00:05:02,880
models is, is that something you
incorporate quite a bit in your

88
00:05:02,880 --> 00:05:05,800
work when you're working with
emotional effective science,

89
00:05:05,800 --> 00:05:08,320
neuroscience or are you
approaching this from a

90
00:05:08,320 --> 00:05:12,040
different angle?
Yeah, it's actually very close.

91
00:05:12,440 --> 00:05:16,840
So I myself in training, I'm not
a computational neuroscientist,

92
00:05:16,840 --> 00:05:20,480
but I do incorporate
computational models and

93
00:05:20,480 --> 00:05:26,120
computational thinking and I
consider emotional behaviour as

94
00:05:26,560 --> 00:05:28,480
algorithm, algorithmic
processes.

95
00:05:28,480 --> 00:05:31,720
So it's all about what the brain
is computing and how we execute

96
00:05:31,720 --> 00:05:34,320
it.
And we could use computational

97
00:05:34,320 --> 00:05:38,000
models to make prediction or to
define very accurately what are

98
00:05:38,000 --> 00:05:41,240
the components of of the
learning and what the brain is

99
00:05:41,400 --> 00:05:44,040
processing.
When, when I was going through

100
00:05:44,040 --> 00:05:47,320
your work, something that I had
the urge to ask you at some

101
00:05:47,320 --> 00:05:51,040
point was within this from
circuits to consciousness.

102
00:05:51,040 --> 00:05:54,200
As I said, Part 2 would be that,
that in neuroscience we're

103
00:05:54,200 --> 00:05:56,200
uncovering so much over the past
few decades.

104
00:05:56,200 --> 00:05:57,600
We've we've figured out so many
things.

105
00:05:57,600 --> 00:05:59,400
There are a few people trying to
figure out the connector and

106
00:05:59,400 --> 00:06:01,560
there are a few people trying to
figure out what consciousness

107
00:06:01,560 --> 00:06:05,480
is.
How close do you think that we

108
00:06:05,480 --> 00:06:08,440
are to connecting these
mechanisms, neurosciences

109
00:06:08,920 --> 00:06:12,600
understanding of emotional
control and mechanisms of what

110
00:06:12,600 --> 00:06:15,640
we call subjective experience?
Do you think we're able to

111
00:06:15,640 --> 00:06:19,160
encapsulate that within a
neuroscientific approach?

112
00:06:21,040 --> 00:06:26,280
Wow, you know, biggest question
of of the field, but.

113
00:06:26,640 --> 00:06:28,640
This podcast, at some point we
have to touch on it.

114
00:06:28,640 --> 00:06:30,920
The mind body problem is the.
Yeah.

115
00:06:30,920 --> 00:06:35,280
So I do have kind of an
optimistic response, but it it's

116
00:06:35,280 --> 00:06:39,040
very, very early on.
But what I can say is that we

117
00:06:39,040 --> 00:06:43,280
have made an effort to construct
what we call the human affectum.

118
00:06:43,880 --> 00:06:49,480
And this is an algorithmic,
actually algorithmic collection

119
00:06:49,880 --> 00:06:54,200
of all the component of
affective phenomena in humans.

120
00:06:55,320 --> 00:06:59,680
So this it's not a model or you
know, a theory, it's actually a

121
00:06:59,680 --> 00:07:03,800
framework that is a just
provides kind of a skeleton to

122
00:07:03,800 --> 00:07:07,560
incorporate all the theories or
actually a platform to create

123
00:07:07,560 --> 00:07:10,120
new theories.
But it gives this fundamental

124
00:07:10,120 --> 00:07:15,600
description of what is an
Organism, what is the purpose of

125
00:07:15,600 --> 00:07:17,920
the Organism?
And in this way what is the

126
00:07:17,920 --> 00:07:24,240
purpose of affect or emotion.
And by answering this, we can

127
00:07:24,240 --> 00:07:28,520
incorporate many, many theories
and explanations from the

128
00:07:29,640 --> 00:07:31,920
neurobiological level to
consciousness.

129
00:07:31,920 --> 00:07:35,760
So it does include feelings and
how feelings are incorporated

130
00:07:35,760 --> 00:07:39,600
into it.
And considering feelings,

131
00:07:39,600 --> 00:07:42,880
feelings are the content of
consciousness.

132
00:07:43,840 --> 00:07:48,480
So what we do is, is investigate
the content of consciousness and

133
00:07:48,480 --> 00:07:52,440
the structure of of that
information.

134
00:07:52,440 --> 00:07:55,880
And from this we can learn
something about consciousness,

135
00:07:55,880 --> 00:07:58,440
you know, why it exists and how
it is organized.

136
00:07:58,440 --> 00:08:02,640
So I think actually by
incorporating affect into

137
00:08:02,640 --> 00:08:08,040
consciousness can lead us step
further to understanding the

138
00:08:08,040 --> 00:08:11,720
link between the body, the brain
and and the mind.

139
00:08:12,200 --> 00:08:15,440
Daniela, are you familiar with
the work done by Professor Mark

140
00:08:15,440 --> 00:08:19,560
Solms here in Cape Town at UCT
University of Cape Town?

141
00:08:21,320 --> 00:08:23,480
Just by by name, but what
exactly?

142
00:08:24,280 --> 00:08:26,360
So it's interesting because what
you're talking about with

143
00:08:26,400 --> 00:08:29,920
effective science combining with
a consciousness research, he's

144
00:08:29,920 --> 00:08:33,080
he's theory of consciousness, he
calls it the felt uncertainty

145
00:08:33,080 --> 00:08:35,919
principle.
He works closely with, with

146
00:08:35,919 --> 00:08:39,760
people like Karpis and etcetera.
But he wrote, he wrote a book

147
00:08:39,760 --> 00:08:42,400
called The Hidden Spring, and
it's pretty cool because

148
00:08:42,400 --> 00:08:45,880
basically what he tries to say
is that feeling or affect is

149
00:08:45,880 --> 00:08:50,800
fundamentally what needs to be
within consciousness, trying to

150
00:08:50,800 --> 00:08:52,120
understand what consciousness
is.

151
00:08:52,120 --> 00:08:55,320
So he tries to take it back to
the brainstem as the fundamental

152
00:08:55,320 --> 00:08:58,440
source of this feeling because
everything that we experience

153
00:08:58,440 --> 00:09:02,160
has to be felt.
And so affect forms a

154
00:09:02,160 --> 00:09:04,560
fundamental basis of what
consciousness is, according to

155
00:09:04,560 --> 00:09:06,800
his theory of consciousness.
I just thought I'd bring that

156
00:09:06,800 --> 00:09:08,360
up.
Sorry, it's just a sidetrack.

157
00:09:08,600 --> 00:09:12,320
But with that being said, if
someone asks you, Daniela, what

158
00:09:12,320 --> 00:09:14,880
is consciousness or how do you
respond to that?

159
00:09:19,480 --> 00:09:23,000
Yeah, I think it's felt
experience.

160
00:09:23,840 --> 00:09:27,000
It's, you know, subjectively
felt, it's felt, right.

161
00:09:27,680 --> 00:09:31,400
It's felt experience.
That's kind of I think the the

162
00:09:31,400 --> 00:09:35,280
main way to to incorporate it.
And I completely agree with that

163
00:09:36,360 --> 00:09:42,440
approach because in a way, every
felt experience has valence.

164
00:09:44,400 --> 00:09:49,320
And in in this sense, affect is
incorporated because there's no,

165
00:09:49,520 --> 00:09:53,400
I mean, we tend to separate
cognition and affect as if, you

166
00:09:53,400 --> 00:09:57,160
know, perception is affect free,
but it's not.

167
00:09:57,160 --> 00:10:00,920
It's like every, every moment,
every thought, every felt

168
00:10:00,920 --> 00:10:05,360
experience is valenced, even if
it's, you know, close to 0 or,

169
00:10:05,360 --> 00:10:08,840
you know, neutral or something.
It's kind of, you know, so

170
00:10:08,840 --> 00:10:10,760
that's a level, that's a level
of valence.

171
00:10:11,440 --> 00:10:16,920
I think these, you know, these
domains are very, very

172
00:10:16,920 --> 00:10:23,000
interconnected and I think a
great deal of what slowed us

173
00:10:23,000 --> 00:10:27,000
down is this like a division of
fields and domains and then

174
00:10:27,000 --> 00:10:30,400
being studied separately.
It's it's the cortical

175
00:10:30,400 --> 00:10:33,640
fallacies, what some people
refer to it as just the this

176
00:10:33,640 --> 00:10:36,800
obsession with the fact that
this visual system has dominated

177
00:10:37,000 --> 00:10:40,120
us for so long.
We must somehow incorporate this

178
00:10:40,120 --> 00:10:41,520
into every theory of
consciousness.

179
00:10:41,520 --> 00:10:43,960
But you're right, I think, I
think that that feeling, that

180
00:10:43,960 --> 00:10:47,040
fault experience, that
subjective experience has to be

181
00:10:47,040 --> 00:10:51,000
incorporated into any theory of
consciousness within these

182
00:10:51,000 --> 00:10:53,960
systems.
I mean us as feeling individuals

183
00:10:53,960 --> 00:10:56,520
or beings.
It places us in an intriguing

184
00:10:57,600 --> 00:10:59,400
world.
We're social species.

185
00:10:59,680 --> 00:11:01,720
We have to interact, we have to
engage.

186
00:11:01,720 --> 00:11:03,720
The social navigation becomes so
important.

187
00:11:03,720 --> 00:11:06,960
And your lab has worked quite a
bit on the social space.

188
00:11:07,040 --> 00:11:12,000
What you call Can you see
exactly what social navigation

189
00:11:12,000 --> 00:11:15,320
means for most people and how we
can understand these complex

190
00:11:15,320 --> 00:11:17,880
human relationships within the
social space?

191
00:11:19,520 --> 00:11:23,920
So I think the easiest way to
begin to understand it is think

192
00:11:23,920 --> 00:11:28,160
about spatial navigation, right?
We walk in space and we do have

193
00:11:28,160 --> 00:11:32,040
the machinery in the brain to
map the physical environment

194
00:11:32,040 --> 00:11:33,880
where we are in a particular
place.

195
00:11:33,880 --> 00:11:36,320
So there are neurons place
cells, you know, that fire in a

196
00:11:36,320 --> 00:11:40,400
particular location or grid
cells that fire regularly in a

197
00:11:40,840 --> 00:11:44,040
pattern that creates kind of a
grid of the environment.

198
00:11:44,560 --> 00:11:49,600
And one idea is that it's not a
machinery dedicated only to the

199
00:11:49,600 --> 00:11:54,360
physical space, but to organise
information more generally.

200
00:11:54,440 --> 00:11:57,920
And in this sense it can be also
abstract information can be

201
00:11:57,920 --> 00:12:00,760
other sensory modalities.
For example, you can navigate in

202
00:12:00,760 --> 00:12:03,760
auditory space or olfactory
space whenever you have

203
00:12:03,760 --> 00:12:08,200
dimensions and social space also
has dimensions.

204
00:12:08,200 --> 00:12:12,480
So it's actually an excellent
case of navigation because if we

205
00:12:12,480 --> 00:12:17,200
take fundamental dimensions like
power and affiliation, which is

206
00:12:17,360 --> 00:12:21,080
you know, how much we get close
to each other and dominance,

207
00:12:21,480 --> 00:12:24,320
power relationships.
These two components are are

208
00:12:24,320 --> 00:12:28,040
fundamental for you see it
across species and in many

209
00:12:28,040 --> 00:12:30,480
psychological theories that
describe relationships.

210
00:12:30,960 --> 00:12:36,120
So whenever we interact, first
of all interaction is is

211
00:12:36,120 --> 00:12:38,720
required.
It's not a snapshot of your

212
00:12:38,720 --> 00:12:41,440
network or you know Facebook
friends or something.

213
00:12:41,840 --> 00:12:47,600
When we interact, we establish
that relationship or location of

214
00:12:47,720 --> 00:12:51,720
you relative to me, for example,
on these two dimensions.

215
00:12:51,720 --> 00:12:55,120
And as the interactions
continue, then there is a path

216
00:12:55,120 --> 00:12:59,280
created because people move in
relative power and affiliation.

217
00:12:59,760 --> 00:13:03,720
So if you model it like that,
you can now have coordinates

218
00:13:03,840 --> 00:13:06,640
like real coordinates and you
can have a geometric structures

219
00:13:06,640 --> 00:13:09,360
and vectors and angles to the
social space.

220
00:13:10,040 --> 00:13:13,600
And we theorized that, but then
we found that the brand is

221
00:13:13,600 --> 00:13:16,680
indeed tracking that we could
see changes in the brand that

222
00:13:16,680 --> 00:13:21,320
track with the coordinates.
And that I think also relates to

223
00:13:21,720 --> 00:13:25,640
to survival and well-being
because social others are part

224
00:13:25,640 --> 00:13:30,760
of the of the environment and
the organisms and work is

225
00:13:31,560 --> 00:13:34,960
existing in an environment and
interacting with the environment

226
00:13:35,600 --> 00:13:37,760
to make sense of the environment
and to survive.

227
00:13:38,640 --> 00:13:42,240
Well, you briefly touched on
this, but I mean the fact that

228
00:13:42,240 --> 00:13:43,920
this is a 2 dimensional
experience you and I

229
00:13:43,920 --> 00:13:47,520
experiencing right now with each
other, we're moving closer and

230
00:13:47,520 --> 00:13:50,920
closer to that being a permanent
version of reality.

231
00:13:50,920 --> 00:13:54,040
With the way social media works
today, how people are constantly

232
00:13:54,040 --> 00:13:57,600
stuck behind a screen, engaging
online, how do you think this is

233
00:13:57,600 --> 00:14:02,000
changing or perhaps causing
malfunctioning responses because

234
00:14:02,000 --> 00:14:04,440
of these new environments?
How is this impacting our

235
00:14:04,440 --> 00:14:06,080
brains?
Obviously I'm not talking from a

236
00:14:06,080 --> 00:14:07,480
mental health perspective
because you're not a

237
00:14:07,480 --> 00:14:10,320
psychiatrist, but actually from
the neuroscientific research,

238
00:14:10,320 --> 00:14:14,520
what's happening?
Yeah, I think there's a

239
00:14:15,200 --> 00:14:18,520
misconception of what our
relationship is.

240
00:14:19,560 --> 00:14:26,200
So if you let's say just read
posts and post them yourselves

241
00:14:26,200 --> 00:14:29,600
and you feel, let's say your
status is changing, that's a

242
00:14:29,600 --> 00:14:31,480
whole different thing.
That these are not social

243
00:14:31,480 --> 00:14:35,320
interactions and there's no
movement there.

244
00:14:35,320 --> 00:14:38,800
There's movement of maybe
something else like, you know, 1

245
00:14:38,800 --> 00:14:43,000
dimensional liking status or or
something like that.

246
00:14:43,680 --> 00:14:51,880
But for a social space to to be
represented and for navigation

247
00:14:51,880 --> 00:14:55,880
in social space to occur, there
has to be actual interactions,

248
00:14:56,360 --> 00:14:59,680
like one-on-one interactions
that are not a one time

249
00:14:59,680 --> 00:15:04,360
interaction.
So in this sense, for example,

250
00:15:04,360 --> 00:15:07,760
let's say I know that someone is
really powerful.

251
00:15:07,800 --> 00:15:14,160
I don't know the president so
that that president is is not in

252
00:15:14,160 --> 00:15:17,000
my social space at the moment
because we didn't interact.

253
00:15:17,000 --> 00:15:21,320
And actually when we interact
that person, although I expected

254
00:15:21,320 --> 00:15:24,840
to that person to have a lot of
power, wouldn't necessarily, you

255
00:15:24,840 --> 00:15:27,040
know, have it maybe like bodies
and all that.

256
00:15:28,640 --> 00:15:33,080
So the interaction itself
defines the location in the

257
00:15:33,080 --> 00:15:36,960
dimensions.
So I think there's a lot of in

258
00:15:36,960 --> 00:15:42,000
with social media, there's a
illusion of interaction and it

259
00:15:42,040 --> 00:15:45,400
it doesn't have the benefits and
it's not encoded in the same

260
00:15:45,400 --> 00:15:47,600
way.
So I don't have direct empirical

261
00:15:47,600 --> 00:15:51,080
evidence, but from what I see
from my experiments the the

262
00:15:51,080 --> 00:15:55,760
tracking by the brain machinery
of of navigation, which is the

263
00:15:55,760 --> 00:15:59,760
hippocampus and related regions
only occurs when you actually

264
00:15:59,760 --> 00:16:03,880
interact, not when you hear
social information and or or

265
00:16:03,880 --> 00:16:07,160
being passively engaged.
I think that's one of the things

266
00:16:07,160 --> 00:16:10,720
that surprise people most is
when when you show them the

267
00:16:10,720 --> 00:16:13,640
neuroscience beyond what's
different from an interaction in

268
00:16:13,640 --> 00:16:15,560
person versus what we
experience.

269
00:16:15,840 --> 00:16:18,920
I think there were a few studies
done years ago that show how we

270
00:16:18,920 --> 00:16:20,360
even perceive people
differently.

271
00:16:20,360 --> 00:16:23,360
If someone puts on an outfit
that let's say they put in a

272
00:16:23,360 --> 00:16:27,840
costume of a big on the street,
how much, how much less certain

273
00:16:27,840 --> 00:16:30,840
parts of their brain react in
response to even seeing them.

274
00:16:30,840 --> 00:16:34,600
So just a mere change of an
outfit can dehumanize a person.

275
00:16:34,880 --> 00:16:36,840
So.
So there's so many small things

276
00:16:36,840 --> 00:16:38,600
about reality that we don't
really understand.

277
00:16:38,640 --> 00:16:41,560
And because of our heuristics,
certain adaptations, processing

278
00:16:41,560 --> 00:16:44,680
power very limited.
It's quite scary to consider the

279
00:16:44,680 --> 00:16:48,880
fact that as a social species,
we're becoming far less social,

280
00:16:49,120 --> 00:16:51,320
and yet we think we're more
social than ever.

281
00:16:51,680 --> 00:16:56,240
Does that concern you?
Yes, it's very concerning

282
00:16:56,240 --> 00:17:01,080
because the the illusion is very
convincing.

283
00:17:04,040 --> 00:17:06,720
It's like, if you think about
it, we have relationship with

284
00:17:06,720 --> 00:17:11,440
ourselves, right?
And we can quite easily create

285
00:17:11,520 --> 00:17:15,240
mental representations of others
in our brain and have

286
00:17:15,240 --> 00:17:17,800
interactions with that mental
representation.

287
00:17:18,440 --> 00:17:22,240
So we do have it in real life
because I do represent you in

288
00:17:22,240 --> 00:17:24,839
certain ways, like you said,
depending on your outfit and how

289
00:17:24,839 --> 00:17:27,040
you behave and my prior
knowledge.

290
00:17:28,079 --> 00:17:31,040
But I constantly update this
based on the interaction.

291
00:17:31,440 --> 00:17:34,000
And ideally, you know, I'm not
captive in stereotypes and so

292
00:17:34,000 --> 00:17:36,840
forth.
But this is all augmented in

293
00:17:37,200 --> 00:17:41,000
online interactions because of
the limited information that you

294
00:17:41,000 --> 00:17:44,480
have.
So you complemented a lot with

295
00:17:44,640 --> 00:17:47,920
your own mental model and this
is what you have interaction

296
00:17:47,920 --> 00:17:50,920
with, you know, just like a
whole bunch of fictional

297
00:17:50,920 --> 00:17:54,480
characters that you created in
your mind and you you have

298
00:17:54,480 --> 00:17:58,160
interactions with them and you
also, you don't have a lot of

299
00:17:58,160 --> 00:18:01,920
information of how they perceive
you, which is very, also very

300
00:18:01,920 --> 00:18:04,840
important input that you need to
understand a relationship.

301
00:18:06,760 --> 00:18:10,920
Daniela When it comes to species
interacting, we know that a lot

302
00:18:10,920 --> 00:18:13,640
of interactions can lead to
permanent outcomes.

303
00:18:13,840 --> 00:18:16,360
So some if someone gives you a
traumatic childhood experience

304
00:18:16,360 --> 00:18:18,680
growing up, you'll always
remember that experience is a

305
00:18:18,680 --> 00:18:21,760
fundamental life changing one.
We briefly touched on the fact

306
00:18:21,760 --> 00:18:25,360
that memory and malleability are
very much interlinked.

307
00:18:26,280 --> 00:18:30,280
Do you think that someone risks,
let's say you change a very

308
00:18:30,280 --> 00:18:34,360
traumatic memory and we sort of
use some sort of a strategy

309
00:18:34,360 --> 00:18:36,960
within psychiatry, whether
whether it's myself as a doctor

310
00:18:36,960 --> 00:18:39,640
trying to work on someone with
CBT, dialectical behavioral

311
00:18:39,640 --> 00:18:43,120
therapy, whatever.
Does that risk the person almost

312
00:18:43,120 --> 00:18:47,240
losing out on a core memory that
perhaps would have otherwise LED

313
00:18:47,240 --> 00:18:49,080
them down a different
philosophical part?

314
00:18:49,240 --> 00:18:52,200
How do you how do you see these
people changing memories?

315
00:18:52,200 --> 00:18:54,360
Or that's that's a pretty deep
question.

316
00:18:56,440 --> 00:18:58,960
It's a very complicated ethical
issue.

317
00:19:01,040 --> 00:19:06,040
So we, yeah, we, we kind of, I
think we all agree that that

318
00:19:06,040 --> 00:19:08,840
memories shape us and, and make
us who we are.

319
00:19:08,840 --> 00:19:14,840
And there is the concept of
growth from trauma, that people

320
00:19:14,840 --> 00:19:17,600
become something they never
imagined they would be.

321
00:19:18,520 --> 00:19:24,440
So that's a positive value kind
of that arises from it.

322
00:19:24,440 --> 00:19:27,280
But but then you wouldn't want
to have trauma just because of

323
00:19:27,280 --> 00:19:31,640
that, right?
So the, the thing about changing

324
00:19:31,640 --> 00:19:37,600
memories is bringing them to the
adaptive range.

325
00:19:38,320 --> 00:19:41,360
The the talk about modifying
memories in the context of

326
00:19:41,360 --> 00:19:46,120
trauma is only when the memory
makes you function less.

327
00:19:46,120 --> 00:19:48,480
Well, it's like some people,
right?

328
00:19:48,480 --> 00:19:52,200
They suffer and they can't work
and it it ruins their social

329
00:19:52,200 --> 00:19:56,520
relationships.
It's a has a very serious price.

330
00:19:56,520 --> 00:20:00,120
And in this sense, you want to
modify that memory such that you

331
00:20:00,120 --> 00:20:04,160
could function with the memory.
So it's not about erasing, it's

332
00:20:04,160 --> 00:20:09,120
about living with it in a way
that wouldn't you know,

333
00:20:09,120 --> 00:20:11,600
interrupt with your, with your
daily function.

334
00:20:11,920 --> 00:20:14,600
And in many cases, you could
remember the content.

335
00:20:14,600 --> 00:20:17,840
So it's not about erasing the
content of the of the event.

336
00:20:18,200 --> 00:20:22,240
It's actually making sense of
the content and then having the

337
00:20:22,240 --> 00:20:28,120
emotion a bit.
Not, I would say, disconnected,

338
00:20:28,120 --> 00:20:32,160
but tolerable when you remember,
because all of these processes

339
00:20:32,160 --> 00:20:35,120
are are the ones that interfere
with with the experience.

340
00:20:36,120 --> 00:20:39,560
Of course, one can take it into
let's shape people, people's

341
00:20:39,560 --> 00:20:46,200
memories and, and modify and it
could go to these, you know,

342
00:20:46,200 --> 00:20:49,240
terrible scenarios.
But I think this can happen with

343
00:20:49,280 --> 00:20:53,240
with every science And I also, I
don't think we're there yet

344
00:20:53,760 --> 00:20:57,880
because it it's very subtle what
we managed to understand and to

345
00:20:57,880 --> 00:21:00,800
modify.
Yeah, I think we thought, I

346
00:21:00,800 --> 00:21:03,160
think that's more for Black
Maria episode at this point.

347
00:21:03,160 --> 00:21:05,800
It's it's, it's very science
fictiony to a point where

348
00:21:05,800 --> 00:21:08,360
people, people can postulate as
much as they want, but the tech

349
00:21:08,360 --> 00:21:11,920
isn't, isn't there.
But I mean, in if we think about

350
00:21:11,920 --> 00:21:14,160
that differently, we can think
of certain medications that do

351
00:21:14,160 --> 00:21:17,960
that within our, within my
field, let's say, and and yet we

352
00:21:17,960 --> 00:21:20,480
still don't understand the basis
of how these things work.

353
00:21:20,640 --> 00:21:23,320
So, so it is one of those fields
where there's it's very touch

354
00:21:23,320 --> 00:21:25,360
and go.
Daniela, when you look at

355
00:21:25,360 --> 00:21:28,480
someone experiencing something,
whether it's a felt experience,

356
00:21:28,480 --> 00:21:31,800
a subjective core experience, do
you find yourself seeing that

357
00:21:31,800 --> 00:21:35,960
more of a more as a brain event
or bodily event?

358
00:21:36,280 --> 00:21:40,000
Or do you find that to be more
of a psychological or a non

359
00:21:40,280 --> 00:21:45,920
physical event?
All of the above, like I, I

360
00:21:45,920 --> 00:21:49,360
don't separate it at all.
And also I think it's incredibly

361
00:21:49,360 --> 00:21:54,280
important not to break it down
because as I mentioned early on,

362
00:21:54,880 --> 00:21:58,240
it's always important to
remember that who is

363
00:21:58,240 --> 00:22:03,000
experiencing the, the entity
that he's experiencing is an

364
00:22:03,000 --> 00:22:05,760
Organism.
So the Organism is not just a

365
00:22:05,760 --> 00:22:10,480
neural pathways or nervous
system, it's the the body,

366
00:22:10,480 --> 00:22:11,960
right?
It's the entire thing.

367
00:22:12,720 --> 00:22:16,680
You have the sensory information
coming to input to the body.

368
00:22:17,120 --> 00:22:21,000
And in addition to that, the
Organism is embedded in the

369
00:22:21,000 --> 00:22:23,560
environment.
So it really matters where the

370
00:22:23,560 --> 00:22:28,600
Organism is because it changes,
you know, the relative survival

371
00:22:28,600 --> 00:22:31,520
ratio or whatever.
The Organism will even compute.

372
00:22:32,280 --> 00:22:36,120
Also, the senses are important.
Different animals have different

373
00:22:36,120 --> 00:22:40,800
senses, so even if they're in
the same environment, they will

374
00:22:40,800 --> 00:22:43,080
each have a different
environment depending on what

375
00:22:43,080 --> 00:22:46,160
they sense.
And in addition to that, to have

376
00:22:46,160 --> 00:22:50,520
an environment, the Organism has
to operate, it has to interact

377
00:22:50,520 --> 00:22:52,920
with the environment, has to
perceive it, which is an action

378
00:22:52,920 --> 00:22:58,960
in itself, and create the
environment such that the

379
00:22:58,960 --> 00:23:02,160
Organism could interact with it.
So you see it's a very

380
00:23:02,480 --> 00:23:06,200
iterative, convoluted process
that incorporates all of these.

381
00:23:06,480 --> 00:23:10,320
There's the Organism in it's
entirety and it's and it's

382
00:23:10,400 --> 00:23:11,840
interaction with the
environment.

383
00:23:11,880 --> 00:23:16,120
All of these are the experience.
Yes, it's very similar to the

384
00:23:16,120 --> 00:23:18,120
approach taken by the four East
Cogsai.

385
00:23:19,120 --> 00:23:20,280
Yes, exactly.
Yes.

386
00:23:20,520 --> 00:23:21,840
Embedded.
Embodied.

387
00:23:21,840 --> 00:23:24,040
Yes.
And the enacted and extended the

388
00:23:24,040 --> 00:23:25,440
mere fact that we've got a cell
phone.

389
00:23:26,360 --> 00:23:28,160
So it forms part of who we are
at this point.

390
00:23:28,160 --> 00:23:30,280
And without our cell phones,
we're actually a lot Dumber than

391
00:23:30,280 --> 00:23:33,000
we think we are.
And we just everyday.

392
00:23:33,000 --> 00:23:35,840
Are there any parts, Daniel,
when you when you guys are

393
00:23:35,840 --> 00:23:37,880
working in the lab, when you
guys are doing your research,

394
00:23:38,160 --> 00:23:40,720
how often do you guys ponder the
philosophical questions?

395
00:23:40,720 --> 00:23:42,240
I mean like the what is
consciousness?

396
00:23:42,240 --> 00:23:44,560
What is free world?
Do you guys ever sit down and

397
00:23:44,560 --> 00:23:46,760
discuss this?
Is it something that comes up or

398
00:23:46,760 --> 00:23:49,320
is that just for the couch
philosophers at home?

399
00:23:51,280 --> 00:23:53,960
And so surprisingly, we, we do
it quite a lot.

400
00:23:54,480 --> 00:23:58,400
And so because we did write the
human Affectum.

401
00:23:58,400 --> 00:24:03,280
So human Affectum is an exercise
in incorporating philosophy into

402
00:24:03,280 --> 00:24:06,040
science.
And it's also a really nice

403
00:24:06,040 --> 00:24:10,240
example of how philosophy really
helps you organize.

404
00:24:10,240 --> 00:24:13,840
What is it that you learn, even
organize the field and

405
00:24:13,840 --> 00:24:16,920
especially putting all the
researchers on the same

406
00:24:17,400 --> 00:24:20,440
platforms such that they and
also give them joint language

407
00:24:20,440 --> 00:24:24,080
such that they will be able to
communicate.

408
00:24:26,000 --> 00:24:35,400
And because that experience we
in our paper is that we had a

409
00:24:36,800 --> 00:24:40,040
table that expands kind of all
the like components of

410
00:24:40,040 --> 00:24:42,800
assumptions that go into
scientific research like

411
00:24:42,800 --> 00:24:47,480
metaphysical assumptions,
pragmatic considerations, also

412
00:24:47,960 --> 00:24:50,320
theoretical virtues.
What is it that you yourself

413
00:24:50,320 --> 00:24:52,920
perceive as a good theory so
that all these like, you know,

414
00:24:52,920 --> 00:24:57,680
something like 7 rows and we put
the affected field into that.

415
00:24:58,000 --> 00:25:02,040
So the exercise we started doing
with the graduate students is

416
00:25:02,280 --> 00:25:05,720
have them fill out this table,
basically identify their

417
00:25:05,720 --> 00:25:07,880
different assumptions,
philosophical assumptions, you

418
00:25:07,880 --> 00:25:10,360
know, mechanistic
operationalization, construct

419
00:25:11,280 --> 00:25:14,560
pragmatic and, and do that in
their own field, in their own

420
00:25:14,560 --> 00:25:19,760
thesis topic.
And that was a really wonderful

421
00:25:19,760 --> 00:25:23,280
experience because it's a simple
exercise, but it kind of

422
00:25:23,280 --> 00:25:26,600
fundamentally change how you
think of your research.

423
00:25:26,600 --> 00:25:29,440
It really helps you organize.
It also helps you communicate

424
00:25:30,120 --> 00:25:35,480
and also makes you aware of many
things you took for granted or

425
00:25:35,960 --> 00:25:38,280
either didn't think about or
once you think about it really

426
00:25:38,280 --> 00:25:41,560
enlightens your understanding of
what you do.

427
00:25:41,560 --> 00:25:46,040
So I'm, I'm like, I'm a really
big fan of incorporating

428
00:25:46,040 --> 00:25:49,000
philosophy into science.
And this year we're going to do

429
00:25:49,000 --> 00:25:53,920
the course again and maybe we'll
write something to share, you

430
00:25:53,920 --> 00:25:56,760
know, the, the syllabus with,
with other people if they're

431
00:25:56,760 --> 00:25:58,840
interested.
I think that's a brilliant

432
00:25:58,840 --> 00:26:00,880
approach.
It's, I think it's very

433
00:26:00,880 --> 00:26:04,000
underrated within science when
well, when you do do that, you

434
00:26:04,000 --> 00:26:06,920
realize the normativity and the,
and the amount of biases and

435
00:26:06,920 --> 00:26:09,560
fallacies we have when we think
about our own field.

436
00:26:09,920 --> 00:26:12,280
And it's, and you're right, it
gives us that common language to

437
00:26:12,280 --> 00:26:15,080
sort of dissect what we're
talking about, how we're

438
00:26:15,080 --> 00:26:18,000
discussing it, why we have the
similar bias that we might have

439
00:26:18,400 --> 00:26:20,400
and, and, and then move beyond
it.

440
00:26:20,400 --> 00:26:23,160
So I, I do think it's almost
like a tool for science and and

441
00:26:23,160 --> 00:26:27,240
it can only augment it.
Yes, like once you do it, you

442
00:26:27,240 --> 00:26:29,040
can't believe you didn't do it
before.

443
00:26:29,560 --> 00:26:32,560
And it's like, how did we even
survive without it?

444
00:26:32,560 --> 00:26:36,560
So I, I do hope we will do it
more and more and we'll be more

445
00:26:36,560 --> 00:26:40,520
aware of it.
Another nice benefit is that

446
00:26:40,760 --> 00:26:44,000
sometimes when you think they're
competing theories, or I don't

447
00:26:44,000 --> 00:26:47,680
know, maybe you have like a
nemesis theory, you actually

448
00:26:47,680 --> 00:26:50,440
don't argue at all because you
really study different things.

449
00:26:50,440 --> 00:26:52,520
So there's no competition
whatsoever.

450
00:26:52,720 --> 00:26:55,120
You're actually really
complementing each other.

451
00:26:55,120 --> 00:26:58,520
So that really changes also the
kind of social dynamics in the

452
00:26:58,520 --> 00:27:00,720
field.
Speaking of those social

453
00:27:00,720 --> 00:27:03,120
dynamics, when you look at the
field of neuroscience right now

454
00:27:03,120 --> 00:27:06,240
and your work and the work
people like you are doing in the

455
00:27:06,240 --> 00:27:08,920
field, is there anything in
particular that excites you

456
00:27:08,920 --> 00:27:11,280
right now?
It's it's 2025, there's so much

457
00:27:11,280 --> 00:27:13,560
going on.
We're exponential growths in

458
00:27:13,560 --> 00:27:16,000
neuroscience research.
What excites you the most?

459
00:27:18,320 --> 00:27:27,200
Wow, there's a a lot so well,
one thing is that from my

460
00:27:27,200 --> 00:27:29,760
experience, we that we started
doing, but people have been

461
00:27:29,760 --> 00:27:35,960
doing for quite a while now is
studying the human brain with

462
00:27:35,960 --> 00:27:39,120
inter cerebral recordings
because we have access with

463
00:27:39,360 --> 00:27:44,480
epilepsy patients that while
they're being just, you know,

464
00:27:44,480 --> 00:27:47,560
waiting to for it.
So epilepsy patients, they will

465
00:27:47,560 --> 00:27:50,120
come to the hospital and kind of
stay there for about a week.

466
00:27:50,360 --> 00:27:54,080
They will have electrodes
implemented and wait for a

467
00:27:54,080 --> 00:27:56,520
seizure to happen.
And this helps the neurosurgeon

468
00:27:56,520 --> 00:28:01,000
identify the source of the
seizure and then map it and

469
00:28:01,000 --> 00:28:04,200
target it in an invasive
procedure later on.

470
00:28:04,600 --> 00:28:07,600
But for a few days, they're just
there with electrodes in their

471
00:28:07,600 --> 00:28:10,600
brain.
And so they, they volunteered to

472
00:28:10,600 --> 00:28:12,800
do some studies.
And that gives us really

473
00:28:12,800 --> 00:28:17,280
unparalleled access to the human
brain, something that we could

474
00:28:17,280 --> 00:28:22,480
just do with animals.
But the added value of doing it

475
00:28:22,480 --> 00:28:24,760
in humans is that now you can do
things like the social

476
00:28:24,760 --> 00:28:28,920
navigation and you can look at
just them talking, you know,

477
00:28:28,920 --> 00:28:31,000
natural language, even
interacting.

478
00:28:32,160 --> 00:28:35,160
So this, this is just kind of,
you don't see a lot of that.

479
00:28:35,160 --> 00:28:38,760
It's the beginning, but this is
very exciting and you can

480
00:28:38,760 --> 00:28:44,320
combine that or do it separately
with virtual reality and

481
00:28:44,320 --> 00:28:48,400
augmented reality.
So experiments become more and

482
00:28:48,400 --> 00:28:52,240
more naturalistic and closer to
the real life experiment, real

483
00:28:52,240 --> 00:28:55,240
life experience, which is very
important, especially in

484
00:28:55,240 --> 00:28:59,200
relation to trauma.
For example, we had a study

485
00:28:59,200 --> 00:29:03,240
where we asked people with PTSD
to listen to our recording that

486
00:29:03,240 --> 00:29:07,920
describes their own personal
trauma versus a regular memory.

487
00:29:08,440 --> 00:29:11,880
So you can imagine someone is in
the fMRI scanner and we can look

488
00:29:11,880 --> 00:29:14,600
at the room when they while they
listen to something like that,

489
00:29:14,640 --> 00:29:17,640
you know, someone just talking,
but describing a personal

490
00:29:17,640 --> 00:29:22,000
experience, then we can compare
it to non personal experience

491
00:29:22,480 --> 00:29:25,840
and also, but it's, it's very,
very naturalistic and it's their

492
00:29:25,840 --> 00:29:27,800
own personal memory.
And this is something you

493
00:29:27,920 --> 00:29:31,480
couldn't study before because
what we do, usually we bring

494
00:29:31,480 --> 00:29:34,040
everybody to the lab and they
all have the same experience,

495
00:29:34,040 --> 00:29:37,280
which is very controlled, like
looking at the stimulus on a

496
00:29:37,280 --> 00:29:40,200
computer, making that stimulus
scary.

497
00:29:40,520 --> 00:29:44,120
But it's all very controlled and
organized.

498
00:29:44,600 --> 00:29:46,840
But really what you're
interested in is the personal

499
00:29:46,840 --> 00:29:49,840
trauma.
So now we can begin to see how

500
00:29:49,840 --> 00:29:54,640
it gives us access to that.
And because we have this

501
00:29:54,640 --> 00:29:58,000
sophisticated also analytical
methods and machine learning and

502
00:29:58,000 --> 00:30:01,880
we have ways now to manage
complicated data and massive

503
00:30:01,880 --> 00:30:06,280
data and we just begin kind of
to see the, the use of it in

504
00:30:06,280 --> 00:30:08,840
science.
So I think that's, that's really

505
00:30:09,640 --> 00:30:11,960
exciting.
That's why I said like, you

506
00:30:11,960 --> 00:30:15,560
know, I said like because I was
just imagine, you know, like

507
00:30:16,160 --> 00:30:18,960
many, many studies now for the,
the next 10 years.

508
00:30:19,440 --> 00:30:23,640
Back when you started, did you
did you ever think that AI would

509
00:30:23,640 --> 00:30:26,760
begin to assist neuroscience in
the way it has in the last few

510
00:30:26,760 --> 00:30:31,240
years?
No, not at all.

511
00:30:31,240 --> 00:30:36,160
It's like a I didn't, I didn't
even imagine that it was really

512
00:30:36,160 --> 00:30:41,320
just the the plain old science
with control conditions and

513
00:30:41,400 --> 00:30:46,160
simple conditions and clean, you
know, out of any you.

514
00:30:46,680 --> 00:30:49,440
It's kind of interesting because
there's a phenomenon of interest

515
00:30:49,760 --> 00:30:53,920
and what you do in one approach
is kind of strip, strip

516
00:30:53,920 --> 00:30:57,640
everything out of it such that
you can isolate it and look at

517
00:30:57,640 --> 00:31:01,120
it.
And now what we do is, is really

518
00:31:01,640 --> 00:31:04,080
tuck it in the, the mass of
life.

519
00:31:04,640 --> 00:31:09,480
But if we do manage to track it
or we believe it exists, like

520
00:31:09,480 --> 00:31:12,760
for example, that computation
that I mentioned, like geometric

521
00:31:12,760 --> 00:31:16,080
structure of social navigation,
we should be able to track it

522
00:31:16,080 --> 00:31:20,680
out of all the mess.
So it actually, if it exists, it

523
00:31:20,680 --> 00:31:24,120
should arise with, with all of
this noise because it's ordering

524
00:31:24,120 --> 00:31:27,240
the noise.
So I find it very, very

525
00:31:27,240 --> 00:31:30,080
compelling.
And I still think that there's

526
00:31:30,080 --> 00:31:32,840
room for both, right?
There's like the classic way and

527
00:31:32,840 --> 00:31:36,480
kind of the new way.
I also see many students, which

528
00:31:36,480 --> 00:31:41,160
is very nice to see that they
combine advisors.

529
00:31:41,440 --> 00:31:44,760
So they have like advisors that
do like more naturalistic stuff.

530
00:31:44,760 --> 00:31:48,360
And then the, the more kind of
conservative advisors that do

531
00:31:48,360 --> 00:31:51,920
the more organized stuff because
they themselves see that you do

532
00:31:51,920 --> 00:31:56,880
need both kind of, you don't
need the, the very clear and

533
00:31:56,880 --> 00:32:01,880
organized analytical thinking of
an experimental design, but also

534
00:32:01,880 --> 00:32:06,120
the, the flexibility and I guess
somewhat creativity of

535
00:32:06,120 --> 00:32:08,200
incorporating the, the
experience.

536
00:32:08,920 --> 00:32:12,480
Another thing that is important
is that we used to think that

537
00:32:12,480 --> 00:32:15,000
you translate animal work to
human work.

538
00:32:15,040 --> 00:32:19,520
It has to be exactly the same.
And that had a lot of problems

539
00:32:19,520 --> 00:32:23,040
because it's very hard for it to
be exactly the same.

540
00:32:23,040 --> 00:32:26,040
So even with fear conditioning
or you have a stimulus paired

541
00:32:26,040 --> 00:32:28,400
with, let's say, an electric
shock, and you do it in animals

542
00:32:28,400 --> 00:32:31,720
and in humans, still humans, you
know, they have expectations.

543
00:32:31,720 --> 00:32:33,720
They are really influenced by
the context.

544
00:32:33,720 --> 00:32:37,800
They kind of overthink.
They're really influenced by the

545
00:32:37,800 --> 00:32:41,440
instructions that you give them.
So it will never be identical.

546
00:32:41,960 --> 00:32:46,320
But I think if there's a
principle of, let's say,

547
00:32:46,320 --> 00:32:53,000
navigation, you could find it in
humans in a whole different way,

548
00:32:53,160 --> 00:32:55,480
but it will still be exactly the
same computation.

549
00:32:55,480 --> 00:32:59,400
You just arrive at it from the
human experience.

550
00:32:59,720 --> 00:33:03,080
So it doesn't have to be exactly
identical, as long as the what

551
00:33:03,080 --> 00:33:05,480
you really isolate is the
computation itself or the

552
00:33:05,480 --> 00:33:07,360
representation that you're
trying to capture.

553
00:33:09,160 --> 00:33:12,560
When we think of these, I mean
it's these soft skills that sort

554
00:33:12,560 --> 00:33:16,200
of separate us from from
machines at this point.

555
00:33:16,800 --> 00:33:19,640
When you think about how far
it's come and where it's going,

556
00:33:20,000 --> 00:33:22,320
do you think we'll ever reach a
point where some sort of an

557
00:33:22,320 --> 00:33:26,200
electrical or silicon system can
reach the complexity of a brain?

558
00:33:26,200 --> 00:33:29,760
I mean, it's 20, it's 2% of our
body's mass, and yet 20% of our

559
00:33:29,760 --> 00:33:33,720
body's energy is consumed.
And at what point will a system

560
00:33:33,720 --> 00:33:37,080
be able to do that at a more
efficient rate and sort of

561
00:33:37,080 --> 00:33:39,880
produce these experiences
similar to us?

562
00:33:39,880 --> 00:33:43,080
Do you think that's possible
firstly, and and what are your

563
00:33:43,080 --> 00:33:44,960
thoughts on when that might
happen if so?

564
00:33:47,280 --> 00:33:49,840
Yeah.
Well, there, there the technical

565
00:33:49,840 --> 00:33:54,000
aspects of it that, you know,
the current systems, they just

566
00:33:54,000 --> 00:33:57,960
get heated and that's it's a big
problem.

567
00:33:57,960 --> 00:34:02,080
I know that there's, you know,
nanophotonics, it's supposed to

568
00:34:02,080 --> 00:34:07,080
be much more effective in terms
of, you know, saving this like

569
00:34:07,080 --> 00:34:11,960
temperature problem.
So maybe maybe something will be

570
00:34:11,960 --> 00:34:14,400
there.
But in terms of, I don't know,

571
00:34:14,400 --> 00:34:16,840
if you're asking about something
that is more similar to the

572
00:34:16,840 --> 00:34:22,080
brain, it might have to be, you
know, with organic matter.

573
00:34:22,280 --> 00:34:26,600
Or if we want, if you want to
talk about conscious machines,

574
00:34:28,199 --> 00:34:31,760
as long as they're not embedded
in the environment and have to

575
00:34:31,760 --> 00:34:35,159
produce their own material like
an Organism.

576
00:34:36,080 --> 00:34:40,080
And there will always be kind of
this fundamental barrier between

577
00:34:40,080 --> 00:34:42,280
considering what is conscious
and what not.

578
00:34:42,920 --> 00:34:46,840
Yeah, I think that's some people
call it mortal computation.

579
00:34:46,840 --> 00:34:49,080
That's sort of the fact that we
will die and we have to sort of

580
00:34:49,080 --> 00:34:52,960
live to we have to do things
within this universe to survive

581
00:34:52,960 --> 00:34:56,080
and keep ourselves alive and
thrive is a fundamental part of

582
00:34:56,080 --> 00:35:00,480
being the conscious being at
this point and and most systems

583
00:35:00,480 --> 00:35:03,160
don't have that or most
mechanistic ones.

584
00:35:04,360 --> 00:35:05,760
Yeah.
I mean you, you could say that

585
00:35:05,760 --> 00:35:10,200
this is the like the perfect
question of purpose that that I

586
00:35:10,200 --> 00:35:13,680
mentioned early on.
This is what the Organism is

587
00:35:13,680 --> 00:35:18,640
doing and this is where you find
kind of consciousness and, and

588
00:35:18,640 --> 00:35:20,640
affect and the content of
consciousness.

589
00:35:21,240 --> 00:35:25,080
It's exactly for that, you know,
to, to exist, to reproduce your

590
00:35:25,080 --> 00:35:28,120
own material, to be independent,
a separate unit from the

591
00:35:28,120 --> 00:35:30,600
environment, but interacting
with the environment and

592
00:35:30,600 --> 00:35:35,520
whatever you do is for the sake
of continuing to be right.

593
00:35:35,560 --> 00:35:38,080
And for this, you have
representations of the

594
00:35:38,080 --> 00:35:43,200
environment and you can, you can
have abstraction, which is this

595
00:35:43,400 --> 00:35:47,040
added ability that we can find,
you know, more confidently in

596
00:35:47,040 --> 00:35:52,240
humans, maybe in other animals
as well, which really expands

597
00:35:52,240 --> 00:35:55,360
your, your field of relevance of
what you can interact with in

598
00:35:55,360 --> 00:35:57,920
the environment.
These are kind of all levels of,

599
00:35:57,960 --> 00:36:03,680
of evolution.
But so, so these are, this is

600
00:36:03,680 --> 00:36:07,600
where you find consciousness.
It's like for, for that purpose,

601
00:36:07,640 --> 00:36:10,760
you know, you could say.
So if we don't have it in

602
00:36:10,760 --> 00:36:16,880
machine, then I think it will be
difficult to to conclude that

603
00:36:16,880 --> 00:36:20,120
it's similar to conscious
Organism.

604
00:36:20,240 --> 00:36:22,720
But I don't know it's, it's a
really complicated debate.

605
00:36:22,840 --> 00:36:24,520
So.
Yeah, no, it's it's one of those

606
00:36:24,520 --> 00:36:26,920
things that keep me up at night.
What are one of the other things

607
00:36:26,920 --> 00:36:29,280
would be free will.
What The channel explores that

608
00:36:29,280 --> 00:36:31,840
quite thoroughly.
What are your thoughts on that

609
00:36:31,840 --> 00:36:34,640
from a neuroscience perspective,
do you believe?

610
00:36:34,720 --> 00:36:36,560
I mean, I know it's a
complicated question and even

611
00:36:36,560 --> 00:36:39,400
asking if do we have free will
is quite simplistic.

612
00:36:40,320 --> 00:36:44,480
But when you think about will
freedom of choice and just

613
00:36:44,480 --> 00:36:46,720
having this, what do you think
about free will?

614
00:36:51,040 --> 00:36:53,400
It's funny, it's like everybody,
you know, come with their own

615
00:36:53,400 --> 00:36:56,960
like real solution to like the
most complicated.

616
00:36:57,240 --> 00:36:59,560
The problem is that if if you
think you have a solution, you

617
00:36:59,560 --> 00:37:01,480
probably don't understand the
problem, right?

618
00:37:02,080 --> 00:37:06,440
So you're always risking it.
But I think there's like

619
00:37:07,320 --> 00:37:13,600
something to do with
probabilities.

620
00:37:13,760 --> 00:37:18,840
I guess it's, I guess it's weird
thing to say, but there's a

621
00:37:18,840 --> 00:37:22,400
problem with determinism, right?
That there's one thing leads to

622
00:37:22,400 --> 00:37:24,600
another and there therefore we
don't have free will.

623
00:37:24,600 --> 00:37:26,000
It's kind of it's all
determined.

624
00:37:26,760 --> 00:37:31,600
But we do find situations in the
world where they call it like

625
00:37:31,600 --> 00:37:36,360
the land of equal probabilities
where equal, you know, it's

626
00:37:36,360 --> 00:37:38,320
like, but you still need to make
a choice.

627
00:37:38,840 --> 00:37:44,280
So yeah, I'm just like, I'm
wondering if the fact that we

628
00:37:44,280 --> 00:37:50,240
make a choice from equal land of
landscape of probabilities is

629
00:37:50,240 --> 00:37:56,240
the sense of free will or.
So it's a sort of, yeah.

630
00:37:56,240 --> 00:37:59,120
But it's, it's really just like
my, my science fiction theory,

631
00:37:59,200 --> 00:38:01,560
right?
That's where I stand.

632
00:38:01,560 --> 00:38:04,920
But I, I think I would be
curious to see where it goes in

633
00:38:04,920 --> 00:38:08,880
terms of choosing from that.
That's the kind of the, the

634
00:38:08,880 --> 00:38:11,720
essence of free will, right?
The fact that you have a choice.

635
00:38:12,360 --> 00:38:15,680
But growing up in Israel, did
you find that you always wanted

636
00:38:15,680 --> 00:38:17,360
to be a neuroscientist?
Was that something that was

637
00:38:17,360 --> 00:38:20,200
always on your mind, or was that
something that just happened

638
00:38:20,200 --> 00:38:23,040
while delving into the
philosophy of mind over time?

639
00:38:25,680 --> 00:38:28,280
I think, I mean, I don't know
who is the person that really

640
00:38:28,280 --> 00:38:30,960
plans their future.
I don't think it exists really.

641
00:38:30,960 --> 00:38:35,760
It's just in retrospect and it's
sometimes it seems like

642
00:38:35,760 --> 00:38:38,600
everything was planned because
one thing builds on another.

643
00:38:38,600 --> 00:38:42,360
It's like just so nicely
crafted, but it's like

644
00:38:42,440 --> 00:38:46,280
absolutely not.
It's like you really take it day

645
00:38:46,280 --> 00:38:48,920
by day.
I had in each stage I had no

646
00:38:48,920 --> 00:38:53,760
idea if even and how I Will
Survive the next and whether it

647
00:38:53,760 --> 00:38:59,920
would lead to anything else.
Also it it, it's not like I had

648
00:38:59,920 --> 00:39:03,600
like a world view of I want to
study this and that just like

649
00:39:03,600 --> 00:39:05,880
every moment I was interested in
something.

650
00:39:05,880 --> 00:39:10,600
And and I think because you
follow your passion, then you're

651
00:39:10,600 --> 00:39:13,200
consistent.
You see, you're not first

652
00:39:13,200 --> 00:39:16,360
consistent.
It's like you just do whatever

653
00:39:16,360 --> 00:39:18,160
at the moment is the most
important to you.

654
00:39:18,160 --> 00:39:22,680
And because it's you and you,
you do have some, some passions

655
00:39:22,680 --> 00:39:25,080
and interests, then then one
thing leads to another and

656
00:39:25,080 --> 00:39:27,880
become a very coherent actually
path.

657
00:39:28,560 --> 00:39:32,240
But yeah, no, I mean, I was
really interested in, in

658
00:39:32,240 --> 00:39:36,040
astronomy and astrophysics.
That that was my passion as a

659
00:39:36,360 --> 00:39:38,560
child.
And I always thought I would go

660
00:39:38,560 --> 00:39:42,920
in that direction.
There's there was no science

661
00:39:43,400 --> 00:39:46,760
like space program or something
when I grew up.

662
00:39:46,760 --> 00:39:50,280
So I just, you know, I ended up,
I found myself sort of in the,

663
00:39:50,320 --> 00:39:53,040
the field of neuroscience, which
was like the next best thing,

664
00:39:53,040 --> 00:39:55,040
you know, in terms of being
complicated.

665
00:39:56,520 --> 00:39:58,240
Yeah.
In retrospect, it does reflect

666
00:39:58,240 --> 00:40:00,200
what what I was always curious
about.

667
00:40:00,320 --> 00:40:05,560
But it just you, if you just
follow your passion, you'll find

668
00:40:05,560 --> 00:40:08,400
your path as opposed to plan
your path ahead of time.

669
00:40:09,160 --> 00:40:11,320
We're exactly the same in that,
because when I was younger,

670
00:40:11,320 --> 00:40:12,520
that's the same thing that
happened to me.

671
00:40:12,520 --> 00:40:13,960
I wanted to be an
astrophysicist.

672
00:40:14,440 --> 00:40:16,560
And then eventually he found
myself going down the route of

673
00:40:16,560 --> 00:40:19,160
mental health and medicine.
And then because I felt like the

674
00:40:19,160 --> 00:40:21,800
mind was studying, the brain
would be the closest thing to

675
00:40:21,800 --> 00:40:24,360
that, which is strange because
it's, but yet it is.

676
00:40:24,560 --> 00:40:26,440
It's fundamentally a universe in
itself.

677
00:40:27,080 --> 00:40:28,520
Yeah.
I mean, I can.

678
00:40:28,840 --> 00:40:31,280
I can.
I think I did have one, at least

679
00:40:31,280 --> 00:40:36,680
one philosophical approach that
that I was aware of early on,

680
00:40:36,680 --> 00:40:41,840
which is how I treated fear
because my interest in fear was

681
00:40:41,840 --> 00:40:49,120
about related to freedom because
I just felt it's very limiting.

682
00:40:50,080 --> 00:40:54,040
You're trapped in your fear and
also trapped in memories.

683
00:40:54,600 --> 00:40:59,240
So I found or thought that
overcoming fear is really about

684
00:40:59,240 --> 00:41:04,320
liberating liberation, kind of
just removing obstacles.

685
00:41:05,360 --> 00:41:09,360
That that's how I I treated
fear, you know, Speaking of free

686
00:41:09,360 --> 00:41:11,880
will, you know, you don't want
something to dictate.

687
00:41:12,320 --> 00:41:15,160
It's like something external
that dictates whatever you do.

688
00:41:15,600 --> 00:41:19,520
So it's really about freedom.
And, and has that changed over

689
00:41:19,520 --> 00:41:22,160
the years or do you still find
that sort of liberation when you

690
00:41:22,160 --> 00:41:24,960
talk about breaking away from
fear and like helping people

691
00:41:24,960 --> 00:41:26,520
with that with your work and
research?

692
00:41:28,000 --> 00:41:32,760
Yes, it even expanded because
now it it goes to everything.

693
00:41:33,040 --> 00:41:35,560
It's about our emotions and
about our memories.

694
00:41:36,080 --> 00:41:39,120
So I did have a fundamental
change in how I think about

695
00:41:39,120 --> 00:41:43,600
memories because, you know, you
grew up thinking that memories

696
00:41:43,600 --> 00:41:46,600
are who you are and you don't
even doubt them.

697
00:41:46,600 --> 00:41:49,200
It's just like a story that
you're stuck with.

698
00:41:49,760 --> 00:41:51,960
And it's just like repeating,
you know, you always have this

699
00:41:51,960 --> 00:41:55,400
like memory in mind and, and
it's who you are and it's your

700
00:41:55,400 --> 00:41:58,720
life experience.
And now it's like it's nothing,

701
00:41:59,200 --> 00:42:03,880
nothing at all because first of
all, it's just it's a choice.

702
00:42:03,880 --> 00:42:06,400
You know, whatever you remember
is a choice and how you remember

703
00:42:06,400 --> 00:42:09,400
it is a choice.
And, and also it's a

704
00:42:09,400 --> 00:42:12,520
possibility.
It's not, you're not entirely

705
00:42:12,520 --> 00:42:15,760
sure that this is what happened.
So in a way, just like you

706
00:42:15,760 --> 00:42:18,800
predicted the future, you
always, you almost like predict

707
00:42:18,800 --> 00:42:20,720
the past.
You can have an hypothesis about

708
00:42:20,720 --> 00:42:24,080
the past.
So it's an option now.

709
00:42:25,560 --> 00:42:28,800
And so it's not, it's not self
defining anymore.

710
00:42:29,280 --> 00:42:37,720
And also the fact that emotions
are are information in a way and

711
00:42:37,720 --> 00:42:41,000
also they could be flexible.
Then you also started having a

712
00:42:41,000 --> 00:42:45,360
relationship with your memory
and with your emotion as opposed

713
00:42:45,360 --> 00:42:48,200
to just operating at every given
moment.

714
00:42:48,200 --> 00:42:50,360
If you have a memory, then it
gives you information.

715
00:42:50,360 --> 00:42:51,760
It's like, why do I remember it
now?

716
00:42:52,040 --> 00:42:54,680
It actually tells you a lot
about your situation now as

717
00:42:54,680 --> 00:42:57,280
opposed to what actually
happened before.

718
00:42:57,840 --> 00:43:00,600
That's the most important
information that it it gives you

719
00:43:01,240 --> 00:43:03,520
and emotions.
It's like if you're afraid, it's

720
00:43:03,520 --> 00:43:06,120
like, because people can say,
yeah, but I can't do it because

721
00:43:06,120 --> 00:43:07,680
I'm afraid.
And that's it.

722
00:43:07,680 --> 00:43:09,960
You know, it's like, no, but
it's not.

723
00:43:10,520 --> 00:43:13,120
You can still have a
relationship with that fear,

724
00:43:13,120 --> 00:43:15,400
right?
You can, you don't necessarily

725
00:43:15,400 --> 00:43:18,400
have to feel like that or you
can do something despite of that

726
00:43:18,400 --> 00:43:23,480
or so all of these like self
defining aspects of your life

727
00:43:23,480 --> 00:43:27,840
are not defining anymore.
And in a way, you become this

728
00:43:30,320 --> 00:43:35,000
first spirit, that kind of
liberated creature that you kind

729
00:43:35,000 --> 00:43:39,320
of create yourself every moment
in a way by interacting with

730
00:43:39,320 --> 00:43:43,000
these entities.
Yeah, it's, it's, it's, it's

731
00:43:43,000 --> 00:43:45,040
crazy because when you think
about when people talk about

732
00:43:45,040 --> 00:43:48,080
prior information dictating the
posterior outcomes or

733
00:43:48,080 --> 00:43:51,160
conclusions, yeah, what you're
saying is almost like it's it's

734
00:43:51,160 --> 00:43:53,440
actually, we have these
posterior conclusions that we

735
00:43:53,440 --> 00:43:55,920
keep making that actually
dictate the posterior

736
00:43:55,920 --> 00:43:57,640
conclusions.
So this choice is all the way

737
00:43:57,640 --> 00:43:59,320
down.
I'd never stop.

738
00:43:59,920 --> 00:44:02,200
And every, at every point we
want to stop and actually ask

739
00:44:02,200 --> 00:44:04,400
another question.
We can fundamentally change the

740
00:44:04,400 --> 00:44:06,680
conclusion in an instant,
actually.

741
00:44:07,080 --> 00:44:09,080
Or maybe not in an instance of
the time at least.

742
00:44:10,160 --> 00:44:14,760
Yeah.
And, and also another thought I

743
00:44:14,760 --> 00:44:19,480
had that is also a more recent
insight of sorts is that so

744
00:44:19,480 --> 00:44:22,240
first you, you have the story of
your life, right?

745
00:44:22,640 --> 00:44:25,400
And then the next stage is like,
well, maybe it's not the story

746
00:44:25,400 --> 00:44:27,600
of my life, right?
You can start to doubt the story

747
00:44:27,600 --> 00:44:30,920
and you kind of see that it's
modifiable and it's flexible.

748
00:44:30,920 --> 00:44:35,080
It's not the absolute truth.
And you really learn about the

749
00:44:35,080 --> 00:44:37,640
moment and who you are now and
what, what you need.

750
00:44:39,240 --> 00:44:43,000
And then the next stage is, is
not committing to any story.

751
00:44:43,000 --> 00:44:46,280
It's actually living with
multiple stories.

752
00:44:46,800 --> 00:44:51,240
And I have kind of a, you know,
pet theory like like with the

753
00:44:51,240 --> 00:44:55,720
consciousness and the free that
it's actually very adaptive to

754
00:44:55,720 --> 00:44:59,120
have a wide range of options.
So the more stories you have

755
00:44:59,120 --> 00:45:02,960
about yourself or the
possibilities, the better.

756
00:45:03,640 --> 00:45:07,600
Because if you just, you know,
commit to 1 storyline, it's very

757
00:45:07,600 --> 00:45:10,400
restrictive.
If like two or three, it's fine.

758
00:45:10,400 --> 00:45:13,920
But if you it's like at any
given moment, it's like you

759
00:45:13,920 --> 00:45:18,200
treat everything as like
possibilities and, and one is

760
00:45:18,200 --> 00:45:23,240
like more probable or more
coherent or makes sense or

761
00:45:23,240 --> 00:45:26,080
something.
But you're kind of very flexible

762
00:45:26,080 --> 00:45:27,680
with that too.
Yeah.

763
00:45:27,680 --> 00:45:30,080
So, and that makes sense because
when you think of certain

764
00:45:30,200 --> 00:45:32,800
theories that were told back in
the day, when you think about

765
00:45:32,800 --> 00:45:36,120
certain narcissists or
pathological liars and you see

766
00:45:36,120 --> 00:45:39,040
how prior to the invention of
proof or like photography,

767
00:45:39,040 --> 00:45:41,280
videography, you could get away
with so much.

768
00:45:41,680 --> 00:45:45,240
And, and it was often a trait
that helped them evolutionary to

769
00:45:45,240 --> 00:45:47,240
get through lots of things.
You could lie your way through

770
00:45:47,240 --> 00:45:48,800
anything.
There was no form of evidence

771
00:45:48,800 --> 00:45:50,480
out there.
People weren't keeping track.

772
00:45:50,800 --> 00:45:53,800
And then over time, as the as
digital media progressed, as

773
00:45:53,800 --> 00:45:58,240
this proof, as we have notes and
and prior information, now it's

774
00:45:58,240 --> 00:45:59,880
not actually not working as
much.

775
00:45:59,880 --> 00:46:03,360
But you can tell how having that
ability to almost lie on demand,

776
00:46:03,640 --> 00:46:06,280
being able to be malleable as a
character, be a chameleon, do

777
00:46:06,280 --> 00:46:09,280
whatever you need to do, would
actually benefit you in so many

778
00:46:09,280 --> 00:46:11,680
ways.
Because you can literally drop

779
00:46:11,680 --> 00:46:13,840
into a different country,
pretend like you're from there.

780
00:46:15,120 --> 00:46:17,200
Yeah, yeah.
But but there's a there's a

781
00:46:17,200 --> 00:46:19,520
caveat.
So I'm glad you you gave that

782
00:46:19,520 --> 00:46:24,280
example because it's, it's not
what I meant to so, so I'm glad

783
00:46:24,280 --> 00:46:26,840
you you went there so that now I
can clarify.

784
00:46:28,200 --> 00:46:33,200
So it's not about inventing your
life as you go.

785
00:46:33,200 --> 00:46:38,000
It's it's actually, I believe
that this is how you become the

786
00:46:38,000 --> 00:46:42,360
most authentic and true to
yourself.

787
00:46:42,360 --> 00:46:46,760
You see, I mean, there's still a
you there that because there's

788
00:46:46,760 --> 00:46:50,040
still the end, the you that have
the interaction with the memory.

789
00:46:50,360 --> 00:46:54,720
It's just what the only thing
I'm saying is that it, it

790
00:46:54,720 --> 00:47:01,080
doesn't, it's not forced upon
you anymore or it becomes

791
00:47:01,080 --> 00:47:05,840
information rather than a ready
made or because you, you never

792
00:47:05,840 --> 00:47:09,040
chose it in a way if you just
have your storylines like my

793
00:47:09,040 --> 00:47:11,800
childhood was like this and this
one did this to me.

794
00:47:11,800 --> 00:47:17,400
And then that's what I am now.
Now it's like, you know, well,

795
00:47:17,400 --> 00:47:20,400
maybe not, or maybe I can look
at it differently, or maybe now

796
00:47:20,400 --> 00:47:25,640
I have more information or maybe
I can find a way not to be

797
00:47:25,640 --> 00:47:27,760
afraid.
But you, you see, you just like

798
00:47:27,760 --> 00:47:32,080
become more like an, an artist
with a lot of material.

799
00:47:32,440 --> 00:47:33,720
Yeah, it's kind of like a
refresh.

800
00:47:34,800 --> 00:47:36,720
It's kind of like reframing
because it's almost like

801
00:47:36,720 --> 00:47:40,440
uncaging yourself and
remembering a lot more as has

802
00:47:40,440 --> 00:47:42,600
actually happened to you.
Let's say the day that you had a

803
00:47:42,600 --> 00:47:45,120
traumatic experience, they might
have been the most beautiful

804
00:47:45,120 --> 00:47:46,640
sunset that occurred the same
night.

805
00:47:47,000 --> 00:47:49,720
And you could always rewire that
into a different thought, like,

806
00:47:49,720 --> 00:47:54,520
OK, that day was more about this
beautiful sunset or I am I close

807
00:47:54,520 --> 00:47:56,000
to but what you're talking
about.

808
00:47:56,000 --> 00:47:59,760
Or yeah, because what I want to
say is that it's not less true.

809
00:48:00,080 --> 00:48:03,520
Yeah, so, so unlike the the
confabulating person, I mean,

810
00:48:03,520 --> 00:48:07,360
that person is doesn't have even
a stable self, right?

811
00:48:07,360 --> 00:48:08,920
It's like you don't know who
that person is.

812
00:48:08,920 --> 00:48:10,480
They're just like moment by
moment.

813
00:48:11,400 --> 00:48:16,840
Actually they they are, they're
like changing by the moment

814
00:48:16,840 --> 00:48:19,680
depending on the moment.
And and I'm talking about

815
00:48:19,680 --> 00:48:26,760
actually resisting that.
You see, because it's like you,

816
00:48:27,480 --> 00:48:30,840
you're not your emotions in your
emotions are not something that

817
00:48:31,600 --> 00:48:34,280
are just there.
It's like I have this emotion.

818
00:48:34,280 --> 00:48:36,600
You see, you sort of free
yourself and now you have like a

819
00:48:36,600 --> 00:48:40,600
lot of choice, or at least you
interact with it as opposed to

820
00:48:40,920 --> 00:48:43,880
it's like floating it's.
Almost like you're detaching

821
00:48:43,880 --> 00:48:46,720
from yourself, but you're
detaching from that moment

822
00:48:46,880 --> 00:48:50,440
rather so, so unlike the
confabulator, the psychopath,

823
00:48:50,440 --> 00:48:53,040
you know, you're not becoming
new people each time, but

824
00:48:53,040 --> 00:48:55,480
you're, you're aligning your
stories to parts that you kind

825
00:48:55,480 --> 00:48:58,320
of want to go on and that have
actually happened.

826
00:48:58,360 --> 00:49:00,880
And you're just navigating
through that territory rather

827
00:49:00,880 --> 00:49:02,480
than the one that brought you
down.

828
00:49:03,960 --> 00:49:07,040
Yeah, yeah, yeah.
I think I like to think about

829
00:49:07,040 --> 00:49:09,240
it.
I like the artist example

830
00:49:09,240 --> 00:49:13,200
because it's like you just have
more material to work with to

831
00:49:13,200 --> 00:49:17,840
shape your life.
Is it is it ever been a concern

832
00:49:17,840 --> 00:49:21,480
that you get so caught up in the
artwork you forget what the

833
00:49:21,480 --> 00:49:23,120
canvas look like when it
started?

834
00:49:28,200 --> 00:49:34,560
I think not, because I think
this process actually makes your

835
00:49:35,120 --> 00:49:41,640
core stronger because you, you
access the core, which is the

836
00:49:41,640 --> 00:49:44,480
the thing, you know, that it
interacts with all the things

837
00:49:44,480 --> 00:49:50,320
and consider them, you know,
and, and also it's, it's not a

838
00:49:50,320 --> 00:49:55,360
lot of it sounds very cerebral,
but it's, it's not, it's not

839
00:49:55,360 --> 00:50:05,720
entirely like that.
It's really about, you know,

840
00:50:05,720 --> 00:50:09,960
like like music when you sort of
improvise and, and you, you have

841
00:50:09,960 --> 00:50:13,520
like a way to kind of so you
listen and then you go in a

842
00:50:13,520 --> 00:50:16,360
certain way, but but you keep
listening and then you go in

843
00:50:16,360 --> 00:50:19,720
another way.
So.

844
00:50:20,000 --> 00:50:21,800
Like jazz in that.
Result.

845
00:50:21,800 --> 00:50:24,040
Yeah, yeah.
It's more like that.

846
00:50:24,880 --> 00:50:28,400
Tell me, Daniella, with within
this field, such a diverse

847
00:50:28,400 --> 00:50:30,400
neuroscience is such a diverse
field, a lot of people don't

848
00:50:30,400 --> 00:50:32,080
realize it.
I mean, it forms bases of so

849
00:50:32,080 --> 00:50:33,920
many different things.
It can be economics, it can go

850
00:50:33,920 --> 00:50:36,240
down to the subatomic level.
It can go into chemistry,

851
00:50:36,240 --> 00:50:39,440
whatever.
Which parts about neuroscience

852
00:50:39,440 --> 00:50:45,320
or the brain are you most
passionate about right now and

853
00:50:45,520 --> 00:50:48,120
and what are you most looking
forward to in the future of

854
00:50:48,200 --> 00:50:56,160
neuroscience?
I'm kind of, I'm excited about,

855
00:50:56,160 --> 00:50:59,640
well, the mind.
So I would say mental actions

856
00:50:59,960 --> 00:51:05,760
because I'm kind of leaning
toward a view that really merges

857
00:51:05,840 --> 00:51:08,600
cognition and affect.
So I wouldn't say I'm interested

858
00:51:08,600 --> 00:51:10,480
in affect.
It's, it's really about this

859
00:51:10,480 --> 00:51:14,400
subjective experience or felt
experience and you have these

860
00:51:14,400 --> 00:51:17,680
different components.
So the, it's really all of the

861
00:51:17,680 --> 00:51:19,800
above.
So there's the neurobiological,

862
00:51:19,800 --> 00:51:24,760
they're literally, even before
that, the very physical, you

863
00:51:24,760 --> 00:51:29,240
know, network of an Organism,
which is like how units interact

864
00:51:29,240 --> 00:51:32,840
and produce materials so that
it's like the, the very, these

865
00:51:32,840 --> 00:51:36,560
very basic systems.
And then on top of that, you

866
00:51:36,560 --> 00:51:39,320
start to have the mental
actions, which is perception,

867
00:51:39,320 --> 00:51:43,960
attention, sensory information.
It's like the components and

868
00:51:43,960 --> 00:51:49,200
layered on it is is affect which
is the experience of the entire

869
00:51:49,200 --> 00:51:52,840
Organism.
So in a way the cognitive

870
00:51:53,280 --> 00:51:56,080
processes that we studied them
separately now are the

871
00:51:56,080 --> 00:52:01,480
components that comprise the the
agent as a whole.

872
00:52:01,760 --> 00:52:05,040
And the agent as a whole has
this like global computation

873
00:52:05,200 --> 00:52:07,800
that can then constrain the the
cognitive aspect.

874
00:52:07,800 --> 00:52:12,960
So for example, if you have the
concern of, of threat, if you go

875
00:52:12,960 --> 00:52:16,640
back to fear as an Organism, it
will constrain your, all of your

876
00:52:16,640 --> 00:52:18,360
processes.
So you will perceive things

877
00:52:18,360 --> 00:52:20,520
differently.
You will be attuned to different

878
00:52:20,520 --> 00:52:23,280
types of information in a
different way.

879
00:52:23,280 --> 00:52:26,000
All of your, your body and
processes are adjusted.

880
00:52:26,680 --> 00:52:29,400
So you can think about it as
like these global computations

881
00:52:29,400 --> 00:52:32,400
versus local computations.
And they're, they're all

882
00:52:32,400 --> 00:52:37,240
interacting.
So I would say I'm looking

883
00:52:37,240 --> 00:52:43,640
forward to to working in that
more global interactive space

884
00:52:43,640 --> 00:52:47,960
where the different researchers
are communicating because

885
00:52:47,960 --> 00:52:50,480
they're all in the same context,
you know, of the Organism in

886
00:52:50,480 --> 00:52:54,720
this particular affective States
and the affective states are,

887
00:52:55,440 --> 00:52:57,920
are inseparable.
They're not like this additional

888
00:52:57,920 --> 00:53:00,480
process, you know, sparkled on a
on a cognitive process.

889
00:53:00,480 --> 00:53:05,400
It's it's just considering the
entire Organism A computation

890
00:53:05,400 --> 00:53:09,880
that has the agent as the the
carrier of the computation.

891
00:53:10,400 --> 00:53:13,240
Yeah, it's and, and when you
think about this fault

892
00:53:13,240 --> 00:53:15,600
experience, I mean this feeling
of reality.

893
00:53:15,600 --> 00:53:18,640
And I mean, because it's easy to
think of someone thinking or

894
00:53:18,640 --> 00:53:21,360
having a thought, but feeling, I
mean, we're always doing it.

895
00:53:22,640 --> 00:53:24,640
We feel, we feel our way through
everything.

896
00:53:24,720 --> 00:53:28,200
Which other species do you think
if if it was not a human brain

897
00:53:28,680 --> 00:53:31,800
that you were studying now, what
would be the next best brain

898
00:53:31,800 --> 00:53:34,000
you'd love to to sort of get
your hands on?

899
00:53:34,000 --> 00:53:38,080
You could possibly want to study
that that you find intrigues

900
00:53:38,080 --> 00:53:42,920
your most except for a human.
Yeah, I think there's actually

901
00:53:42,920 --> 00:53:46,960
more that we don't know.
I I read this book like, you

902
00:53:46,960 --> 00:53:51,680
know, many people read it from
Ed Young, this immense world.

903
00:53:51,680 --> 00:53:57,880
It's a mind boggling and it's a
it's a little bit even scary

904
00:53:58,480 --> 00:54:05,880
what you what you kind of find
in these like different other

905
00:54:05,880 --> 00:54:09,760
options of consciousness.
Even I read like this, this

906
00:54:09,760 --> 00:54:12,280
little actually title that like
some people really like

907
00:54:12,280 --> 00:54:14,240
cockroaches and they think
they're like way more

908
00:54:14,240 --> 00:54:17,120
intelligent and they can like
even look at you, you know.

909
00:54:18,440 --> 00:54:25,880
So I think we don't know a lot,
but it's like from what we know,

910
00:54:25,880 --> 00:54:29,040
I'm really curious about
actually the very social

911
00:54:29,160 --> 00:54:35,040
species, like elephants I think
are just like amazing and

912
00:54:35,040 --> 00:54:41,720
dolphins and, well, octopuses
now we also know a lot more.

913
00:54:42,840 --> 00:54:46,720
So, yeah, all the things that
you can have like a relationship

914
00:54:46,720 --> 00:54:51,280
with and you feel there's
someone there with dogs,

915
00:54:51,280 --> 00:54:57,000
obviously.
So yeah, this would be my first

916
00:54:57,000 --> 00:54:58,640
chance.
Will be an elephant, I think.

917
00:54:59,200 --> 00:55:03,280
And I think that's, that's one
of the nice things about having

918
00:55:03,280 --> 00:55:07,240
feeling or affect as this as a
more core principle within this

919
00:55:07,240 --> 00:55:09,960
framework you're talking about
is because it's sort of allows

920
00:55:09,960 --> 00:55:13,240
us to have more species on our,
on the hierarchy, let's say

921
00:55:13,240 --> 00:55:16,480
evolutionary, because we often
just anything without a cortex,

922
00:55:16,480 --> 00:55:18,480
we sort of we're done with it.
But actually, when you think

923
00:55:18,480 --> 00:55:21,360
about it, a lot of species can
feel their way through reality

924
00:55:21,360 --> 00:55:23,880
without this, without a
prefrontal cortex, without any

925
00:55:23,880 --> 00:55:27,040
cortical aspects as well.
So even though we might not

926
00:55:27,040 --> 00:55:29,240
understand the way an octopus
works or anything without a

927
00:55:29,240 --> 00:55:33,440
cortex is a is still feeling
it's way through reality.

928
00:55:33,440 --> 00:55:36,760
So we're giving more things
consciousness with that approach

929
00:55:36,760 --> 00:55:39,440
in essence.
Yeah.

930
00:55:39,840 --> 00:55:43,720
And I wouldn't necessarily
separate thoughts and feelings,

931
00:55:44,440 --> 00:55:47,160
everything.
When you have a thought, it's a

932
00:55:47,160 --> 00:55:51,520
felt state.
So it's a felt experience.

933
00:55:52,080 --> 00:55:58,320
So yeah, I think in this way,
the more I dived into affect,

934
00:55:58,760 --> 00:56:03,840
the more it seems like actually
encompassing actually all of

935
00:56:03,840 --> 00:56:07,000
cognition.
So it, it become, become less,

936
00:56:08,120 --> 00:56:11,760
you know, more and more kind of
inclusive of, of everything and,

937
00:56:11,760 --> 00:56:15,880
and became something else, not
another process, but that global

938
00:56:15,880 --> 00:56:19,680
process that I, I mentioned.
If you, Danielle, if you had to

939
00:56:19,680 --> 00:56:23,320
have a sort of a Mount Rushmore
favorite neuroscientist, who

940
00:56:23,320 --> 00:56:24,840
would they be?
Who inspired you most?

941
00:56:24,840 --> 00:56:27,200
Or who do you recommend people
check out?

942
00:56:27,200 --> 00:56:34,200
Except for yourself, of course.
Oh, I had.

943
00:56:37,160 --> 00:56:39,320
Jeez, you caught me there.
I have to think about it.

944
00:56:39,560 --> 00:56:42,400
Oh, no pressure Does anyone who
inspires you, people you think

945
00:56:42,400 --> 00:56:44,280
about, people who maybe got you
into it.

946
00:56:45,320 --> 00:56:47,000
I know one of mine is Oliver
Sacks.

947
00:56:47,040 --> 00:56:49,320
He's someone I really love and
actually like.

948
00:56:49,880 --> 00:56:51,840
Probably one of the reasons why
I do this podcast.

949
00:56:51,840 --> 00:56:53,440
I don't know if you're familiar
with Oliver.

950
00:56:53,800 --> 00:56:58,560
I'm assuming you are.
Yeah, I mean, he's amazing.

951
00:56:59,040 --> 00:57:03,480
He like is a person that sees
the humanity in people.

952
00:57:06,840 --> 00:57:10,880
It's just like I feel like I'm
inspired on a daily basis from,

953
00:57:10,880 --> 00:57:15,080
from many, many also, you know,
from physicists and

954
00:57:15,080 --> 00:57:18,120
mathematicians, so.
Brought in that let's make it

955
00:57:18,120 --> 00:57:21,720
like who are the scientists who
or philosophers who've inspired

956
00:57:21,720 --> 00:57:25,200
you or shaped your worldview to
a point where they they really

957
00:57:25,400 --> 00:57:26,760
shaped your career and who you
are?

958
00:57:32,680 --> 00:57:34,960
I I have to think about it.
It's fine.

959
00:57:36,520 --> 00:57:43,160
It is like a great many and kind
of it's hard to choose, but it's

960
00:57:43,160 --> 00:57:46,600
like I feel I have I have an
answer for that, but but I want

961
00:57:46,600 --> 00:57:48,280
to think about it like more
deeply.

962
00:57:48,680 --> 00:57:52,160
That that's completely fine.
If if any recommended reading,

963
00:57:52,160 --> 00:57:54,440
though, Daniella, do you think
that if someone's looking at to

964
00:57:54,440 --> 00:57:56,760
get into this field, do you
think that you can think of any

965
00:57:56,760 --> 00:57:59,600
books that you'd recommend
people who want to fall in love

966
00:57:59,600 --> 00:58:02,400
with neuroscience or and and
just falling in love with the

967
00:58:02,400 --> 00:58:05,480
mind in general.
Is that something you also need?

968
00:58:05,480 --> 00:58:09,080
Some thoughts?
Well, I think Oliver Sacks

969
00:58:09,080 --> 00:58:12,080
definitely will be top of the
list.

970
00:58:12,680 --> 00:58:21,720
And well, I, I really look up to
my a postdoc commenter, Joseph

971
00:58:21,720 --> 00:58:24,480
Ledoux is like the fun of the
founding fathers of the

972
00:58:24,480 --> 00:58:29,560
emotional brain.
And yeah, I think, well, he

973
00:58:29,560 --> 00:58:33,720
would definitely be a person
that inspired me a lot because

974
00:58:34,240 --> 00:58:42,640
he's very poetic about how he
views the the brain and emotion.

975
00:58:42,640 --> 00:58:46,360
He keeps evolving.
He has an amazing way of

976
00:58:46,480 --> 00:58:49,280
expressing himself himself.
He's also a musician.

977
00:58:49,280 --> 00:58:54,880
So you it's very pleasant to
read what he writes, almost like

978
00:58:54,880 --> 00:58:57,080
listening to music.
There's some this like

979
00:58:57,080 --> 00:59:02,360
lightness, but but extreme depth
and I think he did a lot.

980
00:59:02,360 --> 00:59:06,840
He pretty much one of the, you
know, first few that that

981
00:59:06,840 --> 00:59:11,800
started shaping the field of the
emotional brain and emotional

982
00:59:12,760 --> 00:59:16,360
neuroscience and brought us
where we are now.

983
00:59:17,640 --> 00:59:19,960
So yeah, I would say I would
recommend him.

984
00:59:19,960 --> 00:59:23,040
He's he has several books, very
recent ones.

985
00:59:23,040 --> 00:59:29,640
He's also working on a memoir.
So I think there's more to read.

986
00:59:29,640 --> 00:59:32,000
And now he's, he's really
dealing with consciousness.

987
00:59:32,000 --> 00:59:35,200
So for him also emotion led him
to to consciousness.

988
00:59:35,640 --> 00:59:38,960
So I would start with him and I
would say he's he's definitely

989
00:59:39,040 --> 00:59:41,160
one of the scientists that
inspired me most.

990
00:59:41,840 --> 00:59:44,120
Yes.
And Danielle, just a round off

991
00:59:44,120 --> 00:59:48,400
if, what work should we look out
for in your from your lab in the

992
00:59:48,400 --> 00:59:50,840
future that's really exciting
you at the moment.

993
00:59:51,200 --> 00:59:53,720
And and then from there, we'll
slowly round off.

994
00:59:55,880 --> 00:59:59,080
I'm excited about where the
social space will take us in

995
00:59:59,080 --> 01:00:03,240
terms of the neural mechanisms,
if you can dig more and more to

996
01:00:03,240 --> 01:00:06,880
find neurons and how they encode
this navigation in abstract

997
01:00:06,880 --> 01:00:08,720
space.
So we're heading there.

998
01:00:10,200 --> 01:00:14,920
And also more into naturalistic
experiences of of threat and

999
01:00:14,920 --> 01:00:19,440
fear and trauma.
And to use that, I'm going to

1000
01:00:19,440 --> 01:00:24,120
use language models and more
sophisticated machine learning

1001
01:00:24,120 --> 01:00:28,840
based analysis to analyse
naturalistic behaviour,

1002
01:00:29,320 --> 01:00:33,720
especially related to fear.
Daniles, was there anything

1003
01:00:33,720 --> 01:00:37,640
about your work in general that
you feel you've you've always

1004
01:00:37,640 --> 01:00:40,280
wanted to talk about it never
got the chance really express

1005
01:00:40,280 --> 01:00:42,240
and tell people of the
excitement that perhaps you'd

1006
01:00:42,240 --> 01:00:45,840
like to or or do you feel that I
might have not asked about

1007
01:00:46,120 --> 01:00:47,760
that's really cool and people
should know about.

1008
01:00:54,920 --> 01:00:58,760
Yeah, I think the, the
reconsolidation aspect, we, we

1009
01:00:58,760 --> 01:01:01,760
didn't mention the, the word,
but you know, modifying memories

1010
01:01:02,320 --> 01:01:04,720
and, and navigating social
space.

1011
01:01:05,360 --> 01:01:08,480
I think now I'm also merging
them, starting to merge them to

1012
01:01:08,480 --> 01:01:12,920
see how the affect is
incorporated into the social

1013
01:01:12,920 --> 01:01:16,880
space or social behaviour is a
form of affective experience

1014
01:01:16,880 --> 01:01:18,960
that needs to be modified and
updated.

1015
01:01:19,480 --> 01:01:24,200
So it's, it's about identifying
these like core, core memories

1016
01:01:24,600 --> 01:01:28,680
and then finding a way to modify
them and then track the change.

1017
01:01:29,680 --> 01:01:33,520
And I think the human effect,
which we we published just

1018
01:01:33,520 --> 01:01:37,680
recently, I'm very excited about
that and want to see how where

1019
01:01:37,680 --> 01:01:41,360
it will lead us and I hope it
will be useful for the field.

1020
01:01:42,200 --> 01:01:44,160
I'll definitely put a link to
that below as well.

1021
01:01:44,360 --> 01:01:47,280
Thank you so much, Danielle.
I really appreciate your time.

1022
01:01:48,400 --> 01:01:50,800
Before we end, is there anything
about the brain you ping?

1023
01:01:51,000 --> 01:01:53,800
If you if you were to conclude
this, what is the one thing

1024
01:01:53,800 --> 01:01:56,720
you'd like to tell people about
the brain that they should know

1025
01:01:56,720 --> 01:02:01,600
and remember at the end of this
conversation, or the human

1026
01:02:01,600 --> 01:02:03,680
experience with your knowledge
in mind?

1027
01:02:06,080 --> 01:02:09,800
Yeah, I would say the the most
interesting aspect that people

1028
01:02:09,800 --> 01:02:14,520
should be aware of is the
malleability of it, the the

1029
01:02:14,520 --> 01:02:20,000
flexibility of it and the degree
of choice that we have.

1030
01:02:20,800 --> 01:02:26,640
So like breaking free from the
self defining memories and

1031
01:02:26,640 --> 01:02:32,800
emotional patterns and starting
to interact with them to find

1032
01:02:32,800 --> 01:02:36,240
who you are underneath.
Beautiful.

1033
01:02:36,240 --> 01:02:38,440
Thanks so much, Daniela.
I really appreciate your time.

1034
01:02:39,000 --> 01:02:41,240
This was an absolute pleasure
and you keep up the great work.

1035
01:02:41,240 --> 01:02:43,840
You guys are incredible and it's
always a pleasure to watch you

1036
01:02:43,840 --> 01:02:46,720
guys from the outside and see
the incredible work that's being

1037
01:02:46,720 --> 01:02:48,840
put out.
So keep it up and thank you for

1038
01:02:48,840 --> 01:02:50,720
my side.
Yeah, Thank you.

1039
01:02:50,760 --> 01:02:52,280
Thank you.
It was a great conversation.