1 of 18

The Japanese Society of Neurology (JSN)�COI Disclosure �Name of Lead Presenter: Naoya Arakawa

There are no companies, etc. in a relation of conflict of interest requiring disclosure in relation to the presentation.

(Form 4-D)

If there is no state of conflict of interest requiring disclosure

S-02-5

2 of 18

Computational Implementation of Frontal Lobe Functions

ARAKAWA, Naoya

Symposium 02: Frontal Lobe Function in the Age �of Artificial Intelligence (S-02-5)

2025-05-21

The 66th Annual Meeting of the Japanese Society of Neurology

S-02-5

3 of 18

Self-Introduction

ARAKAWA, Naoya

  • Backgrounds: software engineering and philosophy�(Ph.D in Philosophy of Language)
  • Current Interest: biologically plausible cognitive modeling
  • Affiliations:
    • The Whole Brain Architecture Initiative (WBAI): an NPO that supports research activities aiming for creating human-like AI by mimicking the brain
    • The special interest group (SIG) for AGI (artificial general intelligence) of the Japanese Society for Artificial Intelligence.

3

S-02-5

4 of 18

Today’s Topic

Computational Implementation of Frontal Lobe Functions?

There are many functions…

Due to the time limitation

I’ll focus on

Fluid Intelligence

4

S-02-5

5 of 18

Recap: What was Fluid Intelligence?

Crystallized and Fluid Intelligence

Cattell (1963):

  • Crystallized Intelligence: ‘skilled judgment habits’
  • Fluid Intelligence: ‘adaptation to new situations, where crystallized skills are of no particular advantage.’

In other words:

  • Crystallized Intelligence: learned skills
  • Fluid Intelligence: ability to cope with situations beyond learned skills

5

S-02-5

6 of 18

Coping with Situations beyond Learned Skills?

What do you do when you encounter a situation �where your learned skills don’t work?

6

You try!

  • You may try one of learned or innate skills.
  • So ‘trying’ or ‘searching’ is the essence of Fluid Intelligence!

S-02-5

7 of 18

The ‘modern’ AI and Fluid Intelligence

  • The ‘modern’ AI (2010s’-) is basically of Deep Learning�i.e., mechanisms that learn from data
  • Its skills are all learned!
  • So, it is basically all Crystallized Intelligence, i.e.,�no Fluid Intelligence!
  • Of course, we should note the word basically here.

7

S-02-5

8 of 18

Fluid AI?

  • Once upon a time (before 2010s), AI (good old fashioned AI or GOFAI) hardly learned.
  • What they did was ‘search’.
  • So, GOFAI was all Fluid Intelligence and no Crystallized Intelligence!
  • But also with the ‘modern’ AI
    • Many ‘agentic’ deep learning systems incorporate Reinforcement Learning with search mechanisms.�e.g., DQN (Deep RL)(2013), ChatGPT, …

8

S-02-5

9 of 18

Fluid Intelligence, Dumb / Smart

  • The essence of Fluid Intelligence was search.
  • All searching systems smart?
  • Apparently No, … e.g., blind search
  • Fluid Intelligence tasks require more.
    • Raven’s Progressive Matrices
    • ARC by François Chollet (for AI)

9

S-02-5

10 of 18

Example: Raven’s Progressive Matrix

10

CC BY-SA 3.0 Life of Riley @Wikimedia�(Type II)

  • Required functions
    • Finding relations
    • Remembering sequence�(sequence memory)
    • Forming hypotheses�(on relations)
    • Remembering successful and failed hypotheses
  • Constraint satisfaction �via searching hypotheses

S-02-5

11 of 18

Quick Overview of Frontal Lobe Functions

  • General world model – prediction of the environment
  • Prospective memory (requires sequence memory)
  • Language generation
  • Metacognition
  • Executive functions / Goal management
    • Attention/Suppression
    • Motor execution
    • Problem solving - Fluid if beyond learned skills
      • Planning
      • Rule discovery
        • WCST
        • Visual analogy tasks – RPM, ARC, …

11

S-02-5

12 of 18

Where in the Frontal Lobe can we find �subfunctions of Fluid Intelligence?

  • Finding relations
    • Imagining possible transformations (neocortex – BG-driven?)
    • Requires Working Memory for comparison (where is WM in the brain?)
  • Sequence memory for keeping tracks
    • with or without the Hippocampus?
  • Forming and keeping track of hypotheses
    • Predictive evaluation of hypotheses
    • Remembering correct/incorrect sequences

12

S-02-5

13 of 18

Neural Model of Fluid Intelligence?�or

Biologically plausible cognitive model?

  • Function requirement
  • Structure requirement
    • Structurally similar to the Brain
    • Connectomic similarity

13

Modeling Fluid Intelligence while�keeping the whole brain architecture�in mind…

S-02-5

14 of 18

Q&As on Deep Learning

Q: Does PFC perform deep learning?

A: Yes, if we define deep learning as 'learning by hierarchical neural networks' and admit a hierarchy among PFC areas.

Q: Is deep learning sufficient to realize prefrontal-type general intelligence?

A: No. By definition, the performance by deep learning is crystallized and doesn't include fluid intelligence. We need extra architecture for fluid intelligence.

Q: Is the current AI ‘human-like’?

A: No, in that it requires huge computation/data compared to human child development (it does everything by learning).�Yes in that LLMs in use are a total collection of human writings.

14

S-02-5

15 of 18

Q&As on Self-Consciousness

Q: Does sophistication of deep learning lead to metacognition or self-consciousness that humans have?

A: I'd say No. A 'pure' deep learning system is a collection of rather passive (crystallized) prediction models. To get functions beyond that, we need extra structure.

Q: Will AGI/SGI become self-conscious beings?

A: Yes, if we define 'self-conscious' as 'the capability of an agent that monitors its internal working' and implement such a mechanism in AGI.

15

S-02-5

16 of 18

Neuroscience and AGI

  • Does neuroscience help developing AGI?
  • Ten years ago we believed so.
  • Now we are unsure (in the Tsunami of generative AI).
    • The current AI may realize AGI without the help of neuroscience…
  • Surely, neuroscience helps developing human-like AI.
  • But human-like AI is dangerous (as humans are)!
  • Still, human-like AI can help understanding humans�(e.g., pathological models).
  • Human-like AI or biologically plausible cognitive models: �structurally isomorphic or ‘Brain-Morphic’

16

S-02-5

17 of 18

Advertisement

  • We at the Whole Brain Architecture Initiative are compiling a brain anatomy/connectome database to build human-like AI.
  • We solicit you for brain-morphic models of intelligence�based on our connectome database!
  • https://sites.google.com/wba-initiative.org/braes/

17

S-02-5

18 of 18

Thank you for your attention!

Questions are welcome!

18

S-02-5