How We Learn
The human brain is a fascinating organ. We count on it to learn. This blog reflects on reading “How We Learn” by Stanislas Dehaene.
There are two learning modes; in active mode, we form and test hypotheses, and in a receptive mode, we absorb what others transmit to us without personally verifying it. We need to balance the two.
For example, students must be attentive and confident in their teachers’ knowledge but also be critical thinkers of their own learning. Teachers should encourage the student’s curiosity for active engagement.
A teacher can be a school teacher, a parent, a trainer, a doctor, or a government. We want to support our students’ learning in both modes. We can’t filter information one way and label the disagreement as misinformation to dominate the learning path. That’s what happened during the pandemic and divided the people.
Four pillars to support us to learn and learn fast are as follows:
- Attention amplifies the information we focus on
- Active engagement: allowing curiosity encourages our brain to test new hypotheses
- Error feedback: comparing our predictions with reality and correcting our sense-making model
- Consolidation: integrating the learning.
About AI
Many wonders if artificial intelligence (AI) will take over their jobs. If you are one of them, you might want to know what AI is and what AI is missing in learning to determine your job security.
What is AI – AI learns and simulates human intelligence. It uses artificial neural networks (ANNs), a subset of machine learning and also at the heart of deep learning algorithms, to train data to learn and improve the accuracy of the simulation.
An AI example – When you purchase a book from the Amazon website, you will see a list of “Book You Might Like.” That comes from an AI prediction from learning your past searches and orders of books.
What is AI missing – The book “How we learn” explains quite well in Chapter 2 “Why our brain learns better than current machines.” A summary is as follows –
- Learning abstract concepts
Humans call upon general powers of reasoning and abstraction to question our beliefs.
Artificial neural networks neglect an essential point: human learning is not just the setting of a pattern-recognition filter but the forming of an abstract model of the world.
- Data-efficient learning
Human learning makes the most from the least amount of data. It is data efficient. Machines are data-hungry.
- Social learning
Humans learn a lot from our fellow humans through language. This ability remains beyond the reach of current neural networks.
- One-trial learning
Humans integrate new information within an existing network of knowledge. To learn is to succeed in inserting new knowledge into an existing network. And neural networks can’t do this.
- Systematicity and the language of thought
Our brain has a flowing ability to conceive formulas in a kind of mental language and can represent sets of symbols that combine according to a complex and arborescent syntax. Current neural networks can largely not represent the range of abstract phrases, formulas, rules, and theories.
- Composition
Our brain’s ability to compose previously learned skills, that is, to recombine them to solve new problems, is beyond neural models.