Published: January 6, 2023
40
244
2.0k

When it comes to similarities between the brain and deep learning, what's really striking is that everything that was actually bio inspired (e.g. sigmoid/tanh activations, spiking NNs, hebbian learning, etc.) had been dropped, while... (Cont.)

...everything that has durably outperformed (backprop, relu, dropout, MultiheadAttention, MixUp, separable convs, BatchNorm, LayerNorm and many others) makes no sense biologically and has basically been developed by trying a bunch of things and keeping what worked empirically

There's an important lesson here, and it isn't just "modern deep learning has nothing in common with the brain and wasn't inspired by it"

Running tons of experiments while having very few priors about what the solution should look like is tremendously more effective than coming up with fancy theories about how the brain really works and repeatedly trying to prove those theories. Numenta also comes to mind here...

The second bitter lesson

@fchollet Always thought that AI is turning out to create a "new intelligence" not "just" replicating human intelligence in a more efficient way. As our brain is not only complex but weird. With irrationalities, contradictions, etc..

@fchollet “2nd bitter lesson” Needs a better name, like “victory of the frequentists”

Share this thread

Read on Twitter

View original thread

Navigate thread

1/8