There’s a trade off between time and computing power. If you want it done quickly you need more oomph - but if you’re ready to wait, you can get away with less.
It takes huge compute and energy to train AI models to even a fraction of human intelligence.
But our biological compute engine has trained over billions of years so it got away with a lot less intensive energy needs by investing over time.
And just like ML models, once trained, need less compute to run, evolved biological compute works well without so much compute at runtime - which Is why we’re able to think without consuming millions of calories.
This is all conjecture of course, but the analogies are quite fascinating.