A human brain requires less data to be trained on a given ta

A human brain requires less data to be trained on a given task than a neural net, or does it?

We often say humans are more efficient learners than current machine, cause they can achieve the same level of generalization with much less examples.

For example, a human takes 20-30 hours of driving lessons to become an acceptable driver. Self driving cars require billions of frames of training data and still are not considered fully reliable.

Yes, BUT. By the age of 18, most humans have seen cars, have been in one and have seen people driving. Neural networks (NN) haven’t. Humans are born with a sense of objectness, i.e. recognize what makes an object an object and its properties, NN are not.

Human brain is the result of 3B+ years of evolution. Is it counted in the training data? If yes, are we still more efficient?

On the other side, it’s also worth considering the DNA bottleneck. Out of the 3B base pairs composing our DNA only <2% is used to code something in our body, i.e. 8 megabytes. Most models nowadays are bigger than that.

According to the OpenAI’s whitepaper, GPT-3 has 175B parameters and uses float16. This means the model weighs at least 325GB+.

Now I’m confused I don’t know if I should compare the brain to the Tesla V100 GPUs loading and running the model, to the model with its parameters, or to the model’s empty architecture.

At least I know from the movie Lucy that a superhuman can fit into a 2014 USB key, definitely less than 325GB.

hashtag #intelligence hashtag #agi hashtag #ai hashtag #artificialintelligence hashtag #humanvsai