How much innate knowledge can the genome encode?
In a recent debate between Gary Marcus and
Yoshua Bengio about the future of Artificial Intelligence, the question came up
of how much information the genome can encode. This relates to the idea of how
much innate or prior “knowledge” human beings are really born with, versus what
we learn through experience. This is a hot topic in AI these days as people
debate how much prior knowledge needs to be pre-wired into AI systems, in order
to get them to achieve something more akin to natural intelligence.
Bengio (like Yann leCun) argues for putting as little prior knowledge into the system as we can get away with – mainly in the form of meta-learning rules, rather than specific details about specific things in the environment – such that the system that emerges through deep learning from the data supplied to it will be maximally capable of generalisation. (In his view, more detailed priors give a more specialised, but a more limited and possibly more biased machine). Marcus argues for more…
Bengio (like Yann leCun) argues for putting as little prior knowledge into the system as we can get away with – mainly in the form of meta-learning rules, rather than specific details about specific things in the environment – such that the system that emerges through deep learning from the data supplied to it will be maximally capable of generalisation. (In his view, more detailed priors give a more specialised, but a more limited and possibly more biased machine). Marcus argues for more…