tag:blogger.com,1999:blog-6642084484458008587.post2472742025599521773..comments2024-01-16T04:37:26.870-08:00Comments on Thinking Complete: How hard is implementing intelligence?Richard Ngohttp://www.blogger.com/profile/04825733481608403399noreply@blogger.comBlogger1125tag:blogger.com,1999:blog-6642084484458008587.post-55381256753866393002021-12-01T13:27:14.689-08:002021-12-01T13:27:14.689-08:00Thanks for another interesting post.
You argue t...Thanks for another interesting post. <br /><br />You argue that because the brain is very robust to noise and trauma, once we know the algorithm there should be very few hyperparameters that need to be tweaked? I am not so convinced. It is hard to say what parameters are crucial to development and continued functioning and which aren't. For example, the balance between inhibition and excitation seems to be crucial as does the correct functioning of a huge class of different neurotransmitters and receptors, not to mention critical periods for development. <br /><br />On the hardware front, yes the brain uses the same amount of energy as a lightbulb and our computers have far more FLOPS but our largest deep learning models still have far fewer than 150 trillion parameters (mapping these to synapse count, it should be more if you count glial cells too) and are not nearly as parallelized or sparse meaning there is a whole class of algorithms that researchers are disincentivized from testing. <br />Anonymoushttps://www.blogger.com/profile/06541834786941509566noreply@blogger.com