Science

Out of context: Reply #712

  • Started
  • Last post
  • 1,010 Responses
  • monospaced0

    Meanwhile ... the U.S government is throwing $100 million at figuring out how to reverse engineer the brain.

    http://www.scientificamerican.co…

    • we don't need to copy it to make something that works. human brain has its biological limitations, its all fluids etc, which we do not need to cover over to AIdrgs
    • more importantly, there might not be any algorithm at all in the brain to be found (see previous post) in the first placemonospaced
    • for simple propagation networks we already have something that outperforms human vision, and its based on copying abstraction by layers in the braindrgs
    • https://upload.wikim…
      vs
      http://www.cs.uwyo.e…
      drgs
    • yeah, recognition systems require tons of input to store and process and they're getting better, totally understood.monospaced
    • the argument is that the brain doesn't necessarily do any of that (see previous post from scarabin). all quite interestingmonospaced
    • I think this is all so fascinating, and I appreciate the discussion.monospaced
    • the difference currently is that AI learning is supervised (or "labeled", ie you have one sample input and a corresponding target),and you learn the assosiationdrgs
    • technically it means that in a NN you try to predict a target from input, than compute error versus real target, and propagate the error backward, top to bottomdrgs
    • human brain learns the other way around, in unsupervised fashion, from bottom to top, ie. you start with recognizing some basic patterns on lower layersdrgs
    • first couple of years of a baby's life is basically unsupervised learning, they can recognize and separate things from each other, but dont know what they aredrgs
    • in fact if you look around you, 90% of crap around you -- you sort of have seen before, and can sort of describe with language, but you dont know what it isdrgs
    • AI skip this "dark" knowledge, and are only trained to recognize a limited number of very specific thingsdrgs
    • later in life babies only need to label (learn language) the things they are already familiar with, and then you can build up abstraction etcdrgs
    • lack of abstraction = autismdrgs

View thread