We've been taught that our brains are made up of neurons, which transmit electrical signals among themselves. That's true, but the model of a neuron either firing or not firing has led us to think of them as binary switches, and the work of the brain takes many cells to decode the firings.
When talking about how neurons work, we usually end up with the sum-up-inputs-and-spit-out-spike idea. In this idea, the dendrites are just a device to collect inputs. Activating each input alone makes a small change to the neuron’s voltage. Sum up enough of these small changes, from all across the dendrites, and the neuron will spit out a spike from its body, down its axon, to go be an input to other neurons.
It’s a handy mental model for thinking about neurons. It forms the basis for all artificial neural networks. It’s wrong.
Those dendrites are not just bits of wire: they also have their own apparatus for making spikes. If enough inputs are activated in the same small bit of dendrite then the sum of those simultaneous inputs will be bigger than the sum of each input acting alone
The image above shows a neuron on the left, and a flow chart of how it can work on the right. The explanation is much longer than I can summarize here, but it explains why human brains are so much more powerful than any artificial intelligence we've come up with yet. Read the whole thing at Medium. -via Metafilter