For decades, neurons have been conceptualized as points in larger networks where inputs from other neurons are summed up linearly and yield a simple outcome, such as ‘0’ and ‘1’. However, over the same time, neuroscientists have also become increasingly dissatisfied with this apparently too simplistic description. since the function of neurons is not a simple addition of inputs. For example, synapses can form in many places on the neuron, which should have an influence on how data is accumulated. David Beniaguev at the Hebrew Universit in Jerusalem investigated neurons’ input/output (I/O) mapping complexity utilizing recent advances in machine learning.
The article can be found here: