Machine Learning

Neuromorphic computers spike architecture

Informações:

Synopsis

spiking networks the brain has very low power density and very low frequency. deep learning uses in-memory computing. spike-timing dependent plasticity timing between the spikes is how learning is taking place in the synapse. The synapse is connected to the pre-synaptic neuron and a post-synaptic neuron. The Input spikes from the pre-synaptic neuron and the post synaptic neuron affect the weights on the synapse. The delay between the spikes determines who the synapse learns. The synapse will depress from negative current with voltage flowing from pre neuron to the post neuron. if the pre to post voltage is positive you have excitation of the synapse. feedforward network creates a perceptron. multiple synapse connect to a post neuron. the perceptron is self learning. the artificial neurons learn like the biological neurons. the artificial neuron can memorize objects. recurrent networks represent feedback systems. in the recurrent network all neurons talk with each other. the recurrent network has an inhibitor