by Laura Bella Naumann, Henning Sprekeler
Hebbian plasticity, a mechanism believed to be the substrate of learning and memory, detects and further enhances correlated neural activity. Because this constitutes an unstable positive feedback loop, it requires additional homeostatic control. Computational work suggests that in recurrent networks, the homeostatic mechanisms observed in experiments are too slow to compensate instabilities arising from Hebbian plasticity and need to be complemented by rapid compensatory processes. We suggest presynaptic inhibition as a candidate that rapidly provides stability by compensating recurrent excitation induced by Hebbian changes. Presynaptic inhibition is mediated by presynaptic GABA receptors that effectively and reversibly attenuate transmitter release. Activation of these receptors can be triggered by excess network activity, hence providing a stabilising negative feedback loop that weakens recurrent interactions on sub-second timescales. We study the stabilising effect of presynaptic inhibition in a recurrent networks, in which presynaptic inhibition is implemented as a multiplicative reduction of recurrent synaptic weights in response to increasing inhibitory activity. We show that networks with presynaptic inhibition display a gradual increase of firing rates with growing excitatory weights, in contrast to traditional excitatory-inhibitory networks. This alleviates the positive feedback loop between Hebbian plasticity and network activity and thereby allows homeostasis to act on timescales similar to those observed in experiments. Our results generalise to spiking networks with a biophysically more detailed implementation of the presynaptic inhibition mechanism. In conclusion, presynaptic inhibition provides a powerful compensatory mechanism that rapidly reduces effective recurrent interactions and thereby stabilises Hebbian learning.