Mechanistic explanation of neuroplasticity using equivalent circuits

This paper introduces a comprehensive mechanistic model of a neuron with plasticity that explains the biophysical correlates of memory for a single neuron. This means clarifying how information input as time-varying signals is processed, stored, and subsequently recalled. Moreover, the model addresses two additional, long-standing, specific biological problems: the integration of Hebbian and homeostatic plasticity, and the identification of a concise learning rule for synapses. In this study, a biologically accurate Hodgkin-Huxley-style electric-circuit equivalent is derived through a one-to-one mapping from the known properties of ion channels. The dynamics of the synaptic cleft, which is often overlooked, is found to be essential in this process. Analysis of the model reveals a simple and succinct learning rule, indicating that the neuron functions as an internal-feedback adaptive filter, which is commonly used in signal processing. Simulation results confirm the circuit’s functionality, stability, and convergence, demonstrating that even a single neuron without external feedback can function as a potent signal processor. The article is interdisciplinary and spans a broad range of subjects within the realm of biophysics, including neurobiology, electronics, and signal processing. Significance statement Mechanistic neuron models with plasticity are crucial for understanding the complexities of the brain and the processes behind learning and memory. These models provide a way to study how individual neurons and synapses in the brain change over time in response to stimuli, allowing for a more nuanced understanding of neuronal circuits and assemblies. Plasticity is a key aspect of these models, as it represents the ability of the brain to modify its connections and functions in response to experiences. By incorporating plasticity into these models, researchers can explore how changes at the synaptic level contribute to higher-level changes in behavior and cognition. Thus, these models are essential for advancing our understanding of the brain and its functions. PhySH 2023 terms Neuroplasticity, Learning, Memory, Synapses. MeSH 2023 terms Neuronal Plasticity [G11.561.638], Association learning [F02.463.425.069.296], Memory [F02.463.425.540], Synaptic transmission [G02.111.820.850]


Introduction
How does the brain remember?This classical question has recently received considerable attention focusing on the central nervous system's handling of coordinated synaptic changes, mandating a cell-wide coherent explanation of multisynaptic plasticity.However, such explanations have encountered difficulties despite massive research efforts by experimental and theoretical methods.This paper suggests that these problems can be resolved by mapping the neuron to an equivalent circuit and then showing that this circuit implements an adaptive filter.
Experimental methods typically investigate neurons' responses to stimuli and various biological manipulations such as ion channel blocking and genetic modifications.Since the first discovery of synaptic plasticity [5], experiments have revealed a diversity of overlapping and interacting plasticity mechanisms [26,46].A limitation of experiments in vitro is that crucial parameters such as temperature, membrane potential, and calcium concentration often transcend their physiological ranges.On the other hand, experiments in vivo are degraded by external disturbances, such as irrelevant signals from connected neurons.
The theoretical approach is to study how plasticity should work.In this case, the principal obstacle is to find biologically plausible mechanisms that match the theory.A significant theoretical contribution showed that classical Hebbian plasticity alone leads to the saturation of synaptic weights and the ensuing loss of information [38,3].Subsequent experiments demonstrated the existence of additional, homeostatic mechanisms that prevent distortion and stabilize synaptic plasticity [50,49].Here, "homeostasis" refers to the neuron's retaining a stable internal environment despite changes in external conditions.
Because of the challenges posed by the diversity of plasticity mechanisms and the scarcity of biologically plausible models, the timing and integration of homeostatic and Hebbian plasticity is an open issue [23].Therefore, a novel approach is chosen here, modeling a neuron as an electric-circuit equivalent in the spirit of Hodgkin and Huxley's seminal axon model [20] while strictly adhering to known properties of neuronal ion channels to ensure biological veracity.The resulting circuit can be interpreted mechanistically as a modified Least Mean Square (LMS) adaptive filter, a versatile device well-known in the field of signal processing [19,18].This interpretation takes advantage of the rich theory developed for adaptive filters.It explains precisely and quantitatively how the neuron modifies its synapses in orchestration to store time-variable functions or signals as required by procedural memory.
The proposed model being mechanistic is essential here because such models are superior to empirical or phenomenological models by providing a more comprehensive and detailed understanding of the underlying processes and mechanisms that give rise to the observed phenomena [8].

Organization of this paper
The paper's main topic is a derivation of the equivalent circuit and adaptive filter model from established knowledge about neuronal ion channels.For a mechanistic model, it is imperative to select an appropriate level of description that is adequately detailed yet not overly complex to provide a functional explanation and address the three specific problems under consideration.To achieve this, the paper first reviews the established function of inhibitory and excitatory synaptic ion channels to a level that allows for a direct translation into an electric network.By this conversion, insights from a century of experience with electronic circuits can be leveraged, along with the ability to identify circuit patterns or "motifs."The approach is conservative in not assuming the existence of as-yet-undiscovered biological mechanisms.
The paper's main conclusion is that a single neuron can be abstractly characterized as an adaptive filter, a powerful and fundamental component in signal processing.The basic principles of adaptive filters are, therefore, briefly reviewed.An adaptive filter's function in its fullest generality is to determine how a reference input is expressible in terms of a given set of input components.
Two experiments are performed to support further the claim that the neuron operates as an adaptive filter.The first experiment demonstrates the circuit's ability to approximate an inhibitory signal y(t) by appropriately weighting excitatory inputs x k (t).The second experiment confirms that action potentials function as clock pulses ("strobes"), triggering synaptic weight changes.
The results section presents the convergence and stability outcomes of the experiments diagrammatically, followed by an explanation of how the model, in its adaptive filter capacity, addresses the three specific issues concerning memory, Hebbian-homeostatic plasticity, and the synaptic learning rule.Subsequently, the discussion section introduces related work and explores some implications of viewing the neuron as an adaptive filter.
The investigation spans a time frame from milliseconds to minutes, encompassing short-term plasticity (STP) and early long-term plasticity (LTP) while excluding late LTP due to its reliance on nuclear processes and its consolidating function.
The neuron model introduced here lays the groundwork for a more complex mechanistic model that examines neuron populations and their coding mechanisms.However, creating a model encompassing large networks of neurons requires sophisticated signal processing techniques, such as wavelet decomposition and the concept of sparsity.These aspects are beyond the scope of the current paper but are detailed in a separate study [34].To summarize that study briefly, it demonstrates how populations of neurons conforming to the adaptive-filter model we discuss here can effectively transmit, process, and store information.This is achieved through an invariance property, which can be geometrically characterized as a convex cone.The adaptive filtering characteristics of these neurons enable them to perform signal processing tasks compactly and efficiently.An algebra of convex cones can abstractly describe these operations.This provides the populations with a robust computational framework akin to a "programming language" for neurons.
In summary, this article models a neuron's primary biochemical information processing pathways as equivalent electric circuits, reviews the adaptive filter concept, and employs it to describe the neuron's overall function.The model's adequacy is demonstrated through two simulation experiments, substantiating the neuron's capacity to operate as an adaptive filter.These results support the proposed model's validity and potential for advancing research in this field, demonstrated by its application as a foundation for a mechanistic model of neuron populations [34].

Overall structure of a neuron
This subsection provides a detailed description of the structure and function of a neuron, highlighting its key components, synaptic types, and their roles in signal transmission and plasticity.
The target neuron is a generic, glutamatergic neuron equipped with AMPA (α-amino-3-hydroxy-5-methyl-4-isoxazolepropionic acid) receptors (AMPARs) and NMDA (N-methyl-D-aspartate) receptors (NMDARs).This kind of neuron has been extensively studied and is representative of a substantial fraction of neurons in the central nervous system (CNS) [48], typical examples of which are the hippocampal neurons where plasticity was first demonstrated [5].The primary components of a neuron include the dendrites, which receive inputs from presynaptic neurons; the soma, which aggregates the contributions from dendrites; and the axon, which transmits the result to other neurons (fig.1).Axons can branch into axon collaterals carrying identical signals.Synapses, the contact points between axons and dendrites, are of two types: inhibitory and excitatory.They convert incoming stochastically rate-coded sequences of action potentials (APs), or more tersely, PFM (pulse-frequency modulated) spiketrains [35], into postsynaptic currents that alter the membrane potential, the voltage difference between the neuron's interior and exterior.At the axon initial segment (AIS), this potential is converted back into a spiketrain for output via the axon.
Ultrastructural studies reveal that inhibitory synapses are typically situated proximally to the soma, directly on a dendrite or the soma itself.In contrast, exci-tatory synapses tend to be positioned more distally, connecting with dendrites via spines-small protrusions on the dendrites.These spines are associated with plasticity, indicating that excitatory synapses are generally plastic, whereas inhibitory synapses are non-plastic.These principles are occasionally referred to as Gray's rules [15,37,48,17].This subsection focuses on the functioning of an inhibitory synapse in response to an action potential, including the involvement of neurotransmitters and receptors.The subsection presents the corresponding equivalent electric-circuit model that reflects this process, inspired by Hodgkin and Huxley's axon model [20].

Inhibitory synapse mapping
The model adopts Hodgkin and Huxley's view of gated ion channels as voltagecontrolled conductances (fig.3A).Because of their similarity to ideal field-effect transistors (FETs), the schematic uses modern transistor symbols (fig.3B, C).
When action potentials reach the axon terminal (fig.2), the membrane potential depolarizes (increases), causing voltage-gated calcium channels (Ca V ) to open (1).The calcium ion influx triggers the release of the neurotransmitter γ-aminobutyric acid (GABA) from nearby vesicles (2) into the synaptic cleft.GABA binds to GABA type A receptors (GABAAR) on the postsynaptic neuron, opening the receptor channel to chloride ions (3) [43].These ions are negatively involving the two gating inputs G1 and G2.For the NMDAR, these are the glutamate concentration and the postsynaptic membrane potential, respectively, whereas for the AMPAR, they are the glutamate concentration and the number of AMPAR receptors, respectively.
charged and hyperpolarize (reduce) the membrane potential.A direct translation of these biological processes into a circuit equivalent for the inhibitory synapse is shown in fig. 4.
The GABAAR is defined by the equation I DS = γ V DS V G , where the constant γ = γ GABAAR is the transistor's gain, and I DS and V DS are the channel current and voltage, respectively.V G is the gate voltage representing the GABA concentration.
A single transistor is chosen to represent the entire population of GABAARs at one synapse.Overall, the circuit inverts an incoming train of positive voltage pulses to negative current pulses and filters them through a lowpass filter before integrating them into the membrane potential.
The resistor R z represents the transport processes that circulate the chloride back out of the cell.The signal is filtered on its way to the soma by a lowpass filter R i C i composed of the spino-dendritic axial resistance R i and capacitance C i .The series capacitance C h compactly represents the homeostatic machinery that maintains the neuron's mean internal potential at a biologically comfortable level for the neuron.The inhibitory synapse's output into the dendrite or soma is a negative current pulse, the inhibitory postsynaptic current (IPSC).The driving potential of chloride ions is negative, so a positive pulse input at AP will lead to a negative current pulse at IPSC.The capacitor C h models the cellular machinery retaining homeostatic equilibrium.

Excitatory synapse mapping
This subsection examines the functioning of an excitatory synapse, including the roles of calcium ions, glutamate, AMPARs, NMDARs, synaptic plasticity, and the translation of these biological processes into an equivalent electric-circuit model.The function of an excitatory synapse (fig.5) is similar to that of an inhibitory synapse, but the plasticity associated with spines adds complexity to the model.After lowpass filtering, excitatory input pulses increase the postsynaptic membrane potential, and the synapse's gain is modified depending on the input's magnitude and the current membrane potential.
In more detail, the arriving action potential enables calcium ions to enter the presynaptic terminal (1) and trigger the release of the neurotransmitter glutamate (2).Glutamate binds to AMPARs on the postsynaptic neuron, opening the channels to positively charged sodium ions (3) depolarizing the membrane potential.In addition, glutamate affects NMDA receptors involved in the neuron's plasticity.The NMDA receptor is distinguished by its gating mechanism [21,48].While the binding of glutamate is essential, it alone is insufficient to open the channel.A magnesium ion is a gatekeeper that blocks the channel in a graded relation to the neuron's membrane potential [36,30].Depolarization of the neuron removes this magnesium block (4), enabling calcium to flow through the NMDA receptor channel (5).This calcium influx regulates the number of AMPA receptors constituting the synaptic weight through a cascade of downstream reactions [37,21].
Fig. 6 shows an electric-circuit equivalent for the excitatory synapse.Again, this is a direct translation of the biochemical processes of the excitatory synapse in fig. 5.A voltage pulse (1), representing the presynaptic action potential and corresponding glutamate release (2), gates injection of a positive current through the AMPAR.This current (3) undergoes lowpass filtering en route to the soma with a cutoff frequency of f c = 1/(2πR e C e ), which may vary substantially across different synapses within the same neuron.The culmination of this process is an excitatory postsynaptic current (EPSC) that is integrated with other EPSCs and inhibitory postsynaptic currents (IPSCs) by the membrane capacitance C m of the soma and proximal dendrites.The ensuing (AC, alternating current) membrane potential v m is electrotonically propagated throughout the cell (4).
The calcium concentration in the synaptic cleft is depleted when a presynaptic action pulse arrives, because the pulse causes calcium channels to open, consuming some of the synaptic cleft's calcium content [6,7].This reduction of [Ca 2+ ] e upon activity in the synaptic cleft reduces the driving force for calcium entry through the NMDAR, which plays a crucial role in the mechanism underlying synaptic enhancement [4].
The NMDAR, modeled here as a dual-gate transistor, senses the membrane potential with one gate, whereas the other ("strobe") recognizes glutamate activation, enabling synaptic weight modification.The cleft acts as a calcium buffer and lowpass filters the calcium-encoded signal with the time constant R s C s .
The voltage spanning the capacitor C w corresponds to the synaptic weight, defined by the number of AMPARs and, hence, is invariably non-negative.The diode ensures this property in parallel with C w .
The NMDAR's critical function is to act as a multiplier of the synaptic input and the feedback represented by the external calcium concentration [Ca 2+ ] e and the membrane potential v m , respectively.C h1 , C h2 and R h represent homeostasis of the calcium pathway.
The complete equivalent circuit for the neuron can be formed by combining the circuits illustrated in fig. 4 and fig.6.However, before proceeding to show that the complete circuit implements an adaptive filter, the next subsection offers a brief review of such filters.

Internal structure and operation of an adaptive filter
This subsection concisely reviews the fundamental adaptive filter (fig.7), which can be thought of as a procedure or algorithm.Its principal function is to find weights w 1 , w 2 , . . ., w n such that the weighted sum Σw k x k of candidate or component signals x 1 , x 2 , . . ., x n approximates a reference signal y.The component signals may originate from different sources or be derived from a single input x using a delay line or a filter bank as a signal decomposer.
The adaptive filter can be interpreted in different ways, depending on one's perspective.On the one hand, a biologist might see it as a system that maintains a balance between excitatory and inhibitory inputs [39,11,47].Notably, this differs from homeostasis because the inhibitory-excitatory balance adjusts synaptic weights so that the current weighted excitatory inputs match the inhibitory inputs as well as possible.This is non-trivial because these inputs are vectors, and the balancing operation can be described as minimizing a vector difference.
On the other hand, a physicist might view the filter as performing a wavelet transform of the signal y using wavelets x k [29], with the weights serving as transform coefficients.
Depending on how the filter is connected, it can perform a variety of essential signal processing tasks such as model creation, inverse model creation, prediction, and interference cancellation [19].Particularly relevant for biological systems is a configuration suggested to address the sensorimotor association problem, or the process by which the brain learns which neuron is connected to which muscle [33].
Algorithm 1: The basic LMS algorithm.The Least Mean Squares (LMS) algorithm (algorithm 1) [18], also known as the Widrow-Hoff LMS rule, is a method for updating the weights of an adaptive filter.It operates in discrete time steps t = t 1 ,t 2 , . .., where at each step it calculates the error feedback z, which is the difference between the weighted sum of the input signals x k and the reference signal y.Then, it updates all the weights w k by subtracting the associated feedback corrections, which are calculated as ∆w k = ε z x k , where ε is a learning rate.This learning rate is a positive constant, and its selection involves a balance between the convergence speed and stability against noise.
The convergence of the adaptive filter can be understood intuitively as follows: Suppose that some weight w j is slightly too large and that the corresponding input x j is positive.Then the error z will also tend to be positive and will be fed back to cause a reduction of the weight w j by εzx j .A similar argument can be used when instead w j is too small or x j is negative.Proving the convergence of the weights formally can be difficult in general, but the LMS rule has proven to be robust in practical applications [19].

Understanding the neuron as an adaptive filter
Here, it is established that the neuron's equivalent circuit operates as an adaptive filter, suggesting that the neuron also embodies this functionality.

The neuron's equivalent circuit as an adaptive filter
Interpreting the neuron as an adaptive filter is greatly simplified by modeling the neuron as an equivalent electric circuit.The combination of the synapse circuits in fig. 4 and fig.6 into a circuit equivalent for the neuron is shown in fig.8.This circuit converts the spiketrain input to membrane potential.The subsequent output conversion of the membrane potential to an output spiketrain and the application of an activation function ϕ(z) are omitted here because a mechanistic model for them has been presented elsewhere [35] and does not directly influence the input conversion.
The side-by-side comparison of the adaptive filter, as shown in fig.7 and the neuron model presented in fig.8 offers detailed agreement, indicating that both the circuit and, by extension, the neuron implements a modified LMS algorithm (algorithm 2).The match between the circuit and the adaptive filter is corroborated below by illustrating how the circuit realizes the summation operations, error feedback, and weight updates.Furthermore, an explanation is provided for the scenario where component inputs are redundant or linearly dependent, a common condition for biological neurons.
Algorithm 2: The modified LMS algorithm, full neuron version with activation function included.The inputs are assumed to already be lowpass filtered.
input :  8, it is evident that the summation operations in the adaptive filter align with the addition of currents in the neuron's equivalent circuit, as Kirchhoff's law dictates.This law states that the sum of currents entering a junction must equal the sum of currents leaving it, mirroring the summation process in the adaptive filter.
Error feedback: A rapid error feedback signal, labeled by z in fig.7 and fig.6, is essential for the functioning of the adaptive filter, as is visible in algorithms 1 and 2. This feedback is provided by the membrane potential v m created by the total of the IPSC and EPSC currents passing through the impedance consisting of the membrane resistance R m in parallel with the membrane capacitance C m .The feedback signal accesses all synapses within the neuron via their connections to the soma.The lowpass filtering by R m C m introduces a decay or "forget" factor λ , 0 ≤ λ < 1, on line 2 of algorithm 2, slightly generalizing upon algorithm 1, which would have λ = 0.
Rongala et al. [42] have proposed that the membrane capacitance and resistance function as a lowpass filter, stabilizing external feedback in recurrent neural networks.This function is equally applicable to single neurons with internal feedback.In the diagram in fig.8, this lowpass filter is represented by R m C m , and its impact is encapsulated in the decay factor λ .Notably, this parameter is essential but was not included in the original formulation of LMS learning.
In the biological neuron, the propagation of the z signal is nearly instantaneous due to its electrotonic conduction through the cytosol.Incidentally, it is remarkable how Nature has elegantly arranged for feed-forward by current and feedback by voltage, thereby avoiding interference between the two signals.Notably, the model does not require the postsynaptic neuron's generation of an action potential to adjust synaptic weights.The membrane potential provides the feedback.This is important because otherwise, a neuron with zero synaptic weights would have difficulties leaving this state.
Weight updates: The adaptive filter updates its weights w k by the product of the inputs x k and the error feedback z.The update uses a clever trick that stands out when viewing the involved circuitry, i.e., the plasticity circuitry of the excitatory synapse in fig.6.The weight w is a charge held by the capacitor C w .The product of the input x and error z should update this weight.However, whereas the error is readily available in the circuit as the membrane potential v m , the signal x on the glutamate pathway is PFM encoded and is unusable for the update in this form.Although lowpass filtered in the dendrite and soma, it is directly summed into the membrane potential and is unavailable separately.Fortunately, a lowpass-filtered version of x is available as the calcium concentration [Ca 2+ ] e in the synaptic cleft.Thanks to this additional copy of x, the NMDAR transistor in the circuit and the ion channel in the neuron can crucially "compute"-pass a charge proportional to-the weight update by multiplying the calcium concentration representing x with the membrane potential v m representing z.Experiment 1, described below, validates the above process.
Redundant and linearly dependent candidate inputs: Composing a signal x into components in engineering contexts relies on techniques such as a bandpassfilter bank or a Fast Fourier Transform.These methods ensure orthogonality, or at least linear independence, of the components x k .This independence is a critical requirement to guarantee the uniqueness of the weights.However, such a systematic decomposition is unfeasible from a biological perspective, resulting in identical reference inputs possibly giving rise to different synaptic weights.In the case of redundant component inputs, weights will converge (settle) towards a linear subspace rather than a specific point.Correlated component inputs can slow the convergence of the original LMS algorithm.This is because weights are updated simultaneously, which may lead to overshooting and oscillations.Here, evolution has provided an elegant solution for neurons because each synapse is updated individually and asynchronously by its own glutamate strobe signal (fig.6), demonstrated in experiment 2 (cf. the "for" statement in algorithm 1 with the "when" statement in algorithm 2).

Implications of the neuron operating as an adaptive filter
The neuron behaving as an adaptive filter allows us to address the three key concerns in the introduction: the process of memory storage and retrieval, the unification of Hebbian and homeostatic plasticity, and the establishment of a universal rule for synaptic plasticity.The proposed solutions to these problems are presented in the results section below.More generally, the adaptive filter provides a valuable conceptual model for understanding neuron populations and facilitates a succinct mathematical representation of these [34].
The following subsection conducts a series of experiments that confirm the functioning of the circuit as an adaptive filter.

Experiment design
Two sets of experiments were carried out to explore and validate model properties.
In the first experiment, the stability and convergence of the model were examined.The neuron model comprised one inhibitory synapse and two excitatory synapses (n = 2 in fig.8).The task of the circuit was to determine the weights w 1 and w 2 so that the weighted sum of spiketrains 2 and 3 corresponded to spiketrain 1.The inputs were Pulse Frequency Modulated (PFM) spiketrains, effectively inhomogeneous Poisson processes, modulated by sine waves with a modulation depth of 67% (fig.9).The modulations for the first experiment were 1 Hz and 2 Hz for spiketrains 2 and 3, respectively.The reference input, spiketrain 1, began with a modulation of 1 Hz but switched to 2 Hz after 150 s, ensuring a large number of spike arrivals and NMDA activation episodes.
The second experiment aimed to study the model's behavior in the presence of redundant input.In this experiment, a sine wave of 1 Hz modulated the inhibitory input, and a wave of 2 Hz modulated the first excitatory input (x 1 ).The remaining five excitatory synapses x 2 , . . .x 6 (n = 6 in fig.8) received redundant input.During the first run, these inputs were synchronized, receiving the same spiketrain modulated at 1 Hz.In the second run, spiketrains 3-7 were modulated by 1 Hz but generated independently, mimicking the behavior of biological neurons, making them asynchronous.The component values used in these experiments are provided in table 1, and they roughly align with physiological values, except for the gain parameters γ, which are unknown.In particular, the γ NMDAR parameter is a lumped parameter that can be tuned to adjust the learning range ε over a wide range.We have chosen this gain for demonstration purposes to illustrate the circuit function optimally and exaggerate learning speed and amplitude.Learning is typically considerably slower in biological neurons, but regardless, the circuit is robust and not overly sensitive to parameter variations.The experiments were conducted using an electronic-circuit simulator [27].

Experiment results
The first experiment demonstrates the convergence of weights w 1 and w 2 .Initially, with the inhibitory input signal y modulated by a sine wave of 1 Hz, the ratio w 2 /w 1 approaches zero as it should.This is because the input signal x 1 is also modulated by a sine wave of 1 Hz, coinciding with the reference input, while the input signal x 2 is modulated by a sine wave of 2 Hz, which is orthogonal to y.However, after 150 s, the modulation of y changes to 2 Hz, which instead coincides with the input signal x 2 .This time, the inverse ratio w 1 /w 2 approaches zero.Fig. 10A depicts this convergence for two different values of NMDAR gain γ NMDAR .Low and high gain correspond to 3 • 10 −5 A/V 2 and 10 −4 A/V 2 , respectively.The diagram shows that the circuit strives to enhance the weight of the excitatory input that aligns in frequency with the inhibitory input, while concurrently decreasing the weight of the other excitatory input that doesn't match in Parameter Value E Na 67 mV Table 1: Model parameters.The γ parameters denote ion channel (transistor) gains. frequency.
The second experiment shows what happens for multiple redundant excitatory inputs.In the first case, all the excitatory inputs are identical, so all strobe pulses are synchronous (dashed traces).In the second case, the same sine wave of 2 Hz modulates the excitatory inputs, but otherwise, they are independent, so the strobe pulses are asynchronous (solid traces).The experiment shows faster convergence for asynchronous strobes.

Solutions to the three specific problems considered
This paper has suggested that a neuron functions and can be conceptualized as an adaptive filter with internal feedback.Such a neuron model enables straightforward solutions, presented below, to the three problems posed in the introduction.

How does the neuron manage memory?
According to the adaptive-filter model, memories are stored as synaptic weights.More precisely speaking, if the information is input in the form of an inhibitory signal y(t), the memory is formed in the neuron by adapting weights w k so that y is balanced by the weighted sum Σ k w k x k of the excitatory signals x k (t).
In principle, memory is subsequently retrieved whenever signals y and x k are received by the neuron, and it computes and outputs the prediction error z, which  relies on the weights.Alternatively, memory can be recalled by temporarily holding y at zero, whereby the neuron will output the approximation Σ k w k x k ≈ y as-sembled by the same linear combination of excitatory signals.
2. Is there a universal synaptic learning rule?
The synaptic learning rule can be expressed as a variation of the Least Mean Squares (LMS) learning rule, modified to allow asynchronous weight updates, lowpass filtering of the feedback error, and the constraint that weights cannot be negative (cf. line 3 of algorithm 2): where k indicates synapse k, w w w = (w 1 , . . ., w n ) T is a vector denoting the number of AMPARs (synaptic weight), z represents the lowpass-filtered membrane potential v m (error feedback), and the vector x x x = (x 1 , . . ., x n ) T signifies the vectors of local synaptic cleft calcium concentrations [Ca 2+ ] e (excitatory input).The learning rate ε depends on several biological parameters but is perhaps most directly controlled by the gain γ of the NMDAR.This rule is applicable for an arbitrary number of asynchronous inputs and is triggered on a spike arrival at excitatory input k.

How do homeostatic and Hebbian plasticity balance?
The Hebbian-homeostatic balance emerges from the synaptic learning rule (1), inherently providing stability and subsuming both Hebbian and homeostatic plasticity.The NMDAR directly implements the multiplication zx k , and because the parameters x k and z describe the signed deviations from the steady-state averages (homeostatic equilibria), the modified LMS rule offers automatic stabilization.In the case of spike-timing-dependent plasticity (STDP), the model indicates that "backwash" membrane potential fluctuations caused by the output spike will bias weight changes in favor of input spikes preceding output spikes.

Discussion
The neuron uses membrane potential feedback during adaptation to adjust the excitatory synapse weights.This adjustment strives to balance inhibitory and excitatory input.Alternatively, this process can be described as the neuron's attempt to predict the inhibitory input by excitatory input-the membrane potential encodes the prediction error [44].Signal processing and control theory often refer to prediction error as the fundamental concept innovation [22].It has frequently been discussed in neuroscience under different names, including novelty [25], unexpectedness [2], decorrelation [10], surprise [12], and saliency [51], The critical operation for the plasticity of the neuron is the multiplication of the prediction error feedback z, represented by the membrane potential v m , with the excitatory input x available from the synaptic cleft external calcium concentration [Ca 2+ ] e .Given the existence of this non-linear multiply mechanism, linear mechanisms can adjust a suitable homeostatic equilibrium or zero offset (x 0 , z 0 ) by processes involving voltage-gated calcium channels (zx 0 ) and metabotropic glutamate receptors (z 0 x).
A significant difference between a neuron and a classical adaptive filter is that the neuron's weights cannot be negative.This is not a limitation because feeding a candidate signal x together with its negation (−x) achieves the same effect as a signed weight [9].Incorporating such negations could be a function of the numerous local inhibitory neurons in the central nervous system.Somewhat unexpectedly, this restriction to non-negative weights proves to be an advantage, as it enhances the expressive capabilities of neuron populations [34].
Two salient features distinguishing the proposed model are the explicit dynamics of the synaptic cleft and the dual-purpose utilization of glutamate for both direct information transfer and as a strobe signal that facilitates weight adjustment.The necessity for a strobe input arises because if NMDARs were continuously active, weights would be diluted toward zero, resulting in information loss.It is crucial for plasticity that weights change only when there is meaningful inputthat is, when activated by glutamate [21].
The circuit equivalent assumes that NMDARs operate at the same speed as AMPARs.In reality, NMDARs are slower and produce a burst of openings when triggered by glutamate, effectively performing a lowpass filtering.The model does not explicitly incorporate this property because the lowpass-filtered calcium input already accounts for the slowdown.
Several researchers have put forth adaptive filters as models for neuronal circuits in the cerebellum, utilizing external feedback [13,53,41].Nevertheless, low-latency feedback is pivotal for the performance of an adaptive filter as it sets the maximum signal frequency content.External feedback is slower than internal feedback by several orders of magnitude (for pyramidal neurons, see, for example, [32,1]).
The idea of a neuron functioning as a self-contained adaptive filter has been hypothesized [33,28].However, to our knowledge, the model presented here is the first wholly mechanistic model based exclusively on the known properties of ion channels.
For the studied GABAAR-AMPAR-NMDAR neurons, the model assumes that signals are conveyed by minor deviations from equilibrium.The general approach to model neurons by circuit equivalents can certainly also be applied in more general cases involving large deviations and steep changes in ion channel conductance depending on the operating conditions of the neuron, but in such cases, it will most likely be harder to find as simple an abstractions as (1).
A central prediction of the model is that the learning rate ε is directly related to the gain of the external-calcium-to-AMPAR cascade reflected by the lumped parameter γ NMDAR .Two interrelated experiments on real neurons could test this prediction.The first would test whether such a parameter is conceivable, e.g., by modifying the most convenient and accessible factor influencing the learning rate.The second would more exhaustively attempt to identify the factors affecting the gain and their interrelations.
Conducting a sensitivity analysis to measure the factors influencing the learning rate is challenging because the above lumped-parameter gain of the NMDA receptor summarizes this sensitivity.Many factors influence this parameter, providing neurons with multiple adjustment methods.This adaptability is advantageous for the neuron, as it can select the most beneficial adjustment method.However, this complexity and the compensatory nature of these factors result in a broad operating range for each factor, making it hard to pinpoint parameter values.
When interpreting the circuits in fig. 4 and fig.6 from an electrical engineering perspective, it appears that evolution has crafted an elegant, robust, and minimalist solution.From a pure signal processing standpoint, the stability of neuronal functions strongly suggests the existence of feedback.The loop delay in this feedback must be short, pointing towards electrotonic propagation.The membrane potential is the sole feasible choice because the output spikes are too infrequent to provide swift feedback.
The neuron appears to employ the biochemical equivalent of alternating current (AC) signals for signaling, while the direct current (DC) level is maintained by homeostasis to ensure an appropriate metabolic level.It is challenging to conceive a more compact arrangement of components capable of achieving such a sophisticated signal-processing function.Evolution has yielded an exquisite solution, employing current summation for feed-forward and voltage for feedback.The dual purpose of the glutamate pulse, serving as both PFM encoded input and strobe, is ingenious.
Widrow and Hoff [52] initially introduced the abstract, high-level neuron model ADALINE (for ADAptive LInear NEuron), drawing inspiration from the McCulloch-Pitts neuron model [31].This work predates the experimental discovery of ion channels by several years.Regrettably, Widrow and Hoff eventually abandoned ADALINE as a neuron model.Nevertheless, it became the foundation of the adaptive filter, which experienced dramatic advancements within the signal processing domain.
The LMS learning rule is known under various names in different contexts.In the field of artificial neural networks, it is often referred to as the "delta rule," whereas in statistical learning theory, as the "covariance rule" [45].These names all refer to the same concept: an iterative method for adjusting the weights of a learning model to minimize the mean square error z between the model's prediction, which is the weighted sum of x k , and the actual data y.Our model is a mechanistic explanation of a modified LMS or covariance rule with asynchronous updates, restricted to non-negative weights and including a decay factor.Other major self-stabilizing learning rules are the Bienenstock-Cooper-Munro (BCM) rule [3] and the Oja rule [38].However, these rules are theoretical constructs and, to the best of the author's knowledge, lack mechanistic explanations.
The model, when compared to biological neurons, exhibits several characteristics typical of biological neurons but not commonly found in other neuron models, at least not mechanistic ones: 1.It possesses the ability to record time-variable functions.
2. The model can learn without risking instability.This and the previous feature align with two of the three fundamental properties we initially aimed to achieve, as outlined in the introduction.
3. The capacity to "bootstrap" from a state where all synapse weights are zero is difficult for neurons relying on output spikes for plasticity.
Several neuronal features have been discussed and speculatively related to plasticity, including electrical effects of the spine neck [16], intraspine action potentials [40], and shunting of synaptic currents by simultaneously active synapses on a single spine [24].As for spine neck effects that passive filters can characterize, they benefit the neuron by increasing the diversity of synapse filter characteristics.However, the proposed model is generally independent of exotic features.Standard features of ion channels are entirely satisfactory for explaining all aspects of the model.Neither are exotic features deleterious for the model, as it is robust against noise in its capacity as an adaptive filter.
The presented model does not include the process by which the neuron converts the membrane potential into the output spiketrain, including the activation function, because this process has been comprehensively addressed in a recent publication, which mechanistically explains this output process [35].The current paper completes the picture of the neuron by providing a mechanistic explanation of the input process-the conversion back to internal potential from spike trains, including the plasticity.

Conclusions
Neuroscience research in many fields depends on detailed mechanistic knowledge of how neurons decode, process, store, and encode information.Examples of such fields are neural implants, interoception, and artificial intelligence, but progress in these fields has struggled with empirical and oversimplified neuron models.
This paper provides a complete state-of-the-art mechanistic model of a neuron's signal processing, including the plasticity, in the milliseconds-to-minutes range.The model explains at the ion channel level how neurons convert input spiketrains to internal potential, including the adjustments of their synaptic weights.Crucial components of the model are the inclusion of synaptic cleft dynamics, the arrangement of internal feedback, and the multiple functions of the glutamate neurotransmitter.It is shown that memory recording can be identified with the weight adjustments of an adaptive filter.The neuron strives to balance the inhibitory and excitatory inputs.After adaptation, it can be regarded as an inhibitory input predictor, delivering the prediction error as output.
The mechanistic abstraction of the neuron as an adaptive filter constitutes an essential link to the realm of conceptual spaces [14] interposed between the cognitive and biological levels.It reduces the need for spiking-level simulations and simplifies the understanding of large assemblies and networks of neurons, elaborated in-depth in [34].

Figure 1 :
Figure 1: Structure of a generic neuron.This image overviews the essential parts of a neuron, with details of inhibitory and excitatory synapses explored in fig. 2 and fig.5, respectively.

Figure 2 :
Figure 2: Inhibitory synapse schematic.An action potential arriving at (1) causes the release of GABA at (2), which then activates GABAAR, allowing a chloride current at (3), traveling as an IPSC to the soma.

Figure 4 :
Figure4: Inhibitory synapse circuit equivalent.This circuit implements the inhibitory synapse described in fig.2by modeling neurotransmitters as electrical currents.The transistor represents GABA-gated ion channels.Locations indicated by circled numbers 1-3 correspond to identically marked locations in fig.2.The driving potential of chloride ions is negative, so a positive pulse input at AP will lead to a negative current pulse at IPSC.The capacitor C h models the cellular machinery retaining homeostatic equilibrium.

Figure 5 :Figure 6 :
Figure 5: Excitatory synapse schematic.An action potential arriving at (1)causes the release of glutamate at (2), which then activates AMPARs, allowing a sodium current at (3).This current travels to the soma and proximal dendrites, where it is lowpass filtered and fed back as the membrane voltage (4).At (5), this voltage and glutamate gate the NMDAR.Experiments have demonstrated the activity-dependence of the synaptic cleft's calcium concentration [Ca 2+ ] e[6,7].

Figure 7 :
Figure7: Structure of an adaptive filter[18].A box with input x k and tunable weight w k computes the product x k w k and corresponds to the excitatory synapse in fig.6.The feedback z is crucial for adjusting the weights.The image is a graph representation of algorithm 1.

Figure 8 :
Figure 8: The neuron equivalent circuit.The IPSC and EPSC blocks are defined in fig. 4 and fig.6, respectively.Labels y, x1, . . ., xn denote unfiltered action potential (spiketrain) inputs, which are lowpass filtered in the synapse blocks.The subsequent conversion of v m back to an output spiketrain and application of an activation function are omitted.

Figure 9 :
Figure 9: Input signals.A. Initial section of 1 Hz PFM spiketrain used as input.The overlaid sine wave shows the modulation signal.B. Detail.The spike width is 1 ms.

Figure 10 :
Figure 10: Convergence and stability of weights. A. Simulations show the convergence of the weights for two different values of NMDAR gain γ (dashed trace for γ = 3 • 10 −5 AV −2 and solid for γ = 10 −4 AV −2 ).Modulation of the reference signal changed from 1 Hz to 2 Hz at t = 150 s.B. Convergence for redundant inputs.The upper two traces show the sum w 2 + w 3 + . . .+ w 6 , whereas the lower traces show w 1 .Weights cannot be negative.Dashed and solid traces are shown for synchronous and asynchronous spiketrains, respectively.