1Neural network models are an invaluable tool to understand brain function, since they allow to 2 connect the cellular and circuit levels with behaviour. Neural networks usually comprise a huge 3 number of parameters, which must be chosen carefully such that networks reproduce anatomical, 4 behavioural and neurophysiological data. These parameters are usually fitted with off-the-shelf 5 optimization algorithms that iteratively change network parameters and simulate the network to 6 evaluate the changes and improve fitting. Here we propose to invert the fitting process by 7 proceeding from the network dynamics towards network parameters. Firing state transitions are 8 chosen according to the transition graph followed by an agent when solving a given behavioural 9 task. Then, a system of linear equations is constructed from the network firing states and 10 membrane potentials, in such a way that system consistency in guarantee. This allows to 11 uncouple the activity features of the model, like its neurons firing rate and correlation, from the 12 connectivity features and from the task-solving algorithm implemented by the network, allowing 13 to fit these three levels separately. We employed the method to probe the structure-function 14 relationship in a stimuli sequence memory task, finding solution networks where commonly 15 employed optimization algorithms failed. The constructed networks showed reciprocity and 16 correlated firing patterns that recapitulated experimental observations. We argue that the 17 proposed method is a complementary and needed alternative to the way neural networks are 18 constructed to model brain function. 19 20 3
Introduction:21 Understanding brain function requires construction of physiological models that explain 22 experimental data, which encompass behavioural outcome, anatomical features, neurons 23 biophysics and coding properties, among others 1,2 . Many kinds of physiological models have been 24 proposed along history, each one with their own merits. Among them, neural network models are 25 well poised to connect all levels of analysis, from the behavioural to the molecular level, being a 26 natural choice as neurons are the functional units of the brain. Yet, constructing neural networks 27 that are suitable models is not an easy task. Neural networks can be hand-designed, setting 28 network parameters following experimental data, or randomly chosen when experimental data is 29 not available or as a mean of attaining more general conclusions. However, this approach may 30 fall short given the complexity of the nervous systems. To tackle this issue, theorist have 31 employed optimization methods to define the network parameters in such a way that a loss 32 function is minimized. The loss function must encompass relevant aspects of the model, like its 33 performance in one or several tasks, structural constrains such as the Dale's principle, or a 34 connectivity with a certain degree of sparseness 3 . Optimization methods are widely used in 35 artificial intelligence (AI), and ...