Shaping low-rank recurrent neural networks with biological learning rules

We study how different plasticity rules affect the low-dimensional representations and dynamics of low-rank recurrent neural networks, and employ simulation based inference within a teacher-student framework to identify plasticity rules that solve common neuroscience tasks. The low-dimensional representions of gradient-descent trained teacher networks serve as activity characterisers for infering biologically-plausible plasticity rules that imprint these low-dimensional representations onto the student networks.

You can see the poster that Pablo presented at Bernstein conference 2024: poster and the presentation we gave at a group retreat on April 2024 altogether here: presentation.