Shaping low-rank recurrent neural networks with biological learning rules
We study how different plasticity rules affect the low-dimensional representations and dynamics of low-rank recurrent neural networks, and employ simulation based inference within a teacher-student framework to identify plasticity rules that solve common neuroscience tasks. The low-dimensional representions of gradient-descent trained teacher networks serve as activity characterisers for infering biologically-plausible plasticity rules that imprint these low-dimensional representations onto the student networks.
You can see the poster that Pablo presented at Bernstein conference 2024: poster and the presentation we gave at a group retreat on April 2024 altogether here: presentation 1.
Here is the cover of Pablo’s thesis cover, (please contact me or Pablo directly if you would like to read it), and in case someone claims otherwise, here is the history of the overleaf file history screenshot.