My research is interdisciplinary, straddling branches of mathematics, neuroscience, network theory, and machine learning.
I am mostly occupied with studying neural networks and their structure–dynamics–function relationships in the context of reservoir computing and biological brain networks.
Recurrent Neural Network
Reservoir Computing
Network Structure
Dynamics
Applied Topology
Heterosynaptic Plasticity
Biological Connectomes
My main current PhD work (which began with a collaboration) focuses on structure–dynamics–function relationships in recurrent neural networks – mainly random networks, as is used in reservoir computing. I am interested in how the dynamics generated inside random and frozen RNNs can be harnessed for useful computation. We analyse how different network architectures (general and bioogical) perform on computational tasks, seeking topological signatures of effective and/or efficient networks. This includes using connectome-inspired networks in a "NeuroAI" sense.
I'm part of a collaboration with Bristol, using mathematical modelling to study synaptic maturation and network development.
I’ve also worked on how deficits in short-term plasticity (e.g., in psychosis) affect decision-making and working memory, primarily using attractor-based dynamical systems. See an example.
I use topological data analysis, such as persistent homology, to investigate large-scale patterns in neurocognitive datasets—especially working memory, processing speed, and cognitive structure.
I have research interests in assessment theory within mathematics education. See an example here.