M1/M2 Internship: Topology of neural connectomes
Behavior and decision-making are determined by physical processes taking place in the complex environment of the brain. Experimental techniques have reached the point where it is now possible to map the complete wiring diagram (the anatomical connectome) of the brain of simple model organisms at the level of single synapses [1,2,3], and to control the activity of individual neurons in live animals and observe the resulting behavior . Together this offers the occasion to reverse engineer the physical basis of behavior .
This project aims to employ computational, statistical, and machine learning methods to characterize the neural circuitry of the neuronal networks of small animals and investigate the influence of physical constraints on their topology. The project involves two subparts, which may be investigated individually or collectively as time allows and depending on the interests of the student:
- In search of biological neural network distances. Comparing networks is an open technical challenge as the multiplicity of existing tools does not bring a definitive answer . For instance, comparing networks of different sizes is a non-trivial problem . Network patterns — motifs  — are attractive structural features since their definition is a priori independent of the network size. We recently developed an inferential method that extracts statistically significant microcircuits from connectomes. This part of the project is focused on the comparison of circuit motifs as a proxy for network distances. The student would characterize and compare inferred sets of network motifs by exploring how dynamical models  are affected by motifs’ topological properties, e.g. their symmetry. Potential extensions of the subject would involve investigations of how a distribution of microcircuits affect the mesoscopic, or modular, structure of a network .
- Generative modeling of biologically-plausible brain networks from learnt latent embeddings. This axis aims to develop new generative network models based on embedding the nodes in a metric space where distances determine link probabilities based on a kernel that is learnt from data. This will involve generalizing latent space graph models  by learning the distance kernels defining the connection probabilities between neurons from data and possibly extending them to non-euclidean embedding spaces. In a second time, we aim to link these embeddings to higher-order circuit motifs and to the real 3D space embedding of neurons and other biophysical constraints acting on the neural network.
The internship may lead to a PhD scholarship in the lab to continue the project.
-  Winding et al., “The connectome of an insect brain.” Science 379: eadd9330 (2023).
-  Witvliet et al., “Connectomes across development reveal principles of brain maturation.” Nature 596: 257–261 (2021).
-  Dorkenwald et al., “Neuronal wiring diagram of an adult brain.” biorxiv https://doi.org/10.1101/2023.06.27.546656 (2023).
-  Jovanic et al., “Competitive Disinhibition Mediates Behavioral Choice and Sequences in Drosophila.” Cell 167: 858–870 (2016).
-  Hartle, H., Klein, B., McCabe, S., Daniels, A., St-Onge, G., Murphy, C., & Hébert-Dufresne, L. (2020). Network comparison and the within-ensemble graph distance. Proceedings of the Royal Society A, 476(2243), 20190744.
-  Milo, R., Shen-Orr, S., Itzkovitz, S., Kashtan, N., Chklovskii, D., & Alon, U. (2002). Network motifs: simple building blocks of complex networks. Science, 298(5594), 824-827.
-  Riascos, A. P. (2023). Dissimilarity between synchronization processes on networks. arXiv preprint arXiv:2309.13163.
-  Yamamoto, H., Spitzner, F. P., Takemuro, T., Buendía, V., Murota, H., Morante, C., … & Soriano, J. (2023). Modular architecture facilitates noise-driven control of synchrony in neuronal networks. Science advances, 9(34), eade1755.
-  Hoff et al., “Latent Space Approaches to Social Network Analysis.” JASA 1-00: 286–295 (2002).
Scientific or technical background required for work program
The successful intern should have training in one of the following fields: statistical or condensed matter physics, applied mathematics, or statistics.
Some fluency in Python and numerical simulations is expected.