open menu
Contact
Distinguished Lecture Series: Sparsity, modularity, and structural plasticity in deep neural networks, March 26th, 2024, Prof Constantine Dovrolis (The Cyprus Institute and Georgia Tech)
A+
A-
Title: Sparsity, Modularity, and Structural Plasticity in Deep Neural Networks
Speaker: Constantine Dovrolis (The Cyprus Institute and Georgia Tech)
Location: STEP C “KEEK” Building - “Vassilios Dougalis” Meeting Room (Foundation for Research and Technology-Hellas)
Date: 11.00-12.30, March 26th, 2024

Host: Maria Papadopouli

Abstract: There is a growing overlap between Machine Learning, Neuroscience, and Network Theory. These three disciplines create a fertile inter-disciplinary cycle: a) inspiration from neuroscience leads to novel machine learning models and deep neural networks in particular, b) these networks can be better understood and designed using network theory, and c) machine learning and network theory provide new modeling tools to understand the brain’s structure and function, closing the cycle. In this talk, we will “tour” this cross-disciplinary research agenda by focusing on three recent works: a) the design of sparse neural networks that can learn fast and generalize well (PHEW, ICML 2021), b) the use of structural adaptation for continual learning (NISPA, ICML 2022), and c) the emergence of hierarchically modularity in neural networks (Neural Sculpting, NeurIPS 2023).

Bio: Dr. Constantine Dovrolis is the Director of the center for Computational Science and Technology (CaSToRC) at The Cyprus Institute (CyI) as of 1/1/2023. He is also a Professor at the School of Computer Science at the Georgia Institute of Technology (Georgia Tech). He is a graduate of the Technical University of Crete (Engr.Dipl. 1995), University of Rochester (M.S. 1996), and University of Wisconsin-Madison (Ph.D. 2000). His research is highly inter-disciplinary, combining Network Theory, Data Mining and Machine Learning. Together with his collaborators and students, they have published in a wide range of scientific disciplines, including climate science, biology, and neuroscience. More recently, his group has been focusing on neuro-inspired architectures for machine learning based on what is currently known about the structure and function of brain networks. According to Google Scholar, his publications have received more than 15,000 citations with an h-index of 56. His research has been sponsored by US agencies such as NSF, NIH, DOE, DARPA, and by companies such as Google, Microsoft and Cisco. He has published at diverse peer-reviewed conference and journals such as the International Conference on Machine Learning (ICML), the ACM SIGKDD conference, PLOS Computational Biology, Network Neuroscience, Climate Dynamics, the Journal of Computational Social Networks, and others.