Neuroscience Gateway - Software Dissemination, Large Scale Modeling, Data Analysis on Supercomputers

Satellite Workshop - 2023 Society for Neuroscience Annual Meeting

Date and Time: 11/11/2023, 08:30 AM - 12:30 PM

Location: George Washington University, B1220, Science and Engineering Hall, 800 22nd St NW, D.C.

https://www.seas.gwu.edu/science-and-engineering-hall

Please enter the Science and Engineering Hall via the entrance from the 23rd Street

 

Registration (is free but is required given limited space, please register): https://na.eventscloud.com/ereg/newreg.php?eventid=774279&

 

Organizers: Amit Majumdar, Subhashini Sivagnanam, Kenneth Yoshimoto, San Diego Supercomputer Center, University of California San Diego

Ted Carnevale, Neuroscience Department, Yale School of Medicine University

Workshop theme: The Neuroscience Gateway (NSG), funded by both the National Institutes of Health (NIH) and the National Science Foundation (NSF), serves the neuroscience community by providing easy, open and free access to a large number of neuroscience software and tools on supercomputing resources, academic cloud computing resources, and associated storage resources, which are located at various national academic supercomputer centers in the US. NSG can be used by neuroscientists from any country. NSG eliminates administrative and technical barriers for all and enables neuroscientists to carry out research that requires large scale modeling and data processing. Another function of NSG is to disseminate neuroscience modeling and data processing software on computing resources such that the broader neuroscience community is easily able to access and use those software. This workshop will bring together experts from the broader neuroscience software developer and user community, irrespective of their direct involvement with NSG or not. Researchers and experts from agencies such as the NIH, the neuroscience software developer community and the neuroscience user community will share their research and experiences within the context of large-scale modeling and data processing in neuroscience.

AGENDA

8:30 AM - 8:45 AM

Welcome

8:45 AM - 9:15 AM

Title: Human Neocortical Neurosolver: A Software Tool for Cell and Circuit Level Interpretation of MEG/EEG signals

Authors: Stephanie Jones, Nicholas Tolley

Affiliation: Dept of Neuroscience, Brown University

Abstract: MEG/EEG signals are correlated with nearly all healthy and pathological brain functions. However, it is still extremely difficult to infer the underlying cellular and circuit level origins. This limits the translation of MEG/EEG signals into novel principles of information processing, or into new treatment modalities for pathologies. To address this limitation, we built the Human Neocortical Neurosolver (HNN): an open-source software tool to help researchers and clinicians without formal computational modeling or coding experience interpret the neural origin of their human MEG/EEG data. HNN provides a graphical user interface (GUI) to an anatomically and biophysically detailed model of a neocortical circuit, with layer specific thalamocortical and cortical-cortical drives. Tutorials are provided to teach users how to begin to study the cell and circuit level origin of sensory event related potentials (ERPs) and low frequency rhythms. Once users have an understanding of the basic workflows and tutorials in the HNN GUI, those familiar with Python can work in the HNN-core Pythonic interface. We will give a didactic overview of the background and development of HNN and describe current and planned resources to use HNN through the Neuroscience Gateway Portal.

9:15 AM - 10:00 AM

Title: NIH BRAIN Initiative (Part I): The convergence of modeling and data science in the NIH BRAIN Initiative

Authors: Grace C.Y. Penga, Susan N. Wrightb, Hermon Gebrehiwetc

Affiliations: NIH BRAIN Initiative (aNational Institute of Biomedical Imaging and Bioengineering (NIBIB), bNational Institute on Drug Abuse (NIDA), c National Institute of Neurological Disorders and Stroke (NINDS)

Abstract: Since 2014, the NIH BRAIN Initiative has been promoting cutting-edge technology that would accelerate the understanding of brain function as a dynamic integrative system. These technologies include quantitative and computing technologies that place model-driven experiment design at its core. All investigators performing research to uncover brain circuit mechanisms are required to deliver a predictive model that is quantitative or conceptual. This new community of modelers are realizing the need to utilize rigorous data science methods and tools to link neural processes across multiple scales of neural structure and time. The NIH BRAIN initiative flagship brain circuit program is striving to understand various behaviors from perception to executive function. Each of these projects incorporate modeling efforts covering scales from cellular/molecular to inter-regional networks, and at many scales of modeling approaches from biophysics to analytical and numerical, with many programs spanning multiple scales of study. As well the data science efforts required for these projects present many challenges and opportunities for harmonizing the data management practices.

Title: NIH BRAIN Initiative (Part II): Data science across the NIH BRAIN initiative brain circuits projects

Authors: Manuel Schottdorf1, Guoqiang Yu2, Edgar Y. Walker3

Affiliations: NIH U19 Data Science Consortium (1Princeton, 2Virginia Tech, 3University of Washington)

Abstract: The rise of large scientific collaborations in neuroscience requires systematic, scalable and reliable data management. How this is best done in practice remains an open question. To address this, we conducted a data science survey among currently active U19 grants, funded through the NIH's BRAIN Initiative. The survey was answered by both data science liaisons and Principal Investigators, speaking for ~500 researchers across 21 nation-wide collaborations. We describe the tools, technologies and methods currently in use, and identify several shortcomings of current data science practice. Building on this survey, we develop plans and propose policies to improve data collection, use, publication and re-use in the neuroscience community.

10:00 AM - 10:30 AM

Title: Modeling the development of the visual system: computational challenges vs high-performance computing

 

Authors: Ruben A Tikidji-Hamburyan and Matthew T. Colonnese

 

Affiliation: Department of Pharmacology and Physiology, The George Washington University, Washington D.C., United States

 

Abstract: During the early period of development, neurons and networks are gradually maturing toward what will be a fully functional visual system in adulthood. At early ages, neurons are slow, with prolonged dynamics in scales up to seconds. Networks are connected imprecise and need refinement. To refine projections, the thalamus and cortex extract positional information encoded in the correlated activity of retinal ganglion cells in the course of the first two weeks of postnatal development. Modeling this developmental process poses substantial computational challenges. Here, we will discuss the usage of high-performance computing and Neuroscience Gateway specifically to meet these challenges, from fitting a single neuron model to computing the information lost in different biological conditions. We will review the usage of various techniques for model construction, simulation, and assessment of model quality and biological relevance. We will show how to use principal component analysis of single neuron parameters to assess fitting quality, distributed computation for inferring the information timescales, and usage of biology of studied networks for load distribution on parallel HPC cluster or multicore CPU.

 

10:30 AM - 11:00 AM BREAK

 

11:00 AM - 11:30 AM

Title: NIC: An integrated neuroinformatics tool for studying brain interaction dynamics in neurological disorders

 

Authors: Katrina Prantzalos1, Dipak Upadhaya1Shafiabadi N1, Gurski N1, Fernandez-BacaVaca G1, Yoshimoto K2, Sivagnanam S2, Majumdar A2, Sahoo SS1

 

Affiliations: 1Case Western Reserve University, University Hospitals Cleveland Medical Center; 2San Diego Supercomputing Center/University of California San Diego

 

Abstract: High fidelity brain recording from both scalp and intracranial electrodes are widely used to study brain interaction dynamics, including in neurological disorders such as epilepsy. However, large-scale analysis of these neurophysiological signal data, such as electroencephalogram (EEG), requires multiple data pre-processing steps and subsequent analysis the use of specialized methods such as graph network analysis or algebraic topology algorithms. In this talk, we describe the Neuro-Integrative Connectivity (NIC) tool that features integrated support for signal data processing, analysis, and use of machine learning models to characterize brain interactions in epilepsy. The NIC tool features a modular architecture that supports extensibility, integration of new functionalities such as machine learning interpretability methods, and ease of maintenance. The NIC tool is available through the NSG portal.

 

11:30 AM - 12:00 PM

Title: Whole brain connectome models mapping brain structure to function to study lifespan ageing and neurological diseases

 

Authors: Arpan Banerjee1, Suman Saha1, Samuel Berkins1, Anagh Pathak1, Neeraj Kumar1, Priyanka Chakraborty1, Varun Madan Mohan1, Anirudh Vattikonda1, Amit Naskar1, Dipanjan Roy1,2

 

Affiliations:

1. National Brain Research Centre, NH 8 Manesar, Gurugram 122052, India

2. Indian Institute of Technology Jodhpur, NH 62, Jodhpur, India

 

Abstract: A significant development in recent times has been the establishment of whole-brain computational models (WBMs) to understand brain dynamics at multiple spatiotemporal scales. In addition to understanding whole brain dynamics across scales based on biologically realistic anatomical connectivity, it has been pivotal in providing fundamental insights into structure-function relations in brain mapping, an often-cherished goal of neuroscientific explorations. I will present a pipeline developed by our group that can take individual subject- specific tractography data as input to generate the functional dynamics observed in electro-encephalogram (EEG)/ magneto-electrogram (MEG) and fMRI recordings. Further, I will demonstrate how key factors in human lifespan aging such as synaptic scaling and conduction speeds can explain the underlying neurocompensatory mechanisms that preserves functional integration during healthy aging using a WBM, constrained on subject-specific connectome. Using biophysically realistic extensions that capture the multi-scale neurotransmitter-neuroelectric interactions we demonstrate how GABA/Glutamate relationships vary over lifespan aging. Subsequently, we show how such implementations of WBM can extract the conduction speed as the most relevant patient specific clinical marker of cognitive impairment in multiple sclerosis. Using controlled experimental paradigms such as auditory state rhythms and language discrimination tasks we show how brain hemispheric lateralization of speech/ tonal input processing can also be explained by WBMs. Together, we outline the various ways WBM a.k.a virtual brain, sculpted from idiosyncrasies of subject-specific connectome, contribute to basic neuroscientific explorations involving resting state and task-based experimental paradigms and also generate predictive markers for clinical practice.

 

 

12:00 PM - 12:30 PM

Title: The role of computational models in experimental research: predictive vs. explanatory power

 

Authors: Sherif M. Elbasiouny1,2 and Mohamed H. Mousa1

 

Affiliations:1Department of Neuroscience, Cell Biology, and Physiology, Boonshoft School of Medicine and College of Science and Mathematics

Wright State University, Dayton, OH, United States

2Department of Biomedical, Industrial, and Human Factors Engineering, College of Engineering and Computer Science

Wright State University, Dayton, OH, United States

 

Abstract: Computational models provide a means to study biological systems at multi levels. When combined with experiments, computational models could be used to explain the experimental data based on an underlying hypothesis or use the data to generate new testable hypotheses. In this talk, we show examples from our work on the computational models' predictive vs. explanatory power, illustrating the strength and limitations of each approach and discussing some challenges involved in this simulation/experiment co-development approach.