Skip to content


    Skip to content

    NEUROSCIENCE | Contents | Next

    Neuron Modeling and Simulation

    PARTICIPANTS
    James M. Bower,
    Caltech
    Thomas M. Bartol,
    Terrence J. Sejnowski
    Salk Institute for Biological Studies
    David Beeman
    University of Colorado, Boulder
    Joel Stiles, Miriam Salpeter
    Cornell University

    T he growing interest in neuron modeling parallels the increasing experimental evidence that the nervous system is extremely complex. In fact, modeling is as essential as laboratory experimentation to the understanding of structure-function relationships in the brain. The models are heading toward a complexity matching that of the nervous system, and thus an advanced computational infrastructure is essential for progress in neuroscience. In the NPACI Neuroscience thrust area, widely used and newly developed neuron modeling systems are both being extended and linked to large-scale, high-performance capabilities.

    "Neuron modeling efforts are no longer innocent of physiology, as many naïve models were," said Mark Ellisman of UC San Diego, leader of the NPACI Neuroscience thrust area. "In our neuron modeling project, we are demonstrating that the neuron modeling approaches are flexible, extensible, and scalable to high-performance computing. MCell and GENESIS are two activities that make precisely this point, and we hope to support other modeling systems, like NEURON, that can show our community how larger questions can be approached." In so doing, the hope is that investigators already using such models can take advantage of the national computational infrastructure.

    MCELL

    GENESIS

    NEURON TUTORIALS


    MCell Neuromuscular Junction
    Figure 1. MCell Neuromuscular Junction
    Simulation of a synapse between a nerve cell (not shown) and mouse sternomastoid muscle cell. The neurotransmitter acetylcholine diffuses from a synaptic vesicle to activate receptors (cloud of dots) on the muscle membrane. Snapshot is at 300 microseconds, at peak activation. Visualization from MCell by Tom Bartol and Joel Stiles.

    MCELL

    MCell is a general Monte Carlo simulator of cellular microphysiology developed by Tom Bartol, a postdoctoral researcher in the laboratory of Terrence Sejnowski at the Salk Institute for Biological Studies, and Joel R. Stiles, a senior research associate in the Department of Neurobiology and Behavior at Cornell University. MCell's computational algorithms use Monte Carlo methods developed in collaboration with Miriam Salpeter and Edwin Salpeter at Cornell.

    Bartol and Stiles explained that biological structures like neurons show tremendous complexity and diversity at the subcellular level. For example, a single cubic millimeter of cerebral cortex may contain on the order of 5 billion interdigitated synapses of different shapes and sizes. They noted that subcellular communication is based on a wide variety of chemicals signaling pathways. A process like synaptic transmission, for example, encompasses neurotransmitter and neuromodulator molecules, proteins involved with exo- and endocytosis, receptor proteins, transport proteins, and oxidative and hydrolytic enzymes.

    "MCell incorporates high-resolution ultrastructure into models of ligand diffusion and signaling," Bartol said. The first event during an MCell simulation is the release of ligand molecules from a structure such as a synaptic vesicle. This is followed by ligand diffusion within spaces defined by the user (such as pre- or post-synaptic membranes) and reactions between ligands and effectors (neurotransmitters or receptors and enzymes). The ligands and effectors, reaction mechanisms, and surfaces on which reactions take place are all specified by the modeler using the MCell Model Description Language, used to build the simulation objects. MCell then carries out the simulation, using Monte Carlo (random-number based) algorithms, for a specified number of iterations. Different numerical and imaging results can be produced.

    "Our code has been extensively optimized," Bartol said, "so the simulation time required does not depend on the complexity of the surfaces on which the reactions take place." MCell handles a range of physical and temporal scales, permitting the simulation of the chemistry and, soon, the electrochemistry of cells.

    NPACI support is enabling the code to be ported to the Cray T3E and, shortly, to the IBM teraflops system. "We will have plenty of use for the high-end machinery," Stiles said. "For example, we are working with Miriam Salpeter at Cornell and Ellisman and Maryann Martone at UC San Diego on a 3-D reconstruction of a mouse neuromuscular junction, a complex structure illustrating a range of neurological processes. In simulating a structure of this complexity we have learned that it results in high variability in the post-synaptic response" (Figure 1).

    Also central to MCell is collaboration with Jack Dongarra, of the University of Tennessee and Oak Ridge National Laboratory, and Henri Casanova, now at UC San Diego. They have developed NetSolve, an alternative to Dongarra's Parallel Virtual Machine that turns a loosely associated collection of machines into a fault-tolerant client-server compute cluster. NetSolve is being integrated with the AppLeS scheduler developed by Fran Berman at UC San Diego, and connected with the Network Weather System software developed by Rich Wolski of the University of Tennessee. An initial test of MCell on a 40-machine NetSolve cluster demonstrated the need for a distributed file-caching mechanism, which will allow NetSolve to support larger MCell runs.

    A daylong workshop on MCell held at SDSC in June 1999 brought 12 participants from labs around the world. "Now that the core of MCell is hardened code we are seeking funding to further develop its capabilities and features and to add a graphical user interface, perhaps coupled to IBM's Data Explorer software," Bartol said. "That will make MCell even more powerful and easier to use."

    Top| Contents | Next

    REFERENCE

    J. Bower, D. Beeman. The Book of GENESIS, 2d ed. (Springer Verlag, 1998).


    The Book of GENESIS
    Figure 2. The Book of GENESIS
    The book contains tutorials on neuron modeling as well as an introduction to using GENESIS for research. Cover art by Erika Oller.

    GENESIS

    While MCell is a relatively new development, GENESIS--short for GEneral NEural SImulation System--was first released in 1990 and today is in wide use. Developed in the lab of James M. Bower at Caltech, GENESIS is open-source software containing contributions from groups and individuals around the world. David Beeman of the University of Colorado, Boulder, and Michael Hucka of Caltech coordinate the ongoing development of the system. GENESIS models realistic systems ranging from subcellular biophysical mechanisms to single neurons to large-scale networks and whole neural systems. About 600 members of the GENESIS User Group are at 284 sites in 26 countries. "GENESIS supplies a community-wide quantitative basis for describing, comparing, and estimating the significance of the vast array of laboratory experiments focused on nervous systems," Bower said.

    GENESIS is written in C and uses a "building block" approach. Simulations are constructed of modules and objects that receive inputs, perform calculations, and generate outputs. For example, single-neuron models are constructed of small compartments that are linked to objects representing variable conductance ion channels. The compartments can be linked to form multi-compartment neurons at any level of morphological complexity. The neurons themselves can be linked in circuits for network simulations. The object-oriented features allow exchange or reuse of whole models or individual components.

    Bower, Beeman, and contributors have built and nurtured GENESIS, in the process bringing out two editions of The Book of GENESIS (Springer Verlag, 1994, 1998; Figure 2). The book is both a manual and a report of major modeling work done with the system. "Because so many labs are using GENESIS, what we have built is not only a modeling platform but also a community of investigators able to make their ideas about neural function intercomparable and quantifiable," Bower said. "The system is neutral with respect to any particular theory of brain function. Investigators begin with as detailed a description as possible of the relevant neuroanatomy. This biological realism allows known data to be used as constraints for model parameters."

    More than 45 educational institutions have used the system and, to date, more than 90 articles reporting work with the system have appeared, exclusive of studies published by the group at Caltech. A valuable GENESIS asset is a collection of simulation object libraries containing building blocks already constructed by numerous investigators, such as voltage and synaptically activated channels, individual gates, and synapses of several types. "Device objects" such as pulse and spike generators, voltage-clamp circuitry, and spike frequency measurements are also available.

    Another component fosters model sharing among neuroscientists at different sites. It allows researchers to access databases of existing neural models from the Internet and, using a Java-based browser tool, to retrieve new modeling components or elements from remote databases. "This tool encourages a collaborative approach to studying the nervous system," Bower said. "It makes information already accumulated within our system, and by neurobiologists around the world, more available and accessible.

    "Because the neurons are made up of more abstract neural components, they can be altered to represent the physiology of actual neurons anywhere in the brain or nervous system," Bower added. The parallel version, PGENESIS, is now being ported to the SUN HPC 10000 system under an NPACI Strategic Applications Collaboration (see p. 18). "This is important because it will make it possible to study much larger neural systems or to run more extensive parameter searches," he said.

    Top| Contents | Next

    NEURON TUTORIALS

    Another widely used modeling system, called NEURON, was developed at Duke University by Michael Hines, now at Yale. Recently, with NSF funding, a five-day workshop on NEURON was given by Hines, Theodore Carnevale of Yale University, Bill Lytton of the University of Wisconsin, and Sejnowski. Ultimately, Ellisman hopes that NEURON can be made broadly available on NPACI resources. "The availability of the main modeling systems will advance the process of building a computational infrastructure for neuroscience," he said. --MM *

    Top| Contents | Next