Computational Science & Engineering Research Institute


Solving complex science and engineering problems
using high performance computers
Click to see animated sequence
Click to see larger image
In a fluidized diluted granular flow, grains behave like gas molecules.
Applying the Bolztmann kinetic-collisional model to a granular flow.
S. Dartevelle, Ph.D. student, Geological & Mining Engg. & Sci.
NASA Goddard Space Flight Center Beowulf Cluster.
64 2-processor PC's connected by a 100Mb/sec switch.
P. Merkey, Research Ass't Prof., Computer Science Dept.
Click to see larger image
Click to see larger image
Evolution of a non-evaporating fuel spray.
Colors represent droplet radii in microns.
F. Tanner, Assoc. Prof., Mathematical Sciences Dept.
NSF Major Research Instrumentation Sun Enterprise.
12 Sun Ultrasparcs and a small disk farm.
S. Seidel, Assoc. Prof., Computer Science Dept.

The CS&E Research Institute was created in fall, 2002.

The discipline of Computation Science & Engineering is the application of computational technologies (computer hardware, software, networks, etc.) to current problems in science and engineering. Typically, these problem are in areas of national interest, including areas of economic interest and security, and areas of state-wide economic interest. Examples of specific problems being studied by the CS&E faculty at MTU are described below.

The CS&E Research Institute was created to:

The creation of the CS&ERI was approved by:

A complete description of the CS&E Research Institute can be found in the CS&ERI proposal.

Phillip Merkey is the director of the CS&E Research Institute.


What Is CS&ERI Research All About?

How is CS&E research different than traditional science and engineering research?


What Makes CS&E Computers Different Than The One On Your Desktop?

In most cases, computers used in CS&E research are ``parallel" computers made up of tens or hundreds of ordinary computers all connected by a network that allows the computers to ``talk" to each other while they work on solving a problem.


I'm still confused.

Don't most researchers at MTU use computers? How is CS&E research different than what these researchers do?

We all use computers for word processing, web browsing and email. None of that begins to tax our computers. When was that last time that your own computer spent more than 20 seconds actually computing something? The only time that most of us usually have to wait for our computers is when large amounts of data (e.g., pictures, movies, web pages) are being moved from one place to another or while a large software package is being ``loaded". Desktop computers today are so powerful that for most day-to-day tasks (except 3-D game playing) they spend most of their time doing nothing at all. Most of the computers on our desktops spend most of their time waiting for us to do something!

In the laboratory, a researcher might use a computer to control the inputs to an experimental device or to automatically collect data during an experiment. Afterwards a computer is used to analyze the data that have been collected and to display the results using commercially available software. Sometimes the researcher writes a programs to analyze the data in a way particularly fitting to the science or engineering problem at hand. From the computer's point of view, none of the work it is required to do is at all taxing. The computer typically spends very little time doing this work; most of its time is spent waiting for the laboratory devices or for the researcher. Even though a computer and its software is an important laboratory tool, the particular computer is not critically important to the work being done. Any reasonably modern PC can usually do these jobs without breaking a sweat.

On the other hand, many CS&E researchers do not work in a traditional laboratory. They do not do experiments or make field observations. All of their research is done inside the computer. Their work is based on mathematical models of physical phenomena. A much of their research effort consists of turning those mathematical models into computer programs. Their computer programs simulate cloud formation, the flow of pollutants through the soil, the pressures on an aircraft wing, or the evolution of black holes. All of these "experiments" are artificial but, if carefully designed models are used, they mimic the real world so accurately and in so much detail that the results reveal what would actually happen in the real world. It is computational scientists who tell us that global warming is (or is not) happening. It is computational scientists who design new drugs without ever using a test tube. Many important problems today cannot be solved by traditional laboratory studies. The largest and most powerful computers are the only and best tools for scientists and engineers to use to solve these problems.

OK, now I understand that CS&E research is focussed on science and engineering problems that can only be solved using computers ...

... but there are so many different units at MTU with "computer" in their names. What are the differences between ... ?


The Computational Science & Engineering Ph.D. program

One of the primary goals of the CS&E Research Institute is to foster graduate research in computational science and engineering. This is being accomplished through the CS&E Ph.D. program.


Current CS&E Projects

All of the computer equipment shown here is housed and maintained in the Center for Experimental Computation on the 2nd floor of Fisher Hall.

NASA: Earth and Space Sciences Support for NASA High Performance Computing and Communication Program,
P. Merkey (PI), S. Seidel (Co-PI), $758,000.
Technical description:

Popular description:

Click to see larger image
Click to see larger image
NASA cluster: A cluster built from 64 ordinary PC's. Cluster interconnection network: The 100Mb/sec Ethernet switch that connects the PC's in this cluster.
The computer: You are looking at the "business ends" of 64 dual-processor PC's. Note that there is only one keyboard and monitor. All of these PC's are connected to form a single, parallel computer. They are connected by an ordinary Ethernet switch (right) that is just like the one that connects your desktop PC to all other PC's in your local network.

This "Beowulf" cluster was provided to MTU by NASA Goddard Space Flight Center to support parallel computing projects in the CS&ERI.


Hewlett-Packard: UPC Technology Development Project,
S. Seidel (PI) and Co-PIs P. Merkey and C. Wallace, $179,000.
Technical description: Over the past two year Drs. Steven Seidel, Phillip Merkey, and Charles Wallace have received funding from Hewlett-Packard to pursue four projects centered on the development of the Unified Parallel C programming language. UPC is a new language for high performance computing on distributed shared memory architectures. One of this year's projects continues the work begun last year on the MuPC run time system for UPC. New this year are projects to formally specify the UPC memory consistency model, examine programmability and usability aspects of UPC, and produce a specification for UPC collective communication operations.

Popular description: Programming parallel computers is a more difficult task than programming ordinary computers because a parallel program has to control many computers at once. Many familiar programming languages used for scientific computing, such as C and Fortran, have been modified so that they can be used to program parallel computers. One of the newest parallel programming languages is UPC, which stands for Unified Parallel C. This language is currently being studied by research groups at UC Berkeley, George Washington University, and MTU. There is current interest in UPC because it appears to have a well designed collection of commands that allows programmers to more easily write parallel programs to solve complex problems.

One of the current CSERI projects is to develop a portable version of the UPC programming language that can run on any Linux-based computer system. This version is called MuPC (roughly standing for "Michigan Tech UPC"). One of the reasons for developing MuPC is to help "spread the word" about UPC, so MuPC is freely available to the CS&E research community in the U.S.

Click to see larger image
Click to see larger image
The Cray T3E: These three cabinets contain 60 processors and lots of wiring to connect them all. Inside: This is the inside of one of the cabinets. All of those cables are used to interconnect the processors and to keep them in step with one another.
The computer: In association with this work, the CS&ERI has received a Cray T3E supercomputer. This machine originally cost about $1.5M. It is a few years old now, but current models of this supercomputer maintain their place as the fastest machines in the world.

This system was provided to MTU to support UPC-related research. The CSERI has made this system available to researchers at many U.S. institutions, including the UPC research groups at Berkeley and at George Washington University.


NSF: Sandu's CAREER and ITR funding:
Sandu, et al., $450,000.
Technical description:

Popular description:

Click to see larger image
Click to see larger image
The newest CS&E cluster: 20 dual processor 2GHz PC's. Two interconnection networks: The processors are connected with standard Ethernet (top) and by a very fast Myrinet fiber network (bottom).
The computer: This computer is a brand new "Beowulf" cluster. It is similar to the cluster used for the NASA project (two frames above) but it is built using the latest technology. Each PC is about 1.5" thick and houses two 2 gigahertz processors. There are 16 of these PC's (in groups of 8) above and below the two switches that connect them. These switches are shown on the right. The top switch is a standard ethernet switch like that used for the NASA cluster. The bottom switch uses very fast fiber optic technology (that's why the wires are thinner) to connect the computers.

This system was acquired with the NSF Major Research Instrumentation grant described below.


NSF: Computational Facilities for MTU's CS&E Program
S. Seidel (PI), and Co-PIs C. Friedrich, J. Jaszczak, A. Mayer, $260,000.
Technical description:
This is an equipment acquisition grant to support research and training in computationally intensive projects in several disciplines. Participating researchers come from the departments of physics, mechanical engineering-engineering mechanics, metallurgy and materials engineering, geological engineering and sciences, and computer science. The list of current departments has since grown to include mathematical sciences. The computational system acquired in this project is a primary tool used in their work. Other research projects that use this facility are in the areas of computational fluid dynamics, materials science, groundwater flow, and interprocessor communication. This computational facility is being used to expand the scale of problems that these projects address.

Popular description:
Over the past 15 years, computing at MTU has evolved from a centrally located, centrally administered "computer center" to a completely distributed, self-administered web of desktop PC's. This trend put computers on our desks, but by doing so it cut up the computational power of these computers into very small pieces. As explained above, these PC's are more than adequate for doing day-to-day tasks, but certain researchers require much greater computational power. For many years, medium- and large-scale computers were not available at MTU because all computing had been "distributed" to the desktop. This NSF grant was one of the first to reverse this trend and make larger computers again available to researchers at MTU

This multi-year grant was obtained several years ago. The first piece of equipment acquired was the Sun Enterprise 4500, described below. This computer is freely available to scientists and engineers for computational work that is too large for their desktop machines. During the past several years faculty and students from the College of Engineering and the College of Sciences and Arts have used this computer for a wide range of work including:

  • The study of volcanic processes
    Student: Sebastiene Dartevelle, Ph.D. student, Geological and Mining Engineering and Sciences
    Advisor: Bill Rose
  • The study of volcanic processes
    Student: Song Guo, CS&E Ph.D. student, Geological and Mining Engineering and Sciences
    Advisors: Gregg Bluth and Bill Rose
  • Groundwater remediation studies
    Student: Mark Erickson, M.S. student, Geological and Mining Engineering and Sciences
    Advisor: Alex Mayer
  • Project title needed
    Student: Hanyi Li, CS&E Ph.D. student, Geological and Mining Engineering and Sciences
    Advisor: Judith Budd
  • Project title needed
    Student: Krista Stalsberg-Zarling, CS&E Ph.D. student, Mathematical Sciences
    Advisor: Franz Tanner
  • Discrete atomic modelling
    Investigator: Jong Lee, Material Sciences and Engineering
  • Other regular users of cse0.cse.mtu.edu?
    Student:
    Advisor:

Click to see larger image
Sun Enterprise 4500: 12 server-class Sun processors.
The computer: This is 12 processor Sun Enterprise 4500. Unlike the two "Beowulf" clusters above, all connections between processors are contained within the cabinet.

This system was acquired with an NSF Major Research Instrumentation grant to support CS&E research at MTU.


Tanner

Collaborator and Sponsor: Helsinki University of Technology (HUT), Finland
Principal Investigators: Franz X. Tanner (MTU) and Martti Larmi (HUT)
Ph.D. Student: Krista Stalsberg-Zarling

Abstract: The simulation of spray combustion requires the modeling of turbulent, reacting multiphase flows, where the gas phase is described with the time-dependent, compressible, Reynolds/Favre-averaged conservation equations for mass, species, momentum and energy, together with the equations for the turbulence model. The liquid phase is governed by a stochastic formulation of the discrete droplet model, where the liquid and gas phases are treated separately. The liquid-gas interaction, as well as the species creation and depletion due to chemical reactions, are coupled with appropriate source terms in the gas conservation equations. One of the main problems in the modeling and simulation of two-phase flows, using the discrete droplet model approach, is the intrinsic dependence of the liquid-gas coupling on the spatial resolution of the gas phase equations. The objective of this research project is the development of a general method which reduces or eliminates such discretization dependencies.


Michalek


Mayer


Gregg/Rose/Guo

This figure shows a numerically simulated volcanic
plume/cloud 115 minutes after eruption. The simulation was done
using ATHAM (Active Tracer High Resolution Atmospheric Model).


Gockenbach?


Jaszczak?


Hansmann?


Perger?


Others?


Current CS&E Ph.D. Students

Students in the CS&E Ph.D. program are members of their "home" department. Their advisor, their funding, their office, etc. all come from their home department. Yet these students are all in the same degree program and must meet the same set of CS&E degree requirements. This results in graduates who understand their own science or engineering research area and who know how to apply the most advanced computational techniques to solve problems in that area. No other Ph.D. program at MTU is designed to meet these goals and no other Ph.D. program at MTU permits this level of interdisciplinary flexibility.


Current CS&E Faculty and Students


Links to projects that receive support from the CS&E Research Institute


Other CS&ERI Activities

During the past year several visiting CS&E researchers have given talks at MTU:


What's On the Horizon for the CS&E Research Institute?