Introduction

A Beowulf cluster is a private network of computers (usually Alpha or Intel boxes) running a stripped down version of Linux. Acting together, with a parallel programming API like MPI or PVM, the cluster can funtion like a single massively parallel computer. The strength of a Beowulf cluster is its cost-effectiveness: the use of COTS hardware and software greatly reduces the price as compared to buying a specialized and proprietary machine such as a Cray T3E. The weakness is the relatively high latency and low bandwidth of communications between computing nodes. If your parallel application is more compute-intensive than communication-intensive, you may get more bang for your buck with a Beowulf cluster.

Our Beowulf cluster at MTU is called Ecgtheow (who was Beowulf's father in the saga). It was originally built by the Beowulf group within USRA/CESDIS to support the HPCC / ESS project at the NASA Goddard Space Flight Center (GSFC). When Dr. Merkey joined the CS department in the spring of 2000, he brought ecgtheow with him. It will be used to support research, development and testing of Beowulf system software being developed all over the world and to support education and research within the Computational Science and Engineering program (CS&E) here at Michigan Tech.

Here is a picture of ecgtheow and here are some of the stats:

The most popular programming model for Beowulf is the message passing model. In such a model, each node in the cluster runs standard serial code (C or Fortran) which is usually called a process. These processes then coordinate (synchronize or exchange data) through the message passing library. The most popular message passing libraries are MPI (a Message Passing Interface) and PVM (Parallel Virtual Machine). These models are supported on almost all currently available parallel platforms. In addition Beowulf clusters support a wide range of experimental programming models: light weight message passing systems, coarse grain multithreaded models and distributed shared memory models. These have certain academic interest, but those interested in application development are encouraged to look at MPI.

If you would like more information on the availablity of ecgtheow or applicability of parallel programming to your particular computational requirements, please send mail to Phil Merkey.