HiPERiSM's Course HC6 

HiPERiSM - High Performance Algorism Consulting

Course HC6: MPI Training for Parallel Computing

MPILOGOGREEN.GIF (3269 bytes)



 
 
 

Prerequisites:

This course is intended for programmers who are familiar with programming for serial or vector architectures in either Fortran or C. No prior parallel programming experience for specific distributed memory parallel (DMP) programming paradigms is assumed.  Programmers who have vector or serial code they wish to port to DMP systems are encouraged (but not required) to bring their own code for discussion.

Objectives:

This training course primarily intends to introduce all the important MPI-1 constructs and teaches the methodology of writing parallel code using MPI. The course includes examples in Fortran and C. Detailed case studies demonstrate how MPI is applied in production models. On completion of the course participants will have the capability of writing MPI code (or libraries) that are independent of architectures.

Duration:

4 days organized as follows:

Day Period Chapter Topic

1

AM

1

2

3

Introduction

Environment

Point-to-point communication

1

PM

 

Examples

2

AM

4

Collective communication

Examples

2

PM

5

8

Data types

Case studies I: dot product and matrix-vector product

3

AM

6

Communicators and groups

Examples

3

PM

7

Topology

Examples

4

AM

9

Case studies II: the Stommel Ocean Model

4

PM

10

Case studies III: The Princeton Ocean Model
 

Format: 

The course is contained in a course workbook format intended for use in one of three ways:

  1. Class room presentation,
  2. Self-paced study,
  3. As a reference.

For options (a) and (b) the course workbooks are accompanied by a syllabus.

The workbook includes all source code in both Fortran and C languages, sample input, output, and make files needed to compile and execute all programs discussed in the text.

Review of Sections:

This MPI training workbook (in development) is arranged into nine Chapters described as follows.

  1. Introduction.
  2. Environment.
  3. Point-to-point communication.
  4. Collective communication.
  5. Data types.
  6. Communicators and groups.
  7. Topolgy.
  8. Case Studies I: dot and matrix-vector product.
  9. Case Studies II: The Stommel Ocean Model.
  10. Case Studies III: The Princeton Ocean Model.
  11. Bibliography.
  12. Appendix A: MPI procedure cross-reference

backnext page

HiPERiSM Consulting, LLC, (919) 484-9803 (Voice)

(919) 806-2813 (Facsimile)