Abstract
Parallel performance analysis tools must be tested as to whether they perform their task correctly, which comprises at least three aspects. First, it must be ensured that the tools neither alter the semantics nor distort the run-time behavior of the application under investigation. Next, it must be verified that the tools collect the correct performance data as required by their specification. Finally, it must be checked that the tools perform their intended tasks and detect relevant performance problems. Focusing on the latter (correctness) aspect, testing can be done using synthetic test functions with controllable performance properties, possibly complemented by real-world applications with known performance behavior. A systematic test suite can be built from synthetic test functions and other components, possibly with the help of tools to assist the user in putting the pieces together into executable test programs. Clearly, such a test suite can be highly useful to builders of performance analysis tools. It is surprising that, up until now, no systematic effort has been undertaken to provide such a suite. In this paper we describe the APART Test Suite (ATS) for checking the correctness (in the above sense) of parallel performance analysis tools. In particular, we describe a collection of synthetic test functions which allows one to easily construct both simple and more complex test programs with desired performance properties. We briefly report on experience with MPI and OpenMP performance tools when applied to the test cases generated by ATS.
Original language | English |
---|---|
Pages (from-to) | 1465-1480 |
Number of pages | 16 |
Journal | Concurrency and Computation: Practice and Experience |
Volume | 19 |
Issue number | 11 |
DOIs | |
State | Published - 10 Aug 2007 |
Keywords
- Automatic performance analysis
- Parallel applications
- Parallel performance analysis
- Parallel programming
- Performance analysis tools