SC22 Proceedings

The International Conference for High Performance Computing, Networking, Storage, and Analysis

Birds of a Feather Archive

Use Cases for SPEC HPC Benchmarks


Authors: Robert Henschel (Indiana University), Junjie Li (Texas Advanced Computing Center (TACC)), Mathew Colgrove (NVIDIA Corporation), Dossay Oryspayev (Brookhaven National Laboratory), Ron Lieberman (Advanced Micro Devices (AMD) Inc)

Abstract: The Standard Performance Evaluation Corporation (SPEC) is a non-profit corporation formed to establish, maintain and endorse standardized benchmarks and tools to evaluate performance and energy efficiency for the newest generation of computing systems. The SPEC High Performance Group (HPG) focuses specifically on developing industry standard benchmarks for HPC systems and has a track record of producing high-quality benchmarks serving both academia and industry. This BoF invites HPC center operators, developers, and researchers to discuss their experiences using application benchmarks and learn about the roadmap for future SPEC benchmark developments.

Long Description: The SPEC High Performance Group has been developing benchmarks for high performance computing systems for more than 25 years. Those benchmarks have been at the cutting edge of hardware and software, like the SPEC OpenMP2001 benchmark that was released just as the OpenMP programming paradigm started to be widely used. SPEC HPG was also the first group to release a benchmark that covered all relevant vendor-neutral offloading technologies by releasing the SPEC Accel benchmark that supported offloading in OpenCL, OpenMP and OpenACC. Most recently SPEC HPG released the new SPEChpc 2021 benchmark that covers all multiple programming models, supports hybrid execution on CPUs and GPUs and scales from a single server all the way to thousands of nodes and GPUs. SPEC benchmarks are widely used in industry and academia, as evidenced by the diverse participation in the high performance group from industry and academia. The benchmarks are used to validate new compilers, benchmark new HPC systems, conduct acceptance testing and validate performance after critical hardware or software updates. Key to the success of the benchmarks is that SPEC provides not just infrastructure to develop a benchmark, but also facilitates the release, maintenance and support of the benchmark. This ensures that a benchmark stays usable for a long time and is well supported even on cutting edge hardware platforms. The result repository is another key feature of SPEC benchmarks. Every result is peer reviewed to ensure that hardware, software and operating environment of a benchmark run are appropriately documented. This key fact makes results from SPEC benchmarks broadly usable, from appearing in marketing material of hardware vendors to being used in academic papers comparing optimization techniques of compilers. SPEC HPG benchmarks are freely available to all research institutions and non-profit corporations. In this BoF we are bringing together a group that has participated in the development of SPEC HPG benchmarks as well as used them in day to day work. After a brief introduction to SPEC and the SPEC high performance group, the BoF will focus on presenting real world usage scenarios for the benchmarks by having representatives from HPC centers give a brief overview of how they use the benchmarks. Those lightning-like talks will be used to engage the audience. While this is our first BoF proposal for this conference series, we have successfully conducted tutorials at this and other conferences. Based on our experience from these events, we expect the BoF to be very interactive.

URL:


Back to Birds of a Feather Archive Listing