Days Held

Monday–Wednesday, November 14–16, 2022

The Student Cluster Competition (SCC) was developed in 2007 to provide an immersive high performance computing experience to undergraduate and high school students.

With sponsorship from hardware and software vendor partners, student teams design and build small clusters, learn scientific applications, apply optimization techniques for their chosen architectures, and compete in a non-stop, 48-hour challenge at the SC conference to complete real-world scientific workloads, showing off their HPC knowledge for conference attendees and judges.

Reproducibility Challenge

One of the applications presented to the student teams is the Reproducibility Challenge, in which students attempt to reproduce results from an accepted paper from the prior year’s Technical Program.

Students have the opportunity to interact directly with the paper’s authors as they attempt to reproduce specific results and conclusions from the paper. As part of this challenge, each student team writes a reproducibility report detailing their experience in reproducing the results from the paper. Authors of the most highly rated reproducibility reports may be invited to submit their reports to a reproducibility special issue.

Meet the SCC Teams

The SCC received many high-quality team submissions this year, and we’re pleased to announce that the following ten teams will travel to Dallas to compete in the SCC.

  • ETH Zürich, Switzerland
  • Friedrich-Alexander-Universität Erlangen-Nürnberg, Germany
  • MIT, Boston University and Northeastern University, United States
  • Nanyang Technological University, Singapore
  • National Tsing Hua University, Taiwan
  • Prairie View A&M University and California State University Channel Islands, United States
  • Purdue University and Indiana University, US
  • Texas A&M University, United States
  • University of California San Diego, United States
  • University of Texas at Austin, United States

Competing Virtually:

  • Southeast University, China
  • Sun Yat-sen University, China
  • Zhejiang University, China

Meet the teams competing in the IndySCC

Teams & Process

Teams are composed of six students, an advisor, and vendor partners. The advisor provides guidance and recommendations, the vendor provides the resources (hardware and software), and the students provide the skill and enthusiasm. Students work with their advisors to craft a proposal that describes the team, the suggested hardware, and their approach to the competition. The SCC committee reviews each proposal and provides comments for all submissions. The requirements for teams are described more completely below.

The hardware requirements for team clusters are that they are able to run the competition’s applications and exercises and that they can operate without exceeding an announced power limit. This year the competition includes a dynamic power limit – at times the power available to each team for their competition hardware may be as high as 4000-watts (but will usually be lower) and may be as low as 1500-watts (but will usually be higher). Hardware requirements are described more completely in the “SCC Rules” below.

Support Provided

Selected teams receive full conference registration for each team member and one advisor. Each team is also provided with seven single-occupancy hotel rooms for the students and advisor. As the competition is part of the Students@SC program, students can also participate in Mentor–Protégé Matching and the Job Fair. Travel to the conference and per diem are not provided.

Thank You SCC Supporters



For more information about SCC in past years, including team profiles, photos, winners, and more:


SCC Mystery Application

The SCC is looking for scientific applications from the HPC community that could be used as the SCC Mystery Application. If you have a scientific application that you think would be a great fit for the competition, please complete the form via the button below.

Each submission must list an application owner who will:

  • be responsible for answering questions from the SCC teams.
  • prepare test and input decks for the competition.
  • be available to serve as judge during SC22.

The application should not have export control restrictions.

The application must have up-to-date documentation.

Submissions and selections must be kept confidential until the beginning of the SCC when the mystery application selected will be revealed.

The primary owner of the selected app receives complimentary SC22 registration.

Applications Open March 1–May 31, 2022:

Submit a Mystery App

SCC Benchmarks & Applications

Full Set of Benchmarks and Applications


Teams will run a set of benchmarks selected from a suite that includes the following:


LINPACK Benchmark

The Linpack Benchmark is a measure of a computer’s floating-point rate of execution. It is determined by running a computer program that solves a dense system of linear equations. It is used by the TOP 500 as a tool to rank peak performance. The benchmark allows the user to scale the size of the problem and to optimize the software in order to achieve the best performance for a given machine. This performance does not reflect the overall performance of a given system, as no single number ever can. It does, however, reflect the performance of a dedicated system for solving a dense system of linear equations. Since the problem is very regular, the performance achieved is quite high, and the performance numbers give a good correction of peak performance.


HPCG Benchmark

The High Performance Conjugate Gradients (HPCG) Benchmark project is an effort to create a new metric for ranking HPC systems. HPCG is intended as a complement to the High Performance LINPACK (HPL) benchmark, currently used to rank the TOP500 computing systems. The computational and data access patterns of HPL are still representative of some important scalable applications, but not all. HPCG is designed to exercise computational and data access patterns that more closely match a different and broad set of important applications, and to give incentive to computer system designers to invest in capabilities that will have impact on the collective performance of these applications.


IO500 Benchmark

The IO500 benchmark is a benchmark suite for High-Performance IO. It harnesses existing and trusted open-source benchmarks such as IOR and MDTest and bundles execution rules and multiple workloads with the purpose to evaluate and analyze the storage devices for various IO patterns. The IO500 benchmark is designed to provide performance boundaries of the storage for HPC applications regarding data and metadata operations under what are commonly observed to be both easy and difficult IO patterns from multiple concurrent clients. Moreover, there is a phase that scans for previously-created files that match certain conditions using a (possibly file system-specific) parallel find utility to evaluate the speed of namespace traversal and file attribute retrieval. The final score that is used to rank submissions in the list is a combined score across all the executed benchmarks.


AI BenchMark: MLPerf Inference with RetinaNet

This year the SCC is introducing an AI benchmark to the suite of benchmarks teams will run. In this inaugural year, the AI benchmark will be optional and for bonus points: teams are encouraged to run the MLPerf inference benchmark for RetinaNet (image detection).






PHASTA is an open source stabilized finite element computational fluid dynamics (CFD) fluid flow solver. It can solve either the compressible and incompressible Navier Stokes equations. Laminar, transitional, and turbulent flow  occurs in either case and PHASTA’s primary applications are to turbulent flowsThe unsteady and three-dimensional nature of LES and DNS have driven PHASTA to be developed for extreme scale HPC where it has strongly scaled to 3 million processes and applied on a broad variety of platforms. PHASTA discretizes space with kth order hierarchical basis polynomials and the finite element method leaving a system of non-linear ordinary differential equations which are further turned into non-linear algebraic equations by applying a time integrator. To solve these linearized equations, PHASTA has native implementations of the generalized minimum residual (GMRES) method with a block diagonal preconditioner but also can be linked to PETSc to gain access to other equation solvers and preconditioners. Finally, one key feature of PHASTA is its use of unstructured grids.

Thanks to Ken Jansen, University of Colorado.



LAMMPS is a classical molecular dynamics (MD) code, with a focus on materials modeling. MD models the interactions of atoms and molecules without explicitly including electrons. Instead, an empirical potential is used to relate atom positions to forces, which are then used to evolve the system through time using Newton’s equations of motion. Models in LAMMPS range from simple pair-wise interactions (e.g. Lennard Jones), manybody interactions (e.g. Tersoff), and the reactive forcefield (ReaxFF), to state-of-the-art machine learning potentials (e.g. SNAP and PACE). LAMMPS is written in C++ and uses domain decomposition with MPI for parallel computation. Additionally, multithreading support is provided via the Kokkos performance portability library, as well as by native programming models (e.g. CUDA, HIP, and OpenMP).

Thanks to Stan Moore, Sandia National Laboratories.


Mystery Application
At the start of the competition, teams will be given an application and datasets for a mystery application. Students will be expected to build, optimize and run this mystery application all at the competition.


Reproducibility Challenge
One of the applications presented to the student teams is the Reproducibility Challenge, in which students attempt to reproduce results from an accepted paper from the prior year’s Technical Program.

Students have the opportunity to interact directly with the paper’s authors as they attempt to reproduce specific results and conclusions from the paper. As part of this challenge, each student team writes a reproducibility report detailing their experience in reproducing the results from the paper. Authors of the most highly rated reproducibility reports may be invited to submit their reports to a reproducibility special issue.

We are excited to announce that the SC22 Reproducibility Challenge Committee has selected the SC21 paper “Productivity, Portability, Performance: Data-Centric Python” by Alexandros Nikolaos Ziogas, Timo Schneider, Tal Ben-Nun, Alexandru Calotoiu, Tiziano De Matteis, Johannes de Fine Licht, Luca Lavarini, and Torsten Hoefler from ETH Zurich, Switzerland, to serve as the Student Cluster Competition (SCC) benchmark for this year’s Reproducibility Challenge.

Python is popular in the scientific community due to its portability and productivity. This paper explores its suitability as an HPC language, focusing on Python’s performance characteristics and ways in which it can be made more suitable to HPC. To that end, it puts to the test several frameworks that accelerate the execution of numerical workloads in Python on different hardware architectures: Numba, Pythran, CuPy, and DaCe (the authors’ own framework, you can find more information in the repositories DaCe, NPBench, and the paper presentation). The main artifact is NPBench, a collection of scientific benchmarks written in Python targeting those frameworks. This year’s reproducibility challenge will ask the students to work with Python as an HPC language and compete to achieve the best performance utilizing the Python toolchain of their choice.

SCC Team Applications

Recommendations for Preparing Your SCC Team Application

Teams are encouraged to include diverse participation including new participants and under-represented groups. To encourage new participants and help new teams participate, “new participant” points will be applied to Team Application evaluations.

Teams must qualify for at least ten (10) “new participant” points in order to be eligible to participate in SCC22, as follows:

  • Two (2) points for each first-time SC SCC participant.
  • One (1) point for each second-time SC SCC participant.
  • Eight (8) points if this will be the team’s first time in any of the major Student Cluster Competitions (SC, ASC and ISC, considering only 2019 and later).
  • Three (3) points if this will be the team’s second time in any of the major Student Cluster Competitions (SC, ASC and ISC, considering only 2019 and later).

A full description of the “new participant” points, including some examples, is provided at the Student Cluster Competition website.

Teams will be evaluated according to their responses to the following questions in the Team Application Form:


Strength of Team

Explain why this team would do well in the SCC. Comment on the team’s technical expertise, interdisciplinary and/or scientific background, interest in HPC, and previous experience in similar competitions.


  • What HPC and/or computational science experience do team members have?
  • How is the team interdisciplinary? What non-computing or domain science disciplines are represented?
  • How do members create a team with a broad background of experience relevant to the competition?
  • Have any team members participated before?
  • How will HPC help team members in their academic careers? List specific reasons for wanting to participate in the competition.
  • Introduce the advisor and advisor’s background.


Strength of Hardware and Software Approach

Explain the hardware and software architecture and discuss its feasibility.


  • Describe the hardware and software architecture in detail.
  • Explain why this architecture will be successful.
  • Explain the strategy for running applications and/or optimization during the competition.
  • How is the architecture suited for the competition applications?
  • How will you manage your system’s administration and application workflow.


Strength of Approach for Software and Cloud Administration

Teams will be given a fixed cloud budget for testing and development and a separate fixed cloud budget for the competition.


  • Provide a detailed description of the software architecture you plan to deploy.
  • How do you plan to manage your cloud resources?
  • Include details on how you will manage the system and your strategy for running the applications.


Strength of Vendor/Institution Relationship

Describe in detail what support will be given by the vendor and supporting institution before and during the competition (e.g., training, travel expenses).

Virtual-only teams who do not have a vendor partner should answer this question with regard to the support your institution will provide.

For in-person teams:

Discuss how the hardware will be provided by the vendor or supporting institution:

  • Will the vendor and/or institution provide hardware and fund equipment shipping costs as well as fund other travel and travel-related expenses that are not covered by the competition?
  • Will the vendor and/or institution provide resources for pre-competition preparation?
  • If the reviewers have questions about the architecture, who can answer those questions from the vendor and team?
  • If the team is planning to use new architecture that is not generally available, what is the backup plan if this new architecture is not released or available by the start of the competition?

For all teams:

What other support will the team receive from the vendors and/or the institution?


Strength of Diversity

Describe efforts to broaden participation in under-represented communities for team selection. Because the definition of “under-represented” varies from country to country, authors should include a discussion of the ways in which their team includes under-represented communities in their geographic region. In the United States, for example, some salient issues regarding diversity within advanced computing include the under representation of women as well as racial and ethnic minorities.


  • Does the team include meaningful contributions by groups that are traditionally under-represented in the country of the sponsoring institution?
  • What efforts were made during the team selection process to approach under represented communities?

If you feel your country or location does not have issues with under-representation in computing, we encourage you to think more deeply or do some research to learn about this issue. Failure to respond to this question, or response with technical breadth of skills will result in points lost.


Team Preparation

Discuss how your team will prepare to use HPC resources and/or learn computational science.


  • What courses are available and attended?
  • What HPC resources will be used to investigate the applications before vendor hardware arrives? (Please list)
  • What is the team’s method for preparing for the competition?
  • Do team members have specific roles or duties?
  • Provide details on the advisor’s background and the training schedule you have planned to prepare for the competition.


Team Educational Goals

Describe what you hope to learn during the course of your preparation and at the competition.

SCC Rules

General Rules and Hardware, Software, and Team Requirements

The full rules are published on the Student Cluster Competition website. Some key details are as follows.

Violation of any rule may result in a team’s disqualification from the competition, or point penalization, at the discretion of the SCC committee. Any unethical conduct not otherwise covered in these rules will also be penalized at the discretion of the SCC Committee.

The following violations will result in immediate disqualification:

  • Having anyone other than the 6 registered team members working on the team’s cloud resources during competition hours.
  • Any communication between your cloud resource and a network other than the approved cloud networks.

All decisions are the sole discretion of the SCC committee, and SCC committee decisions concerning the rules in a given situation, are final.


Safety First

Equipment configurations, booth layout, and booth occupancy are always subject to safety as first consideration. If a task cannot be done safely, then it is unacceptable. When in doubt, ask an SCC supervisor or team liaison.



  • Advisors are required to be staff, faculty or graduate students of the team’s educational institution(s) or sponsoring HPC center.
  • Student Team Members must:
    • Be enrolled in a university or high school
    • Be at least 18 years old by the beginning of the SCC (Monday November 14, 2022)
    • Not have received a bachelor’s degree or equivalent before the beginning of the competition
  • Teams are encouraged to include diverse participation including new participants and under-represented groups. To encourage new participants and help new teams participate, “new participant” points will be applied to Team Application evaluations, as described in the full rules on the Student Cluster Competition website.
  • Teams must qualify for at least ten (10) “new participant” points in order to be eligible to participate in SCC22.
  • Once the competition begins, the six team members must work on the competition tasks with no external assistance – advisors, vendor partners and other supporters must not help the team in any way (other than to occasionally deliver coffee, snacks, etc). Outsourcing of competition tasks to either paid services or unpaid volunteers is not permitted.
  • Teams must conduct themselves professionally and adhere to the SC Code of Conduct. Students must compete fairly and ethically.



Detailed hardware requirements including specifications for the PDU provided are on the Student Cluster Competition website.

  • All hardware used must be commercially available at the time of the start of the competition.
  • All competition hardware must be powered through the provided PDU and must stay within the power limit applicable at the time. Other systems (such as laptops and monitors) may be powered from separate non-competition power sources provided by the conference.
  • The competition has a dynamic power limit: the power available to each team for their competition hardware at times during the competition may be as high as 4000-watts (but will usually be lower) and may be as low as 1500-watts (but will usually be higher).
  • Teams must ensure that their hardware’s power consumption while idle consumes no more than 1500W, and that their hardware can run the applications and benchmarks without consuming more than 4000-watts.
  • No changes to the physical configuration are permitted after the start of the competition.



  • All system software (operating system, drivers, filesystems, compilers, etc) used in the competition must be publically or commercially available at the start of the competition.
  • System software must not be modified after the benchmarking period.
  • The benchmark and application executables used in the competition must be built by the team members from open source implementations. Executables may be built in advance by the team members, but teams must provide the URL of the source package (for tarballs etc) or commit hash (for git etc repos). Teams should also be prepared to demonstrate building and running the executable if requested.
  • Teams may study and tune the code used in the benchmarks and applications. Any modifications to the source code made by the team must be shared with the SCC committee.



  • A network drop will be provided for outgoing connections only. Teams will NOT be permitted to access their clusters from outside the local network.
  • Competition hardware may be connected via wired connections only – wireless access is not permitted.
  • Free wireless access for laptops will be available throughout the convention center via SCinet.



  • Teams are responsible for obtaining their cluster hardware and transporting it to the convention center. (Team vendor partners are encouraged to help their teams with this).
  • Teams are responsible for their own travel arrangements to and from the conference, and for daily expenses such as meals (Team vendor partners are encouraged to help their teams with this).


Mandatory Events

  • All participants must attend the safety briefing before any unpacking or assembling of hardware, and before participating in the Benchmarking component or any computing tasks in the competition.
  • All students must attend the Students@SC Orientation.
  • At least one student competitor from each team must attend the daily committee-and-teams stand-up meeting.
  • For each team participating in-person, there must be at least 2 student competitors in their team booth at all times while the exhibition floor is open (except during mandatory events scheduled elsewhere).
  • For each team participating only in the virtual component, at least 2 student competitors must be in the competition zoom session throughout the competition.

SCC Team Posters

Guidelines, Requirements, and Evaluation Criteria

Teams (both in-person and virtual-only) will present posters as a part of the competition. The posters will be printed on the walls of the SCC booth.


Submission Deadline

Due to lead times required for printing, the final print-ready version of the poster must be submitted not later than 2 September 2022. Failure to submit your final poster by the deadline will result in no points being awarded for the poster component of the competition.


Poster Printing Requirements

  • PDF format
  • Portrait orientation, 36 inches wide by 48 inches high
  • Graphics, other than photos, should be vector images (e.g. SVG)
  • Photos/raster images should be prepared at high enough resolution to not become overly pixelated when printed
  • Fonts should be vector-based (e.g. TrueType fonts) and embedded in the PDF
  • Upload your poster PDF to: SC Submission System


Poster Evaluation Criteria


Poster Content

Introduction to Team and School

What is the composition of the team? How does team member diversity (skillset, majors, cultural background, etc) strengthen the team? Are all team members making valuable contributions to the success of the team?

Description of Hardware and Software

Which technologies were selected (hardware, software, interconnect, etc) and why are they a good choice for this specific competition? In addition to your physical cluster, discuss your cloud configuration as well.

Preparation and Strategy

What are your optimization strategies for the applications? What are your team/time management strategies? How are you planning to leverage your hardware and cloud resources? How are you planning to handle the dynamic power limit? How does your preparation and planning support your success in the competition?

Poster Presentation

Is the poster in compliance with the printing requirements? Is the poster visually appealing? Is the text well-written and error-free? Does the poster effectively communicate the content using both text and images? Does the technical content demonstrate a deep understanding of the team and systems?

SCC Cloud Component

Cloud Resources and Rules

Thanks to generous support from Microsoft Azure and Oracle Cloud Infrastructure (OCI), SCC teams have access to two kinds of cloud-based HPC resources, as well as the physical cluster that each in-person team will bring to SC22:

  • Each team will have a fixed-size cluster in Oracle Cloud Infrastructure. This cluster will have 4 nodes of BM.Optimized3.36 (Intel Ice Lake). The virtual-only teams will also have a BM.GPU4.8 node (with 8 NVidia A100 GPUs) on their Oracle cluster, as they will be using this in place of the physical cluster that each in-person team brings.
  • Each team will get access to their own Azure CycleCloud installations to deploy and manage cloud clusters, with a selection of VM types based on the latest Intel and AMD CPUs and AMD and NVidia GPUs available in Azure. Teams will be given a budget (announced at the competition kick-off on Monday November 14) to use during the competition.

Teams will have access to each cloud-based resource for one week, shortly before the competition, to practice managing the resources and building and running applications.

Applications will be assigned to clouds as follows:

  • Teams may run PHASTA and LAMMPS on either OCI or their physical clusters, but not on Azure.
    Teams may run the Reproducibility Challenge tasks and the Mystery Application on either Azure, or for in-person teams, their physical cluster, but not on OCI. (Note that this implies virtual-only teams must use Azure for these applications)

See the Student Cluster Competition website for more details about the SCC Cloud Component.

Helpful Webinars

Join free webinars on various Student Cluster Competition and IndySCC topics. Webinars are recorded and published for those who find the timezone a challenge.

Register for the webinars, or download slides and view on YouTube once they occur.

Students@SC Webinars


SC Pro Tip


Wear comfortable shoes! Convention centers are big places, and you’ll walk a lot.

Back To Top Button