1 of 22

Launching the �High Performance�Software Foundation

2024-05-13, ISC-HPC 2023 BOF session

Linux Foundation Project Formation Team

2 of 22

Today we launch the

High Performance Software Foundation

HPSF aims to build, promote, and advance a portable core software stack for HPC by increasing adoption, lowering barriers to contribution, and supporting development efforts.

2

HPSF

3 of 22

The world is investing billions in open source software for supercomputers

  • ECP alone has built 80+ software projects that run portably across GPUs
    • Programming models
    • Tools
    • Math libraries
    • Applications
    • Data visualization
    • Packaging�
  • Europe and Japan are beginning their own major software efforts

3

4 of 22

HPSF aims to bring the high performance �software stack to the mainstream

  • Today’s HPC accelerators will quickly make their way to clouds and personal computers
  • The impact of supercomputer software isn’t limited to supercomputers
  • Leverage know-how developed on flagship HPC systems in industry
  • We need to lower barriers to use

4

Clouds, Servers

HPC

Personal machines

5 of 22

HPSF Goals

  1. Provide neutral home for key HPC projects to enable collaboration between government, industry and academia

  • Promote use of HPSF projects

  • Ensure that HPC software is accessible and reliable by providing CI and turn-key builds

  • Ensure that HPC software is secure and ready for cloud through collaborations with CNCF and OpenSSF

  • Sponsor events and training to grow a diverse, skilled workforce for software in the HPSF ecosystem.

5

6 of 22

HPSF Structure

6

Governing Board (GB)

Technical Advisory Council (TAC)

GB Committees

Marketing

Budget & Finance

CI & Testing

Working Groups

Facility Engagement

Software Stacks

Safety and Security

Kokkos

Technical Projects

Spack

Viskores

HPCToolkit

End Users

Benchmarking

Events & Training

Apptainer

E4S

Collaborations

7 of 22

Members

7

Premier

General

Associate

8 of 22

Joining HPSF as a Member Organization

  • You need to join the Linux Foundation
    • Non-profit or academic orgs join as associate member
  • Joining HPSF at one of three levels:
    • Premier: $175k per year
    • General: $2.5k - $50k depending on size of organization
    • Associate: for non-profit / academic only
  • Non-profit/academic orgs can join LF as associate and HPSF as General/Premiere

8

9 of 22

Membership and Participation Levels

9

Membership Level

Annual Fee

Plus Linux Foundation Membership: Silver

(if not a member)

Governing Board Seat

TAC Seat

Outreach Committee

Notes

Premier

$175,000

LF membership required��Associate LF membership is ok for academic/gov’t orgs.

Yes

Yes

Yes

Two-year minimum commitment

General

Sliding scale, see below

(Elected)

1 per every 5 General members

No but can attend

Yes (non-voting)

A seat on the TAC may be earned on behalf of supported technical projects that hit project development milestones.

Associate

No fee

No

No

No but can attend

Yes

Limited to LF associate members (academic, nonprofit and government entity organizations}

General Annual Fee Scale

$50,000 - > 5,000 employees

$30,000 - 2,000 – 4,999 employees�$20,000 - 500 – 1,999 employees�$10,000 - 100 500 employees�$2,500 - < 100 employees

Please note that membership in the Linux Foundation is required to join High Performance Software Foundation as a member. Technical participation in any of the projects supported by the High Performance Software Foundation does not require membership in either the Linux Foundation or High Performance Software Foundation.

10 of 22

Representation

  • One for each Premier Member
  • One for every 5 General Members – up to 25%
  • One for every 5 Technical Projects – up to 25%

10

  • One for each Premier Member
  • One for every Graduated or Incubating project
  • TAC projects: have moved beyond sandbox level

Governing Board

Technical Advisory Board

(will grow)

4

6

(max)

(max)

4

≤ 2

≤ 2

11 of 22

Prospective Working Groups

CI & Testing

  • Develop recipes, containers, github actions for multi-architecture CI
  • Potentially provide resources to run CI on accelerator architectures

Software Stacks

  • Provide tested software stacks for various system architectures
  • Ensure interoperability of HPSF projects

End Users

  • End user engagement and requirement gathering

Facility Engagement

  • Facility software stack deployment
  • Testing of software stacks

Security and Safety

  • Supply chain security for HPC
  • Best practices for development memory safety & other vulnerability avoidance

Benchmarking

  • Common benchmark infrastructure to ensure performance of HPC software
  • Feedback to hardware vendors on performance issues and concerns

11

12 of 22

Intended collaborations within LF

  • HPC in the cloud
  • cloud based CI/CT
  • Supply chain security
  • SLSA, build farm hardening

  • Accelerate computing
  • Resource aware libraries
  • HPC optimized network
  • Use cases and benchmarks
  • Support UXL standard interfaces
  • Ensure interoperability
  • Ensure HPC projects interoperate
  • Improve software stack

12

13 of 22

Technical Projects – Who should join:

  • Focused on the High Performance Computing ecosystem
  • Need a neutral home to facilitate multi-institutional collaborations
  • Providing vendor neutral solutions to engineering and science computational needs
  • Committed to building an open developer and user community

13

14 of 22

What is Spack?

  • Package Management / �build tool for HPC
  • A repository of nearly 8,000 package recipes

Who uses Spack?

  • Labs, academia, industry
  • End users, developers, admins, user support staff, scientists

What are you looking for?

  • Engagement with OpenSSF, CNCF, other LF organizations
  • Engagement with facilities, clouds, CI providers
  • Broader community!

Who supports Spack?

  • LLNL, SNL, other institutions
  • Community contributors
  • Kitware

14

Spack

15 of 22

What is Kokkos?

  • C++ Performance Portability
  • Programming Model
  • Scientific Libraries (BLAS, Sparse, FFT, …)
  • Tools

15

Who supports Kokkos?

  • SNL, ORNL, CEA, ANL, LBNL
  • Contributions from major hardware vendors

Who uses Kokkos?

  • > 2000 Developers at over 150 institutions
  • Diverse set of apps and libraries (science, engineering, data analytics)

Looking for:

  • Developer help
  • Engagement with academia for Kokkos classes and research

16 of 22

What is HPCToolkit?

  • Tools for measurement and analysis of application performance on CPUs and GPUs
    • Parallel programming models
      • MPI, OpenMP, Kokkos, RAJA, CUDA, HIP, SYCL, thread libraries, and more
    • Languages: C, C++, Fortran, Python
    • Platforms
      • CPUs: ARM, Power, x86_64
      • GPUs: AMD, Intel, NVIDIA

16

Who uses HPCToolkit?

  • National laboratories, companies, universities

Looking for

  • Collaborations: development, research, vendor engagement

Measurement Modalities

  • CPU: asynchronous sampling, function interception
  • GPU: API tracing, PC sampling, HW counters, binary rewriting

17 of 22

What is E4S?

  • Curated, hardened software distribution of scientific packages
  • Built on Spack
  • Buildcaches, containers
  • “Frank” build farm at UO

Who supports E4S?

  • U. Oregon
  • DOE

Who uses E4S?

  • Users, centers in DOE
  • DOD
  • Adaptive computing
  • Application teams

Looking for:

  • Collaborations with application teams, developers
  • Engagement on CI

17

18 of 22

What is Viskores?

  • HPC Visualization Software for Exascale-Era Processors
  • Focus on shared-memory parallelism on CPUs and GPUs

Who supports Viskores?

  • ORNL, Kitware

Who uses Viskores?

  • Large-scale HPC application teams
  • In-situ Visualization frameworks (Ascent)

Looking for:

  • Community engagement
  • Broader user base

18

(Formerly VTK-m)

19 of 22

What is Apptainer?

  • Widely used container tool for HPC
  • Focus on portability and security
  • SIF format supports:
    • easy sharing across HPC machines
    • scalable container launch with, e.g., SLURM
  • Interoperable with Docker, OCI

Who uses Apptainer?

  • Industry, labs, academia
  • Large HPC centers
  • Companies

Who supports Apptainer?

  • CIQ, Inc.
  • Community contributors

19

20 of 22

Join HPSF!

20

HPSF

  • If you are interested in joining, �visit hpsf.io

  • LFX Signup link is live�for members.

  • Contact us if you want�to join as a project!

21 of 22

Additional LF-provided details follow

21

22 of 22

Scientific simulations critically rely on hundreds of open source packages

22

30 LLNL proprietary 12 LLNL open source 71 external open source

ARES dependencies

ARES Turbulent Fluid simulation

Other sophisticated applications have similar graphs of dependencies

WarpX laser-plasma accelerator model

ExaWind turbine model