The purpose of this project is to provide a skeleton or sample Academy Software Foundation (ASWF) project reflecting the best practices that have been established by the Technical Advisory Committee (TAC). More detailed documentation can be found in the Technical Advisory Committee repository.
The project should have a clearly defined mission statement, which should be labeled as such. The mission statement should be located near the top of the README.md file in the home directory of the project repository, as well as on the home page of the project web site. Projects can use their mission statement to help define what their project does, what problems it solves, what is outside the scope of the project, and what are the long term goals. This Linux Foundation blog post Why and How to Set an Open Source Strategy shares some ideas as to what a mission statement for an open source project can look like.
A simple mission statement for this ASWF Sample Project could be:
The ASWF Sample Project aims to capture and document the best practices that will help open source projects meet the requirements for ASWF acceptance to Incubation Stage, followed by project graduation to Adopted Stage. By encouraging and documenting common practices and infrastructure between ASWF projects, the ASWF Sample Project should help to avoid duplicated effort and facilitate contributing to multiple ASWF projects.
The OpenEXR project demonstrates a practical example of a mission statement for an ASWF project.
The process for submitting a project for ASWF membership is documented in the Project Contribution Proposal Template. Once a project has been accepted as an Incubation Stage project, the next step is to move the project towards the Adopted Stage, as documented in the Project Lifecycle Document. A major part of the Project Graduation requirements are covered by meeting the "passing" criteria for the Best Practices badge defined by the Linux Foundation's Core Infrastructure Initiative (CII).
A lot of the information and files in this sample project are closely related to the CII badge requirements, or demonstrate the preferred way to implement these requirements in a ASWF project.
ASWF projects are hosted on GitHub. Once a project has been been accepted as an Incubation Stage project, its repository will be moved under the Academy Software Foundation organization which is managed by the Linux Foundation Release Engineering team.
ASWF projects should chose an explicit Open Source Initiative approved open source license, you can use the Choose a License site to help pick one. Existing projects will typically want to stick to their existing license, as relicensing can be a complex process. It is preferable to select an existing, unmodified, standard open source license since this simplifies the process of getting legal approval for use of the project within commercial organizations, and allows the use of metadata to identify the project license.
If you are starting a new project, the ASWF recommends the use of the Apache License 2.0 for code assets, the Creative Commons Attribution 4.0 International License for non-code / documentation assets, and a Community Data License Agreement for datasets.
A copy of the license should be in the root directory of your repository and should be called LICENSE. If you are using a standard open source license you should also tag your GitHub project with that license type. This can be done at project creation time.
Source files in your project should use Software Package Data eXchange (SPDX) identifiers to specify the project license, for instance in a C++ file:
// SPDX-License-Identifier: Apache-2.0
// Copyright Contributors to the PROJECT Project.
More details about the the licensing and contribution requirements for ASWF projects can be found in contributing.md in the ASWF TAC repository.
A Contributor License Agreement is a requirement that an individual or company must submit a legal document before a submission can be accepted by the project. This is typically the case when a project starts outside the ASWF and the initial project corporate sponsor enforces this requirement. The ASWF OpenColorIO project documents their CLA Requirement and uses the EasyCLA system to keep a record of signed CLA agreements, as well as verify that commits to the project come from contributors who have signed these agreements. If an ASWF project requires the use of CLAs, it is encouraged to use this system.
The Developer Certificate of Origin (DCO) is a lighter weight approach that allows a contributor to state that the code being submitted originated from the developer, or that the developer has the permission to submit the code. Typically the developer needs to use the git commit -s
option to sign all commits with their email address. The DCO GitHub app should be added to the GitHub project to verify that all pull requests meet the requirement that the commits they include should be signed.
GitHub allows your project repository name to use letters [a-z], numbers [0-9], hyphens or underscores. But RFC 952 and RFC 1123 specify that hostnames can only use letters, numbers and hyphens (ignoring for now internationalized domain names). Since it may be desirable to have network resources refer to the project name (such as the name of the project website), it is thus preferable to avoid using underscore characters in a project name.
If the project has a custom logo, consider providing a vector version of your logo in the repository, preferably in the Scalable Vector Graphics (SVG) format. A vector logo is much more flexible than one which has been rasterized to a fixed resolution image file format such as Portable Network Graphics (PNG). The ASWF Landscape uses SVG logos to represent notable open source projects in the industry. Projects hosted by the ASWF can leverage Linux Foundation Creative Services for developing a logo for the project.
Your project should have a README.md file in the project home directory, identifying the project and providing enough information to orient new users towards information and resources relevant to the project. The prefered format for in-tree documentation files which are likely to be viewed via the GitHub web interface is Markdown text.
ASWF Projects are required to designate a Technical Steering Committee (TSC) which is responsible for the technical oversight of the project, and adopt a Project Charter, for which a template is provided in the ASWF TAC repository. The TSC should meet regularly, and should keep public meeting minutes in the project repository. The TSC is responsible for setting the design, development, testing, release and support procedures for the project in close collaboration with the project community: close collaboration and alignment of processes between the TSCs of the various ASWF projects is encouraged but not mandatory.
A suggested directory structure for the TSC-related documents based on the OpenVDB project is the following:
project
├── LICENSE
├── README.md
└── tsc
├── charter.md
├── meetings
│  └── yyyy-mm-dd.md
└── process
├── codereview.md
├── deprecation.md
├── release.md
└── security.md
Best practice for a project is to publish an agenda for every scheduled meeting ahead of time ( preferably the day before ) and then publish the notes after the meeting. Using the tsc directory structure above, the best place for meeting notes to live is under tsc
using the yyyy-mm-dd.md
naming convention.
Agendas can be complied in a number of different ways, including...
- Right in the same doc and shared as a pull request.
- A shared Google Doc, which then you can use a tool like Docs to Markdown to produce clean markdown.
- As a GitHub Issues, complied from other GitHub issues.
The project should offer clearly identified communication channels to facilitate interaction between developers, users, or developers and users. The project should define the best combination of mailing lists and/or instant message channels based on the needs of its community. The ASWF provides infrastructure that projects are encouraged to use.
The ASWF Mailing List Server uses the groups.io service to host both general ASWF mailing lists (referenced from the main ASWF site) as well as project specific mailing lists. Links to project specific mailing lists should be clearly listed on the project website, a typical set of lists could be:
project-announce
for general announcements such as new releasesproject-devel
for developer-oriented discussionproject-user
for user-oriented discussion
A project may also want to offer an "official" Instant Message server as a complement or alternative to the mailing lists. The ASWF provides a ASWF Slack Instance with channels for both general ASWF as well as project-specific topics. The ASWF Slack instance offers a simple onboarding interface where users can self-invite, and is discoverable from the home page of the main ASWF web site. The project web site should also prominently link to its Instant Message server.
The Linux Foundation Release Engineering team can help with the creation of project mailing lists and Slack channels.
The project must specify a versioning mechanism, and it is suggested that Semantic Versioning be used for consistency with other ASWF projects. The procedure for tagging and creating a release should be documented and should be automated as much as possible. In this sample project this is documented in tsc/process/release.md.
The project should include an up to date list of key contributors. This could take an ad hoc form such as the OpenColorIO COMMITTERS.md file, and/or leverage the GitHub CODEOWNERS mechanism such as in the OpenVDB CODEOWNERS file which allows code review of pull requests to be automatically requested from owners of modified code.
Consider hosting project web site from GitHub repository, as this keeps all project-related content in a single repository and can help keep web site updates in sync with project updates. GitHub provides simple web hosting support which can be enabled in the GitHub Pages section of the GitHub Project Settings. For this sample project a stub index.md file was created in the docs/
directory, and the project web site can be accessed at https://AcademySoftwareFoundation.github.io/aswf-sample-project/ (custom project DNS domain are supported and encouraged). GitHub Pages support HTTPS access, and the Enforce HTTPS setting should be used to redirect HTTP access to HTTPS.
Projects with more extensive website requirements may wish to use a separate GitHub repository to maintain their assets, and can make use of different site infrastructure.
The VFX Reference Platform is a set of tool and library versions to be used as a common target platform for building software for the VFX industry. It is updated on an annual basis. ASWF projects are typically used in software environments that adhere to the VFX Reference Platforms, and will often run inside applications (in house or commercial) that are built according to the specification. ASWF projects should have a statement as to which versions of the VFX Reference Platform are supported, and should include those versions of the platform in their build and test environments.
The aswf-docker project provides an environment to build Docker containers that can be used to build and test projects in a VFX Reference Platform compliant environment. It generates Docker containers which are published to the aswf Docker hub repository.
Azure Pipelines supports building inside Docker containers on Linux and Windows. Building inside a container insulates the build from the software environment of the build agent and allows complete control over the toolchain and dependencies. ASWF projects are encouraged to make use of the ASWF build containers as much as possible when setting up their CI environment. Containers are not supported on the macOS platform: Azure Pipelines documents the current versions of packages pre-installed on macOS 10.13 and macOS 10.14 build agents: projects can use pre-installed package management systems such as Homebrew or Miniconda to manage dependencies on that platform.
ASWF projects typically use CMake as a build tool to help support multiple platforms and leverage CMake modules that help with resolving the dependencies on packages and libraries used by projects. The CMake documentation includes a basic tutorial, and a useful resource for in depth CMake information is the book Professional CMake: A Practical Guide by Craig Scott.
An Introduction to Modern CMake is another interesting source of guidance.
CMake can also be used to run the project test suite with CTest, and can send test suite results to the CDash
dashboard.
CPack can be used to generate installation packages in a variety of generic and OS-specific package formats such as .rpm
, .deb
or .tar.gz
for Linux, .msi
or .zip
for Windows, .dmg
or .dmg
for macOS.
ASWF projects currently use Azure Pipelines from Microsoft (a component of the Azure DevOps service) to provide cross platform Continuous Integration functionality. Azure Pipelines provides Windows, macOS and Linux build agents free of charge for open source projects, and although GPU equiped build agents for running test suites that require GPU acceleration are not currently available, it is possible to add a custom build agent to the Agent Pool for your project.
The free build agents provided by Microsoft are documented under Microsoft-hosted agents and are configured with a significant amount of pre-installed software (see links to specific agents in the documentation for a list of available packages). Typically ASWF projects will want to control specific versions of tools and libraries used to build and test, additional mechanisms such as Containers and package managers can be used to configure the desired build environment.
Azure Pipelines typically looks for a file called azure-pipelines.yml
in the root directory of the GitHub project which specifies build / test / release instructions to be executed by CI pipeline.
Once a project becomes an official ASWF project and moves its repository under the AcademySoftwareFoundation GitHub organization, it will be using the ASWF Azure Pipelines build project which is managed by the Linux Foundation Release Engineering team. But you will first want to get Azure Pipelines builds for your project running under your own login.
To run a build of your GitHub project using Azure Pipelines, you will first need to create an Azure DevOps account (assuming you don't already have one, and assuming you already have a GitHub account) and configure authentication in Azure DevOps and GitHub, which can only be done from the web interface:
- You can login to Azure DevOps either with a Microsoft account or with your GitHub credentials. You can create a Microsoft Account if you don't already have one and login to Azure DevOps using those credentials, or alternatively use "Start free with GitHub". This will create a default Azure DevOps organization based on your username.
- At the upper right corner of the Azure DevOps screen, click on the icon representing your account and select "Azure DevOps Profile". Under "User Settings -> Security" select "Personal Access Token". Select "+ New Token" to create a new token, name it "cli_access", under the "Organization" drop down select "All Accessible Organizations", and under "Scope" select "Full Access". Click on "Create" to create the token, make sure to record the value of the token in a safe place. This process is documented under Authenticate access with personal access tokens. This Personal Access Token (PAT) will let you authenticate CLI access to Azure DevOps.
- Similarly you will need to create a GitHub Personal Access Token to allow Azure DevOps to connect to your GitHub project. As per the Build GitHub Repositories documentation under the section "Using a PAT", your GitHub PAT should have
repo
,admin:repo_hook
,read:user
, anduser:email
permissions. You can call the PAT "AzureDevOpsPAT" for instance. As with the Azure DevOps PAT make sure to record this token safely.
From this point on we will configure Azure DevOps from the CLI as much as possible. For more general documentation see Azure DevOps CLI. Each section is assumed to depend on the previous one to avoid repeating commands such as setting environment variables. The syntax is for bash
, but the az
command line tool works similarly on Windows.
-
Install the Azure CLI on your local system:
- On Windows using a MSI installer
- On macOS using the Homebrew package manager
- On Linux using
apt
for Debian/Ubuntu oryum
for RHEL/Fedora/CentOS
-
Install the Azure DevOps extension for the Azure CLI:
az extension add --name azure-devops
- Create an Azure DevOps project, the name doesn't matter too much, but it would probably help if it matched the name of your GitHub repository. The
AZURE_DEVOPS_EXT_PAT
environment variable is used to provide the Azure DevOps PAT you previously generated to the command line tools.
export AZURE_DEVOPS_EXT_PAT=YOUR_AZDEVOPS_PAT
az devops configure --defaults organization=https://dev.azure.com/AZDEVOPS_ORG_NAME
az devops project create --name AZDEVOPS_PROJECT_NAME --source-control git --visibility public
az devops configure --defaults project=AZDEVOPS_PROJECT_NAME
- Create a Service Connection which will link your Azure DevOps project to your GitHub account, using both your Azure DevOps PAT and your GitHub PAT for authentication.
export AZURE_DEVOPS_EXT_GITHUB_PAT=YOUR_GITHUB_PAT
az devops service-endpoint github create --github-url https://github.com/GITHUB_ACCOUNT/GITHUB_PROJECT/settings --name GITHUB_PROJECT.connection
- Create a Pipeline which will link the Azure DevOps project to your GitHub project. This assumes the existence of a
azure-pipelines.yml
configuration file in the root directory of the GitHub project. The--branch master
command line option specifies that the build pipeline will be triggered by any commits to themaster
branch in the GitHub repository. You first need to retrieve the ID for the service connection you just created and use that to create the build pipeline:
export CONNECTION_ID=$(az devops service-endpoint list --query "[?name=='GITHUB_PROJECT.connection'].id" -o tsv)
az pipelines create --name GITHUB_PROJECT.ci --repository GITHUB_USER/GITHUB_PROJECT --branch master --repository-type github --service-connection $CONNECTION_ID --skip-first-run --yml-path /azure-pipelines.yml
As an alternative to the Azure CLI, the Azure DevOps Python API can be accessed directly to automate Azure DevOps tasks.
The configuration from the previous section should have created a pipeline called GITHUB_PROJECT.ci
, which by default compiles and runs a small C++ test program on Windows, macOS and Linux. The --skip-first-run
command line option prevented a build from getting kicked off when the pipeline was created, but any commits to the master
branch in the GitHub repository will kick off an automatic build, which you can monitor in the Azure DevOps web interface.
You can launch a build manually from the command line:
export AZURE_DEVOPS_EXT_PAT=YOUR_AZDEVOPS_PAT
az pipelines build queue --definition-name GITHUB_PROJECT.ci
To meet the CII badge requirements, the project must have an automated test suite, and must have a policy that new tests must be added to the test suite when major new functionality is added to the project. There are several tools that can help create, run and monitor the results of a test suite, this sample project demonstrates trivially simple testing using CTest with uploading of test results to CDash.
In this simple example the tests are specified in src/CMakeLists.txt. One test just runs the resulting binary to make sure it doesn't crash on startup, one test looks for the expected Hello, world!
output.
# does the application run
add_test (HelloRuns hello)
# does it print what you expect
add_test (HelloPrints hello)
set_tests_properties (HelloPrints PROPERTIES PASS_REGULAR_EXPRESSION "Hello, World!")
The test suite can be run directly from the build
directory:
cd build
ctest .
which should answer something like:
Test project /path/to/build/directory
Start 1: HelloRuns
1/2 Test #1: HelloRuns ........................ Passed 0.01 sec
Start 2: HelloPrints
2/2 Test #2: HelloPrints ...................... Passed 0.00 sec
100% tests passed, 0 tests failed out of 2
Total Test time (real) = 0.01 sec
CDash is an open source web-based testing server developer by Kitware, who are also responsible for CMake. CTest has integration with CDash and can automatically upload test suite results to CDash for display and analysis.
First, create a free account on my.cdash.org then once you are logged in, scroll to the bottom of the browser window to the Administration section and select "Start a new project". Use the name of your GitHub project as the CDash project name, select:
- Public Dashboard
- Authenticate Submission
and save your changes with "Update Project".
Next create an authentication token for this CDash project which will be used by CTest to authenticate uploads of test results. Name this token the same as your GitHub / CDash project, create the token, and copy it to a safe location since you will not be able to access it again from the CDash interface.
Then add the CDash token that was created as a secret variable called CTEST_CDASH_AUTH_TOKEN
in the Azure Pipelines pipeline definition (assuming you named your pipeline GITHUB_PROJECT.ci
as per the section on Azure DevOps CLI configuration).
export AZURE_DEVOPS_EXT_PAT=YOUR_AZDEVOPS_PAT
az pipelines variable create --name CTEST_CDASH_AUTH_TOKEN --value YOUR_CDASH_TOKEN --secret true --allow-override true --pipeline-name GITHUB_PROJECT.ci
By default secrets associated with a build pipeline are not made available to pull request builds of forks which will cause automatic gating test builds for Pull Requests to fail since these are built from a separate fork.
Unfortunately there is currently no simple way to allow secrets access from fork builds via the Azure CLI, so instead you need to follow these steps in the GUI:
- In the sidebar of your Azure DevOps project, select
Pipelines / Pipelines
- Select the build pipeline, called GITHUB_PROJECT.ci
- Click on the Edit button at the top right corner of the screen
- Select the
...
drop down menu and pick theTriggers
option - Select the
Pull request validation
trigger - Tick the
Make secrets available to builds of forks
option - Click on the
Save & queue
drop down menu, selectSave
Allowing fork builds access to secrets can be considered a security issue, since the fork / PR could be adding code to compromise the secret. An alternative approach is to skip sections of the build that require access to secrets stored in environment variables, and whenever possible use Service Connections instead. In this project conditional code in CTestScript.cmake is used to prevent trying to upload test results if the access token environment variable is not set.
The configuration to allow CTest to upload to CDash is found in the files CTestConfig.cmake
and the CTest script that will get run is in CTestScript.cmake
.
The CI pipeline definition YAML file azure-pipelines.yml must define the following environment variables before it can call CTest:
CTEST_SOURCE_DIRECTORY
: the path to the current location of the project on the build agentCTEST_BUILD_NAME
: an identifier for the current buildCTEST_SITE
: an identifier for the build agent
After the build is complete, the build agents should then execute:
ctest --verbose -S ../CTestScript.cmake
to run the CTest script, and you should then be able to view the test results on the CDash dashboard:
SonarCloud, but lots of other options such as Clang-Tidy, Cppcheck, Infer, LGTM, PVS-Studio.
The ASWF provides an instance of the JIRA ticketing system for the use of its member projects. You will need to create a Linux Foundation ID to use this system. The native GitHub Issues mechanism in the project GitHub repository is also available. The TSC should define and document which ticketing system (or combination thereof) should be used and for what purpose.
Projects which produce C++ libraries to be consumed by applications should make it simple to control the namespace in which externally visible symbols are scoped. A typical Digital Content Creation (DCC) application may link directly against a specific version of a library, but through its plugin API may end up pulling in different versions as well. As a general rule ASWF C++ libraries must support running two instances of different versions in the same application.
An example of this can be found in the top-level CMakeLists.txt CMake project file of the OpenColorIO Project:
set(OCIO_NAMESPACE OpenColorIO CACHE STRING "Specify the master OCIO C++ namespace: Options include OpenColorIO OpenColorIO_<YOURFACILITY> etc.")
This creates the OCIO_NAMESPACE
CMake cached variable with the default value of OpenColorIO
, this value can be overridden on the CMake command line with the option -DOCIO_NAMESPACE=MyCustomOpenColorIOBranch
for instance.
It can be desirable to use nested namespaces that include the ABI version and build type of the library in the namespace, as demonstrated in OpenColorABI.h.in where OpenColorABI.h.in
gets processed by CMake using the configure_file() CMake command to generate the OpenColorABI.h
C++ header file (hence the use of '@' for token pasting in CMake rather than '##' in the C preprocessor):
#define OCIO_VERSION_NS v@OpenColorIO_VERSION_MAJOR@_@OpenColorIO_VERSION_MINOR@@OpenColorIO_VERSION_RELEASE_TYPE@
#define OCIO_NAMESPACE_ENTER namespace OCIO_NAMESPACE { namespace OCIO_VERSION_NS
#define OCIO_NAMESPACE_EXIT using namespace OCIO_VERSION_NS; }
#define OCIO_NAMESPACE_USING using namespace OCIO_NAMESPACE;
The CMake symbols OpenColorIO_VERSION_MAJOR
and OpenColorIO_VERSION_MINOR
are defined to 2, 0 respectively in the top-level CMakeLists.txt by defining the VERSION
keyword for the project()
command:
project(OpenColorIO
VERSION 2.0.0
LANGUAGES CXX C)
and OpenColorIO_VERSION_RELEASE_TYPE
here would be dev
:
set(OpenColorIO_VERSION_RELEASE_TYPE "dev")
It is considered best practice in ASWF projects to define the project version once in a top level CMakeLists.txt
CMake project file and to use configure_file()
to add this to source code files that require it.
Using the macros:
OCIO_NAMESPACE_ENTER {
some code;
}
OCIO_NAMESPACE_EXIT
is equivalent to:
namespace OpenColorIO {
namespace v2_0dev {
some code;
}
using namespace v2_0dev
}
assuming that the base OCIO_NAMESPACE
CMake cache variable has not been overriden on the command line.
The same versioned ABI namespace should also be used to set the SONAME CMake Property which is used to set the SONAME / ABI version of shared libraries built from the project.
An alternative approach is to use a non-nested namespace that appends the project version to the project namespace, and use C++ namespace aliases in client code to transparently access the versioned namespace. For instance, assuming the CMake infrastructure described previously to generate the main project include file:
namespace MyAswfProject_v2_1 { }
namespace MAP = MyAswfProject_v2_1;
allows client code to refer to API functions as MAP::foo()
without having to worry about the specific version, while still allowing global renaming of the library namespace from the CMake command line.
This project has enabled repolinter from TODO Group. This tool checks repos for common errors in your repository setup. By default it runs weekly at 00:00 GMT on Sunday as a Github Action; notifications on this job success or failure is set by a user's notification options