Run studies on GitHub Actions
Trigger a study run
Section titled “Trigger a study run”Studies are run on-demand using GitHub’s workflow_dispatch trigger.
-
Go to the repository’s Actions tab.
-
Select the Run Studies workflow in the left sidebar.
-
Click Run workflow.
-
Fill in the inputs:
Input Default Description Time budget 2hTime budget per study (e.g., 30m,1h,2h).Studies allComma-separated study IDs, or allfor every study inconfig/studies/. -
Click the green Run workflow button.
Example: run specific studies
Section titled “Example: run specific studies”To run only the format comparison and AVIF speed sweep with a 1-hour budget:
- Time budget:
1h - Studies:
format-comparison, avif-speed-sweep
Monitor progress
Section titled “Monitor progress”Click into the workflow run to see the job graph:
- Build Container Image — builds the dev container (~5–10 min, cached after first run)
- Prepare Study Matrix — computes which studies to run
- Fetch Dataset — downloads
div2k-valid(~2 min) - Study: <name> — one job per study, running in parallel; runs the pipeline (encode + measure) and analysis.
- Comparison: <name> — one job per study, runs after the corresponding Study job; re-encodes images using interpolated quality settings and generates visual comparison figures. Studies without comparison targets produce no output and upload no artifact.
- Generate Report — combines all results into an interactive report
- Deploy Report — publishes to GitHub Pages
- Create Release — tags the commit and creates a GitHub Release
Each job’s logs are visible in real time. Study jobs show encoding progress including image count and time remaining.
Access results
Section titled “Access results”Interactive report
Section titled “Interactive report”After the workflow completes, the report is available at:
https://<username>.github.io/<repository>/report/The report includes interactive Plotly visualizations for all completed studies.
GitHub Release
Section titled “GitHub Release”Each successful run creates a GitHub Release with:
- Release notes — summary of studies, tool versions, and key findings
- CSV files — per-study statistics for independent re-analysis
Find releases under the repository’s Releases tab. The release tag
follows the pattern study-YYYYMMDD-HHMMSS.
Workflow artifacts
Section titled “Workflow artifacts”Raw data is available as workflow artifacts (90-day retention):
- study-results-<id> — metrics JSON and analysis outputs per study
- report — the generated HTML report
- release-materials — release notes and CSV assets
Download artifacts from the workflow run’s Summary page.
Run studies in a fork
Section titled “Run studies in a fork”To run studies in your own fork:
-
Enable GitHub Actions: Go to the fork’s Actions tab and enable workflows.
-
Configure GitHub Pages: Go to Settings → Pages → Source, select Deploy from a branch, choose
gh-pagesbranch with/ (root)folder. -
Allow GHCR access: The workflow pushes a container image to GitHub Container Registry. This works automatically for public forks.
-
Trigger the workflow: Follow the same steps as above.
The report will be deployed to your fork’s GitHub Pages URL.
Customize studies for CI
Section titled “Customize studies for CI”You can add or modify study configs before triggering a run:
- Create or edit a study JSON in
config/studies/(see Create a custom study). - Commit and push to your fork.
- Trigger the workflow with your study ID.
The workflow picks up all study configs from config/studies/ when
studies is set to all, or you can specify your custom study ID directly.
Resource considerations
Section titled “Resource considerations”| Constraint | Limit | Impact |
|---|---|---|
| Job timeout | 6 hours | Limits per-study time budget |
| Runner disk | 14 GB SSD | Limits concurrent encoded artifacts |
| Runner CPU | 4 cores | Affects encoding speed (quality metrics unaffected) |
| Runner RAM | 16 GB | Sufficient for all current studies |
| Concurrent jobs | 20 | Up to 20 studies can run in parallel |
| Artifact retention | 90 days | Raw metrics available for 90 days |
Tip: For the DIV2K validation dataset (100 images), a 2-hour budget typically processes 20–60 images depending on the number of encoder configurations. Increase the budget for more comprehensive results, up to the 6-hour job limit.
See also
Section titled “See also”- Public research with GitHub Actions — why and how this infrastructure works
- Create a custom study — define new studies to run on CI
- Run the pipeline — run studies locally instead