Org Coding Hours is a GitHub Action that aggregates per-contributor coding hours across one or more repositories. It uses the git-hours utility to estimate how many hours each contributor has spent (based on commit timestamps) and produces JSON summary reports. Optionally, it can also commit these metrics to a dedicated branch for archival or further processing. This action is ideal for tracking contributor effort across multiple projects in an organization, whether for open-source volunteer tracking or internal metrics.
Key features and benefits:
- Aggregate commit hours across repos – Analyze one repository or an entire org (supports wildcards like
my-org/*). The action outputs a combined organization-wide report as well as per-repository breakdowns. - Works with private repos – Private repositories are supported. The action will use the provided
GITHUB_TOKEN(or a supplied PAT) to authenticategitclones via HTTPS for private repositories. - Includes
git-hours– The Docker image bundles a prebuiltgit-hoursbinary. The CLI expects this binary to be present and fails fast if it is missing. - Flexible output – Use the JSON reports directly (e.g. for further processing or archival), or commit them to a dedicated metrics branch for later use, such as building dashboards in a separate workflow.
- Deterministic and automated releases – This repository follows semantic versioning for tags (e.g.
v7,v7.0.0). Releases are automated via GitHub Actions: when a new version is prepared, a Git tag is created and a GitHub Release is published using the GitHub CLI with--generate-notesto auto-generate the changelog. Dependencies are locked viapackages.lock.json, and CI restores in locked mode for repeatable builds. (See Release Process for details.)
The CLI runs on Windows, macOS, and Linux. Regardless of platform, ensure the following is available:
git
The repository doesn't commit a
gitbinary. During packaging, the build workflows run a script that downloads a pinned staticgitbinary and places it undertools/git/for inclusion in the CLI's NuGet package.
This action supports the following inputs:
| Input Name | Required? | Default | Description |
|---|---|---|---|
repos |
Yes | (none) | List of repositories to process, in owner/repo format. Separate multiple entries with spaces or newlines. Supports wildcards (e.g. my-org/* for all repositories in an organization). Each repository listed will be cloned and analyzed. |
window_start |
No | (none) | Optional start date (YYYY-MM-DD) for the reporting window. Commits before this date will be ignored. If not set, the default is effectively “30 days ago” (as determined by the git-hours tool). Use this to limit the metrics to a recent timeframe (e.g. quarterly reports). |
metrics_branch |
No | (none) | Optional branch where JSON report snapshots should be committed. Requires github_token and a GitHub Actions context to push. If provided, the action will commit the contents of the reports/ directory to this branch and create it if needed. (Tip: use a dedicated branch like metrics to keep data separate from code.) |
git_hours_version |
No | v0.1.2 |
Version tag of the git-hours CLI to use. By default, a known stable version is included. You can override this to use a specific release of git-hours. |
Note: All inputs are strings.
This action produces two outputs that can be consumed in subsequent steps or jobs:
aggregated_report– The file path (within the workspace) to the aggregated JSON report. If multiple repositories were processed, this points to the combined report (summing all repos). If only one repository was processed, this points to that repository’s JSON file (so you don’t have to handle two cases).repo_slug– A URL/filename-safe identifier derived from thereposinput. All slashes (/) and whitespace in the repository list are replaced with underscores. This is useful for naming artifacts or distinguishing outputs for different repo sets. For example, ifrepos: foo/bar baz/qux, therepo_slugwill befoo_bar-baz_qux. If a single repomy-org/my-repois processed,repo_slugwill bemy-org_my-repo.
In addition to outputs, the action writes files to the workspace in a structured way:
reports/
├─ git-hours-aggregated-YYYY-MM-DD.json # Aggregated report (all repos combined)
├─ git-hours-<repo_slug1>-YYYY-MM-DD.json # Individual repo report (one per repo)
├─ git-hours-<repo_slug2>-YYYY-MM-DD.json
└─ ... (etc., one JSON file for each repository)
Each JSON report (per repo or aggregated) contains a "total" object with total hours and commits, and then one entry per contributor (keyed by email or username) with their own hours and commit count.
To use this action in a workflow, reference it by its repository and version tag. For example, to run a report across multiple repositories and save the JSON outputs as an artifact:
name: Organization Coding Hours Report
on:
workflow_dispatch:
inputs:
repos:
description: "Space-separated list of repositories (owner/name format)"
required: true
window_start:
description: "Optional start date (YYYY-MM-DD)"
required: false
permissions:
contents: write # required for pushing to the metrics branch
# pages: write # (only if publishing a site in a separate job)
jobs:
report:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Run Org Coding Hours Action
uses: LabVIEW-Community-CI-CD/org-coding-hours-action@v7
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
repos: ${{ github.event.inputs.repos }}
window_start: ${{ github.event.inputs.window_start }}
# metrics_branch: metrics # (optional) enable branch push for JSON
- name: Upload JSON reports
uses: actions/upload-artifact@v4
with:
name: coding-hours-json
path: reports/Build the CLI and execute it on your machine. The tool reads its configuration from environment variables:
dotnet publish OrgCodingHoursCLI/OrgCodingHoursCLI.csproj -c Release -o cli
export REPOS="my-org/repo1 my-org/repo2"
export WINDOW_START="2024-01-01" # optional
export METRICS_BRANCH="metrics" # optional
export GITHUB_TOKEN="ghp_..." # needed for private repos
./cli/OrgCodingHoursCLIYou can also invoke the published executable from a workflow step:
jobs:
cli-example:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Build CLI
run: dotnet publish OrgCodingHoursCLI/OrgCodingHoursCLI.csproj -c Release -o cli
- name: Run CLI
run: |
export REPOS="my-org/repo1 my-org/repo2"
export WINDOW_START="2024-01-01"
export METRICS_BRANCH="metrics"
./cli/OrgCodingHoursCLI
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}Pull the published image and provide the required environment variables:
docker run --rm \
-e REPOS="my-org/repo1 my-org/repo2" \
-e GITHUB_TOKEN="$GITHUB_TOKEN" \
-e WINDOW_START="2024-01-01" \
-e METRICS_BRANCH="metrics" \
ghcr.io/labview-community-ci-cd/org-coding-hours-action:v0.1.0Replace v0.1.0 with the desired release tag.
jobs:
docker-example:
runs-on: ubuntu-latest
steps:
- name: Org Coding Hours via Docker
uses: docker://ghcr.io/labview-community-ci-cd/org-coding-hours-action:v0.1.0
env:
REPOS: my-org/repo1 my-org/repo2
WINDOW_START: "2024-01-01"
METRICS_BRANCH: metrics
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}The Docker image includes git-hours built from its latest tagged release. To use another revision, rebuild the Docker image with a different GIT_HOURS_VERSION build argument or provide your own git-hours binary.
This action only produces JSON reports. If you want to publish a static site or dashboard, add steps in your workflow to build and deploy it (for example, by using a separate job with GitHub Pages). The metrics_branch input can push the reports/ directory to a branch for later consumption, or you can upload the reports as an artifact for another job to use.
This project uses semantic versioning for its action releases (e.g. v1.0.0, v2.1.3). The release process is designed to be deterministic and fully automated for consistency and reliability:
For details on how the CLI is built and tested, how the packaged executable feeds into the Docker image, and how releases stay reproducible, see docs/ci.md.
- Version bumps: New versions are tagged when meaningful changes are introduced (features, fixes, etc.). The versioning follows semantic rules: MAJOR.MINOR.PATCH. Breaking changes or major new features increase the major version, new functionality without breaking existing usage increases the minor version, and patches/bugfixes increment the patch number.
- Automated GitHub Releases: Merging changes to
mainruns the release workflow, which creates a signed Git tag (e.g.v7.0.0) and publishes a GitHub Release using the GitHub CLI with the--generate-notesflag. Release notes (changelog) are automatically compiled from commit messages and PR descriptions since the last release. - Changelog generation: By using
gh release create --generate-notes, we ensure the changelog is generated deterministically from the history. GitHub will include summaries of changes (for example, PR titles, commit messages like “feat: ...” or “fix: ...”) in the release notes. This removes manual steps and potential human error from the release documentation. You can view the history of changes for each release in the Releases section of the repository, which will contain these auto-generated notes. - Deterministic outputs: Every release of the action is a specific tagged commit, so your workflows should reference the action with a version tag (for example,
uses: LabVIEW-Community-CI-CD/org-coding-hours-action@v7). Using pinned versions guarantees that your CI runs are repeatable and aren’t unexpectedly changed by new updates. (You can always upgrade to a newer version intentionally by updating the tag.)
(For contributors: once your pull request is merged, tagging and releasing happen automatically. Please follow conventional commit guidelines (using feat:, fix:, etc.) to help generate clear release notes.)
- Runner requirements: This action runs inside a Docker container and thus requires a Linux runner (e.g.,
ubuntu-latest). Ensure your workflow uses an appropriate runner, as Windows and macOS runners are not supported for container actions. - Authentication and permissions: If you are analyzing private repositories, make sure the job’s GITHUB_TOKEN has access to those repos. In an organization, the default token usually has access to org repositories, but in some cases (forked repositories or when using a fine-grained PAT) you may need to supply a Personal Access Token with
reposcope and pass it to the action (e.g., via an input or as theGITHUB_TOKENenv override). The action automatically uses theGITHUB_TOKENenvironment variable for git clone authentication. If using the branch-push feature (metrics_branch), the token must have write permission to contents. On forked repositories, GitHub’s default token has read-only permissions, so you’ll need to explicitly enable workflow permissions or use your own PAT. - Graceful failure behavior: The action is designed to fail fast if something goes wrong (it will exit with an error if any repository cannot be cloned or if the
git-hourstool encounters an issue). This will mark the step as failed, preventing later steps from using incomplete data. If no commits are found within thewindow_startrange (resulting in zero hours), the JSON reports will still be generated (with totals of 0 hours) rather than causing a failure. In other words, an “empty” result is considered a successful run (the absence of areports/directory would indicate a failure earlier in the process). If you want to handle a “no data” scenario more gracefully, you can add a check in your workflow after the action step. For example, you might include a step to verify that thereports/directory exists (and perhaps contains the expected files) before trying to upload or use them. - Pinned tool versions: The action pins the
git-hourstool version by default (v0.1.2) to ensure consistent behavior. You can overridegit_hours_versionif a new version of the tool is released and you want to try it, but note that the Docker image must include or install that version for the change to take effect. - Performance considerations: Analyzing many repositories can take several minutes, especially with large commit histories. Consider narrowing the
window_startor running the action on a scheduled workflow (e.g., weekly) for long-term tracking.