Skip to content

Latest commit

 

History

History
120 lines (88 loc) · 5.41 KB

File metadata and controls

120 lines (88 loc) · 5.41 KB

Development

Below are the details to set up a development environment and run tests.

Versioning

This library adheres to Semantic Versioning. Releases are automated using Release Please, which analyzes commit messages to determine version bumps.

Processes

Conventional Commit Messages

This repository utilizes Conventional Commits for structuring commit messages. This standard is crucial for the automated release process managed by Release Please, which parses your Git history to create GitHub and PyPI releases.

Install

  1. Clone the repository:
    git clone https://github.com/googleapis/mcp-toolbox-sdk-python
  2. Navigate to the specific package directory you wish to work on (e.g., toolbox-core, toolbox-adk, etc.):
    cd mcp-toolbox-sdk-python/packages/<PACKAGE_NAME>
  3. Install the package in editable mode, so changes are reflected without reinstall:
    pip install -e .

Important

(For non-toolbox-core packages) If you are working on an orchestration package (e.g., toolbox-adk or toolbox-langchain), pip may download the latest released version of toolbox-core as a dependency. To ensure you are testing against your local changes, install toolbox-core in editable mode after installing the target package: bash pip install -e ../toolbox-core

Tip

Using -e option allows you to make changes to the SDK code and have those changes reflected immediately without reinstalling the package.

Testing

Local Tests

To run tests locally for a specific package, ensure you have the necessary dependencies.

  1. Navigate to the package directory:
    cd mcp-toolbox-sdk-python/packages/<PACKAGE_NAME>
  2. Install the SDK and its test dependencies:
    pip install -e .[test]

Important

(For non-toolbox-core packages) If testing an orchestration package, make sure to install the local toolbox-core in editable mode after installing test dependencies to override published packages: bash pip install -e ../toolbox-core

  1. Ensure your Toolbox service is running and accessible (if running integration tests).
  2. Run tests:
    pytest

Tip

You can run specific test files or modules:

pytest tests/test_client.py

Authentication in Local Tests

Integration tests involving authentication rely on environment variables for TOOLBOX_URL, TOOLBOX_VERSION, and GOOGLE_CLOUD_PROJECT. For local runs, you might need to mock or set up dummy authentication tokens. These tests generally leverage authentication methods from toolbox-core. Refer to packages/toolbox-core/tests/conftest.py for examples.

Code Coverage

Tests are configured with pytest-cov to measure code coverage. Ensure your changes maintain or improve coverage.

Linting and Type Checking

The repository enforces code style and type adherence using black, isort, and mypy. To run these checks locally:

  1. Install test dependencies as described in the Local Tests section.
  2. Run the linters and type checker from the specific package directory:
    black --check .
    isort --check .
    MYPYPATH='./src' mypy --install-types --non-interactive --cache-dir=.mypy_cache/ -p <PACKAGE_MODULE_NAME>

Note

Replace <PACKAGE_MODULE_NAME> with the Python module name, e.g., toolbox_core, toolbox_adk or toolbox_langchain).

CI and Validation Pipelines

This repository splits CI responsibilities between GitHub Actions and Google Cloud Build.

  • GitHub Actions: These are linting and type checks that run when a pull request is opened or synchronized.
  • Cloud Build: These are live integration tests, that run upon PR creation, and interact with live GCP instances.

Triggering Validations

Different triggers must be applied depending on the target pipeline:

  • Cloud Build (Integration Tests): To authorize external test builds against GCP environments, a repository maintainer must add a comment containing /gcbrun directly on the pull request thread.
  • GitHub Actions (Linting/Types): While these run natively for trusted contributors, testing them against external fork contexts specifically requires a repository maintainer to apply the tests: run metadata label on the pull request.

Contribution Process

For instructions regarding Contributor License Agreements (CLA), Code of Conduct, and general Code Review practices, please refer directly to CONTRIBUTING.md.

Releases & Pipelines

This project uses release-please for automated releases. The release pipeline produces several published Python packages:

Support

Check the existing GitHub Issues for any concerns.

Reporting Security Issues

For security-related concerns, please report them via g.co/vulnz.