GitLab CI template for Python
This project implements a GitLab CI/CD template to build, test and analyse your Python projects.
Usage
This template can be used both as a CI/CD component
or using the legacy include:project
syntax.
Use as a CI/CD component
Add the following to your .gitlab-ci.yml
:
include:
# 1: include the component
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.7.1
# 2: set/override component inputs
inputs:
image: registry.hub.docker.com/library/python:3.12-slim
pytest-enabled: true
Use as a CI/CD template (legacy)
Add the following to your .gitlab-ci.yml
:
include:
# 1: include the template
- project: 'to-be-continuous/python'
ref: '7.7.1'
file: '/templates/gitlab-ci-python.yml'
variables:
# 2: set/override template variables
PYTHON_IMAGE: registry.hub.docker.com/library/python:3.12-slim
PYTEST_ENABLED: "true"
Global configuration
The Python template uses some global configuration used throughout all jobs.
Input / Variable | Description | Default value |
---|---|---|
image / PYTHON_IMAGE
|
The Docker image used to run Python ⚠️ set the version required by your project |
registry.hub.docker.com/library/python:3-slim |
project-dir / PYTHON_PROJECT_DIR
|
Python project root directory | . |
build-system / PYTHON_BUILD_SYSTEM
|
Python build-system to use to install dependencies, build and package the project (see below) |
auto (auto-detect) |
PIP_INDEX_URL |
Python repository url | none |
PIP_EXTRA_INDEX_URL |
Extra Python repository url | none |
pip-opts / PIP_OPTS
|
pip extra options | none |
extra-deps / PYTHON_EXTRA_DEPS
|
Python extra sets of dependencies to install For Setuptools or Poetry only |
none |
reqs-file / PYTHON_REQS_FILE
|
Main requirements file (relative to $PYTHON_PROJECT_DIR )For Requirements Files build-system only |
requirements.txt |
extra-reqs-files / PYTHON_EXTRA_REQS_FILES
|
Extra dev requirements file(s) to install (relative to $PYTHON_PROJECT_DIR )
|
requirements-dev.txt |
py-publish-job-tags / PY_PUBLISH_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
The cache policy also makes the necessary to manage pip cache (not to download Python dependencies over and over again).
Multi build-system support
The Python template supports the most popular dependency management & build systems.
By default it tries to auto-detect the build system used by the project (based on the presence of pyproject.toml
and/or setup.py
and/or requirements.txt
), but the build system might also be set explicitly using the
$PYTHON_BUILD_SYSTEM
variable:
Value | Build System (scope) |
---|---|
none (default) or auto
|
The template tries to auto-detect the actual build system |
setuptools |
Setuptools (dependencies, build & packaging) |
poetry |
Poetry (dependencies, build, test & packaging) |
uv |
uv (dependencies, build, test & packaging) |
pipenv |
Pipenv (dependencies only) |
reqfile |
Requirements Files (dependencies only) |
⚠️ You can explicitly set the build tool version by setting $PYTHON_BUILD_SYSTEM
variable including a version identification. For example PYTHON_BUILD_SYSTEM="poetry==1.1.15"
Jobs
py-package
job
This job allows building your Python project distribution packages.
It is bound to the build
stage, it is disabled by default and can be enabled by setting $PYTHON_PACKAGE_ENABLED
to true
.
Input / Variable | Description | Default value |
---|---|---|
py-package-job-tags / PY_PACKAGE_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
Lint jobs
py-lint
job
This job is disabled by default and performs code analysis based on pylint Python lib.
It is activated by setting $PYLINT_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
pylint-enabled / PYLINT_ENABLED
|
Set to true to enable the pylint job |
none (disabled) |
pylint-args / PYLINT_ARGS
|
Additional pylint CLI options | none |
pylint-files / PYLINT_FILES
|
Files or directories to analyse | none (by default analyses all found python source files) |
py-lint-job-tags / PY_LINT_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
In addition to a textual report in the console, this job produces the following reports, kept for one day:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/py-lint.codeclimate.json |
Code Climate | GitLab integration |
$PYTHON_PROJECT_DIR/reports/py-lint.parseable.txt |
parseable | SonarQube integration |
Test jobs
The Python template features four alternative test jobs:
-
py-unittest
that performs tests based on unittest Python lib, - or
py-pytest
that performs tests based on pytest Python lib, - or
py-nosetest
that performs tests based on nose Python lib, - or
py-compile
that performs byte code generation to check syntax if not tests are available.
py-unittest
job
This job is disabled by default and performs tests based on unittest Python lib.
It is activated by setting $UNITTEST_ENABLED
to true
.
In order to produce JUnit test reports, the tests are executed with the xmlrunner module.
It is bound to the build
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
unittest-enabled / UNITTEST_ENABLED
|
Set to true to enable the unittest job |
none (disabled) |
unittest-args / UNITTEST_ARGS
|
Additional xmlrunner/unittest CLI options | none |
py-unittest-job-tags / PY_UNITTEST_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
ℹ️ use a .coveragerc
file at the root of your Python project to control the coverage settings.
Example:
[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
source =
module_1
module_2
In addition to a textual report in the console, this job produces the following reports, kept for one day:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/TEST-*.xml |
xUnit test report(s) | GitLab integration & SonarQube integration |
$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml |
Cobertura XML coverage report | GitLab integration & SonarQube integration |
py-pytest
job
This job is disabled by default and performs tests based on pytest Python lib.
It is activated by setting $PYTEST_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
pytest-enabled / PYTEST_ENABLED
|
Set to true to enable the pytest job |
none (disabled) |
pytest-args / PYTEST_ARGS
|
Additional pytest or pytest-cov CLI options | none |
py-pytest-job-tags / PY_PYTEST_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
ℹ️ use a .coveragerc
file at the root of your Python project to control the coverage settings.
Example:
[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
source =
module_1
module_2
In addition to a textual report in the console, this job produces the following reports, kept for one day:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/TEST-*.xml |
xUnit test report(s) | GitLab integration & SonarQube integration |
$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml |
Cobertura XML coverage report | GitLab integration & SonarQube integration |
py-nosetests
job
This job is disabled by default and performs tests based on nose Python lib.
It is activated by setting $NOSETESTS_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
nosetests-enabled / NOSETESTS_ENABLED
|
Set to true to enable the nose job |
none (disabled) |
nosetests-args / NOSETESTS_ARGS
|
Additional nose CLI options | none |
py-nosetests-job-tags / PY_NOSETESTS_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
By default coverage will be run on all the project directories. You can restrict it to your packages by setting the $NOSE_COVER_PACKAGE
variable.
More info
ℹ️ use a .coveragerc
file at the root of your Python project to control the coverage settings.
In addition to a textual report in the console, this job produces the following reports, kept for one day:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/TEST-*.xml |
xUnit test report(s) | GitLab integration & SonarQube integration |
$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml |
Cobertura XML coverage report | GitLab integration & SonarQube integration |
py-compile
job
This job is a fallback if no unit test has been set up ($UNITTEST_ENABLED
and $PYTEST_ENABLED
and $NOSETEST_ENABLED
are not set), and performs a compileall
.
It is bound to the build
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
compile-args / PYTHON_COMPILE_ARGS
|
compileall CLI options |
* |
py-compile-job-tags / PY_COMPILE_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
py-bandit
job (SAST)
This job is disabled by default and performs a Bandit analysis.
It is bound to the test
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
bandit-enabled / BANDIT_ENABLED
|
Set to true to enable Bandit analysis |
none (disabled) |
bandit-args / BANDIT_ARGS
|
Additional Bandit CLI options | --recursive . |
py-bandit-job-tags / PY_BANDIT_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
In addition to a textual report in the console, this job produces the following reports, kept for one day and only available for download by users with the Developer role or higher:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/py-bandit.bandit.csv |
CSV |
SonarQube integration This report is generated only if SonarQube template is detected |
$PYTHON_PROJECT_DIR/reports/py-bandit.bandit.json |
JSON |
DefectDojo integration This report is generated only if DefectDojo template is detected |
py-trivy
job (dependency check)
This job performs a dependency check analysis using Trivy. ⚠️ This job is now enabled by default since version 7.0.0
It is bound to the test
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
trivy-disabled / PYTHON_TRIVY_DISABLED
|
Set to true to disable Trivy job |
none (enabled) |
trivy-dist-url / PYTHON_TRIVY_DIST_URL
|
Url to the tar.gz package for linux_amd64 of Trivy to use (ex: https://github.com/aquasecurity/trivy/releases/download/v0.51.1/trivy_0.51.1_Linux-64bit.tar.gz )When unset, the latest version will be used |
none |
trivy-args / PYTHON_TRIVY_ARGS
|
Additional Trivy CLI options | --vuln-type library |
py-trivy-job-tags / PY_TRIVY_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
Other Trivy parameters shall be configured using Trivy environment variables. Examples:
-
TRIVY_SEVERITY
: severities of security issues to be displayed (comma separated values:UNKNOWN
,LOW
,MEDIUM
,HIGH
,CRITICAL
) -
TRIVY_SERVER
: server address (enables client/server mode) -
TRIVY_DB_REPOSITORY
: OCI repository to retrieve Trivy Database from - ...
⚠️ if you're using Trivy in multiple templates with different parameter values (ex: different TRIVY_SEVERITY
threshold with Python and - says - Docker templates), then it is
recommanded to pass the configuration as CLI options using the trivy-args
input / PYTHON_TRIVY_ARGS
variable.
In addition to a textual report in the console, this job produces the following reports, kept for one day and only available for download by users with the Developer role or higher:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/py-trivy.trivy.json |
JSON |
DefectDojo integration This report is generated only if DefectDojo template is detected |
py-sbom
job
This job generates a SBOM file listing all dependencies using syft.
It is bound to the test
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
sbom-disabled / PYTHON_SBOM_DISABLED
|
Set to true to disable this job |
none |
sbom-syft-url / PYTHON_SBOM_SYFT_URL
|
Url to the tar.gz package for linux_amd64 of Syft to use (ex: https://github.com/anchore/syft/releases/download/v0.62.3/syft_0.62.3_linux_amd64.tar.gz )When unset, the latest version will be used |
none |
sbom-name / PYTHON_SBOM_NAME
|
Component name of the emitted SBOM | $CI_PROJECT_PATH/$PYTHON_PROJECT_DIR |
sbom-opts / PYTHON_SBOM_OPTS
|
Options for syft used for SBOM analysis | --override-default-catalogers python-package-cataloger |
py-sbom-job-tags / PY_SBOM_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
In addition to logs in the console, this job produces the following reports, kept for one week:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/py-sbom.cyclonedx.json |
CycloneDX JSON | Security & Compliance integration |
py-black
job
This job disabled by default and runs black on the repo. It is bound to the build stage.
Input / Variable | Description | Default value |
---|---|---|
black-enabled / PYTHON_BLACK_ENABLED
|
Set to true to enable black job |
none (disabled) |
py-black-job-tags / PY_BLACK_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
py-isort
job
This job disabled by default and runs isort on the repo. It is bound to the build stage.
Input / Variable | Description | Default value |
---|---|---|
isort-enabled / PYTHON_ISORT_ENABLED
|
Set to true to enable isort job |
none (disabled) |
py-isort-job-tags / PY_ISORT_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
py-ruff
job
This job disabled by default and runs Ruff on the repo. It is bound to the build stage.
Input / Variable | Description | Default value |
---|---|---|
ruff-enabled / RUFF_ENABLED
|
Set to true to enable ruff job |
none (disabled) |
ruff-args / RUFF_ARGS
|
Additional Ruff Linter CLI options | none |
py-ruff-job-tags / PY_RUFF_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
⚠️ Ruff can replace isort, Bandit, Pylint and much more. More info.
In addition to logs in the console, this job produces the following reports, kept for one week:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/py-ruff.gitlab.json |
GitLab | GitLab integration |
$PYTHON_PROJECT_DIR/reports/py-ruff.native.json |
JSON |
SonarQube integration This report is generated only if SonarQube template is detected |
py-ruff-format
job
This job disabled by default and runs Ruff on the repo. It is bound to the build stage.
Input / Variable | Description | Default value |
---|---|---|
ruff-format-enabled / RUFF_FORMAT_ENABLED
|
Set to true to enable ruff job |
none (disabled) |
⚠️ Ruff can replace Black and much more. More info.
py-mypy
job
This job is disabled by default and performs code analysis based on mypy.
It is activated by setting $MYPY_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
mypy-enabled / MYPY_ENABLED
|
Set to true to enable the mypy job |
none (disabled) |
mypy-args / MYPY_ARGS
|
Additional mypy CLI options | none |
mypy-files / MYPY_FILES
|
Files or directories to analyse | none (by default analyses all found python source files) |
py-mypy-job-tags / PY_MYPY_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
In addition to a textual report in the console, this job produces the following reports, kept for one day:
Report | Format | Usage |
---|---|---|
$PYTHON_PROJECT_DIR/reports/py-mypy.codeclimate.json |
Code Climate | GitLab integration |
$PYTHON_PROJECT_DIR/reports/py-mypy.console.txt |
mypy console output | SonarQube integration |
SonarQube analysis
If you're using the SonarQube template to analyse your Python code, here is a sample sonar-project.properties
file:
# see: https://docs.sonarsource.com/sonarqube-server/latest/analyzing-source-code/test-coverage/python-test-coverage/
# set your source directory(ies) here (relative to the sonar-project.properties file)
sonar.sources=.
# exclude unwanted directories and files from being analysed
sonar.exclusions=**/test_*.py
# set your tests directory(ies) here (relative to the sonar-project.properties file)
sonar.tests=.
sonar.test.inclusions=**/test_*.py
# tests report: xUnit format
sonar.python.xunit.reportPath=reports/TEST-*.xml
# coverage report: Cobertura format
sonar.python.coverage.reportPaths=reports/py-coverage.cobertura.xml
# pylint: parseable format (if enabled)
sonar.python.pylint.reportPaths=reports/py-lint.parseable.txt
# Bandit: CSV format (if enabled)
sonar.python.bandit.reportPaths=reports/py-bandit.bandit.csv
# Ruff: JSON format (if enabled)
sonar.python.ruff.reportPaths=reports/py-ruff.native.json
# mypy: JSON format (if enabled)
sonar.python.mypy.reportPaths=reports/py-mypy.console.txt
More info:
py-release
job
This job is disabled by default and allows to perform a complete release of your Python code:
- increase the Python project version,
- Git commit changes and create a Git tag with the new version number,
- build the Python packages,
- publish the built packages to a PyPI compatible repository (GitLab packages by default).
The Python template supports three packaging systems:
- Poetry: uses Poetry-specific version, build and publish commands.
- uv: uses uv as version management, build as package builder and publish to publish.
- Setuptools: uses bump-my-version as version management, build as package builder and Twine to publish.
The release job is bound to the publish
stage, appears only on production and integration branches and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
release-enabled / PYTHON_RELEASE_ENABLED
|
Set to true to enable the release job |
none (disabled) |
auto-release-enabled / PYTHON_AUTO_RELEASE_ENABLED
|
When set the job start automatically on production branch. When not set (default), the job is manual. Note that this behavior also depends on release-enabled being set. | none (disabled) |
release-next / PYTHON_RELEASE_NEXT
|
The part of the version to increase (one of: major , minor , patch ) |
minor |
semrel-release-disabled / PYTHON_SEMREL_RELEASE_DISABLED
|
Set to true to disable semantic-release integration
|
none (disabled) |
GIT_USERNAME |
Git username for Git push operations (see below) | none |
🔒 GIT_PASSWORD
|
Git password for Git push operations (see below) | none |
🔒 GIT_PRIVATE_KEY
|
SSH key for Git push operations (see below) | none |
release-commit-message / PYTHON_RELEASE_COMMIT_MESSAGE
|
The Git commit message to use on the release commit. This is templated using the Python Format String Syntax. Available in the template context are current_version and new_version. | chore(python-release): {current_version} → {new_version} |
When py-release
job is enabled, py-publish
job is automatically enabled too.
py-publish
job
This job is disabled by default and allow to publish the built packages to a PyPI compatible repository (GitLab packages by default.
The Python template supports three packaging systems:
- Poetry: uses Poetry-specific version, build and publish commands.
- uv: uses uv as version management, build as package builder and publish to publish.
- Setuptools: uses bump-my-version as version management, build as package builder and Twine to publish.
The publish job is bound to the publish
stage, is executed on a Git tag matching semantic versioning pattern and uses the following variables:
Input / Variable | Description | Default value |
---|---|---|
publish-enabled / PYTHON_PUBLISH_ENABLED
|
Set to true to enable the publish job |
none (disabled) |
repository-url / PYTHON_REPOSITORY_URL
|
Target PyPI repository to publish packages | GitLab project's PyPI packages repository |
PYTHON_REPOSITORY_USERNAME |
Target PyPI repository username credential | gitlab-ci-token |
🔒 PYTHON_REPOSITORY_PASSWORD
|
Target PyPI repository password credential | $CI_JOB_TOKEN |
py-release-job-tags / PY_RELEASE_JOB_TAGS
|
Tags to be used for selecting runners for the job | [] |
Setuptools tip
If you're using a Setuptools configuration, then you will have to write a .bumpversion.toml
or pyproject.toml
file.
Example of .bumpversion.toml
file:
[tool.bumpversion]
current_version = "0.0.0"
Example of pyproject.toml
file:
[project]
name = "project-name"
[build-system]
requires = ["setuptools"]
build-backend = "setuptools.build_meta"
[tool.bumpversion]
current_version = "0.0.0"
[[tool.bumpversion.files]]
filename = "project-name/__init__.py"
semantic-release
integration
If you activate the semantic-release-info
job from the semantic-release
template, the py-release
job will rely on the generated next version info.
Thus, a release will be performed only if a next semantic release is present.
You should disable the semantic-release
job (as it's the py-release
job that will perform the release and so we only need the semantic-release-info
job) by setting SEMREL_RELEASE_DISABLED
to true
.
Finally, the semantic-release integration can be disabled with the PYTHON_SEMREL_RELEASE_DISABLED
variable.
Git authentication
A Python release involves some Git push operations.
You can either use a SSH key or user/password credentials.
Using a SSH key
We recommend you to use a project deploy key with write access to your project.
The key should not have a passphrase (see how to generate a new SSH key pair).
Specify 🔒 $GIT_PRIVATE_KEY
as secret project variable with the private part of the deploy key.
-----BEGIN 0PENSSH PRIVATE KEY-----
blablabla
-----END OPENSSH PRIVATE KEY-----
The template handles both classic variable and file variable.
Using user/password credentials
Simply specify 🔒 $GIT_USERNAME
and 🔒 $GIT_PASSWORD
as secret project variables.
Note that the password should be an access token (preferably a project or group access token) with write_repository
scope and Maintainer
role.
Pip repositories
When depending on Python packages published in GitLab's packages registry, it could be useful to configure a group level Package. But such repository will require an authenticated access.
To do so, simply set the PIP_INDEX_URL
and use the CI job token.
variables:
PIP_INDEX_URL: "${CI_SERVER_PROTOCOL}://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}:${CI_SERVER_PORT}/api/v4/groups/<group-id>/-/packages/pypi/simple"
In a corporate environment, you can be faced to two repositories: the corporate proxy-cache and the project repository.
Simply use both PIP_INDEX_URL
and PIP_EXTRA_INDEX_URL
.
variables:
PIP_INDEX_URL: "https://cache.corp/repository/pypi/simple"
PIP_EXTRA_INDEX_URL: "${CI_SERVER_PROTOCOL}://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}:${CI_SERVER_PORT}/api/v4/groups/<group-id>/-/packages/pypi/simple"
Variants
The Python template can be used in conjunction with template variants to cover specific cases.
Vault variant
This variant allows delegating your secrets management to a Vault server.
Configuration
In order to be able to communicate with the Vault server, the variant requires the additional configuration parameters:
Input / Variable | Description | Default value |
---|---|---|
TBC_VAULT_IMAGE |
The Vault Secrets Provider image to use (can be overridden) | registry.gitlab.com/to-be-continuous/tools/vault-secrets-provider:latest |
vault-base-url / VAULT_BASE_URL
|
The Vault server base API url | must be defined |
vault-oidc-aud / VAULT_OIDC_AUD
|
The aud claim for the JWT |
$CI_SERVER_URL |
🔒 VAULT_ROLE_ID
|
The AppRole RoleID | none |
🔒 VAULT_SECRET_ID
|
The AppRole SecretID | none |
By default, the variant will authentifacte using a JWT ID token. To use AppRole instead the VAULT_ROLE_ID
and VAULT_SECRET_ID
should be defined as secret project variables.
Usage
Then you may retrieve any of your secret(s) from Vault using the following syntax:
@url@http://vault-secrets-provider/api/secrets/{secret_path}?field={field}
With:
Parameter | Description |
---|---|
secret_path (path parameter) |
this is your secret location in the Vault server |
field (query parameter) |
parameter to access a single basic field from the secret JSON payload |
Example
include:
# main component
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.7.1
# Vault variant
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python-vault@7.7.1
inputs:
vault-base-url: "https://vault.acme.host/v1"
# audience claim for JWT
vault-oidc-aud: "https://vault.acme.host"
variables:
# Secrets managed by Vault
GIT_PASSWORD: "@url@http://vault-secrets-provider/api/secrets/b7ecb6ebabc231/git/semantic-release?field=group-access-token"
GIT_PRIVATE_KEY: "@url@http://vault-secrets-provider/api/secrets/b7ecb6ebabc231/git/semantic-release?field=private-key"
PYTHON_REPOSITORY_PASSWORD: "@url@http://vault-secrets-provider/api/secrets/b7ecb6ebabc231/pip-repo/repository?field=password"
Google Cloud variant
This variant allows to use Python Google Clients. The variant follow the recommendation Authenticate for using client libraries with ADC
Detailed article on internal OIDC impersonated with Workload Identify Federation
List of requirements before using this variant for use Python Google Clients:
- You must have a Workload Identity Federation Pool,
- You must have a Service Account with enough permissions to run your python job.
- Optional, you can define
GOOGLE_CLOUD_PROJECT
in template variable to define the default Google project
Configuration
The variant requires the additional configuration parameters:
Input / Variable | Description | Default value |
---|---|---|
gcp-oidc-aud / GCP_OIDC_AUD
|
The aud claim for the JWT token (only required for OIDC authentication)
|
$CI_SERVER_URL |
gcp-oidc-provider / GCP_OIDC_PROVIDER
|
Default Workload Identity Provider associated with GitLab to authenticate with OpenID Connect | none |
gcp-oidc-account / GCP_OIDC_ACCOUNT
|
Default Service Account to which impersonate with OpenID Connect authentication | none |
Example
include:
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.7.1
# 2: set/override component inputs
inputs:
image: registry.hub.docker.com/library/python:3.12-slim
pytest-enabled: true
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python-gcp@7.7.1
inputs:
# common OIDC config for non-prod envs
gcp-oidc-provider: "projects/<gcp_nonprod_proj_id>/locations/global/workloadIdentityPools/<pool_id>/providers/<provider_id>"
gcp-oidc-account: "<name>@$<gcp_nonprod_proj_id>.iam.gserviceaccount.com"
AWS CodeArtifact variant
This variant allows to use PyPi packages from AWS CodeArtifact. The variant follow the recommendation Authenticate for using client libraries
It authenticates with AWS CodeArtifact, retrieves and sets the following environment variable:
-
CODEARTIFACT_AUTH_TOKEN
- the AWS CodeArtifact authentication token -
CODEARTIFACT_REPOSITORY_ENDPOINT
- the AWS CodeArtifact repository endpoint -
CODEARTIFACT_URL
- Formatted URL for the AWS CodeArtifact repository
Most importantly, the variant sets the pip global.index-url
to the CodeArtifact url.
The variant supports two authentication methods:
- federated authentication using OpenID Connect (recommended method),
- or basic authentication with AWS access key ID & secret access key.
⚠️ when using this variant, you must have created the CodeArtifact repository.
Configuration
The variant requires the additional configuration parameters:
Input / Variable | Description | Default value |
---|---|---|
TBC_AWS_PROVIDER_IMAGE |
The AWS Auth Provider image to use (can be overridden) | registry.gitlab.com/to-be-continuous/tools/aws-auth-provider:latest |
aws-region / AWS_REGION
|
Default region (where the Codeartifact repository is located) | none |
aws-codeartifact-domain / AWS_CODEARTIFACT_DOMAIN
|
The CodeArtifact domain name | none |
aws-codeartifact-domain-owner / AWS_CODEARTIFACT_DOMAIN_OWNER
|
The CodeArtifact domain owner account ID | none |
aws-codeartifact-repository / AWS_CODEARTIFACT_REPOSITORY
|
The CodeArtifact repository name | none |
OIDC authentication config
This is the recommended authentication method. In order to use it, first carefuly follow GitLab's documentation, then set the required configuration.
Input / Variable | Description | Default value |
---|---|---|
aws-oidc-aud / AWS_OIDC_AUD
|
The aud claim for the JWT token |
$CI_SERVER_URL |
aws-oidc-role-arn / AWS_OIDC_ROLE_ARN
|
Default IAM Role ARN associated with GitLab | none |
Basic authentication config
Variable | Description | Default value |
---|---|---|
🔒 AWS_ACCESS_KEY_ID
|
Default access key ID | none (disabled) |
🔒 AWS_SECRET_ACCESS_KEY
|
Default secret access key | none (disabled) |
Example
include:
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python@7.7.1
# 2: set/override component inputs
inputs:
image: registry.hub.docker.com/library/python:3.12-slim
pytest-enabled: true
- component: $CI_SERVER_FQDN/to-be-continuous/python/gitlab-ci-python-aws-codeartifact@7.7.1
inputs:
aws-region: "us-east-1"
aws-codeartifact-domain: "acme"
aws-codeartifact-domain-owner: "123456789012"
aws-codeartifact-repository: "my-repo"
# common OIDC config for non-prod envs
aws-oidc-role-arn: "arn:aws:iam::123456789012:role/gitlab-ci"