Skip to content
Snippets Groups Projects

GitLab CI template for Python

This project implements a GitLab CI/CD template to build, test and analyse your Python projects.

Usage

In order to include this template in your project, add the following to your gitlab-ci.yml:

include:
  - project: 'to-be-continuous/python'
    ref: '6.1.3'
    file: '/templates/gitlab-ci-python.yml'

Global configuration

The Python template uses some global configuration used throughout all jobs.

Name description default value
PYTHON_IMAGE The Docker image used to run Python
⚠️ set the version required by your project
registry.hub.docker.com/library/python:3
PYTHON_PROJECT_DIR Python project root directory .
PYTHON_BUILD_SYSTEM Python build-system to use to install dependencies, build and package the project (see below) none (auto-detect)
PIP_INDEX_URL Python repository url none
PIP_EXTRA_INDEX_URL Exra Python repository url none
PIP_OPTS pip extra options none
PYTHON_EXTRA_DEPS Python extra sets of dependencies to install
For Setuptools or Poetry only
none
PYTHON_REQS_FILE Main requirements file (relative to $PYTHON_PROJECT_DIR)
For Requirements Files build-system only
requirements.txt
PYTHON_EXTRA_REQS_FILES Extra dev requirements file(s) to install (relative to $PYTHON_PROJECT_DIR) requirements-dev.txt

The cache policy also makes the necessary to manage pip cache (not to download Python dependencies over and over again).

Multi build-system support

The Python template supports the most popular dependency management & build systems.

By default it tries to auto-detect the build system used by the project (based on the presence of pyproject.toml and/or setup.py and/or requirements.txt), but the build system might also be set explicitly using the $PYTHON_BUILD_SYSTEM variable:

Value Build System (scope)
none (default) or auto The template tries to auto-detect the actual build system
setuptools Setuptools (dependencies, build & packaging)
poetry Poetry (dependencies, build, test & packaging)
pipenv Pipenv (dependencies only)
reqfile Requirements Files (dependencies only)

⚠️ You can explicitly set the build tool version by setting $PYTHON_BUILD_SYSTEM variable including a version identification. For example PYTHON_BUILD_SYSTEM="poetry==1.1.15"

Jobs

py-package job

This job allows building your Python project distribution packages.

It is bound to the build stage, it is disabled by default and can be enabled by setting $PYTHON_PACKAGE_ENABLED to true.

Lint jobs

py-pylint job

This job is disabled by default and performs code analysis based on pylint Python lib. It is activated by setting $PYLINT_ENABLED to true.

It is bound to the build stage, and uses the following variables:

Name description default value
PYLINT_ARGS Additional pylint CLI options none
PYLINT_FILES Files or directories to analyse none (by default analyses all found python source files)

In addition to a textual report in the console, this job produces the following reports, kept for one day:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/py-lint.codeclimate.json Code Climate GitLab integration
$PYTHON_PROJECT_DIR/reports/py-lint.parseable.txt parseable SonarQube integration

Test jobs

The Python template features four alternative test jobs:

  • py-unittest that performs tests based on unittest Python lib,
  • or py-pytest that performs tests based on pytest Python lib,
  • or py-nosetest that performs tests based on nose Python lib,
  • or py-compile that performs byte code generation to check syntax if not tests are available.

py-unittest job

This job is disabled by default and performs tests based on unittest Python lib. It is activated by setting $UNITTEST_ENABLED to true.

In order to produce JUnit test reports, the tests are executed with the xmlrunner module.

It is bound to the build stage, and uses the following variables:

Name description default value
UNITTEST_ARGS Additional xmlrunner/unittest CLI options none

ℹ️ use a .coveragerc file at the root of your Python project to control the coverage settings.

Example:

[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
source =
    module_1
    module_2

In addition to a textual report in the console, this job produces the following reports, kept for one day:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/TEST-*.xml xUnit test report(s) GitLab integration & SonarQube integration
$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml Cobertura XML coverage report GitLab integration & SonarQube integration

py-pytest job

This job is disabled by default and performs tests based on pytest Python lib. It is activated by setting $PYTEST_ENABLED to true.

It is bound to the build stage, and uses the following variables:

Name description default value
PYTEST_ARGS Additional pytest or pytest-cov CLI options none

ℹ️ use a .coveragerc file at the root of your Python project to control the coverage settings.

Example:

[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
source =
    module_1
    module_2

In addition to a textual report in the console, this job produces the following reports, kept for one day:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/TEST-*.xml xUnit test report(s) GitLab integration & SonarQube integration
$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml Cobertura XML coverage report GitLab integration & SonarQube integration

py-nosetest job

This job is disabled by default and performs tests based on nose Python lib. It is activated by setting $NOSETESTS_ENABLED to true.

It is bound to the build stage, and uses the following variables:

Name description default value
NOSETESTS_ARGS Additional nose CLI options none

By default coverage will be run on all the project directories. You can restrict it to your packages by setting the $NOSE_COVER_PACKAGE variable. More info

ℹ️ use a .coveragerc file at the root of your Python project to control the coverage settings.

In addition to a textual report in the console, this job produces the following reports, kept for one day:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/TEST-*.xml xUnit test report(s) GitLab integration & SonarQube integration
$PYTHON_PROJECT_DIR/reports/py-coverage.cobertura.xml Cobertura XML coverage report GitLab integration & SonarQube integration

py-compile job

This job is a fallback if no unit test has been setup ($UNITTEST_ENABLED and $PYTEST_ENABLED and $NOSETEST_ENABLED are not set), and performs a compileall.

It is bound to the build stage, and uses the following variables:

Name description default value
PYTHON_COMPILE_ARGS compileall CLI options *

py-bandit job (SAST)

This job is disabled by default and performs a Bandit analysis.

It is bound to the test stage, and uses the following variables:

Name description default value
BANDIT_ENABLED Set to true to enable Bandit analysis none (disabled)
BANDIT_ARGS Additional Bandit CLI options --recursive .

In addition to a textual report in the console, this job produces the following reports, kept for one day:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/py-bandit.bandit.csv CSV SonarQube integration
This report is generated only if SonarQube template is detected
$PYTHON_PROJECT_DIR/reports/py-bandit.bandit.json JSON DefectDojo integration
This report is generated only if DefectDojo template is detected

py-safety job (dependency check)

This job is disabled by default and performs a dependency check analysis using Safety.

It is bound to the test stage, and uses the following variables:

Name description default value
SAFETY_ENABLED Set to true to enable Safety job none (disabled)
SAFETY_ARGS Additional Safety CLI options --full-report

py-trivy job (dependency check)

This job is disabled by default and performs a dependency check analysis using Trivy.

It is bound to the test stage, and uses the following variables:

Name description default value
PYTHON_TRIVY_ENABLED Set to true to enable Trivy job none (disabled)
PYTHON_TRIVY_ARGS Additional Trivy CLI options --vuln-type library

In addition to a textual report in the console, this job produces the following reports, kept for one day:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/py-trivy.trivy.json JSON DefectDojo integration
This report is generated only if DefectDojo template is detected

py-sbom job

This job generates a SBOM file listing all dependencies using syft.

It is bound to the test stage, and uses the following variables:

Name description default value
PYTHON_SBOM_DISABLED Set to true to disable this job none
PYTHON_SBOM_SYFT_URL Url to the tar.gz package for linux_amd64 of Syft to use (ex: https://github.com/anchore/syft/releases/download/v0.62.3/syft_0.62.3_linux_amd64.tar.gz)
When unset, the latest version will be used
none
PYTHON_SBOM_OPTS Options for syft used for SBOM analysis --catalogers python-index-cataloger

In addition to logs in the console, this job produces the following reports, kept for one week:

Report Format Usage
$PYTHON_PROJECT_DIR/reports/py-sbom.cyclonedx.json CycloneDX JSON Security & Compliance integration

SonarQube analysis

If you're using the SonarQube template to analyse your Python code, here is a sample sonar-project.properties file:

# see: https://docs.sonarqube.org/latest/analyzing-source-code/test-coverage/python-test-coverage/
# set your source directory(ies) here (relative to the sonar-project.properties file)
sonar.sources=.
# exclude unwanted directories and files from being analysed
sonar.exclusions=**/test_*.py

# set your tests directory(ies) here (relative to the sonar-project.properties file)
sonar.tests=.
sonar.test.inclusions=**/test_*.py

# tests report: xUnit format
sonar.python.xunit.reportPath=reports/TEST-*.xml
# coverage report: Cobertura format
sonar.python.coverage.reportPaths=reports/py-coverage.cobertura.xml
# pylint: parseable format (if enabled)
sonar.python.pylint.reportPaths=reports/py-lint.parseable.txt
# Bandit: CSV format (if enabled)
sonar.python.bandit.reportPaths=reports/py-bandit.bandit.csv

More info:

py-release job

This job is disabled by default and allows to perform a complete release of your Python code:

  1. increase the Python project version,
  2. Git commit changes and create a Git tag with the new version number,
  3. build the Python packages,
  4. publish the built packages to a PyPI compatible repository (GitLab packages by default).

The Python template supports two packaging systems:

The release job is bound to the publish stage, appears only on production and integration branches and uses the following variables:

Name description default value
PYTHON_RELEASE_ENABLED Set to true to enable the release job none (disabled)
PYTHON_RELEASE_NEXT The part of the version to increase (one of: major, minor, patch) minor
PYTHON_SEMREL_RELEASE_DISABLED Set to true to disable semantic-release integration none (disabled)
GIT_USERNAME Git username for Git push operations (see below) none
🔒 GIT_PASSWORD Git password for Git push operations (see below) none
🔒 GIT_PRIVATE_KEY SSH key for Git push operations (see below) none
PYTHON_REPOSITORY_URL Target PyPI repository to publish packages GitLab project's PyPI packages repository
PYTHON_REPOSITORY_USERNAME Target PyPI repository username credential gitlab-ci-token
🔒 PYTHON_REPOSITORY_PASSWORD Target PyPI repository password credential $CI_JOB_TOKEN

Setuptools tip

If you're using a setup.cfg declarative file for your project Setuptools configuration, then you will have to write a .bumpversion.cfg file to workaround a bug that prevents Bumpversion from updating the project version in your setup.cfg file.

Example of .bumpversion.cfg file:

[bumpversion]
# same version as in your setup.cfg
current_version = 0.5.0

[bumpversion:file:setup.cfg]
# any additional config here
# see: https://github.com/peritus/bumpversion#file-specific-configuration

semantic-release integration

If you activate the semantic-release-info job from the semantic-release template, the py-release job will rely on the generated next version info. Thus, a release will be performed only if a next semantic release is present.

You should disable the semantic-release job (as it's the py-release job that will perform the release and so we only need the semantic-release-info job) by setting SEMREL_RELEASE_DISABLED to true.

Finally, the semantic-release integration can be disabled with the PYTHON_SEMREL_RELEASE_DISABLED variable.

Git authentication

A Python release involves some Git push operations.

You can either use a SSH key or user/password credentials.

Using a SSH key

We recommend you to use a project deploy key with write access to your project.

The key should not have a passphrase (see how to generate a new SSH key pair).

Specify 🔒 $GIT_PRIVATE_KEY as secret project variable with the private part of the deploy key.

-----BEGIN 0PENSSH PRIVATE KEY-----
blablabla
-----END OPENSSH PRIVATE KEY-----

The template handles both classic variable and file variable.

Using user/password credentials

Simply specify 🔒 $GIT_USERNAME and 🔒 $GIT_PASSWORD as secret project variables.

Note that the password should be an access token (preferably a project or group access token) with write_repository scope and Maintainer role.

Pip repositories

When depending on Python packages published in GitLab's packages registry, it could be useful to configure a group level Package. But such repository will require an authenticated access.

To do so, simply set the PIP_INDEX_URL and use the CI job token.

variables:
  PIP_INDEX_URL: "${CI_SERVER_PROTOCOL}://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}:${CI_SERVER_PORT}/api/v4/groups/<group-id>/-/packages/pypi/simple"

In a corporate environment, you can be faced to two repositories: the corporate proxy-cache and the project repository.

Simply use both PIP_INDEX_URL and PIP_EXTRA_INDEX_URL.

variables:
  PIP_INDEX_URL: "https://cache.corp/repository/pypi/simple"
  PIP_EXTRA_INDEX_URL: "${CI_SERVER_PROTOCOL}://gitlab-ci-token:${CI_JOB_TOKEN}@${CI_SERVER_HOST}:${CI_SERVER_PORT}/api/v4/groups/<group-id>/-/packages/pypi/simple"