GitLab CI template for Python
This project implements a generic GitLab CI template for Python.
It provides several features, usable in different modes (by configuration).
Usage
In order to include this template in your project, add the following to your gitlab-ci.yml
:
include:
- project: 'to-be-continuous/python'
ref: '4.0.1'
file: '/templates/gitlab-ci-python.yml'
Global configuration
The Python template uses some global configuration used throughout all jobs.
Name | description | default value |
---|---|---|
PYTHON_IMAGE |
The Docker image used to run Python ⚠️ set the version required by your project |
python:3 |
PYTHON_PROJECT_DIR |
Python project root directory | . |
PYTHON_BUILD_SYSTEM |
Python build-system to use to install dependencies, build and package the project (see below) | none (auto-detect) |
PIP_INDEX_URL |
Python repository url | none |
PIP_OPTS |
pip extra options | none |
PYTHON_EXTRA_DEPS |
Python extra sets of dependencies to install For Setuptools or Poetry only |
none |
PYTHON_REQS_FILE |
Main requirements file (relative to $PYTHON_PROJECT_DIR )For Requirements Files build-system only |
requirements.txt |
PYTHON_EXTRA_REQS_FILES |
Extra dev requirements file(s) to install (relative to $PYTHON_PROJECT_DIR )
|
requirements-dev.txt |
The cache policy also makes the necessary to manage pip cache (not to download Python dependencies over and over again).
Multi build-system support
The Python template supports the most popular dependency management & build systems.
By default it tries to auto-detect the build system used by the project (based on the presence of pyproject.toml
and/or setup.py
and/or requirements.txt
), but the build system might also be set explicitly using the
$PYTHON_BUILD_SYSTEM
variable:
Value | Build System (scope) |
---|---|
none (default) or auto
|
The template tries to auto-detect the actual build system |
setuptools |
Setuptools (dependencies, build & packaging) |
poetry |
Poetry (dependencies, build, test & packaging) |
pipenv |
Pipenv (dependencies only) |
reqfile |
Requirements Files (dependencies only) |
Jobs
py-package
job
This job allows building your Python project distribution packages.
It is bound to the build
stage, it is disabled by default and can be enabled by setting $PYTHON_PACKAGE_ENABLED
to true
.
Lint jobs
py-pylint
job
This job is disabled by default and performs code analysis based on pylint Python lib.
It is activated by setting $PYLINT_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Name | description | default value |
---|---|---|
PYLINT_ARGS |
Additional pylint CLI options | none |
PYLINT_FILES |
Files or directories to analyse | none (by default analyses all found python source files) |
This job produces the following artifacts, kept for one day:
- Code quality json report in code climate format.
Test jobs
The Python template features four alternative test jobs:
-
py-unittest
that performs tests based on unittest Python lib, - or
py-pytest
that performs tests based on pytest Python lib, - or
py-nosetest
that performs tests based on nose Python lib, - or
py-compile
that performs byte code generation to check syntax if not tests are available.
py-unittest
job
This job is disabled by default and performs tests based on unittest Python lib.
It is activated by setting $UNITTEST_ENABLED
to true
.
In order to produce JUnit test reports, the tests are executed with the xmlrunner module.
It is bound to the build
stage, and uses the following variables:
Name | description | default value |
---|---|---|
UNITTEST_ARGS |
Additional xmlrunner/unittest CLI options | none |
This job produces the following artifacts, kept for one day:
- JUnit test report (using the xmlrunner module)
- code coverage report (cobertura xml format).
⚠️ create a .coveragerc
file at the root of your Python project to control the coverage settings.
Example:
[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
source =
module_1
module_2
py-pytest
job
This job is disabled by default and performs tests based on pytest Python lib.
It is activated by setting $PYTEST_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Name | description | default value |
---|---|---|
PYTEST_ARGS |
Additional pytest or pytest-cov CLI options | none |
This job produces the following artifacts, kept for one day:
- JUnit test report (with the
--junit-xml
argument) - code coverage report (cobertura xml format).
⚠️ create a .coveragerc
file at the root of your Python project to control the coverage settings.
Example:
[run]
# enables branch coverage
branch = True
# list of directories/packages to cover
source =
module_1
module_2
py-nosetest
job
This job is disabled by default and performs tests based on nose Python lib.
It is activated by setting $NOSETESTS_ENABLED
to true
.
It is bound to the build
stage, and uses the following variables:
Name | description | default value |
---|---|---|
NOSETESTS_ARGS |
Additional nose CLI options | none |
By default coverage will be run on all the directory. You can restrict it to your packages by setting NOSE_COVER_PACKAGE variable. More info
This job produces the following artifacts, kept for one day:
- JUnit test report (with the
--with-xunit
argument) - code coverage report (cobertura xml format + html report).
⚠️ create a .coveragerc
file at the root of your Python project or use nose CLI options to control the coverage settings.
py-compile
job
This job is a fallback if no unit test has been setup ($UNITTEST_ENABLED
and $PYTEST_ENABLED
and $NOSETEST_ENABLED
are not set), and performs a compileall
.
It is bound to the build
stage, and uses the following variables:
Name | description | default value |
---|---|---|
PYTHON_COMPILE_ARGS |
compileall CLI options |
* |
SonarQube analysis
If you're using the SonarQube template to analyse your Python code, here is a sample sonar-project.properties
file:
# see: https://docs.sonarqube.org/latest/analysis/languages/python/
# set your source directory(ies) here (relative to the sonar-project.properties file)
sonar.sources=.
# exclude unwanted directories and files from being analysed
sonar.exclusions=**/test_*.py
# set your tests directory(ies) here (relative to the sonar-project.properties file)
sonar.tests=.
sonar.test.inclusions=**/test_*.py
# tests report: generic format
sonar.python.xunit.reportPath=reports/unittest/TEST-*.xml
# coverage report: XUnit format
sonar.python.coverage.reportPaths=reports/coverage.xml
More info:
py-bandit
job (SAST)
This job is disabled by default and performs a Bandit analysis.
It is bound to the test
stage, and uses the following variables:
Name | description | default value |
---|---|---|
BANDIT_ENABLED |
Set to true to enable Bandit analysis |
none (disabled) |
BANDIT_ARGS |
Additional Bandit CLI options | --recursive . |
This job outputs a textual report in the console, and in case of failure also exports a JSON report in the reports/
directory (relative to project root dir).
py-safety
job (dependency check)
This job is disabled by default and performs a dependency check analysis using Safety.
It is bound to the test
stage, and uses the following variables:
Name | description | default value |
---|---|---|
SAFETY_ENABLED |
Set to true to enable Safety job |
none (disabled) |
SAFETY_ARGS |
Additional Safety CLI options | --full-report |
This job outputs a textual report in the console, and in case of failure also exports a JSON report in the reports/
directory (relative to project root dir).
py-trivy
job (dependency check)
This job is disabled by default and performs a dependency check analysis using Trivy.
It is bound to the test
stage, and uses the following variables:
Name | description | default value |
---|---|---|
PYTHON_TRIVY_ENABLED |
Set to true to enable Trivy job |
none (disabled) |
PYTHON_TRIVY_ARGS |
Additional Trivy CLI options | --vuln-type library |
This job outputs a textual report in the console, and in case of failure also exports a JSON report in the reports/
directory (relative to project root dir).
py-release
job
This job is disabled by default and allows to perform a complete release of your Python code:
- increase the Python project version,
- Git commit changes and create a Git tag with the new version number,
- build the Python packages,
- publish the built packages to a PyPI compatible repository (GitLab packages by default).
The Python template supports two packaging systems:
- Poetry: uses Poetry-specific version, build and publish commands.
- Setuptools: uses Bumpversion as version management, build as package builder and Twine to publish.
The release job is bound to the publish
stage, appears only on production and integration branches and uses the following variables:
Name | description | default value |
---|---|---|
PYTHON_RELEASE_ENABLED |
Set to true to enable the release job |
none (disabled) |
PYTHON_RELEASE_NEXT |
The part of the version to increase (one of: major , minor , patch ) |
minor |
PYTHON_SEMREL_RELEASE_DISABLED |
Set to true to disable semantic-release integration
|
none (disabled) |
GIT_USERNAME |
Git username for Git push operations (see below) | none |
🔒 GIT_PASSWORD
|
Git password for Git push operations (see below) | none |
🔒 GIT_PRIVATE_KEY
|
SSH key for Git push operations (see below) | none |
PYTHON_REPOSITORY_URL |
Target PyPI repository to publish packages | GitLab project's PyPI packages repository |
PYTHON_REPOSITORY_USERNAME |
Target PyPI repository username credential | gitlab-ci-token |
🔒 PYTHON_REPOSITORY_PASSWORD
|
Target PyPI repository password credential | $CI_JOB_TOKEN |
Setuptools tip
If you're using a setup.cfg
declarative file for your project Setuptools configuration, then you will have to write a
.bumpversion.cfg
file to workaround a bug that prevents Bumpversion from updating the project version in your setup.cfg
file.
Example of .bumpversion.cfg
file:
[bumpversion]
# same version as in your setup.cfg
current_version = 0.5.0
[bumpversion:file:setup.cfg]
# any additional config here
# see: https://github.com/peritus/bumpversion#file-specific-configuration
semantic-release
integration
If you activate the semantic-release-info
job from the semantic-release
template, the py-release
job will rely on the generated next version info.
Thus, a release will be performed only if a next semantic release is present.
You should disable the semantic-release
job (as it's the py-release
job that will perform the release and so we only need the semantic-release-info
job) by setting SEMREL_RELEASE_DISABLED
to true
.
Finally, the semantic-release integration can be disabled with the PYTHON_SEMREL_RELEASE_DISABLED
variable.
Git authentication
A Python release involves some Git push operations.
You can either use a SSH key or user/password credentials.
Using a SSH key
We recommend you to use a project deploy key with write access to your project.
The key should not have a passphrase (see how to generate a new SSH key pair).
Specify 🔒 $GIT_PRIVATE_KEY
as secret project variable with the private part of the deploy key.
-----BEGIN OPENSSH PRIVATE KEY-----
blablabla
-----END OPENSSH PRIVATE KEY-----
The template handles both classic variable and file variable.
Using user/password credentials
Simply specify 🔒 $GIT_USERNAME
and 🔒 $GIT_PASSWORD
as secret project variables.
Note that the password should be an access token (preferably a project or group access token) with read_repository
and write_repository
scopes.