Skip to content
Snippets Groups Projects
Select Git revision
  • d2a71c1b62c46b237bbda4cc33852fbd6502211b
  • master default protected
  • 7
  • 7.10
  • 7.10.2
  • 7.10.1
  • 7.10.0
  • 7.9
  • 7.9.2
  • 7.9.1
  • 7.9.0
  • 7.8.3
  • 7.8.2
  • 7.8.1
  • 7.8
  • 7.8.0
  • 7.7
  • 7.7.1
  • 7.7.0
  • 7.6
  • 7.6.0
  • 7.5
22 results

python

  • Clone with SSH
  • Clone with HTTPS
  • GitLab CI template for Python

    This project implements a generic GitLab CI template for Python.

    It provides several features, usable in different modes (by configuration) following those recommendations

    Usage

    In order to include this template in your project, add the following to your gitlab-ci.yml:

    include:
      - project: 'to-be-continuous/python'
        ref: '1.2.2'
        file: '/templates/gitlab-ci-python.yml'

    Global configuration

    The Python template uses some global configuration used throughout all jobs.

    Name description default value
    PYTHON_IMAGE The Docker image used to run Python. It is highly recommended to set the specific version your project needs python:3
    PIP_INDEX_URL Python repository url none
    PYTHON_PROJECT_DIR Python project root directory .
    REQUIREMENTS_FILE Path to requirements file (relative to $PYTHON_PROJECT_DIR) requirements.txt
    PIP_OPTS pip extra options none

    The cache policy also declares the .cache/pip directory as cached (not to download Python dependencies over and over again).

    Default configuration follows this Python project structure

    Poetry support

    The Python template supports Poetry as packaging and dependency management tool.

    If a pyproject.toml and poetry.lock file is detected at the root of your project structure, requirements will automatically be generated from Poetry.

    ⚠️ as stated in Poetry documentation, You should commit the poetry.lock file to your project repo so that all people working on the project are locked to the same versions of dependencies.

    Jobs

    Lint jobs

    py-pylint job

    This job is disabled by default and performs code analysis based on pylint Python lib. It is activated by setting $PYLINT_ENABLED.

    It is bound to the build stage, and uses the following variables:

    Name description default value
    PYLINT_ARGS Additional pylint CLI options none
    PYLINT_FILES Files or directories to analyse none (by default analyses all found python source files)

    This job produces the following artifacts, kept for one day:

    • Code quality json report in code climate format.

    Test jobs

    The Python template features four alternative test jobs:

    • py-unittest that performs tests based on unittest Python lib,
    • or py-pytest that performs tests based on pytest Python lib,
    • or py-nosetest that performs tests based on nose Python lib,
    • or py-compile that performs byte code generation to check syntax if not tests are available.

    py-unittest job

    This job is disabled by default and performs tests based on unittest Python lib. It is activated by setting $UNITTEST_ENABLED.

    In order to produce JUnit test reports, the tests are executed with the xmlrunner module.

    It is bound to the build stage, and uses the following variables:

    Name description default value
    TEST_REQUIREMENTS_FILE Path to test requirements file (relative to $PYTHON_PROJECT_DIR) test-requirements.txt
    UNITTEST_ARGS Additional xmlrunner/unittest CLI options none

    This job produces the following artifacts, kept for one day:

    • JUnit test report (using the xmlrunner module)
    • code coverage report (cobertura xml format).

    ⚠️ create a .coveragerc file at the root of your Python project to control the coverage settings.

    Example:

    [run]
    # enables branch coverage
    branch = True
    # list of directories/packages to cover
    source = 
        module_1
        module_2

    py-pytest job

    This job is disabled by default and performs tests based on pytest Python lib. It is activated by setting $PYTEST_ENABLED.

    It is bound to the build stage, and uses the following variables:

    Name description default value
    TEST_REQUIREMENTS_FILE Path to test requirements file (relative $PYTHON_PROJECT_DIR) test-requirements.txt
    PYTEST_ARGS Additional pytest or pytest-cov CLI options none

    This job produces the following artifacts, kept for one day:

    • JUnit test report (with the --junit-xml argument)
    • code coverage report (cobertura xml format).

    ⚠️ create a .coveragerc file at the root of your Python project to control the coverage settings.

    Example:

    [run]
    # enables branch coverage
    branch = True
    # list of directories/packages to cover
    source = 
        module_1
        module_2

    py-nosetest job

    This job is disabled by default and performs tests based on nose Python lib. It is activated by setting $NOSETESTS_ENABLED.

    It is bound to the build stage, and uses the following variables:

    Name description default value
    TEST_REQUIREMENTS_FILE Path to test requirements file (relative to $PYTHON_PROJECT_DIR) test-requirements.txt
    NOSETESTS_ARGS Additional nose CLI options none

    By default coverage will be run on all the directory. You can restrict it to your packages by setting NOSE_COVER_PACKAGE variable. More info

    This job produces the following artifacts, kept for one day:

    • JUnit test report (with the --with-xunit argument)
    • code coverage report (cobertura xml format + html report).

    ⚠️ create a .coveragerc file at the root of your Python project or use nose CLI options to control the coverage settings.

    py-compile job

    This job is a fallback if no unit test has been setup ($UNITTEST_ENABLED and $PYTEST_ENABLED and $NOSETEST_ENABLED are not set), and performs a compileall.

    It is bound to the build stage, and uses the following variables:

    Name description default value
    PYTHON_COMPILE_ARGS compileall CLI options *

    SonarQube analysis

    If you're using the SonarQube template to analyse your Python code, here is a sample sonar-project.properties file:

    # see: https://docs.sonarqube.org/latest/analysis/languages/python/
    # set your source directory(ies) here (relative to the sonar-project.properties file)
    sonar.sources=.
    # exclude unwanted directories and files from being analysed
    sonar.exclusions=**/test_*.py
    
    # set your tests directory(ies) here (relative to the sonar-project.properties file)
    sonar.tests=.
    sonar.test.inclusions=**/test_*.py
    
    # tests report: generic format
    sonar.python.xunit.reportPath=reports/unittest/TEST-*.xml
    # coverage report: XUnit format
    sonar.python.coverage.reportPaths=reports/coverage.xml

    More info:

    py-bandit job (SAST)

    This job is disabled by default and performs a Bandit analysis.

    It is bound to the test stage, and uses the following variables:

    Name description default value
    BANDIT_ENABLED Variable to enable Bandit analysis none (disabled)
    BANDIT_ARGS Additional Bandit CLI options --recursive .

    This job outputs a textual report in the console, and in case of failure also exports a JSON report in the reports/ directory (relative to project root dir).

    py-safety job (dependency check)

    This job is disabled by default and performs a dependency check analysis using Safety.

    It is bound to the test stage, and uses the following variables:

    Name description default value
    SAFETY_ENABLED Variable to enable Safety job none (disabled)
    SAFETY_ARGS Additional Safety CLI options --full-report

    This job outputs a textual report in the console, and in case of failure also exports a JSON report in the reports/ directory (relative to project root dir).

    Publish jobs

    py-release job

    This job is disabled by default and performs an automatic tagging of your Python code.

    • Bumpversion Python library is used for version management.
    • Looks for an existing .bumpversion.cfg at the project root. If found, it will be the configuration used by bumpversion. If not, the $RELEASE_VERSION_PART variable and setup.py will be used instead.
    • Creating a Git tag involves an authenticated and authorized Git user.

    Don't use your personal password !!! Use an access token with write_repository rights. If you have a generic account, add it to the project and generate access token from this account.

    It is bound to the publish stage, applies only on master branch and uses the following variables:

    Name description default value
    RELEASE_VERSION_PART The part of the version to increase (one of: major, minor, patch) minor
    RELEASE_USERNAME Username credential for git push none (disabled)
    RELEASE_ACCESS_TOKEN Password credential for git push none

    py-publish job

    This job is disabled by default and performs a packaging and publication of your Python code.

    It is bound to the publish stage, applies only on git tags and uses the following variables:

    Name description default value
    TWINE_REPOSITORY_URL Where to publish your Python project none (disabled)
    TWINE_USERNAME Username credential to publish to $TWINE_REPOSITORY_URL none (disabled)
    TWINE_PASSWORD Password credential to publish to $TWINE_REPOSITORY_URL none

    More info:

    If you want to automatically create tag and publish your Python package, please have a look here

    py-docs job

    This job is disabled by default and performs documentation generation of your Python code using Sphinx. Documentation will be available through a GitLab artifact.

    It is bound to the publish stage, applies only on tags and uses the following variables:

    Name description default value
    DOCS_ENABLED Variable to enable pages job none (disabled)
    DOCS_REQUIREMENTS_FILE Python dependencies for documentation generation (relative to $PYTHON_PROJECT_DIR) docs-requirements.txt
    DOCS_DIRECTORY Directory containing docs source docs
    DOCS_BUILD_DIR Output build directory for documentation public
    DOCS_MAKE_ARGS Args of make command html BUILDDIR=${DOCS_BUILD_DIR}