Skip to content
Snippets Groups Projects
Commit c9787621 authored by Pierre Smeyers's avatar Pierre Smeyers
Browse files

Merge branch '14-full_integration_poetry' into 'master'

feat: fully integration of poetry

Closes #14

See merge request to-be-continuous/python!12
parents 5ba4ef56 10a8150e
No related branches found
No related tags found
No related merge requests found
...@@ -226,18 +226,17 @@ It is bound to the `test` stage, and uses the following variables: ...@@ -226,18 +226,17 @@ It is bound to the `test` stage, and uses the following variables:
This job outputs a **textual report** in the console, and in case of failure also exports a JSON report in the `reports/` This job outputs a **textual report** in the console, and in case of failure also exports a JSON report in the `reports/`
directory _(relative to project root dir)_. directory _(relative to project root dir)_.
### Package jobs ### Package jobs
#### `py-package` job #### `py-package` job
This job is performs a packaging of your Python code. This job is **disabled by default** and performs a packaging of your Python code.
It is bound to the `package-build` stage, applies only on git tags and uses the following variables: It is bound to the `package-build` stage, applies only on git tags and uses the following variables:
| Name | description | default value | | Name | description | default value |
| --------------- | ---------------------------------------------------- | ------------- | | --------------- | ---------------------------------------------------- | ------------- |
| `PYTHON_FORCE_PACKAGE` | Force the packaging even if not on tag related event | _none_ | | `PYTHON_FORCE_PACKAGE` | Set to `true` to force the packaging even if not on tag related event | _none_ (disabled) |
### Publish jobs ### Publish jobs
...@@ -281,17 +280,7 @@ If you want to automatically create tag and publish your Python package, please ...@@ -281,17 +280,7 @@ If you want to automatically create tag and publish your Python package, please
#### `py-docs` job #### `py-docs` job
This job is **disabled by default** and performs documentation generation of your Python code using [Sphinx](http://www.sphinx-doc.org/en/master/). Documentation will be available through a GitLab artifact. This job is no longer supported in this version of the template. It might come back later on with a more generic & configurable implementation.
It is bound to the `publish` stage, applies only on tags and uses the following variables:
| Name | description | default value |
| ------------------------ | -------------------------------------------------------------------------------------- | --------------------------------- |
| `DOCS_ENABLED` | Set to `true` to enable pages job | _none_ (disabled) |
| `DOCS_REQUIREMENTS_FILE` | Python dependencies for documentation generation _(relative to `$PYTHON_PROJECT_DIR`)_ | `docs-requirements.txt` |
| `DOCS_DIRECTORY` | Directory containing docs source | `docs` |
| `DOCS_BUILD_DIR` | Output build directory for documentation | `public` |
| `DOCS_MAKE_ARGS` | Args of make command | `html BUILDDIR=${DOCS_BUILD_DIR}` |
## GitLab compatibility ## GitLab compatibility
......
...@@ -215,38 +215,6 @@ ...@@ -215,38 +215,6 @@
"mandatory": true "mandatory": true
} }
] ]
},
{
"id": "docs",
"name": "Docs",
"description": "Documentation generation of your python code using [Sphinx](http://www.sphinx-doc.org/en/master/). Documentation will be available through a gitlab artifact.",
"enable_with": "DOCS_ENABLED",
"variables": [
{
"name": "DOCS_REQUIREMENTS_FILE",
"description": "Python dependencies for documentation generation _(relative to `$PYTHON_PROJECT_DIR`)_",
"default": "docs-requirements.txt",
"advanced": true
},
{
"name": "DOCS_DIRECTORY",
"description": "Directory containing docs source",
"default": "docs",
"advanced": true
},
{
"name": "DOCS_BUILD_DIR",
"description": "Output build directory for documentation",
"default": "public",
"advanced": true
},
{
"name": "DOCS_MAKE_ARGS",
"description": "Args of make command",
"default": "html BUILDDIR=${DOCS_BUILD_DIR}",
"advanced": true
}
]
} }
] ]
} }
...@@ -69,90 +69,9 @@ variables: ...@@ -69,90 +69,9 @@ variables:
fi fi
} }
function install_test_requirements() {
if [[ -f "pyproject.toml" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
if [[ ! -f "poetry.lock" ]]; then
log_error "Poetry detected but \\e[33;1mpoetry.lock\\e[0m file not found: you shall commit it with your project files"
exit 1
fi
log_info "--- Poetry detected: generating \\e[33;1m${TEST_REQUIREMENTS_FILE}\\e[0m from poetry.lock"
pip install poetry
poetry export --without-hashes ${PYTHON_POETRY_EXTRAS:+--extras "$PYTHON_POETRY_EXTRAS"} --dev -f requirements.txt --output "${TEST_REQUIREMENTS_FILE}"
fi
if [[ -f "${TEST_REQUIREMENTS_FILE}" ]]; then
log_info "--- installing from ${TEST_REQUIREMENTS_FILE} file"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} -r "${TEST_REQUIREMENTS_FILE}"
else
log_info "--- no test requirements file found from env or file ${TEST_REQUIREMENTS_FILE} does not exist"
fi
}
function install_requirements() {
if [[ -f "pyproject.toml" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
if [[ ! -f "poetry.lock" ]]; then
log_error "Poetry detected but \\e[33;1mpoetry.lock\\e[0m file not found: you shall commit it with your project files"
exit 1
fi
log_info "--- Poetry detected: generating \\e[33;1m${REQUIREMENTS_FILE}\\e[0m from poetry.lock"
pip install poetry
poetry export --without-hashes ${PYTHON_POETRY_EXTRAS:+--extras "$PYTHON_POETRY_EXTRAS"} -f requirements.txt --output "${REQUIREMENTS_FILE}"
fi
if [[ -f "${REQUIREMENTS_FILE}" ]]; then
log_info "--- installing from ${REQUIREMENTS_FILE} file"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} -r "${REQUIREMENTS_FILE}"
elif [[ -f "${SETUP_PY_DIR}/setup.py" ]]; then
log_info "--- installing from ${SETUP_PY_DIR}/setup.py file"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} "${SETUP_PY_DIR}/"
else
log_info "--- no requirements or setup.py file found from env or file ${REQUIREMENTS_FILE} - ${SETUP_PY_DIR}/setup.py does not exist"
fi
}
function install_doc_requirements() {
if [[ -f "pyproject.toml" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
if [[ ! -f "poetry.lock" ]]; then
log_error "Poetry detected but \\e[33;1mpoetry.lock\\e[0m file not found: you shall commit it with your project files"
exit 1
fi
log_info "--- Poetry detected: generating \\e[33;1m${TEST_REQUIREMENTS_FILE}\\e[0m from poetry.lock"
pip install poetry
poetry export --without-hashes ${PYTHON_POETRY_EXTRAS:+--extras "$PYTHON_POETRY_EXTRAS"} -f requirements.txt --output "${DOCS_REQUIREMENTS_FILE}"
fi
if [[ -f "${DOCS_REQUIREMENTS_FILE}" ]]; then
log_info "--- installing from ${DOCS_REQUIREMENTS_FILE} file"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} -r "${DOCS_REQUIREMENTS_FILE}"
elif [[ -f "${SETUP_PY_DIR}/setup.py" ]]; then
log_info "--- installing from ${SETUP_PY_DIR}/setup.py file"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} "${SETUP_PY_DIR}/"
else
log_info "--- no doc requirements file found from env or file ${DOCS_REQUIREMENTS_FILE} - ${SETUP_PY_DIR}/setup.py does not exist"
fi
}
function release_args() {
if [[ -f ".bumpversion.cfg" ]]; then
log_info "--- .bumpversion.cfg file found "
export bumpversion_args="${RELEASE_VERSION_PART} --verbose"
else
log_info "--- No .bumpversion.cfg file found "
if [[ -f "setup.py" ]]; then
log_info "--- Getting current version of setup.py file "
current_version=$(python setup.py --version)
export bumpversion_args=" --verbose --current-version ${current_version} --tag --tag-name {new_version} --commit ${RELEASE_VERSION_PART} setup.py"
else
log_warn "--- No setup.py file found. Cannot perform release."
fi
fi
log_info "--- Release args: ${bumpversion_args}"
}
function install_ca_certs() { function install_ca_certs() {
certs=$1 certs=$1
...@@ -289,6 +208,113 @@ variables: ...@@ -289,6 +208,113 @@ variables:
log_info "... done" log_info "... done"
} }
# install requirements
# arg1: 'build' (build only) or 'test' (build + test)
function install_requirements() {
target=$1
if [[ -f "pyproject.toml" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
if [[ ! -f "poetry.lock" ]]; then
log_error "Poetry detected but \\e[33;1mpoetry.lock\\e[0m file not found: you shall commit it with your project files"
exit 1
fi
pip install -U poetry
if [[ "$target" == "build" ]]; then
log_info "--- Poetry detected: install build only requirements"
poetry install --no-dev ${PYTHON_POETRY_EXTRAS:+--extras "$PYTHON_POETRY_EXTRAS"}
else
log_info "--- Poetry detected: install build and dev requirements"
poetry install ${PYTHON_POETRY_EXTRAS:+--extras "$PYTHON_POETRY_EXTRAS"}
fi
elif [[ -f "${REQUIREMENTS_FILE}" ]]; then
log_info "--- installing build requirements from \\e[33;1m${REQUIREMENTS_FILE}\\e[0m"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} -r "${REQUIREMENTS_FILE}"
if [[ "$target" == "test" ]] && [[ -f "${TEST_REQUIREMENTS_FILE}" ]]; then
log_info "--- installing test requirements from \\e[33;1m${TEST_REQUIREMENTS_FILE}\\e[0m"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} -r "${TEST_REQUIREMENTS_FILE}"
fi
elif [[ -f "${SETUP_PY_DIR}/setup.py" ]]; then
log_info "--- installing requirements from \\e[33;1m${SETUP_PY_DIR}/setup.py\\e[0m"
# shellcheck disable=SC2086
pip install ${PIP_OPTS} "${SETUP_PY_DIR}/"
else
log_info "--- no dependency management tool, nor requirements file nor setup.py file found: skip install dependencies"
fi
}
function _run() {
if [[ -f "poetry.lock" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
if ! command -v poetry > /dev/null
then
pip install -U poetry
fi
poetry run "$@"
else
"$@"
fi
}
function _python() {
_run python "$@"
}
function _pip() {
_run pip "$@"
}
function _package(){
if [[ -f "poetry.lock" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
pip install -U poetry
poetry build
else
python setup.py sdist bdist_wheel
fi
}
function _publish() {
if [[ -f "poetry.lock" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
pip install -U poetry
poetry config repositories.user_defined "$TWINE_REPOSITORY_URL"
poetry publish --username "$TWINE_USERNAME" --password "$TWINE_PASSWORD" --repository user_defined
else
pip install -U twine setuptools
pip list
twine upload --verbose dist/*.tar.gz
twine upload --verbose dist/*.whl
fi
}
function _release() {
if [[ -f "poetry.lock" ]] && [[ "${PYTHON_POETRY_DISABLED}" != "true" ]]; then
pip install -U poetry
poetry version "${RELEASE_VERSION_PART}"
else
pip install -U bumpversion
release_args
bumpversion "${bumpversion_args}"
fi
}
function release_args() {
if [[ -f ".bumpversion.cfg" ]]; then
log_info "--- .bumpversion.cfg file found "
export bumpversion_args="${RELEASE_VERSION_PART} --verbose"
else
log_info "--- No .bumpversion.cfg file found "
if [[ -f "setup.py" ]]; then
log_info "--- Getting current version of setup.py file "
current_version=$(python setup.py --version)
export bumpversion_args=" --verbose --current-version ${current_version} --tag --tag-name {new_version} --commit ${RELEASE_VERSION_PART} setup.py"
else
log_warn "--- No setup.py file found. Cannot perform release."
fi
fi
log_info "--- Release args: ${bumpversion_args}"
}
function get_latest_template_version() { function get_latest_template_version() {
tag_json=$(wget -T 5 -q -O - "$CI_API_V4_URL/projects/to-be-continuous%2F$1/repository/tags?per_page=1" || echo "") tag_json=$(wget -T 5 -q -O - "$CI_API_V4_URL/projects/to-be-continuous%2F$1/repository/tags?per_page=1" || echo "")
echo "$tag_json" | sed -rn 's/^.*"name":"([^"]*)".*$/\1/p' echo "$tag_json" | sed -rn 's/^.*"name":"([^"]*)".*$/\1/p'
...@@ -347,20 +373,20 @@ py-lint: ...@@ -347,20 +373,20 @@ py-lint:
extends: .python-base extends: .python-base
stage: build stage: build
script: script:
- install_requirements - mkdir -p reports
- pip install pylint_gitlab - chmod o+rwx reports
- install_requirements build
- _pip install -U pylint_gitlab
- | - |
if ! pylint --ignore=.cache --output-format=text ${PYLINT_ARGS} ${PYLINT_FILES:-$(find -type f -name "*.py")} if ! _run pylint --ignore=.cache --output-format=text ${PYLINT_ARGS} ${PYLINT_FILES:-$(find -type f -name "*.py")}
then then
# failed: also generate codeclimate report # failed: also generate codeclimate report
mkdir -p reports
chmod o+rwx reports _run pylint --ignore=.cache --output-format=pylint_gitlab.GitlabCodeClimateReporter ${PYLINT_ARGS} ${PYLINT_FILES:-$(find -type f -name "*.py")} > reports/pylint-codeclimate.json
pylint --ignore=.cache --output-format=pylint_gitlab.GitlabCodeClimateReporter ${PYLINT_ARGS} ${PYLINT_FILES:-$(find -type f -name "*.py")} > reports/pylint-codeclimate.json
exit 1 exit 1
else else
# success: generate empty codeclimate report (required by GitLab :( ) # success: generate empty codeclimate report (required by GitLab :( )
mkdir -p reports
chmod o+rwx reports
echo "[]" > reports/pylint-codeclimate.json echo "[]" > reports/pylint-codeclimate.json
fi fi
artifacts: artifacts:
...@@ -387,8 +413,8 @@ py-compile: ...@@ -387,8 +413,8 @@ py-compile:
extends: .python-base extends: .python-base
stage: build stage: build
script: script:
- install_requirements - install_requirements build
- python -m compileall $PYTHON_COMPILE_ARGS - _python -m compileall $PYTHON_COMPILE_ARGS
rules: rules:
# exclude merge requests # exclude merge requests
- if: $CI_MERGE_REQUEST_ID - if: $CI_MERGE_REQUEST_ID
...@@ -405,15 +431,14 @@ py-unittest: ...@@ -405,15 +431,14 @@ py-unittest:
script: script:
- mkdir -p reports - mkdir -p reports
- chmod o+rwx reports - chmod o+rwx reports
- install_requirements - install_requirements test
- install_test_requirements
# code coverage # code coverage
- pip install -U coverage - _pip install -U coverage
# JUnit XML report # JUnit XML report
- pip install -U unittest-xml-reporting - _pip install -U unittest-xml-reporting
- coverage run -m xmlrunner discover -o "reports/" $UNITTEST_ARGS - _run coverage run -m xmlrunner discover -o "reports/" $UNITTEST_ARGS
- coverage report -m - _run coverage report -m
- coverage xml -o "reports/coverage.xml" - _run coverage xml -o "reports/coverage.xml"
coverage: /^TOTAL.+?(\d+\%)$/ coverage: /^TOTAL.+?(\d+\%)$/
artifacts: artifacts:
name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG" name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG"
...@@ -436,12 +461,11 @@ py-pytest: ...@@ -436,12 +461,11 @@ py-pytest:
extends: .python-base extends: .python-base
stage: build stage: build
script: script:
- install_requirements
- install_test_requirements
- mkdir -p reports - mkdir -p reports
- chmod o+rwx reports - chmod o+rwx reports
- pip install -U pytest pytest-cov coverage - install_requirements test
- python -m pytest --junit-xml=reports/TEST-pytests.xml --cov --cov-report term --cov-report xml:reports/coverage.xml ${PYTEST_ARGS} - _pip install -U pytest pytest-cov coverage
- _python -m pytest --junit-xml=reports/TEST-pytests.xml --cov --cov-report term --cov-report xml:reports/coverage.xml ${PYTEST_ARGS}
coverage: /^TOTAL.+?(\d+\%)$/ coverage: /^TOTAL.+?(\d+\%)$/
artifacts: artifacts:
name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG" name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG"
...@@ -464,11 +488,10 @@ py-nosetests: ...@@ -464,11 +488,10 @@ py-nosetests:
extends: .python-base extends: .python-base
stage: build stage: build
script: script:
- install_requirements
- install_test_requirements
- mkdir -p reports - mkdir -p reports
- chmod o+rwx reports - chmod o+rwx reports
- nosetests --with-xunit --xunit-file=reports/TEST-nosetests.xml --with-coverage --cover-erase --cover-xml --cover-xml-file=reports/coverage.xml --cover-html --cover-html-dir=reports/coverage ${NOSETESTS_ARGS} - install_requirements test
- _run nosetests --with-xunit --xunit-file=reports/TEST-nosetests.xml --with-coverage --cover-erase --cover-xml --cover-xml-file=reports/coverage.xml --cover-html --cover-html-dir=reports/coverage ${NOSETESTS_ARGS}
coverage: /^TOTAL.+?(\d+\%)$/ coverage: /^TOTAL.+?(\d+\%)$/
artifacts: artifacts:
name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG" name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG"
...@@ -494,14 +517,14 @@ py-bandit: ...@@ -494,14 +517,14 @@ py-bandit:
# force no dependencies # force no dependencies
dependencies: [] dependencies: []
script: script:
- pip install -U bandit - mkdir -p reports
- chmod o+rwx reports
- _pip install -U bandit
- | - |
if ! bandit ${TRACE+--verbose} ${BANDIT_ARGS} if ! _run bandit ${TRACE+--verbose} ${BANDIT_ARGS}
then then
# failed: also generate JSON report # failed: also generate JSON report
mkdir -p reports _run bandit ${TRACE+--verbose} --format json --output reports/bandit.json ${BANDIT_ARGS}
chmod o+rwx reports
bandit ${TRACE+--verbose} --format json --output reports/bandit.json ${BANDIT_ARGS}
exit 1 exit 1
fi fi
artifacts: artifacts:
...@@ -531,14 +554,15 @@ py-safety: ...@@ -531,14 +554,15 @@ py-safety:
# force no dependencies # force no dependencies
dependencies: [] dependencies: []
script: script:
- install_requirements - mkdir -p reports
- chmod o+rwx reports
- install_requirements build
- | - |
if ! pip freeze | safety check --stdin ${SAFETY_ARGS} if ! _pip freeze | safety check --stdin ${SAFETY_ARGS}
then then
# failed: also generate JSON report # failed: also generate JSON report
mkdir -p reports
chmod o+rwx reports _pip freeze | safety check --stdin --json --output reports/safety.json ${SAFETY_ARGS}
pip freeze | safety check --stdin --json --output reports/safety.json ${SAFETY_ARGS}
exit 1 exit 1
fi fi
artifacts: artifacts:
...@@ -559,10 +583,8 @@ py-safety: ...@@ -559,10 +583,8 @@ py-safety:
- if: '$SAFETY_ENABLED == "true"' - if: '$SAFETY_ENABLED == "true"'
when: manual when: manual
allow_failure: true allow_failure: true
############################################################################################### ###############################################################################################
# pakage stage # # package stage #
############################################################################################### ###############################################################################################
# (on tag creation): create packages as artifacts # (on tag creation): create packages as artifacts
...@@ -570,7 +592,7 @@ py-package: ...@@ -570,7 +592,7 @@ py-package:
extends: .python-base extends: .python-base
stage: package-build stage: package-build
script: script:
- python setup.py sdist bdist_wheel - _package
artifacts: artifacts:
paths: paths:
- $PYTHON_PROJECT_DIR/dist/*.tar.gz - $PYTHON_PROJECT_DIR/dist/*.tar.gz
...@@ -580,6 +602,7 @@ py-package: ...@@ -580,6 +602,7 @@ py-package:
- if: '$CI_COMMIT_TAG' - if: '$CI_COMMIT_TAG'
- if: '$PYTHON_FORCE_PACKAGE == "true"' - if: '$PYTHON_FORCE_PACKAGE == "true"'
############################################################################################### ###############################################################################################
# publish stage # # publish stage #
############################################################################################### ###############################################################################################
...@@ -591,30 +614,12 @@ py-publish: ...@@ -591,30 +614,12 @@ py-publish:
script: script:
- assert_defined "$TWINE_USERNAME" 'Missing required env $TWINE_USERNAME' - assert_defined "$TWINE_USERNAME" 'Missing required env $TWINE_USERNAME'
- assert_defined "$TWINE_PASSWORD" 'Missing required env $TWINE_PASSWORD' - assert_defined "$TWINE_PASSWORD" 'Missing required env $TWINE_PASSWORD'
- pip install -U twine setuptools - _publish
- pip list
- twine upload --verbose dist/*.tar.gz
- twine upload --verbose dist/*.whl
rules: rules:
# on tags with $TWINE_USERNAME set # on tags with $TWINE_USERNAME set
- if: '$TWINE_USERNAME && $CI_COMMIT_TAG' - if: '$TWINE_USERNAME && $CI_COMMIT_TAG'
# (on tag creation): generates the documentation
py-docs:
extends: .python-base
stage: publish
script:
- install_doc_requirements
- pip install -U sphinx
- cd ${DOCS_DIRECTORY}
- make ${DOCS_MAKE_ARGS}
artifacts:
name: "$CI_JOB_NAME artifacts from $CI_PROJECT_NAME on $CI_COMMIT_REF_SLUG"
paths:
- $DOCS_BUILD_DIR
rules:
# on tags with $DOCS_ENABLED set
- if: '$DOCS_ENABLED == "true" && $CI_COMMIT_TAG'
# (manual from master branch): triggers a release (tag creation) # (manual from master branch): triggers a release (tag creation)
py-release: py-release:
...@@ -624,9 +629,7 @@ py-release: ...@@ -624,9 +629,7 @@ py-release:
- git config --global user.email '$GITLAB_USER_EMAIL' - git config --global user.email '$GITLAB_USER_EMAIL'
- git config --global user.name '$GITLAB_USER_LOGIN' - git config --global user.name '$GITLAB_USER_LOGIN'
- git checkout -B $CI_BUILD_REF_NAME - git checkout -B $CI_BUILD_REF_NAME
- pip install --upgrade bumpversion - _release
- release_args
- bumpversion ${bumpversion_args}
- git_url_base=`echo ${CI_REPOSITORY_URL} | cut -d\@ -f2` - git_url_base=`echo ${CI_REPOSITORY_URL} | cut -d\@ -f2`
- git push https://${RELEASE_USERNAME}:${RELEASE_ACCESS_TOKEN}@${git_url_base} --tags - git push https://${RELEASE_USERNAME}:${RELEASE_ACCESS_TOKEN}@${git_url_base} --tags
- git push https://${RELEASE_USERNAME}:${RELEASE_ACCESS_TOKEN}@${git_url_base} $CI_BUILD_REF_NAME - git push https://${RELEASE_USERNAME}:${RELEASE_ACCESS_TOKEN}@${git_url_base} $CI_BUILD_REF_NAME
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment