Codebase list docker-compose / add8662
Import upstream version 1.29.2 Debian Janitor 2 years ago
68 changed file(s) with 2537 addition(s) and 1230 deletion(s). Raw diff Collapse all Expand all
00 Change log
11 ==========
2
3 1.29.2 (2021-05-10)
4 -------------------
5
6 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/59?closed=1)
7
8 ### Miscellaneous
9
10 - Remove advertisement for `docker compose` in the `up` command to avoid annoyance
11
12 - Bump `py` to `1.10.0` in `requirements-indirect.txt`
13
14 1.29.1 (2021-04-13)
15 -------------------
16
17 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/58?closed=1)
18
19 ### Bugs
20
21 - Fix for invalid handler warning on Windows builds
22
23 - Fix config hash to trigger container recreation on IPC mode updates
24
25 - Fix conversion map for `placement.max_replicas_per_node`
26
27 - Remove extra scan suggestion on build
28
29 1.29.0 (2021-04-06)
30 -------------------
31
32 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/56?closed=1)
33
34 ### Features
35
36 - Add profile filter to `docker-compose config`
37
38 - Add a `depends_on` condition to wait for successful service completion
39
40 ### Miscellaneous
41
42 - Add image scan message on build
43
44 - Update warning message for `--no-ansi` to mention `--ansi never` as alternative
45
46 - Bump docker-py to 5.0.0
47
48 - Bump PyYAML to 5.4.1
49
50 - Bump python-dotenv to 0.17.0
51
52 1.28.6 (2021-03-23)
53 -------------------
54
55 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/57?closed=1)
56
57 ### Bugs
58
59 - Make `--env-file` relative to the current working directory and error out for invalid paths. Environment file paths set with `--env-file` are relative to the current working directory while the default `.env` file is located in the project directory which by default is the base directory of the Compose file.
60
61 - Fix missing service property `storage_opt` by updating the compose schema
62
63 - Fix build `extra_hosts` list format
64
65 - Remove extra error message on `exec`
66
67 ### Miscellaneous
68
69 - Add `compose.yml` and `compose.yaml` to default filename list
70
71 1.28.5 (2021-02-25)
72 -------------------
73
74 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/55?closed=1)
75
76 ### Bugs
77
78 - Fix OpenSSL version mismatch error when shelling out to the ssh client (via bump to docker-py 4.4.4 which contains the fix)
79
80 - Add missing build flags to the native builder: `platform`, `isolation` and `extra_hosts`
81
82 - Remove info message on native build
83
84 - Avoid fetching logs when service logging driver is set to 'none'
85
86 1.28.4 (2021-02-18)
87 -------------------
88
89 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/54?closed=1)
90
91 ### Bugs
92
93 - Fix SSH port parsing by bumping docker-py to 4.4.3
94
95 ### Miscellaneous
96
97 - Bump Python to 3.7.10
98
99 1.28.3 (2021-02-17)
100 -------------------
101
102 [List of PRs / issues for this release](https://github.com/docker/compose/milestone/53?closed=1)
103
104 ### Bugs
105
106 - Fix SSH hostname parsing when it contains leading s/h, and remove the quiet option that was hiding the error (via docker-py bump to 4.4.2)
107
108 - Fix key error for '--no-log-prefix' option
109
110 - Fix incorrect CLI environment variable name for service profiles: `COMPOSE_PROFILES` instead of `COMPOSE_PROFILE`
111
112 - Fix fish completion
113
114 ### Miscellaneous
115
116 - Bump cryptography to 3.3.2
117
118 - Remove log driver filter
119
120 1.28.2 (2021-01-26)
121 -------------------
122
123 ### Miscellaneous
124
125 - CI setup update
126
127 1.28.1 (2021-01-25)
128 -------------------
129
130 ### Bugs
131
132 - Revert to Python 3.7 bump for Linux static builds
133
134 - Add bash completion for `docker-compose logs|up --no-log-prefix`
135
136 1.28.0 (2021-01-20)
137 -------------------
138
139 ### Features
140
141 - Support for Nvidia GPUs via device requests
142
143 - Support for service profiles
144
145 - Change the SSH connection approach to the Docker CLI's via shellout to the local SSH client (old behaviour enabled by setting `COMPOSE_PARAMIKO_SSH` environment variable)
146
147 - Add flag to disable log prefix
148
149 - Add flag for ansi output control
150
151 ### Bugs
152
153 - Make `parallel_pull=True` by default
154
155 - Bring back warning for configs in non-swarm mode
156
157 - Take `--file` in account when defining `project_dir`
158
159 - On `compose up`, attach only to services we read logs from
160
161 ### Miscellaneous
162
163 - Make COMPOSE_DOCKER_CLI_BUILD=1 the default
164
165 - Add usage metrics
166
167 - Sync schema with COMPOSE specification
168
169 - Improve failure report for missing mandatory environment variables
170
171 - Bump attrs to 20.3.0
172
173 - Bump more_itertools to 8.6.0
174
175 - Bump cryptograhy to 3.2.1
176
177 - Bump cffi to 1.14.4
178
179 - Bump virtualenv to 20.2.2
180
181 - Bump bcrypt to 3.2.0
182
183 - Bump gitpython to 3.1.11
184
185 - Bump docker-py to 4.4.1
186
187 - Bump Python to 3.9
188
189 - Linux: bump Debian base image from stretch to buster (required for Python 3.9)
190
191 - macOS: OpenSSL 1.1.1g to 1.1.1h, Python 3.7.7 to 3.9.0
192
193 - Bump pyinstaller 4.1
194
195 - Loosen restriction on base images to latest minor
196
197 - Updates of READMEs
198
2199
3200 1.27.4 (2020-09-24)
4201 -------------------
0 ARG DOCKER_VERSION=19.03.8
1 ARG PYTHON_VERSION=3.7.7
2 ARG BUILD_ALPINE_VERSION=3.11
0 ARG DOCKER_VERSION=19.03
1 ARG PYTHON_VERSION=3.7.10
2
3 ARG BUILD_ALPINE_VERSION=3.12
4 ARG BUILD_CENTOS_VERSION=7
35 ARG BUILD_DEBIAN_VERSION=slim-stretch
4 ARG RUNTIME_ALPINE_VERSION=3.11.5
5 ARG RUNTIME_DEBIAN_VERSION=stretch-20200414-slim
66
7 ARG BUILD_PLATFORM=alpine
7 ARG RUNTIME_ALPINE_VERSION=3.12
8 ARG RUNTIME_CENTOS_VERSION=7
9 ARG RUNTIME_DEBIAN_VERSION=stretch-slim
10
11 ARG DISTRO=alpine
812
913 FROM docker:${DOCKER_VERSION} AS docker-cli
1014
3943 openssl \
4044 zlib1g-dev
4145
42 FROM build-${BUILD_PLATFORM} AS build
46 FROM centos:${BUILD_CENTOS_VERSION} AS build-centos
47 RUN yum install -y \
48 gcc \
49 git \
50 libffi-devel \
51 make \
52 openssl \
53 openssl-devel
54 WORKDIR /tmp/python3/
55 ARG PYTHON_VERSION
56 RUN curl -L https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz | tar xzf - \
57 && cd Python-${PYTHON_VERSION} \
58 && ./configure --enable-optimizations --enable-shared --prefix=/usr LDFLAGS="-Wl,-rpath /usr/lib" \
59 && make altinstall
60 RUN alternatives --install /usr/bin/python python /usr/bin/python2.7 50
61 RUN alternatives --install /usr/bin/python python /usr/bin/python$(echo "${PYTHON_VERSION%.*}") 60
62 RUN curl https://bootstrap.pypa.io/get-pip.py | python -
63
64 FROM build-${DISTRO} AS build
65 ENTRYPOINT ["sh", "/usr/local/bin/docker-compose-entrypoint.sh"]
66 WORKDIR /code/
4367 COPY docker-compose-entrypoint.sh /usr/local/bin/
44 ENTRYPOINT ["sh", "/usr/local/bin/docker-compose-entrypoint.sh"]
4568 COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker
46 WORKDIR /code/
47 # FIXME(chris-crone): virtualenv 16.3.0 breaks build, force 16.2.0 until fixed
48 RUN pip install virtualenv==20.0.30
49 RUN pip install tox==3.19.0
50
69 RUN pip install \
70 virtualenv==20.4.0 \
71 tox==3.21.2
72 COPY requirements-dev.txt .
5173 COPY requirements-indirect.txt .
5274 COPY requirements.txt .
53 COPY requirements-dev.txt .
75 RUN pip install -r requirements.txt -r requirements-indirect.txt -r requirements-dev.txt
5476 COPY .pre-commit-config.yaml .
5577 COPY tox.ini .
5678 COPY setup.py .
5779 COPY README.md .
5880 COPY compose compose/
59 RUN tox --notest
81 RUN tox -e py37 --notest
6082 COPY . .
6183 ARG GIT_COMMIT=unknown
6284 ENV DOCKER_COMPOSE_GITSHA=$GIT_COMMIT
6385 RUN script/build/linux-entrypoint
6486
87 FROM scratch AS bin
88 ARG TARGETARCH
89 ARG TARGETOS
90 COPY --from=build /usr/local/bin/docker-compose /docker-compose-${TARGETOS}-${TARGETARCH}
91
6592 FROM alpine:${RUNTIME_ALPINE_VERSION} AS runtime-alpine
6693 FROM debian:${RUNTIME_DEBIAN_VERSION} AS runtime-debian
67 FROM runtime-${BUILD_PLATFORM} AS runtime
94 FROM centos:${RUNTIME_CENTOS_VERSION} AS runtime-centos
95 FROM runtime-${DISTRO} AS runtime
6896 COPY docker-compose-entrypoint.sh /usr/local/bin/
6997 ENTRYPOINT ["sh", "/usr/local/bin/docker-compose-entrypoint.sh"]
7098 COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker
00 #!groovy
11
2 def dockerVersions = ['19.03.8']
2 def dockerVersions = ['19.03.13']
33 def baseImages = ['alpine', 'debian']
44 def pythonVersions = ['py37']
55
1212 timeout(time: 2, unit: 'HOURS')
1313 timestamps()
1414 }
15 environment {
16 DOCKER_BUILDKIT="1"
17 }
1518
1619 stages {
1720 stage('Build test images') {
1922 parallel {
2023 stage('alpine') {
2124 agent {
22 label 'ubuntu && amd64 && !zfs'
25 label 'ubuntu-2004 && amd64 && !zfs && cgroup1'
2326 }
2427 steps {
2528 buildImage('alpine')
2730 }
2831 stage('debian') {
2932 agent {
30 label 'ubuntu && amd64 && !zfs'
33 label 'ubuntu-2004 && amd64 && !zfs && cgroup1'
3134 }
3235 steps {
3336 buildImage('debian')
5861
5962 def buildImage(baseImage) {
6063 def scmvar = checkout(scm)
61 def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
64 def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
6265 image = docker.image(imageName)
6366
6467 withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
6871 ansiColor('xterm') {
6972 sh """docker build -t ${imageName} \\
7073 --target build \\
71 --build-arg BUILD_PLATFORM="${baseImage}" \\
74 --build-arg DISTRO="${baseImage}" \\
7275 --build-arg GIT_COMMIT="${scmvar.GIT_COMMIT}" \\
7376 .\\
7477 """
8386 def runTests(dockerVersion, pythonVersion, baseImage) {
8487 return {
8588 stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") {
86 node("ubuntu && amd64 && !zfs") {
89 node("ubuntu-2004 && amd64 && !zfs && cgroup1") {
8790 def scmvar = checkout(scm)
88 def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
91 def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
8992 def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim()
9093 echo "Using local system's storage driver: ${storageDriver}"
9194 withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
9598 --privileged \\
9699 --volume="\$(pwd)/.git:/code/.git" \\
97100 --volume="/var/run/docker.sock:/var/run/docker.sock" \\
101 --volume="\${DOCKER_CONFIG}/config.json:/root/.docker/config.json" \\
102 -e "DOCKER_TLS_CERTDIR=" \\
98103 -e "TAG=${imageName}" \\
99104 -e "STORAGE_DRIVER=${storageDriver}" \\
100105 -e "DOCKER_VERSIONS=${dockerVersion}" \\
0 TAG = "docker-compose:alpine-$(shell git rev-parse --short HEAD)"
1 GIT_VOLUME = "--volume=$(shell pwd)/.git:/code/.git"
2
3 DOCKERFILE ?="Dockerfile"
4 DOCKER_BUILD_TARGET ?="build"
5
6 UNAME_S := $(shell uname -s)
7 ifeq ($(UNAME_S),Linux)
8 BUILD_SCRIPT = linux
9 endif
10 ifeq ($(UNAME_S),Darwin)
11 BUILD_SCRIPT = osx
12 endif
13
14 COMPOSE_SPEC_SCHEMA_PATH = "compose/config/compose_spec.json"
15 COMPOSE_SPEC_RAW_URL = "https://raw.githubusercontent.com/compose-spec/compose-spec/master/schema/compose-spec.json"
16
17 all: cli
18
19 cli: download-compose-spec ## Compile the cli
20 ./script/build/$(BUILD_SCRIPT)
21
22 download-compose-spec: ## Download the compose-spec schema from it's repo
23 curl -so $(COMPOSE_SPEC_SCHEMA_PATH) $(COMPOSE_SPEC_RAW_URL)
24
25 cache-clear: ## Clear the builder cache
26 @docker builder prune --force --filter type=exec.cachemount --filter=unused-for=24h
27
28 base-image: ## Builds base image
29 docker build -f $(DOCKERFILE) -t $(TAG) --target $(DOCKER_BUILD_TARGET) .
30
31 lint: base-image ## Run linter
32 docker run --rm \
33 --tty \
34 $(GIT_VOLUME) \
35 $(TAG) \
36 tox -e pre-commit
37
38 test-unit: base-image ## Run tests
39 docker run --rm \
40 --tty \
41 $(GIT_VOLUME) \
42 $(TAG) \
43 pytest -v tests/unit/
44
45 test: ## Run all tests
46 ./script/test/default
47
48 pre-commit: lint test-unit cli
49
50 help: ## Show help
51 @echo Please specify a build target. The choices are:
52 @grep -E '^[0-9a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}'
53
54 FORCE:
55
56 .PHONY: all cli download-compose-spec cache-clear base-image lint test-unit test pre-commit help
00 Docker Compose
11 ==============
2 [![Build Status](https://ci-next.docker.com/public/buildStatus/icon?job=compose/master)](https://ci-next.docker.com/public/job/compose/job/master/)
3
24 ![Docker Compose](logo.png?raw=true "Docker Compose Logo")
35
4 Compose is a tool for defining and running multi-container Docker applications.
5 With Compose, you use a Compose file to configure your application's services.
6 Then, using a single command, you create and start all the services
7 from your configuration. To learn more about all the features of Compose
8 see [the list of features](https://github.com/docker/docker.github.io/blob/master/compose/index.md#features).
6 Docker Compose is a tool for running multi-container applications on Docker
7 defined using the [Compose file format](https://compose-spec.io).
8 A Compose file is used to define how the one or more containers that make up
9 your application are configured.
10 Once you have a Compose file, you can create and start your application with a
11 single command: `docker-compose up`.
912
10 Compose is great for development, testing, and staging environments, as well as
11 CI workflows. You can learn more about each case in
12 [Common Use Cases](https://github.com/docker/docker.github.io/blob/master/compose/index.md#common-use-cases).
13 Compose files can be used to deploy applications locally, or to the cloud on
14 [Amazon ECS](https://aws.amazon.com/ecs) or
15 [Microsoft ACI](https://azure.microsoft.com/services/container-instances/) using
16 the Docker CLI. You can read more about how to do this:
17 - [Compose for Amazon ECS](https://docs.docker.com/engine/context/ecs-integration/)
18 - [Compose for Microsoft ACI](https://docs.docker.com/engine/context/aci-integration/)
1319
14 Using Compose is basically a three-step process.
20 Where to get Docker Compose
21 ----------------------------
1522
23 ### Windows and macOS
24
25 Docker Compose is included in
26 [Docker Desktop](https://www.docker.com/products/docker-desktop)
27 for Windows and macOS.
28
29 ### Linux
30
31 You can download Docker Compose binaries from the
32 [release page](https://github.com/docker/compose/releases) on this repository.
33
34 ### Using pip
35
36 If your platform is not supported, you can download Docker Compose using `pip`:
37
38 ```console
39 pip install docker-compose
40 ```
41
42 > **Note:** Docker Compose requires Python 3.6 or later.
43
44 Quick Start
45 -----------
46
47 Using Docker Compose is basically a three-step process:
1648 1. Define your app's environment with a `Dockerfile` so it can be
17 reproduced anywhere.
49 reproduced anywhere.
1850 2. Define the services that make up your app in `docker-compose.yml` so
19 they can be run together in an isolated environment.
20 3. Lastly, run `docker-compose up` and Compose will start and run your entire app.
51 they can be run together in an isolated environment.
52 3. Lastly, run `docker-compose up` and Compose will start and run your entire
53 app.
2154
22 A `docker-compose.yml` looks like this:
55 A Compose file looks like this:
2356
24 version: '2'
57 ```yaml
58 services:
59 web:
60 build: .
61 ports:
62 - "5000:5000"
63 volumes:
64 - .:/code
65 redis:
66 image: redis
67 ```
2568
26 services:
27 web:
28 build: .
29 ports:
30 - "5000:5000"
31 volumes:
32 - .:/code
33 redis:
34 image: redis
69 You can find examples of Compose applications in our
70 [Awesome Compose repository](https://github.com/docker/awesome-compose).
3571
36 For more information about the Compose file, see the
37 [Compose file reference](https://github.com/docker/docker.github.io/blob/master/compose/compose-file/compose-versioning.md).
38
39 Compose has commands for managing the whole lifecycle of your application:
40
41 * Start, stop and rebuild services
42 * View the status of running services
43 * Stream the log output of running services
44 * Run a one-off command on a service
45
46 Installation and documentation
47 ------------------------------
48
49 - Full documentation is available on [Docker's website](https://docs.docker.com/compose/).
50 - Code repository for Compose is on [GitHub](https://github.com/docker/compose).
51 - If you find any problems please fill out an [issue](https://github.com/docker/compose/issues/new/choose). Thank you!
72 For more information about the Compose format, see the
73 [Compose file reference](https://docs.docker.com/compose/compose-file/).
5274
5375 Contributing
5476 ------------
5577
56 [![Build Status](https://ci-next.docker.com/public/buildStatus/icon?job=compose/master)](https://ci-next.docker.com/public/job/compose/job/master/)
78 Want to help develop Docker Compose? Check out our
79 [contributing documentation](https://github.com/docker/compose/blob/master/CONTRIBUTING.md).
5780
58 Want to help build Compose? Check out our [contributing documentation](https://github.com/docker/compose/blob/master/CONTRIBUTING.md).
81 If you find an issue, please report it on the
82 [issue tracker](https://github.com/docker/compose/issues/new/choose).
5983
6084 Releasing
6185 ---------
00 #!groovy
11
2 def dockerVersions = ['19.03.8', '18.09.9']
2 def dockerVersions = ['19.03.13', '18.09.9']
33 def baseImages = ['alpine', 'debian']
44 def pythonVersions = ['py37']
55
1212 timeout(time: 2, unit: 'HOURS')
1313 timestamps()
1414 }
15 environment {
16 DOCKER_BUILDKIT="1"
17 }
1518
1619 stages {
1720 stage('Build test images') {
1922 parallel {
2023 stage('alpine') {
2124 agent {
22 label 'linux && docker && ubuntu-2004'
25 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
2326 }
2427 steps {
2528 buildImage('alpine')
2730 }
2831 stage('debian') {
2932 agent {
30 label 'linux && docker && ubuntu-2004'
33 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
3134 }
3235 steps {
3336 buildImage('debian')
3740 }
3841 stage('Test') {
3942 agent {
40 label 'linux && docker && ubuntu-2004'
43 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
4144 }
4245 steps {
4346 // TODO use declarative 1.5.0 `matrix` once available on CI
5760 }
5861 stage('Generate Changelog') {
5962 agent {
60 label 'linux && docker && ubuntu-2004'
63 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
6164 }
6265 steps {
6366 checkout scm
8083 steps {
8184 checkout scm
8285 sh './script/setup/osx'
83 sh 'tox -e py37 -- tests/unit'
86 sh 'tox -e py39 -- tests/unit'
8487 sh './script/build/osx'
8588 dir ('dist') {
8689 checksum('docker-compose-Darwin-x86_64')
9497 }
9598 stage('linux binary') {
9699 agent {
97 label 'linux && docker && ubuntu-2004'
100 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
98101 }
99102 steps {
100103 checkout scm
113116 label 'windows-python'
114117 }
115118 environment {
116 PATH = "$PATH;C:\\Python37;C:\\Python37\\Scripts"
117 }
118 steps {
119 checkout scm
120 bat 'tox.exe -e py37 -- tests/unit'
119 PATH = "C:\\Python39;C:\\Python39\\Scripts;$PATH"
120 }
121 steps {
122 checkout scm
123 bat 'tox.exe -e py39 -- tests/unit'
121124 powershell '.\\script\\build\\windows.ps1'
122125 dir ('dist') {
123126 checksum('docker-compose-Windows-x86_64.exe')
130133 }
131134 stage('alpine image') {
132135 agent {
133 label 'linux && docker && ubuntu-2004'
136 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
134137 }
135138 steps {
136139 buildRuntimeImage('alpine')
138141 }
139142 stage('debian image') {
140143 agent {
141 label 'linux && docker && ubuntu-2004'
144 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
142145 }
143146 steps {
144147 buildRuntimeImage('debian')
153156 parallel {
154157 stage('Pushing images') {
155158 agent {
156 label 'linux && docker && ubuntu-2004'
159 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
157160 }
158161 steps {
159162 pushRuntimeImage('alpine')
162165 }
163166 stage('Creating Github Release') {
164167 agent {
165 label 'linux && docker && ubuntu-2004'
168 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
166169 }
167170 environment {
168171 GITHUB_TOKEN = credentials('github-release-token')
194197 }
195198 stage('Publishing Python packages') {
196199 agent {
197 label 'linux && docker && ubuntu-2004'
200 label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
198201 }
199202 environment {
200203 PYPIRC = credentials('pypirc-docker-dsg-cibot')
218221
219222 def buildImage(baseImage) {
220223 def scmvar = checkout(scm)
221 def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
224 def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
222225 image = docker.image(imageName)
223226
224227 withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
228231 ansiColor('xterm') {
229232 sh """docker build -t ${imageName} \\
230233 --target build \\
231 --build-arg BUILD_PLATFORM="${baseImage}" \\
234 --build-arg DISTRO="${baseImage}" \\
232235 --build-arg GIT_COMMIT="${scmvar.GIT_COMMIT}" \\
233236 .\\
234237 """
243246 def runTests(dockerVersion, pythonVersion, baseImage) {
244247 return {
245248 stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") {
246 node("linux && docker && ubuntu-2004") {
249 node("linux && docker && ubuntu-2004 && amd64 && cgroup1") {
247250 def scmvar = checkout(scm)
248 def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
251 def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
249252 def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim()
250253 echo "Using local system's storage driver: ${storageDriver}"
251254 withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
255258 --privileged \\
256259 --volume="\$(pwd)/.git:/code/.git" \\
257260 --volume="/var/run/docker.sock:/var/run/docker.sock" \\
261 --volume="\${DOCKER_CONFIG}/config.json:/root/.docker/config.json" \\
262 -e "DOCKER_TLS_CERTDIR=" \\
258263 -e "TAG=${imageName}" \\
259264 -e "STORAGE_DRIVER=${storageDriver}" \\
260265 -e "DOCKER_VERSIONS=${dockerVersion}" \\
275280 def imageName = "docker/compose:${baseImage}-${env.BRANCH_NAME}"
276281 ansiColor('xterm') {
277282 sh """docker build -t ${imageName} \\
278 --build-arg BUILD_PLATFORM="${baseImage}" \\
283 --build-arg DISTRO="${baseImage}" \\
279284 --build-arg GIT_COMMIT="${scmvar.GIT_COMMIT.take(7)}" \\
280285 .
281286 """
0 __version__ = '1.27.4'
0 __version__ = '1.29.2'
0 import enum
1 import os
2
03 from ..const import IS_WINDOWS_PLATFORM
14
25 NAMES = [
912 'cyan',
1013 'white'
1114 ]
15
16
17 @enum.unique
18 class AnsiMode(enum.Enum):
19 """Enumeration for when to output ANSI colors."""
20 NEVER = "never"
21 ALWAYS = "always"
22 AUTO = "auto"
23
24 def use_ansi_codes(self, stream):
25 if self is AnsiMode.ALWAYS:
26 return True
27 if self is AnsiMode.NEVER or os.environ.get('CLICOLOR') == '0':
28 return False
29 return stream.isatty()
1230
1331
1432 def get_pairs():
3434
3535 def project_from_options(project_dir, options, additional_options=None):
3636 additional_options = additional_options or {}
37 override_dir = options.get('--project-directory')
37 override_dir = get_project_dir(options)
3838 environment_file = options.get('--env-file')
3939 environment = Environment.from_env_file(override_dir or project_dir, environment_file)
4040 environment.silent = options.get('COMMAND', None) in SILENT_COMMANDS
5858
5959 return get_project(
6060 project_dir,
61 get_config_path_from_options(project_dir, options, environment),
61 get_config_path_from_options(options, environment),
6262 project_name=options.get('--project-name'),
6363 verbose=options.get('--verbose'),
6464 context=context,
6565 environment=environment,
6666 override_dir=override_dir,
6767 interpolate=(not additional_options.get('--no-interpolate')),
68 environment_file=environment_file
68 environment_file=environment_file,
69 enabled_profiles=get_profiles_from_options(options, environment)
6970 )
7071
7172
8586 parallel.GlobalLimit.set_global_limit(parallel_limit)
8687
8788
89 def get_project_dir(options):
90 override_dir = None
91 files = get_config_path_from_options(options, os.environ)
92 if files:
93 if files[0] == '-':
94 return '.'
95 override_dir = os.path.dirname(files[0])
96 return options.get('--project-directory') or override_dir
97
98
8899 def get_config_from_options(base_dir, options, additional_options=None):
89100 additional_options = additional_options or {}
90 override_dir = options.get('--project-directory')
101 override_dir = get_project_dir(options)
91102 environment_file = options.get('--env-file')
92103 environment = Environment.from_env_file(override_dir or base_dir, environment_file)
93 config_path = get_config_path_from_options(
94 base_dir, options, environment
95 )
104 config_path = get_config_path_from_options(options, environment)
96105 return config.load(
97106 config.find(base_dir, config_path, environment, override_dir),
98107 not additional_options.get('--no-interpolate')
99108 )
100109
101110
102 def get_config_path_from_options(base_dir, options, environment):
111 def get_config_path_from_options(options, environment):
103112 def unicode_paths(paths):
104113 return [p.decode('utf-8') if isinstance(p, bytes) else p for p in paths]
105114
114123 return None
115124
116125
126 def get_profiles_from_options(options, environment):
127 profile_option = options.get('--profile')
128 if profile_option:
129 return profile_option
130
131 profiles = environment.get('COMPOSE_PROFILES')
132 if profiles:
133 return profiles.split(',')
134
135 return []
136
137
117138 def get_project(project_dir, config_path=None, project_name=None, verbose=False,
118139 context=None, environment=None, override_dir=None,
119 interpolate=True, environment_file=None):
140 interpolate=True, environment_file=None, enabled_profiles=None):
120141 if not environment:
121142 environment = Environment.from_env_file(project_dir)
122143 config_details = config.find(project_dir, config_path, environment, override_dir)
138159 client,
139160 environment.get('DOCKER_DEFAULT_PLATFORM'),
140161 execution_context_labels(config_details, environment_file),
162 enabled_profiles,
141163 )
142164
143165
165165 kwargs['credstore_env'] = {
166166 'LD_LIBRARY_PATH': environment.get('LD_LIBRARY_PATH_ORIG'),
167167 }
168
169 client = APIClient(**kwargs)
168 use_paramiko_ssh = int(environment.get('COMPOSE_PARAMIKO_SSH', 0))
169 client = APIClient(use_ssh_client=not use_paramiko_ssh, **kwargs)
170170 client._original_base_url = kwargs.get('base_url')
171171
172172 return client
1616 self.command_class = command_class
1717 self.options = options
1818
19 @classmethod
20 def get_command_and_options(cls, doc_entity, argv, options):
21 command_help = getdoc(doc_entity)
22 opt = docopt_full_help(command_help, argv, **options)
23 command = opt['COMMAND']
24 return command_help, opt, command
25
1926 def parse(self, argv):
20 command_help = getdoc(self.command_class)
21 options = docopt_full_help(command_help, argv, **self.options)
22 command = options['COMMAND']
27 command_help, options, command = DocoptDispatcher.get_command_and_options(
28 self.command_class, argv, self.options)
2329
2430 if command is None:
2531 raise SystemExit(command_help)
1515
1616 class LogPresenter:
1717
18 def __init__(self, prefix_width, color_func):
18 def __init__(self, prefix_width, color_func, keep_prefix=True):
1919 self.prefix_width = prefix_width
2020 self.color_func = color_func
21 self.keep_prefix = keep_prefix
2122
2223 def present(self, container, line):
23 prefix = container.name_without_project.ljust(self.prefix_width)
24 return '{prefix} {line}'.format(
25 prefix=self.color_func(prefix + ' |'),
26 line=line)
27
28
29 def build_log_presenters(service_names, monochrome):
24 to_log = '{line}'.format(line=line)
25
26 if self.keep_prefix:
27 prefix = container.name_without_project.ljust(self.prefix_width)
28 to_log = '{prefix} '.format(prefix=self.color_func(prefix + ' |')) + to_log
29
30 return to_log
31
32
33 def build_log_presenters(service_names, monochrome, keep_prefix=True):
3034 """Return an iterable of functions.
3135
3236 Each function can be used to format the logs output of a container.
3741 return text
3842
3943 for color_func in cycle([no_color] if monochrome else colors.rainbow()):
40 yield LogPresenter(prefix_width, color_func)
44 yield LogPresenter(prefix_width, color_func, keep_prefix)
4145
4246
4347 def max_name_width(service_names, max_index_width=3):
153157
154158
155159 def tail_container_logs(container, presenter, queue, log_args):
156 generator = get_log_generator(container)
157
158160 try:
159 for item in generator(container, log_args):
161 for item in build_log_generator(container, log_args):
160162 queue.put(QueueItem.new(presenter.present(container, item)))
161163 except Exception as e:
162164 queue.put(QueueItem.exception(e))
164166 if log_args.get('follow'):
165167 queue.put(QueueItem.new(presenter.color_func(wait_on_exit(container))))
166168 queue.put(QueueItem.stop(container.name))
167
168
169 def get_log_generator(container):
170 if container.has_api_logs:
171 return build_log_generator
172 return build_no_log_generator
173
174
175 def build_no_log_generator(container, log_args):
176 """Return a generator that prints a warning about logs and waits for
177 container to exit.
178 """
179 yield "WARNING: no logs are available with the '{}' log driver\n".format(
180 container.log_driver)
181169
182170
183171 def build_log_generator(container, log_args):
11 import functools
22 import json
33 import logging
4 import os
54 import pipes
65 import re
76 import subprocess
2322 from ..config.environment import Environment
2423 from ..config.serialize import serialize_config
2524 from ..config.types import VolumeSpec
25 from ..const import IS_LINUX_PLATFORM
2626 from ..const import IS_WINDOWS_PLATFORM
2727 from ..errors import StreamParseError
28 from ..metrics.decorator import metrics
29 from ..parallel import ParallelStreamWriter
2830 from ..progress_stream import StreamOutputError
2931 from ..project import get_image_digests
3032 from ..project import MissingDigests
3739 from ..service import ImageType
3840 from ..service import NeedsBuildError
3941 from ..service import OperationFailedError
42 from ..utils import filter_attached_for_up
43 from .colors import AnsiMode
4044 from .command import get_config_from_options
45 from .command import get_project_dir
4146 from .command import project_from_options
4247 from .docopt_command import DocoptDispatcher
4348 from .docopt_command import get_handler
5055 from .utils import get_version_info
5156 from .utils import human_readable_file_size
5257 from .utils import yesno
58 from compose.metrics.client import MetricsCommand
59 from compose.metrics.client import Status
5360
5461
5562 if not IS_WINDOWS_PLATFORM:
5663 from dockerpty.pty import PseudoTerminal, RunOperation, ExecOperation
5764
5865 log = logging.getLogger(__name__)
59 console_handler = logging.StreamHandler(sys.stderr)
60
61
62 def main():
66
67
68 def main(): # noqa: C901
6369 signals.ignore_sigpipe()
70 command = None
6471 try:
65 command = dispatch()
66 command()
72 _, opts, command = DocoptDispatcher.get_command_and_options(
73 TopLevelCommand,
74 get_filtered_args(sys.argv[1:]),
75 {'options_first': True, 'version': get_version_info('compose')})
76 except Exception:
77 pass
78 try:
79 command_func = dispatch()
80 command_func()
81 if not IS_LINUX_PLATFORM and command == 'help':
82 print("\nDocker Compose is now in the Docker CLI, try `docker compose` help")
6783 except (KeyboardInterrupt, signals.ShutdownException):
68 log.error("Aborting.")
69 sys.exit(1)
84 exit_with_metrics(command, "Aborting.", status=Status.CANCELED)
7085 except (UserError, NoSuchService, ConfigurationError,
7186 ProjectError, OperationFailedError) as e:
72 log.error(e.msg)
73 sys.exit(1)
87 exit_with_metrics(command, e.msg, status=Status.FAILURE)
7488 except BuildError as e:
7589 reason = ""
7690 if e.reason:
7791 reason = " : " + e.reason
78 log.error("Service '{}' failed to build{}".format(e.service.name, reason))
79 sys.exit(1)
92 exit_with_metrics(command,
93 "Service '{}' failed to build{}".format(e.service.name, reason),
94 status=Status.FAILURE)
8095 except StreamOutputError as e:
81 log.error(e)
82 sys.exit(1)
96 exit_with_metrics(command, e, status=Status.FAILURE)
8397 except NeedsBuildError as e:
84 log.error("Service '{}' needs to be built, but --no-build was passed.".format(e.service.name))
85 sys.exit(1)
98 exit_with_metrics(command,
99 "Service '{}' needs to be built, but --no-build was passed.".format(
100 e.service.name), status=Status.FAILURE)
86101 except NoSuchCommand as e:
87102 commands = "\n".join(parse_doc_section("commands:", getdoc(e.supercommand)))
88 log.error("No such command: %s\n\n%s", e.command, commands)
89 sys.exit(1)
103 if not IS_LINUX_PLATFORM:
104 commands += "\n\nDocker Compose is now in the Docker CLI, try `docker compose`"
105 exit_with_metrics("", log_msg="No such command: {}\n\n{}".format(
106 e.command, commands), status=Status.FAILURE)
90107 except (errors.ConnectionError, StreamParseError):
91 sys.exit(1)
108 exit_with_metrics(command, status=Status.FAILURE)
109 except SystemExit as e:
110 status = Status.SUCCESS
111 if len(sys.argv) > 1 and '--help' not in sys.argv:
112 status = Status.FAILURE
113
114 if command and len(sys.argv) >= 3 and sys.argv[2] == '--help':
115 command = '--help ' + command
116
117 if not command and len(sys.argv) >= 2 and sys.argv[1] == '--help':
118 command = '--help'
119
120 msg = e.args[0] if len(e.args) else ""
121 code = 0
122 if isinstance(e.code, int):
123 code = e.code
124
125 if not IS_LINUX_PLATFORM and not command:
126 msg += "\n\nDocker Compose is now in the Docker CLI, try `docker compose`"
127
128 exit_with_metrics(command, log_msg=msg, status=status,
129 exit_code=code)
130
131
132 def get_filtered_args(args):
133 if args[0] in ('-h', '--help'):
134 return []
135 if args[0] == '--version':
136 return ['version']
137
138
139 def exit_with_metrics(command, log_msg=None, status=Status.SUCCESS, exit_code=1):
140 if log_msg and command != 'exec':
141 if not exit_code:
142 log.info(log_msg)
143 else:
144 log.error(log_msg)
145
146 MetricsCommand(command, status=status).send_metrics()
147 sys.exit(exit_code)
92148
93149
94150 def dispatch():
95 setup_logging()
151 console_stream = sys.stderr
152 console_handler = logging.StreamHandler(console_stream)
153 setup_logging(console_handler)
96154 dispatcher = DocoptDispatcher(
97155 TopLevelCommand,
98156 {'options_first': True, 'version': get_version_info('compose')})
99157
100158 options, handler, command_options = dispatcher.parse(sys.argv[1:])
159
160 ansi_mode = AnsiMode.AUTO
161 try:
162 if options.get("--ansi"):
163 ansi_mode = AnsiMode(options.get("--ansi"))
164 except ValueError:
165 raise UserError(
166 'Invalid value for --ansi: {}. Expected one of {}.'.format(
167 options.get("--ansi"),
168 ', '.join(m.value for m in AnsiMode)
169 )
170 )
171 if options.get("--no-ansi"):
172 if options.get("--ansi"):
173 raise UserError("--no-ansi and --ansi cannot be combined.")
174 log.warning('--no-ansi option is deprecated and will be removed in future versions. '
175 'Use `--ansi never` instead.')
176 ansi_mode = AnsiMode.NEVER
177
101178 setup_console_handler(console_handler,
102179 options.get('--verbose'),
103 set_no_color_if_clicolor(options.get('--no-ansi')),
180 ansi_mode.use_ansi_codes(console_handler.stream),
104181 options.get("--log-level"))
105 setup_parallel_logger(set_no_color_if_clicolor(options.get('--no-ansi')))
106 if options.get('--no-ansi'):
182 setup_parallel_logger(ansi_mode)
183 if ansi_mode is AnsiMode.NEVER:
107184 command_options['--no-color'] = True
108185 return functools.partial(perform_command, options, handler, command_options)
109186
125202 handler(command, command_options)
126203
127204
128 def setup_logging():
205 def setup_logging(console_handler):
129206 root_logger = logging.getLogger()
130207 root_logger.addHandler(console_handler)
131208 root_logger.setLevel(logging.DEBUG)
132209
133 # Disable requests logging
210 # Disable requests and docker-py logging
211 logging.getLogger("urllib3").propagate = False
134212 logging.getLogger("requests").propagate = False
135
136
137 def setup_parallel_logger(noansi):
138 if noansi:
139 import compose.parallel
140 compose.parallel.ParallelStreamWriter.set_noansi()
141
142
143 def setup_console_handler(handler, verbose, noansi=False, level=None):
144 if handler.stream.isatty() and noansi is False:
213 logging.getLogger("docker").propagate = False
214
215
216 def setup_parallel_logger(ansi_mode):
217 ParallelStreamWriter.set_default_ansi_mode(ansi_mode)
218
219
220 def setup_console_handler(handler, verbose, use_console_formatter=True, level=None):
221 if use_console_formatter:
145222 format_class = ConsoleWarningFormatter
146223 else:
147224 format_class = logging.Formatter
181258 """Define and run multi-container applications with Docker.
182259
183260 Usage:
184 docker-compose [-f <arg>...] [options] [--] [COMMAND] [ARGS...]
261 docker-compose [-f <arg>...] [--profile <name>...] [options] [--] [COMMAND] [ARGS...]
185262 docker-compose -h|--help
186263
187264 Options:
189266 (default: docker-compose.yml)
190267 -p, --project-name NAME Specify an alternate project name
191268 (default: directory name)
269 --profile NAME Specify a profile to enable
192270 -c, --context NAME Specify a context name
193271 --verbose Show more output
194272 --log-level LEVEL Set log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
195 --no-ansi Do not print ANSI control characters
273 --ansi (never|always|auto) Control when to print ANSI control characters
274 --no-ansi Do not print ANSI control characters (DEPRECATED)
196275 -v, --version Print version and exit
197276 -H, --host HOST Daemon socket to connect to
198277
213292 build Build or rebuild services
214293 config Validate and view the Compose file
215294 create Create services
216 down Stop and remove containers, networks, images, and volumes
295 down Stop and remove resources
217296 events Receive real time events from containers
218297 exec Execute a command in a running container
219298 help Get help on a command
243322
244323 @property
245324 def project_dir(self):
246 return self.toplevel_options.get('--project-directory') or '.'
325 return get_project_dir(self.toplevel_options)
247326
248327 @property
249328 def toplevel_environment(self):
250329 environment_file = self.toplevel_options.get('--env-file')
251330 return Environment.from_env_file(self.project_dir, environment_file)
252331
332 @metrics()
253333 def build(self, options):
254334 """
255335 Build or rebuild services.
269349 --no-rm Do not remove intermediate containers after a successful build.
270350 --parallel Build images in parallel.
271351 --progress string Set type of progress output (auto, plain, tty).
272 EXPERIMENTAL flag for native builder.
273 To enable, run with COMPOSE_DOCKER_CLI_BUILD=1)
274352 --pull Always attempt to pull a newer version of the image.
275353 -q, --quiet Don't print anything to STDOUT
276354 """
284362 )
285363 build_args = resolve_build_args(build_args, self.toplevel_environment)
286364
287 native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD')
365 native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD', True)
288366
289367 self.project.build(
290368 service_names=options['SERVICE'],
301379 progress=options.get('--progress'),
302380 )
303381
382 @metrics()
304383 def config(self, options):
305384 """
306385 Validate and view the Compose file.
312391 --no-interpolate Don't interpolate environment variables.
313392 -q, --quiet Only validate the configuration, don't print
314393 anything.
394 --profiles Print the profile names, one per line.
315395 --services Print the service names, one per line.
316396 --volumes Print the volume names, one per line.
317397 --hash="*" Print the service config hash, one per line.
329409 image_digests = image_digests_for_project(self.project)
330410
331411 if options['--quiet']:
412 return
413
414 if options['--profiles']:
415 profiles = set()
416 for service in compose_config.services:
417 if 'profiles' in service:
418 for profile in service['profiles']:
419 profiles.add(profile)
420 print('\n'.join(sorted(profiles)))
332421 return
333422
334423 if options['--services']:
350439
351440 print(serialize_config(compose_config, image_digests, not options['--no-interpolate']))
352441
442 @metrics()
353443 def create(self, options):
354444 """
355445 Creates containers for a service.
378468 do_build=build_action_from_opts(options),
379469 )
380470
471 @metrics()
381472 def down(self, options):
382473 """
383474 Stops containers and removes containers, networks, volumes, and images
429520 Options:
430521 --json Output events as a stream of json objects
431522 """
523
432524 def format_event(event):
433525 attributes = ["%s=%s" % item for item in event['attributes'].items()]
434526 return ("{time} {type} {action} {id} ({attrs})").format(
445537 print(formatter(event))
446538 sys.stdout.flush()
447539
540 @metrics("exec")
448541 def exec_command(self, options):
449542 """
450543 Execute a command in a running container
521614 sys.exit(exit_code)
522615
523616 @classmethod
617 @metrics()
524618 def help(cls, options):
525619 """
526620 Get help on a command.
534628
535629 print(getdoc(subject))
536630
631 @metrics()
537632 def images(self, options):
538633 """
539634 List images used by the created containers.
588683 ])
589684 print(Formatter.table(headers, rows))
590685
686 @metrics()
591687 def kill(self, options):
592688 """
593689 Force stop service containers.
602698
603699 self.project.kill(service_names=options['SERVICE'], signal=signal)
604700
701 @metrics()
605702 def logs(self, options):
606703 """
607704 View output from containers.
609706 Usage: logs [options] [--] [SERVICE...]
610707
611708 Options:
612 --no-color Produce monochrome output.
613 -f, --follow Follow log output.
614 -t, --timestamps Show timestamps.
615 --tail="all" Number of lines to show from the end of the logs
616 for each container.
709 --no-color Produce monochrome output.
710 -f, --follow Follow log output.
711 -t, --timestamps Show timestamps.
712 --tail="all" Number of lines to show from the end of the logs
713 for each container.
714 --no-log-prefix Don't print prefix in logs.
617715 """
618716 containers = self.project.containers(service_names=options['SERVICE'], stopped=True)
619717
632730 log_printer_from_project(
633731 self.project,
634732 containers,
635 set_no_color_if_clicolor(options['--no-color']),
733 options['--no-color'],
636734 log_args,
637 event_stream=self.project.events(service_names=options['SERVICE'])).run()
638
735 event_stream=self.project.events(service_names=options['SERVICE']),
736 keep_prefix=not options['--no-log-prefix']).run()
737
738 @metrics()
639739 def pause(self, options):
640740 """
641741 Pause services.
645745 containers = self.project.pause(service_names=options['SERVICE'])
646746 exit_if(not containers, 'No containers to pause', 1)
647747
748 @metrics()
648749 def port(self, options):
649750 """
650751 Print the public port for a port binding.
666767 options['PRIVATE_PORT'],
667768 protocol=options.get('--protocol') or 'tcp') or '')
668769
770 @metrics()
669771 def ps(self, options):
670772 """
671773 List containers.
722824 ])
723825 print(Formatter.table(headers, rows))
724826
827 @metrics()
725828 def pull(self, options):
726829 """
727830 Pulls images for services defined in a Compose file, but does not start the containers.
745848 include_deps=options.get('--include-deps'),
746849 )
747850
851 @metrics()
748852 def push(self, options):
749853 """
750854 Pushes images for services.
759863 ignore_push_failures=options.get('--ignore-push-failures')
760864 )
761865
866 @metrics()
762867 def rm(self, options):
763868 """
764869 Removes stopped service containers.
803908 else:
804909 print("No stopped containers")
805910
911 @metrics()
806912 def run(self, options):
807913 """
808914 Run a one-off command on a service.
863969 self.toplevel_options, self.toplevel_environment
864970 )
865971
972 @metrics()
866973 def scale(self, options):
867974 """
868975 Set number of containers to run for a service.
891998 for service_name, num in parse_scale_args(options['SERVICE=NUM']).items():
892999 self.project.get_service(service_name).scale(num, timeout=timeout)
8931000
1001 @metrics()
8941002 def start(self, options):
8951003 """
8961004 Start existing containers.
9001008 containers = self.project.start(service_names=options['SERVICE'])
9011009 exit_if(not containers, 'No containers to start', 1)
9021010
1011 @metrics()
9031012 def stop(self, options):
9041013 """
9051014 Stop running containers without removing them.
9151024 timeout = timeout_from_opts(options)
9161025 self.project.stop(service_names=options['SERVICE'], timeout=timeout)
9171026
1027 @metrics()
9181028 def restart(self, options):
9191029 """
9201030 Restart running containers.
9291039 containers = self.project.restart(service_names=options['SERVICE'], timeout=timeout)
9301040 exit_if(not containers, 'No containers to restart', 1)
9311041
1042 @metrics()
9321043 def top(self, options):
9331044 """
9341045 Display the running processes
9561067 print(container.name)
9571068 print(Formatter.table(headers, rows))
9581069
1070 @metrics()
9591071 def unpause(self, options):
9601072 """
9611073 Unpause services.
9651077 containers = self.project.unpause(service_names=options['SERVICE'])
9661078 exit_if(not containers, 'No containers to unpause', 1)
9671079
1080 @metrics()
9681081 def up(self, options):
9691082 """
9701083 Builds, (re)creates, starts, and attaches to containers for a service.
10161129 container. Implies --abort-on-container-exit.
10171130 --scale SERVICE=NUM Scale SERVICE to NUM instances. Overrides the
10181131 `scale` setting in the Compose file if present.
1132 --no-log-prefix Don't print prefix in logs.
10191133 """
10201134 start_deps = not options['--no-deps']
10211135 always_recreate_deps = options['--always-recreate-deps']
10271141 detached = options.get('--detach')
10281142 no_start = options.get('--no-start')
10291143 attach_dependencies = options.get('--attach-dependencies')
1144 keep_prefix = not options.get('--no-log-prefix')
10301145
10311146 if detached and (cascade_stop or exit_value_from or attach_dependencies):
10321147 raise UserError(
10411156 for excluded in [x for x in opts if options.get(x) and no_start]:
10421157 raise UserError('--no-start and {} cannot be combined.'.format(excluded))
10431158
1044 native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD')
1159 native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD', True)
10451160
10461161 with up_shutdown_context(self.project, service_names, timeout, detached):
10471162 warn_for_swarm_mode(self.project.client)
10631178 renew_anonymous_volumes=options.get('--renew-anon-volumes'),
10641179 silent=options.get('--quiet-pull'),
10651180 cli=native_builder,
1181 attach_dependencies=attach_dependencies,
10661182 )
10671183
10681184 try:
10901206 log_printer = log_printer_from_project(
10911207 self.project,
10921208 attached_containers,
1093 set_no_color_if_clicolor(options['--no-color']),
1209 options['--no-color'],
10941210 {'follow': True},
10951211 cascade_stop,
1096 event_stream=self.project.events(service_names=service_names))
1212 event_stream=self.project.events(service_names=service_names),
1213 keep_prefix=keep_prefix)
10971214 print("Attaching to", list_containers(log_printer.containers))
10981215 cascade_starter = log_printer.run()
10991216
11111228 sys.exit(exit_code)
11121229
11131230 @classmethod
1231 @metrics()
11141232 def version(cls, options):
11151233 """
11161234 Show version information and quit.
13751493
13761494
13771495 def log_printer_from_project(
1378 project,
1379 containers,
1380 monochrome,
1381 log_args,
1382 cascade_stop=False,
1383 event_stream=None,
1496 project,
1497 containers,
1498 monochrome,
1499 log_args,
1500 cascade_stop=False,
1501 event_stream=None,
1502 keep_prefix=True,
13841503 ):
13851504 return LogPrinter(
1386 containers,
1387 build_log_presenters(project.service_names, monochrome),
1505 [c for c in containers if c.log_driver not in (None, 'none')],
1506 build_log_presenters(project.service_names, monochrome, keep_prefix),
13881507 event_stream or project.events(),
13891508 cascade_stop=cascade_stop,
13901509 log_args=log_args)
13911510
13921511
13931512 def filter_attached_containers(containers, service_names, attach_dependencies=False):
1394 if attach_dependencies or not service_names:
1395 return containers
1396
1397 return [
1398 container
1399 for container in containers if container.service in service_names
1400 ]
1513 return filter_attached_for_up(
1514 containers,
1515 service_names,
1516 attach_dependencies,
1517 lambda container: container.service)
14011518
14021519
14031520 @contextlib.contextmanager
15731690 "To deploy your application across the swarm, "
15741691 "use `docker stack deploy`.\n"
15751692 )
1576
1577
1578 def set_no_color_if_clicolor(no_color_flag):
1579 return no_color_flag or os.environ.get('CLICOLOR') == "0"
0 {
1 "$schema": "http://json-schema.org/draft/2019-09/schema#",
2 "id": "compose_spec.json",
3 "type": "object",
4 "title": "Compose Specification",
5 "description": "The Compose file is a YAML file defining a multi-containers based application.",
6
7 "properties": {
8 "version": {
9 "type": "string",
10 "description": "Version of the Compose specification used. Tools not implementing required version MUST reject the configuration file."
11 },
12
13 "services": {
14 "id": "#/properties/services",
15 "type": "object",
16 "patternProperties": {
17 "^[a-zA-Z0-9._-]+$": {
18 "$ref": "#/definitions/service"
19 }
20 },
21 "additionalProperties": false
22 },
23
24 "networks": {
25 "id": "#/properties/networks",
26 "type": "object",
27 "patternProperties": {
28 "^[a-zA-Z0-9._-]+$": {
29 "$ref": "#/definitions/network"
30 }
31 }
32 },
33
34 "volumes": {
35 "id": "#/properties/volumes",
36 "type": "object",
37 "patternProperties": {
38 "^[a-zA-Z0-9._-]+$": {
39 "$ref": "#/definitions/volume"
40 }
41 },
42 "additionalProperties": false
43 },
44
45 "secrets": {
46 "id": "#/properties/secrets",
47 "type": "object",
48 "patternProperties": {
49 "^[a-zA-Z0-9._-]+$": {
50 "$ref": "#/definitions/secret"
51 }
52 },
53 "additionalProperties": false
54 },
55
56 "configs": {
57 "id": "#/properties/configs",
58 "type": "object",
59 "patternProperties": {
60 "^[a-zA-Z0-9._-]+$": {
61 "$ref": "#/definitions/config"
62 }
63 },
64 "additionalProperties": false
65 }
66 },
67
68 "patternProperties": {"^x-": {}},
69 "additionalProperties": false,
70
71 "definitions": {
72
73 "service": {
74 "id": "#/definitions/service",
75 "type": "object",
76
77 "properties": {
78 "deploy": {"$ref": "#/definitions/deployment"},
79 "build": {
80 "oneOf": [
81 {"type": "string"},
82 {
83 "type": "object",
84 "properties": {
85 "context": {"type": "string"},
86 "dockerfile": {"type": "string"},
87 "args": {"$ref": "#/definitions/list_or_dict"},
88 "labels": {"$ref": "#/definitions/list_or_dict"},
89 "cache_from": {"type": "array", "items": {"type": "string"}},
90 "network": {"type": "string"},
91 "target": {"type": "string"},
92 "shm_size": {"type": ["integer", "string"]},
93 "extra_hosts": {"$ref": "#/definitions/list_or_dict"},
94 "isolation": {"type": "string"}
95 },
96 "additionalProperties": false,
97 "patternProperties": {"^x-": {}}
98 }
99 ]
100 },
101 "blkio_config": {
102 "type": "object",
103 "properties": {
104 "device_read_bps": {
105 "type": "array",
106 "items": {"$ref": "#/definitions/blkio_limit"}
107 },
108 "device_read_iops": {
109 "type": "array",
110 "items": {"$ref": "#/definitions/blkio_limit"}
111 },
112 "device_write_bps": {
113 "type": "array",
114 "items": {"$ref": "#/definitions/blkio_limit"}
115 },
116 "device_write_iops": {
117 "type": "array",
118 "items": {"$ref": "#/definitions/blkio_limit"}
119 },
120 "weight": {"type": "integer"},
121 "weight_device": {
122 "type": "array",
123 "items": {"$ref": "#/definitions/blkio_weight"}
124 }
125 },
126 "additionalProperties": false
127 },
128 "cap_add": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
129 "cap_drop": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
130 "cgroup_parent": {"type": "string"},
131 "command": {
132 "oneOf": [
133 {"type": "string"},
134 {"type": "array", "items": {"type": "string"}}
135 ]
136 },
137 "configs": {
138 "type": "array",
139 "items": {
140 "oneOf": [
141 {"type": "string"},
142 {
143 "type": "object",
144 "properties": {
145 "source": {"type": "string"},
146 "target": {"type": "string"},
147 "uid": {"type": "string"},
148 "gid": {"type": "string"},
149 "mode": {"type": "number"}
150 },
151 "additionalProperties": false,
152 "patternProperties": {"^x-": {}}
153 }
154 ]
155 }
156 },
157 "container_name": {"type": "string"},
158 "cpu_count": {"type": "integer", "minimum": 0},
159 "cpu_percent": {"type": "integer", "minimum": 0, "maximum": 100},
160 "cpu_shares": {"type": ["number", "string"]},
161 "cpu_quota": {"type": ["number", "string"]},
162 "cpu_period": {"type": ["number", "string"]},
163 "cpu_rt_period": {"type": ["number", "string"]},
164 "cpu_rt_runtime": {"type": ["number", "string"]},
165 "cpus": {"type": ["number", "string"]},
166 "cpuset": {"type": "string"},
167 "credential_spec": {
168 "type": "object",
169 "properties": {
170 "config": {"type": "string"},
171 "file": {"type": "string"},
172 "registry": {"type": "string"}
173 },
174 "additionalProperties": false,
175 "patternProperties": {"^x-": {}}
176 },
177 "depends_on": {
178 "oneOf": [
179 {"$ref": "#/definitions/list_of_strings"},
180 {
181 "type": "object",
182 "additionalProperties": false,
183 "patternProperties": {
184 "^[a-zA-Z0-9._-]+$": {
185 "type": "object",
186 "additionalProperties": false,
187 "properties": {
188 "condition": {
189 "type": "string",
190 "enum": ["service_started", "service_healthy", "service_completed_successfully"]
191 }
192 },
193 "required": ["condition"]
194 }
195 }
196 }
197 ]
198 },
199 "device_cgroup_rules": {"$ref": "#/definitions/list_of_strings"},
200 "devices": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
201 "dns": {"$ref": "#/definitions/string_or_list"},
202 "dns_opt": {"type": "array","items": {"type": "string"}, "uniqueItems": true},
203 "dns_search": {"$ref": "#/definitions/string_or_list"},
204 "domainname": {"type": "string"},
205 "entrypoint": {
206 "oneOf": [
207 {"type": "string"},
208 {"type": "array", "items": {"type": "string"}}
209 ]
210 },
211 "env_file": {"$ref": "#/definitions/string_or_list"},
212 "environment": {"$ref": "#/definitions/list_or_dict"},
213
214 "expose": {
215 "type": "array",
216 "items": {
217 "type": ["string", "number"],
218 "format": "expose"
219 },
220 "uniqueItems": true
221 },
222 "extends": {
223 "oneOf": [
224 {"type": "string"},
225 {
226 "type": "object",
227
228 "properties": {
229 "service": {"type": "string"},
230 "file": {"type": "string"}
231 },
232 "required": ["service"],
233 "additionalProperties": false
234 }
235 ]
236 },
237 "external_links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
238 "extra_hosts": {"$ref": "#/definitions/list_or_dict"},
239 "group_add": {
240 "type": "array",
241 "items": {
242 "type": ["string", "number"]
243 },
244 "uniqueItems": true
245 },
246 "healthcheck": {"$ref": "#/definitions/healthcheck"},
247 "hostname": {"type": "string"},
248 "image": {"type": "string"},
249 "init": {"type": "boolean"},
250 "ipc": {"type": "string"},
251 "isolation": {"type": "string"},
252 "labels": {"$ref": "#/definitions/list_or_dict"},
253 "links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
254 "logging": {
255 "type": "object",
256
257 "properties": {
258 "driver": {"type": "string"},
259 "options": {
260 "type": "object",
261 "patternProperties": {
262 "^.+$": {"type": ["string", "number", "null"]}
263 }
264 }
265 },
266 "additionalProperties": false,
267 "patternProperties": {"^x-": {}}
268 },
269 "mac_address": {"type": "string"},
270 "mem_limit": {"type": ["number", "string"]},
271 "mem_reservation": {"type": ["string", "integer"]},
272 "mem_swappiness": {"type": "integer"},
273 "memswap_limit": {"type": ["number", "string"]},
274 "network_mode": {"type": "string"},
275 "networks": {
276 "oneOf": [
277 {"$ref": "#/definitions/list_of_strings"},
278 {
279 "type": "object",
280 "patternProperties": {
281 "^[a-zA-Z0-9._-]+$": {
282 "oneOf": [
283 {
284 "type": "object",
285 "properties": {
286 "aliases": {"$ref": "#/definitions/list_of_strings"},
287 "ipv4_address": {"type": "string"},
288 "ipv6_address": {"type": "string"},
289 "link_local_ips": {"$ref": "#/definitions/list_of_strings"},
290 "priority": {"type": "number"}
291 },
292 "additionalProperties": false,
293 "patternProperties": {"^x-": {}}
294 },
295 {"type": "null"}
296 ]
297 }
298 },
299 "additionalProperties": false
300 }
301 ]
302 },
303 "oom_kill_disable": {"type": "boolean"},
304 "oom_score_adj": {"type": "integer", "minimum": -1000, "maximum": 1000},
305 "pid": {"type": ["string", "null"]},
306 "pids_limit": {"type": ["number", "string"]},
307 "platform": {"type": "string"},
308 "ports": {
309 "type": "array",
310 "items": {
311 "oneOf": [
312 {"type": "number", "format": "ports"},
313 {"type": "string", "format": "ports"},
314 {
315 "type": "object",
316 "properties": {
317 "mode": {"type": "string"},
318 "target": {"type": "integer"},
319 "published": {"type": "integer"},
320 "protocol": {"type": "string"}
321 },
322 "additionalProperties": false,
323 "patternProperties": {"^x-": {}}
324 }
325 ]
326 },
327 "uniqueItems": true
328 },
329 "privileged": {"type": "boolean"},
330 "profiles": {"$ref": "#/definitions/list_of_strings"},
331 "pull_policy": {"type": "string", "enum": [
332 "always", "never", "if_not_present", "build"
333 ]},
334 "read_only": {"type": "boolean"},
335 "restart": {"type": "string"},
336 "runtime": {
337 "type": "string"
338 },
339 "scale": {
340 "type": "integer"
341 },
342 "security_opt": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
343 "shm_size": {"type": ["number", "string"]},
344 "secrets": {
345 "type": "array",
346 "items": {
347 "oneOf": [
348 {"type": "string"},
349 {
350 "type": "object",
351 "properties": {
352 "source": {"type": "string"},
353 "target": {"type": "string"},
354 "uid": {"type": "string"},
355 "gid": {"type": "string"},
356 "mode": {"type": "number"}
357 },
358 "additionalProperties": false,
359 "patternProperties": {"^x-": {}}
360 }
361 ]
362 }
363 },
364 "sysctls": {"$ref": "#/definitions/list_or_dict"},
365 "stdin_open": {"type": "boolean"},
366 "stop_grace_period": {"type": "string", "format": "duration"},
367 "stop_signal": {"type": "string"},
368 "storage_opt": {"type": "object"},
369 "tmpfs": {"$ref": "#/definitions/string_or_list"},
370 "tty": {"type": "boolean"},
371 "ulimits": {
372 "type": "object",
373 "patternProperties": {
374 "^[a-z]+$": {
375 "oneOf": [
376 {"type": "integer"},
377 {
378 "type": "object",
379 "properties": {
380 "hard": {"type": "integer"},
381 "soft": {"type": "integer"}
382 },
383 "required": ["soft", "hard"],
384 "additionalProperties": false,
385 "patternProperties": {"^x-": {}}
386 }
387 ]
388 }
389 }
390 },
391 "user": {"type": "string"},
392 "userns_mode": {"type": "string"},
393 "volumes": {
394 "type": "array",
395 "items": {
396 "oneOf": [
397 {"type": "string"},
398 {
399 "type": "object",
400 "required": ["type"],
401 "properties": {
402 "type": {"type": "string"},
403 "source": {"type": "string"},
404 "target": {"type": "string"},
405 "read_only": {"type": "boolean"},
406 "consistency": {"type": "string"},
407 "bind": {
408 "type": "object",
409 "properties": {
410 "propagation": {"type": "string"}
411 },
412 "additionalProperties": false,
413 "patternProperties": {"^x-": {}}
414 },
415 "volume": {
416 "type": "object",
417 "properties": {
418 "nocopy": {"type": "boolean"}
419 },
420 "additionalProperties": false,
421 "patternProperties": {"^x-": {}}
422 },
423 "tmpfs": {
424 "type": "object",
425 "properties": {
426 "size": {
427 "type": "integer",
428 "minimum": 0
429 }
430 },
431 "additionalProperties": false,
432 "patternProperties": {"^x-": {}}
433 }
434 },
435 "additionalProperties": false,
436 "patternProperties": {"^x-": {}}
437 }
438 ]
439 },
440 "uniqueItems": true
441 },
442 "volumes_from": {
443 "type": "array",
444 "items": {"type": "string"},
445 "uniqueItems": true
446 },
447 "working_dir": {"type": "string"}
448 },
449 "patternProperties": {"^x-": {}},
450 "additionalProperties": false
451 },
452
453 "healthcheck": {
454 "id": "#/definitions/healthcheck",
455 "type": "object",
456 "properties": {
457 "disable": {"type": "boolean"},
458 "interval": {"type": "string", "format": "duration"},
459 "retries": {"type": "number"},
460 "test": {
461 "oneOf": [
462 {"type": "string"},
463 {"type": "array", "items": {"type": "string"}}
464 ]
465 },
466 "timeout": {"type": "string", "format": "duration"},
467 "start_period": {"type": "string", "format": "duration"}
468 },
469 "additionalProperties": false,
470 "patternProperties": {"^x-": {}}
471 },
472 "deployment": {
473 "id": "#/definitions/deployment",
474 "type": ["object", "null"],
475 "properties": {
476 "mode": {"type": "string"},
477 "endpoint_mode": {"type": "string"},
478 "replicas": {"type": "integer"},
479 "labels": {"$ref": "#/definitions/list_or_dict"},
480 "rollback_config": {
481 "type": "object",
482 "properties": {
483 "parallelism": {"type": "integer"},
484 "delay": {"type": "string", "format": "duration"},
485 "failure_action": {"type": "string"},
486 "monitor": {"type": "string", "format": "duration"},
487 "max_failure_ratio": {"type": "number"},
488 "order": {"type": "string", "enum": [
489 "start-first", "stop-first"
490 ]}
491 },
492 "additionalProperties": false,
493 "patternProperties": {"^x-": {}}
494 },
495 "update_config": {
496 "type": "object",
497 "properties": {
498 "parallelism": {"type": "integer"},
499 "delay": {"type": "string", "format": "duration"},
500 "failure_action": {"type": "string"},
501 "monitor": {"type": "string", "format": "duration"},
502 "max_failure_ratio": {"type": "number"},
503 "order": {"type": "string", "enum": [
504 "start-first", "stop-first"
505 ]}
506 },
507 "additionalProperties": false,
508 "patternProperties": {"^x-": {}}
509 },
510 "resources": {
511 "type": "object",
512 "properties": {
513 "limits": {
514 "type": "object",
515 "properties": {
516 "cpus": {"type": ["number", "string"]},
517 "memory": {"type": "string"}
518 },
519 "additionalProperties": false,
520 "patternProperties": {"^x-": {}}
521 },
522 "reservations": {
523 "type": "object",
524 "properties": {
525 "cpus": {"type": ["number", "string"]},
526 "memory": {"type": "string"},
527 "generic_resources": {"$ref": "#/definitions/generic_resources"},
528 "devices": {"$ref": "#/definitions/devices"}
529 },
530 "additionalProperties": false,
531 "patternProperties": {"^x-": {}}
532 }
533 },
534 "additionalProperties": false,
535 "patternProperties": {"^x-": {}}
536 },
537 "restart_policy": {
538 "type": "object",
539 "properties": {
540 "condition": {"type": "string"},
541 "delay": {"type": "string", "format": "duration"},
542 "max_attempts": {"type": "integer"},
543 "window": {"type": "string", "format": "duration"}
544 },
545 "additionalProperties": false,
546 "patternProperties": {"^x-": {}}
547 },
548 "placement": {
549 "type": "object",
550 "properties": {
551 "constraints": {"type": "array", "items": {"type": "string"}},
552 "preferences": {
553 "type": "array",
554 "items": {
555 "type": "object",
556 "properties": {
557 "spread": {"type": "string"}
558 },
559 "additionalProperties": false,
560 "patternProperties": {"^x-": {}}
561 }
562 },
563 "max_replicas_per_node": {"type": "integer"}
564 },
565 "additionalProperties": false,
566 "patternProperties": {"^x-": {}}
567 }
568 },
569 "additionalProperties": false,
570 "patternProperties": {"^x-": {}}
571 },
572
573 "generic_resources": {
574 "id": "#/definitions/generic_resources",
575 "type": "array",
576 "items": {
577 "type": "object",
578 "properties": {
579 "discrete_resource_spec": {
580 "type": "object",
581 "properties": {
582 "kind": {"type": "string"},
583 "value": {"type": "number"}
584 },
585 "additionalProperties": false,
586 "patternProperties": {"^x-": {}}
587 }
588 },
589 "additionalProperties": false,
590 "patternProperties": {"^x-": {}}
591 }
592 },
593
594 "devices": {
595 "id": "#/definitions/devices",
596 "type": "array",
597 "items": {
598 "type": "object",
599 "properties": {
600 "capabilities": {"$ref": "#/definitions/list_of_strings"},
601 "count": {"type": ["string", "integer"]},
602 "device_ids": {"$ref": "#/definitions/list_of_strings"},
603 "driver":{"type": "string"},
604 "options":{"$ref": "#/definitions/list_or_dict"}
605 },
606 "additionalProperties": false,
607 "patternProperties": {"^x-": {}}
608 }
609 },
610
611 "network": {
612 "id": "#/definitions/network",
613 "type": ["object", "null"],
614 "properties": {
615 "name": {"type": "string"},
616 "driver": {"type": "string"},
617 "driver_opts": {
618 "type": "object",
619 "patternProperties": {
620 "^.+$": {"type": ["string", "number"]}
621 }
622 },
623 "ipam": {
624 "type": "object",
625 "properties": {
626 "driver": {"type": "string"},
627 "config": {
628 "type": "array",
629 "items": {
630 "type": "object",
631 "properties": {
632 "subnet": {"type": "string", "format": "subnet_ip_address"},
633 "ip_range": {"type": "string"},
634 "gateway": {"type": "string"},
635 "aux_addresses": {
636 "type": "object",
637 "additionalProperties": false,
638 "patternProperties": {"^.+$": {"type": "string"}}
639 }
640 },
641 "additionalProperties": false,
642 "patternProperties": {"^x-": {}}
643 }
644 },
645 "options": {
646 "type": "object",
647 "additionalProperties": false,
648 "patternProperties": {"^.+$": {"type": "string"}}
649 }
650 },
651 "additionalProperties": false,
652 "patternProperties": {"^x-": {}}
653 },
654 "external": {
655 "type": ["boolean", "object"],
656 "properties": {
657 "name": {
658 "deprecated": true,
659 "type": "string"
660 }
661 },
662 "additionalProperties": false,
663 "patternProperties": {"^x-": {}}
664 },
665 "internal": {"type": "boolean"},
666 "enable_ipv6": {"type": "boolean"},
667 "attachable": {"type": "boolean"},
668 "labels": {"$ref": "#/definitions/list_or_dict"}
669 },
670 "additionalProperties": false,
671 "patternProperties": {"^x-": {}}
672 },
673
674 "volume": {
675 "id": "#/definitions/volume",
676 "type": ["object", "null"],
677 "properties": {
678 "name": {"type": "string"},
679 "driver": {"type": "string"},
680 "driver_opts": {
681 "type": "object",
682 "patternProperties": {
683 "^.+$": {"type": ["string", "number"]}
684 }
685 },
686 "external": {
687 "type": ["boolean", "object"],
688 "properties": {
689 "name": {
690 "deprecated": true,
691 "type": "string"
692 }
693 },
694 "additionalProperties": false,
695 "patternProperties": {"^x-": {}}
696 },
697 "labels": {"$ref": "#/definitions/list_or_dict"}
698 },
699 "additionalProperties": false,
700 "patternProperties": {"^x-": {}}
701 },
702
703 "secret": {
704 "id": "#/definitions/secret",
705 "type": "object",
706 "properties": {
707 "name": {"type": "string"},
708 "file": {"type": "string"},
709 "external": {
710 "type": ["boolean", "object"],
711 "properties": {
712 "name": {"type": "string"}
713 }
714 },
715 "labels": {"$ref": "#/definitions/list_or_dict"},
716 "driver": {"type": "string"},
717 "driver_opts": {
718 "type": "object",
719 "patternProperties": {
720 "^.+$": {"type": ["string", "number"]}
721 }
722 },
723 "template_driver": {"type": "string"}
724 },
725 "additionalProperties": false,
726 "patternProperties": {"^x-": {}}
727 },
728
729 "config": {
730 "id": "#/definitions/config",
731 "type": "object",
732 "properties": {
733 "name": {"type": "string"},
734 "file": {"type": "string"},
735 "external": {
736 "type": ["boolean", "object"],
737 "properties": {
738 "name": {
739 "deprecated": true,
740 "type": "string"
741 }
742 }
743 },
744 "labels": {"$ref": "#/definitions/list_or_dict"},
745 "template_driver": {"type": "string"}
746 },
747 "additionalProperties": false,
748 "patternProperties": {"^x-": {}}
749 },
750
751 "string_or_list": {
752 "oneOf": [
753 {"type": "string"},
754 {"$ref": "#/definitions/list_of_strings"}
755 ]
756 },
757
758 "list_of_strings": {
759 "type": "array",
760 "items": {"type": "string"},
761 "uniqueItems": true
762 },
763
764 "list_or_dict": {
765 "oneOf": [
766 {
767 "type": "object",
768 "patternProperties": {
769 ".+": {
770 "type": ["string", "number", "null"]
771 }
772 },
773 "additionalProperties": false
774 },
775 {"type": "array", "items": {"type": "string"}, "uniqueItems": true}
776 ]
777 },
778
779 "blkio_limit": {
780 "type": "object",
781 "properties": {
782 "path": {"type": "string"},
783 "rate": {"type": ["integer", "string"]}
784 },
785 "additionalProperties": false
786 },
787 "blkio_weight": {
788 "type": "object",
789 "properties": {
790 "path": {"type": "string"},
791 "weight": {"type": "integer"}
792 },
793 "additionalProperties": false
794 },
795
796 "constraints": {
797 "service": {
798 "id": "#/definitions/constraints/service",
799 "anyOf": [
800 {"required": ["build"]},
801 {"required": ["image"]}
802 ],
803 "properties": {
804 "build": {
805 "required": ["context"]
806 }
807 }
808 }
809 }
810 }
811 }
99 from operator import itemgetter
1010
1111 import yaml
12 from cached_property import cached_property
12
13 try:
14 from functools import cached_property
15 except ImportError:
16 from cached_property import cached_property
1317
1418 from . import types
1519 from ..const import COMPOSE_SPEC as VERSION
132136 'logging',
133137 'network_mode',
134138 'platform',
139 'profiles',
135140 'scale',
136141 'stop_grace_period',
137142 ]
147152 SUPPORTED_FILENAMES = [
148153 'docker-compose.yml',
149154 'docker-compose.yaml',
155 'compose.yml',
156 'compose.yaml',
150157 ]
151158
152 DEFAULT_OVERRIDE_FILENAMES = ('docker-compose.override.yml', 'docker-compose.override.yaml')
159 DEFAULT_OVERRIDE_FILENAMES = ('docker-compose.override.yml',
160 'docker-compose.override.yaml',
161 'compose.override.yml',
162 'compose.override.yaml')
153163
154164
155165 log = logging.getLogger(__name__)
302312 if filenames:
303313 filenames = [os.path.join(base_dir, f) for f in filenames]
304314 else:
315 # search for compose files in the base dir and its parents
305316 filenames = get_default_config_files(base_dir)
317 if not filenames and not override_dir:
318 # none found in base_dir and no override_dir defined
319 raise ComposeFileNotFound(SUPPORTED_FILENAMES)
320 if not filenames:
321 # search for compose files in the project directory and its parents
322 filenames = get_default_config_files(override_dir)
323 if not filenames:
324 raise ComposeFileNotFound(SUPPORTED_FILENAMES)
306325
307326 log.debug("Using configuration files: {}".format(",".join(filenames)))
308327 return ConfigDetails(
333352 (candidates, path) = find_candidates_in_parent_dirs(SUPPORTED_FILENAMES, base_dir)
334353
335354 if not candidates:
336 raise ComposeFileNotFound(SUPPORTED_FILENAMES)
355 return None
337356
338357 winner = candidates[0]
339358
370389 return find_candidates_in_parent_dirs(filenames, parent_dir)
371390
372391 return (candidates, path)
392
393
394 def check_swarm_only_config(service_dicts):
395 warning_template = (
396 "Some services ({services}) use the '{key}' key, which will be ignored. "
397 "Compose does not support '{key}' configuration - use "
398 "`docker stack deploy` to deploy to a swarm."
399 )
400 key = 'configs'
401 services = [s for s in service_dicts if s.get(key)]
402 if services:
403 log.warning(
404 warning_template.format(
405 services=", ".join(sorted(s['name'] for s in services)),
406 key=key
407 )
408 )
373409
374410
375411 def load(config_details, interpolate=True):
408444 for service_dict in service_dicts:
409445 match_named_volumes(service_dict, volumes)
410446
447 check_swarm_only_config(service_dicts)
448
411449 return Config(main_file.config_version, main_file.version,
412450 service_dicts, volumes, networks, secrets, configs)
413451
535573 config_file.version,
536574 config,
537575 section,
538 environment
539 )
576 environment)
540577 else:
541578 return config
542579
10461083
10471084 for field in [
10481085 'cap_add', 'cap_drop', 'expose', 'external_links',
1049 'volumes_from', 'device_cgroup_rules',
1086 'volumes_from', 'device_cgroup_rules', 'profiles',
10501087 ]:
10511088 md.merge_field(field, merge_unique_items_lists, default=[])
10521089
11651202 md.merge_scalar('cpus')
11661203 md.merge_scalar('memory')
11671204 md.merge_sequence('generic_resources', types.GenericResource.parse)
1205 md.merge_field('devices', merge_unique_objects_lists, default=[])
11681206 return dict(md)
11691207
11701208
+0
-773
compose/config/config_schema_compose_spec.json less more
0 {
1 "$schema": "http://json-schema.org/draft/2019-09/schema#",
2 "id": "config_schema_compose_spec.json",
3 "type": "object",
4 "title": "Compose Specification",
5 "description": "The Compose file is a YAML file defining a multi-containers based application.",
6 "properties": {
7 "version": {
8 "type": "string",
9 "description": "Version of the Compose specification used. Tools not implementing required version MUST reject the configuration file."
10 },
11 "services": {
12 "id": "#/properties/services",
13 "type": "object",
14 "patternProperties": {
15 "^[a-zA-Z0-9._-]+$": {
16 "$ref": "#/definitions/service"
17 }
18 },
19 "additionalProperties": false
20 },
21 "networks": {
22 "id": "#/properties/networks",
23 "type": "object",
24 "patternProperties": {
25 "^[a-zA-Z0-9._-]+$": {
26 "$ref": "#/definitions/network"
27 }
28 }
29 },
30 "volumes": {
31 "id": "#/properties/volumes",
32 "type": "object",
33 "patternProperties": {
34 "^[a-zA-Z0-9._-]+$": {
35 "$ref": "#/definitions/volume"
36 }
37 },
38 "additionalProperties": false
39 },
40 "secrets": {
41 "id": "#/properties/secrets",
42 "type": "object",
43 "patternProperties": {
44 "^[a-zA-Z0-9._-]+$": {
45 "$ref": "#/definitions/secret"
46 }
47 },
48 "additionalProperties": false
49 },
50 "configs": {
51 "id": "#/properties/configs",
52 "type": "object",
53 "patternProperties": {
54 "^[a-zA-Z0-9._-]+$": {
55 "$ref": "#/definitions/config"
56 }
57 },
58 "additionalProperties": false
59 }
60 },
61 "patternProperties": {"^x-": {}},
62 "additionalProperties": false,
63 "definitions": {
64 "service": {
65 "id": "#/definitions/service",
66 "type": "object",
67 "properties": {
68 "deploy": {"$ref": "#/definitions/deployment"},
69 "build": {
70 "oneOf": [
71 {"type": "string"},
72 {
73 "type": "object",
74 "properties": {
75 "context": {"type": "string"},
76 "dockerfile": {"type": "string"},
77 "args": {"$ref": "#/definitions/list_or_dict"},
78 "labels": {"$ref": "#/definitions/list_or_dict"},
79 "cache_from": {"$ref": "#/definitions/list_of_strings"},
80 "network": {"type": "string"},
81 "target": {"type": "string"},
82 "shm_size": {"type": ["integer", "string"]},
83 "extra_hosts": {"$ref": "#/definitions/list_or_dict"},
84 "isolation": {"type": "string"}
85 },
86 "additionalProperties": false,
87 "patternProperties": {"^x-": {}}
88 }
89 ]
90 },
91 "blkio_config": {
92 "type": "object",
93 "properties": {
94 "device_read_bps": {
95 "type": "array",
96 "items": {"$ref": "#/definitions/blkio_limit"}
97 },
98 "device_read_iops": {
99 "type": "array",
100 "items": {"$ref": "#/definitions/blkio_limit"}
101 },
102 "device_write_bps": {
103 "type": "array",
104 "items": {"$ref": "#/definitions/blkio_limit"}
105 },
106 "device_write_iops": {
107 "type": "array",
108 "items": {"$ref": "#/definitions/blkio_limit"}
109 },
110 "weight": {"type": "integer"},
111 "weight_device": {
112 "type": "array",
113 "items": {"$ref": "#/definitions/blkio_weight"}
114 }
115 },
116 "additionalProperties": false
117 },
118 "cap_add": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
119 "cap_drop": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
120 "cgroup_parent": {"type": "string"},
121 "command": {
122 "oneOf": [
123 {"type": "string"},
124 {"type": "array", "items": {"type": "string"}}
125 ]
126 },
127 "configs": {
128 "type": "array",
129 "items": {
130 "oneOf": [
131 {"type": "string"},
132 {
133 "type": "object",
134 "properties": {
135 "source": {"type": "string"},
136 "target": {"type": "string"},
137 "uid": {"type": "string"},
138 "gid": {"type": "string"},
139 "mode": {"type": "number"}
140 },
141 "additionalProperties": false,
142 "patternProperties": {"^x-": {}}
143 }
144 ]
145 }
146 },
147 "container_name": {"type": "string"},
148 "cpu_count": {"type": "integer", "minimum": 0},
149 "cpu_percent": {"type": "integer", "minimum": 0, "maximum": 100},
150 "cpu_shares": {"type": ["number", "string"]},
151 "cpu_quota": {"type": ["number", "string"]},
152 "cpu_period": {"type": ["number", "string"]},
153 "cpu_rt_period": {"type": ["number", "string"]},
154 "cpu_rt_runtime": {"type": ["number", "string"]},
155 "cpus": {"type": ["number", "string"]},
156 "cpuset": {"type": "string"},
157 "credential_spec": {
158 "type": "object",
159 "properties": {
160 "config": {"type": "string"},
161 "file": {"type": "string"},
162 "registry": {"type": "string"}
163 },
164 "additionalProperties": false,
165 "patternProperties": {"^x-": {}}
166 },
167 "depends_on": {
168 "oneOf": [
169 {"$ref": "#/definitions/list_of_strings"},
170 {
171 "type": "object",
172 "additionalProperties": false,
173 "patternProperties": {
174 "^[a-zA-Z0-9._-]+$": {
175 "type": "object",
176 "additionalProperties": false,
177 "properties": {
178 "condition": {
179 "type": "string",
180 "enum": ["service_started", "service_healthy"]
181 }
182 },
183 "required": ["condition"]
184 }
185 }
186 }
187 ]
188 },
189 "device_cgroup_rules": {"$ref": "#/definitions/list_of_strings"},
190 "devices": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
191 "dns": {"$ref": "#/definitions/string_or_list"},
192
193 "dns_opt": {"type": "array","items": {"type": "string"}, "uniqueItems": true},
194 "dns_search": {"$ref": "#/definitions/string_or_list"},
195 "domainname": {"type": "string"},
196 "entrypoint": {
197 "oneOf": [
198 {"type": "string"},
199 {"type": "array", "items": {"type": "string"}}
200 ]
201 },
202 "env_file": {"$ref": "#/definitions/string_or_list"},
203 "environment": {"$ref": "#/definitions/list_or_dict"},
204
205 "expose": {
206 "type": "array",
207 "items": {
208 "type": ["string", "number"],
209 "format": "expose"
210 },
211 "uniqueItems": true
212 },
213
214 "extends": {
215 "oneOf": [
216 {"type": "string"},
217 {
218 "type": "object",
219 "properties": {
220 "service": {"type": "string"},
221 "file": {"type": "string"}
222 },
223 "required": ["service"],
224 "additionalProperties": false
225 }
226 ]
227 },
228 "external_links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
229 "extra_hosts": {"$ref": "#/definitions/list_or_dict"},
230 "group_add": {
231 "type": "array",
232 "items": {
233 "type": ["string", "number"]
234 },
235 "uniqueItems": true
236 },
237 "healthcheck": {"$ref": "#/definitions/healthcheck"},
238 "hostname": {"type": "string"},
239 "image": {"type": "string"},
240 "init": {"type": "boolean"},
241 "ipc": {"type": "string"},
242 "isolation": {"type": "string"},
243 "labels": {"$ref": "#/definitions/list_or_dict"},
244 "links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
245 "logging": {
246 "type": "object",
247 "properties": {
248 "driver": {"type": "string"},
249 "options": {
250 "type": "object",
251 "patternProperties": {
252 "^.+$": {"type": ["string", "number", "null"]}
253 }
254 }
255 },
256 "additionalProperties": false,
257 "patternProperties": {"^x-": {}}
258 },
259 "mac_address": {"type": "string"},
260 "mem_limit": {"type": "string"},
261 "mem_reservation": {"type": ["string", "integer"]},
262 "mem_swappiness": {"type": "integer"},
263 "memswap_limit": {"type": ["number", "string"]},
264 "network_mode": {"type": "string"},
265 "networks": {
266 "oneOf": [
267 {"$ref": "#/definitions/list_of_strings"},
268 {
269 "type": "object",
270 "patternProperties": {
271 "^[a-zA-Z0-9._-]+$": {
272 "oneOf": [
273 {
274 "type": "object",
275 "properties": {
276 "aliases": {"$ref": "#/definitions/list_of_strings"},
277 "ipv4_address": {"type": "string"},
278 "ipv6_address": {"type": "string"},
279 "link_local_ips": {"$ref": "#/definitions/list_of_strings"},
280 "priority": {"type": "number"}
281 },
282 "additionalProperties": false,
283 "patternProperties": {"^x-": {}}
284 },
285 {"type": "null"}
286 ]
287 }
288 },
289 "additionalProperties": false
290 }
291 ]
292 },
293 "oom_kill_disable": {"type": "boolean"},
294 "oom_score_adj": {"type": "integer", "minimum": -1000, "maximum": 1000},
295 "pid": {"type": ["string", "null"]},
296 "pids_limit": {"type": ["number", "string"]},
297 "platform": {"type": "string"},
298 "ports": {
299 "type": "array",
300 "items": {
301 "oneOf": [
302 {"type": "number", "format": "ports"},
303 {"type": "string", "format": "ports"},
304 {
305 "type": "object",
306 "properties": {
307 "mode": {"type": "string"},
308 "target": {"type": "integer"},
309 "published": {"type": "integer"},
310 "protocol": {"type": "string"}
311 },
312 "additionalProperties": false,
313 "patternProperties": {"^x-": {}}
314 }
315 ]
316 },
317 "uniqueItems": true
318 },
319 "privileged": {"type": "boolean"},
320 "pull_policy": {"type": "string", "enum": [
321 "always", "never", "if_not_present"
322 ]},
323 "read_only": {"type": "boolean"},
324 "restart": {"type": "string"},
325 "runtime": {
326 "deprecated": true,
327 "type": "string"
328 },
329 "scale": {
330 "type": "integer"
331 },
332 "security_opt": {"type": "array", "items": {"type": "string"}, "uniqueItems": true},
333 "shm_size": {"type": ["number", "string"]},
334 "secrets": {
335 "type": "array",
336 "items": {
337 "oneOf": [
338 {"type": "string"},
339 {
340 "type": "object",
341 "properties": {
342 "source": {"type": "string"},
343 "target": {"type": "string"},
344 "uid": {"type": "string"},
345 "gid": {"type": "string"},
346 "mode": {"type": "number"}
347 },
348 "additionalProperties": false,
349 "patternProperties": {"^x-": {}}
350 }
351 ]
352 }
353 },
354 "sysctls": {"$ref": "#/definitions/list_or_dict"},
355 "stdin_open": {"type": "boolean"},
356 "stop_grace_period": {"type": "string", "format": "duration"},
357 "stop_signal": {"type": "string"},
358 "tmpfs": {"$ref": "#/definitions/string_or_list"},
359 "tty": {"type": "boolean"},
360 "ulimits": {
361 "type": "object",
362 "patternProperties": {
363 "^[a-z]+$": {
364 "oneOf": [
365 {"type": "integer"},
366 {
367 "type": "object",
368 "properties": {
369 "hard": {"type": "integer"},
370 "soft": {"type": "integer"}
371 },
372 "required": ["soft", "hard"],
373 "additionalProperties": false,
374 "patternProperties": {"^x-": {}}
375 }
376 ]
377 }
378 }
379 },
380 "user": {"type": "string"},
381 "userns_mode": {"type": "string"},
382 "volumes": {
383 "type": "array",
384 "items": {
385 "oneOf": [
386 {"type": "string"},
387 {
388 "type": "object",
389 "required": ["type"],
390 "properties": {
391 "type": {"type": "string"},
392 "source": {"type": "string"},
393 "target": {"type": "string"},
394 "read_only": {"type": "boolean"},
395 "consistency": {"type": "string"},
396 "bind": {
397 "type": "object",
398 "properties": {
399 "propagation": {"type": "string"}
400 },
401 "additionalProperties": false,
402 "patternProperties": {"^x-": {}}
403 },
404 "volume": {
405 "type": "object",
406 "properties": {
407 "nocopy": {"type": "boolean"}
408 },
409 "additionalProperties": false,
410 "patternProperties": {"^x-": {}}
411 },
412 "tmpfs": {
413 "type": "object",
414 "properties": {
415 "size": {
416 "type": "integer",
417 "minimum": 0
418 }
419 },
420 "additionalProperties": false,
421 "patternProperties": {"^x-": {}}
422 }
423 },
424 "additionalProperties": false,
425 "patternProperties": {"^x-": {}}
426 }
427 ],
428 "uniqueItems": true
429 }
430 },
431 "volumes_from": {
432 "type": "array",
433 "items": {"type": "string"},
434 "uniqueItems": true
435 },
436 "working_dir": {"type": "string"}
437 },
438 "patternProperties": {"^x-": {}},
439 "additionalProperties": false
440 },
441
442 "healthcheck": {
443 "id": "#/definitions/healthcheck",
444 "type": "object",
445 "properties": {
446 "disable": {"type": "boolean"},
447 "interval": {"type": "string", "format": "duration"},
448 "retries": {"type": "number"},
449 "test": {
450 "oneOf": [
451 {"type": "string"},
452 {"type": "array", "items": {"type": "string"}}
453 ]
454 },
455 "timeout": {"type": "string", "format": "duration"},
456 "start_period": {"type": "string", "format": "duration"}
457 },
458 "additionalProperties": false,
459 "patternProperties": {"^x-": {}}
460 },
461 "deployment": {
462 "id": "#/definitions/deployment",
463 "type": ["object", "null"],
464 "properties": {
465 "mode": {"type": "string"},
466 "endpoint_mode": {"type": "string"},
467 "replicas": {"type": "integer"},
468 "labels": {"$ref": "#/definitions/list_or_dict"},
469 "rollback_config": {
470 "type": "object",
471 "properties": {
472 "parallelism": {"type": "integer"},
473 "delay": {"type": "string", "format": "duration"},
474 "failure_action": {"type": "string"},
475 "monitor": {"type": "string", "format": "duration"},
476 "max_failure_ratio": {"type": "number"},
477 "order": {"type": "string", "enum": [
478 "start-first", "stop-first"
479 ]}
480 },
481 "additionalProperties": false,
482 "patternProperties": {"^x-": {}}
483 },
484 "update_config": {
485 "type": "object",
486 "properties": {
487 "parallelism": {"type": "integer"},
488 "delay": {"type": "string", "format": "duration"},
489 "failure_action": {"type": "string"},
490 "monitor": {"type": "string", "format": "duration"},
491 "max_failure_ratio": {"type": "number"},
492 "order": {"type": "string", "enum": [
493 "start-first", "stop-first"
494 ]}
495 },
496 "additionalProperties": false,
497 "patternProperties": {"^x-": {}}
498 },
499 "resources": {
500 "type": "object",
501 "properties": {
502 "limits": {
503 "type": "object",
504 "properties": {
505 "cpus": {"type": ["number", "string"]},
506 "memory": {"type": "string"}
507 },
508 "additionalProperties": false,
509 "patternProperties": {"^x-": {}}
510 },
511 "reservations": {
512 "type": "object",
513 "properties": {
514 "cpus": {"type": ["number", "string"]},
515 "memory": {"type": "string"},
516 "generic_resources": {"$ref": "#/definitions/generic_resources"}
517 },
518 "additionalProperties": false,
519 "patternProperties": {"^x-": {}}
520 }
521 },
522 "additionalProperties": false,
523 "patternProperties": {"^x-": {}}
524 },
525 "restart_policy": {
526 "type": "object",
527 "properties": {
528 "condition": {"type": "string"},
529 "delay": {"type": "string", "format": "duration"},
530 "max_attempts": {"type": "integer"},
531 "window": {"type": "string", "format": "duration"}
532 },
533 "additionalProperties": false,
534 "patternProperties": {"^x-": {}}
535 },
536 "placement": {
537 "type": "object",
538 "properties": {
539 "constraints": {"type": "array", "items": {"type": "string"}},
540 "preferences": {
541 "type": "array",
542 "items": {
543 "type": "object",
544 "properties": {
545 "spread": {"type": "string"}
546 },
547 "additionalProperties": false,
548 "patternProperties": {"^x-": {}}
549 }
550 },
551 "max_replicas_per_node": {"type": "integer"}
552 },
553 "additionalProperties": false,
554 "patternProperties": {"^x-": {}}
555 }
556 },
557 "additionalProperties": false,
558 "patternProperties": {"^x-": {}}
559 },
560 "generic_resources": {
561 "id": "#/definitions/generic_resources",
562 "type": "array",
563 "items": {
564 "type": "object",
565 "properties": {
566 "discrete_resource_spec": {
567 "type": "object",
568 "properties": {
569 "kind": {"type": "string"},
570 "value": {"type": "number"}
571 },
572 "additionalProperties": false,
573 "patternProperties": {"^x-": {}}
574 }
575 },
576 "additionalProperties": false,
577 "patternProperties": {"^x-": {}}
578 }
579 },
580 "network": {
581 "id": "#/definitions/network",
582 "type": ["object", "null"],
583 "properties": {
584 "name": {"type": "string"},
585 "driver": {"type": "string"},
586 "driver_opts": {
587 "type": "object",
588 "patternProperties": {
589 "^.+$": {"type": ["string", "number"]}
590 }
591 },
592 "ipam": {
593 "type": "object",
594 "properties": {
595 "driver": {"type": "string"},
596 "config": {
597 "type": "array",
598 "items": {
599 "type": "object",
600 "properties": {
601 "subnet": {"type": "string", "format": "subnet_ip_address"},
602 "ip_range": {"type": "string"},
603 "gateway": {"type": "string"},
604 "aux_addresses": {
605 "type": "object",
606 "additionalProperties": false,
607 "patternProperties": {"^.+$": {"type": "string"}}
608 }
609 }
610 },
611 "additionalProperties": false,
612 "patternProperties": {"^x-": {}}
613 },
614 "options": {
615 "type": "object",
616 "additionalProperties": false,
617 "patternProperties": {"^.+$": {"type": "string"}}
618 }
619 },
620 "additionalProperties": false,
621 "patternProperties": {"^x-": {}}
622 },
623 "external": {
624 "type": ["boolean", "object"],
625 "properties": {
626 "name": {
627 "deprecated": true,
628 "type": "string"
629 }
630 },
631 "additionalProperties": false,
632 "patternProperties": {"^x-": {}}
633 },
634 "internal": {"type": "boolean"},
635 "enable_ipv6": {"type": "boolean"},
636 "attachable": {"type": "boolean"},
637 "labels": {"$ref": "#/definitions/list_or_dict"}
638 },
639 "additionalProperties": false,
640 "patternProperties": {"^x-": {}}
641 },
642 "volume": {
643 "id": "#/definitions/volume",
644 "type": ["object", "null"],
645 "properties": {
646 "name": {"type": "string"},
647 "driver": {"type": "string"},
648 "driver_opts": {
649 "type": "object",
650 "patternProperties": {
651 "^.+$": {"type": ["string", "number"]}
652 }
653 },
654 "external": {
655 "type": ["boolean", "object"],
656 "properties": {
657 "name": {
658 "deprecated": true,
659 "type": "string"
660 }
661 },
662 "additionalProperties": false,
663 "patternProperties": {"^x-": {}}
664 },
665 "labels": {"$ref": "#/definitions/list_or_dict"}
666 },
667 "additionalProperties": false,
668 "patternProperties": {"^x-": {}}
669 },
670 "secret": {
671 "id": "#/definitions/secret",
672 "type": "object",
673 "properties": {
674 "name": {"type": "string"},
675 "file": {"type": "string"},
676 "external": {
677 "type": ["boolean", "object"],
678 "properties": {
679 "name": {"type": "string"}
680 }
681 },
682 "labels": {"$ref": "#/definitions/list_or_dict"},
683 "driver": {"type": "string"},
684 "driver_opts": {
685 "type": "object",
686 "patternProperties": {
687 "^.+$": {"type": ["string", "number"]}
688 }
689 },
690 "template_driver": {"type": "string"}
691 },
692 "additionalProperties": false,
693 "patternProperties": {"^x-": {}}
694 },
695 "config": {
696 "id": "#/definitions/config",
697 "type": "object",
698 "properties": {
699 "name": {"type": "string"},
700 "file": {"type": "string"},
701 "external": {
702 "type": ["boolean", "object"],
703 "properties": {
704 "name": {
705 "deprecated": true,
706 "type": "string"
707 }
708 }
709 },
710 "labels": {"$ref": "#/definitions/list_or_dict"},
711 "template_driver": {"type": "string"}
712 },
713 "additionalProperties": false,
714 "patternProperties": {"^x-": {}}
715 },
716 "string_or_list": {
717 "oneOf": [
718 {"type": "string"},
719 {"$ref": "#/definitions/list_of_strings"}
720 ]
721 },
722 "list_of_strings": {
723 "type": "array",
724 "items": {"type": "string"},
725 "uniqueItems": true
726 },
727 "list_or_dict": {
728 "oneOf": [
729 {
730 "type": "object",
731 "patternProperties": {
732 ".+": {
733 "type": ["string", "number", "null"]
734 }
735 },
736 "additionalProperties": false
737 },
738 {"type": "array", "items": {"type": "string"}, "uniqueItems": true}
739 ]
740 },
741 "blkio_limit": {
742 "type": "object",
743 "properties": {
744 "path": {"type": "string"},
745 "rate": {"type": ["integer", "string"]}
746 },
747 "additionalProperties": false
748 },
749 "blkio_weight": {
750 "type": "object",
751 "properties": {
752 "path": {"type": "string"},
753 "weight": {"type": "integer"}
754 },
755 "additionalProperties": false
756 },
757 "constraints": {
758 "service": {
759 "id": "#/definitions/constraints/service",
760 "anyOf": [
761 {"required": ["build"]},
762 {"required": ["image"]}
763 ],
764 "properties": {
765 "build": {
766 "required": ["context"]
767 }
768 }
769 }
770 }
771 }
772 }
5353 if base_dir is None:
5454 return result
5555 if env_file:
56 env_file_path = os.path.join(base_dir, env_file)
57 else:
58 env_file_path = os.path.join(base_dir, '.env')
56 env_file_path = os.path.join(os.getcwd(), env_file)
57 return cls(env_vars_from_file(env_file_path))
58
59 env_file_path = os.path.join(base_dir, '.env')
5960 try:
6061 return cls(env_vars_from_file(env_file_path))
6162 except EnvFileNotFound:
112113 )
113114 return super().get(key, *args, **kwargs)
114115
115 def get_boolean(self, key):
116 def get_boolean(self, key, default=False):
116117 # Convert a value to a boolean using "common sense" rules.
117118 # Unset, empty, "0" and "false" (i-case) yield False.
118119 # All other values yield True.
119120 value = self.get(key)
120121 if not value:
121 return False
122 return default
122123 if value.lower() in ['0', 'false']:
123124 return False
124125 return True
110110 var, _, err = braced.partition(':?')
111111 result = mapping.get(var)
112112 if not result:
113 err = err or var
113114 raise UnsetRequiredSubstitution(err)
114115 return result
115116 elif '?' == sep:
116117 var, _, err = braced.partition('?')
117118 if var in mapping:
118119 return mapping.get(var)
120 err = err or var
119121 raise UnsetRequiredSubstitution(err)
120122
121123 # Modified from python2.7/string.py
240242 service_path('healthcheck', 'disable'): to_boolean,
241243 service_path('deploy', 'labels', PATH_JOKER): to_str,
242244 service_path('deploy', 'replicas'): to_int,
245 service_path('deploy', 'placement', 'max_replicas_per_node'): to_int,
243246 service_path('deploy', 'resources', 'limits', "cpus"): to_float,
244247 service_path('deploy', 'update_config', 'parallelism'): to_int,
245248 service_path('deploy', 'update_config', 'max_failure_ratio'): to_float,
501501
502502
503503 def load_jsonschema(version):
504 suffix = "compose_spec"
504 name = "compose_spec"
505505 if version == V1:
506 suffix = "v1"
506 name = "config_schema_v1"
507507
508508 filename = os.path.join(
509509 get_schema_path(),
510 "config_schema_{}.json".format(suffix))
510 "{}.json".format(name))
511511
512512 if not os.path.exists(filename):
513513 raise ConfigurationError(
44 DEFAULT_TIMEOUT = 10
55 HTTP_TIMEOUT = 60
66 IS_WINDOWS_PLATFORM = (sys.platform == "win32")
7 IS_LINUX_PLATFORM = (sys.platform == "linux")
78 LABEL_CONTAINER_NUMBER = 'com.docker.compose.container-number'
89 LABEL_ONE_OFF = 'com.docker.compose.oneoff'
910 LABEL_PROJECT = 'com.docker.compose.project'
186186 return self.get('HostConfig.LogConfig.Type')
187187
188188 @property
189 def has_api_logs(self):
190 log_type = self.log_driver
191 return not log_type or log_type in ('json-file', 'journald', 'local')
192
193 @property
194189 def human_readable_health_status(self):
195190 """ Generate UP status string with up time and health
196191 """
203198 return status_string
204199
205200 def attach_log_stream(self):
206 """A log stream can only be attached if the container uses a
207 json-file, journald or local log driver.
208 """
209 if self.has_api_logs:
210 self.log_stream = self.attach(stdout=True, stderr=True, stream=True)
201 self.log_stream = self.attach(stdout=True, stderr=True, stream=True)
211202
212203 def get(self, key):
213204 """Return a value from the container or None if the value is not set.
2626 service_name
2727 )
2828 )
29
30
31 class CompletedUnsuccessfully(Exception):
32 def __init__(self, container_id, exit_code):
33 self.msg = 'Container "{}" exited with code {}.'.format(container_id, exit_code)
(New empty file)
0 import os
1 from enum import Enum
2
3 import requests
4 from docker import ContextAPI
5 from docker.transport import UnixHTTPAdapter
6
7 from compose.const import IS_WINDOWS_PLATFORM
8
9 if IS_WINDOWS_PLATFORM:
10 from docker.transport import NpipeHTTPAdapter
11
12
13 class Status(Enum):
14 SUCCESS = "success"
15 FAILURE = "failure"
16 CANCELED = "canceled"
17
18
19 class MetricsSource:
20 CLI = "docker-compose"
21
22
23 if IS_WINDOWS_PLATFORM:
24 METRICS_SOCKET_FILE = 'npipe://\\\\.\\pipe\\docker_cli'
25 else:
26 METRICS_SOCKET_FILE = 'http+unix:///var/run/docker-cli.sock'
27
28
29 class MetricsCommand(requests.Session):
30 """
31 Representation of a command in the metrics.
32 """
33
34 def __init__(self, command,
35 context_type=None, status=Status.SUCCESS,
36 source=MetricsSource.CLI, uri=None):
37 super().__init__()
38 self.command = ("compose " + command).strip() if command else "compose --help"
39 self.context = context_type or ContextAPI.get_current_context().context_type or 'moby'
40 self.source = source
41 self.status = status.value
42 self.uri = uri or os.environ.get("METRICS_SOCKET_FILE", METRICS_SOCKET_FILE)
43 if IS_WINDOWS_PLATFORM:
44 self.mount("http+unix://", NpipeHTTPAdapter(self.uri))
45 else:
46 self.mount("http+unix://", UnixHTTPAdapter(self.uri))
47
48 def send_metrics(self):
49 try:
50 return self.post("http+unix://localhost/usage",
51 json=self.to_map(),
52 timeout=.05,
53 headers={'Content-Type': 'application/json'})
54 except Exception as e:
55 return e
56
57 def to_map(self):
58 return {
59 'command': self.command,
60 'context': self.context,
61 'source': self.source,
62 'status': self.status,
63 }
0 import functools
1
2 from compose.metrics.client import MetricsCommand
3 from compose.metrics.client import Status
4
5
6 class metrics:
7 def __init__(self, command_name=None):
8 self.command_name = command_name
9
10 def __call__(self, fn):
11 @functools.wraps(fn,
12 assigned=functools.WRAPPER_ASSIGNMENTS,
13 updated=functools.WRAPPER_UPDATES)
14 def wrapper(*args, **kwargs):
15 if not self.command_name:
16 self.command_name = fn.__name__
17 result = fn(*args, **kwargs)
18 MetricsCommand(self.command_name, status=Status.SUCCESS).send_metrics()
19 return result
20 return wrapper
1010 from docker.errors import APIError
1111 from docker.errors import ImageNotFound
1212
13 from compose.cli.colors import AnsiMode
1314 from compose.cli.colors import green
1415 from compose.cli.colors import red
1516 from compose.cli.signals import ShutdownException
1617 from compose.const import PARALLEL_LIMIT
18 from compose.errors import CompletedUnsuccessfully
1719 from compose.errors import HealthCheckFailed
1820 from compose.errors import NoHealthCheckConfigured
1921 from compose.errors import OperationFailedError
5961 elif isinstance(exception, APIError):
6062 errors[get_name(obj)] = exception.explanation
6163 writer.write(msg, get_name(obj), 'error', red)
62 elif isinstance(exception, (OperationFailedError, HealthCheckFailed, NoHealthCheckConfigured)):
64 elif isinstance(exception, (OperationFailedError, HealthCheckFailed, NoHealthCheckConfigured,
65 CompletedUnsuccessfully)):
6366 errors[get_name(obj)] = exception.msg
6467 writer.write(msg, get_name(obj), 'error', red)
6568 elif isinstance(exception, UpstreamError):
8285 objects = list(objects)
8386 stream = sys.stderr
8487
85 if ParallelStreamWriter.instance:
86 writer = ParallelStreamWriter.instance
87 else:
88 writer = ParallelStreamWriter(stream)
88 writer = ParallelStreamWriter.get_or_assign_instance(ParallelStreamWriter(stream))
8989
9090 for obj in objects:
9191 writer.add_object(msg, get_name(obj))
242242 'not processing'.format(obj)
243243 )
244244 results.put((obj, None, e))
245 except CompletedUnsuccessfully as e:
246 log.debug(
247 'Service(s) upstream of {} did not completed successfully - '
248 'not processing'.format(obj)
249 )
250 results.put((obj, None, e))
245251
246252 if state.is_done():
247253 results.put(STOP)
258264 to jump to the correct line, and write over the line.
259265 """
260266
261 noansi = False
262 lock = Lock()
267 default_ansi_mode = AnsiMode.AUTO
268 write_lock = Lock()
269
263270 instance = None
271 instance_lock = Lock()
264272
265273 @classmethod
266 def set_noansi(cls, value=True):
267 cls.noansi = value
268
269 def __init__(self, stream):
274 def get_instance(cls):
275 return cls.instance
276
277 @classmethod
278 def get_or_assign_instance(cls, writer):
279 cls.instance_lock.acquire()
280 try:
281 if cls.instance is None:
282 cls.instance = writer
283 return cls.instance
284 finally:
285 cls.instance_lock.release()
286
287 @classmethod
288 def set_default_ansi_mode(cls, ansi_mode):
289 cls.default_ansi_mode = ansi_mode
290
291 def __init__(self, stream, ansi_mode=None):
292 if ansi_mode is None:
293 ansi_mode = self.default_ansi_mode
270294 self.stream = stream
295 self.use_ansi_codes = ansi_mode.use_ansi_codes(stream)
271296 self.lines = []
272297 self.width = 0
273 ParallelStreamWriter.instance = self
274298
275299 def add_object(self, msg, obj_index):
276300 if msg is None:
284308 return self._write_noansi(msg, obj_index, '')
285309
286310 def _write_ansi(self, msg, obj_index, status):
287 self.lock.acquire()
311 self.write_lock.acquire()
288312 position = self.lines.index(msg + obj_index)
289313 diff = len(self.lines) - position
290314 # move up
296320 # move back down
297321 self.stream.write("%c[%dB" % (27, diff))
298322 self.stream.flush()
299 self.lock.release()
323 self.write_lock.release()
300324
301325 def _write_noansi(self, msg, obj_index, status):
302326 self.stream.write(
309333 def write(self, msg, obj_index, status, color_func):
310334 if msg is None:
311335 return
312 if self.noansi:
336 if self.use_ansi_codes:
337 self._write_ansi(msg, obj_index, color_func(status))
338 else:
313339 self._write_noansi(msg, obj_index, status)
314 else:
315 self._write_ansi(msg, obj_index, color_func(status))
316
317
318 def get_stream_writer():
319 instance = ParallelStreamWriter.instance
320 if instance is None:
321 raise RuntimeError('ParallelStreamWriter has not yet been instantiated')
322 return instance
323340
324341
325342 def parallel_operation(containers, operation, options, message):
3838 from .service import ServiceIpcMode
3939 from .service import ServiceNetworkMode
4040 from .service import ServicePidMode
41 from .utils import filter_attached_for_up
4142 from .utils import microseconds_from_time_nano
4243 from .utils import truncate_string
4344 from .volume import ProjectVolumes
6768 """
6869 A collection of services.
6970 """
70 def __init__(self, name, services, client, networks=None, volumes=None, config_version=None):
71 def __init__(self, name, services, client, networks=None, volumes=None, config_version=None,
72 enabled_profiles=None):
7173 self.name = name
7274 self.services = services
7375 self.client = client
7476 self.volumes = volumes or ProjectVolumes({})
7577 self.networks = networks or ProjectNetworks({}, False)
7678 self.config_version = config_version
79 self.enabled_profiles = enabled_profiles or []
7780
7881 def labels(self, one_off=OneOffFilter.exclude, legacy=False):
7982 name = self.name
8588 return labels
8689
8790 @classmethod
88 def from_config(cls, name, config_data, client, default_platform=None, extra_labels=None):
91 def from_config(cls, name, config_data, client, default_platform=None, extra_labels=None,
92 enabled_profiles=None):
8993 """
9094 Construct a Project from a config.Config object.
9195 """
97101 networks,
98102 use_networking)
99103 volumes = ProjectVolumes.from_config(name, config_data, client)
100 project = cls(name, [], client, project_networks, volumes, config_data.version)
104 project = cls(name, [], client, project_networks, volumes, config_data.version, enabled_profiles)
101105
102106 for service_dict in config_data.services:
103107 service_dict = dict(service_dict)
127131 config_data.secrets)
128132
129133 service_dict['scale'] = project.get_service_scale(service_dict)
130
134 service_dict['device_requests'] = project.get_device_requests(service_dict)
131135 service_dict = translate_credential_spec_to_security_opt(service_dict)
132136 service_dict, ignored_keys = translate_deploy_keys_to_container_config(
133137 service_dict
184188 if name not in valid_names:
185189 raise NoSuchService(name)
186190
187 def get_services(self, service_names=None, include_deps=False):
191 def get_services(self, service_names=None, include_deps=False, auto_enable_profiles=True):
188192 """
189193 Returns a list of this project's services filtered
190194 by the provided list of names, or all services if service_names is None
197201 reordering as needed to resolve dependencies.
198202
199203 Raises NoSuchService if any of the named services do not exist.
204
205 Raises ConfigurationError if any service depended on is not enabled by active profiles
200206 """
207 # create a copy so we can *locally* add auto-enabled profiles later
208 enabled_profiles = self.enabled_profiles.copy()
209
201210 if service_names is None or len(service_names) == 0:
202 service_names = self.service_names
211 auto_enable_profiles = False
212 service_names = [
213 service.name
214 for service in self.services
215 if service.enabled_for_profiles(enabled_profiles)
216 ]
203217
204218 unsorted = [self.get_service(name) for name in service_names]
205219 services = [s for s in self.services if s in unsorted]
206220
221 if auto_enable_profiles:
222 # enable profiles of explicitly targeted services
223 for service in services:
224 for profile in service.get_profiles():
225 if profile not in enabled_profiles:
226 enabled_profiles.append(profile)
227
207228 if include_deps:
208 services = reduce(self._inject_deps, services, [])
229 services = reduce(
230 lambda acc, s: self._inject_deps(acc, s, enabled_profiles),
231 services,
232 []
233 )
209234
210235 uniques = []
211236 [uniques.append(s) for s in services if s not in uniques]
330355 max_replicas))
331356 return scale
332357
358 def get_device_requests(self, service_dict):
359 deploy_dict = service_dict.get('deploy', None)
360 if not deploy_dict:
361 return
362
363 resources = deploy_dict.get('resources', None)
364 if not resources or not resources.get('reservations', None):
365 return
366 devices = resources['reservations'].get('devices')
367 if not devices:
368 return
369
370 for dev in devices:
371 count = dev.get("count", -1)
372 if not isinstance(count, int):
373 if count != "all":
374 raise ConfigurationError(
375 'Invalid value "{}" for devices count'.format(dev["count"]),
376 '(expected integer or "all")')
377 dev["count"] = -1
378
379 if 'capabilities' in dev:
380 dev['capabilities'] = [dev['capabilities']]
381 return devices
382
333383 def start(self, service_names=None, **options):
334384 containers = []
335385
411461 self.remove_images(remove_image_type)
412462
413463 def remove_images(self, remove_image_type):
414 for service in self.get_services():
464 for service in self.services:
415465 service.remove_image(remove_image_type)
416466
417467 def restart(self, service_names=None, **options):
468 # filter service_names by enabled profiles
469 service_names = [s.name for s in self.get_services(service_names)]
418470 containers = self.containers(service_names, stopped=True)
419471
420472 parallel.parallel_execute(
437489 log.info('%s uses an image, skipping' % service.name)
438490
439491 if cli:
440 log.warning("Native build is an experimental feature and could change at any time")
441492 if parallel_build:
442493 log.warning("Flag '--parallel' is ignored when building with "
443494 "COMPOSE_DOCKER_CLI_BUILD=1")
593644 silent=False,
594645 cli=False,
595646 one_off=False,
647 attach_dependencies=False,
596648 override_options=None,
597649 ):
598
599 if cli:
600 log.warning("Native build is an experimental feature and could change at any time")
601650
602651 self.initialize()
603652 if not ignore_orphans:
619668 one_off=service_names if one_off else [],
620669 )
621670
671 services_to_attach = filter_attached_for_up(
672 services,
673 service_names,
674 attach_dependencies,
675 lambda service: service.name)
676
622677 def do(service):
623
624678 return service.execute_convergence_plan(
625679 plans[service.name],
626680 timeout=timeout,
627 detached=detached,
681 detached=detached or (service not in services_to_attach),
628682 scale_override=scale_override.get(service.name),
629683 rescale=rescale,
630684 start=start,
694748
695749 return plans
696750
697 def pull(self, service_names=None, ignore_pull_failures=False, parallel_pull=False, silent=False,
751 def pull(self, service_names=None, ignore_pull_failures=False, parallel_pull=True, silent=False,
698752 include_deps=False):
699753 services = self.get_services(service_names, include_deps)
700754
728782 return
729783
730784 try:
731 writer = parallel.get_stream_writer()
785 writer = parallel.ParallelStreamWriter.get_instance()
786 if writer is None:
787 raise RuntimeError('ParallelStreamWriter has not yet been instantiated')
732788 for event in strm:
733789 if 'status' not in event:
734790 continue
829885 )
830886 )
831887
832 def _inject_deps(self, acc, service):
888 def _inject_deps(self, acc, service, enabled_profiles):
833889 dep_names = service.get_dependency_names()
834890
835891 if len(dep_names) > 0:
836892 dep_services = self.get_services(
837893 service_names=list(set(dep_names)),
838 include_deps=True
839 )
894 include_deps=True,
895 auto_enable_profiles=False
896 )
897
898 for dep in dep_services:
899 if not dep.enabled_for_profiles(enabled_profiles):
900 raise ConfigurationError(
901 'Service "{dep_name}" was pulled in as a dependency of '
902 'service "{service_name}" but is not enabled by the '
903 'active profiles. '
904 'You may fix this by adding a common profile to '
905 '"{dep_name}" and "{service_name}".'
906 .format(dep_name=dep.name, service_name=service.name)
907 )
840908 else:
841909 dep_services = []
842910
00 import enum
11 import itertools
2 import json
32 import logging
43 import os
54 import re
4443 from .const import NANOCPUS_SCALE
4544 from .const import WINDOWS_LONGPATH_PREFIX
4645 from .container import Container
46 from .errors import CompletedUnsuccessfully
4747 from .errors import HealthCheckFailed
4848 from .errors import NoHealthCheckConfigured
4949 from .errors import OperationFailedError
7676 'cpuset',
7777 'device_cgroup_rules',
7878 'devices',
79 'device_requests',
7980 'dns',
8081 'dns_search',
8182 'dns_opt',
110111
111112 CONDITION_STARTED = 'service_started'
112113 CONDITION_HEALTHY = 'service_healthy'
114 CONDITION_COMPLETED_SUCCESSFULLY = 'service_completed_successfully'
113115
114116
115117 class BuildError(Exception):
710712 'image_id': image_id(),
711713 'links': self.get_link_names(),
712714 'net': self.network_mode.id,
715 'ipc_mode': self.ipc_mode.mode,
713716 'networks': self.networks,
714717 'secrets': self.secrets,
715718 'volumes_from': [
716719 (v.source.name, v.mode)
717720 for v in self.volumes_from if isinstance(v.source, Service)
718 ],
721 ]
719722 }
720723
721724 def get_dependency_names(self):
751754 configs[svc] = lambda s: True
752755 elif config['condition'] == CONDITION_HEALTHY:
753756 configs[svc] = lambda s: s.is_healthy()
757 elif config['condition'] == CONDITION_COMPLETED_SUCCESSFULLY:
758 configs[svc] = lambda s: s.is_completed_successfully()
754759 else:
755760 # The config schema already prevents this, but it might be
756761 # bypassed if Compose is called programmatically.
10151020 privileged=options.get('privileged', False),
10161021 network_mode=self.network_mode.mode,
10171022 devices=options.get('devices'),
1023 device_requests=options.get('device_requests'),
10181024 dns=options.get('dns'),
10191025 dns_opt=options.get('dns_opt'),
10201026 dns_search=options.get('dns_search'),
11001106 'Impossible to perform platform-targeted builds for API version < 1.35'
11011107 )
11021108
1103 builder = self.client if not cli else _CLIBuilder(progress)
1104 build_output = builder.build(
1109 builder = _ClientBuilder(self.client) if not cli else _CLIBuilder(progress)
1110 return builder.build(
1111 service=self,
11051112 path=path,
11061113 tag=self.image_name,
11071114 rm=rm,
11221129 gzip=gzip,
11231130 isolation=build_opts.get('isolation', self.options.get('isolation', None)),
11241131 platform=self.platform,
1125 )
1126
1127 try:
1128 all_events = list(stream_output(build_output, output_stream))
1129 except StreamOutputError as e:
1130 raise BuildError(self, str(e))
1131
1132 # Ensure the HTTP connection is not reused for another
1133 # streaming command, as the Docker daemon can sometimes
1134 # complain about it
1135 self.client.close()
1136
1137 image_id = None
1138
1139 for event in all_events:
1140 if 'stream' in event:
1141 match = re.search(r'Successfully built ([0-9a-f]+)', event.get('stream', ''))
1142 if match:
1143 image_id = match.group(1)
1144
1145 if image_id is None:
1146 raise BuildError(self, event if all_events else 'Unknown')
1147
1148 return image_id
1132 output_stream=output_stream)
11491133
11501134 def get_cache_from(self, build_opts):
11511135 cache_from = build_opts.get('cache_from', None)
13011285 raise HealthCheckFailed(ctnr.short_id)
13021286 return result
13031287
1288 def is_completed_successfully(self):
1289 """ Check that all containers for this service has completed successfully
1290 Returns false if at least one container does not exited and
1291 raises CompletedUnsuccessfully exception if at least one container
1292 exited with non-zero exit code.
1293 """
1294 result = True
1295 for ctnr in self.containers(stopped=True):
1296 ctnr.inspect()
1297 if ctnr.get('State.Status') != 'exited':
1298 result = False
1299 elif ctnr.exit_code != 0:
1300 raise CompletedUnsuccessfully(ctnr.short_id, ctnr.exit_code)
1301 return result
1302
13041303 def _parse_proxy_config(self):
13051304 client = self.client
13061305 if 'proxies' not in client._general_configs:
13251324 result[permitted[k]] = result[permitted[k].lower()] = v
13261325
13271326 return result
1327
1328 def get_profiles(self):
1329 if 'profiles' not in self.options:
1330 return []
1331
1332 return self.options.get('profiles')
1333
1334 def enabled_for_profiles(self, enabled_profiles):
1335 # if service has no profiles specified it is always enabled
1336 if 'profiles' not in self.options:
1337 return True
1338
1339 service_profiles = self.options.get('profiles')
1340 for profile in enabled_profiles:
1341 if profile in service_profiles:
1342 return True
1343
1344 return False
13281345
13291346
13301347 def short_id_alias_exists(container, network):
17691786 return path
17701787
17711788
1772 class _CLIBuilder:
1773 def __init__(self, progress):
1774 self._progress = progress
1775
1776 def build(self, path, tag=None, quiet=False, fileobj=None,
1789 class _ClientBuilder:
1790 def __init__(self, client):
1791 self.client = client
1792
1793 def build(self, service, path, tag=None, quiet=False, fileobj=None,
17771794 nocache=False, rm=False, timeout=None,
17781795 custom_context=False, encoding=None, pull=False,
17791796 forcerm=False, dockerfile=None, container_limits=None,
17801797 decode=False, buildargs=None, gzip=False, shmsize=None,
17811798 labels=None, cache_from=None, target=None, network_mode=None,
17821799 squash=None, extra_hosts=None, platform=None, isolation=None,
1783 use_config_proxy=True):
1800 use_config_proxy=True, output_stream=sys.stdout):
1801 build_output = self.client.build(
1802 path=path,
1803 tag=tag,
1804 nocache=nocache,
1805 rm=rm,
1806 pull=pull,
1807 forcerm=forcerm,
1808 dockerfile=dockerfile,
1809 labels=labels,
1810 cache_from=cache_from,
1811 buildargs=buildargs,
1812 network_mode=network_mode,
1813 target=target,
1814 shmsize=shmsize,
1815 extra_hosts=extra_hosts,
1816 container_limits=container_limits,
1817 gzip=gzip,
1818 isolation=isolation,
1819 platform=platform)
1820
1821 try:
1822 all_events = list(stream_output(build_output, output_stream))
1823 except StreamOutputError as e:
1824 raise BuildError(service, str(e))
1825
1826 # Ensure the HTTP connection is not reused for another
1827 # streaming command, as the Docker daemon can sometimes
1828 # complain about it
1829 self.client.close()
1830
1831 image_id = None
1832
1833 for event in all_events:
1834 if 'stream' in event:
1835 match = re.search(r'Successfully built ([0-9a-f]+)', event.get('stream', ''))
1836 if match:
1837 image_id = match.group(1)
1838
1839 if image_id is None:
1840 raise BuildError(service, event if all_events else 'Unknown')
1841
1842 return image_id
1843
1844
1845 class _CLIBuilder:
1846 def __init__(self, progress):
1847 self._progress = progress
1848
1849 def build(self, service, path, tag=None, quiet=False, fileobj=None,
1850 nocache=False, rm=False, timeout=None,
1851 custom_context=False, encoding=None, pull=False,
1852 forcerm=False, dockerfile=None, container_limits=None,
1853 decode=False, buildargs=None, gzip=False, shmsize=None,
1854 labels=None, cache_from=None, target=None, network_mode=None,
1855 squash=None, extra_hosts=None, platform=None, isolation=None,
1856 use_config_proxy=True, output_stream=sys.stdout):
17841857 """
17851858 Args:
1859 service (str): Service to be built
17861860 path (str): Path to the directory containing the Dockerfile
17871861 buildargs (dict): A dictionary of build arguments
17881862 cache_from (:py:class:`list`): A list of images used for build
18311905 configuration file (``~/.docker/config.json`` by default)
18321906 contains a proxy configuration, the corresponding environment
18331907 variables will be set in the container being built.
1908 output_stream (writer): stream to use for build logs
18341909 Returns:
18351910 A generator for the build output.
18361911 """
1837 if dockerfile:
1912 if dockerfile and os.path.isdir(path):
18381913 dockerfile = os.path.join(path, dockerfile)
18391914 iidfile = tempfile.mktemp()
18401915
18521927 command_builder.add_arg("--tag", tag)
18531928 command_builder.add_arg("--target", target)
18541929 command_builder.add_arg("--iidfile", iidfile)
1930 command_builder.add_arg("--platform", platform)
1931 command_builder.add_arg("--isolation", isolation)
1932
1933 if extra_hosts:
1934 if isinstance(extra_hosts, dict):
1935 extra_hosts = ["{}:{}".format(host, ip) for host, ip in extra_hosts.items()]
1936 for host in extra_hosts:
1937 command_builder.add_arg("--add-host", "{}".format(host))
1938
18551939 args = command_builder.build([path])
18561940
1857 magic_word = "Successfully built "
1858 appear = False
1859 with subprocess.Popen(args, stdout=subprocess.PIPE,
1941 with subprocess.Popen(args, stdout=output_stream, stderr=sys.stderr,
18601942 universal_newlines=True) as p:
1861 while True:
1862 line = p.stdout.readline()
1863 if not line:
1864 break
1865 if line.startswith(magic_word):
1866 appear = True
1867 yield json.dumps({"stream": line})
1868
18691943 p.communicate()
18701944 if p.returncode != 0:
1871 raise StreamOutputError()
1945 raise BuildError(service, "Build failed")
18721946
18731947 with open(iidfile) as f:
18741948 line = f.readline()
18751949 image_id = line.split(":")[1].strip()
18761950 os.remove(iidfile)
18771951
1878 # In case of `DOCKER_BUILDKIT=1`
1879 # there is no success message already present in the output.
1880 # Since that's the way `Service::build` gets the `image_id`
1881 # it has to be added `manually`
1882 if not appear:
1883 yield json.dumps({"stream": "{}{}\n".format(magic_word, image_id)})
1952 return image_id
18841953
18851954
18861955 class _CommandBuilder:
173173 if len(s) > max_chars:
174174 return s[:max_chars - 2] + '...'
175175 return s
176
177
178 def filter_attached_for_up(items, service_names, attach_dependencies=False,
179 item_to_service_name=lambda x: x):
180 """This function contains the logic of choosing which services to
181 attach when doing docker-compose up. It may be used both with containers
182 and services, and any other entities that map to service names -
183 this mapping is provided by item_to_service_name."""
184 if attach_dependencies or not service_names:
185 return items
186
187 return [
188 item
189 for item in items if item_to_service_name(item) in service_names
190 ]
137137 ;;
138138 esac
139139
140 COMPREPLY=( $( compgen -W "--hash --help --no-interpolate --quiet -q --resolve-image-digests --services --volumes" -- "$cur" ) )
140 COMPREPLY=( $( compgen -W "--hash --help --no-interpolate --profiles --quiet -q --resolve-image-digests --services --volumes" -- "$cur" ) )
141141 }
142142
143143
163163 _filedir "y?(a)ml"
164164 return
165165 ;;
166 --ansi)
167 COMPREPLY=( $( compgen -W "never always auto" -- "$cur" ) )
168 return
169 ;;
166170 --log-level)
167171 COMPREPLY=( $( compgen -W "debug info warning error critical" -- "$cur" ) )
168172 return
169173 ;;
174 --profile)
175 COMPREPLY=( $( compgen -W "$(__docker_compose_q config --profiles)" -- "$cur" ) )
176 return
177 ;;
170178 --project-directory)
171179 _filedir -d
172180 return
289297
290298 case "$cur" in
291299 -*)
292 COMPREPLY=( $( compgen -W "--follow -f --help --no-color --tail --timestamps -t" -- "$cur" ) )
300 COMPREPLY=( $( compgen -W "--follow -f --help --no-color --no-log-prefix --tail --timestamps -t" -- "$cur" ) )
293301 ;;
294302 *)
295303 __docker_compose_complete_services
544552
545553 case "$cur" in
546554 -*)
547 COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) )
555 COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-log-prefix --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) )
548556 ;;
549557 *)
550558 __docker_compose_complete_services
613621 --tlskey
614622 "
615623
616 # These options are require special treatment when searching the command.
624 # These options require special treatment when searching the command.
617625 local top_level_options_with_args="
626 --ansi
618627 --log-level
628 --profile
619629 "
620630
621631 COMPREPLY=()
2020 complete -c docker-compose -l tlskey -r -d 'Path to TLS key file'
2121 complete -c docker-compose -l tlsverify -d 'Use TLS and verify the remote'
2222 complete -c docker-compose -l skip-hostname-check -d "Don't check the daemon's hostname against the name specified in the client certificate (for example if your docker host is an IP address)"
23 complete -c docker-compose -l no-ansi -d 'Do not print ANSI control characters'
24 complete -c docker-compose -l ansi -a 'never always auto' -d 'Control when to print ANSI control characters'
2325 complete -c docker-compose -s h -l help -d 'Print usage'
2426 complete -c docker-compose -s v -l version -d 'Print version and exit'
341341 '--verbose[Show more output]' \
342342 '--log-level=[Set log level]:level:(DEBUG INFO WARNING ERROR CRITICAL)' \
343343 '--no-ansi[Do not print ANSI control characters]' \
344 '--ansi=[Control when to print ANSI control characters]:when:(never always auto)' \
344345 '(-H --host)'{-H,--host}'[Daemon socket to connect to]:host:' \
345346 '--tls[Use TLS; implied by --tlsverify]' \
346347 '--tlscacert=[Trust certs signed only by this CA]:ca path:' \
2222 'DATA'
2323 ),
2424 (
25 'compose/config/config_schema_compose_spec.json',
26 'compose/config/config_schema_compose_spec.json',
25 'compose/config/compose_spec.json',
26 'compose/config/compose_spec.json',
2727 'DATA'
2828 ),
2929 (
3131 'DATA'
3232 ),
3333 (
34 'compose/config/config_schema_compose_spec.json',
35 'compose/config/config_schema_compose_spec.json',
34 'compose/config/compose_spec.json',
35 'compose/config/compose_spec.json',
3636 'DATA'
3737 ),
3838 (
0 pyinstaller==3.6
0 pyinstaller==4.1
00 Click==7.1.2
1 coverage==5.2.1
1 coverage==5.5
22 ddt==1.4.1
33 flake8==3.8.3
4 gitpython==3.1.7
4 gitpython==3.1.11
55 mock==3.0.5
66 pytest==6.0.1; python_version >= '3.5'
77 pytest==4.6.5; python_version < '3.5'
00 altgraph==0.17
11 appdirs==1.4.4
2 attrs==20.1.0
3 bcrypt==3.1.7
4 cffi==1.14.1
5 cryptography==3.0
2 attrs==20.3.0
3 bcrypt==3.2.0
4 cffi==1.14.4
5 cryptography==3.3.2
66 distlib==0.3.1
77 entrypoints==0.3
88 filelock==3.0.12
99 gitdb2==4.0.2
1010 mccabe==0.6.1
11 more-itertools==8.4.0; python_version >= '3.5'
11 more-itertools==8.6.0; python_version >= '3.5'
1212 more-itertools==5.0.0; python_version < '3.5'
13 packaging==20.4
13 packaging==20.9
1414 pluggy==0.13.1
15 py==1.9.0
15 py==1.10.0
1616 pycodestyle==2.6.0
1717 pycparser==2.20
1818 pyflakes==2.2.0
2222 smmap==3.0.4
2323 smmap2==3.0.1
2424 toml==0.10.1
25 tox==3.19.0
26 virtualenv==20.0.30
25 tox==3.21.2
26 virtualenv==20.4.0
2727 wcwidth==0.2.5
00 backports.shutil_get_terminal_size==1.0.0
1 cached-property==1.5.1
1 cached-property==1.5.1; python_version < '3.8'
22 certifi==2020.6.20
33 chardet==3.0.4
44 colorama==0.4.3; sys_platform == 'win32'
55 distro==1.5.0
6 docker==4.3.1
6 docker==5.0.0
77 docker-pycreds==0.4.0
88 dockerpty==0.4.1
99 docopt==0.6.2
1111 ipaddress==1.0.23
1212 jsonschema==3.2.0
1313 paramiko==2.7.1
14 pypiwin32==219; sys_platform == 'win32' and python_version < '3.6'
15 pypiwin32==223; sys_platform == 'win32' and python_version >= '3.6'
1614 PySocks==1.7.1
17 python-dotenv==0.14.0
18 PyYAML==5.3.1
15 python-dotenv==0.17.0
16 pywin32==227; sys_platform == 'win32'
17 PyYAML==5.4.1
1918 requests==2.24.0
2019 texttable==1.6.2
2120 urllib3==1.25.10; python_version == '3.3'
44 ./script/clean
55
66 DOCKER_COMPOSE_GITSHA="$(script/build/write-git-sha)"
7 TAG="docker/compose:tmp-glibc-linux-binary-${DOCKER_COMPOSE_GITSHA}"
87
9 docker build -t "${TAG}" . \
10 --build-arg BUILD_PLATFORM=debian \
11 --build-arg GIT_COMMIT="${DOCKER_COMPOSE_GITSHA}"
12 TMP_CONTAINER=$(docker create "${TAG}")
13 mkdir -p dist
8 docker build . \
9 --target bin \
10 --build-arg DISTRO=debian \
11 --build-arg GIT_COMMIT="${DOCKER_COMPOSE_GITSHA}" \
12 --output dist/
1413 ARCH=$(uname -m)
15 docker cp "${TMP_CONTAINER}":/usr/local/bin/docker-compose "dist/docker-compose-Linux-${ARCH}"
16 docker container rm -f "${TMP_CONTAINER}"
17 docker image rm -f "${TAG}"
14 # Ensure that we output the binary with the same name as we did before
15 mv dist/docker-compose-linux-amd64 "dist/docker-compose-Linux-${ARCH}"
2323 git clone --single-branch --branch develop https://github.com/pyinstaller/pyinstaller.git /tmp/pyinstaller
2424 cd /tmp/pyinstaller/bootloader
2525 # Checkout commit corresponding to version in requirements-build
26 git checkout v3.6
26 git checkout v4.1
2727 "${VENV}"/bin/python3 ./waf configure --no-lsb all
2828 "${VENV}"/bin/pip3 install ..
2929 cd "${CODE_PATH}"
1212 DOCKER_COMPOSE_GITSHA="$(script/build/write-git-sha)"
1313 docker build -t "${IMAGE}:${TAG}" . \
1414 --target build \
15 --build-arg BUILD_PLATFORM="debian" \
15 --build-arg DISTRO="debian" \
1616 --build-arg GIT_COMMIT="${DOCKER_COMPOSE_GITSHA}"
1717 docker tag "${IMAGE}":"${TAG}" "${IMAGE}":latest
55 #
66 # http://git-scm.com/download/win
77 #
8 # 2. Install Python 3.7.x:
8 # 2. Install Python 3.9.x:
99 #
1010 # https://www.python.org/downloads/
1111 #
12 # 3. Append ";C:\Python37;C:\Python37\Scripts" to the "Path" environment variable:
12 # 3. Append ";C:\Python39;C:\Python39\Scripts" to the "Path" environment variable:
1313 #
1414 # https://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/sysdm_advancd_environmnt_addchange_variable.mspx?mfr=true
1515 #
1616 # 4. In Powershell, run the following commands:
1717 #
18 # $ pip install 'virtualenv==20.0.30'
18 # $ pip install 'virtualenv==20.2.2'
1919 # $ Set-ExecutionPolicy -Scope CurrentUser RemoteSigned
2020 #
2121 # 5. Clone the repository:
3838 Get-ChildItem -Recurse -Include *.pyc | foreach ($_) { Remove-Item $_.FullName }
3939
4040 # Create virtualenv
41 virtualenv -p C:\Python37\python.exe .\venv
41 virtualenv -p C:\Python39\python.exe .\venv
4242
4343 # pip and pyinstaller generate lots of warnings, so we need to ignore them
4444 $ErrorActionPreference = "Continue"
1414
1515 set -e
1616
17 VERSION="1.27.4"
17 VERSION="1.29.2"
1818 IMAGE="docker/compose:$VERSION"
1919
2020
4343 if [ -n "$COMPOSE_PROJECT_NAME" ]; then
4444 COMPOSE_OPTIONS="-e COMPOSE_PROJECT_NAME $COMPOSE_OPTIONS"
4545 fi
46 # TODO: also check --file argument
4746 if [ -n "$compose_dir" ]; then
4847 VOLUMES="$VOLUMES -v $compose_dir:$compose_dir"
4948 fi
5049 if [ -n "$HOME" ]; then
5150 VOLUMES="$VOLUMES -v $HOME:$HOME -e HOME" # Pass in HOME to share docker.config and allow ~/-relative paths to work.
5251 fi
52 i=$#
53 while [ $i -gt 0 ]; do
54 arg=$1
55 i=$((i - 1))
56 shift
57
58 case "$arg" in
59 -f|--file)
60 value=$1
61 i=$((i - 1))
62 shift
63 set -- "$@" "$arg" "$value"
64
65 file_dir=$(realpath "$(dirname "$value")")
66 VOLUMES="$VOLUMES -v $file_dir:$file_dir"
67 ;;
68 *) set -- "$@" "$arg" ;;
69 esac
70 done
71
72 # Setup environment variables for compose config and context
73 ENV_OPTIONS=$(printenv | sed -E "/^PATH=.*/d; s/^/-e /g; s/=.*//g; s/\n/ /g")
5374
5475 # Only allocate tty if we detect one
5576 if [ -t 0 ] && [ -t 1 ]; then
6687 fi
6788
6889 # shellcheck disable=SC2086
69 exec docker run --rm $DOCKER_RUN_OPTIONS $DOCKER_ADDR $COMPOSE_OPTIONS $VOLUMES -w "$(pwd)" $IMAGE "$@"
90 exec docker run --rm $DOCKER_RUN_OPTIONS $DOCKER_ADDR $COMPOSE_OPTIONS $ENV_OPTIONS $VOLUMES -w "$(pwd)" $IMAGE "$@"
1212 SDK_SHA1=dd228a335194e3392f1904ce49aff1b1da26ca62
1313 fi
1414
15 OPENSSL_VERSION=1.1.1g
15 OPENSSL_VERSION=1.1.1h
1616 OPENSSL_URL=https://www.openssl.org/source/openssl-${OPENSSL_VERSION}.tar.gz
17 OPENSSL_SHA1=b213a293f2127ec3e323fb3cfc0c9807664fd997
17 OPENSSL_SHA1=8d0d099e8973ec851368c8c775e05e1eadca1794
1818
19 PYTHON_VERSION=3.7.7
19 PYTHON_VERSION=3.9.0
2020 PYTHON_URL=https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz
21 PYTHON_SHA1=8e9968663a214aea29659ba9dfa959e8a7d82b39
21 PYTHON_SHA1=5744a10ba989d2badacbab3c00cdcb83c83106c7
2222
2323 #
2424 # Install prerequisites.
3535 brew install python3
3636 fi
3737 if ! [ -x "$(command -v virtualenv)" ]; then
38 pip3 install virtualenv==20.0.30
38 pip3 install virtualenv==20.2.2
3939 fi
4040
4141 #
2020 DOCKER_VERSIONS=$($get_versions -n 2 recent)
2121 fi
2222
23
2423 BUILD_NUMBER=${BUILD_NUMBER-$USER}
2524 PY_TEST_VERSIONS=${PY_TEST_VERSIONS:-py37}
2625
3837
3938 trap "on_exit" EXIT
4039
41 repo="dockerswarm/dind"
42
4340 docker run \
4441 -d \
4542 --name "$daemon_container" \
4643 --privileged \
4744 --volume="/var/lib/docker" \
48 "$repo:$version" \
45 -e "DOCKER_TLS_CERTDIR=" \
46 "docker:$version-dind" \
4947 dockerd -H tcp://0.0.0.0:2375 $DOCKER_DAEMON_ARGS \
5048 2>&1 | tail -n 10
49
50 docker exec "$daemon_container" sh -c "apk add --no-cache git"
51
52 # copy docker config from host for authentication with Docker Hub
53 docker exec "$daemon_container" sh -c "mkdir /root/.docker"
54 docker cp /root/.docker/config.json $daemon_container:/root/.docker/config.json
55 docker exec "$daemon_container" sh -c "chmod 644 /root/.docker/config.json"
5156
5257 docker run \
5358 --rm \
2424
2525
2626 install_requires = [
27 'cached-property >= 1.2.0, < 2',
2827 'docopt >= 0.6.1, < 1',
2928 'PyYAML >= 3.10, < 6',
3029 'requests >= 2.20.0, < 3',
3130 'texttable >= 0.9.0, < 2',
3231 'websocket-client >= 0.32.0, < 1',
3332 'distro >= 1.5.0, < 2',
34 'docker[ssh] >= 4.3.1, < 5',
33 'docker[ssh] >= 5',
3534 'dockerpty >= 0.4.1, < 1',
3635 'jsonschema >= 2.5.1, < 4',
3736 'python-dotenv >= 0.13.0, < 1',
4948
5049 extras_require = {
5150 ':python_version < "3.5"': ['backports.ssl_match_hostname >= 3.5, < 4'],
51 ':python_version < "3.8"': ['cached-property >= 1.2.0, < 2'],
5252 ':sys_platform == "win32"': ['colorama >= 0.4, < 1'],
5353 'socks': ['PySocks >= 1.5.6, != 1.5.7, < 2'],
5454 'tests': tests_require,
101101 'Programming Language :: Python :: 3.4',
102102 'Programming Language :: Python :: 3.6',
103103 'Programming Language :: Python :: 3.7',
104 'Programming Language :: Python :: 3.8',
105 'Programming Language :: Python :: 3.9',
104106 ],
105107 )
5757 }
5858
5959
60 def start_process(base_dir, options):
60 def start_process(base_dir, options, executable=None, env=None):
61 executable = executable or DOCKER_COMPOSE_EXECUTABLE
6162 proc = subprocess.Popen(
62 [DOCKER_COMPOSE_EXECUTABLE] + options,
63 [executable] + options,
6364 stdin=subprocess.PIPE,
6465 stdout=subprocess.PIPE,
6566 stderr=subprocess.PIPE,
66 cwd=base_dir)
67 cwd=base_dir,
68 env=env,
69 )
6770 print("Running process: %s" % proc.pid)
6871 return proc
6972
7780 return ProcessResult(stdout.decode('utf-8'), stderr.decode('utf-8'))
7881
7982
80 def dispatch(base_dir, options, project_options=None, returncode=0, stdin=None):
83 def dispatch(base_dir, options,
84 project_options=None, returncode=0, stdin=None, executable=None, env=None):
8185 project_options = project_options or []
82 proc = start_process(base_dir, project_options + options)
86 proc = start_process(base_dir, project_options + options, executable=executable, env=env)
8387 return wait_on_process(proc, returncode=returncode, stdin=stdin)
8488
8589
231235
232236 result = self.dispatch(['-H=tcp://doesnotexist:8000', 'ps'], returncode=1)
233237 assert "Couldn't connect to Docker daemon" in result.stderr
238
239 def test_config_list_profiles(self):
240 self.base_dir = 'tests/fixtures/config-profiles'
241 result = self.dispatch(['config', '--profiles'])
242 assert set(result.stdout.rstrip().split('\n')) == {'debug', 'frontend', 'gui'}
234243
235244 def test_config_list_services(self):
236245 self.base_dir = 'tests/fixtures/v2-full'
782791 assert BUILD_CACHE_TEXT not in result.stdout
783792 assert BUILD_PULL_TEXT in result.stdout
784793
794 @mock.patch.dict(os.environ)
785795 def test_build_log_level(self):
796 os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0'
797 os.environ['DOCKER_BUILDKIT'] = '0'
798 self.test_env_file_relative_to_compose_file()
786799 self.base_dir = 'tests/fixtures/simple-dockerfile'
787800 result = self.dispatch(['--log-level', 'warning', 'build', 'simple'])
788801 assert result.stderr == ''
844857 for c in self.project.client.containers(all=True):
845858 self.addCleanup(self.project.client.remove_container, c, force=True)
846859
860 @mock.patch.dict(os.environ)
847861 def test_build_shm_size_build_option(self):
862 os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0'
848863 pull_busybox(self.client)
849864 self.base_dir = 'tests/fixtures/build-shm-size'
850865 result = self.dispatch(['build', '--no-cache'], None)
851866 assert 'shm_size: 96' in result.stdout
852867
868 @mock.patch.dict(os.environ)
853869 def test_build_memory_build_option(self):
870 os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0'
854871 pull_busybox(self.client)
855872 self.base_dir = 'tests/fixtures/build-memory'
856873 result = self.dispatch(['build', '--no-cache', '--memory', '96m', 'service'], None)
17171734
17181735 shareable_mode_container = self.project.get_service('shareable').containers()[0]
17191736 assert shareable_mode_container.get('HostConfig.IpcMode') == 'shareable'
1737
1738 def test_profiles_up_with_no_profile(self):
1739 self.base_dir = 'tests/fixtures/profiles'
1740 self.dispatch(['up'])
1741
1742 containers = self.project.containers(stopped=True)
1743 service_names = [c.service for c in containers]
1744
1745 assert 'foo' in service_names
1746 assert len(containers) == 1
1747
1748 def test_profiles_up_with_profile(self):
1749 self.base_dir = 'tests/fixtures/profiles'
1750 self.dispatch(['--profile', 'test', 'up'])
1751
1752 containers = self.project.containers(stopped=True)
1753 service_names = [c.service for c in containers]
1754
1755 assert 'foo' in service_names
1756 assert 'bar' in service_names
1757 assert 'baz' in service_names
1758 assert len(containers) == 3
1759
1760 def test_profiles_up_invalid_dependency(self):
1761 self.base_dir = 'tests/fixtures/profiles'
1762 result = self.dispatch(['--profile', 'debug', 'up'], returncode=1)
1763
1764 assert ('Service "bar" was pulled in as a dependency of service "zot" '
1765 'but is not enabled by the active profiles.') in result.stderr
1766
1767 def test_profiles_up_with_multiple_profiles(self):
1768 self.base_dir = 'tests/fixtures/profiles'
1769 self.dispatch(['--profile', 'debug', '--profile', 'test', 'up'])
1770
1771 containers = self.project.containers(stopped=True)
1772 service_names = [c.service for c in containers]
1773
1774 assert 'foo' in service_names
1775 assert 'bar' in service_names
1776 assert 'baz' in service_names
1777 assert 'zot' in service_names
1778 assert len(containers) == 4
1779
1780 def test_profiles_up_with_profile_enabled_by_service(self):
1781 self.base_dir = 'tests/fixtures/profiles'
1782 self.dispatch(['up', 'bar'])
1783
1784 containers = self.project.containers(stopped=True)
1785 service_names = [c.service for c in containers]
1786
1787 assert 'bar' in service_names
1788 assert len(containers) == 1
1789
1790 def test_profiles_up_with_dependency_and_profile_enabled_by_service(self):
1791 self.base_dir = 'tests/fixtures/profiles'
1792 self.dispatch(['up', 'baz'])
1793
1794 containers = self.project.containers(stopped=True)
1795 service_names = [c.service for c in containers]
1796
1797 assert 'bar' in service_names
1798 assert 'baz' in service_names
1799 assert len(containers) == 2
1800
1801 def test_profiles_up_with_invalid_dependency_for_target_service(self):
1802 self.base_dir = 'tests/fixtures/profiles'
1803 result = self.dispatch(['up', 'zot'], returncode=1)
1804
1805 assert ('Service "bar" was pulled in as a dependency of service "zot" '
1806 'but is not enabled by the active profiles.') in result.stderr
1807
1808 def test_profiles_up_with_profile_for_dependency(self):
1809 self.base_dir = 'tests/fixtures/profiles'
1810 self.dispatch(['--profile', 'test', 'up', 'zot'])
1811
1812 containers = self.project.containers(stopped=True)
1813 service_names = [c.service for c in containers]
1814
1815 assert 'bar' in service_names
1816 assert 'zot' in service_names
1817 assert len(containers) == 2
1818
1819 def test_profiles_up_with_merged_profiles(self):
1820 self.base_dir = 'tests/fixtures/profiles'
1821 self.dispatch(['-f', 'docker-compose.yml', '-f', 'merge-profiles.yml', 'up', 'zot'])
1822
1823 containers = self.project.containers(stopped=True)
1824 service_names = [c.service for c in containers]
1825
1826 assert 'bar' in service_names
1827 assert 'zot' in service_names
1828 assert len(containers) == 2
17201829
17211830 def test_exec_without_tty(self):
17221831 self.base_dir = 'tests/fixtures/links-composefile'
30333142 another = self.project.get_service('--log-service')
30343143 assert len(service.containers()) == 1
30353144 assert len(another.containers()) == 1
3145
3146 def test_up_no_log_prefix(self):
3147 self.base_dir = 'tests/fixtures/echo-services'
3148 result = self.dispatch(['up', '--no-log-prefix'])
3149
3150 assert 'simple' in result.stdout
3151 assert 'another' in result.stdout
3152 assert 'exited with code 0' in result.stdout
3153 assert 'exited with code 0' in result.stdout
0 version: '3.8'
1 services:
2 frontend:
3 image: frontend
4 profiles: ["frontend", "gui"]
5 phpmyadmin:
6 image: phpmyadmin
7 depends_on:
8 - db
9 profiles:
10 - debug
11 backend:
12 image: backend
13 db:
14 image: mysql
0 version: "3"
1 services:
2 foo:
3 image: busybox:1.31.0-uclibc
4 bar:
5 image: busybox:1.31.0-uclibc
6 profiles:
7 - test
8 baz:
9 image: busybox:1.31.0-uclibc
10 depends_on:
11 - bar
12 profiles:
13 - test
14 zot:
15 image: busybox:1.31.0-uclibc
16 depends_on:
17 - bar
18 profiles:
19 - debug
0 version: "3"
1 services:
2 bar:
3 profiles:
4 - debug
00 import tempfile
11
2 import pytest
23 from ddt import data
34 from ddt import ddt
45
78 from compose.cli.command import get_project
89 from compose.cli.command import project_from_options
910 from compose.config.environment import Environment
11 from compose.config.errors import EnvFileNotFound
1012 from tests.integration.testcases import DockerClientTestCase
1113
1214
5456 class EnvironmentOverrideFileTest(DockerClientTestCase):
5557 def test_env_file_override(self):
5658 base_dir = 'tests/fixtures/env-file-override'
59 # '--env-file' are relative to the current working dir
60 env = Environment.from_env_file(base_dir, base_dir+'/.env.override')
5761 dispatch(base_dir, ['--env-file', '.env.override', 'up'])
5862 project = get_project(project_dir=base_dir,
5963 config_path=['docker-compose.yml'],
60 environment=Environment.from_env_file(base_dir, '.env.override'),
64 environment=env,
6165 override_dir=base_dir)
6266 containers = project.containers(stopped=True)
6367 assert len(containers) == 1
6468 assert "WHEREAMI=override" in containers[0].get('Config.Env')
6569 assert "DEFAULT_CONF_LOADED=true" in containers[0].get('Config.Env')
6670 dispatch(base_dir, ['--env-file', '.env.override', 'down'], None)
71
72 def test_env_file_not_found_error(self):
73 base_dir = 'tests/fixtures/env-file-override'
74 with pytest.raises(EnvFileNotFound) as excinfo:
75 Environment.from_env_file(base_dir, '.env.override')
76
77 assert "Couldn't find env file" in excinfo.exconly()
78
79 def test_dot_env_file(self):
80 base_dir = 'tests/fixtures/env-file-override'
81 # '.env' is relative to the project_dir (base_dir)
82 env = Environment.from_env_file(base_dir, None)
83 dispatch(base_dir, ['up'])
84 project = get_project(project_dir=base_dir,
85 config_path=['docker-compose.yml'],
86 environment=env,
87 override_dir=base_dir)
88 containers = project.containers(stopped=True)
89 assert len(containers) == 1
90 assert "WHEREAMI=default" in containers[0].get('Config.Env')
91 dispatch(base_dir, ['down'], None)
0 import logging
1 import os
2 import socket
3 from http.server import BaseHTTPRequestHandler
4 from http.server import HTTPServer
5 from threading import Thread
6
7 import requests
8 from docker.transport import UnixHTTPAdapter
9
10 from tests.acceptance.cli_test import dispatch
11 from tests.integration.testcases import DockerClientTestCase
12
13
14 TEST_SOCKET_FILE = '/tmp/test-metrics-docker-cli.sock'
15
16
17 class MetricsTest(DockerClientTestCase):
18 test_session = requests.sessions.Session()
19 test_env = None
20 base_dir = 'tests/fixtures/v3-full'
21
22 @classmethod
23 def setUpClass(cls):
24 super().setUpClass()
25 MetricsTest.test_session.mount("http+unix://", UnixHTTPAdapter(TEST_SOCKET_FILE))
26 MetricsTest.test_env = os.environ.copy()
27 MetricsTest.test_env['METRICS_SOCKET_FILE'] = TEST_SOCKET_FILE
28 MetricsServer().start()
29
30 @classmethod
31 def test_metrics_help(cls):
32 # root `docker-compose` command is considered as a `--help`
33 dispatch(cls.base_dir, [], env=MetricsTest.test_env)
34 assert cls.get_content() == \
35 b'{"command": "compose --help", "context": "moby", ' \
36 b'"source": "docker-compose", "status": "success"}'
37 dispatch(cls.base_dir, ['help', 'run'], env=MetricsTest.test_env)
38 assert cls.get_content() == \
39 b'{"command": "compose help", "context": "moby", ' \
40 b'"source": "docker-compose", "status": "success"}'
41 dispatch(cls.base_dir, ['--help'], env=MetricsTest.test_env)
42 assert cls.get_content() == \
43 b'{"command": "compose --help", "context": "moby", ' \
44 b'"source": "docker-compose", "status": "success"}'
45 dispatch(cls.base_dir, ['run', '--help'], env=MetricsTest.test_env)
46 assert cls.get_content() == \
47 b'{"command": "compose --help run", "context": "moby", ' \
48 b'"source": "docker-compose", "status": "success"}'
49 dispatch(cls.base_dir, ['up', '--help', 'extra_args'], env=MetricsTest.test_env)
50 assert cls.get_content() == \
51 b'{"command": "compose --help up", "context": "moby", ' \
52 b'"source": "docker-compose", "status": "success"}'
53
54 @classmethod
55 def test_metrics_simple_commands(cls):
56 dispatch(cls.base_dir, ['ps'], env=MetricsTest.test_env)
57 assert cls.get_content() == \
58 b'{"command": "compose ps", "context": "moby", ' \
59 b'"source": "docker-compose", "status": "success"}'
60 dispatch(cls.base_dir, ['version'], env=MetricsTest.test_env)
61 assert cls.get_content() == \
62 b'{"command": "compose version", "context": "moby", ' \
63 b'"source": "docker-compose", "status": "success"}'
64 dispatch(cls.base_dir, ['version', '--yyy'], env=MetricsTest.test_env)
65 assert cls.get_content() == \
66 b'{"command": "compose version", "context": "moby", ' \
67 b'"source": "docker-compose", "status": "failure"}'
68
69 @staticmethod
70 def get_content():
71 resp = MetricsTest.test_session.get("http+unix://localhost")
72 print(resp.content)
73 return resp.content
74
75
76 def start_server(uri=TEST_SOCKET_FILE):
77 try:
78 os.remove(uri)
79 except OSError:
80 pass
81 httpd = HTTPServer(uri, MetricsHTTPRequestHandler, False)
82 sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
83 sock.bind(TEST_SOCKET_FILE)
84 sock.listen(0)
85 httpd.socket = sock
86 print('Serving on ', uri)
87 httpd.serve_forever()
88 sock.shutdown(socket.SHUT_RDWR)
89 sock.close()
90 os.remove(uri)
91
92
93 class MetricsServer:
94 @classmethod
95 def start(cls):
96 t = Thread(target=start_server, daemon=True)
97 t.start()
98
99
100 class MetricsHTTPRequestHandler(BaseHTTPRequestHandler):
101 usages = []
102
103 def do_GET(self):
104 self.client_address = ('',) # avoid exception in BaseHTTPServer.py log_message()
105 self.send_response(200)
106 self.end_headers()
107 for u in MetricsHTTPRequestHandler.usages:
108 self.wfile.write(u)
109 MetricsHTTPRequestHandler.usages = []
110
111 def do_POST(self):
112 self.client_address = ('',) # avoid exception in BaseHTTPServer.py log_message()
113 content_length = int(self.headers['Content-Length'])
114 body = self.rfile.read(content_length)
115 print(body)
116 MetricsHTTPRequestHandler.usages.append(body)
117 self.send_response(200)
118 self.end_headers()
119
120
121 if __name__ == '__main__':
122 logging.getLogger("urllib3").propagate = False
123 logging.getLogger("requests").propagate = False
124 start_server()
2424 from compose.const import LABEL_PROJECT
2525 from compose.const import LABEL_SERVICE
2626 from compose.container import Container
27 from compose.errors import CompletedUnsuccessfully
2728 from compose.errors import HealthCheckFailed
2829 from compose.errors import NoHealthCheckConfigured
2930 from compose.project import Project
18981899 with pytest.raises(NoHealthCheckConfigured):
18991900 svc1.is_healthy()
19001901
1902 def test_project_up_completed_successfully_dependency(self):
1903 config_dict = {
1904 'version': '2.1',
1905 'services': {
1906 'svc1': {
1907 'image': BUSYBOX_IMAGE_WITH_TAG,
1908 'command': 'true'
1909 },
1910 'svc2': {
1911 'image': BUSYBOX_IMAGE_WITH_TAG,
1912 'command': 'top',
1913 'depends_on': {
1914 'svc1': {'condition': 'service_completed_successfully'},
1915 }
1916 }
1917 }
1918 }
1919 config_data = load_config(config_dict)
1920 project = Project.from_config(
1921 name='composetest', config_data=config_data, client=self.client
1922 )
1923 project.up()
1924
1925 svc1 = project.get_service('svc1')
1926 svc2 = project.get_service('svc2')
1927
1928 assert 'svc1' in svc2.get_dependency_names()
1929 assert svc2.containers()[0].is_running
1930 assert len(svc1.containers()) == 0
1931 assert svc1.is_completed_successfully()
1932
1933 def test_project_up_completed_unsuccessfully_dependency(self):
1934 config_dict = {
1935 'version': '2.1',
1936 'services': {
1937 'svc1': {
1938 'image': BUSYBOX_IMAGE_WITH_TAG,
1939 'command': 'false'
1940 },
1941 'svc2': {
1942 'image': BUSYBOX_IMAGE_WITH_TAG,
1943 'command': 'top',
1944 'depends_on': {
1945 'svc1': {'condition': 'service_completed_successfully'},
1946 }
1947 }
1948 }
1949 }
1950 config_data = load_config(config_dict)
1951 project = Project.from_config(
1952 name='composetest', config_data=config_data, client=self.client
1953 )
1954 with pytest.raises(ProjectError):
1955 project.up()
1956
1957 svc1 = project.get_service('svc1')
1958 svc2 = project.get_service('svc2')
1959 assert 'svc1' in svc2.get_dependency_names()
1960 assert len(svc2.containers()) == 0
1961 with pytest.raises(CompletedUnsuccessfully):
1962 svc1.is_completed_successfully()
1963
1964 def test_project_up_completed_differently_dependencies(self):
1965 config_dict = {
1966 'version': '2.1',
1967 'services': {
1968 'svc1': {
1969 'image': BUSYBOX_IMAGE_WITH_TAG,
1970 'command': 'true'
1971 },
1972 'svc2': {
1973 'image': BUSYBOX_IMAGE_WITH_TAG,
1974 'command': 'false'
1975 },
1976 'svc3': {
1977 'image': BUSYBOX_IMAGE_WITH_TAG,
1978 'command': 'top',
1979 'depends_on': {
1980 'svc1': {'condition': 'service_completed_successfully'},
1981 'svc2': {'condition': 'service_completed_successfully'},
1982 }
1983 }
1984 }
1985 }
1986 config_data = load_config(config_dict)
1987 project = Project.from_config(
1988 name='composetest', config_data=config_data, client=self.client
1989 )
1990 with pytest.raises(ProjectError):
1991 project.up()
1992
1993 svc1 = project.get_service('svc1')
1994 svc2 = project.get_service('svc2')
1995 svc3 = project.get_service('svc3')
1996 assert ['svc1', 'svc2'] == svc3.get_dependency_names()
1997 assert svc1.is_completed_successfully()
1998 assert len(svc3.containers()) == 0
1999 with pytest.raises(CompletedUnsuccessfully):
2000 svc2.is_completed_successfully()
2001
19012002 def test_project_up_seccomp_profile(self):
19022003 seccomp_data = {
19032004 'defaultAction': 'SCMP_ACT_ALLOW',
947947 with open(os.path.join(base_dir, 'Dockerfile'), 'w') as f:
948948 f.write("FROM busybox\n")
949949
950 service = self.create_service('web', build={'context': base_dir})
950 service = self.create_service('web',
951 build={'context': base_dir},
952 environment={
953 'COMPOSE_DOCKER_CLI_BUILD': '0',
954 'DOCKER_BUILDKIT': '0',
955 })
951956 service.build()
952957 self.addCleanup(self.client.remove_image, service.image_name)
953958
963968 service = self.create_service('web',
964969 build={'context': base_dir},
965970 environment={
966 'COMPOSE_DOCKER_CLI_BUILD': '1',
967971 'DOCKER_BUILDKIT': '1',
968972 })
969973 service.build(cli=True)
10141018 web = self.create_service('web',
10151019 build={'context': base_dir},
10161020 environment={
1017 'COMPOSE_DOCKER_CLI_BUILD': '1',
10181021 'DOCKER_BUILDKIT': '1',
10191022 })
10201023 project = Project('composetest', [web], self.client)
6060
6161 @classmethod
6262 def tearDownClass(cls):
63 cls.client.close()
6364 del cls.client
6465
6566 def tearDown(self):
0 import os
1
2 import pytest
3
4 from compose.cli.colors import AnsiMode
5 from tests import mock
6
7
8 @pytest.fixture
9 def tty_stream():
10 stream = mock.Mock()
11 stream.isatty.return_value = True
12 return stream
13
14
15 @pytest.fixture
16 def non_tty_stream():
17 stream = mock.Mock()
18 stream.isatty.return_value = False
19 return stream
20
21
22 class TestAnsiModeTestCase:
23
24 @mock.patch.dict(os.environ)
25 def test_ansi_mode_never(self, tty_stream, non_tty_stream):
26 if "CLICOLOR" in os.environ:
27 del os.environ["CLICOLOR"]
28 assert not AnsiMode.NEVER.use_ansi_codes(tty_stream)
29 assert not AnsiMode.NEVER.use_ansi_codes(non_tty_stream)
30
31 os.environ["CLICOLOR"] = "0"
32 assert not AnsiMode.NEVER.use_ansi_codes(tty_stream)
33 assert not AnsiMode.NEVER.use_ansi_codes(non_tty_stream)
34
35 @mock.patch.dict(os.environ)
36 def test_ansi_mode_always(self, tty_stream, non_tty_stream):
37 if "CLICOLOR" in os.environ:
38 del os.environ["CLICOLOR"]
39 assert AnsiMode.ALWAYS.use_ansi_codes(tty_stream)
40 assert AnsiMode.ALWAYS.use_ansi_codes(non_tty_stream)
41
42 os.environ["CLICOLOR"] = "0"
43 assert AnsiMode.ALWAYS.use_ansi_codes(tty_stream)
44 assert AnsiMode.ALWAYS.use_ansi_codes(non_tty_stream)
45
46 @mock.patch.dict(os.environ)
47 def test_ansi_mode_auto(self, tty_stream, non_tty_stream):
48 if "CLICOLOR" in os.environ:
49 del os.environ["CLICOLOR"]
50 assert AnsiMode.AUTO.use_ansi_codes(tty_stream)
51 assert not AnsiMode.AUTO.use_ansi_codes(non_tty_stream)
52
53 os.environ["CLICOLOR"] = "0"
54 assert not AnsiMode.AUTO.use_ansi_codes(tty_stream)
55 assert not AnsiMode.AUTO.use_ansi_codes(non_tty_stream)
1313 paths = ['one.yml', 'two.yml']
1414 opts = {'--file': paths}
1515 environment = Environment.from_env_file('.')
16 assert get_config_path_from_options('.', opts, environment) == paths
16 assert get_config_path_from_options(opts, environment) == paths
1717
1818 def test_single_path_from_env(self):
1919 with mock.patch.dict(os.environ):
2020 os.environ['COMPOSE_FILE'] = 'one.yml'
2121 environment = Environment.from_env_file('.')
22 assert get_config_path_from_options('.', {}, environment) == ['one.yml']
22 assert get_config_path_from_options({}, environment) == ['one.yml']
2323
2424 @pytest.mark.skipif(IS_WINDOWS_PLATFORM, reason='posix separator')
2525 def test_multiple_path_from_env(self):
2626 with mock.patch.dict(os.environ):
2727 os.environ['COMPOSE_FILE'] = 'one.yml:two.yml'
2828 environment = Environment.from_env_file('.')
29 assert get_config_path_from_options(
30 '.', {}, environment
31 ) == ['one.yml', 'two.yml']
29 assert get_config_path_from_options({}, environment) == ['one.yml', 'two.yml']
3230
3331 @pytest.mark.skipif(not IS_WINDOWS_PLATFORM, reason='windows separator')
3432 def test_multiple_path_from_env_windows(self):
3533 with mock.patch.dict(os.environ):
3634 os.environ['COMPOSE_FILE'] = 'one.yml;two.yml'
3735 environment = Environment.from_env_file('.')
38 assert get_config_path_from_options(
39 '.', {}, environment
40 ) == ['one.yml', 'two.yml']
36 assert get_config_path_from_options({}, environment) == ['one.yml', 'two.yml']
4137
4238 def test_multiple_path_from_env_custom_separator(self):
4339 with mock.patch.dict(os.environ):
4440 os.environ['COMPOSE_PATH_SEPARATOR'] = '^'
4541 os.environ['COMPOSE_FILE'] = 'c:\\one.yml^.\\semi;colon.yml'
4642 environment = Environment.from_env_file('.')
47 assert get_config_path_from_options(
48 '.', {}, environment
49 ) == ['c:\\one.yml', '.\\semi;colon.yml']
43 assert get_config_path_from_options({}, environment) == ['c:\\one.yml', '.\\semi;colon.yml']
5044
5145 def test_no_path(self):
5246 environment = Environment.from_env_file('.')
53 assert not get_config_path_from_options('.', {}, environment)
47 assert not get_config_path_from_options({}, environment)
5448
5549 def test_unicode_path_from_options(self):
5650 paths = [b'\xe5\xb0\xb1\xe5\x90\x83\xe9\xa5\xad/docker-compose.yml']
5751 opts = {'--file': paths}
5852 environment = Environment.from_env_file('.')
59 assert get_config_path_from_options(
60 '.', opts, environment
61 ) == ['就吃饭/docker-compose.yml']
53 assert get_config_path_from_options(opts, environment) == ['就吃饭/docker-compose.yml']
77
88 from compose.cli.log_printer import build_log_generator
99 from compose.cli.log_printer import build_log_presenters
10 from compose.cli.log_printer import build_no_log_generator
1110 from compose.cli.log_printer import consume_queue
1211 from compose.cli.log_printer import QueueItem
1312 from compose.cli.log_printer import wait_on_exit
7271 mock_container.name, status_code,
7372 )
7473 assert expected in wait_on_exit(mock_container)
75
76
77 def test_build_no_log_generator(mock_container):
78 mock_container.has_api_logs = False
79 mock_container.log_driver = 'none'
80 output, = build_no_log_generator(mock_container, None)
81 assert "WARNING: no logs are available with the 'none' log driver\n" in output
82 assert "exited with code" not in output
8374
8475
8576 class TestBuildLogGenerator:
136136
137137 class TestSetupConsoleHandlerTestCase:
138138
139 def test_with_tty_verbose(self, logging_handler):
139 def test_with_console_formatter_verbose(self, logging_handler):
140140 setup_console_handler(logging_handler, True)
141141 assert type(logging_handler.formatter) == ConsoleWarningFormatter
142142 assert '%(name)s' in logging_handler.formatter._fmt
143143 assert '%(funcName)s' in logging_handler.formatter._fmt
144144
145 def test_with_tty_not_verbose(self, logging_handler):
145 def test_with_console_formatter_not_verbose(self, logging_handler):
146146 setup_console_handler(logging_handler, False)
147147 assert type(logging_handler.formatter) == ConsoleWarningFormatter
148148 assert '%(name)s' not in logging_handler.formatter._fmt
149149 assert '%(funcName)s' not in logging_handler.formatter._fmt
150150
151 def test_with_not_a_tty(self, logging_handler):
152 logging_handler.stream.isatty.return_value = False
153 setup_console_handler(logging_handler, False)
151 def test_without_console_formatter(self, logging_handler):
152 setup_console_handler(logging_handler, False, use_console_formatter=False)
154153 assert type(logging_handler.formatter) == logging.Formatter
155154
156155
237237 )
238238 )
239239
240 assert 'Invalid top-level property "web"' in excinfo.exconly()
240 assert "compose.config.errors.ConfigurationError: " \
241 "The Compose file 'filename.yml' is invalid because:\n" \
242 "'web' does not match any of the regexes: '^x-'" in excinfo.exconly()
241243 assert VERSION_EXPLANATION in excinfo.exconly()
242244
243245 def test_named_volume_config_empty(self):
666668
667669 assert 'Invalid service name \'mong\\o\'' in excinfo.exconly()
668670
669 def test_config_duplicate_cache_from_values_validation_error(self):
671 def test_config_duplicate_cache_from_values_no_validation_error(self):
670672 with pytest.raises(ConfigurationError) as exc:
671673 config.load(
672674 build_config_details({
678680 })
679681 )
680682
681 assert 'build.cache_from contains non-unique items' in exc.exconly()
683 assert 'build.cache_from contains non-unique items' not in exc.exconly()
682684
683685 def test_load_with_multiple_files_v1(self):
684686 base_file = config.ConfigFile(
23942396 'image': 'busybox',
23952397 'depends_on': {
23962398 'app1': {'condition': 'service_started'},
2397 'app2': {'condition': 'service_healthy'}
2399 'app2': {'condition': 'service_healthy'},
2400 'app3': {'condition': 'service_completed_successfully'}
23982401 }
23992402 }
24002403 override = {}
24062409 'image': 'busybox',
24072410 'depends_on': {
24082411 'app1': {'condition': 'service_started'},
2409 'app2': {'condition': 'service_healthy'}
2412 'app2': {'condition': 'service_healthy'},
2413 'app3': {'condition': 'service_completed_successfully'}
24102414 }
24112415 }
24122416 override = {
2413 'depends_on': ['app3']
2417 'depends_on': ['app4']
24142418 }
24152419
24162420 actual = config.merge_service_dicts(base, override, VERSION)
24192423 'depends_on': {
24202424 'app1': {'condition': 'service_started'},
24212425 'app2': {'condition': 'service_healthy'},
2422 'app3': {'condition': 'service_started'}
2426 'app3': {'condition': 'service_completed_successfully'},
2427 'app4': {'condition': 'service_started'},
24232428 }
24242429 }
24252430
35643569 @mock.patch.dict(os.environ)
35653570 def test_config_file_with_options_environment_file(self):
35663571 project_dir = 'tests/fixtures/default-env-file'
3572 # env-file is relative to current working dir
3573 env = Environment.from_env_file(project_dir, project_dir + '/.env2')
35673574 service_dicts = config.load(
35683575 config.find(
3569 project_dir, None, Environment.from_env_file(project_dir, '.env2')
3576 project_dir, None, env
35703577 )
35713578 ).services
35723579
52305237 files = [
52315238 'docker-compose.yml',
52325239 'docker-compose.yaml',
5240 'compose.yml',
5241 'compose.yaml',
52335242 ]
52345243
52355244 def test_get_config_path_default_file_in_basedir(self):
52635272 base_dir = tempfile.mkdtemp(dir=project_dir)
52645273 else:
52655274 base_dir = project_dir
5266 filename, = config.get_default_config_files(base_dir)
5267 return os.path.basename(filename)
5275 filenames = config.get_default_config_files(base_dir)
5276 if not filenames:
5277 raise config.ComposeFileNotFound(config.SUPPORTED_FILENAMES)
5278 return os.path.basename(filenames[0])
52685279 finally:
52695280 shutil.rmtree(project_dir)
52705281
415415 with pytest.raises(UnsetRequiredSubstitution) as e:
416416 defaults_interpolator("not ok ${BAZ?}")
417417
418 assert e.value.err == ''
418 assert e.value.err == 'BAZ'
419419
420420
421421 def test_interpolate_mixed_separators(defaults_interpolator):
220220 container = Container(None, self.container_dict, has_been_inspected=True)
221221 assert container.short_id == self.container_id[:12]
222222
223 def test_has_api_logs(self):
224 container_dict = {
225 'HostConfig': {
226 'LogConfig': {
227 'Type': 'json-file'
228 }
229 }
230 }
231
232 container = Container(None, container_dict, has_been_inspected=True)
233 assert container.has_api_logs is True
234
235 container_dict['HostConfig']['LogConfig']['Type'] = 'none'
236 container = Container(None, container_dict, has_been_inspected=True)
237 assert container.has_api_logs is False
238
239 container_dict['HostConfig']['LogConfig']['Type'] = 'syslog'
240 container = Container(None, container_dict, has_been_inspected=True)
241 assert container.has_api_logs is False
242
243 container_dict['HostConfig']['LogConfig']['Type'] = 'journald'
244 container = Container(None, container_dict, has_been_inspected=True)
245 assert container.has_api_logs is True
246
247 container_dict['HostConfig']['LogConfig']['Type'] = 'foobar'
248 container = Container(None, container_dict, has_been_inspected=True)
249 assert container.has_api_logs is False
250
251223
252224 class GetContainerNameTestCase(unittest.TestCase):
253225
0 import unittest
1
2 from compose.metrics.client import MetricsCommand
3 from compose.metrics.client import Status
4
5
6 class MetricsTest(unittest.TestCase):
7 @classmethod
8 def test_metrics(cls):
9 assert MetricsCommand('up', 'moby').to_map() == {
10 'command': 'compose up',
11 'context': 'moby',
12 'status': 'success',
13 'source': 'docker-compose',
14 }
15
16 assert MetricsCommand('down', 'local').to_map() == {
17 'command': 'compose down',
18 'context': 'local',
19 'status': 'success',
20 'source': 'docker-compose',
21 }
22
23 assert MetricsCommand('help', 'aci', Status.FAILURE).to_map() == {
24 'command': 'compose help',
25 'context': 'aci',
26 'status': 'failure',
27 'source': 'docker-compose',
28 }
29
30 assert MetricsCommand('run', 'ecs').to_map() == {
31 'command': 'compose run',
32 'context': 'ecs',
33 'status': 'success',
34 'source': 'docker-compose',
35 }
22
33 from docker.errors import APIError
44
5 from compose.cli.colors import AnsiMode
56 from compose.parallel import GlobalLimit
67 from compose.parallel import parallel_execute
78 from compose.parallel import parallel_execute_iter
155156
156157 def test_parallel_execute_ansi(capsys):
157158 ParallelStreamWriter.instance = None
158 ParallelStreamWriter.set_noansi(value=False)
159 ParallelStreamWriter.set_default_ansi_mode(AnsiMode.ALWAYS)
159160 results, errors = parallel_execute(
160161 objects=["something", "something more"],
161162 func=lambda x: x,
171172
172173 def test_parallel_execute_noansi(capsys):
173174 ParallelStreamWriter.instance = None
174 ParallelStreamWriter.set_noansi()
175 ParallelStreamWriter.set_default_ansi_mode(AnsiMode.NEVER)
175176 results, errors = parallel_execute(
176177 objects=["something", "something more"],
177178 func=lambda x: x,
329329 assert service.options['environment'] == environment
330330
331331 assert opts['labels'][LABEL_CONFIG_HASH] == \
332 '689149e6041a85f6fb4945a2146a497ed43c8a5cbd8991753d875b165f1b4de4'
332 '6da0f3ec0d5adf901de304bdc7e0ee44ec5dd7adb08aebc20fe0dd791d4ee5a8'
333333 assert opts['environment'] == ['also=real']
334334
335335 def test_get_container_create_options_sets_affinity_with_binds(self):
699699 config_dict = service.config_dict()
700700 expected = {
701701 'image_id': 'abcd',
702 'ipc_mode': None,
702703 'options': {'image': 'example.com/foo'},
703704 'links': [('one', 'one')],
704705 'net': 'other',
722723 config_dict = service.config_dict()
723724 expected = {
724725 'image_id': 'abcd',
726 'ipc_mode': None,
725727 'options': {'image': 'example.com/foo'},
726728 'links': [],
727729 'networks': {},
00 [tox]
1 envlist = py37,pre-commit
1 envlist = py37,py39,pre-commit
22
33 [testenv]
44 usedevelop=True
4949 [flake8]
5050 max-line-length = 105
5151 # Set this high for now
52 max-complexity = 11
52 max-complexity = 12
5353 exclude = compose/packages
5454
5555 [pytest]