Import upstream version 1.29.2
Debian Janitor
2 years ago
0 | 0 | Change log |
1 | 1 | ========== |
2 | ||
3 | 1.29.2 (2021-05-10) | |
4 | ------------------- | |
5 | ||
6 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/59?closed=1) | |
7 | ||
8 | ### Miscellaneous | |
9 | ||
10 | - Remove advertisement for `docker compose` in the `up` command to avoid annoyance | |
11 | ||
12 | - Bump `py` to `1.10.0` in `requirements-indirect.txt` | |
13 | ||
14 | 1.29.1 (2021-04-13) | |
15 | ------------------- | |
16 | ||
17 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/58?closed=1) | |
18 | ||
19 | ### Bugs | |
20 | ||
21 | - Fix for invalid handler warning on Windows builds | |
22 | ||
23 | - Fix config hash to trigger container recreation on IPC mode updates | |
24 | ||
25 | - Fix conversion map for `placement.max_replicas_per_node` | |
26 | ||
27 | - Remove extra scan suggestion on build | |
28 | ||
29 | 1.29.0 (2021-04-06) | |
30 | ------------------- | |
31 | ||
32 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/56?closed=1) | |
33 | ||
34 | ### Features | |
35 | ||
36 | - Add profile filter to `docker-compose config` | |
37 | ||
38 | - Add a `depends_on` condition to wait for successful service completion | |
39 | ||
40 | ### Miscellaneous | |
41 | ||
42 | - Add image scan message on build | |
43 | ||
44 | - Update warning message for `--no-ansi` to mention `--ansi never` as alternative | |
45 | ||
46 | - Bump docker-py to 5.0.0 | |
47 | ||
48 | - Bump PyYAML to 5.4.1 | |
49 | ||
50 | - Bump python-dotenv to 0.17.0 | |
51 | ||
52 | 1.28.6 (2021-03-23) | |
53 | ------------------- | |
54 | ||
55 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/57?closed=1) | |
56 | ||
57 | ### Bugs | |
58 | ||
59 | - Make `--env-file` relative to the current working directory and error out for invalid paths. Environment file paths set with `--env-file` are relative to the current working directory while the default `.env` file is located in the project directory which by default is the base directory of the Compose file. | |
60 | ||
61 | - Fix missing service property `storage_opt` by updating the compose schema | |
62 | ||
63 | - Fix build `extra_hosts` list format | |
64 | ||
65 | - Remove extra error message on `exec` | |
66 | ||
67 | ### Miscellaneous | |
68 | ||
69 | - Add `compose.yml` and `compose.yaml` to default filename list | |
70 | ||
71 | 1.28.5 (2021-02-25) | |
72 | ------------------- | |
73 | ||
74 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/55?closed=1) | |
75 | ||
76 | ### Bugs | |
77 | ||
78 | - Fix OpenSSL version mismatch error when shelling out to the ssh client (via bump to docker-py 4.4.4 which contains the fix) | |
79 | ||
80 | - Add missing build flags to the native builder: `platform`, `isolation` and `extra_hosts` | |
81 | ||
82 | - Remove info message on native build | |
83 | ||
84 | - Avoid fetching logs when service logging driver is set to 'none' | |
85 | ||
86 | 1.28.4 (2021-02-18) | |
87 | ------------------- | |
88 | ||
89 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/54?closed=1) | |
90 | ||
91 | ### Bugs | |
92 | ||
93 | - Fix SSH port parsing by bumping docker-py to 4.4.3 | |
94 | ||
95 | ### Miscellaneous | |
96 | ||
97 | - Bump Python to 3.7.10 | |
98 | ||
99 | 1.28.3 (2021-02-17) | |
100 | ------------------- | |
101 | ||
102 | [List of PRs / issues for this release](https://github.com/docker/compose/milestone/53?closed=1) | |
103 | ||
104 | ### Bugs | |
105 | ||
106 | - Fix SSH hostname parsing when it contains leading s/h, and remove the quiet option that was hiding the error (via docker-py bump to 4.4.2) | |
107 | ||
108 | - Fix key error for '--no-log-prefix' option | |
109 | ||
110 | - Fix incorrect CLI environment variable name for service profiles: `COMPOSE_PROFILES` instead of `COMPOSE_PROFILE` | |
111 | ||
112 | - Fix fish completion | |
113 | ||
114 | ### Miscellaneous | |
115 | ||
116 | - Bump cryptography to 3.3.2 | |
117 | ||
118 | - Remove log driver filter | |
119 | ||
120 | 1.28.2 (2021-01-26) | |
121 | ------------------- | |
122 | ||
123 | ### Miscellaneous | |
124 | ||
125 | - CI setup update | |
126 | ||
127 | 1.28.1 (2021-01-25) | |
128 | ------------------- | |
129 | ||
130 | ### Bugs | |
131 | ||
132 | - Revert to Python 3.7 bump for Linux static builds | |
133 | ||
134 | - Add bash completion for `docker-compose logs|up --no-log-prefix` | |
135 | ||
136 | 1.28.0 (2021-01-20) | |
137 | ------------------- | |
138 | ||
139 | ### Features | |
140 | ||
141 | - Support for Nvidia GPUs via device requests | |
142 | ||
143 | - Support for service profiles | |
144 | ||
145 | - Change the SSH connection approach to the Docker CLI's via shellout to the local SSH client (old behaviour enabled by setting `COMPOSE_PARAMIKO_SSH` environment variable) | |
146 | ||
147 | - Add flag to disable log prefix | |
148 | ||
149 | - Add flag for ansi output control | |
150 | ||
151 | ### Bugs | |
152 | ||
153 | - Make `parallel_pull=True` by default | |
154 | ||
155 | - Bring back warning for configs in non-swarm mode | |
156 | ||
157 | - Take `--file` in account when defining `project_dir` | |
158 | ||
159 | - On `compose up`, attach only to services we read logs from | |
160 | ||
161 | ### Miscellaneous | |
162 | ||
163 | - Make COMPOSE_DOCKER_CLI_BUILD=1 the default | |
164 | ||
165 | - Add usage metrics | |
166 | ||
167 | - Sync schema with COMPOSE specification | |
168 | ||
169 | - Improve failure report for missing mandatory environment variables | |
170 | ||
171 | - Bump attrs to 20.3.0 | |
172 | ||
173 | - Bump more_itertools to 8.6.0 | |
174 | ||
175 | - Bump cryptograhy to 3.2.1 | |
176 | ||
177 | - Bump cffi to 1.14.4 | |
178 | ||
179 | - Bump virtualenv to 20.2.2 | |
180 | ||
181 | - Bump bcrypt to 3.2.0 | |
182 | ||
183 | - Bump gitpython to 3.1.11 | |
184 | ||
185 | - Bump docker-py to 4.4.1 | |
186 | ||
187 | - Bump Python to 3.9 | |
188 | ||
189 | - Linux: bump Debian base image from stretch to buster (required for Python 3.9) | |
190 | ||
191 | - macOS: OpenSSL 1.1.1g to 1.1.1h, Python 3.7.7 to 3.9.0 | |
192 | ||
193 | - Bump pyinstaller 4.1 | |
194 | ||
195 | - Loosen restriction on base images to latest minor | |
196 | ||
197 | - Updates of READMEs | |
198 | ||
2 | 199 | |
3 | 200 | 1.27.4 (2020-09-24) |
4 | 201 | ------------------- |
0 | ARG DOCKER_VERSION=19.03.8 | |
1 | ARG PYTHON_VERSION=3.7.7 | |
2 | ARG BUILD_ALPINE_VERSION=3.11 | |
0 | ARG DOCKER_VERSION=19.03 | |
1 | ARG PYTHON_VERSION=3.7.10 | |
2 | ||
3 | ARG BUILD_ALPINE_VERSION=3.12 | |
4 | ARG BUILD_CENTOS_VERSION=7 | |
3 | 5 | ARG BUILD_DEBIAN_VERSION=slim-stretch |
4 | ARG RUNTIME_ALPINE_VERSION=3.11.5 | |
5 | ARG RUNTIME_DEBIAN_VERSION=stretch-20200414-slim | |
6 | 6 | |
7 | ARG BUILD_PLATFORM=alpine | |
7 | ARG RUNTIME_ALPINE_VERSION=3.12 | |
8 | ARG RUNTIME_CENTOS_VERSION=7 | |
9 | ARG RUNTIME_DEBIAN_VERSION=stretch-slim | |
10 | ||
11 | ARG DISTRO=alpine | |
8 | 12 | |
9 | 13 | FROM docker:${DOCKER_VERSION} AS docker-cli |
10 | 14 | |
39 | 43 | openssl \ |
40 | 44 | zlib1g-dev |
41 | 45 | |
42 | FROM build-${BUILD_PLATFORM} AS build | |
46 | FROM centos:${BUILD_CENTOS_VERSION} AS build-centos | |
47 | RUN yum install -y \ | |
48 | gcc \ | |
49 | git \ | |
50 | libffi-devel \ | |
51 | make \ | |
52 | openssl \ | |
53 | openssl-devel | |
54 | WORKDIR /tmp/python3/ | |
55 | ARG PYTHON_VERSION | |
56 | RUN curl -L https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz | tar xzf - \ | |
57 | && cd Python-${PYTHON_VERSION} \ | |
58 | && ./configure --enable-optimizations --enable-shared --prefix=/usr LDFLAGS="-Wl,-rpath /usr/lib" \ | |
59 | && make altinstall | |
60 | RUN alternatives --install /usr/bin/python python /usr/bin/python2.7 50 | |
61 | RUN alternatives --install /usr/bin/python python /usr/bin/python$(echo "${PYTHON_VERSION%.*}") 60 | |
62 | RUN curl https://bootstrap.pypa.io/get-pip.py | python - | |
63 | ||
64 | FROM build-${DISTRO} AS build | |
65 | ENTRYPOINT ["sh", "/usr/local/bin/docker-compose-entrypoint.sh"] | |
66 | WORKDIR /code/ | |
43 | 67 | COPY docker-compose-entrypoint.sh /usr/local/bin/ |
44 | ENTRYPOINT ["sh", "/usr/local/bin/docker-compose-entrypoint.sh"] | |
45 | 68 | COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker |
46 | WORKDIR /code/ | |
47 | # FIXME(chris-crone): virtualenv 16.3.0 breaks build, force 16.2.0 until fixed | |
48 | RUN pip install virtualenv==20.0.30 | |
49 | RUN pip install tox==3.19.0 | |
50 | ||
69 | RUN pip install \ | |
70 | virtualenv==20.4.0 \ | |
71 | tox==3.21.2 | |
72 | COPY requirements-dev.txt . | |
51 | 73 | COPY requirements-indirect.txt . |
52 | 74 | COPY requirements.txt . |
53 | COPY requirements-dev.txt . | |
75 | RUN pip install -r requirements.txt -r requirements-indirect.txt -r requirements-dev.txt | |
54 | 76 | COPY .pre-commit-config.yaml . |
55 | 77 | COPY tox.ini . |
56 | 78 | COPY setup.py . |
57 | 79 | COPY README.md . |
58 | 80 | COPY compose compose/ |
59 | RUN tox --notest | |
81 | RUN tox -e py37 --notest | |
60 | 82 | COPY . . |
61 | 83 | ARG GIT_COMMIT=unknown |
62 | 84 | ENV DOCKER_COMPOSE_GITSHA=$GIT_COMMIT |
63 | 85 | RUN script/build/linux-entrypoint |
64 | 86 | |
87 | FROM scratch AS bin | |
88 | ARG TARGETARCH | |
89 | ARG TARGETOS | |
90 | COPY --from=build /usr/local/bin/docker-compose /docker-compose-${TARGETOS}-${TARGETARCH} | |
91 | ||
65 | 92 | FROM alpine:${RUNTIME_ALPINE_VERSION} AS runtime-alpine |
66 | 93 | FROM debian:${RUNTIME_DEBIAN_VERSION} AS runtime-debian |
67 | FROM runtime-${BUILD_PLATFORM} AS runtime | |
94 | FROM centos:${RUNTIME_CENTOS_VERSION} AS runtime-centos | |
95 | FROM runtime-${DISTRO} AS runtime | |
68 | 96 | COPY docker-compose-entrypoint.sh /usr/local/bin/ |
69 | 97 | ENTRYPOINT ["sh", "/usr/local/bin/docker-compose-entrypoint.sh"] |
70 | 98 | COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker |
0 | 0 | #!groovy |
1 | 1 | |
2 | def dockerVersions = ['19.03.8'] | |
2 | def dockerVersions = ['19.03.13'] | |
3 | 3 | def baseImages = ['alpine', 'debian'] |
4 | 4 | def pythonVersions = ['py37'] |
5 | 5 | |
12 | 12 | timeout(time: 2, unit: 'HOURS') |
13 | 13 | timestamps() |
14 | 14 | } |
15 | environment { | |
16 | DOCKER_BUILDKIT="1" | |
17 | } | |
15 | 18 | |
16 | 19 | stages { |
17 | 20 | stage('Build test images') { |
19 | 22 | parallel { |
20 | 23 | stage('alpine') { |
21 | 24 | agent { |
22 | label 'ubuntu && amd64 && !zfs' | |
25 | label 'ubuntu-2004 && amd64 && !zfs && cgroup1' | |
23 | 26 | } |
24 | 27 | steps { |
25 | 28 | buildImage('alpine') |
27 | 30 | } |
28 | 31 | stage('debian') { |
29 | 32 | agent { |
30 | label 'ubuntu && amd64 && !zfs' | |
33 | label 'ubuntu-2004 && amd64 && !zfs && cgroup1' | |
31 | 34 | } |
32 | 35 | steps { |
33 | 36 | buildImage('debian') |
58 | 61 | |
59 | 62 | def buildImage(baseImage) { |
60 | 63 | def scmvar = checkout(scm) |
61 | def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
64 | def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
62 | 65 | image = docker.image(imageName) |
63 | 66 | |
64 | 67 | withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') { |
68 | 71 | ansiColor('xterm') { |
69 | 72 | sh """docker build -t ${imageName} \\ |
70 | 73 | --target build \\ |
71 | --build-arg BUILD_PLATFORM="${baseImage}" \\ | |
74 | --build-arg DISTRO="${baseImage}" \\ | |
72 | 75 | --build-arg GIT_COMMIT="${scmvar.GIT_COMMIT}" \\ |
73 | 76 | .\\ |
74 | 77 | """ |
83 | 86 | def runTests(dockerVersion, pythonVersion, baseImage) { |
84 | 87 | return { |
85 | 88 | stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") { |
86 | node("ubuntu && amd64 && !zfs") { | |
89 | node("ubuntu-2004 && amd64 && !zfs && cgroup1") { | |
87 | 90 | def scmvar = checkout(scm) |
88 | def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
91 | def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
89 | 92 | def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim() |
90 | 93 | echo "Using local system's storage driver: ${storageDriver}" |
91 | 94 | withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') { |
95 | 98 | --privileged \\ |
96 | 99 | --volume="\$(pwd)/.git:/code/.git" \\ |
97 | 100 | --volume="/var/run/docker.sock:/var/run/docker.sock" \\ |
101 | --volume="\${DOCKER_CONFIG}/config.json:/root/.docker/config.json" \\ | |
102 | -e "DOCKER_TLS_CERTDIR=" \\ | |
98 | 103 | -e "TAG=${imageName}" \\ |
99 | 104 | -e "STORAGE_DRIVER=${storageDriver}" \\ |
100 | 105 | -e "DOCKER_VERSIONS=${dockerVersion}" \\ |
0 | TAG = "docker-compose:alpine-$(shell git rev-parse --short HEAD)" | |
1 | GIT_VOLUME = "--volume=$(shell pwd)/.git:/code/.git" | |
2 | ||
3 | DOCKERFILE ?="Dockerfile" | |
4 | DOCKER_BUILD_TARGET ?="build" | |
5 | ||
6 | UNAME_S := $(shell uname -s) | |
7 | ifeq ($(UNAME_S),Linux) | |
8 | BUILD_SCRIPT = linux | |
9 | endif | |
10 | ifeq ($(UNAME_S),Darwin) | |
11 | BUILD_SCRIPT = osx | |
12 | endif | |
13 | ||
14 | COMPOSE_SPEC_SCHEMA_PATH = "compose/config/compose_spec.json" | |
15 | COMPOSE_SPEC_RAW_URL = "https://raw.githubusercontent.com/compose-spec/compose-spec/master/schema/compose-spec.json" | |
16 | ||
17 | all: cli | |
18 | ||
19 | cli: download-compose-spec ## Compile the cli | |
20 | ./script/build/$(BUILD_SCRIPT) | |
21 | ||
22 | download-compose-spec: ## Download the compose-spec schema from it's repo | |
23 | curl -so $(COMPOSE_SPEC_SCHEMA_PATH) $(COMPOSE_SPEC_RAW_URL) | |
24 | ||
25 | cache-clear: ## Clear the builder cache | |
26 | @docker builder prune --force --filter type=exec.cachemount --filter=unused-for=24h | |
27 | ||
28 | base-image: ## Builds base image | |
29 | docker build -f $(DOCKERFILE) -t $(TAG) --target $(DOCKER_BUILD_TARGET) . | |
30 | ||
31 | lint: base-image ## Run linter | |
32 | docker run --rm \ | |
33 | --tty \ | |
34 | $(GIT_VOLUME) \ | |
35 | $(TAG) \ | |
36 | tox -e pre-commit | |
37 | ||
38 | test-unit: base-image ## Run tests | |
39 | docker run --rm \ | |
40 | --tty \ | |
41 | $(GIT_VOLUME) \ | |
42 | $(TAG) \ | |
43 | pytest -v tests/unit/ | |
44 | ||
45 | test: ## Run all tests | |
46 | ./script/test/default | |
47 | ||
48 | pre-commit: lint test-unit cli | |
49 | ||
50 | help: ## Show help | |
51 | @echo Please specify a build target. The choices are: | |
52 | @grep -E '^[0-9a-zA-Z_-]+:.*?## .*$$' $(MAKEFILE_LIST) | awk 'BEGIN {FS = ":.*?## "}; {printf "\033[36m%-30s\033[0m %s\n", $$1, $$2}' | |
53 | ||
54 | FORCE: | |
55 | ||
56 | .PHONY: all cli download-compose-spec cache-clear base-image lint test-unit test pre-commit help |
0 | 0 | Docker Compose |
1 | 1 | ============== |
2 | [![Build Status](https://ci-next.docker.com/public/buildStatus/icon?job=compose/master)](https://ci-next.docker.com/public/job/compose/job/master/) | |
3 | ||
2 | 4 | ![Docker Compose](logo.png?raw=true "Docker Compose Logo") |
3 | 5 | |
4 | Compose is a tool for defining and running multi-container Docker applications. | |
5 | With Compose, you use a Compose file to configure your application's services. | |
6 | Then, using a single command, you create and start all the services | |
7 | from your configuration. To learn more about all the features of Compose | |
8 | see [the list of features](https://github.com/docker/docker.github.io/blob/master/compose/index.md#features). | |
6 | Docker Compose is a tool for running multi-container applications on Docker | |
7 | defined using the [Compose file format](https://compose-spec.io). | |
8 | A Compose file is used to define how the one or more containers that make up | |
9 | your application are configured. | |
10 | Once you have a Compose file, you can create and start your application with a | |
11 | single command: `docker-compose up`. | |
9 | 12 | |
10 | Compose is great for development, testing, and staging environments, as well as | |
11 | CI workflows. You can learn more about each case in | |
12 | [Common Use Cases](https://github.com/docker/docker.github.io/blob/master/compose/index.md#common-use-cases). | |
13 | Compose files can be used to deploy applications locally, or to the cloud on | |
14 | [Amazon ECS](https://aws.amazon.com/ecs) or | |
15 | [Microsoft ACI](https://azure.microsoft.com/services/container-instances/) using | |
16 | the Docker CLI. You can read more about how to do this: | |
17 | - [Compose for Amazon ECS](https://docs.docker.com/engine/context/ecs-integration/) | |
18 | - [Compose for Microsoft ACI](https://docs.docker.com/engine/context/aci-integration/) | |
13 | 19 | |
14 | Using Compose is basically a three-step process. | |
20 | Where to get Docker Compose | |
21 | ---------------------------- | |
15 | 22 | |
23 | ### Windows and macOS | |
24 | ||
25 | Docker Compose is included in | |
26 | [Docker Desktop](https://www.docker.com/products/docker-desktop) | |
27 | for Windows and macOS. | |
28 | ||
29 | ### Linux | |
30 | ||
31 | You can download Docker Compose binaries from the | |
32 | [release page](https://github.com/docker/compose/releases) on this repository. | |
33 | ||
34 | ### Using pip | |
35 | ||
36 | If your platform is not supported, you can download Docker Compose using `pip`: | |
37 | ||
38 | ```console | |
39 | pip install docker-compose | |
40 | ``` | |
41 | ||
42 | > **Note:** Docker Compose requires Python 3.6 or later. | |
43 | ||
44 | Quick Start | |
45 | ----------- | |
46 | ||
47 | Using Docker Compose is basically a three-step process: | |
16 | 48 | 1. Define your app's environment with a `Dockerfile` so it can be |
17 | reproduced anywhere. | |
49 | reproduced anywhere. | |
18 | 50 | 2. Define the services that make up your app in `docker-compose.yml` so |
19 | they can be run together in an isolated environment. | |
20 | 3. Lastly, run `docker-compose up` and Compose will start and run your entire app. | |
51 | they can be run together in an isolated environment. | |
52 | 3. Lastly, run `docker-compose up` and Compose will start and run your entire | |
53 | app. | |
21 | 54 | |
22 | A `docker-compose.yml` looks like this: | |
55 | A Compose file looks like this: | |
23 | 56 | |
24 | version: '2' | |
57 | ```yaml | |
58 | services: | |
59 | web: | |
60 | build: . | |
61 | ports: | |
62 | - "5000:5000" | |
63 | volumes: | |
64 | - .:/code | |
65 | redis: | |
66 | image: redis | |
67 | ``` | |
25 | 68 | |
26 | services: | |
27 | web: | |
28 | build: . | |
29 | ports: | |
30 | - "5000:5000" | |
31 | volumes: | |
32 | - .:/code | |
33 | redis: | |
34 | image: redis | |
69 | You can find examples of Compose applications in our | |
70 | [Awesome Compose repository](https://github.com/docker/awesome-compose). | |
35 | 71 | |
36 | For more information about the Compose file, see the | |
37 | [Compose file reference](https://github.com/docker/docker.github.io/blob/master/compose/compose-file/compose-versioning.md). | |
38 | ||
39 | Compose has commands for managing the whole lifecycle of your application: | |
40 | ||
41 | * Start, stop and rebuild services | |
42 | * View the status of running services | |
43 | * Stream the log output of running services | |
44 | * Run a one-off command on a service | |
45 | ||
46 | Installation and documentation | |
47 | ------------------------------ | |
48 | ||
49 | - Full documentation is available on [Docker's website](https://docs.docker.com/compose/). | |
50 | - Code repository for Compose is on [GitHub](https://github.com/docker/compose). | |
51 | - If you find any problems please fill out an [issue](https://github.com/docker/compose/issues/new/choose). Thank you! | |
72 | For more information about the Compose format, see the | |
73 | [Compose file reference](https://docs.docker.com/compose/compose-file/). | |
52 | 74 | |
53 | 75 | Contributing |
54 | 76 | ------------ |
55 | 77 | |
56 | [![Build Status](https://ci-next.docker.com/public/buildStatus/icon?job=compose/master)](https://ci-next.docker.com/public/job/compose/job/master/) | |
78 | Want to help develop Docker Compose? Check out our | |
79 | [contributing documentation](https://github.com/docker/compose/blob/master/CONTRIBUTING.md). | |
57 | 80 | |
58 | Want to help build Compose? Check out our [contributing documentation](https://github.com/docker/compose/blob/master/CONTRIBUTING.md). | |
81 | If you find an issue, please report it on the | |
82 | [issue tracker](https://github.com/docker/compose/issues/new/choose). | |
59 | 83 | |
60 | 84 | Releasing |
61 | 85 | --------- |
0 | 0 | #!groovy |
1 | 1 | |
2 | def dockerVersions = ['19.03.8', '18.09.9'] | |
2 | def dockerVersions = ['19.03.13', '18.09.9'] | |
3 | 3 | def baseImages = ['alpine', 'debian'] |
4 | 4 | def pythonVersions = ['py37'] |
5 | 5 | |
12 | 12 | timeout(time: 2, unit: 'HOURS') |
13 | 13 | timestamps() |
14 | 14 | } |
15 | environment { | |
16 | DOCKER_BUILDKIT="1" | |
17 | } | |
15 | 18 | |
16 | 19 | stages { |
17 | 20 | stage('Build test images') { |
19 | 22 | parallel { |
20 | 23 | stage('alpine') { |
21 | 24 | agent { |
22 | label 'linux && docker && ubuntu-2004' | |
25 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
23 | 26 | } |
24 | 27 | steps { |
25 | 28 | buildImage('alpine') |
27 | 30 | } |
28 | 31 | stage('debian') { |
29 | 32 | agent { |
30 | label 'linux && docker && ubuntu-2004' | |
33 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
31 | 34 | } |
32 | 35 | steps { |
33 | 36 | buildImage('debian') |
37 | 40 | } |
38 | 41 | stage('Test') { |
39 | 42 | agent { |
40 | label 'linux && docker && ubuntu-2004' | |
43 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
41 | 44 | } |
42 | 45 | steps { |
43 | 46 | // TODO use declarative 1.5.0 `matrix` once available on CI |
57 | 60 | } |
58 | 61 | stage('Generate Changelog') { |
59 | 62 | agent { |
60 | label 'linux && docker && ubuntu-2004' | |
63 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
61 | 64 | } |
62 | 65 | steps { |
63 | 66 | checkout scm |
80 | 83 | steps { |
81 | 84 | checkout scm |
82 | 85 | sh './script/setup/osx' |
83 | sh 'tox -e py37 -- tests/unit' | |
86 | sh 'tox -e py39 -- tests/unit' | |
84 | 87 | sh './script/build/osx' |
85 | 88 | dir ('dist') { |
86 | 89 | checksum('docker-compose-Darwin-x86_64') |
94 | 97 | } |
95 | 98 | stage('linux binary') { |
96 | 99 | agent { |
97 | label 'linux && docker && ubuntu-2004' | |
100 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
98 | 101 | } |
99 | 102 | steps { |
100 | 103 | checkout scm |
113 | 116 | label 'windows-python' |
114 | 117 | } |
115 | 118 | environment { |
116 | PATH = "$PATH;C:\\Python37;C:\\Python37\\Scripts" | |
117 | } | |
118 | steps { | |
119 | checkout scm | |
120 | bat 'tox.exe -e py37 -- tests/unit' | |
119 | PATH = "C:\\Python39;C:\\Python39\\Scripts;$PATH" | |
120 | } | |
121 | steps { | |
122 | checkout scm | |
123 | bat 'tox.exe -e py39 -- tests/unit' | |
121 | 124 | powershell '.\\script\\build\\windows.ps1' |
122 | 125 | dir ('dist') { |
123 | 126 | checksum('docker-compose-Windows-x86_64.exe') |
130 | 133 | } |
131 | 134 | stage('alpine image') { |
132 | 135 | agent { |
133 | label 'linux && docker && ubuntu-2004' | |
136 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
134 | 137 | } |
135 | 138 | steps { |
136 | 139 | buildRuntimeImage('alpine') |
138 | 141 | } |
139 | 142 | stage('debian image') { |
140 | 143 | agent { |
141 | label 'linux && docker && ubuntu-2004' | |
144 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
142 | 145 | } |
143 | 146 | steps { |
144 | 147 | buildRuntimeImage('debian') |
153 | 156 | parallel { |
154 | 157 | stage('Pushing images') { |
155 | 158 | agent { |
156 | label 'linux && docker && ubuntu-2004' | |
159 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
157 | 160 | } |
158 | 161 | steps { |
159 | 162 | pushRuntimeImage('alpine') |
162 | 165 | } |
163 | 166 | stage('Creating Github Release') { |
164 | 167 | agent { |
165 | label 'linux && docker && ubuntu-2004' | |
168 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
166 | 169 | } |
167 | 170 | environment { |
168 | 171 | GITHUB_TOKEN = credentials('github-release-token') |
194 | 197 | } |
195 | 198 | stage('Publishing Python packages') { |
196 | 199 | agent { |
197 | label 'linux && docker && ubuntu-2004' | |
200 | label 'linux && docker && ubuntu-2004 && amd64 && cgroup1' | |
198 | 201 | } |
199 | 202 | environment { |
200 | 203 | PYPIRC = credentials('pypirc-docker-dsg-cibot') |
218 | 221 | |
219 | 222 | def buildImage(baseImage) { |
220 | 223 | def scmvar = checkout(scm) |
221 | def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
224 | def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
222 | 225 | image = docker.image(imageName) |
223 | 226 | |
224 | 227 | withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') { |
228 | 231 | ansiColor('xterm') { |
229 | 232 | sh """docker build -t ${imageName} \\ |
230 | 233 | --target build \\ |
231 | --build-arg BUILD_PLATFORM="${baseImage}" \\ | |
234 | --build-arg DISTRO="${baseImage}" \\ | |
232 | 235 | --build-arg GIT_COMMIT="${scmvar.GIT_COMMIT}" \\ |
233 | 236 | .\\ |
234 | 237 | """ |
243 | 246 | def runTests(dockerVersion, pythonVersion, baseImage) { |
244 | 247 | return { |
245 | 248 | stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") { |
246 | node("linux && docker && ubuntu-2004") { | |
249 | node("linux && docker && ubuntu-2004 && amd64 && cgroup1") { | |
247 | 250 | def scmvar = checkout(scm) |
248 | def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
251 | def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}" | |
249 | 252 | def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim() |
250 | 253 | echo "Using local system's storage driver: ${storageDriver}" |
251 | 254 | withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') { |
255 | 258 | --privileged \\ |
256 | 259 | --volume="\$(pwd)/.git:/code/.git" \\ |
257 | 260 | --volume="/var/run/docker.sock:/var/run/docker.sock" \\ |
261 | --volume="\${DOCKER_CONFIG}/config.json:/root/.docker/config.json" \\ | |
262 | -e "DOCKER_TLS_CERTDIR=" \\ | |
258 | 263 | -e "TAG=${imageName}" \\ |
259 | 264 | -e "STORAGE_DRIVER=${storageDriver}" \\ |
260 | 265 | -e "DOCKER_VERSIONS=${dockerVersion}" \\ |
275 | 280 | def imageName = "docker/compose:${baseImage}-${env.BRANCH_NAME}" |
276 | 281 | ansiColor('xterm') { |
277 | 282 | sh """docker build -t ${imageName} \\ |
278 | --build-arg BUILD_PLATFORM="${baseImage}" \\ | |
283 | --build-arg DISTRO="${baseImage}" \\ | |
279 | 284 | --build-arg GIT_COMMIT="${scmvar.GIT_COMMIT.take(7)}" \\ |
280 | 285 | . |
281 | 286 | """ |
0 | import enum | |
1 | import os | |
2 | ||
0 | 3 | from ..const import IS_WINDOWS_PLATFORM |
1 | 4 | |
2 | 5 | NAMES = [ |
9 | 12 | 'cyan', |
10 | 13 | 'white' |
11 | 14 | ] |
15 | ||
16 | ||
17 | @enum.unique | |
18 | class AnsiMode(enum.Enum): | |
19 | """Enumeration for when to output ANSI colors.""" | |
20 | NEVER = "never" | |
21 | ALWAYS = "always" | |
22 | AUTO = "auto" | |
23 | ||
24 | def use_ansi_codes(self, stream): | |
25 | if self is AnsiMode.ALWAYS: | |
26 | return True | |
27 | if self is AnsiMode.NEVER or os.environ.get('CLICOLOR') == '0': | |
28 | return False | |
29 | return stream.isatty() | |
12 | 30 | |
13 | 31 | |
14 | 32 | def get_pairs(): |
34 | 34 | |
35 | 35 | def project_from_options(project_dir, options, additional_options=None): |
36 | 36 | additional_options = additional_options or {} |
37 | override_dir = options.get('--project-directory') | |
37 | override_dir = get_project_dir(options) | |
38 | 38 | environment_file = options.get('--env-file') |
39 | 39 | environment = Environment.from_env_file(override_dir or project_dir, environment_file) |
40 | 40 | environment.silent = options.get('COMMAND', None) in SILENT_COMMANDS |
58 | 58 | |
59 | 59 | return get_project( |
60 | 60 | project_dir, |
61 | get_config_path_from_options(project_dir, options, environment), | |
61 | get_config_path_from_options(options, environment), | |
62 | 62 | project_name=options.get('--project-name'), |
63 | 63 | verbose=options.get('--verbose'), |
64 | 64 | context=context, |
65 | 65 | environment=environment, |
66 | 66 | override_dir=override_dir, |
67 | 67 | interpolate=(not additional_options.get('--no-interpolate')), |
68 | environment_file=environment_file | |
68 | environment_file=environment_file, | |
69 | enabled_profiles=get_profiles_from_options(options, environment) | |
69 | 70 | ) |
70 | 71 | |
71 | 72 | |
85 | 86 | parallel.GlobalLimit.set_global_limit(parallel_limit) |
86 | 87 | |
87 | 88 | |
89 | def get_project_dir(options): | |
90 | override_dir = None | |
91 | files = get_config_path_from_options(options, os.environ) | |
92 | if files: | |
93 | if files[0] == '-': | |
94 | return '.' | |
95 | override_dir = os.path.dirname(files[0]) | |
96 | return options.get('--project-directory') or override_dir | |
97 | ||
98 | ||
88 | 99 | def get_config_from_options(base_dir, options, additional_options=None): |
89 | 100 | additional_options = additional_options or {} |
90 | override_dir = options.get('--project-directory') | |
101 | override_dir = get_project_dir(options) | |
91 | 102 | environment_file = options.get('--env-file') |
92 | 103 | environment = Environment.from_env_file(override_dir or base_dir, environment_file) |
93 | config_path = get_config_path_from_options( | |
94 | base_dir, options, environment | |
95 | ) | |
104 | config_path = get_config_path_from_options(options, environment) | |
96 | 105 | return config.load( |
97 | 106 | config.find(base_dir, config_path, environment, override_dir), |
98 | 107 | not additional_options.get('--no-interpolate') |
99 | 108 | ) |
100 | 109 | |
101 | 110 | |
102 | def get_config_path_from_options(base_dir, options, environment): | |
111 | def get_config_path_from_options(options, environment): | |
103 | 112 | def unicode_paths(paths): |
104 | 113 | return [p.decode('utf-8') if isinstance(p, bytes) else p for p in paths] |
105 | 114 | |
114 | 123 | return None |
115 | 124 | |
116 | 125 | |
126 | def get_profiles_from_options(options, environment): | |
127 | profile_option = options.get('--profile') | |
128 | if profile_option: | |
129 | return profile_option | |
130 | ||
131 | profiles = environment.get('COMPOSE_PROFILES') | |
132 | if profiles: | |
133 | return profiles.split(',') | |
134 | ||
135 | return [] | |
136 | ||
137 | ||
117 | 138 | def get_project(project_dir, config_path=None, project_name=None, verbose=False, |
118 | 139 | context=None, environment=None, override_dir=None, |
119 | interpolate=True, environment_file=None): | |
140 | interpolate=True, environment_file=None, enabled_profiles=None): | |
120 | 141 | if not environment: |
121 | 142 | environment = Environment.from_env_file(project_dir) |
122 | 143 | config_details = config.find(project_dir, config_path, environment, override_dir) |
138 | 159 | client, |
139 | 160 | environment.get('DOCKER_DEFAULT_PLATFORM'), |
140 | 161 | execution_context_labels(config_details, environment_file), |
162 | enabled_profiles, | |
141 | 163 | ) |
142 | 164 | |
143 | 165 |
165 | 165 | kwargs['credstore_env'] = { |
166 | 166 | 'LD_LIBRARY_PATH': environment.get('LD_LIBRARY_PATH_ORIG'), |
167 | 167 | } |
168 | ||
169 | client = APIClient(**kwargs) | |
168 | use_paramiko_ssh = int(environment.get('COMPOSE_PARAMIKO_SSH', 0)) | |
169 | client = APIClient(use_ssh_client=not use_paramiko_ssh, **kwargs) | |
170 | 170 | client._original_base_url = kwargs.get('base_url') |
171 | 171 | |
172 | 172 | return client |
16 | 16 | self.command_class = command_class |
17 | 17 | self.options = options |
18 | 18 | |
19 | @classmethod | |
20 | def get_command_and_options(cls, doc_entity, argv, options): | |
21 | command_help = getdoc(doc_entity) | |
22 | opt = docopt_full_help(command_help, argv, **options) | |
23 | command = opt['COMMAND'] | |
24 | return command_help, opt, command | |
25 | ||
19 | 26 | def parse(self, argv): |
20 | command_help = getdoc(self.command_class) | |
21 | options = docopt_full_help(command_help, argv, **self.options) | |
22 | command = options['COMMAND'] | |
27 | command_help, options, command = DocoptDispatcher.get_command_and_options( | |
28 | self.command_class, argv, self.options) | |
23 | 29 | |
24 | 30 | if command is None: |
25 | 31 | raise SystemExit(command_help) |
15 | 15 | |
16 | 16 | class LogPresenter: |
17 | 17 | |
18 | def __init__(self, prefix_width, color_func): | |
18 | def __init__(self, prefix_width, color_func, keep_prefix=True): | |
19 | 19 | self.prefix_width = prefix_width |
20 | 20 | self.color_func = color_func |
21 | self.keep_prefix = keep_prefix | |
21 | 22 | |
22 | 23 | def present(self, container, line): |
23 | prefix = container.name_without_project.ljust(self.prefix_width) | |
24 | return '{prefix} {line}'.format( | |
25 | prefix=self.color_func(prefix + ' |'), | |
26 | line=line) | |
27 | ||
28 | ||
29 | def build_log_presenters(service_names, monochrome): | |
24 | to_log = '{line}'.format(line=line) | |
25 | ||
26 | if self.keep_prefix: | |
27 | prefix = container.name_without_project.ljust(self.prefix_width) | |
28 | to_log = '{prefix} '.format(prefix=self.color_func(prefix + ' |')) + to_log | |
29 | ||
30 | return to_log | |
31 | ||
32 | ||
33 | def build_log_presenters(service_names, monochrome, keep_prefix=True): | |
30 | 34 | """Return an iterable of functions. |
31 | 35 | |
32 | 36 | Each function can be used to format the logs output of a container. |
37 | 41 | return text |
38 | 42 | |
39 | 43 | for color_func in cycle([no_color] if monochrome else colors.rainbow()): |
40 | yield LogPresenter(prefix_width, color_func) | |
44 | yield LogPresenter(prefix_width, color_func, keep_prefix) | |
41 | 45 | |
42 | 46 | |
43 | 47 | def max_name_width(service_names, max_index_width=3): |
153 | 157 | |
154 | 158 | |
155 | 159 | def tail_container_logs(container, presenter, queue, log_args): |
156 | generator = get_log_generator(container) | |
157 | ||
158 | 160 | try: |
159 | for item in generator(container, log_args): | |
161 | for item in build_log_generator(container, log_args): | |
160 | 162 | queue.put(QueueItem.new(presenter.present(container, item))) |
161 | 163 | except Exception as e: |
162 | 164 | queue.put(QueueItem.exception(e)) |
164 | 166 | if log_args.get('follow'): |
165 | 167 | queue.put(QueueItem.new(presenter.color_func(wait_on_exit(container)))) |
166 | 168 | queue.put(QueueItem.stop(container.name)) |
167 | ||
168 | ||
169 | def get_log_generator(container): | |
170 | if container.has_api_logs: | |
171 | return build_log_generator | |
172 | return build_no_log_generator | |
173 | ||
174 | ||
175 | def build_no_log_generator(container, log_args): | |
176 | """Return a generator that prints a warning about logs and waits for | |
177 | container to exit. | |
178 | """ | |
179 | yield "WARNING: no logs are available with the '{}' log driver\n".format( | |
180 | container.log_driver) | |
181 | 169 | |
182 | 170 | |
183 | 171 | def build_log_generator(container, log_args): |
1 | 1 | import functools |
2 | 2 | import json |
3 | 3 | import logging |
4 | import os | |
5 | 4 | import pipes |
6 | 5 | import re |
7 | 6 | import subprocess |
23 | 22 | from ..config.environment import Environment |
24 | 23 | from ..config.serialize import serialize_config |
25 | 24 | from ..config.types import VolumeSpec |
25 | from ..const import IS_LINUX_PLATFORM | |
26 | 26 | from ..const import IS_WINDOWS_PLATFORM |
27 | 27 | from ..errors import StreamParseError |
28 | from ..metrics.decorator import metrics | |
29 | from ..parallel import ParallelStreamWriter | |
28 | 30 | from ..progress_stream import StreamOutputError |
29 | 31 | from ..project import get_image_digests |
30 | 32 | from ..project import MissingDigests |
37 | 39 | from ..service import ImageType |
38 | 40 | from ..service import NeedsBuildError |
39 | 41 | from ..service import OperationFailedError |
42 | from ..utils import filter_attached_for_up | |
43 | from .colors import AnsiMode | |
40 | 44 | from .command import get_config_from_options |
45 | from .command import get_project_dir | |
41 | 46 | from .command import project_from_options |
42 | 47 | from .docopt_command import DocoptDispatcher |
43 | 48 | from .docopt_command import get_handler |
50 | 55 | from .utils import get_version_info |
51 | 56 | from .utils import human_readable_file_size |
52 | 57 | from .utils import yesno |
58 | from compose.metrics.client import MetricsCommand | |
59 | from compose.metrics.client import Status | |
53 | 60 | |
54 | 61 | |
55 | 62 | if not IS_WINDOWS_PLATFORM: |
56 | 63 | from dockerpty.pty import PseudoTerminal, RunOperation, ExecOperation |
57 | 64 | |
58 | 65 | log = logging.getLogger(__name__) |
59 | console_handler = logging.StreamHandler(sys.stderr) | |
60 | ||
61 | ||
62 | def main(): | |
66 | ||
67 | ||
68 | def main(): # noqa: C901 | |
63 | 69 | signals.ignore_sigpipe() |
70 | command = None | |
64 | 71 | try: |
65 | command = dispatch() | |
66 | command() | |
72 | _, opts, command = DocoptDispatcher.get_command_and_options( | |
73 | TopLevelCommand, | |
74 | get_filtered_args(sys.argv[1:]), | |
75 | {'options_first': True, 'version': get_version_info('compose')}) | |
76 | except Exception: | |
77 | pass | |
78 | try: | |
79 | command_func = dispatch() | |
80 | command_func() | |
81 | if not IS_LINUX_PLATFORM and command == 'help': | |
82 | print("\nDocker Compose is now in the Docker CLI, try `docker compose` help") | |
67 | 83 | except (KeyboardInterrupt, signals.ShutdownException): |
68 | log.error("Aborting.") | |
69 | sys.exit(1) | |
84 | exit_with_metrics(command, "Aborting.", status=Status.CANCELED) | |
70 | 85 | except (UserError, NoSuchService, ConfigurationError, |
71 | 86 | ProjectError, OperationFailedError) as e: |
72 | log.error(e.msg) | |
73 | sys.exit(1) | |
87 | exit_with_metrics(command, e.msg, status=Status.FAILURE) | |
74 | 88 | except BuildError as e: |
75 | 89 | reason = "" |
76 | 90 | if e.reason: |
77 | 91 | reason = " : " + e.reason |
78 | log.error("Service '{}' failed to build{}".format(e.service.name, reason)) | |
79 | sys.exit(1) | |
92 | exit_with_metrics(command, | |
93 | "Service '{}' failed to build{}".format(e.service.name, reason), | |
94 | status=Status.FAILURE) | |
80 | 95 | except StreamOutputError as e: |
81 | log.error(e) | |
82 | sys.exit(1) | |
96 | exit_with_metrics(command, e, status=Status.FAILURE) | |
83 | 97 | except NeedsBuildError as e: |
84 | log.error("Service '{}' needs to be built, but --no-build was passed.".format(e.service.name)) | |
85 | sys.exit(1) | |
98 | exit_with_metrics(command, | |
99 | "Service '{}' needs to be built, but --no-build was passed.".format( | |
100 | e.service.name), status=Status.FAILURE) | |
86 | 101 | except NoSuchCommand as e: |
87 | 102 | commands = "\n".join(parse_doc_section("commands:", getdoc(e.supercommand))) |
88 | log.error("No such command: %s\n\n%s", e.command, commands) | |
89 | sys.exit(1) | |
103 | if not IS_LINUX_PLATFORM: | |
104 | commands += "\n\nDocker Compose is now in the Docker CLI, try `docker compose`" | |
105 | exit_with_metrics("", log_msg="No such command: {}\n\n{}".format( | |
106 | e.command, commands), status=Status.FAILURE) | |
90 | 107 | except (errors.ConnectionError, StreamParseError): |
91 | sys.exit(1) | |
108 | exit_with_metrics(command, status=Status.FAILURE) | |
109 | except SystemExit as e: | |
110 | status = Status.SUCCESS | |
111 | if len(sys.argv) > 1 and '--help' not in sys.argv: | |
112 | status = Status.FAILURE | |
113 | ||
114 | if command and len(sys.argv) >= 3 and sys.argv[2] == '--help': | |
115 | command = '--help ' + command | |
116 | ||
117 | if not command and len(sys.argv) >= 2 and sys.argv[1] == '--help': | |
118 | command = '--help' | |
119 | ||
120 | msg = e.args[0] if len(e.args) else "" | |
121 | code = 0 | |
122 | if isinstance(e.code, int): | |
123 | code = e.code | |
124 | ||
125 | if not IS_LINUX_PLATFORM and not command: | |
126 | msg += "\n\nDocker Compose is now in the Docker CLI, try `docker compose`" | |
127 | ||
128 | exit_with_metrics(command, log_msg=msg, status=status, | |
129 | exit_code=code) | |
130 | ||
131 | ||
132 | def get_filtered_args(args): | |
133 | if args[0] in ('-h', '--help'): | |
134 | return [] | |
135 | if args[0] == '--version': | |
136 | return ['version'] | |
137 | ||
138 | ||
139 | def exit_with_metrics(command, log_msg=None, status=Status.SUCCESS, exit_code=1): | |
140 | if log_msg and command != 'exec': | |
141 | if not exit_code: | |
142 | log.info(log_msg) | |
143 | else: | |
144 | log.error(log_msg) | |
145 | ||
146 | MetricsCommand(command, status=status).send_metrics() | |
147 | sys.exit(exit_code) | |
92 | 148 | |
93 | 149 | |
94 | 150 | def dispatch(): |
95 | setup_logging() | |
151 | console_stream = sys.stderr | |
152 | console_handler = logging.StreamHandler(console_stream) | |
153 | setup_logging(console_handler) | |
96 | 154 | dispatcher = DocoptDispatcher( |
97 | 155 | TopLevelCommand, |
98 | 156 | {'options_first': True, 'version': get_version_info('compose')}) |
99 | 157 | |
100 | 158 | options, handler, command_options = dispatcher.parse(sys.argv[1:]) |
159 | ||
160 | ansi_mode = AnsiMode.AUTO | |
161 | try: | |
162 | if options.get("--ansi"): | |
163 | ansi_mode = AnsiMode(options.get("--ansi")) | |
164 | except ValueError: | |
165 | raise UserError( | |
166 | 'Invalid value for --ansi: {}. Expected one of {}.'.format( | |
167 | options.get("--ansi"), | |
168 | ', '.join(m.value for m in AnsiMode) | |
169 | ) | |
170 | ) | |
171 | if options.get("--no-ansi"): | |
172 | if options.get("--ansi"): | |
173 | raise UserError("--no-ansi and --ansi cannot be combined.") | |
174 | log.warning('--no-ansi option is deprecated and will be removed in future versions. ' | |
175 | 'Use `--ansi never` instead.') | |
176 | ansi_mode = AnsiMode.NEVER | |
177 | ||
101 | 178 | setup_console_handler(console_handler, |
102 | 179 | options.get('--verbose'), |
103 | set_no_color_if_clicolor(options.get('--no-ansi')), | |
180 | ansi_mode.use_ansi_codes(console_handler.stream), | |
104 | 181 | options.get("--log-level")) |
105 | setup_parallel_logger(set_no_color_if_clicolor(options.get('--no-ansi'))) | |
106 | if options.get('--no-ansi'): | |
182 | setup_parallel_logger(ansi_mode) | |
183 | if ansi_mode is AnsiMode.NEVER: | |
107 | 184 | command_options['--no-color'] = True |
108 | 185 | return functools.partial(perform_command, options, handler, command_options) |
109 | 186 | |
125 | 202 | handler(command, command_options) |
126 | 203 | |
127 | 204 | |
128 | def setup_logging(): | |
205 | def setup_logging(console_handler): | |
129 | 206 | root_logger = logging.getLogger() |
130 | 207 | root_logger.addHandler(console_handler) |
131 | 208 | root_logger.setLevel(logging.DEBUG) |
132 | 209 | |
133 | # Disable requests logging | |
210 | # Disable requests and docker-py logging | |
211 | logging.getLogger("urllib3").propagate = False | |
134 | 212 | logging.getLogger("requests").propagate = False |
135 | ||
136 | ||
137 | def setup_parallel_logger(noansi): | |
138 | if noansi: | |
139 | import compose.parallel | |
140 | compose.parallel.ParallelStreamWriter.set_noansi() | |
141 | ||
142 | ||
143 | def setup_console_handler(handler, verbose, noansi=False, level=None): | |
144 | if handler.stream.isatty() and noansi is False: | |
213 | logging.getLogger("docker").propagate = False | |
214 | ||
215 | ||
216 | def setup_parallel_logger(ansi_mode): | |
217 | ParallelStreamWriter.set_default_ansi_mode(ansi_mode) | |
218 | ||
219 | ||
220 | def setup_console_handler(handler, verbose, use_console_formatter=True, level=None): | |
221 | if use_console_formatter: | |
145 | 222 | format_class = ConsoleWarningFormatter |
146 | 223 | else: |
147 | 224 | format_class = logging.Formatter |
181 | 258 | """Define and run multi-container applications with Docker. |
182 | 259 | |
183 | 260 | Usage: |
184 | docker-compose [-f <arg>...] [options] [--] [COMMAND] [ARGS...] | |
261 | docker-compose [-f <arg>...] [--profile <name>...] [options] [--] [COMMAND] [ARGS...] | |
185 | 262 | docker-compose -h|--help |
186 | 263 | |
187 | 264 | Options: |
189 | 266 | (default: docker-compose.yml) |
190 | 267 | -p, --project-name NAME Specify an alternate project name |
191 | 268 | (default: directory name) |
269 | --profile NAME Specify a profile to enable | |
192 | 270 | -c, --context NAME Specify a context name |
193 | 271 | --verbose Show more output |
194 | 272 | --log-level LEVEL Set log level (DEBUG, INFO, WARNING, ERROR, CRITICAL) |
195 | --no-ansi Do not print ANSI control characters | |
273 | --ansi (never|always|auto) Control when to print ANSI control characters | |
274 | --no-ansi Do not print ANSI control characters (DEPRECATED) | |
196 | 275 | -v, --version Print version and exit |
197 | 276 | -H, --host HOST Daemon socket to connect to |
198 | 277 | |
213 | 292 | build Build or rebuild services |
214 | 293 | config Validate and view the Compose file |
215 | 294 | create Create services |
216 | down Stop and remove containers, networks, images, and volumes | |
295 | down Stop and remove resources | |
217 | 296 | events Receive real time events from containers |
218 | 297 | exec Execute a command in a running container |
219 | 298 | help Get help on a command |
243 | 322 | |
244 | 323 | @property |
245 | 324 | def project_dir(self): |
246 | return self.toplevel_options.get('--project-directory') or '.' | |
325 | return get_project_dir(self.toplevel_options) | |
247 | 326 | |
248 | 327 | @property |
249 | 328 | def toplevel_environment(self): |
250 | 329 | environment_file = self.toplevel_options.get('--env-file') |
251 | 330 | return Environment.from_env_file(self.project_dir, environment_file) |
252 | 331 | |
332 | @metrics() | |
253 | 333 | def build(self, options): |
254 | 334 | """ |
255 | 335 | Build or rebuild services. |
269 | 349 | --no-rm Do not remove intermediate containers after a successful build. |
270 | 350 | --parallel Build images in parallel. |
271 | 351 | --progress string Set type of progress output (auto, plain, tty). |
272 | EXPERIMENTAL flag for native builder. | |
273 | To enable, run with COMPOSE_DOCKER_CLI_BUILD=1) | |
274 | 352 | --pull Always attempt to pull a newer version of the image. |
275 | 353 | -q, --quiet Don't print anything to STDOUT |
276 | 354 | """ |
284 | 362 | ) |
285 | 363 | build_args = resolve_build_args(build_args, self.toplevel_environment) |
286 | 364 | |
287 | native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD') | |
365 | native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD', True) | |
288 | 366 | |
289 | 367 | self.project.build( |
290 | 368 | service_names=options['SERVICE'], |
301 | 379 | progress=options.get('--progress'), |
302 | 380 | ) |
303 | 381 | |
382 | @metrics() | |
304 | 383 | def config(self, options): |
305 | 384 | """ |
306 | 385 | Validate and view the Compose file. |
312 | 391 | --no-interpolate Don't interpolate environment variables. |
313 | 392 | -q, --quiet Only validate the configuration, don't print |
314 | 393 | anything. |
394 | --profiles Print the profile names, one per line. | |
315 | 395 | --services Print the service names, one per line. |
316 | 396 | --volumes Print the volume names, one per line. |
317 | 397 | --hash="*" Print the service config hash, one per line. |
329 | 409 | image_digests = image_digests_for_project(self.project) |
330 | 410 | |
331 | 411 | if options['--quiet']: |
412 | return | |
413 | ||
414 | if options['--profiles']: | |
415 | profiles = set() | |
416 | for service in compose_config.services: | |
417 | if 'profiles' in service: | |
418 | for profile in service['profiles']: | |
419 | profiles.add(profile) | |
420 | print('\n'.join(sorted(profiles))) | |
332 | 421 | return |
333 | 422 | |
334 | 423 | if options['--services']: |
350 | 439 | |
351 | 440 | print(serialize_config(compose_config, image_digests, not options['--no-interpolate'])) |
352 | 441 | |
442 | @metrics() | |
353 | 443 | def create(self, options): |
354 | 444 | """ |
355 | 445 | Creates containers for a service. |
378 | 468 | do_build=build_action_from_opts(options), |
379 | 469 | ) |
380 | 470 | |
471 | @metrics() | |
381 | 472 | def down(self, options): |
382 | 473 | """ |
383 | 474 | Stops containers and removes containers, networks, volumes, and images |
429 | 520 | Options: |
430 | 521 | --json Output events as a stream of json objects |
431 | 522 | """ |
523 | ||
432 | 524 | def format_event(event): |
433 | 525 | attributes = ["%s=%s" % item for item in event['attributes'].items()] |
434 | 526 | return ("{time} {type} {action} {id} ({attrs})").format( |
445 | 537 | print(formatter(event)) |
446 | 538 | sys.stdout.flush() |
447 | 539 | |
540 | @metrics("exec") | |
448 | 541 | def exec_command(self, options): |
449 | 542 | """ |
450 | 543 | Execute a command in a running container |
521 | 614 | sys.exit(exit_code) |
522 | 615 | |
523 | 616 | @classmethod |
617 | @metrics() | |
524 | 618 | def help(cls, options): |
525 | 619 | """ |
526 | 620 | Get help on a command. |
534 | 628 | |
535 | 629 | print(getdoc(subject)) |
536 | 630 | |
631 | @metrics() | |
537 | 632 | def images(self, options): |
538 | 633 | """ |
539 | 634 | List images used by the created containers. |
588 | 683 | ]) |
589 | 684 | print(Formatter.table(headers, rows)) |
590 | 685 | |
686 | @metrics() | |
591 | 687 | def kill(self, options): |
592 | 688 | """ |
593 | 689 | Force stop service containers. |
602 | 698 | |
603 | 699 | self.project.kill(service_names=options['SERVICE'], signal=signal) |
604 | 700 | |
701 | @metrics() | |
605 | 702 | def logs(self, options): |
606 | 703 | """ |
607 | 704 | View output from containers. |
609 | 706 | Usage: logs [options] [--] [SERVICE...] |
610 | 707 | |
611 | 708 | Options: |
612 | --no-color Produce monochrome output. | |
613 | -f, --follow Follow log output. | |
614 | -t, --timestamps Show timestamps. | |
615 | --tail="all" Number of lines to show from the end of the logs | |
616 | for each container. | |
709 | --no-color Produce monochrome output. | |
710 | -f, --follow Follow log output. | |
711 | -t, --timestamps Show timestamps. | |
712 | --tail="all" Number of lines to show from the end of the logs | |
713 | for each container. | |
714 | --no-log-prefix Don't print prefix in logs. | |
617 | 715 | """ |
618 | 716 | containers = self.project.containers(service_names=options['SERVICE'], stopped=True) |
619 | 717 | |
632 | 730 | log_printer_from_project( |
633 | 731 | self.project, |
634 | 732 | containers, |
635 | set_no_color_if_clicolor(options['--no-color']), | |
733 | options['--no-color'], | |
636 | 734 | log_args, |
637 | event_stream=self.project.events(service_names=options['SERVICE'])).run() | |
638 | ||
735 | event_stream=self.project.events(service_names=options['SERVICE']), | |
736 | keep_prefix=not options['--no-log-prefix']).run() | |
737 | ||
738 | @metrics() | |
639 | 739 | def pause(self, options): |
640 | 740 | """ |
641 | 741 | Pause services. |
645 | 745 | containers = self.project.pause(service_names=options['SERVICE']) |
646 | 746 | exit_if(not containers, 'No containers to pause', 1) |
647 | 747 | |
748 | @metrics() | |
648 | 749 | def port(self, options): |
649 | 750 | """ |
650 | 751 | Print the public port for a port binding. |
666 | 767 | options['PRIVATE_PORT'], |
667 | 768 | protocol=options.get('--protocol') or 'tcp') or '') |
668 | 769 | |
770 | @metrics() | |
669 | 771 | def ps(self, options): |
670 | 772 | """ |
671 | 773 | List containers. |
722 | 824 | ]) |
723 | 825 | print(Formatter.table(headers, rows)) |
724 | 826 | |
827 | @metrics() | |
725 | 828 | def pull(self, options): |
726 | 829 | """ |
727 | 830 | Pulls images for services defined in a Compose file, but does not start the containers. |
745 | 848 | include_deps=options.get('--include-deps'), |
746 | 849 | ) |
747 | 850 | |
851 | @metrics() | |
748 | 852 | def push(self, options): |
749 | 853 | """ |
750 | 854 | Pushes images for services. |
759 | 863 | ignore_push_failures=options.get('--ignore-push-failures') |
760 | 864 | ) |
761 | 865 | |
866 | @metrics() | |
762 | 867 | def rm(self, options): |
763 | 868 | """ |
764 | 869 | Removes stopped service containers. |
803 | 908 | else: |
804 | 909 | print("No stopped containers") |
805 | 910 | |
911 | @metrics() | |
806 | 912 | def run(self, options): |
807 | 913 | """ |
808 | 914 | Run a one-off command on a service. |
863 | 969 | self.toplevel_options, self.toplevel_environment |
864 | 970 | ) |
865 | 971 | |
972 | @metrics() | |
866 | 973 | def scale(self, options): |
867 | 974 | """ |
868 | 975 | Set number of containers to run for a service. |
891 | 998 | for service_name, num in parse_scale_args(options['SERVICE=NUM']).items(): |
892 | 999 | self.project.get_service(service_name).scale(num, timeout=timeout) |
893 | 1000 | |
1001 | @metrics() | |
894 | 1002 | def start(self, options): |
895 | 1003 | """ |
896 | 1004 | Start existing containers. |
900 | 1008 | containers = self.project.start(service_names=options['SERVICE']) |
901 | 1009 | exit_if(not containers, 'No containers to start', 1) |
902 | 1010 | |
1011 | @metrics() | |
903 | 1012 | def stop(self, options): |
904 | 1013 | """ |
905 | 1014 | Stop running containers without removing them. |
915 | 1024 | timeout = timeout_from_opts(options) |
916 | 1025 | self.project.stop(service_names=options['SERVICE'], timeout=timeout) |
917 | 1026 | |
1027 | @metrics() | |
918 | 1028 | def restart(self, options): |
919 | 1029 | """ |
920 | 1030 | Restart running containers. |
929 | 1039 | containers = self.project.restart(service_names=options['SERVICE'], timeout=timeout) |
930 | 1040 | exit_if(not containers, 'No containers to restart', 1) |
931 | 1041 | |
1042 | @metrics() | |
932 | 1043 | def top(self, options): |
933 | 1044 | """ |
934 | 1045 | Display the running processes |
956 | 1067 | print(container.name) |
957 | 1068 | print(Formatter.table(headers, rows)) |
958 | 1069 | |
1070 | @metrics() | |
959 | 1071 | def unpause(self, options): |
960 | 1072 | """ |
961 | 1073 | Unpause services. |
965 | 1077 | containers = self.project.unpause(service_names=options['SERVICE']) |
966 | 1078 | exit_if(not containers, 'No containers to unpause', 1) |
967 | 1079 | |
1080 | @metrics() | |
968 | 1081 | def up(self, options): |
969 | 1082 | """ |
970 | 1083 | Builds, (re)creates, starts, and attaches to containers for a service. |
1016 | 1129 | container. Implies --abort-on-container-exit. |
1017 | 1130 | --scale SERVICE=NUM Scale SERVICE to NUM instances. Overrides the |
1018 | 1131 | `scale` setting in the Compose file if present. |
1132 | --no-log-prefix Don't print prefix in logs. | |
1019 | 1133 | """ |
1020 | 1134 | start_deps = not options['--no-deps'] |
1021 | 1135 | always_recreate_deps = options['--always-recreate-deps'] |
1027 | 1141 | detached = options.get('--detach') |
1028 | 1142 | no_start = options.get('--no-start') |
1029 | 1143 | attach_dependencies = options.get('--attach-dependencies') |
1144 | keep_prefix = not options.get('--no-log-prefix') | |
1030 | 1145 | |
1031 | 1146 | if detached and (cascade_stop or exit_value_from or attach_dependencies): |
1032 | 1147 | raise UserError( |
1041 | 1156 | for excluded in [x for x in opts if options.get(x) and no_start]: |
1042 | 1157 | raise UserError('--no-start and {} cannot be combined.'.format(excluded)) |
1043 | 1158 | |
1044 | native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD') | |
1159 | native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD', True) | |
1045 | 1160 | |
1046 | 1161 | with up_shutdown_context(self.project, service_names, timeout, detached): |
1047 | 1162 | warn_for_swarm_mode(self.project.client) |
1063 | 1178 | renew_anonymous_volumes=options.get('--renew-anon-volumes'), |
1064 | 1179 | silent=options.get('--quiet-pull'), |
1065 | 1180 | cli=native_builder, |
1181 | attach_dependencies=attach_dependencies, | |
1066 | 1182 | ) |
1067 | 1183 | |
1068 | 1184 | try: |
1090 | 1206 | log_printer = log_printer_from_project( |
1091 | 1207 | self.project, |
1092 | 1208 | attached_containers, |
1093 | set_no_color_if_clicolor(options['--no-color']), | |
1209 | options['--no-color'], | |
1094 | 1210 | {'follow': True}, |
1095 | 1211 | cascade_stop, |
1096 | event_stream=self.project.events(service_names=service_names)) | |
1212 | event_stream=self.project.events(service_names=service_names), | |
1213 | keep_prefix=keep_prefix) | |
1097 | 1214 | print("Attaching to", list_containers(log_printer.containers)) |
1098 | 1215 | cascade_starter = log_printer.run() |
1099 | 1216 | |
1111 | 1228 | sys.exit(exit_code) |
1112 | 1229 | |
1113 | 1230 | @classmethod |
1231 | @metrics() | |
1114 | 1232 | def version(cls, options): |
1115 | 1233 | """ |
1116 | 1234 | Show version information and quit. |
1375 | 1493 | |
1376 | 1494 | |
1377 | 1495 | def log_printer_from_project( |
1378 | project, | |
1379 | containers, | |
1380 | monochrome, | |
1381 | log_args, | |
1382 | cascade_stop=False, | |
1383 | event_stream=None, | |
1496 | project, | |
1497 | containers, | |
1498 | monochrome, | |
1499 | log_args, | |
1500 | cascade_stop=False, | |
1501 | event_stream=None, | |
1502 | keep_prefix=True, | |
1384 | 1503 | ): |
1385 | 1504 | return LogPrinter( |
1386 | containers, | |
1387 | build_log_presenters(project.service_names, monochrome), | |
1505 | [c for c in containers if c.log_driver not in (None, 'none')], | |
1506 | build_log_presenters(project.service_names, monochrome, keep_prefix), | |
1388 | 1507 | event_stream or project.events(), |
1389 | 1508 | cascade_stop=cascade_stop, |
1390 | 1509 | log_args=log_args) |
1391 | 1510 | |
1392 | 1511 | |
1393 | 1512 | def filter_attached_containers(containers, service_names, attach_dependencies=False): |
1394 | if attach_dependencies or not service_names: | |
1395 | return containers | |
1396 | ||
1397 | return [ | |
1398 | container | |
1399 | for container in containers if container.service in service_names | |
1400 | ] | |
1513 | return filter_attached_for_up( | |
1514 | containers, | |
1515 | service_names, | |
1516 | attach_dependencies, | |
1517 | lambda container: container.service) | |
1401 | 1518 | |
1402 | 1519 | |
1403 | 1520 | @contextlib.contextmanager |
1573 | 1690 | "To deploy your application across the swarm, " |
1574 | 1691 | "use `docker stack deploy`.\n" |
1575 | 1692 | ) |
1576 | ||
1577 | ||
1578 | def set_no_color_if_clicolor(no_color_flag): | |
1579 | return no_color_flag or os.environ.get('CLICOLOR') == "0" |
0 | { | |
1 | "$schema": "http://json-schema.org/draft/2019-09/schema#", | |
2 | "id": "compose_spec.json", | |
3 | "type": "object", | |
4 | "title": "Compose Specification", | |
5 | "description": "The Compose file is a YAML file defining a multi-containers based application.", | |
6 | ||
7 | "properties": { | |
8 | "version": { | |
9 | "type": "string", | |
10 | "description": "Version of the Compose specification used. Tools not implementing required version MUST reject the configuration file." | |
11 | }, | |
12 | ||
13 | "services": { | |
14 | "id": "#/properties/services", | |
15 | "type": "object", | |
16 | "patternProperties": { | |
17 | "^[a-zA-Z0-9._-]+$": { | |
18 | "$ref": "#/definitions/service" | |
19 | } | |
20 | }, | |
21 | "additionalProperties": false | |
22 | }, | |
23 | ||
24 | "networks": { | |
25 | "id": "#/properties/networks", | |
26 | "type": "object", | |
27 | "patternProperties": { | |
28 | "^[a-zA-Z0-9._-]+$": { | |
29 | "$ref": "#/definitions/network" | |
30 | } | |
31 | } | |
32 | }, | |
33 | ||
34 | "volumes": { | |
35 | "id": "#/properties/volumes", | |
36 | "type": "object", | |
37 | "patternProperties": { | |
38 | "^[a-zA-Z0-9._-]+$": { | |
39 | "$ref": "#/definitions/volume" | |
40 | } | |
41 | }, | |
42 | "additionalProperties": false | |
43 | }, | |
44 | ||
45 | "secrets": { | |
46 | "id": "#/properties/secrets", | |
47 | "type": "object", | |
48 | "patternProperties": { | |
49 | "^[a-zA-Z0-9._-]+$": { | |
50 | "$ref": "#/definitions/secret" | |
51 | } | |
52 | }, | |
53 | "additionalProperties": false | |
54 | }, | |
55 | ||
56 | "configs": { | |
57 | "id": "#/properties/configs", | |
58 | "type": "object", | |
59 | "patternProperties": { | |
60 | "^[a-zA-Z0-9._-]+$": { | |
61 | "$ref": "#/definitions/config" | |
62 | } | |
63 | }, | |
64 | "additionalProperties": false | |
65 | } | |
66 | }, | |
67 | ||
68 | "patternProperties": {"^x-": {}}, | |
69 | "additionalProperties": false, | |
70 | ||
71 | "definitions": { | |
72 | ||
73 | "service": { | |
74 | "id": "#/definitions/service", | |
75 | "type": "object", | |
76 | ||
77 | "properties": { | |
78 | "deploy": {"$ref": "#/definitions/deployment"}, | |
79 | "build": { | |
80 | "oneOf": [ | |
81 | {"type": "string"}, | |
82 | { | |
83 | "type": "object", | |
84 | "properties": { | |
85 | "context": {"type": "string"}, | |
86 | "dockerfile": {"type": "string"}, | |
87 | "args": {"$ref": "#/definitions/list_or_dict"}, | |
88 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
89 | "cache_from": {"type": "array", "items": {"type": "string"}}, | |
90 | "network": {"type": "string"}, | |
91 | "target": {"type": "string"}, | |
92 | "shm_size": {"type": ["integer", "string"]}, | |
93 | "extra_hosts": {"$ref": "#/definitions/list_or_dict"}, | |
94 | "isolation": {"type": "string"} | |
95 | }, | |
96 | "additionalProperties": false, | |
97 | "patternProperties": {"^x-": {}} | |
98 | } | |
99 | ] | |
100 | }, | |
101 | "blkio_config": { | |
102 | "type": "object", | |
103 | "properties": { | |
104 | "device_read_bps": { | |
105 | "type": "array", | |
106 | "items": {"$ref": "#/definitions/blkio_limit"} | |
107 | }, | |
108 | "device_read_iops": { | |
109 | "type": "array", | |
110 | "items": {"$ref": "#/definitions/blkio_limit"} | |
111 | }, | |
112 | "device_write_bps": { | |
113 | "type": "array", | |
114 | "items": {"$ref": "#/definitions/blkio_limit"} | |
115 | }, | |
116 | "device_write_iops": { | |
117 | "type": "array", | |
118 | "items": {"$ref": "#/definitions/blkio_limit"} | |
119 | }, | |
120 | "weight": {"type": "integer"}, | |
121 | "weight_device": { | |
122 | "type": "array", | |
123 | "items": {"$ref": "#/definitions/blkio_weight"} | |
124 | } | |
125 | }, | |
126 | "additionalProperties": false | |
127 | }, | |
128 | "cap_add": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
129 | "cap_drop": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
130 | "cgroup_parent": {"type": "string"}, | |
131 | "command": { | |
132 | "oneOf": [ | |
133 | {"type": "string"}, | |
134 | {"type": "array", "items": {"type": "string"}} | |
135 | ] | |
136 | }, | |
137 | "configs": { | |
138 | "type": "array", | |
139 | "items": { | |
140 | "oneOf": [ | |
141 | {"type": "string"}, | |
142 | { | |
143 | "type": "object", | |
144 | "properties": { | |
145 | "source": {"type": "string"}, | |
146 | "target": {"type": "string"}, | |
147 | "uid": {"type": "string"}, | |
148 | "gid": {"type": "string"}, | |
149 | "mode": {"type": "number"} | |
150 | }, | |
151 | "additionalProperties": false, | |
152 | "patternProperties": {"^x-": {}} | |
153 | } | |
154 | ] | |
155 | } | |
156 | }, | |
157 | "container_name": {"type": "string"}, | |
158 | "cpu_count": {"type": "integer", "minimum": 0}, | |
159 | "cpu_percent": {"type": "integer", "minimum": 0, "maximum": 100}, | |
160 | "cpu_shares": {"type": ["number", "string"]}, | |
161 | "cpu_quota": {"type": ["number", "string"]}, | |
162 | "cpu_period": {"type": ["number", "string"]}, | |
163 | "cpu_rt_period": {"type": ["number", "string"]}, | |
164 | "cpu_rt_runtime": {"type": ["number", "string"]}, | |
165 | "cpus": {"type": ["number", "string"]}, | |
166 | "cpuset": {"type": "string"}, | |
167 | "credential_spec": { | |
168 | "type": "object", | |
169 | "properties": { | |
170 | "config": {"type": "string"}, | |
171 | "file": {"type": "string"}, | |
172 | "registry": {"type": "string"} | |
173 | }, | |
174 | "additionalProperties": false, | |
175 | "patternProperties": {"^x-": {}} | |
176 | }, | |
177 | "depends_on": { | |
178 | "oneOf": [ | |
179 | {"$ref": "#/definitions/list_of_strings"}, | |
180 | { | |
181 | "type": "object", | |
182 | "additionalProperties": false, | |
183 | "patternProperties": { | |
184 | "^[a-zA-Z0-9._-]+$": { | |
185 | "type": "object", | |
186 | "additionalProperties": false, | |
187 | "properties": { | |
188 | "condition": { | |
189 | "type": "string", | |
190 | "enum": ["service_started", "service_healthy", "service_completed_successfully"] | |
191 | } | |
192 | }, | |
193 | "required": ["condition"] | |
194 | } | |
195 | } | |
196 | } | |
197 | ] | |
198 | }, | |
199 | "device_cgroup_rules": {"$ref": "#/definitions/list_of_strings"}, | |
200 | "devices": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
201 | "dns": {"$ref": "#/definitions/string_or_list"}, | |
202 | "dns_opt": {"type": "array","items": {"type": "string"}, "uniqueItems": true}, | |
203 | "dns_search": {"$ref": "#/definitions/string_or_list"}, | |
204 | "domainname": {"type": "string"}, | |
205 | "entrypoint": { | |
206 | "oneOf": [ | |
207 | {"type": "string"}, | |
208 | {"type": "array", "items": {"type": "string"}} | |
209 | ] | |
210 | }, | |
211 | "env_file": {"$ref": "#/definitions/string_or_list"}, | |
212 | "environment": {"$ref": "#/definitions/list_or_dict"}, | |
213 | ||
214 | "expose": { | |
215 | "type": "array", | |
216 | "items": { | |
217 | "type": ["string", "number"], | |
218 | "format": "expose" | |
219 | }, | |
220 | "uniqueItems": true | |
221 | }, | |
222 | "extends": { | |
223 | "oneOf": [ | |
224 | {"type": "string"}, | |
225 | { | |
226 | "type": "object", | |
227 | ||
228 | "properties": { | |
229 | "service": {"type": "string"}, | |
230 | "file": {"type": "string"} | |
231 | }, | |
232 | "required": ["service"], | |
233 | "additionalProperties": false | |
234 | } | |
235 | ] | |
236 | }, | |
237 | "external_links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
238 | "extra_hosts": {"$ref": "#/definitions/list_or_dict"}, | |
239 | "group_add": { | |
240 | "type": "array", | |
241 | "items": { | |
242 | "type": ["string", "number"] | |
243 | }, | |
244 | "uniqueItems": true | |
245 | }, | |
246 | "healthcheck": {"$ref": "#/definitions/healthcheck"}, | |
247 | "hostname": {"type": "string"}, | |
248 | "image": {"type": "string"}, | |
249 | "init": {"type": "boolean"}, | |
250 | "ipc": {"type": "string"}, | |
251 | "isolation": {"type": "string"}, | |
252 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
253 | "links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
254 | "logging": { | |
255 | "type": "object", | |
256 | ||
257 | "properties": { | |
258 | "driver": {"type": "string"}, | |
259 | "options": { | |
260 | "type": "object", | |
261 | "patternProperties": { | |
262 | "^.+$": {"type": ["string", "number", "null"]} | |
263 | } | |
264 | } | |
265 | }, | |
266 | "additionalProperties": false, | |
267 | "patternProperties": {"^x-": {}} | |
268 | }, | |
269 | "mac_address": {"type": "string"}, | |
270 | "mem_limit": {"type": ["number", "string"]}, | |
271 | "mem_reservation": {"type": ["string", "integer"]}, | |
272 | "mem_swappiness": {"type": "integer"}, | |
273 | "memswap_limit": {"type": ["number", "string"]}, | |
274 | "network_mode": {"type": "string"}, | |
275 | "networks": { | |
276 | "oneOf": [ | |
277 | {"$ref": "#/definitions/list_of_strings"}, | |
278 | { | |
279 | "type": "object", | |
280 | "patternProperties": { | |
281 | "^[a-zA-Z0-9._-]+$": { | |
282 | "oneOf": [ | |
283 | { | |
284 | "type": "object", | |
285 | "properties": { | |
286 | "aliases": {"$ref": "#/definitions/list_of_strings"}, | |
287 | "ipv4_address": {"type": "string"}, | |
288 | "ipv6_address": {"type": "string"}, | |
289 | "link_local_ips": {"$ref": "#/definitions/list_of_strings"}, | |
290 | "priority": {"type": "number"} | |
291 | }, | |
292 | "additionalProperties": false, | |
293 | "patternProperties": {"^x-": {}} | |
294 | }, | |
295 | {"type": "null"} | |
296 | ] | |
297 | } | |
298 | }, | |
299 | "additionalProperties": false | |
300 | } | |
301 | ] | |
302 | }, | |
303 | "oom_kill_disable": {"type": "boolean"}, | |
304 | "oom_score_adj": {"type": "integer", "minimum": -1000, "maximum": 1000}, | |
305 | "pid": {"type": ["string", "null"]}, | |
306 | "pids_limit": {"type": ["number", "string"]}, | |
307 | "platform": {"type": "string"}, | |
308 | "ports": { | |
309 | "type": "array", | |
310 | "items": { | |
311 | "oneOf": [ | |
312 | {"type": "number", "format": "ports"}, | |
313 | {"type": "string", "format": "ports"}, | |
314 | { | |
315 | "type": "object", | |
316 | "properties": { | |
317 | "mode": {"type": "string"}, | |
318 | "target": {"type": "integer"}, | |
319 | "published": {"type": "integer"}, | |
320 | "protocol": {"type": "string"} | |
321 | }, | |
322 | "additionalProperties": false, | |
323 | "patternProperties": {"^x-": {}} | |
324 | } | |
325 | ] | |
326 | }, | |
327 | "uniqueItems": true | |
328 | }, | |
329 | "privileged": {"type": "boolean"}, | |
330 | "profiles": {"$ref": "#/definitions/list_of_strings"}, | |
331 | "pull_policy": {"type": "string", "enum": [ | |
332 | "always", "never", "if_not_present", "build" | |
333 | ]}, | |
334 | "read_only": {"type": "boolean"}, | |
335 | "restart": {"type": "string"}, | |
336 | "runtime": { | |
337 | "type": "string" | |
338 | }, | |
339 | "scale": { | |
340 | "type": "integer" | |
341 | }, | |
342 | "security_opt": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
343 | "shm_size": {"type": ["number", "string"]}, | |
344 | "secrets": { | |
345 | "type": "array", | |
346 | "items": { | |
347 | "oneOf": [ | |
348 | {"type": "string"}, | |
349 | { | |
350 | "type": "object", | |
351 | "properties": { | |
352 | "source": {"type": "string"}, | |
353 | "target": {"type": "string"}, | |
354 | "uid": {"type": "string"}, | |
355 | "gid": {"type": "string"}, | |
356 | "mode": {"type": "number"} | |
357 | }, | |
358 | "additionalProperties": false, | |
359 | "patternProperties": {"^x-": {}} | |
360 | } | |
361 | ] | |
362 | } | |
363 | }, | |
364 | "sysctls": {"$ref": "#/definitions/list_or_dict"}, | |
365 | "stdin_open": {"type": "boolean"}, | |
366 | "stop_grace_period": {"type": "string", "format": "duration"}, | |
367 | "stop_signal": {"type": "string"}, | |
368 | "storage_opt": {"type": "object"}, | |
369 | "tmpfs": {"$ref": "#/definitions/string_or_list"}, | |
370 | "tty": {"type": "boolean"}, | |
371 | "ulimits": { | |
372 | "type": "object", | |
373 | "patternProperties": { | |
374 | "^[a-z]+$": { | |
375 | "oneOf": [ | |
376 | {"type": "integer"}, | |
377 | { | |
378 | "type": "object", | |
379 | "properties": { | |
380 | "hard": {"type": "integer"}, | |
381 | "soft": {"type": "integer"} | |
382 | }, | |
383 | "required": ["soft", "hard"], | |
384 | "additionalProperties": false, | |
385 | "patternProperties": {"^x-": {}} | |
386 | } | |
387 | ] | |
388 | } | |
389 | } | |
390 | }, | |
391 | "user": {"type": "string"}, | |
392 | "userns_mode": {"type": "string"}, | |
393 | "volumes": { | |
394 | "type": "array", | |
395 | "items": { | |
396 | "oneOf": [ | |
397 | {"type": "string"}, | |
398 | { | |
399 | "type": "object", | |
400 | "required": ["type"], | |
401 | "properties": { | |
402 | "type": {"type": "string"}, | |
403 | "source": {"type": "string"}, | |
404 | "target": {"type": "string"}, | |
405 | "read_only": {"type": "boolean"}, | |
406 | "consistency": {"type": "string"}, | |
407 | "bind": { | |
408 | "type": "object", | |
409 | "properties": { | |
410 | "propagation": {"type": "string"} | |
411 | }, | |
412 | "additionalProperties": false, | |
413 | "patternProperties": {"^x-": {}} | |
414 | }, | |
415 | "volume": { | |
416 | "type": "object", | |
417 | "properties": { | |
418 | "nocopy": {"type": "boolean"} | |
419 | }, | |
420 | "additionalProperties": false, | |
421 | "patternProperties": {"^x-": {}} | |
422 | }, | |
423 | "tmpfs": { | |
424 | "type": "object", | |
425 | "properties": { | |
426 | "size": { | |
427 | "type": "integer", | |
428 | "minimum": 0 | |
429 | } | |
430 | }, | |
431 | "additionalProperties": false, | |
432 | "patternProperties": {"^x-": {}} | |
433 | } | |
434 | }, | |
435 | "additionalProperties": false, | |
436 | "patternProperties": {"^x-": {}} | |
437 | } | |
438 | ] | |
439 | }, | |
440 | "uniqueItems": true | |
441 | }, | |
442 | "volumes_from": { | |
443 | "type": "array", | |
444 | "items": {"type": "string"}, | |
445 | "uniqueItems": true | |
446 | }, | |
447 | "working_dir": {"type": "string"} | |
448 | }, | |
449 | "patternProperties": {"^x-": {}}, | |
450 | "additionalProperties": false | |
451 | }, | |
452 | ||
453 | "healthcheck": { | |
454 | "id": "#/definitions/healthcheck", | |
455 | "type": "object", | |
456 | "properties": { | |
457 | "disable": {"type": "boolean"}, | |
458 | "interval": {"type": "string", "format": "duration"}, | |
459 | "retries": {"type": "number"}, | |
460 | "test": { | |
461 | "oneOf": [ | |
462 | {"type": "string"}, | |
463 | {"type": "array", "items": {"type": "string"}} | |
464 | ] | |
465 | }, | |
466 | "timeout": {"type": "string", "format": "duration"}, | |
467 | "start_period": {"type": "string", "format": "duration"} | |
468 | }, | |
469 | "additionalProperties": false, | |
470 | "patternProperties": {"^x-": {}} | |
471 | }, | |
472 | "deployment": { | |
473 | "id": "#/definitions/deployment", | |
474 | "type": ["object", "null"], | |
475 | "properties": { | |
476 | "mode": {"type": "string"}, | |
477 | "endpoint_mode": {"type": "string"}, | |
478 | "replicas": {"type": "integer"}, | |
479 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
480 | "rollback_config": { | |
481 | "type": "object", | |
482 | "properties": { | |
483 | "parallelism": {"type": "integer"}, | |
484 | "delay": {"type": "string", "format": "duration"}, | |
485 | "failure_action": {"type": "string"}, | |
486 | "monitor": {"type": "string", "format": "duration"}, | |
487 | "max_failure_ratio": {"type": "number"}, | |
488 | "order": {"type": "string", "enum": [ | |
489 | "start-first", "stop-first" | |
490 | ]} | |
491 | }, | |
492 | "additionalProperties": false, | |
493 | "patternProperties": {"^x-": {}} | |
494 | }, | |
495 | "update_config": { | |
496 | "type": "object", | |
497 | "properties": { | |
498 | "parallelism": {"type": "integer"}, | |
499 | "delay": {"type": "string", "format": "duration"}, | |
500 | "failure_action": {"type": "string"}, | |
501 | "monitor": {"type": "string", "format": "duration"}, | |
502 | "max_failure_ratio": {"type": "number"}, | |
503 | "order": {"type": "string", "enum": [ | |
504 | "start-first", "stop-first" | |
505 | ]} | |
506 | }, | |
507 | "additionalProperties": false, | |
508 | "patternProperties": {"^x-": {}} | |
509 | }, | |
510 | "resources": { | |
511 | "type": "object", | |
512 | "properties": { | |
513 | "limits": { | |
514 | "type": "object", | |
515 | "properties": { | |
516 | "cpus": {"type": ["number", "string"]}, | |
517 | "memory": {"type": "string"} | |
518 | }, | |
519 | "additionalProperties": false, | |
520 | "patternProperties": {"^x-": {}} | |
521 | }, | |
522 | "reservations": { | |
523 | "type": "object", | |
524 | "properties": { | |
525 | "cpus": {"type": ["number", "string"]}, | |
526 | "memory": {"type": "string"}, | |
527 | "generic_resources": {"$ref": "#/definitions/generic_resources"}, | |
528 | "devices": {"$ref": "#/definitions/devices"} | |
529 | }, | |
530 | "additionalProperties": false, | |
531 | "patternProperties": {"^x-": {}} | |
532 | } | |
533 | }, | |
534 | "additionalProperties": false, | |
535 | "patternProperties": {"^x-": {}} | |
536 | }, | |
537 | "restart_policy": { | |
538 | "type": "object", | |
539 | "properties": { | |
540 | "condition": {"type": "string"}, | |
541 | "delay": {"type": "string", "format": "duration"}, | |
542 | "max_attempts": {"type": "integer"}, | |
543 | "window": {"type": "string", "format": "duration"} | |
544 | }, | |
545 | "additionalProperties": false, | |
546 | "patternProperties": {"^x-": {}} | |
547 | }, | |
548 | "placement": { | |
549 | "type": "object", | |
550 | "properties": { | |
551 | "constraints": {"type": "array", "items": {"type": "string"}}, | |
552 | "preferences": { | |
553 | "type": "array", | |
554 | "items": { | |
555 | "type": "object", | |
556 | "properties": { | |
557 | "spread": {"type": "string"} | |
558 | }, | |
559 | "additionalProperties": false, | |
560 | "patternProperties": {"^x-": {}} | |
561 | } | |
562 | }, | |
563 | "max_replicas_per_node": {"type": "integer"} | |
564 | }, | |
565 | "additionalProperties": false, | |
566 | "patternProperties": {"^x-": {}} | |
567 | } | |
568 | }, | |
569 | "additionalProperties": false, | |
570 | "patternProperties": {"^x-": {}} | |
571 | }, | |
572 | ||
573 | "generic_resources": { | |
574 | "id": "#/definitions/generic_resources", | |
575 | "type": "array", | |
576 | "items": { | |
577 | "type": "object", | |
578 | "properties": { | |
579 | "discrete_resource_spec": { | |
580 | "type": "object", | |
581 | "properties": { | |
582 | "kind": {"type": "string"}, | |
583 | "value": {"type": "number"} | |
584 | }, | |
585 | "additionalProperties": false, | |
586 | "patternProperties": {"^x-": {}} | |
587 | } | |
588 | }, | |
589 | "additionalProperties": false, | |
590 | "patternProperties": {"^x-": {}} | |
591 | } | |
592 | }, | |
593 | ||
594 | "devices": { | |
595 | "id": "#/definitions/devices", | |
596 | "type": "array", | |
597 | "items": { | |
598 | "type": "object", | |
599 | "properties": { | |
600 | "capabilities": {"$ref": "#/definitions/list_of_strings"}, | |
601 | "count": {"type": ["string", "integer"]}, | |
602 | "device_ids": {"$ref": "#/definitions/list_of_strings"}, | |
603 | "driver":{"type": "string"}, | |
604 | "options":{"$ref": "#/definitions/list_or_dict"} | |
605 | }, | |
606 | "additionalProperties": false, | |
607 | "patternProperties": {"^x-": {}} | |
608 | } | |
609 | }, | |
610 | ||
611 | "network": { | |
612 | "id": "#/definitions/network", | |
613 | "type": ["object", "null"], | |
614 | "properties": { | |
615 | "name": {"type": "string"}, | |
616 | "driver": {"type": "string"}, | |
617 | "driver_opts": { | |
618 | "type": "object", | |
619 | "patternProperties": { | |
620 | "^.+$": {"type": ["string", "number"]} | |
621 | } | |
622 | }, | |
623 | "ipam": { | |
624 | "type": "object", | |
625 | "properties": { | |
626 | "driver": {"type": "string"}, | |
627 | "config": { | |
628 | "type": "array", | |
629 | "items": { | |
630 | "type": "object", | |
631 | "properties": { | |
632 | "subnet": {"type": "string", "format": "subnet_ip_address"}, | |
633 | "ip_range": {"type": "string"}, | |
634 | "gateway": {"type": "string"}, | |
635 | "aux_addresses": { | |
636 | "type": "object", | |
637 | "additionalProperties": false, | |
638 | "patternProperties": {"^.+$": {"type": "string"}} | |
639 | } | |
640 | }, | |
641 | "additionalProperties": false, | |
642 | "patternProperties": {"^x-": {}} | |
643 | } | |
644 | }, | |
645 | "options": { | |
646 | "type": "object", | |
647 | "additionalProperties": false, | |
648 | "patternProperties": {"^.+$": {"type": "string"}} | |
649 | } | |
650 | }, | |
651 | "additionalProperties": false, | |
652 | "patternProperties": {"^x-": {}} | |
653 | }, | |
654 | "external": { | |
655 | "type": ["boolean", "object"], | |
656 | "properties": { | |
657 | "name": { | |
658 | "deprecated": true, | |
659 | "type": "string" | |
660 | } | |
661 | }, | |
662 | "additionalProperties": false, | |
663 | "patternProperties": {"^x-": {}} | |
664 | }, | |
665 | "internal": {"type": "boolean"}, | |
666 | "enable_ipv6": {"type": "boolean"}, | |
667 | "attachable": {"type": "boolean"}, | |
668 | "labels": {"$ref": "#/definitions/list_or_dict"} | |
669 | }, | |
670 | "additionalProperties": false, | |
671 | "patternProperties": {"^x-": {}} | |
672 | }, | |
673 | ||
674 | "volume": { | |
675 | "id": "#/definitions/volume", | |
676 | "type": ["object", "null"], | |
677 | "properties": { | |
678 | "name": {"type": "string"}, | |
679 | "driver": {"type": "string"}, | |
680 | "driver_opts": { | |
681 | "type": "object", | |
682 | "patternProperties": { | |
683 | "^.+$": {"type": ["string", "number"]} | |
684 | } | |
685 | }, | |
686 | "external": { | |
687 | "type": ["boolean", "object"], | |
688 | "properties": { | |
689 | "name": { | |
690 | "deprecated": true, | |
691 | "type": "string" | |
692 | } | |
693 | }, | |
694 | "additionalProperties": false, | |
695 | "patternProperties": {"^x-": {}} | |
696 | }, | |
697 | "labels": {"$ref": "#/definitions/list_or_dict"} | |
698 | }, | |
699 | "additionalProperties": false, | |
700 | "patternProperties": {"^x-": {}} | |
701 | }, | |
702 | ||
703 | "secret": { | |
704 | "id": "#/definitions/secret", | |
705 | "type": "object", | |
706 | "properties": { | |
707 | "name": {"type": "string"}, | |
708 | "file": {"type": "string"}, | |
709 | "external": { | |
710 | "type": ["boolean", "object"], | |
711 | "properties": { | |
712 | "name": {"type": "string"} | |
713 | } | |
714 | }, | |
715 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
716 | "driver": {"type": "string"}, | |
717 | "driver_opts": { | |
718 | "type": "object", | |
719 | "patternProperties": { | |
720 | "^.+$": {"type": ["string", "number"]} | |
721 | } | |
722 | }, | |
723 | "template_driver": {"type": "string"} | |
724 | }, | |
725 | "additionalProperties": false, | |
726 | "patternProperties": {"^x-": {}} | |
727 | }, | |
728 | ||
729 | "config": { | |
730 | "id": "#/definitions/config", | |
731 | "type": "object", | |
732 | "properties": { | |
733 | "name": {"type": "string"}, | |
734 | "file": {"type": "string"}, | |
735 | "external": { | |
736 | "type": ["boolean", "object"], | |
737 | "properties": { | |
738 | "name": { | |
739 | "deprecated": true, | |
740 | "type": "string" | |
741 | } | |
742 | } | |
743 | }, | |
744 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
745 | "template_driver": {"type": "string"} | |
746 | }, | |
747 | "additionalProperties": false, | |
748 | "patternProperties": {"^x-": {}} | |
749 | }, | |
750 | ||
751 | "string_or_list": { | |
752 | "oneOf": [ | |
753 | {"type": "string"}, | |
754 | {"$ref": "#/definitions/list_of_strings"} | |
755 | ] | |
756 | }, | |
757 | ||
758 | "list_of_strings": { | |
759 | "type": "array", | |
760 | "items": {"type": "string"}, | |
761 | "uniqueItems": true | |
762 | }, | |
763 | ||
764 | "list_or_dict": { | |
765 | "oneOf": [ | |
766 | { | |
767 | "type": "object", | |
768 | "patternProperties": { | |
769 | ".+": { | |
770 | "type": ["string", "number", "null"] | |
771 | } | |
772 | }, | |
773 | "additionalProperties": false | |
774 | }, | |
775 | {"type": "array", "items": {"type": "string"}, "uniqueItems": true} | |
776 | ] | |
777 | }, | |
778 | ||
779 | "blkio_limit": { | |
780 | "type": "object", | |
781 | "properties": { | |
782 | "path": {"type": "string"}, | |
783 | "rate": {"type": ["integer", "string"]} | |
784 | }, | |
785 | "additionalProperties": false | |
786 | }, | |
787 | "blkio_weight": { | |
788 | "type": "object", | |
789 | "properties": { | |
790 | "path": {"type": "string"}, | |
791 | "weight": {"type": "integer"} | |
792 | }, | |
793 | "additionalProperties": false | |
794 | }, | |
795 | ||
796 | "constraints": { | |
797 | "service": { | |
798 | "id": "#/definitions/constraints/service", | |
799 | "anyOf": [ | |
800 | {"required": ["build"]}, | |
801 | {"required": ["image"]} | |
802 | ], | |
803 | "properties": { | |
804 | "build": { | |
805 | "required": ["context"] | |
806 | } | |
807 | } | |
808 | } | |
809 | } | |
810 | } | |
811 | } |
9 | 9 | from operator import itemgetter |
10 | 10 | |
11 | 11 | import yaml |
12 | from cached_property import cached_property | |
12 | ||
13 | try: | |
14 | from functools import cached_property | |
15 | except ImportError: | |
16 | from cached_property import cached_property | |
13 | 17 | |
14 | 18 | from . import types |
15 | 19 | from ..const import COMPOSE_SPEC as VERSION |
132 | 136 | 'logging', |
133 | 137 | 'network_mode', |
134 | 138 | 'platform', |
139 | 'profiles', | |
135 | 140 | 'scale', |
136 | 141 | 'stop_grace_period', |
137 | 142 | ] |
147 | 152 | SUPPORTED_FILENAMES = [ |
148 | 153 | 'docker-compose.yml', |
149 | 154 | 'docker-compose.yaml', |
155 | 'compose.yml', | |
156 | 'compose.yaml', | |
150 | 157 | ] |
151 | 158 | |
152 | DEFAULT_OVERRIDE_FILENAMES = ('docker-compose.override.yml', 'docker-compose.override.yaml') | |
159 | DEFAULT_OVERRIDE_FILENAMES = ('docker-compose.override.yml', | |
160 | 'docker-compose.override.yaml', | |
161 | 'compose.override.yml', | |
162 | 'compose.override.yaml') | |
153 | 163 | |
154 | 164 | |
155 | 165 | log = logging.getLogger(__name__) |
302 | 312 | if filenames: |
303 | 313 | filenames = [os.path.join(base_dir, f) for f in filenames] |
304 | 314 | else: |
315 | # search for compose files in the base dir and its parents | |
305 | 316 | filenames = get_default_config_files(base_dir) |
317 | if not filenames and not override_dir: | |
318 | # none found in base_dir and no override_dir defined | |
319 | raise ComposeFileNotFound(SUPPORTED_FILENAMES) | |
320 | if not filenames: | |
321 | # search for compose files in the project directory and its parents | |
322 | filenames = get_default_config_files(override_dir) | |
323 | if not filenames: | |
324 | raise ComposeFileNotFound(SUPPORTED_FILENAMES) | |
306 | 325 | |
307 | 326 | log.debug("Using configuration files: {}".format(",".join(filenames))) |
308 | 327 | return ConfigDetails( |
333 | 352 | (candidates, path) = find_candidates_in_parent_dirs(SUPPORTED_FILENAMES, base_dir) |
334 | 353 | |
335 | 354 | if not candidates: |
336 | raise ComposeFileNotFound(SUPPORTED_FILENAMES) | |
355 | return None | |
337 | 356 | |
338 | 357 | winner = candidates[0] |
339 | 358 | |
370 | 389 | return find_candidates_in_parent_dirs(filenames, parent_dir) |
371 | 390 | |
372 | 391 | return (candidates, path) |
392 | ||
393 | ||
394 | def check_swarm_only_config(service_dicts): | |
395 | warning_template = ( | |
396 | "Some services ({services}) use the '{key}' key, which will be ignored. " | |
397 | "Compose does not support '{key}' configuration - use " | |
398 | "`docker stack deploy` to deploy to a swarm." | |
399 | ) | |
400 | key = 'configs' | |
401 | services = [s for s in service_dicts if s.get(key)] | |
402 | if services: | |
403 | log.warning( | |
404 | warning_template.format( | |
405 | services=", ".join(sorted(s['name'] for s in services)), | |
406 | key=key | |
407 | ) | |
408 | ) | |
373 | 409 | |
374 | 410 | |
375 | 411 | def load(config_details, interpolate=True): |
408 | 444 | for service_dict in service_dicts: |
409 | 445 | match_named_volumes(service_dict, volumes) |
410 | 446 | |
447 | check_swarm_only_config(service_dicts) | |
448 | ||
411 | 449 | return Config(main_file.config_version, main_file.version, |
412 | 450 | service_dicts, volumes, networks, secrets, configs) |
413 | 451 | |
535 | 573 | config_file.version, |
536 | 574 | config, |
537 | 575 | section, |
538 | environment | |
539 | ) | |
576 | environment) | |
540 | 577 | else: |
541 | 578 | return config |
542 | 579 | |
1046 | 1083 | |
1047 | 1084 | for field in [ |
1048 | 1085 | 'cap_add', 'cap_drop', 'expose', 'external_links', |
1049 | 'volumes_from', 'device_cgroup_rules', | |
1086 | 'volumes_from', 'device_cgroup_rules', 'profiles', | |
1050 | 1087 | ]: |
1051 | 1088 | md.merge_field(field, merge_unique_items_lists, default=[]) |
1052 | 1089 | |
1165 | 1202 | md.merge_scalar('cpus') |
1166 | 1203 | md.merge_scalar('memory') |
1167 | 1204 | md.merge_sequence('generic_resources', types.GenericResource.parse) |
1205 | md.merge_field('devices', merge_unique_objects_lists, default=[]) | |
1168 | 1206 | return dict(md) |
1169 | 1207 | |
1170 | 1208 |
0 | { | |
1 | "$schema": "http://json-schema.org/draft/2019-09/schema#", | |
2 | "id": "config_schema_compose_spec.json", | |
3 | "type": "object", | |
4 | "title": "Compose Specification", | |
5 | "description": "The Compose file is a YAML file defining a multi-containers based application.", | |
6 | "properties": { | |
7 | "version": { | |
8 | "type": "string", | |
9 | "description": "Version of the Compose specification used. Tools not implementing required version MUST reject the configuration file." | |
10 | }, | |
11 | "services": { | |
12 | "id": "#/properties/services", | |
13 | "type": "object", | |
14 | "patternProperties": { | |
15 | "^[a-zA-Z0-9._-]+$": { | |
16 | "$ref": "#/definitions/service" | |
17 | } | |
18 | }, | |
19 | "additionalProperties": false | |
20 | }, | |
21 | "networks": { | |
22 | "id": "#/properties/networks", | |
23 | "type": "object", | |
24 | "patternProperties": { | |
25 | "^[a-zA-Z0-9._-]+$": { | |
26 | "$ref": "#/definitions/network" | |
27 | } | |
28 | } | |
29 | }, | |
30 | "volumes": { | |
31 | "id": "#/properties/volumes", | |
32 | "type": "object", | |
33 | "patternProperties": { | |
34 | "^[a-zA-Z0-9._-]+$": { | |
35 | "$ref": "#/definitions/volume" | |
36 | } | |
37 | }, | |
38 | "additionalProperties": false | |
39 | }, | |
40 | "secrets": { | |
41 | "id": "#/properties/secrets", | |
42 | "type": "object", | |
43 | "patternProperties": { | |
44 | "^[a-zA-Z0-9._-]+$": { | |
45 | "$ref": "#/definitions/secret" | |
46 | } | |
47 | }, | |
48 | "additionalProperties": false | |
49 | }, | |
50 | "configs": { | |
51 | "id": "#/properties/configs", | |
52 | "type": "object", | |
53 | "patternProperties": { | |
54 | "^[a-zA-Z0-9._-]+$": { | |
55 | "$ref": "#/definitions/config" | |
56 | } | |
57 | }, | |
58 | "additionalProperties": false | |
59 | } | |
60 | }, | |
61 | "patternProperties": {"^x-": {}}, | |
62 | "additionalProperties": false, | |
63 | "definitions": { | |
64 | "service": { | |
65 | "id": "#/definitions/service", | |
66 | "type": "object", | |
67 | "properties": { | |
68 | "deploy": {"$ref": "#/definitions/deployment"}, | |
69 | "build": { | |
70 | "oneOf": [ | |
71 | {"type": "string"}, | |
72 | { | |
73 | "type": "object", | |
74 | "properties": { | |
75 | "context": {"type": "string"}, | |
76 | "dockerfile": {"type": "string"}, | |
77 | "args": {"$ref": "#/definitions/list_or_dict"}, | |
78 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
79 | "cache_from": {"$ref": "#/definitions/list_of_strings"}, | |
80 | "network": {"type": "string"}, | |
81 | "target": {"type": "string"}, | |
82 | "shm_size": {"type": ["integer", "string"]}, | |
83 | "extra_hosts": {"$ref": "#/definitions/list_or_dict"}, | |
84 | "isolation": {"type": "string"} | |
85 | }, | |
86 | "additionalProperties": false, | |
87 | "patternProperties": {"^x-": {}} | |
88 | } | |
89 | ] | |
90 | }, | |
91 | "blkio_config": { | |
92 | "type": "object", | |
93 | "properties": { | |
94 | "device_read_bps": { | |
95 | "type": "array", | |
96 | "items": {"$ref": "#/definitions/blkio_limit"} | |
97 | }, | |
98 | "device_read_iops": { | |
99 | "type": "array", | |
100 | "items": {"$ref": "#/definitions/blkio_limit"} | |
101 | }, | |
102 | "device_write_bps": { | |
103 | "type": "array", | |
104 | "items": {"$ref": "#/definitions/blkio_limit"} | |
105 | }, | |
106 | "device_write_iops": { | |
107 | "type": "array", | |
108 | "items": {"$ref": "#/definitions/blkio_limit"} | |
109 | }, | |
110 | "weight": {"type": "integer"}, | |
111 | "weight_device": { | |
112 | "type": "array", | |
113 | "items": {"$ref": "#/definitions/blkio_weight"} | |
114 | } | |
115 | }, | |
116 | "additionalProperties": false | |
117 | }, | |
118 | "cap_add": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
119 | "cap_drop": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
120 | "cgroup_parent": {"type": "string"}, | |
121 | "command": { | |
122 | "oneOf": [ | |
123 | {"type": "string"}, | |
124 | {"type": "array", "items": {"type": "string"}} | |
125 | ] | |
126 | }, | |
127 | "configs": { | |
128 | "type": "array", | |
129 | "items": { | |
130 | "oneOf": [ | |
131 | {"type": "string"}, | |
132 | { | |
133 | "type": "object", | |
134 | "properties": { | |
135 | "source": {"type": "string"}, | |
136 | "target": {"type": "string"}, | |
137 | "uid": {"type": "string"}, | |
138 | "gid": {"type": "string"}, | |
139 | "mode": {"type": "number"} | |
140 | }, | |
141 | "additionalProperties": false, | |
142 | "patternProperties": {"^x-": {}} | |
143 | } | |
144 | ] | |
145 | } | |
146 | }, | |
147 | "container_name": {"type": "string"}, | |
148 | "cpu_count": {"type": "integer", "minimum": 0}, | |
149 | "cpu_percent": {"type": "integer", "minimum": 0, "maximum": 100}, | |
150 | "cpu_shares": {"type": ["number", "string"]}, | |
151 | "cpu_quota": {"type": ["number", "string"]}, | |
152 | "cpu_period": {"type": ["number", "string"]}, | |
153 | "cpu_rt_period": {"type": ["number", "string"]}, | |
154 | "cpu_rt_runtime": {"type": ["number", "string"]}, | |
155 | "cpus": {"type": ["number", "string"]}, | |
156 | "cpuset": {"type": "string"}, | |
157 | "credential_spec": { | |
158 | "type": "object", | |
159 | "properties": { | |
160 | "config": {"type": "string"}, | |
161 | "file": {"type": "string"}, | |
162 | "registry": {"type": "string"} | |
163 | }, | |
164 | "additionalProperties": false, | |
165 | "patternProperties": {"^x-": {}} | |
166 | }, | |
167 | "depends_on": { | |
168 | "oneOf": [ | |
169 | {"$ref": "#/definitions/list_of_strings"}, | |
170 | { | |
171 | "type": "object", | |
172 | "additionalProperties": false, | |
173 | "patternProperties": { | |
174 | "^[a-zA-Z0-9._-]+$": { | |
175 | "type": "object", | |
176 | "additionalProperties": false, | |
177 | "properties": { | |
178 | "condition": { | |
179 | "type": "string", | |
180 | "enum": ["service_started", "service_healthy"] | |
181 | } | |
182 | }, | |
183 | "required": ["condition"] | |
184 | } | |
185 | } | |
186 | } | |
187 | ] | |
188 | }, | |
189 | "device_cgroup_rules": {"$ref": "#/definitions/list_of_strings"}, | |
190 | "devices": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
191 | "dns": {"$ref": "#/definitions/string_or_list"}, | |
192 | ||
193 | "dns_opt": {"type": "array","items": {"type": "string"}, "uniqueItems": true}, | |
194 | "dns_search": {"$ref": "#/definitions/string_or_list"}, | |
195 | "domainname": {"type": "string"}, | |
196 | "entrypoint": { | |
197 | "oneOf": [ | |
198 | {"type": "string"}, | |
199 | {"type": "array", "items": {"type": "string"}} | |
200 | ] | |
201 | }, | |
202 | "env_file": {"$ref": "#/definitions/string_or_list"}, | |
203 | "environment": {"$ref": "#/definitions/list_or_dict"}, | |
204 | ||
205 | "expose": { | |
206 | "type": "array", | |
207 | "items": { | |
208 | "type": ["string", "number"], | |
209 | "format": "expose" | |
210 | }, | |
211 | "uniqueItems": true | |
212 | }, | |
213 | ||
214 | "extends": { | |
215 | "oneOf": [ | |
216 | {"type": "string"}, | |
217 | { | |
218 | "type": "object", | |
219 | "properties": { | |
220 | "service": {"type": "string"}, | |
221 | "file": {"type": "string"} | |
222 | }, | |
223 | "required": ["service"], | |
224 | "additionalProperties": false | |
225 | } | |
226 | ] | |
227 | }, | |
228 | "external_links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
229 | "extra_hosts": {"$ref": "#/definitions/list_or_dict"}, | |
230 | "group_add": { | |
231 | "type": "array", | |
232 | "items": { | |
233 | "type": ["string", "number"] | |
234 | }, | |
235 | "uniqueItems": true | |
236 | }, | |
237 | "healthcheck": {"$ref": "#/definitions/healthcheck"}, | |
238 | "hostname": {"type": "string"}, | |
239 | "image": {"type": "string"}, | |
240 | "init": {"type": "boolean"}, | |
241 | "ipc": {"type": "string"}, | |
242 | "isolation": {"type": "string"}, | |
243 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
244 | "links": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
245 | "logging": { | |
246 | "type": "object", | |
247 | "properties": { | |
248 | "driver": {"type": "string"}, | |
249 | "options": { | |
250 | "type": "object", | |
251 | "patternProperties": { | |
252 | "^.+$": {"type": ["string", "number", "null"]} | |
253 | } | |
254 | } | |
255 | }, | |
256 | "additionalProperties": false, | |
257 | "patternProperties": {"^x-": {}} | |
258 | }, | |
259 | "mac_address": {"type": "string"}, | |
260 | "mem_limit": {"type": "string"}, | |
261 | "mem_reservation": {"type": ["string", "integer"]}, | |
262 | "mem_swappiness": {"type": "integer"}, | |
263 | "memswap_limit": {"type": ["number", "string"]}, | |
264 | "network_mode": {"type": "string"}, | |
265 | "networks": { | |
266 | "oneOf": [ | |
267 | {"$ref": "#/definitions/list_of_strings"}, | |
268 | { | |
269 | "type": "object", | |
270 | "patternProperties": { | |
271 | "^[a-zA-Z0-9._-]+$": { | |
272 | "oneOf": [ | |
273 | { | |
274 | "type": "object", | |
275 | "properties": { | |
276 | "aliases": {"$ref": "#/definitions/list_of_strings"}, | |
277 | "ipv4_address": {"type": "string"}, | |
278 | "ipv6_address": {"type": "string"}, | |
279 | "link_local_ips": {"$ref": "#/definitions/list_of_strings"}, | |
280 | "priority": {"type": "number"} | |
281 | }, | |
282 | "additionalProperties": false, | |
283 | "patternProperties": {"^x-": {}} | |
284 | }, | |
285 | {"type": "null"} | |
286 | ] | |
287 | } | |
288 | }, | |
289 | "additionalProperties": false | |
290 | } | |
291 | ] | |
292 | }, | |
293 | "oom_kill_disable": {"type": "boolean"}, | |
294 | "oom_score_adj": {"type": "integer", "minimum": -1000, "maximum": 1000}, | |
295 | "pid": {"type": ["string", "null"]}, | |
296 | "pids_limit": {"type": ["number", "string"]}, | |
297 | "platform": {"type": "string"}, | |
298 | "ports": { | |
299 | "type": "array", | |
300 | "items": { | |
301 | "oneOf": [ | |
302 | {"type": "number", "format": "ports"}, | |
303 | {"type": "string", "format": "ports"}, | |
304 | { | |
305 | "type": "object", | |
306 | "properties": { | |
307 | "mode": {"type": "string"}, | |
308 | "target": {"type": "integer"}, | |
309 | "published": {"type": "integer"}, | |
310 | "protocol": {"type": "string"} | |
311 | }, | |
312 | "additionalProperties": false, | |
313 | "patternProperties": {"^x-": {}} | |
314 | } | |
315 | ] | |
316 | }, | |
317 | "uniqueItems": true | |
318 | }, | |
319 | "privileged": {"type": "boolean"}, | |
320 | "pull_policy": {"type": "string", "enum": [ | |
321 | "always", "never", "if_not_present" | |
322 | ]}, | |
323 | "read_only": {"type": "boolean"}, | |
324 | "restart": {"type": "string"}, | |
325 | "runtime": { | |
326 | "deprecated": true, | |
327 | "type": "string" | |
328 | }, | |
329 | "scale": { | |
330 | "type": "integer" | |
331 | }, | |
332 | "security_opt": {"type": "array", "items": {"type": "string"}, "uniqueItems": true}, | |
333 | "shm_size": {"type": ["number", "string"]}, | |
334 | "secrets": { | |
335 | "type": "array", | |
336 | "items": { | |
337 | "oneOf": [ | |
338 | {"type": "string"}, | |
339 | { | |
340 | "type": "object", | |
341 | "properties": { | |
342 | "source": {"type": "string"}, | |
343 | "target": {"type": "string"}, | |
344 | "uid": {"type": "string"}, | |
345 | "gid": {"type": "string"}, | |
346 | "mode": {"type": "number"} | |
347 | }, | |
348 | "additionalProperties": false, | |
349 | "patternProperties": {"^x-": {}} | |
350 | } | |
351 | ] | |
352 | } | |
353 | }, | |
354 | "sysctls": {"$ref": "#/definitions/list_or_dict"}, | |
355 | "stdin_open": {"type": "boolean"}, | |
356 | "stop_grace_period": {"type": "string", "format": "duration"}, | |
357 | "stop_signal": {"type": "string"}, | |
358 | "tmpfs": {"$ref": "#/definitions/string_or_list"}, | |
359 | "tty": {"type": "boolean"}, | |
360 | "ulimits": { | |
361 | "type": "object", | |
362 | "patternProperties": { | |
363 | "^[a-z]+$": { | |
364 | "oneOf": [ | |
365 | {"type": "integer"}, | |
366 | { | |
367 | "type": "object", | |
368 | "properties": { | |
369 | "hard": {"type": "integer"}, | |
370 | "soft": {"type": "integer"} | |
371 | }, | |
372 | "required": ["soft", "hard"], | |
373 | "additionalProperties": false, | |
374 | "patternProperties": {"^x-": {}} | |
375 | } | |
376 | ] | |
377 | } | |
378 | } | |
379 | }, | |
380 | "user": {"type": "string"}, | |
381 | "userns_mode": {"type": "string"}, | |
382 | "volumes": { | |
383 | "type": "array", | |
384 | "items": { | |
385 | "oneOf": [ | |
386 | {"type": "string"}, | |
387 | { | |
388 | "type": "object", | |
389 | "required": ["type"], | |
390 | "properties": { | |
391 | "type": {"type": "string"}, | |
392 | "source": {"type": "string"}, | |
393 | "target": {"type": "string"}, | |
394 | "read_only": {"type": "boolean"}, | |
395 | "consistency": {"type": "string"}, | |
396 | "bind": { | |
397 | "type": "object", | |
398 | "properties": { | |
399 | "propagation": {"type": "string"} | |
400 | }, | |
401 | "additionalProperties": false, | |
402 | "patternProperties": {"^x-": {}} | |
403 | }, | |
404 | "volume": { | |
405 | "type": "object", | |
406 | "properties": { | |
407 | "nocopy": {"type": "boolean"} | |
408 | }, | |
409 | "additionalProperties": false, | |
410 | "patternProperties": {"^x-": {}} | |
411 | }, | |
412 | "tmpfs": { | |
413 | "type": "object", | |
414 | "properties": { | |
415 | "size": { | |
416 | "type": "integer", | |
417 | "minimum": 0 | |
418 | } | |
419 | }, | |
420 | "additionalProperties": false, | |
421 | "patternProperties": {"^x-": {}} | |
422 | } | |
423 | }, | |
424 | "additionalProperties": false, | |
425 | "patternProperties": {"^x-": {}} | |
426 | } | |
427 | ], | |
428 | "uniqueItems": true | |
429 | } | |
430 | }, | |
431 | "volumes_from": { | |
432 | "type": "array", | |
433 | "items": {"type": "string"}, | |
434 | "uniqueItems": true | |
435 | }, | |
436 | "working_dir": {"type": "string"} | |
437 | }, | |
438 | "patternProperties": {"^x-": {}}, | |
439 | "additionalProperties": false | |
440 | }, | |
441 | ||
442 | "healthcheck": { | |
443 | "id": "#/definitions/healthcheck", | |
444 | "type": "object", | |
445 | "properties": { | |
446 | "disable": {"type": "boolean"}, | |
447 | "interval": {"type": "string", "format": "duration"}, | |
448 | "retries": {"type": "number"}, | |
449 | "test": { | |
450 | "oneOf": [ | |
451 | {"type": "string"}, | |
452 | {"type": "array", "items": {"type": "string"}} | |
453 | ] | |
454 | }, | |
455 | "timeout": {"type": "string", "format": "duration"}, | |
456 | "start_period": {"type": "string", "format": "duration"} | |
457 | }, | |
458 | "additionalProperties": false, | |
459 | "patternProperties": {"^x-": {}} | |
460 | }, | |
461 | "deployment": { | |
462 | "id": "#/definitions/deployment", | |
463 | "type": ["object", "null"], | |
464 | "properties": { | |
465 | "mode": {"type": "string"}, | |
466 | "endpoint_mode": {"type": "string"}, | |
467 | "replicas": {"type": "integer"}, | |
468 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
469 | "rollback_config": { | |
470 | "type": "object", | |
471 | "properties": { | |
472 | "parallelism": {"type": "integer"}, | |
473 | "delay": {"type": "string", "format": "duration"}, | |
474 | "failure_action": {"type": "string"}, | |
475 | "monitor": {"type": "string", "format": "duration"}, | |
476 | "max_failure_ratio": {"type": "number"}, | |
477 | "order": {"type": "string", "enum": [ | |
478 | "start-first", "stop-first" | |
479 | ]} | |
480 | }, | |
481 | "additionalProperties": false, | |
482 | "patternProperties": {"^x-": {}} | |
483 | }, | |
484 | "update_config": { | |
485 | "type": "object", | |
486 | "properties": { | |
487 | "parallelism": {"type": "integer"}, | |
488 | "delay": {"type": "string", "format": "duration"}, | |
489 | "failure_action": {"type": "string"}, | |
490 | "monitor": {"type": "string", "format": "duration"}, | |
491 | "max_failure_ratio": {"type": "number"}, | |
492 | "order": {"type": "string", "enum": [ | |
493 | "start-first", "stop-first" | |
494 | ]} | |
495 | }, | |
496 | "additionalProperties": false, | |
497 | "patternProperties": {"^x-": {}} | |
498 | }, | |
499 | "resources": { | |
500 | "type": "object", | |
501 | "properties": { | |
502 | "limits": { | |
503 | "type": "object", | |
504 | "properties": { | |
505 | "cpus": {"type": ["number", "string"]}, | |
506 | "memory": {"type": "string"} | |
507 | }, | |
508 | "additionalProperties": false, | |
509 | "patternProperties": {"^x-": {}} | |
510 | }, | |
511 | "reservations": { | |
512 | "type": "object", | |
513 | "properties": { | |
514 | "cpus": {"type": ["number", "string"]}, | |
515 | "memory": {"type": "string"}, | |
516 | "generic_resources": {"$ref": "#/definitions/generic_resources"} | |
517 | }, | |
518 | "additionalProperties": false, | |
519 | "patternProperties": {"^x-": {}} | |
520 | } | |
521 | }, | |
522 | "additionalProperties": false, | |
523 | "patternProperties": {"^x-": {}} | |
524 | }, | |
525 | "restart_policy": { | |
526 | "type": "object", | |
527 | "properties": { | |
528 | "condition": {"type": "string"}, | |
529 | "delay": {"type": "string", "format": "duration"}, | |
530 | "max_attempts": {"type": "integer"}, | |
531 | "window": {"type": "string", "format": "duration"} | |
532 | }, | |
533 | "additionalProperties": false, | |
534 | "patternProperties": {"^x-": {}} | |
535 | }, | |
536 | "placement": { | |
537 | "type": "object", | |
538 | "properties": { | |
539 | "constraints": {"type": "array", "items": {"type": "string"}}, | |
540 | "preferences": { | |
541 | "type": "array", | |
542 | "items": { | |
543 | "type": "object", | |
544 | "properties": { | |
545 | "spread": {"type": "string"} | |
546 | }, | |
547 | "additionalProperties": false, | |
548 | "patternProperties": {"^x-": {}} | |
549 | } | |
550 | }, | |
551 | "max_replicas_per_node": {"type": "integer"} | |
552 | }, | |
553 | "additionalProperties": false, | |
554 | "patternProperties": {"^x-": {}} | |
555 | } | |
556 | }, | |
557 | "additionalProperties": false, | |
558 | "patternProperties": {"^x-": {}} | |
559 | }, | |
560 | "generic_resources": { | |
561 | "id": "#/definitions/generic_resources", | |
562 | "type": "array", | |
563 | "items": { | |
564 | "type": "object", | |
565 | "properties": { | |
566 | "discrete_resource_spec": { | |
567 | "type": "object", | |
568 | "properties": { | |
569 | "kind": {"type": "string"}, | |
570 | "value": {"type": "number"} | |
571 | }, | |
572 | "additionalProperties": false, | |
573 | "patternProperties": {"^x-": {}} | |
574 | } | |
575 | }, | |
576 | "additionalProperties": false, | |
577 | "patternProperties": {"^x-": {}} | |
578 | } | |
579 | }, | |
580 | "network": { | |
581 | "id": "#/definitions/network", | |
582 | "type": ["object", "null"], | |
583 | "properties": { | |
584 | "name": {"type": "string"}, | |
585 | "driver": {"type": "string"}, | |
586 | "driver_opts": { | |
587 | "type": "object", | |
588 | "patternProperties": { | |
589 | "^.+$": {"type": ["string", "number"]} | |
590 | } | |
591 | }, | |
592 | "ipam": { | |
593 | "type": "object", | |
594 | "properties": { | |
595 | "driver": {"type": "string"}, | |
596 | "config": { | |
597 | "type": "array", | |
598 | "items": { | |
599 | "type": "object", | |
600 | "properties": { | |
601 | "subnet": {"type": "string", "format": "subnet_ip_address"}, | |
602 | "ip_range": {"type": "string"}, | |
603 | "gateway": {"type": "string"}, | |
604 | "aux_addresses": { | |
605 | "type": "object", | |
606 | "additionalProperties": false, | |
607 | "patternProperties": {"^.+$": {"type": "string"}} | |
608 | } | |
609 | } | |
610 | }, | |
611 | "additionalProperties": false, | |
612 | "patternProperties": {"^x-": {}} | |
613 | }, | |
614 | "options": { | |
615 | "type": "object", | |
616 | "additionalProperties": false, | |
617 | "patternProperties": {"^.+$": {"type": "string"}} | |
618 | } | |
619 | }, | |
620 | "additionalProperties": false, | |
621 | "patternProperties": {"^x-": {}} | |
622 | }, | |
623 | "external": { | |
624 | "type": ["boolean", "object"], | |
625 | "properties": { | |
626 | "name": { | |
627 | "deprecated": true, | |
628 | "type": "string" | |
629 | } | |
630 | }, | |
631 | "additionalProperties": false, | |
632 | "patternProperties": {"^x-": {}} | |
633 | }, | |
634 | "internal": {"type": "boolean"}, | |
635 | "enable_ipv6": {"type": "boolean"}, | |
636 | "attachable": {"type": "boolean"}, | |
637 | "labels": {"$ref": "#/definitions/list_or_dict"} | |
638 | }, | |
639 | "additionalProperties": false, | |
640 | "patternProperties": {"^x-": {}} | |
641 | }, | |
642 | "volume": { | |
643 | "id": "#/definitions/volume", | |
644 | "type": ["object", "null"], | |
645 | "properties": { | |
646 | "name": {"type": "string"}, | |
647 | "driver": {"type": "string"}, | |
648 | "driver_opts": { | |
649 | "type": "object", | |
650 | "patternProperties": { | |
651 | "^.+$": {"type": ["string", "number"]} | |
652 | } | |
653 | }, | |
654 | "external": { | |
655 | "type": ["boolean", "object"], | |
656 | "properties": { | |
657 | "name": { | |
658 | "deprecated": true, | |
659 | "type": "string" | |
660 | } | |
661 | }, | |
662 | "additionalProperties": false, | |
663 | "patternProperties": {"^x-": {}} | |
664 | }, | |
665 | "labels": {"$ref": "#/definitions/list_or_dict"} | |
666 | }, | |
667 | "additionalProperties": false, | |
668 | "patternProperties": {"^x-": {}} | |
669 | }, | |
670 | "secret": { | |
671 | "id": "#/definitions/secret", | |
672 | "type": "object", | |
673 | "properties": { | |
674 | "name": {"type": "string"}, | |
675 | "file": {"type": "string"}, | |
676 | "external": { | |
677 | "type": ["boolean", "object"], | |
678 | "properties": { | |
679 | "name": {"type": "string"} | |
680 | } | |
681 | }, | |
682 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
683 | "driver": {"type": "string"}, | |
684 | "driver_opts": { | |
685 | "type": "object", | |
686 | "patternProperties": { | |
687 | "^.+$": {"type": ["string", "number"]} | |
688 | } | |
689 | }, | |
690 | "template_driver": {"type": "string"} | |
691 | }, | |
692 | "additionalProperties": false, | |
693 | "patternProperties": {"^x-": {}} | |
694 | }, | |
695 | "config": { | |
696 | "id": "#/definitions/config", | |
697 | "type": "object", | |
698 | "properties": { | |
699 | "name": {"type": "string"}, | |
700 | "file": {"type": "string"}, | |
701 | "external": { | |
702 | "type": ["boolean", "object"], | |
703 | "properties": { | |
704 | "name": { | |
705 | "deprecated": true, | |
706 | "type": "string" | |
707 | } | |
708 | } | |
709 | }, | |
710 | "labels": {"$ref": "#/definitions/list_or_dict"}, | |
711 | "template_driver": {"type": "string"} | |
712 | }, | |
713 | "additionalProperties": false, | |
714 | "patternProperties": {"^x-": {}} | |
715 | }, | |
716 | "string_or_list": { | |
717 | "oneOf": [ | |
718 | {"type": "string"}, | |
719 | {"$ref": "#/definitions/list_of_strings"} | |
720 | ] | |
721 | }, | |
722 | "list_of_strings": { | |
723 | "type": "array", | |
724 | "items": {"type": "string"}, | |
725 | "uniqueItems": true | |
726 | }, | |
727 | "list_or_dict": { | |
728 | "oneOf": [ | |
729 | { | |
730 | "type": "object", | |
731 | "patternProperties": { | |
732 | ".+": { | |
733 | "type": ["string", "number", "null"] | |
734 | } | |
735 | }, | |
736 | "additionalProperties": false | |
737 | }, | |
738 | {"type": "array", "items": {"type": "string"}, "uniqueItems": true} | |
739 | ] | |
740 | }, | |
741 | "blkio_limit": { | |
742 | "type": "object", | |
743 | "properties": { | |
744 | "path": {"type": "string"}, | |
745 | "rate": {"type": ["integer", "string"]} | |
746 | }, | |
747 | "additionalProperties": false | |
748 | }, | |
749 | "blkio_weight": { | |
750 | "type": "object", | |
751 | "properties": { | |
752 | "path": {"type": "string"}, | |
753 | "weight": {"type": "integer"} | |
754 | }, | |
755 | "additionalProperties": false | |
756 | }, | |
757 | "constraints": { | |
758 | "service": { | |
759 | "id": "#/definitions/constraints/service", | |
760 | "anyOf": [ | |
761 | {"required": ["build"]}, | |
762 | {"required": ["image"]} | |
763 | ], | |
764 | "properties": { | |
765 | "build": { | |
766 | "required": ["context"] | |
767 | } | |
768 | } | |
769 | } | |
770 | } | |
771 | } | |
772 | } |
53 | 53 | if base_dir is None: |
54 | 54 | return result |
55 | 55 | if env_file: |
56 | env_file_path = os.path.join(base_dir, env_file) | |
57 | else: | |
58 | env_file_path = os.path.join(base_dir, '.env') | |
56 | env_file_path = os.path.join(os.getcwd(), env_file) | |
57 | return cls(env_vars_from_file(env_file_path)) | |
58 | ||
59 | env_file_path = os.path.join(base_dir, '.env') | |
59 | 60 | try: |
60 | 61 | return cls(env_vars_from_file(env_file_path)) |
61 | 62 | except EnvFileNotFound: |
112 | 113 | ) |
113 | 114 | return super().get(key, *args, **kwargs) |
114 | 115 | |
115 | def get_boolean(self, key): | |
116 | def get_boolean(self, key, default=False): | |
116 | 117 | # Convert a value to a boolean using "common sense" rules. |
117 | 118 | # Unset, empty, "0" and "false" (i-case) yield False. |
118 | 119 | # All other values yield True. |
119 | 120 | value = self.get(key) |
120 | 121 | if not value: |
121 | return False | |
122 | return default | |
122 | 123 | if value.lower() in ['0', 'false']: |
123 | 124 | return False |
124 | 125 | return True |
110 | 110 | var, _, err = braced.partition(':?') |
111 | 111 | result = mapping.get(var) |
112 | 112 | if not result: |
113 | err = err or var | |
113 | 114 | raise UnsetRequiredSubstitution(err) |
114 | 115 | return result |
115 | 116 | elif '?' == sep: |
116 | 117 | var, _, err = braced.partition('?') |
117 | 118 | if var in mapping: |
118 | 119 | return mapping.get(var) |
120 | err = err or var | |
119 | 121 | raise UnsetRequiredSubstitution(err) |
120 | 122 | |
121 | 123 | # Modified from python2.7/string.py |
240 | 242 | service_path('healthcheck', 'disable'): to_boolean, |
241 | 243 | service_path('deploy', 'labels', PATH_JOKER): to_str, |
242 | 244 | service_path('deploy', 'replicas'): to_int, |
245 | service_path('deploy', 'placement', 'max_replicas_per_node'): to_int, | |
243 | 246 | service_path('deploy', 'resources', 'limits', "cpus"): to_float, |
244 | 247 | service_path('deploy', 'update_config', 'parallelism'): to_int, |
245 | 248 | service_path('deploy', 'update_config', 'max_failure_ratio'): to_float, |
501 | 501 | |
502 | 502 | |
503 | 503 | def load_jsonschema(version): |
504 | suffix = "compose_spec" | |
504 | name = "compose_spec" | |
505 | 505 | if version == V1: |
506 | suffix = "v1" | |
506 | name = "config_schema_v1" | |
507 | 507 | |
508 | 508 | filename = os.path.join( |
509 | 509 | get_schema_path(), |
510 | "config_schema_{}.json".format(suffix)) | |
510 | "{}.json".format(name)) | |
511 | 511 | |
512 | 512 | if not os.path.exists(filename): |
513 | 513 | raise ConfigurationError( |
4 | 4 | DEFAULT_TIMEOUT = 10 |
5 | 5 | HTTP_TIMEOUT = 60 |
6 | 6 | IS_WINDOWS_PLATFORM = (sys.platform == "win32") |
7 | IS_LINUX_PLATFORM = (sys.platform == "linux") | |
7 | 8 | LABEL_CONTAINER_NUMBER = 'com.docker.compose.container-number' |
8 | 9 | LABEL_ONE_OFF = 'com.docker.compose.oneoff' |
9 | 10 | LABEL_PROJECT = 'com.docker.compose.project' |
186 | 186 | return self.get('HostConfig.LogConfig.Type') |
187 | 187 | |
188 | 188 | @property |
189 | def has_api_logs(self): | |
190 | log_type = self.log_driver | |
191 | return not log_type or log_type in ('json-file', 'journald', 'local') | |
192 | ||
193 | @property | |
194 | 189 | def human_readable_health_status(self): |
195 | 190 | """ Generate UP status string with up time and health |
196 | 191 | """ |
203 | 198 | return status_string |
204 | 199 | |
205 | 200 | def attach_log_stream(self): |
206 | """A log stream can only be attached if the container uses a | |
207 | json-file, journald or local log driver. | |
208 | """ | |
209 | if self.has_api_logs: | |
210 | self.log_stream = self.attach(stdout=True, stderr=True, stream=True) | |
201 | self.log_stream = self.attach(stdout=True, stderr=True, stream=True) | |
211 | 202 | |
212 | 203 | def get(self, key): |
213 | 204 | """Return a value from the container or None if the value is not set. |
26 | 26 | service_name |
27 | 27 | ) |
28 | 28 | ) |
29 | ||
30 | ||
31 | class CompletedUnsuccessfully(Exception): | |
32 | def __init__(self, container_id, exit_code): | |
33 | self.msg = 'Container "{}" exited with code {}.'.format(container_id, exit_code) |
0 | import os | |
1 | from enum import Enum | |
2 | ||
3 | import requests | |
4 | from docker import ContextAPI | |
5 | from docker.transport import UnixHTTPAdapter | |
6 | ||
7 | from compose.const import IS_WINDOWS_PLATFORM | |
8 | ||
9 | if IS_WINDOWS_PLATFORM: | |
10 | from docker.transport import NpipeHTTPAdapter | |
11 | ||
12 | ||
13 | class Status(Enum): | |
14 | SUCCESS = "success" | |
15 | FAILURE = "failure" | |
16 | CANCELED = "canceled" | |
17 | ||
18 | ||
19 | class MetricsSource: | |
20 | CLI = "docker-compose" | |
21 | ||
22 | ||
23 | if IS_WINDOWS_PLATFORM: | |
24 | METRICS_SOCKET_FILE = 'npipe://\\\\.\\pipe\\docker_cli' | |
25 | else: | |
26 | METRICS_SOCKET_FILE = 'http+unix:///var/run/docker-cli.sock' | |
27 | ||
28 | ||
29 | class MetricsCommand(requests.Session): | |
30 | """ | |
31 | Representation of a command in the metrics. | |
32 | """ | |
33 | ||
34 | def __init__(self, command, | |
35 | context_type=None, status=Status.SUCCESS, | |
36 | source=MetricsSource.CLI, uri=None): | |
37 | super().__init__() | |
38 | self.command = ("compose " + command).strip() if command else "compose --help" | |
39 | self.context = context_type or ContextAPI.get_current_context().context_type or 'moby' | |
40 | self.source = source | |
41 | self.status = status.value | |
42 | self.uri = uri or os.environ.get("METRICS_SOCKET_FILE", METRICS_SOCKET_FILE) | |
43 | if IS_WINDOWS_PLATFORM: | |
44 | self.mount("http+unix://", NpipeHTTPAdapter(self.uri)) | |
45 | else: | |
46 | self.mount("http+unix://", UnixHTTPAdapter(self.uri)) | |
47 | ||
48 | def send_metrics(self): | |
49 | try: | |
50 | return self.post("http+unix://localhost/usage", | |
51 | json=self.to_map(), | |
52 | timeout=.05, | |
53 | headers={'Content-Type': 'application/json'}) | |
54 | except Exception as e: | |
55 | return e | |
56 | ||
57 | def to_map(self): | |
58 | return { | |
59 | 'command': self.command, | |
60 | 'context': self.context, | |
61 | 'source': self.source, | |
62 | 'status': self.status, | |
63 | } |
0 | import functools | |
1 | ||
2 | from compose.metrics.client import MetricsCommand | |
3 | from compose.metrics.client import Status | |
4 | ||
5 | ||
6 | class metrics: | |
7 | def __init__(self, command_name=None): | |
8 | self.command_name = command_name | |
9 | ||
10 | def __call__(self, fn): | |
11 | @functools.wraps(fn, | |
12 | assigned=functools.WRAPPER_ASSIGNMENTS, | |
13 | updated=functools.WRAPPER_UPDATES) | |
14 | def wrapper(*args, **kwargs): | |
15 | if not self.command_name: | |
16 | self.command_name = fn.__name__ | |
17 | result = fn(*args, **kwargs) | |
18 | MetricsCommand(self.command_name, status=Status.SUCCESS).send_metrics() | |
19 | return result | |
20 | return wrapper |
10 | 10 | from docker.errors import APIError |
11 | 11 | from docker.errors import ImageNotFound |
12 | 12 | |
13 | from compose.cli.colors import AnsiMode | |
13 | 14 | from compose.cli.colors import green |
14 | 15 | from compose.cli.colors import red |
15 | 16 | from compose.cli.signals import ShutdownException |
16 | 17 | from compose.const import PARALLEL_LIMIT |
18 | from compose.errors import CompletedUnsuccessfully | |
17 | 19 | from compose.errors import HealthCheckFailed |
18 | 20 | from compose.errors import NoHealthCheckConfigured |
19 | 21 | from compose.errors import OperationFailedError |
59 | 61 | elif isinstance(exception, APIError): |
60 | 62 | errors[get_name(obj)] = exception.explanation |
61 | 63 | writer.write(msg, get_name(obj), 'error', red) |
62 | elif isinstance(exception, (OperationFailedError, HealthCheckFailed, NoHealthCheckConfigured)): | |
64 | elif isinstance(exception, (OperationFailedError, HealthCheckFailed, NoHealthCheckConfigured, | |
65 | CompletedUnsuccessfully)): | |
63 | 66 | errors[get_name(obj)] = exception.msg |
64 | 67 | writer.write(msg, get_name(obj), 'error', red) |
65 | 68 | elif isinstance(exception, UpstreamError): |
82 | 85 | objects = list(objects) |
83 | 86 | stream = sys.stderr |
84 | 87 | |
85 | if ParallelStreamWriter.instance: | |
86 | writer = ParallelStreamWriter.instance | |
87 | else: | |
88 | writer = ParallelStreamWriter(stream) | |
88 | writer = ParallelStreamWriter.get_or_assign_instance(ParallelStreamWriter(stream)) | |
89 | 89 | |
90 | 90 | for obj in objects: |
91 | 91 | writer.add_object(msg, get_name(obj)) |
242 | 242 | 'not processing'.format(obj) |
243 | 243 | ) |
244 | 244 | results.put((obj, None, e)) |
245 | except CompletedUnsuccessfully as e: | |
246 | log.debug( | |
247 | 'Service(s) upstream of {} did not completed successfully - ' | |
248 | 'not processing'.format(obj) | |
249 | ) | |
250 | results.put((obj, None, e)) | |
245 | 251 | |
246 | 252 | if state.is_done(): |
247 | 253 | results.put(STOP) |
258 | 264 | to jump to the correct line, and write over the line. |
259 | 265 | """ |
260 | 266 | |
261 | noansi = False | |
262 | lock = Lock() | |
267 | default_ansi_mode = AnsiMode.AUTO | |
268 | write_lock = Lock() | |
269 | ||
263 | 270 | instance = None |
271 | instance_lock = Lock() | |
264 | 272 | |
265 | 273 | @classmethod |
266 | def set_noansi(cls, value=True): | |
267 | cls.noansi = value | |
268 | ||
269 | def __init__(self, stream): | |
274 | def get_instance(cls): | |
275 | return cls.instance | |
276 | ||
277 | @classmethod | |
278 | def get_or_assign_instance(cls, writer): | |
279 | cls.instance_lock.acquire() | |
280 | try: | |
281 | if cls.instance is None: | |
282 | cls.instance = writer | |
283 | return cls.instance | |
284 | finally: | |
285 | cls.instance_lock.release() | |
286 | ||
287 | @classmethod | |
288 | def set_default_ansi_mode(cls, ansi_mode): | |
289 | cls.default_ansi_mode = ansi_mode | |
290 | ||
291 | def __init__(self, stream, ansi_mode=None): | |
292 | if ansi_mode is None: | |
293 | ansi_mode = self.default_ansi_mode | |
270 | 294 | self.stream = stream |
295 | self.use_ansi_codes = ansi_mode.use_ansi_codes(stream) | |
271 | 296 | self.lines = [] |
272 | 297 | self.width = 0 |
273 | ParallelStreamWriter.instance = self | |
274 | 298 | |
275 | 299 | def add_object(self, msg, obj_index): |
276 | 300 | if msg is None: |
284 | 308 | return self._write_noansi(msg, obj_index, '') |
285 | 309 | |
286 | 310 | def _write_ansi(self, msg, obj_index, status): |
287 | self.lock.acquire() | |
311 | self.write_lock.acquire() | |
288 | 312 | position = self.lines.index(msg + obj_index) |
289 | 313 | diff = len(self.lines) - position |
290 | 314 | # move up |
296 | 320 | # move back down |
297 | 321 | self.stream.write("%c[%dB" % (27, diff)) |
298 | 322 | self.stream.flush() |
299 | self.lock.release() | |
323 | self.write_lock.release() | |
300 | 324 | |
301 | 325 | def _write_noansi(self, msg, obj_index, status): |
302 | 326 | self.stream.write( |
309 | 333 | def write(self, msg, obj_index, status, color_func): |
310 | 334 | if msg is None: |
311 | 335 | return |
312 | if self.noansi: | |
336 | if self.use_ansi_codes: | |
337 | self._write_ansi(msg, obj_index, color_func(status)) | |
338 | else: | |
313 | 339 | self._write_noansi(msg, obj_index, status) |
314 | else: | |
315 | self._write_ansi(msg, obj_index, color_func(status)) | |
316 | ||
317 | ||
318 | def get_stream_writer(): | |
319 | instance = ParallelStreamWriter.instance | |
320 | if instance is None: | |
321 | raise RuntimeError('ParallelStreamWriter has not yet been instantiated') | |
322 | return instance | |
323 | 340 | |
324 | 341 | |
325 | 342 | def parallel_operation(containers, operation, options, message): |
38 | 38 | from .service import ServiceIpcMode |
39 | 39 | from .service import ServiceNetworkMode |
40 | 40 | from .service import ServicePidMode |
41 | from .utils import filter_attached_for_up | |
41 | 42 | from .utils import microseconds_from_time_nano |
42 | 43 | from .utils import truncate_string |
43 | 44 | from .volume import ProjectVolumes |
67 | 68 | """ |
68 | 69 | A collection of services. |
69 | 70 | """ |
70 | def __init__(self, name, services, client, networks=None, volumes=None, config_version=None): | |
71 | def __init__(self, name, services, client, networks=None, volumes=None, config_version=None, | |
72 | enabled_profiles=None): | |
71 | 73 | self.name = name |
72 | 74 | self.services = services |
73 | 75 | self.client = client |
74 | 76 | self.volumes = volumes or ProjectVolumes({}) |
75 | 77 | self.networks = networks or ProjectNetworks({}, False) |
76 | 78 | self.config_version = config_version |
79 | self.enabled_profiles = enabled_profiles or [] | |
77 | 80 | |
78 | 81 | def labels(self, one_off=OneOffFilter.exclude, legacy=False): |
79 | 82 | name = self.name |
85 | 88 | return labels |
86 | 89 | |
87 | 90 | @classmethod |
88 | def from_config(cls, name, config_data, client, default_platform=None, extra_labels=None): | |
91 | def from_config(cls, name, config_data, client, default_platform=None, extra_labels=None, | |
92 | enabled_profiles=None): | |
89 | 93 | """ |
90 | 94 | Construct a Project from a config.Config object. |
91 | 95 | """ |
97 | 101 | networks, |
98 | 102 | use_networking) |
99 | 103 | volumes = ProjectVolumes.from_config(name, config_data, client) |
100 | project = cls(name, [], client, project_networks, volumes, config_data.version) | |
104 | project = cls(name, [], client, project_networks, volumes, config_data.version, enabled_profiles) | |
101 | 105 | |
102 | 106 | for service_dict in config_data.services: |
103 | 107 | service_dict = dict(service_dict) |
127 | 131 | config_data.secrets) |
128 | 132 | |
129 | 133 | service_dict['scale'] = project.get_service_scale(service_dict) |
130 | ||
134 | service_dict['device_requests'] = project.get_device_requests(service_dict) | |
131 | 135 | service_dict = translate_credential_spec_to_security_opt(service_dict) |
132 | 136 | service_dict, ignored_keys = translate_deploy_keys_to_container_config( |
133 | 137 | service_dict |
184 | 188 | if name not in valid_names: |
185 | 189 | raise NoSuchService(name) |
186 | 190 | |
187 | def get_services(self, service_names=None, include_deps=False): | |
191 | def get_services(self, service_names=None, include_deps=False, auto_enable_profiles=True): | |
188 | 192 | """ |
189 | 193 | Returns a list of this project's services filtered |
190 | 194 | by the provided list of names, or all services if service_names is None |
197 | 201 | reordering as needed to resolve dependencies. |
198 | 202 | |
199 | 203 | Raises NoSuchService if any of the named services do not exist. |
204 | ||
205 | Raises ConfigurationError if any service depended on is not enabled by active profiles | |
200 | 206 | """ |
207 | # create a copy so we can *locally* add auto-enabled profiles later | |
208 | enabled_profiles = self.enabled_profiles.copy() | |
209 | ||
201 | 210 | if service_names is None or len(service_names) == 0: |
202 | service_names = self.service_names | |
211 | auto_enable_profiles = False | |
212 | service_names = [ | |
213 | service.name | |
214 | for service in self.services | |
215 | if service.enabled_for_profiles(enabled_profiles) | |
216 | ] | |
203 | 217 | |
204 | 218 | unsorted = [self.get_service(name) for name in service_names] |
205 | 219 | services = [s for s in self.services if s in unsorted] |
206 | 220 | |
221 | if auto_enable_profiles: | |
222 | # enable profiles of explicitly targeted services | |
223 | for service in services: | |
224 | for profile in service.get_profiles(): | |
225 | if profile not in enabled_profiles: | |
226 | enabled_profiles.append(profile) | |
227 | ||
207 | 228 | if include_deps: |
208 | services = reduce(self._inject_deps, services, []) | |
229 | services = reduce( | |
230 | lambda acc, s: self._inject_deps(acc, s, enabled_profiles), | |
231 | services, | |
232 | [] | |
233 | ) | |
209 | 234 | |
210 | 235 | uniques = [] |
211 | 236 | [uniques.append(s) for s in services if s not in uniques] |
330 | 355 | max_replicas)) |
331 | 356 | return scale |
332 | 357 | |
358 | def get_device_requests(self, service_dict): | |
359 | deploy_dict = service_dict.get('deploy', None) | |
360 | if not deploy_dict: | |
361 | return | |
362 | ||
363 | resources = deploy_dict.get('resources', None) | |
364 | if not resources or not resources.get('reservations', None): | |
365 | return | |
366 | devices = resources['reservations'].get('devices') | |
367 | if not devices: | |
368 | return | |
369 | ||
370 | for dev in devices: | |
371 | count = dev.get("count", -1) | |
372 | if not isinstance(count, int): | |
373 | if count != "all": | |
374 | raise ConfigurationError( | |
375 | 'Invalid value "{}" for devices count'.format(dev["count"]), | |
376 | '(expected integer or "all")') | |
377 | dev["count"] = -1 | |
378 | ||
379 | if 'capabilities' in dev: | |
380 | dev['capabilities'] = [dev['capabilities']] | |
381 | return devices | |
382 | ||
333 | 383 | def start(self, service_names=None, **options): |
334 | 384 | containers = [] |
335 | 385 | |
411 | 461 | self.remove_images(remove_image_type) |
412 | 462 | |
413 | 463 | def remove_images(self, remove_image_type): |
414 | for service in self.get_services(): | |
464 | for service in self.services: | |
415 | 465 | service.remove_image(remove_image_type) |
416 | 466 | |
417 | 467 | def restart(self, service_names=None, **options): |
468 | # filter service_names by enabled profiles | |
469 | service_names = [s.name for s in self.get_services(service_names)] | |
418 | 470 | containers = self.containers(service_names, stopped=True) |
419 | 471 | |
420 | 472 | parallel.parallel_execute( |
437 | 489 | log.info('%s uses an image, skipping' % service.name) |
438 | 490 | |
439 | 491 | if cli: |
440 | log.warning("Native build is an experimental feature and could change at any time") | |
441 | 492 | if parallel_build: |
442 | 493 | log.warning("Flag '--parallel' is ignored when building with " |
443 | 494 | "COMPOSE_DOCKER_CLI_BUILD=1") |
593 | 644 | silent=False, |
594 | 645 | cli=False, |
595 | 646 | one_off=False, |
647 | attach_dependencies=False, | |
596 | 648 | override_options=None, |
597 | 649 | ): |
598 | ||
599 | if cli: | |
600 | log.warning("Native build is an experimental feature and could change at any time") | |
601 | 650 | |
602 | 651 | self.initialize() |
603 | 652 | if not ignore_orphans: |
619 | 668 | one_off=service_names if one_off else [], |
620 | 669 | ) |
621 | 670 | |
671 | services_to_attach = filter_attached_for_up( | |
672 | services, | |
673 | service_names, | |
674 | attach_dependencies, | |
675 | lambda service: service.name) | |
676 | ||
622 | 677 | def do(service): |
623 | ||
624 | 678 | return service.execute_convergence_plan( |
625 | 679 | plans[service.name], |
626 | 680 | timeout=timeout, |
627 | detached=detached, | |
681 | detached=detached or (service not in services_to_attach), | |
628 | 682 | scale_override=scale_override.get(service.name), |
629 | 683 | rescale=rescale, |
630 | 684 | start=start, |
694 | 748 | |
695 | 749 | return plans |
696 | 750 | |
697 | def pull(self, service_names=None, ignore_pull_failures=False, parallel_pull=False, silent=False, | |
751 | def pull(self, service_names=None, ignore_pull_failures=False, parallel_pull=True, silent=False, | |
698 | 752 | include_deps=False): |
699 | 753 | services = self.get_services(service_names, include_deps) |
700 | 754 | |
728 | 782 | return |
729 | 783 | |
730 | 784 | try: |
731 | writer = parallel.get_stream_writer() | |
785 | writer = parallel.ParallelStreamWriter.get_instance() | |
786 | if writer is None: | |
787 | raise RuntimeError('ParallelStreamWriter has not yet been instantiated') | |
732 | 788 | for event in strm: |
733 | 789 | if 'status' not in event: |
734 | 790 | continue |
829 | 885 | ) |
830 | 886 | ) |
831 | 887 | |
832 | def _inject_deps(self, acc, service): | |
888 | def _inject_deps(self, acc, service, enabled_profiles): | |
833 | 889 | dep_names = service.get_dependency_names() |
834 | 890 | |
835 | 891 | if len(dep_names) > 0: |
836 | 892 | dep_services = self.get_services( |
837 | 893 | service_names=list(set(dep_names)), |
838 | include_deps=True | |
839 | ) | |
894 | include_deps=True, | |
895 | auto_enable_profiles=False | |
896 | ) | |
897 | ||
898 | for dep in dep_services: | |
899 | if not dep.enabled_for_profiles(enabled_profiles): | |
900 | raise ConfigurationError( | |
901 | 'Service "{dep_name}" was pulled in as a dependency of ' | |
902 | 'service "{service_name}" but is not enabled by the ' | |
903 | 'active profiles. ' | |
904 | 'You may fix this by adding a common profile to ' | |
905 | '"{dep_name}" and "{service_name}".' | |
906 | .format(dep_name=dep.name, service_name=service.name) | |
907 | ) | |
840 | 908 | else: |
841 | 909 | dep_services = [] |
842 | 910 |
0 | 0 | import enum |
1 | 1 | import itertools |
2 | import json | |
3 | 2 | import logging |
4 | 3 | import os |
5 | 4 | import re |
44 | 43 | from .const import NANOCPUS_SCALE |
45 | 44 | from .const import WINDOWS_LONGPATH_PREFIX |
46 | 45 | from .container import Container |
46 | from .errors import CompletedUnsuccessfully | |
47 | 47 | from .errors import HealthCheckFailed |
48 | 48 | from .errors import NoHealthCheckConfigured |
49 | 49 | from .errors import OperationFailedError |
76 | 76 | 'cpuset', |
77 | 77 | 'device_cgroup_rules', |
78 | 78 | 'devices', |
79 | 'device_requests', | |
79 | 80 | 'dns', |
80 | 81 | 'dns_search', |
81 | 82 | 'dns_opt', |
110 | 111 | |
111 | 112 | CONDITION_STARTED = 'service_started' |
112 | 113 | CONDITION_HEALTHY = 'service_healthy' |
114 | CONDITION_COMPLETED_SUCCESSFULLY = 'service_completed_successfully' | |
113 | 115 | |
114 | 116 | |
115 | 117 | class BuildError(Exception): |
710 | 712 | 'image_id': image_id(), |
711 | 713 | 'links': self.get_link_names(), |
712 | 714 | 'net': self.network_mode.id, |
715 | 'ipc_mode': self.ipc_mode.mode, | |
713 | 716 | 'networks': self.networks, |
714 | 717 | 'secrets': self.secrets, |
715 | 718 | 'volumes_from': [ |
716 | 719 | (v.source.name, v.mode) |
717 | 720 | for v in self.volumes_from if isinstance(v.source, Service) |
718 | ], | |
721 | ] | |
719 | 722 | } |
720 | 723 | |
721 | 724 | def get_dependency_names(self): |
751 | 754 | configs[svc] = lambda s: True |
752 | 755 | elif config['condition'] == CONDITION_HEALTHY: |
753 | 756 | configs[svc] = lambda s: s.is_healthy() |
757 | elif config['condition'] == CONDITION_COMPLETED_SUCCESSFULLY: | |
758 | configs[svc] = lambda s: s.is_completed_successfully() | |
754 | 759 | else: |
755 | 760 | # The config schema already prevents this, but it might be |
756 | 761 | # bypassed if Compose is called programmatically. |
1015 | 1020 | privileged=options.get('privileged', False), |
1016 | 1021 | network_mode=self.network_mode.mode, |
1017 | 1022 | devices=options.get('devices'), |
1023 | device_requests=options.get('device_requests'), | |
1018 | 1024 | dns=options.get('dns'), |
1019 | 1025 | dns_opt=options.get('dns_opt'), |
1020 | 1026 | dns_search=options.get('dns_search'), |
1100 | 1106 | 'Impossible to perform platform-targeted builds for API version < 1.35' |
1101 | 1107 | ) |
1102 | 1108 | |
1103 | builder = self.client if not cli else _CLIBuilder(progress) | |
1104 | build_output = builder.build( | |
1109 | builder = _ClientBuilder(self.client) if not cli else _CLIBuilder(progress) | |
1110 | return builder.build( | |
1111 | service=self, | |
1105 | 1112 | path=path, |
1106 | 1113 | tag=self.image_name, |
1107 | 1114 | rm=rm, |
1122 | 1129 | gzip=gzip, |
1123 | 1130 | isolation=build_opts.get('isolation', self.options.get('isolation', None)), |
1124 | 1131 | platform=self.platform, |
1125 | ) | |
1126 | ||
1127 | try: | |
1128 | all_events = list(stream_output(build_output, output_stream)) | |
1129 | except StreamOutputError as e: | |
1130 | raise BuildError(self, str(e)) | |
1131 | ||
1132 | # Ensure the HTTP connection is not reused for another | |
1133 | # streaming command, as the Docker daemon can sometimes | |
1134 | # complain about it | |
1135 | self.client.close() | |
1136 | ||
1137 | image_id = None | |
1138 | ||
1139 | for event in all_events: | |
1140 | if 'stream' in event: | |
1141 | match = re.search(r'Successfully built ([0-9a-f]+)', event.get('stream', '')) | |
1142 | if match: | |
1143 | image_id = match.group(1) | |
1144 | ||
1145 | if image_id is None: | |
1146 | raise BuildError(self, event if all_events else 'Unknown') | |
1147 | ||
1148 | return image_id | |
1132 | output_stream=output_stream) | |
1149 | 1133 | |
1150 | 1134 | def get_cache_from(self, build_opts): |
1151 | 1135 | cache_from = build_opts.get('cache_from', None) |
1301 | 1285 | raise HealthCheckFailed(ctnr.short_id) |
1302 | 1286 | return result |
1303 | 1287 | |
1288 | def is_completed_successfully(self): | |
1289 | """ Check that all containers for this service has completed successfully | |
1290 | Returns false if at least one container does not exited and | |
1291 | raises CompletedUnsuccessfully exception if at least one container | |
1292 | exited with non-zero exit code. | |
1293 | """ | |
1294 | result = True | |
1295 | for ctnr in self.containers(stopped=True): | |
1296 | ctnr.inspect() | |
1297 | if ctnr.get('State.Status') != 'exited': | |
1298 | result = False | |
1299 | elif ctnr.exit_code != 0: | |
1300 | raise CompletedUnsuccessfully(ctnr.short_id, ctnr.exit_code) | |
1301 | return result | |
1302 | ||
1304 | 1303 | def _parse_proxy_config(self): |
1305 | 1304 | client = self.client |
1306 | 1305 | if 'proxies' not in client._general_configs: |
1325 | 1324 | result[permitted[k]] = result[permitted[k].lower()] = v |
1326 | 1325 | |
1327 | 1326 | return result |
1327 | ||
1328 | def get_profiles(self): | |
1329 | if 'profiles' not in self.options: | |
1330 | return [] | |
1331 | ||
1332 | return self.options.get('profiles') | |
1333 | ||
1334 | def enabled_for_profiles(self, enabled_profiles): | |
1335 | # if service has no profiles specified it is always enabled | |
1336 | if 'profiles' not in self.options: | |
1337 | return True | |
1338 | ||
1339 | service_profiles = self.options.get('profiles') | |
1340 | for profile in enabled_profiles: | |
1341 | if profile in service_profiles: | |
1342 | return True | |
1343 | ||
1344 | return False | |
1328 | 1345 | |
1329 | 1346 | |
1330 | 1347 | def short_id_alias_exists(container, network): |
1769 | 1786 | return path |
1770 | 1787 | |
1771 | 1788 | |
1772 | class _CLIBuilder: | |
1773 | def __init__(self, progress): | |
1774 | self._progress = progress | |
1775 | ||
1776 | def build(self, path, tag=None, quiet=False, fileobj=None, | |
1789 | class _ClientBuilder: | |
1790 | def __init__(self, client): | |
1791 | self.client = client | |
1792 | ||
1793 | def build(self, service, path, tag=None, quiet=False, fileobj=None, | |
1777 | 1794 | nocache=False, rm=False, timeout=None, |
1778 | 1795 | custom_context=False, encoding=None, pull=False, |
1779 | 1796 | forcerm=False, dockerfile=None, container_limits=None, |
1780 | 1797 | decode=False, buildargs=None, gzip=False, shmsize=None, |
1781 | 1798 | labels=None, cache_from=None, target=None, network_mode=None, |
1782 | 1799 | squash=None, extra_hosts=None, platform=None, isolation=None, |
1783 | use_config_proxy=True): | |
1800 | use_config_proxy=True, output_stream=sys.stdout): | |
1801 | build_output = self.client.build( | |
1802 | path=path, | |
1803 | tag=tag, | |
1804 | nocache=nocache, | |
1805 | rm=rm, | |
1806 | pull=pull, | |
1807 | forcerm=forcerm, | |
1808 | dockerfile=dockerfile, | |
1809 | labels=labels, | |
1810 | cache_from=cache_from, | |
1811 | buildargs=buildargs, | |
1812 | network_mode=network_mode, | |
1813 | target=target, | |
1814 | shmsize=shmsize, | |
1815 | extra_hosts=extra_hosts, | |
1816 | container_limits=container_limits, | |
1817 | gzip=gzip, | |
1818 | isolation=isolation, | |
1819 | platform=platform) | |
1820 | ||
1821 | try: | |
1822 | all_events = list(stream_output(build_output, output_stream)) | |
1823 | except StreamOutputError as e: | |
1824 | raise BuildError(service, str(e)) | |
1825 | ||
1826 | # Ensure the HTTP connection is not reused for another | |
1827 | # streaming command, as the Docker daemon can sometimes | |
1828 | # complain about it | |
1829 | self.client.close() | |
1830 | ||
1831 | image_id = None | |
1832 | ||
1833 | for event in all_events: | |
1834 | if 'stream' in event: | |
1835 | match = re.search(r'Successfully built ([0-9a-f]+)', event.get('stream', '')) | |
1836 | if match: | |
1837 | image_id = match.group(1) | |
1838 | ||
1839 | if image_id is None: | |
1840 | raise BuildError(service, event if all_events else 'Unknown') | |
1841 | ||
1842 | return image_id | |
1843 | ||
1844 | ||
1845 | class _CLIBuilder: | |
1846 | def __init__(self, progress): | |
1847 | self._progress = progress | |
1848 | ||
1849 | def build(self, service, path, tag=None, quiet=False, fileobj=None, | |
1850 | nocache=False, rm=False, timeout=None, | |
1851 | custom_context=False, encoding=None, pull=False, | |
1852 | forcerm=False, dockerfile=None, container_limits=None, | |
1853 | decode=False, buildargs=None, gzip=False, shmsize=None, | |
1854 | labels=None, cache_from=None, target=None, network_mode=None, | |
1855 | squash=None, extra_hosts=None, platform=None, isolation=None, | |
1856 | use_config_proxy=True, output_stream=sys.stdout): | |
1784 | 1857 | """ |
1785 | 1858 | Args: |
1859 | service (str): Service to be built | |
1786 | 1860 | path (str): Path to the directory containing the Dockerfile |
1787 | 1861 | buildargs (dict): A dictionary of build arguments |
1788 | 1862 | cache_from (:py:class:`list`): A list of images used for build |
1831 | 1905 | configuration file (``~/.docker/config.json`` by default) |
1832 | 1906 | contains a proxy configuration, the corresponding environment |
1833 | 1907 | variables will be set in the container being built. |
1908 | output_stream (writer): stream to use for build logs | |
1834 | 1909 | Returns: |
1835 | 1910 | A generator for the build output. |
1836 | 1911 | """ |
1837 | if dockerfile: | |
1912 | if dockerfile and os.path.isdir(path): | |
1838 | 1913 | dockerfile = os.path.join(path, dockerfile) |
1839 | 1914 | iidfile = tempfile.mktemp() |
1840 | 1915 | |
1852 | 1927 | command_builder.add_arg("--tag", tag) |
1853 | 1928 | command_builder.add_arg("--target", target) |
1854 | 1929 | command_builder.add_arg("--iidfile", iidfile) |
1930 | command_builder.add_arg("--platform", platform) | |
1931 | command_builder.add_arg("--isolation", isolation) | |
1932 | ||
1933 | if extra_hosts: | |
1934 | if isinstance(extra_hosts, dict): | |
1935 | extra_hosts = ["{}:{}".format(host, ip) for host, ip in extra_hosts.items()] | |
1936 | for host in extra_hosts: | |
1937 | command_builder.add_arg("--add-host", "{}".format(host)) | |
1938 | ||
1855 | 1939 | args = command_builder.build([path]) |
1856 | 1940 | |
1857 | magic_word = "Successfully built " | |
1858 | appear = False | |
1859 | with subprocess.Popen(args, stdout=subprocess.PIPE, | |
1941 | with subprocess.Popen(args, stdout=output_stream, stderr=sys.stderr, | |
1860 | 1942 | universal_newlines=True) as p: |
1861 | while True: | |
1862 | line = p.stdout.readline() | |
1863 | if not line: | |
1864 | break | |
1865 | if line.startswith(magic_word): | |
1866 | appear = True | |
1867 | yield json.dumps({"stream": line}) | |
1868 | ||
1869 | 1943 | p.communicate() |
1870 | 1944 | if p.returncode != 0: |
1871 | raise StreamOutputError() | |
1945 | raise BuildError(service, "Build failed") | |
1872 | 1946 | |
1873 | 1947 | with open(iidfile) as f: |
1874 | 1948 | line = f.readline() |
1875 | 1949 | image_id = line.split(":")[1].strip() |
1876 | 1950 | os.remove(iidfile) |
1877 | 1951 | |
1878 | # In case of `DOCKER_BUILDKIT=1` | |
1879 | # there is no success message already present in the output. | |
1880 | # Since that's the way `Service::build` gets the `image_id` | |
1881 | # it has to be added `manually` | |
1882 | if not appear: | |
1883 | yield json.dumps({"stream": "{}{}\n".format(magic_word, image_id)}) | |
1952 | return image_id | |
1884 | 1953 | |
1885 | 1954 | |
1886 | 1955 | class _CommandBuilder: |
173 | 173 | if len(s) > max_chars: |
174 | 174 | return s[:max_chars - 2] + '...' |
175 | 175 | return s |
176 | ||
177 | ||
178 | def filter_attached_for_up(items, service_names, attach_dependencies=False, | |
179 | item_to_service_name=lambda x: x): | |
180 | """This function contains the logic of choosing which services to | |
181 | attach when doing docker-compose up. It may be used both with containers | |
182 | and services, and any other entities that map to service names - | |
183 | this mapping is provided by item_to_service_name.""" | |
184 | if attach_dependencies or not service_names: | |
185 | return items | |
186 | ||
187 | return [ | |
188 | item | |
189 | for item in items if item_to_service_name(item) in service_names | |
190 | ] |
137 | 137 | ;; |
138 | 138 | esac |
139 | 139 | |
140 | COMPREPLY=( $( compgen -W "--hash --help --no-interpolate --quiet -q --resolve-image-digests --services --volumes" -- "$cur" ) ) | |
140 | COMPREPLY=( $( compgen -W "--hash --help --no-interpolate --profiles --quiet -q --resolve-image-digests --services --volumes" -- "$cur" ) ) | |
141 | 141 | } |
142 | 142 | |
143 | 143 | |
163 | 163 | _filedir "y?(a)ml" |
164 | 164 | return |
165 | 165 | ;; |
166 | --ansi) | |
167 | COMPREPLY=( $( compgen -W "never always auto" -- "$cur" ) ) | |
168 | return | |
169 | ;; | |
166 | 170 | --log-level) |
167 | 171 | COMPREPLY=( $( compgen -W "debug info warning error critical" -- "$cur" ) ) |
168 | 172 | return |
169 | 173 | ;; |
174 | --profile) | |
175 | COMPREPLY=( $( compgen -W "$(__docker_compose_q config --profiles)" -- "$cur" ) ) | |
176 | return | |
177 | ;; | |
170 | 178 | --project-directory) |
171 | 179 | _filedir -d |
172 | 180 | return |
289 | 297 | |
290 | 298 | case "$cur" in |
291 | 299 | -*) |
292 | COMPREPLY=( $( compgen -W "--follow -f --help --no-color --tail --timestamps -t" -- "$cur" ) ) | |
300 | COMPREPLY=( $( compgen -W "--follow -f --help --no-color --no-log-prefix --tail --timestamps -t" -- "$cur" ) ) | |
293 | 301 | ;; |
294 | 302 | *) |
295 | 303 | __docker_compose_complete_services |
544 | 552 | |
545 | 553 | case "$cur" in |
546 | 554 | -*) |
547 | COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) ) | |
555 | COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-log-prefix --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) ) | |
548 | 556 | ;; |
549 | 557 | *) |
550 | 558 | __docker_compose_complete_services |
613 | 621 | --tlskey |
614 | 622 | " |
615 | 623 | |
616 | # These options are require special treatment when searching the command. | |
624 | # These options require special treatment when searching the command. | |
617 | 625 | local top_level_options_with_args=" |
626 | --ansi | |
618 | 627 | --log-level |
628 | --profile | |
619 | 629 | " |
620 | 630 | |
621 | 631 | COMPREPLY=() |
20 | 20 | complete -c docker-compose -l tlskey -r -d 'Path to TLS key file' |
21 | 21 | complete -c docker-compose -l tlsverify -d 'Use TLS and verify the remote' |
22 | 22 | complete -c docker-compose -l skip-hostname-check -d "Don't check the daemon's hostname against the name specified in the client certificate (for example if your docker host is an IP address)" |
23 | complete -c docker-compose -l no-ansi -d 'Do not print ANSI control characters' | |
24 | complete -c docker-compose -l ansi -a 'never always auto' -d 'Control when to print ANSI control characters' | |
23 | 25 | complete -c docker-compose -s h -l help -d 'Print usage' |
24 | 26 | complete -c docker-compose -s v -l version -d 'Print version and exit' |
341 | 341 | '--verbose[Show more output]' \ |
342 | 342 | '--log-level=[Set log level]:level:(DEBUG INFO WARNING ERROR CRITICAL)' \ |
343 | 343 | '--no-ansi[Do not print ANSI control characters]' \ |
344 | '--ansi=[Control when to print ANSI control characters]:when:(never always auto)' \ | |
344 | 345 | '(-H --host)'{-H,--host}'[Daemon socket to connect to]:host:' \ |
345 | 346 | '--tls[Use TLS; implied by --tlsverify]' \ |
346 | 347 | '--tlscacert=[Trust certs signed only by this CA]:ca path:' \ |
22 | 22 | 'DATA' |
23 | 23 | ), |
24 | 24 | ( |
25 | 'compose/config/config_schema_compose_spec.json', | |
26 | 'compose/config/config_schema_compose_spec.json', | |
25 | 'compose/config/compose_spec.json', | |
26 | 'compose/config/compose_spec.json', | |
27 | 27 | 'DATA' |
28 | 28 | ), |
29 | 29 | ( |
31 | 31 | 'DATA' |
32 | 32 | ), |
33 | 33 | ( |
34 | 'compose/config/config_schema_compose_spec.json', | |
35 | 'compose/config/config_schema_compose_spec.json', | |
34 | 'compose/config/compose_spec.json', | |
35 | 'compose/config/compose_spec.json', | |
36 | 36 | 'DATA' |
37 | 37 | ), |
38 | 38 | ( |
0 | 0 | Click==7.1.2 |
1 | coverage==5.2.1 | |
1 | coverage==5.5 | |
2 | 2 | ddt==1.4.1 |
3 | 3 | flake8==3.8.3 |
4 | gitpython==3.1.7 | |
4 | gitpython==3.1.11 | |
5 | 5 | mock==3.0.5 |
6 | 6 | pytest==6.0.1; python_version >= '3.5' |
7 | 7 | pytest==4.6.5; python_version < '3.5' |
0 | 0 | altgraph==0.17 |
1 | 1 | appdirs==1.4.4 |
2 | attrs==20.1.0 | |
3 | bcrypt==3.1.7 | |
4 | cffi==1.14.1 | |
5 | cryptography==3.0 | |
2 | attrs==20.3.0 | |
3 | bcrypt==3.2.0 | |
4 | cffi==1.14.4 | |
5 | cryptography==3.3.2 | |
6 | 6 | distlib==0.3.1 |
7 | 7 | entrypoints==0.3 |
8 | 8 | filelock==3.0.12 |
9 | 9 | gitdb2==4.0.2 |
10 | 10 | mccabe==0.6.1 |
11 | more-itertools==8.4.0; python_version >= '3.5' | |
11 | more-itertools==8.6.0; python_version >= '3.5' | |
12 | 12 | more-itertools==5.0.0; python_version < '3.5' |
13 | packaging==20.4 | |
13 | packaging==20.9 | |
14 | 14 | pluggy==0.13.1 |
15 | py==1.9.0 | |
15 | py==1.10.0 | |
16 | 16 | pycodestyle==2.6.0 |
17 | 17 | pycparser==2.20 |
18 | 18 | pyflakes==2.2.0 |
22 | 22 | smmap==3.0.4 |
23 | 23 | smmap2==3.0.1 |
24 | 24 | toml==0.10.1 |
25 | tox==3.19.0 | |
26 | virtualenv==20.0.30 | |
25 | tox==3.21.2 | |
26 | virtualenv==20.4.0 | |
27 | 27 | wcwidth==0.2.5 |
0 | 0 | backports.shutil_get_terminal_size==1.0.0 |
1 | cached-property==1.5.1 | |
1 | cached-property==1.5.1; python_version < '3.8' | |
2 | 2 | certifi==2020.6.20 |
3 | 3 | chardet==3.0.4 |
4 | 4 | colorama==0.4.3; sys_platform == 'win32' |
5 | 5 | distro==1.5.0 |
6 | docker==4.3.1 | |
6 | docker==5.0.0 | |
7 | 7 | docker-pycreds==0.4.0 |
8 | 8 | dockerpty==0.4.1 |
9 | 9 | docopt==0.6.2 |
11 | 11 | ipaddress==1.0.23 |
12 | 12 | jsonschema==3.2.0 |
13 | 13 | paramiko==2.7.1 |
14 | pypiwin32==219; sys_platform == 'win32' and python_version < '3.6' | |
15 | pypiwin32==223; sys_platform == 'win32' and python_version >= '3.6' | |
16 | 14 | PySocks==1.7.1 |
17 | python-dotenv==0.14.0 | |
18 | PyYAML==5.3.1 | |
15 | python-dotenv==0.17.0 | |
16 | pywin32==227; sys_platform == 'win32' | |
17 | PyYAML==5.4.1 | |
19 | 18 | requests==2.24.0 |
20 | 19 | texttable==1.6.2 |
21 | 20 | urllib3==1.25.10; python_version == '3.3' |
4 | 4 | ./script/clean |
5 | 5 | |
6 | 6 | DOCKER_COMPOSE_GITSHA="$(script/build/write-git-sha)" |
7 | TAG="docker/compose:tmp-glibc-linux-binary-${DOCKER_COMPOSE_GITSHA}" | |
8 | 7 | |
9 | docker build -t "${TAG}" . \ | |
10 | --build-arg BUILD_PLATFORM=debian \ | |
11 | --build-arg GIT_COMMIT="${DOCKER_COMPOSE_GITSHA}" | |
12 | TMP_CONTAINER=$(docker create "${TAG}") | |
13 | mkdir -p dist | |
8 | docker build . \ | |
9 | --target bin \ | |
10 | --build-arg DISTRO=debian \ | |
11 | --build-arg GIT_COMMIT="${DOCKER_COMPOSE_GITSHA}" \ | |
12 | --output dist/ | |
14 | 13 | ARCH=$(uname -m) |
15 | docker cp "${TMP_CONTAINER}":/usr/local/bin/docker-compose "dist/docker-compose-Linux-${ARCH}" | |
16 | docker container rm -f "${TMP_CONTAINER}" | |
17 | docker image rm -f "${TAG}" | |
14 | # Ensure that we output the binary with the same name as we did before | |
15 | mv dist/docker-compose-linux-amd64 "dist/docker-compose-Linux-${ARCH}" |
23 | 23 | git clone --single-branch --branch develop https://github.com/pyinstaller/pyinstaller.git /tmp/pyinstaller |
24 | 24 | cd /tmp/pyinstaller/bootloader |
25 | 25 | # Checkout commit corresponding to version in requirements-build |
26 | git checkout v3.6 | |
26 | git checkout v4.1 | |
27 | 27 | "${VENV}"/bin/python3 ./waf configure --no-lsb all |
28 | 28 | "${VENV}"/bin/pip3 install .. |
29 | 29 | cd "${CODE_PATH}" |
12 | 12 | DOCKER_COMPOSE_GITSHA="$(script/build/write-git-sha)" |
13 | 13 | docker build -t "${IMAGE}:${TAG}" . \ |
14 | 14 | --target build \ |
15 | --build-arg BUILD_PLATFORM="debian" \ | |
15 | --build-arg DISTRO="debian" \ | |
16 | 16 | --build-arg GIT_COMMIT="${DOCKER_COMPOSE_GITSHA}" |
17 | 17 | docker tag "${IMAGE}":"${TAG}" "${IMAGE}":latest |
5 | 5 | # |
6 | 6 | # http://git-scm.com/download/win |
7 | 7 | # |
8 | # 2. Install Python 3.7.x: | |
8 | # 2. Install Python 3.9.x: | |
9 | 9 | # |
10 | 10 | # https://www.python.org/downloads/ |
11 | 11 | # |
12 | # 3. Append ";C:\Python37;C:\Python37\Scripts" to the "Path" environment variable: | |
12 | # 3. Append ";C:\Python39;C:\Python39\Scripts" to the "Path" environment variable: | |
13 | 13 | # |
14 | 14 | # https://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/sysdm_advancd_environmnt_addchange_variable.mspx?mfr=true |
15 | 15 | # |
16 | 16 | # 4. In Powershell, run the following commands: |
17 | 17 | # |
18 | # $ pip install 'virtualenv==20.0.30' | |
18 | # $ pip install 'virtualenv==20.2.2' | |
19 | 19 | # $ Set-ExecutionPolicy -Scope CurrentUser RemoteSigned |
20 | 20 | # |
21 | 21 | # 5. Clone the repository: |
38 | 38 | Get-ChildItem -Recurse -Include *.pyc | foreach ($_) { Remove-Item $_.FullName } |
39 | 39 | |
40 | 40 | # Create virtualenv |
41 | virtualenv -p C:\Python37\python.exe .\venv | |
41 | virtualenv -p C:\Python39\python.exe .\venv | |
42 | 42 | |
43 | 43 | # pip and pyinstaller generate lots of warnings, so we need to ignore them |
44 | 44 | $ErrorActionPreference = "Continue" |
14 | 14 | |
15 | 15 | set -e |
16 | 16 | |
17 | VERSION="1.27.4" | |
17 | VERSION="1.29.2" | |
18 | 18 | IMAGE="docker/compose:$VERSION" |
19 | 19 | |
20 | 20 | |
43 | 43 | if [ -n "$COMPOSE_PROJECT_NAME" ]; then |
44 | 44 | COMPOSE_OPTIONS="-e COMPOSE_PROJECT_NAME $COMPOSE_OPTIONS" |
45 | 45 | fi |
46 | # TODO: also check --file argument | |
47 | 46 | if [ -n "$compose_dir" ]; then |
48 | 47 | VOLUMES="$VOLUMES -v $compose_dir:$compose_dir" |
49 | 48 | fi |
50 | 49 | if [ -n "$HOME" ]; then |
51 | 50 | VOLUMES="$VOLUMES -v $HOME:$HOME -e HOME" # Pass in HOME to share docker.config and allow ~/-relative paths to work. |
52 | 51 | fi |
52 | i=$# | |
53 | while [ $i -gt 0 ]; do | |
54 | arg=$1 | |
55 | i=$((i - 1)) | |
56 | shift | |
57 | ||
58 | case "$arg" in | |
59 | -f|--file) | |
60 | value=$1 | |
61 | i=$((i - 1)) | |
62 | shift | |
63 | set -- "$@" "$arg" "$value" | |
64 | ||
65 | file_dir=$(realpath "$(dirname "$value")") | |
66 | VOLUMES="$VOLUMES -v $file_dir:$file_dir" | |
67 | ;; | |
68 | *) set -- "$@" "$arg" ;; | |
69 | esac | |
70 | done | |
71 | ||
72 | # Setup environment variables for compose config and context | |
73 | ENV_OPTIONS=$(printenv | sed -E "/^PATH=.*/d; s/^/-e /g; s/=.*//g; s/\n/ /g") | |
53 | 74 | |
54 | 75 | # Only allocate tty if we detect one |
55 | 76 | if [ -t 0 ] && [ -t 1 ]; then |
66 | 87 | fi |
67 | 88 | |
68 | 89 | # shellcheck disable=SC2086 |
69 | exec docker run --rm $DOCKER_RUN_OPTIONS $DOCKER_ADDR $COMPOSE_OPTIONS $VOLUMES -w "$(pwd)" $IMAGE "$@" | |
90 | exec docker run --rm $DOCKER_RUN_OPTIONS $DOCKER_ADDR $COMPOSE_OPTIONS $ENV_OPTIONS $VOLUMES -w "$(pwd)" $IMAGE "$@" |
12 | 12 | SDK_SHA1=dd228a335194e3392f1904ce49aff1b1da26ca62 |
13 | 13 | fi |
14 | 14 | |
15 | OPENSSL_VERSION=1.1.1g | |
15 | OPENSSL_VERSION=1.1.1h | |
16 | 16 | OPENSSL_URL=https://www.openssl.org/source/openssl-${OPENSSL_VERSION}.tar.gz |
17 | OPENSSL_SHA1=b213a293f2127ec3e323fb3cfc0c9807664fd997 | |
17 | OPENSSL_SHA1=8d0d099e8973ec851368c8c775e05e1eadca1794 | |
18 | 18 | |
19 | PYTHON_VERSION=3.7.7 | |
19 | PYTHON_VERSION=3.9.0 | |
20 | 20 | PYTHON_URL=https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz |
21 | PYTHON_SHA1=8e9968663a214aea29659ba9dfa959e8a7d82b39 | |
21 | PYTHON_SHA1=5744a10ba989d2badacbab3c00cdcb83c83106c7 | |
22 | 22 | |
23 | 23 | # |
24 | 24 | # Install prerequisites. |
35 | 35 | brew install python3 |
36 | 36 | fi |
37 | 37 | if ! [ -x "$(command -v virtualenv)" ]; then |
38 | pip3 install virtualenv==20.0.30 | |
38 | pip3 install virtualenv==20.2.2 | |
39 | 39 | fi |
40 | 40 | |
41 | 41 | # |
20 | 20 | DOCKER_VERSIONS=$($get_versions -n 2 recent) |
21 | 21 | fi |
22 | 22 | |
23 | ||
24 | 23 | BUILD_NUMBER=${BUILD_NUMBER-$USER} |
25 | 24 | PY_TEST_VERSIONS=${PY_TEST_VERSIONS:-py37} |
26 | 25 | |
38 | 37 | |
39 | 38 | trap "on_exit" EXIT |
40 | 39 | |
41 | repo="dockerswarm/dind" | |
42 | ||
43 | 40 | docker run \ |
44 | 41 | -d \ |
45 | 42 | --name "$daemon_container" \ |
46 | 43 | --privileged \ |
47 | 44 | --volume="/var/lib/docker" \ |
48 | "$repo:$version" \ | |
45 | -e "DOCKER_TLS_CERTDIR=" \ | |
46 | "docker:$version-dind" \ | |
49 | 47 | dockerd -H tcp://0.0.0.0:2375 $DOCKER_DAEMON_ARGS \ |
50 | 48 | 2>&1 | tail -n 10 |
49 | ||
50 | docker exec "$daemon_container" sh -c "apk add --no-cache git" | |
51 | ||
52 | # copy docker config from host for authentication with Docker Hub | |
53 | docker exec "$daemon_container" sh -c "mkdir /root/.docker" | |
54 | docker cp /root/.docker/config.json $daemon_container:/root/.docker/config.json | |
55 | docker exec "$daemon_container" sh -c "chmod 644 /root/.docker/config.json" | |
51 | 56 | |
52 | 57 | docker run \ |
53 | 58 | --rm \ |
24 | 24 | |
25 | 25 | |
26 | 26 | install_requires = [ |
27 | 'cached-property >= 1.2.0, < 2', | |
28 | 27 | 'docopt >= 0.6.1, < 1', |
29 | 28 | 'PyYAML >= 3.10, < 6', |
30 | 29 | 'requests >= 2.20.0, < 3', |
31 | 30 | 'texttable >= 0.9.0, < 2', |
32 | 31 | 'websocket-client >= 0.32.0, < 1', |
33 | 32 | 'distro >= 1.5.0, < 2', |
34 | 'docker[ssh] >= 4.3.1, < 5', | |
33 | 'docker[ssh] >= 5', | |
35 | 34 | 'dockerpty >= 0.4.1, < 1', |
36 | 35 | 'jsonschema >= 2.5.1, < 4', |
37 | 36 | 'python-dotenv >= 0.13.0, < 1', |
49 | 48 | |
50 | 49 | extras_require = { |
51 | 50 | ':python_version < "3.5"': ['backports.ssl_match_hostname >= 3.5, < 4'], |
51 | ':python_version < "3.8"': ['cached-property >= 1.2.0, < 2'], | |
52 | 52 | ':sys_platform == "win32"': ['colorama >= 0.4, < 1'], |
53 | 53 | 'socks': ['PySocks >= 1.5.6, != 1.5.7, < 2'], |
54 | 54 | 'tests': tests_require, |
101 | 101 | 'Programming Language :: Python :: 3.4', |
102 | 102 | 'Programming Language :: Python :: 3.6', |
103 | 103 | 'Programming Language :: Python :: 3.7', |
104 | 'Programming Language :: Python :: 3.8', | |
105 | 'Programming Language :: Python :: 3.9', | |
104 | 106 | ], |
105 | 107 | ) |
57 | 57 | } |
58 | 58 | |
59 | 59 | |
60 | def start_process(base_dir, options): | |
60 | def start_process(base_dir, options, executable=None, env=None): | |
61 | executable = executable or DOCKER_COMPOSE_EXECUTABLE | |
61 | 62 | proc = subprocess.Popen( |
62 | [DOCKER_COMPOSE_EXECUTABLE] + options, | |
63 | [executable] + options, | |
63 | 64 | stdin=subprocess.PIPE, |
64 | 65 | stdout=subprocess.PIPE, |
65 | 66 | stderr=subprocess.PIPE, |
66 | cwd=base_dir) | |
67 | cwd=base_dir, | |
68 | env=env, | |
69 | ) | |
67 | 70 | print("Running process: %s" % proc.pid) |
68 | 71 | return proc |
69 | 72 | |
77 | 80 | return ProcessResult(stdout.decode('utf-8'), stderr.decode('utf-8')) |
78 | 81 | |
79 | 82 | |
80 | def dispatch(base_dir, options, project_options=None, returncode=0, stdin=None): | |
83 | def dispatch(base_dir, options, | |
84 | project_options=None, returncode=0, stdin=None, executable=None, env=None): | |
81 | 85 | project_options = project_options or [] |
82 | proc = start_process(base_dir, project_options + options) | |
86 | proc = start_process(base_dir, project_options + options, executable=executable, env=env) | |
83 | 87 | return wait_on_process(proc, returncode=returncode, stdin=stdin) |
84 | 88 | |
85 | 89 | |
231 | 235 | |
232 | 236 | result = self.dispatch(['-H=tcp://doesnotexist:8000', 'ps'], returncode=1) |
233 | 237 | assert "Couldn't connect to Docker daemon" in result.stderr |
238 | ||
239 | def test_config_list_profiles(self): | |
240 | self.base_dir = 'tests/fixtures/config-profiles' | |
241 | result = self.dispatch(['config', '--profiles']) | |
242 | assert set(result.stdout.rstrip().split('\n')) == {'debug', 'frontend', 'gui'} | |
234 | 243 | |
235 | 244 | def test_config_list_services(self): |
236 | 245 | self.base_dir = 'tests/fixtures/v2-full' |
782 | 791 | assert BUILD_CACHE_TEXT not in result.stdout |
783 | 792 | assert BUILD_PULL_TEXT in result.stdout |
784 | 793 | |
794 | @mock.patch.dict(os.environ) | |
785 | 795 | def test_build_log_level(self): |
796 | os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0' | |
797 | os.environ['DOCKER_BUILDKIT'] = '0' | |
798 | self.test_env_file_relative_to_compose_file() | |
786 | 799 | self.base_dir = 'tests/fixtures/simple-dockerfile' |
787 | 800 | result = self.dispatch(['--log-level', 'warning', 'build', 'simple']) |
788 | 801 | assert result.stderr == '' |
844 | 857 | for c in self.project.client.containers(all=True): |
845 | 858 | self.addCleanup(self.project.client.remove_container, c, force=True) |
846 | 859 | |
860 | @mock.patch.dict(os.environ) | |
847 | 861 | def test_build_shm_size_build_option(self): |
862 | os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0' | |
848 | 863 | pull_busybox(self.client) |
849 | 864 | self.base_dir = 'tests/fixtures/build-shm-size' |
850 | 865 | result = self.dispatch(['build', '--no-cache'], None) |
851 | 866 | assert 'shm_size: 96' in result.stdout |
852 | 867 | |
868 | @mock.patch.dict(os.environ) | |
853 | 869 | def test_build_memory_build_option(self): |
870 | os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0' | |
854 | 871 | pull_busybox(self.client) |
855 | 872 | self.base_dir = 'tests/fixtures/build-memory' |
856 | 873 | result = self.dispatch(['build', '--no-cache', '--memory', '96m', 'service'], None) |
1717 | 1734 | |
1718 | 1735 | shareable_mode_container = self.project.get_service('shareable').containers()[0] |
1719 | 1736 | assert shareable_mode_container.get('HostConfig.IpcMode') == 'shareable' |
1737 | ||
1738 | def test_profiles_up_with_no_profile(self): | |
1739 | self.base_dir = 'tests/fixtures/profiles' | |
1740 | self.dispatch(['up']) | |
1741 | ||
1742 | containers = self.project.containers(stopped=True) | |
1743 | service_names = [c.service for c in containers] | |
1744 | ||
1745 | assert 'foo' in service_names | |
1746 | assert len(containers) == 1 | |
1747 | ||
1748 | def test_profiles_up_with_profile(self): | |
1749 | self.base_dir = 'tests/fixtures/profiles' | |
1750 | self.dispatch(['--profile', 'test', 'up']) | |
1751 | ||
1752 | containers = self.project.containers(stopped=True) | |
1753 | service_names = [c.service for c in containers] | |
1754 | ||
1755 | assert 'foo' in service_names | |
1756 | assert 'bar' in service_names | |
1757 | assert 'baz' in service_names | |
1758 | assert len(containers) == 3 | |
1759 | ||
1760 | def test_profiles_up_invalid_dependency(self): | |
1761 | self.base_dir = 'tests/fixtures/profiles' | |
1762 | result = self.dispatch(['--profile', 'debug', 'up'], returncode=1) | |
1763 | ||
1764 | assert ('Service "bar" was pulled in as a dependency of service "zot" ' | |
1765 | 'but is not enabled by the active profiles.') in result.stderr | |
1766 | ||
1767 | def test_profiles_up_with_multiple_profiles(self): | |
1768 | self.base_dir = 'tests/fixtures/profiles' | |
1769 | self.dispatch(['--profile', 'debug', '--profile', 'test', 'up']) | |
1770 | ||
1771 | containers = self.project.containers(stopped=True) | |
1772 | service_names = [c.service for c in containers] | |
1773 | ||
1774 | assert 'foo' in service_names | |
1775 | assert 'bar' in service_names | |
1776 | assert 'baz' in service_names | |
1777 | assert 'zot' in service_names | |
1778 | assert len(containers) == 4 | |
1779 | ||
1780 | def test_profiles_up_with_profile_enabled_by_service(self): | |
1781 | self.base_dir = 'tests/fixtures/profiles' | |
1782 | self.dispatch(['up', 'bar']) | |
1783 | ||
1784 | containers = self.project.containers(stopped=True) | |
1785 | service_names = [c.service for c in containers] | |
1786 | ||
1787 | assert 'bar' in service_names | |
1788 | assert len(containers) == 1 | |
1789 | ||
1790 | def test_profiles_up_with_dependency_and_profile_enabled_by_service(self): | |
1791 | self.base_dir = 'tests/fixtures/profiles' | |
1792 | self.dispatch(['up', 'baz']) | |
1793 | ||
1794 | containers = self.project.containers(stopped=True) | |
1795 | service_names = [c.service for c in containers] | |
1796 | ||
1797 | assert 'bar' in service_names | |
1798 | assert 'baz' in service_names | |
1799 | assert len(containers) == 2 | |
1800 | ||
1801 | def test_profiles_up_with_invalid_dependency_for_target_service(self): | |
1802 | self.base_dir = 'tests/fixtures/profiles' | |
1803 | result = self.dispatch(['up', 'zot'], returncode=1) | |
1804 | ||
1805 | assert ('Service "bar" was pulled in as a dependency of service "zot" ' | |
1806 | 'but is not enabled by the active profiles.') in result.stderr | |
1807 | ||
1808 | def test_profiles_up_with_profile_for_dependency(self): | |
1809 | self.base_dir = 'tests/fixtures/profiles' | |
1810 | self.dispatch(['--profile', 'test', 'up', 'zot']) | |
1811 | ||
1812 | containers = self.project.containers(stopped=True) | |
1813 | service_names = [c.service for c in containers] | |
1814 | ||
1815 | assert 'bar' in service_names | |
1816 | assert 'zot' in service_names | |
1817 | assert len(containers) == 2 | |
1818 | ||
1819 | def test_profiles_up_with_merged_profiles(self): | |
1820 | self.base_dir = 'tests/fixtures/profiles' | |
1821 | self.dispatch(['-f', 'docker-compose.yml', '-f', 'merge-profiles.yml', 'up', 'zot']) | |
1822 | ||
1823 | containers = self.project.containers(stopped=True) | |
1824 | service_names = [c.service for c in containers] | |
1825 | ||
1826 | assert 'bar' in service_names | |
1827 | assert 'zot' in service_names | |
1828 | assert len(containers) == 2 | |
1720 | 1829 | |
1721 | 1830 | def test_exec_without_tty(self): |
1722 | 1831 | self.base_dir = 'tests/fixtures/links-composefile' |
3033 | 3142 | another = self.project.get_service('--log-service') |
3034 | 3143 | assert len(service.containers()) == 1 |
3035 | 3144 | assert len(another.containers()) == 1 |
3145 | ||
3146 | def test_up_no_log_prefix(self): | |
3147 | self.base_dir = 'tests/fixtures/echo-services' | |
3148 | result = self.dispatch(['up', '--no-log-prefix']) | |
3149 | ||
3150 | assert 'simple' in result.stdout | |
3151 | assert 'another' in result.stdout | |
3152 | assert 'exited with code 0' in result.stdout | |
3153 | assert 'exited with code 0' in result.stdout |
0 | version: '3.8' | |
1 | services: | |
2 | frontend: | |
3 | image: frontend | |
4 | profiles: ["frontend", "gui"] | |
5 | phpmyadmin: | |
6 | image: phpmyadmin | |
7 | depends_on: | |
8 | - db | |
9 | profiles: | |
10 | - debug | |
11 | backend: | |
12 | image: backend | |
13 | db: | |
14 | image: mysql |
0 | WHEREAMI=default |
0 | version: "3" | |
1 | services: | |
2 | foo: | |
3 | image: busybox:1.31.0-uclibc | |
4 | bar: | |
5 | image: busybox:1.31.0-uclibc | |
6 | profiles: | |
7 | - test | |
8 | baz: | |
9 | image: busybox:1.31.0-uclibc | |
10 | depends_on: | |
11 | - bar | |
12 | profiles: | |
13 | - test | |
14 | zot: | |
15 | image: busybox:1.31.0-uclibc | |
16 | depends_on: | |
17 | - bar | |
18 | profiles: | |
19 | - debug |
0 | 0 | import tempfile |
1 | 1 | |
2 | import pytest | |
2 | 3 | from ddt import data |
3 | 4 | from ddt import ddt |
4 | 5 | |
7 | 8 | from compose.cli.command import get_project |
8 | 9 | from compose.cli.command import project_from_options |
9 | 10 | from compose.config.environment import Environment |
11 | from compose.config.errors import EnvFileNotFound | |
10 | 12 | from tests.integration.testcases import DockerClientTestCase |
11 | 13 | |
12 | 14 | |
54 | 56 | class EnvironmentOverrideFileTest(DockerClientTestCase): |
55 | 57 | def test_env_file_override(self): |
56 | 58 | base_dir = 'tests/fixtures/env-file-override' |
59 | # '--env-file' are relative to the current working dir | |
60 | env = Environment.from_env_file(base_dir, base_dir+'/.env.override') | |
57 | 61 | dispatch(base_dir, ['--env-file', '.env.override', 'up']) |
58 | 62 | project = get_project(project_dir=base_dir, |
59 | 63 | config_path=['docker-compose.yml'], |
60 | environment=Environment.from_env_file(base_dir, '.env.override'), | |
64 | environment=env, | |
61 | 65 | override_dir=base_dir) |
62 | 66 | containers = project.containers(stopped=True) |
63 | 67 | assert len(containers) == 1 |
64 | 68 | assert "WHEREAMI=override" in containers[0].get('Config.Env') |
65 | 69 | assert "DEFAULT_CONF_LOADED=true" in containers[0].get('Config.Env') |
66 | 70 | dispatch(base_dir, ['--env-file', '.env.override', 'down'], None) |
71 | ||
72 | def test_env_file_not_found_error(self): | |
73 | base_dir = 'tests/fixtures/env-file-override' | |
74 | with pytest.raises(EnvFileNotFound) as excinfo: | |
75 | Environment.from_env_file(base_dir, '.env.override') | |
76 | ||
77 | assert "Couldn't find env file" in excinfo.exconly() | |
78 | ||
79 | def test_dot_env_file(self): | |
80 | base_dir = 'tests/fixtures/env-file-override' | |
81 | # '.env' is relative to the project_dir (base_dir) | |
82 | env = Environment.from_env_file(base_dir, None) | |
83 | dispatch(base_dir, ['up']) | |
84 | project = get_project(project_dir=base_dir, | |
85 | config_path=['docker-compose.yml'], | |
86 | environment=env, | |
87 | override_dir=base_dir) | |
88 | containers = project.containers(stopped=True) | |
89 | assert len(containers) == 1 | |
90 | assert "WHEREAMI=default" in containers[0].get('Config.Env') | |
91 | dispatch(base_dir, ['down'], None) |
0 | import logging | |
1 | import os | |
2 | import socket | |
3 | from http.server import BaseHTTPRequestHandler | |
4 | from http.server import HTTPServer | |
5 | from threading import Thread | |
6 | ||
7 | import requests | |
8 | from docker.transport import UnixHTTPAdapter | |
9 | ||
10 | from tests.acceptance.cli_test import dispatch | |
11 | from tests.integration.testcases import DockerClientTestCase | |
12 | ||
13 | ||
14 | TEST_SOCKET_FILE = '/tmp/test-metrics-docker-cli.sock' | |
15 | ||
16 | ||
17 | class MetricsTest(DockerClientTestCase): | |
18 | test_session = requests.sessions.Session() | |
19 | test_env = None | |
20 | base_dir = 'tests/fixtures/v3-full' | |
21 | ||
22 | @classmethod | |
23 | def setUpClass(cls): | |
24 | super().setUpClass() | |
25 | MetricsTest.test_session.mount("http+unix://", UnixHTTPAdapter(TEST_SOCKET_FILE)) | |
26 | MetricsTest.test_env = os.environ.copy() | |
27 | MetricsTest.test_env['METRICS_SOCKET_FILE'] = TEST_SOCKET_FILE | |
28 | MetricsServer().start() | |
29 | ||
30 | @classmethod | |
31 | def test_metrics_help(cls): | |
32 | # root `docker-compose` command is considered as a `--help` | |
33 | dispatch(cls.base_dir, [], env=MetricsTest.test_env) | |
34 | assert cls.get_content() == \ | |
35 | b'{"command": "compose --help", "context": "moby", ' \ | |
36 | b'"source": "docker-compose", "status": "success"}' | |
37 | dispatch(cls.base_dir, ['help', 'run'], env=MetricsTest.test_env) | |
38 | assert cls.get_content() == \ | |
39 | b'{"command": "compose help", "context": "moby", ' \ | |
40 | b'"source": "docker-compose", "status": "success"}' | |
41 | dispatch(cls.base_dir, ['--help'], env=MetricsTest.test_env) | |
42 | assert cls.get_content() == \ | |
43 | b'{"command": "compose --help", "context": "moby", ' \ | |
44 | b'"source": "docker-compose", "status": "success"}' | |
45 | dispatch(cls.base_dir, ['run', '--help'], env=MetricsTest.test_env) | |
46 | assert cls.get_content() == \ | |
47 | b'{"command": "compose --help run", "context": "moby", ' \ | |
48 | b'"source": "docker-compose", "status": "success"}' | |
49 | dispatch(cls.base_dir, ['up', '--help', 'extra_args'], env=MetricsTest.test_env) | |
50 | assert cls.get_content() == \ | |
51 | b'{"command": "compose --help up", "context": "moby", ' \ | |
52 | b'"source": "docker-compose", "status": "success"}' | |
53 | ||
54 | @classmethod | |
55 | def test_metrics_simple_commands(cls): | |
56 | dispatch(cls.base_dir, ['ps'], env=MetricsTest.test_env) | |
57 | assert cls.get_content() == \ | |
58 | b'{"command": "compose ps", "context": "moby", ' \ | |
59 | b'"source": "docker-compose", "status": "success"}' | |
60 | dispatch(cls.base_dir, ['version'], env=MetricsTest.test_env) | |
61 | assert cls.get_content() == \ | |
62 | b'{"command": "compose version", "context": "moby", ' \ | |
63 | b'"source": "docker-compose", "status": "success"}' | |
64 | dispatch(cls.base_dir, ['version', '--yyy'], env=MetricsTest.test_env) | |
65 | assert cls.get_content() == \ | |
66 | b'{"command": "compose version", "context": "moby", ' \ | |
67 | b'"source": "docker-compose", "status": "failure"}' | |
68 | ||
69 | @staticmethod | |
70 | def get_content(): | |
71 | resp = MetricsTest.test_session.get("http+unix://localhost") | |
72 | print(resp.content) | |
73 | return resp.content | |
74 | ||
75 | ||
76 | def start_server(uri=TEST_SOCKET_FILE): | |
77 | try: | |
78 | os.remove(uri) | |
79 | except OSError: | |
80 | pass | |
81 | httpd = HTTPServer(uri, MetricsHTTPRequestHandler, False) | |
82 | sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM) | |
83 | sock.bind(TEST_SOCKET_FILE) | |
84 | sock.listen(0) | |
85 | httpd.socket = sock | |
86 | print('Serving on ', uri) | |
87 | httpd.serve_forever() | |
88 | sock.shutdown(socket.SHUT_RDWR) | |
89 | sock.close() | |
90 | os.remove(uri) | |
91 | ||
92 | ||
93 | class MetricsServer: | |
94 | @classmethod | |
95 | def start(cls): | |
96 | t = Thread(target=start_server, daemon=True) | |
97 | t.start() | |
98 | ||
99 | ||
100 | class MetricsHTTPRequestHandler(BaseHTTPRequestHandler): | |
101 | usages = [] | |
102 | ||
103 | def do_GET(self): | |
104 | self.client_address = ('',) # avoid exception in BaseHTTPServer.py log_message() | |
105 | self.send_response(200) | |
106 | self.end_headers() | |
107 | for u in MetricsHTTPRequestHandler.usages: | |
108 | self.wfile.write(u) | |
109 | MetricsHTTPRequestHandler.usages = [] | |
110 | ||
111 | def do_POST(self): | |
112 | self.client_address = ('',) # avoid exception in BaseHTTPServer.py log_message() | |
113 | content_length = int(self.headers['Content-Length']) | |
114 | body = self.rfile.read(content_length) | |
115 | print(body) | |
116 | MetricsHTTPRequestHandler.usages.append(body) | |
117 | self.send_response(200) | |
118 | self.end_headers() | |
119 | ||
120 | ||
121 | if __name__ == '__main__': | |
122 | logging.getLogger("urllib3").propagate = False | |
123 | logging.getLogger("requests").propagate = False | |
124 | start_server() |
24 | 24 | from compose.const import LABEL_PROJECT |
25 | 25 | from compose.const import LABEL_SERVICE |
26 | 26 | from compose.container import Container |
27 | from compose.errors import CompletedUnsuccessfully | |
27 | 28 | from compose.errors import HealthCheckFailed |
28 | 29 | from compose.errors import NoHealthCheckConfigured |
29 | 30 | from compose.project import Project |
1898 | 1899 | with pytest.raises(NoHealthCheckConfigured): |
1899 | 1900 | svc1.is_healthy() |
1900 | 1901 | |
1902 | def test_project_up_completed_successfully_dependency(self): | |
1903 | config_dict = { | |
1904 | 'version': '2.1', | |
1905 | 'services': { | |
1906 | 'svc1': { | |
1907 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1908 | 'command': 'true' | |
1909 | }, | |
1910 | 'svc2': { | |
1911 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1912 | 'command': 'top', | |
1913 | 'depends_on': { | |
1914 | 'svc1': {'condition': 'service_completed_successfully'}, | |
1915 | } | |
1916 | } | |
1917 | } | |
1918 | } | |
1919 | config_data = load_config(config_dict) | |
1920 | project = Project.from_config( | |
1921 | name='composetest', config_data=config_data, client=self.client | |
1922 | ) | |
1923 | project.up() | |
1924 | ||
1925 | svc1 = project.get_service('svc1') | |
1926 | svc2 = project.get_service('svc2') | |
1927 | ||
1928 | assert 'svc1' in svc2.get_dependency_names() | |
1929 | assert svc2.containers()[0].is_running | |
1930 | assert len(svc1.containers()) == 0 | |
1931 | assert svc1.is_completed_successfully() | |
1932 | ||
1933 | def test_project_up_completed_unsuccessfully_dependency(self): | |
1934 | config_dict = { | |
1935 | 'version': '2.1', | |
1936 | 'services': { | |
1937 | 'svc1': { | |
1938 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1939 | 'command': 'false' | |
1940 | }, | |
1941 | 'svc2': { | |
1942 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1943 | 'command': 'top', | |
1944 | 'depends_on': { | |
1945 | 'svc1': {'condition': 'service_completed_successfully'}, | |
1946 | } | |
1947 | } | |
1948 | } | |
1949 | } | |
1950 | config_data = load_config(config_dict) | |
1951 | project = Project.from_config( | |
1952 | name='composetest', config_data=config_data, client=self.client | |
1953 | ) | |
1954 | with pytest.raises(ProjectError): | |
1955 | project.up() | |
1956 | ||
1957 | svc1 = project.get_service('svc1') | |
1958 | svc2 = project.get_service('svc2') | |
1959 | assert 'svc1' in svc2.get_dependency_names() | |
1960 | assert len(svc2.containers()) == 0 | |
1961 | with pytest.raises(CompletedUnsuccessfully): | |
1962 | svc1.is_completed_successfully() | |
1963 | ||
1964 | def test_project_up_completed_differently_dependencies(self): | |
1965 | config_dict = { | |
1966 | 'version': '2.1', | |
1967 | 'services': { | |
1968 | 'svc1': { | |
1969 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1970 | 'command': 'true' | |
1971 | }, | |
1972 | 'svc2': { | |
1973 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1974 | 'command': 'false' | |
1975 | }, | |
1976 | 'svc3': { | |
1977 | 'image': BUSYBOX_IMAGE_WITH_TAG, | |
1978 | 'command': 'top', | |
1979 | 'depends_on': { | |
1980 | 'svc1': {'condition': 'service_completed_successfully'}, | |
1981 | 'svc2': {'condition': 'service_completed_successfully'}, | |
1982 | } | |
1983 | } | |
1984 | } | |
1985 | } | |
1986 | config_data = load_config(config_dict) | |
1987 | project = Project.from_config( | |
1988 | name='composetest', config_data=config_data, client=self.client | |
1989 | ) | |
1990 | with pytest.raises(ProjectError): | |
1991 | project.up() | |
1992 | ||
1993 | svc1 = project.get_service('svc1') | |
1994 | svc2 = project.get_service('svc2') | |
1995 | svc3 = project.get_service('svc3') | |
1996 | assert ['svc1', 'svc2'] == svc3.get_dependency_names() | |
1997 | assert svc1.is_completed_successfully() | |
1998 | assert len(svc3.containers()) == 0 | |
1999 | with pytest.raises(CompletedUnsuccessfully): | |
2000 | svc2.is_completed_successfully() | |
2001 | ||
1901 | 2002 | def test_project_up_seccomp_profile(self): |
1902 | 2003 | seccomp_data = { |
1903 | 2004 | 'defaultAction': 'SCMP_ACT_ALLOW', |
947 | 947 | with open(os.path.join(base_dir, 'Dockerfile'), 'w') as f: |
948 | 948 | f.write("FROM busybox\n") |
949 | 949 | |
950 | service = self.create_service('web', build={'context': base_dir}) | |
950 | service = self.create_service('web', | |
951 | build={'context': base_dir}, | |
952 | environment={ | |
953 | 'COMPOSE_DOCKER_CLI_BUILD': '0', | |
954 | 'DOCKER_BUILDKIT': '0', | |
955 | }) | |
951 | 956 | service.build() |
952 | 957 | self.addCleanup(self.client.remove_image, service.image_name) |
953 | 958 | |
963 | 968 | service = self.create_service('web', |
964 | 969 | build={'context': base_dir}, |
965 | 970 | environment={ |
966 | 'COMPOSE_DOCKER_CLI_BUILD': '1', | |
967 | 971 | 'DOCKER_BUILDKIT': '1', |
968 | 972 | }) |
969 | 973 | service.build(cli=True) |
1014 | 1018 | web = self.create_service('web', |
1015 | 1019 | build={'context': base_dir}, |
1016 | 1020 | environment={ |
1017 | 'COMPOSE_DOCKER_CLI_BUILD': '1', | |
1018 | 1021 | 'DOCKER_BUILDKIT': '1', |
1019 | 1022 | }) |
1020 | 1023 | project = Project('composetest', [web], self.client) |
60 | 60 | |
61 | 61 | @classmethod |
62 | 62 | def tearDownClass(cls): |
63 | cls.client.close() | |
63 | 64 | del cls.client |
64 | 65 | |
65 | 66 | def tearDown(self): |
0 | import os | |
1 | ||
2 | import pytest | |
3 | ||
4 | from compose.cli.colors import AnsiMode | |
5 | from tests import mock | |
6 | ||
7 | ||
8 | @pytest.fixture | |
9 | def tty_stream(): | |
10 | stream = mock.Mock() | |
11 | stream.isatty.return_value = True | |
12 | return stream | |
13 | ||
14 | ||
15 | @pytest.fixture | |
16 | def non_tty_stream(): | |
17 | stream = mock.Mock() | |
18 | stream.isatty.return_value = False | |
19 | return stream | |
20 | ||
21 | ||
22 | class TestAnsiModeTestCase: | |
23 | ||
24 | @mock.patch.dict(os.environ) | |
25 | def test_ansi_mode_never(self, tty_stream, non_tty_stream): | |
26 | if "CLICOLOR" in os.environ: | |
27 | del os.environ["CLICOLOR"] | |
28 | assert not AnsiMode.NEVER.use_ansi_codes(tty_stream) | |
29 | assert not AnsiMode.NEVER.use_ansi_codes(non_tty_stream) | |
30 | ||
31 | os.environ["CLICOLOR"] = "0" | |
32 | assert not AnsiMode.NEVER.use_ansi_codes(tty_stream) | |
33 | assert not AnsiMode.NEVER.use_ansi_codes(non_tty_stream) | |
34 | ||
35 | @mock.patch.dict(os.environ) | |
36 | def test_ansi_mode_always(self, tty_stream, non_tty_stream): | |
37 | if "CLICOLOR" in os.environ: | |
38 | del os.environ["CLICOLOR"] | |
39 | assert AnsiMode.ALWAYS.use_ansi_codes(tty_stream) | |
40 | assert AnsiMode.ALWAYS.use_ansi_codes(non_tty_stream) | |
41 | ||
42 | os.environ["CLICOLOR"] = "0" | |
43 | assert AnsiMode.ALWAYS.use_ansi_codes(tty_stream) | |
44 | assert AnsiMode.ALWAYS.use_ansi_codes(non_tty_stream) | |
45 | ||
46 | @mock.patch.dict(os.environ) | |
47 | def test_ansi_mode_auto(self, tty_stream, non_tty_stream): | |
48 | if "CLICOLOR" in os.environ: | |
49 | del os.environ["CLICOLOR"] | |
50 | assert AnsiMode.AUTO.use_ansi_codes(tty_stream) | |
51 | assert not AnsiMode.AUTO.use_ansi_codes(non_tty_stream) | |
52 | ||
53 | os.environ["CLICOLOR"] = "0" | |
54 | assert not AnsiMode.AUTO.use_ansi_codes(tty_stream) | |
55 | assert not AnsiMode.AUTO.use_ansi_codes(non_tty_stream) |
13 | 13 | paths = ['one.yml', 'two.yml'] |
14 | 14 | opts = {'--file': paths} |
15 | 15 | environment = Environment.from_env_file('.') |
16 | assert get_config_path_from_options('.', opts, environment) == paths | |
16 | assert get_config_path_from_options(opts, environment) == paths | |
17 | 17 | |
18 | 18 | def test_single_path_from_env(self): |
19 | 19 | with mock.patch.dict(os.environ): |
20 | 20 | os.environ['COMPOSE_FILE'] = 'one.yml' |
21 | 21 | environment = Environment.from_env_file('.') |
22 | assert get_config_path_from_options('.', {}, environment) == ['one.yml'] | |
22 | assert get_config_path_from_options({}, environment) == ['one.yml'] | |
23 | 23 | |
24 | 24 | @pytest.mark.skipif(IS_WINDOWS_PLATFORM, reason='posix separator') |
25 | 25 | def test_multiple_path_from_env(self): |
26 | 26 | with mock.patch.dict(os.environ): |
27 | 27 | os.environ['COMPOSE_FILE'] = 'one.yml:two.yml' |
28 | 28 | environment = Environment.from_env_file('.') |
29 | assert get_config_path_from_options( | |
30 | '.', {}, environment | |
31 | ) == ['one.yml', 'two.yml'] | |
29 | assert get_config_path_from_options({}, environment) == ['one.yml', 'two.yml'] | |
32 | 30 | |
33 | 31 | @pytest.mark.skipif(not IS_WINDOWS_PLATFORM, reason='windows separator') |
34 | 32 | def test_multiple_path_from_env_windows(self): |
35 | 33 | with mock.patch.dict(os.environ): |
36 | 34 | os.environ['COMPOSE_FILE'] = 'one.yml;two.yml' |
37 | 35 | environment = Environment.from_env_file('.') |
38 | assert get_config_path_from_options( | |
39 | '.', {}, environment | |
40 | ) == ['one.yml', 'two.yml'] | |
36 | assert get_config_path_from_options({}, environment) == ['one.yml', 'two.yml'] | |
41 | 37 | |
42 | 38 | def test_multiple_path_from_env_custom_separator(self): |
43 | 39 | with mock.patch.dict(os.environ): |
44 | 40 | os.environ['COMPOSE_PATH_SEPARATOR'] = '^' |
45 | 41 | os.environ['COMPOSE_FILE'] = 'c:\\one.yml^.\\semi;colon.yml' |
46 | 42 | environment = Environment.from_env_file('.') |
47 | assert get_config_path_from_options( | |
48 | '.', {}, environment | |
49 | ) == ['c:\\one.yml', '.\\semi;colon.yml'] | |
43 | assert get_config_path_from_options({}, environment) == ['c:\\one.yml', '.\\semi;colon.yml'] | |
50 | 44 | |
51 | 45 | def test_no_path(self): |
52 | 46 | environment = Environment.from_env_file('.') |
53 | assert not get_config_path_from_options('.', {}, environment) | |
47 | assert not get_config_path_from_options({}, environment) | |
54 | 48 | |
55 | 49 | def test_unicode_path_from_options(self): |
56 | 50 | paths = [b'\xe5\xb0\xb1\xe5\x90\x83\xe9\xa5\xad/docker-compose.yml'] |
57 | 51 | opts = {'--file': paths} |
58 | 52 | environment = Environment.from_env_file('.') |
59 | assert get_config_path_from_options( | |
60 | '.', opts, environment | |
61 | ) == ['就吃饭/docker-compose.yml'] | |
53 | assert get_config_path_from_options(opts, environment) == ['就吃饭/docker-compose.yml'] |
7 | 7 | |
8 | 8 | from compose.cli.log_printer import build_log_generator |
9 | 9 | from compose.cli.log_printer import build_log_presenters |
10 | from compose.cli.log_printer import build_no_log_generator | |
11 | 10 | from compose.cli.log_printer import consume_queue |
12 | 11 | from compose.cli.log_printer import QueueItem |
13 | 12 | from compose.cli.log_printer import wait_on_exit |
72 | 71 | mock_container.name, status_code, |
73 | 72 | ) |
74 | 73 | assert expected in wait_on_exit(mock_container) |
75 | ||
76 | ||
77 | def test_build_no_log_generator(mock_container): | |
78 | mock_container.has_api_logs = False | |
79 | mock_container.log_driver = 'none' | |
80 | output, = build_no_log_generator(mock_container, None) | |
81 | assert "WARNING: no logs are available with the 'none' log driver\n" in output | |
82 | assert "exited with code" not in output | |
83 | 74 | |
84 | 75 | |
85 | 76 | class TestBuildLogGenerator: |
136 | 136 | |
137 | 137 | class TestSetupConsoleHandlerTestCase: |
138 | 138 | |
139 | def test_with_tty_verbose(self, logging_handler): | |
139 | def test_with_console_formatter_verbose(self, logging_handler): | |
140 | 140 | setup_console_handler(logging_handler, True) |
141 | 141 | assert type(logging_handler.formatter) == ConsoleWarningFormatter |
142 | 142 | assert '%(name)s' in logging_handler.formatter._fmt |
143 | 143 | assert '%(funcName)s' in logging_handler.formatter._fmt |
144 | 144 | |
145 | def test_with_tty_not_verbose(self, logging_handler): | |
145 | def test_with_console_formatter_not_verbose(self, logging_handler): | |
146 | 146 | setup_console_handler(logging_handler, False) |
147 | 147 | assert type(logging_handler.formatter) == ConsoleWarningFormatter |
148 | 148 | assert '%(name)s' not in logging_handler.formatter._fmt |
149 | 149 | assert '%(funcName)s' not in logging_handler.formatter._fmt |
150 | 150 | |
151 | def test_with_not_a_tty(self, logging_handler): | |
152 | logging_handler.stream.isatty.return_value = False | |
153 | setup_console_handler(logging_handler, False) | |
151 | def test_without_console_formatter(self, logging_handler): | |
152 | setup_console_handler(logging_handler, False, use_console_formatter=False) | |
154 | 153 | assert type(logging_handler.formatter) == logging.Formatter |
155 | 154 | |
156 | 155 |
237 | 237 | ) |
238 | 238 | ) |
239 | 239 | |
240 | assert 'Invalid top-level property "web"' in excinfo.exconly() | |
240 | assert "compose.config.errors.ConfigurationError: " \ | |
241 | "The Compose file 'filename.yml' is invalid because:\n" \ | |
242 | "'web' does not match any of the regexes: '^x-'" in excinfo.exconly() | |
241 | 243 | assert VERSION_EXPLANATION in excinfo.exconly() |
242 | 244 | |
243 | 245 | def test_named_volume_config_empty(self): |
666 | 668 | |
667 | 669 | assert 'Invalid service name \'mong\\o\'' in excinfo.exconly() |
668 | 670 | |
669 | def test_config_duplicate_cache_from_values_validation_error(self): | |
671 | def test_config_duplicate_cache_from_values_no_validation_error(self): | |
670 | 672 | with pytest.raises(ConfigurationError) as exc: |
671 | 673 | config.load( |
672 | 674 | build_config_details({ |
678 | 680 | }) |
679 | 681 | ) |
680 | 682 | |
681 | assert 'build.cache_from contains non-unique items' in exc.exconly() | |
683 | assert 'build.cache_from contains non-unique items' not in exc.exconly() | |
682 | 684 | |
683 | 685 | def test_load_with_multiple_files_v1(self): |
684 | 686 | base_file = config.ConfigFile( |
2394 | 2396 | 'image': 'busybox', |
2395 | 2397 | 'depends_on': { |
2396 | 2398 | 'app1': {'condition': 'service_started'}, |
2397 | 'app2': {'condition': 'service_healthy'} | |
2399 | 'app2': {'condition': 'service_healthy'}, | |
2400 | 'app3': {'condition': 'service_completed_successfully'} | |
2398 | 2401 | } |
2399 | 2402 | } |
2400 | 2403 | override = {} |
2406 | 2409 | 'image': 'busybox', |
2407 | 2410 | 'depends_on': { |
2408 | 2411 | 'app1': {'condition': 'service_started'}, |
2409 | 'app2': {'condition': 'service_healthy'} | |
2412 | 'app2': {'condition': 'service_healthy'}, | |
2413 | 'app3': {'condition': 'service_completed_successfully'} | |
2410 | 2414 | } |
2411 | 2415 | } |
2412 | 2416 | override = { |
2413 | 'depends_on': ['app3'] | |
2417 | 'depends_on': ['app4'] | |
2414 | 2418 | } |
2415 | 2419 | |
2416 | 2420 | actual = config.merge_service_dicts(base, override, VERSION) |
2419 | 2423 | 'depends_on': { |
2420 | 2424 | 'app1': {'condition': 'service_started'}, |
2421 | 2425 | 'app2': {'condition': 'service_healthy'}, |
2422 | 'app3': {'condition': 'service_started'} | |
2426 | 'app3': {'condition': 'service_completed_successfully'}, | |
2427 | 'app4': {'condition': 'service_started'}, | |
2423 | 2428 | } |
2424 | 2429 | } |
2425 | 2430 | |
3564 | 3569 | @mock.patch.dict(os.environ) |
3565 | 3570 | def test_config_file_with_options_environment_file(self): |
3566 | 3571 | project_dir = 'tests/fixtures/default-env-file' |
3572 | # env-file is relative to current working dir | |
3573 | env = Environment.from_env_file(project_dir, project_dir + '/.env2') | |
3567 | 3574 | service_dicts = config.load( |
3568 | 3575 | config.find( |
3569 | project_dir, None, Environment.from_env_file(project_dir, '.env2') | |
3576 | project_dir, None, env | |
3570 | 3577 | ) |
3571 | 3578 | ).services |
3572 | 3579 | |
5230 | 5237 | files = [ |
5231 | 5238 | 'docker-compose.yml', |
5232 | 5239 | 'docker-compose.yaml', |
5240 | 'compose.yml', | |
5241 | 'compose.yaml', | |
5233 | 5242 | ] |
5234 | 5243 | |
5235 | 5244 | def test_get_config_path_default_file_in_basedir(self): |
5263 | 5272 | base_dir = tempfile.mkdtemp(dir=project_dir) |
5264 | 5273 | else: |
5265 | 5274 | base_dir = project_dir |
5266 | filename, = config.get_default_config_files(base_dir) | |
5267 | return os.path.basename(filename) | |
5275 | filenames = config.get_default_config_files(base_dir) | |
5276 | if not filenames: | |
5277 | raise config.ComposeFileNotFound(config.SUPPORTED_FILENAMES) | |
5278 | return os.path.basename(filenames[0]) | |
5268 | 5279 | finally: |
5269 | 5280 | shutil.rmtree(project_dir) |
5270 | 5281 |
415 | 415 | with pytest.raises(UnsetRequiredSubstitution) as e: |
416 | 416 | defaults_interpolator("not ok ${BAZ?}") |
417 | 417 | |
418 | assert e.value.err == '' | |
418 | assert e.value.err == 'BAZ' | |
419 | 419 | |
420 | 420 | |
421 | 421 | def test_interpolate_mixed_separators(defaults_interpolator): |
220 | 220 | container = Container(None, self.container_dict, has_been_inspected=True) |
221 | 221 | assert container.short_id == self.container_id[:12] |
222 | 222 | |
223 | def test_has_api_logs(self): | |
224 | container_dict = { | |
225 | 'HostConfig': { | |
226 | 'LogConfig': { | |
227 | 'Type': 'json-file' | |
228 | } | |
229 | } | |
230 | } | |
231 | ||
232 | container = Container(None, container_dict, has_been_inspected=True) | |
233 | assert container.has_api_logs is True | |
234 | ||
235 | container_dict['HostConfig']['LogConfig']['Type'] = 'none' | |
236 | container = Container(None, container_dict, has_been_inspected=True) | |
237 | assert container.has_api_logs is False | |
238 | ||
239 | container_dict['HostConfig']['LogConfig']['Type'] = 'syslog' | |
240 | container = Container(None, container_dict, has_been_inspected=True) | |
241 | assert container.has_api_logs is False | |
242 | ||
243 | container_dict['HostConfig']['LogConfig']['Type'] = 'journald' | |
244 | container = Container(None, container_dict, has_been_inspected=True) | |
245 | assert container.has_api_logs is True | |
246 | ||
247 | container_dict['HostConfig']['LogConfig']['Type'] = 'foobar' | |
248 | container = Container(None, container_dict, has_been_inspected=True) | |
249 | assert container.has_api_logs is False | |
250 | ||
251 | 223 | |
252 | 224 | class GetContainerNameTestCase(unittest.TestCase): |
253 | 225 |
0 | import unittest | |
1 | ||
2 | from compose.metrics.client import MetricsCommand | |
3 | from compose.metrics.client import Status | |
4 | ||
5 | ||
6 | class MetricsTest(unittest.TestCase): | |
7 | @classmethod | |
8 | def test_metrics(cls): | |
9 | assert MetricsCommand('up', 'moby').to_map() == { | |
10 | 'command': 'compose up', | |
11 | 'context': 'moby', | |
12 | 'status': 'success', | |
13 | 'source': 'docker-compose', | |
14 | } | |
15 | ||
16 | assert MetricsCommand('down', 'local').to_map() == { | |
17 | 'command': 'compose down', | |
18 | 'context': 'local', | |
19 | 'status': 'success', | |
20 | 'source': 'docker-compose', | |
21 | } | |
22 | ||
23 | assert MetricsCommand('help', 'aci', Status.FAILURE).to_map() == { | |
24 | 'command': 'compose help', | |
25 | 'context': 'aci', | |
26 | 'status': 'failure', | |
27 | 'source': 'docker-compose', | |
28 | } | |
29 | ||
30 | assert MetricsCommand('run', 'ecs').to_map() == { | |
31 | 'command': 'compose run', | |
32 | 'context': 'ecs', | |
33 | 'status': 'success', | |
34 | 'source': 'docker-compose', | |
35 | } |
2 | 2 | |
3 | 3 | from docker.errors import APIError |
4 | 4 | |
5 | from compose.cli.colors import AnsiMode | |
5 | 6 | from compose.parallel import GlobalLimit |
6 | 7 | from compose.parallel import parallel_execute |
7 | 8 | from compose.parallel import parallel_execute_iter |
155 | 156 | |
156 | 157 | def test_parallel_execute_ansi(capsys): |
157 | 158 | ParallelStreamWriter.instance = None |
158 | ParallelStreamWriter.set_noansi(value=False) | |
159 | ParallelStreamWriter.set_default_ansi_mode(AnsiMode.ALWAYS) | |
159 | 160 | results, errors = parallel_execute( |
160 | 161 | objects=["something", "something more"], |
161 | 162 | func=lambda x: x, |
171 | 172 | |
172 | 173 | def test_parallel_execute_noansi(capsys): |
173 | 174 | ParallelStreamWriter.instance = None |
174 | ParallelStreamWriter.set_noansi() | |
175 | ParallelStreamWriter.set_default_ansi_mode(AnsiMode.NEVER) | |
175 | 176 | results, errors = parallel_execute( |
176 | 177 | objects=["something", "something more"], |
177 | 178 | func=lambda x: x, |
329 | 329 | assert service.options['environment'] == environment |
330 | 330 | |
331 | 331 | assert opts['labels'][LABEL_CONFIG_HASH] == \ |
332 | '689149e6041a85f6fb4945a2146a497ed43c8a5cbd8991753d875b165f1b4de4' | |
332 | '6da0f3ec0d5adf901de304bdc7e0ee44ec5dd7adb08aebc20fe0dd791d4ee5a8' | |
333 | 333 | assert opts['environment'] == ['also=real'] |
334 | 334 | |
335 | 335 | def test_get_container_create_options_sets_affinity_with_binds(self): |
699 | 699 | config_dict = service.config_dict() |
700 | 700 | expected = { |
701 | 701 | 'image_id': 'abcd', |
702 | 'ipc_mode': None, | |
702 | 703 | 'options': {'image': 'example.com/foo'}, |
703 | 704 | 'links': [('one', 'one')], |
704 | 705 | 'net': 'other', |
722 | 723 | config_dict = service.config_dict() |
723 | 724 | expected = { |
724 | 725 | 'image_id': 'abcd', |
726 | 'ipc_mode': None, | |
725 | 727 | 'options': {'image': 'example.com/foo'}, |
726 | 728 | 'links': [], |
727 | 729 | 'networks': {}, |