New upstream snapshot.
Debian Janitor
2 years ago
42 | 42 | -Liam Beguin <liambeguin _at_ gmail.com> |
43 | 43 | -Ram Rachum <ram _at_ rachum.com> |
44 | 44 | -Alba Mendez <me _at_ alba.sh> |
45 | -Robert Westman <robert _at_ byteflux.io> | |
45 | 46 | Portions derived from other open source works and are clearly marked. |
0 | 0 | Please see the online documentation for the latest changelog: |
1 | https://github.com/gitpython-developers/GitPython/blob/master/doc/source/changes.rst | |
1 | https://github.com/gitpython-developers/GitPython/blob/main/doc/source/changes.rst |
0 | ### How to contribute | |
0 | # How to contribute | |
1 | 1 | |
2 | 2 | The following is a short step-by-step rundown of what one typically would do to contribute. |
3 | 3 | |
4 | * [fork this project](https://github.com/gitpython-developers/GitPython/fork) on GitHub. | |
5 | * For setting up the environment to run the self tests, please look at `.travis.yml`. | |
6 | * Please try to **write a test that fails unless the contribution is present.** | |
7 | * Feel free to add yourself to AUTHORS file. | |
8 | * Create a pull request. | |
9 | ||
4 | - [fork this project](https://github.com/gitpython-developers/GitPython/fork) on GitHub. | |
5 | - For setting up the environment to run the self tests, please run `init-tests-after-clone.sh`. | |
6 | - Please try to **write a test that fails unless the contribution is present.** | |
7 | - Try to avoid massive commits and prefer to take small steps, with one commit for each. | |
8 | - Feel free to add yourself to AUTHORS file. | |
9 | - Create a pull request. |
0 | 0 | Metadata-Version: 1.2 |
1 | 1 | Name: GitPython |
2 | Version: 3.1.14 | |
2 | Version: 3.1.20 | |
3 | 3 | Summary: Python Git Library |
4 | 4 | Home-page: https://github.com/gitpython-developers/GitPython |
5 | 5 | Author: Sebastian Thiel, Michael Trier |
15 | 15 | Classifier: Operating System :: POSIX |
16 | 16 | Classifier: Operating System :: Microsoft :: Windows |
17 | 17 | Classifier: Operating System :: MacOS :: MacOS X |
18 | Classifier: Typing:: Typed | |
18 | 19 | Classifier: Programming Language :: Python |
19 | 20 | Classifier: Programming Language :: Python :: 3 |
20 | Classifier: Programming Language :: Python :: 3.4 | |
21 | Classifier: Programming Language :: Python :: 3.5 | |
22 | Classifier: Programming Language :: Python :: 3.6 | |
23 | 21 | Classifier: Programming Language :: Python :: 3.7 |
24 | 22 | Classifier: Programming Language :: Python :: 3.8 |
25 | Classifier: Programming Language :: Python :: 3.9 | |
26 | Requires-Python: >=3.4 | |
23 | Classifier: Programming Language :: Python :: 3.9Programming Language :: Python :: 3.10 | |
24 | Requires-Python: >=3.7 |
4 | 4 | MANIFEST.in |
5 | 5 | README.md |
6 | 6 | VERSION |
7 | pyproject.toml | |
7 | 8 | requirements.txt |
8 | 9 | setup.py |
9 | 10 | test-requirements.txt |
30 | 31 | git/db.py |
31 | 32 | git/diff.py |
32 | 33 | git/exc.py |
34 | git/py.typed | |
33 | 35 | git/remote.py |
36 | git/types.py | |
34 | 37 | git/util.py |
35 | 38 | git/index/__init__.py |
36 | 39 | git/index/base.py |
0 | include AUTHORS | |
1 | include CHANGES | |
2 | include CONTRIBUTING.md | |
3 | include LICENSE | |
4 | include README.md | |
0 | 5 | include VERSION |
1 | include LICENSE | |
2 | include CHANGES | |
3 | include AUTHORS | |
4 | include CONTRIBUTING.md | |
5 | include README.md | |
6 | 6 | include requirements.txt |
7 | include test-requirements.txt | |
8 | include git/py.typed | |
7 | 9 | |
8 | 10 | recursive-include doc * |
9 | 11 | recursive-exclude test * |
0 | 0 | Metadata-Version: 1.2 |
1 | 1 | Name: GitPython |
2 | Version: 3.1.14 | |
2 | Version: 3.1.20 | |
3 | 3 | Summary: Python Git Library |
4 | 4 | Home-page: https://github.com/gitpython-developers/GitPython |
5 | 5 | Author: Sebastian Thiel, Michael Trier |
15 | 15 | Classifier: Operating System :: POSIX |
16 | 16 | Classifier: Operating System :: Microsoft :: Windows |
17 | 17 | Classifier: Operating System :: MacOS :: MacOS X |
18 | Classifier: Typing:: Typed | |
18 | 19 | Classifier: Programming Language :: Python |
19 | 20 | Classifier: Programming Language :: Python :: 3 |
20 | Classifier: Programming Language :: Python :: 3.4 | |
21 | Classifier: Programming Language :: Python :: 3.5 | |
22 | Classifier: Programming Language :: Python :: 3.6 | |
23 | 21 | Classifier: Programming Language :: Python :: 3.7 |
24 | 22 | Classifier: Programming Language :: Python :: 3.8 |
25 | Classifier: Programming Language :: Python :: 3.9 | |
26 | Requires-Python: >=3.4 | |
23 | Classifier: Programming Language :: Python :: 3.9Programming Language :: Python :: 3.10 | |
24 | Requires-Python: >=3.7 |
0 | ![Python package](https://github.com/gitpython-developers/GitPython/workflows/Python%20package/badge.svg) | |
1 | [![Documentation Status](https://readthedocs.org/projects/gitpython/badge/?version=stable)](https://readthedocs.org/projects/gitpython/?badge=stable) | |
2 | [![Packaging status](https://repology.org/badge/tiny-repos/python:gitpython.svg)](https://repology.org/metapackage/python:gitpython/versions) | |
3 | ||
0 | 4 | ## [Gitoxide](https://github.com/Byron/gitoxide): A peek into the future… |
1 | 5 | |
2 | 6 | I started working on GitPython in 2009, back in the days when Python was 'my thing' and I had great plans with it. |
19 | 23 | |
20 | 24 | It provides abstractions of git objects for easy access of repository data, and additionally |
21 | 25 | allows you to access the git repository more directly using either a pure python implementation, |
22 | or the faster, but more resource intensive *git command* implementation. | |
26 | or the faster, but more resource intensive _git command_ implementation. | |
23 | 27 | |
24 | 28 | The object database implementation is optimized for handling large quantities of objects and large datasets, |
25 | 29 | which is achieved by using low-level structures and data streaming. |
26 | 30 | |
31 | ### DEVELOPMENT STATUS | |
32 | ||
33 | This project is in **maintenance mode**, which means that | |
34 | ||
35 | - …there will be no feature development, unless these are contributed | |
36 | - …there will be no bug fixes, unless they are relevant to the safety of users, or contributed | |
37 | - …issues will be responded to with waiting times of up to a month | |
38 | ||
39 | The project is open to contributions of all kinds, as well as new maintainers. | |
27 | 40 | |
28 | 41 | ### REQUIREMENTS |
29 | 42 | |
32 | 45 | If it is not in your `PATH`, you can help GitPython find it by setting |
33 | 46 | the `GIT_PYTHON_GIT_EXECUTABLE=<path/to/git>` environment variable. |
34 | 47 | |
35 | * Git (1.7.x or newer) | |
36 | * Python >= 3.4 | |
48 | - Git (1.7.x or newer) | |
49 | - Python >= 3.7 | |
37 | 50 | |
38 | 51 | The list of dependencies are listed in `./requirements.txt` and `./test-requirements.txt`. |
39 | 52 | The installer takes care of installing them for you. |
82 | 95 | |
83 | 96 | ### RUNNING TESTS |
84 | 97 | |
85 | *Important*: Right after cloning this repository, please be sure to have executed | |
98 | _Important_: Right after cloning this repository, please be sure to have executed | |
86 | 99 | the `./init-tests-after-clone.sh` script in the repository root. Otherwise |
87 | 100 | you will encounter test failures. |
88 | 101 | |
89 | On *Windows*, make sure you have `git-daemon` in your PATH. For MINGW-git, the `git-daemon.exe` | |
102 | On _Windows_, make sure you have `git-daemon` in your PATH. For MINGW-git, the `git-daemon.exe` | |
90 | 103 | exists in `Git\mingw64\libexec\git-core\`; CYGWIN has no daemon, but should get along fine |
91 | 104 | with MINGW's. |
92 | 105 | |
93 | The easiest way to run tests is by using [tox](https://pypi.python.org/pypi/tox) | |
94 | a wrapper around virtualenv. It will take care of setting up environments with the proper | |
95 | dependencies installed and execute test commands. To install it simply: | |
96 | ||
97 | pip install tox | |
98 | ||
99 | Then run: | |
100 | ||
101 | tox | |
102 | ||
103 | ||
104 | For more fine-grained control, you can use `unittest`. | |
106 | Ensure testing libraries are installed. | |
107 | In the root directory, run: `pip install -r test-requirements.txt` | |
108 | ||
109 | To lint, run: `flake8` | |
110 | ||
111 | To typecheck, run: `mypy -p git` | |
112 | ||
113 | To test, run: `pytest` | |
114 | ||
115 | Configuration for flake8 is in the ./.flake8 file. | |
116 | ||
117 | Configurations for mypy, pytest and coverage.py are in ./pyproject.toml. | |
118 | ||
119 | The same linting and testing will also be performed against different supported python versions | |
120 | upon submitting a pull request (or on each push if you have a fork with a "main" branch and actions enabled). | |
105 | 121 | |
106 | 122 | ### Contributions |
107 | 123 | |
109 | 125 | |
110 | 126 | ### INFRASTRUCTURE |
111 | 127 | |
112 | * [User Documentation](http://gitpython.readthedocs.org) | |
113 | * [Questions and Answers](http://stackexchange.com/filters/167317/gitpython) | |
114 | * Please post on stackoverflow and use the `gitpython` tag | |
115 | * [Issue Tracker](https://github.com/gitpython-developers/GitPython/issues) | |
116 | * Post reproducible bugs and feature requests as a new issue. | |
128 | - [User Documentation](http://gitpython.readthedocs.org) | |
129 | - [Questions and Answers](http://stackexchange.com/filters/167317/gitpython) | |
130 | - Please post on stackoverflow and use the `gitpython` tag | |
131 | - [Issue Tracker](https://github.com/gitpython-developers/GitPython/issues) | |
132 | - Post reproducible bugs and feature requests as a new issue. | |
117 | 133 | Please be sure to provide the following information if posting bugs: |
118 | * GitPython version (e.g. `import git; git.__version__`) | |
119 | * Python version (e.g. `python --version`) | |
120 | * The encountered stack-trace, if applicable | |
121 | * Enough information to allow reproducing the issue | |
134 | - GitPython version (e.g. `import git; git.__version__`) | |
135 | - Python version (e.g. `python --version`) | |
136 | - The encountered stack-trace, if applicable | |
137 | - Enough information to allow reproducing the issue | |
122 | 138 | |
123 | 139 | ### How to make a new release |
124 | 140 | |
125 | * Update/verify the **version** in the `VERSION` file | |
126 | * Update/verify that the `doc/source/changes.rst` changelog file was updated | |
127 | * Commit everything | |
128 | * Run `git tag -s <version>` to tag the version in Git | |
129 | * Run `make release` | |
130 | * Close the milestone mentioned in the _changelog_ and create a new one. _Do not reuse milestones by renaming them_. | |
131 | * set the upcoming version in the `VERSION` file, usually be | |
141 | - Update/verify the **version** in the `VERSION` file | |
142 | - Update/verify that the `doc/source/changes.rst` changelog file was updated | |
143 | - Commit everything | |
144 | - Run `git tag -s <version>` to tag the version in Git | |
145 | - Run `make release` | |
146 | - Close the milestone mentioned in the _changelog_ and create a new one. _Do not reuse milestones by renaming them_. | |
147 | - set the upcoming version in the `VERSION` file, usually be | |
132 | 148 | incrementing the patch level, and possibly by appending `-dev`. Probably you |
133 | 149 | want to `git push` once more. |
134 | 150 | |
180 | 196 | |
181 | 197 | ### Projects using GitPython |
182 | 198 | |
183 | * [PyDriller](https://github.com/ishepard/pydriller) | |
184 | * [Kivy Designer](https://github.com/kivy/kivy-designer) | |
185 | * [Prowl](https://github.com/nettitude/Prowl) | |
186 | * [Python Taint](https://github.com/python-security/pyt) | |
187 | * [Buster](https://github.com/axitkhurana/buster) | |
188 | * [git-ftp](https://github.com/ezyang/git-ftp) | |
189 | * [Git-Pandas](https://github.com/wdm0006/git-pandas) | |
190 | * [PyGitUp](https://github.com/msiemens/PyGitUp) | |
191 | * [PyJFuzz](https://github.com/mseclab/PyJFuzz) | |
192 | * [Loki](https://github.com/Neo23x0/Loki) | |
193 | * [Omniwallet](https://github.com/OmniLayer/omniwallet) | |
194 | * [GitViper](https://github.com/BeayemX/GitViper) | |
195 | * [Git Gud](https://github.com/bthayer2365/git-gud) | |
199 | - [PyDriller](https://github.com/ishepard/pydriller) | |
200 | - [Kivy Designer](https://github.com/kivy/kivy-designer) | |
201 | - [Prowl](https://github.com/nettitude/Prowl) | |
202 | - [Python Taint](https://github.com/python-security/pyt) | |
203 | - [Buster](https://github.com/axitkhurana/buster) | |
204 | - [git-ftp](https://github.com/ezyang/git-ftp) | |
205 | - [Git-Pandas](https://github.com/wdm0006/git-pandas) | |
206 | - [PyGitUp](https://github.com/msiemens/PyGitUp) | |
207 | - [PyJFuzz](https://github.com/mseclab/PyJFuzz) | |
208 | - [Loki](https://github.com/Neo23x0/Loki) | |
209 | - [Omniwallet](https://github.com/OmniLayer/omniwallet) | |
210 | - [GitViper](https://github.com/BeayemX/GitViper) | |
211 | - [Git Gud](https://github.com/bthayer2365/git-gud) | |
196 | 212 | |
197 | 213 | ### LICENSE |
198 | 214 | |
199 | New BSD License. See the LICENSE file. | |
200 | ||
201 | ### DEVELOPMENT STATUS | |
202 | ||
203 | ![Python package](https://github.com/gitpython-developers/GitPython/workflows/Python%20package/badge.svg) | |
204 | [![Documentation Status](https://readthedocs.org/projects/gitpython/badge/?version=stable)](https://readthedocs.org/projects/gitpython/?badge=stable) | |
205 | [![Packaging status](https://repology.org/badge/tiny-repos/python:gitpython.svg)](https://repology.org/metapackage/python:gitpython/versions) | |
206 | ||
207 | This project is in **maintenance mode**, which means that | |
208 | ||
209 | * …there will be no feature development, unless these are contributed | |
210 | * …there will be no bug fixes, unless they are relevant to the safety of users, or contributed | |
211 | * …issues will be responded to with waiting times of up to a month | |
212 | ||
213 | The project is open to contributions of all kinds, as well as new maintainers. | |
215 | New BSD License. See the LICENSE file. | |
214 | 216 | |
215 | 217 | [contributing]: https://github.com/gitpython-developers/GitPython/blob/master/CONTRIBUTING.md |
0 | python-git (3.1.20+git20210905.1.5da76e8-1) UNRELEASED; urgency=low | |
1 | ||
2 | * New upstream snapshot. | |
3 | ||
4 | -- Debian Janitor <janitor@jelmer.uk> Thu, 09 Sep 2021 00:30:40 -0000 | |
5 | ||
0 | 6 | python-git (3.1.14-1) unstable; urgency=medium |
1 | 7 | |
2 | 8 | * New upstream version 3.1.14 |
1 | 1 | Changelog |
2 | 2 | ========= |
3 | 3 | |
4 | 3.1.20 | |
5 | ====== | |
6 | ||
7 | * This is the second typed release with a lot of improvements under the hood. | |
8 | * Tracking issue: https://github.com/gitpython-developers/GitPython/issues/1095 | |
9 | ||
10 | See the following for details: | |
11 | https://github.com/gitpython-developers/gitpython/milestone/52?closed=1 | |
12 | ||
13 | ||
14 | 3.1.19 (YANKED) | |
15 | =============== | |
16 | ||
17 | * This is the second typed release with a lot of improvements under the hood. | |
18 | * Tracking issue: https://github.com/gitpython-developers/GitPython/issues/1095 | |
19 | ||
20 | See the following for details: | |
21 | https://github.com/gitpython-developers/gitpython/milestone/51?closed=1 | |
22 | ||
23 | 3.1.18 | |
24 | ====== | |
25 | ||
26 | * drop support for python 3.5 to reduce maintenance burden on typing. Lower patch levels of python 3.5 would break, too. | |
27 | ||
28 | See the following for details: | |
29 | https://github.com/gitpython-developers/gitpython/milestone/50?closed=1 | |
30 | ||
31 | 3.1.17 | |
32 | ====== | |
33 | ||
34 | * Fix issues from 3.1.16 (see https://github.com/gitpython-developers/GitPython/issues/1238) | |
35 | * Fix issues from 3.1.15 (see https://github.com/gitpython-developers/GitPython/issues/1223) | |
36 | * Add more static typing information | |
37 | ||
38 | See the following for details: | |
39 | https://github.com/gitpython-developers/gitpython/milestone/49?closed=1 | |
40 | ||
41 | 3.1.16 (YANKED) | |
42 | =============== | |
43 | ||
44 | * Fix issues from 3.1.15 (see https://github.com/gitpython-developers/GitPython/issues/1223) | |
45 | * Add more static typing information | |
46 | ||
47 | See the following for details: | |
48 | https://github.com/gitpython-developers/gitpython/milestone/48?closed=1 | |
49 | ||
50 | 3.1.15 (YANKED) | |
51 | =============== | |
52 | ||
53 | * add deprectation warning for python 3.5 | |
54 | ||
55 | See the following for details: | |
56 | https://github.com/gitpython-developers/gitpython/milestone/47?closed=1 | |
57 | ||
4 | 58 | 3.1.14 |
5 | 59 | ====== |
6 | 60 | |
7 | 61 | * git.Commit objects now have a ``replace`` method that will return a |
8 | 62 | copy of the commit with modified attributes. |
9 | 63 | * Add python 3.9 support |
64 | * Drop python 3.4 support | |
65 | ||
66 | See the following for details: | |
67 | https://github.com/gitpython-developers/gitpython/milestone/46?closed=1 | |
10 | 68 | |
11 | 69 | 3.1.13 |
12 | 70 | ====== |
13 | 13 | # All configuration values have a default; values that are commented out |
14 | 14 | # serve to show the default. |
15 | 15 | |
16 | import sys, os | |
16 | import sys | |
17 | import os | |
17 | 18 | |
18 | 19 | # If your extensions are in another directory, add it here. If the directory |
19 | 20 | # is relative to the documentation root, use os.path.abspath to make it |
49 | 50 | # built documents. |
50 | 51 | # |
51 | 52 | # The short X.Y version. |
52 | with open(os.path.join(os.path.dirname(__file__),"..", "..", 'VERSION')) as fd: | |
53 | with open(os.path.join(os.path.dirname(__file__), "..", "..", 'VERSION')) as fd: | |
53 | 54 | VERSION = fd.readline().strip() |
54 | 55 | version = VERSION |
55 | 56 | # The full version, including alpha/beta/rc tags. |
169 | 170 | # Grouping the document tree into LaTeX files. List of tuples |
170 | 171 | # (source start file, target name, title, author, document class [howto/manual]). |
171 | 172 | latex_documents = [ |
172 | ('index', 'GitPython.tex', ur'GitPython Documentation', | |
173 | ur'Michael Trier', 'manual'), | |
173 | ('index', 'GitPython.tex', r'GitPython Documentation', | |
174 | r'Michael Trier', 'manual'), | |
174 | 175 | ] |
175 | 176 | |
176 | 177 | # The name of an image file (relative to this directory) to place at the top of |
12 | 12 | Requirements |
13 | 13 | ============ |
14 | 14 | |
15 | * `Python`_ >= 3.4 | |
15 | * `Python`_ >= 3.7 | |
16 | 16 | * `Git`_ 1.7.0 or newer |
17 | 17 | It should also work with older versions, but it may be that some operations |
18 | 18 | involving remotes will not work as expected. |
19 | 19 | * `GitDB`_ - a pure python git database implementation |
20 | * `typing_extensions`_ >= 3.7.3.4 (if python < 3.10) | |
20 | 21 | |
21 | 22 | .. _Python: https://www.python.org |
22 | 23 | .. _Git: https://git-scm.com/ |
23 | 24 | .. _GitDB: https://pypi.python.org/pypi/gitdb |
25 | .. _typing_extensions: https://pypi.org/project/typing-extensions/ | |
24 | 26 | |
25 | 27 | Installing GitPython |
26 | 28 | ==================== |
59 | 61 | --------------------------- |
60 | 62 | |
61 | 63 | GitPython is not suited for long-running processes (like daemons) as it tends to |
62 | leak system resources. It was written in a time where destructors (as implemented | |
64 | leak system resources. It was written in a time where destructors (as implemented | |
63 | 65 | in the `__del__` method) still ran deterministically. |
64 | 66 | |
65 | 67 | In case you still want to use it in such a context, you will want to search the |
9 | 9 | |
10 | 10 | GitPython provides object model access to your git repository. This tutorial is composed of multiple sections, most of which explains a real-life usecase. |
11 | 11 | |
12 | All code presented here originated from `test_docs.py <https://github.com/gitpython-developers/GitPython/blob/master/test/test_docs.py>`_ to assure correctness. Knowing this should also allow you to more easily run the code for your own testing purposes, all you need is a developer installation of git-python. | |
12 | All code presented here originated from `test_docs.py <https://github.com/gitpython-developers/GitPython/blob/main/test/test_docs.py>`_ to assure correctness. Knowing this should also allow you to more easily run the code for your own testing purposes, all you need is a developer installation of git-python. | |
13 | 13 | |
14 | 14 | Meet the Repo type |
15 | 15 | ****************** |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | 5 | # flake8: noqa |
6 | 6 | #@PydevCodeAnalysisIgnore |
7 | from git.exc import * # @NoMove @IgnorePep8 | |
7 | 8 | import inspect |
8 | 9 | import os |
9 | 10 | import sys |
10 | ||
11 | 11 | import os.path as osp |
12 | 12 | |
13 | from typing import Optional | |
14 | from git.types import PathLike | |
13 | 15 | |
14 | __version__ = '3.1.14' | |
16 | __version__ = '3.1.20' | |
15 | 17 | |
16 | 18 | |
17 | 19 | #{ Initialization |
18 | def _init_externals(): | |
20 | def _init_externals() -> None: | |
19 | 21 | """Initialize external projects by putting them into the path""" |
20 | if __version__ == '3.1.14' and 'PYOXIDIZER' not in os.environ: | |
22 | if __version__ == '3.1.20' and 'PYOXIDIZER' not in os.environ: | |
21 | 23 | sys.path.insert(1, osp.join(osp.dirname(__file__), 'ext', 'gitdb')) |
22 | 24 | |
23 | 25 | try: |
28 | 30 | |
29 | 31 | #} END initialization |
30 | 32 | |
33 | ||
31 | 34 | ################# |
32 | 35 | _init_externals() |
33 | 36 | ################# |
34 | 37 | |
35 | 38 | #{ Imports |
36 | 39 | |
37 | from git.exc import * # @NoMove @IgnorePep8 | |
38 | 40 | try: |
39 | 41 | from git.config import GitConfigParser # @NoMove @IgnorePep8 |
40 | 42 | from git.objects import * # @NoMove @IgnorePep8 |
64 | 66 | #{ Initialize git executable path |
65 | 67 | GIT_OK = None |
66 | 68 | |
67 | def refresh(path=None): | |
69 | ||
70 | def refresh(path: Optional[PathLike] = None) -> None: | |
68 | 71 | """Convenience method for setting the git executable path.""" |
69 | 72 | global GIT_OK |
70 | 73 | GIT_OK = False |
77 | 80 | GIT_OK = True |
78 | 81 | #} END initialize git executable path |
79 | 82 | |
83 | ||
80 | 84 | ################# |
81 | 85 | try: |
82 | 86 | refresh() |
14 | 14 | PIPE |
15 | 15 | ) |
16 | 16 | import subprocess |
17 | import sys | |
18 | 17 | import threading |
19 | from collections import OrderedDict | |
20 | 18 | from textwrap import dedent |
21 | 19 | |
22 | 20 | from git.compat import ( |
27 | 25 | is_win, |
28 | 26 | ) |
29 | 27 | from git.exc import CommandError |
30 | from git.util import is_cygwin_git, cygpath, expand_path | |
28 | from git.util import is_cygwin_git, cygpath, expand_path, remove_password_if_present | |
31 | 29 | |
32 | 30 | from .exc import ( |
33 | 31 | GitCommandError, |
38 | 36 | stream_copy, |
39 | 37 | ) |
40 | 38 | |
39 | # typing --------------------------------------------------------------------------- | |
40 | ||
41 | from typing import (Any, AnyStr, BinaryIO, Callable, Dict, IO, Iterator, List, Mapping, | |
42 | Sequence, TYPE_CHECKING, TextIO, Tuple, Union, cast, overload) | |
43 | ||
44 | from git.types import PathLike, Literal, TBD | |
45 | ||
46 | if TYPE_CHECKING: | |
47 | from git.repo.base import Repo | |
48 | from git.diff import DiffIndex | |
49 | ||
50 | ||
51 | # --------------------------------------------------------------------------------- | |
52 | ||
41 | 53 | execute_kwargs = {'istream', 'with_extended_output', |
42 | 54 | 'with_exceptions', 'as_process', 'stdout_as_string', |
43 | 55 | 'output_stream', 'with_stdout', 'kill_after_timeout', |
55 | 67 | # Documentation |
56 | 68 | ## @{ |
57 | 69 | |
58 | def handle_process_output(process, stdout_handler, stderr_handler, | |
59 | finalizer=None, decode_streams=True): | |
70 | def handle_process_output(process: Union[subprocess.Popen, 'Git.AutoInterrupt'], | |
71 | stdout_handler: Union[None, | |
72 | Callable[[AnyStr], None], | |
73 | Callable[[List[AnyStr]], None], | |
74 | Callable[[bytes, 'Repo', 'DiffIndex'], None]], | |
75 | stderr_handler: Union[None, | |
76 | Callable[[AnyStr], None], | |
77 | Callable[[List[AnyStr]], None]], | |
78 | finalizer: Union[None, | |
79 | Callable[[Union[subprocess.Popen, 'Git.AutoInterrupt']], None]] = None, | |
80 | decode_streams: bool = True) -> None: | |
60 | 81 | """Registers for notifications to learn that process output is ready to read, and dispatches lines to |
61 | 82 | the respective line handlers. |
62 | 83 | This function returns once the finalizer returns |
73 | 94 | or if decoding must happen later (i.e. for Diffs). |
74 | 95 | """ |
75 | 96 | # Use 2 "pump" threads and wait for both to finish. |
76 | def pump_stream(cmdline, name, stream, is_decode, handler): | |
97 | def pump_stream(cmdline: str, name: str, stream: Union[BinaryIO, TextIO], is_decode: bool, | |
98 | handler: Union[None, Callable[[Union[bytes, str]], None]]) -> None: | |
77 | 99 | try: |
78 | 100 | for line in stream: |
79 | 101 | if handler: |
80 | 102 | if is_decode: |
81 | line = line.decode(defenc) | |
82 | handler(line) | |
103 | assert isinstance(line, bytes) | |
104 | line_str = line.decode(defenc) | |
105 | handler(line_str) | |
106 | else: | |
107 | handler(line) | |
83 | 108 | except Exception as ex: |
84 | log.error("Pumping %r of cmd(%s) failed due to: %r", name, cmdline, ex) | |
85 | raise CommandError(['<%s-pump>' % name] + cmdline, ex) from ex | |
109 | log.error("Pumping %r of cmd(%s) failed due to: %r", name, remove_password_if_present(cmdline), ex) | |
110 | raise CommandError(['<%s-pump>' % name] + remove_password_if_present(cmdline), ex) from ex | |
86 | 111 | finally: |
87 | 112 | stream.close() |
88 | 113 | |
101 | 126 | for name, stream, handler in pumps: |
102 | 127 | t = threading.Thread(target=pump_stream, |
103 | 128 | args=(cmdline, name, stream, decode_streams, handler)) |
104 | t.setDaemon(True) | |
129 | t.daemon = True | |
105 | 130 | t.start() |
106 | 131 | threads.append(t) |
107 | 132 | |
112 | 137 | |
113 | 138 | if finalizer: |
114 | 139 | return finalizer(process) |
115 | ||
116 | ||
117 | def dashify(string): | |
140 | else: | |
141 | return None | |
142 | ||
143 | ||
144 | def dashify(string: str) -> str: | |
118 | 145 | return string.replace('_', '-') |
119 | 146 | |
120 | 147 | |
121 | def slots_to_dict(self, exclude=()): | |
148 | def slots_to_dict(self: object, exclude: Sequence[str] = ()) -> Dict[str, Any]: | |
122 | 149 | return {s: getattr(self, s) for s in self.__slots__ if s not in exclude} |
123 | 150 | |
124 | 151 | |
125 | def dict_to_slots_and__excluded_are_none(self, d, excluded=()): | |
152 | def dict_to_slots_and__excluded_are_none(self: object, d: Mapping[str, Any], excluded: Sequence[str] = ()) -> None: | |
126 | 153 | for k, v in d.items(): |
127 | 154 | setattr(self, k, v) |
128 | 155 | for k in excluded: |
136 | 163 | |
137 | 164 | ## CREATE_NEW_PROCESS_GROUP is needed to allow killing it afterwards, |
138 | 165 | # see https://docs.python.org/3/library/subprocess.html#subprocess.Popen.send_signal |
139 | PROC_CREATIONFLAGS = (CREATE_NO_WINDOW | subprocess.CREATE_NEW_PROCESS_GROUP | |
140 | if is_win else 0) | |
166 | PROC_CREATIONFLAGS = (CREATE_NO_WINDOW | subprocess.CREATE_NEW_PROCESS_GROUP # type: ignore[attr-defined] | |
167 | if is_win else 0) # mypy error if not windows | |
141 | 168 | |
142 | 169 | |
143 | 170 | class Git(LazyMixin): |
161 | 188 | |
162 | 189 | _excluded_ = ('cat_file_all', 'cat_file_header', '_version_info') |
163 | 190 | |
164 | def __getstate__(self): | |
191 | def __getstate__(self) -> Dict[str, Any]: | |
165 | 192 | return slots_to_dict(self, exclude=self._excluded_) |
166 | 193 | |
167 | def __setstate__(self, d): | |
194 | def __setstate__(self, d: Dict[str, Any]) -> None: | |
168 | 195 | dict_to_slots_and__excluded_are_none(self, d, excluded=self._excluded_) |
169 | 196 | |
170 | 197 | # CONFIGURATION |
188 | 215 | # the top level __init__ |
189 | 216 | |
190 | 217 | @classmethod |
191 | def refresh(cls, path=None): | |
218 | def refresh(cls, path: Union[None, PathLike] = None) -> bool: | |
192 | 219 | """This gets called by the refresh function (see the top level |
193 | 220 | __init__). |
194 | 221 | """ |
208 | 235 | # - a GitCommandNotFound error is spawned by ourselves |
209 | 236 | # - a PermissionError is spawned if the git executable provided |
210 | 237 | # cannot be executed for whatever reason |
211 | ||
238 | ||
212 | 239 | has_git = False |
213 | 240 | try: |
214 | 241 | cls().version() |
303 | 330 | return has_git |
304 | 331 | |
305 | 332 | @classmethod |
306 | def is_cygwin(cls): | |
333 | def is_cygwin(cls) -> bool: | |
307 | 334 | return is_cygwin_git(cls.GIT_PYTHON_GIT_EXECUTABLE) |
308 | 335 | |
336 | @overload | |
309 | 337 | @classmethod |
310 | def polish_url(cls, url, is_cygwin=None): | |
338 | def polish_url(cls, url: str, is_cygwin: Literal[False] = ...) -> str: | |
339 | ... | |
340 | ||
341 | @overload | |
342 | @classmethod | |
343 | def polish_url(cls, url: str, is_cygwin: Union[None, bool] = None) -> str: | |
344 | ... | |
345 | ||
346 | @classmethod | |
347 | def polish_url(cls, url: str, is_cygwin: Union[None, bool] = None) -> PathLike: | |
311 | 348 | if is_cygwin is None: |
312 | 349 | is_cygwin = cls.is_cygwin() |
313 | 350 | |
324 | 361 | if url.startswith('~'): |
325 | 362 | url = os.path.expanduser(url) |
326 | 363 | url = url.replace("\\\\", "\\").replace("\\", "/") |
327 | ||
328 | 364 | return url |
329 | 365 | |
330 | 366 | class AutoInterrupt(object): |
337 | 373 | |
338 | 374 | __slots__ = ("proc", "args") |
339 | 375 | |
340 | def __init__(self, proc, args): | |
376 | def __init__(self, proc: Union[None, subprocess.Popen], args: Any) -> None: | |
341 | 377 | self.proc = proc |
342 | 378 | self.args = args |
343 | 379 | |
344 | def __del__(self): | |
380 | def __del__(self) -> None: | |
345 | 381 | if self.proc is None: |
346 | 382 | return |
347 | 383 | |
357 | 393 | # did the process finish already so we have a return code ? |
358 | 394 | try: |
359 | 395 | if proc.poll() is not None: |
360 | return | |
396 | return None | |
361 | 397 | except OSError as ex: |
362 | 398 | log.info("Ignored error after process had died: %r", ex) |
363 | 399 | |
364 | 400 | # can be that nothing really exists anymore ... |
365 | 401 | if os is None or getattr(os, 'kill', None) is None: |
366 | return | |
402 | return None | |
367 | 403 | |
368 | 404 | # try to kill it |
369 | 405 | try: |
380 | 416 | call(("TASKKILL /F /T /PID %s 2>nul 1>nul" % str(proc.pid)), shell=True) |
381 | 417 | # END exception handling |
382 | 418 | |
383 | def __getattr__(self, attr): | |
419 | def __getattr__(self, attr: str) -> Any: | |
384 | 420 | return getattr(self.proc, attr) |
385 | 421 | |
386 | def wait(self, stderr=b''): # TODO: Bad choice to mimic `proc.wait()` but with different args. | |
422 | # TODO: Bad choice to mimic `proc.wait()` but with different args. | |
423 | def wait(self, stderr: Union[None, str, bytes] = b'') -> int: | |
387 | 424 | """Wait for the process and return its status code. |
388 | 425 | |
389 | 426 | :param stderr: Previously read value of stderr, in case stderr is already closed. |
390 | 427 | :warn: may deadlock if output or error pipes are used and not handled separately. |
391 | 428 | :raise GitCommandError: if the return status is not 0""" |
392 | 429 | if stderr is None: |
393 | stderr = b'' | |
394 | stderr = force_bytes(data=stderr, encoding='utf-8') | |
395 | ||
396 | status = self.proc.wait() | |
397 | ||
398 | def read_all_from_possibly_closed_stream(stream): | |
399 | try: | |
400 | return stderr + force_bytes(stream.read()) | |
401 | except ValueError: | |
402 | return stderr or b'' | |
403 | ||
404 | if status != 0: | |
405 | errstr = read_all_from_possibly_closed_stream(self.proc.stderr) | |
406 | log.debug('AutoInterrupt wait stderr: %r' % (errstr,)) | |
407 | raise GitCommandError(self.args, status, errstr) | |
430 | stderr_b = b'' | |
431 | stderr_b = force_bytes(data=stderr, encoding='utf-8') | |
432 | ||
433 | if self.proc is not None: | |
434 | status = self.proc.wait() | |
435 | ||
436 | def read_all_from_possibly_closed_stream(stream: Union[IO[bytes], None]) -> bytes: | |
437 | if stream: | |
438 | try: | |
439 | return stderr_b + force_bytes(stream.read()) | |
440 | except ValueError: | |
441 | return stderr_b or b'' | |
442 | else: | |
443 | return stderr_b or b'' | |
444 | ||
445 | if status != 0: | |
446 | errstr = read_all_from_possibly_closed_stream(self.proc.stderr) | |
447 | log.debug('AutoInterrupt wait stderr: %r' % (errstr,)) | |
448 | raise GitCommandError(remove_password_if_present(self.args), status, errstr) | |
408 | 449 | # END status handling |
409 | 450 | return status |
451 | ||
410 | 452 | # END auto interrupt |
411 | 453 | |
412 | 454 | class CatFileContentStream(object): |
418 | 460 | If not all data is read to the end of the objects's lifetime, we read the |
419 | 461 | rest to assure the underlying stream continues to work""" |
420 | 462 | |
421 | __slots__ = ('_stream', '_nbr', '_size') | |
422 | ||
423 | def __init__(self, size, stream): | |
463 | __slots__: Tuple[str, ...] = ('_stream', '_nbr', '_size') | |
464 | ||
465 | def __init__(self, size: int, stream: IO[bytes]) -> None: | |
424 | 466 | self._stream = stream |
425 | 467 | self._size = size |
426 | 468 | self._nbr = 0 # num bytes read |
431 | 473 | stream.read(1) |
432 | 474 | # END handle empty streams |
433 | 475 | |
434 | def read(self, size=-1): | |
476 | def read(self, size: int = -1) -> bytes: | |
435 | 477 | bytes_left = self._size - self._nbr |
436 | 478 | if bytes_left == 0: |
437 | 479 | return b'' |
451 | 493 | # END finish reading |
452 | 494 | return data |
453 | 495 | |
454 | def readline(self, size=-1): | |
496 | def readline(self, size: int = -1) -> bytes: | |
455 | 497 | if self._nbr == self._size: |
456 | 498 | return b'' |
457 | 499 | |
473 | 515 | |
474 | 516 | return data |
475 | 517 | |
476 | def readlines(self, size=-1): | |
518 | def readlines(self, size: int = -1) -> List[bytes]: | |
477 | 519 | if self._nbr == self._size: |
478 | 520 | return [] |
479 | 521 | |
494 | 536 | return out |
495 | 537 | |
496 | 538 | # skipcq: PYL-E0301 |
497 | def __iter__(self): | |
539 | def __iter__(self) -> 'Git.CatFileContentStream': | |
498 | 540 | return self |
499 | ||
500 | def __next__(self): | |
501 | return self.next() | |
502 | ||
503 | def next(self): | |
541 | ||
542 | def __next__(self) -> bytes: | |
543 | return next(self) | |
544 | ||
545 | def next(self) -> bytes: | |
504 | 546 | line = self.readline() |
505 | 547 | if not line: |
506 | 548 | raise StopIteration |
507 | 549 | |
508 | 550 | return line |
509 | 551 | |
510 | def __del__(self): | |
552 | def __del__(self) -> None: | |
511 | 553 | bytes_left = self._size - self._nbr |
512 | 554 | if bytes_left: |
513 | 555 | # read and discard - seeking is impossible within a stream |
515 | 557 | self._stream.read(bytes_left + 1) |
516 | 558 | # END handle incomplete read |
517 | 559 | |
518 | def __init__(self, working_dir=None): | |
560 | def __init__(self, working_dir: Union[None, PathLike] = None): | |
519 | 561 | """Initialize this instance with: |
520 | 562 | |
521 | 563 | :param working_dir: |
525 | 567 | .git directory in case of bare repositories.""" |
526 | 568 | super(Git, self).__init__() |
527 | 569 | self._working_dir = expand_path(working_dir) |
528 | self._git_options = () | |
529 | self._persistent_git_options = [] | |
570 | self._git_options: Union[List[str], Tuple[str, ...]] = () | |
571 | self._persistent_git_options: List[str] = [] | |
530 | 572 | |
531 | 573 | # Extra environment variables to pass to git commands |
532 | self._environment = {} | |
574 | self._environment: Dict[str, str] = {} | |
533 | 575 | |
534 | 576 | # cached command slots |
535 | self.cat_file_header = None | |
536 | self.cat_file_all = None | |
537 | ||
538 | def __getattr__(self, name): | |
577 | self.cat_file_header: Union[None, TBD] = None | |
578 | self.cat_file_all: Union[None, TBD] = None | |
579 | ||
580 | def __getattr__(self, name: str) -> Any: | |
539 | 581 | """A convenience method as it allows to call the command as if it was |
540 | 582 | an object. |
541 | 583 | :return: Callable object that will execute call _call_process with your arguments.""" |
543 | 585 | return LazyMixin.__getattr__(self, name) |
544 | 586 | return lambda *args, **kwargs: self._call_process(name, *args, **kwargs) |
545 | 587 | |
546 | def set_persistent_git_options(self, **kwargs): | |
588 | def set_persistent_git_options(self, **kwargs: Any) -> None: | |
547 | 589 | """Specify command line options to the git executable |
548 | 590 | for subsequent subcommand calls |
549 | 591 | |
557 | 599 | self._persistent_git_options = self.transform_kwargs( |
558 | 600 | split_single_char_options=True, **kwargs) |
559 | 601 | |
560 | def _set_cache_(self, attr): | |
602 | def _set_cache_(self, attr: str) -> None: | |
561 | 603 | if attr == '_version_info': |
562 | 604 | # We only use the first 4 numbers, as everything else could be strings in fact (on windows) |
563 | version_numbers = self._call_process('version').split(' ')[2] | |
564 | self._version_info = tuple(int(n) for n in version_numbers.split('.')[:4] if n.isdigit()) | |
605 | process_version = self._call_process('version') # should be as default *args and **kwargs used | |
606 | version_numbers = process_version.split(' ')[2] | |
607 | ||
608 | self._version_info = cast(Tuple[int, int, int, int], | |
609 | tuple(int(n) for n in version_numbers.split('.')[:4] if n.isdigit()) | |
610 | ) | |
565 | 611 | else: |
566 | 612 | super(Git, self)._set_cache_(attr) |
567 | 613 | # END handle version info |
568 | 614 | |
569 | @property | |
570 | def working_dir(self): | |
615 | @ property | |
616 | def working_dir(self) -> Union[None, PathLike]: | |
571 | 617 | """:return: Git directory we are working on""" |
572 | 618 | return self._working_dir |
573 | 619 | |
574 | @property | |
575 | def version_info(self): | |
620 | @ property | |
621 | def version_info(self) -> Tuple[int, int, int, int]: | |
576 | 622 | """ |
577 | 623 | :return: tuple(int, int, int, int) tuple with integers representing the major, minor |
578 | 624 | and additional version numbers as parsed from git version. |
579 | 625 | This value is generated on demand and is cached""" |
580 | 626 | return self._version_info |
581 | 627 | |
582 | def execute(self, command, | |
583 | istream=None, | |
584 | with_extended_output=False, | |
585 | with_exceptions=True, | |
586 | as_process=False, | |
587 | output_stream=None, | |
588 | stdout_as_string=True, | |
589 | kill_after_timeout=None, | |
590 | with_stdout=True, | |
591 | universal_newlines=False, | |
592 | shell=None, | |
593 | env=None, | |
594 | max_chunk_size=io.DEFAULT_BUFFER_SIZE, | |
595 | **subprocess_kwargs | |
596 | ): | |
628 | @ overload | |
629 | def execute(self, | |
630 | command: Union[str, Sequence[Any]], | |
631 | *, | |
632 | as_process: Literal[True] | |
633 | ) -> 'AutoInterrupt': | |
634 | ... | |
635 | ||
636 | @ overload | |
637 | def execute(self, | |
638 | command: Union[str, Sequence[Any]], | |
639 | *, | |
640 | as_process: Literal[False] = False, | |
641 | stdout_as_string: Literal[True] | |
642 | ) -> Union[str, Tuple[int, str, str]]: | |
643 | ... | |
644 | ||
645 | @ overload | |
646 | def execute(self, | |
647 | command: Union[str, Sequence[Any]], | |
648 | *, | |
649 | as_process: Literal[False] = False, | |
650 | stdout_as_string: Literal[False] = False | |
651 | ) -> Union[bytes, Tuple[int, bytes, str]]: | |
652 | ... | |
653 | ||
654 | @ overload | |
655 | def execute(self, | |
656 | command: Union[str, Sequence[Any]], | |
657 | *, | |
658 | with_extended_output: Literal[False], | |
659 | as_process: Literal[False], | |
660 | stdout_as_string: Literal[True] | |
661 | ) -> str: | |
662 | ... | |
663 | ||
664 | @ overload | |
665 | def execute(self, | |
666 | command: Union[str, Sequence[Any]], | |
667 | *, | |
668 | with_extended_output: Literal[False], | |
669 | as_process: Literal[False], | |
670 | stdout_as_string: Literal[False] | |
671 | ) -> bytes: | |
672 | ... | |
673 | ||
674 | def execute(self, | |
675 | command: Union[str, Sequence[Any]], | |
676 | istream: Union[None, BinaryIO] = None, | |
677 | with_extended_output: bool = False, | |
678 | with_exceptions: bool = True, | |
679 | as_process: bool = False, | |
680 | output_stream: Union[None, BinaryIO] = None, | |
681 | stdout_as_string: bool = True, | |
682 | kill_after_timeout: Union[None, int] = None, | |
683 | with_stdout: bool = True, | |
684 | universal_newlines: bool = False, | |
685 | shell: Union[None, bool] = None, | |
686 | env: Union[None, Mapping[str, str]] = None, | |
687 | max_chunk_size: int = io.DEFAULT_BUFFER_SIZE, | |
688 | **subprocess_kwargs: Any | |
689 | ) -> Union[str, bytes, Tuple[int, Union[str, bytes], str], AutoInterrupt]: | |
597 | 690 | """Handles executing the command on the shell and consumes and returns |
598 | 691 | the returned information (stdout) |
599 | 692 | |
637 | 730 | |
638 | 731 | :param env: |
639 | 732 | A dictionary of environment variables to be passed to `subprocess.Popen`. |
640 | ||
733 | ||
641 | 734 | :param max_chunk_size: |
642 | 735 | Maximum number of bytes in one chunk of data passed to the output_stream in |
643 | 736 | one invocation of write() method. If the given number is not positive then |
681 | 774 | :note: |
682 | 775 | If you add additional keyword arguments to the signature of this method, |
683 | 776 | you must update the execute_kwargs tuple housed in this module.""" |
777 | # Remove password for the command if present | |
778 | redacted_command = remove_password_if_present(command) | |
684 | 779 | if self.GIT_PYTHON_TRACE and (self.GIT_PYTHON_TRACE != 'full' or as_process): |
685 | log.info(' '.join(command)) | |
780 | log.info(' '.join(redacted_command)) | |
686 | 781 | |
687 | 782 | # Allow the user to have the command executed in their working dir. |
688 | cwd = self._working_dir or os.getcwd() | |
783 | try: | |
784 | cwd = self._working_dir or os.getcwd() # type: Union[None, str] | |
785 | if not os.access(str(cwd), os.X_OK): | |
786 | cwd = None | |
787 | except FileNotFoundError: | |
788 | cwd = None | |
689 | 789 | |
690 | 790 | # Start the process |
691 | 791 | inline_env = env |
703 | 803 | if is_win: |
704 | 804 | cmd_not_found_exception = OSError |
705 | 805 | if kill_after_timeout: |
706 | raise GitCommandError(command, '"kill_after_timeout" feature is not supported on Windows.') | |
806 | raise GitCommandError(redacted_command, '"kill_after_timeout" feature is not supported on Windows.') | |
707 | 807 | else: |
708 | if sys.version_info[0] > 2: | |
709 | cmd_not_found_exception = FileNotFoundError # NOQA # exists, flake8 unknown @UndefinedVariable | |
710 | else: | |
711 | cmd_not_found_exception = OSError | |
808 | cmd_not_found_exception = FileNotFoundError # NOQA # exists, flake8 unknown @UndefinedVariable | |
712 | 809 | # end handle |
713 | 810 | |
714 | 811 | stdout_sink = (PIPE |
718 | 815 | if istream: |
719 | 816 | istream_ok = "<valid stream>" |
720 | 817 | log.debug("Popen(%s, cwd=%s, universal_newlines=%s, shell=%s, istream=%s)", |
721 | command, cwd, universal_newlines, shell, istream_ok) | |
818 | redacted_command, cwd, universal_newlines, shell, istream_ok) | |
722 | 819 | try: |
723 | 820 | proc = Popen(command, |
724 | 821 | env=env, |
733 | 830 | creationflags=PROC_CREATIONFLAGS, |
734 | 831 | **subprocess_kwargs |
735 | 832 | ) |
833 | ||
736 | 834 | except cmd_not_found_exception as err: |
737 | raise GitCommandNotFound(command, err) from err | |
835 | raise GitCommandNotFound(redacted_command, err) from err | |
836 | else: | |
837 | # replace with a typeguard for Popen[bytes]? | |
838 | proc.stdout = cast(BinaryIO, proc.stdout) | |
839 | proc.stderr = cast(BinaryIO, proc.stderr) | |
738 | 840 | |
739 | 841 | if as_process: |
740 | 842 | return self.AutoInterrupt(proc, command) |
741 | 843 | |
742 | def _kill_process(pid): | |
844 | def _kill_process(pid: int) -> None: | |
743 | 845 | """ Callback method to kill a process. """ |
744 | 846 | p = Popen(['ps', '--ppid', str(pid)], stdout=PIPE, |
745 | 847 | creationflags=PROC_CREATIONFLAGS) |
746 | 848 | child_pids = [] |
747 | for line in p.stdout: | |
748 | if len(line.split()) > 0: | |
749 | local_pid = (line.split())[0] | |
750 | if local_pid.isdigit(): | |
751 | child_pids.append(int(local_pid)) | |
849 | if p.stdout is not None: | |
850 | for line in p.stdout: | |
851 | if len(line.split()) > 0: | |
852 | local_pid = (line.split())[0] | |
853 | if local_pid.isdigit(): | |
854 | child_pids.append(int(local_pid)) | |
752 | 855 | try: |
753 | 856 | # Windows does not have SIGKILL, so use SIGTERM instead |
754 | 857 | sig = getattr(signal, 'SIGKILL', signal.SIGTERM) |
772 | 875 | |
773 | 876 | # Wait for the process to return |
774 | 877 | status = 0 |
775 | stdout_value = b'' | |
776 | stderr_value = b'' | |
878 | stdout_value: Union[str, bytes] = b'' | |
879 | stderr_value: Union[str, bytes] = b'' | |
777 | 880 | newline = "\n" if universal_newlines else b"\n" |
778 | 881 | try: |
779 | 882 | if output_stream is None: |
782 | 885 | stdout_value, stderr_value = proc.communicate() |
783 | 886 | if kill_after_timeout: |
784 | 887 | watchdog.cancel() |
785 | if kill_check.isSet(): | |
888 | if kill_check.is_set(): | |
786 | 889 | stderr_value = ('Timeout: the command "%s" did not complete in %d ' |
787 | 'secs.' % (" ".join(command), kill_after_timeout)) | |
890 | 'secs.' % (" ".join(redacted_command), kill_after_timeout)) | |
788 | 891 | if not universal_newlines: |
789 | 892 | stderr_value = stderr_value.encode(defenc) |
790 | 893 | # strip trailing "\n" |
791 | if stdout_value.endswith(newline): | |
894 | if stdout_value.endswith(newline): # type: ignore | |
792 | 895 | stdout_value = stdout_value[:-1] |
793 | if stderr_value.endswith(newline): | |
896 | if stderr_value.endswith(newline): # type: ignore | |
794 | 897 | stderr_value = stderr_value[:-1] |
898 | ||
795 | 899 | status = proc.returncode |
796 | 900 | else: |
797 | 901 | max_chunk_size = max_chunk_size if max_chunk_size and max_chunk_size > 0 else io.DEFAULT_BUFFER_SIZE |
799 | 903 | stdout_value = proc.stdout.read() |
800 | 904 | stderr_value = proc.stderr.read() |
801 | 905 | # strip trailing "\n" |
802 | if stderr_value.endswith(newline): | |
906 | if stderr_value.endswith(newline): # type: ignore | |
803 | 907 | stderr_value = stderr_value[:-1] |
804 | 908 | status = proc.wait() |
805 | 909 | # END stdout handling |
808 | 912 | proc.stderr.close() |
809 | 913 | |
810 | 914 | if self.GIT_PYTHON_TRACE == 'full': |
811 | cmdstr = " ".join(command) | |
812 | ||
813 | def as_text(stdout_value): | |
915 | cmdstr = " ".join(redacted_command) | |
916 | ||
917 | def as_text(stdout_value: Union[bytes, str]) -> str: | |
814 | 918 | return not output_stream and safe_decode(stdout_value) or '<OUTPUT_STREAM>' |
815 | 919 | # end |
816 | 920 | |
824 | 928 | # END handle debug printing |
825 | 929 | |
826 | 930 | if with_exceptions and status != 0: |
827 | raise GitCommandError(command, status, stderr_value, stdout_value) | |
931 | raise GitCommandError(redacted_command, status, stderr_value, stdout_value) | |
828 | 932 | |
829 | 933 | if isinstance(stdout_value, bytes) and stdout_as_string: # could also be output_stream |
830 | 934 | stdout_value = safe_decode(stdout_value) |
835 | 939 | else: |
836 | 940 | return stdout_value |
837 | 941 | |
838 | def environment(self): | |
942 | def environment(self) -> Dict[str, str]: | |
839 | 943 | return self._environment |
840 | 944 | |
841 | def update_environment(self, **kwargs): | |
945 | def update_environment(self, **kwargs: Any) -> Dict[str, Union[str, None]]: | |
842 | 946 | """ |
843 | 947 | Set environment variables for future git invocations. Return all changed |
844 | 948 | values in a format that can be passed back into this function to revert |
865 | 969 | return old_env |
866 | 970 | |
867 | 971 | @contextmanager |
868 | def custom_environment(self, **kwargs): | |
972 | def custom_environment(self, **kwargs: Any) -> Iterator[None]: | |
869 | 973 | """ |
870 | 974 | A context manager around the above ``update_environment`` method to restore the |
871 | 975 | environment back to its previous state after operation. |
883 | 987 | finally: |
884 | 988 | self.update_environment(**old_env) |
885 | 989 | |
886 | def transform_kwarg(self, name, value, split_single_char_options): | |
990 | def transform_kwarg(self, name: str, value: Any, split_single_char_options: bool) -> List[str]: | |
887 | 991 | if len(name) == 1: |
888 | 992 | if value is True: |
889 | 993 | return ["-%s" % name] |
899 | 1003 | return ["--%s=%s" % (dashify(name), value)] |
900 | 1004 | return [] |
901 | 1005 | |
902 | def transform_kwargs(self, split_single_char_options=True, **kwargs): | |
1006 | def transform_kwargs(self, split_single_char_options: bool = True, **kwargs: Any) -> List[str]: | |
903 | 1007 | """Transforms Python style kwargs into git command line options.""" |
904 | 1008 | args = [] |
905 | kwargs = OrderedDict(sorted(kwargs.items(), key=lambda x: x[0])) | |
906 | 1009 | for k, v in kwargs.items(): |
907 | 1010 | if isinstance(v, (list, tuple)): |
908 | 1011 | for value in v: |
912 | 1015 | return args |
913 | 1016 | |
914 | 1017 | @classmethod |
915 | def __unpack_args(cls, arg_list): | |
916 | if not isinstance(arg_list, (list, tuple)): | |
917 | return [str(arg_list)] | |
1018 | def __unpack_args(cls, arg_list: Sequence[str]) -> List[str]: | |
918 | 1019 | |
919 | 1020 | outlist = [] |
920 | for arg in arg_list: | |
921 | if isinstance(arg_list, (list, tuple)): | |
1021 | if isinstance(arg_list, (list, tuple)): | |
1022 | for arg in arg_list: | |
922 | 1023 | outlist.extend(cls.__unpack_args(arg)) |
923 | # END recursion | |
924 | else: | |
925 | outlist.append(str(arg)) | |
926 | # END for each arg | |
1024 | else: | |
1025 | outlist.append(str(arg_list)) | |
1026 | ||
927 | 1027 | return outlist |
928 | 1028 | |
929 | def __call__(self, **kwargs): | |
1029 | def __call__(self, **kwargs: Any) -> 'Git': | |
930 | 1030 | """Specify command line options to the git executable |
931 | 1031 | for a subcommand call |
932 | 1032 | |
942 | 1042 | split_single_char_options=True, **kwargs) |
943 | 1043 | return self |
944 | 1044 | |
945 | def _call_process(self, method, *args, **kwargs): | |
1045 | @overload | |
1046 | def _call_process(self, method: str, *args: None, **kwargs: None | |
1047 | ) -> str: | |
1048 | ... # if no args given, execute called with all defaults | |
1049 | ||
1050 | @overload | |
1051 | def _call_process(self, method: str, | |
1052 | istream: int, | |
1053 | as_process: Literal[True], | |
1054 | *args: Any, **kwargs: Any | |
1055 | ) -> 'Git.AutoInterrupt': ... | |
1056 | ||
1057 | @overload | |
1058 | def _call_process(self, method: str, *args: Any, **kwargs: Any | |
1059 | ) -> Union[str, bytes, Tuple[int, Union[str, bytes], str], 'Git.AutoInterrupt']: | |
1060 | ... | |
1061 | ||
1062 | def _call_process(self, method: str, *args: Any, **kwargs: Any | |
1063 | ) -> Union[str, bytes, Tuple[int, Union[str, bytes], str], 'Git.AutoInterrupt']: | |
946 | 1064 | """Run the given git command with the specified arguments and return |
947 | 1065 | the result as a String |
948 | 1066 | |
959 | 1077 | It contains key-values for the following: |
960 | 1078 | - the :meth:`execute()` kwds, as listed in :var:`execute_kwargs`; |
961 | 1079 | - "command options" to be converted by :meth:`transform_kwargs()`; |
962 | - the `'insert_kwargs_after'` key which its value must match one of ``*args``, | |
963 | and any cmd-options will be appended after the matched arg. | |
1080 | - the `'insert_kwargs_after'` key which its value must match one of ``*args`` | |
1081 | and any cmd-options will be appended after the matched arg. | |
964 | 1082 | |
965 | 1083 | Examples:: |
966 | 1084 | |
970 | 1088 | |
971 | 1089 | git rev-list max-count 10 --header master |
972 | 1090 | |
973 | :return: Same as ``execute``""" | |
1091 | :return: Same as ``execute`` | |
1092 | if no args given used execute default (esp. as_process = False, stdout_as_string = True) | |
1093 | and return str """ | |
974 | 1094 | # Handle optional arguments prior to calling transform_kwargs |
975 | 1095 | # otherwise these'll end up in args, which is bad. |
976 | 1096 | exec_kwargs = {k: v for k, v in kwargs.items() if k in execute_kwargs} |
979 | 1099 | insert_after_this_arg = opts_kwargs.pop('insert_kwargs_after', None) |
980 | 1100 | |
981 | 1101 | # Prepare the argument list |
1102 | ||
982 | 1103 | opt_args = self.transform_kwargs(**opts_kwargs) |
983 | 1104 | ext_args = self.__unpack_args([a for a in args if a is not None]) |
984 | 1105 | |
985 | 1106 | if insert_after_this_arg is None: |
986 | args = opt_args + ext_args | |
1107 | args_list = opt_args + ext_args | |
987 | 1108 | else: |
988 | 1109 | try: |
989 | 1110 | index = ext_args.index(insert_after_this_arg) |
991 | 1112 | raise ValueError("Couldn't find argument '%s' in args %s to insert cmd options after" |
992 | 1113 | % (insert_after_this_arg, str(ext_args))) from err |
993 | 1114 | # end handle error |
994 | args = ext_args[:index + 1] + opt_args + ext_args[index + 1:] | |
1115 | args_list = ext_args[:index + 1] + opt_args + ext_args[index + 1:] | |
995 | 1116 | # end handle opts_kwargs |
996 | 1117 | |
997 | 1118 | call = [self.GIT_PYTHON_GIT_EXECUTABLE] |
1005 | 1126 | self._git_options = () |
1006 | 1127 | |
1007 | 1128 | call.append(dashify(method)) |
1008 | call.extend(args) | |
1129 | call.extend(args_list) | |
1009 | 1130 | |
1010 | 1131 | return self.execute(call, **exec_kwargs) |
1011 | 1132 | |
1012 | def _parse_object_header(self, header_line): | |
1133 | def _parse_object_header(self, header_line: str) -> Tuple[str, str, int]: | |
1013 | 1134 | """ |
1014 | 1135 | :param header_line: |
1015 | 1136 | <hex_sha> type_string size_as_int |
1031 | 1152 | raise ValueError("Failed to parse header: %r" % header_line) |
1032 | 1153 | return (tokens[0], tokens[1], int(tokens[2])) |
1033 | 1154 | |
1034 | def _prepare_ref(self, ref): | |
1155 | def _prepare_ref(self, ref: AnyStr) -> bytes: | |
1035 | 1156 | # required for command to separate refs on stdin, as bytes |
1036 | refstr = ref | |
1037 | 1157 | if isinstance(ref, bytes): |
1038 | 1158 | # Assume 40 bytes hexsha - bin-to-ascii for some reason returns bytes, not text |
1039 | refstr = ref.decode('ascii') | |
1159 | refstr: str = ref.decode('ascii') | |
1040 | 1160 | elif not isinstance(ref, str): |
1041 | 1161 | refstr = str(ref) # could be ref-object |
1162 | else: | |
1163 | refstr = ref | |
1042 | 1164 | |
1043 | 1165 | if not refstr.endswith("\n"): |
1044 | 1166 | refstr += "\n" |
1045 | 1167 | return refstr.encode(defenc) |
1046 | 1168 | |
1047 | def _get_persistent_cmd(self, attr_name, cmd_name, *args, **kwargs): | |
1169 | def _get_persistent_cmd(self, attr_name: str, cmd_name: str, *args: Any, **kwargs: Any | |
1170 | ) -> 'Git.AutoInterrupt': | |
1048 | 1171 | cur_val = getattr(self, attr_name) |
1049 | 1172 | if cur_val is not None: |
1050 | 1173 | return cur_val |
1054 | 1177 | |
1055 | 1178 | cmd = self._call_process(cmd_name, *args, **options) |
1056 | 1179 | setattr(self, attr_name, cmd) |
1180 | cmd = cast('Git.AutoInterrupt', cmd) | |
1057 | 1181 | return cmd |
1058 | 1182 | |
1059 | def __get_object_header(self, cmd, ref): | |
1060 | cmd.stdin.write(self._prepare_ref(ref)) | |
1061 | cmd.stdin.flush() | |
1062 | return self._parse_object_header(cmd.stdout.readline()) | |
1063 | ||
1064 | def get_object_header(self, ref): | |
1183 | def __get_object_header(self, cmd: 'Git.AutoInterrupt', ref: AnyStr) -> Tuple[str, str, int]: | |
1184 | if cmd.stdin and cmd.stdout: | |
1185 | cmd.stdin.write(self._prepare_ref(ref)) | |
1186 | cmd.stdin.flush() | |
1187 | return self._parse_object_header(cmd.stdout.readline()) | |
1188 | else: | |
1189 | raise ValueError("cmd stdin was empty") | |
1190 | ||
1191 | def get_object_header(self, ref: str) -> Tuple[str, str, int]: | |
1065 | 1192 | """ Use this method to quickly examine the type and size of the object behind |
1066 | 1193 | the given ref. |
1067 | 1194 | |
1072 | 1199 | cmd = self._get_persistent_cmd("cat_file_header", "cat_file", batch_check=True) |
1073 | 1200 | return self.__get_object_header(cmd, ref) |
1074 | 1201 | |
1075 | def get_object_data(self, ref): | |
1202 | def get_object_data(self, ref: str) -> Tuple[str, str, int, bytes]: | |
1076 | 1203 | """ As get_object_header, but returns object data as well |
1077 | 1204 | :return: (hexsha, type_string, size_as_int,data_string) |
1078 | 1205 | :note: not threadsafe""" |
1081 | 1208 | del(stream) |
1082 | 1209 | return (hexsha, typename, size, data) |
1083 | 1210 | |
1084 | def stream_object_data(self, ref): | |
1211 | def stream_object_data(self, ref: str) -> Tuple[str, str, int, 'Git.CatFileContentStream']: | |
1085 | 1212 | """ As get_object_header, but returns the data as a stream |
1086 | 1213 | |
1087 | 1214 | :return: (hexsha, type_string, size_as_int, stream) |
1088 | 1215 | :note: This method is not threadsafe, you need one independent Command instance per thread to be safe !""" |
1089 | 1216 | cmd = self._get_persistent_cmd("cat_file_all", "cat_file", batch=True) |
1090 | 1217 | hexsha, typename, size = self.__get_object_header(cmd, ref) |
1091 | return (hexsha, typename, size, self.CatFileContentStream(size, cmd.stdout)) | |
1092 | ||
1093 | def clear_cache(self): | |
1218 | cmd_stdout = cmd.stdout if cmd.stdout is not None else io.BytesIO() | |
1219 | return (hexsha, typename, size, self.CatFileContentStream(size, cmd_stdout)) | |
1220 | ||
1221 | def clear_cache(self) -> 'Git': | |
1094 | 1222 | """Clear all kinds of internal caches to release resources. |
1095 | 1223 | |
1096 | 1224 | Currently persistent commands will be interrupted. |
10 | 10 | import os |
11 | 11 | import sys |
12 | 12 | |
13 | ||
14 | 13 | from gitdb.utils.encoding import ( |
15 | 14 | force_bytes, # @UnusedImport |
16 | 15 | force_text # @UnusedImport |
17 | 16 | ) |
18 | 17 | |
18 | # typing -------------------------------------------------------------------- | |
19 | 19 | |
20 | is_win = (os.name == 'nt') | |
20 | from typing import ( | |
21 | Any, | |
22 | AnyStr, | |
23 | Dict, | |
24 | IO, | |
25 | Optional, | |
26 | Tuple, | |
27 | Type, | |
28 | Union, | |
29 | overload, | |
30 | ) | |
31 | # --------------------------------------------------------------------------- | |
32 | ||
33 | ||
34 | is_win: bool = (os.name == 'nt') | |
21 | 35 | is_posix = (os.name == 'posix') |
22 | 36 | is_darwin = (os.name == 'darwin') |
23 | 37 | defenc = sys.getfilesystemencoding() |
24 | 38 | |
25 | 39 | |
26 | def safe_decode(s): | |
40 | @overload | |
41 | def safe_decode(s: None) -> None: ... | |
42 | ||
43 | ||
44 | @overload | |
45 | def safe_decode(s: AnyStr) -> str: ... | |
46 | ||
47 | ||
48 | def safe_decode(s: Union[AnyStr, None]) -> Optional[str]: | |
27 | 49 | """Safely decodes a binary string to unicode""" |
28 | 50 | if isinstance(s, str): |
29 | 51 | return s |
30 | 52 | elif isinstance(s, bytes): |
31 | 53 | return s.decode(defenc, 'surrogateescape') |
32 | elif s is not None: | |
54 | elif s is None: | |
55 | return None | |
56 | else: | |
33 | 57 | raise TypeError('Expected bytes or text, but got %r' % (s,)) |
34 | 58 | |
35 | 59 | |
36 | def safe_encode(s): | |
37 | """Safely decodes a binary string to unicode""" | |
60 | @overload | |
61 | def safe_encode(s: None) -> None: ... | |
62 | ||
63 | ||
64 | @overload | |
65 | def safe_encode(s: AnyStr) -> bytes: ... | |
66 | ||
67 | ||
68 | def safe_encode(s: Optional[AnyStr]) -> Optional[bytes]: | |
69 | """Safely encodes a binary string to unicode""" | |
38 | 70 | if isinstance(s, str): |
39 | 71 | return s.encode(defenc) |
40 | 72 | elif isinstance(s, bytes): |
41 | 73 | return s |
42 | elif s is not None: | |
74 | elif s is None: | |
75 | return None | |
76 | else: | |
43 | 77 | raise TypeError('Expected bytes or text, but got %r' % (s,)) |
44 | 78 | |
45 | 79 | |
46 | def win_encode(s): | |
80 | @overload | |
81 | def win_encode(s: None) -> None: ... | |
82 | ||
83 | ||
84 | @overload | |
85 | def win_encode(s: AnyStr) -> bytes: ... | |
86 | ||
87 | ||
88 | def win_encode(s: Optional[AnyStr]) -> Optional[bytes]: | |
47 | 89 | """Encode unicodes for process arguments on Windows.""" |
48 | 90 | if isinstance(s, str): |
49 | 91 | return s.encode(locale.getpreferredencoding(False)) |
51 | 93 | return s |
52 | 94 | elif s is not None: |
53 | 95 | raise TypeError('Expected bytes or text, but got %r' % (s,)) |
54 | ||
55 | ||
56 | def with_metaclass(meta, *bases): | |
57 | """copied from https://github.com/Byron/bcore/blob/master/src/python/butility/future.py#L15""" | |
58 | class metaclass(meta): | |
59 | __call__ = type.__call__ | |
60 | __init__ = type.__init__ | |
61 | ||
62 | def __new__(cls, name, nbases, d): | |
63 | if nbases is None: | |
64 | return type.__new__(cls, name, (), d) | |
65 | return meta(name, bases, d) | |
66 | return metaclass(meta.__name__ + 'Helper', None, {}) | |
96 | return None |
5 | 5 | """Module containing module parser implementation able to properly read and write |
6 | 6 | configuration files""" |
7 | 7 | |
8 | import sys | |
8 | 9 | import abc |
9 | 10 | from functools import wraps |
10 | 11 | import inspect |
11 | from io import IOBase | |
12 | from io import BufferedReader, IOBase | |
12 | 13 | import logging |
13 | 14 | import os |
14 | 15 | import re |
15 | 16 | import fnmatch |
16 | from collections import OrderedDict | |
17 | 17 | |
18 | 18 | from git.compat import ( |
19 | 19 | defenc, |
20 | 20 | force_text, |
21 | with_metaclass, | |
22 | 21 | is_win, |
23 | 22 | ) |
23 | ||
24 | 24 | from git.util import LockFile |
25 | 25 | |
26 | 26 | import os.path as osp |
27 | 27 | |
28 | 28 | import configparser as cp |
29 | 29 | |
30 | # typing------------------------------------------------------- | |
31 | ||
32 | from typing import (Any, Callable, Generic, IO, List, Dict, Sequence, | |
33 | TYPE_CHECKING, Tuple, TypeVar, Union, cast, overload) | |
34 | ||
35 | from git.types import Lit_config_levels, ConfigLevels_Tup, PathLike, assert_never, _T | |
36 | ||
37 | if TYPE_CHECKING: | |
38 | from git.repo.base import Repo | |
39 | from io import BytesIO | |
40 | ||
41 | T_ConfigParser = TypeVar('T_ConfigParser', bound='GitConfigParser') | |
42 | T_OMD_value = TypeVar('T_OMD_value', str, bytes, int, float, bool) | |
43 | ||
44 | if sys.version_info[:3] < (3, 7, 2): | |
45 | # typing.Ordereddict not added until py 3.7.2 | |
46 | from collections import OrderedDict | |
47 | OrderedDict_OMD = OrderedDict | |
48 | else: | |
49 | from typing import OrderedDict | |
50 | OrderedDict_OMD = OrderedDict[str, List[T_OMD_value]] # type: ignore[assignment, misc] | |
51 | ||
52 | # ------------------------------------------------------------- | |
30 | 53 | |
31 | 54 | __all__ = ('GitConfigParser', 'SectionConstraint') |
32 | 55 | |
36 | 59 | |
37 | 60 | # invariants |
38 | 61 | # represents the configuration level of a configuration file |
39 | CONFIG_LEVELS = ("system", "user", "global", "repository") | |
62 | ||
63 | ||
64 | CONFIG_LEVELS: ConfigLevels_Tup = ("system", "user", "global", "repository") | |
65 | ||
40 | 66 | |
41 | 67 | # Section pattern to detect conditional includes. |
42 | 68 | # https://git-scm.com/docs/git-config#_conditional_includes |
44 | 70 | |
45 | 71 | |
46 | 72 | class MetaParserBuilder(abc.ABCMeta): |
47 | ||
48 | 73 | """Utlity class wrapping base-class methods into decorators that assure read-only properties""" |
49 | def __new__(cls, name, bases, clsdict): | |
74 | def __new__(cls, name: str, bases: Tuple, clsdict: Dict[str, Any]) -> 'MetaParserBuilder': | |
50 | 75 | """ |
51 | 76 | Equip all base-class methods with a needs_values decorator, and all non-const methods |
52 | 77 | with a set_dirty_and_flush_changes decorator in addition to that.""" |
72 | 97 | return new_type |
73 | 98 | |
74 | 99 | |
75 | def needs_values(func): | |
100 | def needs_values(func: Callable[..., _T]) -> Callable[..., _T]: | |
76 | 101 | """Returns method assuring we read values (on demand) before we try to access them""" |
77 | 102 | |
78 | 103 | @wraps(func) |
79 | def assure_data_present(self, *args, **kwargs): | |
104 | def assure_data_present(self: 'GitConfigParser', *args: Any, **kwargs: Any) -> _T: | |
80 | 105 | self.read() |
81 | 106 | return func(self, *args, **kwargs) |
82 | 107 | # END wrapper method |
83 | 108 | return assure_data_present |
84 | 109 | |
85 | 110 | |
86 | def set_dirty_and_flush_changes(non_const_func): | |
111 | def set_dirty_and_flush_changes(non_const_func: Callable[..., _T]) -> Callable[..., _T]: | |
87 | 112 | """Return method that checks whether given non constant function may be called. |
88 | 113 | If so, the instance will be set dirty. |
89 | 114 | Additionally, we flush the changes right to disk""" |
90 | 115 | |
91 | def flush_changes(self, *args, **kwargs): | |
116 | def flush_changes(self: 'GitConfigParser', *args: Any, **kwargs: Any) -> _T: | |
92 | 117 | rval = non_const_func(self, *args, **kwargs) |
93 | 118 | self._dirty = True |
94 | 119 | self.write() |
98 | 123 | return flush_changes |
99 | 124 | |
100 | 125 | |
101 | class SectionConstraint(object): | |
126 | class SectionConstraint(Generic[T_ConfigParser]): | |
102 | 127 | |
103 | 128 | """Constrains a ConfigParser to only option commands which are constrained to |
104 | 129 | always use the section we have been initialized with. |
111 | 136 | _valid_attrs_ = ("get_value", "set_value", "get", "set", "getint", "getfloat", "getboolean", "has_option", |
112 | 137 | "remove_section", "remove_option", "options") |
113 | 138 | |
114 | def __init__(self, config, section): | |
139 | def __init__(self, config: T_ConfigParser, section: str) -> None: | |
115 | 140 | self._config = config |
116 | 141 | self._section_name = section |
117 | 142 | |
118 | def __del__(self): | |
143 | def __del__(self) -> None: | |
119 | 144 | # Yes, for some reason, we have to call it explicitly for it to work in PY3 ! |
120 | 145 | # Apparently __del__ doesn't get call anymore if refcount becomes 0 |
121 | 146 | # Ridiculous ... . |
122 | 147 | self._config.release() |
123 | 148 | |
124 | def __getattr__(self, attr): | |
149 | def __getattr__(self, attr: str) -> Any: | |
125 | 150 | if attr in self._valid_attrs_: |
126 | 151 | return lambda *args, **kwargs: self._call_config(attr, *args, **kwargs) |
127 | 152 | return super(SectionConstraint, self).__getattribute__(attr) |
128 | 153 | |
129 | def _call_config(self, method, *args, **kwargs): | |
154 | def _call_config(self, method: str, *args: Any, **kwargs: Any) -> Any: | |
130 | 155 | """Call the configuration at the given method which must take a section name |
131 | 156 | as first argument""" |
132 | 157 | return getattr(self._config, method)(self._section_name, *args, **kwargs) |
133 | 158 | |
134 | 159 | @property |
135 | def config(self): | |
160 | def config(self) -> T_ConfigParser: | |
136 | 161 | """return: Configparser instance we constrain""" |
137 | 162 | return self._config |
138 | 163 | |
139 | def release(self): | |
164 | def release(self) -> None: | |
140 | 165 | """Equivalent to GitConfigParser.release(), which is called on our underlying parser instance""" |
141 | 166 | return self._config.release() |
142 | 167 | |
143 | def __enter__(self): | |
168 | def __enter__(self) -> 'SectionConstraint[T_ConfigParser]': | |
144 | 169 | self._config.__enter__() |
145 | 170 | return self |
146 | 171 | |
147 | def __exit__(self, exception_type, exception_value, traceback): | |
172 | def __exit__(self, exception_type: str, exception_value: str, traceback: str) -> None: | |
148 | 173 | self._config.__exit__(exception_type, exception_value, traceback) |
149 | 174 | |
150 | 175 | |
151 | class _OMD(OrderedDict): | |
176 | class _OMD(OrderedDict_OMD): | |
152 | 177 | """Ordered multi-dict.""" |
153 | 178 | |
154 | def __setitem__(self, key, value): | |
179 | def __setitem__(self, key: str, value: _T) -> None: | |
155 | 180 | super(_OMD, self).__setitem__(key, [value]) |
156 | 181 | |
157 | def add(self, key, value): | |
182 | def add(self, key: str, value: Any) -> None: | |
183 | if key not in self: | |
184 | super(_OMD, self).__setitem__(key, [value]) | |
185 | return None | |
186 | super(_OMD, self).__getitem__(key).append(value) | |
187 | ||
188 | def setall(self, key: str, values: List[_T]) -> None: | |
189 | super(_OMD, self).__setitem__(key, values) | |
190 | ||
191 | def __getitem__(self, key: str) -> Any: | |
192 | return super(_OMD, self).__getitem__(key)[-1] | |
193 | ||
194 | def getlast(self, key: str) -> Any: | |
195 | return super(_OMD, self).__getitem__(key)[-1] | |
196 | ||
197 | def setlast(self, key: str, value: Any) -> None: | |
158 | 198 | if key not in self: |
159 | 199 | super(_OMD, self).__setitem__(key, [value]) |
160 | 200 | return |
161 | 201 | |
162 | super(_OMD, self).__getitem__(key).append(value) | |
163 | ||
164 | def setall(self, key, values): | |
165 | super(_OMD, self).__setitem__(key, values) | |
166 | ||
167 | def __getitem__(self, key): | |
168 | return super(_OMD, self).__getitem__(key)[-1] | |
169 | ||
170 | def getlast(self, key): | |
171 | return super(_OMD, self).__getitem__(key)[-1] | |
172 | ||
173 | def setlast(self, key, value): | |
174 | if key not in self: | |
175 | super(_OMD, self).__setitem__(key, [value]) | |
176 | return | |
177 | ||
178 | 202 | prior = super(_OMD, self).__getitem__(key) |
179 | 203 | prior[-1] = value |
180 | 204 | |
181 | def get(self, key, default=None): | |
205 | def get(self, key: str, default: Union[_T, None] = None) -> Union[_T, None]: | |
182 | 206 | return super(_OMD, self).get(key, [default])[-1] |
183 | 207 | |
184 | def getall(self, key): | |
208 | def getall(self, key: str) -> List[_T]: | |
185 | 209 | return super(_OMD, self).__getitem__(key) |
186 | 210 | |
187 | def items(self): | |
211 | def items(self) -> List[Tuple[str, _T]]: # type: ignore[override] | |
188 | 212 | """List of (key, last value for key).""" |
189 | 213 | return [(k, self[k]) for k in self] |
190 | 214 | |
191 | def items_all(self): | |
215 | def items_all(self) -> List[Tuple[str, List[_T]]]: | |
192 | 216 | """List of (key, list of values for key).""" |
193 | 217 | return [(k, self.getall(k)) for k in self] |
194 | 218 | |
195 | 219 | |
196 | def get_config_path(config_level): | |
220 | def get_config_path(config_level: Lit_config_levels) -> str: | |
197 | 221 | |
198 | 222 | # we do not support an absolute path of the gitconfig on windows , |
199 | 223 | # use the global config instead |
209 | 233 | return osp.normpath(osp.expanduser("~/.gitconfig")) |
210 | 234 | elif config_level == "repository": |
211 | 235 | raise ValueError("No repo to get repository configuration from. Use Repo._get_config_path") |
212 | ||
213 | raise ValueError("Invalid configuration level: %r" % config_level) | |
214 | ||
215 | ||
216 | class GitConfigParser(with_metaclass(MetaParserBuilder, cp.RawConfigParser, object)): | |
236 | else: | |
237 | # Should not reach here. Will raise ValueError if does. Static typing will warn missing elifs | |
238 | assert_never(config_level, # type: ignore[unreachable] | |
239 | ValueError(f"Invalid configuration level: {config_level!r}")) | |
240 | ||
241 | ||
242 | class GitConfigParser(cp.RawConfigParser, metaclass=MetaParserBuilder): | |
217 | 243 | |
218 | 244 | """Implements specifics required to read git style configuration files. |
219 | 245 | |
251 | 277 | # list of RawConfigParser methods able to change the instance |
252 | 278 | _mutating_methods_ = ("add_section", "remove_section", "remove_option", "set") |
253 | 279 | |
254 | def __init__(self, file_or_files=None, read_only=True, merge_includes=True, config_level=None, repo=None): | |
280 | def __init__(self, file_or_files: Union[None, PathLike, 'BytesIO', Sequence[Union[PathLike, 'BytesIO']]] = None, | |
281 | read_only: bool = True, merge_includes: bool = True, | |
282 | config_level: Union[Lit_config_levels, None] = None, | |
283 | repo: Union['Repo', None] = None) -> None: | |
255 | 284 | """Initialize a configuration reader to read the given file_or_files and to |
256 | 285 | possibly allow changes to it by setting read_only False |
257 | 286 | |
271 | 300 | |
272 | 301 | """ |
273 | 302 | cp.RawConfigParser.__init__(self, dict_type=_OMD) |
303 | self._dict: Callable[..., _OMD] # type: ignore # mypy/typeshed bug? | |
304 | self._defaults: _OMD | |
305 | self._sections: _OMD # type: ignore # mypy/typeshed bug? | |
274 | 306 | |
275 | 307 | # Used in python 3, needs to stay in sync with sections for underlying implementation to work |
276 | 308 | if not hasattr(self, '_proxies'): |
277 | 309 | self._proxies = self._dict() |
278 | 310 | |
279 | 311 | if file_or_files is not None: |
280 | self._file_or_files = file_or_files | |
312 | self._file_or_files: Union[PathLike, 'BytesIO', Sequence[Union[PathLike, 'BytesIO']]] = file_or_files | |
281 | 313 | else: |
282 | 314 | if config_level is None: |
283 | 315 | if read_only: |
284 | self._file_or_files = [get_config_path(f) for f in CONFIG_LEVELS if f != 'repository'] | |
316 | self._file_or_files = [get_config_path(cast(Lit_config_levels, f)) | |
317 | for f in CONFIG_LEVELS | |
318 | if f != 'repository'] | |
285 | 319 | else: |
286 | 320 | raise ValueError("No configuration level or configuration files specified") |
287 | 321 | else: |
292 | 326 | self._is_initialized = False |
293 | 327 | self._merge_includes = merge_includes |
294 | 328 | self._repo = repo |
295 | self._lock = None | |
329 | self._lock: Union['LockFile', None] = None | |
296 | 330 | self._acquire_lock() |
297 | 331 | |
298 | def _acquire_lock(self): | |
332 | def _acquire_lock(self) -> None: | |
299 | 333 | if not self._read_only: |
300 | 334 | if not self._lock: |
301 | if isinstance(self._file_or_files, (tuple, list)): | |
335 | if isinstance(self._file_or_files, (str, os.PathLike)): | |
336 | file_or_files = self._file_or_files | |
337 | elif isinstance(self._file_or_files, (tuple, list, Sequence)): | |
302 | 338 | raise ValueError( |
303 | 339 | "Write-ConfigParsers can operate on a single file only, multiple files have been passed") |
304 | # END single file check | |
305 | ||
306 | file_or_files = self._file_or_files | |
307 | if not isinstance(self._file_or_files, str): | |
340 | else: | |
308 | 341 | file_or_files = self._file_or_files.name |
342 | ||
309 | 343 | # END get filename from handle/stream |
310 | 344 | # initialize lock base - we want to write |
311 | 345 | self._lock = self.t_lock(file_or_files) |
314 | 348 | self._lock._obtain_lock() |
315 | 349 | # END read-only check |
316 | 350 | |
317 | def __del__(self): | |
351 | def __del__(self) -> None: | |
318 | 352 | """Write pending changes if required and release locks""" |
319 | 353 | # NOTE: only consistent in PY2 |
320 | 354 | self.release() |
321 | 355 | |
322 | def __enter__(self): | |
356 | def __enter__(self) -> 'GitConfigParser': | |
323 | 357 | self._acquire_lock() |
324 | 358 | return self |
325 | 359 | |
326 | def __exit__(self, exception_type, exception_value, traceback): | |
360 | def __exit__(self, *args: Any) -> None: | |
327 | 361 | self.release() |
328 | 362 | |
329 | def release(self): | |
363 | def release(self) -> None: | |
330 | 364 | """Flush changes and release the configuration write lock. This instance must not be used anymore afterwards. |
331 | 365 | In Python 3, it's required to explicitly release locks and flush changes, as __del__ is not called |
332 | 366 | deterministically anymore.""" |
346 | 380 | # Usually when shutting down the interpreter, don'y know how to fix this |
347 | 381 | pass |
348 | 382 | finally: |
349 | self._lock._release_lock() | |
350 | ||
351 | def optionxform(self, optionstr): | |
383 | if self._lock is not None: | |
384 | self._lock._release_lock() | |
385 | ||
386 | def optionxform(self, optionstr: str) -> str: | |
352 | 387 | """Do not transform options in any way when writing""" |
353 | 388 | return optionstr |
354 | 389 | |
355 | def _read(self, fp, fpname): | |
390 | def _read(self, fp: Union[BufferedReader, IO[bytes]], fpname: str) -> None: | |
356 | 391 | """A direct copy of the py2.4 version of the super class's _read method |
357 | 392 | to assure it uses ordered dicts. Had to change one line to make it work. |
358 | 393 | |
368 | 403 | is_multi_line = False |
369 | 404 | e = None # None, or an exception |
370 | 405 | |
371 | def string_decode(v): | |
406 | def string_decode(v: str) -> str: | |
372 | 407 | if v[-1] == '\\': |
373 | 408 | v = v[:-1] |
374 | 409 | # end cut trailing escapes to prevent decode error |
393 | 428 | # is it a section header? |
394 | 429 | mo = self.SECTCRE.match(line.strip()) |
395 | 430 | if not is_multi_line and mo: |
396 | sectname = mo.group('header').strip() | |
431 | sectname: str = mo.group('header').strip() | |
397 | 432 | if sectname in self._sections: |
398 | 433 | cursect = self._sections[sectname] |
399 | 434 | elif sectname == cp.DEFAULTSECT: |
450 | 485 | if e: |
451 | 486 | raise e |
452 | 487 | |
453 | def _has_includes(self): | |
488 | def _has_includes(self) -> Union[bool, int]: | |
454 | 489 | return self._merge_includes and len(self._included_paths()) |
455 | 490 | |
456 | def _included_paths(self): | |
457 | """Return all paths that must be included to configuration. | |
491 | def _included_paths(self) -> List[Tuple[str, str]]: | |
492 | """Return List all paths that must be included to configuration | |
493 | as Tuples of (option, value). | |
458 | 494 | """ |
459 | 495 | paths = [] |
460 | 496 | |
487 | 523 | ), |
488 | 524 | value |
489 | 525 | ) |
490 | ||
491 | if fnmatch.fnmatchcase(self._repo.git_dir, value): | |
492 | paths += self.items(section) | |
526 | if self._repo.git_dir: | |
527 | if fnmatch.fnmatchcase(str(self._repo.git_dir), value): | |
528 | paths += self.items(section) | |
493 | 529 | |
494 | 530 | elif keyword == "onbranch": |
495 | 531 | try: |
503 | 539 | |
504 | 540 | return paths |
505 | 541 | |
506 | def read(self): | |
542 | def read(self) -> None: # type: ignore[override] | |
507 | 543 | """Reads the data stored in the files we have been initialized with. It will |
508 | 544 | ignore files that cannot be read, possibly leaving an empty configuration |
509 | 545 | |
510 | 546 | :return: Nothing |
511 | 547 | :raise IOError: if a file cannot be handled""" |
512 | 548 | if self._is_initialized: |
513 | return | |
549 | return None | |
514 | 550 | self._is_initialized = True |
515 | 551 | |
516 | if not isinstance(self._file_or_files, (tuple, list)): | |
552 | files_to_read: List[Union[PathLike, IO]] = [""] | |
553 | if isinstance(self._file_or_files, (str, os.PathLike)): | |
554 | # for str or Path, as str is a type of Sequence | |
517 | 555 | files_to_read = [self._file_or_files] |
518 | else: | |
556 | elif not isinstance(self._file_or_files, (tuple, list, Sequence)): | |
557 | # could merge with above isinstance once runtime type known | |
558 | files_to_read = [self._file_or_files] | |
559 | else: # for lists or tuples | |
519 | 560 | files_to_read = list(self._file_or_files) |
520 | 561 | # end assure we have a copy of the paths to handle |
521 | 562 | |
523 | 564 | num_read_include_files = 0 |
524 | 565 | while files_to_read: |
525 | 566 | file_path = files_to_read.pop(0) |
526 | fp = file_path | |
527 | 567 | file_ok = False |
528 | 568 | |
529 | if hasattr(fp, "seek"): | |
530 | self._read(fp, fp.name) | |
569 | if hasattr(file_path, "seek"): | |
570 | # must be a file objectfile-object | |
571 | file_path = cast(IO[bytes], file_path) # replace with assert to narrow type, once sure | |
572 | self._read(file_path, file_path.name) | |
531 | 573 | else: |
532 | 574 | # assume a path if it is not a file-object |
575 | file_path = cast(PathLike, file_path) | |
533 | 576 | try: |
534 | 577 | with open(file_path, 'rb') as fp: |
535 | 578 | file_ok = True |
547 | 590 | if not file_ok: |
548 | 591 | continue |
549 | 592 | # end ignore relative paths if we don't know the configuration file path |
593 | file_path = cast(PathLike, file_path) | |
550 | 594 | assert osp.isabs(file_path), "Need absolute paths to be sure our cycle checks will work" |
551 | 595 | include_path = osp.join(osp.dirname(file_path), include_path) |
552 | 596 | # end make include path absolute |
567 | 611 | self._merge_includes = False |
568 | 612 | # end |
569 | 613 | |
570 | def _write(self, fp): | |
614 | def _write(self, fp: IO) -> None: | |
571 | 615 | """Write an .ini-format representation of the configuration state in |
572 | 616 | git compatible format""" |
573 | def write_section(name, section_dict): | |
617 | def write_section(name: str, section_dict: _OMD) -> None: | |
574 | 618 | fp.write(("[%s]\n" % name).encode(defenc)) |
619 | ||
620 | values: Sequence[str] # runtime only gets str in tests, but should be whatever _OMD stores | |
621 | v: str | |
575 | 622 | for (key, values) in section_dict.items_all(): |
576 | 623 | if key == "__name__": |
577 | 624 | continue |
583 | 630 | |
584 | 631 | if self._defaults: |
585 | 632 | write_section(cp.DEFAULTSECT, self._defaults) |
633 | value: _OMD | |
634 | ||
586 | 635 | for name, value in self._sections.items(): |
587 | 636 | write_section(name, value) |
588 | 637 | |
589 | def items(self, section_name): | |
638 | def items(self, section_name: str) -> List[Tuple[str, str]]: # type: ignore[override] | |
590 | 639 | """:return: list((option, value), ...) pairs of all items in the given section""" |
591 | 640 | return [(k, v) for k, v in super(GitConfigParser, self).items(section_name) if k != '__name__'] |
592 | 641 | |
593 | def items_all(self, section_name): | |
642 | def items_all(self, section_name: str) -> List[Tuple[str, List[str]]]: | |
594 | 643 | """:return: list((option, [values...]), ...) pairs of all items in the given section""" |
595 | 644 | rv = _OMD(self._defaults) |
596 | 645 | |
607 | 656 | return rv.items_all() |
608 | 657 | |
609 | 658 | @needs_values |
610 | def write(self): | |
659 | def write(self) -> None: | |
611 | 660 | """Write changes to our file, if there are changes at all |
612 | 661 | |
613 | 662 | :raise IOError: if this is a read-only writer instance or if we could not obtain |
614 | 663 | a file lock""" |
615 | 664 | self._assure_writable("write") |
616 | 665 | if not self._dirty: |
617 | return | |
666 | return None | |
618 | 667 | |
619 | 668 | if isinstance(self._file_or_files, (list, tuple)): |
620 | 669 | raise AssertionError("Cannot write back if there is not exactly a single file to write to, have %i files" |
624 | 673 | if self._has_includes(): |
625 | 674 | log.debug("Skipping write-back of configuration file as include files were merged in." + |
626 | 675 | "Set merge_includes=False to prevent this.") |
627 | return | |
676 | return None | |
628 | 677 | # end |
629 | 678 | |
630 | 679 | fp = self._file_or_files |
631 | 680 | |
632 | 681 | # we have a physical file on disk, so get a lock |
633 | is_file_lock = isinstance(fp, (str, IOBase)) | |
634 | if is_file_lock: | |
682 | is_file_lock = isinstance(fp, (str, os.PathLike, IOBase)) # can't use Pathlike until 3.5 dropped | |
683 | if is_file_lock and self._lock is not None: # else raise Error? | |
635 | 684 | self._lock._obtain_lock() |
685 | ||
636 | 686 | if not hasattr(fp, "seek"): |
637 | with open(self._file_or_files, "wb") as fp: | |
638 | self._write(fp) | |
687 | fp = cast(PathLike, fp) | |
688 | with open(fp, "wb") as fp_open: | |
689 | self._write(fp_open) | |
639 | 690 | else: |
691 | fp = cast('BytesIO', fp) | |
640 | 692 | fp.seek(0) |
641 | 693 | # make sure we do not overwrite into an existing file |
642 | 694 | if hasattr(fp, 'truncate'): |
643 | 695 | fp.truncate() |
644 | 696 | self._write(fp) |
645 | 697 | |
646 | def _assure_writable(self, method_name): | |
698 | def _assure_writable(self, method_name: str) -> None: | |
647 | 699 | if self.read_only: |
648 | 700 | raise IOError("Cannot execute non-constant method %s.%s" % (self, method_name)) |
649 | 701 | |
650 | def add_section(self, section): | |
702 | def add_section(self, section: str) -> None: | |
651 | 703 | """Assures added options will stay in order""" |
652 | 704 | return super(GitConfigParser, self).add_section(section) |
653 | 705 | |
654 | 706 | @property |
655 | def read_only(self): | |
707 | def read_only(self) -> bool: | |
656 | 708 | """:return: True if this instance may change the configuration file""" |
657 | 709 | return self._read_only |
658 | 710 | |
659 | def get_value(self, section, option, default=None): | |
711 | @overload | |
712 | def get_value(self, section: str, option: str, default: None = None) -> Union[int, float, str, bool]: ... | |
713 | ||
714 | @overload | |
715 | def get_value(self, section: str, option: str, default: str) -> str: ... | |
716 | ||
717 | @overload | |
718 | def get_value(self, section: str, option: str, default: float) -> float: ... | |
719 | ||
720 | def get_value(self, section: str, option: str, default: Union[int, float, str, bool, None] = None | |
721 | ) -> Union[int, float, str, bool]: | |
722 | # can default or return type include bool? | |
660 | 723 | """Get an option's value. |
661 | 724 | |
662 | 725 | If multiple values are specified for this option in the section, the |
678 | 741 | |
679 | 742 | return self._string_to_value(valuestr) |
680 | 743 | |
681 | def get_values(self, section, option, default=None): | |
744 | def get_values(self, section: str, option: str, default: Union[int, float, str, bool, None] = None | |
745 | ) -> List[Union[int, float, str, bool]]: | |
682 | 746 | """Get an option's values. |
683 | 747 | |
684 | 748 | If multiple values are specified for this option in the section, all are |
700 | 764 | |
701 | 765 | return [self._string_to_value(valuestr) for valuestr in lst] |
702 | 766 | |
703 | def _string_to_value(self, valuestr): | |
767 | def _string_to_value(self, valuestr: str) -> Union[int, float, str, bool]: | |
704 | 768 | types = (int, float) |
705 | 769 | for numtype in types: |
706 | 770 | try: |
707 | 771 | val = numtype(valuestr) |
708 | ||
709 | 772 | # truncated value ? |
710 | 773 | if val != float(valuestr): |
711 | 774 | continue |
712 | ||
713 | 775 | return val |
714 | 776 | except (ValueError, TypeError): |
715 | 777 | continue |
729 | 791 | |
730 | 792 | return valuestr |
731 | 793 | |
732 | def _value_to_string(self, value): | |
794 | def _value_to_string(self, value: Union[str, bytes, int, float, bool]) -> str: | |
733 | 795 | if isinstance(value, (int, float, bool)): |
734 | 796 | return str(value) |
735 | 797 | return force_text(value) |
736 | 798 | |
737 | 799 | @needs_values |
738 | 800 | @set_dirty_and_flush_changes |
739 | def set_value(self, section, option, value): | |
801 | def set_value(self, section: str, option: str, value: Union[str, bytes, int, float, bool]) -> 'GitConfigParser': | |
740 | 802 | """Sets the given option in section to the given value. |
741 | 803 | It will create the section if required, and will not throw as opposed to the default |
742 | 804 | ConfigParser 'set' method. |
754 | 816 | |
755 | 817 | @needs_values |
756 | 818 | @set_dirty_and_flush_changes |
757 | def add_value(self, section, option, value): | |
819 | def add_value(self, section: str, option: str, value: Union[str, bytes, int, float, bool]) -> 'GitConfigParser': | |
758 | 820 | """Adds a value for the given option in section. |
759 | 821 | It will create the section if required, and will not throw as opposed to the default |
760 | 822 | ConfigParser 'set' method. The value becomes the new value of the option as returned |
771 | 833 | self._sections[section].add(option, self._value_to_string(value)) |
772 | 834 | return self |
773 | 835 | |
774 | def rename_section(self, section, new_name): | |
836 | def rename_section(self, section: str, new_name: str) -> 'GitConfigParser': | |
775 | 837 | """rename the given section to new_name |
776 | 838 | :raise ValueError: if section doesn't exit |
777 | 839 | :raise ValueError: if a section with new_name does already exist |
6 | 6 | from gitdb.db import GitDB # @UnusedImport |
7 | 7 | from gitdb.db import LooseObjectDB |
8 | 8 | |
9 | from .exc import ( | |
10 | GitCommandError, | |
11 | BadObject | |
12 | ) | |
9 | from gitdb.exc import BadObject | |
10 | from git.exc import GitCommandError | |
11 | ||
12 | # typing------------------------------------------------- | |
13 | ||
14 | from typing import TYPE_CHECKING | |
15 | from git.types import PathLike | |
16 | ||
17 | if TYPE_CHECKING: | |
18 | from git.cmd import Git | |
13 | 19 | |
14 | 20 | |
21 | # -------------------------------------------------------- | |
22 | ||
15 | 23 | __all__ = ('GitCmdObjectDB', 'GitDB') |
16 | ||
17 | # class GitCmdObjectDB(CompoundDB, ObjectDBW): | |
18 | 24 | |
19 | 25 | |
20 | 26 | class GitCmdObjectDB(LooseObjectDB): |
27 | 33 | have packs and the other implementations |
28 | 34 | """ |
29 | 35 | |
30 | def __init__(self, root_path, git): | |
36 | def __init__(self, root_path: PathLike, git: 'Git') -> None: | |
31 | 37 | """Initialize this instance with the root and a git command""" |
32 | 38 | super(GitCmdObjectDB, self).__init__(root_path) |
33 | 39 | self._git = git |
34 | 40 | |
35 | def info(self, sha): | |
36 | hexsha, typename, size = self._git.get_object_header(bin_to_hex(sha)) | |
41 | def info(self, binsha: bytes) -> OInfo: | |
42 | hexsha, typename, size = self._git.get_object_header(bin_to_hex(binsha)) | |
37 | 43 | return OInfo(hex_to_bin(hexsha), typename, size) |
38 | 44 | |
39 | def stream(self, sha): | |
45 | def stream(self, binsha: bytes) -> OStream: | |
40 | 46 | """For now, all lookup is done by git itself""" |
41 | hexsha, typename, size, stream = self._git.stream_object_data(bin_to_hex(sha)) | |
47 | hexsha, typename, size, stream = self._git.stream_object_data(bin_to_hex(binsha)) | |
42 | 48 | return OStream(hex_to_bin(hexsha), typename, size, stream) |
43 | 49 | |
44 | 50 | # { Interface |
45 | 51 | |
46 | def partial_to_complete_sha_hex(self, partial_hexsha): | |
52 | def partial_to_complete_sha_hex(self, partial_hexsha: str) -> bytes: | |
47 | 53 | """:return: Full binary 20 byte sha from the given partial hexsha |
48 | 54 | :raise AmbiguousObjectName: |
49 | 55 | :raise BadObject: |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | ||
5 | 6 | import re |
6 | ||
7 | 7 | from git.cmd import handle_process_output |
8 | 8 | from git.compat import defenc |
9 | 9 | from git.util import finalize_process, hex_to_bin |
12 | 12 | from .objects.util import mode_str_to_int |
13 | 13 | |
14 | 14 | |
15 | # typing ------------------------------------------------------------------ | |
16 | ||
17 | from typing import Any, Iterator, List, Match, Optional, Tuple, Type, TypeVar, Union, TYPE_CHECKING, cast | |
18 | from git.types import PathLike, Literal | |
19 | ||
20 | if TYPE_CHECKING: | |
21 | from .objects.tree import Tree | |
22 | from .objects import Commit | |
23 | from git.repo.base import Repo | |
24 | from git.objects.base import IndexObject | |
25 | from subprocess import Popen | |
26 | from git import Git | |
27 | ||
28 | Lit_change_type = Literal['A', 'D', 'C', 'M', 'R', 'T', 'U'] | |
29 | ||
30 | ||
31 | # def is_change_type(inp: str) -> TypeGuard[Lit_change_type]: | |
32 | # # return True | |
33 | # return inp in ['A', 'D', 'C', 'M', 'R', 'T', 'U'] | |
34 | ||
35 | # ------------------------------------------------------------------------ | |
36 | ||
37 | ||
15 | 38 | __all__ = ('Diffable', 'DiffIndex', 'Diff', 'NULL_TREE') |
16 | 39 | |
17 | 40 | # Special object to compare against the empty tree in diffs |
20 | 43 | _octal_byte_re = re.compile(b'\\\\([0-9]{3})') |
21 | 44 | |
22 | 45 | |
23 | def _octal_repl(matchobj): | |
46 | def _octal_repl(matchobj: Match) -> bytes: | |
24 | 47 | value = matchobj.group(1) |
25 | 48 | value = int(value, 8) |
26 | 49 | value = bytes(bytearray((value,))) |
27 | 50 | return value |
28 | 51 | |
29 | 52 | |
30 | def decode_path(path, has_ab_prefix=True): | |
53 | def decode_path(path: bytes, has_ab_prefix: bool = True) -> Optional[bytes]: | |
31 | 54 | if path == b'/dev/null': |
32 | 55 | return None |
33 | 56 | |
59 | 82 | class Index(object): |
60 | 83 | pass |
61 | 84 | |
62 | def _process_diff_args(self, args): | |
85 | def _process_diff_args(self, args: List[Union[str, 'Diffable', Type['Diffable.Index'], object]] | |
86 | ) -> List[Union[str, 'Diffable', Type['Diffable.Index'], object]]: | |
63 | 87 | """ |
64 | 88 | :return: |
65 | 89 | possibly altered version of the given args list. |
67 | 91 | Subclasses can use it to alter the behaviour of the superclass""" |
68 | 92 | return args |
69 | 93 | |
70 | def diff(self, other=Index, paths=None, create_patch=False, **kwargs): | |
94 | def diff(self, other: Union[Type['Index'], 'Tree', 'Commit', None, str, object] = Index, | |
95 | paths: Union[PathLike, List[PathLike], Tuple[PathLike, ...], None] = None, | |
96 | create_patch: bool = False, **kwargs: Any) -> 'DiffIndex': | |
71 | 97 | """Creates diffs between two items being trees, trees and index or an |
72 | 98 | index and the working tree. It will detect renames automatically. |
73 | 99 | |
98 | 124 | :note: |
99 | 125 | On a bare repository, 'other' needs to be provided as Index or as |
100 | 126 | as Tree/Commit, or a git command error will occur""" |
101 | args = [] | |
127 | args: List[Union[PathLike, Diffable, Type['Diffable.Index'], object]] = [] | |
102 | 128 | args.append("--abbrev=40") # we need full shas |
103 | 129 | args.append("--full-index") # get full index paths, not only filenames |
104 | 130 | |
115 | 141 | |
116 | 142 | if paths is not None and not isinstance(paths, (tuple, list)): |
117 | 143 | paths = [paths] |
144 | ||
145 | if hasattr(self, 'Has_Repo'): | |
146 | self.repo: 'Repo' = self.repo | |
118 | 147 | |
119 | 148 | diff_cmd = self.repo.git.diff |
120 | 149 | if other is self.Index: |
148 | 177 | return index |
149 | 178 | |
150 | 179 | |
151 | class DiffIndex(list): | |
180 | T_Diff = TypeVar('T_Diff', bound='Diff') | |
181 | ||
182 | ||
183 | class DiffIndex(List[T_Diff]): | |
152 | 184 | |
153 | 185 | """Implements an Index for diffs, allowing a list of Diffs to be queried by |
154 | 186 | the diff properties. |
162 | 194 | # T = Changed in the type |
163 | 195 | change_type = ("A", "C", "D", "R", "M", "T") |
164 | 196 | |
165 | def iter_change_type(self, change_type): | |
197 | def iter_change_type(self, change_type: Lit_change_type) -> Iterator[T_Diff]: | |
166 | 198 | """ |
167 | 199 | :return: |
168 | 200 | iterator yielding Diff instances that match the given change_type |
179 | 211 | if change_type not in self.change_type: |
180 | 212 | raise ValueError("Invalid change type: %s" % change_type) |
181 | 213 | |
182 | for diff in self: | |
183 | if diff.change_type == change_type: | |
184 | yield diff | |
185 | elif change_type == "A" and diff.new_file: | |
186 | yield diff | |
187 | elif change_type == "D" and diff.deleted_file: | |
188 | yield diff | |
189 | elif change_type == "C" and diff.copied_file: | |
190 | yield diff | |
191 | elif change_type == "R" and diff.renamed: | |
192 | yield diff | |
193 | elif change_type == "M" and diff.a_blob and diff.b_blob and diff.a_blob != diff.b_blob: | |
194 | yield diff | |
214 | for diffidx in self: | |
215 | if diffidx.change_type == change_type: | |
216 | yield diffidx | |
217 | elif change_type == "A" and diffidx.new_file: | |
218 | yield diffidx | |
219 | elif change_type == "D" and diffidx.deleted_file: | |
220 | yield diffidx | |
221 | elif change_type == "C" and diffidx.copied_file: | |
222 | yield diffidx | |
223 | elif change_type == "R" and diffidx.renamed: | |
224 | yield diffidx | |
225 | elif change_type == "M" and diffidx.a_blob and diffidx.b_blob and diffidx.a_blob != diffidx.b_blob: | |
226 | yield diffidx | |
195 | 227 | # END for each diff |
196 | 228 | |
197 | 229 | |
254 | 286 | "new_file", "deleted_file", "copied_file", "raw_rename_from", |
255 | 287 | "raw_rename_to", "diff", "change_type", "score") |
256 | 288 | |
257 | def __init__(self, repo, a_rawpath, b_rawpath, a_blob_id, b_blob_id, a_mode, | |
258 | b_mode, new_file, deleted_file, copied_file, raw_rename_from, | |
259 | raw_rename_to, diff, change_type, score): | |
260 | ||
261 | self.a_mode = a_mode | |
262 | self.b_mode = b_mode | |
289 | def __init__(self, repo: 'Repo', | |
290 | a_rawpath: Optional[bytes], b_rawpath: Optional[bytes], | |
291 | a_blob_id: Union[str, bytes, None], b_blob_id: Union[str, bytes, None], | |
292 | a_mode: Union[bytes, str, None], b_mode: Union[bytes, str, None], | |
293 | new_file: bool, deleted_file: bool, copied_file: bool, | |
294 | raw_rename_from: Optional[bytes], raw_rename_to: Optional[bytes], | |
295 | diff: Union[str, bytes, None], change_type: Optional[Lit_change_type], score: Optional[int]) -> None: | |
263 | 296 | |
264 | 297 | assert a_rawpath is None or isinstance(a_rawpath, bytes) |
265 | 298 | assert b_rawpath is None or isinstance(b_rawpath, bytes) |
266 | 299 | self.a_rawpath = a_rawpath |
267 | 300 | self.b_rawpath = b_rawpath |
268 | 301 | |
269 | if self.a_mode: | |
270 | self.a_mode = mode_str_to_int(self.a_mode) | |
271 | if self.b_mode: | |
272 | self.b_mode = mode_str_to_int(self.b_mode) | |
302 | self.a_mode = mode_str_to_int(a_mode) if a_mode else None | |
303 | self.b_mode = mode_str_to_int(b_mode) if b_mode else None | |
273 | 304 | |
274 | 305 | # Determine whether this diff references a submodule, if it does then |
275 | 306 | # we need to overwrite "repo" to the corresponding submodule's repo instead |
280 | 311 | repo = submodule.module() |
281 | 312 | break |
282 | 313 | |
314 | self.a_blob: Union['IndexObject', None] | |
283 | 315 | if a_blob_id is None or a_blob_id == self.NULL_HEX_SHA: |
284 | 316 | self.a_blob = None |
285 | 317 | else: |
286 | 318 | self.a_blob = Blob(repo, hex_to_bin(a_blob_id), mode=self.a_mode, path=self.a_path) |
287 | 319 | |
320 | self.b_blob: Union['IndexObject', None] | |
288 | 321 | if b_blob_id is None or b_blob_id == self.NULL_HEX_SHA: |
289 | 322 | self.b_blob = None |
290 | 323 | else: |
291 | 324 | self.b_blob = Blob(repo, hex_to_bin(b_blob_id), mode=self.b_mode, path=self.b_path) |
292 | 325 | |
293 | self.new_file = new_file | |
294 | self.deleted_file = deleted_file | |
295 | self.copied_file = copied_file | |
326 | self.new_file: bool = new_file | |
327 | self.deleted_file: bool = deleted_file | |
328 | self.copied_file: bool = copied_file | |
296 | 329 | |
297 | 330 | # be clear and use None instead of empty strings |
298 | 331 | assert raw_rename_from is None or isinstance(raw_rename_from, bytes) |
301 | 334 | self.raw_rename_to = raw_rename_to or None |
302 | 335 | |
303 | 336 | self.diff = diff |
304 | self.change_type = change_type | |
337 | self.change_type: Union[Lit_change_type, None] = change_type | |
305 | 338 | self.score = score |
306 | 339 | |
307 | def __eq__(self, other): | |
340 | def __eq__(self, other: object) -> bool: | |
308 | 341 | for name in self.__slots__: |
309 | 342 | if getattr(self, name) != getattr(other, name): |
310 | 343 | return False |
311 | 344 | # END for each name |
312 | 345 | return True |
313 | 346 | |
314 | def __ne__(self, other): | |
347 | def __ne__(self, other: object) -> bool: | |
315 | 348 | return not (self == other) |
316 | 349 | |
317 | def __hash__(self): | |
350 | def __hash__(self) -> int: | |
318 | 351 | return hash(tuple(getattr(self, n) for n in self.__slots__)) |
319 | 352 | |
320 | def __str__(self): | |
321 | h = "%s" | |
353 | def __str__(self) -> str: | |
354 | h: str = "%s" | |
322 | 355 | if self.a_blob: |
323 | 356 | h %= self.a_blob.path |
324 | 357 | elif self.b_blob: |
325 | 358 | h %= self.b_blob.path |
326 | 359 | |
327 | msg = '' | |
360 | msg: str = '' | |
328 | 361 | line = None # temp line |
329 | 362 | line_length = 0 # line length |
330 | 363 | for b, n in zip((self.a_blob, self.b_blob), ('lhs', 'rhs')): |
353 | 386 | if self.diff: |
354 | 387 | msg += '\n---' |
355 | 388 | try: |
356 | msg += self.diff.decode(defenc) | |
389 | msg += self.diff.decode(defenc) if isinstance(self.diff, bytes) else self.diff | |
357 | 390 | except UnicodeDecodeError: |
358 | 391 | msg += 'OMITTED BINARY DATA' |
359 | 392 | # end handle encoding |
366 | 399 | # end |
367 | 400 | return res |
368 | 401 | |
369 | @property | |
370 | def a_path(self): | |
402 | @ property | |
403 | def a_path(self) -> Optional[str]: | |
371 | 404 | return self.a_rawpath.decode(defenc, 'replace') if self.a_rawpath else None |
372 | 405 | |
373 | @property | |
374 | def b_path(self): | |
406 | @ property | |
407 | def b_path(self) -> Optional[str]: | |
375 | 408 | return self.b_rawpath.decode(defenc, 'replace') if self.b_rawpath else None |
376 | 409 | |
377 | @property | |
378 | def rename_from(self): | |
410 | @ property | |
411 | def rename_from(self) -> Optional[str]: | |
379 | 412 | return self.raw_rename_from.decode(defenc, 'replace') if self.raw_rename_from else None |
380 | 413 | |
381 | @property | |
382 | def rename_to(self): | |
414 | @ property | |
415 | def rename_to(self) -> Optional[str]: | |
383 | 416 | return self.raw_rename_to.decode(defenc, 'replace') if self.raw_rename_to else None |
384 | 417 | |
385 | @property | |
386 | def renamed(self): | |
418 | @ property | |
419 | def renamed(self) -> bool: | |
387 | 420 | """:returns: True if the blob of our diff has been renamed |
388 | 421 | :note: This property is deprecated, please use ``renamed_file`` instead. |
389 | 422 | """ |
390 | 423 | return self.renamed_file |
391 | 424 | |
392 | @property | |
393 | def renamed_file(self): | |
425 | @ property | |
426 | def renamed_file(self) -> bool: | |
394 | 427 | """:returns: True if the blob of our diff has been renamed |
395 | 428 | """ |
396 | 429 | return self.rename_from != self.rename_to |
397 | 430 | |
398 | @classmethod | |
399 | def _pick_best_path(cls, path_match, rename_match, path_fallback_match): | |
431 | @ classmethod | |
432 | def _pick_best_path(cls, path_match: bytes, rename_match: bytes, path_fallback_match: bytes) -> Optional[bytes]: | |
400 | 433 | if path_match: |
401 | 434 | return decode_path(path_match) |
402 | 435 | |
408 | 441 | |
409 | 442 | return None |
410 | 443 | |
411 | @classmethod | |
412 | def _index_from_patch_format(cls, repo, proc): | |
444 | @ classmethod | |
445 | def _index_from_patch_format(cls, repo: 'Repo', proc: Union['Popen', 'Git.AutoInterrupt']) -> DiffIndex: | |
413 | 446 | """Create a new DiffIndex from the given text which must be in patch format |
414 | 447 | :param repo: is the repository we are operating on - it is required |
415 | 448 | :param stream: result of 'git diff' as a stream (supporting file protocol) |
416 | 449 | :return: git.DiffIndex """ |
417 | 450 | |
418 | 451 | ## FIXME: Here SLURPING raw, need to re-phrase header-regexes linewise. |
419 | text = [] | |
420 | handle_process_output(proc, text.append, None, finalize_process, decode_streams=False) | |
452 | text_list: List[bytes] = [] | |
453 | handle_process_output(proc, text_list.append, None, finalize_process, decode_streams=False) | |
421 | 454 | |
422 | 455 | # for now, we have to bake the stream |
423 | text = b''.join(text) | |
424 | index = DiffIndex() | |
425 | previous_header = None | |
426 | header = None | |
456 | text = b''.join(text_list) | |
457 | index: 'DiffIndex' = DiffIndex() | |
458 | previous_header: Union[Match[bytes], None] = None | |
459 | header: Union[Match[bytes], None] = None | |
460 | a_path, b_path = None, None # for mypy | |
461 | a_mode, b_mode = None, None # for mypy | |
427 | 462 | for _header in cls.re_header.finditer(text): |
428 | 463 | a_path_fallback, b_path_fallback, \ |
429 | 464 | old_mode, new_mode, \ |
463 | 498 | previous_header = _header |
464 | 499 | header = _header |
465 | 500 | # end for each header we parse |
466 | if index: | |
501 | if index and header: | |
467 | 502 | index[-1].diff = text[header.end():] |
468 | 503 | # end assign last diff |
469 | 504 | |
470 | 505 | return index |
471 | 506 | |
472 | @classmethod | |
473 | def _index_from_raw_format(cls, repo, proc): | |
507 | @ staticmethod | |
508 | def _handle_diff_line(lines_bytes: bytes, repo: 'Repo', index: DiffIndex) -> None: | |
509 | lines = lines_bytes.decode(defenc) | |
510 | ||
511 | for line in lines.split(':')[1:]: | |
512 | meta, _, path = line.partition('\x00') | |
513 | path = path.rstrip('\x00') | |
514 | a_blob_id: Optional[str] | |
515 | b_blob_id: Optional[str] | |
516 | old_mode, new_mode, a_blob_id, b_blob_id, _change_type = meta.split(None, 4) | |
517 | # Change type can be R100 | |
518 | # R: status letter | |
519 | # 100: score (in case of copy and rename) | |
520 | # assert is_change_type(_change_type[0]), f"Unexpected value for change_type received: {_change_type[0]}" | |
521 | change_type: Lit_change_type = cast(Lit_change_type, _change_type[0]) | |
522 | score_str = ''.join(_change_type[1:]) | |
523 | score = int(score_str) if score_str.isdigit() else None | |
524 | path = path.strip() | |
525 | a_path = path.encode(defenc) | |
526 | b_path = path.encode(defenc) | |
527 | deleted_file = False | |
528 | new_file = False | |
529 | copied_file = False | |
530 | rename_from = None | |
531 | rename_to = None | |
532 | ||
533 | # NOTE: We cannot conclude from the existence of a blob to change type | |
534 | # as diffs with the working do not have blobs yet | |
535 | if change_type == 'D': | |
536 | b_blob_id = None # Optional[str] | |
537 | deleted_file = True | |
538 | elif change_type == 'A': | |
539 | a_blob_id = None | |
540 | new_file = True | |
541 | elif change_type == 'C': | |
542 | copied_file = True | |
543 | a_path_str, b_path_str = path.split('\x00', 1) | |
544 | a_path = a_path_str.encode(defenc) | |
545 | b_path = b_path_str.encode(defenc) | |
546 | elif change_type == 'R': | |
547 | a_path_str, b_path_str = path.split('\x00', 1) | |
548 | a_path = a_path_str.encode(defenc) | |
549 | b_path = b_path_str.encode(defenc) | |
550 | rename_from, rename_to = a_path, b_path | |
551 | elif change_type == 'T': | |
552 | # Nothing to do | |
553 | pass | |
554 | # END add/remove handling | |
555 | ||
556 | diff = Diff(repo, a_path, b_path, a_blob_id, b_blob_id, old_mode, new_mode, | |
557 | new_file, deleted_file, copied_file, rename_from, rename_to, | |
558 | '', change_type, score) | |
559 | index.append(diff) | |
560 | ||
561 | @ classmethod | |
562 | def _index_from_raw_format(cls, repo: 'Repo', proc: 'Popen') -> 'DiffIndex': | |
474 | 563 | """Create a new DiffIndex from the given stream which must be in raw format. |
475 | 564 | :return: git.DiffIndex""" |
476 | 565 | # handles |
477 | 566 | # :100644 100644 687099101... 37c5e30c8... M .gitignore |
478 | 567 | |
479 | index = DiffIndex() | |
480 | ||
481 | def handle_diff_line(lines): | |
482 | lines = lines.decode(defenc) | |
483 | ||
484 | for line in lines.split(':')[1:]: | |
485 | meta, _, path = line.partition('\x00') | |
486 | path = path.rstrip('\x00') | |
487 | old_mode, new_mode, a_blob_id, b_blob_id, _change_type = meta.split(None, 4) | |
488 | # Change type can be R100 | |
489 | # R: status letter | |
490 | # 100: score (in case of copy and rename) | |
491 | change_type = _change_type[0] | |
492 | score_str = ''.join(_change_type[1:]) | |
493 | score = int(score_str) if score_str.isdigit() else None | |
494 | path = path.strip() | |
495 | a_path = path.encode(defenc) | |
496 | b_path = path.encode(defenc) | |
497 | deleted_file = False | |
498 | new_file = False | |
499 | copied_file = False | |
500 | rename_from = None | |
501 | rename_to = None | |
502 | ||
503 | # NOTE: We cannot conclude from the existence of a blob to change type | |
504 | # as diffs with the working do not have blobs yet | |
505 | if change_type == 'D': | |
506 | b_blob_id = None | |
507 | deleted_file = True | |
508 | elif change_type == 'A': | |
509 | a_blob_id = None | |
510 | new_file = True | |
511 | elif change_type == 'C': | |
512 | copied_file = True | |
513 | a_path, b_path = path.split('\x00', 1) | |
514 | a_path = a_path.encode(defenc) | |
515 | b_path = b_path.encode(defenc) | |
516 | elif change_type == 'R': | |
517 | a_path, b_path = path.split('\x00', 1) | |
518 | a_path = a_path.encode(defenc) | |
519 | b_path = b_path.encode(defenc) | |
520 | rename_from, rename_to = a_path, b_path | |
521 | elif change_type == 'T': | |
522 | # Nothing to do | |
523 | pass | |
524 | # END add/remove handling | |
525 | ||
526 | diff = Diff(repo, a_path, b_path, a_blob_id, b_blob_id, old_mode, new_mode, | |
527 | new_file, deleted_file, copied_file, rename_from, rename_to, | |
528 | '', change_type, score) | |
529 | index.append(diff) | |
530 | ||
531 | handle_process_output(proc, handle_diff_line, None, finalize_process, decode_streams=False) | |
568 | index: 'DiffIndex' = DiffIndex() | |
569 | handle_process_output(proc, lambda byt: cls._handle_diff_line(byt, repo, index), | |
570 | None, finalize_process, decode_streams=False) | |
532 | 571 | |
533 | 572 | return index |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | 5 | """ Module containing all exceptions thrown throughout the git package, """ |
6 | 6 | |
7 | from gitdb.exc import BadName # NOQA @UnusedWildImport skipcq: PYL-W0401, PYL-W0614 | |
7 | 8 | from gitdb.exc import * # NOQA @UnusedWildImport skipcq: PYL-W0401, PYL-W0614 |
8 | 9 | from git.compat import safe_decode |
10 | ||
11 | # typing ---------------------------------------------------- | |
12 | ||
13 | from typing import List, Sequence, Tuple, Union, TYPE_CHECKING | |
14 | from git.types import PathLike | |
15 | ||
16 | if TYPE_CHECKING: | |
17 | from git.repo.base import Repo | |
18 | ||
19 | # ------------------------------------------------------------------ | |
9 | 20 | |
10 | 21 | |
11 | 22 | class GitError(Exception): |
36 | 47 | #: "'%s' failed%s" |
37 | 48 | _msg = "Cmd('%s') failed%s" |
38 | 49 | |
39 | def __init__(self, command, status=None, stderr=None, stdout=None): | |
50 | def __init__(self, command: Union[List[str], Tuple[str, ...], str], | |
51 | status: Union[str, int, None, Exception] = None, | |
52 | stderr: Union[bytes, str, None] = None, | |
53 | stdout: Union[bytes, str, None] = None) -> None: | |
40 | 54 | if not isinstance(command, (tuple, list)): |
41 | 55 | command = command.split() |
42 | 56 | self.command = command |
54 | 68 | self._cmd = safe_decode(command[0]) |
55 | 69 | self._cmdline = ' '.join(safe_decode(i) for i in command) |
56 | 70 | self._cause = status and " due to: %s" % status or "!" |
57 | self.stdout = stdout and "\n stdout: '%s'" % safe_decode(stdout) or '' | |
58 | self.stderr = stderr and "\n stderr: '%s'" % safe_decode(stderr) or '' | |
71 | stdout_decode = safe_decode(stdout) | |
72 | stderr_decode = safe_decode(stderr) | |
73 | self.stdout = stdout_decode and "\n stdout: '%s'" % stdout_decode or '' | |
74 | self.stderr = stderr_decode and "\n stderr: '%s'" % stderr_decode or '' | |
59 | 75 | |
60 | def __str__(self): | |
76 | def __str__(self) -> str: | |
61 | 77 | return (self._msg + "\n cmdline: %s%s%s") % ( |
62 | 78 | self._cmd, self._cause, self._cmdline, self.stdout, self.stderr) |
63 | 79 | |
65 | 81 | class GitCommandNotFound(CommandError): |
66 | 82 | """Thrown if we cannot find the `git` executable in the PATH or at the path given by |
67 | 83 | the GIT_PYTHON_GIT_EXECUTABLE environment variable""" |
68 | def __init__(self, command, cause): | |
84 | ||
85 | def __init__(self, command: Union[List[str], Tuple[str], str], cause: Union[str, Exception]) -> None: | |
69 | 86 | super(GitCommandNotFound, self).__init__(command, cause) |
70 | 87 | self._msg = "Cmd('%s') not found%s" |
71 | 88 | |
73 | 90 | class GitCommandError(CommandError): |
74 | 91 | """ Thrown if execution of the git command fails with non-zero status code. """ |
75 | 92 | |
76 | def __init__(self, command, status, stderr=None, stdout=None): | |
93 | def __init__(self, command: Union[List[str], Tuple[str, ...], str], | |
94 | status: Union[str, int, None, Exception] = None, | |
95 | stderr: Union[bytes, str, None] = None, | |
96 | stdout: Union[bytes, str, None] = None, | |
97 | ) -> None: | |
77 | 98 | super(GitCommandError, self).__init__(command, status, stderr, stdout) |
78 | 99 | |
79 | 100 | |
91 | 112 | were checked out successfully and hence match the version stored in the |
92 | 113 | index""" |
93 | 114 | |
94 | def __init__(self, message, failed_files, valid_files, failed_reasons): | |
115 | def __init__(self, message: str, failed_files: Sequence[PathLike], valid_files: Sequence[PathLike], | |
116 | failed_reasons: List[str]) -> None: | |
117 | ||
95 | 118 | Exception.__init__(self, message) |
96 | 119 | self.failed_files = failed_files |
97 | 120 | self.failed_reasons = failed_reasons |
98 | 121 | self.valid_files = valid_files |
99 | 122 | |
100 | def __str__(self): | |
123 | def __str__(self) -> str: | |
101 | 124 | return Exception.__str__(self) + ":%s" % self.failed_files |
102 | 125 | |
103 | 126 | |
115 | 138 | """Thrown if a hook exits with a non-zero exit code. It provides access to the exit code and the string returned |
116 | 139 | via standard output""" |
117 | 140 | |
118 | def __init__(self, command, status, stderr=None, stdout=None): | |
141 | def __init__(self, command: Union[List[str], Tuple[str, ...], str], | |
142 | status: Union[str, int, None, Exception], | |
143 | stderr: Union[bytes, str, None] = None, | |
144 | stdout: Union[bytes, str, None] = None) -> None: | |
145 | ||
119 | 146 | super(HookExecutionError, self).__init__(command, status, stderr, stdout) |
120 | 147 | self._msg = "Hook('%s') failed%s" |
121 | 148 | |
123 | 150 | class RepositoryDirtyError(GitError): |
124 | 151 | """Thrown whenever an operation on a repository fails as it has uncommitted changes that would be overwritten""" |
125 | 152 | |
126 | def __init__(self, repo, message): | |
153 | def __init__(self, repo: 'Repo', message: str) -> None: | |
127 | 154 | self.repo = repo |
128 | 155 | self.message = message |
129 | 156 | |
130 | def __str__(self): | |
157 | def __str__(self) -> str: | |
131 | 158 | return "Operation cannot be performed on %r: %s" % (self.repo, self.message) |
0 | 0 | """Initialize the index package""" |
1 | 1 | # flake8: noqa |
2 | from __future__ import absolute_import | |
3 | ||
4 | 2 | from .base import * |
5 | 3 | from .typ import * |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | ||
5 | 6 | import glob |
6 | 7 | from io import BytesIO |
7 | 8 | import os |
16 | 17 | from git.exc import ( |
17 | 18 | GitCommandError, |
18 | 19 | CheckoutError, |
20 | GitError, | |
19 | 21 | InvalidGitRepositoryError |
20 | 22 | ) |
21 | 23 | from git.objects import ( |
38 | 40 | from gitdb.base import IStream |
39 | 41 | from gitdb.db import MemoryDB |
40 | 42 | |
41 | import git.diff as diff | |
43 | import git.diff as git_diff | |
42 | 44 | import os.path as osp |
43 | 45 | |
44 | 46 | from .fun import ( |
62 | 64 | git_working_dir |
63 | 65 | ) |
64 | 66 | |
67 | # typing ----------------------------------------------------------------------------- | |
68 | ||
69 | from typing import (Any, BinaryIO, Callable, Dict, IO, Iterable, Iterator, List, NoReturn, | |
70 | Sequence, TYPE_CHECKING, Tuple, Type, Union) | |
71 | ||
72 | from git.types import Commit_ish, PathLike | |
73 | ||
74 | if TYPE_CHECKING: | |
75 | from subprocess import Popen | |
76 | from git.repo import Repo | |
77 | from git.refs.reference import Reference | |
78 | from git.util import Actor | |
79 | ||
80 | ||
81 | StageType = int | |
82 | Treeish = Union[Tree, Commit, str, bytes] | |
83 | ||
84 | # ------------------------------------------------------------------------------------ | |
85 | ||
65 | 86 | |
66 | 87 | __all__ = ('IndexFile', 'CheckoutError') |
67 | 88 | |
68 | 89 | |
69 | class IndexFile(LazyMixin, diff.Diffable, Serializable): | |
90 | class IndexFile(LazyMixin, git_diff.Diffable, Serializable): | |
70 | 91 | |
71 | 92 | """ |
72 | 93 | Implements an Index that can be manipulated using a native implementation in |
92 | 113 | _VERSION = 2 # latest version we support |
93 | 114 | S_IFGITLINK = S_IFGITLINK # a submodule |
94 | 115 | |
95 | def __init__(self, repo, file_path=None): | |
116 | def __init__(self, repo: 'Repo', file_path: Union[PathLike, None] = None) -> None: | |
96 | 117 | """Initialize this Index instance, optionally from the given ``file_path``. |
97 | 118 | If no file_path is given, we will be created from the current index file. |
98 | 119 | |
101 | 122 | self.repo = repo |
102 | 123 | self.version = self._VERSION |
103 | 124 | self._extension_data = b'' |
104 | self._file_path = file_path or self._index_path() | |
105 | ||
106 | def _set_cache_(self, attr): | |
125 | self._file_path: PathLike = file_path or self._index_path() | |
126 | ||
127 | def _set_cache_(self, attr: str) -> None: | |
107 | 128 | if attr == "entries": |
108 | 129 | # read the current index |
109 | 130 | # try memory map for speed |
114 | 135 | ok = True |
115 | 136 | except OSError: |
116 | 137 | # in new repositories, there may be no index, which means we are empty |
117 | self.entries = {} | |
118 | return | |
138 | self.entries: Dict[Tuple[PathLike, StageType], IndexEntry] = {} | |
139 | return None | |
119 | 140 | finally: |
120 | 141 | if not ok: |
121 | 142 | lfd.rollback() |
132 | 153 | else: |
133 | 154 | super(IndexFile, self)._set_cache_(attr) |
134 | 155 | |
135 | def _index_path(self): | |
136 | return join_path_native(self.repo.git_dir, "index") | |
156 | def _index_path(self) -> PathLike: | |
157 | if self.repo.git_dir: | |
158 | return join_path_native(self.repo.git_dir, "index") | |
159 | else: | |
160 | raise GitCommandError("No git directory given to join index path") | |
137 | 161 | |
138 | 162 | @property |
139 | def path(self): | |
163 | def path(self) -> PathLike: | |
140 | 164 | """ :return: Path to the index file we are representing """ |
141 | 165 | return self._file_path |
142 | 166 | |
143 | def _delete_entries_cache(self): | |
167 | def _delete_entries_cache(self) -> None: | |
144 | 168 | """Safely clear the entries cache so it can be recreated""" |
145 | 169 | try: |
146 | 170 | del(self.entries) |
151 | 175 | |
152 | 176 | #{ Serializable Interface |
153 | 177 | |
154 | def _deserialize(self, stream): | |
178 | def _deserialize(self, stream: IO) -> 'IndexFile': | |
155 | 179 | """Initialize this instance with index values read from the given stream""" |
156 | 180 | self.version, self.entries, self._extension_data, _conten_sha = read_cache(stream) |
157 | 181 | return self |
158 | 182 | |
159 | def _entries_sorted(self): | |
183 | def _entries_sorted(self) -> List[IndexEntry]: | |
160 | 184 | """:return: list of entries, in a sorted fashion, first by path, then by stage""" |
161 | 185 | return sorted(self.entries.values(), key=lambda e: (e.path, e.stage)) |
162 | 186 | |
163 | def _serialize(self, stream, ignore_extension_data=False): | |
187 | def _serialize(self, stream: IO, ignore_extension_data: bool = False) -> 'IndexFile': | |
164 | 188 | entries = self._entries_sorted() |
165 | extension_data = self._extension_data | |
189 | extension_data = self._extension_data # type: Union[None, bytes] | |
166 | 190 | if ignore_extension_data: |
167 | 191 | extension_data = None |
168 | 192 | write_cache(entries, stream, extension_data) |
170 | 194 | |
171 | 195 | #} END serializable interface |
172 | 196 | |
173 | def write(self, file_path=None, ignore_extension_data=False): | |
197 | def write(self, file_path: Union[None, PathLike] = None, ignore_extension_data: bool = False) -> None: | |
174 | 198 | """Write the current state to our file path or to the given one |
175 | 199 | |
176 | 200 | :param file_path: |
190 | 214 | Alternatively, use IndexFile.write_tree() to handle this case |
191 | 215 | automatically |
192 | 216 | |
193 | :return: self""" | |
217 | :return: self # does it? or returns None?""" | |
194 | 218 | # make sure we have our entries read before getting a write lock |
195 | 219 | # else it would be done when streaming. This can happen |
196 | 220 | # if one doesn't change the index, but writes it right away |
214 | 238 | |
215 | 239 | @post_clear_cache |
216 | 240 | @default_index |
217 | def merge_tree(self, rhs, base=None): | |
241 | def merge_tree(self, rhs: Treeish, base: Union[None, Treeish] = None) -> 'IndexFile': | |
218 | 242 | """Merge the given rhs treeish into the current index, possibly taking |
219 | 243 | a common base treeish into account. |
220 | 244 | |
241 | 265 | # -i : ignore working tree status |
242 | 266 | # --aggressive : handle more merge cases |
243 | 267 | # -m : do an actual merge |
244 | args = ["--aggressive", "-i", "-m"] | |
268 | args: List[Union[Treeish, str]] = ["--aggressive", "-i", "-m"] | |
245 | 269 | if base is not None: |
246 | 270 | args.append(base) |
247 | 271 | args.append(rhs) |
250 | 274 | return self |
251 | 275 | |
252 | 276 | @classmethod |
253 | def new(cls, repo, *tree_sha): | |
277 | def new(cls, repo: 'Repo', *tree_sha: Union[str, Tree]) -> 'IndexFile': | |
254 | 278 | """ Merge the given treeish revisions into a new index which is returned. |
255 | 279 | This method behaves like git-read-tree --aggressive when doing the merge. |
256 | 280 | |
263 | 287 | New IndexFile instance. Its path will be undefined. |
264 | 288 | If you intend to write such a merged Index, supply an alternate file_path |
265 | 289 | to its 'write' method.""" |
266 | base_entries = aggressive_tree_merge(repo.odb, [to_bin_sha(str(t)) for t in tree_sha]) | |
290 | tree_sha_bytes: List[bytes] = [to_bin_sha(str(t)) for t in tree_sha] | |
291 | base_entries = aggressive_tree_merge(repo.odb, tree_sha_bytes) | |
267 | 292 | |
268 | 293 | inst = cls(repo) |
269 | 294 | # convert to entries dict |
270 | entries = dict(zip(((e.path, e.stage) for e in base_entries), | |
271 | (IndexEntry.from_base(e) for e in base_entries))) | |
295 | entries: Dict[Tuple[PathLike, int], IndexEntry] = dict(zip( | |
296 | ((e.path, e.stage) for e in base_entries), | |
297 | (IndexEntry.from_base(e) for e in base_entries))) | |
272 | 298 | |
273 | 299 | inst.entries = entries |
274 | 300 | return inst |
275 | 301 | |
276 | 302 | @classmethod |
277 | def from_tree(cls, repo, *treeish, **kwargs): | |
303 | def from_tree(cls, repo: 'Repo', *treeish: Treeish, **kwargs: Any) -> 'IndexFile': | |
278 | 304 | """Merge the given treeish revisions into a new index which is returned. |
279 | 305 | The original index will remain unaltered |
280 | 306 | |
311 | 337 | if len(treeish) == 0 or len(treeish) > 3: |
312 | 338 | raise ValueError("Please specify between 1 and 3 treeish, got %i" % len(treeish)) |
313 | 339 | |
314 | arg_list = [] | |
340 | arg_list: List[Union[Treeish, str]] = [] | |
315 | 341 | # ignore that working tree and index possibly are out of date |
316 | 342 | if len(treeish) > 1: |
317 | 343 | # drop unmerged entries when reading our index and merging |
330 | 356 | # as it considers existing entries. moving it essentially clears the index. |
331 | 357 | # Unfortunately there is no 'soft' way to do it. |
332 | 358 | # The TemporaryFileSwap assure the original file get put back |
333 | index_handler = TemporaryFileSwap(join_path_native(repo.git_dir, 'index')) | |
359 | if repo.git_dir: | |
360 | index_handler = TemporaryFileSwap(join_path_native(repo.git_dir, 'index')) | |
334 | 361 | try: |
335 | 362 | repo.git.read_tree(*arg_list, **kwargs) |
336 | 363 | index = cls(repo, tmp_index) |
345 | 372 | |
346 | 373 | # UTILITIES |
347 | 374 | @unbare_repo |
348 | def _iter_expand_paths(self, paths): | |
375 | def _iter_expand_paths(self: 'IndexFile', paths: Sequence[PathLike]) -> Iterator[PathLike]: | |
349 | 376 | """Expand the directories in list of paths to the corresponding paths accordingly, |
350 | 377 | |
351 | 378 | Note: git will add items multiple times even if a glob overlapped |
352 | 379 | with manually specified paths or if paths where specified multiple |
353 | 380 | times - we respect that and do not prune""" |
354 | def raise_exc(e): | |
381 | def raise_exc(e: Exception) -> NoReturn: | |
355 | 382 | raise e |
356 | r = self.repo.working_tree_dir | |
383 | r = str(self.repo.working_tree_dir) | |
357 | 384 | rs = r + os.sep |
358 | 385 | for path in paths: |
359 | abs_path = path | |
386 | abs_path = str(path) | |
360 | 387 | if not osp.isabs(abs_path): |
361 | 388 | abs_path = osp.join(r, path) |
362 | 389 | # END make absolute path |
373 | 400 | # end check symlink |
374 | 401 | |
375 | 402 | # if the path is not already pointing to an existing file, resolve globs if possible |
376 | if not os.path.exists(path) and ('?' in path or '*' in path or '[' in path): | |
403 | if not os.path.exists(abs_path) and ('?' in abs_path or '*' in abs_path or '[' in abs_path): | |
377 | 404 | resolved_paths = glob.glob(abs_path) |
378 | 405 | # not abs_path in resolved_paths: |
379 | 406 | # a glob() resolving to the same path we are feeding it with |
383 | 410 | # whose name contains wildcard characters. |
384 | 411 | if abs_path not in resolved_paths: |
385 | 412 | for f in self._iter_expand_paths(glob.glob(abs_path)): |
386 | yield f.replace(rs, '') | |
413 | yield str(f).replace(rs, '') | |
387 | 414 | continue |
388 | 415 | # END glob handling |
389 | 416 | try: |
395 | 422 | # END for each subdirectory |
396 | 423 | except OSError: |
397 | 424 | # was a file or something that could not be iterated |
398 | yield path.replace(rs, '') | |
425 | yield abs_path.replace(rs, '') | |
399 | 426 | # END path exception handling |
400 | 427 | # END for each path |
401 | 428 | |
402 | def _write_path_to_stdin(self, proc, filepath, item, fmakeexc, fprogress, | |
403 | read_from_stdout=True): | |
429 | def _write_path_to_stdin(self, proc: 'Popen', filepath: PathLike, item: PathLike, fmakeexc: Callable[..., GitError], | |
430 | fprogress: Callable[[PathLike, bool, PathLike], None], | |
431 | read_from_stdout: bool = True) -> Union[None, str]: | |
404 | 432 | """Write path to proc.stdin and make sure it processes the item, including progress. |
405 | 433 | |
406 | 434 | :return: stdout string |
416 | 444 | we will close stdin to break the pipe.""" |
417 | 445 | |
418 | 446 | fprogress(filepath, False, item) |
419 | rval = None | |
420 | try: | |
421 | proc.stdin.write(("%s\n" % filepath).encode(defenc)) | |
422 | except IOError as e: | |
423 | # pipe broke, usually because some error happened | |
424 | raise fmakeexc() from e | |
425 | # END write exception handling | |
426 | proc.stdin.flush() | |
427 | if read_from_stdout: | |
447 | rval: Union[None, str] = None | |
448 | ||
449 | if proc.stdin is not None: | |
450 | try: | |
451 | proc.stdin.write(("%s\n" % filepath).encode(defenc)) | |
452 | except IOError as e: | |
453 | # pipe broke, usually because some error happened | |
454 | raise fmakeexc() from e | |
455 | # END write exception handling | |
456 | proc.stdin.flush() | |
457 | ||
458 | if read_from_stdout and proc.stdout is not None: | |
428 | 459 | rval = proc.stdout.readline().strip() |
429 | 460 | fprogress(filepath, True, item) |
430 | 461 | return rval |
431 | 462 | |
432 | def iter_blobs(self, predicate=lambda t: True): | |
463 | def iter_blobs(self, predicate: Callable[[Tuple[StageType, Blob]], bool] = lambda t: True | |
464 | ) -> Iterator[Tuple[StageType, Blob]]: | |
433 | 465 | """ |
434 | 466 | :return: Iterator yielding tuples of Blob objects and stages, tuple(stage, Blob) |
435 | 467 | |
445 | 477 | yield output |
446 | 478 | # END for each entry |
447 | 479 | |
448 | def unmerged_blobs(self): | |
480 | def unmerged_blobs(self) -> Dict[PathLike, List[Tuple[StageType, Blob]]]: | |
449 | 481 | """ |
450 | 482 | :return: |
451 | Iterator yielding dict(path : list( tuple( stage, Blob, ...))), being | |
483 | Dict(path : list( tuple( stage, Blob, ...))), being | |
452 | 484 | a dictionary associating a path in the index with a list containing |
453 | 485 | sorted stage/blob pairs |
486 | ||
454 | 487 | |
455 | 488 | :note: |
456 | 489 | Blobs that have been removed in one side simply do not exist in the |
458 | 491 | are at stage 3 will not have a stage 3 entry. |
459 | 492 | """ |
460 | 493 | is_unmerged_blob = lambda t: t[0] != 0 |
461 | path_map = {} | |
494 | path_map: Dict[PathLike, List[Tuple[StageType, Blob]]] = {} | |
462 | 495 | for stage, blob in self.iter_blobs(is_unmerged_blob): |
463 | 496 | path_map.setdefault(blob.path, []).append((stage, blob)) |
464 | 497 | # END for each unmerged blob |
465 | 498 | for line in path_map.values(): |
466 | 499 | line.sort() |
500 | ||
467 | 501 | return path_map |
468 | 502 | |
469 | @classmethod | |
470 | def entry_key(cls, *entry): | |
503 | @ classmethod | |
504 | def entry_key(cls, *entry: Union[BaseIndexEntry, PathLike, StageType]) -> Tuple[PathLike, StageType]: | |
471 | 505 | return entry_key(*entry) |
472 | 506 | |
473 | def resolve_blobs(self, iter_blobs): | |
507 | def resolve_blobs(self, iter_blobs: Iterator[Blob]) -> 'IndexFile': | |
474 | 508 | """Resolve the blobs given in blob iterator. This will effectively remove the |
475 | 509 | index entries of the respective path at all non-null stages and add the given |
476 | 510 | blob as new stage null blob. |
488 | 522 | for blob in iter_blobs: |
489 | 523 | stage_null_key = (blob.path, 0) |
490 | 524 | if stage_null_key in self.entries: |
491 | raise ValueError("Path %r already exists at stage 0" % blob.path) | |
525 | raise ValueError("Path %r already exists at stage 0" % str(blob.path)) | |
492 | 526 | # END assert blob is not stage 0 already |
493 | 527 | |
494 | 528 | # delete all possible stages |
505 | 539 | |
506 | 540 | return self |
507 | 541 | |
508 | def update(self): | |
542 | def update(self) -> 'IndexFile': | |
509 | 543 | """Reread the contents of our index file, discarding all cached information |
510 | 544 | we might have. |
511 | 545 | |
516 | 550 | # allows to lazily reread on demand |
517 | 551 | return self |
518 | 552 | |
519 | def write_tree(self): | |
553 | def write_tree(self) -> Tree: | |
520 | 554 | """Writes this index to a corresponding Tree object into the repository's |
521 | 555 | object database and return it. |
522 | 556 | |
541 | 575 | root_tree._cache = tree_items |
542 | 576 | return root_tree |
543 | 577 | |
544 | def _process_diff_args(self, args): | |
578 | def _process_diff_args(self, # type: ignore[override] | |
579 | args: List[Union[str, 'git_diff.Diffable', Type['git_diff.Diffable.Index']]] | |
580 | ) -> List[Union[str, 'git_diff.Diffable', Type['git_diff.Diffable.Index']]]: | |
545 | 581 | try: |
546 | 582 | args.pop(args.index(self)) |
547 | 583 | except IndexError: |
549 | 585 | # END remove self |
550 | 586 | return args |
551 | 587 | |
552 | def _to_relative_path(self, path): | |
553 | """:return: Version of path relative to our git directory or raise ValueError | |
554 | if it is not within our git direcotory""" | |
588 | def _to_relative_path(self, path: PathLike) -> PathLike: | |
589 | """ | |
590 | :return: Version of path relative to our git directory or raise ValueError | |
591 | if it is not within our git direcotory""" | |
555 | 592 | if not osp.isabs(path): |
556 | 593 | return path |
557 | 594 | if self.repo.bare: |
558 | 595 | raise InvalidGitRepositoryError("require non-bare repository") |
559 | if not path.startswith(self.repo.working_tree_dir): | |
596 | if not str(path).startswith(str(self.repo.working_tree_dir)): | |
560 | 597 | raise ValueError("Absolute path %r is not in git repository at %r" % (path, self.repo.working_tree_dir)) |
561 | 598 | return os.path.relpath(path, self.repo.working_tree_dir) |
562 | 599 | |
563 | def _preprocess_add_items(self, items): | |
600 | def _preprocess_add_items(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']] | |
601 | ) -> Tuple[List[PathLike], List[BaseIndexEntry]]: | |
564 | 602 | """ Split the items into two lists of path strings and BaseEntries. """ |
565 | 603 | paths = [] |
566 | 604 | entries = [] |
580 | 618 | # END for each item |
581 | 619 | return paths, entries |
582 | 620 | |
583 | def _store_path(self, filepath, fprogress): | |
621 | def _store_path(self, filepath: PathLike, fprogress: Callable) -> BaseIndexEntry: | |
584 | 622 | """Store file at filepath in the database and return the base index entry |
585 | 623 | Needs the git_working_dir decorator active ! This must be assured in the calling code""" |
586 | 624 | st = os.lstat(filepath) # handles non-symlinks as well |
587 | 625 | if S_ISLNK(st.st_mode): |
588 | 626 | # in PY3, readlink is string, but we need bytes. In PY2, it's just OS encoded bytes, we assume UTF-8 |
589 | open_stream = lambda: BytesIO(force_bytes(os.readlink(filepath), encoding=defenc)) | |
627 | open_stream: Callable[[], BinaryIO] = lambda: BytesIO(force_bytes(os.readlink(filepath), | |
628 | encoding=defenc)) | |
590 | 629 | else: |
591 | 630 | open_stream = lambda: open(filepath, 'rb') |
592 | 631 | with open_stream() as stream: |
596 | 635 | return BaseIndexEntry((stat_mode_to_index_mode(st.st_mode), |
597 | 636 | istream.binsha, 0, to_native_path_linux(filepath))) |
598 | 637 | |
599 | @unbare_repo | |
600 | @git_working_dir | |
601 | def _entries_for_paths(self, paths, path_rewriter, fprogress, entries): | |
602 | entries_added = [] | |
638 | @ unbare_repo | |
639 | @ git_working_dir | |
640 | def _entries_for_paths(self, paths: List[str], path_rewriter: Callable, fprogress: Callable, | |
641 | entries: List[BaseIndexEntry]) -> List[BaseIndexEntry]: | |
642 | entries_added: List[BaseIndexEntry] = [] | |
603 | 643 | if path_rewriter: |
604 | 644 | for path in paths: |
605 | 645 | if osp.isabs(path): |
606 | 646 | abspath = path |
607 | gitrelative_path = path[len(self.repo.working_tree_dir) + 1:] | |
647 | gitrelative_path = path[len(str(self.repo.working_tree_dir)) + 1:] | |
608 | 648 | else: |
609 | 649 | gitrelative_path = path |
610 | abspath = osp.join(self.repo.working_tree_dir, gitrelative_path) | |
650 | if self.repo.working_tree_dir: | |
651 | abspath = osp.join(self.repo.working_tree_dir, gitrelative_path) | |
611 | 652 | # end obtain relative and absolute paths |
612 | 653 | |
613 | 654 | blob = Blob(self.repo, Blob.NULL_BIN_SHA, |
627 | 668 | # END path handling |
628 | 669 | return entries_added |
629 | 670 | |
630 | def add(self, items, force=True, fprogress=lambda *args: None, path_rewriter=None, | |
631 | write=True, write_extension_data=False): | |
671 | def add(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']], force: bool = True, | |
672 | fprogress: Callable = lambda *args: None, path_rewriter: Union[Callable[..., PathLike], None] = None, | |
673 | write: bool = True, write_extension_data: bool = False) -> List[BaseIndexEntry]: | |
632 | 674 | """Add files from the working tree, specific blobs or BaseIndexEntries |
633 | 675 | to the index. |
634 | 676 | |
731 | 773 | # automatically |
732 | 774 | # paths can be git-added, for everything else we use git-update-index |
733 | 775 | paths, entries = self._preprocess_add_items(items) |
734 | entries_added = [] | |
776 | entries_added: List[BaseIndexEntry] = [] | |
735 | 777 | # This code needs a working tree, therefore we try not to run it unless required. |
736 | 778 | # That way, we are OK on a bare repository as well. |
737 | 779 | # If there are no paths, the rewriter has nothing to do either |
750 | 792 | # create objects if required, otherwise go with the existing shas |
751 | 793 | null_entries_indices = [i for i, e in enumerate(entries) if e.binsha == Object.NULL_BIN_SHA] |
752 | 794 | if null_entries_indices: |
753 | @git_working_dir | |
754 | def handle_null_entries(self): | |
795 | @ git_working_dir | |
796 | def handle_null_entries(self: 'IndexFile') -> None: | |
755 | 797 | for ei in null_entries_indices: |
756 | 798 | null_entry = entries[ei] |
757 | 799 | new_entry = self._store_path(null_entry.path, fprogress) |
795 | 837 | |
796 | 838 | return entries_added |
797 | 839 | |
798 | def _items_to_rela_paths(self, items): | |
840 | def _items_to_rela_paths(self, items: Union[PathLike, Sequence[Union[PathLike, BaseIndexEntry, Blob, Submodule]]] | |
841 | ) -> List[PathLike]: | |
799 | 842 | """Returns a list of repo-relative paths from the given items which |
800 | 843 | may be absolute or relative paths, entries or blobs""" |
801 | 844 | paths = [] |
802 | 845 | # if string put in list |
803 | if isinstance(items, str): | |
846 | if isinstance(items, (str, os.PathLike)): | |
804 | 847 | items = [items] |
805 | 848 | |
806 | 849 | for item in items: |
813 | 856 | # END for each item |
814 | 857 | return paths |
815 | 858 | |
816 | @post_clear_cache | |
817 | @default_index | |
818 | def remove(self, items, working_tree=False, **kwargs): | |
859 | @ post_clear_cache | |
860 | @ default_index | |
861 | def remove(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']], working_tree: bool = False, | |
862 | **kwargs: Any) -> List[str]: | |
819 | 863 | """Remove the given items from the index and optionally from |
820 | 864 | the working tree as well. |
821 | 865 | |
864 | 908 | # rm 'path' |
865 | 909 | return [p[4:-1] for p in removed_paths] |
866 | 910 | |
867 | @post_clear_cache | |
868 | @default_index | |
869 | def move(self, items, skip_errors=False, **kwargs): | |
911 | @ post_clear_cache | |
912 | @ default_index | |
913 | def move(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']], skip_errors: bool = False, | |
914 | **kwargs: Any) -> List[Tuple[str, str]]: | |
870 | 915 | """Rename/move the items, whereas the last item is considered the destination of |
871 | 916 | the move operation. If the destination is a file, the first item ( of two ) |
872 | 917 | must be a file as well. If the destination is a directory, it may be preceded |
928 | 973 | |
929 | 974 | return out |
930 | 975 | |
931 | def commit(self, message, parent_commits=None, head=True, author=None, | |
932 | committer=None, author_date=None, commit_date=None, | |
933 | skip_hooks=False): | |
976 | def commit(self, | |
977 | message: str, | |
978 | parent_commits: Union[Commit_ish, None] = None, | |
979 | head: bool = True, | |
980 | author: Union[None, 'Actor'] = None, | |
981 | committer: Union[None, 'Actor'] = None, | |
982 | author_date: Union[str, None] = None, | |
983 | commit_date: Union[str, None] = None, | |
984 | skip_hooks: bool = False) -> Commit: | |
934 | 985 | """Commit the current default index file, creating a commit object. |
935 | 986 | For more information on the arguments, see tree.commit. |
936 | 987 | |
954 | 1005 | run_commit_hook('post-commit', self) |
955 | 1006 | return rval |
956 | 1007 | |
957 | def _write_commit_editmsg(self, message): | |
1008 | def _write_commit_editmsg(self, message: str) -> None: | |
958 | 1009 | with open(self._commit_editmsg_filepath(), "wb") as commit_editmsg_file: |
959 | 1010 | commit_editmsg_file.write(message.encode(defenc)) |
960 | 1011 | |
961 | def _remove_commit_editmsg(self): | |
1012 | def _remove_commit_editmsg(self) -> None: | |
962 | 1013 | os.remove(self._commit_editmsg_filepath()) |
963 | 1014 | |
964 | def _read_commit_editmsg(self): | |
1015 | def _read_commit_editmsg(self) -> str: | |
965 | 1016 | with open(self._commit_editmsg_filepath(), "rb") as commit_editmsg_file: |
966 | 1017 | return commit_editmsg_file.read().decode(defenc) |
967 | 1018 | |
968 | def _commit_editmsg_filepath(self): | |
1019 | def _commit_editmsg_filepath(self) -> str: | |
969 | 1020 | return osp.join(self.repo.common_dir, "COMMIT_EDITMSG") |
970 | 1021 | |
971 | @classmethod | |
972 | def _flush_stdin_and_wait(cls, proc, ignore_stdout=False): | |
973 | proc.stdin.flush() | |
974 | proc.stdin.close() | |
975 | stdout = '' | |
976 | if not ignore_stdout: | |
1022 | def _flush_stdin_and_wait(cls, proc: 'Popen[bytes]', ignore_stdout: bool = False) -> bytes: | |
1023 | stdin_IO = proc.stdin | |
1024 | if stdin_IO: | |
1025 | stdin_IO.flush() | |
1026 | stdin_IO.close() | |
1027 | ||
1028 | stdout = b'' | |
1029 | if not ignore_stdout and proc.stdout: | |
977 | 1030 | stdout = proc.stdout.read() |
978 | proc.stdout.close() | |
979 | proc.wait() | |
1031 | ||
1032 | if proc.stdout: | |
1033 | proc.stdout.close() | |
1034 | proc.wait() | |
980 | 1035 | return stdout |
981 | 1036 | |
982 | @default_index | |
983 | def checkout(self, paths=None, force=False, fprogress=lambda *args: None, **kwargs): | |
1037 | @ default_index | |
1038 | def checkout(self, paths: Union[None, Iterable[PathLike]] = None, force: bool = False, | |
1039 | fprogress: Callable = lambda *args: None, **kwargs: Any | |
1040 | ) -> Union[None, Iterator[PathLike], Sequence[PathLike]]: | |
984 | 1041 | """Checkout the given paths or all files from the version known to the index into |
985 | 1042 | the working tree. |
986 | 1043 | |
1031 | 1088 | failed_reasons = [] |
1032 | 1089 | unknown_lines = [] |
1033 | 1090 | |
1034 | def handle_stderr(proc, iter_checked_out_files): | |
1035 | stderr = proc.stderr.read() | |
1036 | if not stderr: | |
1037 | return | |
1091 | def handle_stderr(proc: 'Popen[bytes]', iter_checked_out_files: Iterable[PathLike]) -> None: | |
1092 | ||
1093 | stderr_IO = proc.stderr | |
1094 | if not stderr_IO: | |
1095 | return None # return early if stderr empty | |
1096 | else: | |
1097 | stderr_bytes = stderr_IO.read() | |
1038 | 1098 | # line contents: |
1039 | stderr = stderr.decode(defenc) | |
1099 | stderr = stderr_bytes.decode(defenc) | |
1040 | 1100 | # git-checkout-index: this already exists |
1041 | 1101 | endings = (' already exists', ' is not in the cache', ' does not exist at stage', ' is unmerged') |
1042 | 1102 | for line in stderr.splitlines(): |
1100 | 1160 | proc = self.repo.git.checkout_index(args, **kwargs) |
1101 | 1161 | # FIXME: Reading from GIL! |
1102 | 1162 | make_exc = lambda: GitCommandError(("git-checkout-index",) + tuple(args), 128, proc.stderr.read()) |
1103 | checked_out_files = [] | |
1163 | checked_out_files: List[PathLike] = [] | |
1104 | 1164 | |
1105 | 1165 | for path in paths: |
1106 | 1166 | co_path = to_native_path_linux(self._to_relative_path(path)) |
1110 | 1170 | try: |
1111 | 1171 | self.entries[(co_path, 0)] |
1112 | 1172 | except KeyError: |
1113 | folder = co_path | |
1173 | folder = str(co_path) | |
1114 | 1174 | if not folder.endswith('/'): |
1115 | 1175 | folder += '/' |
1116 | 1176 | for entry in self.entries.values(): |
1117 | if entry.path.startswith(folder): | |
1177 | if str(entry.path).startswith(folder): | |
1118 | 1178 | p = entry.path |
1119 | 1179 | self._write_path_to_stdin(proc, p, p, make_exc, |
1120 | 1180 | fprogress, read_from_stdout=False) |
1141 | 1201 | handle_stderr(proc, checked_out_files) |
1142 | 1202 | return checked_out_files |
1143 | 1203 | # END paths handling |
1144 | assert "Should not reach this point" | |
1145 | ||
1146 | @default_index | |
1147 | def reset(self, commit='HEAD', working_tree=False, paths=None, head=False, **kwargs): | |
1204 | ||
1205 | @ default_index | |
1206 | def reset(self, commit: Union[Commit, 'Reference', str] = 'HEAD', working_tree: bool = False, | |
1207 | paths: Union[None, Iterable[PathLike]] = None, | |
1208 | head: bool = False, **kwargs: Any) -> 'IndexFile': | |
1148 | 1209 | """Reset the index to reflect the tree at the given commit. This will not |
1149 | 1210 | adjust our HEAD reference as opposed to HEAD.reset by default. |
1150 | 1211 | |
1211 | 1272 | |
1212 | 1273 | return self |
1213 | 1274 | |
1214 | @default_index | |
1215 | def diff(self, other=diff.Diffable.Index, paths=None, create_patch=False, **kwargs): | |
1275 | # @ default_index, breaks typing for some reason, copied into function | |
1276 | def diff(self, # type: ignore[override] | |
1277 | other: Union[Type['git_diff.Diffable.Index'], 'Tree', 'Commit', str, None] = git_diff.Diffable.Index, | |
1278 | paths: Union[PathLike, List[PathLike], Tuple[PathLike, ...], None] = None, | |
1279 | create_patch: bool = False, **kwargs: Any | |
1280 | ) -> git_diff.DiffIndex: | |
1216 | 1281 | """Diff this index against the working copy or a Tree or Commit object |
1217 | 1282 | |
1218 | For a documentation of the parameters and return values, see | |
1283 | For a documentation of the parameters and return values, see, | |
1219 | 1284 | Diffable.diff |
1220 | 1285 | |
1221 | 1286 | :note: |
1222 | 1287 | Will only work with indices that represent the default git index as |
1223 | 1288 | they have not been initialized with a stream. |
1224 | 1289 | """ |
1290 | ||
1291 | # only run if we are the default repository index | |
1292 | if self._file_path != self._index_path(): | |
1293 | raise AssertionError( | |
1294 | "Cannot call %r on indices that do not represent the default git index" % self.diff()) | |
1225 | 1295 | # index against index is always empty |
1226 | 1296 | if other is self.Index: |
1227 | return diff.DiffIndex() | |
1297 | return git_diff.DiffIndex() | |
1228 | 1298 | |
1229 | 1299 | # index against anything but None is a reverse diff with the respective |
1230 | 1300 | # item. Handle existing -R flags properly. Transform strings to the object |
1233 | 1303 | other = self.repo.rev_parse(other) |
1234 | 1304 | # END object conversion |
1235 | 1305 | |
1236 | if isinstance(other, Object): | |
1306 | if isinstance(other, Object): # for Tree or Commit | |
1237 | 1307 | # invert the existing R flag |
1238 | 1308 | cur_val = kwargs.get('R', False) |
1239 | 1309 | kwargs['R'] = not cur_val |
0 | 0 | # Contains standalone functions to accompany the index implementation and make it |
1 | 1 | # more versatile |
2 | 2 | # NOTE: Autodoc hates it if this is a docstring |
3 | ||
3 | 4 | from io import BytesIO |
4 | 5 | import os |
5 | 6 | from stat import ( |
9 | 10 | S_ISDIR, |
10 | 11 | S_IFMT, |
11 | 12 | S_IFREG, |
13 | S_IXUSR, | |
12 | 14 | ) |
13 | 15 | import subprocess |
14 | 16 | |
46 | 48 | unpack |
47 | 49 | ) |
48 | 50 | |
51 | # typing ----------------------------------------------------------------------------- | |
52 | ||
53 | from typing import (Dict, IO, List, Sequence, TYPE_CHECKING, Tuple, Type, Union, cast) | |
54 | ||
55 | from git.types import PathLike | |
56 | ||
57 | if TYPE_CHECKING: | |
58 | from .base import IndexFile | |
59 | from git.db import GitCmdObjectDB | |
60 | from git.objects.tree import TreeCacheTup | |
61 | # from git.objects.fun import EntryTupOrNone | |
62 | ||
63 | # ------------------------------------------------------------------------------------ | |
64 | ||
49 | 65 | |
50 | 66 | S_IFGITLINK = S_IFLNK | S_IFDIR # a submodule |
51 | 67 | CE_NAMEMASK_INV = ~CE_NAMEMASK |
54 | 70 | 'stat_mode_to_index_mode', 'S_IFGITLINK', 'run_commit_hook', 'hook_path') |
55 | 71 | |
56 | 72 | |
57 | def hook_path(name, git_dir): | |
73 | def hook_path(name: str, git_dir: PathLike) -> str: | |
58 | 74 | """:return: path to the given named hook in the given git repository directory""" |
59 | 75 | return osp.join(git_dir, 'hooks', name) |
60 | 76 | |
61 | 77 | |
62 | def run_commit_hook(name, index, *args): | |
78 | def run_commit_hook(name: str, index: 'IndexFile', *args: str) -> None: | |
63 | 79 | """Run the commit hook of the given name. Silently ignores hooks that do not exist. |
64 | 80 | :param name: name of hook, like 'pre-commit' |
65 | 81 | :param index: IndexFile instance |
67 | 83 | :raises HookExecutionError: """ |
68 | 84 | hp = hook_path(name, index.repo.git_dir) |
69 | 85 | if not os.access(hp, os.X_OK): |
70 | return | |
86 | return None | |
71 | 87 | |
72 | 88 | env = os.environ.copy() |
73 | env['GIT_INDEX_FILE'] = safe_decode(index.path) | |
89 | env['GIT_INDEX_FILE'] = safe_decode(str(index.path)) | |
74 | 90 | env['GIT_EDITOR'] = ':' |
75 | 91 | try: |
76 | 92 | cmd = subprocess.Popen([hp] + list(args), |
83 | 99 | except Exception as ex: |
84 | 100 | raise HookExecutionError(hp, ex) from ex |
85 | 101 | else: |
86 | stdout = [] | |
87 | stderr = [] | |
88 | handle_process_output(cmd, stdout.append, stderr.append, finalize_process) | |
89 | stdout = ''.join(stdout) | |
90 | stderr = ''.join(stderr) | |
102 | stdout_list: List[str] = [] | |
103 | stderr_list: List[str] = [] | |
104 | handle_process_output(cmd, stdout_list.append, stderr_list.append, finalize_process) | |
105 | stdout = ''.join(stdout_list) | |
106 | stderr = ''.join(stderr_list) | |
91 | 107 | if cmd.returncode != 0: |
92 | 108 | stdout = force_text(stdout, defenc) |
93 | 109 | stderr = force_text(stderr, defenc) |
95 | 111 | # end handle return code |
96 | 112 | |
97 | 113 | |
98 | def stat_mode_to_index_mode(mode): | |
114 | def stat_mode_to_index_mode(mode: int) -> int: | |
99 | 115 | """Convert the given mode from a stat call to the corresponding index mode |
100 | 116 | and return it""" |
101 | 117 | if S_ISLNK(mode): # symlinks |
102 | 118 | return S_IFLNK |
103 | 119 | if S_ISDIR(mode) or S_IFMT(mode) == S_IFGITLINK: # submodules |
104 | 120 | return S_IFGITLINK |
105 | return S_IFREG | 0o644 | (mode & 0o111) # blobs with or without executable bit | |
106 | ||
107 | ||
108 | def write_cache(entries, stream, extension_data=None, ShaStreamCls=IndexFileSHA1Writer): | |
121 | return S_IFREG | (mode & S_IXUSR and 0o755 or 0o644) # blobs with or without executable bit | |
122 | ||
123 | ||
124 | def write_cache(entries: Sequence[Union[BaseIndexEntry, 'IndexEntry']], stream: IO[bytes], | |
125 | extension_data: Union[None, bytes] = None, | |
126 | ShaStreamCls: Type[IndexFileSHA1Writer] = IndexFileSHA1Writer) -> None: | |
109 | 127 | """Write the cache represented by entries to a stream |
110 | 128 | |
111 | 129 | :param entries: **sorted** list of entries |
118 | 136 | :param extension_data: any kind of data to write as a trailer, it must begin |
119 | 137 | a 4 byte identifier, followed by its size ( 4 bytes )""" |
120 | 138 | # wrap the stream into a compatible writer |
121 | stream = ShaStreamCls(stream) | |
122 | ||
123 | tell = stream.tell | |
124 | write = stream.write | |
139 | stream_sha = ShaStreamCls(stream) | |
140 | ||
141 | tell = stream_sha.tell | |
142 | write = stream_sha.write | |
125 | 143 | |
126 | 144 | # header |
127 | 145 | version = 2 |
131 | 149 | # body |
132 | 150 | for entry in entries: |
133 | 151 | beginoffset = tell() |
134 | write(entry[4]) # ctime | |
135 | write(entry[5]) # mtime | |
136 | path = entry[3] | |
137 | path = force_bytes(path, encoding=defenc) | |
152 | write(entry.ctime_bytes) # ctime | |
153 | write(entry.mtime_bytes) # mtime | |
154 | path_str = str(entry.path) | |
155 | path: bytes = force_bytes(path_str, encoding=defenc) | |
138 | 156 | plen = len(path) & CE_NAMEMASK # path length |
139 | assert plen == len(path), "Path %s too long to fit into index" % entry[3] | |
140 | flags = plen | (entry[2] & CE_NAMEMASK_INV) # clear possible previous values | |
141 | write(pack(">LLLLLL20sH", entry[6], entry[7], entry[0], | |
142 | entry[8], entry[9], entry[10], entry[1], flags)) | |
157 | assert plen == len(path), "Path %s too long to fit into index" % entry.path | |
158 | flags = plen | (entry.flags & CE_NAMEMASK_INV) # clear possible previous values | |
159 | write(pack(">LLLLLL20sH", entry.dev, entry.inode, entry.mode, | |
160 | entry.uid, entry.gid, entry.size, entry.binsha, flags)) | |
143 | 161 | write(path) |
144 | 162 | real_size = ((tell() - beginoffset + 8) & ~7) |
145 | 163 | write(b"\0" * ((beginoffset + real_size) - tell())) |
147 | 165 | |
148 | 166 | # write previously cached extensions data |
149 | 167 | if extension_data is not None: |
150 | stream.write(extension_data) | |
168 | stream_sha.write(extension_data) | |
151 | 169 | |
152 | 170 | # write the sha over the content |
153 | stream.write_sha() | |
154 | ||
155 | ||
156 | def read_header(stream): | |
171 | stream_sha.write_sha() | |
172 | ||
173 | ||
174 | def read_header(stream: IO[bytes]) -> Tuple[int, int]: | |
157 | 175 | """Return tuple(version_long, num_entries) from the given stream""" |
158 | 176 | type_id = stream.read(4) |
159 | 177 | if type_id != b"DIRC": |
160 | 178 | raise AssertionError("Invalid index file header: %r" % type_id) |
161 | version, num_entries = unpack(">LL", stream.read(4 * 2)) | |
179 | unpacked = cast(Tuple[int, int], unpack(">LL", stream.read(4 * 2))) | |
180 | version, num_entries = unpacked | |
162 | 181 | |
163 | 182 | # TODO: handle version 3: extended data, see read-cache.c |
164 | 183 | assert version in (1, 2) |
165 | 184 | return version, num_entries |
166 | 185 | |
167 | 186 | |
168 | def entry_key(*entry): | |
187 | def entry_key(*entry: Union[BaseIndexEntry, PathLike, int]) -> Tuple[PathLike, int]: | |
169 | 188 | """:return: Key suitable to be used for the index.entries dictionary |
170 | 189 | :param entry: One instance of type BaseIndexEntry or the path and the stage""" |
190 | ||
191 | # def is_entry_key_tup(entry_key: Tuple) -> TypeGuard[Tuple[PathLike, int]]: | |
192 | # return isinstance(entry_key, tuple) and len(entry_key) == 2 | |
193 | ||
171 | 194 | if len(entry) == 1: |
172 | return (entry[0].path, entry[0].stage) | |
173 | return tuple(entry) | |
195 | entry_first = entry[0] | |
196 | assert isinstance(entry_first, BaseIndexEntry) | |
197 | return (entry_first.path, entry_first.stage) | |
198 | else: | |
199 | # assert is_entry_key_tup(entry) | |
200 | entry = cast(Tuple[PathLike, int], entry) | |
201 | return entry | |
174 | 202 | # END handle entry |
175 | 203 | |
176 | 204 | |
177 | def read_cache(stream): | |
205 | def read_cache(stream: IO[bytes]) -> Tuple[int, Dict[Tuple[PathLike, int], 'IndexEntry'], bytes, bytes]: | |
178 | 206 | """Read a cache file from the given stream |
179 | 207 | :return: tuple(version, entries_dict, extension_data, content_sha) |
180 | 208 | * version is the integer version number |
183 | 211 | * content_sha is a 20 byte sha on all cache file contents""" |
184 | 212 | version, num_entries = read_header(stream) |
185 | 213 | count = 0 |
186 | entries = {} | |
214 | entries: Dict[Tuple[PathLike, int], 'IndexEntry'] = {} | |
187 | 215 | |
188 | 216 | read = stream.read |
189 | 217 | tell = stream.tell |
217 | 245 | content_sha = extension_data[-20:] |
218 | 246 | |
219 | 247 | # truncate the sha in the end as we will dynamically create it anyway |
220 | extension_data = extension_data[:-20] | |
248 | extension_data = extension_data[: -20] | |
221 | 249 | |
222 | 250 | return (version, entries, extension_data, content_sha) |
223 | 251 | |
224 | 252 | |
225 | def write_tree_from_cache(entries, odb, sl, si=0): | |
253 | def write_tree_from_cache(entries: List[IndexEntry], odb: 'GitCmdObjectDB', sl: slice, si: int = 0 | |
254 | ) -> Tuple[bytes, List['TreeCacheTup']]: | |
226 | 255 | """Create a tree from the given sorted list of entries and put the respective |
227 | 256 | trees into the given object database |
228 | 257 | |
232 | 261 | :param sl: slice indicating the range we should process on the entries list |
233 | 262 | :return: tuple(binsha, list(tree_entry, ...)) a tuple of a sha and a list of |
234 | 263 | tree entries being a tuple of hexsha, mode, name""" |
235 | tree_items = [] | |
236 | tree_items_append = tree_items.append | |
264 | tree_items: List['TreeCacheTup'] = [] | |
265 | ||
237 | 266 | ci = sl.start |
238 | 267 | end = sl.stop |
239 | 268 | while ci < end: |
245 | 274 | rbound = entry.path.find('/', si) |
246 | 275 | if rbound == -1: |
247 | 276 | # its not a tree |
248 | tree_items_append((entry.binsha, entry.mode, entry.path[si:])) | |
277 | tree_items.append((entry.binsha, entry.mode, entry.path[si:])) | |
249 | 278 | else: |
250 | 279 | # find common base range |
251 | 280 | base = entry.path[si:rbound] |
262 | 291 | # enter recursion |
263 | 292 | # ci - 1 as we want to count our current item as well |
264 | 293 | sha, _tree_entry_list = write_tree_from_cache(entries, odb, slice(ci - 1, xi), rbound + 1) |
265 | tree_items_append((sha, S_IFDIR, base)) | |
294 | tree_items.append((sha, S_IFDIR, base)) | |
266 | 295 | |
267 | 296 | # skip ahead |
268 | 297 | ci = xi |
271 | 300 | |
272 | 301 | # finally create the tree |
273 | 302 | sio = BytesIO() |
274 | tree_to_stream(tree_items, sio.write) | |
303 | tree_to_stream(tree_items, sio.write) # writes to stream as bytes, but doesnt change tree_items | |
275 | 304 | sio.seek(0) |
276 | 305 | |
277 | 306 | istream = odb.store(IStream(str_tree_type, len(sio.getvalue()), sio)) |
278 | 307 | return (istream.binsha, tree_items) |
279 | 308 | |
280 | 309 | |
281 | def _tree_entry_to_baseindexentry(tree_entry, stage): | |
310 | def _tree_entry_to_baseindexentry(tree_entry: 'TreeCacheTup', stage: int) -> BaseIndexEntry: | |
282 | 311 | return BaseIndexEntry((tree_entry[1], tree_entry[0], stage << CE_STAGESHIFT, tree_entry[2])) |
283 | 312 | |
284 | 313 | |
285 | def aggressive_tree_merge(odb, tree_shas): | |
314 | def aggressive_tree_merge(odb: 'GitCmdObjectDB', tree_shas: Sequence[bytes]) -> List[BaseIndexEntry]: | |
286 | 315 | """ |
287 | 316 | :return: list of BaseIndexEntries representing the aggressive merge of the given |
288 | 317 | trees. All valid entries are on stage 0, whereas the conflicting ones are left |
291 | 320 | :param tree_shas: 1, 2 or 3 trees as identified by their binary 20 byte shas |
292 | 321 | If 1 or two, the entries will effectively correspond to the last given tree |
293 | 322 | If 3 are given, a 3 way merge is performed""" |
294 | out = [] | |
295 | out_append = out.append | |
323 | out: List[BaseIndexEntry] = [] | |
296 | 324 | |
297 | 325 | # one and two way is the same for us, as we don't have to handle an existing |
298 | 326 | # index, instrea |
299 | 327 | if len(tree_shas) in (1, 2): |
300 | 328 | for entry in traverse_tree_recursive(odb, tree_shas[-1], ''): |
301 | out_append(_tree_entry_to_baseindexentry(entry, 0)) | |
329 | out.append(_tree_entry_to_baseindexentry(entry, 0)) | |
302 | 330 | # END for each entry |
303 | 331 | return out |
304 | 332 | # END handle single tree |
319 | 347 | if(base[0] != ours[0] and base[0] != theirs[0] and ours[0] != theirs[0]) or \ |
320 | 348 | (base[1] != ours[1] and base[1] != theirs[1] and ours[1] != theirs[1]): |
321 | 349 | # changed by both |
322 | out_append(_tree_entry_to_baseindexentry(base, 1)) | |
323 | out_append(_tree_entry_to_baseindexentry(ours, 2)) | |
324 | out_append(_tree_entry_to_baseindexentry(theirs, 3)) | |
350 | out.append(_tree_entry_to_baseindexentry(base, 1)) | |
351 | out.append(_tree_entry_to_baseindexentry(ours, 2)) | |
352 | out.append(_tree_entry_to_baseindexentry(theirs, 3)) | |
325 | 353 | elif base[0] != ours[0] or base[1] != ours[1]: |
326 | 354 | # only we changed it |
327 | out_append(_tree_entry_to_baseindexentry(ours, 0)) | |
355 | out.append(_tree_entry_to_baseindexentry(ours, 0)) | |
328 | 356 | else: |
329 | 357 | # either nobody changed it, or they did. In either |
330 | 358 | # case, use theirs |
331 | out_append(_tree_entry_to_baseindexentry(theirs, 0)) | |
359 | out.append(_tree_entry_to_baseindexentry(theirs, 0)) | |
332 | 360 | # END handle modification |
333 | 361 | else: |
334 | 362 | |
335 | 363 | if ours[0] != base[0] or ours[1] != base[1]: |
336 | 364 | # they deleted it, we changed it, conflict |
337 | out_append(_tree_entry_to_baseindexentry(base, 1)) | |
338 | out_append(_tree_entry_to_baseindexentry(ours, 2)) | |
365 | out.append(_tree_entry_to_baseindexentry(base, 1)) | |
366 | out.append(_tree_entry_to_baseindexentry(ours, 2)) | |
339 | 367 | # else: |
340 | 368 | # we didn't change it, ignore |
341 | 369 | # pass |
348 | 376 | else: |
349 | 377 | if theirs[0] != base[0] or theirs[1] != base[1]: |
350 | 378 | # deleted in ours, changed theirs, conflict |
351 | out_append(_tree_entry_to_baseindexentry(base, 1)) | |
352 | out_append(_tree_entry_to_baseindexentry(theirs, 3)) | |
379 | out.append(_tree_entry_to_baseindexentry(base, 1)) | |
380 | out.append(_tree_entry_to_baseindexentry(theirs, 3)) | |
353 | 381 | # END theirs changed |
354 | 382 | # else: |
355 | 383 | # theirs didn't change |
360 | 388 | # all three can't be None |
361 | 389 | if ours is None: |
362 | 390 | # added in their branch |
363 | out_append(_tree_entry_to_baseindexentry(theirs, 0)) | |
391 | assert theirs is not None | |
392 | out.append(_tree_entry_to_baseindexentry(theirs, 0)) | |
364 | 393 | elif theirs is None: |
365 | 394 | # added in our branch |
366 | out_append(_tree_entry_to_baseindexentry(ours, 0)) | |
395 | out.append(_tree_entry_to_baseindexentry(ours, 0)) | |
367 | 396 | else: |
368 | 397 | # both have it, except for the base, see whether it changed |
369 | 398 | if ours[0] != theirs[0] or ours[1] != theirs[1]: |
370 | out_append(_tree_entry_to_baseindexentry(ours, 2)) | |
371 | out_append(_tree_entry_to_baseindexentry(theirs, 3)) | |
399 | out.append(_tree_entry_to_baseindexentry(ours, 2)) | |
400 | out.append(_tree_entry_to_baseindexentry(theirs, 3)) | |
372 | 401 | else: |
373 | 402 | # it was added the same in both |
374 | out_append(_tree_entry_to_baseindexentry(ours, 0)) | |
403 | out.append(_tree_entry_to_baseindexentry(ours, 0)) | |
375 | 404 | # END handle two items |
376 | 405 | # END handle heads |
377 | 406 | # END handle base exists |
7 | 7 | ) |
8 | 8 | from git.objects import Blob |
9 | 9 | |
10 | ||
11 | # typing ---------------------------------------------------------------------- | |
12 | ||
13 | from typing import (NamedTuple, Sequence, TYPE_CHECKING, Tuple, Union, cast) | |
14 | ||
15 | from git.types import PathLike | |
16 | ||
17 | if TYPE_CHECKING: | |
18 | from git.repo import Repo | |
19 | ||
20 | # --------------------------------------------------------------------------------- | |
10 | 21 | |
11 | 22 | __all__ = ('BlobFilter', 'BaseIndexEntry', 'IndexEntry') |
12 | 23 | |
30 | 41 | """ |
31 | 42 | __slots__ = 'paths' |
32 | 43 | |
33 | def __init__(self, paths): | |
44 | def __init__(self, paths: Sequence[PathLike]) -> None: | |
34 | 45 | """ |
35 | 46 | :param paths: |
36 | 47 | tuple or list of paths which are either pointing to directories or |
38 | 49 | """ |
39 | 50 | self.paths = paths |
40 | 51 | |
41 | def __call__(self, stage_blob): | |
52 | def __call__(self, stage_blob: Blob) -> bool: | |
42 | 53 | path = stage_blob[1].path |
43 | 54 | for p in self.paths: |
44 | 55 | if path.startswith(p): |
47 | 58 | return False |
48 | 59 | |
49 | 60 | |
50 | class BaseIndexEntry(tuple): | |
61 | class BaseIndexEntryHelper(NamedTuple): | |
62 | """Typed namedtuple to provide named attribute access for BaseIndexEntry. | |
63 | Needed to allow overriding __new__ in child class to preserve backwards compat.""" | |
64 | mode: int | |
65 | binsha: bytes | |
66 | flags: int | |
67 | path: PathLike | |
68 | ctime_bytes: bytes = pack(">LL", 0, 0) | |
69 | mtime_bytes: bytes = pack(">LL", 0, 0) | |
70 | dev: int = 0 | |
71 | inode: int = 0 | |
72 | uid: int = 0 | |
73 | gid: int = 0 | |
74 | size: int = 0 | |
75 | ||
76 | ||
77 | class BaseIndexEntry(BaseIndexEntryHelper): | |
51 | 78 | |
52 | 79 | """Small Brother of an index entry which can be created to describe changes |
53 | 80 | done to the index in which case plenty of additional information is not required. |
54 | 81 | |
55 | 82 | As the first 4 data members match exactly to the IndexEntry type, methods |
56 | 83 | expecting a BaseIndexEntry can also handle full IndexEntries even if they |
57 | use numeric indices for performance reasons. """ | |
84 | use numeric indices for performance reasons. | |
85 | """ | |
58 | 86 | |
59 | def __str__(self): | |
87 | def __new__(cls, inp_tuple: Union[Tuple[int, bytes, int, PathLike], | |
88 | Tuple[int, bytes, int, PathLike, bytes, bytes, int, int, int, int, int]] | |
89 | ) -> 'BaseIndexEntry': | |
90 | """Override __new__ to allow construction from a tuple for backwards compatibility """ | |
91 | return super().__new__(cls, *inp_tuple) | |
92 | ||
93 | def __str__(self) -> str: | |
60 | 94 | return "%o %s %i\t%s" % (self.mode, self.hexsha, self.stage, self.path) |
61 | 95 | |
62 | def __repr__(self): | |
96 | def __repr__(self) -> str: | |
63 | 97 | return "(%o, %s, %i, %s)" % (self.mode, self.hexsha, self.stage, self.path) |
64 | 98 | |
65 | 99 | @property |
66 | def mode(self): | |
67 | """ File Mode, compatible to stat module constants """ | |
68 | return self[0] | |
100 | def hexsha(self) -> str: | |
101 | """hex version of our sha""" | |
102 | return b2a_hex(self.binsha).decode('ascii') | |
69 | 103 | |
70 | 104 | @property |
71 | def binsha(self): | |
72 | """binary sha of the blob """ | |
73 | return self[1] | |
74 | ||
75 | @property | |
76 | def hexsha(self): | |
77 | """hex version of our sha""" | |
78 | return b2a_hex(self[1]).decode('ascii') | |
79 | ||
80 | @property | |
81 | def stage(self): | |
105 | def stage(self) -> int: | |
82 | 106 | """Stage of the entry, either: |
83 | 107 | |
84 | 108 | * 0 = default stage |
88 | 112 | |
89 | 113 | :note: For more information, see http://www.kernel.org/pub/software/scm/git/docs/git-read-tree.html |
90 | 114 | """ |
91 | return (self[2] & CE_STAGEMASK) >> CE_STAGESHIFT | |
92 | ||
93 | @property | |
94 | def path(self): | |
95 | """:return: our path relative to the repository working tree root""" | |
96 | return self[3] | |
97 | ||
98 | @property | |
99 | def flags(self): | |
100 | """:return: flags stored with this entry""" | |
101 | return self[2] | |
115 | return (self.flags & CE_STAGEMASK) >> CE_STAGESHIFT | |
102 | 116 | |
103 | 117 | @classmethod |
104 | def from_blob(cls, blob, stage=0): | |
118 | def from_blob(cls, blob: Blob, stage: int = 0) -> 'BaseIndexEntry': | |
105 | 119 | """:return: Fully equipped BaseIndexEntry at the given stage""" |
106 | 120 | return cls((blob.mode, blob.binsha, stage << CE_STAGESHIFT, blob.path)) |
107 | 121 | |
108 | def to_blob(self, repo): | |
122 | def to_blob(self, repo: 'Repo') -> Blob: | |
109 | 123 | """:return: Blob using the information of this index entry""" |
110 | 124 | return Blob(repo, self.binsha, self.mode, self.path) |
111 | 125 | |
119 | 133 | |
120 | 134 | See the properties for a mapping between names and tuple indices. """ |
121 | 135 | @property |
122 | def ctime(self): | |
136 | def ctime(self) -> Tuple[int, int]: | |
123 | 137 | """ |
124 | 138 | :return: |
125 | 139 | Tuple(int_time_seconds_since_epoch, int_nano_seconds) of the |
126 | 140 | file's creation time""" |
127 | return unpack(">LL", self[4]) | |
141 | return cast(Tuple[int, int], unpack(">LL", self.ctime_bytes)) | |
128 | 142 | |
129 | 143 | @property |
130 | def mtime(self): | |
144 | def mtime(self) -> Tuple[int, int]: | |
131 | 145 | """See ctime property, but returns modification time """ |
132 | return unpack(">LL", self[5]) | |
133 | ||
134 | @property | |
135 | def dev(self): | |
136 | """ Device ID """ | |
137 | return self[6] | |
138 | ||
139 | @property | |
140 | def inode(self): | |
141 | """ Inode ID """ | |
142 | return self[7] | |
143 | ||
144 | @property | |
145 | def uid(self): | |
146 | """ User ID """ | |
147 | return self[8] | |
148 | ||
149 | @property | |
150 | def gid(self): | |
151 | """ Group ID """ | |
152 | return self[9] | |
153 | ||
154 | @property | |
155 | def size(self): | |
156 | """:return: Uncompressed size of the blob """ | |
157 | return self[10] | |
146 | return cast(Tuple[int, int], unpack(">LL", self.mtime_bytes)) | |
158 | 147 | |
159 | 148 | @classmethod |
160 | def from_base(cls, base): | |
149 | def from_base(cls, base: 'BaseIndexEntry') -> 'IndexEntry': | |
161 | 150 | """ |
162 | 151 | :return: |
163 | 152 | Minimal entry as created from the given BaseIndexEntry instance. |
168 | 157 | return IndexEntry((base.mode, base.binsha, base.flags, base.path, time, time, 0, 0, 0, 0, 0)) |
169 | 158 | |
170 | 159 | @classmethod |
171 | def from_blob(cls, blob, stage=0): | |
160 | def from_blob(cls, blob: Blob, stage: int = 0) -> 'IndexEntry': | |
172 | 161 | """:return: Minimal entry resembling the given blob object""" |
173 | 162 | time = pack(">LL", 0, 0) |
174 | 163 | return IndexEntry((blob.mode, blob.binsha, stage << CE_STAGESHIFT, blob.path, |
6 | 6 | from git.compat import is_win |
7 | 7 | |
8 | 8 | import os.path as osp |
9 | ||
10 | ||
11 | # typing ---------------------------------------------------------------------- | |
12 | ||
13 | from typing import (Any, Callable, TYPE_CHECKING) | |
14 | ||
15 | from git.types import PathLike, _T | |
16 | ||
17 | if TYPE_CHECKING: | |
18 | from git.index import IndexFile | |
19 | ||
20 | # --------------------------------------------------------------------------------- | |
9 | 21 | |
10 | 22 | |
11 | 23 | __all__ = ('TemporaryFileSwap', 'post_clear_cache', 'default_index', 'git_working_dir') |
23 | 35 | and moving it back on to where on object deletion.""" |
24 | 36 | __slots__ = ("file_path", "tmp_file_path") |
25 | 37 | |
26 | def __init__(self, file_path): | |
38 | def __init__(self, file_path: PathLike) -> None: | |
27 | 39 | self.file_path = file_path |
28 | self.tmp_file_path = self.file_path + tempfile.mktemp('', '', '') | |
40 | self.tmp_file_path = str(self.file_path) + tempfile.mktemp('', '', '') | |
29 | 41 | # it may be that the source does not exist |
30 | 42 | try: |
31 | 43 | os.rename(self.file_path, self.tmp_file_path) |
32 | 44 | except OSError: |
33 | 45 | pass |
34 | 46 | |
35 | def __del__(self): | |
47 | def __del__(self) -> None: | |
36 | 48 | if osp.isfile(self.tmp_file_path): |
37 | 49 | if is_win and osp.exists(self.file_path): |
38 | 50 | os.remove(self.file_path) |
42 | 54 | |
43 | 55 | #{ Decorators |
44 | 56 | |
45 | def post_clear_cache(func): | |
57 | def post_clear_cache(func: Callable[..., _T]) -> Callable[..., _T]: | |
46 | 58 | """Decorator for functions that alter the index using the git command. This would |
47 | 59 | invalidate our possibly existing entries dictionary which is why it must be |
48 | 60 | deleted to allow it to be lazily reread later. |
53 | 65 | """ |
54 | 66 | |
55 | 67 | @wraps(func) |
56 | def post_clear_cache_if_not_raised(self, *args, **kwargs): | |
68 | def post_clear_cache_if_not_raised(self: 'IndexFile', *args: Any, **kwargs: Any) -> _T: | |
57 | 69 | rval = func(self, *args, **kwargs) |
58 | 70 | self._delete_entries_cache() |
59 | 71 | return rval |
62 | 74 | return post_clear_cache_if_not_raised |
63 | 75 | |
64 | 76 | |
65 | def default_index(func): | |
77 | def default_index(func: Callable[..., _T]) -> Callable[..., _T]: | |
66 | 78 | """Decorator assuring the wrapped method may only run if we are the default |
67 | 79 | repository index. This is as we rely on git commands that operate |
68 | 80 | on that index only. """ |
69 | 81 | |
70 | 82 | @wraps(func) |
71 | def check_default_index(self, *args, **kwargs): | |
83 | def check_default_index(self: 'IndexFile', *args: Any, **kwargs: Any) -> _T: | |
72 | 84 | if self._file_path != self._index_path(): |
73 | 85 | raise AssertionError( |
74 | 86 | "Cannot call %r on indices that do not represent the default git index" % func.__name__) |
78 | 90 | return check_default_index |
79 | 91 | |
80 | 92 | |
81 | def git_working_dir(func): | |
93 | def git_working_dir(func: Callable[..., _T]) -> Callable[..., _T]: | |
82 | 94 | """Decorator which changes the current working dir to the one of the git |
83 | 95 | repository in order to assure relative paths are handled correctly""" |
84 | 96 | |
85 | 97 | @wraps(func) |
86 | def set_git_working_dir(self, *args, **kwargs): | |
98 | def set_git_working_dir(self: 'IndexFile', *args: Any, **kwargs: Any) -> _T: | |
87 | 99 | cur_wd = os.getcwd() |
88 | os.chdir(self.repo.working_tree_dir) | |
100 | os.chdir(str(self.repo.working_tree_dir)) | |
89 | 101 | try: |
90 | 102 | return func(self, *args, **kwargs) |
91 | 103 | finally: |
1 | 1 | Import all submodules main classes into the package space |
2 | 2 | """ |
3 | 3 | # flake8: noqa |
4 | from __future__ import absolute_import | |
5 | ||
6 | 4 | import inspect |
7 | 5 | |
8 | 6 | from .base import * |
15 | 13 | from .tree import * |
16 | 14 | # Fix import dependency - add IndexObject to the util module, so that it can be |
17 | 15 | # imported by the submodule.base |
18 | smutil.IndexObject = IndexObject | |
19 | smutil.Object = Object | |
16 | smutil.IndexObject = IndexObject # type: ignore[attr-defined] | |
17 | smutil.Object = Object # type: ignore[attr-defined] | |
20 | 18 | del(smutil) |
21 | 19 | |
22 | 20 | # must come after submodule was made available |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | ||
6 | from git.exc import WorkTreeRepositoryUnsupported | |
5 | 7 | from git.util import LazyMixin, join_path_native, stream_copy, bin_to_hex |
6 | 8 | |
7 | 9 | import gitdb.typ as dbtyp |
10 | 12 | from .util import get_object_type_by_name |
11 | 13 | |
12 | 14 | |
13 | _assertion_msg_format = "Created object %r whose python type %r disagrees with the acutal git object type %r" | |
15 | # typing ------------------------------------------------------------------ | |
16 | ||
17 | from typing import Any, TYPE_CHECKING, Union | |
18 | ||
19 | from git.types import PathLike, Commit_ish, Lit_commit_ish | |
20 | ||
21 | if TYPE_CHECKING: | |
22 | from git.repo import Repo | |
23 | from gitdb.base import OStream | |
24 | from .tree import Tree | |
25 | from .blob import Blob | |
26 | from .submodule.base import Submodule | |
27 | from git.refs.reference import Reference | |
28 | ||
29 | IndexObjUnion = Union['Tree', 'Blob', 'Submodule'] | |
30 | ||
31 | # -------------------------------------------------------------------------- | |
32 | ||
33 | ||
34 | _assertion_msg_format = "Created object %r whose python type %r disagrees with the acutual git object type %r" | |
14 | 35 | |
15 | 36 | __all__ = ("Object", "IndexObject") |
16 | 37 | |
23 | 44 | |
24 | 45 | TYPES = (dbtyp.str_blob_type, dbtyp.str_tree_type, dbtyp.str_commit_type, dbtyp.str_tag_type) |
25 | 46 | __slots__ = ("repo", "binsha", "size") |
26 | type = None # to be set by subclass | |
27 | ||
28 | def __init__(self, repo, binsha): | |
47 | type: Union[Lit_commit_ish, None] = None | |
48 | ||
49 | def __init__(self, repo: 'Repo', binsha: bytes): | |
29 | 50 | """Initialize an object by identifying it by its binary sha. |
30 | 51 | All keyword arguments will be set on demand if None. |
31 | 52 | |
38 | 59 | assert len(binsha) == 20, "Require 20 byte binary sha, got %r, len = %i" % (binsha, len(binsha)) |
39 | 60 | |
40 | 61 | @classmethod |
41 | def new(cls, repo, id): # @ReservedAssignment | |
62 | def new(cls, repo: 'Repo', id: Union[str, 'Reference']) -> Commit_ish: | |
42 | 63 | """ |
43 | 64 | :return: New Object instance of a type appropriate to the object type behind |
44 | 65 | id. The id of the newly created object will be a binsha even though |
51 | 72 | return repo.rev_parse(str(id)) |
52 | 73 | |
53 | 74 | @classmethod |
54 | def new_from_sha(cls, repo, sha1): | |
75 | def new_from_sha(cls, repo: 'Repo', sha1: bytes) -> Commit_ish: | |
55 | 76 | """ |
56 | 77 | :return: new object instance of a type appropriate to represent the given |
57 | 78 | binary sha1 |
65 | 86 | inst.size = oinfo.size |
66 | 87 | return inst |
67 | 88 | |
68 | def _set_cache_(self, attr): | |
89 | def _set_cache_(self, attr: str) -> None: | |
69 | 90 | """Retrieve object information""" |
70 | 91 | if attr == "size": |
71 | 92 | oinfo = self.repo.odb.info(self.binsha) |
72 | self.size = oinfo.size | |
93 | self.size = oinfo.size # type: int | |
73 | 94 | # assert oinfo.type == self.type, _assertion_msg_format % (self.binsha, oinfo.type, self.type) |
74 | 95 | else: |
75 | 96 | super(Object, self)._set_cache_(attr) |
76 | 97 | |
77 | def __eq__(self, other): | |
98 | def __eq__(self, other: Any) -> bool: | |
78 | 99 | """:return: True if the objects have the same SHA1""" |
79 | 100 | if not hasattr(other, 'binsha'): |
80 | 101 | return False |
81 | 102 | return self.binsha == other.binsha |
82 | 103 | |
83 | def __ne__(self, other): | |
104 | def __ne__(self, other: Any) -> bool: | |
84 | 105 | """:return: True if the objects do not have the same SHA1 """ |
85 | 106 | if not hasattr(other, 'binsha'): |
86 | 107 | return True |
87 | 108 | return self.binsha != other.binsha |
88 | 109 | |
89 | def __hash__(self): | |
110 | def __hash__(self) -> int: | |
90 | 111 | """:return: Hash of our id allowing objects to be used in dicts and sets""" |
91 | 112 | return hash(self.binsha) |
92 | 113 | |
93 | def __str__(self): | |
114 | def __str__(self) -> str: | |
94 | 115 | """:return: string of our SHA1 as understood by all git commands""" |
95 | 116 | return self.hexsha |
96 | 117 | |
97 | def __repr__(self): | |
118 | def __repr__(self) -> str: | |
98 | 119 | """:return: string with pythonic representation of our object""" |
99 | 120 | return '<git.%s "%s">' % (self.__class__.__name__, self.hexsha) |
100 | 121 | |
101 | 122 | @property |
102 | def hexsha(self): | |
123 | def hexsha(self) -> str: | |
103 | 124 | """:return: 40 byte hex version of our 20 byte binary sha""" |
104 | 125 | # b2a_hex produces bytes |
105 | 126 | return bin_to_hex(self.binsha).decode('ascii') |
106 | 127 | |
107 | 128 | @property |
108 | def data_stream(self): | |
129 | def data_stream(self) -> 'OStream': | |
109 | 130 | """ :return: File Object compatible stream to the uncompressed raw data of the object |
110 | 131 | :note: returned streams must be read in order""" |
111 | 132 | return self.repo.odb.stream(self.binsha) |
112 | 133 | |
113 | def stream_data(self, ostream): | |
134 | def stream_data(self, ostream: 'OStream') -> 'Object': | |
114 | 135 | """Writes our data directly to the given output stream |
115 | 136 | :param ostream: File object compatible stream object. |
116 | 137 | :return: self""" |
128 | 149 | # for compatibility with iterable lists |
129 | 150 | _id_attribute_ = 'path' |
130 | 151 | |
131 | def __init__(self, repo, binsha, mode=None, path=None): | |
152 | def __init__(self, | |
153 | repo: 'Repo', binsha: bytes, mode: Union[None, int] = None, path: Union[None, PathLike] = None | |
154 | ) -> None: | |
132 | 155 | """Initialize a newly instanced IndexObject |
133 | 156 | |
134 | 157 | :param repo: is the Repo we are located in |
148 | 171 | if path is not None: |
149 | 172 | self.path = path |
150 | 173 | |
151 | def __hash__(self): | |
174 | def __hash__(self) -> int: | |
152 | 175 | """ |
153 | 176 | :return: |
154 | 177 | Hash of our path as index items are uniquely identifiable by path, not |
155 | 178 | by their data !""" |
156 | 179 | return hash(self.path) |
157 | 180 | |
158 | def _set_cache_(self, attr): | |
181 | def _set_cache_(self, attr: str) -> None: | |
159 | 182 | if attr in IndexObject.__slots__: |
160 | 183 | # they cannot be retrieved lateron ( not without searching for them ) |
161 | 184 | raise AttributeError( |
166 | 189 | # END handle slot attribute |
167 | 190 | |
168 | 191 | @property |
169 | def name(self): | |
192 | def name(self) -> str: | |
170 | 193 | """:return: Name portion of the path, effectively being the basename""" |
171 | 194 | return osp.basename(self.path) |
172 | 195 | |
173 | 196 | @property |
174 | def abspath(self): | |
197 | def abspath(self) -> PathLike: | |
175 | 198 | """ |
176 | 199 | :return: |
177 | 200 | Absolute path to this index object in the file system ( as opposed to the |
178 | 201 | .path field which is a path relative to the git repository ). |
179 | 202 | |
180 | 203 | The returned path will be native to the system and contains '\' on windows. """ |
181 | return join_path_native(self.repo.working_tree_dir, self.path) | |
204 | if self.repo.working_tree_dir is not None: | |
205 | return join_path_native(self.repo.working_tree_dir, self.path) | |
206 | else: | |
207 | raise WorkTreeRepositoryUnsupported("Working_tree_dir was None or empty") |
5 | 5 | from mimetypes import guess_type |
6 | 6 | from . import base |
7 | 7 | |
8 | from git.types import Literal | |
9 | ||
8 | 10 | __all__ = ('Blob', ) |
9 | 11 | |
10 | 12 | |
12 | 14 | |
13 | 15 | """A Blob encapsulates a git blob object""" |
14 | 16 | DEFAULT_MIME_TYPE = "text/plain" |
15 | type = "blob" | |
17 | type: Literal['blob'] = "blob" | |
16 | 18 | |
17 | 19 | # valid blob modes |
18 | 20 | executable_mode = 0o100755 |
22 | 24 | __slots__ = () |
23 | 25 | |
24 | 26 | @property |
25 | def mime_type(self): | |
27 | def mime_type(self) -> str: | |
26 | 28 | """ |
27 | 29 | :return: String describing the mime type of this file (based on the filename) |
28 | 30 | :note: Defaults to 'text/plain' in case the actual file type is unknown. """ |
29 | 31 | guesses = None |
30 | 32 | if self.path: |
31 | guesses = guess_type(self.path) | |
33 | guesses = guess_type(str(self.path)) | |
32 | 34 | return guesses and guesses[0] or self.DEFAULT_MIME_TYPE |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | ||
5 | import datetime | |
6 | from subprocess import Popen | |
6 | 7 | from gitdb import IStream |
7 | 8 | from git.util import ( |
8 | 9 | hex_to_bin, |
9 | 10 | Actor, |
10 | Iterable, | |
11 | 11 | Stats, |
12 | 12 | finalize_process |
13 | 13 | ) |
16 | 16 | from .tree import Tree |
17 | 17 | from . import base |
18 | 18 | from .util import ( |
19 | Traversable, | |
20 | 19 | Serializable, |
20 | TraversableIterableObj, | |
21 | 21 | parse_date, |
22 | 22 | altz_to_utctz_str, |
23 | 23 | parse_actor_and_date, |
35 | 35 | from io import BytesIO |
36 | 36 | import logging |
37 | 37 | |
38 | ||
39 | # typing ------------------------------------------------------------------ | |
40 | ||
41 | from typing import Any, IO, Iterator, List, Sequence, Tuple, Union, TYPE_CHECKING, cast | |
42 | ||
43 | from git.types import PathLike, Literal | |
44 | ||
45 | if TYPE_CHECKING: | |
46 | from git.repo import Repo | |
47 | from git.refs import SymbolicReference | |
48 | ||
49 | # ------------------------------------------------------------------------ | |
50 | ||
38 | 51 | log = logging.getLogger('git.objects.commit') |
39 | 52 | log.addHandler(logging.NullHandler()) |
40 | 53 | |
41 | 54 | __all__ = ('Commit', ) |
42 | 55 | |
43 | 56 | |
44 | class Commit(base.Object, Iterable, Diffable, Traversable, Serializable): | |
57 | class Commit(base.Object, TraversableIterableObj, Diffable, Serializable): | |
45 | 58 | |
46 | 59 | """Wraps a git Commit object. |
47 | 60 | |
60 | 73 | default_encoding = "UTF-8" |
61 | 74 | |
62 | 75 | # object configuration |
63 | type = "commit" | |
76 | type: Literal['commit'] = "commit" | |
64 | 77 | __slots__ = ("tree", |
65 | 78 | "author", "authored_date", "author_tz_offset", |
66 | 79 | "committer", "committed_date", "committer_tz_offset", |
67 | 80 | "message", "parents", "encoding", "gpgsig") |
68 | 81 | _id_attribute_ = "hexsha" |
69 | 82 | |
70 | def __init__(self, repo, binsha, tree=None, author=None, authored_date=None, author_tz_offset=None, | |
71 | committer=None, committed_date=None, committer_tz_offset=None, | |
72 | message=None, parents=None, encoding=None, gpgsig=None): | |
83 | def __init__(self, repo: 'Repo', binsha: bytes, tree: Union[Tree, None] = None, | |
84 | author: Union[Actor, None] = None, | |
85 | authored_date: Union[int, None] = None, | |
86 | author_tz_offset: Union[None, float] = None, | |
87 | committer: Union[Actor, None] = None, | |
88 | committed_date: Union[int, None] = None, | |
89 | committer_tz_offset: Union[None, float] = None, | |
90 | message: Union[str, bytes, None] = None, | |
91 | parents: Union[Sequence['Commit'], None] = None, | |
92 | encoding: Union[str, None] = None, | |
93 | gpgsig: Union[str, None] = None) -> None: | |
73 | 94 | """Instantiate a new Commit. All keyword arguments taking None as default will |
74 | 95 | be implicitly set on first query. |
75 | 96 | |
106 | 127 | as what time.altzone returns. The sign is inverted compared to git's |
107 | 128 | UTC timezone.""" |
108 | 129 | super(Commit, self).__init__(repo, binsha) |
130 | self.binsha = binsha | |
109 | 131 | if tree is not None: |
110 | 132 | assert isinstance(tree, Tree), "Tree needs to be a Tree instance, was %s" % type(tree) |
111 | 133 | if tree is not None: |
132 | 154 | self.gpgsig = gpgsig |
133 | 155 | |
134 | 156 | @classmethod |
135 | def _get_intermediate_items(cls, commit): | |
136 | return commit.parents | |
157 | def _get_intermediate_items(cls, commit: 'Commit') -> Tuple['Commit', ...]: | |
158 | return tuple(commit.parents) | |
137 | 159 | |
138 | 160 | @classmethod |
139 | def _calculate_sha_(cls, repo, commit): | |
161 | def _calculate_sha_(cls, repo: 'Repo', commit: 'Commit') -> bytes: | |
140 | 162 | '''Calculate the sha of a commit. |
141 | 163 | |
142 | 164 | :param repo: Repo object the commit should be part of |
151 | 173 | istream = repo.odb.store(IStream(cls.type, streamlen, stream)) |
152 | 174 | return istream.binsha |
153 | 175 | |
154 | def replace(self, **kwargs): | |
176 | def replace(self, **kwargs: Any) -> 'Commit': | |
155 | 177 | '''Create new commit object from existing commit object. |
156 | 178 | |
157 | 179 | Any values provided as keyword arguments will replace the |
170 | 192 | |
171 | 193 | return new_commit |
172 | 194 | |
173 | def _set_cache_(self, attr): | |
195 | def _set_cache_(self, attr: str) -> None: | |
174 | 196 | if attr in Commit.__slots__: |
175 | 197 | # read the data in a chunk, its faster - then provide a file wrapper |
176 | 198 | _binsha, _typename, self.size, stream = self.repo.odb.stream(self.binsha) |
180 | 202 | # END handle attrs |
181 | 203 | |
182 | 204 | @property |
183 | def authored_datetime(self): | |
205 | def authored_datetime(self) -> datetime.datetime: | |
184 | 206 | return from_timestamp(self.authored_date, self.author_tz_offset) |
185 | 207 | |
186 | 208 | @property |
187 | def committed_datetime(self): | |
209 | def committed_datetime(self) -> datetime.datetime: | |
188 | 210 | return from_timestamp(self.committed_date, self.committer_tz_offset) |
189 | 211 | |
190 | 212 | @property |
191 | def summary(self): | |
213 | def summary(self) -> Union[str, bytes]: | |
192 | 214 | """:return: First line of the commit message""" |
193 | return self.message.split('\n', 1)[0] | |
194 | ||
195 | def count(self, paths='', **kwargs): | |
215 | if isinstance(self.message, str): | |
216 | return self.message.split('\n', 1)[0] | |
217 | else: | |
218 | return self.message.split(b'\n', 1)[0] | |
219 | ||
220 | def count(self, paths: Union[PathLike, Sequence[PathLike]] = '', **kwargs: Any) -> int: | |
196 | 221 | """Count the number of commits reachable from this commit |
197 | 222 | |
198 | 223 | :param paths: |
210 | 235 | return len(self.repo.git.rev_list(self.hexsha, **kwargs).splitlines()) |
211 | 236 | |
212 | 237 | @property |
213 | def name_rev(self): | |
238 | def name_rev(self) -> str: | |
214 | 239 | """ |
215 | 240 | :return: |
216 | 241 | String describing the commits hex sha based on the closest Reference. |
218 | 243 | return self.repo.git.name_rev(self) |
219 | 244 | |
220 | 245 | @classmethod |
221 | def iter_items(cls, repo, rev, paths='', **kwargs): | |
246 | def iter_items(cls, repo: 'Repo', rev: Union[str, 'Commit', 'SymbolicReference'], # type: ignore | |
247 | paths: Union[PathLike, Sequence[PathLike]] = '', **kwargs: Any | |
248 | ) -> Iterator['Commit']: | |
222 | 249 | """Find all commits matching the given criteria. |
223 | 250 | |
224 | 251 | :param repo: is the Repo |
238 | 265 | |
239 | 266 | # use -- in any case, to prevent possibility of ambiguous arguments |
240 | 267 | # see https://github.com/gitpython-developers/GitPython/issues/264 |
241 | args = ['--'] | |
268 | ||
269 | args_list: List[PathLike] = ['--'] | |
270 | ||
242 | 271 | if paths: |
243 | args.extend((paths, )) | |
272 | paths_tup: Tuple[PathLike, ...] | |
273 | if isinstance(paths, (str, os.PathLike)): | |
274 | paths_tup = (paths, ) | |
275 | else: | |
276 | paths_tup = tuple(paths) | |
277 | ||
278 | args_list.extend(paths_tup) | |
244 | 279 | # END if paths |
245 | 280 | |
246 | proc = repo.git.rev_list(rev, args, as_process=True, **kwargs) | |
281 | proc = repo.git.rev_list(rev, args_list, as_process=True, **kwargs) | |
247 | 282 | return cls._iter_from_process_or_stream(repo, proc) |
248 | 283 | |
249 | def iter_parents(self, paths='', **kwargs): | |
284 | def iter_parents(self, paths: Union[PathLike, Sequence[PathLike]] = '', **kwargs: Any) -> Iterator['Commit']: | |
250 | 285 | """Iterate _all_ parents of this commit. |
251 | 286 | |
252 | 287 | :param paths: |
262 | 297 | |
263 | 298 | return self.iter_items(self.repo, self, paths, **kwargs) |
264 | 299 | |
265 | @property | |
266 | def stats(self): | |
300 | @ property | |
301 | def stats(self) -> Stats: | |
267 | 302 | """Create a git stat from changes between this commit and its first parent |
268 | 303 | or from all changes done if this is the very first commit. |
269 | 304 | |
279 | 314 | text = self.repo.git.diff(self.parents[0].hexsha, self.hexsha, '--', numstat=True) |
280 | 315 | return Stats._list_from_string(self.repo, text) |
281 | 316 | |
282 | @classmethod | |
283 | def _iter_from_process_or_stream(cls, repo, proc_or_stream): | |
317 | @ classmethod | |
318 | def _iter_from_process_or_stream(cls, repo: 'Repo', proc_or_stream: Union[Popen, IO]) -> Iterator['Commit']: | |
284 | 319 | """Parse out commit information into a list of Commit objects |
285 | 320 | We expect one-line per commit, and parse the actual commit information directly |
286 | 321 | from our lighting fast object database |
287 | 322 | |
288 | 323 | :param proc: git-rev-list process instance - one sha per line |
289 | 324 | :return: iterator returning Commit objects""" |
290 | stream = proc_or_stream | |
291 | if not hasattr(stream, 'readline'): | |
292 | stream = proc_or_stream.stdout | |
325 | ||
326 | # def is_proc(inp) -> TypeGuard[Popen]: | |
327 | # return hasattr(proc_or_stream, 'wait') and not hasattr(proc_or_stream, 'readline') | |
328 | ||
329 | # def is_stream(inp) -> TypeGuard[IO]: | |
330 | # return hasattr(proc_or_stream, 'readline') | |
331 | ||
332 | if hasattr(proc_or_stream, 'wait'): | |
333 | proc_or_stream = cast(Popen, proc_or_stream) | |
334 | if proc_or_stream.stdout is not None: | |
335 | stream = proc_or_stream.stdout | |
336 | elif hasattr(proc_or_stream, 'readline'): | |
337 | proc_or_stream = cast(IO, proc_or_stream) | |
338 | stream = proc_or_stream | |
293 | 339 | |
294 | 340 | readline = stream.readline |
295 | 341 | while True: |
308 | 354 | # TODO: Review this - it seems process handling got a bit out of control |
309 | 355 | # due to many developers trying to fix the open file handles issue |
310 | 356 | if hasattr(proc_or_stream, 'wait'): |
357 | proc_or_stream = cast(Popen, proc_or_stream) | |
311 | 358 | finalize_process(proc_or_stream) |
312 | 359 | |
313 | @classmethod | |
314 | def create_from_tree(cls, repo, tree, message, parent_commits=None, head=False, author=None, committer=None, | |
315 | author_date=None, commit_date=None): | |
360 | @ classmethod | |
361 | def create_from_tree(cls, repo: 'Repo', tree: Union[Tree, str], message: str, | |
362 | parent_commits: Union[None, List['Commit']] = None, head: bool = False, | |
363 | author: Union[None, Actor] = None, committer: Union[None, Actor] = None, | |
364 | author_date: Union[None, str] = None, commit_date: Union[None, str] = None) -> 'Commit': | |
316 | 365 | """Commit the given tree, creating a commit object. |
317 | 366 | |
318 | 367 | :param repo: Repo object the commit should be part of |
319 | 368 | :param tree: Tree object or hex or bin sha |
320 | 369 | the tree of the new commit |
321 | 370 | :param message: Commit message. It may be an empty string if no message is provided. |
322 | It will be converted to a string in any case. | |
371 | It will be converted to a string , in any case. | |
323 | 372 | :param parent_commits: |
324 | 373 | Optional Commit objects to use as parents for the new commit. |
325 | 374 | If empty list, the commit will have no parents at all and become |
353 | 402 | else: |
354 | 403 | for p in parent_commits: |
355 | 404 | if not isinstance(p, cls): |
356 | raise ValueError("Parent commit '%r' must be of type %s" % (p, cls)) | |
405 | raise ValueError(f"Parent commit '{p!r}' must be of type {cls}") | |
357 | 406 | # end check parent commit types |
358 | 407 | # END if parent commits are unset |
359 | 408 | |
429 | 478 | |
430 | 479 | #{ Serializable Implementation |
431 | 480 | |
432 | def _serialize(self, stream): | |
481 | def _serialize(self, stream: BytesIO) -> 'Commit': | |
433 | 482 | write = stream.write |
434 | 483 | write(("tree %s\n" % self.tree).encode('ascii')) |
435 | 484 | for p in self.parents: |
453 | 502 | write(("encoding %s\n" % self.encoding).encode('ascii')) |
454 | 503 | |
455 | 504 | try: |
456 | if self.__getattribute__('gpgsig') is not None: | |
505 | if self.__getattribute__('gpgsig'): | |
457 | 506 | write(b"gpgsig") |
458 | 507 | for sigline in self.gpgsig.rstrip("\n").split("\n"): |
459 | 508 | write((" " + sigline + "\n").encode('ascii')) |
470 | 519 | # END handle encoding |
471 | 520 | return self |
472 | 521 | |
473 | def _deserialize(self, stream): | |
474 | """:param from_rev_list: if true, the stream format is coming from the rev-list command | |
475 | Otherwise it is assumed to be a plain data stream from our object""" | |
522 | def _deserialize(self, stream: BytesIO) -> 'Commit': | |
523 | """ | |
524 | :param from_rev_list: if true, the stream format is coming from the rev-list command | |
525 | Otherwise it is assumed to be a plain data stream from our object | |
526 | """ | |
476 | 527 | readline = stream.readline |
477 | 528 | self.tree = Tree(self.repo, hex_to_bin(readline().split()[1]), Tree.tree_id << 12, '') |
478 | 529 | |
503 | 554 | # now we can have the encoding line, or an empty line followed by the optional |
504 | 555 | # message. |
505 | 556 | self.encoding = self.default_encoding |
506 | self.gpgsig = None | |
557 | self.gpgsig = "" | |
507 | 558 | |
508 | 559 | # read headers |
509 | 560 | enc = next_line |
510 | 561 | buf = enc.strip() |
511 | 562 | while buf: |
512 | 563 | if buf[0:10] == b"encoding ": |
513 | self.encoding = buf[buf.find(' ') + 1:].decode( | |
564 | self.encoding = buf[buf.find(b' ') + 1:].decode( | |
514 | 565 | self.encoding, 'ignore') |
515 | 566 | elif buf[0:7] == b"gpgsig ": |
516 | 567 | sig = buf[buf.find(b' ') + 1:] + b"\n" |
532 | 583 | # decode the authors name |
533 | 584 | |
534 | 585 | try: |
535 | self.author, self.authored_date, self.author_tz_offset = \ | |
586 | (self.author, self.authored_date, self.author_tz_offset) = \ | |
536 | 587 | parse_actor_and_date(author_line.decode(self.encoding, 'replace')) |
537 | 588 | except UnicodeDecodeError: |
538 | 589 | log.error("Failed to decode author line '%s' using encoding %s", author_line, self.encoding, |
552 | 603 | try: |
553 | 604 | self.message = self.message.decode(self.encoding, 'replace') |
554 | 605 | except UnicodeDecodeError: |
555 | log.error("Failed to decode message '%s' using encoding %s", self.message, self.encoding, exc_info=True) | |
606 | log.error("Failed to decode message '%s' using encoding %s", | |
607 | self.message, self.encoding, exc_info=True) | |
556 | 608 | # END exception handling |
557 | 609 | |
558 | 610 | return self |
0 | 0 | """Module with functions which are supposed to be as fast as possible""" |
1 | 1 | from stat import S_ISDIR |
2 | ||
3 | ||
2 | 4 | from git.compat import ( |
3 | 5 | safe_decode, |
4 | 6 | defenc |
5 | 7 | ) |
6 | 8 | |
9 | # typing ---------------------------------------------- | |
10 | ||
11 | from typing import Callable, List, MutableSequence, Sequence, Tuple, TYPE_CHECKING, Union, overload | |
12 | ||
13 | if TYPE_CHECKING: | |
14 | from _typeshed import ReadableBuffer | |
15 | from git import GitCmdObjectDB | |
16 | ||
17 | EntryTup = Tuple[bytes, int, str] # same as TreeCacheTup in tree.py | |
18 | EntryTupOrNone = Union[EntryTup, None] | |
19 | ||
20 | # --------------------------------------------------- | |
21 | ||
22 | ||
7 | 23 | __all__ = ('tree_to_stream', 'tree_entries_from_data', 'traverse_trees_recursive', |
8 | 24 | 'traverse_tree_recursive') |
9 | 25 | |
10 | 26 | |
11 | def tree_to_stream(entries, write): | |
27 | def tree_to_stream(entries: Sequence[EntryTup], write: Callable[['ReadableBuffer'], Union[int, None]]) -> None: | |
12 | 28 | """Write the give list of entries into a stream using its write method |
13 | 29 | :param entries: **sorted** list of tuples with (binsha, mode, name) |
14 | 30 | :param write: write method which takes a data string""" |
32 | 48 | # According to my tests, this is exactly what git does, that is it just |
33 | 49 | # takes the input literally, which appears to be utf8 on linux. |
34 | 50 | if isinstance(name, str): |
35 | name = name.encode(defenc) | |
36 | write(b''.join((mode_str, b' ', name, b'\0', binsha))) | |
51 | name_bytes = name.encode(defenc) | |
52 | else: | |
53 | name_bytes = name # type: ignore[unreachable] # check runtime types - is always str? | |
54 | write(b''.join((mode_str, b' ', name_bytes, b'\0', binsha))) | |
37 | 55 | # END for each item |
38 | 56 | |
39 | 57 | |
40 | def tree_entries_from_data(data): | |
58 | def tree_entries_from_data(data: bytes) -> List[EntryTup]: | |
41 | 59 | """Reads the binary representation of a tree and returns tuples of Tree items |
42 | 60 | :param data: data block with tree data (as bytes) |
43 | 61 | :return: list(tuple(binsha, mode, tree_relative_path), ...)""" |
71 | 89 | |
72 | 90 | # default encoding for strings in git is utf8 |
73 | 91 | # Only use the respective unicode object if the byte stream was encoded |
74 | name = data[ns:i] | |
75 | name = safe_decode(name) | |
92 | name_bytes = data[ns:i] | |
93 | name = safe_decode(name_bytes) | |
76 | 94 | |
77 | 95 | # byte is NULL, get next 20 |
78 | 96 | i += 1 |
83 | 101 | return out |
84 | 102 | |
85 | 103 | |
86 | def _find_by_name(tree_data, name, is_dir, start_at): | |
104 | def _find_by_name(tree_data: MutableSequence[EntryTupOrNone], name: str, is_dir: bool, start_at: int | |
105 | ) -> EntryTupOrNone: | |
87 | 106 | """return data entry matching the given name and tree mode |
88 | 107 | or None. |
89 | 108 | Before the item is returned, the respective data item is set |
90 | 109 | None in the tree_data list to mark it done""" |
110 | ||
91 | 111 | try: |
92 | 112 | item = tree_data[start_at] |
93 | 113 | if item and item[2] == name and S_ISDIR(item[1]) == is_dir: |
105 | 125 | return None |
106 | 126 | |
107 | 127 | |
108 | def _to_full_path(item, path_prefix): | |
128 | @ overload | |
129 | def _to_full_path(item: None, path_prefix: str) -> None: | |
130 | ... | |
131 | ||
132 | ||
133 | @ overload | |
134 | def _to_full_path(item: EntryTup, path_prefix: str) -> EntryTup: | |
135 | ... | |
136 | ||
137 | ||
138 | def _to_full_path(item: EntryTupOrNone, path_prefix: str) -> EntryTupOrNone: | |
109 | 139 | """Rebuild entry with given path prefix""" |
110 | 140 | if not item: |
111 | 141 | return item |
112 | 142 | return (item[0], item[1], path_prefix + item[2]) |
113 | 143 | |
114 | 144 | |
115 | def traverse_trees_recursive(odb, tree_shas, path_prefix): | |
145 | def traverse_trees_recursive(odb: 'GitCmdObjectDB', tree_shas: Sequence[Union[bytes, None]], | |
146 | path_prefix: str) -> List[Tuple[EntryTupOrNone, ...]]: | |
116 | 147 | """ |
117 | :return: list with entries according to the given binary tree-shas. | |
148 | :return: list of list with entries according to the given binary tree-shas. | |
118 | 149 | The result is encoded in a list |
119 | 150 | of n tuple|None per blob/commit, (n == len(tree_shas)), where |
120 | 151 | * [0] == 20 byte sha |
127 | 158 | :param path_prefix: a prefix to be added to the returned paths on this level, |
128 | 159 | set it '' for the first iteration |
129 | 160 | :note: The ordering of the returned items will be partially lost""" |
130 | trees_data = [] | |
161 | trees_data: List[List[EntryTupOrNone]] = [] | |
162 | ||
131 | 163 | nt = len(tree_shas) |
132 | 164 | for tree_sha in tree_shas: |
133 | 165 | if tree_sha is None: |
134 | data = [] | |
166 | data: List[EntryTupOrNone] = [] | |
135 | 167 | else: |
136 | data = tree_entries_from_data(odb.stream(tree_sha).read()) | |
168 | # make new list for typing as list invariant | |
169 | data = list(tree_entries_from_data(odb.stream(tree_sha).read())) | |
137 | 170 | # END handle muted trees |
138 | 171 | trees_data.append(data) |
139 | 172 | # END for each sha to get data for |
140 | 173 | |
141 | out = [] | |
142 | out_append = out.append | |
174 | out: List[Tuple[EntryTupOrNone, ...]] = [] | |
143 | 175 | |
144 | 176 | # find all matching entries and recursively process them together if the match |
145 | 177 | # is a tree. If the match is a non-tree item, put it into the result. |
146 | 178 | # Processed items will be set None |
147 | 179 | for ti, tree_data in enumerate(trees_data): |
180 | ||
148 | 181 | for ii, item in enumerate(tree_data): |
149 | 182 | if not item: |
150 | 183 | continue |
151 | 184 | # END skip already done items |
185 | entries: List[EntryTupOrNone] | |
152 | 186 | entries = [None for _ in range(nt)] |
153 | 187 | entries[ti] = item |
154 | 188 | _sha, mode, name = item |
160 | 194 | for tio in range(ti + 1, ti + nt): |
161 | 195 | tio = tio % nt |
162 | 196 | entries[tio] = _find_by_name(trees_data[tio], name, is_dir, ii) |
197 | ||
163 | 198 | # END for each other item data |
164 | ||
165 | 199 | # if we are a directory, enter recursion |
166 | 200 | if is_dir: |
167 | 201 | out.extend(traverse_trees_recursive( |
168 | 202 | odb, [((ei and ei[0]) or None) for ei in entries], path_prefix + name + '/')) |
169 | 203 | else: |
170 | out_append(tuple(_to_full_path(e, path_prefix) for e in entries)) | |
204 | out.append(tuple(_to_full_path(e, path_prefix) for e in entries)) | |
205 | ||
171 | 206 | # END handle recursion |
172 | ||
173 | 207 | # finally mark it done |
174 | 208 | tree_data[ii] = None |
175 | 209 | # END for each item |
180 | 214 | return out |
181 | 215 | |
182 | 216 | |
183 | def traverse_tree_recursive(odb, tree_sha, path_prefix): | |
217 | def traverse_tree_recursive(odb: 'GitCmdObjectDB', tree_sha: bytes, path_prefix: str) -> List[EntryTup]: | |
184 | 218 | """ |
185 | 219 | :return: list of entries of the tree pointed to by the binary tree_sha. An entry |
186 | 220 | has the following format: |
2 | 2 | import logging |
3 | 3 | import os |
4 | 4 | import stat |
5 | ||
5 | 6 | from unittest import SkipTest |
6 | 7 | import uuid |
7 | 8 | |
23 | 24 | BadName |
24 | 25 | ) |
25 | 26 | from git.objects.base import IndexObject, Object |
26 | from git.objects.util import Traversable | |
27 | from git.objects.util import TraversableIterableObj | |
28 | ||
27 | 29 | from git.util import ( |
28 | Iterable, | |
29 | 30 | join_path_native, |
30 | 31 | to_native_path_linux, |
31 | 32 | RemoteProgress, |
32 | 33 | rmtree, |
33 | unbare_repo | |
34 | unbare_repo, | |
35 | IterableList | |
34 | 36 | ) |
35 | 37 | from git.util import HIDE_WINDOWS_KNOWN_ERRORS |
36 | 38 | |
45 | 47 | ) |
46 | 48 | |
47 | 49 | |
50 | # typing ---------------------------------------------------------------------- | |
51 | from typing import Callable, Dict, Mapping, Sequence, TYPE_CHECKING, cast | |
52 | from typing import Any, Iterator, Union | |
53 | ||
54 | from git.types import Commit_ish, Literal, PathLike, TBD | |
55 | ||
56 | if TYPE_CHECKING: | |
57 | from git.index import IndexFile | |
58 | from git.repo import Repo | |
59 | from git.refs import Head | |
60 | ||
61 | ||
62 | # ----------------------------------------------------------------------------- | |
63 | ||
48 | 64 | __all__ = ["Submodule", "UpdateProgress"] |
49 | 65 | |
50 | 66 | |
57 | 73 | """Class providing detailed progress information to the caller who should |
58 | 74 | derive from it and implement the ``update(...)`` message""" |
59 | 75 | CLONE, FETCH, UPDWKTREE = [1 << x for x in range(RemoteProgress._num_op_codes, RemoteProgress._num_op_codes + 3)] |
60 | _num_op_codes = RemoteProgress._num_op_codes + 3 | |
76 | _num_op_codes: int = RemoteProgress._num_op_codes + 3 | |
61 | 77 | |
62 | 78 | __slots__ = () |
63 | 79 | |
72 | 88 | # IndexObject comes via util module, its a 'hacky' fix thanks to pythons import |
73 | 89 | # mechanism which cause plenty of trouble of the only reason for packages and |
74 | 90 | # modules is refactoring - subpackages shouldn't depend on parent packages |
75 | class Submodule(IndexObject, Iterable, Traversable): | |
91 | class Submodule(IndexObject, TraversableIterableObj): | |
76 | 92 | |
77 | 93 | """Implements access to a git submodule. They are special in that their sha |
78 | 94 | represents a commit in the submodule's repository which is to be checked out |
89 | 105 | k_default_mode = stat.S_IFDIR | stat.S_IFLNK # submodules are directories with link-status |
90 | 106 | |
91 | 107 | # this is a bogus type for base class compatibility |
92 | type = 'submodule' | |
108 | type: Literal['submodule'] = 'submodule' # type: ignore | |
93 | 109 | |
94 | 110 | __slots__ = ('_parent_commit', '_url', '_branch_path', '_name', '__weakref__') |
95 | 111 | _cache_attrs = ('path', '_url', '_branch_path') |
96 | 112 | |
97 | def __init__(self, repo, binsha, mode=None, path=None, name=None, parent_commit=None, url=None, branch_path=None): | |
113 | def __init__(self, repo: 'Repo', binsha: bytes, | |
114 | mode: Union[int, None] = None, | |
115 | path: Union[PathLike, None] = None, | |
116 | name: Union[str, None] = None, | |
117 | parent_commit: Union[Commit_ish, None] = None, | |
118 | url: Union[str, None] = None, | |
119 | branch_path: Union[PathLike, None] = None | |
120 | ) -> None: | |
98 | 121 | """Initialize this instance with its attributes. We only document the ones |
99 | 122 | that differ from ``IndexObject`` |
100 | 123 | |
109 | 132 | if url is not None: |
110 | 133 | self._url = url |
111 | 134 | if branch_path is not None: |
112 | assert isinstance(branch_path, str) | |
135 | # assert isinstance(branch_path, str) | |
113 | 136 | self._branch_path = branch_path |
114 | 137 | if name is not None: |
115 | 138 | self._name = name |
116 | 139 | |
117 | def _set_cache_(self, attr): | |
140 | def _set_cache_(self, attr: str) -> None: | |
118 | 141 | if attr in ('path', '_url', '_branch_path'): |
119 | reader = self.config_reader() | |
142 | reader: SectionConstraint = self.config_reader() | |
120 | 143 | # default submodule values |
121 | 144 | try: |
122 | 145 | self.path = reader.get('path') |
123 | 146 | except cp.NoSectionError as e: |
124 | raise ValueError("This submodule instance does not exist anymore in '%s' file" | |
125 | % osp.join(self.repo.working_tree_dir, '.gitmodules')) from e | |
147 | if self.repo.working_tree_dir is not None: | |
148 | raise ValueError("This submodule instance does not exist anymore in '%s' file" | |
149 | % osp.join(self.repo.working_tree_dir, '.gitmodules')) from e | |
126 | 150 | # end |
127 | 151 | self._url = reader.get('url') |
128 | 152 | # git-python extension values - optional |
133 | 157 | super(Submodule, self)._set_cache_(attr) |
134 | 158 | # END handle attribute name |
135 | 159 | |
136 | def _get_intermediate_items(self, item): | |
160 | @classmethod | |
161 | def _get_intermediate_items(cls, item: 'Submodule') -> IterableList['Submodule']: | |
137 | 162 | """:return: all the submodules of our module repository""" |
138 | 163 | try: |
139 | return type(self).list_items(item.module()) | |
164 | return cls.list_items(item.module()) | |
140 | 165 | except InvalidGitRepositoryError: |
141 | return [] | |
166 | return IterableList('') | |
142 | 167 | # END handle intermediate items |
143 | 168 | |
144 | 169 | @classmethod |
145 | def _need_gitfile_submodules(cls, git): | |
170 | def _need_gitfile_submodules(cls, git: Git) -> bool: | |
146 | 171 | return git.version_info[:3] >= (1, 7, 5) |
147 | 172 | |
148 | def __eq__(self, other): | |
173 | def __eq__(self, other: Any) -> bool: | |
149 | 174 | """Compare with another submodule""" |
150 | 175 | # we may only compare by name as this should be the ID they are hashed with |
151 | 176 | # Otherwise this type wouldn't be hashable |
152 | 177 | # return self.path == other.path and self.url == other.url and super(Submodule, self).__eq__(other) |
153 | 178 | return self._name == other._name |
154 | 179 | |
155 | def __ne__(self, other): | |
180 | def __ne__(self, other: object) -> bool: | |
156 | 181 | """Compare with another submodule for inequality""" |
157 | 182 | return not (self == other) |
158 | 183 | |
159 | def __hash__(self): | |
184 | def __hash__(self) -> int: | |
160 | 185 | """Hash this instance using its logical id, not the sha""" |
161 | 186 | return hash(self._name) |
162 | 187 | |
163 | def __str__(self): | |
188 | def __str__(self) -> str: | |
164 | 189 | return self._name |
165 | 190 | |
166 | def __repr__(self): | |
191 | def __repr__(self) -> str: | |
167 | 192 | return "git.%s(name=%s, path=%s, url=%s, branch_path=%s)"\ |
168 | 193 | % (type(self).__name__, self._name, self.path, self.url, self.branch_path) |
169 | 194 | |
170 | 195 | @classmethod |
171 | def _config_parser(cls, repo, parent_commit, read_only): | |
196 | def _config_parser(cls, repo: 'Repo', | |
197 | parent_commit: Union[Commit_ish, None], | |
198 | read_only: bool) -> SubmoduleConfigParser: | |
172 | 199 | """:return: Config Parser constrained to our submodule in read or write mode |
173 | 200 | :raise IOError: If the .gitmodules file cannot be found, either locally or in the repository |
174 | 201 | at the given parent commit. Otherwise the exception would be delayed until the first |
181 | 208 | # We are most likely in an empty repository, so the HEAD doesn't point to a valid ref |
182 | 209 | pass |
183 | 210 | # end handle parent_commit |
184 | ||
185 | if not repo.bare and parent_matches_head: | |
211 | fp_module: Union[str, BytesIO] | |
212 | if not repo.bare and parent_matches_head and repo.working_tree_dir: | |
186 | 213 | fp_module = osp.join(repo.working_tree_dir, cls.k_modules_file) |
187 | 214 | else: |
188 | 215 | assert parent_commit is not None, "need valid parent_commit in bare repositories" |
200 | 227 | |
201 | 228 | return SubmoduleConfigParser(fp_module, read_only=read_only) |
202 | 229 | |
203 | def _clear_cache(self): | |
230 | def _clear_cache(self) -> None: | |
204 | 231 | # clear the possibly changed values |
205 | 232 | for name in self._cache_attrs: |
206 | 233 | try: |
211 | 238 | # END for each name to delete |
212 | 239 | |
213 | 240 | @classmethod |
214 | def _sio_modules(cls, parent_commit): | |
241 | def _sio_modules(cls, parent_commit: Commit_ish) -> BytesIO: | |
215 | 242 | """:return: Configuration file as BytesIO - we only access it through the respective blob's data""" |
216 | 243 | sio = BytesIO(parent_commit.tree[cls.k_modules_file].data_stream.read()) |
217 | 244 | sio.name = cls.k_modules_file |
218 | 245 | return sio |
219 | 246 | |
220 | def _config_parser_constrained(self, read_only): | |
247 | def _config_parser_constrained(self, read_only: bool) -> SectionConstraint: | |
221 | 248 | """:return: Config Parser constrained to our submodule in read or write mode""" |
222 | 249 | try: |
223 | pc = self.parent_commit | |
250 | pc: Union['Commit_ish', None] = self.parent_commit | |
224 | 251 | except ValueError: |
225 | 252 | pc = None |
226 | 253 | # end handle empty parent repository |
229 | 256 | return SectionConstraint(parser, sm_section(self.name)) |
230 | 257 | |
231 | 258 | @classmethod |
232 | def _module_abspath(cls, parent_repo, path, name): | |
259 | def _module_abspath(cls, parent_repo: 'Repo', path: PathLike, name: str) -> PathLike: | |
233 | 260 | if cls._need_gitfile_submodules(parent_repo.git): |
234 | 261 | return osp.join(parent_repo.git_dir, 'modules', name) |
235 | return osp.join(parent_repo.working_tree_dir, path) | |
262 | if parent_repo.working_tree_dir: | |
263 | return osp.join(parent_repo.working_tree_dir, path) | |
264 | raise NotADirectoryError() | |
236 | 265 | # end |
237 | 266 | |
238 | 267 | @classmethod |
239 | def _clone_repo(cls, repo, url, path, name, **kwargs): | |
268 | def _clone_repo(cls, repo: 'Repo', url: str, path: PathLike, name: str, **kwargs: Any) -> 'Repo': | |
240 | 269 | """:return: Repo instance of newly cloned repository |
241 | 270 | :param repo: our parent repository |
242 | 271 | :param url: url to clone from |
243 | :param path: repository-relative path to the submodule checkout location | |
272 | :param path: repository - relative path to the submodule checkout location | |
244 | 273 | :param name: canonical of the submodule |
245 | 274 | :param kwrags: additinoal arguments given to git.clone""" |
246 | 275 | module_abspath = cls._module_abspath(repo, path, name) |
250 | 279 | module_abspath_dir = osp.dirname(module_abspath) |
251 | 280 | if not osp.isdir(module_abspath_dir): |
252 | 281 | os.makedirs(module_abspath_dir) |
253 | module_checkout_path = osp.join(repo.working_tree_dir, path) | |
282 | module_checkout_path = osp.join(str(repo.working_tree_dir), path) | |
254 | 283 | # end |
255 | 284 | |
256 | 285 | clone = git.Repo.clone_from(url, module_checkout_path, **kwargs) |
260 | 289 | return clone |
261 | 290 | |
262 | 291 | @classmethod |
263 | def _to_relative_path(cls, parent_repo, path): | |
264 | """:return: a path guaranteed to be relative to the given parent-repository | |
292 | def _to_relative_path(cls, parent_repo: 'Repo', path: PathLike) -> PathLike: | |
293 | """:return: a path guaranteed to be relative to the given parent - repository | |
265 | 294 | :raise ValueError: if path is not contained in the parent repository's working tree""" |
266 | 295 | path = to_native_path_linux(path) |
267 | 296 | if path.endswith('/'): |
268 | 297 | path = path[:-1] |
269 | 298 | # END handle trailing slash |
270 | 299 | |
271 | if osp.isabs(path): | |
300 | if osp.isabs(path) and parent_repo.working_tree_dir: | |
272 | 301 | working_tree_linux = to_native_path_linux(parent_repo.working_tree_dir) |
273 | 302 | if not path.startswith(working_tree_linux): |
274 | 303 | raise ValueError("Submodule checkout path '%s' needs to be within the parents repository at '%s'" |
282 | 311 | return path |
283 | 312 | |
284 | 313 | @classmethod |
285 | def _write_git_file_and_module_config(cls, working_tree_dir, module_abspath): | |
286 | """Writes a .git file containing a (preferably) relative path to the actual git module repository. | |
314 | def _write_git_file_and_module_config(cls, working_tree_dir: PathLike, module_abspath: PathLike) -> None: | |
315 | """Writes a .git file containing a(preferably) relative path to the actual git module repository. | |
287 | 316 | It is an error if the module_abspath cannot be made into a relative path, relative to the working_tree_dir |
288 | 317 | :note: will overwrite existing files ! |
289 | 318 | :note: as we rewrite both the git file as well as the module configuration, we might fail on the configuration |
290 | and will not roll back changes done to the git file. This should be a non-issue, but may easily be fixed | |
319 | and will not roll back changes done to the git file. This should be a non - issue, but may easily be fixed | |
291 | 320 | if it becomes one |
292 | 321 | :param working_tree_dir: directory to write the .git file into |
293 | 322 | :param module_abspath: absolute path to the bare repository |
308 | 337 | #{ Edit Interface |
309 | 338 | |
310 | 339 | @classmethod |
311 | def add(cls, repo, name, path, url=None, branch=None, no_checkout=False, depth=None, env=None): | |
340 | def add(cls, repo: 'Repo', name: str, path: PathLike, url: Union[str, None] = None, | |
341 | branch: Union[str, None] = None, no_checkout: bool = False, depth: Union[int, None] = None, | |
342 | env: Union[Mapping[str, str], None] = None, clone_multi_options: Union[Sequence[TBD], None] = None | |
343 | ) -> 'Submodule': | |
312 | 344 | """Add a new submodule to the given repository. This will alter the index |
313 | 345 | as well as the .gitmodules file, but will not create a new commit. |
314 | 346 | If the submodule already exists, no matter if the configuration differs |
341 | 373 | and is defined in `os.environ`, value from `os.environ` will be used. |
342 | 374 | If you want to unset some variable, consider providing empty string |
343 | 375 | as its value. |
376 | :param clone_multi_options: A list of Clone options. Please see ``git.repo.base.Repo.clone`` | |
377 | for details. | |
344 | 378 | :return: The newly created submodule instance |
345 | 379 | :note: works atomically, such that no change will be done if the repository |
346 | 380 | update fails for instance""" |
381 | ||
347 | 382 | if repo.bare: |
348 | 383 | raise InvalidGitRepositoryError("Cannot add submodules to bare repositories") |
349 | 384 | # END handle bare repos |
361 | 396 | if sm.exists(): |
362 | 397 | # reretrieve submodule from tree |
363 | 398 | try: |
364 | sm = repo.head.commit.tree[path] | |
399 | sm = repo.head.commit.tree[str(path)] | |
365 | 400 | sm._name = name |
366 | 401 | return sm |
367 | 402 | except KeyError: |
384 | 419 | # END check url |
385 | 420 | # END verify urls match |
386 | 421 | |
387 | mrepo = None | |
422 | mrepo: Union[Repo, None] = None | |
423 | ||
388 | 424 | if url is None: |
389 | 425 | if not has_module: |
390 | raise ValueError("A URL was not given and existing repository did not exsit at %s" % path) | |
426 | raise ValueError("A URL was not given and a repository did not exist at %s" % path) | |
391 | 427 | # END check url |
392 | 428 | mrepo = sm.module() |
429 | # assert isinstance(mrepo, git.Repo) | |
393 | 430 | urls = [r.url for r in mrepo.remotes] |
394 | 431 | if not urls: |
395 | 432 | raise ValueError("Didn't find any remote url in repository at %s" % sm.abspath) |
397 | 434 | url = urls[0] |
398 | 435 | else: |
399 | 436 | # clone new repo |
400 | kwargs = {'n': no_checkout} | |
437 | kwargs: Dict[str, Union[bool, int, str, Sequence[TBD]]] = {'n': no_checkout} | |
401 | 438 | if not branch_is_default: |
402 | 439 | kwargs['b'] = br.name |
403 | 440 | # END setup checkout-branch |
407 | 444 | kwargs['depth'] = depth |
408 | 445 | else: |
409 | 446 | raise ValueError("depth should be an integer") |
447 | if clone_multi_options: | |
448 | kwargs['multi_options'] = clone_multi_options | |
410 | 449 | |
411 | 450 | # _clone_repo(cls, repo, url, path, name, **kwargs): |
412 | 451 | mrepo = cls._clone_repo(repo, url, path, name, env=env, **kwargs) |
419 | 458 | # otherwise there is a '-' character in front of the submodule listing |
420 | 459 | # a38efa84daef914e4de58d1905a500d8d14aaf45 mymodule (v0.9.0-1-ga38efa8) |
421 | 460 | # -a38efa84daef914e4de58d1905a500d8d14aaf45 submodules/intermediate/one |
461 | writer: Union[GitConfigParser, SectionConstraint] | |
462 | ||
422 | 463 | with sm.repo.config_writer() as writer: |
423 | 464 | writer.set_value(sm_section(name), 'url', url) |
424 | 465 | |
435 | 476 | sm._branch_path = br.path |
436 | 477 | |
437 | 478 | # we deliberately assume that our head matches our index ! |
438 | sm.binsha = mrepo.head.commit.binsha | |
479 | if mrepo: | |
480 | sm.binsha = mrepo.head.commit.binsha | |
439 | 481 | index.add([sm], write=True) |
440 | 482 | |
441 | 483 | return sm |
442 | 484 | |
443 | def update(self, recursive=False, init=True, to_latest_revision=False, progress=None, dry_run=False, | |
444 | force=False, keep_going=False, env=None): | |
485 | def update(self, recursive: bool = False, init: bool = True, to_latest_revision: bool = False, | |
486 | progress: Union['UpdateProgress', None] = None, dry_run: bool = False, | |
487 | force: bool = False, keep_going: bool = False, env: Union[Mapping[str, str], None] = None, | |
488 | clone_multi_options: Union[Sequence[TBD], None] = None) -> 'Submodule': | |
445 | 489 | """Update the repository of this submodule to point to the checkout |
446 | 490 | we point at with the binsha of this instance. |
447 | 491 | |
455 | 499 | was specified for this submodule and the branch existed remotely |
456 | 500 | :param progress: UpdateProgress instance or None if no progress should be shown |
457 | 501 | :param dry_run: if True, the operation will only be simulated, but not performed. |
458 | All performed operations are read-only | |
502 | All performed operations are read - only | |
459 | 503 | :param force: |
460 | 504 | If True, we may reset heads even if the repository in question is dirty. Additinoally we will be allowed |
461 | 505 | to set a tracking branch which is ahead of its remote branch back into the past or the location of the |
463 | 507 | If False, local tracking branches that are in the future of their respective remote branches will simply |
464 | 508 | not be moved. |
465 | 509 | :param keep_going: if True, we will ignore but log all errors, and keep going recursively. |
466 | Unless dry_run is set as well, keep_going could cause subsequent/inherited errors you wouldn't see | |
510 | Unless dry_run is set as well, keep_going could cause subsequent / inherited errors you wouldn't see | |
467 | 511 | otherwise. |
468 | 512 | In conjunction with dry_run, it can be useful to anticipate all errors when updating submodules |
469 | 513 | :param env: Optional dictionary containing the desired environment variables. |
472 | 516 | and is defined in `os.environ`, value from `os.environ` will be used. |
473 | 517 | If you want to unset some variable, consider providing empty string |
474 | 518 | as its value. |
519 | :param clone_multi_options: list of Clone options. Please see ``git.repo.base.Repo.clone`` | |
520 | for details. Only take effect with `init` option. | |
475 | 521 | :note: does nothing in bare repositories |
476 | 522 | :note: method is definitely not atomic if recurisve is True |
477 | 523 | :return: self""" |
518 | 564 | progress.update(op, i, len_rmts, prefix + "Done fetching remote of submodule %r" % self.name) |
519 | 565 | # END fetch new data |
520 | 566 | except InvalidGitRepositoryError: |
567 | mrepo = None | |
521 | 568 | if not init: |
522 | 569 | return self |
523 | 570 | # END early abort if init is not allowed |
538 | 585 | progress.update(BEGIN | CLONE, 0, 1, prefix + "Cloning url '%s' to '%s' in submodule %r" % |
539 | 586 | (self.url, checkout_module_abspath, self.name)) |
540 | 587 | if not dry_run: |
541 | mrepo = self._clone_repo(self.repo, self.url, self.path, self.name, n=True, env=env) | |
588 | mrepo = self._clone_repo(self.repo, self.url, self.path, self.name, n=True, env=env, | |
589 | multi_options=clone_multi_options) | |
542 | 590 | # END handle dry-run |
543 | 591 | progress.update(END | CLONE, 0, 1, prefix + "Done cloning to %s" % checkout_module_abspath) |
544 | 592 | |
545 | 593 | if not dry_run: |
546 | 594 | # see whether we have a valid branch to checkout |
547 | 595 | try: |
596 | mrepo = cast('Repo', mrepo) | |
548 | 597 | # find a remote which has our branch - we try to be flexible |
549 | 598 | remote_branch = find_first_remote_branch(mrepo.remotes, self.branch_name) |
550 | 599 | local_branch = mkhead(mrepo, self.branch_path) |
556 | 605 | |
557 | 606 | # make sure HEAD is not detached |
558 | 607 | mrepo.head.set_reference(local_branch, logmsg="submodule: attaching head to %s" % local_branch) |
559 | mrepo.head.ref.set_tracking_branch(remote_branch) | |
608 | mrepo.head.reference.set_tracking_branch(remote_branch) | |
560 | 609 | except (IndexError, InvalidGitRepositoryError): |
561 | 610 | log.warning("Failed to checkout tracking branch %s", self.branch_path) |
562 | 611 | # END handle tracking branch |
582 | 631 | if mrepo is not None and to_latest_revision: |
583 | 632 | msg_base = "Cannot update to latest revision in repository at %r as " % mrepo.working_dir |
584 | 633 | if not is_detached: |
585 | rref = mrepo.head.ref.tracking_branch() | |
634 | rref = mrepo.head.reference.tracking_branch() | |
586 | 635 | if rref is not None: |
587 | 636 | rcommit = rref.commit |
588 | 637 | binsha = rcommit.binsha |
589 | 638 | hexsha = rcommit.hexsha |
590 | 639 | else: |
591 | log.error("%s a tracking branch was not set for local branch '%s'", msg_base, mrepo.head.ref) | |
640 | log.error("%s a tracking branch was not set for local branch '%s'", | |
641 | msg_base, mrepo.head.reference) | |
592 | 642 | # END handle remote ref |
593 | 643 | else: |
594 | 644 | log.error("%s there was no local tracking branch", msg_base) |
605 | 655 | may_reset = True |
606 | 656 | if mrepo.head.commit.binsha != self.NULL_BIN_SHA: |
607 | 657 | base_commit = mrepo.merge_base(mrepo.head.commit, hexsha) |
608 | if len(base_commit) == 0 or base_commit[0].hexsha == hexsha: | |
658 | if len(base_commit) == 0 or (base_commit[0] is not None and base_commit[0].hexsha == hexsha): | |
609 | 659 | if force: |
610 | 660 | msg = "Will force checkout or reset on local branch that is possibly in the future of" |
611 | 661 | msg += "the commit it will be checked out to, effectively 'forgetting' new commits" |
663 | 713 | return self |
664 | 714 | |
665 | 715 | @unbare_repo |
666 | def move(self, module_path, configuration=True, module=True): | |
716 | def move(self, module_path: PathLike, configuration: bool = True, module: bool = True) -> 'Submodule': | |
667 | 717 | """Move the submodule to a another module path. This involves physically moving |
668 | 718 | the repository at our current path, changing the configuration, as well as |
669 | 719 | adjusting our index entry accordingly. |
670 | 720 | |
671 | 721 | :param module_path: the path to which to move our module in the parent repostory's working tree, |
672 | given as repository-relative or absolute path. Intermediate directories will be created | |
722 | given as repository - relative or absolute path. Intermediate directories will be created | |
673 | 723 | accordingly. If the path already exists, it must be empty. |
674 | Trailing (back)slashes are removed automatically | |
724 | Trailing(back)slashes are removed automatically | |
675 | 725 | :param configuration: if True, the configuration will be adjusted to let |
676 | 726 | the submodule point to the given path. |
677 | 727 | :param module: if True, the repository managed by this submodule |
680 | 730 | :return: self |
681 | 731 | :raise ValueError: if the module path existed and was not empty, or was a file |
682 | 732 | :note: Currently the method is not atomic, and it could leave the repository |
683 | in an inconsistent state if a sub-step fails for some reason | |
733 | in an inconsistent state if a sub - step fails for some reason | |
684 | 734 | """ |
685 | 735 | if module + configuration < 1: |
686 | 736 | raise ValueError("You must specify to move at least the module or the configuration of the submodule") |
693 | 743 | return self |
694 | 744 | # END handle no change |
695 | 745 | |
696 | module_checkout_abspath = join_path_native(self.repo.working_tree_dir, module_checkout_path) | |
746 | module_checkout_abspath = join_path_native(str(self.repo.working_tree_dir), module_checkout_path) | |
697 | 747 | if osp.isfile(module_checkout_abspath): |
698 | 748 | raise ValueError("Cannot move repository onto a file: %s" % module_checkout_abspath) |
699 | 749 | # END handle target files |
772 | 822 | return self |
773 | 823 | |
774 | 824 | @unbare_repo |
775 | def remove(self, module=True, force=False, configuration=True, dry_run=False): | |
825 | def remove(self, module: bool = True, force: bool = False, | |
826 | configuration: bool = True, dry_run: bool = False) -> 'Submodule': | |
776 | 827 | """Remove this submodule from the repository. This will remove our entry |
777 | from the .gitmodules file and the entry in the .git/config file. | |
828 | from the .gitmodules file and the entry in the .git / config file. | |
778 | 829 | |
779 | 830 | :param module: If True, the module checkout we point to will be deleted |
780 | 831 | as well. If the module is currently on a commit which is not part |
781 | 832 | of any branch in the remote, if the currently checked out branch |
782 | 833 | working tree, or untracked files, |
783 | is ahead of its tracking branch, if you have modifications in the | |
834 | is ahead of its tracking branch, if you have modifications in the | |
784 | 835 | In case the removal of the repository fails for these reasons, the |
785 | 836 | submodule status will not have been altered. |
786 | If this submodule has child-modules on its own, these will be deleted | |
837 | If this submodule has child - modules on its own, these will be deleted | |
787 | 838 | prior to touching the own module. |
788 | 839 | :param force: Enforces the deletion of the module even though it contains |
789 | modifications. This basically enforces a brute-force file system based | |
840 | modifications. This basically enforces a brute - force file system based | |
790 | 841 | deletion. |
791 | 842 | :param configuration: if True, the submodule is deleted from the configuration, |
792 | 843 | otherwise it isn't. Although this should be enabled most of the times, |
826 | 877 | # TODO: If we run into permission problems, we have a highly inconsistent |
827 | 878 | # state. Delete the .git folders last, start with the submodules first |
828 | 879 | mp = self.abspath |
829 | method = None | |
880 | method: Union[None, Callable[[PathLike], None]] = None | |
830 | 881 | if osp.islink(mp): |
831 | 882 | method = os.remove |
832 | 883 | elif osp.isdir(mp): |
879 | 930 | import gc |
880 | 931 | gc.collect() |
881 | 932 | try: |
882 | rmtree(wtd) | |
933 | rmtree(str(wtd)) | |
883 | 934 | except Exception as ex: |
884 | 935 | if HIDE_WINDOWS_KNOWN_ERRORS: |
885 | 936 | raise SkipTest("FIXME: fails with: PermissionError\n {}".format(ex)) from ex |
893 | 944 | rmtree(git_dir) |
894 | 945 | except Exception as ex: |
895 | 946 | if HIDE_WINDOWS_KNOWN_ERRORS: |
896 | raise SkipTest("FIXME: fails with: PermissionError\n %s", ex) from ex | |
947 | raise SkipTest(f"FIXME: fails with: PermissionError\n {ex}") from ex | |
897 | 948 | else: |
898 | 949 | raise |
899 | 950 | # end handle separate bare repository |
917 | 968 | |
918 | 969 | # now git config - need the config intact, otherwise we can't query |
919 | 970 | # information anymore |
920 | with self.repo.config_writer() as writer: | |
921 | writer.remove_section(sm_section(self.name)) | |
922 | ||
923 | with self.config_writer() as writer: | |
924 | writer.remove_section() | |
971 | ||
972 | with self.repo.config_writer() as gcp_writer: | |
973 | gcp_writer.remove_section(sm_section(self.name)) | |
974 | ||
975 | with self.config_writer() as sc_writer: | |
976 | sc_writer.remove_section() | |
925 | 977 | # END delete configuration |
926 | 978 | |
927 | 979 | return self |
928 | 980 | |
929 | def set_parent_commit(self, commit, check=True): | |
981 | def set_parent_commit(self, commit: Union[Commit_ish, None], check: bool = True) -> 'Submodule': | |
930 | 982 | """Set this instance to use the given commit whose tree is supposed to |
931 | 983 | contain the .gitmodules blob. |
932 | 984 | |
965 | 1017 | # If check is False, we might see a parent-commit that doesn't even contain the submodule anymore. |
966 | 1018 | # in that case, mark our sha as being NULL |
967 | 1019 | try: |
968 | self.binsha = pctree[self.path].binsha | |
1020 | self.binsha = pctree[str(self.path)].binsha | |
969 | 1021 | except KeyError: |
970 | 1022 | self.binsha = self.NULL_BIN_SHA |
971 | 1023 | # end |
974 | 1026 | return self |
975 | 1027 | |
976 | 1028 | @unbare_repo |
977 | def config_writer(self, index=None, write=True): | |
1029 | def config_writer(self, index: Union['IndexFile', None] = None, write: bool = True | |
1030 | ) -> SectionConstraint['SubmoduleConfigParser']: | |
978 | 1031 | """:return: a config writer instance allowing you to read and write the data |
979 | 1032 | belonging to this submodule into the .gitmodules file. |
980 | 1033 | |
995 | 1048 | return writer |
996 | 1049 | |
997 | 1050 | @unbare_repo |
998 | def rename(self, new_name): | |
1051 | def rename(self, new_name: str) -> 'Submodule': | |
999 | 1052 | """Rename this submodule |
1000 | 1053 | :note: This method takes care of renaming the submodule in various places, such as |
1001 | 1054 | |
1002 | * $parent_git_dir/config | |
1003 | * $working_tree_dir/.gitmodules | |
1004 | * (git >=v1.8.0: move submodule repository to new name) | |
1055 | * $parent_git_dir / config | |
1056 | * $working_tree_dir / .gitmodules | |
1057 | * (git >= v1.8.0: move submodule repository to new name) | |
1005 | 1058 | |
1006 | 1059 | As .gitmodules will be changed, you would need to make a commit afterwards. The changed .gitmodules file |
1007 | 1060 | will already be added to the index |
1030 | 1083 | destination_module_abspath = self._module_abspath(self.repo, self.path, new_name) |
1031 | 1084 | source_dir = mod.git_dir |
1032 | 1085 | # Let's be sure the submodule name is not so obviously tied to a directory |
1033 | if destination_module_abspath.startswith(mod.git_dir): | |
1086 | if str(destination_module_abspath).startswith(str(mod.git_dir)): | |
1034 | 1087 | tmp_dir = self._module_abspath(self.repo, self.path, str(uuid.uuid4())) |
1035 | 1088 | os.renames(source_dir, tmp_dir) |
1036 | 1089 | source_dir = tmp_dir |
1037 | 1090 | # end handle self-containment |
1038 | 1091 | os.renames(source_dir, destination_module_abspath) |
1039 | self._write_git_file_and_module_config(mod.working_tree_dir, destination_module_abspath) | |
1092 | if mod.working_tree_dir: | |
1093 | self._write_git_file_and_module_config(mod.working_tree_dir, destination_module_abspath) | |
1040 | 1094 | # end move separate git repository |
1041 | 1095 | |
1042 | 1096 | return self |
1046 | 1100 | #{ Query Interface |
1047 | 1101 | |
1048 | 1102 | @unbare_repo |
1049 | def module(self): | |
1103 | def module(self) -> 'Repo': | |
1050 | 1104 | """:return: Repo instance initialized from the repository at our submodule path |
1051 | 1105 | :raise InvalidGitRepositoryError: if a repository was not available. This could |
1052 | 1106 | also mean that it was not yet initialized""" |
1063 | 1117 | raise InvalidGitRepositoryError("Repository at %r was not yet checked out" % module_checkout_abspath) |
1064 | 1118 | # END handle exceptions |
1065 | 1119 | |
1066 | def module_exists(self): | |
1120 | def module_exists(self) -> bool: | |
1067 | 1121 | """:return: True if our module exists and is a valid git repository. See module() method""" |
1068 | 1122 | try: |
1069 | 1123 | self.module() |
1072 | 1126 | return False |
1073 | 1127 | # END handle exception |
1074 | 1128 | |
1075 | def exists(self): | |
1129 | def exists(self) -> bool: | |
1076 | 1130 | """ |
1077 | 1131 | :return: True if the submodule exists, False otherwise. Please note that |
1078 | a submodule may exist (in the .gitmodules file) even though its module | |
1132 | a submodule may exist ( in the .gitmodules file) even though its module | |
1079 | 1133 | doesn't exist on disk""" |
1080 | 1134 | # keep attributes for later, and restore them if we have no valid data |
1081 | 1135 | # this way we do not actually alter the state of the object |
1107 | 1161 | # END handle object state consistency |
1108 | 1162 | |
1109 | 1163 | @property |
1110 | def branch(self): | |
1164 | def branch(self) -> 'Head': | |
1111 | 1165 | """:return: The branch instance that we are to checkout |
1112 | 1166 | :raise InvalidGitRepositoryError: if our module is not yet checked out""" |
1113 | 1167 | return mkhead(self.module(), self._branch_path) |
1114 | 1168 | |
1115 | 1169 | @property |
1116 | def branch_path(self): | |
1170 | def branch_path(self) -> PathLike: | |
1117 | 1171 | """ |
1118 | :return: full (relative) path as string to the branch we would checkout | |
1172 | :return: full(relative) path as string to the branch we would checkout | |
1119 | 1173 | from the remote and track""" |
1120 | 1174 | return self._branch_path |
1121 | 1175 | |
1122 | 1176 | @property |
1123 | def branch_name(self): | |
1177 | def branch_name(self) -> str: | |
1124 | 1178 | """:return: the name of the branch, which is the shortest possible branch name""" |
1125 | 1179 | # use an instance method, for this we create a temporary Head instance |
1126 | 1180 | # which uses a repository that is available at least ( it makes no difference ) |
1127 | 1181 | return git.Head(self.repo, self._branch_path).name |
1128 | 1182 | |
1129 | 1183 | @property |
1130 | def url(self): | |
1131 | """:return: The url to the repository which our module-repository refers to""" | |
1184 | def url(self) -> str: | |
1185 | """:return: The url to the repository which our module - repository refers to""" | |
1132 | 1186 | return self._url |
1133 | 1187 | |
1134 | 1188 | @property |
1135 | def parent_commit(self): | |
1189 | def parent_commit(self) -> 'Commit_ish': | |
1136 | 1190 | """:return: Commit instance with the tree containing the .gitmodules file |
1137 | 1191 | :note: will always point to the current head's commit if it was not set explicitly""" |
1138 | 1192 | if self._parent_commit is None: |
1140 | 1194 | return self._parent_commit |
1141 | 1195 | |
1142 | 1196 | @property |
1143 | def name(self): | |
1197 | def name(self) -> str: | |
1144 | 1198 | """:return: The name of this submodule. It is used to identify it within the |
1145 | 1199 | .gitmodules file. |
1146 | 1200 | :note: by default, the name is the path at which to find the submodule, but |
1147 | in git-python it should be a unique identifier similar to the identifiers | |
1201 | in git - python it should be a unique identifier similar to the identifiers | |
1148 | 1202 | used for remotes, which allows to change the path of the submodule |
1149 | 1203 | easily |
1150 | 1204 | """ |
1151 | 1205 | return self._name |
1152 | 1206 | |
1153 | def config_reader(self): | |
1207 | def config_reader(self) -> SectionConstraint[SubmoduleConfigParser]: | |
1154 | 1208 | """ |
1155 | 1209 | :return: ConfigReader instance which allows you to qurey the configuration values |
1156 | 1210 | of this submodule, as provided by the .gitmodules file |
1160 | 1214 | :raise IOError: If the .gitmodules file/blob could not be read""" |
1161 | 1215 | return self._config_parser_constrained(read_only=True) |
1162 | 1216 | |
1163 | def children(self): | |
1217 | def children(self) -> IterableList['Submodule']: | |
1164 | 1218 | """ |
1165 | 1219 | :return: IterableList(Submodule, ...) an iterable list of submodules instances |
1166 | 1220 | which are children of this submodule or 0 if the submodule is not checked out""" |
1171 | 1225 | #{ Iterable Interface |
1172 | 1226 | |
1173 | 1227 | @classmethod |
1174 | def iter_items(cls, repo, parent_commit='HEAD'): | |
1228 | def iter_items(cls, repo: 'Repo', parent_commit: Union[Commit_ish, str] = 'HEAD', *Args: Any, **kwargs: Any | |
1229 | ) -> Iterator['Submodule']: | |
1175 | 1230 | """:return: iterator yielding Submodule instances available in the given repository""" |
1176 | 1231 | try: |
1177 | 1232 | pc = repo.commit(parent_commit) # parent commit instance |
1178 | 1233 | parser = cls._config_parser(repo, pc, read_only=True) |
1179 | 1234 | except (IOError, BadName): |
1180 | return | |
1235 | return iter([]) | |
1181 | 1236 | # END handle empty iterator |
1182 | ||
1183 | rt = pc.tree # root tree | |
1184 | 1237 | |
1185 | 1238 | for sms in parser.sections(): |
1186 | 1239 | n = sm_name(sms) |
1194 | 1247 | # get the binsha |
1195 | 1248 | index = repo.index |
1196 | 1249 | try: |
1250 | rt = pc.tree # root tree | |
1197 | 1251 | sm = rt[p] |
1198 | 1252 | except KeyError: |
1199 | 1253 | # try the index, maybe it was just added |
1 | 1 | Submodule, |
2 | 2 | UpdateProgress |
3 | 3 | ) |
4 | from .util import ( | |
5 | find_first_remote_branch | |
6 | ) | |
4 | from .util import find_first_remote_branch | |
7 | 5 | from git.exc import InvalidGitRepositoryError |
8 | 6 | import git |
9 | 7 | |
10 | 8 | import logging |
9 | ||
10 | # typing ------------------------------------------------------------------- | |
11 | ||
12 | from typing import TYPE_CHECKING, Union | |
13 | ||
14 | from git.types import Commit_ish | |
15 | ||
16 | if TYPE_CHECKING: | |
17 | from git.repo import Repo | |
18 | from git.util import IterableList | |
19 | ||
20 | # ---------------------------------------------------------------------------- | |
11 | 21 | |
12 | 22 | __all__ = ["RootModule", "RootUpdateProgress"] |
13 | 23 | |
41 | 51 | |
42 | 52 | k_root_name = '__ROOT__' |
43 | 53 | |
44 | def __init__(self, repo): | |
54 | def __init__(self, repo: 'Repo'): | |
45 | 55 | # repo, binsha, mode=None, path=None, name = None, parent_commit=None, url=None, ref=None) |
46 | 56 | super(RootModule, self).__init__( |
47 | 57 | repo, |
54 | 64 | branch_path=git.Head.to_full_path(self.k_head_default) |
55 | 65 | ) |
56 | 66 | |
57 | def _clear_cache(self): | |
67 | def _clear_cache(self) -> None: | |
58 | 68 | """May not do anything""" |
59 | 69 | pass |
60 | 70 | |
61 | 71 | #{ Interface |
62 | 72 | |
63 | def update(self, previous_commit=None, recursive=True, force_remove=False, init=True, | |
64 | to_latest_revision=False, progress=None, dry_run=False, force_reset=False, | |
65 | keep_going=False): | |
73 | def update(self, previous_commit: Union[Commit_ish, None] = None, # type: ignore[override] | |
74 | recursive: bool = True, force_remove: bool = False, init: bool = True, | |
75 | to_latest_revision: bool = False, progress: Union[None, 'RootUpdateProgress'] = None, | |
76 | dry_run: bool = False, force_reset: bool = False, keep_going: bool = False | |
77 | ) -> 'RootModule': | |
66 | 78 | """Update the submodules of this repository to the current HEAD commit. |
67 | 79 | This method behaves smartly by determining changes of the path of a submodules |
68 | 80 | repository, next to changes to the to-be-checked-out commit or the branch to be |
127 | 139 | previous_commit = repo.commit(previous_commit) # obtain commit object |
128 | 140 | # END handle previous commit |
129 | 141 | |
130 | psms = self.list_items(repo, parent_commit=previous_commit) | |
131 | sms = self.list_items(repo) | |
142 | psms: 'IterableList[Submodule]' = self.list_items(repo, parent_commit=previous_commit) | |
143 | sms: 'IterableList[Submodule]' = self.list_items(repo) | |
132 | 144 | spsms = set(psms) |
133 | 145 | ssms = set(sms) |
134 | 146 | |
161 | 173 | csms = (spsms & ssms) |
162 | 174 | len_csms = len(csms) |
163 | 175 | for i, csm in enumerate(csms): |
164 | psm = psms[csm.name] | |
165 | sm = sms[csm.name] | |
176 | psm: 'Submodule' = psms[csm.name] | |
177 | sm: 'Submodule' = sms[csm.name] | |
166 | 178 | |
167 | 179 | # PATH CHANGES |
168 | 180 | ############## |
342 | 354 | |
343 | 355 | return self |
344 | 356 | |
345 | def module(self): | |
357 | def module(self) -> 'Repo': | |
346 | 358 | """:return: the actual repository containing the submodules""" |
347 | 359 | return self.repo |
348 | 360 | #} END interface |
2 | 2 | from git.config import GitConfigParser |
3 | 3 | from io import BytesIO |
4 | 4 | import weakref |
5 | ||
6 | ||
7 | # typing ----------------------------------------------------------------------- | |
8 | ||
9 | from typing import Any, Sequence, TYPE_CHECKING, Union | |
10 | ||
11 | from git.types import PathLike | |
12 | ||
13 | if TYPE_CHECKING: | |
14 | from .base import Submodule | |
15 | from weakref import ReferenceType | |
16 | from git.repo import Repo | |
17 | from git.refs import Head | |
18 | from git import Remote | |
19 | from git.refs import RemoteReference | |
20 | ||
5 | 21 | |
6 | 22 | __all__ = ('sm_section', 'sm_name', 'mkhead', 'find_first_remote_branch', |
7 | 23 | 'SubmoduleConfigParser') |
9 | 25 | #{ Utilities |
10 | 26 | |
11 | 27 | |
12 | def sm_section(name): | |
28 | def sm_section(name: str) -> str: | |
13 | 29 | """:return: section title used in .gitmodules configuration file""" |
14 | return 'submodule "%s"' % name | |
30 | return f'submodule "{name}"' | |
15 | 31 | |
16 | 32 | |
17 | def sm_name(section): | |
33 | def sm_name(section: str) -> str: | |
18 | 34 | """:return: name of the submodule as parsed from the section name""" |
19 | 35 | section = section.strip() |
20 | 36 | return section[11:-1] |
21 | 37 | |
22 | 38 | |
23 | def mkhead(repo, path): | |
39 | def mkhead(repo: 'Repo', path: PathLike) -> 'Head': | |
24 | 40 | """:return: New branch/head instance""" |
25 | 41 | return git.Head(repo, git.Head.to_full_path(path)) |
26 | 42 | |
27 | 43 | |
28 | def find_first_remote_branch(remotes, branch_name): | |
44 | def find_first_remote_branch(remotes: Sequence['Remote'], branch_name: str) -> 'RemoteReference': | |
29 | 45 | """Find the remote branch matching the name of the given branch or raise InvalidGitRepositoryError""" |
30 | 46 | for remote in remotes: |
31 | 47 | try: |
52 | 68 | Please note that no mutating method will work in bare mode |
53 | 69 | """ |
54 | 70 | |
55 | def __init__(self, *args, **kwargs): | |
56 | self._smref = None | |
71 | def __init__(self, *args: Any, **kwargs: Any) -> None: | |
72 | self._smref: Union['ReferenceType[Submodule]', None] = None | |
57 | 73 | self._index = None |
58 | 74 | self._auto_write = True |
59 | 75 | super(SubmoduleConfigParser, self).__init__(*args, **kwargs) |
60 | 76 | |
61 | 77 | #{ Interface |
62 | def set_submodule(self, submodule): | |
78 | def set_submodule(self, submodule: 'Submodule') -> None: | |
63 | 79 | """Set this instance's submodule. It must be called before |
64 | 80 | the first write operation begins""" |
65 | 81 | self._smref = weakref.ref(submodule) |
66 | 82 | |
67 | def flush_to_index(self): | |
83 | def flush_to_index(self) -> None: | |
68 | 84 | """Flush changes in our configuration file to the index""" |
69 | 85 | assert self._smref is not None |
70 | 86 | # should always have a file here |
83 | 99 | #} END interface |
84 | 100 | |
85 | 101 | #{ Overridden Methods |
86 | def write(self): | |
87 | rval = super(SubmoduleConfigParser, self).write() | |
102 | def write(self) -> None: # type: ignore[override] | |
103 | rval: None = super(SubmoduleConfigParser, self).write() | |
88 | 104 | self.flush_to_index() |
89 | 105 | return rval |
90 | 106 | # END overridden methods |
8 | 8 | from ..util import hex_to_bin |
9 | 9 | from ..compat import defenc |
10 | 10 | |
11 | from typing import List, TYPE_CHECKING, Union | |
12 | ||
13 | from git.types import Literal | |
14 | ||
15 | if TYPE_CHECKING: | |
16 | from git.repo import Repo | |
17 | from git.util import Actor | |
18 | from .commit import Commit | |
19 | from .blob import Blob | |
20 | from .tree import Tree | |
21 | ||
11 | 22 | __all__ = ("TagObject", ) |
12 | 23 | |
13 | 24 | |
14 | 25 | class TagObject(base.Object): |
15 | 26 | |
16 | 27 | """Non-Lightweight tag carrying additional information about an object we are pointing to.""" |
17 | type = "tag" | |
28 | type: Literal['tag'] = "tag" | |
18 | 29 | __slots__ = ("object", "tag", "tagger", "tagged_date", "tagger_tz_offset", "message") |
19 | 30 | |
20 | def __init__(self, repo, binsha, object=None, tag=None, # @ReservedAssignment | |
21 | tagger=None, tagged_date=None, tagger_tz_offset=None, message=None): | |
31 | def __init__(self, repo: 'Repo', binsha: bytes, | |
32 | object: Union[None, base.Object] = None, | |
33 | tag: Union[None, str] = None, | |
34 | tagger: Union[None, 'Actor'] = None, | |
35 | tagged_date: Union[int, None] = None, | |
36 | tagger_tz_offset: Union[int, None] = None, | |
37 | message: Union[str, None] = None | |
38 | ) -> None: # @ReservedAssignment | |
22 | 39 | """Initialize a tag object with additional data |
23 | 40 | |
24 | 41 | :param repo: repository this object is located in |
33 | 50 | authored_date is in, in a format similar to time.altzone""" |
34 | 51 | super(TagObject, self).__init__(repo, binsha) |
35 | 52 | if object is not None: |
36 | self.object = object | |
53 | self.object: Union['Commit', 'Blob', 'Tree', 'TagObject'] = object | |
37 | 54 | if tag is not None: |
38 | 55 | self.tag = tag |
39 | 56 | if tagger is not None: |
45 | 62 | if message is not None: |
46 | 63 | self.message = message |
47 | 64 | |
48 | def _set_cache_(self, attr): | |
65 | def _set_cache_(self, attr: str) -> None: | |
49 | 66 | """Cache all our attributes at once""" |
50 | 67 | if attr in TagObject.__slots__: |
51 | 68 | ostream = self.repo.odb.stream(self.binsha) |
52 | lines = ostream.read().decode(defenc, 'replace').splitlines() | |
69 | lines: List[str] = ostream.read().decode(defenc, 'replace').splitlines() | |
53 | 70 | |
54 | 71 | _obj, hexsha = lines[0].split(" ") |
55 | 72 | _type_token, type_name = lines[1].split(" ") |
73 | object_type = get_object_type_by_name(type_name.encode('ascii')) | |
56 | 74 | self.object = \ |
57 | get_object_type_by_name(type_name.encode('ascii'))(self.repo, hex_to_bin(hexsha)) | |
75 | object_type(self.repo, hex_to_bin(hexsha)) | |
58 | 76 | |
59 | 77 | self.tag = lines[2][4:] # tag <tag name> |
60 | 78 |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | from git.util import join_path | |
6 | import git.diff as diff | |
5 | ||
6 | from git.util import IterableList, join_path | |
7 | import git.diff as git_diff | |
7 | 8 | from git.util import to_bin_sha |
8 | 9 | |
9 | 10 | from . import util |
10 | from .base import IndexObject | |
11 | from .base import IndexObject, IndexObjUnion | |
11 | 12 | from .blob import Blob |
12 | 13 | from .submodule.base import Submodule |
13 | 14 | |
16 | 17 | tree_to_stream |
17 | 18 | ) |
18 | 19 | |
19 | cmp = lambda a, b: (a > b) - (a < b) | |
20 | ||
21 | # typing ------------------------------------------------- | |
22 | ||
23 | from typing import (Any, Callable, Dict, Iterable, Iterator, List, | |
24 | Tuple, Type, Union, cast, TYPE_CHECKING) | |
25 | ||
26 | from git.types import PathLike, Literal | |
27 | ||
28 | if TYPE_CHECKING: | |
29 | from git.repo import Repo | |
30 | from io import BytesIO | |
31 | ||
32 | TreeCacheTup = Tuple[bytes, int, str] | |
33 | ||
34 | TraversedTreeTup = Union[Tuple[Union['Tree', None], IndexObjUnion, | |
35 | Tuple['Submodule', 'Submodule']]] | |
36 | ||
37 | ||
38 | # def is_tree_cache(inp: Tuple[bytes, int, str]) -> TypeGuard[TreeCacheTup]: | |
39 | # return isinstance(inp[0], bytes) and isinstance(inp[1], int) and isinstance([inp], str) | |
40 | ||
41 | #-------------------------------------------------------- | |
42 | ||
43 | ||
44 | cmp: Callable[[str, str], int] = lambda a, b: (a > b) - (a < b) | |
20 | 45 | |
21 | 46 | __all__ = ("TreeModifier", "Tree") |
22 | 47 | |
23 | 48 | |
24 | def git_cmp(t1, t2): | |
49 | def git_cmp(t1: TreeCacheTup, t2: TreeCacheTup) -> int: | |
25 | 50 | a, b = t1[2], t2[2] |
51 | # assert isinstance(a, str) and isinstance(b, str) | |
26 | 52 | len_a, len_b = len(a), len(b) |
27 | 53 | min_len = min(len_a, len_b) |
28 | 54 | min_cmp = cmp(a[:min_len], b[:min_len]) |
33 | 59 | return len_a - len_b |
34 | 60 | |
35 | 61 | |
36 | def merge_sort(a, cmp): | |
62 | def merge_sort(a: List[TreeCacheTup], | |
63 | cmp: Callable[[TreeCacheTup, TreeCacheTup], int]) -> None: | |
37 | 64 | if len(a) < 2: |
38 | return | |
65 | return None | |
39 | 66 | |
40 | 67 | mid = len(a) // 2 |
41 | 68 | lefthalf = a[:mid] |
76 | 103 | the cache of a tree, will be sorted. Assuring it will be in a serializable state""" |
77 | 104 | __slots__ = '_cache' |
78 | 105 | |
79 | def __init__(self, cache): | |
106 | def __init__(self, cache: List[TreeCacheTup]) -> None: | |
80 | 107 | self._cache = cache |
81 | 108 | |
82 | def _index_by_name(self, name): | |
109 | def _index_by_name(self, name: str) -> int: | |
83 | 110 | """:return: index of an item with name, or -1 if not found""" |
84 | 111 | for i, t in enumerate(self._cache): |
85 | 112 | if t[2] == name: |
89 | 116 | return -1 |
90 | 117 | |
91 | 118 | #{ Interface |
92 | def set_done(self): | |
119 | def set_done(self) -> 'TreeModifier': | |
93 | 120 | """Call this method once you are done modifying the tree information. |
94 | 121 | It may be called several times, but be aware that each call will cause |
95 | 122 | a sort operation |
99 | 126 | #} END interface |
100 | 127 | |
101 | 128 | #{ Mutators |
102 | def add(self, sha, mode, name, force=False): | |
129 | def add(self, sha: bytes, mode: int, name: str, force: bool = False) -> 'TreeModifier': | |
103 | 130 | """Add the given item to the tree. If an item with the given name already |
104 | 131 | exists, nothing will be done, but a ValueError will be raised if the |
105 | 132 | sha and mode of the existing item do not match the one you add, unless |
117 | 144 | |
118 | 145 | sha = to_bin_sha(sha) |
119 | 146 | index = self._index_by_name(name) |
147 | ||
120 | 148 | item = (sha, mode, name) |
149 | # assert is_tree_cache(item) | |
150 | ||
121 | 151 | if index == -1: |
122 | 152 | self._cache.append(item) |
123 | 153 | else: |
132 | 162 | # END handle name exists |
133 | 163 | return self |
134 | 164 | |
135 | def add_unchecked(self, binsha, mode, name): | |
165 | def add_unchecked(self, binsha: bytes, mode: int, name: str) -> None: | |
136 | 166 | """Add the given item to the tree, its correctness is assumed, which |
137 | 167 | puts the caller into responsibility to assure the input is correct. |
138 | 168 | For more information on the parameters, see ``add`` |
139 | 169 | :param binsha: 20 byte binary sha""" |
140 | self._cache.append((binsha, mode, name)) | |
141 | ||
142 | def __delitem__(self, name): | |
170 | assert isinstance(binsha, bytes) and isinstance(mode, int) and isinstance(name, str) | |
171 | tree_cache = (binsha, mode, name) | |
172 | ||
173 | self._cache.append(tree_cache) | |
174 | ||
175 | def __delitem__(self, name: str) -> None: | |
143 | 176 | """Deletes an item with the given name if it exists""" |
144 | 177 | index = self._index_by_name(name) |
145 | 178 | if index > -1: |
148 | 181 | #} END mutators |
149 | 182 | |
150 | 183 | |
151 | class Tree(IndexObject, diff.Diffable, util.Traversable, util.Serializable): | |
184 | class Tree(IndexObject, git_diff.Diffable, util.Traversable, util.Serializable): | |
152 | 185 | |
153 | 186 | """Tree objects represent an ordered list of Blobs and other Trees. |
154 | 187 | |
161 | 194 | blob = tree[0] |
162 | 195 | """ |
163 | 196 | |
164 | type = "tree" | |
197 | type: Literal['tree'] = "tree" | |
165 | 198 | __slots__ = "_cache" |
166 | 199 | |
167 | 200 | # actual integer ids for comparison |
170 | 203 | symlink_id = 0o12 |
171 | 204 | tree_id = 0o04 |
172 | 205 | |
173 | _map_id_to_type = { | |
206 | _map_id_to_type: Dict[int, Type[IndexObjUnion]] = { | |
174 | 207 | commit_id: Submodule, |
175 | 208 | blob_id: Blob, |
176 | 209 | symlink_id: Blob |
177 | 210 | # tree id added once Tree is defined |
178 | 211 | } |
179 | 212 | |
180 | def __init__(self, repo, binsha, mode=tree_id << 12, path=None): | |
213 | def __init__(self, repo: 'Repo', binsha: bytes, mode: int = tree_id << 12, path: Union[PathLike, None] = None): | |
181 | 214 | super(Tree, self).__init__(repo, binsha, mode, path) |
182 | 215 | |
183 | @classmethod | |
184 | def _get_intermediate_items(cls, index_object): | |
216 | @ classmethod | |
217 | def _get_intermediate_items(cls, index_object: IndexObjUnion, | |
218 | ) -> Union[Tuple['Tree', ...], Tuple[()]]: | |
185 | 219 | if index_object.type == "tree": |
186 | 220 | return tuple(index_object._iter_convert_to_object(index_object._cache)) |
187 | 221 | return () |
188 | 222 | |
189 | def _set_cache_(self, attr): | |
223 | def _set_cache_(self, attr: str) -> None: | |
190 | 224 | if attr == "_cache": |
191 | 225 | # Set the data when we need it |
192 | 226 | ostream = self.repo.odb.stream(self.binsha) |
193 | self._cache = tree_entries_from_data(ostream.read()) | |
227 | self._cache: List[TreeCacheTup] = tree_entries_from_data(ostream.read()) | |
194 | 228 | else: |
195 | 229 | super(Tree, self)._set_cache_(attr) |
196 | 230 | # END handle attribute |
197 | 231 | |
198 | def _iter_convert_to_object(self, iterable): | |
232 | def _iter_convert_to_object(self, iterable: Iterable[TreeCacheTup] | |
233 | ) -> Iterator[IndexObjUnion]: | |
199 | 234 | """Iterable yields tuples of (binsha, mode, name), which will be converted |
200 | 235 | to the respective object representation""" |
201 | 236 | for binsha, mode, name in iterable: |
206 | 241 | raise TypeError("Unknown mode %o found in tree data for path '%s'" % (mode, path)) from e |
207 | 242 | # END for each item |
208 | 243 | |
209 | def join(self, file): | |
244 | def join(self, file: str) -> IndexObjUnion: | |
210 | 245 | """Find the named object in this tree's contents |
211 | 246 | :return: ``git.Blob`` or ``git.Tree`` or ``git.Submodule`` |
212 | 247 | |
239 | 274 | raise KeyError(msg % file) |
240 | 275 | # END handle long paths |
241 | 276 | |
242 | def __div__(self, file): | |
243 | """For PY2 only""" | |
244 | return self.join(file) | |
245 | ||
246 | def __truediv__(self, file): | |
277 | def __truediv__(self, file: str) -> IndexObjUnion: | |
247 | 278 | """For PY3 only""" |
248 | 279 | return self.join(file) |
249 | 280 | |
250 | @property | |
251 | def trees(self): | |
281 | @ property | |
282 | def trees(self) -> List['Tree']: | |
252 | 283 | """:return: list(Tree, ...) list of trees directly below this tree""" |
253 | 284 | return [i for i in self if i.type == "tree"] |
254 | 285 | |
255 | @property | |
256 | def blobs(self): | |
286 | @ property | |
287 | def blobs(self) -> List[Blob]: | |
257 | 288 | """:return: list(Blob, ...) list of blobs directly below this tree""" |
258 | 289 | return [i for i in self if i.type == "blob"] |
259 | 290 | |
260 | @property | |
261 | def cache(self): | |
291 | @ property | |
292 | def cache(self) -> TreeModifier: | |
262 | 293 | """ |
263 | 294 | :return: An object allowing to modify the internal cache. This can be used |
264 | 295 | to change the tree's contents. When done, make sure you call ``set_done`` |
266 | 297 | See the ``TreeModifier`` for more information on how to alter the cache""" |
267 | 298 | return TreeModifier(self._cache) |
268 | 299 | |
269 | def traverse(self, predicate=lambda i, d: True, | |
270 | prune=lambda i, d: False, depth=-1, branch_first=True, | |
271 | visit_once=False, ignore_self=1): | |
272 | """For documentation, see util.Traversable.traverse | |
300 | def traverse(self, # type: ignore[override] | |
301 | predicate: Callable[[Union[IndexObjUnion, TraversedTreeTup], int], bool] = lambda i, d: True, | |
302 | prune: Callable[[Union[IndexObjUnion, TraversedTreeTup], int], bool] = lambda i, d: False, | |
303 | depth: int = -1, | |
304 | branch_first: bool = True, | |
305 | visit_once: bool = False, | |
306 | ignore_self: int = 1, | |
307 | as_edge: bool = False | |
308 | ) -> Union[Iterator[IndexObjUnion], | |
309 | Iterator[TraversedTreeTup]]: | |
310 | """For documentation, see util.Traversable._traverse() | |
273 | 311 | Trees are set to visit_once = False to gain more performance in the traversal""" |
274 | return super(Tree, self).traverse(predicate, prune, depth, branch_first, visit_once, ignore_self) | |
312 | ||
313 | # """ | |
314 | # # To typecheck instead of using cast. | |
315 | # import itertools | |
316 | # def is_tree_traversed(inp: Tuple) -> TypeGuard[Tuple[Iterator[Union['Tree', 'Blob', 'Submodule']]]]: | |
317 | # return all(isinstance(x, (Blob, Tree, Submodule)) for x in inp[1]) | |
318 | ||
319 | # ret = super(Tree, self).traverse(predicate, prune, depth, branch_first, visit_once, ignore_self) | |
320 | # ret_tup = itertools.tee(ret, 2) | |
321 | # assert is_tree_traversed(ret_tup), f"Type is {[type(x) for x in list(ret_tup[0])]}" | |
322 | # return ret_tup[0]""" | |
323 | return cast(Union[Iterator[IndexObjUnion], Iterator[TraversedTreeTup]], | |
324 | super(Tree, self)._traverse(predicate, prune, depth, # type: ignore | |
325 | branch_first, visit_once, ignore_self)) | |
326 | ||
327 | def list_traverse(self, *args: Any, **kwargs: Any) -> IterableList[IndexObjUnion]: | |
328 | """ | |
329 | :return: IterableList with the results of the traversal as produced by | |
330 | traverse() | |
331 | Tree -> IterableList[Union['Submodule', 'Tree', 'Blob']] | |
332 | """ | |
333 | return super(Tree, self)._list_traverse(* args, **kwargs) | |
275 | 334 | |
276 | 335 | # List protocol |
277 | def __getslice__(self, i, j): | |
336 | ||
337 | def __getslice__(self, i: int, j: int) -> List[IndexObjUnion]: | |
278 | 338 | return list(self._iter_convert_to_object(self._cache[i:j])) |
279 | 339 | |
280 | def __iter__(self): | |
340 | def __iter__(self) -> Iterator[IndexObjUnion]: | |
281 | 341 | return self._iter_convert_to_object(self._cache) |
282 | 342 | |
283 | def __len__(self): | |
343 | def __len__(self) -> int: | |
284 | 344 | return len(self._cache) |
285 | 345 | |
286 | def __getitem__(self, item): | |
346 | def __getitem__(self, item: Union[str, int, slice]) -> IndexObjUnion: | |
287 | 347 | if isinstance(item, int): |
288 | 348 | info = self._cache[item] |
289 | 349 | return self._map_id_to_type[info[1] >> 12](self.repo, info[0], info[1], join_path(self.path, info[2])) |
295 | 355 | |
296 | 356 | raise TypeError("Invalid index type: %r" % item) |
297 | 357 | |
298 | def __contains__(self, item): | |
358 | def __contains__(self, item: Union[IndexObjUnion, PathLike]) -> bool: | |
299 | 359 | if isinstance(item, IndexObject): |
300 | 360 | for info in self._cache: |
301 | 361 | if item.binsha == info[0]: |
306 | 366 | # compatibility |
307 | 367 | |
308 | 368 | # treat item as repo-relative path |
309 | path = self.path | |
310 | for info in self._cache: | |
311 | if item == join_path(path, info[2]): | |
312 | return True | |
369 | else: | |
370 | path = self.path | |
371 | for info in self._cache: | |
372 | if item == join_path(path, info[2]): | |
373 | return True | |
313 | 374 | # END for each item |
314 | 375 | return False |
315 | 376 | |
316 | def __reversed__(self): | |
317 | return reversed(self._iter_convert_to_object(self._cache)) | |
318 | ||
319 | def _serialize(self, stream): | |
377 | def __reversed__(self) -> Iterator[IndexObjUnion]: | |
378 | return reversed(self._iter_convert_to_object(self._cache)) # type: ignore | |
379 | ||
380 | def _serialize(self, stream: 'BytesIO') -> 'Tree': | |
320 | 381 | """Serialize this tree into the stream. Please note that we will assume |
321 | 382 | our tree data to be in a sorted state. If this is not the case, serialization |
322 | 383 | will not generate a correct tree representation as these are assumed to be sorted |
324 | 385 | tree_to_stream(self._cache, stream.write) |
325 | 386 | return self |
326 | 387 | |
327 | def _deserialize(self, stream): | |
388 | def _deserialize(self, stream: 'BytesIO') -> 'Tree': | |
328 | 389 | self._cache = tree_entries_from_data(stream.read()) |
329 | 390 | return self |
330 | 391 |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | 5 | """Module for general utility functions""" |
6 | ||
7 | from abc import abstractmethod | |
8 | import warnings | |
6 | 9 | from git.util import ( |
7 | 10 | IterableList, |
11 | IterableObj, | |
8 | 12 | Actor |
9 | 13 | ) |
10 | 14 | |
11 | 15 | import re |
12 | from collections import deque as Deque | |
16 | from collections import deque | |
13 | 17 | |
14 | 18 | from string import digits |
15 | 19 | import time |
16 | 20 | import calendar |
17 | 21 | from datetime import datetime, timedelta, tzinfo |
18 | 22 | |
23 | # typing ------------------------------------------------------------ | |
24 | from typing import (Any, Callable, Deque, Iterator, NamedTuple, overload, Sequence, | |
25 | TYPE_CHECKING, Tuple, Type, TypeVar, Union, cast) | |
26 | ||
27 | from git.types import Has_id_attribute, Literal, Protocol, runtime_checkable | |
28 | ||
29 | if TYPE_CHECKING: | |
30 | from io import BytesIO, StringIO | |
31 | from .commit import Commit | |
32 | from .blob import Blob | |
33 | from .tag import TagObject | |
34 | from .tree import Tree, TraversedTreeTup | |
35 | from subprocess import Popen | |
36 | from .submodule.base import Submodule | |
37 | ||
38 | ||
39 | class TraverseNT(NamedTuple): | |
40 | depth: int | |
41 | item: Union['Traversable', 'Blob'] | |
42 | src: Union['Traversable', None] | |
43 | ||
44 | ||
45 | T_TIobj = TypeVar('T_TIobj', bound='TraversableIterableObj') # for TraversableIterableObj.traverse() | |
46 | ||
47 | TraversedTup = Union[Tuple[Union['Traversable', None], 'Traversable'], # for commit, submodule | |
48 | 'TraversedTreeTup'] # for tree.traverse() | |
49 | ||
50 | # -------------------------------------------------------------------- | |
51 | ||
19 | 52 | __all__ = ('get_object_type_by_name', 'parse_date', 'parse_actor_and_date', |
20 | 53 | 'ProcessStreamAdapter', 'Traversable', 'altz_to_utctz_str', 'utctz_to_altz', |
21 | 54 | 'verify_utctz', 'Actor', 'tzoffset', 'utc') |
25 | 58 | #{ Functions |
26 | 59 | |
27 | 60 | |
28 | def mode_str_to_int(modestr): | |
61 | def mode_str_to_int(modestr: Union[bytes, str]) -> int: | |
29 | 62 | """ |
30 | 63 | :param modestr: string like 755 or 644 or 100644 - only the last 6 chars will be used |
31 | 64 | :return: |
35 | 68 | for example.""" |
36 | 69 | mode = 0 |
37 | 70 | for iteration, char in enumerate(reversed(modestr[-6:])): |
71 | char = cast(Union[str, int], char) | |
38 | 72 | mode += int(char) << iteration * 3 |
39 | 73 | # END for each char |
40 | 74 | return mode |
41 | 75 | |
42 | 76 | |
43 | def get_object_type_by_name(object_type_name): | |
77 | def get_object_type_by_name(object_type_name: bytes | |
78 | ) -> Union[Type['Commit'], Type['TagObject'], Type['Tree'], Type['Blob']]: | |
44 | 79 | """ |
45 | 80 | :return: type suitable to handle the given object type name. |
46 | 81 | Use the type to create new instances. |
61 | 96 | from . import tree |
62 | 97 | return tree.Tree |
63 | 98 | else: |
64 | raise ValueError("Cannot handle unknown object type: %s" % object_type_name) | |
65 | ||
66 | ||
67 | def utctz_to_altz(utctz): | |
99 | raise ValueError("Cannot handle unknown object type: %s" % object_type_name.decode()) | |
100 | ||
101 | ||
102 | def utctz_to_altz(utctz: str) -> int: | |
68 | 103 | """we convert utctz to the timezone in seconds, it is the format time.altzone |
69 | 104 | returns. Git stores it as UTC timezone which has the opposite sign as well, |
70 | 105 | which explains the -1 * ( that was made explicit here ) |
72 | 107 | return -1 * int(float(utctz) / 100 * 3600) |
73 | 108 | |
74 | 109 | |
75 | def altz_to_utctz_str(altz): | |
110 | def altz_to_utctz_str(altz: float) -> str: | |
76 | 111 | """As above, but inverses the operation, returning a string that can be used |
77 | 112 | in commit objects""" |
78 | 113 | utci = -1 * int((float(altz) / 3600) * 100) |
82 | 117 | return prefix + utcs |
83 | 118 | |
84 | 119 | |
85 | def verify_utctz(offset): | |
120 | def verify_utctz(offset: str) -> str: | |
86 | 121 | """:raise ValueError: if offset is incorrect |
87 | 122 | :return: offset""" |
88 | 123 | fmt_exc = ValueError("Invalid timezone offset format: %s" % offset) |
100 | 135 | |
101 | 136 | |
102 | 137 | class tzoffset(tzinfo): |
103 | def __init__(self, secs_west_of_utc, name=None): | |
138 | ||
139 | def __init__(self, secs_west_of_utc: float, name: Union[None, str] = None) -> None: | |
104 | 140 | self._offset = timedelta(seconds=-secs_west_of_utc) |
105 | 141 | self._name = name or 'fixed' |
106 | 142 | |
107 | def __reduce__(self): | |
143 | def __reduce__(self) -> Tuple[Type['tzoffset'], Tuple[float, str]]: | |
108 | 144 | return tzoffset, (-self._offset.total_seconds(), self._name) |
109 | 145 | |
110 | def utcoffset(self, dt): | |
146 | def utcoffset(self, dt: Union[datetime, None]) -> timedelta: | |
111 | 147 | return self._offset |
112 | 148 | |
113 | def tzname(self, dt): | |
149 | def tzname(self, dt: Union[datetime, None]) -> str: | |
114 | 150 | return self._name |
115 | 151 | |
116 | def dst(self, dt): | |
152 | def dst(self, dt: Union[datetime, None]) -> timedelta: | |
117 | 153 | return ZERO |
118 | 154 | |
119 | 155 | |
120 | 156 | utc = tzoffset(0, 'UTC') |
121 | 157 | |
122 | 158 | |
123 | def from_timestamp(timestamp, tz_offset): | |
159 | def from_timestamp(timestamp: float, tz_offset: float) -> datetime: | |
124 | 160 | """Converts a timestamp + tz_offset into an aware datetime instance.""" |
125 | 161 | utc_dt = datetime.fromtimestamp(timestamp, utc) |
126 | 162 | try: |
130 | 166 | return utc_dt |
131 | 167 | |
132 | 168 | |
133 | def parse_date(string_date): | |
169 | def parse_date(string_date: Union[str, datetime]) -> Tuple[int, int]: | |
134 | 170 | """ |
135 | 171 | Parse the given date as one of the following |
136 | 172 | |
144 | 180 | :raise ValueError: If the format could not be understood |
145 | 181 | :note: Date can also be YYYY.MM.DD, MM/DD/YYYY and DD.MM.YYYY. |
146 | 182 | """ |
147 | if isinstance(string_date, datetime) and string_date.tzinfo: | |
148 | offset = -int(string_date.utcoffset().total_seconds()) | |
149 | return int(string_date.astimezone(utc).timestamp()), offset | |
183 | if isinstance(string_date, datetime): | |
184 | if string_date.tzinfo: | |
185 | utcoffset = cast(timedelta, string_date.utcoffset()) # typeguard, if tzinfoand is not None | |
186 | offset = -int(utcoffset.total_seconds()) | |
187 | return int(string_date.astimezone(utc).timestamp()), offset | |
188 | else: | |
189 | raise ValueError(f"string_date datetime object without tzinfo, {string_date}") | |
150 | 190 | |
151 | 191 | # git time |
152 | 192 | try: |
153 | 193 | if string_date.count(' ') == 1 and string_date.rfind(':') == -1: |
154 | timestamp, offset = string_date.split() | |
194 | timestamp, offset_str = string_date.split() | |
155 | 195 | if timestamp.startswith('@'): |
156 | 196 | timestamp = timestamp[1:] |
157 | timestamp = int(timestamp) | |
158 | return timestamp, utctz_to_altz(verify_utctz(offset)) | |
197 | timestamp_int = int(timestamp) | |
198 | return timestamp_int, utctz_to_altz(verify_utctz(offset_str)) | |
159 | 199 | else: |
160 | offset = "+0000" # local time by default | |
200 | offset_str = "+0000" # local time by default | |
161 | 201 | if string_date[-5] in '-+': |
162 | offset = verify_utctz(string_date[-5:]) | |
202 | offset_str = verify_utctz(string_date[-5:]) | |
163 | 203 | string_date = string_date[:-6] # skip space as well |
164 | 204 | # END split timezone info |
165 | offset = utctz_to_altz(offset) | |
205 | offset = utctz_to_altz(offset_str) | |
166 | 206 | |
167 | 207 | # now figure out the date and time portion - split time |
168 | 208 | date_formats = [] |
208 | 248 | raise ValueError("no format matched") |
209 | 249 | # END handle format |
210 | 250 | except Exception as e: |
211 | raise ValueError("Unsupported date format: %s" % string_date) from e | |
251 | raise ValueError(f"Unsupported date format or type: {string_date}, type={type(string_date)}") from e | |
212 | 252 | # END handle exceptions |
213 | 253 | |
214 | 254 | |
217 | 257 | _re_only_actor = re.compile(r'^.+? (.*)$') |
218 | 258 | |
219 | 259 | |
220 | def parse_actor_and_date(line): | |
260 | def parse_actor_and_date(line: str) -> Tuple[Actor, int, int]: | |
221 | 261 | """Parse out the actor (author or committer) info from a line like:: |
222 | 262 | |
223 | 263 | author Tom Preston-Werner <tom@mojombo.com> 1191999972 -0700 |
224 | 264 | |
225 | 265 | :return: [Actor, int_seconds_since_epoch, int_timezone_offset]""" |
226 | actor, epoch, offset = '', 0, 0 | |
266 | actor, epoch, offset = '', '0', '0' | |
227 | 267 | m = _re_actor_epoch.search(line) |
228 | 268 | if m: |
229 | 269 | actor, epoch, offset = m.groups() |
246 | 286 | it if the instance goes out of scope.""" |
247 | 287 | __slots__ = ("_proc", "_stream") |
248 | 288 | |
249 | def __init__(self, process, stream_name): | |
289 | def __init__(self, process: 'Popen', stream_name: str) -> None: | |
250 | 290 | self._proc = process |
251 | self._stream = getattr(process, stream_name) | |
252 | ||
253 | def __getattr__(self, attr): | |
291 | self._stream: StringIO = getattr(process, stream_name) # guessed type | |
292 | ||
293 | def __getattr__(self, attr: str) -> Any: | |
254 | 294 | return getattr(self._stream, attr) |
255 | 295 | |
256 | 296 | |
257 | class Traversable(object): | |
297 | @runtime_checkable | |
298 | class Traversable(Protocol): | |
258 | 299 | |
259 | 300 | """Simple interface to perform depth-first or breadth-first traversals |
260 | 301 | into one direction. |
261 | 302 | Subclasses only need to implement one function. |
262 | Instances of the Subclass must be hashable""" | |
303 | Instances of the Subclass must be hashable | |
304 | ||
305 | Defined subclasses = [Commit, Tree, SubModule] | |
306 | """ | |
263 | 307 | __slots__ = () |
264 | 308 | |
265 | 309 | @classmethod |
266 | def _get_intermediate_items(cls, item): | |
310 | @abstractmethod | |
311 | def _get_intermediate_items(cls, item: Any) -> Sequence['Traversable']: | |
267 | 312 | """ |
268 | 313 | Returns: |
269 | List of items connected to the given item. | |
314 | Tuple of items connected to the given item. | |
270 | 315 | Must be implemented in subclass |
316 | ||
317 | class Commit:: (cls, Commit) -> Tuple[Commit, ...] | |
318 | class Submodule:: (cls, Submodule) -> Iterablelist[Submodule] | |
319 | class Tree:: (cls, Tree) -> Tuple[Tree, ...] | |
271 | 320 | """ |
272 | 321 | raise NotImplementedError("To be implemented in subclass") |
273 | 322 | |
274 | def list_traverse(self, *args, **kwargs): | |
323 | @abstractmethod | |
324 | def list_traverse(self, *args: Any, **kwargs: Any) -> Any: | |
325 | """ """ | |
326 | warnings.warn("list_traverse() method should only be called from subclasses." | |
327 | "Calling from Traversable abstract class will raise NotImplementedError in 3.1.20" | |
328 | "Builtin sublclasses are 'Submodule', 'Tree' and 'Commit", | |
329 | DeprecationWarning, | |
330 | stacklevel=2) | |
331 | return self._list_traverse(*args, **kwargs) | |
332 | ||
333 | def _list_traverse(self, as_edge: bool = False, *args: Any, **kwargs: Any | |
334 | ) -> IterableList[Union['Commit', 'Submodule', 'Tree', 'Blob']]: | |
275 | 335 | """ |
276 | 336 | :return: IterableList with the results of the traversal as produced by |
277 | traverse()""" | |
278 | out = IterableList(self._id_attribute_) | |
279 | out.extend(self.traverse(*args, **kwargs)) | |
280 | return out | |
281 | ||
282 | def traverse(self, predicate=lambda i, d: True, | |
283 | prune=lambda i, d: False, depth=-1, branch_first=True, | |
284 | visit_once=True, ignore_self=1, as_edge=False): | |
337 | traverse() | |
338 | Commit -> IterableList['Commit'] | |
339 | Submodule -> IterableList['Submodule'] | |
340 | Tree -> IterableList[Union['Submodule', 'Tree', 'Blob']] | |
341 | """ | |
342 | # Commit and Submodule have id.__attribute__ as IterableObj | |
343 | # Tree has id.__attribute__ inherited from IndexObject | |
344 | if isinstance(self, Has_id_attribute): | |
345 | id = self._id_attribute_ | |
346 | else: | |
347 | id = "" # shouldn't reach here, unless Traversable subclass created with no _id_attribute_ | |
348 | # could add _id_attribute_ to Traversable, or make all Traversable also Iterable? | |
349 | ||
350 | if not as_edge: | |
351 | out: IterableList[Union['Commit', 'Submodule', 'Tree', 'Blob']] = IterableList(id) | |
352 | out.extend(self.traverse(as_edge=as_edge, *args, **kwargs)) | |
353 | return out | |
354 | # overloads in subclasses (mypy does't allow typing self: subclass) | |
355 | # Union[IterableList['Commit'], IterableList['Submodule'], IterableList[Union['Submodule', 'Tree', 'Blob']]] | |
356 | else: | |
357 | # Raise deprecationwarning, doesn't make sense to use this | |
358 | out_list: IterableList = IterableList(self.traverse(*args, **kwargs)) | |
359 | return out_list | |
360 | ||
361 | @ abstractmethod | |
362 | def traverse(self, *args: Any, **kwargs: Any) -> Any: | |
363 | """ """ | |
364 | warnings.warn("traverse() method should only be called from subclasses." | |
365 | "Calling from Traversable abstract class will raise NotImplementedError in 3.1.20" | |
366 | "Builtin sublclasses are 'Submodule', 'Tree' and 'Commit", | |
367 | DeprecationWarning, | |
368 | stacklevel=2) | |
369 | return self._traverse(*args, **kwargs) | |
370 | ||
371 | def _traverse(self, | |
372 | predicate: Callable[[Union['Traversable', 'Blob', TraversedTup], int], bool] = lambda i, d: True, | |
373 | prune: Callable[[Union['Traversable', 'Blob', TraversedTup], int], bool] = lambda i, d: False, | |
374 | depth: int = -1, branch_first: bool = True, visit_once: bool = True, | |
375 | ignore_self: int = 1, as_edge: bool = False | |
376 | ) -> Union[Iterator[Union['Traversable', 'Blob']], | |
377 | Iterator[TraversedTup]]: | |
285 | 378 | """:return: iterator yielding of items found when traversing self |
286 | ||
287 | 379 | :param predicate: f(i,d) returns False if item i at depth d should not be included in the result |
288 | 380 | |
289 | 381 | :param prune: |
312 | 404 | if True, return a pair of items, first being the source, second the |
313 | 405 | destination, i.e. tuple(src, dest) with the edge spanning from |
314 | 406 | source to destination""" |
407 | ||
408 | """ | |
409 | Commit -> Iterator[Union[Commit, Tuple[Commit, Commit]] | |
410 | Submodule -> Iterator[Submodule, Tuple[Submodule, Submodule]] | |
411 | Tree -> Iterator[Union[Blob, Tree, Submodule, | |
412 | Tuple[Union[Submodule, Tree], Union[Blob, Tree, Submodule]]] | |
413 | ||
414 | ignore_self=True is_edge=True -> Iterator[item] | |
415 | ignore_self=True is_edge=False --> Iterator[item] | |
416 | ignore_self=False is_edge=True -> Iterator[item] | Iterator[Tuple[src, item]] | |
417 | ignore_self=False is_edge=False -> Iterator[Tuple[src, item]]""" | |
418 | ||
315 | 419 | visited = set() |
316 | stack = Deque() | |
317 | stack.append((0, self, None)) # self is always depth level 0 | |
318 | ||
319 | def addToStack(stack, item, branch_first, depth): | |
420 | stack: Deque[TraverseNT] = deque() | |
421 | stack.append(TraverseNT(0, self, None)) # self is always depth level 0 | |
422 | ||
423 | def addToStack(stack: Deque[TraverseNT], | |
424 | src_item: 'Traversable', | |
425 | branch_first: bool, | |
426 | depth: int) -> None: | |
320 | 427 | lst = self._get_intermediate_items(item) |
321 | if not lst: | |
322 | return | |
428 | if not lst: # empty list | |
429 | return None | |
323 | 430 | if branch_first: |
324 | stack.extendleft((depth, i, item) for i in lst) | |
431 | stack.extendleft(TraverseNT(depth, i, src_item) for i in lst) | |
325 | 432 | else: |
326 | reviter = ((depth, lst[i], item) for i in range(len(lst) - 1, -1, -1)) | |
433 | reviter = (TraverseNT(depth, lst[i], src_item) for i in range(len(lst) - 1, -1, -1)) | |
327 | 434 | stack.extend(reviter) |
328 | 435 | # END addToStack local method |
329 | 436 | |
336 | 443 | if visit_once: |
337 | 444 | visited.add(item) |
338 | 445 | |
339 | rval = (as_edge and (src, item)) or item | |
446 | rval: Union[TraversedTup, 'Traversable', 'Blob'] | |
447 | if as_edge: # if as_edge return (src, item) unless rrc is None (e.g. for first item) | |
448 | rval = (src, item) | |
449 | else: | |
450 | rval = item | |
451 | ||
340 | 452 | if prune(rval, d): |
341 | 453 | continue |
342 | 454 | |
353 | 465 | # END for each item on work stack |
354 | 466 | |
355 | 467 | |
356 | class Serializable(object): | |
468 | @ runtime_checkable | |
469 | class Serializable(Protocol): | |
357 | 470 | |
358 | 471 | """Defines methods to serialize and deserialize objects from and into a data stream""" |
359 | 472 | __slots__ = () |
360 | 473 | |
361 | def _serialize(self, stream): | |
474 | # @abstractmethod | |
475 | def _serialize(self, stream: 'BytesIO') -> 'Serializable': | |
362 | 476 | """Serialize the data of this object into the given data stream |
363 | 477 | :note: a serialized object would ``_deserialize`` into the same object |
364 | 478 | :param stream: a file-like object |
365 | 479 | :return: self""" |
366 | 480 | raise NotImplementedError("To be implemented in subclass") |
367 | 481 | |
368 | def _deserialize(self, stream): | |
482 | # @abstractmethod | |
483 | def _deserialize(self, stream: 'BytesIO') -> 'Serializable': | |
369 | 484 | """Deserialize all information regarding this object from the stream |
370 | 485 | :param stream: a file-like object |
371 | 486 | :return: self""" |
372 | 487 | raise NotImplementedError("To be implemented in subclass") |
488 | ||
489 | ||
490 | class TraversableIterableObj(IterableObj, Traversable): | |
491 | __slots__ = () | |
492 | ||
493 | TIobj_tuple = Tuple[Union[T_TIobj, None], T_TIobj] | |
494 | ||
495 | def list_traverse(self: T_TIobj, *args: Any, **kwargs: Any) -> IterableList[T_TIobj]: | |
496 | return super(TraversableIterableObj, self)._list_traverse(* args, **kwargs) | |
497 | ||
498 | @ overload # type: ignore | |
499 | def traverse(self: T_TIobj | |
500 | ) -> Iterator[T_TIobj]: | |
501 | ... | |
502 | ||
503 | @ overload | |
504 | def traverse(self: T_TIobj, | |
505 | predicate: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool], | |
506 | prune: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool], | |
507 | depth: int, branch_first: bool, visit_once: bool, | |
508 | ignore_self: Literal[True], | |
509 | as_edge: Literal[False], | |
510 | ) -> Iterator[T_TIobj]: | |
511 | ... | |
512 | ||
513 | @ overload | |
514 | def traverse(self: T_TIobj, | |
515 | predicate: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool], | |
516 | prune: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool], | |
517 | depth: int, branch_first: bool, visit_once: bool, | |
518 | ignore_self: Literal[False], | |
519 | as_edge: Literal[True], | |
520 | ) -> Iterator[Tuple[Union[T_TIobj, None], T_TIobj]]: | |
521 | ... | |
522 | ||
523 | @ overload | |
524 | def traverse(self: T_TIobj, | |
525 | predicate: Callable[[Union[T_TIobj, TIobj_tuple], int], bool], | |
526 | prune: Callable[[Union[T_TIobj, TIobj_tuple], int], bool], | |
527 | depth: int, branch_first: bool, visit_once: bool, | |
528 | ignore_self: Literal[True], | |
529 | as_edge: Literal[True], | |
530 | ) -> Iterator[Tuple[T_TIobj, T_TIobj]]: | |
531 | ... | |
532 | ||
533 | def traverse(self: T_TIobj, | |
534 | predicate: Callable[[Union[T_TIobj, TIobj_tuple], int], | |
535 | bool] = lambda i, d: True, | |
536 | prune: Callable[[Union[T_TIobj, TIobj_tuple], int], | |
537 | bool] = lambda i, d: False, | |
538 | depth: int = -1, branch_first: bool = True, visit_once: bool = True, | |
539 | ignore_self: int = 1, as_edge: bool = False | |
540 | ) -> Union[Iterator[T_TIobj], | |
541 | Iterator[Tuple[T_TIobj, T_TIobj]], | |
542 | Iterator[TIobj_tuple]]: | |
543 | """For documentation, see util.Traversable._traverse()""" | |
544 | ||
545 | """ | |
546 | # To typecheck instead of using cast. | |
547 | import itertools | |
548 | from git.types import TypeGuard | |
549 | def is_commit_traversed(inp: Tuple) -> TypeGuard[Tuple[Iterator[Tuple['Commit', 'Commit']]]]: | |
550 | for x in inp[1]: | |
551 | if not isinstance(x, tuple) and len(x) != 2: | |
552 | if all(isinstance(inner, Commit) for inner in x): | |
553 | continue | |
554 | return True | |
555 | ||
556 | ret = super(Commit, self).traverse(predicate, prune, depth, branch_first, visit_once, ignore_self, as_edge) | |
557 | ret_tup = itertools.tee(ret, 2) | |
558 | assert is_commit_traversed(ret_tup), f"{[type(x) for x in list(ret_tup[0])]}" | |
559 | return ret_tup[0] | |
560 | """ | |
561 | return cast(Union[Iterator[T_TIobj], | |
562 | Iterator[Tuple[Union[None, T_TIobj], T_TIobj]]], | |
563 | super(TraversableIterableObj, self)._traverse( | |
564 | predicate, prune, depth, branch_first, visit_once, ignore_self, as_edge # type: ignore | |
565 | )) |
0 | 0 | # flake8: noqa |
1 | from __future__ import absolute_import | |
2 | 1 | # import all modules in order, fix the names they require |
3 | 2 | from .symbolic import * |
4 | 3 | from .reference import * |
0 | from git.config import SectionConstraint | |
0 | from git.config import GitConfigParser, SectionConstraint | |
1 | 1 | from git.util import join_path |
2 | 2 | from git.exc import GitCommandError |
3 | 3 | |
4 | 4 | from .symbolic import SymbolicReference |
5 | 5 | from .reference import Reference |
6 | 6 | |
7 | # typinng --------------------------------------------------- | |
8 | ||
9 | from typing import Any, Sequence, Union, TYPE_CHECKING | |
10 | ||
11 | from git.types import PathLike, Commit_ish | |
12 | ||
13 | if TYPE_CHECKING: | |
14 | from git.repo import Repo | |
15 | from git.objects import Commit | |
16 | from git.refs import RemoteReference | |
17 | ||
18 | # ------------------------------------------------------------------- | |
19 | ||
7 | 20 | __all__ = ["HEAD", "Head"] |
8 | 21 | |
9 | 22 | |
10 | def strip_quotes(string): | |
23 | def strip_quotes(string: str) -> str: | |
11 | 24 | if string.startswith('"') and string.endswith('"'): |
12 | 25 | return string[1:-1] |
13 | 26 | return string |
14 | ||
27 | ||
15 | 28 | |
16 | 29 | class HEAD(SymbolicReference): |
17 | 30 | |
21 | 34 | _ORIG_HEAD_NAME = 'ORIG_HEAD' |
22 | 35 | __slots__ = () |
23 | 36 | |
24 | def __init__(self, repo, path=_HEAD_NAME): | |
37 | def __init__(self, repo: 'Repo', path: PathLike = _HEAD_NAME): | |
25 | 38 | if path != self._HEAD_NAME: |
26 | 39 | raise ValueError("HEAD instance must point to %r, got %r" % (self._HEAD_NAME, path)) |
27 | 40 | super(HEAD, self).__init__(repo, path) |
28 | ||
29 | def orig_head(self): | |
41 | self.commit: 'Commit' | |
42 | ||
43 | def orig_head(self) -> SymbolicReference: | |
30 | 44 | """ |
31 | 45 | :return: SymbolicReference pointing at the ORIG_HEAD, which is maintained |
32 | 46 | to contain the previous value of HEAD""" |
33 | 47 | return SymbolicReference(self.repo, self._ORIG_HEAD_NAME) |
34 | 48 | |
35 | def reset(self, commit='HEAD', index=True, working_tree=False, | |
36 | paths=None, **kwargs): | |
49 | def reset(self, commit: Union[Commit_ish, SymbolicReference, str] = 'HEAD', | |
50 | index: bool = True, working_tree: bool = False, | |
51 | paths: Union[PathLike, Sequence[PathLike], None] = None, **kwargs: Any) -> 'HEAD': | |
37 | 52 | """Reset our HEAD to the given commit optionally synchronizing |
38 | 53 | the index and working tree. The reference we refer to will be set to |
39 | 54 | commit as well. |
59 | 74 | Additional arguments passed to git-reset. |
60 | 75 | |
61 | 76 | :return: self""" |
77 | mode: Union[str, None] | |
62 | 78 | mode = "--soft" |
63 | 79 | if index: |
64 | 80 | mode = "--mixed" |
112 | 128 | k_config_remote_ref = "merge" # branch to merge from remote |
113 | 129 | |
114 | 130 | @classmethod |
115 | def delete(cls, repo, *heads, **kwargs): | |
131 | def delete(cls, repo: 'Repo', *heads: 'Head', force: bool = False, **kwargs: Any) -> None: | |
116 | 132 | """Delete the given heads |
117 | 133 | |
118 | 134 | :param force: |
119 | 135 | If True, the heads will be deleted even if they are not yet merged into |
120 | 136 | the main development stream. |
121 | 137 | Default False""" |
122 | force = kwargs.get("force", False) | |
123 | 138 | flag = "-d" |
124 | 139 | if force: |
125 | 140 | flag = "-D" |
126 | 141 | repo.git.branch(flag, *heads) |
127 | 142 | |
128 | def set_tracking_branch(self, remote_reference): | |
143 | def set_tracking_branch(self, remote_reference: Union['RemoteReference', None]) -> 'Head': | |
129 | 144 | """ |
130 | 145 | Configure this branch to track the given remote reference. This will alter |
131 | 146 | this branch's configuration accordingly. |
150 | 165 | |
151 | 166 | return self |
152 | 167 | |
153 | def tracking_branch(self): | |
168 | def tracking_branch(self) -> Union['RemoteReference', None]: | |
154 | 169 | """ |
155 | 170 | :return: The remote_reference we are tracking, or None if we are |
156 | 171 | not a tracking branch""" |
165 | 180 | # we are not a tracking branch |
166 | 181 | return None |
167 | 182 | |
168 | def rename(self, new_path, force=False): | |
183 | def rename(self, new_path: PathLike, force: bool = False) -> 'Head': | |
169 | 184 | """Rename self to a new path |
170 | 185 | |
171 | 186 | :param new_path: |
186 | 201 | self.path = "%s/%s" % (self._common_path_default, new_path) |
187 | 202 | return self |
188 | 203 | |
189 | def checkout(self, force=False, **kwargs): | |
204 | def checkout(self, force: bool = False, **kwargs: Any) -> Union['HEAD', 'Head']: | |
190 | 205 | """Checkout this head by setting the HEAD to this reference, by updating the index |
191 | 206 | to reflect the tree we point to and by updating the working tree to reflect |
192 | 207 | the latest index. |
218 | 233 | self.repo.git.checkout(self, **kwargs) |
219 | 234 | if self.repo.head.is_detached: |
220 | 235 | return self.repo.head |
221 | return self.repo.active_branch | |
236 | else: | |
237 | return self.repo.active_branch | |
222 | 238 | |
223 | 239 | #{ Configuration |
224 | def _config_parser(self, read_only): | |
240 | def _config_parser(self, read_only: bool) -> SectionConstraint[GitConfigParser]: | |
225 | 241 | if read_only: |
226 | 242 | parser = self.repo.config_reader() |
227 | 243 | else: |
230 | 246 | |
231 | 247 | return SectionConstraint(parser, 'branch "%s"' % self.name) |
232 | 248 | |
233 | def config_reader(self): | |
249 | def config_reader(self) -> SectionConstraint[GitConfigParser]: | |
234 | 250 | """ |
235 | 251 | :return: A configuration parser instance constrained to only read |
236 | 252 | this instance's values""" |
237 | 253 | return self._config_parser(read_only=True) |
238 | 254 | |
239 | def config_writer(self): | |
255 | def config_writer(self) -> SectionConstraint[GitConfigParser]: | |
240 | 256 | """ |
241 | 257 | :return: A configuration writer instance with read-and write access |
242 | 258 | to options of this head""" |
0 | ||
1 | from mmap import mmap | |
0 | 2 | import re |
1 | import time | |
3 | import time as _time | |
2 | 4 | |
3 | 5 | from git.compat import defenc |
4 | 6 | from git.objects.util import ( |
19 | 21 | import os.path as osp |
20 | 22 | |
21 | 23 | |
24 | # typing ------------------------------------------------------------------ | |
25 | ||
26 | from typing import Iterator, List, Tuple, Union, TYPE_CHECKING | |
27 | ||
28 | from git.types import PathLike | |
29 | ||
30 | if TYPE_CHECKING: | |
31 | from git.refs import SymbolicReference | |
32 | from io import BytesIO | |
33 | from git.config import GitConfigParser, SectionConstraint # NOQA | |
34 | ||
35 | # ------------------------------------------------------------------------------ | |
36 | ||
22 | 37 | __all__ = ["RefLog", "RefLogEntry"] |
23 | 38 | |
24 | 39 | |
25 | class RefLogEntry(tuple): | |
40 | class RefLogEntry(Tuple[str, str, Actor, Tuple[int, int], str]): | |
26 | 41 | |
27 | 42 | """Named tuple allowing easy access to the revlog data fields""" |
28 | 43 | _re_hexsha_only = re.compile('^[0-9A-Fa-f]{40}$') |
29 | 44 | __slots__ = () |
30 | 45 | |
31 | def __repr__(self): | |
46 | def __repr__(self) -> str: | |
32 | 47 | """Representation of ourselves in git reflog format""" |
33 | 48 | return self.format() |
34 | 49 | |
35 | def format(self): | |
50 | def format(self) -> str: | |
36 | 51 | """:return: a string suitable to be placed in a reflog file""" |
37 | 52 | act = self.actor |
38 | 53 | time = self.time |
45 | 60 | self.message) |
46 | 61 | |
47 | 62 | @property |
48 | def oldhexsha(self): | |
63 | def oldhexsha(self) -> str: | |
49 | 64 | """The hexsha to the commit the ref pointed to before the change""" |
50 | 65 | return self[0] |
51 | 66 | |
52 | 67 | @property |
53 | def newhexsha(self): | |
68 | def newhexsha(self) -> str: | |
54 | 69 | """The hexsha to the commit the ref now points to, after the change""" |
55 | 70 | return self[1] |
56 | 71 | |
57 | 72 | @property |
58 | def actor(self): | |
73 | def actor(self) -> Actor: | |
59 | 74 | """Actor instance, providing access""" |
60 | 75 | return self[2] |
61 | 76 | |
62 | 77 | @property |
63 | def time(self): | |
78 | def time(self) -> Tuple[int, int]: | |
64 | 79 | """time as tuple: |
65 | 80 | |
66 | 81 | * [0] = int(time) |
68 | 83 | return self[3] |
69 | 84 | |
70 | 85 | @property |
71 | def message(self): | |
86 | def message(self) -> str: | |
72 | 87 | """Message describing the operation that acted on the reference""" |
73 | 88 | return self[4] |
74 | 89 | |
75 | 90 | @classmethod |
76 | def new(cls, oldhexsha, newhexsha, actor, time, tz_offset, message): # skipcq: PYL-W0621 | |
91 | def new(cls, oldhexsha: str, newhexsha: str, actor: Actor, time: int, tz_offset: int, message: str | |
92 | ) -> 'RefLogEntry': # skipcq: PYL-W0621 | |
77 | 93 | """:return: New instance of a RefLogEntry""" |
78 | 94 | if not isinstance(actor, Actor): |
79 | 95 | raise ValueError("Need actor instance, got %s" % actor) |
81 | 97 | return RefLogEntry((oldhexsha, newhexsha, actor, (time, tz_offset), message)) |
82 | 98 | |
83 | 99 | @classmethod |
84 | def from_line(cls, line): | |
100 | def from_line(cls, line: bytes) -> 'RefLogEntry': | |
85 | 101 | """:return: New RefLogEntry instance from the given revlog line. |
86 | 102 | :param line: line bytes without trailing newline |
87 | 103 | :raise ValueError: If line could not be parsed""" |
88 | line = line.decode(defenc) | |
89 | fields = line.split('\t', 1) | |
104 | line_str = line.decode(defenc) | |
105 | fields = line_str.split('\t', 1) | |
90 | 106 | if len(fields) == 1: |
91 | 107 | info, msg = fields[0], None |
92 | 108 | elif len(fields) == 2: |
93 | 109 | info, msg = fields |
94 | 110 | else: |
95 | 111 | raise ValueError("Line must have up to two TAB-separated fields." |
96 | " Got %s" % repr(line)) | |
112 | " Got %s" % repr(line_str)) | |
97 | 113 | # END handle first split |
98 | 114 | |
99 | 115 | oldhexsha = info[:40] |
110 | 126 | # END handle missing end brace |
111 | 127 | |
112 | 128 | actor = Actor._from_string(info[82:email_end + 1]) |
113 | time, tz_offset = parse_date(info[email_end + 2:]) # skipcq: PYL-W0621 | |
129 | time, tz_offset = parse_date( | |
130 | info[email_end + 2:]) # skipcq: PYL-W0621 | |
114 | 131 | |
115 | 132 | return RefLogEntry((oldhexsha, newhexsha, actor, (time, tz_offset), msg)) |
116 | 133 | |
117 | 134 | |
118 | class RefLog(list, Serializable): | |
119 | ||
120 | """A reflog contains reflog entries, each of which defines a certain state | |
135 | class RefLog(List[RefLogEntry], Serializable): | |
136 | ||
137 | """A reflog contains RefLogEntrys, each of which defines a certain state | |
121 | 138 | of the head in question. Custom query methods allow to retrieve log entries |
122 | 139 | by date or by other criteria. |
123 | 140 | |
126 | 143 | |
127 | 144 | __slots__ = ('_path', ) |
128 | 145 | |
129 | def __new__(cls, filepath=None): | |
146 | def __new__(cls, filepath: Union[PathLike, None] = None) -> 'RefLog': | |
130 | 147 | inst = super(RefLog, cls).__new__(cls) |
131 | 148 | return inst |
132 | 149 | |
133 | def __init__(self, filepath=None): | |
150 | def __init__(self, filepath: Union[PathLike, None] = None): | |
134 | 151 | """Initialize this instance with an optional filepath, from which we will |
135 | 152 | initialize our data. The path is also used to write changes back using |
136 | 153 | the write() method""" |
139 | 156 | self._read_from_file() |
140 | 157 | # END handle filepath |
141 | 158 | |
142 | def _read_from_file(self): | |
159 | def _read_from_file(self) -> None: | |
143 | 160 | try: |
144 | fmap = file_contents_ro_filepath(self._path, stream=True, allow_mmap=True) | |
161 | fmap = file_contents_ro_filepath( | |
162 | self._path, stream=True, allow_mmap=True) | |
145 | 163 | except OSError: |
146 | 164 | # it is possible and allowed that the file doesn't exist ! |
147 | 165 | return |
153 | 171 | fmap.close() |
154 | 172 | # END handle closing of handle |
155 | 173 | |
156 | #{ Interface | |
157 | ||
158 | @classmethod | |
159 | def from_file(cls, filepath): | |
174 | # { Interface | |
175 | ||
176 | @classmethod | |
177 | def from_file(cls, filepath: PathLike) -> 'RefLog': | |
160 | 178 | """ |
161 | 179 | :return: a new RefLog instance containing all entries from the reflog |
162 | 180 | at the given filepath |
165 | 183 | return cls(filepath) |
166 | 184 | |
167 | 185 | @classmethod |
168 | def path(cls, ref): | |
186 | def path(cls, ref: 'SymbolicReference') -> str: | |
169 | 187 | """ |
170 | 188 | :return: string to absolute path at which the reflog of the given ref |
171 | 189 | instance would be found. The path is not guaranteed to point to a valid |
174 | 192 | return osp.join(ref.repo.git_dir, "logs", to_native_path(ref.path)) |
175 | 193 | |
176 | 194 | @classmethod |
177 | def iter_entries(cls, stream): | |
195 | def iter_entries(cls, stream: Union[str, 'BytesIO', mmap]) -> Iterator[RefLogEntry]: | |
178 | 196 | """ |
179 | 197 | :return: Iterator yielding RefLogEntry instances, one for each line read |
180 | 198 | sfrom the given stream. |
181 | 199 | :param stream: file-like object containing the revlog in its native format |
182 | or basestring instance pointing to a file to read""" | |
200 | or string instance pointing to a file to read""" | |
183 | 201 | new_entry = RefLogEntry.from_line |
184 | 202 | if isinstance(stream, str): |
185 | stream = file_contents_ro_filepath(stream) | |
203 | # default args return mmap on py>3 | |
204 | _stream = file_contents_ro_filepath(stream) | |
205 | assert isinstance(_stream, mmap) | |
206 | else: | |
207 | _stream = stream | |
186 | 208 | # END handle stream type |
187 | 209 | while True: |
188 | line = stream.readline() | |
210 | line = _stream.readline() | |
189 | 211 | if not line: |
190 | 212 | return |
191 | 213 | yield new_entry(line.strip()) |
192 | 214 | # END endless loop |
193 | stream.close() | |
194 | ||
195 | @classmethod | |
196 | def entry_at(cls, filepath, index): | |
197 | """:return: RefLogEntry at the given index | |
215 | ||
216 | @classmethod | |
217 | def entry_at(cls, filepath: PathLike, index: int) -> 'RefLogEntry': | |
218 | """ | |
219 | :return: RefLogEntry at the given index | |
220 | ||
198 | 221 | :param filepath: full path to the index file from which to read the entry |
222 | ||
199 | 223 | :param index: python list compatible index, i.e. it may be negative to |
200 | 224 | specify an entry counted from the end of the list |
201 | 225 | |
209 | 233 | if index < 0: |
210 | 234 | return RefLogEntry.from_line(fp.readlines()[index].strip()) |
211 | 235 | # read until index is reached |
236 | ||
212 | 237 | for i in range(index + 1): |
213 | 238 | line = fp.readline() |
214 | 239 | if not line: |
215 | break | |
240 | raise IndexError( | |
241 | f"Index file ended at line {i+1}, before given index was reached") | |
216 | 242 | # END abort on eof |
217 | 243 | # END handle runup |
218 | 244 | |
219 | if i != index or not line: # skipcq:PYL-W0631 | |
220 | raise IndexError | |
221 | # END handle exception | |
222 | ||
223 | 245 | return RefLogEntry.from_line(line.strip()) |
224 | 246 | # END handle index |
225 | 247 | |
226 | def to_file(self, filepath): | |
248 | def to_file(self, filepath: PathLike) -> None: | |
227 | 249 | """Write the contents of the reflog instance to a file at the given filepath. |
228 | 250 | :param filepath: path to file, parent directories are assumed to exist""" |
229 | 251 | lfd = LockedFD(filepath) |
240 | 262 | # END handle change |
241 | 263 | |
242 | 264 | @classmethod |
243 | def append_entry(cls, config_reader, filepath, oldbinsha, newbinsha, message): | |
265 | def append_entry(cls, config_reader: Union[Actor, 'GitConfigParser', 'SectionConstraint', None], | |
266 | filepath: PathLike, oldbinsha: bytes, newbinsha: bytes, message: str, | |
267 | write: bool = True) -> 'RefLogEntry': | |
244 | 268 | """Append a new log entry to the revlog at filepath. |
245 | 269 | |
246 | 270 | :param config_reader: configuration reader of the repository - used to obtain |
247 | user information. May also be an Actor instance identifying the committer directly. | |
248 | May also be None | |
271 | user information. May also be an Actor instance identifying the committer directly or None. | |
249 | 272 | :param filepath: full path to the log file |
250 | 273 | :param oldbinsha: binary sha of the previous commit |
251 | 274 | :param newbinsha: binary sha of the current commit |
252 | 275 | :param message: message describing the change to the reference |
253 | 276 | :param write: If True, the changes will be written right away. Otherwise |
254 | 277 | the change will not be written |
278 | ||
255 | 279 | :return: RefLogEntry objects which was appended to the log |
280 | ||
256 | 281 | :note: As we are append-only, concurrent access is not a problem as we |
257 | 282 | do not interfere with readers.""" |
283 | ||
258 | 284 | if len(oldbinsha) != 20 or len(newbinsha) != 20: |
259 | 285 | raise ValueError("Shas need to be given in binary format") |
260 | 286 | # END handle sha type |
261 | 287 | assure_directory_exists(filepath, is_file=True) |
262 | 288 | first_line = message.split('\n')[0] |
263 | committer = isinstance(config_reader, Actor) and config_reader or Actor.committer(config_reader) | |
289 | if isinstance(config_reader, Actor): | |
290 | committer = config_reader # mypy thinks this is Actor | Gitconfigparser, but why? | |
291 | else: | |
292 | committer = Actor.committer(config_reader) | |
264 | 293 | entry = RefLogEntry(( |
265 | 294 | bin_to_hex(oldbinsha).decode('ascii'), |
266 | 295 | bin_to_hex(newbinsha).decode('ascii'), |
267 | committer, (int(time.time()), time.altzone), first_line | |
296 | committer, (int(_time.time()), _time.altzone), first_line | |
268 | 297 | )) |
269 | 298 | |
270 | lf = LockFile(filepath) | |
271 | lf._obtain_lock_or_raise() | |
272 | fd = open(filepath, 'ab') | |
273 | try: | |
274 | fd.write(entry.format().encode(defenc)) | |
275 | finally: | |
276 | fd.close() | |
277 | lf._release_lock() | |
278 | # END handle write operation | |
279 | ||
299 | if write: | |
300 | lf = LockFile(filepath) | |
301 | lf._obtain_lock_or_raise() | |
302 | fd = open(filepath, 'ab') | |
303 | try: | |
304 | fd.write(entry.format().encode(defenc)) | |
305 | finally: | |
306 | fd.close() | |
307 | lf._release_lock() | |
308 | # END handle write operation | |
280 | 309 | return entry |
281 | 310 | |
282 | def write(self): | |
311 | def write(self) -> 'RefLog': | |
283 | 312 | """Write this instance's data to the file we are originating from |
284 | 313 | :return: self""" |
285 | 314 | if self._path is None: |
286 | raise ValueError("Instance was not initialized with a path, use to_file(...) instead") | |
315 | raise ValueError( | |
316 | "Instance was not initialized with a path, use to_file(...) instead") | |
287 | 317 | # END assert path |
288 | 318 | self.to_file(self._path) |
289 | 319 | return self |
290 | 320 | |
291 | #} END interface | |
292 | ||
293 | #{ Serializable Interface | |
294 | def _serialize(self, stream): | |
321 | # } END interface | |
322 | ||
323 | # { Serializable Interface | |
324 | def _serialize(self, stream: 'BytesIO') -> 'RefLog': | |
295 | 325 | write = stream.write |
296 | 326 | |
297 | 327 | # write all entries |
298 | 328 | for e in self: |
299 | 329 | write(e.format().encode(defenc)) |
300 | 330 | # END for each entry |
301 | ||
302 | def _deserialize(self, stream): | |
331 | return self | |
332 | ||
333 | def _deserialize(self, stream: 'BytesIO') -> 'RefLog': | |
303 | 334 | self.extend(self.iter_entries(stream)) |
304 | #} END serializable interface | |
335 | # } END serializable interface | |
336 | return self |
0 | 0 | from git.util import ( |
1 | 1 | LazyMixin, |
2 | Iterable, | |
2 | IterableObj, | |
3 | 3 | ) |
4 | from .symbolic import SymbolicReference | |
4 | from .symbolic import SymbolicReference, T_References | |
5 | ||
6 | ||
7 | # typing ------------------------------------------------------------------ | |
8 | ||
9 | from typing import Any, Callable, Iterator, Type, Union, TYPE_CHECKING # NOQA | |
10 | from git.types import Commit_ish, PathLike, _T # NOQA | |
11 | ||
12 | if TYPE_CHECKING: | |
13 | from git.repo import Repo | |
14 | ||
15 | # ------------------------------------------------------------------------------ | |
5 | 16 | |
6 | 17 | |
7 | 18 | __all__ = ["Reference"] |
9 | 20 | #{ Utilities |
10 | 21 | |
11 | 22 | |
12 | def require_remote_ref_path(func): | |
23 | def require_remote_ref_path(func: Callable[..., _T]) -> Callable[..., _T]: | |
13 | 24 | """A decorator raising a TypeError if we are not a valid remote, based on the path""" |
14 | 25 | |
15 | def wrapper(self, *args): | |
26 | def wrapper(self: T_References, *args: Any) -> _T: | |
16 | 27 | if not self.is_remote(): |
17 | 28 | raise ValueError("ref path does not point to a remote reference: %s" % self.path) |
18 | 29 | return func(self, *args) |
22 | 33 | #}END utilities |
23 | 34 | |
24 | 35 | |
25 | class Reference(SymbolicReference, LazyMixin, Iterable): | |
36 | class Reference(SymbolicReference, LazyMixin, IterableObj): | |
26 | 37 | |
27 | 38 | """Represents a named reference to any object. Subclasses may apply restrictions though, |
28 | 39 | i.e. Heads can only point to commits.""" |
31 | 42 | _resolve_ref_on_create = True |
32 | 43 | _common_path_default = "refs" |
33 | 44 | |
34 | def __init__(self, repo, path, check_path=True): | |
45 | def __init__(self, repo: 'Repo', path: PathLike, check_path: bool = True) -> None: | |
35 | 46 | """Initialize this instance |
36 | 47 | :param repo: Our parent repository |
37 | 48 | |
40 | 51 | refs/heads/master |
41 | 52 | :param check_path: if False, you can provide any path. Otherwise the path must start with the |
42 | 53 | default path prefix of this type.""" |
43 | if check_path and not path.startswith(self._common_path_default + '/'): | |
44 | raise ValueError("Cannot instantiate %r from path %s" % (self.__class__.__name__, path)) | |
54 | if check_path and not str(path).startswith(self._common_path_default + '/'): | |
55 | raise ValueError(f"Cannot instantiate {self.__class__.__name__!r} from path {path}") | |
56 | self.path: str # SymbolicReference converts to string atm | |
45 | 57 | super(Reference, self).__init__(repo, path) |
46 | 58 | |
47 | def __str__(self): | |
59 | def __str__(self) -> str: | |
48 | 60 | return self.name |
49 | 61 | |
50 | 62 | #{ Interface |
51 | 63 | |
52 | def set_object(self, object, logmsg=None): # @ReservedAssignment | |
64 | # @ReservedAssignment | |
65 | def set_object(self, object: Union[Commit_ish, 'SymbolicReference', str], logmsg: Union[str, None] = None | |
66 | ) -> 'Reference': | |
53 | 67 | """Special version which checks if the head-log needs an update as well |
54 | 68 | :return: self""" |
55 | 69 | oldbinsha = None |
83 | 97 | # NOTE: Don't have to overwrite properties as the will only work without a the log |
84 | 98 | |
85 | 99 | @property |
86 | def name(self): | |
100 | def name(self) -> str: | |
87 | 101 | """:return: (shortest) Name of this reference - it may contain path components""" |
88 | 102 | # first two path tokens are can be removed as they are |
89 | 103 | # refs/heads or refs/tags or refs/remotes |
93 | 107 | return '/'.join(tokens[2:]) |
94 | 108 | |
95 | 109 | @classmethod |
96 | def iter_items(cls, repo, common_path=None): | |
110 | def iter_items(cls: Type[T_References], repo: 'Repo', common_path: Union[PathLike, None] = None, | |
111 | *args: Any, **kwargs: Any) -> Iterator[T_References]: | |
97 | 112 | """Equivalent to SymbolicReference.iter_items, but will return non-detached |
98 | 113 | references as well.""" |
99 | 114 | return cls._iter_items(repo, common_path) |
102 | 117 | |
103 | 118 | #{ Remote Interface |
104 | 119 | |
105 | @property | |
120 | @property # type: ignore ## mypy cannot deal with properties with an extra decorator (2021-04-21) | |
106 | 121 | @require_remote_ref_path |
107 | def remote_name(self): | |
122 | def remote_name(self) -> str: | |
108 | 123 | """ |
109 | 124 | :return: |
110 | 125 | Name of the remote we are a reference of, such as 'origin' for a reference |
113 | 128 | # /refs/remotes/<remote name>/<branch_name> |
114 | 129 | return tokens[2] |
115 | 130 | |
116 | @property | |
131 | @property # type: ignore ## mypy cannot deal with properties with an extra decorator (2021-04-21) | |
117 | 132 | @require_remote_ref_path |
118 | def remote_head(self): | |
133 | def remote_head(self) -> str: | |
119 | 134 | """:return: Name of the remote head itself, i.e. master. |
120 | 135 | :note: The returned name is usually not qualified enough to uniquely identify |
121 | 136 | a branch""" |
0 | 0 | import os |
1 | 1 | |
2 | 2 | from git.util import join_path |
3 | ||
4 | import os.path as osp | |
5 | 3 | |
6 | 4 | from .head import Head |
7 | 5 | |
8 | 6 | |
9 | 7 | __all__ = ["RemoteReference"] |
8 | ||
9 | # typing ------------------------------------------------------------------ | |
10 | ||
11 | from typing import Any, Iterator, NoReturn, Union, TYPE_CHECKING | |
12 | from git.types import PathLike | |
13 | ||
14 | ||
15 | if TYPE_CHECKING: | |
16 | from git.repo import Repo | |
17 | from git import Remote | |
18 | ||
19 | # ------------------------------------------------------------------------------ | |
10 | 20 | |
11 | 21 | |
12 | 22 | class RemoteReference(Head): |
15 | 25 | _common_path_default = Head._remote_common_path_default |
16 | 26 | |
17 | 27 | @classmethod |
18 | def iter_items(cls, repo, common_path=None, remote=None): | |
28 | def iter_items(cls, repo: 'Repo', common_path: Union[PathLike, None] = None, | |
29 | remote: Union['Remote', None] = None, *args: Any, **kwargs: Any | |
30 | ) -> Iterator['RemoteReference']: | |
19 | 31 | """Iterate remote references, and if given, constrain them to the given remote""" |
20 | 32 | common_path = common_path or cls._common_path_default |
21 | 33 | if remote is not None: |
22 | 34 | common_path = join_path(common_path, str(remote)) |
23 | 35 | # END handle remote constraint |
36 | # super is Reference | |
24 | 37 | return super(RemoteReference, cls).iter_items(repo, common_path) |
25 | 38 | |
26 | @classmethod | |
27 | def delete(cls, repo, *refs, **kwargs): | |
39 | @ classmethod | |
40 | def delete(cls, repo: 'Repo', *refs: 'RemoteReference', **kwargs: Any) -> None: | |
28 | 41 | """Delete the given remote references |
29 | 42 | |
30 | 43 | :note: |
36 | 49 | # and delete remainders manually |
37 | 50 | for ref in refs: |
38 | 51 | try: |
39 | os.remove(osp.join(repo.common_dir, ref.path)) | |
52 | os.remove(os.path.join(repo.common_dir, ref.path)) | |
40 | 53 | except OSError: |
41 | 54 | pass |
42 | 55 | try: |
43 | os.remove(osp.join(repo.git_dir, ref.path)) | |
56 | os.remove(os.path.join(repo.git_dir, ref.path)) | |
44 | 57 | except OSError: |
45 | 58 | pass |
46 | 59 | # END for each ref |
47 | 60 | |
48 | @classmethod | |
49 | def create(cls, *args, **kwargs): | |
61 | @ classmethod | |
62 | def create(cls, *args: Any, **kwargs: Any) -> NoReturn: | |
50 | 63 | """Used to disable this method""" |
51 | 64 | raise TypeError("Cannot explicitly create remote references") |
0 | from git.types import PathLike | |
0 | 1 | import os |
1 | 2 | |
2 | 3 | from git.compat import defenc |
3 | from git.objects import Object, Commit | |
4 | from git.objects import Object | |
5 | from git.objects.commit import Commit | |
4 | 6 | from git.util import ( |
5 | 7 | join_path, |
6 | 8 | join_path_native, |
14 | 16 | BadName |
15 | 17 | ) |
16 | 18 | |
17 | import os.path as osp | |
18 | ||
19 | 19 | from .log import RefLog |
20 | 20 | |
21 | # typing ------------------------------------------------------------------ | |
22 | ||
23 | from typing import Any, Iterator, List, Tuple, Type, TypeVar, Union, TYPE_CHECKING, cast # NOQA | |
24 | from git.types import Commit_ish, PathLike # NOQA | |
25 | ||
26 | if TYPE_CHECKING: | |
27 | from git.repo import Repo | |
28 | from git.refs import Head, TagReference, RemoteReference, Reference | |
29 | from .log import RefLogEntry | |
30 | from git.config import GitConfigParser | |
31 | from git.objects.commit import Actor | |
32 | ||
33 | ||
34 | T_References = TypeVar('T_References', bound='SymbolicReference') | |
35 | ||
36 | # ------------------------------------------------------------------------------ | |
37 | ||
21 | 38 | |
22 | 39 | __all__ = ["SymbolicReference"] |
23 | 40 | |
24 | 41 | |
25 | def _git_dir(repo, path): | |
42 | def _git_dir(repo: 'Repo', path: Union[PathLike, None]) -> PathLike: | |
26 | 43 | """ Find the git dir that's appropriate for the path""" |
27 | name = "%s" % (path,) | |
44 | name = f"{path}" | |
28 | 45 | if name in ['HEAD', 'ORIG_HEAD', 'FETCH_HEAD', 'index', 'logs']: |
29 | 46 | return repo.git_dir |
30 | 47 | return repo.common_dir |
44 | 61 | _remote_common_path_default = "refs/remotes" |
45 | 62 | _id_attribute_ = "name" |
46 | 63 | |
47 | def __init__(self, repo, path): | |
64 | def __init__(self, repo: 'Repo', path: PathLike, check_path: bool = False): | |
48 | 65 | self.repo = repo |
49 | 66 | self.path = path |
50 | 67 | |
51 | def __str__(self): | |
52 | return self.path | |
53 | ||
54 | def __repr__(self): | |
68 | def __str__(self) -> str: | |
69 | return str(self.path) | |
70 | ||
71 | def __repr__(self) -> str: | |
55 | 72 | return '<git.%s "%s">' % (self.__class__.__name__, self.path) |
56 | 73 | |
57 | def __eq__(self, other): | |
74 | def __eq__(self, other: object) -> bool: | |
58 | 75 | if hasattr(other, 'path'): |
76 | other = cast(SymbolicReference, other) | |
59 | 77 | return self.path == other.path |
60 | 78 | return False |
61 | 79 | |
62 | def __ne__(self, other): | |
80 | def __ne__(self, other: object) -> bool: | |
63 | 81 | return not (self == other) |
64 | 82 | |
65 | def __hash__(self): | |
83 | def __hash__(self) -> int: | |
66 | 84 | return hash(self.path) |
67 | 85 | |
68 | 86 | @property |
69 | def name(self): | |
87 | def name(self) -> str: | |
70 | 88 | """ |
71 | 89 | :return: |
72 | 90 | In case of symbolic references, the shortest assumable name |
73 | 91 | is the path itself.""" |
74 | return self.path | |
92 | return str(self.path) | |
75 | 93 | |
76 | 94 | @property |
77 | def abspath(self): | |
95 | def abspath(self) -> PathLike: | |
78 | 96 | return join_path_native(_git_dir(self.repo, self.path), self.path) |
79 | 97 | |
80 | 98 | @classmethod |
81 | def _get_packed_refs_path(cls, repo): | |
82 | return osp.join(repo.common_dir, 'packed-refs') | |
83 | ||
84 | @classmethod | |
85 | def _iter_packed_refs(cls, repo): | |
86 | """Returns an iterator yielding pairs of sha1/path pairs (as bytes) for the corresponding refs. | |
99 | def _get_packed_refs_path(cls, repo: 'Repo') -> str: | |
100 | return os.path.join(repo.common_dir, 'packed-refs') | |
101 | ||
102 | @classmethod | |
103 | def _iter_packed_refs(cls, repo: 'Repo') -> Iterator[Tuple[str, str]]: | |
104 | """Returns an iterator yielding pairs of sha1/path pairs (as strings) for the corresponding refs. | |
87 | 105 | :note: The packed refs file will be kept open as long as we iterate""" |
88 | 106 | try: |
89 | with open(cls._get_packed_refs_path(repo), 'rt') as fp: | |
107 | with open(cls._get_packed_refs_path(repo), 'rt', encoding='UTF-8') as fp: | |
90 | 108 | for line in fp: |
91 | 109 | line = line.strip() |
92 | 110 | if not line: |
111 | 129 | if line[0] == '^': |
112 | 130 | continue |
113 | 131 | |
114 | yield tuple(line.split(' ', 1)) | |
132 | yield cast(Tuple[str, str], tuple(line.split(' ', 1))) | |
115 | 133 | # END for each line |
116 | except (OSError, IOError): | |
117 | return | |
134 | except OSError: | |
135 | return None | |
118 | 136 | # END no packed-refs file handling |
119 | 137 | # NOTE: Had try-finally block around here to close the fp, |
120 | 138 | # but some python version wouldn't allow yields within that. |
122 | 140 | # alright. |
123 | 141 | |
124 | 142 | @classmethod |
125 | def dereference_recursive(cls, repo, ref_path): | |
143 | def dereference_recursive(cls, repo: 'Repo', ref_path: Union[PathLike, None]) -> str: | |
126 | 144 | """ |
127 | 145 | :return: hexsha stored in the reference at the given ref_path, recursively dereferencing all |
128 | 146 | intermediate references as required |
129 | 147 | :param repo: the repository containing the reference at ref_path""" |
148 | ||
130 | 149 | while True: |
131 | 150 | hexsha, ref_path = cls._get_ref_info(repo, ref_path) |
132 | 151 | if hexsha is not None: |
134 | 153 | # END recursive dereferencing |
135 | 154 | |
136 | 155 | @classmethod |
137 | def _get_ref_info_helper(cls, repo, ref_path): | |
156 | def _get_ref_info_helper(cls, repo: 'Repo', ref_path: Union[PathLike, None] | |
157 | ) -> Union[Tuple[str, None], Tuple[None, str]]: | |
138 | 158 | """Return: (str(sha), str(target_ref_path)) if available, the sha the file at |
139 | 159 | rela_path points to, or None. target_ref_path is the reference we |
140 | 160 | point to, or None""" |
141 | tokens = None | |
161 | tokens: Union[None, List[str], Tuple[str, str]] = None | |
142 | 162 | repodir = _git_dir(repo, ref_path) |
143 | 163 | try: |
144 | with open(osp.join(repodir, ref_path), 'rt', encoding='UTF-8') as fp: | |
164 | with open(os.path.join(repodir, str(ref_path)), 'rt', encoding='UTF-8') as fp: | |
145 | 165 | value = fp.read().rstrip() |
146 | 166 | # Don't only split on spaces, but on whitespace, which allows to parse lines like |
147 | 167 | # 60b64ef992065e2600bfef6187a97f92398a9144 branch 'master' of git-server:/path/to/repo |
148 | 168 | tokens = value.split() |
149 | 169 | assert(len(tokens) != 0) |
150 | except (OSError, IOError): | |
170 | except OSError: | |
151 | 171 | # Probably we are just packed, find our entry in the packed refs file |
152 | 172 | # NOTE: We are not a symbolic ref if we are in a packed file, as these |
153 | 173 | # are excluded explicitly |
173 | 193 | raise ValueError("Failed to parse reference information from %r" % ref_path) |
174 | 194 | |
175 | 195 | @classmethod |
176 | def _get_ref_info(cls, repo, ref_path): | |
196 | def _get_ref_info(cls, repo: 'Repo', ref_path: Union[PathLike, None]) -> Union[Tuple[str, None], Tuple[None, str]]: | |
177 | 197 | """Return: (str(sha), str(target_ref_path)) if available, the sha the file at |
178 | 198 | rela_path points to, or None. target_ref_path is the reference we |
179 | 199 | point to, or None""" |
180 | 200 | return cls._get_ref_info_helper(repo, ref_path) |
181 | 201 | |
182 | def _get_object(self): | |
202 | def _get_object(self) -> Commit_ish: | |
183 | 203 | """ |
184 | 204 | :return: |
185 | 205 | The object our ref currently refers to. Refs can be cached, they will |
188 | 208 | # Our path will be resolved to the hexsha which will be used accordingly |
189 | 209 | return Object.new_from_sha(self.repo, hex_to_bin(self.dereference_recursive(self.repo, self.path))) |
190 | 210 | |
191 | def _get_commit(self): | |
211 | def _get_commit(self) -> 'Commit': | |
192 | 212 | """ |
193 | 213 | :return: |
194 | 214 | Commit object we point to, works for detached and non-detached |
203 | 223 | # END handle type |
204 | 224 | return obj |
205 | 225 | |
206 | def set_commit(self, commit, logmsg=None): | |
226 | def set_commit(self, commit: Union[Commit, 'SymbolicReference', str], logmsg: Union[str, None] = None | |
227 | ) -> 'SymbolicReference': | |
207 | 228 | """As set_object, but restricts the type of object to be a Commit |
208 | 229 | |
209 | 230 | :raise ValueError: If commit is not a Commit object or doesn't point to |
232 | 253 | |
233 | 254 | return self |
234 | 255 | |
235 | def set_object(self, object, logmsg=None): # @ReservedAssignment | |
256 | def set_object(self, object: Union[Commit_ish, 'SymbolicReference', str], logmsg: Union[str, None] = None | |
257 | ) -> 'SymbolicReference': | |
236 | 258 | """Set the object we point to, possibly dereference our symbolic reference first. |
237 | 259 | If the reference does not exist, it will be created |
238 | 260 | |
259 | 281 | # set the commit on our reference |
260 | 282 | return self._get_reference().set_object(object, logmsg) |
261 | 283 | |
262 | commit = property(_get_commit, set_commit, doc="Query or set commits directly") | |
263 | object = property(_get_object, set_object, doc="Return the object our ref currently refers to") | |
264 | ||
265 | def _get_reference(self): | |
284 | commit = property(_get_commit, set_commit, doc="Query or set commits directly") # type: ignore | |
285 | object = property(_get_object, set_object, doc="Return the object our ref currently refers to") # type: ignore | |
286 | ||
287 | def _get_reference(self) -> 'SymbolicReference': | |
266 | 288 | """:return: Reference Object we point to |
267 | 289 | :raise TypeError: If this symbolic reference is detached, hence it doesn't point |
268 | 290 | to a reference, but to a commit""" |
271 | 293 | raise TypeError("%s is a detached symbolic reference as it points to %r" % (self, sha)) |
272 | 294 | return self.from_path(self.repo, target_ref_path) |
273 | 295 | |
274 | def set_reference(self, ref, logmsg=None): | |
296 | def set_reference(self, ref: Union[Commit_ish, 'SymbolicReference', str], | |
297 | logmsg: Union[str, None] = None) -> 'SymbolicReference': | |
275 | 298 | """Set ourselves to the given ref. It will stay a symbol if the ref is a Reference. |
276 | 299 | Otherwise an Object, given as Object instance or refspec, is assumed and if valid, |
277 | 300 | will be set which effectively detaches the refererence if it was a purely |
312 | 335 | raise TypeError("Require commit, got %r" % obj) |
313 | 336 | # END verify type |
314 | 337 | |
315 | oldbinsha = None | |
338 | oldbinsha: bytes = b'' | |
316 | 339 | if logmsg is not None: |
317 | 340 | try: |
318 | 341 | oldbinsha = self.commit.binsha |
341 | 364 | return self |
342 | 365 | |
343 | 366 | # aliased reference |
344 | reference = property(_get_reference, set_reference, doc="Returns the Reference we point to") | |
367 | reference: Union['Head', 'TagReference', 'RemoteReference', 'Reference'] | |
368 | reference = property(_get_reference, set_reference, doc="Returns the Reference we point to") # type: ignore | |
345 | 369 | ref = reference |
346 | 370 | |
347 | def is_valid(self): | |
371 | def is_valid(self) -> bool: | |
348 | 372 | """ |
349 | 373 | :return: |
350 | 374 | True if the reference is valid, hence it can be read and points to |
357 | 381 | return True |
358 | 382 | |
359 | 383 | @property |
360 | def is_detached(self): | |
384 | def is_detached(self) -> bool: | |
361 | 385 | """ |
362 | 386 | :return: |
363 | 387 | True if we are a detached reference, hence we point to a specific commit |
368 | 392 | except TypeError: |
369 | 393 | return True |
370 | 394 | |
371 | def log(self): | |
395 | def log(self) -> 'RefLog': | |
372 | 396 | """ |
373 | 397 | :return: RefLog for this reference. Its last entry reflects the latest change |
374 | 398 | applied to this reference |
377 | 401 | instead of calling this method repeatedly. It should be considered read-only.""" |
378 | 402 | return RefLog.from_file(RefLog.path(self)) |
379 | 403 | |
380 | def log_append(self, oldbinsha, message, newbinsha=None): | |
404 | def log_append(self, oldbinsha: bytes, message: Union[str, None], | |
405 | newbinsha: Union[bytes, None] = None) -> 'RefLogEntry': | |
381 | 406 | """Append a logentry to the logfile of this ref |
382 | 407 | |
383 | 408 | :param oldbinsha: binary sha this ref used to point to |
389 | 414 | # correct to allow overriding the committer on a per-commit level. |
390 | 415 | # See https://github.com/gitpython-developers/GitPython/pull/146 |
391 | 416 | try: |
392 | committer_or_reader = self.commit.committer | |
417 | committer_or_reader: Union['Actor', 'GitConfigParser'] = self.commit.committer | |
393 | 418 | except ValueError: |
394 | 419 | committer_or_reader = self.repo.config_reader() |
395 | 420 | # end handle newly cloned repositories |
396 | return RefLog.append_entry(committer_or_reader, RefLog.path(self), oldbinsha, | |
397 | (newbinsha is None and self.commit.binsha) or newbinsha, | |
398 | message) | |
399 | ||
400 | def log_entry(self, index): | |
421 | if newbinsha is None: | |
422 | newbinsha = self.commit.binsha | |
423 | ||
424 | if message is None: | |
425 | message = '' | |
426 | ||
427 | return RefLog.append_entry(committer_or_reader, RefLog.path(self), oldbinsha, newbinsha, message) | |
428 | ||
429 | def log_entry(self, index: int) -> 'RefLogEntry': | |
401 | 430 | """:return: RefLogEntry at the given index |
402 | 431 | :param index: python list compatible positive or negative index |
403 | 432 | |
407 | 436 | return RefLog.entry_at(RefLog.path(self), index) |
408 | 437 | |
409 | 438 | @classmethod |
410 | def to_full_path(cls, path): | |
439 | def to_full_path(cls, path: Union[PathLike, 'SymbolicReference']) -> PathLike: | |
411 | 440 | """ |
412 | 441 | :return: string with a full repository-relative path which can be used to initialize |
413 | 442 | a Reference instance, for instance by using ``Reference.from_path``""" |
416 | 445 | full_ref_path = path |
417 | 446 | if not cls._common_path_default: |
418 | 447 | return full_ref_path |
419 | if not path.startswith(cls._common_path_default + "/"): | |
448 | if not str(path).startswith(cls._common_path_default + "/"): | |
420 | 449 | full_ref_path = '%s/%s' % (cls._common_path_default, path) |
421 | 450 | return full_ref_path |
422 | 451 | |
423 | 452 | @classmethod |
424 | def delete(cls, repo, path): | |
453 | def delete(cls, repo: 'Repo', path: PathLike) -> None: | |
425 | 454 | """Delete the reference at the given path |
426 | 455 | |
427 | 456 | :param repo: |
432 | 461 | or just "myreference", hence 'refs/' is implied. |
433 | 462 | Alternatively the symbolic reference to be deleted""" |
434 | 463 | full_ref_path = cls.to_full_path(path) |
435 | abs_path = osp.join(repo.common_dir, full_ref_path) | |
436 | if osp.exists(abs_path): | |
464 | abs_path = os.path.join(repo.common_dir, full_ref_path) | |
465 | if os.path.exists(abs_path): | |
437 | 466 | os.remove(abs_path) |
438 | 467 | else: |
439 | 468 | # check packed refs |
443 | 472 | new_lines = [] |
444 | 473 | made_change = False |
445 | 474 | dropped_last_line = False |
446 | for line in reader: | |
447 | line = line.decode(defenc) | |
475 | for line_bytes in reader: | |
476 | line = line_bytes.decode(defenc) | |
448 | 477 | _, _, line_ref = line.partition(' ') |
449 | 478 | line_ref = line_ref.strip() |
450 | 479 | # keep line if it is a comment or if the ref to delete is not |
469 | 498 | with open(pack_file_path, 'wb') as fd: |
470 | 499 | fd.writelines(line.encode(defenc) for line in new_lines) |
471 | 500 | |
472 | except (OSError, IOError): | |
501 | except OSError: | |
473 | 502 | pass # it didn't exist at all |
474 | 503 | |
475 | 504 | # delete the reflog |
476 | 505 | reflog_path = RefLog.path(cls(repo, full_ref_path)) |
477 | if osp.isfile(reflog_path): | |
506 | if os.path.isfile(reflog_path): | |
478 | 507 | os.remove(reflog_path) |
479 | 508 | # END remove reflog |
480 | 509 | |
481 | 510 | @classmethod |
482 | def _create(cls, repo, path, resolve, reference, force, logmsg=None): | |
511 | def _create(cls: Type[T_References], repo: 'Repo', path: PathLike, resolve: bool, | |
512 | reference: Union['SymbolicReference', str], force: bool, | |
513 | logmsg: Union[str, None] = None) -> T_References: | |
483 | 514 | """internal method used to create a new symbolic reference. |
484 | 515 | If resolve is False, the reference will be taken as is, creating |
485 | 516 | a proper symbolic reference. Otherwise it will be resolved to the |
487 | 518 | instead""" |
488 | 519 | git_dir = _git_dir(repo, path) |
489 | 520 | full_ref_path = cls.to_full_path(path) |
490 | abs_ref_path = osp.join(git_dir, full_ref_path) | |
521 | abs_ref_path = os.path.join(git_dir, full_ref_path) | |
491 | 522 | |
492 | 523 | # figure out target data |
493 | 524 | target = reference |
494 | 525 | if resolve: |
495 | 526 | target = repo.rev_parse(str(reference)) |
496 | 527 | |
497 | if not force and osp.isfile(abs_ref_path): | |
528 | if not force and os.path.isfile(abs_ref_path): | |
498 | 529 | target_data = str(target) |
499 | 530 | if isinstance(target, SymbolicReference): |
500 | target_data = target.path | |
531 | target_data = str(target.path) | |
501 | 532 | if not resolve: |
502 | 533 | target_data = "ref: " + target_data |
503 | 534 | with open(abs_ref_path, 'rb') as fd: |
512 | 543 | return ref |
513 | 544 | |
514 | 545 | @classmethod |
515 | def create(cls, repo, path, reference='HEAD', force=False, logmsg=None): | |
516 | """Create a new symbolic reference, hence a reference pointing to another reference. | |
546 | def create(cls: Type[T_References], repo: 'Repo', path: PathLike, | |
547 | reference: Union['SymbolicReference', str] = 'HEAD', | |
548 | logmsg: Union[str, None] = None, force: bool = False, **kwargs: Any) -> T_References: | |
549 | """Create a new symbolic reference, hence a reference pointing , to another reference. | |
517 | 550 | |
518 | 551 | :param repo: |
519 | 552 | Repository to create the reference in |
543 | 576 | :note: This does not alter the current HEAD, index or Working Tree""" |
544 | 577 | return cls._create(repo, path, cls._resolve_ref_on_create, reference, force, logmsg) |
545 | 578 | |
546 | def rename(self, new_path, force=False): | |
579 | def rename(self, new_path: PathLike, force: bool = False) -> 'SymbolicReference': | |
547 | 580 | """Rename self to a new path |
548 | 581 | |
549 | 582 | :param new_path: |
561 | 594 | if self.path == new_path: |
562 | 595 | return self |
563 | 596 | |
564 | new_abs_path = osp.join(_git_dir(self.repo, new_path), new_path) | |
565 | cur_abs_path = osp.join(_git_dir(self.repo, self.path), self.path) | |
566 | if osp.isfile(new_abs_path): | |
597 | new_abs_path = os.path.join(_git_dir(self.repo, new_path), new_path) | |
598 | cur_abs_path = os.path.join(_git_dir(self.repo, self.path), self.path) | |
599 | if os.path.isfile(new_abs_path): | |
567 | 600 | if not force: |
568 | 601 | # if they point to the same file, its not an error |
569 | 602 | with open(new_abs_path, 'rb') as fd1: |
578 | 611 | os.remove(new_abs_path) |
579 | 612 | # END handle existing target file |
580 | 613 | |
581 | dname = osp.dirname(new_abs_path) | |
582 | if not osp.isdir(dname): | |
614 | dname = os.path.dirname(new_abs_path) | |
615 | if not os.path.isdir(dname): | |
583 | 616 | os.makedirs(dname) |
584 | 617 | # END create directory |
585 | 618 | |
589 | 622 | return self |
590 | 623 | |
591 | 624 | @classmethod |
592 | def _iter_items(cls, repo, common_path=None): | |
625 | def _iter_items(cls: Type[T_References], repo: 'Repo', common_path: Union[PathLike, None] = None | |
626 | ) -> Iterator[T_References]: | |
593 | 627 | if common_path is None: |
594 | 628 | common_path = cls._common_path_default |
595 | 629 | rela_paths = set() |
613 | 647 | |
614 | 648 | # read packed refs |
615 | 649 | for _sha, rela_path in cls._iter_packed_refs(repo): |
616 | if rela_path.startswith(common_path): | |
650 | if rela_path.startswith(str(common_path)): | |
617 | 651 | rela_paths.add(rela_path) |
618 | 652 | # END relative path matches common path |
619 | 653 | # END packed refs reading |
627 | 661 | # END for each sorted relative refpath |
628 | 662 | |
629 | 663 | @classmethod |
630 | def iter_items(cls, repo, common_path=None): | |
664 | def iter_items(cls: Type[T_References], repo: 'Repo', common_path: Union[PathLike, None] = None, | |
665 | *args: Any, **kwargs: Any) -> Iterator[T_References]: | |
631 | 666 | """Find all refs in the repository |
632 | 667 | |
633 | 668 | :param repo: is the Repo |
647 | 682 | return (r for r in cls._iter_items(repo, common_path) if r.__class__ == SymbolicReference or not r.is_detached) |
648 | 683 | |
649 | 684 | @classmethod |
650 | def from_path(cls, repo, path): | |
685 | def from_path(cls: Type[T_References], repo: 'Repo', path: PathLike) -> T_References: | |
651 | 686 | """ |
652 | 687 | :param path: full .git-directory-relative path name to the Reference to instantiate |
653 | 688 | :note: use to_full_path() if you only have a partial path of a known Reference Type |
662 | 697 | from . import HEAD, Head, RemoteReference, TagReference, Reference |
663 | 698 | for ref_type in (HEAD, Head, RemoteReference, TagReference, Reference, SymbolicReference): |
664 | 699 | try: |
700 | instance: T_References | |
665 | 701 | instance = ref_type(repo, path) |
666 | 702 | if instance.__class__ == SymbolicReference and instance.is_detached: |
667 | 703 | raise ValueError("SymbolRef was detached, we drop it") |
668 | return instance | |
704 | else: | |
705 | return instance | |
706 | ||
669 | 707 | except ValueError: |
670 | 708 | pass |
671 | 709 | # END exception handling |
672 | 710 | # END for each type to try |
673 | 711 | raise ValueError("Could not find reference type suitable to handle path %r" % path) |
674 | 712 | |
675 | def is_remote(self): | |
713 | def is_remote(self) -> bool: | |
676 | 714 | """:return: True if this symbolic reference points to a remote branch""" |
677 | return self.path.startswith(self._remote_common_path_default + "/") | |
715 | return str(self.path).startswith(self._remote_common_path_default + "/") |
0 | 0 | from .reference import Reference |
1 | 1 | |
2 | 2 | __all__ = ["TagReference", "Tag"] |
3 | ||
4 | # typing ------------------------------------------------------------------ | |
5 | ||
6 | from typing import Any, Type, Union, TYPE_CHECKING | |
7 | from git.types import Commit_ish, PathLike | |
8 | ||
9 | if TYPE_CHECKING: | |
10 | from git.repo import Repo | |
11 | from git.objects import Commit | |
12 | from git.objects import TagObject | |
13 | from git.refs import SymbolicReference | |
14 | ||
15 | ||
16 | # ------------------------------------------------------------------------------ | |
3 | 17 | |
4 | 18 | |
5 | 19 | class TagReference(Reference): |
17 | 31 | print(tagref.tag.message)""" |
18 | 32 | |
19 | 33 | __slots__ = () |
20 | _common_path_default = "refs/tags" | |
34 | _common_default = "tags" | |
35 | _common_path_default = Reference._common_path_default + "/" + _common_default | |
21 | 36 | |
22 | 37 | @property |
23 | def commit(self): | |
38 | def commit(self) -> 'Commit': # type: ignore[override] # LazyMixin has unrelated comit method | |
24 | 39 | """:return: Commit object the tag ref points to |
25 | ||
40 | ||
26 | 41 | :raise ValueError: if the tag points to a tree or blob""" |
27 | 42 | obj = self.object |
28 | 43 | while obj.type != 'commit': |
35 | 50 | return obj |
36 | 51 | |
37 | 52 | @property |
38 | def tag(self): | |
53 | def tag(self) -> Union['TagObject', None]: | |
39 | 54 | """ |
40 | 55 | :return: Tag object this tag ref points to or None in case |
41 | 56 | we are a light weight tag""" |
46 | 61 | |
47 | 62 | # make object read-only |
48 | 63 | # It should be reasonably hard to adjust an existing tag |
49 | object = property(Reference._get_object) | |
64 | ||
65 | # object = property(Reference._get_object) | |
66 | @property | |
67 | def object(self) -> Commit_ish: # type: ignore[override] | |
68 | return Reference._get_object(self) | |
50 | 69 | |
51 | 70 | @classmethod |
52 | def create(cls, repo, path, ref='HEAD', message=None, force=False, **kwargs): | |
71 | def create(cls: Type['TagReference'], repo: 'Repo', path: PathLike, | |
72 | reference: Union[str, 'SymbolicReference'] = 'HEAD', | |
73 | logmsg: Union[str, None] = None, | |
74 | force: bool = False, **kwargs: Any) -> 'TagReference': | |
53 | 75 | """Create a new tag reference. |
54 | 76 | |
55 | 77 | :param path: |
57 | 79 | The prefix refs/tags is implied |
58 | 80 | |
59 | 81 | :param ref: |
60 | A reference to the object you want to tag. It can be a commit, tree or | |
82 | A reference to the Object you want to tag. The Object can be a commit, tree or | |
61 | 83 | blob. |
62 | 84 | |
63 | :param message: | |
85 | :param logmsg: | |
64 | 86 | If not None, the message will be used in your tag object. This will also |
65 | 87 | create an additional tag object that allows to obtain that information, i.e.:: |
66 | 88 | |
67 | 89 | tagref.tag.message |
90 | ||
91 | :param message: | |
92 | Synonym for :param logmsg: | |
93 | Included for backwards compatability. :param logmsg is used in preference if both given. | |
68 | 94 | |
69 | 95 | :param force: |
70 | 96 | If True, to force creation of a tag even though that tag already exists. |
73 | 99 | Additional keyword arguments to be passed to git-tag |
74 | 100 | |
75 | 101 | :return: A new TagReference""" |
76 | args = (path, ref) | |
77 | if message: | |
78 | kwargs['m'] = message | |
102 | if 'ref' in kwargs and kwargs['ref']: | |
103 | reference = kwargs['ref'] | |
104 | ||
105 | if logmsg: | |
106 | kwargs['m'] = logmsg | |
107 | elif 'message' in kwargs and kwargs['message']: | |
108 | kwargs['m'] = kwargs['message'] | |
109 | ||
79 | 110 | if force: |
80 | 111 | kwargs['f'] = True |
112 | ||
113 | args = (path, reference) | |
81 | 114 | |
82 | 115 | repo.git.tag(*args, **kwargs) |
83 | 116 | return TagReference(repo, "%s/%s" % (cls._common_path_default, path)) |
84 | 117 | |
85 | 118 | @classmethod |
86 | def delete(cls, repo, *tags): | |
119 | def delete(cls, repo: 'Repo', *tags: 'TagReference') -> None: # type: ignore[override] | |
87 | 120 | """Delete the given existing tag or tags""" |
88 | 121 | repo.git.tag("-d", *tags) |
89 | 122 |
8 | 8 | import re |
9 | 9 | |
10 | 10 | from git.cmd import handle_process_output, Git |
11 | from git.compat import (defenc, force_text, is_win) | |
11 | from git.compat import (defenc, force_text) | |
12 | 12 | from git.exc import GitCommandError |
13 | 13 | from git.util import ( |
14 | 14 | LazyMixin, |
15 | Iterable, | |
15 | IterableObj, | |
16 | 16 | IterableList, |
17 | 17 | RemoteProgress, |
18 | CallableRemoteProgress | |
18 | CallableRemoteProgress, | |
19 | 19 | ) |
20 | 20 | from git.util import ( |
21 | 21 | join_path, |
22 | 22 | ) |
23 | 23 | |
24 | 24 | from .config import ( |
25 | GitConfigParser, | |
25 | 26 | SectionConstraint, |
26 | 27 | cp, |
27 | 28 | ) |
33 | 34 | TagReference |
34 | 35 | ) |
35 | 36 | |
37 | # typing------------------------------------------------------- | |
38 | ||
39 | from typing import (Any, Callable, Dict, Iterator, List, NoReturn, Optional, Sequence, | |
40 | TYPE_CHECKING, Type, Union, cast, overload) | |
41 | ||
42 | from git.types import PathLike, Literal, Commit_ish | |
43 | ||
44 | if TYPE_CHECKING: | |
45 | from git.repo.base import Repo | |
46 | from git.objects.submodule.base import UpdateProgress | |
47 | # from git.objects.commit import Commit | |
48 | # from git.objects import Blob, Tree, TagObject | |
49 | ||
50 | flagKeyLiteral = Literal[' ', '!', '+', '-', '*', '=', 't', '?'] | |
51 | ||
52 | # def is_flagKeyLiteral(inp: str) -> TypeGuard[flagKeyLiteral]: | |
53 | # return inp in [' ', '!', '+', '-', '=', '*', 't', '?'] | |
54 | ||
55 | ||
56 | # ------------------------------------------------------------- | |
57 | ||
36 | 58 | |
37 | 59 | log = logging.getLogger('git.remote') |
38 | 60 | log.addHandler(logging.NullHandler()) |
43 | 65 | #{ Utilities |
44 | 66 | |
45 | 67 | |
46 | def add_progress(kwargs, git, progress): | |
68 | def add_progress(kwargs: Any, git: Git, | |
69 | progress: Union[RemoteProgress, 'UpdateProgress', Callable[..., RemoteProgress], None] | |
70 | ) -> Any: | |
47 | 71 | """Add the --progress flag to the given kwargs dict if supported by the |
48 | 72 | git command. If the actual progress in the given progress instance is not |
49 | 73 | given, we do not request any progress |
59 | 83 | #} END utilities |
60 | 84 | |
61 | 85 | |
62 | def to_progress_instance(progress): | |
86 | @ overload | |
87 | def to_progress_instance(progress: None) -> RemoteProgress: | |
88 | ... | |
89 | ||
90 | ||
91 | @ overload | |
92 | def to_progress_instance(progress: Callable[..., Any]) -> CallableRemoteProgress: | |
93 | ... | |
94 | ||
95 | ||
96 | @ overload | |
97 | def to_progress_instance(progress: RemoteProgress) -> RemoteProgress: | |
98 | ... | |
99 | ||
100 | ||
101 | def to_progress_instance(progress: Union[Callable[..., Any], RemoteProgress, None] | |
102 | ) -> Union[RemoteProgress, CallableRemoteProgress]: | |
63 | 103 | """Given the 'progress' return a suitable object derived from |
64 | 104 | RemoteProgress(). |
65 | 105 | """ |
75 | 115 | return progress |
76 | 116 | |
77 | 117 | |
78 | class PushInfo(object): | |
118 | class PushInfo(IterableObj, object): | |
79 | 119 | """ |
80 | 120 | Carries information about the result of a push operation of a single head:: |
81 | 121 | |
91 | 131 | info.summary # summary line providing human readable english text about the push |
92 | 132 | """ |
93 | 133 | __slots__ = ('local_ref', 'remote_ref_string', 'flags', '_old_commit_sha', '_remote', 'summary') |
134 | _id_attribute_ = 'pushinfo' | |
94 | 135 | |
95 | 136 | NEW_TAG, NEW_HEAD, NO_MATCH, REJECTED, REMOTE_REJECTED, REMOTE_FAILURE, DELETED, \ |
96 | 137 | FORCED_UPDATE, FAST_FORWARD, UP_TO_DATE, ERROR = [1 << x for x in range(11)] |
103 | 144 | '=': UP_TO_DATE, |
104 | 145 | '!': ERROR} |
105 | 146 | |
106 | def __init__(self, flags, local_ref, remote_ref_string, remote, old_commit=None, | |
107 | summary=''): | |
108 | """ Initialize a new instance """ | |
147 | def __init__(self, flags: int, local_ref: Union[SymbolicReference, None], remote_ref_string: str, remote: 'Remote', | |
148 | old_commit: Optional[str] = None, summary: str = '') -> None: | |
149 | """ Initialize a new instance | |
150 | local_ref: HEAD | Head | RemoteReference | TagReference | Reference | SymbolicReference | None """ | |
109 | 151 | self.flags = flags |
110 | 152 | self.local_ref = local_ref |
111 | 153 | self.remote_ref_string = remote_ref_string |
113 | 155 | self._old_commit_sha = old_commit |
114 | 156 | self.summary = summary |
115 | 157 | |
116 | @property | |
117 | def old_commit(self): | |
158 | @ property | |
159 | def old_commit(self) -> Union[str, SymbolicReference, Commit_ish, None]: | |
118 | 160 | return self._old_commit_sha and self._remote.repo.commit(self._old_commit_sha) or None |
119 | 161 | |
120 | @property | |
121 | def remote_ref(self): | |
162 | @ property | |
163 | def remote_ref(self) -> Union[RemoteReference, TagReference]: | |
122 | 164 | """ |
123 | 165 | :return: |
124 | 166 | Remote Reference or TagReference in the local repository corresponding |
133 | 175 | raise ValueError("Could not handle remote ref: %r" % self.remote_ref_string) |
134 | 176 | # END |
135 | 177 | |
136 | @classmethod | |
137 | def _from_line(cls, remote, line): | |
178 | @ classmethod | |
179 | def _from_line(cls, remote: 'Remote', line: str) -> 'PushInfo': | |
138 | 180 | """Create a new PushInfo instance as parsed from line which is expected to be like |
139 | 181 | refs/heads/master:refs/heads/master 05d2687..1d0568e as bytes""" |
140 | 182 | control_character, from_to, summary = line.split('\t', 3) |
150 | 192 | # from_to handling |
151 | 193 | from_ref_string, to_ref_string = from_to.split(':') |
152 | 194 | if flags & cls.DELETED: |
153 | from_ref = None | |
195 | from_ref: Union[SymbolicReference, None] = None | |
154 | 196 | else: |
155 | 197 | if from_ref_string == "(delete)": |
156 | 198 | from_ref = None |
158 | 200 | from_ref = Reference.from_path(remote.repo, from_ref_string) |
159 | 201 | |
160 | 202 | # commit handling, could be message or commit info |
161 | old_commit = None | |
203 | old_commit: Optional[str] = None | |
162 | 204 | if summary.startswith('['): |
163 | 205 | if "[rejected]" in summary: |
164 | 206 | flags |= cls.REJECTED |
186 | 228 | |
187 | 229 | return PushInfo(flags, from_ref, to_ref_string, remote, old_commit, summary) |
188 | 230 | |
189 | ||
190 | class FetchInfo(object): | |
231 | @ classmethod | |
232 | def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any | |
233 | ) -> NoReturn: # -> Iterator['PushInfo']: | |
234 | raise NotImplementedError | |
235 | ||
236 | ||
237 | class FetchInfo(IterableObj, object): | |
191 | 238 | |
192 | 239 | """ |
193 | 240 | Carries information about the results of a fetch operation of a single head:: |
204 | 251 | info.remote_ref_path # The path from which we fetched on the remote. It's the remote's version of our info.ref |
205 | 252 | """ |
206 | 253 | __slots__ = ('ref', 'old_commit', 'flags', 'note', 'remote_ref_path') |
254 | _id_attribute_ = 'fetchinfo' | |
207 | 255 | |
208 | 256 | NEW_TAG, NEW_HEAD, HEAD_UPTODATE, TAG_UPDATE, REJECTED, FORCED_UPDATE, \ |
209 | 257 | FAST_FORWARD, ERROR = [1 << x for x in range(8)] |
210 | 258 | |
211 | 259 | _re_fetch_result = re.compile(r'^\s*(.) (\[?[\w\s\.$@]+\]?)\s+(.+) -> ([^\s]+)( \(.*\)?$)?') |
212 | 260 | |
213 | _flag_map = { | |
261 | _flag_map: Dict[flagKeyLiteral, int] = { | |
214 | 262 | '!': ERROR, |
215 | 263 | '+': FORCED_UPDATE, |
216 | 264 | '*': 0, |
219 | 267 | '-': TAG_UPDATE, |
220 | 268 | } |
221 | 269 | |
222 | @classmethod | |
223 | def refresh(cls): | |
270 | @ classmethod | |
271 | def refresh(cls) -> Literal[True]: | |
224 | 272 | """This gets called by the refresh function (see the top level |
225 | 273 | __init__). |
226 | 274 | """ |
243 | 291 | |
244 | 292 | return True |
245 | 293 | |
246 | def __init__(self, ref, flags, note='', old_commit=None, remote_ref_path=None): | |
294 | def __init__(self, ref: SymbolicReference, flags: int, note: str = '', | |
295 | old_commit: Union[Commit_ish, None] = None, | |
296 | remote_ref_path: Optional[PathLike] = None) -> None: | |
247 | 297 | """ |
248 | 298 | Initialize a new instance |
249 | 299 | """ |
253 | 303 | self.old_commit = old_commit |
254 | 304 | self.remote_ref_path = remote_ref_path |
255 | 305 | |
256 | def __str__(self): | |
306 | def __str__(self) -> str: | |
257 | 307 | return self.name |
258 | 308 | |
259 | @property | |
260 | def name(self): | |
309 | @ property | |
310 | def name(self) -> str: | |
261 | 311 | """:return: Name of our remote ref""" |
262 | 312 | return self.ref.name |
263 | 313 | |
264 | @property | |
265 | def commit(self): | |
314 | @ property | |
315 | def commit(self) -> Commit_ish: | |
266 | 316 | """:return: Commit of our remote ref""" |
267 | 317 | return self.ref.commit |
268 | 318 | |
269 | @classmethod | |
270 | def _from_line(cls, repo, line, fetch_line): | |
319 | @ classmethod | |
320 | def _from_line(cls, repo: 'Repo', line: str, fetch_line: str) -> 'FetchInfo': | |
271 | 321 | """Parse information from the given line as returned by git-fetch -v |
272 | 322 | and return a new FetchInfo object representing this information. |
273 | 323 | |
274 | We can handle a line as follows | |
275 | "%c %-*s %-*s -> %s%s" | |
276 | ||
277 | Where c is either ' ', !, +, -, *, or = | |
324 | We can handle a line as follows: | |
325 | "%c %-\\*s %-\\*s -> %s%s" | |
326 | ||
327 | Where c is either ' ', !, +, -, \\*, or = | |
278 | 328 | ! means error |
279 | 329 | + means success forcing update |
280 | 330 | - means a tag was updated |
289 | 339 | raise ValueError("Failed to parse line: %r" % line) |
290 | 340 | |
291 | 341 | # parse lines |
292 | control_character, operation, local_remote_ref, remote_local_ref, note = match.groups() | |
342 | remote_local_ref_str: str | |
343 | control_character, operation, local_remote_ref, remote_local_ref_str, note = match.groups() | |
344 | # assert is_flagKeyLiteral(control_character), f"{control_character}" | |
345 | control_character = cast(flagKeyLiteral, control_character) | |
293 | 346 | try: |
294 | 347 | _new_hex_sha, _fetch_operation, fetch_note = fetch_line.split("\t") |
295 | 348 | ref_type_name, fetch_note = fetch_note.split(' ', 1) |
305 | 358 | # END control char exception handling |
306 | 359 | |
307 | 360 | # parse operation string for more info - makes no sense for symbolic refs, but we parse it anyway |
308 | old_commit = None | |
361 | old_commit: Union[Commit_ish, None] = None | |
309 | 362 | is_tag_operation = False |
310 | 363 | if 'rejected' in operation: |
311 | 364 | flags |= cls.REJECTED |
328 | 381 | # If we do not specify a target branch like master:refs/remotes/origin/master, |
329 | 382 | # the fetch result is stored in FETCH_HEAD which destroys the rule we usually |
330 | 383 | # have. In that case we use a symbolic reference which is detached |
331 | ref_type = None | |
332 | if remote_local_ref == "FETCH_HEAD": | |
384 | ref_type: Optional[Type[SymbolicReference]] = None | |
385 | if remote_local_ref_str == "FETCH_HEAD": | |
333 | 386 | ref_type = SymbolicReference |
334 | 387 | elif ref_type_name == "tag" or is_tag_operation: |
335 | 388 | # the ref_type_name can be branch, whereas we are still seeing a tag operation. It happens during |
357 | 410 | # by the 'ref/' prefix. Otherwise even a tag could be in refs/remotes, which is when it will have the |
358 | 411 | # 'tags/' subdirectory in its path. |
359 | 412 | # We don't want to test for actual existence, but try to figure everything out analytically. |
360 | ref_path = None | |
361 | remote_local_ref = remote_local_ref.strip() | |
362 | if remote_local_ref.startswith(Reference._common_path_default + "/"): | |
413 | ref_path: Optional[PathLike] = None | |
414 | remote_local_ref_str = remote_local_ref_str.strip() | |
415 | ||
416 | if remote_local_ref_str.startswith(Reference._common_path_default + "/"): | |
363 | 417 | # always use actual type if we get absolute paths |
364 | 418 | # Will always be the case if something is fetched outside of refs/remotes (if its not a tag) |
365 | ref_path = remote_local_ref | |
419 | ref_path = remote_local_ref_str | |
366 | 420 | if ref_type is not TagReference and not \ |
367 | remote_local_ref.startswith(RemoteReference._common_path_default + "/"): | |
421 | remote_local_ref_str.startswith(RemoteReference._common_path_default + "/"): | |
368 | 422 | ref_type = Reference |
369 | 423 | # END downgrade remote reference |
370 | elif ref_type is TagReference and 'tags/' in remote_local_ref: | |
424 | elif ref_type is TagReference and 'tags/' in remote_local_ref_str: | |
371 | 425 | # even though its a tag, it is located in refs/remotes |
372 | ref_path = join_path(RemoteReference._common_path_default, remote_local_ref) | |
426 | ref_path = join_path(RemoteReference._common_path_default, remote_local_ref_str) | |
373 | 427 | else: |
374 | ref_path = join_path(ref_type._common_path_default, remote_local_ref) | |
428 | ref_path = join_path(ref_type._common_path_default, remote_local_ref_str) | |
375 | 429 | # END obtain refpath |
376 | 430 | |
377 | 431 | # even though the path could be within the git conventions, we make |
383 | 437 | |
384 | 438 | return cls(remote_local_ref, flags, note, old_commit, local_remote_ref) |
385 | 439 | |
386 | ||
387 | class Remote(LazyMixin, Iterable): | |
440 | @ classmethod | |
441 | def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any | |
442 | ) -> NoReturn: # -> Iterator['FetchInfo']: | |
443 | raise NotImplementedError | |
444 | ||
445 | ||
446 | class Remote(LazyMixin, IterableObj): | |
388 | 447 | |
389 | 448 | """Provides easy read and write access to a git remote. |
390 | 449 | |
397 | 456 | __slots__ = ("repo", "name", "_config_reader") |
398 | 457 | _id_attribute_ = "name" |
399 | 458 | |
400 | def __init__(self, repo, name): | |
459 | def __init__(self, repo: 'Repo', name: str) -> None: | |
401 | 460 | """Initialize a remote instance |
402 | 461 | |
403 | 462 | :param repo: The repository we are a remote of |
404 | 463 | :param name: the name of the remote, i.e. 'origin'""" |
405 | 464 | self.repo = repo |
406 | 465 | self.name = name |
407 | ||
408 | if is_win: | |
409 | # some oddity: on windows, python 2.5, it for some reason does not realize | |
410 | # that it has the config_writer property, but instead calls __getattr__ | |
411 | # which will not yield the expected results. 'pinging' the members | |
412 | # with a dir call creates the config_writer property that we require | |
413 | # ... bugs like these make me wonder whether python really wants to be used | |
414 | # for production. It doesn't happen on linux though. | |
415 | dir(self) | |
416 | # END windows special handling | |
417 | ||
418 | def __getattr__(self, attr): | |
466 | self.url: str | |
467 | ||
468 | def __getattr__(self, attr: str) -> Any: | |
419 | 469 | """Allows to call this instance like |
420 | 470 | remote.special( \\*args, \\*\\*kwargs) to call git-remote special self.name""" |
421 | 471 | if attr == "_config_reader": |
429 | 479 | return super(Remote, self).__getattr__(attr) |
430 | 480 | # END handle exception |
431 | 481 | |
432 | def _config_section_name(self): | |
482 | def _config_section_name(self) -> str: | |
433 | 483 | return 'remote "%s"' % self.name |
434 | 484 | |
435 | def _set_cache_(self, attr): | |
485 | def _set_cache_(self, attr: str) -> None: | |
436 | 486 | if attr == "_config_reader": |
437 | 487 | # NOTE: This is cached as __getattr__ is overridden to return remote config values implicitly, such as |
438 | 488 | # in print(r.pushurl) |
440 | 490 | else: |
441 | 491 | super(Remote, self)._set_cache_(attr) |
442 | 492 | |
443 | def __str__(self): | |
493 | def __str__(self) -> str: | |
444 | 494 | return self.name |
445 | 495 | |
446 | def __repr__(self): | |
496 | def __repr__(self) -> str: | |
447 | 497 | return '<git.%s "%s">' % (self.__class__.__name__, self.name) |
448 | 498 | |
449 | def __eq__(self, other): | |
499 | def __eq__(self, other: object) -> bool: | |
450 | 500 | return isinstance(other, type(self)) and self.name == other.name |
451 | 501 | |
452 | def __ne__(self, other): | |
502 | def __ne__(self, other: object) -> bool: | |
453 | 503 | return not (self == other) |
454 | 504 | |
455 | def __hash__(self): | |
505 | def __hash__(self) -> int: | |
456 | 506 | return hash(self.name) |
457 | 507 | |
458 | def exists(self): | |
508 | def exists(self) -> bool: | |
459 | 509 | """ |
460 | 510 | :return: True if this is a valid, existing remote. |
461 | 511 | Valid remotes have an entry in the repository's configuration""" |
469 | 519 | return False |
470 | 520 | # end |
471 | 521 | |
472 | @classmethod | |
473 | def iter_items(cls, repo): | |
522 | @ classmethod | |
523 | def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> Iterator['Remote']: | |
474 | 524 | """:return: Iterator yielding Remote objects of the given repository""" |
475 | 525 | for section in repo.config_reader("repository").sections(): |
476 | 526 | if not section.startswith('remote '): |
482 | 532 | yield Remote(repo, section[lbound + 1:rbound]) |
483 | 533 | # END for each configuration section |
484 | 534 | |
485 | def set_url(self, new_url, old_url=None, **kwargs): | |
535 | def set_url(self, new_url: str, old_url: Optional[str] = None, **kwargs: Any) -> 'Remote': | |
486 | 536 | """Configure URLs on current remote (cf command git remote set_url) |
487 | 537 | |
488 | 538 | This command manages URLs on the remote. |
499 | 549 | self.repo.git.remote(scmd, self.name, new_url, **kwargs) |
500 | 550 | return self |
501 | 551 | |
502 | def add_url(self, url, **kwargs): | |
552 | def add_url(self, url: str, **kwargs: Any) -> 'Remote': | |
503 | 553 | """Adds a new url on current remote (special case of git remote set_url) |
504 | 554 | |
505 | 555 | This command adds new URLs to a given remote, making it possible to have |
510 | 560 | """ |
511 | 561 | return self.set_url(url, add=True) |
512 | 562 | |
513 | def delete_url(self, url, **kwargs): | |
563 | def delete_url(self, url: str, **kwargs: Any) -> 'Remote': | |
514 | 564 | """Deletes a new url on current remote (special case of git remote set_url) |
515 | 565 | |
516 | 566 | This command deletes new URLs to a given remote, making it possible to have |
521 | 571 | """ |
522 | 572 | return self.set_url(url, delete=True) |
523 | 573 | |
524 | @property | |
525 | def urls(self): | |
574 | @ property | |
575 | def urls(self) -> Iterator[str]: | |
526 | 576 | """:return: Iterator yielding all configured URL targets on a remote as strings""" |
527 | 577 | try: |
528 | 578 | remote_details = self.repo.git.remote("get-url", "--all", self.name) |
579 | assert isinstance(remote_details, str) | |
529 | 580 | for line in remote_details.split('\n'): |
530 | 581 | yield line |
531 | 582 | except GitCommandError as ex: |
537 | 588 | if 'Unknown subcommand: get-url' in str(ex): |
538 | 589 | try: |
539 | 590 | remote_details = self.repo.git.remote("show", self.name) |
591 | assert isinstance(remote_details, str) | |
540 | 592 | for line in remote_details.split('\n'): |
541 | 593 | if ' Push URL:' in line: |
542 | 594 | yield line.split(': ')[-1] |
543 | except GitCommandError as ex: | |
544 | if any(msg in str(ex) for msg in ['correct access rights', 'cannot run ssh']): | |
595 | except GitCommandError as _ex: | |
596 | if any(msg in str(_ex) for msg in ['correct access rights', 'cannot run ssh']): | |
545 | 597 | # If ssh is not setup to access this repository, see issue 694 |
546 | 598 | remote_details = self.repo.git.config('--get-all', 'remote.%s.url' % self.name) |
599 | assert isinstance(remote_details, str) | |
547 | 600 | for line in remote_details.split('\n'): |
548 | 601 | yield line |
549 | 602 | else: |
550 | raise ex | |
603 | raise _ex | |
551 | 604 | else: |
552 | 605 | raise ex |
553 | 606 | |
554 | @property | |
555 | def refs(self): | |
607 | @ property | |
608 | def refs(self) -> IterableList[RemoteReference]: | |
556 | 609 | """ |
557 | 610 | :return: |
558 | 611 | IterableList of RemoteReference objects. It is prefixed, allowing |
559 | 612 | you to omit the remote path portion, i.e.:: |
560 | 613 | remote.refs.master # yields RemoteReference('/refs/remotes/origin/master')""" |
561 | out_refs = IterableList(RemoteReference._id_attribute_, "%s/" % self.name) | |
614 | out_refs: IterableList[RemoteReference] = IterableList(RemoteReference._id_attribute_, "%s/" % self.name) | |
562 | 615 | out_refs.extend(RemoteReference.list_items(self.repo, remote=self.name)) |
563 | 616 | return out_refs |
564 | 617 | |
565 | @property | |
566 | def stale_refs(self): | |
618 | @ property | |
619 | def stale_refs(self) -> IterableList[Reference]: | |
567 | 620 | """ |
568 | 621 | :return: |
569 | 622 | IterableList RemoteReference objects that do not have a corresponding |
578 | 631 | as well. This is a fix for the issue described here: |
579 | 632 | https://github.com/gitpython-developers/GitPython/issues/260 |
580 | 633 | """ |
581 | out_refs = IterableList(RemoteReference._id_attribute_, "%s/" % self.name) | |
634 | out_refs: IterableList[Reference] = IterableList(RemoteReference._id_attribute_, "%s/" % self.name) | |
582 | 635 | for line in self.repo.git.remote("prune", "--dry-run", self).splitlines()[2:]: |
583 | 636 | # expecting |
584 | 637 | # * [would prune] origin/new_branch |
585 | 638 | token = " * [would prune] " |
586 | 639 | if not line.startswith(token): |
587 | raise ValueError("Could not parse git-remote prune result: %r" % line) | |
640 | continue | |
588 | 641 | ref_name = line.replace(token, "") |
589 | 642 | # sometimes, paths start with a full ref name, like refs/tags/foo, see #260 |
590 | 643 | if ref_name.startswith(Reference._common_path_default + '/'): |
591 | out_refs.append(SymbolicReference.from_path(self.repo, ref_name)) | |
644 | out_refs.append(Reference.from_path(self.repo, ref_name)) | |
592 | 645 | else: |
593 | 646 | fqhn = "%s/%s" % (RemoteReference._common_path_default, ref_name) |
594 | 647 | out_refs.append(RemoteReference(self.repo, fqhn)) |
596 | 649 | # END for each line |
597 | 650 | return out_refs |
598 | 651 | |
599 | @classmethod | |
600 | def create(cls, repo, name, url, **kwargs): | |
652 | @ classmethod | |
653 | def create(cls, repo: 'Repo', name: str, url: str, **kwargs: Any) -> 'Remote': | |
601 | 654 | """Create a new remote to the given repository |
602 | 655 | :param repo: Repository instance that is to receive the new remote |
603 | 656 | :param name: Desired name of the remote |
613 | 666 | # add is an alias |
614 | 667 | add = create |
615 | 668 | |
616 | @classmethod | |
617 | def remove(cls, repo, name): | |
669 | @ classmethod | |
670 | def remove(cls, repo: 'Repo', name: str) -> str: | |
618 | 671 | """Remove the remote with the given name |
619 | 672 | :return: the passed remote name to remove |
620 | 673 | """ |
626 | 679 | # alias |
627 | 680 | rm = remove |
628 | 681 | |
629 | def rename(self, new_name): | |
682 | def rename(self, new_name: str) -> 'Remote': | |
630 | 683 | """Rename self to the given new_name |
631 | 684 | :return: self """ |
632 | 685 | if self.name == new_name: |
638 | 691 | |
639 | 692 | return self |
640 | 693 | |
641 | def update(self, **kwargs): | |
694 | def update(self, **kwargs: Any) -> 'Remote': | |
642 | 695 | """Fetch all changes for this remote, including new branches which will |
643 | 696 | be forced in ( in case your local remote branch is not part the new remote branches |
644 | 697 | ancestry anymore ). |
652 | 705 | self.repo.git.remote(scmd, self.name, **kwargs) |
653 | 706 | return self |
654 | 707 | |
655 | def _get_fetch_info_from_stderr(self, proc, progress): | |
708 | def _get_fetch_info_from_stderr(self, proc: 'Git.AutoInterrupt', | |
709 | progress: Union[Callable[..., Any], RemoteProgress, None] | |
710 | ) -> IterableList['FetchInfo']: | |
711 | ||
656 | 712 | progress = to_progress_instance(progress) |
657 | 713 | |
658 | 714 | # skip first line as it is some remote info we are not interested in |
659 | output = IterableList('name') | |
715 | output: IterableList['FetchInfo'] = IterableList('name') | |
660 | 716 | |
661 | 717 | # lines which are no progress are fetch info lines |
662 | 718 | # this also waits for the command to finish |
694 | 750 | msg += "Will ignore extra progress lines or fetch head lines." |
695 | 751 | msg %= (l_fil, l_fhi) |
696 | 752 | log.debug(msg) |
697 | log.debug("info lines: " + str(fetch_info_lines)) | |
698 | log.debug("head info : " + str(fetch_head_info)) | |
753 | log.debug(b"info lines: " + str(fetch_info_lines).encode("UTF-8")) | |
754 | log.debug(b"head info: " + str(fetch_head_info).encode("UTF-8")) | |
699 | 755 | if l_fil < l_fhi: |
700 | 756 | fetch_head_info = fetch_head_info[:l_fil] |
701 | 757 | else: |
711 | 767 | log.warning("Git informed while fetching: %s", err_line.strip()) |
712 | 768 | return output |
713 | 769 | |
714 | def _get_push_info(self, proc, progress): | |
770 | def _get_push_info(self, proc: 'Git.AutoInterrupt', | |
771 | progress: Union[Callable[..., Any], RemoteProgress, None]) -> IterableList[PushInfo]: | |
715 | 772 | progress = to_progress_instance(progress) |
716 | 773 | |
717 | 774 | # read progress information from stderr |
719 | 776 | # read the lines manually as it will use carriage returns between the messages |
720 | 777 | # to override the previous one. This is why we read the bytes manually |
721 | 778 | progress_handler = progress.new_message_handler() |
722 | output = [] | |
723 | ||
724 | def stdout_handler(line): | |
779 | output: IterableList[PushInfo] = IterableList('push_infos') | |
780 | ||
781 | def stdout_handler(line: str) -> None: | |
725 | 782 | try: |
726 | 783 | output.append(PushInfo._from_line(self, line)) |
727 | 784 | except ValueError: |
740 | 797 | |
741 | 798 | return output |
742 | 799 | |
743 | def _assert_refspec(self): | |
800 | def _assert_refspec(self) -> None: | |
744 | 801 | """Turns out we can't deal with remotes if the refspec is missing""" |
745 | 802 | config = self.config_reader |
746 | 803 | unset = 'placeholder' |
753 | 810 | finally: |
754 | 811 | config.release() |
755 | 812 | |
756 | def fetch(self, refspec=None, progress=None, verbose=True, **kwargs): | |
813 | def fetch(self, refspec: Union[str, List[str], None] = None, | |
814 | progress: Union[RemoteProgress, None, 'UpdateProgress'] = None, | |
815 | verbose: bool = True, **kwargs: Any) -> IterableList[FetchInfo]: | |
757 | 816 | """Fetch the latest changes for this remote |
758 | 817 | |
759 | 818 | :param refspec: |
784 | 843 | if refspec is None: |
785 | 844 | # No argument refspec, then ensure the repo's config has a fetch refspec. |
786 | 845 | self._assert_refspec() |
846 | ||
787 | 847 | kwargs = add_progress(kwargs, self.repo.git, progress) |
788 | 848 | if isinstance(refspec, list): |
789 | args = refspec | |
849 | args: Sequence[Optional[str]] = refspec | |
790 | 850 | else: |
791 | 851 | args = [refspec] |
792 | 852 | |
797 | 857 | self.repo.odb.update_cache() |
798 | 858 | return res |
799 | 859 | |
800 | def pull(self, refspec=None, progress=None, **kwargs): | |
860 | def pull(self, refspec: Union[str, List[str], None] = None, | |
861 | progress: Union[RemoteProgress, 'UpdateProgress', None] = None, | |
862 | **kwargs: Any) -> IterableList[FetchInfo]: | |
801 | 863 | """Pull changes from the given branch, being the same as a fetch followed |
802 | 864 | by a merge of branch with your local branch. |
803 | 865 | |
816 | 878 | self.repo.odb.update_cache() |
817 | 879 | return res |
818 | 880 | |
819 | def push(self, refspec=None, progress=None, **kwargs): | |
881 | def push(self, refspec: Union[str, List[str], None] = None, | |
882 | progress: Union[RemoteProgress, 'UpdateProgress', Callable[..., RemoteProgress], None] = None, | |
883 | **kwargs: Any) -> IterableList[PushInfo]: | |
820 | 884 | """Push changes from source branch in refspec to target branch in refspec. |
821 | 885 | |
822 | 886 | :param refspec: see 'fetch' method |
846 | 910 | universal_newlines=True, **kwargs) |
847 | 911 | return self._get_push_info(proc, progress) |
848 | 912 | |
849 | @property | |
850 | def config_reader(self): | |
913 | @ property | |
914 | def config_reader(self) -> SectionConstraint[GitConfigParser]: | |
851 | 915 | """ |
852 | 916 | :return: |
853 | 917 | GitConfigParser compatible object able to read options for only our remote. |
854 | 918 | Hence you may simple type config.get("pushurl") to obtain the information""" |
855 | 919 | return self._config_reader |
856 | 920 | |
857 | def _clear_cache(self): | |
921 | def _clear_cache(self) -> None: | |
858 | 922 | try: |
859 | 923 | del(self._config_reader) |
860 | 924 | except AttributeError: |
861 | 925 | pass |
862 | 926 | # END handle exception |
863 | 927 | |
864 | @property | |
865 | def config_writer(self): | |
928 | @ property | |
929 | def config_writer(self) -> SectionConstraint: | |
866 | 930 | """ |
867 | 931 | :return: GitConfigParser compatible object able to write options for this remote. |
868 | 932 | :note: |
0 | 0 | """Initialize the Repo package""" |
1 | 1 | # flake8: noqa |
2 | from __future__ import absolute_import | |
3 | from .base import * | |
2 | from .base import Repo as Repo |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | ||
6 | from collections import namedtuple | |
5 | from __future__ import annotations | |
7 | 6 | import logging |
8 | 7 | import os |
9 | 8 | import re |
9 | import shlex | |
10 | 10 | import warnings |
11 | from gitdb.db.loose import LooseObjectDB | |
12 | ||
13 | from gitdb.exc import BadObject | |
11 | 14 | |
12 | 15 | from git.cmd import ( |
13 | 16 | Git, |
25 | 28 | from git.objects import Submodule, RootModule, Commit |
26 | 29 | from git.refs import HEAD, Head, Reference, TagReference |
27 | 30 | from git.remote import Remote, add_progress, to_progress_instance |
28 | from git.util import Actor, finalize_process, decygpath, hex_to_bin, expand_path | |
31 | from git.util import Actor, finalize_process, decygpath, hex_to_bin, expand_path, remove_password_if_present | |
29 | 32 | import os.path as osp |
30 | 33 | |
31 | 34 | from .fun import rev_parse, is_git_dir, find_submodule_git_dir, touch, find_worktree_git_dir |
32 | 35 | import gc |
33 | 36 | import gitdb |
34 | 37 | |
35 | try: | |
36 | import pathlib | |
37 | except ImportError: | |
38 | pathlib = None | |
39 | ||
38 | # typing ------------------------------------------------------ | |
39 | ||
40 | from git.types import TBD, PathLike, Lit_config_levels, Commit_ish, Tree_ish, assert_never | |
41 | from typing import (Any, BinaryIO, Callable, Dict, | |
42 | Iterator, List, Mapping, Optional, Sequence, | |
43 | TextIO, Tuple, Type, Union, | |
44 | NamedTuple, cast, TYPE_CHECKING) | |
45 | ||
46 | from git.types import ConfigLevels_Tup, TypedDict | |
47 | ||
48 | if TYPE_CHECKING: | |
49 | from git.util import IterableList | |
50 | from git.refs.symbolic import SymbolicReference | |
51 | from git.objects import Tree | |
52 | from git.objects.submodule.base import UpdateProgress | |
53 | from git.remote import RemoteProgress | |
54 | ||
55 | # ----------------------------------------------------------- | |
40 | 56 | |
41 | 57 | log = logging.getLogger(__name__) |
42 | 58 | |
43 | BlameEntry = namedtuple('BlameEntry', ['commit', 'linenos', 'orig_path', 'orig_linenos']) | |
44 | ||
45 | ||
46 | 59 | __all__ = ('Repo',) |
60 | ||
61 | ||
62 | class BlameEntry(NamedTuple): | |
63 | commit: Dict[str, 'Commit'] | |
64 | linenos: range | |
65 | orig_path: Optional[str] | |
66 | orig_linenos: range | |
47 | 67 | |
48 | 68 | |
49 | 69 | class Repo(object): |
62 | 82 | 'git_dir' is the .git repository directory, which is always set.""" |
63 | 83 | DAEMON_EXPORT_FILE = 'git-daemon-export-ok' |
64 | 84 | |
65 | git = None # Must exist, or __del__ will fail in case we raise on `__init__()` | |
66 | working_dir = None | |
67 | _working_tree_dir = None | |
68 | git_dir = None | |
69 | _common_dir = None | |
85 | git = cast('Git', None) # Must exist, or __del__ will fail in case we raise on `__init__()` | |
86 | working_dir: Optional[PathLike] = None | |
87 | _working_tree_dir: Optional[PathLike] = None | |
88 | git_dir: PathLike = "" | |
89 | _common_dir: PathLike = "" | |
70 | 90 | |
71 | 91 | # precompiled regex |
72 | 92 | re_whitespace = re.compile(r'\s+') |
78 | 98 | |
79 | 99 | # invariants |
80 | 100 | # represents the configuration level of a configuration file |
81 | config_level = ("system", "user", "global", "repository") | |
101 | config_level: ConfigLevels_Tup = ("system", "user", "global", "repository") | |
82 | 102 | |
83 | 103 | # Subclass configuration |
84 | 104 | # Subclasses may easily bring in their own custom types by placing a constructor or type here |
85 | 105 | GitCommandWrapperType = Git |
86 | 106 | |
87 | def __init__(self, path=None, odbt=GitCmdObjectDB, search_parent_directories=False, expand_vars=True): | |
107 | def __init__(self, path: Optional[PathLike] = None, odbt: Type[LooseObjectDB] = GitCmdObjectDB, | |
108 | search_parent_directories: bool = False, expand_vars: bool = True) -> None: | |
88 | 109 | """Create a new Repo instance |
89 | 110 | |
90 | 111 | :param path: |
125 | 146 | warnings.warn("The use of environment variables in paths is deprecated" + |
126 | 147 | "\nfor security reasons and may be removed in the future!!") |
127 | 148 | epath = expand_path(epath, expand_vars) |
128 | if not os.path.exists(epath): | |
129 | raise NoSuchPathError(epath) | |
149 | if epath is not None: | |
150 | if not os.path.exists(epath): | |
151 | raise NoSuchPathError(epath) | |
130 | 152 | |
131 | 153 | ## Walk up the path to find the `.git` dir. |
132 | 154 | # |
189 | 211 | try: |
190 | 212 | common_dir = open(osp.join(self.git_dir, 'commondir'), 'rt').readlines()[0].strip() |
191 | 213 | self._common_dir = osp.join(self.git_dir, common_dir) |
192 | except (OSError, IOError): | |
193 | self._common_dir = None | |
214 | except OSError: | |
215 | self._common_dir = "" | |
194 | 216 | |
195 | 217 | # adjust the wd in case we are actually bare - we didn't know that |
196 | 218 | # in the first place |
198 | 220 | self._working_tree_dir = None |
199 | 221 | # END working dir handling |
200 | 222 | |
201 | self.working_dir = self._working_tree_dir or self.common_dir | |
223 | self.working_dir: Optional[PathLike] = self._working_tree_dir or self.common_dir | |
202 | 224 | self.git = self.GitCommandWrapperType(self.working_dir) |
203 | 225 | |
204 | 226 | # special handling, in special times |
205 | args = [osp.join(self.common_dir, 'objects')] | |
227 | rootpath = osp.join(self.common_dir, 'objects') | |
206 | 228 | if issubclass(odbt, GitCmdObjectDB): |
207 | args.append(self.git) | |
208 | self.odb = odbt(*args) | |
209 | ||
210 | def __enter__(self): | |
229 | self.odb = odbt(rootpath, self.git) | |
230 | else: | |
231 | self.odb = odbt(rootpath) | |
232 | ||
233 | def __enter__(self) -> 'Repo': | |
211 | 234 | return self |
212 | 235 | |
213 | def __exit__(self, exc_type, exc_value, traceback): | |
236 | def __exit__(self, *args: Any) -> None: | |
214 | 237 | self.close() |
215 | 238 | |
216 | def __del__(self): | |
239 | def __del__(self) -> None: | |
217 | 240 | try: |
218 | 241 | self.close() |
219 | 242 | except Exception: |
220 | 243 | pass |
221 | 244 | |
222 | def close(self): | |
245 | def close(self) -> None: | |
223 | 246 | if self.git: |
224 | 247 | self.git.clear_cache() |
225 | 248 | # Tempfiles objects on Windows are holding references to |
234 | 257 | if is_win: |
235 | 258 | gc.collect() |
236 | 259 | |
237 | def __eq__(self, rhs): | |
238 | if isinstance(rhs, Repo): | |
260 | def __eq__(self, rhs: object) -> bool: | |
261 | if isinstance(rhs, Repo) and self.git_dir: | |
239 | 262 | return self.git_dir == rhs.git_dir |
240 | 263 | return False |
241 | 264 | |
242 | def __ne__(self, rhs): | |
265 | def __ne__(self, rhs: object) -> bool: | |
243 | 266 | return not self.__eq__(rhs) |
244 | 267 | |
245 | def __hash__(self): | |
268 | def __hash__(self) -> int: | |
246 | 269 | return hash(self.git_dir) |
247 | 270 | |
248 | 271 | # Description property |
249 | def _get_description(self): | |
250 | filename = osp.join(self.git_dir, 'description') | |
272 | def _get_description(self) -> str: | |
273 | if self.git_dir: | |
274 | filename = osp.join(self.git_dir, 'description') | |
251 | 275 | with open(filename, 'rb') as fp: |
252 | 276 | return fp.read().rstrip().decode(defenc) |
253 | 277 | |
254 | def _set_description(self, descr): | |
255 | filename = osp.join(self.git_dir, 'description') | |
278 | def _set_description(self, descr: str) -> None: | |
279 | if self.git_dir: | |
280 | filename = osp.join(self.git_dir, 'description') | |
256 | 281 | with open(filename, 'wb') as fp: |
257 | 282 | fp.write((descr + '\n').encode(defenc)) |
258 | 283 | |
262 | 287 | del _set_description |
263 | 288 | |
264 | 289 | @property |
265 | def working_tree_dir(self): | |
290 | def working_tree_dir(self) -> Optional[PathLike]: | |
266 | 291 | """:return: The working tree directory of our git repository. If this is a bare repository, None is returned. |
267 | 292 | """ |
268 | 293 | return self._working_tree_dir |
269 | 294 | |
270 | 295 | @property |
271 | def common_dir(self): | |
296 | def common_dir(self) -> PathLike: | |
272 | 297 | """ |
273 | 298 | :return: The git dir that holds everything except possibly HEAD, |
274 | 299 | FETCH_HEAD, ORIG_HEAD, COMMIT_EDITMSG, index, and logs/.""" |
275 | return self._common_dir or self.git_dir | |
300 | if self._common_dir: | |
301 | return self._common_dir | |
302 | elif self.git_dir: | |
303 | return self.git_dir | |
304 | else: | |
305 | # or could return "" | |
306 | raise InvalidGitRepositoryError() | |
276 | 307 | |
277 | 308 | @property |
278 | def bare(self): | |
309 | def bare(self) -> bool: | |
279 | 310 | """:return: True if the repository is bare""" |
280 | 311 | return self._bare |
281 | 312 | |
282 | 313 | @property |
283 | def heads(self): | |
314 | def heads(self) -> 'IterableList[Head]': | |
284 | 315 | """A list of ``Head`` objects representing the branch heads in |
285 | 316 | this repo |
286 | 317 | |
288 | 319 | return Head.list_items(self) |
289 | 320 | |
290 | 321 | @property |
291 | def references(self): | |
322 | def references(self) -> 'IterableList[Reference]': | |
292 | 323 | """A list of Reference objects representing tags, heads and remote references. |
293 | 324 | |
294 | 325 | :return: IterableList(Reference, ...)""" |
301 | 332 | branches = heads |
302 | 333 | |
303 | 334 | @property |
304 | def index(self): | |
335 | def index(self) -> 'IndexFile': | |
305 | 336 | """:return: IndexFile representing this repository's index. |
306 | 337 | :note: This property can be expensive, as the returned ``IndexFile`` will be |
307 | 338 | reinitialized. It's recommended to re-use the object.""" |
308 | 339 | return IndexFile(self) |
309 | 340 | |
310 | 341 | @property |
311 | def head(self): | |
342 | def head(self) -> 'HEAD': | |
312 | 343 | """:return: HEAD Object pointing to the current head reference""" |
313 | 344 | return HEAD(self, 'HEAD') |
314 | 345 | |
315 | 346 | @property |
316 | def remotes(self): | |
347 | def remotes(self) -> 'IterableList[Remote]': | |
317 | 348 | """A list of Remote objects allowing to access and manipulate remotes |
318 | 349 | :return: ``git.IterableList(Remote, ...)``""" |
319 | 350 | return Remote.list_items(self) |
320 | 351 | |
321 | def remote(self, name='origin'): | |
352 | def remote(self, name: str = 'origin') -> 'Remote': | |
322 | 353 | """:return: Remote with the specified name |
323 | 354 | :raise ValueError: if no remote with such a name exists""" |
324 | 355 | r = Remote(self, name) |
329 | 360 | #{ Submodules |
330 | 361 | |
331 | 362 | @property |
332 | def submodules(self): | |
363 | def submodules(self) -> 'IterableList[Submodule]': | |
333 | 364 | """ |
334 | 365 | :return: git.IterableList(Submodule, ...) of direct submodules |
335 | 366 | available from the current head""" |
336 | 367 | return Submodule.list_items(self) |
337 | 368 | |
338 | def submodule(self, name): | |
369 | def submodule(self, name: str) -> 'Submodule': | |
339 | 370 | """ :return: Submodule with the given name |
340 | 371 | :raise ValueError: If no such submodule exists""" |
341 | 372 | try: |
344 | 375 | raise ValueError("Didn't find submodule named %r" % name) from e |
345 | 376 | # END exception handling |
346 | 377 | |
347 | def create_submodule(self, *args, **kwargs): | |
378 | def create_submodule(self, *args: Any, **kwargs: Any) -> Submodule: | |
348 | 379 | """Create a new submodule |
349 | 380 | |
350 | 381 | :note: See the documentation of Submodule.add for a description of the |
352 | 383 | :return: created submodules""" |
353 | 384 | return Submodule.add(self, *args, **kwargs) |
354 | 385 | |
355 | def iter_submodules(self, *args, **kwargs): | |
386 | def iter_submodules(self, *args: Any, **kwargs: Any) -> Iterator[Submodule]: | |
356 | 387 | """An iterator yielding Submodule instances, see Traversable interface |
357 | 388 | for a description of args and kwargs |
358 | 389 | :return: Iterator""" |
359 | 390 | return RootModule(self).traverse(*args, **kwargs) |
360 | 391 | |
361 | def submodule_update(self, *args, **kwargs): | |
392 | def submodule_update(self, *args: Any, **kwargs: Any) -> Iterator[Submodule]: | |
362 | 393 | """Update the submodules, keeping the repository consistent as it will |
363 | 394 | take the previous state into consideration. For more information, please |
364 | 395 | see the documentation of RootModule.update""" |
367 | 398 | #}END submodules |
368 | 399 | |
369 | 400 | @property |
370 | def tags(self): | |
401 | def tags(self) -> 'IterableList[TagReference]': | |
371 | 402 | """A list of ``Tag`` objects that are available in this repo |
372 | 403 | :return: ``git.IterableList(TagReference, ...)`` """ |
373 | 404 | return TagReference.list_items(self) |
374 | 405 | |
375 | def tag(self, path): | |
406 | def tag(self, path: PathLike) -> TagReference: | |
376 | 407 | """:return: TagReference Object, reference pointing to a Commit or Tag |
377 | 408 | :param path: path to the tag reference, i.e. 0.1.5 or tags/0.1.5 """ |
378 | return TagReference(self, path) | |
379 | ||
380 | def create_head(self, path, commit='HEAD', force=False, logmsg=None): | |
409 | full_path = self._to_full_tag_path(path) | |
410 | return TagReference(self, full_path) | |
411 | ||
412 | @staticmethod | |
413 | def _to_full_tag_path(path: PathLike) -> str: | |
414 | path_str = str(path) | |
415 | if path_str.startswith(TagReference._common_path_default + '/'): | |
416 | return path_str | |
417 | if path_str.startswith(TagReference._common_default + '/'): | |
418 | return Reference._common_path_default + '/' + path_str | |
419 | else: | |
420 | return TagReference._common_path_default + '/' + path_str | |
421 | ||
422 | def create_head(self, path: PathLike, commit: str = 'HEAD', | |
423 | force: bool = False, logmsg: Optional[str] = None | |
424 | ) -> 'Head': | |
381 | 425 | """Create a new head within the repository. |
382 | 426 | For more documentation, please see the Head.create method. |
383 | 427 | |
384 | 428 | :return: newly created Head Reference""" |
385 | return Head.create(self, path, commit, force, logmsg) | |
386 | ||
387 | def delete_head(self, *heads, **kwargs): | |
429 | return Head.create(self, path, commit, logmsg, force) | |
430 | ||
431 | def delete_head(self, *heads: 'Head', **kwargs: Any) -> None: | |
388 | 432 | """Delete the given heads |
389 | 433 | |
390 | 434 | :param kwargs: Additional keyword arguments to be passed to git-branch""" |
391 | 435 | return Head.delete(self, *heads, **kwargs) |
392 | 436 | |
393 | def create_tag(self, path, ref='HEAD', message=None, force=False, **kwargs): | |
437 | def create_tag(self, path: PathLike, ref: str = 'HEAD', | |
438 | message: Optional[str] = None, force: bool = False, **kwargs: Any | |
439 | ) -> TagReference: | |
394 | 440 | """Create a new tag reference. |
395 | 441 | For more documentation, please see the TagReference.create method. |
396 | 442 | |
397 | 443 | :return: TagReference object """ |
398 | 444 | return TagReference.create(self, path, ref, message, force, **kwargs) |
399 | 445 | |
400 | def delete_tag(self, *tags): | |
446 | def delete_tag(self, *tags: TagReference) -> None: | |
401 | 447 | """Delete the given tag references""" |
402 | 448 | return TagReference.delete(self, *tags) |
403 | 449 | |
404 | def create_remote(self, name, url, **kwargs): | |
450 | def create_remote(self, name: str, url: str, **kwargs: Any) -> Remote: | |
405 | 451 | """Create a new remote. |
406 | 452 | |
407 | 453 | For more information, please see the documentation of the Remote.create |
410 | 456 | :return: Remote reference""" |
411 | 457 | return Remote.create(self, name, url, **kwargs) |
412 | 458 | |
413 | def delete_remote(self, remote): | |
459 | def delete_remote(self, remote: 'Remote') -> str: | |
414 | 460 | """Delete the given remote.""" |
415 | 461 | return Remote.remove(self, remote) |
416 | 462 | |
417 | def _get_config_path(self, config_level): | |
463 | def _get_config_path(self, config_level: Lit_config_levels) -> str: | |
418 | 464 | # we do not support an absolute path of the gitconfig on windows , |
419 | 465 | # use the global config instead |
420 | 466 | if is_win and config_level == "system": |
428 | 474 | elif config_level == "global": |
429 | 475 | return osp.normpath(osp.expanduser("~/.gitconfig")) |
430 | 476 | elif config_level == "repository": |
431 | return osp.normpath(osp.join(self._common_dir or self.git_dir, "config")) | |
432 | ||
433 | raise ValueError("Invalid configuration level: %r" % config_level) | |
434 | ||
435 | def config_reader(self, config_level=None): | |
477 | repo_dir = self._common_dir or self.git_dir | |
478 | if not repo_dir: | |
479 | raise NotADirectoryError | |
480 | else: | |
481 | return osp.normpath(osp.join(repo_dir, "config")) | |
482 | else: | |
483 | ||
484 | assert_never(config_level, # type:ignore[unreachable] | |
485 | ValueError(f"Invalid configuration level: {config_level!r}")) | |
486 | ||
487 | def config_reader(self, config_level: Optional[Lit_config_levels] = None, | |
488 | ) -> GitConfigParser: | |
436 | 489 | """ |
437 | 490 | :return: |
438 | 491 | GitConfigParser allowing to read the full git configuration, but not to write it |
448 | 501 | unknown, instead the global path will be used.""" |
449 | 502 | files = None |
450 | 503 | if config_level is None: |
451 | files = [self._get_config_path(f) for f in self.config_level] | |
504 | files = [self._get_config_path(cast(Lit_config_levels, f)) | |
505 | for f in self.config_level if cast(Lit_config_levels, f)] | |
452 | 506 | else: |
453 | 507 | files = [self._get_config_path(config_level)] |
454 | 508 | return GitConfigParser(files, read_only=True, repo=self) |
455 | 509 | |
456 | def config_writer(self, config_level="repository"): | |
510 | def config_writer(self, config_level: Lit_config_levels = "repository" | |
511 | ) -> GitConfigParser: | |
457 | 512 | """ |
458 | 513 | :return: |
459 | 514 | GitConfigParser allowing to write values of the specified configuration file level. |
468 | 523 | repository = configuration file for this repository only""" |
469 | 524 | return GitConfigParser(self._get_config_path(config_level), read_only=False, repo=self) |
470 | 525 | |
471 | def commit(self, rev=None): | |
526 | def commit(self, rev: Union[str, Commit_ish, None] = None | |
527 | ) -> Commit: | |
472 | 528 | """The Commit object for the specified revision |
473 | 529 | |
474 | 530 | :param rev: revision specifier, see git-rev-parse for viable options. |
478 | 534 | return self.head.commit |
479 | 535 | return self.rev_parse(str(rev) + "^0") |
480 | 536 | |
481 | def iter_trees(self, *args, **kwargs): | |
537 | def iter_trees(self, *args: Any, **kwargs: Any) -> Iterator['Tree']: | |
482 | 538 | """:return: Iterator yielding Tree objects |
483 | 539 | :note: Takes all arguments known to iter_commits method""" |
484 | 540 | return (c.tree for c in self.iter_commits(*args, **kwargs)) |
485 | 541 | |
486 | def tree(self, rev=None): | |
542 | def tree(self, rev: Union[Tree_ish, str, None] = None) -> 'Tree': | |
487 | 543 | """The Tree object for the given treeish revision |
488 | 544 | Examples:: |
489 | 545 | |
500 | 556 | return self.head.commit.tree |
501 | 557 | return self.rev_parse(str(rev) + "^{tree}") |
502 | 558 | |
503 | def iter_commits(self, rev=None, paths='', **kwargs): | |
559 | def iter_commits(self, rev: Union[str, Commit, 'SymbolicReference', None] = None, | |
560 | paths: Union[PathLike, Sequence[PathLike]] = '', | |
561 | **kwargs: Any) -> Iterator[Commit]: | |
504 | 562 | """A list of Commit objects representing the history of a given ref/commit |
505 | 563 | |
506 | 564 | :param rev: |
524 | 582 | |
525 | 583 | return Commit.iter_items(self, rev, paths, **kwargs) |
526 | 584 | |
527 | def merge_base(self, *rev, **kwargs): | |
585 | def merge_base(self, *rev: TBD, **kwargs: Any | |
586 | ) -> List[Union[Commit_ish, None]]: | |
528 | 587 | """Find the closest common ancestor for the given revision (e.g. Commits, Tags, References, etc) |
529 | 588 | |
530 | 589 | :param rev: At least two revs to find the common ancestor for. |
537 | 596 | raise ValueError("Please specify at least two revs, got only %i" % len(rev)) |
538 | 597 | # end handle input |
539 | 598 | |
540 | res = [] | |
599 | res: List[Union[Commit_ish, None]] = [] | |
541 | 600 | try: |
542 | lines = self.git.merge_base(*rev, **kwargs).splitlines() | |
601 | lines = self.git.merge_base(*rev, **kwargs).splitlines() # List[str] | |
543 | 602 | except GitCommandError as err: |
544 | 603 | if err.status == 128: |
545 | 604 | raise |
555 | 614 | |
556 | 615 | return res |
557 | 616 | |
558 | def is_ancestor(self, ancestor_rev, rev): | |
617 | def is_ancestor(self, ancestor_rev: 'Commit', rev: 'Commit') -> bool: | |
559 | 618 | """Check if a commit is an ancestor of another |
560 | 619 | |
561 | 620 | :param ancestor_rev: Rev which should be an ancestor |
570 | 629 | raise |
571 | 630 | return True |
572 | 631 | |
573 | def _get_daemon_export(self): | |
574 | filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE) | |
632 | def is_valid_object(self, sha: str, object_type: Union[str, None] = None) -> bool: | |
633 | try: | |
634 | complete_sha = self.odb.partial_to_complete_sha_hex(sha) | |
635 | object_info = self.odb.info(complete_sha) | |
636 | if object_type: | |
637 | if object_info.type == object_type.encode(): | |
638 | return True | |
639 | else: | |
640 | log.debug("Commit hash points to an object of type '%s'. Requested were objects of type '%s'", | |
641 | object_info.type.decode(), object_type) | |
642 | return False | |
643 | else: | |
644 | return True | |
645 | except BadObject: | |
646 | log.debug("Commit hash is invalid.") | |
647 | return False | |
648 | ||
649 | def _get_daemon_export(self) -> bool: | |
650 | if self.git_dir: | |
651 | filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE) | |
575 | 652 | return osp.exists(filename) |
576 | 653 | |
577 | def _set_daemon_export(self, value): | |
578 | filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE) | |
654 | def _set_daemon_export(self, value: object) -> None: | |
655 | if self.git_dir: | |
656 | filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE) | |
579 | 657 | fileexists = osp.exists(filename) |
580 | 658 | if value and not fileexists: |
581 | 659 | touch(filename) |
587 | 665 | del _get_daemon_export |
588 | 666 | del _set_daemon_export |
589 | 667 | |
590 | def _get_alternates(self): | |
668 | def _get_alternates(self) -> List[str]: | |
591 | 669 | """The list of alternates for this repo from which objects can be retrieved |
592 | 670 | |
593 | 671 | :return: list of strings being pathnames of alternates""" |
594 | alternates_path = osp.join(self.git_dir, 'objects', 'info', 'alternates') | |
672 | if self.git_dir: | |
673 | alternates_path = osp.join(self.git_dir, 'objects', 'info', 'alternates') | |
595 | 674 | |
596 | 675 | if osp.exists(alternates_path): |
597 | 676 | with open(alternates_path, 'rb') as f: |
599 | 678 | return alts.strip().splitlines() |
600 | 679 | return [] |
601 | 680 | |
602 | def _set_alternates(self, alts): | |
681 | def _set_alternates(self, alts: List[str]) -> None: | |
603 | 682 | """Sets the alternates |
604 | 683 | |
605 | 684 | :param alts: |
621 | 700 | alternates = property(_get_alternates, _set_alternates, |
622 | 701 | doc="Retrieve a list of alternates paths or set a list paths to be used as alternates") |
623 | 702 | |
624 | def is_dirty(self, index=True, working_tree=True, untracked_files=False, | |
625 | submodules=True, path=None): | |
703 | def is_dirty(self, index: bool = True, working_tree: bool = True, untracked_files: bool = False, | |
704 | submodules: bool = True, path: Optional[PathLike] = None) -> bool: | |
626 | 705 | """ |
627 | 706 | :return: |
628 | 707 | ``True``, the repository is considered dirty. By default it will react |
638 | 717 | if not submodules: |
639 | 718 | default_args.append('--ignore-submodules') |
640 | 719 | if path: |
641 | default_args.extend(["--", path]) | |
720 | default_args.extend(["--", str(path)]) | |
642 | 721 | if index: |
643 | 722 | # diff index against HEAD |
644 | 723 | if osp.isfile(self.index.path) and \ |
657 | 736 | return False |
658 | 737 | |
659 | 738 | @property |
660 | def untracked_files(self): | |
739 | def untracked_files(self) -> List[str]: | |
661 | 740 | """ |
662 | 741 | :return: |
663 | 742 | list(str,...) |
672 | 751 | consider caching it yourself.""" |
673 | 752 | return self._get_untracked_files() |
674 | 753 | |
675 | def _get_untracked_files(self, *args, **kwargs): | |
754 | def _get_untracked_files(self, *args: Any, **kwargs: Any) -> List[str]: | |
676 | 755 | # make sure we get all files, not only untracked directories |
677 | 756 | proc = self.git.status(*args, |
678 | 757 | porcelain=True, |
696 | 775 | finalize_process(proc) |
697 | 776 | return untracked_files |
698 | 777 | |
699 | def ignored(self, *paths): | |
778 | def ignored(self, *paths: PathLike) -> List[str]: | |
700 | 779 | """Checks if paths are ignored via .gitignore |
701 | 780 | Doing so using the "git check-ignore" method. |
702 | 781 | |
704 | 783 | :return: subset of those paths which are ignored |
705 | 784 | """ |
706 | 785 | try: |
707 | proc = self.git.check_ignore(*paths) | |
786 | proc: str = self.git.check_ignore(*paths) | |
708 | 787 | except GitCommandError: |
709 | 788 | return [] |
710 | 789 | return proc.replace("\\\\", "\\").replace('"', "").split("\n") |
711 | 790 | |
712 | 791 | @property |
713 | def active_branch(self): | |
792 | def active_branch(self) -> Head: | |
714 | 793 | """The name of the currently active branch. |
715 | ||
716 | 794 | :return: Head to the active branch""" |
795 | # reveal_type(self.head.reference) # => Reference | |
717 | 796 | return self.head.reference |
718 | 797 | |
719 | def blame_incremental(self, rev, file, **kwargs): | |
798 | def blame_incremental(self, rev: str | HEAD, file: str, **kwargs: Any) -> Iterator['BlameEntry']: | |
720 | 799 | """Iterator for blame information for the given file at the given revision. |
721 | 800 | |
722 | 801 | Unlike .blame(), this does not return the actual file's contents, only |
730 | 809 | If you combine all line number ranges outputted by this command, you |
731 | 810 | should get a continuous range spanning all line numbers in the file. |
732 | 811 | """ |
733 | data = self.git.blame(rev, '--', file, p=True, incremental=True, stdout_as_string=False, **kwargs) | |
734 | commits = {} | |
812 | ||
813 | data: bytes = self.git.blame(rev, '--', file, p=True, incremental=True, stdout_as_string=False, **kwargs) | |
814 | commits: Dict[bytes, Commit] = {} | |
735 | 815 | |
736 | 816 | stream = (line for line in data.split(b'\n') if line) |
737 | 817 | while True: |
739 | 819 | line = next(stream) # when exhausted, causes a StopIteration, terminating this function |
740 | 820 | except StopIteration: |
741 | 821 | return |
742 | hexsha, orig_lineno, lineno, num_lines = line.split() | |
743 | lineno = int(lineno) | |
744 | num_lines = int(num_lines) | |
745 | orig_lineno = int(orig_lineno) | |
822 | split_line = line.split() | |
823 | hexsha, orig_lineno_b, lineno_b, num_lines_b = split_line | |
824 | lineno = int(lineno_b) | |
825 | num_lines = int(num_lines_b) | |
826 | orig_lineno = int(orig_lineno_b) | |
746 | 827 | if hexsha not in commits: |
747 | 828 | # Now read the next few lines and build up a dict of properties |
748 | 829 | # for this commit |
749 | props = {} | |
830 | props: Dict[bytes, bytes] = {} | |
750 | 831 | while True: |
751 | 832 | try: |
752 | 833 | line = next(stream) |
790 | 871 | safe_decode(orig_filename), |
791 | 872 | range(orig_lineno, orig_lineno + num_lines)) |
792 | 873 | |
793 | def blame(self, rev, file, incremental=False, **kwargs): | |
874 | def blame(self, rev: Union[str, HEAD], file: str, incremental: bool = False, **kwargs: Any | |
875 | ) -> List[List[Commit | List[str | bytes] | None]] | Iterator[BlameEntry] | None: | |
794 | 876 | """The blame information for the given file at the given revision. |
795 | 877 | |
796 | 878 | :param rev: revision specifier, see git-rev-parse for viable options. |
797 | 879 | :return: |
798 | 880 | list: [git.Commit, list: [<line>]] |
799 | A list of tuples associating a Commit object with a list of lines that | |
881 | A list of lists associating a Commit object with a list of lines that | |
800 | 882 | changed within the given commit. The Commit objects will be given in order |
801 | 883 | of appearance.""" |
802 | 884 | if incremental: |
803 | 885 | return self.blame_incremental(rev, file, **kwargs) |
804 | 886 | |
805 | data = self.git.blame(rev, '--', file, p=True, stdout_as_string=False, **kwargs) | |
806 | commits = {} | |
807 | blames = [] | |
808 | info = None | |
887 | data: bytes = self.git.blame(rev, '--', file, p=True, stdout_as_string=False, **kwargs) | |
888 | commits: Dict[str, Commit] = {} | |
889 | blames: List[List[Commit | List[str | bytes] | None]] = [] | |
890 | ||
891 | class InfoTD(TypedDict, total=False): | |
892 | sha: str | |
893 | id: str | |
894 | filename: str | |
895 | summary: str | |
896 | author: str | |
897 | author_email: str | |
898 | author_date: int | |
899 | committer: str | |
900 | committer_email: str | |
901 | committer_date: int | |
902 | ||
903 | info: InfoTD = {} | |
809 | 904 | |
810 | 905 | keepends = True |
811 | for line in data.splitlines(keepends): | |
906 | for line_bytes in data.splitlines(keepends): | |
812 | 907 | try: |
813 | line = line.rstrip().decode(defenc) | |
908 | line_str = line_bytes.rstrip().decode(defenc) | |
814 | 909 | except UnicodeDecodeError: |
815 | 910 | firstpart = '' |
911 | parts = [] | |
816 | 912 | is_binary = True |
817 | 913 | else: |
818 | 914 | # As we don't have an idea when the binary data ends, as it could contain multiple newlines |
819 | 915 | # in the process. So we rely on being able to decode to tell us what is is. |
820 | 916 | # This can absolutely fail even on text files, but even if it does, we should be fine treating it |
821 | 917 | # as binary instead |
822 | parts = self.re_whitespace.split(line, 1) | |
918 | parts = self.re_whitespace.split(line_str, 1) | |
823 | 919 | firstpart = parts[0] |
824 | 920 | is_binary = False |
825 | 921 | # end handle decode of line |
850 | 946 | # committer-time 1192271832 |
851 | 947 | # committer-tz -0700 - IGNORED BY US |
852 | 948 | role = m.group(0) |
853 | if firstpart.endswith('-mail'): | |
854 | info["%s_email" % role] = parts[-1] | |
855 | elif firstpart.endswith('-time'): | |
856 | info["%s_date" % role] = int(parts[-1]) | |
857 | elif role == firstpart: | |
858 | info[role] = parts[-1] | |
949 | if role == 'author': | |
950 | if firstpart.endswith('-mail'): | |
951 | info["author_email"] = parts[-1] | |
952 | elif firstpart.endswith('-time'): | |
953 | info["author_date"] = int(parts[-1]) | |
954 | elif role == firstpart: | |
955 | info["author"] = parts[-1] | |
956 | elif role == 'committer': | |
957 | if firstpart.endswith('-mail'): | |
958 | info["committer_email"] = parts[-1] | |
959 | elif firstpart.endswith('-time'): | |
960 | info["committer_date"] = int(parts[-1]) | |
961 | elif role == firstpart: | |
962 | info["committer"] = parts[-1] | |
859 | 963 | # END distinguish mail,time,name |
860 | 964 | else: |
861 | 965 | # handle |
872 | 976 | c = commits.get(sha) |
873 | 977 | if c is None: |
874 | 978 | c = Commit(self, hex_to_bin(sha), |
875 | author=Actor._from_string(info['author'] + ' ' + info['author_email']), | |
979 | author=Actor._from_string(f"{info['author']} {info['author_email']}"), | |
876 | 980 | authored_date=info['author_date'], |
877 | 981 | committer=Actor._from_string( |
878 | info['committer'] + ' ' + info['committer_email']), | |
982 | f"{info['committer']} {info['committer_email']}"), | |
879 | 983 | committed_date=info['committer_date']) |
880 | 984 | commits[sha] = c |
985 | blames[-1][0] = c | |
881 | 986 | # END if commit objects needs initial creation |
882 | if not is_binary: | |
883 | if line and line[0] == '\t': | |
884 | line = line[1:] | |
885 | else: | |
886 | # NOTE: We are actually parsing lines out of binary data, which can lead to the | |
887 | # binary being split up along the newline separator. We will append this to the blame | |
888 | # we are currently looking at, even though it should be concatenated with the last line | |
889 | # we have seen. | |
890 | pass | |
891 | # end handle line contents | |
892 | blames[-1][0] = c | |
893 | blames[-1][1].append(line) | |
987 | ||
988 | if blames[-1][1] is not None: | |
989 | line: str | bytes | |
990 | if not is_binary: | |
991 | if line_str and line_str[0] == '\t': | |
992 | line_str = line_str[1:] | |
993 | line = line_str | |
994 | else: | |
995 | line = line_bytes | |
996 | # NOTE: We are actually parsing lines out of binary data, which can lead to the | |
997 | # binary being split up along the newline separator. We will append this to the | |
998 | # blame we are currently looking at, even though it should be concatenated with | |
999 | # the last line we have seen. | |
1000 | blames[-1][1].append(line) | |
1001 | ||
894 | 1002 | info = {'id': sha} |
895 | 1003 | # END if we collected commit info |
896 | 1004 | # END distinguish filename,summary,rest |
898 | 1006 | # END distinguish hexsha vs other information |
899 | 1007 | return blames |
900 | 1008 | |
901 | @classmethod | |
902 | def init(cls, path=None, mkdir=True, odbt=GitCmdObjectDB, expand_vars=True, **kwargs): | |
1009 | @ classmethod | |
1010 | def init(cls, path: Union[PathLike, None] = None, mkdir: bool = True, odbt: Type[GitCmdObjectDB] = GitCmdObjectDB, | |
1011 | expand_vars: bool = True, **kwargs: Any) -> 'Repo': | |
903 | 1012 | """Initialize a git repository at the given path if specified |
904 | 1013 | |
905 | 1014 | :param path: |
932 | 1041 | os.makedirs(path, 0o755) |
933 | 1042 | |
934 | 1043 | # git command automatically chdir into the directory |
935 | git = Git(path) | |
1044 | git = cls.GitCommandWrapperType(path) | |
936 | 1045 | git.init(**kwargs) |
937 | 1046 | return cls(path, odbt=odbt) |
938 | 1047 | |
939 | @classmethod | |
940 | def _clone(cls, git, url, path, odb_default_type, progress, multi_options=None, **kwargs): | |
941 | if progress is not None: | |
942 | progress = to_progress_instance(progress) | |
943 | ||
1048 | @ classmethod | |
1049 | def _clone(cls, git: 'Git', url: PathLike, path: PathLike, odb_default_type: Type[GitCmdObjectDB], | |
1050 | progress: Union['RemoteProgress', 'UpdateProgress', Callable[..., 'RemoteProgress'], None] = None, | |
1051 | multi_options: Optional[List[str]] = None, **kwargs: Any | |
1052 | ) -> 'Repo': | |
944 | 1053 | odbt = kwargs.pop('odbt', odb_default_type) |
945 | 1054 | |
946 | 1055 | # when pathlib.Path or other classbased path is passed |
961 | 1070 | kwargs['separate_git_dir'] = Git.polish_url(sep_dir) |
962 | 1071 | multi = None |
963 | 1072 | if multi_options: |
964 | multi = ' '.join(multi_options).split(' ') | |
965 | proc = git.clone(multi, Git.polish_url(url), clone_path, with_extended_output=True, as_process=True, | |
1073 | multi = shlex.split(' '.join(multi_options)) | |
1074 | proc = git.clone(multi, Git.polish_url(str(url)), clone_path, with_extended_output=True, as_process=True, | |
966 | 1075 | v=True, universal_newlines=True, **add_progress(kwargs, git, progress)) |
967 | 1076 | if progress: |
968 | handle_process_output(proc, None, progress.new_message_handler(), finalize_process, decode_streams=False) | |
1077 | handle_process_output(proc, None, to_progress_instance(progress).new_message_handler(), | |
1078 | finalize_process, decode_streams=False) | |
969 | 1079 | else: |
970 | 1080 | (stdout, stderr) = proc.communicate() |
971 | log.debug("Cmd(%s)'s unused stdout: %s", getattr(proc, 'args', ''), stdout) | |
1081 | cmdline = getattr(proc, 'args', '') | |
1082 | cmdline = remove_password_if_present(cmdline) | |
1083 | ||
1084 | log.debug("Cmd(%s)'s unused stdout: %s", cmdline, stdout) | |
972 | 1085 | finalize_process(proc, stderr=stderr) |
973 | 1086 | |
974 | 1087 | # our git command could have a different working dir than our actual |
975 | 1088 | # environment, hence we prepend its working dir if required |
976 | if not osp.isabs(path) and git.working_dir: | |
977 | path = osp.join(git._working_dir, path) | |
1089 | if not osp.isabs(path): | |
1090 | path = osp.join(git._working_dir, path) if git._working_dir is not None else path | |
978 | 1091 | |
979 | 1092 | repo = cls(path, odbt=odbt) |
980 | 1093 | |
992 | 1105 | # END handle remote repo |
993 | 1106 | return repo |
994 | 1107 | |
995 | def clone(self, path, progress=None, multi_options=None, **kwargs): | |
1108 | def clone(self, path: PathLike, progress: Optional[Callable] = None, | |
1109 | multi_options: Optional[List[str]] = None, **kwargs: Any) -> 'Repo': | |
996 | 1110 | """Create a clone from this repository. |
997 | 1111 | |
998 | 1112 | :param path: is the full path of the new repo (traditionally ends with ./<name>.git). |
1009 | 1123 | :return: ``git.Repo`` (the newly cloned repo)""" |
1010 | 1124 | return self._clone(self.git, self.common_dir, path, type(self.odb), progress, multi_options, **kwargs) |
1011 | 1125 | |
1012 | @classmethod | |
1013 | def clone_from(cls, url, to_path, progress=None, env=None, multi_options=None, **kwargs): | |
1126 | @ classmethod | |
1127 | def clone_from(cls, url: PathLike, to_path: PathLike, progress: Optional[Callable] = None, | |
1128 | env: Optional[Mapping[str, str]] = None, | |
1129 | multi_options: Optional[List[str]] = None, **kwargs: Any) -> 'Repo': | |
1014 | 1130 | """Create a clone from the given URL |
1015 | 1131 | |
1016 | 1132 | :param url: valid git url, see http://www.kernel.org/pub/software/scm/git/docs/git-clone.html#URLS |
1025 | 1141 | :param multi_options: See ``clone`` method |
1026 | 1142 | :param kwargs: see the ``clone`` method |
1027 | 1143 | :return: Repo instance pointing to the cloned directory""" |
1028 | git = Git(os.getcwd()) | |
1144 | git = cls.GitCommandWrapperType(os.getcwd()) | |
1029 | 1145 | if env is not None: |
1030 | 1146 | git.update_environment(**env) |
1031 | 1147 | return cls._clone(git, url, to_path, GitCmdObjectDB, progress, multi_options, **kwargs) |
1032 | 1148 | |
1033 | def archive(self, ostream, treeish=None, prefix=None, **kwargs): | |
1149 | def archive(self, ostream: Union[TextIO, BinaryIO], treeish: Optional[str] = None, | |
1150 | prefix: Optional[str] = None, **kwargs: Any) -> Repo: | |
1034 | 1151 | """Archive the tree at the given revision. |
1035 | 1152 | |
1036 | 1153 | :param ostream: file compatible stream object to which the archive will be written as bytes |
1051 | 1168 | kwargs['prefix'] = prefix |
1052 | 1169 | kwargs['output_stream'] = ostream |
1053 | 1170 | path = kwargs.pop('path', []) |
1171 | path = cast(Union[PathLike, List[PathLike], Tuple[PathLike, ...]], path) | |
1054 | 1172 | if not isinstance(path, (tuple, list)): |
1055 | 1173 | path = [path] |
1056 | 1174 | # end assure paths is list |
1057 | ||
1058 | 1175 | self.git.archive(treeish, *path, **kwargs) |
1059 | 1176 | return self |
1060 | 1177 | |
1061 | def has_separate_working_tree(self): | |
1178 | def has_separate_working_tree(self) -> bool: | |
1062 | 1179 | """ |
1063 | 1180 | :return: True if our git_dir is not at the root of our working_tree_dir, but a .git file with a |
1064 | 1181 | platform agnositic symbolic link. Our git_dir will be wherever the .git file points to |
1066 | 1183 | """ |
1067 | 1184 | if self.bare: |
1068 | 1185 | return False |
1069 | return osp.isfile(osp.join(self.working_tree_dir, '.git')) | |
1186 | if self.working_tree_dir: | |
1187 | return osp.isfile(osp.join(self.working_tree_dir, '.git')) | |
1188 | else: | |
1189 | return False # or raise Error? | |
1070 | 1190 | |
1071 | 1191 | rev_parse = rev_parse |
1072 | 1192 | |
1073 | def __repr__(self): | |
1193 | def __repr__(self) -> str: | |
1074 | 1194 | clazz = self.__class__ |
1075 | 1195 | return '<%s.%s %r>' % (clazz.__module__, clazz.__name__, self.git_dir) |
1076 | 1196 | |
1077 | def currently_rebasing_on(self): | |
1197 | def currently_rebasing_on(self) -> Commit | None: | |
1078 | 1198 | """ |
1079 | 1199 | :return: The commit which is currently being replayed while rebasing. |
1080 | 1200 | |
1081 | 1201 | None if we are not currently rebasing. |
1082 | 1202 | """ |
1083 | rebase_head_file = osp.join(self.git_dir, "REBASE_HEAD") | |
1203 | if self.git_dir: | |
1204 | rebase_head_file = osp.join(self.git_dir, "REBASE_HEAD") | |
1084 | 1205 | if not osp.isfile(rebase_head_file): |
1085 | 1206 | return None |
1086 | 1207 | return self.commit(open(rebase_head_file, "rt").readline().strip()) |
0 | 0 | """Package with general repository related functions""" |
1 | from __future__ import annotations | |
1 | 2 | import os |
2 | 3 | import stat |
3 | 4 | from string import digits |
14 | 15 | import os.path as osp |
15 | 16 | from git.cmd import Git |
16 | 17 | |
18 | # Typing ---------------------------------------------------------------------- | |
19 | ||
20 | from typing import Union, Optional, cast, TYPE_CHECKING | |
21 | from git.types import Commit_ish | |
22 | ||
23 | if TYPE_CHECKING: | |
24 | from git.types import PathLike | |
25 | from .base import Repo | |
26 | from git.db import GitCmdObjectDB | |
27 | from git.refs.reference import Reference | |
28 | from git.objects import Commit, TagObject, Blob, Tree | |
29 | from git.refs.tag import Tag | |
30 | ||
31 | # ---------------------------------------------------------------------------- | |
17 | 32 | |
18 | 33 | __all__ = ('rev_parse', 'is_git_dir', 'touch', 'find_submodule_git_dir', 'name_to_object', 'short_to_long', 'deref_tag', |
19 | 34 | 'to_commit', 'find_worktree_git_dir') |
20 | 35 | |
21 | 36 | |
22 | def touch(filename): | |
37 | def touch(filename: str) -> str: | |
23 | 38 | with open(filename, "ab"): |
24 | 39 | pass |
25 | 40 | return filename |
26 | 41 | |
27 | 42 | |
28 | def is_git_dir(d): | |
43 | def is_git_dir(d: 'PathLike') -> bool: | |
29 | 44 | """ This is taken from the git setup.c:is_git_directory |
30 | 45 | function. |
31 | 46 | |
47 | 62 | return False |
48 | 63 | |
49 | 64 | |
50 | def find_worktree_git_dir(dotgit): | |
65 | def find_worktree_git_dir(dotgit: 'PathLike') -> Optional[str]: | |
51 | 66 | """Search for a gitdir for this worktree.""" |
52 | 67 | try: |
53 | 68 | statbuf = os.stat(dotgit) |
66 | 81 | return None |
67 | 82 | |
68 | 83 | |
69 | def find_submodule_git_dir(d): | |
84 | def find_submodule_git_dir(d: 'PathLike') -> Optional['PathLike']: | |
70 | 85 | """Search for a submodule repo.""" |
71 | 86 | if is_git_dir(d): |
72 | 87 | return d |
74 | 89 | try: |
75 | 90 | with open(d) as fp: |
76 | 91 | content = fp.read().rstrip() |
77 | except (IOError, OSError): | |
92 | except IOError: | |
78 | 93 | # it's probably not a file |
79 | 94 | pass |
80 | 95 | else: |
91 | 106 | return None |
92 | 107 | |
93 | 108 | |
94 | def short_to_long(odb, hexsha): | |
109 | def short_to_long(odb: 'GitCmdObjectDB', hexsha: str) -> Optional[bytes]: | |
95 | 110 | """:return: long hexadecimal sha1 from the given less-than-40 byte hexsha |
96 | 111 | or None if no candidate could be found. |
97 | 112 | :param hexsha: hexsha with less than 40 byte""" |
102 | 117 | # END exception handling |
103 | 118 | |
104 | 119 | |
105 | def name_to_object(repo, name, return_ref=False): | |
120 | def name_to_object(repo: 'Repo', name: str, return_ref: bool = False | |
121 | ) -> Union[SymbolicReference, 'Commit', 'TagObject', 'Blob', 'Tree']: | |
106 | 122 | """ |
107 | 123 | :return: object specified by the given name, hexshas ( short and long ) |
108 | 124 | as well as references are supported |
109 | 125 | :param return_ref: if name specifies a reference, we will return the reference |
110 | 126 | instead of the object. Otherwise it will raise BadObject or BadName |
111 | 127 | """ |
112 | hexsha = None | |
128 | hexsha: Union[None, str, bytes] = None | |
113 | 129 | |
114 | 130 | # is it a hexsha ? Try the most common ones, which is 7 to 40 |
115 | 131 | if repo.re_hexsha_shortened.match(name): |
149 | 165 | return Object.new_from_sha(repo, hex_to_bin(hexsha)) |
150 | 166 | |
151 | 167 | |
152 | def deref_tag(tag): | |
168 | def deref_tag(tag: 'Tag') -> 'TagObject': | |
153 | 169 | """Recursively dereference a tag and return the resulting object""" |
154 | 170 | while True: |
155 | 171 | try: |
160 | 176 | return tag |
161 | 177 | |
162 | 178 | |
163 | def to_commit(obj): | |
179 | def to_commit(obj: Object) -> Union['Commit', 'TagObject']: | |
164 | 180 | """Convert the given object to a commit if possible and return it""" |
165 | 181 | if obj.type == 'tag': |
166 | 182 | obj = deref_tag(obj) |
171 | 187 | return obj |
172 | 188 | |
173 | 189 | |
174 | def rev_parse(repo, rev): | |
190 | def rev_parse(repo: 'Repo', rev: str) -> Union['Commit', 'Tag', 'Tree', 'Blob']: | |
175 | 191 | """ |
176 | 192 | :return: Object at the given revision, either Commit, Tag, Tree or Blob |
177 | 193 | :param rev: git-rev-parse compatible revision specification as string, please see |
187 | 203 | raise NotImplementedError("commit by message search ( regex )") |
188 | 204 | # END handle search |
189 | 205 | |
190 | obj = None | |
206 | obj: Union[Commit_ish, 'Reference', None] = None | |
191 | 207 | ref = None |
192 | 208 | output_type = "commit" |
193 | 209 | start = 0 |
207 | 223 | ref = repo.head.ref |
208 | 224 | else: |
209 | 225 | if token == '@': |
210 | ref = name_to_object(repo, rev[:start], return_ref=True) | |
226 | ref = cast('Reference', name_to_object(repo, rev[:start], return_ref=True)) | |
211 | 227 | else: |
212 | obj = name_to_object(repo, rev[:start]) | |
228 | obj = cast(Commit_ish, name_to_object(repo, rev[:start])) | |
213 | 229 | # END handle token |
214 | 230 | # END handle refname |
231 | else: | |
232 | assert obj is not None | |
215 | 233 | |
216 | 234 | if ref is not None: |
217 | obj = ref.commit | |
235 | obj = cast('Commit', ref.commit) | |
218 | 236 | # END handle ref |
219 | 237 | # END initialize obj on first token |
220 | 238 | |
232 | 250 | pass # default |
233 | 251 | elif output_type == 'tree': |
234 | 252 | try: |
253 | obj = cast(Commit_ish, obj) | |
235 | 254 | obj = to_commit(obj).tree |
236 | 255 | except (AttributeError, ValueError): |
237 | 256 | pass # error raised later |
238 | 257 | # END exception handling |
239 | 258 | elif output_type in ('', 'blob'): |
240 | if obj.type == 'tag': | |
259 | obj = cast('TagObject', obj) | |
260 | if obj and obj.type == 'tag': | |
241 | 261 | obj = deref_tag(obj) |
242 | 262 | else: |
243 | 263 | # cannot do anything for non-tags |
265 | 285 | obj = Object.new_from_sha(repo, hex_to_bin(entry.newhexsha)) |
266 | 286 | |
267 | 287 | # make it pass the following checks |
268 | output_type = None | |
288 | output_type = '' | |
269 | 289 | else: |
270 | 290 | raise ValueError("Invalid output type: %s ( in %s )" % (output_type, rev)) |
271 | 291 | # END handle output type |
272 | 292 | |
273 | 293 | # empty output types don't require any specific type, its just about dereferencing tags |
274 | if output_type and obj.type != output_type: | |
294 | if output_type and obj and obj.type != output_type: | |
275 | 295 | raise ValueError("Could not accommodate requested object type %r, got %s" % (output_type, obj.type)) |
276 | 296 | # END verify output type |
277 | 297 | |
304 | 324 | parsed_to = start |
305 | 325 | # handle hierarchy walk |
306 | 326 | try: |
327 | obj = cast(Commit_ish, obj) | |
307 | 328 | if token == "~": |
308 | 329 | obj = to_commit(obj) |
309 | 330 | for _ in range(num): |
325 | 346 | # END end handle tag |
326 | 347 | except (IndexError, AttributeError) as e: |
327 | 348 | raise BadName( |
328 | "Invalid revision spec '%s' - not enough " | |
329 | "parent commits to reach '%s%i'" % (rev, token, num)) from e | |
349 | f"Invalid revision spec '{rev}' - not enough " | |
350 | f"parent commits to reach '{token}{int(num)}'") from e | |
330 | 351 | # END exception handling |
331 | 352 | # END parse loop |
332 | 353 | |
333 | 354 | # still no obj ? Its probably a simple name |
334 | 355 | if obj is None: |
335 | obj = name_to_object(repo, rev) | |
356 | obj = cast(Commit_ish, name_to_object(repo, rev)) | |
336 | 357 | parsed_to = lr |
337 | 358 | # END handle simple name |
338 | 359 |
0 | # -*- coding: utf-8 -*- | |
1 | # This module is part of GitPython and is released under | |
2 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php | |
3 | ||
4 | import os | |
5 | import sys | |
6 | from typing import (Callable, Dict, NoReturn, Sequence, Tuple, Union, Any, Iterator, # noqa: F401 | |
7 | NamedTuple, TYPE_CHECKING, TypeVar) # noqa: F401 | |
8 | ||
9 | if sys.version_info[:2] >= (3, 8): | |
10 | from typing import Final, Literal, SupportsIndex, TypedDict, Protocol, runtime_checkable # noqa: F401 | |
11 | else: | |
12 | from typing_extensions import (Final, Literal, SupportsIndex, # noqa: F401 | |
13 | TypedDict, Protocol, runtime_checkable) # noqa: F401 | |
14 | ||
15 | # if sys.version_info[:2] >= (3, 10): | |
16 | # from typing import TypeGuard # noqa: F401 | |
17 | # else: | |
18 | # from typing_extensions import TypeGuard # noqa: F401 | |
19 | ||
20 | ||
21 | if sys.version_info[:2] < (3, 9): | |
22 | PathLike = Union[str, os.PathLike] | |
23 | elif sys.version_info[:2] >= (3, 9): | |
24 | # os.PathLike only becomes subscriptable from Python 3.9 onwards | |
25 | PathLike = Union[str, os.PathLike] | |
26 | ||
27 | if TYPE_CHECKING: | |
28 | from git.repo import Repo | |
29 | from git.objects import Commit, Tree, TagObject, Blob | |
30 | # from git.refs import SymbolicReference | |
31 | ||
32 | TBD = Any | |
33 | _T = TypeVar('_T') | |
34 | ||
35 | Tree_ish = Union['Commit', 'Tree'] | |
36 | Commit_ish = Union['Commit', 'TagObject', 'Blob', 'Tree'] | |
37 | Lit_commit_ish = Literal['commit', 'tag', 'blob', 'tree'] | |
38 | ||
39 | # Config_levels --------------------------------------------------------- | |
40 | ||
41 | Lit_config_levels = Literal['system', 'global', 'user', 'repository'] | |
42 | ||
43 | ||
44 | # def is_config_level(inp: str) -> TypeGuard[Lit_config_levels]: | |
45 | # # return inp in get_args(Lit_config_level) # only py >= 3.8 | |
46 | # return inp in ("system", "user", "global", "repository") | |
47 | ||
48 | ||
49 | ConfigLevels_Tup = Tuple[Literal['system'], Literal['user'], Literal['global'], Literal['repository']] | |
50 | ||
51 | #----------------------------------------------------------------------------------- | |
52 | ||
53 | ||
54 | def assert_never(inp: NoReturn, raise_error: bool = True, exc: Union[Exception, None] = None) -> None: | |
55 | """For use in exhaustive checking of literal or Enum in if/else chain. | |
56 | Should only be reached if all memebers not handled OR attempt to pass non-members through chain. | |
57 | ||
58 | If all members handled, type is Empty. Otherwise, will cause mypy error. | |
59 | If non-members given, should cause mypy error at variable creation. | |
60 | ||
61 | If raise_error is True, will also raise AssertionError or the Exception passed to exc. | |
62 | """ | |
63 | if raise_error: | |
64 | if exc is None: | |
65 | raise ValueError(f"An unhandled Literal ({inp}) in an if/else chain was found") | |
66 | else: | |
67 | raise exc | |
68 | else: | |
69 | pass | |
70 | ||
71 | ||
72 | class Files_TD(TypedDict): | |
73 | insertions: int | |
74 | deletions: int | |
75 | lines: int | |
76 | ||
77 | ||
78 | class Total_TD(TypedDict): | |
79 | insertions: int | |
80 | deletions: int | |
81 | lines: int | |
82 | files: int | |
83 | ||
84 | ||
85 | class HSH_TD(TypedDict): | |
86 | total: Total_TD | |
87 | files: Dict[PathLike, Files_TD] | |
88 | ||
89 | ||
90 | @runtime_checkable | |
91 | class Has_Repo(Protocol): | |
92 | repo: 'Repo' | |
93 | ||
94 | ||
95 | @runtime_checkable | |
96 | class Has_id_attribute(Protocol): | |
97 | _id_attribute_: str |
2 | 2 | # |
3 | 3 | # This module is part of GitPython and is released under |
4 | 4 | # the BSD License: http://www.opensource.org/licenses/bsd-license.php |
5 | ||
6 | from abc import abstractmethod | |
7 | from .exc import InvalidGitRepositoryError | |
8 | import os.path as osp | |
9 | from .compat import is_win | |
5 | 10 | import contextlib |
6 | 11 | from functools import wraps |
7 | 12 | import getpass |
15 | 20 | from sys import maxsize |
16 | 21 | import time |
17 | 22 | from unittest import SkipTest |
18 | ||
19 | from gitdb.util import (# NOQA @IgnorePep8 | |
23 | from urllib.parse import urlsplit, urlunsplit | |
24 | import warnings | |
25 | ||
26 | # from git.objects.util import Traversable | |
27 | ||
28 | # typing --------------------------------------------------------- | |
29 | ||
30 | from typing import (Any, AnyStr, BinaryIO, Callable, Dict, Generator, IO, Iterator, List, | |
31 | Optional, Pattern, Sequence, Tuple, TypeVar, Union, cast, | |
32 | TYPE_CHECKING, overload, ) | |
33 | ||
34 | import pathlib | |
35 | ||
36 | if TYPE_CHECKING: | |
37 | from git.remote import Remote | |
38 | from git.repo.base import Repo | |
39 | from git.config import GitConfigParser, SectionConstraint | |
40 | from git import Git | |
41 | # from git.objects.base import IndexObject | |
42 | ||
43 | ||
44 | from .types import (Literal, SupportsIndex, Protocol, runtime_checkable, # because behind py version guards | |
45 | PathLike, HSH_TD, Total_TD, Files_TD, # aliases | |
46 | Has_id_attribute) | |
47 | ||
48 | T_IterableObj = TypeVar('T_IterableObj', bound=Union['IterableObj', 'Has_id_attribute'], covariant=True) | |
49 | # So IterableList[Head] is subtype of IterableList[IterableObj] | |
50 | ||
51 | # --------------------------------------------------------------------- | |
52 | ||
53 | ||
54 | from gitdb.util import ( # NOQA @IgnorePep8 | |
20 | 55 | make_sha, |
21 | 56 | LockedFD, # @UnusedImport |
22 | 57 | file_contents_ro, # @UnusedImport |
28 | 63 | hex_to_bin, # @UnusedImport |
29 | 64 | ) |
30 | 65 | |
31 | from git.compat import is_win | |
32 | import os.path as osp | |
33 | ||
34 | from .exc import InvalidGitRepositoryError | |
35 | ||
36 | 66 | |
37 | 67 | # NOTE: Some of the unused imports might be used/imported by others. |
38 | 68 | # Handle once test-cases are back up and running. |
39 | 69 | # Most of these are unused here, but are for use by git-python modules so these |
40 | 70 | # don't see gitdb all the time. Flake of course doesn't like it. |
41 | 71 | __all__ = ["stream_copy", "join_path", "to_native_path_linux", |
42 | "join_path_native", "Stats", "IndexFileSHA1Writer", "Iterable", "IterableList", | |
72 | "join_path_native", "Stats", "IndexFileSHA1Writer", "IterableObj", "IterableList", | |
43 | 73 | "BlockingLockFile", "LockFile", 'Actor', 'get_user_id', 'assure_directory_exists', |
44 | 74 | 'RemoteProgress', 'CallableRemoteProgress', 'rmtree', 'unbare_repo', |
45 | 75 | 'HIDE_WINDOWS_KNOWN_ERRORS'] |
46 | 76 | |
47 | 77 | log = logging.getLogger(__name__) |
78 | ||
79 | # types############################################################ | |
80 | ||
48 | 81 | |
49 | 82 | #: We need an easy way to see if Appveyor TCs start failing, |
50 | 83 | #: so the errors marked with this var are considered "acknowledged" ones, awaiting remedy, |
52 | 85 | HIDE_WINDOWS_KNOWN_ERRORS = is_win and os.environ.get('HIDE_WINDOWS_KNOWN_ERRORS', True) |
53 | 86 | HIDE_WINDOWS_FREEZE_ERRORS = is_win and os.environ.get('HIDE_WINDOWS_FREEZE_ERRORS', True) |
54 | 87 | |
55 | #{ Utility Methods | |
56 | ||
57 | ||
58 | def unbare_repo(func): | |
88 | # { Utility Methods | |
89 | ||
90 | T = TypeVar('T') | |
91 | ||
92 | ||
93 | def unbare_repo(func: Callable[..., T]) -> Callable[..., T]: | |
59 | 94 | """Methods with this decorator raise InvalidGitRepositoryError if they |
60 | 95 | encounter a bare repository""" |
61 | 96 | |
62 | 97 | @wraps(func) |
63 | def wrapper(self, *args, **kwargs): | |
98 | def wrapper(self: 'Remote', *args: Any, **kwargs: Any) -> T: | |
64 | 99 | if self.repo.bare: |
65 | 100 | raise InvalidGitRepositoryError("Method '%s' cannot operate on bare repositories" % func.__name__) |
66 | 101 | # END bare method |
67 | 102 | return func(self, *args, **kwargs) |
68 | 103 | # END wrapper |
104 | ||
69 | 105 | return wrapper |
70 | 106 | |
71 | 107 | |
72 | 108 | @contextlib.contextmanager |
73 | def cwd(new_dir): | |
109 | def cwd(new_dir: PathLike) -> Generator[PathLike, None, None]: | |
74 | 110 | old_dir = os.getcwd() |
75 | 111 | os.chdir(new_dir) |
76 | 112 | try: |
79 | 115 | os.chdir(old_dir) |
80 | 116 | |
81 | 117 | |
82 | def rmtree(path): | |
118 | def rmtree(path: PathLike) -> None: | |
83 | 119 | """Remove the given recursively. |
84 | 120 | |
85 | 121 | :note: we use shutil rmtree but adjust its behaviour to see whether files that |
86 | 122 | couldn't be deleted are read-only. Windows will not remove them in that case""" |
87 | 123 | |
88 | def onerror(func, path, exc_info): | |
124 | def onerror(func: Callable, path: PathLike, exc_info: str) -> None: | |
89 | 125 | # Is the error an access error ? |
90 | 126 | os.chmod(path, stat.S_IWUSR) |
91 | 127 | |
99 | 135 | return shutil.rmtree(path, False, onerror) |
100 | 136 | |
101 | 137 | |
102 | def rmfile(path): | |
138 | def rmfile(path: PathLike) -> None: | |
103 | 139 | """Ensure file deleted also on *Windows* where read-only files need special treatment.""" |
104 | 140 | if osp.isfile(path): |
105 | 141 | if is_win: |
107 | 143 | os.remove(path) |
108 | 144 | |
109 | 145 | |
110 | def stream_copy(source, destination, chunk_size=512 * 1024): | |
146 | def stream_copy(source: BinaryIO, destination: BinaryIO, chunk_size: int = 512 * 1024) -> int: | |
111 | 147 | """Copy all data from the source stream into the destination stream in chunks |
112 | 148 | of size chunk_size |
113 | 149 | |
123 | 159 | return br |
124 | 160 | |
125 | 161 | |
126 | def join_path(a, *p): | |
162 | def join_path(a: PathLike, *p: PathLike) -> PathLike: | |
127 | 163 | """Join path tokens together similar to osp.join, but always use |
128 | 164 | '/' instead of possibly '\' on windows.""" |
129 | path = a | |
165 | path = str(a) | |
130 | 166 | for b in p: |
167 | b = str(b) | |
131 | 168 | if not b: |
132 | 169 | continue |
133 | 170 | if b.startswith('/'): |
141 | 178 | |
142 | 179 | |
143 | 180 | if is_win: |
144 | def to_native_path_windows(path): | |
181 | def to_native_path_windows(path: PathLike) -> PathLike: | |
182 | path = str(path) | |
145 | 183 | return path.replace('/', '\\') |
146 | 184 | |
147 | def to_native_path_linux(path): | |
185 | def to_native_path_linux(path: PathLike) -> str: | |
186 | path = str(path) | |
148 | 187 | return path.replace('\\', '/') |
149 | 188 | |
150 | 189 | __all__.append("to_native_path_windows") |
151 | 190 | to_native_path = to_native_path_windows |
152 | 191 | else: |
153 | 192 | # no need for any work on linux |
154 | def to_native_path_linux(path): | |
155 | return path | |
193 | def to_native_path_linux(path: PathLike) -> str: | |
194 | return str(path) | |
195 | ||
156 | 196 | to_native_path = to_native_path_linux |
157 | 197 | |
158 | 198 | |
159 | def join_path_native(a, *p): | |
199 | def join_path_native(a: PathLike, *p: PathLike) -> PathLike: | |
160 | 200 | """ |
161 | 201 | As join path, but makes sure an OS native path is returned. This is only |
162 | 202 | needed to play it safe on my dear windows and to assure nice paths that only |
164 | 204 | return to_native_path(join_path(a, *p)) |
165 | 205 | |
166 | 206 | |
167 | def assure_directory_exists(path, is_file=False): | |
207 | def assure_directory_exists(path: PathLike, is_file: bool = False) -> bool: | |
168 | 208 | """Assure that the directory pointed to by path exists. |
169 | 209 | |
170 | 210 | :param is_file: If True, path is assumed to be a file and handled correctly. |
179 | 219 | return False |
180 | 220 | |
181 | 221 | |
182 | def _get_exe_extensions(): | |
222 | def _get_exe_extensions() -> Sequence[str]: | |
183 | 223 | PATHEXT = os.environ.get('PATHEXT', None) |
184 | return tuple(p.upper() for p in PATHEXT.split(os.pathsep)) \ | |
185 | if PATHEXT \ | |
186 | else (('.BAT', 'COM', '.EXE') if is_win else ()) | |
187 | ||
188 | ||
189 | def py_where(program, path=None): | |
224 | return tuple(p.upper() for p in PATHEXT.split(os.pathsep)) if PATHEXT \ | |
225 | else ('.BAT', 'COM', '.EXE') if is_win \ | |
226 | else ('') | |
227 | ||
228 | ||
229 | def py_where(program: str, path: Optional[PathLike] = None) -> List[str]: | |
190 | 230 | # From: http://stackoverflow.com/a/377028/548792 |
191 | 231 | winprog_exts = _get_exe_extensions() |
192 | 232 | |
193 | def is_exec(fpath): | |
233 | def is_exec(fpath: str) -> bool: | |
194 | 234 | return osp.isfile(fpath) and os.access(fpath, os.X_OK) and ( |
195 | 235 | os.name != 'nt' or not winprog_exts or any(fpath.upper().endswith(ext) |
196 | 236 | for ext in winprog_exts)) |
198 | 238 | progs = [] |
199 | 239 | if not path: |
200 | 240 | path = os.environ["PATH"] |
201 | for folder in path.split(os.pathsep): | |
241 | for folder in str(path).split(os.pathsep): | |
202 | 242 | folder = folder.strip('"') |
203 | 243 | if folder: |
204 | 244 | exe_path = osp.join(folder, program) |
208 | 248 | return progs |
209 | 249 | |
210 | 250 | |
211 | def _cygexpath(drive, path): | |
251 | def _cygexpath(drive: Optional[str], path: str) -> str: | |
212 | 252 | if osp.isabs(path) and not drive: |
213 | ## Invoked from `cygpath()` directly with `D:Apps\123`? | |
253 | # Invoked from `cygpath()` directly with `D:Apps\123`? | |
214 | 254 | # It's an error, leave it alone just slashes) |
215 | p = path | |
255 | p = path # convert to str if AnyPath given | |
216 | 256 | else: |
217 | 257 | p = path and osp.normpath(osp.expandvars(osp.expanduser(path))) |
218 | 258 | if osp.isabs(p): |
223 | 263 | p = cygpath(p) |
224 | 264 | elif drive: |
225 | 265 | p = '/cygdrive/%s/%s' % (drive.lower(), p) |
226 | ||
227 | return p.replace('\\', '/') | |
228 | ||
229 | ||
230 | _cygpath_parsers = ( | |
231 | ## See: https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247(v=vs.85).aspx | |
232 | ## and: https://www.cygwin.com/cygwin-ug-net/using.html#unc-paths | |
266 | p_str = str(p) # ensure it is a str and not AnyPath | |
267 | return p_str.replace('\\', '/') | |
268 | ||
269 | ||
270 | _cygpath_parsers: Tuple[Tuple[Pattern[str], Callable, bool], ...] = ( | |
271 | # See: https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247(v=vs.85).aspx | |
272 | # and: https://www.cygwin.com/cygwin-ug-net/using.html#unc-paths | |
233 | 273 | (re.compile(r"\\\\\?\\UNC\\([^\\]+)\\([^\\]+)(?:\\(.*))?"), |
234 | 274 | (lambda server, share, rest_path: '//%s/%s/%s' % (server, share, rest_path.replace('\\', '/'))), |
235 | 275 | False |
236 | 276 | ), |
237 | 277 | |
238 | 278 | (re.compile(r"\\\\\?\\(\w):[/\\](.*)"), |
239 | _cygexpath, | |
279 | (_cygexpath), | |
240 | 280 | False |
241 | 281 | ), |
242 | 282 | |
243 | 283 | (re.compile(r"(\w):[/\\](.*)"), |
244 | _cygexpath, | |
284 | (_cygexpath), | |
245 | 285 | False |
246 | 286 | ), |
247 | 287 | |
248 | 288 | (re.compile(r"file:(.*)", re.I), |
249 | 289 | (lambda rest_path: rest_path), |
250 | True), | |
290 | True | |
291 | ), | |
251 | 292 | |
252 | 293 | (re.compile(r"(\w{2,}:.*)"), # remote URL, do nothing |
253 | 294 | (lambda url: url), |
254 | False), | |
295 | False | |
296 | ), | |
255 | 297 | ) |
256 | 298 | |
257 | 299 | |
258 | def cygpath(path): | |
300 | def cygpath(path: str) -> str: | |
259 | 301 | """Use :meth:`git.cmd.Git.polish_url()` instead, that works on any environment.""" |
302 | path = str(path) # ensure is str and not AnyPath. | |
303 | # Fix to use Paths when 3.5 dropped. or to be just str if only for urls? | |
260 | 304 | if not path.startswith(('/cygdrive', '//')): |
261 | 305 | for regex, parser, recurse in _cygpath_parsers: |
262 | 306 | match = regex.match(path) |
274 | 318 | _decygpath_regex = re.compile(r"/cygdrive/(\w)(/.*)?") |
275 | 319 | |
276 | 320 | |
277 | def decygpath(path): | |
321 | def decygpath(path: PathLike) -> str: | |
322 | path = str(path) | |
278 | 323 | m = _decygpath_regex.match(path) |
279 | 324 | if m: |
280 | 325 | drive, rest_path = m.groups() |
285 | 330 | |
286 | 331 | #: Store boolean flags denoting if a specific Git executable |
287 | 332 | #: is from a Cygwin installation (since `cache_lru()` unsupported on PY2). |
288 | _is_cygwin_cache = {} | |
289 | ||
290 | ||
291 | def is_cygwin_git(git_executable): | |
333 | _is_cygwin_cache: Dict[str, Optional[bool]] = {} | |
334 | ||
335 | ||
336 | @overload | |
337 | def is_cygwin_git(git_executable: None) -> Literal[False]: | |
338 | ... | |
339 | ||
340 | ||
341 | @overload | |
342 | def is_cygwin_git(git_executable: PathLike) -> bool: | |
343 | ... | |
344 | ||
345 | ||
346 | def is_cygwin_git(git_executable: Union[None, PathLike]) -> bool: | |
292 | 347 | if not is_win: |
293 | 348 | return False |
294 | 349 | |
295 | #from subprocess import check_output | |
296 | ||
297 | is_cygwin = _is_cygwin_cache.get(git_executable) | |
350 | if git_executable is None: | |
351 | return False | |
352 | ||
353 | git_executable = str(git_executable) | |
354 | is_cygwin = _is_cygwin_cache.get(git_executable) # type: Optional[bool] | |
298 | 355 | if is_cygwin is None: |
299 | 356 | is_cygwin = False |
300 | 357 | try: |
301 | 358 | git_dir = osp.dirname(git_executable) |
302 | 359 | if not git_dir: |
303 | 360 | res = py_where(git_executable) |
304 | git_dir = osp.dirname(res[0]) if res else None | |
305 | ||
306 | ## Just a name given, not a real path. | |
361 | git_dir = osp.dirname(res[0]) if res else "" | |
362 | ||
363 | # Just a name given, not a real path. | |
307 | 364 | uname_cmd = osp.join(git_dir, 'uname') |
308 | 365 | process = subprocess.Popen([uname_cmd], stdout=subprocess.PIPE, |
309 | 366 | universal_newlines=True) |
317 | 374 | return is_cygwin |
318 | 375 | |
319 | 376 | |
320 | def get_user_id(): | |
377 | def get_user_id() -> str: | |
321 | 378 | """:return: string identifying the currently active system user as name@node""" |
322 | 379 | return "%s@%s" % (getpass.getuser(), platform.node()) |
323 | 380 | |
324 | 381 | |
325 | def finalize_process(proc, **kwargs): | |
382 | def finalize_process(proc: Union[subprocess.Popen, 'Git.AutoInterrupt'], **kwargs: Any) -> None: | |
326 | 383 | """Wait for the process (clone, fetch, pull or push) and handle its errors accordingly""" |
327 | ## TODO: No close proc-streams?? | |
384 | # TODO: No close proc-streams?? | |
328 | 385 | proc.wait(**kwargs) |
329 | 386 | |
330 | 387 | |
331 | def expand_path(p, expand_vars=True): | |
388 | @overload | |
389 | def expand_path(p: None, expand_vars: bool = ...) -> None: | |
390 | ... | |
391 | ||
392 | ||
393 | @overload | |
394 | def expand_path(p: PathLike, expand_vars: bool = ...) -> str: | |
395 | # improve these overloads when 3.5 dropped | |
396 | ... | |
397 | ||
398 | ||
399 | def expand_path(p: Union[None, PathLike], expand_vars: bool = True) -> Optional[PathLike]: | |
400 | if isinstance(p, pathlib.Path): | |
401 | return p.resolve() | |
332 | 402 | try: |
333 | p = osp.expanduser(p) | |
403 | p = osp.expanduser(p) # type: ignore | |
334 | 404 | if expand_vars: |
335 | p = osp.expandvars(p) | |
336 | return osp.normpath(osp.abspath(p)) | |
405 | p = osp.expandvars(p) # type: ignore | |
406 | return osp.normpath(osp.abspath(p)) # type: ignore | |
337 | 407 | except Exception: |
338 | 408 | return None |
339 | 409 | |
340 | #} END utilities | |
341 | ||
342 | #{ Classes | |
410 | ||
411 | def remove_password_if_present(cmdline: Sequence[str]) -> List[str]: | |
412 | """ | |
413 | Parse any command line argument and if on of the element is an URL with a | |
414 | password, replace it by stars (in-place). | |
415 | ||
416 | If nothing found just returns the command line as-is. | |
417 | ||
418 | This should be used for every log line that print a command line. | |
419 | """ | |
420 | new_cmdline = [] | |
421 | for index, to_parse in enumerate(cmdline): | |
422 | new_cmdline.append(to_parse) | |
423 | try: | |
424 | url = urlsplit(to_parse) | |
425 | # Remove password from the URL if present | |
426 | if url.password is None: | |
427 | continue | |
428 | ||
429 | edited_url = url._replace( | |
430 | netloc=url.netloc.replace(url.password, "*****")) | |
431 | new_cmdline[index] = urlunsplit(edited_url) | |
432 | except ValueError: | |
433 | # This is not a valid URL | |
434 | continue | |
435 | return new_cmdline | |
436 | ||
437 | ||
438 | # } END utilities | |
439 | ||
440 | # { Classes | |
343 | 441 | |
344 | 442 | |
345 | 443 | class RemoteProgress(object): |
347 | 445 | Handler providing an interface to parse progress information emitted by git-push |
348 | 446 | and git-fetch and to dispatch callbacks allowing subclasses to react to the progress. |
349 | 447 | """ |
350 | _num_op_codes = 9 | |
448 | _num_op_codes: int = 9 | |
351 | 449 | BEGIN, END, COUNTING, COMPRESSING, WRITING, RECEIVING, RESOLVING, FINDING_SOURCES, CHECKING_OUT = \ |
352 | 450 | [1 << x for x in range(_num_op_codes)] |
353 | 451 | STAGE_MASK = BEGIN | END |
363 | 461 | re_op_absolute = re.compile(r"(remote: )?([\w\s]+):\s+()(\d+)()(.*)") |
364 | 462 | re_op_relative = re.compile(r"(remote: )?([\w\s]+):\s+(\d+)% \((\d+)/(\d+)\)(.*)") |
365 | 463 | |
366 | def __init__(self): | |
367 | self._seen_ops = [] | |
368 | self._cur_line = None | |
369 | self.error_lines = [] | |
370 | self.other_lines = [] | |
371 | ||
372 | def _parse_progress_line(self, line): | |
464 | def __init__(self) -> None: | |
465 | self._seen_ops: List[int] = [] | |
466 | self._cur_line: Optional[str] = None | |
467 | self.error_lines: List[str] = [] | |
468 | self.other_lines: List[str] = [] | |
469 | ||
470 | def _parse_progress_line(self, line: AnyStr) -> None: | |
373 | 471 | """Parse progress information from the given line as retrieved by git-push |
374 | 472 | or git-fetch. |
375 | 473 | |
376 | 474 | - Lines that do not contain progress info are stored in :attr:`other_lines`. |
377 | 475 | - Lines that seem to contain an error (i.e. start with error: or fatal:) are stored |
378 | in :attr:`error_lines`.""" | |
476 | in :attr:`error_lines`.""" | |
379 | 477 | # handle |
380 | 478 | # Counting objects: 4, done. |
381 | 479 | # Compressing objects: 50% (1/2) |
382 | 480 | # Compressing objects: 100% (2/2) |
383 | 481 | # Compressing objects: 100% (2/2), done. |
384 | self._cur_line = line = line.decode('utf-8') if isinstance(line, bytes) else line | |
385 | if self.error_lines or self._cur_line.startswith(('error:', 'fatal:')): | |
482 | if isinstance(line, bytes): # mypy argues about ternary assignment | |
483 | line_str = line.decode('utf-8') | |
484 | else: | |
485 | line_str = line | |
486 | self._cur_line = line_str | |
487 | ||
488 | if self._cur_line.startswith(('error:', 'fatal:')): | |
386 | 489 | self.error_lines.append(self._cur_line) |
387 | 490 | return |
388 | 491 | |
389 | 492 | # find escape characters and cut them away - regex will not work with |
390 | 493 | # them as they are non-ascii. As git might expect a tty, it will send them |
391 | 494 | last_valid_index = None |
392 | for i, c in enumerate(reversed(line)): | |
495 | for i, c in enumerate(reversed(line_str)): | |
393 | 496 | if ord(c) < 32: |
394 | 497 | # its a slice index |
395 | 498 | last_valid_index = -i - 1 |
396 | 499 | # END character was non-ascii |
397 | 500 | # END for each character in line |
398 | 501 | if last_valid_index is not None: |
399 | line = line[:last_valid_index] | |
502 | line_str = line_str[:last_valid_index] | |
400 | 503 | # END cut away invalid part |
401 | line = line.rstrip() | |
504 | line_str = line_str.rstrip() | |
402 | 505 | |
403 | 506 | cur_count, max_count = None, None |
404 | match = self.re_op_relative.match(line) | |
507 | match = self.re_op_relative.match(line_str) | |
405 | 508 | if match is None: |
406 | match = self.re_op_absolute.match(line) | |
509 | match = self.re_op_absolute.match(line_str) | |
407 | 510 | |
408 | 511 | if not match: |
409 | self.line_dropped(line) | |
410 | self.other_lines.append(line) | |
512 | self.line_dropped(line_str) | |
513 | self.other_lines.append(line_str) | |
411 | 514 | return |
412 | 515 | # END could not get match |
413 | 516 | |
436 | 539 | # This can't really be prevented, so we drop the line verbosely |
437 | 540 | # to make sure we get informed in case the process spits out new |
438 | 541 | # commands at some point. |
439 | self.line_dropped(line) | |
542 | self.line_dropped(line_str) | |
440 | 543 | # Note: Don't add this line to the other lines, as we have to silently |
441 | 544 | # drop it |
442 | return | |
545 | return None | |
443 | 546 | # END handle op code |
444 | 547 | |
445 | 548 | # figure out stage |
464 | 567 | max_count and float(max_count), |
465 | 568 | message) |
466 | 569 | |
467 | def new_message_handler(self): | |
570 | def new_message_handler(self) -> Callable[[str], None]: | |
468 | 571 | """ |
469 | 572 | :return: |
470 | 573 | a progress handler suitable for handle_process_output(), passing lines on to this Progress |
471 | 574 | handler in a suitable format""" |
472 | def handler(line): | |
575 | def handler(line: AnyStr) -> None: | |
473 | 576 | return self._parse_progress_line(line.rstrip()) |
474 | 577 | # end |
475 | 578 | return handler |
476 | 579 | |
477 | def line_dropped(self, line): | |
580 | def line_dropped(self, line: str) -> None: | |
478 | 581 | """Called whenever a line could not be understood and was therefore dropped.""" |
479 | 582 | pass |
480 | 583 | |
481 | def update(self, op_code, cur_count, max_count=None, message=''): | |
584 | def update(self, op_code: int, cur_count: Union[str, float], max_count: Union[str, float, None] = None, | |
585 | message: str = '',) -> None: | |
482 | 586 | """Called whenever the progress changes |
483 | 587 | |
484 | 588 | :param op_code: |
509 | 613 | """An implementation forwarding updates to any callable""" |
510 | 614 | __slots__ = ('_callable') |
511 | 615 | |
512 | def __init__(self, fn): | |
616 | def __init__(self, fn: Callable) -> None: | |
513 | 617 | self._callable = fn |
514 | 618 | super(CallableRemoteProgress, self).__init__() |
515 | 619 | |
516 | def update(self, *args, **kwargs): | |
620 | def update(self, *args: Any, **kwargs: Any) -> None: | |
517 | 621 | self._callable(*args, **kwargs) |
518 | 622 | |
519 | 623 | |
538 | 642 | |
539 | 643 | __slots__ = ('name', 'email') |
540 | 644 | |
541 | def __init__(self, name, email): | |
645 | def __init__(self, name: Optional[str], email: Optional[str]) -> None: | |
542 | 646 | self.name = name |
543 | 647 | self.email = email |
544 | 648 | |
545 | def __eq__(self, other): | |
649 | def __eq__(self, other: Any) -> bool: | |
546 | 650 | return self.name == other.name and self.email == other.email |
547 | 651 | |
548 | def __ne__(self, other): | |
652 | def __ne__(self, other: Any) -> bool: | |
549 | 653 | return not (self == other) |
550 | 654 | |
551 | def __hash__(self): | |
655 | def __hash__(self) -> int: | |
552 | 656 | return hash((self.name, self.email)) |
553 | 657 | |
554 | def __str__(self): | |
555 | return self.name | |
556 | ||
557 | def __repr__(self): | |
658 | def __str__(self) -> str: | |
659 | return self.name if self.name else "" | |
660 | ||
661 | def __repr__(self) -> str: | |
558 | 662 | return '<git.Actor "%s <%s>">' % (self.name, self.email) |
559 | 663 | |
560 | 664 | @classmethod |
561 | def _from_string(cls, string): | |
665 | def _from_string(cls, string: str) -> 'Actor': | |
562 | 666 | """Create an Actor from a string. |
563 | 667 | :param string: is the string, which is expected to be in regular git format |
564 | 668 | |
579 | 683 | # END handle name/email matching |
580 | 684 | |
581 | 685 | @classmethod |
582 | def _main_actor(cls, env_name, env_email, config_reader=None): | |
686 | def _main_actor(cls, env_name: str, env_email: str, | |
687 | config_reader: Union[None, 'GitConfigParser', 'SectionConstraint'] = None) -> 'Actor': | |
583 | 688 | actor = Actor('', '') |
584 | 689 | user_id = None # We use this to avoid multiple calls to getpass.getuser() |
585 | 690 | |
586 | def default_email(): | |
691 | def default_email() -> str: | |
587 | 692 | nonlocal user_id |
588 | 693 | if not user_id: |
589 | 694 | user_id = get_user_id() |
590 | 695 | return user_id |
591 | 696 | |
592 | def default_name(): | |
697 | def default_name() -> str: | |
593 | 698 | return default_email().split('@')[0] |
594 | 699 | |
595 | 700 | for attr, evar, cvar, default in (('name', env_name, cls.conf_name, default_name), |
599 | 704 | setattr(actor, attr, val) |
600 | 705 | except KeyError: |
601 | 706 | if config_reader is not None: |
602 | setattr(actor, attr, config_reader.get_value('user', cvar, default())) | |
707 | try: | |
708 | val = config_reader.get('user', cvar) | |
709 | except Exception: | |
710 | val = default() | |
711 | setattr(actor, attr, val) | |
603 | 712 | # END config-reader handling |
604 | 713 | if not getattr(actor, attr): |
605 | 714 | setattr(actor, attr, default()) |
608 | 717 | return actor |
609 | 718 | |
610 | 719 | @classmethod |
611 | def committer(cls, config_reader=None): | |
720 | def committer(cls, config_reader: Union[None, 'GitConfigParser', 'SectionConstraint'] = None) -> 'Actor': | |
612 | 721 | """ |
613 | 722 | :return: Actor instance corresponding to the configured committer. It behaves |
614 | 723 | similar to the git implementation, such that the environment will override |
619 | 728 | return cls._main_actor(cls.env_committer_name, cls.env_committer_email, config_reader) |
620 | 729 | |
621 | 730 | @classmethod |
622 | def author(cls, config_reader=None): | |
731 | def author(cls, config_reader: Union[None, 'GitConfigParser', 'SectionConstraint'] = None) -> 'Actor': | |
623 | 732 | """Same as committer(), but defines the main author. It may be specified in the environment, |
624 | 733 | but defaults to the committer""" |
625 | 734 | return cls._main_actor(cls.env_author_name, cls.env_author_email, config_reader) |
653 | 762 | files = number of changed files as int""" |
654 | 763 | __slots__ = ("total", "files") |
655 | 764 | |
656 | def __init__(self, total, files): | |
765 | def __init__(self, total: Total_TD, files: Dict[PathLike, Files_TD]): | |
657 | 766 | self.total = total |
658 | 767 | self.files = files |
659 | 768 | |
660 | 769 | @classmethod |
661 | def _list_from_string(cls, repo, text): | |
770 | def _list_from_string(cls, repo: 'Repo', text: str) -> 'Stats': | |
662 | 771 | """Create a Stat object from output retrieved by git-diff. |
663 | 772 | |
664 | 773 | :return: git.Stat""" |
665 | hsh = {'total': {'insertions': 0, 'deletions': 0, 'lines': 0, 'files': 0}, 'files': {}} | |
774 | ||
775 | hsh: HSH_TD = {'total': {'insertions': 0, | |
776 | 'deletions': 0, | |
777 | 'lines': 0, | |
778 | 'files': 0}, | |
779 | 'files': {} | |
780 | } | |
666 | 781 | for line in text.splitlines(): |
667 | 782 | (raw_insertions, raw_deletions, filename) = line.split("\t") |
668 | 783 | insertions = raw_insertions != '-' and int(raw_insertions) or 0 |
671 | 786 | hsh['total']['deletions'] += deletions |
672 | 787 | hsh['total']['lines'] += insertions + deletions |
673 | 788 | hsh['total']['files'] += 1 |
674 | hsh['files'][filename.strip()] = {'insertions': insertions, | |
675 | 'deletions': deletions, | |
676 | 'lines': insertions + deletions} | |
789 | files_dict: Files_TD = {'insertions': insertions, | |
790 | 'deletions': deletions, | |
791 | 'lines': insertions + deletions} | |
792 | hsh['files'][filename.strip()] = files_dict | |
677 | 793 | return Stats(hsh['total'], hsh['files']) |
678 | 794 | |
679 | 795 | |
688 | 804 | :note: Based on the dulwich project""" |
689 | 805 | __slots__ = ("f", "sha1") |
690 | 806 | |
691 | def __init__(self, f): | |
807 | def __init__(self, f: IO) -> None: | |
692 | 808 | self.f = f |
693 | 809 | self.sha1 = make_sha(b"") |
694 | 810 | |
695 | def write(self, data): | |
811 | def write(self, data: AnyStr) -> int: | |
696 | 812 | self.sha1.update(data) |
697 | 813 | return self.f.write(data) |
698 | 814 | |
699 | def write_sha(self): | |
815 | def write_sha(self) -> bytes: | |
700 | 816 | sha = self.sha1.digest() |
701 | 817 | self.f.write(sha) |
702 | 818 | return sha |
703 | 819 | |
704 | def close(self): | |
820 | def close(self) -> bytes: | |
705 | 821 | sha = self.write_sha() |
706 | 822 | self.f.close() |
707 | 823 | return sha |
708 | 824 | |
709 | def tell(self): | |
825 | def tell(self) -> int: | |
710 | 826 | return self.f.tell() |
711 | 827 | |
712 | 828 | |
720 | 836 | Locks will automatically be released on destruction""" |
721 | 837 | __slots__ = ("_file_path", "_owns_lock") |
722 | 838 | |
723 | def __init__(self, file_path): | |
839 | def __init__(self, file_path: PathLike) -> None: | |
724 | 840 | self._file_path = file_path |
725 | 841 | self._owns_lock = False |
726 | 842 | |
727 | def __del__(self): | |
843 | def __del__(self) -> None: | |
728 | 844 | self._release_lock() |
729 | 845 | |
730 | def _lock_file_path(self): | |
846 | def _lock_file_path(self) -> str: | |
731 | 847 | """:return: Path to lockfile""" |
732 | 848 | return "%s.lock" % (self._file_path) |
733 | 849 | |
734 | def _has_lock(self): | |
850 | def _has_lock(self) -> bool: | |
735 | 851 | """:return: True if we have a lock and if the lockfile still exists |
736 | 852 | :raise AssertionError: if our lock-file does not exist""" |
737 | 853 | return self._owns_lock |
738 | 854 | |
739 | def _obtain_lock_or_raise(self): | |
855 | def _obtain_lock_or_raise(self) -> None: | |
740 | 856 | """Create a lock file as flag for other instances, mark our instance as lock-holder |
741 | 857 | |
742 | 858 | :raise IOError: if a lock was already present or a lock file could not be written""" |
758 | 874 | |
759 | 875 | self._owns_lock = True |
760 | 876 | |
761 | def _obtain_lock(self): | |
877 | def _obtain_lock(self) -> None: | |
762 | 878 | """The default implementation will raise if a lock cannot be obtained. |
763 | 879 | Subclasses may override this method to provide a different implementation""" |
764 | 880 | return self._obtain_lock_or_raise() |
765 | 881 | |
766 | def _release_lock(self): | |
882 | def _release_lock(self) -> None: | |
767 | 883 | """Release our lock if we have one""" |
768 | 884 | if not self._has_lock(): |
769 | 885 | return |
788 | 904 | can never be obtained.""" |
789 | 905 | __slots__ = ("_check_interval", "_max_block_time") |
790 | 906 | |
791 | def __init__(self, file_path, check_interval_s=0.3, max_block_time_s=maxsize): | |
907 | def __init__(self, file_path: PathLike, check_interval_s: float = 0.3, max_block_time_s: int = maxsize) -> None: | |
792 | 908 | """Configure the instance |
793 | 909 | |
794 | 910 | :param check_interval_s: |
800 | 916 | self._check_interval = check_interval_s |
801 | 917 | self._max_block_time = max_block_time_s |
802 | 918 | |
803 | def _obtain_lock(self): | |
919 | def _obtain_lock(self) -> None: | |
804 | 920 | """This method blocks until it obtained the lock, or raises IOError if |
805 | 921 | it ran out of time or if the parent directory was not available anymore. |
806 | 922 | If this method returns, you are guaranteed to own the lock""" |
829 | 945 | # END endless loop |
830 | 946 | |
831 | 947 | |
832 | class IterableList(list): | |
948 | class IterableList(List[T_IterableObj]): | |
833 | 949 | |
834 | 950 | """ |
835 | 951 | List of iterable objects allowing to query an object by id or by named index:: |
839 | 955 | heads['master'] |
840 | 956 | heads[0] |
841 | 957 | |
958 | Iterable parent objects = [Commit, SubModule, Reference, FetchInfo, PushInfo] | |
959 | Iterable via inheritance = [Head, TagReference, RemoteReference] | |
960 | ] | |
842 | 961 | It requires an id_attribute name to be set which will be queried from its |
843 | 962 | contained items to have a means for comparison. |
844 | 963 | |
847 | 966 | can be left out.""" |
848 | 967 | __slots__ = ('_id_attr', '_prefix') |
849 | 968 | |
850 | def __new__(cls, id_attr, prefix=''): | |
969 | def __new__(cls, id_attr: str, prefix: str = '') -> 'IterableList[IterableObj]': | |
851 | 970 | return super(IterableList, cls).__new__(cls) |
852 | 971 | |
853 | def __init__(self, id_attr, prefix=''): | |
972 | def __init__(self, id_attr: str, prefix: str = '') -> None: | |
854 | 973 | self._id_attr = id_attr |
855 | 974 | self._prefix = prefix |
856 | 975 | |
857 | def __contains__(self, attr): | |
976 | def __contains__(self, attr: object) -> bool: | |
858 | 977 | # first try identity match for performance |
859 | 978 | try: |
860 | 979 | rval = list.__contains__(self, attr) |
866 | 985 | |
867 | 986 | # otherwise make a full name search |
868 | 987 | try: |
869 | getattr(self, attr) | |
988 | getattr(self, cast(str, attr)) # use cast to silence mypy | |
870 | 989 | return True |
871 | 990 | except (AttributeError, TypeError): |
872 | 991 | return False |
873 | 992 | # END handle membership |
874 | 993 | |
875 | def __getattr__(self, attr): | |
994 | def __getattr__(self, attr: str) -> T_IterableObj: | |
876 | 995 | attr = self._prefix + attr |
877 | 996 | for item in self: |
878 | 997 | if getattr(item, self._id_attr) == attr: |
880 | 999 | # END for each item |
881 | 1000 | return list.__getattribute__(self, attr) |
882 | 1001 | |
883 | def __getitem__(self, index): | |
1002 | def __getitem__(self, index: Union[SupportsIndex, int, slice, str]) -> T_IterableObj: # type: ignore | |
1003 | ||
1004 | assert isinstance(index, (int, str, slice)), "Index of IterableList should be an int or str" | |
1005 | ||
884 | 1006 | if isinstance(index, int): |
885 | 1007 | return list.__getitem__(self, index) |
886 | ||
887 | try: | |
888 | return getattr(self, index) | |
889 | except AttributeError as e: | |
890 | raise IndexError("No item found with id %r" % (self._prefix + index)) from e | |
1008 | elif isinstance(index, slice): | |
1009 | raise ValueError("Index should be an int or str") | |
1010 | else: | |
1011 | try: | |
1012 | return getattr(self, index) | |
1013 | except AttributeError as e: | |
1014 | raise IndexError("No item found with id %r" % (self._prefix + index)) from e | |
891 | 1015 | # END handle getattr |
892 | 1016 | |
893 | def __delitem__(self, index): | |
894 | delindex = index | |
1017 | def __delitem__(self, index: Union[SupportsIndex, int, slice, str]) -> None: | |
1018 | ||
1019 | assert isinstance(index, (int, str)), "Index of IterableList should be an int or str" | |
1020 | ||
1021 | delindex = cast(int, index) | |
895 | 1022 | if not isinstance(index, int): |
896 | 1023 | delindex = -1 |
897 | 1024 | name = self._prefix + index |
908 | 1035 | list.__delitem__(self, delindex) |
909 | 1036 | |
910 | 1037 | |
911 | class Iterable(object): | |
1038 | class IterableClassWatcher(type): | |
1039 | """ Metaclass that watches """ | |
1040 | def __init__(cls, name: str, bases: Tuple, clsdict: Dict) -> None: | |
1041 | for base in bases: | |
1042 | if type(base) == IterableClassWatcher: | |
1043 | warnings.warn(f"GitPython Iterable subclassed by {name}. " | |
1044 | "Iterable is deprecated due to naming clash since v3.1.18" | |
1045 | " and will be removed in 3.1.20, " | |
1046 | "Use IterableObj instead \n", | |
1047 | DeprecationWarning, | |
1048 | stacklevel=2) | |
1049 | ||
1050 | ||
1051 | class Iterable(metaclass=IterableClassWatcher): | |
912 | 1052 | |
913 | 1053 | """Defines an interface for iterable items which is to assure a uniform |
914 | 1054 | way to retrieve and iterate items within the git repository""" |
916 | 1056 | _id_attribute_ = "attribute that most suitably identifies your instance" |
917 | 1057 | |
918 | 1058 | @classmethod |
919 | def list_items(cls, repo, *args, **kwargs): | |
1059 | def list_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> Any: | |
1060 | """ | |
1061 | Deprecated, use IterableObj instead. | |
1062 | Find all items of this type - subclasses can specify args and kwargs differently. | |
1063 | If no args are given, subclasses are obliged to return all items if no additional | |
1064 | arguments arg given. | |
1065 | ||
1066 | :note: Favor the iter_items method as it will | |
1067 | ||
1068 | :return:list(Item,...) list of item instances""" | |
1069 | out_list: Any = IterableList(cls._id_attribute_) | |
1070 | out_list.extend(cls.iter_items(repo, *args, **kwargs)) | |
1071 | return out_list | |
1072 | ||
1073 | @classmethod | |
1074 | def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> Any: | |
1075 | # return typed to be compatible with subtypes e.g. Remote | |
1076 | """For more information about the arguments, see list_items | |
1077 | :return: iterator yielding Items""" | |
1078 | raise NotImplementedError("To be implemented by Subclass") | |
1079 | ||
1080 | ||
1081 | @runtime_checkable | |
1082 | class IterableObj(Protocol): | |
1083 | """Defines an interface for iterable items which is to assure a uniform | |
1084 | way to retrieve and iterate items within the git repository | |
1085 | ||
1086 | Subclasses = [Submodule, Commit, Reference, PushInfo, FetchInfo, Remote]""" | |
1087 | ||
1088 | __slots__ = () | |
1089 | _id_attribute_: str | |
1090 | ||
1091 | @classmethod | |
1092 | def list_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> IterableList[T_IterableObj]: | |
920 | 1093 | """ |
921 | 1094 | Find all items of this type - subclasses can specify args and kwargs differently. |
922 | 1095 | If no args are given, subclasses are obliged to return all items if no additional |
925 | 1098 | :note: Favor the iter_items method as it will |
926 | 1099 | |
927 | 1100 | :return:list(Item,...) list of item instances""" |
928 | out_list = IterableList(cls._id_attribute_) | |
1101 | out_list: IterableList = IterableList(cls._id_attribute_) | |
929 | 1102 | out_list.extend(cls.iter_items(repo, *args, **kwargs)) |
930 | 1103 | return out_list |
931 | 1104 | |
932 | 1105 | @classmethod |
933 | def iter_items(cls, repo, *args, **kwargs): | |
1106 | @abstractmethod | |
1107 | def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any | |
1108 | ) -> Iterator[T_IterableObj]: # Iterator[T_IterableObj]: | |
1109 | # return typed to be compatible with subtypes e.g. Remote | |
934 | 1110 | """For more information about the arguments, see list_items |
935 | :return: iterator yielding Items""" | |
1111 | :return: iterator yielding Items""" | |
936 | 1112 | raise NotImplementedError("To be implemented by Subclass") |
937 | 1113 | |
938 | #} END classes | |
1114 | # } END classes | |
939 | 1115 | |
940 | 1116 | |
941 | 1117 | class NullHandler(logging.Handler): |
942 | def emit(self, record): | |
1118 | def emit(self, record: object) -> None: | |
943 | 1119 | pass |
0 | [build-system] | |
1 | requires = ["setuptools", "wheel"] | |
2 | build-backend = "setuptools.build_meta" | |
3 | ||
4 | [tool.pytest.ini_options] | |
5 | python_files = 'test_*.py' | |
6 | testpaths = 'test' # space seperated list of paths from root e.g test tests doc/testing | |
7 | addopts = '--cov=git --cov-report=term --maxfail=10 --force-sugar --disable-warnings' | |
8 | filterwarnings = 'ignore::DeprecationWarning' | |
9 | # --cov coverage | |
10 | # --cov-report term # send report to terminal term-missing -> terminal with line numbers html xml | |
11 | # --cov-report term-missing # to terminal with line numbers | |
12 | # --cov-report html:path # html file at path | |
13 | # --maxfail # number of errors before giving up | |
14 | # -disable-warnings # Disable pytest warnings (not codebase warnings) | |
15 | # -rf # increased reporting of failures | |
16 | # -rE # increased reporting of errors | |
17 | # --ignore-glob=**/gitdb/* # ignore glob paths | |
18 | # filterwarnings ignore::WarningType # ignores those warnings | |
19 | ||
20 | [tool.mypy] | |
21 | disallow_untyped_defs = true | |
22 | no_implicit_optional = true | |
23 | warn_redundant_casts = true | |
24 | # warn_unused_ignores = true | |
25 | warn_unreachable = true | |
26 | show_error_codes = true | |
27 | implicit_reexport = true | |
28 | # strict = true | |
29 | ||
30 | # TODO: remove when 'gitdb' is fully annotated | |
31 | [[tool.mypy.overrides]] | |
32 | module = "gitdb.*" | |
33 | ignore_missing_imports = true | |
34 | ||
35 | [tool.coverage.run] | |
36 | source = ["git"] | |
37 | ||
38 | [tool.coverage.report] | |
39 | include = ["*/git/*"] | |
40 | omit = ["*/git/ext/*"] |
0 | #!/usr/bin/env python | |
1 | from __future__ import print_function | |
2 | try: | |
3 | from setuptools import setup, find_packages | |
4 | except ImportError: | |
5 | from ez_setup import use_setuptools | |
6 | use_setuptools() | |
7 | from setuptools import setup, find_packages | |
8 | ||
9 | from distutils.command.build_py import build_py as _build_py | |
0 | from typing import Sequence | |
1 | from setuptools import setup, find_packages | |
2 | from setuptools.command.build_py import build_py as _build_py | |
10 | 3 | from setuptools.command.sdist import sdist as _sdist |
11 | 4 | import fnmatch |
12 | 5 | import os |
25 | 18 | |
26 | 19 | class build_py(_build_py): |
27 | 20 | |
28 | def run(self): | |
21 | def run(self) -> None: | |
29 | 22 | init = path.join(self.build_lib, 'git', '__init__.py') |
30 | 23 | if path.exists(init): |
31 | 24 | os.unlink(init) |
36 | 29 | |
37 | 30 | class sdist(_sdist): |
38 | 31 | |
39 | def make_release_tree(self, base_dir, files): | |
32 | def make_release_tree(self, base_dir: str, files: Sequence) -> None: | |
40 | 33 | _sdist.make_release_tree(self, base_dir, files) |
41 | 34 | orig = path.join('git', '__init__.py') |
42 | 35 | assert path.exists(orig), orig |
47 | 40 | _stamp_version(dest) |
48 | 41 | |
49 | 42 | |
50 | def _stamp_version(filename): | |
43 | def _stamp_version(filename: str) -> None: | |
51 | 44 | found, out = False, [] |
52 | 45 | try: |
53 | 46 | with open(filename, 'r') as f: |
56 | 49 | line = line.replace("'git'", "'%s'" % VERSION) |
57 | 50 | found = True |
58 | 51 | out.append(line) |
59 | except (IOError, OSError): | |
52 | except OSError: | |
60 | 53 | print("Couldn't find file %s to stamp version" % filename, file=sys.stderr) |
61 | 54 | |
62 | 55 | if found: |
66 | 59 | print("WARNING: Couldn't find version line in file %s" % filename, file=sys.stderr) |
67 | 60 | |
68 | 61 | |
69 | def build_py_modules(basedir, excludes=[]): | |
62 | def build_py_modules(basedir: str, excludes: Sequence = ()) -> Sequence: | |
70 | 63 | # create list of py_modules from tree |
71 | 64 | res = set() |
72 | 65 | _prefix = os.path.basename(basedir) |
93 | 86 | author_email="byronimo@gmail.com, mtrier@gmail.com", |
94 | 87 | license="BSD", |
95 | 88 | url="https://github.com/gitpython-developers/GitPython", |
96 | packages=find_packages(exclude=("test.*")), | |
89 | packages=find_packages(exclude=["test", "test.*"]), | |
97 | 90 | include_package_data=True, |
98 | 91 | py_modules=build_py_modules("./git", excludes=["git.ext.*"]), |
99 | 92 | package_dir={'git': 'git'}, |
100 | python_requires='>=3.4', | |
93 | python_requires='>=3.7', | |
101 | 94 | install_requires=requirements, |
102 | 95 | tests_require=requirements + test_requirements, |
103 | 96 | zip_safe=False, |
119 | 112 | "Operating System :: POSIX", |
120 | 113 | "Operating System :: Microsoft :: Windows", |
121 | 114 | "Operating System :: MacOS :: MacOS X", |
115 | "Typing:: Typed", | |
122 | 116 | "Programming Language :: Python", |
123 | 117 | "Programming Language :: Python :: 3", |
124 | "Programming Language :: Python :: 3.4", | |
125 | "Programming Language :: Python :: 3.5", | |
126 | "Programming Language :: Python :: 3.6", | |
127 | 118 | "Programming Language :: Python :: 3.7", |
128 | 119 | "Programming Language :: Python :: 3.8", |
129 | "Programming Language :: Python :: 3.9" | |
120 | "Programming Language :: Python :: 3.9" | |
121 | "Programming Language :: Python :: 3.10" | |
130 | 122 | ] |
131 | 123 | ) |