Codebase list python-git / 64e593c
New upstream snapshot. Debian Janitor 2 years ago
57 changed file(s) with 3393 addition(s) and 1792 deletion(s). Raw diff Collapse all Expand all
4242 -Liam Beguin <liambeguin _at_ gmail.com>
4343 -Ram Rachum <ram _at_ rachum.com>
4444 -Alba Mendez <me _at_ alba.sh>
45 -Robert Westman <robert _at_ byteflux.io>
4546 Portions derived from other open source works and are clearly marked.
00 Please see the online documentation for the latest changelog:
1 https://github.com/gitpython-developers/GitPython/blob/master/doc/source/changes.rst
1 https://github.com/gitpython-developers/GitPython/blob/main/doc/source/changes.rst
0 ### How to contribute
0 # How to contribute
11
22 The following is a short step-by-step rundown of what one typically would do to contribute.
33
4 * [fork this project](https://github.com/gitpython-developers/GitPython/fork) on GitHub.
5 * For setting up the environment to run the self tests, please look at `.travis.yml`.
6 * Please try to **write a test that fails unless the contribution is present.**
7 * Feel free to add yourself to AUTHORS file.
8 * Create a pull request.
9
4 - [fork this project](https://github.com/gitpython-developers/GitPython/fork) on GitHub.
5 - For setting up the environment to run the self tests, please run `init-tests-after-clone.sh`.
6 - Please try to **write a test that fails unless the contribution is present.**
7 - Try to avoid massive commits and prefer to take small steps, with one commit for each.
8 - Feel free to add yourself to AUTHORS file.
9 - Create a pull request.
00 Metadata-Version: 1.2
11 Name: GitPython
2 Version: 3.1.14
2 Version: 3.1.20
33 Summary: Python Git Library
44 Home-page: https://github.com/gitpython-developers/GitPython
55 Author: Sebastian Thiel, Michael Trier
1515 Classifier: Operating System :: POSIX
1616 Classifier: Operating System :: Microsoft :: Windows
1717 Classifier: Operating System :: MacOS :: MacOS X
18 Classifier: Typing:: Typed
1819 Classifier: Programming Language :: Python
1920 Classifier: Programming Language :: Python :: 3
20 Classifier: Programming Language :: Python :: 3.4
21 Classifier: Programming Language :: Python :: 3.5
22 Classifier: Programming Language :: Python :: 3.6
2321 Classifier: Programming Language :: Python :: 3.7
2422 Classifier: Programming Language :: Python :: 3.8
25 Classifier: Programming Language :: Python :: 3.9
26 Requires-Python: >=3.4
23 Classifier: Programming Language :: Python :: 3.9Programming Language :: Python :: 3.10
24 Requires-Python: >=3.7
44 MANIFEST.in
55 README.md
66 VERSION
7 pyproject.toml
78 requirements.txt
89 setup.py
910 test-requirements.txt
3031 git/db.py
3132 git/diff.py
3233 git/exc.py
34 git/py.typed
3335 git/remote.py
36 git/types.py
3437 git/util.py
3538 git/index/__init__.py
3639 git/index/base.py
00 gitdb<5,>=4.0.1
1
2 [:python_version < "3.10"]
3 typing-extensions>=3.7.4.3
0 include AUTHORS
1 include CHANGES
2 include CONTRIBUTING.md
3 include LICENSE
4 include README.md
05 include VERSION
1 include LICENSE
2 include CHANGES
3 include AUTHORS
4 include CONTRIBUTING.md
5 include README.md
66 include requirements.txt
7 include test-requirements.txt
8 include git/py.typed
79
810 recursive-include doc *
911 recursive-exclude test *
00 Metadata-Version: 1.2
11 Name: GitPython
2 Version: 3.1.14
2 Version: 3.1.20
33 Summary: Python Git Library
44 Home-page: https://github.com/gitpython-developers/GitPython
55 Author: Sebastian Thiel, Michael Trier
1515 Classifier: Operating System :: POSIX
1616 Classifier: Operating System :: Microsoft :: Windows
1717 Classifier: Operating System :: MacOS :: MacOS X
18 Classifier: Typing:: Typed
1819 Classifier: Programming Language :: Python
1920 Classifier: Programming Language :: Python :: 3
20 Classifier: Programming Language :: Python :: 3.4
21 Classifier: Programming Language :: Python :: 3.5
22 Classifier: Programming Language :: Python :: 3.6
2321 Classifier: Programming Language :: Python :: 3.7
2422 Classifier: Programming Language :: Python :: 3.8
25 Classifier: Programming Language :: Python :: 3.9
26 Requires-Python: >=3.4
23 Classifier: Programming Language :: Python :: 3.9Programming Language :: Python :: 3.10
24 Requires-Python: >=3.7
0 ![Python package](https://github.com/gitpython-developers/GitPython/workflows/Python%20package/badge.svg)
1 [![Documentation Status](https://readthedocs.org/projects/gitpython/badge/?version=stable)](https://readthedocs.org/projects/gitpython/?badge=stable)
2 [![Packaging status](https://repology.org/badge/tiny-repos/python:gitpython.svg)](https://repology.org/metapackage/python:gitpython/versions)
3
04 ## [Gitoxide](https://github.com/Byron/gitoxide): A peek into the future…
15
26 I started working on GitPython in 2009, back in the days when Python was 'my thing' and I had great plans with it.
1923
2024 It provides abstractions of git objects for easy access of repository data, and additionally
2125 allows you to access the git repository more directly using either a pure python implementation,
22 or the faster, but more resource intensive *git command* implementation.
26 or the faster, but more resource intensive _git command_ implementation.
2327
2428 The object database implementation is optimized for handling large quantities of objects and large datasets,
2529 which is achieved by using low-level structures and data streaming.
2630
31 ### DEVELOPMENT STATUS
32
33 This project is in **maintenance mode**, which means that
34
35 - …there will be no feature development, unless these are contributed
36 - …there will be no bug fixes, unless they are relevant to the safety of users, or contributed
37 - …issues will be responded to with waiting times of up to a month
38
39 The project is open to contributions of all kinds, as well as new maintainers.
2740
2841 ### REQUIREMENTS
2942
3245 If it is not in your `PATH`, you can help GitPython find it by setting
3346 the `GIT_PYTHON_GIT_EXECUTABLE=<path/to/git>` environment variable.
3447
35 * Git (1.7.x or newer)
36 * Python >= 3.4
48 - Git (1.7.x or newer)
49 - Python >= 3.7
3750
3851 The list of dependencies are listed in `./requirements.txt` and `./test-requirements.txt`.
3952 The installer takes care of installing them for you.
8295
8396 ### RUNNING TESTS
8497
85 *Important*: Right after cloning this repository, please be sure to have executed
98 _Important_: Right after cloning this repository, please be sure to have executed
8699 the `./init-tests-after-clone.sh` script in the repository root. Otherwise
87100 you will encounter test failures.
88101
89 On *Windows*, make sure you have `git-daemon` in your PATH. For MINGW-git, the `git-daemon.exe`
102 On _Windows_, make sure you have `git-daemon` in your PATH. For MINGW-git, the `git-daemon.exe`
90103 exists in `Git\mingw64\libexec\git-core\`; CYGWIN has no daemon, but should get along fine
91104 with MINGW's.
92105
93 The easiest way to run tests is by using [tox](https://pypi.python.org/pypi/tox)
94 a wrapper around virtualenv. It will take care of setting up environments with the proper
95 dependencies installed and execute test commands. To install it simply:
96
97 pip install tox
98
99 Then run:
100
101 tox
102
103
104 For more fine-grained control, you can use `unittest`.
106 Ensure testing libraries are installed.
107 In the root directory, run: `pip install -r test-requirements.txt`
108
109 To lint, run: `flake8`
110
111 To typecheck, run: `mypy -p git`
112
113 To test, run: `pytest`
114
115 Configuration for flake8 is in the ./.flake8 file.
116
117 Configurations for mypy, pytest and coverage.py are in ./pyproject.toml.
118
119 The same linting and testing will also be performed against different supported python versions
120 upon submitting a pull request (or on each push if you have a fork with a "main" branch and actions enabled).
105121
106122 ### Contributions
107123
109125
110126 ### INFRASTRUCTURE
111127
112 * [User Documentation](http://gitpython.readthedocs.org)
113 * [Questions and Answers](http://stackexchange.com/filters/167317/gitpython)
114 * Please post on stackoverflow and use the `gitpython` tag
115 * [Issue Tracker](https://github.com/gitpython-developers/GitPython/issues)
116 * Post reproducible bugs and feature requests as a new issue.
128 - [User Documentation](http://gitpython.readthedocs.org)
129 - [Questions and Answers](http://stackexchange.com/filters/167317/gitpython)
130 - Please post on stackoverflow and use the `gitpython` tag
131 - [Issue Tracker](https://github.com/gitpython-developers/GitPython/issues)
132 - Post reproducible bugs and feature requests as a new issue.
117133 Please be sure to provide the following information if posting bugs:
118 * GitPython version (e.g. `import git; git.__version__`)
119 * Python version (e.g. `python --version`)
120 * The encountered stack-trace, if applicable
121 * Enough information to allow reproducing the issue
134 - GitPython version (e.g. `import git; git.__version__`)
135 - Python version (e.g. `python --version`)
136 - The encountered stack-trace, if applicable
137 - Enough information to allow reproducing the issue
122138
123139 ### How to make a new release
124140
125 * Update/verify the **version** in the `VERSION` file
126 * Update/verify that the `doc/source/changes.rst` changelog file was updated
127 * Commit everything
128 * Run `git tag -s <version>` to tag the version in Git
129 * Run `make release`
130 * Close the milestone mentioned in the _changelog_ and create a new one. _Do not reuse milestones by renaming them_.
131 * set the upcoming version in the `VERSION` file, usually be
141 - Update/verify the **version** in the `VERSION` file
142 - Update/verify that the `doc/source/changes.rst` changelog file was updated
143 - Commit everything
144 - Run `git tag -s <version>` to tag the version in Git
145 - Run `make release`
146 - Close the milestone mentioned in the _changelog_ and create a new one. _Do not reuse milestones by renaming them_.
147 - set the upcoming version in the `VERSION` file, usually be
132148 incrementing the patch level, and possibly by appending `-dev`. Probably you
133149 want to `git push` once more.
134150
180196
181197 ### Projects using GitPython
182198
183 * [PyDriller](https://github.com/ishepard/pydriller)
184 * [Kivy Designer](https://github.com/kivy/kivy-designer)
185 * [Prowl](https://github.com/nettitude/Prowl)
186 * [Python Taint](https://github.com/python-security/pyt)
187 * [Buster](https://github.com/axitkhurana/buster)
188 * [git-ftp](https://github.com/ezyang/git-ftp)
189 * [Git-Pandas](https://github.com/wdm0006/git-pandas)
190 * [PyGitUp](https://github.com/msiemens/PyGitUp)
191 * [PyJFuzz](https://github.com/mseclab/PyJFuzz)
192 * [Loki](https://github.com/Neo23x0/Loki)
193 * [Omniwallet](https://github.com/OmniLayer/omniwallet)
194 * [GitViper](https://github.com/BeayemX/GitViper)
195 * [Git Gud](https://github.com/bthayer2365/git-gud)
199 - [PyDriller](https://github.com/ishepard/pydriller)
200 - [Kivy Designer](https://github.com/kivy/kivy-designer)
201 - [Prowl](https://github.com/nettitude/Prowl)
202 - [Python Taint](https://github.com/python-security/pyt)
203 - [Buster](https://github.com/axitkhurana/buster)
204 - [git-ftp](https://github.com/ezyang/git-ftp)
205 - [Git-Pandas](https://github.com/wdm0006/git-pandas)
206 - [PyGitUp](https://github.com/msiemens/PyGitUp)
207 - [PyJFuzz](https://github.com/mseclab/PyJFuzz)
208 - [Loki](https://github.com/Neo23x0/Loki)
209 - [Omniwallet](https://github.com/OmniLayer/omniwallet)
210 - [GitViper](https://github.com/BeayemX/GitViper)
211 - [Git Gud](https://github.com/bthayer2365/git-gud)
196212
197213 ### LICENSE
198214
199 New BSD License. See the LICENSE file.
200
201 ### DEVELOPMENT STATUS
202
203 ![Python package](https://github.com/gitpython-developers/GitPython/workflows/Python%20package/badge.svg)
204 [![Documentation Status](https://readthedocs.org/projects/gitpython/badge/?version=stable)](https://readthedocs.org/projects/gitpython/?badge=stable)
205 [![Packaging status](https://repology.org/badge/tiny-repos/python:gitpython.svg)](https://repology.org/metapackage/python:gitpython/versions)
206
207 This project is in **maintenance mode**, which means that
208
209 * …there will be no feature development, unless these are contributed
210 * …there will be no bug fixes, unless they are relevant to the safety of users, or contributed
211 * …issues will be responded to with waiting times of up to a month
212
213 The project is open to contributions of all kinds, as well as new maintainers.
215 New BSD License. See the LICENSE file.
214216
215217 [contributing]: https://github.com/gitpython-developers/GitPython/blob/master/CONTRIBUTING.md
0 3.1.14
0 3.1.20
0 python-git (3.1.20+git20210905.1.5da76e8-1) UNRELEASED; urgency=low
1
2 * New upstream snapshot.
3
4 -- Debian Janitor <janitor@jelmer.uk> Thu, 09 Sep 2021 00:30:40 -0000
5
06 python-git (3.1.14-1) unstable; urgency=medium
17
28 * New upstream version 3.1.14
0 sphinx<2.0
0 sphinx==4.1.1
11 sphinx_rtd_theme
2 sphinx-autodoc-typehints
11 Changelog
22 =========
33
4 3.1.20
5 ======
6
7 * This is the second typed release with a lot of improvements under the hood.
8 * Tracking issue: https://github.com/gitpython-developers/GitPython/issues/1095
9
10 See the following for details:
11 https://github.com/gitpython-developers/gitpython/milestone/52?closed=1
12
13
14 3.1.19 (YANKED)
15 ===============
16
17 * This is the second typed release with a lot of improvements under the hood.
18 * Tracking issue: https://github.com/gitpython-developers/GitPython/issues/1095
19
20 See the following for details:
21 https://github.com/gitpython-developers/gitpython/milestone/51?closed=1
22
23 3.1.18
24 ======
25
26 * drop support for python 3.5 to reduce maintenance burden on typing. Lower patch levels of python 3.5 would break, too.
27
28 See the following for details:
29 https://github.com/gitpython-developers/gitpython/milestone/50?closed=1
30
31 3.1.17
32 ======
33
34 * Fix issues from 3.1.16 (see https://github.com/gitpython-developers/GitPython/issues/1238)
35 * Fix issues from 3.1.15 (see https://github.com/gitpython-developers/GitPython/issues/1223)
36 * Add more static typing information
37
38 See the following for details:
39 https://github.com/gitpython-developers/gitpython/milestone/49?closed=1
40
41 3.1.16 (YANKED)
42 ===============
43
44 * Fix issues from 3.1.15 (see https://github.com/gitpython-developers/GitPython/issues/1223)
45 * Add more static typing information
46
47 See the following for details:
48 https://github.com/gitpython-developers/gitpython/milestone/48?closed=1
49
50 3.1.15 (YANKED)
51 ===============
52
53 * add deprectation warning for python 3.5
54
55 See the following for details:
56 https://github.com/gitpython-developers/gitpython/milestone/47?closed=1
57
458 3.1.14
559 ======
660
761 * git.Commit objects now have a ``replace`` method that will return a
862 copy of the commit with modified attributes.
963 * Add python 3.9 support
64 * Drop python 3.4 support
65
66 See the following for details:
67 https://github.com/gitpython-developers/gitpython/milestone/46?closed=1
1068
1169 3.1.13
1270 ======
1313 # All configuration values have a default; values that are commented out
1414 # serve to show the default.
1515
16 import sys, os
16 import sys
17 import os
1718
1819 # If your extensions are in another directory, add it here. If the directory
1920 # is relative to the documentation root, use os.path.abspath to make it
4950 # built documents.
5051 #
5152 # The short X.Y version.
52 with open(os.path.join(os.path.dirname(__file__),"..", "..", 'VERSION')) as fd:
53 with open(os.path.join(os.path.dirname(__file__), "..", "..", 'VERSION')) as fd:
5354 VERSION = fd.readline().strip()
5455 version = VERSION
5556 # The full version, including alpha/beta/rc tags.
169170 # Grouping the document tree into LaTeX files. List of tuples
170171 # (source start file, target name, title, author, document class [howto/manual]).
171172 latex_documents = [
172 ('index', 'GitPython.tex', ur'GitPython Documentation',
173 ur'Michael Trier', 'manual'),
173 ('index', 'GitPython.tex', r'GitPython Documentation',
174 r'Michael Trier', 'manual'),
174175 ]
175176
176177 # The name of an image file (relative to this directory) to place at the top of
1212 Requirements
1313 ============
1414
15 * `Python`_ >= 3.4
15 * `Python`_ >= 3.7
1616 * `Git`_ 1.7.0 or newer
1717 It should also work with older versions, but it may be that some operations
1818 involving remotes will not work as expected.
1919 * `GitDB`_ - a pure python git database implementation
20 * `typing_extensions`_ >= 3.7.3.4 (if python < 3.10)
2021
2122 .. _Python: https://www.python.org
2223 .. _Git: https://git-scm.com/
2324 .. _GitDB: https://pypi.python.org/pypi/gitdb
25 .. _typing_extensions: https://pypi.org/project/typing-extensions/
2426
2527 Installing GitPython
2628 ====================
5961 ---------------------------
6062
6163 GitPython is not suited for long-running processes (like daemons) as it tends to
62 leak system resources. It was written in a time where destructors (as implemented
64 leak system resources. It was written in a time where destructors (as implemented
6365 in the `__del__` method) still ran deterministically.
6466
6567 In case you still want to use it in such a context, you will want to search the
99
1010 GitPython provides object model access to your git repository. This tutorial is composed of multiple sections, most of which explains a real-life usecase.
1111
12 All code presented here originated from `test_docs.py <https://github.com/gitpython-developers/GitPython/blob/master/test/test_docs.py>`_ to assure correctness. Knowing this should also allow you to more easily run the code for your own testing purposes, all you need is a developer installation of git-python.
12 All code presented here originated from `test_docs.py <https://github.com/gitpython-developers/GitPython/blob/main/test/test_docs.py>`_ to assure correctness. Knowing this should also allow you to more easily run the code for your own testing purposes, all you need is a developer installation of git-python.
1313
1414 Meet the Repo type
1515 ******************
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
55 # flake8: noqa
66 #@PydevCodeAnalysisIgnore
7 from git.exc import * # @NoMove @IgnorePep8
78 import inspect
89 import os
910 import sys
10
1111 import os.path as osp
1212
13 from typing import Optional
14 from git.types import PathLike
1315
14 __version__ = '3.1.14'
16 __version__ = '3.1.20'
1517
1618
1719 #{ Initialization
18 def _init_externals():
20 def _init_externals() -> None:
1921 """Initialize external projects by putting them into the path"""
20 if __version__ == '3.1.14' and 'PYOXIDIZER' not in os.environ:
22 if __version__ == '3.1.20' and 'PYOXIDIZER' not in os.environ:
2123 sys.path.insert(1, osp.join(osp.dirname(__file__), 'ext', 'gitdb'))
2224
2325 try:
2830
2931 #} END initialization
3032
33
3134 #################
3235 _init_externals()
3336 #################
3437
3538 #{ Imports
3639
37 from git.exc import * # @NoMove @IgnorePep8
3840 try:
3941 from git.config import GitConfigParser # @NoMove @IgnorePep8
4042 from git.objects import * # @NoMove @IgnorePep8
6466 #{ Initialize git executable path
6567 GIT_OK = None
6668
67 def refresh(path=None):
69
70 def refresh(path: Optional[PathLike] = None) -> None:
6871 """Convenience method for setting the git executable path."""
6972 global GIT_OK
7073 GIT_OK = False
7780 GIT_OK = True
7881 #} END initialize git executable path
7982
83
8084 #################
8185 try:
8286 refresh()
1414 PIPE
1515 )
1616 import subprocess
17 import sys
1817 import threading
19 from collections import OrderedDict
2018 from textwrap import dedent
2119
2220 from git.compat import (
2725 is_win,
2826 )
2927 from git.exc import CommandError
30 from git.util import is_cygwin_git, cygpath, expand_path
28 from git.util import is_cygwin_git, cygpath, expand_path, remove_password_if_present
3129
3230 from .exc import (
3331 GitCommandError,
3836 stream_copy,
3937 )
4038
39 # typing ---------------------------------------------------------------------------
40
41 from typing import (Any, AnyStr, BinaryIO, Callable, Dict, IO, Iterator, List, Mapping,
42 Sequence, TYPE_CHECKING, TextIO, Tuple, Union, cast, overload)
43
44 from git.types import PathLike, Literal, TBD
45
46 if TYPE_CHECKING:
47 from git.repo.base import Repo
48 from git.diff import DiffIndex
49
50
51 # ---------------------------------------------------------------------------------
52
4153 execute_kwargs = {'istream', 'with_extended_output',
4254 'with_exceptions', 'as_process', 'stdout_as_string',
4355 'output_stream', 'with_stdout', 'kill_after_timeout',
5567 # Documentation
5668 ## @{
5769
58 def handle_process_output(process, stdout_handler, stderr_handler,
59 finalizer=None, decode_streams=True):
70 def handle_process_output(process: Union[subprocess.Popen, 'Git.AutoInterrupt'],
71 stdout_handler: Union[None,
72 Callable[[AnyStr], None],
73 Callable[[List[AnyStr]], None],
74 Callable[[bytes, 'Repo', 'DiffIndex'], None]],
75 stderr_handler: Union[None,
76 Callable[[AnyStr], None],
77 Callable[[List[AnyStr]], None]],
78 finalizer: Union[None,
79 Callable[[Union[subprocess.Popen, 'Git.AutoInterrupt']], None]] = None,
80 decode_streams: bool = True) -> None:
6081 """Registers for notifications to learn that process output is ready to read, and dispatches lines to
6182 the respective line handlers.
6283 This function returns once the finalizer returns
7394 or if decoding must happen later (i.e. for Diffs).
7495 """
7596 # Use 2 "pump" threads and wait for both to finish.
76 def pump_stream(cmdline, name, stream, is_decode, handler):
97 def pump_stream(cmdline: str, name: str, stream: Union[BinaryIO, TextIO], is_decode: bool,
98 handler: Union[None, Callable[[Union[bytes, str]], None]]) -> None:
7799 try:
78100 for line in stream:
79101 if handler:
80102 if is_decode:
81 line = line.decode(defenc)
82 handler(line)
103 assert isinstance(line, bytes)
104 line_str = line.decode(defenc)
105 handler(line_str)
106 else:
107 handler(line)
83108 except Exception as ex:
84 log.error("Pumping %r of cmd(%s) failed due to: %r", name, cmdline, ex)
85 raise CommandError(['<%s-pump>' % name] + cmdline, ex) from ex
109 log.error("Pumping %r of cmd(%s) failed due to: %r", name, remove_password_if_present(cmdline), ex)
110 raise CommandError(['<%s-pump>' % name] + remove_password_if_present(cmdline), ex) from ex
86111 finally:
87112 stream.close()
88113
101126 for name, stream, handler in pumps:
102127 t = threading.Thread(target=pump_stream,
103128 args=(cmdline, name, stream, decode_streams, handler))
104 t.setDaemon(True)
129 t.daemon = True
105130 t.start()
106131 threads.append(t)
107132
112137
113138 if finalizer:
114139 return finalizer(process)
115
116
117 def dashify(string):
140 else:
141 return None
142
143
144 def dashify(string: str) -> str:
118145 return string.replace('_', '-')
119146
120147
121 def slots_to_dict(self, exclude=()):
148 def slots_to_dict(self: object, exclude: Sequence[str] = ()) -> Dict[str, Any]:
122149 return {s: getattr(self, s) for s in self.__slots__ if s not in exclude}
123150
124151
125 def dict_to_slots_and__excluded_are_none(self, d, excluded=()):
152 def dict_to_slots_and__excluded_are_none(self: object, d: Mapping[str, Any], excluded: Sequence[str] = ()) -> None:
126153 for k, v in d.items():
127154 setattr(self, k, v)
128155 for k in excluded:
136163
137164 ## CREATE_NEW_PROCESS_GROUP is needed to allow killing it afterwards,
138165 # see https://docs.python.org/3/library/subprocess.html#subprocess.Popen.send_signal
139 PROC_CREATIONFLAGS = (CREATE_NO_WINDOW | subprocess.CREATE_NEW_PROCESS_GROUP
140 if is_win else 0)
166 PROC_CREATIONFLAGS = (CREATE_NO_WINDOW | subprocess.CREATE_NEW_PROCESS_GROUP # type: ignore[attr-defined]
167 if is_win else 0) # mypy error if not windows
141168
142169
143170 class Git(LazyMixin):
161188
162189 _excluded_ = ('cat_file_all', 'cat_file_header', '_version_info')
163190
164 def __getstate__(self):
191 def __getstate__(self) -> Dict[str, Any]:
165192 return slots_to_dict(self, exclude=self._excluded_)
166193
167 def __setstate__(self, d):
194 def __setstate__(self, d: Dict[str, Any]) -> None:
168195 dict_to_slots_and__excluded_are_none(self, d, excluded=self._excluded_)
169196
170197 # CONFIGURATION
188215 # the top level __init__
189216
190217 @classmethod
191 def refresh(cls, path=None):
218 def refresh(cls, path: Union[None, PathLike] = None) -> bool:
192219 """This gets called by the refresh function (see the top level
193220 __init__).
194221 """
208235 # - a GitCommandNotFound error is spawned by ourselves
209236 # - a PermissionError is spawned if the git executable provided
210237 # cannot be executed for whatever reason
211
238
212239 has_git = False
213240 try:
214241 cls().version()
303330 return has_git
304331
305332 @classmethod
306 def is_cygwin(cls):
333 def is_cygwin(cls) -> bool:
307334 return is_cygwin_git(cls.GIT_PYTHON_GIT_EXECUTABLE)
308335
336 @overload
309337 @classmethod
310 def polish_url(cls, url, is_cygwin=None):
338 def polish_url(cls, url: str, is_cygwin: Literal[False] = ...) -> str:
339 ...
340
341 @overload
342 @classmethod
343 def polish_url(cls, url: str, is_cygwin: Union[None, bool] = None) -> str:
344 ...
345
346 @classmethod
347 def polish_url(cls, url: str, is_cygwin: Union[None, bool] = None) -> PathLike:
311348 if is_cygwin is None:
312349 is_cygwin = cls.is_cygwin()
313350
324361 if url.startswith('~'):
325362 url = os.path.expanduser(url)
326363 url = url.replace("\\\\", "\\").replace("\\", "/")
327
328364 return url
329365
330366 class AutoInterrupt(object):
337373
338374 __slots__ = ("proc", "args")
339375
340 def __init__(self, proc, args):
376 def __init__(self, proc: Union[None, subprocess.Popen], args: Any) -> None:
341377 self.proc = proc
342378 self.args = args
343379
344 def __del__(self):
380 def __del__(self) -> None:
345381 if self.proc is None:
346382 return
347383
357393 # did the process finish already so we have a return code ?
358394 try:
359395 if proc.poll() is not None:
360 return
396 return None
361397 except OSError as ex:
362398 log.info("Ignored error after process had died: %r", ex)
363399
364400 # can be that nothing really exists anymore ...
365401 if os is None or getattr(os, 'kill', None) is None:
366 return
402 return None
367403
368404 # try to kill it
369405 try:
380416 call(("TASKKILL /F /T /PID %s 2>nul 1>nul" % str(proc.pid)), shell=True)
381417 # END exception handling
382418
383 def __getattr__(self, attr):
419 def __getattr__(self, attr: str) -> Any:
384420 return getattr(self.proc, attr)
385421
386 def wait(self, stderr=b''): # TODO: Bad choice to mimic `proc.wait()` but with different args.
422 # TODO: Bad choice to mimic `proc.wait()` but with different args.
423 def wait(self, stderr: Union[None, str, bytes] = b'') -> int:
387424 """Wait for the process and return its status code.
388425
389426 :param stderr: Previously read value of stderr, in case stderr is already closed.
390427 :warn: may deadlock if output or error pipes are used and not handled separately.
391428 :raise GitCommandError: if the return status is not 0"""
392429 if stderr is None:
393 stderr = b''
394 stderr = force_bytes(data=stderr, encoding='utf-8')
395
396 status = self.proc.wait()
397
398 def read_all_from_possibly_closed_stream(stream):
399 try:
400 return stderr + force_bytes(stream.read())
401 except ValueError:
402 return stderr or b''
403
404 if status != 0:
405 errstr = read_all_from_possibly_closed_stream(self.proc.stderr)
406 log.debug('AutoInterrupt wait stderr: %r' % (errstr,))
407 raise GitCommandError(self.args, status, errstr)
430 stderr_b = b''
431 stderr_b = force_bytes(data=stderr, encoding='utf-8')
432
433 if self.proc is not None:
434 status = self.proc.wait()
435
436 def read_all_from_possibly_closed_stream(stream: Union[IO[bytes], None]) -> bytes:
437 if stream:
438 try:
439 return stderr_b + force_bytes(stream.read())
440 except ValueError:
441 return stderr_b or b''
442 else:
443 return stderr_b or b''
444
445 if status != 0:
446 errstr = read_all_from_possibly_closed_stream(self.proc.stderr)
447 log.debug('AutoInterrupt wait stderr: %r' % (errstr,))
448 raise GitCommandError(remove_password_if_present(self.args), status, errstr)
408449 # END status handling
409450 return status
451
410452 # END auto interrupt
411453
412454 class CatFileContentStream(object):
418460 If not all data is read to the end of the objects's lifetime, we read the
419461 rest to assure the underlying stream continues to work"""
420462
421 __slots__ = ('_stream', '_nbr', '_size')
422
423 def __init__(self, size, stream):
463 __slots__: Tuple[str, ...] = ('_stream', '_nbr', '_size')
464
465 def __init__(self, size: int, stream: IO[bytes]) -> None:
424466 self._stream = stream
425467 self._size = size
426468 self._nbr = 0 # num bytes read
431473 stream.read(1)
432474 # END handle empty streams
433475
434 def read(self, size=-1):
476 def read(self, size: int = -1) -> bytes:
435477 bytes_left = self._size - self._nbr
436478 if bytes_left == 0:
437479 return b''
451493 # END finish reading
452494 return data
453495
454 def readline(self, size=-1):
496 def readline(self, size: int = -1) -> bytes:
455497 if self._nbr == self._size:
456498 return b''
457499
473515
474516 return data
475517
476 def readlines(self, size=-1):
518 def readlines(self, size: int = -1) -> List[bytes]:
477519 if self._nbr == self._size:
478520 return []
479521
494536 return out
495537
496538 # skipcq: PYL-E0301
497 def __iter__(self):
539 def __iter__(self) -> 'Git.CatFileContentStream':
498540 return self
499
500 def __next__(self):
501 return self.next()
502
503 def next(self):
541
542 def __next__(self) -> bytes:
543 return next(self)
544
545 def next(self) -> bytes:
504546 line = self.readline()
505547 if not line:
506548 raise StopIteration
507549
508550 return line
509551
510 def __del__(self):
552 def __del__(self) -> None:
511553 bytes_left = self._size - self._nbr
512554 if bytes_left:
513555 # read and discard - seeking is impossible within a stream
515557 self._stream.read(bytes_left + 1)
516558 # END handle incomplete read
517559
518 def __init__(self, working_dir=None):
560 def __init__(self, working_dir: Union[None, PathLike] = None):
519561 """Initialize this instance with:
520562
521563 :param working_dir:
525567 .git directory in case of bare repositories."""
526568 super(Git, self).__init__()
527569 self._working_dir = expand_path(working_dir)
528 self._git_options = ()
529 self._persistent_git_options = []
570 self._git_options: Union[List[str], Tuple[str, ...]] = ()
571 self._persistent_git_options: List[str] = []
530572
531573 # Extra environment variables to pass to git commands
532 self._environment = {}
574 self._environment: Dict[str, str] = {}
533575
534576 # cached command slots
535 self.cat_file_header = None
536 self.cat_file_all = None
537
538 def __getattr__(self, name):
577 self.cat_file_header: Union[None, TBD] = None
578 self.cat_file_all: Union[None, TBD] = None
579
580 def __getattr__(self, name: str) -> Any:
539581 """A convenience method as it allows to call the command as if it was
540582 an object.
541583 :return: Callable object that will execute call _call_process with your arguments."""
543585 return LazyMixin.__getattr__(self, name)
544586 return lambda *args, **kwargs: self._call_process(name, *args, **kwargs)
545587
546 def set_persistent_git_options(self, **kwargs):
588 def set_persistent_git_options(self, **kwargs: Any) -> None:
547589 """Specify command line options to the git executable
548590 for subsequent subcommand calls
549591
557599 self._persistent_git_options = self.transform_kwargs(
558600 split_single_char_options=True, **kwargs)
559601
560 def _set_cache_(self, attr):
602 def _set_cache_(self, attr: str) -> None:
561603 if attr == '_version_info':
562604 # We only use the first 4 numbers, as everything else could be strings in fact (on windows)
563 version_numbers = self._call_process('version').split(' ')[2]
564 self._version_info = tuple(int(n) for n in version_numbers.split('.')[:4] if n.isdigit())
605 process_version = self._call_process('version') # should be as default *args and **kwargs used
606 version_numbers = process_version.split(' ')[2]
607
608 self._version_info = cast(Tuple[int, int, int, int],
609 tuple(int(n) for n in version_numbers.split('.')[:4] if n.isdigit())
610 )
565611 else:
566612 super(Git, self)._set_cache_(attr)
567613 # END handle version info
568614
569 @property
570 def working_dir(self):
615 @ property
616 def working_dir(self) -> Union[None, PathLike]:
571617 """:return: Git directory we are working on"""
572618 return self._working_dir
573619
574 @property
575 def version_info(self):
620 @ property
621 def version_info(self) -> Tuple[int, int, int, int]:
576622 """
577623 :return: tuple(int, int, int, int) tuple with integers representing the major, minor
578624 and additional version numbers as parsed from git version.
579625 This value is generated on demand and is cached"""
580626 return self._version_info
581627
582 def execute(self, command,
583 istream=None,
584 with_extended_output=False,
585 with_exceptions=True,
586 as_process=False,
587 output_stream=None,
588 stdout_as_string=True,
589 kill_after_timeout=None,
590 with_stdout=True,
591 universal_newlines=False,
592 shell=None,
593 env=None,
594 max_chunk_size=io.DEFAULT_BUFFER_SIZE,
595 **subprocess_kwargs
596 ):
628 @ overload
629 def execute(self,
630 command: Union[str, Sequence[Any]],
631 *,
632 as_process: Literal[True]
633 ) -> 'AutoInterrupt':
634 ...
635
636 @ overload
637 def execute(self,
638 command: Union[str, Sequence[Any]],
639 *,
640 as_process: Literal[False] = False,
641 stdout_as_string: Literal[True]
642 ) -> Union[str, Tuple[int, str, str]]:
643 ...
644
645 @ overload
646 def execute(self,
647 command: Union[str, Sequence[Any]],
648 *,
649 as_process: Literal[False] = False,
650 stdout_as_string: Literal[False] = False
651 ) -> Union[bytes, Tuple[int, bytes, str]]:
652 ...
653
654 @ overload
655 def execute(self,
656 command: Union[str, Sequence[Any]],
657 *,
658 with_extended_output: Literal[False],
659 as_process: Literal[False],
660 stdout_as_string: Literal[True]
661 ) -> str:
662 ...
663
664 @ overload
665 def execute(self,
666 command: Union[str, Sequence[Any]],
667 *,
668 with_extended_output: Literal[False],
669 as_process: Literal[False],
670 stdout_as_string: Literal[False]
671 ) -> bytes:
672 ...
673
674 def execute(self,
675 command: Union[str, Sequence[Any]],
676 istream: Union[None, BinaryIO] = None,
677 with_extended_output: bool = False,
678 with_exceptions: bool = True,
679 as_process: bool = False,
680 output_stream: Union[None, BinaryIO] = None,
681 stdout_as_string: bool = True,
682 kill_after_timeout: Union[None, int] = None,
683 with_stdout: bool = True,
684 universal_newlines: bool = False,
685 shell: Union[None, bool] = None,
686 env: Union[None, Mapping[str, str]] = None,
687 max_chunk_size: int = io.DEFAULT_BUFFER_SIZE,
688 **subprocess_kwargs: Any
689 ) -> Union[str, bytes, Tuple[int, Union[str, bytes], str], AutoInterrupt]:
597690 """Handles executing the command on the shell and consumes and returns
598691 the returned information (stdout)
599692
637730
638731 :param env:
639732 A dictionary of environment variables to be passed to `subprocess.Popen`.
640
733
641734 :param max_chunk_size:
642735 Maximum number of bytes in one chunk of data passed to the output_stream in
643736 one invocation of write() method. If the given number is not positive then
681774 :note:
682775 If you add additional keyword arguments to the signature of this method,
683776 you must update the execute_kwargs tuple housed in this module."""
777 # Remove password for the command if present
778 redacted_command = remove_password_if_present(command)
684779 if self.GIT_PYTHON_TRACE and (self.GIT_PYTHON_TRACE != 'full' or as_process):
685 log.info(' '.join(command))
780 log.info(' '.join(redacted_command))
686781
687782 # Allow the user to have the command executed in their working dir.
688 cwd = self._working_dir or os.getcwd()
783 try:
784 cwd = self._working_dir or os.getcwd() # type: Union[None, str]
785 if not os.access(str(cwd), os.X_OK):
786 cwd = None
787 except FileNotFoundError:
788 cwd = None
689789
690790 # Start the process
691791 inline_env = env
703803 if is_win:
704804 cmd_not_found_exception = OSError
705805 if kill_after_timeout:
706 raise GitCommandError(command, '"kill_after_timeout" feature is not supported on Windows.')
806 raise GitCommandError(redacted_command, '"kill_after_timeout" feature is not supported on Windows.')
707807 else:
708 if sys.version_info[0] > 2:
709 cmd_not_found_exception = FileNotFoundError # NOQA # exists, flake8 unknown @UndefinedVariable
710 else:
711 cmd_not_found_exception = OSError
808 cmd_not_found_exception = FileNotFoundError # NOQA # exists, flake8 unknown @UndefinedVariable
712809 # end handle
713810
714811 stdout_sink = (PIPE
718815 if istream:
719816 istream_ok = "<valid stream>"
720817 log.debug("Popen(%s, cwd=%s, universal_newlines=%s, shell=%s, istream=%s)",
721 command, cwd, universal_newlines, shell, istream_ok)
818 redacted_command, cwd, universal_newlines, shell, istream_ok)
722819 try:
723820 proc = Popen(command,
724821 env=env,
733830 creationflags=PROC_CREATIONFLAGS,
734831 **subprocess_kwargs
735832 )
833
736834 except cmd_not_found_exception as err:
737 raise GitCommandNotFound(command, err) from err
835 raise GitCommandNotFound(redacted_command, err) from err
836 else:
837 # replace with a typeguard for Popen[bytes]?
838 proc.stdout = cast(BinaryIO, proc.stdout)
839 proc.stderr = cast(BinaryIO, proc.stderr)
738840
739841 if as_process:
740842 return self.AutoInterrupt(proc, command)
741843
742 def _kill_process(pid):
844 def _kill_process(pid: int) -> None:
743845 """ Callback method to kill a process. """
744846 p = Popen(['ps', '--ppid', str(pid)], stdout=PIPE,
745847 creationflags=PROC_CREATIONFLAGS)
746848 child_pids = []
747 for line in p.stdout:
748 if len(line.split()) > 0:
749 local_pid = (line.split())[0]
750 if local_pid.isdigit():
751 child_pids.append(int(local_pid))
849 if p.stdout is not None:
850 for line in p.stdout:
851 if len(line.split()) > 0:
852 local_pid = (line.split())[0]
853 if local_pid.isdigit():
854 child_pids.append(int(local_pid))
752855 try:
753856 # Windows does not have SIGKILL, so use SIGTERM instead
754857 sig = getattr(signal, 'SIGKILL', signal.SIGTERM)
772875
773876 # Wait for the process to return
774877 status = 0
775 stdout_value = b''
776 stderr_value = b''
878 stdout_value: Union[str, bytes] = b''
879 stderr_value: Union[str, bytes] = b''
777880 newline = "\n" if universal_newlines else b"\n"
778881 try:
779882 if output_stream is None:
782885 stdout_value, stderr_value = proc.communicate()
783886 if kill_after_timeout:
784887 watchdog.cancel()
785 if kill_check.isSet():
888 if kill_check.is_set():
786889 stderr_value = ('Timeout: the command "%s" did not complete in %d '
787 'secs.' % (" ".join(command), kill_after_timeout))
890 'secs.' % (" ".join(redacted_command), kill_after_timeout))
788891 if not universal_newlines:
789892 stderr_value = stderr_value.encode(defenc)
790893 # strip trailing "\n"
791 if stdout_value.endswith(newline):
894 if stdout_value.endswith(newline): # type: ignore
792895 stdout_value = stdout_value[:-1]
793 if stderr_value.endswith(newline):
896 if stderr_value.endswith(newline): # type: ignore
794897 stderr_value = stderr_value[:-1]
898
795899 status = proc.returncode
796900 else:
797901 max_chunk_size = max_chunk_size if max_chunk_size and max_chunk_size > 0 else io.DEFAULT_BUFFER_SIZE
799903 stdout_value = proc.stdout.read()
800904 stderr_value = proc.stderr.read()
801905 # strip trailing "\n"
802 if stderr_value.endswith(newline):
906 if stderr_value.endswith(newline): # type: ignore
803907 stderr_value = stderr_value[:-1]
804908 status = proc.wait()
805909 # END stdout handling
808912 proc.stderr.close()
809913
810914 if self.GIT_PYTHON_TRACE == 'full':
811 cmdstr = " ".join(command)
812
813 def as_text(stdout_value):
915 cmdstr = " ".join(redacted_command)
916
917 def as_text(stdout_value: Union[bytes, str]) -> str:
814918 return not output_stream and safe_decode(stdout_value) or '<OUTPUT_STREAM>'
815919 # end
816920
824928 # END handle debug printing
825929
826930 if with_exceptions and status != 0:
827 raise GitCommandError(command, status, stderr_value, stdout_value)
931 raise GitCommandError(redacted_command, status, stderr_value, stdout_value)
828932
829933 if isinstance(stdout_value, bytes) and stdout_as_string: # could also be output_stream
830934 stdout_value = safe_decode(stdout_value)
835939 else:
836940 return stdout_value
837941
838 def environment(self):
942 def environment(self) -> Dict[str, str]:
839943 return self._environment
840944
841 def update_environment(self, **kwargs):
945 def update_environment(self, **kwargs: Any) -> Dict[str, Union[str, None]]:
842946 """
843947 Set environment variables for future git invocations. Return all changed
844948 values in a format that can be passed back into this function to revert
865969 return old_env
866970
867971 @contextmanager
868 def custom_environment(self, **kwargs):
972 def custom_environment(self, **kwargs: Any) -> Iterator[None]:
869973 """
870974 A context manager around the above ``update_environment`` method to restore the
871975 environment back to its previous state after operation.
883987 finally:
884988 self.update_environment(**old_env)
885989
886 def transform_kwarg(self, name, value, split_single_char_options):
990 def transform_kwarg(self, name: str, value: Any, split_single_char_options: bool) -> List[str]:
887991 if len(name) == 1:
888992 if value is True:
889993 return ["-%s" % name]
8991003 return ["--%s=%s" % (dashify(name), value)]
9001004 return []
9011005
902 def transform_kwargs(self, split_single_char_options=True, **kwargs):
1006 def transform_kwargs(self, split_single_char_options: bool = True, **kwargs: Any) -> List[str]:
9031007 """Transforms Python style kwargs into git command line options."""
9041008 args = []
905 kwargs = OrderedDict(sorted(kwargs.items(), key=lambda x: x[0]))
9061009 for k, v in kwargs.items():
9071010 if isinstance(v, (list, tuple)):
9081011 for value in v:
9121015 return args
9131016
9141017 @classmethod
915 def __unpack_args(cls, arg_list):
916 if not isinstance(arg_list, (list, tuple)):
917 return [str(arg_list)]
1018 def __unpack_args(cls, arg_list: Sequence[str]) -> List[str]:
9181019
9191020 outlist = []
920 for arg in arg_list:
921 if isinstance(arg_list, (list, tuple)):
1021 if isinstance(arg_list, (list, tuple)):
1022 for arg in arg_list:
9221023 outlist.extend(cls.__unpack_args(arg))
923 # END recursion
924 else:
925 outlist.append(str(arg))
926 # END for each arg
1024 else:
1025 outlist.append(str(arg_list))
1026
9271027 return outlist
9281028
929 def __call__(self, **kwargs):
1029 def __call__(self, **kwargs: Any) -> 'Git':
9301030 """Specify command line options to the git executable
9311031 for a subcommand call
9321032
9421042 split_single_char_options=True, **kwargs)
9431043 return self
9441044
945 def _call_process(self, method, *args, **kwargs):
1045 @overload
1046 def _call_process(self, method: str, *args: None, **kwargs: None
1047 ) -> str:
1048 ... # if no args given, execute called with all defaults
1049
1050 @overload
1051 def _call_process(self, method: str,
1052 istream: int,
1053 as_process: Literal[True],
1054 *args: Any, **kwargs: Any
1055 ) -> 'Git.AutoInterrupt': ...
1056
1057 @overload
1058 def _call_process(self, method: str, *args: Any, **kwargs: Any
1059 ) -> Union[str, bytes, Tuple[int, Union[str, bytes], str], 'Git.AutoInterrupt']:
1060 ...
1061
1062 def _call_process(self, method: str, *args: Any, **kwargs: Any
1063 ) -> Union[str, bytes, Tuple[int, Union[str, bytes], str], 'Git.AutoInterrupt']:
9461064 """Run the given git command with the specified arguments and return
9471065 the result as a String
9481066
9591077 It contains key-values for the following:
9601078 - the :meth:`execute()` kwds, as listed in :var:`execute_kwargs`;
9611079 - "command options" to be converted by :meth:`transform_kwargs()`;
962 - the `'insert_kwargs_after'` key which its value must match one of ``*args``,
963 and any cmd-options will be appended after the matched arg.
1080 - the `'insert_kwargs_after'` key which its value must match one of ``*args``
1081 and any cmd-options will be appended after the matched arg.
9641082
9651083 Examples::
9661084
9701088
9711089 git rev-list max-count 10 --header master
9721090
973 :return: Same as ``execute``"""
1091 :return: Same as ``execute``
1092 if no args given used execute default (esp. as_process = False, stdout_as_string = True)
1093 and return str """
9741094 # Handle optional arguments prior to calling transform_kwargs
9751095 # otherwise these'll end up in args, which is bad.
9761096 exec_kwargs = {k: v for k, v in kwargs.items() if k in execute_kwargs}
9791099 insert_after_this_arg = opts_kwargs.pop('insert_kwargs_after', None)
9801100
9811101 # Prepare the argument list
1102
9821103 opt_args = self.transform_kwargs(**opts_kwargs)
9831104 ext_args = self.__unpack_args([a for a in args if a is not None])
9841105
9851106 if insert_after_this_arg is None:
986 args = opt_args + ext_args
1107 args_list = opt_args + ext_args
9871108 else:
9881109 try:
9891110 index = ext_args.index(insert_after_this_arg)
9911112 raise ValueError("Couldn't find argument '%s' in args %s to insert cmd options after"
9921113 % (insert_after_this_arg, str(ext_args))) from err
9931114 # end handle error
994 args = ext_args[:index + 1] + opt_args + ext_args[index + 1:]
1115 args_list = ext_args[:index + 1] + opt_args + ext_args[index + 1:]
9951116 # end handle opts_kwargs
9961117
9971118 call = [self.GIT_PYTHON_GIT_EXECUTABLE]
10051126 self._git_options = ()
10061127
10071128 call.append(dashify(method))
1008 call.extend(args)
1129 call.extend(args_list)
10091130
10101131 return self.execute(call, **exec_kwargs)
10111132
1012 def _parse_object_header(self, header_line):
1133 def _parse_object_header(self, header_line: str) -> Tuple[str, str, int]:
10131134 """
10141135 :param header_line:
10151136 <hex_sha> type_string size_as_int
10311152 raise ValueError("Failed to parse header: %r" % header_line)
10321153 return (tokens[0], tokens[1], int(tokens[2]))
10331154
1034 def _prepare_ref(self, ref):
1155 def _prepare_ref(self, ref: AnyStr) -> bytes:
10351156 # required for command to separate refs on stdin, as bytes
1036 refstr = ref
10371157 if isinstance(ref, bytes):
10381158 # Assume 40 bytes hexsha - bin-to-ascii for some reason returns bytes, not text
1039 refstr = ref.decode('ascii')
1159 refstr: str = ref.decode('ascii')
10401160 elif not isinstance(ref, str):
10411161 refstr = str(ref) # could be ref-object
1162 else:
1163 refstr = ref
10421164
10431165 if not refstr.endswith("\n"):
10441166 refstr += "\n"
10451167 return refstr.encode(defenc)
10461168
1047 def _get_persistent_cmd(self, attr_name, cmd_name, *args, **kwargs):
1169 def _get_persistent_cmd(self, attr_name: str, cmd_name: str, *args: Any, **kwargs: Any
1170 ) -> 'Git.AutoInterrupt':
10481171 cur_val = getattr(self, attr_name)
10491172 if cur_val is not None:
10501173 return cur_val
10541177
10551178 cmd = self._call_process(cmd_name, *args, **options)
10561179 setattr(self, attr_name, cmd)
1180 cmd = cast('Git.AutoInterrupt', cmd)
10571181 return cmd
10581182
1059 def __get_object_header(self, cmd, ref):
1060 cmd.stdin.write(self._prepare_ref(ref))
1061 cmd.stdin.flush()
1062 return self._parse_object_header(cmd.stdout.readline())
1063
1064 def get_object_header(self, ref):
1183 def __get_object_header(self, cmd: 'Git.AutoInterrupt', ref: AnyStr) -> Tuple[str, str, int]:
1184 if cmd.stdin and cmd.stdout:
1185 cmd.stdin.write(self._prepare_ref(ref))
1186 cmd.stdin.flush()
1187 return self._parse_object_header(cmd.stdout.readline())
1188 else:
1189 raise ValueError("cmd stdin was empty")
1190
1191 def get_object_header(self, ref: str) -> Tuple[str, str, int]:
10651192 """ Use this method to quickly examine the type and size of the object behind
10661193 the given ref.
10671194
10721199 cmd = self._get_persistent_cmd("cat_file_header", "cat_file", batch_check=True)
10731200 return self.__get_object_header(cmd, ref)
10741201
1075 def get_object_data(self, ref):
1202 def get_object_data(self, ref: str) -> Tuple[str, str, int, bytes]:
10761203 """ As get_object_header, but returns object data as well
10771204 :return: (hexsha, type_string, size_as_int,data_string)
10781205 :note: not threadsafe"""
10811208 del(stream)
10821209 return (hexsha, typename, size, data)
10831210
1084 def stream_object_data(self, ref):
1211 def stream_object_data(self, ref: str) -> Tuple[str, str, int, 'Git.CatFileContentStream']:
10851212 """ As get_object_header, but returns the data as a stream
10861213
10871214 :return: (hexsha, type_string, size_as_int, stream)
10881215 :note: This method is not threadsafe, you need one independent Command instance per thread to be safe !"""
10891216 cmd = self._get_persistent_cmd("cat_file_all", "cat_file", batch=True)
10901217 hexsha, typename, size = self.__get_object_header(cmd, ref)
1091 return (hexsha, typename, size, self.CatFileContentStream(size, cmd.stdout))
1092
1093 def clear_cache(self):
1218 cmd_stdout = cmd.stdout if cmd.stdout is not None else io.BytesIO()
1219 return (hexsha, typename, size, self.CatFileContentStream(size, cmd_stdout))
1220
1221 def clear_cache(self) -> 'Git':
10941222 """Clear all kinds of internal caches to release resources.
10951223
10961224 Currently persistent commands will be interrupted.
1010 import os
1111 import sys
1212
13
1413 from gitdb.utils.encoding import (
1514 force_bytes, # @UnusedImport
1615 force_text # @UnusedImport
1716 )
1817
18 # typing --------------------------------------------------------------------
1919
20 is_win = (os.name == 'nt')
20 from typing import (
21 Any,
22 AnyStr,
23 Dict,
24 IO,
25 Optional,
26 Tuple,
27 Type,
28 Union,
29 overload,
30 )
31 # ---------------------------------------------------------------------------
32
33
34 is_win: bool = (os.name == 'nt')
2135 is_posix = (os.name == 'posix')
2236 is_darwin = (os.name == 'darwin')
2337 defenc = sys.getfilesystemencoding()
2438
2539
26 def safe_decode(s):
40 @overload
41 def safe_decode(s: None) -> None: ...
42
43
44 @overload
45 def safe_decode(s: AnyStr) -> str: ...
46
47
48 def safe_decode(s: Union[AnyStr, None]) -> Optional[str]:
2749 """Safely decodes a binary string to unicode"""
2850 if isinstance(s, str):
2951 return s
3052 elif isinstance(s, bytes):
3153 return s.decode(defenc, 'surrogateescape')
32 elif s is not None:
54 elif s is None:
55 return None
56 else:
3357 raise TypeError('Expected bytes or text, but got %r' % (s,))
3458
3559
36 def safe_encode(s):
37 """Safely decodes a binary string to unicode"""
60 @overload
61 def safe_encode(s: None) -> None: ...
62
63
64 @overload
65 def safe_encode(s: AnyStr) -> bytes: ...
66
67
68 def safe_encode(s: Optional[AnyStr]) -> Optional[bytes]:
69 """Safely encodes a binary string to unicode"""
3870 if isinstance(s, str):
3971 return s.encode(defenc)
4072 elif isinstance(s, bytes):
4173 return s
42 elif s is not None:
74 elif s is None:
75 return None
76 else:
4377 raise TypeError('Expected bytes or text, but got %r' % (s,))
4478
4579
46 def win_encode(s):
80 @overload
81 def win_encode(s: None) -> None: ...
82
83
84 @overload
85 def win_encode(s: AnyStr) -> bytes: ...
86
87
88 def win_encode(s: Optional[AnyStr]) -> Optional[bytes]:
4789 """Encode unicodes for process arguments on Windows."""
4890 if isinstance(s, str):
4991 return s.encode(locale.getpreferredencoding(False))
5193 return s
5294 elif s is not None:
5395 raise TypeError('Expected bytes or text, but got %r' % (s,))
54
55
56 def with_metaclass(meta, *bases):
57 """copied from https://github.com/Byron/bcore/blob/master/src/python/butility/future.py#L15"""
58 class metaclass(meta):
59 __call__ = type.__call__
60 __init__ = type.__init__
61
62 def __new__(cls, name, nbases, d):
63 if nbases is None:
64 return type.__new__(cls, name, (), d)
65 return meta(name, bases, d)
66 return metaclass(meta.__name__ + 'Helper', None, {})
96 return None
55 """Module containing module parser implementation able to properly read and write
66 configuration files"""
77
8 import sys
89 import abc
910 from functools import wraps
1011 import inspect
11 from io import IOBase
12 from io import BufferedReader, IOBase
1213 import logging
1314 import os
1415 import re
1516 import fnmatch
16 from collections import OrderedDict
1717
1818 from git.compat import (
1919 defenc,
2020 force_text,
21 with_metaclass,
2221 is_win,
2322 )
23
2424 from git.util import LockFile
2525
2626 import os.path as osp
2727
2828 import configparser as cp
2929
30 # typing-------------------------------------------------------
31
32 from typing import (Any, Callable, Generic, IO, List, Dict, Sequence,
33 TYPE_CHECKING, Tuple, TypeVar, Union, cast, overload)
34
35 from git.types import Lit_config_levels, ConfigLevels_Tup, PathLike, assert_never, _T
36
37 if TYPE_CHECKING:
38 from git.repo.base import Repo
39 from io import BytesIO
40
41 T_ConfigParser = TypeVar('T_ConfigParser', bound='GitConfigParser')
42 T_OMD_value = TypeVar('T_OMD_value', str, bytes, int, float, bool)
43
44 if sys.version_info[:3] < (3, 7, 2):
45 # typing.Ordereddict not added until py 3.7.2
46 from collections import OrderedDict
47 OrderedDict_OMD = OrderedDict
48 else:
49 from typing import OrderedDict
50 OrderedDict_OMD = OrderedDict[str, List[T_OMD_value]] # type: ignore[assignment, misc]
51
52 # -------------------------------------------------------------
3053
3154 __all__ = ('GitConfigParser', 'SectionConstraint')
3255
3659
3760 # invariants
3861 # represents the configuration level of a configuration file
39 CONFIG_LEVELS = ("system", "user", "global", "repository")
62
63
64 CONFIG_LEVELS: ConfigLevels_Tup = ("system", "user", "global", "repository")
65
4066
4167 # Section pattern to detect conditional includes.
4268 # https://git-scm.com/docs/git-config#_conditional_includes
4470
4571
4672 class MetaParserBuilder(abc.ABCMeta):
47
4873 """Utlity class wrapping base-class methods into decorators that assure read-only properties"""
49 def __new__(cls, name, bases, clsdict):
74 def __new__(cls, name: str, bases: Tuple, clsdict: Dict[str, Any]) -> 'MetaParserBuilder':
5075 """
5176 Equip all base-class methods with a needs_values decorator, and all non-const methods
5277 with a set_dirty_and_flush_changes decorator in addition to that."""
7297 return new_type
7398
7499
75 def needs_values(func):
100 def needs_values(func: Callable[..., _T]) -> Callable[..., _T]:
76101 """Returns method assuring we read values (on demand) before we try to access them"""
77102
78103 @wraps(func)
79 def assure_data_present(self, *args, **kwargs):
104 def assure_data_present(self: 'GitConfigParser', *args: Any, **kwargs: Any) -> _T:
80105 self.read()
81106 return func(self, *args, **kwargs)
82107 # END wrapper method
83108 return assure_data_present
84109
85110
86 def set_dirty_and_flush_changes(non_const_func):
111 def set_dirty_and_flush_changes(non_const_func: Callable[..., _T]) -> Callable[..., _T]:
87112 """Return method that checks whether given non constant function may be called.
88113 If so, the instance will be set dirty.
89114 Additionally, we flush the changes right to disk"""
90115
91 def flush_changes(self, *args, **kwargs):
116 def flush_changes(self: 'GitConfigParser', *args: Any, **kwargs: Any) -> _T:
92117 rval = non_const_func(self, *args, **kwargs)
93118 self._dirty = True
94119 self.write()
98123 return flush_changes
99124
100125
101 class SectionConstraint(object):
126 class SectionConstraint(Generic[T_ConfigParser]):
102127
103128 """Constrains a ConfigParser to only option commands which are constrained to
104129 always use the section we have been initialized with.
111136 _valid_attrs_ = ("get_value", "set_value", "get", "set", "getint", "getfloat", "getboolean", "has_option",
112137 "remove_section", "remove_option", "options")
113138
114 def __init__(self, config, section):
139 def __init__(self, config: T_ConfigParser, section: str) -> None:
115140 self._config = config
116141 self._section_name = section
117142
118 def __del__(self):
143 def __del__(self) -> None:
119144 # Yes, for some reason, we have to call it explicitly for it to work in PY3 !
120145 # Apparently __del__ doesn't get call anymore if refcount becomes 0
121146 # Ridiculous ... .
122147 self._config.release()
123148
124 def __getattr__(self, attr):
149 def __getattr__(self, attr: str) -> Any:
125150 if attr in self._valid_attrs_:
126151 return lambda *args, **kwargs: self._call_config(attr, *args, **kwargs)
127152 return super(SectionConstraint, self).__getattribute__(attr)
128153
129 def _call_config(self, method, *args, **kwargs):
154 def _call_config(self, method: str, *args: Any, **kwargs: Any) -> Any:
130155 """Call the configuration at the given method which must take a section name
131156 as first argument"""
132157 return getattr(self._config, method)(self._section_name, *args, **kwargs)
133158
134159 @property
135 def config(self):
160 def config(self) -> T_ConfigParser:
136161 """return: Configparser instance we constrain"""
137162 return self._config
138163
139 def release(self):
164 def release(self) -> None:
140165 """Equivalent to GitConfigParser.release(), which is called on our underlying parser instance"""
141166 return self._config.release()
142167
143 def __enter__(self):
168 def __enter__(self) -> 'SectionConstraint[T_ConfigParser]':
144169 self._config.__enter__()
145170 return self
146171
147 def __exit__(self, exception_type, exception_value, traceback):
172 def __exit__(self, exception_type: str, exception_value: str, traceback: str) -> None:
148173 self._config.__exit__(exception_type, exception_value, traceback)
149174
150175
151 class _OMD(OrderedDict):
176 class _OMD(OrderedDict_OMD):
152177 """Ordered multi-dict."""
153178
154 def __setitem__(self, key, value):
179 def __setitem__(self, key: str, value: _T) -> None:
155180 super(_OMD, self).__setitem__(key, [value])
156181
157 def add(self, key, value):
182 def add(self, key: str, value: Any) -> None:
183 if key not in self:
184 super(_OMD, self).__setitem__(key, [value])
185 return None
186 super(_OMD, self).__getitem__(key).append(value)
187
188 def setall(self, key: str, values: List[_T]) -> None:
189 super(_OMD, self).__setitem__(key, values)
190
191 def __getitem__(self, key: str) -> Any:
192 return super(_OMD, self).__getitem__(key)[-1]
193
194 def getlast(self, key: str) -> Any:
195 return super(_OMD, self).__getitem__(key)[-1]
196
197 def setlast(self, key: str, value: Any) -> None:
158198 if key not in self:
159199 super(_OMD, self).__setitem__(key, [value])
160200 return
161201
162 super(_OMD, self).__getitem__(key).append(value)
163
164 def setall(self, key, values):
165 super(_OMD, self).__setitem__(key, values)
166
167 def __getitem__(self, key):
168 return super(_OMD, self).__getitem__(key)[-1]
169
170 def getlast(self, key):
171 return super(_OMD, self).__getitem__(key)[-1]
172
173 def setlast(self, key, value):
174 if key not in self:
175 super(_OMD, self).__setitem__(key, [value])
176 return
177
178202 prior = super(_OMD, self).__getitem__(key)
179203 prior[-1] = value
180204
181 def get(self, key, default=None):
205 def get(self, key: str, default: Union[_T, None] = None) -> Union[_T, None]:
182206 return super(_OMD, self).get(key, [default])[-1]
183207
184 def getall(self, key):
208 def getall(self, key: str) -> List[_T]:
185209 return super(_OMD, self).__getitem__(key)
186210
187 def items(self):
211 def items(self) -> List[Tuple[str, _T]]: # type: ignore[override]
188212 """List of (key, last value for key)."""
189213 return [(k, self[k]) for k in self]
190214
191 def items_all(self):
215 def items_all(self) -> List[Tuple[str, List[_T]]]:
192216 """List of (key, list of values for key)."""
193217 return [(k, self.getall(k)) for k in self]
194218
195219
196 def get_config_path(config_level):
220 def get_config_path(config_level: Lit_config_levels) -> str:
197221
198222 # we do not support an absolute path of the gitconfig on windows ,
199223 # use the global config instead
209233 return osp.normpath(osp.expanduser("~/.gitconfig"))
210234 elif config_level == "repository":
211235 raise ValueError("No repo to get repository configuration from. Use Repo._get_config_path")
212
213 raise ValueError("Invalid configuration level: %r" % config_level)
214
215
216 class GitConfigParser(with_metaclass(MetaParserBuilder, cp.RawConfigParser, object)):
236 else:
237 # Should not reach here. Will raise ValueError if does. Static typing will warn missing elifs
238 assert_never(config_level, # type: ignore[unreachable]
239 ValueError(f"Invalid configuration level: {config_level!r}"))
240
241
242 class GitConfigParser(cp.RawConfigParser, metaclass=MetaParserBuilder):
217243
218244 """Implements specifics required to read git style configuration files.
219245
251277 # list of RawConfigParser methods able to change the instance
252278 _mutating_methods_ = ("add_section", "remove_section", "remove_option", "set")
253279
254 def __init__(self, file_or_files=None, read_only=True, merge_includes=True, config_level=None, repo=None):
280 def __init__(self, file_or_files: Union[None, PathLike, 'BytesIO', Sequence[Union[PathLike, 'BytesIO']]] = None,
281 read_only: bool = True, merge_includes: bool = True,
282 config_level: Union[Lit_config_levels, None] = None,
283 repo: Union['Repo', None] = None) -> None:
255284 """Initialize a configuration reader to read the given file_or_files and to
256285 possibly allow changes to it by setting read_only False
257286
271300
272301 """
273302 cp.RawConfigParser.__init__(self, dict_type=_OMD)
303 self._dict: Callable[..., _OMD] # type: ignore # mypy/typeshed bug?
304 self._defaults: _OMD
305 self._sections: _OMD # type: ignore # mypy/typeshed bug?
274306
275307 # Used in python 3, needs to stay in sync with sections for underlying implementation to work
276308 if not hasattr(self, '_proxies'):
277309 self._proxies = self._dict()
278310
279311 if file_or_files is not None:
280 self._file_or_files = file_or_files
312 self._file_or_files: Union[PathLike, 'BytesIO', Sequence[Union[PathLike, 'BytesIO']]] = file_or_files
281313 else:
282314 if config_level is None:
283315 if read_only:
284 self._file_or_files = [get_config_path(f) for f in CONFIG_LEVELS if f != 'repository']
316 self._file_or_files = [get_config_path(cast(Lit_config_levels, f))
317 for f in CONFIG_LEVELS
318 if f != 'repository']
285319 else:
286320 raise ValueError("No configuration level or configuration files specified")
287321 else:
292326 self._is_initialized = False
293327 self._merge_includes = merge_includes
294328 self._repo = repo
295 self._lock = None
329 self._lock: Union['LockFile', None] = None
296330 self._acquire_lock()
297331
298 def _acquire_lock(self):
332 def _acquire_lock(self) -> None:
299333 if not self._read_only:
300334 if not self._lock:
301 if isinstance(self._file_or_files, (tuple, list)):
335 if isinstance(self._file_or_files, (str, os.PathLike)):
336 file_or_files = self._file_or_files
337 elif isinstance(self._file_or_files, (tuple, list, Sequence)):
302338 raise ValueError(
303339 "Write-ConfigParsers can operate on a single file only, multiple files have been passed")
304 # END single file check
305
306 file_or_files = self._file_or_files
307 if not isinstance(self._file_or_files, str):
340 else:
308341 file_or_files = self._file_or_files.name
342
309343 # END get filename from handle/stream
310344 # initialize lock base - we want to write
311345 self._lock = self.t_lock(file_or_files)
314348 self._lock._obtain_lock()
315349 # END read-only check
316350
317 def __del__(self):
351 def __del__(self) -> None:
318352 """Write pending changes if required and release locks"""
319353 # NOTE: only consistent in PY2
320354 self.release()
321355
322 def __enter__(self):
356 def __enter__(self) -> 'GitConfigParser':
323357 self._acquire_lock()
324358 return self
325359
326 def __exit__(self, exception_type, exception_value, traceback):
360 def __exit__(self, *args: Any) -> None:
327361 self.release()
328362
329 def release(self):
363 def release(self) -> None:
330364 """Flush changes and release the configuration write lock. This instance must not be used anymore afterwards.
331365 In Python 3, it's required to explicitly release locks and flush changes, as __del__ is not called
332366 deterministically anymore."""
346380 # Usually when shutting down the interpreter, don'y know how to fix this
347381 pass
348382 finally:
349 self._lock._release_lock()
350
351 def optionxform(self, optionstr):
383 if self._lock is not None:
384 self._lock._release_lock()
385
386 def optionxform(self, optionstr: str) -> str:
352387 """Do not transform options in any way when writing"""
353388 return optionstr
354389
355 def _read(self, fp, fpname):
390 def _read(self, fp: Union[BufferedReader, IO[bytes]], fpname: str) -> None:
356391 """A direct copy of the py2.4 version of the super class's _read method
357392 to assure it uses ordered dicts. Had to change one line to make it work.
358393
368403 is_multi_line = False
369404 e = None # None, or an exception
370405
371 def string_decode(v):
406 def string_decode(v: str) -> str:
372407 if v[-1] == '\\':
373408 v = v[:-1]
374409 # end cut trailing escapes to prevent decode error
393428 # is it a section header?
394429 mo = self.SECTCRE.match(line.strip())
395430 if not is_multi_line and mo:
396 sectname = mo.group('header').strip()
431 sectname: str = mo.group('header').strip()
397432 if sectname in self._sections:
398433 cursect = self._sections[sectname]
399434 elif sectname == cp.DEFAULTSECT:
450485 if e:
451486 raise e
452487
453 def _has_includes(self):
488 def _has_includes(self) -> Union[bool, int]:
454489 return self._merge_includes and len(self._included_paths())
455490
456 def _included_paths(self):
457 """Return all paths that must be included to configuration.
491 def _included_paths(self) -> List[Tuple[str, str]]:
492 """Return List all paths that must be included to configuration
493 as Tuples of (option, value).
458494 """
459495 paths = []
460496
487523 ),
488524 value
489525 )
490
491 if fnmatch.fnmatchcase(self._repo.git_dir, value):
492 paths += self.items(section)
526 if self._repo.git_dir:
527 if fnmatch.fnmatchcase(str(self._repo.git_dir), value):
528 paths += self.items(section)
493529
494530 elif keyword == "onbranch":
495531 try:
503539
504540 return paths
505541
506 def read(self):
542 def read(self) -> None: # type: ignore[override]
507543 """Reads the data stored in the files we have been initialized with. It will
508544 ignore files that cannot be read, possibly leaving an empty configuration
509545
510546 :return: Nothing
511547 :raise IOError: if a file cannot be handled"""
512548 if self._is_initialized:
513 return
549 return None
514550 self._is_initialized = True
515551
516 if not isinstance(self._file_or_files, (tuple, list)):
552 files_to_read: List[Union[PathLike, IO]] = [""]
553 if isinstance(self._file_or_files, (str, os.PathLike)):
554 # for str or Path, as str is a type of Sequence
517555 files_to_read = [self._file_or_files]
518 else:
556 elif not isinstance(self._file_or_files, (tuple, list, Sequence)):
557 # could merge with above isinstance once runtime type known
558 files_to_read = [self._file_or_files]
559 else: # for lists or tuples
519560 files_to_read = list(self._file_or_files)
520561 # end assure we have a copy of the paths to handle
521562
523564 num_read_include_files = 0
524565 while files_to_read:
525566 file_path = files_to_read.pop(0)
526 fp = file_path
527567 file_ok = False
528568
529 if hasattr(fp, "seek"):
530 self._read(fp, fp.name)
569 if hasattr(file_path, "seek"):
570 # must be a file objectfile-object
571 file_path = cast(IO[bytes], file_path) # replace with assert to narrow type, once sure
572 self._read(file_path, file_path.name)
531573 else:
532574 # assume a path if it is not a file-object
575 file_path = cast(PathLike, file_path)
533576 try:
534577 with open(file_path, 'rb') as fp:
535578 file_ok = True
547590 if not file_ok:
548591 continue
549592 # end ignore relative paths if we don't know the configuration file path
593 file_path = cast(PathLike, file_path)
550594 assert osp.isabs(file_path), "Need absolute paths to be sure our cycle checks will work"
551595 include_path = osp.join(osp.dirname(file_path), include_path)
552596 # end make include path absolute
567611 self._merge_includes = False
568612 # end
569613
570 def _write(self, fp):
614 def _write(self, fp: IO) -> None:
571615 """Write an .ini-format representation of the configuration state in
572616 git compatible format"""
573 def write_section(name, section_dict):
617 def write_section(name: str, section_dict: _OMD) -> None:
574618 fp.write(("[%s]\n" % name).encode(defenc))
619
620 values: Sequence[str] # runtime only gets str in tests, but should be whatever _OMD stores
621 v: str
575622 for (key, values) in section_dict.items_all():
576623 if key == "__name__":
577624 continue
583630
584631 if self._defaults:
585632 write_section(cp.DEFAULTSECT, self._defaults)
633 value: _OMD
634
586635 for name, value in self._sections.items():
587636 write_section(name, value)
588637
589 def items(self, section_name):
638 def items(self, section_name: str) -> List[Tuple[str, str]]: # type: ignore[override]
590639 """:return: list((option, value), ...) pairs of all items in the given section"""
591640 return [(k, v) for k, v in super(GitConfigParser, self).items(section_name) if k != '__name__']
592641
593 def items_all(self, section_name):
642 def items_all(self, section_name: str) -> List[Tuple[str, List[str]]]:
594643 """:return: list((option, [values...]), ...) pairs of all items in the given section"""
595644 rv = _OMD(self._defaults)
596645
607656 return rv.items_all()
608657
609658 @needs_values
610 def write(self):
659 def write(self) -> None:
611660 """Write changes to our file, if there are changes at all
612661
613662 :raise IOError: if this is a read-only writer instance or if we could not obtain
614663 a file lock"""
615664 self._assure_writable("write")
616665 if not self._dirty:
617 return
666 return None
618667
619668 if isinstance(self._file_or_files, (list, tuple)):
620669 raise AssertionError("Cannot write back if there is not exactly a single file to write to, have %i files"
624673 if self._has_includes():
625674 log.debug("Skipping write-back of configuration file as include files were merged in." +
626675 "Set merge_includes=False to prevent this.")
627 return
676 return None
628677 # end
629678
630679 fp = self._file_or_files
631680
632681 # we have a physical file on disk, so get a lock
633 is_file_lock = isinstance(fp, (str, IOBase))
634 if is_file_lock:
682 is_file_lock = isinstance(fp, (str, os.PathLike, IOBase)) # can't use Pathlike until 3.5 dropped
683 if is_file_lock and self._lock is not None: # else raise Error?
635684 self._lock._obtain_lock()
685
636686 if not hasattr(fp, "seek"):
637 with open(self._file_or_files, "wb") as fp:
638 self._write(fp)
687 fp = cast(PathLike, fp)
688 with open(fp, "wb") as fp_open:
689 self._write(fp_open)
639690 else:
691 fp = cast('BytesIO', fp)
640692 fp.seek(0)
641693 # make sure we do not overwrite into an existing file
642694 if hasattr(fp, 'truncate'):
643695 fp.truncate()
644696 self._write(fp)
645697
646 def _assure_writable(self, method_name):
698 def _assure_writable(self, method_name: str) -> None:
647699 if self.read_only:
648700 raise IOError("Cannot execute non-constant method %s.%s" % (self, method_name))
649701
650 def add_section(self, section):
702 def add_section(self, section: str) -> None:
651703 """Assures added options will stay in order"""
652704 return super(GitConfigParser, self).add_section(section)
653705
654706 @property
655 def read_only(self):
707 def read_only(self) -> bool:
656708 """:return: True if this instance may change the configuration file"""
657709 return self._read_only
658710
659 def get_value(self, section, option, default=None):
711 @overload
712 def get_value(self, section: str, option: str, default: None = None) -> Union[int, float, str, bool]: ...
713
714 @overload
715 def get_value(self, section: str, option: str, default: str) -> str: ...
716
717 @overload
718 def get_value(self, section: str, option: str, default: float) -> float: ...
719
720 def get_value(self, section: str, option: str, default: Union[int, float, str, bool, None] = None
721 ) -> Union[int, float, str, bool]:
722 # can default or return type include bool?
660723 """Get an option's value.
661724
662725 If multiple values are specified for this option in the section, the
678741
679742 return self._string_to_value(valuestr)
680743
681 def get_values(self, section, option, default=None):
744 def get_values(self, section: str, option: str, default: Union[int, float, str, bool, None] = None
745 ) -> List[Union[int, float, str, bool]]:
682746 """Get an option's values.
683747
684748 If multiple values are specified for this option in the section, all are
700764
701765 return [self._string_to_value(valuestr) for valuestr in lst]
702766
703 def _string_to_value(self, valuestr):
767 def _string_to_value(self, valuestr: str) -> Union[int, float, str, bool]:
704768 types = (int, float)
705769 for numtype in types:
706770 try:
707771 val = numtype(valuestr)
708
709772 # truncated value ?
710773 if val != float(valuestr):
711774 continue
712
713775 return val
714776 except (ValueError, TypeError):
715777 continue
729791
730792 return valuestr
731793
732 def _value_to_string(self, value):
794 def _value_to_string(self, value: Union[str, bytes, int, float, bool]) -> str:
733795 if isinstance(value, (int, float, bool)):
734796 return str(value)
735797 return force_text(value)
736798
737799 @needs_values
738800 @set_dirty_and_flush_changes
739 def set_value(self, section, option, value):
801 def set_value(self, section: str, option: str, value: Union[str, bytes, int, float, bool]) -> 'GitConfigParser':
740802 """Sets the given option in section to the given value.
741803 It will create the section if required, and will not throw as opposed to the default
742804 ConfigParser 'set' method.
754816
755817 @needs_values
756818 @set_dirty_and_flush_changes
757 def add_value(self, section, option, value):
819 def add_value(self, section: str, option: str, value: Union[str, bytes, int, float, bool]) -> 'GitConfigParser':
758820 """Adds a value for the given option in section.
759821 It will create the section if required, and will not throw as opposed to the default
760822 ConfigParser 'set' method. The value becomes the new value of the option as returned
771833 self._sections[section].add(option, self._value_to_string(value))
772834 return self
773835
774 def rename_section(self, section, new_name):
836 def rename_section(self, section: str, new_name: str) -> 'GitConfigParser':
775837 """rename the given section to new_name
776838 :raise ValueError: if section doesn't exit
777839 :raise ValueError: if a section with new_name does already exist
66 from gitdb.db import GitDB # @UnusedImport
77 from gitdb.db import LooseObjectDB
88
9 from .exc import (
10 GitCommandError,
11 BadObject
12 )
9 from gitdb.exc import BadObject
10 from git.exc import GitCommandError
11
12 # typing-------------------------------------------------
13
14 from typing import TYPE_CHECKING
15 from git.types import PathLike
16
17 if TYPE_CHECKING:
18 from git.cmd import Git
1319
1420
21 # --------------------------------------------------------
22
1523 __all__ = ('GitCmdObjectDB', 'GitDB')
16
17 # class GitCmdObjectDB(CompoundDB, ObjectDBW):
1824
1925
2026 class GitCmdObjectDB(LooseObjectDB):
2733 have packs and the other implementations
2834 """
2935
30 def __init__(self, root_path, git):
36 def __init__(self, root_path: PathLike, git: 'Git') -> None:
3137 """Initialize this instance with the root and a git command"""
3238 super(GitCmdObjectDB, self).__init__(root_path)
3339 self._git = git
3440
35 def info(self, sha):
36 hexsha, typename, size = self._git.get_object_header(bin_to_hex(sha))
41 def info(self, binsha: bytes) -> OInfo:
42 hexsha, typename, size = self._git.get_object_header(bin_to_hex(binsha))
3743 return OInfo(hex_to_bin(hexsha), typename, size)
3844
39 def stream(self, sha):
45 def stream(self, binsha: bytes) -> OStream:
4046 """For now, all lookup is done by git itself"""
41 hexsha, typename, size, stream = self._git.stream_object_data(bin_to_hex(sha))
47 hexsha, typename, size, stream = self._git.stream_object_data(bin_to_hex(binsha))
4248 return OStream(hex_to_bin(hexsha), typename, size, stream)
4349
4450 # { Interface
4551
46 def partial_to_complete_sha_hex(self, partial_hexsha):
52 def partial_to_complete_sha_hex(self, partial_hexsha: str) -> bytes:
4753 """:return: Full binary 20 byte sha from the given partial hexsha
4854 :raise AmbiguousObjectName:
4955 :raise BadObject:
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5
56 import re
6
77 from git.cmd import handle_process_output
88 from git.compat import defenc
99 from git.util import finalize_process, hex_to_bin
1212 from .objects.util import mode_str_to_int
1313
1414
15 # typing ------------------------------------------------------------------
16
17 from typing import Any, Iterator, List, Match, Optional, Tuple, Type, TypeVar, Union, TYPE_CHECKING, cast
18 from git.types import PathLike, Literal
19
20 if TYPE_CHECKING:
21 from .objects.tree import Tree
22 from .objects import Commit
23 from git.repo.base import Repo
24 from git.objects.base import IndexObject
25 from subprocess import Popen
26 from git import Git
27
28 Lit_change_type = Literal['A', 'D', 'C', 'M', 'R', 'T', 'U']
29
30
31 # def is_change_type(inp: str) -> TypeGuard[Lit_change_type]:
32 # # return True
33 # return inp in ['A', 'D', 'C', 'M', 'R', 'T', 'U']
34
35 # ------------------------------------------------------------------------
36
37
1538 __all__ = ('Diffable', 'DiffIndex', 'Diff', 'NULL_TREE')
1639
1740 # Special object to compare against the empty tree in diffs
2043 _octal_byte_re = re.compile(b'\\\\([0-9]{3})')
2144
2245
23 def _octal_repl(matchobj):
46 def _octal_repl(matchobj: Match) -> bytes:
2447 value = matchobj.group(1)
2548 value = int(value, 8)
2649 value = bytes(bytearray((value,)))
2750 return value
2851
2952
30 def decode_path(path, has_ab_prefix=True):
53 def decode_path(path: bytes, has_ab_prefix: bool = True) -> Optional[bytes]:
3154 if path == b'/dev/null':
3255 return None
3356
5982 class Index(object):
6083 pass
6184
62 def _process_diff_args(self, args):
85 def _process_diff_args(self, args: List[Union[str, 'Diffable', Type['Diffable.Index'], object]]
86 ) -> List[Union[str, 'Diffable', Type['Diffable.Index'], object]]:
6387 """
6488 :return:
6589 possibly altered version of the given args list.
6791 Subclasses can use it to alter the behaviour of the superclass"""
6892 return args
6993
70 def diff(self, other=Index, paths=None, create_patch=False, **kwargs):
94 def diff(self, other: Union[Type['Index'], 'Tree', 'Commit', None, str, object] = Index,
95 paths: Union[PathLike, List[PathLike], Tuple[PathLike, ...], None] = None,
96 create_patch: bool = False, **kwargs: Any) -> 'DiffIndex':
7197 """Creates diffs between two items being trees, trees and index or an
7298 index and the working tree. It will detect renames automatically.
7399
98124 :note:
99125 On a bare repository, 'other' needs to be provided as Index or as
100126 as Tree/Commit, or a git command error will occur"""
101 args = []
127 args: List[Union[PathLike, Diffable, Type['Diffable.Index'], object]] = []
102128 args.append("--abbrev=40") # we need full shas
103129 args.append("--full-index") # get full index paths, not only filenames
104130
115141
116142 if paths is not None and not isinstance(paths, (tuple, list)):
117143 paths = [paths]
144
145 if hasattr(self, 'Has_Repo'):
146 self.repo: 'Repo' = self.repo
118147
119148 diff_cmd = self.repo.git.diff
120149 if other is self.Index:
148177 return index
149178
150179
151 class DiffIndex(list):
180 T_Diff = TypeVar('T_Diff', bound='Diff')
181
182
183 class DiffIndex(List[T_Diff]):
152184
153185 """Implements an Index for diffs, allowing a list of Diffs to be queried by
154186 the diff properties.
162194 # T = Changed in the type
163195 change_type = ("A", "C", "D", "R", "M", "T")
164196
165 def iter_change_type(self, change_type):
197 def iter_change_type(self, change_type: Lit_change_type) -> Iterator[T_Diff]:
166198 """
167199 :return:
168200 iterator yielding Diff instances that match the given change_type
179211 if change_type not in self.change_type:
180212 raise ValueError("Invalid change type: %s" % change_type)
181213
182 for diff in self:
183 if diff.change_type == change_type:
184 yield diff
185 elif change_type == "A" and diff.new_file:
186 yield diff
187 elif change_type == "D" and diff.deleted_file:
188 yield diff
189 elif change_type == "C" and diff.copied_file:
190 yield diff
191 elif change_type == "R" and diff.renamed:
192 yield diff
193 elif change_type == "M" and diff.a_blob and diff.b_blob and diff.a_blob != diff.b_blob:
194 yield diff
214 for diffidx in self:
215 if diffidx.change_type == change_type:
216 yield diffidx
217 elif change_type == "A" and diffidx.new_file:
218 yield diffidx
219 elif change_type == "D" and diffidx.deleted_file:
220 yield diffidx
221 elif change_type == "C" and diffidx.copied_file:
222 yield diffidx
223 elif change_type == "R" and diffidx.renamed:
224 yield diffidx
225 elif change_type == "M" and diffidx.a_blob and diffidx.b_blob and diffidx.a_blob != diffidx.b_blob:
226 yield diffidx
195227 # END for each diff
196228
197229
254286 "new_file", "deleted_file", "copied_file", "raw_rename_from",
255287 "raw_rename_to", "diff", "change_type", "score")
256288
257 def __init__(self, repo, a_rawpath, b_rawpath, a_blob_id, b_blob_id, a_mode,
258 b_mode, new_file, deleted_file, copied_file, raw_rename_from,
259 raw_rename_to, diff, change_type, score):
260
261 self.a_mode = a_mode
262 self.b_mode = b_mode
289 def __init__(self, repo: 'Repo',
290 a_rawpath: Optional[bytes], b_rawpath: Optional[bytes],
291 a_blob_id: Union[str, bytes, None], b_blob_id: Union[str, bytes, None],
292 a_mode: Union[bytes, str, None], b_mode: Union[bytes, str, None],
293 new_file: bool, deleted_file: bool, copied_file: bool,
294 raw_rename_from: Optional[bytes], raw_rename_to: Optional[bytes],
295 diff: Union[str, bytes, None], change_type: Optional[Lit_change_type], score: Optional[int]) -> None:
263296
264297 assert a_rawpath is None or isinstance(a_rawpath, bytes)
265298 assert b_rawpath is None or isinstance(b_rawpath, bytes)
266299 self.a_rawpath = a_rawpath
267300 self.b_rawpath = b_rawpath
268301
269 if self.a_mode:
270 self.a_mode = mode_str_to_int(self.a_mode)
271 if self.b_mode:
272 self.b_mode = mode_str_to_int(self.b_mode)
302 self.a_mode = mode_str_to_int(a_mode) if a_mode else None
303 self.b_mode = mode_str_to_int(b_mode) if b_mode else None
273304
274305 # Determine whether this diff references a submodule, if it does then
275306 # we need to overwrite "repo" to the corresponding submodule's repo instead
280311 repo = submodule.module()
281312 break
282313
314 self.a_blob: Union['IndexObject', None]
283315 if a_blob_id is None or a_blob_id == self.NULL_HEX_SHA:
284316 self.a_blob = None
285317 else:
286318 self.a_blob = Blob(repo, hex_to_bin(a_blob_id), mode=self.a_mode, path=self.a_path)
287319
320 self.b_blob: Union['IndexObject', None]
288321 if b_blob_id is None or b_blob_id == self.NULL_HEX_SHA:
289322 self.b_blob = None
290323 else:
291324 self.b_blob = Blob(repo, hex_to_bin(b_blob_id), mode=self.b_mode, path=self.b_path)
292325
293 self.new_file = new_file
294 self.deleted_file = deleted_file
295 self.copied_file = copied_file
326 self.new_file: bool = new_file
327 self.deleted_file: bool = deleted_file
328 self.copied_file: bool = copied_file
296329
297330 # be clear and use None instead of empty strings
298331 assert raw_rename_from is None or isinstance(raw_rename_from, bytes)
301334 self.raw_rename_to = raw_rename_to or None
302335
303336 self.diff = diff
304 self.change_type = change_type
337 self.change_type: Union[Lit_change_type, None] = change_type
305338 self.score = score
306339
307 def __eq__(self, other):
340 def __eq__(self, other: object) -> bool:
308341 for name in self.__slots__:
309342 if getattr(self, name) != getattr(other, name):
310343 return False
311344 # END for each name
312345 return True
313346
314 def __ne__(self, other):
347 def __ne__(self, other: object) -> bool:
315348 return not (self == other)
316349
317 def __hash__(self):
350 def __hash__(self) -> int:
318351 return hash(tuple(getattr(self, n) for n in self.__slots__))
319352
320 def __str__(self):
321 h = "%s"
353 def __str__(self) -> str:
354 h: str = "%s"
322355 if self.a_blob:
323356 h %= self.a_blob.path
324357 elif self.b_blob:
325358 h %= self.b_blob.path
326359
327 msg = ''
360 msg: str = ''
328361 line = None # temp line
329362 line_length = 0 # line length
330363 for b, n in zip((self.a_blob, self.b_blob), ('lhs', 'rhs')):
353386 if self.diff:
354387 msg += '\n---'
355388 try:
356 msg += self.diff.decode(defenc)
389 msg += self.diff.decode(defenc) if isinstance(self.diff, bytes) else self.diff
357390 except UnicodeDecodeError:
358391 msg += 'OMITTED BINARY DATA'
359392 # end handle encoding
366399 # end
367400 return res
368401
369 @property
370 def a_path(self):
402 @ property
403 def a_path(self) -> Optional[str]:
371404 return self.a_rawpath.decode(defenc, 'replace') if self.a_rawpath else None
372405
373 @property
374 def b_path(self):
406 @ property
407 def b_path(self) -> Optional[str]:
375408 return self.b_rawpath.decode(defenc, 'replace') if self.b_rawpath else None
376409
377 @property
378 def rename_from(self):
410 @ property
411 def rename_from(self) -> Optional[str]:
379412 return self.raw_rename_from.decode(defenc, 'replace') if self.raw_rename_from else None
380413
381 @property
382 def rename_to(self):
414 @ property
415 def rename_to(self) -> Optional[str]:
383416 return self.raw_rename_to.decode(defenc, 'replace') if self.raw_rename_to else None
384417
385 @property
386 def renamed(self):
418 @ property
419 def renamed(self) -> bool:
387420 """:returns: True if the blob of our diff has been renamed
388421 :note: This property is deprecated, please use ``renamed_file`` instead.
389422 """
390423 return self.renamed_file
391424
392 @property
393 def renamed_file(self):
425 @ property
426 def renamed_file(self) -> bool:
394427 """:returns: True if the blob of our diff has been renamed
395428 """
396429 return self.rename_from != self.rename_to
397430
398 @classmethod
399 def _pick_best_path(cls, path_match, rename_match, path_fallback_match):
431 @ classmethod
432 def _pick_best_path(cls, path_match: bytes, rename_match: bytes, path_fallback_match: bytes) -> Optional[bytes]:
400433 if path_match:
401434 return decode_path(path_match)
402435
408441
409442 return None
410443
411 @classmethod
412 def _index_from_patch_format(cls, repo, proc):
444 @ classmethod
445 def _index_from_patch_format(cls, repo: 'Repo', proc: Union['Popen', 'Git.AutoInterrupt']) -> DiffIndex:
413446 """Create a new DiffIndex from the given text which must be in patch format
414447 :param repo: is the repository we are operating on - it is required
415448 :param stream: result of 'git diff' as a stream (supporting file protocol)
416449 :return: git.DiffIndex """
417450
418451 ## FIXME: Here SLURPING raw, need to re-phrase header-regexes linewise.
419 text = []
420 handle_process_output(proc, text.append, None, finalize_process, decode_streams=False)
452 text_list: List[bytes] = []
453 handle_process_output(proc, text_list.append, None, finalize_process, decode_streams=False)
421454
422455 # for now, we have to bake the stream
423 text = b''.join(text)
424 index = DiffIndex()
425 previous_header = None
426 header = None
456 text = b''.join(text_list)
457 index: 'DiffIndex' = DiffIndex()
458 previous_header: Union[Match[bytes], None] = None
459 header: Union[Match[bytes], None] = None
460 a_path, b_path = None, None # for mypy
461 a_mode, b_mode = None, None # for mypy
427462 for _header in cls.re_header.finditer(text):
428463 a_path_fallback, b_path_fallback, \
429464 old_mode, new_mode, \
463498 previous_header = _header
464499 header = _header
465500 # end for each header we parse
466 if index:
501 if index and header:
467502 index[-1].diff = text[header.end():]
468503 # end assign last diff
469504
470505 return index
471506
472 @classmethod
473 def _index_from_raw_format(cls, repo, proc):
507 @ staticmethod
508 def _handle_diff_line(lines_bytes: bytes, repo: 'Repo', index: DiffIndex) -> None:
509 lines = lines_bytes.decode(defenc)
510
511 for line in lines.split(':')[1:]:
512 meta, _, path = line.partition('\x00')
513 path = path.rstrip('\x00')
514 a_blob_id: Optional[str]
515 b_blob_id: Optional[str]
516 old_mode, new_mode, a_blob_id, b_blob_id, _change_type = meta.split(None, 4)
517 # Change type can be R100
518 # R: status letter
519 # 100: score (in case of copy and rename)
520 # assert is_change_type(_change_type[0]), f"Unexpected value for change_type received: {_change_type[0]}"
521 change_type: Lit_change_type = cast(Lit_change_type, _change_type[0])
522 score_str = ''.join(_change_type[1:])
523 score = int(score_str) if score_str.isdigit() else None
524 path = path.strip()
525 a_path = path.encode(defenc)
526 b_path = path.encode(defenc)
527 deleted_file = False
528 new_file = False
529 copied_file = False
530 rename_from = None
531 rename_to = None
532
533 # NOTE: We cannot conclude from the existence of a blob to change type
534 # as diffs with the working do not have blobs yet
535 if change_type == 'D':
536 b_blob_id = None # Optional[str]
537 deleted_file = True
538 elif change_type == 'A':
539 a_blob_id = None
540 new_file = True
541 elif change_type == 'C':
542 copied_file = True
543 a_path_str, b_path_str = path.split('\x00', 1)
544 a_path = a_path_str.encode(defenc)
545 b_path = b_path_str.encode(defenc)
546 elif change_type == 'R':
547 a_path_str, b_path_str = path.split('\x00', 1)
548 a_path = a_path_str.encode(defenc)
549 b_path = b_path_str.encode(defenc)
550 rename_from, rename_to = a_path, b_path
551 elif change_type == 'T':
552 # Nothing to do
553 pass
554 # END add/remove handling
555
556 diff = Diff(repo, a_path, b_path, a_blob_id, b_blob_id, old_mode, new_mode,
557 new_file, deleted_file, copied_file, rename_from, rename_to,
558 '', change_type, score)
559 index.append(diff)
560
561 @ classmethod
562 def _index_from_raw_format(cls, repo: 'Repo', proc: 'Popen') -> 'DiffIndex':
474563 """Create a new DiffIndex from the given stream which must be in raw format.
475564 :return: git.DiffIndex"""
476565 # handles
477566 # :100644 100644 687099101... 37c5e30c8... M .gitignore
478567
479 index = DiffIndex()
480
481 def handle_diff_line(lines):
482 lines = lines.decode(defenc)
483
484 for line in lines.split(':')[1:]:
485 meta, _, path = line.partition('\x00')
486 path = path.rstrip('\x00')
487 old_mode, new_mode, a_blob_id, b_blob_id, _change_type = meta.split(None, 4)
488 # Change type can be R100
489 # R: status letter
490 # 100: score (in case of copy and rename)
491 change_type = _change_type[0]
492 score_str = ''.join(_change_type[1:])
493 score = int(score_str) if score_str.isdigit() else None
494 path = path.strip()
495 a_path = path.encode(defenc)
496 b_path = path.encode(defenc)
497 deleted_file = False
498 new_file = False
499 copied_file = False
500 rename_from = None
501 rename_to = None
502
503 # NOTE: We cannot conclude from the existence of a blob to change type
504 # as diffs with the working do not have blobs yet
505 if change_type == 'D':
506 b_blob_id = None
507 deleted_file = True
508 elif change_type == 'A':
509 a_blob_id = None
510 new_file = True
511 elif change_type == 'C':
512 copied_file = True
513 a_path, b_path = path.split('\x00', 1)
514 a_path = a_path.encode(defenc)
515 b_path = b_path.encode(defenc)
516 elif change_type == 'R':
517 a_path, b_path = path.split('\x00', 1)
518 a_path = a_path.encode(defenc)
519 b_path = b_path.encode(defenc)
520 rename_from, rename_to = a_path, b_path
521 elif change_type == 'T':
522 # Nothing to do
523 pass
524 # END add/remove handling
525
526 diff = Diff(repo, a_path, b_path, a_blob_id, b_blob_id, old_mode, new_mode,
527 new_file, deleted_file, copied_file, rename_from, rename_to,
528 '', change_type, score)
529 index.append(diff)
530
531 handle_process_output(proc, handle_diff_line, None, finalize_process, decode_streams=False)
568 index: 'DiffIndex' = DiffIndex()
569 handle_process_output(proc, lambda byt: cls._handle_diff_line(byt, repo, index),
570 None, finalize_process, decode_streams=False)
532571
533572 return index
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
55 """ Module containing all exceptions thrown throughout the git package, """
66
7 from gitdb.exc import BadName # NOQA @UnusedWildImport skipcq: PYL-W0401, PYL-W0614
78 from gitdb.exc import * # NOQA @UnusedWildImport skipcq: PYL-W0401, PYL-W0614
89 from git.compat import safe_decode
10
11 # typing ----------------------------------------------------
12
13 from typing import List, Sequence, Tuple, Union, TYPE_CHECKING
14 from git.types import PathLike
15
16 if TYPE_CHECKING:
17 from git.repo.base import Repo
18
19 # ------------------------------------------------------------------
920
1021
1122 class GitError(Exception):
3647 #: "'%s' failed%s"
3748 _msg = "Cmd('%s') failed%s"
3849
39 def __init__(self, command, status=None, stderr=None, stdout=None):
50 def __init__(self, command: Union[List[str], Tuple[str, ...], str],
51 status: Union[str, int, None, Exception] = None,
52 stderr: Union[bytes, str, None] = None,
53 stdout: Union[bytes, str, None] = None) -> None:
4054 if not isinstance(command, (tuple, list)):
4155 command = command.split()
4256 self.command = command
5468 self._cmd = safe_decode(command[0])
5569 self._cmdline = ' '.join(safe_decode(i) for i in command)
5670 self._cause = status and " due to: %s" % status or "!"
57 self.stdout = stdout and "\n stdout: '%s'" % safe_decode(stdout) or ''
58 self.stderr = stderr and "\n stderr: '%s'" % safe_decode(stderr) or ''
71 stdout_decode = safe_decode(stdout)
72 stderr_decode = safe_decode(stderr)
73 self.stdout = stdout_decode and "\n stdout: '%s'" % stdout_decode or ''
74 self.stderr = stderr_decode and "\n stderr: '%s'" % stderr_decode or ''
5975
60 def __str__(self):
76 def __str__(self) -> str:
6177 return (self._msg + "\n cmdline: %s%s%s") % (
6278 self._cmd, self._cause, self._cmdline, self.stdout, self.stderr)
6379
6581 class GitCommandNotFound(CommandError):
6682 """Thrown if we cannot find the `git` executable in the PATH or at the path given by
6783 the GIT_PYTHON_GIT_EXECUTABLE environment variable"""
68 def __init__(self, command, cause):
84
85 def __init__(self, command: Union[List[str], Tuple[str], str], cause: Union[str, Exception]) -> None:
6986 super(GitCommandNotFound, self).__init__(command, cause)
7087 self._msg = "Cmd('%s') not found%s"
7188
7390 class GitCommandError(CommandError):
7491 """ Thrown if execution of the git command fails with non-zero status code. """
7592
76 def __init__(self, command, status, stderr=None, stdout=None):
93 def __init__(self, command: Union[List[str], Tuple[str, ...], str],
94 status: Union[str, int, None, Exception] = None,
95 stderr: Union[bytes, str, None] = None,
96 stdout: Union[bytes, str, None] = None,
97 ) -> None:
7798 super(GitCommandError, self).__init__(command, status, stderr, stdout)
7899
79100
91112 were checked out successfully and hence match the version stored in the
92113 index"""
93114
94 def __init__(self, message, failed_files, valid_files, failed_reasons):
115 def __init__(self, message: str, failed_files: Sequence[PathLike], valid_files: Sequence[PathLike],
116 failed_reasons: List[str]) -> None:
117
95118 Exception.__init__(self, message)
96119 self.failed_files = failed_files
97120 self.failed_reasons = failed_reasons
98121 self.valid_files = valid_files
99122
100 def __str__(self):
123 def __str__(self) -> str:
101124 return Exception.__str__(self) + ":%s" % self.failed_files
102125
103126
115138 """Thrown if a hook exits with a non-zero exit code. It provides access to the exit code and the string returned
116139 via standard output"""
117140
118 def __init__(self, command, status, stderr=None, stdout=None):
141 def __init__(self, command: Union[List[str], Tuple[str, ...], str],
142 status: Union[str, int, None, Exception],
143 stderr: Union[bytes, str, None] = None,
144 stdout: Union[bytes, str, None] = None) -> None:
145
119146 super(HookExecutionError, self).__init__(command, status, stderr, stdout)
120147 self._msg = "Hook('%s') failed%s"
121148
123150 class RepositoryDirtyError(GitError):
124151 """Thrown whenever an operation on a repository fails as it has uncommitted changes that would be overwritten"""
125152
126 def __init__(self, repo, message):
153 def __init__(self, repo: 'Repo', message: str) -> None:
127154 self.repo = repo
128155 self.message = message
129156
130 def __str__(self):
157 def __str__(self) -> str:
131158 return "Operation cannot be performed on %r: %s" % (self.repo, self.message)
00 """Initialize the index package"""
11 # flake8: noqa
2 from __future__ import absolute_import
3
42 from .base import *
53 from .typ import *
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5
56 import glob
67 from io import BytesIO
78 import os
1617 from git.exc import (
1718 GitCommandError,
1819 CheckoutError,
20 GitError,
1921 InvalidGitRepositoryError
2022 )
2123 from git.objects import (
3840 from gitdb.base import IStream
3941 from gitdb.db import MemoryDB
4042
41 import git.diff as diff
43 import git.diff as git_diff
4244 import os.path as osp
4345
4446 from .fun import (
6264 git_working_dir
6365 )
6466
67 # typing -----------------------------------------------------------------------------
68
69 from typing import (Any, BinaryIO, Callable, Dict, IO, Iterable, Iterator, List, NoReturn,
70 Sequence, TYPE_CHECKING, Tuple, Type, Union)
71
72 from git.types import Commit_ish, PathLike
73
74 if TYPE_CHECKING:
75 from subprocess import Popen
76 from git.repo import Repo
77 from git.refs.reference import Reference
78 from git.util import Actor
79
80
81 StageType = int
82 Treeish = Union[Tree, Commit, str, bytes]
83
84 # ------------------------------------------------------------------------------------
85
6586
6687 __all__ = ('IndexFile', 'CheckoutError')
6788
6889
69 class IndexFile(LazyMixin, diff.Diffable, Serializable):
90 class IndexFile(LazyMixin, git_diff.Diffable, Serializable):
7091
7192 """
7293 Implements an Index that can be manipulated using a native implementation in
92113 _VERSION = 2 # latest version we support
93114 S_IFGITLINK = S_IFGITLINK # a submodule
94115
95 def __init__(self, repo, file_path=None):
116 def __init__(self, repo: 'Repo', file_path: Union[PathLike, None] = None) -> None:
96117 """Initialize this Index instance, optionally from the given ``file_path``.
97118 If no file_path is given, we will be created from the current index file.
98119
101122 self.repo = repo
102123 self.version = self._VERSION
103124 self._extension_data = b''
104 self._file_path = file_path or self._index_path()
105
106 def _set_cache_(self, attr):
125 self._file_path: PathLike = file_path or self._index_path()
126
127 def _set_cache_(self, attr: str) -> None:
107128 if attr == "entries":
108129 # read the current index
109130 # try memory map for speed
114135 ok = True
115136 except OSError:
116137 # in new repositories, there may be no index, which means we are empty
117 self.entries = {}
118 return
138 self.entries: Dict[Tuple[PathLike, StageType], IndexEntry] = {}
139 return None
119140 finally:
120141 if not ok:
121142 lfd.rollback()
132153 else:
133154 super(IndexFile, self)._set_cache_(attr)
134155
135 def _index_path(self):
136 return join_path_native(self.repo.git_dir, "index")
156 def _index_path(self) -> PathLike:
157 if self.repo.git_dir:
158 return join_path_native(self.repo.git_dir, "index")
159 else:
160 raise GitCommandError("No git directory given to join index path")
137161
138162 @property
139 def path(self):
163 def path(self) -> PathLike:
140164 """ :return: Path to the index file we are representing """
141165 return self._file_path
142166
143 def _delete_entries_cache(self):
167 def _delete_entries_cache(self) -> None:
144168 """Safely clear the entries cache so it can be recreated"""
145169 try:
146170 del(self.entries)
151175
152176 #{ Serializable Interface
153177
154 def _deserialize(self, stream):
178 def _deserialize(self, stream: IO) -> 'IndexFile':
155179 """Initialize this instance with index values read from the given stream"""
156180 self.version, self.entries, self._extension_data, _conten_sha = read_cache(stream)
157181 return self
158182
159 def _entries_sorted(self):
183 def _entries_sorted(self) -> List[IndexEntry]:
160184 """:return: list of entries, in a sorted fashion, first by path, then by stage"""
161185 return sorted(self.entries.values(), key=lambda e: (e.path, e.stage))
162186
163 def _serialize(self, stream, ignore_extension_data=False):
187 def _serialize(self, stream: IO, ignore_extension_data: bool = False) -> 'IndexFile':
164188 entries = self._entries_sorted()
165 extension_data = self._extension_data
189 extension_data = self._extension_data # type: Union[None, bytes]
166190 if ignore_extension_data:
167191 extension_data = None
168192 write_cache(entries, stream, extension_data)
170194
171195 #} END serializable interface
172196
173 def write(self, file_path=None, ignore_extension_data=False):
197 def write(self, file_path: Union[None, PathLike] = None, ignore_extension_data: bool = False) -> None:
174198 """Write the current state to our file path or to the given one
175199
176200 :param file_path:
190214 Alternatively, use IndexFile.write_tree() to handle this case
191215 automatically
192216
193 :return: self"""
217 :return: self # does it? or returns None?"""
194218 # make sure we have our entries read before getting a write lock
195219 # else it would be done when streaming. This can happen
196220 # if one doesn't change the index, but writes it right away
214238
215239 @post_clear_cache
216240 @default_index
217 def merge_tree(self, rhs, base=None):
241 def merge_tree(self, rhs: Treeish, base: Union[None, Treeish] = None) -> 'IndexFile':
218242 """Merge the given rhs treeish into the current index, possibly taking
219243 a common base treeish into account.
220244
241265 # -i : ignore working tree status
242266 # --aggressive : handle more merge cases
243267 # -m : do an actual merge
244 args = ["--aggressive", "-i", "-m"]
268 args: List[Union[Treeish, str]] = ["--aggressive", "-i", "-m"]
245269 if base is not None:
246270 args.append(base)
247271 args.append(rhs)
250274 return self
251275
252276 @classmethod
253 def new(cls, repo, *tree_sha):
277 def new(cls, repo: 'Repo', *tree_sha: Union[str, Tree]) -> 'IndexFile':
254278 """ Merge the given treeish revisions into a new index which is returned.
255279 This method behaves like git-read-tree --aggressive when doing the merge.
256280
263287 New IndexFile instance. Its path will be undefined.
264288 If you intend to write such a merged Index, supply an alternate file_path
265289 to its 'write' method."""
266 base_entries = aggressive_tree_merge(repo.odb, [to_bin_sha(str(t)) for t in tree_sha])
290 tree_sha_bytes: List[bytes] = [to_bin_sha(str(t)) for t in tree_sha]
291 base_entries = aggressive_tree_merge(repo.odb, tree_sha_bytes)
267292
268293 inst = cls(repo)
269294 # convert to entries dict
270 entries = dict(zip(((e.path, e.stage) for e in base_entries),
271 (IndexEntry.from_base(e) for e in base_entries)))
295 entries: Dict[Tuple[PathLike, int], IndexEntry] = dict(zip(
296 ((e.path, e.stage) for e in base_entries),
297 (IndexEntry.from_base(e) for e in base_entries)))
272298
273299 inst.entries = entries
274300 return inst
275301
276302 @classmethod
277 def from_tree(cls, repo, *treeish, **kwargs):
303 def from_tree(cls, repo: 'Repo', *treeish: Treeish, **kwargs: Any) -> 'IndexFile':
278304 """Merge the given treeish revisions into a new index which is returned.
279305 The original index will remain unaltered
280306
311337 if len(treeish) == 0 or len(treeish) > 3:
312338 raise ValueError("Please specify between 1 and 3 treeish, got %i" % len(treeish))
313339
314 arg_list = []
340 arg_list: List[Union[Treeish, str]] = []
315341 # ignore that working tree and index possibly are out of date
316342 if len(treeish) > 1:
317343 # drop unmerged entries when reading our index and merging
330356 # as it considers existing entries. moving it essentially clears the index.
331357 # Unfortunately there is no 'soft' way to do it.
332358 # The TemporaryFileSwap assure the original file get put back
333 index_handler = TemporaryFileSwap(join_path_native(repo.git_dir, 'index'))
359 if repo.git_dir:
360 index_handler = TemporaryFileSwap(join_path_native(repo.git_dir, 'index'))
334361 try:
335362 repo.git.read_tree(*arg_list, **kwargs)
336363 index = cls(repo, tmp_index)
345372
346373 # UTILITIES
347374 @unbare_repo
348 def _iter_expand_paths(self, paths):
375 def _iter_expand_paths(self: 'IndexFile', paths: Sequence[PathLike]) -> Iterator[PathLike]:
349376 """Expand the directories in list of paths to the corresponding paths accordingly,
350377
351378 Note: git will add items multiple times even if a glob overlapped
352379 with manually specified paths or if paths where specified multiple
353380 times - we respect that and do not prune"""
354 def raise_exc(e):
381 def raise_exc(e: Exception) -> NoReturn:
355382 raise e
356 r = self.repo.working_tree_dir
383 r = str(self.repo.working_tree_dir)
357384 rs = r + os.sep
358385 for path in paths:
359 abs_path = path
386 abs_path = str(path)
360387 if not osp.isabs(abs_path):
361388 abs_path = osp.join(r, path)
362389 # END make absolute path
373400 # end check symlink
374401
375402 # if the path is not already pointing to an existing file, resolve globs if possible
376 if not os.path.exists(path) and ('?' in path or '*' in path or '[' in path):
403 if not os.path.exists(abs_path) and ('?' in abs_path or '*' in abs_path or '[' in abs_path):
377404 resolved_paths = glob.glob(abs_path)
378405 # not abs_path in resolved_paths:
379406 # a glob() resolving to the same path we are feeding it with
383410 # whose name contains wildcard characters.
384411 if abs_path not in resolved_paths:
385412 for f in self._iter_expand_paths(glob.glob(abs_path)):
386 yield f.replace(rs, '')
413 yield str(f).replace(rs, '')
387414 continue
388415 # END glob handling
389416 try:
395422 # END for each subdirectory
396423 except OSError:
397424 # was a file or something that could not be iterated
398 yield path.replace(rs, '')
425 yield abs_path.replace(rs, '')
399426 # END path exception handling
400427 # END for each path
401428
402 def _write_path_to_stdin(self, proc, filepath, item, fmakeexc, fprogress,
403 read_from_stdout=True):
429 def _write_path_to_stdin(self, proc: 'Popen', filepath: PathLike, item: PathLike, fmakeexc: Callable[..., GitError],
430 fprogress: Callable[[PathLike, bool, PathLike], None],
431 read_from_stdout: bool = True) -> Union[None, str]:
404432 """Write path to proc.stdin and make sure it processes the item, including progress.
405433
406434 :return: stdout string
416444 we will close stdin to break the pipe."""
417445
418446 fprogress(filepath, False, item)
419 rval = None
420 try:
421 proc.stdin.write(("%s\n" % filepath).encode(defenc))
422 except IOError as e:
423 # pipe broke, usually because some error happened
424 raise fmakeexc() from e
425 # END write exception handling
426 proc.stdin.flush()
427 if read_from_stdout:
447 rval: Union[None, str] = None
448
449 if proc.stdin is not None:
450 try:
451 proc.stdin.write(("%s\n" % filepath).encode(defenc))
452 except IOError as e:
453 # pipe broke, usually because some error happened
454 raise fmakeexc() from e
455 # END write exception handling
456 proc.stdin.flush()
457
458 if read_from_stdout and proc.stdout is not None:
428459 rval = proc.stdout.readline().strip()
429460 fprogress(filepath, True, item)
430461 return rval
431462
432 def iter_blobs(self, predicate=lambda t: True):
463 def iter_blobs(self, predicate: Callable[[Tuple[StageType, Blob]], bool] = lambda t: True
464 ) -> Iterator[Tuple[StageType, Blob]]:
433465 """
434466 :return: Iterator yielding tuples of Blob objects and stages, tuple(stage, Blob)
435467
445477 yield output
446478 # END for each entry
447479
448 def unmerged_blobs(self):
480 def unmerged_blobs(self) -> Dict[PathLike, List[Tuple[StageType, Blob]]]:
449481 """
450482 :return:
451 Iterator yielding dict(path : list( tuple( stage, Blob, ...))), being
483 Dict(path : list( tuple( stage, Blob, ...))), being
452484 a dictionary associating a path in the index with a list containing
453485 sorted stage/blob pairs
486
454487
455488 :note:
456489 Blobs that have been removed in one side simply do not exist in the
458491 are at stage 3 will not have a stage 3 entry.
459492 """
460493 is_unmerged_blob = lambda t: t[0] != 0
461 path_map = {}
494 path_map: Dict[PathLike, List[Tuple[StageType, Blob]]] = {}
462495 for stage, blob in self.iter_blobs(is_unmerged_blob):
463496 path_map.setdefault(blob.path, []).append((stage, blob))
464497 # END for each unmerged blob
465498 for line in path_map.values():
466499 line.sort()
500
467501 return path_map
468502
469 @classmethod
470 def entry_key(cls, *entry):
503 @ classmethod
504 def entry_key(cls, *entry: Union[BaseIndexEntry, PathLike, StageType]) -> Tuple[PathLike, StageType]:
471505 return entry_key(*entry)
472506
473 def resolve_blobs(self, iter_blobs):
507 def resolve_blobs(self, iter_blobs: Iterator[Blob]) -> 'IndexFile':
474508 """Resolve the blobs given in blob iterator. This will effectively remove the
475509 index entries of the respective path at all non-null stages and add the given
476510 blob as new stage null blob.
488522 for blob in iter_blobs:
489523 stage_null_key = (blob.path, 0)
490524 if stage_null_key in self.entries:
491 raise ValueError("Path %r already exists at stage 0" % blob.path)
525 raise ValueError("Path %r already exists at stage 0" % str(blob.path))
492526 # END assert blob is not stage 0 already
493527
494528 # delete all possible stages
505539
506540 return self
507541
508 def update(self):
542 def update(self) -> 'IndexFile':
509543 """Reread the contents of our index file, discarding all cached information
510544 we might have.
511545
516550 # allows to lazily reread on demand
517551 return self
518552
519 def write_tree(self):
553 def write_tree(self) -> Tree:
520554 """Writes this index to a corresponding Tree object into the repository's
521555 object database and return it.
522556
541575 root_tree._cache = tree_items
542576 return root_tree
543577
544 def _process_diff_args(self, args):
578 def _process_diff_args(self, # type: ignore[override]
579 args: List[Union[str, 'git_diff.Diffable', Type['git_diff.Diffable.Index']]]
580 ) -> List[Union[str, 'git_diff.Diffable', Type['git_diff.Diffable.Index']]]:
545581 try:
546582 args.pop(args.index(self))
547583 except IndexError:
549585 # END remove self
550586 return args
551587
552 def _to_relative_path(self, path):
553 """:return: Version of path relative to our git directory or raise ValueError
554 if it is not within our git direcotory"""
588 def _to_relative_path(self, path: PathLike) -> PathLike:
589 """
590 :return: Version of path relative to our git directory or raise ValueError
591 if it is not within our git direcotory"""
555592 if not osp.isabs(path):
556593 return path
557594 if self.repo.bare:
558595 raise InvalidGitRepositoryError("require non-bare repository")
559 if not path.startswith(self.repo.working_tree_dir):
596 if not str(path).startswith(str(self.repo.working_tree_dir)):
560597 raise ValueError("Absolute path %r is not in git repository at %r" % (path, self.repo.working_tree_dir))
561598 return os.path.relpath(path, self.repo.working_tree_dir)
562599
563 def _preprocess_add_items(self, items):
600 def _preprocess_add_items(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']]
601 ) -> Tuple[List[PathLike], List[BaseIndexEntry]]:
564602 """ Split the items into two lists of path strings and BaseEntries. """
565603 paths = []
566604 entries = []
580618 # END for each item
581619 return paths, entries
582620
583 def _store_path(self, filepath, fprogress):
621 def _store_path(self, filepath: PathLike, fprogress: Callable) -> BaseIndexEntry:
584622 """Store file at filepath in the database and return the base index entry
585623 Needs the git_working_dir decorator active ! This must be assured in the calling code"""
586624 st = os.lstat(filepath) # handles non-symlinks as well
587625 if S_ISLNK(st.st_mode):
588626 # in PY3, readlink is string, but we need bytes. In PY2, it's just OS encoded bytes, we assume UTF-8
589 open_stream = lambda: BytesIO(force_bytes(os.readlink(filepath), encoding=defenc))
627 open_stream: Callable[[], BinaryIO] = lambda: BytesIO(force_bytes(os.readlink(filepath),
628 encoding=defenc))
590629 else:
591630 open_stream = lambda: open(filepath, 'rb')
592631 with open_stream() as stream:
596635 return BaseIndexEntry((stat_mode_to_index_mode(st.st_mode),
597636 istream.binsha, 0, to_native_path_linux(filepath)))
598637
599 @unbare_repo
600 @git_working_dir
601 def _entries_for_paths(self, paths, path_rewriter, fprogress, entries):
602 entries_added = []
638 @ unbare_repo
639 @ git_working_dir
640 def _entries_for_paths(self, paths: List[str], path_rewriter: Callable, fprogress: Callable,
641 entries: List[BaseIndexEntry]) -> List[BaseIndexEntry]:
642 entries_added: List[BaseIndexEntry] = []
603643 if path_rewriter:
604644 for path in paths:
605645 if osp.isabs(path):
606646 abspath = path
607 gitrelative_path = path[len(self.repo.working_tree_dir) + 1:]
647 gitrelative_path = path[len(str(self.repo.working_tree_dir)) + 1:]
608648 else:
609649 gitrelative_path = path
610 abspath = osp.join(self.repo.working_tree_dir, gitrelative_path)
650 if self.repo.working_tree_dir:
651 abspath = osp.join(self.repo.working_tree_dir, gitrelative_path)
611652 # end obtain relative and absolute paths
612653
613654 blob = Blob(self.repo, Blob.NULL_BIN_SHA,
627668 # END path handling
628669 return entries_added
629670
630 def add(self, items, force=True, fprogress=lambda *args: None, path_rewriter=None,
631 write=True, write_extension_data=False):
671 def add(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']], force: bool = True,
672 fprogress: Callable = lambda *args: None, path_rewriter: Union[Callable[..., PathLike], None] = None,
673 write: bool = True, write_extension_data: bool = False) -> List[BaseIndexEntry]:
632674 """Add files from the working tree, specific blobs or BaseIndexEntries
633675 to the index.
634676
731773 # automatically
732774 # paths can be git-added, for everything else we use git-update-index
733775 paths, entries = self._preprocess_add_items(items)
734 entries_added = []
776 entries_added: List[BaseIndexEntry] = []
735777 # This code needs a working tree, therefore we try not to run it unless required.
736778 # That way, we are OK on a bare repository as well.
737779 # If there are no paths, the rewriter has nothing to do either
750792 # create objects if required, otherwise go with the existing shas
751793 null_entries_indices = [i for i, e in enumerate(entries) if e.binsha == Object.NULL_BIN_SHA]
752794 if null_entries_indices:
753 @git_working_dir
754 def handle_null_entries(self):
795 @ git_working_dir
796 def handle_null_entries(self: 'IndexFile') -> None:
755797 for ei in null_entries_indices:
756798 null_entry = entries[ei]
757799 new_entry = self._store_path(null_entry.path, fprogress)
795837
796838 return entries_added
797839
798 def _items_to_rela_paths(self, items):
840 def _items_to_rela_paths(self, items: Union[PathLike, Sequence[Union[PathLike, BaseIndexEntry, Blob, Submodule]]]
841 ) -> List[PathLike]:
799842 """Returns a list of repo-relative paths from the given items which
800843 may be absolute or relative paths, entries or blobs"""
801844 paths = []
802845 # if string put in list
803 if isinstance(items, str):
846 if isinstance(items, (str, os.PathLike)):
804847 items = [items]
805848
806849 for item in items:
813856 # END for each item
814857 return paths
815858
816 @post_clear_cache
817 @default_index
818 def remove(self, items, working_tree=False, **kwargs):
859 @ post_clear_cache
860 @ default_index
861 def remove(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']], working_tree: bool = False,
862 **kwargs: Any) -> List[str]:
819863 """Remove the given items from the index and optionally from
820864 the working tree as well.
821865
864908 # rm 'path'
865909 return [p[4:-1] for p in removed_paths]
866910
867 @post_clear_cache
868 @default_index
869 def move(self, items, skip_errors=False, **kwargs):
911 @ post_clear_cache
912 @ default_index
913 def move(self, items: Sequence[Union[PathLike, Blob, BaseIndexEntry, 'Submodule']], skip_errors: bool = False,
914 **kwargs: Any) -> List[Tuple[str, str]]:
870915 """Rename/move the items, whereas the last item is considered the destination of
871916 the move operation. If the destination is a file, the first item ( of two )
872917 must be a file as well. If the destination is a directory, it may be preceded
928973
929974 return out
930975
931 def commit(self, message, parent_commits=None, head=True, author=None,
932 committer=None, author_date=None, commit_date=None,
933 skip_hooks=False):
976 def commit(self,
977 message: str,
978 parent_commits: Union[Commit_ish, None] = None,
979 head: bool = True,
980 author: Union[None, 'Actor'] = None,
981 committer: Union[None, 'Actor'] = None,
982 author_date: Union[str, None] = None,
983 commit_date: Union[str, None] = None,
984 skip_hooks: bool = False) -> Commit:
934985 """Commit the current default index file, creating a commit object.
935986 For more information on the arguments, see tree.commit.
936987
9541005 run_commit_hook('post-commit', self)
9551006 return rval
9561007
957 def _write_commit_editmsg(self, message):
1008 def _write_commit_editmsg(self, message: str) -> None:
9581009 with open(self._commit_editmsg_filepath(), "wb") as commit_editmsg_file:
9591010 commit_editmsg_file.write(message.encode(defenc))
9601011
961 def _remove_commit_editmsg(self):
1012 def _remove_commit_editmsg(self) -> None:
9621013 os.remove(self._commit_editmsg_filepath())
9631014
964 def _read_commit_editmsg(self):
1015 def _read_commit_editmsg(self) -> str:
9651016 with open(self._commit_editmsg_filepath(), "rb") as commit_editmsg_file:
9661017 return commit_editmsg_file.read().decode(defenc)
9671018
968 def _commit_editmsg_filepath(self):
1019 def _commit_editmsg_filepath(self) -> str:
9691020 return osp.join(self.repo.common_dir, "COMMIT_EDITMSG")
9701021
971 @classmethod
972 def _flush_stdin_and_wait(cls, proc, ignore_stdout=False):
973 proc.stdin.flush()
974 proc.stdin.close()
975 stdout = ''
976 if not ignore_stdout:
1022 def _flush_stdin_and_wait(cls, proc: 'Popen[bytes]', ignore_stdout: bool = False) -> bytes:
1023 stdin_IO = proc.stdin
1024 if stdin_IO:
1025 stdin_IO.flush()
1026 stdin_IO.close()
1027
1028 stdout = b''
1029 if not ignore_stdout and proc.stdout:
9771030 stdout = proc.stdout.read()
978 proc.stdout.close()
979 proc.wait()
1031
1032 if proc.stdout:
1033 proc.stdout.close()
1034 proc.wait()
9801035 return stdout
9811036
982 @default_index
983 def checkout(self, paths=None, force=False, fprogress=lambda *args: None, **kwargs):
1037 @ default_index
1038 def checkout(self, paths: Union[None, Iterable[PathLike]] = None, force: bool = False,
1039 fprogress: Callable = lambda *args: None, **kwargs: Any
1040 ) -> Union[None, Iterator[PathLike], Sequence[PathLike]]:
9841041 """Checkout the given paths or all files from the version known to the index into
9851042 the working tree.
9861043
10311088 failed_reasons = []
10321089 unknown_lines = []
10331090
1034 def handle_stderr(proc, iter_checked_out_files):
1035 stderr = proc.stderr.read()
1036 if not stderr:
1037 return
1091 def handle_stderr(proc: 'Popen[bytes]', iter_checked_out_files: Iterable[PathLike]) -> None:
1092
1093 stderr_IO = proc.stderr
1094 if not stderr_IO:
1095 return None # return early if stderr empty
1096 else:
1097 stderr_bytes = stderr_IO.read()
10381098 # line contents:
1039 stderr = stderr.decode(defenc)
1099 stderr = stderr_bytes.decode(defenc)
10401100 # git-checkout-index: this already exists
10411101 endings = (' already exists', ' is not in the cache', ' does not exist at stage', ' is unmerged')
10421102 for line in stderr.splitlines():
11001160 proc = self.repo.git.checkout_index(args, **kwargs)
11011161 # FIXME: Reading from GIL!
11021162 make_exc = lambda: GitCommandError(("git-checkout-index",) + tuple(args), 128, proc.stderr.read())
1103 checked_out_files = []
1163 checked_out_files: List[PathLike] = []
11041164
11051165 for path in paths:
11061166 co_path = to_native_path_linux(self._to_relative_path(path))
11101170 try:
11111171 self.entries[(co_path, 0)]
11121172 except KeyError:
1113 folder = co_path
1173 folder = str(co_path)
11141174 if not folder.endswith('/'):
11151175 folder += '/'
11161176 for entry in self.entries.values():
1117 if entry.path.startswith(folder):
1177 if str(entry.path).startswith(folder):
11181178 p = entry.path
11191179 self._write_path_to_stdin(proc, p, p, make_exc,
11201180 fprogress, read_from_stdout=False)
11411201 handle_stderr(proc, checked_out_files)
11421202 return checked_out_files
11431203 # END paths handling
1144 assert "Should not reach this point"
1145
1146 @default_index
1147 def reset(self, commit='HEAD', working_tree=False, paths=None, head=False, **kwargs):
1204
1205 @ default_index
1206 def reset(self, commit: Union[Commit, 'Reference', str] = 'HEAD', working_tree: bool = False,
1207 paths: Union[None, Iterable[PathLike]] = None,
1208 head: bool = False, **kwargs: Any) -> 'IndexFile':
11481209 """Reset the index to reflect the tree at the given commit. This will not
11491210 adjust our HEAD reference as opposed to HEAD.reset by default.
11501211
12111272
12121273 return self
12131274
1214 @default_index
1215 def diff(self, other=diff.Diffable.Index, paths=None, create_patch=False, **kwargs):
1275 # @ default_index, breaks typing for some reason, copied into function
1276 def diff(self, # type: ignore[override]
1277 other: Union[Type['git_diff.Diffable.Index'], 'Tree', 'Commit', str, None] = git_diff.Diffable.Index,
1278 paths: Union[PathLike, List[PathLike], Tuple[PathLike, ...], None] = None,
1279 create_patch: bool = False, **kwargs: Any
1280 ) -> git_diff.DiffIndex:
12161281 """Diff this index against the working copy or a Tree or Commit object
12171282
1218 For a documentation of the parameters and return values, see
1283 For a documentation of the parameters and return values, see,
12191284 Diffable.diff
12201285
12211286 :note:
12221287 Will only work with indices that represent the default git index as
12231288 they have not been initialized with a stream.
12241289 """
1290
1291 # only run if we are the default repository index
1292 if self._file_path != self._index_path():
1293 raise AssertionError(
1294 "Cannot call %r on indices that do not represent the default git index" % self.diff())
12251295 # index against index is always empty
12261296 if other is self.Index:
1227 return diff.DiffIndex()
1297 return git_diff.DiffIndex()
12281298
12291299 # index against anything but None is a reverse diff with the respective
12301300 # item. Handle existing -R flags properly. Transform strings to the object
12331303 other = self.repo.rev_parse(other)
12341304 # END object conversion
12351305
1236 if isinstance(other, Object):
1306 if isinstance(other, Object): # for Tree or Commit
12371307 # invert the existing R flag
12381308 cur_val = kwargs.get('R', False)
12391309 kwargs['R'] = not cur_val
00 # Contains standalone functions to accompany the index implementation and make it
11 # more versatile
22 # NOTE: Autodoc hates it if this is a docstring
3
34 from io import BytesIO
45 import os
56 from stat import (
910 S_ISDIR,
1011 S_IFMT,
1112 S_IFREG,
13 S_IXUSR,
1214 )
1315 import subprocess
1416
4648 unpack
4749 )
4850
51 # typing -----------------------------------------------------------------------------
52
53 from typing import (Dict, IO, List, Sequence, TYPE_CHECKING, Tuple, Type, Union, cast)
54
55 from git.types import PathLike
56
57 if TYPE_CHECKING:
58 from .base import IndexFile
59 from git.db import GitCmdObjectDB
60 from git.objects.tree import TreeCacheTup
61 # from git.objects.fun import EntryTupOrNone
62
63 # ------------------------------------------------------------------------------------
64
4965
5066 S_IFGITLINK = S_IFLNK | S_IFDIR # a submodule
5167 CE_NAMEMASK_INV = ~CE_NAMEMASK
5470 'stat_mode_to_index_mode', 'S_IFGITLINK', 'run_commit_hook', 'hook_path')
5571
5672
57 def hook_path(name, git_dir):
73 def hook_path(name: str, git_dir: PathLike) -> str:
5874 """:return: path to the given named hook in the given git repository directory"""
5975 return osp.join(git_dir, 'hooks', name)
6076
6177
62 def run_commit_hook(name, index, *args):
78 def run_commit_hook(name: str, index: 'IndexFile', *args: str) -> None:
6379 """Run the commit hook of the given name. Silently ignores hooks that do not exist.
6480 :param name: name of hook, like 'pre-commit'
6581 :param index: IndexFile instance
6783 :raises HookExecutionError: """
6884 hp = hook_path(name, index.repo.git_dir)
6985 if not os.access(hp, os.X_OK):
70 return
86 return None
7187
7288 env = os.environ.copy()
73 env['GIT_INDEX_FILE'] = safe_decode(index.path)
89 env['GIT_INDEX_FILE'] = safe_decode(str(index.path))
7490 env['GIT_EDITOR'] = ':'
7591 try:
7692 cmd = subprocess.Popen([hp] + list(args),
8399 except Exception as ex:
84100 raise HookExecutionError(hp, ex) from ex
85101 else:
86 stdout = []
87 stderr = []
88 handle_process_output(cmd, stdout.append, stderr.append, finalize_process)
89 stdout = ''.join(stdout)
90 stderr = ''.join(stderr)
102 stdout_list: List[str] = []
103 stderr_list: List[str] = []
104 handle_process_output(cmd, stdout_list.append, stderr_list.append, finalize_process)
105 stdout = ''.join(stdout_list)
106 stderr = ''.join(stderr_list)
91107 if cmd.returncode != 0:
92108 stdout = force_text(stdout, defenc)
93109 stderr = force_text(stderr, defenc)
95111 # end handle return code
96112
97113
98 def stat_mode_to_index_mode(mode):
114 def stat_mode_to_index_mode(mode: int) -> int:
99115 """Convert the given mode from a stat call to the corresponding index mode
100116 and return it"""
101117 if S_ISLNK(mode): # symlinks
102118 return S_IFLNK
103119 if S_ISDIR(mode) or S_IFMT(mode) == S_IFGITLINK: # submodules
104120 return S_IFGITLINK
105 return S_IFREG | 0o644 | (mode & 0o111) # blobs with or without executable bit
106
107
108 def write_cache(entries, stream, extension_data=None, ShaStreamCls=IndexFileSHA1Writer):
121 return S_IFREG | (mode & S_IXUSR and 0o755 or 0o644) # blobs with or without executable bit
122
123
124 def write_cache(entries: Sequence[Union[BaseIndexEntry, 'IndexEntry']], stream: IO[bytes],
125 extension_data: Union[None, bytes] = None,
126 ShaStreamCls: Type[IndexFileSHA1Writer] = IndexFileSHA1Writer) -> None:
109127 """Write the cache represented by entries to a stream
110128
111129 :param entries: **sorted** list of entries
118136 :param extension_data: any kind of data to write as a trailer, it must begin
119137 a 4 byte identifier, followed by its size ( 4 bytes )"""
120138 # wrap the stream into a compatible writer
121 stream = ShaStreamCls(stream)
122
123 tell = stream.tell
124 write = stream.write
139 stream_sha = ShaStreamCls(stream)
140
141 tell = stream_sha.tell
142 write = stream_sha.write
125143
126144 # header
127145 version = 2
131149 # body
132150 for entry in entries:
133151 beginoffset = tell()
134 write(entry[4]) # ctime
135 write(entry[5]) # mtime
136 path = entry[3]
137 path = force_bytes(path, encoding=defenc)
152 write(entry.ctime_bytes) # ctime
153 write(entry.mtime_bytes) # mtime
154 path_str = str(entry.path)
155 path: bytes = force_bytes(path_str, encoding=defenc)
138156 plen = len(path) & CE_NAMEMASK # path length
139 assert plen == len(path), "Path %s too long to fit into index" % entry[3]
140 flags = plen | (entry[2] & CE_NAMEMASK_INV) # clear possible previous values
141 write(pack(">LLLLLL20sH", entry[6], entry[7], entry[0],
142 entry[8], entry[9], entry[10], entry[1], flags))
157 assert plen == len(path), "Path %s too long to fit into index" % entry.path
158 flags = plen | (entry.flags & CE_NAMEMASK_INV) # clear possible previous values
159 write(pack(">LLLLLL20sH", entry.dev, entry.inode, entry.mode,
160 entry.uid, entry.gid, entry.size, entry.binsha, flags))
143161 write(path)
144162 real_size = ((tell() - beginoffset + 8) & ~7)
145163 write(b"\0" * ((beginoffset + real_size) - tell()))
147165
148166 # write previously cached extensions data
149167 if extension_data is not None:
150 stream.write(extension_data)
168 stream_sha.write(extension_data)
151169
152170 # write the sha over the content
153 stream.write_sha()
154
155
156 def read_header(stream):
171 stream_sha.write_sha()
172
173
174 def read_header(stream: IO[bytes]) -> Tuple[int, int]:
157175 """Return tuple(version_long, num_entries) from the given stream"""
158176 type_id = stream.read(4)
159177 if type_id != b"DIRC":
160178 raise AssertionError("Invalid index file header: %r" % type_id)
161 version, num_entries = unpack(">LL", stream.read(4 * 2))
179 unpacked = cast(Tuple[int, int], unpack(">LL", stream.read(4 * 2)))
180 version, num_entries = unpacked
162181
163182 # TODO: handle version 3: extended data, see read-cache.c
164183 assert version in (1, 2)
165184 return version, num_entries
166185
167186
168 def entry_key(*entry):
187 def entry_key(*entry: Union[BaseIndexEntry, PathLike, int]) -> Tuple[PathLike, int]:
169188 """:return: Key suitable to be used for the index.entries dictionary
170189 :param entry: One instance of type BaseIndexEntry or the path and the stage"""
190
191 # def is_entry_key_tup(entry_key: Tuple) -> TypeGuard[Tuple[PathLike, int]]:
192 # return isinstance(entry_key, tuple) and len(entry_key) == 2
193
171194 if len(entry) == 1:
172 return (entry[0].path, entry[0].stage)
173 return tuple(entry)
195 entry_first = entry[0]
196 assert isinstance(entry_first, BaseIndexEntry)
197 return (entry_first.path, entry_first.stage)
198 else:
199 # assert is_entry_key_tup(entry)
200 entry = cast(Tuple[PathLike, int], entry)
201 return entry
174202 # END handle entry
175203
176204
177 def read_cache(stream):
205 def read_cache(stream: IO[bytes]) -> Tuple[int, Dict[Tuple[PathLike, int], 'IndexEntry'], bytes, bytes]:
178206 """Read a cache file from the given stream
179207 :return: tuple(version, entries_dict, extension_data, content_sha)
180208 * version is the integer version number
183211 * content_sha is a 20 byte sha on all cache file contents"""
184212 version, num_entries = read_header(stream)
185213 count = 0
186 entries = {}
214 entries: Dict[Tuple[PathLike, int], 'IndexEntry'] = {}
187215
188216 read = stream.read
189217 tell = stream.tell
217245 content_sha = extension_data[-20:]
218246
219247 # truncate the sha in the end as we will dynamically create it anyway
220 extension_data = extension_data[:-20]
248 extension_data = extension_data[: -20]
221249
222250 return (version, entries, extension_data, content_sha)
223251
224252
225 def write_tree_from_cache(entries, odb, sl, si=0):
253 def write_tree_from_cache(entries: List[IndexEntry], odb: 'GitCmdObjectDB', sl: slice, si: int = 0
254 ) -> Tuple[bytes, List['TreeCacheTup']]:
226255 """Create a tree from the given sorted list of entries and put the respective
227256 trees into the given object database
228257
232261 :param sl: slice indicating the range we should process on the entries list
233262 :return: tuple(binsha, list(tree_entry, ...)) a tuple of a sha and a list of
234263 tree entries being a tuple of hexsha, mode, name"""
235 tree_items = []
236 tree_items_append = tree_items.append
264 tree_items: List['TreeCacheTup'] = []
265
237266 ci = sl.start
238267 end = sl.stop
239268 while ci < end:
245274 rbound = entry.path.find('/', si)
246275 if rbound == -1:
247276 # its not a tree
248 tree_items_append((entry.binsha, entry.mode, entry.path[si:]))
277 tree_items.append((entry.binsha, entry.mode, entry.path[si:]))
249278 else:
250279 # find common base range
251280 base = entry.path[si:rbound]
262291 # enter recursion
263292 # ci - 1 as we want to count our current item as well
264293 sha, _tree_entry_list = write_tree_from_cache(entries, odb, slice(ci - 1, xi), rbound + 1)
265 tree_items_append((sha, S_IFDIR, base))
294 tree_items.append((sha, S_IFDIR, base))
266295
267296 # skip ahead
268297 ci = xi
271300
272301 # finally create the tree
273302 sio = BytesIO()
274 tree_to_stream(tree_items, sio.write)
303 tree_to_stream(tree_items, sio.write) # writes to stream as bytes, but doesnt change tree_items
275304 sio.seek(0)
276305
277306 istream = odb.store(IStream(str_tree_type, len(sio.getvalue()), sio))
278307 return (istream.binsha, tree_items)
279308
280309
281 def _tree_entry_to_baseindexentry(tree_entry, stage):
310 def _tree_entry_to_baseindexentry(tree_entry: 'TreeCacheTup', stage: int) -> BaseIndexEntry:
282311 return BaseIndexEntry((tree_entry[1], tree_entry[0], stage << CE_STAGESHIFT, tree_entry[2]))
283312
284313
285 def aggressive_tree_merge(odb, tree_shas):
314 def aggressive_tree_merge(odb: 'GitCmdObjectDB', tree_shas: Sequence[bytes]) -> List[BaseIndexEntry]:
286315 """
287316 :return: list of BaseIndexEntries representing the aggressive merge of the given
288317 trees. All valid entries are on stage 0, whereas the conflicting ones are left
291320 :param tree_shas: 1, 2 or 3 trees as identified by their binary 20 byte shas
292321 If 1 or two, the entries will effectively correspond to the last given tree
293322 If 3 are given, a 3 way merge is performed"""
294 out = []
295 out_append = out.append
323 out: List[BaseIndexEntry] = []
296324
297325 # one and two way is the same for us, as we don't have to handle an existing
298326 # index, instrea
299327 if len(tree_shas) in (1, 2):
300328 for entry in traverse_tree_recursive(odb, tree_shas[-1], ''):
301 out_append(_tree_entry_to_baseindexentry(entry, 0))
329 out.append(_tree_entry_to_baseindexentry(entry, 0))
302330 # END for each entry
303331 return out
304332 # END handle single tree
319347 if(base[0] != ours[0] and base[0] != theirs[0] and ours[0] != theirs[0]) or \
320348 (base[1] != ours[1] and base[1] != theirs[1] and ours[1] != theirs[1]):
321349 # changed by both
322 out_append(_tree_entry_to_baseindexentry(base, 1))
323 out_append(_tree_entry_to_baseindexentry(ours, 2))
324 out_append(_tree_entry_to_baseindexentry(theirs, 3))
350 out.append(_tree_entry_to_baseindexentry(base, 1))
351 out.append(_tree_entry_to_baseindexentry(ours, 2))
352 out.append(_tree_entry_to_baseindexentry(theirs, 3))
325353 elif base[0] != ours[0] or base[1] != ours[1]:
326354 # only we changed it
327 out_append(_tree_entry_to_baseindexentry(ours, 0))
355 out.append(_tree_entry_to_baseindexentry(ours, 0))
328356 else:
329357 # either nobody changed it, or they did. In either
330358 # case, use theirs
331 out_append(_tree_entry_to_baseindexentry(theirs, 0))
359 out.append(_tree_entry_to_baseindexentry(theirs, 0))
332360 # END handle modification
333361 else:
334362
335363 if ours[0] != base[0] or ours[1] != base[1]:
336364 # they deleted it, we changed it, conflict
337 out_append(_tree_entry_to_baseindexentry(base, 1))
338 out_append(_tree_entry_to_baseindexentry(ours, 2))
365 out.append(_tree_entry_to_baseindexentry(base, 1))
366 out.append(_tree_entry_to_baseindexentry(ours, 2))
339367 # else:
340368 # we didn't change it, ignore
341369 # pass
348376 else:
349377 if theirs[0] != base[0] or theirs[1] != base[1]:
350378 # deleted in ours, changed theirs, conflict
351 out_append(_tree_entry_to_baseindexentry(base, 1))
352 out_append(_tree_entry_to_baseindexentry(theirs, 3))
379 out.append(_tree_entry_to_baseindexentry(base, 1))
380 out.append(_tree_entry_to_baseindexentry(theirs, 3))
353381 # END theirs changed
354382 # else:
355383 # theirs didn't change
360388 # all three can't be None
361389 if ours is None:
362390 # added in their branch
363 out_append(_tree_entry_to_baseindexentry(theirs, 0))
391 assert theirs is not None
392 out.append(_tree_entry_to_baseindexentry(theirs, 0))
364393 elif theirs is None:
365394 # added in our branch
366 out_append(_tree_entry_to_baseindexentry(ours, 0))
395 out.append(_tree_entry_to_baseindexentry(ours, 0))
367396 else:
368397 # both have it, except for the base, see whether it changed
369398 if ours[0] != theirs[0] or ours[1] != theirs[1]:
370 out_append(_tree_entry_to_baseindexentry(ours, 2))
371 out_append(_tree_entry_to_baseindexentry(theirs, 3))
399 out.append(_tree_entry_to_baseindexentry(ours, 2))
400 out.append(_tree_entry_to_baseindexentry(theirs, 3))
372401 else:
373402 # it was added the same in both
374 out_append(_tree_entry_to_baseindexentry(ours, 0))
403 out.append(_tree_entry_to_baseindexentry(ours, 0))
375404 # END handle two items
376405 # END handle heads
377406 # END handle base exists
77 )
88 from git.objects import Blob
99
10
11 # typing ----------------------------------------------------------------------
12
13 from typing import (NamedTuple, Sequence, TYPE_CHECKING, Tuple, Union, cast)
14
15 from git.types import PathLike
16
17 if TYPE_CHECKING:
18 from git.repo import Repo
19
20 # ---------------------------------------------------------------------------------
1021
1122 __all__ = ('BlobFilter', 'BaseIndexEntry', 'IndexEntry')
1223
3041 """
3142 __slots__ = 'paths'
3243
33 def __init__(self, paths):
44 def __init__(self, paths: Sequence[PathLike]) -> None:
3445 """
3546 :param paths:
3647 tuple or list of paths which are either pointing to directories or
3849 """
3950 self.paths = paths
4051
41 def __call__(self, stage_blob):
52 def __call__(self, stage_blob: Blob) -> bool:
4253 path = stage_blob[1].path
4354 for p in self.paths:
4455 if path.startswith(p):
4758 return False
4859
4960
50 class BaseIndexEntry(tuple):
61 class BaseIndexEntryHelper(NamedTuple):
62 """Typed namedtuple to provide named attribute access for BaseIndexEntry.
63 Needed to allow overriding __new__ in child class to preserve backwards compat."""
64 mode: int
65 binsha: bytes
66 flags: int
67 path: PathLike
68 ctime_bytes: bytes = pack(">LL", 0, 0)
69 mtime_bytes: bytes = pack(">LL", 0, 0)
70 dev: int = 0
71 inode: int = 0
72 uid: int = 0
73 gid: int = 0
74 size: int = 0
75
76
77 class BaseIndexEntry(BaseIndexEntryHelper):
5178
5279 """Small Brother of an index entry which can be created to describe changes
5380 done to the index in which case plenty of additional information is not required.
5481
5582 As the first 4 data members match exactly to the IndexEntry type, methods
5683 expecting a BaseIndexEntry can also handle full IndexEntries even if they
57 use numeric indices for performance reasons. """
84 use numeric indices for performance reasons.
85 """
5886
59 def __str__(self):
87 def __new__(cls, inp_tuple: Union[Tuple[int, bytes, int, PathLike],
88 Tuple[int, bytes, int, PathLike, bytes, bytes, int, int, int, int, int]]
89 ) -> 'BaseIndexEntry':
90 """Override __new__ to allow construction from a tuple for backwards compatibility """
91 return super().__new__(cls, *inp_tuple)
92
93 def __str__(self) -> str:
6094 return "%o %s %i\t%s" % (self.mode, self.hexsha, self.stage, self.path)
6195
62 def __repr__(self):
96 def __repr__(self) -> str:
6397 return "(%o, %s, %i, %s)" % (self.mode, self.hexsha, self.stage, self.path)
6498
6599 @property
66 def mode(self):
67 """ File Mode, compatible to stat module constants """
68 return self[0]
100 def hexsha(self) -> str:
101 """hex version of our sha"""
102 return b2a_hex(self.binsha).decode('ascii')
69103
70104 @property
71 def binsha(self):
72 """binary sha of the blob """
73 return self[1]
74
75 @property
76 def hexsha(self):
77 """hex version of our sha"""
78 return b2a_hex(self[1]).decode('ascii')
79
80 @property
81 def stage(self):
105 def stage(self) -> int:
82106 """Stage of the entry, either:
83107
84108 * 0 = default stage
88112
89113 :note: For more information, see http://www.kernel.org/pub/software/scm/git/docs/git-read-tree.html
90114 """
91 return (self[2] & CE_STAGEMASK) >> CE_STAGESHIFT
92
93 @property
94 def path(self):
95 """:return: our path relative to the repository working tree root"""
96 return self[3]
97
98 @property
99 def flags(self):
100 """:return: flags stored with this entry"""
101 return self[2]
115 return (self.flags & CE_STAGEMASK) >> CE_STAGESHIFT
102116
103117 @classmethod
104 def from_blob(cls, blob, stage=0):
118 def from_blob(cls, blob: Blob, stage: int = 0) -> 'BaseIndexEntry':
105119 """:return: Fully equipped BaseIndexEntry at the given stage"""
106120 return cls((blob.mode, blob.binsha, stage << CE_STAGESHIFT, blob.path))
107121
108 def to_blob(self, repo):
122 def to_blob(self, repo: 'Repo') -> Blob:
109123 """:return: Blob using the information of this index entry"""
110124 return Blob(repo, self.binsha, self.mode, self.path)
111125
119133
120134 See the properties for a mapping between names and tuple indices. """
121135 @property
122 def ctime(self):
136 def ctime(self) -> Tuple[int, int]:
123137 """
124138 :return:
125139 Tuple(int_time_seconds_since_epoch, int_nano_seconds) of the
126140 file's creation time"""
127 return unpack(">LL", self[4])
141 return cast(Tuple[int, int], unpack(">LL", self.ctime_bytes))
128142
129143 @property
130 def mtime(self):
144 def mtime(self) -> Tuple[int, int]:
131145 """See ctime property, but returns modification time """
132 return unpack(">LL", self[5])
133
134 @property
135 def dev(self):
136 """ Device ID """
137 return self[6]
138
139 @property
140 def inode(self):
141 """ Inode ID """
142 return self[7]
143
144 @property
145 def uid(self):
146 """ User ID """
147 return self[8]
148
149 @property
150 def gid(self):
151 """ Group ID """
152 return self[9]
153
154 @property
155 def size(self):
156 """:return: Uncompressed size of the blob """
157 return self[10]
146 return cast(Tuple[int, int], unpack(">LL", self.mtime_bytes))
158147
159148 @classmethod
160 def from_base(cls, base):
149 def from_base(cls, base: 'BaseIndexEntry') -> 'IndexEntry':
161150 """
162151 :return:
163152 Minimal entry as created from the given BaseIndexEntry instance.
168157 return IndexEntry((base.mode, base.binsha, base.flags, base.path, time, time, 0, 0, 0, 0, 0))
169158
170159 @classmethod
171 def from_blob(cls, blob, stage=0):
160 def from_blob(cls, blob: Blob, stage: int = 0) -> 'IndexEntry':
172161 """:return: Minimal entry resembling the given blob object"""
173162 time = pack(">LL", 0, 0)
174163 return IndexEntry((blob.mode, blob.binsha, stage << CE_STAGESHIFT, blob.path,
66 from git.compat import is_win
77
88 import os.path as osp
9
10
11 # typing ----------------------------------------------------------------------
12
13 from typing import (Any, Callable, TYPE_CHECKING)
14
15 from git.types import PathLike, _T
16
17 if TYPE_CHECKING:
18 from git.index import IndexFile
19
20 # ---------------------------------------------------------------------------------
921
1022
1123 __all__ = ('TemporaryFileSwap', 'post_clear_cache', 'default_index', 'git_working_dir')
2335 and moving it back on to where on object deletion."""
2436 __slots__ = ("file_path", "tmp_file_path")
2537
26 def __init__(self, file_path):
38 def __init__(self, file_path: PathLike) -> None:
2739 self.file_path = file_path
28 self.tmp_file_path = self.file_path + tempfile.mktemp('', '', '')
40 self.tmp_file_path = str(self.file_path) + tempfile.mktemp('', '', '')
2941 # it may be that the source does not exist
3042 try:
3143 os.rename(self.file_path, self.tmp_file_path)
3244 except OSError:
3345 pass
3446
35 def __del__(self):
47 def __del__(self) -> None:
3648 if osp.isfile(self.tmp_file_path):
3749 if is_win and osp.exists(self.file_path):
3850 os.remove(self.file_path)
4254
4355 #{ Decorators
4456
45 def post_clear_cache(func):
57 def post_clear_cache(func: Callable[..., _T]) -> Callable[..., _T]:
4658 """Decorator for functions that alter the index using the git command. This would
4759 invalidate our possibly existing entries dictionary which is why it must be
4860 deleted to allow it to be lazily reread later.
5365 """
5466
5567 @wraps(func)
56 def post_clear_cache_if_not_raised(self, *args, **kwargs):
68 def post_clear_cache_if_not_raised(self: 'IndexFile', *args: Any, **kwargs: Any) -> _T:
5769 rval = func(self, *args, **kwargs)
5870 self._delete_entries_cache()
5971 return rval
6274 return post_clear_cache_if_not_raised
6375
6476
65 def default_index(func):
77 def default_index(func: Callable[..., _T]) -> Callable[..., _T]:
6678 """Decorator assuring the wrapped method may only run if we are the default
6779 repository index. This is as we rely on git commands that operate
6880 on that index only. """
6981
7082 @wraps(func)
71 def check_default_index(self, *args, **kwargs):
83 def check_default_index(self: 'IndexFile', *args: Any, **kwargs: Any) -> _T:
7284 if self._file_path != self._index_path():
7385 raise AssertionError(
7486 "Cannot call %r on indices that do not represent the default git index" % func.__name__)
7890 return check_default_index
7991
8092
81 def git_working_dir(func):
93 def git_working_dir(func: Callable[..., _T]) -> Callable[..., _T]:
8294 """Decorator which changes the current working dir to the one of the git
8395 repository in order to assure relative paths are handled correctly"""
8496
8597 @wraps(func)
86 def set_git_working_dir(self, *args, **kwargs):
98 def set_git_working_dir(self: 'IndexFile', *args: Any, **kwargs: Any) -> _T:
8799 cur_wd = os.getcwd()
88 os.chdir(self.repo.working_tree_dir)
100 os.chdir(str(self.repo.working_tree_dir))
89101 try:
90102 return func(self, *args, **kwargs)
91103 finally:
11 Import all submodules main classes into the package space
22 """
33 # flake8: noqa
4 from __future__ import absolute_import
5
64 import inspect
75
86 from .base import *
1513 from .tree import *
1614 # Fix import dependency - add IndexObject to the util module, so that it can be
1715 # imported by the submodule.base
18 smutil.IndexObject = IndexObject
19 smutil.Object = Object
16 smutil.IndexObject = IndexObject # type: ignore[attr-defined]
17 smutil.Object = Object # type: ignore[attr-defined]
2018 del(smutil)
2119
2220 # must come after submodule was made available
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5
6 from git.exc import WorkTreeRepositoryUnsupported
57 from git.util import LazyMixin, join_path_native, stream_copy, bin_to_hex
68
79 import gitdb.typ as dbtyp
1012 from .util import get_object_type_by_name
1113
1214
13 _assertion_msg_format = "Created object %r whose python type %r disagrees with the acutal git object type %r"
15 # typing ------------------------------------------------------------------
16
17 from typing import Any, TYPE_CHECKING, Union
18
19 from git.types import PathLike, Commit_ish, Lit_commit_ish
20
21 if TYPE_CHECKING:
22 from git.repo import Repo
23 from gitdb.base import OStream
24 from .tree import Tree
25 from .blob import Blob
26 from .submodule.base import Submodule
27 from git.refs.reference import Reference
28
29 IndexObjUnion = Union['Tree', 'Blob', 'Submodule']
30
31 # --------------------------------------------------------------------------
32
33
34 _assertion_msg_format = "Created object %r whose python type %r disagrees with the acutual git object type %r"
1435
1536 __all__ = ("Object", "IndexObject")
1637
2344
2445 TYPES = (dbtyp.str_blob_type, dbtyp.str_tree_type, dbtyp.str_commit_type, dbtyp.str_tag_type)
2546 __slots__ = ("repo", "binsha", "size")
26 type = None # to be set by subclass
27
28 def __init__(self, repo, binsha):
47 type: Union[Lit_commit_ish, None] = None
48
49 def __init__(self, repo: 'Repo', binsha: bytes):
2950 """Initialize an object by identifying it by its binary sha.
3051 All keyword arguments will be set on demand if None.
3152
3859 assert len(binsha) == 20, "Require 20 byte binary sha, got %r, len = %i" % (binsha, len(binsha))
3960
4061 @classmethod
41 def new(cls, repo, id): # @ReservedAssignment
62 def new(cls, repo: 'Repo', id: Union[str, 'Reference']) -> Commit_ish:
4263 """
4364 :return: New Object instance of a type appropriate to the object type behind
4465 id. The id of the newly created object will be a binsha even though
5172 return repo.rev_parse(str(id))
5273
5374 @classmethod
54 def new_from_sha(cls, repo, sha1):
75 def new_from_sha(cls, repo: 'Repo', sha1: bytes) -> Commit_ish:
5576 """
5677 :return: new object instance of a type appropriate to represent the given
5778 binary sha1
6586 inst.size = oinfo.size
6687 return inst
6788
68 def _set_cache_(self, attr):
89 def _set_cache_(self, attr: str) -> None:
6990 """Retrieve object information"""
7091 if attr == "size":
7192 oinfo = self.repo.odb.info(self.binsha)
72 self.size = oinfo.size
93 self.size = oinfo.size # type: int
7394 # assert oinfo.type == self.type, _assertion_msg_format % (self.binsha, oinfo.type, self.type)
7495 else:
7596 super(Object, self)._set_cache_(attr)
7697
77 def __eq__(self, other):
98 def __eq__(self, other: Any) -> bool:
7899 """:return: True if the objects have the same SHA1"""
79100 if not hasattr(other, 'binsha'):
80101 return False
81102 return self.binsha == other.binsha
82103
83 def __ne__(self, other):
104 def __ne__(self, other: Any) -> bool:
84105 """:return: True if the objects do not have the same SHA1 """
85106 if not hasattr(other, 'binsha'):
86107 return True
87108 return self.binsha != other.binsha
88109
89 def __hash__(self):
110 def __hash__(self) -> int:
90111 """:return: Hash of our id allowing objects to be used in dicts and sets"""
91112 return hash(self.binsha)
92113
93 def __str__(self):
114 def __str__(self) -> str:
94115 """:return: string of our SHA1 as understood by all git commands"""
95116 return self.hexsha
96117
97 def __repr__(self):
118 def __repr__(self) -> str:
98119 """:return: string with pythonic representation of our object"""
99120 return '<git.%s "%s">' % (self.__class__.__name__, self.hexsha)
100121
101122 @property
102 def hexsha(self):
123 def hexsha(self) -> str:
103124 """:return: 40 byte hex version of our 20 byte binary sha"""
104125 # b2a_hex produces bytes
105126 return bin_to_hex(self.binsha).decode('ascii')
106127
107128 @property
108 def data_stream(self):
129 def data_stream(self) -> 'OStream':
109130 """ :return: File Object compatible stream to the uncompressed raw data of the object
110131 :note: returned streams must be read in order"""
111132 return self.repo.odb.stream(self.binsha)
112133
113 def stream_data(self, ostream):
134 def stream_data(self, ostream: 'OStream') -> 'Object':
114135 """Writes our data directly to the given output stream
115136 :param ostream: File object compatible stream object.
116137 :return: self"""
128149 # for compatibility with iterable lists
129150 _id_attribute_ = 'path'
130151
131 def __init__(self, repo, binsha, mode=None, path=None):
152 def __init__(self,
153 repo: 'Repo', binsha: bytes, mode: Union[None, int] = None, path: Union[None, PathLike] = None
154 ) -> None:
132155 """Initialize a newly instanced IndexObject
133156
134157 :param repo: is the Repo we are located in
148171 if path is not None:
149172 self.path = path
150173
151 def __hash__(self):
174 def __hash__(self) -> int:
152175 """
153176 :return:
154177 Hash of our path as index items are uniquely identifiable by path, not
155178 by their data !"""
156179 return hash(self.path)
157180
158 def _set_cache_(self, attr):
181 def _set_cache_(self, attr: str) -> None:
159182 if attr in IndexObject.__slots__:
160183 # they cannot be retrieved lateron ( not without searching for them )
161184 raise AttributeError(
166189 # END handle slot attribute
167190
168191 @property
169 def name(self):
192 def name(self) -> str:
170193 """:return: Name portion of the path, effectively being the basename"""
171194 return osp.basename(self.path)
172195
173196 @property
174 def abspath(self):
197 def abspath(self) -> PathLike:
175198 """
176199 :return:
177200 Absolute path to this index object in the file system ( as opposed to the
178201 .path field which is a path relative to the git repository ).
179202
180203 The returned path will be native to the system and contains '\' on windows. """
181 return join_path_native(self.repo.working_tree_dir, self.path)
204 if self.repo.working_tree_dir is not None:
205 return join_path_native(self.repo.working_tree_dir, self.path)
206 else:
207 raise WorkTreeRepositoryUnsupported("Working_tree_dir was None or empty")
55 from mimetypes import guess_type
66 from . import base
77
8 from git.types import Literal
9
810 __all__ = ('Blob', )
911
1012
1214
1315 """A Blob encapsulates a git blob object"""
1416 DEFAULT_MIME_TYPE = "text/plain"
15 type = "blob"
17 type: Literal['blob'] = "blob"
1618
1719 # valid blob modes
1820 executable_mode = 0o100755
2224 __slots__ = ()
2325
2426 @property
25 def mime_type(self):
27 def mime_type(self) -> str:
2628 """
2729 :return: String describing the mime type of this file (based on the filename)
2830 :note: Defaults to 'text/plain' in case the actual file type is unknown. """
2931 guesses = None
3032 if self.path:
31 guesses = guess_type(self.path)
33 guesses = guess_type(str(self.path))
3234 return guesses and guesses[0] or self.DEFAULT_MIME_TYPE
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5
5 import datetime
6 from subprocess import Popen
67 from gitdb import IStream
78 from git.util import (
89 hex_to_bin,
910 Actor,
10 Iterable,
1111 Stats,
1212 finalize_process
1313 )
1616 from .tree import Tree
1717 from . import base
1818 from .util import (
19 Traversable,
2019 Serializable,
20 TraversableIterableObj,
2121 parse_date,
2222 altz_to_utctz_str,
2323 parse_actor_and_date,
3535 from io import BytesIO
3636 import logging
3737
38
39 # typing ------------------------------------------------------------------
40
41 from typing import Any, IO, Iterator, List, Sequence, Tuple, Union, TYPE_CHECKING, cast
42
43 from git.types import PathLike, Literal
44
45 if TYPE_CHECKING:
46 from git.repo import Repo
47 from git.refs import SymbolicReference
48
49 # ------------------------------------------------------------------------
50
3851 log = logging.getLogger('git.objects.commit')
3952 log.addHandler(logging.NullHandler())
4053
4154 __all__ = ('Commit', )
4255
4356
44 class Commit(base.Object, Iterable, Diffable, Traversable, Serializable):
57 class Commit(base.Object, TraversableIterableObj, Diffable, Serializable):
4558
4659 """Wraps a git Commit object.
4760
6073 default_encoding = "UTF-8"
6174
6275 # object configuration
63 type = "commit"
76 type: Literal['commit'] = "commit"
6477 __slots__ = ("tree",
6578 "author", "authored_date", "author_tz_offset",
6679 "committer", "committed_date", "committer_tz_offset",
6780 "message", "parents", "encoding", "gpgsig")
6881 _id_attribute_ = "hexsha"
6982
70 def __init__(self, repo, binsha, tree=None, author=None, authored_date=None, author_tz_offset=None,
71 committer=None, committed_date=None, committer_tz_offset=None,
72 message=None, parents=None, encoding=None, gpgsig=None):
83 def __init__(self, repo: 'Repo', binsha: bytes, tree: Union[Tree, None] = None,
84 author: Union[Actor, None] = None,
85 authored_date: Union[int, None] = None,
86 author_tz_offset: Union[None, float] = None,
87 committer: Union[Actor, None] = None,
88 committed_date: Union[int, None] = None,
89 committer_tz_offset: Union[None, float] = None,
90 message: Union[str, bytes, None] = None,
91 parents: Union[Sequence['Commit'], None] = None,
92 encoding: Union[str, None] = None,
93 gpgsig: Union[str, None] = None) -> None:
7394 """Instantiate a new Commit. All keyword arguments taking None as default will
7495 be implicitly set on first query.
7596
106127 as what time.altzone returns. The sign is inverted compared to git's
107128 UTC timezone."""
108129 super(Commit, self).__init__(repo, binsha)
130 self.binsha = binsha
109131 if tree is not None:
110132 assert isinstance(tree, Tree), "Tree needs to be a Tree instance, was %s" % type(tree)
111133 if tree is not None:
132154 self.gpgsig = gpgsig
133155
134156 @classmethod
135 def _get_intermediate_items(cls, commit):
136 return commit.parents
157 def _get_intermediate_items(cls, commit: 'Commit') -> Tuple['Commit', ...]:
158 return tuple(commit.parents)
137159
138160 @classmethod
139 def _calculate_sha_(cls, repo, commit):
161 def _calculate_sha_(cls, repo: 'Repo', commit: 'Commit') -> bytes:
140162 '''Calculate the sha of a commit.
141163
142164 :param repo: Repo object the commit should be part of
151173 istream = repo.odb.store(IStream(cls.type, streamlen, stream))
152174 return istream.binsha
153175
154 def replace(self, **kwargs):
176 def replace(self, **kwargs: Any) -> 'Commit':
155177 '''Create new commit object from existing commit object.
156178
157179 Any values provided as keyword arguments will replace the
170192
171193 return new_commit
172194
173 def _set_cache_(self, attr):
195 def _set_cache_(self, attr: str) -> None:
174196 if attr in Commit.__slots__:
175197 # read the data in a chunk, its faster - then provide a file wrapper
176198 _binsha, _typename, self.size, stream = self.repo.odb.stream(self.binsha)
180202 # END handle attrs
181203
182204 @property
183 def authored_datetime(self):
205 def authored_datetime(self) -> datetime.datetime:
184206 return from_timestamp(self.authored_date, self.author_tz_offset)
185207
186208 @property
187 def committed_datetime(self):
209 def committed_datetime(self) -> datetime.datetime:
188210 return from_timestamp(self.committed_date, self.committer_tz_offset)
189211
190212 @property
191 def summary(self):
213 def summary(self) -> Union[str, bytes]:
192214 """:return: First line of the commit message"""
193 return self.message.split('\n', 1)[0]
194
195 def count(self, paths='', **kwargs):
215 if isinstance(self.message, str):
216 return self.message.split('\n', 1)[0]
217 else:
218 return self.message.split(b'\n', 1)[0]
219
220 def count(self, paths: Union[PathLike, Sequence[PathLike]] = '', **kwargs: Any) -> int:
196221 """Count the number of commits reachable from this commit
197222
198223 :param paths:
210235 return len(self.repo.git.rev_list(self.hexsha, **kwargs).splitlines())
211236
212237 @property
213 def name_rev(self):
238 def name_rev(self) -> str:
214239 """
215240 :return:
216241 String describing the commits hex sha based on the closest Reference.
218243 return self.repo.git.name_rev(self)
219244
220245 @classmethod
221 def iter_items(cls, repo, rev, paths='', **kwargs):
246 def iter_items(cls, repo: 'Repo', rev: Union[str, 'Commit', 'SymbolicReference'], # type: ignore
247 paths: Union[PathLike, Sequence[PathLike]] = '', **kwargs: Any
248 ) -> Iterator['Commit']:
222249 """Find all commits matching the given criteria.
223250
224251 :param repo: is the Repo
238265
239266 # use -- in any case, to prevent possibility of ambiguous arguments
240267 # see https://github.com/gitpython-developers/GitPython/issues/264
241 args = ['--']
268
269 args_list: List[PathLike] = ['--']
270
242271 if paths:
243 args.extend((paths, ))
272 paths_tup: Tuple[PathLike, ...]
273 if isinstance(paths, (str, os.PathLike)):
274 paths_tup = (paths, )
275 else:
276 paths_tup = tuple(paths)
277
278 args_list.extend(paths_tup)
244279 # END if paths
245280
246 proc = repo.git.rev_list(rev, args, as_process=True, **kwargs)
281 proc = repo.git.rev_list(rev, args_list, as_process=True, **kwargs)
247282 return cls._iter_from_process_or_stream(repo, proc)
248283
249 def iter_parents(self, paths='', **kwargs):
284 def iter_parents(self, paths: Union[PathLike, Sequence[PathLike]] = '', **kwargs: Any) -> Iterator['Commit']:
250285 """Iterate _all_ parents of this commit.
251286
252287 :param paths:
262297
263298 return self.iter_items(self.repo, self, paths, **kwargs)
264299
265 @property
266 def stats(self):
300 @ property
301 def stats(self) -> Stats:
267302 """Create a git stat from changes between this commit and its first parent
268303 or from all changes done if this is the very first commit.
269304
279314 text = self.repo.git.diff(self.parents[0].hexsha, self.hexsha, '--', numstat=True)
280315 return Stats._list_from_string(self.repo, text)
281316
282 @classmethod
283 def _iter_from_process_or_stream(cls, repo, proc_or_stream):
317 @ classmethod
318 def _iter_from_process_or_stream(cls, repo: 'Repo', proc_or_stream: Union[Popen, IO]) -> Iterator['Commit']:
284319 """Parse out commit information into a list of Commit objects
285320 We expect one-line per commit, and parse the actual commit information directly
286321 from our lighting fast object database
287322
288323 :param proc: git-rev-list process instance - one sha per line
289324 :return: iterator returning Commit objects"""
290 stream = proc_or_stream
291 if not hasattr(stream, 'readline'):
292 stream = proc_or_stream.stdout
325
326 # def is_proc(inp) -> TypeGuard[Popen]:
327 # return hasattr(proc_or_stream, 'wait') and not hasattr(proc_or_stream, 'readline')
328
329 # def is_stream(inp) -> TypeGuard[IO]:
330 # return hasattr(proc_or_stream, 'readline')
331
332 if hasattr(proc_or_stream, 'wait'):
333 proc_or_stream = cast(Popen, proc_or_stream)
334 if proc_or_stream.stdout is not None:
335 stream = proc_or_stream.stdout
336 elif hasattr(proc_or_stream, 'readline'):
337 proc_or_stream = cast(IO, proc_or_stream)
338 stream = proc_or_stream
293339
294340 readline = stream.readline
295341 while True:
308354 # TODO: Review this - it seems process handling got a bit out of control
309355 # due to many developers trying to fix the open file handles issue
310356 if hasattr(proc_or_stream, 'wait'):
357 proc_or_stream = cast(Popen, proc_or_stream)
311358 finalize_process(proc_or_stream)
312359
313 @classmethod
314 def create_from_tree(cls, repo, tree, message, parent_commits=None, head=False, author=None, committer=None,
315 author_date=None, commit_date=None):
360 @ classmethod
361 def create_from_tree(cls, repo: 'Repo', tree: Union[Tree, str], message: str,
362 parent_commits: Union[None, List['Commit']] = None, head: bool = False,
363 author: Union[None, Actor] = None, committer: Union[None, Actor] = None,
364 author_date: Union[None, str] = None, commit_date: Union[None, str] = None) -> 'Commit':
316365 """Commit the given tree, creating a commit object.
317366
318367 :param repo: Repo object the commit should be part of
319368 :param tree: Tree object or hex or bin sha
320369 the tree of the new commit
321370 :param message: Commit message. It may be an empty string if no message is provided.
322 It will be converted to a string in any case.
371 It will be converted to a string , in any case.
323372 :param parent_commits:
324373 Optional Commit objects to use as parents for the new commit.
325374 If empty list, the commit will have no parents at all and become
353402 else:
354403 for p in parent_commits:
355404 if not isinstance(p, cls):
356 raise ValueError("Parent commit '%r' must be of type %s" % (p, cls))
405 raise ValueError(f"Parent commit '{p!r}' must be of type {cls}")
357406 # end check parent commit types
358407 # END if parent commits are unset
359408
429478
430479 #{ Serializable Implementation
431480
432 def _serialize(self, stream):
481 def _serialize(self, stream: BytesIO) -> 'Commit':
433482 write = stream.write
434483 write(("tree %s\n" % self.tree).encode('ascii'))
435484 for p in self.parents:
453502 write(("encoding %s\n" % self.encoding).encode('ascii'))
454503
455504 try:
456 if self.__getattribute__('gpgsig') is not None:
505 if self.__getattribute__('gpgsig'):
457506 write(b"gpgsig")
458507 for sigline in self.gpgsig.rstrip("\n").split("\n"):
459508 write((" " + sigline + "\n").encode('ascii'))
470519 # END handle encoding
471520 return self
472521
473 def _deserialize(self, stream):
474 """:param from_rev_list: if true, the stream format is coming from the rev-list command
475 Otherwise it is assumed to be a plain data stream from our object"""
522 def _deserialize(self, stream: BytesIO) -> 'Commit':
523 """
524 :param from_rev_list: if true, the stream format is coming from the rev-list command
525 Otherwise it is assumed to be a plain data stream from our object
526 """
476527 readline = stream.readline
477528 self.tree = Tree(self.repo, hex_to_bin(readline().split()[1]), Tree.tree_id << 12, '')
478529
503554 # now we can have the encoding line, or an empty line followed by the optional
504555 # message.
505556 self.encoding = self.default_encoding
506 self.gpgsig = None
557 self.gpgsig = ""
507558
508559 # read headers
509560 enc = next_line
510561 buf = enc.strip()
511562 while buf:
512563 if buf[0:10] == b"encoding ":
513 self.encoding = buf[buf.find(' ') + 1:].decode(
564 self.encoding = buf[buf.find(b' ') + 1:].decode(
514565 self.encoding, 'ignore')
515566 elif buf[0:7] == b"gpgsig ":
516567 sig = buf[buf.find(b' ') + 1:] + b"\n"
532583 # decode the authors name
533584
534585 try:
535 self.author, self.authored_date, self.author_tz_offset = \
586 (self.author, self.authored_date, self.author_tz_offset) = \
536587 parse_actor_and_date(author_line.decode(self.encoding, 'replace'))
537588 except UnicodeDecodeError:
538589 log.error("Failed to decode author line '%s' using encoding %s", author_line, self.encoding,
552603 try:
553604 self.message = self.message.decode(self.encoding, 'replace')
554605 except UnicodeDecodeError:
555 log.error("Failed to decode message '%s' using encoding %s", self.message, self.encoding, exc_info=True)
606 log.error("Failed to decode message '%s' using encoding %s",
607 self.message, self.encoding, exc_info=True)
556608 # END exception handling
557609
558610 return self
00 """Module with functions which are supposed to be as fast as possible"""
11 from stat import S_ISDIR
2
3
24 from git.compat import (
35 safe_decode,
46 defenc
57 )
68
9 # typing ----------------------------------------------
10
11 from typing import Callable, List, MutableSequence, Sequence, Tuple, TYPE_CHECKING, Union, overload
12
13 if TYPE_CHECKING:
14 from _typeshed import ReadableBuffer
15 from git import GitCmdObjectDB
16
17 EntryTup = Tuple[bytes, int, str] # same as TreeCacheTup in tree.py
18 EntryTupOrNone = Union[EntryTup, None]
19
20 # ---------------------------------------------------
21
22
723 __all__ = ('tree_to_stream', 'tree_entries_from_data', 'traverse_trees_recursive',
824 'traverse_tree_recursive')
925
1026
11 def tree_to_stream(entries, write):
27 def tree_to_stream(entries: Sequence[EntryTup], write: Callable[['ReadableBuffer'], Union[int, None]]) -> None:
1228 """Write the give list of entries into a stream using its write method
1329 :param entries: **sorted** list of tuples with (binsha, mode, name)
1430 :param write: write method which takes a data string"""
3248 # According to my tests, this is exactly what git does, that is it just
3349 # takes the input literally, which appears to be utf8 on linux.
3450 if isinstance(name, str):
35 name = name.encode(defenc)
36 write(b''.join((mode_str, b' ', name, b'\0', binsha)))
51 name_bytes = name.encode(defenc)
52 else:
53 name_bytes = name # type: ignore[unreachable] # check runtime types - is always str?
54 write(b''.join((mode_str, b' ', name_bytes, b'\0', binsha)))
3755 # END for each item
3856
3957
40 def tree_entries_from_data(data):
58 def tree_entries_from_data(data: bytes) -> List[EntryTup]:
4159 """Reads the binary representation of a tree and returns tuples of Tree items
4260 :param data: data block with tree data (as bytes)
4361 :return: list(tuple(binsha, mode, tree_relative_path), ...)"""
7189
7290 # default encoding for strings in git is utf8
7391 # Only use the respective unicode object if the byte stream was encoded
74 name = data[ns:i]
75 name = safe_decode(name)
92 name_bytes = data[ns:i]
93 name = safe_decode(name_bytes)
7694
7795 # byte is NULL, get next 20
7896 i += 1
83101 return out
84102
85103
86 def _find_by_name(tree_data, name, is_dir, start_at):
104 def _find_by_name(tree_data: MutableSequence[EntryTupOrNone], name: str, is_dir: bool, start_at: int
105 ) -> EntryTupOrNone:
87106 """return data entry matching the given name and tree mode
88107 or None.
89108 Before the item is returned, the respective data item is set
90109 None in the tree_data list to mark it done"""
110
91111 try:
92112 item = tree_data[start_at]
93113 if item and item[2] == name and S_ISDIR(item[1]) == is_dir:
105125 return None
106126
107127
108 def _to_full_path(item, path_prefix):
128 @ overload
129 def _to_full_path(item: None, path_prefix: str) -> None:
130 ...
131
132
133 @ overload
134 def _to_full_path(item: EntryTup, path_prefix: str) -> EntryTup:
135 ...
136
137
138 def _to_full_path(item: EntryTupOrNone, path_prefix: str) -> EntryTupOrNone:
109139 """Rebuild entry with given path prefix"""
110140 if not item:
111141 return item
112142 return (item[0], item[1], path_prefix + item[2])
113143
114144
115 def traverse_trees_recursive(odb, tree_shas, path_prefix):
145 def traverse_trees_recursive(odb: 'GitCmdObjectDB', tree_shas: Sequence[Union[bytes, None]],
146 path_prefix: str) -> List[Tuple[EntryTupOrNone, ...]]:
116147 """
117 :return: list with entries according to the given binary tree-shas.
148 :return: list of list with entries according to the given binary tree-shas.
118149 The result is encoded in a list
119150 of n tuple|None per blob/commit, (n == len(tree_shas)), where
120151 * [0] == 20 byte sha
127158 :param path_prefix: a prefix to be added to the returned paths on this level,
128159 set it '' for the first iteration
129160 :note: The ordering of the returned items will be partially lost"""
130 trees_data = []
161 trees_data: List[List[EntryTupOrNone]] = []
162
131163 nt = len(tree_shas)
132164 for tree_sha in tree_shas:
133165 if tree_sha is None:
134 data = []
166 data: List[EntryTupOrNone] = []
135167 else:
136 data = tree_entries_from_data(odb.stream(tree_sha).read())
168 # make new list for typing as list invariant
169 data = list(tree_entries_from_data(odb.stream(tree_sha).read()))
137170 # END handle muted trees
138171 trees_data.append(data)
139172 # END for each sha to get data for
140173
141 out = []
142 out_append = out.append
174 out: List[Tuple[EntryTupOrNone, ...]] = []
143175
144176 # find all matching entries and recursively process them together if the match
145177 # is a tree. If the match is a non-tree item, put it into the result.
146178 # Processed items will be set None
147179 for ti, tree_data in enumerate(trees_data):
180
148181 for ii, item in enumerate(tree_data):
149182 if not item:
150183 continue
151184 # END skip already done items
185 entries: List[EntryTupOrNone]
152186 entries = [None for _ in range(nt)]
153187 entries[ti] = item
154188 _sha, mode, name = item
160194 for tio in range(ti + 1, ti + nt):
161195 tio = tio % nt
162196 entries[tio] = _find_by_name(trees_data[tio], name, is_dir, ii)
197
163198 # END for each other item data
164
165199 # if we are a directory, enter recursion
166200 if is_dir:
167201 out.extend(traverse_trees_recursive(
168202 odb, [((ei and ei[0]) or None) for ei in entries], path_prefix + name + '/'))
169203 else:
170 out_append(tuple(_to_full_path(e, path_prefix) for e in entries))
204 out.append(tuple(_to_full_path(e, path_prefix) for e in entries))
205
171206 # END handle recursion
172
173207 # finally mark it done
174208 tree_data[ii] = None
175209 # END for each item
180214 return out
181215
182216
183 def traverse_tree_recursive(odb, tree_sha, path_prefix):
217 def traverse_tree_recursive(odb: 'GitCmdObjectDB', tree_sha: bytes, path_prefix: str) -> List[EntryTup]:
184218 """
185219 :return: list of entries of the tree pointed to by the binary tree_sha. An entry
186220 has the following format:
22 import logging
33 import os
44 import stat
5
56 from unittest import SkipTest
67 import uuid
78
2324 BadName
2425 )
2526 from git.objects.base import IndexObject, Object
26 from git.objects.util import Traversable
27 from git.objects.util import TraversableIterableObj
28
2729 from git.util import (
28 Iterable,
2930 join_path_native,
3031 to_native_path_linux,
3132 RemoteProgress,
3233 rmtree,
33 unbare_repo
34 unbare_repo,
35 IterableList
3436 )
3537 from git.util import HIDE_WINDOWS_KNOWN_ERRORS
3638
4547 )
4648
4749
50 # typing ----------------------------------------------------------------------
51 from typing import Callable, Dict, Mapping, Sequence, TYPE_CHECKING, cast
52 from typing import Any, Iterator, Union
53
54 from git.types import Commit_ish, Literal, PathLike, TBD
55
56 if TYPE_CHECKING:
57 from git.index import IndexFile
58 from git.repo import Repo
59 from git.refs import Head
60
61
62 # -----------------------------------------------------------------------------
63
4864 __all__ = ["Submodule", "UpdateProgress"]
4965
5066
5773 """Class providing detailed progress information to the caller who should
5874 derive from it and implement the ``update(...)`` message"""
5975 CLONE, FETCH, UPDWKTREE = [1 << x for x in range(RemoteProgress._num_op_codes, RemoteProgress._num_op_codes + 3)]
60 _num_op_codes = RemoteProgress._num_op_codes + 3
76 _num_op_codes: int = RemoteProgress._num_op_codes + 3
6177
6278 __slots__ = ()
6379
7288 # IndexObject comes via util module, its a 'hacky' fix thanks to pythons import
7389 # mechanism which cause plenty of trouble of the only reason for packages and
7490 # modules is refactoring - subpackages shouldn't depend on parent packages
75 class Submodule(IndexObject, Iterable, Traversable):
91 class Submodule(IndexObject, TraversableIterableObj):
7692
7793 """Implements access to a git submodule. They are special in that their sha
7894 represents a commit in the submodule's repository which is to be checked out
89105 k_default_mode = stat.S_IFDIR | stat.S_IFLNK # submodules are directories with link-status
90106
91107 # this is a bogus type for base class compatibility
92 type = 'submodule'
108 type: Literal['submodule'] = 'submodule' # type: ignore
93109
94110 __slots__ = ('_parent_commit', '_url', '_branch_path', '_name', '__weakref__')
95111 _cache_attrs = ('path', '_url', '_branch_path')
96112
97 def __init__(self, repo, binsha, mode=None, path=None, name=None, parent_commit=None, url=None, branch_path=None):
113 def __init__(self, repo: 'Repo', binsha: bytes,
114 mode: Union[int, None] = None,
115 path: Union[PathLike, None] = None,
116 name: Union[str, None] = None,
117 parent_commit: Union[Commit_ish, None] = None,
118 url: Union[str, None] = None,
119 branch_path: Union[PathLike, None] = None
120 ) -> None:
98121 """Initialize this instance with its attributes. We only document the ones
99122 that differ from ``IndexObject``
100123
109132 if url is not None:
110133 self._url = url
111134 if branch_path is not None:
112 assert isinstance(branch_path, str)
135 # assert isinstance(branch_path, str)
113136 self._branch_path = branch_path
114137 if name is not None:
115138 self._name = name
116139
117 def _set_cache_(self, attr):
140 def _set_cache_(self, attr: str) -> None:
118141 if attr in ('path', '_url', '_branch_path'):
119 reader = self.config_reader()
142 reader: SectionConstraint = self.config_reader()
120143 # default submodule values
121144 try:
122145 self.path = reader.get('path')
123146 except cp.NoSectionError as e:
124 raise ValueError("This submodule instance does not exist anymore in '%s' file"
125 % osp.join(self.repo.working_tree_dir, '.gitmodules')) from e
147 if self.repo.working_tree_dir is not None:
148 raise ValueError("This submodule instance does not exist anymore in '%s' file"
149 % osp.join(self.repo.working_tree_dir, '.gitmodules')) from e
126150 # end
127151 self._url = reader.get('url')
128152 # git-python extension values - optional
133157 super(Submodule, self)._set_cache_(attr)
134158 # END handle attribute name
135159
136 def _get_intermediate_items(self, item):
160 @classmethod
161 def _get_intermediate_items(cls, item: 'Submodule') -> IterableList['Submodule']:
137162 """:return: all the submodules of our module repository"""
138163 try:
139 return type(self).list_items(item.module())
164 return cls.list_items(item.module())
140165 except InvalidGitRepositoryError:
141 return []
166 return IterableList('')
142167 # END handle intermediate items
143168
144169 @classmethod
145 def _need_gitfile_submodules(cls, git):
170 def _need_gitfile_submodules(cls, git: Git) -> bool:
146171 return git.version_info[:3] >= (1, 7, 5)
147172
148 def __eq__(self, other):
173 def __eq__(self, other: Any) -> bool:
149174 """Compare with another submodule"""
150175 # we may only compare by name as this should be the ID they are hashed with
151176 # Otherwise this type wouldn't be hashable
152177 # return self.path == other.path and self.url == other.url and super(Submodule, self).__eq__(other)
153178 return self._name == other._name
154179
155 def __ne__(self, other):
180 def __ne__(self, other: object) -> bool:
156181 """Compare with another submodule for inequality"""
157182 return not (self == other)
158183
159 def __hash__(self):
184 def __hash__(self) -> int:
160185 """Hash this instance using its logical id, not the sha"""
161186 return hash(self._name)
162187
163 def __str__(self):
188 def __str__(self) -> str:
164189 return self._name
165190
166 def __repr__(self):
191 def __repr__(self) -> str:
167192 return "git.%s(name=%s, path=%s, url=%s, branch_path=%s)"\
168193 % (type(self).__name__, self._name, self.path, self.url, self.branch_path)
169194
170195 @classmethod
171 def _config_parser(cls, repo, parent_commit, read_only):
196 def _config_parser(cls, repo: 'Repo',
197 parent_commit: Union[Commit_ish, None],
198 read_only: bool) -> SubmoduleConfigParser:
172199 """:return: Config Parser constrained to our submodule in read or write mode
173200 :raise IOError: If the .gitmodules file cannot be found, either locally or in the repository
174201 at the given parent commit. Otherwise the exception would be delayed until the first
181208 # We are most likely in an empty repository, so the HEAD doesn't point to a valid ref
182209 pass
183210 # end handle parent_commit
184
185 if not repo.bare and parent_matches_head:
211 fp_module: Union[str, BytesIO]
212 if not repo.bare and parent_matches_head and repo.working_tree_dir:
186213 fp_module = osp.join(repo.working_tree_dir, cls.k_modules_file)
187214 else:
188215 assert parent_commit is not None, "need valid parent_commit in bare repositories"
200227
201228 return SubmoduleConfigParser(fp_module, read_only=read_only)
202229
203 def _clear_cache(self):
230 def _clear_cache(self) -> None:
204231 # clear the possibly changed values
205232 for name in self._cache_attrs:
206233 try:
211238 # END for each name to delete
212239
213240 @classmethod
214 def _sio_modules(cls, parent_commit):
241 def _sio_modules(cls, parent_commit: Commit_ish) -> BytesIO:
215242 """:return: Configuration file as BytesIO - we only access it through the respective blob's data"""
216243 sio = BytesIO(parent_commit.tree[cls.k_modules_file].data_stream.read())
217244 sio.name = cls.k_modules_file
218245 return sio
219246
220 def _config_parser_constrained(self, read_only):
247 def _config_parser_constrained(self, read_only: bool) -> SectionConstraint:
221248 """:return: Config Parser constrained to our submodule in read or write mode"""
222249 try:
223 pc = self.parent_commit
250 pc: Union['Commit_ish', None] = self.parent_commit
224251 except ValueError:
225252 pc = None
226253 # end handle empty parent repository
229256 return SectionConstraint(parser, sm_section(self.name))
230257
231258 @classmethod
232 def _module_abspath(cls, parent_repo, path, name):
259 def _module_abspath(cls, parent_repo: 'Repo', path: PathLike, name: str) -> PathLike:
233260 if cls._need_gitfile_submodules(parent_repo.git):
234261 return osp.join(parent_repo.git_dir, 'modules', name)
235 return osp.join(parent_repo.working_tree_dir, path)
262 if parent_repo.working_tree_dir:
263 return osp.join(parent_repo.working_tree_dir, path)
264 raise NotADirectoryError()
236265 # end
237266
238267 @classmethod
239 def _clone_repo(cls, repo, url, path, name, **kwargs):
268 def _clone_repo(cls, repo: 'Repo', url: str, path: PathLike, name: str, **kwargs: Any) -> 'Repo':
240269 """:return: Repo instance of newly cloned repository
241270 :param repo: our parent repository
242271 :param url: url to clone from
243 :param path: repository-relative path to the submodule checkout location
272 :param path: repository - relative path to the submodule checkout location
244273 :param name: canonical of the submodule
245274 :param kwrags: additinoal arguments given to git.clone"""
246275 module_abspath = cls._module_abspath(repo, path, name)
250279 module_abspath_dir = osp.dirname(module_abspath)
251280 if not osp.isdir(module_abspath_dir):
252281 os.makedirs(module_abspath_dir)
253 module_checkout_path = osp.join(repo.working_tree_dir, path)
282 module_checkout_path = osp.join(str(repo.working_tree_dir), path)
254283 # end
255284
256285 clone = git.Repo.clone_from(url, module_checkout_path, **kwargs)
260289 return clone
261290
262291 @classmethod
263 def _to_relative_path(cls, parent_repo, path):
264 """:return: a path guaranteed to be relative to the given parent-repository
292 def _to_relative_path(cls, parent_repo: 'Repo', path: PathLike) -> PathLike:
293 """:return: a path guaranteed to be relative to the given parent - repository
265294 :raise ValueError: if path is not contained in the parent repository's working tree"""
266295 path = to_native_path_linux(path)
267296 if path.endswith('/'):
268297 path = path[:-1]
269298 # END handle trailing slash
270299
271 if osp.isabs(path):
300 if osp.isabs(path) and parent_repo.working_tree_dir:
272301 working_tree_linux = to_native_path_linux(parent_repo.working_tree_dir)
273302 if not path.startswith(working_tree_linux):
274303 raise ValueError("Submodule checkout path '%s' needs to be within the parents repository at '%s'"
282311 return path
283312
284313 @classmethod
285 def _write_git_file_and_module_config(cls, working_tree_dir, module_abspath):
286 """Writes a .git file containing a (preferably) relative path to the actual git module repository.
314 def _write_git_file_and_module_config(cls, working_tree_dir: PathLike, module_abspath: PathLike) -> None:
315 """Writes a .git file containing a(preferably) relative path to the actual git module repository.
287316 It is an error if the module_abspath cannot be made into a relative path, relative to the working_tree_dir
288317 :note: will overwrite existing files !
289318 :note: as we rewrite both the git file as well as the module configuration, we might fail on the configuration
290 and will not roll back changes done to the git file. This should be a non-issue, but may easily be fixed
319 and will not roll back changes done to the git file. This should be a non - issue, but may easily be fixed
291320 if it becomes one
292321 :param working_tree_dir: directory to write the .git file into
293322 :param module_abspath: absolute path to the bare repository
308337 #{ Edit Interface
309338
310339 @classmethod
311 def add(cls, repo, name, path, url=None, branch=None, no_checkout=False, depth=None, env=None):
340 def add(cls, repo: 'Repo', name: str, path: PathLike, url: Union[str, None] = None,
341 branch: Union[str, None] = None, no_checkout: bool = False, depth: Union[int, None] = None,
342 env: Union[Mapping[str, str], None] = None, clone_multi_options: Union[Sequence[TBD], None] = None
343 ) -> 'Submodule':
312344 """Add a new submodule to the given repository. This will alter the index
313345 as well as the .gitmodules file, but will not create a new commit.
314346 If the submodule already exists, no matter if the configuration differs
341373 and is defined in `os.environ`, value from `os.environ` will be used.
342374 If you want to unset some variable, consider providing empty string
343375 as its value.
376 :param clone_multi_options: A list of Clone options. Please see ``git.repo.base.Repo.clone``
377 for details.
344378 :return: The newly created submodule instance
345379 :note: works atomically, such that no change will be done if the repository
346380 update fails for instance"""
381
347382 if repo.bare:
348383 raise InvalidGitRepositoryError("Cannot add submodules to bare repositories")
349384 # END handle bare repos
361396 if sm.exists():
362397 # reretrieve submodule from tree
363398 try:
364 sm = repo.head.commit.tree[path]
399 sm = repo.head.commit.tree[str(path)]
365400 sm._name = name
366401 return sm
367402 except KeyError:
384419 # END check url
385420 # END verify urls match
386421
387 mrepo = None
422 mrepo: Union[Repo, None] = None
423
388424 if url is None:
389425 if not has_module:
390 raise ValueError("A URL was not given and existing repository did not exsit at %s" % path)
426 raise ValueError("A URL was not given and a repository did not exist at %s" % path)
391427 # END check url
392428 mrepo = sm.module()
429 # assert isinstance(mrepo, git.Repo)
393430 urls = [r.url for r in mrepo.remotes]
394431 if not urls:
395432 raise ValueError("Didn't find any remote url in repository at %s" % sm.abspath)
397434 url = urls[0]
398435 else:
399436 # clone new repo
400 kwargs = {'n': no_checkout}
437 kwargs: Dict[str, Union[bool, int, str, Sequence[TBD]]] = {'n': no_checkout}
401438 if not branch_is_default:
402439 kwargs['b'] = br.name
403440 # END setup checkout-branch
407444 kwargs['depth'] = depth
408445 else:
409446 raise ValueError("depth should be an integer")
447 if clone_multi_options:
448 kwargs['multi_options'] = clone_multi_options
410449
411450 # _clone_repo(cls, repo, url, path, name, **kwargs):
412451 mrepo = cls._clone_repo(repo, url, path, name, env=env, **kwargs)
419458 # otherwise there is a '-' character in front of the submodule listing
420459 # a38efa84daef914e4de58d1905a500d8d14aaf45 mymodule (v0.9.0-1-ga38efa8)
421460 # -a38efa84daef914e4de58d1905a500d8d14aaf45 submodules/intermediate/one
461 writer: Union[GitConfigParser, SectionConstraint]
462
422463 with sm.repo.config_writer() as writer:
423464 writer.set_value(sm_section(name), 'url', url)
424465
435476 sm._branch_path = br.path
436477
437478 # we deliberately assume that our head matches our index !
438 sm.binsha = mrepo.head.commit.binsha
479 if mrepo:
480 sm.binsha = mrepo.head.commit.binsha
439481 index.add([sm], write=True)
440482
441483 return sm
442484
443 def update(self, recursive=False, init=True, to_latest_revision=False, progress=None, dry_run=False,
444 force=False, keep_going=False, env=None):
485 def update(self, recursive: bool = False, init: bool = True, to_latest_revision: bool = False,
486 progress: Union['UpdateProgress', None] = None, dry_run: bool = False,
487 force: bool = False, keep_going: bool = False, env: Union[Mapping[str, str], None] = None,
488 clone_multi_options: Union[Sequence[TBD], None] = None) -> 'Submodule':
445489 """Update the repository of this submodule to point to the checkout
446490 we point at with the binsha of this instance.
447491
455499 was specified for this submodule and the branch existed remotely
456500 :param progress: UpdateProgress instance or None if no progress should be shown
457501 :param dry_run: if True, the operation will only be simulated, but not performed.
458 All performed operations are read-only
502 All performed operations are read - only
459503 :param force:
460504 If True, we may reset heads even if the repository in question is dirty. Additinoally we will be allowed
461505 to set a tracking branch which is ahead of its remote branch back into the past or the location of the
463507 If False, local tracking branches that are in the future of their respective remote branches will simply
464508 not be moved.
465509 :param keep_going: if True, we will ignore but log all errors, and keep going recursively.
466 Unless dry_run is set as well, keep_going could cause subsequent/inherited errors you wouldn't see
510 Unless dry_run is set as well, keep_going could cause subsequent / inherited errors you wouldn't see
467511 otherwise.
468512 In conjunction with dry_run, it can be useful to anticipate all errors when updating submodules
469513 :param env: Optional dictionary containing the desired environment variables.
472516 and is defined in `os.environ`, value from `os.environ` will be used.
473517 If you want to unset some variable, consider providing empty string
474518 as its value.
519 :param clone_multi_options: list of Clone options. Please see ``git.repo.base.Repo.clone``
520 for details. Only take effect with `init` option.
475521 :note: does nothing in bare repositories
476522 :note: method is definitely not atomic if recurisve is True
477523 :return: self"""
518564 progress.update(op, i, len_rmts, prefix + "Done fetching remote of submodule %r" % self.name)
519565 # END fetch new data
520566 except InvalidGitRepositoryError:
567 mrepo = None
521568 if not init:
522569 return self
523570 # END early abort if init is not allowed
538585 progress.update(BEGIN | CLONE, 0, 1, prefix + "Cloning url '%s' to '%s' in submodule %r" %
539586 (self.url, checkout_module_abspath, self.name))
540587 if not dry_run:
541 mrepo = self._clone_repo(self.repo, self.url, self.path, self.name, n=True, env=env)
588 mrepo = self._clone_repo(self.repo, self.url, self.path, self.name, n=True, env=env,
589 multi_options=clone_multi_options)
542590 # END handle dry-run
543591 progress.update(END | CLONE, 0, 1, prefix + "Done cloning to %s" % checkout_module_abspath)
544592
545593 if not dry_run:
546594 # see whether we have a valid branch to checkout
547595 try:
596 mrepo = cast('Repo', mrepo)
548597 # find a remote which has our branch - we try to be flexible
549598 remote_branch = find_first_remote_branch(mrepo.remotes, self.branch_name)
550599 local_branch = mkhead(mrepo, self.branch_path)
556605
557606 # make sure HEAD is not detached
558607 mrepo.head.set_reference(local_branch, logmsg="submodule: attaching head to %s" % local_branch)
559 mrepo.head.ref.set_tracking_branch(remote_branch)
608 mrepo.head.reference.set_tracking_branch(remote_branch)
560609 except (IndexError, InvalidGitRepositoryError):
561610 log.warning("Failed to checkout tracking branch %s", self.branch_path)
562611 # END handle tracking branch
582631 if mrepo is not None and to_latest_revision:
583632 msg_base = "Cannot update to latest revision in repository at %r as " % mrepo.working_dir
584633 if not is_detached:
585 rref = mrepo.head.ref.tracking_branch()
634 rref = mrepo.head.reference.tracking_branch()
586635 if rref is not None:
587636 rcommit = rref.commit
588637 binsha = rcommit.binsha
589638 hexsha = rcommit.hexsha
590639 else:
591 log.error("%s a tracking branch was not set for local branch '%s'", msg_base, mrepo.head.ref)
640 log.error("%s a tracking branch was not set for local branch '%s'",
641 msg_base, mrepo.head.reference)
592642 # END handle remote ref
593643 else:
594644 log.error("%s there was no local tracking branch", msg_base)
605655 may_reset = True
606656 if mrepo.head.commit.binsha != self.NULL_BIN_SHA:
607657 base_commit = mrepo.merge_base(mrepo.head.commit, hexsha)
608 if len(base_commit) == 0 or base_commit[0].hexsha == hexsha:
658 if len(base_commit) == 0 or (base_commit[0] is not None and base_commit[0].hexsha == hexsha):
609659 if force:
610660 msg = "Will force checkout or reset on local branch that is possibly in the future of"
611661 msg += "the commit it will be checked out to, effectively 'forgetting' new commits"
663713 return self
664714
665715 @unbare_repo
666 def move(self, module_path, configuration=True, module=True):
716 def move(self, module_path: PathLike, configuration: bool = True, module: bool = True) -> 'Submodule':
667717 """Move the submodule to a another module path. This involves physically moving
668718 the repository at our current path, changing the configuration, as well as
669719 adjusting our index entry accordingly.
670720
671721 :param module_path: the path to which to move our module in the parent repostory's working tree,
672 given as repository-relative or absolute path. Intermediate directories will be created
722 given as repository - relative or absolute path. Intermediate directories will be created
673723 accordingly. If the path already exists, it must be empty.
674 Trailing (back)slashes are removed automatically
724 Trailing(back)slashes are removed automatically
675725 :param configuration: if True, the configuration will be adjusted to let
676726 the submodule point to the given path.
677727 :param module: if True, the repository managed by this submodule
680730 :return: self
681731 :raise ValueError: if the module path existed and was not empty, or was a file
682732 :note: Currently the method is not atomic, and it could leave the repository
683 in an inconsistent state if a sub-step fails for some reason
733 in an inconsistent state if a sub - step fails for some reason
684734 """
685735 if module + configuration < 1:
686736 raise ValueError("You must specify to move at least the module or the configuration of the submodule")
693743 return self
694744 # END handle no change
695745
696 module_checkout_abspath = join_path_native(self.repo.working_tree_dir, module_checkout_path)
746 module_checkout_abspath = join_path_native(str(self.repo.working_tree_dir), module_checkout_path)
697747 if osp.isfile(module_checkout_abspath):
698748 raise ValueError("Cannot move repository onto a file: %s" % module_checkout_abspath)
699749 # END handle target files
772822 return self
773823
774824 @unbare_repo
775 def remove(self, module=True, force=False, configuration=True, dry_run=False):
825 def remove(self, module: bool = True, force: bool = False,
826 configuration: bool = True, dry_run: bool = False) -> 'Submodule':
776827 """Remove this submodule from the repository. This will remove our entry
777 from the .gitmodules file and the entry in the .git/config file.
828 from the .gitmodules file and the entry in the .git / config file.
778829
779830 :param module: If True, the module checkout we point to will be deleted
780831 as well. If the module is currently on a commit which is not part
781832 of any branch in the remote, if the currently checked out branch
782833 working tree, or untracked files,
783 is ahead of its tracking branch, if you have modifications in the
834 is ahead of its tracking branch, if you have modifications in the
784835 In case the removal of the repository fails for these reasons, the
785836 submodule status will not have been altered.
786 If this submodule has child-modules on its own, these will be deleted
837 If this submodule has child - modules on its own, these will be deleted
787838 prior to touching the own module.
788839 :param force: Enforces the deletion of the module even though it contains
789 modifications. This basically enforces a brute-force file system based
840 modifications. This basically enforces a brute - force file system based
790841 deletion.
791842 :param configuration: if True, the submodule is deleted from the configuration,
792843 otherwise it isn't. Although this should be enabled most of the times,
826877 # TODO: If we run into permission problems, we have a highly inconsistent
827878 # state. Delete the .git folders last, start with the submodules first
828879 mp = self.abspath
829 method = None
880 method: Union[None, Callable[[PathLike], None]] = None
830881 if osp.islink(mp):
831882 method = os.remove
832883 elif osp.isdir(mp):
879930 import gc
880931 gc.collect()
881932 try:
882 rmtree(wtd)
933 rmtree(str(wtd))
883934 except Exception as ex:
884935 if HIDE_WINDOWS_KNOWN_ERRORS:
885936 raise SkipTest("FIXME: fails with: PermissionError\n {}".format(ex)) from ex
893944 rmtree(git_dir)
894945 except Exception as ex:
895946 if HIDE_WINDOWS_KNOWN_ERRORS:
896 raise SkipTest("FIXME: fails with: PermissionError\n %s", ex) from ex
947 raise SkipTest(f"FIXME: fails with: PermissionError\n {ex}") from ex
897948 else:
898949 raise
899950 # end handle separate bare repository
917968
918969 # now git config - need the config intact, otherwise we can't query
919970 # information anymore
920 with self.repo.config_writer() as writer:
921 writer.remove_section(sm_section(self.name))
922
923 with self.config_writer() as writer:
924 writer.remove_section()
971
972 with self.repo.config_writer() as gcp_writer:
973 gcp_writer.remove_section(sm_section(self.name))
974
975 with self.config_writer() as sc_writer:
976 sc_writer.remove_section()
925977 # END delete configuration
926978
927979 return self
928980
929 def set_parent_commit(self, commit, check=True):
981 def set_parent_commit(self, commit: Union[Commit_ish, None], check: bool = True) -> 'Submodule':
930982 """Set this instance to use the given commit whose tree is supposed to
931983 contain the .gitmodules blob.
932984
9651017 # If check is False, we might see a parent-commit that doesn't even contain the submodule anymore.
9661018 # in that case, mark our sha as being NULL
9671019 try:
968 self.binsha = pctree[self.path].binsha
1020 self.binsha = pctree[str(self.path)].binsha
9691021 except KeyError:
9701022 self.binsha = self.NULL_BIN_SHA
9711023 # end
9741026 return self
9751027
9761028 @unbare_repo
977 def config_writer(self, index=None, write=True):
1029 def config_writer(self, index: Union['IndexFile', None] = None, write: bool = True
1030 ) -> SectionConstraint['SubmoduleConfigParser']:
9781031 """:return: a config writer instance allowing you to read and write the data
9791032 belonging to this submodule into the .gitmodules file.
9801033
9951048 return writer
9961049
9971050 @unbare_repo
998 def rename(self, new_name):
1051 def rename(self, new_name: str) -> 'Submodule':
9991052 """Rename this submodule
10001053 :note: This method takes care of renaming the submodule in various places, such as
10011054
1002 * $parent_git_dir/config
1003 * $working_tree_dir/.gitmodules
1004 * (git >=v1.8.0: move submodule repository to new name)
1055 * $parent_git_dir / config
1056 * $working_tree_dir / .gitmodules
1057 * (git >= v1.8.0: move submodule repository to new name)
10051058
10061059 As .gitmodules will be changed, you would need to make a commit afterwards. The changed .gitmodules file
10071060 will already be added to the index
10301083 destination_module_abspath = self._module_abspath(self.repo, self.path, new_name)
10311084 source_dir = mod.git_dir
10321085 # Let's be sure the submodule name is not so obviously tied to a directory
1033 if destination_module_abspath.startswith(mod.git_dir):
1086 if str(destination_module_abspath).startswith(str(mod.git_dir)):
10341087 tmp_dir = self._module_abspath(self.repo, self.path, str(uuid.uuid4()))
10351088 os.renames(source_dir, tmp_dir)
10361089 source_dir = tmp_dir
10371090 # end handle self-containment
10381091 os.renames(source_dir, destination_module_abspath)
1039 self._write_git_file_and_module_config(mod.working_tree_dir, destination_module_abspath)
1092 if mod.working_tree_dir:
1093 self._write_git_file_and_module_config(mod.working_tree_dir, destination_module_abspath)
10401094 # end move separate git repository
10411095
10421096 return self
10461100 #{ Query Interface
10471101
10481102 @unbare_repo
1049 def module(self):
1103 def module(self) -> 'Repo':
10501104 """:return: Repo instance initialized from the repository at our submodule path
10511105 :raise InvalidGitRepositoryError: if a repository was not available. This could
10521106 also mean that it was not yet initialized"""
10631117 raise InvalidGitRepositoryError("Repository at %r was not yet checked out" % module_checkout_abspath)
10641118 # END handle exceptions
10651119
1066 def module_exists(self):
1120 def module_exists(self) -> bool:
10671121 """:return: True if our module exists and is a valid git repository. See module() method"""
10681122 try:
10691123 self.module()
10721126 return False
10731127 # END handle exception
10741128
1075 def exists(self):
1129 def exists(self) -> bool:
10761130 """
10771131 :return: True if the submodule exists, False otherwise. Please note that
1078 a submodule may exist (in the .gitmodules file) even though its module
1132 a submodule may exist ( in the .gitmodules file) even though its module
10791133 doesn't exist on disk"""
10801134 # keep attributes for later, and restore them if we have no valid data
10811135 # this way we do not actually alter the state of the object
11071161 # END handle object state consistency
11081162
11091163 @property
1110 def branch(self):
1164 def branch(self) -> 'Head':
11111165 """:return: The branch instance that we are to checkout
11121166 :raise InvalidGitRepositoryError: if our module is not yet checked out"""
11131167 return mkhead(self.module(), self._branch_path)
11141168
11151169 @property
1116 def branch_path(self):
1170 def branch_path(self) -> PathLike:
11171171 """
1118 :return: full (relative) path as string to the branch we would checkout
1172 :return: full(relative) path as string to the branch we would checkout
11191173 from the remote and track"""
11201174 return self._branch_path
11211175
11221176 @property
1123 def branch_name(self):
1177 def branch_name(self) -> str:
11241178 """:return: the name of the branch, which is the shortest possible branch name"""
11251179 # use an instance method, for this we create a temporary Head instance
11261180 # which uses a repository that is available at least ( it makes no difference )
11271181 return git.Head(self.repo, self._branch_path).name
11281182
11291183 @property
1130 def url(self):
1131 """:return: The url to the repository which our module-repository refers to"""
1184 def url(self) -> str:
1185 """:return: The url to the repository which our module - repository refers to"""
11321186 return self._url
11331187
11341188 @property
1135 def parent_commit(self):
1189 def parent_commit(self) -> 'Commit_ish':
11361190 """:return: Commit instance with the tree containing the .gitmodules file
11371191 :note: will always point to the current head's commit if it was not set explicitly"""
11381192 if self._parent_commit is None:
11401194 return self._parent_commit
11411195
11421196 @property
1143 def name(self):
1197 def name(self) -> str:
11441198 """:return: The name of this submodule. It is used to identify it within the
11451199 .gitmodules file.
11461200 :note: by default, the name is the path at which to find the submodule, but
1147 in git-python it should be a unique identifier similar to the identifiers
1201 in git - python it should be a unique identifier similar to the identifiers
11481202 used for remotes, which allows to change the path of the submodule
11491203 easily
11501204 """
11511205 return self._name
11521206
1153 def config_reader(self):
1207 def config_reader(self) -> SectionConstraint[SubmoduleConfigParser]:
11541208 """
11551209 :return: ConfigReader instance which allows you to qurey the configuration values
11561210 of this submodule, as provided by the .gitmodules file
11601214 :raise IOError: If the .gitmodules file/blob could not be read"""
11611215 return self._config_parser_constrained(read_only=True)
11621216
1163 def children(self):
1217 def children(self) -> IterableList['Submodule']:
11641218 """
11651219 :return: IterableList(Submodule, ...) an iterable list of submodules instances
11661220 which are children of this submodule or 0 if the submodule is not checked out"""
11711225 #{ Iterable Interface
11721226
11731227 @classmethod
1174 def iter_items(cls, repo, parent_commit='HEAD'):
1228 def iter_items(cls, repo: 'Repo', parent_commit: Union[Commit_ish, str] = 'HEAD', *Args: Any, **kwargs: Any
1229 ) -> Iterator['Submodule']:
11751230 """:return: iterator yielding Submodule instances available in the given repository"""
11761231 try:
11771232 pc = repo.commit(parent_commit) # parent commit instance
11781233 parser = cls._config_parser(repo, pc, read_only=True)
11791234 except (IOError, BadName):
1180 return
1235 return iter([])
11811236 # END handle empty iterator
1182
1183 rt = pc.tree # root tree
11841237
11851238 for sms in parser.sections():
11861239 n = sm_name(sms)
11941247 # get the binsha
11951248 index = repo.index
11961249 try:
1250 rt = pc.tree # root tree
11971251 sm = rt[p]
11981252 except KeyError:
11991253 # try the index, maybe it was just added
11 Submodule,
22 UpdateProgress
33 )
4 from .util import (
5 find_first_remote_branch
6 )
4 from .util import find_first_remote_branch
75 from git.exc import InvalidGitRepositoryError
86 import git
97
108 import logging
9
10 # typing -------------------------------------------------------------------
11
12 from typing import TYPE_CHECKING, Union
13
14 from git.types import Commit_ish
15
16 if TYPE_CHECKING:
17 from git.repo import Repo
18 from git.util import IterableList
19
20 # ----------------------------------------------------------------------------
1121
1222 __all__ = ["RootModule", "RootUpdateProgress"]
1323
4151
4252 k_root_name = '__ROOT__'
4353
44 def __init__(self, repo):
54 def __init__(self, repo: 'Repo'):
4555 # repo, binsha, mode=None, path=None, name = None, parent_commit=None, url=None, ref=None)
4656 super(RootModule, self).__init__(
4757 repo,
5464 branch_path=git.Head.to_full_path(self.k_head_default)
5565 )
5666
57 def _clear_cache(self):
67 def _clear_cache(self) -> None:
5868 """May not do anything"""
5969 pass
6070
6171 #{ Interface
6272
63 def update(self, previous_commit=None, recursive=True, force_remove=False, init=True,
64 to_latest_revision=False, progress=None, dry_run=False, force_reset=False,
65 keep_going=False):
73 def update(self, previous_commit: Union[Commit_ish, None] = None, # type: ignore[override]
74 recursive: bool = True, force_remove: bool = False, init: bool = True,
75 to_latest_revision: bool = False, progress: Union[None, 'RootUpdateProgress'] = None,
76 dry_run: bool = False, force_reset: bool = False, keep_going: bool = False
77 ) -> 'RootModule':
6678 """Update the submodules of this repository to the current HEAD commit.
6779 This method behaves smartly by determining changes of the path of a submodules
6880 repository, next to changes to the to-be-checked-out commit or the branch to be
127139 previous_commit = repo.commit(previous_commit) # obtain commit object
128140 # END handle previous commit
129141
130 psms = self.list_items(repo, parent_commit=previous_commit)
131 sms = self.list_items(repo)
142 psms: 'IterableList[Submodule]' = self.list_items(repo, parent_commit=previous_commit)
143 sms: 'IterableList[Submodule]' = self.list_items(repo)
132144 spsms = set(psms)
133145 ssms = set(sms)
134146
161173 csms = (spsms & ssms)
162174 len_csms = len(csms)
163175 for i, csm in enumerate(csms):
164 psm = psms[csm.name]
165 sm = sms[csm.name]
176 psm: 'Submodule' = psms[csm.name]
177 sm: 'Submodule' = sms[csm.name]
166178
167179 # PATH CHANGES
168180 ##############
342354
343355 return self
344356
345 def module(self):
357 def module(self) -> 'Repo':
346358 """:return: the actual repository containing the submodules"""
347359 return self.repo
348360 #} END interface
22 from git.config import GitConfigParser
33 from io import BytesIO
44 import weakref
5
6
7 # typing -----------------------------------------------------------------------
8
9 from typing import Any, Sequence, TYPE_CHECKING, Union
10
11 from git.types import PathLike
12
13 if TYPE_CHECKING:
14 from .base import Submodule
15 from weakref import ReferenceType
16 from git.repo import Repo
17 from git.refs import Head
18 from git import Remote
19 from git.refs import RemoteReference
20
521
622 __all__ = ('sm_section', 'sm_name', 'mkhead', 'find_first_remote_branch',
723 'SubmoduleConfigParser')
925 #{ Utilities
1026
1127
12 def sm_section(name):
28 def sm_section(name: str) -> str:
1329 """:return: section title used in .gitmodules configuration file"""
14 return 'submodule "%s"' % name
30 return f'submodule "{name}"'
1531
1632
17 def sm_name(section):
33 def sm_name(section: str) -> str:
1834 """:return: name of the submodule as parsed from the section name"""
1935 section = section.strip()
2036 return section[11:-1]
2137
2238
23 def mkhead(repo, path):
39 def mkhead(repo: 'Repo', path: PathLike) -> 'Head':
2440 """:return: New branch/head instance"""
2541 return git.Head(repo, git.Head.to_full_path(path))
2642
2743
28 def find_first_remote_branch(remotes, branch_name):
44 def find_first_remote_branch(remotes: Sequence['Remote'], branch_name: str) -> 'RemoteReference':
2945 """Find the remote branch matching the name of the given branch or raise InvalidGitRepositoryError"""
3046 for remote in remotes:
3147 try:
5268 Please note that no mutating method will work in bare mode
5369 """
5470
55 def __init__(self, *args, **kwargs):
56 self._smref = None
71 def __init__(self, *args: Any, **kwargs: Any) -> None:
72 self._smref: Union['ReferenceType[Submodule]', None] = None
5773 self._index = None
5874 self._auto_write = True
5975 super(SubmoduleConfigParser, self).__init__(*args, **kwargs)
6076
6177 #{ Interface
62 def set_submodule(self, submodule):
78 def set_submodule(self, submodule: 'Submodule') -> None:
6379 """Set this instance's submodule. It must be called before
6480 the first write operation begins"""
6581 self._smref = weakref.ref(submodule)
6682
67 def flush_to_index(self):
83 def flush_to_index(self) -> None:
6884 """Flush changes in our configuration file to the index"""
6985 assert self._smref is not None
7086 # should always have a file here
8399 #} END interface
84100
85101 #{ Overridden Methods
86 def write(self):
87 rval = super(SubmoduleConfigParser, self).write()
102 def write(self) -> None: # type: ignore[override]
103 rval: None = super(SubmoduleConfigParser, self).write()
88104 self.flush_to_index()
89105 return rval
90106 # END overridden methods
88 from ..util import hex_to_bin
99 from ..compat import defenc
1010
11 from typing import List, TYPE_CHECKING, Union
12
13 from git.types import Literal
14
15 if TYPE_CHECKING:
16 from git.repo import Repo
17 from git.util import Actor
18 from .commit import Commit
19 from .blob import Blob
20 from .tree import Tree
21
1122 __all__ = ("TagObject", )
1223
1324
1425 class TagObject(base.Object):
1526
1627 """Non-Lightweight tag carrying additional information about an object we are pointing to."""
17 type = "tag"
28 type: Literal['tag'] = "tag"
1829 __slots__ = ("object", "tag", "tagger", "tagged_date", "tagger_tz_offset", "message")
1930
20 def __init__(self, repo, binsha, object=None, tag=None, # @ReservedAssignment
21 tagger=None, tagged_date=None, tagger_tz_offset=None, message=None):
31 def __init__(self, repo: 'Repo', binsha: bytes,
32 object: Union[None, base.Object] = None,
33 tag: Union[None, str] = None,
34 tagger: Union[None, 'Actor'] = None,
35 tagged_date: Union[int, None] = None,
36 tagger_tz_offset: Union[int, None] = None,
37 message: Union[str, None] = None
38 ) -> None: # @ReservedAssignment
2239 """Initialize a tag object with additional data
2340
2441 :param repo: repository this object is located in
3350 authored_date is in, in a format similar to time.altzone"""
3451 super(TagObject, self).__init__(repo, binsha)
3552 if object is not None:
36 self.object = object
53 self.object: Union['Commit', 'Blob', 'Tree', 'TagObject'] = object
3754 if tag is not None:
3855 self.tag = tag
3956 if tagger is not None:
4562 if message is not None:
4663 self.message = message
4764
48 def _set_cache_(self, attr):
65 def _set_cache_(self, attr: str) -> None:
4966 """Cache all our attributes at once"""
5067 if attr in TagObject.__slots__:
5168 ostream = self.repo.odb.stream(self.binsha)
52 lines = ostream.read().decode(defenc, 'replace').splitlines()
69 lines: List[str] = ostream.read().decode(defenc, 'replace').splitlines()
5370
5471 _obj, hexsha = lines[0].split(" ")
5572 _type_token, type_name = lines[1].split(" ")
73 object_type = get_object_type_by_name(type_name.encode('ascii'))
5674 self.object = \
57 get_object_type_by_name(type_name.encode('ascii'))(self.repo, hex_to_bin(hexsha))
75 object_type(self.repo, hex_to_bin(hexsha))
5876
5977 self.tag = lines[2][4:] # tag <tag name>
6078
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5 from git.util import join_path
6 import git.diff as diff
5
6 from git.util import IterableList, join_path
7 import git.diff as git_diff
78 from git.util import to_bin_sha
89
910 from . import util
10 from .base import IndexObject
11 from .base import IndexObject, IndexObjUnion
1112 from .blob import Blob
1213 from .submodule.base import Submodule
1314
1617 tree_to_stream
1718 )
1819
19 cmp = lambda a, b: (a > b) - (a < b)
20
21 # typing -------------------------------------------------
22
23 from typing import (Any, Callable, Dict, Iterable, Iterator, List,
24 Tuple, Type, Union, cast, TYPE_CHECKING)
25
26 from git.types import PathLike, Literal
27
28 if TYPE_CHECKING:
29 from git.repo import Repo
30 from io import BytesIO
31
32 TreeCacheTup = Tuple[bytes, int, str]
33
34 TraversedTreeTup = Union[Tuple[Union['Tree', None], IndexObjUnion,
35 Tuple['Submodule', 'Submodule']]]
36
37
38 # def is_tree_cache(inp: Tuple[bytes, int, str]) -> TypeGuard[TreeCacheTup]:
39 # return isinstance(inp[0], bytes) and isinstance(inp[1], int) and isinstance([inp], str)
40
41 #--------------------------------------------------------
42
43
44 cmp: Callable[[str, str], int] = lambda a, b: (a > b) - (a < b)
2045
2146 __all__ = ("TreeModifier", "Tree")
2247
2348
24 def git_cmp(t1, t2):
49 def git_cmp(t1: TreeCacheTup, t2: TreeCacheTup) -> int:
2550 a, b = t1[2], t2[2]
51 # assert isinstance(a, str) and isinstance(b, str)
2652 len_a, len_b = len(a), len(b)
2753 min_len = min(len_a, len_b)
2854 min_cmp = cmp(a[:min_len], b[:min_len])
3359 return len_a - len_b
3460
3561
36 def merge_sort(a, cmp):
62 def merge_sort(a: List[TreeCacheTup],
63 cmp: Callable[[TreeCacheTup, TreeCacheTup], int]) -> None:
3764 if len(a) < 2:
38 return
65 return None
3966
4067 mid = len(a) // 2
4168 lefthalf = a[:mid]
76103 the cache of a tree, will be sorted. Assuring it will be in a serializable state"""
77104 __slots__ = '_cache'
78105
79 def __init__(self, cache):
106 def __init__(self, cache: List[TreeCacheTup]) -> None:
80107 self._cache = cache
81108
82 def _index_by_name(self, name):
109 def _index_by_name(self, name: str) -> int:
83110 """:return: index of an item with name, or -1 if not found"""
84111 for i, t in enumerate(self._cache):
85112 if t[2] == name:
89116 return -1
90117
91118 #{ Interface
92 def set_done(self):
119 def set_done(self) -> 'TreeModifier':
93120 """Call this method once you are done modifying the tree information.
94121 It may be called several times, but be aware that each call will cause
95122 a sort operation
99126 #} END interface
100127
101128 #{ Mutators
102 def add(self, sha, mode, name, force=False):
129 def add(self, sha: bytes, mode: int, name: str, force: bool = False) -> 'TreeModifier':
103130 """Add the given item to the tree. If an item with the given name already
104131 exists, nothing will be done, but a ValueError will be raised if the
105132 sha and mode of the existing item do not match the one you add, unless
117144
118145 sha = to_bin_sha(sha)
119146 index = self._index_by_name(name)
147
120148 item = (sha, mode, name)
149 # assert is_tree_cache(item)
150
121151 if index == -1:
122152 self._cache.append(item)
123153 else:
132162 # END handle name exists
133163 return self
134164
135 def add_unchecked(self, binsha, mode, name):
165 def add_unchecked(self, binsha: bytes, mode: int, name: str) -> None:
136166 """Add the given item to the tree, its correctness is assumed, which
137167 puts the caller into responsibility to assure the input is correct.
138168 For more information on the parameters, see ``add``
139169 :param binsha: 20 byte binary sha"""
140 self._cache.append((binsha, mode, name))
141
142 def __delitem__(self, name):
170 assert isinstance(binsha, bytes) and isinstance(mode, int) and isinstance(name, str)
171 tree_cache = (binsha, mode, name)
172
173 self._cache.append(tree_cache)
174
175 def __delitem__(self, name: str) -> None:
143176 """Deletes an item with the given name if it exists"""
144177 index = self._index_by_name(name)
145178 if index > -1:
148181 #} END mutators
149182
150183
151 class Tree(IndexObject, diff.Diffable, util.Traversable, util.Serializable):
184 class Tree(IndexObject, git_diff.Diffable, util.Traversable, util.Serializable):
152185
153186 """Tree objects represent an ordered list of Blobs and other Trees.
154187
161194 blob = tree[0]
162195 """
163196
164 type = "tree"
197 type: Literal['tree'] = "tree"
165198 __slots__ = "_cache"
166199
167200 # actual integer ids for comparison
170203 symlink_id = 0o12
171204 tree_id = 0o04
172205
173 _map_id_to_type = {
206 _map_id_to_type: Dict[int, Type[IndexObjUnion]] = {
174207 commit_id: Submodule,
175208 blob_id: Blob,
176209 symlink_id: Blob
177210 # tree id added once Tree is defined
178211 }
179212
180 def __init__(self, repo, binsha, mode=tree_id << 12, path=None):
213 def __init__(self, repo: 'Repo', binsha: bytes, mode: int = tree_id << 12, path: Union[PathLike, None] = None):
181214 super(Tree, self).__init__(repo, binsha, mode, path)
182215
183 @classmethod
184 def _get_intermediate_items(cls, index_object):
216 @ classmethod
217 def _get_intermediate_items(cls, index_object: IndexObjUnion,
218 ) -> Union[Tuple['Tree', ...], Tuple[()]]:
185219 if index_object.type == "tree":
186220 return tuple(index_object._iter_convert_to_object(index_object._cache))
187221 return ()
188222
189 def _set_cache_(self, attr):
223 def _set_cache_(self, attr: str) -> None:
190224 if attr == "_cache":
191225 # Set the data when we need it
192226 ostream = self.repo.odb.stream(self.binsha)
193 self._cache = tree_entries_from_data(ostream.read())
227 self._cache: List[TreeCacheTup] = tree_entries_from_data(ostream.read())
194228 else:
195229 super(Tree, self)._set_cache_(attr)
196230 # END handle attribute
197231
198 def _iter_convert_to_object(self, iterable):
232 def _iter_convert_to_object(self, iterable: Iterable[TreeCacheTup]
233 ) -> Iterator[IndexObjUnion]:
199234 """Iterable yields tuples of (binsha, mode, name), which will be converted
200235 to the respective object representation"""
201236 for binsha, mode, name in iterable:
206241 raise TypeError("Unknown mode %o found in tree data for path '%s'" % (mode, path)) from e
207242 # END for each item
208243
209 def join(self, file):
244 def join(self, file: str) -> IndexObjUnion:
210245 """Find the named object in this tree's contents
211246 :return: ``git.Blob`` or ``git.Tree`` or ``git.Submodule``
212247
239274 raise KeyError(msg % file)
240275 # END handle long paths
241276
242 def __div__(self, file):
243 """For PY2 only"""
244 return self.join(file)
245
246 def __truediv__(self, file):
277 def __truediv__(self, file: str) -> IndexObjUnion:
247278 """For PY3 only"""
248279 return self.join(file)
249280
250 @property
251 def trees(self):
281 @ property
282 def trees(self) -> List['Tree']:
252283 """:return: list(Tree, ...) list of trees directly below this tree"""
253284 return [i for i in self if i.type == "tree"]
254285
255 @property
256 def blobs(self):
286 @ property
287 def blobs(self) -> List[Blob]:
257288 """:return: list(Blob, ...) list of blobs directly below this tree"""
258289 return [i for i in self if i.type == "blob"]
259290
260 @property
261 def cache(self):
291 @ property
292 def cache(self) -> TreeModifier:
262293 """
263294 :return: An object allowing to modify the internal cache. This can be used
264295 to change the tree's contents. When done, make sure you call ``set_done``
266297 See the ``TreeModifier`` for more information on how to alter the cache"""
267298 return TreeModifier(self._cache)
268299
269 def traverse(self, predicate=lambda i, d: True,
270 prune=lambda i, d: False, depth=-1, branch_first=True,
271 visit_once=False, ignore_self=1):
272 """For documentation, see util.Traversable.traverse
300 def traverse(self, # type: ignore[override]
301 predicate: Callable[[Union[IndexObjUnion, TraversedTreeTup], int], bool] = lambda i, d: True,
302 prune: Callable[[Union[IndexObjUnion, TraversedTreeTup], int], bool] = lambda i, d: False,
303 depth: int = -1,
304 branch_first: bool = True,
305 visit_once: bool = False,
306 ignore_self: int = 1,
307 as_edge: bool = False
308 ) -> Union[Iterator[IndexObjUnion],
309 Iterator[TraversedTreeTup]]:
310 """For documentation, see util.Traversable._traverse()
273311 Trees are set to visit_once = False to gain more performance in the traversal"""
274 return super(Tree, self).traverse(predicate, prune, depth, branch_first, visit_once, ignore_self)
312
313 # """
314 # # To typecheck instead of using cast.
315 # import itertools
316 # def is_tree_traversed(inp: Tuple) -> TypeGuard[Tuple[Iterator[Union['Tree', 'Blob', 'Submodule']]]]:
317 # return all(isinstance(x, (Blob, Tree, Submodule)) for x in inp[1])
318
319 # ret = super(Tree, self).traverse(predicate, prune, depth, branch_first, visit_once, ignore_self)
320 # ret_tup = itertools.tee(ret, 2)
321 # assert is_tree_traversed(ret_tup), f"Type is {[type(x) for x in list(ret_tup[0])]}"
322 # return ret_tup[0]"""
323 return cast(Union[Iterator[IndexObjUnion], Iterator[TraversedTreeTup]],
324 super(Tree, self)._traverse(predicate, prune, depth, # type: ignore
325 branch_first, visit_once, ignore_self))
326
327 def list_traverse(self, *args: Any, **kwargs: Any) -> IterableList[IndexObjUnion]:
328 """
329 :return: IterableList with the results of the traversal as produced by
330 traverse()
331 Tree -> IterableList[Union['Submodule', 'Tree', 'Blob']]
332 """
333 return super(Tree, self)._list_traverse(* args, **kwargs)
275334
276335 # List protocol
277 def __getslice__(self, i, j):
336
337 def __getslice__(self, i: int, j: int) -> List[IndexObjUnion]:
278338 return list(self._iter_convert_to_object(self._cache[i:j]))
279339
280 def __iter__(self):
340 def __iter__(self) -> Iterator[IndexObjUnion]:
281341 return self._iter_convert_to_object(self._cache)
282342
283 def __len__(self):
343 def __len__(self) -> int:
284344 return len(self._cache)
285345
286 def __getitem__(self, item):
346 def __getitem__(self, item: Union[str, int, slice]) -> IndexObjUnion:
287347 if isinstance(item, int):
288348 info = self._cache[item]
289349 return self._map_id_to_type[info[1] >> 12](self.repo, info[0], info[1], join_path(self.path, info[2]))
295355
296356 raise TypeError("Invalid index type: %r" % item)
297357
298 def __contains__(self, item):
358 def __contains__(self, item: Union[IndexObjUnion, PathLike]) -> bool:
299359 if isinstance(item, IndexObject):
300360 for info in self._cache:
301361 if item.binsha == info[0]:
306366 # compatibility
307367
308368 # treat item as repo-relative path
309 path = self.path
310 for info in self._cache:
311 if item == join_path(path, info[2]):
312 return True
369 else:
370 path = self.path
371 for info in self._cache:
372 if item == join_path(path, info[2]):
373 return True
313374 # END for each item
314375 return False
315376
316 def __reversed__(self):
317 return reversed(self._iter_convert_to_object(self._cache))
318
319 def _serialize(self, stream):
377 def __reversed__(self) -> Iterator[IndexObjUnion]:
378 return reversed(self._iter_convert_to_object(self._cache)) # type: ignore
379
380 def _serialize(self, stream: 'BytesIO') -> 'Tree':
320381 """Serialize this tree into the stream. Please note that we will assume
321382 our tree data to be in a sorted state. If this is not the case, serialization
322383 will not generate a correct tree representation as these are assumed to be sorted
324385 tree_to_stream(self._cache, stream.write)
325386 return self
326387
327 def _deserialize(self, stream):
388 def _deserialize(self, stream: 'BytesIO') -> 'Tree':
328389 self._cache = tree_entries_from_data(stream.read())
329390 return self
330391
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
55 """Module for general utility functions"""
6
7 from abc import abstractmethod
8 import warnings
69 from git.util import (
710 IterableList,
11 IterableObj,
812 Actor
913 )
1014
1115 import re
12 from collections import deque as Deque
16 from collections import deque
1317
1418 from string import digits
1519 import time
1620 import calendar
1721 from datetime import datetime, timedelta, tzinfo
1822
23 # typing ------------------------------------------------------------
24 from typing import (Any, Callable, Deque, Iterator, NamedTuple, overload, Sequence,
25 TYPE_CHECKING, Tuple, Type, TypeVar, Union, cast)
26
27 from git.types import Has_id_attribute, Literal, Protocol, runtime_checkable
28
29 if TYPE_CHECKING:
30 from io import BytesIO, StringIO
31 from .commit import Commit
32 from .blob import Blob
33 from .tag import TagObject
34 from .tree import Tree, TraversedTreeTup
35 from subprocess import Popen
36 from .submodule.base import Submodule
37
38
39 class TraverseNT(NamedTuple):
40 depth: int
41 item: Union['Traversable', 'Blob']
42 src: Union['Traversable', None]
43
44
45 T_TIobj = TypeVar('T_TIobj', bound='TraversableIterableObj') # for TraversableIterableObj.traverse()
46
47 TraversedTup = Union[Tuple[Union['Traversable', None], 'Traversable'], # for commit, submodule
48 'TraversedTreeTup'] # for tree.traverse()
49
50 # --------------------------------------------------------------------
51
1952 __all__ = ('get_object_type_by_name', 'parse_date', 'parse_actor_and_date',
2053 'ProcessStreamAdapter', 'Traversable', 'altz_to_utctz_str', 'utctz_to_altz',
2154 'verify_utctz', 'Actor', 'tzoffset', 'utc')
2558 #{ Functions
2659
2760
28 def mode_str_to_int(modestr):
61 def mode_str_to_int(modestr: Union[bytes, str]) -> int:
2962 """
3063 :param modestr: string like 755 or 644 or 100644 - only the last 6 chars will be used
3164 :return:
3568 for example."""
3669 mode = 0
3770 for iteration, char in enumerate(reversed(modestr[-6:])):
71 char = cast(Union[str, int], char)
3872 mode += int(char) << iteration * 3
3973 # END for each char
4074 return mode
4175
4276
43 def get_object_type_by_name(object_type_name):
77 def get_object_type_by_name(object_type_name: bytes
78 ) -> Union[Type['Commit'], Type['TagObject'], Type['Tree'], Type['Blob']]:
4479 """
4580 :return: type suitable to handle the given object type name.
4681 Use the type to create new instances.
6196 from . import tree
6297 return tree.Tree
6398 else:
64 raise ValueError("Cannot handle unknown object type: %s" % object_type_name)
65
66
67 def utctz_to_altz(utctz):
99 raise ValueError("Cannot handle unknown object type: %s" % object_type_name.decode())
100
101
102 def utctz_to_altz(utctz: str) -> int:
68103 """we convert utctz to the timezone in seconds, it is the format time.altzone
69104 returns. Git stores it as UTC timezone which has the opposite sign as well,
70105 which explains the -1 * ( that was made explicit here )
72107 return -1 * int(float(utctz) / 100 * 3600)
73108
74109
75 def altz_to_utctz_str(altz):
110 def altz_to_utctz_str(altz: float) -> str:
76111 """As above, but inverses the operation, returning a string that can be used
77112 in commit objects"""
78113 utci = -1 * int((float(altz) / 3600) * 100)
82117 return prefix + utcs
83118
84119
85 def verify_utctz(offset):
120 def verify_utctz(offset: str) -> str:
86121 """:raise ValueError: if offset is incorrect
87122 :return: offset"""
88123 fmt_exc = ValueError("Invalid timezone offset format: %s" % offset)
100135
101136
102137 class tzoffset(tzinfo):
103 def __init__(self, secs_west_of_utc, name=None):
138
139 def __init__(self, secs_west_of_utc: float, name: Union[None, str] = None) -> None:
104140 self._offset = timedelta(seconds=-secs_west_of_utc)
105141 self._name = name or 'fixed'
106142
107 def __reduce__(self):
143 def __reduce__(self) -> Tuple[Type['tzoffset'], Tuple[float, str]]:
108144 return tzoffset, (-self._offset.total_seconds(), self._name)
109145
110 def utcoffset(self, dt):
146 def utcoffset(self, dt: Union[datetime, None]) -> timedelta:
111147 return self._offset
112148
113 def tzname(self, dt):
149 def tzname(self, dt: Union[datetime, None]) -> str:
114150 return self._name
115151
116 def dst(self, dt):
152 def dst(self, dt: Union[datetime, None]) -> timedelta:
117153 return ZERO
118154
119155
120156 utc = tzoffset(0, 'UTC')
121157
122158
123 def from_timestamp(timestamp, tz_offset):
159 def from_timestamp(timestamp: float, tz_offset: float) -> datetime:
124160 """Converts a timestamp + tz_offset into an aware datetime instance."""
125161 utc_dt = datetime.fromtimestamp(timestamp, utc)
126162 try:
130166 return utc_dt
131167
132168
133 def parse_date(string_date):
169 def parse_date(string_date: Union[str, datetime]) -> Tuple[int, int]:
134170 """
135171 Parse the given date as one of the following
136172
144180 :raise ValueError: If the format could not be understood
145181 :note: Date can also be YYYY.MM.DD, MM/DD/YYYY and DD.MM.YYYY.
146182 """
147 if isinstance(string_date, datetime) and string_date.tzinfo:
148 offset = -int(string_date.utcoffset().total_seconds())
149 return int(string_date.astimezone(utc).timestamp()), offset
183 if isinstance(string_date, datetime):
184 if string_date.tzinfo:
185 utcoffset = cast(timedelta, string_date.utcoffset()) # typeguard, if tzinfoand is not None
186 offset = -int(utcoffset.total_seconds())
187 return int(string_date.astimezone(utc).timestamp()), offset
188 else:
189 raise ValueError(f"string_date datetime object without tzinfo, {string_date}")
150190
151191 # git time
152192 try:
153193 if string_date.count(' ') == 1 and string_date.rfind(':') == -1:
154 timestamp, offset = string_date.split()
194 timestamp, offset_str = string_date.split()
155195 if timestamp.startswith('@'):
156196 timestamp = timestamp[1:]
157 timestamp = int(timestamp)
158 return timestamp, utctz_to_altz(verify_utctz(offset))
197 timestamp_int = int(timestamp)
198 return timestamp_int, utctz_to_altz(verify_utctz(offset_str))
159199 else:
160 offset = "+0000" # local time by default
200 offset_str = "+0000" # local time by default
161201 if string_date[-5] in '-+':
162 offset = verify_utctz(string_date[-5:])
202 offset_str = verify_utctz(string_date[-5:])
163203 string_date = string_date[:-6] # skip space as well
164204 # END split timezone info
165 offset = utctz_to_altz(offset)
205 offset = utctz_to_altz(offset_str)
166206
167207 # now figure out the date and time portion - split time
168208 date_formats = []
208248 raise ValueError("no format matched")
209249 # END handle format
210250 except Exception as e:
211 raise ValueError("Unsupported date format: %s" % string_date) from e
251 raise ValueError(f"Unsupported date format or type: {string_date}, type={type(string_date)}") from e
212252 # END handle exceptions
213253
214254
217257 _re_only_actor = re.compile(r'^.+? (.*)$')
218258
219259
220 def parse_actor_and_date(line):
260 def parse_actor_and_date(line: str) -> Tuple[Actor, int, int]:
221261 """Parse out the actor (author or committer) info from a line like::
222262
223263 author Tom Preston-Werner <tom@mojombo.com> 1191999972 -0700
224264
225265 :return: [Actor, int_seconds_since_epoch, int_timezone_offset]"""
226 actor, epoch, offset = '', 0, 0
266 actor, epoch, offset = '', '0', '0'
227267 m = _re_actor_epoch.search(line)
228268 if m:
229269 actor, epoch, offset = m.groups()
246286 it if the instance goes out of scope."""
247287 __slots__ = ("_proc", "_stream")
248288
249 def __init__(self, process, stream_name):
289 def __init__(self, process: 'Popen', stream_name: str) -> None:
250290 self._proc = process
251 self._stream = getattr(process, stream_name)
252
253 def __getattr__(self, attr):
291 self._stream: StringIO = getattr(process, stream_name) # guessed type
292
293 def __getattr__(self, attr: str) -> Any:
254294 return getattr(self._stream, attr)
255295
256296
257 class Traversable(object):
297 @runtime_checkable
298 class Traversable(Protocol):
258299
259300 """Simple interface to perform depth-first or breadth-first traversals
260301 into one direction.
261302 Subclasses only need to implement one function.
262 Instances of the Subclass must be hashable"""
303 Instances of the Subclass must be hashable
304
305 Defined subclasses = [Commit, Tree, SubModule]
306 """
263307 __slots__ = ()
264308
265309 @classmethod
266 def _get_intermediate_items(cls, item):
310 @abstractmethod
311 def _get_intermediate_items(cls, item: Any) -> Sequence['Traversable']:
267312 """
268313 Returns:
269 List of items connected to the given item.
314 Tuple of items connected to the given item.
270315 Must be implemented in subclass
316
317 class Commit:: (cls, Commit) -> Tuple[Commit, ...]
318 class Submodule:: (cls, Submodule) -> Iterablelist[Submodule]
319 class Tree:: (cls, Tree) -> Tuple[Tree, ...]
271320 """
272321 raise NotImplementedError("To be implemented in subclass")
273322
274 def list_traverse(self, *args, **kwargs):
323 @abstractmethod
324 def list_traverse(self, *args: Any, **kwargs: Any) -> Any:
325 """ """
326 warnings.warn("list_traverse() method should only be called from subclasses."
327 "Calling from Traversable abstract class will raise NotImplementedError in 3.1.20"
328 "Builtin sublclasses are 'Submodule', 'Tree' and 'Commit",
329 DeprecationWarning,
330 stacklevel=2)
331 return self._list_traverse(*args, **kwargs)
332
333 def _list_traverse(self, as_edge: bool = False, *args: Any, **kwargs: Any
334 ) -> IterableList[Union['Commit', 'Submodule', 'Tree', 'Blob']]:
275335 """
276336 :return: IterableList with the results of the traversal as produced by
277 traverse()"""
278 out = IterableList(self._id_attribute_)
279 out.extend(self.traverse(*args, **kwargs))
280 return out
281
282 def traverse(self, predicate=lambda i, d: True,
283 prune=lambda i, d: False, depth=-1, branch_first=True,
284 visit_once=True, ignore_self=1, as_edge=False):
337 traverse()
338 Commit -> IterableList['Commit']
339 Submodule -> IterableList['Submodule']
340 Tree -> IterableList[Union['Submodule', 'Tree', 'Blob']]
341 """
342 # Commit and Submodule have id.__attribute__ as IterableObj
343 # Tree has id.__attribute__ inherited from IndexObject
344 if isinstance(self, Has_id_attribute):
345 id = self._id_attribute_
346 else:
347 id = "" # shouldn't reach here, unless Traversable subclass created with no _id_attribute_
348 # could add _id_attribute_ to Traversable, or make all Traversable also Iterable?
349
350 if not as_edge:
351 out: IterableList[Union['Commit', 'Submodule', 'Tree', 'Blob']] = IterableList(id)
352 out.extend(self.traverse(as_edge=as_edge, *args, **kwargs))
353 return out
354 # overloads in subclasses (mypy does't allow typing self: subclass)
355 # Union[IterableList['Commit'], IterableList['Submodule'], IterableList[Union['Submodule', 'Tree', 'Blob']]]
356 else:
357 # Raise deprecationwarning, doesn't make sense to use this
358 out_list: IterableList = IterableList(self.traverse(*args, **kwargs))
359 return out_list
360
361 @ abstractmethod
362 def traverse(self, *args: Any, **kwargs: Any) -> Any:
363 """ """
364 warnings.warn("traverse() method should only be called from subclasses."
365 "Calling from Traversable abstract class will raise NotImplementedError in 3.1.20"
366 "Builtin sublclasses are 'Submodule', 'Tree' and 'Commit",
367 DeprecationWarning,
368 stacklevel=2)
369 return self._traverse(*args, **kwargs)
370
371 def _traverse(self,
372 predicate: Callable[[Union['Traversable', 'Blob', TraversedTup], int], bool] = lambda i, d: True,
373 prune: Callable[[Union['Traversable', 'Blob', TraversedTup], int], bool] = lambda i, d: False,
374 depth: int = -1, branch_first: bool = True, visit_once: bool = True,
375 ignore_self: int = 1, as_edge: bool = False
376 ) -> Union[Iterator[Union['Traversable', 'Blob']],
377 Iterator[TraversedTup]]:
285378 """:return: iterator yielding of items found when traversing self
286
287379 :param predicate: f(i,d) returns False if item i at depth d should not be included in the result
288380
289381 :param prune:
312404 if True, return a pair of items, first being the source, second the
313405 destination, i.e. tuple(src, dest) with the edge spanning from
314406 source to destination"""
407
408 """
409 Commit -> Iterator[Union[Commit, Tuple[Commit, Commit]]
410 Submodule -> Iterator[Submodule, Tuple[Submodule, Submodule]]
411 Tree -> Iterator[Union[Blob, Tree, Submodule,
412 Tuple[Union[Submodule, Tree], Union[Blob, Tree, Submodule]]]
413
414 ignore_self=True is_edge=True -> Iterator[item]
415 ignore_self=True is_edge=False --> Iterator[item]
416 ignore_self=False is_edge=True -> Iterator[item] | Iterator[Tuple[src, item]]
417 ignore_self=False is_edge=False -> Iterator[Tuple[src, item]]"""
418
315419 visited = set()
316 stack = Deque()
317 stack.append((0, self, None)) # self is always depth level 0
318
319 def addToStack(stack, item, branch_first, depth):
420 stack: Deque[TraverseNT] = deque()
421 stack.append(TraverseNT(0, self, None)) # self is always depth level 0
422
423 def addToStack(stack: Deque[TraverseNT],
424 src_item: 'Traversable',
425 branch_first: bool,
426 depth: int) -> None:
320427 lst = self._get_intermediate_items(item)
321 if not lst:
322 return
428 if not lst: # empty list
429 return None
323430 if branch_first:
324 stack.extendleft((depth, i, item) for i in lst)
431 stack.extendleft(TraverseNT(depth, i, src_item) for i in lst)
325432 else:
326 reviter = ((depth, lst[i], item) for i in range(len(lst) - 1, -1, -1))
433 reviter = (TraverseNT(depth, lst[i], src_item) for i in range(len(lst) - 1, -1, -1))
327434 stack.extend(reviter)
328435 # END addToStack local method
329436
336443 if visit_once:
337444 visited.add(item)
338445
339 rval = (as_edge and (src, item)) or item
446 rval: Union[TraversedTup, 'Traversable', 'Blob']
447 if as_edge: # if as_edge return (src, item) unless rrc is None (e.g. for first item)
448 rval = (src, item)
449 else:
450 rval = item
451
340452 if prune(rval, d):
341453 continue
342454
353465 # END for each item on work stack
354466
355467
356 class Serializable(object):
468 @ runtime_checkable
469 class Serializable(Protocol):
357470
358471 """Defines methods to serialize and deserialize objects from and into a data stream"""
359472 __slots__ = ()
360473
361 def _serialize(self, stream):
474 # @abstractmethod
475 def _serialize(self, stream: 'BytesIO') -> 'Serializable':
362476 """Serialize the data of this object into the given data stream
363477 :note: a serialized object would ``_deserialize`` into the same object
364478 :param stream: a file-like object
365479 :return: self"""
366480 raise NotImplementedError("To be implemented in subclass")
367481
368 def _deserialize(self, stream):
482 # @abstractmethod
483 def _deserialize(self, stream: 'BytesIO') -> 'Serializable':
369484 """Deserialize all information regarding this object from the stream
370485 :param stream: a file-like object
371486 :return: self"""
372487 raise NotImplementedError("To be implemented in subclass")
488
489
490 class TraversableIterableObj(IterableObj, Traversable):
491 __slots__ = ()
492
493 TIobj_tuple = Tuple[Union[T_TIobj, None], T_TIobj]
494
495 def list_traverse(self: T_TIobj, *args: Any, **kwargs: Any) -> IterableList[T_TIobj]:
496 return super(TraversableIterableObj, self)._list_traverse(* args, **kwargs)
497
498 @ overload # type: ignore
499 def traverse(self: T_TIobj
500 ) -> Iterator[T_TIobj]:
501 ...
502
503 @ overload
504 def traverse(self: T_TIobj,
505 predicate: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool],
506 prune: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool],
507 depth: int, branch_first: bool, visit_once: bool,
508 ignore_self: Literal[True],
509 as_edge: Literal[False],
510 ) -> Iterator[T_TIobj]:
511 ...
512
513 @ overload
514 def traverse(self: T_TIobj,
515 predicate: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool],
516 prune: Callable[[Union[T_TIobj, Tuple[Union[T_TIobj, None], T_TIobj]], int], bool],
517 depth: int, branch_first: bool, visit_once: bool,
518 ignore_self: Literal[False],
519 as_edge: Literal[True],
520 ) -> Iterator[Tuple[Union[T_TIobj, None], T_TIobj]]:
521 ...
522
523 @ overload
524 def traverse(self: T_TIobj,
525 predicate: Callable[[Union[T_TIobj, TIobj_tuple], int], bool],
526 prune: Callable[[Union[T_TIobj, TIobj_tuple], int], bool],
527 depth: int, branch_first: bool, visit_once: bool,
528 ignore_self: Literal[True],
529 as_edge: Literal[True],
530 ) -> Iterator[Tuple[T_TIobj, T_TIobj]]:
531 ...
532
533 def traverse(self: T_TIobj,
534 predicate: Callable[[Union[T_TIobj, TIobj_tuple], int],
535 bool] = lambda i, d: True,
536 prune: Callable[[Union[T_TIobj, TIobj_tuple], int],
537 bool] = lambda i, d: False,
538 depth: int = -1, branch_first: bool = True, visit_once: bool = True,
539 ignore_self: int = 1, as_edge: bool = False
540 ) -> Union[Iterator[T_TIobj],
541 Iterator[Tuple[T_TIobj, T_TIobj]],
542 Iterator[TIobj_tuple]]:
543 """For documentation, see util.Traversable._traverse()"""
544
545 """
546 # To typecheck instead of using cast.
547 import itertools
548 from git.types import TypeGuard
549 def is_commit_traversed(inp: Tuple) -> TypeGuard[Tuple[Iterator[Tuple['Commit', 'Commit']]]]:
550 for x in inp[1]:
551 if not isinstance(x, tuple) and len(x) != 2:
552 if all(isinstance(inner, Commit) for inner in x):
553 continue
554 return True
555
556 ret = super(Commit, self).traverse(predicate, prune, depth, branch_first, visit_once, ignore_self, as_edge)
557 ret_tup = itertools.tee(ret, 2)
558 assert is_commit_traversed(ret_tup), f"{[type(x) for x in list(ret_tup[0])]}"
559 return ret_tup[0]
560 """
561 return cast(Union[Iterator[T_TIobj],
562 Iterator[Tuple[Union[None, T_TIobj], T_TIobj]]],
563 super(TraversableIterableObj, self)._traverse(
564 predicate, prune, depth, branch_first, visit_once, ignore_self, as_edge # type: ignore
565 ))
(New empty file)
00 # flake8: noqa
1 from __future__ import absolute_import
21 # import all modules in order, fix the names they require
32 from .symbolic import *
43 from .reference import *
0 from git.config import SectionConstraint
0 from git.config import GitConfigParser, SectionConstraint
11 from git.util import join_path
22 from git.exc import GitCommandError
33
44 from .symbolic import SymbolicReference
55 from .reference import Reference
66
7 # typinng ---------------------------------------------------
8
9 from typing import Any, Sequence, Union, TYPE_CHECKING
10
11 from git.types import PathLike, Commit_ish
12
13 if TYPE_CHECKING:
14 from git.repo import Repo
15 from git.objects import Commit
16 from git.refs import RemoteReference
17
18 # -------------------------------------------------------------------
19
720 __all__ = ["HEAD", "Head"]
821
922
10 def strip_quotes(string):
23 def strip_quotes(string: str) -> str:
1124 if string.startswith('"') and string.endswith('"'):
1225 return string[1:-1]
1326 return string
14
27
1528
1629 class HEAD(SymbolicReference):
1730
2134 _ORIG_HEAD_NAME = 'ORIG_HEAD'
2235 __slots__ = ()
2336
24 def __init__(self, repo, path=_HEAD_NAME):
37 def __init__(self, repo: 'Repo', path: PathLike = _HEAD_NAME):
2538 if path != self._HEAD_NAME:
2639 raise ValueError("HEAD instance must point to %r, got %r" % (self._HEAD_NAME, path))
2740 super(HEAD, self).__init__(repo, path)
28
29 def orig_head(self):
41 self.commit: 'Commit'
42
43 def orig_head(self) -> SymbolicReference:
3044 """
3145 :return: SymbolicReference pointing at the ORIG_HEAD, which is maintained
3246 to contain the previous value of HEAD"""
3347 return SymbolicReference(self.repo, self._ORIG_HEAD_NAME)
3448
35 def reset(self, commit='HEAD', index=True, working_tree=False,
36 paths=None, **kwargs):
49 def reset(self, commit: Union[Commit_ish, SymbolicReference, str] = 'HEAD',
50 index: bool = True, working_tree: bool = False,
51 paths: Union[PathLike, Sequence[PathLike], None] = None, **kwargs: Any) -> 'HEAD':
3752 """Reset our HEAD to the given commit optionally synchronizing
3853 the index and working tree. The reference we refer to will be set to
3954 commit as well.
5974 Additional arguments passed to git-reset.
6075
6176 :return: self"""
77 mode: Union[str, None]
6278 mode = "--soft"
6379 if index:
6480 mode = "--mixed"
112128 k_config_remote_ref = "merge" # branch to merge from remote
113129
114130 @classmethod
115 def delete(cls, repo, *heads, **kwargs):
131 def delete(cls, repo: 'Repo', *heads: 'Head', force: bool = False, **kwargs: Any) -> None:
116132 """Delete the given heads
117133
118134 :param force:
119135 If True, the heads will be deleted even if they are not yet merged into
120136 the main development stream.
121137 Default False"""
122 force = kwargs.get("force", False)
123138 flag = "-d"
124139 if force:
125140 flag = "-D"
126141 repo.git.branch(flag, *heads)
127142
128 def set_tracking_branch(self, remote_reference):
143 def set_tracking_branch(self, remote_reference: Union['RemoteReference', None]) -> 'Head':
129144 """
130145 Configure this branch to track the given remote reference. This will alter
131146 this branch's configuration accordingly.
150165
151166 return self
152167
153 def tracking_branch(self):
168 def tracking_branch(self) -> Union['RemoteReference', None]:
154169 """
155170 :return: The remote_reference we are tracking, or None if we are
156171 not a tracking branch"""
165180 # we are not a tracking branch
166181 return None
167182
168 def rename(self, new_path, force=False):
183 def rename(self, new_path: PathLike, force: bool = False) -> 'Head':
169184 """Rename self to a new path
170185
171186 :param new_path:
186201 self.path = "%s/%s" % (self._common_path_default, new_path)
187202 return self
188203
189 def checkout(self, force=False, **kwargs):
204 def checkout(self, force: bool = False, **kwargs: Any) -> Union['HEAD', 'Head']:
190205 """Checkout this head by setting the HEAD to this reference, by updating the index
191206 to reflect the tree we point to and by updating the working tree to reflect
192207 the latest index.
218233 self.repo.git.checkout(self, **kwargs)
219234 if self.repo.head.is_detached:
220235 return self.repo.head
221 return self.repo.active_branch
236 else:
237 return self.repo.active_branch
222238
223239 #{ Configuration
224 def _config_parser(self, read_only):
240 def _config_parser(self, read_only: bool) -> SectionConstraint[GitConfigParser]:
225241 if read_only:
226242 parser = self.repo.config_reader()
227243 else:
230246
231247 return SectionConstraint(parser, 'branch "%s"' % self.name)
232248
233 def config_reader(self):
249 def config_reader(self) -> SectionConstraint[GitConfigParser]:
234250 """
235251 :return: A configuration parser instance constrained to only read
236252 this instance's values"""
237253 return self._config_parser(read_only=True)
238254
239 def config_writer(self):
255 def config_writer(self) -> SectionConstraint[GitConfigParser]:
240256 """
241257 :return: A configuration writer instance with read-and write access
242258 to options of this head"""
0
1 from mmap import mmap
02 import re
1 import time
3 import time as _time
24
35 from git.compat import defenc
46 from git.objects.util import (
1921 import os.path as osp
2022
2123
24 # typing ------------------------------------------------------------------
25
26 from typing import Iterator, List, Tuple, Union, TYPE_CHECKING
27
28 from git.types import PathLike
29
30 if TYPE_CHECKING:
31 from git.refs import SymbolicReference
32 from io import BytesIO
33 from git.config import GitConfigParser, SectionConstraint # NOQA
34
35 # ------------------------------------------------------------------------------
36
2237 __all__ = ["RefLog", "RefLogEntry"]
2338
2439
25 class RefLogEntry(tuple):
40 class RefLogEntry(Tuple[str, str, Actor, Tuple[int, int], str]):
2641
2742 """Named tuple allowing easy access to the revlog data fields"""
2843 _re_hexsha_only = re.compile('^[0-9A-Fa-f]{40}$')
2944 __slots__ = ()
3045
31 def __repr__(self):
46 def __repr__(self) -> str:
3247 """Representation of ourselves in git reflog format"""
3348 return self.format()
3449
35 def format(self):
50 def format(self) -> str:
3651 """:return: a string suitable to be placed in a reflog file"""
3752 act = self.actor
3853 time = self.time
4560 self.message)
4661
4762 @property
48 def oldhexsha(self):
63 def oldhexsha(self) -> str:
4964 """The hexsha to the commit the ref pointed to before the change"""
5065 return self[0]
5166
5267 @property
53 def newhexsha(self):
68 def newhexsha(self) -> str:
5469 """The hexsha to the commit the ref now points to, after the change"""
5570 return self[1]
5671
5772 @property
58 def actor(self):
73 def actor(self) -> Actor:
5974 """Actor instance, providing access"""
6075 return self[2]
6176
6277 @property
63 def time(self):
78 def time(self) -> Tuple[int, int]:
6479 """time as tuple:
6580
6681 * [0] = int(time)
6883 return self[3]
6984
7085 @property
71 def message(self):
86 def message(self) -> str:
7287 """Message describing the operation that acted on the reference"""
7388 return self[4]
7489
7590 @classmethod
76 def new(cls, oldhexsha, newhexsha, actor, time, tz_offset, message): # skipcq: PYL-W0621
91 def new(cls, oldhexsha: str, newhexsha: str, actor: Actor, time: int, tz_offset: int, message: str
92 ) -> 'RefLogEntry': # skipcq: PYL-W0621
7793 """:return: New instance of a RefLogEntry"""
7894 if not isinstance(actor, Actor):
7995 raise ValueError("Need actor instance, got %s" % actor)
8197 return RefLogEntry((oldhexsha, newhexsha, actor, (time, tz_offset), message))
8298
8399 @classmethod
84 def from_line(cls, line):
100 def from_line(cls, line: bytes) -> 'RefLogEntry':
85101 """:return: New RefLogEntry instance from the given revlog line.
86102 :param line: line bytes without trailing newline
87103 :raise ValueError: If line could not be parsed"""
88 line = line.decode(defenc)
89 fields = line.split('\t', 1)
104 line_str = line.decode(defenc)
105 fields = line_str.split('\t', 1)
90106 if len(fields) == 1:
91107 info, msg = fields[0], None
92108 elif len(fields) == 2:
93109 info, msg = fields
94110 else:
95111 raise ValueError("Line must have up to two TAB-separated fields."
96 " Got %s" % repr(line))
112 " Got %s" % repr(line_str))
97113 # END handle first split
98114
99115 oldhexsha = info[:40]
110126 # END handle missing end brace
111127
112128 actor = Actor._from_string(info[82:email_end + 1])
113 time, tz_offset = parse_date(info[email_end + 2:]) # skipcq: PYL-W0621
129 time, tz_offset = parse_date(
130 info[email_end + 2:]) # skipcq: PYL-W0621
114131
115132 return RefLogEntry((oldhexsha, newhexsha, actor, (time, tz_offset), msg))
116133
117134
118 class RefLog(list, Serializable):
119
120 """A reflog contains reflog entries, each of which defines a certain state
135 class RefLog(List[RefLogEntry], Serializable):
136
137 """A reflog contains RefLogEntrys, each of which defines a certain state
121138 of the head in question. Custom query methods allow to retrieve log entries
122139 by date or by other criteria.
123140
126143
127144 __slots__ = ('_path', )
128145
129 def __new__(cls, filepath=None):
146 def __new__(cls, filepath: Union[PathLike, None] = None) -> 'RefLog':
130147 inst = super(RefLog, cls).__new__(cls)
131148 return inst
132149
133 def __init__(self, filepath=None):
150 def __init__(self, filepath: Union[PathLike, None] = None):
134151 """Initialize this instance with an optional filepath, from which we will
135152 initialize our data. The path is also used to write changes back using
136153 the write() method"""
139156 self._read_from_file()
140157 # END handle filepath
141158
142 def _read_from_file(self):
159 def _read_from_file(self) -> None:
143160 try:
144 fmap = file_contents_ro_filepath(self._path, stream=True, allow_mmap=True)
161 fmap = file_contents_ro_filepath(
162 self._path, stream=True, allow_mmap=True)
145163 except OSError:
146164 # it is possible and allowed that the file doesn't exist !
147165 return
153171 fmap.close()
154172 # END handle closing of handle
155173
156 #{ Interface
157
158 @classmethod
159 def from_file(cls, filepath):
174 # { Interface
175
176 @classmethod
177 def from_file(cls, filepath: PathLike) -> 'RefLog':
160178 """
161179 :return: a new RefLog instance containing all entries from the reflog
162180 at the given filepath
165183 return cls(filepath)
166184
167185 @classmethod
168 def path(cls, ref):
186 def path(cls, ref: 'SymbolicReference') -> str:
169187 """
170188 :return: string to absolute path at which the reflog of the given ref
171189 instance would be found. The path is not guaranteed to point to a valid
174192 return osp.join(ref.repo.git_dir, "logs", to_native_path(ref.path))
175193
176194 @classmethod
177 def iter_entries(cls, stream):
195 def iter_entries(cls, stream: Union[str, 'BytesIO', mmap]) -> Iterator[RefLogEntry]:
178196 """
179197 :return: Iterator yielding RefLogEntry instances, one for each line read
180198 sfrom the given stream.
181199 :param stream: file-like object containing the revlog in its native format
182 or basestring instance pointing to a file to read"""
200 or string instance pointing to a file to read"""
183201 new_entry = RefLogEntry.from_line
184202 if isinstance(stream, str):
185 stream = file_contents_ro_filepath(stream)
203 # default args return mmap on py>3
204 _stream = file_contents_ro_filepath(stream)
205 assert isinstance(_stream, mmap)
206 else:
207 _stream = stream
186208 # END handle stream type
187209 while True:
188 line = stream.readline()
210 line = _stream.readline()
189211 if not line:
190212 return
191213 yield new_entry(line.strip())
192214 # END endless loop
193 stream.close()
194
195 @classmethod
196 def entry_at(cls, filepath, index):
197 """:return: RefLogEntry at the given index
215
216 @classmethod
217 def entry_at(cls, filepath: PathLike, index: int) -> 'RefLogEntry':
218 """
219 :return: RefLogEntry at the given index
220
198221 :param filepath: full path to the index file from which to read the entry
222
199223 :param index: python list compatible index, i.e. it may be negative to
200224 specify an entry counted from the end of the list
201225
209233 if index < 0:
210234 return RefLogEntry.from_line(fp.readlines()[index].strip())
211235 # read until index is reached
236
212237 for i in range(index + 1):
213238 line = fp.readline()
214239 if not line:
215 break
240 raise IndexError(
241 f"Index file ended at line {i+1}, before given index was reached")
216242 # END abort on eof
217243 # END handle runup
218244
219 if i != index or not line: # skipcq:PYL-W0631
220 raise IndexError
221 # END handle exception
222
223245 return RefLogEntry.from_line(line.strip())
224246 # END handle index
225247
226 def to_file(self, filepath):
248 def to_file(self, filepath: PathLike) -> None:
227249 """Write the contents of the reflog instance to a file at the given filepath.
228250 :param filepath: path to file, parent directories are assumed to exist"""
229251 lfd = LockedFD(filepath)
240262 # END handle change
241263
242264 @classmethod
243 def append_entry(cls, config_reader, filepath, oldbinsha, newbinsha, message):
265 def append_entry(cls, config_reader: Union[Actor, 'GitConfigParser', 'SectionConstraint', None],
266 filepath: PathLike, oldbinsha: bytes, newbinsha: bytes, message: str,
267 write: bool = True) -> 'RefLogEntry':
244268 """Append a new log entry to the revlog at filepath.
245269
246270 :param config_reader: configuration reader of the repository - used to obtain
247 user information. May also be an Actor instance identifying the committer directly.
248 May also be None
271 user information. May also be an Actor instance identifying the committer directly or None.
249272 :param filepath: full path to the log file
250273 :param oldbinsha: binary sha of the previous commit
251274 :param newbinsha: binary sha of the current commit
252275 :param message: message describing the change to the reference
253276 :param write: If True, the changes will be written right away. Otherwise
254277 the change will not be written
278
255279 :return: RefLogEntry objects which was appended to the log
280
256281 :note: As we are append-only, concurrent access is not a problem as we
257282 do not interfere with readers."""
283
258284 if len(oldbinsha) != 20 or len(newbinsha) != 20:
259285 raise ValueError("Shas need to be given in binary format")
260286 # END handle sha type
261287 assure_directory_exists(filepath, is_file=True)
262288 first_line = message.split('\n')[0]
263 committer = isinstance(config_reader, Actor) and config_reader or Actor.committer(config_reader)
289 if isinstance(config_reader, Actor):
290 committer = config_reader # mypy thinks this is Actor | Gitconfigparser, but why?
291 else:
292 committer = Actor.committer(config_reader)
264293 entry = RefLogEntry((
265294 bin_to_hex(oldbinsha).decode('ascii'),
266295 bin_to_hex(newbinsha).decode('ascii'),
267 committer, (int(time.time()), time.altzone), first_line
296 committer, (int(_time.time()), _time.altzone), first_line
268297 ))
269298
270 lf = LockFile(filepath)
271 lf._obtain_lock_or_raise()
272 fd = open(filepath, 'ab')
273 try:
274 fd.write(entry.format().encode(defenc))
275 finally:
276 fd.close()
277 lf._release_lock()
278 # END handle write operation
279
299 if write:
300 lf = LockFile(filepath)
301 lf._obtain_lock_or_raise()
302 fd = open(filepath, 'ab')
303 try:
304 fd.write(entry.format().encode(defenc))
305 finally:
306 fd.close()
307 lf._release_lock()
308 # END handle write operation
280309 return entry
281310
282 def write(self):
311 def write(self) -> 'RefLog':
283312 """Write this instance's data to the file we are originating from
284313 :return: self"""
285314 if self._path is None:
286 raise ValueError("Instance was not initialized with a path, use to_file(...) instead")
315 raise ValueError(
316 "Instance was not initialized with a path, use to_file(...) instead")
287317 # END assert path
288318 self.to_file(self._path)
289319 return self
290320
291 #} END interface
292
293 #{ Serializable Interface
294 def _serialize(self, stream):
321 # } END interface
322
323 # { Serializable Interface
324 def _serialize(self, stream: 'BytesIO') -> 'RefLog':
295325 write = stream.write
296326
297327 # write all entries
298328 for e in self:
299329 write(e.format().encode(defenc))
300330 # END for each entry
301
302 def _deserialize(self, stream):
331 return self
332
333 def _deserialize(self, stream: 'BytesIO') -> 'RefLog':
303334 self.extend(self.iter_entries(stream))
304 #} END serializable interface
335 # } END serializable interface
336 return self
00 from git.util import (
11 LazyMixin,
2 Iterable,
2 IterableObj,
33 )
4 from .symbolic import SymbolicReference
4 from .symbolic import SymbolicReference, T_References
5
6
7 # typing ------------------------------------------------------------------
8
9 from typing import Any, Callable, Iterator, Type, Union, TYPE_CHECKING # NOQA
10 from git.types import Commit_ish, PathLike, _T # NOQA
11
12 if TYPE_CHECKING:
13 from git.repo import Repo
14
15 # ------------------------------------------------------------------------------
516
617
718 __all__ = ["Reference"]
920 #{ Utilities
1021
1122
12 def require_remote_ref_path(func):
23 def require_remote_ref_path(func: Callable[..., _T]) -> Callable[..., _T]:
1324 """A decorator raising a TypeError if we are not a valid remote, based on the path"""
1425
15 def wrapper(self, *args):
26 def wrapper(self: T_References, *args: Any) -> _T:
1627 if not self.is_remote():
1728 raise ValueError("ref path does not point to a remote reference: %s" % self.path)
1829 return func(self, *args)
2233 #}END utilities
2334
2435
25 class Reference(SymbolicReference, LazyMixin, Iterable):
36 class Reference(SymbolicReference, LazyMixin, IterableObj):
2637
2738 """Represents a named reference to any object. Subclasses may apply restrictions though,
2839 i.e. Heads can only point to commits."""
3142 _resolve_ref_on_create = True
3243 _common_path_default = "refs"
3344
34 def __init__(self, repo, path, check_path=True):
45 def __init__(self, repo: 'Repo', path: PathLike, check_path: bool = True) -> None:
3546 """Initialize this instance
3647 :param repo: Our parent repository
3748
4051 refs/heads/master
4152 :param check_path: if False, you can provide any path. Otherwise the path must start with the
4253 default path prefix of this type."""
43 if check_path and not path.startswith(self._common_path_default + '/'):
44 raise ValueError("Cannot instantiate %r from path %s" % (self.__class__.__name__, path))
54 if check_path and not str(path).startswith(self._common_path_default + '/'):
55 raise ValueError(f"Cannot instantiate {self.__class__.__name__!r} from path {path}")
56 self.path: str # SymbolicReference converts to string atm
4557 super(Reference, self).__init__(repo, path)
4658
47 def __str__(self):
59 def __str__(self) -> str:
4860 return self.name
4961
5062 #{ Interface
5163
52 def set_object(self, object, logmsg=None): # @ReservedAssignment
64 # @ReservedAssignment
65 def set_object(self, object: Union[Commit_ish, 'SymbolicReference', str], logmsg: Union[str, None] = None
66 ) -> 'Reference':
5367 """Special version which checks if the head-log needs an update as well
5468 :return: self"""
5569 oldbinsha = None
8397 # NOTE: Don't have to overwrite properties as the will only work without a the log
8498
8599 @property
86 def name(self):
100 def name(self) -> str:
87101 """:return: (shortest) Name of this reference - it may contain path components"""
88102 # first two path tokens are can be removed as they are
89103 # refs/heads or refs/tags or refs/remotes
93107 return '/'.join(tokens[2:])
94108
95109 @classmethod
96 def iter_items(cls, repo, common_path=None):
110 def iter_items(cls: Type[T_References], repo: 'Repo', common_path: Union[PathLike, None] = None,
111 *args: Any, **kwargs: Any) -> Iterator[T_References]:
97112 """Equivalent to SymbolicReference.iter_items, but will return non-detached
98113 references as well."""
99114 return cls._iter_items(repo, common_path)
102117
103118 #{ Remote Interface
104119
105 @property
120 @property # type: ignore ## mypy cannot deal with properties with an extra decorator (2021-04-21)
106121 @require_remote_ref_path
107 def remote_name(self):
122 def remote_name(self) -> str:
108123 """
109124 :return:
110125 Name of the remote we are a reference of, such as 'origin' for a reference
113128 # /refs/remotes/<remote name>/<branch_name>
114129 return tokens[2]
115130
116 @property
131 @property # type: ignore ## mypy cannot deal with properties with an extra decorator (2021-04-21)
117132 @require_remote_ref_path
118 def remote_head(self):
133 def remote_head(self) -> str:
119134 """:return: Name of the remote head itself, i.e. master.
120135 :note: The returned name is usually not qualified enough to uniquely identify
121136 a branch"""
00 import os
11
22 from git.util import join_path
3
4 import os.path as osp
53
64 from .head import Head
75
86
97 __all__ = ["RemoteReference"]
8
9 # typing ------------------------------------------------------------------
10
11 from typing import Any, Iterator, NoReturn, Union, TYPE_CHECKING
12 from git.types import PathLike
13
14
15 if TYPE_CHECKING:
16 from git.repo import Repo
17 from git import Remote
18
19 # ------------------------------------------------------------------------------
1020
1121
1222 class RemoteReference(Head):
1525 _common_path_default = Head._remote_common_path_default
1626
1727 @classmethod
18 def iter_items(cls, repo, common_path=None, remote=None):
28 def iter_items(cls, repo: 'Repo', common_path: Union[PathLike, None] = None,
29 remote: Union['Remote', None] = None, *args: Any, **kwargs: Any
30 ) -> Iterator['RemoteReference']:
1931 """Iterate remote references, and if given, constrain them to the given remote"""
2032 common_path = common_path or cls._common_path_default
2133 if remote is not None:
2234 common_path = join_path(common_path, str(remote))
2335 # END handle remote constraint
36 # super is Reference
2437 return super(RemoteReference, cls).iter_items(repo, common_path)
2538
26 @classmethod
27 def delete(cls, repo, *refs, **kwargs):
39 @ classmethod
40 def delete(cls, repo: 'Repo', *refs: 'RemoteReference', **kwargs: Any) -> None:
2841 """Delete the given remote references
2942
3043 :note:
3649 # and delete remainders manually
3750 for ref in refs:
3851 try:
39 os.remove(osp.join(repo.common_dir, ref.path))
52 os.remove(os.path.join(repo.common_dir, ref.path))
4053 except OSError:
4154 pass
4255 try:
43 os.remove(osp.join(repo.git_dir, ref.path))
56 os.remove(os.path.join(repo.git_dir, ref.path))
4457 except OSError:
4558 pass
4659 # END for each ref
4760
48 @classmethod
49 def create(cls, *args, **kwargs):
61 @ classmethod
62 def create(cls, *args: Any, **kwargs: Any) -> NoReturn:
5063 """Used to disable this method"""
5164 raise TypeError("Cannot explicitly create remote references")
0 from git.types import PathLike
01 import os
12
23 from git.compat import defenc
3 from git.objects import Object, Commit
4 from git.objects import Object
5 from git.objects.commit import Commit
46 from git.util import (
57 join_path,
68 join_path_native,
1416 BadName
1517 )
1618
17 import os.path as osp
18
1919 from .log import RefLog
2020
21 # typing ------------------------------------------------------------------
22
23 from typing import Any, Iterator, List, Tuple, Type, TypeVar, Union, TYPE_CHECKING, cast # NOQA
24 from git.types import Commit_ish, PathLike # NOQA
25
26 if TYPE_CHECKING:
27 from git.repo import Repo
28 from git.refs import Head, TagReference, RemoteReference, Reference
29 from .log import RefLogEntry
30 from git.config import GitConfigParser
31 from git.objects.commit import Actor
32
33
34 T_References = TypeVar('T_References', bound='SymbolicReference')
35
36 # ------------------------------------------------------------------------------
37
2138
2239 __all__ = ["SymbolicReference"]
2340
2441
25 def _git_dir(repo, path):
42 def _git_dir(repo: 'Repo', path: Union[PathLike, None]) -> PathLike:
2643 """ Find the git dir that's appropriate for the path"""
27 name = "%s" % (path,)
44 name = f"{path}"
2845 if name in ['HEAD', 'ORIG_HEAD', 'FETCH_HEAD', 'index', 'logs']:
2946 return repo.git_dir
3047 return repo.common_dir
4461 _remote_common_path_default = "refs/remotes"
4562 _id_attribute_ = "name"
4663
47 def __init__(self, repo, path):
64 def __init__(self, repo: 'Repo', path: PathLike, check_path: bool = False):
4865 self.repo = repo
4966 self.path = path
5067
51 def __str__(self):
52 return self.path
53
54 def __repr__(self):
68 def __str__(self) -> str:
69 return str(self.path)
70
71 def __repr__(self) -> str:
5572 return '<git.%s "%s">' % (self.__class__.__name__, self.path)
5673
57 def __eq__(self, other):
74 def __eq__(self, other: object) -> bool:
5875 if hasattr(other, 'path'):
76 other = cast(SymbolicReference, other)
5977 return self.path == other.path
6078 return False
6179
62 def __ne__(self, other):
80 def __ne__(self, other: object) -> bool:
6381 return not (self == other)
6482
65 def __hash__(self):
83 def __hash__(self) -> int:
6684 return hash(self.path)
6785
6886 @property
69 def name(self):
87 def name(self) -> str:
7088 """
7189 :return:
7290 In case of symbolic references, the shortest assumable name
7391 is the path itself."""
74 return self.path
92 return str(self.path)
7593
7694 @property
77 def abspath(self):
95 def abspath(self) -> PathLike:
7896 return join_path_native(_git_dir(self.repo, self.path), self.path)
7997
8098 @classmethod
81 def _get_packed_refs_path(cls, repo):
82 return osp.join(repo.common_dir, 'packed-refs')
83
84 @classmethod
85 def _iter_packed_refs(cls, repo):
86 """Returns an iterator yielding pairs of sha1/path pairs (as bytes) for the corresponding refs.
99 def _get_packed_refs_path(cls, repo: 'Repo') -> str:
100 return os.path.join(repo.common_dir, 'packed-refs')
101
102 @classmethod
103 def _iter_packed_refs(cls, repo: 'Repo') -> Iterator[Tuple[str, str]]:
104 """Returns an iterator yielding pairs of sha1/path pairs (as strings) for the corresponding refs.
87105 :note: The packed refs file will be kept open as long as we iterate"""
88106 try:
89 with open(cls._get_packed_refs_path(repo), 'rt') as fp:
107 with open(cls._get_packed_refs_path(repo), 'rt', encoding='UTF-8') as fp:
90108 for line in fp:
91109 line = line.strip()
92110 if not line:
111129 if line[0] == '^':
112130 continue
113131
114 yield tuple(line.split(' ', 1))
132 yield cast(Tuple[str, str], tuple(line.split(' ', 1)))
115133 # END for each line
116 except (OSError, IOError):
117 return
134 except OSError:
135 return None
118136 # END no packed-refs file handling
119137 # NOTE: Had try-finally block around here to close the fp,
120138 # but some python version wouldn't allow yields within that.
122140 # alright.
123141
124142 @classmethod
125 def dereference_recursive(cls, repo, ref_path):
143 def dereference_recursive(cls, repo: 'Repo', ref_path: Union[PathLike, None]) -> str:
126144 """
127145 :return: hexsha stored in the reference at the given ref_path, recursively dereferencing all
128146 intermediate references as required
129147 :param repo: the repository containing the reference at ref_path"""
148
130149 while True:
131150 hexsha, ref_path = cls._get_ref_info(repo, ref_path)
132151 if hexsha is not None:
134153 # END recursive dereferencing
135154
136155 @classmethod
137 def _get_ref_info_helper(cls, repo, ref_path):
156 def _get_ref_info_helper(cls, repo: 'Repo', ref_path: Union[PathLike, None]
157 ) -> Union[Tuple[str, None], Tuple[None, str]]:
138158 """Return: (str(sha), str(target_ref_path)) if available, the sha the file at
139159 rela_path points to, or None. target_ref_path is the reference we
140160 point to, or None"""
141 tokens = None
161 tokens: Union[None, List[str], Tuple[str, str]] = None
142162 repodir = _git_dir(repo, ref_path)
143163 try:
144 with open(osp.join(repodir, ref_path), 'rt', encoding='UTF-8') as fp:
164 with open(os.path.join(repodir, str(ref_path)), 'rt', encoding='UTF-8') as fp:
145165 value = fp.read().rstrip()
146166 # Don't only split on spaces, but on whitespace, which allows to parse lines like
147167 # 60b64ef992065e2600bfef6187a97f92398a9144 branch 'master' of git-server:/path/to/repo
148168 tokens = value.split()
149169 assert(len(tokens) != 0)
150 except (OSError, IOError):
170 except OSError:
151171 # Probably we are just packed, find our entry in the packed refs file
152172 # NOTE: We are not a symbolic ref if we are in a packed file, as these
153173 # are excluded explicitly
173193 raise ValueError("Failed to parse reference information from %r" % ref_path)
174194
175195 @classmethod
176 def _get_ref_info(cls, repo, ref_path):
196 def _get_ref_info(cls, repo: 'Repo', ref_path: Union[PathLike, None]) -> Union[Tuple[str, None], Tuple[None, str]]:
177197 """Return: (str(sha), str(target_ref_path)) if available, the sha the file at
178198 rela_path points to, or None. target_ref_path is the reference we
179199 point to, or None"""
180200 return cls._get_ref_info_helper(repo, ref_path)
181201
182 def _get_object(self):
202 def _get_object(self) -> Commit_ish:
183203 """
184204 :return:
185205 The object our ref currently refers to. Refs can be cached, they will
188208 # Our path will be resolved to the hexsha which will be used accordingly
189209 return Object.new_from_sha(self.repo, hex_to_bin(self.dereference_recursive(self.repo, self.path)))
190210
191 def _get_commit(self):
211 def _get_commit(self) -> 'Commit':
192212 """
193213 :return:
194214 Commit object we point to, works for detached and non-detached
203223 # END handle type
204224 return obj
205225
206 def set_commit(self, commit, logmsg=None):
226 def set_commit(self, commit: Union[Commit, 'SymbolicReference', str], logmsg: Union[str, None] = None
227 ) -> 'SymbolicReference':
207228 """As set_object, but restricts the type of object to be a Commit
208229
209230 :raise ValueError: If commit is not a Commit object or doesn't point to
232253
233254 return self
234255
235 def set_object(self, object, logmsg=None): # @ReservedAssignment
256 def set_object(self, object: Union[Commit_ish, 'SymbolicReference', str], logmsg: Union[str, None] = None
257 ) -> 'SymbolicReference':
236258 """Set the object we point to, possibly dereference our symbolic reference first.
237259 If the reference does not exist, it will be created
238260
259281 # set the commit on our reference
260282 return self._get_reference().set_object(object, logmsg)
261283
262 commit = property(_get_commit, set_commit, doc="Query or set commits directly")
263 object = property(_get_object, set_object, doc="Return the object our ref currently refers to")
264
265 def _get_reference(self):
284 commit = property(_get_commit, set_commit, doc="Query or set commits directly") # type: ignore
285 object = property(_get_object, set_object, doc="Return the object our ref currently refers to") # type: ignore
286
287 def _get_reference(self) -> 'SymbolicReference':
266288 """:return: Reference Object we point to
267289 :raise TypeError: If this symbolic reference is detached, hence it doesn't point
268290 to a reference, but to a commit"""
271293 raise TypeError("%s is a detached symbolic reference as it points to %r" % (self, sha))
272294 return self.from_path(self.repo, target_ref_path)
273295
274 def set_reference(self, ref, logmsg=None):
296 def set_reference(self, ref: Union[Commit_ish, 'SymbolicReference', str],
297 logmsg: Union[str, None] = None) -> 'SymbolicReference':
275298 """Set ourselves to the given ref. It will stay a symbol if the ref is a Reference.
276299 Otherwise an Object, given as Object instance or refspec, is assumed and if valid,
277300 will be set which effectively detaches the refererence if it was a purely
312335 raise TypeError("Require commit, got %r" % obj)
313336 # END verify type
314337
315 oldbinsha = None
338 oldbinsha: bytes = b''
316339 if logmsg is not None:
317340 try:
318341 oldbinsha = self.commit.binsha
341364 return self
342365
343366 # aliased reference
344 reference = property(_get_reference, set_reference, doc="Returns the Reference we point to")
367 reference: Union['Head', 'TagReference', 'RemoteReference', 'Reference']
368 reference = property(_get_reference, set_reference, doc="Returns the Reference we point to") # type: ignore
345369 ref = reference
346370
347 def is_valid(self):
371 def is_valid(self) -> bool:
348372 """
349373 :return:
350374 True if the reference is valid, hence it can be read and points to
357381 return True
358382
359383 @property
360 def is_detached(self):
384 def is_detached(self) -> bool:
361385 """
362386 :return:
363387 True if we are a detached reference, hence we point to a specific commit
368392 except TypeError:
369393 return True
370394
371 def log(self):
395 def log(self) -> 'RefLog':
372396 """
373397 :return: RefLog for this reference. Its last entry reflects the latest change
374398 applied to this reference
377401 instead of calling this method repeatedly. It should be considered read-only."""
378402 return RefLog.from_file(RefLog.path(self))
379403
380 def log_append(self, oldbinsha, message, newbinsha=None):
404 def log_append(self, oldbinsha: bytes, message: Union[str, None],
405 newbinsha: Union[bytes, None] = None) -> 'RefLogEntry':
381406 """Append a logentry to the logfile of this ref
382407
383408 :param oldbinsha: binary sha this ref used to point to
389414 # correct to allow overriding the committer on a per-commit level.
390415 # See https://github.com/gitpython-developers/GitPython/pull/146
391416 try:
392 committer_or_reader = self.commit.committer
417 committer_or_reader: Union['Actor', 'GitConfigParser'] = self.commit.committer
393418 except ValueError:
394419 committer_or_reader = self.repo.config_reader()
395420 # end handle newly cloned repositories
396 return RefLog.append_entry(committer_or_reader, RefLog.path(self), oldbinsha,
397 (newbinsha is None and self.commit.binsha) or newbinsha,
398 message)
399
400 def log_entry(self, index):
421 if newbinsha is None:
422 newbinsha = self.commit.binsha
423
424 if message is None:
425 message = ''
426
427 return RefLog.append_entry(committer_or_reader, RefLog.path(self), oldbinsha, newbinsha, message)
428
429 def log_entry(self, index: int) -> 'RefLogEntry':
401430 """:return: RefLogEntry at the given index
402431 :param index: python list compatible positive or negative index
403432
407436 return RefLog.entry_at(RefLog.path(self), index)
408437
409438 @classmethod
410 def to_full_path(cls, path):
439 def to_full_path(cls, path: Union[PathLike, 'SymbolicReference']) -> PathLike:
411440 """
412441 :return: string with a full repository-relative path which can be used to initialize
413442 a Reference instance, for instance by using ``Reference.from_path``"""
416445 full_ref_path = path
417446 if not cls._common_path_default:
418447 return full_ref_path
419 if not path.startswith(cls._common_path_default + "/"):
448 if not str(path).startswith(cls._common_path_default + "/"):
420449 full_ref_path = '%s/%s' % (cls._common_path_default, path)
421450 return full_ref_path
422451
423452 @classmethod
424 def delete(cls, repo, path):
453 def delete(cls, repo: 'Repo', path: PathLike) -> None:
425454 """Delete the reference at the given path
426455
427456 :param repo:
432461 or just "myreference", hence 'refs/' is implied.
433462 Alternatively the symbolic reference to be deleted"""
434463 full_ref_path = cls.to_full_path(path)
435 abs_path = osp.join(repo.common_dir, full_ref_path)
436 if osp.exists(abs_path):
464 abs_path = os.path.join(repo.common_dir, full_ref_path)
465 if os.path.exists(abs_path):
437466 os.remove(abs_path)
438467 else:
439468 # check packed refs
443472 new_lines = []
444473 made_change = False
445474 dropped_last_line = False
446 for line in reader:
447 line = line.decode(defenc)
475 for line_bytes in reader:
476 line = line_bytes.decode(defenc)
448477 _, _, line_ref = line.partition(' ')
449478 line_ref = line_ref.strip()
450479 # keep line if it is a comment or if the ref to delete is not
469498 with open(pack_file_path, 'wb') as fd:
470499 fd.writelines(line.encode(defenc) for line in new_lines)
471500
472 except (OSError, IOError):
501 except OSError:
473502 pass # it didn't exist at all
474503
475504 # delete the reflog
476505 reflog_path = RefLog.path(cls(repo, full_ref_path))
477 if osp.isfile(reflog_path):
506 if os.path.isfile(reflog_path):
478507 os.remove(reflog_path)
479508 # END remove reflog
480509
481510 @classmethod
482 def _create(cls, repo, path, resolve, reference, force, logmsg=None):
511 def _create(cls: Type[T_References], repo: 'Repo', path: PathLike, resolve: bool,
512 reference: Union['SymbolicReference', str], force: bool,
513 logmsg: Union[str, None] = None) -> T_References:
483514 """internal method used to create a new symbolic reference.
484515 If resolve is False, the reference will be taken as is, creating
485516 a proper symbolic reference. Otherwise it will be resolved to the
487518 instead"""
488519 git_dir = _git_dir(repo, path)
489520 full_ref_path = cls.to_full_path(path)
490 abs_ref_path = osp.join(git_dir, full_ref_path)
521 abs_ref_path = os.path.join(git_dir, full_ref_path)
491522
492523 # figure out target data
493524 target = reference
494525 if resolve:
495526 target = repo.rev_parse(str(reference))
496527
497 if not force and osp.isfile(abs_ref_path):
528 if not force and os.path.isfile(abs_ref_path):
498529 target_data = str(target)
499530 if isinstance(target, SymbolicReference):
500 target_data = target.path
531 target_data = str(target.path)
501532 if not resolve:
502533 target_data = "ref: " + target_data
503534 with open(abs_ref_path, 'rb') as fd:
512543 return ref
513544
514545 @classmethod
515 def create(cls, repo, path, reference='HEAD', force=False, logmsg=None):
516 """Create a new symbolic reference, hence a reference pointing to another reference.
546 def create(cls: Type[T_References], repo: 'Repo', path: PathLike,
547 reference: Union['SymbolicReference', str] = 'HEAD',
548 logmsg: Union[str, None] = None, force: bool = False, **kwargs: Any) -> T_References:
549 """Create a new symbolic reference, hence a reference pointing , to another reference.
517550
518551 :param repo:
519552 Repository to create the reference in
543576 :note: This does not alter the current HEAD, index or Working Tree"""
544577 return cls._create(repo, path, cls._resolve_ref_on_create, reference, force, logmsg)
545578
546 def rename(self, new_path, force=False):
579 def rename(self, new_path: PathLike, force: bool = False) -> 'SymbolicReference':
547580 """Rename self to a new path
548581
549582 :param new_path:
561594 if self.path == new_path:
562595 return self
563596
564 new_abs_path = osp.join(_git_dir(self.repo, new_path), new_path)
565 cur_abs_path = osp.join(_git_dir(self.repo, self.path), self.path)
566 if osp.isfile(new_abs_path):
597 new_abs_path = os.path.join(_git_dir(self.repo, new_path), new_path)
598 cur_abs_path = os.path.join(_git_dir(self.repo, self.path), self.path)
599 if os.path.isfile(new_abs_path):
567600 if not force:
568601 # if they point to the same file, its not an error
569602 with open(new_abs_path, 'rb') as fd1:
578611 os.remove(new_abs_path)
579612 # END handle existing target file
580613
581 dname = osp.dirname(new_abs_path)
582 if not osp.isdir(dname):
614 dname = os.path.dirname(new_abs_path)
615 if not os.path.isdir(dname):
583616 os.makedirs(dname)
584617 # END create directory
585618
589622 return self
590623
591624 @classmethod
592 def _iter_items(cls, repo, common_path=None):
625 def _iter_items(cls: Type[T_References], repo: 'Repo', common_path: Union[PathLike, None] = None
626 ) -> Iterator[T_References]:
593627 if common_path is None:
594628 common_path = cls._common_path_default
595629 rela_paths = set()
613647
614648 # read packed refs
615649 for _sha, rela_path in cls._iter_packed_refs(repo):
616 if rela_path.startswith(common_path):
650 if rela_path.startswith(str(common_path)):
617651 rela_paths.add(rela_path)
618652 # END relative path matches common path
619653 # END packed refs reading
627661 # END for each sorted relative refpath
628662
629663 @classmethod
630 def iter_items(cls, repo, common_path=None):
664 def iter_items(cls: Type[T_References], repo: 'Repo', common_path: Union[PathLike, None] = None,
665 *args: Any, **kwargs: Any) -> Iterator[T_References]:
631666 """Find all refs in the repository
632667
633668 :param repo: is the Repo
647682 return (r for r in cls._iter_items(repo, common_path) if r.__class__ == SymbolicReference or not r.is_detached)
648683
649684 @classmethod
650 def from_path(cls, repo, path):
685 def from_path(cls: Type[T_References], repo: 'Repo', path: PathLike) -> T_References:
651686 """
652687 :param path: full .git-directory-relative path name to the Reference to instantiate
653688 :note: use to_full_path() if you only have a partial path of a known Reference Type
662697 from . import HEAD, Head, RemoteReference, TagReference, Reference
663698 for ref_type in (HEAD, Head, RemoteReference, TagReference, Reference, SymbolicReference):
664699 try:
700 instance: T_References
665701 instance = ref_type(repo, path)
666702 if instance.__class__ == SymbolicReference and instance.is_detached:
667703 raise ValueError("SymbolRef was detached, we drop it")
668 return instance
704 else:
705 return instance
706
669707 except ValueError:
670708 pass
671709 # END exception handling
672710 # END for each type to try
673711 raise ValueError("Could not find reference type suitable to handle path %r" % path)
674712
675 def is_remote(self):
713 def is_remote(self) -> bool:
676714 """:return: True if this symbolic reference points to a remote branch"""
677 return self.path.startswith(self._remote_common_path_default + "/")
715 return str(self.path).startswith(self._remote_common_path_default + "/")
00 from .reference import Reference
11
22 __all__ = ["TagReference", "Tag"]
3
4 # typing ------------------------------------------------------------------
5
6 from typing import Any, Type, Union, TYPE_CHECKING
7 from git.types import Commit_ish, PathLike
8
9 if TYPE_CHECKING:
10 from git.repo import Repo
11 from git.objects import Commit
12 from git.objects import TagObject
13 from git.refs import SymbolicReference
14
15
16 # ------------------------------------------------------------------------------
317
418
519 class TagReference(Reference):
1731 print(tagref.tag.message)"""
1832
1933 __slots__ = ()
20 _common_path_default = "refs/tags"
34 _common_default = "tags"
35 _common_path_default = Reference._common_path_default + "/" + _common_default
2136
2237 @property
23 def commit(self):
38 def commit(self) -> 'Commit': # type: ignore[override] # LazyMixin has unrelated comit method
2439 """:return: Commit object the tag ref points to
25
40
2641 :raise ValueError: if the tag points to a tree or blob"""
2742 obj = self.object
2843 while obj.type != 'commit':
3550 return obj
3651
3752 @property
38 def tag(self):
53 def tag(self) -> Union['TagObject', None]:
3954 """
4055 :return: Tag object this tag ref points to or None in case
4156 we are a light weight tag"""
4661
4762 # make object read-only
4863 # It should be reasonably hard to adjust an existing tag
49 object = property(Reference._get_object)
64
65 # object = property(Reference._get_object)
66 @property
67 def object(self) -> Commit_ish: # type: ignore[override]
68 return Reference._get_object(self)
5069
5170 @classmethod
52 def create(cls, repo, path, ref='HEAD', message=None, force=False, **kwargs):
71 def create(cls: Type['TagReference'], repo: 'Repo', path: PathLike,
72 reference: Union[str, 'SymbolicReference'] = 'HEAD',
73 logmsg: Union[str, None] = None,
74 force: bool = False, **kwargs: Any) -> 'TagReference':
5375 """Create a new tag reference.
5476
5577 :param path:
5779 The prefix refs/tags is implied
5880
5981 :param ref:
60 A reference to the object you want to tag. It can be a commit, tree or
82 A reference to the Object you want to tag. The Object can be a commit, tree or
6183 blob.
6284
63 :param message:
85 :param logmsg:
6486 If not None, the message will be used in your tag object. This will also
6587 create an additional tag object that allows to obtain that information, i.e.::
6688
6789 tagref.tag.message
90
91 :param message:
92 Synonym for :param logmsg:
93 Included for backwards compatability. :param logmsg is used in preference if both given.
6894
6995 :param force:
7096 If True, to force creation of a tag even though that tag already exists.
7399 Additional keyword arguments to be passed to git-tag
74100
75101 :return: A new TagReference"""
76 args = (path, ref)
77 if message:
78 kwargs['m'] = message
102 if 'ref' in kwargs and kwargs['ref']:
103 reference = kwargs['ref']
104
105 if logmsg:
106 kwargs['m'] = logmsg
107 elif 'message' in kwargs and kwargs['message']:
108 kwargs['m'] = kwargs['message']
109
79110 if force:
80111 kwargs['f'] = True
112
113 args = (path, reference)
81114
82115 repo.git.tag(*args, **kwargs)
83116 return TagReference(repo, "%s/%s" % (cls._common_path_default, path))
84117
85118 @classmethod
86 def delete(cls, repo, *tags):
119 def delete(cls, repo: 'Repo', *tags: 'TagReference') -> None: # type: ignore[override]
87120 """Delete the given existing tag or tags"""
88121 repo.git.tag("-d", *tags)
89122
88 import re
99
1010 from git.cmd import handle_process_output, Git
11 from git.compat import (defenc, force_text, is_win)
11 from git.compat import (defenc, force_text)
1212 from git.exc import GitCommandError
1313 from git.util import (
1414 LazyMixin,
15 Iterable,
15 IterableObj,
1616 IterableList,
1717 RemoteProgress,
18 CallableRemoteProgress
18 CallableRemoteProgress,
1919 )
2020 from git.util import (
2121 join_path,
2222 )
2323
2424 from .config import (
25 GitConfigParser,
2526 SectionConstraint,
2627 cp,
2728 )
3334 TagReference
3435 )
3536
37 # typing-------------------------------------------------------
38
39 from typing import (Any, Callable, Dict, Iterator, List, NoReturn, Optional, Sequence,
40 TYPE_CHECKING, Type, Union, cast, overload)
41
42 from git.types import PathLike, Literal, Commit_ish
43
44 if TYPE_CHECKING:
45 from git.repo.base import Repo
46 from git.objects.submodule.base import UpdateProgress
47 # from git.objects.commit import Commit
48 # from git.objects import Blob, Tree, TagObject
49
50 flagKeyLiteral = Literal[' ', '!', '+', '-', '*', '=', 't', '?']
51
52 # def is_flagKeyLiteral(inp: str) -> TypeGuard[flagKeyLiteral]:
53 # return inp in [' ', '!', '+', '-', '=', '*', 't', '?']
54
55
56 # -------------------------------------------------------------
57
3658
3759 log = logging.getLogger('git.remote')
3860 log.addHandler(logging.NullHandler())
4365 #{ Utilities
4466
4567
46 def add_progress(kwargs, git, progress):
68 def add_progress(kwargs: Any, git: Git,
69 progress: Union[RemoteProgress, 'UpdateProgress', Callable[..., RemoteProgress], None]
70 ) -> Any:
4771 """Add the --progress flag to the given kwargs dict if supported by the
4872 git command. If the actual progress in the given progress instance is not
4973 given, we do not request any progress
5983 #} END utilities
6084
6185
62 def to_progress_instance(progress):
86 @ overload
87 def to_progress_instance(progress: None) -> RemoteProgress:
88 ...
89
90
91 @ overload
92 def to_progress_instance(progress: Callable[..., Any]) -> CallableRemoteProgress:
93 ...
94
95
96 @ overload
97 def to_progress_instance(progress: RemoteProgress) -> RemoteProgress:
98 ...
99
100
101 def to_progress_instance(progress: Union[Callable[..., Any], RemoteProgress, None]
102 ) -> Union[RemoteProgress, CallableRemoteProgress]:
63103 """Given the 'progress' return a suitable object derived from
64104 RemoteProgress().
65105 """
75115 return progress
76116
77117
78 class PushInfo(object):
118 class PushInfo(IterableObj, object):
79119 """
80120 Carries information about the result of a push operation of a single head::
81121
91131 info.summary # summary line providing human readable english text about the push
92132 """
93133 __slots__ = ('local_ref', 'remote_ref_string', 'flags', '_old_commit_sha', '_remote', 'summary')
134 _id_attribute_ = 'pushinfo'
94135
95136 NEW_TAG, NEW_HEAD, NO_MATCH, REJECTED, REMOTE_REJECTED, REMOTE_FAILURE, DELETED, \
96137 FORCED_UPDATE, FAST_FORWARD, UP_TO_DATE, ERROR = [1 << x for x in range(11)]
103144 '=': UP_TO_DATE,
104145 '!': ERROR}
105146
106 def __init__(self, flags, local_ref, remote_ref_string, remote, old_commit=None,
107 summary=''):
108 """ Initialize a new instance """
147 def __init__(self, flags: int, local_ref: Union[SymbolicReference, None], remote_ref_string: str, remote: 'Remote',
148 old_commit: Optional[str] = None, summary: str = '') -> None:
149 """ Initialize a new instance
150 local_ref: HEAD | Head | RemoteReference | TagReference | Reference | SymbolicReference | None """
109151 self.flags = flags
110152 self.local_ref = local_ref
111153 self.remote_ref_string = remote_ref_string
113155 self._old_commit_sha = old_commit
114156 self.summary = summary
115157
116 @property
117 def old_commit(self):
158 @ property
159 def old_commit(self) -> Union[str, SymbolicReference, Commit_ish, None]:
118160 return self._old_commit_sha and self._remote.repo.commit(self._old_commit_sha) or None
119161
120 @property
121 def remote_ref(self):
162 @ property
163 def remote_ref(self) -> Union[RemoteReference, TagReference]:
122164 """
123165 :return:
124166 Remote Reference or TagReference in the local repository corresponding
133175 raise ValueError("Could not handle remote ref: %r" % self.remote_ref_string)
134176 # END
135177
136 @classmethod
137 def _from_line(cls, remote, line):
178 @ classmethod
179 def _from_line(cls, remote: 'Remote', line: str) -> 'PushInfo':
138180 """Create a new PushInfo instance as parsed from line which is expected to be like
139181 refs/heads/master:refs/heads/master 05d2687..1d0568e as bytes"""
140182 control_character, from_to, summary = line.split('\t', 3)
150192 # from_to handling
151193 from_ref_string, to_ref_string = from_to.split(':')
152194 if flags & cls.DELETED:
153 from_ref = None
195 from_ref: Union[SymbolicReference, None] = None
154196 else:
155197 if from_ref_string == "(delete)":
156198 from_ref = None
158200 from_ref = Reference.from_path(remote.repo, from_ref_string)
159201
160202 # commit handling, could be message or commit info
161 old_commit = None
203 old_commit: Optional[str] = None
162204 if summary.startswith('['):
163205 if "[rejected]" in summary:
164206 flags |= cls.REJECTED
186228
187229 return PushInfo(flags, from_ref, to_ref_string, remote, old_commit, summary)
188230
189
190 class FetchInfo(object):
231 @ classmethod
232 def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any
233 ) -> NoReturn: # -> Iterator['PushInfo']:
234 raise NotImplementedError
235
236
237 class FetchInfo(IterableObj, object):
191238
192239 """
193240 Carries information about the results of a fetch operation of a single head::
204251 info.remote_ref_path # The path from which we fetched on the remote. It's the remote's version of our info.ref
205252 """
206253 __slots__ = ('ref', 'old_commit', 'flags', 'note', 'remote_ref_path')
254 _id_attribute_ = 'fetchinfo'
207255
208256 NEW_TAG, NEW_HEAD, HEAD_UPTODATE, TAG_UPDATE, REJECTED, FORCED_UPDATE, \
209257 FAST_FORWARD, ERROR = [1 << x for x in range(8)]
210258
211259 _re_fetch_result = re.compile(r'^\s*(.) (\[?[\w\s\.$@]+\]?)\s+(.+) -> ([^\s]+)( \(.*\)?$)?')
212260
213 _flag_map = {
261 _flag_map: Dict[flagKeyLiteral, int] = {
214262 '!': ERROR,
215263 '+': FORCED_UPDATE,
216264 '*': 0,
219267 '-': TAG_UPDATE,
220268 }
221269
222 @classmethod
223 def refresh(cls):
270 @ classmethod
271 def refresh(cls) -> Literal[True]:
224272 """This gets called by the refresh function (see the top level
225273 __init__).
226274 """
243291
244292 return True
245293
246 def __init__(self, ref, flags, note='', old_commit=None, remote_ref_path=None):
294 def __init__(self, ref: SymbolicReference, flags: int, note: str = '',
295 old_commit: Union[Commit_ish, None] = None,
296 remote_ref_path: Optional[PathLike] = None) -> None:
247297 """
248298 Initialize a new instance
249299 """
253303 self.old_commit = old_commit
254304 self.remote_ref_path = remote_ref_path
255305
256 def __str__(self):
306 def __str__(self) -> str:
257307 return self.name
258308
259 @property
260 def name(self):
309 @ property
310 def name(self) -> str:
261311 """:return: Name of our remote ref"""
262312 return self.ref.name
263313
264 @property
265 def commit(self):
314 @ property
315 def commit(self) -> Commit_ish:
266316 """:return: Commit of our remote ref"""
267317 return self.ref.commit
268318
269 @classmethod
270 def _from_line(cls, repo, line, fetch_line):
319 @ classmethod
320 def _from_line(cls, repo: 'Repo', line: str, fetch_line: str) -> 'FetchInfo':
271321 """Parse information from the given line as returned by git-fetch -v
272322 and return a new FetchInfo object representing this information.
273323
274 We can handle a line as follows
275 "%c %-*s %-*s -> %s%s"
276
277 Where c is either ' ', !, +, -, *, or =
324 We can handle a line as follows:
325 "%c %-\\*s %-\\*s -> %s%s"
326
327 Where c is either ' ', !, +, -, \\*, or =
278328 ! means error
279329 + means success forcing update
280330 - means a tag was updated
289339 raise ValueError("Failed to parse line: %r" % line)
290340
291341 # parse lines
292 control_character, operation, local_remote_ref, remote_local_ref, note = match.groups()
342 remote_local_ref_str: str
343 control_character, operation, local_remote_ref, remote_local_ref_str, note = match.groups()
344 # assert is_flagKeyLiteral(control_character), f"{control_character}"
345 control_character = cast(flagKeyLiteral, control_character)
293346 try:
294347 _new_hex_sha, _fetch_operation, fetch_note = fetch_line.split("\t")
295348 ref_type_name, fetch_note = fetch_note.split(' ', 1)
305358 # END control char exception handling
306359
307360 # parse operation string for more info - makes no sense for symbolic refs, but we parse it anyway
308 old_commit = None
361 old_commit: Union[Commit_ish, None] = None
309362 is_tag_operation = False
310363 if 'rejected' in operation:
311364 flags |= cls.REJECTED
328381 # If we do not specify a target branch like master:refs/remotes/origin/master,
329382 # the fetch result is stored in FETCH_HEAD which destroys the rule we usually
330383 # have. In that case we use a symbolic reference which is detached
331 ref_type = None
332 if remote_local_ref == "FETCH_HEAD":
384 ref_type: Optional[Type[SymbolicReference]] = None
385 if remote_local_ref_str == "FETCH_HEAD":
333386 ref_type = SymbolicReference
334387 elif ref_type_name == "tag" or is_tag_operation:
335388 # the ref_type_name can be branch, whereas we are still seeing a tag operation. It happens during
357410 # by the 'ref/' prefix. Otherwise even a tag could be in refs/remotes, which is when it will have the
358411 # 'tags/' subdirectory in its path.
359412 # We don't want to test for actual existence, but try to figure everything out analytically.
360 ref_path = None
361 remote_local_ref = remote_local_ref.strip()
362 if remote_local_ref.startswith(Reference._common_path_default + "/"):
413 ref_path: Optional[PathLike] = None
414 remote_local_ref_str = remote_local_ref_str.strip()
415
416 if remote_local_ref_str.startswith(Reference._common_path_default + "/"):
363417 # always use actual type if we get absolute paths
364418 # Will always be the case if something is fetched outside of refs/remotes (if its not a tag)
365 ref_path = remote_local_ref
419 ref_path = remote_local_ref_str
366420 if ref_type is not TagReference and not \
367 remote_local_ref.startswith(RemoteReference._common_path_default + "/"):
421 remote_local_ref_str.startswith(RemoteReference._common_path_default + "/"):
368422 ref_type = Reference
369423 # END downgrade remote reference
370 elif ref_type is TagReference and 'tags/' in remote_local_ref:
424 elif ref_type is TagReference and 'tags/' in remote_local_ref_str:
371425 # even though its a tag, it is located in refs/remotes
372 ref_path = join_path(RemoteReference._common_path_default, remote_local_ref)
426 ref_path = join_path(RemoteReference._common_path_default, remote_local_ref_str)
373427 else:
374 ref_path = join_path(ref_type._common_path_default, remote_local_ref)
428 ref_path = join_path(ref_type._common_path_default, remote_local_ref_str)
375429 # END obtain refpath
376430
377431 # even though the path could be within the git conventions, we make
383437
384438 return cls(remote_local_ref, flags, note, old_commit, local_remote_ref)
385439
386
387 class Remote(LazyMixin, Iterable):
440 @ classmethod
441 def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any
442 ) -> NoReturn: # -> Iterator['FetchInfo']:
443 raise NotImplementedError
444
445
446 class Remote(LazyMixin, IterableObj):
388447
389448 """Provides easy read and write access to a git remote.
390449
397456 __slots__ = ("repo", "name", "_config_reader")
398457 _id_attribute_ = "name"
399458
400 def __init__(self, repo, name):
459 def __init__(self, repo: 'Repo', name: str) -> None:
401460 """Initialize a remote instance
402461
403462 :param repo: The repository we are a remote of
404463 :param name: the name of the remote, i.e. 'origin'"""
405464 self.repo = repo
406465 self.name = name
407
408 if is_win:
409 # some oddity: on windows, python 2.5, it for some reason does not realize
410 # that it has the config_writer property, but instead calls __getattr__
411 # which will not yield the expected results. 'pinging' the members
412 # with a dir call creates the config_writer property that we require
413 # ... bugs like these make me wonder whether python really wants to be used
414 # for production. It doesn't happen on linux though.
415 dir(self)
416 # END windows special handling
417
418 def __getattr__(self, attr):
466 self.url: str
467
468 def __getattr__(self, attr: str) -> Any:
419469 """Allows to call this instance like
420470 remote.special( \\*args, \\*\\*kwargs) to call git-remote special self.name"""
421471 if attr == "_config_reader":
429479 return super(Remote, self).__getattr__(attr)
430480 # END handle exception
431481
432 def _config_section_name(self):
482 def _config_section_name(self) -> str:
433483 return 'remote "%s"' % self.name
434484
435 def _set_cache_(self, attr):
485 def _set_cache_(self, attr: str) -> None:
436486 if attr == "_config_reader":
437487 # NOTE: This is cached as __getattr__ is overridden to return remote config values implicitly, such as
438488 # in print(r.pushurl)
440490 else:
441491 super(Remote, self)._set_cache_(attr)
442492
443 def __str__(self):
493 def __str__(self) -> str:
444494 return self.name
445495
446 def __repr__(self):
496 def __repr__(self) -> str:
447497 return '<git.%s "%s">' % (self.__class__.__name__, self.name)
448498
449 def __eq__(self, other):
499 def __eq__(self, other: object) -> bool:
450500 return isinstance(other, type(self)) and self.name == other.name
451501
452 def __ne__(self, other):
502 def __ne__(self, other: object) -> bool:
453503 return not (self == other)
454504
455 def __hash__(self):
505 def __hash__(self) -> int:
456506 return hash(self.name)
457507
458 def exists(self):
508 def exists(self) -> bool:
459509 """
460510 :return: True if this is a valid, existing remote.
461511 Valid remotes have an entry in the repository's configuration"""
469519 return False
470520 # end
471521
472 @classmethod
473 def iter_items(cls, repo):
522 @ classmethod
523 def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> Iterator['Remote']:
474524 """:return: Iterator yielding Remote objects of the given repository"""
475525 for section in repo.config_reader("repository").sections():
476526 if not section.startswith('remote '):
482532 yield Remote(repo, section[lbound + 1:rbound])
483533 # END for each configuration section
484534
485 def set_url(self, new_url, old_url=None, **kwargs):
535 def set_url(self, new_url: str, old_url: Optional[str] = None, **kwargs: Any) -> 'Remote':
486536 """Configure URLs on current remote (cf command git remote set_url)
487537
488538 This command manages URLs on the remote.
499549 self.repo.git.remote(scmd, self.name, new_url, **kwargs)
500550 return self
501551
502 def add_url(self, url, **kwargs):
552 def add_url(self, url: str, **kwargs: Any) -> 'Remote':
503553 """Adds a new url on current remote (special case of git remote set_url)
504554
505555 This command adds new URLs to a given remote, making it possible to have
510560 """
511561 return self.set_url(url, add=True)
512562
513 def delete_url(self, url, **kwargs):
563 def delete_url(self, url: str, **kwargs: Any) -> 'Remote':
514564 """Deletes a new url on current remote (special case of git remote set_url)
515565
516566 This command deletes new URLs to a given remote, making it possible to have
521571 """
522572 return self.set_url(url, delete=True)
523573
524 @property
525 def urls(self):
574 @ property
575 def urls(self) -> Iterator[str]:
526576 """:return: Iterator yielding all configured URL targets on a remote as strings"""
527577 try:
528578 remote_details = self.repo.git.remote("get-url", "--all", self.name)
579 assert isinstance(remote_details, str)
529580 for line in remote_details.split('\n'):
530581 yield line
531582 except GitCommandError as ex:
537588 if 'Unknown subcommand: get-url' in str(ex):
538589 try:
539590 remote_details = self.repo.git.remote("show", self.name)
591 assert isinstance(remote_details, str)
540592 for line in remote_details.split('\n'):
541593 if ' Push URL:' in line:
542594 yield line.split(': ')[-1]
543 except GitCommandError as ex:
544 if any(msg in str(ex) for msg in ['correct access rights', 'cannot run ssh']):
595 except GitCommandError as _ex:
596 if any(msg in str(_ex) for msg in ['correct access rights', 'cannot run ssh']):
545597 # If ssh is not setup to access this repository, see issue 694
546598 remote_details = self.repo.git.config('--get-all', 'remote.%s.url' % self.name)
599 assert isinstance(remote_details, str)
547600 for line in remote_details.split('\n'):
548601 yield line
549602 else:
550 raise ex
603 raise _ex
551604 else:
552605 raise ex
553606
554 @property
555 def refs(self):
607 @ property
608 def refs(self) -> IterableList[RemoteReference]:
556609 """
557610 :return:
558611 IterableList of RemoteReference objects. It is prefixed, allowing
559612 you to omit the remote path portion, i.e.::
560613 remote.refs.master # yields RemoteReference('/refs/remotes/origin/master')"""
561 out_refs = IterableList(RemoteReference._id_attribute_, "%s/" % self.name)
614 out_refs: IterableList[RemoteReference] = IterableList(RemoteReference._id_attribute_, "%s/" % self.name)
562615 out_refs.extend(RemoteReference.list_items(self.repo, remote=self.name))
563616 return out_refs
564617
565 @property
566 def stale_refs(self):
618 @ property
619 def stale_refs(self) -> IterableList[Reference]:
567620 """
568621 :return:
569622 IterableList RemoteReference objects that do not have a corresponding
578631 as well. This is a fix for the issue described here:
579632 https://github.com/gitpython-developers/GitPython/issues/260
580633 """
581 out_refs = IterableList(RemoteReference._id_attribute_, "%s/" % self.name)
634 out_refs: IterableList[Reference] = IterableList(RemoteReference._id_attribute_, "%s/" % self.name)
582635 for line in self.repo.git.remote("prune", "--dry-run", self).splitlines()[2:]:
583636 # expecting
584637 # * [would prune] origin/new_branch
585638 token = " * [would prune] "
586639 if not line.startswith(token):
587 raise ValueError("Could not parse git-remote prune result: %r" % line)
640 continue
588641 ref_name = line.replace(token, "")
589642 # sometimes, paths start with a full ref name, like refs/tags/foo, see #260
590643 if ref_name.startswith(Reference._common_path_default + '/'):
591 out_refs.append(SymbolicReference.from_path(self.repo, ref_name))
644 out_refs.append(Reference.from_path(self.repo, ref_name))
592645 else:
593646 fqhn = "%s/%s" % (RemoteReference._common_path_default, ref_name)
594647 out_refs.append(RemoteReference(self.repo, fqhn))
596649 # END for each line
597650 return out_refs
598651
599 @classmethod
600 def create(cls, repo, name, url, **kwargs):
652 @ classmethod
653 def create(cls, repo: 'Repo', name: str, url: str, **kwargs: Any) -> 'Remote':
601654 """Create a new remote to the given repository
602655 :param repo: Repository instance that is to receive the new remote
603656 :param name: Desired name of the remote
613666 # add is an alias
614667 add = create
615668
616 @classmethod
617 def remove(cls, repo, name):
669 @ classmethod
670 def remove(cls, repo: 'Repo', name: str) -> str:
618671 """Remove the remote with the given name
619672 :return: the passed remote name to remove
620673 """
626679 # alias
627680 rm = remove
628681
629 def rename(self, new_name):
682 def rename(self, new_name: str) -> 'Remote':
630683 """Rename self to the given new_name
631684 :return: self """
632685 if self.name == new_name:
638691
639692 return self
640693
641 def update(self, **kwargs):
694 def update(self, **kwargs: Any) -> 'Remote':
642695 """Fetch all changes for this remote, including new branches which will
643696 be forced in ( in case your local remote branch is not part the new remote branches
644697 ancestry anymore ).
652705 self.repo.git.remote(scmd, self.name, **kwargs)
653706 return self
654707
655 def _get_fetch_info_from_stderr(self, proc, progress):
708 def _get_fetch_info_from_stderr(self, proc: 'Git.AutoInterrupt',
709 progress: Union[Callable[..., Any], RemoteProgress, None]
710 ) -> IterableList['FetchInfo']:
711
656712 progress = to_progress_instance(progress)
657713
658714 # skip first line as it is some remote info we are not interested in
659 output = IterableList('name')
715 output: IterableList['FetchInfo'] = IterableList('name')
660716
661717 # lines which are no progress are fetch info lines
662718 # this also waits for the command to finish
694750 msg += "Will ignore extra progress lines or fetch head lines."
695751 msg %= (l_fil, l_fhi)
696752 log.debug(msg)
697 log.debug("info lines: " + str(fetch_info_lines))
698 log.debug("head info : " + str(fetch_head_info))
753 log.debug(b"info lines: " + str(fetch_info_lines).encode("UTF-8"))
754 log.debug(b"head info: " + str(fetch_head_info).encode("UTF-8"))
699755 if l_fil < l_fhi:
700756 fetch_head_info = fetch_head_info[:l_fil]
701757 else:
711767 log.warning("Git informed while fetching: %s", err_line.strip())
712768 return output
713769
714 def _get_push_info(self, proc, progress):
770 def _get_push_info(self, proc: 'Git.AutoInterrupt',
771 progress: Union[Callable[..., Any], RemoteProgress, None]) -> IterableList[PushInfo]:
715772 progress = to_progress_instance(progress)
716773
717774 # read progress information from stderr
719776 # read the lines manually as it will use carriage returns between the messages
720777 # to override the previous one. This is why we read the bytes manually
721778 progress_handler = progress.new_message_handler()
722 output = []
723
724 def stdout_handler(line):
779 output: IterableList[PushInfo] = IterableList('push_infos')
780
781 def stdout_handler(line: str) -> None:
725782 try:
726783 output.append(PushInfo._from_line(self, line))
727784 except ValueError:
740797
741798 return output
742799
743 def _assert_refspec(self):
800 def _assert_refspec(self) -> None:
744801 """Turns out we can't deal with remotes if the refspec is missing"""
745802 config = self.config_reader
746803 unset = 'placeholder'
753810 finally:
754811 config.release()
755812
756 def fetch(self, refspec=None, progress=None, verbose=True, **kwargs):
813 def fetch(self, refspec: Union[str, List[str], None] = None,
814 progress: Union[RemoteProgress, None, 'UpdateProgress'] = None,
815 verbose: bool = True, **kwargs: Any) -> IterableList[FetchInfo]:
757816 """Fetch the latest changes for this remote
758817
759818 :param refspec:
784843 if refspec is None:
785844 # No argument refspec, then ensure the repo's config has a fetch refspec.
786845 self._assert_refspec()
846
787847 kwargs = add_progress(kwargs, self.repo.git, progress)
788848 if isinstance(refspec, list):
789 args = refspec
849 args: Sequence[Optional[str]] = refspec
790850 else:
791851 args = [refspec]
792852
797857 self.repo.odb.update_cache()
798858 return res
799859
800 def pull(self, refspec=None, progress=None, **kwargs):
860 def pull(self, refspec: Union[str, List[str], None] = None,
861 progress: Union[RemoteProgress, 'UpdateProgress', None] = None,
862 **kwargs: Any) -> IterableList[FetchInfo]:
801863 """Pull changes from the given branch, being the same as a fetch followed
802864 by a merge of branch with your local branch.
803865
816878 self.repo.odb.update_cache()
817879 return res
818880
819 def push(self, refspec=None, progress=None, **kwargs):
881 def push(self, refspec: Union[str, List[str], None] = None,
882 progress: Union[RemoteProgress, 'UpdateProgress', Callable[..., RemoteProgress], None] = None,
883 **kwargs: Any) -> IterableList[PushInfo]:
820884 """Push changes from source branch in refspec to target branch in refspec.
821885
822886 :param refspec: see 'fetch' method
846910 universal_newlines=True, **kwargs)
847911 return self._get_push_info(proc, progress)
848912
849 @property
850 def config_reader(self):
913 @ property
914 def config_reader(self) -> SectionConstraint[GitConfigParser]:
851915 """
852916 :return:
853917 GitConfigParser compatible object able to read options for only our remote.
854918 Hence you may simple type config.get("pushurl") to obtain the information"""
855919 return self._config_reader
856920
857 def _clear_cache(self):
921 def _clear_cache(self) -> None:
858922 try:
859923 del(self._config_reader)
860924 except AttributeError:
861925 pass
862926 # END handle exception
863927
864 @property
865 def config_writer(self):
928 @ property
929 def config_writer(self) -> SectionConstraint:
866930 """
867931 :return: GitConfigParser compatible object able to write options for this remote.
868932 :note:
00 """Initialize the Repo package"""
11 # flake8: noqa
2 from __future__ import absolute_import
3 from .base import *
2 from .base import Repo as Repo
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5
6 from collections import namedtuple
5 from __future__ import annotations
76 import logging
87 import os
98 import re
9 import shlex
1010 import warnings
11 from gitdb.db.loose import LooseObjectDB
12
13 from gitdb.exc import BadObject
1114
1215 from git.cmd import (
1316 Git,
2528 from git.objects import Submodule, RootModule, Commit
2629 from git.refs import HEAD, Head, Reference, TagReference
2730 from git.remote import Remote, add_progress, to_progress_instance
28 from git.util import Actor, finalize_process, decygpath, hex_to_bin, expand_path
31 from git.util import Actor, finalize_process, decygpath, hex_to_bin, expand_path, remove_password_if_present
2932 import os.path as osp
3033
3134 from .fun import rev_parse, is_git_dir, find_submodule_git_dir, touch, find_worktree_git_dir
3235 import gc
3336 import gitdb
3437
35 try:
36 import pathlib
37 except ImportError:
38 pathlib = None
39
38 # typing ------------------------------------------------------
39
40 from git.types import TBD, PathLike, Lit_config_levels, Commit_ish, Tree_ish, assert_never
41 from typing import (Any, BinaryIO, Callable, Dict,
42 Iterator, List, Mapping, Optional, Sequence,
43 TextIO, Tuple, Type, Union,
44 NamedTuple, cast, TYPE_CHECKING)
45
46 from git.types import ConfigLevels_Tup, TypedDict
47
48 if TYPE_CHECKING:
49 from git.util import IterableList
50 from git.refs.symbolic import SymbolicReference
51 from git.objects import Tree
52 from git.objects.submodule.base import UpdateProgress
53 from git.remote import RemoteProgress
54
55 # -----------------------------------------------------------
4056
4157 log = logging.getLogger(__name__)
4258
43 BlameEntry = namedtuple('BlameEntry', ['commit', 'linenos', 'orig_path', 'orig_linenos'])
44
45
4659 __all__ = ('Repo',)
60
61
62 class BlameEntry(NamedTuple):
63 commit: Dict[str, 'Commit']
64 linenos: range
65 orig_path: Optional[str]
66 orig_linenos: range
4767
4868
4969 class Repo(object):
6282 'git_dir' is the .git repository directory, which is always set."""
6383 DAEMON_EXPORT_FILE = 'git-daemon-export-ok'
6484
65 git = None # Must exist, or __del__ will fail in case we raise on `__init__()`
66 working_dir = None
67 _working_tree_dir = None
68 git_dir = None
69 _common_dir = None
85 git = cast('Git', None) # Must exist, or __del__ will fail in case we raise on `__init__()`
86 working_dir: Optional[PathLike] = None
87 _working_tree_dir: Optional[PathLike] = None
88 git_dir: PathLike = ""
89 _common_dir: PathLike = ""
7090
7191 # precompiled regex
7292 re_whitespace = re.compile(r'\s+')
7898
7999 # invariants
80100 # represents the configuration level of a configuration file
81 config_level = ("system", "user", "global", "repository")
101 config_level: ConfigLevels_Tup = ("system", "user", "global", "repository")
82102
83103 # Subclass configuration
84104 # Subclasses may easily bring in their own custom types by placing a constructor or type here
85105 GitCommandWrapperType = Git
86106
87 def __init__(self, path=None, odbt=GitCmdObjectDB, search_parent_directories=False, expand_vars=True):
107 def __init__(self, path: Optional[PathLike] = None, odbt: Type[LooseObjectDB] = GitCmdObjectDB,
108 search_parent_directories: bool = False, expand_vars: bool = True) -> None:
88109 """Create a new Repo instance
89110
90111 :param path:
125146 warnings.warn("The use of environment variables in paths is deprecated" +
126147 "\nfor security reasons and may be removed in the future!!")
127148 epath = expand_path(epath, expand_vars)
128 if not os.path.exists(epath):
129 raise NoSuchPathError(epath)
149 if epath is not None:
150 if not os.path.exists(epath):
151 raise NoSuchPathError(epath)
130152
131153 ## Walk up the path to find the `.git` dir.
132154 #
189211 try:
190212 common_dir = open(osp.join(self.git_dir, 'commondir'), 'rt').readlines()[0].strip()
191213 self._common_dir = osp.join(self.git_dir, common_dir)
192 except (OSError, IOError):
193 self._common_dir = None
214 except OSError:
215 self._common_dir = ""
194216
195217 # adjust the wd in case we are actually bare - we didn't know that
196218 # in the first place
198220 self._working_tree_dir = None
199221 # END working dir handling
200222
201 self.working_dir = self._working_tree_dir or self.common_dir
223 self.working_dir: Optional[PathLike] = self._working_tree_dir or self.common_dir
202224 self.git = self.GitCommandWrapperType(self.working_dir)
203225
204226 # special handling, in special times
205 args = [osp.join(self.common_dir, 'objects')]
227 rootpath = osp.join(self.common_dir, 'objects')
206228 if issubclass(odbt, GitCmdObjectDB):
207 args.append(self.git)
208 self.odb = odbt(*args)
209
210 def __enter__(self):
229 self.odb = odbt(rootpath, self.git)
230 else:
231 self.odb = odbt(rootpath)
232
233 def __enter__(self) -> 'Repo':
211234 return self
212235
213 def __exit__(self, exc_type, exc_value, traceback):
236 def __exit__(self, *args: Any) -> None:
214237 self.close()
215238
216 def __del__(self):
239 def __del__(self) -> None:
217240 try:
218241 self.close()
219242 except Exception:
220243 pass
221244
222 def close(self):
245 def close(self) -> None:
223246 if self.git:
224247 self.git.clear_cache()
225248 # Tempfiles objects on Windows are holding references to
234257 if is_win:
235258 gc.collect()
236259
237 def __eq__(self, rhs):
238 if isinstance(rhs, Repo):
260 def __eq__(self, rhs: object) -> bool:
261 if isinstance(rhs, Repo) and self.git_dir:
239262 return self.git_dir == rhs.git_dir
240263 return False
241264
242 def __ne__(self, rhs):
265 def __ne__(self, rhs: object) -> bool:
243266 return not self.__eq__(rhs)
244267
245 def __hash__(self):
268 def __hash__(self) -> int:
246269 return hash(self.git_dir)
247270
248271 # Description property
249 def _get_description(self):
250 filename = osp.join(self.git_dir, 'description')
272 def _get_description(self) -> str:
273 if self.git_dir:
274 filename = osp.join(self.git_dir, 'description')
251275 with open(filename, 'rb') as fp:
252276 return fp.read().rstrip().decode(defenc)
253277
254 def _set_description(self, descr):
255 filename = osp.join(self.git_dir, 'description')
278 def _set_description(self, descr: str) -> None:
279 if self.git_dir:
280 filename = osp.join(self.git_dir, 'description')
256281 with open(filename, 'wb') as fp:
257282 fp.write((descr + '\n').encode(defenc))
258283
262287 del _set_description
263288
264289 @property
265 def working_tree_dir(self):
290 def working_tree_dir(self) -> Optional[PathLike]:
266291 """:return: The working tree directory of our git repository. If this is a bare repository, None is returned.
267292 """
268293 return self._working_tree_dir
269294
270295 @property
271 def common_dir(self):
296 def common_dir(self) -> PathLike:
272297 """
273298 :return: The git dir that holds everything except possibly HEAD,
274299 FETCH_HEAD, ORIG_HEAD, COMMIT_EDITMSG, index, and logs/."""
275 return self._common_dir or self.git_dir
300 if self._common_dir:
301 return self._common_dir
302 elif self.git_dir:
303 return self.git_dir
304 else:
305 # or could return ""
306 raise InvalidGitRepositoryError()
276307
277308 @property
278 def bare(self):
309 def bare(self) -> bool:
279310 """:return: True if the repository is bare"""
280311 return self._bare
281312
282313 @property
283 def heads(self):
314 def heads(self) -> 'IterableList[Head]':
284315 """A list of ``Head`` objects representing the branch heads in
285316 this repo
286317
288319 return Head.list_items(self)
289320
290321 @property
291 def references(self):
322 def references(self) -> 'IterableList[Reference]':
292323 """A list of Reference objects representing tags, heads and remote references.
293324
294325 :return: IterableList(Reference, ...)"""
301332 branches = heads
302333
303334 @property
304 def index(self):
335 def index(self) -> 'IndexFile':
305336 """:return: IndexFile representing this repository's index.
306337 :note: This property can be expensive, as the returned ``IndexFile`` will be
307338 reinitialized. It's recommended to re-use the object."""
308339 return IndexFile(self)
309340
310341 @property
311 def head(self):
342 def head(self) -> 'HEAD':
312343 """:return: HEAD Object pointing to the current head reference"""
313344 return HEAD(self, 'HEAD')
314345
315346 @property
316 def remotes(self):
347 def remotes(self) -> 'IterableList[Remote]':
317348 """A list of Remote objects allowing to access and manipulate remotes
318349 :return: ``git.IterableList(Remote, ...)``"""
319350 return Remote.list_items(self)
320351
321 def remote(self, name='origin'):
352 def remote(self, name: str = 'origin') -> 'Remote':
322353 """:return: Remote with the specified name
323354 :raise ValueError: if no remote with such a name exists"""
324355 r = Remote(self, name)
329360 #{ Submodules
330361
331362 @property
332 def submodules(self):
363 def submodules(self) -> 'IterableList[Submodule]':
333364 """
334365 :return: git.IterableList(Submodule, ...) of direct submodules
335366 available from the current head"""
336367 return Submodule.list_items(self)
337368
338 def submodule(self, name):
369 def submodule(self, name: str) -> 'Submodule':
339370 """ :return: Submodule with the given name
340371 :raise ValueError: If no such submodule exists"""
341372 try:
344375 raise ValueError("Didn't find submodule named %r" % name) from e
345376 # END exception handling
346377
347 def create_submodule(self, *args, **kwargs):
378 def create_submodule(self, *args: Any, **kwargs: Any) -> Submodule:
348379 """Create a new submodule
349380
350381 :note: See the documentation of Submodule.add for a description of the
352383 :return: created submodules"""
353384 return Submodule.add(self, *args, **kwargs)
354385
355 def iter_submodules(self, *args, **kwargs):
386 def iter_submodules(self, *args: Any, **kwargs: Any) -> Iterator[Submodule]:
356387 """An iterator yielding Submodule instances, see Traversable interface
357388 for a description of args and kwargs
358389 :return: Iterator"""
359390 return RootModule(self).traverse(*args, **kwargs)
360391
361 def submodule_update(self, *args, **kwargs):
392 def submodule_update(self, *args: Any, **kwargs: Any) -> Iterator[Submodule]:
362393 """Update the submodules, keeping the repository consistent as it will
363394 take the previous state into consideration. For more information, please
364395 see the documentation of RootModule.update"""
367398 #}END submodules
368399
369400 @property
370 def tags(self):
401 def tags(self) -> 'IterableList[TagReference]':
371402 """A list of ``Tag`` objects that are available in this repo
372403 :return: ``git.IterableList(TagReference, ...)`` """
373404 return TagReference.list_items(self)
374405
375 def tag(self, path):
406 def tag(self, path: PathLike) -> TagReference:
376407 """:return: TagReference Object, reference pointing to a Commit or Tag
377408 :param path: path to the tag reference, i.e. 0.1.5 or tags/0.1.5 """
378 return TagReference(self, path)
379
380 def create_head(self, path, commit='HEAD', force=False, logmsg=None):
409 full_path = self._to_full_tag_path(path)
410 return TagReference(self, full_path)
411
412 @staticmethod
413 def _to_full_tag_path(path: PathLike) -> str:
414 path_str = str(path)
415 if path_str.startswith(TagReference._common_path_default + '/'):
416 return path_str
417 if path_str.startswith(TagReference._common_default + '/'):
418 return Reference._common_path_default + '/' + path_str
419 else:
420 return TagReference._common_path_default + '/' + path_str
421
422 def create_head(self, path: PathLike, commit: str = 'HEAD',
423 force: bool = False, logmsg: Optional[str] = None
424 ) -> 'Head':
381425 """Create a new head within the repository.
382426 For more documentation, please see the Head.create method.
383427
384428 :return: newly created Head Reference"""
385 return Head.create(self, path, commit, force, logmsg)
386
387 def delete_head(self, *heads, **kwargs):
429 return Head.create(self, path, commit, logmsg, force)
430
431 def delete_head(self, *heads: 'Head', **kwargs: Any) -> None:
388432 """Delete the given heads
389433
390434 :param kwargs: Additional keyword arguments to be passed to git-branch"""
391435 return Head.delete(self, *heads, **kwargs)
392436
393 def create_tag(self, path, ref='HEAD', message=None, force=False, **kwargs):
437 def create_tag(self, path: PathLike, ref: str = 'HEAD',
438 message: Optional[str] = None, force: bool = False, **kwargs: Any
439 ) -> TagReference:
394440 """Create a new tag reference.
395441 For more documentation, please see the TagReference.create method.
396442
397443 :return: TagReference object """
398444 return TagReference.create(self, path, ref, message, force, **kwargs)
399445
400 def delete_tag(self, *tags):
446 def delete_tag(self, *tags: TagReference) -> None:
401447 """Delete the given tag references"""
402448 return TagReference.delete(self, *tags)
403449
404 def create_remote(self, name, url, **kwargs):
450 def create_remote(self, name: str, url: str, **kwargs: Any) -> Remote:
405451 """Create a new remote.
406452
407453 For more information, please see the documentation of the Remote.create
410456 :return: Remote reference"""
411457 return Remote.create(self, name, url, **kwargs)
412458
413 def delete_remote(self, remote):
459 def delete_remote(self, remote: 'Remote') -> str:
414460 """Delete the given remote."""
415461 return Remote.remove(self, remote)
416462
417 def _get_config_path(self, config_level):
463 def _get_config_path(self, config_level: Lit_config_levels) -> str:
418464 # we do not support an absolute path of the gitconfig on windows ,
419465 # use the global config instead
420466 if is_win and config_level == "system":
428474 elif config_level == "global":
429475 return osp.normpath(osp.expanduser("~/.gitconfig"))
430476 elif config_level == "repository":
431 return osp.normpath(osp.join(self._common_dir or self.git_dir, "config"))
432
433 raise ValueError("Invalid configuration level: %r" % config_level)
434
435 def config_reader(self, config_level=None):
477 repo_dir = self._common_dir or self.git_dir
478 if not repo_dir:
479 raise NotADirectoryError
480 else:
481 return osp.normpath(osp.join(repo_dir, "config"))
482 else:
483
484 assert_never(config_level, # type:ignore[unreachable]
485 ValueError(f"Invalid configuration level: {config_level!r}"))
486
487 def config_reader(self, config_level: Optional[Lit_config_levels] = None,
488 ) -> GitConfigParser:
436489 """
437490 :return:
438491 GitConfigParser allowing to read the full git configuration, but not to write it
448501 unknown, instead the global path will be used."""
449502 files = None
450503 if config_level is None:
451 files = [self._get_config_path(f) for f in self.config_level]
504 files = [self._get_config_path(cast(Lit_config_levels, f))
505 for f in self.config_level if cast(Lit_config_levels, f)]
452506 else:
453507 files = [self._get_config_path(config_level)]
454508 return GitConfigParser(files, read_only=True, repo=self)
455509
456 def config_writer(self, config_level="repository"):
510 def config_writer(self, config_level: Lit_config_levels = "repository"
511 ) -> GitConfigParser:
457512 """
458513 :return:
459514 GitConfigParser allowing to write values of the specified configuration file level.
468523 repository = configuration file for this repository only"""
469524 return GitConfigParser(self._get_config_path(config_level), read_only=False, repo=self)
470525
471 def commit(self, rev=None):
526 def commit(self, rev: Union[str, Commit_ish, None] = None
527 ) -> Commit:
472528 """The Commit object for the specified revision
473529
474530 :param rev: revision specifier, see git-rev-parse for viable options.
478534 return self.head.commit
479535 return self.rev_parse(str(rev) + "^0")
480536
481 def iter_trees(self, *args, **kwargs):
537 def iter_trees(self, *args: Any, **kwargs: Any) -> Iterator['Tree']:
482538 """:return: Iterator yielding Tree objects
483539 :note: Takes all arguments known to iter_commits method"""
484540 return (c.tree for c in self.iter_commits(*args, **kwargs))
485541
486 def tree(self, rev=None):
542 def tree(self, rev: Union[Tree_ish, str, None] = None) -> 'Tree':
487543 """The Tree object for the given treeish revision
488544 Examples::
489545
500556 return self.head.commit.tree
501557 return self.rev_parse(str(rev) + "^{tree}")
502558
503 def iter_commits(self, rev=None, paths='', **kwargs):
559 def iter_commits(self, rev: Union[str, Commit, 'SymbolicReference', None] = None,
560 paths: Union[PathLike, Sequence[PathLike]] = '',
561 **kwargs: Any) -> Iterator[Commit]:
504562 """A list of Commit objects representing the history of a given ref/commit
505563
506564 :param rev:
524582
525583 return Commit.iter_items(self, rev, paths, **kwargs)
526584
527 def merge_base(self, *rev, **kwargs):
585 def merge_base(self, *rev: TBD, **kwargs: Any
586 ) -> List[Union[Commit_ish, None]]:
528587 """Find the closest common ancestor for the given revision (e.g. Commits, Tags, References, etc)
529588
530589 :param rev: At least two revs to find the common ancestor for.
537596 raise ValueError("Please specify at least two revs, got only %i" % len(rev))
538597 # end handle input
539598
540 res = []
599 res: List[Union[Commit_ish, None]] = []
541600 try:
542 lines = self.git.merge_base(*rev, **kwargs).splitlines()
601 lines = self.git.merge_base(*rev, **kwargs).splitlines() # List[str]
543602 except GitCommandError as err:
544603 if err.status == 128:
545604 raise
555614
556615 return res
557616
558 def is_ancestor(self, ancestor_rev, rev):
617 def is_ancestor(self, ancestor_rev: 'Commit', rev: 'Commit') -> bool:
559618 """Check if a commit is an ancestor of another
560619
561620 :param ancestor_rev: Rev which should be an ancestor
570629 raise
571630 return True
572631
573 def _get_daemon_export(self):
574 filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE)
632 def is_valid_object(self, sha: str, object_type: Union[str, None] = None) -> bool:
633 try:
634 complete_sha = self.odb.partial_to_complete_sha_hex(sha)
635 object_info = self.odb.info(complete_sha)
636 if object_type:
637 if object_info.type == object_type.encode():
638 return True
639 else:
640 log.debug("Commit hash points to an object of type '%s'. Requested were objects of type '%s'",
641 object_info.type.decode(), object_type)
642 return False
643 else:
644 return True
645 except BadObject:
646 log.debug("Commit hash is invalid.")
647 return False
648
649 def _get_daemon_export(self) -> bool:
650 if self.git_dir:
651 filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE)
575652 return osp.exists(filename)
576653
577 def _set_daemon_export(self, value):
578 filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE)
654 def _set_daemon_export(self, value: object) -> None:
655 if self.git_dir:
656 filename = osp.join(self.git_dir, self.DAEMON_EXPORT_FILE)
579657 fileexists = osp.exists(filename)
580658 if value and not fileexists:
581659 touch(filename)
587665 del _get_daemon_export
588666 del _set_daemon_export
589667
590 def _get_alternates(self):
668 def _get_alternates(self) -> List[str]:
591669 """The list of alternates for this repo from which objects can be retrieved
592670
593671 :return: list of strings being pathnames of alternates"""
594 alternates_path = osp.join(self.git_dir, 'objects', 'info', 'alternates')
672 if self.git_dir:
673 alternates_path = osp.join(self.git_dir, 'objects', 'info', 'alternates')
595674
596675 if osp.exists(alternates_path):
597676 with open(alternates_path, 'rb') as f:
599678 return alts.strip().splitlines()
600679 return []
601680
602 def _set_alternates(self, alts):
681 def _set_alternates(self, alts: List[str]) -> None:
603682 """Sets the alternates
604683
605684 :param alts:
621700 alternates = property(_get_alternates, _set_alternates,
622701 doc="Retrieve a list of alternates paths or set a list paths to be used as alternates")
623702
624 def is_dirty(self, index=True, working_tree=True, untracked_files=False,
625 submodules=True, path=None):
703 def is_dirty(self, index: bool = True, working_tree: bool = True, untracked_files: bool = False,
704 submodules: bool = True, path: Optional[PathLike] = None) -> bool:
626705 """
627706 :return:
628707 ``True``, the repository is considered dirty. By default it will react
638717 if not submodules:
639718 default_args.append('--ignore-submodules')
640719 if path:
641 default_args.extend(["--", path])
720 default_args.extend(["--", str(path)])
642721 if index:
643722 # diff index against HEAD
644723 if osp.isfile(self.index.path) and \
657736 return False
658737
659738 @property
660 def untracked_files(self):
739 def untracked_files(self) -> List[str]:
661740 """
662741 :return:
663742 list(str,...)
672751 consider caching it yourself."""
673752 return self._get_untracked_files()
674753
675 def _get_untracked_files(self, *args, **kwargs):
754 def _get_untracked_files(self, *args: Any, **kwargs: Any) -> List[str]:
676755 # make sure we get all files, not only untracked directories
677756 proc = self.git.status(*args,
678757 porcelain=True,
696775 finalize_process(proc)
697776 return untracked_files
698777
699 def ignored(self, *paths):
778 def ignored(self, *paths: PathLike) -> List[str]:
700779 """Checks if paths are ignored via .gitignore
701780 Doing so using the "git check-ignore" method.
702781
704783 :return: subset of those paths which are ignored
705784 """
706785 try:
707 proc = self.git.check_ignore(*paths)
786 proc: str = self.git.check_ignore(*paths)
708787 except GitCommandError:
709788 return []
710789 return proc.replace("\\\\", "\\").replace('"', "").split("\n")
711790
712791 @property
713 def active_branch(self):
792 def active_branch(self) -> Head:
714793 """The name of the currently active branch.
715
716794 :return: Head to the active branch"""
795 # reveal_type(self.head.reference) # => Reference
717796 return self.head.reference
718797
719 def blame_incremental(self, rev, file, **kwargs):
798 def blame_incremental(self, rev: str | HEAD, file: str, **kwargs: Any) -> Iterator['BlameEntry']:
720799 """Iterator for blame information for the given file at the given revision.
721800
722801 Unlike .blame(), this does not return the actual file's contents, only
730809 If you combine all line number ranges outputted by this command, you
731810 should get a continuous range spanning all line numbers in the file.
732811 """
733 data = self.git.blame(rev, '--', file, p=True, incremental=True, stdout_as_string=False, **kwargs)
734 commits = {}
812
813 data: bytes = self.git.blame(rev, '--', file, p=True, incremental=True, stdout_as_string=False, **kwargs)
814 commits: Dict[bytes, Commit] = {}
735815
736816 stream = (line for line in data.split(b'\n') if line)
737817 while True:
739819 line = next(stream) # when exhausted, causes a StopIteration, terminating this function
740820 except StopIteration:
741821 return
742 hexsha, orig_lineno, lineno, num_lines = line.split()
743 lineno = int(lineno)
744 num_lines = int(num_lines)
745 orig_lineno = int(orig_lineno)
822 split_line = line.split()
823 hexsha, orig_lineno_b, lineno_b, num_lines_b = split_line
824 lineno = int(lineno_b)
825 num_lines = int(num_lines_b)
826 orig_lineno = int(orig_lineno_b)
746827 if hexsha not in commits:
747828 # Now read the next few lines and build up a dict of properties
748829 # for this commit
749 props = {}
830 props: Dict[bytes, bytes] = {}
750831 while True:
751832 try:
752833 line = next(stream)
790871 safe_decode(orig_filename),
791872 range(orig_lineno, orig_lineno + num_lines))
792873
793 def blame(self, rev, file, incremental=False, **kwargs):
874 def blame(self, rev: Union[str, HEAD], file: str, incremental: bool = False, **kwargs: Any
875 ) -> List[List[Commit | List[str | bytes] | None]] | Iterator[BlameEntry] | None:
794876 """The blame information for the given file at the given revision.
795877
796878 :param rev: revision specifier, see git-rev-parse for viable options.
797879 :return:
798880 list: [git.Commit, list: [<line>]]
799 A list of tuples associating a Commit object with a list of lines that
881 A list of lists associating a Commit object with a list of lines that
800882 changed within the given commit. The Commit objects will be given in order
801883 of appearance."""
802884 if incremental:
803885 return self.blame_incremental(rev, file, **kwargs)
804886
805 data = self.git.blame(rev, '--', file, p=True, stdout_as_string=False, **kwargs)
806 commits = {}
807 blames = []
808 info = None
887 data: bytes = self.git.blame(rev, '--', file, p=True, stdout_as_string=False, **kwargs)
888 commits: Dict[str, Commit] = {}
889 blames: List[List[Commit | List[str | bytes] | None]] = []
890
891 class InfoTD(TypedDict, total=False):
892 sha: str
893 id: str
894 filename: str
895 summary: str
896 author: str
897 author_email: str
898 author_date: int
899 committer: str
900 committer_email: str
901 committer_date: int
902
903 info: InfoTD = {}
809904
810905 keepends = True
811 for line in data.splitlines(keepends):
906 for line_bytes in data.splitlines(keepends):
812907 try:
813 line = line.rstrip().decode(defenc)
908 line_str = line_bytes.rstrip().decode(defenc)
814909 except UnicodeDecodeError:
815910 firstpart = ''
911 parts = []
816912 is_binary = True
817913 else:
818914 # As we don't have an idea when the binary data ends, as it could contain multiple newlines
819915 # in the process. So we rely on being able to decode to tell us what is is.
820916 # This can absolutely fail even on text files, but even if it does, we should be fine treating it
821917 # as binary instead
822 parts = self.re_whitespace.split(line, 1)
918 parts = self.re_whitespace.split(line_str, 1)
823919 firstpart = parts[0]
824920 is_binary = False
825921 # end handle decode of line
850946 # committer-time 1192271832
851947 # committer-tz -0700 - IGNORED BY US
852948 role = m.group(0)
853 if firstpart.endswith('-mail'):
854 info["%s_email" % role] = parts[-1]
855 elif firstpart.endswith('-time'):
856 info["%s_date" % role] = int(parts[-1])
857 elif role == firstpart:
858 info[role] = parts[-1]
949 if role == 'author':
950 if firstpart.endswith('-mail'):
951 info["author_email"] = parts[-1]
952 elif firstpart.endswith('-time'):
953 info["author_date"] = int(parts[-1])
954 elif role == firstpart:
955 info["author"] = parts[-1]
956 elif role == 'committer':
957 if firstpart.endswith('-mail'):
958 info["committer_email"] = parts[-1]
959 elif firstpart.endswith('-time'):
960 info["committer_date"] = int(parts[-1])
961 elif role == firstpart:
962 info["committer"] = parts[-1]
859963 # END distinguish mail,time,name
860964 else:
861965 # handle
872976 c = commits.get(sha)
873977 if c is None:
874978 c = Commit(self, hex_to_bin(sha),
875 author=Actor._from_string(info['author'] + ' ' + info['author_email']),
979 author=Actor._from_string(f"{info['author']} {info['author_email']}"),
876980 authored_date=info['author_date'],
877981 committer=Actor._from_string(
878 info['committer'] + ' ' + info['committer_email']),
982 f"{info['committer']} {info['committer_email']}"),
879983 committed_date=info['committer_date'])
880984 commits[sha] = c
985 blames[-1][0] = c
881986 # END if commit objects needs initial creation
882 if not is_binary:
883 if line and line[0] == '\t':
884 line = line[1:]
885 else:
886 # NOTE: We are actually parsing lines out of binary data, which can lead to the
887 # binary being split up along the newline separator. We will append this to the blame
888 # we are currently looking at, even though it should be concatenated with the last line
889 # we have seen.
890 pass
891 # end handle line contents
892 blames[-1][0] = c
893 blames[-1][1].append(line)
987
988 if blames[-1][1] is not None:
989 line: str | bytes
990 if not is_binary:
991 if line_str and line_str[0] == '\t':
992 line_str = line_str[1:]
993 line = line_str
994 else:
995 line = line_bytes
996 # NOTE: We are actually parsing lines out of binary data, which can lead to the
997 # binary being split up along the newline separator. We will append this to the
998 # blame we are currently looking at, even though it should be concatenated with
999 # the last line we have seen.
1000 blames[-1][1].append(line)
1001
8941002 info = {'id': sha}
8951003 # END if we collected commit info
8961004 # END distinguish filename,summary,rest
8981006 # END distinguish hexsha vs other information
8991007 return blames
9001008
901 @classmethod
902 def init(cls, path=None, mkdir=True, odbt=GitCmdObjectDB, expand_vars=True, **kwargs):
1009 @ classmethod
1010 def init(cls, path: Union[PathLike, None] = None, mkdir: bool = True, odbt: Type[GitCmdObjectDB] = GitCmdObjectDB,
1011 expand_vars: bool = True, **kwargs: Any) -> 'Repo':
9031012 """Initialize a git repository at the given path if specified
9041013
9051014 :param path:
9321041 os.makedirs(path, 0o755)
9331042
9341043 # git command automatically chdir into the directory
935 git = Git(path)
1044 git = cls.GitCommandWrapperType(path)
9361045 git.init(**kwargs)
9371046 return cls(path, odbt=odbt)
9381047
939 @classmethod
940 def _clone(cls, git, url, path, odb_default_type, progress, multi_options=None, **kwargs):
941 if progress is not None:
942 progress = to_progress_instance(progress)
943
1048 @ classmethod
1049 def _clone(cls, git: 'Git', url: PathLike, path: PathLike, odb_default_type: Type[GitCmdObjectDB],
1050 progress: Union['RemoteProgress', 'UpdateProgress', Callable[..., 'RemoteProgress'], None] = None,
1051 multi_options: Optional[List[str]] = None, **kwargs: Any
1052 ) -> 'Repo':
9441053 odbt = kwargs.pop('odbt', odb_default_type)
9451054
9461055 # when pathlib.Path or other classbased path is passed
9611070 kwargs['separate_git_dir'] = Git.polish_url(sep_dir)
9621071 multi = None
9631072 if multi_options:
964 multi = ' '.join(multi_options).split(' ')
965 proc = git.clone(multi, Git.polish_url(url), clone_path, with_extended_output=True, as_process=True,
1073 multi = shlex.split(' '.join(multi_options))
1074 proc = git.clone(multi, Git.polish_url(str(url)), clone_path, with_extended_output=True, as_process=True,
9661075 v=True, universal_newlines=True, **add_progress(kwargs, git, progress))
9671076 if progress:
968 handle_process_output(proc, None, progress.new_message_handler(), finalize_process, decode_streams=False)
1077 handle_process_output(proc, None, to_progress_instance(progress).new_message_handler(),
1078 finalize_process, decode_streams=False)
9691079 else:
9701080 (stdout, stderr) = proc.communicate()
971 log.debug("Cmd(%s)'s unused stdout: %s", getattr(proc, 'args', ''), stdout)
1081 cmdline = getattr(proc, 'args', '')
1082 cmdline = remove_password_if_present(cmdline)
1083
1084 log.debug("Cmd(%s)'s unused stdout: %s", cmdline, stdout)
9721085 finalize_process(proc, stderr=stderr)
9731086
9741087 # our git command could have a different working dir than our actual
9751088 # environment, hence we prepend its working dir if required
976 if not osp.isabs(path) and git.working_dir:
977 path = osp.join(git._working_dir, path)
1089 if not osp.isabs(path):
1090 path = osp.join(git._working_dir, path) if git._working_dir is not None else path
9781091
9791092 repo = cls(path, odbt=odbt)
9801093
9921105 # END handle remote repo
9931106 return repo
9941107
995 def clone(self, path, progress=None, multi_options=None, **kwargs):
1108 def clone(self, path: PathLike, progress: Optional[Callable] = None,
1109 multi_options: Optional[List[str]] = None, **kwargs: Any) -> 'Repo':
9961110 """Create a clone from this repository.
9971111
9981112 :param path: is the full path of the new repo (traditionally ends with ./<name>.git).
10091123 :return: ``git.Repo`` (the newly cloned repo)"""
10101124 return self._clone(self.git, self.common_dir, path, type(self.odb), progress, multi_options, **kwargs)
10111125
1012 @classmethod
1013 def clone_from(cls, url, to_path, progress=None, env=None, multi_options=None, **kwargs):
1126 @ classmethod
1127 def clone_from(cls, url: PathLike, to_path: PathLike, progress: Optional[Callable] = None,
1128 env: Optional[Mapping[str, str]] = None,
1129 multi_options: Optional[List[str]] = None, **kwargs: Any) -> 'Repo':
10141130 """Create a clone from the given URL
10151131
10161132 :param url: valid git url, see http://www.kernel.org/pub/software/scm/git/docs/git-clone.html#URLS
10251141 :param multi_options: See ``clone`` method
10261142 :param kwargs: see the ``clone`` method
10271143 :return: Repo instance pointing to the cloned directory"""
1028 git = Git(os.getcwd())
1144 git = cls.GitCommandWrapperType(os.getcwd())
10291145 if env is not None:
10301146 git.update_environment(**env)
10311147 return cls._clone(git, url, to_path, GitCmdObjectDB, progress, multi_options, **kwargs)
10321148
1033 def archive(self, ostream, treeish=None, prefix=None, **kwargs):
1149 def archive(self, ostream: Union[TextIO, BinaryIO], treeish: Optional[str] = None,
1150 prefix: Optional[str] = None, **kwargs: Any) -> Repo:
10341151 """Archive the tree at the given revision.
10351152
10361153 :param ostream: file compatible stream object to which the archive will be written as bytes
10511168 kwargs['prefix'] = prefix
10521169 kwargs['output_stream'] = ostream
10531170 path = kwargs.pop('path', [])
1171 path = cast(Union[PathLike, List[PathLike], Tuple[PathLike, ...]], path)
10541172 if not isinstance(path, (tuple, list)):
10551173 path = [path]
10561174 # end assure paths is list
1057
10581175 self.git.archive(treeish, *path, **kwargs)
10591176 return self
10601177
1061 def has_separate_working_tree(self):
1178 def has_separate_working_tree(self) -> bool:
10621179 """
10631180 :return: True if our git_dir is not at the root of our working_tree_dir, but a .git file with a
10641181 platform agnositic symbolic link. Our git_dir will be wherever the .git file points to
10661183 """
10671184 if self.bare:
10681185 return False
1069 return osp.isfile(osp.join(self.working_tree_dir, '.git'))
1186 if self.working_tree_dir:
1187 return osp.isfile(osp.join(self.working_tree_dir, '.git'))
1188 else:
1189 return False # or raise Error?
10701190
10711191 rev_parse = rev_parse
10721192
1073 def __repr__(self):
1193 def __repr__(self) -> str:
10741194 clazz = self.__class__
10751195 return '<%s.%s %r>' % (clazz.__module__, clazz.__name__, self.git_dir)
10761196
1077 def currently_rebasing_on(self):
1197 def currently_rebasing_on(self) -> Commit | None:
10781198 """
10791199 :return: The commit which is currently being replayed while rebasing.
10801200
10811201 None if we are not currently rebasing.
10821202 """
1083 rebase_head_file = osp.join(self.git_dir, "REBASE_HEAD")
1203 if self.git_dir:
1204 rebase_head_file = osp.join(self.git_dir, "REBASE_HEAD")
10841205 if not osp.isfile(rebase_head_file):
10851206 return None
10861207 return self.commit(open(rebase_head_file, "rt").readline().strip())
00 """Package with general repository related functions"""
1 from __future__ import annotations
12 import os
23 import stat
34 from string import digits
1415 import os.path as osp
1516 from git.cmd import Git
1617
18 # Typing ----------------------------------------------------------------------
19
20 from typing import Union, Optional, cast, TYPE_CHECKING
21 from git.types import Commit_ish
22
23 if TYPE_CHECKING:
24 from git.types import PathLike
25 from .base import Repo
26 from git.db import GitCmdObjectDB
27 from git.refs.reference import Reference
28 from git.objects import Commit, TagObject, Blob, Tree
29 from git.refs.tag import Tag
30
31 # ----------------------------------------------------------------------------
1732
1833 __all__ = ('rev_parse', 'is_git_dir', 'touch', 'find_submodule_git_dir', 'name_to_object', 'short_to_long', 'deref_tag',
1934 'to_commit', 'find_worktree_git_dir')
2035
2136
22 def touch(filename):
37 def touch(filename: str) -> str:
2338 with open(filename, "ab"):
2439 pass
2540 return filename
2641
2742
28 def is_git_dir(d):
43 def is_git_dir(d: 'PathLike') -> bool:
2944 """ This is taken from the git setup.c:is_git_directory
3045 function.
3146
4762 return False
4863
4964
50 def find_worktree_git_dir(dotgit):
65 def find_worktree_git_dir(dotgit: 'PathLike') -> Optional[str]:
5166 """Search for a gitdir for this worktree."""
5267 try:
5368 statbuf = os.stat(dotgit)
6681 return None
6782
6883
69 def find_submodule_git_dir(d):
84 def find_submodule_git_dir(d: 'PathLike') -> Optional['PathLike']:
7085 """Search for a submodule repo."""
7186 if is_git_dir(d):
7287 return d
7489 try:
7590 with open(d) as fp:
7691 content = fp.read().rstrip()
77 except (IOError, OSError):
92 except IOError:
7893 # it's probably not a file
7994 pass
8095 else:
91106 return None
92107
93108
94 def short_to_long(odb, hexsha):
109 def short_to_long(odb: 'GitCmdObjectDB', hexsha: str) -> Optional[bytes]:
95110 """:return: long hexadecimal sha1 from the given less-than-40 byte hexsha
96111 or None if no candidate could be found.
97112 :param hexsha: hexsha with less than 40 byte"""
102117 # END exception handling
103118
104119
105 def name_to_object(repo, name, return_ref=False):
120 def name_to_object(repo: 'Repo', name: str, return_ref: bool = False
121 ) -> Union[SymbolicReference, 'Commit', 'TagObject', 'Blob', 'Tree']:
106122 """
107123 :return: object specified by the given name, hexshas ( short and long )
108124 as well as references are supported
109125 :param return_ref: if name specifies a reference, we will return the reference
110126 instead of the object. Otherwise it will raise BadObject or BadName
111127 """
112 hexsha = None
128 hexsha: Union[None, str, bytes] = None
113129
114130 # is it a hexsha ? Try the most common ones, which is 7 to 40
115131 if repo.re_hexsha_shortened.match(name):
149165 return Object.new_from_sha(repo, hex_to_bin(hexsha))
150166
151167
152 def deref_tag(tag):
168 def deref_tag(tag: 'Tag') -> 'TagObject':
153169 """Recursively dereference a tag and return the resulting object"""
154170 while True:
155171 try:
160176 return tag
161177
162178
163 def to_commit(obj):
179 def to_commit(obj: Object) -> Union['Commit', 'TagObject']:
164180 """Convert the given object to a commit if possible and return it"""
165181 if obj.type == 'tag':
166182 obj = deref_tag(obj)
171187 return obj
172188
173189
174 def rev_parse(repo, rev):
190 def rev_parse(repo: 'Repo', rev: str) -> Union['Commit', 'Tag', 'Tree', 'Blob']:
175191 """
176192 :return: Object at the given revision, either Commit, Tag, Tree or Blob
177193 :param rev: git-rev-parse compatible revision specification as string, please see
187203 raise NotImplementedError("commit by message search ( regex )")
188204 # END handle search
189205
190 obj = None
206 obj: Union[Commit_ish, 'Reference', None] = None
191207 ref = None
192208 output_type = "commit"
193209 start = 0
207223 ref = repo.head.ref
208224 else:
209225 if token == '@':
210 ref = name_to_object(repo, rev[:start], return_ref=True)
226 ref = cast('Reference', name_to_object(repo, rev[:start], return_ref=True))
211227 else:
212 obj = name_to_object(repo, rev[:start])
228 obj = cast(Commit_ish, name_to_object(repo, rev[:start]))
213229 # END handle token
214230 # END handle refname
231 else:
232 assert obj is not None
215233
216234 if ref is not None:
217 obj = ref.commit
235 obj = cast('Commit', ref.commit)
218236 # END handle ref
219237 # END initialize obj on first token
220238
232250 pass # default
233251 elif output_type == 'tree':
234252 try:
253 obj = cast(Commit_ish, obj)
235254 obj = to_commit(obj).tree
236255 except (AttributeError, ValueError):
237256 pass # error raised later
238257 # END exception handling
239258 elif output_type in ('', 'blob'):
240 if obj.type == 'tag':
259 obj = cast('TagObject', obj)
260 if obj and obj.type == 'tag':
241261 obj = deref_tag(obj)
242262 else:
243263 # cannot do anything for non-tags
265285 obj = Object.new_from_sha(repo, hex_to_bin(entry.newhexsha))
266286
267287 # make it pass the following checks
268 output_type = None
288 output_type = ''
269289 else:
270290 raise ValueError("Invalid output type: %s ( in %s )" % (output_type, rev))
271291 # END handle output type
272292
273293 # empty output types don't require any specific type, its just about dereferencing tags
274 if output_type and obj.type != output_type:
294 if output_type and obj and obj.type != output_type:
275295 raise ValueError("Could not accommodate requested object type %r, got %s" % (output_type, obj.type))
276296 # END verify output type
277297
304324 parsed_to = start
305325 # handle hierarchy walk
306326 try:
327 obj = cast(Commit_ish, obj)
307328 if token == "~":
308329 obj = to_commit(obj)
309330 for _ in range(num):
325346 # END end handle tag
326347 except (IndexError, AttributeError) as e:
327348 raise BadName(
328 "Invalid revision spec '%s' - not enough "
329 "parent commits to reach '%s%i'" % (rev, token, num)) from e
349 f"Invalid revision spec '{rev}' - not enough "
350 f"parent commits to reach '{token}{int(num)}'") from e
330351 # END exception handling
331352 # END parse loop
332353
333354 # still no obj ? Its probably a simple name
334355 if obj is None:
335 obj = name_to_object(repo, rev)
356 obj = cast(Commit_ish, name_to_object(repo, rev))
336357 parsed_to = lr
337358 # END handle simple name
338359
0 # -*- coding: utf-8 -*-
1 # This module is part of GitPython and is released under
2 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
3
4 import os
5 import sys
6 from typing import (Callable, Dict, NoReturn, Sequence, Tuple, Union, Any, Iterator, # noqa: F401
7 NamedTuple, TYPE_CHECKING, TypeVar) # noqa: F401
8
9 if sys.version_info[:2] >= (3, 8):
10 from typing import Final, Literal, SupportsIndex, TypedDict, Protocol, runtime_checkable # noqa: F401
11 else:
12 from typing_extensions import (Final, Literal, SupportsIndex, # noqa: F401
13 TypedDict, Protocol, runtime_checkable) # noqa: F401
14
15 # if sys.version_info[:2] >= (3, 10):
16 # from typing import TypeGuard # noqa: F401
17 # else:
18 # from typing_extensions import TypeGuard # noqa: F401
19
20
21 if sys.version_info[:2] < (3, 9):
22 PathLike = Union[str, os.PathLike]
23 elif sys.version_info[:2] >= (3, 9):
24 # os.PathLike only becomes subscriptable from Python 3.9 onwards
25 PathLike = Union[str, os.PathLike]
26
27 if TYPE_CHECKING:
28 from git.repo import Repo
29 from git.objects import Commit, Tree, TagObject, Blob
30 # from git.refs import SymbolicReference
31
32 TBD = Any
33 _T = TypeVar('_T')
34
35 Tree_ish = Union['Commit', 'Tree']
36 Commit_ish = Union['Commit', 'TagObject', 'Blob', 'Tree']
37 Lit_commit_ish = Literal['commit', 'tag', 'blob', 'tree']
38
39 # Config_levels ---------------------------------------------------------
40
41 Lit_config_levels = Literal['system', 'global', 'user', 'repository']
42
43
44 # def is_config_level(inp: str) -> TypeGuard[Lit_config_levels]:
45 # # return inp in get_args(Lit_config_level) # only py >= 3.8
46 # return inp in ("system", "user", "global", "repository")
47
48
49 ConfigLevels_Tup = Tuple[Literal['system'], Literal['user'], Literal['global'], Literal['repository']]
50
51 #-----------------------------------------------------------------------------------
52
53
54 def assert_never(inp: NoReturn, raise_error: bool = True, exc: Union[Exception, None] = None) -> None:
55 """For use in exhaustive checking of literal or Enum in if/else chain.
56 Should only be reached if all memebers not handled OR attempt to pass non-members through chain.
57
58 If all members handled, type is Empty. Otherwise, will cause mypy error.
59 If non-members given, should cause mypy error at variable creation.
60
61 If raise_error is True, will also raise AssertionError or the Exception passed to exc.
62 """
63 if raise_error:
64 if exc is None:
65 raise ValueError(f"An unhandled Literal ({inp}) in an if/else chain was found")
66 else:
67 raise exc
68 else:
69 pass
70
71
72 class Files_TD(TypedDict):
73 insertions: int
74 deletions: int
75 lines: int
76
77
78 class Total_TD(TypedDict):
79 insertions: int
80 deletions: int
81 lines: int
82 files: int
83
84
85 class HSH_TD(TypedDict):
86 total: Total_TD
87 files: Dict[PathLike, Files_TD]
88
89
90 @runtime_checkable
91 class Has_Repo(Protocol):
92 repo: 'Repo'
93
94
95 @runtime_checkable
96 class Has_id_attribute(Protocol):
97 _id_attribute_: str
22 #
33 # This module is part of GitPython and is released under
44 # the BSD License: http://www.opensource.org/licenses/bsd-license.php
5
6 from abc import abstractmethod
7 from .exc import InvalidGitRepositoryError
8 import os.path as osp
9 from .compat import is_win
510 import contextlib
611 from functools import wraps
712 import getpass
1520 from sys import maxsize
1621 import time
1722 from unittest import SkipTest
18
19 from gitdb.util import (# NOQA @IgnorePep8
23 from urllib.parse import urlsplit, urlunsplit
24 import warnings
25
26 # from git.objects.util import Traversable
27
28 # typing ---------------------------------------------------------
29
30 from typing import (Any, AnyStr, BinaryIO, Callable, Dict, Generator, IO, Iterator, List,
31 Optional, Pattern, Sequence, Tuple, TypeVar, Union, cast,
32 TYPE_CHECKING, overload, )
33
34 import pathlib
35
36 if TYPE_CHECKING:
37 from git.remote import Remote
38 from git.repo.base import Repo
39 from git.config import GitConfigParser, SectionConstraint
40 from git import Git
41 # from git.objects.base import IndexObject
42
43
44 from .types import (Literal, SupportsIndex, Protocol, runtime_checkable, # because behind py version guards
45 PathLike, HSH_TD, Total_TD, Files_TD, # aliases
46 Has_id_attribute)
47
48 T_IterableObj = TypeVar('T_IterableObj', bound=Union['IterableObj', 'Has_id_attribute'], covariant=True)
49 # So IterableList[Head] is subtype of IterableList[IterableObj]
50
51 # ---------------------------------------------------------------------
52
53
54 from gitdb.util import ( # NOQA @IgnorePep8
2055 make_sha,
2156 LockedFD, # @UnusedImport
2257 file_contents_ro, # @UnusedImport
2863 hex_to_bin, # @UnusedImport
2964 )
3065
31 from git.compat import is_win
32 import os.path as osp
33
34 from .exc import InvalidGitRepositoryError
35
3666
3767 # NOTE: Some of the unused imports might be used/imported by others.
3868 # Handle once test-cases are back up and running.
3969 # Most of these are unused here, but are for use by git-python modules so these
4070 # don't see gitdb all the time. Flake of course doesn't like it.
4171 __all__ = ["stream_copy", "join_path", "to_native_path_linux",
42 "join_path_native", "Stats", "IndexFileSHA1Writer", "Iterable", "IterableList",
72 "join_path_native", "Stats", "IndexFileSHA1Writer", "IterableObj", "IterableList",
4373 "BlockingLockFile", "LockFile", 'Actor', 'get_user_id', 'assure_directory_exists',
4474 'RemoteProgress', 'CallableRemoteProgress', 'rmtree', 'unbare_repo',
4575 'HIDE_WINDOWS_KNOWN_ERRORS']
4676
4777 log = logging.getLogger(__name__)
78
79 # types############################################################
80
4881
4982 #: We need an easy way to see if Appveyor TCs start failing,
5083 #: so the errors marked with this var are considered "acknowledged" ones, awaiting remedy,
5285 HIDE_WINDOWS_KNOWN_ERRORS = is_win and os.environ.get('HIDE_WINDOWS_KNOWN_ERRORS', True)
5386 HIDE_WINDOWS_FREEZE_ERRORS = is_win and os.environ.get('HIDE_WINDOWS_FREEZE_ERRORS', True)
5487
55 #{ Utility Methods
56
57
58 def unbare_repo(func):
88 # { Utility Methods
89
90 T = TypeVar('T')
91
92
93 def unbare_repo(func: Callable[..., T]) -> Callable[..., T]:
5994 """Methods with this decorator raise InvalidGitRepositoryError if they
6095 encounter a bare repository"""
6196
6297 @wraps(func)
63 def wrapper(self, *args, **kwargs):
98 def wrapper(self: 'Remote', *args: Any, **kwargs: Any) -> T:
6499 if self.repo.bare:
65100 raise InvalidGitRepositoryError("Method '%s' cannot operate on bare repositories" % func.__name__)
66101 # END bare method
67102 return func(self, *args, **kwargs)
68103 # END wrapper
104
69105 return wrapper
70106
71107
72108 @contextlib.contextmanager
73 def cwd(new_dir):
109 def cwd(new_dir: PathLike) -> Generator[PathLike, None, None]:
74110 old_dir = os.getcwd()
75111 os.chdir(new_dir)
76112 try:
79115 os.chdir(old_dir)
80116
81117
82 def rmtree(path):
118 def rmtree(path: PathLike) -> None:
83119 """Remove the given recursively.
84120
85121 :note: we use shutil rmtree but adjust its behaviour to see whether files that
86122 couldn't be deleted are read-only. Windows will not remove them in that case"""
87123
88 def onerror(func, path, exc_info):
124 def onerror(func: Callable, path: PathLike, exc_info: str) -> None:
89125 # Is the error an access error ?
90126 os.chmod(path, stat.S_IWUSR)
91127
99135 return shutil.rmtree(path, False, onerror)
100136
101137
102 def rmfile(path):
138 def rmfile(path: PathLike) -> None:
103139 """Ensure file deleted also on *Windows* where read-only files need special treatment."""
104140 if osp.isfile(path):
105141 if is_win:
107143 os.remove(path)
108144
109145
110 def stream_copy(source, destination, chunk_size=512 * 1024):
146 def stream_copy(source: BinaryIO, destination: BinaryIO, chunk_size: int = 512 * 1024) -> int:
111147 """Copy all data from the source stream into the destination stream in chunks
112148 of size chunk_size
113149
123159 return br
124160
125161
126 def join_path(a, *p):
162 def join_path(a: PathLike, *p: PathLike) -> PathLike:
127163 """Join path tokens together similar to osp.join, but always use
128164 '/' instead of possibly '\' on windows."""
129 path = a
165 path = str(a)
130166 for b in p:
167 b = str(b)
131168 if not b:
132169 continue
133170 if b.startswith('/'):
141178
142179
143180 if is_win:
144 def to_native_path_windows(path):
181 def to_native_path_windows(path: PathLike) -> PathLike:
182 path = str(path)
145183 return path.replace('/', '\\')
146184
147 def to_native_path_linux(path):
185 def to_native_path_linux(path: PathLike) -> str:
186 path = str(path)
148187 return path.replace('\\', '/')
149188
150189 __all__.append("to_native_path_windows")
151190 to_native_path = to_native_path_windows
152191 else:
153192 # no need for any work on linux
154 def to_native_path_linux(path):
155 return path
193 def to_native_path_linux(path: PathLike) -> str:
194 return str(path)
195
156196 to_native_path = to_native_path_linux
157197
158198
159 def join_path_native(a, *p):
199 def join_path_native(a: PathLike, *p: PathLike) -> PathLike:
160200 """
161201 As join path, but makes sure an OS native path is returned. This is only
162202 needed to play it safe on my dear windows and to assure nice paths that only
164204 return to_native_path(join_path(a, *p))
165205
166206
167 def assure_directory_exists(path, is_file=False):
207 def assure_directory_exists(path: PathLike, is_file: bool = False) -> bool:
168208 """Assure that the directory pointed to by path exists.
169209
170210 :param is_file: If True, path is assumed to be a file and handled correctly.
179219 return False
180220
181221
182 def _get_exe_extensions():
222 def _get_exe_extensions() -> Sequence[str]:
183223 PATHEXT = os.environ.get('PATHEXT', None)
184 return tuple(p.upper() for p in PATHEXT.split(os.pathsep)) \
185 if PATHEXT \
186 else (('.BAT', 'COM', '.EXE') if is_win else ())
187
188
189 def py_where(program, path=None):
224 return tuple(p.upper() for p in PATHEXT.split(os.pathsep)) if PATHEXT \
225 else ('.BAT', 'COM', '.EXE') if is_win \
226 else ('')
227
228
229 def py_where(program: str, path: Optional[PathLike] = None) -> List[str]:
190230 # From: http://stackoverflow.com/a/377028/548792
191231 winprog_exts = _get_exe_extensions()
192232
193 def is_exec(fpath):
233 def is_exec(fpath: str) -> bool:
194234 return osp.isfile(fpath) and os.access(fpath, os.X_OK) and (
195235 os.name != 'nt' or not winprog_exts or any(fpath.upper().endswith(ext)
196236 for ext in winprog_exts))
198238 progs = []
199239 if not path:
200240 path = os.environ["PATH"]
201 for folder in path.split(os.pathsep):
241 for folder in str(path).split(os.pathsep):
202242 folder = folder.strip('"')
203243 if folder:
204244 exe_path = osp.join(folder, program)
208248 return progs
209249
210250
211 def _cygexpath(drive, path):
251 def _cygexpath(drive: Optional[str], path: str) -> str:
212252 if osp.isabs(path) and not drive:
213 ## Invoked from `cygpath()` directly with `D:Apps\123`?
253 # Invoked from `cygpath()` directly with `D:Apps\123`?
214254 # It's an error, leave it alone just slashes)
215 p = path
255 p = path # convert to str if AnyPath given
216256 else:
217257 p = path and osp.normpath(osp.expandvars(osp.expanduser(path)))
218258 if osp.isabs(p):
223263 p = cygpath(p)
224264 elif drive:
225265 p = '/cygdrive/%s/%s' % (drive.lower(), p)
226
227 return p.replace('\\', '/')
228
229
230 _cygpath_parsers = (
231 ## See: https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247(v=vs.85).aspx
232 ## and: https://www.cygwin.com/cygwin-ug-net/using.html#unc-paths
266 p_str = str(p) # ensure it is a str and not AnyPath
267 return p_str.replace('\\', '/')
268
269
270 _cygpath_parsers: Tuple[Tuple[Pattern[str], Callable, bool], ...] = (
271 # See: https://msdn.microsoft.com/en-us/library/windows/desktop/aa365247(v=vs.85).aspx
272 # and: https://www.cygwin.com/cygwin-ug-net/using.html#unc-paths
233273 (re.compile(r"\\\\\?\\UNC\\([^\\]+)\\([^\\]+)(?:\\(.*))?"),
234274 (lambda server, share, rest_path: '//%s/%s/%s' % (server, share, rest_path.replace('\\', '/'))),
235275 False
236276 ),
237277
238278 (re.compile(r"\\\\\?\\(\w):[/\\](.*)"),
239 _cygexpath,
279 (_cygexpath),
240280 False
241281 ),
242282
243283 (re.compile(r"(\w):[/\\](.*)"),
244 _cygexpath,
284 (_cygexpath),
245285 False
246286 ),
247287
248288 (re.compile(r"file:(.*)", re.I),
249289 (lambda rest_path: rest_path),
250 True),
290 True
291 ),
251292
252293 (re.compile(r"(\w{2,}:.*)"), # remote URL, do nothing
253294 (lambda url: url),
254 False),
295 False
296 ),
255297 )
256298
257299
258 def cygpath(path):
300 def cygpath(path: str) -> str:
259301 """Use :meth:`git.cmd.Git.polish_url()` instead, that works on any environment."""
302 path = str(path) # ensure is str and not AnyPath.
303 # Fix to use Paths when 3.5 dropped. or to be just str if only for urls?
260304 if not path.startswith(('/cygdrive', '//')):
261305 for regex, parser, recurse in _cygpath_parsers:
262306 match = regex.match(path)
274318 _decygpath_regex = re.compile(r"/cygdrive/(\w)(/.*)?")
275319
276320
277 def decygpath(path):
321 def decygpath(path: PathLike) -> str:
322 path = str(path)
278323 m = _decygpath_regex.match(path)
279324 if m:
280325 drive, rest_path = m.groups()
285330
286331 #: Store boolean flags denoting if a specific Git executable
287332 #: is from a Cygwin installation (since `cache_lru()` unsupported on PY2).
288 _is_cygwin_cache = {}
289
290
291 def is_cygwin_git(git_executable):
333 _is_cygwin_cache: Dict[str, Optional[bool]] = {}
334
335
336 @overload
337 def is_cygwin_git(git_executable: None) -> Literal[False]:
338 ...
339
340
341 @overload
342 def is_cygwin_git(git_executable: PathLike) -> bool:
343 ...
344
345
346 def is_cygwin_git(git_executable: Union[None, PathLike]) -> bool:
292347 if not is_win:
293348 return False
294349
295 #from subprocess import check_output
296
297 is_cygwin = _is_cygwin_cache.get(git_executable)
350 if git_executable is None:
351 return False
352
353 git_executable = str(git_executable)
354 is_cygwin = _is_cygwin_cache.get(git_executable) # type: Optional[bool]
298355 if is_cygwin is None:
299356 is_cygwin = False
300357 try:
301358 git_dir = osp.dirname(git_executable)
302359 if not git_dir:
303360 res = py_where(git_executable)
304 git_dir = osp.dirname(res[0]) if res else None
305
306 ## Just a name given, not a real path.
361 git_dir = osp.dirname(res[0]) if res else ""
362
363 # Just a name given, not a real path.
307364 uname_cmd = osp.join(git_dir, 'uname')
308365 process = subprocess.Popen([uname_cmd], stdout=subprocess.PIPE,
309366 universal_newlines=True)
317374 return is_cygwin
318375
319376
320 def get_user_id():
377 def get_user_id() -> str:
321378 """:return: string identifying the currently active system user as name@node"""
322379 return "%s@%s" % (getpass.getuser(), platform.node())
323380
324381
325 def finalize_process(proc, **kwargs):
382 def finalize_process(proc: Union[subprocess.Popen, 'Git.AutoInterrupt'], **kwargs: Any) -> None:
326383 """Wait for the process (clone, fetch, pull or push) and handle its errors accordingly"""
327 ## TODO: No close proc-streams??
384 # TODO: No close proc-streams??
328385 proc.wait(**kwargs)
329386
330387
331 def expand_path(p, expand_vars=True):
388 @overload
389 def expand_path(p: None, expand_vars: bool = ...) -> None:
390 ...
391
392
393 @overload
394 def expand_path(p: PathLike, expand_vars: bool = ...) -> str:
395 # improve these overloads when 3.5 dropped
396 ...
397
398
399 def expand_path(p: Union[None, PathLike], expand_vars: bool = True) -> Optional[PathLike]:
400 if isinstance(p, pathlib.Path):
401 return p.resolve()
332402 try:
333 p = osp.expanduser(p)
403 p = osp.expanduser(p) # type: ignore
334404 if expand_vars:
335 p = osp.expandvars(p)
336 return osp.normpath(osp.abspath(p))
405 p = osp.expandvars(p) # type: ignore
406 return osp.normpath(osp.abspath(p)) # type: ignore
337407 except Exception:
338408 return None
339409
340 #} END utilities
341
342 #{ Classes
410
411 def remove_password_if_present(cmdline: Sequence[str]) -> List[str]:
412 """
413 Parse any command line argument and if on of the element is an URL with a
414 password, replace it by stars (in-place).
415
416 If nothing found just returns the command line as-is.
417
418 This should be used for every log line that print a command line.
419 """
420 new_cmdline = []
421 for index, to_parse in enumerate(cmdline):
422 new_cmdline.append(to_parse)
423 try:
424 url = urlsplit(to_parse)
425 # Remove password from the URL if present
426 if url.password is None:
427 continue
428
429 edited_url = url._replace(
430 netloc=url.netloc.replace(url.password, "*****"))
431 new_cmdline[index] = urlunsplit(edited_url)
432 except ValueError:
433 # This is not a valid URL
434 continue
435 return new_cmdline
436
437
438 # } END utilities
439
440 # { Classes
343441
344442
345443 class RemoteProgress(object):
347445 Handler providing an interface to parse progress information emitted by git-push
348446 and git-fetch and to dispatch callbacks allowing subclasses to react to the progress.
349447 """
350 _num_op_codes = 9
448 _num_op_codes: int = 9
351449 BEGIN, END, COUNTING, COMPRESSING, WRITING, RECEIVING, RESOLVING, FINDING_SOURCES, CHECKING_OUT = \
352450 [1 << x for x in range(_num_op_codes)]
353451 STAGE_MASK = BEGIN | END
363461 re_op_absolute = re.compile(r"(remote: )?([\w\s]+):\s+()(\d+)()(.*)")
364462 re_op_relative = re.compile(r"(remote: )?([\w\s]+):\s+(\d+)% \((\d+)/(\d+)\)(.*)")
365463
366 def __init__(self):
367 self._seen_ops = []
368 self._cur_line = None
369 self.error_lines = []
370 self.other_lines = []
371
372 def _parse_progress_line(self, line):
464 def __init__(self) -> None:
465 self._seen_ops: List[int] = []
466 self._cur_line: Optional[str] = None
467 self.error_lines: List[str] = []
468 self.other_lines: List[str] = []
469
470 def _parse_progress_line(self, line: AnyStr) -> None:
373471 """Parse progress information from the given line as retrieved by git-push
374472 or git-fetch.
375473
376474 - Lines that do not contain progress info are stored in :attr:`other_lines`.
377475 - Lines that seem to contain an error (i.e. start with error: or fatal:) are stored
378 in :attr:`error_lines`."""
476 in :attr:`error_lines`."""
379477 # handle
380478 # Counting objects: 4, done.
381479 # Compressing objects: 50% (1/2)
382480 # Compressing objects: 100% (2/2)
383481 # Compressing objects: 100% (2/2), done.
384 self._cur_line = line = line.decode('utf-8') if isinstance(line, bytes) else line
385 if self.error_lines or self._cur_line.startswith(('error:', 'fatal:')):
482 if isinstance(line, bytes): # mypy argues about ternary assignment
483 line_str = line.decode('utf-8')
484 else:
485 line_str = line
486 self._cur_line = line_str
487
488 if self._cur_line.startswith(('error:', 'fatal:')):
386489 self.error_lines.append(self._cur_line)
387490 return
388491
389492 # find escape characters and cut them away - regex will not work with
390493 # them as they are non-ascii. As git might expect a tty, it will send them
391494 last_valid_index = None
392 for i, c in enumerate(reversed(line)):
495 for i, c in enumerate(reversed(line_str)):
393496 if ord(c) < 32:
394497 # its a slice index
395498 last_valid_index = -i - 1
396499 # END character was non-ascii
397500 # END for each character in line
398501 if last_valid_index is not None:
399 line = line[:last_valid_index]
502 line_str = line_str[:last_valid_index]
400503 # END cut away invalid part
401 line = line.rstrip()
504 line_str = line_str.rstrip()
402505
403506 cur_count, max_count = None, None
404 match = self.re_op_relative.match(line)
507 match = self.re_op_relative.match(line_str)
405508 if match is None:
406 match = self.re_op_absolute.match(line)
509 match = self.re_op_absolute.match(line_str)
407510
408511 if not match:
409 self.line_dropped(line)
410 self.other_lines.append(line)
512 self.line_dropped(line_str)
513 self.other_lines.append(line_str)
411514 return
412515 # END could not get match
413516
436539 # This can't really be prevented, so we drop the line verbosely
437540 # to make sure we get informed in case the process spits out new
438541 # commands at some point.
439 self.line_dropped(line)
542 self.line_dropped(line_str)
440543 # Note: Don't add this line to the other lines, as we have to silently
441544 # drop it
442 return
545 return None
443546 # END handle op code
444547
445548 # figure out stage
464567 max_count and float(max_count),
465568 message)
466569
467 def new_message_handler(self):
570 def new_message_handler(self) -> Callable[[str], None]:
468571 """
469572 :return:
470573 a progress handler suitable for handle_process_output(), passing lines on to this Progress
471574 handler in a suitable format"""
472 def handler(line):
575 def handler(line: AnyStr) -> None:
473576 return self._parse_progress_line(line.rstrip())
474577 # end
475578 return handler
476579
477 def line_dropped(self, line):
580 def line_dropped(self, line: str) -> None:
478581 """Called whenever a line could not be understood and was therefore dropped."""
479582 pass
480583
481 def update(self, op_code, cur_count, max_count=None, message=''):
584 def update(self, op_code: int, cur_count: Union[str, float], max_count: Union[str, float, None] = None,
585 message: str = '',) -> None:
482586 """Called whenever the progress changes
483587
484588 :param op_code:
509613 """An implementation forwarding updates to any callable"""
510614 __slots__ = ('_callable')
511615
512 def __init__(self, fn):
616 def __init__(self, fn: Callable) -> None:
513617 self._callable = fn
514618 super(CallableRemoteProgress, self).__init__()
515619
516 def update(self, *args, **kwargs):
620 def update(self, *args: Any, **kwargs: Any) -> None:
517621 self._callable(*args, **kwargs)
518622
519623
538642
539643 __slots__ = ('name', 'email')
540644
541 def __init__(self, name, email):
645 def __init__(self, name: Optional[str], email: Optional[str]) -> None:
542646 self.name = name
543647 self.email = email
544648
545 def __eq__(self, other):
649 def __eq__(self, other: Any) -> bool:
546650 return self.name == other.name and self.email == other.email
547651
548 def __ne__(self, other):
652 def __ne__(self, other: Any) -> bool:
549653 return not (self == other)
550654
551 def __hash__(self):
655 def __hash__(self) -> int:
552656 return hash((self.name, self.email))
553657
554 def __str__(self):
555 return self.name
556
557 def __repr__(self):
658 def __str__(self) -> str:
659 return self.name if self.name else ""
660
661 def __repr__(self) -> str:
558662 return '<git.Actor "%s <%s>">' % (self.name, self.email)
559663
560664 @classmethod
561 def _from_string(cls, string):
665 def _from_string(cls, string: str) -> 'Actor':
562666 """Create an Actor from a string.
563667 :param string: is the string, which is expected to be in regular git format
564668
579683 # END handle name/email matching
580684
581685 @classmethod
582 def _main_actor(cls, env_name, env_email, config_reader=None):
686 def _main_actor(cls, env_name: str, env_email: str,
687 config_reader: Union[None, 'GitConfigParser', 'SectionConstraint'] = None) -> 'Actor':
583688 actor = Actor('', '')
584689 user_id = None # We use this to avoid multiple calls to getpass.getuser()
585690
586 def default_email():
691 def default_email() -> str:
587692 nonlocal user_id
588693 if not user_id:
589694 user_id = get_user_id()
590695 return user_id
591696
592 def default_name():
697 def default_name() -> str:
593698 return default_email().split('@')[0]
594699
595700 for attr, evar, cvar, default in (('name', env_name, cls.conf_name, default_name),
599704 setattr(actor, attr, val)
600705 except KeyError:
601706 if config_reader is not None:
602 setattr(actor, attr, config_reader.get_value('user', cvar, default()))
707 try:
708 val = config_reader.get('user', cvar)
709 except Exception:
710 val = default()
711 setattr(actor, attr, val)
603712 # END config-reader handling
604713 if not getattr(actor, attr):
605714 setattr(actor, attr, default())
608717 return actor
609718
610719 @classmethod
611 def committer(cls, config_reader=None):
720 def committer(cls, config_reader: Union[None, 'GitConfigParser', 'SectionConstraint'] = None) -> 'Actor':
612721 """
613722 :return: Actor instance corresponding to the configured committer. It behaves
614723 similar to the git implementation, such that the environment will override
619728 return cls._main_actor(cls.env_committer_name, cls.env_committer_email, config_reader)
620729
621730 @classmethod
622 def author(cls, config_reader=None):
731 def author(cls, config_reader: Union[None, 'GitConfigParser', 'SectionConstraint'] = None) -> 'Actor':
623732 """Same as committer(), but defines the main author. It may be specified in the environment,
624733 but defaults to the committer"""
625734 return cls._main_actor(cls.env_author_name, cls.env_author_email, config_reader)
653762 files = number of changed files as int"""
654763 __slots__ = ("total", "files")
655764
656 def __init__(self, total, files):
765 def __init__(self, total: Total_TD, files: Dict[PathLike, Files_TD]):
657766 self.total = total
658767 self.files = files
659768
660769 @classmethod
661 def _list_from_string(cls, repo, text):
770 def _list_from_string(cls, repo: 'Repo', text: str) -> 'Stats':
662771 """Create a Stat object from output retrieved by git-diff.
663772
664773 :return: git.Stat"""
665 hsh = {'total': {'insertions': 0, 'deletions': 0, 'lines': 0, 'files': 0}, 'files': {}}
774
775 hsh: HSH_TD = {'total': {'insertions': 0,
776 'deletions': 0,
777 'lines': 0,
778 'files': 0},
779 'files': {}
780 }
666781 for line in text.splitlines():
667782 (raw_insertions, raw_deletions, filename) = line.split("\t")
668783 insertions = raw_insertions != '-' and int(raw_insertions) or 0
671786 hsh['total']['deletions'] += deletions
672787 hsh['total']['lines'] += insertions + deletions
673788 hsh['total']['files'] += 1
674 hsh['files'][filename.strip()] = {'insertions': insertions,
675 'deletions': deletions,
676 'lines': insertions + deletions}
789 files_dict: Files_TD = {'insertions': insertions,
790 'deletions': deletions,
791 'lines': insertions + deletions}
792 hsh['files'][filename.strip()] = files_dict
677793 return Stats(hsh['total'], hsh['files'])
678794
679795
688804 :note: Based on the dulwich project"""
689805 __slots__ = ("f", "sha1")
690806
691 def __init__(self, f):
807 def __init__(self, f: IO) -> None:
692808 self.f = f
693809 self.sha1 = make_sha(b"")
694810
695 def write(self, data):
811 def write(self, data: AnyStr) -> int:
696812 self.sha1.update(data)
697813 return self.f.write(data)
698814
699 def write_sha(self):
815 def write_sha(self) -> bytes:
700816 sha = self.sha1.digest()
701817 self.f.write(sha)
702818 return sha
703819
704 def close(self):
820 def close(self) -> bytes:
705821 sha = self.write_sha()
706822 self.f.close()
707823 return sha
708824
709 def tell(self):
825 def tell(self) -> int:
710826 return self.f.tell()
711827
712828
720836 Locks will automatically be released on destruction"""
721837 __slots__ = ("_file_path", "_owns_lock")
722838
723 def __init__(self, file_path):
839 def __init__(self, file_path: PathLike) -> None:
724840 self._file_path = file_path
725841 self._owns_lock = False
726842
727 def __del__(self):
843 def __del__(self) -> None:
728844 self._release_lock()
729845
730 def _lock_file_path(self):
846 def _lock_file_path(self) -> str:
731847 """:return: Path to lockfile"""
732848 return "%s.lock" % (self._file_path)
733849
734 def _has_lock(self):
850 def _has_lock(self) -> bool:
735851 """:return: True if we have a lock and if the lockfile still exists
736852 :raise AssertionError: if our lock-file does not exist"""
737853 return self._owns_lock
738854
739 def _obtain_lock_or_raise(self):
855 def _obtain_lock_or_raise(self) -> None:
740856 """Create a lock file as flag for other instances, mark our instance as lock-holder
741857
742858 :raise IOError: if a lock was already present or a lock file could not be written"""
758874
759875 self._owns_lock = True
760876
761 def _obtain_lock(self):
877 def _obtain_lock(self) -> None:
762878 """The default implementation will raise if a lock cannot be obtained.
763879 Subclasses may override this method to provide a different implementation"""
764880 return self._obtain_lock_or_raise()
765881
766 def _release_lock(self):
882 def _release_lock(self) -> None:
767883 """Release our lock if we have one"""
768884 if not self._has_lock():
769885 return
788904 can never be obtained."""
789905 __slots__ = ("_check_interval", "_max_block_time")
790906
791 def __init__(self, file_path, check_interval_s=0.3, max_block_time_s=maxsize):
907 def __init__(self, file_path: PathLike, check_interval_s: float = 0.3, max_block_time_s: int = maxsize) -> None:
792908 """Configure the instance
793909
794910 :param check_interval_s:
800916 self._check_interval = check_interval_s
801917 self._max_block_time = max_block_time_s
802918
803 def _obtain_lock(self):
919 def _obtain_lock(self) -> None:
804920 """This method blocks until it obtained the lock, or raises IOError if
805921 it ran out of time or if the parent directory was not available anymore.
806922 If this method returns, you are guaranteed to own the lock"""
829945 # END endless loop
830946
831947
832 class IterableList(list):
948 class IterableList(List[T_IterableObj]):
833949
834950 """
835951 List of iterable objects allowing to query an object by id or by named index::
839955 heads['master']
840956 heads[0]
841957
958 Iterable parent objects = [Commit, SubModule, Reference, FetchInfo, PushInfo]
959 Iterable via inheritance = [Head, TagReference, RemoteReference]
960 ]
842961 It requires an id_attribute name to be set which will be queried from its
843962 contained items to have a means for comparison.
844963
847966 can be left out."""
848967 __slots__ = ('_id_attr', '_prefix')
849968
850 def __new__(cls, id_attr, prefix=''):
969 def __new__(cls, id_attr: str, prefix: str = '') -> 'IterableList[IterableObj]':
851970 return super(IterableList, cls).__new__(cls)
852971
853 def __init__(self, id_attr, prefix=''):
972 def __init__(self, id_attr: str, prefix: str = '') -> None:
854973 self._id_attr = id_attr
855974 self._prefix = prefix
856975
857 def __contains__(self, attr):
976 def __contains__(self, attr: object) -> bool:
858977 # first try identity match for performance
859978 try:
860979 rval = list.__contains__(self, attr)
866985
867986 # otherwise make a full name search
868987 try:
869 getattr(self, attr)
988 getattr(self, cast(str, attr)) # use cast to silence mypy
870989 return True
871990 except (AttributeError, TypeError):
872991 return False
873992 # END handle membership
874993
875 def __getattr__(self, attr):
994 def __getattr__(self, attr: str) -> T_IterableObj:
876995 attr = self._prefix + attr
877996 for item in self:
878997 if getattr(item, self._id_attr) == attr:
880999 # END for each item
8811000 return list.__getattribute__(self, attr)
8821001
883 def __getitem__(self, index):
1002 def __getitem__(self, index: Union[SupportsIndex, int, slice, str]) -> T_IterableObj: # type: ignore
1003
1004 assert isinstance(index, (int, str, slice)), "Index of IterableList should be an int or str"
1005
8841006 if isinstance(index, int):
8851007 return list.__getitem__(self, index)
886
887 try:
888 return getattr(self, index)
889 except AttributeError as e:
890 raise IndexError("No item found with id %r" % (self._prefix + index)) from e
1008 elif isinstance(index, slice):
1009 raise ValueError("Index should be an int or str")
1010 else:
1011 try:
1012 return getattr(self, index)
1013 except AttributeError as e:
1014 raise IndexError("No item found with id %r" % (self._prefix + index)) from e
8911015 # END handle getattr
8921016
893 def __delitem__(self, index):
894 delindex = index
1017 def __delitem__(self, index: Union[SupportsIndex, int, slice, str]) -> None:
1018
1019 assert isinstance(index, (int, str)), "Index of IterableList should be an int or str"
1020
1021 delindex = cast(int, index)
8951022 if not isinstance(index, int):
8961023 delindex = -1
8971024 name = self._prefix + index
9081035 list.__delitem__(self, delindex)
9091036
9101037
911 class Iterable(object):
1038 class IterableClassWatcher(type):
1039 """ Metaclass that watches """
1040 def __init__(cls, name: str, bases: Tuple, clsdict: Dict) -> None:
1041 for base in bases:
1042 if type(base) == IterableClassWatcher:
1043 warnings.warn(f"GitPython Iterable subclassed by {name}. "
1044 "Iterable is deprecated due to naming clash since v3.1.18"
1045 " and will be removed in 3.1.20, "
1046 "Use IterableObj instead \n",
1047 DeprecationWarning,
1048 stacklevel=2)
1049
1050
1051 class Iterable(metaclass=IterableClassWatcher):
9121052
9131053 """Defines an interface for iterable items which is to assure a uniform
9141054 way to retrieve and iterate items within the git repository"""
9161056 _id_attribute_ = "attribute that most suitably identifies your instance"
9171057
9181058 @classmethod
919 def list_items(cls, repo, *args, **kwargs):
1059 def list_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> Any:
1060 """
1061 Deprecated, use IterableObj instead.
1062 Find all items of this type - subclasses can specify args and kwargs differently.
1063 If no args are given, subclasses are obliged to return all items if no additional
1064 arguments arg given.
1065
1066 :note: Favor the iter_items method as it will
1067
1068 :return:list(Item,...) list of item instances"""
1069 out_list: Any = IterableList(cls._id_attribute_)
1070 out_list.extend(cls.iter_items(repo, *args, **kwargs))
1071 return out_list
1072
1073 @classmethod
1074 def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> Any:
1075 # return typed to be compatible with subtypes e.g. Remote
1076 """For more information about the arguments, see list_items
1077 :return: iterator yielding Items"""
1078 raise NotImplementedError("To be implemented by Subclass")
1079
1080
1081 @runtime_checkable
1082 class IterableObj(Protocol):
1083 """Defines an interface for iterable items which is to assure a uniform
1084 way to retrieve and iterate items within the git repository
1085
1086 Subclasses = [Submodule, Commit, Reference, PushInfo, FetchInfo, Remote]"""
1087
1088 __slots__ = ()
1089 _id_attribute_: str
1090
1091 @classmethod
1092 def list_items(cls, repo: 'Repo', *args: Any, **kwargs: Any) -> IterableList[T_IterableObj]:
9201093 """
9211094 Find all items of this type - subclasses can specify args and kwargs differently.
9221095 If no args are given, subclasses are obliged to return all items if no additional
9251098 :note: Favor the iter_items method as it will
9261099
9271100 :return:list(Item,...) list of item instances"""
928 out_list = IterableList(cls._id_attribute_)
1101 out_list: IterableList = IterableList(cls._id_attribute_)
9291102 out_list.extend(cls.iter_items(repo, *args, **kwargs))
9301103 return out_list
9311104
9321105 @classmethod
933 def iter_items(cls, repo, *args, **kwargs):
1106 @abstractmethod
1107 def iter_items(cls, repo: 'Repo', *args: Any, **kwargs: Any
1108 ) -> Iterator[T_IterableObj]: # Iterator[T_IterableObj]:
1109 # return typed to be compatible with subtypes e.g. Remote
9341110 """For more information about the arguments, see list_items
935 :return: iterator yielding Items"""
1111 :return: iterator yielding Items"""
9361112 raise NotImplementedError("To be implemented by Subclass")
9371113
938 #} END classes
1114 # } END classes
9391115
9401116
9411117 class NullHandler(logging.Handler):
942 def emit(self, record):
1118 def emit(self, record: object) -> None:
9431119 pass
0 [build-system]
1 requires = ["setuptools", "wheel"]
2 build-backend = "setuptools.build_meta"
3
4 [tool.pytest.ini_options]
5 python_files = 'test_*.py'
6 testpaths = 'test' # space seperated list of paths from root e.g test tests doc/testing
7 addopts = '--cov=git --cov-report=term --maxfail=10 --force-sugar --disable-warnings'
8 filterwarnings = 'ignore::DeprecationWarning'
9 # --cov coverage
10 # --cov-report term # send report to terminal term-missing -> terminal with line numbers html xml
11 # --cov-report term-missing # to terminal with line numbers
12 # --cov-report html:path # html file at path
13 # --maxfail # number of errors before giving up
14 # -disable-warnings # Disable pytest warnings (not codebase warnings)
15 # -rf # increased reporting of failures
16 # -rE # increased reporting of errors
17 # --ignore-glob=**/gitdb/* # ignore glob paths
18 # filterwarnings ignore::WarningType # ignores those warnings
19
20 [tool.mypy]
21 disallow_untyped_defs = true
22 no_implicit_optional = true
23 warn_redundant_casts = true
24 # warn_unused_ignores = true
25 warn_unreachable = true
26 show_error_codes = true
27 implicit_reexport = true
28 # strict = true
29
30 # TODO: remove when 'gitdb' is fully annotated
31 [[tool.mypy.overrides]]
32 module = "gitdb.*"
33 ignore_missing_imports = true
34
35 [tool.coverage.run]
36 source = ["git"]
37
38 [tool.coverage.report]
39 include = ["*/git/*"]
40 omit = ["*/git/ext/*"]
00 gitdb>=4.0.1,<5
1 typing-extensions>=3.7.4.3;python_version<"3.10"
0 #!/usr/bin/env python
1 from __future__ import print_function
2 try:
3 from setuptools import setup, find_packages
4 except ImportError:
5 from ez_setup import use_setuptools
6 use_setuptools()
7 from setuptools import setup, find_packages
8
9 from distutils.command.build_py import build_py as _build_py
0 from typing import Sequence
1 from setuptools import setup, find_packages
2 from setuptools.command.build_py import build_py as _build_py
103 from setuptools.command.sdist import sdist as _sdist
114 import fnmatch
125 import os
2518
2619 class build_py(_build_py):
2720
28 def run(self):
21 def run(self) -> None:
2922 init = path.join(self.build_lib, 'git', '__init__.py')
3023 if path.exists(init):
3124 os.unlink(init)
3629
3730 class sdist(_sdist):
3831
39 def make_release_tree(self, base_dir, files):
32 def make_release_tree(self, base_dir: str, files: Sequence) -> None:
4033 _sdist.make_release_tree(self, base_dir, files)
4134 orig = path.join('git', '__init__.py')
4235 assert path.exists(orig), orig
4740 _stamp_version(dest)
4841
4942
50 def _stamp_version(filename):
43 def _stamp_version(filename: str) -> None:
5144 found, out = False, []
5245 try:
5346 with open(filename, 'r') as f:
5649 line = line.replace("'git'", "'%s'" % VERSION)
5750 found = True
5851 out.append(line)
59 except (IOError, OSError):
52 except OSError:
6053 print("Couldn't find file %s to stamp version" % filename, file=sys.stderr)
6154
6255 if found:
6659 print("WARNING: Couldn't find version line in file %s" % filename, file=sys.stderr)
6760
6861
69 def build_py_modules(basedir, excludes=[]):
62 def build_py_modules(basedir: str, excludes: Sequence = ()) -> Sequence:
7063 # create list of py_modules from tree
7164 res = set()
7265 _prefix = os.path.basename(basedir)
9386 author_email="byronimo@gmail.com, mtrier@gmail.com",
9487 license="BSD",
9588 url="https://github.com/gitpython-developers/GitPython",
96 packages=find_packages(exclude=("test.*")),
89 packages=find_packages(exclude=["test", "test.*"]),
9790 include_package_data=True,
9891 py_modules=build_py_modules("./git", excludes=["git.ext.*"]),
9992 package_dir={'git': 'git'},
100 python_requires='>=3.4',
93 python_requires='>=3.7',
10194 install_requires=requirements,
10295 tests_require=requirements + test_requirements,
10396 zip_safe=False,
119112 "Operating System :: POSIX",
120113 "Operating System :: Microsoft :: Windows",
121114 "Operating System :: MacOS :: MacOS X",
115 "Typing:: Typed",
122116 "Programming Language :: Python",
123117 "Programming Language :: Python :: 3",
124 "Programming Language :: Python :: 3.4",
125 "Programming Language :: Python :: 3.5",
126 "Programming Language :: Python :: 3.6",
127118 "Programming Language :: Python :: 3.7",
128119 "Programming Language :: Python :: 3.8",
129 "Programming Language :: Python :: 3.9"
120 "Programming Language :: Python :: 3.9"
121 "Programming Language :: Python :: 3.10"
130122 ]
131123 )
00 ddt>=1.1.1
1 coverage
1 mypy
2
23 flake8
3 tox
4 flake8-bugbear
5 flake8-comprehensions
6 flake8-typing-imports
7
48 virtualenv
5 nose
9
10 pytest
11 pytest-cov
12 pytest-sugar