New Upstream Snapshot - networkx

Ready changes

Summary

Merged new upstream version: 3.0+git20230110.1.b316afe (was: 2.8.8).

Resulting package

Built on 2023-01-11T04:24 (took 10m4s)

The resulting binary packages can be installed (if you have the apt repository enabled) by running one of:

apt install -t fresh-snapshots python3-networkx

Diff

diff --git a/.circleci/config.yml b/.circleci/config.yml
deleted file mode 100644
index 4639853..0000000
--- a/.circleci/config.yml
+++ /dev/null
@@ -1,108 +0,0 @@
-# See: https://circleci.com/docs/2.0/language-python/
-
-version: 2.1
-jobs:
-  documentation:
-    docker:
-      - image: cimg/python:3.9
-
-    steps:
-      - checkout
-
-      - run:
-          name: Update apt-get
-          command: |
-            sudo apt-get update
-
-      - run:
-          name: Install Graphviz
-          command: |
-            sudo apt-get install graphviz libgraphviz-dev
-
-      - run:
-          name: Install TeX
-          command: |
-            sudo apt-get install texlive texlive-latex-extra latexmk texlive-xetex fonts-freefont-otf xindy
-
-      - run:
-          name: Install pysal dependencies
-          command: |
-            sudo apt-get install libspatialindex-dev
-
-      - restore_cache:
-          keys:
-            - pip-cache-v1
-
-      - run:
-          name: Install Python dependencies
-          command: |
-            python3 -m venv venv
-            source venv/bin/activate
-            pip install --upgrade pip wheel setuptools
-            pip install -r requirements/default.txt -r requirements/test.txt
-            pip install -r requirements/extra.txt
-            pip install -r requirements/example.txt
-            pip install -r requirements/doc.txt
-            pip list
-
-      - save_cache:
-          key: pip-cache-v1
-          paths:
-            - ~/.cache/pip
-
-      - run:
-          name: Install
-          command: |
-            source venv/bin/activate
-            pip install -e .
-
-      - run:
-          name: Build docs
-          command: |
-            # NOTE: bad interaction w/ blas multithreading on circleci
-            export OMP_NUM_THREADS=1
-            source venv/bin/activate
-            make -C doc/ html
-            make -C doc/ latexpdf LATEXOPTS="-file-line-error -halt-on-error"
-            cp -a doc/build/latex/networkx_reference.pdf doc/build/html/_downloads/.
-
-      - store_artifacts:
-          path: doc/build/html
-
-  image:
-    docker:
-      - image: cimg/python:3.9
-
-    steps:
-      - checkout
-
-      - run:
-          name: Install Python dependencies
-          command: |
-            python -m venv venv
-            source venv/bin/activate
-            pip install --upgrade pip wheel setuptools
-            pip install -r requirements/default.txt -r requirements/test.txt
-            pip install pytest-mpl  # NOTE: specified here to avoid confusing conda
-            pip list
-
-      - run:
-          name: Install
-          command: |
-            source venv/bin/activate
-            pip install -e .
-
-      - run:
-          name: Test NetworkX drawing
-          command: |
-            source venv/bin/activate
-            pytest --mpl --mpl-generate-summary=html --mpl-results-path=results --pyargs networkx.drawing
-
-      - store_artifacts:
-          path: results
-
-workflows:
-  documentation_and_image_comparison:
-    jobs:
-      - documentation
-      - image
diff --git a/.codecov.yml b/.codecov.yml
deleted file mode 100644
index e28a818..0000000
--- a/.codecov.yml
+++ /dev/null
@@ -1,9 +0,0 @@
-# Allow coverage to decrease by 0.05%.
-coverage:
-  status:
-    project:
-      default:
-        threshold: 0.05%
-
-# Don't post a comment on pull requests.
-comment: off
diff --git a/.coveragerc b/.coveragerc
deleted file mode 100644
index 64641b7..0000000
--- a/.coveragerc
+++ /dev/null
@@ -1,4 +0,0 @@
-[run]
-branch = True
-source = networkx
-omit = */tests/*, conftest.py, *testing/test.py
diff --git a/.git-blame-ignore-revs b/.git-blame-ignore-revs
deleted file mode 100644
index ad47efc..0000000
--- a/.git-blame-ignore-revs
+++ /dev/null
@@ -1,12 +0,0 @@
-f6755ffa00211b523c6c0bec5398bc6c3c43c8b1
-44680a2466c3c429d9c01f55d71952efbb09b25c
-e9c6af06313d853fd6d03799b85905f6ac38691c
-735e6d856b81989cc12dfa30d8cd8e2ddb7dcead
-f30e9392bef0dccbcfd1b73ccb934064f6200fa3
-b22d6b36ce0545995c99d233546e8a1fe7e27fc5
-3351206a3ce5b3a39bb2fc451e93ef545b96c95b
-99fc1bb6690ac1a45124db2a01c12fd64dcb109b
-cea08c3bb8ca5aa2e167d534b0c5629205733762
-bec833c60c61e838722bf096da75949a9b519d1f
-be23fa0e422b51f4526828cb19b8105c89e5dcbb
-5c0b11afb4c0882a070d522ef3fa41482ba935d3
diff --git a/.github/FUNDING.yml b/.github/FUNDING.yml
deleted file mode 100644
index 6b83ca6..0000000
--- a/.github/FUNDING.yml
+++ /dev/null
@@ -1 +0,0 @@
-custom: https://numfocus.org/donate-to-networkx
diff --git a/.github/ISSUE_TEMPLATE/bug_report.md b/.github/ISSUE_TEMPLATE/bug_report.md
deleted file mode 100644
index 5326ab6..0000000
--- a/.github/ISSUE_TEMPLATE/bug_report.md
+++ /dev/null
@@ -1,31 +0,0 @@
----
-name: Bug report
-about: "Please describe the problem you have encountered"
----
-
-<!-- If you have a general question about NetworkX, please use the discussions tab to create a new discussion -->
-
-<!--- Provide a general summary of the issue in the Title above -->
-
-### Current Behavior
-
-<!--- Tell us what happens instead of the expected behavior -->
-
-### Expected Behavior
-
-<!--- Tell us what should happen -->
-
-### Steps to Reproduce
-
-<!--- Provide a minimal example that reproduces the bug -->
-
-### Environment
-
-<!--- Please provide details about your local environment -->
-
-Python version:
-NetworkX version:
-
-### Additional context
-
-<!--- Add any other context about the problem here, screenshots, etc. -->
diff --git a/.github/ISSUE_TEMPLATE/config.yml b/.github/ISSUE_TEMPLATE/config.yml
deleted file mode 100644
index d0be5a4..0000000
--- a/.github/ISSUE_TEMPLATE/config.yml
+++ /dev/null
@@ -1,8 +0,0 @@
-blank_issues_enabled: true
-contact_links:
-  - name: Questions about NetworkX
-    url: https://github.com/networkx/networkx/discussions/new?category=q-a
-    about: Ask questions about usage of NetworkX
-  - name: Discussions and Ideas
-    url: https://github.com/networkx/networkx/discussions
-    about: Talk about new algorithms, feature requests, show your latest application of networks.
diff --git a/.github/PULL_REQUEST_TEMPLATE.md b/.github/PULL_REQUEST_TEMPLATE.md
deleted file mode 100644
index bcc8cfd..0000000
--- a/.github/PULL_REQUEST_TEMPLATE.md
+++ /dev/null
@@ -1,4 +0,0 @@
-<!--
-Please run black to format your code.
-See https://networkx.org/documentation/latest/developer/contribute.html for details.
--->
diff --git a/.github/workflows/circleci.yml b/.github/workflows/circleci.yml
deleted file mode 100644
index 5d2cc41..0000000
--- a/.github/workflows/circleci.yml
+++ /dev/null
@@ -1,25 +0,0 @@
-name: circleci
-
-on: [status]
-jobs:
-  image:
-    runs-on: ubuntu-latest
-    name: Run CircleCI image artifact redirector
-    steps:
-      - name: GitHub Action step
-        uses: larsoner/circleci-artifacts-redirector-action@master
-        with:
-          repo-token: ${{ secrets.GITHUB_TOKEN }}
-          artifact-path: 0/results/fig_comparison.html
-          circleci-jobs: image
-
-  documentation:
-    runs-on: ubuntu-latest
-    name: Run CircleCI documentation artifact redirector
-    steps:
-      - name: GitHub Action step
-        uses: larsoner/circleci-artifacts-redirector-action@master
-        with:
-          repo-token: ${{ secrets.GITHUB_TOKEN }}
-          artifact-path: 0/doc/build/html/index.html
-          circleci-jobs: documentation
diff --git a/.github/workflows/coverage.yml b/.github/workflows/coverage.yml
deleted file mode 100644
index a7d0c8f..0000000
--- a/.github/workflows/coverage.yml
+++ /dev/null
@@ -1,38 +0,0 @@
-name: coverage
-
-on:
-  push:
-    branches: [v2.8]
-  pull_request:
-    branches: [v2.8]
-
-jobs:
-  report:
-    runs-on: ubuntu-22.04
-    strategy:
-      matrix:
-        python-version: ["3.10"]
-    steps:
-      - uses: actions/checkout@v3
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Before install
-        run: |
-          sudo apt-get update
-          sudo apt-get install graphviz graphviz-dev
-
-      - name: Install packages
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/default.txt -r requirements/test.txt
-          pip install -r requirements/extra.txt
-          pip install .
-          pip list
-
-      - name: Test NetworkX
-        run: |
-          pytest --cov=networkx --runslow --doctest-modules --durations=20 --pyargs networkx
-          codecov
diff --git a/.github/workflows/deploy-docs.yml b/.github/workflows/deploy-docs.yml
deleted file mode 100644
index fc714d4..0000000
--- a/.github/workflows/deploy-docs.yml
+++ /dev/null
@@ -1,72 +0,0 @@
-name: deploy
-
-on:
-  push:
-    branches: [v2.8]
-
-jobs:
-  documentation:
-    # Do not attempt to deploy documentation on forks
-    if: github.repository_owner == 'networkx'
-
-    runs-on: ubuntu-22.04
-
-    steps:
-      - uses: actions/checkout@v3
-      - name: Set up Python
-        uses: actions/setup-python@v3
-        with:
-          python-version: "3.9"
-
-      - name: Before install
-        run: |
-          sudo apt-get update
-          sudo apt-get install graphviz graphviz-dev
-          sudo apt-get install texlive texlive-latex-extra latexmk texlive-xetex
-          sudo apt-get install fonts-freefont-otf xindy
-          sudo apt-get install libspatialindex-dev
-
-      - name: Install packages
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/default.txt -r requirements/test.txt
-          pip install -r requirements/extra.txt
-          pip install -r requirements/example.txt
-          pip install -U -r requirements/doc.txt
-          pip install .
-          pip list
-
-      # To set up a cross-repository deploy key:
-      # 1. Create a key pair:
-      #   `ssh-keygen -t ed25519 -C "nx_doc_deploy_bot@nomail"`
-      # 2. Add the public key to the networkx/documentation repo
-      #   - Settings -> Deploy keys -> Add new
-      #   - Make sure the key has write permissions
-      # 3. Add private key as a secret to networkx/networkx repo
-      #   - Settings -> Secrets -> New Repository Secret
-      #   - Make sure the name is the same as below: CI_DEPLOY_KEY
-      - name: Install SSH agent
-        if: github.ref == 'refs/heads/v2.8'
-        uses: webfactory/ssh-agent@v0.5.4
-        with:
-          ssh-private-key: ${{ secrets.CI_DEPLOY_KEY }}
-
-      - name: Build docs
-        if: github.ref == 'refs/heads/v2.8'
-        run: |
-          export DISPLAY=:99
-          make -C doc/ html
-          make -C doc/ latexpdf LATEXOPTS="-file-line-error -halt-on-error"
-          cp -a doc/build/latex/networkx_reference.pdf doc/build/html/_downloads/.
-
-      - name: Deploy docs
-        if: github.ref == 'refs/heads/v2.8'
-        uses: JamesIves/github-pages-deploy-action@releases/v3
-        with:
-          GIT_CONFIG_NAME: nx-doc-deploy-bot
-          GIT_CONFIG_EMAIL: nx-doc-deploy-bot@nomail
-          FOLDER: doc/build/html
-          REPOSITORY_NAME: networkx/documentation
-          BRANCH: gh-pages
-          TARGET_FOLDER: latest
-          SSH: true
diff --git a/.github/workflows/lint.yml b/.github/workflows/lint.yml
deleted file mode 100644
index 4c6c698..0000000
--- a/.github/workflows/lint.yml
+++ /dev/null
@@ -1,33 +0,0 @@
-name: style
-
-on:
-  push:
-    branches:
-      - v2.8
-  pull_request:
-    branches:
-      - v2.8
-
-jobs:
-  format:
-    runs-on: ubuntu-latest
-    strategy:
-      matrix:
-        python-version: [3.8]
-
-    steps:
-      - uses: actions/checkout@v3
-
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Install packages
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/developer.txt
-          pip list
-
-      - name: Lint
-        run: pre-commit run --all-files --show-diff-on-failure --color always
diff --git a/.github/workflows/mypy.yml b/.github/workflows/mypy.yml
deleted file mode 100644
index 9aecd42..0000000
--- a/.github/workflows/mypy.yml
+++ /dev/null
@@ -1,30 +0,0 @@
-name: Mypy
-
-on: [push, pull_request]
-
-jobs:
-  type-check:
-    runs-on: ubuntu-latest
-    strategy:
-      matrix:
-        python-version: [3.9]
-
-    steps:
-      - uses: actions/checkout@v3
-
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Install packages
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/developer.txt
-          # Rm below when yaml no longer a dep
-          pip install types-PyYAML
-          pip install -e .
-          pip list
-
-      - name: run mypy
-        run: mypy -p networkx
diff --git a/.github/workflows/pytest-randomly.yml b/.github/workflows/pytest-randomly.yml
deleted file mode 100644
index 84168d2..0000000
--- a/.github/workflows/pytest-randomly.yml
+++ /dev/null
@@ -1,26 +0,0 @@
-name: Run test suite in random order
-on:
-  schedule:
-    # Run this workflow once a day
-    - cron: "0 0 * * *"
-
-jobs:
-  randomize-test-order:
-    runs-on: ubuntu-latest
-    steps:
-      - uses: actions/checkout@v2
-      - name: Set up Python
-        uses: actions/setup-python@v2
-        with:
-          python-version: "3.9"
-
-      - name: Install packages
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/default.txt -r requirements/test.txt
-          pip install pytest-randomly
-          pip install .
-          pip list
-
-      - name: Run tests
-        run: pytest --doctest-modules --durations=10 --pyargs networkx
diff --git a/.github/workflows/test.yml b/.github/workflows/test.yml
deleted file mode 100644
index 6e402f2..0000000
--- a/.github/workflows/test.yml
+++ /dev/null
@@ -1,142 +0,0 @@
-name: test
-
-on:
-  push:
-    branches:
-      - v2.8
-  pull_request:
-    branches:
-      - v2.8
-
-jobs:
-  base:
-    runs-on: ${{ matrix.os }}-latest
-    strategy:
-      matrix:
-        os: [ubuntu, macos, windows]
-        python-version: ["pypy-3.8"]
-    steps:
-      - uses: actions/checkout@v3
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Install packages
-        run: |
-          python -m pip install --upgrade pip wheel setuptools
-          python -m pip install -r requirements/test.txt
-          python -m pip install .
-          python -m pip list
-
-      - name: Test NetworkX
-        run: |
-          pytest --durations=10 --pyargs networkx
-
-  default:
-    runs-on: ${{ matrix.os }}-latest
-    strategy:
-      matrix:
-        os: [ubuntu, macos, windows]
-        python-version: ["3.8", "3.9", "3.10", "3.11"]
-    steps:
-      - uses: actions/checkout@v3
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Install packages
-        run: |
-          python -m pip install --upgrade pip wheel setuptools
-          python -m pip install -r requirements/default.txt -r requirements/test.txt
-          python -m pip install .
-          python -m pip list
-
-      - name: Test NetworkX
-        run: |
-          pytest --doctest-modules --durations=10 --pyargs networkx
-
-  extra:
-    runs-on: ${{ matrix.os }}
-    strategy:
-      matrix:
-        os: [ubuntu-22.04, macos-latest, windows-latest]
-        python-version: ["3.8", "3.9", "3.10", "3.11"]
-    steps:
-      - uses: actions/checkout@v3
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Before install (Linux)
-        if: runner.os == 'Linux'
-        run: sudo apt-get update && sudo apt-get install graphviz graphviz-dev
-
-      - name: Before install (macOS)
-        if: runner.os == 'macOS'
-        run: brew install graphviz
-
-      - name: Before install (Windows)
-        if: runner.os == 'Windows'
-        run: choco install graphviz
-
-      - name: Install packages (Linux)
-        if: runner.os == 'Linux'
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/default.txt -r requirements/test.txt
-          pip install -r requirements/extra.txt
-          pip install .
-          pip list
-      - name: Install packages (macOS)
-        if: runner.os == 'macOS'
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install -r requirements/default.txt -r requirements/test.txt
-          pip install --global-option=build_ext --global-option="-I/usr/local/include/" --global-option="-L/usr/local/lib/" pygraphviz
-          pip install -r requirements/extra.txt
-          pip install .
-          pip list
-      - name: Install packages (windows)
-        if: runner.os == 'Windows'
-        run: |
-          echo "C:\Program Files\Graphviz\bin" | Out-File -FilePath $env:GITHUB_PATH -Encoding utf8 -Append
-          python -m pip install --upgrade pip wheel setuptools
-          python -m pip install -r requirements/default.txt -r requirements/test.txt
-          python -m pip install --global-option=build_ext `
-                                --global-option="-IC:\Program Files\Graphviz\include" `
-                                --global-option="-LC:\Program Files\Graphviz\lib" `
-                                pygraphviz
-          python -m pip install -r requirements/extra.txt
-          python -m pip install .
-          python -m pip list
-
-      - name: Test NetworkX
-        run: |
-          pytest --doctest-modules --durations=10 --pyargs networkx
-
-  prerelease:
-    runs-on: ${{ matrix.os }}-latest
-    strategy:
-      matrix:
-        os: [ubuntu, macos]
-        python-version: ["3.8", "3.9", "3.10", "3.11"]
-    steps:
-      - uses: actions/checkout@v3
-      - name: Set up Python ${{ matrix.python-version }}
-        uses: actions/setup-python@v3
-        with:
-          python-version: ${{ matrix.python-version }}
-
-      - name: Install packages
-        run: |
-          pip install --upgrade pip wheel setuptools
-          pip install --pre -r requirements/default.txt -r requirements/test.txt
-          pip install .
-          pip list
-
-      - name: Test NetworkX
-        run: |
-          pytest --doctest-modules --durations=10 --pyargs networkx
diff --git a/.gitignore b/.gitignore
deleted file mode 100644
index e454ae3..0000000
--- a/.gitignore
+++ /dev/null
@@ -1,58 +0,0 @@
-*.pyc
-__pycache__
-*~
-.DS_Store
-build/*
-dist/*
-networkx/version.py
-examples/*/*.png
-doc/networkx-documentation.zip
-doc/networkx_reference.pdf
-doc/networkx_tutorial.pdf
-doc/build
-doc/ghpages_build
-.coverage
-*.class
-
-# Generated while building documentation.
-doc/auto_examples
-doc/modules
-doc/reference/generated
-doc/reference/algorithms/generated
-doc/reference/classes/generated
-doc/reference/readwrite/generated
-doc/path.to.file
-
-examples/advanced/edgelist.utf-8
-examples/basic/grid.edgelist
-
-# Generated when 'python setup_egg.py'
-networkx.egg-info/
-
-# Sublime Text project files
-*.sublime-project
-*.sublime-workspace
-
-# Backup files
-*.bak
-
-# IPython Notebook Checkpoints
-.ipynb_checkpoints/
-
-# Vim's swap files
-*.sw[op]
-
-# Spyder project file
-.spyderproject
-
-# PyCharm project file
-.idea
-
-# VS Code settings
-.vscode
-
-# PyTest Cache
-.pytest_cache
-
-# Virtual environment directory
-networkx-dev/
\ No newline at end of file
diff --git a/.mypy.ini b/.mypy.ini
deleted file mode 100644
index f10a25b..0000000
--- a/.mypy.ini
+++ /dev/null
@@ -1,3 +0,0 @@
-[mypy]
-ignore_missing_imports = True
-exclude = yaml|subgraphviews|reportviews*
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
deleted file mode 100644
index 3efcf9c..0000000
--- a/.pre-commit-config.yaml
+++ /dev/null
@@ -1,30 +0,0 @@
-# Install pre-commit hooks via
-# pre-commit install
-
-repos:
-  - repo: https://github.com/psf/black
-    rev: 22.10.0
-    hooks:
-      - id: black
-  - repo: https://github.com/asottile/pyupgrade
-    rev: v3.0.0
-    hooks:
-      - id: pyupgrade
-        args: [--py38-plus]
-  - repo: https://github.com/asottile/blacken-docs
-    rev: v1.12.1
-    hooks:
-      - id: blacken-docs
-  - repo: https://github.com/pycqa/isort
-    rev: 5.10.1
-    hooks:
-      - id: isort
-        name: isort (python)
-        args: ["--profile", "black", "--filter-files", "--skip", "__init__.py"]
-        files: ^networkx/
-  - repo: https://github.com/pre-commit/mirrors-prettier
-    rev: v2.7.1
-    hooks:
-      - id: prettier
-        files: \.(html|md|yml|yaml)
-        args: [--prose-wrap=preserve]
diff --git a/CODE_OF_CONDUCT.rst b/CODE_OF_CONDUCT.rst
deleted file mode 100644
index 0822de9..0000000
--- a/CODE_OF_CONDUCT.rst
+++ /dev/null
@@ -1,152 +0,0 @@
-.. _code_of_conduct:
-
-Code of Conduct
-===============
-
-
-Introduction
-------------
-
-This code of conduct applies to all spaces managed by the NetworkX project,
-including all public and private mailing lists, issue trackers, wikis, and
-any other communication channel used by our community.
-
-This code of conduct should be honored by everyone who participates in
-the NetworkX community formally or informally, or claims any affiliation with the
-project, in any project-related activities and especially when representing the
-project, in any role.
-
-This code is not exhaustive or complete. It serves to distill our common
-understanding of a collaborative, shared environment and goals. Please try to
-follow this code in spirit as much as in letter, to create a friendly and
-productive environment that enriches the surrounding community.
-
-
-Specific Guidelines
--------------------
-
-We strive to:
-
-1. Be open. We invite anyone to participate in our community. We prefer to use
-   public methods of communication for project-related messages, unless
-   discussing something sensitive. This applies to messages for help or
-   project-related support, too; not only is a public support request much more
-   likely to result in an answer to a question, it also ensures that any
-   inadvertent mistakes in answering are more easily detected and corrected.
-
-2. Be empathetic, welcoming, friendly, and patient. We work together to resolve
-   conflict, and assume good intentions. We may all experience some frustration
-   from time to time, but we do not allow frustration to turn into a personal
-   attack. A community where people feel uncomfortable or threatened is not a
-   productive one.
-
-3. Be collaborative. Our work will be used by other people, and in turn we will
-   depend on the work of others. When we make something for the benefit of the
-   project, we are willing to explain to others how it works, so that they can
-   build on the work to make it even better. Any decision we make will affect
-   users and colleagues, and we take those consequences seriously when making
-   decisions.
-
-4. Be inquisitive. Nobody knows everything! Asking questions early avoids many
-   problems later, so we encourage questions, although we may direct them to
-   the appropriate forum. We will try hard to be responsive and helpful.
-
-5. Be careful in the words that we choose.  We are careful and respectful in
-   our communication and we take responsibility for our own speech. Be kind to
-   others. Do not insult or put down other participants.  We will not accept
-   harassment or other exclusionary behaviour, such as:
-
-    - Violent threats or language directed against another person.
-    - Sexist, racist, or otherwise discriminatory jokes and language.
-    - Posting sexually explicit or violent material.
-    - Posting (or threatening to post) other people's personally identifying information ("doxing").
-    - Sharing private content, such as emails sent privately or non-publicly,
-      or unlogged forums such as IRC channel history, without the sender's consent.
-    - Personal insults, especially those using racist or sexist terms.
-    - Unwelcome sexual attention.
-    - Excessive profanity. Please avoid swearwords; people differ greatly in their sensitivity to swearing.
-    - Repeated harassment of others. In general, if someone asks you to stop, then stop.
-    - Advocating for, or encouraging, any of the above behaviour.
-
-
-Diversity Statement
--------------------
-
-The NetworkX project welcomes and encourages participation by everyone. We are
-committed to being a community that everyone enjoys being part of. Although
-we may not always be able to accommodate each individual's preferences, we try
-our best to treat everyone kindly.
-
-No matter how you identify yourself or how others perceive you: we welcome you.
-Though no list can hope to be comprehensive, we explicitly honour diversity in:
-age, culture, ethnicity, genotype, gender identity or expression, language,
-national origin, neurotype, phenotype, political beliefs, profession, race,
-religion, sexual orientation, socioeconomic status, subculture and technical
-ability.
-
-Though we welcome people fluent in all languages, NetworkX development is
-conducted in English.
-
-Standards for behaviour in the NetworkX community are detailed in the Code of
-Conduct above. Participants in our community should uphold these standards
-in all their interactions and help others to do so as well (see next section).
-
-
-Reporting Guidelines
---------------------
-
-We know that it is painfully common for internet communication to start at or
-devolve into obvious and flagrant abuse.  We also recognize that sometimes
-people may have a bad day, or be unaware of some of the guidelines in this Code
-of Conduct. Please keep this in mind when deciding on how to respond to a
-breach of this Code.
-
-For clearly intentional breaches, report those to the NetworkX Steering Council
-(see below). For possibly unintentional breaches, you may reply to the person
-and point out this code of conduct (either in public or in private, whatever is
-most appropriate). If you would prefer not to do that, please feel free to
-report to the NetworkX Steering Council directly, or ask the Council for
-advice, in confidence.
-
-You can report issues to the
-`NetworkX Steering Council <https://github.com/orgs/networkx/teams/steering-council/members>`__,
-at networkx-conduct@groups.io.
-
-If your report involves any members of the Council, or if they feel they have
-a conflict of interest in handling it, then they will recuse themselves from
-considering your report. Alternatively, if for any reason you feel
-uncomfortable making a report to the Council, then you can also contact:
-
-- Senior `NumFOCUS staff <https://numfocus.org/code-of-conduct#persons-responsible>`__: conduct@numfocus.org.
-
-
-Incident reporting resolution & Code of Conduct enforcement
------------------------------------------------------------
-
-We will investigate and respond to all complaints. The NetworkX Steering Council
-will protect the identity of the reporter, and treat the content of
-complaints as confidential (unless the reporter agrees otherwise).
-
-In case of severe and obvious breaches, e.g., personal threat or violent, sexist
-or racist language, we will immediately disconnect the originator from NetworkX
-communication channels.
-
-In cases not involving clear severe and obvious breaches of this code of
-conduct, the process for acting on any received code of conduct violation
-report will be:
-
-1. acknowledge report is received
-2. reasonable discussion/feedback
-3. mediation (if feedback didn't help, and only if both reporter and reportee agree to this)
-4. enforcement via transparent decision by the NetworkX Steering Council
-
-The Council will respond to any report as soon as possible, and at most
-within 72 hours.
-
-
-Endnotes
---------
-
-This document is adapted from:
-
-- `SciPy Code of Conduct <http://scipy.github.io/devdocs/dev/conduct/code_of_conduct.html>`_
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 003ad8c..a8cb72c 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -47,7 +47,7 @@ Development Workflow
          # Install main development and runtime dependencies of networkx
          pip install -r requirements/default.txt -r requirements/test.txt -r requirements/developer.txt
          #
-         # (Optional) Install pygraphviz, pydot, and gdal packages
+         # (Optional) Install pygraphviz and pydot packages
          # These packages require that you have your system properly configured
          # and what that involves differs on various systems.
          # pip install -r requirements/extra.txt
@@ -68,7 +68,7 @@ Development Workflow
          # Install main development and runtime dependencies of networkx
          conda install -c conda-forge --file requirements/default.txt --file requirements/test.txt --file requirements/developer.txt
          #
-         # (Optional) Install pygraphviz, pydot, and gdal packages
+         # (Optional) Install pygraphviz and pydot packages
          # These packages require that you have your system properly configured
          # and what that involves differs on various systems.
          # conda install -c conda-forge --file requirements/extra.txt
@@ -325,11 +325,11 @@ Or the tests for a specific submodule::
 
 Or tests from a specific file::
 
-    $ PYTHONPATH=. pytest networkx/readwrite/tests/test_yaml.py
+    $ PYTHONPATH=. pytest networkx/readwrite/tests/test_edgelist.py
 
 Or a single test within that file::
 
-    $ PYTHONPATH=. pytest networkx/readwrite/tests/test_yaml.py::TestYaml::testUndirected
+    $ PYTHONPATH=. pytest networkx/readwrite/tests/test_edgelist.py::test_parse_edgelist_with_data_list
 
 Use ``--doctest-modules`` to run doctests.
 For example, run all tests and all doctests using::
diff --git a/INSTALL.rst b/INSTALL.rst
index 792c099..b8ffc2d 100644
--- a/INSTALL.rst
+++ b/INSTALL.rst
@@ -78,7 +78,7 @@ Extra packages
 --------------
 
 .. note::
-   Some optional packages (e.g., `gdal`) may require compiling
+   Some optional packages may require compiling
    C or C++ code.  If you have difficulty installing these packages
    with `pip`, please consult the homepages of those packages.
 
@@ -89,8 +89,6 @@ version requirements.
 - `PyGraphviz <http://pygraphviz.github.io/>`_ and
   `pydot <https://github.com/erocarrera/pydot>`_ provide graph drawing
   and graph layout algorithms via `GraphViz <http://graphviz.org/>`_.
-- `PyYAML <http://pyyaml.org/>`_ provides YAML format reading and writing.
-- `gdal <http://www.gdal.org/>`_ provides shapefile format reading and writing.
 - `lxml <http://lxml.de/>`_ used for GraphML XML format.
 
 To install ``networkx`` and extra packages, do::
@@ -99,7 +97,7 @@ To install ``networkx`` and extra packages, do::
 
 To explicitly install all optional packages, do::
 
-    $ pip install pygraphviz pydot pyyaml gdal lxml
+    $ pip install pygraphviz pydot lxml
 
 Or, install any optional package (e.g., ``pygraphviz``) individually::
 
diff --git a/LICENSE.txt b/LICENSE.txt
index a274a66..42b6f17 100644
--- a/LICENSE.txt
+++ b/LICENSE.txt
@@ -2,7 +2,7 @@ NetworkX is distributed with the 3-clause BSD license.
 
 ::
 
-   Copyright (C) 2004-2022, NetworkX Developers
+   Copyright (C) 2004-2023, NetworkX Developers
    Aric Hagberg <hagberg@lanl.gov>
    Dan Schult <dschult@colgate.edu>
    Pieter Swart <swart@lanl.gov>
diff --git a/PKG-INFO b/PKG-INFO
new file mode 100644
index 0000000..2b1d4d0
--- /dev/null
+++ b/PKG-INFO
@@ -0,0 +1,113 @@
+Metadata-Version: 2.1
+Name: networkx
+Version: 3.1rc1.dev0
+Summary: Python package for creating and manipulating graphs and networks
+Home-page: https://networkx.org/
+Author: Aric Hagberg
+Author-email: hagberg@lanl.gov
+Maintainer: NetworkX Developers
+Maintainer-email: networkx-discuss@googlegroups.com
+Project-URL: Bug Tracker, https://github.com/networkx/networkx/issues
+Project-URL: Documentation, https://networkx.org/documentation/stable/
+Project-URL: Source Code, https://github.com/networkx/networkx
+Keywords: Networks,Graph Theory,Mathematics,network,graph,discrete mathematics,math
+Platform: Linux
+Platform: Mac OSX
+Platform: Windows
+Platform: Unix
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: Intended Audience :: Science/Research
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
+Classifier: Programming Language :: Python :: 3 :: Only
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Topic :: Scientific/Engineering :: Bio-Informatics
+Classifier: Topic :: Scientific/Engineering :: Information Analysis
+Classifier: Topic :: Scientific/Engineering :: Mathematics
+Classifier: Topic :: Scientific/Engineering :: Physics
+Requires-Python: >=3.8
+Provides-Extra: default
+Provides-Extra: developer
+Provides-Extra: doc
+Provides-Extra: extra
+Provides-Extra: test
+License-File: LICENSE.txt
+
+NetworkX
+========
+
+.. image:: https://github.com/networkx/networkx/workflows/test/badge.svg?branch=main
+  :target: https://github.com/networkx/networkx/actions?query=workflow%3A%22test%22
+
+.. image:: https://codecov.io/gh/networkx/networkx/branch/main/graph/badge.svg
+   :target: https://app.codecov.io/gh/networkx/networkx/branch/main
+   
+.. image:: https://img.shields.io/github/labels/networkx/networkx/Good%20First%20Issue?color=green&label=Contribute%20&style=flat-square
+   :target: https://github.com/networkx/networkx/issues?q=is%3Aopen+is%3Aissue+label%3A%22Good+First+Issue%22
+   
+
+NetworkX is a Python package for the creation, manipulation,
+and study of the structure, dynamics, and functions
+of complex networks.
+
+- **Website (including documentation):** https://networkx.org
+- **Mailing list:** https://groups.google.com/forum/#!forum/networkx-discuss
+- **Source:** https://github.com/networkx/networkx
+- **Bug reports:** https://github.com/networkx/networkx/issues
+- **Report a security vulnerability:** https://tidelift.com/security
+- **Tutorial:** https://networkx.org/documentation/latest/tutorial.html
+- **GitHub Discussions:** https://github.com/networkx/networkx/discussions
+
+Simple example
+--------------
+
+Find the shortest path between two nodes in an undirected graph:
+
+.. code:: pycon
+
+    >>> import networkx as nx
+    >>> G = nx.Graph()
+    >>> G.add_edge("A", "B", weight=4)
+    >>> G.add_edge("B", "D", weight=2)
+    >>> G.add_edge("A", "C", weight=3)
+    >>> G.add_edge("C", "D", weight=4)
+    >>> nx.shortest_path(G, "A", "D", weight="weight")
+    ['A', 'B', 'D']
+
+Install
+-------
+
+Install the latest version of NetworkX::
+
+    $ pip install networkx
+
+Install with all optional dependencies::
+
+    $ pip install networkx[all]
+
+For additional details, please see `INSTALL.rst`.
+
+Bugs
+----
+
+Please report any bugs that you find `here <https://github.com/networkx/networkx/issues>`_.
+Or, even better, fork the repository on `GitHub <https://github.com/networkx/networkx>`_
+and create a pull request (PR). We welcome all changes, big or small, and we
+will help you make the PR if you are new to `git` (just ask on the issue and/or
+see `CONTRIBUTING.rst`).
+
+License
+-------
+
+Released under the 3-Clause BSD license (see `LICENSE.txt`)::
+
+   Copyright (C) 2004-2023 NetworkX Developers
+   Aric Hagberg <hagberg@lanl.gov>
+   Dan Schult <dschult@colgate.edu>
+   Pieter Swart <swart@lanl.gov>
diff --git a/README.rst b/README.rst
index e76b856..0f7e1c4 100644
--- a/README.rst
+++ b/README.rst
@@ -66,7 +66,7 @@ License
 
 Released under the 3-Clause BSD license (see `LICENSE.txt`)::
 
-   Copyright (C) 2004-2022 NetworkX Developers
+   Copyright (C) 2004-2023 NetworkX Developers
    Aric Hagberg <hagberg@lanl.gov>
    Dan Schult <dschult@colgate.edu>
    Pieter Swart <swart@lanl.gov>
diff --git a/debian/changelog b/debian/changelog
index 45357d6..e1d4938 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,8 +1,9 @@
-networkx (2.8.8-2) UNRELEASED; urgency=medium
+networkx (3.0+git20230110.1.b316afe-1) UNRELEASED; urgency=medium
 
   * Trim trailing whitespace.
+  * New upstream snapshot.
 
- -- Debian Janitor <janitor@jelmer.uk>  Tue, 10 Jan 2023 05:38:51 -0000
+ -- Debian Janitor <janitor@jelmer.uk>  Wed, 11 Jan 2023 04:19:36 -0000
 
 networkx (2.8.8-1) unstable; urgency=medium
 
diff --git a/doc/README.md b/doc/README.md
deleted file mode 100644
index 96012b2..0000000
--- a/doc/README.md
+++ /dev/null
@@ -1,31 +0,0 @@
-# Building docs
-
-We use Sphinx for generating the API and reference documentation.
-
-Pre-built versions can be found at
-
-    https://networkx.org/
-
-for both the stable and the latest (i.e., development) releases.
-
-## Instructions
-
-After installing NetworkX and its dependencies, install the Python
-packages needed to build the documentation by entering::
-
-    pip install -r requirements/doc.txt
-
-in the root directory.
-
-To build the HTML documentation, enter::
-
-    make html
-
-in the `doc/` directory. This will generate a `build/html` subdirectory
-containing the built documentation.
-
-To build the PDF documentation, enter::
-
-    make latexpdf
-
-You will need to have LaTeX installed for this.
diff --git a/doc/_static/networkx_banner.svg b/doc/_static/networkx_banner.svg
deleted file mode 100644
index aa20bf8..0000000
--- a/doc/_static/networkx_banner.svg
+++ /dev/null
@@ -1,216 +0,0 @@
-<?xml version="1.0" encoding="UTF-8" standalone="no"?>
-<!-- Created with Inkscape (http://www.inkscape.org/) -->
-
-<svg
-   xmlns:dc="http://purl.org/dc/elements/1.1/"
-   xmlns:cc="http://creativecommons.org/ns#"
-   xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
-   xmlns:svg="http://www.w3.org/2000/svg"
-   xmlns="http://www.w3.org/2000/svg"
-   xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
-   xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
-   width="119.00541mm"
-   height="26.772989mm"
-   viewBox="0 0 119.00541 26.772989"
-   version="1.1"
-   id="svg1557"
-   inkscape:version="0.92.4 (unknown)"
-   sodipodi:docname="networkx_banner.svg">
-  <defs
-     id="defs1551" />
-  <sodipodi:namedview
-     id="base"
-     pagecolor="#ffffff"
-     bordercolor="#666666"
-     borderopacity="1.0"
-     inkscape:pageopacity="0.0"
-     inkscape:pageshadow="2"
-     inkscape:zoom="1.4"
-     inkscape:cx="238.50138"
-     inkscape:cy="-32.516569"
-     inkscape:document-units="mm"
-     inkscape:current-layer="layer1"
-     showgrid="false"
-     inkscape:window-width="1920"
-     inkscape:window-height="1061"
-     inkscape:window-x="0"
-     inkscape:window-y="19"
-     inkscape:window-maximized="0"
-     fit-margin-top="0"
-     fit-margin-left="0"
-     fit-margin-right="0"
-     fit-margin-bottom="0" />
-  <metadata
-     id="metadata1554">
-    <rdf:RDF>
-      <cc:Work
-         rdf:about="">
-        <dc:format>image/svg+xml</dc:format>
-        <dc:type
-           rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
-        <dc:title></dc:title>
-      </cc:Work>
-    </rdf:RDF>
-  </metadata>
-  <g
-     inkscape:label="Layer 1"
-     inkscape:groupmode="layer"
-     id="layer1"
-     transform="translate(-11.556817,-12.2266)">
-    <g
-       id="g996-1"
-       transform="matrix(0.26458333,0,0,0.26458333,-2.2489386,8.0893424)">
-      <text
-         id="text3150-8"
-         y="83.149475"
-         x="163.22154"
-         style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;line-height:0%;font-family:Verdana;-inkscape-font-specification:Verdana;letter-spacing:0px;word-spacing:0px;fill:#4d4d4d;fill-opacity:1;stroke:none"
-         xml:space="preserve"><tspan
-           style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:70px;line-height:1.25;font-family:'DejaVu Sans';-inkscape-font-specification:'DejaVu Sans'"
-           y="83.149475"
-           x="163.22154"
-           id="tspan3152-3"
-           sodipodi:role="line">NetworkX</tspan></text>
-      <text
-         id="text3154-4"
-         y="111.43372"
-         x="167.75905"
-         style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;line-height:0%;font-family:Verdana;-inkscape-font-specification:Verdana;letter-spacing:0px;word-spacing:0px;fill:#4d4d4d;fill-opacity:1;stroke:none"
-         xml:space="preserve"><tspan
-           style="font-style:normal;font-variant:normal;font-weight:normal;font-stretch:normal;font-size:24.66666603px;line-height:1;font-family:'DejaVu Sans';-inkscape-font-specification:'DejaVu Sans'"
-           y="111.43372"
-           x="167.75905"
-           id="tspan3156-2"
-           sodipodi:role="line">Network Analysis in Python</tspan></text>
-    </g>
-    <g
-       id="g4909"
-       transform="matrix(0.26458333,0,0,0.26458333,-2.8736486,-86.571652)">
-      <circle
-         transform="rotate(-20)"
-         style="opacity:1;fill:#ff7f0e;fill-opacity:1;stroke:none;stroke-width:4.04473829;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-         id="path4144-3-5-3"
-         cx="-42.136234"
-         cy="425.194"
-         r="14.62983" />
-      <circle
-         transform="rotate(-20)"
-         style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:6.54949951;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-         id="path4144-3"
-         cx="-42.036808"
-         cy="425.4267"
-         r="22.590828" />
-      <circle
-         transform="rotate(-20)"
-         style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:4.68222761;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-         id="path4144-3-7-3"
-         cx="-43.053768"
-         cy="473.88513"
-         r="12.229949" />
-      <circle
-         transform="rotate(-20)"
-         style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:3.35354948;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-         id="path4144-3-6-7"
-         cx="-84.925018"
-         cy="425.4267"
-         r="8.75945" />
-      <path
-         style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:5.36567926;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
-         d="m 73.611215,425.93735 9.387581,-3.4168"
-         id="path4218-9"
-         inkscape:connector-curvature="0"
-         sodipodi:nodetypes="cc" />
-      <path
-         style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:7.49156427;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
-         d="m 113.7196,435.34942 4.77054,13.10695"
-         id="path4218-6-8"
-         inkscape:connector-curvature="0"
-         sodipodi:nodetypes="cc" />
-      <circle
-         transform="matrix(-0.73923793,0.67344435,0.67344435,0.73923793,0,0)"
-         r="8.75945"
-         cy="375.46466"
-         cx="157.81384"
-         id="circle1079-6"
-         style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:3.35354948;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" />
-      <path
-         sodipodi:nodetypes="cc"
-         inkscape:connector-curvature="0"
-         id="path1081-7"
-         d="m 129.96983,389.50545 -7.38503,6.72775"
-         style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:5.36567926;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1" />
-      <g
-         transform="matrix(0.65159599,0.11562736,0.11562736,-0.65159599,57.400159,489.11581)"
-         id="g1103-0">
-        <circle
-           r="8.5899563"
-           cy="60.834969"
-           cx="50.778111"
-           id="circle1099-4"
-           style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:3.28865886;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" />
-        <path
-           sodipodi:nodetypes="cc"
-           inkscape:connector-curvature="0"
-           id="path1101-8"
-           d="m 59.033005,60.834966 h 9.796749"
-           style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:5.26185417;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1" />
-      </g>
-      <g
-         id="g1109-4"
-         transform="matrix(0.1145481,0.65178659,0.65178659,-0.1145481,15.845354,381.11932)">
-        <circle
-           style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:3.28865886;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-           id="circle1105-8"
-           cx="50.778111"
-           cy="60.834969"
-           r="8.5899563" />
-        <path
-           style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:5.26185417;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
-           d="m 59.033005,60.834966 h 9.796749"
-           id="path1107-1"
-           inkscape:connector-curvature="0"
-           sodipodi:nodetypes="cc" />
-      </g>
-      <circle
-         transform="matrix(-0.90723971,-0.42061396,-0.42061396,0.90723971,0,0)"
-         style="opacity:1;fill:none;fill-opacity:1;stroke:#2c7fb8;stroke-width:2.17635441;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-         id="circle1111-6"
-         cx="-307.58365"
-         cy="329.86111"
-         r="5.6846242" />
-      <path
-         style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:3.48216701;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1"
-         d="m 135.35175,426.33931 -5.88186,-2.72694"
-         id="path1113-8"
-         inkscape:connector-curvature="0"
-         sodipodi:nodetypes="cc" />
-      <circle
-         transform="rotate(-20)"
-         r="7.0025988"
-         cy="473.88513"
-         cx="-43.053768"
-         id="circle1165-5"
-         style="opacity:1;fill:#ff7f0e;fill-opacity:1;stroke:none;stroke-width:1.93602252;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" />
-      <circle
-         transform="rotate(-20)"
-         r="5.0957913"
-         cy="425.4267"
-         cx="-84.925018"
-         id="circle1167-2"
-         style="opacity:1;fill:#ff7f0e;fill-opacity:1;stroke:none;stroke-width:1.40884352;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1" />
-      <circle
-         transform="rotate(-20)"
-         style="opacity:1;fill:#ff7f0e;fill-opacity:1;stroke:none;stroke-width:1.40884352;stroke-linecap:round;stroke-linejoin:miter;stroke-miterlimit:0;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1"
-         id="circle1169-1"
-         cx="-3.3006787"
-         cy="407.26898"
-         r="5.0957913" />
-      <path
-         sodipodi:nodetypes="cc"
-         inkscape:connector-curvature="0"
-         id="path1195-9"
-         d="m 136.92323,434.33633 -8.1301,14.54894"
-         style="fill:none;fill-opacity:1;fill-rule:evenodd;stroke:#2c7fb8;stroke-width:3.48216701;stroke-linecap:butt;stroke-linejoin:miter;stroke-miterlimit:4;stroke-dasharray:none;stroke-opacity:1" />
-    </g>
-  </g>
-</svg>
diff --git a/doc/_templates/layout.html b/doc/_templates/layout.html
new file mode 100644
index 0000000..b4665f6
--- /dev/null
+++ b/doc/_templates/layout.html
@@ -0,0 +1,2 @@
+{% extends "!layout.html" %} {% block content %} {% include "dev_banner.html" %}
+{{ super() }} {% endblock %}
diff --git a/doc/conf.py b/doc/conf.py
index f5b6d55..4059015 100644
--- a/doc/conf.py
+++ b/doc/conf.py
@@ -222,11 +222,11 @@ latex_appendices = ["tutorial"]
 intersphinx_mapping = {
     "python": ("https://docs.python.org/3/", None),
     "numpy": ("https://numpy.org/doc/stable/", None),
-    "neps": ("https://numpy.org/neps", None),
-    "matplotlib": ("https://matplotlib.org/stable", None),
-    "scipy": ("https://docs.scipy.org/doc/scipy/reference", None),
-    "pandas": ("https://pandas.pydata.org/pandas-docs/stable", None),
-    "geopandas": ("https://geopandas.org/", None),
+    "neps": ("https://numpy.org/neps/", None),
+    "matplotlib": ("https://matplotlib.org/stable/", None),
+    "scipy": ("https://docs.scipy.org/doc/scipy/", None),
+    "pandas": ("https://pandas.pydata.org/pandas-docs/stable/", None),
+    "geopandas": ("https://geopandas.org/en/stable/", None),
     "pygraphviz": ("https://pygraphviz.github.io/documentation/stable/", None),
     "sphinx-gallery": ("https://sphinx-gallery.github.io/stable/", None),
     "nx-guides": ("https://networkx.org/nx-guides/", None),
diff --git a/doc/developer/about_us.rst b/doc/developer/about_us.rst
index b785cf2..34c203f 100644
--- a/doc/developer/about_us.rst
+++ b/doc/developer/about_us.rst
@@ -145,6 +145,8 @@ to add your name to the bottom of the list.
 - Philip Boalch
 - Matt Schwennesen, Github: `mjschwenne <https://github.com/mjschwenne>`_
 - Andrew Knyazev, Github: `lobpcg <https://github.com/lobpcg>`_, LinkedIn: `andrew-knyazev <https://www.linkedin.com/in/andrew-knyazev>`_
+- Luca Cappelletti, GitHub: `LucaCappelletti94 <https://github.com/LucaCappelletti94>`_
+- Sultan Orazbayev, GitHub: `SultanOrazbayev <https://github.com/SultanOrazbayev>`_, LinkedIn: `Sultan Orazbayev <https://www.linkedin.com/in/sultan-orazbayev/>`_
 
 A supplementary (but still incomplete) list of contributors is given by the
 list of names that have commits in ``networkx``'s
diff --git a/doc/developer/deprecations.rst b/doc/developer/deprecations.rst
index db98405..120f49e 100644
--- a/doc/developer/deprecations.rst
+++ b/doc/developer/deprecations.rst
@@ -46,76 +46,19 @@ Version 3.0
 
 * In ``readwrite/gml.py`` remove ``literal_stringizer`` and related tests.
 * In ``readwrite/gml.py`` remove ``literal_destringizer`` and related tests.
-* In ``utils/misc.py`` remove ``is_string_like`` and related tests.
-* In ``utils/misc.py`` remove ``make_str`` and related tests.
-* In ``utils/misc.py`` remove ``is_iterator``.
-* In ``utils/misc.py`` remove ``iterable``.
-* In ``utils/misc.py`` remove ``is_list_of_ints``.
-* In ``utils/misc.py`` remove ``consume``.
-* In ``utils/misc.py`` remove ``default_opener``.
-* In ``utils/misc.py`` remove ``empty_generator``.
-* Remove ``utils/contextmanagers.py`` and related tests.
-* In ``drawing/nx_agraph.py`` remove ``display_pygraphviz`` and related tests.
-* In ``algorithms/chordal.py`` replace ``chordal_graph_cliques`` with ``_chordal_graph_cliques``.
-* In ``algorithms/centrality/betweenness_centrality_subset.py`` remove ``betweenness_centrality_source``.
-* In ``algorithms/centrality/betweenness.py`` remove ``edge_betweeness``.
-* In ``algorithms/community_modularity_max.py`` remove old name ``_naive_greedy_modularity_communities``.
-* In ``linalg/algebraicconnectivity.py`` remove ``_CholeskySolver`` and related code.
-* In ``convert_matrix.py`` remove ``to_numpy_matrix`` and ``from_numpy_matrix``.
-* In ``readwrite/json_graph/cytoscape.py``, change function signature for
-  ``cytoscape_graph`` and ``cytoscape_data`` to replace the ``attrs`` keyword.
-  argument with explicit ``name`` and ``ident`` keyword args.
-* In ``readwrite/json_graph/tree.py``, remove ``attrs`` kwarg from ``tree_graph``
-  and ``tree_data``.
-* Undo changes related to the removal of ``pyyaml``. Remove the
-  ``__getattr__`` definitions from ``networkx/__init__.py``,
-  ``networkx/readwrite/__init__.py`` and ``networkx/readwrite/nx_yaml.py`` and
-  remove ``networkx/readwrite/tests/test_getattr_nxyaml_removal.py``
-* Remove ``readwrite/gpickle.py`` and related tests.
-* Remove ``readwrite/nx_shp.py`` and related tests (add info in alternatives).
 * Remove ``copy`` method in the coreview Filtered-related classes and related tests.
 * In ``algorithms/link_analysis/pagerank_alg.py`` replace ``pagerank`` with ``pagerank_scipy``.
 * In ``algorithms/link_analysis/pagerank_alg.py`` rename ``pagerank_numpy`` as ``_pagerank_numpy``.
 * In ``convert_matrix.py`` remove ``order`` kwarg from ``to_pandas_edgelist`` and docstring
-* Remove ``readwrite/json_graph/jit.py`` and related tests.
-* In ``utils/misc.py`` remove ``generate_unique_node`` and related tests.
-* In ``algorithms/link_analysis/hits_alg.py`` remove ``hub_matrix`` and ``authority_matrix``
-* In ``algorithms/link_analysis/hits_alg.py``, remove ``hits_numpy`` and ``hist_scipy``.
-* In ``classes`` remove the ``ordered`` module and the four ``Ordered``
-  classes defined therein.
-* In ``utils/decorators.py`` remove ``preserve_random_state``.
-* In ``algorithms/community/quality.py`` remove ``coverage`` and ``performance``.
-* Remove ``testing``.
-* In ``linalg/graphmatrix.py`` remove ``adj_matrix``.
-* In ``algorithms/similarity.py`` replace ``simrank_similarity`` with ``simrank_similarity_numpy``.
-* In ``algorithms/assortativity/mixing.py`` remove ``numeric_mixing_matrix``.
-* In ``algorithms/assortativity/connectivity.py`` remove ``k_nearest_neighbors``.
-* In ``utils/decorators.py`` remove ``random_state``.
 * In ``algorithms/operators/binary.py`` remove ``name`` kwarg from ``union`` and docstring.
-* In ``generators/geometric.py`` remove ``euclidean`` and tests.
-* In ``algorithms/node_classification/`` remove ``hmn.py``, ``lgc.py``,
-  and ``utils.py`` after moving the functions defined therein into the newly created
-  ``node_classification.py`` module, which will replace the current package.
 * In ``algorithms/link_analysis/pagerank_alg.py``, remove the
   ``np.asmatrix`` wrappers on the return values of ``google_matrix`` and remove
   the associated FutureWarning.
-* In ``convert_matrix.py`` remove ``from_scipy_sparse_matrix`` and
-  ``to_scipy_sparse_matrix``.
 * In ``linalg/attrmatrix.py`` remove the FutureWarning, update the
   return type by removing ``np.asmatrix``, and update the docstring to
   reflect that the function returns a ``numpy.ndarray`` instance.
-* In ``generators/small.py`` remove ``make_small_graph`` and
-  ``make_small_undirected_graph``.
-* In ``convert_matrix.py`` remove ``to_numpy_recarray``.
-* In ``classes/function.py`` remove ``info``.
-* In ``algorithms/community/modularity_max.py``, remove the deprecated
-  ``n_communities`` parameter from the ``greedy_modularity_communities``
-  function.
 * In ``algorithms/distance_measures.py`` remove ``extrema_bounding``.
-* In ``utils/misc.py`` remove ``dict_to_numpy_array1`` and ``dict_to_numpy_array2``.
-* In ``utils/misc.py`` remove ``to_tuple``.
 * In ``algorithms/matching.py``, remove parameter ``maxcardinality`` from ``min_weight_matching``.
-* In ``drawing/nx_pydot.py``, change PendingDeprecationWarning to DeprecationWarning.
 
 
 Version 3.2
diff --git a/doc/developer/projects.rst b/doc/developer/projects.rst
index f6a2963..755b03b 100644
--- a/doc/developer/projects.rst
+++ b/doc/developer/projects.rst
@@ -41,35 +41,14 @@ Pedagogical Interactive Notebooks for Algorithms Implemented in NetworkX
   pedagogical interactive notebooks for the medium duration project and 4-5 notebooks
   for the long duration project.
 
-Implement the VF2++ Graph Isomorphism Algorithm
------------------------------------------------
-
-- Abstract: The `Graph Isomorphism Problem`_ is a famous difficult network problem at
-  the boundary between P and NP-Complete. The VF2 algorithm is included with NetworkX
-  in a recursive formulation. There is an improved version of this algorithm called
-  `VF2++`_ which we intend to implement. We have early attempts at a nonrecursive version
-  of the main algorithm that also address subgraph isomorphism and subgraph monomorphism.
-  This project involves fully implementing them and extending to directed and multigraph
-  settings.
-
-- Recommended Skills: Python, graph algorithms
-
-- Expected Outcome: A new set of functions in NetworkX that implement the VF2++
-  algorithm for all problem and graph types in a nonrecursive manner.
-
-- Complexity: Moderate
-
-- Interested Mentors: `@dschult <https://github.com/dschult/>`__,
-  `@MridulS <https://github.com/MridulS/>`__, `@boothby <https://github.com/boothby/>`__,
-
-.. _`Graph Isomorphism Problem`: https://en.wikipedia.org/wiki/Graph_isomorphism_problem
-.. _VF2++: https://doi.org/10.1016/j.dam.2018.02.018
-
-- Expected time commitment: Long project (~350 hours)
-
 Completed Projects
 ==================
 
+- `VF2++ algorithm for graph isomorphism`_
+    - Program: Google Summer of Code 2022
+    - Contributor: `@kpetridis24 <https://github.com/kpetridis24/>`__
+    - Link to Proposal: `GSoC 2022: VF2++ Algorithm <https://github.com/networkx/archive/blob/main/proposals-gsoc/GSoC-2022-VF2plusplus-isomorphism.pdf`_
+
 - `Louvain community detection algorithm`_ 
     - Program: Google Summer of Code 2021
     - Contributor: `@z3y50n <https://github.com/z3y50n/>`__
@@ -98,6 +77,7 @@ Completed Projects
     - Contributor: `@MridulS <https://github.com/MridulS/>`__
     - Link to Proposal: `GSoC 2015: NetworkX 2.0 API <https://github.com/networkx/archive/blob/main/proposals-gsoc/GSoC-2015-NetworkX-2.0-api.md>`__
 
+.. _`VF2++ algorithm for graph isomorphism`: https://github.com/networkx/networkx/pull/5788
 .. _`Louvain community detection algorithm`: https://github.com/networkx/networkx/pull/4929
 .. _`Asadpour algorithm for directed travelling salesman problem`: https://github.com/networkx/networkx/pull/4740
 .. _`Directed acyclic graphs and topological sort`: https://github.com/networkx/nx-guides/pull/44
diff --git a/doc/developer/roadmap.rst b/doc/developer/roadmap.rst
index f0f1b1a..93dec06 100644
--- a/doc/developer/roadmap.rst
+++ b/doc/developer/roadmap.rst
@@ -15,7 +15,7 @@ Installation
 ------------
 
 We aim to make NetworkX as easy to install as possible.
-Some of our dependencies (e.g., graphviz and gdal) can be tricky to install.
+Some of our dependencies (e.g., graphviz) can be tricky to install.
 Other of our dependencies are easy to install on the CPython platform, but
 may be more involved on other platforms such as PyPy.
 Addressing these installation issues may involve working with the external projects.
diff --git a/doc/reference/algorithms/assortativity.rst b/doc/reference/algorithms/assortativity.rst
index 02f3b8d..8ec6167 100644
--- a/doc/reference/algorithms/assortativity.rst
+++ b/doc/reference/algorithms/assortativity.rst
@@ -30,7 +30,6 @@ Average degree connectivity
    :toctree: generated/
 
    average_degree_connectivity
-   k_nearest_neighbors
 
 
 Mixing
@@ -40,7 +39,6 @@ Mixing
 
    attribute_mixing_matrix
    degree_mixing_matrix
-   numeric_mixing_matrix
    attribute_mixing_dict
    degree_mixing_dict
    mixing_dict
diff --git a/doc/reference/algorithms/centrality.rst b/doc/reference/algorithms/centrality.rst
index 84472b9..92ac029 100644
--- a/doc/reference/algorithms/centrality.rst
+++ b/doc/reference/algorithms/centrality.rst
@@ -45,7 +45,6 @@ Current Flow Closeness
    :toctree: generated/
 
    betweenness_centrality
-   betweenness_centrality_source
    betweenness_centrality_subset
    edge_betweenness_centrality
    edge_betweenness_centrality_subset
@@ -149,3 +148,10 @@ VoteRank
    :toctree: generated/
 
    voterank
+
+Laplacian
+--------
+.. autosummary::
+   :toctree: generated/
+
+   laplacian_centrality
\ No newline at end of file
diff --git a/doc/reference/algorithms/community.rst b/doc/reference/algorithms/community.rst
index ff60d36..55f7ad6 100644
--- a/doc/reference/algorithms/community.rst
+++ b/doc/reference/algorithms/community.rst
@@ -71,10 +71,8 @@ Measuring partitions
 .. autosummary::
    :toctree: generated/
 
-   coverage
    modularity
    partition_quality
-   performance
 
 Partitions via centrality measures
 ----------------------------------
diff --git a/doc/reference/algorithms/distance_measures.rst b/doc/reference/algorithms/distance_measures.rst
index 1a6a3c7..f0d0918 100644
--- a/doc/reference/algorithms/distance_measures.rst
+++ b/doc/reference/algorithms/distance_measures.rst
@@ -10,7 +10,6 @@ Distance Measures
    center
    diameter
    eccentricity
-   extrema_bounding
    periphery
    radius
    resistance_distance
diff --git a/doc/reference/algorithms/isomorphism.rst b/doc/reference/algorithms/isomorphism.rst
index 5a29f1e..1d64bd0 100644
--- a/doc/reference/algorithms/isomorphism.rst
+++ b/doc/reference/algorithms/isomorphism.rst
@@ -4,9 +4,6 @@
 Isomorphism
 ***********
 
-.. toctree::
-   :maxdepth: 2
-
 .. automodule:: networkx.algorithms.isomorphism
 .. autosummary::
    :toctree: generated/
@@ -16,6 +13,15 @@ Isomorphism
    fast_could_be_isomorphic
    faster_could_be_isomorphic
 
+VF2++
+-----
+.. automodule:: networkx.algorithms.isomorphism.vf2pp
+.. autosummary::
+   :toctree: generated/
+
+   vf2pp_is_isomorphic
+   vf2pp_all_isomorphisms
+   vf2pp_isomorphism
 
 Tree Isomorphism
 -----------------
diff --git a/doc/reference/algorithms/link_analysis.rst b/doc/reference/algorithms/link_analysis.rst
index d85ab33..c691f89 100644
--- a/doc/reference/algorithms/link_analysis.rst
+++ b/doc/reference/algorithms/link_analysis.rst
@@ -10,8 +10,6 @@ PageRank
    :toctree: generated/
 
    pagerank
-   pagerank_numpy
-   pagerank_scipy
    google_matrix
 
 Hits
@@ -22,8 +20,3 @@ Hits
    :toctree: generated/
 
    hits
-   hits_numpy
-   hits_scipy
-   hub_matrix
-   authority_matrix
-
diff --git a/doc/reference/algorithms/node_classification.rst b/doc/reference/algorithms/node_classification.rst
index 2229818..6b79fe6 100644
--- a/doc/reference/algorithms/node_classification.rst
+++ b/doc/reference/algorithms/node_classification.rst
@@ -1,21 +1,9 @@
 Node Classification
 ===================
-.. automodule:: networkx.algorithms.node_classification
-.. currentmodule:: networkx
 
-Harmonic Function
------------------
-.. automodule:: networkx.algorithms.node_classification.hmn
+.. automodule:: networkx.algorithms.node_classification
 .. autosummary::
    :toctree: generated/
 
    harmonic_function
-
-
-Local and Global Consistency
-----------------------------
-.. automodule:: networkx.algorithms.node_classification.lgc
-.. autosummary::
-   :toctree: generated/
-
    local_and_global_consistency
diff --git a/doc/reference/algorithms/operators.rst b/doc/reference/algorithms/operators.rst
index 7babf2c..13632e2 100644
--- a/doc/reference/algorithms/operators.rst
+++ b/doc/reference/algorithms/operators.rst
@@ -43,3 +43,4 @@ Operators
    strong_product
    tensor_product
    power
+   corona_product
diff --git a/doc/reference/algorithms/similarity.rst b/doc/reference/algorithms/similarity.rst
index 4721101..17a4d68 100644
--- a/doc/reference/algorithms/similarity.rst
+++ b/doc/reference/algorithms/similarity.rst
@@ -11,6 +11,5 @@ Similarity Measures
    optimize_graph_edit_distance
    optimize_edit_paths
    simrank_similarity
-   simrank_similarity_numpy
    panther_similarity
    generate_random_paths
diff --git a/doc/reference/algorithms/swap.rst b/doc/reference/algorithms/swap.rst
index ea80cb8..4375b33 100644
--- a/doc/reference/algorithms/swap.rst
+++ b/doc/reference/algorithms/swap.rst
@@ -7,5 +7,6 @@ Swap
    :toctree: generated/
 
    double_edge_swap
+   directed_edge_swap
    connected_double_edge_swap
 
diff --git a/doc/reference/classes/index.rst b/doc/reference/classes/index.rst
index a0fc388..a4acd53 100644
--- a/doc/reference/classes/index.rst
+++ b/doc/reference/classes/index.rst
@@ -39,13 +39,9 @@ Basic graph types
    multidigraph
 
 .. note:: NetworkX uses `dicts` to store the nodes and neighbors in a graph.
-   So the reporting of nodes and edges for the base graph classes will not
-   necessarily be consistent across versions and platforms.  If you need the
-   order of nodes and edges to be consistent (e.g., when writing automated
-   tests), please see :class:`~networkx.OrderedGraph`,
-   :class:`~networkx.OrderedDiGraph`, :class:`~networkx.OrderedMultiGraph`,
-   or :class:`~networkx.OrderedMultiDiGraph`, which behave like the base
-   graph classes but give a consistent order for reporting of nodes and edges.
+   So the reporting of nodes and edges for the base graph classes may not
+   necessarily be consistent across versions and platforms; however, the reporting
+   for CPython is consistent across platforms and versions after 3.6.
 
 Graph Views
 ===========
@@ -99,3 +95,16 @@ Filters
    show_diedges
    show_multidiedges
    show_multiedges
+
+Backends
+========
+
+.. note:: This is an experimental feature to dispatch your computations to an
+   alternate backend like GraphBLAS, instead of using pure Python dictionaries
+   for computation. Things will change and break in the future!
+
+.. automodule:: networkx.classes.backends
+.. autosummary::
+   :toctree: generated/
+
+   _dispatch
diff --git a/doc/reference/classes/ordered.rst b/doc/reference/classes/ordered.rst
deleted file mode 100644
index c9bd45f..0000000
--- a/doc/reference/classes/ordered.rst
+++ /dev/null
@@ -1,13 +0,0 @@
-.. _ordered:
-
-============================================
-Ordered Graphs---Consistently ordered graphs
-============================================
-
-.. automodule:: networkx.classes.ordered
-
-.. currentmodule:: networkx
-.. autoclass:: OrderedGraph
-.. autoclass:: OrderedDiGraph
-.. autoclass:: OrderedMultiGraph
-.. autoclass:: OrderedMultiDiGraph
diff --git a/doc/reference/convert.rst b/doc/reference/convert.rst
index 4566048..1af838c 100644
--- a/doc/reference/convert.rst
+++ b/doc/reference/convert.rst
@@ -37,10 +37,7 @@ Numpy
 .. autosummary::
    :toctree: generated/
 
-   to_numpy_matrix
    to_numpy_array
-   to_numpy_recarray
-   from_numpy_matrix
    from_numpy_array
 
 Scipy
@@ -49,8 +46,6 @@ Scipy
    :toctree: generated/
 
    to_scipy_sparse_array
-   to_scipy_sparse_matrix
-   from_scipy_sparse_matrix
    from_scipy_sparse_array
 
 Pandas
diff --git a/doc/reference/drawing.rst b/doc/reference/drawing.rst
index efd3436..daf0949 100644
--- a/doc/reference/drawing.rst
+++ b/doc/reference/drawing.rst
@@ -95,4 +95,14 @@ Graph Layout
    spectral_layout
    spiral_layout
    multipartite_layout
-   
+
+
+LaTeX Code
+==========
+.. automodule:: networkx.drawing.nx_latex
+.. autosummary::
+   :toctree: generated/
+
+   to_latex_raw
+   to_latex
+   write_latex
diff --git a/doc/reference/functions.rst b/doc/reference/functions.rst
index 04ad1e5..a859bc8 100644
--- a/doc/reference/functions.rst
+++ b/doc/reference/functions.rst
@@ -12,7 +12,6 @@ Graph
    degree
    degree_histogram
    density
-   info
    create_empty_copy
    is_directed
    to_directed
diff --git a/doc/reference/generators.rst b/doc/reference/generators.rst
index 0985a0b..d2f12b2 100644
--- a/doc/reference/generators.rst
+++ b/doc/reference/generators.rst
@@ -73,7 +73,6 @@ Small
 .. autosummary::
    :toctree: generated/
 
-   make_small_graph
    LCF_graph
    bull_graph
    chvatal_graph
diff --git a/doc/reference/readwrite/gpickle.rst b/doc/reference/readwrite/gpickle.rst
deleted file mode 100644
index 55255c1..0000000
--- a/doc/reference/readwrite/gpickle.rst
+++ /dev/null
@@ -1,8 +0,0 @@
-Pickle
-======
-.. automodule:: networkx.readwrite.gpickle
-.. autosummary::
-   :toctree: generated/
-
-   read_gpickle
-   write_gpickle
diff --git a/doc/reference/readwrite/index.rst b/doc/reference/readwrite/index.rst
index 76a3f21..ad852e3 100644
--- a/doc/reference/readwrite/index.rst
+++ b/doc/reference/readwrite/index.rst
@@ -12,11 +12,9 @@ Reading and writing graphs
    edgelist
    gexf
    gml
-   gpickle
    graphml
    json_graph
    leda
    sparsegraph6
    pajek
-   nx_shp
    matrix_market
diff --git a/doc/reference/readwrite/json_graph.rst b/doc/reference/readwrite/json_graph.rst
index 6ae934f..5336877 100644
--- a/doc/reference/readwrite/json_graph.rst
+++ b/doc/reference/readwrite/json_graph.rst
@@ -12,5 +12,4 @@ JSON
    cytoscape_graph
    tree_data
    tree_graph
-   jit_data
-   jit_graph
+
diff --git a/doc/reference/readwrite/matrix_market.rst b/doc/reference/readwrite/matrix_market.rst
index 3717383..a722c1f 100644
--- a/doc/reference/readwrite/matrix_market.rst
+++ b/doc/reference/readwrite/matrix_market.rst
@@ -95,6 +95,6 @@ sparse matrices::
 
     >>> # Read from file
     >>> fh.seek(0)
-    >>> H = nx.from_scipy_sparse_matrix(sp.io.mmread(fh))
+    >>> H = nx.from_scipy_sparse_array(sp.io.mmread(fh))
     >>> H.edges() == G.edges()
     True
diff --git a/doc/reference/readwrite/nx_shp.rst b/doc/reference/readwrite/nx_shp.rst
deleted file mode 100644
index 4750762..0000000
--- a/doc/reference/readwrite/nx_shp.rst
+++ /dev/null
@@ -1,10 +0,0 @@
-GIS Shapefile
-=============
-.. automodule:: networkx.readwrite.nx_shp
-.. autosummary::
-   :toctree: generated/
-
-   read_shp
-   write_shp
-
-
diff --git a/doc/reference/utils.rst b/doc/reference/utils.rst
index 7a223fb..e8c1ce4 100644
--- a/doc/reference/utils.rst
+++ b/doc/reference/utils.rst
@@ -12,16 +12,13 @@ Helper Functions
    :toctree: generated/
 
    arbitrary_element
-   is_string_like
    flatten
-   iterable
    make_list_of_ints
-   make_str
-   generate_unique_node
-   default_opener
+   dict_to_numpy_array
    pairwise
    groups
    create_random_state
+   create_py_random_state
    nodes_equal
    edges_equal
    graphs_equal
@@ -68,3 +65,11 @@ Cuthill-Mckee Ordering
 
    cuthill_mckee_ordering
    reverse_cuthill_mckee_ordering
+
+Mapped Queue
+------------
+.. automodule:: networkx.utils.mapped_queue
+.. autosummary::
+   :toctree: generated/
+
+   MappedQueue
diff --git a/doc/release/contribs.py b/doc/release/contribs.py
index 55ccacc..6d2f26b 100644
--- a/doc/release/contribs.py
+++ b/doc/release/contribs.py
@@ -5,11 +5,17 @@ import sys
 import string
 import shlex
 
-if len(sys.argv) != 2:
-    print("Usage: ./contributors.py tag-of-previous-release")
+if len(sys.argv) < 2 or len(sys.argv) > 3:
+    print(
+        "Usage: ./contributors.py tag-of-previous-release tag-of-newer-release (optional)"
+    )
     sys.exit(-1)
 
 tag = sys.argv[1]
+if len(sys.argv) < 3:
+    compare_tag = None
+else:
+    compare_tag = sys.argv[2]
 
 
 def call(cmd):
@@ -17,19 +23,42 @@ def call(cmd):
 
 
 tag_date = call(f"git log -n1 --format='%ci' {tag}")[0]
+if compare_tag:
+    compare_tag_date = call(f"git log -n1 --format='%ci' {compare_tag}")[0]
+
 print(f"Release {tag} was on {tag_date}\n")
 
-merges = call(f"git log --since='{tag_date}' --merges --format='>>>%B' --reverse")
+if compare_tag:
+    merges = call(
+        f"git log --since='{tag_date}' --until='{compare_tag_date}' --merges --format='>>>%B' --reverse"
+    )
+else:
+    merges = call(f"git log --since='{tag_date}' --merges --format='>>>%B' --reverse")
 merges = [m for m in merges if m.strip()]
 merges = "\n".join(merges).split(">>>")
 merges = [m.split("\n")[:2] for m in merges]
 merges = [m for m in merges if len(m) == 2 and m[1].strip()]
 
-num_commits = call(f"git rev-list {tag}..HEAD --count")[0]
+if compare_tag:
+    num_commits = call(f"git rev-list {tag}..{compare_tag} --count")[0]
+else:
+    num_commits = call(f"git rev-list {tag}..HEAD --count")[0]
+
 print(f"A total of {num_commits} changes have been committed.\n")
 
 # Use filter to remove empty strings
-commits = filter(None, call(f"git log --since='{tag_date}' --pretty=%s --reverse"))
+if compare_tag:
+    commits = filter(
+        None,
+        call(
+            f"git log --since='{tag_date}' --until='{compare_tag_date}' --pretty=%s --reverse"
+        ),
+    )
+else:
+    commits = filter(
+        None,
+        call(f"git log --since='{tag_date}' --pretty=%s --reverse"),
+    )
 for c in commits:
     print("- " + c)
 
@@ -44,12 +73,17 @@ for (merge, message) in merges:
 
 print("\nMade by the following committers [alphabetical by last name]:\n")
 
-authors = call(f"git log --since='{tag_date}' --format=%aN")
+if compare_tag:
+    authors = call(
+        f"git log --since='{tag_date}' --until='{compare_tag_date}' --format=%aN"
+    )
+else:
+    authors = call(f"git log --since='{tag_date}' --format=%aN")
 authors = [a.strip() for a in authors if a.strip()]
 
 
 def key(author):
-    author = [v for v in author.split() if v[0] in string.ascii_letters]
+    author = [v for v in author.split()]
     if len(author) > 0:
         return author[-1]
 
diff --git a/doc/release/index.rst b/doc/release/index.rst
index 7ab33c9..200207d 100644
--- a/doc/release/index.rst
+++ b/doc/release/index.rst
@@ -15,6 +15,7 @@ period.
    :maxdepth: 2
 
    release_dev
+   release_3.0
    release_2.8.8
    release_2.8.7
    release_2.8.6
diff --git a/doc/release/migration_guide_from_2.x_to_3.0.rst b/doc/release/migration_guide_from_2.x_to_3.0.rst
index a7dc8a4..8035f4d 100644
--- a/doc/release/migration_guide_from_2.x_to_3.0.rst
+++ b/doc/release/migration_guide_from_2.x_to_3.0.rst
@@ -1,8 +1,8 @@
 :orphan:
 
-*****************************
-Preparing for the 3.0 release
-*****************************
+*******************************
+Migration guide from 2.X to 3.0
+*******************************
 
 .. note::
    Much of the work leading to the NetworkX 3.0 release will be included
@@ -11,7 +11,7 @@ Preparing for the 3.0 release
    ongoing work and will help you understand what changes you can make now
    to minimize the disruption caused by the move to 3.0.
 
-This is a guide for people moving from NetworkX 2.X to NetworkX 3.0
+This is a guide for people moving from NetworkX 2.X to NetworkX 3.0.
 
 Any issues with these can be discussed on the `mailing list
 <https://groups.google.com/forum/#!forum/networkx-discuss>`_.
@@ -34,9 +34,6 @@ structures (``Graph``, ``DiGraph``, etc.) and common algorithms, but some
 functionality, e.g. functions found in the ``networkx.linalg`` package, are
 only available if these additional libraries are installed.
 
-.. **TODO**: Generate a table showing dependencies of individual nx objects?
-.. Probably overkill...
-
 Improved integration with scientific Python
 -------------------------------------------
 
@@ -179,15 +176,7 @@ improving supported for array representations of multi-attribute adjacency::
 Deprecated code
 ---------------
 
-The 2.6 release deprecates over 30 functions.
-See :ref:`networkx_2.6`.
-
-.. **TODO**: A table summarizing one deprecation per row w/ 3 columns: 1. the
-.. deprecated function, 2. the old usage, 3. the replacement usage.
-
----
-
-The functions `read_gpickle` and `write_gpickle` will be removed in 3.0.
+The functions `read_gpickle` and `write_gpickle` were removed in 3.0.
 You can read and write NetworkX graphs as Python pickles.
 
 >>> import pickle
@@ -199,7 +188,7 @@ You can read and write NetworkX graphs as Python pickles.
 ...     G = pickle.load(f)
 ... 
 
-The functions `read_yaml` and `write_yaml` will be removed in 3.0.
+The functions `read_yaml` and `write_yaml` were removed in 3.0.
 You can read and write NetworkX graphs in YAML format
 using pyyaml.
 
diff --git a/doc/release/release_2.8.8.rst b/doc/release/release_2.8.8.rst
index 750230f..b6e88e7 100644
--- a/doc/release/release_2.8.8.rst
+++ b/doc/release/release_2.8.8.rst
@@ -1,4 +1,4 @@
-NetworkX 2.8.7
+NetworkX 2.8.8
 ==============
 
 Release date: 1 November 2022
diff --git a/doc/release/release_3.0.rst b/doc/release/release_3.0.rst
new file mode 100644
index 0000000..955f9f5
--- /dev/null
+++ b/doc/release/release_3.0.rst
@@ -0,0 +1,327 @@
+NetworkX 3.0
+============
+
+Release date: 7 January 2023
+
+Supports Python 3.8, 3.9, 3.10, and 3.11.
+
+NetworkX is a Python package for the creation, manipulation, and study of the
+structure, dynamics, and functions of complex networks.
+
+For more information, please visit our `website <https://networkx.org/>`_
+and our :ref:`gallery of examples <examples_gallery>`.
+Please send comments and questions to the `networkx-discuss mailing list
+<http://groups.google.com/group/networkx-discuss>`_.
+
+Highlights
+----------
+
+This release is the result of 8 months of work with over 180 changes by
+41 contributors. We also have a `guide for people moving from NetworkX 2.X
+to NetworkX 3.0 <https://networkx.org/documentation/latest/release/migration_guide_from_2.x_to_3.0.html>`_. Highlights include:
+
+- Better syncing between G._succ and G._adj for directed G.
+  And slightly better speed from all the core adjacency data structures.
+  G.adj is now a cached_property while still having the cache reset when
+  G._adj is set to a new dict (which doesn't happen very often).
+  Note: We have always assumed that G._succ and G._adj point to the same
+  object. But we did not enforce it well. If you have somehow worked
+  around our attempts and are relying on these private attributes being
+  allowed to be different from each other due to loopholes in our previous
+  code, you will have to look for other loopholes in our new code
+  (or subclass DiGraph to explicitly allow this).
+- If your code sets G._succ or G._adj to new dictionary-like objects, you no longer
+  have to set them both. Setting either will ensure the other is set as well.
+  And the cached_properties G.adj and G.succ will be rest accordingly too.
+- If you use the presence of the attribute `_adj` as a criteria for the object
+  being a Graph instance, that code may need updating. The graph classes
+  themselves now have an attribute `_adj`. So, it is possible that whatever you
+  are checking might be a class rather than an instance. We suggest you check
+  for attribute `_adj` to verify it is like a NetworkX graph object or type and
+  then `type(obj) is type` to check if it is a class.
+- We have added an `experimental plugin feature <https://github.com/networkx/networkx/pull/6000>`_,
+  which let users choose alternate backends like GraphBLAS, CuGraph for computation. This is an
+  opt-in feature and may change in future releases.
+- Improved integration with the general `Scientific Python ecosystem <https://networkx.org/documentation/latest/release/migration_guide_from_2.x_to_3.0.html#improved-integration-with-scientific-python>`_.
+- New drawing feature (module and tests) from NetworkX graphs to the TikZ library of TeX/LaTeX.
+  The basic interface is ``nx.to_latex(G, pos, **options)`` to construct a string of latex code or
+  ``nx.write_latex(G, filename, as_document=True, **options)`` to write the string to a file.
+- Added an improved subgraph isomorphism algorithm called VF2++.
+
+Improvements
+------------
+- [`#5663 <https://github.com/networkx/networkx/pull/5663>`_]
+  Implements edge swapping for directed graphs.
+- [`#5883 <https://github.com/networkx/networkx/pull/5883>`_]
+  Replace the implementation of ``lowest_common_ancestor`` and
+  ``all_pairs_lowest_common_ancestor`` with a "naive" algorithm to fix
+  several bugs and improve performance.
+- [`#5912 <https://github.com/networkx/networkx/pull/5912>`_]
+  The ``mapping`` argument of the ``relabel_nodes`` function can be either a
+  mapping or a function that creates a mapping. ``relabel_nodes`` first checks
+  whether the ``mapping`` is callable - if so, then it is used as a function.
+  This fixes a bug related for ``mapping=str`` and may change the behavior for
+  other ``mapping`` arguments that implement both ``__getitem__`` and
+  ``__call__``.
+- [`#5898 <https://github.com/networkx/networkx/pull/5898>`_]
+  Implements computing and checking for minimal d-separators between two nodes.
+  Also adds functionality to DAGs for computing v-structures.
+- [`#5943 <https://github.com/networkx/networkx/pull/5943>`_]
+  ``is_path`` used to raise a `KeyError` when the ``path`` argument contained
+  a node that was not in the Graph. The behavior has been updated so that
+  ``is_path`` returns `False` in this case rather than raising the exception.
+- [`#6003 <https://github.com/networkx/networkx/pull/6003>`_]
+  ``avg_shortest_path_length`` now raises an exception if the provided
+  graph is directed but not strongly connected. The previous test (weak
+  connecting) was wrong; in that case, the returned value was nonsensical.
+
+API Changes
+-----------
+
+- [`#5813 <https://github.com/networkx/networkx/pull/5813>`_]
+  OrderedGraph and other Ordered classes are replaced by Graph because
+  Python dicts (and thus networkx graphs) now maintain order.
+- [`#5899 <https://github.com/networkx/networkx/pull/5899>`_]
+  The `attrs` keyword argument will be replaced with keyword only arguments
+  `source`, `target`, `name`, `key` and `link` for `json_graph/node_link` functions.
+
+Deprecations
+------------
+
+- [`#5723 <https://github.com/networkx/networkx/issues/5723>`_]
+  ``nx.nx_pydot.*`` will be deprecated in the future if pydot isn't being
+  actively maintained. Users are recommended to use pygraphviz instead. 
+- [`#5899 <https://github.com/networkx/networkx/pull/5899>`_]
+  The `attrs` keyword argument will be replaced with keyword only arguments
+  `source`, `target`, `name`, `key` and `link` for `json_graph/node_link` functions.
+
+Merged PRs
+----------
+
+- Bump release version
+- Add characteristic polynomial example to polynomials docs (#5730)
+- Remove deprecated function is_string_like (#5738)
+- Remove deprecated function make_str (#5739)
+- Remove unused 'name' parameter from `union` (#5741)
+- Remove deprecated function is_iterator (#5740)
+- Remove deprecated `euclidean` from geometric.py (#5744)
+- Remove deprecated function utils.consume (#5745)
+- Rm `to_numpy_recarray` (#5737)
+- Remove deprecated function utils.empty_generator (#5748)
+- Rm jit.py (#5751)
+- Remove deprecated context managers (#5752)
+- Remove deprecated function utils.to_tuple (#5755)
+- Remove deprecated display_pygraphviz (#5754)
+- Remove to_numpy_matrix & from_numpy_matrix (#5746)
+- Remove deprecated decorator preserve_random_state (#5768)
+- Remove deprecated function is_list_of_ints (#5743)
+- Remove decorator random_state (#5770)
+- remove `adj_matrix` from `linalg/graphmatrix.py` (#5753)
+- Remove betweenness_centrality_source (#5786)
+- Remove deprecated simrank_similarity_numpy (#5783)
+- Remove networkx.testing subpackage (#5782)
+- Change PyDot PendingDeprecation to Deprecation (#5781)
+- Remove deprecated numeric_mixing_matrix (#5777)
+- Remove deprecated functions make_small_graph and make_small_undirected_graph (#5761)
+- Remove _naive_greedy_modularity_communities (#5760)
+- Make chordal_graph_cliques a generator (#5758)
+- update cytoscape functions to drop old signature (#5784)
+- Remove deprecated functions dict_to_numpy_array2 and dict_to_numpy_array1 (#5756)
+- Remove deprecated function utils.default_opener (#5747)
+- Remove deprecated function iterable (#5742)
+- remove old attr keyword from json_graph/tree (#5785)
+- Remove generate_unique_node (#5780)
+- Replace node_classification subpackage with a module (#5774)
+- Remove gpickle (#5773)
+- Remove deprecated function extrema_bounding (#5757)
+- Remove coverage and performance from quality (#5775)
+- Update return type of google_matrix to numpy.ndarray (#5762)
+- Remove deprecated k-nearest-neighbors (#5769)
+- Remove gdal dependency (#5766)
+- Update return type of attrmatrix (#5764)
+- Remove unused deprecated argument from to_pandas_edgelist (#5778)
+- Remove deprecated function edge_betweeness (#5765)
+- Remove pyyaml dependency (#5763)
+- Remove copy methods for Filter* coreviews (#5776)
+- Remove deprecated function nx.info (#5759)
+- Remove deprecated n_communities argument from greedy_modularity_communities (#5789)
+- Remove deprecated functions hub_matrix and authority_matrix (#5767)
+- Make HITS numpy and scipy private functions (#5771)
+- Add Triad example plot (#5528)
+- Add gallery example visualizing DAG with multiple layouts (#5432)
+- Make pagerank numpy and scipy private functions (#5772)
+- Implement directed edge swap (#5663)
+- Update relabel.py to preserve node order (#5258)
+- Modify DAG example to show topological layout. (#5835)
+- Add docstring example for self-ancestors/descendants (#5802)
+- Update precommit linters (#5839)
+- remove to/from_scipy_sparse_matrix (#5779)
+- Clean up from PR #5779 (#5841)
+- Corona Product (#5223)
+- Add direct link to github networkx org sponsorship (#5843)
+- added examples to efficiency_measures.py (#5643)
+- added examples to regular.py (#5642)
+- added examples to degree_alg.py (#5644)
+- Add docstring examples for triads functions (#5522)
+- Fix docbuild warnings: is_string_like is removed and identation in corona product (#5845)
+- Use py_random_state to control randomness of random_triad (#5847)
+- Remove OrderedGraphs (#5813)
+- Drop NumPy 1.19 (#5856)
+- Speed up unionfind a bit by not adding root node in the path (#5844)
+- Minor doc fixups (#5868)
+- Attempt to reverse slowdown from hasattr  needed for cached_property (#5836)
+- make lazy_import private and remove its internal use (#5878)
+- strategy_saturation_largest_first now accepts partial colorings (#5888)
+- Add weight distance metrics (#5305)
+- docstring updates for `union`, `disjoint_union`, and `compose` (#5892)
+- Update precommit hooks (#5923)
+- Remove old Appveyor cruft (#5924)
+- signature change for `node_link` functions: for issue #5787 (#5899)
+- Replace LCA with naive implementations (#5883)
+- Bump nodelink args deprecation expiration to v3.2 (#5933)
+- Update mapping logic in `relabel_nodes` (#5912)
+- Update pygraphviz (#5934)
+- Further improvements to strategy_saturation_largest_first (#5935)
+- Arf layout (#5910)
+- [ENH] Find and verify a minimal D-separating set in DAG (#5898)
+- Add Mehlhorn Steiner approximations (#5629)
+- Preliminary VF2++ Implementation (#5788)
+- Minor docstring touchups and test refactor for `is_path` (#5967)
+- Switch to relative import for vf2pp_helpers. (#5973)
+- Add vf2pp_helpers subpackage to wheel (#5975)
+- Enhance biconnected components to avoid indexing (#5974)
+- Update mentored projects list (#5985)
+- Add concurrency hook to cancel jobs on new push. (#5986)
+- Make all.py generator friendly (#5984)
+- Only run scheduled pytest-randomly job in main repo. (#5993)
+- Fix steiner tree test (#5999)
+- Update doc requirements (#6008)
+- VF2++ for Directed Graphs (#5972)
+- Fix defect and update docs for MappedQueue, related to gh-5681 (#5939)
+- Update pydata-sphinx-theme (#6012)
+- Update numpydoc (#6022)
+- Fixed test for average shortest path in the case of directed graphs (#6003)
+- Update deprecations after 3.0 dep sprint (#6031)
+- Use scipy.sparse array datastructure (#6037)
+- Designate 3.0b1 release
+- Bump release version
+- Use org funding.yml
+- Update which flow functions support the cutoff argument (#6085)
+- Update GML parsing/writing to allow empty lists/tuples as node attributes (#6093)
+- Warn on unused visualization kwargs that only apply to FancyArrowPatch edges (#6098)
+- Fix weighted MultiDiGraphs in DAG longest path algorithms + add additional tests (#5988)
+- Circular center node layout (#6114)
+- Fix doc inconsistencies related to cutoff in connectivity.py and disjoint_paths.py (#6113)
+- Remove deprecated maxcardinality parameter from min_weight_matching (#6146)
+- Remove deprecated `find_cores` (#6139)
+- Remove deprecated project function from bipartite package. (#6147)
+- Improve test coverage for voterank algorithm (#6161)
+- plugin based backend infrastructure to use multiple computation backends (#6000)
+- Undocumented parameters in dispersion (#6183)
+- Swap.py coverage to 100 (#6176)
+- Improve test coverage for current_flow_betweenness module (#6143)
+- Completed Testing in community.py resolves issue #6184 (#6185)
+- Added an example to algebraic_connectivity (#6153)
+- Add ThinGraph example to Multi*Graph doc_strings (#6160)
+- Fix defect in eulerize, replace reciprocal edge weights (#6145)
+- For issue #6030 Add test coverage for algorithms in beamsearch.py (#6087)
+- Improve test coverage expanders stochastic graph generators (#6073)
+- Update developer requirements  (#6194)
+- Designate 3.0rc1 release
+- Bump release version
+- Tests added in test_centrality.py (#6200)
+- add laplacian_spectrum example (#6169)
+- PR for issue #6033 Improve test coverage for algorithms in betweenness_subset.py #6033 (#6083)
+- Di graph edges doc fix (#6108)
+- Improve coverage for core.py (#6116)
+- Add clear edges method as a method to be frozen by nx.freeze (#6190)
+- Adds LCA test case for self-ancestors from gh-4458. (#6218)
+- Minor Python 2 cleanup (#6219)
+- Add example laplacian matrix  (#6168)
+- Revert 6219 and delete comment. (#6222)
+- fix wording in error message (#6228)
+- Rm incorrect test case for connected edge swap (#6223)
+- add missing `seed` to function called by `connected_double_edge_swap` (#6231)
+- Hide edges with a weight of None in A*. (#5945)
+- Add dfs_labeled_edges reporting of reverse edges due to depth_limit. (#6240)
+- Warn users about duplicate nodes in generator function input (#6237)
+- Reenable geospatial examples (#6252)
+- Draft 3.0 release notes (#6232)
+- Add 2.8.x release notes (#6255)
+- doc: clarify allowed `alpha` when using nx.draw_networkx_edges (#6254)
+- Add a contributor (#6256)
+- Allow MultiDiGraphs for LCA (#6234)
+- Update simple_paths.py to improve readability of the BFS. (#6273)
+- doc: update documentation when providing an iterator over current graph to add/remove_edges_from. (#6268)
+- Fix bug vf2pp is isomorphic issue 6257 (#6270)
+- Improve test coverage for Eigenvector centrality  (#6227)
+- Bug fix in swap: directed_edge_swap and double_edge_swap  (#6149)
+- Adding a test to verify that a NetworkXError is raised when calling n… (#6265)
+- Pin to sphinx 5.2.3 (#6277)
+- Update pre-commit hooks (#6278)
+- Update GH actions (#6280)
+- Fix links in release notes (#6281)
+- bug fix in smallworld.py: random_reference and lattice_reference (#6151)
+- [DOC] Follow numpydoc standard in barbell_graph documentation (#6286)
+- Update simple_paths.py: consistent behaviour for `is_simple_path` when path contains nodes not in the graph. (#6272)
+- Correctly point towards 2.8.8 in release notes (#6298)
+- Isomorphism improve documentation (#6295)
+- Improvements and test coverage for `line.py` (#6215)
+- Fix typo in Katz centrality comment (#6310)
+- Broken link in isomorphism documentation (#6296)
+- Update copyright years to 2023 (#6322)
+- fix warnings for make doctest (#6323)
+- fix whitespace issue in test_internet_as_graph (#6324)
+- Create a Tikz latex drawing feature for networkx (#6238)
+- Fix docstrings (#6329)
+- Fix documentation deployment (#6330)
+- Fix links to migration guide (#6331)
+- Fix links to migration guide (#6331)
+- Fix typo in readme file (#6312)
+- Fix typos in the networkx codebase (#6335)
+- Refactor vf2pp modules and test files (#6334)
+
+Contributors
+------------
+
+- 0ddoe_s
+- Abangma Jessika
+- Adam Li
+- Adam Richardson
+- Ali Faraji
+- Alimi Qudirah
+- Anurag Bhat
+- Ben Heil
+- Brian Hou
+- Casper van Elteren
+- danieleades
+- Dan Schult
+- ddelange
+- Dilara Tekinoglu
+- Dimitrios Papageorgiou
+- Douglas K. G. Araujo
+- George Watkins
+- Guy Aglionby
+- Isaac Western
+- Jarrod Millman
+- Juanita Gomez
+- Kevin Brown
+- Konstantinos Petridis
+- ladykkk
+- Lucas H. McCabe
+- Ludovic Stephan
+- Lukong123
+- Matt Schwennesen
+- Michael Holtz
+- Morrison Turnansky
+- Mridul Seth
+- nsengaw4c
+- Okite chimaobi Samuel
+- Paula Pérez Bianchi
+- Radoslav Fulek
+- reneechebbo
+- Ross Barnowski
+- Sebastiano Vigna
+- stevenstrickler
+- Sultan Orazbayev
+- Tina Oberoi
diff --git a/doc/release/release_dev.rst b/doc/release/release_dev.rst
index 199aa1e..de72cba 100644
--- a/doc/release/release_dev.rst
+++ b/doc/release/release_dev.rst
@@ -1,9 +1,9 @@
-Next Release
-============
+3.1 (unreleased)
+================
 
 Release date: TBD
 
-Supports Python ...
+Supports Python 3.9, 3.10, and 3.11.
 
 NetworkX is a Python package for the creation, manipulation, and study of the
 structure, dynamics, and functions of complex networks.
@@ -19,25 +19,6 @@ Highlights
 This release is the result of X of work with over X pull requests by
 X contributors. Highlights include:
 
-- Better syncing between G._succ and G._adj for directed G.
-  And slightly better speed from all the core adjacency data structures.
-  G.adj is now a cached_property while still having the cache reset when
-  G._adj is set to a new dict (which doesn't happen very often).
-  Note: We have always assumed that G._succ and G._adj point to the same
-  object. But we did not enforce it well. If you have somehow worked
-  around our attempts and are relying on these private attributes being
-  allowed to be different from each other due to loopholes in our previous
-  code, you will have to look for other loopholes in our new code
-  (or subclass DiGraph to explicitly allow this).
-- If your code sets G._succ or G._adj to new dictionary-like objects, you no longer
-  have to set them both. Setting either will ensure the other is set as well.
-  And the cached_properties G.adj and G.succ will be rest accordingly too.
-- If you use the presence of the attribute `_adj` as a criteria for the object
-  being a Graph instance, that code may need updating. The graph classes
-  themselves now have an attribute `_adj`. So, it is possible that whatever you
-  are checking might be a class rather than an instance. We suggest you check
-  for attribute `_adj` to verify it is like a NetworkX graph object or type and
-  then `type(obj) is type` to check if it is a class.
 
 Improvements
 ------------
diff --git a/doc/release/report_functions_without_rst_generated.py b/doc/release/report_functions_without_rst_generated.py
index f73da43..3e29706 100644
--- a/doc/release/report_functions_without_rst_generated.py
+++ b/doc/release/report_functions_without_rst_generated.py
@@ -13,13 +13,9 @@ for n, f in funcs:
     # print(result)
 
     old_names = (
-        "find_cores",
         "test",
-        "edge_betweenness",
-        "betweenness_centrality_source",
         "write_graphml_lxml",
         "write_graphml_xml",
-        "adj_matrix",
         "project",
         "fruchterman_reingold_layout",
         "node_degree_xy",
diff --git a/doc/release/report_functions_without_rst_generated.sh b/doc/release/report_functions_without_rst_generated.sh
deleted file mode 100644
index e701889..0000000
--- a/doc/release/report_functions_without_rst_generated.sh
+++ /dev/null
@@ -1,38 +0,0 @@
-#!/bin/bash
-# Authored-by: James Trimble <james.trimble@yahoo.co.uk> 
-# expanded beyond /algorithms/ by Dan Schult <dschult@colgate.edu>
-
-echo ALGORITHMS
-find networkx/algorithms -name "*.py" | xargs cat | tr -d ' \n' | grep -o '__all__=\[["a-z0-9_,]*\]' | grep -o '"[a-z0-9_]*"' | tr -d '"' | sort | uniq > tmp_funcs.txt
-cat doc/reference/algorithms/*.rst | tr -d ' ' | sort | uniq > tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt > functions_possibly_missing_from_doc.txt
-for f in $(cat functions_possibly_missing_from_doc.txt); do echo $f; grep -l -r "def ${f}" networkx/algorithms | grep -v ".pyc" | sed 's/^/  /'; done
-
-echo GENERATORS
-find networkx/generators -name "*.py" | xargs cat | tr -d ' \n' | grep -o '__all__=\[["a-z0-9_,]*\]' | grep -o '"[a-z0-9_]*"' | tr -d '"' | sort | uniq > tmp_funcs.txt
-cat doc/reference/generators.rst | tr -d ' ' | sort | uniq > tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt > functions_possibly_missing_from_doc.txt
-for f in $(cat functions_possibly_missing_from_doc.txt); do echo $f; grep -l -r "def ${f}" networkx/generators | grep -v ".pyc" | sed 's/^/  /'; done
-
-echo LINALG
-find networkx/linalg -name "*.py" | xargs cat | tr -d ' \n' | grep -o '__all__=\[["a-z0-9_,]*\]' | grep -o '"[a-z0-9_]*"' | tr -d '"' | sort | uniq > tmp_funcs.txt
-cat doc/reference/linalg.rst | tr -d ' ' | sort | uniq > tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt > functions_possibly_missing_from_doc.txt
-for f in $(cat functions_possibly_missing_from_doc.txt); do echo $f; grep -l -r "def ${f}" networkx/linalg | grep -v ".pyc" | sed 's/^/  /'; done
-
-echo CLASSES
-find networkx/classes -name "*.py" | xargs cat | tr -d ' \n' | grep -o '__all__=\[["a-z0-9_,]*\]' | grep -o '"[a-z0-9_]*"' | tr -d '"' | sort | uniq > tmp_funcs.txt
-cat doc/reference/{filters,functions,classes/*}.rst | tr -d ' ' | sort | uniq > tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt > functions_possibly_missing_from_doc.txt
-for f in $(cat functions_possibly_missing_from_doc.txt); do echo $f; grep -l -r "def ${f}" networkx/classes | grep -v ".pyc" | sed 's/^/  /'; done
-
-echo READWRITE
-find networkx/readwrite -name "*.py" | xargs cat | tr -d ' \n' | grep -o '__all__=\[["a-z0-9_,]*\]' | grep -o '"[a-z0-9_]*"' | tr -d '"' | sort | uniq > tmp_funcs.txt
-cat doc/reference/readwrite/*.rst | tr -d ' ' | sort | uniq > tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt
-comm -2 -3 tmp_funcs.txt tmp_doc.txt > functions_possibly_missing_from_doc.txt
-for f in $(cat functions_possibly_missing_from_doc.txt); do echo $f; grep -l -r "def ${f}" networkx/readwrite | grep -v ".pyc" | sed 's/^/  /'; done
diff --git a/doc/tutorial.rst b/doc/tutorial.rst
index 6c6a0a8..c703d9f 100644
--- a/doc/tutorial.rst
+++ b/doc/tutorial.rst
@@ -509,7 +509,7 @@ like so:
 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
 
 NetworkX supports many popular formats, such as edge lists, adjacency lists,
-GML, GraphML, pickle, LEDA and others.
+GML, GraphML, LEDA and others.
 
 .. nbplot::
 
diff --git a/examples/algorithms/plot_betweenness_centrality.py b/examples/algorithms/plot_betweenness_centrality.py
index e8354c6..52994d5 100644
--- a/examples/algorithms/plot_betweenness_centrality.py
+++ b/examples/algorithms/plot_betweenness_centrality.py
@@ -1,6 +1,6 @@
 """
 =====================
-Betweeness Centrality
+Betweenness Centrality
 =====================
 
 Betweenness centrality measures of positive gene functional associations
@@ -70,13 +70,13 @@ ax.text(
 ax.text(
     0.80,
     0.06,
-    "node size = betweeness centrality",
+    "node size = betweenness centrality",
     horizontalalignment="center",
     transform=ax.transAxes,
     fontdict=font,
 )
 
-# Resize figure for label readibility
+# Resize figure for label readability
 ax.margins(0.1, 0.05)
 fig.tight_layout()
 plt.axis("off")
diff --git a/examples/algorithms/plot_blockmodel.py b/examples/algorithms/plot_blockmodel.py
index e3fec60..b41f0c3 100644
--- a/examples/algorithms/plot_blockmodel.py
+++ b/examples/algorithms/plot_blockmodel.py
@@ -55,7 +55,7 @@ G = nx.read_edgelist("hartford_drug.edgelist")
 H = G.subgraph(next(nx.connected_components(G)))
 # Makes life easier to have consecutively labeled integer nodes
 H = nx.convert_node_labels_to_integers(H)
-# Create parititions with hierarchical clustering
+# Create partitions with hierarchical clustering
 partitions = create_hc(H)
 # Build blockmodel graph
 BM = nx.quotient_graph(H, partitions, relabel=True)
diff --git a/examples/algorithms/plot_parallel_betweenness.py b/examples/algorithms/plot_parallel_betweenness.py
index d9333f4..e6d238d 100644
--- a/examples/algorithms/plot_parallel_betweenness.py
+++ b/examples/algorithms/plot_parallel_betweenness.py
@@ -65,7 +65,7 @@ G_ws = nx.connected_watts_strogatz_graph(1000, 4, 0.1)
 for G in [G_ba, G_er, G_ws]:
     print("")
     print("Computing betweenness centrality for:")
-    print(nx.info(G))
+    print(G)
     print("\tParallel version")
     start = time.time()
     bt = betweenness_centrality_parallel(G)
diff --git a/examples/drawing/icons/computer_black_144x144.png b/examples/drawing/icons/computer_black_144x144.png
deleted file mode 100644
index bddc3a0..0000000
Binary files a/examples/drawing/icons/computer_black_144x144.png and /dev/null differ
diff --git a/examples/drawing/icons/router_black_144x144.png b/examples/drawing/icons/router_black_144x144.png
deleted file mode 100644
index 51588d3..0000000
Binary files a/examples/drawing/icons/router_black_144x144.png and /dev/null differ
diff --git a/examples/drawing/icons/switch_black_144x144.png b/examples/drawing/icons/switch_black_144x144.png
deleted file mode 100644
index edf38c5..0000000
Binary files a/examples/drawing/icons/switch_black_144x144.png and /dev/null differ
diff --git a/examples/drawing/plot_center_node.py b/examples/drawing/plot_center_node.py
new file mode 100644
index 0000000..d7eb470
--- /dev/null
+++ b/examples/drawing/plot_center_node.py
@@ -0,0 +1,20 @@
+"""
+====================
+Custom Node Position
+====================
+
+Draw a graph with node(s) located at user-defined positions.
+
+When a position is set by the user, the other nodes can still be neatly organised in a layout.
+"""
+
+import networkx as nx
+import numpy as np
+
+G = nx.path_graph(20)  # An example graph
+center_node = 5  # Or any other node to be in the center
+edge_nodes = set(G) - {center_node}
+# Ensures the nodes around the circle are evenly distributed
+pos = nx.circular_layout(G.subgraph(edge_nodes))
+pos[center_node] = np.array([0, 0])  # manually specify node position
+nx.draw(G, pos, with_labels=True)
diff --git a/examples/drawing/plot_chess_masters.py b/examples/drawing/plot_chess_masters.py
index 52f4cfb..7e7bfed 100644
--- a/examples/drawing/plot_chess_masters.py
+++ b/examples/drawing/plot_chess_masters.py
@@ -145,7 +145,7 @@ ax.text(
     fontdict=font,
 )
 
-# Resize figure for label readibility
+# Resize figure for label readability
 ax.margins(0.1, 0.05)
 fig.tight_layout()
 plt.axis("off")
diff --git a/examples/external/force/README.txt b/examples/external/force/README.txt
index b49796c..6bac21a 100644
--- a/examples/external/force/README.txt
+++ b/examples/external/force/README.txt
@@ -1,7 +1,7 @@
 Modified from the example at of D3
 https://bl.ocks.org/mbostock/4062045
 
-Run the file force.py to generate the force.json data file needed for this to work.
+Run the file javascript_force.py to generate the force.json data file needed for this to work.
 
 Then copy all of the files in this directory to a webserver and load force.html.
 
diff --git a/examples/geospatial/cholera_cases.gpkg b/examples/geospatial/cholera_cases.gpkg
deleted file mode 100644
index a21456f..0000000
Binary files a/examples/geospatial/cholera_cases.gpkg and /dev/null differ
diff --git a/examples/geospatial/extended_description.rst b/examples/geospatial/extended_description.rst
deleted file mode 100644
index dd3ae10..0000000
--- a/examples/geospatial/extended_description.rst
+++ /dev/null
@@ -1,261 +0,0 @@
-:orphan:
-
-Geospatial Examples Description
--------------------------------
-
-Functions for reading and writing shapefiles are provided in NetworkX versions <3.0.
-However, we recommend that you use the following libraries when working
-with geospatial data (including reading and writing shapefiles).
-
-Geospatial Python Libraries
-~~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-`GeoPandas <https://geopandas.readthedocs.io/>`__ provides
-interoperability between geospatial formats and storage mechanisms
-(e.g., databases) and Pandas data frames for tabular-oriented processing
-of spatial data, as well as a wide array of supporting functionality
-including spatial indices, spatial predicates (e.g., test if geometries
-intersect each other), spatial operations (e.g., the area of overlap
-between intersecting polygons), and more.
-
-See the following examples that use GeoPandas:
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a delaunay graph (plus its dual, the set of Voronoi polygons) f...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_delaunay_thumb.png
-     :alt: Delaunay graphs from geographic points
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_delaunay.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of geographic lines (sometimes called &quot;lines...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_lines_thumb.png
-     :alt: Graphs from a set of lines
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_lines.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of polygons using PySAL and geopandas. We&#x27;ll...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_polygons_thumb.png
-     :alt: Graphs from Polygons
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_polygons.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of points using PySAL and geopandas. In this...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_points_thumb.png
-     :alt: Graphs from geographic points
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_points.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to use OSMnx to download and model a street network from OpenStreetMap, ...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_osmnx_thumb.png
-     :alt: OpenStreetMap with OSMnx
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_osmnx.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-clear"></div>
-
-`PySAL <https://pysal.org/>`__ provides a rich suite of spatial analysis
-algorithms. From a network analysis context, `spatial
-weights <https://pysal.org/libpysal/api.html#spatial-weights>`__
-provide… (Levi please add more here).
-
-See the following examples that use PySAL:
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a delaunay graph (plus its dual, the set of Voronoi polygons) f...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_delaunay_thumb.png
-     :alt: Delaunay graphs from geographic points
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_delaunay.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of geographic lines (sometimes called &quot;lines...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_lines_thumb.png
-     :alt: Graphs from a set of lines
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_lines.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of polygons using PySAL and geopandas. We&#x27;ll...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_polygons_thumb.png
-     :alt: Graphs from Polygons
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_polygons.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of points using PySAL and geopandas. In this...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_points_thumb.png
-     :alt: Graphs from geographic points
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_points.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-clear"></div>
-
-`momepy <http://docs.momepy.org/en/stable/>`__ builds on top of
-GeoPandas and PySAL to provide a suite of algorithms focused on urban
-morphology. From a network analysis context, momepy enables you to
-convert your line geometry to `networkx.MultiGraph` and back to 
-`geopandas.GeoDataFrame` and apply a range of analytical functions aiming at 
-morphological description of (street) network configurations.
-
-See the following examples that use momepy:
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to build a graph from a set of geographic lines (sometimes called &quot;lines...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_lines_thumb.png
-     :alt: Graphs from a set of lines
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_lines.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-clear"></div>
-
-`OSMnx <https://osmnx.readthedocs.io/>`__ provides a set of tools to retrieve,
-model, project, analyze, and visualize OpenStreetMap street networks (and any
-other networked infrastructure) as `networkx.MultiDiGraph` objects, and convert
-these MultiDiGraphs to/from `geopandas.GeoDataFrame`. It can automatically add
-node/edge attributes for: elevation and grade (using the Google Maps Elevation
-API), edge travel speed, edge traversal time, and edge bearing. It can also
-retrieve any other spatial data from OSM (such as building footprints, public
-parks, schools, transit stops, etc) as Geopandas GeoDataFrames.
-
-See the following examples that use OSMnx:
-
-.. raw:: html
-
-    <div class="sphx-glr-thumbcontainer" tooltip="This example shows how to use OSMnx to download and model a street network from OpenStreetMap, ...">
-
-.. only:: html
-
- .. figure:: /auto_examples/geospatial/images/thumb/sphx_glr_plot_osmnx_thumb.png
-     :alt: OpenStreetMap with OSMnx
-
-     :ref:`sphx_glr_auto_examples_geospatial_plot_osmnx.py`
-
-.. raw:: html
-
-    </div>
-
-.. raw:: html
-
-    <div class="sphx-glr-clear"></div>
-
-Key Concepts
-~~~~~~~~~~~~
-
-One of the essential tasks in network analysis of geospatial data is
-defining the spatial relationships between spatial features (points,
-lines, or polygons).
-
-``PySAL`` provides several ways of representing these spatial
-relationships between features using the concept of spatial weights.
-These include relationships such as ``Queen``, ``Rook``, ...
-(Levi please add more here with a brief explanation of each).
-
-``momepy`` allows representation of street networks as both primal
-and dual graphs (in a street network analysis sense). The primal approach
-turns intersections into Graph nodes and street segments into edges,
-a format which is used for a majority of morphological studies. The dual 
-approach uses street segments as nodes and intersection topology
-as edges, which allows encoding of angular information (i.e an analysis
-can be weighted by angles between street segments instead of their length).
-
-``OSMnx`` represents street networks as primal, nonplanar, directed graphs with
-possible self-loops and parallel edges to model real-world street network form
-and flow. Nodes represent intersections and dead-ends, and edges represent the
-street segments linking them. Details of OSMnx's modeling methodology are
-available at https://doi.org/10.1016/j.compenvurbsys.2017.05.004
-
-Learn More
-~~~~~~~~~~
-
-To learn more see `Geographic Data Science with PySAL and the PyData Stack
-<https://geographicdata.science/book/intro.html>`_.
diff --git a/examples/geospatial/nuts1.geojson b/examples/geospatial/nuts1.geojson
deleted file mode 100644
index 19037a0..0000000
--- a/examples/geospatial/nuts1.geojson
+++ /dev/null
@@ -1,122 +0,0 @@
-{
-"type": "FeatureCollection",
-"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:OGC:1.3:CRS84" } },
-"features": [
-{ "type": "Feature", "id": 1, "properties": { "NUTS_ID": "AT1", "STAT_LEVL_": 1, "SHAPE_AREA": 2.9405632435800002, "SHAPE_LEN": 9.5340599707300004 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 16.940278, 48.617245498999978 ], [ 16.949778, 48.535791999000025 ], [ 16.851106, 48.438635001000023 ], [ 16.976203, 48.172244498 ], [ 17.066741, 48.11868149899999 ], [ 17.1607975, 48.006656501 ], [ 17.093074, 47.708236 ], [ 16.421846, 47.664704498999981 ], [ 16.652076, 47.6229035 ], [ 16.64622, 47.446597 ], [ 16.4337615, 47.352918499999987 ], [ 16.508267499999988, 47.001255999000023 ], [ 16.113849, 46.869067998999981 ], [ 15.996236, 46.8353985 ], [ 16.121719499999983, 46.990666499999975 ], [ 16.015158499999984, 47.36712749899999 ], [ 16.171769499999982, 47.4224 ], [ 15.847011, 47.567789499000014 ], [ 15.217306, 47.7960275 ], [ 15.0892495, 47.741471000999979 ], [ 14.738462, 47.748723499999983 ], [ 14.479043499999989, 48.104414500000019 ], [ 14.521179, 48.237609498999973 ], [ 14.8288425, 48.189076999 ], [ 14.964447, 48.37851200099999 ], [ 14.691014, 48.584301998 ], [ 14.990445, 49.009649001000014 ], [ 15.542449499999975, 48.907696997000016 ], [ 15.753633499999978, 48.852178501000026 ], [ 16.940278, 48.617245498999978 ] ] ] } },
-{ "type": "Feature", "id": 16, "properties": { "NUTS_ID": "AT2", "STAT_LEVL_": 1, "SHAPE_AREA": 3.0970141940899998, "SHAPE_LEN": 9.7616838950999991 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 15.996236, 46.8353985 ], [ 16.038086, 46.656145497000011 ], [ 15.786422, 46.7074695 ], [ 15.649988, 46.705757 ], [ 15.40197949899999, 46.65354849900001 ], [ 15.065120499999978, 46.652112497000019 ], [ 14.6745805, 46.450687500000015 ], [ 14.5651755, 46.372453496999981 ], [ 14.434500500000013, 46.442943501 ], [ 13.714184999, 46.522703499999977 ], [ 13.504249500000014, 46.566303998000024 ], [ 12.731392998999979, 46.634288 ], [ 12.690635, 46.656972000999986 ], [ 12.841158, 46.860979499 ], [ 12.6568335, 47.099517497000022 ], [ 13.354519499999981, 47.09710400099999 ], [ 13.784528498999975, 46.943882497 ], [ 13.864178998999989, 47.25217 ], [ 13.607597, 47.283564999000021 ], [ 13.585662999000022, 47.474799499000028 ], [ 13.722560499999986, 47.462111500999981 ], [ 13.777418, 47.714528998999981 ], [ 14.010598, 47.700891997999975 ], [ 14.738462, 47.748723499999983 ], [ 15.0892495, 47.741471000999979 ], [ 15.217306, 47.7960275 ], [ 15.847011, 47.567789499000014 ], [ 16.171769499999982, 47.4224 ], [ 16.015158499999984, 47.36712749899999 ], [ 16.121719499999983, 46.990666499999975 ], [ 15.996236, 46.8353985 ] ] ] } },
-{ "type": "Feature", "id": 28, "properties": { "NUTS_ID": "AT3", "STAT_LEVL_": 1, "SHAPE_AREA": 4.1089052759299998, "SHAPE_LEN": 16.642867131900001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 14.691014, 48.584301998 ], [ 14.964447, 48.37851200099999 ], [ 14.8288425, 48.189076999 ], [ 14.521179, 48.237609498999973 ], [ 14.479043499999989, 48.104414500000019 ], [ 14.738462, 47.748723499999983 ], [ 14.010598, 47.700891997999975 ], [ 13.777418, 47.714528998999981 ], [ 13.722560499999986, 47.462111500999981 ], [ 13.585662999000022, 47.474799499000028 ], [ 13.607597, 47.283564999000021 ], [ 13.864178998999989, 47.25217 ], [ 13.784528498999975, 46.943882497 ], [ 13.354519499999981, 47.09710400099999 ], [ 12.6568335, 47.099517497000022 ], [ 12.841158, 46.860979499 ], [ 12.690635, 46.656972000999986 ], [ 12.477924, 46.679835497999989 ], [ 12.143811, 46.913779998 ], [ 12.2407455, 47.069168499 ], [ 12.136014, 47.0806675 ], [ 11.627199500000017, 47.013299 ], [ 11.164281500000016, 46.965722500000027 ], [ 11.02225, 46.765410498999984 ], [ 10.4696515, 46.854909 ], [ 10.389317999000014, 47.000524499999983 ], [ 10.144974499999989, 46.851009498999986 ], [ 9.607078, 47.0607745 ], [ 9.620580500000017, 47.151645499999972 ], [ 9.530749, 47.270581000999982 ], [ 9.673370499999976, 47.38151050099998 ], [ 9.5587205, 47.541892999000027 ], [ 9.967813499999977, 47.546240496999985 ], [ 9.999526, 47.48301699699999 ], [ 10.178353, 47.270113998999989 ], [ 10.454439, 47.55579699899999 ], [ 10.886199, 47.536847998999974 ], [ 10.991202499999986, 47.396131 ], [ 11.0165005, 47.396368000999985 ], [ 11.4214275, 47.44485199899998 ], [ 11.410219, 47.495324000999972 ], [ 11.632883, 47.592445997000027 ], [ 12.060662, 47.618743499 ], [ 12.338045498999975, 47.697087501 ], [ 12.575026499999979, 47.632316 ], [ 12.695795499999974, 47.682222998999976 ], [ 13.046055500000023, 47.520502498999974 ], [ 12.8757855, 47.962608999 ], [ 12.86018150000001, 47.996639999000024 ], [ 12.751555, 48.112810497999988 ], [ 12.9446845, 48.206692498999985 ], [ 13.177043, 48.294389997 ], [ 13.407838, 48.372160501 ], [ 13.438430499999981, 48.548895001 ], [ 13.51336950000001, 48.590977998000028 ], [ 13.727090499999974, 48.513019500999974 ], [ 13.796107, 48.71360049899999 ], [ 13.839507, 48.771605001000012 ], [ 14.691014, 48.584301998 ] ] ] } },
-{ "type": "Feature", "id": 37, "properties": { "NUTS_ID": "BE3", "STAT_LEVL_": 1, "SHAPE_AREA": 2.1906072535800001, "SHAPE_LEN": 9.2029315527800009 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 6.020998999000028, 50.754295500000012 ], [ 6.26861550000001, 50.625981 ], [ 6.189312500000028, 50.56609400100001 ], [ 6.192200499000023, 50.521055499 ], [ 6.206287499999974, 50.521302000999981 ], [ 6.315556, 50.497042498999974 ], [ 6.405028500000014, 50.323308499 ], [ 6.137662499999976, 50.129951499000015 ], [ 6.0248995, 50.182779498 ], [ 5.746319, 49.853595 ], [ 5.910688, 49.662388001000011 ], [ 5.818117, 49.5463105 ], [ 5.734555999, 49.545690501000024 ], [ 5.470882999000025, 49.49723799899999 ], [ 5.393511, 49.617111 ], [ 5.153738499999974, 49.717926 ], [ 4.969431, 49.801825999000016 ], [ 4.851578, 49.793255000999977 ], [ 4.793194, 49.98199900100002 ], [ 4.896794, 50.137420498999973 ], [ 4.796697, 50.148677998999972 ], [ 4.432494, 49.941615996999985 ], [ 4.233133, 49.957828498000026 ], [ 4.140853, 49.978759997 ], [ 4.027774500000021, 50.358330498999976 ], [ 3.710389, 50.303165499999977 ], [ 3.65551, 50.461735498999985 ], [ 3.615081499999974, 50.490399001000014 ], [ 3.286492, 50.52756899799999 ], [ 3.245294, 50.713009498000019 ], [ 3.176996, 50.756164497999976 ], [ 3.324118, 50.722308999 ], [ 3.459731, 50.765968500999975 ], [ 3.5412725, 50.733700499 ], [ 3.775359499999979, 50.747729999 ], [ 3.815390499999978, 50.75072499700002 ], [ 3.895736499, 50.732944498999984 ], [ 4.100483, 50.70595549799998 ], [ 4.597269, 50.763534500999981 ], [ 5.019701, 50.750973000999977 ], [ 5.1018525, 50.708652498999982 ], [ 5.236892, 50.72714600099999 ], [ 5.431686500000012, 50.719803 ], [ 5.687622, 50.811923998999987 ], [ 5.682000499000026, 50.757446497999979 ], [ 5.892073, 50.755237498999975 ], [ 6.020998999000028, 50.754295500000012 ] ] ], [ [ [ 3.0187085, 50.773532999 ], [ 2.863276, 50.708343496999987 ], [ 2.842169, 50.751404 ], [ 2.945734077, 50.793946362999975 ], [ 3.0187085, 50.773532999 ] ] ] ] } },
-{ "type": "Feature", "id": 64, "properties": { "NUTS_ID": "BG3", "STAT_LEVL_": 1, "SHAPE_AREA": 7.5037102699, "SHAPE_LEN": 16.905305932499999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 24.641605998999978, 43.733412999 ], [ 25.293740500000013, 43.654294 ], [ 25.544640500000014, 43.64298149699999 ], [ 25.671862499999975, 43.691339998999979 ], [ 26.358359999000015, 44.038415500999974 ], [ 26.379968, 44.042950999000027 ], [ 27.271344999, 44.12633649899999 ], [ 27.69541, 43.987343 ], [ 28.578884, 43.738739 ], [ 28.529502998999988, 43.423537997999972 ], [ 28.0601805, 43.316445000999977 ], [ 27.88262, 42.838413498000023 ], [ 27.899066999000013, 42.700245500999984 ], [ 27.4479045, 42.461486998999987 ], [ 28.035512499999982, 41.983079498999984 ], [ 27.559395, 41.904781496999988 ], [ 27.059650499999975, 42.088336997999988 ], [ 26.9492285, 42.000213498999983 ], [ 26.561544500000025, 41.92627349899999 ], [ 26.538056, 42.154196999000021 ], [ 26.192733499999974, 42.208129997000015 ], [ 26.057769499000017, 42.017656499999987 ], [ 25.609251, 42.200959 ], [ 25.348064, 42.11437749800001 ], [ 25.007195500000023, 42.737618999 ], [ 24.385927, 42.749953498000025 ], [ 24.12798399899998, 42.774947997000027 ], [ 24.165291501000013, 42.929701498999975 ], [ 23.971503499999983, 43.062251499000013 ], [ 23.566569, 42.993892999000025 ], [ 23.412767, 43.160521997999979 ], [ 23.00621, 43.192878498000027 ], [ 22.747454, 43.386612999000022 ], [ 22.666975499999978, 43.402747498999986 ], [ 22.357130499999982, 43.809481999000013 ], [ 22.536145, 44.045511499999975 ], [ 22.6751615, 44.215662996999981 ], [ 22.966399500000023, 44.098524997000027 ], [ 23.045898, 44.063749499999972 ], [ 22.838719, 43.877852001 ], [ 22.997154500000022, 43.807626997 ], [ 23.63008, 43.791259997999987 ], [ 24.112719, 43.699566498000024 ], [ 24.324132, 43.699567996999974 ], [ 24.641605998999978, 43.733412999 ] ] ] } },
-{ "type": "Feature", "id": 87, "properties": { "NUTS_ID": "BG4", "STAT_LEVL_": 1, "SHAPE_AREA": 4.7374808307, "SHAPE_LEN": 11.5242072777 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 26.561544500000025, 41.92627349899999 ], [ 26.357879, 41.711104498999987 ], [ 26.060393, 41.688516000999982 ], [ 26.158688499999982, 41.391182496999988 ], [ 25.948626, 41.320342000999972 ], [ 25.906437499999981, 41.307574998 ], [ 25.224698499999988, 41.264630997999973 ], [ 25.1793755, 41.310187998 ], [ 24.783956, 41.360188997000023 ], [ 24.525637, 41.568699998 ], [ 24.059744, 41.522112 ], [ 23.624222998999983, 41.375727497000014 ], [ 22.9275915, 41.338539498999978 ], [ 22.968327499999987, 41.51983549900001 ], [ 22.867214, 42.022199496999974 ], [ 22.510412, 42.15515849799999 ], [ 22.3602065, 42.311157001000026 ], [ 22.559255, 42.480283998 ], [ 22.461472, 42.64849049899999 ], [ 22.44282149899999, 42.825456501000019 ], [ 22.5155105, 42.868538498000021 ], [ 22.7483995, 42.889787496999986 ], [ 23.00621, 43.192878498000027 ], [ 23.412767, 43.160521997999979 ], [ 23.566569, 42.993892999000025 ], [ 23.971503499999983, 43.062251499000013 ], [ 24.165291501000013, 42.929701498999975 ], [ 24.12798399899998, 42.774947997000027 ], [ 24.385927, 42.749953498000025 ], [ 25.007195500000023, 42.737618999 ], [ 25.348064, 42.11437749800001 ], [ 25.609251, 42.200959 ], [ 26.057769499000017, 42.017656499999987 ], [ 26.192733499999974, 42.208129997000015 ], [ 26.538056, 42.154196999000021 ], [ 26.561544500000025, 41.92627349899999 ] ] ] } },
-{ "type": "Feature", "id": 101, "properties": { "NUTS_ID": "CY0", "STAT_LEVL_": 1, "SHAPE_AREA": 0.87854701902400001, "SHAPE_LEN": 5.8309313981199997 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 34.43715, 35.60157049899999 ], [ 33.906834, 35.257218 ], [ 34.077861499999983, 34.966491001 ], [ 33.6778885, 34.971539999000015 ], [ 33.006973499000026, 34.648288499999978 ], [ 32.425964500000021, 34.743192497999985 ], [ 32.269781, 35.066213999000013 ], [ 32.881518, 35.16222 ], [ 32.9214245, 35.403959998 ], [ 33.652597, 35.35919099900002 ], [ 34.588354, 35.695434498999987 ], [ 34.43715, 35.60157049899999 ] ] ] } },
-{ "type": "Feature", "id": 105, "properties": { "NUTS_ID": "CZ0", "STAT_LEVL_": 1, "SHAPE_AREA": 9.8540716308099991, "SHAPE_LEN": 17.3294302971 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 14.823362, 50.870550497000011 ], [ 15.032691, 51.021315999000024 ], [ 15.535267499999975, 50.779375999000024 ], [ 16.107318, 50.662072998999975 ], [ 16.443536, 50.586257499999988 ], [ 16.195729, 50.432135001 ], [ 16.58029, 50.142787998000017 ], [ 16.86327, 50.19812299900002 ], [ 17.028323, 50.229996998999979 ], [ 16.907924499999979, 50.44945400099999 ], [ 17.429605, 50.254513001000021 ], [ 17.718403998999975, 50.32095 ], [ 17.758479, 50.206568 ], [ 17.592736, 50.160014 ], [ 17.868675, 49.972545997 ], [ 18.035060999, 50.06577199899999 ], [ 18.575724, 49.910423 ], [ 18.851551, 49.51718900100002 ], [ 18.4035955, 49.396745499000019 ], [ 18.322436, 49.315059 ], [ 17.64693, 48.854265998000017 ], [ 17.3967255, 48.813349997999978 ], [ 17.2016625, 48.878028997 ], [ 16.940278, 48.617245498999978 ], [ 15.753633499999978, 48.852178501000026 ], [ 15.542449499999975, 48.907696997000016 ], [ 14.990445, 49.009649001000014 ], [ 14.691014, 48.584301998 ], [ 13.839507, 48.771605001000012 ], [ 13.5515785, 48.967787497000018 ], [ 13.4166515, 48.980035499 ], [ 13.170907999, 49.173579501 ], [ 12.633763499999986, 49.476194997999983 ], [ 12.593778499999985, 49.542191499000012 ], [ 12.401524, 49.758372998000027 ], [ 12.550988500000017, 49.905088001000024 ], [ 12.260799, 50.058156498000017 ], [ 12.160679, 50.219848997999975 ], [ 12.100900500000023, 50.31802799799999 ], [ 12.5837785, 50.407078999000021 ], [ 12.948144500000012, 50.404311499000016 ], [ 13.501846, 50.633642999000017 ], [ 13.652174, 50.730359997999983 ], [ 14.388335, 50.900298997999982 ], [ 14.317873500000019, 51.054698998999982 ], [ 14.491221, 51.043530498999985 ], [ 14.6188, 50.857804497000018 ], [ 14.823362, 50.870550497000011 ] ] ] } },
-{ "type": "Feature", "id": 129, "properties": { "NUTS_ID": "CH0", "STAT_LEVL_": 1, "SHAPE_AREA": 4.8872175037299996, "SHAPE_LEN": 13.39682568 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 9.5587205, 47.541892999000027 ], [ 9.673370499999976, 47.38151050099998 ], [ 9.530749, 47.270581000999982 ], [ 9.4760475, 47.051797498999974 ], [ 9.607078, 47.0607745 ], [ 10.144974499999989, 46.851009498999986 ], [ 10.389317999000014, 47.000524499999983 ], [ 10.4696515, 46.854909 ], [ 10.452801, 46.530682999000021 ], [ 10.2448745, 46.622091997999974 ], [ 9.714149500000019, 46.292708499000014 ], [ 9.248531500000013, 46.233768000999987 ], [ 9.1593775, 46.169601 ], [ 8.988276499999984, 45.972282498000027 ], [ 9.088803499999983, 45.896897001000013 ], [ 8.912147, 45.830444999 ], [ 8.713936, 46.097271998999986 ], [ 8.384717, 46.452158499 ], [ 7.8771375, 45.926954997999985 ], [ 7.86407650000001, 45.916750499999978 ], [ 7.044886, 45.922412999000016 ], [ 6.797888999, 46.136798999 ], [ 6.821063998999989, 46.427154498999982 ], [ 6.24136, 46.343582497 ], [ 6.231694, 46.329429498000025 ], [ 6.219547, 46.311877998999989 ], [ 6.310211, 46.244045498999981 ], [ 5.956067, 46.1320955 ], [ 6.125608, 46.317229999 ], [ 6.064003, 46.416228997000019 ], [ 6.138106, 46.557666997000013 ], [ 6.460011, 46.851551498999982 ], [ 6.858914, 47.165294499000026 ], [ 7.061636, 47.34373449899999 ], [ 6.879806, 47.352439998000023 ], [ 6.939186, 47.433704499999976 ], [ 7.130353, 47.503040499 ], [ 7.326466, 47.439853500000027 ], [ 7.380894, 47.431892499000014 ], [ 7.42113949899999, 47.446387999000024 ], [ 7.445019, 47.46172349699998 ], [ 7.510905499999978, 47.502582497999981 ], [ 7.5551595, 47.56456399699999 ], [ 7.589039, 47.589877999 ], [ 7.634097, 47.561113501000023 ], [ 7.713784499999974, 47.539405 ], [ 7.894107500000018, 47.586374998999986 ], [ 8.426434500000028, 47.567548998 ], [ 8.562841, 47.599432498 ], [ 8.595602, 47.6055445 ], [ 8.606369500000028, 47.669005498999979 ], [ 8.510115, 47.776190499999984 ], [ 8.61233, 47.801462 ], [ 8.740104499999973, 47.752789001 ], [ 8.7279815, 47.692680499 ], [ 8.795708499999989, 47.675595499999986 ], [ 8.843022, 47.712262499000019 ], [ 8.8749985, 47.65597349799998 ], [ 9.182192, 47.655890499 ], [ 9.495604, 47.551454998999986 ], [ 9.5587205, 47.541892999000027 ] ] ] } },
-{ "type": "Feature", "id": 158, "properties": { "NUTS_ID": "BE1", "STAT_LEVL_": 1, "SHAPE_AREA": 0.0030247151296100001, "SHAPE_LEN": 0.49368216231500001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 4.47678350000001, 50.8203775 ], [ 4.4804765, 50.794257999000024 ], [ 4.2455665, 50.817627 ], [ 4.47678350000001, 50.8203775 ] ] ] } },
-{ "type": "Feature", "id": 161, "properties": { "NUTS_ID": "BE2", "STAT_LEVL_": 1, "SHAPE_AREA": 1.72470830463, "SHAPE_LEN": 7.8510293864299996 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 3.0187085, 50.773532999 ], [ 2.945734077, 50.793946362999975 ], [ 2.842169, 50.751404 ], [ 2.863276, 50.708343496999987 ], [ 2.607036, 50.91268949900001 ], [ 2.546011, 51.089381998000022 ], [ 2.749051499000018, 51.161769999 ], [ 3.110688499999981, 51.312259501000028 ], [ 3.367216499999984, 51.368134499 ], [ 3.380661, 51.274299497000015 ], [ 3.8563395, 51.211056 ], [ 3.9776665, 51.225131998999984 ], [ 4.23481750000002, 51.348254 ], [ 4.242049, 51.353966999000022 ], [ 4.24366950000001, 51.374729499000011 ], [ 4.279565, 51.376017497000021 ], [ 4.669544, 51.426383999 ], [ 4.759926, 51.50246449799999 ], [ 5.10218, 51.42900499699999 ], [ 5.237716499999976, 51.261600499 ], [ 5.5662835, 51.220836497999983 ], [ 5.798274, 51.059853498999985 ], [ 5.766149, 51.009235499999988 ], [ 5.758272498999986, 50.954795 ], [ 5.687622, 50.811923998999987 ], [ 5.431686500000012, 50.719803 ], [ 5.236892, 50.72714600099999 ], [ 5.1018525, 50.708652498999982 ], [ 5.019701, 50.750973000999977 ], [ 4.597269, 50.763534500999981 ], [ 4.100483, 50.70595549799998 ], [ 3.895736499, 50.732944498999984 ], [ 3.815390499999978, 50.75072499700002 ], [ 3.775359499999979, 50.747729999 ], [ 3.5412725, 50.733700499 ], [ 3.459731, 50.765968500999975 ], [ 3.324118, 50.722308999 ], [ 3.176996, 50.756164497999976 ], [ 3.098481, 50.779019 ], [ 3.0187085, 50.773532999 ] ], [ [ 4.4804765, 50.794257999000024 ], [ 4.47678350000001, 50.8203775 ], [ 4.2455665, 50.817627 ], [ 4.4804765, 50.794257999000024 ] ] ] } },
-{ "type": "Feature", "id": 203, "properties": { "NUTS_ID": "DE8", "STAT_LEVL_": 1, "SHAPE_AREA": 3.21364124189, "SHAPE_LEN": 13.3120506311 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 12.926951499999973, 54.427963998999985 ], [ 12.363174500000014, 54.267097998999986 ], [ 13.017391718999988, 54.438720773999989 ], [ 13.109225, 54.27934049800001 ], [ 13.319396, 54.193505997999978 ], [ 13.344701499999985, 54.181111497000018 ], [ 13.742715499999974, 54.031386499 ], [ 13.819152499999973, 53.841944996999985 ], [ 14.267542, 53.697806496999988 ], [ 14.412157, 53.329635998000015 ], [ 13.710065499999985, 53.4790625 ], [ 13.2414035, 53.232401 ], [ 12.984683, 53.1649855 ], [ 12.3317515, 53.318011 ], [ 12.325101500000017, 53.321301498000025 ], [ 11.265732, 53.121978 ], [ 11.171861499999977, 53.15664399799999 ], [ 10.595047, 53.363927500999978 ], [ 10.951918499999977, 53.647622499000022 ], [ 10.76296450000001, 53.811153 ], [ 10.903661500999988, 53.956822000999978 ], [ 11.5617785, 54.028088497999988 ], [ 11.998484500000018, 54.174969998999984 ], [ 12.201068, 54.244695498999988 ], [ 12.2855305, 54.274951999 ], [ 12.520881067, 54.474234541999976 ], [ 12.926951499999973, 54.427963998999985 ] ] ], [ [ [ 13.391083499999979, 54.651317 ], [ 13.67934, 54.56278249799999 ], [ 13.6824535, 54.349581 ], [ 13.394217500000025, 54.222502500000019 ], [ 13.114586, 54.331907996999973 ], [ 13.2614835, 54.382916500000022 ], [ 13.270131, 54.480173498999989 ], [ 13.144181, 54.546965499 ], [ 13.391083499999979, 54.651317 ] ] ], [ [ [ 14.2130775, 53.866479498999979 ], [ 13.826150499999983, 53.849684997999987 ], [ 13.750745999, 54.14996050100001 ], [ 13.9720845, 54.06874749799999 ], [ 14.226302, 53.928652998000018 ], [ 14.2130775, 53.866479498999979 ] ] ] ] } },
-{ "type": "Feature", "id": 213, "properties": { "NUTS_ID": "DE9", "STAT_LEVL_": 1, "SHAPE_AREA": 6.2722593853799999, "SHAPE_LEN": 17.880560420199998 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 8.614913999, 53.882118 ], [ 9.02991750000001, 53.825564997000015 ], [ 9.02242, 53.87951750000002 ], [ 9.199750999, 53.88010449799998 ], [ 9.48589, 53.707664497999986 ], [ 9.73010399899999, 53.557580997 ], [ 9.768861, 53.50527899799999 ], [ 10.308312, 53.433221498000023 ], [ 10.469184, 53.385844 ], [ 10.595047, 53.363927500999978 ], [ 11.171861499999977, 53.15664399799999 ], [ 11.265732, 53.121978 ], [ 11.597784499999989, 53.035926499000027 ], [ 11.505027, 52.941032499000016 ], [ 10.841556, 52.852205 ], [ 10.759314500000016, 52.795830500000022 ], [ 11.0087815, 52.496747500000026 ], [ 10.93454250000002, 52.471794999 ], [ 11.086244, 52.22863399900001 ], [ 10.964414499999975, 52.056642997999973 ], [ 10.801428499999986, 52.0480005 ], [ 10.561227, 52.004065998999977 ], [ 10.701372, 51.642187499999977 ], [ 10.677283, 51.638376000999983 ], [ 10.488551, 51.574778498 ], [ 10.365365, 51.55589100100002 ], [ 9.928339, 51.375299 ], [ 9.710295479000024, 51.301537750000023 ], [ 9.568025, 51.340001498999982 ], [ 9.557291, 51.35137849900002 ], [ 9.647755500000017, 51.55251099899999 ], [ 9.62582500100001, 51.580205 ], [ 9.672377499999982, 51.568403999 ], [ 9.685331, 51.582016499000019 ], [ 9.440457, 51.650393999000016 ], [ 9.417317500000024, 51.647269499 ], [ 9.459648500000014, 51.86279749800002 ], [ 9.323343500000021, 51.85506099700001 ], [ 9.308893, 51.922719501000017 ], [ 9.15560449899999, 52.09783 ], [ 8.985787500000015, 52.19457649899999 ], [ 9.125252499999988, 52.411993498000015 ], [ 8.703008999000019, 52.500437998 ], [ 8.2972135, 52.456497997999975 ], [ 8.4661865, 52.267614 ], [ 8.410586500000022, 52.115114996999978 ], [ 8.096448, 52.057145000999981 ], [ 7.885159, 52.0833025 ], [ 8.007796, 52.115332997999985 ], [ 7.956465, 52.2724905 ], [ 7.964625500000011, 52.32485850099999 ], [ 7.608038500000021, 52.474015999000017 ], [ 7.317481, 52.280271998999979 ], [ 7.099148999000022, 52.24305849699999 ], [ 7.065685, 52.241372999000021 ], [ 6.987941499999977, 52.469540999 ], [ 6.697865499999978, 52.486285998000028 ], [ 6.709732499999973, 52.627823498999987 ], [ 7.006229500000018, 52.63876299899999 ], [ 7.092692, 52.83820099899998 ], [ 7.202794499999982, 53.113281498999982 ], [ 7.208935, 53.243064498000024 ], [ 7.2643, 53.325526497999988 ], [ 6.999446499999976, 53.359887501 ], [ 7.24368149899999, 53.66778199700002 ], [ 7.550173, 53.675037499999974 ], [ 7.809895499999982, 53.707661998999981 ], [ 8.091291, 53.638109000999975 ], [ 8.061327, 53.505957001000013 ], [ 8.195555, 53.409171499000024 ], [ 8.329544, 53.614027 ], [ 8.554898499999979, 53.525131000999977 ], [ 8.492652, 53.472420000999989 ], [ 8.652134000999979, 53.516016998999987 ], [ 8.650632499999972, 53.602565001000016 ], [ 8.520410500000025, 53.606205000999978 ], [ 8.614913999, 53.882118 ] ], [ [ 8.711424, 53.044629997000015 ], [ 8.915827, 53.011021497 ], [ 8.979256, 53.045849499999974 ], [ 8.984185500000024, 53.126070998999978 ], [ 8.485330499999975, 53.22712 ], [ 8.6548985, 53.108864499999981 ], [ 8.711424, 53.044629997000015 ] ] ] } },
-{ "type": "Feature", "id": 264, "properties": { "NUTS_ID": "DEA", "STAT_LEVL_": 1, "SHAPE_AREA": 4.4627307736199997, "SHAPE_LEN": 12.1282181866 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 9.440457, 51.650393999000016 ], [ 9.155410500000016, 51.442674999000019 ], [ 8.970653500000026, 51.5067735 ], [ 8.556348, 51.277495 ], [ 8.7582145, 51.177181499000028 ], [ 8.549084999, 51.101868001000014 ], [ 8.477892, 50.96904749700002 ], [ 8.355495999000027, 50.862001 ], [ 8.125776499999972, 50.685813998000015 ], [ 8.03969, 50.697375498999975 ], [ 7.851496, 50.925832500000013 ], [ 7.785898, 50.939913999 ], [ 7.6610035, 50.820364500999972 ], [ 7.44077199899999, 50.711437998 ], [ 7.2123995, 50.623404999 ], [ 7.210872, 50.649543499 ], [ 7.1952005, 50.642722999 ], [ 6.927901, 50.558618500000023 ], [ 6.800711499999977, 50.361781499000017 ], [ 6.425295, 50.323011998000027 ], [ 6.405028500000014, 50.323308499 ], [ 6.315556, 50.497042498999974 ], [ 6.206287499999974, 50.521302000999981 ], [ 6.192200499000023, 50.521055499 ], [ 6.189312500000028, 50.56609400100001 ], [ 6.26861550000001, 50.625981 ], [ 6.020998999000028, 50.754295500000012 ], [ 6.0869475, 50.913134999000022 ], [ 5.877084998999976, 51.032101 ], [ 6.174812, 51.1845135 ], [ 6.072657, 51.242587497999978 ], [ 6.224405, 51.364978999000016 ], [ 5.953192, 51.747845998 ], [ 6.167766, 51.900804499 ], [ 6.4077795, 51.828092001000016 ], [ 6.828513, 51.9640665 ], [ 6.760465, 52.118569499999978 ], [ 7.065685, 52.241372999000021 ], [ 7.099148999000022, 52.24305849699999 ], [ 7.317481, 52.280271998999979 ], [ 7.608038500000021, 52.474015999000017 ], [ 7.964625500000011, 52.32485850099999 ], [ 7.956465, 52.2724905 ], [ 8.007796, 52.115332997999985 ], [ 7.885159, 52.0833025 ], [ 8.096448, 52.057145000999981 ], [ 8.410586500000022, 52.115114996999978 ], [ 8.4661865, 52.267614 ], [ 8.2972135, 52.456497997999975 ], [ 8.703008999000019, 52.500437998 ], [ 9.125252499999988, 52.411993498000015 ], [ 8.985787500000015, 52.19457649899999 ], [ 9.15560449899999, 52.09783 ], [ 9.308893, 51.922719501000017 ], [ 9.323343500000021, 51.85506099700001 ], [ 9.459648500000014, 51.86279749800002 ], [ 9.417317500000024, 51.647269499 ], [ 9.440457, 51.650393999000016 ] ] ] } },
-{ "type": "Feature", "id": 335, "properties": { "NUTS_ID": "DE1", "STAT_LEVL_": 1, "SHAPE_AREA": 4.4214885156900001, "SHAPE_LEN": 9.9770721050599995 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 9.5587205, 47.541892999000027 ], [ 9.495604, 47.551454998999986 ], [ 9.182192, 47.655890499 ], [ 8.8749985, 47.65597349799998 ], [ 8.843022, 47.712262499000019 ], [ 8.795708499999989, 47.675595499999986 ], [ 8.7279815, 47.692680499 ], [ 8.740104499999973, 47.752789001 ], [ 8.61233, 47.801462 ], [ 8.510115, 47.776190499999984 ], [ 8.606369500000028, 47.669005498999979 ], [ 8.595602, 47.6055445 ], [ 8.562841, 47.599432498 ], [ 8.426434500000028, 47.567548998 ], [ 7.894107500000018, 47.586374998999986 ], [ 7.713784499999974, 47.539405 ], [ 7.634097, 47.561113501000023 ], [ 7.589039, 47.589877999 ], [ 7.545990500000016, 47.743573998999977 ], [ 7.577291, 48.115654999000014 ], [ 7.5779195, 48.121391999000025 ], [ 7.680713, 48.257266996999988 ], [ 7.9596305, 48.718580998999983 ], [ 8.232632998999975, 48.966571500999976 ], [ 8.261284, 48.980916998999987 ], [ 8.277349, 48.989939000999982 ], [ 8.33997, 49.080149999000014 ], [ 8.4130725, 49.249816499000019 ], [ 8.466985500000021, 49.28297550100001 ], [ 8.487268, 49.290026499000021 ], [ 8.4709155, 49.340712999 ], [ 8.497316, 49.411347 ], [ 8.474739704, 49.440616029000012 ], [ 8.423068, 49.541821 ], [ 8.422700500000019, 49.57419249899999 ], [ 8.4224395, 49.583385498999974 ], [ 8.581375, 49.519780498999978 ], [ 8.899572499999977, 49.50365549899999 ], [ 8.899355, 49.48455 ], [ 8.950348500000018, 49.454992999000012 ], [ 8.93188, 49.470636996999986 ], [ 9.083426499999973, 49.52608699699999 ], [ 9.103006, 49.577456 ], [ 9.41092, 49.663507998999989 ], [ 9.295673, 49.740530998999986 ], [ 9.471497499, 49.779726501000027 ], [ 9.648736, 49.791477499999985 ], [ 9.926560999, 49.48483550100002 ], [ 10.083623, 49.542955499000016 ], [ 10.118327500000021, 49.47316949899999 ], [ 10.1119665, 49.384910499 ], [ 10.256763499999977, 49.059495 ], [ 10.4098495, 48.977435499000023 ], [ 10.423689, 48.744495997 ], [ 10.487258, 48.69666200099999 ], [ 10.278268, 48.516074499000013 ], [ 10.230779499999983, 48.510511 ], [ 10.1338955, 48.454871498999978 ], [ 10.0326945, 48.45719649900002 ], [ 9.997606, 48.35011799900002 ], [ 10.095357499999977, 48.164014999000017 ], [ 10.136457, 48.108459498 ], [ 10.135728, 48.02376149700001 ], [ 10.132387, 48.01539999900001 ], [ 10.11495450000001, 47.976257501000021 ], [ 10.104207, 47.974358497000026 ], [ 10.11013650000001, 47.937149998999985 ], [ 10.131928500000015, 47.820087496999975 ], [ 10.0772925, 47.63927449800002 ], [ 9.692543, 47.610768999000015 ], [ 9.5587205, 47.541892999000027 ] ] ] } },
-{ "type": "Feature", "id": 384, "properties": { "NUTS_ID": "DE2", "STAT_LEVL_": 1, "SHAPE_AREA": 8.5129416394100001, "SHAPE_LEN": 17.544328932799999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 11.919884500000023, 50.42440349899999 ], [ 12.100900500000023, 50.31802799799999 ], [ 12.160679, 50.219848997999975 ], [ 12.260799, 50.058156498000017 ], [ 12.550988500000017, 49.905088001000024 ], [ 12.401524, 49.758372998000027 ], [ 12.593778499999985, 49.542191499000012 ], [ 12.633763499999986, 49.476194997999983 ], [ 13.170907999, 49.173579501 ], [ 13.4166515, 48.980035499 ], [ 13.5515785, 48.967787497000018 ], [ 13.839507, 48.771605001000012 ], [ 13.796107, 48.71360049899999 ], [ 13.727090499999974, 48.513019500999974 ], [ 13.51336950000001, 48.590977998000028 ], [ 13.438430499999981, 48.548895001 ], [ 13.407838, 48.372160501 ], [ 13.177043, 48.294389997 ], [ 12.9446845, 48.206692498999985 ], [ 12.751555, 48.112810497999988 ], [ 12.86018150000001, 47.996639999000024 ], [ 12.8757855, 47.962608999 ], [ 13.046055500000023, 47.520502498999974 ], [ 12.695795499999974, 47.682222998999976 ], [ 12.575026499999979, 47.632316 ], [ 12.338045498999975, 47.697087501 ], [ 12.060662, 47.618743499 ], [ 11.632883, 47.592445997000027 ], [ 11.410219, 47.495324000999972 ], [ 11.4214275, 47.44485199899998 ], [ 11.0165005, 47.396368000999985 ], [ 10.991202499999986, 47.396131 ], [ 10.886199, 47.536847998999974 ], [ 10.454439, 47.55579699899999 ], [ 10.178353, 47.270113998999989 ], [ 9.999526, 47.48301699699999 ], [ 9.967813499999977, 47.546240496999985 ], [ 9.5587205, 47.541892999000027 ], [ 9.692543, 47.610768999000015 ], [ 10.0772925, 47.63927449800002 ], [ 10.131928500000015, 47.820087496999975 ], [ 10.11013650000001, 47.937149998999985 ], [ 10.104207, 47.974358497000026 ], [ 10.11495450000001, 47.976257501000021 ], [ 10.132387, 48.01539999900001 ], [ 10.135728, 48.02376149700001 ], [ 10.136457, 48.108459498 ], [ 10.095357499999977, 48.164014999000017 ], [ 9.997606, 48.35011799900002 ], [ 10.0326945, 48.45719649900002 ], [ 10.1338955, 48.454871498999978 ], [ 10.230779499999983, 48.510511 ], [ 10.278268, 48.516074499000013 ], [ 10.487258, 48.69666200099999 ], [ 10.423689, 48.744495997 ], [ 10.4098495, 48.977435499000023 ], [ 10.256763499999977, 49.059495 ], [ 10.1119665, 49.384910499 ], [ 10.118327500000021, 49.47316949899999 ], [ 10.083623, 49.542955499000016 ], [ 9.926560999, 49.48483550100002 ], [ 9.648736, 49.791477499999985 ], [ 9.471497499, 49.779726501000027 ], [ 9.295673, 49.740530998999986 ], [ 9.41092, 49.663507998999989 ], [ 9.103006, 49.577456 ], [ 9.150809, 49.742850497 ], [ 9.036080500000025, 49.846503998 ], [ 9.05008, 49.866315000999975 ], [ 9.016088500000023, 49.991340501000025 ], [ 8.990559999000027, 50.06711900099998 ], [ 9.404984500000012, 50.087734497999975 ], [ 9.623150999000018, 50.229039998000019 ], [ 9.7329145, 50.356149499000026 ], [ 9.935655, 50.419606499999986 ], [ 10.0413385, 50.516469498999982 ], [ 10.450532, 50.4018595 ], [ 10.610115, 50.227994998999975 ], [ 10.729202, 50.230005496999979 ], [ 10.8514945, 50.262762499000019 ], [ 10.715266499999984, 50.363590998 ], [ 10.9457185, 50.38646649899999 ], [ 11.189943, 50.27118550099999 ], [ 11.265938, 50.479421 ], [ 11.481568, 50.431621501 ], [ 11.603290500000014, 50.398766001000013 ], [ 11.919884500000023, 50.42440349899999 ] ] ] } },
-{ "type": "Feature", "id": 488, "properties": { "NUTS_ID": "DE3", "STAT_LEVL_": 1, "SHAPE_AREA": 0.110168847705, "SHAPE_LEN": 1.4095856494600001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 13.66698, 52.474166998999976 ], [ 13.699738, 52.377882998000018 ], [ 13.420984998999984, 52.376247001000024 ], [ 13.31212, 52.399599500000022 ], [ 13.165898, 52.394275498000013 ], [ 13.153695500000026, 52.50182150099999 ], [ 13.164263, 52.598900500000013 ], [ 13.39854, 52.648194498 ], [ 13.610827500000028, 52.544235500000013 ], [ 13.66698, 52.474166998999976 ] ] ] } },
-{ "type": "Feature", "id": 491, "properties": { "NUTS_ID": "DE4", "STAT_LEVL_": 1, "SHAPE_AREA": 3.87565509606, "SHAPE_LEN": 11.671553189899999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 14.412157, 53.329635998000015 ], [ 14.143658, 52.9613685 ], [ 14.156692, 52.895590998999978 ], [ 14.436438, 52.679900498999984 ], [ 14.565063, 52.624497 ], [ 14.534361999, 52.395008 ], [ 14.600891499999989, 52.272051998999984 ], [ 14.755227, 52.070024998 ], [ 14.716716, 52.001188001 ], [ 14.729862, 51.581776999 ], [ 14.447771, 51.542068997 ], [ 14.163324499999987, 51.541042999000013 ], [ 13.835313, 51.37678999799999 ], [ 13.691250500000024, 51.374012999 ], [ 13.21015, 51.404735999000025 ], [ 13.051025, 51.647677 ], [ 13.1509135, 51.859610499999974 ], [ 12.769780500000024, 51.979274499999974 ], [ 12.3761225, 52.04511949800002 ], [ 12.276724499000011, 52.104018 ], [ 12.31718, 52.454095499 ], [ 12.171555, 52.506336497 ], [ 12.249203500000021, 52.79186199899999 ], [ 12.126811499999974, 52.890199499 ], [ 11.597784499999989, 53.035926499000027 ], [ 11.265732, 53.121978 ], [ 12.325101500000017, 53.321301498000025 ], [ 12.3317515, 53.318011 ], [ 12.984683, 53.1649855 ], [ 13.2414035, 53.232401 ], [ 13.710065499999985, 53.4790625 ], [ 14.412157, 53.329635998000015 ] ], [ [ 13.66698, 52.474166998999976 ], [ 13.610827500000028, 52.544235500000013 ], [ 13.39854, 52.648194498 ], [ 13.164263, 52.598900500000013 ], [ 13.153695500000026, 52.50182150099999 ], [ 13.165898, 52.394275498000013 ], [ 13.31212, 52.399599500000022 ], [ 13.420984998999984, 52.376247001000024 ], [ 13.699738, 52.377882998000018 ], [ 13.66698, 52.474166998999976 ] ] ] } },
-{ "type": "Feature", "id": 511, "properties": { "NUTS_ID": "DE5", "STAT_LEVL_": 1, "SHAPE_AREA": 0.061226229512700001, "SHAPE_LEN": 1.76656281046 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 8.979256, 53.045849499999974 ], [ 8.915827, 53.011021497 ], [ 8.711424, 53.044629997000015 ], [ 8.6548985, 53.108864499999981 ], [ 8.485330499999975, 53.22712 ], [ 8.984185500000024, 53.126070998999978 ], [ 8.979256, 53.045849499999974 ] ] ], [ [ [ 8.652134000999979, 53.516016998999987 ], [ 8.492652, 53.472420000999989 ], [ 8.554898499999979, 53.525131000999977 ], [ 8.4832025, 53.600498997999978 ], [ 8.520410500000025, 53.606205000999978 ], [ 8.650632499999972, 53.602565001000016 ], [ 8.652134000999979, 53.516016998999987 ] ] ] ] } },
-{ "type": "Feature", "id": 515, "properties": { "NUTS_ID": "DE6", "STAT_LEVL_": 1, "SHAPE_AREA": 0.075484323706799999, "SHAPE_LEN": 1.3487076921500001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 10.308312, 53.433221498000023 ], [ 9.768861, 53.50527899799999 ], [ 9.73010399899999, 53.557580997 ], [ 9.945376, 53.652927998 ], [ 10.072805, 53.709633999 ], [ 10.236678499999982, 53.496354498000017 ], [ 10.308312, 53.433221498000023 ] ] ] } },
-{ "type": "Feature", "id": 518, "properties": { "NUTS_ID": "DE7", "STAT_LEVL_": 1, "SHAPE_AREA": 2.6844014076599998, "SHAPE_LEN": 8.8288737353099993 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 9.928339, 51.375299 ], [ 10.206942, 51.190648996999983 ], [ 10.210013, 51.144082998999977 ], [ 10.202181, 51.012009001000024 ], [ 10.182348499999989, 50.99848750000001 ], [ 10.021558500000026, 50.9929495 ], [ 9.9261765, 50.767389998999988 ], [ 10.077555500000017, 50.63762399699999 ], [ 10.0413385, 50.516469498999982 ], [ 9.935655, 50.419606499999986 ], [ 9.7329145, 50.356149499000026 ], [ 9.623150999000018, 50.229039998000019 ], [ 9.404984500000012, 50.087734497999975 ], [ 8.990559999000027, 50.06711900099998 ], [ 9.016088500000023, 49.991340501000025 ], [ 9.05008, 49.866315000999975 ], [ 9.036080500000025, 49.846503998 ], [ 9.150809, 49.742850497 ], [ 9.103006, 49.577456 ], [ 9.083426499999973, 49.52608699699999 ], [ 8.93188, 49.470636996999986 ], [ 8.950348500000018, 49.454992999000012 ], [ 8.899355, 49.48455 ], [ 8.899572499999977, 49.50365549899999 ], [ 8.581375, 49.519780498999978 ], [ 8.4224395, 49.583385498999974 ], [ 8.414774, 49.595051997999974 ], [ 8.44637749899999, 49.730799497000021 ], [ 8.448365, 49.73366149899999 ], [ 8.400055, 49.803675 ], [ 8.3430305, 49.940506 ], [ 8.288245, 49.995134 ], [ 8.190038500000014, 50.03529599699999 ], [ 8.17513550000001, 50.034255497 ], [ 7.773997, 50.066539998999986 ], [ 8.1219135, 50.277224999 ], [ 7.971559500000012, 50.406220499000028 ], [ 8.151592, 50.599370998999973 ], [ 8.125776499999972, 50.685813998000015 ], [ 8.355495999000027, 50.862001 ], [ 8.477892, 50.96904749700002 ], [ 8.549084999, 51.101868001000014 ], [ 8.7582145, 51.177181499000028 ], [ 8.556348, 51.277495 ], [ 8.970653500000026, 51.5067735 ], [ 9.155410500000016, 51.442674999000019 ], [ 9.440457, 51.650393999000016 ], [ 9.685331, 51.582016499000019 ], [ 9.672377499999982, 51.568403999 ], [ 9.62582500100001, 51.580205 ], [ 9.647755500000017, 51.55251099899999 ], [ 9.557291, 51.35137849900002 ], [ 9.568025, 51.340001498999982 ], [ 9.710295479000024, 51.301537750000023 ], [ 9.928339, 51.375299 ] ] ] } },
-{ "type": "Feature", "id": 540, "properties": { "NUTS_ID": "DEB", "STAT_LEVL_": 1, "SHAPE_AREA": 2.46718641429, "SHAPE_LEN": 8.1942008487300004 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 8.125776499999972, 50.685813998000015 ], [ 8.151592, 50.599370998999973 ], [ 7.971559500000012, 50.406220499000028 ], [ 8.1219135, 50.277224999 ], [ 7.773997, 50.066539998999986 ], [ 8.17513550000001, 50.034255497 ], [ 8.190038500000014, 50.03529599699999 ], [ 8.288245, 49.995134 ], [ 8.3430305, 49.940506 ], [ 8.400055, 49.803675 ], [ 8.448365, 49.73366149899999 ], [ 8.44637749899999, 49.730799497000021 ], [ 8.414774, 49.595051997999974 ], [ 8.4224395, 49.583385498999974 ], [ 8.422700500000019, 49.57419249899999 ], [ 8.423068, 49.541821 ], [ 8.474739704, 49.440616029000012 ], [ 8.497316, 49.411347 ], [ 8.4709155, 49.340712999 ], [ 8.487268, 49.290026499000021 ], [ 8.466985500000021, 49.28297550100001 ], [ 8.4130725, 49.249816499000019 ], [ 8.33997, 49.080149999000014 ], [ 8.277349, 48.989939000999982 ], [ 8.261284, 48.980916998999987 ], [ 8.232632998999975, 48.966571500999976 ], [ 8.068407499999978, 48.999316497999985 ], [ 7.910654500000021, 49.045163499000012 ], [ 7.635651, 49.053950499999985 ], [ 7.36875550000002, 49.16145799899999 ], [ 7.320341, 49.189498999000023 ], [ 7.394494, 49.316351998000016 ], [ 7.402075500000024, 49.367718501000013 ], [ 7.395898499999987, 49.372045996999987 ], [ 7.292601499999989, 49.408222497 ], [ 7.252589, 49.43146550099999 ], [ 7.276622499999974, 49.548623500000019 ], [ 7.02798, 49.63943849899999 ], [ 6.891454, 49.613422497999977 ], [ 6.380052499999977, 49.551104999000017 ], [ 6.4749625, 49.821274999000025 ], [ 6.137662499999976, 50.129951499000015 ], [ 6.405028500000014, 50.323308499 ], [ 6.425295, 50.323011998000027 ], [ 6.800711499999977, 50.361781499000017 ], [ 6.927901, 50.558618500000023 ], [ 7.1952005, 50.642722999 ], [ 7.210872, 50.649543499 ], [ 7.2123995, 50.623404999 ], [ 7.44077199899999, 50.711437998 ], [ 7.6610035, 50.820364500999972 ], [ 7.785898, 50.939913999 ], [ 7.851496, 50.925832500000013 ], [ 8.03969, 50.697375498999975 ], [ 8.125776499999972, 50.685813998000015 ] ] ] } },
-{ "type": "Feature", "id": 595, "properties": { "NUTS_ID": "DEG", "STAT_LEVL_": 1, "SHAPE_AREA": 2.05802907599, "SHAPE_LEN": 8.2329101100200006 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 12.284594500000026, 51.091162497000028 ], [ 12.617355, 50.980792998000027 ], [ 12.652865500000019, 50.923678000999985 ], [ 12.250825500000019, 50.818318498999986 ], [ 12.3189165, 50.676532499000018 ], [ 11.944137500000011, 50.591275499 ], [ 11.919884500000023, 50.42440349899999 ], [ 11.603290500000014, 50.398766001000013 ], [ 11.481568, 50.431621501 ], [ 11.265938, 50.479421 ], [ 11.189943, 50.27118550099999 ], [ 10.9457185, 50.38646649899999 ], [ 10.715266499999984, 50.363590998 ], [ 10.8514945, 50.262762499000019 ], [ 10.729202, 50.230005496999979 ], [ 10.610115, 50.227994998999975 ], [ 10.450532, 50.4018595 ], [ 10.0413385, 50.516469498999982 ], [ 10.077555500000017, 50.63762399699999 ], [ 9.9261765, 50.767389998999988 ], [ 10.021558500000026, 50.9929495 ], [ 10.182348499999989, 50.99848750000001 ], [ 10.202181, 51.012009001000024 ], [ 10.210013, 51.144082998999977 ], [ 10.206942, 51.190648996999983 ], [ 9.928339, 51.375299 ], [ 10.365365, 51.55589100100002 ], [ 10.488551, 51.574778498 ], [ 10.677283, 51.638376000999983 ], [ 10.701372, 51.642187499999977 ], [ 10.916058998999972, 51.616374001 ], [ 10.978113, 51.426884998999981 ], [ 11.428868500000021, 51.339809999000011 ], [ 11.473968, 51.29505849899999 ], [ 11.385374500000012, 51.24588699899999 ], [ 11.484608499999979, 51.105437499 ], [ 11.696938499999987, 51.087490998000021 ], [ 12.021018500000025, 50.96912249899998 ], [ 12.163369499999988, 50.958822498000018 ], [ 12.224169, 50.942934998 ], [ 12.284594500000026, 51.091162497000028 ] ] ] } },
-{ "type": "Feature", "id": 621, "properties": { "NUTS_ID": "DK0", "STAT_LEVL_": 1, "SHAPE_AREA": 5.5988693816300001, "SHAPE_LEN": 33.248423858300001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 10.277603904999978, 56.700858940999979 ], [ 9.801955, 56.6385995 ], [ 10.194114500000012, 56.684693501000027 ], [ 10.308140107999975, 56.606842685 ], [ 10.9599055, 56.442081498999983 ], [ 10.712867, 56.14205549799999 ], [ 10.345911, 56.194126 ], [ 10.009511, 55.700878000999978 ], [ 9.671952499999975, 55.712069998 ], [ 9.543939500000022, 55.706680999000014 ], [ 9.860733, 55.625149 ], [ 9.739651342, 55.529699214 ], [ 10.493877499, 55.547950500000013 ], [ 10.427106, 55.429145999000013 ], [ 10.856239500000015, 55.293189500999972 ], [ 10.5537165, 54.944484499 ], [ 10.072239, 55.077411498 ], [ 9.679817695, 55.511982487000012 ], [ 9.663652053000021, 55.510181843 ], [ 9.488094499999988, 55.49062699699999 ], [ 9.422177499999975, 55.033451 ], [ 10.0712885, 54.877284997 ], [ 9.619266499999981, 54.93403200099999 ], [ 9.420151499999974, 54.831956500999979 ], [ 9.113097, 54.8736015 ], [ 8.63592599899999, 54.911682998 ], [ 8.684189, 55.158839998000019 ], [ 8.620333500000015, 55.428672999000014 ], [ 8.076632500000017, 55.556964999 ], [ 8.17079, 55.815367 ], [ 8.386469, 55.907278500000018 ], [ 8.099594, 56.036021998000024 ], [ 8.19683, 56.694996 ], [ 8.668789, 56.4678495 ], [ 9.002038860000027, 56.80120575 ], [ 9.1771, 56.708271 ], [ 9.05857850000001, 56.629835997999976 ], [ 9.319437, 56.672226998999975 ], [ 9.159611, 56.677932498000018 ], [ 9.24345, 56.74848949699998 ], [ 9.164284, 56.888828499999988 ], [ 10.272072, 56.9368935 ], [ 10.3327445, 56.708076496999979 ], [ 10.277603904999978, 56.700858940999979 ] ] ], [ [ [ 12.5641005, 55.994034 ], [ 12.583112, 55.807791501 ], [ 12.584052499999984, 55.721773499999983 ], [ 12.6812885, 55.588256999 ], [ 12.504888, 55.6374695 ], [ 12.363498, 55.593827498999985 ], [ 12.221227, 55.425629501 ], [ 12.4563445, 55.292506999000011 ], [ 12.0024285, 54.96804599699999 ], [ 11.6157665, 55.081359998999972 ], [ 11.828402499999982, 55.042315997 ], [ 11.748527500000023, 55.219173497999975 ], [ 11.243200499000011, 55.207942996999975 ], [ 10.87105, 55.74376300099999 ], [ 11.348573, 55.757736500000021 ], [ 11.4824, 55.96112800100002 ], [ 11.777166500000021, 55.9767685 ], [ 11.743550500000026, 55.789573497999982 ], [ 11.805708192, 55.680681231999984 ], [ 11.87555, 55.737591499000018 ], [ 11.9599945, 55.850509498 ], [ 11.905017, 55.9359095 ], [ 12.0596445, 55.732090001000017 ], [ 11.980809, 55.730124998 ], [ 11.951244, 55.675521999000011 ], [ 12.084782, 55.654254001000027 ], [ 12.087174684999979, 55.719519628 ], [ 12.088165831000026, 55.746555279 ], [ 12.089422, 55.78082 ], [ 12.01011649899999, 55.96801749799999 ], [ 11.844483500000024, 55.967036999000015 ], [ 12.5641005, 55.994034 ] ] ], [ [ [ 8.668212, 56.950069498 ], [ 8.515605634, 56.686523847999979 ], [ 8.577901844, 56.692498349 ], [ 8.581561781, 56.695444119 ], [ 8.9003495, 56.952026499 ], [ 8.7765885, 56.694549498000015 ], [ 8.595991889, 56.690836525 ], [ 8.583263347000013, 56.690574833000028 ], [ 8.4040455, 56.670237 ], [ 8.232896, 56.798159499 ], [ 8.589805500000011, 57.11797349699998 ], [ 9.3992035, 57.164098497 ], [ 9.95829550000002, 57.595298997999976 ], [ 10.644709499999976, 57.742755499999987 ], [ 10.422333499999979, 57.14775149799999 ], [ 8.668212, 56.950069498 ] ] ], [ [ [ 12.166872, 54.834666997 ], [ 11.4617675, 54.610306 ], [ 10.956038499999977, 54.815006497000013 ], [ 12.166872, 54.834666997 ] ] ], [ [ [ 14.779674, 55.29000449900002 ], [ 15.158351, 55.083439 ], [ 14.683027, 55.09734649699999 ], [ 14.779674, 55.29000449900002 ] ] ] ] } },
-{ "type": "Feature", "id": 639, "properties": { "NUTS_ID": "EE0", "STAT_LEVL_": 1, "SHAPE_AREA": 6.9388307308700004, "SHAPE_LEN": 17.375034765500001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 27.357049499000027, 58.787144001 ], [ 27.621053500000016, 58.0051115 ], [ 27.351579, 57.51823699900001 ], [ 26.524905499999988, 57.516247497999984 ], [ 26.056459, 57.848431496999979 ], [ 25.046307, 58.040146000999982 ], [ 24.834082500000022, 57.9727795 ], [ 24.352817500000015, 57.876556500999982 ], [ 24.58152, 58.325415498999973 ], [ 24.111982500000011, 58.240461501000027 ], [ 23.521601, 58.564879500000018 ], [ 23.404937, 59.025805 ], [ 23.730738499999973, 59.237132501000019 ], [ 24.796578, 59.571803500999977 ], [ 25.83015949899999, 59.564065 ], [ 26.759436, 59.499518497999986 ], [ 28.041862499999979, 59.4701025 ], [ 28.209798001000024, 59.371387998999978 ], [ 27.357049499000027, 58.787144001 ] ] ], [ [ [ 23.345607500000028, 58.610229998000023 ], [ 22.719874, 58.212165001000017 ], [ 21.828614, 58.307032998000011 ], [ 22.543011499999977, 58.642051496000022 ], [ 23.345607500000028, 58.610229998000023 ] ] ], [ [ [ 22.9317145, 58.971051000999978 ], [ 23.069148, 58.838146000999984 ], [ 22.393010999000012, 58.891237998 ], [ 22.592449, 59.088072996999983 ], [ 22.9317145, 58.971051000999978 ] ] ] ] } },
-{ "type": "Feature", "id": 651, "properties": { "NUTS_ID": "EL3", "STAT_LEVL_": 1, "SHAPE_AREA": 0.34999048249499998, "SHAPE_LEN": 4.1181184112300002 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 24.079767, 38.161911998999983 ], [ 24.072212499999978, 37.682054000999983 ], [ 23.743593, 37.853809497999976 ], [ 23.672225500000025, 37.941819999000018 ], [ 23.5713945, 37.993224499 ], [ 23.5969505, 38.01517499800002 ], [ 23.179699500000027, 37.951589997999974 ], [ 23.133437, 37.920439499 ], [ 22.848893499999974, 38.02820650000001 ], [ 23.117532, 38.060645997999984 ], [ 23.126179, 38.168400001 ], [ 23.63415550000002, 38.211311498999976 ], [ 23.690418, 38.340297498999973 ], [ 24.079767, 38.161911998999983 ] ] ], [ [ [ 23.5226725, 37.518099499000016 ], [ 23.4244195, 37.411700998000015 ], [ 23.2003785, 37.59665300099999 ], [ 23.386825499999986, 37.561183998999979 ], [ 23.5226725, 37.518099499000016 ] ] ] ] } },
-{ "type": "Feature", "id": 660, "properties": { "NUTS_ID": "EL4", "STAT_LEVL_": 1, "SHAPE_AREA": 1.18768899792, "SHAPE_LEN": 13.9733443587 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 23.596821, 35.628162498999984 ], [ 23.8409785, 35.52808 ], [ 24.174324, 35.587199998000017 ], [ 24.317411, 35.353774998 ], [ 24.928003499999988, 35.40690999899999 ], [ 25.040821, 35.400146499000016 ], [ 25.490698, 35.298331997999981 ], [ 25.7719595, 35.341518499000017 ], [ 25.7262935, 35.13115699799999 ], [ 26.263292499999977, 35.265228501000024 ], [ 26.281149, 35.111587499 ], [ 25.550568, 34.99075699799999 ], [ 24.7709425, 34.929885997999975 ], [ 24.722517, 35.092544499999974 ], [ 24.287523498999974, 35.176006500000028 ], [ 23.515789, 35.289715 ], [ 23.596821, 35.628162498999984 ] ] ], [ [ [ 26.424524500000018, 39.334041500000012 ], [ 26.609173, 39.010497998 ], [ 25.831421, 39.18841949900002 ], [ 26.424524500000018, 39.334041500000012 ] ] ], [ [ [ 28.22072350000002, 36.45793699799998 ], [ 27.863466500000015, 35.929278497999974 ], [ 27.718461499999989, 35.94023049899999 ], [ 27.703456500000016, 36.146589999000014 ], [ 28.22072350000002, 36.45793699799998 ] ] ], [ [ [ 26.159764, 38.553092998000011 ], [ 26.03731349899999, 38.19849399899999 ], [ 25.8641245, 38.24406799799999 ], [ 25.857202500000028, 38.580429001000027 ], [ 26.159764, 38.553092998000011 ] ] ], [ [ [ 27.230274, 35.508568001000015 ], [ 27.134781, 35.396366 ], [ 27.061262, 35.59654999899999 ], [ 27.2101575, 35.830795497999986 ], [ 27.230274, 35.508568001000015 ] ] ], [ [ [ 24.9576055, 37.897117498999989 ], [ 24.9635755, 37.683925499 ], [ 24.684284, 37.951064998999982 ], [ 24.9576055, 37.897117498999989 ] ] ], [ [ [ 26.8575075, 37.797096497999974 ], [ 27.070253499999978, 37.717021998 ], [ 26.5665095, 37.730647998999984 ], [ 26.8575075, 37.797096497999974 ] ] ] ] } },
-{ "type": "Feature", "id": 673, "properties": { "NUTS_ID": "EL5", "STAT_LEVL_": 1, "SHAPE_AREA": 5.5230486305399999, "SHAPE_LEN": 20.862764837099999 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 26.628431499999976, 41.345533499 ], [ 26.321652, 41.250406499 ], [ 26.291015500000015, 40.93188349899998 ], [ 26.032758, 40.730256999 ], [ 25.632429, 40.862804498 ], [ 25.139545, 40.990687498999989 ], [ 24.81349, 40.858085499000026 ], [ 24.508724, 40.958480998000027 ], [ 24.088630500000022, 40.72168399899999 ], [ 23.86774650000001, 40.77872849900001 ], [ 23.765077500000018, 40.757892498999979 ], [ 23.7568645, 40.634563497999977 ], [ 24.010988, 40.389434499 ], [ 24.005043, 40.31456749900002 ], [ 23.861864, 40.37016300099998 ], [ 23.694948, 40.303939999000022 ], [ 23.997488499999974, 40.107707999000013 ], [ 23.9472275, 39.938018996999972 ], [ 23.65232450000002, 40.224617 ], [ 23.005256498999984, 40.350360501000011 ], [ 22.814283499999988, 40.478442998999981 ], [ 22.947689500000024, 40.626636497999982 ], [ 22.682419, 40.528964998999982 ], [ 22.663957499999981, 40.491554496999981 ], [ 22.547895499999981, 40.145462001 ], [ 22.66243, 39.975547999000014 ], [ 22.1152075, 40.189948997999977 ], [ 21.918218500000023, 39.852611498999977 ], [ 21.29648450000002, 39.857772997999973 ], [ 21.181753, 39.501956999000015 ], [ 21.373546499999975, 39.174735998000017 ], [ 21.1086085, 39.04614650100001 ], [ 20.738746, 39.018909499000017 ], [ 20.4778215, 39.27484149899999 ], [ 20.464247, 39.273101999 ], [ 20.3006785, 39.31573499699999 ], [ 20.008812499999976, 39.691292001000022 ], [ 20.391308, 39.788483499999984 ], [ 20.312236499999983, 39.991022498 ], [ 20.778501, 40.348788999000021 ], [ 21.056068499999981, 40.6166955 ], [ 20.980204500000013, 40.855665 ], [ 21.787378, 40.9311255 ], [ 21.929438, 41.100350498000012 ], [ 22.216216, 41.170457498000019 ], [ 22.332052, 41.120272500999988 ], [ 22.732037, 41.146391498000014 ], [ 22.879178500000023, 41.340652998 ], [ 22.9275915, 41.338539498999978 ], [ 23.624222998999983, 41.375727497000014 ], [ 24.059744, 41.522112 ], [ 24.525637, 41.568699998 ], [ 24.783956, 41.360188997000023 ], [ 25.1793755, 41.310187998 ], [ 25.224698499999988, 41.264630997999973 ], [ 25.906437499999981, 41.307574998 ], [ 25.948626, 41.320342000999972 ], [ 26.158688499999982, 41.391182496999988 ], [ 26.060393, 41.688516000999982 ], [ 26.357879, 41.711104498999987 ], [ 26.6001205, 41.601198997999973 ], [ 26.628431499999976, 41.345533499 ] ] ], [ [ [ 23.679613, 39.972439001 ], [ 23.750192500000026, 39.915984998999988 ], [ 23.355198, 39.953880498999979 ], [ 23.3263685, 40.183349498999974 ], [ 23.679613, 39.972439001 ] ] ] ] } },
-{ "type": "Feature", "id": 692, "properties": { "NUTS_ID": "EL6", "STAT_LEVL_": 1, "SHAPE_AREA": 5.8435701306799999, "SHAPE_LEN": 28.343296034400002 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 22.66243, 39.975547999000014 ], [ 22.915911, 39.60293599900001 ], [ 22.976824, 39.555328498999984 ], [ 23.3514765, 39.187774499999989 ], [ 23.073730500000011, 39.084652 ], [ 23.212417500000015, 39.146873497 ], [ 23.1633645, 39.2728995 ], [ 22.821161500000017, 39.277507501 ], [ 23.018066499999975, 39.002654996999979 ], [ 22.527525, 38.858909499999982 ], [ 23.3777905, 38.52575699800002 ], [ 23.328289, 38.501220499999988 ], [ 23.560117, 38.502597998 ], [ 23.655134499999974, 38.352496998999982 ], [ 23.690418, 38.340297498999973 ], [ 23.63415550000002, 38.211311498999976 ], [ 23.126179, 38.168400001 ], [ 22.6080685, 38.359401499 ], [ 21.851717, 38.374519497999984 ], [ 21.486598730000026, 38.301853584000014 ], [ 21.486297499999978, 38.327686498999981 ], [ 21.484744471999988, 38.306621568000025 ], [ 21.318926, 38.499629997 ], [ 21.1428795, 38.3037875 ], [ 21.105279793000022, 38.392762489 ], [ 21.075875442999973, 38.462344194000025 ], [ 20.988678, 38.66868599899999 ], [ 20.726846500000022, 38.814430499000025 ], [ 20.764536, 38.957309499000019 ], [ 21.066231, 38.877136499000017 ], [ 21.1086085, 39.04614650100001 ], [ 21.373546499999975, 39.174735998000017 ], [ 21.181753, 39.501956999000015 ], [ 21.29648450000002, 39.857772997999973 ], [ 21.918218500000023, 39.852611498999977 ], [ 22.1152075, 40.189948997999977 ], [ 22.66243, 39.975547999000014 ] ] ], [ [ [ 23.2003785, 37.59665300099999 ], [ 23.4244195, 37.411700998000015 ], [ 22.725086, 37.572639498 ], [ 22.982481, 37.048026998000012 ], [ 23.196792500000015, 36.434662 ], [ 22.7856215, 36.796966499 ], [ 22.386833, 36.591141 ], [ 22.149515, 37.018146499000011 ], [ 21.9301395, 36.9811555 ], [ 21.877396, 36.717616497999984 ], [ 21.704893, 36.811290499999984 ], [ 21.565725499, 37.163509499999975 ], [ 21.680233, 37.377433999 ], [ 21.105238, 37.848926499000015 ], [ 21.351074, 38.102138498999977 ], [ 21.871086, 38.334537497999975 ], [ 22.3729725, 38.142223498000021 ], [ 22.962061, 37.949569499 ], [ 23.178438, 37.803564498000014 ], [ 23.120031499999982, 37.72998499900001 ], [ 23.2003785, 37.59665300099999 ] ] ], [ [ [ 23.426233500000023, 38.90583049899999 ], [ 24.155115, 38.651188 ], [ 24.248035500000015, 38.22748199900002 ], [ 24.583963499999982, 38.0242235 ], [ 24.30678, 38.06962199899999 ], [ 24.048601, 38.395148998000025 ], [ 23.648699, 38.398697 ], [ 23.511806499999977, 38.587054997999985 ], [ 23.196602, 38.834670999000025 ], [ 22.829393499999981, 38.825450999 ], [ 23.317873, 39.038505499 ], [ 23.426233500000023, 38.90583049899999 ] ] ], [ [ [ 20.628674, 38.325271497000017 ], [ 20.79628550000001, 38.065116999 ], [ 20.341785500000015, 38.176131997000027 ], [ 20.536981500000024, 38.469810499 ], [ 20.628674, 38.325271497000017 ] ] ], [ [ [ 20.899757, 37.77567349899999 ], [ 20.988697, 37.705222497000022 ], [ 20.878824, 37.729893 ], [ 20.829103, 37.644634998000015 ], [ 20.625544, 37.820702000999972 ], [ 20.632630999000014, 37.88441899899999 ], [ 20.71006, 37.922859999000025 ], [ 20.899757, 37.77567349899999 ] ] ], [ [ [ 19.841421500000024, 39.659159499 ], [ 20.109863500000017, 39.360972 ], [ 19.641207, 39.749874498999986 ], [ 19.915536499999973, 39.79195249899999 ], [ 19.841421500000024, 39.659159499 ] ] ], [ [ [ 20.721731, 38.626224498999989 ], [ 20.551553500000011, 38.577827497999976 ], [ 20.5996705, 38.779578997999977 ], [ 20.693756, 38.851817998 ], [ 20.721731, 38.626224498999989 ] ] ] ] } },
-{ "type": "Feature", "id": 717, "properties": { "NUTS_ID": "ES1", "STAT_LEVL_": 1, "SHAPE_AREA": 5.1093221077699997, "SHAPE_LEN": 16.225247708600001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -3.153338, 43.353221999000027 ], [ -3.450147, 43.236207998999987 ], [ -3.417681500000015, 43.133369497999979 ], [ -3.945488, 43.005718496999975 ], [ -3.815811, 42.812561 ], [ -3.99967, 42.76893249699998 ], [ -4.002319, 42.830867497999975 ], [ -4.04593, 42.766566996999984 ], [ -4.081359, 42.761417499 ], [ -4.238510999000027, 42.954314997999973 ], [ -4.737022998999976, 43.02092349899999 ], [ -4.841038500000025, 43.180709499999978 ], [ -6.824167, 42.915012499 ], [ -7.076834, 42.50812399900002 ], [ -6.8227655, 42.490833001 ], [ -6.784308, 42.253608499999984 ], [ -6.983513500000015, 41.972903998999982 ], [ -7.20046450000001, 41.879749498000024 ], [ -8.051862500000027, 41.820613998 ], [ -8.1650755, 41.818302 ], [ -8.199000500000011, 42.154418998999972 ], [ -8.863186, 41.872066499000027 ], [ -8.898017, 42.1119195 ], [ -8.658873500000027, 42.291434998999989 ], [ -8.833692499999984, 42.470134498999982 ], [ -8.726749499999983, 42.68825149700001 ], [ -9.0309545, 42.697562996999977 ], [ -9.277219, 43.0441095 ], [ -7.904372500000022, 43.769103997 ], [ -7.699736499999972, 43.735115001 ], [ -7.031837, 43.544471497000018 ], [ -5.84171, 43.655956499000013 ], [ -4.51230099899999, 43.393204001000015 ], [ -3.426641001, 43.413513001000013 ], [ -3.153338, 43.353221999000027 ] ] ] } },
-{ "type": "Feature", "id": 727, "properties": { "NUTS_ID": "ES2", "STAT_LEVL_": 1, "SHAPE_AREA": 7.7314428233200001, "SHAPE_LEN": 14.658468562 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -1.785978, 43.350478996999982 ], [ -1.728903, 43.296088997000027 ], [ -1.6087885, 43.251976001 ], [ -0.724501, 42.920158500000014 ], [ -0.551047, 42.777637500000026 ], [ -0.313342, 42.849364997 ], [ 0.4777555, 42.700019997000027 ], [ 0.660127, 42.69095249899999 ], [ 0.767070499999988, 42.34881599800002 ], [ 0.335948499999972, 41.407432498999981 ], [ 0.385724, 41.278839498000025 ], [ 0.220409500000017, 41.07143 ], [ 0.170789500000012, 40.732837 ], [ -0.197124499999973, 40.784457998999983 ], [ -0.279997499999979, 40.369499499000028 ], [ -0.837749501000019, 39.976817997000012 ], [ -0.797643, 39.881076499000017 ], [ -0.912739, 39.87310750099999 ], [ -1.1423615, 39.971855499000014 ], [ -1.165154, 40.010108998000021 ], [ -1.448831499999983, 40.145358497000018 ], [ -1.806344, 40.39824699899998 ], [ -1.545494500000018, 40.595210999000017 ], [ -1.6174335, 40.94374099700002 ], [ -2.05169, 41.146857999000019 ], [ -2.170485499999984, 41.318836498999985 ], [ -1.825316, 41.778365999000016 ], [ -1.8565395, 41.966388498000015 ], [ -2.282167500000014, 42.13168699900001 ], [ -2.9136375, 42.022838499999978 ], [ -3.13429450000001, 42.54202249799999 ], [ -2.8581175, 42.638168499000017 ], [ -3.286797499999977, 42.885097497 ], [ -3.0369895, 42.982185497999978 ], [ -3.0890235, 43.00158299899999 ], [ -3.141396, 43.161483498999985 ], [ -3.417681500000015, 43.133369497999979 ], [ -3.450147, 43.236207998999987 ], [ -3.153338, 43.353221999000027 ], [ -2.508807499999989, 43.376544998999975 ], [ -2.412847, 43.321082998 ], [ -1.785978, 43.350478996999982 ] ] ] } },
-{ "type": "Feature", "id": 740, "properties": { "NUTS_ID": "ES3", "STAT_LEVL_": 1, "SHAPE_AREA": 0.81272479065799996, "SHAPE_LEN": 4.3266876300900003 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -3.130407, 40.40515399899999 ], [ -3.194294500000012, 40.248005998999986 ], [ -3.067689, 40.157884998999975 ], [ -3.161419500000022, 40.06489699799999 ], [ -3.623443, 40.053848500000015 ], [ -4.190704, 40.297315997999988 ], [ -4.578901, 40.217157499 ], [ -4.160293500000023, 40.689853500000027 ], [ -3.539573, 41.16499349899999 ], [ -3.394300499999986, 41.000234498 ], [ -3.468398, 40.689896498999985 ], [ -3.130407, 40.40515399899999 ] ] ] } },
-{ "type": "Feature", "id": 743, "properties": { "NUTS_ID": "DEC", "STAT_LEVL_": 1, "SHAPE_AREA": 0.321613830253, "SHAPE_LEN": 2.6466424927699999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 7.36875550000002, 49.16145799899999 ], [ 7.101069, 49.155998 ], [ 6.723465499999975, 49.218828998999982 ], [ 6.556986, 49.419208498999978 ], [ 6.367107499999975, 49.469507001000011 ], [ 6.380052499999977, 49.551104999000017 ], [ 6.891454, 49.613422497999977 ], [ 7.02798, 49.63943849899999 ], [ 7.276622499999974, 49.548623500000019 ], [ 7.252589, 49.43146550099999 ], [ 7.292601499999989, 49.408222497 ], [ 7.395898499999987, 49.372045996999987 ], [ 7.402075500000024, 49.367718501000013 ], [ 7.394494, 49.316351998000016 ], [ 7.320341, 49.189498999000023 ], [ 7.36875550000002, 49.16145799899999 ] ] ] } },
-{ "type": "Feature", "id": 751, "properties": { "NUTS_ID": "DED", "STAT_LEVL_": 1, "SHAPE_AREA": 2.3377208450500002, "SHAPE_LEN": 9.00332850707 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 14.729862, 51.581776999 ], [ 14.974183, 51.36394999700002 ], [ 15.037271, 51.243749998999988 ], [ 14.823362, 50.870550497000011 ], [ 14.6188, 50.857804497000018 ], [ 14.491221, 51.043530498999985 ], [ 14.317873500000019, 51.054698998999982 ], [ 14.388335, 50.900298997999982 ], [ 13.652174, 50.730359997999983 ], [ 13.501846, 50.633642999000017 ], [ 12.948144500000012, 50.404311499000016 ], [ 12.5837785, 50.407078999000021 ], [ 12.100900500000023, 50.31802799799999 ], [ 11.919884500000023, 50.42440349899999 ], [ 11.944137500000011, 50.591275499 ], [ 12.3189165, 50.676532499000018 ], [ 12.250825500000019, 50.818318498999986 ], [ 12.652865500000019, 50.923678000999985 ], [ 12.617355, 50.980792998000027 ], [ 12.284594500000026, 51.091162497000028 ], [ 12.1709045, 51.2755085 ], [ 12.193544499999973, 51.332518500999981 ], [ 12.198939, 51.531320498000014 ], [ 12.580263, 51.626065499999982 ], [ 13.051025, 51.647677 ], [ 13.21015, 51.404735999000025 ], [ 13.691250500000024, 51.374012999 ], [ 13.835313, 51.37678999799999 ], [ 14.163324499999987, 51.541042999000013 ], [ 14.447771, 51.542068997 ], [ 14.729862, 51.581776999 ] ] ] } },
-{ "type": "Feature", "id": 768, "properties": { "NUTS_ID": "DEE", "STAT_LEVL_": 1, "SHAPE_AREA": 2.7298399960699999, "SHAPE_LEN": 8.7958688020100002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 13.051025, 51.647677 ], [ 12.580263, 51.626065499999982 ], [ 12.198939, 51.531320498000014 ], [ 12.193544499999973, 51.332518500999981 ], [ 12.1709045, 51.2755085 ], [ 12.284594500000026, 51.091162497000028 ], [ 12.224169, 50.942934998 ], [ 12.163369499999988, 50.958822498000018 ], [ 12.021018500000025, 50.96912249899998 ], [ 11.696938499999987, 51.087490998000021 ], [ 11.484608499999979, 51.105437499 ], [ 11.385374500000012, 51.24588699899999 ], [ 11.473968, 51.29505849899999 ], [ 11.428868500000021, 51.339809999000011 ], [ 10.978113, 51.426884998999981 ], [ 10.916058998999972, 51.616374001 ], [ 10.701372, 51.642187499999977 ], [ 10.561227, 52.004065998999977 ], [ 10.801428499999986, 52.0480005 ], [ 10.964414499999975, 52.056642997999973 ], [ 11.086244, 52.22863399900001 ], [ 10.93454250000002, 52.471794999 ], [ 11.0087815, 52.496747500000026 ], [ 10.759314500000016, 52.795830500000022 ], [ 10.841556, 52.852205 ], [ 11.505027, 52.941032499000016 ], [ 11.597784499999989, 53.035926499000027 ], [ 12.126811499999974, 52.890199499 ], [ 12.249203500000021, 52.79186199899999 ], [ 12.171555, 52.506336497 ], [ 12.31718, 52.454095499 ], [ 12.276724499000011, 52.104018 ], [ 12.3761225, 52.04511949800002 ], [ 12.769780500000024, 51.979274499999974 ], [ 13.1509135, 51.859610499999974 ], [ 13.051025, 51.647677 ] ] ] } },
-{ "type": "Feature", "id": 784, "properties": { "NUTS_ID": "DEF", "STAT_LEVL_": 1, "SHAPE_AREA": 2.0719277246600001, "SHAPE_LEN": 8.32640217384 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 9.420151499999974, 54.831956500999979 ], [ 9.422934, 54.823222998 ], [ 9.491949499999976, 54.822635499 ], [ 10.03117, 54.636575000999983 ], [ 10.168517, 54.432844998 ], [ 10.174162, 54.345745498999975 ], [ 10.713750499000014, 54.305072500999984 ], [ 11.1256095, 54.373613499999976 ], [ 11.093783, 54.199050500999988 ], [ 10.755363499999987, 54.054051501 ], [ 10.840886001, 53.991893998000023 ], [ 10.903661500999988, 53.956822000999978 ], [ 10.76296450000001, 53.811153 ], [ 10.951918499999977, 53.647622499000022 ], [ 10.595047, 53.363927500999978 ], [ 10.469184, 53.385844 ], [ 10.308312, 53.433221498000023 ], [ 10.236678499999982, 53.496354498000017 ], [ 10.072805, 53.709633999 ], [ 9.945376, 53.652927998 ], [ 9.73010399899999, 53.557580997 ], [ 9.48589, 53.707664497999986 ], [ 9.199750999, 53.88010449799998 ], [ 9.02242, 53.87951750000002 ], [ 8.8448045, 54.266328499999986 ], [ 8.602134, 54.340507996999975 ], [ 9.010464, 54.496822999000017 ], [ 8.372664499999985, 54.896856497999977 ], [ 8.63592599899999, 54.911682998 ], [ 9.113097, 54.8736015 ], [ 9.420151499999974, 54.831956500999979 ] ] ] } },
-{ "type": "Feature", "id": 799, "properties": { "NUTS_ID": "FR6", "STAT_LEVL_": 1, "SHAPE_AREA": 11.8789682494, "SHAPE_LEN": 17.746398878800001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 2.28104350000001, 46.420403497 ], [ 2.565372500000024, 46.143036 ], [ 2.60902149899999, 45.966643998999984 ], [ 2.388014, 45.827372997 ], [ 2.492129499999976, 45.737669997000012 ], [ 2.50841250000002, 45.478501499 ], [ 2.062908, 44.976504498999986 ], [ 2.207473, 44.61552899899999 ], [ 2.4789475, 44.64800999900001 ], [ 2.7167695, 44.928827996999985 ], [ 2.9816755, 44.644672999000022 ], [ 3.120173500000021, 44.261838496999985 ], [ 3.373648, 44.170759498999985 ], [ 3.263114, 44.092425499 ], [ 3.448355, 44.019103498999982 ], [ 3.358362, 43.913829498999974 ], [ 2.935457, 43.694664999 ], [ 2.565782500000012, 43.422958 ], [ 2.265415, 43.452913498999976 ], [ 2.029134, 43.436895497000023 ], [ 1.6884235, 43.273554497000021 ], [ 1.949341, 43.120973 ], [ 2.166049, 42.663917498999979 ], [ 1.7860985, 42.573658 ], [ 1.442566, 42.603668001000017 ], [ 0.858215, 42.825740999 ], [ 0.660127, 42.69095249899999 ], [ 0.4777555, 42.700019997000027 ], [ -0.313342, 42.849364997 ], [ -0.551047, 42.777637500000026 ], [ -0.724501, 42.920158500000014 ], [ -1.6087885, 43.251976001 ], [ -1.728903, 43.296088997000027 ], [ -1.785978, 43.350478996999982 ], [ -1.52487050000002, 43.529700999 ], [ -1.253891, 44.467605499 ], [ -1.156919, 45.472530499000015 ], [ -0.708231, 45.327480498999989 ], [ -0.040197499999977, 45.10237999899999 ], [ 0.004336, 45.191628 ], [ 0.629741, 45.714569997000012 ], [ 0.823432500000024, 46.128584499 ], [ 1.177279, 46.383947998 ], [ 1.4151855, 46.347215001 ], [ 2.167784499999982, 46.424068998999985 ], [ 2.28104350000001, 46.420403497 ] ] ] } },
-{ "type": "Feature", "id": 819, "properties": { "NUTS_ID": "FR7", "STAT_LEVL_": 1, "SHAPE_AREA": 8.2323456312500003, "SHAPE_LEN": 16.838942168900001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 5.310563499000011, 46.446769999000026 ], [ 5.4736565, 46.264284001000021 ], [ 6.064003, 46.416228997000019 ], [ 6.125608, 46.317229999 ], [ 5.956067, 46.1320955 ], [ 6.310211, 46.244045498999981 ], [ 6.219547, 46.311877998999989 ], [ 6.231694, 46.329429498000025 ], [ 6.24136, 46.343582497 ], [ 6.821063998999989, 46.427154498999982 ], [ 6.797888999, 46.136798999 ], [ 7.044886, 45.922412999000016 ], [ 6.8023685, 45.778562 ], [ 7.104723, 45.468454499000018 ], [ 7.125157, 45.243994498 ], [ 6.630051, 45.109856499999978 ], [ 6.26057, 45.12684399699998 ], [ 6.203923499999974, 45.01247100099999 ], [ 6.355365, 44.854820499000027 ], [ 5.80147, 44.706777500999976 ], [ 5.418397500000026, 44.424768498999981 ], [ 5.676035999000021, 44.191428498999983 ], [ 5.4987865, 44.115716497999983 ], [ 4.804566, 44.303896998000027 ], [ 4.650611, 44.329802997 ], [ 4.6492275, 44.27036 ], [ 4.258899499999984, 44.264422998999976 ], [ 3.998161499999981, 44.459798498 ], [ 3.862531, 44.743865997 ], [ 3.361347500000022, 44.971408 ], [ 3.103125, 44.884632497999974 ], [ 2.9816755, 44.644672999000022 ], [ 2.7167695, 44.928827996999985 ], [ 2.4789475, 44.64800999900001 ], [ 2.207473, 44.61552899899999 ], [ 2.062908, 44.976504498999986 ], [ 2.50841250000002, 45.478501499 ], [ 2.492129499999976, 45.737669997000012 ], [ 2.388014, 45.827372997 ], [ 2.60902149899999, 45.966643998999984 ], [ 2.565372500000024, 46.143036 ], [ 2.28104350000001, 46.420403497 ], [ 3.032063, 46.794909501 ], [ 3.629422499999976, 46.749456497999972 ], [ 3.998868500000015, 46.46486699899998 ], [ 3.899538499000016, 46.275907999000026 ], [ 4.388074500000016, 46.219790499 ], [ 4.780208500000015, 46.176675999 ], [ 4.953853998999989, 46.513422999 ], [ 5.310563499000011, 46.446769999000026 ] ] ] } },
-{ "type": "Feature", "id": 834, "properties": { "NUTS_ID": "FR8", "STAT_LEVL_": 1, "SHAPE_AREA": 7.6266291756699998, "SHAPE_LEN": 23.436939791499999 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 7.007760500000018, 44.23670049899999 ], [ 7.714236500000027, 44.061513499 ], [ 7.529827, 43.784007997 ], [ 7.43605928, 43.769769968999981 ], [ 7.413667280000027, 43.744077703000016 ], [ 6.933721, 43.480064497 ], [ 6.621062, 43.157783500999983 ], [ 5.671879, 43.17926799899999 ], [ 4.230281, 43.460186 ], [ 4.101040500000011, 43.554370999000014 ], [ 3.24056250000001, 43.212802999000019 ], [ 2.9889005, 42.864599999 ], [ 3.035430500000018, 42.8420695 ], [ 3.050192, 42.870636 ], [ 3.043504499999983, 42.83815 ], [ 3.174803999, 42.435375 ], [ 1.964552, 42.382156999000017 ], [ 1.731011, 42.492400998999983 ], [ 1.725801, 42.504401998999981 ], [ 1.7860985, 42.573658 ], [ 2.166049, 42.663917498999979 ], [ 1.949341, 43.120973 ], [ 1.6884235, 43.273554497000021 ], [ 2.029134, 43.436895497000023 ], [ 2.265415, 43.452913498999976 ], [ 2.565782500000012, 43.422958 ], [ 2.935457, 43.694664999 ], [ 3.358362, 43.913829498999974 ], [ 3.448355, 44.019103498999982 ], [ 3.263114, 44.092425499 ], [ 3.373648, 44.170759498999985 ], [ 3.120173500000021, 44.261838496999985 ], [ 2.9816755, 44.644672999000022 ], [ 3.103125, 44.884632497999974 ], [ 3.361347500000022, 44.971408 ], [ 3.862531, 44.743865997 ], [ 3.998161499999981, 44.459798498 ], [ 4.258899499999984, 44.264422998999976 ], [ 4.6492275, 44.27036 ], [ 4.650611, 44.329802997 ], [ 4.804566, 44.303896998000027 ], [ 5.4987865, 44.115716497999983 ], [ 5.676035999000021, 44.191428498999983 ], [ 5.418397500000026, 44.424768498999981 ], [ 5.80147, 44.706777500999976 ], [ 6.355365, 44.854820499000027 ], [ 6.203923499999974, 45.01247100099999 ], [ 6.26057, 45.12684399699998 ], [ 6.630051, 45.109856499999978 ], [ 7.065755, 44.713464497000018 ], [ 6.948443, 44.654741999 ], [ 6.887428, 44.361287 ], [ 7.007760500000018, 44.23670049899999 ] ] ], [ [ [ 9.402272, 41.858703499 ], [ 9.219715, 41.367603998999982 ], [ 8.788534, 41.557075497000028 ], [ 8.591134, 41.962154496999972 ], [ 8.573006498999973, 42.238570998 ], [ 8.691523500000017, 42.266464000999974 ], [ 8.573408, 42.381405 ], [ 8.727043, 42.561603499 ], [ 9.301698499999986, 42.678638501000023 ], [ 9.343248500000016, 42.999740499999973 ], [ 9.4632805, 42.986736499000017 ], [ 9.559117500000013, 42.196696997 ], [ 9.402272, 41.858703499 ] ] ] ] } },
-{ "type": "Feature", "id": 851, "properties": { "NUTS_ID": "FRA", "STAT_LEVL_": 1, "SHAPE_AREA": 7.2853621183900001, "SHAPE_LEN": 16.8260370478 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -52.565187, 2.516254 ], [ -52.904169, 2.190938 ], [ -53.723031, 2.31207 ], [ -54.152372500000013, 2.119626501000027 ], [ -54.587859499999979, 2.32202350099999 ], [ -54.208187, 2.7777375 ], [ -53.979807, 3.606575 ], [ -54.356726499999979, 4.047393500999988 ], [ -54.468998, 4.890381 ], [ -54.008094501000016, 5.623116499999981 ], [ -53.9683655, 5.745176500000014 ], [ -53.7992595, 5.723899501 ], [ -52.94350750000001, 5.451987499999973 ], [ -52.364701, 4.9207345 ], [ -51.813244, 4.6343435 ], [ -51.629835500000013, 4.189126499 ], [ -52.565187, 2.516254 ] ] ], [ [ [ 55.825796500000024, -21.143524499000023 ], [ 55.797395, -21.350129498 ], [ 55.292974, -21.229784998000014 ], [ 55.283669, -20.926209498999981 ], [ 55.6477375, -20.91560449799999 ], [ 55.825796500000024, -21.143524499000023 ] ] ], [ [ [ -61.003888, 14.808371500000021 ], [ -60.857675500000028, 14.398175 ], [ -61.081969, 14.470402498999988 ], [ -61.229269, 14.822112499000013 ], [ -61.003888, 14.808371500000021 ] ] ], [ [ [ -61.555346499999985, 16.053240499000026 ], [ -61.6968005, 15.946588999000028 ], [ -61.80593, 16.257474498000022 ], [ -61.672612500000014, 16.324603500000023 ], [ -61.555346499999985, 16.053240499000026 ] ] ], [ [ [ -61.254871499999979, 16.256139 ], [ -61.527298499999972, 16.219122998999978 ], [ -61.541210499999977, 16.438073 ], [ -61.464057500000024, 16.513064499 ], [ -61.254871499999979, 16.256139 ] ] ], [ [ [ 45.2132815, -12.873075500000027 ], [ 45.1483005, -13.000090499 ], [ 45.071559, -12.902030998999976 ], [ 45.154967, -12.92356799800001 ], [ 45.04258, -12.744023498999979 ], [ 45.088423499999976, -12.669118998999977 ], [ 45.125466999000025, -12.727415999000016 ], [ 45.236856, -12.753384998 ], [ 45.186050500000022, -12.839496 ], [ 45.2132815, -12.873075500000027 ] ] ] ] } },
-{ "type": "Feature", "id": 863, "properties": { "NUTS_ID": "HR0", "STAT_LEVL_": 1, "SHAPE_AREA": 6.2709515486800003, "SHAPE_LEN": 27.335127013899999 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 14.589105421, 45.236414152 ], [ 14.8205145, 44.970927997999979 ], [ 14.438129499000013, 45.064864499 ], [ 14.585924309, 45.238482154 ], [ 14.332406, 45.355255000999989 ], [ 14.2265825, 45.153838499000017 ], [ 13.906588, 44.768634999000028 ], [ 13.606798, 45.11697399799999 ], [ 13.489867, 45.486835497000015 ], [ 13.583067, 45.477409997 ], [ 14.109059, 45.482458496999982 ], [ 14.11819650000001, 45.48123899699999 ], [ 14.570012500000018, 45.672944497 ], [ 15.226437499999975, 45.42708799799999 ], [ 15.385532, 45.486580499000013 ], [ 15.277050499999973, 45.604461499000024 ], [ 15.331290500000023, 45.762854999000012 ], [ 15.404425, 45.792716997000014 ], [ 15.706404500000019, 45.97534299900002 ], [ 15.627262, 46.085953499000027 ], [ 15.7914755, 46.259327497000015 ], [ 15.8768685, 46.280055499000014 ], [ 16.301549, 46.378287996999973 ], [ 16.242148499999985, 46.490075497000021 ], [ 16.596805, 46.475902498999972 ], [ 16.854754999000022, 46.350440999 ], [ 16.8760435, 46.320602496999982 ], [ 17.294325, 45.988544500999978 ], [ 17.651499, 45.847835499999974 ], [ 17.911642, 45.790950996999982 ], [ 18.446884, 45.737048997999977 ], [ 18.821306, 45.914382997000018 ], [ 18.889734, 45.921180497000023 ], [ 19.033942, 45.486821498999973 ], [ 19.004717, 45.435032 ], [ 19.425167499999986, 45.167692499 ], [ 19.0864565, 45.145390999000028 ], [ 19.022069499999986, 44.855353998999988 ], [ 18.531997499999989, 45.090629498 ], [ 17.143591499000024, 45.162589500000024 ], [ 16.529669, 45.226574 ], [ 16.288761, 44.995689497 ], [ 15.836288500000023, 45.22247699899998 ], [ 15.736722, 44.935630999000011 ], [ 16.131936999, 44.454681500999982 ], [ 16.208290375999979, 44.216682898999977 ], [ 16.544348, 43.974178498000015 ], [ 16.811584499999981, 43.755492998000022 ], [ 17.450845500000014, 43.176868500000012 ], [ 17.4751225, 43.161456997000016 ], [ 17.712072499999977, 42.973003499000015 ], [ 17.581335, 42.938422999000011 ], [ 17.355711, 43.084621501000015 ], [ 16.3882865, 43.509128498999985 ], [ 16.475813, 43.534843501000012 ], [ 16.013214, 43.502506 ], [ 15.558911500000022, 43.871951997 ], [ 15.111249, 44.260101497999983 ], [ 15.598931350999976, 44.172556789 ], [ 15.2894715, 44.363227999 ], [ 14.896002, 44.696899497 ], [ 14.881313499999976, 45.034404999 ], [ 14.589105421, 45.236414152 ] ] ], [ [ [ 18.438104, 42.555705000999978 ], [ 18.525213, 42.420462997000016 ], [ 17.000639, 43.046752998999978 ], [ 17.469783999000015, 42.914893996999979 ], [ 17.6491565, 42.888923499999976 ], [ 18.438104, 42.555705000999978 ] ] ], [ [ [ 14.474727499999972, 44.95611549900002 ], [ 14.517015500000014, 44.608916998999973 ], [ 14.3020285, 44.93503199700001 ], [ 14.3602285, 45.088412999000013 ], [ 14.474727499999972, 44.95611549900002 ] ] ] ] } },
-{ "type": "Feature", "id": 888, "properties": { "NUTS_ID": "ES4", "STAT_LEVL_": 1, "SHAPE_AREA": 22.818076117299999, "SHAPE_LEN": 31.029754506500002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -3.417681500000015, 43.133369497999979 ], [ -3.141396, 43.161483498999985 ], [ -3.0890235, 43.00158299899999 ], [ -3.0369895, 42.982185497999978 ], [ -3.286797499999977, 42.885097497 ], [ -2.8581175, 42.638168499000017 ], [ -3.13429450000001, 42.54202249799999 ], [ -2.9136375, 42.022838499999978 ], [ -2.282167500000014, 42.13168699900001 ], [ -1.8565395, 41.966388498000015 ], [ -1.825316, 41.778365999000016 ], [ -2.170485499999984, 41.318836498999985 ], [ -2.05169, 41.146857999000019 ], [ -1.6174335, 40.94374099700002 ], [ -1.545494500000018, 40.595210999000017 ], [ -1.806344, 40.39824699899998 ], [ -1.448831499999983, 40.145358497000018 ], [ -1.165154, 40.010108998000021 ], [ -1.1423615, 39.971855499000014 ], [ -1.505146500000023, 39.41801449799999 ], [ -1.161849, 39.305431499 ], [ -1.266710499999988, 39.051032500000019 ], [ -0.959359, 38.944587497999976 ], [ -0.928884499999981, 38.783839999 ], [ -1.02687, 38.655509498000015 ], [ -1.402179499999988, 38.690803500000015 ], [ -2.341602, 38.026019999000027 ], [ -2.551274001000024, 38.084118 ], [ -2.481576, 38.393101 ], [ -2.762069, 38.532779499000014 ], [ -4.268895, 38.347212997999975 ], [ -5.008071500000028, 38.715274998999973 ], [ -5.046996001000025, 38.72913349800001 ], [ -5.568977500000017, 38.432639998000013 ], [ -5.584845499999972, 38.131752 ], [ -5.874359, 38.158622998 ], [ -6.180307500000026, 37.941077998000026 ], [ -6.9317385, 38.208377998 ], [ -7.10795250000001, 38.188121500000022 ], [ -7.316636, 38.439876498999979 ], [ -7.203135, 38.75101749800001 ], [ -6.9513915, 39.024070499 ], [ -7.231467, 39.278431 ], [ -7.53490245, 39.661986891000026 ], [ -7.015405, 39.670856499000024 ], [ -6.864203, 40.011867498000015 ], [ -7.011960499999986, 40.126934 ], [ -6.9512985, 40.257445999000026 ], [ -6.865144, 40.270694499 ], [ -6.801935, 40.861045998 ], [ -6.929903500000023, 41.029466499000023 ], [ -6.689786, 41.205241498000021 ], [ -6.479713, 41.294379999 ], [ -6.189352, 41.575046499 ], [ -6.5884615, 41.967761998000015 ], [ -6.983513500000015, 41.972903998999982 ], [ -6.784308, 42.253608499999984 ], [ -6.8227655, 42.490833001 ], [ -7.076834, 42.50812399900002 ], [ -6.824167, 42.915012499 ], [ -4.841038500000025, 43.180709499999978 ], [ -4.737022998999976, 43.02092349899999 ], [ -4.238510999000027, 42.954314997999973 ], [ -4.081359, 42.761417499 ], [ -4.04593, 42.766566996999984 ], [ -4.002319, 42.830867497999975 ], [ -3.99967, 42.76893249699998 ], [ -3.815811, 42.812561 ], [ -3.945488, 43.005718496999975 ], [ -3.417681500000015, 43.133369497999979 ] ], [ [ -3.130407, 40.40515399899999 ], [ -3.468398, 40.689896498999985 ], [ -3.394300499999986, 41.000234498 ], [ -3.539573, 41.16499349899999 ], [ -4.160293500000023, 40.689853500000027 ], [ -4.578901, 40.217157499 ], [ -4.190704, 40.297315997999988 ], [ -3.623443, 40.053848500000015 ], [ -3.161419500000022, 40.06489699799999 ], [ -3.067689, 40.157884998999975 ], [ -3.194294500000012, 40.248005998999986 ], [ -3.130407, 40.40515399899999 ] ] ] } },
-{ "type": "Feature", "id": 908, "properties": { "NUTS_ID": "ES5", "STAT_LEVL_": 1, "SHAPE_AREA": 6.28681523868, "SHAPE_LEN": 22.031441685299999 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 1.731011, 42.492400998999983 ], [ 1.964552, 42.382156999000017 ], [ 3.174803999, 42.435375 ], [ 3.111661500000025, 42.203849498000011 ], [ 3.21998050000002, 41.92232149900002 ], [ 2.778513, 41.648807498 ], [ 1.645323500000018, 41.19562149699999 ], [ 0.966513, 41.028423498999985 ], [ 0.5152385, 40.522918498000024 ], [ -0.188467, 39.721954498 ], [ -0.334561, 39.423534499000027 ], [ 0.012785310000027, 38.864592846999983 ], [ 0.205479500000024, 38.731944999 ], [ -0.301367, 38.481929498999989 ], [ -0.7621355, 37.847007997999981 ], [ -1.030276500000014, 38.097076497999979 ], [ -1.02687, 38.655509498000015 ], [ -0.928884499999981, 38.783839999 ], [ -0.959359, 38.944587497999976 ], [ -1.266710499999988, 39.051032500000019 ], [ -1.161849, 39.305431499 ], [ -1.505146500000023, 39.41801449799999 ], [ -1.1423615, 39.971855499000014 ], [ -0.912739, 39.87310750099999 ], [ -0.797643, 39.881076499000017 ], [ -0.837749501000019, 39.976817997000012 ], [ -0.279997499999979, 40.369499499000028 ], [ -0.197124499999973, 40.784457998999983 ], [ 0.170789500000012, 40.732837 ], [ 0.220409500000017, 41.07143 ], [ 0.385724, 41.278839498000025 ], [ 0.335948499999972, 41.407432498999981 ], [ 0.767070499999988, 42.34881599800002 ], [ 0.660127, 42.69095249899999 ], [ 0.858215, 42.825740999 ], [ 1.442566, 42.603668001000017 ], [ 1.725801, 42.504401998999981 ], [ 1.731011, 42.492400998999983 ] ] ], [ [ [ 3.079701, 39.901542499000016 ], [ 3.1449035, 39.773849500999972 ], [ 3.3484785, 39.788887000999978 ], [ 3.455952500000024, 39.656703998000012 ], [ 3.0730825, 39.268691999 ], [ 2.344436, 39.587226998 ], [ 2.778520500000013, 39.85567899900002 ], [ 3.213000500000021, 39.955375996999976 ], [ 3.079701, 39.901542499000016 ] ] ], [ [ [ 1.423349, 38.905044498999985 ], [ 1.3711275, 38.830333498000016 ], [ 1.213064499999973, 38.901359500000012 ], [ 1.310512500000016, 39.042953499000021 ], [ 1.603850500000021, 39.091018497999983 ], [ 1.605787500000019, 39.02961349899999 ], [ 1.423349, 38.905044498999985 ] ] ], [ [ [ 4.094098, 40.05835349900002 ], [ 4.28757, 39.814689499 ], [ 3.791304, 40.016177998999979 ], [ 4.094098, 40.05835349900002 ] ] ] ] } },
-{ "type": "Feature", "id": 922, "properties": { "NUTS_ID": "ES6", "STAT_LEVL_": 1, "SHAPE_AREA": 10.183726200400001, "SHAPE_LEN": 17.966652552199999 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -5.008071500000028, 38.715274998999973 ], [ -4.268895, 38.347212997999975 ], [ -2.762069, 38.532779499000014 ], [ -2.481576, 38.393101 ], [ -2.551274001000024, 38.084118 ], [ -2.341602, 38.026019999000027 ], [ -1.402179499999988, 38.690803500000015 ], [ -1.02687, 38.655509498000015 ], [ -1.030276500000014, 38.097076497999979 ], [ -0.7621355, 37.847007997999981 ], [ -0.8599795, 37.721446997999976 ], [ -0.720821, 37.605872998 ], [ -1.325323500000025, 37.562442498999985 ], [ -1.630033, 37.375182999 ], [ -2.12434, 36.731842001000018 ], [ -2.3680915, 36.841457499 ], [ -2.700591, 36.682723997999972 ], [ -3.128691, 36.750887998999985 ], [ -3.777457500000025, 36.737926500000015 ], [ -4.418207, 36.717219999 ], [ -5.252405, 36.311275498999976 ], [ -5.339225, 36.152034997999976 ], [ -5.351522499999987, 36.152568998999982 ], [ -5.395709008999972, 36.127541227999984 ], [ -5.60843, 36.007053497000015 ], [ -6.040427500000021, 36.192378999000027 ], [ -6.440116, 36.720863498000028 ], [ -6.345561, 36.798766998000019 ], [ -6.893693499999983, 37.161465 ], [ -7.401916500000027, 37.174827498000013 ], [ -7.512691500000017, 37.526256499 ], [ -7.2632845, 37.979908 ], [ -7.002483499999983, 38.0227165 ], [ -6.9317385, 38.208377998 ], [ -6.180307500000026, 37.941077998000026 ], [ -5.874359, 38.158622998 ], [ -5.584845499999972, 38.131752 ], [ -5.568977500000017, 38.432639998000013 ], [ -5.046996001000025, 38.72913349800001 ], [ -5.008071500000028, 38.715274998999973 ] ] ], [ [ [ -2.950587, 35.318417999000019 ], [ -2.92736, 35.274264 ], [ -2.970093500000019, 35.289367500000026 ], [ -2.950587, 35.318417999000019 ] ] ], [ [ [ -5.35072900099999, 35.909002998 ], [ -5.34252117599999, 35.873038039999983 ], [ -5.382034499999975, 35.912606499999981 ], [ -5.35072900099999, 35.909002998 ] ] ] ] } },
-{ "type": "Feature", "id": 938, "properties": { "NUTS_ID": "ES7", "STAT_LEVL_": 1, "SHAPE_AREA": 0.58145106276699998, "SHAPE_LEN": 9.1537334656899993 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -16.3605925, 28.379459499 ], [ -16.640173, 28.00498799799999 ], [ -16.924217, 28.350006 ], [ -16.134222, 28.5812585 ], [ -16.3605925, 28.379459499 ] ] ], [ [ [ -15.424459500000012, 27.807855497999981 ], [ -15.600111, 27.734939498000017 ], [ -15.8329905, 27.90972349899999 ], [ -15.709276000999978, 28.16573899799999 ], [ -15.4035465, 28.171691999000018 ], [ -15.424459500000012, 27.807855497999981 ] ] ], [ [ [ -14.2242965, 28.162038998000014 ], [ -14.492716, 28.083705999000017 ], [ -14.226619, 28.21153449799999 ], [ -14.016622499999983, 28.714993998000011 ], [ -13.877083500000026, 28.752040998999973 ], [ -13.922949, 28.247462998 ], [ -14.2242965, 28.162038998000014 ] ] ], [ [ [ -13.482365500000014, 28.999332498 ], [ -13.8541985, 28.862073998000028 ], [ -13.756207500000016, 29.0776095 ], [ -13.470518, 29.238542498000015 ], [ -13.464197, 29.12962349899999 ], [ -13.482365500000014, 28.999332498 ] ] ], [ [ [ -17.7602195, 28.569320999000013 ], [ -17.842235500000015, 28.452651499000012 ], [ -17.9487345, 28.840764997 ], [ -17.777729, 28.839669997999977 ], [ -17.7602195, 28.569320999000013 ] ] ], [ [ [ -17.099160500999972, 28.0943145 ], [ -17.199426500000015, 28.023731498000018 ], [ -17.3492185, 28.099218498000027 ], [ -17.318047, 28.204323999 ], [ -17.118093499999986, 28.149373998999977 ], [ -17.099160500999972, 28.0943145 ] ] ], [ [ [ -17.883365500000025, 27.80445449699999 ], [ -17.979040000999987, 27.640519998 ], [ -18.158359500000017, 27.712901499999987 ], [ -18.131494, 27.771581498999979 ], [ -18.036621001000015, 27.762228 ], [ -17.958796, 27.841646 ], [ -17.883365500000025, 27.80445449699999 ] ] ] ] } },
-{ "type": "Feature", "id": 948, "properties": { "NUTS_ID": "FI1", "STAT_LEVL_": 1, "SHAPE_AREA": 61.297172761299997, "SHAPE_LEN": 51.154091122899999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 28.8007675, 68.869286000999978 ], [ 28.4339175, 68.539672 ], [ 28.6461425, 68.196304500999986 ], [ 30.0170405, 67.673556500000018 ], [ 29.033642, 66.942135001 ], [ 29.572938500000021, 66.432856 ], [ 30.13848, 65.668682000999979 ], [ 29.7547315, 65.497369498000012 ], [ 29.64493, 64.866495499999985 ], [ 30.086534500000027, 64.773990498999979 ], [ 30.044892, 64.402029997999989 ], [ 30.553427999, 64.132247999000015 ], [ 29.971917, 63.757165499999985 ], [ 31.586729, 62.90870200099999 ], [ 30.143966499999976, 61.85223799900001 ], [ 27.991279500000019, 60.668979 ], [ 27.797273500000017, 60.54534300099999 ], [ 26.453932, 60.487652500000024 ], [ 26.009723687000019, 60.316479188000017 ], [ 26.010455, 60.321357 ], [ 25.993759220000015, 60.337336162999975 ], [ 25.9122875, 60.366004498 ], [ 25.863847, 60.323478497999986 ], [ 25.936341, 60.26062 ], [ 25.7869765, 60.237285997000015 ], [ 25.627897500000017, 60.326478 ], [ 25.7202395, 60.339234498 ], [ 25.6696885, 60.383327501 ], [ 25.432495500000016, 60.222521 ], [ 25.216344105000019, 60.21141365599999 ], [ 24.575224, 60.178468501 ], [ 24.202054, 60.027420999000014 ], [ 23.839481499999977, 60.018403001000024 ], [ 23.956796, 60.004310997 ], [ 23.692564563000019, 59.972044676999985 ], [ 23.727499, 59.943315996000024 ], [ 23.3554325, 59.917364998999972 ], [ 23.186451499999976, 59.827804001 ], [ 22.932191499999988, 59.824022 ], [ 23.118828807999989, 60.000895125999989 ], [ 23.028183, 60.101486500000021 ], [ 22.87021, 60.1982375 ], [ 23.079409, 60.375627 ], [ 22.454013998999983, 60.265965000999984 ], [ 22.604136499999981, 60.373881499999982 ], [ 22.50068, 60.418537497999978 ], [ 22.295877999000027, 60.384289497999987 ], [ 22.253048428, 60.411757148999982 ], [ 22.247403, 60.419651498 ], [ 22.24190927799998, 60.418904734000023 ], [ 21.846205, 60.63480049899999 ], [ 21.585519498999986, 60.491007001000014 ], [ 21.572621, 60.500252499 ], [ 21.592718, 60.517643501 ], [ 21.5593745, 60.559245001000022 ], [ 21.462105, 60.581443501000024 ], [ 21.1746465, 60.87034399700002 ], [ 21.419929500000023, 61.047442998 ], [ 21.7037775, 61.557694501000014 ], [ 21.27957, 61.993263 ], [ 21.366348, 62.370907 ], [ 21.08101, 62.6323175 ], [ 21.483848500000022, 62.97023950099998 ], [ 21.455967718000011, 63.012294849999989 ], [ 21.3373805, 63.0409545 ], [ 22.137469558000021, 63.261305887999981 ], [ 22.159774, 63.255317 ], [ 22.224766719, 63.291334242 ], [ 22.393028500000014, 63.333281500999988 ], [ 22.324308499999972, 63.581827499999974 ], [ 22.8713565, 63.8339785 ], [ 22.927715464000016, 63.82609472199999 ], [ 22.989051, 63.788022 ], [ 23.027271499999983, 63.788767501 ], [ 23.61733300100002, 64.048771998 ], [ 24.539910500000019, 64.814548496999976 ], [ 25.487671, 64.9614365 ], [ 25.366445, 65.427499499000021 ], [ 25.084607499000015, 65.588949496999987 ], [ 24.8877245, 65.665782 ], [ 24.155129, 65.816027 ], [ 23.6455995, 66.3014085 ], [ 23.995160500999987, 66.819708999 ], [ 23.3940235, 67.485820999 ], [ 23.652348999000026, 67.958793001 ], [ 21.888943000999973, 68.5843835 ], [ 20.548636499999986, 69.059968501000014 ], [ 21.2788215, 69.311883998999974 ], [ 21.9836115, 69.072893500000021 ], [ 22.374526, 68.716667 ], [ 24.903199500000028, 68.554591 ], [ 25.777502500000026, 69.018279001 ], [ 25.702101, 69.253661501000011 ], [ 25.950607, 69.6965505 ], [ 27.984522, 70.01396549899999 ], [ 29.336974, 69.478310500000021 ], [ 28.8057925, 69.111147 ], [ 28.92968, 69.051905 ], [ 28.415766500000018, 68.915452501 ], [ 28.8007675, 68.869286000999978 ] ] ] } },
-{ "type": "Feature", "id": 971, "properties": { "NUTS_ID": "FI2", "STAT_LEVL_": 1, "SHAPE_AREA": 0.144539654431, "SHAPE_LEN": 1.72365985858 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 19.916036500000018, 60.422706501999983 ], [ 20.2753515, 60.28527 ], [ 20.168312500000013, 60.17514399800001 ], [ 20.043062, 60.102843001 ], [ 19.980941499999972, 60.104172 ], [ 19.9577145, 60.060943501 ], [ 19.513948, 60.178337 ], [ 19.916036500000018, 60.422706501999983 ] ] ] } },
-{ "type": "Feature", "id": 975, "properties": { "NUTS_ID": "FR1", "STAT_LEVL_": 1, "SHAPE_AREA": 1.46444370405, "SHAPE_LEN": 5.2383928090699996 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 1.704359, 49.232196997000017 ], [ 2.5905285, 49.079653999000016 ], [ 3.07188, 49.117553500999975 ], [ 3.4851835, 48.851910498999985 ], [ 3.557419164, 48.616713788000027 ], [ 3.414788999, 48.390268499 ], [ 3.049454, 48.360027499000012 ], [ 2.936316, 48.163391499999989 ], [ 2.476016500000014, 48.1296365 ], [ 2.402663, 48.3207175 ], [ 1.99409, 48.286584 ], [ 1.9221465, 48.457599499000025 ], [ 1.501526500000011, 48.941051997999978 ], [ 1.608799, 49.077894 ], [ 1.704359, 49.232196997000017 ] ] ] } },
-{ "type": "Feature", "id": 985, "properties": { "NUTS_ID": "FR2", "STAT_LEVL_": 1, "SHAPE_AREA": 17.828110909900001, "SHAPE_LEN": 30.506354878900002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 4.140853, 49.978759997 ], [ 4.233133, 49.957828498000026 ], [ 4.432494, 49.941615996999985 ], [ 4.796697, 50.148677998999972 ], [ 4.896794, 50.137420498999973 ], [ 4.793194, 49.98199900100002 ], [ 4.851578, 49.793255000999977 ], [ 4.969431, 49.801825999000016 ], [ 5.153738499999974, 49.717926 ], [ 5.393511, 49.617111 ], [ 5.107566500000019, 49.584689498999978 ], [ 4.950990499999989, 49.23686749699999 ], [ 4.9884305, 48.684422000999973 ], [ 5.470055, 48.420926499000018 ], [ 5.8847235, 47.92604649899999 ], [ 5.37408, 47.604537999 ], [ 5.477542, 47.60871849900002 ], [ 5.375406999, 47.460167499000022 ], [ 5.518537499999979, 47.304183998999974 ], [ 5.255232499999977, 46.979887498999972 ], [ 5.440604, 46.637912 ], [ 5.310563499000011, 46.446769999000026 ], [ 4.953853998999989, 46.513422999 ], [ 4.780208500000015, 46.176675999 ], [ 4.388074500000016, 46.219790499 ], [ 3.899538499000016, 46.275907999000026 ], [ 3.998868500000015, 46.46486699899998 ], [ 3.629422499999976, 46.749456497999972 ], [ 3.032063, 46.794909501 ], [ 2.28104350000001, 46.420403497 ], [ 2.167784499999982, 46.424068998999985 ], [ 1.4151855, 46.347215001 ], [ 1.177279, 46.383947998 ], [ 0.867469, 46.748216499000023 ], [ 0.05383, 47.163733499999978 ], [ 0.230000500000017, 47.608397498999977 ], [ 0.614432500000021, 47.694215497000016 ], [ 0.841217498999981, 48.103059501000018 ], [ 0.797658499000022, 48.19445500099999 ], [ 0.35289, 48.459688 ], [ -0.054527, 48.3820045 ], [ -0.234124, 48.561847500999988 ], [ -0.86036, 48.501458499000023 ], [ -1.070164499999976, 48.508492 ], [ -1.42794, 48.461915001000023 ], [ -1.571089500000028, 48.626442 ], [ -1.391047, 48.644653496999979 ], [ -1.942384, 49.725959999 ], [ -1.266492500000027, 49.69559849699999 ], [ -1.3077705, 49.545719 ], [ -1.11962, 49.35556800099999 ], [ 0.297224500000027, 49.429863001 ], [ 0.3389785, 49.4409255 ], [ 0.065609, 49.512576998999975 ], [ 0.192298, 49.706962498999985 ], [ 1.379698, 50.06500999799999 ], [ 1.641539500000022, 50.352149997000026 ], [ 3.090252, 50.053740499000014 ], [ 3.1727045, 50.011996497999974 ], [ 4.140853, 49.978759997 ] ], [ [ 1.704359, 49.232196997000017 ], [ 1.608799, 49.077894 ], [ 1.501526500000011, 48.941051997999978 ], [ 1.9221465, 48.457599499000025 ], [ 1.99409, 48.286584 ], [ 2.402663, 48.3207175 ], [ 2.476016500000014, 48.1296365 ], [ 2.936316, 48.163391499999989 ], [ 3.049454, 48.360027499000012 ], [ 3.414788999, 48.390268499 ], [ 3.557419164, 48.616713788000027 ], [ 3.4851835, 48.851910498999985 ], [ 3.07188, 49.117553500999975 ], [ 2.5905285, 49.079653999000016 ], [ 1.704359, 49.232196997000017 ] ] ] } },
-{ "type": "Feature", "id": 1014, "properties": { "NUTS_ID": "FR3", "STAT_LEVL_": 1, "SHAPE_AREA": 1.5419431428799999, "SHAPE_LEN": 6.4342402718200002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 2.863276, 50.708343496999987 ], [ 3.0187085, 50.773532999 ], [ 3.098481, 50.779019 ], [ 3.176996, 50.756164497999976 ], [ 3.245294, 50.713009498000019 ], [ 3.286492, 50.52756899799999 ], [ 3.615081499999974, 50.490399001000014 ], [ 3.65551, 50.461735498999985 ], [ 3.710389, 50.303165499999977 ], [ 4.027774500000021, 50.358330498999976 ], [ 4.140853, 49.978759997 ], [ 3.1727045, 50.011996497999974 ], [ 3.090252, 50.053740499000014 ], [ 1.641539500000022, 50.352149997000026 ], [ 1.580953, 50.8695525 ], [ 2.067705, 51.00649999699999 ], [ 2.546011, 51.089381998000022 ], [ 2.607036, 50.91268949900001 ], [ 2.863276, 50.708343496999987 ] ] ] } },
-{ "type": "Feature", "id": 1018, "properties": { "NUTS_ID": "FR4", "STAT_LEVL_": 1, "SHAPE_AREA": 5.8831875126900002, "SHAPE_LEN": 12.4740951319 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 5.818117, 49.5463105 ], [ 5.893386, 49.496944498 ], [ 6.367107499999975, 49.469507001000011 ], [ 6.556986, 49.419208498999978 ], [ 6.723465499999975, 49.218828998999982 ], [ 7.101069, 49.155998 ], [ 7.36875550000002, 49.16145799899999 ], [ 7.635651, 49.053950499999985 ], [ 7.910654500000021, 49.045163499000012 ], [ 8.068407499999978, 48.999316497999985 ], [ 8.232632998999975, 48.966571500999976 ], [ 7.9596305, 48.718580998999983 ], [ 7.680713, 48.257266996999988 ], [ 7.5779195, 48.121391999000025 ], [ 7.577291, 48.115654999000014 ], [ 7.545990500000016, 47.743573998999977 ], [ 7.589039, 47.589877999 ], [ 7.5551595, 47.56456399699999 ], [ 7.510905499999978, 47.502582497999981 ], [ 7.445019, 47.46172349699998 ], [ 7.42113949899999, 47.446387999000024 ], [ 7.380894, 47.431892499000014 ], [ 7.326466, 47.439853500000027 ], [ 7.130353, 47.503040499 ], [ 6.939186, 47.433704499999976 ], [ 6.879806, 47.352439998000023 ], [ 7.061636, 47.34373449899999 ], [ 6.858914, 47.165294499000026 ], [ 6.460011, 46.851551498999982 ], [ 6.138106, 46.557666997000013 ], [ 6.064003, 46.416228997000019 ], [ 5.4736565, 46.264284001000021 ], [ 5.310563499000011, 46.446769999000026 ], [ 5.440604, 46.637912 ], [ 5.255232499999977, 46.979887498999972 ], [ 5.518537499999979, 47.304183998999974 ], [ 5.375406999, 47.460167499000022 ], [ 5.477542, 47.60871849900002 ], [ 5.37408, 47.604537999 ], [ 5.8847235, 47.92604649899999 ], [ 5.470055, 48.420926499000018 ], [ 4.9884305, 48.684422000999973 ], [ 4.950990499999989, 49.23686749699999 ], [ 5.107566500000019, 49.584689498999978 ], [ 5.393511, 49.617111 ], [ 5.470882999000025, 49.49723799899999 ], [ 5.734555999, 49.545690501000024 ], [ 5.818117, 49.5463105 ] ] ] } },
-{ "type": "Feature", "id": 1032, "properties": { "NUTS_ID": "FR5", "STAT_LEVL_": 1, "SHAPE_AREA": 9.9106182720800007, "SHAPE_LEN": 18.907034111600002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -1.571089500000028, 48.626442 ], [ -1.42794, 48.461915001000023 ], [ -1.070164499999976, 48.508492 ], [ -0.86036, 48.501458499000023 ], [ -0.234124, 48.561847500999988 ], [ -0.054527, 48.3820045 ], [ 0.35289, 48.459688 ], [ 0.797658499000022, 48.19445500099999 ], [ 0.841217498999981, 48.103059501000018 ], [ 0.614432500000021, 47.694215497000016 ], [ 0.230000500000017, 47.608397498999977 ], [ 0.05383, 47.163733499999978 ], [ 0.867469, 46.748216499000023 ], [ 1.177279, 46.383947998 ], [ 0.823432500000024, 46.128584499 ], [ 0.629741, 45.714569997000012 ], [ 0.004336, 45.191628 ], [ -0.040197499999977, 45.10237999899999 ], [ -0.708231, 45.327480498999989 ], [ -1.230697500000019, 45.680572498 ], [ -1.053074, 46.00383 ], [ -1.129406, 46.310271999 ], [ -1.8123445, 46.49341949799998 ], [ -2.141985, 46.81897349799999 ], [ -1.980413, 47.028907999000012 ], [ -2.547108001000026, 47.292373499 ], [ -2.458493, 47.448119999000028 ], [ -3.528605313000014, 47.778291696999986 ], [ -4.734697499999982, 48.038242497999988 ], [ -4.269574, 48.132770500999982 ], [ -4.2257935, 48.28749099700002 ], [ -4.273602501000028, 48.443435500000021 ], [ -4.772754, 48.329235 ], [ -4.7699215, 48.521136999000021 ], [ -3.950495, 48.652869998000028 ], [ -3.65915050000001, 48.659209998999984 ], [ -3.084001, 48.865695999000025 ], [ -2.6863985, 48.493152499000018 ], [ -2.123706500000026, 48.604404499 ], [ -2.148958, 48.633731996999984 ], [ -2.045604, 48.63681399699999 ], [ -2.006897, 48.566108498 ], [ -1.982181075000028, 48.554643027999987 ], [ -1.872124, 48.645713999 ], [ -1.571089500000028, 48.626442 ] ] ] } },
-{ "type": "Feature", "id": 1036, "properties": { "NUTS_ID": "HU1", "STAT_LEVL_": 1, "SHAPE_AREA": 0.86330293364900001, "SHAPE_LEN": 4.1065573863199996 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 18.965907500000014, 47.028965 ], [ 18.688433499999974, 47.577067499 ], [ 18.848478, 47.818228001000023 ], [ 18.7548155, 47.975082499 ], [ 18.928392, 48.056832499 ], [ 19.086134500000014, 47.838170499 ], [ 19.570937, 47.734900001000028 ], [ 19.666329, 47.588550000999987 ], [ 19.990462499999978, 47.346628001 ], [ 20.094639, 47.0074265 ], [ 18.965907500000014, 47.028965 ] ] ] } },
-{ "type": "Feature", "id": 1040, "properties": { "NUTS_ID": "HU2", "STAT_LEVL_": 1, "SHAPE_AREA": 4.3137756546799997, "SHAPE_LEN": 9.3225174596299993 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 18.848478, 47.818228001000023 ], [ 18.688433499999974, 47.577067499 ], [ 18.965907500000014, 47.028965 ], [ 18.925097, 46.85716850099999 ], [ 19.006243499999982, 46.704923496999982 ], [ 18.802291, 46.108764999000016 ], [ 18.821306, 45.914382997000018 ], [ 18.446884, 45.737048997999977 ], [ 17.911642, 45.790950996999982 ], [ 17.651499, 45.847835499999974 ], [ 17.294325, 45.988544500999978 ], [ 16.8760435, 46.320602496999982 ], [ 16.854754999000022, 46.350440999 ], [ 16.596805, 46.475902498999972 ], [ 16.3707935, 46.722243499 ], [ 16.113849, 46.869067998999981 ], [ 16.508267499999988, 47.001255999000023 ], [ 16.4337615, 47.352918499999987 ], [ 16.64622, 47.446597 ], [ 16.652076, 47.6229035 ], [ 16.421846, 47.664704498999981 ], [ 17.093074, 47.708236 ], [ 17.1607975, 48.006656501 ], [ 17.247427500000015, 48.012008998999988 ], [ 17.705436500000019, 47.758992498999987 ], [ 17.893923, 47.739456999000026 ], [ 18.848478, 47.818228001000023 ] ] ] } },
-{ "type": "Feature", "id": 1053, "properties": { "NUTS_ID": "HU3", "STAT_LEVL_": 1, "SHAPE_AREA": 5.8422609989199996, "SHAPE_LEN": 13.2434117271 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 22.155306, 48.403396499 ], [ 22.896270500000014, 47.954120500999977 ], [ 22.1808375, 47.600094499000022 ], [ 22.12832, 47.598089496999989 ], [ 21.658955, 47.022131498000022 ], [ 21.441398, 46.651467 ], [ 21.103170499999976, 46.262590497000019 ], [ 20.7756, 46.275909999000021 ], [ 20.705303500000014, 46.160937499 ], [ 20.264296, 46.1263735 ], [ 19.86539, 46.150334499 ], [ 19.698099, 46.187930999 ], [ 19.3027095, 45.991550500000017 ], [ 18.889734, 45.921180497000023 ], [ 18.821306, 45.914382997000018 ], [ 18.802291, 46.108764999000016 ], [ 19.006243499999982, 46.704923496999982 ], [ 18.925097, 46.85716850099999 ], [ 18.965907500000014, 47.028965 ], [ 20.094639, 47.0074265 ], [ 19.990462499999978, 47.346628001 ], [ 19.666329, 47.588550000999987 ], [ 19.570937, 47.734900001000028 ], [ 19.086134500000014, 47.838170499 ], [ 18.928392, 48.056832499 ], [ 19.0143225, 48.077736499000025 ], [ 20.051879, 48.16770399699999 ], [ 20.463937, 48.463967 ], [ 21.440056, 48.585232999000027 ], [ 21.721957, 48.351050499 ], [ 22.121077500000013, 48.378311499 ], [ 22.155306, 48.403396499 ] ] ] } },
-{ "type": "Feature", "id": 1067, "properties": { "NUTS_ID": "IE0", "STAT_LEVL_": 1, "SHAPE_AREA": 9.3568211137299997, "SHAPE_LEN": 24.5137708524 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -8.177718, 54.464973500999974 ], [ -7.565956, 54.126514499 ], [ -7.0286355, 54.421306497999979 ], [ -6.623778500000014, 54.036548497000013 ], [ -6.2680155, 54.102337 ], [ -6.1031835, 53.99999950099999 ], [ -6.3811025, 54.012704 ], [ -6.247098, 53.722454500000026 ], [ -6.2148995, 53.633406998999988 ], [ -6.102773, 53.210773 ], [ -6.144516501, 52.737696498999981 ], [ -6.360401500000023, 52.174297998999975 ], [ -7.842156, 51.954218498999978 ], [ -8.534105, 51.603961498999979 ], [ -9.820953499999973, 51.44911749900001 ], [ -9.4395065, 51.723395000999972 ], [ -10.143066499999975, 51.597052999000027 ], [ -10.359855687999982, 51.89811573899999 ], [ -10.292081, 51.9134095 ], [ -10.305404416999977, 51.91783544499998 ], [ -9.792653, 52.155574499000011 ], [ -10.472409, 52.180784501 ], [ -9.365306771, 52.572083917999976 ], [ -9.365192, 52.572124499999973 ], [ -8.733330465999984, 52.662677988999974 ], [ -8.72813801, 52.663422141000012 ], [ -8.728047772000025, 52.663435361999973 ], [ -8.728093624999985, 52.663433558 ], [ -8.733924531000014, 52.66327281299999 ], [ -9.545411424, 52.640890680999973 ], [ -9.546329543000013, 52.640865521000023 ], [ -9.546003731999974, 52.641486854999982 ], [ -9.282234397000025, 53.14464463600001 ], [ -9.282228877000023, 53.144654906000028 ], [ -9.282189335999988, 53.14465521599999 ], [ -9.009308499999975, 53.140817498999979 ], [ -9.009085012000014, 53.141198864999978 ], [ -8.933212, 53.270670997000025 ], [ -9.63143794299998, 53.296942788000024 ], [ -9.630072499999983, 53.297348496999973 ], [ -9.63144528700002, 53.297345429000018 ], [ -10.179155499999979, 53.406988999000021 ], [ -10.189941, 53.556088500999977 ], [ -9.80073826, 53.608912261 ], [ -9.900522, 53.764440997 ], [ -9.546066, 53.88420849900001 ], [ -10.070641500000022, 54.27627250099999 ], [ -9.132258499999978, 54.162360498 ], [ -8.508256, 54.216985497999985 ], [ -8.134833, 54.603353501000015 ], [ -8.7886565, 54.688915996999981 ], [ -8.281798499999979, 55.158847998 ], [ -7.339527, 55.374958 ], [ -6.924876, 55.233694500000013 ], [ -7.256068500000026, 55.067034998 ], [ -7.534506, 54.74713900099999 ], [ -7.921372, 54.696544499000026 ], [ -7.703411501, 54.608287999000027 ], [ -8.177718, 54.464973500999974 ] ] ] } },
-{ "type": "Feature", "id": 1090, "properties": { "NUTS_ID": "ITF", "STAT_LEVL_": 1, "SHAPE_AREA": 7.7655461834499997, "SHAPE_LEN": 20.516959983 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 17.388981, 40.89186349900001 ], [ 18.097453, 40.515377999 ], [ 18.517056, 40.135147999000026 ], [ 18.369068, 39.793742001 ], [ 18.047947500000021, 39.92847849899999 ], [ 17.763504499000021, 40.295507 ], [ 17.127098, 40.517620500000021 ], [ 16.867264, 40.397845499000027 ], [ 16.644317, 40.119095999000024 ], [ 16.527955500000019, 39.668659500999979 ], [ 17.025064499999985, 39.483578498999975 ], [ 17.188881, 39.020471998 ], [ 16.890732, 38.92712849899999 ], [ 16.566985499999987, 38.765481998999974 ], [ 16.5820845, 38.46979049700002 ], [ 16.0638095, 37.924418996999975 ], [ 15.678212, 37.954119498000011 ], [ 15.636063, 38.231473498000014 ], [ 15.652291742999978, 38.247265964 ], [ 15.687350835000018, 38.281382564000012 ], [ 15.918956, 38.50676149899999 ], [ 15.8481175, 38.658864498000014 ], [ 16.2137535, 38.810439998999982 ], [ 16.093153, 39.048742998000023 ], [ 15.75595850000002, 39.923496 ], [ 15.644945499000016, 40.042790498999977 ], [ 15.356917, 39.999090997999986 ], [ 14.787251500000025, 40.666180498000017 ], [ 14.747420499999976, 40.67764649899999 ], [ 14.692414499999984, 40.634065998999972 ], [ 14.528490499999975, 40.607152498 ], [ 14.46862, 40.620201499000018 ], [ 14.324673, 40.569084998999983 ], [ 14.460677499999974, 40.742930500999989 ], [ 14.0321305, 40.898838997999974 ], [ 13.760795499999972, 41.223167997000019 ], [ 13.8737185, 41.338294001 ], [ 13.977928, 41.462454 ], [ 13.941038499, 41.68794399699999 ], [ 13.296299499999975, 41.948588500000028 ], [ 13.0305745, 42.115353 ], [ 13.1913805, 42.587491 ], [ 13.39403149899999, 42.591233999 ], [ 13.3577745, 42.694069997999975 ], [ 13.9157315, 42.894577 ], [ 14.1469065, 42.530596499000012 ], [ 14.253960499000016, 42.444798499 ], [ 14.779646999000022, 42.070021499 ], [ 15.138178499999981, 41.927007499000013 ], [ 16.17653150000001, 41.884789500000011 ], [ 15.89714, 41.603231999 ], [ 16.024738500000012, 41.425590499 ], [ 16.542302499000016, 41.229441 ], [ 17.388981, 40.89186349900001 ] ] ] } },
-{ "type": "Feature", "id": 1121, "properties": { "NUTS_ID": "ITG", "STAT_LEVL_": 1, "SHAPE_AREA": 5.1115389580599997, "SHAPE_LEN": 15.519685297200001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 15.572551, 38.234204998999985 ], [ 15.2580825, 37.807223997999984 ], [ 15.091499, 37.3577085 ], [ 15.315962500000012, 37.035012498000015 ], [ 15.116582, 36.674955999000019 ], [ 15.000358, 36.702847497999983 ], [ 14.493577, 36.78673349799999 ], [ 14.337594500000023, 37.001979999000014 ], [ 14.036207, 37.10602799899999 ], [ 12.896496, 37.577118498 ], [ 12.672787, 37.560080998999979 ], [ 12.4429295, 37.811051998999972 ], [ 12.561744499999975, 38.065561998000021 ], [ 12.731687, 38.180297499 ], [ 12.811005, 38.081515497999987 ], [ 12.9772625, 38.040205999000023 ], [ 13.36569350000002, 38.182289498999978 ], [ 13.7438295, 37.97024649799999 ], [ 14.183486, 38.019396498999981 ], [ 15.652927, 38.26710399699999 ], [ 15.572551, 38.234204998999985 ] ] ], [ [ [ 9.423461998999983, 41.176741000999982 ], [ 9.748871999000016, 40.660406997999985 ], [ 9.8270205, 40.512112499000011 ], [ 9.626526999000021, 40.224876998000013 ], [ 9.7352045, 40.075661998999976 ], [ 9.6511185, 39.549323997999977 ], [ 9.555865, 39.133966997000016 ], [ 9.1099865, 39.214018501 ], [ 8.859653, 38.877469498999972 ], [ 8.612392, 38.957787998000015 ], [ 8.368104499000026, 39.213810500000022 ], [ 8.393, 39.446529499 ], [ 8.5022545, 39.713006999000015 ], [ 8.399809, 40.407568498999979 ], [ 8.135010500000021, 40.736348998999972 ], [ 8.201709, 40.97256649799999 ], [ 8.4171275, 40.83831349899998 ], [ 8.802688499999988, 40.931047998 ], [ 9.163662, 41.235624001000019 ], [ 9.423461998999983, 41.176741000999982 ] ] ] ] } },
-{ "type": "Feature", "id": 1141, "properties": { "NUTS_ID": "ITH", "STAT_LEVL_": 1, "SHAPE_AREA": 7.1090756829000004, "SHAPE_LEN": 19.566307989799999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 12.690635, 46.656972000999986 ], [ 12.731392998999979, 46.634288 ], [ 13.504249500000014, 46.566303998000024 ], [ 13.714184999, 46.522703499999977 ], [ 13.684032, 46.437472501 ], [ 13.3754925, 46.29823249899999 ], [ 13.664347500000019, 46.177549999 ], [ 13.496938999, 46.051334999 ], [ 13.597145, 45.81952250099999 ], [ 13.596243, 45.807937501000026 ], [ 13.9186565, 45.63351749899999 ], [ 13.7228235, 45.594725497000013 ], [ 13.579783, 45.786951499 ], [ 13.40649350000001, 45.725175001000025 ], [ 13.130853, 45.771844499999986 ], [ 13.098785501, 45.644532997 ], [ 12.434162500000014, 45.424485996999977 ], [ 12.631836, 45.534338499 ], [ 12.413683999, 45.544310500999984 ], [ 12.132562, 45.300370499999985 ], [ 12.200494, 45.257337 ], [ 12.2045665, 45.197675998000022 ], [ 12.330571500000019, 45.1605505 ], [ 12.328777, 45.151860999 ], [ 12.302155605999985, 45.13788754699999 ], [ 12.279706042999976, 45.123266828 ], [ 12.399056499999972, 44.792615501 ], [ 12.269892, 44.630007497 ], [ 12.384282, 44.224726499999974 ], [ 12.450349, 44.162189496999986 ], [ 12.75078, 43.970601500999976 ], [ 12.669299, 43.823227498999984 ], [ 12.493956, 43.91554650099999 ], [ 12.513435500000014, 43.991269497000019 ], [ 12.4046075, 43.955678501000023 ], [ 12.4177, 43.899029997000014 ], [ 12.2837975, 43.764898997999978 ], [ 12.107464, 43.75375349699999 ], [ 11.98652850000002, 43.761913998000011 ], [ 11.710189, 43.877430498000024 ], [ 11.715906500000017, 44.122540499000024 ], [ 11.524959500000023, 44.157639999000025 ], [ 11.202439500000025, 44.100721 ], [ 11.049449, 44.090229998999973 ], [ 10.814787, 44.11617449900001 ], [ 10.624078, 44.120351501000016 ], [ 10.470149, 44.226041001 ], [ 10.253875, 44.268566998999972 ], [ 10.142054499999972, 44.353852999000026 ], [ 9.686725500000023, 44.365921999000022 ], [ 9.4790595, 44.40924050000001 ], [ 9.493363, 44.555858999 ], [ 9.202995499999986, 44.613476499 ], [ 9.200074, 44.686099496999987 ], [ 9.3246605, 44.69 ], [ 9.548684499999979, 45.132648499000027 ], [ 9.8911015, 45.130898 ], [ 10.083503, 45.04395850100002 ], [ 10.464030499999978, 44.937171501000023 ], [ 10.504344, 44.922417999 ], [ 10.887909499999978, 44.914265999 ], [ 11.246231500000022, 44.951427997 ], [ 11.426765499999988, 44.950076 ], [ 11.205502500000023, 45.109487499000011 ], [ 10.6546555, 45.415826998999989 ], [ 10.631074, 45.609512500999983 ], [ 10.840176499999984, 45.832758501 ], [ 10.502388, 45.830373 ], [ 10.515751500000022, 46.34322449699999 ], [ 10.622145499999988, 46.448101999000016 ], [ 10.452801, 46.530682999000021 ], [ 10.4696515, 46.854909 ], [ 11.02225, 46.765410498999984 ], [ 11.164281500000016, 46.965722500000027 ], [ 11.627199500000017, 47.013299 ], [ 12.136014, 47.0806675 ], [ 12.2407455, 47.069168499 ], [ 12.143811, 46.913779998 ], [ 12.477924, 46.679835497999989 ], [ 12.690635, 46.656972000999986 ] ] ] } },
-{ "type": "Feature", "id": 1169, "properties": { "NUTS_ID": "ITI", "STAT_LEVL_": 1, "SHAPE_AREA": 6.33140759553, "SHAPE_LEN": 14.0181138018 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 12.4177, 43.899029997000014 ], [ 12.493956, 43.91554650099999 ], [ 12.669299, 43.823227498999984 ], [ 12.75078, 43.970601500999976 ], [ 13.172615, 43.75034599899999 ], [ 13.6420965, 43.474142498999981 ], [ 13.742970499000023, 43.29414699900002 ], [ 13.849453, 43.066659497999979 ], [ 13.9157315, 42.894577 ], [ 13.3577745, 42.694069997999975 ], [ 13.39403149899999, 42.591233999 ], [ 13.1913805, 42.587491 ], [ 13.0305745, 42.115353 ], [ 13.296299499999975, 41.948588500000028 ], [ 13.941038499, 41.68794399699999 ], [ 13.977928, 41.462454 ], [ 13.8737185, 41.338294001 ], [ 13.760795499999972, 41.223167997000019 ], [ 13.067982, 41.221979 ], [ 12.7733215, 41.416263999000023 ], [ 11.733844, 42.15805899899999 ], [ 11.449938499999973, 42.377670501000011 ], [ 11.0979815, 42.393149499 ], [ 11.1765795, 42.541544 ], [ 10.7056725, 42.9418645 ], [ 10.499046, 42.93527050099999 ], [ 10.528247498999974, 43.231603999000015 ], [ 10.299847, 43.581928999000013 ], [ 10.258118, 43.815146998999978 ], [ 10.143476, 43.97542049899999 ], [ 10.018769500000019, 44.044535998000015 ], [ 9.686725500000023, 44.365921999000022 ], [ 10.142054499999972, 44.353852999000026 ], [ 10.253875, 44.268566998999972 ], [ 10.470149, 44.226041001 ], [ 10.624078, 44.120351501000016 ], [ 10.814787, 44.11617449900001 ], [ 11.049449, 44.090229998999973 ], [ 11.202439500000025, 44.100721 ], [ 11.524959500000023, 44.157639999000025 ], [ 11.715906500000017, 44.122540499000024 ], [ 11.710189, 43.877430498000024 ], [ 11.98652850000002, 43.761913998000011 ], [ 12.107464, 43.75375349699999 ], [ 12.2837975, 43.764898997999978 ], [ 12.4177, 43.899029997000014 ] ], [ [ 12.4466885, 41.901744998000026 ], [ 12.458003, 41.901485498999989 ], [ 12.45582, 41.907197499 ], [ 12.4466885, 41.901744998000026 ] ] ] } },
-{ "type": "Feature", "id": 1197, "properties": { "NUTS_ID": "LI0", "STAT_LEVL_": 1, "SHAPE_AREA": 0.018972630538699999, "SHAPE_LEN": 0.59777272911599999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 9.607078, 47.0607745 ], [ 9.4760475, 47.051797498999974 ], [ 9.530749, 47.270581000999982 ], [ 9.620580500000017, 47.151645499999972 ], [ 9.607078, 47.0607745 ] ] ] } },
-{ "type": "Feature", "id": 1201, "properties": { "NUTS_ID": "LT0", "STAT_LEVL_": 1, "SHAPE_AREA": 9.1183198946200008, "SHAPE_LEN": 14.746266998499999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 24.91819, 56.440865 ], [ 25.0920375, 56.186041998 ], [ 25.821386500000017, 56.052179 ], [ 26.0461545, 55.944105998999987 ], [ 26.630365, 55.680666998999982 ], [ 26.553886, 55.388916000999984 ], [ 26.8356225, 55.285648498 ], [ 26.743454, 55.254102998 ], [ 25.779577500000016, 54.854839501000015 ], [ 25.762614, 54.5768945 ], [ 25.5319225, 54.342031498999972 ], [ 25.779838, 54.160037000999978 ], [ 25.497109500000022, 54.309776499 ], [ 24.835949500000027, 54.149027499 ], [ 24.435035500000026, 53.901003 ], [ 23.51465, 53.956559997999989 ], [ 23.321498500000018, 54.25332599799998 ], [ 22.792095500000016, 54.363358996999978 ], [ 22.588968, 55.070251498 ], [ 21.651070999000012, 55.179983500999981 ], [ 21.271226238999986, 55.244369351999978 ], [ 21.2643495, 55.245534997999982 ], [ 21.064238, 56.069136999000023 ], [ 21.978209, 56.385142998999982 ], [ 22.635563499999989, 56.368168 ], [ 22.920517, 56.39914299899999 ], [ 24.1518, 56.253348501 ], [ 24.91819, 56.440865 ] ] ] } },
-{ "type": "Feature", "id": 1214, "properties": { "NUTS_ID": "LU0", "STAT_LEVL_": 1, "SHAPE_AREA": 0.334562602948, "SHAPE_LEN": 2.34710743195 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 6.137662499999976, 50.129951499000015 ], [ 6.4749625, 49.821274999000025 ], [ 6.380052499999977, 49.551104999000017 ], [ 6.367107499999975, 49.469507001000011 ], [ 5.893386, 49.496944498 ], [ 5.818117, 49.5463105 ], [ 5.910688, 49.662388001000011 ], [ 5.746319, 49.853595 ], [ 6.0248995, 50.182779498 ], [ 6.137662499999976, 50.129951499000015 ] ] ] } },
-{ "type": "Feature", "id": 1218, "properties": { "NUTS_ID": "LV0", "STAT_LEVL_": 1, "SHAPE_AREA": 9.6020568119899998, "SHAPE_LEN": 18.2246422222 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 27.351579, 57.51823699900001 ], [ 27.690777, 57.37056099900002 ], [ 27.8701565, 57.28979499899998 ], [ 27.6594445, 56.8343655 ], [ 28.192765, 56.448565998999982 ], [ 28.154555500000015, 56.169843501 ], [ 27.615505499999983, 55.787071501000014 ], [ 26.630365, 55.680666998999982 ], [ 26.0461545, 55.944105998999987 ], [ 25.821386500000017, 56.052179 ], [ 25.0920375, 56.186041998 ], [ 24.91819, 56.440865 ], [ 24.1518, 56.253348501 ], [ 22.920517, 56.39914299899999 ], [ 22.635563499999989, 56.368168 ], [ 21.978209, 56.385142998999982 ], [ 21.064238, 56.069136999000023 ], [ 20.97067850000002, 56.352584497 ], [ 21.05211, 56.823620999000013 ], [ 21.3748955, 57.00309699799999 ], [ 21.419337, 57.290952499000014 ], [ 21.7031505, 57.5685305 ], [ 22.6050035, 57.758573999000021 ], [ 23.146890499999984, 57.316258 ], [ 23.304514, 57.064289499999973 ], [ 23.934532499999989, 57.006370496999978 ], [ 24.119202499999972, 57.086290001 ], [ 24.410199499999976, 57.266025501 ], [ 24.352817500000015, 57.876556500999982 ], [ 24.834082500000022, 57.9727795 ], [ 25.046307, 58.040146000999982 ], [ 26.056459, 57.848431496999979 ], [ 26.524905499999988, 57.516247497999984 ], [ 27.351579, 57.51823699900001 ] ] ] } },
-{ "type": "Feature", "id": 1227, "properties": { "NUTS_ID": "ME0", "STAT_LEVL_": 1, "SHAPE_AREA": 1.46919018022, "SHAPE_LEN": 5.89019509358 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 18.525213, 42.420462997000016 ], [ 18.438104, 42.555705000999978 ], [ 18.5672495, 42.651405497999974 ], [ 18.491927998999984, 42.983962997999981 ], [ 18.683405, 43.245486998999979 ], [ 19.224069, 43.527541 ], [ 20.063936500000011, 43.006823999 ], [ 20.352928500000019, 42.833381498999984 ], [ 20.024653, 42.765138498999988 ], [ 20.0763, 42.555823498999985 ], [ 19.621872, 42.589744498000016 ], [ 19.282484, 42.180015502 ], [ 19.372067, 41.850320497999974 ], [ 18.689521633000027, 42.464353484000014 ], [ 18.525213, 42.420462997000016 ] ] ] } },
-{ "type": "Feature", "id": 1231, "properties": { "NUTS_ID": "MK0", "STAT_LEVL_": 1, "SHAPE_AREA": 2.70884348404, "SHAPE_LEN": 6.7098309700699996 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 22.9275915, 41.338539498999978 ], [ 22.879178500000023, 41.340652998 ], [ 22.732037, 41.146391498000014 ], [ 22.332052, 41.120272500999988 ], [ 22.216216, 41.170457498000019 ], [ 21.929438, 41.100350498000012 ], [ 21.787378, 40.9311255 ], [ 20.980204500000013, 40.855665 ], [ 20.837823, 40.927682999000012 ], [ 20.515577, 41.230955500999983 ], [ 20.456284499999981, 41.554024001000016 ], [ 20.557774, 41.581870502000015 ], [ 20.594286, 41.877327498 ], [ 21.10894, 42.206170998 ], [ 21.2126975, 42.110701001 ], [ 21.4443185, 42.234930498999972 ], [ 21.58694650000001, 42.262817498 ], [ 22.3602065, 42.311157001000026 ], [ 22.510412, 42.15515849799999 ], [ 22.867214, 42.022199496999974 ], [ 22.968327499999987, 41.51983549900001 ], [ 22.9275915, 41.338539498999978 ] ] ] } },
-{ "type": "Feature", "id": 1242, "properties": { "NUTS_ID": "MT0", "STAT_LEVL_": 1, "SHAPE_AREA": 0.0251866693234, "SHAPE_LEN": 1.05686311101 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 14.5720435, 35.849754497999982 ], [ 14.535398, 35.807502998000018 ], [ 14.416679499999987, 35.828189499000018 ], [ 14.342219999, 35.873474498 ], [ 14.319842, 35.970368 ], [ 14.376091499999973, 35.987453999000024 ], [ 14.350481, 35.969952498999987 ], [ 14.5720435, 35.849754497999982 ] ] ], [ [ [ 14.336181, 36.032258499000022 ], [ 14.218846, 36.020974497999987 ], [ 14.186361, 36.036271999 ], [ 14.184842, 36.074078498 ], [ 14.300056499999982, 36.057194998999989 ], [ 14.336181, 36.032258499000022 ] ] ] ] } },
-{ "type": "Feature", "id": 1247, "properties": { "NUTS_ID": "NL1", "STAT_LEVL_": 1, "SHAPE_AREA": 1.12198002137, "SHAPE_LEN": 4.9628447500900004 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 7.208935, 53.243064498000024 ], [ 7.202794499999982, 53.113281498999982 ], [ 7.092692, 52.83820099899998 ], [ 7.006229500000018, 52.63876299899999 ], [ 6.709732499999973, 52.627823498999987 ], [ 6.629429500000015, 52.669658500000025 ], [ 6.1630035, 52.680061498999976 ], [ 6.119814, 52.854267000999982 ], [ 5.819826499999976, 52.81731750099999 ], [ 5.795148499999982, 52.806499498999983 ], [ 5.377261499999975, 52.764805 ], [ 5.167425499999979, 52.998798498999975 ], [ 5.164383499999985, 53.000910501000021 ], [ 5.4112285, 53.151724498000021 ], [ 6.1913015, 53.410937999 ], [ 6.882578, 53.440654999 ], [ 6.874905, 53.408012998 ], [ 7.0927125, 53.25701749699999 ], [ 7.208935, 53.243064498000024 ] ] ] } },
-{ "type": "Feature", "id": 1260, "properties": { "NUTS_ID": "NL2", "STAT_LEVL_": 1, "SHAPE_AREA": 1.46019485675, "SHAPE_LEN": 6.53269045452 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 6.709732499999973, 52.627823498999987 ], [ 6.697865499999978, 52.486285998000028 ], [ 6.987941499999977, 52.469540999 ], [ 7.065685, 52.241372999000021 ], [ 6.760465, 52.118569499999978 ], [ 6.828513, 51.9640665 ], [ 6.4077795, 51.828092001000016 ], [ 6.167766, 51.900804499 ], [ 5.953192, 51.747845998 ], [ 5.8651585, 51.757407999 ], [ 5.597879499999976, 51.828082999 ], [ 5.128057500000011, 51.73761 ], [ 5.000534, 51.820937999000023 ], [ 5.149456, 51.933452499 ], [ 5.606011500000022, 51.943248498 ], [ 5.404633, 52.249629998999978 ], [ 5.335462, 52.29021849899999 ], [ 5.079161, 52.388653 ], [ 5.0604265, 52.578937499 ], [ 5.377261499999975, 52.764805 ], [ 5.795148499999982, 52.806499498999983 ], [ 5.819826499999976, 52.81731750099999 ], [ 6.119814, 52.854267000999982 ], [ 6.1630035, 52.680061498999976 ], [ 6.629429500000015, 52.669658500000025 ], [ 6.709732499999973, 52.627823498999987 ] ] ] } },
-{ "type": "Feature", "id": 1279, "properties": { "NUTS_ID": "IS0", "STAT_LEVL_": 1, "SHAPE_AREA": 19.582305420200001, "SHAPE_LEN": 42.765015966100002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -16.194012499999985, 66.537420000999987 ], [ -15.7608105, 66.280061499999988 ], [ -14.530650501000025, 66.377677 ], [ -15.1874295, 66.107847999 ], [ -14.604863500000022, 65.959309 ], [ -14.841373499999975, 65.725313001000018 ], [ -13.606742500999985, 65.510168 ], [ -13.495021, 65.076342496999985 ], [ -14.962440500000014, 64.23983099899999 ], [ -18.177745, 63.458595998000021 ], [ -22.007397, 63.835987500999977 ], [ -22.13086850000002, 63.836598001000027 ], [ -22.749017, 63.970331498 ], [ -22.125507, 64.040599 ], [ -21.3700475, 64.380325998999979 ], [ -22.031621500000028, 64.303023999 ], [ -22.4159795, 64.812200999000027 ], [ -24.04781, 64.87882700099999 ], [ -21.8073225, 65.025850000999981 ], [ -21.7099895, 65.15945799799999 ], [ -22.560879, 65.169345998999972 ], [ -21.681208, 65.451475 ], [ -24.532042499999989, 65.503079 ], [ -24.099384, 65.805268499000022 ], [ -23.171936500000015, 65.774955998 ], [ -23.868307, 65.886556499999983 ], [ -23.182204500000012, 65.837784498000019 ], [ -23.819687, 66.034912 ], [ -23.478353, 66.194693500000028 ], [ -22.360858, 65.925525999 ], [ -22.976494, 66.222132499999987 ], [ -22.362860500000011, 66.2698555 ], [ -23.1958975, 66.350033997000025 ], [ -22.9317105, 66.4693605 ], [ -21.327746499999989, 66.006199001000027 ], [ -21.3420165, 65.734767501000022 ], [ -21.7825325, 65.764513497999985 ], [ -21.094844500000022, 65.447562999000013 ], [ -20.264036, 65.725345499000014 ], [ -20.421178, 66.084051499999987 ], [ -19.473689, 65.736606500999983 ], [ -19.361081500000012, 65.829324 ], [ -19.4362145, 66.0476645 ], [ -18.77751, 66.191427001000022 ], [ -17.4125985, 65.992246497 ], [ -16.194012499999985, 66.537420000999987 ] ] ] } },
-{ "type": "Feature", "id": 1284, "properties": { "NUTS_ID": "ITC", "STAT_LEVL_": 1, "SHAPE_AREA": 6.6847335358500004, "SHAPE_LEN": 17.1356662484 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 10.452801, 46.530682999000021 ], [ 10.622145499999988, 46.448101999000016 ], [ 10.515751500000022, 46.34322449699999 ], [ 10.502388, 45.830373 ], [ 10.840176499999984, 45.832758501 ], [ 10.631074, 45.609512500999983 ], [ 10.6546555, 45.415826998999989 ], [ 11.205502500000023, 45.109487499000011 ], [ 11.426765499999988, 44.950076 ], [ 11.246231500000022, 44.951427997 ], [ 10.887909499999978, 44.914265999 ], [ 10.504344, 44.922417999 ], [ 10.464030499999978, 44.937171501000023 ], [ 10.083503, 45.04395850100002 ], [ 9.8911015, 45.130898 ], [ 9.548684499999979, 45.132648499000027 ], [ 9.3246605, 44.69 ], [ 9.200074, 44.686099496999987 ], [ 9.202995499999986, 44.613476499 ], [ 9.493363, 44.555858999 ], [ 9.4790595, 44.40924050000001 ], [ 9.686725500000023, 44.365921999000022 ], [ 10.018769500000019, 44.044535998000015 ], [ 9.511344, 44.216675497999972 ], [ 8.63301, 44.379803498 ], [ 8.135350999000025, 43.938940498000022 ], [ 7.529827, 43.784007997 ], [ 7.714236500000027, 44.061513499 ], [ 7.007760500000018, 44.23670049899999 ], [ 6.887428, 44.361287 ], [ 6.948443, 44.654741999 ], [ 7.065755, 44.713464497000018 ], [ 6.630051, 45.109856499999978 ], [ 7.125157, 45.243994498 ], [ 7.104723, 45.468454499000018 ], [ 6.8023685, 45.778562 ], [ 7.044886, 45.922412999000016 ], [ 7.86407650000001, 45.916750499999978 ], [ 7.8771375, 45.926954997999985 ], [ 8.384717, 46.452158499 ], [ 8.713936, 46.097271998999986 ], [ 8.912147, 45.830444999 ], [ 9.088803499999983, 45.896897001000013 ], [ 8.988276499999984, 45.972282498000027 ], [ 9.1593775, 46.169601 ], [ 9.248531500000013, 46.233768000999987 ], [ 9.714149500000019, 46.292708499000014 ], [ 10.2448745, 46.622091997999974 ], [ 10.452801, 46.530682999000021 ] ] ] } },
-{ "type": "Feature", "id": 1298, "properties": { "NUTS_ID": "PL5", "STAT_LEVL_": 1, "SHAPE_AREA": 3.8374115411399998, "SHAPE_LEN": 10.978364321000001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 18.1636815, 51.172516997 ], [ 18.672954499000014, 51.056902501000025 ], [ 18.615877500000011, 50.85358750099999 ], [ 18.607473500000026, 50.550011 ], [ 18.425866499999984, 50.248965498000018 ], [ 18.059780499999988, 50.174652499999979 ], [ 18.035060999, 50.06577199899999 ], [ 17.868675, 49.972545997 ], [ 17.592736, 50.160014 ], [ 17.758479, 50.206568 ], [ 17.718403998999975, 50.32095 ], [ 17.429605, 50.254513001000021 ], [ 16.907924499999979, 50.44945400099999 ], [ 17.028323, 50.229996998999979 ], [ 16.86327, 50.19812299900002 ], [ 16.58029, 50.142787998000017 ], [ 16.195729, 50.432135001 ], [ 16.443536, 50.586257499999988 ], [ 16.107318, 50.662072998999975 ], [ 15.535267499999975, 50.779375999000024 ], [ 15.032691, 51.021315999000024 ], [ 14.823362, 50.870550497000011 ], [ 15.037271, 51.243749998999988 ], [ 14.974183, 51.36394999700002 ], [ 15.714032499999973, 51.51739699699999 ], [ 15.979245, 51.802309001000026 ], [ 16.416191, 51.784865 ], [ 16.828372, 51.57218799899999 ], [ 17.257433, 51.642843001000017 ], [ 17.556245, 51.58430149899999 ], [ 17.795269500000018, 51.194146000999979 ], [ 17.939456, 51.10886499899999 ], [ 18.1636815, 51.172516997 ] ] ] } },
-{ "type": "Feature", "id": 1308, "properties": { "NUTS_ID": "PL6", "STAT_LEVL_": 1, "SHAPE_AREA": 8.1421002457899991, "SHAPE_LEN": 17.552293975200001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 22.792095500000016, 54.363358996999978 ], [ 22.476556500000015, 54.201305500999979 ], [ 22.782806, 53.915489498999989 ], [ 22.703502500000013, 53.767354496999985 ], [ 22.135728, 53.544539998 ], [ 21.598199912999974, 53.480181970999979 ], [ 21.55169, 53.478128001000016 ], [ 20.675953999, 53.269529498 ], [ 20.411137, 53.214165 ], [ 19.7615955, 53.15179649800001 ], [ 19.684806, 52.963038998 ], [ 19.444664, 52.939042500000028 ], [ 19.522742999, 52.74920049799999 ], [ 19.289184499999976, 52.39271249799998 ], [ 19.04712, 52.332803999000021 ], [ 18.377150500000027, 52.53746350099999 ], [ 17.458523, 52.738948000999983 ], [ 17.509582500000022, 52.917767997 ], [ 17.3015565, 52.994803 ], [ 17.438793499999974, 53.26751450099999 ], [ 17.390653, 53.490964 ], [ 16.892248, 53.655868999 ], [ 16.982053499000017, 53.904909499999974 ], [ 16.792764499999976, 53.985550499999988 ], [ 16.858620499999972, 54.38257649799999 ], [ 16.699085, 54.569247001 ], [ 17.666607, 54.7832315 ], [ 18.35961, 54.81719149700001 ], [ 18.828968499999974, 54.607707999000013 ], [ 18.3967745, 54.747280499999988 ], [ 18.541726499999982, 54.58448 ], [ 18.950029500000028, 54.358310501 ], [ 19.639032, 54.458294498999976 ], [ 19.648523, 54.453339 ], [ 19.256953, 54.2784605 ], [ 19.8037685, 54.4424105 ], [ 20.313503, 54.402202000999978 ], [ 21.559322, 54.322504 ], [ 22.792095500000016, 54.363358996999978 ] ] ] } },
-{ "type": "Feature", "id": 1326, "properties": { "NUTS_ID": "PT1", "STAT_LEVL_": 1, "SHAPE_AREA": 9.3186138296800003, "SHAPE_LEN": 17.9613633114 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -6.983513500000015, 41.972903998999982 ], [ -6.5884615, 41.967761998000015 ], [ -6.189352, 41.575046499 ], [ -6.479713, 41.294379999 ], [ -6.689786, 41.205241498000021 ], [ -6.929903500000023, 41.029466499000023 ], [ -6.801935, 40.861045998 ], [ -6.865144, 40.270694499 ], [ -6.9512985, 40.257445999000026 ], [ -7.011960499999986, 40.126934 ], [ -6.864203, 40.011867498000015 ], [ -7.015405, 39.670856499000024 ], [ -7.53490245, 39.661986891000026 ], [ -7.231467, 39.278431 ], [ -6.9513915, 39.024070499 ], [ -7.203135, 38.75101749800001 ], [ -7.316636, 38.439876498999979 ], [ -7.10795250000001, 38.188121500000022 ], [ -6.9317385, 38.208377998 ], [ -7.002483499999983, 38.0227165 ], [ -7.2632845, 37.979908 ], [ -7.512691500000017, 37.526256499 ], [ -7.401916500000027, 37.174827498000013 ], [ -7.8825875, 36.96182299899999 ], [ -8.592472, 37.1220545 ], [ -8.981500499999981, 37.027438497999981 ], [ -8.7963155, 37.442948 ], [ -8.735066500000016, 38.516034999 ], [ -9.222857499999975, 38.413747999 ], [ -9.260362499999985, 38.662664999000015 ], [ -8.924883500000021, 38.758676499999979 ], [ -8.968469, 38.827762499000016 ], [ -9.477468499999986, 38.701838497999972 ], [ -9.416459499999974, 39.054692999 ], [ -9.365950500999986, 39.348377998999979 ], [ -9.04026600100002, 39.741422000999989 ], [ -8.894938, 40.045502999 ], [ -8.784033, 40.520366498999977 ], [ -8.6531195, 40.964778497 ], [ -8.776427500000011, 41.471979497 ], [ -8.811573, 41.611576996999986 ], [ -8.863186, 41.872066499000027 ], [ -8.199000500000011, 42.154418998999972 ], [ -8.1650755, 41.818302 ], [ -8.051862500000027, 41.820613998 ], [ -7.20046450000001, 41.879749498000024 ], [ -6.983513500000015, 41.972903998999982 ] ] ] } },
-{ "type": "Feature", "id": 1355, "properties": { "NUTS_ID": "PT2", "STAT_LEVL_": 1, "SHAPE_AREA": 0.043939992241600001, "SHAPE_LEN": 1.4808539034099999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -25.1341635, 37.807617498000013 ], [ -25.5115255, 37.708371499 ], [ -25.85274, 37.85310549899998 ], [ -25.689569, 37.841939501000013 ], [ -25.1341635, 37.807617498000013 ] ] ] } },
-{ "type": "Feature", "id": 1358, "properties": { "NUTS_ID": "PT3", "STAT_LEVL_": 1, "SHAPE_AREA": 0.055018856820200003, "SHAPE_LEN": 1.11196618179 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -16.746594500000015, 32.73417 ], [ -16.944882501, 32.632935498999984 ], [ -17.2126035, 32.736969999 ], [ -17.193386499999974, 32.870556498999974 ], [ -16.746594500000015, 32.73417 ] ] ] } },
-{ "type": "Feature", "id": 1362, "properties": { "NUTS_ID": "RO1", "STAT_LEVL_": 1, "SHAPE_AREA": 8.0538642859799996, "SHAPE_LEN": 12.7418179266 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 24.947100499999976, 47.729124 ], [ 24.960837500000025, 47.596850999000026 ], [ 25.063383, 47.13653849799999 ], [ 25.2467105, 47.097919 ], [ 25.661582499000019, 47.091646500000024 ], [ 25.8619675, 46.923383498000021 ], [ 25.79531, 46.71911749899999 ], [ 25.975159500000018, 46.69731249900002 ], [ 26.263439, 46.246338497000011 ], [ 26.4399845, 46.038876499000025 ], [ 26.392094499999985, 45.802389496999979 ], [ 26.093334500000026, 45.516305997000018 ], [ 26.0727185, 45.505787999 ], [ 25.452539, 45.441341998999974 ], [ 25.321577499999989, 45.381088998 ], [ 25.103214, 45.585193999000012 ], [ 24.68492550000002, 45.604110997000021 ], [ 24.5128345, 45.586824998999987 ], [ 23.703613, 45.496772001000011 ], [ 23.597270499999979, 45.473471499000027 ], [ 23.234864, 46.030501996999988 ], [ 22.748716, 46.351207499999987 ], [ 22.676575500000013, 46.405825497000023 ], [ 21.441398, 46.651467 ], [ 21.658955, 47.022131498000022 ], [ 22.12832, 47.598089496999989 ], [ 22.1808375, 47.600094499000022 ], [ 22.896270500000014, 47.954120500999977 ], [ 23.1885355, 48.108687998999983 ], [ 23.493605, 47.96781149899999 ], [ 24.583293500000025, 47.964851496999984 ], [ 24.947100499999976, 47.729124 ] ] ] } },
-{ "type": "Feature", "id": 1377, "properties": { "NUTS_ID": "RO2", "STAT_LEVL_": 1, "SHAPE_AREA": 8.3495283464300005, "SHAPE_LEN": 17.2443908003 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 28.578884, 43.738739 ], [ 27.69541, 43.987343 ], [ 27.271344999, 44.12633649899999 ], [ 28.017373998999972, 44.340248997 ], [ 28.110168499999986, 44.439703501 ], [ 27.881007, 44.763076499000022 ], [ 27.203458500000011, 44.787073998999972 ], [ 26.6055515, 44.856993998 ], [ 26.0727185, 45.505787999 ], [ 26.093334500000026, 45.516305997000018 ], [ 26.392094499999985, 45.802389496999979 ], [ 26.4399845, 46.038876499000025 ], [ 26.263439, 46.246338497000011 ], [ 25.975159500000018, 46.69731249900002 ], [ 25.79531, 46.71911749899999 ], [ 25.8619675, 46.923383498000021 ], [ 25.661582499000019, 47.091646500000024 ], [ 25.2467105, 47.097919 ], [ 25.063383, 47.13653849799999 ], [ 24.960837500000025, 47.596850999000026 ], [ 24.947100499999976, 47.729124 ], [ 26.098827500000027, 47.97879599700002 ], [ 26.63056, 48.259749999 ], [ 27.1646515, 47.994683499000018 ], [ 27.3911665, 47.58939749699999 ], [ 28.113805500000012, 46.838411001 ], [ 28.260886, 46.437139499000011 ], [ 28.115802, 46.107826997000018 ], [ 28.088561500000026, 45.606112499 ], [ 28.21136, 45.467279999000027 ], [ 28.28599, 45.430630001 ], [ 28.717, 45.224379999 ], [ 29.42759, 45.44223999899998 ], [ 29.6796395, 45.211838 ], [ 29.600269, 44.839554498999973 ], [ 28.994085, 44.679625498 ], [ 28.625982500000021, 44.297031998000023 ], [ 28.578884, 43.738739 ] ] ] } },
-{ "type": "Feature", "id": 1392, "properties": { "NUTS_ID": "RO3", "STAT_LEVL_": 1, "SHAPE_AREA": 4.14173980187, "SHAPE_LEN": 10.045947246600001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 26.0727185, 45.505787999 ], [ 26.6055515, 44.856993998 ], [ 27.203458500000011, 44.787073998999972 ], [ 27.881007, 44.763076499000022 ], [ 28.110168499999986, 44.439703501 ], [ 28.017373998999972, 44.340248997 ], [ 27.271344999, 44.12633649899999 ], [ 26.379968, 44.042950999000027 ], [ 26.358359999000015, 44.038415500999974 ], [ 25.671862499999975, 43.691339998999979 ], [ 25.544640500000014, 43.64298149699999 ], [ 25.293740500000013, 43.654294 ], [ 24.641605998999978, 43.733412999 ], [ 24.755096, 43.817601498999977 ], [ 24.613092499, 44.014155997999978 ], [ 24.884019500000022, 44.382636497000021 ], [ 24.438121, 44.845375499 ], [ 24.5128345, 45.586824998999987 ], [ 24.68492550000002, 45.604110997000021 ], [ 25.103214, 45.585193999000012 ], [ 25.321577499999989, 45.381088998 ], [ 25.452539, 45.441341998999974 ], [ 26.0727185, 45.505787999 ] ] ] } },
-{ "type": "Feature", "id": 1404, "properties": { "NUTS_ID": "RO4", "STAT_LEVL_": 1, "SHAPE_AREA": 6.9361637261500002, "SHAPE_LEN": 13.9495964542 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 24.5128345, 45.586824998999987 ], [ 24.438121, 44.845375499 ], [ 24.884019500000022, 44.382636497000021 ], [ 24.613092499, 44.014155997999978 ], [ 24.755096, 43.817601498999977 ], [ 24.641605998999978, 43.733412999 ], [ 24.324132, 43.699567996999974 ], [ 24.112719, 43.699566498000024 ], [ 23.63008, 43.791259997999987 ], [ 22.997154500000022, 43.807626997 ], [ 22.838719, 43.877852001 ], [ 23.045898, 44.063749499999972 ], [ 22.966399500000023, 44.098524997000027 ], [ 22.6751615, 44.215662996999981 ], [ 22.4573995, 44.4674665 ], [ 22.705956500000013, 44.603216498999984 ], [ 22.467334, 44.7147455 ], [ 22.1595795, 44.471777498999984 ], [ 22.016132500000026, 44.599202499 ], [ 22.012350500000025, 44.602318498999978 ], [ 21.35847, 44.82161199699999 ], [ 21.560126500000024, 44.8890065 ], [ 21.479178499999989, 45.193027501000017 ], [ 21.016575, 45.324627497999984 ], [ 20.662833, 45.794115998999985 ], [ 20.264296, 46.1263735 ], [ 20.705303500000014, 46.160937499 ], [ 20.7756, 46.275909999000021 ], [ 21.103170499999976, 46.262590497000019 ], [ 21.441398, 46.651467 ], [ 22.676575500000013, 46.405825497000023 ], [ 22.748716, 46.351207499999987 ], [ 23.234864, 46.030501996999988 ], [ 23.597270499999979, 45.473471499000027 ], [ 23.703613, 45.496772001000011 ], [ 24.5128345, 45.586824998999987 ] ] ] } },
-{ "type": "Feature", "id": 1409, "properties": { "NUTS_ID": "NL3", "STAT_LEVL_": 1, "SHAPE_AREA": 1.22674453319, "SHAPE_LEN": 9.6691880804999997 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 4.249017, 51.6458015 ], [ 4.279565, 51.376017497000021 ], [ 4.24366950000001, 51.374729499000011 ], [ 3.4342, 51.526157499000021 ], [ 4.223578, 51.438655997000012 ], [ 4.161857000999987, 51.666831576999982 ], [ 4.135041090000016, 51.673303697999984 ], [ 3.678739, 51.695324001000017 ], [ 3.839083, 51.758297 ], [ 4.1277885, 52.000542497000026 ], [ 4.198018499999989, 52.054144496999982 ], [ 4.3742325, 52.187089996999987 ], [ 4.493847500000015, 52.32826000099999 ], [ 4.560596499999974, 52.437426001 ], [ 4.609676499999978, 52.573401000999979 ], [ 4.649071, 52.756177499999978 ], [ 4.730989, 52.962188498999978 ], [ 5.164383499999985, 53.000910501000021 ], [ 5.167425499999979, 52.998798498999975 ], [ 5.377261499999975, 52.764805 ], [ 5.0604265, 52.578937499 ], [ 5.079161, 52.388653 ], [ 5.335462, 52.29021849899999 ], [ 5.404633, 52.249629998999978 ], [ 5.606011500000022, 51.943248498 ], [ 5.149456, 51.933452499 ], [ 5.000534, 51.820937999000023 ], [ 4.676294499999983, 51.7249185 ], [ 4.620420500000023, 51.71412650000002 ], [ 4.249017, 51.6458015 ] ] ], [ [ [ 3.9776665, 51.225131998999984 ], [ 3.8563395, 51.211056 ], [ 3.380661, 51.274299497000015 ], [ 3.367216499999984, 51.368134499 ], [ 4.23481750000002, 51.348254 ], [ 3.9776665, 51.225131998999984 ] ] ] ] } },
-{ "type": "Feature", "id": 1430, "properties": { "NUTS_ID": "NL4", "STAT_LEVL_": 1, "SHAPE_AREA": 0.93053041389699997, "SHAPE_LEN": 5.9277501068699996 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 5.953192, 51.747845998 ], [ 6.224405, 51.364978999000016 ], [ 6.072657, 51.242587497999978 ], [ 6.174812, 51.1845135 ], [ 5.877084998999976, 51.032101 ], [ 6.0869475, 50.913134999000022 ], [ 6.020998999000028, 50.754295500000012 ], [ 5.892073, 50.755237498999975 ], [ 5.682000499000026, 50.757446497999979 ], [ 5.687622, 50.811923998999987 ], [ 5.758272498999986, 50.954795 ], [ 5.766149, 51.009235499999988 ], [ 5.798274, 51.059853498999985 ], [ 5.5662835, 51.220836497999983 ], [ 5.237716499999976, 51.261600499 ], [ 5.10218, 51.42900499699999 ], [ 4.759926, 51.50246449799999 ], [ 4.669544, 51.426383999 ], [ 4.279565, 51.376017497000021 ], [ 4.249017, 51.6458015 ], [ 4.620420500000023, 51.71412650000002 ], [ 4.676294499999983, 51.7249185 ], [ 5.000534, 51.820937999000023 ], [ 5.128057500000011, 51.73761 ], [ 5.597879499999976, 51.828082999 ], [ 5.8651585, 51.757407999 ], [ 5.953192, 51.747845998 ] ] ] } },
-{ "type": "Feature", "id": 1441, "properties": { "NUTS_ID": "NO0", "STAT_LEVL_": 1, "SHAPE_AREA": 55.983827901700003, "SHAPE_LEN": 166.53739427900001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 28.548399, 70.96723950099999 ], [ 27.855804499999977, 70.430145499999981 ], [ 28.510006, 70.447929999 ], [ 28.397211, 70.542122001 ], [ 28.96657349899999, 70.884941 ], [ 31.060808, 70.287788501000023 ], [ 28.73865071, 70.155567389999987 ], [ 28.575159, 70.118652500999985 ], [ 29.767906536, 69.823490934 ], [ 29.459511, 69.648964001000024 ], [ 30.839926, 69.775808000999973 ], [ 30.954538500000012, 69.632432 ], [ 30.085324500000013, 69.658104501000025 ], [ 30.1884235, 69.568457500000022 ], [ 28.92968, 69.051905 ], [ 28.8057925, 69.111147 ], [ 29.336974, 69.478310500000021 ], [ 27.984522, 70.01396549899999 ], [ 25.950607, 69.6965505 ], [ 25.702101, 69.253661501000011 ], [ 25.777502500000026, 69.018279001 ], [ 24.903199500000028, 68.554591 ], [ 22.374526, 68.716667 ], [ 21.9836115, 69.072893500000021 ], [ 21.2788215, 69.311883998999974 ], [ 20.548636499999986, 69.059968501000014 ], [ 20.0600475, 69.045759001000022 ], [ 20.3358725, 68.802312500000028 ], [ 19.921397, 68.356013001 ], [ 18.125925, 68.536515999000017 ], [ 18.151354, 68.19879 ], [ 17.899761500000011, 67.969372 ], [ 17.281524, 68.118815 ], [ 16.158001500000012, 67.51915900099999 ], [ 16.387759, 67.045462499999985 ], [ 15.3772265, 66.484303996999984 ], [ 15.453994500000022, 66.345233501 ], [ 14.516289, 66.132579 ], [ 14.6254765, 65.811808499999984 ], [ 14.325985, 65.118916001 ], [ 13.654257500000028, 64.580340498999988 ], [ 14.113869500000021, 64.462484501 ], [ 14.157109, 64.195054497 ], [ 13.967524500000025, 64.00797 ], [ 12.683565, 63.9742235 ], [ 12.149767, 63.593946999000025 ], [ 11.974581, 63.269228 ], [ 12.0524555, 63.1834445 ], [ 12.218231, 63.000332500000013 ], [ 12.056142, 62.611918499000012 ], [ 12.2546595, 62.331024496999987 ], [ 12.29937, 62.267494 ], [ 12.137665, 61.723816998000018 ], [ 12.870847500000025, 61.356494998000016 ], [ 12.670176500000025, 61.055976498 ], [ 12.22399, 61.013078001 ], [ 12.606881499999986, 60.512742496999977 ], [ 12.499544999000022, 60.09765699799999 ], [ 11.839729, 59.840767 ], [ 11.926970499999982, 59.790475999000023 ], [ 11.93987850000002, 59.69458099799999 ], [ 11.691129, 59.589547499999981 ], [ 11.8261965, 59.237849998 ], [ 11.632557, 58.908306501000027 ], [ 11.460827, 58.988658499 ], [ 11.15222652599999, 59.068580457 ], [ 11.133916596, 59.073322399 ], [ 10.825053, 59.153312501000016 ], [ 10.686118, 59.489406998999982 ], [ 10.5818175, 59.756759500999976 ], [ 10.764788, 59.8295445 ], [ 10.681954881000024, 59.884828632999984 ], [ 10.64198, 59.911508500000025 ], [ 10.497314500000016, 59.7883385 ], [ 10.54188649299999, 59.708155780000027 ], [ 10.555847737000022, 59.683040208000023 ], [ 10.617811, 59.571571499000015 ], [ 10.421314, 59.525081500999988 ], [ 10.4391965, 59.665851498999984 ], [ 10.212189500000022, 59.737282 ], [ 10.3198815, 59.691343501 ], [ 10.374606, 59.676967499999989 ], [ 10.386558499999978, 59.275020498 ], [ 10.273399499999982, 59.041736500000013 ], [ 9.839197001, 59.044178998 ], [ 9.289463, 58.839130496 ], [ 9.46771050000001, 58.829845498 ], [ 9.365971, 58.771349 ], [ 9.082190498999978, 58.74871449699998 ], [ 9.0575915, 58.725215999 ], [ 9.244501, 58.725330499999984 ], [ 8.185152, 58.1429415 ], [ 7.038199, 58.021426999000028 ], [ 6.553461500000026, 58.117786498999976 ], [ 6.89118, 58.2743605 ], [ 6.438839, 58.2908195 ], [ 5.491752, 58.75478 ], [ 5.5742715, 59.030342500000017 ], [ 5.995267500000011, 58.968890998 ], [ 6.217452499999979, 59.266296497999974 ], [ 5.941965499999981, 59.361309001 ], [ 5.614074, 59.330623499000012 ], [ 5.550886, 59.271621499999981 ], [ 5.295012441999972, 59.370869887000026 ], [ 5.298477, 59.145370501 ], [ 5.175062500000024, 59.178374497999982 ], [ 5.302875500000027, 59.481168001000015 ], [ 5.534834, 59.7320785 ], [ 5.506673157000023, 59.52538185899999 ], [ 5.794665721, 59.650943915000028 ], [ 6.381009, 59.874691 ], [ 5.675114, 59.848640499 ], [ 5.994348, 59.954563 ], [ 6.0828945, 60.191448 ], [ 6.353219500000023, 60.370498998000016 ], [ 6.829082746999973, 60.471445758000016 ], [ 6.716949, 60.520217996999975 ], [ 6.0151285, 60.269165 ], [ 5.817266, 59.983326 ], [ 5.5750435, 60.153873498999985 ], [ 5.756293, 60.399605 ], [ 5.446748500000012, 60.156437000999972 ], [ 5.2010105, 60.290233501999978 ], [ 5.309116500000016, 60.388206502 ], [ 5.262637499999983, 60.505832496999972 ], [ 5.7384535, 60.459213499999976 ], [ 5.7344455, 60.673576500000024 ], [ 5.708317500000021, 60.471111501 ], [ 5.339822, 60.541588 ], [ 5.706546, 60.759361497999976 ], [ 5.280158, 60.540760001000024 ], [ 5.207727998999985, 60.621887001 ], [ 5.281049, 60.632553 ], [ 4.94567, 60.809383500000024 ], [ 5.469385499999987, 60.659252 ], [ 5.140299, 60.84025250000002 ], [ 5.115073511999981, 60.965715414999977 ], [ 5.026263, 61.01004050099999 ], [ 5.426630526999986, 61.043119309000019 ], [ 5.394584, 61.069965497 ], [ 4.947169499999973, 61.25885 ], [ 5.209688823000022, 61.334528406 ], [ 4.977364500000022, 61.41730899800001 ], [ 5.843049, 61.45913699800002 ], [ 5.24131566599999, 61.583933230000014 ], [ 4.994847999, 61.591152001000012 ], [ 4.969778, 61.72112249700001 ], [ 6.846349, 61.870303997 ], [ 5.4584625, 61.93935 ], [ 5.089433, 62.167289499999981 ], [ 5.491458, 62.014787500000011 ], [ 5.6237655, 62.067054500999973 ], [ 5.405087499999979, 62.1274295 ], [ 6.3250655, 62.060600500000021 ], [ 5.931158, 62.21788399799999 ], [ 6.328697, 62.37550749799999 ], [ 6.653805499999976, 62.193794497999988 ], [ 6.573409500000025, 62.54216 ], [ 6.2564425, 62.529735500000015 ], [ 6.334281499999975, 62.61331949800001 ], [ 8.146601499999974, 62.688270500999977 ], [ 6.898579, 62.909847501 ], [ 7.3014915, 63.010261496999988 ], [ 7.450778500000013, 62.90481949799999 ], [ 8.11747, 62.921828998000024 ], [ 8.107145499000012, 63.105423 ], [ 8.764103, 63.18451 ], [ 8.832189, 63.201980000999981 ], [ 8.490692, 63.2775115 ], [ 8.765663999000026, 63.34213649899999 ], [ 8.650367, 63.400879 ], [ 8.7551115, 63.424341 ], [ 9.0929175, 63.287159 ], [ 9.501772, 63.39708699800002 ], [ 9.249117, 63.367828497 ], [ 9.147209, 63.488140000999977 ], [ 9.753410499999973, 63.645359000999974 ], [ 9.714933499999972, 63.615505000999974 ], [ 9.977688, 63.441615997999975 ], [ 9.8231755, 63.312244501 ], [ 10.85359, 63.439093496999988 ], [ 10.941581, 63.564289 ], [ 10.627927, 63.546741500999985 ], [ 11.458975, 63.802337500000021 ], [ 11.0724775, 63.858989501 ], [ 11.501289499999984, 64.005218501 ], [ 11.222313, 64.074219001000017 ], [ 10.5912035, 63.804355499999986 ], [ 11.070417500000019, 63.8419495 ], [ 10.955013501, 63.7456095 ], [ 10.170315, 63.526134496999987 ], [ 9.8057985, 63.626873 ], [ 10.130084, 63.754104499999983 ], [ 9.529121499999974, 63.712398496999981 ], [ 10.010268949000022, 64.045650763000026 ], [ 10.634974129999989, 64.391002416999982 ], [ 10.548171, 64.44136050100002 ], [ 11.3407555, 64.436919998 ], [ 11.225456, 64.311905001000014 ], [ 11.7259625, 64.584793 ], [ 11.1820715, 64.740906 ], [ 11.514355244, 64.878553708000027 ], [ 11.278464499999984, 64.857429498999977 ], [ 11.9609795, 65.077147997999987 ], [ 11.981972, 65.070815497000012 ], [ 12.0461065, 65.06245 ], [ 12.092824, 65.0398255 ], [ 12.187348499999985, 65.000111498000024 ], [ 12.182329, 64.981381 ], [ 12.9642955, 65.322120499999983 ], [ 12.041082500000016, 65.216194 ], [ 12.292716, 65.588462998000011 ], [ 12.781976499999985, 65.462524499999972 ], [ 12.347975, 65.627327 ], [ 12.646986, 65.814437997000027 ], [ 12.666879418, 65.920708010999988 ], [ 12.3801345, 65.894111 ], [ 14.147304500000018, 66.330467 ], [ 13.0099535, 66.191283999 ], [ 13.562423500000023, 66.30467999699999 ], [ 12.990386999, 66.348778 ], [ 13.733414499999981, 66.609176500999979 ], [ 13.2072895, 66.716881 ], [ 14.000651, 66.797889500999986 ], [ 13.492176, 66.949196500000028 ], [ 14.01318550000002, 66.961944501 ], [ 14.0676185, 67.161491501 ], [ 14.511739499999976, 67.207649 ], [ 15.08839799899999, 67.236534 ], [ 15.433605, 67.104980500000011 ], [ 15.481187, 67.1844635 ], [ 15.162948498999981, 67.329765 ], [ 14.308716, 67.263771 ], [ 15.650709213000027, 67.53825820899999 ], [ 15.160784499999977, 67.632698 ], [ 15.855372499999987, 67.7099 ], [ 14.75716, 67.804359499999975 ], [ 16.001165500000013, 67.996216002999972 ], [ 15.526616, 68.065216002 ], [ 15.395943499999987, 68.036521999 ], [ 15.440778500000022, 68.024948 ], [ 15.4147835, 67.995910499999979 ], [ 15.275926500000025, 68.050613 ], [ 16.062347499, 68.255936001 ], [ 16.819980499999986, 68.155219999 ], [ 16.275680500000021, 68.373138499999982 ], [ 17.900738, 68.417808 ], [ 16.075683500000025, 68.414055001 ], [ 16.5795895, 68.541367 ], [ 17.685274, 68.674110499999983 ], [ 17.251898, 68.75971199899999 ], [ 17.805559, 68.749992500000019 ], [ 17.447077, 68.907905501000016 ], [ 18.166633499, 69.1502 ], [ 17.96954549899999, 69.228561501 ], [ 18.238685499999974, 69.482642999 ], [ 19.547487499999988, 69.214874499000018 ], [ 18.934316500000023, 69.59994500099998 ], [ 19.2743185, 69.779205499999989 ], [ 20.394979499999977, 69.890387998999984 ], [ 19.930265500000019, 69.266662500999985 ], [ 20.459503, 69.762077498999986 ], [ 21.325062, 69.909232998999983 ], [ 21.241577, 70.008903499999974 ], [ 22.140845, 69.745193501000017 ], [ 21.727436, 70.05191050000002 ], [ 21.19089, 70.221240500000022 ], [ 21.424858, 70.22282400099999 ], [ 21.478477275999978, 70.200361853 ], [ 21.493496359, 70.199272055999984 ], [ 21.56427, 70.323387001000015 ], [ 21.8320885, 70.168754501000024 ], [ 21.782270499999981, 70.258720499999981 ], [ 21.983658, 70.329452499000013 ], [ 22.96682, 70.189369000999989 ], [ 23.001796500000012, 69.91750349900002 ], [ 23.407093, 69.964683499999978 ], [ 23.52833, 70.015915 ], [ 23.132837, 70.085571501 ], [ 23.179638, 70.216247501 ], [ 23.246681500000022, 70.248192 ], [ 24.090675499999975, 70.541496499 ], [ 24.298256, 70.686134501000026 ], [ 24.236952, 70.8358915 ], [ 24.553296499999988, 70.97564699899999 ], [ 25.911754499999972, 70.875648500000011 ], [ 25.086530499999981, 70.514610499000014 ], [ 24.912155, 70.096092 ], [ 25.153381500000023, 70.064895499999977 ], [ 26.572368499999982, 70.954368500999976 ], [ 26.733936500000027, 70.824897998999973 ], [ 26.475652500000024, 70.354866 ], [ 27.032476499999973, 70.483855999000014 ], [ 26.951067, 70.546791 ], [ 27.7029875, 70.800621001000025 ], [ 27.1011425, 70.93241899899999 ], [ 27.66123, 71.126999001 ], [ 28.548399, 70.96723950099999 ] ] ], [ [ [ 16.005573, 68.7555385 ], [ 15.7227115, 68.536804001 ], [ 16.076727, 68.748443499000018 ], [ 16.302828, 68.715187001 ], [ 16.123302500000023, 68.84416199899999 ], [ 16.409397, 68.857864499000016 ], [ 16.456951999000012, 68.557357998999976 ], [ 14.98062149899999, 68.273948500000017 ], [ 15.716285, 68.693619001 ], [ 15.464602500000012, 68.753418001 ], [ 15.619679500000018, 68.952705499999979 ], [ 15.959016500000018, 68.889045499000019 ], [ 15.963360495000018, 68.876588513 ], [ 15.988082191999979, 68.805695734999972 ], [ 16.005573, 68.7555385 ] ] ], [ [ [ 23.4573385, 70.765121500000021 ], [ 22.798172, 70.513825000999987 ], [ 21.928592499999979, 70.642860499999983 ], [ 23.4573385, 70.765121500000021 ] ] ], [ [ [ 18.080169500000011, 69.426903001000028 ], [ 16.782012999000017, 69.0575945 ], [ 17.658531, 69.487670998999988 ], [ 18.080169500000011, 69.426903001000028 ] ] ], [ [ [ 15.41872, 68.7030565 ], [ 14.368575998999972, 68.684494 ], [ 15.317963500000019, 68.917221 ], [ 15.253855, 68.755890001000012 ], [ 15.41872, 68.7030565 ] ] ], [ [ [ 15.165045500000019, 68.442626999000026 ], [ 14.206444499999975, 68.17415599899999 ], [ 14.4163855, 68.393623499 ], [ 15.165045500000019, 68.442626999000026 ] ] ], [ [ [ 5.396219498999983, 59.711742501 ], [ 5.183555500000011, 59.574615500999982 ], [ 5.0594595, 59.854045998 ], [ 5.179249, 59.877666498999986 ], [ 5.396219498999983, 59.711742501 ] ] ] ] } },
-{ "type": "Feature", "id": 1469, "properties": { "NUTS_ID": "PL1", "STAT_LEVL_": 1, "SHAPE_AREA": 7.0434331381900002, "SHAPE_LEN": 13.7039180622 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 19.74706, 50.86597000099999 ], [ 19.243142499999976, 51.036844497 ], [ 18.672954499000014, 51.056902501000025 ], [ 18.1636815, 51.172516997 ], [ 18.0745215, 51.349915001 ], [ 18.47196550000001, 51.851005499 ], [ 18.685665500000027, 51.821002997999983 ], [ 18.827809, 52.06418800099999 ], [ 19.04712, 52.332803999000021 ], [ 19.289184499999976, 52.39271249799998 ], [ 19.522742999, 52.74920049799999 ], [ 19.444664, 52.939042500000028 ], [ 19.684806, 52.963038998 ], [ 19.7615955, 53.15179649800001 ], [ 20.411137, 53.214165 ], [ 20.675953999, 53.269529498 ], [ 21.55169, 53.478128001000016 ], [ 21.598199912999974, 53.480181970999979 ], [ 21.735427500000014, 53.312727999 ], [ 21.694427500000018, 53.138093997999988 ], [ 22.453771500000016, 52.788238499999977 ], [ 22.408589, 52.609689999000011 ], [ 22.58041350000002, 52.393157997 ], [ 23.128409, 52.287841500000013 ], [ 22.622901499000022, 52.018744497 ], [ 21.889174500000024, 51.973184 ], [ 21.879981, 51.69363149899999 ], [ 21.6155425, 51.617562 ], [ 21.87317600099999, 51.474913499000024 ], [ 21.802998, 51.072078998999984 ], [ 21.630164499999978, 51.06338899799999 ], [ 20.432815, 51.33940499900001 ], [ 19.993847, 51.183954001000018 ], [ 20.0394705, 50.990206496999974 ], [ 19.74706, 50.86597000099999 ] ] ] } },
-{ "type": "Feature", "id": 1485, "properties": { "NUTS_ID": "PL2", "STAT_LEVL_": 1, "SHAPE_AREA": 3.3945000458100001, "SHAPE_LEN": 9.2183648553400008 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 19.74706, 50.86597000099999 ], [ 19.712949499999979, 50.729156000999978 ], [ 19.949966500000016, 50.504782501000022 ], [ 20.681539, 50.205875497000022 ], [ 21.208831499999974, 50.354897499 ], [ 21.150439, 49.976484500000026 ], [ 21.297149499999989, 49.842864 ], [ 21.241772, 49.776104499999974 ], [ 21.397712500000011, 49.433793998999988 ], [ 20.9237225, 49.296234498999979 ], [ 20.6149395, 49.417832998999984 ], [ 19.883929500000022, 49.204176999000026 ], [ 19.467386499999975, 49.613767 ], [ 19.153403, 49.40377700099998 ], [ 18.851551, 49.51718900100002 ], [ 18.575724, 49.910423 ], [ 18.035060999, 50.06577199899999 ], [ 18.059780499999988, 50.174652499999979 ], [ 18.425866499999984, 50.248965498000018 ], [ 18.607473500000026, 50.550011 ], [ 18.615877500000011, 50.85358750099999 ], [ 18.672954499000014, 51.056902501000025 ], [ 19.243142499999976, 51.036844497 ], [ 19.74706, 50.86597000099999 ] ] ] } },
-{ "type": "Feature", "id": 1502, "properties": { "NUTS_ID": "PL3", "STAT_LEVL_": 1, "SHAPE_AREA": 9.8888091762200006, "SHAPE_LEN": 21.4364826057 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 23.51465, 53.956559997999989 ], [ 23.588702, 53.695928497000011 ], [ 23.918259, 53.157621999000014 ], [ 23.916239, 52.904812 ], [ 23.93864, 52.712916001 ], [ 23.178338, 52.283140998000022 ], [ 23.653733, 52.072494999000014 ], [ 23.529021, 51.731750998999985 ], [ 23.617665999, 51.507613999 ], [ 23.709221, 51.277648498000019 ], [ 24.1457825, 50.869376999 ], [ 24.0345575, 50.44484350099998 ], [ 23.547642, 50.251602 ], [ 22.6861495, 49.57316549699999 ], [ 22.894087, 49.016692998999986 ], [ 22.56684, 49.088377498999989 ], [ 21.397712500000011, 49.433793998999988 ], [ 21.241772, 49.776104499999974 ], [ 21.297149499999989, 49.842864 ], [ 21.150439, 49.976484500000026 ], [ 21.208831499999974, 50.354897499 ], [ 20.681539, 50.205875497000022 ], [ 19.949966500000016, 50.504782501000022 ], [ 19.712949499999979, 50.729156000999978 ], [ 19.74706, 50.86597000099999 ], [ 20.0394705, 50.990206496999974 ], [ 19.993847, 51.183954001000018 ], [ 20.432815, 51.33940499900001 ], [ 21.630164499999978, 51.06338899799999 ], [ 21.802998, 51.072078998999984 ], [ 21.87317600099999, 51.474913499000024 ], [ 21.6155425, 51.617562 ], [ 21.879981, 51.69363149899999 ], [ 21.889174500000024, 51.973184 ], [ 22.622901499000022, 52.018744497 ], [ 23.128409, 52.287841500000013 ], [ 22.58041350000002, 52.393157997 ], [ 22.408589, 52.609689999000011 ], [ 22.453771500000016, 52.788238499999977 ], [ 21.694427500000018, 53.138093997999988 ], [ 21.735427500000014, 53.312727999 ], [ 21.598199912999974, 53.480181970999979 ], [ 22.135728, 53.544539998 ], [ 22.703502500000013, 53.767354496999985 ], [ 22.782806, 53.915489498999989 ], [ 22.476556500000015, 54.201305500999979 ], [ 22.792095500000016, 54.363358996999978 ], [ 23.321498500000018, 54.25332599799998 ], [ 23.51465, 53.956559997999989 ] ] ] } },
-{ "type": "Feature", "id": 1520, "properties": { "NUTS_ID": "PL4", "STAT_LEVL_": 1, "SHAPE_AREA": 8.9635947398799996, "SHAPE_LEN": 15.9574634674 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 19.04712, 52.332803999000021 ], [ 18.827809, 52.06418800099999 ], [ 18.685665500000027, 51.821002997999983 ], [ 18.47196550000001, 51.851005499 ], [ 18.0745215, 51.349915001 ], [ 18.1636815, 51.172516997 ], [ 17.939456, 51.10886499899999 ], [ 17.795269500000018, 51.194146000999979 ], [ 17.556245, 51.58430149899999 ], [ 17.257433, 51.642843001000017 ], [ 16.828372, 51.57218799899999 ], [ 16.416191, 51.784865 ], [ 15.979245, 51.802309001000026 ], [ 15.714032499999973, 51.51739699699999 ], [ 14.974183, 51.36394999700002 ], [ 14.729862, 51.581776999 ], [ 14.716716, 52.001188001 ], [ 14.755227, 52.070024998 ], [ 14.600891499999989, 52.272051998999984 ], [ 14.534361999, 52.395008 ], [ 14.565063, 52.624497 ], [ 14.436438, 52.679900498999984 ], [ 14.156692, 52.895590998999978 ], [ 14.143658, 52.9613685 ], [ 14.412157, 53.329635998000015 ], [ 14.267542, 53.697806496999988 ], [ 14.619581499999981, 53.649783999000022 ], [ 14.573513, 53.849196999000014 ], [ 14.2130775, 53.866479498999979 ], [ 14.226302, 53.928652998000018 ], [ 15.388074500000016, 54.158907 ], [ 16.699085, 54.569247001 ], [ 16.858620499999972, 54.38257649799999 ], [ 16.792764499999976, 53.985550499999988 ], [ 16.982053499000017, 53.904909499999974 ], [ 16.892248, 53.655868999 ], [ 17.390653, 53.490964 ], [ 17.438793499999974, 53.26751450099999 ], [ 17.3015565, 52.994803 ], [ 17.509582500000022, 52.917767997 ], [ 17.458523, 52.738948000999983 ], [ 18.377150500000027, 52.53746350099999 ], [ 19.04712, 52.332803999000021 ] ] ] } },
-{ "type": "Feature", "id": 1542, "properties": { "NUTS_ID": "SE1", "STAT_LEVL_": 1, "SHAPE_AREA": 7.6648856093999997, "SHAPE_LEN": 18.135565057800001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 17.370218, 60.654461500000025 ], [ 17.675121999, 60.505647996999983 ], [ 17.9892855, 60.6041095 ], [ 18.617636, 60.232382500000028 ], [ 18.310146499999973, 60.316625499999986 ], [ 18.513265499999989, 60.14808899799999 ], [ 18.838097, 60.110722501 ], [ 19.0854185, 59.758264497000027 ], [ 18.058382209, 59.379042366000022 ], [ 18.138552267000023, 59.332316024000022 ], [ 18.708546500000011, 59.290029998000023 ], [ 17.991549570000018, 58.966979990000027 ], [ 17.868635241999982, 58.847063186000014 ], [ 17.850585, 58.901825998999982 ], [ 17.827534277999973, 58.884624761999987 ], [ 17.605396633, 58.94228633099999 ], [ 17.592441408000013, 58.945649194 ], [ 17.572396, 58.950852498000017 ], [ 16.75066350100002, 58.626932001 ], [ 16.218891636000023, 58.625096370999984 ], [ 16.948542, 58.4797815 ], [ 16.414471, 58.475954499000011 ], [ 16.927406, 58.335789498 ], [ 16.627013499999975, 58.356352500000014 ], [ 16.854670499, 58.172553497000024 ], [ 16.666683499999976, 57.996076497999979 ], [ 16.239588, 58.134933499999988 ], [ 15.942313999000021, 57.803287497999975 ], [ 15.421051, 57.704879498000025 ], [ 15.1283585, 57.716312501 ], [ 14.982714, 58.1491125 ], [ 14.41671, 58.187240497 ], [ 14.778936499, 58.645992501000023 ], [ 14.295995, 59.012779501000011 ], [ 14.437155500000017, 60.026160996999977 ], [ 15.422092500000019, 59.854747499999974 ], [ 15.801567, 60.179103499 ], [ 16.703831, 60.195720498000014 ], [ 17.192478, 60.300712499999975 ], [ 17.370218, 60.654461500000025 ] ] ] } },
-{ "type": "Feature", "id": 1551, "properties": { "NUTS_ID": "SE2", "STAT_LEVL_": 1, "SHAPE_AREA": 12.5635036565, "SHAPE_LEN": 26.444577975000001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 12.228614499, 59.256698498999981 ], [ 13.254575499999987, 58.725619998000013 ], [ 13.589992, 59.059444499999984 ], [ 14.295995, 59.012779501000011 ], [ 14.778936499, 58.645992501000023 ], [ 14.41671, 58.187240497 ], [ 14.982714, 58.1491125 ], [ 15.1283585, 57.716312501 ], [ 15.421051, 57.704879498000025 ], [ 15.942313999000021, 57.803287497999975 ], [ 16.239588, 58.134933499999988 ], [ 16.666683499999976, 57.996076497999979 ], [ 16.690222393, 57.99604851399999 ], [ 16.73475643099999, 57.995995569 ], [ 16.813461, 57.995902 ], [ 16.727634500000022, 57.443552997999973 ], [ 16.0498715, 56.321888500999989 ], [ 15.85154799899999, 56.085102498000026 ], [ 15.592694, 56.211868998 ], [ 15.348007, 56.124922501000015 ], [ 14.687203, 56.167117 ], [ 14.781868499999973, 56.034379999 ], [ 14.721438499999977, 55.993557001 ], [ 14.5463995, 56.061317499999973 ], [ 14.220109499999978, 55.830457500000023 ], [ 14.360844, 55.554259498000022 ], [ 14.194909, 55.384035000999972 ], [ 12.80983950000001, 55.37825650000002 ], [ 13.063593, 55.670109497 ], [ 12.445960500000012, 56.302680001 ], [ 12.783795, 56.219567998 ], [ 12.619629499999974, 56.42242299899999 ], [ 12.899313, 56.4490045 ], [ 12.936727500000018, 56.587077 ], [ 11.913888, 57.40103749799999 ], [ 11.922900500000026, 57.562223 ], [ 11.990241500000025, 57.721453500999985 ], [ 11.6737205, 57.84875499899999 ], [ 11.844677282000021, 58.160147672999983 ], [ 11.406957, 58.143874 ], [ 11.702689, 58.433648 ], [ 11.220071499000028, 58.40892949800002 ], [ 11.3094795, 58.479590496000014 ], [ 11.256905, 58.677688000999979 ], [ 11.174135001000025, 58.718292497999983 ], [ 11.23903150000001, 58.838971500000014 ], [ 11.113543499, 58.998263497999972 ], [ 11.460827, 58.988658499 ], [ 11.632557, 58.908306501000027 ], [ 11.8261965, 59.237849998 ], [ 12.228614499, 59.256698498999981 ] ] ], [ [ [ 18.711624, 57.244686498000021 ], [ 18.340005500000018, 57.072327998999981 ], [ 18.091156, 57.259374001000026 ], [ 18.106193500000018, 57.53464150100001 ], [ 18.777192500000012, 57.867417000999978 ], [ 19.097435, 57.819173001000024 ], [ 18.810876, 57.705975501000012 ], [ 18.761467499999981, 57.469703501000026 ], [ 18.917816500000015, 57.403347499 ], [ 18.711624, 57.244686498000021 ] ] ], [ [ [ 17.07371999899999, 57.179024001000016 ], [ 16.400671, 56.195806996999977 ], [ 16.413673, 56.589974499999983 ], [ 16.959014, 57.291363499999989 ], [ 17.076804309000011, 57.355879516000016 ], [ 17.07371999899999, 57.179024001000016 ] ] ] ] } },
-{ "type": "Feature", "id": 1563, "properties": { "NUTS_ID": "SK0", "STAT_LEVL_": 1, "SHAPE_AREA": 5.9578391045199997, "SHAPE_LEN": 13.4472835294 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 21.397712500000011, 49.433793998999988 ], [ 22.56684, 49.088377498999989 ], [ 22.382817499999987, 48.86226399899999 ], [ 22.155306, 48.403396499 ], [ 22.121077500000013, 48.378311499 ], [ 21.721957, 48.351050499 ], [ 21.440056, 48.585232999000027 ], [ 20.463937, 48.463967 ], [ 20.051879, 48.16770399699999 ], [ 19.0143225, 48.077736499000025 ], [ 18.928392, 48.056832499 ], [ 18.7548155, 47.975082499 ], [ 18.848478, 47.818228001000023 ], [ 17.893923, 47.739456999000026 ], [ 17.705436500000019, 47.758992498999987 ], [ 17.247427500000015, 48.012008998999988 ], [ 17.1607975, 48.006656501 ], [ 17.066741, 48.11868149899999 ], [ 16.976203, 48.172244498 ], [ 16.851106, 48.438635001000023 ], [ 16.949778, 48.535791999000025 ], [ 16.940278, 48.617245498999978 ], [ 17.2016625, 48.878028997 ], [ 17.3967255, 48.813349997999978 ], [ 17.64693, 48.854265998000017 ], [ 18.322436, 49.315059 ], [ 18.4035955, 49.396745499000019 ], [ 18.851551, 49.51718900100002 ], [ 19.153403, 49.40377700099998 ], [ 19.467386499999975, 49.613767 ], [ 19.883929500000022, 49.204176999000026 ], [ 20.6149395, 49.417832998999984 ], [ 20.9237225, 49.296234498999979 ], [ 21.397712500000011, 49.433793998999988 ] ] ] } },
-{ "type": "Feature", "id": 1577, "properties": { "NUTS_ID": "TR1", "STAT_LEVL_": 1, "SHAPE_AREA": 0.63573077059399996, "SHAPE_LEN": 4.8702439844700001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 29.0726335, 41.124540498999977 ], [ 28.824152, 40.954229499 ], [ 27.991102500000011, 41.018344997999975 ], [ 28.181659498999977, 41.564058999 ], [ 29.1158845, 41.228910498 ], [ 29.037527, 41.155839999000023 ], [ 29.0726335, 41.124540498999977 ] ] ], [ [ [ 29.849114, 41.064824498 ], [ 29.342517499999985, 40.807657499000015 ], [ 29.031469500000014, 40.967085 ], [ 29.087270499999988, 41.178521999 ], [ 29.265264, 41.230552 ], [ 29.865624500000024, 41.143447498 ], [ 29.849114, 41.064824498 ] ] ] ] } },
-{ "type": "Feature", "id": 1580, "properties": { "NUTS_ID": "TR2", "STAT_LEVL_": 1, "SHAPE_AREA": 4.579164777, "SHAPE_LEN": 16.835196242599999 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ 28.164779, 40.395613 ], [ 28.067550499999982, 40.253133999 ], [ 28.2650375, 39.870233998 ], [ 28.968574, 39.600677499000028 ], [ 28.885427, 39.372048998000025 ], [ 28.597988, 39.280636 ], [ 28.652546500000028, 39.147368498999981 ], [ 28.15897, 39.05698399900001 ], [ 27.800632, 39.313108 ], [ 27.381316, 39.352746501000013 ], [ 26.763921, 39.175209999 ], [ 26.671144500000025, 39.279348497 ], [ 26.950548500000025, 39.559968498999979 ], [ 26.618685, 39.547879498999976 ], [ 26.062643, 39.479497998999989 ], [ 26.179243, 39.99143399799999 ], [ 26.7568455, 40.403798998000013 ], [ 27.319921500000021, 40.433412498 ], [ 27.5062805, 40.304677998999978 ], [ 27.879064500000027, 40.372956998 ], [ 27.753523, 40.528421498 ], [ 28.016809500000022, 40.447443 ], [ 28.164779, 40.395613 ] ] ], [ [ [ 27.059650499999975, 42.088336997999988 ], [ 27.559395, 41.904781496999988 ], [ 28.035512499999982, 41.983079498999984 ], [ 28.152973, 41.57896249700002 ], [ 28.181659498999977, 41.564058999 ], [ 27.991102500000011, 41.018344997999975 ], [ 27.5128535, 40.974729497 ], [ 26.968651500000021, 40.553368998999986 ], [ 26.165872499999978, 40.052561 ], [ 26.218675500000018, 40.319448497 ], [ 26.838197499999978, 40.58582499900001 ], [ 26.734886500000016, 40.64293299799999 ], [ 26.151155500000016, 40.5893155 ], [ 26.032758, 40.730256999 ], [ 26.291015500000015, 40.93188349899998 ], [ 26.321652, 41.250406499 ], [ 26.628431499999976, 41.345533499 ], [ 26.6001205, 41.601198997999973 ], [ 26.357879, 41.711104498999987 ], [ 26.561544500000025, 41.92627349899999 ], [ 26.9492285, 42.000213498999983 ], [ 27.059650499999975, 42.088336997999988 ] ] ] ] } },
-{ "type": "Feature", "id": 1588, "properties": { "NUTS_ID": "TR3", "STAT_LEVL_": 1, "SHAPE_AREA": 9.1200942847499995, "SHAPE_LEN": 21.6883571686 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 31.6197325, 39.10305549899999 ], [ 31.621331, 38.619947497999988 ], [ 31.233349499999974, 38.409972998 ], [ 30.986741, 38.466414499 ], [ 30.039979500000015, 37.752413999 ], [ 29.852119, 37.752285498999981 ], [ 29.500381, 37.618871997999975 ], [ 29.606530500000019, 37.395360998 ], [ 29.341447, 37.006848998 ], [ 29.71326449899999, 36.956309498999985 ], [ 29.637207, 36.669686001 ], [ 29.2605615, 36.304398998000011 ], [ 29.102026, 36.386720497999988 ], [ 29.103811, 36.670343498000022 ], [ 28.928788999, 36.754858001 ], [ 28.851038, 36.662084499 ], [ 28.270406499999979, 36.852479997999978 ], [ 27.979345500000022, 36.553571498 ], [ 28.131860500000016, 36.793966501 ], [ 27.374422, 36.685484 ], [ 28.032835, 36.787759999 ], [ 28.330560499, 37.033395498 ], [ 27.565542, 36.977285499 ], [ 27.4248715, 37.03574199799999 ], [ 27.264269, 36.964120998 ], [ 27.32292, 37.15848349800001 ], [ 27.47090350000002, 37.08077799900002 ], [ 27.611025, 37.257513 ], [ 27.404363, 37.368095 ], [ 27.424582461, 37.407003472999975 ], [ 27.419004802000018, 37.41190175600002 ], [ 27.202436499999976, 37.350824499999987 ], [ 27.21665, 37.59111199900002 ], [ 27.003069, 37.659795998999982 ], [ 27.234331, 37.72418399899999 ], [ 27.264569, 37.875048498000012 ], [ 26.757936500000028, 38.221479499999987 ], [ 26.5912, 38.102750497999978 ], [ 26.281446500000015, 38.26644349899999 ], [ 26.344708121999986, 38.485118011 ], [ 26.348637239000027, 38.49869967 ], [ 26.397398, 38.667249498999979 ], [ 26.6252955, 38.52710550099999 ], [ 26.674020499999983, 38.311987 ], [ 27.167742, 38.440419999000028 ], [ 26.73378550000001, 38.653470998999978 ], [ 27.06627450000002, 38.877225997999972 ], [ 26.763921, 39.175209999 ], [ 27.381316, 39.352746501000013 ], [ 27.800632, 39.313108 ], [ 28.15897, 39.05698399900001 ], [ 28.652546500000028, 39.147368498999981 ], [ 28.597988, 39.280636 ], [ 28.885427, 39.372048998000025 ], [ 28.968574, 39.600677499000028 ], [ 29.1940935, 39.600573998000016 ], [ 29.396933, 39.901012499999979 ], [ 29.74218350000001, 39.871334 ], [ 29.760535, 39.710123497999973 ], [ 30.0601345, 39.658641 ], [ 30.355477, 39.499004999000022 ], [ 30.429491499999983, 39.217612498999983 ], [ 30.741455499999972, 39.126772999000025 ], [ 31.192543, 39.279110498000023 ], [ 31.6197325, 39.10305549899999 ] ] ] } },
-{ "type": "Feature", "id": 1600, "properties": { "NUTS_ID": "TR4", "STAT_LEVL_": 1, "SHAPE_AREA": 5.1458750138699996, "SHAPE_LEN": 15.4662318942 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 31.2956565, 41.11632 ], [ 31.75363, 41.005629999 ], [ 32.134024, 41.030704 ], [ 32.560043, 40.807477499000015 ], [ 32.554816, 40.691943998 ], [ 31.911622, 40.328740498 ], [ 31.128547500000025, 40.36527999899999 ], [ 30.82533, 40.125784499000019 ], [ 31.676844, 40.035240500999976 ], [ 32.069824, 39.276070998000023 ], [ 31.8153175, 39.128693499 ], [ 31.6197325, 39.10305549899999 ], [ 31.192543, 39.279110498000023 ], [ 30.741455499999972, 39.126772999000025 ], [ 30.429491499999983, 39.217612498999983 ], [ 30.355477, 39.499004999000022 ], [ 30.0601345, 39.658641 ], [ 29.760535, 39.710123497999973 ], [ 29.74218350000001, 39.871334 ], [ 29.396933, 39.901012499999979 ], [ 29.1940935, 39.600573998000016 ], [ 28.968574, 39.600677499000028 ], [ 28.2650375, 39.870233998 ], [ 28.067550499999982, 40.253133999 ], [ 28.164779, 40.395613 ], [ 29.154478, 40.425374998999985 ], [ 28.991232500000024, 40.466273498000021 ], [ 28.780161, 40.53619199799999 ], [ 29.546993, 40.695436 ], [ 29.9421865, 40.750489999000024 ], [ 29.342517499999985, 40.807657499000015 ], [ 29.849114, 41.064824498 ], [ 29.865624500000024, 41.143447498 ], [ 30.354686500000014, 41.184194499 ], [ 30.9820555, 41.072602501 ], [ 31.2956565, 41.11632 ] ] ] } },
-{ "type": "Feature", "id": 1611, "properties": { "NUTS_ID": "TR5", "STAT_LEVL_": 1, "SHAPE_AREA": 7.7779817103999997, "SHAPE_LEN": 16.850021436799999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 33.695422, 40.33224149900002 ], [ 33.247372499999983, 39.64108699799999 ], [ 33.441716, 39.35144900099999 ], [ 33.886019, 39.043412499 ], [ 33.722786, 38.942947998000022 ], [ 33.7054745, 38.680822499999977 ], [ 33.464405, 38.636591499000019 ], [ 33.2033695, 38.278494001000013 ], [ 33.3944535, 37.97126099799999 ], [ 34.033573499999989, 38.004810997999982 ], [ 34.4004265, 37.751674998999988 ], [ 34.3350345, 37.479407497000011 ], [ 34.428914500000019, 37.316384998999979 ], [ 33.997476, 37.097663997999973 ], [ 32.95091050000002, 36.831344999 ], [ 33.238955499999975, 36.520920999 ], [ 32.573227, 36.357624 ], [ 32.457231499999978, 36.738470999000015 ], [ 31.740267500000016, 37.35582599899999 ], [ 31.4533915, 37.333832998999981 ], [ 31.300975, 37.404258997999989 ], [ 31.419393, 37.973507997000013 ], [ 31.5975305, 38.0560855 ], [ 31.233349499999974, 38.409972998 ], [ 31.621331, 38.619947497999988 ], [ 31.6197325, 39.10305549899999 ], [ 31.8153175, 39.128693499 ], [ 32.069824, 39.276070998000023 ], [ 31.676844, 40.035240500999976 ], [ 30.82533, 40.125784499000019 ], [ 31.128547500000025, 40.36527999899999 ], [ 31.911622, 40.328740498 ], [ 32.554816, 40.691943998 ], [ 32.972095498999977, 40.615893501000016 ], [ 33.220758999, 40.320784498000023 ], [ 33.695422, 40.33224149900002 ] ] ] } },
-{ "type": "Feature", "id": 1617, "properties": { "NUTS_ID": "TR6", "STAT_LEVL_": 1, "SHAPE_AREA": 9.0355123981199998, "SHAPE_LEN": 27.319507149100001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 37.6386675, 37.931167 ], [ 37.432401, 37.641709 ], [ 37.623216500000012, 37.511652497 ], [ 37.08278, 37.175676 ], [ 36.901796, 37.333660499000018 ], [ 36.697388499999988, 37.218142496999974 ], [ 36.465673, 36.94866749900001 ], [ 36.660010393999983, 36.833231536000028 ], [ 36.549576, 36.487769997999976 ], [ 36.683648, 36.236783499000012 ], [ 36.39222, 36.213326 ], [ 36.37497, 35.99791399899999 ], [ 36.168469, 35.819717997999987 ], [ 35.917953, 35.928695499000014 ], [ 35.978777, 36.019441498999981 ], [ 35.779623, 36.318555998000022 ], [ 36.2157305, 36.65963549899999 ], [ 36.1485765, 36.855066498999975 ], [ 35.963096, 36.902986999 ], [ 35.56584, 36.5647945 ], [ 35.339365, 36.538526499 ], [ 34.899245, 36.738280998999983 ], [ 34.5607205, 36.768163496999989 ], [ 33.68421, 36.133401999 ], [ 32.57634, 36.093232999 ], [ 32.029032, 36.538202499000022 ], [ 30.698167500000011, 36.884519497999975 ], [ 30.327932499999974, 36.296952998999984 ], [ 29.644518, 36.196329 ], [ 29.2605615, 36.304398998000011 ], [ 29.637207, 36.669686001 ], [ 29.71326449899999, 36.956309498999985 ], [ 29.341447, 37.006848998 ], [ 29.606530500000019, 37.395360998 ], [ 29.500381, 37.618871997999975 ], [ 29.852119, 37.752285498999981 ], [ 30.039979500000015, 37.752413999 ], [ 30.986741, 38.466414499 ], [ 31.233349499999974, 38.409972998 ], [ 31.5975305, 38.0560855 ], [ 31.419393, 37.973507997000013 ], [ 31.300975, 37.404258997999989 ], [ 31.4533915, 37.333832998999981 ], [ 31.740267500000016, 37.35582599899999 ], [ 32.457231499999978, 36.738470999000015 ], [ 32.573227, 36.357624 ], [ 33.238955499999975, 36.520920999 ], [ 32.95091050000002, 36.831344999 ], [ 33.997476, 37.097663997999973 ], [ 34.428914500000019, 37.316384998999979 ], [ 34.748, 37.409303998999974 ], [ 34.8854045, 37.67054299900002 ], [ 35.218708, 37.759224499000027 ], [ 35.574764, 37.737158498999975 ], [ 35.6144215, 37.962110999 ], [ 36.253814, 38.379422998999985 ], [ 36.440496, 38.21943249899999 ], [ 36.769845499999974, 38.546435998999982 ], [ 37.341487, 38.593515498999977 ], [ 37.259534499999972, 38.479911 ], [ 37.7602465, 38.203050997999981 ], [ 37.6386675, 37.931167 ] ] ] } },
-{ "type": "Feature", "id": 1629, "properties": { "NUTS_ID": "TR7", "STAT_LEVL_": 1, "SHAPE_AREA": 9.4533397717100005, "SHAPE_LEN": 16.901896064700001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 38.78355, 40.083271498999977 ], [ 38.3667795, 39.93940199799999 ], [ 38.349975, 39.149943496999981 ], [ 37.825541, 39.093950499000016 ], [ 37.341487, 38.593515498999977 ], [ 36.769845499999974, 38.546435998999982 ], [ 36.440496, 38.21943249899999 ], [ 36.253814, 38.379422998999985 ], [ 35.6144215, 37.962110999 ], [ 35.574764, 37.737158498999975 ], [ 35.218708, 37.759224499000027 ], [ 34.8854045, 37.67054299900002 ], [ 34.748, 37.409303998999974 ], [ 34.428914500000019, 37.316384998999979 ], [ 34.3350345, 37.479407497000011 ], [ 34.4004265, 37.751674998999988 ], [ 34.033573499999989, 38.004810997999982 ], [ 33.3944535, 37.97126099799999 ], [ 33.2033695, 38.278494001000013 ], [ 33.464405, 38.636591499000019 ], [ 33.7054745, 38.680822499999977 ], [ 33.722786, 38.942947998000022 ], [ 33.886019, 39.043412499 ], [ 33.441716, 39.35144900099999 ], [ 33.247372499999983, 39.64108699799999 ], [ 33.695422, 40.33224149900002 ], [ 33.9162255, 40.263038 ], [ 34.175507, 39.940434 ], [ 35.072878, 40.012056498999982 ], [ 35.1556395, 40.226113 ], [ 35.35115, 40.2433805 ], [ 35.442588, 40.23402249899999 ], [ 36.003423, 39.95451049799999 ], [ 36.606419500000015, 39.980525999 ], [ 36.8059275, 40.228436999 ], [ 37.450466, 40.170535498999982 ], [ 37.584064, 40.424386998999978 ], [ 37.73245750000001, 40.3425095 ], [ 38.148452500000019, 40.524218497999982 ], [ 38.213064, 40.213312498999983 ], [ 38.78355, 40.083271498999977 ] ] ] } },
-{ "type": "Feature", "id": 1640, "properties": { "NUTS_ID": "TR8", "STAT_LEVL_": 1, "SHAPE_AREA": 8.0248435480399998, "SHAPE_LEN": 15.605980006 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 37.152348500000016, 41.148413498000025 ], [ 36.671261, 40.915629498999976 ], [ 37.632352, 40.543681998000011 ], [ 37.584064, 40.424386998999978 ], [ 37.450466, 40.170535498999982 ], [ 36.8059275, 40.228436999 ], [ 36.606419500000015, 39.980525999 ], [ 36.003423, 39.95451049799999 ], [ 35.442588, 40.23402249899999 ], [ 35.35115, 40.2433805 ], [ 35.1556395, 40.226113 ], [ 35.072878, 40.012056498999982 ], [ 34.175507, 39.940434 ], [ 33.9162255, 40.263038 ], [ 33.695422, 40.33224149900002 ], [ 33.220758999, 40.320784498000023 ], [ 32.972095498999977, 40.615893501000016 ], [ 32.554816, 40.691943998 ], [ 32.560043, 40.807477499000015 ], [ 32.134024, 41.030704 ], [ 31.75363, 41.005629999 ], [ 31.2956565, 41.11632 ], [ 31.403204, 41.317392498 ], [ 32.062578, 41.580305999000018 ], [ 32.736103500000013, 41.849009498999976 ], [ 33.3196165, 42.01599299899999 ], [ 34.228878499000018, 41.954958998999984 ], [ 34.7977085, 41.954602997999984 ], [ 34.943724, 42.096982998999977 ], [ 35.513699499999973, 41.635979999000028 ], [ 35.9609585, 41.734693 ], [ 36.385412499999973, 41.256273498999974 ], [ 36.651739, 41.383401999 ], [ 37.152348500000016, 41.148413498000025 ] ] ] } },
-{ "type": "Feature", "id": 1654, "properties": { "NUTS_ID": "TR9", "STAT_LEVL_": 1, "SHAPE_AREA": 3.81807358941, "SHAPE_LEN": 13.882364019500001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 42.515174931999979, 41.43828241599999 ], [ 42.596096, 41.2710925 ], [ 42.28524950000002, 40.916437998999982 ], [ 41.9458735, 40.950454997 ], [ 41.809857500000021, 40.684673499999974 ], [ 41.384645499999976, 40.567911998 ], [ 41.164019499, 40.834939498999972 ], [ 40.60606150000001, 40.536 ], [ 40.461455, 40.52868399800002 ], [ 40.089906, 40.572094999 ], [ 39.643336499999975, 40.095837999000025 ], [ 39.789502, 39.940146 ], [ 39.4362185, 39.878558 ], [ 38.95667850000001, 40.068511999 ], [ 38.78355, 40.083271498999977 ], [ 38.213064, 40.213312498999983 ], [ 38.148452500000019, 40.524218497999982 ], [ 37.73245750000001, 40.3425095 ], [ 37.584064, 40.424386998999978 ], [ 37.632352, 40.543681998000011 ], [ 36.671261, 40.915629498999976 ], [ 37.152348500000016, 41.148413498000025 ], [ 38.109992499999976, 40.95851799799999 ], [ 39.1793085, 41.074095999 ], [ 40.328287499999988, 40.987883501 ], [ 41.2519, 41.32993349899999 ], [ 41.547133499999973, 41.520375498000021 ], [ 42.515174931999979, 41.43828241599999 ] ] ] } },
-{ "type": "Feature", "id": 1662, "properties": { "NUTS_ID": "TRA", "STAT_LEVL_": 1, "SHAPE_AREA": 7.4832750096799998, "SHAPE_LEN": 17.830357565700002 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 44.049955809999972, 39.362815426999987 ], [ 43.716901, 39.209512500000017 ], [ 43.412439, 39.393592499000022 ], [ 43.1418885, 39.310626996999986 ], [ 43.184449, 39.198752497999976 ], [ 42.979636, 39.016623498 ], [ 42.75008, 38.92519099899999 ], [ 42.671909, 39.21350849800001 ], [ 42.406795, 39.467103499000018 ], [ 41.801216, 39.14470749899999 ], [ 41.20683, 39.353699001 ], [ 40.651635, 39.522910498999977 ], [ 40.4484405, 39.522155498000018 ], [ 39.828901499999972, 39.601652498000021 ], [ 39.072714, 39.437300499 ], [ 38.7295135, 39.137155496999981 ], [ 38.768138, 39.006262998000011 ], [ 38.6840535, 39.02384299900001 ], [ 38.349975, 39.149943496999981 ], [ 38.3667795, 39.93940199799999 ], [ 38.78355, 40.083271498999977 ], [ 38.95667850000001, 40.068511999 ], [ 39.4362185, 39.878558 ], [ 39.789502, 39.940146 ], [ 39.643336499999975, 40.095837999000025 ], [ 40.089906, 40.572094999 ], [ 40.461455, 40.52868399800002 ], [ 40.60606150000001, 40.536 ], [ 41.164019499, 40.834939498999972 ], [ 41.384645499999976, 40.567911998 ], [ 41.809857500000021, 40.684673499999974 ], [ 41.9458735, 40.950454997 ], [ 42.28524950000002, 40.916437998999982 ], [ 42.596096, 41.2710925 ], [ 42.515174931999979, 41.43828241599999 ], [ 42.836095, 41.584442497999987 ], [ 43.473827, 41.123297499999978 ], [ 43.469638, 41.057344998000019 ], [ 43.750409, 40.744998999000018 ], [ 43.583054, 40.45111099899998 ], [ 43.653923, 40.130377499000019 ], [ 44.351662, 40.022220998000023 ], [ 44.76841, 39.714094498 ], [ 44.812042500000018, 39.63174349799999 ], [ 44.502099499999986, 39.7169935 ], [ 44.401101, 39.416515999000012 ], [ 44.049955809999972, 39.362815426999987 ] ] ] } },
-{ "type": "Feature", "id": 1672, "properties": { "NUTS_ID": "TRB", "STAT_LEVL_": 1, "SHAPE_AREA": 8.4190056696200006, "SHAPE_LEN": 20.650621091600001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 44.049955809999972, 39.362815426999987 ], [ 44.300266998999973, 38.842627999 ], [ 44.30526, 38.400534998000012 ], [ 44.482518, 38.341299998000011 ], [ 44.223969, 37.899149998999974 ], [ 44.462854938000021, 37.809568168999988 ], [ 44.617768, 37.717975998999975 ], [ 44.588852, 37.443092998 ], [ 44.801659, 37.321662999000011 ], [ 44.793147499999975, 37.17628399900002 ], [ 44.349709, 37.038322998000012 ], [ 44.118882, 37.315686 ], [ 43.424563, 37.275334499999985 ], [ 43.307242, 37.438266498000019 ], [ 43.505390499999976, 37.713529998000013 ], [ 42.9662085, 37.762133 ], [ 42.767993499999989, 37.913629497999978 ], [ 41.702295, 38.246542998999985 ], [ 41.506182, 38.564977999 ], [ 41.380549998999982, 38.492253498000025 ], [ 41.176564, 38.716726 ], [ 40.459416499999975, 38.622623498999985 ], [ 40.31407, 38.465824499 ], [ 39.162863, 38.304434497999978 ], [ 39.122747, 38.183312999 ], [ 38.816859, 38.012205499 ], [ 38.329782500000022, 38.204376998999976 ], [ 38.095202500000028, 38.108330497999987 ], [ 38.0894285, 37.954172998999979 ], [ 37.6386675, 37.931167 ], [ 37.7602465, 38.203050997999981 ], [ 37.259534499999972, 38.479911 ], [ 37.341487, 38.593515498999977 ], [ 37.825541, 39.093950499000016 ], [ 38.349975, 39.149943496999981 ], [ 38.6840535, 39.02384299900001 ], [ 38.768138, 39.006262998000011 ], [ 38.7295135, 39.137155496999981 ], [ 39.072714, 39.437300499 ], [ 39.828901499999972, 39.601652498000021 ], [ 40.4484405, 39.522155498000018 ], [ 40.651635, 39.522910498999977 ], [ 41.20683, 39.353699001 ], [ 41.801216, 39.14470749899999 ], [ 42.406795, 39.467103499000018 ], [ 42.671909, 39.21350849800001 ], [ 42.75008, 38.92519099899999 ], [ 42.979636, 39.016623498 ], [ 43.184449, 39.198752497999976 ], [ 43.1418885, 39.310626996999986 ], [ 43.412439, 39.393592499000022 ], [ 43.716901, 39.209512500000017 ], [ 44.049955809999972, 39.362815426999987 ] ] ] } },
-{ "type": "Feature", "id": 1683, "properties": { "NUTS_ID": "TRC", "STAT_LEVL_": 1, "SHAPE_AREA": 7.7164063001300001, "SHAPE_LEN": 16.635246868599999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 43.424563, 37.275334499999985 ], [ 42.786801, 37.38367499899999 ], [ 42.3556825, 37.10799350100001 ], [ 42.180832, 37.290541999000027 ], [ 41.663342, 37.102903498999979 ], [ 40.770821, 37.11804999899999 ], [ 40.226033, 36.901373498999988 ], [ 39.221524, 36.665340996999987 ], [ 38.386376, 36.898329997000019 ], [ 38.047589500000015, 36.845554497000023 ], [ 37.585516499999983, 36.703768498999978 ], [ 37.127495, 36.659157 ], [ 36.678777500000024, 36.83266099799999 ], [ 36.660010393999983, 36.833231536000028 ], [ 36.465673, 36.94866749900001 ], [ 36.697388499999988, 37.218142496999974 ], [ 36.901796, 37.333660499000018 ], [ 37.08278, 37.175676 ], [ 37.623216500000012, 37.511652497 ], [ 37.432401, 37.641709 ], [ 37.6386675, 37.931167 ], [ 38.0894285, 37.954172998999979 ], [ 38.095202500000028, 38.108330497999987 ], [ 38.329782500000022, 38.204376998999976 ], [ 38.816859, 38.012205499 ], [ 39.122747, 38.183312999 ], [ 39.162863, 38.304434497999978 ], [ 40.31407, 38.465824499 ], [ 40.459416499999975, 38.622623498999985 ], [ 41.176564, 38.716726 ], [ 41.380549998999982, 38.492253498000025 ], [ 41.506182, 38.564977999 ], [ 41.702295, 38.246542998999985 ], [ 42.767993499999989, 37.913629497999978 ], [ 42.9662085, 37.762133 ], [ 43.505390499999976, 37.713529998000013 ], [ 43.307242, 37.438266498000019 ], [ 43.424563, 37.275334499999985 ] ] ] } },
-{ "type": "Feature", "id": 1697, "properties": { "NUTS_ID": "UKC", "STAT_LEVL_": 1, "SHAPE_AREA": 1.1414604798700001, "SHAPE_LEN": 5.7423487498399997 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -0.7909065, 54.5594825 ], [ -1.234791, 54.51036849799999 ], [ -1.434778, 54.487514498999985 ], [ -1.696865, 54.536060501 ], [ -2.1701665, 54.458256 ], [ -2.312043001, 54.791080500000021 ], [ -2.567809, 54.8236425 ], [ -2.4862965, 55.083118499000022 ], [ -2.689751, 55.18906399799999 ], [ -2.165465499999982, 55.468467501000021 ], [ -2.335961, 55.63214499899999 ], [ -2.034329, 55.811165 ], [ -1.838949500000012, 55.642334 ], [ -1.639348, 55.578319497999985 ], [ -1.461693, 55.0743905 ], [ -1.3639015, 54.9441835 ], [ -1.3473725, 54.860690998999985 ], [ -1.2422295, 54.722594997999977 ], [ -1.261252, 54.571902997999985 ], [ -0.7909065, 54.5594825 ] ] ] } },
-{ "type": "Feature", "id": 1709, "properties": { "NUTS_ID": "SE3", "STAT_LEVL_": 1, "SHAPE_AREA": 58.209015498799999, "SHAPE_LEN": 45.190549258099999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 24.155129, 65.816027 ], [ 22.34076600100002, 65.868960499000025 ], [ 22.402720499, 65.537355497000021 ], [ 22.0356215, 65.4648855 ], [ 21.524723359, 65.231805586 ], [ 21.543604500000015, 65.063058997999974 ], [ 21.156021, 64.815484500000025 ], [ 21.605362500000012, 64.433961 ], [ 20.31427, 63.674042500999974 ], [ 19.8976955, 63.617022999000028 ], [ 19.683038999000019, 63.434309000999974 ], [ 19.286468, 63.4697195 ], [ 18.331517, 63.053526501000022 ], [ 18.55759150099999, 62.962596500000018 ], [ 18.0637, 62.610071 ], [ 17.4012075, 62.549360498999988 ], [ 17.392239500000017, 62.323179001000028 ], [ 17.6594715, 62.2330035 ], [ 17.489299500000016, 62.136950499000022 ], [ 17.3074195, 61.85283999699999 ], [ 17.402429499999982, 61.7223975 ], [ 17.037339499999973, 61.574806497 ], [ 17.070646, 60.89967199900002 ], [ 17.370218, 60.654461500000025 ], [ 17.192478, 60.300712499999975 ], [ 16.703831, 60.195720498000014 ], [ 15.801567, 60.179103499 ], [ 15.422092500000019, 59.854747499999974 ], [ 14.437155500000017, 60.026160996999977 ], [ 14.295995, 59.012779501000011 ], [ 13.589992, 59.059444499999984 ], [ 13.254575499999987, 58.725619998000013 ], [ 12.228614499, 59.256698498999981 ], [ 11.8261965, 59.237849998 ], [ 11.691129, 59.589547499999981 ], [ 11.93987850000002, 59.69458099799999 ], [ 11.926970499999982, 59.790475999000023 ], [ 11.839729, 59.840767 ], [ 12.499544999000022, 60.09765699799999 ], [ 12.606881499999986, 60.512742496999977 ], [ 12.22399, 61.013078001 ], [ 12.670176500000025, 61.055976498 ], [ 12.870847500000025, 61.356494998000016 ], [ 12.137665, 61.723816998000018 ], [ 12.29937, 62.267494 ], [ 12.2546595, 62.331024496999987 ], [ 12.056142, 62.611918499000012 ], [ 12.218231, 63.000332500000013 ], [ 12.0524555, 63.1834445 ], [ 11.974581, 63.269228 ], [ 12.149767, 63.593946999000025 ], [ 12.683565, 63.9742235 ], [ 13.967524500000025, 64.00797 ], [ 14.157109, 64.195054497 ], [ 14.113869500000021, 64.462484501 ], [ 13.654257500000028, 64.580340498999988 ], [ 14.325985, 65.118916001 ], [ 14.6254765, 65.811808499999984 ], [ 14.516289, 66.132579 ], [ 15.453994500000022, 66.345233501 ], [ 15.3772265, 66.484303996999984 ], [ 16.387759, 67.045462499999985 ], [ 16.158001500000012, 67.51915900099999 ], [ 17.281524, 68.118815 ], [ 17.899761500000011, 67.969372 ], [ 18.151354, 68.19879 ], [ 18.125925, 68.536515999000017 ], [ 19.921397, 68.356013001 ], [ 20.3358725, 68.802312500000028 ], [ 20.0600475, 69.045759001000022 ], [ 20.548636499999986, 69.059968501000014 ], [ 21.888943000999973, 68.5843835 ], [ 23.652348999000026, 67.958793001 ], [ 23.3940235, 67.485820999 ], [ 23.995160500999987, 66.819708999 ], [ 23.6455995, 66.3014085 ], [ 24.155129, 65.816027 ] ] ] } },
-{ "type": "Feature", "id": 1721, "properties": { "NUTS_ID": "SI0", "STAT_LEVL_": 1, "SHAPE_AREA": 2.2971441979099998, "SHAPE_LEN": 9.3683069783700006 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 16.596805, 46.475902498999972 ], [ 16.242148499999985, 46.490075497000021 ], [ 16.301549, 46.378287996999973 ], [ 15.8768685, 46.280055499000014 ], [ 15.7914755, 46.259327497000015 ], [ 15.627262, 46.085953499000027 ], [ 15.706404500000019, 45.97534299900002 ], [ 15.404425, 45.792716997000014 ], [ 15.331290500000023, 45.762854999000012 ], [ 15.277050499999973, 45.604461499000024 ], [ 15.385532, 45.486580499000013 ], [ 15.226437499999975, 45.42708799799999 ], [ 14.570012500000018, 45.672944497 ], [ 14.11819650000001, 45.48123899699999 ], [ 14.109059, 45.482458496999982 ], [ 13.583067, 45.477409997 ], [ 13.7228235, 45.594725497000013 ], [ 13.9186565, 45.63351749899999 ], [ 13.596243, 45.807937501000026 ], [ 13.597145, 45.81952250099999 ], [ 13.496938999, 46.051334999 ], [ 13.664347500000019, 46.177549999 ], [ 13.3754925, 46.29823249899999 ], [ 13.684032, 46.437472501 ], [ 13.714184999, 46.522703499999977 ], [ 14.434500500000013, 46.442943501 ], [ 14.5651755, 46.372453496999981 ], [ 14.6745805, 46.450687500000015 ], [ 15.065120499999978, 46.652112497000019 ], [ 15.40197949899999, 46.65354849900001 ], [ 15.649988, 46.705757 ], [ 15.786422, 46.7074695 ], [ 16.038086, 46.656145497000011 ], [ 15.996236, 46.8353985 ], [ 16.113849, 46.869067998999981 ], [ 16.3707935, 46.722243499 ], [ 16.596805, 46.475902498999972 ] ] ] } },
-{ "type": "Feature", "id": 1743, "properties": { "NUTS_ID": "UKD", "STAT_LEVL_": 1, "SHAPE_AREA": 1.9298566909199999, "SHAPE_LEN": 8.6431854386100007 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -2.1701665, 54.458256 ], [ -2.4608275, 54.226764498000023 ], [ -2.469517, 54.04625699799999 ], [ -2.1844825, 53.952304998999978 ], [ -2.046089, 53.850178 ], [ -2.06121, 53.825671999 ], [ -2.146293001, 53.682266 ], [ -1.909583, 53.53842149899998 ], [ -1.9633505, 53.50985349699999 ], [ -2.0310235, 53.370289001 ], [ -1.987376, 53.213608 ], [ -2.380768, 52.998428499999989 ], [ -2.69927450099999, 52.995460501000025 ], [ -2.726823500000023, 52.9832955 ], [ -3.084193, 53.256122498000025 ], [ -3.110706, 53.296317999 ], [ -3.200367501000017, 53.387527499999976 ], [ -2.928556500000013, 53.308277 ], [ -2.7524115, 53.3147545 ], [ -2.675166499999989, 53.354484500000012 ], [ -2.693358, 53.361835498 ], [ -2.826677, 53.331672501000014 ], [ -3.008742, 53.438411498999983 ], [ -3.1054585, 53.551548001000015 ], [ -2.956203, 53.697529001000021 ], [ -2.833743, 53.722129999 ], [ -3.057371499999988, 53.776481499 ], [ -3.047940499999982, 53.875774498999988 ], [ -2.86947, 54.17673849900001 ], [ -3.149062, 54.093601 ], [ -3.174973, 54.114906499000028 ], [ -3.2049515, 54.215130000999977 ], [ -3.223693609, 54.249058382999976 ], [ -3.613598, 54.525096999000027 ], [ -3.3986365, 54.869125499 ], [ -3.11915, 54.927886999 ], [ -3.091623992999985, 54.97554629199999 ], [ -2.858508, 55.108424999000022 ], [ -2.689751, 55.18906399799999 ], [ -2.4862965, 55.083118499000022 ], [ -2.567809, 54.8236425 ], [ -2.312043001, 54.791080500000021 ], [ -2.1701665, 54.458256 ] ] ] } },
-{ "type": "Feature", "id": 1769, "properties": { "NUTS_ID": "UKE", "STAT_LEVL_": 1, "SHAPE_AREA": 2.12376823527, "SHAPE_LEN": 8.2862590722699991 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 0.017378, 53.525367501 ], [ -0.738485500000024, 53.519866998999987 ], [ -0.741809832, 53.516955832 ], [ -0.7974175, 53.455085998000015 ], [ -0.935518, 53.502521499000011 ], [ -1.199686499999984, 53.311454997 ], [ -1.324675, 53.328853498 ], [ -1.599034500000016, 53.311401498 ], [ -1.801430499999981, 53.481018001 ], [ -1.822188499999982, 53.521117998000022 ], [ -1.909583, 53.53842149899998 ], [ -2.146293001, 53.682266 ], [ -2.06121, 53.825671999 ], [ -2.046089, 53.850178 ], [ -2.1844825, 53.952304998999978 ], [ -2.469517, 54.04625699799999 ], [ -2.4608275, 54.226764498000023 ], [ -2.1701665, 54.458256 ], [ -1.696865, 54.536060501 ], [ -1.434778, 54.487514498999985 ], [ -1.234791, 54.51036849799999 ], [ -0.7909065, 54.5594825 ], [ -0.2124495, 54.15762700099998 ], [ 0.141740261999985, 53.61067700699999 ], [ -0.250096, 53.733318500999985 ], [ -0.419136501000025, 53.719619997 ], [ -0.698366, 53.684653998999977 ], [ -0.722945813000024, 53.611722574 ], [ -0.7450505, 53.571201498999983 ], [ -0.739235033, 53.52388799 ], [ -0.720730575, 53.611187982999979 ], [ -0.295713499999977, 53.713866999 ], [ 0.017378, 53.525367501 ] ] ] } },
-{ "type": "Feature", "id": 1785, "properties": { "NUTS_ID": "UKF", "STAT_LEVL_": 1, "SHAPE_AREA": 2.0991644171799999, "SHAPE_LEN": 7.2785898763399999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -1.801430499999981, 53.481018001 ], [ -1.599034500000016, 53.311401498 ], [ -1.324675, 53.328853498 ], [ -1.199686499999984, 53.311454997 ], [ -0.935518, 53.502521499000011 ], [ -0.7974175, 53.455085998000015 ], [ -0.741809832, 53.516955832 ], [ -0.738485500000024, 53.519866998999987 ], [ 0.017378, 53.525367501 ], [ 0.355712, 53.192062498999974 ], [ 0.043715500000019, 52.903881 ], [ 0.26596, 52.810116000999983 ], [ 0.171689, 52.738036998999974 ], [ -0.031214, 52.66153699900002 ], [ -0.4948475, 52.64027975099998 ], [ -0.41533, 52.578746997 ], [ -0.341543, 52.466945499000019 ], [ -0.465321500000016, 52.322956001000023 ], [ -0.668097, 52.195033998999975 ], [ -0.70541750000001, 52.191570497999976 ], [ -0.871292499999981, 52.0402525 ], [ -1.118058, 52.015426498000011 ], [ -1.331868499999985, 52.168487500000026 ], [ -1.20158, 52.396735998999986 ], [ -1.589611, 52.68727099900002 ], [ -1.597507, 52.700431998999989 ], [ -1.987376, 53.213608 ], [ -2.0310235, 53.370289001 ], [ -1.9633505, 53.50985349699999 ], [ -1.909583, 53.53842149899998 ], [ -1.822188499999982, 53.521117998000022 ], [ -1.801430499999981, 53.481018001 ] ] ] } },
-{ "type": "Feature", "id": 1800, "properties": { "NUTS_ID": "UKG", "STAT_LEVL_": 1, "SHAPE_AREA": 1.7063612080399999, "SHAPE_LEN": 5.7747078284500004 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -1.331868499999985, 52.168487500000026 ], [ -1.6657325, 51.987491498 ], [ -1.76762700099999, 52.112594499000011 ], [ -2.351362, 52.021365999000011 ], [ -2.6502135, 51.826152998999987 ], [ -3.067357500000014, 51.983150501000011 ], [ -3.1419115, 52.127876499000024 ], [ -2.954719, 52.349258500000019 ], [ -3.235562, 52.44255049899999 ], [ -3.14748, 52.890155998000012 ], [ -2.726823500000023, 52.9832955 ], [ -2.69927450099999, 52.995460501000025 ], [ -2.380768, 52.998428499999989 ], [ -1.987376, 53.213608 ], [ -1.597507, 52.700431998999989 ], [ -1.589611, 52.68727099900002 ], [ -1.20158, 52.396735998999986 ], [ -1.331868499999985, 52.168487500000026 ] ] ] } },
-{ "type": "Feature", "id": 1818, "properties": { "NUTS_ID": "UKH", "STAT_LEVL_": 1, "SHAPE_AREA": 2.3906274238899998, "SHAPE_LEN": 8.4359191964600004 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 1.2919645, 51.870651 ], [ 0.844628, 51.781318499 ], [ 0.71307, 51.715091500000028 ], [ 0.756304, 51.691742 ], [ 0.9458965, 51.716212998 ], [ 0.776491998999973, 51.636024498999973 ], [ 0.76434071599999, 51.63674747499999 ], [ 0.85409737399999, 51.601599641 ], [ 0.958179500000028, 51.61969 ], [ 0.821244, 51.540714498 ], [ 0.626791, 51.532172999000011 ], [ 0.51370350000002, 51.531184998000015 ], [ 0.40186059199999, 51.456630585000028 ], [ 0.379662895000024, 51.457289678999985 ], [ 0.217437, 51.480072 ], [ 0.210514, 51.490035998999986 ], [ 0.313079500000015, 51.565814999 ], [ 0.200354, 51.624931499000013 ], [ 0.138225499999976, 51.623543 ], [ -0.012219500000015, 51.646228998000026 ], [ -0.011878500000023, 51.6808775 ], [ -0.18204750000001, 51.668601997 ], [ -0.304423, 51.636348499 ], [ -0.5005615, 51.599689497999975 ], [ -0.745649501, 51.842094498999984 ], [ -0.553598, 51.8267095 ], [ -0.652946499999985, 51.969230501000027 ], [ -0.5917725, 52.110691 ], [ -0.668097, 52.195033998999975 ], [ -0.465321500000016, 52.322956001000023 ], [ -0.341543, 52.466945499000019 ], [ -0.41533, 52.578746997 ], [ -0.4948475, 52.64027975099998 ], [ -0.031214, 52.66153699900002 ], [ 0.171689, 52.738036998999974 ], [ 0.26596, 52.810116000999983 ], [ 0.695918, 52.98762899799999 ], [ 1.675478, 52.74269099899999 ], [ 1.7405435, 52.532100500000013 ], [ 1.756430500000022, 52.471725498000012 ], [ 1.391530358000011, 51.98931597 ], [ 1.05638, 51.95149249799999 ], [ 1.2919645, 51.870651 ] ] ] } },
-{ "type": "Feature", "id": 1838, "properties": { "NUTS_ID": "UKI", "STAT_LEVL_": 1, "SHAPE_AREA": 0.202723101494, "SHAPE_LEN": 2.0682560974299999 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ 0.210514, 51.490035998999986 ], [ 0.217437, 51.480072 ], [ 0.152974, 51.408702999000013 ], [ 0.148871, 51.4085045 ], [ 0.042433500000016, 51.292667498000014 ], [ 0.002330500000028, 51.329132 ], [ -0.156508, 51.3215065 ], [ -0.330622, 51.329006000999982 ], [ -0.317662, 51.393665499 ], [ -0.458605499999976, 51.456314 ], [ -0.509666501000027, 51.469173500000011 ], [ -0.489989499999979, 51.494747 ], [ -0.5005615, 51.599689497999975 ], [ -0.304423, 51.636348499 ], [ -0.18204750000001, 51.668601997 ], [ -0.011878500000023, 51.6808775 ], [ -0.012219500000015, 51.646228998000026 ], [ 0.138225499999976, 51.623543 ], [ 0.200354, 51.624931499000013 ], [ 0.313079500000015, 51.565814999 ], [ 0.210514, 51.490035998999986 ] ] ] } },
-{ "type": "Feature", "id": 1865, "properties": { "NUTS_ID": "UKJ", "STAT_LEVL_": 1, "SHAPE_AREA": 2.3484679541600002, "SHAPE_LEN": 11.501812599200001 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -0.5917725, 52.110691 ], [ -0.652946499999985, 51.969230501000027 ], [ -0.553598, 51.8267095 ], [ -0.745649501, 51.842094498999984 ], [ -0.5005615, 51.599689497999975 ], [ -0.489989499999979, 51.494747 ], [ -0.509666501000027, 51.469173500000011 ], [ -0.458605499999976, 51.456314 ], [ -0.317662, 51.393665499 ], [ -0.330622, 51.329006000999982 ], [ -0.156508, 51.3215065 ], [ 0.002330500000028, 51.329132 ], [ 0.042433500000016, 51.292667498000014 ], [ 0.148871, 51.4085045 ], [ 0.152974, 51.408702999000013 ], [ 0.217437, 51.480072 ], [ 0.379662895000024, 51.457289678999985 ], [ 0.40186059199999, 51.456630585000028 ], [ 0.458900500000027, 51.454936998999983 ], [ 0.720437, 51.45972799899999 ], [ 0.539261022, 51.407019827 ], [ 0.6268945, 51.37521 ], [ 0.951056, 51.373244999 ], [ 0.950297499999976, 51.345749 ], [ 1.4497465, 51.377177997999979 ], [ 1.3796385, 51.1421585 ], [ 0.854784, 50.923709998999982 ], [ -0.03817650000002, 50.799503499000025 ], [ -0.216009, 50.827563997000027 ], [ -0.932831, 50.843151 ], [ -1.021779507000019, 50.835727013 ], [ -1.116579, 50.842498998 ], [ -1.3651595, 50.880069497000022 ], [ -1.4770375, 50.923957997 ], [ -1.306856, 50.819854498999973 ], [ -1.6920495, 50.736625498000024 ], [ -1.956807, 50.98982999899999 ], [ -1.6233815, 50.954635500999984 ], [ -1.49828, 51.329376 ], [ -1.584689500000025, 51.524913998999978 ], [ -1.602793500000018, 51.518295498999976 ], [ -1.6830405, 51.690112997000028 ], [ -1.6657325, 51.987491498 ], [ -1.331868499999985, 52.168487500000026 ], [ -1.118058, 52.015426498000011 ], [ -0.871292499999981, 52.0402525 ], [ -0.70541750000001, 52.191570497999976 ], [ -0.668097, 52.195033998999975 ], [ -0.5917725, 52.110691 ] ] ], [ [ [ -1.069896, 50.683647001 ], [ -1.303737500000011, 50.576045997999984 ], [ -1.586375, 50.663165997000021 ], [ -1.292164500000013, 50.75799949899999 ], [ -1.069896, 50.683647001 ] ] ] ] } },
-{ "type": "Feature", "id": 1891, "properties": { "NUTS_ID": "UKK", "STAT_LEVL_": 1, "SHAPE_AREA": 3.1640568410799998, "SHAPE_LEN": 11.3657412761 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -1.6657325, 51.987491498 ], [ -1.6830405, 51.690112997000028 ], [ -1.602793500000018, 51.518295498999976 ], [ -1.584689500000025, 51.524913998999978 ], [ -1.49828, 51.329376 ], [ -1.6233815, 50.954635500999984 ], [ -1.956807, 50.98982999899999 ], [ -1.6920495, 50.736625498000024 ], [ -1.7407255, 50.7215385 ], [ -2.0406, 50.719962999000018 ], [ -1.953566001000013, 50.594131499000014 ], [ -2.947815000999981, 50.718307501000027 ], [ -3.5090305, 50.516514001000019 ], [ -3.507436, 50.378989999 ], [ -3.723704, 50.201519 ], [ -4.1231325, 50.346740497999974 ], [ -4.177532638, 50.364021469000022 ], [ -4.76328, 50.326236498000014 ], [ -5.185075499999982, 49.963165496999977 ], [ -5.479827, 50.126014499 ], [ -4.545959499999981, 50.928352499000027 ], [ -4.139783841, 51.077219300000024 ], [ -4.200037001, 51.200939 ], [ -3.7207785, 51.233093499 ], [ -2.992948, 51.32031999899999 ], [ -2.679612, 51.480212999 ], [ -2.674005500000021, 51.544273498999985 ], [ -2.534737, 51.677245998999979 ], [ -2.66065692, 51.627663196000015 ], [ -2.6502135, 51.826152998999987 ], [ -2.351362, 52.021365999000011 ], [ -1.76762700099999, 52.112594499000011 ], [ -1.6657325, 51.987491498 ] ] ] } },
-{ "type": "Feature", "id": 1908, "properties": { "NUTS_ID": "UKL", "STAT_LEVL_": 1, "SHAPE_AREA": 2.60665836629, "SHAPE_LEN": 12.047229786400001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -3.084193, 53.256122498000025 ], [ -2.726823500000023, 52.9832955 ], [ -3.14748, 52.890155998000012 ], [ -3.235562, 52.44255049899999 ], [ -2.954719, 52.349258500000019 ], [ -3.1419115, 52.127876499000024 ], [ -3.067357500000014, 51.983150501000011 ], [ -2.6502135, 51.826152998999987 ], [ -2.66065692, 51.627663196000015 ], [ -3.082688, 51.501907499000026 ], [ -3.6378555, 51.469840998 ], [ -3.88623, 51.617385997999975 ], [ -4.307776, 51.610301999 ], [ -4.05255, 51.703281498000024 ], [ -4.929975500000012, 51.59847650099999 ], [ -5.3202485, 51.861602999000013 ], [ -4.207504501000017, 52.263820497999973 ], [ -3.9310135, 52.553612 ], [ -4.128679, 52.612781499999983 ], [ -4.058637499999975, 52.920723001 ], [ -4.767578500000013, 52.79496 ], [ -4.216110416999982, 53.19851737099998 ], [ -4.69663, 53.30674349899999 ], [ -4.040188, 53.310653499000011 ], [ -4.196792116999973, 53.210816994000027 ], [ -4.007376, 53.246928998999977 ], [ -3.363392499999975, 53.352028 ], [ -3.084193, 53.256122498000025 ] ] ] } },
-{ "type": "Feature", "id": 1923, "properties": { "NUTS_ID": "UKM", "STAT_LEVL_": 1, "SHAPE_AREA": 10.765903724199999, "SHAPE_LEN": 41.233591048100003 }, "geometry": { "type": "MultiPolygon", "coordinates": [ [ [ [ -3.771491500000025, 57.867034999 ], [ -4.4320525, 57.494354000999977 ], [ -2.801509, 57.6952475 ], [ -1.78944, 57.503555498000026 ], [ -2.425346, 56.755187997 ], [ -3.051892501, 56.458499997999979 ], [ -3.27144769, 56.353184073000023 ], [ -3.2769755, 56.350532500999975 ], [ -3.264612652999972, 56.352858137 ], [ -3.243498734000013, 56.356829979999986 ], [ -2.803087, 56.439677997999979 ], [ -2.620387, 56.2611885 ], [ -3.391451500000016, 56.00593199799999 ], [ -3.866839500000026, 56.1206625 ], [ -3.820153, 56.099117501000023 ], [ -3.515724, 56.002258500999972 ], [ -3.425330499999973, 55.994030001 ], [ -3.077645500000017, 55.94689549899999 ], [ -2.3666235, 55.946113498999978 ], [ -2.034329, 55.811165 ], [ -2.335961, 55.63214499899999 ], [ -2.165465499999982, 55.468467501000021 ], [ -2.689751, 55.18906399799999 ], [ -2.858508, 55.108424999000022 ], [ -3.091623992999985, 54.97554629199999 ], [ -4.4054395, 54.677509500999975 ], [ -5.183450000999983, 54.912364998999976 ], [ -5.040228500000012, 54.997726500999988 ], [ -4.65827250000001, 55.570289499000012 ], [ -4.888966499999981, 55.87478999699999 ], [ -4.7651305, 55.957771499999978 ], [ -4.392671741000015, 55.889815842000019 ], [ -4.609648, 55.94667449799999 ], [ -4.828885, 56.079009998 ], [ -5.419614, 55.896728498000016 ], [ -5.520004500000027, 55.361922998000011 ], [ -5.799178, 55.299698001000024 ], [ -5.436028, 55.857562998999981 ], [ -5.666273499999988, 55.800583001 ], [ -5.487101000999985, 56.256931501 ], [ -5.654321, 56.297649497 ], [ -5.38434560799999, 56.531742240000028 ], [ -5.122688000999972, 56.827274499 ], [ -5.684919, 56.4972955 ], [ -6.226232, 56.710991001000025 ], [ -5.75170300100001, 56.782165497999983 ], [ -5.924258, 56.891563497999982 ], [ -5.72895, 57.101478498 ], [ -5.5094375, 57.097774499000025 ], [ -5.681389, 57.1505305 ], [ -5.670602562999989, 57.208438753999985 ], [ -6.017994, 57.017947999 ], [ -6.7897145, 57.42120749899999 ], [ -6.296708, 57.707561498000018 ], [ -6.146597000999975, 57.574474497999972 ], [ -6.130341, 57.316936499 ], [ -5.885413500000027, 57.238167001000022 ], [ -5.6476125, 57.255294998000011 ], [ -5.664395153999976, 57.219288447 ], [ -5.415392, 57.230156 ], [ -5.82202, 57.363647496999988 ], [ -5.8141635, 57.858711 ], [ -5.071033000999989, 57.81991949799999 ], [ -5.458675, 58.074409495999987 ], [ -5.0071175, 58.625992000999986 ], [ -3.024449, 58.64426400100001 ], [ -3.116977500000019, 58.36858 ], [ -4.400494, 57.918987497999979 ], [ -3.771491500000025, 57.867034999 ] ] ], [ [ [ -6.180591, 58.467010500000015 ], [ -6.344816, 58.234204497 ], [ -6.135358, 58.259441498 ], [ -6.967598, 57.727894000999981 ], [ -7.1331955, 57.836654501 ], [ -6.8160135, 57.900836998999978 ], [ -7.079999499999985, 57.967232001000013 ], [ -6.907952, 58.049675001000026 ], [ -7.10329500099999, 58.073547500000018 ], [ -7.027442, 58.244353997000019 ], [ -6.180591, 58.467010500000015 ] ] ], [ [ [ -1.155833500000028, 60.338035499999989 ], [ -1.277545, 59.852599998000017 ], [ -1.702932, 60.255359499 ], [ -1.339904500999978, 60.359771497999986 ], [ -1.632932, 60.483162 ], [ -1.499491319000015, 60.543659871999978 ], [ -1.304159, 60.623996499999976 ], [ -1.155833500000028, 60.338035499999989 ] ] ], [ [ [ -5.646332, 56.440559501 ], [ -6.385666501, 56.288787997999975 ], [ -6.323570501, 56.606162998 ], [ -5.646332, 56.440559501 ] ] ], [ [ [ -2.9119245, 59.006061499 ], [ -2.704751499999986, 58.961914001000025 ], [ -3.368161499999985, 58.998829 ], [ -2.9119245, 59.006061499 ] ] ] ] } },
-{ "type": "Feature", "id": 1943, "properties": { "NUTS_ID": "UKN", "STAT_LEVL_": 1, "SHAPE_AREA": 1.88082099117, "SHAPE_LEN": 7.4104194730500001 }, "geometry": { "type": "Polygon", "coordinates": [ [ [ -6.2680155, 54.102337 ], [ -6.623778500000014, 54.036548497000013 ], [ -7.0286355, 54.421306497999979 ], [ -7.565956, 54.126514499 ], [ -8.177718, 54.464973500999974 ], [ -7.703411501, 54.608287999000027 ], [ -7.921372, 54.696544499000026 ], [ -7.534506, 54.74713900099999 ], [ -7.256068500000026, 55.067034998 ], [ -7.099074804999987, 55.100807920000022 ], [ -6.455277, 55.2393035 ], [ -5.976527499999975, 55.056598501 ], [ -5.992633500000011, 54.989261999 ], [ -5.721779, 54.772852496999974 ], [ -5.720909001, 54.772636498999987 ], [ -5.691673, 54.809592999000017 ], [ -5.690664001000016, 54.769462497 ], [ -5.912931500000013, 54.648036998 ], [ -5.855336, 54.63377 ], [ -5.575003, 54.670196501000021 ], [ -5.432802, 54.48764399800001 ], [ -5.499689499999988, 54.338054499 ], [ -5.874409, 54.184848500999976 ], [ -6.2680155, 54.102337 ] ] ] } }
-]
-}
diff --git a/examples/geospatial/rivers.geojson b/examples/geospatial/rivers.geojson
deleted file mode 100644
index 1f22c3d..0000000
--- a/examples/geospatial/rivers.geojson
+++ /dev/null
@@ -1,28 +0,0 @@
-{
-"type": "FeatureCollection",
-"crs": { "type": "name", "properties": { "name": "urn:ogc:def:crs:EPSG::32718" } },
-"features": [
-{ "type": "Feature", "properties": { "Length": 8726.582, "id": 0 }, "geometry": { "type": "LineString", "coordinates": [ [ 554143.902784303762019, 9370463.606285564601421 ], [ 554090.77431408688426, 9370670.527637269347906 ], [ 554090.774323970079422, 9371226.15380304120481 ], [ 554241.587090324610472, 9371670.65468049235642 ], [ 554694.025547004304826, 9371853.21761728823185 ], [ 555225.839149633422494, 9371750.02984730899334 ], [ 555343.558532191440463, 9371655.854319509118795 ], [ 555345.387838716618717, 9371654.390889570116997 ], [ 555662.402400243096054, 9371400.77914435043931 ], [ 556122.77833932172507, 9371003.903421515598893 ], [ 557091.155320196412504, 9369979.963763244450092 ], [ 557341.200555620715022, 9369629.900410778820515 ], [ 557346.268165812827647, 9369622.805799175053835 ], [ 557527.71874447260052, 9369368.775065643712878 ], [ 557765.84421896468848, 9368876.649134300649166 ], [ 558049.371886842884123, 9368053.068484710529447 ], [ 558053.988722653128207, 9368039.657954649999738 ], [ 558099.219791372306645, 9367908.272239688783884 ], [ 558226.220121195539832, 9367098.645607182756066 ], [ 558099.219845050014555, 9366423.956662653014064 ], [ 558080.528334323316813, 9366376.590186208486557 ] ] } },
-{ "type": "Feature", "properties": { "Length": 4257.036, "id": 1 }, "geometry": { "type": "LineString", "coordinates": [ [ 556297.090028856880963, 9365917.049970472231507 ], [ 555900.606109458953142, 9365657.158350326120853 ], [ 555782.799677672795951, 9365476.812701666727662 ], [ 555700.779348274692893, 9365322.130740420892835 ], [ 555688.470050815492868, 9365265.110052039846778 ], [ 555681.259945538826287, 9365190.290051722899079 ], [ 555674.760025494731963, 9364974.659947790205479 ], [ 555682.420050675049424, 9364852.709972137585282 ], [ 555694.550037671811879, 9364781.880019277334213 ], [ 555727.349953718483448, 9364659.910016257315874 ], [ 555744.420012982562184, 9364518.670053444802761 ], [ 555729.489983960986137, 9364441.560034865513444 ], [ 555699.249948039650917, 9364386.189982520416379 ], [ 555684.109970162622631, 9364315.900024494156241 ], [ 555686.449967577122152, 9364255.080044345930219 ], [ 555728.060013480484486, 9364125.630033632740378 ], [ 555749.610024364665151, 9364085.360002601519227 ], [ 555770.949986239895225, 9364059.309945780783892 ], [ 555794.450053149834275, 9364010.699949137866497 ], [ 555829.790047806687653, 9363850.149949369952083 ], [ 555874.309982567094266, 9363744.100027633830905 ], [ 555907.019950926303864, 9363695.920032363384962 ], [ 555947.87997955083847, 9363615.940017953515053 ], [ 556013.410024100914598, 9363509.110036455094814 ], [ 556202.53995044529438, 9363306.050033012405038 ], [ 556274.150049703195691, 9363241.279995545744896 ], [ 556299.859950244426727, 9363225.749980885535479 ], [ 556317.170045118778944, 9363207.929970927536488 ], [ 556341.640031377784908, 9363196.830052131786942 ], [ 556441.779971952550113, 9363116.399995505809784 ], [ 556510.090008058585227, 9363068.569956073537469 ], [ 556590.200010528787971, 9363037.669945260509849 ], [ 556741.749954544007778, 9362995.009969409555197 ], [ 557237.734752693213522, 9362710.7868034504354 ] ] } },
-{ "type": "Feature", "properties": { "Length": 4126.414, "id": 2 }, "geometry": { "type": "LineString", "coordinates": [ [ 556950.010039016604424, 9367879.529990935698152 ], [ 556969.360006319358945, 9367808.640045430511236 ], [ 557003.739985266700387, 9367721.949950056150556 ], [ 557036.919998358935118, 9367525.920052092522383 ], [ 557039.840040050446987, 9367455.529969932511449 ], [ 557024.140032706782222, 9367220.819958550855517 ], [ 557024.3699557390064, 9367105.170054715126753 ], [ 557049.169946600683033, 9366954.26003678701818 ], [ 557070.480000229552388, 9366864.159951467067003 ], [ 557127.270002986304462, 9366510.689949084073305 ], [ 557132.919950186274946, 9366497.259986190125346 ], [ 557139.560054676607251, 9366401.4000089392066 ], [ 557136.479962345212698, 9366150.580038771033287 ], [ 557151.830032242462039, 9365999.990010671317577 ], [ 557155.590027201920748, 9365871.080005507916212 ], [ 557152.200018741190434, 9365729.669961683452129 ], [ 557146.04999157320708, 9365662.309972265735269 ], [ 557096.070009808056056, 9365457.700033070519567 ], [ 557080.740025472827256, 9365348.980017950758338 ], [ 557047.89997448399663, 9365232.779988572001457 ], [ 556998.720020562410355, 9364984.410006260499358 ], [ 556998.470042725093663, 9364891.139983013272285 ], [ 557016.959983234293759, 9364759.48998936265707 ], [ 557020.109949165023863, 9364692.179985167458653 ], [ 556986.650004463270307, 9364540.580016005784273 ], [ 556978.97003573179245, 9364414.860015098005533 ], [ 556989.520046097226441, 9364373.359963739290833 ], [ 557295.943237604573369, 9363890.830840945243835 ] ] } },
-{ "type": "Feature", "properties": { "Length": 1193.85, "id": 3 }, "geometry": { "type": "LineString", "coordinates": [ [ 557295.943237604573369, 9363890.830840945243835 ], [ 557179.526292844675481, 9363224.079394550994039 ], [ 557237.734752693213522, 9362710.7868034504354 ] ] } },
-{ "type": "Feature", "properties": { "Length": 3678.113, "id": 4 }, "geometry": { "type": "LineString", "coordinates": [ [ 557237.734752693213522, 9362710.7868034504354 ], [ 557322.401589901186526, 9362345.661031894385815 ], [ 557629.318806320428848, 9362086.368931673467159 ], [ 557952.111221184954047, 9361938.201930712908506 ], [ 558433.653843024745584, 9361885.285094810649753 ], [ 558571.23748308327049, 9361906.45183520950377 ], [ 558804.071235538460314, 9362118.119011694565415 ], [ 559042.196680794470012, 9362398.577756013721228 ], [ 559280.322152965702116, 9362620.82816337607801 ], [ 559481.405938610434532, 9362795.453616688027978 ], [ 559994.698619224131107, 9362964.787325641140342 ], [ 560098.0373490806669, 9362989.473788045346737 ] ] } },
-{ "type": "Feature", "properties": { "Length": 732.404, "id": 5 }, "geometry": { "type": "LineString", "coordinates": [ [ 557427.158627981320024, 9364610.527111783623695 ], [ 557407.068465149030089, 9364546.998659269884229 ], [ 557295.943237604573369, 9363890.830840945243835 ] ] } },
-{ "type": "Feature", "properties": { "Length": 1884.698, "id": 6 }, "geometry": { "type": "LineString", "coordinates": [ [ 558080.528334323316813, 9366376.590186208486557 ], [ 557602.860547710210085, 9365166.124975387006998 ], [ 557427.158627981320024, 9364610.527111783623695 ] ] } },
-{ "type": "Feature", "properties": { "Length": 3442.012, "id": 7 }, "geometry": { "type": "LineString", "coordinates": [ [ 557427.158627981320024, 9364610.527111783623695 ], [ 557805.752903219312429, 9364280.688339363783598 ], [ 558106.716948954388499, 9363923.500108260661364 ], [ 558275.389236090704799, 9363589.46295397169888 ], [ 558457.290563251823187, 9363242.196594277396798 ], [ 558702.030567424371839, 9363080.139034846797585 ], [ 559002.994778543710709, 9363030.529521865770221 ], [ 559194.818084074184299, 9363076.831730449572206 ], [ 559380.026796967722476, 9363083.446249704807997 ], [ 560098.0373490806669, 9362989.473788045346737 ] ] } },
-{ "type": "Feature", "properties": { "Length": 2233.621, "id": 8 }, "geometry": { "type": "LineString", "coordinates": [ [ 558080.528334323316813, 9366376.590186208486557 ], [ 558315.076750056818128, 9366239.931772261857986 ], [ 558526.479321829974651, 9366203.154564624652267 ], [ 558709.042151712812483, 9366137.008597478270531 ], [ 558870.438306444324553, 9366025.883540779352188 ], [ 559097.980513922870159, 9365814.216369282454252 ], [ 559232.918147033080459, 9365687.216153411194682 ], [ 559373.147643727250397, 9365512.590732062235475 ], [ 559513.377018287777901, 9365263.882038922980428 ], [ 559611.273070521652699, 9365033.693947337567806 ], [ 559631.074507311917841, 9364934.686608903110027 ] ] } },
-{ "type": "Feature", "properties": { "Length": 10205.469, "id": 9 }, "geometry": { "type": "LineString", "coordinates": [ [ 561623.109724191948771, 9370725.055553076788783 ], [ 561676.18997294921428, 9370714.779967816546559 ], [ 561872.700051285326481, 9370633.689976416528225 ], [ 561965.950044951401651, 9370604.980027223005891 ], [ 562030.249970029108226, 9370576.360040871426463 ], [ 562124.010049782693386, 9370520.819982415065169 ], [ 562167.6200290042907, 9370486.610048672184348 ], [ 562228.209969406016171, 9370457.580005738884211 ], [ 562373.359956271015108, 9370432.180039843544364 ], [ 562492.949983207508922, 9370383.569977002218366 ], [ 562715.320010068826377, 9370251.099986046552658 ], [ 562817.060029708780348, 9370201.850013209506869 ], [ 562910.809955675154924, 9370133.78995799459517 ], [ 563009.018621220253408, 9370046.866670096293092 ], [ 563009.985151003114879, 9370046.011303126811981 ], [ 563020.719979303888977, 9370036.509967906400561 ], [ 563153.016688115894794, 9369931.317918915301561 ], [ 563160.323813738301396, 9369925.507776217535138 ], [ 563259.789997064508498, 9369846.419983314350247 ], [ 563309.629967176355422, 9369788.110029544681311 ], [ 563352.719949327409267, 9369746.749972360208631 ], [ 563482.359983313828707, 9369582.040037693455815 ], [ 563546.450017801485956, 9369482.339969750493765 ], [ 563568.230047828517854, 9369438.550010936334729 ], [ 563640.699980834499002, 9369257.379998253658414 ], [ 563704.739994174800813, 9369068.280027104541659 ], [ 563758.530009357258677, 9368843.660054840147495 ], [ 563782.790046340785921, 9368769.659959333017468 ], [ 563802.600015265867114, 9368654.040054289624095 ], [ 563809.940042392350733, 9368570.360043799504638 ], [ 563810.639974911697209, 9368493.269952712580562 ], [ 563828.989995523355901, 9368377.549995694309473 ], [ 563835.819965007714927, 9368229.960012884810567 ], [ 563817.079956445842981, 9367927.960038047283888 ], [ 563789.859994518570602, 9367789.759950948879123 ], [ 563659.979972447268665, 9367401.359956156462431 ], [ 563571.960032540373504, 9367214.499949682503939 ], [ 563534.900016752071679, 9367147.420007986947894 ], [ 563487.759952491149306, 9367080.549984656274319 ], [ 563350.280050816945732, 9366928.819999873638153 ], [ 563299.450048363767564, 9366889.93997529707849 ], [ 563052.050024612806737, 9366770.009978745132685 ], [ 562937.400018213316798, 9366720.689949177205563 ], [ 562669.570023030042648, 9366625.919982662424445 ], [ 562326.890030118636787, 9366482.909977274015546 ], [ 562099.610032433643937, 9366398.40998206473887 ], [ 561991.650016712956131, 9366349.370033476501703 ], [ 561779.182359215803444, 9366384.032598856836557 ], [ 561636.968650399707258, 9366417.105622258037329 ], [ 561464.213070619851351, 9366402.262741258367896 ], [ 561463.75820950884372, 9366402.262741258367896 ], [ 561463.073749605566263, 9366401.980590904131532 ], [ 561382.249945723451674, 9366407.229994645342231 ], [ 561342.7800421891734, 9366397.170049915090203 ], [ 561318.300000792369246, 9366379.429999388754368 ], [ 561298.470034805126488, 9366324.010052978992462 ], [ 561272.490045037120581, 9366274.860033167526126 ], [ 561245.32002614159137, 9366258.689991842955351 ], [ 561209.280043817125261, 9366250.149994755163789 ], [ 561164.889944912865758, 9366248.770035138353705 ], [ 561131.099964152090251, 9366287.050040356814861 ], [ 561116.460049882531166, 9366310.640005555003881 ], [ 561048.77001735009253, 9366340.479974111542106 ], [ 560988.900028594769537, 9366352.31000741943717 ], [ 560941.339979647658765, 9366336.740006012842059 ], [ 560923.070013260468841, 9366316.929958242923021 ], [ 560919.330047259107232, 9366300.380018414929509 ], [ 560921.569994766265154, 9366260.259971223771572 ], [ 560936.799992737360299, 9366192.209996210411191 ], [ 560925.05003163870424, 9366126.270037136971951 ], [ 560876.040008636191487, 9366071.009984320029616 ], [ 560811.089955819770694, 9366030.080027123913169 ], [ 560777.720031510107219, 9366021.430015010759234 ], [ 560722.320030105300248, 9366019.599988108500838 ], [ 560687.230004222132266, 9366026.950045935809612 ], [ 560611.969972735270858, 9366056.569956619292498 ], [ 560558.649947706609964, 9366056.350004274398088 ], [ 560505.070005991496146, 9366036.09994831867516 ], [ 560424.559977711178362, 9365964.249981984496117 ], [ 560389.580007423646748, 9365942.949960263445973 ], [ 560368.409995377995074, 9365939.010003125295043 ], [ 560323.710036418400705, 9365944.029953524470329 ], [ 560310.509968371130526, 9365949.480042090639472 ], [ 560272.819948982447386, 9366039.820001818239689 ], [ 560255.240021657198668, 9366054.909989798441529 ], [ 560241.229975403286517, 9366057.109999237582088 ], [ 560027.700045749545097, 9365958.499957986176014 ], [ 559981.980015838518739, 9365926.920028259977698 ], [ 559960.809977819211781, 9365902.629947766661644 ], [ 559937.809967474080622, 9365863.670021612197161 ], [ 559929.169960617087781, 9365818.25998305901885 ], [ 559928.909987358376384, 9365702.910010498017073 ], [ 559941.24996009003371, 9365584.660001531243324 ], [ 559941.370029916986823, 9365478.899977192282677 ], [ 559933.660025156103075, 9365340.459967693313956 ], [ 559916.789950799196959, 9365227.910002045333385 ], [ 559887.94998219050467, 9365172.430055160075426 ], [ 559835.799970475956798, 9365101.660033008083701 ], [ 559793.400005815550685, 9365063.109944943338633 ], [ 559703.000015168450773, 9365007.539955627173185 ], [ 559631.074507311917841, 9364934.686608903110027 ] ] } },
-{ "type": "Feature", "properties": { "Length": 2061.313, "id": 10 }, "geometry": { "type": "LineString", "coordinates": [ [ 559631.074507311917841, 9364934.686608903110027 ], [ 559674.773152898997068, 9364716.193343315273523 ], [ 559717.106637695804238, 9364488.651212530210614 ], [ 559754.148339871317148, 9364245.234052091836929 ], [ 559788.544262650422752, 9364015.046059736981988 ], [ 559833.523490295745432, 9363858.941560404375196 ], [ 559854.690266018733382, 9363729.295549921691418 ], [ 559965.815547276288271, 9363552.024331323802471 ], [ 560113.982422707602382, 9363395.919846467673779 ], [ 560233.045221633277833, 9363181.606919135898352 ], [ 560319.598569430410862, 9363042.402333023026586 ] ] } },
-{ "type": "Feature", "properties": { "Length": 83.734, "id": 11 }, "geometry": { "type": "LineString", "coordinates": [ [ 559897.255002218298614, 9369874.2249611672014 ], [ 559964.208351333625615, 9369823.988749623298645 ] ] } },
-{ "type": "Feature", "properties": { "Length": 1687.179, "id": 12 }, "geometry": { "type": "LineString", "coordinates": [ [ 559935.500016157515347, 9368146.289991278201342 ], [ 559922.310023608617485, 9368229.959980944171548 ], [ 559929.429956942796707, 9368342.340002160519361 ], [ 559941.249947638250887, 9368419.490002373233438 ], [ 559941.480000208131969, 9368557.889966269955039 ], [ 559928.1699612531811, 9368763.130033286288381 ], [ 559915.889949735254049, 9368846.899954961612821 ], [ 559915.899990991689265, 9369264.769971307367086 ], [ 559956.630007576197386, 9369486.99005114659667 ], [ 559954.07999915163964, 9369624.289997108280659 ], [ 559947.930049324408174, 9369697.340018127113581 ], [ 559954.159957599826157, 9369781.860050067305565 ], [ 559961.630633292719722, 9369813.181432234123349 ], [ 559964.208351333625615, 9369823.988749623298645 ] ] } },
-{ "type": "Feature", "properties": { "Length": 2098.763, "id": 13 }, "geometry": { "type": "LineString", "coordinates": [ [ 559964.208351333625615, 9369823.988749623298645 ], [ 559993.93003548309207, 9369948.599983097985387 ], [ 560024.45994486194104, 9370038.910024585202336 ], [ 560075.27995169442147, 9370135.399959975853562 ], [ 560100.700002578087151, 9370167.779956474900246 ], [ 560138.18996147532016, 9370201.569987757131457 ], [ 560355.609977724030614, 9370357.219972815364599 ], [ 560409.639980466105044, 9370401.170008989050984 ], [ 560457.739998309873044, 9370449.789993532001972 ], [ 560565.390003379434347, 9370582.470039075240493 ], [ 560643.150022109039128, 9370615.329998968169093 ], [ 560757.790031655691564, 9370645.409984136000276 ], [ 560924.549954789690673, 9370717.559975691139698 ], [ 561044.0500304447487, 9370744.38003858178854 ], [ 561172.519950155168772, 9370748.350045446306467 ], [ 561304.110004225745797, 9370782.95998015627265 ], [ 561365.289962874725461, 9370784.139984445646405 ], [ 561432.550000729039311, 9370771.879966421052814 ], [ 561564.86994812823832, 9370736.329980153590441 ], [ 561623.109724191948771, 9370725.055553076788783 ] ] } },
-{ "type": "Feature", "properties": { "Length": 227.876, "id": 14 }, "geometry": { "type": "LineString", "coordinates": [ [ 560098.0373490806669, 9362989.473788045346737 ], [ 560319.598569430410862, 9363042.402333023026586 ] ] } },
-{ "type": "Feature", "properties": { "Length": 3105.917, "id": 15 }, "geometry": { "type": "LineString", "coordinates": [ [ 560319.598569430410862, 9363042.402333023026586 ], [ 560947.200462090782821, 9363192.329369526356459 ], [ 561666.868644681759179, 9363234.662870485335588 ], [ 562316.908597845584154, 9363141.799990758299828 ], [ 562329.508644707500935, 9363140.000025192275643 ], [ 562407.703429786488414, 9363128.829273901879787 ], [ 563046.928949325345457, 9363061.146588189527392 ], [ 563048.499706661328673, 9363060.980306170880795 ], [ 563307.288444920442998, 9363033.579099290072918 ], [ 563385.142031962051988, 9362997.808577870950103 ] ] } },
-{ "type": "Feature", "properties": { "Length": 3886.99, "id": 16 }, "geometry": { "type": "LineString", "coordinates": [ [ 560815.479997918009758, 9365530.689978899434209 ], [ 560902.259947555139661, 9365515.080050561577082 ], [ 561049.850032853893936, 9365511.849946537986398 ], [ 561124.029952492564917, 9365494.479996873065829 ], [ 561249.360020865686238, 9365444.579967383295298 ], [ 561353.540000218898058, 9365425.390015292912722 ], [ 561434.549947706982493, 9365368.420008925721049 ], [ 561472.269985198974609, 9365324.550047246739268 ], [ 561481.899960414506495, 9365299.940050642937422 ], [ 561494.540039928629994, 9365285.530010910704732 ], [ 561533.909994185902178, 9365167.599945735186338 ], [ 561561.699985285289586, 9365039.019990542903543 ], [ 561574.470049194991589, 9364817.399949222803116 ], [ 561624.59000070951879, 9364589.349988346919417 ], [ 561660.90998010057956, 9364508.99999831803143 ], [ 561690.690036914311349, 9364465.460024479776621 ], [ 561818.679991500452161, 9364344.429977498948574 ], [ 562040.629969758912921, 9364188.099969102069736 ], [ 562171.840027747675776, 9364083.560016987845302 ], [ 562318.199945067055523, 9363925.839996244758368 ], [ 562361.270036499947309, 9363890.489975864067674 ], [ 562408.920048193074763, 9363862.110008614137769 ], [ 562502.29994682315737, 9363827.250028057023883 ], [ 562561.040022045373917, 9363792.399958049878478 ], [ 562630.640048502013087, 9363740.510028844699264 ], [ 562737.170037001371384, 9363644.159994296729565 ], [ 562854.730042885988951, 9363567.550046514719725 ], [ 562903.770015719346702, 9363527.429986264556646 ], [ 562959.38994843326509, 9363496.420005165040493 ], [ 563032.1500472901389, 9363465.870003057643771 ], [ 563078.400049898773432, 9363435.940001895651221 ], [ 563128.439971880055964, 9363360.700036335736513 ], [ 563136.859951579943299, 9363332.6300014462322 ], [ 563385.142031962051988, 9362997.808577870950103 ] ] } },
-{ "type": "Feature", "properties": { "Length": 1235.319, "id": 17 }, "geometry": { "type": "LineString", "coordinates": [ [ 561911.180035983212292, 9371902.450048869475722 ], [ 561911.290002782829106, 9371774.560007127001882 ], [ 561916.410003134049475, 9371707.729972602799535 ], [ 561910.259977804496884, 9371553.489982048049569 ], [ 561896.39003321994096, 9371460.949989477172494 ], [ 561875.290005644783378, 9371378.690044919028878 ], [ 561838.010008008219302, 9371289.289959898218513 ], [ 561813.170048271305859, 9371259.270027354359627 ], [ 561716.240030192770064, 9371063.799999132752419 ], [ 561640.690031638368964, 9370842.729984594509006 ], [ 561626.13999830186367, 9370782.070050254464149 ], [ 561623.109724191948771, 9370725.055553076788783 ] ] } },
-{ "type": "Feature", "properties": { "Length": 1106.873, "id": 18 }, "geometry": { "type": "LineString", "coordinates": [ [ 563385.142031962051988, 9362997.808577870950103 ], [ 564090.456766493618488, 9362673.74504354223609 ], [ 564386.659932249225676, 9362527.618199057877064 ] ] } },
-{ "type": "Feature", "properties": { "Length": 1122.178, "id": 19 }, "geometry": { "type": "LineString", "coordinates": [ [ 565364.253777569159865, 9362503.811572419479489 ], [ 565383.560033028945327, 9362503.310028443112969 ], [ 565293.300813051871955, 9362573.508931530639529 ], [ 565154.570039769634604, 9362686.990004047751427 ], [ 565069.959959919564426, 9362712.129985691979527 ], [ 564881.849966394715011, 9362714.800014909356833 ], [ 564821.750010306946933, 9362704.4400226008147 ], [ 564794.950000228360295, 9362693.920012114569545 ], [ 564789.299989039078355, 9362685.28998427093029 ], [ 564386.659932249225676, 9362527.618199057877064 ] ] } },
-{ "type": "Feature", "properties": { "Length": 3264.726, "id": 20 }, "geometry": { "type": "LineString", "coordinates": [ [ 564386.659932249225676, 9362527.618199057877064 ], [ 564884.208424394950271, 9362282.160890934988856 ], [ 565656.793198664672673, 9361625.993028860539198 ], [ 566175.377591628581285, 9361117.991917047649622 ], [ 566598.711782545782626, 9360631.157560428604484 ], [ 566802.417236858978868, 9360379.115381432697177 ] ] } },
-{ "type": "Feature", "properties": { "Length": 2306.128, "id": 21 }, "geometry": { "type": "LineString", "coordinates": [ [ 558080.528334323316813, 9366376.590186208486557 ], [ 558150.80009717203211, 9366018.721947841346264 ], [ 558252.318974075606093, 9365828.925786674022675 ], [ 558459.770592095912434, 9365714.165317131206393 ], [ 558583.358790065394714, 9365599.404847588390112 ], [ 558658.394481689785607, 9365511.127563323825598 ], [ 558751.085630166926421, 9365356.642315862700343 ], [ 558830.53518600447569, 9365255.123438958078623 ], [ 558892.329284989275038, 9365188.915475759655237 ], [ 558962.951112400507554, 9365118.293648349121213 ], [ 559055.642260877648368, 9365065.327277790755033 ], [ 559174.816594633972272, 9365007.94704301841557 ], [ 559280.749335750704631, 9364968.222265100106597 ], [ 559369.026620014687069, 9364954.980672460049391 ], [ 559631.074507311917841, 9364934.686608903110027 ], [ 559631.074507311917841, 9364934.686608903110027 ] ] } }
-]
-}
diff --git a/examples/geospatial/schools.dbf b/examples/geospatial/schools.dbf
deleted file mode 100644
index 70ab8d1..0000000
Binary files a/examples/geospatial/schools.dbf and /dev/null differ
diff --git a/examples/geospatial/schools.prj b/examples/geospatial/schools.prj
deleted file mode 100644
index b7b18c8..0000000
--- a/examples/geospatial/schools.prj
+++ /dev/null
@@ -1 +0,0 @@
-PROJCS["NAD_1983_StatePlane_Arizona_Central_FIPS_0202_Feet",GEOGCS["GCS_North_American_1983",DATUM["D_North_American_1983",SPHEROID["GRS_1980",6378137.0,298.257222101]],PRIMEM["Greenwich",0.0],UNIT["Degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["False_Easting",699998.6],PARAMETER["False_Northing",0.0],PARAMETER["Central_Meridian",-111.9166666666667],PARAMETER["Scale_Factor",0.9999],PARAMETER["Latitude_Of_Origin",31.0],UNIT["Foot_US",0.3048006096012192]]
\ No newline at end of file
diff --git a/examples/geospatial/schools.sbn b/examples/geospatial/schools.sbn
deleted file mode 100644
index 0508084..0000000
Binary files a/examples/geospatial/schools.sbn and /dev/null differ
diff --git a/examples/geospatial/schools.sbx b/examples/geospatial/schools.sbx
deleted file mode 100644
index abadf28..0000000
Binary files a/examples/geospatial/schools.sbx and /dev/null differ
diff --git a/examples/geospatial/schools.shp b/examples/geospatial/schools.shp
deleted file mode 100644
index b71949e..0000000
Binary files a/examples/geospatial/schools.shp and /dev/null differ
diff --git a/examples/geospatial/schools.shp.xml b/examples/geospatial/schools.shp.xml
deleted file mode 100644
index 9ec306c..0000000
--- a/examples/geospatial/schools.shp.xml
+++ /dev/null
@@ -1,546 +0,0 @@
-<?xml version="1.0"?>
-<metadata><idinfo><citation><citeinfo><origin>Arizona Department of Environmental Quality, Arizona Department of Health Services</origin><pubdate>August 14, 2007</pubdate><title Sync="TRUE">SCHOOLS_EVERYTHING_8_7_08</title><edition>Schools 8-14-2007</edition><geoform Sync="TRUE">vector digital data</geoform><onlink>\\adeq.lcl\gisprod\data\adeq\schools-everything-8-14-07.shp</onlink><ftname Sync="TRUE">SCHOOLS_EVERYTHING_8_7_08</ftname></citeinfo></citation><descript><abstract>This data set is a general reference for schools or "learning sites" in Arizona.  It represents schools from the AZ Department of Education (CTDS numbers, charter and public schools), AZ School Facilities Board, private schools, some technical schools, colleges and universities.</abstract><purpose>The intention with which the data set was developed is for general reference only.  It is representative only presenting a single point in time for the topic "learning sites."  It is not the final or authoritative legal documentation for the learning sites data or locations.</purpose><supplinf>This data set does not contain locations for all cosmetology or beauty colleges, horseshoeing or welding technical schools, or other trade schools.</supplinf><langdata Sync="TRUE">en</langdata></descript><timeperd><timeinfo><sngdate><caldate>REQUIRED: The year (and optionally month, or month and day) for which the data set corresponds to the ground.</caldate></sngdate></timeinfo><current>publication date</current></timeperd><status><progress>In work</progress><update>As needed</update></status><spdom><bounding><westbc Sync="TRUE">-114.993109</westbc><eastbc Sync="TRUE">-108.985983</eastbc><northbc Sync="TRUE">37.012375</northbc><southbc Sync="TRUE">31.281734</southbc></bounding><lboundng><leftbc Sync="TRUE">144373.429325</leftbc><rightbc Sync="TRUE">679220.687500</rightbc><bottombc Sync="TRUE">3466849.812280</bottombc><topbc Sync="TRUE">4096245.182700</topbc></lboundng></spdom><keywords><theme><themekt>REQUIRED: Reference to a formally registered thesaurus or a similar authoritative source of theme keywords.</themekt><themekey>ADEQ</themekey><themekey>environment</themekey><themekey>Arizona</themekey><themekey>Environmental Quality</themekey><themekey>Department of Environmental Quality</themekey><themekey>schools</themekey><themekey>learning sites</themekey><themekey>colleges</themekey><themekey>universities</themekey><themekey>grade school</themekey><themekey>elementary school</themekey><themekey>high school</themekey><themekey>middle school</themekey><themekey>kindergarten</themekey><themekey>private school</themekey><themekey>parochial school</themekey><themekey>montessori</themekey><themekey>community college</themekey><themekey>junior college</themekey><themekey>university</themekey><themekey>Arizona  Department of Education</themekey><themekey>Charter School</themekey></theme><place><placekey>Arizona</placekey></place><temporal><tempkey>2008</tempkey></temporal></keywords><accconst>Access to these data are allowed for non-commercial applications without charge.  Commercial uses require payment.</accconst><useconst>The Arizona Department of Environmental Quality and others have compiled this data as a service to our customers using information from various sources. ADEQ and its collaborators cannot ensure that the information is accurate, current or complete. Neither the information presented nor maps derived from them are official documents.  
-
-All data are provided "as is" and may contain errors. The data are for reference and illustration purposes only and are not suitable for site-specific decision making. Information found here should not be used for making financial or any other commitments. Conclusions drawn from such information are the responsibility of the user.  
-
-ADEQ assumes no responsibility for errors arising from misuse of the data or maps derived from the data. ADEQ disclaims any liability for injury, damage or loss that might result from the use of this information. In no event shall ADEQ become liable to users of these data and maps, or any other party, arising from the use or modification of the data.</useconst><ptcontac><cntinfo><cntorgp><cntorg>Arizona Department of Environmental Quality</cntorg></cntorgp><cntaddr><addrtype>mailing and physical address</addrtype><address>1110 W Washington St</address><city>Phoenix</city><state>Arizona</state><postal>85007</postal><country>USA</country></cntaddr></cntinfo></ptcontac><datacred>This data set has been created in collaboration with the Arizona Department of Education, Arizona Department of Health Services, Arizona State Land Department  and the Arizona State Cartographers Office.</datacred><secinfo><secclass>Unclassified</secclass></secinfo><native Sync="TRUE">Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcCatalog 9.3.1.1850</native><natvform Sync="TRUE">Shapefile</natvform></idinfo><dataqual><lineage><procstep><procdesc>Dataset copied.</procdesc></procstep><procstep><procdesc>Metadata imported.</procdesc><srcused>D:\DOCUME~1\VMG~1.ADE\LOCALS~1\Temp\xmlCA.tmp</srcused></procstep><procstep><procdesc>Metadata imported.</procdesc><srcused>S:\common\vmg\schoolsmetadata.xml</srcused></procstep><procstep><procdesc Sync="TRUE">Metadata imported.</procdesc><srcused Sync="TRUE">S:\common\vmg\schools_everything_5_19_08.xml</srcused><date Sync="TRUE">20080708</date><time Sync="TRUE">12334800</time></procstep><procstep><procdesc Sync="TRUE">Metadata imported.</procdesc><srcused Sync="TRUE">T:\data\adeq\cross_media\schools_everything_7_8_08.shp.xml</srcused><date Sync="TRUE">20080811</date><time Sync="TRUE">10052400</time></procstep><procstep><procdesc Sync="TRUE">Dataset copied.</procdesc><srcused Sync="TRUE"></srcused><procdate Sync="TRUE">20120210</procdate><proctime Sync="TRUE">09541600</proctime></procstep><procstep><procdesc Sync="TRUE">Dataset copied.</procdesc><srcused Sync="TRUE"></srcused><procdate Sync="TRUE">20120210</procdate><proctime Sync="TRUE">09575300</proctime></procstep><procstep><procdesc Sync="TRUE">Dataset copied.</procdesc><srcused Sync="TRUE"></srcused><procdate Sync="TRUE">20120210</procdate><proctime Sync="TRUE">09582900</proctime></procstep></lineage></dataqual><spdoinfo><direct Sync="TRUE">Vector</direct><ptvctinf><esriterm Name="ADEQ_AZ_SCHOOLS_2008"><efeatyp Sync="TRUE">Simple</efeatyp><efeageom Sync="TRUE">1</efeageom><esritopo Sync="TRUE">FALSE</esritopo><efeacnt Sync="TRUE">2772</efeacnt><spindex Sync="TRUE">TRUE</spindex><linrefer Sync="TRUE">FALSE</linrefer></esriterm></ptvctinf></spdoinfo><spref><horizsys><planar><planci><plance Sync="TRUE">coordinate pair</plance><coordrep><absres Sync="TRUE">0.000000</absres><ordres Sync="TRUE">0.000000</ordres></coordrep><plandu Sync="TRUE">meters</plandu></planci></planar><geodetic><horizdn Sync="TRUE">North American Datum of 1983</horizdn><ellips Sync="TRUE">Geodetic Reference System 80</ellips><semiaxis Sync="TRUE">6378137.000000</semiaxis><denflat Sync="TRUE">298.257222</denflat></geodetic><cordsysn><geogcsn Sync="TRUE">GCS_North_American_1983</geogcsn><projcsn Sync="TRUE">NAD_1983_UTM_Zone_12N</projcsn></cordsysn></horizsys></spref><eainfo><detailed Name="ADEQ_AZ_SCHOOLS_2008"><enttyp><enttypl Sync="TRUE">ADEQ_AZ_SCHOOLS_2008</enttypl><enttypt Sync="TRUE">Feature Class</enttypt><enttypc Sync="TRUE">2772</enttypc></enttyp><attr><attrlabl>FID</attrlabl><attrdef>Internal feature number.</attrdef><attrdefs>ESRI</attrdefs><attrdomv><udom>Sequential unique whole numbers that are automatically generated.</udom></attrdomv><attalias Sync="TRUE">FID</attalias><attrtype Sync="TRUE">OID</attrtype><attwidth Sync="TRUE">4</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>Shape</attrlabl><attrdef>Feature geometry.</attrdef><attrdefs>ESRI</attrdefs><attrdomv><udom>Coordinates defining the features.</udom></attrdomv><attalias Sync="TRUE">Shape</attalias><attrtype Sync="TRUE">Geometry</attrtype><attwidth Sync="TRUE">0</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>NAME</attrlabl><attrdef>School or learning site name</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Good</attrva><attrvae>Names verified to mulitple sources</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">69</attwidth><attalias Sync="TRUE">NAME</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>ADDRESS</attrlabl><attrdef>Physical address of school or learning site</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Good</attrva><attrvae>Verified to multiple sources</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">44</attwidth><attalias Sync="TRUE">ADDRESS</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>ZIP</attrlabl><attrdef>ZIP Code of physical location</attrdef><attrdefs>USPS ZIP Code</attrdefs><attrvai><attrva>Good</attrva><attrvae>Verified to outside source</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">7</attwidth><attalias Sync="TRUE">ZIP</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>CTDS</attrlabl><attrdef>AZ Dept of Education Identification Number, (County Code, Type Code, District Code &amp; Site Number</attrdef><attrdefs>AZ Dept. of Education</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Missing leading zeros in string field</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><attalias Sync="TRUE">CTDS</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>CTDS_NUM</attrlabl><attrtype Sync="TRUE">Double</attrtype><attwidth Sync="TRUE">11</attwidth><attalias Sync="TRUE">CTDS_NUM</attalias><atprecis Sync="TRUE">11</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>STATUS</attrlabl><attrdef>Operating Status (open , closed, proposed)</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Good</attrva><attrvae>Validated to multiple sources</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">8</attwidth><attalias Sync="TRUE">STATUS</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>LOCATION</attrlabl><attrdef>Location check method</attrdef><attrdefs>ADEQ</attrdefs><attrdomv><edom><edomv>DIG</edomv><edomvd>Digitally verified against raster data or other data set (parcels)</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>NON</edomv><edomvd>Non-specific, multiple methods of verification (digital, geocoding, GPS, etc.)</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>GPS</edomv><edomvd>Global Positioning System - field collected</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>GEO</edomv><edomvd>Originally geocoded - address matched [location verified by other methods]</edomvd><edomvds>ADEQ</edomvds></edom></attrdomv><attrvai><attrva>Good</attrva><attrvae>Verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><attalias Sync="TRUE">LOCATION</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>QA_QC</attrlabl><attrdef>Quality Assurance / Quality Control Code</attrdef><attrdefs>ADEQ</attrdefs><attrdomv><edom><edomv>0</edomv><edomvd>Used as identifier for version additions to data set - GPS], location quality is "ok"</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>1</edomv><edomvd>Very high confidence of location accuracy, matched to at least two independent sources</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>2</edomv><edomvd>Low confidence of locational accuracy, unable to match to other sources</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>3</edomv><edomvd>Used as identifier for version additions to data set - GPS or NON, location quality is "ok"</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>4</edomv><edomvd>Used as identifier for version additions to data set - GPS or NON, location quality is "ok"</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>5</edomv><edomvd>Very high confidence of location accuracy, matched digitally, etc. to at least two independent sources</edomvd><edomvds>ADEQ</edomvds></edom></attrdomv><attrvai><attrva>Good</attrva></attrvai><attrtype Sync="TRUE">Integer</attrtype><attwidth Sync="TRUE">7</attwidth><attalias Sync="TRUE">QA_QC</attalias><atprecis Sync="TRUE">7</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>CITY</attrlabl><attrdef>Physical location city or town</attrdef><attrdefs>Digital</attrdefs><attrvai><attrva>Good</attrva><attrvae>Verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">22</attwidth><attalias Sync="TRUE">CITY</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>COUNTY</attrlabl><attrdef>County</attrdef><attrdefs>Digital</attrdefs><attrvai><attrva>Good</attrva><attrvae>Verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">COUNTY</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>PHONE</attrlabl><attrdef>Phone Number</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">PHONE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>FAX</attrlabl><attrdef>FAX Number</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">FAX</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>LOWGRADE</attrlabl><attrdef>Lowest class level</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">LOWGRADE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>HIGHGRADE</attrlabl><attrdef>Highest class level taught</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">HIGHGRADE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>COMMENT</attrlabl><attrdef>Comments Field</attrdef><attrdefs>Varies</attrdefs><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">49</attwidth><attalias Sync="TRUE">COMMENT</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>DISTRICT</attrlabl><attrdef>School District or Charter Holder</attrdef><attrdefs>AZ Department of Education</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not fully verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">55</attwidth><attalias Sync="TRUE">DISTRICT</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">GRADE</attrlabl><attalias Sync="TRUE">GRADE</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">7</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">NURSE</attrlabl><attalias Sync="TRUE">NURSE</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">11</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">RN_PHN</attrlabl><attalias Sync="TRUE">RN_PHN</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">16</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">JUV_POP</attrlabl><attalias Sync="TRUE">JUV_POP</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">DISTNUM</attrlabl><attalias Sync="TRUE">DISTNUM</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">MAILTO</attrlabl><attalias Sync="TRUE">MAILTO</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">44</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">MAILCITY</attrlabl><attalias Sync="TRUE">MAILCITY</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">20</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">MAILSTAT</attrlabl><attalias Sync="TRUE">MAILSTAT</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">MAILZIP</attrlabl><attalias Sync="TRUE">MAILZIP</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">8</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">CLASS</attrlabl><attalias Sync="TRUE">CLASS</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">17</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">TYPE_1</attrlabl><attalias Sync="TRUE">TYPE_1</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">KINDER</attrlabl><attalias Sync="TRUE">KINDER</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">8</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">FIRST</attrlabl><attalias Sync="TRUE">FIRST</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">SECOND</attrlabl><attalias Sync="TRUE">SECOND</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">THIRD</attrlabl><attalias Sync="TRUE">THIRD</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">FOURTH</attrlabl><attalias Sync="TRUE">FOURTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">8</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">FIFTH</attrlabl><attalias Sync="TRUE">FIFTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">SIXTH</attrlabl><attalias Sync="TRUE">SIXTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">SEVENTH</attrlabl><attalias Sync="TRUE">SEVENTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">EIGHTH</attrlabl><attalias Sync="TRUE">EIGHTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">7</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">NINTH</attrlabl><attalias Sync="TRUE">NINTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">TENTH</attrlabl><attalias Sync="TRUE">TENTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">7</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">ELEVENTH</attrlabl><attalias Sync="TRUE">ELEVENTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">TWELFTH</attrlabl><attalias Sync="TRUE">TWELFTH</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">PRESCHL</attrlabl><attalias Sync="TRUE">PRESCHL</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">ACCURACY</attrlabl><attalias Sync="TRUE">ACCURACY</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">BOARDING</attrlabl><attalias Sync="TRUE">BOARDING</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">REGION</attrlabl><attalias Sync="TRUE">REGION</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl Sync="TRUE">WEB_PAGE</attrlabl><attalias Sync="TRUE">WEB_PAGE</attalias><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">69</attwidth><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>GRADE</attrlabl><attrdef>Range of classes taught</attrdef><attrdefs>Varies</attrdefs><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">Double</attrtype><attwidth Sync="TRUE">11</attwidth><attalias Sync="TRUE">PLAC_IDNO</attalias><atprecis Sync="TRUE">11</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>NURSE</attrlabl><attrdef>School Nurse present?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Unknown</attrva></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">49</attwidth><attalias Sync="TRUE">COMMENT</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>RN_PHN</attrlabl><attrdef>Register Nurse Phone Number</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Unknown</attrva></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">HIGHGRADE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>JUV_POP</attrlabl><attrdef>Juvenile Population (number of students)</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Unknown</attrva></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">LOWGRADE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>DISTNUM</attrlabl><attrdef>School District Number or Charter Holder Number</attrdef><attrdefs>AZ Department of Education</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Partially verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">FAX</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>MAILTO</attrlabl><attrdef>Mailing Address</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">PHONE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>MAILCITY</attrlabl><attrdef>Mailing Address City</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">COUNTY</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>MAILSTAT</attrlabl><attrdef>Mailing Address State</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">22</attwidth><attalias Sync="TRUE">CITY</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>MAILZIP</attrlabl><attrdef>Mailing Address ZIP</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not verified</attrvae></attrvai><attrtype Sync="TRUE">Integer</attrtype><attwidth Sync="TRUE">7</attwidth><attalias Sync="TRUE">QA_QC</attalias><atprecis Sync="TRUE">7</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>CLASS</attrlabl><attrdef>Class - grade levels</attrdef><attrdefs>Varies</attrdefs><attrdomv><edom><edomv>University</edomv><edomvd>University level</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>Comm. College</edomv><edomvd>Community College - part of community college network</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>College</edomv><edomvd>College - non-university or community college</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>Tech</edomv><edomvd>Technical Schools</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>Rel. College</edomv><edomvd>Religious College</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>Special Needs</edomv><edomvd>Special needs schools</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>All Grades</edomv><edomvd>All grade levels</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>High</edomv><edomvd>High School</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>JR/SR High</edomv><edomvd>Seventh through twelfth grade</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>Middle</edomv><edomvd>Middle School</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>Primary</edomv><edomvd>Primary or elementary school</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>(Blank)</edomv><edomvd>Unknown grade levels</edomvd></edom></attrdomv><attrvai><attrva>Good</attrva><attrvae>Verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><attalias Sync="TRUE">LOCATION</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>TYPE_1</attrlabl><attrdef>Type of School</attrdef><attrdefs>Varies</attrdefs><attrdomv><edom><edomv>Charter</edomv><edomvd>Arizona Charter School</edomvd><edomvds>Arizona Board of Charter Schools</edomvds></edom><edom><edomv>Public</edomv><edomvd>Public School</edomvd><edomvds>Az Departement of Education</edomvds></edom><edom><edomv>BIA</edomv><edomvd>Bureau of Indian Affairs operated school</edomvd><edomvds>US BIA</edomvds></edom><edom><edomv>Closed</edomv><edomvd>Closed School</edomvd><edomvds>ADEQ</edomvds></edom><edom><edomv>Private</edomv><edomvd>Private or Parochial operated school</edomvd><edomvds>Varies</edomvds></edom><edom><edomv>Tribal</edomv><edomvd>Tribe operated school</edomvd><edomvds>Varies</edomvds></edom></attrdomv><attrvai><attrva>Good</attrva><attrvae>verified</attrvae></attrvai><attrmfrq>As needed</attrmfrq><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">69</attwidth><attalias Sync="TRUE">NAME</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>KINDER</attrlabl><attrdef>Kindergarden Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">Geometry</attrtype><attwidth Sync="TRUE">0</attwidth><attalias Sync="TRUE">Shape</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale><attrdomv><udom Sync="TRUE">Coordinates defining the features.</udom></attrdomv></attr><attr><attrlabl>FIRST</attrlabl><attrdef>First Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">OID</attrtype><attwidth Sync="TRUE">4</attwidth><attalias Sync="TRUE">FID</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale><attrdomv><udom Sync="TRUE">Sequential unique whole numbers that are automatically generated.</udom></attrdomv></attr><attr><attrlabl>SECOND</attrlabl><attrdef>Second Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><attalias Sync="TRUE">SECOND</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>THIRD</attrlabl><attrdef>Third Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><attalias Sync="TRUE">THIRD</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>FOURTH</attrlabl><attrdef>Fourth Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">8</attwidth><attalias Sync="TRUE">FOURTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>FIFTH</attrlabl><attrdef>Fifth Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><attalias Sync="TRUE">FIFTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>SIXTH</attrlabl><attrdef>Sixth Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><attalias Sync="TRUE">SIXTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>SEVENTH</attrlabl><attrdef>Seventh Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><attalias Sync="TRUE">SEVENTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>EIGHTH</attrlabl><attrdef>Eighth Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">7</attwidth><attalias Sync="TRUE">EIGHTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>NINTH</attrlabl><attrdef>Ninth Grade Taught</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">6</attwidth><attalias Sync="TRUE">NINTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>TENTH</attrlabl><attrdef>Tenth Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">7</attwidth><attalias Sync="TRUE">TENTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>ELEVENTH</attrlabl><attrdef>Eleventh Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><attalias Sync="TRUE">ELEVENTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>TWELFTH</attrlabl><attrdef>Twelfth Grade Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><attalias Sync="TRUE">TWELFTH</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>PRESCHL</attrlabl><attrdef>Preschool Level Taught?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><attalias Sync="TRUE">PRESCHL</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>ACCURACY</attrlabl><attrdef>Original  Accuracy</attrdef><attrdefs>ADHS</attrdefs><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">12</attwidth><attalias Sync="TRUE">ACCURACY</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>BOARDING</attrlabl><attrdef>Boarding School?</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">10</attwidth><attalias Sync="TRUE">BOARDING</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>REGION</attrlabl><attrdef>Region of State</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Not Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">9</attwidth><attalias Sync="TRUE">REGION</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>WEB_PAGE</attrlabl><attrdef>School Web Page Address</attrdef><attrdefs>Varies</attrdefs><attrvai><attrva>Medium</attrva><attrvae>Paritially Verified</attrvae></attrvai><attrtype Sync="TRUE">String</attrtype><attwidth Sync="TRUE">69</attwidth><attalias Sync="TRUE">WEB_PAGE</attalias><atprecis Sync="TRUE">0</atprecis><attscale Sync="TRUE">0</attscale></attr><attr><attrlabl>PLAC_IDNO</attrlabl><attrtype Sync="TRUE">Double</attrtype><attwidth Sync="TRUE">11</attwidth><attalias Sync="TRUE">PLAC_IDNO</attalias><atprecis Sync="TRUE">11</atprecis><attscale Sync="TRUE">0</attscale></attr></detailed></eainfo><distinfo><resdesc>Downloadable Data</resdesc><stdorder><digform><digtinfo><transize>3.017</transize><dssize Sync="TRUE">0.074</dssize></digtinfo></digform></stdorder></distinfo><metainfo><metd Sync="TRUE">20090730</metd><metc><cntinfo><cntorgp><cntorg>REQUIRED: The organization responsible for the metadata information.</cntorg><cntper>REQUIRED: The person responsible for the metadata information.</cntper></cntorgp><cntaddr><addrtype>REQUIRED: The mailing and/or physical address for the organization or individual.</addrtype><city>REQUIRED: The city of the address.</city><state>REQUIRED: The state or province of the address.</state><postal>REQUIRED: The ZIP or other postal code of the address.</postal></cntaddr><cntvoice>REQUIRED: The telephone number by which individuals can speak to the organization or individual.</cntvoice></cntinfo></metc><metstdn Sync="TRUE">FGDC Content Standards for Digital Geospatial Metadata</metstdn><metstdv Sync="TRUE">FGDC-STD-001-1998</metstdv><mettc Sync="TRUE">local time</mettc><metextns><onlink>http://www.esri.com/metadata/esriprof80.html</onlink><metprof>ESRI Metadata Profile</metprof></metextns><metextns><onlink>http://www.esri.com/metadata/esriprof80.html</onlink><metprof>ESRI Metadata Profile</metprof></metextns><metextns><onlink>http://www.esri.com/metadata/esriprof80.html</onlink><metprof>ESRI Metadata Profile</metprof></metextns><langmeta Sync="TRUE">en</langmeta></metainfo><Esri><CreaDate>20120210</CreaDate><CreaTime>09582900</CreaTime><SyncOnce>FALSE</SyncOnce><SyncDate>20110620</SyncDate><SyncTime>14302900</SyncTime><ModDate>20110620</ModDate><ModTime>14302900</ModTime><DataProperties><itemProps><itemName Sync="TRUE">ADEQ_AZ_SCHOOLS_2008</itemName><imsContentType Sync="TRUE">002</imsContentType><nativeExtBox><westBL Sync="TRUE">144373.429325</westBL><eastBL Sync="TRUE">679220.687500</eastBL><southBL Sync="TRUE">3466849.812280</southBL><northBL Sync="TRUE">4096245.182700</northBL><exTypeCode Sync="TRUE">1</exTypeCode></nativeExtBox><itemSize Sync="TRUE">0.074</itemSize><itemLocation><linkage Sync="TRUE">file://\\itfs1.asurite.ad.asu.edu\gisshare1$\spatial_data\AZ_Data_by_Topic\Education\AZ_Schools\ADEQ_AZ_SCHOOLS_2008.shp</linkage><protocol Sync="TRUE">Local Area Network</protocol></itemLocation></itemProps><coordRef><type Sync="TRUE">Projected</type><geogcsn Sync="TRUE">GCS_North_American_1983</geogcsn><projcsn Sync="TRUE">NAD_1983_UTM_Zone_12N</projcsn><peXml Sync="TRUE">&lt;ProjectedCoordinateSystem xsi:type='typens:ProjectedCoordinateSystem' xmlns:xsi='http://www.w3.org/2001/XMLSchema-instance' xmlns:xs='http://www.w3.org/2001/XMLSchema' xmlns:typens='http://www.esri.com/schemas/ArcGIS/10.0'&gt;&lt;WKT&gt;PROJCS[&amp;quot;NAD_1983_UTM_Zone_12N&amp;quot;,GEOGCS[&amp;quot;GCS_North_American_1983&amp;quot;,DATUM[&amp;quot;D_North_American_1983&amp;quot;,SPHEROID[&amp;quot;GRS_1980&amp;quot;,6378137.0,298.257222101]],PRIMEM[&amp;quot;Greenwich&amp;quot;,0.0],UNIT[&amp;quot;Degree&amp;quot;,0.0174532925199433]],PROJECTION[&amp;quot;Transverse_Mercator&amp;quot;],PARAMETER[&amp;quot;False_Easting&amp;quot;,500000.0],PARAMETER[&amp;quot;False_Northing&amp;quot;,0.0],PARAMETER[&amp;quot;Central_Meridian&amp;quot;,-111.0],PARAMETER[&amp;quot;Scale_Factor&amp;quot;,0.9996],PARAMETER[&amp;quot;Latitude_Of_Origin&amp;quot;,0.0],UNIT[&amp;quot;Meter&amp;quot;,1.0],AUTHORITY[&amp;quot;EPSG&amp;quot;,26912]]&lt;/WKT&gt;&lt;XOrigin&gt;-5120900&lt;/XOrigin&gt;&lt;YOrigin&gt;-9998100&lt;/YOrigin&gt;&lt;XYScale&gt;450445547.3910538&lt;/XYScale&gt;&lt;ZOrigin&gt;-100000&lt;/ZOrigin&gt;&lt;ZScale&gt;10000&lt;/ZScale&gt;&lt;MOrigin&gt;-100000&lt;/MOrigin&gt;&lt;MScale&gt;10000&lt;/MScale&gt;&lt;XYTolerance&gt;0.001&lt;/XYTolerance&gt;&lt;ZTolerance&gt;0.001&lt;/ZTolerance&gt;&lt;MTolerance&gt;0.001&lt;/MTolerance&gt;&lt;HighPrecision&gt;true&lt;/HighPrecision&gt;&lt;WKID&gt;26912&lt;/WKID&gt;&lt;/ProjectedCoordinateSystem&gt;</peXml></coordRef><lineage><Process ToolSource="C:\Program Files\ArcGIS\ArcToolbox\Toolboxes\Data Management Tools.tbx\Project" Date="20120210" Time="095753">Project schools Z:\Desktop\example_data\schools_Project.shp PROJCS['NAD_1983_StatePlane_Arizona_Central_FIPS_0202_Feet',GEOGCS['GCS_North_American_1983',DATUM['D_North_American_1983',SPHEROID['GRS_1980',6378137.0,298.257222101]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Transverse_Mercator'],PARAMETER['False_Easting',699998.6],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-111.9166666666667],PARAMETER['Scale_Factor',0.9999],PARAMETER['Latitude_Of_Origin',31.0],UNIT['Foot_US',0.3048006096012192]] # PROJCS['NAD_1983_UTM_Zone_12N',GEOGCS['GCS_North_American_1983',DATUM['D_North_American_1983',SPHEROID['GRS_1980',6378137.0,298.257222101]],PRIMEM['Greenwich',0.0],UNIT['Degree',0.0174532925199433]],PROJECTION['Transverse_Mercator'],PARAMETER['False_Easting',500000.0],PARAMETER['False_Northing',0.0],PARAMETER['Central_Meridian',-111.0],PARAMETER['Scale_Factor',0.9996],PARAMETER['Latitude_Of_Origin',0.0],UNIT['Meter',1.0]]</Process></lineage></DataProperties><ArcGISFormat>1.0</ArcGISFormat><MetaID>{171D5199-0709-4DCB-AFC4-AD87037D006B}</MetaID></Esri><mdLang><languageCode value="en"/></mdLang><mdHrLv><ScopeCd value="005"/></mdHrLv><mdStanName>ArcGIS Metadata</mdStanName><mdStanVer>1.0</mdStanVer><distInfo><distTranOps><transSize>3.017</transSize></distTranOps><distTranOps><onLineSrc><linkage>\\adeq.lcl\gisprod\data\adeq\schools-everything-8-14-07.shp</linkage></onLineSrc></distTranOps><distFormat><formatName Sync="TRUE">Shapefile</formatName></distFormat><distributor><distorTran><onLineSrc><linkage Sync="TRUE">file://</linkage><protocol Sync="TRUE">Local Area Network</protocol></onLineSrc></distorTran></distributor></distInfo><dataIdInfo><idCitation><date><pubDate/></date><resEd>Schools 8-14-2007</resEd><citRespParty><rpOrgName>Arizona Department of Environmental Quality, Arizona Department of Health Services</rpOrgName><role><RoleCd value="006"/></role></citRespParty><resTitle Sync="TRUE">ADEQ_AZ_SCHOOLS_2008</resTitle><presForm><PresFormCd value="005"></PresFormCd></presForm></idCitation><idAbs>This data set is a general reference for schools or "learning sites" in Arizona. It represents schools from the AZ Department of Education (CTDS numbers, charter and public schools), AZ School Facilities Board, private schools, some technical schools, colleges and universities.</idAbs><idPurp>The intention with which the data set was developed is for general reference only. It is representative only presenting a single point in time for the topic "learning sites." It is not the final or authoritative legal documentation for the learning sites data or locations.</idPurp><idCredit>This data set has been created in collaboration with the Arizona Department of Education, Arizona Department of Health Services, Arizona State Land Department and the Arizona State Cartographers Office.</idCredit><idStatus><ProgCd value="007"/></idStatus><idPoC><rpOrgName>Arizona Department of Environmental Quality</rpOrgName><rpCntInfo><cntAddress><delPoint>1110 W Washington St</delPoint><city>Phoenix</city><adminArea>Arizona</adminArea><postCode>85007</postCode><country>US</country></cntAddress></rpCntInfo><role><RoleCd value="007"/></role></idPoC><resMaint><maintFreq><MaintFreqCd value="009"/></maintFreq></resMaint><placeKeys><keyword>Arizona</keyword></placeKeys><tempKeys><keyword>2008</keyword></tempKeys><themeKeys><keyword>montessori</keyword><keyword>environment</keyword><keyword>junior college</keyword><keyword>middle school</keyword><keyword>elementary school</keyword><keyword>grade school</keyword><keyword>community college</keyword><keyword>Arizona</keyword><keyword>kindergarten</keyword><keyword>schools</keyword><keyword>colleges</keyword><keyword>Department of Environmental Quality</keyword><keyword>private school</keyword><keyword>ADEQ</keyword><keyword>Arizona Department of Education</keyword><keyword>parochial school</keyword><keyword>learning sites</keyword><keyword>Charter School</keyword><keyword>high school</keyword><keyword>Environmental Quality</keyword><keyword>universities</keyword><keyword>university</keyword></themeKeys><searchKeys><keyword>montessori</keyword><keyword>environment</keyword><keyword>junior college</keyword><keyword>middle school</keyword><keyword>elementary school</keyword><keyword>grade school</keyword><keyword>community college</keyword><keyword>Arizona</keyword><keyword>kindergarten</keyword><keyword>schools</keyword><keyword>colleges</keyword><keyword>Department of Environmental Quality</keyword><keyword>private school</keyword><keyword>ADEQ</keyword><keyword>Arizona Department of Education</keyword><keyword>parochial school</keyword><keyword>learning sites</keyword><keyword>Arizona</keyword><keyword>2008</keyword><keyword>Charter School</keyword><keyword>high school</keyword><keyword>Environmental Quality</keyword><keyword>universities</keyword><keyword>university</keyword></searchKeys><resConst><LegConsts><accessConsts><RestrictCd value="008"/></accessConsts><othConsts>Access constraints: Access to these data are allowed for non-commercial applications without charge. Commercial uses require payment.</othConsts></LegConsts></resConst><resConst><LegConsts><useConsts><RestrictCd value="008"/></useConsts><othConsts>Use constraints: The Arizona Department of Environmental Quality and others have compiled this data as a service to our customers using information from various sources. ADEQ and its collaborators cannot ensure that the information is accurate, current or complete. Neither the information presented nor maps derived from them are official documents. All data are provided "as is" and may contain errors. The data are for reference and illustration purposes only and are not suitable for site-specific decision making. Information found here should not be used for making financial or any other commitments. Conclusions drawn from such information are the responsibility of the user. ADEQ assumes no responsibility for errors arising from misuse of the data or maps derived from the data. ADEQ disclaims any liability for injury, damage or loss that might result from the use of this information. In no event shall ADEQ become liable to users of these data and maps, or any other party, arising from the use or modification of the data.</othConsts></LegConsts></resConst><resConst><SecConsts><class><ClasscationCd value="001"/></class></SecConsts></resConst><dataLang><languageCode value="en" country="US"/></dataLang><tpCat><TopicCatCd value="007"/></tpCat><dataExt><exDesc>publication date</exDesc><geoEle><GeoBndBox esriExtentType="search"><westBL Sync="TRUE">-114.993109</westBL><eastBL Sync="TRUE">-108.985983</eastBL><northBL Sync="TRUE">37.012375</northBL><southBL Sync="TRUE">31.281734</southBL><exTypeCode Sync="TRUE">1</exTypeCode></GeoBndBox></geoEle></dataExt><suppInfo>This data set does not contain locations for all cosmetology or beauty colleges, horseshoeing or welding technical schools, or other trade schools.</suppInfo><envirDesc Sync="TRUE">Microsoft Windows XP Version 5.1 (Build 2600) Service Pack 3; ESRI ArcGIS 10.0.2.3200</envirDesc><descKeys><thesaName uuidref="723f6998-058e-11dc-8314-0800200c9a66"></thesaName><keyword Sync="TRUE">002</keyword></descKeys><spatRpType><SpatRepTypCd value="001"></SpatRepTypCd></spatRpType></dataIdInfo><dqInfo><dqScope><scpLvl><ScopeCd value="005"/></scpLvl></dqScope><dataLineage/></dqInfo><spatRepInfo><VectSpatRep><topLvl><TopoLevCd value="001"></TopoLevCd></topLvl><geometObjs><geoObjTyp><GeoObjTypCd value="004"></GeoObjTypCd><geoObjCnt Sync="TRUE">2772</geoObjCnt></geoObjTyp></geometObjs></VectSpatRep></spatRepInfo><Binary><Enclosure><Descript>original metadata</Descript><Data SourceMetadata="yes" OriginalFileName="source_metadata.xml" SourceMetadataDigest="3bf31794c3811024c8bd56a81c2e3c" EsriPropertyType="Base64" SourceMetadataSchema="fgdc">PD94bWwgdmVyc2lvbj0iMS4wIiBlbmNvZGluZz0idXRmLTgiPz4NCjxtZXRhZGF0YT4NCiAgPGlk
-aW5mbz4NCiAgICA8Y2l0YXRpb24+DQogICAgICA8Y2l0ZWluZm8+DQogICAgICAgIDxvcmlnaW4+
-QXJpem9uYSBEZXBhcnRtZW50IG9mIEVudmlyb25tZW50YWwgUXVhbGl0eSwgQXJpem9uYSBEZXBh
-cnRtZW50IG9mIEhlYWx0aCBTZXJ2aWNlczwvb3JpZ2luPg0KICAgICAgICA8cHViZGF0ZT5BdWd1
-c3QgMTQsIDIwMDc8L3B1YmRhdGU+DQogICAgICAgIDxlZGl0aW9uPlNjaG9vbHMgOC0xNC0yMDA3
-PC9lZGl0aW9uPg0KICAgICAgICA8b25saW5rPlxcYWRlcS5sY2xcZ2lzcHJvZFxkYXRhXGFkZXFc
-c2Nob29scy1ldmVyeXRoaW5nLTgtMTQtMDcuc2hwPC9vbmxpbms+DQogICAgICA8L2NpdGVpbmZv
-Pg0KICAgIDwvY2l0YXRpb24+DQogICAgPGRlc2NyaXB0Pg0KICAgICAgPGFic3RyYWN0PlRoaXMg
-ZGF0YSBzZXQgaXMgYSBnZW5lcmFsIHJlZmVyZW5jZSBmb3Igc2Nob29scyBvciAibGVhcm5pbmcg
-c2l0ZXMiIGluIEFyaXpvbmEuICBJdCByZXByZXNlbnRzIHNjaG9vbHMgZnJvbSB0aGUgQVogRGVw
-YXJ0bWVudCBvZiBFZHVjYXRpb24gKENURFMgbnVtYmVycywgY2hhcnRlciBhbmQgcHVibGljIHNj
-aG9vbHMpLCBBWiBTY2hvb2wgRmFjaWxpdGllcyBCb2FyZCwgcHJpdmF0ZSBzY2hvb2xzLCBzb21l
-IHRlY2huaWNhbCBzY2hvb2xzLCBjb2xsZWdlcyBhbmQgdW5pdmVyc2l0aWVzLjwvYWJzdHJhY3Q+
-DQogICAgICA8cHVycG9zZT5UaGUgaW50ZW50aW9uIHdpdGggd2hpY2ggdGhlIGRhdGEgc2V0IHdh
-cyBkZXZlbG9wZWQgaXMgZm9yIGdlbmVyYWwgcmVmZXJlbmNlIG9ubHkuICBJdCBpcyByZXByZXNl
-bnRhdGl2ZSBvbmx5IHByZXNlbnRpbmcgYSBzaW5nbGUgcG9pbnQgaW4gdGltZSBmb3IgdGhlIHRv
-cGljICJsZWFybmluZyBzaXRlcy4iICBJdCBpcyBub3QgdGhlIGZpbmFsIG9yIGF1dGhvcml0YXRp
-dmUgbGVnYWwgZG9jdW1lbnRhdGlvbiBmb3IgdGhlIGxlYXJuaW5nIHNpdGVzIGRhdGEgb3IgbG9j
-YXRpb25zLjwvcHVycG9zZT4NCiAgICAgIDxzdXBwbGluZj5UaGlzIGRhdGEgc2V0IGRvZXMgbm90
-IGNvbnRhaW4gbG9jYXRpb25zIGZvciBhbGwgY29zbWV0b2xvZ3kgb3IgYmVhdXR5IGNvbGxlZ2Vz
-LCBob3JzZXNob2Vpbmcgb3Igd2VsZGluZyB0ZWNobmljYWwgc2Nob29scywgb3Igb3RoZXIgdHJh
-ZGUgc2Nob29scy48L3N1cHBsaW5mPg0KICAgIDwvZGVzY3JpcHQ+DQogICAgPHRpbWVwZXJkPg0K
-ICAgICAgPGN1cnJlbnQ+cHVibGljYXRpb24gZGF0ZTwvY3VycmVudD4NCiAgICA8L3RpbWVwZXJk
-Pg0KICAgIDxzdGF0dXM+DQogICAgICA8cHJvZ3Jlc3M+SW4gd29yazwvcHJvZ3Jlc3M+DQogICAg
-ICA8dXBkYXRlPkFzIG5lZWRlZDwvdXBkYXRlPg0KICAgIDwvc3RhdHVzPg0KICAgIDxrZXl3b3Jk
-cz4NCiAgICAgIDx0aGVtZT4NCiAgICAgICAgPHRoZW1la2V5PkFERVE8L3RoZW1la2V5Pg0KICAg
-ICAgICA8dGhlbWVrZXk+ZW52aXJvbm1lbnQ8L3RoZW1la2V5Pg0KICAgICAgICA8dGhlbWVrZXk+
-QXJpem9uYTwvdGhlbWVrZXk+DQogICAgICAgIDx0aGVtZWtleT5FbnZpcm9ubWVudGFsIFF1YWxp
-dHk8L3RoZW1la2V5Pg0KICAgICAgICA8dGhlbWVrZXk+RGVwYXJ0bWVudCBvZiBFbnZpcm9ubWVu
-dGFsIFF1YWxpdHk8L3RoZW1la2V5Pg0KICAgICAgICA8dGhlbWVrZXk+c2Nob29sczwvdGhlbWVr
-ZXk+DQogICAgICAgIDx0aGVtZWtleT5sZWFybmluZyBzaXRlczwvdGhlbWVrZXk+DQogICAgICAg
-IDx0aGVtZWtleT5jb2xsZWdlczwvdGhlbWVrZXk+DQogICAgICAgIDx0aGVtZWtleT51bml2ZXJz
-aXRpZXM8L3RoZW1la2V5Pg0KICAgICAgICA8dGhlbWVrZXk+Z3JhZGUgc2Nob29sPC90aGVtZWtl
-eT4NCiAgICAgICAgPHRoZW1la2V5PmVsZW1lbnRhcnkgc2Nob29sPC90aGVtZWtleT4NCiAgICAg
-ICAgPHRoZW1la2V5PmhpZ2ggc2Nob29sPC90aGVtZWtleT4NCiAgICAgICAgPHRoZW1la2V5Pm1p
-ZGRsZSBzY2hvb2w8L3RoZW1la2V5Pg0KICAgICAgICA8dGhlbWVrZXk+a2luZGVyZ2FydGVuPC90
-aGVtZWtleT4NCiAgICAgICAgPHRoZW1la2V5PnByaXZhdGUgc2Nob29sPC90aGVtZWtleT4NCiAg
-ICAgICAgPHRoZW1la2V5PnBhcm9jaGlhbCBzY2hvb2w8L3RoZW1la2V5Pg0KICAgICAgICA8dGhl
-bWVrZXk+bW9udGVzc29yaTwvdGhlbWVrZXk+DQogICAgICAgIDx0aGVtZWtleT5jb21tdW5pdHkg
-Y29sbGVnZTwvdGhlbWVrZXk+DQogICAgICAgIDx0aGVtZWtleT5qdW5pb3IgY29sbGVnZTwvdGhl
-bWVrZXk+DQogICAgICAgIDx0aGVtZWtleT51bml2ZXJzaXR5PC90aGVtZWtleT4NCiAgICAgICAg
-PHRoZW1la2V5PkFyaXpvbmEgIERlcGFydG1lbnQgb2YgRWR1Y2F0aW9uPC90aGVtZWtleT4NCiAg
-ICAgICAgPHRoZW1la2V5PkNoYXJ0ZXIgU2Nob29sPC90aGVtZWtleT4NCiAgICAgIDwvdGhlbWU+
-DQogICAgICA8cGxhY2U+DQogICAgICAgIDxwbGFjZWtleT5Bcml6b25hPC9wbGFjZWtleT4NCiAg
-ICAgIDwvcGxhY2U+DQogICAgICA8dGVtcG9yYWw+DQogICAgICAgIDx0ZW1wa2V5PjIwMDg8L3Rl
-bXBrZXk+DQogICAgICA8L3RlbXBvcmFsPg0KICAgIDwva2V5d29yZHM+DQogICAgPGFjY2NvbnN0
-PkFjY2VzcyB0byB0aGVzZSBkYXRhIGFyZSBhbGxvd2VkIGZvciBub24tY29tbWVyY2lhbCBhcHBs
-aWNhdGlvbnMgd2l0aG91dCBjaGFyZ2UuICBDb21tZXJjaWFsIHVzZXMgcmVxdWlyZSBwYXltZW50
-LjwvYWNjY29uc3Q+DQogICAgPHVzZWNvbnN0PlRoZSBBcml6b25hIERlcGFydG1lbnQgb2YgRW52
-aXJvbm1lbnRhbCBRdWFsaXR5IGFuZCBvdGhlcnMgaGF2ZSBjb21waWxlZCB0aGlzIGRhdGEgYXMg
-YSBzZXJ2aWNlIHRvIG91ciBjdXN0b21lcnMgdXNpbmcgaW5mb3JtYXRpb24gZnJvbSB2YXJpb3Vz
-IHNvdXJjZXMuIEFERVEgYW5kIGl0cyBjb2xsYWJvcmF0b3JzIGNhbm5vdCBlbnN1cmUgdGhhdCB0
-aGUgaW5mb3JtYXRpb24gaXMgYWNjdXJhdGUsIGN1cnJlbnQgb3IgY29tcGxldGUuIE5laXRoZXIg
-dGhlIGluZm9ybWF0aW9uIHByZXNlbnRlZCBub3IgbWFwcyBkZXJpdmVkIGZyb20gdGhlbSBhcmUg
-b2ZmaWNpYWwgZG9jdW1lbnRzLiAgDQoNCkFsbCBkYXRhIGFyZSBwcm92aWRlZCAiYXMgaXMiIGFu
-ZCBtYXkgY29udGFpbiBlcnJvcnMuIFRoZSBkYXRhIGFyZSBmb3IgcmVmZXJlbmNlIGFuZCBpbGx1
-c3RyYXRpb24gcHVycG9zZXMgb25seSBhbmQgYXJlIG5vdCBzdWl0YWJsZSBmb3Igc2l0ZS1zcGVj
-aWZpYyBkZWNpc2lvbiBtYWtpbmcuIEluZm9ybWF0aW9uIGZvdW5kIGhlcmUgc2hvdWxkIG5vdCBi
-ZSB1c2VkIGZvciBtYWtpbmcgZmluYW5jaWFsIG9yIGFueSBvdGhlciBjb21taXRtZW50cy4gQ29u
-Y2x1c2lvbnMgZHJhd24gZnJvbSBzdWNoIGluZm9ybWF0aW9uIGFyZSB0aGUgcmVzcG9uc2liaWxp
-dHkgb2YgdGhlIHVzZXIuICANCg0KQURFUSBhc3N1bWVzIG5vIHJlc3BvbnNpYmlsaXR5IGZvciBl
-cnJvcnMgYXJpc2luZyBmcm9tIG1pc3VzZSBvZiB0aGUgZGF0YSBvciBtYXBzIGRlcml2ZWQgZnJv
-bSB0aGUgZGF0YS4gQURFUSBkaXNjbGFpbXMgYW55IGxpYWJpbGl0eSBmb3IgaW5qdXJ5LCBkYW1h
-Z2Ugb3IgbG9zcyB0aGF0IG1pZ2h0IHJlc3VsdCBmcm9tIHRoZSB1c2Ugb2YgdGhpcyBpbmZvcm1h
-dGlvbi4gSW4gbm8gZXZlbnQgc2hhbGwgQURFUSBiZWNvbWUgbGlhYmxlIHRvIHVzZXJzIG9mIHRo
-ZXNlIGRhdGEgYW5kIG1hcHMsIG9yIGFueSBvdGhlciBwYXJ0eSwgYXJpc2luZyBmcm9tIHRoZSB1
-c2Ugb3IgbW9kaWZpY2F0aW9uIG9mIHRoZSBkYXRhLjwvdXNlY29uc3Q+DQogICAgPHB0Y29udGFj
-Pg0KICAgICAgPGNudGluZm8+DQogICAgICAgIDxjbnRvcmdwPg0KICAgICAgICAgIDxjbnRvcmc+
-QXJpem9uYSBEZXBhcnRtZW50IG9mIEVudmlyb25tZW50YWwgUXVhbGl0eTwvY250b3JnPg0KICAg
-ICAgICA8L2NudG9yZ3A+DQogICAgICAgIDxjbnRhZGRyPg0KICAgICAgICAgIDxhZGRydHlwZT5t
-YWlsaW5nIGFuZCBwaHlzaWNhbCBhZGRyZXNzPC9hZGRydHlwZT4NCiAgICAgICAgICA8YWRkcmVz
-cz4xMTEwIFcgV2FzaGluZ3RvbiBTdDwvYWRkcmVzcz4NCiAgICAgICAgICA8Y2l0eT5QaG9lbml4
-PC9jaXR5Pg0KICAgICAgICAgIDxzdGF0ZT5Bcml6b25hPC9zdGF0ZT4NCiAgICAgICAgICA8cG9z
-dGFsPjg1MDA3PC9wb3N0YWw+DQogICAgICAgICAgPGNvdW50cnk+VVNBPC9jb3VudHJ5Pg0KICAg
-ICAgICA8L2NudGFkZHI+DQogICAgICA8L2NudGluZm8+DQogICAgPC9wdGNvbnRhYz4NCiAgICA8
-ZGF0YWNyZWQ+VGhpcyBkYXRhIHNldCBoYXMgYmVlbiBjcmVhdGVkIGluIGNvbGxhYm9yYXRpb24g
-d2l0aCB0aGUgQXJpem9uYSBEZXBhcnRtZW50IG9mIEVkdWNhdGlvbiwgQXJpem9uYSBEZXBhcnRt
-ZW50IG9mIEhlYWx0aCBTZXJ2aWNlcywgQXJpem9uYSBTdGF0ZSBMYW5kIERlcGFydG1lbnQgIGFu
-ZCB0aGUgQXJpem9uYSBTdGF0ZSBDYXJ0b2dyYXBoZXJzIE9mZmljZS48L2RhdGFjcmVkPg0KICAg
-IDxzZWNpbmZvPg0KICAgICAgPHNlY2NsYXNzPlVuY2xhc3NpZmllZDwvc2VjY2xhc3M+DQogICAg
-PC9zZWNpbmZvPg0KICA8L2lkaW5mbz4NCiAgPGRhdGFxdWFsPg0KICAgIDxsaW5lYWdlPg0KICAg
-ICAgPHByb2NzdGVwPg0KICAgICAgICA8cHJvY2Rlc2M+RGF0YXNldCBjb3BpZWQuPC9wcm9jZGVz
-Yz4NCiAgICAgIDwvcHJvY3N0ZXA+DQogICAgICA8cHJvY3N0ZXA+DQogICAgICAgIDxwcm9jZGVz
-Yz5NZXRhZGF0YSBpbXBvcnRlZC48L3Byb2NkZXNjPg0KICAgICAgICA8c3JjdXNlZD5EOlxET0NV
-TUV+MVxWTUd+MS5BREVcTE9DQUxTfjFcVGVtcFx4bWxDQS50bXA8L3NyY3VzZWQ+DQogICAgICA8
-L3Byb2NzdGVwPg0KICAgICAgPHByb2NzdGVwPg0KICAgICAgICA8cHJvY2Rlc2M+TWV0YWRhdGEg
-aW1wb3J0ZWQuPC9wcm9jZGVzYz4NCiAgICAgICAgPHNyY3VzZWQ+UzpcY29tbW9uXHZtZ1xzY2hv
-b2xzbWV0YWRhdGEueG1sPC9zcmN1c2VkPg0KICAgICAgPC9wcm9jc3RlcD4NCiAgICA8L2xpbmVh
-Z2U+DQogIDwvZGF0YXF1YWw+DQogIDxzcGRvaW5mbz4NCiAgICA8cHR2Y3RpbmY+DQogICAgICA8
-ZXNyaXRlcm0gTmFtZT0iU0NIT09MU19FVkVSWVRISU5HXzhfN18wOCIgLz4NCiAgICA8L3B0dmN0
-aW5mPg0KICA8L3NwZG9pbmZvPg0KICA8ZWFpbmZvPg0KICAgIDxkZXRhaWxlZCBOYW1lPSJTQ0hP
-T0xTX0VWRVJZVEhJTkdfOF83XzA4Ij4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+
-RklEPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+SW50ZXJuYWwgZmVhdHVyZSBudW1iZXIu
-PC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+RVNSSTwvYXR0cmRlZnM+DQogICAgICAgIDxh
-dHRyZG9tdj4NCiAgICAgICAgICA8dWRvbT5TZXF1ZW50aWFsIHVuaXF1ZSB3aG9sZSBudW1iZXJz
-IHRoYXQgYXJlIGF1dG9tYXRpY2FsbHkgZ2VuZXJhdGVkLjwvdWRvbT4NCiAgICAgICAgPC9hdHRy
-ZG9tdj4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+U2hh
-cGU8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5GZWF0dXJlIGdlb21ldHJ5LjwvYXR0cmRl
-Zj4NCiAgICAgICAgPGF0dHJkZWZzPkVTUkk8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cmRvbXY+
-DQogICAgICAgICAgPHVkb20+Q29vcmRpbmF0ZXMgZGVmaW5pbmcgdGhlIGZlYXR1cmVzLjwvdWRv
-bT4NCiAgICAgICAgPC9hdHRyZG9tdj4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0KICAg
-ICAgICA8YXR0cmxhYmw+TkFNRTwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlNjaG9vbCBv
-ciBsZWFybmluZyBzaXRlIG5hbWU8L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8
-L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPkdvb2Q8L2F0
-dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5OYW1lcyB2ZXJpZmllZCB0byBtdWxpdHBsZSBzb3Vy
-Y2VzPC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICAgIDxhdHRybWZycT5BcyBu
-ZWVkZWQ8L2F0dHJtZnJxPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxh
-dHRybGFibD5BRERSRVNTPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+UGh5c2ljYWwgYWRk
-cmVzcyBvZiBzY2hvb2wgb3IgbGVhcm5pbmcgc2l0ZTwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJk
-ZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAgIDxhdHRy
-dmE+R29vZDwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPlZlcmlmaWVkIHRvIG11bHRpcGxl
-IHNvdXJjZXM8L2F0dHJ2YWU+DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgICAgPGF0dHJtZnJx
-PkFzIG5lZWRlZDwvYXR0cm1mcnE+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAg
-ICAgPGF0dHJsYWJsPlpJUDwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlpJUCBDb2RlIG9m
-IHBoeXNpY2FsIGxvY2F0aW9uPC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VVNQUyBaSVAg
-Q29kZTwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAgIDxhdHRydmE+R29v
-ZDwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPlZlcmlmaWVkIHRvIG91dHNpZGUgc291cmNl
-PC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICAgIDxhdHRybWZycT5BcyBuZWVk
-ZWQ8L2F0dHJtZnJxPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRy
-bGFibD5DVERTPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+QVogRGVwdCBvZiBFZHVjYXRp
-b24gSWRlbnRpZmljYXRpb24gTnVtYmVyLCAoQ291bnR5IENvZGUsIFR5cGUgQ29kZSwgRGlzdHJp
-Y3QgQ29kZSAmYW1wOyBTaXRlIE51bWJlcjwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPkFa
-IERlcHQuIG9mIEVkdWNhdGlvbjwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAg
-ICAgIDxhdHRydmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+TWlzc2luZyBs
-ZWFkaW5nIHplcm9zIGluIHN0cmluZyBmaWVsZDwvYXR0cnZhZT4NCiAgICAgICAgPC9hdHRydmFp
-Pg0KICAgICAgICA8YXR0cm1mcnE+QXMgbmVlZGVkPC9hdHRybWZycT4NCiAgICAgIDwvYXR0cj4N
-CiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+Q1REU19OVU08L2F0dHJsYWJsPg0KICAg
-ICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5TVEFUVVM8L2F0dHJs
-YWJsPg0KICAgICAgICA8YXR0cmRlZj5PcGVyYXRpbmcgU3RhdHVzIChvcGVuICwgY2xvc2VkLCBw
-cm9wb3NlZCk8L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0K
-ICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPkdvb2Q8L2F0dHJ2YT4NCiAgICAg
-ICAgICA8YXR0cnZhZT5WYWxpZGF0ZWQgdG8gbXVsdGlwbGUgc291cmNlczwvYXR0cnZhZT4NCiAg
-ICAgICAgPC9hdHRydmFpPg0KICAgICAgICA8YXR0cm1mcnE+QXMgbmVlZGVkPC9hdHRybWZycT4N
-CiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+TE9DQVRJT048
-L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5Mb2NhdGlvbiBjaGVjayBtZXRob2Q8L2F0dHJk
-ZWY+DQogICAgICAgIDxhdHRyZGVmcz5BREVRPC9hdHRyZGVmcz4NCiAgICAgICAgPGF0dHJkb212
-Pg0KICAgICAgICAgIDxlZG9tPg0KICAgICAgICAgICAgPGVkb212PkRJRzwvZWRvbXY+DQogICAg
-ICAgICAgICA8ZWRvbXZkPkRpZ2l0YWxseSB2ZXJpZmllZCBhZ2FpbnN0IHJhc3RlciBkYXRhIG9y
-IG90aGVyIGRhdGEgc2V0IChwYXJjZWxzKTwvZWRvbXZkPg0KICAgICAgICAgICAgPGVkb212ZHM+
-QURFUTwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQogICAg
-ICAgICAgICA8ZWRvbXY+Tk9OPC9lZG9tdj4NCiAgICAgICAgICAgIDxlZG9tdmQ+Tm9uLXNwZWNp
-ZmljLCBtdWx0aXBsZSBtZXRob2RzIG9mIHZlcmlmaWNhdGlvbiAoZGlnaXRhbCwgZ2VvY29kaW5n
-LCBHUFMsIGV0Yy4pPC9lZG9tdmQ+DQogICAgICAgICAgICA8ZWRvbXZkcz5BREVRPC9lZG9tdmRz
-Pg0KICAgICAgICAgIDwvZWRvbT4NCiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9t
-dj5HUFM8L2Vkb212Pg0KICAgICAgICAgICAgPGVkb212ZD5HbG9iYWwgUG9zaXRpb25pbmcgU3lz
-dGVtIC0gZmllbGQgY29sbGVjdGVkPC9lZG9tdmQ+DQogICAgICAgICAgICA8ZWRvbXZkcz5BREVR
-PC9lZG9tdmRzPg0KICAgICAgICAgIDwvZWRvbT4NCiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAg
-ICAgIDxlZG9tdj5HRU88L2Vkb212Pg0KICAgICAgICAgICAgPGVkb212ZD5PcmlnaW5hbGx5IGdl
-b2NvZGVkIC0gYWRkcmVzcyBtYXRjaGVkIFtsb2NhdGlvbiB2ZXJpZmllZCBieSBvdGhlciBtZXRo
-b2RzXTwvZWRvbXZkPg0KICAgICAgICAgICAgPGVkb212ZHM+QURFUTwvZWRvbXZkcz4NCiAgICAg
-ICAgICA8L2Vkb20+DQogICAgICAgIDwvYXR0cmRvbXY+DQogICAgICAgIDxhdHRydmFpPg0KICAg
-ICAgICAgIDxhdHRydmE+R29vZDwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPlZlcmlmaWVk
-PC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICAgIDxhdHRybWZycT5BcyBuZWVk
-ZWQ8L2F0dHJtZnJxPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRy
-bGFibD5RQV9RQzwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlF1YWxpdHkgQXNzdXJhbmNl
-IC8gUXVhbGl0eSBDb250cm9sIENvZGU8L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5BREVR
-PC9hdHRyZGVmcz4NCiAgICAgICAgPGF0dHJkb212Pg0KICAgICAgICAgIDxlZG9tPg0KICAgICAg
-ICAgICAgPGVkb212PjA8L2Vkb212Pg0KICAgICAgICAgICAgPGVkb212ZD5Vc2VkIGFzIGlkZW50
-aWZpZXIgZm9yIHZlcnNpb24gYWRkaXRpb25zIHRvIGRhdGEgc2V0IC0gR1BTXSwgbG9jYXRpb24g
-cXVhbGl0eSBpcyAib2siPC9lZG9tdmQ+DQogICAgICAgICAgICA8ZWRvbXZkcz5BREVRPC9lZG9t
-dmRzPg0KICAgICAgICAgIDwvZWRvbT4NCiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxl
-ZG9tdj4xPC9lZG9tdj4NCiAgICAgICAgICAgIDxlZG9tdmQ+VmVyeSBoaWdoIGNvbmZpZGVuY2Ug
-b2YgbG9jYXRpb24gYWNjdXJhY3ksIG1hdGNoZWQgdG8gYXQgbGVhc3QgdHdvIGluZGVwZW5kZW50
-IHNvdXJjZXM8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRzPkFERVE8L2Vkb212ZHM+DQog
-ICAgICAgICAgPC9lZG9tPg0KICAgICAgICAgIDxlZG9tPg0KICAgICAgICAgICAgPGVkb212PjI8
-L2Vkb212Pg0KICAgICAgICAgICAgPGVkb212ZD5Mb3cgY29uZmlkZW5jZSBvZiBsb2NhdGlvbmFs
-IGFjY3VyYWN5LCB1bmFibGUgdG8gbWF0Y2ggdG8gb3RoZXIgc291cmNlczwvZWRvbXZkPg0KICAg
-ICAgICAgICAgPGVkb212ZHM+QURFUTwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAg
-ICAgICAgPGVkb20+DQogICAgICAgICAgICA8ZWRvbXY+MzwvZWRvbXY+DQogICAgICAgICAgICA8
-ZWRvbXZkPlVzZWQgYXMgaWRlbnRpZmllciBmb3IgdmVyc2lvbiBhZGRpdGlvbnMgdG8gZGF0YSBz
-ZXQgLSBHUFMgb3IgTk9OLCBsb2NhdGlvbiBxdWFsaXR5IGlzICJvayI8L2Vkb212ZD4NCiAgICAg
-ICAgICAgIDxlZG9tdmRzPkFERVE8L2Vkb212ZHM+DQogICAgICAgICAgPC9lZG9tPg0KICAgICAg
-ICAgIDxlZG9tPg0KICAgICAgICAgICAgPGVkb212PjQ8L2Vkb212Pg0KICAgICAgICAgICAgPGVk
-b212ZD5Vc2VkIGFzIGlkZW50aWZpZXIgZm9yIHZlcnNpb24gYWRkaXRpb25zIHRvIGRhdGEgc2V0
-IC0gR1BTIG9yIE5PTiwgbG9jYXRpb24gcXVhbGl0eSBpcyAib2siPC9lZG9tdmQ+DQogICAgICAg
-ICAgICA8ZWRvbXZkcz5BREVRPC9lZG9tdmRzPg0KICAgICAgICAgIDwvZWRvbT4NCiAgICAgICAg
-ICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj41PC9lZG9tdj4NCiAgICAgICAgICAgIDxlZG9t
-dmQ+VmVyeSBoaWdoIGNvbmZpZGVuY2Ugb2YgbG9jYXRpb24gYWNjdXJhY3ksIG1hdGNoZWQgZGln
-aXRhbGx5LCBldGMuIHRvIGF0IGxlYXN0IHR3byBpbmRlcGVuZGVudCBzb3VyY2VzPC9lZG9tdmQ+
-DQogICAgICAgICAgICA8ZWRvbXZkcz5BREVRPC9lZG9tdmRzPg0KICAgICAgICAgIDwvZWRvbT4N
-CiAgICAgICAgPC9hdHRyZG9tdj4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2
-YT5Hb29kPC9hdHRydmE+DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgIDwvYXR0cj4NCiAgICAg
-IDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+Q0lUWTwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRy
-ZGVmPlBoeXNpY2FsIGxvY2F0aW9uIGNpdHkgb3IgdG93bjwvYXR0cmRlZj4NCiAgICAgICAgPGF0
-dHJkZWZzPkRpZ2l0YWw8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8
-YXR0cnZhPkdvb2Q8L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5WZXJpZmllZDwvYXR0cnZh
-ZT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgICA8YXR0cm1mcnE+QXMgbmVlZGVkPC9hdHRy
-bWZycT4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+Q09V
-TlRZPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+Q291bnR5PC9hdHRyZGVmPg0KICAgICAg
-ICA8YXR0cmRlZnM+RGlnaXRhbDwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAg
-ICAgIDxhdHRydmE+R29vZDwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPlZlcmlmaWVkPC9h
-dHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICAgIDxhdHRybWZycT5BcyBuZWVkZWQ8
-L2F0dHJtZnJxPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFi
-bD5QSE9ORTwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlBob25lIE51bWJlcjwvYXR0cmRl
-Zj4NCiAgICAgICAgPGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFp
-Pg0KICAgICAgICAgIDxhdHRydmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+
-Tm90IHZlcmlmaWVkPC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICAgIDxhdHRy
-bWZycT5BcyBuZWVkZWQ8L2F0dHJtZnJxPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQog
-ICAgICAgIDxhdHRybGFibD5GQVg8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5GQVggTnVt
-YmVyPC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRyZGVmcz4NCiAgICAg
-ICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2YT4NCiAgICAgICAg
-ICA8YXR0cnZhZT5Ob3QgdmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwvYXR0cnZhaT4NCiAg
-ICAgICAgPGF0dHJtZnJxPkFzIG5lZWRlZDwvYXR0cm1mcnE+DQogICAgICA8L2F0dHI+DQogICAg
-ICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkxPV0dSQURFPC9hdHRybGFibD4NCiAgICAgICAg
-PGF0dHJkZWY+TG93ZXN0IGNsYXNzIGxldmVsPC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+
-VmFyaWVzPC9hdHRyZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5N
-ZWRpdW08L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5Ob3QgVmVyaWZpZWQ8L2F0dHJ2YWU+
-DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgICAgPGF0dHJtZnJxPkFzIG5lZWRlZDwvYXR0cm1m
-cnE+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkhJR0hH
-UkFERTwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPkhpZ2hlc3QgY2xhc3MgbGV2ZWwgdGF1
-Z2h0PC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRyZGVmcz4NCiAgICAg
-ICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2YT4NCiAgICAgICAg
-ICA8YXR0cnZhZT5Ob3QgdmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwvYXR0cnZhaT4NCiAg
-ICAgICAgPGF0dHJtZnJxPkFzIG5lZWRlZDwvYXR0cm1mcnE+DQogICAgICA8L2F0dHI+DQogICAg
-ICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkNPTU1FTlQ8L2F0dHJsYWJsPg0KICAgICAgICA8
-YXR0cmRlZj5Db21tZW50cyBGaWVsZDwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPlZhcmll
-czwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRybWZycT5BcyBuZWVkZWQ8L2F0dHJtZnJxPg0KICAg
-ICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5ESVNUUklDVDwvYXR0
-cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlNjaG9vbCBEaXN0cmljdCBvciBDaGFydGVyIEhvbGRl
-cjwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPkFaIERlcGFydG1lbnQgb2YgRWR1Y2F0aW9u
-PC9hdHRyZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08
-L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5Ob3QgZnVsbHkgdmVyaWZpZWQ8L2F0dHJ2YWU+
-DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgICAgPGF0dHJtZnJxPkFzIG5lZWRlZDwvYXR0cm1m
-cnE+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkdSQURF
-PC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+UmFuZ2Ugb2YgY2xhc3NlcyB0YXVnaHQ8L2F0
-dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0
-cm1mcnE+QXMgbmVlZGVkPC9hdHRybWZycT4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0K
-ICAgICAgICA8YXR0cmxhYmw+TlVSU0U8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5TY2hv
-b2wgTnVyc2UgcHJlc2VudD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0
-dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPlVua25vd248L2F0
-dHJ2YT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQog
-ICAgICAgIDxhdHRybGFibD5STl9QSE48L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5SZWdp
-c3RlciBOdXJzZSBQaG9uZSBOdW1iZXI8L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJp
-ZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPlVua25v
-d248L2F0dHJ2YT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0
-dHI+DQogICAgICAgIDxhdHRybGFibD5KVVZfUE9QPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJk
-ZWY+SnV2ZW5pbGUgUG9wdWxhdGlvbiAobnVtYmVyIG9mIHN0dWRlbnRzKTwvYXR0cmRlZj4NCiAg
-ICAgICAgPGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAg
-ICAgICAgIDxhdHRydmE+VW5rbm93bjwvYXR0cnZhPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAg
-ICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkRJU1ROVU08L2F0dHJs
-YWJsPg0KICAgICAgICA8YXR0cmRlZj5TY2hvb2wgRGlzdHJpY3QgTnVtYmVyIG9yIENoYXJ0ZXIg
-SG9sZGVyIE51bWJlcjwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPkFaIERlcGFydG1lbnQg
-b2YgRWR1Y2F0aW9uPC9hdHRyZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0
-dHJ2YT5NZWRpdW08L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5QYXJ0aWFsbHkgdmVyaWZp
-ZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxh
-dHRyPg0KICAgICAgICA8YXR0cmxhYmw+TUFJTFRPPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJk
-ZWY+TWFpbGluZyBBZGRyZXNzPC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9h
-dHRyZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0
-dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5Ob3QgdmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAg
-IDwvYXR0cnZhaT4NCiAgICAgICAgPGF0dHJtZnJxPkFzIG5lZWRlZDwvYXR0cm1mcnE+DQogICAg
-ICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPk1BSUxDSVRZPC9hdHRy
-bGFibD4NCiAgICAgICAgPGF0dHJkZWY+TWFpbGluZyBBZGRyZXNzIENpdHk8L2F0dHJkZWY+DQog
-ICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAg
-ICAgICAgICA8YXR0cnZhPk1lZGl1bTwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPk5vdCB2
-ZXJpZmllZDwvYXR0cnZhZT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAg
-ICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5NQUlMU1RBVDwvYXR0cmxhYmw+DQogICAgICAg
-IDxhdHRyZGVmPk1haWxpbmcgQWRkcmVzcyBTdGF0ZTwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJk
-ZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAgIDxhdHRy
-dmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+Tm90IHZlcmlmaWVkPC9hdHRy
-dmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAg
-ICAgICAgPGF0dHJsYWJsPk1BSUxaSVA8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5NYWls
-aW5nIEFkZHJlc3MgWklQPC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRy
-ZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2
-YT4NCiAgICAgICAgICA8YXR0cnZhZT5Ob3QgdmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwv
-YXR0cnZhaT4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+
-Q0xBU1M8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5DbGFzcyAtIGdyYWRlIGxldmVsczwv
-YXR0cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxh
-dHRyZG9tdj4NCiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj5Vbml2ZXJzaXR5
-PC9lZG9tdj4NCiAgICAgICAgICAgIDxlZG9tdmQ+VW5pdmVyc2l0eSBsZXZlbDwvZWRvbXZkPg0K
-ICAgICAgICAgICAgPGVkb212ZHM+VmFyaWVzPC9lZG9tdmRzPg0KICAgICAgICAgIDwvZWRvbT4N
-CiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj5Db21tLiBDb2xsZWdlPC9lZG9t
-dj4NCiAgICAgICAgICAgIDxlZG9tdmQ+Q29tbXVuaXR5IENvbGxlZ2UgLSBwYXJ0IG9mIGNvbW11
-bml0eSBjb2xsZWdlIG5ldHdvcms8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRzPlZhcmll
-czwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQogICAgICAg
-ICAgICA8ZWRvbXY+Q29sbGVnZTwvZWRvbXY+DQogICAgICAgICAgICA8ZWRvbXZkPkNvbGxlZ2Ug
-LSBub24tdW5pdmVyc2l0eSBvciBjb21tdW5pdHkgY29sbGVnZTwvZWRvbXZkPg0KICAgICAgICAg
-ICAgPGVkb212ZHM+VmFyaWVzPC9lZG9tdmRzPg0KICAgICAgICAgIDwvZWRvbT4NCiAgICAgICAg
-ICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj5UZWNoPC9lZG9tdj4NCiAgICAgICAgICAgIDxl
-ZG9tdmQ+VGVjaG5pY2FsIFNjaG9vbHM8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRzPlZh
-cmllczwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQogICAg
-ICAgICAgICA8ZWRvbXY+UmVsLiBDb2xsZWdlPC9lZG9tdj4NCiAgICAgICAgICAgIDxlZG9tdmQ+
-UmVsaWdpb3VzIENvbGxlZ2U8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRzPlZhcmllczwv
-ZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQogICAgICAgICAg
-ICA8ZWRvbXY+U3BlY2lhbCBOZWVkczwvZWRvbXY+DQogICAgICAgICAgICA8ZWRvbXZkPlNwZWNp
-YWwgbmVlZHMgc2Nob29sczwvZWRvbXZkPg0KICAgICAgICAgICAgPGVkb212ZHM+QURFUTwvZWRv
-bXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQogICAgICAgICAgICA8
-ZWRvbXY+QWxsIEdyYWRlczwvZWRvbXY+DQogICAgICAgICAgICA8ZWRvbXZkPkFsbCBncmFkZSBs
-ZXZlbHM8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRzPkFERVE8L2Vkb212ZHM+DQogICAg
-ICAgICAgPC9lZG9tPg0KICAgICAgICAgIDxlZG9tPg0KICAgICAgICAgICAgPGVkb212PkhpZ2g8
-L2Vkb212Pg0KICAgICAgICAgICAgPGVkb212ZD5IaWdoIFNjaG9vbDwvZWRvbXZkPg0KICAgICAg
-ICAgICAgPGVkb212ZHM+VmFyaWVzPC9lZG9tdmRzPg0KICAgICAgICAgIDwvZWRvbT4NCiAgICAg
-ICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj5KUi9TUiBIaWdoPC9lZG9tdj4NCiAgICAg
-ICAgICAgIDxlZG9tdmQ+U2V2ZW50aCB0aHJvdWdoIHR3ZWxmdGggZ3JhZGU8L2Vkb212ZD4NCiAg
-ICAgICAgICAgIDxlZG9tdmRzPkFERVE8L2Vkb212ZHM+DQogICAgICAgICAgPC9lZG9tPg0KICAg
-ICAgICAgIDxlZG9tPg0KICAgICAgICAgICAgPGVkb212Pk1pZGRsZTwvZWRvbXY+DQogICAgICAg
-ICAgICA8ZWRvbXZkPk1pZGRsZSBTY2hvb2w8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRz
-PlZhcmllczwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQog
-ICAgICAgICAgICA8ZWRvbXY+UHJpbWFyeTwvZWRvbXY+DQogICAgICAgICAgICA8ZWRvbXZkPlBy
-aW1hcnkgb3IgZWxlbWVudGFyeSBzY2hvb2w8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRz
-PlZhcmllczwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQog
-ICAgICAgICAgICA8ZWRvbXY+KEJsYW5rKTwvZWRvbXY+DQogICAgICAgICAgICA8ZWRvbXZkPlVu
-a25vd24gZ3JhZGUgbGV2ZWxzPC9lZG9tdmQ+DQogICAgICAgICAgPC9lZG9tPg0KICAgICAgICA8
-L2F0dHJkb212Pg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPkdvb2Q8L2F0
-dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5WZXJpZmllZDwvYXR0cnZhZT4NCiAgICAgICAgPC9h
-dHRydmFpPg0KICAgICAgICA8YXR0cm1mcnE+QXMgbmVlZGVkPC9hdHRybWZycT4NCiAgICAgIDwv
-YXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+VFlQRV8xPC9hdHRybGFibD4N
-CiAgICAgICAgPGF0dHJkZWY+VHlwZSBvZiBTY2hvb2w8L2F0dHJkZWY+DQogICAgICAgIDxhdHRy
-ZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cmRvbXY+DQogICAgICAgICAgPGVk
-b20+DQogICAgICAgICAgICA8ZWRvbXY+Q2hhcnRlcjwvZWRvbXY+DQogICAgICAgICAgICA8ZWRv
-bXZkPkFyaXpvbmEgQ2hhcnRlciBTY2hvb2w8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRz
-PkFyaXpvbmEgQm9hcmQgb2YgQ2hhcnRlciBTY2hvb2xzPC9lZG9tdmRzPg0KICAgICAgICAgIDwv
-ZWRvbT4NCiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj5QdWJsaWM8L2Vkb212
-Pg0KICAgICAgICAgICAgPGVkb212ZD5QdWJsaWMgU2Nob29sPC9lZG9tdmQ+DQogICAgICAgICAg
-ICA8ZWRvbXZkcz5BeiBEZXBhcnRlbWVudCBvZiBFZHVjYXRpb248L2Vkb212ZHM+DQogICAgICAg
-ICAgPC9lZG9tPg0KICAgICAgICAgIDxlZG9tPg0KICAgICAgICAgICAgPGVkb212PkJJQTwvZWRv
-bXY+DQogICAgICAgICAgICA8ZWRvbXZkPkJ1cmVhdSBvZiBJbmRpYW4gQWZmYWlycyBvcGVyYXRl
-ZCBzY2hvb2w8L2Vkb212ZD4NCiAgICAgICAgICAgIDxlZG9tdmRzPlVTIEJJQTwvZWRvbXZkcz4N
-CiAgICAgICAgICA8L2Vkb20+DQogICAgICAgICAgPGVkb20+DQogICAgICAgICAgICA8ZWRvbXY+
-Q2xvc2VkPC9lZG9tdj4NCiAgICAgICAgICAgIDxlZG9tdmQ+Q2xvc2VkIFNjaG9vbDwvZWRvbXZk
-Pg0KICAgICAgICAgICAgPGVkb212ZHM+QURFUTwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+
-DQogICAgICAgICAgPGVkb20+DQogICAgICAgICAgICA8ZWRvbXY+UHJpdmF0ZTwvZWRvbXY+DQog
-ICAgICAgICAgICA8ZWRvbXZkPlByaXZhdGUgb3IgUGFyb2NoaWFsIG9wZXJhdGVkIHNjaG9vbDwv
-ZWRvbXZkPg0KICAgICAgICAgICAgPGVkb212ZHM+VmFyaWVzPC9lZG9tdmRzPg0KICAgICAgICAg
-IDwvZWRvbT4NCiAgICAgICAgICA8ZWRvbT4NCiAgICAgICAgICAgIDxlZG9tdj5UcmliYWw8L2Vk
-b212Pg0KICAgICAgICAgICAgPGVkb212ZD5UcmliZSBvcGVyYXRlZCBzY2hvb2w8L2Vkb212ZD4N
-CiAgICAgICAgICAgIDxlZG9tdmRzPlZhcmllczwvZWRvbXZkcz4NCiAgICAgICAgICA8L2Vkb20+
-DQogICAgICAgIDwvYXR0cmRvbXY+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAgIDxhdHRy
-dmE+R29vZDwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPnZlcmlmaWVkPC9hdHRydmFlPg0K
-ICAgICAgICA8L2F0dHJ2YWk+DQogICAgICAgIDxhdHRybWZycT5BcyBuZWVkZWQ8L2F0dHJtZnJx
-Pg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5LSU5ERVI8
-L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5LaW5kZXJnYXJkZW4gVGF1Z2h0PzwvYXR0cmRl
-Zj4NCiAgICAgICAgPGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFp
-Pg0KICAgICAgICAgIDxhdHRydmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+
-Tm90IFZlcmlmaWVkPC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICA8L2F0dHI+
-DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkZJUlNUPC9hdHRybGFibD4NCiAgICAg
-ICAgPGF0dHJkZWY+Rmlyc3QgR3JhZGUgVGF1Z2h0PzwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJk
-ZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAgIDxhdHRy
-dmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+Tm90IFZlcmlmaWVkPC9hdHRy
-dmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAg
-ICAgICAgPGF0dHJsYWJsPlNFQ09ORDwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlNlY29u
-ZCBHcmFkZSBUYXVnaHQ/PC9hdHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRy
-ZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2
-YT4NCiAgICAgICAgICA8YXR0cnZhZT5Ob3QgVmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwv
-YXR0cnZhaT4NCiAgICAgIDwvYXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+
-VEhJUkQ8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5UaGlyZCBHcmFkZSBUYXVnaHQ/PC9h
-dHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRyZGVmcz4NCiAgICAgICAgPGF0
-dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0
-cnZhZT5Ob3QgVmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgIDwv
-YXR0cj4NCiAgICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+Rk9VUlRIPC9hdHRybGFibD4N
-CiAgICAgICAgPGF0dHJkZWY+Rm91cnRoIEdyYWRlIFRhdWdodD88L2F0dHJkZWY+DQogICAgICAg
-IDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAg
-ICA8YXR0cnZhPk1lZGl1bTwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmll
-ZDwvYXR0cnZhZT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0
-dHI+DQogICAgICAgIDxhdHRybGFibD5GSUZUSDwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVm
-PkZpZnRoIEdyYWRlIFRhdWdodD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8
-L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPk1lZGl1bTwv
-YXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZhZT4NCiAgICAg
-ICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRy
-bGFibD5TSVhUSDwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlNpeHRoIEdyYWRlIFRhdWdo
-dD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAg
-ICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPk1lZGl1bTwvYXR0cnZhPg0KICAgICAgICAg
-IDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZhZT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAg
-ICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5TRVZFTlRIPC9hdHRy
-bGFibD4NCiAgICAgICAgPGF0dHJkZWY+U2V2ZW50aCBHcmFkZSBUYXVnaHQ/PC9hdHRyZGVmPg0K
-ICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRyZGVmcz4NCiAgICAgICAgPGF0dHJ2YWk+DQog
-ICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0cnZhZT5Ob3Qg
-VmVyaWZpZWQ8L2F0dHJ2YWU+DQogICAgICAgIDwvYXR0cnZhaT4NCiAgICAgIDwvYXR0cj4NCiAg
-ICAgIDxhdHRyPg0KICAgICAgICA8YXR0cmxhYmw+RUlHSFRIPC9hdHRybGFibD4NCiAgICAgICAg
-PGF0dHJkZWY+RWlnaHRoIEdyYWRlIFRhdWdodD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVm
-cz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZh
-Pk1lZGl1bTwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZh
-ZT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAg
-ICAgIDxhdHRybGFibD5OSU5USDwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPk5pbnRoIEdy
-YWRlIFRhdWdodDwvYXR0cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+
-DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAgIDxhdHRydmE+TWVkaXVtPC9hdHRydmE+DQog
-ICAgICAgICAgPGF0dHJ2YWU+Tm90IFZlcmlmaWVkPC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2
-YWk+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPlRFTlRI
-PC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+VGVudGggR3JhZGUgVGF1Z2h0PzwvYXR0cmRl
-Zj4NCiAgICAgICAgPGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFp
-Pg0KICAgICAgICAgIDxhdHRydmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+
-Tm90IFZlcmlmaWVkPC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICA8L2F0dHI+
-DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPkVMRVZFTlRIPC9hdHRybGFibD4NCiAg
-ICAgICAgPGF0dHJkZWY+RWxldmVudGggR3JhZGUgVGF1Z2h0PzwvYXR0cmRlZj4NCiAgICAgICAg
-PGF0dHJkZWZzPlZhcmllczwvYXR0cmRlZnM+DQogICAgICAgIDxhdHRydmFpPg0KICAgICAgICAg
-IDxhdHRydmE+TWVkaXVtPC9hdHRydmE+DQogICAgICAgICAgPGF0dHJ2YWU+Tm90IFZlcmlmaWVk
-PC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQogICAgICA8L2F0dHI+DQogICAgICA8YXR0
-cj4NCiAgICAgICAgPGF0dHJsYWJsPlRXRUxGVEg8L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRl
-Zj5Ud2VsZnRoIEdyYWRlIFRhdWdodD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJp
-ZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPk1lZGl1
-bTwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZhZT4NCiAg
-ICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxh
-dHRybGFibD5QUkVTQ0hMPC9hdHRybGFibD4NCiAgICAgICAgPGF0dHJkZWY+UHJlc2Nob29sIExl
-dmVsIFRhdWdodD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZz
-Pg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPk1lZGl1bTwvYXR0cnZhPg0K
-ICAgICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZhZT4NCiAgICAgICAgPC9hdHRy
-dmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5BQ0NV
-UkFDWTwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPk9yaWdpbmFsICBBY2N1cmFjeTwvYXR0
-cmRlZj4NCiAgICAgICAgPGF0dHJkZWZzPkFESFM8L2F0dHJkZWZzPg0KICAgICAgPC9hdHRyPg0K
-ICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5CT0FSRElORzwvYXR0cmxhYmw+DQogICAg
-ICAgIDxhdHRyZGVmPkJvYXJkaW5nIFNjaG9vbD88L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVm
-cz5WYXJpZXM8L2F0dHJkZWZzPg0KICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZh
-Pk1lZGl1bTwvYXR0cnZhPg0KICAgICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZh
-ZT4NCiAgICAgICAgPC9hdHRydmFpPg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAg
-ICAgIDxhdHRybGFibD5SRUdJT048L2F0dHJsYWJsPg0KICAgICAgICA8YXR0cmRlZj5SZWdpb24g
-b2YgU3RhdGU8L2F0dHJkZWY+DQogICAgICAgIDxhdHRyZGVmcz5WYXJpZXM8L2F0dHJkZWZzPg0K
-ICAgICAgICA8YXR0cnZhaT4NCiAgICAgICAgICA8YXR0cnZhPk1lZGl1bTwvYXR0cnZhPg0KICAg
-ICAgICAgIDxhdHRydmFlPk5vdCBWZXJpZmllZDwvYXR0cnZhZT4NCiAgICAgICAgPC9hdHRydmFp
-Pg0KICAgICAgPC9hdHRyPg0KICAgICAgPGF0dHI+DQogICAgICAgIDxhdHRybGFibD5XRUJfUEFH
-RTwvYXR0cmxhYmw+DQogICAgICAgIDxhdHRyZGVmPlNjaG9vbCBXZWIgUGFnZSBBZGRyZXNzPC9h
-dHRyZGVmPg0KICAgICAgICA8YXR0cmRlZnM+VmFyaWVzPC9hdHRyZGVmcz4NCiAgICAgICAgPGF0
-dHJ2YWk+DQogICAgICAgICAgPGF0dHJ2YT5NZWRpdW08L2F0dHJ2YT4NCiAgICAgICAgICA8YXR0
-cnZhZT5QYXJpdGlhbGx5IFZlcmlmaWVkPC9hdHRydmFlPg0KICAgICAgICA8L2F0dHJ2YWk+DQog
-ICAgICA8L2F0dHI+DQogICAgICA8YXR0cj4NCiAgICAgICAgPGF0dHJsYWJsPlBMQUNfSUROTzwv
-YXR0cmxhYmw+DQogICAgICA8L2F0dHI+DQogICAgPC9kZXRhaWxlZD4NCiAgPC9lYWluZm8+DQog
-IDxkaXN0aW5mbz4NCiAgICA8cmVzZGVzYz5Eb3dubG9hZGFibGUgRGF0YTwvcmVzZGVzYz4NCiAg
-ICA8c3Rkb3JkZXI+DQogICAgICA8ZGlnZm9ybT4NCiAgICAgICAgPGRpZ3RpbmZvPg0KICAgICAg
-ICAgIDx0cmFuc2l6ZT4zLjAxNzwvdHJhbnNpemU+DQogICAgICAgIDwvZGlndGluZm8+DQogICAg
-ICA8L2RpZ2Zvcm0+DQogICAgPC9zdGRvcmRlcj4NCiAgPC9kaXN0aW5mbz4NCiAgPG1ldGFpbmZv
-Pg0KICAgIDxtZXRleHRucz4NCiAgICAgIDxvbmxpbms+aHR0cDovL3d3dy5lc3JpLmNvbS9tZXRh
-ZGF0YS9lc3JpcHJvZjgwLmh0bWw8L29ubGluaz4NCiAgICAgIDxtZXRwcm9mPkVTUkkgTWV0YWRh
-dGEgUHJvZmlsZTwvbWV0cHJvZj4NCiAgICA8L21ldGV4dG5zPg0KICAgIDxtZXRleHRucz4NCiAg
-ICAgIDxvbmxpbms+aHR0cDovL3d3dy5lc3JpLmNvbS9tZXRhZGF0YS9lc3JpcHJvZjgwLmh0bWw8
-L29ubGluaz4NCiAgICAgIDxtZXRwcm9mPkVTUkkgTWV0YWRhdGEgUHJvZmlsZTwvbWV0cHJvZj4N
-CiAgICA8L21ldGV4dG5zPg0KICAgIDxtZXRleHRucz4NCiAgICAgIDxvbmxpbms+aHR0cDovL3d3
-dy5lc3JpLmNvbS9tZXRhZGF0YS9lc3JpcHJvZjgwLmh0bWw8L29ubGluaz4NCiAgICAgIDxtZXRw
-cm9mPkVTUkkgTWV0YWRhdGEgUHJvZmlsZTwvbWV0cHJvZj4NCiAgICA8L21ldGV4dG5zPg0KICA8
-L21ldGFpbmZvPg0KICA8RXNyaT4NCiAgICA8Q3JlYURhdGU+MjAwODA4MTE8L0NyZWFEYXRlPg0K
-ICAgIDxDcmVhVGltZT4xMDA1MjQwMDwvQ3JlYVRpbWU+DQogICAgPFN5bmNPbmNlPkZBTFNFPC9T
-eW5jT25jZT4NCiAgICA8U3luY0RhdGU+MjAxMTA2MjA8L1N5bmNEYXRlPg0KICAgIDxTeW5jVGlt
-ZT4xMzEyMDkwMDwvU3luY1RpbWU+DQogICAgPE1vZERhdGU+MjAxMTA2MjA8L01vZERhdGU+DQog
-ICAgPE1vZFRpbWU+MTMxMjA5MDA8L01vZFRpbWU+DQogICAgPE1ldGFJRD57MDEyODg5MzEtOEY1
-Qi00OTg4LTlGNTUtRTcwODFGMDkxMkU3fTwvTWV0YUlEPg0KICA8L0Vzcmk+DQogIDxkYXRhSWRJ
-bmZvPg0KICAgIDxkYXRhRXh0Pg0KICAgICAgPGdlb0VsZT4NCiAgICAgICAgPEdlb0JuZEJveCBl
-c3JpRXh0ZW50VHlwZT0ic2VhcmNoIiAvPg0KICAgICAgPC9nZW9FbGU+DQogICAgPC9kYXRhRXh0
-Pg0KICAgIDxnZW9Cb3ggZXNyaUV4dGVudFR5cGU9ImRlY2RlZ3JlZXMiIC8+DQogICAgPGRlc2NL
-ZXlzPg0KICAgICAgPHRoZXNhTmFtZSB1dWlkcmVmPSI3MjNmNjk5OC0wNThlLTExZGMtODMxNC0w
-ODAwMjAwYzlhNjYiIC8+DQogICAgPC9kZXNjS2V5cz4NCiAgPC9kYXRhSWRJbmZvPg0KICA8c3Bh
-dFJlcEluZm8+DQogICAgPFZlY3RTcGF0UmVwPg0KICAgICAgPGdlb21ldE9ianMgTmFtZT0iU0NI
-T09MU19FVkVSWVRISU5HXzhfN18wOCIgLz4NCiAgICA8L1ZlY3RTcGF0UmVwPg0KICA8L3NwYXRS
-ZXBJbmZvPg0KPC9tZXRhZGF0YT4=</Data></Enclosure><Thumbnail><Data EsriPropertyType="PictureX">/9j/4AAQSkZJRgABAQEAAQABAAD/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0a
-HBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIy
-MjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCACFAMgDASIA
-AhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQA
-AAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3
-ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWm
-p6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEA
-AwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSEx
-BhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElK
-U1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3
-uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwD3+iii
-gAooooAKKKKACiiigChba3pl7qt5pdtfQy31mFNxAjZaMN0z/njj1q/UENla29zcXENtFHPcEGaR
-EAaQgYG49Tgcc1PQBzEPjzRn8Tavoc7S2kulrG01xcqEhO/AUByeuWA5xntnBrXttc0281m+0i3u
-le/sVRriIA5QOMrz0PHpUmoaTp2rWz2+oWNvdQuVLpNGGDFTlc564NWljRWLKihiACQOSB0oAdVS
-51Sws720s7m8ghubtmW3idwGlIGTtHfAq3WLrHhbStc1bSdTvYWa70qYzWzo5XBI6HHUZAOPUemQ
-QDaorNj1/S5fEM2gx3kbapDALiS3GcqhOASemeRx1wQe9Ns/EWmX2v6holvOzX+nqj3EZRgFDjK8
-kYPHp60AalNd1jjZ3YKigliewFY+hNr9zBqC+Ibe0gJuZFtRZysSYOiljwQ3uMfQVnx6feeGbTT9
-I05lTQYraf7VqN3eZntjjKsN4IPJPXgAdgMEA1NN8TaLq2n2F9Z6jA9vfsUtSzbDKwzlVDYJI2nj
-2rWrgbvVvCHhW20bS/Emqw6hqmmiJoJZ4Q8wLnYJAFHH88AE5OCe+oAKKKKACiiigAooooAKKKKA
-CiiigAooooAKKKKACiiigCpqmow6RpV1qNwsrw20TSusKF3IAycAdTUlldx39jb3kSyLHPGsiiRC
-jAEZGVPIPsanooAKzb+/v7bVdNtrbSpLq1uXcXN0sqqLYBcqSp5bJ446UaJpt1pVjJBd6rc6lI08
-komuAoZVZiQg2gcAcfywMAO1LVtLsJ7Oy1C7hik1CQwW8UhwZmxyo/z1IHUigCtqFxdRXNjdaNpl
-tf8A2qdIbu4EyoY4Bu+bODu2kn5fc+ta4RA7OFUOwALAcnHT+Zrm4NFl8I6JZaZ4R0y3e3F4DNHc
-3LDy4nYl2UnJJGeB/PocK68Yadp/xLv49ZvNS0mGwsdsYuHUWVyrfNvXj/WcMAMknaR1BFAHd393
-b2NjNc3V3HaQIvzTyMFVOwJJ46+tcXqsE2m6Tpmm+K9f0m90SaGaPUptRURSXLY3J5e0gDAH14z1
-xjEvtKtvE/iubSNG1WG50TVoHvtZBu1uHG+NVh8tGyY/4XDD2zwFFdXq/hjSIfCtkNU0yXxA+hw+
-ZbiRA80jIvYcBicDg8HA4OKAKuhwm78aPqNjE0/hu40i2FlMvleQhRyQqLjeMDB9jnP8OO2rBGtS
-Ww0KKDQrqOzvkw7fJGtgNgKq6545wuB9OuAd6gAooooAKKKKACiiigAooooAKKKKACiiigArNmtd
-UfxDa3UWoxx6XHA6T2ZgBaSQkbWD5yMc8fzzxzfxL8eS+BNGtZ7XTWvry9n8iBN2FDYzyB8x9gB+
-I72PFdr42u4RJ4a1DT7VTYyrJbzxkuZyBtKv2xzjIAz1Bz8oB1U8wt7eWYpI4jQuVjUszYGcADqf
-aue8IeJ77xRBc3dx4evdJtAV+yvdsA86kcnZ1XB/PNVPCX/CX6fYaFpmu2kd032OQ32oG6DPHIG/
-doRj5ztIBbuQTnj5uwoAKKzZtGjn8Q2usG7vFktoHhFukxELhiDuZO5GP84GK03/AAksl3rEcP8A
-ZsNv5C/2ZM29m83ad3mr6bsdO3rQBt1HJBDM8TyxRu8Tb42ZQSjYIyPQ4JH0JrB0u08VxS6S2p6p
-YTxRWbLqCRWxVpp+NrIc4AHPYDrxyNvKeFJviL4qk1qTX/8AinrGSGS1tIYoR5ySbjiQE8nA4z0b
-gjHWgD0a8vLbT7Oa8vJ44LaFC8ksjYVVHUk1g+Kbjw3N4dh1TWLCPV9OV0eHy7b7VnzPlDKADkYb
-r/XFPu/DuoXc2kxya5LJYW0Dw39tNBG4v8pty/GBzkkAY57U7wxZ28Hh6TSk1xtWFvJJbyTq6B4j
-k/u/3eNhUEDHBHGMDAABjeL/AA1b3l7Z3Fhb2VrLextZ3d7G7RXYt9hYCDaPmf5emDwMdCcQXcui
-+NPhvbXuq3eq6JpUcis0lzceRLIiNtHmNk5D/mSQRzW5pPgbQ9J0rSLD7O14NIkaWzmu23yRuxJJ
-B/Hp04B6gGt27s7XULSS1vLeK4t5Bh4pUDKw9weDQBwPjHwoniPwlYeD/Dt/arDZT2puo3uC0iWw
-BxzycnAIz1xXoiqEQKOgGK4zQtB12Px9rGuX4sLSzkX7PDDaRqWukGCkkjkbtyjK4/oBntKACiii
-gAooooAKKKKACiiigAooooAKKKKAOG+KXg678Y+H7aKzcGSxuPtX2YnH2napAj3ZwpJPUg1v3Fpq
-+pWujzJetpE0Usc15bxKswkXb80O4jpk/eHpW1XG+FdC8W6T4p12fWNeTUdGupDJZRPkyREnOOgC
-gDjAyDweO4B0Wt61ZeHtHuNV1F3S0twDIyRs5GSAOACepFXkcSRq652sARkY4+hpxAIwRkGs3W9G
-j1yxjtZLu8tQk8c2+0mMTkowbaSOxxz/AI0AaVcFYeJdStvitrOk6qLiHS3hhNjJO0aQhuBhTwWL
-MSMZJ+Xp6XvE+r+LEs9bttB0BzdQpCLC7aWNlmZzhztJGNg9frjFec/Ef4UXtxHrHi+fWL7UNRjj
-ikt7aG2DBCpG9QpJ+QckAdBnO7uAanjr40HTH1XQ9BsLka9Z3Hl5uIdyeWqlnkAB7AcZ7EH2r0/Q
-NWh1zQLHUoJ4ZluIVcvCRt3Y+YdTjByMZOMV50fC+l/GLRrTVtT0yfTzDOnk3u1Flv7cAZLAcqGJ
-bA5x2Jru3vdO8LvpOi22nTQ2kwdI2toP3FsqLuJkbooPPPfkmgDcrJ0vw1o+iHUG0yyjtG1CUzXD
-RZBZyOue3fAHAycdafF4h0iefUoItQgeXTAGvFVsmEFdwJ/AH8j6Vi2XjObVPEOlwabo9xd6DqNo
-08erxn5FYZ+VlIyvTHODk9OtAFfQfCGvaL4mhkbxPc3OgWtqYYLOYl5JHY5LSsepB6EYwMDgZ3dr
-XAnxu2ueJtHh8OatpX2FbqaDU7e7by7nK4UbEbDdSMYHUrnvXVX+pXttrOmWVtprXMF0z/aLhZlX
-7MqrkMVPLAnjjp+IoA1KKxfDlt4htob0eIdQtLyR7p2tjbQmMJCfuqff88erda2qACiiigAooooA
-KKKKACiiigAooooAKKKKAIL27jsLG4vJVkaOCNpGEaF2IAycKOSfYVzel3NvY2Op+MLrWdQfSr6F
-Lxbe8TaLONU5CrjOT7deOvU7Ot6ldaVYxz2mlXOpSNPHEYbcqGVWYAudxHAHP88DJESS61L4murW
-40+0/sD7KpiuPNzI8pJDKyYxjH9OTkhQDQs7u31CygvLWUS288ayxOOjKwyD+RqtrOuab4fs0u9U
-ult4HmSFXYE5dzhRx/kVU8V+H5fEvh+XSYNVudMErKJJrXG5o8/MnsCMjj9RkHStbGC00+3slDSQ
-26IiGZjIx24wSTyTwDk96AOJ8erpFpruiX+veIdVtrVp44rbTbMsFlnDhg7bBuIHQj6Y569PremC
-9ay1Bbm5STTXa5jiiufJjnOwgLIeQV5/DnsSDPrUFneaXdWt1ClwDC37nzAjNuBUANkbSclQcjrX
-zNqfxI8WeD4JPCTiI20cYVra7XfLBFJH/qC6kZChhhhg8DpytAH0l4V1lvEHhfT9UdLeOS4iDPHb
-ziZEboQGHBxj8Onal8TeHrLxToFzo+oS3EdrOBva3l2MMHPXoRx0IIr5K+H/AI6uvC3ibSri+vtR
-bSLQyK1rBLlQrg5AQnGN2CR7Z619XodN8eeDFMsNyNO1W2BMb7opArfr/Q+4PIBBBolx4bsNJ0/w
-vaWQto5o47xrt28wwKpBIYcs/QDPA6dOlzW9Vl0K1sfsejXV+stzHbGO0Vf3KNxvIPRR+XTkda1I
-II7a3igiBEcSBFBYsQAMDk8n6mpKAPO/iB8PfDXiKC6vBJZaZrtun2sX3yqQF6NKP4k+XGSOMfUH
-Y8KWFxqS2viTxBplnD4gSBrQXNrP5kcsO7IdcHADdR1OD74EXjfwLH4n069/s+S30/VbyFLWa+aH
-ezW4bc0fUYz6+2Pp02laemk6RZadExdLWBIFYgAsFUDOBx2oAzbez8RL4zvLufUrZvD7WyJb2axf
-vFlz8zFvz+oIGBjJ3a53xrbwPoK3k1pql4bC4ju47fTJCssjq3AwCMjnJHtkciugRi8asVZSQDtb
-qPY0AOooooAKKKKACiiigAooooAKKKKACiiigAorL1jWRpthevaW51C/toPOGnwSKJnBJA4PQEg8
-98HAJ4q09/Db6Z9vvmWyiWISTfaHVRCMZIY5xx9cUAWqKo6nq9ho+jz6tfXCxWMEfmPKAWG31GMk
-9ulV7vxBb20ukLHb3d1HqkmyKa2hLogKlgzn+FSB1/oDQBwfxK+G2o+KtbXVdL+w7xYNbyRzyOnm
-uGBjJwCCF5YdOVGcg8eZT/ALxhcXuqSXd3bTOkRmhuBLu+1ynkpzgg9fmbHOOucj3u/1XxP/AGvL
-aab4eia2imtx9rubsKssTZ80qoBIK4HB+voDpazpt1qcVqlrqtzpxhuY5na3CkyqpyYzkHg/5yOK
-APi3WPCXiPw3DFc6tpF5YxOyiOWWMqCxG4DPrjt14Poa9C+BviLxP/wl1ro1pfCbS9p861upTsRO
-SWjHZh6DrzngEjqPiH4Vtk1jxNqWuX3iHVooLI31pbrGY7e3d2EahZASuVxnG0fKOc4OcH4aeGtY
-8XeO5PFlkLPSbWzkidVMSzBwQBtH+1sBLPwdx7EnAB7ne3+r6tca7oenWt1pVxBAn2XVpoleGRnB
-+6M846d8dx0B5CSX4teGfDVvM40zxBcRF1liSNvOI3rtOQVDfKG6AEbuQcZHfeI9IuNc0d7G11a7
-0uVnRxc2pG8bWBxz2OP8jINHwfrV1qVtfWOp3ME+rabctBdtbwPHHk/Mu3cOflIzj+WCQDoElBEY
-ceXK67vLZhuHTI464z2pl6Lo2NwLFoluzG3kmYEoHx8u4DnGcZxWXrvhq21iRL6Nvs2sW0EsVlfA
-bmtzIuCwU8N+P4YrmdH+J+kx69F4T1k39prCP9mWa9gEa3TDADjbwN5yR29+RkA7bSxfrpVqNUeB
-78RL9oaAERl8c7c84zVusSHSbu08T32tz69dPYS26ounyBRDAV5Lg/56nJPGNugAooooAKKKKACi
-iigAooooAKKKKACo5zIsEhhUNKFJQHuccelSUUAeGeF/hLqXirU7nxT46uL601Ge5Dra27iIjYeM
-sCSF4XGMEbc56GvTPFHgXTfF0sf9p3eoi3WEwtawXTJFIN6vllHU5Uf/AKwCOnooAiW2gS1W1WGM
-W6p5Yi2jaFxjbj0xxipaKKAGh0MhjDKXUBiueQDnBx+B/I06ss6TpGn6te+Imhjhu5bcR3N0zEDy
-kyecnAA7n2HpWfpnipNT1iJYvsZ0a8t1k029F0N93JyXQRkZ+UDn0/HAAN+5toby1mtbiNZYJkaO
-SNhkMpGCD7EGuAuPB+j/AA4t7/xP4atPKmitFhazmv8AybaTDD53Z8/MBnknnPqc11mjaFbeHI9R
-aC4vZ0url7tlnlaYxluSqDrjOTjk5PesvU9d8Na38PJ9W1jdBodxCfM+2QbWXnA+RgctuA24BycY
-zQBEPid4Uk0a3v4tYgb7TcfZIURWdjNkAgIAGIG4HOOhHqKwfhs3iDVvEmp+INTn0u5t5IfsTT2F
-3I6tLFIwyIydqjaR+hH3mr548ZaAPCfiGC304XYh8tJ7S+Zx/pQPzLNHtHyjkYGSRjnnivevgf4r
-+0+ALqTWtXtf9DvDGWmYI6B8EGRifmLOzYJ5JzyewB6drMupQaNdy6PbQ3OorGTbwzvsR27An/8A
-V9R1CLptrcz2uoX2nWh1KKML5uwO0R6lVcgHGc+n0q4ZowkjBtwjzuCDcQQM4wOc+1c54Vvb7xFo
-d5f3d2xs7+WQ2QS3e2mggI2gNk53AgncPqDyAADW1zRLHxFo1zpOpRvJaXChZFSQoSAQRyDnqBVy
-CCK1t4reBAkUSBEQdFUDAH5Vh2ug6lpiaFaafrUp0+wDJdJdp50t2u3C5kONpB54H6DFdBQAUUUU
-AFFFFABRRRQAUUUUAFFFFABRRRQAUUUUAFFFFAAQCMEZBqg+i6W81jK2nWvmWGfsjeUP3AIwQn90
-Y9Kv0UAcj4o0zxC/inw5qmiXdz9nhuPJ1C0EyrE0DclyCOSMAdz0xjk1093Z2uoWklreW8VxbyDD
-xSoGVh7g8Gp6zbVdZGuX7XctmdKKR/Y0jVhKGwd+8njrjGKAPPfiDpl14r+InhjRLKRzBprC/vYV
-lkgxGXUK6uBjI2kDBzzxjkjTh8F+C10TUfACXivcXWbu4RpkN1lmyJOnUYGOOmM9ec34l6xP4J8Z
-+HPFkVnJdwSJJp13FHuLFWKsu3nbuyCQMc4PtjpPDHhBbMSarqd5JqWrXBkaG/urVEuLeGTkR9O3
-oeBkgADigDH0PRtI+FdxeXdx4hvLqx1q9ihhjnXzWFwSwJLqOSe5IH3ecmtW/k1XSvEdpbnxbaKd
-U1LzILK7thu+zrH88UZUjnODk+3fO7zvwRBqnw08e6n4b1UX1/ockJube4WFjDEVBk38/KpwGBxx
-uAq7qHj6PULjwl4yv9GvrXS4XvpEEbSSSMqphXwnyAEZyHI6Nzt5IB61rOs2Hh/SZ9U1S4FvZwAG
-SQgnGSAOByeSKuo6yRq6MGRgCpHcGqWl6jaa/olrqECO1peQrIizR7SVIzyp/wD1VfoAKKKKACii
-igAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKAGuiSAB1VgCCMjOCOhp1FFABTFijWERL
-GoiC7QgHAHTGPSiigB4AAwBgCiiigAooooAKKKKACiiigD//2Q==</Data></Thumbnail></Binary><mdHrLvName Sync="TRUE">dataset</mdHrLvName><refSysInfo><RefSystem><refSysID><identCode code="26912"></identCode><idCodeSpace Sync="TRUE">EPSG</idCodeSpace><idVersion Sync="TRUE">7.4.1</idVersion></refSysID></RefSystem></refSysInfo><mdDateSt Sync="TRUE">20110620</mdDateSt></metadata>
diff --git a/examples/geospatial/schools.shx b/examples/geospatial/schools.shx
deleted file mode 100644
index 5c20492..0000000
Binary files a/examples/geospatial/schools.shx and /dev/null differ
diff --git a/examples/graph/plot_dag_layout.py b/examples/graph/plot_dag_layout.py
new file mode 100644
index 0000000..0b12013
--- /dev/null
+++ b/examples/graph/plot_dag_layout.py
@@ -0,0 +1,42 @@
+"""
+========================
+DAG - Topological Layout
+========================
+
+This example combines the `topological_generations` generator with
+`multipartite_layout` to show how to visualize a DAG in topologically-sorted
+order.
+"""
+
+import networkx as nx
+import matplotlib.pyplot as plt
+
+
+G = nx.DiGraph(
+    [
+        ("f", "a"),
+        ("a", "b"),
+        ("a", "e"),
+        ("b", "c"),
+        ("b", "d"),
+        ("d", "e"),
+        ("f", "c"),
+        ("f", "g"),
+        ("h", "f"),
+    ]
+)
+
+for layer, nodes in enumerate(nx.topological_generations(G)):
+    # `multipartite_layout` expects the layer as a node attribute, so add the
+    # numeric layer value as a node attribute
+    for node in nodes:
+        G.nodes[node]["layer"] = layer
+
+# Compute the multipartite_layout using the "layer" node attribute
+pos = nx.multipartite_layout(G, subset_key="layer")
+
+fig, ax = plt.subplots()
+nx.draw_networkx(G, pos=pos, ax=ax)
+ax.set_title("DAG layout in topological order")
+fig.tight_layout()
+plt.show()
diff --git a/examples/graph/plot_degree_sequence.py b/examples/graph/plot_degree_sequence.py
index 0d32091..87abb64 100644
--- a/examples/graph/plot_degree_sequence.py
+++ b/examples/graph/plot_degree_sequence.py
@@ -15,7 +15,9 @@ z = [5, 3, 3, 3, 3, 2, 2, 2, 1, 1, 1]
 print(nx.is_graphical(z))
 
 print("Configuration model")
-G = nx.configuration_model(z, seed=seed)  # configuration model, seed for reproduciblity
+G = nx.configuration_model(
+    z, seed=seed
+)  # configuration model, seed for reproducibility
 degree_sequence = [d for n, d in G.degree()]  # degree sequence
 print(f"Degree sequence {degree_sequence}")
 print("Degree histogram")
diff --git a/examples/graph/plot_triad_types.py b/examples/graph/plot_triad_types.py
new file mode 100644
index 0000000..eacbc6e
--- /dev/null
+++ b/examples/graph/plot_triad_types.py
@@ -0,0 +1,63 @@
+"""
+======
+Triads
+======
+According to the paper by Snijders, T. (2012). “Transitivity and triads.”
+University of Oxford, there are 16 Triad Types possible. This plot shows
+the 16 Triad Types that can be identified within directed networks.
+Triadic relationships are especially useful when analysing Social Networks.
+The first three digits refer to the number of mutual, asymmetric and null
+dyads (bidirectional, unidirection and nonedges) and the letter gives
+the Orientation as Up (U), Down (D) , Cyclical (C) or Transitive (T).
+"""
+
+import networkx as nx
+import matplotlib.pyplot as plt
+
+fig, axes = plt.subplots(4, 4, figsize=(10, 10))
+triads = {
+    "003": [],
+    "012": [(1, 2)],
+    "102": [(1, 2), (2, 1)],
+    "021D": [(3, 1), (3, 2)],
+    "021U": [(1, 3), (2, 3)],
+    "021C": [(1, 3), (3, 2)],
+    "111D": [(1, 2), (2, 1), (3, 1)],
+    "111U": [(1, 2), (2, 1), (1, 3)],
+    "030T": [(1, 2), (3, 2), (1, 3)],
+    "030C": [(1, 3), (3, 2), (2, 1)],
+    "201": [(1, 2), (2, 1), (3, 1), (1, 3)],
+    "120D": [(1, 2), (2, 1), (3, 1), (3, 2)],
+    "120U": [(1, 2), (2, 1), (1, 3), (2, 3)],
+    "120C": [(1, 2), (2, 1), (1, 3), (3, 2)],
+    "210": [(1, 2), (2, 1), (1, 3), (3, 2), (2, 3)],
+    "300": [(1, 2), (2, 1), (2, 3), (3, 2), (1, 3), (3, 1)],
+}
+
+for (title, triad), ax in zip(triads.items(), axes.flatten()):
+    G = nx.DiGraph()
+    G.add_nodes_from([1, 2, 3])
+    G.add_edges_from(triad)
+    nx.draw_networkx(
+        G,
+        ax=ax,
+        with_labels=False,
+        node_color=["green"],
+        node_size=200,
+        arrowsize=20,
+        width=2,
+        pos=nx.planar_layout(G),
+    )
+    ax.set_xlim(val * 1.2 for val in ax.get_xlim())
+    ax.set_ylim(val * 1.2 for val in ax.get_ylim())
+    ax.text(
+        0,
+        0,
+        title,
+        fontsize=15,
+        fontweight="extra bold",
+        horizontalalignment="center",
+        bbox=dict(boxstyle="square,pad=0.3", fc="none"),
+    )
+fig.tight_layout()
+plt.show()
diff --git a/networkx.egg-info/PKG-INFO b/networkx.egg-info/PKG-INFO
new file mode 100644
index 0000000..2b1d4d0
--- /dev/null
+++ b/networkx.egg-info/PKG-INFO
@@ -0,0 +1,113 @@
+Metadata-Version: 2.1
+Name: networkx
+Version: 3.1rc1.dev0
+Summary: Python package for creating and manipulating graphs and networks
+Home-page: https://networkx.org/
+Author: Aric Hagberg
+Author-email: hagberg@lanl.gov
+Maintainer: NetworkX Developers
+Maintainer-email: networkx-discuss@googlegroups.com
+Project-URL: Bug Tracker, https://github.com/networkx/networkx/issues
+Project-URL: Documentation, https://networkx.org/documentation/stable/
+Project-URL: Source Code, https://github.com/networkx/networkx
+Keywords: Networks,Graph Theory,Mathematics,network,graph,discrete mathematics,math
+Platform: Linux
+Platform: Mac OSX
+Platform: Windows
+Platform: Unix
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: Intended Audience :: Science/Research
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Operating System :: OS Independent
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Classifier: Programming Language :: Python :: 3.11
+Classifier: Programming Language :: Python :: 3 :: Only
+Classifier: Topic :: Software Development :: Libraries :: Python Modules
+Classifier: Topic :: Scientific/Engineering :: Bio-Informatics
+Classifier: Topic :: Scientific/Engineering :: Information Analysis
+Classifier: Topic :: Scientific/Engineering :: Mathematics
+Classifier: Topic :: Scientific/Engineering :: Physics
+Requires-Python: >=3.8
+Provides-Extra: default
+Provides-Extra: developer
+Provides-Extra: doc
+Provides-Extra: extra
+Provides-Extra: test
+License-File: LICENSE.txt
+
+NetworkX
+========
+
+.. image:: https://github.com/networkx/networkx/workflows/test/badge.svg?branch=main
+  :target: https://github.com/networkx/networkx/actions?query=workflow%3A%22test%22
+
+.. image:: https://codecov.io/gh/networkx/networkx/branch/main/graph/badge.svg
+   :target: https://app.codecov.io/gh/networkx/networkx/branch/main
+   
+.. image:: https://img.shields.io/github/labels/networkx/networkx/Good%20First%20Issue?color=green&label=Contribute%20&style=flat-square
+   :target: https://github.com/networkx/networkx/issues?q=is%3Aopen+is%3Aissue+label%3A%22Good+First+Issue%22
+   
+
+NetworkX is a Python package for the creation, manipulation,
+and study of the structure, dynamics, and functions
+of complex networks.
+
+- **Website (including documentation):** https://networkx.org
+- **Mailing list:** https://groups.google.com/forum/#!forum/networkx-discuss
+- **Source:** https://github.com/networkx/networkx
+- **Bug reports:** https://github.com/networkx/networkx/issues
+- **Report a security vulnerability:** https://tidelift.com/security
+- **Tutorial:** https://networkx.org/documentation/latest/tutorial.html
+- **GitHub Discussions:** https://github.com/networkx/networkx/discussions
+
+Simple example
+--------------
+
+Find the shortest path between two nodes in an undirected graph:
+
+.. code:: pycon
+
+    >>> import networkx as nx
+    >>> G = nx.Graph()
+    >>> G.add_edge("A", "B", weight=4)
+    >>> G.add_edge("B", "D", weight=2)
+    >>> G.add_edge("A", "C", weight=3)
+    >>> G.add_edge("C", "D", weight=4)
+    >>> nx.shortest_path(G, "A", "D", weight="weight")
+    ['A', 'B', 'D']
+
+Install
+-------
+
+Install the latest version of NetworkX::
+
+    $ pip install networkx
+
+Install with all optional dependencies::
+
+    $ pip install networkx[all]
+
+For additional details, please see `INSTALL.rst`.
+
+Bugs
+----
+
+Please report any bugs that you find `here <https://github.com/networkx/networkx/issues>`_.
+Or, even better, fork the repository on `GitHub <https://github.com/networkx/networkx>`_
+and create a pull request (PR). We welcome all changes, big or small, and we
+will help you make the PR if you are new to `git` (just ask on the issue and/or
+see `CONTRIBUTING.rst`).
+
+License
+-------
+
+Released under the 3-Clause BSD license (see `LICENSE.txt`)::
+
+   Copyright (C) 2004-2023 NetworkX Developers
+   Aric Hagberg <hagberg@lanl.gov>
+   Dan Schult <dschult@colgate.edu>
+   Pieter Swart <swart@lanl.gov>
diff --git a/networkx.egg-info/SOURCES.txt b/networkx.egg-info/SOURCES.txt
new file mode 100644
index 0000000..36548a3
--- /dev/null
+++ b/networkx.egg-info/SOURCES.txt
@@ -0,0 +1,844 @@
+CONTRIBUTING.rst
+INSTALL.rst
+LICENSE.txt
+MANIFEST.in
+README.rst
+setup.py
+doc/Makefile
+doc/conf.py
+doc/index.rst
+doc/install.rst
+doc/tutorial.rst
+doc/_static/copybutton.js
+doc/_static/custom.css
+doc/_templates/dev_banner.html
+doc/_templates/eol_banner.html
+doc/_templates/layout.html
+doc/_templates/version.html
+doc/_templates/autosummary/base.rst
+doc/_templates/autosummary/class.rst
+doc/developer/about_us.rst
+doc/developer/code_of_conduct.rst
+doc/developer/contribute.rst
+doc/developer/core_developer.rst
+doc/developer/deprecations.rst
+doc/developer/index.rst
+doc/developer/new_contributor_faq.rst
+doc/developer/projects.rst
+doc/developer/release.rst
+doc/developer/roadmap.rst
+doc/developer/team.rst
+doc/developer/values.rst
+doc/developer/nxeps/index.rst
+doc/developer/nxeps/nxep-0000.rst
+doc/developer/nxeps/nxep-0001.rst
+doc/developer/nxeps/nxep-0002.rst
+doc/developer/nxeps/nxep-0003.rst
+doc/developer/nxeps/nxep-0004.rst
+doc/developer/nxeps/nxep-template.rst
+doc/developer/nxeps/_static/nxep-0000.png
+doc/reference/convert.rst
+doc/reference/drawing.rst
+doc/reference/exceptions.rst
+doc/reference/functions.rst
+doc/reference/generators.rst
+doc/reference/glossary.rst
+doc/reference/index.rst
+doc/reference/introduction.rst
+doc/reference/linalg.rst
+doc/reference/randomness.rst
+doc/reference/relabel.rst
+doc/reference/utils.rst
+doc/reference/algorithms/approximation.rst
+doc/reference/algorithms/assortativity.rst
+doc/reference/algorithms/asteroidal.rst
+doc/reference/algorithms/bipartite.rst
+doc/reference/algorithms/boundary.rst
+doc/reference/algorithms/bridges.rst
+doc/reference/algorithms/centrality.rst
+doc/reference/algorithms/chains.rst
+doc/reference/algorithms/chordal.rst
+doc/reference/algorithms/clique.rst
+doc/reference/algorithms/clustering.rst
+doc/reference/algorithms/coloring.rst
+doc/reference/algorithms/communicability_alg.rst
+doc/reference/algorithms/community.rst
+doc/reference/algorithms/component.rst
+doc/reference/algorithms/connectivity.rst
+doc/reference/algorithms/core.rst
+doc/reference/algorithms/covering.rst
+doc/reference/algorithms/cuts.rst
+doc/reference/algorithms/cycles.rst
+doc/reference/algorithms/d_separation.rst
+doc/reference/algorithms/dag.rst
+doc/reference/algorithms/distance_measures.rst
+doc/reference/algorithms/distance_regular.rst
+doc/reference/algorithms/dominance.rst
+doc/reference/algorithms/dominating.rst
+doc/reference/algorithms/efficiency_measures.rst
+doc/reference/algorithms/euler.rst
+doc/reference/algorithms/flow.rst
+doc/reference/algorithms/graph_hashing.rst
+doc/reference/algorithms/graphical.rst
+doc/reference/algorithms/hierarchy.rst
+doc/reference/algorithms/hybrid.rst
+doc/reference/algorithms/index.rst
+doc/reference/algorithms/isolates.rst
+doc/reference/algorithms/isomorphism.ismags.rst
+doc/reference/algorithms/isomorphism.rst
+doc/reference/algorithms/isomorphism.vf2.rst
+doc/reference/algorithms/link_analysis.rst
+doc/reference/algorithms/link_prediction.rst
+doc/reference/algorithms/lowest_common_ancestors.rst
+doc/reference/algorithms/matching.rst
+doc/reference/algorithms/minors.rst
+doc/reference/algorithms/mis.rst
+doc/reference/algorithms/moral.rst
+doc/reference/algorithms/node_classification.rst
+doc/reference/algorithms/non_randomness.rst
+doc/reference/algorithms/operators.rst
+doc/reference/algorithms/planar_drawing.rst
+doc/reference/algorithms/planarity.rst
+doc/reference/algorithms/polynomials.rst
+doc/reference/algorithms/reciprocity.rst
+doc/reference/algorithms/regular.rst
+doc/reference/algorithms/rich_club.rst
+doc/reference/algorithms/shortest_paths.rst
+doc/reference/algorithms/similarity.rst
+doc/reference/algorithms/simple_paths.rst
+doc/reference/algorithms/smallworld.rst
+doc/reference/algorithms/smetric.rst
+doc/reference/algorithms/sparsifiers.rst
+doc/reference/algorithms/structuralholes.rst
+doc/reference/algorithms/summarization.rst
+doc/reference/algorithms/swap.rst
+doc/reference/algorithms/threshold.rst
+doc/reference/algorithms/tournament.rst
+doc/reference/algorithms/traversal.rst
+doc/reference/algorithms/tree.rst
+doc/reference/algorithms/triads.rst
+doc/reference/algorithms/vitality.rst
+doc/reference/algorithms/voronoi.rst
+doc/reference/algorithms/wiener.rst
+doc/reference/classes/digraph.rst
+doc/reference/classes/graph.rst
+doc/reference/classes/index.rst
+doc/reference/classes/multidigraph.rst
+doc/reference/classes/multigraph.rst
+doc/reference/readwrite/adjlist.rst
+doc/reference/readwrite/edgelist.rst
+doc/reference/readwrite/gexf.rst
+doc/reference/readwrite/gml.rst
+doc/reference/readwrite/graphml.rst
+doc/reference/readwrite/index.rst
+doc/reference/readwrite/json_graph.rst
+doc/reference/readwrite/leda.rst
+doc/reference/readwrite/matrix_market.rst
+doc/reference/readwrite/multiline_adjlist.rst
+doc/reference/readwrite/pajek.rst
+doc/reference/readwrite/sparsegraph6.rst
+doc/release/api_0.99.rst
+doc/release/api_1.0.rst
+doc/release/api_1.10.rst
+doc/release/api_1.11.rst
+doc/release/api_1.4.rst
+doc/release/api_1.5.rst
+doc/release/api_1.6.rst
+doc/release/api_1.7.rst
+doc/release/api_1.8.rst
+doc/release/api_1.9.rst
+doc/release/contribs.py
+doc/release/index.rst
+doc/release/migration_guide_from_1.x_to_2.0.rst
+doc/release/migration_guide_from_2.x_to_3.0.rst
+doc/release/old_release_log.rst
+doc/release/release_2.0.rst
+doc/release/release_2.1.rst
+doc/release/release_2.2.rst
+doc/release/release_2.3.rst
+doc/release/release_2.4.rst
+doc/release/release_2.5.rst
+doc/release/release_2.6.rst
+doc/release/release_2.7.1.rst
+doc/release/release_2.7.rst
+doc/release/release_2.8.1.rst
+doc/release/release_2.8.2.rst
+doc/release/release_2.8.3.rst
+doc/release/release_2.8.4.rst
+doc/release/release_2.8.5.rst
+doc/release/release_2.8.6.rst
+doc/release/release_2.8.7.rst
+doc/release/release_2.8.8.rst
+doc/release/release_2.8.rst
+doc/release/release_3.0.rst
+doc/release/release_dev.rst
+doc/release/release_template.rst
+doc/release/report_functions_without_rst_generated.py
+examples/README.txt
+examples/./README.txt
+examples/3d_drawing/README.txt
+examples/3d_drawing/mayavi2_spring.py
+examples/3d_drawing/plot_basic.py
+examples/algorithms/README.txt
+examples/algorithms/WormNet.v3.benchmark.txt
+examples/algorithms/hartford_drug.edgelist
+examples/algorithms/plot_beam_search.py
+examples/algorithms/plot_betweenness_centrality.py
+examples/algorithms/plot_blockmodel.py
+examples/algorithms/plot_circuits.py
+examples/algorithms/plot_davis_club.py
+examples/algorithms/plot_dedensification.py
+examples/algorithms/plot_iterated_dynamical_systems.py
+examples/algorithms/plot_krackhardt_centrality.py
+examples/algorithms/plot_parallel_betweenness.py
+examples/algorithms/plot_rcm.py
+examples/algorithms/plot_snap.py
+examples/algorithms/plot_subgraphs.py
+examples/basic/README.txt
+examples/basic/plot_properties.py
+examples/basic/plot_read_write.py
+examples/basic/plot_simple_graph.py
+examples/drawing/README.txt
+examples/drawing/chess_masters_WCC.pgn.bz2
+examples/drawing/knuth_miles.txt.gz
+examples/drawing/plot_center_node.py
+examples/drawing/plot_chess_masters.py
+examples/drawing/plot_custom_node_icons.py
+examples/drawing/plot_degree.py
+examples/drawing/plot_directed.py
+examples/drawing/plot_edge_colormap.py
+examples/drawing/plot_ego_graph.py
+examples/drawing/plot_eigenvalues.py
+examples/drawing/plot_four_grids.py
+examples/drawing/plot_house_with_colors.py
+examples/drawing/plot_knuth_miles.py
+examples/drawing/plot_labels_and_colors.py
+examples/drawing/plot_multipartite_graph.py
+examples/drawing/plot_node_colormap.py
+examples/drawing/plot_rainbow_coloring.py
+examples/drawing/plot_random_geometric_graph.py
+examples/drawing/plot_sampson.py
+examples/drawing/plot_selfloops.py
+examples/drawing/plot_simple_path.py
+examples/drawing/plot_spectral_grid.py
+examples/drawing/plot_tsp.py
+examples/drawing/plot_unix_email.py
+examples/drawing/plot_weighted_graph.py
+examples/drawing/sampson_data.zip
+examples/drawing/unix_email.mbox
+examples/external/README.txt
+examples/external/javascript_force.py
+examples/external/plot_igraph.py
+examples/external/force/README.txt
+examples/external/force/force.css
+examples/external/force/force.html
+examples/external/force/force.js
+examples/geospatial/README.txt
+examples/geospatial/plot_delaunay.py
+examples/geospatial/plot_lines.py
+examples/geospatial/plot_osmnx.py
+examples/geospatial/plot_points.py
+examples/geospatial/plot_polygons.py
+examples/graph/README.txt
+examples/graph/plot_dag_layout.py
+examples/graph/plot_degree_sequence.py
+examples/graph/plot_erdos_renyi.py
+examples/graph/plot_expected_degree_sequence.py
+examples/graph/plot_football.py
+examples/graph/plot_karate_club.py
+examples/graph/plot_morse_trie.py
+examples/graph/plot_napoleon_russian_campaign.py
+examples/graph/plot_roget.py
+examples/graph/plot_triad_types.py
+examples/graph/plot_words.py
+examples/graph/roget_dat.txt.gz
+examples/graph/words_dat.txt.gz
+examples/graphviz_drawing/README.txt
+examples/graphviz_drawing/plot_attributes.py
+examples/graphviz_drawing/plot_conversion.py
+examples/graphviz_drawing/plot_grid.py
+examples/graphviz_drawing/plot_mini_atlas.py
+examples/graphviz_layout/README.txt
+examples/graphviz_layout/lanl_routes.edgelist
+examples/graphviz_layout/plot_atlas.py
+examples/graphviz_layout/plot_circular_tree.py
+examples/graphviz_layout/plot_decomposition.py
+examples/graphviz_layout/plot_giant_component.py
+examples/graphviz_layout/plot_lanl_routes.py
+examples/subclass/README.txt
+examples/subclass/plot_antigraph.py
+examples/subclass/plot_printgraph.py
+networkx/__init__.py
+networkx/conftest.py
+networkx/convert.py
+networkx/convert_matrix.py
+networkx/exception.py
+networkx/lazy_imports.py
+networkx/relabel.py
+networkx.egg-info/PKG-INFO
+networkx.egg-info/SOURCES.txt
+networkx.egg-info/dependency_links.txt
+networkx.egg-info/not-zip-safe
+networkx.egg-info/requires.txt
+networkx.egg-info/top_level.txt
+networkx/algorithms/__init__.py
+networkx/algorithms/asteroidal.py
+networkx/algorithms/boundary.py
+networkx/algorithms/bridges.py
+networkx/algorithms/chains.py
+networkx/algorithms/chordal.py
+networkx/algorithms/clique.py
+networkx/algorithms/cluster.py
+networkx/algorithms/communicability_alg.py
+networkx/algorithms/core.py
+networkx/algorithms/covering.py
+networkx/algorithms/cuts.py
+networkx/algorithms/cycles.py
+networkx/algorithms/d_separation.py
+networkx/algorithms/dag.py
+networkx/algorithms/distance_measures.py
+networkx/algorithms/distance_regular.py
+networkx/algorithms/dominance.py
+networkx/algorithms/dominating.py
+networkx/algorithms/efficiency_measures.py
+networkx/algorithms/euler.py
+networkx/algorithms/graph_hashing.py
+networkx/algorithms/graphical.py
+networkx/algorithms/hierarchy.py
+networkx/algorithms/hybrid.py
+networkx/algorithms/isolate.py
+networkx/algorithms/link_prediction.py
+networkx/algorithms/lowest_common_ancestors.py
+networkx/algorithms/matching.py
+networkx/algorithms/mis.py
+networkx/algorithms/moral.py
+networkx/algorithms/node_classification.py
+networkx/algorithms/non_randomness.py
+networkx/algorithms/planar_drawing.py
+networkx/algorithms/planarity.py
+networkx/algorithms/polynomials.py
+networkx/algorithms/reciprocity.py
+networkx/algorithms/regular.py
+networkx/algorithms/richclub.py
+networkx/algorithms/similarity.py
+networkx/algorithms/simple_paths.py
+networkx/algorithms/smallworld.py
+networkx/algorithms/smetric.py
+networkx/algorithms/sparsifiers.py
+networkx/algorithms/structuralholes.py
+networkx/algorithms/summarization.py
+networkx/algorithms/swap.py
+networkx/algorithms/threshold.py
+networkx/algorithms/tournament.py
+networkx/algorithms/triads.py
+networkx/algorithms/vitality.py
+networkx/algorithms/voronoi.py
+networkx/algorithms/wiener.py
+networkx/algorithms/approximation/__init__.py
+networkx/algorithms/approximation/clique.py
+networkx/algorithms/approximation/clustering_coefficient.py
+networkx/algorithms/approximation/connectivity.py
+networkx/algorithms/approximation/distance_measures.py
+networkx/algorithms/approximation/dominating_set.py
+networkx/algorithms/approximation/kcomponents.py
+networkx/algorithms/approximation/matching.py
+networkx/algorithms/approximation/maxcut.py
+networkx/algorithms/approximation/ramsey.py
+networkx/algorithms/approximation/steinertree.py
+networkx/algorithms/approximation/traveling_salesman.py
+networkx/algorithms/approximation/treewidth.py
+networkx/algorithms/approximation/vertex_cover.py
+networkx/algorithms/approximation/tests/__init__.py
+networkx/algorithms/approximation/tests/test_approx_clust_coeff.py
+networkx/algorithms/approximation/tests/test_clique.py
+networkx/algorithms/approximation/tests/test_connectivity.py
+networkx/algorithms/approximation/tests/test_distance_measures.py
+networkx/algorithms/approximation/tests/test_dominating_set.py
+networkx/algorithms/approximation/tests/test_kcomponents.py
+networkx/algorithms/approximation/tests/test_matching.py
+networkx/algorithms/approximation/tests/test_maxcut.py
+networkx/algorithms/approximation/tests/test_ramsey.py
+networkx/algorithms/approximation/tests/test_steinertree.py
+networkx/algorithms/approximation/tests/test_traveling_salesman.py
+networkx/algorithms/approximation/tests/test_treewidth.py
+networkx/algorithms/approximation/tests/test_vertex_cover.py
+networkx/algorithms/assortativity/__init__.py
+networkx/algorithms/assortativity/connectivity.py
+networkx/algorithms/assortativity/correlation.py
+networkx/algorithms/assortativity/mixing.py
+networkx/algorithms/assortativity/neighbor_degree.py
+networkx/algorithms/assortativity/pairs.py
+networkx/algorithms/assortativity/tests/__init__.py
+networkx/algorithms/assortativity/tests/base_test.py
+networkx/algorithms/assortativity/tests/test_connectivity.py
+networkx/algorithms/assortativity/tests/test_correlation.py
+networkx/algorithms/assortativity/tests/test_mixing.py
+networkx/algorithms/assortativity/tests/test_neighbor_degree.py
+networkx/algorithms/assortativity/tests/test_pairs.py
+networkx/algorithms/bipartite/__init__.py
+networkx/algorithms/bipartite/basic.py
+networkx/algorithms/bipartite/centrality.py
+networkx/algorithms/bipartite/cluster.py
+networkx/algorithms/bipartite/covering.py
+networkx/algorithms/bipartite/edgelist.py
+networkx/algorithms/bipartite/generators.py
+networkx/algorithms/bipartite/matching.py
+networkx/algorithms/bipartite/matrix.py
+networkx/algorithms/bipartite/projection.py
+networkx/algorithms/bipartite/redundancy.py
+networkx/algorithms/bipartite/spectral.py
+networkx/algorithms/bipartite/tests/__init__.py
+networkx/algorithms/bipartite/tests/test_basic.py
+networkx/algorithms/bipartite/tests/test_centrality.py
+networkx/algorithms/bipartite/tests/test_cluster.py
+networkx/algorithms/bipartite/tests/test_covering.py
+networkx/algorithms/bipartite/tests/test_edgelist.py
+networkx/algorithms/bipartite/tests/test_generators.py
+networkx/algorithms/bipartite/tests/test_matching.py
+networkx/algorithms/bipartite/tests/test_matrix.py
+networkx/algorithms/bipartite/tests/test_project.py
+networkx/algorithms/bipartite/tests/test_redundancy.py
+networkx/algorithms/bipartite/tests/test_spectral_bipartivity.py
+networkx/algorithms/centrality/__init__.py
+networkx/algorithms/centrality/betweenness.py
+networkx/algorithms/centrality/betweenness_subset.py
+networkx/algorithms/centrality/closeness.py
+networkx/algorithms/centrality/current_flow_betweenness.py
+networkx/algorithms/centrality/current_flow_betweenness_subset.py
+networkx/algorithms/centrality/current_flow_closeness.py
+networkx/algorithms/centrality/degree_alg.py
+networkx/algorithms/centrality/dispersion.py
+networkx/algorithms/centrality/eigenvector.py
+networkx/algorithms/centrality/flow_matrix.py
+networkx/algorithms/centrality/group.py
+networkx/algorithms/centrality/harmonic.py
+networkx/algorithms/centrality/katz.py
+networkx/algorithms/centrality/laplacian.py
+networkx/algorithms/centrality/load.py
+networkx/algorithms/centrality/percolation.py
+networkx/algorithms/centrality/reaching.py
+networkx/algorithms/centrality/second_order.py
+networkx/algorithms/centrality/subgraph_alg.py
+networkx/algorithms/centrality/trophic.py
+networkx/algorithms/centrality/voterank_alg.py
+networkx/algorithms/centrality/tests/__init__.py
+networkx/algorithms/centrality/tests/test_betweenness_centrality.py
+networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py
+networkx/algorithms/centrality/tests/test_closeness_centrality.py
+networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py
+networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality_subset.py
+networkx/algorithms/centrality/tests/test_current_flow_closeness.py
+networkx/algorithms/centrality/tests/test_degree_centrality.py
+networkx/algorithms/centrality/tests/test_dispersion.py
+networkx/algorithms/centrality/tests/test_eigenvector_centrality.py
+networkx/algorithms/centrality/tests/test_group.py
+networkx/algorithms/centrality/tests/test_harmonic_centrality.py
+networkx/algorithms/centrality/tests/test_katz_centrality.py
+networkx/algorithms/centrality/tests/test_laplacian_centrality.py
+networkx/algorithms/centrality/tests/test_load_centrality.py
+networkx/algorithms/centrality/tests/test_percolation_centrality.py
+networkx/algorithms/centrality/tests/test_reaching.py
+networkx/algorithms/centrality/tests/test_second_order_centrality.py
+networkx/algorithms/centrality/tests/test_subgraph.py
+networkx/algorithms/centrality/tests/test_trophic.py
+networkx/algorithms/centrality/tests/test_voterank.py
+networkx/algorithms/coloring/__init__.py
+networkx/algorithms/coloring/equitable_coloring.py
+networkx/algorithms/coloring/greedy_coloring.py
+networkx/algorithms/coloring/tests/__init__.py
+networkx/algorithms/coloring/tests/test_coloring.py
+networkx/algorithms/community/__init__.py
+networkx/algorithms/community/asyn_fluid.py
+networkx/algorithms/community/centrality.py
+networkx/algorithms/community/community_utils.py
+networkx/algorithms/community/kclique.py
+networkx/algorithms/community/kernighan_lin.py
+networkx/algorithms/community/label_propagation.py
+networkx/algorithms/community/louvain.py
+networkx/algorithms/community/lukes.py
+networkx/algorithms/community/modularity_max.py
+networkx/algorithms/community/quality.py
+networkx/algorithms/community/tests/__init__.py
+networkx/algorithms/community/tests/test_asyn_fluid.py
+networkx/algorithms/community/tests/test_centrality.py
+networkx/algorithms/community/tests/test_kclique.py
+networkx/algorithms/community/tests/test_kernighan_lin.py
+networkx/algorithms/community/tests/test_label_propagation.py
+networkx/algorithms/community/tests/test_louvain.py
+networkx/algorithms/community/tests/test_lukes.py
+networkx/algorithms/community/tests/test_modularity_max.py
+networkx/algorithms/community/tests/test_quality.py
+networkx/algorithms/community/tests/test_utils.py
+networkx/algorithms/components/__init__.py
+networkx/algorithms/components/attracting.py
+networkx/algorithms/components/biconnected.py
+networkx/algorithms/components/connected.py
+networkx/algorithms/components/semiconnected.py
+networkx/algorithms/components/strongly_connected.py
+networkx/algorithms/components/weakly_connected.py
+networkx/algorithms/components/tests/__init__.py
+networkx/algorithms/components/tests/test_attracting.py
+networkx/algorithms/components/tests/test_biconnected.py
+networkx/algorithms/components/tests/test_connected.py
+networkx/algorithms/components/tests/test_semiconnected.py
+networkx/algorithms/components/tests/test_strongly_connected.py
+networkx/algorithms/components/tests/test_weakly_connected.py
+networkx/algorithms/connectivity/__init__.py
+networkx/algorithms/connectivity/connectivity.py
+networkx/algorithms/connectivity/cuts.py
+networkx/algorithms/connectivity/disjoint_paths.py
+networkx/algorithms/connectivity/edge_augmentation.py
+networkx/algorithms/connectivity/edge_kcomponents.py
+networkx/algorithms/connectivity/kcomponents.py
+networkx/algorithms/connectivity/kcutsets.py
+networkx/algorithms/connectivity/stoerwagner.py
+networkx/algorithms/connectivity/utils.py
+networkx/algorithms/connectivity/tests/__init__.py
+networkx/algorithms/connectivity/tests/test_connectivity.py
+networkx/algorithms/connectivity/tests/test_cuts.py
+networkx/algorithms/connectivity/tests/test_disjoint_paths.py
+networkx/algorithms/connectivity/tests/test_edge_augmentation.py
+networkx/algorithms/connectivity/tests/test_edge_kcomponents.py
+networkx/algorithms/connectivity/tests/test_kcomponents.py
+networkx/algorithms/connectivity/tests/test_kcutsets.py
+networkx/algorithms/connectivity/tests/test_stoer_wagner.py
+networkx/algorithms/flow/__init__.py
+networkx/algorithms/flow/boykovkolmogorov.py
+networkx/algorithms/flow/capacityscaling.py
+networkx/algorithms/flow/dinitz_alg.py
+networkx/algorithms/flow/edmondskarp.py
+networkx/algorithms/flow/gomory_hu.py
+networkx/algorithms/flow/maxflow.py
+networkx/algorithms/flow/mincost.py
+networkx/algorithms/flow/networksimplex.py
+networkx/algorithms/flow/preflowpush.py
+networkx/algorithms/flow/shortestaugmentingpath.py
+networkx/algorithms/flow/utils.py
+networkx/algorithms/flow/tests/__init__.py
+networkx/algorithms/flow/tests/gl1.gpickle.bz2
+networkx/algorithms/flow/tests/gw1.gpickle.bz2
+networkx/algorithms/flow/tests/netgen-2.gpickle.bz2
+networkx/algorithms/flow/tests/test_gomory_hu.py
+networkx/algorithms/flow/tests/test_maxflow.py
+networkx/algorithms/flow/tests/test_maxflow_large_graph.py
+networkx/algorithms/flow/tests/test_mincost.py
+networkx/algorithms/flow/tests/test_networksimplex.py
+networkx/algorithms/flow/tests/wlm3.gpickle.bz2
+networkx/algorithms/isomorphism/__init__.py
+networkx/algorithms/isomorphism/ismags.py
+networkx/algorithms/isomorphism/isomorph.py
+networkx/algorithms/isomorphism/isomorphvf2.py
+networkx/algorithms/isomorphism/matchhelpers.py
+networkx/algorithms/isomorphism/temporalisomorphvf2.py
+networkx/algorithms/isomorphism/tree_isomorphism.py
+networkx/algorithms/isomorphism/vf2pp.py
+networkx/algorithms/isomorphism/vf2userfunc.py
+networkx/algorithms/isomorphism/tests/__init__.py
+networkx/algorithms/isomorphism/tests/iso_r01_s80.A99
+networkx/algorithms/isomorphism/tests/iso_r01_s80.B99
+networkx/algorithms/isomorphism/tests/si2_b06_m200.A99
+networkx/algorithms/isomorphism/tests/si2_b06_m200.B99
+networkx/algorithms/isomorphism/tests/test_ismags.py
+networkx/algorithms/isomorphism/tests/test_isomorphism.py
+networkx/algorithms/isomorphism/tests/test_isomorphvf2.py
+networkx/algorithms/isomorphism/tests/test_match_helpers.py
+networkx/algorithms/isomorphism/tests/test_temporalisomorphvf2.py
+networkx/algorithms/isomorphism/tests/test_tree_isomorphism.py
+networkx/algorithms/isomorphism/tests/test_vf2pp.py
+networkx/algorithms/isomorphism/tests/test_vf2pp_helpers.py
+networkx/algorithms/isomorphism/tests/test_vf2userfunc.py
+networkx/algorithms/link_analysis/__init__.py
+networkx/algorithms/link_analysis/hits_alg.py
+networkx/algorithms/link_analysis/pagerank_alg.py
+networkx/algorithms/link_analysis/tests/__init__.py
+networkx/algorithms/link_analysis/tests/test_hits.py
+networkx/algorithms/link_analysis/tests/test_pagerank.py
+networkx/algorithms/minors/__init__.py
+networkx/algorithms/minors/contraction.py
+networkx/algorithms/minors/tests/test_contraction.py
+networkx/algorithms/operators/__init__.py
+networkx/algorithms/operators/all.py
+networkx/algorithms/operators/binary.py
+networkx/algorithms/operators/product.py
+networkx/algorithms/operators/unary.py
+networkx/algorithms/operators/tests/__init__.py
+networkx/algorithms/operators/tests/test_all.py
+networkx/algorithms/operators/tests/test_binary.py
+networkx/algorithms/operators/tests/test_product.py
+networkx/algorithms/operators/tests/test_unary.py
+networkx/algorithms/shortest_paths/__init__.py
+networkx/algorithms/shortest_paths/astar.py
+networkx/algorithms/shortest_paths/dense.py
+networkx/algorithms/shortest_paths/generic.py
+networkx/algorithms/shortest_paths/unweighted.py
+networkx/algorithms/shortest_paths/weighted.py
+networkx/algorithms/shortest_paths/tests/__init__.py
+networkx/algorithms/shortest_paths/tests/test_astar.py
+networkx/algorithms/shortest_paths/tests/test_dense.py
+networkx/algorithms/shortest_paths/tests/test_dense_numpy.py
+networkx/algorithms/shortest_paths/tests/test_generic.py
+networkx/algorithms/shortest_paths/tests/test_unweighted.py
+networkx/algorithms/shortest_paths/tests/test_weighted.py
+networkx/algorithms/tests/__init__.py
+networkx/algorithms/tests/test_asteroidal.py
+networkx/algorithms/tests/test_boundary.py
+networkx/algorithms/tests/test_bridges.py
+networkx/algorithms/tests/test_chains.py
+networkx/algorithms/tests/test_chordal.py
+networkx/algorithms/tests/test_clique.py
+networkx/algorithms/tests/test_cluster.py
+networkx/algorithms/tests/test_communicability.py
+networkx/algorithms/tests/test_core.py
+networkx/algorithms/tests/test_covering.py
+networkx/algorithms/tests/test_cuts.py
+networkx/algorithms/tests/test_cycles.py
+networkx/algorithms/tests/test_d_separation.py
+networkx/algorithms/tests/test_dag.py
+networkx/algorithms/tests/test_distance_measures.py
+networkx/algorithms/tests/test_distance_regular.py
+networkx/algorithms/tests/test_dominance.py
+networkx/algorithms/tests/test_dominating.py
+networkx/algorithms/tests/test_efficiency.py
+networkx/algorithms/tests/test_euler.py
+networkx/algorithms/tests/test_graph_hashing.py
+networkx/algorithms/tests/test_graphical.py
+networkx/algorithms/tests/test_hierarchy.py
+networkx/algorithms/tests/test_hybrid.py
+networkx/algorithms/tests/test_isolate.py
+networkx/algorithms/tests/test_link_prediction.py
+networkx/algorithms/tests/test_lowest_common_ancestors.py
+networkx/algorithms/tests/test_matching.py
+networkx/algorithms/tests/test_max_weight_clique.py
+networkx/algorithms/tests/test_mis.py
+networkx/algorithms/tests/test_moral.py
+networkx/algorithms/tests/test_node_classification.py
+networkx/algorithms/tests/test_non_randomness.py
+networkx/algorithms/tests/test_planar_drawing.py
+networkx/algorithms/tests/test_planarity.py
+networkx/algorithms/tests/test_polynomials.py
+networkx/algorithms/tests/test_reciprocity.py
+networkx/algorithms/tests/test_regular.py
+networkx/algorithms/tests/test_richclub.py
+networkx/algorithms/tests/test_similarity.py
+networkx/algorithms/tests/test_simple_paths.py
+networkx/algorithms/tests/test_smallworld.py
+networkx/algorithms/tests/test_smetric.py
+networkx/algorithms/tests/test_sparsifiers.py
+networkx/algorithms/tests/test_structuralholes.py
+networkx/algorithms/tests/test_summarization.py
+networkx/algorithms/tests/test_swap.py
+networkx/algorithms/tests/test_threshold.py
+networkx/algorithms/tests/test_tournament.py
+networkx/algorithms/tests/test_triads.py
+networkx/algorithms/tests/test_vitality.py
+networkx/algorithms/tests/test_voronoi.py
+networkx/algorithms/tests/test_wiener.py
+networkx/algorithms/traversal/__init__.py
+networkx/algorithms/traversal/beamsearch.py
+networkx/algorithms/traversal/breadth_first_search.py
+networkx/algorithms/traversal/depth_first_search.py
+networkx/algorithms/traversal/edgebfs.py
+networkx/algorithms/traversal/edgedfs.py
+networkx/algorithms/traversal/tests/__init__.py
+networkx/algorithms/traversal/tests/test_beamsearch.py
+networkx/algorithms/traversal/tests/test_bfs.py
+networkx/algorithms/traversal/tests/test_dfs.py
+networkx/algorithms/traversal/tests/test_edgebfs.py
+networkx/algorithms/traversal/tests/test_edgedfs.py
+networkx/algorithms/tree/__init__.py
+networkx/algorithms/tree/branchings.py
+networkx/algorithms/tree/coding.py
+networkx/algorithms/tree/decomposition.py
+networkx/algorithms/tree/mst.py
+networkx/algorithms/tree/operations.py
+networkx/algorithms/tree/recognition.py
+networkx/algorithms/tree/tests/__init__.py
+networkx/algorithms/tree/tests/test_branchings.py
+networkx/algorithms/tree/tests/test_coding.py
+networkx/algorithms/tree/tests/test_decomposition.py
+networkx/algorithms/tree/tests/test_mst.py
+networkx/algorithms/tree/tests/test_operations.py
+networkx/algorithms/tree/tests/test_recognition.py
+networkx/classes/__init__.py
+networkx/classes/backends.py
+networkx/classes/coreviews.py
+networkx/classes/digraph.py
+networkx/classes/filters.py
+networkx/classes/function.py
+networkx/classes/graph.py
+networkx/classes/graphviews.py
+networkx/classes/multidigraph.py
+networkx/classes/multigraph.py
+networkx/classes/reportviews.py
+networkx/classes/tests/__init__.py
+networkx/classes/tests/historical_tests.py
+networkx/classes/tests/test_coreviews.py
+networkx/classes/tests/test_digraph.py
+networkx/classes/tests/test_digraph_historical.py
+networkx/classes/tests/test_filters.py
+networkx/classes/tests/test_function.py
+networkx/classes/tests/test_graph.py
+networkx/classes/tests/test_graph_historical.py
+networkx/classes/tests/test_graphviews.py
+networkx/classes/tests/test_multidigraph.py
+networkx/classes/tests/test_multigraph.py
+networkx/classes/tests/test_reportviews.py
+networkx/classes/tests/test_special.py
+networkx/classes/tests/test_subgraphviews.py
+networkx/drawing/__init__.py
+networkx/drawing/layout.py
+networkx/drawing/nx_agraph.py
+networkx/drawing/nx_latex.py
+networkx/drawing/nx_pydot.py
+networkx/drawing/nx_pylab.py
+networkx/drawing/tests/__init__.py
+networkx/drawing/tests/test_agraph.py
+networkx/drawing/tests/test_latex.py
+networkx/drawing/tests/test_layout.py
+networkx/drawing/tests/test_pydot.py
+networkx/drawing/tests/test_pylab.py
+networkx/drawing/tests/baseline/test_house_with_colors.png
+networkx/generators/__init__.py
+networkx/generators/atlas.dat.gz
+networkx/generators/atlas.py
+networkx/generators/classic.py
+networkx/generators/cographs.py
+networkx/generators/community.py
+networkx/generators/degree_seq.py
+networkx/generators/directed.py
+networkx/generators/duplication.py
+networkx/generators/ego.py
+networkx/generators/expanders.py
+networkx/generators/geometric.py
+networkx/generators/harary_graph.py
+networkx/generators/internet_as_graphs.py
+networkx/generators/intersection.py
+networkx/generators/interval_graph.py
+networkx/generators/joint_degree_seq.py
+networkx/generators/lattice.py
+networkx/generators/line.py
+networkx/generators/mycielski.py
+networkx/generators/nonisomorphic_trees.py
+networkx/generators/random_clustered.py
+networkx/generators/random_graphs.py
+networkx/generators/small.py
+networkx/generators/social.py
+networkx/generators/spectral_graph_forge.py
+networkx/generators/stochastic.py
+networkx/generators/sudoku.py
+networkx/generators/trees.py
+networkx/generators/triads.py
+networkx/generators/tests/__init__.py
+networkx/generators/tests/test_atlas.py
+networkx/generators/tests/test_classic.py
+networkx/generators/tests/test_cographs.py
+networkx/generators/tests/test_community.py
+networkx/generators/tests/test_degree_seq.py
+networkx/generators/tests/test_directed.py
+networkx/generators/tests/test_duplication.py
+networkx/generators/tests/test_ego.py
+networkx/generators/tests/test_expanders.py
+networkx/generators/tests/test_geometric.py
+networkx/generators/tests/test_harary_graph.py
+networkx/generators/tests/test_internet_as_graphs.py
+networkx/generators/tests/test_intersection.py
+networkx/generators/tests/test_interval_graph.py
+networkx/generators/tests/test_joint_degree_seq.py
+networkx/generators/tests/test_lattice.py
+networkx/generators/tests/test_line.py
+networkx/generators/tests/test_mycielski.py
+networkx/generators/tests/test_nonisomorphic_trees.py
+networkx/generators/tests/test_random_clustered.py
+networkx/generators/tests/test_random_graphs.py
+networkx/generators/tests/test_small.py
+networkx/generators/tests/test_spectral_graph_forge.py
+networkx/generators/tests/test_stochastic.py
+networkx/generators/tests/test_sudoku.py
+networkx/generators/tests/test_trees.py
+networkx/generators/tests/test_triads.py
+networkx/linalg/__init__.py
+networkx/linalg/algebraicconnectivity.py
+networkx/linalg/attrmatrix.py
+networkx/linalg/bethehessianmatrix.py
+networkx/linalg/graphmatrix.py
+networkx/linalg/laplacianmatrix.py
+networkx/linalg/modularitymatrix.py
+networkx/linalg/spectrum.py
+networkx/linalg/tests/__init__.py
+networkx/linalg/tests/test_algebraic_connectivity.py
+networkx/linalg/tests/test_attrmatrix.py
+networkx/linalg/tests/test_bethehessian.py
+networkx/linalg/tests/test_graphmatrix.py
+networkx/linalg/tests/test_laplacian.py
+networkx/linalg/tests/test_modularity.py
+networkx/linalg/tests/test_spectrum.py
+networkx/readwrite/__init__.py
+networkx/readwrite/adjlist.py
+networkx/readwrite/edgelist.py
+networkx/readwrite/gexf.py
+networkx/readwrite/gml.py
+networkx/readwrite/graph6.py
+networkx/readwrite/graphml.py
+networkx/readwrite/leda.py
+networkx/readwrite/multiline_adjlist.py
+networkx/readwrite/p2g.py
+networkx/readwrite/pajek.py
+networkx/readwrite/sparse6.py
+networkx/readwrite/text.py
+networkx/readwrite/json_graph/__init__.py
+networkx/readwrite/json_graph/adjacency.py
+networkx/readwrite/json_graph/cytoscape.py
+networkx/readwrite/json_graph/node_link.py
+networkx/readwrite/json_graph/tree.py
+networkx/readwrite/json_graph/tests/__init__.py
+networkx/readwrite/json_graph/tests/test_adjacency.py
+networkx/readwrite/json_graph/tests/test_cytoscape.py
+networkx/readwrite/json_graph/tests/test_node_link.py
+networkx/readwrite/json_graph/tests/test_tree.py
+networkx/readwrite/tests/__init__.py
+networkx/readwrite/tests/test_adjlist.py
+networkx/readwrite/tests/test_edgelist.py
+networkx/readwrite/tests/test_gexf.py
+networkx/readwrite/tests/test_gml.py
+networkx/readwrite/tests/test_graph6.py
+networkx/readwrite/tests/test_graphml.py
+networkx/readwrite/tests/test_leda.py
+networkx/readwrite/tests/test_p2g.py
+networkx/readwrite/tests/test_pajek.py
+networkx/readwrite/tests/test_sparse6.py
+networkx/readwrite/tests/test_text.py
+networkx/tests/__init__.py
+networkx/tests/test_all_random_functions.py
+networkx/tests/test_convert.py
+networkx/tests/test_convert_numpy.py
+networkx/tests/test_convert_pandas.py
+networkx/tests/test_convert_scipy.py
+networkx/tests/test_exceptions.py
+networkx/tests/test_import.py
+networkx/tests/test_lazy_imports.py
+networkx/tests/test_relabel.py
+networkx/utils/__init__.py
+networkx/utils/decorators.py
+networkx/utils/heaps.py
+networkx/utils/mapped_queue.py
+networkx/utils/misc.py
+networkx/utils/random_sequence.py
+networkx/utils/rcm.py
+networkx/utils/union_find.py
+networkx/utils/tests/__init__.py
+networkx/utils/tests/test__init.py
+networkx/utils/tests/test_decorators.py
+networkx/utils/tests/test_heaps.py
+networkx/utils/tests/test_mapped_queue.py
+networkx/utils/tests/test_misc.py
+networkx/utils/tests/test_random_sequence.py
+networkx/utils/tests/test_rcm.py
+networkx/utils/tests/test_unionfind.py
+requirements/README.md
+requirements/default.txt
+requirements/developer.txt
+requirements/doc.txt
+requirements/example.txt
+requirements/extra.txt
+requirements/release.txt
+requirements/test.txt
\ No newline at end of file
diff --git a/networkx.egg-info/dependency_links.txt b/networkx.egg-info/dependency_links.txt
new file mode 100644
index 0000000..8b13789
--- /dev/null
+++ b/networkx.egg-info/dependency_links.txt
@@ -0,0 +1 @@
+
diff --git a/networkx.egg-info/not-zip-safe b/networkx.egg-info/not-zip-safe
new file mode 100644
index 0000000..8b13789
--- /dev/null
+++ b/networkx.egg-info/not-zip-safe
@@ -0,0 +1 @@
+
diff --git a/networkx.egg-info/requires.txt b/networkx.egg-info/requires.txt
new file mode 100644
index 0000000..69ef029
--- /dev/null
+++ b/networkx.egg-info/requires.txt
@@ -0,0 +1,30 @@
+
+[default]
+matplotlib>=3.4
+numpy>=1.20
+pandas>=1.3
+scipy>=1.8
+
+[developer]
+mypy>=0.991
+pre-commit>=2.21
+
+[doc]
+nb2plots>=0.6
+numpydoc>=1.5
+pillow>=9.2
+pydata-sphinx-theme>=0.11
+sphinx-gallery>=0.11
+sphinx==5.2.3
+texext>=0.6.7
+
+[extra]
+lxml>=4.6
+pydot>=1.4.2
+pygraphviz>=1.10
+sympy>=1.10
+
+[test]
+codecov>=2.1
+pytest-cov>=4.0
+pytest>=7.2
diff --git a/networkx.egg-info/top_level.txt b/networkx.egg-info/top_level.txt
new file mode 100644
index 0000000..4d07dfe
--- /dev/null
+++ b/networkx.egg-info/top_level.txt
@@ -0,0 +1 @@
+networkx
diff --git a/networkx/__init__.py b/networkx/__init__.py
index d49e31e..6c6310f 100644
--- a/networkx/__init__.py
+++ b/networkx/__init__.py
@@ -8,47 +8,7 @@ structure, dynamics, and functions of complex networks.
 See https://networkx.org for complete documentation.
 """
 
-__version__ = "2.8.8"
-
-
-def __getattr__(name):
-    """Remove functions and provide informative error messages."""
-    if name == "nx_yaml":
-        raise ImportError(
-            "\nThe nx_yaml module has been removed from NetworkX.\n"
-            "Please use the `yaml` package directly for working with yaml data.\n"
-            "For example, a networkx.Graph `G` can be written to and loaded\n"
-            "from a yaml file with:\n\n"
-            "    import yaml\n\n"
-            "    with open('path_to_yaml_file', 'w') as fh:\n"
-            "        yaml.dump(G, fh)\n"
-            "    with open('path_to_yaml_file', 'r') as fh:\n"
-            "        G = yaml.load(fh, Loader=yaml.Loader)\n\n"
-            "Note that yaml.Loader is considered insecure - see the pyyaml\n"
-            "documentation for further details.\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    if name == "read_yaml":
-        raise ImportError(
-            "\nread_yaml has been removed from NetworkX, please use `yaml`\n"
-            "directly:\n\n"
-            "    import yaml\n\n"
-            "    with open('path', 'r') as fh:\n"
-            "        yaml.load(fh, Loader=yaml.Loader)\n\n"
-            "Note that yaml.Loader is considered insecure - see the pyyaml\n"
-            "documentation for further details.\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    if name == "write_yaml":
-        raise ImportError(
-            "\nwrite_yaml has been removed from NetworkX, please use `yaml`\n"
-            "directly:\n\n"
-            "    import yaml\n\n"
-            "    with open('path_for_yaml_output', 'w') as fh:\n"
-            "        yaml.dump(G_to_be_yaml, fh)\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    raise AttributeError(f"module {__name__} has no attribute {name}")
+__version__ = "3.1rc1.dev0"
 
 
 # These are imported in order as listed
@@ -61,6 +21,7 @@ from networkx import utils
 from networkx import classes
 from networkx.classes import filters
 from networkx.classes import *
+from networkx.classes import _dispatch
 
 from networkx import convert
 from networkx.convert import *
@@ -84,7 +45,5 @@ from networkx.algorithms import *
 from networkx import linalg
 from networkx.linalg import *
 
-from networkx.testing.test import run as test
-
 from networkx import drawing
 from networkx.drawing import *
diff --git a/networkx/algorithms/__init__.py b/networkx/algorithms/__init__.py
index 9fa60b9..cb5ea77 100644
--- a/networkx/algorithms/__init__.py
+++ b/networkx/algorithms/__init__.py
@@ -85,7 +85,6 @@ from networkx.algorithms import tree
 # to the user as direct imports from the `networkx` namespace.
 from networkx.algorithms.bipartite import complete_bipartite_graph
 from networkx.algorithms.bipartite import is_bipartite
-from networkx.algorithms.bipartite import project
 from networkx.algorithms.bipartite import projected_graph
 from networkx.algorithms.connectivity import all_pairs_node_connectivity
 from networkx.algorithms.connectivity import all_node_cuts
@@ -117,6 +116,7 @@ from networkx.algorithms.isomorphism import could_be_isomorphic
 from networkx.algorithms.isomorphism import fast_could_be_isomorphic
 from networkx.algorithms.isomorphism import faster_could_be_isomorphic
 from networkx.algorithms.isomorphism import is_isomorphic
+from networkx.algorithms.isomorphism.vf2pp import *
 from networkx.algorithms.tree.branchings import maximum_branching
 from networkx.algorithms.tree.branchings import maximum_spanning_arborescence
 from networkx.algorithms.tree.branchings import minimum_branching
diff --git a/networkx/algorithms/approximation/steinertree.py b/networkx/algorithms/approximation/steinertree.py
index 496098b..4c15ba6 100644
--- a/networkx/algorithms/approximation/steinertree.py
+++ b/networkx/algorithms/approximation/steinertree.py
@@ -46,22 +46,109 @@ def metric_closure(G, weight="weight"):
     return M
 
 
+def _mehlhorn_steiner_tree(G, terminal_nodes, weight):
+    paths = nx.multi_source_dijkstra_path(G, terminal_nodes)
+
+    d_1 = {}
+    s = {}
+    for v in G.nodes():
+        s[v] = paths[v][0]
+        d_1[(v, s[v])] = len(paths[v]) - 1
+
+    # G1-G4 names match those from the Mehlhorn 1988 paper.
+    G_1_prime = nx.Graph()
+    for u, v, data in G.edges(data=True):
+        su, sv = s[u], s[v]
+        weight_here = d_1[(u, su)] + data.get(weight, 1) + d_1[(v, sv)]
+        if not G_1_prime.has_edge(su, sv):
+            G_1_prime.add_edge(su, sv, weight=weight_here)
+        else:
+            new_weight = min(weight_here, G_1_prime[su][sv][weight])
+            G_1_prime.add_edge(su, sv, weight=new_weight)
+
+    G_2 = nx.minimum_spanning_edges(G_1_prime, data=True)
+
+    G_3 = nx.Graph()
+    for u, v, d in G_2:
+        path = nx.shortest_path(G, u, v, weight)
+        for n1, n2 in pairwise(path):
+            G_3.add_edge(n1, n2)
+
+    G_3_mst = list(nx.minimum_spanning_edges(G_3, data=False))
+    if G.is_multigraph():
+        G_3_mst = (
+            (u, v, min(G[u][v], key=lambda k: G[u][v][k][weight])) for u, v in G_3_mst
+        )
+    G_4 = G.edge_subgraph(G_3_mst).copy()
+    _remove_nonterminal_leaves(G_4, terminal_nodes)
+    return G_4.edges()
+
+
+def _kou_steiner_tree(G, terminal_nodes, weight):
+    # H is the subgraph induced by terminal_nodes in the metric closure M of G.
+    M = metric_closure(G, weight=weight)
+    H = M.subgraph(terminal_nodes)
+
+    # Use the 'distance' attribute of each edge provided by M.
+    mst_edges = nx.minimum_spanning_edges(H, weight="distance", data=True)
+
+    # Create an iterator over each edge in each shortest path; repeats are okay
+    mst_all_edges = chain.from_iterable(pairwise(d["path"]) for u, v, d in mst_edges)
+    if G.is_multigraph():
+        mst_all_edges = (
+            (u, v, min(G[u][v], key=lambda k: G[u][v][k][weight]))
+            for u, v in mst_all_edges
+        )
+
+    # Find the MST again, over this new set of edges
+    G_S = G.edge_subgraph(mst_all_edges)
+    T_S = nx.minimum_spanning_edges(G_S, weight="weight", data=False)
+
+    # Leaf nodes that are not terminal might still remain; remove them here
+    T_H = G.edge_subgraph(T_S).copy()
+    _remove_nonterminal_leaves(T_H, terminal_nodes)
+
+    return T_H.edges()
+
+
+def _remove_nonterminal_leaves(G, terminals):
+    terminals_set = set(terminals)
+    for n in list(G.nodes):
+        if n not in terminals_set and G.degree(n) == 1:
+            G.remove_node(n)
+
+
+ALGORITHMS = {
+    "kou": _kou_steiner_tree,
+    "mehlhorn": _mehlhorn_steiner_tree,
+}
+
+
 @not_implemented_for("directed")
-def steiner_tree(G, terminal_nodes, weight="weight"):
-    """Return an approximation to the minimum Steiner tree of a graph.
-
-    The minimum Steiner tree of `G` w.r.t a set of `terminal_nodes`
-    is a tree within `G` that spans those nodes and has minimum size
-    (sum of edge weights) among all such trees.
-
-    The minimum Steiner tree can be approximated by computing the minimum
-    spanning tree of the subgraph of the metric closure of *G* induced by the
-    terminal nodes, where the metric closure of *G* is the complete graph in
-    which each edge is weighted by the shortest path distance between the
-    nodes in *G* .
-    This algorithm produces a tree whose weight is within a (2 - (2 / t))
-    factor of the weight of the optimal Steiner tree where *t* is number of
-    terminal nodes.
+def steiner_tree(G, terminal_nodes, weight="weight", method=None):
+    r"""Return an approximation to the minimum Steiner tree of a graph.
+
+    The minimum Steiner tree of `G` w.r.t a set of `terminal_nodes` (also *S*)
+    is a tree within `G` that spans those nodes and has minimum size (sum of
+    edge weights) among all such trees.
+
+    The approximation algorithm is specified with the `method` keyword
+    argument. All three available algorithms produce a tree whose weight is
+    within a (2 - (2 / l)) factor of the weight of the optimal Steiner tree,
+    where *l* is the minimum number of leaf nodes across all possible Steiner
+    trees.
+
+    * `kou` [2]_ (runtime $O(|S| |V|^2)$) computes the minimum spanning tree of
+    the subgraph of the metric closure of *G* induced by the terminal nodes,
+    where the metric closure of *G* is the complete graph in which each edge is
+    weighted by the shortest path distance between the nodes in *G*.
+
+    * `mehlhorn` [3]_ (runtime $O(|E|+|V|\log|V|)$) modifies Kou et al.'s
+    algorithm, beginning by finding the closest terminal node for each
+    non-terminal. This data is used to create a complete graph containing only
+    the terminal nodes, in which edge is weighted with the shortest path
+    distance between them. The algorithm then proceeds in the same way as Kou
+    et al..
 
     Parameters
     ----------
@@ -71,6 +158,15 @@ def steiner_tree(G, terminal_nodes, weight="weight"):
          A list of terminal nodes for which minimum steiner tree is
          to be found.
 
+    weight : string (default = 'weight')
+        Use the edge attribute specified by this string as the edge weight.
+        Any edge attribute not present defaults to 1.
+
+    method : string, optional (default = 'kou')
+        The algorithm to use to approximate the Steiner tree.
+        Supported options: 'kou', 'mehlhorn'.
+        Other inputs produce a ValueError.
+
     Returns
     -------
     NetworkX graph
@@ -86,15 +182,33 @@ def steiner_tree(G, terminal_nodes, weight="weight"):
     References
     ----------
     .. [1] Steiner_tree_problem on Wikipedia.
-       https://en.wikipedia.org/wiki/Steiner_tree_problem
+           https://en.wikipedia.org/wiki/Steiner_tree_problem
+    .. [2] Kou, L., G. Markowsky, and L. Berman. 1981.
+           ‘A Fast Algorithm for Steiner Trees’.
+           Acta Informatica 15 (2): 141–45.
+           https://doi.org/10.1007/BF00288961.
+    .. [3] Mehlhorn, Kurt. 1988.
+           ‘A Faster Approximation Algorithm for the Steiner Problem in Graphs’.
+           Information Processing Letters 27 (3): 125–28.
+           https://doi.org/10.1016/0020-0190(88)90066-X.
     """
-    # H is the subgraph induced by terminal_nodes in the metric closure M of G.
-    M = metric_closure(G, weight=weight)
-    H = M.subgraph(terminal_nodes)
-    # Use the 'distance' attribute of each edge provided by M.
-    mst_edges = nx.minimum_spanning_edges(H, weight="distance", data=True)
-    # Create an iterator over each edge in each shortest path; repeats are okay
-    edges = chain.from_iterable(pairwise(d["path"]) for u, v, d in mst_edges)
+    if method is None:
+        import warnings
+
+        msg = (
+            "steiner_tree will change default method from 'kou' to 'mehlhorn'"
+            "in version 3.2.\nSet the `method` kwarg to remove this warning."
+        )
+        warnings.warn(msg, FutureWarning, stacklevel=4)
+        method = "kou"
+
+    try:
+        algo = ALGORITHMS[method]
+    except KeyError as e:
+        msg = f"{method} is not a valid choice for an algorithm."
+        raise ValueError(msg) from e
+
+    edges = algo(G, terminal_nodes, weight)
     # For multigraph we should add the minimal weight edge keys
     if G.is_multigraph():
         edges = (
diff --git a/networkx/algorithms/approximation/tests/test_ramsey.py b/networkx/algorithms/approximation/tests/test_ramsey.py
index 856a8ef..32fe1fb 100644
--- a/networkx/algorithms/approximation/tests/test_ramsey.py
+++ b/networkx/algorithms/approximation/tests/test_ramsey.py
@@ -11,7 +11,7 @@ def test_ramsey():
     idens = nx.density(graph.subgraph(i))
     assert idens == 0.0, "i-set not correctly found by ramsey!"
 
-    # this trival graph has no cliques. should just find i-sets
+    # this trivial graph has no cliques. should just find i-sets
     graph = nx.trivial_graph()
     c, i = apxa.ramsey_R2(graph)
     assert c == {0}, "clique not correctly found by ramsey!"
diff --git a/networkx/algorithms/approximation/tests/test_steinertree.py b/networkx/algorithms/approximation/tests/test_steinertree.py
index d58eb66..d7af1a1 100644
--- a/networkx/algorithms/approximation/tests/test_steinertree.py
+++ b/networkx/algorithms/approximation/tests/test_steinertree.py
@@ -8,24 +8,75 @@ from networkx.utils import edges_equal
 class TestSteinerTree:
     @classmethod
     def setup_class(cls):
-        G = nx.Graph()
-        G.add_edge(1, 2, weight=10)
-        G.add_edge(2, 3, weight=10)
-        G.add_edge(3, 4, weight=10)
-        G.add_edge(4, 5, weight=10)
-        G.add_edge(5, 6, weight=10)
-        G.add_edge(2, 7, weight=1)
-        G.add_edge(7, 5, weight=1)
-        cls.G = G
-        cls.term_nodes = [1, 2, 3, 4, 5]
+        G1 = nx.Graph()
+        G1.add_edge(1, 2, weight=10)
+        G1.add_edge(2, 3, weight=10)
+        G1.add_edge(3, 4, weight=10)
+        G1.add_edge(4, 5, weight=10)
+        G1.add_edge(5, 6, weight=10)
+        G1.add_edge(2, 7, weight=1)
+        G1.add_edge(7, 5, weight=1)
+
+        G2 = nx.Graph()
+        G2.add_edge(0, 5, weight=6)
+        G2.add_edge(1, 2, weight=2)
+        G2.add_edge(1, 5, weight=3)
+        G2.add_edge(2, 4, weight=4)
+        G2.add_edge(3, 5, weight=5)
+        G2.add_edge(4, 5, weight=1)
+
+        G3 = nx.Graph()
+        G3.add_edge(1, 2, weight=8)
+        G3.add_edge(1, 9, weight=3)
+        G3.add_edge(1, 8, weight=6)
+        G3.add_edge(1, 10, weight=2)
+        G3.add_edge(1, 14, weight=3)
+        G3.add_edge(2, 3, weight=6)
+        G3.add_edge(3, 4, weight=3)
+        G3.add_edge(3, 10, weight=2)
+        G3.add_edge(3, 11, weight=1)
+        G3.add_edge(4, 5, weight=1)
+        G3.add_edge(4, 11, weight=1)
+        G3.add_edge(5, 6, weight=4)
+        G3.add_edge(5, 11, weight=2)
+        G3.add_edge(5, 12, weight=1)
+        G3.add_edge(5, 13, weight=3)
+        G3.add_edge(6, 7, weight=2)
+        G3.add_edge(6, 12, weight=3)
+        G3.add_edge(6, 13, weight=1)
+        G3.add_edge(7, 8, weight=3)
+        G3.add_edge(7, 9, weight=3)
+        G3.add_edge(7, 11, weight=5)
+        G3.add_edge(7, 13, weight=2)
+        G3.add_edge(7, 14, weight=4)
+        G3.add_edge(8, 9, weight=2)
+        G3.add_edge(9, 14, weight=1)
+        G3.add_edge(10, 11, weight=2)
+        G3.add_edge(10, 14, weight=1)
+        G3.add_edge(11, 12, weight=1)
+        G3.add_edge(11, 14, weight=7)
+        G3.add_edge(12, 14, weight=3)
+        G3.add_edge(12, 15, weight=1)
+        G3.add_edge(13, 14, weight=4)
+        G3.add_edge(13, 15, weight=1)
+        G3.add_edge(14, 15, weight=2)
+
+        cls.G1 = G1
+        cls.G2 = G2
+        cls.G3 = G3
+        cls.G1_term_nodes = [1, 2, 3, 4, 5]
+        cls.G2_term_nodes = [0, 2, 3]
+        cls.G3_term_nodes = [1, 3, 5, 6, 8, 10, 11, 12, 13]
+
+        cls.methods = ["kou", "mehlhorn"]
 
     def test_connected_metric_closure(self):
-        G = self.G.copy()
+        G = self.G1.copy()
         G.add_node(100)
         pytest.raises(nx.NetworkXError, metric_closure, G)
 
     def test_metric_closure(self):
-        M = metric_closure(self.G)
+        M = metric_closure(self.G1)
         mc = [
             (1, 2, {"distance": 10, "path": [1, 2]}),
             (1, 3, {"distance": 20, "path": [1, 2, 3]}),
@@ -52,15 +103,71 @@ class TestSteinerTree:
         assert edges_equal(list(M.edges(data=True)), mc)
 
     def test_steiner_tree(self):
-        S = steiner_tree(self.G, self.term_nodes)
-        expected_steiner_tree = [
-            (1, 2, {"weight": 10}),
-            (2, 3, {"weight": 10}),
-            (2, 7, {"weight": 1}),
-            (3, 4, {"weight": 10}),
-            (5, 7, {"weight": 1}),
+        valid_steiner_trees = [
+            [
+                [
+                    (1, 2, {"weight": 10}),
+                    (2, 3, {"weight": 10}),
+                    (2, 7, {"weight": 1}),
+                    (3, 4, {"weight": 10}),
+                    (5, 7, {"weight": 1}),
+                ],
+                [
+                    (1, 2, {"weight": 10}),
+                    (2, 7, {"weight": 1}),
+                    (3, 4, {"weight": 10}),
+                    (4, 5, {"weight": 10}),
+                    (5, 7, {"weight": 1}),
+                ],
+                [
+                    (1, 2, {"weight": 10}),
+                    (2, 3, {"weight": 10}),
+                    (2, 7, {"weight": 1}),
+                    (4, 5, {"weight": 10}),
+                    (5, 7, {"weight": 1}),
+                ],
+            ],
+            [
+                [
+                    (0, 5, {"weight": 6}),
+                    (1, 2, {"weight": 2}),
+                    (1, 5, {"weight": 3}),
+                    (3, 5, {"weight": 5}),
+                ],
+                [
+                    (0, 5, {"weight": 6}),
+                    (4, 2, {"weight": 4}),
+                    (4, 5, {"weight": 1}),
+                    (3, 5, {"weight": 5}),
+                ],
+            ],
+            [
+                [
+                    (1, 10, {"weight": 2}),
+                    (3, 10, {"weight": 2}),
+                    (3, 11, {"weight": 1}),
+                    (5, 12, {"weight": 1}),
+                    (6, 13, {"weight": 1}),
+                    (8, 9, {"weight": 2}),
+                    (9, 14, {"weight": 1}),
+                    (10, 14, {"weight": 1}),
+                    (11, 12, {"weight": 1}),
+                    (12, 15, {"weight": 1}),
+                    (13, 15, {"weight": 1}),
+                ]
+            ],
         ]
-        assert edges_equal(list(S.edges(data=True)), expected_steiner_tree)
+        for method in self.methods:
+            for G, term_nodes, valid_trees in zip(
+                [self.G1, self.G2, self.G3],
+                [self.G1_term_nodes, self.G2_term_nodes, self.G3_term_nodes],
+                valid_steiner_trees,
+            ):
+                S = steiner_tree(G, term_nodes, method=method)
+                assert any(
+                    edges_equal(list(S.edges(data=True)), valid_tree)
+                    for valid_tree in valid_trees
+                )
 
     def test_multigraph_steiner_tree(self):
         G = nx.MultiGraph()
@@ -79,5 +186,6 @@ class TestSteinerTree:
             (3, 4, 0, {"weight": 1}),
             (3, 5, 0, {"weight": 1}),
         ]
-        T = steiner_tree(G, terminal_nodes)
-        assert edges_equal(T.edges(data=True, keys=True), expected_edges)
+        for method in self.methods:
+            S = steiner_tree(G, terminal_nodes, method=method)
+            assert edges_equal(S.edges(data=True, keys=True), expected_edges)
diff --git a/networkx/algorithms/approximation/traveling_salesman.py b/networkx/algorithms/approximation/traveling_salesman.py
index 806c8b7..881fff9 100644
--- a/networkx/algorithms/approximation/traveling_salesman.py
+++ b/networkx/algorithms/approximation/traveling_salesman.py
@@ -178,7 +178,7 @@ def christofides(G, weight="weight", tree=None):
     L.remove_nodes_from([v for v, degree in tree.degree if not (degree % 2)])
     MG = nx.MultiGraph()
     MG.add_edges_from(tree.edges)
-    edges = nx.min_weight_matching(L, maxcardinality=True, weight=weight)
+    edges = nx.min_weight_matching(L, weight=weight)
     MG.add_edges_from(edges)
     return _shortcutting(nx.eulerian_circuit(MG))
 
diff --git a/networkx/algorithms/assortativity/connectivity.py b/networkx/algorithms/assortativity/connectivity.py
index ad05418..dad28f9 100644
--- a/networkx/algorithms/assortativity/connectivity.py
+++ b/networkx/algorithms/assortativity/connectivity.py
@@ -2,7 +2,7 @@ from collections import defaultdict
 
 import networkx as nx
 
-__all__ = ["average_degree_connectivity", "k_nearest_neighbors"]
+__all__ = ["average_degree_connectivity"]
 
 
 def average_degree_connectivity(
@@ -119,21 +119,3 @@ def average_degree_connectivity(
 
     # normalize
     return {k: avg if dnorm[k] == 0 else avg / dnorm[k] for k, avg in dsum.items()}
-
-
-def k_nearest_neighbors(G, source="in+out", target="in+out", nodes=None, weight=None):
-    """Compute the average degree connectivity of graph.
-
-    .. deprecated 2.6
-
-      k_nearest_neighbors function is deprecated and will be removed in v3.0.
-      Use `average_degree_connectivity` instead.
-    """
-    import warnings
-
-    msg = (
-        "k_nearest_neighbors function is deprecated and will be removed in v3.0.\n"
-        "Use `average_degree_connectivity` instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    return average_degree_connectivity(G, source, target, nodes, weight)
diff --git a/networkx/algorithms/assortativity/mixing.py b/networkx/algorithms/assortativity/mixing.py
index f457d75..6a44841 100644
--- a/networkx/algorithms/assortativity/mixing.py
+++ b/networkx/algorithms/assortativity/mixing.py
@@ -9,7 +9,6 @@ __all__ = [
     "attribute_mixing_dict",
     "degree_mixing_matrix",
     "degree_mixing_dict",
-    "numeric_mixing_matrix",
     "mixing_dict",
 ]
 
@@ -209,58 +208,6 @@ def degree_mixing_matrix(
     return a
 
 
-def numeric_mixing_matrix(G, attribute, nodes=None, normalized=True, mapping=None):
-    """Returns numeric mixing matrix for attribute.
-
-    .. deprecated:: 2.6
-
-       numeric_mixing_matrix is deprecated and will be removed in 3.0.
-       Use `attribute_mixing_matrix` instead.
-
-    Parameters
-    ----------
-    G : graph
-       NetworkX graph object.
-
-    attribute : string
-       Node attribute key.
-
-    nodes: list or iterable (optional)
-        Build the matrix only with nodes in container. The default is all nodes.
-
-    normalized : bool (default=True)
-       Return counts if False or probabilities if True.
-
-    mapping : dictionary, optional
-       Mapping from node attribute to integer index in matrix.
-       If not specified, an arbitrary ordering will be used.
-
-    Notes
-    -----
-    If each node has a unique attribute value, the unnormalized mixing matrix
-    will be equal to the adjacency matrix. To get a denser mixing matrix,
-    the rounding can be performed to form groups of nodes with equal values.
-    For example, the exact height of persons in cm (180.79155222, 163.9080892,
-    163.30095355, 167.99016217, 168.21590163, ...) can be rounded to (180, 163,
-    163, 168, 168, ...).
-
-    Returns
-    -------
-    m: numpy array
-       Counts, or joint, probability of occurrence of node attribute pairs.
-    """
-    import warnings
-
-    msg = (
-        "numeric_mixing_matrix is deprecated and will be removed in v3.0.\n"
-        "Use `attribute_mixing_matrix` instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    return attribute_mixing_matrix(
-        G, attribute, nodes=nodes, normalized=normalized, mapping=mapping
-    )
-
-
 def mixing_dict(xy, normalized=False):
     """Returns a dictionary representation of mixing matrix.
 
diff --git a/networkx/algorithms/assortativity/tests/base_test.py b/networkx/algorithms/assortativity/tests/base_test.py
index 73bb32d..46d6300 100644
--- a/networkx/algorithms/assortativity/tests/base_test.py
+++ b/networkx/algorithms/assortativity/tests/base_test.py
@@ -37,6 +37,27 @@ class BaseTestAttributeMixing:
         S.add_edge(2, 2)
         cls.S = S
 
+        N = nx.Graph()
+        N.add_nodes_from([0, 1], margin=-2)
+        N.add_nodes_from([2, 3], margin=-2)
+        N.add_nodes_from([4], margin=-3)
+        N.add_nodes_from([5], margin=-4)
+        N.add_edges_from([(0, 1), (2, 3), (0, 4), (2, 5)])
+        cls.N = N
+
+        F = nx.Graph()
+        F.add_edges_from([(0, 3), (1, 3), (2, 3)], weight=0.5)
+        F.add_edge(0, 2, weight=1)
+        nx.set_node_attributes(F, dict(F.degree(weight="weight")), "margin")
+        cls.F = F
+
+        K = nx.Graph()
+        K.add_nodes_from([1, 2], margin=-1)
+        K.add_nodes_from([3], margin=1)
+        K.add_nodes_from([4], margin=2)
+        K.add_edges_from([(3, 4), (1, 2), (1, 3)])
+        cls.K = K
+
 
 class BaseTestDegreeMixing:
     @classmethod
@@ -58,28 +79,3 @@ class BaseTestDegreeMixing:
         S2 = nx.star_graph(4)
         cls.DS = nx.disjoint_union(S1, S2)
         cls.DS.add_edge(4, 5)
-
-
-class BaseTestNumericMixing:
-    @classmethod
-    def setup_class(cls):
-        N = nx.Graph()
-        N.add_nodes_from([0, 1], margin=-2)
-        N.add_nodes_from([2, 3], margin=-2)
-        N.add_nodes_from([4], margin=-3)
-        N.add_nodes_from([5], margin=-4)
-        N.add_edges_from([(0, 1), (2, 3), (0, 4), (2, 5)])
-        cls.N = N
-
-        F = nx.Graph()
-        F.add_edges_from([(0, 3), (1, 3), (2, 3)], weight=0.5)
-        F.add_edge(0, 2, weight=1)
-        nx.set_node_attributes(F, dict(F.degree(weight="weight")), "margin")
-        cls.F = F
-
-        M = nx.Graph()
-        M.add_nodes_from([1, 2], margin=-1)
-        M.add_nodes_from([3], margin=1)
-        M.add_nodes_from([4], margin=2)
-        M.add_edges_from([(3, 4), (1, 2), (1, 3)])
-        cls.M = M
diff --git a/networkx/algorithms/assortativity/tests/test_connectivity.py b/networkx/algorithms/assortativity/tests/test_connectivity.py
index c8fae23..21c6287 100644
--- a/networkx/algorithms/assortativity/tests/test_connectivity.py
+++ b/networkx/algorithms/assortativity/tests/test_connectivity.py
@@ -86,8 +86,6 @@ class TestNeighborConnectivity:
         assert nd == 1.8
         nd = nx.average_degree_connectivity(G, weight="weight")[5]
         assert nd == pytest.approx(3.222222, abs=1e-5)
-        nd = nx.k_nearest_neighbors(G, weight="weight")[5]
-        assert nd == pytest.approx(3.222222, abs=1e-5)
 
     def test_zero_deg(self):
         G = nx.DiGraph()
diff --git a/networkx/algorithms/assortativity/tests/test_correlation.py b/networkx/algorithms/assortativity/tests/test_correlation.py
index ffba703..dbaa432 100644
--- a/networkx/algorithms/assortativity/tests/test_correlation.py
+++ b/networkx/algorithms/assortativity/tests/test_correlation.py
@@ -7,11 +7,7 @@ pytest.importorskip("scipy")
 import networkx as nx
 from networkx.algorithms.assortativity.correlation import attribute_ac
 
-from .base_test import (
-    BaseTestAttributeMixing,
-    BaseTestDegreeMixing,
-    BaseTestNumericMixing,
-)
+from .base_test import BaseTestAttributeMixing, BaseTestDegreeMixing
 
 
 class TestDegreeMixingCorrelation(BaseTestDegreeMixing):
@@ -99,16 +95,14 @@ class TestAttributeMixingCorrelation(BaseTestAttributeMixing):
         r = attribute_ac(a)
         np.testing.assert_almost_equal(r, 0.029, decimal=3)
 
-
-class TestNumericMixingCorrelation(BaseTestNumericMixing):
-    def test_numeric_assortativity_negative(self):
+    def test_attribute_assortativity_negative(self):
         r = nx.numeric_assortativity_coefficient(self.N, "margin")
         np.testing.assert_almost_equal(r, -0.2903, decimal=4)
 
-    def test_numeric_assortativity_float(self):
+    def test_attribute_assortativity_float(self):
         r = nx.numeric_assortativity_coefficient(self.F, "margin")
         np.testing.assert_almost_equal(r, -0.1429, decimal=4)
 
-    def test_numeric_assortativity_mixed(self):
-        r = nx.numeric_assortativity_coefficient(self.M, "margin")
+    def test_attribute_assortativity_mixed(self):
+        r = nx.numeric_assortativity_coefficient(self.K, "margin")
         np.testing.assert_almost_equal(r, 0.4340, decimal=4)
diff --git a/networkx/algorithms/assortativity/tests/test_mixing.py b/networkx/algorithms/assortativity/tests/test_mixing.py
index cb4ae07..9af0986 100644
--- a/networkx/algorithms/assortativity/tests/test_mixing.py
+++ b/networkx/algorithms/assortativity/tests/test_mixing.py
@@ -5,11 +5,7 @@ np = pytest.importorskip("numpy")
 
 import networkx as nx
 
-from .base_test import (
-    BaseTestAttributeMixing,
-    BaseTestDegreeMixing,
-    BaseTestNumericMixing,
-)
+from .base_test import BaseTestAttributeMixing, BaseTestDegreeMixing
 
 
 class TestDegreeMixingDict(BaseTestDegreeMixing):
@@ -159,24 +155,22 @@ class TestAttributeMixingMatrix(BaseTestAttributeMixing):
         a = nx.attribute_mixing_matrix(self.M, "fish", mapping=mapping)
         np.testing.assert_equal(a, a_result / a_result.sum())
 
-
-class TestNumericMixingMatrix(BaseTestNumericMixing):
-    def test_numeric_mixing_matrix_negative(self):
+    def test_attribute_mixing_matrix_negative(self):
         mapping = {-2: 0, -3: 1, -4: 2}
         a_result = np.array([[4.0, 1.0, 1.0], [1.0, 0.0, 0.0], [1.0, 0.0, 0.0]])
-        a = nx.numeric_mixing_matrix(
+        a = nx.attribute_mixing_matrix(
             self.N, "margin", mapping=mapping, normalized=False
         )
         np.testing.assert_equal(a, a_result)
-        a = nx.numeric_mixing_matrix(self.N, "margin", mapping=mapping)
+        a = nx.attribute_mixing_matrix(self.N, "margin", mapping=mapping)
         np.testing.assert_equal(a, a_result / float(a_result.sum()))
 
-    def test_numeric_mixing_matrix_float(self):
+    def test_attribute_mixing_matrix_float(self):
         mapping = {0.5: 1, 1.5: 0}
         a_result = np.array([[6.0, 1.0], [1.0, 0.0]])
-        a = nx.numeric_mixing_matrix(
+        a = nx.attribute_mixing_matrix(
             self.F, "margin", mapping=mapping, normalized=False
         )
         np.testing.assert_equal(a, a_result)
-        a = nx.numeric_mixing_matrix(self.F, "margin", mapping=mapping)
+        a = nx.attribute_mixing_matrix(self.F, "margin", mapping=mapping)
         np.testing.assert_equal(a, a_result / a_result.sum())
diff --git a/networkx/algorithms/bipartite/basic.py b/networkx/algorithms/bipartite/basic.py
index ac4686a..808b335 100644
--- a/networkx/algorithms/bipartite/basic.py
+++ b/networkx/algorithms/bipartite/basic.py
@@ -82,6 +82,7 @@ def color(G):
     return color
 
 
+@nx._dispatch
 def is_bipartite(G):
     """Returns True if graph G is bipartite, False if not.
 
diff --git a/networkx/algorithms/bipartite/matrix.py b/networkx/algorithms/bipartite/matrix.py
index 276d3e4..3e5db20 100644
--- a/networkx/algorithms/bipartite/matrix.py
+++ b/networkx/algorithms/bipartite/matrix.py
@@ -50,7 +50,7 @@ def biadjacency_matrix(
 
     Returns
     -------
-    M : SciPy sparse matrix
+    M : SciPy sparse array
         Biadjacency matrix representation of the bipartite graph G.
 
     Notes
@@ -103,16 +103,8 @@ def biadjacency_matrix(
                 if u in row_index and v in col_index
             )
         )
-    # TODO: change coo_matrix -> coo_array for NX 3.0
-    A = sp.sparse.coo_matrix((data, (row, col)), shape=(nlen, mlen), dtype=dtype)
+    A = sp.sparse.coo_array((data, (row, col)), shape=(nlen, mlen), dtype=dtype)
     try:
-        import warnings
-
-        warnings.warn(
-            "biadjacency_matrix will return a scipy.sparse array instead of a matrix in NetworkX 3.0",
-            FutureWarning,
-            stacklevel=2,
-        )
         return A.asformat(format)
     except ValueError as err:
         raise nx.NetworkXError(f"Unknown sparse array format: {format}") from err
@@ -120,11 +112,11 @@ def biadjacency_matrix(
 
 def from_biadjacency_matrix(A, create_using=None, edge_attribute="weight"):
     r"""Creates a new bipartite graph from a biadjacency matrix given as a
-    SciPy sparse matrix.
+    SciPy sparse array.
 
     Parameters
     ----------
-    A: scipy sparse matrix
+    A: scipy sparse array
       A biadjacency matrix representation of a graph
 
     create_using: NetworkX graph
diff --git a/networkx/algorithms/bipartite/projection.py b/networkx/algorithms/bipartite/projection.py
index 8864195..93f2c29 100644
--- a/networkx/algorithms/bipartite/projection.py
+++ b/networkx/algorithms/bipartite/projection.py
@@ -4,7 +4,6 @@ from networkx.exception import NetworkXAlgorithmError
 from networkx.utils import not_implemented_for
 
 __all__ = [
-    "project",
     "projected_graph",
     "weighted_projected_graph",
     "collaboration_weighted_projected_graph",
@@ -522,17 +521,3 @@ def generic_weighted_projected_graph(B, nodes, weight_function=None):
             weight = weight_function(B, u, v)
             G.add_edge(u, v, weight=weight)
     return G
-
-
-def project(B, nodes, create_using=None):
-    import warnings
-
-    warnings.warn(
-        (
-            "networkx.project is deprecated and will be removed"
-            "in NetworkX 3.0, use networkx.projected_graph instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    return projected_graph(B, nodes)
diff --git a/networkx/algorithms/bipartite/redundancy.py b/networkx/algorithms/bipartite/redundancy.py
index 55de063..dd6e343 100644
--- a/networkx/algorithms/bipartite/redundancy.py
+++ b/networkx/algorithms/bipartite/redundancy.py
@@ -103,8 +103,6 @@ def _node_redundancy(G, v):
 
     """
     n = len(G[v])
-    # TODO On Python 3, we could just use `G[u].keys() & G[w].keys()` instead
-    # of instantiating the entire sets.
     overlap = sum(
         1 for (u, w) in combinations(G[v], 2) if (set(G[u]) & set(G[w])) - {v}
     )
diff --git a/networkx/algorithms/bipartite/tests/test_centrality.py b/networkx/algorithms/bipartite/tests/test_centrality.py
index 50ac906..19fb5d1 100644
--- a/networkx/algorithms/bipartite/tests/test_centrality.py
+++ b/networkx/algorithms/bipartite/tests/test_centrality.py
@@ -55,6 +55,22 @@ class TestBipartiteCentrality:
         c = bipartite.closeness_centrality(G, [1])
         assert c == {0: 0.0, 1: 0.0}
 
+    def test_bipartite_closeness_centrality_unconnected(self):
+        G = nx.complete_bipartite_graph(3, 3)
+        G.add_edge(6, 7)
+        c = bipartite.closeness_centrality(G, [0, 2, 4, 6], normalized=False)
+        answer = {
+            0: 10.0 / 7,
+            2: 10.0 / 7,
+            4: 10.0 / 7,
+            6: 10.0,
+            1: 10.0 / 7,
+            3: 10.0 / 7,
+            5: 10.0 / 7,
+            7: 10.0,
+        }
+        assert c == answer
+
     def test_davis_degree_centrality(self):
         G = self.davis
         deg = bipartite.degree_centrality(G, self.top_nodes)
diff --git a/networkx/algorithms/boundary.py b/networkx/algorithms/boundary.py
index 25c1e28..04d59de 100644
--- a/networkx/algorithms/boundary.py
+++ b/networkx/algorithms/boundary.py
@@ -10,9 +10,12 @@ nodes in *S* that are outside *S*.
 """
 from itertools import chain
 
+import networkx as nx
+
 __all__ = ["edge_boundary", "node_boundary"]
 
 
+@nx._dispatch
 def edge_boundary(G, nbunch1, nbunch2=None, data=False, keys=False, default=None):
     """Returns the edge boundary of `nbunch1`.
 
@@ -89,6 +92,7 @@ def edge_boundary(G, nbunch1, nbunch2=None, data=False, keys=False, default=None
     )
 
 
+@nx._dispatch()
 def node_boundary(G, nbunch1, nbunch2=None):
     """Returns the node boundary of `nbunch1`.
 
diff --git a/networkx/algorithms/centrality/__init__.py b/networkx/algorithms/centrality/__init__.py
index cf07fe2..c91a904 100644
--- a/networkx/algorithms/centrality/__init__.py
+++ b/networkx/algorithms/centrality/__init__.py
@@ -17,3 +17,4 @@ from .second_order import *
 from .subgraph_alg import *
 from .trophic import *
 from .voterank_alg import *
+from .laplacian import *
diff --git a/networkx/algorithms/centrality/betweenness.py b/networkx/algorithms/centrality/betweenness.py
index 54b7db9..5c9a0c5 100644
--- a/networkx/algorithms/centrality/betweenness.py
+++ b/networkx/algorithms/centrality/betweenness.py
@@ -1,16 +1,17 @@
 """Betweenness centrality measures."""
-import warnings
 from collections import deque
 from heapq import heappop, heappush
 from itertools import count
 
+import networkx as nx
 from networkx.algorithms.shortest_paths.weighted import _weight_function
 from networkx.utils import py_random_state
 from networkx.utils.decorators import not_implemented_for
 
-__all__ = ["betweenness_centrality", "edge_betweenness_centrality", "edge_betweenness"]
+__all__ = ["betweenness_centrality", "edge_betweenness_centrality"]
 
 
+@nx._dispatch
 @py_random_state(5)
 def betweenness_centrality(
     G, k=None, normalized=True, weight=None, endpoints=False, seed=None
@@ -147,6 +148,7 @@ def betweenness_centrality(
     return betweenness
 
 
+@nx._dispatch
 @py_random_state(4)
 def edge_betweenness_centrality(G, k=None, normalized=True, weight=None, seed=None):
     r"""Compute betweenness centrality for edges.
@@ -242,14 +244,6 @@ def edge_betweenness_centrality(G, k=None, normalized=True, weight=None, seed=No
     return betweenness
 
 
-# obsolete name
-def edge_betweenness(G, k=None, normalized=True, weight=None, seed=None):
-    warnings.warn(
-        "edge_betweeness is replaced by edge_betweenness_centrality", DeprecationWarning
-    )
-    return edge_betweenness_centrality(G, k, normalized, weight, seed)
-
-
 # helpers for betweenness centrality
 
 
diff --git a/networkx/algorithms/centrality/betweenness_subset.py b/networkx/algorithms/centrality/betweenness_subset.py
index 6b6958f..24e127b 100644
--- a/networkx/algorithms/centrality/betweenness_subset.py
+++ b/networkx/algorithms/centrality/betweenness_subset.py
@@ -1,6 +1,4 @@
 """Betweenness centrality measures for subsets of nodes."""
-import warnings
-
 from networkx.algorithms.centrality.betweenness import _add_edge_keys
 from networkx.algorithms.centrality.betweenness import (
     _single_source_dijkstra_path_basic as dijkstra,
@@ -11,7 +9,6 @@ from networkx.algorithms.centrality.betweenness import (
 
 __all__ = [
     "betweenness_centrality_subset",
-    "betweenness_centrality_source",
     "edge_betweenness_centrality_subset",
 ]
 
@@ -199,16 +196,6 @@ def edge_betweenness_centrality_subset(
     return b
 
 
-# obsolete name
-def betweenness_centrality_source(G, normalized=True, weight=None, sources=None):
-    msg = "betweenness_centrality_source --> betweenness_centrality_subset"
-    warnings.warn(msg, DeprecationWarning)
-    if sources is None:
-        sources = G.nodes()
-    targets = list(G)
-    return betweenness_centrality_subset(G, sources, targets, normalized, weight)
-
-
 def _accumulate_subset(betweenness, S, P, sigma, s, targets):
     delta = dict.fromkeys(S, 0.0)
     target_set = set(targets) - {s}
diff --git a/networkx/algorithms/centrality/degree_alg.py b/networkx/algorithms/centrality/degree_alg.py
index a7e7b92..87beef8 100644
--- a/networkx/algorithms/centrality/degree_alg.py
+++ b/networkx/algorithms/centrality/degree_alg.py
@@ -1,9 +1,11 @@
 """Degree centrality measures."""
+import networkx as nx
 from networkx.utils.decorators import not_implemented_for
 
 __all__ = ["degree_centrality", "in_degree_centrality", "out_degree_centrality"]
 
 
+@nx._dispatch
 def degree_centrality(G):
     """Compute the degree centrality for nodes.
 
@@ -20,6 +22,12 @@ def degree_centrality(G):
     nodes : dictionary
        Dictionary of nodes with degree centrality as the value.
 
+    Examples
+    --------
+    >>> G = nx.Graph([(0, 1), (0, 2), (0, 3), (1, 2), (1, 3)])
+    >>> nx.degree_centrality(G)
+    {0: 1.0, 1: 1.0, 2: 0.6666666666666666, 3: 0.6666666666666666}
+
     See Also
     --------
     betweenness_centrality, load_centrality, eigenvector_centrality
@@ -41,6 +49,7 @@ def degree_centrality(G):
     return centrality
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 def in_degree_centrality(G):
     """Compute the in-degree centrality for nodes.
@@ -63,6 +72,12 @@ def in_degree_centrality(G):
     NetworkXNotImplemented
         If G is undirected.
 
+    Examples
+    --------
+    >>> G = nx.DiGraph([(0, 1), (0, 2), (0, 3), (1, 2), (1, 3)])
+    >>> nx.in_degree_centrality(G)
+    {0: 0.0, 1: 0.3333333333333333, 2: 0.6666666666666666, 3: 0.6666666666666666}
+
     See Also
     --------
     degree_centrality, out_degree_centrality
@@ -84,6 +99,7 @@ def in_degree_centrality(G):
     return centrality
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 def out_degree_centrality(G):
     """Compute the out-degree centrality for nodes.
@@ -106,6 +122,12 @@ def out_degree_centrality(G):
     NetworkXNotImplemented
         If G is undirected.
 
+    Examples
+    --------
+    >>> G = nx.DiGraph([(0, 1), (0, 2), (0, 3), (1, 2), (1, 3)])
+    >>> nx.out_degree_centrality(G)
+    {0: 1.0, 1: 0.6666666666666666, 2: 0.0, 3: 0.0}
+
     See Also
     --------
     degree_centrality, in_degree_centrality
diff --git a/networkx/algorithms/centrality/dispersion.py b/networkx/algorithms/centrality/dispersion.py
index 8005670..a61a7b4 100644
--- a/networkx/algorithms/centrality/dispersion.py
+++ b/networkx/algorithms/centrality/dispersion.py
@@ -19,6 +19,13 @@ def dispersion(G, u=None, v=None, normalized=True, alpha=1.0, b=0.0, c=0.0):
         The target of the dispersion score if specified.
     normalized : bool
         If True (default) normalize by the embededness of the nodes (u and v).
+    alpha, b, c : float
+        Parameters for the normalization procedure. When `normalized` is True,
+        the dispersion value is normalized by::
+
+            result = ((dispersion + b) ** alpha) / (embeddedness + c)
+
+        as long as the denominator is nonzero.
 
     Returns
     -------
diff --git a/networkx/algorithms/centrality/eigenvector.py b/networkx/algorithms/centrality/eigenvector.py
index f55cedb..bd8a8fd 100644
--- a/networkx/algorithms/centrality/eigenvector.py
+++ b/networkx/algorithms/centrality/eigenvector.py
@@ -7,6 +7,7 @@ from networkx.utils import not_implemented_for
 __all__ = ["eigenvector_centrality", "eigenvector_centrality_numpy"]
 
 
+@nx._dispatch
 @not_implemented_for("multigraph")
 def eigenvector_centrality(G, max_iter=100, tol=1.0e-6, nstart=None, weight=None):
     r"""Compute the eigenvector centrality for the graph `G`.
diff --git a/networkx/algorithms/centrality/katz.py b/networkx/algorithms/centrality/katz.py
index fd0bb93..f429400 100644
--- a/networkx/algorithms/centrality/katz.py
+++ b/networkx/algorithms/centrality/katz.py
@@ -7,6 +7,7 @@ from networkx.utils import not_implemented_for
 __all__ = ["katz_centrality", "katz_centrality_numpy"]
 
 
+@nx._dispatch
 @not_implemented_for("multigraph")
 def katz_centrality(
     G,
@@ -168,7 +169,7 @@ def katz_centrality(
     for _ in range(max_iter):
         xlast = x
         x = dict.fromkeys(xlast, 0)
-        # do the multiplication y^T = Alpha * x^T A - Beta
+        # do the multiplication y^T = Alpha * x^T A + Beta
         for n in x:
             for nbr in G[n]:
                 x[nbr] += xlast[n] * G[n][nbr].get(weight, 1)
diff --git a/networkx/algorithms/centrality/laplacian.py b/networkx/algorithms/centrality/laplacian.py
new file mode 100644
index 0000000..c4bc03f
--- /dev/null
+++ b/networkx/algorithms/centrality/laplacian.py
@@ -0,0 +1,137 @@
+"""
+Laplacian centrality measures.
+"""
+import networkx as nx
+
+__all__ = ["laplacian_centrality"]
+
+
+def laplacian_centrality(
+    G, normalized=True, nodelist=None, weight="weight", walk_type=None, alpha=0.95
+):
+    r"""Compute the Laplacian centrality for nodes in the graph `G`.
+
+    The Laplacian Centrality of a node `i` is measured by the drop in the
+    Laplacian Energy after deleting node `i` from the graph. The Laplacian Energy
+    is the sum of the squared eigenvalues of a graph's Laplacian matrix.
+
+    .. math::
+
+        C_L(u_i,G) = \frac{(\Delta E)_i}{E_L (G)} = \frac{E_L (G)-E_L (G_i)}{E_L (G)}
+
+        E_L (G) = \sum_{i=0}^n \lambda_i^2
+
+    Where $E_L (G)$ is the Laplacian energy of graph `G`,
+    E_L (G_i) is the Laplacian energy of graph `G` after deleting node `i`
+    and $\lambda_i$ are the eigenvalues of `G`'s Laplacian matrix.
+    This formula shows the normalized value. Without normalization,
+    the numerator on the right side is returned.
+
+    Parameters
+    ----------
+
+    G : graph
+        A networkx graph
+
+    normalized : bool (default = True)
+        If True the centrality score is scaled so the sum over all nodes is 1.
+        If False the centrality score for each node is the drop in Laplacian
+        energy when that node is removed.
+
+    nodelist : list, optional (default = None)
+        The rows and columns are ordered according to the nodes in nodelist.
+        If nodelist is None, then the ordering is produced by G.nodes().
+
+    weight: string or None, optional (default=`weight`)
+        Optional parameter `weight` to compute the Laplacian matrix.
+        The edge data key used to compute each value in the matrix.
+        If None, then each edge has weight 1.
+
+    walk_type : string or None, optional (default=None)
+        Optional parameter `walk_type` used when calling
+        :func:`directed_laplacian_matrix <networkx.directed_laplacian_matrix>`.
+        If None, the transition matrix is selected depending on the properties
+        of the graph. Otherwise can be `random`, `lazy`, or `pagerank`.
+
+    alpha : real (default = 0.95)
+        Optional parameter `alpha` used when calling
+        :func:`directed_laplacian_matrix <networkx.directed_laplacian_matrix>`.
+        (1 - alpha) is the teleportation probability used with pagerank.
+
+    Returns
+    -------
+    nodes : dictionary
+       Dictionary of nodes with Laplacian centrality as the value.
+
+    Examples
+    --------
+    >>> G = nx.Graph()
+    >>> edges = [(0, 1, 4), (0, 2, 2), (2, 1, 1), (1, 3, 2), (1, 4, 2), (4, 5, 1)]
+    >>> G.add_weighted_edges_from(edges)
+    >>> sorted((v, f"{c:0.2f}") for v, c in laplacian_centrality(G).items())
+    [(0, '0.70'), (1, '0.90'), (2, '0.28'), (3, '0.22'), (4, '0.26'), (5, '0.04')]
+
+    Notes
+    -----
+    The algorithm is implemented based on [1] with an extension to directed graphs
+    using the `nx.directed_laplacian_matrix` function.
+
+    Raises
+    ------
+    NetworkXPointlessConcept
+        If the graph `G` is the null graph.
+
+    References
+    ----------
+    .. [1] Qi, X., Fuller, E., Wu, Q., Wu, Y., and Zhang, C.-Q. (2012).
+    Laplacian centrality: A new centrality measure for weighted networks.
+    Information Sciences, 194:240-253.
+    https://math.wvu.edu/~cqzhang/Publication-files/my-paper/INS-2012-Laplacian-W.pdf
+
+    See Also
+    ========
+    directed_laplacian_matrix
+    laplacian_matrix
+    """
+    import numpy as np
+    import scipy as sp
+    import scipy.linalg  # call as sp.linalg
+
+    if len(G) == 0:
+        raise nx.NetworkXPointlessConcept("null graph has no centrality defined")
+
+    if nodelist != None:
+        nodeset = set(G.nbunch_iter(nodelist))
+        if len(nodeset) != len(nodelist):
+            raise nx.NetworkXError("nodelist has duplicate nodes or nodes not in G")
+        nodes = nodelist + [n for n in G if n not in nodeset]
+    else:
+        nodelist = nodes = list(G)
+
+    if G.is_directed():
+        lap_matrix = nx.directed_laplacian_matrix(G, nodes, weight, walk_type, alpha)
+    else:
+        lap_matrix = nx.laplacian_matrix(G, nodes, weight).toarray()
+
+    full_energy = np.power(sp.linalg.eigh(lap_matrix, eigvals_only=True), 2).sum()
+
+    # calculate laplacian centrality
+    laplace_centralities_dict = {}
+    for i, node in enumerate(nodelist):
+        # remove row and col i from lap_matrix
+        all_but_i = list(np.arange(lap_matrix.shape[0]))
+        all_but_i.remove(i)
+        A_2 = lap_matrix[all_but_i, :][:, all_but_i]
+
+        # Adjust diagonal for removed row
+        new_diag = lap_matrix.diagonal() - abs(lap_matrix[:, i])
+        np.fill_diagonal(A_2, new_diag[all_but_i])
+
+        new_energy = np.power(sp.linalg.eigh(A_2, eigvals_only=True), 2).sum()
+        lapl_cent = full_energy - new_energy
+        if normalized:
+            lapl_cent = lapl_cent / full_energy
+
+        laplace_centralities_dict[node] = lapl_cent
+
+    return laplace_centralities_dict
diff --git a/networkx/algorithms/centrality/percolation.py b/networkx/algorithms/centrality/percolation.py
index 4d70338..c15ff56 100644
--- a/networkx/algorithms/centrality/percolation.py
+++ b/networkx/algorithms/centrality/percolation.py
@@ -62,7 +62,7 @@ def percolation_centrality(G, attribute="percolation", states=None, weight=None)
     -----
     The algorithm is from Mahendra Piraveenan, Mikhail Prokopenko, and
     Liaquat Hossain [1]_
-    Pair dependecies are calculated and accumulated using [2]_
+    Pair dependencies are calculated and accumulated using [2]_
 
     For weighted graphs the edge weights must be greater than zero.
     Zero edge weights can produce an infinite number of equal length
diff --git a/networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py b/networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py
index 6b66b8a..a35a401 100644
--- a/networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py
+++ b/networkx/algorithms/centrality/tests/test_betweenness_centrality_subset.py
@@ -116,21 +116,48 @@ class TestSubsetBetweennessCentrality:
         for n in sorted(G):
             assert b[n] == pytest.approx(expected_b[n], abs=1e-7)
 
+    def test_normalized_p2(self):
+        """
+        Betweenness Centrality Subset: Normalized P2
+        if n <= 2:  no normalization, betweenness centrality should be 0 for all nodes.
+        """
+        G = nx.Graph()
+        nx.add_path(G, range(2))
+        b_answer = {0: 0, 1: 0.0}
+        b = nx.betweenness_centrality_subset(
+            G, sources=[0], targets=[1], normalized=True, weight=None
+        )
+        for n in sorted(G):
+            assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
 
-class TestBetweennessCentralitySources:
-    def test_K5(self):
-        """Betweenness Centrality Sources: K5"""
-        G = nx.complete_graph(5)
-        b = nx.betweenness_centrality_source(G, weight=None, normalized=False)
-        b_answer = {0: 0.0, 1: 0.0, 2: 0.0, 3: 0.0, 4: 0.0}
+    def test_normalized_P5_directed(self):
+        """Betweenness Centrality Subset: Normalized Directed P5"""
+        G = nx.DiGraph()
+        nx.add_path(G, range(5))
+        b_answer = {0: 0, 1: 1.0 / 12.0, 2: 1.0 / 12.0, 3: 0, 4: 0, 5: 0}
+        b = nx.betweenness_centrality_subset(
+            G, sources=[0], targets=[3], normalized=True, weight=None
+        )
         for n in sorted(G):
             assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
 
-    def test_P3(self):
-        """Betweenness Centrality Sources: P3"""
-        G = nx.path_graph(3)
-        b_answer = {0: 0.0, 1: 1.0, 2: 0.0}
-        b = nx.betweenness_centrality_source(G, weight=None, normalized=True)
+    def test_weighted_graph(self):
+        """Betweenness Centrality Subset: Weighted Graph"""
+        G = nx.DiGraph()
+        G.add_edge(0, 1, weight=3)
+        G.add_edge(0, 2, weight=2)
+        G.add_edge(0, 3, weight=6)
+        G.add_edge(0, 4, weight=4)
+        G.add_edge(1, 3, weight=5)
+        G.add_edge(1, 5, weight=5)
+        G.add_edge(2, 4, weight=1)
+        G.add_edge(3, 4, weight=2)
+        G.add_edge(3, 5, weight=1)
+        G.add_edge(4, 5, weight=4)
+        b_answer = {0: 0.0, 1: 0.0, 2: 0.5, 3: 0.5, 4: 0.5, 5: 0.0}
+        b = nx.betweenness_centrality_subset(
+            G, sources=[0], targets=[5], normalized=False, weight="weight"
+        )
         for n in sorted(G):
             assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
 
@@ -225,3 +252,89 @@ class TestEdgeSubsetBetweennessCentrality:
         )
         for n in sorted(G.edges()):
             assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
+
+    def test_diamond_multi_path(self):
+        """Edge betweenness subset centrality: Diamond Multi Path"""
+        G = nx.Graph()
+        G.add_edges_from(
+            [
+                (1, 2),
+                (1, 3),
+                (1, 4),
+                (1, 5),
+                (1, 10),
+                (10, 11),
+                (11, 12),
+                (12, 9),
+                (2, 6),
+                (3, 6),
+                (4, 6),
+                (5, 7),
+                (7, 8),
+                (6, 8),
+                (8, 9),
+            ]
+        )
+        b_answer = dict.fromkeys(G.edges(), 0)
+        b_answer[(8, 9)] = 0.4
+        b_answer[(6, 8)] = b_answer[(7, 8)] = 0.2
+        b_answer[(2, 6)] = b_answer[(3, 6)] = b_answer[(4, 6)] = 0.2 / 3.0
+        b_answer[(1, 2)] = b_answer[(1, 3)] = b_answer[(1, 4)] = 0.2 / 3.0
+        b_answer[(5, 7)] = 0.2
+        b_answer[(1, 5)] = 0.2
+        b_answer[(9, 12)] = 0.1
+        b_answer[(11, 12)] = b_answer[(10, 11)] = b_answer[(1, 10)] = 0.1
+        b = nx.edge_betweenness_centrality_subset(
+            G, sources=[1], targets=[9], weight=None
+        )
+        for n in G.edges():
+            sort_n = tuple(sorted(n))
+            assert b[n] == pytest.approx(b_answer[sort_n], abs=1e-7)
+
+    def test_normalized_p1(self):
+        """
+        Edge betweenness subset centrality: P1
+        if n <= 1: no normalization b=0 for all nodes
+        """
+        G = nx.Graph()
+        nx.add_path(G, range(1))
+        b_answer = dict.fromkeys(G.edges(), 0)
+        b = nx.edge_betweenness_centrality_subset(
+            G, sources=[0], targets=[0], normalized=True, weight=None
+        )
+        for n in G.edges():
+            assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
+
+    def test_normalized_P5_directed(self):
+        """Edge betweenness subset centrality: Normalized Directed P5"""
+        G = nx.DiGraph()
+        nx.add_path(G, range(5))
+        b_answer = dict.fromkeys(G.edges(), 0)
+        b_answer[(0, 1)] = b_answer[(1, 2)] = b_answer[(2, 3)] = 0.05
+        b = nx.edge_betweenness_centrality_subset(
+            G, sources=[0], targets=[3], normalized=True, weight=None
+        )
+        for n in G.edges():
+            assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
+
+    def test_weighted_graph(self):
+        """Edge betweenness subset centrality: Weighted Graph"""
+        G = nx.DiGraph()
+        G.add_edge(0, 1, weight=3)
+        G.add_edge(0, 2, weight=2)
+        G.add_edge(0, 3, weight=6)
+        G.add_edge(0, 4, weight=4)
+        G.add_edge(1, 3, weight=5)
+        G.add_edge(1, 5, weight=5)
+        G.add_edge(2, 4, weight=1)
+        G.add_edge(3, 4, weight=2)
+        G.add_edge(3, 5, weight=1)
+        G.add_edge(4, 5, weight=4)
+        b_answer = dict.fromkeys(G.edges(), 0)
+        b_answer[(0, 2)] = b_answer[(2, 4)] = b_answer[(4, 5)] = 0.5
+        b_answer[(0, 3)] = b_answer[(3, 5)] = 0.5
+        b = nx.edge_betweenness_centrality_subset(
+            G, sources=[0], targets=[5], normalized=False, weight="weight"
+        )
+        for n in G.edges():
+            assert b[n] == pytest.approx(b_answer[n], abs=1e-7)
diff --git a/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py b/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py
index e9f5179..4e3d438 100644
--- a/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py
+++ b/networkx/algorithms/centrality/tests/test_current_flow_betweenness_centrality.py
@@ -134,6 +134,11 @@ class TestApproximateFlowBetweennessCentrality:
             for n in sorted(G):
                 np.testing.assert_allclose(b[n], b_answer[n], atol=epsilon)
 
+    def test_lower_kmax(self):
+        G = nx.complete_graph(4)
+        with pytest.raises(nx.NetworkXError, match="Increase kmax or epsilon"):
+            nx.approximate_current_flow_betweenness_centrality(G, kmax=4)
+
 
 class TestWeightedFlowBetweennessCentrality:
     pass
@@ -175,3 +180,18 @@ class TestEdgeFlowBetweennessCentrality:
         for (s, t), v1 in b_answer.items():
             v2 = b.get((s, t), b.get((t, s)))
             assert v1 == pytest.approx(v2, abs=1e-7)
+
+
+@pytest.mark.parametrize(
+    "centrality_func",
+    (
+        nx.current_flow_betweenness_centrality,
+        nx.edge_current_flow_betweenness_centrality,
+        nx.approximate_current_flow_betweenness_centrality,
+    ),
+)
+def test_unconnected_graphs_betweenness_centrality(centrality_func):
+    G = nx.Graph([(1, 2), (3, 4)])
+    G.add_node(5)
+    with pytest.raises(nx.NetworkXError, match="Graph not connected"):
+        centrality_func(G)
diff --git a/networkx/algorithms/centrality/tests/test_eigenvector_centrality.py b/networkx/algorithms/centrality/tests/test_eigenvector_centrality.py
index 7a44aff..407205b 100644
--- a/networkx/algorithms/centrality/tests/test_eigenvector_centrality.py
+++ b/networkx/algorithms/centrality/tests/test_eigenvector_centrality.py
@@ -166,3 +166,10 @@ class TestEigenvectorCentralityExceptions:
     def test_empty_numpy(self):
         with pytest.raises(nx.NetworkXException):
             nx.eigenvector_centrality_numpy(nx.Graph())
+
+    def test_zero_nstart(self):
+        G = nx.Graph([(1, 2), (1, 3), (2, 3)])
+        with pytest.raises(
+            nx.NetworkXException, match="initial vector cannot have all zero values"
+        ):
+            nx.eigenvector_centrality(G, nstart={v: 0 for v in G})
diff --git a/networkx/algorithms/centrality/tests/test_laplacian_centrality.py b/networkx/algorithms/centrality/tests/test_laplacian_centrality.py
new file mode 100644
index 0000000..0cc59c6
--- /dev/null
+++ b/networkx/algorithms/centrality/tests/test_laplacian_centrality.py
@@ -0,0 +1,189 @@
+import pytest
+
+import networkx as nx
+
+np = pytest.importorskip("numpy")
+sp = pytest.importorskip("scipy")
+
+
+def test_laplacian_centrality_E():
+    E = nx.Graph()
+    E.add_weighted_edges_from(
+        [(0, 1, 4), (4, 5, 1), (0, 2, 2), (2, 1, 1), (1, 3, 2), (1, 4, 2)]
+    )
+    d = nx.laplacian_centrality(E)
+    exact = {
+        0: 0.700000,
+        1: 0.900000,
+        2: 0.280000,
+        3: 0.220000,
+        4: 0.260000,
+        5: 0.040000,
+    }
+
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+    # Check not normalized
+    full_energy = 200
+    dnn = nx.laplacian_centrality(E, normalized=False)
+    for n, dc in dnn.items():
+        assert exact[n] * full_energy == pytest.approx(dc, abs=1e-7)
+
+    # Check unweighted not-normalized version
+    duw_nn = nx.laplacian_centrality(E, normalized=False, weight=None)
+    print(duw_nn)
+    exact_uw_nn = {
+        0: 18,
+        1: 34,
+        2: 18,
+        3: 10,
+        4: 16,
+        5: 6,
+    }
+    for n, dc in duw_nn.items():
+        assert exact_uw_nn[n] == pytest.approx(dc, abs=1e-7)
+
+    # Check unweighted version
+    duw = nx.laplacian_centrality(E, weight=None)
+    full_energy = 42
+    for n, dc in duw.items():
+        assert exact_uw_nn[n] / full_energy == pytest.approx(dc, abs=1e-7)
+
+
+def test_laplacian_centrality_KC():
+    KC = nx.karate_club_graph()
+    d = nx.laplacian_centrality(KC)
+    exact = {
+        0: 0.2543593,
+        1: 0.1724524,
+        2: 0.2166053,
+        3: 0.0964646,
+        4: 0.0350344,
+        5: 0.0571109,
+        6: 0.0540713,
+        7: 0.0788674,
+        8: 0.1222204,
+        9: 0.0217565,
+        10: 0.0308751,
+        11: 0.0215965,
+        12: 0.0174372,
+        13: 0.118861,
+        14: 0.0366341,
+        15: 0.0548712,
+        16: 0.0172772,
+        17: 0.0191969,
+        18: 0.0225564,
+        19: 0.0331147,
+        20: 0.0279955,
+        21: 0.0246361,
+        22: 0.0382339,
+        23: 0.1294193,
+        24: 0.0227164,
+        25: 0.0644697,
+        26: 0.0281555,
+        27: 0.075188,
+        28: 0.0364742,
+        29: 0.0707087,
+        30: 0.0708687,
+        31: 0.131019,
+        32: 0.2370821,
+        33: 0.3066709,
+    }
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+    # Check not normalized
+    full_energy = 12502
+    dnn = nx.laplacian_centrality(KC, normalized=False)
+    for n, dc in dnn.items():
+        assert exact[n] * full_energy == pytest.approx(dc, abs=1e-3)
+
+
+def test_laplacian_centrality_K():
+    K = nx.krackhardt_kite_graph()
+    d = nx.laplacian_centrality(K)
+    exact = {
+        0: 0.3010753,
+        1: 0.3010753,
+        2: 0.2258065,
+        3: 0.483871,
+        4: 0.2258065,
+        5: 0.3870968,
+        6: 0.3870968,
+        7: 0.1935484,
+        8: 0.0752688,
+        9: 0.0322581,
+    }
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+    # Check not normalized
+    full_energy = 186
+    dnn = nx.laplacian_centrality(K, normalized=False)
+    for n, dc in dnn.items():
+        assert exact[n] * full_energy == pytest.approx(dc, abs=1e-3)
+
+
+def test_laplacian_centrality_P3():
+    P3 = nx.path_graph(3)
+    d = nx.laplacian_centrality(P3)
+    exact = {0: 0.6, 1: 1.0, 2: 0.6}
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+
+def test_laplacian_centrality_K5():
+    K5 = nx.complete_graph(5)
+    d = nx.laplacian_centrality(K5)
+    exact = {0: 0.52, 1: 0.52, 2: 0.52, 3: 0.52, 4: 0.52}
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+
+def test_laplacian_centrality_FF():
+    FF = nx.florentine_families_graph()
+    d = nx.laplacian_centrality(FF)
+    exact = {
+        "Acciaiuoli": 0.0804598,
+        "Medici": 0.4022989,
+        "Castellani": 0.1724138,
+        "Peruzzi": 0.183908,
+        "Strozzi": 0.2528736,
+        "Barbadori": 0.137931,
+        "Ridolfi": 0.2183908,
+        "Tornabuoni": 0.2183908,
+        "Albizzi": 0.1954023,
+        "Salviati": 0.1149425,
+        "Pazzi": 0.0344828,
+        "Bischeri": 0.1954023,
+        "Guadagni": 0.2298851,
+        "Ginori": 0.045977,
+        "Lamberteschi": 0.0574713,
+    }
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+
+def test_laplacian_centrality_DG():
+    DG = nx.DiGraph([(0, 5), (1, 5), (2, 5), (3, 5), (4, 5), (5, 6), (5, 7), (5, 8)])
+    d = nx.laplacian_centrality(DG)
+    exact = {
+        0: 0.2123352,
+        5: 0.515391,
+        1: 0.2123352,
+        2: 0.2123352,
+        3: 0.2123352,
+        4: 0.2123352,
+        6: 0.2952031,
+        7: 0.2952031,
+        8: 0.2952031,
+    }
+    for n, dc in d.items():
+        assert exact[n] == pytest.approx(dc, abs=1e-7)
+
+    # Check not normalized
+    full_energy = 9.50704
+    dnn = nx.laplacian_centrality(DG, normalized=False)
+    for n, dc in dnn.items():
+        assert exact[n] * full_energy == pytest.approx(dc, abs=1e-4)
diff --git a/networkx/algorithms/centrality/tests/test_voterank.py b/networkx/algorithms/centrality/tests/test_voterank.py
index aa653ae..1212681 100644
--- a/networkx/algorithms/centrality/tests/test_voterank.py
+++ b/networkx/algorithms/centrality/tests/test_voterank.py
@@ -28,6 +28,10 @@ class TestVoteRankCentrality:
         )
         assert [0, 7, 6] == nx.voterank(G)
 
+    def test_voterank_emptygraph(self):
+        G = nx.Graph()
+        assert [] == nx.voterank(G)
+
     # Graph unit test
     def test_voterank_centrality_2(self):
         G = nx.florentine_families_graph()
diff --git a/networkx/algorithms/chordal.py b/networkx/algorithms/chordal.py
index ad17ef7..6ff8b04 100644
--- a/networkx/algorithms/chordal.py
+++ b/networkx/algorithms/chordal.py
@@ -6,7 +6,6 @@ A graph is chordal if every cycle of length at least 4 has a chord
 https://en.wikipedia.org/wiki/Chordal_graph
 """
 import sys
-import warnings
 
 import networkx as nx
 from networkx.algorithms.components import connected_components
@@ -162,7 +161,7 @@ def find_induced_nodes(G, s, t, treewidth_bound=sys.maxsize):
 
 
 def chordal_graph_cliques(G):
-    """Returns the set of maximal cliques of a chordal graph.
+    """Returns all maximal cliques of a chordal graph.
 
     The algorithm breaks the graph in connected components and performs a
     maximum cardinality search in each component to get the cliques.
@@ -172,9 +171,11 @@ def chordal_graph_cliques(G):
     G : graph
       A NetworkX graph
 
-    Returns
-    -------
-    cliques : A set containing the maximal cliques in G.
+    Yields
+    ------
+    frozenset of nodes
+        Maximal cliques, each of which is a frozenset of
+        nodes in `G`. The order of cliques is arbitrary.
 
     Raises
     ------
@@ -200,11 +201,35 @@ def chordal_graph_cliques(G):
     ... ]
     >>> G = nx.Graph(e)
     >>> G.add_node(9)
-    >>> setlist = nx.chordal_graph_cliques(G)
+    >>> cliques = [c for c in chordal_graph_cliques(G)]
+    >>> cliques[0]
+    frozenset({1, 2, 3})
     """
-    msg = "This will return a generator in 3.0."
-    warnings.warn(msg, DeprecationWarning)
-    return {c for c in _chordal_graph_cliques(G)}
+    for C in (G.subgraph(c).copy() for c in connected_components(G)):
+        if C.number_of_nodes() == 1:
+            if nx.number_of_selfloops(C) > 0:
+                raise nx.NetworkXError("Input graph is not chordal.")
+            yield frozenset(C.nodes())
+        else:
+            unnumbered = set(C.nodes())
+            v = arbitrary_element(C)
+            unnumbered.remove(v)
+            numbered = {v}
+            clique_wanna_be = {v}
+            while unnumbered:
+                v = _max_cardinality_node(C, unnumbered, numbered)
+                unnumbered.remove(v)
+                numbered.add(v)
+                new_clique_wanna_be = set(C.neighbors(v)) & numbered
+                sg = C.subgraph(clique_wanna_be)
+                if _is_complete_graph(sg):
+                    new_clique_wanna_be.add(v)
+                    if not new_clique_wanna_be >= clique_wanna_be:
+                        yield frozenset(clique_wanna_be)
+                    clique_wanna_be = new_clique_wanna_be
+                else:
+                    raise nx.NetworkXError("Input graph is not chordal.")
+            yield frozenset(clique_wanna_be)
 
 
 def chordal_graph_treewidth(G):
@@ -331,78 +356,6 @@ def _find_chordality_breaker(G, s=None, treewidth_bound=sys.maxsize):
     return ()
 
 
-def _chordal_graph_cliques(G):
-    """Returns all maximal cliques of a chordal graph.
-
-    The algorithm breaks the graph in connected components and performs a
-    maximum cardinality search in each component to get the cliques.
-
-    Parameters
-    ----------
-    G : graph
-      A NetworkX graph
-
-    Returns
-    -------
-    iterator
-        An iterator over maximal cliques, each of which is a frozenset of
-        nodes in `G`. The order of cliques is arbitrary.
-
-    Raises
-    ------
-    NetworkXError
-        The algorithm does not support DiGraph, MultiGraph and MultiDiGraph.
-        The algorithm can only be applied to chordal graphs. If the input
-        graph is found to be non-chordal, a :exc:`NetworkXError` is raised.
-
-    Examples
-    --------
-    >>> e = [
-    ...     (1, 2),
-    ...     (1, 3),
-    ...     (2, 3),
-    ...     (2, 4),
-    ...     (3, 4),
-    ...     (3, 5),
-    ...     (3, 6),
-    ...     (4, 5),
-    ...     (4, 6),
-    ...     (5, 6),
-    ...     (7, 8),
-    ... ]
-    >>> G = nx.Graph(e)
-    >>> G.add_node(9)
-    >>> cliques = [c for c in _chordal_graph_cliques(G)]
-    >>> cliques[0]
-    frozenset({1, 2, 3})
-    """
-    for C in (G.subgraph(c).copy() for c in connected_components(G)):
-        if C.number_of_nodes() == 1:
-            if nx.number_of_selfloops(C) > 0:
-                raise nx.NetworkXError("Input graph is not chordal.")
-            yield frozenset(C.nodes())
-        else:
-            unnumbered = set(C.nodes())
-            v = arbitrary_element(C)
-            unnumbered.remove(v)
-            numbered = {v}
-            clique_wanna_be = {v}
-            while unnumbered:
-                v = _max_cardinality_node(C, unnumbered, numbered)
-                unnumbered.remove(v)
-                numbered.add(v)
-                new_clique_wanna_be = set(C.neighbors(v)) & numbered
-                sg = C.subgraph(clique_wanna_be)
-                if _is_complete_graph(sg):
-                    new_clique_wanna_be.add(v)
-                    if not new_clique_wanna_be >= clique_wanna_be:
-                        yield frozenset(clique_wanna_be)
-                    clique_wanna_be = new_clique_wanna_be
-                else:
-                    raise nx.NetworkXError("Input graph is not chordal.")
-            yield frozenset(clique_wanna_be)
-
-
 @not_implemented_for("directed")
 def complete_to_chordal_graph(G):
     """Return a copy of G completed to a chordal graph
diff --git a/networkx/algorithms/clique.py b/networkx/algorithms/clique.py
index afdaa47..c563e2d 100644
--- a/networkx/algorithms/clique.py
+++ b/networkx/algorithms/clique.py
@@ -137,6 +137,67 @@ def find_cliques(G, nodes=None):
     ValueError
         If `nodes` is not a clique.
 
+    Examples
+    --------
+    >>> from pprint import pprint  # For nice dict formatting
+    >>> G = nx.karate_club_graph()
+    >>> sum(1 for c in nx.find_cliques(G))  # The number of maximal cliques in G
+    36
+    >>> max(nx.find_cliques(G), key=len)  # The largest maximal clique in G
+    [0, 1, 2, 3, 13]
+
+    The size of the largest maximal clique is known as the *clique number* of
+    the graph, which can be found directly with:
+
+    >>> max(len(c) for c in nx.find_cliques(G))
+    5
+
+    One can also compute the number of maximal cliques in `G` that contain a given
+    node. The following produces a dictionary keyed by node whose
+    values are the number of maximal cliques in `G` that contain the node:
+
+    >>> pprint({n: sum(1 for c in nx.find_cliques(G) if n in c) for n in G})
+    {0: 13,
+     1: 6,
+     2: 7,
+     3: 3,
+     4: 2,
+     5: 3,
+     6: 3,
+     7: 1,
+     8: 3,
+     9: 2,
+     10: 2,
+     11: 1,
+     12: 1,
+     13: 2,
+     14: 1,
+     15: 1,
+     16: 1,
+     17: 1,
+     18: 1,
+     19: 2,
+     20: 1,
+     21: 1,
+     22: 1,
+     23: 3,
+     24: 2,
+     25: 2,
+     26: 1,
+     27: 3,
+     28: 2,
+     29: 2,
+     30: 2,
+     31: 4,
+     32: 9,
+     33: 14}
+
+    Or, similarly, the maximal cliques in `G` that contain a given node.
+    For example, the 4 maximal cliques that contain node 31:
+
+    >>> [c for c in nx.find_cliques(G) if 31 in c]
+    [[0, 31], [33, 32, 31], [33, 28, 31], [24, 25, 31]]
+
     See Also
     --------
     find_cliques_recursive
@@ -274,7 +335,7 @@ def find_cliques_recursive(G, nodes=None):
     See Also
     --------
     find_cliques
-        An iterative version of the same algorithm.
+        An iterative version of the same algorithm. See docstring for examples.
 
     Notes
     -----
@@ -451,6 +512,14 @@ def graph_clique_number(G, cliques=None):
     The *clique number* of a graph is the size of the largest clique in
     the graph.
 
+    .. deprecated:: 3.0
+
+       graph_clique_number is deprecated in NetworkX 3.0 and will be removed
+       in v3.2. The graph clique number can be computed directly with::
+
+           max(len(c) for c in nx.find_cliques(G))
+
+
     Parameters
     ----------
     G : NetworkX graph
@@ -473,6 +542,16 @@ def graph_clique_number(G, cliques=None):
     maximal cliques.
 
     """
+    import warnings
+
+    warnings.warn(
+        (
+            "\n\ngraph_clique_number is deprecated and will be removed.\n"
+            "Use: ``max(len(c) for c in nx.find_cliques(G))`` instead."
+        ),
+        DeprecationWarning,
+        stacklevel=2,
+    )
     if len(G.nodes) < 1:
         return 0
     if cliques is None:
@@ -483,6 +562,13 @@ def graph_clique_number(G, cliques=None):
 def graph_number_of_cliques(G, cliques=None):
     """Returns the number of maximal cliques in the graph.
 
+    .. deprecated:: 3.0
+
+       graph_number_of_cliques is deprecated and will be removed in v3.2.
+       The number of maximal cliques can be computed directly with::
+
+           sum(1 for _ in nx.find_cliques(G))
+
     Parameters
     ----------
     G : NetworkX graph
@@ -505,6 +591,16 @@ def graph_number_of_cliques(G, cliques=None):
     maximal cliques.
 
     """
+    import warnings
+
+    warnings.warn(
+        (
+            "\n\ngraph_number_of_cliques is deprecated and will be removed.\n"
+            "Use: ``sum(1 for _ in nx.find_cliques(G))`` instead."
+        ),
+        DeprecationWarning,
+        stacklevel=2,
+    )
     if cliques is None:
         cliques = list(find_cliques(G))
     return len(cliques)
@@ -576,9 +672,29 @@ def node_clique_number(G, nodes=None, cliques=None, separate_nodes=False):
 def number_of_cliques(G, nodes=None, cliques=None):
     """Returns the number of maximal cliques for each node.
 
+    .. deprecated:: 3.0
+
+       number_of_cliques is deprecated and will be removed in v3.2.
+       Use the result of `find_cliques` directly to compute the number of
+       cliques containing each node::
+
+           {n: sum(1 for c in nx.find_cliques(G) if n in c) for n in G}
+
     Returns a single or list depending on input nodes.
     Optional list of cliques can be input if already computed.
     """
+    import warnings
+
+    warnings.warn(
+        (
+            "\n\nnumber_of_cliques is deprecated and will be removed.\n"
+            "Use the result of find_cliques directly to compute the number\n"
+            "of cliques containing each node:\n\n"
+            "    {n: sum(1 for c in nx.find_cliques(G) if n in c) for n in G}\n\n"
+        ),
+        DeprecationWarning,
+        stacklevel=2,
+    )
     if cliques is None:
         cliques = list(find_cliques(G))
 
@@ -599,9 +715,29 @@ def number_of_cliques(G, nodes=None, cliques=None):
 def cliques_containing_node(G, nodes=None, cliques=None):
     """Returns a list of cliques containing the given node.
 
+    .. deprecated:: 3.0
+
+       cliques_containing_node is deprecated and will be removed in 3.2.
+       Use the result of `find_cliques` directly to compute the cliques that
+       contain each node::
+
+           {n: [c for c in nx.find_cliques(G) if n in c] for n in G}
+
     Returns a single list or list of lists depending on input nodes.
     Optional list of cliques can be input if already computed.
     """
+    import warnings
+
+    warnings.warn(
+        (
+            "\n\ncliques_containing_node is deprecated and will be removed.\n"
+            "Use the result of find_cliques directly to compute maximal cliques\n"
+            "containing each node:\n\n"
+            "    {n: [c for c in nx.find_cliques(G) if n in c] for n in G}\n\n"
+        ),
+        DeprecationWarning,
+        stacklevel=2,
+    )
     if cliques is None:
         cliques = list(find_cliques(G))
 
diff --git a/networkx/algorithms/cluster.py b/networkx/algorithms/cluster.py
index 1421fef..03d5986 100644
--- a/networkx/algorithms/cluster.py
+++ b/networkx/algorithms/cluster.py
@@ -3,6 +3,7 @@
 from collections import Counter
 from itertools import chain, combinations
 
+import networkx as nx
 from networkx.utils import not_implemented_for
 
 __all__ = [
@@ -15,6 +16,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch("triangles")
 @not_implemented_for("directed")
 def triangles(G, nodes=None):
     """Compute the number of triangles.
@@ -218,6 +220,7 @@ def _directed_weighted_triangles_and_degree_iter(G, nodes=None, weight="weight")
         yield (i, dtotal, dbidirectional, directed_triangles)
 
 
+@nx._dispatch(name="average_clustering")
 def average_clustering(G, nodes=None, weight=None, count_zeros=True):
     r"""Compute the average clustering coefficient for the graph G.
 
@@ -277,6 +280,7 @@ def average_clustering(G, nodes=None, weight=None, count_zeros=True):
     return sum(c) / len(c)
 
 
+@nx._dispatch(name="clustering")
 def clustering(G, nodes=None, weight=None):
     r"""Compute the clustering coefficient for nodes.
 
@@ -325,8 +329,10 @@ def clustering(G, nodes=None, weight=None):
     ----------
     G : graph
 
-    nodes : container of nodes, optional (default=all nodes in G)
-       Compute clustering for nodes in this container.
+    nodes : node, iterable of nodes, or None (default=None)
+        If a singleton node, return the number of triangles for that node.
+        If an iterable, compute the number of triangles for each of those nodes.
+        If `None` (the default) compute the number of triangles for all nodes in `G`.
 
     weight : string or None, optional (default=None)
        The edge attribute that holds the numerical value used as a weight.
@@ -390,6 +396,7 @@ def clustering(G, nodes=None, weight=None):
     return clusterc
 
 
+@nx._dispatch("transitivity")
 def transitivity(G):
     r"""Compute graph transitivity, the fraction of all possible triangles
     present in G.
@@ -428,6 +435,7 @@ def transitivity(G):
     return 0 if triangles == 0 else triangles / contri
 
 
+@nx._dispatch(name="square_clustering")
 def square_clustering(G, nodes=None):
     r"""Compute the squares clustering coefficient for nodes.
 
@@ -505,6 +513,7 @@ def square_clustering(G, nodes=None):
     return clustering
 
 
+@nx._dispatch("generalized_degree")
 @not_implemented_for("directed")
 def generalized_degree(G, nodes=None):
     r"""Compute the generalized degree for nodes.
@@ -546,7 +555,7 @@ def generalized_degree(G, nodes=None):
 
     Notes
     -----
-    In a network of N nodes, the highest triangle multiplicty an edge can have
+    In a network of N nodes, the highest triangle multiplicity an edge can have
     is N-2.
 
     The return value does not include a `zero` entry if no edges of a
diff --git a/networkx/algorithms/coloring/greedy_coloring.py b/networkx/algorithms/coloring/greedy_coloring.py
index 329746c..78e0d15 100644
--- a/networkx/algorithms/coloring/greedy_coloring.py
+++ b/networkx/algorithms/coloring/greedy_coloring.py
@@ -209,29 +209,42 @@ def strategy_saturation_largest_first(G, colors):
 
     """
     distinct_colors = {v: set() for v in G}
-    for i in range(len(G)):
-        # On the first time through, simply choose the node of highest degree.
-        if i == 0:
-            node = max(G, key=G.degree)
-            yield node
-            # Add the color 0 to the distinct colors set for each
-            # neighbors of that node.
-            for v in G[node]:
-                distinct_colors[v].add(0)
-        else:
-            # Compute the maximum saturation and the set of nodes that
-            # achieve that saturation.
-            saturation = {
-                v: len(c) for v, c in distinct_colors.items() if v not in colors
-            }
-            # Yield the node with the highest saturation, and break ties by
-            # degree.
-            node = max(saturation, key=lambda v: (saturation[v], G.degree(v)))
-            yield node
-            # Update the distinct color sets for the neighbors.
-            color = colors[node]
-            for v in G[node]:
-                distinct_colors[v].add(color)
+
+    # Add the node color assignments given in colors to the
+    # distinct colors set for each neighbor of that node
+    for node, color in colors.items():
+        for neighbor in G[node]:
+            distinct_colors[neighbor].add(color)
+
+    # Check that the color assignments in colors are valid
+    # i.e. no neighboring nodes have the same color
+    if len(colors) >= 2:
+        for node, color in colors.items():
+            if color in distinct_colors[node]:
+                raise nx.NetworkXError("Neighboring nodes must have different colors")
+
+    # If 0 nodes have been colored, simply choose the node of highest degree.
+    if not colors:
+        node = max(G, key=G.degree)
+        yield node
+        # Add the color 0 to the distinct colors set for each
+        # neighbor of that node.
+        for v in G[node]:
+            distinct_colors[v].add(0)
+
+    while not len(G) == len(colors):
+        # Update the distinct color sets for the neighbors.
+        for node, color in colors.items():
+            for neighbor in G[node]:
+                distinct_colors[neighbor].add(color)
+
+        # Compute the maximum saturation and the set of nodes that
+        # achieve that saturation.
+        saturation = {v: len(c) for v, c in distinct_colors.items() if v not in colors}
+        # Yield the node with the highest saturation, and break ties by
+        # degree.
+        node = max(saturation, key=lambda v: (saturation[v], G.degree(v)))
+        yield node
 
 
 #: Dictionary mapping name of a strategy as a string to the strategy function.
diff --git a/networkx/algorithms/coloring/tests/test_coloring.py b/networkx/algorithms/coloring/tests/test_coloring.py
index cc422e3..6ab95be 100644
--- a/networkx/algorithms/coloring/tests/test_coloring.py
+++ b/networkx/algorithms/coloring/tests/test_coloring.py
@@ -2,6 +2,8 @@
 
 """
 
+import itertools
+
 import pytest
 
 import networkx as nx
@@ -429,6 +431,78 @@ class TestColoring:
         )
         check_state(**params)
 
+    def test_strategy_saturation_largest_first(self):
+        def color_remaining_nodes(
+            G,
+            colored_nodes,
+            full_color_assignment=None,
+            nodes_to_add_between_calls=1,
+        ):
+
+            color_assignments = []
+            aux_colored_nodes = colored_nodes.copy()
+
+            node_iterator = nx.algorithms.coloring.greedy_coloring.strategy_saturation_largest_first(
+                G, aux_colored_nodes
+            )
+
+            for u in node_iterator:
+                # Set to keep track of colors of neighbours
+                neighbour_colors = {
+                    aux_colored_nodes[v] for v in G[u] if v in aux_colored_nodes
+                }
+                # Find the first unused color.
+                for color in itertools.count():
+                    if color not in neighbour_colors:
+                        break
+                aux_colored_nodes[u] = color
+                color_assignments.append((u, color))
+
+                # Color nodes between iterations
+                for i in range(nodes_to_add_between_calls - 1):
+                    if not len(color_assignments) + len(colored_nodes) >= len(
+                        full_color_assignment
+                    ):
+                        full_color_assignment_node, color = full_color_assignment[
+                            len(color_assignments) + len(colored_nodes)
+                        ]
+
+                        # Assign the new color to the current node.
+                        aux_colored_nodes[full_color_assignment_node] = color
+                        color_assignments.append((full_color_assignment_node, color))
+
+            return color_assignments, aux_colored_nodes
+
+        for G, _, _ in SPECIAL_TEST_CASES["saturation_largest_first"]:
+
+            G = G()
+
+            # Check that function still works when nodes are colored between iterations
+            for nodes_to_add_between_calls in range(1, 5):
+                # Get a full color assignment, (including the order in which nodes were colored)
+                colored_nodes = {}
+                full_color_assignment, full_colored_nodes = color_remaining_nodes(
+                    G, colored_nodes
+                )
+
+                # For each node in the color assignment, add it to colored_nodes and re-run the function
+                for ind, (node, color) in enumerate(full_color_assignment):
+                    colored_nodes[node] = color
+
+                    (
+                        partial_color_assignment,
+                        partial_colored_nodes,
+                    ) = color_remaining_nodes(
+                        G,
+                        colored_nodes,
+                        full_color_assignment=full_color_assignment,
+                        nodes_to_add_between_calls=nodes_to_add_between_calls,
+                    )
+
+                    # Check that the color assignment and order of remaining nodes are the same
+                    assert full_color_assignment[ind + 1 :] == partial_color_assignment
+                    assert full_colored_nodes == partial_colored_nodes
+
 
 #  ############################  Utility functions ############################
 def verify_coloring(graph, coloring):
diff --git a/networkx/algorithms/communicability_alg.py b/networkx/algorithms/communicability_alg.py
index ba4b4ab..1d2161d 100644
--- a/networkx/algorithms/communicability_alg.py
+++ b/networkx/algorithms/communicability_alg.py
@@ -36,7 +36,7 @@ def communicability(G):
        Communicability between all pairs of nodes in G  using spectral
        decomposition.
     communicability_betweenness_centrality:
-       Communicability betweeness centrality for each node in G.
+       Communicability betweenness centrality for each node in G.
 
     Notes
     -----
@@ -116,7 +116,7 @@ def communicability_exp(G):
     communicability:
        Communicability between pairs of nodes in G.
     communicability_betweenness_centrality:
-       Communicability betweeness centrality for each node in G.
+       Communicability betweenness centrality for each node in G.
 
     Notes
     -----
diff --git a/networkx/algorithms/community/louvain.py b/networkx/algorithms/community/louvain.py
index 9471130..577766c 100644
--- a/networkx/algorithms/community/louvain.py
+++ b/networkx/algorithms/community/louvain.py
@@ -125,7 +125,7 @@ def louvain_partitions(
     A dendrogram is a diagram representing a tree and each level represents
     a partition of the G graph. The top level contains the smallest communities
     and as you traverse to the bottom of the tree the communities get bigger
-    and the overal modularity increases making the partition better.
+    and the overall modularity increases making the partition better.
 
     Each level is generated by executing the two phases of the Louvain Community
     Detection Algorithm.
diff --git a/networkx/algorithms/community/modularity_max.py b/networkx/algorithms/community/modularity_max.py
index 67a4961..fabf116 100644
--- a/networkx/algorithms/community/modularity_max.py
+++ b/networkx/algorithms/community/modularity_max.py
@@ -10,7 +10,6 @@ from networkx.utils.mapped_queue import MappedQueue
 __all__ = [
     "greedy_modularity_communities",
     "naive_greedy_modularity_communities",
-    "_naive_greedy_modularity_communities",
 ]
 
 
@@ -225,7 +224,11 @@ def _greedy_modularity_communities_generator(G, weight=None, resolution=1):
 
 
 def greedy_modularity_communities(
-    G, weight=None, resolution=1, cutoff=1, best_n=None, n_communities=None
+    G,
+    weight=None,
+    resolution=1,
+    cutoff=1,
+    best_n=None,
 ):
     r"""Find communities in G using greedy modularity maximization.
 
@@ -234,7 +237,7 @@ def greedy_modularity_communities(
 
     Greedy modularity maximization begins with each node in its own community
     and repeatedly joins the pair of communities that lead to the largest
-    modularity until no futher increase in modularity is possible (a maximum).
+    modularity until no further increase in modularity is possible (a maximum).
     Two keyword arguments adjust the stopping condition. `cutoff` is a lower
     limit on the number of communities so you can stop the process before
     reaching a maximum (used to save computation time). `best_n` is an upper
@@ -271,23 +274,10 @@ def greedy_modularity_communities(
         starts to decrease until `best_n` communities remain.
         If ``None``, don't force it to continue beyond a maximum.
 
-    n_communities : int or None, optional (default=None)
-
-        .. deprecated:: 3.0
-           The `n_communities` parameter is deprecated - use `cutoff` and/or
-           `best_n` to set bounds on the desired number of communities instead.
-
-        A minimum number of communities below which the merging process stops.
-        The process stops at this number of communities even if modularity
-        is not maximized. The goal is to let the user stop the process early.
-        The process stops before the cutoff if it finds a maximum of modularity.
-
     Raises
     ------
     ValueError : If the `cutoff` or `best_n`  value is not in the range
         ``[1, G.number_of_nodes()]``, or if `best_n` < `cutoff`.
-        Also raised if `cutoff` is used with the deprecated `n_communities`
-        parameter.
 
     Returns
     -------
@@ -330,18 +320,6 @@ def greedy_modularity_communities(
             return [set(G)]
     else:
         best_n = G.number_of_nodes()
-    if n_communities is not None:
-        import warnings
-
-        warnings.warn(
-            "kwarg ``n_communities`` in greedy_modularity_communities is deprecated"
-            "and will be removed in version 3.0.   Use ``cutoff`` instead.",
-            DeprecationWarning,
-        )
-        if cutoff == 1:
-            cutoff = n_communities
-        else:
-            raise ValueError(f"Can not set both n_communities and cutoff.")
 
     # retrieve generator object to construct output
     community_gen = _greedy_modularity_communities_generator(
@@ -468,7 +446,3 @@ def naive_greedy_modularity_communities(G, resolution=1, weight=None):
             communities[i] = frozenset([])
     # Remove empty communities and sort
     return sorted((c for c in communities if len(c) > 0), key=len, reverse=True)
-
-
-# old name
-_naive_greedy_modularity_communities = naive_greedy_modularity_communities
diff --git a/networkx/algorithms/community/quality.py b/networkx/algorithms/community/quality.py
index 7de8059..39bf0cd 100644
--- a/networkx/algorithms/community/quality.py
+++ b/networkx/algorithms/community/quality.py
@@ -11,14 +11,14 @@ from networkx.algorithms.community.community_utils import is_partition
 from networkx.utils import not_implemented_for
 from networkx.utils.decorators import argmap
 
-__all__ = ["coverage", "modularity", "performance", "partition_quality"]
+__all__ = ["modularity", "partition_quality"]
 
 
 class NotAPartition(NetworkXError):
     """Raised if a given collection is not a partition."""
 
     def __init__(self, G, collection):
-        msg = f"{G} is not a valid partition of the graph {collection}"
+        msg = f"{collection} is not a valid partition of the graph {G}"
         super().__init__(msg)
 
 
@@ -59,6 +59,7 @@ def _require_partition(G, partition):
 require_partition = argmap(_require_partition, (0, 1))
 
 
+@nx._dispatch
 def intra_community_edges(G, partition):
     """Returns the number of intra-community edges for a partition of `G`.
 
@@ -76,6 +77,7 @@ def intra_community_edges(G, partition):
     return sum(G.subgraph(block).size() for block in partition)
 
 
+@nx._dispatch
 def inter_community_edges(G, partition):
     """Returns the number of inter-community edges for a partition of `G`.
     according to the given
@@ -139,109 +141,6 @@ def inter_community_non_edges(G, partition):
     return inter_community_edges(nx.complement(G), partition)
 
 
-@not_implemented_for("multigraph")
-@require_partition
-def performance(G, partition):
-    """Returns the performance of a partition.
-
-    .. deprecated:: 2.6
-       Use `partition_quality` instead.
-
-    The *performance* of a partition is the number of
-    intra-community edges plus inter-community non-edges divided by the total
-    number of potential edges.
-
-    Parameters
-    ----------
-    G : NetworkX graph
-        A simple graph (directed or undirected).
-
-    partition : sequence
-        Partition of the nodes of `G`, represented as a sequence of
-        sets of nodes. Each block of the partition represents a
-        community.
-
-    Returns
-    -------
-    float
-        The performance of the partition, as defined above.
-
-    Raises
-    ------
-    NetworkXError
-        If `partition` is not a valid partition of the nodes of `G`.
-
-    References
-    ----------
-    .. [1] Santo Fortunato.
-           "Community Detection in Graphs".
-           *Physical Reports*, Volume 486, Issue 3--5 pp. 75--174
-           <https://arxiv.org/abs/0906.0612>
-
-    """
-    # Compute the number of intra-community edges and inter-community
-    # edges.
-    intra_edges = intra_community_edges(G, partition)
-    inter_edges = inter_community_non_edges(G, partition)
-    # Compute the number of edges in the complete graph (directed or
-    # undirected, as it depends on `G`) on `n` nodes.
-    #
-    # (If `G` is an undirected graph, we divide by two since we have
-    # double-counted each potential edge. We use integer division since
-    # `total_pairs` is guaranteed to be even.)
-    n = len(G)
-    total_pairs = n * (n - 1)
-    if not G.is_directed():
-        total_pairs //= 2
-    return (intra_edges + inter_edges) / total_pairs
-
-
-@require_partition
-def coverage(G, partition):
-    """Returns the coverage of a partition.
-
-    .. deprecated:: 2.6
-       Use `partition_quality` instead.
-
-    The *coverage* of a partition is the ratio of the number of
-    intra-community edges to the total number of edges in the graph.
-
-    Parameters
-    ----------
-    G : NetworkX graph
-
-    partition : sequence
-        Partition of the nodes of `G`, represented as a sequence of
-        sets of nodes. Each block of the partition represents a
-        community.
-
-    Returns
-    -------
-    float
-        The coverage of the partition, as defined above.
-
-    Raises
-    ------
-    NetworkXError
-        If `partition` is not a valid partition of the nodes of `G`.
-
-    Notes
-    -----
-    If `G` is a multigraph, the multiplicity of edges is counted.
-
-    References
-    ----------
-    .. [1] Santo Fortunato.
-           "Community Detection in Graphs".
-           *Physical Reports*, Volume 486, Issue 3--5 pp. 75--174
-           <https://arxiv.org/abs/0906.0612>
-
-    """
-    intra_edges = intra_community_edges(G, partition)
-    total_edges = G.number_of_edges()
-    return intra_edges / total_edges
-
-
 def modularity(G, communities, weight="weight", resolution=1):
     r"""Returns the modularity of the given partition of the graph.
 
@@ -294,7 +193,7 @@ def modularity(G, communities, weight="weight", resolution=1):
     Returns
     -------
     Q : float
-        The modularity of the paritition.
+        The modularity of the partition.
 
     Raises
     ------
diff --git a/networkx/algorithms/community/tests/test_quality.py b/networkx/algorithms/community/tests/test_quality.py
index 1d6aeb8..3447c94 100644
--- a/networkx/algorithms/community/tests/test_quality.py
+++ b/networkx/algorithms/community/tests/test_quality.py
@@ -6,12 +6,7 @@ import pytest
 
 import networkx as nx
 from networkx import barbell_graph
-from networkx.algorithms.community import (
-    coverage,
-    modularity,
-    partition_quality,
-    performance,
-)
+from networkx.algorithms.community import modularity, partition_quality
 from networkx.algorithms.community.quality import inter_community_edges
 
 
@@ -22,14 +17,12 @@ class TestPerformance:
         """Tests that a poor partition has a low performance measure."""
         G = barbell_graph(3, 0)
         partition = [{0, 1, 4}, {2, 3, 5}]
-        assert 8 / 15 == pytest.approx(performance(G, partition), abs=1e-7)
         assert 8 / 15 == pytest.approx(partition_quality(G, partition)[1], abs=1e-7)
 
     def test_good_partition(self):
         """Tests that a good partition has a high performance measure."""
         G = barbell_graph(3, 0)
         partition = [{0, 1, 2}, {3, 4, 5}]
-        assert 14 / 15 == pytest.approx(performance(G, partition), abs=1e-7)
         assert 14 / 15 == pytest.approx(partition_quality(G, partition)[1], abs=1e-7)
 
 
@@ -40,14 +33,12 @@ class TestCoverage:
         """Tests that a poor partition has a low coverage measure."""
         G = barbell_graph(3, 0)
         partition = [{0, 1, 4}, {2, 3, 5}]
-        assert 3 / 7 == pytest.approx(coverage(G, partition), abs=1e-7)
         assert 3 / 7 == pytest.approx(partition_quality(G, partition)[0], abs=1e-7)
 
     def test_good_partition(self):
         """Tests that a good partition has a high coverage measure."""
         G = barbell_graph(3, 0)
         partition = [{0, 1, 2}, {3, 4, 5}]
-        assert 6 / 7 == pytest.approx(coverage(G, partition), abs=1e-7)
         assert 6 / 7 == pytest.approx(partition_quality(G, partition)[0], abs=1e-7)
 
 
diff --git a/networkx/algorithms/components/biconnected.py b/networkx/algorithms/components/biconnected.py
index 1eebe8a..f638453 100644
--- a/networkx/algorithms/components/biconnected.py
+++ b/networkx/algorithms/components/biconnected.py
@@ -343,6 +343,7 @@ def _biconnected_dfs(G, components=True):
         visited.add(start)
         edge_stack = []
         stack = [(start, start, iter(G[start]))]
+        edge_index = {}
         while stack:
             grandparent, parent, children = stack[-1]
             try:
@@ -353,29 +354,34 @@ def _biconnected_dfs(G, components=True):
                     if discovery[child] <= discovery[parent]:  # back edge
                         low[parent] = min(low[parent], discovery[child])
                         if components:
+                            edge_index[parent, child] = len(edge_stack)
                             edge_stack.append((parent, child))
                 else:
                     low[child] = discovery[child] = len(discovery)
                     visited.add(child)
                     stack.append((parent, child, iter(G[child])))
                     if components:
+                        edge_index[parent, child] = len(edge_stack)
                         edge_stack.append((parent, child))
+
             except StopIteration:
                 stack.pop()
                 if len(stack) > 1:
                     if low[parent] >= discovery[grandparent]:
                         if components:
-                            ind = edge_stack.index((grandparent, parent))
+                            ind = edge_index[grandparent, parent]
                             yield edge_stack[ind:]
-                            edge_stack = edge_stack[:ind]
+                            del edge_stack[ind:]
+
                         else:
                             yield grandparent
                     low[grandparent] = min(low[parent], low[grandparent])
                 elif stack:  # length 1 so grandparent is root
                     root_children += 1
                     if components:
-                        ind = edge_stack.index((grandparent, parent))
+                        ind = edge_index[grandparent, parent]
                         yield edge_stack[ind:]
+                        del edge_stack[ind:]
         if not components:
             # root node is articulation point if it has more than 1 child
             if root_children > 1:
diff --git a/networkx/algorithms/components/connected.py b/networkx/algorithms/components/connected.py
index e6b122e..2e74880 100644
--- a/networkx/algorithms/components/connected.py
+++ b/networkx/algorithms/components/connected.py
@@ -12,6 +12,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch
 @not_implemented_for("directed")
 def connected_components(G):
     """Generate connected components.
diff --git a/networkx/algorithms/components/strongly_connected.py b/networkx/algorithms/components/strongly_connected.py
index 1967740..b4a089c 100644
--- a/networkx/algorithms/components/strongly_connected.py
+++ b/networkx/algorithms/components/strongly_connected.py
@@ -12,6 +12,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 def strongly_connected_components(G):
     """Generate nodes in strongly connected components of graph.
diff --git a/networkx/algorithms/components/weakly_connected.py b/networkx/algorithms/components/weakly_connected.py
index 822719a..0a86a01 100644
--- a/networkx/algorithms/components/weakly_connected.py
+++ b/networkx/algorithms/components/weakly_connected.py
@@ -9,6 +9,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 def weakly_connected_components(G):
     """Generate weakly connected components of G.
diff --git a/networkx/algorithms/connectivity/connectivity.py b/networkx/algorithms/connectivity/connectivity.py
index b782031..07636a9 100644
--- a/networkx/algorithms/connectivity/connectivity.py
+++ b/networkx/algorithms/connectivity/connectivity.py
@@ -75,12 +75,10 @@ def local_node_connectivity(
         Residual network to compute maximum flow. If provided it will be
         reused instead of recreated. Default value: None.
 
-    cutoff : integer, float
+    cutoff : integer, float, or None (default: None)
         If specified, the maximum flow algorithm will terminate when the
-        flow value reaches or exceeds the cutoff. This is only for the
-        algorithms that support the cutoff parameter: :meth:`edmonds_karp`
-        and :meth:`shortest_augmenting_path`. Other algorithms will ignore
-        this parameter. Default value: None.
+        flow value reaches or exceeds the cutoff. This only works for flows
+        that support the cutoff parameter (most do) and is ignored otherwise.
 
     Returns
     -------
@@ -529,12 +527,10 @@ def local_edge_connectivity(
         Residual network to compute maximum flow. If provided it will be
         reused instead of recreated. Default value: None.
 
-    cutoff : integer, float
+    cutoff : integer, float, or None (default: None)
         If specified, the maximum flow algorithm will terminate when the
-        flow value reaches or exceeds the cutoff. This is only for the
-        algorithms that support the cutoff parameter: :meth:`edmonds_karp`
-        and :meth:`shortest_augmenting_path`. Other algorithms will ignore
-        this parameter. Default value: None.
+        flow value reaches or exceeds the cutoff. This only works for flows
+        that support the cutoff parameter (most do) and is ignored otherwise.
 
     Returns
     -------
@@ -679,12 +675,10 @@ def edge_connectivity(G, s=None, t=None, flow_func=None, cutoff=None):
         choice of the default function may change from version
         to version and should not be relied on. Default value: None.
 
-    cutoff : integer, float
+    cutoff : integer, float, or None (default: None)
         If specified, the maximum flow algorithm will terminate when the
-        flow value reaches or exceeds the cutoff. This is only for the
-        algorithms that support the cutoff parameter: e.g., :meth:`edmonds_karp`
-        and :meth:`shortest_augmenting_path`. Other algorithms will ignore
-        this parameter. Default value: None.
+        flow value reaches or exceeds the cutoff. This only works for flows
+        that support the cutoff parameter (most do) and is ignored otherwise.
 
     Returns
     -------
diff --git a/networkx/algorithms/connectivity/disjoint_paths.py b/networkx/algorithms/connectivity/disjoint_paths.py
index 378a709..e9b0fae 100644
--- a/networkx/algorithms/connectivity/disjoint_paths.py
+++ b/networkx/algorithms/connectivity/disjoint_paths.py
@@ -1,7 +1,7 @@
 """Flow based node and edge disjoint paths."""
 import networkx as nx
 
-# Define the default maximum flow function to use for the undelying
+# Define the default maximum flow function to use for the underlying
 # maximum flow computations
 from networkx.algorithms.flow import (
     edmonds_karp,
@@ -48,13 +48,11 @@ def edge_disjoint_paths(
         may change from version to version and should not be relied on.
         Default value: None.
 
-    cutoff : int
-        Maximum number of paths to yield. Some of the maximum flow
-        algorithms, such as :meth:`edmonds_karp` (the default) and
-        :meth:`shortest_augmenting_path` support the cutoff parameter,
-        and will terminate when the flow value reaches or exceeds the
-        cutoff. Other algorithms will ignore this parameter.
-        Default value: None.
+    cutoff : integer or None (default: None)
+        Maximum number of paths to yield. If specified, the maximum flow
+        algorithm will terminate when the flow value reaches or exceeds the
+        cutoff. This only works for flows that support the cutoff parameter
+        (most do) and is ignored otherwise.
 
     auxiliary : NetworkX DiGraph
         Auxiliary digraph to compute flow based edge connectivity. It has
@@ -254,13 +252,11 @@ def node_disjoint_paths(
         of the default function may change from version to version and
         should not be relied on. Default value: None.
 
-    cutoff : int
-        Maximum number of paths to yield. Some of the maximum flow
-        algorithms, such as :meth:`edmonds_karp` (the default) and
-        :meth:`shortest_augmenting_path` support the cutoff parameter,
-        and will terminate when the flow value reaches or exceeds the
-        cutoff. Other algorithms will ignore this parameter.
-        Default value: None.
+    cutoff : integer or None (default: None)
+        Maximum number of paths to yield. If specified, the maximum flow
+        algorithm will terminate when the flow value reaches or exceeds the
+        cutoff. This only works for flows that support the cutoff parameter
+        (most do) and is ignored otherwise.
 
     auxiliary : NetworkX DiGraph
         Auxiliary digraph to compute flow based node connectivity. It has
diff --git a/networkx/algorithms/connectivity/edge_augmentation.py b/networkx/algorithms/connectivity/edge_augmentation.py
index a8c5e83..b9e5f5d 100644
--- a/networkx/algorithms/connectivity/edge_augmentation.py
+++ b/networkx/algorithms/connectivity/edge_augmentation.py
@@ -262,7 +262,7 @@ def k_edge_augmentation(G, k, avail=None, weight=None, partial=False):
             aug_edges = greedy_k_edge_augmentation(
                 G, k=k, avail=avail, weight=weight, seed=0
             )
-        # Do eager evaulation so we can catch any exceptions
+        # Do eager evaluation so we can catch any exceptions
         # Before executing partial code.
         yield from list(aug_edges)
     except nx.NetworkXUnfeasible:
@@ -368,7 +368,7 @@ def partial_k_edge_augmentation(G, k, avail, weight=None):
             }
             # Remove potential augmenting edges
             C.remove_edges_from(sub_avail.keys())
-            # Find a subset of these edges that makes the compoment
+            # Find a subset of these edges that makes the component
             # k-edge-connected and ignore the rest
             yield from nx.k_edge_augmentation(C, k=k, avail=sub_avail)
 
@@ -542,7 +542,7 @@ def _lightest_meta_edges(mapping, avail_uv, avail_w):
     -----
     Each node in the metagraph is a k-edge-connected component in the original
     graph.  We don't care about any edge within the same k-edge-connected
-    component, so we ignore self edges.  We also are only intereseted in the
+    component, so we ignore self edges.  We also are only interested in the
     minimum weight edge bridging each k-edge-connected component so, we group
     the edges by meta-edge and take the lightest in each group.
 
diff --git a/networkx/algorithms/connectivity/tests/test_kcutsets.py b/networkx/algorithms/connectivity/tests/test_kcutsets.py
index 91426f1..3d48c4c 100644
--- a/networkx/algorithms/connectivity/tests/test_kcutsets.py
+++ b/networkx/algorithms/connectivity/tests/test_kcutsets.py
@@ -241,7 +241,6 @@ def test_non_repeated_cuts():
     solution = [{32, 33}, {2, 33}, {0, 3}, {0, 1}, {29, 33}]
     cuts = list(nx.all_node_cuts(G))
     if len(solution) != len(cuts):
-        print(nx.info(G))
         print(f"Solution: {solution}")
         print(f"Result: {cuts}")
     assert len(solution) == len(cuts)
diff --git a/networkx/algorithms/core.py b/networkx/algorithms/core.py
index e39eb84..165e583 100644
--- a/networkx/algorithms/core.py
+++ b/networkx/algorithms/core.py
@@ -34,7 +34,6 @@ from networkx.utils import not_implemented_for
 
 __all__ = [
     "core_number",
-    "find_cores",
     "k_core",
     "k_shell",
     "k_crust",
@@ -44,6 +43,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch
 @not_implemented_for("multigraph")
 def core_number(G):
     """Returns the core number for each vertex.
@@ -115,18 +115,6 @@ def core_number(G):
     return core
 
 
-def find_cores(G):
-    import warnings
-
-    msg = (
-        "\nfind_cores is deprecated as of version 2.7 and will be removed "
-        "in version 3.0.\n"
-        "The find_cores function is renamed core_number\n"
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    return nx.core_number(G)
-
-
 def _core_subgraph(G, k_filter, k=None, core=None):
     """Returns the subgraph induced by nodes passing filter `k_filter`.
 
@@ -154,6 +142,7 @@ def _core_subgraph(G, k_filter, k=None, core=None):
     return G.subgraph(nodes).copy()
 
 
+@nx._dispatch
 def k_core(G, k=None, core_number=None):
     """Returns the k-core of G.
 
@@ -378,6 +367,7 @@ def k_corona(G, k, core_number=None):
     return _core_subgraph(G, func, k, core_number)
 
 
+@nx._dispatch
 @not_implemented_for("directed")
 @not_implemented_for("multigraph")
 def k_truss(G, k):
diff --git a/networkx/algorithms/cuts.py b/networkx/algorithms/cuts.py
index ae1cb02..d225996 100644
--- a/networkx/algorithms/cuts.py
+++ b/networkx/algorithms/cuts.py
@@ -21,6 +21,7 @@ __all__ = [
 # TODO STILL NEED TO UPDATE ALL THE DOCUMENTATION!
 
 
+@nx._dispatch
 def cut_size(G, S, T=None, weight=None):
     """Returns the size of the cut between two sets of nodes.
 
@@ -83,6 +84,7 @@ def cut_size(G, S, T=None, weight=None):
     return sum(weight for u, v, weight in edges)
 
 
+@nx._dispatch
 def volume(G, S, weight=None):
     """Returns the volume of a set of nodes.
 
@@ -125,6 +127,7 @@ def volume(G, S, weight=None):
     return sum(d for v, d in degree(S, weight=weight))
 
 
+@nx._dispatch
 def normalized_cut_size(G, S, T=None, weight=None):
     """Returns the normalized size of the cut between two sets of nodes.
 
@@ -177,6 +180,7 @@ def normalized_cut_size(G, S, T=None, weight=None):
     return num_cut_edges * ((1 / volume_S) + (1 / volume_T))
 
 
+@nx._dispatch
 def conductance(G, S, T=None, weight=None):
     """Returns the conductance of two sets of nodes.
 
@@ -224,6 +228,7 @@ def conductance(G, S, T=None, weight=None):
     return num_cut_edges / min(volume_S, volume_T)
 
 
+@nx._dispatch
 def edge_expansion(G, S, T=None, weight=None):
     """Returns the edge expansion between two node sets.
 
@@ -270,6 +275,7 @@ def edge_expansion(G, S, T=None, weight=None):
     return num_cut_edges / min(len(S), len(T))
 
 
+@nx._dispatch
 def mixing_expansion(G, S, T=None, weight=None):
     """Returns the mixing expansion between two node sets.
 
@@ -317,6 +323,7 @@ def mixing_expansion(G, S, T=None, weight=None):
 
 # TODO What is the generalization to two arguments, S and T? Does the
 # denominator become `min(len(S), len(T))`?
+@nx._dispatch
 def node_expansion(G, S):
     """Returns the node expansion of the set `S`.
 
@@ -356,6 +363,7 @@ def node_expansion(G, S):
 
 # TODO What is the generalization to two arguments, S and T? Does the
 # denominator become `min(len(S), len(T))`?
+@nx._dispatch
 def boundary_expansion(G, S):
     """Returns the boundary expansion of the set `S`.
 
diff --git a/networkx/algorithms/cycles.py b/networkx/algorithms/cycles.py
index 48d32ae..1dd91bd 100644
--- a/networkx/algorithms/cycles.py
+++ b/networkx/algorithms/cycles.py
@@ -529,7 +529,7 @@ def minimum_cycle_basis(G, weight=None):
     --------
     simple_cycles, cycle_basis
     """
-    # We first split the graph in commected subgraphs
+    # We first split the graph in connected subgraphs
     return sum(
         (_min_cycle_basis(G.subgraph(c), weight) for c in nx.connected_components(G)),
         [],
diff --git a/networkx/algorithms/d_separation.py b/networkx/algorithms/d_separation.py
index caf26d0..ce7fe31 100644
--- a/networkx/algorithms/d_separation.py
+++ b/networkx/algorithms/d_separation.py
@@ -11,6 +11,65 @@ The implementation is based on the conceptually simple linear time
 algorithm presented in [2]_.  Refer to [3]_, [4]_ for a couple of
 alternative algorithms.
 
+Here, we provide a brief overview of d-separation and related concepts that
+are relevant for understanding it:
+
+Blocking paths
+--------------
+
+Before we overview, we introduce the following terminology to describe paths:
+
+- "open" path: A path between two nodes that can be traversed
+- "blocked" path: A path between two nodes that cannot be traversed
+
+A **collider** is a triplet of nodes along a path that is like the following:
+``... u -> c <- v ...``), where 'c' is a common successor of ``u`` and ``v``. A path
+through a collider is considered "blocked". When
+a node that is a collider, or a descendant of a collider is included in
+the d-separating set, then the path through that collider node is "open". If the
+path through the collider node is open, then we will call this node an open collider.
+
+The d-separation set blocks the paths between ``u`` and ``v``. If you include colliders,
+or their descendant nodes in the d-separation set, then those colliders will open up,
+enabling a path to be traversed if it is not blocked some other way.
+
+Illustration of D-separation with examples
+------------------------------------------
+
+For a pair of two nodes, ``u`` and ``v``, all paths are considered open if
+there is a path between ``u`` and ``v`` that is not blocked. That means, there is an open
+path between ``u`` and ``v`` that does not encounter a collider, or a variable in the
+d-separating set.
+
+For example, if the d-separating set is the empty set, then the following paths are
+unblocked between ``u`` and ``v``:
+
+- u <- z -> v
+- u -> w -> ... -> z -> v
+
+If for example, 'z' is in the d-separating set, then 'z' blocks those paths
+between ``u`` and ``v``.
+
+Colliders block a path by default if they and their descendants are not included
+in the d-separating set. An example of a path that is blocked when the d-separating
+set is empty is:
+
+- u -> w -> ... -> z <- v
+
+because 'z' is a collider in this path and 'z' is not in the d-separating set. However,
+if 'z' or a descendant of 'z' is included in the d-separating set, then the path through
+the collider at 'z' (... -> z <- ...) is now "open". 
+
+D-separation is concerned with blocking all paths between u and v. Therefore, a
+d-separating set between ``u`` and ``v`` is one where all paths are blocked.
+
+D-separation and its applications in probability
+------------------------------------------------
+
+D-separation is commonly used in probabilistic graphical models. D-separation
+connects the idea of probabilistic "dependence" with separation in a graph. If
+one assumes the causal Markov condition [5]_, then d-separation implies conditional
+independence in probability distributions.
 
 Examples
 --------
@@ -55,6 +114,8 @@ References
 .. [4] Koller, D., & Friedman, N. (2009).
    Probabilistic graphical models: principles and techniques. The MIT Press.
 
+.. [5] https://en.wikipedia.org/wiki/Causal_Markov_condition
+
 """
 
 from collections import deque
@@ -62,7 +123,7 @@ from collections import deque
 import networkx as nx
 from networkx.utils import UnionFind, not_implemented_for
 
-__all__ = ["d_separated"]
+__all__ = ["d_separated", "minimal_d_separator", "is_minimal_d_separator"]
 
 
 @not_implemented_for("undirected")
@@ -100,6 +161,15 @@ def d_separated(G, x, y, z):
         If any of the input nodes are not found in the graph,
         a :exc:`NodeNotFound` exception is raised.
 
+    Notes
+    -----
+    A d-separating set in a DAG is a set of nodes that
+    blocks all paths between the two sets. Nodes in `z`
+    block a path if they are part of the path and are not a collider,
+    or a descendant of a collider. A collider structure along a path
+    is ``... -> c <- ...`` where ``c`` is the collider node.
+
+    https://en.wikipedia.org/wiki/Bayesian_network#d-separation
     """
 
     if not nx.is_directed_acyclic_graph(G):
@@ -140,3 +210,232 @@ def d_separated(G, x, y, z):
         return False
     else:
         return True
+
+
+@not_implemented_for("undirected")
+def minimal_d_separator(G, u, v):
+    """Compute a minimal d-separating set between 'u' and 'v'.
+
+    A d-separating set in a DAG is a set of nodes that blocks all paths
+    between the two nodes, 'u' and 'v'. This function
+    constructs a d-separating set that is "minimal", meaning it is the smallest
+    d-separating set for 'u' and 'v'. This is not necessarily
+    unique. For more details, see Notes.
+
+    Parameters
+    ----------
+    G : graph
+        A networkx DAG.
+    u : node
+        A node in the graph, G.
+    v : node
+        A node in the graph, G.
+
+    Raises
+    ------
+    NetworkXError
+        Raises a :exc:`NetworkXError` if the input graph is not a DAG.
+
+    NodeNotFound
+        If any of the input nodes are not found in the graph,
+        a :exc:`NodeNotFound` exception is raised.
+
+    References
+    ----------
+    .. [1] Tian, J., & Paz, A. (1998). Finding Minimal D-separators.
+
+    Notes
+    -----
+    This function only finds ``a`` minimal d-separator. It does not guarantee
+    uniqueness, since in a DAG there may be more than one minimal d-separator
+    between two nodes. Moreover, this only checks for minimal separators
+    between two nodes, not two sets. Finding minimal d-separators between
+    two sets of nodes is not supported.
+
+    Uses the algorithm presented in [1]_. The complexity of the algorithm
+    is :math:`O(|E_{An}^m|)`, where :math:`|E_{An}^m|` stands for the
+    number of edges in the moralized graph of the sub-graph consisting
+    of only the ancestors of 'u' and 'v'. For full details, see [1]_.
+
+    The algorithm works by constructing the moral graph consisting of just
+    the ancestors of `u` and `v`. Then it constructs a candidate for
+    a separating set  ``Z'`` from the predecessors of `u` and `v`.
+    Then BFS is run starting from `u` and marking nodes
+    found from ``Z'`` and calling those nodes ``Z''``.
+    Then BFS is run again starting from `v` and marking nodes if they are
+    present in ``Z''``. Those marked nodes are the returned minimal
+    d-separating set.
+
+    https://en.wikipedia.org/wiki/Bayesian_network#d-separation
+    """
+    if not nx.is_directed_acyclic_graph(G):
+        raise nx.NetworkXError("graph should be directed acyclic")
+
+    union_uv = {u, v}
+
+    if any(n not in G.nodes for n in union_uv):
+        raise nx.NodeNotFound("one or more specified nodes not found in the graph")
+
+    # first construct the set of ancestors of X and Y
+    x_anc = nx.ancestors(G, u)
+    y_anc = nx.ancestors(G, v)
+    D_anc_xy = x_anc.union(y_anc)
+    D_anc_xy.update((u, v))
+
+    # second, construct the moralization of the subgraph of Anc(X,Y)
+    moral_G = nx.moral_graph(G.subgraph(D_anc_xy))
+
+    # find a separating set Z' in moral_G
+    Z_prime = set(G.predecessors(u)).union(set(G.predecessors(v)))
+
+    # perform BFS on the graph from 'x' to mark
+    Z_dprime = _bfs_with_marks(moral_G, u, Z_prime)
+    Z = _bfs_with_marks(moral_G, v, Z_dprime)
+    return Z
+
+
+@not_implemented_for("undirected")
+def is_minimal_d_separator(G, u, v, z):
+    """Determine if a d-separating set is minimal.
+
+    A d-separating set, `z`, in a DAG is a set of nodes that blocks
+    all paths between the two nodes, `u` and `v`. This function
+    verifies that a set is "minimal", meaning there is no smaller
+    d-separating set between the two nodes.
+
+    Parameters
+    ----------
+    G : nx.DiGraph
+        The graph.
+    u : node
+        A node in the graph.
+    v : node
+        A node in the graph.
+    z : Set of nodes
+        The set of nodes to check if it is a minimal d-separating set.
+
+    Returns
+    -------
+    bool
+        Whether or not the `z` separating set is minimal.
+
+    Raises
+    ------
+    NetworkXError
+        Raises a :exc:`NetworkXError` if the input graph is not a DAG.
+
+    NodeNotFound
+        If any of the input nodes are not found in the graph,
+        a :exc:`NodeNotFound` exception is raised.
+
+    References
+    ----------
+    .. [1] Tian, J., & Paz, A. (1998). Finding Minimal D-separators.
+
+    Notes
+    -----
+    This function only works on verifying a d-separating set is minimal
+    between two nodes. To verify that a d-separating set is minimal between
+    two sets of nodes is not supported.
+
+    Uses algorithm 2 presented in [1]_. The complexity of the algorithm
+    is :math:`O(|E_{An}^m|)`, where :math:`|E_{An}^m|` stands for the
+    number of edges in the moralized graph of the sub-graph consisting
+    of only the ancestors of ``u`` and ``v``.
+
+    The algorithm works by constructing the moral graph consisting of just
+    the ancestors of `u` and `v`. First, it performs BFS on the moral graph
+    starting from `u` and marking any nodes it encounters that are part of
+    the separating set, `z`. If a node is marked, then it does not continue
+    along that path. In the second stage, BFS with markings is repeated on the
+    moral graph starting from `v`. If at any stage, any node in `z` is
+    not marked, then `z` is considered not minimal. If the end of the algorithm
+    is reached, then `z` is minimal.
+
+    For full details, see [1]_.
+
+    https://en.wikipedia.org/wiki/Bayesian_network#d-separation
+    """
+    if not nx.is_directed_acyclic_graph(G):
+        raise nx.NetworkXError("graph should be directed acyclic")
+
+    union_uv = {u, v}
+    union_uv.update(z)
+
+    if any(n not in G.nodes for n in union_uv):
+        raise nx.NodeNotFound("one or more specified nodes not found in the graph")
+
+    x_anc = nx.ancestors(G, u)
+    y_anc = nx.ancestors(G, v)
+    xy_anc = x_anc.union(y_anc)
+
+    # if Z contains any node which is not in ancestors of X or Y
+    # then it is definitely not minimal
+    if any(node not in xy_anc for node in z):
+        return False
+
+    D_anc_xy = x_anc.union(y_anc)
+    D_anc_xy.update((u, v))
+
+    # second, construct the moralization of the subgraph
+    moral_G = nx.moral_graph(G.subgraph(D_anc_xy))
+
+    # start BFS from X
+    marks = _bfs_with_marks(moral_G, u, z)
+
+    # if not all the Z is marked, then the set is not minimal
+    if any(node not in marks for node in z):
+        return False
+
+    # similarly, start BFS from Y and check the marks
+    marks = _bfs_with_marks(moral_G, v, z)
+    # if not all the Z is marked, then the set is not minimal
+    if any(node not in marks for node in z):
+        return False
+
+    return True
+
+
+@not_implemented_for("directed")
+def _bfs_with_marks(G, start_node, check_set):
+    """Breadth-first-search with markings.
+
+    Performs BFS starting from ``start_node`` and whenever a node
+    inside ``check_set`` is met, it is "marked". Once a node is marked,
+    BFS does not continue along that path. The resulting marked nodes
+    are returned.
+
+    Parameters
+    ----------
+    G : nx.Graph
+        An undirected graph.
+    start_node : node
+        The start of the BFS.
+    check_set : set
+        The set of nodes to check against.
+
+    Returns
+    -------
+    marked : set
+        A set of nodes that were marked.
+    """
+    visited = dict()
+    marked = set()
+    queue = []
+
+    visited[start_node] = None
+    queue.append(start_node)
+    while queue:
+        m = queue.pop(0)
+
+        for nbr in G.neighbors(m):
+            if nbr not in visited:
+                # memoize where we visited so far
+                visited[nbr] = None
+
+                # mark the node in Z' and do not continue along that path
+                if nbr in check_set:
+                    marked.add(nbr)
+                else:
+                    queue.append(nbr)
+    return marked
diff --git a/networkx/algorithms/dag.py b/networkx/algorithms/dag.py
index 27f1a82..2073516 100644
--- a/networkx/algorithms/dag.py
+++ b/networkx/algorithms/dag.py
@@ -8,7 +8,7 @@ to the user to check for that.
 import heapq
 from collections import deque
 from functools import partial
-from itertools import chain, product, starmap
+from itertools import chain, combinations, product, starmap
 from math import gcd
 
 import networkx as nx
@@ -30,11 +30,13 @@ __all__ = [
     "dag_longest_path",
     "dag_longest_path_length",
     "dag_to_branching",
+    "compute_v_structures",
 ]
 
 chaini = chain.from_iterable
 
 
+@nx._dispatch
 def descendants(G, source):
     """Returns all nodes reachable from `source` in `G`.
 
@@ -56,9 +58,14 @@ def descendants(G, source):
     Examples
     --------
     >>> DG = nx.path_graph(5, create_using=nx.DiGraph)
-    >>> sorted(list(nx.descendants(DG, 2)))
+    >>> sorted(nx.descendants(DG, 2))
     [3, 4]
 
+    The `source` node is not a descendant of itself, but can be included manually:
+
+    >>> sorted(nx.descendants(DG, 2) | {2})
+    [2, 3, 4]
+
     See also
     --------
     ancestors
@@ -66,6 +73,7 @@ def descendants(G, source):
     return {child for parent, child in nx.bfs_edges(G, source)}
 
 
+@nx._dispatch
 def ancestors(G, source):
     """Returns all nodes having a path to `source` in `G`.
 
@@ -87,9 +95,14 @@ def ancestors(G, source):
     Examples
     --------
     >>> DG = nx.path_graph(5, create_using=nx.DiGraph)
-    >>> sorted(list(nx.ancestors(DG, 2)))
+    >>> sorted(nx.ancestors(DG, 2))
     [0, 1]
 
+    The `source` node is not an ancestor of itself, but can be included manually:
+
+    >>> sorted(nx.ancestors(DG, 2) | {2})
+    [0, 1, 2]
+
     See also
     --------
     descendants
@@ -995,7 +1008,15 @@ def dag_longest_path(G, weight="weight", default_weight=1, topo_order=None):
     dist = {}  # stores {v : (length, u)}
     for v in topo_order:
         us = [
-            (dist[u][0] + data.get(weight, default_weight), u)
+            (
+                dist[u][0]
+                + (
+                    max(data.values(), key=lambda x: x.get(weight, default_weight))
+                    if G.is_multigraph()
+                    else data
+                ).get(weight, default_weight),
+                u,
+            )
             for u, data in G.pred[v].items()
         ]
 
@@ -1057,8 +1078,13 @@ def dag_longest_path_length(G, weight="weight", default_weight=1):
     """
     path = nx.dag_longest_path(G, weight, default_weight)
     path_length = 0
-    for (u, v) in pairwise(path):
-        path_length += G[u][v].get(weight, default_weight)
+    if G.is_multigraph():
+        for u, v in pairwise(path):
+            i = max(G[u][v], key=lambda x: G[u][v][x].get(weight, default_weight))
+            path_length += G[u][v][i].get(weight, default_weight)
+    else:
+        for (u, v) in pairwise(path):
+            path_length += G[u][v].get(weight, default_weight)
 
     return path_length
 
@@ -1177,3 +1203,33 @@ def dag_to_branching(G):
     B.remove_node(0)
     B.remove_node(-1)
     return B
+
+
+@not_implemented_for("undirected")
+def compute_v_structures(G):
+    """Iterate through the graph to compute all v-structures.
+
+    V-structures are triples in the directed graph where
+    two parent nodes point to the same child and the two parent nodes
+    are not adjacent.
+
+    Parameters
+    ----------
+    G : graph
+        A networkx DiGraph.
+
+    Returns
+    -------
+    vstructs : iterator of tuples
+        The v structures within the graph. Each v structure is a 3-tuple with the
+        parent, collider, and other parent.
+
+    Notes
+    -----
+    https://en.wikipedia.org/wiki/Collider_(statistics)
+    """
+    for collider, preds in G.pred.items():
+        for common_parents in combinations(preds, r=2):
+            # ensure that the colliders are the same
+            common_parents = sorted(common_parents)
+            yield (common_parents[0], collider, common_parents[1])
diff --git a/networkx/algorithms/distance_measures.py b/networkx/algorithms/distance_measures.py
index 3f59a2a..9ea6c07 100644
--- a/networkx/algorithms/distance_measures.py
+++ b/networkx/algorithms/distance_measures.py
@@ -4,7 +4,6 @@ import networkx as nx
 from networkx.utils import not_implemented_for
 
 __all__ = [
-    "extrema_bounding",
     "eccentricity",
     "diameter",
     "radius",
@@ -15,15 +14,9 @@ __all__ = [
 ]
 
 
-def extrema_bounding(G, compute="diameter"):
+def _extrema_bounding(G, compute="diameter", weight=None):
     """Compute requested extreme distance metric of undirected graph G
 
-    .. deprecated:: 2.8
-
-       extrema_bounding is deprecated and will be removed in NetworkX 3.0.
-       Use the corresponding distance measure with the `usebounds=True` option
-       instead.
-
     Computation is based on smart lower and upper bounds, and in practice
     linear in the number of nodes, rather than quadratic (except for some
     border cases such as complete graphs or circle shaped graphs).
@@ -40,70 +33,25 @@ def extrema_bounding(G, compute="diameter"):
        "center" for the set of nodes with eccentricity equal to the radius,
        "eccentricities" for the maximum distance from each node to all other nodes in G
 
-    Returns
-    -------
-    value : value of the requested metric
-       int for "diameter" and "radius" or
-       list of nodes for "center" and "periphery" or
-       dictionary of eccentricity values keyed by node for "eccentricities"
-
-    Raises
-    ------
-    NetworkXError
-        If the graph consists of multiple components
-    ValueError
-        If `compute` is not one of "diameter", "radius", "periphery", "center",
-        or "eccentricities".
-
-    Notes
-    -----
-    This algorithm was proposed in the following papers:
-
-    F.W. Takes and W.A. Kosters, Determining the Diameter of Small World
-    Networks, in Proceedings of the 20th ACM International Conference on
-    Information and Knowledge Management (CIKM 2011), pp. 1191-1196, 2011.
-    doi: https://doi.org/10.1145/2063576.2063748
-
-    F.W. Takes and W.A. Kosters, Computing the Eccentricity Distribution of
-    Large Graphs, Algorithms 6(1): 100-118, 2013.
-    doi: https://doi.org/10.3390/a6010100
-
-    M. Borassi, P. Crescenzi, M. Habib, W.A. Kosters, A. Marino and F.W. Takes,
-    Fast Graph Diameter and Radius BFS-Based Computation in (Weakly Connected)
-    Real-World Graphs, Theoretical Computer Science 586: 59-80, 2015.
-    doi: https://doi.org/10.1016/j.tcs.2015.02.033
-    """
-    import warnings
-
-    msg = "extrema_bounding is deprecated and will be removed in networkx 3.0\n"
-    # NOTE: _extrema_bounding does input checking, so it is skipped here
-    if compute in {"diameter", "radius", "periphery", "center"}:
-        msg += f"Use nx.{compute}(G, usebounds=True) instead."
-    if compute == "eccentricities":
-        msg += f"Use nx.eccentricity(G) instead."
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-    return _extrema_bounding(G, compute=compute)
+    weight : string, function, or None
+        If this is a string, then edge weights will be accessed via the
+        edge attribute with this key (that is, the weight of the edge
+        joining `u` to `v` will be ``G.edges[u, v][weight]``). If no
+        such edge attribute exists, the weight of the edge is assumed to
+        be one.
 
+        If this is a function, the weight of an edge is the value
+        returned by the function. The function must accept exactly three
+        positional arguments: the two endpoints of an edge and the
+        dictionary of edge attributes for that edge. The function must
+        return a number.
 
-def _extrema_bounding(G, compute="diameter"):
-    """Compute requested extreme distance metric of undirected graph G
+        If this is None, every edge has weight/distance/cost 1.
 
-    Computation is based on smart lower and upper bounds, and in practice
-    linear in the number of nodes, rather than quadratic (except for some
-    border cases such as complete graphs or circle shaped graphs).
+        Weights stored as floating point values can lead to small round-off
+        errors in distances. Use integer weights to avoid this.
 
-    Parameters
-    ----------
-    G : NetworkX graph
-       An undirected graph
-
-    compute : string denoting the requesting metric
-       "diameter" for the maximal eccentricity value,
-       "radius" for the minimal eccentricity value,
-       "periphery" for the set of nodes with eccentricity equal to the diameter,
-       "center" for the set of nodes with eccentricity equal to the radius,
-       "eccentricities" for the maximum distance from each node to all other nodes in G
+        Weights should be positive, since they are distances.
 
     Returns
     -------
@@ -118,25 +66,26 @@ def _extrema_bounding(G, compute="diameter"):
         If the graph consists of multiple components
     ValueError
         If `compute` is not one of "diameter", "radius", "periphery", "center", or "eccentricities".
+
     Notes
     -----
-    This algorithm was proposed in the following papers:
-
-    F.W. Takes and W.A. Kosters, Determining the Diameter of Small World
-    Networks, in Proceedings of the 20th ACM International Conference on
-    Information and Knowledge Management (CIKM 2011), pp. 1191-1196, 2011.
-    doi: https://doi.org/10.1145/2063576.2063748
+    This algorithm was proposed in [1]_ and discussed further in [2]_ and [3]_.
 
-    F.W. Takes and W.A. Kosters, Computing the Eccentricity Distribution of
-    Large Graphs, Algorithms 6(1): 100-118, 2013.
-    doi: https://doi.org/10.3390/a6010100
-
-    M. Borassi, P. Crescenzi, M. Habib, W.A. Kosters, A. Marino and F.W. Takes,
-    Fast Graph Diameter and Radius BFS-Based Computation in (Weakly Connected)
-    Real-World Graphs, Theoretical Computer Science 586: 59-80, 2015.
-    doi: https://doi.org/10.1016/j.tcs.2015.02.033
+    References
+    ----------
+    .. [1] F. W. Takes, W. A. Kosters,
+       "Determining the diameter of small world networks."
+       Proceedings of the 20th ACM international conference on Information and knowledge management, 2011
+       https://dl.acm.org/doi/abs/10.1145/2063576.2063748
+    .. [2] F. W. Takes, W. A. Kosters,
+       "Computing the Eccentricity Distribution of Large Graphs."
+       Algorithms, 2013
+       https://www.mdpi.com/1999-4893/6/1/100
+    .. [3] M. Borassi, P. Crescenzi, M. Habib, W. A. Kosters, A. Marino, F. W. Takes,
+       "Fast diameter and radius BFS-based computation in (weakly connected) real-world graphs: With an application to the six degrees of separation games. "
+       Theoretical Computer Science, 2015
+       https://www.sciencedirect.com/science/article/pii/S0304397515001644
     """
-
     # init variables
     degrees = dict(G.degree())  # start with the highest degree node
     minlowernode = max(degrees, key=degrees.get)
@@ -163,7 +112,8 @@ def _extrema_bounding(G, compute="diameter"):
         high = not high
 
         # get distances from/to current node and derive eccentricity
-        dist = dict(nx.single_source_shortest_path_length(G, current))
+        dist = nx.shortest_path_length(G, source=current, weight=weight)
+
         if len(dist) != N:
             msg = "Cannot compute metric because graph is not connected."
             raise nx.NetworkXError(msg)
@@ -272,20 +222,20 @@ def _extrema_bounding(G, compute="diameter"):
     # return the correct value of the requested metric
     if compute == "diameter":
         return maxlower
-    elif compute == "radius":
+    if compute == "radius":
         return minupper
-    elif compute == "periphery":
+    if compute == "periphery":
         p = [v for v in G if ecc_lower[v] == maxlower]
         return p
-    elif compute == "center":
+    if compute == "center":
         c = [v for v in G if ecc_upper[v] == minupper]
         return c
-    elif compute == "eccentricities":
+    if compute == "eccentricities":
         return ecc_lower
     return None
 
 
-def eccentricity(G, v=None, sp=None):
+def eccentricity(G, v=None, sp=None, weight=None):
     """Returns the eccentricity of nodes in G.
 
     The eccentricity of a node v is the maximum distance from v to
@@ -302,6 +252,26 @@ def eccentricity(G, v=None, sp=None):
     sp : dict of dicts, optional
        All pairs shortest path lengths as a dictionary of dictionaries
 
+    weight : string, function, or None (default=None)
+        If this is a string, then edge weights will be accessed via the
+        edge attribute with this key (that is, the weight of the edge
+        joining `u` to `v` will be ``G.edges[u, v][weight]``). If no
+        such edge attribute exists, the weight of the edge is assumed to
+        be one.
+
+        If this is a function, the weight of an edge is the value
+        returned by the function. The function must accept exactly three
+        positional arguments: the two endpoints of an edge and the
+        dictionary of edge attributes for that edge. The function must
+        return a number.
+
+        If this is None, every edge has weight/distance/cost 1.
+
+        Weights stored as floating point values can lead to small round-off
+        errors in distances. Use integer weights to avoid this.
+
+        Weights should be positive, since they are distances.
+
     Returns
     -------
     ecc : dictionary
@@ -324,11 +294,11 @@ def eccentricity(G, v=None, sp=None):
     #    else:                      # assume v is a container of nodes
     #        nodes=v
     order = G.order()
-
     e = {}
     for n in G.nbunch_iter(v):
         if sp is None:
-            length = nx.single_source_shortest_path_length(G, n)
+            length = nx.shortest_path_length(G, source=n, weight=weight)
+
             L = len(length)
         else:
             try:
@@ -350,11 +320,10 @@ def eccentricity(G, v=None, sp=None):
 
     if v in G:
         return e[v]  # return single value
-    else:
-        return e
+    return e
 
 
-def diameter(G, e=None, usebounds=False):
+def diameter(G, e=None, usebounds=False, weight=None):
     """Returns the diameter of the graph G.
 
     The diameter is the maximum eccentricity.
@@ -367,6 +336,26 @@ def diameter(G, e=None, usebounds=False):
     e : eccentricity dictionary, optional
       A precomputed dictionary of eccentricities.
 
+    weight : string, function, or None
+        If this is a string, then edge weights will be accessed via the
+        edge attribute with this key (that is, the weight of the edge
+        joining `u` to `v` will be ``G.edges[u, v][weight]``). If no
+        such edge attribute exists, the weight of the edge is assumed to
+        be one.
+
+        If this is a function, the weight of an edge is the value
+        returned by the function. The function must accept exactly three
+        positional arguments: the two endpoints of an edge and the
+        dictionary of edge attributes for that edge. The function must
+        return a number.
+
+        If this is None, every edge has weight/distance/cost 1.
+
+        Weights stored as floating point values can lead to small round-off
+        errors in distances. Use integer weights to avoid this.
+
+        Weights should be positive, since they are distances.
+
     Returns
     -------
     d : integer
@@ -383,13 +372,13 @@ def diameter(G, e=None, usebounds=False):
     eccentricity
     """
     if usebounds is True and e is None and not G.is_directed():
-        return _extrema_bounding(G, compute="diameter")
+        return _extrema_bounding(G, compute="diameter", weight=weight)
     if e is None:
-        e = eccentricity(G)
+        e = eccentricity(G, weight=weight)
     return max(e.values())
 
 
-def periphery(G, e=None, usebounds=False):
+def periphery(G, e=None, usebounds=False, weight=None):
     """Returns the periphery of the graph G.
 
     The periphery is the set of nodes with eccentricity equal to the diameter.
@@ -402,6 +391,26 @@ def periphery(G, e=None, usebounds=False):
     e : eccentricity dictionary, optional
       A precomputed dictionary of eccentricities.
 
+    weight : string, function, or None
+        If this is a string, then edge weights will be accessed via the
+        edge attribute with this key (that is, the weight of the edge
+        joining `u` to `v` will be ``G.edges[u, v][weight]``). If no
+        such edge attribute exists, the weight of the edge is assumed to
+        be one.
+
+        If this is a function, the weight of an edge is the value
+        returned by the function. The function must accept exactly three
+        positional arguments: the two endpoints of an edge and the
+        dictionary of edge attributes for that edge. The function must
+        return a number.
+
+        If this is None, every edge has weight/distance/cost 1.
+
+        Weights stored as floating point values can lead to small round-off
+        errors in distances. Use integer weights to avoid this.
+
+        Weights should be positive, since they are distances.
+
     Returns
     -------
     p : list
@@ -419,15 +428,15 @@ def periphery(G, e=None, usebounds=False):
     center
     """
     if usebounds is True and e is None and not G.is_directed():
-        return _extrema_bounding(G, compute="periphery")
+        return _extrema_bounding(G, compute="periphery", weight=weight)
     if e is None:
-        e = eccentricity(G)
+        e = eccentricity(G, weight=weight)
     diameter = max(e.values())
     p = [v for v in e if e[v] == diameter]
     return p
 
 
-def radius(G, e=None, usebounds=False):
+def radius(G, e=None, usebounds=False, weight=None):
     """Returns the radius of the graph G.
 
     The radius is the minimum eccentricity.
@@ -440,6 +449,26 @@ def radius(G, e=None, usebounds=False):
     e : eccentricity dictionary, optional
       A precomputed dictionary of eccentricities.
 
+    weight : string, function, or None
+        If this is a string, then edge weights will be accessed via the
+        edge attribute with this key (that is, the weight of the edge
+        joining `u` to `v` will be ``G.edges[u, v][weight]``). If no
+        such edge attribute exists, the weight of the edge is assumed to
+        be one.
+
+        If this is a function, the weight of an edge is the value
+        returned by the function. The function must accept exactly three
+        positional arguments: the two endpoints of an edge and the
+        dictionary of edge attributes for that edge. The function must
+        return a number.
+
+        If this is None, every edge has weight/distance/cost 1.
+
+        Weights stored as floating point values can lead to small round-off
+        errors in distances. Use integer weights to avoid this.
+
+        Weights should be positive, since they are distances.
+
     Returns
     -------
     r : integer
@@ -453,13 +482,13 @@ def radius(G, e=None, usebounds=False):
 
     """
     if usebounds is True and e is None and not G.is_directed():
-        return _extrema_bounding(G, compute="radius")
+        return _extrema_bounding(G, compute="radius", weight=weight)
     if e is None:
-        e = eccentricity(G)
+        e = eccentricity(G, weight=weight)
     return min(e.values())
 
 
-def center(G, e=None, usebounds=False):
+def center(G, e=None, usebounds=False, weight=None):
     """Returns the center of the graph G.
 
     The center is the set of nodes with eccentricity equal to radius.
@@ -472,6 +501,26 @@ def center(G, e=None, usebounds=False):
     e : eccentricity dictionary, optional
       A precomputed dictionary of eccentricities.
 
+    weight : string, function, or None
+        If this is a string, then edge weights will be accessed via the
+        edge attribute with this key (that is, the weight of the edge
+        joining `u` to `v` will be ``G.edges[u, v][weight]``). If no
+        such edge attribute exists, the weight of the edge is assumed to
+        be one.
+
+        If this is a function, the weight of an edge is the value
+        returned by the function. The function must accept exactly three
+        positional arguments: the two endpoints of an edge and the
+        dictionary of edge attributes for that edge. The function must
+        return a number.
+
+        If this is None, every edge has weight/distance/cost 1.
+
+        Weights stored as floating point values can lead to small round-off
+        errors in distances. Use integer weights to avoid this.
+
+        Weights should be positive, since they are distances.
+
     Returns
     -------
     c : list
@@ -489,9 +538,9 @@ def center(G, e=None, usebounds=False):
     periphery
     """
     if usebounds is True and e is None and not G.is_directed():
-        return _extrema_bounding(G, compute="center")
+        return _extrema_bounding(G, compute="center", weight=weight)
     if e is None:
-        e = eccentricity(G)
+        e = eccentricity(G, weight=weight)
     radius = min(e.values())
     p = [v for v in e if e[v] == radius]
     return p
@@ -632,14 +681,23 @@ def resistance_distance(G, nodeA, nodeB, weight=None, invert_weight=True):
 
     Notes
     -----
-    Overview discussion:
-    * https://en.wikipedia.org/wiki/Resistance_distance
-    * http://mathworld.wolfram.com/ResistanceDistance.html
-
-    Additional details:
-    Vaya Sapobi Samui Vos, “Methods for determining the effective resistance,” M.S.,
-    Mathematisch Instituut, Universiteit Leiden, Leiden, Netherlands, 2016
-    Available: `Link to thesis <https://www.universiteitleiden.nl/binaries/content/assets/science/mi/scripties/master/vos_vaya_master.pdf>`_
+    Overviews are provided in [1]_ and [2]_. Additional details on computational
+    methods, proofs of properties, and corresponding MATLAB codes are provided
+    in [3]_.
+
+    References
+    ----------
+    .. [1] Wikipedia
+       "Resistance distance."
+       https://en.wikipedia.org/wiki/Resistance_distance
+    .. [2] E. W. Weisstein
+       "Resistance Distance."
+       MathWorld--A Wolfram Web Resource
+       https://mathworld.wolfram.com/ResistanceDistance.html
+    .. [3] V. S. S. Vos,
+       "Methods for determining the effective resistance."
+       Mestrado, Mathematisch Instituut Universiteit Leiden, 2016
+       https://www.universiteitleiden.nl/binaries/content/assets/science/mi/scripties/master/vos_vaya_master.pdf
     """
     import numpy as np
     import scipy as sp
@@ -671,7 +729,7 @@ def resistance_distance(G, nodeA, nodeB, weight=None, invert_weight=True):
     # Replace with collapsing topology or approximated zero?
 
     # Using determinants to compute the effective resistance is more memory
-    # efficent than directly calculating the psuedo-inverse
+    # efficient than directly calculating the psuedo-inverse
     L = nx.laplacian_matrix(G, node_list, weight=weight).asformat("csc")
     indices = list(range(L.shape[0]))
     # w/ nodeA removed
diff --git a/networkx/algorithms/dominating.py b/networkx/algorithms/dominating.py
index 32fff4d..042e2b8 100644
--- a/networkx/algorithms/dominating.py
+++ b/networkx/algorithms/dominating.py
@@ -64,6 +64,7 @@ def dominating_set(G, start_with=None):
     return dominating_set
 
 
+@nx._dispatch
 def is_dominating_set(G, nbunch):
     """Checks if `nbunch` is a dominating set for `G`.
 
diff --git a/networkx/algorithms/efficiency_measures.py b/networkx/algorithms/efficiency_measures.py
index 45f19cd..3234399 100644
--- a/networkx/algorithms/efficiency_measures.py
+++ b/networkx/algorithms/efficiency_measures.py
@@ -28,6 +28,12 @@ def efficiency(G, u, v):
     float
         Multiplicative inverse of the shortest path distance between the nodes.
 
+    Examples
+    --------
+    >>> G = nx.Graph([(0, 1), (0, 2), (0, 3), (1, 2), (1, 3)])
+    >>> nx.efficiency(G, 2, 3)  # this gives efficiency for node 2 and 3
+    0.5
+
     Notes
     -----
     Edge weights are ignored when computing the shortest path distances.
@@ -71,6 +77,12 @@ def global_efficiency(G):
     float
         The average global efficiency of the graph.
 
+    Examples
+    --------
+    >>> G = nx.Graph([(0, 1), (0, 2), (0, 3), (1, 2), (1, 3)])
+    >>> round(nx.global_efficiency(G), 12)
+    0.916666666667
+
     Notes
     -----
     Edge weights are ignored when computing the shortest path distances.
@@ -126,6 +138,12 @@ def local_efficiency(G):
     float
         The average local efficiency of the graph.
 
+    Examples
+    --------
+    >>> G = nx.Graph([(0, 1), (0, 2), (0, 3), (1, 2), (1, 3)])
+    >>> nx.local_efficiency(G)
+    0.9166666666666667
+
     Notes
     -----
     Edge weights are ignored when computing the shortest path distances.
diff --git a/networkx/algorithms/euler.py b/networkx/algorithms/euler.py
index e50a0e9..b68d5a8 100644
--- a/networkx/algorithms/euler.py
+++ b/networkx/algorithms/euler.py
@@ -382,7 +382,10 @@ def eulerian_path(G, source=None, keys=False):
 
 @not_implemented_for("directed")
 def eulerize(G):
-    """Transforms a graph into an Eulerian graph
+    """Transforms a graph into an Eulerian graph.
+
+    If `G` is Eulerian the result is `G` as a MultiGraph, otherwise the result is a smallest
+    (in terms of the number of edges) multigraph whose underlying simple graph is `G`.
 
     Parameters
     ----------
@@ -434,13 +437,21 @@ def eulerize(G):
         for m, n in combinations(odd_degree_nodes, 2)
     ]
 
-    # use inverse path lengths as edge-weights in a new graph
+    # use the number of vertices in a graph + 1 as an upper bound on
+    # the maximum length of a path in G
+    upper_bound_on_max_path_length = len(G) + 1
+
+    # use "len(G) + 1 - len(P)",
+    # where P is a shortest path between vertices n and m,
+    # as edge-weights in a new graph
     # store the paths in the graph for easy indexing later
     Gp = nx.Graph()
     for n, Ps in odd_deg_pairs_paths:
         for m, P in Ps.items():
             if n != m:
-                Gp.add_edge(m, n, weight=1 / len(P), path=P)
+                Gp.add_edge(
+                    m, n, weight=upper_bound_on_max_path_length - len(P), path=P
+                )
 
     # find the minimum weight matching of edges in the weighted graph
     best_matching = nx.Graph(list(nx.max_weight_matching(Gp)))
diff --git a/networkx/algorithms/flow/capacityscaling.py b/networkx/algorithms/flow/capacityscaling.py
index b565077..374c104 100644
--- a/networkx/algorithms/flow/capacityscaling.py
+++ b/networkx/algorithms/flow/capacityscaling.py
@@ -290,7 +290,7 @@ def capacity_scaling(
         for u, v, e in nx.selfloop_edges(G, data=True)
     )
 
-    # Determine the maxmimum edge capacity.
+    # Determine the maximum edge capacity.
     wmax = max(chain([-inf], (e["capacity"] for u, v, e in R.edges(data=True))))
     if wmax == -inf:
         # Residual network has no edges.
diff --git a/networkx/algorithms/flow/maxflow.py b/networkx/algorithms/flow/maxflow.py
index 8d2fb8f..b0098d0 100644
--- a/networkx/algorithms/flow/maxflow.py
+++ b/networkx/algorithms/flow/maxflow.py
@@ -12,14 +12,6 @@ from .utils import build_flow_dict
 
 # Define the default flow function for computing maximum flow.
 default_flow_func = preflow_push
-# Functions that don't support cutoff for minimum cut computations.
-flow_funcs = [
-    boykov_kolmogorov,
-    dinitz,
-    edmonds_karp,
-    preflow_push,
-    shortest_augmenting_path,
-]
 
 __all__ = ["maximum_flow", "maximum_flow_value", "minimum_cut", "minimum_cut_value"]
 
@@ -452,7 +444,7 @@ def minimum_cut(flowG, _s, _t, capacity="capacity", flow_func=None, **kwargs):
     if not callable(flow_func):
         raise nx.NetworkXError("flow_func has to be callable.")
 
-    if kwargs.get("cutoff") is not None and flow_func in flow_funcs:
+    if kwargs.get("cutoff") is not None and flow_func is preflow_push:
         raise nx.NetworkXError("cutoff should not be specified.")
 
     R = flow_func(flowG, _s, _t, capacity=capacity, value_only=True, **kwargs)
@@ -603,7 +595,7 @@ def minimum_cut_value(flowG, _s, _t, capacity="capacity", flow_func=None, **kwar
     if not callable(flow_func):
         raise nx.NetworkXError("flow_func has to be callable.")
 
-    if kwargs.get("cutoff") is not None and flow_func in flow_funcs:
+    if kwargs.get("cutoff") is not None and flow_func is preflow_push:
         raise nx.NetworkXError("cutoff should not be specified.")
 
     R = flow_func(flowG, _s, _t, capacity=capacity, value_only=True, **kwargs)
diff --git a/networkx/algorithms/flow/tests/test_maxflow.py b/networkx/algorithms/flow/tests/test_maxflow.py
index 6448a76..a672caa 100644
--- a/networkx/algorithms/flow/tests/test_maxflow.py
+++ b/networkx/algorithms/flow/tests/test_maxflow.py
@@ -20,6 +20,7 @@ flow_funcs = {
     preflow_push,
     shortest_augmenting_path,
 }
+
 max_min_funcs = {nx.maximum_flow, nx.minimum_cut}
 flow_value_funcs = {nx.maximum_flow_value, nx.minimum_cut_value}
 interface_funcs = max_min_funcs & flow_value_funcs
@@ -427,25 +428,24 @@ class TestMaxFlowMinCutInterface:
 
     def test_minimum_cut_no_cutoff(self):
         G = self.G
-        for flow_func in flow_funcs:
-            pytest.raises(
-                nx.NetworkXError,
-                nx.minimum_cut,
-                G,
-                "x",
-                "y",
-                flow_func=flow_func,
-                cutoff=1.0,
-            )
-            pytest.raises(
-                nx.NetworkXError,
-                nx.minimum_cut_value,
-                G,
-                "x",
-                "y",
-                flow_func=flow_func,
-                cutoff=1.0,
-            )
+        pytest.raises(
+            nx.NetworkXError,
+            nx.minimum_cut,
+            G,
+            "x",
+            "y",
+            flow_func=preflow_push,
+            cutoff=1.0,
+        )
+        pytest.raises(
+            nx.NetworkXError,
+            nx.minimum_cut_value,
+            G,
+            "x",
+            "y",
+            flow_func=preflow_push,
+            cutoff=1.0,
+        )
 
     def test_kwargs(self):
         G = self.H
diff --git a/networkx/algorithms/flow/tests/test_maxflow_large_graph.py b/networkx/algorithms/flow/tests/test_maxflow_large_graph.py
index c62c0a9..7adf9de 100644
--- a/networkx/algorithms/flow/tests/test_maxflow_large_graph.py
+++ b/networkx/algorithms/flow/tests/test_maxflow_large_graph.py
@@ -1,7 +1,9 @@
 """Maximum flow algorithms test suite on large graphs.
 """
 
+import bz2
 import os
+import pickle
 
 import pytest
 
@@ -47,8 +49,10 @@ def gen_pyramid(N):
 
 def read_graph(name):
     dirname = os.path.dirname(__file__)
-    path = os.path.join(dirname, name + ".gpickle.bz2")
-    return nx.read_gpickle(path)
+    fname = os.path.join(dirname, name + ".gpickle.bz2")
+    with bz2.BZ2File(fname, "rb") as f:
+        G = pickle.load(f)
+    return G
 
 
 def validate_flows(G, s, t, soln_value, R, flow_func):
diff --git a/networkx/algorithms/flow/tests/test_mincost.py b/networkx/algorithms/flow/tests/test_mincost.py
index 5a8c2d7..c49d3e7 100644
--- a/networkx/algorithms/flow/tests/test_mincost.py
+++ b/networkx/algorithms/flow/tests/test_mincost.py
@@ -1,4 +1,6 @@
+import bz2
 import os
+import pickle
 
 import pytest
 
@@ -460,7 +462,8 @@ class TestMinCostFlow:
 
     def test_large(self):
         fname = os.path.join(os.path.dirname(__file__), "netgen-2.gpickle.bz2")
-        G = nx.read_gpickle(fname)
+        with bz2.BZ2File(fname, "rb") as f:
+            G = pickle.load(f)
         flowCost, flowDict = nx.network_simplex(G)
         assert 6749969302 == flowCost
         assert 6749969302 == nx.cost_of_flow(G, flowDict)
diff --git a/networkx/algorithms/flow/tests/test_networksimplex.py b/networkx/algorithms/flow/tests/test_networksimplex.py
index 0c25db9..78800f5 100644
--- a/networkx/algorithms/flow/tests/test_networksimplex.py
+++ b/networkx/algorithms/flow/tests/test_networksimplex.py
@@ -1,4 +1,6 @@
+import bz2
 import os
+import pickle
 
 import pytest
 
@@ -140,7 +142,8 @@ def test_google_or_tools_example2():
 
 def test_large():
     fname = os.path.join(os.path.dirname(__file__), "netgen-2.gpickle.bz2")
-    G = nx.read_gpickle(fname)
+    with bz2.BZ2File(fname, "rb") as f:
+        G = pickle.load(f)
     flowCost, flowDict = nx.network_simplex(G)
     assert 6749969302 == flowCost
     assert 6749969302 == nx.cost_of_flow(G, flowDict)
diff --git a/networkx/algorithms/isolate.py b/networkx/algorithms/isolate.py
index e81e722..f998328 100644
--- a/networkx/algorithms/isolate.py
+++ b/networkx/algorithms/isolate.py
@@ -1,10 +1,12 @@
 """
 Functions for identifying isolate (degree zero) nodes.
 """
+import networkx as nx
 
 __all__ = ["is_isolate", "isolates", "number_of_isolates"]
 
 
+@nx._dispatch
 def is_isolate(G, n):
     """Determines whether a node is an isolate.
 
@@ -37,6 +39,7 @@ def is_isolate(G, n):
     return G.degree(n) == 0
 
 
+@nx._dispatch
 def isolates(G):
     """Iterator over isolates in the graph.
 
@@ -82,6 +85,7 @@ def isolates(G):
     return (n for n, d in G.degree() if d == 0)
 
 
+@nx._dispatch
 def number_of_isolates(G):
     """Returns the number of isolates in the graph.
 
diff --git a/networkx/algorithms/isomorphism/__init__.py b/networkx/algorithms/isomorphism/__init__.py
index ddcedea..58c2268 100644
--- a/networkx/algorithms/isomorphism/__init__.py
+++ b/networkx/algorithms/isomorphism/__init__.py
@@ -4,3 +4,4 @@ from networkx.algorithms.isomorphism.matchhelpers import *
 from networkx.algorithms.isomorphism.temporalisomorphvf2 import *
 from networkx.algorithms.isomorphism.ismags import *
 from networkx.algorithms.isomorphism.tree_isomorphism import *
+from networkx.algorithms.isomorphism.vf2pp import *
diff --git a/networkx/algorithms/isomorphism/ismags.py b/networkx/algorithms/isomorphism/ismags.py
index bfb5eea..3526161 100644
--- a/networkx/algorithms/isomorphism/ismags.py
+++ b/networkx/algorithms/isomorphism/ismags.py
@@ -136,7 +136,7 @@ def are_all_equal(iterable):
         pass
     else:
         if len(shape) > 1:
-            message = "The function does not works on multidimension arrays."
+            message = "The function does not works on multidimensional arrays."
             raise NotImplementedError(message) from None
 
     iterator = iter(iterable)
@@ -226,7 +226,7 @@ def intersect(collection_of_sets):
 
 class ISMAGS:
     """
-    Implements the ISMAGS subgraph matching algorith. [1]_ ISMAGS stands for
+    Implements the ISMAGS subgraph matching algorithm. [1]_ ISMAGS stands for
     "Index-based Subgraph Matching Algorithm with General Symmetries". As the
     name implies, it is symmetry aware and will only generate non-symmetric
     isomorphisms.
@@ -587,7 +587,7 @@ class ISMAGS:
         graph : networkx.Graph
             The graph whose symmetry should be analyzed.
         node_partitions : list of sets
-            A list of sets containining node keys. Node keys in the same set
+            A list of sets containing node keys. Node keys in the same set
             are considered equivalent. Every node key in `graph` should be in
             exactly one of the sets. If all nodes are equivalent, this should
             be ``[set(graph.nodes)]``.
@@ -910,7 +910,7 @@ class ISMAGS:
         # "part of" the subgraph in to_be_mapped, and we make it a little
         # smaller every iteration.
 
-        # pylint disable becuase it's guarded against by default value
+        # pylint disable because it's guarded against by default value
         current_size = len(
             next(iter(to_be_mapped), [])
         )  # pylint: disable=stop-iteration-return
diff --git a/networkx/algorithms/isomorphism/isomorph.py b/networkx/algorithms/isomorphism/isomorph.py
index 1b9a727..b4de3f5 100644
--- a/networkx/algorithms/isomorphism/isomorph.py
+++ b/networkx/algorithms/isomorphism/isomorph.py
@@ -24,6 +24,10 @@ def could_be_isomorphic(G1, G2):
     Notes
     -----
     Checks for matching degree, triangle, and number of cliques sequences.
+    The triangle sequence contains the number of triangles each node is part of.
+    The clique sequence contains for each node the number of maximal cliques
+    involving that node.
+
     """
 
     # Check global properties
@@ -33,13 +37,15 @@ def could_be_isomorphic(G1, G2):
     # Check local properties
     d1 = G1.degree()
     t1 = nx.triangles(G1)
-    c1 = nx.number_of_cliques(G1)
+    clqs_1 = list(nx.find_cliques(G1))
+    c1 = {n: sum(1 for c in clqs_1 if n in c) for n in G1}  # number of cliques
     props1 = [[d, t1[v], c1[v]] for v, d in d1]
     props1.sort()
 
     d2 = G2.degree()
     t2 = nx.triangles(G2)
-    c2 = nx.number_of_cliques(G2)
+    clqs_2 = list(nx.find_cliques(G2))
+    c2 = {n: sum(1 for c in clqs_2 if n in c) for n in G2}  # number of cliques
     props2 = [[d, t2[v], c2[v]] for v, d in d2]
     props2.sort()
 
@@ -65,7 +71,8 @@ def fast_could_be_isomorphic(G1, G2):
 
     Notes
     -----
-    Checks for matching degree and triangle sequences.
+    Checks for matching degree and triangle sequences. The triangle
+    sequence contains the number of triangles each node is part of.
     """
     # Check global properties
     if G1.order() != G2.order():
@@ -219,7 +226,7 @@ def is_isomorphic(G1, G2, node_match=None, edge_match=None):
        "An Improved Algorithm for Matching Large Graphs",
        3rd IAPR-TC15 Workshop  on Graph-based Representations in
        Pattern Recognition, Cuen, pp. 149-159, 2001.
-       https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.101.5342
+       https://www.researchgate.net/publication/200034365_An_Improved_Algorithm_for_Matching_Large_Graphs
     """
     if G1.is_directed() and G2.is_directed():
         GM = nx.algorithms.isomorphism.DiGraphMatcher
diff --git a/networkx/algorithms/isomorphism/isomorphvf2.py b/networkx/algorithms/isomorphism/isomorphvf2.py
index bcd478e..878924e 100644
--- a/networkx/algorithms/isomorphism/isomorphvf2.py
+++ b/networkx/algorithms/isomorphism/isomorphvf2.py
@@ -116,7 +116,7 @@ References
       Algorithm for Matching Large Graphs", 3rd IAPR-TC15 Workshop
       on Graph-based Representations in Pattern Recognition, Cuen,
       pp. 149-159, 2001.
-      https://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.101.5342
+      https://www.researchgate.net/publication/200034365_An_Improved_Algorithm_for_Matching_Large_Graphs
 
 See Also
 --------
diff --git a/networkx/algorithms/isomorphism/tests/test_isomorphvf2.py b/networkx/algorithms/isomorphism/tests/test_isomorphvf2.py
index 5d3f41b..8040310 100644
--- a/networkx/algorithms/isomorphism/tests/test_isomorphvf2.py
+++ b/networkx/algorithms/isomorphism/tests/test_isomorphvf2.py
@@ -59,7 +59,7 @@ class TestWikipediaExample:
 
         mapping = sorted(gm.mapping.items())
 
-    # this mapping is only one of the possibilies
+    # this mapping is only one of the possibilities
     # so this test needs to be reconsidered
     #        isomap = [('a', 1), ('b', 6), ('c', 3), ('d', 8),
     #                  ('g', 2), ('h', 5), ('i', 4), ('j', 7)]
@@ -401,3 +401,9 @@ def test_monomorphism_edge_match():
 
     gm = iso.DiGraphMatcher(G, SG, edge_match=iso.categorical_edge_match("label", None))
     assert gm.subgraph_is_monomorphic()
+
+
+def test_isomorphvf2pp_multidigraphs():
+    g = nx.MultiDiGraph({0: [1, 1, 2, 2, 3], 1: [2, 3, 3], 2: [3]})
+    h = nx.MultiDiGraph({0: [1, 1, 2, 2, 3], 1: [2, 3, 3], 3: [2]})
+    assert not (nx.vf2pp_is_isomorphic(g, h))
diff --git a/networkx/algorithms/isomorphism/tests/test_vf2pp.py b/networkx/algorithms/isomorphism/tests/test_vf2pp.py
new file mode 100644
index 0000000..2cf40d0
--- /dev/null
+++ b/networkx/algorithms/isomorphism/tests/test_vf2pp.py
@@ -0,0 +1,1608 @@
+import itertools as it
+
+import pytest
+
+import networkx as nx
+from networkx import vf2pp_is_isomorphic, vf2pp_isomorphism
+
+labels_same = ["blue"]
+
+labels_many = [
+    "white",
+    "red",
+    "blue",
+    "green",
+    "orange",
+    "black",
+    "purple",
+    "yellow",
+    "brown",
+    "cyan",
+    "solarized",
+    "pink",
+    "none",
+]
+
+
+class TestPreCheck:
+    def test_first_graph_empty(self):
+        G1 = nx.Graph()
+        G2 = nx.Graph([(0, 1), (1, 2)])
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+    def test_second_graph_empty(self):
+        G1 = nx.Graph([(0, 1), (1, 2)])
+        G2 = nx.Graph()
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+    def test_different_order1(self):
+        G1 = nx.path_graph(5)
+        G2 = nx.path_graph(6)
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+    def test_different_order2(self):
+        G1 = nx.barbell_graph(100, 20)
+        G2 = nx.barbell_graph(101, 20)
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+    def test_different_order3(self):
+        G1 = nx.complete_graph(7)
+        G2 = nx.complete_graph(8)
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+    def test_different_degree_sequences1(self):
+        G1 = nx.Graph([(0, 1), (0, 2), (1, 2), (1, 3), (0, 4)])
+        G2 = nx.Graph([(0, 1), (0, 2), (1, 2), (1, 3), (0, 4), (2, 5)])
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+        G2.remove_node(3)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(["a"]))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle("a"))), "label")
+
+        assert vf2pp_is_isomorphic(G1, G2)
+
+    def test_different_degree_sequences2(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (0, 2),
+                (2, 3),
+                (3, 4),
+                (4, 5),
+                (5, 6),
+                (6, 3),
+                (4, 7),
+                (7, 8),
+                (8, 3),
+            ]
+        )
+        G2 = G1.copy()
+        G2.add_edge(8, 0)
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+        G1.add_edge(6, 1)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(["a"]))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle("a"))), "label")
+
+        assert vf2pp_is_isomorphic(G1, G2)
+
+    def test_different_degree_sequences3(self):
+        G1 = nx.Graph([(0, 1), (0, 2), (1, 2), (2, 3), (2, 4), (3, 4), (2, 5), (2, 6)])
+        G2 = nx.Graph(
+            [(0, 1), (0, 6), (0, 2), (1, 2), (2, 3), (2, 4), (3, 4), (2, 5), (2, 6)]
+        )
+        assert not vf2pp_is_isomorphic(G1, G2)
+
+        G1.add_edge(3, 5)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(["a"]))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle("a"))), "label")
+
+        assert vf2pp_is_isomorphic(G1, G2)
+
+    def test_label_distribution(self):
+        G1 = nx.Graph([(0, 1), (0, 2), (1, 2), (2, 3), (2, 4), (3, 4), (2, 5), (2, 6)])
+        G2 = nx.Graph([(0, 1), (0, 2), (1, 2), (2, 3), (2, 4), (3, 4), (2, 5), (2, 6)])
+
+        colors1 = ["blue", "blue", "blue", "yellow", "black", "purple", "purple"]
+        colors2 = ["blue", "blue", "yellow", "yellow", "black", "purple", "purple"]
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(colors1[::-1]))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(colors2[::-1]))), "label")
+
+        assert not vf2pp_is_isomorphic(G1, G2, node_label="label")
+        G2.nodes[3]["label"] = "blue"
+        assert vf2pp_is_isomorphic(G1, G2, node_label="label")
+
+
+class TestAllGraphTypesEdgeCases:
+    @pytest.mark.parametrize("graph_type", (nx.Graph, nx.MultiGraph, nx.DiGraph))
+    def test_both_graphs_empty(self, graph_type):
+        G = graph_type()
+        H = graph_type()
+        assert vf2pp_isomorphism(G, H) is None
+
+        G.add_node(0)
+
+        assert vf2pp_isomorphism(G, H) is None
+        assert vf2pp_isomorphism(H, G) is None
+
+        H.add_node(0)
+        assert vf2pp_isomorphism(G, H) == {0: 0}
+
+    @pytest.mark.parametrize("graph_type", (nx.Graph, nx.MultiGraph, nx.DiGraph))
+    def test_first_graph_empty(self, graph_type):
+        G = graph_type()
+        H = graph_type([(0, 1)])
+        assert vf2pp_isomorphism(G, H) is None
+
+    @pytest.mark.parametrize("graph_type", (nx.Graph, nx.MultiGraph, nx.DiGraph))
+    def test_second_graph_empty(self, graph_type):
+        G = graph_type([(0, 1)])
+        H = graph_type()
+        assert vf2pp_isomorphism(G, H) is None
+
+
+class TestGraphISOVF2pp:
+    def test_custom_graph1_same_labels(self):
+        G1 = nx.Graph()
+
+        mapped = {1: "A", 2: "B", 3: "C", 4: "D", 5: "Z", 6: "E"}
+        edges1 = [(1, 2), (1, 3), (1, 4), (2, 3), (2, 6), (3, 4), (5, 1), (5, 2)]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Add edge making G1 symmetrical
+        G1.add_edge(3, 7)
+        G1.nodes[7]["label"] = "blue"
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Make G2 isomorphic to G1
+        G2.add_edges_from([(mapped[3], "X"), (mapped[6], mapped[5])])
+        G1.add_edge(4, 7)
+        G2.nodes["X"]["label"] = "blue"
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Re-structure maintaining isomorphism
+        G1.remove_edges_from([(1, 4), (1, 3)])
+        G2.remove_edges_from([(mapped[1], mapped[5]), (mapped[1], mapped[2])])
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+    def test_custom_graph1_different_labels(self):
+        G1 = nx.Graph()
+
+        mapped = {1: "A", 2: "B", 3: "C", 4: "D", 5: "Z", 6: "E"}
+        edges1 = [(1, 2), (1, 3), (1, 4), (2, 3), (2, 6), (3, 4), (5, 1), (5, 2)]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+    def test_custom_graph2_same_labels(self):
+        G1 = nx.Graph()
+
+        mapped = {1: "A", 2: "C", 3: "D", 4: "E", 5: "G", 7: "B", 6: "F"}
+        edges1 = [(1, 2), (1, 5), (5, 6), (2, 3), (2, 4), (3, 4), (4, 5), (2, 7)]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Obtain two isomorphic subgraphs from the graph
+        G2.remove_edge(mapped[1], mapped[2])
+        G2.add_edge(mapped[1], mapped[4])
+        H1 = nx.Graph(G1.subgraph([2, 3, 4, 7]))
+        H2 = nx.Graph(G2.subgraph([mapped[1], mapped[4], mapped[5], mapped[6]]))
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+        # Add edges maintaining isomorphism
+        H1.add_edges_from([(3, 7), (4, 7)])
+        H2.add_edges_from([(mapped[1], mapped[6]), (mapped[4], mapped[6])])
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+    def test_custom_graph2_different_labels(self):
+        G1 = nx.Graph()
+
+        mapped = {1: "A", 2: "C", 3: "D", 4: "E", 5: "G", 7: "B", 6: "F"}
+        edges1 = [(1, 2), (1, 5), (5, 6), (2, 3), (2, 4), (3, 4), (4, 5), (2, 7)]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+
+        # Adding new nodes
+        G1.add_node(0)
+        G2.add_node("Z")
+        G1.nodes[0]["label"] = G1.nodes[1]["label"]
+        G2.nodes["Z"]["label"] = G1.nodes[1]["label"]
+        mapped.update({0: "Z"})
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+        # Change the color of one of the nodes
+        G2.nodes["Z"]["label"] = G1.nodes[2]["label"]
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Add an extra edge
+        G1.nodes[0]["label"] = "blue"
+        G2.nodes["Z"]["label"] = "blue"
+        G1.add_edge(0, 1)
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Add extra edge to both
+        G2.add_edge("Z", "A")
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+    def test_custom_graph3_same_labels(self):
+        G1 = nx.Graph()
+
+        mapped = {1: 9, 2: 8, 3: 7, 4: 6, 5: 3, 8: 5, 9: 4, 7: 1, 6: 2}
+        edges1 = [
+            (1, 2),
+            (1, 3),
+            (2, 3),
+            (3, 4),
+            (4, 5),
+            (4, 7),
+            (4, 9),
+            (5, 8),
+            (8, 9),
+            (5, 6),
+            (6, 7),
+            (5, 2),
+        ]
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Connect nodes maintaining symmetry
+        G1.add_edges_from([(6, 9), (7, 8)])
+        G2.add_edges_from([(mapped[6], mapped[8]), (mapped[7], mapped[9])])
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Make isomorphic
+        G1.add_edges_from([(6, 8), (7, 9)])
+        G2.add_edges_from([(mapped[6], mapped[9]), (mapped[7], mapped[8])])
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Connect more nodes
+        G1.add_edges_from([(2, 7), (3, 6)])
+        G2.add_edges_from([(mapped[2], mapped[7]), (mapped[3], mapped[6])])
+        G1.add_node(10)
+        G2.add_node("Z")
+        G1.nodes[10]["label"] = "blue"
+        G2.nodes["Z"]["label"] = "blue"
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Connect the newly added node, to opposite sides of the graph
+        G1.add_edges_from([(10, 1), (10, 5), (10, 8)])
+        G2.add_edges_from([("Z", mapped[1]), ("Z", mapped[4]), ("Z", mapped[9])])
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Get two subgraphs that are not isomorphic but are easy to make
+        H1 = nx.Graph(G1.subgraph([2, 3, 4, 5, 6, 7, 10]))
+        H2 = nx.Graph(
+            G2.subgraph(
+                [mapped[4], mapped[5], mapped[6], mapped[7], mapped[8], mapped[9], "Z"]
+            )
+        )
+        assert vf2pp_isomorphism(H1, H2, node_label="label") is None
+
+        # Restructure both to make them isomorphic
+        H1.add_edges_from([(10, 2), (10, 6), (3, 6), (2, 7), (2, 6), (3, 7)])
+        H2.add_edges_from(
+            [("Z", mapped[7]), (mapped[6], mapped[9]), (mapped[7], mapped[8])]
+        )
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+        # Add edges with opposite direction in each Graph
+        H1.add_edge(3, 5)
+        H2.add_edge(mapped[5], mapped[7])
+        assert vf2pp_isomorphism(H1, H2, node_label="label") is None
+
+    def test_custom_graph3_different_labels(self):
+        G1 = nx.Graph()
+
+        mapped = {1: 9, 2: 8, 3: 7, 4: 6, 5: 3, 8: 5, 9: 4, 7: 1, 6: 2}
+        edges1 = [
+            (1, 2),
+            (1, 3),
+            (2, 3),
+            (3, 4),
+            (4, 5),
+            (4, 7),
+            (4, 9),
+            (5, 8),
+            (8, 9),
+            (5, 6),
+            (6, 7),
+            (5, 2),
+        ]
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+        # Add extra edge to G1
+        G1.add_edge(1, 7)
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Compensate in G2
+        G2.add_edge(9, 1)
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+        # Add extra node
+        G1.add_node("A")
+        G2.add_node("K")
+        G1.nodes["A"]["label"] = "green"
+        G2.nodes["K"]["label"] = "green"
+        mapped.update({"A": "K"})
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+        # Connect A to one side of G1 and K to the opposite
+        G1.add_edge("A", 6)
+        G2.add_edge("K", 5)
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Make the graphs symmetrical
+        G1.add_edge(1, 5)
+        G1.add_edge(2, 9)
+        G2.add_edge(9, 3)
+        G2.add_edge(8, 4)
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Assign same colors so the two opposite sides are identical
+        for node in G1.nodes():
+            color = "red"
+            G1.nodes[node]["label"] = color
+            G2.nodes[mapped[node]]["label"] = color
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+    def test_custom_graph4_different_labels(self):
+        G1 = nx.Graph()
+        edges1 = [
+            (1, 2),
+            (2, 3),
+            (3, 8),
+            (3, 4),
+            (4, 5),
+            (4, 6),
+            (3, 6),
+            (8, 7),
+            (8, 9),
+            (5, 9),
+            (10, 11),
+            (11, 12),
+            (12, 13),
+            (11, 13),
+        ]
+
+        mapped = {
+            1: "n",
+            2: "m",
+            3: "l",
+            4: "j",
+            5: "k",
+            6: "i",
+            7: "g",
+            8: "h",
+            9: "f",
+            10: "b",
+            11: "a",
+            12: "d",
+            13: "e",
+        }
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+    def test_custom_graph4_same_labels(self):
+        G1 = nx.Graph()
+        edges1 = [
+            (1, 2),
+            (2, 3),
+            (3, 8),
+            (3, 4),
+            (4, 5),
+            (4, 6),
+            (3, 6),
+            (8, 7),
+            (8, 9),
+            (5, 9),
+            (10, 11),
+            (11, 12),
+            (12, 13),
+            (11, 13),
+        ]
+
+        mapped = {
+            1: "n",
+            2: "m",
+            3: "l",
+            4: "j",
+            5: "k",
+            6: "i",
+            7: "g",
+            8: "h",
+            9: "f",
+            10: "b",
+            11: "a",
+            12: "d",
+            13: "e",
+        }
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Add nodes of different label
+        G1.add_node(0)
+        G2.add_node("z")
+        G1.nodes[0]["label"] = "green"
+        G2.nodes["z"]["label"] = "blue"
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Make the labels identical
+        G2.nodes["z"]["label"] = "green"
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Change the structure of the graphs, keeping them isomorphic
+        G1.add_edge(2, 5)
+        G2.remove_edge("i", "l")
+        G2.add_edge("g", "l")
+        G2.add_edge("m", "f")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Change the structure of the disconnected sub-graph, keeping it isomorphic
+        G1.remove_node(13)
+        G2.remove_node("d")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Connect the newly added node to the disconnected graph, which now is just a path of size 3
+        G1.add_edge(0, 10)
+        G2.add_edge("e", "z")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Connect the two disconnected sub-graphs, forming a single graph
+        G1.add_edge(11, 3)
+        G1.add_edge(0, 8)
+        G2.add_edge("a", "l")
+        G2.add_edge("z", "j")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+    def test_custom_graph5_same_labels(self):
+        G1 = nx.Graph()
+        edges1 = [
+            (1, 5),
+            (1, 2),
+            (1, 4),
+            (2, 3),
+            (2, 6),
+            (3, 4),
+            (3, 7),
+            (4, 8),
+            (5, 8),
+            (5, 6),
+            (6, 7),
+            (7, 8),
+        ]
+        mapped = {1: "a", 2: "h", 3: "d", 4: "i", 5: "g", 6: "b", 7: "j", 8: "c"}
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Add different edges in each graph, maintaining symmetry
+        G1.add_edges_from([(3, 6), (2, 7), (2, 5), (1, 3), (4, 7), (6, 8)])
+        G2.add_edges_from(
+            [
+                (mapped[6], mapped[3]),
+                (mapped[2], mapped[7]),
+                (mapped[1], mapped[6]),
+                (mapped[5], mapped[7]),
+                (mapped[3], mapped[8]),
+                (mapped[2], mapped[4]),
+            ]
+        )
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+        # Obtain two different but isomorphic subgraphs from G1 and G2
+        H1 = nx.Graph(G1.subgraph([1, 5, 8, 6, 7, 3]))
+        H2 = nx.Graph(
+            G2.subgraph(
+                [mapped[1], mapped[4], mapped[8], mapped[7], mapped[3], mapped[5]]
+            )
+        )
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+        # Delete corresponding node from the two graphs
+        H1.remove_node(8)
+        H2.remove_node(mapped[7])
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+        # Re-orient, maintaining isomorphism
+        H1.add_edge(1, 6)
+        H1.remove_edge(3, 6)
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+    def test_custom_graph5_different_labels(self):
+        G1 = nx.Graph()
+        edges1 = [
+            (1, 5),
+            (1, 2),
+            (1, 4),
+            (2, 3),
+            (2, 6),
+            (3, 4),
+            (3, 7),
+            (4, 8),
+            (5, 8),
+            (5, 6),
+            (6, 7),
+            (7, 8),
+        ]
+        mapped = {1: "a", 2: "h", 3: "d", 4: "i", 5: "g", 6: "b", 7: "j", 8: "c"}
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        colors = ["red", "blue", "grey", "none", "brown", "solarized", "yellow", "pink"]
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+        # Assign different colors to matching nodes
+        c = 0
+        for node in G1.nodes():
+            color1 = colors[c]
+            color2 = colors[(c + 3) % len(colors)]
+            G1.nodes[node]["label"] = color1
+            G2.nodes[mapped[node]]["label"] = color2
+            c += 1
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label") is None
+
+        # Get symmetrical sub-graphs of G1,G2 and compare them
+        H1 = G1.subgraph([1, 5])
+        H2 = G2.subgraph(["i", "c"])
+        c = 0
+        for node1, node2 in zip(H1.nodes(), H2.nodes()):
+            H1.nodes[node1]["label"] = "red"
+            H2.nodes[node2]["label"] = "red"
+            c += 1
+
+        assert vf2pp_isomorphism(H1, H2, node_label="label")
+
+    def test_disconnected_graph_all_same_labels(self):
+        G1 = nx.Graph()
+        G1.add_nodes_from([i for i in range(10)])
+
+        mapped = {0: 9, 1: 8, 2: 7, 3: 6, 4: 5, 5: 4, 6: 3, 7: 2, 8: 1, 9: 0}
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+    def test_disconnected_graph_all_different_labels(self):
+        G1 = nx.Graph()
+        G1.add_nodes_from([i for i in range(10)])
+
+        mapped = {0: 9, 1: 8, 2: 7, 3: 6, 4: 5, 5: 4, 6: 3, 7: 2, 8: 1, 9: 0}
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        assert vf2pp_isomorphism(G1, G2, node_label="label") == mapped
+
+    def test_disconnected_graph_some_same_labels(self):
+        G1 = nx.Graph()
+        G1.add_nodes_from([i for i in range(10)])
+
+        mapped = {0: 9, 1: 8, 2: 7, 3: 6, 4: 5, 5: 4, 6: 3, 7: 2, 8: 1, 9: 0}
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        colors = [
+            "white",
+            "white",
+            "white",
+            "purple",
+            "purple",
+            "red",
+            "red",
+            "pink",
+            "pink",
+            "pink",
+        ]
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(colors))), "label")
+        nx.set_node_attributes(
+            G2, dict(zip([mapped[n] for n in G1], it.cycle(colors))), "label"
+        )
+
+        assert vf2pp_isomorphism(G1, G2, node_label="label")
+
+
+class TestMultiGraphISOVF2pp:
+    def test_custom_multigraph1_same_labels(self):
+        G1 = nx.MultiGraph()
+
+        mapped = {1: "A", 2: "B", 3: "C", 4: "D", 5: "Z", 6: "E"}
+        edges1 = [
+            (1, 2),
+            (1, 3),
+            (1, 4),
+            (1, 4),
+            (1, 4),
+            (2, 3),
+            (2, 6),
+            (2, 6),
+            (3, 4),
+            (3, 4),
+            (5, 1),
+            (5, 1),
+            (5, 2),
+            (5, 2),
+        ]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Transfer the 2-clique to the right side of G1
+        G1.remove_edges_from([(2, 6), (2, 6)])
+        G1.add_edges_from([(3, 6), (3, 6)])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Delete an edges, making them symmetrical, so the position of the 2-clique doesn't matter
+        G2.remove_edge(mapped[1], mapped[4])
+        G1.remove_edge(1, 4)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Add self-loops
+        G1.add_edges_from([(5, 5), (5, 5), (1, 1)])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Compensate in G2
+        G2.add_edges_from(
+            [(mapped[1], mapped[1]), (mapped[4], mapped[4]), (mapped[4], mapped[4])]
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+    def test_custom_multigraph1_different_labels(self):
+        G1 = nx.MultiGraph()
+
+        mapped = {1: "A", 2: "B", 3: "C", 4: "D", 5: "Z", 6: "E"}
+        edges1 = [
+            (1, 2),
+            (1, 3),
+            (1, 4),
+            (1, 4),
+            (1, 4),
+            (2, 3),
+            (2, 6),
+            (2, 6),
+            (3, 4),
+            (3, 4),
+            (5, 1),
+            (5, 1),
+            (5, 2),
+            (5, 2),
+        ]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Re-structure G1, maintaining the degree sequence
+        G1.remove_edge(1, 4)
+        G1.add_edge(1, 5)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Restructure G2, making it isomorphic to G1
+        G2.remove_edge("A", "D")
+        G2.add_edge("A", "Z")
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Add edge from node to itself
+        G1.add_edges_from([(6, 6), (6, 6), (6, 6)])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Same for G2
+        G2.add_edges_from([("E", "E"), ("E", "E"), ("E", "E")])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+    def test_custom_multigraph2_same_labels(self):
+        G1 = nx.MultiGraph()
+
+        mapped = {1: "A", 2: "C", 3: "D", 4: "E", 5: "G", 7: "B", 6: "F"}
+        edges1 = [
+            (1, 2),
+            (1, 2),
+            (1, 5),
+            (1, 5),
+            (1, 5),
+            (5, 6),
+            (2, 3),
+            (2, 3),
+            (2, 4),
+            (3, 4),
+            (3, 4),
+            (4, 5),
+            (4, 5),
+            (4, 5),
+            (2, 7),
+            (2, 7),
+            (2, 7),
+        ]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Obtain two non-somorphic subgraphs from the graph
+        G2.remove_edges_from([(mapped[1], mapped[2]), (mapped[1], mapped[2])])
+        G2.add_edge(mapped[1], mapped[4])
+        H1 = nx.MultiGraph(G1.subgraph([2, 3, 4, 7]))
+        H2 = nx.MultiGraph(G2.subgraph([mapped[1], mapped[4], mapped[5], mapped[6]]))
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+        # Make them isomorphic
+        H1.remove_edge(3, 4)
+        H1.add_edges_from([(2, 3), (2, 4), (2, 4)])
+        H2.add_edges_from([(mapped[5], mapped[6]), (mapped[5], mapped[6])])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Remove triangle edge
+        H1.remove_edges_from([(2, 3), (2, 3), (2, 3)])
+        H2.remove_edges_from([(mapped[5], mapped[4])] * 3)
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Change the edge orientation such that H1 is rotated H2
+        H1.remove_edges_from([(2, 7), (2, 7)])
+        H1.add_edges_from([(3, 4), (3, 4)])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Add extra edges maintaining degree sequence, but in a non-symmetrical manner
+        H2.add_edge(mapped[5], mapped[1])
+        H1.add_edge(3, 4)
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+    def test_custom_multigraph2_different_labels(self):
+        G1 = nx.MultiGraph()
+
+        mapped = {1: "A", 2: "C", 3: "D", 4: "E", 5: "G", 7: "B", 6: "F"}
+        edges1 = [
+            (1, 2),
+            (1, 2),
+            (1, 5),
+            (1, 5),
+            (1, 5),
+            (5, 6),
+            (2, 3),
+            (2, 3),
+            (2, 4),
+            (3, 4),
+            (3, 4),
+            (4, 5),
+            (4, 5),
+            (4, 5),
+            (2, 7),
+            (2, 7),
+            (2, 7),
+        ]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Re-structure G1
+        G1.remove_edge(2, 7)
+        G1.add_edge(5, 6)
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Same for G2
+        G2.remove_edge("B", "C")
+        G2.add_edge("G", "F")
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Delete node from G1 and G2, keeping them isomorphic
+        G1.remove_node(3)
+        G2.remove_node("D")
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Change G1 edges
+        G1.remove_edge(1, 2)
+        G1.remove_edge(2, 7)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Make G2 identical to G1, but with different edge orientation and different labels
+        G2.add_edges_from([("A", "C"), ("C", "E"), ("C", "E")])
+        G2.remove_edges_from(
+            [("A", "G"), ("A", "G"), ("F", "G"), ("E", "G"), ("E", "G")]
+        )
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Make all labels the same, so G1 and G2 are also isomorphic
+        for n1, n2 in zip(G1.nodes(), G2.nodes()):
+            G1.nodes[n1]["label"] = "blue"
+            G2.nodes[n2]["label"] = "blue"
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+    def test_custom_multigraph3_same_labels(self):
+        G1 = nx.MultiGraph()
+
+        mapped = {1: 9, 2: 8, 3: 7, 4: 6, 5: 3, 8: 5, 9: 4, 7: 1, 6: 2}
+        edges1 = [
+            (1, 2),
+            (1, 3),
+            (1, 3),
+            (2, 3),
+            (2, 3),
+            (3, 4),
+            (4, 5),
+            (4, 7),
+            (4, 9),
+            (4, 9),
+            (4, 9),
+            (5, 8),
+            (5, 8),
+            (8, 9),
+            (8, 9),
+            (5, 6),
+            (6, 7),
+            (6, 7),
+            (6, 7),
+            (5, 2),
+        ]
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Connect nodes maintaining symmetry
+        G1.add_edges_from([(6, 9), (7, 8), (5, 8), (4, 9), (4, 9)])
+        G2.add_edges_from(
+            [
+                (mapped[6], mapped[8]),
+                (mapped[7], mapped[9]),
+                (mapped[5], mapped[8]),
+                (mapped[4], mapped[9]),
+                (mapped[4], mapped[9]),
+            ]
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Make isomorphic
+        G1.add_edges_from([(6, 8), (6, 8), (7, 9), (7, 9), (7, 9)])
+        G2.add_edges_from(
+            [
+                (mapped[6], mapped[8]),
+                (mapped[6], mapped[9]),
+                (mapped[7], mapped[8]),
+                (mapped[7], mapped[9]),
+                (mapped[7], mapped[9]),
+            ]
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Connect more nodes
+        G1.add_edges_from([(2, 7), (2, 7), (3, 6), (3, 6)])
+        G2.add_edges_from(
+            [
+                (mapped[2], mapped[7]),
+                (mapped[2], mapped[7]),
+                (mapped[3], mapped[6]),
+                (mapped[3], mapped[6]),
+            ]
+        )
+        G1.add_node(10)
+        G2.add_node("Z")
+        G1.nodes[10]["label"] = "blue"
+        G2.nodes["Z"]["label"] = "blue"
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Connect the newly added node, to opposite sides of the graph
+        G1.add_edges_from([(10, 1), (10, 5), (10, 8), (10, 10), (10, 10)])
+        G2.add_edges_from(
+            [
+                ("Z", mapped[1]),
+                ("Z", mapped[4]),
+                ("Z", mapped[9]),
+                ("Z", "Z"),
+                ("Z", "Z"),
+            ]
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # We connected the new node to opposite sides, so G1 must be symmetrical to G2. Re-structure them to be so
+        G1.remove_edges_from([(1, 3), (4, 9), (4, 9), (7, 9)])
+        G2.remove_edges_from(
+            [
+                (mapped[1], mapped[3]),
+                (mapped[4], mapped[9]),
+                (mapped[4], mapped[9]),
+                (mapped[7], mapped[9]),
+            ]
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Get two subgraphs that are not isomorphic but are easy to make
+        H1 = nx.Graph(G1.subgraph([2, 3, 4, 5, 6, 7, 10]))
+        H2 = nx.Graph(
+            G2.subgraph(
+                [mapped[4], mapped[5], mapped[6], mapped[7], mapped[8], mapped[9], "Z"]
+            )
+        )
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+        # Restructure both to make them isomorphic
+        H1.add_edges_from([(10, 2), (10, 6), (3, 6), (2, 7), (2, 6), (3, 7)])
+        H2.add_edges_from(
+            [("Z", mapped[7]), (mapped[6], mapped[9]), (mapped[7], mapped[8])]
+        )
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Remove one self-loop in H2
+        H2.remove_edge("Z", "Z")
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+        # Compensate in H1
+        H1.remove_edge(10, 10)
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+    def test_custom_multigraph3_different_labels(self):
+        G1 = nx.MultiGraph()
+
+        mapped = {1: 9, 2: 8, 3: 7, 4: 6, 5: 3, 8: 5, 9: 4, 7: 1, 6: 2}
+        edges1 = [
+            (1, 2),
+            (1, 3),
+            (1, 3),
+            (2, 3),
+            (2, 3),
+            (3, 4),
+            (4, 5),
+            (4, 7),
+            (4, 9),
+            (4, 9),
+            (4, 9),
+            (5, 8),
+            (5, 8),
+            (8, 9),
+            (8, 9),
+            (5, 6),
+            (6, 7),
+            (6, 7),
+            (6, 7),
+            (5, 2),
+        ]
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Delete edge maintaining isomorphism
+        G1.remove_edge(4, 9)
+        G2.remove_edge(4, 6)
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Change edge orientation such that G1 mirrors G2
+        G1.add_edges_from([(4, 9), (1, 2), (1, 2)])
+        G1.remove_edges_from([(1, 3), (1, 3)])
+        G2.add_edges_from([(3, 5), (7, 9)])
+        G2.remove_edge(8, 9)
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Make all labels the same, so G1 and G2 are also isomorphic
+        for n1, n2 in zip(G1.nodes(), G2.nodes()):
+            G1.nodes[n1]["label"] = "blue"
+            G2.nodes[n2]["label"] = "blue"
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        G1.add_node(10)
+        G2.add_node("Z")
+        G1.nodes[10]["label"] = "green"
+        G2.nodes["Z"]["label"] = "green"
+
+        # Add different number of edges between the new nodes and themselves
+        G1.add_edges_from([(10, 10), (10, 10)])
+        G2.add_edges_from([("Z", "Z")])
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Make the number of self-edges equal
+        G1.remove_edge(10, 10)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Connect the new node to the graph
+        G1.add_edges_from([(10, 3), (10, 4)])
+        G2.add_edges_from([("Z", 8), ("Z", 3)])
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Remove central node
+        G1.remove_node(4)
+        G2.remove_node(3)
+        G1.add_edges_from([(5, 6), (5, 6), (5, 7)])
+        G2.add_edges_from([(1, 6), (1, 6), (6, 2)])
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+    def test_custom_multigraph4_same_labels(self):
+        G1 = nx.MultiGraph()
+        edges1 = [
+            (1, 2),
+            (1, 2),
+            (2, 2),
+            (2, 3),
+            (3, 8),
+            (3, 8),
+            (3, 4),
+            (4, 5),
+            (4, 5),
+            (4, 5),
+            (4, 6),
+            (3, 6),
+            (3, 6),
+            (6, 6),
+            (8, 7),
+            (7, 7),
+            (8, 9),
+            (9, 9),
+            (8, 9),
+            (8, 9),
+            (5, 9),
+            (10, 11),
+            (11, 12),
+            (12, 13),
+            (11, 13),
+            (10, 10),
+            (10, 11),
+            (11, 13),
+        ]
+
+        mapped = {
+            1: "n",
+            2: "m",
+            3: "l",
+            4: "j",
+            5: "k",
+            6: "i",
+            7: "g",
+            8: "h",
+            9: "f",
+            10: "b",
+            11: "a",
+            12: "d",
+            13: "e",
+        }
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Add extra but corresponding edges to both graphs
+        G1.add_edges_from([(2, 2), (2, 3), (2, 8), (3, 4)])
+        G2.add_edges_from([("m", "m"), ("m", "l"), ("m", "h"), ("l", "j")])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Obtain subgraphs
+        H1 = nx.MultiGraph(G1.subgraph([2, 3, 4, 6, 10, 11, 12, 13]))
+        H2 = nx.MultiGraph(
+            G2.subgraph(
+                [
+                    mapped[2],
+                    mapped[3],
+                    mapped[8],
+                    mapped[9],
+                    mapped[10],
+                    mapped[11],
+                    mapped[12],
+                    mapped[13],
+                ]
+            )
+        )
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+        # Make them isomorphic
+        H2.remove_edges_from(
+            [(mapped[3], mapped[2]), (mapped[9], mapped[8]), (mapped[2], mapped[2])]
+        )
+        H2.add_edges_from([(mapped[9], mapped[9]), (mapped[2], mapped[8])])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Re-structure the disconnected sub-graph
+        H1.remove_node(12)
+        H2.remove_node(mapped[12])
+        H1.add_edge(13, 13)
+        H2.add_edge(mapped[13], mapped[13])
+
+        # Connect the two disconnected components, forming a single graph
+        H1.add_edges_from([(3, 13), (6, 11)])
+        H2.add_edges_from([(mapped[8], mapped[10]), (mapped[2], mapped[11])])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Change orientation of self-loops in one graph, maintaining the degree sequence
+        H1.remove_edges_from([(2, 2), (3, 6)])
+        H1.add_edges_from([(6, 6), (2, 3)])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+    def test_custom_multigraph4_different_labels(self):
+        G1 = nx.MultiGraph()
+        edges1 = [
+            (1, 2),
+            (1, 2),
+            (2, 2),
+            (2, 3),
+            (3, 8),
+            (3, 8),
+            (3, 4),
+            (4, 5),
+            (4, 5),
+            (4, 5),
+            (4, 6),
+            (3, 6),
+            (3, 6),
+            (6, 6),
+            (8, 7),
+            (7, 7),
+            (8, 9),
+            (9, 9),
+            (8, 9),
+            (8, 9),
+            (5, 9),
+            (10, 11),
+            (11, 12),
+            (12, 13),
+            (11, 13),
+        ]
+
+        mapped = {
+            1: "n",
+            2: "m",
+            3: "l",
+            4: "j",
+            5: "k",
+            6: "i",
+            7: "g",
+            8: "h",
+            9: "f",
+            10: "b",
+            11: "a",
+            12: "d",
+            13: "e",
+        }
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m == mapped
+
+        # Add extra but corresponding edges to both graphs
+        G1.add_edges_from([(2, 2), (2, 3), (2, 8), (3, 4)])
+        G2.add_edges_from([("m", "m"), ("m", "l"), ("m", "h"), ("l", "j")])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m == mapped
+
+        # Obtain isomorphic subgraphs
+        H1 = nx.MultiGraph(G1.subgraph([2, 3, 4, 6]))
+        H2 = nx.MultiGraph(G2.subgraph(["m", "l", "j", "i"]))
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Delete the 3-clique, keeping only the path-graph. Also, H1 mirrors H2
+        H1.remove_node(4)
+        H2.remove_node("j")
+        H1.remove_edges_from([(2, 2), (2, 3), (6, 6)])
+        H2.remove_edges_from([("l", "i"), ("m", "m"), ("m", "m")])
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+        # Assign the same labels so that mirroring means isomorphic
+        for n1, n2 in zip(H1.nodes(), H2.nodes()):
+            H1.nodes[n1]["label"] = "red"
+            H2.nodes[n2]["label"] = "red"
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Leave only one node with self-loop
+        H1.remove_nodes_from([3, 6])
+        H2.remove_nodes_from(["m", "l"])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Remove one self-loop from H1
+        H1.remove_edge(2, 2)
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert not m
+
+        # Same for H2
+        H2.remove_edge("i", "i")
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Compose H1 with the disconnected sub-graph of G1. Same for H2
+        S1 = nx.compose(H1, nx.MultiGraph(G1.subgraph([10, 11, 12, 13])))
+        S2 = nx.compose(H2, nx.MultiGraph(G2.subgraph(["a", "b", "d", "e"])))
+
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+        # Connect the two components
+        S1.add_edges_from([(13, 13), (13, 13), (2, 13)])
+        S2.add_edges_from([("a", "a"), ("a", "a"), ("i", "e")])
+        m = vf2pp_isomorphism(H1, H2, node_label="label")
+        assert m
+
+    def test_custom_multigraph5_same_labels(self):
+        G1 = nx.MultiGraph()
+
+        edges1 = [
+            (1, 5),
+            (1, 2),
+            (1, 4),
+            (2, 3),
+            (2, 6),
+            (3, 4),
+            (3, 7),
+            (4, 8),
+            (5, 8),
+            (5, 6),
+            (6, 7),
+            (7, 8),
+        ]
+        mapped = {1: "a", 2: "h", 3: "d", 4: "i", 5: "g", 6: "b", 7: "j", 8: "c"}
+
+        G1.add_edges_from(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Add multiple edges and self-loops, maintaining isomorphism
+        G1.add_edges_from(
+            [(1, 2), (1, 2), (3, 7), (8, 8), (8, 8), (7, 8), (2, 3), (5, 6)]
+        )
+        G2.add_edges_from(
+            [
+                ("a", "h"),
+                ("a", "h"),
+                ("d", "j"),
+                ("c", "c"),
+                ("c", "c"),
+                ("j", "c"),
+                ("d", "h"),
+                ("g", "b"),
+            ]
+        )
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Make G2 to be the rotated G1
+        G2.remove_edges_from(
+            [
+                ("a", "h"),
+                ("a", "h"),
+                ("d", "j"),
+                ("c", "c"),
+                ("c", "c"),
+                ("j", "c"),
+                ("d", "h"),
+                ("g", "b"),
+            ]
+        )
+        G2.add_edges_from(
+            [
+                ("d", "i"),
+                ("a", "h"),
+                ("g", "b"),
+                ("g", "b"),
+                ("i", "i"),
+                ("i", "i"),
+                ("b", "j"),
+                ("d", "j"),
+            ]
+        )
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+    def test_disconnected_multigraph_all_same_labels(self):
+        G1 = nx.MultiGraph()
+        G1.add_nodes_from([i for i in range(10)])
+        G1.add_edges_from([(i, i) for i in range(10)])
+
+        mapped = {0: 9, 1: 8, 2: 7, 3: 6, 4: 5, 5: 4, 6: 3, 7: 2, 8: 1, 9: 0}
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_same))), "label")
+        nx.set_node_attributes(G2, dict(zip(G2, it.cycle(labels_same))), "label")
+
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Add self-loops to non-mapped nodes. Should be the same, as the graph is disconnected.
+        G1.add_edges_from([(i, i) for i in range(5, 8)] * 3)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Compensate in G2
+        G2.add_edges_from([(i, i) for i in range(0, 3)] * 3)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+        # Add one more self-loop in G2
+        G2.add_edges_from([(0, 0), (1, 1), (1, 1)])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Compensate in G1
+        G1.add_edges_from([(5, 5), (7, 7), (7, 7)])
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+    def test_disconnected_multigraph_all_different_labels(self):
+        G1 = nx.MultiGraph()
+        G1.add_nodes_from([i for i in range(10)])
+        G1.add_edges_from([(i, i) for i in range(10)])
+
+        mapped = {0: 9, 1: 8, 2: 7, 3: 6, 4: 5, 5: 4, 6: 3, 7: 2, 8: 1, 9: 0}
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip([mapped[n] for n in G1], it.cycle(labels_many))),
+            "label",
+        )
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+        assert m == mapped
+
+        # Add self-loops to non-mapped nodes. Now it is not the same, as there are different labels
+        G1.add_edges_from([(i, i) for i in range(5, 8)] * 3)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Add self-loops to non mapped nodes in G2 as well
+        G2.add_edges_from([(mapped[i], mapped[i]) for i in range(0, 3)] * 7)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Add self-loops to mapped nodes in G2
+        G2.add_edges_from([(mapped[i], mapped[i]) for i in range(5, 8)] * 3)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert not m
+
+        # Add self-loops to G1 so that they are even in both graphs
+        G1.add_edges_from([(i, i) for i in range(0, 3)] * 7)
+        m = vf2pp_isomorphism(G1, G2, node_label="label")
+        assert m
+
+
+class TestDiGraphISOVF2pp:
+    def test_wikipedia_graph(self):
+        edges1 = [
+            (1, 5),
+            (1, 2),
+            (1, 4),
+            (3, 2),
+            (6, 2),
+            (3, 4),
+            (7, 3),
+            (4, 8),
+            (5, 8),
+            (6, 5),
+            (6, 7),
+            (7, 8),
+        ]
+        mapped = {1: "a", 2: "h", 3: "d", 4: "i", 5: "g", 6: "b", 7: "j", 8: "c"}
+
+        G1 = nx.DiGraph(edges1)
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        assert vf2pp_isomorphism(G1, G2) == mapped
+
+        # Change the direction of an edge
+        G1.remove_edge(1, 5)
+        G1.add_edge(5, 1)
+        assert vf2pp_isomorphism(G1, G2) is None
+
+    def test_non_isomorphic_same_degree_sequence(self):
+        r"""
+                G1                           G2
+        x--------------x              x--------------x
+        | \            |              | \            |
+        |  x-------x   |              |  x-------x   |
+        |  |       |   |              |  |       |   |
+        |  x-------x   |              |  x-------x   |
+        | /            |              |            \ |
+        x--------------x              x--------------x
+        """
+        edges1 = [
+            (1, 5),
+            (1, 2),
+            (4, 1),
+            (3, 2),
+            (3, 4),
+            (4, 8),
+            (5, 8),
+            (6, 5),
+            (6, 7),
+            (7, 8),
+        ]
+        edges2 = [
+            (1, 5),
+            (1, 2),
+            (4, 1),
+            (3, 2),
+            (4, 3),
+            (5, 8),
+            (6, 5),
+            (6, 7),
+            (3, 7),
+            (8, 7),
+        ]
+
+        G1 = nx.DiGraph(edges1)
+        G2 = nx.DiGraph(edges2)
+        assert vf2pp_isomorphism(G1, G2) is None
diff --git a/networkx/algorithms/isomorphism/tests/test_vf2pp_helpers.py b/networkx/algorithms/isomorphism/tests/test_vf2pp_helpers.py
new file mode 100644
index 0000000..ab3439e
--- /dev/null
+++ b/networkx/algorithms/isomorphism/tests/test_vf2pp_helpers.py
@@ -0,0 +1,3103 @@
+import itertools as it
+
+import pytest
+
+import networkx as nx
+from networkx import vf2pp_is_isomorphic, vf2pp_isomorphism
+from networkx.algorithms.isomorphism.vf2pp import (
+    _consistent_PT,
+    _cut_PT,
+    _feasibility,
+    _find_candidates,
+    _find_candidates_Di,
+    _GraphParameters,
+    _initialize_parameters,
+    _matching_order,
+    _restore_Tinout,
+    _restore_Tinout_Di,
+    _StateParameters,
+    _update_Tinout,
+)
+
+labels_same = ["blue"]
+
+labels_many = [
+    "white",
+    "red",
+    "blue",
+    "green",
+    "orange",
+    "black",
+    "purple",
+    "yellow",
+    "brown",
+    "cyan",
+    "solarized",
+    "pink",
+    "none",
+]
+
+
+class TestNodeOrdering:
+    def test_empty_graph(self):
+        G1 = nx.Graph()
+        G2 = nx.Graph()
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        assert len(set(_matching_order(gparams))) == 0
+
+    def test_single_node(self):
+        G1 = nx.Graph()
+        G2 = nx.Graph()
+        G1.add_node(1)
+        G2.add_node(1)
+
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels_many))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip(G2, it.cycle(labels_many))),
+            "label",
+        )
+        l1, l2 = nx.get_node_attributes(G1, "label"), nx.get_node_attributes(
+            G2, "label"
+        )
+
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+        m = _matching_order(gparams)
+        assert m == [1]
+
+    def test_matching_order(self):
+        labels = [
+            "blue",
+            "blue",
+            "red",
+            "red",
+            "red",
+            "red",
+            "green",
+            "green",
+            "green",
+            "yellow",
+            "purple",
+            "purple",
+            "blue",
+            "blue",
+        ]
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (0, 2),
+                (1, 2),
+                (2, 5),
+                (2, 4),
+                (1, 3),
+                (1, 4),
+                (3, 6),
+                (4, 6),
+                (6, 7),
+                (7, 8),
+                (9, 10),
+                (9, 11),
+                (11, 12),
+                (11, 13),
+                (12, 13),
+                (10, 13),
+            ]
+        )
+        G2 = G1.copy()
+        nx.set_node_attributes(G1, dict(zip(G1, it.cycle(labels))), "label")
+        nx.set_node_attributes(
+            G2,
+            dict(zip(G2, it.cycle(labels))),
+            "label",
+        )
+        l1, l2 = nx.get_node_attributes(G1, "label"), nx.get_node_attributes(
+            G2, "label"
+        )
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        expected = [9, 11, 10, 13, 12, 1, 2, 4, 0, 3, 6, 5, 7, 8]
+        assert _matching_order(gparams) == expected
+
+    def test_matching_order_all_branches(self):
+        G1 = nx.Graph(
+            [(0, 1), (0, 2), (0, 3), (0, 4), (1, 2), (1, 3), (1, 4), (2, 4), (3, 4)]
+        )
+        G1.add_node(5)
+        G2 = G1.copy()
+
+        G1.nodes[0]["label"] = "black"
+        G1.nodes[1]["label"] = "blue"
+        G1.nodes[2]["label"] = "blue"
+        G1.nodes[3]["label"] = "red"
+        G1.nodes[4]["label"] = "red"
+        G1.nodes[5]["label"] = "blue"
+
+        G2.nodes[0]["label"] = "black"
+        G2.nodes[1]["label"] = "blue"
+        G2.nodes[2]["label"] = "blue"
+        G2.nodes[3]["label"] = "red"
+        G2.nodes[4]["label"] = "red"
+        G2.nodes[5]["label"] = "blue"
+
+        l1, l2 = nx.get_node_attributes(G1, "label"), nx.get_node_attributes(
+            G2, "label"
+        )
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        expected = [0, 4, 1, 3, 2, 5]
+        assert _matching_order(gparams) == expected
+
+
+class TestGraphCandidateSelection:
+    G1_edges = [
+        (1, 2),
+        (1, 4),
+        (1, 5),
+        (2, 3),
+        (2, 4),
+        (3, 4),
+        (4, 5),
+        (1, 6),
+        (6, 7),
+        (6, 8),
+        (8, 9),
+        (7, 9),
+    ]
+    mapped = {
+        0: "x",
+        1: "a",
+        2: "b",
+        3: "c",
+        4: "d",
+        5: "e",
+        6: "f",
+        7: "g",
+        8: "h",
+        9: "i",
+    }
+
+    def test_no_covered_neighbors_no_labels(self):
+        G1 = nx.Graph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = dict(G1.degree)
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1 = {7, 8, 2, 4, 5}
+        T1_tilde = {0, 3, 6}
+        T2 = {"g", "h", "b", "d", "e"}
+        T2_tilde = {"x", "c", "f"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        u = 3
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 0
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        m.pop(9)
+        m_rev.pop(self.mapped[9])
+
+        T1 = {2, 4, 5, 6}
+        T1_tilde = {0, 3, 7, 8, 9}
+        T2 = {"g", "h", "b", "d", "e", "f"}
+        T2_tilde = {"x", "c", "g", "h", "i"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        u = 7
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {
+            self.mapped[u],
+            self.mapped[8],
+            self.mapped[3],
+            self.mapped[9],
+        }
+
+    def test_no_covered_neighbors_with_labels(self):
+        G1 = nx.Graph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = dict(G1.degree)
+        nx.set_node_attributes(
+            G1,
+            dict(zip(G1, it.cycle(labels_many))),
+            "label",
+        )
+        nx.set_node_attributes(
+            G2,
+            dict(
+                zip(
+                    [self.mapped[n] for n in G1],
+                    it.cycle(labels_many),
+                )
+            ),
+            "label",
+        )
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1 = {7, 8, 2, 4, 5, 6}
+        T1_tilde = {0, 3}
+        T2 = {"g", "h", "b", "d", "e", "f"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        u = 3
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 0
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        # Change label of disconnected node
+        G1.nodes[u]["label"] = "blue"
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        # No candidate
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == set()
+
+        m.pop(9)
+        m_rev.pop(self.mapped[9])
+
+        T1 = {2, 4, 5, 6}
+        T1_tilde = {0, 3, 7, 8, 9}
+        T2 = {"b", "d", "e", "f"}
+        T2_tilde = {"x", "c", "g", "h", "i"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        u = 7
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        G1.nodes[8]["label"] = G1.nodes[7]["label"]
+        G2.nodes[self.mapped[8]]["label"] = G1.nodes[7]["label"]
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[8]}
+
+    def test_covered_neighbors_no_labels(self):
+        G1 = nx.Graph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = dict(G1.degree)
+        l1 = dict(G1.nodes(data=None, default=-1))
+        l2 = dict(G2.nodes(data=None, default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1 = {7, 8, 2, 4, 5, 6}
+        T1_tilde = {0, 3}
+        T2 = {"g", "h", "b", "d", "e", "f"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        u = 5
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 6
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[2]}
+
+    def test_covered_neighbors_with_labels(self):
+        G1 = nx.Graph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = dict(G1.degree)
+        nx.set_node_attributes(
+            G1,
+            dict(zip(G1, it.cycle(labels_many))),
+            "label",
+        )
+        nx.set_node_attributes(
+            G2,
+            dict(
+                zip(
+                    [self.mapped[n] for n in G1],
+                    it.cycle(labels_many),
+                )
+            ),
+            "label",
+        )
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1 = {7, 8, 2, 4, 5, 6}
+        T1_tilde = {0, 3}
+        T2 = {"g", "h", "b", "d", "e", "f"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        u = 5
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 6
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        # Assign to 2, the same label as 6
+        G1.nodes[2]["label"] = G1.nodes[u]["label"]
+        G2.nodes[self.mapped[2]]["label"] = G1.nodes[u]["label"]
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups({node: degree for node, degree in G2.degree()}),
+        )
+
+        candidates = _find_candidates(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[2]}
+
+
+class TestDiGraphCandidateSelection:
+    G1_edges = [
+        (1, 2),
+        (1, 4),
+        (5, 1),
+        (2, 3),
+        (4, 2),
+        (3, 4),
+        (4, 5),
+        (1, 6),
+        (6, 7),
+        (6, 8),
+        (8, 9),
+        (7, 9),
+    ]
+    mapped = {
+        0: "x",
+        1: "a",
+        2: "b",
+        3: "c",
+        4: "d",
+        5: "e",
+        6: "f",
+        7: "g",
+        8: "h",
+        9: "i",
+    }
+
+    def test_no_covered_neighbors_no_labels(self):
+        G1 = nx.DiGraph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(G1.in_degree, G1.out_degree)
+        }
+
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1_out = {2, 4, 6}
+        T1_in = {5, 7, 8}
+        T1_tilde = {0, 3}
+        T2_out = {"b", "d", "f"}
+        T2_in = {"e", "g", "h"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 3
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 0
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        m.pop(9)
+        m_rev.pop(self.mapped[9])
+
+        T1_out = {2, 4, 6}
+        T1_in = {5}
+        T1_tilde = {0, 3, 7, 8, 9}
+        T2_out = {"b", "d", "f"}
+        T2_in = {"e"}
+        T2_tilde = {"x", "c", "g", "h", "i"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 7
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[8], self.mapped[3]}
+
+    def test_no_covered_neighbors_with_labels(self):
+        G1 = nx.DiGraph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(G1.in_degree, G1.out_degree)
+        }
+        nx.set_node_attributes(
+            G1,
+            dict(zip(G1, it.cycle(labels_many))),
+            "label",
+        )
+        nx.set_node_attributes(
+            G2,
+            dict(
+                zip(
+                    [self.mapped[n] for n in G1],
+                    it.cycle(labels_many),
+                )
+            ),
+            "label",
+        )
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1_out = {2, 4, 6}
+        T1_in = {5, 7, 8}
+        T1_tilde = {0, 3}
+        T2_out = {"b", "d", "f"}
+        T2_in = {"e", "g", "h"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 3
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 0
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        # Change label of disconnected node
+        G1.nodes[u]["label"] = "blue"
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        # No candidate
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == set()
+
+        m.pop(9)
+        m_rev.pop(self.mapped[9])
+
+        T1_out = {2, 4, 6}
+        T1_in = {5}
+        T1_tilde = {0, 3, 7, 8, 9}
+        T2_out = {"b", "d", "f"}
+        T2_in = {"e"}
+        T2_tilde = {"x", "c", "g", "h", "i"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 7
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        G1.nodes[8]["label"] = G1.nodes[7]["label"]
+        G2.nodes[self.mapped[8]]["label"] = G1.nodes[7]["label"]
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[8]}
+
+    def test_covered_neighbors_no_labels(self):
+        G1 = nx.DiGraph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(G1.in_degree, G1.out_degree)
+        }
+
+        l1 = dict(G1.nodes(data=None, default=-1))
+        l2 = dict(G2.nodes(data=None, default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1_out = {2, 4, 6}
+        T1_in = {5, 7, 8}
+        T1_tilde = {0, 3}
+        T2_out = {"b", "d", "f"}
+        T2_in = {"e", "g", "h"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 5
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 6
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        # Change the direction of an edge to make the degree orientation same as first candidate of u.
+        G1.remove_edge(4, 2)
+        G1.add_edge(2, 4)
+        G2.remove_edge("d", "b")
+        G2.add_edge("b", "d")
+
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[2]}
+
+    def test_covered_neighbors_with_labels(self):
+        G1 = nx.DiGraph()
+        G1.add_edges_from(self.G1_edges)
+        G1.add_node(0)
+        G2 = nx.relabel_nodes(G1, self.mapped)
+
+        G1.remove_edge(4, 2)
+        G1.add_edge(2, 4)
+        G2.remove_edge("d", "b")
+        G2.add_edge("b", "d")
+
+        G1_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(G1.in_degree, G1.out_degree)
+        }
+
+        nx.set_node_attributes(
+            G1,
+            dict(zip(G1, it.cycle(labels_many))),
+            "label",
+        )
+        nx.set_node_attributes(
+            G2,
+            dict(
+                zip(
+                    [self.mapped[n] for n in G1],
+                    it.cycle(labels_many),
+                )
+            ),
+            "label",
+        )
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        m = {9: self.mapped[9], 1: self.mapped[1]}
+        m_rev = {self.mapped[9]: 9, self.mapped[1]: 1}
+
+        T1_out = {2, 4, 6}
+        T1_in = {5, 7, 8}
+        T1_tilde = {0, 3}
+        T2_out = {"b", "d", "f"}
+        T2_in = {"e", "g", "h"}
+        T2_tilde = {"x", "c"}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 5
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        u = 6
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+        # Assign to 2, the same label as 6
+        G1.nodes[2]["label"] = G1.nodes[u]["label"]
+        G2.nodes[self.mapped[2]]["label"] = G1.nodes[u]["label"]
+        l1 = dict(G1.nodes(data="label", default=-1))
+        l2 = dict(G2.nodes(data="label", default=-1))
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u], self.mapped[2]}
+
+        # Change the direction of an edge to make the degree orientation same as first candidate of u.
+        G1.remove_edge(2, 4)
+        G1.add_edge(4, 2)
+        G2.remove_edge("b", "d")
+        G2.add_edge("d", "b")
+
+        gparams = _GraphParameters(
+            G1,
+            G2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        G2.in_degree(), G2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        candidates = _find_candidates_Di(u, gparams, sparams, G1_degree)
+        assert candidates == {self.mapped[u]}
+
+    def test_same_in_out_degrees_no_candidate(self):
+        g1 = nx.DiGraph([(4, 1), (4, 2), (3, 4), (5, 4), (6, 4)])
+        g2 = nx.DiGraph([(1, 4), (2, 4), (3, 4), (4, 5), (4, 6)])
+
+        l1 = dict(g1.nodes(data=None, default=-1))
+        l2 = dict(g2.nodes(data=None, default=-1))
+        gparams = _GraphParameters(
+            g1,
+            g2,
+            l1,
+            l2,
+            nx.utils.groups(l1),
+            nx.utils.groups(l2),
+            nx.utils.groups(
+                {
+                    node: (in_degree, out_degree)
+                    for (node, in_degree), (_, out_degree) in zip(
+                        g2.in_degree(), g2.out_degree()
+                    )
+                }
+            ),
+        )
+
+        g1_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(g1.in_degree, g1.out_degree)
+        }
+
+        m = {1: 1, 2: 2, 3: 3}
+        m_rev = m.copy()
+
+        T1_out = {4}
+        T1_in = {4}
+        T1_tilde = {5, 6}
+        T2_out = {4}
+        T2_in = {4}
+        T2_tilde = {5, 6}
+
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        u = 4
+        # despite the same in and out degree, there's no candidate for u=4
+        candidates = _find_candidates_Di(u, gparams, sparams, g1_degree)
+        assert candidates == set()
+        # Notice how the regular candidate selection method returns wrong result.
+        assert _find_candidates(u, gparams, sparams, g1_degree) == {4}
+
+
+class TestGraphISOFeasibility:
+    def test_const_covered_neighbors(self):
+        G1 = nx.Graph([(0, 1), (1, 2), (3, 0), (3, 2)])
+        G2 = nx.Graph([("a", "b"), ("b", "c"), ("k", "a"), ("k", "c")])
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_no_covered_neighbors(self):
+        G1 = nx.Graph([(0, 1), (1, 2), (3, 4), (3, 5)])
+        G2 = nx.Graph([("a", "b"), ("b", "c"), ("k", "w"), ("k", "z")])
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_mixed_covered_uncovered_neighbors(self):
+        G1 = nx.Graph([(0, 1), (1, 2), (3, 0), (3, 2), (3, 4), (3, 5)])
+        G2 = nx.Graph(
+            [("a", "b"), ("b", "c"), ("k", "a"), ("k", "c"), ("k", "w"), ("k", "z")]
+        )
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_fail_cases(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (10, 5),
+                (10, 6),
+                (4, 1),
+                (5, 3),
+            ]
+        )
+        G2 = nx.Graph(
+            [
+                ("a", "b"),
+                ("b", "c"),
+                ("k", "a"),
+                ("k", "d"),
+                ("k", "e"),
+                ("k", "f"),
+                ("k", "g"),
+                ("e", "b"),
+                ("f", "d"),
+            ]
+        )
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 10, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Delete one uncovered neighbor of u. Notice how it still passes the test.
+        # Two reasons for this:
+        #   1. If u, v had different degrees from the beginning, they wouldn't
+        #      be selected as candidates in the first place.
+        #   2. Even if they are selected, consistency is basically 1-look-ahead,
+        #      meaning that we take into consideration the relation of the
+        #      candidates with their mapped neighbors. The node we deleted is
+        #      not a covered neighbor.
+        #      Such nodes will be checked by the cut_PT function, which is
+        #      basically the 2-look-ahead, checking the relation of the
+        #      candidates with T1, T2 (in which belongs the node we just deleted).
+        G1.remove_node(6)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Add one more covered neighbor of u in G1
+        G1.add_edge(u, 2)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.add_edge(v, "c")
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Add one more covered neighbor of v in G2
+        G2.add_edge(v, "x")
+        G1.add_node(7)
+        sparams.mapping.update({7: "x"})
+        sparams.reverse_mapping.update({"x": 7})
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compendate in G1
+        G1.add_edge(u, 7)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    @pytest.mark.parametrize("graph_type", (nx.Graph, nx.DiGraph))
+    def test_cut_inconsistent_labels(self, graph_type):
+        G1 = graph_type(
+            [
+                (0, 1),
+                (1, 2),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (10, 5),
+                (10, 6),
+                (4, 1),
+                (5, 3),
+            ]
+        )
+        G2 = graph_type(
+            [
+                ("a", "b"),
+                ("b", "c"),
+                ("k", "a"),
+                ("k", "d"),
+                ("k", "e"),
+                ("k", "f"),
+                ("k", "g"),
+                ("e", "b"),
+                ("f", "d"),
+            ]
+        )
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        l1.update({6: "green"})  # Change the label of one neighbor of u
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+
+        u, v = 10, "k"
+        assert _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_consistent_labels(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (10, 5),
+                (10, 6),
+                (4, 1),
+                (5, 3),
+            ]
+        )
+        G2 = nx.Graph(
+            [
+                ("a", "b"),
+                ("b", "c"),
+                ("k", "a"),
+                ("k", "d"),
+                ("k", "e"),
+                ("k", "f"),
+                ("k", "g"),
+                ("e", "b"),
+                ("f", "d"),
+            ]
+        )
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5},
+            None,
+            {6},
+            None,
+            {"e", "f"},
+            None,
+            {"g"},
+            None,
+        )
+
+        u, v = 10, "k"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_same_labels(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (10, 5),
+                (10, 6),
+                (4, 1),
+                (5, 3),
+            ]
+        )
+        mapped = {0: "a", 1: "b", 2: "c", 3: "d", 4: "e", 5: "f", 6: "g", 10: "k"}
+        G2 = nx.relabel_nodes(G1, mapped)
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5},
+            None,
+            {6},
+            None,
+            {"e", "f"},
+            None,
+            {"g"},
+            None,
+        )
+
+        u, v = 10, "k"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change intersection between G1[u] and T1, so it's not the same as the one between G2[v] and T2
+        G1.remove_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.remove_edge(v, mapped[4])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change intersection between G2[v] and T2_tilde, so it's not the same as the one between G1[u] and T1_tilde
+        G2.remove_edge(v, mapped[6])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G1
+        G1.remove_edge(u, 6)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add disconnected nodes, which will form the new Ti_out
+        G1.add_nodes_from([6, 7, 8])
+        G2.add_nodes_from(["g", "y", "z"])
+        sparams.T1_tilde.update({6, 7, 8})
+        sparams.T2_tilde.update({"g", "y", "z"})
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add some new nodes to the mapping
+        sparams.mapping.update({6: "g", 7: "y"})
+        sparams.reverse_mapping.update({"g": 6, "y": 7})
+
+        # Add more nodes to T1, T2.
+        G1.add_edges_from([(6, 20), (7, 20), (6, 21)])
+        G2.add_edges_from([("g", "i"), ("g", "j"), ("y", "j")])
+
+        sparams.mapping.update({20: "j", 21: "i"})
+        sparams.reverse_mapping.update({"j": 20, "i": 21})
+        sparams.T1.update({20, 21})
+        sparams.T2.update({"i", "j"})
+        sparams.T1_tilde.difference_update({6, 7})
+        sparams.T2_tilde.difference_update({"g", "y"})
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add nodes from the new T1 and T2, as neighbors of u and v respectively
+        G1.add_edges_from([(u, 20), (u, 21)])
+        G2.add_edges_from([(v, "i"), (v, "j")])
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the edges, maintaining the G1[u]-T1 intersection
+        G1.remove_edge(u, 20)
+        G1.add_edge(u, 4)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Connect u to 8 which is still in T1_tilde
+        G1.add_edge(u, 8)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for v and z, so that inters(G1[u], T1out) == inters(G2[v], T2out)
+        G2.add_edge(v, "z")
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_different_labels(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (1, 14),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (4, 10),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 3),
+                (20, 5),
+                (20, 0),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        l1 = {n: "none" for n in G1.nodes()}
+        l2 = dict()
+
+        l1.update(
+            {
+                9: "blue",
+                15: "blue",
+                12: "blue",
+                11: "green",
+                3: "green",
+                8: "red",
+                0: "red",
+                5: "yellow",
+            }
+        )
+        l2.update({mapped[n]: l for n, l in l1.items()})
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 14},
+            None,
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "h", "o"},
+            None,
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the orientation of the labels on neighbors of u compared to neighbors of v. Leave the structure intact
+        l1.update({9: "red"})
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # compensate in G2
+        l2.update({mapped[9]: "red"})
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the intersection of G1[u] and T1
+        G1.add_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G2[v] and T2
+        G2.add_edge(v, mapped[4])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the intersection of G2[v] and T2_tilde
+        G2.remove_edge(v, mapped[8])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G1[u] and T1_tilde
+        G1.remove_edge(u, 8)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Place 8 and mapped[8] in T1 and T2 respectively, by connecting it to covered nodes
+        G1.add_edge(8, 3)
+        G2.add_edge(mapped[8], mapped[3])
+        sparams.T1.add(8)
+        sparams.T2.add(mapped[8])
+        sparams.T1_tilde.remove(8)
+        sparams.T2_tilde.remove(mapped[8])
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Remove neighbor of u from T1
+        G1.remove_node(5)
+        l1.pop(5)
+        sparams.T1.remove(5)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same in G2
+        G2.remove_node(mapped[5])
+        l2.pop(mapped[5])
+        sparams.T2.remove(mapped[5])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_feasibility_same_labels(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (1, 14),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (4, 10),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 2),
+                (20, 5),
+                (20, 0),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {mapped[n]: "blue" for n in G1.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 14},
+            None,
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "h", "o"},
+            None,
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change structure in G2 such that, ONLY consistency is harmed
+        G2.remove_edge(mapped[20], mapped[2])
+        G2.add_edge(mapped[20], mapped[3])
+
+        # Consistency check fails, while the cutting rules are satisfied!
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G1 and make it consistent
+        G1.remove_edge(20, 2)
+        G1.add_edge(20, 3)
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # ONLY fail the cutting check
+        G2.add_edge(v, mapped[10])
+        assert _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_feasibility_different_labels(self):
+        G1 = nx.Graph(
+            [
+                (0, 1),
+                (1, 2),
+                (1, 14),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (4, 10),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 2),
+                (20, 5),
+                (20, 0),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        l1 = {n: "none" for n in G1.nodes()}
+        l2 = dict()
+
+        l1.update(
+            {
+                9: "blue",
+                15: "blue",
+                12: "blue",
+                11: "green",
+                2: "green",
+                8: "red",
+                0: "red",
+                5: "yellow",
+            }
+        )
+        l2.update({mapped[n]: l for n, l in l1.items()})
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 14},
+            None,
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "h", "o"},
+            None,
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change structure in G2 such that, ONLY consistency is harmed
+        G2.remove_edge(mapped[20], mapped[2])
+        G2.add_edge(mapped[20], mapped[3])
+        l2.update({mapped[3]: "green"})
+
+        # Consistency check fails, while the cutting rules are satisfied!
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G1 and make it consistent
+        G1.remove_edge(20, 2)
+        G1.add_edge(20, 3)
+        l1.update({3: "green"})
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # ONLY fail the cutting check
+        l1.update({5: "red"})
+        assert _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+
+class TestMultiGraphISOFeasibility:
+    def test_const_covered_neighbors(self):
+        G1 = nx.MultiGraph(
+            [(0, 1), (0, 1), (1, 2), (3, 0), (3, 0), (3, 0), (3, 2), (3, 2)]
+        )
+        G2 = nx.MultiGraph(
+            [
+                ("a", "b"),
+                ("a", "b"),
+                ("b", "c"),
+                ("k", "a"),
+                ("k", "a"),
+                ("k", "a"),
+                ("k", "c"),
+                ("k", "c"),
+            ]
+        )
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_no_covered_neighbors(self):
+        G1 = nx.MultiGraph([(0, 1), (0, 1), (1, 2), (3, 4), (3, 4), (3, 5)])
+        G2 = nx.MultiGraph([("a", "b"), ("b", "c"), ("k", "w"), ("k", "w"), ("k", "z")])
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_mixed_covered_uncovered_neighbors(self):
+        G1 = nx.MultiGraph(
+            [(0, 1), (1, 2), (3, 0), (3, 0), (3, 0), (3, 2), (3, 2), (3, 4), (3, 5)]
+        )
+        G2 = nx.MultiGraph(
+            [
+                ("a", "b"),
+                ("b", "c"),
+                ("k", "a"),
+                ("k", "a"),
+                ("k", "a"),
+                ("k", "c"),
+                ("k", "c"),
+                ("k", "w"),
+                ("k", "z"),
+            ]
+        )
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_fail_cases(self):
+        G1 = nx.MultiGraph(
+            [
+                (0, 1),
+                (1, 2),
+                (10, 0),
+                (10, 0),
+                (10, 0),
+                (10, 3),
+                (10, 3),
+                (10, 4),
+                (10, 5),
+                (10, 6),
+                (10, 6),
+                (4, 1),
+                (5, 3),
+            ]
+        )
+        mapped = {0: "a", 1: "b", 2: "c", 3: "d", 4: "e", 5: "f", 6: "g", 10: "k"}
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 10, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Delete one uncovered neighbor of u. Notice how it still passes the test. Two reasons for this:
+        # 1. If u, v had different degrees from the beginning, they wouldn't be selected as candidates in the first
+        #    place.
+        # 2. Even if they are selected, consistency is basically 1-look-ahead, meaning that we take into consideration
+        #    the relation of the candidates with their mapped neighbors. The node we deleted is not a covered neighbor.
+        #    Such nodes will be checked by the cut_PT function, which is basically the 2-look-ahead, checking the
+        #    relation of the candidates with T1, T2 (in which belongs the node we just deleted).
+        G1.remove_node(6)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Add one more covered neighbor of u in G1
+        G1.add_edge(u, 2)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.add_edge(v, "c")
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Add one more covered neighbor of v in G2
+        G2.add_edge(v, "x")
+        G1.add_node(7)
+        sparams.mapping.update({7: "x"})
+        sparams.reverse_mapping.update({"x": 7})
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compendate in G1
+        G1.add_edge(u, 7)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Delete an edge between u and a covered neighbor
+        G1.remove_edges_from([(u, 0), (u, 0)])
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.remove_edges_from([(v, mapped[0]), (v, mapped[0])])
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Remove an edge between v and a covered neighbor
+        G2.remove_edge(v, mapped[3])
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G1
+        G1.remove_edge(u, 3)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_cut_same_labels(self):
+        G1 = nx.MultiGraph(
+            [
+                (0, 1),
+                (1, 2),
+                (10, 0),
+                (10, 0),
+                (10, 0),
+                (10, 3),
+                (10, 3),
+                (10, 4),
+                (10, 4),
+                (10, 5),
+                (10, 5),
+                (10, 5),
+                (10, 5),
+                (10, 6),
+                (10, 6),
+                (4, 1),
+                (5, 3),
+            ]
+        )
+        mapped = {0: "a", 1: "b", 2: "c", 3: "d", 4: "e", 5: "f", 6: "g", 10: "k"}
+        G2 = nx.relabel_nodes(G1, mapped)
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5},
+            None,
+            {6},
+            None,
+            {"e", "f"},
+            None,
+            {"g"},
+            None,
+        )
+
+        u, v = 10, "k"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Remove one of the multiple edges between u and a neighbor
+        G1.remove_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G1.remove_edge(u, 4)
+        G2.remove_edges_from([(v, mapped[4]), (v, mapped[4])])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change intersection between G2[v] and T2_tilde, so it's not the same as the one between G1[u] and T1_tilde
+        G2.remove_edge(v, mapped[6])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G1
+        G1.remove_edge(u, 6)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add more edges between u and neighbor which belongs in T1_tilde
+        G1.add_edges_from([(u, 5), (u, 5), (u, 5)])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.add_edges_from([(v, mapped[5]), (v, mapped[5]), (v, mapped[5])])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add disconnected nodes, which will form the new Ti_out
+        G1.add_nodes_from([6, 7, 8])
+        G2.add_nodes_from(["g", "y", "z"])
+        G1.add_edges_from([(u, 6), (u, 6), (u, 6), (u, 8)])
+        G2.add_edges_from([(v, "g"), (v, "g"), (v, "g"), (v, "z")])
+
+        sparams.T1_tilde.update({6, 7, 8})
+        sparams.T2_tilde.update({"g", "y", "z"})
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add some new nodes to the mapping
+        sparams.mapping.update({6: "g", 7: "y"})
+        sparams.reverse_mapping.update({"g": 6, "y": 7})
+
+        # Add more nodes to T1, T2.
+        G1.add_edges_from([(6, 20), (7, 20), (6, 21)])
+        G2.add_edges_from([("g", "i"), ("g", "j"), ("y", "j")])
+
+        sparams.T1.update({20, 21})
+        sparams.T2.update({"i", "j"})
+        sparams.T1_tilde.difference_update({6, 7})
+        sparams.T2_tilde.difference_update({"g", "y"})
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Remove some edges
+        G2.remove_edge(v, "g")
+        assert _cut_PT(u, v, gparams, sparams)
+
+        G1.remove_edge(u, 6)
+        G1.add_edge(u, 8)
+        G2.add_edge(v, "z")
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add nodes from the new T1 and T2, as neighbors of u and v respectively
+        G1.add_edges_from([(u, 20), (u, 20), (u, 20), (u, 21)])
+        G2.add_edges_from([(v, "i"), (v, "i"), (v, "i"), (v, "j")])
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the edges
+        G1.remove_edge(u, 20)
+        G1.add_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        G2.remove_edge(v, "i")
+        G2.add_edge(v, mapped[4])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_different_labels(self):
+        G1 = nx.MultiGraph(
+            [
+                (0, 1),
+                (0, 1),
+                (1, 2),
+                (1, 2),
+                (1, 14),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (4, 10),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 9),
+                (20, 9),
+                (20, 15),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (20, 11),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 8),
+                (20, 3),
+                (20, 3),
+                (20, 5),
+                (20, 5),
+                (20, 5),
+                (20, 0),
+                (20, 0),
+                (20, 0),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        l1 = {n: "none" for n in G1.nodes()}
+        l2 = dict()
+
+        l1.update(
+            {
+                9: "blue",
+                15: "blue",
+                12: "blue",
+                11: "green",
+                3: "green",
+                8: "red",
+                0: "red",
+                5: "yellow",
+            }
+        )
+        l2.update({mapped[n]: l for n, l in l1.items()})
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 14},
+            None,
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "h", "o"},
+            None,
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the orientation of the labels on neighbors of u compared to neighbors of v. Leave the structure intact
+        l1.update({9: "red"})
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # compensate in G2
+        l2.update({mapped[9]: "red"})
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the intersection of G1[u] and T1
+        G1.add_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G2[v] and T2
+        G2.add_edge(v, mapped[4])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Delete one from the multiple edges
+        G2.remove_edge(v, mapped[8])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G1[u] and T1_tilde
+        G1.remove_edge(u, 8)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Place 8 and mapped[8] in T1 and T2 respectively, by connecting it to covered nodes
+        G1.add_edges_from([(8, 3), (8, 3), (8, u)])
+        G2.add_edges_from([(mapped[8], mapped[3]), (mapped[8], mapped[3])])
+        sparams.T1.add(8)
+        sparams.T2.add(mapped[8])
+        sparams.T1_tilde.remove(8)
+        sparams.T2_tilde.remove(mapped[8])
+
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Fix uneven edges
+        G1.remove_edge(8, u)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Remove neighbor of u from T1
+        G1.remove_node(5)
+        l1.pop(5)
+        sparams.T1.remove(5)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same in G2
+        G2.remove_node(mapped[5])
+        l2.pop(mapped[5])
+        sparams.T2.remove(mapped[5])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_feasibility_same_labels(self):
+        G1 = nx.MultiGraph(
+            [
+                (0, 1),
+                (0, 1),
+                (1, 2),
+                (1, 2),
+                (1, 14),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (4, 10),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 9),
+                (20, 9),
+                (20, 15),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (20, 11),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 8),
+                (20, 3),
+                (20, 3),
+                (20, 5),
+                (20, 5),
+                (20, 5),
+                (20, 0),
+                (20, 0),
+                (20, 0),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {mapped[n]: "blue" for n in G1.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 14},
+            None,
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "h", "o"},
+            None,
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change structure in G2 such that, ONLY consistency is harmed
+        G2.remove_edges_from([(mapped[20], mapped[3]), (mapped[20], mapped[3])])
+        G2.add_edges_from([(mapped[20], mapped[2]), (mapped[20], mapped[2])])
+
+        # Consistency check fails, while the cutting rules are satisfied!
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G1 and make it consistent
+        G1.remove_edges_from([(20, 3), (20, 3)])
+        G1.add_edges_from([(20, 2), (20, 2)])
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # ONLY fail the cutting check
+        G2.add_edges_from([(v, mapped[10])] * 5)
+        assert _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Pass all tests
+        G1.add_edges_from([(u, 10)] * 5)
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_feasibility_different_labels(self):
+        G1 = nx.MultiGraph(
+            [
+                (0, 1),
+                (0, 1),
+                (1, 2),
+                (1, 2),
+                (1, 14),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (4, 10),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 9),
+                (20, 9),
+                (20, 15),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (20, 11),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 8),
+                (20, 2),
+                (20, 2),
+                (20, 5),
+                (20, 5),
+                (20, 5),
+                (20, 0),
+                (20, 0),
+                (20, 0),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+        l1 = {n: "none" for n in G1.nodes()}
+        l2 = dict()
+
+        l1.update(
+            {
+                9: "blue",
+                15: "blue",
+                12: "blue",
+                11: "green",
+                2: "green",
+                8: "red",
+                0: "red",
+                5: "yellow",
+            }
+        )
+        l2.update({mapped[n]: l for n, l in l1.items()})
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 14},
+            None,
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "h", "o"},
+            None,
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change structure in G2 such that, ONLY consistency is harmed
+        G2.remove_edges_from([(mapped[20], mapped[2]), (mapped[20], mapped[2])])
+        G2.add_edges_from([(mapped[20], mapped[3]), (mapped[20], mapped[3])])
+        l2.update({mapped[3]: "green"})
+
+        # Consistency check fails, while the cutting rules are satisfied!
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G1 and make it consistent
+        G1.remove_edges_from([(20, 2), (20, 2)])
+        G1.add_edges_from([(20, 3), (20, 3)])
+        l1.update({3: "green"})
+        assert not _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # ONLY fail the cutting check
+        l1.update({5: "red"})
+        assert _cut_PT(u, v, gparams, sparams)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+
+class TestDiGraphISOFeasibility:
+    def test_const_covered_neighbors(self):
+        G1 = nx.DiGraph([(0, 1), (1, 2), (0, 3), (2, 3)])
+        G2 = nx.DiGraph([("a", "b"), ("b", "c"), ("a", "k"), ("c", "k")])
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_no_covered_neighbors(self):
+        G1 = nx.DiGraph([(0, 1), (1, 2), (3, 4), (3, 5)])
+        G2 = nx.DiGraph([("a", "b"), ("b", "c"), ("k", "w"), ("k", "z")])
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_mixed_covered_uncovered_neighbors(self):
+        G1 = nx.DiGraph([(0, 1), (1, 2), (3, 0), (3, 2), (3, 4), (3, 5)])
+        G2 = nx.DiGraph(
+            [("a", "b"), ("b", "c"), ("k", "a"), ("k", "c"), ("k", "w"), ("k", "z")]
+        )
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 3, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_const_fail_cases(self):
+        G1 = nx.DiGraph(
+            [
+                (0, 1),
+                (2, 1),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (5, 10),
+                (10, 6),
+                (1, 4),
+                (5, 3),
+            ]
+        )
+        G2 = nx.DiGraph(
+            [
+                ("a", "b"),
+                ("c", "b"),
+                ("k", "a"),
+                ("k", "d"),
+                ("k", "e"),
+                ("f", "k"),
+                ("k", "g"),
+                ("b", "e"),
+                ("f", "d"),
+            ]
+        )
+        gparams = _GraphParameters(G1, G2, None, None, None, None, None)
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+        u, v = 10, "k"
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Delete one uncovered neighbor of u. Notice how it still passes the
+        # test. Two reasons for this:
+        #   1. If u, v had different degrees from the beginning, they wouldn't
+        #      be selected as candidates in the first place.
+        #   2. Even if they are selected, consistency is basically
+        #      1-look-ahead, meaning that we take into consideration the
+        #      relation of the candidates with their mapped neighbors.
+        #      The node we deleted is not a covered neighbor.
+        #      Such nodes will be checked by the cut_PT function, which is
+        #      basically the 2-look-ahead, checking the relation of the
+        #      candidates with T1, T2 (in which belongs the node we just deleted).
+        G1.remove_node(6)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Add one more covered neighbor of u in G1
+        G1.add_edge(u, 2)
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.add_edge(v, "c")
+        assert _consistent_PT(u, v, gparams, sparams)
+
+        # Add one more covered neighbor of v in G2
+        G2.add_edge(v, "x")
+        G1.add_node(7)
+        sparams.mapping.update({7: "x"})
+        sparams.reverse_mapping.update({"x": 7})
+        assert not _consistent_PT(u, v, gparams, sparams)
+
+        # Compensate in G1
+        G1.add_edge(u, 7)
+        assert _consistent_PT(u, v, gparams, sparams)
+
+    def test_cut_inconsistent_labels(self):
+        G1 = nx.DiGraph(
+            [
+                (0, 1),
+                (2, 1),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (5, 10),
+                (10, 6),
+                (1, 4),
+                (5, 3),
+            ]
+        )
+        G2 = nx.DiGraph(
+            [
+                ("a", "b"),
+                ("c", "b"),
+                ("k", "a"),
+                ("k", "d"),
+                ("k", "e"),
+                ("f", "k"),
+                ("k", "g"),
+                ("b", "e"),
+                ("f", "d"),
+            ]
+        )
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        l1.update({5: "green"})  # Change the label of one neighbor of u
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+            None,
+        )
+
+        u, v = 10, "k"
+        assert _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_consistent_labels(self):
+        G1 = nx.DiGraph(
+            [
+                (0, 1),
+                (2, 1),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (5, 10),
+                (10, 6),
+                (1, 4),
+                (5, 3),
+            ]
+        )
+        G2 = nx.DiGraph(
+            [
+                ("a", "b"),
+                ("c", "b"),
+                ("k", "a"),
+                ("k", "d"),
+                ("k", "e"),
+                ("f", "k"),
+                ("k", "g"),
+                ("b", "e"),
+                ("f", "d"),
+            ]
+        )
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4},
+            {5, 10},
+            {6},
+            None,
+            {"e"},
+            {"f", "k"},
+            {"g"},
+            None,
+        )
+
+        u, v = 10, "k"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_same_labels(self):
+        G1 = nx.DiGraph(
+            [
+                (0, 1),
+                (2, 1),
+                (10, 0),
+                (10, 3),
+                (10, 4),
+                (5, 10),
+                (10, 6),
+                (1, 4),
+                (5, 3),
+            ]
+        )
+        mapped = {0: "a", 1: "b", 2: "c", 3: "d", 4: "e", 5: "f", 6: "g", 10: "k"}
+        G2 = nx.relabel_nodes(G1, mapped)
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4},
+            {5, 10},
+            {6},
+            None,
+            {"e"},
+            {"f", "k"},
+            {"g"},
+            None,
+        )
+
+        u, v = 10, "k"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change intersection between G1[u] and T1_out, so it's not the same as the one between G2[v] and T2_out
+        G1.remove_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.remove_edge(v, mapped[4])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change intersection between G1[u] and T1_in, so it's not the same as the one between G2[v] and T2_in
+        G1.remove_edge(5, u)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G2
+        G2.remove_edge(mapped[5], v)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change intersection between G2[v] and T2_tilde, so it's not the same as the one between G1[u] and T1_tilde
+        G2.remove_edge(v, mapped[6])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Compensate in G1
+        G1.remove_edge(u, 6)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Add disconnected nodes, which will form the new Ti_tilde
+        G1.add_nodes_from([6, 7, 8])
+        G2.add_nodes_from(["g", "y", "z"])
+        sparams.T1_tilde.update({6, 7, 8})
+        sparams.T2_tilde.update({"g", "y", "z"})
+
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_cut_different_labels(self):
+        G1 = nx.DiGraph(
+            [
+                (0, 1),
+                (1, 2),
+                (14, 1),
+                (0, 4),
+                (1, 5),
+                (2, 6),
+                (3, 7),
+                (3, 6),
+                (10, 4),
+                (4, 9),
+                (6, 10),
+                (20, 9),
+                (20, 15),
+                (20, 12),
+                (20, 11),
+                (12, 13),
+                (11, 13),
+                (20, 8),
+                (20, 3),
+                (20, 5),
+                (0, 20),
+            ]
+        )
+        mapped = {
+            0: "a",
+            1: "b",
+            2: "c",
+            3: "d",
+            4: "e",
+            5: "f",
+            6: "g",
+            7: "h",
+            8: "i",
+            9: "j",
+            10: "k",
+            11: "l",
+            12: "m",
+            13: "n",
+            14: "o",
+            15: "p",
+            20: "x",
+        }
+        G2 = nx.relabel_nodes(G1, mapped)
+
+        l1 = {n: "none" for n in G1.nodes()}
+        l2 = dict()
+
+        l1.update(
+            {
+                9: "blue",
+                15: "blue",
+                12: "blue",
+                11: "green",
+                3: "green",
+                8: "red",
+                0: "red",
+                5: "yellow",
+            }
+        )
+        l2.update({mapped[n]: l for n, l in l1.items()})
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c", 3: "d"},
+            {"a": 0, "b": 1, "c": 2, "d": 3},
+            {4, 5, 6, 7, 20},
+            {14, 20},
+            {9, 10, 15, 12, 11, 13, 8},
+            None,
+            {"e", "f", "g", "x"},
+            {"o", "x"},
+            {"j", "k", "l", "m", "n", "i", "p"},
+            None,
+        )
+
+        u, v = 20, "x"
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the orientation of the labels on neighbors of u compared to neighbors of v. Leave the structure intact
+        l1.update({9: "red"})
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # compensate in G2
+        l2.update({mapped[9]: "red"})
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the intersection of G1[u] and T1_out
+        G1.add_edge(u, 4)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G2[v] and T2_out
+        G2.add_edge(v, mapped[4])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the intersection of G1[u] and T1_in
+        G1.add_edge(u, 14)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G2[v] and T2_in
+        G2.add_edge(v, mapped[14])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Change the intersection of G2[v] and T2_tilde
+        G2.remove_edge(v, mapped[8])
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same for G1[u] and T1_tilde
+        G1.remove_edge(u, 8)
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Place 8 and mapped[8] in T1 and T2 respectively, by connecting it to covered nodes
+        G1.add_edge(8, 3)
+        G2.add_edge(mapped[8], mapped[3])
+        sparams.T1.add(8)
+        sparams.T2.add(mapped[8])
+        sparams.T1_tilde.remove(8)
+        sparams.T2_tilde.remove(mapped[8])
+
+        assert not _cut_PT(u, v, gparams, sparams)
+
+        # Remove neighbor of u from T1
+        G1.remove_node(5)
+        l1.pop(5)
+        sparams.T1.remove(5)
+        assert _cut_PT(u, v, gparams, sparams)
+
+        # Same in G2
+        G2.remove_node(mapped[5])
+        l2.pop(mapped[5])
+        sparams.T2.remove(mapped[5])
+        assert not _cut_PT(u, v, gparams, sparams)
+
+    def test_predecessor_T1_in_fail(self):
+        G1 = nx.DiGraph(
+            [(0, 1), (0, 3), (4, 0), (1, 5), (5, 2), (3, 6), (4, 6), (6, 5)]
+        )
+        mapped = {0: "a", 1: "b", 2: "c", 3: "d", 4: "e", 5: "f", 6: "g"}
+        G2 = nx.relabel_nodes(G1, mapped)
+        l1 = {n: "blue" for n in G1.nodes()}
+        l2 = {n: "blue" for n in G2.nodes()}
+
+        gparams = _GraphParameters(
+            G1, G2, l1, l2, nx.utils.groups(l1), nx.utils.groups(l2), None
+        )
+        sparams = _StateParameters(
+            {0: "a", 1: "b", 2: "c"},
+            {"a": 0, "b": 1, "c": 2},
+            {3, 5},
+            {4, 5},
+            {6},
+            None,
+            {"d", "f"},
+            {"f"},  # mapped[4] is missing from T2_in
+            {"g"},
+            None,
+        )
+
+        u, v = 6, "g"
+        assert _cut_PT(u, v, gparams, sparams)
+
+        sparams.T2_in.add("e")
+        assert not _cut_PT(u, v, gparams, sparams)
+
+
+class TestGraphTinoutUpdating:
+    edges = [
+        (1, 3),
+        (2, 3),
+        (3, 4),
+        (4, 9),
+        (4, 5),
+        (3, 9),
+        (5, 8),
+        (5, 7),
+        (8, 7),
+        (6, 7),
+    ]
+    mapped = {
+        0: "x",
+        1: "a",
+        2: "b",
+        3: "c",
+        4: "d",
+        5: "e",
+        6: "f",
+        7: "g",
+        8: "h",
+        9: "i",
+    }
+    G1 = nx.Graph()
+    G1.add_edges_from(edges)
+    G1.add_node(0)
+    G2 = nx.relabel_nodes(G1, mapping=mapped)
+
+    def test_updating(self):
+        G2_degree = dict(self.G2.degree)
+        gparams, sparams = _initialize_parameters(self.G1, self.G2, G2_degree)
+        m, m_rev, T1, _, T1_tilde, _, T2, _, T2_tilde, _ = sparams
+
+        # Add node to the mapping
+        m[4] = self.mapped[4]
+        m_rev[self.mapped[4]] = 4
+        _update_Tinout(4, self.mapped[4], gparams, sparams)
+
+        assert T1 == {3, 5, 9}
+        assert T2 == {"c", "i", "e"}
+        assert T1_tilde == {0, 1, 2, 6, 7, 8}
+        assert T2_tilde == {"x", "a", "b", "f", "g", "h"}
+
+        # Add node to the mapping
+        m[5] = self.mapped[5]
+        m_rev.update({self.mapped[5]: 5})
+        _update_Tinout(5, self.mapped[5], gparams, sparams)
+
+        assert T1 == {3, 9, 8, 7}
+        assert T2 == {"c", "i", "h", "g"}
+        assert T1_tilde == {0, 1, 2, 6}
+        assert T2_tilde == {"x", "a", "b", "f"}
+
+        # Add node to the mapping
+        m[6] = self.mapped[6]
+        m_rev.update({self.mapped[6]: 6})
+        _update_Tinout(6, self.mapped[6], gparams, sparams)
+
+        assert T1 == {3, 9, 8, 7}
+        assert T2 == {"c", "i", "h", "g"}
+        assert T1_tilde == {0, 1, 2}
+        assert T2_tilde == {"x", "a", "b"}
+
+        # Add node to the mapping
+        m[3] = self.mapped[3]
+        m_rev.update({self.mapped[3]: 3})
+        _update_Tinout(3, self.mapped[3], gparams, sparams)
+
+        assert T1 == {1, 2, 9, 8, 7}
+        assert T2 == {"a", "b", "i", "h", "g"}
+        assert T1_tilde == {0}
+        assert T2_tilde == {"x"}
+
+        # Add node to the mapping
+        m[0] = self.mapped[0]
+        m_rev.update({self.mapped[0]: 0})
+        _update_Tinout(0, self.mapped[0], gparams, sparams)
+
+        assert T1 == {1, 2, 9, 8, 7}
+        assert T2 == {"a", "b", "i", "h", "g"}
+        assert T1_tilde == set()
+        assert T2_tilde == set()
+
+    def test_restoring(self):
+        m = {0: "x", 3: "c", 4: "d", 5: "e", 6: "f"}
+        m_rev = {"x": 0, "c": 3, "d": 4, "e": 5, "f": 6}
+
+        T1 = {1, 2, 7, 9, 8}
+        T2 = {"a", "b", "g", "i", "h"}
+        T1_tilde = set()
+        T2_tilde = set()
+
+        gparams = _GraphParameters(self.G1, self.G2, {}, {}, {}, {}, {})
+        sparams = _StateParameters(
+            m, m_rev, T1, None, T1_tilde, None, T2, None, T2_tilde, None
+        )
+
+        # Remove a node from the mapping
+        m.pop(0)
+        m_rev.pop("x")
+        _restore_Tinout(0, self.mapped[0], gparams, sparams)
+
+        assert T1 == {1, 2, 7, 9, 8}
+        assert T2 == {"a", "b", "g", "i", "h"}
+        assert T1_tilde == {0}
+        assert T2_tilde == {"x"}
+
+        # Remove a node from the mapping
+        m.pop(6)
+        m_rev.pop("f")
+        _restore_Tinout(6, self.mapped[6], gparams, sparams)
+
+        assert T1 == {1, 2, 7, 9, 8}
+        assert T2 == {"a", "b", "g", "i", "h"}
+        assert T1_tilde == {0, 6}
+        assert T2_tilde == {"x", "f"}
+
+        # Remove a node from the mapping
+        m.pop(3)
+        m_rev.pop("c")
+        _restore_Tinout(3, self.mapped[3], gparams, sparams)
+
+        assert T1 == {7, 9, 8, 3}
+        assert T2 == {"g", "i", "h", "c"}
+        assert T1_tilde == {0, 6, 1, 2}
+        assert T2_tilde == {"x", "f", "a", "b"}
+
+        # Remove a node from the mapping
+        m.pop(5)
+        m_rev.pop("e")
+        _restore_Tinout(5, self.mapped[5], gparams, sparams)
+
+        assert T1 == {9, 3, 5}
+        assert T2 == {"i", "c", "e"}
+        assert T1_tilde == {0, 6, 1, 2, 7, 8}
+        assert T2_tilde == {"x", "f", "a", "b", "g", "h"}
+
+        # Remove a node from the mapping
+        m.pop(4)
+        m_rev.pop("d")
+        _restore_Tinout(4, self.mapped[4], gparams, sparams)
+
+        assert T1 == set()
+        assert T2 == set()
+        assert T1_tilde == set(self.G1.nodes())
+        assert T2_tilde == set(self.G2.nodes())
+
+
+class TestDiGraphTinoutUpdating:
+    edges = [
+        (1, 3),
+        (3, 2),
+        (3, 4),
+        (4, 9),
+        (4, 5),
+        (3, 9),
+        (5, 8),
+        (5, 7),
+        (8, 7),
+        (7, 6),
+    ]
+    mapped = {
+        0: "x",
+        1: "a",
+        2: "b",
+        3: "c",
+        4: "d",
+        5: "e",
+        6: "f",
+        7: "g",
+        8: "h",
+        9: "i",
+    }
+    G1 = nx.DiGraph(edges)
+    G1.add_node(0)
+    G2 = nx.relabel_nodes(G1, mapping=mapped)
+
+    def test_updating(self):
+        G2_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(
+                self.G2.in_degree, self.G2.out_degree
+            )
+        }
+        gparams, sparams = _initialize_parameters(self.G1, self.G2, G2_degree)
+        m, m_rev, T1_out, T1_in, T1_tilde, _, T2_out, T2_in, T2_tilde, _ = sparams
+
+        # Add node to the mapping
+        m[4] = self.mapped[4]
+        m_rev[self.mapped[4]] = 4
+        _update_Tinout(4, self.mapped[4], gparams, sparams)
+
+        assert T1_out == {5, 9}
+        assert T1_in == {3}
+        assert T2_out == {"i", "e"}
+        assert T2_in == {"c"}
+        assert T1_tilde == {0, 1, 2, 6, 7, 8}
+        assert T2_tilde == {"x", "a", "b", "f", "g", "h"}
+
+        # Add node to the mapping
+        m[5] = self.mapped[5]
+        m_rev[self.mapped[5]] = 5
+        _update_Tinout(5, self.mapped[5], gparams, sparams)
+
+        assert T1_out == {9, 8, 7}
+        assert T1_in == {3}
+        assert T2_out == {"i", "g", "h"}
+        assert T2_in == {"c"}
+        assert T1_tilde == {0, 1, 2, 6}
+        assert T2_tilde == {"x", "a", "b", "f"}
+
+        # Add node to the mapping
+        m[6] = self.mapped[6]
+        m_rev[self.mapped[6]] = 6
+        _update_Tinout(6, self.mapped[6], gparams, sparams)
+
+        assert T1_out == {9, 8, 7}
+        assert T1_in == {3, 7}
+        assert T2_out == {"i", "g", "h"}
+        assert T2_in == {"c", "g"}
+        assert T1_tilde == {0, 1, 2}
+        assert T2_tilde == {"x", "a", "b"}
+
+        # Add node to the mapping
+        m[3] = self.mapped[3]
+        m_rev[self.mapped[3]] = 3
+        _update_Tinout(3, self.mapped[3], gparams, sparams)
+
+        assert T1_out == {9, 8, 7, 2}
+        assert T1_in == {7, 1}
+        assert T2_out == {"i", "g", "h", "b"}
+        assert T2_in == {"g", "a"}
+        assert T1_tilde == {0}
+        assert T2_tilde == {"x"}
+
+        # Add node to the mapping
+        m[0] = self.mapped[0]
+        m_rev[self.mapped[0]] = 0
+        _update_Tinout(0, self.mapped[0], gparams, sparams)
+
+        assert T1_out == {9, 8, 7, 2}
+        assert T1_in == {7, 1}
+        assert T2_out == {"i", "g", "h", "b"}
+        assert T2_in == {"g", "a"}
+        assert T1_tilde == set()
+        assert T2_tilde == set()
+
+    def test_restoring(self):
+        m = {0: "x", 3: "c", 4: "d", 5: "e", 6: "f"}
+        m_rev = {"x": 0, "c": 3, "d": 4, "e": 5, "f": 6}
+
+        T1_out = {2, 7, 9, 8}
+        T1_in = {1, 7}
+        T2_out = {"b", "g", "i", "h"}
+        T2_in = {"a", "g"}
+        T1_tilde = set()
+        T2_tilde = set()
+
+        gparams = _GraphParameters(self.G1, self.G2, {}, {}, {}, {}, {})
+        sparams = _StateParameters(
+            m, m_rev, T1_out, T1_in, T1_tilde, None, T2_out, T2_in, T2_tilde, None
+        )
+
+        # Remove a node from the mapping
+        m.pop(0)
+        m_rev.pop("x")
+        _restore_Tinout_Di(0, self.mapped[0], gparams, sparams)
+
+        assert T1_out == {2, 7, 9, 8}
+        assert T1_in == {1, 7}
+        assert T2_out == {"b", "g", "i", "h"}
+        assert T2_in == {"a", "g"}
+        assert T1_tilde == {0}
+        assert T2_tilde == {"x"}
+
+        # Remove a node from the mapping
+        m.pop(6)
+        m_rev.pop("f")
+        _restore_Tinout_Di(6, self.mapped[6], gparams, sparams)
+
+        assert T1_out == {2, 9, 8, 7}
+        assert T1_in == {1}
+        assert T2_out == {"b", "i", "h", "g"}
+        assert T2_in == {"a"}
+        assert T1_tilde == {0, 6}
+        assert T2_tilde == {"x", "f"}
+
+        # Remove a node from the mapping
+        m.pop(3)
+        m_rev.pop("c")
+        _restore_Tinout_Di(3, self.mapped[3], gparams, sparams)
+
+        assert T1_out == {9, 8, 7}
+        assert T1_in == {3}
+        assert T2_out == {"i", "h", "g"}
+        assert T2_in == {"c"}
+        assert T1_tilde == {0, 6, 1, 2}
+        assert T2_tilde == {"x", "f", "a", "b"}
+
+        # Remove a node from the mapping
+        m.pop(5)
+        m_rev.pop("e")
+        _restore_Tinout_Di(5, self.mapped[5], gparams, sparams)
+
+        assert T1_out == {9, 5}
+        assert T1_in == {3}
+        assert T2_out == {"i", "e"}
+        assert T2_in == {"c"}
+        assert T1_tilde == {0, 6, 1, 2, 8, 7}
+        assert T2_tilde == {"x", "f", "a", "b", "h", "g"}
+
+        # Remove a node from the mapping
+        m.pop(4)
+        m_rev.pop("d")
+        _restore_Tinout_Di(4, self.mapped[4], gparams, sparams)
+
+        assert T1_out == set()
+        assert T1_in == set()
+        assert T2_out == set()
+        assert T2_in == set()
+        assert T1_tilde == set(self.G1.nodes())
+        assert T2_tilde == set(self.G2.nodes())
diff --git a/networkx/algorithms/isomorphism/tree_isomorphism.py b/networkx/algorithms/isomorphism/tree_isomorphism.py
index 7e13d02..cfb0a93 100644
--- a/networkx/algorithms/isomorphism/tree_isomorphism.py
+++ b/networkx/algorithms/isomorphism/tree_isomorphism.py
@@ -103,7 +103,7 @@ def generate_isomorphism(v, w, M, ordered_children):
 def rooted_tree_isomorphism(t1, root1, t2, root2):
     """
     Given two rooted trees `t1` and `t2`,
-    with roots `root1` and `root2` respectivly
+    with roots `root1` and `root2` respectively
     this routine will determine if they are isomorphic.
 
     These trees may be either directed or undirected,
@@ -186,7 +186,7 @@ def rooted_tree_isomorphism(t1, root1, t2, root2):
         forlabel = sorted((ordered_labels[v], v) for v in L[i])
 
         # now assign labels to these nodes, according to the sorted order
-        # starting from 0, where idential ordered_labels get the same label
+        # starting from 0, where identical ordered_labels get the same label
         current = 0
         for i, (ol, v) in enumerate(forlabel):
             # advance to next label if not 0, and different from previous
diff --git a/networkx/algorithms/isomorphism/vf2pp.py b/networkx/algorithms/isomorphism/vf2pp.py
new file mode 100644
index 0000000..9a9fd26
--- /dev/null
+++ b/networkx/algorithms/isomorphism/vf2pp.py
@@ -0,0 +1,1059 @@
+"""
+***************
+VF2++ Algorithm
+***************
+
+An implementation of the VF2++ algorithm for Graph Isomorphism testing.
+
+The simplest interface to use this module is to call:
+
+`vf2pp_is_isomorphic`: to check whether two graphs are isomorphic.
+`vf2pp_isomorphism`: to obtain the node mapping between two graphs,
+in case they are isomorphic.
+`vf2pp_all_isomorphisms`: to generate all possible mappings between two graphs,
+if isomorphic.
+
+Introduction
+------------
+The VF2++ algorithm, follows a similar logic to that of VF2, while also
+introducing new easy-to-check cutting rules and determining the optimal access
+order of nodes. It is also implemented in a non-recursive manner, which saves
+both time and space, when compared to its previous counterpart.
+
+The optimal node ordering is obtained after taking into consideration both the
+degree but also the label rarity of each node.
+This way we place the nodes that are more likely to match, first in the order,
+thus examining the most promising branches in the beginning.
+The rules also consider node labels, making it easier to prune unfruitful
+branches early in the process.
+
+Examples
+--------
+
+Suppose G1 and G2 are Isomorphic Graphs. Verification is as follows:
+
+Without node labels:
+
+>>> import networkx as nx
+>>> G1 = nx.path_graph(4)
+>>> G2 = nx.path_graph(4)
+>>> nx.vf2pp_is_isomorphic(G1, G2, node_label=None)
+True
+>>> nx.vf2pp_isomorphism(G1, G2, node_label=None)
+{1: 1, 2: 2, 0: 0, 3: 3}
+
+With node labels:
+
+>>> G1 = nx.path_graph(4)
+>>> G2 = nx.path_graph(4)
+>>> mapped = {1: 1, 2: 2, 3: 3, 0: 0}
+>>> nx.set_node_attributes(G1, dict(zip(G1, ["blue", "red", "green", "yellow"])), "label")
+>>> nx.set_node_attributes(G2, dict(zip([mapped[u] for u in G1], ["blue", "red", "green", "yellow"])), "label")
+>>> nx.vf2pp_is_isomorphic(G1, G2, node_label="label")
+True
+>>> nx.vf2pp_isomorphism(G1, G2, node_label="label")
+{1: 1, 2: 2, 0: 0, 3: 3}
+
+"""
+import collections
+
+import networkx as nx
+
+__all__ = ["vf2pp_isomorphism", "vf2pp_is_isomorphic", "vf2pp_all_isomorphisms"]
+
+_GraphParameters = collections.namedtuple(
+    "_GraphParameters",
+    [
+        "G1",
+        "G2",
+        "G1_labels",
+        "G2_labels",
+        "nodes_of_G1Labels",
+        "nodes_of_G2Labels",
+        "G2_nodes_of_degree",
+    ],
+)
+
+_StateParameters = collections.namedtuple(
+    "_StateParameters",
+    [
+        "mapping",
+        "reverse_mapping",
+        "T1",
+        "T1_in",
+        "T1_tilde",
+        "T1_tilde_in",
+        "T2",
+        "T2_in",
+        "T2_tilde",
+        "T2_tilde_in",
+    ],
+)
+
+
+def vf2pp_isomorphism(G1, G2, node_label=None, default_label=None):
+    """Return an isomorphic mapping between `G1` and `G2` if it exists.
+
+    Parameters
+    ----------
+    G1, G2 : NetworkX Graph or MultiGraph instances.
+        The two graphs to check for isomorphism.
+
+    node_label : str, optional
+        The name of the node attribute to be used when comparing nodes.
+        The default is `None`, meaning node attributes are not considered
+        in the comparison. Any node that doesn't have the `node_label`
+        attribute uses `default_label` instead.
+
+    default_label : scalar
+        Default value to use when a node doesn't have an attribute
+        named `node_label`. Default is `None`.
+
+    Returns
+    -------
+    dict or None
+        Node mapping if the two graphs are isomorphic. None otherwise.
+    """
+    try:
+        mapping = next(vf2pp_all_isomorphisms(G1, G2, node_label, default_label))
+        return mapping
+    except StopIteration:
+        return None
+
+
+def vf2pp_is_isomorphic(G1, G2, node_label=None, default_label=None):
+    """Examines whether G1 and G2 are isomorphic.
+
+    Parameters
+    ----------
+    G1, G2 : NetworkX Graph or MultiGraph instances.
+        The two graphs to check for isomorphism.
+
+    node_label : str, optional
+        The name of the node attribute to be used when comparing nodes.
+        The default is `None`, meaning node attributes are not considered
+        in the comparison. Any node that doesn't have the `node_label`
+        attribute uses `default_label` instead.
+
+    default_label : scalar
+        Default value to use when a node doesn't have an attribute
+        named `node_label`. Default is `None`.
+
+    Returns
+    -------
+    bool
+        True if the two graphs are isomorphic, False otherwise.
+    """
+    if vf2pp_isomorphism(G1, G2, node_label, default_label) is not None:
+        return True
+    return False
+
+
+def vf2pp_all_isomorphisms(G1, G2, node_label=None, default_label=None):
+    """Yields all the possible mappings between G1 and G2.
+
+    Parameters
+    ----------
+    G1, G2 : NetworkX Graph or MultiGraph instances.
+        The two graphs to check for isomorphism.
+
+    node_label : str, optional
+        The name of the node attribute to be used when comparing nodes.
+        The default is `None`, meaning node attributes are not considered
+        in the comparison. Any node that doesn't have the `node_label`
+        attribute uses `default_label` instead.
+
+    default_label : scalar
+        Default value to use when a node doesn't have an attribute
+        named `node_label`. Default is `None`.
+
+    Yields
+    ------
+    dict
+        Isomorphic mapping between the nodes in `G1` and `G2`.
+    """
+    if G1.number_of_nodes() == 0 or G2.number_of_nodes() == 0:
+        return False
+
+    # Create the degree dicts based on graph type
+    if G1.is_directed():
+        G1_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(G1.in_degree, G1.out_degree)
+        }
+        G2_degree = {
+            n: (in_degree, out_degree)
+            for (n, in_degree), (_, out_degree) in zip(G2.in_degree, G2.out_degree)
+        }
+    else:
+        G1_degree = dict(G1.degree)
+        G2_degree = dict(G2.degree)
+
+    if not G1.is_directed():
+        find_candidates = _find_candidates
+        restore_Tinout = _restore_Tinout
+    else:
+        find_candidates = _find_candidates_Di
+        restore_Tinout = _restore_Tinout_Di
+
+    # Check that both graphs have the same number of nodes and degree sequence
+    if G1.order() != G2.order():
+        return False
+    if sorted(G1_degree.values()) != sorted(G2_degree.values()):
+        return False
+
+    # Initialize parameters and cache necessary information about degree and labels
+    graph_params, state_params = _initialize_parameters(
+        G1, G2, G2_degree, node_label, default_label
+    )
+
+    # Check if G1 and G2 have the same labels, and that number of nodes per label is equal between the two graphs
+    if not _precheck_label_properties(graph_params):
+        return False
+
+    # Calculate the optimal node ordering
+    node_order = _matching_order(graph_params)
+
+    # Initialize the stack
+    stack = []
+    candidates = iter(
+        find_candidates(node_order[0], graph_params, state_params, G1_degree)
+    )
+    stack.append((node_order[0], candidates))
+
+    mapping = state_params.mapping
+    reverse_mapping = state_params.reverse_mapping
+
+    # Index of the node from the order, currently being examined
+    matching_node = 1
+
+    while stack:
+        current_node, candidate_nodes = stack[-1]
+
+        try:
+            candidate = next(candidate_nodes)
+        except StopIteration:
+            # If no remaining candidates, return to a previous state, and follow another branch
+            stack.pop()
+            matching_node -= 1
+            if stack:
+                # Pop the previously added u-v pair, and look for a different candidate _v for u
+                popped_node1, _ = stack[-1]
+                popped_node2 = mapping[popped_node1]
+                mapping.pop(popped_node1)
+                reverse_mapping.pop(popped_node2)
+                restore_Tinout(popped_node1, popped_node2, graph_params, state_params)
+            continue
+
+        if _feasibility(current_node, candidate, graph_params, state_params):
+            # Terminate if mapping is extended to its full
+            if len(mapping) == G2.number_of_nodes() - 1:
+                cp_mapping = mapping.copy()
+                cp_mapping[current_node] = candidate
+                yield cp_mapping
+                continue
+
+            # Feasibility rules pass, so extend the mapping and update the parameters
+            mapping[current_node] = candidate
+            reverse_mapping[candidate] = current_node
+            _update_Tinout(current_node, candidate, graph_params, state_params)
+            # Append the next node and its candidates to the stack
+            candidates = iter(
+                find_candidates(
+                    node_order[matching_node], graph_params, state_params, G1_degree
+                )
+            )
+            stack.append((node_order[matching_node], candidates))
+            matching_node += 1
+
+
+def _precheck_label_properties(graph_params):
+    G1, G2, G1_labels, G2_labels, nodes_of_G1Labels, nodes_of_G2Labels, _ = graph_params
+    if any(
+        label not in nodes_of_G1Labels or len(nodes_of_G1Labels[label]) != len(nodes)
+        for label, nodes in nodes_of_G2Labels.items()
+    ):
+        return False
+    return True
+
+
+def _initialize_parameters(G1, G2, G2_degree, node_label=None, default_label=-1):
+    """Initializes all the necessary parameters for VF2++
+
+    Parameters
+    ----------
+    G1,G2: NetworkX Graph or MultiGraph instances.
+        The two graphs to check for isomorphism or monomorphism
+
+    G1_labels,G2_labels: dict
+        The label of every node in G1 and G2 respectively
+
+    Returns
+    -------
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2
+        G1_labels,G2_labels: dict
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_out, T2_out: set
+            Ti_out contains all the nodes from Gi, that are neither in the mapping nor in Ti
+    """
+    G1_labels = dict(G1.nodes(data=node_label, default=default_label))
+    G2_labels = dict(G2.nodes(data=node_label, default=default_label))
+
+    graph_params = _GraphParameters(
+        G1,
+        G2,
+        G1_labels,
+        G2_labels,
+        nx.utils.groups(G1_labels),
+        nx.utils.groups(G2_labels),
+        nx.utils.groups(G2_degree),
+    )
+
+    T1, T1_in = set(), set()
+    T2, T2_in = set(), set()
+    if G1.is_directed():
+        T1_tilde, T1_tilde_in = (
+            set(G1.nodes()),
+            set(),
+        )  # todo: do we need Ti_tilde_in? What nodes does it have?
+        T2_tilde, T2_tilde_in = set(G2.nodes()), set()
+    else:
+        T1_tilde, T1_tilde_in = set(G1.nodes()), set()
+        T2_tilde, T2_tilde_in = set(G2.nodes()), set()
+
+    state_params = _StateParameters(
+        dict(),
+        dict(),
+        T1,
+        T1_in,
+        T1_tilde,
+        T1_tilde_in,
+        T2,
+        T2_in,
+        T2_tilde,
+        T2_tilde_in,
+    )
+
+    return graph_params, state_params
+
+
+def _matching_order(graph_params):
+    """The node ordering as introduced in VF2++.
+
+    Notes
+    -----
+    Taking into account the structure of the Graph and the node labeling, the nodes are placed in an order such that,
+    most of the unfruitful/infeasible branches of the search space can be pruned on high levels, significantly
+    decreasing the number of visited states. The premise is that, the algorithm will be able to recognize
+    inconsistencies early, proceeding to go deep into the search tree only if it's needed.
+
+    Parameters
+    ----------
+    graph_params: namedtuple
+        Contains:
+
+            G1,G2: NetworkX Graph or MultiGraph instances.
+                The two graphs to check for isomorphism or monomorphism.
+
+            G1_labels,G2_labels: dict
+                The label of every node in G1 and G2 respectively.
+
+    Returns
+    -------
+    node_order: list
+        The ordering of the nodes.
+    """
+    G1, G2, G1_labels, _, _, nodes_of_G2Labels, _ = graph_params
+    if not G1 and not G2:
+        return {}
+
+    if G1.is_directed():
+        G1 = G1.to_undirected(as_view=True)
+
+    V1_unordered = set(G1.nodes())
+    label_rarity = {label: len(nodes) for label, nodes in nodes_of_G2Labels.items()}
+    used_degrees = {node: 0 for node in G1}
+    node_order = []
+
+    while V1_unordered:
+        max_rarity = min(label_rarity[G1_labels[x]] for x in V1_unordered)
+        rarest_nodes = [
+            n for n in V1_unordered if label_rarity[G1_labels[n]] == max_rarity
+        ]
+        max_node = max(rarest_nodes, key=G1.degree)
+
+        for dlevel_nodes in nx.bfs_layers(G1, max_node):
+            nodes_to_add = dlevel_nodes.copy()
+            while nodes_to_add:
+                max_used_degree = max(used_degrees[n] for n in nodes_to_add)
+                max_used_degree_nodes = [
+                    n for n in nodes_to_add if used_degrees[n] == max_used_degree
+                ]
+                max_degree = max(G1.degree[n] for n in max_used_degree_nodes)
+                max_degree_nodes = [
+                    n for n in max_used_degree_nodes if G1.degree[n] == max_degree
+                ]
+                next_node = min(
+                    max_degree_nodes, key=lambda x: label_rarity[G1_labels[x]]
+                )
+
+                node_order.append(next_node)
+                for node in G1.neighbors(next_node):
+                    used_degrees[node] += 1
+
+                nodes_to_add.remove(next_node)
+                label_rarity[G1_labels[next_node]] -= 1
+                V1_unordered.discard(next_node)
+
+    return node_order
+
+
+def _find_candidates(
+    u, graph_params, state_params, G1_degree
+):  # todo: make the 4th argument the degree of u
+    """Given node u of G1, finds the candidates of u from G2.
+
+    Parameters
+    ----------
+    u: Graph node
+        The node from G1 for which to find the candidates from G2.
+
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2: NetworkX Graph or MultiGraph instances.
+            The two graphs to check for isomorphism or monomorphism
+
+        G1_labels,G2_labels: dict
+            The label of every node in G1 and G2 respectively
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_tilde, T2_tilde: set
+            Ti_tilde contains all the nodes from Gi, that are neither in the mapping nor in Ti
+
+    Returns
+    -------
+    candidates: set
+        The nodes from G2 which are candidates for u.
+    """
+    G1, G2, G1_labels, _, _, nodes_of_G2Labels, G2_nodes_of_degree = graph_params
+    mapping, reverse_mapping, _, _, _, _, _, _, T2_tilde, _ = state_params
+
+    covered_neighbors = [nbr for nbr in G1[u] if nbr in mapping]
+    if not covered_neighbors:
+        candidates = set(nodes_of_G2Labels[G1_labels[u]])
+        candidates.intersection_update(G2_nodes_of_degree[G1_degree[u]])
+        candidates.intersection_update(T2_tilde)
+        candidates.difference_update(reverse_mapping)
+        if G1.is_multigraph():
+            candidates.difference_update(
+                {
+                    node
+                    for node in candidates
+                    if G1.number_of_edges(u, u) != G2.number_of_edges(node, node)
+                }
+            )
+        return candidates
+
+    nbr1 = covered_neighbors[0]
+    common_nodes = set(G2[mapping[nbr1]])
+
+    for nbr1 in covered_neighbors[1:]:
+        common_nodes.intersection_update(G2[mapping[nbr1]])
+
+    common_nodes.difference_update(reverse_mapping)
+    common_nodes.intersection_update(G2_nodes_of_degree[G1_degree[u]])
+    common_nodes.intersection_update(nodes_of_G2Labels[G1_labels[u]])
+    if G1.is_multigraph():
+        common_nodes.difference_update(
+            {
+                node
+                for node in common_nodes
+                if G1.number_of_edges(u, u) != G2.number_of_edges(node, node)
+            }
+        )
+    return common_nodes
+
+
+def _find_candidates_Di(u, graph_params, state_params, G1_degree):
+    G1, G2, G1_labels, _, _, nodes_of_G2Labels, G2_nodes_of_degree = graph_params
+    mapping, reverse_mapping, _, _, _, _, _, _, T2_tilde, _ = state_params
+
+    covered_successors = [succ for succ in G1[u] if succ in mapping]
+    covered_predecessors = [pred for pred in G1.pred[u] if pred in mapping]
+
+    if not (covered_successors or covered_predecessors):
+        candidates = set(nodes_of_G2Labels[G1_labels[u]])
+        candidates.intersection_update(G2_nodes_of_degree[G1_degree[u]])
+        candidates.intersection_update(T2_tilde)
+        candidates.difference_update(reverse_mapping)
+        if G1.is_multigraph():
+            candidates.difference_update(
+                {
+                    node
+                    for node in candidates
+                    if G1.number_of_edges(u, u) != G2.number_of_edges(node, node)
+                }
+            )
+        return candidates
+
+    if covered_successors:
+        succ1 = covered_successors[0]
+        common_nodes = set(G2.pred[mapping[succ1]])
+
+        for succ1 in covered_successors[1:]:
+            common_nodes.intersection_update(G2.pred[mapping[succ1]])
+    else:
+        pred1 = covered_predecessors.pop()
+        common_nodes = set(G2[mapping[pred1]])
+
+    for pred1 in covered_predecessors:
+        common_nodes.intersection_update(G2[mapping[pred1]])
+
+    common_nodes.difference_update(reverse_mapping)
+    common_nodes.intersection_update(G2_nodes_of_degree[G1_degree[u]])
+    common_nodes.intersection_update(nodes_of_G2Labels[G1_labels[u]])
+    if G1.is_multigraph():
+        common_nodes.difference_update(
+            {
+                node
+                for node in common_nodes
+                if G1.number_of_edges(u, u) != G2.number_of_edges(node, node)
+            }
+        )
+    return common_nodes
+
+
+def _feasibility(node1, node2, graph_params, state_params):
+    """Given a candidate pair of nodes u and v from G1 and G2 respectively, checks if it's feasible to extend the
+    mapping, i.e. if u and v can be matched.
+
+    Notes
+    -----
+    This function performs all the necessary checking by applying both consistency and cutting rules.
+
+    Parameters
+    ----------
+    node1, node2: Graph node
+        The candidate pair of nodes being checked for matching
+
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2: NetworkX Graph or MultiGraph instances.
+            The two graphs to check for isomorphism or monomorphism
+
+        G1_labels,G2_labels: dict
+            The label of every node in G1 and G2 respectively
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_out, T2_out: set
+            Ti_out contains all the nodes from Gi, that are neither in the mapping nor in Ti
+
+    Returns
+    -------
+    True if all checks are successful, False otherwise.
+    """
+    G1 = graph_params.G1
+
+    if _cut_PT(node1, node2, graph_params, state_params):
+        return False
+
+    if G1.is_multigraph():
+        if not _consistent_PT(node1, node2, graph_params, state_params):
+            return False
+
+    return True
+
+
+def _cut_PT(u, v, graph_params, state_params):
+    """Implements the cutting rules for the ISO problem.
+
+    Parameters
+    ----------
+    u, v: Graph node
+        The two candidate nodes being examined.
+
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2: NetworkX Graph or MultiGraph instances.
+            The two graphs to check for isomorphism or monomorphism
+
+        G1_labels,G2_labels: dict
+            The label of every node in G1 and G2 respectively
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_tilde, T2_tilde: set
+            Ti_out contains all the nodes from Gi, that are neither in the mapping nor in Ti
+
+    Returns
+    -------
+    True if we should prune this branch, i.e. the node pair failed the cutting checks. False otherwise.
+    """
+    G1, G2, G1_labels, G2_labels, _, _, _ = graph_params
+    (
+        _,
+        _,
+        T1,
+        T1_in,
+        T1_tilde,
+        _,
+        T2,
+        T2_in,
+        T2_tilde,
+        _,
+    ) = state_params
+
+    u_labels_predecessors, v_labels_predecessors = {}, {}
+    if G1.is_directed():
+        u_labels_predecessors = nx.utils.groups(
+            {n1: G1_labels[n1] for n1 in G1.pred[u]}
+        )
+        v_labels_predecessors = nx.utils.groups(
+            {n2: G2_labels[n2] for n2 in G2.pred[v]}
+        )
+
+        if set(u_labels_predecessors.keys()) != set(v_labels_predecessors.keys()):
+            return True
+
+    u_labels_successors = nx.utils.groups({n1: G1_labels[n1] for n1 in G1[u]})
+    v_labels_successors = nx.utils.groups({n2: G2_labels[n2] for n2 in G2[v]})
+
+    # if the neighbors of u, do not have the same labels as those of v, NOT feasible.
+    if set(u_labels_successors.keys()) != set(v_labels_successors.keys()):
+        return True
+
+    for label, G1_nbh in u_labels_successors.items():
+        G2_nbh = v_labels_successors[label]
+
+        if G1.is_multigraph():
+            # Check for every neighbor in the neighborhood, if u-nbr1 has same edges as v-nbr2
+            u_nbrs_edges = sorted(G1.number_of_edges(u, x) for x in G1_nbh)
+            v_nbrs_edges = sorted(G2.number_of_edges(v, x) for x in G2_nbh)
+            if any(
+                u_nbr_edges != v_nbr_edges
+                for u_nbr_edges, v_nbr_edges in zip(u_nbrs_edges, v_nbrs_edges)
+            ):
+                return True
+
+        if len(T1.intersection(G1_nbh)) != len(T2.intersection(G2_nbh)):
+            return True
+        if len(T1_tilde.intersection(G1_nbh)) != len(T2_tilde.intersection(G2_nbh)):
+            return True
+        if G1.is_directed() and len(T1_in.intersection(G1_nbh)) != len(
+            T2_in.intersection(G2_nbh)
+        ):
+            return True
+
+    if not G1.is_directed():
+        return False
+
+    for label, G1_pred in u_labels_predecessors.items():
+        G2_pred = v_labels_predecessors[label]
+
+        if G1.is_multigraph():
+            # Check for every neighbor in the neighborhood, if u-nbr1 has same edges as v-nbr2
+            u_pred_edges = sorted(G1.number_of_edges(u, x) for x in G1_pred)
+            v_pred_edges = sorted(G2.number_of_edges(v, x) for x in G2_pred)
+            if any(
+                u_nbr_edges != v_nbr_edges
+                for u_nbr_edges, v_nbr_edges in zip(u_pred_edges, v_pred_edges)
+            ):
+                return True
+
+        if len(T1.intersection(G1_pred)) != len(T2.intersection(G2_pred)):
+            return True
+        if len(T1_tilde.intersection(G1_pred)) != len(T2_tilde.intersection(G2_pred)):
+            return True
+        if len(T1_in.intersection(G1_pred)) != len(T2_in.intersection(G2_pred)):
+            return True
+
+    return False
+
+
+def _consistent_PT(u, v, graph_params, state_params):
+    """Checks the consistency of extending the mapping using the current node pair.
+
+    Parameters
+    ----------
+    u, v: Graph node
+        The two candidate nodes being examined.
+
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2: NetworkX Graph or MultiGraph instances.
+            The two graphs to check for isomorphism or monomorphism
+
+        G1_labels,G2_labels: dict
+            The label of every node in G1 and G2 respectively
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_out, T2_out: set
+            Ti_out contains all the nodes from Gi, that are neither in the mapping nor in Ti
+
+    Returns
+    -------
+    True if the pair passes all the consistency checks successfully. False otherwise.
+    """
+    G1, G2 = graph_params.G1, graph_params.G2
+    mapping, reverse_mapping = state_params.mapping, state_params.reverse_mapping
+
+    for neighbor in G1[u]:
+        if neighbor in mapping:
+            if G1.number_of_edges(u, neighbor) != G2.number_of_edges(
+                v, mapping[neighbor]
+            ):
+                return False
+
+    for neighbor in G2[v]:
+        if neighbor in reverse_mapping:
+            if G1.number_of_edges(u, reverse_mapping[neighbor]) != G2.number_of_edges(
+                v, neighbor
+            ):
+                return False
+
+    if not G1.is_directed():
+        return True
+
+    for predecessor in G1.pred[u]:
+        if predecessor in mapping:
+            if G1.number_of_edges(predecessor, u) != G2.number_of_edges(
+                mapping[predecessor], v
+            ):
+                return False
+
+    for predecessor in G2.pred[v]:
+        if predecessor in reverse_mapping:
+            if G1.number_of_edges(
+                reverse_mapping[predecessor], u
+            ) != G2.number_of_edges(predecessor, v):
+                return False
+
+    return True
+
+
+def _update_Tinout(new_node1, new_node2, graph_params, state_params):
+    """Updates the Ti/Ti_out (i=1,2) when a new node pair u-v is added to the mapping.
+
+    Notes
+    -----
+    This function should be called right after the feasibility checks are passed, and node1 is mapped to node2. The
+    purpose of this function is to avoid brute force computing of Ti/Ti_out by iterating over all nodes of the graph
+    and checking which nodes satisfy the necessary conditions. Instead, in every step of the algorithm we focus
+    exclusively on the two nodes that are being added to the mapping, incrementally updating Ti/Ti_out.
+
+    Parameters
+    ----------
+    new_node1, new_node2: Graph node
+        The two new nodes, added to the mapping.
+
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2: NetworkX Graph or MultiGraph instances.
+            The two graphs to check for isomorphism or monomorphism
+
+        G1_labels,G2_labels: dict
+            The label of every node in G1 and G2 respectively
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_tilde, T2_tilde: set
+            Ti_out contains all the nodes from Gi, that are neither in the mapping nor in Ti
+    """
+    G1, G2, _, _, _, _, _ = graph_params
+    (
+        mapping,
+        reverse_mapping,
+        T1,
+        T1_in,
+        T1_tilde,
+        T1_tilde_in,
+        T2,
+        T2_in,
+        T2_tilde,
+        T2_tilde_in,
+    ) = state_params
+
+    uncovered_successors_G1 = {succ for succ in G1[new_node1] if succ not in mapping}
+    uncovered_successors_G2 = {
+        succ for succ in G2[new_node2] if succ not in reverse_mapping
+    }
+
+    # Add the uncovered neighbors of node1 and node2 in T1 and T2 respectively
+    T1.update(uncovered_successors_G1)
+    T2.update(uncovered_successors_G2)
+    T1.discard(new_node1)
+    T2.discard(new_node2)
+
+    T1_tilde.difference_update(uncovered_successors_G1)
+    T2_tilde.difference_update(uncovered_successors_G2)
+    T1_tilde.discard(new_node1)
+    T2_tilde.discard(new_node2)
+
+    if not G1.is_directed():
+        return
+
+    uncovered_predecessors_G1 = {
+        pred for pred in G1.pred[new_node1] if pred not in mapping
+    }
+    uncovered_predecessors_G2 = {
+        pred for pred in G2.pred[new_node2] if pred not in reverse_mapping
+    }
+
+    T1_in.update(uncovered_predecessors_G1)
+    T2_in.update(uncovered_predecessors_G2)
+    T1_in.discard(new_node1)
+    T2_in.discard(new_node2)
+
+    T1_tilde.difference_update(uncovered_predecessors_G1)
+    T2_tilde.difference_update(uncovered_predecessors_G2)
+    T1_tilde.discard(new_node1)
+    T2_tilde.discard(new_node2)
+
+
+def _restore_Tinout(popped_node1, popped_node2, graph_params, state_params):
+    """Restores the previous version of Ti/Ti_out when a node pair is deleted from the mapping.
+
+    Parameters
+    ----------
+    popped_node1, popped_node2: Graph node
+        The two nodes deleted from the mapping.
+
+    graph_params: namedtuple
+        Contains all the Graph-related parameters:
+
+        G1,G2: NetworkX Graph or MultiGraph instances.
+            The two graphs to check for isomorphism or monomorphism
+
+        G1_labels,G2_labels: dict
+            The label of every node in G1 and G2 respectively
+
+    state_params: namedtuple
+        Contains all the State-related parameters:
+
+        mapping: dict
+            The mapping as extended so far. Maps nodes of G1 to nodes of G2
+
+        reverse_mapping: dict
+            The reverse mapping as extended so far. Maps nodes from G2 to nodes of G1. It's basically "mapping" reversed
+
+        T1, T2: set
+            Ti contains uncovered neighbors of covered nodes from Gi, i.e. nodes that are not in the mapping, but are
+            neighbors of nodes that are.
+
+        T1_tilde, T2_tilde: set
+            Ti_out contains all the nodes from Gi, that are neither in the mapping nor in Ti
+    """
+    # If the node we want to remove from the mapping, has at least one covered neighbor, add it to T1.
+    G1, G2, _, _, _, _, _ = graph_params
+    (
+        mapping,
+        reverse_mapping,
+        T1,
+        T1_in,
+        T1_tilde,
+        T1_tilde_in,
+        T2,
+        T2_in,
+        T2_tilde,
+        T2_tilde_in,
+    ) = state_params
+
+    is_added = False
+    for neighbor in G1[popped_node1]:
+        if neighbor in mapping:
+            # if a neighbor of the excluded node1 is in the mapping, keep node1 in T1
+            is_added = True
+            T1.add(popped_node1)
+        else:
+            # check if its neighbor has another connection with a covered node. If not, only then exclude it from T1
+            if any(nbr in mapping for nbr in G1[neighbor]):
+                continue
+            T1.discard(neighbor)
+            T1_tilde.add(neighbor)
+
+    # Case where the node is not present in neither the mapping nor T1. By definition, it should belong to T1_tilde
+    if not is_added:
+        T1_tilde.add(popped_node1)
+
+    is_added = False
+    for neighbor in G2[popped_node2]:
+        if neighbor in reverse_mapping:
+            is_added = True
+            T2.add(popped_node2)
+        else:
+            if any(nbr in reverse_mapping for nbr in G2[neighbor]):
+                continue
+            T2.discard(neighbor)
+            T2_tilde.add(neighbor)
+
+    if not is_added:
+        T2_tilde.add(popped_node2)
+
+
+def _restore_Tinout_Di(popped_node1, popped_node2, graph_params, state_params):
+    # If the node we want to remove from the mapping, has at least one covered neighbor, add it to T1.
+    G1, G2, _, _, _, _, _ = graph_params
+    (
+        mapping,
+        reverse_mapping,
+        T1,
+        T1_in,
+        T1_tilde,
+        T1_tilde_in,
+        T2,
+        T2_in,
+        T2_tilde,
+        T2_tilde_in,
+    ) = state_params
+
+    is_added = False
+    for successor in G1[popped_node1]:
+        if successor in mapping:
+            # if a neighbor of the excluded node1 is in the mapping, keep node1 in T1
+            is_added = True
+            T1_in.add(popped_node1)
+        else:
+            # check if its neighbor has another connection with a covered node. If not, only then exclude it from T1
+            if not any(pred in mapping for pred in G1.pred[successor]):
+                T1.discard(successor)
+
+            if not any(succ in mapping for succ in G1[successor]):
+                T1_in.discard(successor)
+
+            if successor not in T1:
+                if successor not in T1_in:
+                    T1_tilde.add(successor)
+
+    for predecessor in G1.pred[popped_node1]:
+        if predecessor in mapping:
+            # if a neighbor of the excluded node1 is in the mapping, keep node1 in T1
+            is_added = True
+            T1.add(popped_node1)
+        else:
+            # check if its neighbor has another connection with a covered node. If not, only then exclude it from T1
+            if not any(pred in mapping for pred in G1.pred[predecessor]):
+                T1.discard(predecessor)
+
+            if not any(succ in mapping for succ in G1[predecessor]):
+                T1_in.discard(predecessor)
+
+            if not (predecessor in T1 or predecessor in T1_in):
+                T1_tilde.add(predecessor)
+
+    # Case where the node is not present in neither the mapping nor T1. By definition it should belong to T1_tilde
+    if not is_added:
+        T1_tilde.add(popped_node1)
+
+    is_added = False
+    for successor in G2[popped_node2]:
+        if successor in reverse_mapping:
+            is_added = True
+            T2_in.add(popped_node2)
+        else:
+            if not any(pred in reverse_mapping for pred in G2.pred[successor]):
+                T2.discard(successor)
+
+            if not any(succ in reverse_mapping for succ in G2[successor]):
+                T2_in.discard(successor)
+
+            if successor not in T2:
+                if successor not in T2_in:
+                    T2_tilde.add(successor)
+
+    for predecessor in G2.pred[popped_node2]:
+        if predecessor in reverse_mapping:
+            # if a neighbor of the excluded node1 is in the mapping, keep node1 in T1
+            is_added = True
+            T2.add(popped_node2)
+        else:
+            # check if its neighbor has another connection with a covered node. If not, only then exclude it from T1
+            if not any(pred in reverse_mapping for pred in G2.pred[predecessor]):
+                T2.discard(predecessor)
+
+            if not any(succ in reverse_mapping for succ in G2[predecessor]):
+                T2_in.discard(predecessor)
+
+            if not (predecessor in T2 or predecessor in T2_in):
+                T2_tilde.add(predecessor)
+
+    if not is_added:
+        T2_tilde.add(popped_node2)
diff --git a/networkx/algorithms/link_analysis/hits_alg.py b/networkx/algorithms/link_analysis/hits_alg.py
index 2deb3f4..47fcdb4 100644
--- a/networkx/algorithms/link_analysis/hits_alg.py
+++ b/networkx/algorithms/link_analysis/hits_alg.py
@@ -2,9 +2,10 @@
 """
 import networkx as nx
 
-__all__ = ["hits", "hits_numpy", "hits_scipy", "authority_matrix", "hub_matrix"]
+__all__ = ["hits"]
 
 
+@nx._dispatch
 def hits(G, max_iter=100, tol=1.0e-8, nstart=None, normalized=True):
     """Returns HITS hubs and authorities values for nodes.
 
@@ -144,51 +145,9 @@ def _hits_python(G, max_iter=100, tol=1.0e-8, nstart=None, normalized=True):
     return h, a
 
 
-def authority_matrix(G, nodelist=None):
-    """Returns the HITS authority matrix.
-
-    .. deprecated:: 2.6
-    """
-    import warnings
-
-    msg = (
-        "\nauthority_matrix is deprecated as of version 2.6 and will be removed "
-        "in version 3.0.\n"
-        "The authority matrix can be computed by::\n"
-        "    >>> M = nx.to_numpy_array(G, nodelist=nodelist)\n"
-        "    >>> M.T @ M"
-    )
-    warnings.warn(msg, DeprecationWarning)
-    M = nx.to_numpy_array(G, nodelist=nodelist)
-    return M.T @ M
-
-
-def hub_matrix(G, nodelist=None):
-    """Returns the HITS hub matrix.
-
-    .. deprecated:: 2.6
-    """
-    import warnings
-
-    msg = (
-        "\nhub_matrix is deprecated as of version 2.6 and will be removed "
-        "in version 3.0.\n"
-        "The hub matrix can be computed by::\n"
-        "    >>> M = nx.to_numpy_array(G, nodelist=nodelist)\n"
-        "    >>> M @ M.T"
-    )
-    warnings.warn(msg, DeprecationWarning)
-    M = nx.to_numpy_array(G, nodelist=nodelist)
-    return M @ M.T
-
-
-def hits_numpy(G, normalized=True):
+def _hits_numpy(G, normalized=True):
     """Returns HITS hubs and authorities values for nodes.
 
-    .. deprecated:: 2.6
-
-       hits_numpy is deprecated and will be removed in networkx 3.0.
-
     The HITS algorithm computes two numbers for a node.
     Authorities estimates the node value based on the incoming links.
     Hubs estimates the node value based on outgoing links.
@@ -221,10 +180,11 @@ def hits_numpy(G, normalized=True):
     >>> hubs_matrix = adj_ary @ adj_ary.T
     >>> authority_matrix = adj_ary.T @ adj_ary
 
-    `hits_numpy` maps the eigenvector corresponding to the maximum eigenvalue
+    `_hits_numpy` maps the eigenvector corresponding to the maximum eigenvalue
     of the respective matrices to the nodes in `G`:
 
-    >>> hubs, authority = nx.hits_numpy(G)
+    >>> from networkx.algorithms.link_analysis.hits_alg import _hits_numpy
+    >>> hubs, authority = _hits_numpy(G)
 
     Notes
     -----
@@ -245,19 +205,8 @@ def hits_numpy(G, normalized=True):
        doi:10.1145/324133.324140.
        http://www.cs.cornell.edu/home/kleinber/auth.pdf.
     """
-    import warnings
-
     import numpy as np
 
-    warnings.warn(
-        (
-            "networkx.hits_numpy is deprecated and will be removed"
-            "in NetworkX 3.0, use networkx.hits instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-
     if len(G) == 0:
         return {}, {}
     adj_ary = nx.to_numpy_array(G)
@@ -280,12 +229,9 @@ def hits_numpy(G, normalized=True):
     return hubs, authorities
 
 
-def hits_scipy(G, max_iter=100, tol=1.0e-6, nstart=None, normalized=True):
+def _hits_scipy(G, max_iter=100, tol=1.0e-6, nstart=None, normalized=True):
     """Returns HITS hubs and authorities values for nodes.
 
-    .. deprecated:: 2.6
-
-       hits_scipy is deprecated and will be removed in networkx 3.0
 
     The HITS algorithm computes two numbers for a node.
     Authorities estimates the node value based on the incoming links.
@@ -316,8 +262,9 @@ def hits_scipy(G, max_iter=100, tol=1.0e-6, nstart=None, normalized=True):
 
     Examples
     --------
+    >>> from networkx.algorithms.link_analysis.hits_alg import _hits_scipy
     >>> G = nx.path_graph(4)
-    >>> h, a = nx.hits(G)
+    >>> h, a = _hits_scipy(G)
 
     Notes
     -----
@@ -350,19 +297,8 @@ def hits_scipy(G, max_iter=100, tol=1.0e-6, nstart=None, normalized=True):
        doi:10.1145/324133.324140.
        http://www.cs.cornell.edu/home/kleinber/auth.pdf.
     """
-    import warnings
-
     import numpy as np
 
-    warnings.warn(
-        (
-            "networkx.hits_scipy is deprecated and will be removed"
-            "in NetworkX 3.0, use networkx.hits instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-
     if len(G) == 0:
         return {}, {}
     A = nx.to_scipy_sparse_array(G, nodelist=list(G))
diff --git a/networkx/algorithms/link_analysis/pagerank_alg.py b/networkx/algorithms/link_analysis/pagerank_alg.py
index ece444c..d11faab 100644
--- a/networkx/algorithms/link_analysis/pagerank_alg.py
+++ b/networkx/algorithms/link_analysis/pagerank_alg.py
@@ -3,9 +3,10 @@ from warnings import warn
 
 import networkx as nx
 
-__all__ = ["pagerank", "pagerank_numpy", "pagerank_scipy", "google_matrix"]
+__all__ = ["pagerank", "google_matrix"]
 
 
+@nx._dispatch
 def pagerank(
     G,
     alpha=0.85,
@@ -35,7 +36,7 @@ def pagerank(
       The "personalization vector" consisting of a dictionary with a
       key some subset of graph nodes and personalization value each of those.
       At least one personalization value must be non-zero.
-      If not specfiied, a nodes personalization value will be zero.
+      If not specified, a nodes personalization value will be zero.
       By default, a uniform distribution is used.
 
     max_iter : integer, optional
@@ -86,7 +87,7 @@ def pagerank(
 
     See Also
     --------
-    pagerank_numpy, pagerank_scipy, google_matrix
+    google_matrix
 
     Raises
     ------
@@ -105,7 +106,7 @@ def pagerank(
        http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf
 
     """
-    return pagerank_scipy(
+    return _pagerank_scipy(
         G, alpha, personalization, max_iter, tol, nstart, weight, dangling
     )
 
@@ -170,6 +171,7 @@ def _pagerank_python(
     raise nx.PowerIterationFailedConvergence(max_iter)
 
 
+@nx._dispatch
 def google_matrix(
     G, alpha=0.85, personalization=None, nodelist=None, weight="weight", dangling=None
 ):
@@ -188,7 +190,7 @@ def google_matrix(
       The "personalization vector" consisting of a dictionary with a
       key some subset of graph nodes and personalization value each of those.
       At least one personalization value must be non-zero.
-      If not specfiied, a nodes personalization value will be zero.
+      If not specifed, a nodes personalization value will be zero.
       By default, a uniform distribution is used.
 
     nodelist : list, optional
@@ -209,12 +211,12 @@ def google_matrix(
 
     Returns
     -------
-    A : NumPy matrix
+    A : 2D NumPy ndarray
        Google matrix of the graph
 
     Notes
     -----
-    The matrix returned represents the transition matrix that describes the
+    The array returned represents the transition matrix that describes the
     Markov chain used in PageRank. For PageRank to converge to a unique
     solution (i.e., a unique stationary distribution in a Markov chain), the
     transition matrix must be irreducible. In other words, it must be that
@@ -227,28 +229,17 @@ def google_matrix(
 
     See Also
     --------
-    pagerank, pagerank_numpy, pagerank_scipy
+    pagerank
     """
-    # TODO: Remove this warning in version 3.0
-    import warnings
-
     import numpy as np
 
-    warnings.warn(
-        "google_matrix will return an np.ndarray instead of a np.matrix in\n"
-        "NetworkX version 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-
     if nodelist is None:
         nodelist = list(G)
 
     A = nx.to_numpy_array(G, nodelist=nodelist, weight=weight)
     N = len(G)
     if N == 0:
-        # TODO: Remove np.asmatrix wrapper in version 3.0
-        return np.asmatrix(A)
+        return A
 
     # Personalization vector
     if personalization is None:
@@ -273,11 +264,12 @@ def google_matrix(
 
     A /= A.sum(axis=1)[:, np.newaxis]  # Normalize rows to sum to 1
 
-    # TODO: Remove np.asmatrix wrapper in version 3.0
-    return np.asmatrix(alpha * A + (1 - alpha) * p)
+    return alpha * A + (1 - alpha) * p
 
 
-def pagerank_numpy(G, alpha=0.85, personalization=None, weight="weight", dangling=None):
+def _pagerank_numpy(
+    G, alpha=0.85, personalization=None, weight="weight", dangling=None
+):
     """Returns the PageRank of the nodes in the graph.
 
     PageRank computes a ranking of the nodes in the graph G based on
@@ -297,7 +289,7 @@ def pagerank_numpy(G, alpha=0.85, personalization=None, weight="weight", danglin
       The "personalization vector" consisting of a dictionary with a
       key some subset of graph nodes and personalization value each of those.
       At least one personalization value must be non-zero.
-      If not specfiied, a nodes personalization value will be zero.
+      If not specified, a nodes personalization value will be zero.
       By default, a uniform distribution is used.
 
     weight : key, optional
@@ -319,8 +311,9 @@ def pagerank_numpy(G, alpha=0.85, personalization=None, weight="weight", danglin
 
     Examples
     --------
+    >>> from networkx.algorithms.link_analysis.pagerank_alg import _pagerank_numpy
     >>> G = nx.DiGraph(nx.path_graph(4))
-    >>> pr = nx.pagerank_numpy(G, alpha=0.9)
+    >>> pr = _pagerank_numpy(G, alpha=0.9)
 
     Notes
     -----
@@ -334,7 +327,7 @@ def pagerank_numpy(G, alpha=0.85, personalization=None, weight="weight", danglin
 
     See Also
     --------
-    pagerank, pagerank_scipy, google_matrix
+    pagerank, google_matrix
 
     References
     ----------
@@ -345,8 +338,6 @@ def pagerank_numpy(G, alpha=0.85, personalization=None, weight="weight", danglin
        The PageRank citation ranking: Bringing order to the Web. 1999
        http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf
     """
-    msg = "networkx.pagerank_numpy is deprecated and will be removed in NetworkX 3.0, use networkx.pagerank instead."
-    warn(msg, DeprecationWarning, stacklevel=2)
     import numpy as np
 
     if len(G) == 0:
@@ -363,7 +354,7 @@ def pagerank_numpy(G, alpha=0.85, personalization=None, weight="weight", danglin
     return dict(zip(G, map(float, largest / norm)))
 
 
-def pagerank_scipy(
+def _pagerank_scipy(
     G,
     alpha=0.85,
     personalization=None,
@@ -392,7 +383,7 @@ def pagerank_scipy(
       The "personalization vector" consisting of a dictionary with a
       key some subset of graph nodes and personalization value each of those.
       At least one personalization value must be non-zero.
-      If not specfiied, a nodes personalization value will be zero.
+      If not specified, a nodes personalization value will be zero.
       By default, a uniform distribution is used.
 
     max_iter : integer, optional
@@ -423,8 +414,9 @@ def pagerank_scipy(
 
     Examples
     --------
+    >>> from networkx.algorithms.link_analysis.pagerank_alg import _pagerank_scipy
     >>> G = nx.DiGraph(nx.path_graph(4))
-    >>> pr = nx.pagerank_scipy(G, alpha=0.9)
+    >>> pr = _pagerank_scipy(G, alpha=0.9)
 
     Notes
     -----
@@ -437,7 +429,7 @@ def pagerank_scipy(
 
     See Also
     --------
-    pagerank, pagerank_numpy, google_matrix
+    pagerank
 
     Raises
     ------
@@ -455,8 +447,6 @@ def pagerank_scipy(
        The PageRank citation ranking: Bringing order to the Web. 1999
        http://dbpubs.stanford.edu:8090/pub/showDoc.Fulltext?lang=en&doc=1999-66&format=pdf
     """
-    msg = "networkx.pagerank_scipy is deprecated and will be removed in NetworkX 3.0, use networkx.pagerank instead."
-    warn(msg, DeprecationWarning, stacklevel=2)
     import numpy as np
     import scipy as sp
     import scipy.sparse  # call as sp.sparse
diff --git a/networkx/algorithms/link_analysis/tests/test_hits.py b/networkx/algorithms/link_analysis/tests/test_hits.py
index df5f0da..1b242a1 100644
--- a/networkx/algorithms/link_analysis/tests/test_hits.py
+++ b/networkx/algorithms/link_analysis/tests/test_hits.py
@@ -6,7 +6,11 @@ np = pytest.importorskip("numpy")
 sp = pytest.importorskip("scipy")
 import scipy.sparse  # call as sp.sparse
 
-from networkx.algorithms.link_analysis.hits_alg import _hits_python
+from networkx.algorithms.link_analysis.hits_alg import (
+    _hits_numpy,
+    _hits_python,
+    _hits_scipy,
+)
 
 # Example from
 # A. Langville and C. Meyer, "A survey of eigenvector methods of web
@@ -32,13 +36,13 @@ class TestHITS:
 
     def test_hits_numpy(self):
         G = self.G
-        h, a = nx.hits_numpy(G)
+        h, a = _hits_numpy(G)
         for n in G:
             assert h[n] == pytest.approx(G.h[n], abs=1e-4)
         for n in G:
             assert a[n] == pytest.approx(G.a[n], abs=1e-4)
 
-    @pytest.mark.parametrize("hits_alg", (nx.hits, nx.hits_scipy, _hits_python))
+    @pytest.mark.parametrize("hits_alg", (nx.hits, _hits_python, _hits_scipy))
     def test_hits(self, hits_alg):
         G = self.G
         h, a = hits_alg(G, tol=1.0e-08)
@@ -56,34 +60,21 @@ class TestHITS:
     def test_empty(self):
         G = nx.Graph()
         assert nx.hits(G) == ({}, {})
-        assert nx.hits_numpy(G) == ({}, {})
+        assert _hits_numpy(G) == ({}, {})
         assert _hits_python(G) == ({}, {})
-        assert nx.hits_scipy(G) == ({}, {})
-        assert nx.authority_matrix(G).shape == (0, 0)
-        assert nx.hub_matrix(G).shape == (0, 0)
+        assert _hits_scipy(G) == ({}, {})
 
     def test_hits_not_convergent(self):
         G = nx.path_graph(50)
         with pytest.raises(nx.PowerIterationFailedConvergence):
-            nx.hits_scipy(G, max_iter=1)
+            _hits_scipy(G, max_iter=1)
         with pytest.raises(nx.PowerIterationFailedConvergence):
             _hits_python(G, max_iter=1)
         with pytest.raises(nx.PowerIterationFailedConvergence):
-            nx.hits_scipy(G, max_iter=0)
+            _hits_scipy(G, max_iter=0)
         with pytest.raises(nx.PowerIterationFailedConvergence):
             _hits_python(G, max_iter=0)
         with pytest.raises(ValueError):
             nx.hits(G, max_iter=0)
         with pytest.raises(sp.sparse.linalg.ArpackNoConvergence):
             nx.hits(G, max_iter=1)
-
-
-@pytest.mark.parametrize("hits_alg", (nx.hits_numpy, nx.hits_scipy))
-def test_deprecation_warnings(hits_alg):
-    """Make sure deprecation warnings are raised.
-
-    To be removed when deprecations expire.
-    """
-    G = nx.DiGraph(nx.path_graph(4))
-    with pytest.warns(DeprecationWarning):
-        hits_alg(G)
diff --git a/networkx/algorithms/link_analysis/tests/test_pagerank.py b/networkx/algorithms/link_analysis/tests/test_pagerank.py
index 4c9722f..930b4e4 100644
--- a/networkx/algorithms/link_analysis/tests/test_pagerank.py
+++ b/networkx/algorithms/link_analysis/tests/test_pagerank.py
@@ -7,7 +7,11 @@ import networkx as nx
 np = pytest.importorskip("numpy")
 pytest.importorskip("scipy")
 
-from networkx.algorithms.link_analysis.pagerank_alg import _pagerank_python
+from networkx.algorithms.link_analysis.pagerank_alg import (
+    _pagerank_numpy,
+    _pagerank_python,
+    _pagerank_scipy,
+)
 
 # Example from
 # A. Langville and C. Meyer, "A survey of eigenvector methods of web
@@ -74,7 +78,7 @@ class TestPageRank:
 
     def test_numpy_pagerank(self):
         G = self.G
-        p = nx.pagerank_numpy(G, alpha=0.9)
+        p = _pagerank_numpy(G, alpha=0.9)
         for n in G:
             assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
 
@@ -82,11 +86,11 @@ class TestPageRank:
         G = self.G
         M = nx.google_matrix(G, alpha=0.9, nodelist=sorted(G))
         _, ev = np.linalg.eig(M.T)
-        p = np.array(ev[:, 0] / ev[:, 0].sum())[:, 0]
+        p = ev[:, 0] / ev[:, 0].sum()
         for (a, b) in zip(p, self.G.pagerank.values()):
             assert a == pytest.approx(b, abs=1e-7)
 
-    @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, nx.pagerank_numpy))
+    @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, _pagerank_numpy))
     def test_personalization(self, alg):
         G = nx.complete_graph(4)
         personalize = {0: 1, 1: 1, 2: 4, 3: 4}
@@ -153,7 +157,7 @@ class TestPageRank:
                 else:
                     assert M2[i, j] == pytest.approx(M1[i, j], abs=1e-4)
 
-    @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, nx.pagerank_numpy))
+    @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python, _pagerank_numpy))
     def test_dangling_pagerank(self, alg):
         pr = alg(self.G, dangling=self.dangling_edges)
         for n in self.G:
@@ -163,7 +167,7 @@ class TestPageRank:
         G = nx.Graph()
         assert nx.pagerank(G) == {}
         assert _pagerank_python(G) == {}
-        assert nx.pagerank_numpy(G) == {}
+        assert _pagerank_numpy(G) == {}
         assert nx.google_matrix(G).shape == (0, 0)
 
     @pytest.mark.parametrize("alg", (nx.pagerank, _pagerank_python))
@@ -184,37 +188,26 @@ class TestPageRank:
 class TestPageRankScipy(TestPageRank):
     def test_scipy_pagerank(self):
         G = self.G
-        p = nx.pagerank_scipy(G, alpha=0.9, tol=1.0e-08)
+        p = _pagerank_scipy(G, alpha=0.9, tol=1.0e-08)
         for n in G:
             assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
         personalize = {n: random.random() for n in G}
-        p = nx.pagerank_scipy(G, alpha=0.9, tol=1.0e-08, personalization=personalize)
+        p = _pagerank_scipy(G, alpha=0.9, tol=1.0e-08, personalization=personalize)
 
         nstart = {n: random.random() for n in G}
-        p = nx.pagerank_scipy(G, alpha=0.9, tol=1.0e-08, nstart=nstart)
+        p = _pagerank_scipy(G, alpha=0.9, tol=1.0e-08, nstart=nstart)
         for n in G:
             assert p[n] == pytest.approx(G.pagerank[n], abs=1e-4)
 
     def test_scipy_pagerank_max_iter(self):
         with pytest.raises(nx.PowerIterationFailedConvergence):
-            nx.pagerank_scipy(self.G, max_iter=0)
+            _pagerank_scipy(self.G, max_iter=0)
 
     def test_dangling_scipy_pagerank(self):
-        pr = nx.pagerank_scipy(self.G, dangling=self.dangling_edges)
+        pr = _pagerank_scipy(self.G, dangling=self.dangling_edges)
         for n in self.G:
             assert pr[n] == pytest.approx(self.G.dangling_pagerank[n], abs=1e-4)
 
     def test_empty_scipy(self):
         G = nx.Graph()
-        assert nx.pagerank_scipy(G) == {}
-
-
-@pytest.mark.parametrize("pagerank_alg", (nx.pagerank_numpy, nx.pagerank_scipy))
-def test_deprecation_warnings(pagerank_alg):
-    """Make sure deprecation warnings are raised.
-
-    To be removed when deprecations expire.
-    """
-    G = nx.DiGraph(nx.path_graph(4))
-    with pytest.warns(DeprecationWarning):
-        pagerank_alg(G, alpha=0.9)
+        assert _pagerank_scipy(G) == {}
diff --git a/networkx/algorithms/link_prediction.py b/networkx/algorithms/link_prediction.py
index de29e24..771ce79 100644
--- a/networkx/algorithms/link_prediction.py
+++ b/networkx/algorithms/link_prediction.py
@@ -94,6 +94,7 @@ def resource_allocation_index(G, ebunch=None):
     return _apply_prediction(G, predict, ebunch)
 
 
+@nx._dispatch
 @not_implemented_for("directed")
 @not_implemented_for("multigraph")
 def jaccard_coefficient(G, ebunch=None):
diff --git a/networkx/algorithms/lowest_common_ancestors.py b/networkx/algorithms/lowest_common_ancestors.py
index 68aaf7d..9c40758 100644
--- a/networkx/algorithms/lowest_common_ancestors.py
+++ b/networkx/algorithms/lowest_common_ancestors.py
@@ -14,7 +14,6 @@ __all__ = [
 
 
 @not_implemented_for("undirected")
-@not_implemented_for("multigraph")
 def all_pairs_lowest_common_ancestor(G, pairs=None):
     """Return the lowest common ancestor of all pairs or the provided pairs
 
@@ -112,7 +111,6 @@ def all_pairs_lowest_common_ancestor(G, pairs=None):
 
 
 @not_implemented_for("undirected")
-@not_implemented_for("multigraph")
 def lowest_common_ancestor(G, node1, node2, default=None):
     """Compute the lowest common ancestor of the given pair of nodes.
 
@@ -150,7 +148,6 @@ def lowest_common_ancestor(G, node1, node2, default=None):
 
 
 @not_implemented_for("undirected")
-@not_implemented_for("multigraph")
 def tree_all_pairs_lowest_common_ancestor(G, root=None, pairs=None):
     r"""Yield the lowest common ancestor for sets of pairs in a tree.
 
@@ -237,7 +234,8 @@ def tree_all_pairs_lowest_common_ancestor(G, root=None, pairs=None):
                     msg = "No root specified and tree has multiple sources."
                     raise nx.NetworkXError(msg)
                 root = n
-            elif deg > 1:
+            # checking deg>1 is not sufficient for MultiDiGraphs
+            elif deg > 1 and len(G.pred[n]) > 1:
                 msg = "Tree LCA only defined on trees; use DAG routine."
                 raise nx.NetworkXError(msg)
     if root is None:
diff --git a/networkx/algorithms/matching.py b/networkx/algorithms/matching.py
index a4e7e9c..da6416d 100644
--- a/networkx/algorithms/matching.py
+++ b/networkx/algorithms/matching.py
@@ -255,7 +255,7 @@ def is_perfect_matching(G, matching):
 
 @not_implemented_for("multigraph")
 @not_implemented_for("directed")
-def min_weight_matching(G, maxcardinality=None, weight="weight"):
+def min_weight_matching(G, weight="weight"):
     """Computing a minimum-weight maximal matching of G.
 
     Use the maximum-weight algorithm with edge weights subtracted
@@ -290,15 +290,6 @@ def min_weight_matching(G, maxcardinality=None, weight="weight"):
     G : NetworkX graph
       Undirected graph
 
-    maxcardinality: bool
-        .. deprecated:: 2.8
-            The `maxcardinality` parameter will be removed in v3.0.
-            It doesn't make sense to set it to False when looking for
-            a min weight matching because then we just return no edges.
-
-        If maxcardinality is True, compute the maximum-cardinality matching
-        with minimum weight among all maximum-cardinality matchings.
-
     weight: string, optional (default='weight')
        Edge data key corresponding to the edge weight.
        If key not found, uses 1 as weight.
@@ -312,12 +303,6 @@ def min_weight_matching(G, maxcardinality=None, weight="weight"):
     --------
     max_weight_matching
     """
-    if maxcardinality not in (True, None):
-        raise nx.NetworkXError(
-            "The argument maxcardinality does not make sense "
-            "in the context of minimum weight matchings."
-            "It is deprecated and will be removed in v3.0."
-        )
     if len(G.edges) == 0:
         return max_weight_matching(G, maxcardinality=True, weight=weight)
     G_edges = G.edges(data=weight, default=1)
diff --git a/networkx/algorithms/node_classification.py b/networkx/algorithms/node_classification.py
new file mode 100644
index 0000000..2875db0
--- /dev/null
+++ b/networkx/algorithms/node_classification.py
@@ -0,0 +1,218 @@
+""" This module provides the functions for node classification problem.
+
+The functions in this module are not imported
+into the top level `networkx` namespace.
+You can access these functions by importing
+the `networkx.algorithms.node_classification` modules,
+then accessing the functions as attributes of `node_classification`.
+For example:
+
+  >>> from networkx.algorithms import node_classification
+  >>> G = nx.path_graph(4)
+  >>> G.edges()
+  EdgeView([(0, 1), (1, 2), (2, 3)])
+  >>> G.nodes[0]["label"] = "A"
+  >>> G.nodes[3]["label"] = "B"
+  >>> node_classification.harmonic_function(G)
+  ['A', 'A', 'B', 'B']
+
+References
+----------
+Zhu, X., Ghahramani, Z., & Lafferty, J. (2003, August).
+Semi-supervised learning using gaussian fields and harmonic functions.
+In ICML (Vol. 3, pp. 912-919).
+"""
+import networkx as nx
+
+__all__ = ["harmonic_function", "local_and_global_consistency"]
+
+
+@nx.utils.not_implemented_for("directed")
+def harmonic_function(G, max_iter=30, label_name="label"):
+    """Node classification by Harmonic function
+
+    Function for computing Harmonic function algorithm by Zhu et al.
+
+    Parameters
+    ----------
+    G : NetworkX Graph
+    max_iter : int
+        maximum number of iterations allowed
+    label_name : string
+        name of target labels to predict
+
+    Returns
+    -------
+    predicted : list
+        List of length ``len(G)`` with the predicted labels for each node.
+
+    Raises
+    ------
+    NetworkXError
+        If no nodes in `G` have attribute `label_name`.
+
+    Examples
+    --------
+    >>> from networkx.algorithms import node_classification
+    >>> G = nx.path_graph(4)
+    >>> G.nodes[0]["label"] = "A"
+    >>> G.nodes[3]["label"] = "B"
+    >>> G.nodes(data=True)
+    NodeDataView({0: {'label': 'A'}, 1: {}, 2: {}, 3: {'label': 'B'}})
+    >>> G.edges()
+    EdgeView([(0, 1), (1, 2), (2, 3)])
+    >>> predicted = node_classification.harmonic_function(G)
+    >>> predicted
+    ['A', 'A', 'B', 'B']
+
+    References
+    ----------
+    Zhu, X., Ghahramani, Z., & Lafferty, J. (2003, August).
+    Semi-supervised learning using gaussian fields and harmonic functions.
+    In ICML (Vol. 3, pp. 912-919).
+    """
+    import numpy as np
+    import scipy as sp
+    import scipy.sparse  # call as sp.sparse
+
+    X = nx.to_scipy_sparse_array(G)  # adjacency matrix
+    labels, label_dict = _get_label_info(G, label_name)
+
+    if labels.shape[0] == 0:
+        raise nx.NetworkXError(
+            f"No node on the input graph is labeled by '{label_name}'."
+        )
+
+    n_samples = X.shape[0]
+    n_classes = label_dict.shape[0]
+    F = np.zeros((n_samples, n_classes))
+
+    # Build propagation matrix
+    degrees = X.sum(axis=0)
+    degrees[degrees == 0] = 1  # Avoid division by 0
+    # TODO: csr_array
+    D = sp.sparse.csr_array(sp.sparse.diags((1.0 / degrees), offsets=0))
+    P = (D @ X).tolil()
+    P[labels[:, 0]] = 0  # labels[:, 0] indicates IDs of labeled nodes
+    # Build base matrix
+    B = np.zeros((n_samples, n_classes))
+    B[labels[:, 0], labels[:, 1]] = 1
+
+    for _ in range(max_iter):
+        F = (P @ F) + B
+
+    return label_dict[np.argmax(F, axis=1)].tolist()
+
+
+@nx.utils.not_implemented_for("directed")
+def local_and_global_consistency(G, alpha=0.99, max_iter=30, label_name="label"):
+    """Node classification by Local and Global Consistency
+
+    Function for computing Local and global consistency algorithm by Zhou et al.
+
+    Parameters
+    ----------
+    G : NetworkX Graph
+    alpha : float
+        Clamping factor
+    max_iter : int
+        Maximum number of iterations allowed
+    label_name : string
+        Name of target labels to predict
+
+    Returns
+    -------
+    predicted : list
+        List of length ``len(G)`` with the predicted labels for each node.
+
+    Raises
+    ------
+    NetworkXError
+        If no nodes in `G` have attribute `label_name`.
+
+    Examples
+    --------
+    >>> from networkx.algorithms import node_classification
+    >>> G = nx.path_graph(4)
+    >>> G.nodes[0]["label"] = "A"
+    >>> G.nodes[3]["label"] = "B"
+    >>> G.nodes(data=True)
+    NodeDataView({0: {'label': 'A'}, 1: {}, 2: {}, 3: {'label': 'B'}})
+    >>> G.edges()
+    EdgeView([(0, 1), (1, 2), (2, 3)])
+    >>> predicted = node_classification.local_and_global_consistency(G)
+    >>> predicted
+    ['A', 'A', 'B', 'B']
+
+    References
+    ----------
+    Zhou, D., Bousquet, O., Lal, T. N., Weston, J., & Schölkopf, B. (2004).
+    Learning with local and global consistency.
+    Advances in neural information processing systems, 16(16), 321-328.
+    """
+    import numpy as np
+    import scipy as sp
+    import scipy.sparse  # call as sp.sparse
+
+    X = nx.to_scipy_sparse_array(G)  # adjacency matrix
+    labels, label_dict = _get_label_info(G, label_name)
+
+    if labels.shape[0] == 0:
+        raise nx.NetworkXError(
+            f"No node on the input graph is labeled by '{label_name}'."
+        )
+
+    n_samples = X.shape[0]
+    n_classes = label_dict.shape[0]
+    F = np.zeros((n_samples, n_classes))
+
+    # Build propagation matrix
+    degrees = X.sum(axis=0)
+    degrees[degrees == 0] = 1  # Avoid division by 0
+    # TODO: csr_array
+    D2 = np.sqrt(sp.sparse.csr_array(sp.sparse.diags((1.0 / degrees), offsets=0)))
+    P = alpha * ((D2 @ X) @ D2)
+    # Build base matrix
+    B = np.zeros((n_samples, n_classes))
+    B[labels[:, 0], labels[:, 1]] = 1 - alpha
+
+    for _ in range(max_iter):
+        F = (P @ F) + B
+
+    return label_dict[np.argmax(F, axis=1)].tolist()
+
+
+def _get_label_info(G, label_name):
+    """Get and return information of labels from the input graph
+
+    Parameters
+    ----------
+    G : Network X graph
+    label_name : string
+        Name of the target label
+
+    Returns
+    ----------
+    labels : numpy array, shape = [n_labeled_samples, 2]
+        Array of pairs of labeled node ID and label ID
+    label_dict : numpy array, shape = [n_classes]
+        Array of labels
+        i-th element contains the label corresponding label ID `i`
+    """
+    import numpy as np
+
+    labels = []
+    label_to_id = {}
+    lid = 0
+    for i, n in enumerate(G.nodes(data=True)):
+        if label_name in n[1]:
+            label = n[1][label_name]
+            if label not in label_to_id:
+                label_to_id[label] = lid
+                lid += 1
+            labels.append([i, label_to_id[label]])
+    labels = np.array(labels)
+    label_dict = np.array(
+        [label for label, _ in sorted(label_to_id.items(), key=lambda x: x[1])]
+    )
+    return (labels, label_dict)
diff --git a/networkx/algorithms/node_classification/__init__.py b/networkx/algorithms/node_classification/__init__.py
deleted file mode 100644
index 23fa264..0000000
--- a/networkx/algorithms/node_classification/__init__.py
+++ /dev/null
@@ -1,52 +0,0 @@
-""" This module provides the functions for node classification problem.
-
-The functions in this module are not imported
-into the top level `networkx` namespace.
-You can access these functions by importing
-the `networkx.algorithms.node_classification` modules,
-then accessing the functions as attributes of `node_classification`.
-For example:
-
-  >>> from networkx.algorithms import node_classification
-  >>> G = nx.path_graph(4)
-  >>> G.edges()
-  EdgeView([(0, 1), (1, 2), (2, 3)])
-  >>> G.nodes[0]["label"] = "A"
-  >>> G.nodes[3]["label"] = "B"
-  >>> node_classification.harmonic_function(G)
-  ['A', 'A', 'B', 'B']
-
-"""
-
-
-def __getattr__(name):
-    if name in ("hmn", "lgc"):
-        import warnings
-        import importlib
-
-        fn_name = (
-            "harmonic_function" if name == "hmn" else "local_and_global_consistency"
-        )
-        msg = (
-            f"The {name}  module is deprecated and will be removed in version 3.0.\n"
-            f"Access `{fn_name}` directly from `node_classification`:\n\n"
-            "    from networkx.algorithms import node_classification\n"
-            f"    node_classification.{fn_name}\n"
-        )
-        warnings.warn(msg, category=DeprecationWarning, stacklevel=2)
-        return importlib.import_module(
-            f".{name}", "networkx.algorithms.node_classification"
-        )
-    if name == "harmonic_function":
-        from .hmn import harmonic_function
-
-        return harmonic_function
-    if name == "local_and_global_consistency":
-        from .lgc import local_and_global_consistency
-
-        return local_and_global_consistency
-    raise AttributeError(f"module {__name__} has no attribute {name}")
-
-
-def __dir__():
-    return ["harmonic_function", "local_and_global_consistency"]
diff --git a/networkx/algorithms/node_classification/hmn.py b/networkx/algorithms/node_classification/hmn.py
deleted file mode 100644
index 727ee36..0000000
--- a/networkx/algorithms/node_classification/hmn.py
+++ /dev/null
@@ -1,88 +0,0 @@
-"""Function for computing Harmonic function algorithm by Zhu et al.
-
-References
-----------
-Zhu, X., Ghahramani, Z., & Lafferty, J. (2003, August).
-Semi-supervised learning using gaussian fields and harmonic functions.
-In ICML (Vol. 3, pp. 912-919).
-"""
-import networkx as nx
-from networkx.algorithms.node_classification.utils import _get_label_info
-from networkx.utils.decorators import not_implemented_for
-
-__all__ = ["harmonic_function"]
-
-
-@not_implemented_for("directed")
-def harmonic_function(G, max_iter=30, label_name="label"):
-    """Node classification by Harmonic function
-
-    Parameters
-    ----------
-    G : NetworkX Graph
-    max_iter : int
-        maximum number of iterations allowed
-    label_name : string
-        name of target labels to predict
-
-    Returns
-    -------
-    predicted : list
-        List of length ``len(G)`` with the predicted labels for each node.
-
-    Raises
-    ------
-    NetworkXError
-        If no nodes in `G` have attribute `label_name`.
-
-    Examples
-    --------
-    >>> from networkx.algorithms import node_classification
-    >>> G = nx.path_graph(4)
-    >>> G.nodes[0]["label"] = "A"
-    >>> G.nodes[3]["label"] = "B"
-    >>> G.nodes(data=True)
-    NodeDataView({0: {'label': 'A'}, 1: {}, 2: {}, 3: {'label': 'B'}})
-    >>> G.edges()
-    EdgeView([(0, 1), (1, 2), (2, 3)])
-    >>> predicted = node_classification.harmonic_function(G)
-    >>> predicted
-    ['A', 'A', 'B', 'B']
-
-    References
-    ----------
-    Zhu, X., Ghahramani, Z., & Lafferty, J. (2003, August).
-    Semi-supervised learning using gaussian fields and harmonic functions.
-    In ICML (Vol. 3, pp. 912-919).
-    """
-    import numpy as np
-    import scipy as sp
-    import scipy.sparse  # call as sp.sparse
-
-    X = nx.to_scipy_sparse_array(G)  # adjacency matrix
-    labels, label_dict = _get_label_info(G, label_name)
-
-    if labels.shape[0] == 0:
-        raise nx.NetworkXError(
-            f"No node on the input graph is labeled by '{label_name}'."
-        )
-
-    n_samples = X.shape[0]
-    n_classes = label_dict.shape[0]
-    F = np.zeros((n_samples, n_classes))
-
-    # Build propagation matrix
-    degrees = X.sum(axis=0)
-    degrees[degrees == 0] = 1  # Avoid division by 0
-    # TODO: csr_array
-    D = sp.sparse.csr_array(sp.sparse.diags((1.0 / degrees), offsets=0))
-    P = (D @ X).tolil()
-    P[labels[:, 0]] = 0  # labels[:, 0] indicates IDs of labeled nodes
-    # Build base matrix
-    B = np.zeros((n_samples, n_classes))
-    B[labels[:, 0], labels[:, 1]] = 1
-
-    for _ in range(max_iter):
-        F = (P @ F) + B
-
-    return label_dict[np.argmax(F, axis=1)].tolist()
diff --git a/networkx/algorithms/node_classification/lgc.py b/networkx/algorithms/node_classification/lgc.py
deleted file mode 100644
index 5324470..0000000
--- a/networkx/algorithms/node_classification/lgc.py
+++ /dev/null
@@ -1,89 +0,0 @@
-"""Function for computing Local and global consistency algorithm by Zhou et al.
-
-References
-----------
-Zhou, D., Bousquet, O., Lal, T. N., Weston, J., & Schölkopf, B. (2004).
-Learning with local and global consistency.
-Advances in neural information processing systems, 16(16), 321-328.
-"""
-import networkx as nx
-from networkx.algorithms.node_classification.utils import _get_label_info
-from networkx.utils.decorators import not_implemented_for
-
-__all__ = ["local_and_global_consistency"]
-
-
-@not_implemented_for("directed")
-def local_and_global_consistency(G, alpha=0.99, max_iter=30, label_name="label"):
-    """Node classification by Local and Global Consistency
-
-    Parameters
-    ----------
-    G : NetworkX Graph
-    alpha : float
-        Clamping factor
-    max_iter : int
-        Maximum number of iterations allowed
-    label_name : string
-        Name of target labels to predict
-
-    Returns
-    -------
-    predicted : list
-        List of length ``len(G)`` with the predicted labels for each node.
-
-    Raises
-    ------
-    NetworkXError
-        If no nodes in `G` have attribute `label_name`.
-
-    Examples
-    --------
-    >>> from networkx.algorithms import node_classification
-    >>> G = nx.path_graph(4)
-    >>> G.nodes[0]["label"] = "A"
-    >>> G.nodes[3]["label"] = "B"
-    >>> G.nodes(data=True)
-    NodeDataView({0: {'label': 'A'}, 1: {}, 2: {}, 3: {'label': 'B'}})
-    >>> G.edges()
-    EdgeView([(0, 1), (1, 2), (2, 3)])
-    >>> predicted = node_classification.local_and_global_consistency(G)
-    >>> predicted
-    ['A', 'A', 'B', 'B']
-
-    References
-    ----------
-    Zhou, D., Bousquet, O., Lal, T. N., Weston, J., & Schölkopf, B. (2004).
-    Learning with local and global consistency.
-    Advances in neural information processing systems, 16(16), 321-328.
-    """
-    import numpy as np
-    import scipy as sp
-    import scipy.sparse  # call as sp.sparse
-
-    X = nx.to_scipy_sparse_array(G)  # adjacency matrix
-    labels, label_dict = _get_label_info(G, label_name)
-
-    if labels.shape[0] == 0:
-        raise nx.NetworkXError(
-            f"No node on the input graph is labeled by '{label_name}'."
-        )
-
-    n_samples = X.shape[0]
-    n_classes = label_dict.shape[0]
-    F = np.zeros((n_samples, n_classes))
-
-    # Build propagation matrix
-    degrees = X.sum(axis=0)
-    degrees[degrees == 0] = 1  # Avoid division by 0
-    # TODO: csr_array
-    D2 = np.sqrt(sp.sparse.csr_array(sp.sparse.diags((1.0 / degrees), offsets=0)))
-    P = alpha * ((D2 @ X) @ D2)
-    # Build base matrix
-    B = np.zeros((n_samples, n_classes))
-    B[labels[:, 0], labels[:, 1]] = 1 - alpha
-
-    for _ in range(max_iter):
-        F = (P @ F) + B
-
-    return label_dict[np.argmax(F, axis=1)].tolist()
diff --git a/networkx/algorithms/node_classification/utils.py b/networkx/algorithms/node_classification/utils.py
deleted file mode 100644
index f7d7ac2..0000000
--- a/networkx/algorithms/node_classification/utils.py
+++ /dev/null
@@ -1,34 +0,0 @@
-def _get_label_info(G, label_name):
-    """Get and return information of labels from the input graph
-
-    Parameters
-    ----------
-    G : Network X graph
-    label_name : string
-        Name of the target label
-
-    Returns
-    ----------
-    labels : numpy array, shape = [n_labeled_samples, 2]
-        Array of pairs of labeled node ID and label ID
-    label_dict : numpy array, shape = [n_classes]
-        Array of labels
-        i-th element contains the label corresponding label ID `i`
-    """
-    import numpy as np
-
-    labels = []
-    label_to_id = {}
-    lid = 0
-    for i, n in enumerate(G.nodes(data=True)):
-        if label_name in n[1]:
-            label = n[1][label_name]
-            if label not in label_to_id:
-                label_to_id[label] = lid
-                lid += 1
-            labels.append([i, label_to_id[label]])
-    labels = np.array(labels)
-    label_dict = np.array(
-        [label for label, _ in sorted(label_to_id.items(), key=lambda x: x[1])]
-    )
-    return (labels, label_dict)
diff --git a/networkx/algorithms/operators/all.py b/networkx/algorithms/operators/all.py
index 7d7c19c..2dd4643 100644
--- a/networkx/algorithms/operators/all.py
+++ b/networkx/algorithms/operators/all.py
@@ -1,26 +1,27 @@
 """Operations on many graphs.
 """
-from itertools import zip_longest
+from itertools import chain, repeat
 
 import networkx as nx
 
 __all__ = ["union_all", "compose_all", "disjoint_union_all", "intersection_all"]
 
 
-def union_all(graphs, rename=(None,)):
+def union_all(graphs, rename=()):
     """Returns the union of all graphs.
 
     The graphs must be disjoint, otherwise an exception is raised.
 
     Parameters
     ----------
-    graphs : list of graphs
-       List of NetworkX graphs
+    graphs : iterable
+       Iterable of NetworkX graphs
 
-    rename : bool , default=(None, None)
-       Node names of G and H can be changed by specifying the tuple
+    rename : iterable , optional
+       Node names of graphs can be changed by specifying the tuple
        rename=('G-','H-') (for example).  Node "u" in G is then renamed
-       "G-u" and "v" in H is renamed "H-v".
+       "G-u" and "v" in H is renamed "H-v". Infinite generators (like itertools.count)
+       are also supported.
 
     Returns
     -------
@@ -45,16 +46,8 @@ def union_all(graphs, rename=(None,)):
     union
     disjoint_union_all
     """
-    # collect the graphs in case an iterator was passed
-    graphs = list(graphs)
-
-    if not graphs:
-        raise ValueError("cannot apply union_all to an empty list")
-
-    U = graphs[0]
-
-    if any(G.is_multigraph() != U.is_multigraph() for G in graphs):
-        raise nx.NetworkXError("All graphs must be graphs or multigraphs.")
+    R = None
+    seen_nodes = set()
 
     # rename graph to obtain disjoint node labels
     def add_prefix(graph, prefix):
@@ -62,41 +55,37 @@ def union_all(graphs, rename=(None,)):
             return graph
 
         def label(x):
-            if isinstance(x, str):
-                name = prefix + x
-            else:
-                name = prefix + repr(x)
-            return name
+            return f"{prefix}{x}"
 
         return nx.relabel_nodes(graph, label)
 
-    graphs = [add_prefix(G, name) for G, name in zip_longest(graphs, rename)]
-
-    if sum(len(G) for G in graphs) != len(set().union(*graphs)):
-        raise nx.NetworkXError(
-            "The node sets of the graphs are not disjoint.",
-            "Use appropriate rename"
-            "=(G1prefix,G2prefix,...,GNprefix)"
-            "or use disjoint_union(G1,G2,...,GN).",
-        )
-
-    # Union is the same type as first graph
-    R = U.__class__()
-
-    # add graph attributes, later attributes take precedent over earlier ones
-    for G in graphs:
+    rename = chain(rename, repeat(None))
+    graphs = (add_prefix(G, name) for G, name in zip(graphs, rename))
+
+    for i, G in enumerate(graphs):
+        G_nodes_set = set(G.nodes)
+        if i == 0:
+            # Union is the same type as first graph
+            R = G.__class__()
+        elif G.is_multigraph() != R.is_multigraph():
+            raise nx.NetworkXError("All graphs must be graphs or multigraphs.")
+        elif not seen_nodes.isdisjoint(G_nodes_set):
+            raise nx.NetworkXError(
+                "The node sets of the graphs are not disjoint.",
+                "Use appropriate rename"
+                "=(G1prefix,G2prefix,...,GNprefix)"
+                "or use disjoint_union(G1,G2,...,GN).",
+            )
+
+        seen_nodes |= G_nodes_set
         R.graph.update(G.graph)
-
-    # add nodes and attributes
-    for G in graphs:
         R.add_nodes_from(G.nodes(data=True))
+        R.add_edges_from(
+            G.edges(keys=True, data=True) if G.is_multigraph() else G.edges(data=True)
+        )
 
-    if U.is_multigraph():
-        for G in graphs:
-            R.add_edges_from(G.edges(keys=True, data=True))
-    else:
-        for G in graphs:
-            R.add_edges_from(G.edges(data=True))
+    if R is None:
+        raise ValueError("cannot apply union_all to an empty list")
 
     return R
 
@@ -109,8 +98,8 @@ def disjoint_union_all(graphs):
 
     Parameters
     ----------
-    graphs : list
-       List of NetworkX graphs
+    graphs : iterable
+       Iterable of NetworkX graphs
 
     Returns
     -------
@@ -129,22 +118,15 @@ def disjoint_union_all(graphs):
     If a graph attribute is present in multiple graphs, then the value
     from the last graph in the list with that attribute is used.
     """
-    graphs = list(graphs)
 
-    if not graphs:
-        raise ValueError("cannot apply disjoint_union_all to an empty list")
+    def yield_relabeled(graphs):
+        first_label = 0
+        for G in graphs:
+            yield nx.convert_node_labels_to_integers(G, first_label=first_label)
+            first_label += len(G)
 
-    first_labels = [0]
-    for G in graphs[:-1]:
-        first_labels.append(len(G) + first_labels[-1])
+    R = union_all(yield_relabeled(graphs))
 
-    relabeled = [
-        nx.convert_node_labels_to_integers(G, first_label=first_label)
-        for G, first_label in zip(graphs, first_labels)
-    ]
-    R = union_all(relabeled)
-    for G in graphs:
-        R.graph.update(G.graph)
     return R
 
 
@@ -156,8 +138,8 @@ def compose_all(graphs):
 
     Parameters
     ----------
-    graphs : list
-       List of NetworkX graphs
+    graphs : iterable
+       Iterable of NetworkX graphs
 
     Returns
     -------
@@ -177,30 +159,25 @@ def compose_all(graphs):
     If a graph attribute is present in multiple graphs, then the value
     from the last graph in the list with that attribute is used.
     """
-    graphs = list(graphs)
-
-    if not graphs:
-        raise ValueError("cannot apply compose_all to an empty list")
-
-    U = graphs[0]
+    R = None
 
-    if any(G.is_multigraph() != U.is_multigraph() for G in graphs):
-        raise nx.NetworkXError("All graphs must be graphs or multigraphs.")
-
-    R = U.__class__()
     # add graph attributes, H attributes take precedent over G attributes
-    for G in graphs:
-        R.graph.update(G.graph)
+    for i, G in enumerate(graphs):
+        if i == 0:
+            # create new graph
+            R = G.__class__()
+        elif G.is_multigraph() != R.is_multigraph():
+            raise nx.NetworkXError("All graphs must be graphs or multigraphs.")
 
-    for G in graphs:
+        R.graph.update(G.graph)
         R.add_nodes_from(G.nodes(data=True))
+        R.add_edges_from(
+            G.edges(keys=True, data=True) if G.is_multigraph() else G.edges(data=True)
+        )
+
+    if R is None:
+        raise ValueError("cannot apply compose_all to an empty list")
 
-    if U.is_multigraph():
-        for G in graphs:
-            R.add_edges_from(G.edges(keys=True, data=True))
-    else:
-        for G in graphs:
-            R.add_edges_from(G.edges(data=True))
     return R
 
 
@@ -210,8 +187,8 @@ def intersection_all(graphs):
 
     Parameters
     ----------
-    graphs : list
-       List of NetworkX graphs
+    graphs : iterable
+       Iterable of NetworkX graphs
 
     Returns
     -------
@@ -227,27 +204,28 @@ def intersection_all(graphs):
     Attributes from the graph, nodes, and edges are not copied to the new
     graph.
     """
-    graphs = list(graphs)
-
-    if not graphs:
-        raise ValueError("cannot apply intersection_all to an empty list")
+    R = None
+
+    for i, G in enumerate(graphs):
+        G_nodes_set = set(G.nodes)
+        G_edges_set = set(G.edges(keys=True) if G.is_multigraph() else G.edges())
+        if i == 0:
+            # create new graph
+            R = G.__class__()
+            node_intersection = G_nodes_set
+            edge_intersection = G_edges_set
+        elif G.is_multigraph() != R.is_multigraph():
+            raise nx.NetworkXError("All graphs must be graphs or multigraphs.")
+        else:
+            node_intersection &= G_nodes_set
+            edge_intersection &= G_edges_set
 
-    U = graphs[0]
+        R.graph.update(G.graph)
 
-    if any(G.is_multigraph() != U.is_multigraph() for G in graphs):
-        raise nx.NetworkXError("All graphs must be graphs or multigraphs.")
+    if R is None:
+        raise ValueError("cannot apply intersection_all to an empty list")
 
-    # create new graph
-    node_intersection = set.intersection(*[set(G.nodes) for G in graphs])
-    R = U.__class__()
     R.add_nodes_from(node_intersection)
-
-    if U.is_multigraph():
-        edge_sets = [set(G.edges(keys=True)) for G in graphs]
-    else:
-        edge_sets = [set(G.edges()) for G in graphs]
-
-    edge_intersection = set.intersection(*edge_sets)
     R.add_edges_from(edge_intersection)
 
     return R
diff --git a/networkx/algorithms/operators/binary.py b/networkx/algorithms/operators/binary.py
index 7ea3d77..09f59d1 100644
--- a/networkx/algorithms/operators/binary.py
+++ b/networkx/algorithms/operators/binary.py
@@ -14,7 +14,7 @@ __all__ = [
 ]
 
 
-def union(G, H, rename=(None, None), name=None):
+def union(G, H, rename=()):
     """Combine graphs G and H. The names of nodes must be unique.
 
     A name collision between the graphs will raise an exception.
@@ -27,17 +27,11 @@ def union(G, H, rename=(None, None), name=None):
     G, H : graph
        A NetworkX graph
 
-    rename : tuple , default=(None, None)
+    rename : iterable , optional
        Node names of G and H can be changed by specifying the tuple
        rename=('G-','H-') (for example).  Node "u" in G is then renamed
        "G-u" and "v" in H is renamed "H-v".
 
-    name : string
-       Specify the name for the union graph
-
-       .. deprecated:: 2.7
-           This is deprecated and will be removed in version v3.0.
-
     Returns
     -------
     U : A union graph with the same type as G.
@@ -72,15 +66,6 @@ def union(G, H, rename=(None, None), name=None):
 
 
     """
-    if name is not None:
-        import warnings
-
-        warnings.warn(
-            "name parameter is deprecated and will be removed in version 3.0",
-            DeprecationWarning,
-            stacklevel=2,
-        )
-
     return nx.union_all([G, H], rename)
 
 
@@ -433,11 +418,7 @@ def full_join(G, H, rename=(None, None)):
             return graph
 
         def label(x):
-            if isinstance(x, str):
-                name = prefix + x
-            else:
-                name = prefix + repr(x)
-            return name
+            return f"{prefix}{x}"
 
         return nx.relabel_nodes(graph, label)
 
diff --git a/networkx/algorithms/operators/product.py b/networkx/algorithms/operators/product.py
index 4c56bbe..a43dab1 100644
--- a/networkx/algorithms/operators/product.py
+++ b/networkx/algorithms/operators/product.py
@@ -13,6 +13,7 @@ __all__ = [
     "strong_product",
     "power",
     "rooted_product",
+    "corona_product",
 ]
 
 
@@ -459,3 +460,68 @@ def rooted_product(G, H, root):
     R.add_edges_from(((g, e[0]), (g, e[1])) for g in G for e in H.edges())
 
     return R
+
+
+@not_implemented_for("directed")
+@not_implemented_for("multigraph")
+def corona_product(G, H):
+    r"""Returns the Corona product of G and H.
+
+    The corona product of $G$ and $H$ is the graph $C = G \circ H$ obtained by
+    taking one copy of $G$, called the center graph, $|V(G)|$ copies of $H$,
+    called the outer graph, and making the $i$-th vertex of $G$ adjacent to
+    every vertex of the $i$-th copy of $H$, where $1 ≤ i ≤ |V(G)|$.
+
+    Parameters
+    ----------
+    G, H: NetworkX graphs
+        The graphs to take the carona product of.
+        `G` is the center graph and `H` is the outer graph
+
+    Returns
+    -------
+    C: NetworkX graph
+        The Corona product of G and H.
+
+    Raises
+    ------
+    NetworkXError
+        If G and H are not both directed or both undirected.
+
+    Examples
+    --------
+    >>> G = nx.cycle_graph(4)
+    >>> H = nx.path_graph(2)
+    >>> C = nx.corona_product(G, H)
+    >>> list(C)
+    [0, 1, 2, 3, (0, 0), (0, 1), (1, 0), (1, 1), (2, 0), (2, 1), (3, 0), (3, 1)]
+    >>> print(C)
+    Graph with 12 nodes and 16 edges
+
+    References
+    ----------
+    [1] M. Tavakoli, F. Rahbarnia, and A. R. Ashrafi,
+        "Studying the corona product of graphs under some graph invariants,"
+        Transactions on Combinatorics, vol. 3, no. 3, pp. 43–49, Sep. 2014,
+        doi: 10.22108/toc.2014.5542.
+    [2] A. Faraji, "Corona Product in Graph Theory," Ali Faraji, May 11, 2021.
+        https://blog.alifaraji.ir/math/graph-theory/corona-product.html (accessed Dec. 07, 2021).
+    """
+    GH = _init_product_graph(G, H)
+    GH.add_nodes_from(G)
+    GH.add_edges_from(G.edges)
+
+    for G_node in G:
+
+        # copy nodes of H in GH, call it H_i
+        GH.add_nodes_from((G_node, v) for v in H)
+
+        # copy edges of H_i based on H
+        GH.add_edges_from(
+            ((G_node, e0), (G_node, e1), d) for e0, e1, d in H.edges.data()
+        )
+
+        # creating new edges between H_i and a G's node
+        GH.add_edges_from((G_node, (G_node, H_node)) for H_node in H)
+
+    return GH
diff --git a/networkx/algorithms/operators/tests/test_all.py b/networkx/algorithms/operators/tests/test_all.py
index e09791b..2454851 100644
--- a/networkx/algorithms/operators/tests/test_all.py
+++ b/networkx/algorithms/operators/tests/test_all.py
@@ -28,13 +28,13 @@ def test_union_all_attributes():
         assert ghj.nodes[n] == eval(graph).nodes[int(node)]
 
     assert ghj.graph["attr"] == "attr"
-    assert ghj.graph["name"] == "j"  # j graph attributes take precendent
+    assert ghj.graph["name"] == "j"  # j graph attributes take precedent
 
 
 def test_intersection_all():
     G = nx.Graph()
     H = nx.Graph()
-    R = nx.Graph()
+    R = nx.Graph(awesome=True)
     G.add_nodes_from([1, 2, 3, 4])
     G.add_edge(1, 2)
     G.add_edge(2, 3)
@@ -47,6 +47,7 @@ def test_intersection_all():
     I = nx.intersection_all([G, H, R])
     assert set(I.nodes()) == {1, 2, 3, 4}
     assert sorted(I.edges()) == [(2, 3)]
+    assert I.graph["awesome"]
 
 
 def test_intersection_all_different_node_sets():
@@ -238,9 +239,10 @@ def test_union_all_multigraph():
 
 
 def test_input_output():
-    l = [nx.Graph([(1, 2)]), nx.Graph([(3, 4)])]
+    l = [nx.Graph([(1, 2)]), nx.Graph([(3, 4)], awesome=True)]
     U = nx.disjoint_union_all(l)
     assert len(l) == 2
+    assert U.graph["awesome"]
     C = nx.compose_all(l)
     assert len(l) == 2
     l = [nx.Graph([(1, 2)]), nx.Graph([(1, 2)])]
diff --git a/networkx/algorithms/operators/tests/test_binary.py b/networkx/algorithms/operators/tests/test_binary.py
index f11e159..b4e64f8 100644
--- a/networkx/algorithms/operators/tests/test_binary.py
+++ b/networkx/algorithms/operators/tests/test_binary.py
@@ -23,7 +23,7 @@ def test_union_attributes():
         assert gh.nodes[n] == eval(graph).nodes[int(node)]
 
     assert gh.graph["attr"] == "attr"
-    assert gh.graph["name"] == "h"  # h graph attributes take precendent
+    assert gh.graph["name"] == "h"  # h graph attributes take precedent
 
 
 def test_intersection():
diff --git a/networkx/algorithms/operators/tests/test_product.py b/networkx/algorithms/operators/tests/test_product.py
index fb97756..75bc048 100644
--- a/networkx/algorithms/operators/tests/test_product.py
+++ b/networkx/algorithms/operators/tests/test_product.py
@@ -425,3 +425,11 @@ def test_rooted_product():
     R = nx.rooted_product(G, H, "a")
     assert len(R) == len(G) * len(H)
     assert R.size() == G.size() + len(G) * H.size()
+
+
+def test_corona_product():
+    G = nx.cycle_graph(3)
+    H = nx.path_graph(2)
+    C = nx.corona_product(G, H)
+    assert len(C) == (len(G) * len(H)) + len(G)
+    assert C.size() == G.size() + len(G) * H.size() + len(G) * len(H)
diff --git a/networkx/algorithms/polynomials.py b/networkx/algorithms/polynomials.py
index 35c0166..27dc580 100644
--- a/networkx/algorithms/polynomials.py
+++ b/networkx/algorithms/polynomials.py
@@ -5,6 +5,19 @@ variety of structural information. Examples include the Tutte polynomial,
 chromatic polynomial, characteristic polynomial, and matching polynomial. An
 extensive treatment is provided in [1]_.
 
+For a simple example, the `~sympy.matrices.matrices.MatrixDeterminant.charpoly`
+method can be used to compute the characteristic polynomial from the adjacency
+matrix of a graph. Consider the complete graph ``K_4``:
+
+>>> import sympy
+>>> x = sympy.Symbol("x")
+>>> G = nx.complete_graph(4)
+>>> A = nx.adjacency_matrix(G)
+>>> M = sympy.SparseMatrix(A.todense())
+>>> M.charpoly(x).as_expr()
+x**4 - 6*x**2 - 8*x - 3
+
+
 .. [1] Y. Shi, M. Dehmer, X. Li, I. Gutman,
    "Graph Polynomials"
 """
diff --git a/networkx/algorithms/reciprocity.py b/networkx/algorithms/reciprocity.py
index 1b7761b..d58b607 100644
--- a/networkx/algorithms/reciprocity.py
+++ b/networkx/algorithms/reciprocity.py
@@ -1,4 +1,5 @@
 """Algorithms to calculate reciprocity in a directed graph."""
+import networkx as nx
 from networkx import NetworkXError
 
 from ..utils import not_implemented_for
@@ -6,6 +7,7 @@ from ..utils import not_implemented_for
 __all__ = ["reciprocity", "overall_reciprocity"]
 
 
+@nx._dispatch
 @not_implemented_for("undirected", "multigraph")
 def reciprocity(G, nodes=None):
     r"""Compute the reciprocity in a directed graph.
@@ -73,6 +75,7 @@ def _reciprocity_iter(G, nodes):
             yield (node, reciprocity)
 
 
+@nx._dispatch
 @not_implemented_for("undirected", "multigraph")
 def overall_reciprocity(G):
     """Compute the reciprocity for the whole graph.
diff --git a/networkx/algorithms/regular.py b/networkx/algorithms/regular.py
index 3f76d40..94ec71d 100644
--- a/networkx/algorithms/regular.py
+++ b/networkx/algorithms/regular.py
@@ -5,6 +5,7 @@ from networkx.utils import not_implemented_for
 __all__ = ["is_regular", "is_k_regular", "k_factor"]
 
 
+@nx._dispatch
 def is_regular(G):
     """Determines whether the graph ``G`` is a regular graph.
 
@@ -21,6 +22,12 @@ def is_regular(G):
     bool
         Whether the given graph or digraph is regular.
 
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 4), (4, 1)])
+    >>> nx.is_regular(G)
+    True
+
     """
     n1 = nx.utils.arbitrary_element(G)
     if not G.is_directed():
@@ -34,6 +41,7 @@ def is_regular(G):
         return in_regular and out_regular
 
 
+@nx._dispatch
 @not_implemented_for("directed")
 def is_k_regular(G, k):
     """Determines whether the graph ``G`` is a k-regular graph.
@@ -49,6 +57,12 @@ def is_k_regular(G, k):
     bool
         Whether the given graph is k-regular.
 
+    Examples
+    --------
+    >>> G = nx.Graph([(1, 2), (2, 3), (3, 4), (4, 1)])
+    >>> nx.is_k_regular(G, k=3)
+    False
+
     """
     return all(d == k for n, d in G.degree)
 
@@ -78,6 +92,13 @@ def k_factor(G, k, matching_weight="weight"):
     G2 : NetworkX graph
         A k-factor of G
 
+    Examples
+    --------
+    >>> G = nx.Graph([(1, 2), (2, 3), (3, 4), (4, 1)])
+    >>> G2 = nx.k_factor(G, k=1)
+    >>> G2.edges()
+    EdgeView([(1, 2), (3, 4)])
+
     References
     ----------
     .. [1] "An algorithm for computing simple k-factors.",
diff --git a/networkx/algorithms/shortest_paths/astar.py b/networkx/algorithms/shortest_paths/astar.py
index 5d5a847..6f0e442 100644
--- a/networkx/algorithms/shortest_paths/astar.py
+++ b/networkx/algorithms/shortest_paths/astar.py
@@ -46,7 +46,7 @@ def astar_path(G, source, target, heuristic=None, weight="weight"):
        returned by the function. The function must accept exactly three
        positional arguments: the two endpoints of an edge and the
        dictionary of edge attributes for that edge. The function must
-       return a number.
+       return a number or None to indicate a hidden edge.
 
     Raises
     ------
@@ -67,6 +67,14 @@ def astar_path(G, source, target, heuristic=None, weight="weight"):
     >>> print(nx.astar_path(G, (0, 0), (2, 2), heuristic=dist, weight="cost"))
     [(0, 0), (0, 1), (0, 2), (1, 2), (2, 2)]
 
+    Notes
+    -----
+    Edge weight attributes must be numerical.
+    Distances are calculated as sums of weighted edges traversed.
+
+    The weight function can be used to hide edges by returning None.
+    So ``weight = lambda u, v, d: 1 if d['color']=="red" else None``
+    will find the shortest red path.
 
     See Also
     --------
@@ -127,7 +135,10 @@ def astar_path(G, source, target, heuristic=None, weight="weight"):
         explored[curnode] = parent
 
         for neighbor, w in G[curnode].items():
-            ncost = dist + weight(curnode, neighbor, w)
+            cost = weight(curnode, neighbor, w)
+            if cost is None:
+                continue
+            ncost = dist + cost
             if neighbor in enqueued:
                 qcost, h = enqueued[neighbor]
                 # if qcost <= ncost, a less costly path from the
@@ -179,7 +190,7 @@ def astar_path_length(G, source, target, heuristic=None, weight="weight"):
        returned by the function. The function must accept exactly three
        positional arguments: the two endpoints of an edge and the
        dictionary of edge attributes for that edge. The function must
-       return a number.
+       return a number or None to indicate a hidden edge.
     Raises
     ------
     NetworkXNoPath
diff --git a/networkx/algorithms/shortest_paths/generic.py b/networkx/algorithms/shortest_paths/generic.py
index 129f741..8981841 100644
--- a/networkx/algorithms/shortest_paths/generic.py
+++ b/networkx/algorithms/shortest_paths/generic.py
@@ -16,6 +16,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch
 def has_path(G, source, target):
     """Returns *True* if *G* has a path from *source* to *target*.
 
@@ -36,6 +37,7 @@ def has_path(G, source, target):
     return True
 
 
+@nx._dispatch
 def shortest_path(G, source=None, target=None, weight=None, method="dijkstra"):
     """Compute shortest paths in the graph.
 
@@ -170,6 +172,7 @@ def shortest_path(G, source=None, target=None, weight=None, method="dijkstra"):
     return paths
 
 
+@nx._dispatch
 def shortest_path_length(G, source=None, target=None, weight=None, method="dijkstra"):
     """Compute shortest path lengths in the graph.
 
@@ -320,12 +323,16 @@ def average_shortest_path_length(G, weight=None, method=None):
 
     .. math::
 
-       a =\sum_{s,t \in V} \frac{d(s, t)}{n(n-1)}
+       a =\sum_{\substack{s,t \in V \\ s\neq t}} \frac{d(s, t)}{n(n-1)}
 
     where `V` is the set of nodes in `G`,
     `d(s, t)` is the shortest path from `s` to `t`,
     and `n` is the number of nodes in `G`.
 
+    .. versionchanged:: 3.0
+       An exception is raised for directed graphs that are not strongly
+       connected.
+
     Parameters
     ----------
     G : NetworkX graph
@@ -354,7 +361,7 @@ def average_shortest_path_length(G, weight=None, method=None):
         If `G` is the null graph (that is, the graph on zero nodes).
 
     NetworkXError
-        If `G` is not connected (or not weakly connected, in the case
+        If `G` is not connected (or not strongly connected, in the case
         of a directed graph).
 
     ValueError
@@ -397,9 +404,10 @@ def average_shortest_path_length(G, weight=None, method=None):
     # For the special case of the trivial graph, return zero immediately.
     if n == 1:
         return 0
-    # Shortest path length is undefined if the graph is disconnected.
-    if G.is_directed() and not nx.is_weakly_connected(G):
-        raise nx.NetworkXError("Graph is not weakly connected.")
+    # Shortest path length is undefined if the graph is not strongly connected.
+    if G.is_directed() and not nx.is_strongly_connected(G):
+        raise nx.NetworkXError("Graph is not strongly connected.")
+    # Shortest path length is undefined if the graph is not connected.
     if not G.is_directed() and not nx.is_connected(G):
         raise nx.NetworkXError("Graph is not connected.")
 
diff --git a/networkx/algorithms/shortest_paths/tests/test_astar.py b/networkx/algorithms/shortest_paths/tests/test_astar.py
index e622502..3cc4050 100644
--- a/networkx/algorithms/shortest_paths/tests/test_astar.py
+++ b/networkx/algorithms/shortest_paths/tests/test_astar.py
@@ -44,6 +44,25 @@ class TestAStar:
         assert nx.astar_path(self.XG, "s", "v") == ["s", "x", "u", "v"]
         assert nx.astar_path_length(self.XG, "s", "v") == 9
 
+    def test_astar_directed_weight_function(self):
+        w1 = lambda u, v, d: d["weight"]
+        assert nx.astar_path(self.XG, "x", "u", weight=w1) == ["x", "u"]
+        assert nx.astar_path_length(self.XG, "x", "u", weight=w1) == 3
+        assert nx.astar_path(self.XG, "s", "v", weight=w1) == ["s", "x", "u", "v"]
+        assert nx.astar_path_length(self.XG, "s", "v", weight=w1) == 9
+
+        w2 = lambda u, v, d: None if (u, v) == ("x", "u") else d["weight"]
+        assert nx.astar_path(self.XG, "x", "u", weight=w2) == ["x", "y", "s", "u"]
+        assert nx.astar_path_length(self.XG, "x", "u", weight=w2) == 19
+        assert nx.astar_path(self.XG, "s", "v", weight=w2) == ["s", "x", "v"]
+        assert nx.astar_path_length(self.XG, "s", "v", weight=w2) == 10
+
+        w3 = lambda u, v, d: d["weight"] + 10
+        assert nx.astar_path(self.XG, "x", "u", weight=w3) == ["x", "u"]
+        assert nx.astar_path_length(self.XG, "x", "u", weight=w3) == 13
+        assert nx.astar_path(self.XG, "s", "v", weight=w3) == ["s", "x", "v"]
+        assert nx.astar_path_length(self.XG, "s", "v", weight=w3) == 30
+
     def test_astar_multigraph(self):
         G = nx.MultiDiGraph(self.XG)
         G.add_weighted_edges_from((u, v, 1000) for (u, v) in list(G.edges()))
diff --git a/networkx/algorithms/shortest_paths/tests/test_generic.py b/networkx/algorithms/shortest_paths/tests/test_generic.py
index 093fd9c..91b0e30 100644
--- a/networkx/algorithms/shortest_paths/tests/test_generic.py
+++ b/networkx/algorithms/shortest_paths/tests/test_generic.py
@@ -324,13 +324,16 @@ class TestAverageShortestPathLength:
         )
         assert ans == pytest.approx(4, abs=1e-7)
 
-    def test_disconnected(self):
+    def test_directed_not_strongly_connected(self):
+        G = nx.DiGraph([(0, 1)])
+        with pytest.raises(nx.NetworkXError, match="Graph is not strongly connected"):
+            nx.average_shortest_path_length(G)
+
+    def test_undirected_not_connected(self):
         g = nx.Graph()
         g.add_nodes_from(range(3))
         g.add_edge(0, 1)
         pytest.raises(nx.NetworkXError, nx.average_shortest_path_length, g)
-        g = g.to_directed()
-        pytest.raises(nx.NetworkXError, nx.average_shortest_path_length, g)
 
     def test_trivial_graph(self):
         """Tests that the trivial graph has average path length zero,
diff --git a/networkx/algorithms/shortest_paths/weighted.py b/networkx/algorithms/shortest_paths/weighted.py
index ef0ee63..e131d49 100644
--- a/networkx/algorithms/shortest_paths/weighted.py
+++ b/networkx/algorithms/shortest_paths/weighted.py
@@ -105,7 +105,7 @@ def dijkstra_path(G, source, target, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -186,7 +186,7 @@ def dijkstra_path_length(G, source, target, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -266,7 +266,7 @@ def single_source_dijkstra_path(G, source, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -330,7 +330,7 @@ def single_source_dijkstra_path_length(G, source, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -408,7 +408,7 @@ def single_source_dijkstra(G, source, target=None, cutoff=None, weight="weight")
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -506,7 +506,7 @@ def multi_source_dijkstra_path(G, sources, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -579,7 +579,7 @@ def multi_source_dijkstra_path_length(G, sources, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -664,7 +664,7 @@ def multi_source_dijkstra(G, sources, target=None, cutoff=None, weight="weight")
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -778,7 +778,8 @@ def _dijkstra_multisource(
         nodes.
 
     weight: function
-        Function with (u, v, data) input that returns that edges weight
+        Function with (u, v, data) input that returns that edge's weight
+        or None to indicate a hidden edge
 
     pred: dict of lists, optional(default=None)
         dict to store a list of predecessors keyed by that node
@@ -892,7 +893,7 @@ def dijkstra_predecessor_and_distance(G, source, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -957,7 +958,7 @@ def all_pairs_dijkstra(G, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Yields
     ------
@@ -1025,7 +1026,7 @@ def all_pairs_dijkstra_path_length(G, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -1083,7 +1084,7 @@ def all_pairs_dijkstra_path(G, cutoff=None, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
@@ -2251,7 +2252,7 @@ def bidirectional_dijkstra(G, source, target, weight="weight"):
         returned by the function. The function must accept exactly three
         positional arguments: the two endpoints of an edge and the
         dictionary of edge attributes for that edge. The function must
-        return a number.
+        return a number or None to indicate a hidden edge.
 
     Returns
     -------
diff --git a/networkx/algorithms/similarity.py b/networkx/algorithms/similarity.py
index fe6e0f2..26bd3b8 100644
--- a/networkx/algorithms/similarity.py
+++ b/networkx/algorithms/similarity.py
@@ -28,7 +28,6 @@ __all__ = [
     "optimize_graph_edit_distance",
     "optimize_edit_paths",
     "simrank_similarity",
-    "simrank_similarity_numpy",
     "panther_similarity",
     "generate_random_paths",
 ]
@@ -1500,34 +1499,6 @@ def _simrank_similarity_numpy(
     return newsim
 
 
-def simrank_similarity_numpy(
-    G,
-    source=None,
-    target=None,
-    importance_factor=0.9,
-    max_iterations=100,
-    tolerance=1e-4,
-):
-    """Calculate SimRank of nodes in ``G`` using matrices with ``numpy``.
-
-    .. deprecated:: 2.6
-        simrank_similarity_numpy is deprecated and will be removed in networkx 3.0.
-        Use simrank_similarity
-
-    """
-    warnings.warn(
-        (
-            "networkx.simrank_similarity_numpy is deprecated and will be removed"
-            "in NetworkX 3.0, use networkx.simrank_similarity instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    return _simrank_similarity_numpy(
-        G, source, target, importance_factor, max_iterations, tolerance
-    )
-
-
 def panther_similarity(G, source, k=5, path_length=5, c=0.5, delta=0.1, eps=None):
     r"""Returns the Panther similarity of nodes in the graph `G` to node ``v``.
 
diff --git a/networkx/algorithms/simple_paths.py b/networkx/algorithms/simple_paths.py
index e19e4e4..d81d234 100644
--- a/networkx/algorithms/simple_paths.py
+++ b/networkx/algorithms/simple_paths.py
@@ -13,6 +13,7 @@ __all__ = [
 ]
 
 
+@nx._dispatch
 def is_simple_path(G, nodes):
     """Returns True if and only if `nodes` form a simple path in `G`.
 
@@ -71,13 +72,23 @@ def is_simple_path(G, nodes):
     # NetworkXPointlessConcept here.
     if len(nodes) == 0:
         return False
+
     # If the list is a single node, just check that the node is actually
     # in the graph.
     if len(nodes) == 1:
         return nodes[0] in G
-    # Test that no node appears more than once, and that each
-    # adjacent pair of nodes is adjacent.
-    return len(set(nodes)) == len(nodes) and all(v in G[u] for u, v in pairwise(nodes))
+
+    # check that all nodes in the list are in the graph, if at least one
+    # is not in the graph, then this is not a simple path
+    if not all(n in G for n in nodes):
+        return False
+
+    # If the list contains repeated nodes, then it's not a simple path
+    if len(set(nodes)) != len(nodes):
+        return False
+
+    # Test that each adjacent pair of nodes is adjacent.
+    return all(v in G[u] for u, v in pairwise(nodes))
 
 
 def all_simple_paths(G, source, target, cutoff=None):
@@ -256,7 +267,7 @@ def _empty_generator():
 
 
 def _all_simple_paths_graph(G, source, targets, cutoff):
-    visited = dict.fromkeys([source])
+    visited = {source: True}
     stack = [iter(G[source])]
     while stack:
         children = stack[-1]
@@ -269,7 +280,7 @@ def _all_simple_paths_graph(G, source, targets, cutoff):
                 continue
             if child in targets:
                 yield list(visited) + [child]
-            visited[child] = None
+            visited[child] = True
             if targets - set(visited.keys()):  # expand stack until find all targets
                 stack.append(iter(G[child]))
             else:
@@ -282,7 +293,7 @@ def _all_simple_paths_graph(G, source, targets, cutoff):
 
 
 def _all_simple_paths_multigraph(G, source, targets, cutoff):
-    visited = dict.fromkeys([source])
+    visited = {source: True}
     stack = [(v for u, v in G.edges(source))]
     while stack:
         children = stack[-1]
@@ -295,7 +306,7 @@ def _all_simple_paths_multigraph(G, source, targets, cutoff):
                 continue
             if child in targets:
                 yield list(visited) + [child]
-            visited[child] = None
+            visited[child] = True
             if targets - set(visited.keys()):
                 stack.append((v for u, v in G.edges(child)))
             else:
diff --git a/networkx/algorithms/smallworld.py b/networkx/algorithms/smallworld.py
index fc64d13..9b039e3 100644
--- a/networkx/algorithms/smallworld.py
+++ b/networkx/algorithms/smallworld.py
@@ -46,6 +46,11 @@ def random_reference(G, niter=1, connectivity=True, seed=None):
     G : graph
         The randomized graph.
 
+    Raises
+    ------
+    NetworkXError
+        If there are fewer than 4 nodes or 2 edges in `G`
+
     Notes
     -----
     The implementation is adapted from the algorithm by Maslov and Sneppen
@@ -58,7 +63,9 @@ def random_reference(G, niter=1, connectivity=True, seed=None):
            Science 296.5569 (2002): 910-913.
     """
     if len(G) < 4:
-        raise nx.NetworkXError("Graph has less than four nodes.")
+        raise nx.NetworkXError("Graph has fewer than four nodes.")
+    if len(G.edges) < 2:
+        raise nx.NetworkXError("Graph has fewer that 2 edges")
 
     from networkx.utils import cumulative_distribution, discrete_sequence
 
@@ -119,10 +126,10 @@ def lattice_reference(G, niter=5, D=None, connectivity=True, seed=None):
     Parameters
     ----------
     G : graph
-        An undirected graph with 4 or more nodes.
+        An undirected graph.
 
     niter : integer (optional, default=1)
-        An edge is rewired approximatively niter times.
+        An edge is rewired approximately niter times.
 
     D : numpy.array (optional, default=None)
         Distance to the diagonal matrix.
@@ -139,6 +146,11 @@ def lattice_reference(G, niter=5, D=None, connectivity=True, seed=None):
     G : graph
         The latticized graph.
 
+    Raises
+    ------
+    NetworkXError
+        If there are fewer than 4 nodes or 2 edges in `G`
+
     Notes
     -----
     The implementation is adapted from the algorithm by Sporns et al. [1]_.
@@ -160,7 +172,9 @@ def lattice_reference(G, niter=5, D=None, connectivity=True, seed=None):
     local_conn = nx.connectivity.local_edge_connectivity
 
     if len(G) < 4:
-        raise nx.NetworkXError("Graph has less than four nodes.")
+        raise nx.NetworkXError("Graph has fewer than four nodes.")
+    if len(G.edges) < 2:
+        raise nx.NetworkXError("Graph has fewer that 2 edges")
     # Instead of choosing uniformly at random from a generated edge list,
     # this algorithm chooses nonuniformly from the set of nodes with
     # probability weighted by degree.
diff --git a/networkx/algorithms/smetric.py b/networkx/algorithms/smetric.py
index b851e1e..785b2da 100644
--- a/networkx/algorithms/smetric.py
+++ b/networkx/algorithms/smetric.py
@@ -3,6 +3,7 @@ import networkx as nx
 __all__ = ["s_metric"]
 
 
+@nx._dispatch
 def s_metric(G, normalized=True):
     """Returns the s-metric of graph.
 
diff --git a/networkx/algorithms/structuralholes.py b/networkx/algorithms/structuralholes.py
index 55cdfe4..9f67f59 100644
--- a/networkx/algorithms/structuralholes.py
+++ b/networkx/algorithms/structuralholes.py
@@ -5,6 +5,7 @@ import networkx as nx
 __all__ = ["constraint", "local_constraint", "effective_size"]
 
 
+@nx._dispatch
 def mutual_weight(G, u, v, weight=None):
     """Returns the sum of the weights of the edge from `u` to `v` and
     the edge from `v` to `u` in `G`.
diff --git a/networkx/algorithms/summarization.py b/networkx/algorithms/summarization.py
index 16c7e62..ce0c9e6 100644
--- a/networkx/algorithms/summarization.py
+++ b/networkx/algorithms/summarization.py
@@ -20,7 +20,7 @@ nodes called compressor or virtual nodes to reduce the total number of edges in
 a graph. Edge-grouping techniques can be lossless, meaning that they can be
 used to re-create the original graph, or techniques can be lossy, requiring
 less space to store the summary graph, but at the expense of lower
-recontruction accuracy of the original graph.
+reconstruction accuracy of the original graph.
 
 Bit-compression techniques minimize the amount of information needed to
 describe the original graph, while revealing structural patterns in the
@@ -177,20 +177,20 @@ def dedensify(G, threshold, prefix=None, copy=True):
     high_degree_nodes = {n for n, d in degrees if d > threshold}
     low_degree_nodes = G.nodes() - high_degree_nodes
 
-    auxillary = {}
+    auxiliary = {}
     for node in G:
         high_degree_neighbors = frozenset(high_degree_nodes & set(G[node]))
         if high_degree_neighbors:
-            if high_degree_neighbors in auxillary:
-                auxillary[high_degree_neighbors].add(node)
+            if high_degree_neighbors in auxiliary:
+                auxiliary[high_degree_neighbors].add(node)
             else:
-                auxillary[high_degree_neighbors] = {node}
+                auxiliary[high_degree_neighbors] = {node}
 
     if copy:
         G = G.copy()
 
     compressor_nodes = set()
-    for index, (high_degree_nodes, low_degree_nodes) in enumerate(auxillary.items()):
+    for index, (high_degree_nodes, low_degree_nodes) in enumerate(auxiliary.items()):
         low_degree_node_count = len(low_degree_nodes)
         high_degree_node_count = len(high_degree_nodes)
         old_edges = high_degree_node_count * low_degree_node_count
diff --git a/networkx/algorithms/swap.py b/networkx/algorithms/swap.py
index 26a1f31..c02a4b3 100644
--- a/networkx/algorithms/swap.py
+++ b/networkx/algorithms/swap.py
@@ -6,7 +6,127 @@ import math
 import networkx as nx
 from networkx.utils import py_random_state
 
-__all__ = ["double_edge_swap", "connected_double_edge_swap"]
+__all__ = ["double_edge_swap", "connected_double_edge_swap", "directed_edge_swap"]
+
+
+@py_random_state(3)
+@nx.utils.not_implemented_for("undirected")
+def directed_edge_swap(G, *, nswap=1, max_tries=100, seed=None):
+    """Swap three edges in a directed graph while keeping the node degrees fixed.
+
+    A directed edge swap swaps three edges such that a -> b -> c -> d becomes
+    a -> c -> b -> d. This pattern of swapping allows all possible states with the
+    same in- and out-degree distribution in a directed graph to be reached.
+
+    If the swap would create parallel edges (e.g. if a -> c already existed in the
+    previous example), another attempt is made to find a suitable trio of edges.
+
+    Parameters
+    ----------
+    G : DiGraph
+       A directed graph
+
+    nswap : integer (optional, default=1)
+       Number of three-edge (directed) swaps to perform
+
+    max_tries : integer (optional, default=100)
+       Maximum number of attempts to swap edges
+
+    seed : integer, random_state, or None (default)
+        Indicator of random number generation state.
+        See :ref:`Randomness<randomness>`.
+
+    Returns
+    -------
+    G : DiGraph
+       The graph after the edges are swapped.
+
+    Raises
+    ------
+    NetworkXError
+        If `G` is not directed, or
+        If nswap > max_tries, or
+        If there are fewer than 4 nodes or 3 edges in `G`.
+    NetworkXAlgorithmError
+        If the number of swap attempts exceeds `max_tries` before `nswap` swaps are made
+
+    Notes
+    -----
+    Does not enforce any connectivity constraints.
+
+    The graph G is modified in place.
+
+    References
+    ----------
+    .. [1] Erdős, Péter L., et al. “A Simple Havel-Hakimi Type Algorithm to Realize
+           Graphical Degree Sequences of Directed Graphs.” ArXiv:0905.4913 [Math],
+           Jan. 2010. https://doi.org/10.48550/arXiv.0905.4913.
+           Published  2010 in Elec. J. Combinatorics (17(1)). R66.
+           http://www.combinatorics.org/Volume_17/PDF/v17i1r66.pdf
+    .. [2] “Combinatorics - Reaching All Possible Simple Directed Graphs with a given
+           Degree Sequence with 2-Edge Swaps.” Mathematics Stack Exchange,
+           https://math.stackexchange.com/questions/22272/. Accessed 30 May 2022.
+    """
+    if nswap > max_tries:
+        raise nx.NetworkXError("Number of swaps > number of tries allowed.")
+    if len(G) < 4:
+        raise nx.NetworkXError("DiGraph has fewer than four nodes.")
+    if len(G.edges) < 3:
+        raise nx.NetworkXError("DiGraph has fewer than 3 edges")
+
+    # Instead of choosing uniformly at random from a generated edge list,
+    # this algorithm chooses nonuniformly from the set of nodes with
+    # probability weighted by degree.
+    tries = 0
+    swapcount = 0
+    keys, degrees = zip(*G.degree())  # keys, degree
+    cdf = nx.utils.cumulative_distribution(degrees)  # cdf of degree
+    discrete_sequence = nx.utils.discrete_sequence
+
+    while swapcount < nswap:
+        # choose source node index from discrete distribution
+        start_index = discrete_sequence(1, cdistribution=cdf, seed=seed)[0]
+        start = keys[start_index]
+        tries += 1
+
+        if tries > max_tries:
+            msg = f"Maximum number of swap attempts ({tries}) exceeded before desired swaps achieved ({nswap})."
+            raise nx.NetworkXAlgorithmError(msg)
+
+        # If the given node doesn't have any out edges, then there isn't anything to swap
+        if G.out_degree(start) == 0:
+            continue
+        second = seed.choice(list(G.succ[start]))
+        if start == second:
+            continue
+
+        if G.out_degree(second) == 0:
+            continue
+        third = seed.choice(list(G.succ[second]))
+        if second == third:
+            continue
+
+        if G.out_degree(third) == 0:
+            continue
+        fourth = seed.choice(list(G.succ[third]))
+        if third == fourth:
+            continue
+
+        if (
+            third not in G.succ[start]
+            and fourth not in G.succ[second]
+            and second not in G.succ[third]
+        ):
+            # Swap nodes
+            G.add_edge(start, third)
+            G.add_edge(third, second)
+            G.add_edge(second, fourth)
+            G.remove_edge(start, second)
+            G.remove_edge(second, third)
+            G.remove_edge(third, fourth)
+            swapcount += 1
+
+    return G
 
 
 @py_random_state(3)
@@ -43,6 +163,15 @@ def double_edge_swap(G, nswap=1, max_tries=100, seed=None):
     G : graph
        The graph after double edge swaps.
 
+    Raises
+    ------
+    NetworkXError
+        If `G` is directed, or
+        If `nswap` > `max_tries`, or
+        If there are fewer than 4 nodes or 2 edges in `G`.
+    NetworkXAlgorithmError
+        If the number of swap attempts exceeds `max_tries` before `nswap` swaps are made
+
     Notes
     -----
     Does not enforce any connectivity constraints.
@@ -50,11 +179,15 @@ def double_edge_swap(G, nswap=1, max_tries=100, seed=None):
     The graph G is modified in place.
     """
     if G.is_directed():
-        raise nx.NetworkXError("double_edge_swap() not defined for directed graphs.")
+        raise nx.NetworkXError(
+            "double_edge_swap() not defined for directed graphs. Use directed_edge_swap instead."
+        )
     if nswap > max_tries:
         raise nx.NetworkXError("Number of swaps > number of tries allowed.")
     if len(G) < 4:
-        raise nx.NetworkXError("Graph has less than four nodes.")
+        raise nx.NetworkXError("Graph has fewer than four nodes.")
+    if len(G.edges) < 2:
+        raise nx.NetworkXError("Graph has fewer than 2 edges")
     # Instead of choosing uniformly at random from a generated edge list,
     # this algorithm chooses nonuniformly from the set of nodes with
     # probability weighted by degree.
@@ -165,7 +298,7 @@ def connected_double_edge_swap(G, nswap=1, _window_threshold=3, seed=None):
     if not nx.is_connected(G):
         raise nx.NetworkXError("Graph not connected")
     if len(G) < 4:
-        raise nx.NetworkXError("Graph has less than four nodes.")
+        raise nx.NetworkXError("Graph has fewer than four nodes.")
     n = 0
     swapcount = 0
     deg = G.degree()
@@ -230,7 +363,7 @@ def connected_double_edge_swap(G, nswap=1, _window_threshold=3, seed=None):
             while wcount < window and n < nswap:
                 # Pick two random edges without creating the edge list. Choose
                 # source nodes from the discrete degree distribution.
-                (ui, xi) = nx.utils.discrete_sequence(2, cdistribution=cdf)
+                (ui, xi) = discrete_sequence(2, cdistribution=cdf, seed=seed)
                 # If the source nodes are the same, skip this pair.
                 if ui == xi:
                     continue
diff --git a/networkx/algorithms/tests/test_chordal.py b/networkx/algorithms/tests/test_chordal.py
index c72699c..2d4b757 100644
--- a/networkx/algorithms/tests/test_chordal.py
+++ b/networkx/algorithms/tests/test_chordal.py
@@ -89,11 +89,11 @@ class TestMCS:
             frozenset([2, 3, 4]),
             frozenset([3, 4, 5, 6]),
         }
-        assert nx.chordal_graph_cliques(self.chordal_G) == cliques
+        assert {c for c in nx.chordal_graph_cliques(self.chordal_G)} == cliques
         with pytest.raises(nx.NetworkXError, match="Input graph is not chordal"):
-            nx.chordal_graph_cliques(self.non_chordal_G)
+            {c for c in nx.chordal_graph_cliques(self.non_chordal_G)}
         with pytest.raises(nx.NetworkXError, match="Input graph is not chordal"):
-            nx.chordal_graph_cliques(self.self_loop_G)
+            {c for c in nx.chordal_graph_cliques(self.self_loop_G)}
 
     def test_chordal_find_cliques_path(self):
         G = nx.path_graph(10)
@@ -104,7 +104,7 @@ class TestMCS:
     def test_chordal_find_cliquesCC(self):
         cliques = {frozenset([1, 2, 3]), frozenset([2, 3, 4]), frozenset([3, 4, 5, 6])}
         cgc = nx.chordal_graph_cliques
-        assert cgc(self.connected_chordal_G) == cliques
+        assert {c for c in cgc(self.connected_chordal_G)} == cliques
 
     def test_complete_to_chordal_graph(self):
         fgrg = nx.fast_gnp_random_graph
diff --git a/networkx/algorithms/tests/test_clique.py b/networkx/algorithms/tests/test_clique.py
index f6d5335..f523801 100644
--- a/networkx/algorithms/tests/test_clique.py
+++ b/networkx/algorithms/tests/test_clique.py
@@ -69,80 +69,96 @@ class TestCliques:
 
     def test_clique_number(self):
         G = self.G
-        assert nx.graph_clique_number(G) == 4
-        assert nx.graph_clique_number(G, cliques=self.cl) == 4
+        with pytest.deprecated_call():
+            assert nx.graph_clique_number(G) == 4
+        with pytest.deprecated_call():
+            assert nx.graph_clique_number(G, cliques=self.cl) == 4
 
     def test_clique_number2(self):
         G = nx.Graph()
         G.add_nodes_from([1, 2, 3])
-        assert nx.graph_clique_number(G) == 1
+        with pytest.deprecated_call():
+            assert nx.graph_clique_number(G) == 1
 
     def test_clique_number3(self):
         G = nx.Graph()
-        assert nx.graph_clique_number(G) == 0
+        with pytest.deprecated_call():
+            assert nx.graph_clique_number(G) == 0
 
     def test_number_of_cliques(self):
         G = self.G
-        assert nx.graph_number_of_cliques(G) == 5
-        assert nx.graph_number_of_cliques(G, cliques=self.cl) == 5
-        assert nx.number_of_cliques(G, 1) == 1
-        assert list(nx.number_of_cliques(G, [1]).values()) == [1]
-        assert list(nx.number_of_cliques(G, [1, 2]).values()) == [1, 2]
-        assert nx.number_of_cliques(G, [1, 2]) == {1: 1, 2: 2}
-        assert nx.number_of_cliques(G, 2) == 2
-        assert nx.number_of_cliques(G) == {
-            1: 1,
-            2: 2,
-            3: 1,
-            4: 2,
-            5: 1,
-            6: 2,
-            7: 1,
-            8: 1,
-            9: 1,
-            10: 1,
-            11: 1,
-        }
-        assert nx.number_of_cliques(G, nodes=list(G)) == {
-            1: 1,
-            2: 2,
-            3: 1,
-            4: 2,
-            5: 1,
-            6: 2,
-            7: 1,
-            8: 1,
-            9: 1,
-            10: 1,
-            11: 1,
-        }
-        assert nx.number_of_cliques(G, nodes=[2, 3, 4]) == {2: 2, 3: 1, 4: 2}
-        assert nx.number_of_cliques(G, cliques=self.cl) == {
-            1: 1,
-            2: 2,
-            3: 1,
-            4: 2,
-            5: 1,
-            6: 2,
-            7: 1,
-            8: 1,
-            9: 1,
-            10: 1,
-            11: 1,
-        }
-        assert nx.number_of_cliques(G, list(G), cliques=self.cl) == {
-            1: 1,
-            2: 2,
-            3: 1,
-            4: 2,
-            5: 1,
-            6: 2,
-            7: 1,
-            8: 1,
-            9: 1,
-            10: 1,
-            11: 1,
-        }
+        with pytest.deprecated_call():
+            assert nx.graph_number_of_cliques(G) == 5
+        with pytest.deprecated_call():
+            assert nx.graph_number_of_cliques(G, cliques=self.cl) == 5
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, 1) == 1
+        with pytest.deprecated_call():
+            assert list(nx.number_of_cliques(G, [1]).values()) == [1]
+        with pytest.deprecated_call():
+            assert list(nx.number_of_cliques(G, [1, 2]).values()) == [1, 2]
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, [1, 2]) == {1: 1, 2: 2}
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, 2) == 2
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G) == {
+                1: 1,
+                2: 2,
+                3: 1,
+                4: 2,
+                5: 1,
+                6: 2,
+                7: 1,
+                8: 1,
+                9: 1,
+                10: 1,
+                11: 1,
+            }
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, nodes=list(G)) == {
+                1: 1,
+                2: 2,
+                3: 1,
+                4: 2,
+                5: 1,
+                6: 2,
+                7: 1,
+                8: 1,
+                9: 1,
+                10: 1,
+                11: 1,
+            }
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, nodes=[2, 3, 4]) == {2: 2, 3: 1, 4: 2}
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, cliques=self.cl) == {
+                1: 1,
+                2: 2,
+                3: 1,
+                4: 2,
+                5: 1,
+                6: 2,
+                7: 1,
+                8: 1,
+                9: 1,
+                10: 1,
+                11: 1,
+            }
+        with pytest.deprecated_call():
+            assert nx.number_of_cliques(G, list(G), cliques=self.cl) == {
+                1: 1,
+                2: 2,
+                3: 1,
+                4: 2,
+                5: 1,
+                6: 2,
+                7: 1,
+                8: 1,
+                9: 1,
+                10: 1,
+                11: 1,
+            }
 
     def test_node_clique_number(self):
         G = self.G
@@ -182,23 +198,31 @@ class TestCliques:
 
     def test_cliques_containing_node(self):
         G = self.G
-        assert nx.cliques_containing_node(G, 1) == [[2, 6, 1, 3]]
-        assert list(nx.cliques_containing_node(G, [1]).values()) == [[[2, 6, 1, 3]]]
-        assert [
-            sorted(c) for c in list(nx.cliques_containing_node(G, [1, 2]).values())
-        ] == [[[2, 6, 1, 3]], [[2, 6, 1, 3], [2, 6, 4]]]
-        result = nx.cliques_containing_node(G, [1, 2])
+        with pytest.deprecated_call():
+            assert nx.cliques_containing_node(G, 1) == [[2, 6, 1, 3]]
+        with pytest.deprecated_call():
+            assert list(nx.cliques_containing_node(G, [1]).values()) == [[[2, 6, 1, 3]]]
+        with pytest.deprecated_call():
+            assert [
+                sorted(c) for c in list(nx.cliques_containing_node(G, [1, 2]).values())
+            ] == [[[2, 6, 1, 3]], [[2, 6, 1, 3], [2, 6, 4]]]
+        with pytest.deprecated_call():
+            result = nx.cliques_containing_node(G, [1, 2])
         for k, v in result.items():
             result[k] = sorted(v)
         assert result == {1: [[2, 6, 1, 3]], 2: [[2, 6, 1, 3], [2, 6, 4]]}
-        assert nx.cliques_containing_node(G, 1) == [[2, 6, 1, 3]]
+        with pytest.deprecated_call():
+            assert nx.cliques_containing_node(G, 1) == [[2, 6, 1, 3]]
         expected = [{2, 6, 1, 3}, {2, 6, 4}]
-        answer = [set(c) for c in nx.cliques_containing_node(G, 2)]
+        with pytest.deprecated_call():
+            answer = [set(c) for c in nx.cliques_containing_node(G, 2)]
         assert answer in (expected, list(reversed(expected)))
 
-        answer = [set(c) for c in nx.cliques_containing_node(G, 2, cliques=self.cl)]
+        with pytest.deprecated_call():
+            answer = [set(c) for c in nx.cliques_containing_node(G, 2, cliques=self.cl)]
         assert answer in (expected, list(reversed(expected)))
-        assert len(nx.cliques_containing_node(G)) == 11
+        with pytest.deprecated_call():
+            assert len(nx.cliques_containing_node(G)) == 11
 
     def test_make_clique_bipartite(self):
         G = self.G
diff --git a/networkx/algorithms/tests/test_core.py b/networkx/algorithms/tests/test_core.py
index db2d277..535af31 100644
--- a/networkx/algorithms/tests/test_core.py
+++ b/networkx/algorithms/tests/test_core.py
@@ -1,3 +1,5 @@
+import pytest
+
 import networkx as nx
 from networkx.utils import nodes_equal
 
@@ -67,6 +69,12 @@ class TestCore:
         assert nodes_equal(nodes_by_core[1], [1, 3])
         assert nodes_equal(nodes_by_core[2], [2, 4, 5, 6])
 
+    def test_core_number_self_loop(self):
+        G = nx.cycle_graph(3)
+        G.add_edge(0, 0)
+        with pytest.raises(nx.NetworkXError, match="Input graph has self loops"):
+            nx.core_number(G)
+
     def test_directed_core_number(self):
         """core number had a bug for directed graphs found in issue #1959"""
         # small example where too timid edge removal can make cn[2] = 3
@@ -169,3 +177,9 @@ class TestCore:
         assert nodes_equal(nodes_by_layer[3], [9, 11])
         assert nodes_equal(nodes_by_layer[4], [1, 2, 4, 5, 6, 8])
         assert nodes_equal(nodes_by_layer[5], [3, 7])
+
+    def test_onion_self_loop(self):
+        G = nx.cycle_graph(3)
+        G.add_edge(0, 0)
+        with pytest.raises(nx.NetworkXError, match="Input graph contains self loops"):
+            nx.onion_layers(G)
diff --git a/networkx/algorithms/tests/test_d_separation.py b/networkx/algorithms/tests/test_d_separation.py
index 23367a0..74c16ae 100644
--- a/networkx/algorithms/tests/test_d_separation.py
+++ b/networkx/algorithms/tests/test_d_separation.py
@@ -132,11 +132,16 @@ def test_undirected_graphs_are_not_supported():
     """
     Test that undirected graphs are not supported.
 
-    d-separation does not apply in the case of undirected graphs.
+    d-separation and its related algorithms do not apply in
+    the case of undirected graphs.
     """
+    g = nx.path_graph(3, nx.Graph)
     with pytest.raises(nx.NetworkXNotImplemented):
-        g = nx.path_graph(3, nx.Graph)
         nx.d_separated(g, {0}, {1}, {2})
+    with pytest.raises(nx.NetworkXNotImplemented):
+        nx.is_minimal_d_separator(g, {0}, {1}, {2})
+    with pytest.raises(nx.NetworkXNotImplemented):
+        nx.minimal_d_separator(g, {0}, {1})
 
 
 def test_cyclic_graphs_raise_error():
@@ -145,9 +150,13 @@ def test_cyclic_graphs_raise_error():
 
     This is because PGMs assume a directed acyclic graph.
     """
+    g = nx.cycle_graph(3, nx.DiGraph)
     with pytest.raises(nx.NetworkXError):
-        g = nx.cycle_graph(3, nx.DiGraph)
         nx.d_separated(g, {0}, {1}, {2})
+    with pytest.raises(nx.NetworkXError):
+        nx.minimal_d_separator(g, {0}, {1})
+    with pytest.raises(nx.NetworkXError):
+        nx.is_minimal_d_separator(g, {0}, {1}, {2})
 
 
 def test_invalid_nodes_raise_error(asia_graph):
@@ -156,3 +165,38 @@ def test_invalid_nodes_raise_error(asia_graph):
     """
     with pytest.raises(nx.NodeNotFound):
         nx.d_separated(asia_graph, {0}, {1}, {2})
+    with pytest.raises(nx.NodeNotFound):
+        nx.is_minimal_d_separator(asia_graph, 0, 1, {2})
+    with pytest.raises(nx.NodeNotFound):
+        nx.minimal_d_separator(asia_graph, 0, 1)
+
+
+def test_minimal_d_separator():
+    # Case 1:
+    # create a graph A -> B <- C
+    # B -> D -> E;
+    # B -> F;
+    # G -> E;
+    edge_list = [("A", "B"), ("C", "B"), ("B", "D"), ("D", "E"), ("B", "F"), ("G", "E")]
+    G = nx.DiGraph(edge_list)
+    assert not nx.d_separated(G, {"B"}, {"E"}, set())
+
+    # minimal set of the corresponding graph
+    # for B and E should be (D,)
+    Zmin = nx.minimal_d_separator(G, "B", "E")
+
+    # the minimal separating set should pass the test for minimality
+    assert nx.is_minimal_d_separator(G, "B", "E", Zmin)
+    assert Zmin == {"D"}
+
+    # Case 2:
+    # create a graph A -> B -> C
+    # B -> D -> C;
+    edge_list = [("A", "B"), ("B", "C"), ("B", "D"), ("D", "C")]
+    G = nx.DiGraph(edge_list)
+    assert not nx.d_separated(G, {"A"}, {"C"}, set())
+    Zmin = nx.minimal_d_separator(G, "A", "C")
+
+    # the minimal separating set should pass the test for minimality
+    assert nx.is_minimal_d_separator(G, "A", "C", Zmin)
+    assert Zmin == {"B"}
diff --git a/networkx/algorithms/tests/test_dag.py b/networkx/algorithms/tests/test_dag.py
index b39b033..7ad6a77 100644
--- a/networkx/algorithms/tests/test_dag.py
+++ b/networkx/algorithms/tests/test_dag.py
@@ -60,6 +60,31 @@ class TestDagLongestPath:
         # this will raise NotImplementedError when nodes need to be ordered
         nx.dag_longest_path(G)
 
+    def test_multigraph_unweighted(self):
+        edges = [(1, 2), (2, 3), (2, 3), (3, 4), (4, 5), (1, 3), (1, 5), (3, 5)]
+        G = nx.MultiDiGraph(edges)
+        assert nx.dag_longest_path(G) == [1, 2, 3, 4, 5]
+
+    def test_multigraph_weighted(self):
+        G = nx.MultiDiGraph()
+        edges = [
+            (1, 2, 2),
+            (2, 3, 2),
+            (1, 3, 1),
+            (1, 3, 5),
+            (1, 3, 2),
+        ]
+        G.add_weighted_edges_from(edges)
+        assert nx.dag_longest_path(G) == [1, 3]
+
+    def test_multigraph_weighted_default_weight(self):
+        G = nx.MultiDiGraph([(1, 2), (2, 3)])  # Unweighted edges
+        G.add_weighted_edges_from([(1, 3, 1), (1, 3, 5), (1, 3, 2)])
+
+        # Default value for default weight is 1
+        assert nx.dag_longest_path(G) == [1, 3]
+        assert nx.dag_longest_path(G, default_weight=3) == [1, 2, 3]
+
 
 class TestDagLongestPathLength:
     """Unit tests for computing the length of a longest path in a
@@ -91,6 +116,23 @@ class TestDagLongestPathLength:
         G.add_weighted_edges_from(edges)
         assert nx.dag_longest_path_length(G) == 5
 
+    def test_multigraph_unweighted(self):
+        edges = [(1, 2), (2, 3), (2, 3), (3, 4), (4, 5), (1, 3), (1, 5), (3, 5)]
+        G = nx.MultiDiGraph(edges)
+        assert nx.dag_longest_path_length(G) == 4
+
+    def test_multigraph_weighted(self):
+        G = nx.MultiDiGraph()
+        edges = [
+            (1, 2, 2),
+            (2, 3, 2),
+            (1, 3, 1),
+            (1, 3, 5),
+            (1, 3, 2),
+        ]
+        G.add_weighted_edges_from(edges)
+        assert nx.dag_longest_path_length(G) == 5
+
 
 class TestDAG:
     @classmethod
@@ -708,3 +750,22 @@ def test_ancestors_descendants_undirected():
     undirected graphs."""
     G = nx.path_graph(5)
     nx.ancestors(G, 2) == nx.descendants(G, 2) == {0, 1, 3, 4}
+
+
+def test_compute_v_structures_raise():
+    G = nx.Graph()
+    pytest.raises(nx.NetworkXNotImplemented, nx.compute_v_structures, G)
+
+
+def test_compute_v_structures():
+    edges = [(0, 1), (0, 2), (3, 2)]
+    G = nx.DiGraph(edges)
+
+    v_structs = set(nx.compute_v_structures(G))
+    assert len(v_structs) == 1
+    assert (0, 2, 3) in v_structs
+
+    edges = [("A", "B"), ("C", "B"), ("B", "D"), ("D", "E"), ("G", "E")]
+    G = nx.DiGraph(edges)
+    v_structs = set(nx.compute_v_structures(G))
+    assert len(v_structs) == 2
diff --git a/networkx/algorithms/tests/test_distance_measures.py b/networkx/algorithms/tests/test_distance_measures.py
index d7cec15..a5066f6 100644
--- a/networkx/algorithms/tests/test_distance_measures.py
+++ b/networkx/algorithms/tests/test_distance_measures.py
@@ -7,15 +7,6 @@ from networkx import convert_node_labels_to_integers as cnlti
 from networkx.algorithms.distance_measures import _extrema_bounding
 
 
-@pytest.mark.parametrize(
-    "compute", ("diameter", "radius", "periphery", "center", "eccentricities")
-)
-def test_extrema_bounding_deprecated(compute):
-    G = nx.complete_graph(3)
-    with pytest.deprecated_call():
-        nx.extrema_bounding(G, compute=compute)
-
-
 def test__extrema_bounding_invalid_compute_kwarg():
     G = nx.path_graph(3)
     with pytest.raises(ValueError, match="compute must be one of"):
@@ -106,6 +97,228 @@ class TestDistance:
             nx.eccentricity(DG)
 
 
+class TestWeightedDistance:
+    def setup_method(self):
+        G = nx.Graph()
+        G.add_edge(0, 1, weight=0.6, cost=0.6, high_cost=6)
+        G.add_edge(0, 2, weight=0.2, cost=0.2, high_cost=2)
+        G.add_edge(2, 3, weight=0.1, cost=0.1, high_cost=1)
+        G.add_edge(2, 4, weight=0.7, cost=0.7, high_cost=7)
+        G.add_edge(2, 5, weight=0.9, cost=0.9, high_cost=9)
+        G.add_edge(1, 5, weight=0.3, cost=0.3, high_cost=3)
+        self.G = G
+        self.weight_fn = lambda v, u, e: 2
+
+    def test_eccentricity_weight_None(self):
+        assert nx.eccentricity(self.G, 1, weight=None) == 3
+        e = nx.eccentricity(self.G, weight=None)
+        assert e[1] == 3
+
+        e = nx.eccentricity(self.G, v=1, weight=None)
+        assert e == 3
+
+        # This behavior changed in version 1.8 (ticket #739)
+        e = nx.eccentricity(self.G, v=[1, 1], weight=None)
+        assert e[1] == 3
+        e = nx.eccentricity(self.G, v=[1, 2], weight=None)
+        assert e[1] == 3
+
+    def test_eccentricity_weight_attr(self):
+        assert nx.eccentricity(self.G, 1, weight="weight") == 1.5
+        e = nx.eccentricity(self.G, weight="weight")
+        assert (
+            e
+            == nx.eccentricity(self.G, weight="cost")
+            != nx.eccentricity(self.G, weight="high_cost")
+        )
+        assert e[1] == 1.5
+
+        e = nx.eccentricity(self.G, v=1, weight="weight")
+        assert e == 1.5
+
+        # This behavior changed in version 1.8 (ticket #739)
+        e = nx.eccentricity(self.G, v=[1, 1], weight="weight")
+        assert e[1] == 1.5
+        e = nx.eccentricity(self.G, v=[1, 2], weight="weight")
+        assert e[1] == 1.5
+
+    def test_eccentricity_weight_fn(self):
+        assert nx.eccentricity(self.G, 1, weight=self.weight_fn) == 6
+        e = nx.eccentricity(self.G, weight=self.weight_fn)
+        assert e[1] == 6
+
+        e = nx.eccentricity(self.G, v=1, weight=self.weight_fn)
+        assert e == 6
+
+        # This behavior changed in version 1.8 (ticket #739)
+        e = nx.eccentricity(self.G, v=[1, 1], weight=self.weight_fn)
+        assert e[1] == 6
+        e = nx.eccentricity(self.G, v=[1, 2], weight=self.weight_fn)
+        assert e[1] == 6
+
+    def test_diameter_weight_None(self):
+        assert nx.diameter(self.G, weight=None) == 3
+
+    def test_diameter_weight_attr(self):
+        assert (
+            nx.diameter(self.G, weight="weight")
+            == nx.diameter(self.G, weight="cost")
+            == 1.6
+            != nx.diameter(self.G, weight="high_cost")
+        )
+
+    def test_diameter_weight_fn(self):
+        assert nx.diameter(self.G, weight=self.weight_fn) == 6
+
+    def test_radius_weight_None(self):
+        assert pytest.approx(nx.radius(self.G, weight=None)) == 2
+
+    def test_radius_weight_attr(self):
+        assert (
+            pytest.approx(nx.radius(self.G, weight="weight"))
+            == pytest.approx(nx.radius(self.G, weight="cost"))
+            == 0.9
+            != nx.radius(self.G, weight="high_cost")
+        )
+
+    def test_radius_weight_fn(self):
+        assert nx.radius(self.G, weight=self.weight_fn) == 4
+
+    def test_periphery_weight_None(self):
+        for v in set(nx.periphery(self.G, weight=None)):
+            assert nx.eccentricity(self.G, v, weight=None) == nx.diameter(
+                self.G, weight=None
+            )
+
+    def test_periphery_weight_attr(self):
+        periphery = set(nx.periphery(self.G, weight="weight"))
+        assert (
+            periphery
+            == set(nx.periphery(self.G, weight="cost"))
+            == set(nx.periphery(self.G, weight="high_cost"))
+        )
+        for v in periphery:
+            assert (
+                nx.eccentricity(self.G, v, weight="high_cost")
+                != nx.eccentricity(self.G, v, weight="weight")
+                == nx.eccentricity(self.G, v, weight="cost")
+                == nx.diameter(self.G, weight="weight")
+                == nx.diameter(self.G, weight="cost")
+                != nx.diameter(self.G, weight="high_cost")
+            )
+            assert nx.eccentricity(self.G, v, weight="high_cost") == nx.diameter(
+                self.G, weight="high_cost"
+            )
+
+    def test_periphery_weight_fn(self):
+        for v in set(nx.periphery(self.G, weight=self.weight_fn)):
+            assert nx.eccentricity(self.G, v, weight=self.weight_fn) == nx.diameter(
+                self.G, weight=self.weight_fn
+            )
+
+    def test_center_weight_None(self):
+        for v in set(nx.center(self.G, weight=None)):
+            assert pytest.approx(nx.eccentricity(self.G, v, weight=None)) == nx.radius(
+                self.G, weight=None
+            )
+
+    def test_center_weight_attr(self):
+        center = set(nx.center(self.G, weight="weight"))
+        assert (
+            center
+            == set(nx.center(self.G, weight="cost"))
+            != set(nx.center(self.G, weight="high_cost"))
+        )
+        for v in center:
+            assert (
+                nx.eccentricity(self.G, v, weight="high_cost")
+                != pytest.approx(nx.eccentricity(self.G, v, weight="weight"))
+                == pytest.approx(nx.eccentricity(self.G, v, weight="cost"))
+                == nx.radius(self.G, weight="weight")
+                == nx.radius(self.G, weight="cost")
+                != nx.radius(self.G, weight="high_cost")
+            )
+            assert nx.eccentricity(self.G, v, weight="high_cost") == nx.radius(
+                self.G, weight="high_cost"
+            )
+
+    def test_center_weight_fn(self):
+        for v in set(nx.center(self.G, weight=self.weight_fn)):
+            assert nx.eccentricity(self.G, v, weight=self.weight_fn) == nx.radius(
+                self.G, weight=self.weight_fn
+            )
+
+    def test_bound_diameter_weight_None(self):
+        assert nx.diameter(self.G, usebounds=True, weight=None) == 3
+
+    def test_bound_diameter_weight_attr(self):
+        assert (
+            nx.diameter(self.G, usebounds=True, weight="high_cost")
+            != nx.diameter(self.G, usebounds=True, weight="weight")
+            == nx.diameter(self.G, usebounds=True, weight="cost")
+            == 1.6
+            != nx.diameter(self.G, usebounds=True, weight="high_cost")
+        )
+        assert nx.diameter(self.G, usebounds=True, weight="high_cost") == nx.diameter(
+            self.G, usebounds=True, weight="high_cost"
+        )
+
+    def test_bound_diameter_weight_fn(self):
+        assert nx.diameter(self.G, usebounds=True, weight=self.weight_fn) == 6
+
+    def test_bound_radius_weight_None(self):
+        assert pytest.approx(nx.radius(self.G, usebounds=True, weight=None)) == 2
+
+    def test_bound_radius_weight_attr(self):
+        assert (
+            nx.radius(self.G, usebounds=True, weight="high_cost")
+            != pytest.approx(nx.radius(self.G, usebounds=True, weight="weight"))
+            == pytest.approx(nx.radius(self.G, usebounds=True, weight="cost"))
+            == 0.9
+            != nx.radius(self.G, usebounds=True, weight="high_cost")
+        )
+        assert nx.radius(self.G, usebounds=True, weight="high_cost") == nx.radius(
+            self.G, usebounds=True, weight="high_cost"
+        )
+
+    def test_bound_radius_weight_fn(self):
+        assert nx.radius(self.G, usebounds=True, weight=self.weight_fn) == 4
+
+    def test_bound_periphery_weight_None(self):
+        result = {1, 3, 4}
+        assert set(nx.periphery(self.G, usebounds=True, weight=None)) == result
+
+    def test_bound_periphery_weight_attr(self):
+        result = {4, 5}
+        assert (
+            set(nx.periphery(self.G, usebounds=True, weight="weight"))
+            == set(nx.periphery(self.G, usebounds=True, weight="cost"))
+            == result
+        )
+
+    def test_bound_periphery_weight_fn(self):
+        result = {1, 3, 4}
+        assert (
+            set(nx.periphery(self.G, usebounds=True, weight=self.weight_fn)) == result
+        )
+
+    def test_bound_center_weight_None(self):
+        result = {0, 2, 5}
+        assert set(nx.center(self.G, usebounds=True, weight=None)) == result
+
+    def test_bound_center_weight_attr(self):
+        result = {0}
+        assert (
+            set(nx.center(self.G, usebounds=True, weight="weight"))
+            == set(nx.center(self.G, usebounds=True, weight="cost"))
+            == result
+        )
+
+    def test_bound_center_weight_fn(self):
+        result = {0, 2, 5}
+        assert set(nx.center(self.G, usebounds=True, weight=self.weight_fn)) == result
+
+
 class TestResistanceDistance:
     @classmethod
     def setup_class(cls):
diff --git a/networkx/algorithms/tests/test_euler.py b/networkx/algorithms/tests/test_euler.py
index 7dfe2d1..cba66ee 100644
--- a/networkx/algorithms/tests/test_euler.py
+++ b/networkx/algorithms/tests/test_euler.py
@@ -273,3 +273,23 @@ class TestEulerize:
         G = nx.complete_graph(4)
         assert nx.is_eulerian(nx.eulerize(G))
         assert nx.is_eulerian(nx.eulerize(nx.MultiGraph(G)))
+
+    def test_on_non_eulerian_graph(self):
+        G = nx.cycle_graph(18)
+        G.add_edge(0, 18)
+        G.add_edge(18, 19)
+        G.add_edge(17, 19)
+        G.add_edge(4, 20)
+        G.add_edge(20, 21)
+        G.add_edge(21, 22)
+        G.add_edge(22, 23)
+        G.add_edge(23, 24)
+        G.add_edge(24, 25)
+        G.add_edge(25, 26)
+        G.add_edge(26, 27)
+        G.add_edge(27, 28)
+        G.add_edge(28, 13)
+        assert not nx.is_eulerian(G)
+        G = nx.eulerize(G)
+        assert nx.is_eulerian(G)
+        assert nx.number_of_edges(G) == 39
diff --git a/networkx/algorithms/tests/test_lowest_common_ancestors.py b/networkx/algorithms/tests/test_lowest_common_ancestors.py
index 512a1fe..1a8fd03 100644
--- a/networkx/algorithms/tests/test_lowest_common_ancestors.py
+++ b/networkx/algorithms/tests/test_lowest_common_ancestors.py
@@ -136,12 +136,6 @@ class TestTreeLCA:
         with pytest.raises(NNI):
             next(all_pairs_lca(G))
         pytest.raises(NNI, nx.lowest_common_ancestor, G, 0, 1)
-        G = nx.MultiDiGraph([(0, 1)])
-        with pytest.raises(NNI):
-            next(tree_all_pairs_lca(G))
-        with pytest.raises(NNI):
-            next(all_pairs_lca(G))
-        pytest.raises(NNI, nx.lowest_common_ancestor, G, 0, 1)
 
     def test_tree_all_pairs_lca_trees_without_LCAs(self):
         G = nx.DiGraph()
@@ -150,6 +144,41 @@ class TestTreeLCA:
         assert ans == [((3, 3), 3)]
 
 
+class TestMultiTreeLCA(TestTreeLCA):
+    @classmethod
+    def setup_class(cls):
+        cls.DG = nx.MultiDiGraph()
+        edges = [(0, 1), (0, 2), (1, 3), (1, 4), (2, 5), (2, 6)]
+        cls.DG.add_edges_from(edges)
+        cls.ans = dict(tree_all_pairs_lca(cls.DG, 0))
+        # add multiedges
+        cls.DG.add_edges_from(edges)
+
+        gold = {(n, n): n for n in cls.DG}
+        gold.update({(0, i): 0 for i in range(1, 7)})
+        gold.update(
+            {
+                (1, 2): 0,
+                (1, 3): 1,
+                (1, 4): 1,
+                (1, 5): 0,
+                (1, 6): 0,
+                (2, 3): 0,
+                (2, 4): 0,
+                (2, 5): 2,
+                (2, 6): 2,
+                (3, 4): 1,
+                (3, 5): 0,
+                (3, 6): 0,
+                (4, 5): 0,
+                (4, 6): 0,
+                (5, 6): 2,
+            }
+        )
+
+        cls.gold = gold
+
+
 class TestDAGLCA:
     @classmethod
     def setup_class(cls):
@@ -316,6 +345,7 @@ class TestDAGLCA:
 
     def test_all_pairs_lca_one_pair_gh4942(self):
         G = nx.DiGraph()
+        # Note: order edge addition is critical to the test
         G.add_edge(0, 1)
         G.add_edge(2, 0)
         G.add_edge(2, 3)
@@ -323,3 +353,75 @@ class TestDAGLCA:
         G.add_edge(5, 2)
 
         assert nx.lowest_common_ancestor(G, 1, 3) == 2
+
+
+class TestMultiDiGraph_DAGLCA(TestDAGLCA):
+    @classmethod
+    def setup_class(cls):
+        cls.DG = nx.MultiDiGraph()
+        nx.add_path(cls.DG, (0, 1, 2, 3))
+        # add multiedges
+        nx.add_path(cls.DG, (0, 1, 2, 3))
+        nx.add_path(cls.DG, (0, 4, 3))
+        nx.add_path(cls.DG, (0, 5, 6, 8, 3))
+        nx.add_path(cls.DG, (5, 7, 8))
+        cls.DG.add_edge(6, 2)
+        cls.DG.add_edge(7, 2)
+
+        cls.root_distance = nx.shortest_path_length(cls.DG, source=0)
+
+        cls.gold = {
+            (1, 1): 1,
+            (1, 2): 1,
+            (1, 3): 1,
+            (1, 4): 0,
+            (1, 5): 0,
+            (1, 6): 0,
+            (1, 7): 0,
+            (1, 8): 0,
+            (2, 2): 2,
+            (2, 3): 2,
+            (2, 4): 0,
+            (2, 5): 5,
+            (2, 6): 6,
+            (2, 7): 7,
+            (2, 8): 7,
+            (3, 3): 3,
+            (3, 4): 4,
+            (3, 5): 5,
+            (3, 6): 6,
+            (3, 7): 7,
+            (3, 8): 8,
+            (4, 4): 4,
+            (4, 5): 0,
+            (4, 6): 0,
+            (4, 7): 0,
+            (4, 8): 0,
+            (5, 5): 5,
+            (5, 6): 5,
+            (5, 7): 5,
+            (5, 8): 5,
+            (6, 6): 6,
+            (6, 7): 5,
+            (6, 8): 6,
+            (7, 7): 7,
+            (7, 8): 7,
+            (8, 8): 8,
+        }
+        cls.gold.update(((0, n), 0) for n in cls.DG)
+
+
+def test_all_pairs_lca_self_ancestors():
+    """Self-ancestors should always be the node itself, i.e. lca of (0, 0) is 0.
+    See gh-4458."""
+    # DAG for test - note order of node/edge addition is relevant
+    G = nx.DiGraph()
+    G.add_nodes_from(range(5))
+    G.add_edges_from([(1, 0), (2, 0), (3, 2), (4, 1), (4, 3)])
+
+    ap_lca = nx.all_pairs_lowest_common_ancestor
+    assert all(u == v == a for (u, v), a in ap_lca(G) if u == v)
+    MG = nx.MultiDiGraph(G)
+    assert all(u == v == a for (u, v), a in ap_lca(MG) if u == v)
+    MG.add_edges_from([(1, 0), (2, 0)])
+    assert all(u == v == a for (u, v), a in ap_lca(MG) if u == v)
diff --git a/networkx/algorithms/tests/test_matching.py b/networkx/algorithms/tests/test_matching.py
index 57603bc..37853e3 100644
--- a/networkx/algorithms/tests/test_matching.py
+++ b/networkx/algorithms/tests/test_matching.py
@@ -117,14 +117,12 @@ class TestMaxWeightMatching:
             nx.max_weight_matching(G), matching_dict_to_set({1: 2, 2: 1})
         )
         assert edges_equal(
-            nx.max_weight_matching(G, 1), matching_dict_to_set({1: 3, 2: 4, 3: 1, 4: 2})
+            nx.max_weight_matching(G, maxcardinality=True),
+            matching_dict_to_set({1: 3, 2: 4, 3: 1, 4: 2}),
         )
         assert edges_equal(
             nx.min_weight_matching(G), matching_dict_to_set({1: 2, 3: 4})
         )
-        assert edges_equal(
-            nx.min_weight_matching(G, 1), matching_dict_to_set({1: 2, 3: 4})
-        )
 
     def test_s_blossom(self):
         """Create S-blossom and use it for augmentation:"""
diff --git a/networkx/algorithms/tests/test_node_classification_deprecations.py b/networkx/algorithms/tests/test_node_classification_deprecations.py
deleted file mode 100644
index 2d12561..0000000
--- a/networkx/algorithms/tests/test_node_classification_deprecations.py
+++ /dev/null
@@ -1,41 +0,0 @@
-"""TODO: Remove this test module for version 3.0."""
-
-
-import sys
-
-import pytest
-
-# NOTE: It is necessary to prevent previous imports in the test suite from
-# "contaminating" the tests for the deprecation warnings by removing
-# node_classification from sys.modules.
-
-
-def test_hmn_deprecation_warning():
-    sys.modules.pop("networkx.algorithms.node_classification", None)
-    with pytest.warns(DeprecationWarning):
-        from networkx.algorithms.node_classification import hmn
-
-
-def test_lgc_deprecation_warning():
-    sys.modules.pop("networkx.algorithms.node_classification", None)
-    with pytest.warns(DeprecationWarning):
-        from networkx.algorithms.node_classification import lgc
-
-
-def test_no_warn_on_function_import(recwarn):
-    # Accessing the functions shouldn't raise any warning
-    sys.modules.pop("networkx.algorithms.node_classification", None)
-    from networkx.algorithms.node_classification import (
-        harmonic_function,
-        local_and_global_consistency,
-    )
-
-    assert len(recwarn) == 0
-
-
-def test_no_warn_on_package_import(recwarn):
-    # Accessing the package shouldn't raise any warning
-    sys.modules.pop("networkx.algorithms.node_classification", None)
-    from networkx.algorithms import node_classification
-
-    assert len(recwarn) == 0
diff --git a/networkx/algorithms/tests/test_planarity.py b/networkx/algorithms/tests/test_planarity.py
index 675a5d9..470b1d2 100644
--- a/networkx/algorithms/tests/test_planarity.py
+++ b/networkx/algorithms/tests/test_planarity.py
@@ -202,7 +202,7 @@ class TestLRPlanarity:
         self.check_graph(G, is_planar=True)
 
     def test_graph1(self):
-        G = nx.OrderedGraph(
+        G = nx.Graph(
             [
                 (3, 10),
                 (2, 13),
@@ -219,7 +219,7 @@ class TestLRPlanarity:
         self.check_graph(G, is_planar=True)
 
     def test_graph2(self):
-        G = nx.OrderedGraph(
+        G = nx.Graph(
             [
                 (1, 2),
                 (4, 13),
@@ -243,7 +243,7 @@ class TestLRPlanarity:
         self.check_graph(G, is_planar=False)
 
     def test_graph3(self):
-        G = nx.OrderedGraph(
+        G = nx.Graph(
             [
                 (0, 7),
                 (3, 11),
diff --git a/networkx/algorithms/tests/test_similarity.py b/networkx/algorithms/tests/test_similarity.py
index 9b620de..a7ca1d8 100644
--- a/networkx/algorithms/tests/test_similarity.py
+++ b/networkx/algorithms/tests/test_similarity.py
@@ -523,7 +523,10 @@ class TestSimilarity:
         assert graph_edit_distance(G1, G2, node_match=nmatch, edge_match=ematch) == 1
 
     # note: nx.simrank_similarity_numpy not included because returns np.array
-    simrank_algs = [nx.simrank_similarity, nx.similarity._simrank_similarity_python]
+    simrank_algs = [
+        nx.simrank_similarity,
+        nx.algorithms.similarity._simrank_similarity_python,
+    ]
 
     @pytest.mark.parametrize("simrank_similarity", simrank_algs)
     def test_simrank_no_source_no_target(self, simrank_similarity):
diff --git a/networkx/algorithms/tests/test_simple_paths.py b/networkx/algorithms/tests/test_simple_paths.py
index 08348b9..c2d4c00 100644
--- a/networkx/algorithms/tests/test_simple_paths.py
+++ b/networkx/algorithms/tests/test_simple_paths.py
@@ -58,6 +58,10 @@ class TestIsSimplePath:
         G = nx.path_graph(2)
         assert not nx.is_simple_path(G, [0, 2])
 
+    def test_missing_starting_node(self):
+        G = nx.path_graph(2)
+        assert not nx.is_simple_path(G, [2, 0])
+
     def test_directed_path(self):
         G = nx.DiGraph([(0, 1), (1, 2)])
         assert nx.is_simple_path(G, [0, 1, 2])
diff --git a/networkx/algorithms/tests/test_smallworld.py b/networkx/algorithms/tests/test_smallworld.py
index 42ede0e..d115dd9 100644
--- a/networkx/algorithms/tests/test_smallworld.py
+++ b/networkx/algorithms/tests/test_smallworld.py
@@ -68,3 +68,11 @@ def test_omega():
 
     for o in omegas:
         assert -1 <= o <= 1
+
+
+@pytest.mark.parametrize("f", (nx.random_reference, nx.lattice_reference))
+def test_graph_no_edges(f):
+    G = nx.Graph()
+    G.add_nodes_from([0, 1, 2, 3])
+    with pytest.raises(nx.NetworkXError, match="Graph has fewer that 2 edges"):
+        f(G)
diff --git a/networkx/algorithms/tests/test_swap.py b/networkx/algorithms/tests/test_swap.py
index 9982b95..49dd5f8 100644
--- a/networkx/algorithms/tests/test_swap.py
+++ b/networkx/algorithms/tests/test_swap.py
@@ -2,8 +2,26 @@ import pytest
 
 import networkx as nx
 
-# import random
-# random.seed(0)
+
+def test_directed_edge_swap():
+    graph = nx.path_graph(200, create_using=nx.DiGraph)
+    in_degrees = sorted((n, d) for n, d in graph.in_degree())
+    out_degrees = sorted((n, d) for n, d in graph.out_degree())
+    G = nx.directed_edge_swap(graph, nswap=40, max_tries=500, seed=1)
+    assert in_degrees == sorted((n, d) for n, d in G.in_degree())
+    assert out_degrees == sorted((n, d) for n, d in G.out_degree())
+
+
+def test_edge_cases_directed_edge_swap():
+    # Tests cases when swaps are impossible, either too few edges exist, or self loops/cycles are unavoidable
+    # TODO: Rewrite function to explicitly check for impossible swaps and raise error
+    e = (
+        "Maximum number of swap attempts \\(11\\) exceeded "
+        "before desired swaps achieved \\(\\d\\)."
+    )
+    graph = nx.DiGraph([(0, 0), (0, 1), (1, 0), (2, 3), (3, 2)])
+    with pytest.raises(nx.NetworkXAlgorithmError, match=e):
+        nx.directed_edge_swap(graph, nswap=1, max_tries=10, seed=1)
 
 
 def test_double_edge_swap():
@@ -54,6 +72,31 @@ def test_connected_double_edge_swap_star_low_window_threshold():
     assert degrees == sorted(d for n, d in graph.degree())
 
 
+def test_directed_edge_swap_small():
+    with pytest.raises(nx.NetworkXError):
+        G = nx.directed_edge_swap(nx.path_graph(3, create_using=nx.DiGraph))
+
+
+def test_directed_edge_swap_tries():
+    with pytest.raises(nx.NetworkXError):
+        G = nx.directed_edge_swap(
+            nx.path_graph(3, create_using=nx.DiGraph), nswap=1, max_tries=0
+        )
+
+
+def test_directed_exception_undirected():
+    graph = nx.Graph([(0, 1), (2, 3)])
+    with pytest.raises(nx.NetworkXNotImplemented):
+        G = nx.directed_edge_swap(graph)
+
+
+def test_directed_edge_max_tries():
+    with pytest.raises(nx.NetworkXAlgorithmError):
+        G = nx.directed_edge_swap(
+            nx.complete_graph(4, nx.DiGraph()), nswap=1, max_tries=5
+        )
+
+
 def test_double_edge_swap_small():
     with pytest.raises(nx.NetworkXError):
         G = nx.double_edge_swap(nx.path_graph(3))
@@ -92,3 +135,22 @@ def test_degree_seq_c4():
     degrees = sorted(d for n, d in G.degree())
     G = nx.double_edge_swap(G, 1, 100)
     assert degrees == sorted(d for n, d in G.degree())
+
+
+def test_fewer_than_4_nodes():
+    G = nx.DiGraph()
+    G.add_nodes_from([0, 1, 2])
+    with pytest.raises(nx.NetworkXError, match=".*fewer than four nodes."):
+        nx.directed_edge_swap(G)
+
+
+def test_less_than_3_edges():
+    G = nx.DiGraph([(0, 1), (1, 2)])
+    G.add_nodes_from([3, 4])
+    with pytest.raises(nx.NetworkXError, match=".*fewer than 3 edges"):
+        nx.directed_edge_swap(G)
+
+    G = nx.Graph()
+    G.add_nodes_from([0, 1, 2, 3])
+    with pytest.raises(nx.NetworkXError, match=".*fewer than 2 edges"):
+        nx.double_edge_swap(G)
diff --git a/networkx/algorithms/threshold.py b/networkx/algorithms/threshold.py
index 5c50394..f462f26 100644
--- a/networkx/algorithms/threshold.py
+++ b/networkx/algorithms/threshold.py
@@ -42,13 +42,13 @@ def is_threshold_graph(G):
 
 def is_threshold_sequence(degree_sequence):
     """
-    Returns True if the sequence is a threshold degree seqeunce.
+    Returns True if the sequence is a threshold degree sequence.
 
     Uses the property that a threshold graph must be constructed by
     adding either dominating or isolated nodes. Thus, it can be
     deconstructed iteratively by removing a node of degree zero or a
     node that connects to the remaining nodes.  If this deconstruction
-    failes then the sequence is not a threshold sequence.
+    fails then the sequence is not a threshold sequence.
     """
     ds = degree_sequence[:]  # get a copy so we don't destroy original
     ds.sort()
@@ -95,7 +95,7 @@ def creation_sequence(degree_sequence, with_labels=False, compact=False):
         raise ValueError("compact sequences cannot be labeled")
 
     # make an indexed copy
-    if isinstance(degree_sequence, dict):  # labeled degree seqeunce
+    if isinstance(degree_sequence, dict):  # labeled degree sequence
         ds = [[degree, label] for (label, degree) in degree_sequence.items()]
     else:
         ds = [[d, i] for i, d in enumerate(degree_sequence)]
@@ -668,7 +668,7 @@ def betweenness_sequence(creation_sequence, normalized=True):
     cs = creation_sequence
     seq = []  # betweenness
     lastchar = "d"  # first node is always a 'd'
-    dr = float(cs.count("d"))  # number of d's to the right of curren pos
+    dr = float(cs.count("d"))  # number of d's to the right of current pos
     irun = 0  # number of i's in the last run
     drun = 0  # number of d's in the last run
     dlast = 0.0  # betweenness of last d
diff --git a/networkx/algorithms/tournament.py b/networkx/algorithms/tournament.py
index 278a1c4..ef1d8a0 100644
--- a/networkx/algorithms/tournament.py
+++ b/networkx/algorithms/tournament.py
@@ -61,6 +61,7 @@ def index_satisfying(iterable, condition):
         raise ValueError("iterable must be non-empty") from err
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 @not_implemented_for("multigraph")
 def is_tournament(G):
@@ -179,6 +180,7 @@ def random_tournament(n, seed=None):
     return nx.DiGraph(edges)
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 @not_implemented_for("multigraph")
 def score_sequence(G):
@@ -208,6 +210,7 @@ def score_sequence(G):
     return sorted(d for v, d in G.out_degree())
 
 
+@nx._dispatch
 @not_implemented_for("undirected")
 @not_implemented_for("multigraph")
 def tournament_matrix(G):
@@ -237,7 +240,7 @@ def tournament_matrix(G):
 
     Returns
     -------
-    SciPy sparse matrix
+    SciPy sparse array
         The tournament matrix of the tournament graph `G`.
 
     Raises
diff --git a/networkx/algorithms/traversal/breadth_first_search.py b/networkx/algorithms/traversal/breadth_first_search.py
index 55ba1f0..300a6a9 100644
--- a/networkx/algorithms/traversal/breadth_first_search.py
+++ b/networkx/algorithms/traversal/breadth_first_search.py
@@ -88,6 +88,7 @@ def generic_bfs_edges(G, source, neighbors=None, depth_limit=None, sort_neighbor
             queue.popleft()
 
 
+@nx._dispatch
 def bfs_edges(G, source, reverse=False, depth_limit=None, sort_neighbors=None):
     """Iterate over edges in a breadth-first-search starting at source.
 
diff --git a/networkx/algorithms/traversal/depth_first_search.py b/networkx/algorithms/traversal/depth_first_search.py
index 0ccca4f..c250787 100644
--- a/networkx/algorithms/traversal/depth_first_search.py
+++ b/networkx/algorithms/traversal/depth_first_search.py
@@ -364,12 +364,15 @@ def dfs_labeled_edges(G, source=None, depth_limit=None):
     edges: generator
        A generator of triples of the form (*u*, *v*, *d*), where (*u*,
        *v*) is the edge being explored in the depth-first search and *d*
-       is one of the strings 'forward', 'nontree', or 'reverse'. A
-       'forward' edge is one in which *u* has been visited but *v* has
+       is one of the strings 'forward', 'nontree', 'reverse', or 'reverse-depth_limit'.
+       A 'forward' edge is one in which *u* has been visited but *v* has
        not. A 'nontree' edge is one in which both *u* and *v* have been
        visited but the edge is not in the DFS tree. A 'reverse' edge is
-       on in which both *u* and *v* have been visited and the edge is in
-       the DFS tree.
+       one in which both *u* and *v* have been visited and the edge is in
+       the DFS tree. When the `depth_limit` is reached via a 'forward' edge,
+       a 'reverse' edge is immediately generated rather than the subtree
+       being explored. To indicate this flavor of 'reverse' edge, the string
+       yielded is 'reverse-depth_limit'.
 
     Examples
     --------
@@ -436,6 +439,8 @@ def dfs_labeled_edges(G, source=None, depth_limit=None):
                     visited.add(child)
                     if depth_now > 1:
                         stack.append((child, depth_now - 1, iter(G[child])))
+                    else:
+                        yield parent, child, "reverse-depth_limit"
             except StopIteration:
                 stack.pop()
                 if stack:
diff --git a/networkx/algorithms/traversal/tests/test_beamsearch.py b/networkx/algorithms/traversal/tests/test_beamsearch.py
index 249cc2f..8945b41 100644
--- a/networkx/algorithms/traversal/tests/test_beamsearch.py
+++ b/networkx/algorithms/traversal/tests/test_beamsearch.py
@@ -25,3 +25,8 @@ class TestBeamSearch:
         G = nx.cycle_graph(4)
         edges = nx.bfs_beam_edges(G, 0, identity, width=2)
         assert list(edges) == [(0, 3), (0, 1), (3, 2)]
+
+    def test_width_none(self):
+        G = nx.cycle_graph(4)
+        edges = nx.bfs_beam_edges(G, 0, identity, width=None)
+        assert list(edges) == [(0, 3), (0, 1), (3, 2)]
diff --git a/networkx/algorithms/traversal/tests/test_dfs.py b/networkx/algorithms/traversal/tests/test_dfs.py
index 7652809..0eb698b 100644
--- a/networkx/algorithms/traversal/tests/test_dfs.py
+++ b/networkx/algorithms/traversal/tests/test_dfs.py
@@ -59,11 +59,43 @@ class TestDFS:
         edges = list(nx.dfs_labeled_edges(self.G, source=0))
         forward = [(u, v) for (u, v, d) in edges if d == "forward"]
         assert forward == [(0, 0), (0, 1), (1, 2), (2, 4), (1, 3)]
+        assert edges == [
+            (0, 0, "forward"),
+            (0, 1, "forward"),
+            (1, 0, "nontree"),
+            (1, 2, "forward"),
+            (2, 1, "nontree"),
+            (2, 4, "forward"),
+            (4, 2, "nontree"),
+            (4, 0, "nontree"),
+            (2, 4, "reverse"),
+            (1, 2, "reverse"),
+            (1, 3, "forward"),
+            (3, 1, "nontree"),
+            (3, 0, "nontree"),
+            (1, 3, "reverse"),
+            (0, 1, "reverse"),
+            (0, 3, "nontree"),
+            (0, 4, "nontree"),
+            (0, 0, "reverse"),
+        ]
 
     def test_dfs_labeled_disconnected_edges(self):
         edges = list(nx.dfs_labeled_edges(self.D))
         forward = [(u, v) for (u, v, d) in edges if d == "forward"]
         assert forward == [(0, 0), (0, 1), (2, 2), (2, 3)]
+        assert edges == [
+            (0, 0, "forward"),
+            (0, 1, "forward"),
+            (1, 0, "nontree"),
+            (0, 1, "reverse"),
+            (0, 0, "reverse"),
+            (2, 2, "forward"),
+            (2, 3, "forward"),
+            (3, 2, "nontree"),
+            (2, 3, "reverse"),
+            (2, 2, "reverse"),
+        ]
 
     def test_dfs_tree_isolates(self):
         G = nx.Graph()
@@ -141,12 +173,79 @@ class TestDepthLimitedSearch:
         edges = nx.dfs_edges(self.G, source=9, depth_limit=4)
         assert list(edges) == [(9, 8), (8, 7), (7, 2), (2, 1), (2, 3), (9, 10)]
 
-    def test_dls_labeled_edges(self):
+    def test_dls_labeled_edges_depth_1(self):
         edges = list(nx.dfs_labeled_edges(self.G, source=5, depth_limit=1))
         forward = [(u, v) for (u, v, d) in edges if d == "forward"]
         assert forward == [(5, 5), (5, 4), (5, 6)]
+        # Note: reverse-depth_limit edge types were not reported before gh-6240
+        assert edges == [
+            (5, 5, "forward"),
+            (5, 4, "forward"),
+            (5, 4, "reverse-depth_limit"),
+            (5, 6, "forward"),
+            (5, 6, "reverse-depth_limit"),
+            (5, 5, "reverse"),
+        ]
 
-    def test_dls_labeled_disconnected_edges(self):
+    def test_dls_labeled_edges_depth_2(self):
         edges = list(nx.dfs_labeled_edges(self.G, source=6, depth_limit=2))
         forward = [(u, v) for (u, v, d) in edges if d == "forward"]
         assert forward == [(6, 6), (6, 5), (5, 4)]
+        assert edges == [
+            (6, 6, "forward"),
+            (6, 5, "forward"),
+            (5, 4, "forward"),
+            (5, 4, "reverse-depth_limit"),
+            (5, 6, "nontree"),
+            (6, 5, "reverse"),
+            (6, 6, "reverse"),
+        ]
+
+    def test_dls_labeled_disconnected_edges(self):
+        edges = list(nx.dfs_labeled_edges(self.D, depth_limit=1))
+        assert edges == [
+            (0, 0, "forward"),
+            (0, 1, "forward"),
+            (0, 1, "reverse-depth_limit"),
+            (0, 0, "reverse"),
+            (2, 2, "forward"),
+            (2, 3, "forward"),
+            (2, 3, "reverse-depth_limit"),
+            (2, 7, "forward"),
+            (2, 7, "reverse-depth_limit"),
+            (2, 2, "reverse"),
+            (8, 8, "forward"),
+            (8, 7, "nontree"),
+            (8, 9, "forward"),
+            (8, 9, "reverse-depth_limit"),
+            (8, 8, "reverse"),
+            (10, 10, "forward"),
+            (10, 9, "nontree"),
+            (10, 10, "reverse"),
+        ]
+        # large depth_limit has no impact
+        edges = list(nx.dfs_labeled_edges(self.D, depth_limit=19))
+        assert edges == [
+            (0, 0, "forward"),
+            (0, 1, "forward"),
+            (1, 0, "nontree"),
+            (0, 1, "reverse"),
+            (0, 0, "reverse"),
+            (2, 2, "forward"),
+            (2, 3, "forward"),
+            (3, 2, "nontree"),
+            (2, 3, "reverse"),
+            (2, 7, "forward"),
+            (7, 2, "nontree"),
+            (7, 8, "forward"),
+            (8, 7, "nontree"),
+            (8, 9, "forward"),
+            (9, 8, "nontree"),
+            (9, 10, "forward"),
+            (10, 9, "nontree"),
+            (9, 10, "reverse"),
+            (8, 9, "reverse"),
+            (7, 8, "reverse"),
+            (2, 7, "reverse"),
+            (2, 2, "reverse"),
+        ]
diff --git a/networkx/algorithms/tree/mst.py b/networkx/algorithms/tree/mst.py
index e2ff7c6..efb356f 100644
--- a/networkx/algorithms/tree/mst.py
+++ b/networkx/algorithms/tree/mst.py
@@ -847,10 +847,10 @@ def random_spanning_tree(G, weight=None, *, multiplicative=True, seed=None):
         Find the sum of weights of the spanning trees of `G` using the
         approioate `method`.
 
-        This is easy if the choosen method is 'multiplicative', since we can
+        This is easy if the chosen method is 'multiplicative', since we can
         use Kirchhoff's Tree Matrix Theorem directly. However, with the
         'additive' method, this process is slightly more complex and less
-        computatiionally efficent as we have to find the number of spanning
+        computatiionally efficient as we have to find the number of spanning
         trees which contain each possible edge in the graph.
 
         Parameters
@@ -882,7 +882,7 @@ def random_spanning_tree(G, weight=None, *, multiplicative=True, seed=None):
             #    the number of spanning trees which have to include that edge. This
             #    can be accomplished by contracting the edge and finding the
             #    multiplicative total spanning tree weight if the weight of each edge
-            #    is assumed to be 1, which is conviently built into networkx already,
+            #    is assumed to be 1, which is conveniently built into networkx already,
             #    by calling total_spanning_tree_weight with weight=None
             else:
                 total = 0
diff --git a/networkx/algorithms/triads.py b/networkx/algorithms/triads.py
index 1c107a1..eaacf22 100644
--- a/networkx/algorithms/triads.py
+++ b/networkx/algorithms/triads.py
@@ -6,10 +6,9 @@
 
 from collections import defaultdict
 from itertools import combinations, permutations
-from random import sample
 
 import networkx as nx
-from networkx.utils import not_implemented_for
+from networkx.utils import not_implemented_for, py_random_state
 
 __all__ = [
     "triadic_census",
@@ -149,6 +148,30 @@ def triadic_census(G, nodelist=None):
     census : dict
        Dictionary with triad type as keys and number of occurrences as values.
 
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 1), (3, 4), (4, 1), (4, 2)])
+    >>> triadic_census = nx.triadic_census(G)
+    >>> for key, value in triadic_census.items():
+    ...     print(f"{key}: {value}")
+    ...
+    003: 0
+    012: 0
+    102: 0
+    021D: 0
+    021U: 0
+    021C: 0
+    111D: 0
+    111U: 0
+    030T: 2
+    030C: 2
+    201: 0
+    120D: 0
+    120U: 0
+    120C: 0
+    210: 0
+    300: 0
+
     Notes
     -----
     This algorithm has complexity $O(m)$ where $m$ is the number of edges in
@@ -252,6 +275,7 @@ def triadic_census(G, nodelist=None):
     return census
 
 
+@nx._dispatch()
 def is_triad(G):
     """Returns True if the graph G is a triad, else False.
 
@@ -264,6 +288,15 @@ def is_triad(G):
     -------
     istriad : boolean
        Whether G is a valid triad
+
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 1)])
+    >>> nx.is_triad(G)
+    True
+    >>> G.add_edge(0, 1)
+    >>> nx.is_triad(G)
+    False
     """
     if isinstance(G, nx.Graph):
         if G.order() == 3 and nx.is_directed(G):
@@ -285,6 +318,13 @@ def all_triplets(G):
     -------
     triplets : generator of 3-tuples
        Generator of tuples of 3 nodes
+
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 4)])
+    >>> list(nx.all_triplets(G))
+    [(1, 2, 3), (1, 2, 4), (1, 3, 4), (2, 3, 4)]
+
     """
     triplets = combinations(G.nodes(), 3)
     return triplets
@@ -303,6 +343,17 @@ def all_triads(G):
     -------
     all_triads : generator of DiGraphs
        Generator of triads (order-3 DiGraphs)
+
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 1), (3, 4), (4, 1), (4, 2)])
+    >>> for triad in nx.all_triads(G):
+    ...     print(triad.edges)
+    [(1, 2), (2, 3), (3, 1)]
+    [(1, 2), (4, 1), (4, 2)]
+    [(3, 1), (3, 4), (4, 1)]
+    [(2, 3), (3, 4), (4, 2)]
+
     """
     triplets = combinations(G.nodes(), 3)
     for triplet in triplets:
@@ -312,6 +363,29 @@ def all_triads(G):
 @not_implemented_for("undirected")
 def triads_by_type(G):
     """Returns a list of all triads for each triad type in a directed graph.
+    There are exactly 16 different types of triads possible. Suppose 1, 2, 3 are three
+    nodes, they will be classified as a particular triad type if their connections
+    are as follows:
+
+    - 003: 1, 2, 3
+    - 012: 1 -> 2, 3
+    - 102: 1 <-> 2, 3
+    - 021D: 1 <- 2 -> 3
+    - 021U: 1 -> 2 <- 3
+    - 021C: 1 -> 2 -> 3
+    - 111D: 1 <-> 2 <- 3
+    - 111U: 1 <-> 2 -> 3
+    - 030T: 1 -> 2 -> 3, 1 -> 3
+    - 030C: 1 <- 2 <- 3, 1 -> 3
+    - 201: 1 <-> 2 <-> 3
+    - 120D: 1 <- 2 -> 3, 1 <-> 3
+    - 120U: 1 -> 2 <- 3, 1 <-> 3
+    - 120C: 1 -> 2 -> 3, 1 <-> 3
+    - 210: 1 -> 2 <-> 3, 1 <-> 3
+    - 300: 1 <-> 2 <-> 3, 1 <-> 3
+
+    Refer to the :doc:`example gallery <auto_examples/graph/plot_triad_types>`
+    for visual examples of the triad types.
 
     Parameters
     ----------
@@ -322,6 +396,21 @@ def triads_by_type(G):
     -------
     tri_by_type : dict
        Dictionary with triad types as keys and lists of triads as values.
+
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (1, 3), (2, 3), (3, 1), (5, 6), (5, 4), (6, 7)])
+    >>> dict = nx.triads_by_type(G)
+    >>> dict['120C'][0].edges()
+    OutEdgeView([(1, 2), (1, 3), (2, 3), (3, 1)])
+    >>> dict['012'][0].edges()
+    OutEdgeView([(1, 2)])
+
+    References
+    ----------
+    .. [1] Snijders, T. (2012). "Transitivity and triads." University of
+        Oxford.
+        https://web.archive.org/web/20170830032057/http://www.stats.ox.ac.uk/~snijders/Trans_Triads_ha.pdf
     """
     # num_triads = o * (o - 1) * (o - 2) // 6
     # if num_triads > TRIAD_LIMIT: print(WARNING)
@@ -347,6 +436,15 @@ def triad_type(G):
     triad_type : str
        A string identifying the triad type
 
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 1)])
+    >>> nx.triad_type(G)
+    '030C'
+    >>> G.add_edge(1, 3)
+    >>> nx.triad_type(G)
+    '120C'
+
     Notes
     -----
     There can be 6 unique edges in a triad (order-3 DiGraph) (so 2^^6=64 unique
@@ -360,7 +458,7 @@ def triad_type(G):
 
     {m}     = number of mutual ties (takes 0, 1, 2, 3); a mutual tie is (0,1)
               AND (1,0)
-    {a}     = number of assymmetric ties (takes 0, 1, 2, 3); an assymmetric tie
+    {a}     = number of asymmetric ties (takes 0, 1, 2, 3); an asymmetric tie
               is (0,1) BUT NOT (1,0) or vice versa
     {n}     = number of null ties (takes 0, 1, 2, 3); a null tie is NEITHER
               (0,1) NOR (1,0)
@@ -422,20 +520,32 @@ def triad_type(G):
 
 
 @not_implemented_for("undirected")
-def random_triad(G):
+@py_random_state(1)
+def random_triad(G, seed=None):
     """Returns a random triad from a directed graph.
 
     Parameters
     ----------
     G : digraph
        A NetworkX DiGraph
+    seed : integer, random_state, or None (default)
+        Indicator of random number generation state.
+        See :ref:`Randomness<randomness>`.
 
     Returns
     -------
     G2 : subgraph
        A randomly selected triad (order-3 NetworkX DiGraph)
+
+    Examples
+    --------
+    >>> G = nx.DiGraph([(1, 2), (1, 3), (2, 3), (3, 1), (5, 6), (5, 4), (6, 7)])
+    >>> triad = nx.random_triad(G, seed=1)
+    >>> triad.edges
+    OutEdgeView([(1, 2)])
+
     """
-    nodes = sample(list(G.nodes()), 3)
+    nodes = seed.sample(list(G.nodes()), 3)
     G2 = G.subgraph(nodes)
     return G2
 
diff --git a/networkx/classes/__init__.py b/networkx/classes/__init__.py
index d5bb1d7..cc534cd 100644
--- a/networkx/classes/__init__.py
+++ b/networkx/classes/__init__.py
@@ -2,7 +2,7 @@ from .graph import Graph
 from .digraph import DiGraph
 from .multigraph import MultiGraph
 from .multidigraph import MultiDiGraph
-from .ordered import *
+from .backends import _dispatch
 
 from .function import *
 
@@ -11,3 +11,4 @@ from networkx.classes import filters
 from networkx.classes import coreviews
 from networkx.classes import graphviews
 from networkx.classes import reportviews
+from networkx.classes import backends
diff --git a/networkx/classes/backends.py b/networkx/classes/backends.py
new file mode 100644
index 0000000..cb0957f
--- /dev/null
+++ b/networkx/classes/backends.py
@@ -0,0 +1,225 @@
+"""
+Code to support various backends in a plugin dispatch architecture.
+
+Create a Dispatcher
+-------------------
+
+To be a valid plugin, a package must register an entry_point
+of `networkx.plugins` with a key pointing to the handler.
+
+For example:
+
+```
+entry_points={'networkx.plugins': 'sparse = networkx_plugin_sparse'}
+```
+
+The plugin must create a Graph-like object which contains an attribute
+`__networkx_plugin__` with a value of the entry point name.
+
+Continuing the example above:
+
+```
+class WrappedSparse:
+    __networkx_plugin__ = "sparse"
+    ...
+```
+
+When a dispatchable NetworkX algorithm encounters a Graph-like object
+with a `__networkx_plugin__` attribute, it will look for the associated
+dispatch object in the entry_points, load it, and dispatch the work to it.
+
+
+Testing
+-------
+To assist in validating the backend algorithm implementations, if an
+environment variable `NETWORKX_GRAPH_CONVERT` is set to a registered
+plugin keys, the dispatch machinery will automatically convert regular
+networkx Graphs and DiGraphs to the backend equivalent by calling
+`<backend dispatcher>.convert_from_nx(G, weight=weight, name=name)`.
+
+The converted object is then passed to the backend implementation of
+the algorithm. The result is then passed to
+`<backend dispatcher>.convert_to_nx(result, name=name)` to convert back
+to a form expected by the NetworkX tests.
+
+By defining `convert_from_nx` and `convert_to_nx` methods and setting
+the environment variable, NetworkX will automatically route tests on
+dispatchable algorithms to the backend, allowing the full networkx test
+suite to be run against the backend implementation.
+
+Example pytest invocation:
+NETWORKX_GRAPH_CONVERT=sparse pytest --pyargs networkx
+
+Dispatchable algorithms which are not implemented by the backend
+will cause a `pytest.xfail()`, giving some indication that not all
+tests are working, while avoiding causing an explicit failure.
+
+A special `on_start_tests(items)` function may be defined by the backend.
+It will be called with the list of NetworkX tests discovered. Each item
+is a pytest.Node object. If the backend does not support the test, that
+test can be marked as xfail.
+"""
+import functools
+import inspect
+import os
+import sys
+from importlib.metadata import entry_points
+
+from ..exception import NetworkXNotImplemented
+
+__all__ = ["_dispatch", "_mark_tests"]
+
+
+class PluginInfo:
+    """Lazily loaded entry_points plugin information"""
+
+    def __init__(self):
+        self._items = None
+
+    def __bool__(self):
+        return len(self.items) > 0
+
+    @property
+    def items(self):
+        if self._items is None:
+            if sys.version_info < (3, 10):
+                self._items = entry_points()["networkx.plugins"]
+            else:
+                self._items = entry_points(group="networkx.plugins")
+        return self._items
+
+    def __contains__(self, name):
+        if sys.version_info < (3, 10):
+            return len([ep for ep in self.items if ep.name == name]) > 0
+        return name in self.items.names
+
+    def __getitem__(self, name):
+        if sys.version_info < (3, 10):
+            return [ep for ep in self.items if ep.name == name][0]
+        return self.items[name]
+
+
+plugins = PluginInfo()
+_registered_algorithms = {}
+
+
+def _register_algo(name, wrapped_func):
+    if name in _registered_algorithms:
+        raise KeyError(f"Algorithm already exists in dispatch registry: {name}")
+    _registered_algorithms[name] = wrapped_func
+    wrapped_func.dispatchname = name
+
+
+def _dispatch(func=None, *, name=None):
+    """Dispatches to a backend algorithm
+    when the first argument is a backend graph-like object.
+    """
+    # Allow any of the following decorator forms:
+    #  - @_dispatch
+    #  - @_dispatch()
+    #  - @_dispatch("override_name")
+    #  - @_dispatch(name="override_name")
+    if func is None:
+        if name is None:
+            return _dispatch
+        return functools.partial(_dispatch, name=name)
+    if isinstance(func, str):
+        return functools.partial(_dispatch, name=func)
+    # If name not provided, use the name of the function
+    if name is None:
+        name = func.__name__
+
+    @functools.wraps(func)
+    def wrapper(*args, **kwds):
+        graph = args[0]
+        if hasattr(graph, "__networkx_plugin__") and plugins:
+            plugin_name = graph.__networkx_plugin__
+            if plugin_name in plugins:
+                backend = plugins[plugin_name].load()
+                if hasattr(backend, name):
+                    return getattr(backend, name).__call__(*args, **kwds)
+                else:
+                    raise NetworkXNotImplemented(
+                        f"'{name}' not implemented by {plugin_name}"
+                    )
+        return func(*args, **kwds)
+
+    _register_algo(name, wrapper)
+    return wrapper
+
+
+def test_override_dispatch(func=None, *, name=None):
+    """Auto-converts the first argument into the backend equivalent,
+    causing the dispatching mechanism to trigger for every
+    decorated algorithm."""
+    if func is None:
+        if name is None:
+            return test_override_dispatch
+        return functools.partial(test_override_dispatch, name=name)
+    if isinstance(func, str):
+        return functools.partial(test_override_dispatch, name=func)
+    # If name not provided, use the name of the function
+    if name is None:
+        name = func.__name__
+
+    sig = inspect.signature(func)
+
+    @functools.wraps(func)
+    def wrapper(*args, **kwds):
+        backend = plugins[plugin_name].load()
+        if not hasattr(backend, name):
+            pytest.xfail(f"'{name}' not implemented by {plugin_name}")
+        bound = sig.bind(*args, **kwds)
+        bound.apply_defaults()
+        graph, *args = args
+        # Convert graph into backend graph-like object
+        #   Include the weight label, if provided to the algorithm
+        weight = None
+        if "weight" in bound.arguments:
+            weight = bound.arguments["weight"]
+        elif "data" in bound.arguments and "default" in bound.arguments:
+            # This case exists for several MultiGraph edge algorithms
+            if isinstance(bound.arguments["data"], str):
+                weight = bound.arguments["data"]
+            elif bound.arguments["data"]:
+                weight = "weight"
+        graph = backend.convert_from_nx(graph, weight=weight, name=name)
+        result = getattr(backend, name).__call__(graph, *args, **kwds)
+        return backend.convert_to_nx(result, name=name)
+
+    _register_algo(name, wrapper)
+    return wrapper
+
+
+# Check for auto-convert testing
+# This allows existing NetworkX tests to be run against a backend
+# implementation without any changes to the testing code. The only
+# required change is to set an environment variable prior to running
+# pytest.
+if os.environ.get("NETWORKX_GRAPH_CONVERT"):
+    plugin_name = os.environ["NETWORKX_GRAPH_CONVERT"]
+    if not plugins:
+        raise Exception("No registered networkx.plugins entry_points")
+    if plugin_name not in plugins:
+        raise Exception(
+            f"No registered networkx.plugins entry_point named {plugin_name}"
+        )
+
+    try:
+        import pytest
+    except ImportError:
+        raise ImportError(
+            f"Missing pytest, which is required when using NETWORKX_GRAPH_CONVERT"
+        )
+
+    # Override `dispatch` for testing
+    _dispatch = test_override_dispatch
+
+
+def _mark_tests(items):
+    """Allow backend to mark tests (skip or xfail) if they aren't able to correctly handle them"""
+    if os.environ.get("NETWORKX_GRAPH_CONVERT"):
+        plugin_name = os.environ["NETWORKX_GRAPH_CONVERT"]
+        backend = plugins[plugin_name].load()
+        if hasattr(backend, "on_start_tests"):
+            getattr(backend, "on_start_tests")(items)
diff --git a/networkx/classes/coreviews.py b/networkx/classes/coreviews.py
index 6c5b8a4..c2b8355 100644
--- a/networkx/classes/coreviews.py
+++ b/networkx/classes/coreviews.py
@@ -2,7 +2,6 @@
 These ``Views`` often restrict element access, with either the entire view or
 layers of nested mappings being read-only.
 """
-import warnings
 from collections.abc import Mapping
 
 __all__ = [
@@ -286,25 +285,6 @@ class FilterAtlas(Mapping):  # nodedict, nbrdict, keydict
             return self._atlas[key]
         raise KeyError(f"Key {key} not found")
 
-    # FIXME should this just be removed? we don't use it, but someone might
-    def copy(self):
-        warnings.warn(
-            (
-                "FilterAtlas.copy is deprecated.\n"
-                "It will be removed in NetworkX 3.0.\n"
-                "Please open an Issue on https://github.com/networkx/networkx/issues\n"
-                "if you use this feature. We think that no one does use it."
-            ),
-            DeprecationWarning,
-        )
-        try:  # check that NODE_OK has attr 'nodes'
-            node_ok_shorter = 2 * len(self.NODE_OK.nodes) < len(self._atlas)
-        except AttributeError:
-            node_ok_shorter = False
-        if node_ok_shorter:
-            return {u: self._atlas[u] for u in self.NODE_OK.nodes if u in self._atlas}
-        return {u: d for u, d in self._atlas.items() if self.NODE_OK(u)}
-
     def __str__(self):
         return str({nbr: self[nbr] for nbr in self})
 
@@ -339,38 +319,6 @@ class FilterAdjacency(Mapping):  # edgedict
             return FilterAtlas(self._atlas[node], new_node_ok)
         raise KeyError(f"Key {node} not found")
 
-    # FIXME should this just be removed? we don't use it, but someone might
-    def copy(self):
-        warnings.warn(
-            (
-                "FilterAdjacency.copy is deprecated.\n"
-                "It will be removed in NetworkX 3.0.\n"
-                "Please open an Issue on https://github.com/networkx/networkx/issues\n"
-                "if you use this feature. We think that no one does use it."
-            ),
-            DeprecationWarning,
-        )
-        try:  # check that NODE_OK has attr 'nodes'
-            node_ok_shorter = 2 * len(self.NODE_OK.nodes) < len(self._atlas)
-        except AttributeError:
-            node_ok_shorter = False
-        if node_ok_shorter:
-            return {
-                u: {
-                    v: d
-                    for v, d in self._atlas[u].items()
-                    if self.NODE_OK(v)
-                    if self.EDGE_OK(u, v)
-                }
-                for u in self.NODE_OK.nodes
-                if u in self._atlas
-            }
-        return {
-            u: {v: d for v, d in nbrs.items() if self.NODE_OK(v) if self.EDGE_OK(u, v)}
-            for u, nbrs in self._atlas.items()
-            if self.NODE_OK(u)
-        }
-
     def __str__(self):
         return str({nbr: self[nbr] for nbr in self})
 
@@ -407,33 +355,6 @@ class FilterMultiInner(FilterAdjacency):  # muliedge_seconddict
             return FilterAtlas(self._atlas[nbr], new_node_ok)
         raise KeyError(f"Key {nbr} not found")
 
-    # FIXME should this just be removed? we don't use it, but someone might
-    def copy(self):
-        warnings.warn(
-            (
-                "FilterMultiInner.copy is deprecated.\n"
-                "It will be removed in NetworkX 3.0.\n"
-                "Please open an Issue on https://github.com/networkx/networkx/issues\n"
-                "if you use this feature. We think that no one does use it."
-            ),
-            DeprecationWarning,
-        )
-        try:  # check that NODE_OK has attr 'nodes'
-            node_ok_shorter = 2 * len(self.NODE_OK.nodes) < len(self._atlas)
-        except AttributeError:
-            node_ok_shorter = False
-        if node_ok_shorter:
-            return {
-                v: {k: d for k, d in self._atlas[v].items() if self.EDGE_OK(v, k)}
-                for v in self.NODE_OK.nodes
-                if v in self._atlas
-            }
-        return {
-            v: {k: d for k, d in nbrs.items() if self.EDGE_OK(v, k)}
-            for v, nbrs in self._atlas.items()
-            if self.NODE_OK(v)
-        }
-
 
 class FilterMultiAdjacency(FilterAdjacency):  # multiedgedict
     def __getitem__(self, node):
@@ -444,39 +365,3 @@ class FilterMultiAdjacency(FilterAdjacency):  # multiedgedict
 
             return FilterMultiInner(self._atlas[node], self.NODE_OK, edge_ok)
         raise KeyError(f"Key {node} not found")
-
-    # FIXME should this just be removed? we don't use it, but someone might
-    def copy(self):
-        warnings.warn(
-            (
-                "FilterMultiAdjacency.copy is deprecated.\n"
-                "It will be removed in NetworkX 3.0.\n"
-                "Please open an Issue on https://github.com/networkx/networkx/issues\n"
-                "if you use this feature. We think that no one does use it."
-            ),
-            DeprecationWarning,
-        )
-        try:  # check that NODE_OK has attr 'nodes'
-            node_ok_shorter = 2 * len(self.NODE_OK.nodes) < len(self._atlas)
-        except AttributeError:
-            node_ok_shorter = False
-        if node_ok_shorter:
-            my_nodes = self.NODE_OK.nodes
-            return {
-                u: {
-                    v: {k: d for k, d in kd.items() if self.EDGE_OK(u, v, k)}
-                    for v, kd in self._atlas[u].items()
-                    if v in my_nodes
-                }
-                for u in my_nodes
-                if u in self._atlas
-            }
-        return {
-            u: {
-                v: {k: d for k, d in kd.items() if self.EDGE_OK(u, v, k)}
-                for v, kd in nbrs.items()
-                if self.NODE_OK(v)
-            }
-            for u, nbrs in self._atlas.items()
-            if self.NODE_OK(u)
-        }
diff --git a/networkx/classes/digraph.py b/networkx/classes/digraph.py
index 9528a15..8e1f752 100644
--- a/networkx/classes/digraph.py
+++ b/networkx/classes/digraph.py
@@ -99,7 +99,6 @@ class DiGraph(Graph):
     Graph
     MultiGraph
     MultiDiGraph
-    OrderedDiGraph
 
     Examples
     --------
@@ -308,11 +307,6 @@ class DiGraph(Graph):
     >>> G.add_edge(2, 2)
     >>> G[2][1] is G[2][2]
     True
-
-
-    Please see :mod:`~networkx.classes.ordered` for more examples of
-    creating graph subclasses by overwriting the base class `dict` with
-    a dictionary-like object.
     """
 
     _adj = _CachedPropertyResetterAdjAndSucc()  # type: ignore
@@ -329,7 +323,7 @@ class DiGraph(Graph):
             graph is created.  The data can be an edge list, or any
             NetworkX graph object.  If the corresponding optional Python
             packages are installed the data can also be a 2D NumPy array, a
-            SciPy sparse matrix, or a PyGraphviz graph.
+            SciPy sparse array, or a PyGraphviz graph.
 
         attr : keyword arguments, optional (default= no attributes)
             Attributes to add to graph as key=value pairs.
@@ -491,6 +485,16 @@ class DiGraph(Graph):
         --------
         add_node
 
+        Notes
+        -----
+        When adding nodes from an iterator over the graph you are changing,
+        a `RuntimeError` can be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_nodes)`, and pass this
+        object to `G.add_nodes_from`.
+
         Examples
         --------
         >>> G = nx.Graph()  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -515,6 +519,13 @@ class DiGraph(Graph):
         >>> H.nodes[1]["size"]
         11
 
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.DiGraph([(0, 1), (1, 2), (3, 4)])
+        >>> # wrong way - will raise RuntimeError
+        >>> # G.add_nodes_from(n + 1 for n in G.nodes)
+        >>> # correct way
+        >>> G.add_nodes_from(list(n + 1 for n in G.nodes))
         """
         for n in nodes_for_adding:
             try:
@@ -588,6 +599,16 @@ class DiGraph(Graph):
         --------
         remove_node
 
+        Notes
+        -----
+        When removing nodes from an iterator over the graph you are changing,
+        a `RuntimeError` will be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_nodes)`, and pass this
+        object to `G.remove_nodes_from`.
+
         Examples
         --------
         >>> G = nx.path_graph(3)  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -598,6 +619,13 @@ class DiGraph(Graph):
         >>> list(G.nodes)
         []
 
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.DiGraph([(0, 1), (1, 2), (3, 4)])
+        >>> # this command will fail, as the graph's dict is modified during iteration
+        >>> # G.remove_nodes_from(n for n in G.nodes if n < 2)
+        >>> # this command will work, since the dictionary underlying graph is not modified
+        >>> G.remove_nodes_from(list(n for n in G.nodes if n < 2))
         """
         for n in nodes:
             try:
@@ -708,6 +736,14 @@ class DiGraph(Graph):
         Edge attributes specified in an ebunch take precedence over
         attributes specified via keyword arguments.
 
+        When adding edges from an iterator over the graph you are changing,
+        a `RuntimeError` can be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_edges)`, and pass this
+        object to `G.add_edges_from`.
+
         Examples
         --------
         >>> G = nx.Graph()  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -719,6 +755,15 @@ class DiGraph(Graph):
 
         >>> G.add_edges_from([(1, 2), (2, 3)], weight=3)
         >>> G.add_edges_from([(3, 4), (1, 4)], label="WN2898")
+
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.DiGraph([(1, 2), (2, 3), (3, 4)])
+        >>> # Grow graph by one new node, adding edges to all existing nodes.
+        >>> # wrong way - will raise RuntimeError
+        >>> # G.add_edges_from(((5, n) for n in G.nodes))
+        >>> # right way - note that there will be no self-edge for node 5
+        >>> G.add_edges_from(list((5, n) for n in G.nodes))
         """
         for e in ebunch_to_add:
             ne = len(e)
@@ -955,7 +1000,7 @@ class DiGraph(Graph):
 
     @cached_property
     def in_edges(self):
-        """An InEdgeView of the Graph as G.in_edges or G.in_edges().
+        """A view of the in edges of the graph as G.in_edges or G.in_edges().
 
         in_edges(self, nbunch=None, data=False, default=None):
 
@@ -973,11 +1018,20 @@ class DiGraph(Graph):
 
         Returns
         -------
-        in_edges : InEdgeView
+        in_edges : InEdgeView or InEdgeDataView
             A view of edge attributes, usually it iterates over (u, v)
             or (u, v, d) tuples of edges, but can also be used for
             attribute lookup as `edges[u, v]['foo']`.
 
+        Examples
+        --------
+        >>> G = nx.DiGraph()
+        >>> G.add_edge(1, 2, color='blue')
+        >>> G.in_edges()
+        InEdgeView([(1, 2)])
+        >>> G.in_edges(nbunch=2)
+        InEdgeDataView([(1, 2)])
+
         See Also
         --------
         edges
diff --git a/networkx/classes/function.py b/networkx/classes/function.py
index 3750fd1..194ab1f 100644
--- a/networkx/classes/function.py
+++ b/networkx/classes/function.py
@@ -18,7 +18,6 @@ __all__ = [
     "number_of_edges",
     "density",
     "is_directed",
-    "info",
     "freeze",
     "is_frozen",
     "subgraph",
@@ -202,6 +201,7 @@ def freeze(G):
     G.remove_edge = frozen
     G.remove_edges_from = frozen
     G.clear = frozen
+    G.clear_edges = frozen
     G.frozen = True
     return G
 
@@ -552,51 +552,6 @@ def create_empty_copy(G, with_data=True):
     return H
 
 
-def info(G, n=None):
-    """Return a summary of information for the graph G or a single node n.
-
-    The summary includes the number of nodes and edges, or neighbours for a single
-    node.
-
-    Parameters
-    ----------
-    G : Networkx graph
-       A graph
-    n : node (any hashable)
-       A node in the graph G
-
-    Returns
-    -------
-    info : str
-        A string containing the short summary
-
-    Raises
-    ------
-    NetworkXError
-        If n is not in the graph G
-
-    .. deprecated:: 2.7
-       ``info`` is deprecated and will be removed in NetworkX 3.0.
-    """
-    import warnings
-
-    warnings.warn(
-        ("info is deprecated and will be removed in version 3.0.\n"),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    if n is None:
-        return str(G)
-    if n not in G:
-        raise nx.NetworkXError(f"node {n} not in graph")
-    info = ""  # append this all to a string
-    info += f"Node {n} has the following properties:\n"
-    info += f"Degree: {G.degree(n)}\n"
-    info += "Neighbors: "
-    info += " ".join(str(nbr) for nbr in G.neighbors(n))
-    return info
-
-
 def set_node_attributes(G, values, name=None):
     """Sets node attributes from a given value or dictionary of values.
 
diff --git a/networkx/classes/graph.py b/networkx/classes/graph.py
index 6edc506..70e2b25 100644
--- a/networkx/classes/graph.py
+++ b/networkx/classes/graph.py
@@ -95,7 +95,6 @@ class Graph:
     DiGraph
     MultiGraph
     MultiDiGraph
-    OrderedGraph
 
     Examples
     --------
@@ -302,10 +301,6 @@ class Graph:
     >>> G.add_edge(2, 2)
     >>> G[2][1] is G[2][2]
     True
-
-    Please see :mod:`~networkx.classes.ordered` for more examples of
-    creating graph subclasses by overwriting the base class `dict` with
-    a dictionary-like object.
     """
 
     _adj = _CachedPropertyResetterAdj()
@@ -344,7 +339,7 @@ class Graph:
             graph is created.  The data can be an edge list, or any
             NetworkX graph object.  If the corresponding optional Python
             packages are installed the data can also be a 2D NumPy array, a
-            SciPy sparse matrix, or a PyGraphviz graph.
+            SciPy sparse array, or a PyGraphviz graph.
 
         attr : keyword arguments, optional (default= no attributes)
             Attributes to add to graph as key=value pairs.
@@ -415,7 +410,8 @@ class Graph:
         Returns
         -------
         info : string
-            Graph information as provided by `nx.info`
+            Graph information including the graph name (if any), graph type, and the
+            number of nodes and edges.
 
         Examples
         --------
@@ -583,6 +579,16 @@ class Graph:
         --------
         add_node
 
+        Notes
+        -----
+        When adding nodes from an iterator over the graph you are changing,
+        a `RuntimeError` can be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_nodes)`, and pass this
+        object to `G.add_nodes_from`.
+
         Examples
         --------
         >>> G = nx.Graph()  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -607,6 +613,13 @@ class Graph:
         >>> H.nodes[1]["size"]
         11
 
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.Graph([(0, 1), (1, 2), (3, 4)])
+        >>> # wrong way - will raise RuntimeError
+        >>> # G.add_nodes_from(n + 1 for n in G.nodes)
+        >>> # correct way
+        >>> G.add_nodes_from(list(n + 1 for n in G.nodes))
         """
         for n in nodes_for_adding:
             try:
@@ -678,6 +691,16 @@ class Graph:
         --------
         remove_node
 
+        Notes
+        -----
+        When removing nodes from an iterator over the graph you are changing,
+        a `RuntimeError` will be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_nodes)`, and pass this
+        object to `G.remove_nodes_from`.
+
         Examples
         --------
         >>> G = nx.path_graph(3)  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -688,6 +711,13 @@ class Graph:
         >>> list(G.nodes)
         []
 
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.Graph([(0, 1), (1, 2), (3, 4)])
+        >>> # this command will fail, as the graph's dict is modified during iteration
+        >>> # G.remove_nodes_from(n for n in G.nodes if n < 2)
+        >>> # this command will work, since the dictionary underlying graph is not modified
+        >>> G.remove_nodes_from(list(n for n in G.nodes if n < 2))
         """
         adj = self._adj
         for n in nodes:
@@ -954,6 +984,14 @@ class Graph:
         Edge attributes specified in an ebunch take precedence over
         attributes specified via keyword arguments.
 
+        When adding edges from an iterator over the graph you are changing,
+        a `RuntimeError` can be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_edges)`, and pass this
+        object to `G.add_edges_from`.
+
         Examples
         --------
         >>> G = nx.Graph()  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -965,6 +1003,15 @@ class Graph:
 
         >>> G.add_edges_from([(1, 2), (2, 3)], weight=3)
         >>> G.add_edges_from([(3, 4), (1, 4)], label="WN2898")
+
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.Graph([(1, 2), (2, 3), (3, 4)])
+        >>> # Grow graph by one new node, adding edges to all existing nodes.
+        >>> # wrong way - will raise RuntimeError
+        >>> # G.add_edges_from(((5, n) for n in G.nodes))
+        >>> # correct way - note that there will be no self-edge for node 5
+        >>> G.add_edges_from(list((5, n) for n in G.nodes))
         """
         for e in ebunch_to_add:
             ne = len(e)
@@ -1016,10 +1063,28 @@ class Graph:
         the edge data. For MultiGraph/MultiDiGraph, duplicate edges
         are stored.
 
+        When adding edges from an iterator over the graph you are changing,
+        a `RuntimeError` can be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_edges)`, and pass this
+        object to `G.add_weighted_edges_from`.
+
         Examples
         --------
         >>> G = nx.Graph()  # or DiGraph, MultiGraph, MultiDiGraph, etc
         >>> G.add_weighted_edges_from([(0, 1, 3.0), (1, 2, 7.5)])
+
+        Evaluate an iterator over edges before passing it
+
+        >>> G = nx.Graph([(1, 2), (2, 3), (3, 4)])
+        >>> weight = 0.1
+        >>> # Grow graph by one new node, adding edges to all existing nodes.
+        >>> # wrong way - will raise RuntimeError
+        >>> # G.add_weighted_edges_from(((5, n, weight) for n in G.nodes))
+        >>> # correct way - note that there will be no self-edge for node 5
+        >>> G.add_weighted_edges_from(list((5, n, weight) for n in G.nodes))
         """
         self.add_edges_from(((u, v, {weight: d}) for u, v, d in ebunch_to_add), **attr)
 
diff --git a/networkx/classes/graphviews.py b/networkx/classes/graphviews.py
index dcb7836..5e3fa81 100644
--- a/networkx/classes/graphviews.py
+++ b/networkx/classes/graphviews.py
@@ -8,7 +8,7 @@ a graph to reverse directed edges, or treat a directed graph
 as undirected, etc. This module provides those graph views.
 
 The resulting views are essentially read-only graphs that
-report data from the orignal graph object. We provide an
+report data from the original graph object. We provide an
 attribute G._graph which points to the underlying graph object.
 
 Note: Since graphviews look like graphs, one can end up with
diff --git a/networkx/classes/multidigraph.py b/networkx/classes/multidigraph.py
index e118dc2..603f0b2 100644
--- a/networkx/classes/multidigraph.py
+++ b/networkx/classes/multidigraph.py
@@ -63,7 +63,6 @@ class MultiDiGraph(MultiGraph, DiGraph):
     Graph
     DiGraph
     MultiGraph
-    OrderedMultiDiGraph
 
     Examples
     --------
@@ -271,9 +270,26 @@ class MultiDiGraph(MultiGraph, DiGraph):
         Class to create a new graph structure in the `to_undirected` method.
         If `None`, a NetworkX class (Graph or MultiGraph) is used.
 
-    Please see :mod:`~networkx.classes.ordered` for examples of
-    creating graph subclasses by overwriting the base class `dict` with
-    a dictionary-like object.
+    **Subclassing Example**
+
+    Create a low memory graph class that effectively disallows edge
+    attributes by using a single attribute dict for all edges.
+    This reduces the memory used, but you lose edge attributes.
+
+    >>> class ThinGraph(nx.Graph):
+    ...     all_edge_dict = {"weight": 1}
+    ...
+    ...     def single_edge_dict(self):
+    ...         return self.all_edge_dict
+    ...
+    ...     edge_attr_dict_factory = single_edge_dict
+    >>> G = ThinGraph()
+    >>> G.add_edge(2, 1)
+    >>> G[2][1]
+    {'weight': 1}
+    >>> G.add_edge(2, 2)
+    >>> G[2][1] is G[2][2]
+    True
     """
 
     # node_dict_factory = dict    # already assigned in Graph
@@ -292,7 +308,7 @@ class MultiDiGraph(MultiGraph, DiGraph):
             an empty graph is created.  The data can be an edge list, or any
             NetworkX graph object.  If the corresponding optional Python
             packages are installed the data can also be a 2D NumPy array, a
-            SciPy sparse matrix, or a PyGraphviz graph.
+            SciPy sparse array, or a PyGraphviz graph.
 
         multigraph_input : bool or None (default None)
             Note: Only used when `incoming_graph_data` is a dict.
@@ -658,7 +674,7 @@ class MultiDiGraph(MultiGraph, DiGraph):
 
     @cached_property
     def in_edges(self):
-        """An InMultiEdgeView of the Graph as G.in_edges or G.in_edges().
+        """A view of the in edges of the graph as G.in_edges or G.in_edges().
 
         in_edges(self, nbunch=None, data=False, keys=False, default=None)
 
@@ -679,7 +695,7 @@ class MultiDiGraph(MultiGraph, DiGraph):
 
         Returns
         -------
-        in_edges : InMultiEdgeView
+        in_edges : InMultiEdgeView or InMultiEdgeDataView
             A view of edge attributes, usually it iterates over (u, v)
             or (u, v, k) or (u, v, k, d) tuples of edges, but can also be
             used for attribute lookup as `edges[u, v, k]['foo']`.
diff --git a/networkx/classes/multigraph.py b/networkx/classes/multigraph.py
index 3332201..a91e012 100644
--- a/networkx/classes/multigraph.py
+++ b/networkx/classes/multigraph.py
@@ -36,7 +36,7 @@ class MultiGraph(Graph):
         graph is created.  The data can be any format that is supported
         by the to_networkx_graph() function, currently including edge list,
         dict of dicts, dict of lists, NetworkX graph, 2D NumPy array,
-        SciPy sparse matrix, or PyGraphviz graph.
+        SciPy sparse array, or PyGraphviz graph.
 
     multigraph_input : bool or None (default None)
         Note: Only used when `incoming_graph_data` is a dict.
@@ -59,7 +59,6 @@ class MultiGraph(Graph):
     Graph
     DiGraph
     MultiDiGraph
-    OrderedMultiGraph
 
     Examples
     --------
@@ -264,9 +263,26 @@ class MultiGraph(Graph):
         Class to create a new graph structure in the `to_undirected` method.
         If `None`, a NetworkX class (Graph or MultiGraph) is used.
 
-    Please see :mod:`~networkx.classes.ordered` for examples of
-    creating graph subclasses by overwriting the base class `dict` with
-    a dictionary-like object.
+    **Subclassing Example**
+
+    Create a low memory graph class that effectively disallows edge
+    attributes by using a single attribute dict for all edges.
+    This reduces the memory used, but you lose edge attributes.
+
+    >>> class ThinGraph(nx.Graph):
+    ...     all_edge_dict = {"weight": 1}
+    ...
+    ...     def single_edge_dict(self):
+    ...         return self.all_edge_dict
+    ...
+    ...     edge_attr_dict_factory = single_edge_dict
+    >>> G = ThinGraph()
+    >>> G.add_edge(2, 1)
+    >>> G[2][1]
+    {'weight': 1}
+    >>> G.add_edge(2, 2)
+    >>> G[2][1] is G[2][2]
+    True
     """
 
     # node_dict_factory = dict    # already assigned in Graph
@@ -301,7 +317,7 @@ class MultiGraph(Graph):
             an empty graph is created.  The data can be an edge list, or any
             NetworkX graph object.  If the corresponding optional Python
             packages are installed the data can also be a 2D NumPy array, a
-            SciPy sparse matrix, or a PyGraphviz graph.
+            SciPy sparse array, or a PyGraphviz graph.
 
         multigraph_input : bool or None (default None)
             Note: Only used when `incoming_graph_data` is a dict.
@@ -546,6 +562,14 @@ class MultiGraph(Graph):
         This method can be overridden by subclassing the base class and
         providing a custom ``new_edge_key()`` method.
 
+        When adding edges from an iterator over the graph you are changing,
+        a `RuntimeError` can be raised with message:
+        `RuntimeError: dictionary changed size during iteration`. This
+        happens when the graph's underlying dictionary is modified during
+        iteration. To avoid this error, evaluate the iterator into a separate
+        object, e.g. by using `list(iterator_of_edges)`, and pass this
+        object to `G.add_edges_from`.
+
         Examples
         --------
         >>> G = nx.Graph()  # or DiGraph, MultiGraph, MultiDiGraph, etc
@@ -557,6 +581,15 @@ class MultiGraph(Graph):
 
         >>> G.add_edges_from([(1, 2), (2, 3)], weight=3)
         >>> G.add_edges_from([(3, 4), (1, 4)], label="WN2898")
+
+        Evaluate an iterator over a graph if using it to modify the same graph
+
+        >>> G = nx.MultiGraph([(1, 2), (2, 3), (3, 4)])
+        >>> # Grow graph by one new node, adding edges to all existing nodes.
+        >>> # wrong way - will raise RuntimeError
+        >>> # G.add_edges_from(((5, n) for n in G.nodes))
+        >>> # right way - note that there will be no self-edge for node 5
+        >>> assigned_keys = G.add_edges_from(list((5, n) for n in G.nodes))
         """
         keylist = []
         for e in ebunch_to_add:
diff --git a/networkx/classes/ordered.py b/networkx/classes/ordered.py
deleted file mode 100644
index ca82d12..0000000
--- a/networkx/classes/ordered.py
+++ /dev/null
@@ -1,162 +0,0 @@
-"""
-
-.. deprecated:: 2.6
-
-   The ordered variants of graph classes in this module are deprecated and
-   will be removed in version 3.0.
-
-Consistently ordered variants of the default base classes.
-Note that if you are using Python 3.6+, you shouldn't need these classes
-because the dicts in Python 3.6+ are ordered.
-Note also that there are many differing expectations for the word "ordered"
-and that these classes may not provide the order you expect.
-The intent here is to give a consistent order not a particular order.
-
-The Ordered (Di/Multi/MultiDi) Graphs give a consistent order for reporting of
-nodes and edges.  The order of node reporting agrees with node adding, but for
-edges, the order is not necessarily the order that the edges were added.
-
-In general, you should use the default (i.e., unordered) graph classes.
-However, there are times (e.g., when testing) when you may need the
-order preserved.
-
-Special care is required when using subgraphs of the Ordered classes.
-The order of nodes in the subclass is not necessarily the same order
-as the original class.  In general it is probably better to avoid using
-subgraphs and replace with code similar to:
-
-.. code-block:: python
-
-    # instead of SG = G.subgraph(ordered_nodes)
-    SG = nx.OrderedGraph()
-    SG.add_nodes_from(ordered_nodes)
-    SG.add_edges_from((u, v) for (u, v) in G.edges() if u in SG if v in SG)
-
-"""
-import warnings
-from collections import OrderedDict
-
-from .digraph import DiGraph
-from .graph import Graph
-from .multidigraph import MultiDiGraph
-from .multigraph import MultiGraph
-
-__all__ = []
-
-__all__.extend(
-    ["OrderedGraph", "OrderedDiGraph", "OrderedMultiGraph", "OrderedMultiDiGraph"]
-)
-
-
-class OrderedGraph(Graph):
-    """Consistently ordered variant of :class:`~networkx.Graph`.
-
-    .. deprecated:: 2.6
-
-       OrderedGraph is deprecated and will be removed in version 3.0.
-       Use `Graph` instead, which guarantees order is preserved for
-       Python >= 3.7
-    """
-
-    node_dict_factory = OrderedDict
-    adjlist_outer_dict_factory = OrderedDict
-    adjlist_inner_dict_factory = OrderedDict
-    edge_attr_dict_factory = OrderedDict
-
-    def __init__(self, incoming_graph_data=None, **attr):
-        warnings.warn(
-            (
-                "OrderedGraph is deprecated and will be removed in version 3.0.\n"
-                "Use `Graph` instead, which guarantees order is preserved for\n"
-                "Python >= 3.7\n"
-            ),
-            DeprecationWarning,
-            stacklevel=2,
-        )
-        super().__init__(incoming_graph_data, **attr)
-
-
-class OrderedDiGraph(DiGraph):
-    """Consistently ordered variant of :class:`~networkx.DiGraph`.
-
-    .. deprecated:: 2.6
-
-       OrderedDiGraph is deprecated and will be removed in version 3.0.
-       Use `DiGraph` instead, which guarantees order is preserved for
-       Python >= 3.7
-    """
-
-    node_dict_factory = OrderedDict
-    adjlist_outer_dict_factory = OrderedDict
-    adjlist_inner_dict_factory = OrderedDict
-    edge_attr_dict_factory = OrderedDict
-
-    def __init__(self, incoming_graph_data=None, **attr):
-        warnings.warn(
-            (
-                "OrderedDiGraph is deprecated and will be removed in version 3.0.\n"
-                "Use `DiGraph` instead, which guarantees order is preserved for\n"
-                "Python >= 3.7\n"
-            ),
-            DeprecationWarning,
-            stacklevel=2,
-        )
-        super().__init__(incoming_graph_data, **attr)
-
-
-class OrderedMultiGraph(MultiGraph):
-    """Consistently ordered variant of :class:`~networkx.MultiGraph`.
-
-    .. deprecated:: 2.6
-
-       OrderedMultiGraph is deprecated and will be removed in version 3.0.
-       Use `MultiGraph` instead, which guarantees order is preserved for
-       Python >= 3.7
-    """
-
-    node_dict_factory = OrderedDict
-    adjlist_outer_dict_factory = OrderedDict
-    adjlist_inner_dict_factory = OrderedDict
-    edge_key_dict_factory = OrderedDict
-    edge_attr_dict_factory = OrderedDict
-
-    def __init__(self, incoming_graph_data=None, **attr):
-        warnings.warn(
-            (
-                "OrderedMultiGraph is deprecated and will be removed in version 3.0.\n"
-                "Use `MultiGraph` instead, which guarantees order is preserved for\n"
-                "Python >= 3.7\n"
-            ),
-            DeprecationWarning,
-            stacklevel=2,
-        )
-        super().__init__(incoming_graph_data, **attr)
-
-
-class OrderedMultiDiGraph(MultiDiGraph):
-    """Consistently ordered variant of :class:`~networkx.MultiDiGraph`.
-
-    .. deprecated:: 2.6
-
-       OrderedMultiDiGraph is deprecated and will be removed in version 3.0.
-       Use `MultiDiGraph` instead, which guarantees order is preserved for
-       Python >= 3.7
-    """
-
-    node_dict_factory = OrderedDict
-    adjlist_outer_dict_factory = OrderedDict
-    adjlist_inner_dict_factory = OrderedDict
-    edge_key_dict_factory = OrderedDict
-    edge_attr_dict_factory = OrderedDict
-
-    def __init__(self, incoming_graph_data=None, **attr):
-        warnings.warn(
-            (
-                "OrderedMultiDiGraph is deprecated and will be removed in version 3.0.\n"
-                "Use `MultiDiGraph` instead, which guarantees order is preserved for\n"
-                "Python >= 3.7\n"
-            ),
-            DeprecationWarning,
-            stacklevel=2,
-        )
-        super().__init__(incoming_graph_data, **attr)
diff --git a/networkx/classes/tests/test_coreviews.py b/networkx/classes/tests/test_coreviews.py
index fdea00f..f773b85 100644
--- a/networkx/classes/tests/test_coreviews.py
+++ b/networkx/classes/tests/test_coreviews.py
@@ -360,72 +360,3 @@ class TestFilteredGraphs:
             assert RG.adj[2].copy() == RG.adj[2]
             assert RsG.adj.copy() == RsG.adj
             assert RsG.adj[2].copy() == RsG.adj[2]
-
-    def test_filtered_copy(self):
-        # TODO: This function can be removed when filtered.copy()
-        # deprecation expires
-        SubGraph = nx.graphviews.subgraph_view
-        for Graph in self.Graphs:
-            G = nx.path_graph(4, Graph)
-            SG = G.subgraph([2, 3])
-            RG = SubGraph(G, nx.filters.hide_nodes([0, 1]))
-            RsG = SubGraph(G, nx.filters.show_nodes([2, 3]))
-            # test FilterAtlas & co in these subgraphs
-            assert SG._node.copy() == SG._node
-            assert SG.adj._atlas.copy() == SG.adj._atlas
-            assert SG.adj[2]._atlas.copy() == SG.adj[2]._atlas
-            assert SG.adj[2]._atlas[3].copy() == SG.adj[2]._atlas[3]
-            assert RG.adj._atlas.copy() == RG.adj._atlas
-            assert RG.adj[2]._atlas.copy() == RG.adj[2]._atlas
-            assert RG.adj[2]._atlas[3].copy() == RG.adj[2]._atlas[3]
-            assert RG._node.copy() == RG._node
-            assert RsG.adj._atlas.copy() == RsG.adj._atlas
-            assert RsG.adj[2]._atlas.copy() == RsG.adj[2]._atlas
-            assert RsG.adj[2]._atlas[3].copy() == RsG.adj[2]._atlas[3]
-            assert RsG._node.copy() == RsG._node
-            # test MultiFilterInner
-            if G.is_multigraph():
-                assert SG.adj[2]._atlas[3][0].copy() == SG.adj[2]._atlas[3][0]
-                assert RG.adj[2]._atlas[3][0].copy() == RG.adj[2]._atlas[3][0]
-                assert RsG.adj[2]._atlas[3][0].copy() == RsG.adj[2]._atlas[3][0]
-
-            # test deprecation
-            # FilterAtlas.copy()
-            pytest.deprecated_call(SG._node.copy)
-            # FilterAdjacency.copy()
-            pytest.deprecated_call(SG.adj._atlas.copy)
-            # FilterMultiAdjacency.copy()
-            if G.is_multigraph():
-                pytest.deprecated_call(SG.adj._atlas.copy)
-            # FilterMultiInner.copy()
-            if G.is_multigraph():
-                pytest.deprecated_call(SG.adj[2]._atlas.copy)
-
-            SSG = SG.subgraph([2])
-            assert list(SSG) == [2]
-
-            # check case when node_ok is small
-            G = nx.complete_graph(9, Graph)
-            SG = G.subgraph([2, 3])
-            RG = SubGraph(G, nx.filters.hide_nodes([0, 1]))
-            RsG = SubGraph(G, nx.filters.show_nodes([2, 3, 4, 5, 6, 7, 8]))
-            assert SG.adj._atlas.copy() == SG.adj._atlas
-            assert SG.adj[2]._atlas.copy() == SG.adj[2]._atlas
-            assert SG.adj[2]._atlas[3].copy() == SG.adj[2]._atlas[3]
-            assert SG._node.copy() == SG._node
-            assert RG.adj._atlas.copy() == RG.adj._atlas
-            assert RG.adj[2]._atlas.copy() == RG.adj[2]._atlas
-            assert RG.adj[2]._atlas[3].copy() == RG.adj[2]._atlas[3]
-            assert RG._node.copy() == RG._node
-            assert RsG.adj._atlas.copy() == RsG.adj._atlas
-            assert RsG.adj[2]._atlas.copy() == RsG.adj[2]._atlas
-            assert RsG.adj[2]._atlas[3].copy() == RsG.adj[2]._atlas[3]
-            assert RsG._node.copy() == RsG._node
-            # test MultiFilterInner
-            if G.is_multigraph():
-                assert SG.adj[2][3]._atlas.copy() == SG.adj[2][3]._atlas
-                assert RG.adj[2][3]._atlas.copy() == RG.adj[2][3]._atlas
-                assert RsG.adj[2][3]._atlas.copy() == RsG.adj[2][3]._atlas
-
-            SSG = SG.subgraph([2])
-            assert list(SSG) == [2]
diff --git a/networkx/classes/tests/test_function.py b/networkx/classes/tests/test_function.py
index c738ab5..8d534ad 100644
--- a/networkx/classes/tests/test_function.py
+++ b/networkx/classes/tests/test_function.py
@@ -252,6 +252,7 @@ class TestFunction:
         pytest.raises(nx.NetworkXError, G.add_edges_from, [(1, 2)])
         pytest.raises(nx.NetworkXError, G.remove_edge, 1, 2)
         pytest.raises(nx.NetworkXError, G.remove_edges_from, [(1, 2)])
+        pytest.raises(nx.NetworkXError, G.clear_edges)
         pytest.raises(nx.NetworkXError, G.clear)
 
     def test_is_frozen(self):
@@ -260,37 +261,17 @@ class TestFunction:
         assert G.frozen == nx.is_frozen(self.G)
         assert G.frozen
 
-    def test_info(self):
-        G = nx.path_graph(5)
-        G.name = "path_graph(5)"
-        info = nx.info(G)
-        expected_graph_info = "Graph named 'path_graph(5)' with 5 nodes and 4 edges"
-        assert info == expected_graph_info
-
-        info = nx.info(G, n=1)
-        assert type(info) == str
-        expected_node_info = "\n".join(
-            ["Node 1 has the following properties:", "Degree: 2", "Neighbors: 0 2"]
-        )
-        assert info == expected_node_info
-
-        # must raise an error for a non-existent node
-        pytest.raises(nx.NetworkXError, nx.info, G, 1248)
-
-    def test_info_digraph(self):
-        G = nx.DiGraph(name="path_graph(5)")
-        nx.add_path(G, [0, 1, 2, 3, 4])
-        info = nx.info(G)
-        expected_graph_info = "DiGraph named 'path_graph(5)' with 5 nodes and 4 edges"
-        assert info == expected_graph_info
-
-        info = nx.info(G, n=1)
-        expected_node_info = "\n".join(
-            ["Node 1 has the following properties:", "Degree: 2", "Neighbors: 2"]
-        )
-        assert info == expected_node_info
-
-        pytest.raises(nx.NetworkXError, nx.info, G, n=-1)
+    def test_node_attributes_are_still_mutable_on_frozen_graph(self):
+        G = nx.freeze(nx.path_graph(3))
+        node = G.nodes[0]
+        node["node_attribute"] = True
+        assert node["node_attribute"] == True
+
+    def test_edge_attributes_are_still_mutable_on_frozen_graph(self):
+        G = nx.freeze(nx.path_graph(3))
+        edge = G.edges[(0, 1)]
+        edge["edge_attribute"] = True
+        assert edge["edge_attribute"] == True
 
     def test_neighbors_complete_graph(self):
         graph = nx.complete_graph(100)
diff --git a/networkx/classes/tests/test_graphviews.py b/networkx/classes/tests/test_graphviews.py
index d17a424..8fe80ac 100644
--- a/networkx/classes/tests/test_graphviews.py
+++ b/networkx/classes/tests/test_graphviews.py
@@ -255,7 +255,7 @@ class TestChainsOfViews:
 
     def test_subgraph_copy(self):
         for origG in self.graphs:
-            G = nx.OrderedGraph(origG)
+            G = nx.Graph(origG)
             SG = G.subgraph([4, 5, 6])
             H = SG.copy()
             assert type(G) == type(H)
@@ -330,10 +330,10 @@ class TestChainsOfViews:
         assert not hasattr(DCSG, "_graph")  # not a view
 
     def test_copy_of_view(self):
-        G = nx.OrderedMultiGraph(self.MGv)
-        assert G.__class__.__name__ == "OrderedMultiGraph"
+        G = nx.MultiGraph(self.MGv)
+        assert G.__class__.__name__ == "MultiGraph"
         G = G.copy(as_view=True)
-        assert G.__class__.__name__ == "OrderedMultiGraph"
+        assert G.__class__.__name__ == "MultiGraph"
 
     def test_subclass(self):
         class MyGraph(nx.DiGraph):
diff --git a/networkx/classes/tests/test_ordered.py b/networkx/classes/tests/test_ordered.py
deleted file mode 100644
index f29ecb4..0000000
--- a/networkx/classes/tests/test_ordered.py
+++ /dev/null
@@ -1,40 +0,0 @@
-import networkx as nx
-
-
-class TestOrdered:
-    # Just test instantiation.
-    def test_graph(self):
-        G = nx.OrderedGraph()
-
-    def test_digraph(self):
-        G = nx.OrderedDiGraph()
-
-    def test_multigraph(self):
-        G = nx.OrderedMultiGraph()
-
-    def test_multidigraph(self):
-        G = nx.OrderedMultiDiGraph()
-
-
-class TestOrderedFeatures:
-    @classmethod
-    def setup_class(cls):
-        cls.G = nx.OrderedDiGraph()
-        cls.G.add_nodes_from([1, 2, 3])
-        cls.G.add_edges_from([(2, 3), (1, 3)])
-
-    def test_subgraph_order(self):
-        G = self.G
-        G_sub = G.subgraph([1, 2, 3])
-        assert list(G.nodes) == list(G_sub.nodes)
-        assert list(G.edges) == list(G_sub.edges)
-        assert list(G.pred[3]) == list(G_sub.pred[3])
-        assert [2, 1] == list(G_sub.pred[3])
-        assert [] == list(G_sub.succ[3])
-
-        G_sub = nx.induced_subgraph(G, [1, 2, 3])
-        assert list(G.nodes) == list(G_sub.nodes)
-        assert list(G.edges) == list(G_sub.edges)
-        assert list(G.pred[3]) == list(G_sub.pred[3])
-        assert [2, 1] == list(G_sub.pred[3])
-        assert [] == list(G_sub.succ[3])
diff --git a/networkx/classes/tests/test_special.py b/networkx/classes/tests/test_special.py
index fbeb5f8..1fa7960 100644
--- a/networkx/classes/tests/test_special.py
+++ b/networkx/classes/tests/test_special.py
@@ -1,5 +1,3 @@
-from collections import OrderedDict
-
 import networkx as nx
 
 from .test_digraph import BaseDiGraphTester
@@ -58,19 +56,6 @@ class TestSpecialGraph(_TestGraph):
         self.Graph = nx.Graph
 
 
-class TestOrderedGraph(_TestGraph):
-    def setup_method(self):
-        _TestGraph.setup_method(self)
-
-        class MyGraph(nx.Graph):
-            node_dict_factory = OrderedDict
-            adjlist_outer_dict_factory = OrderedDict
-            adjlist_inner_dict_factory = OrderedDict
-            edge_attr_dict_factory = OrderedDict
-
-        self.Graph = MyGraph
-
-
 class TestThinGraph(BaseGraphTester):
     def setup_method(self):
         all_edge_dict = {"weight": 1}
@@ -99,19 +84,6 @@ class TestSpecialDiGraph(_TestDiGraph):
         self.Graph = nx.DiGraph
 
 
-class TestOrderedDiGraph(_TestDiGraph):
-    def setup_method(self):
-        _TestDiGraph.setup_method(self)
-
-        class MyGraph(nx.DiGraph):
-            node_dict_factory = OrderedDict
-            adjlist_outer_dict_factory = OrderedDict
-            adjlist_inner_dict_factory = OrderedDict
-            edge_attr_dict_factory = OrderedDict
-
-        self.Graph = MyGraph
-
-
 class TestThinDiGraph(BaseDiGraphTester):
     def setup_method(self):
         all_edge_dict = {"weight": 1}
@@ -153,35 +125,7 @@ class TestSpecialMultiGraph(_TestMultiGraph):
         self.Graph = nx.MultiGraph
 
 
-class TestOrderedMultiGraph(_TestMultiGraph):
-    def setup_method(self):
-        _TestMultiGraph.setup_method(self)
-
-        class MyGraph(nx.MultiGraph):
-            node_dict_factory = OrderedDict
-            adjlist_outer_dict_factory = OrderedDict
-            adjlist_inner_dict_factory = OrderedDict
-            edge_key_dict_factory = OrderedDict
-            edge_attr_dict_factory = OrderedDict
-
-        self.Graph = MyGraph
-
-
 class TestSpecialMultiDiGraph(_TestMultiDiGraph):
     def setup_method(self):
         _TestMultiDiGraph.setup_method(self)
         self.Graph = nx.MultiDiGraph
-
-
-class TestOrderedMultiDiGraph(_TestMultiDiGraph):
-    def setup_method(self):
-        _TestMultiDiGraph.setup_method(self)
-
-        class MyGraph(nx.MultiDiGraph):
-            node_dict_factory = OrderedDict
-            adjlist_outer_dict_factory = OrderedDict
-            adjlist_inner_dict_factory = OrderedDict
-            edge_key_dict_factory = OrderedDict
-            edge_attr_dict_factory = OrderedDict
-
-        self.Graph = MyGraph
diff --git a/networkx/conftest.py b/networkx/conftest.py
index 0306530..4c5c185 100644
--- a/networkx/conftest.py
+++ b/networkx/conftest.py
@@ -30,6 +30,10 @@ def pytest_configure(config):
 
 
 def pytest_collection_modifyitems(config, items):
+    # Allow pluggable backends to add markers to tests when
+    # running in auto-conversion test mode
+    networkx.classes.backends._mark_tests(items)
+
     if config.getoption("--runslow"):
         # --runslow given in cli: do not skip slow tests
         return
@@ -42,15 +46,6 @@ def pytest_collection_modifyitems(config, items):
 # TODO: The warnings below need to be dealt with, but for now we silence them.
 @pytest.fixture(autouse=True)
 def set_warnings():
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="k_nearest_neighbors"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="numeric_mixing_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message=r"Ordered.* is deprecated"
-    )
     warnings.filterwarnings(
         "ignore",
         category=DeprecationWarning,
@@ -61,187 +56,12 @@ def set_warnings():
         category=DeprecationWarning,
         message="literal_destringizer is deprecated",
     )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="is_string_like is deprecated"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="\nauthority_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="\nhub_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="default_opener is deprecated"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="empty_generator is deprecated"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="make_str is deprecated"
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=DeprecationWarning,
-        message="generate_unique_node is deprecated",
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=DeprecationWarning,
-        message="context manager reversed is deprecated",
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=DeprecationWarning,
-        message="This will return a generator in 3.0*",
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="betweenness_centrality_source"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="edge_betweeness"
-    )
-    warnings.filterwarnings(
-        "ignore", category=PendingDeprecationWarning, message="the matrix subclass"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="to_numpy_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="from_numpy_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="networkx.pagerank_numpy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="networkx.pagerank_scipy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="write_gpickle"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="read_gpickle"
-    )
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="write_shp")
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="read_shp")
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="edges_from_line"
-    )
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="write_yaml")
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="read_yaml")
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="FilterAtlas.copy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="FilterAdjacency.copy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="FilterMultiAdjacency.copy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="FilterMultiInner.copy"
-    )
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="jit_data")
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="jit_graph")
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="consume")
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="iterable is deprecated"
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=FutureWarning,
-        message="\nThe function signature for cytoscape",
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="\nThe `attrs` keyword"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="preserve_random_state"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="`almost_equal`"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="`assert_nodes_equal`"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="`assert_edges_equal`"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="`assert_graphs_equal`"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="networkx.hits_scipy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="networkx.hits_numpy"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="preserve_random_state"
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=FutureWarning,
-        message="google_matrix will return an np.ndarray instead of a np.matrix",
-    )
-    ### Future warnings from scipy.sparse array transition
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="biadjacency_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="bethe_hessian_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="incidence_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="laplacian_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="normalized_laplacian_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="directed_laplacian_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=FutureWarning,
-        message="directed_combinatorial_laplacian_matrix",
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="modularity_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="directed_modularity_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore", category=FutureWarning, message="adjacency_matrix"
-    )
-    warnings.filterwarnings(
-        "ignore",
-        category=DeprecationWarning,
-        message="\n\nThe scipy.sparse array containers",
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="networkx.project"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="\nfind_cores"
-    )
-    warnings.filterwarnings("ignore", category=FutureWarning, message="attr_matrix")
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message=r"\n\nmake_small_.*"
-    )
-    warnings.filterwarnings(
-        "ignore", category=DeprecationWarning, message="to_numpy_recarray"
-    )
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="info")
-    warnings.filterwarnings("ignore", category=DeprecationWarning, message="to_tuple")
     # create_using for scale_free_graph
     warnings.filterwarnings(
         "ignore", category=DeprecationWarning, message="The create_using argument"
     )
     warnings.filterwarnings(
-        "ignore", category=PendingDeprecationWarning, message="nx.nx_pydot"
+        "ignore", category=DeprecationWarning, message="nx.nx_pydot"
     )
     warnings.filterwarnings(
         "ignore",
@@ -292,13 +112,6 @@ try:
 except ImportError:
     has_pygraphviz = False
 
-try:
-    import yaml
-
-    has_yaml = True
-except ImportError:
-    has_yaml = False
-
 try:
     import pydot
 
@@ -306,13 +119,6 @@ try:
 except ImportError:
     has_pydot = False
 
-try:
-    import ogr
-
-    has_ogr = True
-except ImportError:
-    has_ogr = False
-
 try:
     import sympy
 
@@ -328,12 +134,13 @@ collect_ignore = []
 needs_numpy = [
     "algorithms/approximation/traveling_salesman.py",
     "algorithms/centrality/current_flow_closeness.py",
-    "algorithms/node_classification/__init__.py",
+    "algorithms/node_classification.py",
     "algorithms/non_randomness.py",
     "algorithms/shortest_paths/dense.py",
     "linalg/bethehessianmatrix.py",
     "linalg/laplacianmatrix.py",
     "utils/misc.py",
+    "algorithms/centrality/laplacian.py",
 ]
 needs_scipy = [
     "algorithms/approximation/traveling_salesman.py",
@@ -351,9 +158,7 @@ needs_scipy = [
     "algorithms/communicability_alg.py",
     "algorithms/link_analysis/hits_alg.py",
     "algorithms/link_analysis/pagerank_alg.py",
-    "algorithms/node_classification/__init__.py",
-    "algorithms/node_classification/hmn.py",
-    "algorithms/node_classification/lgc.py",
+    "algorithms/node_classification.py",
     "algorithms/similarity.py",
     "convert_matrix.py",
     "drawing/layout.py",
@@ -365,13 +170,12 @@ needs_scipy = [
     "linalg/modularitymatrix.py",
     "linalg/spectrum.py",
     "utils/rcm.py",
+    "algorithms/centrality/laplacian.py",
 ]
 needs_matplotlib = ["drawing/nx_pylab.py"]
 needs_pandas = ["convert_matrix.py"]
-needs_yaml = ["readwrite/nx_yaml.py"]
 needs_pygraphviz = ["drawing/nx_agraph.py"]
 needs_pydot = ["drawing/nx_pydot.py"]
-needs_ogr = ["readwrite/nx_shp.py"]
 needs_sympy = ["algorithms/polynomials.py"]
 
 if not has_numpy:
@@ -382,13 +186,9 @@ if not has_matplotlib:
     collect_ignore += needs_matplotlib
 if not has_pandas:
     collect_ignore += needs_pandas
-if not has_yaml:
-    collect_ignore += needs_yaml
 if not has_pygraphviz:
     collect_ignore += needs_pygraphviz
 if not has_pydot:
     collect_ignore += needs_pydot
-if not has_ogr:
-    collect_ignore += needs_ogr
 if not has_sympy:
     collect_ignore += needs_sympy
diff --git a/networkx/convert.py b/networkx/convert.py
index 3356dd0..7ed668f 100644
--- a/networkx/convert.py
+++ b/networkx/convert.py
@@ -57,7 +57,7 @@ def to_networkx_graph(data, create_using=None, multigraph_input=False):
          generator of edges
          Pandas DataFrame (row per edge)
          2D numpy array
-         scipy sparse matrix
+         scipy sparse array
          pygraphviz agraph
 
     create_using : NetworkX graph constructor, optional (default=nx.Graph)
@@ -135,7 +135,7 @@ def to_networkx_graph(data, create_using=None, multigraph_input=False):
     except ImportError:
         warnings.warn("pandas not found, skipping conversion test.", ImportWarning)
 
-    # numpy matrix or ndarray
+    # numpy array
     try:
         import numpy as np
 
@@ -149,16 +149,16 @@ def to_networkx_graph(data, create_using=None, multigraph_input=False):
     except ImportError:
         warnings.warn("numpy not found, skipping conversion test.", ImportWarning)
 
-    # scipy sparse matrix - any format
+    # scipy sparse array - any format
     try:
         import scipy
 
         if hasattr(data, "format"):
             try:
-                return nx.from_scipy_sparse_matrix(data, create_using=create_using)
+                return nx.from_scipy_sparse_array(data, create_using=create_using)
             except Exception as err:
                 raise nx.NetworkXError(
-                    "Input is not a correct scipy sparse matrix type."
+                    "Input is not a correct scipy sparse array type."
                 ) from err
     except ImportError:
         warnings.warn("scipy not found, skipping conversion test.", ImportWarning)
diff --git a/networkx/convert_matrix.py b/networkx/convert_matrix.py
index fecc8ca..027db4f 100644
--- a/networkx/convert_matrix.py
+++ b/networkx/convert_matrix.py
@@ -26,24 +26,18 @@ nx_agraph, nx_pydot
 """
 
 import itertools
-import warnings
 from collections import defaultdict
 
 import networkx as nx
 from networkx.utils import not_implemented_for
 
 __all__ = [
-    "from_numpy_matrix",
-    "to_numpy_matrix",
     "from_pandas_adjacency",
     "to_pandas_adjacency",
     "from_pandas_edgelist",
     "to_pandas_edgelist",
-    "to_numpy_recarray",
     "from_scipy_sparse_array",
-    "from_scipy_sparse_matrix",
     "to_scipy_sparse_array",
-    "to_scipy_sparse_matrix",
     "from_numpy_array",
     "to_numpy_array",
 ]
@@ -197,7 +191,7 @@ def from_pandas_adjacency(df, create_using=None):
     1  2  1
     >>> G = nx.from_pandas_adjacency(df)
     >>> G.name = "Graph from pandas adjacency matrix"
-    >>> print(nx.info(G))
+    >>> print(G)
     Graph named 'Graph from pandas adjacency matrix' with 2 nodes and 3 edges
     """
 
@@ -221,7 +215,6 @@ def to_pandas_edgelist(
     target="target",
     nodelist=None,
     dtype=None,
-    order=None,
     edge_key=None,
 ):
     """Returns the graph edge list as a Pandas DataFrame.
@@ -246,12 +239,6 @@ def to_pandas_edgelist(
         Use to create the DataFrame. Data type to force.
         Only a single dtype is allowed. If None, infer.
 
-    order : None
-        An unused parameter mistakenly included in the function.
-
-        .. deprecated:: 2.6
-            This is deprecated and will be removed in NetworkX v3.0.
-
     edge_key : str or int or None, optional (default=None)
         A valid column name (string or integer) for the edge keys (for the
         multigraph case). If None, edge keys are not stored in the DataFrame.
@@ -476,323 +463,6 @@ def from_pandas_edgelist(
     return g
 
 
-def to_numpy_matrix(
-    G,
-    nodelist=None,
-    dtype=None,
-    order=None,
-    multigraph_weight=sum,
-    weight="weight",
-    nonedge=0.0,
-):
-    """Returns the graph adjacency matrix as a NumPy matrix.
-
-    Parameters
-    ----------
-    G : graph
-        The NetworkX graph used to construct the NumPy matrix.
-
-    nodelist : list, optional
-        The rows and columns are ordered according to the nodes in `nodelist`.
-        If `nodelist` is None, then the ordering is produced by G.nodes().
-
-    dtype : NumPy data type, optional
-        A valid single NumPy data type used to initialize the array.
-        This must be a simple type such as int or numpy.float64 and
-        not a compound data type (see to_numpy_recarray)
-        If None, then the NumPy default is used.
-
-    order : {'C', 'F'}, optional
-        Whether to store multidimensional data in C- or Fortran-contiguous
-        (row- or column-wise) order in memory. If None, then the NumPy default
-        is used.
-
-    multigraph_weight : {sum, min, max}, optional
-        An operator that determines how weights in multigraphs are handled.
-        The default is to sum the weights of the multiple edges.
-
-    weight : string or None optional (default = 'weight')
-        The edge attribute that holds the numerical value used for
-        the edge weight. If an edge does not have that attribute, then the
-        value 1 is used instead.
-
-    nonedge : float (default = 0.0)
-        The matrix values corresponding to nonedges are typically set to zero.
-        However, this could be undesirable if there are matrix values
-        corresponding to actual edges that also have the value zero. If so,
-        one might prefer nonedges to have some other value, such as nan.
-
-    Returns
-    -------
-    M : NumPy matrix
-        Graph adjacency matrix
-
-    See Also
-    --------
-    to_numpy_recarray
-
-    Notes
-    -----
-    For directed graphs, entry i,j corresponds to an edge from i to j.
-
-    The matrix entries are assigned to the weight edge attribute. When
-    an edge does not have a weight attribute, the value of the entry is set to
-    the number 1.  For multiple (parallel) edges, the values of the entries
-    are determined by the `multigraph_weight` parameter.  The default is to
-    sum the weight attributes for each of the parallel edges.
-
-    When `nodelist` does not contain every node in `G`, the matrix is built
-    from the subgraph of `G` that is induced by the nodes in `nodelist`.
-
-    The convention used for self-loop edges in graphs is to assign the
-    diagonal matrix entry value to the weight attribute of the edge
-    (or the number 1 if the edge has no weight attribute).  If the
-    alternate convention of doubling the edge weight is desired the
-    resulting Numpy matrix can be modified as follows:
-
-    >>> import numpy as np
-    >>> G = nx.Graph([(1, 1)])
-    >>> A = nx.to_numpy_matrix(G)
-    >>> A
-    matrix([[1.]])
-    >>> A[np.diag_indices_from(A)] *= 2
-    >>> A
-    matrix([[2.]])
-
-    Examples
-    --------
-    >>> G = nx.MultiDiGraph()
-    >>> G.add_edge(0, 1, weight=2)
-    0
-    >>> G.add_edge(1, 0)
-    0
-    >>> G.add_edge(2, 2, weight=3)
-    0
-    >>> G.add_edge(2, 2)
-    1
-    >>> nx.to_numpy_matrix(G, nodelist=[0, 1, 2])
-    matrix([[0., 2., 0.],
-            [1., 0., 0.],
-            [0., 0., 4.]])
-
-    """
-    warnings.warn(
-        (
-            "to_numpy_matrix is deprecated and will be removed in NetworkX 3.0.\n"
-            "Use to_numpy_array instead, e.g. np.asmatrix(to_numpy_array(G, **kwargs))"
-        ),
-        DeprecationWarning,
-    )
-
-    import numpy as np
-
-    A = to_numpy_array(
-        G,
-        nodelist=nodelist,
-        dtype=dtype,
-        order=order,
-        multigraph_weight=multigraph_weight,
-        weight=weight,
-        nonedge=nonedge,
-    )
-    M = np.asmatrix(A, dtype=dtype)
-    return M
-
-
-def from_numpy_matrix(A, parallel_edges=False, create_using=None):
-    """Returns a graph from numpy matrix.
-
-    The numpy matrix is interpreted as an adjacency matrix for the graph.
-
-    Parameters
-    ----------
-    A : numpy matrix
-        An adjacency matrix representation of a graph
-
-    parallel_edges : Boolean
-        If True, `create_using` is a multigraph, and `A` is an
-        integer matrix, then entry *(i, j)* in the matrix is interpreted as the
-        number of parallel edges joining vertices *i* and *j* in the graph.
-        If False, then the entries in the adjacency matrix are interpreted as
-        the weight of a single edge joining the vertices.
-
-    create_using : NetworkX graph constructor, optional (default=nx.Graph)
-       Graph type to create. If graph instance, then cleared before populated.
-
-    Notes
-    -----
-    For directed graphs, explicitly mention create_using=nx.DiGraph,
-    and entry i,j of A corresponds to an edge from i to j.
-
-    If `create_using` is :class:`networkx.MultiGraph` or
-    :class:`networkx.MultiDiGraph`, `parallel_edges` is True, and the
-    entries of `A` are of type :class:`int`, then this function returns a
-    multigraph (constructed from `create_using`) with parallel edges.
-
-    If `create_using` indicates an undirected multigraph, then only the edges
-    indicated by the upper triangle of the matrix `A` will be added to the
-    graph.
-
-    If the numpy matrix has a single data type for each matrix entry it
-    will be converted to an appropriate Python data type.
-
-    If the numpy matrix has a user-specified compound data type the names
-    of the data fields will be used as attribute keys in the resulting
-    NetworkX graph.
-
-    See Also
-    --------
-    to_numpy_recarray
-
-    Examples
-    --------
-    Simple integer weights on edges:
-
-    >>> import numpy as np
-    >>> A = np.array([[1, 1], [2, 1]])
-    >>> G = nx.from_numpy_matrix(A)
-
-    If `create_using` indicates a multigraph and the matrix has only integer
-    entries and `parallel_edges` is False, then the entries will be treated
-    as weights for edges joining the nodes (without creating parallel edges):
-
-    >>> A = np.array([[1, 1], [1, 2]])
-    >>> G = nx.from_numpy_matrix(A, create_using=nx.MultiGraph)
-    >>> G[1][1]
-    AtlasView({0: {'weight': 2}})
-
-    If `create_using` indicates a multigraph and the matrix has only integer
-    entries and `parallel_edges` is True, then the entries will be treated
-    as the number of parallel edges joining those two vertices:
-
-    >>> A = np.array([[1, 1], [1, 2]])
-    >>> temp = nx.MultiGraph()
-    >>> G = nx.from_numpy_matrix(A, parallel_edges=True, create_using=temp)
-    >>> G[1][1]
-    AtlasView({0: {'weight': 1}, 1: {'weight': 1}})
-
-    User defined compound data type on edges:
-
-    >>> dt = [("weight", float), ("cost", int)]
-    >>> A = np.array([[(1.0, 2)]], dtype=dt)
-    >>> G = nx.from_numpy_matrix(A)
-    >>> list(G.edges())
-    [(0, 0)]
-    >>> G[0][0]["cost"]
-    2
-    >>> G[0][0]["weight"]
-    1.0
-
-    """
-    warnings.warn(
-        (
-            "from_numpy_matrix is deprecated and will be removed in NetworkX 3.0.\n"
-            "Use from_numpy_array instead, e.g. from_numpy_array(A, **kwargs)"
-        ),
-        DeprecationWarning,
-    )
-    return from_numpy_array(A, parallel_edges=parallel_edges, create_using=create_using)
-
-
-@not_implemented_for("multigraph")
-def to_numpy_recarray(G, nodelist=None, dtype=None, order=None):
-    """Returns the graph adjacency matrix as a NumPy recarray.
-
-    .. deprecated:: 2.7
-
-       ``to_numpy_recarray`` is deprecated and will be removed in NetworkX 3.0.
-       Use ``nx.to_numpy_array(G, dtype=dtype, weight=None).view(np.recarray)``
-       instead.
-
-    Parameters
-    ----------
-    G : graph
-        The NetworkX graph used to construct the NumPy recarray.
-
-    nodelist : list, optional
-       The rows and columns are ordered according to the nodes in `nodelist`.
-       If `nodelist` is None, then the ordering is produced by G.nodes().
-
-    dtype : NumPy data-type, optional
-        A valid NumPy named dtype used to initialize the NumPy recarray.
-        The data type names are assumed to be keys in the graph edge attribute
-        dictionary. The default is ``dtype([("weight", float)])``.
-
-    order : {'C', 'F'}, optional
-        Whether to store multidimensional data in C- or Fortran-contiguous
-        (row- or column-wise) order in memory. If None, then the NumPy default
-        is used.
-
-    Returns
-    -------
-    M : NumPy recarray
-       The graph with specified edge data as a Numpy recarray
-
-    Notes
-    -----
-    When `nodelist` does not contain every node in `G`, the adjacency
-    matrix is built from the subgraph of `G` that is induced by the nodes in
-    `nodelist`.
-
-    Examples
-    --------
-    >>> G = nx.Graph()
-    >>> G.add_edge(1, 2, weight=7.0, cost=5)
-    >>> A = nx.to_numpy_recarray(G, dtype=[("weight", float), ("cost", int)])
-    >>> print(A.weight)
-    [[0. 7.]
-     [7. 0.]]
-    >>> print(A.cost)
-    [[0 5]
-     [5 0]]
-
-    """
-    import warnings
-
-    import numpy as np
-
-    warnings.warn(
-        (
-            "to_numpy_recarray is deprecated and will be removed in version 3.0.\n"
-            "Use to_numpy_array instead::\n\n"
-            "    nx.to_numpy_array(G, dtype=dtype, weight=None).view(np.recarray)"
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-
-    if dtype is None:
-        dtype = [("weight", float)]
-
-    if nodelist is None:
-        nodelist = list(G)
-        nodeset = G
-        nlen = len(G)
-    else:
-        nlen = len(nodelist)
-        nodeset = set(G.nbunch_iter(nodelist))
-        if nlen != len(nodeset):
-            for n in nodelist:
-                if n not in G:
-                    raise nx.NetworkXError(f"Node {n} in nodelist is not in G")
-            raise nx.NetworkXError("nodelist contains duplicates.")
-
-    undirected = not G.is_directed()
-    index = dict(zip(nodelist, range(nlen)))
-    M = np.zeros((nlen, nlen), dtype=dtype, order=order)
-
-    names = M.dtype.names
-    for u, v, attrs in G.edges(data=True):
-        if (u in nodeset) and (v in nodeset):
-            i, j = index[u], index[v]
-            values = tuple(attrs[n] for n in names)
-            M[i, j] = values
-            if undirected:
-                M[j, i] = M[i, j]
-
-    return M.view(np.recarray)
-
-
 def to_scipy_sparse_array(G, nodelist=None, dtype=None, weight="weight", format="csr"):
     """Returns the graph adjacency matrix as a SciPy sparse array.
 
@@ -841,7 +511,7 @@ def to_scipy_sparse_array(G, nodelist=None, dtype=None, weight="weight", format=
     diagonal matrix entry value to the weight attribute of the edge
     (or the number 1 if the edge has no weight attribute).  If the
     alternate convention of doubling the edge weight is desired the
-    resulting Scipy sparse matrix can be modified as follows:
+    resulting SciPy sparse array can be modified as follows:
 
     >>> G = nx.Graph([(1, 1)])
     >>> A = nx.to_scipy_sparse_array(G)
@@ -927,189 +597,8 @@ def to_scipy_sparse_array(G, nodelist=None, dtype=None, weight="weight", format=
         raise nx.NetworkXError(f"Unknown sparse matrix format: {format}") from err
 
 
-def to_scipy_sparse_matrix(G, nodelist=None, dtype=None, weight="weight", format="csr"):
-    """Returns the graph adjacency matrix as a SciPy sparse matrix.
-
-    Parameters
-    ----------
-    G : graph
-        The NetworkX graph used to construct the sparse matrix.
-
-    nodelist : list, optional
-       The rows and columns are ordered according to the nodes in `nodelist`.
-       If `nodelist` is None, then the ordering is produced by G.nodes().
-
-    dtype : NumPy data-type, optional
-        A valid NumPy dtype used to initialize the array. If None, then the
-        NumPy default is used.
-
-    weight : string or None   optional (default='weight')
-        The edge attribute that holds the numerical value used for
-        the edge weight.  If None then all edge weights are 1.
-
-    format : str in {'bsr', 'csr', 'csc', 'coo', 'lil', 'dia', 'dok'}
-        The type of the matrix to be returned (default 'csr').  For
-        some algorithms different implementations of sparse matrices
-        can perform better.  See [1]_ for details.
-
-    Returns
-    -------
-    A : SciPy sparse matrix
-       Graph adjacency matrix.
-
-    Notes
-    -----
-    For directed graphs, matrix entry i,j corresponds to an edge from i to j.
-
-    The matrix entries are populated using the edge attribute held in
-    parameter weight. When an edge does not have that attribute, the
-    value of the entry is 1.
-
-    For multiple edges the matrix values are the sums of the edge weights.
-
-    When `nodelist` does not contain every node in `G`, the adjacency matrix
-    is built from the subgraph of `G` that is induced by the nodes in
-    `nodelist`.
-
-    The convention used for self-loop edges in graphs is to assign the
-    diagonal matrix entry value to the weight attribute of the edge
-    (or the number 1 if the edge has no weight attribute).  If the
-    alternate convention of doubling the edge weight is desired the
-    resulting Scipy sparse matrix can be modified as follows:
-
-    >>> G = nx.Graph([(1, 1)])
-    >>> A = nx.to_scipy_sparse_matrix(G)
-    >>> print(A.todense())
-    [[1]]
-    >>> A.setdiag(A.diagonal() * 2)
-    >>> print(A.todense())
-    [[2]]
-
-    Examples
-    --------
-    >>> G = nx.MultiDiGraph()
-    >>> G.add_edge(0, 1, weight=2)
-    0
-    >>> G.add_edge(1, 0)
-    0
-    >>> G.add_edge(2, 2, weight=3)
-    0
-    >>> G.add_edge(2, 2)
-    1
-    >>> S = nx.to_scipy_sparse_matrix(G, nodelist=[0, 1, 2])
-    >>> print(S.todense())
-    [[0 2 0]
-     [1 0 0]
-     [0 0 4]]
-
-    References
-    ----------
-    .. [1] Scipy Dev. References, "Sparse Matrices",
-       https://docs.scipy.org/doc/scipy/reference/sparse.html
-    """
-    import scipy as sp
-    import scipy.sparse
-
-    warnings.warn(
-        (
-            "\n\nThe scipy.sparse array containers will be used instead of matrices\n"
-            "in Networkx 3.0. Use `to_scipy_sparse_array` instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    A = to_scipy_sparse_array(
-        G, nodelist=nodelist, dtype=dtype, weight=weight, format=format
-    )
-    return sp.sparse.csr_matrix(A).asformat(format)
-
-
-def from_scipy_sparse_matrix(
-    A, parallel_edges=False, create_using=None, edge_attribute="weight"
-):
-    """Creates a new graph from an adjacency matrix given as a SciPy sparse
-    matrix.
-
-    Parameters
-    ----------
-    A: scipy sparse matrix
-      An adjacency matrix representation of a graph
-
-    parallel_edges : Boolean
-      If this is True, `create_using` is a multigraph, and `A` is an
-      integer matrix, then entry *(i, j)* in the matrix is interpreted as the
-      number of parallel edges joining vertices *i* and *j* in the graph.
-      If it is False, then the entries in the matrix are interpreted as
-      the weight of a single edge joining the vertices.
-
-    create_using : NetworkX graph constructor, optional (default=nx.Graph)
-       Graph type to create. If graph instance, then cleared before populated.
-
-    edge_attribute: string
-       Name of edge attribute to store matrix numeric value. The data will
-       have the same type as the matrix entry (int, float, (real,imag)).
-
-    Notes
-    -----
-    For directed graphs, explicitly mention create_using=nx.DiGraph,
-    and entry i,j of A corresponds to an edge from i to j.
-
-    If `create_using` is :class:`networkx.MultiGraph` or
-    :class:`networkx.MultiDiGraph`, `parallel_edges` is True, and the
-    entries of `A` are of type :class:`int`, then this function returns a
-    multigraph (constructed from `create_using`) with parallel edges.
-    In this case, `edge_attribute` will be ignored.
-
-    If `create_using` indicates an undirected multigraph, then only the edges
-    indicated by the upper triangle of the matrix `A` will be added to the
-    graph.
-
-    Examples
-    --------
-    >>> import scipy as sp
-    >>> import scipy.sparse  # call as sp.sparse
-    >>> A = sp.sparse.eye(2, 2, 1)
-    >>> G = nx.from_scipy_sparse_matrix(A)
-
-    If `create_using` indicates a multigraph and the matrix has only integer
-    entries and `parallel_edges` is False, then the entries will be treated
-    as weights for edges joining the nodes (without creating parallel edges):
-
-    >>> A = sp.sparse.csr_matrix([[1, 1], [1, 2]])
-    >>> G = nx.from_scipy_sparse_matrix(A, create_using=nx.MultiGraph)
-    >>> G[1][1]
-    AtlasView({0: {'weight': 2}})
-
-    If `create_using` indicates a multigraph and the matrix has only integer
-    entries and `parallel_edges` is True, then the entries will be treated
-    as the number of parallel edges joining those two vertices:
-
-    >>> A = sp.sparse.csr_matrix([[1, 1], [1, 2]])
-    >>> G = nx.from_scipy_sparse_matrix(
-    ...     A, parallel_edges=True, create_using=nx.MultiGraph
-    ... )
-    >>> G[1][1]
-    AtlasView({0: {'weight': 1}, 1: {'weight': 1}})
-
-    """
-    warnings.warn(
-        (
-            "\n\nThe scipy.sparse array containers will be used instead of matrices\n"
-            "in Networkx 3.0. Use `from_scipy_sparse_array` instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    return from_scipy_sparse_array(
-        A,
-        parallel_edges=parallel_edges,
-        create_using=create_using,
-        edge_attribute=edge_attribute,
-    )
-
-
 def _csr_gen_triples(A):
-    """Converts a SciPy sparse matrix in **Compressed Sparse Row** format to
+    """Converts a SciPy sparse array in **Compressed Sparse Row** format to
     an iterable of weighted edge triples.
 
     """
@@ -1121,7 +610,7 @@ def _csr_gen_triples(A):
 
 
 def _csc_gen_triples(A):
-    """Converts a SciPy sparse matrix in **Compressed Sparse Column** format to
+    """Converts a SciPy sparse array in **Compressed Sparse Column** format to
     an iterable of weighted edge triples.
 
     """
@@ -1133,7 +622,7 @@ def _csc_gen_triples(A):
 
 
 def _coo_gen_triples(A):
-    """Converts a SciPy sparse matrix in **Coordinate** format to an iterable
+    """Converts a SciPy sparse array in **Coordinate** format to an iterable
     of weighted edge triples.
 
     """
@@ -1142,7 +631,7 @@ def _coo_gen_triples(A):
 
 
 def _dok_gen_triples(A):
-    """Converts a SciPy sparse matrix in **Dictionary of Keys** format to an
+    """Converts a SciPy sparse array in **Dictionary of Keys** format to an
     iterable of weighted edge triples.
 
     """
@@ -1154,7 +643,7 @@ def _generate_weighted_edges(A):
     """Returns an iterable over (u, v, w) triples, where u and v are adjacent
     vertices and w is the weight of the edge joining u and v.
 
-    `A` is a SciPy sparse matrix (in any format).
+    `A` is a SciPy sparse array (in any format).
 
     """
     if A.format == "csr":
diff --git a/networkx/drawing/__init__.py b/networkx/drawing/__init__.py
index 1e8542f..0f53309 100644
--- a/networkx/drawing/__init__.py
+++ b/networkx/drawing/__init__.py
@@ -1,6 +1,7 @@
 # graph drawing and interface to graphviz
 
 from .layout import *
+from .nx_latex import *
 from .nx_pylab import *
 from . import nx_agraph
 from . import nx_pydot
diff --git a/networkx/drawing/layout.py b/networkx/drawing/layout.py
index b6d2afe..2d49c30 100644
--- a/networkx/drawing/layout.py
+++ b/networkx/drawing/layout.py
@@ -32,6 +32,7 @@ __all__ = [
     "fruchterman_reingold_layout",
     "spiral_layout",
     "multipartite_layout",
+    "arf_layout",
 ]
 
 
@@ -1110,6 +1111,118 @@ def multipartite_layout(G, subset_key="subset", align="vertical", scale=1, cente
     return pos
 
 
+def arf_layout(
+    G,
+    pos=None,
+    scaling=1,
+    a=1.1,
+    etol=1e-6,
+    dt=1e-3,
+    max_iter=1000,
+):
+    """Arf layout for networkx
+
+    The attractive and repulsive forces (arf) layout [1]
+    improves the spring layout in three ways. First, it
+    prevents congestion of highly connected nodes due to
+    strong forcing between nodes. Second, it utilizes the
+    layout space more effectively by preventing large gaps
+    that spring layout tends to create. Lastly, the arf
+    layout represents symmmetries in the layout better than
+    the default spring layout.
+
+    Parameters
+    ----------
+    G : nx.Graph or nx.DiGraph
+        Networkx graph.
+    pos : dict
+        Initial  position of  the nodes.  If set  to None  a
+        random layout will be used.
+    scaling : float
+        Scales the radius of the circular layout space.
+    a : float
+        Strength of springs between connected nodes. Should be larger than 1. The greater a, the clearer the separation ofunconnected sub clusters.
+    etol : float
+        Graduent sum of spring forces must be larger than `etol` before successful termination.
+    dt : float
+        Time step for force differential equation simulations.
+    max_iter : int
+        Max iterations before termination of the algorithm.
+
+    References
+    .. [1] "Self-Organization Applied to Dynamic Network Layout", M. Geipel,
+            International Jounral of Modern Physics C, 2007, Vol 18, No 10, pp. 1537-1549.
+            https://doi.org/10.1142/S0129183107011558 https://arxiv.org/abs/0704.1748
+
+    Returns
+    -------
+    pos : dict
+        A dictionary of positions keyed by node.
+
+    Examples
+    --------
+    >>> G = nx.grid_graph((5, 5))
+    >>> pos = nx.arf_layout(G)
+
+    """
+    import warnings
+
+    import numpy as np
+
+    if a <= 1:
+        msg = "The parameter a should be larger than 1"
+        raise ValueError(msg)
+
+    pos_tmp = nx.random_layout(G)
+    if pos is None:
+        pos = pos_tmp
+    else:
+        for node in G.nodes():
+            if node not in pos:
+                pos[node] = pos_tmp[node].copy()
+
+    # Initialize spring constant matrix
+    N = len(G)
+    # No nodes no computation
+    if N == 0:
+        return pos
+
+    # init force of springs
+    K = np.ones((N, N)) - np.eye(N)
+    node_order = {node: i for i, node in enumerate(G)}
+    for x, y in G.edges():
+        if x != y:
+            idx, jdx = (node_order[i] for i in (x, y))
+            K[idx, jdx] = a
+
+    # vectorize values
+    p = np.asarray(list(pos.values()))
+
+    # equation 10 in [1]
+    rho = scaling * np.sqrt(N)
+
+    # looping variables
+    error = etol + 1
+    n_iter = 0
+    while error > etol:
+        diff = p[:, np.newaxis] - p[np.newaxis]
+        A = np.linalg.norm(diff, axis=-1)[..., np.newaxis]
+        # attraction_force - repulsions force
+        # suppress nans due to division; caused by diagonal set to zero.
+        # Does not affect the computation due to nansum
+        with warnings.catch_warnings():
+            warnings.simplefilter("ignore")
+            change = K[..., np.newaxis] * diff - rho / A * diff
+        change = np.nansum(change, axis=0)
+        p += change * dt
+
+        error = np.linalg.norm(change, axis=-1).sum()
+        if n_iter > max_iter:
+            break
+        n_iter += 1
+    return {node: pi for node, pi in zip(G.nodes(), p)}
+
+
 def rescale_layout(pos, scale=1):
     """Returns scaled position array to (-scale, scale) in all axes.
 
diff --git a/networkx/drawing/nx_agraph.py b/networkx/drawing/nx_agraph.py
index eeb9cf8..2ffa21f 100644
--- a/networkx/drawing/nx_agraph.py
+++ b/networkx/drawing/nx_agraph.py
@@ -324,7 +324,7 @@ def view_pygraphviz(
     G : NetworkX graph
         The machine to draw.
     edgelabel : str, callable, None
-        If a string, then it specifes the edge attribute to be displayed
+        If a string, then it specifies the edge attribute to be displayed
         on the edge labels. If a callable, then it is called for each
         edge and it should return the string to be displayed on the edges.
         The function signature of `edgelabel` should be edgelabel(data),
@@ -458,52 +458,3 @@ def view_pygraphviz(
         Image.open(path.name).show()
 
     return path.name, A
-
-
-def display_pygraphviz(graph, path, format=None, prog=None, args=""):
-    """Internal function to display a graph in OS dependent manner.
-
-    Parameters
-    ----------
-    graph : PyGraphviz graph
-        A PyGraphviz AGraph instance.
-    path :  file object
-        An already opened file object that will be closed.
-    format : str, None
-        An attempt is made to guess the output format based on the extension
-        of the filename. If that fails, the value of `format` is used.
-    prog : string
-        Name of Graphviz layout program.
-    args : str
-        Additional arguments to pass to the Graphviz layout program.
-
-    Notes
-    -----
-    If this function is called in succession too quickly, sometimes the
-    image is not displayed. So you might consider time.sleep(.5) between
-    calls if you experience problems.
-
-    """
-    import warnings
-
-    from PIL import Image
-
-    warnings.warn(
-        "display_pygraphviz is deprecated and will be removed in NetworkX 3.0. "
-        "To view a graph G using pygraphviz, use nx.nx_agraph.view_pygraphviz(G). "
-        "To view a graph from file, consider an image processing libary like "
-        "`Pillow`, e.g. ``PIL.Image.open(path.name).show()``",
-        DeprecationWarning,
-    )
-    if format is None:
-        filename = path.name
-        format = os.path.splitext(filename)[1].lower()[1:]
-    if not format:
-        # Let the draw() function use its default
-        format = None
-
-    # Save to a file and display in the default viewer.
-    # We must close the file before viewing it.
-    graph.draw(path, format, prog, args)
-    path.close()
-    Image.open(filename).show()
diff --git a/networkx/drawing/nx_latex.py b/networkx/drawing/nx_latex.py
new file mode 100644
index 0000000..9ccd6ee
--- /dev/null
+++ b/networkx/drawing/nx_latex.py
@@ -0,0 +1,571 @@
+r"""
+*****
+LaTeX
+*****
+
+Export NetworkX graphs in LaTeX format using the TikZ library within TeX/LaTeX.
+Usually, you will want the drawing to appear in a figure environment so
+you use ``to_latex(G, caption="A caption")``. If you want the raw
+drawing commands without a figure environment use :func:`to_latex_raw`.
+And if you want to write to a file instead of just returning the latex
+code as a string, use ``write_latex(G, "filname.tex", caption="A caption")``.
+
+To construct a figure with subfigures for each graph to be shown, provide
+``to_latex`` or ``write_latex`` a list of graphs, a list of subcaptions,
+and a number of rows of subfigures inside the figure.
+
+To be able to refer to the figures or subfigures in latex using ``\\ref``,
+the keyword ``latex_label`` is available for figures and `sub_labels` for
+a list of labels, one for each subfigure.
+
+We intend to eventually provide an interface to the TikZ Graph
+features which include e.g. layout algorithms.
+
+Let us know via github what you'd like to see available, or better yet
+give us some code to do it, or even better make a github pull request
+to add the feature.
+
+The TikZ approach
+=================
+Drawing options can be stored on the graph as node/edge attributes, or
+can be provided as dicts keyed by node/edge to a string of the options
+for that node/edge. Similarly a label can be shown for each node/edge
+by specifying the labels as graph node/edge attributes or by providing
+a dict keyed by node/edge to the text to be written for that node/edge.
+
+Options for the tikzpicture environment (e.g. "[scale=2]") can be provided
+via a keyword argument. Similarly default node and edge options can be
+provided through keywords arguments. The default node options are applied
+to the single TikZ "path" that draws all nodes (and no edges). The default edge
+options are applied to a TikZ "scope" which contains a path for each edge.
+
+Examples
+========
+>>> G = nx.path_graph(3)
+>>> nx.write_latex(G, "just_my_figure.tex", as_document=True)
+>>> nx.write_latex(G, "my_figure.tex", caption="A path graph", latex_label="fig1")
+>>> latex_code = nx.to_latex(G)  # a string rather than a file
+
+You can change many features of the nodes and edges.
+
+>>> G = nx.path_graph(4, create_using=nx.DiGraph)
+>>> pos = {n: (n, n) for n in G}  # nodes set on a line
+
+>>> G.nodes[0]["style"] = "blue"
+>>> G.nodes[2]["style"] = "line width=3,draw"
+>>> G.nodes[3]["label"] = "Stop"
+>>> G.edges[(0, 1)]["label"] = "1st Step"
+>>> G.edges[(0, 1)]["label_opts"] = "near start"
+>>> G.edges[(1, 2)]["style"] = "line width=3"
+>>> G.edges[(1, 2)]["label"] = "2nd Step"
+>>> G.edges[(2, 3)]["style"] = "green"
+>>> G.edges[(2, 3)]["label"] = "3rd Step"
+>>> G.edges[(2, 3)]["label_opts"] = "near end"
+
+>>> nx.write_latex(G, "latex_graph.tex", pos=pos, as_document=True)
+
+Then compile the LaTeX using something like ``pdflatex latex_graph.tex``
+and view the pdf file created: ``latex_graph.pdf``.
+
+If you want **subfigures** each containing one graph, you can input a list of graphs.
+
+>>> H1 = nx.path_graph(4)
+>>> H2 = nx.complete_graph(4)
+>>> H3 = nx.path_graph(8)
+>>> H4 = nx.complete_graph(8)
+>>> graphs = [H1, H2, H3, H4]
+>>> caps = ["Path 4", "Complete graph 4", "Path 8", "Complete graph 8"]
+>>> lbls = ["fig2a", "fig2b", "fig2c", "fig2d"]
+>>> nx.write_latex(graphs, "subfigs.tex", n_rows=2, sub_captions=caps, sub_labels=lbls)
+>>> latex_code = nx.to_latex(graphs, n_rows=2, sub_captions=caps, sub_labels=lbls)
+
+>>> node_color = {0: "red", 1: "orange", 2: "blue", 3: "gray!90"}
+>>> edge_width = {e: "line width=1.5" for e in H3.edges}
+>>> pos = nx.circular_layout(H3)
+>>> latex_code = nx.to_latex(H3, pos, node_options=node_color, edge_options=edge_width)
+>>> print(latex_code)
+\documentclass{report}
+\usepackage{tikz}
+\usepackage{subcaption}
+<BLANKLINE>
+\begin{document}
+\begin{figure}
+  \begin{tikzpicture}
+      \draw
+        (1.0, 0.0) node[red] (0){0}
+        (0.707, 0.707) node[orange] (1){1}
+        (-0.0, 1.0) node[blue] (2){2}
+        (-0.707, 0.707) node[gray!90] (3){3}
+        (-1.0, -0.0) node (4){4}
+        (-0.707, -0.707) node (5){5}
+        (0.0, -1.0) node (6){6}
+        (0.707, -0.707) node (7){7};
+      \begin{scope}[-]
+        \draw[line width=1.5] (0) to (1);
+        \draw[line width=1.5] (1) to (2);
+        \draw[line width=1.5] (2) to (3);
+        \draw[line width=1.5] (3) to (4);
+        \draw[line width=1.5] (4) to (5);
+        \draw[line width=1.5] (5) to (6);
+        \draw[line width=1.5] (6) to (7);
+      \end{scope}
+    \end{tikzpicture}
+\end{figure}
+\end{document}
+
+Notes
+-----
+If you want to change the preamble/postamble of the figure/document/subfigure
+environment, use the keyword arguments: `figure_wrapper`, `document_wrapper`,
+`subfigure_wrapper`. The default values are stored in private variables
+e.g. ``nx.nx_layout._DOCUMENT_WRAPPER``
+
+References
+----------
+TikZ:          https://tikz.dev/
+
+TikZ options details:   https://tikz.dev/tikz-actions
+"""
+import numbers
+import os
+
+import networkx as nx
+
+__all__ = [
+    "to_latex_raw",
+    "to_latex",
+    "write_latex",
+]
+
+
+@nx.utils.not_implemented_for("multigraph")
+def to_latex_raw(
+    G,
+    pos="pos",
+    tikz_options="",
+    default_node_options="",
+    node_options="node_options",
+    node_label="label",
+    default_edge_options="",
+    edge_options="edge_options",
+    edge_label="label",
+    edge_label_options="edge_label_options",
+):
+    """Return a string of the LaTeX/TikZ code to draw `G`
+
+    This function produces just the code for the tikzpicture
+    without any enclosing environment.
+
+    Parameters
+    ==========
+    G : NetworkX graph
+        The NetworkX graph to be drawn
+    pos : string or dict (default "pos")
+        The name of the node attribute on `G` that holds the position of each node.
+        Positions can be sequences of length 2 with numbers for (x,y) coordinates.
+        They can also be strings to denote positions in TikZ style, such as (x, y)
+        or (angle:radius).
+        If a dict, it should be keyed by node to a position.
+        If an empty dict, a circular layout is computed by TikZ.
+    tikz_options : string
+        The tikzpicture options description defining the options for the picture.
+        Often large scale options like `[scale=2]`.
+    default_node_options : string
+        The draw options for a path of nodes. Individual node options override these.
+    node_options : string or dict
+        The name of the node attribute on `G` that holds the options for each node.
+        Or a dict keyed by node to a string holding the options for that node.
+    node_label : string or dict
+        The name of the node attribute on `G` that holds the node label (text)
+        displayed for each node. If the attribute is "" or not present, the node
+        itself is drawn as a string. LaTeX processing such as ``"$A_1$"`` is allowed.
+        Or a dict keyed by node to a string holding the label for that node.
+    default_edge_options : string
+        The options for the scope drawing all edges. The default is "[-]" for
+        undirected graphs and "[->]" for directed graphs.
+    edge_options : string or dict
+        The name of the edge attribute on `G` that holds the options for each edge.
+        If the edge is a self-loop and ``"loop" not in edge_options`` the option
+        "loop," is added to the options for the self-loop edge. Hence you can
+        use "[loop above]" explicitly, but the default is "[loop]".
+        Or a dict keyed by edge to a string holding the options for that edge.
+    edge_label : string or dict
+        The name of the edge attribute on `G` that holds the edge label (text)
+        displayed for each edge. If the attribute is "" or not present, no edge
+        label is drawn.
+        Or a dict keyed by edge to a string holding the label for that edge.
+    edge_label_options : string or dict
+        The name of the edge attribute on `G` that holds the label options for
+        each edge. For example, "[sloped,above,blue]". The default is no options.
+        Or a dict keyed by edge to a string holding the label options for that edge.
+
+    Returns
+    =======
+    latex_code : string
+       The text string which draws the desired graph(s) when compiled by LaTeX.
+
+    See Also
+    ========
+    to_latex
+    write_latex
+    """
+    i4 = "\n    "
+    i8 = "\n        "
+
+    # set up position dict
+    # TODO allow pos to be None and use a nice TikZ default
+    if not isinstance(pos, dict):
+        pos = nx.get_node_attributes(G, pos)
+    if not pos:
+        # circular layout with radius 1
+        pos = {n: f"({round(2* 3.1415 * i / len(G), 3)}:10)" for i, n in enumerate(G)}
+    for node in G:
+        if node not in pos:
+            raise nx.NetworkXError(f"node {node} has no specified pos {pos}")
+        posnode = pos[node]
+        if not isinstance(posnode, str):
+            try:
+                posx, posy = posnode
+                pos[node] = f"({round(posx, 3)}, {round(posy, 3)})"
+            except (TypeError, ValueError):
+                msg = f"position pos[{node}] is not 2-tuple or a string: {posnode}"
+                raise nx.NetworkXError(msg)
+
+    # set up all the dicts
+    if not isinstance(node_options, dict):
+        node_options = nx.get_node_attributes(G, node_options)
+    if not isinstance(node_label, dict):
+        node_label = nx.get_node_attributes(G, node_label)
+    if not isinstance(edge_options, dict):
+        edge_options = nx.get_edge_attributes(G, edge_options)
+    if not isinstance(edge_label, dict):
+        edge_label = nx.get_edge_attributes(G, edge_label)
+    if not isinstance(edge_label_options, dict):
+        edge_label_options = nx.get_edge_attributes(G, edge_label_options)
+
+    # process default options (add brackets or not)
+    topts = "" if tikz_options == "" else f"[{tikz_options.strip('[]')}]"
+    defn = "" if default_node_options == "" else f"[{default_node_options.strip('[]')}]"
+    linestyle = f"{'->' if G.is_directed() else '-'}"
+    if default_edge_options == "":
+        defe = "[" + linestyle + "]"
+    elif "-" in default_edge_options:
+        defe = default_edge_options
+    else:
+        defe = f"[{linestyle},{default_edge_options.strip('[]')}]"
+
+    # Construct the string line by line
+    result = "  \\begin{tikzpicture}" + topts
+    result += i4 + "  \\draw" + defn
+    # load the nodes
+    for n in G:
+        # node options goes inside square brackets
+        nopts = f"[{node_options[n].strip('[]')}]" if n in node_options else ""
+        # node text goes inside curly brackets {}
+        ntext = f"{{{node_label[n]}}}" if n in node_label else f"{{{n}}}"
+
+        result += i8 + f"{pos[n]} node{nopts} ({n}){ntext}"
+    result += ";\n"
+
+    # load the edges
+    result += "      \\begin{scope}" + defe
+    for edge in G.edges:
+        u, v = edge[:2]
+        e_opts = f"{edge_options[edge]}".strip("[]") if edge in edge_options else ""
+        # add loop options for selfloops if not present
+        if u == v and "loop" not in e_opts:
+            e_opts = "loop," + e_opts
+        e_opts = f"[{e_opts}]" if e_opts != "" else ""
+        # TODO -- handle bending of multiedges
+
+        els = edge_label_options[edge] if edge in edge_label_options else ""
+        # edge label options goes inside square brackets []
+        els = f"[{els.strip('[]')}]"
+        # edge text is drawn using the TikZ node command inside curly brackets {}
+        e_label = f" node{els} {{{edge_label[edge]}}}" if edge in edge_label else ""
+
+        result += i8 + f"\\draw{e_opts} ({u}) to{e_label} ({v});"
+
+    result += "\n      \\end{scope}\n    \\end{tikzpicture}\n"
+    return result
+
+
+_DOC_WRAPPER_TIKZ = r"""\documentclass{{report}}
+\usepackage{{tikz}}
+\usepackage{{subcaption}}
+
+\begin{{document}}
+{content}
+\end{{document}}"""
+
+
+_FIG_WRAPPER = r"""\begin{{figure}}
+{content}{caption}{label}
+\end{{figure}}"""
+
+
+_SUBFIG_WRAPPER = r"""  \begin{{subfigure}}{{{size}\textwidth}}
+{content}{caption}{label}
+  \end{{subfigure}}"""
+
+
+def to_latex(
+    Gbunch,
+    pos="pos",
+    tikz_options="",
+    default_node_options="",
+    node_options="node_options",
+    node_label="node_label",
+    default_edge_options="",
+    edge_options="edge_options",
+    edge_label="edge_label",
+    edge_label_options="edge_label_options",
+    caption="",
+    latex_label="",
+    sub_captions=None,
+    sub_labels=None,
+    n_rows=1,
+    as_document=True,
+    document_wrapper=_DOC_WRAPPER_TIKZ,
+    figure_wrapper=_FIG_WRAPPER,
+    subfigure_wrapper=_SUBFIG_WRAPPER,
+):
+    """Return latex code to draw the graph(s) in `Gbunch`
+
+    The TikZ drawing utility in LaTeX is used to draw the graph(s).
+    If `Gbunch` is a graph, it is drawn in a figure environment.
+    If `Gbunch` is an iterable of graphs, each is drawn in a subfigure envionment
+    within a single figure environment.
+
+    If `as_document` is True, the figure is wrapped inside a document environment
+    so that the resulting string is ready to be compiled by LaTeX. Otherwise,
+    the string is ready for inclusion in a larger tex document using ``\\include``
+    or ``\\input`` statements.
+
+    Parameters
+    ==========
+    Gbunch : NetworkX graph or iterable of NetworkX graphs
+        The NetworkX graph to be drawn or an iterable of graphs
+        to be drawn inside subfigures of a single figure.
+    pos : string or list of strings
+        The name of the node attribute on `G` that holds the position of each node.
+        Positions can be sequences of length 2 with numbers for (x,y) coordinates.
+        They can also be strings to denote positions in TikZ style, such as (x, y)
+        or (angle:radius).
+        If a dict, it should be keyed by node to a position.
+        If an empty dict, a circular layout is computed by TikZ.
+        If you are drawing many graphs in subfigures, use a list of position dicts.
+    tikz_options : string
+        The tikzpicture options description defining the options for the picture.
+        Often large scale options like `[scale=2]`.
+    default_node_options : string
+        The draw options for a path of nodes. Individual node options override these.
+    node_options : string or dict
+        The name of the node attribute on `G` that holds the options for each node.
+        Or a dict keyed by node to a string holding the options for that node.
+    node_label : string or dict
+        The name of the node attribute on `G` that holds the node label (text)
+        displayed for each node. If the attribute is "" or not present, the node
+        itself is drawn as a string. LaTeX processing such as ``"$A_1$"`` is allowed.
+        Or a dict keyed by node to a string holding the label for that node.
+    default_edge_options : string
+        The options for the scope drawing all edges. The default is "[-]" for
+        undirected graphs and "[->]" for directed graphs.
+    edge_options : string or dict
+        The name of the edge attribute on `G` that holds the options for each edge.
+        If the edge is a self-loop and ``"loop" not in edge_options`` the option
+        "loop," is added to the options for the self-loop edge. Hence you can
+        use "[loop above]" explicitly, but the default is "[loop]".
+        Or a dict keyed by edge to a string holding the options for that edge.
+    edge_label : string or dict
+        The name of the edge attribute on `G` that holds the edge label (text)
+        displayed for each edge. If the attribute is "" or not present, no edge
+        label is drawn.
+        Or a dict keyed by edge to a string holding the label for that edge.
+    edge_label_options : string or dict
+        The name of the edge attribute on `G` that holds the label options for
+        each edge. For example, "[sloped,above,blue]". The default is no options.
+        Or a dict keyed by edge to a string holding the label options for that edge.
+    caption : string
+        The caption string for the figure environment
+    latex_label : string
+        The latex label used for the figure for easy referral from the main text
+    sub_captions : list of strings
+        The sub_caption string for each subfigure in the figure
+    sub_latex_labels : list of strings
+        The latex label for each subfigure in the figure
+    n_rows : int
+        The number of rows of subfigures to arrange for multiple graphs
+    as_document : bool
+        Whether to wrap the latex code in a document envionment for compiling
+    document_wrapper : formatted text string with variable ``content``.
+        This text is called to evaluate the content embedded in a document
+        environment with a preamble setting up TikZ.
+    figure_wrapper : formatted text string
+        This text is evaluated with variables ``content``, ``caption`` and ``label``.
+        It wraps the content and if a caption is provided, adds the latex code for
+        that caption, and if a label is provided, adds the latex code for a label.
+    subfigure_wrapper : formatted text string
+        This text evaluate variables ``size``, ``content``, ``caption`` and ``label``.
+        It wraps the content and if a caption is provided, adds the latex code for
+        that caption, and if a label is provided, adds the latex code for a label.
+        The size is the vertical size of each row of subfigures as a fraction.
+
+    Returns
+    =======
+    latex_code : string
+        The text string which draws the desired graph(s) when compiled by LaTeX.
+
+    See Also
+    ========
+    write_latex
+    to_latex_raw
+    """
+    if hasattr(Gbunch, "adj"):
+        raw = to_latex_raw(
+            Gbunch,
+            pos,
+            tikz_options,
+            default_node_options,
+            node_options,
+            node_label,
+            default_edge_options,
+            edge_options,
+            edge_label,
+            edge_label_options,
+        )
+    else:  # iterator of graphs
+        sbf = subfigure_wrapper
+        size = 1 / n_rows
+
+        N = len(Gbunch)
+        if isinstance(pos, (str, dict)):
+            pos = [pos] * N
+        if sub_captions is None:
+            sub_captions = [""] * N
+        if sub_labels is None:
+            sub_labels = [""] * N
+        if not (len(Gbunch) == len(pos) == len(sub_captions) == len(sub_labels)):
+            raise nx.NetworkXError(
+                "length of Gbunch, sub_captions and sub_figures must agree"
+            )
+
+        raw = ""
+        for G, pos, subcap, sublbl in zip(Gbunch, pos, sub_captions, sub_labels):
+            subraw = to_latex_raw(
+                G,
+                pos,
+                tikz_options,
+                default_node_options,
+                node_options,
+                node_label,
+                default_edge_options,
+                edge_options,
+                edge_label,
+                edge_label_options,
+            )
+            cap = f"    \\caption{{{subcap}}}" if subcap else ""
+            lbl = f"\\label{{{sublbl}}}" if sublbl else ""
+            raw += sbf.format(size=size, content=subraw, caption=cap, label=lbl)
+            raw += "\n"
+
+    # put raw latex code into a figure environment and optionally into a document
+    raw = raw[:-1]
+    cap = f"\n  \\caption{{{caption}}}" if caption else ""
+    lbl = f"\\label{{{latex_label}}}" if latex_label else ""
+    fig = figure_wrapper.format(content=raw, caption=cap, label=lbl)
+    if as_document:
+        return document_wrapper.format(content=fig)
+    return fig
+
+
+@nx.utils.open_file(1, mode="w")
+def write_latex(Gbunch, path, **options):
+    """Write the latex code to draw the graph(s) onto `path`.
+
+    This convenience function creates the latex drawing code as a string
+    and writes that to a file ready to be compiled when `as_document` is True
+    or ready to be ``\\import``ed or `\\include``ed into your main LaTeX document.
+
+    The `path` argument can be a string filename or a file handle to write to.
+
+    Parameters
+    ----------
+    Gbunch : NetworkX graph or iterable of NetworkX graphs
+        If Gbunch is a graph, it is drawn in a figure environment.
+        If Gbunch is an iterable of graphs, each is drawn in a subfigure
+        envionment within a single figure environment.
+    path : filename
+        Filename or file handle to write to
+    options : dict
+        By default, TikZ is used with options: (others are ignored):
+
+        pos : string or dict or list
+            The name of the node attribute on `G` that holds the position of each node.
+            Positions can be sequences of length 2 with numbers for (x,y) coordinates.
+            They can also be strings to denote positions in TikZ style, such as (x, y)
+            or (angle:radius).
+            If a dict, it should be keyed by node to a position.
+            If an empty dict, a circular layout is computed by TikZ.
+            If you are drawing many graphs in subfigures, use a list of position dicts.
+        tikz_options : string
+            The tikzpicture options description defining the options for the picture.
+            Often large scale options like `[scale=2]`.
+        default_node_options : string
+            The draw options for a path of nodes. Individual node options override these.
+        node_options : string or dict
+            The name of the node attribute on `G` that holds the options for each node.
+            Or a dict keyed by node to a string holding the options for that node.
+        node_label : string or dict
+            The name of the node attribute on `G` that holds the node label (text)
+            displayed for each node. If the attribute is "" or not present, the node
+            itself is drawn as a string. LaTeX processing such as ``"$A_1$"`` is allowed.
+            Or a dict keyed by node to a string holding the label for that node.
+        default_edge_options : string
+            The options for the scope drawing all edges. The default is "[-]" for
+            undirected graphs and "[->]" for directed graphs.
+        edge_options : string or dict
+            The name of the edge attribute on `G` that holds the options for each edge.
+            If the edge is a self-loop and ``"loop" not in edge_options`` the option
+            "loop," is added to the options for the self-loop edge. Hence you can
+            use "[loop above]" explicitly, but the default is "[loop]".
+            Or a dict keyed by edge to a string holding the options for that edge.
+        edge_label : string or dict
+            The name of the edge attribute on `G` that holds the edge label (text)
+            displayed for each edge. If the attribute is "" or not present, no edge
+            label is drawn.
+            Or a dict keyed by edge to a string holding the label for that edge.
+        edge_label_options : string or dict
+            The name of the edge attribute on `G` that holds the label options for
+            each edge. For example, "[sloped,above,blue]". The default is no options.
+            Or a dict keyed by edge to a string holding the label options for that edge.
+        caption : string
+            The caption string for the figure environment
+        latex_label : string
+            The latex label used for the figure for easy referral from the main text
+        sub_captions : list of strings
+            The sub_caption string for each subfigure in the figure
+        sub_latex_labels : list of strings
+            The latex label for each subfigure in the figure
+        n_rows : int
+            The number of rows of subfigures to arrange for multiple graphs
+        as_document : bool
+            Whether to wrap the latex code in a document envionment for compiling
+        document_wrapper : formatted text string with variable ``content``.
+            This text is called to evaluate the content embedded in a document
+            environment with a preamble setting up the TikZ syntax.
+        figure_wrapper : formatted text string
+            This text is evaluated with variables ``content``, ``caption`` and ``label``.
+            It wraps the content and if a caption is provided, adds the latex code for
+            that caption, and if a label is provided, adds the latex code for a label.
+        subfigure_wrapper : formatted text string
+            This text evaluate variables ``size``, ``content``, ``caption`` and ``label``.
+            It wraps the content and if a caption is provided, adds the latex code for
+            that caption, and if a label is provided, adds the latex code for a label.
+            The size is the vertical size of each row of subfigures as a fraction.
+
+    See Also
+    ========
+    to_latex
+    """
+    path.write(to_latex(Gbunch, **options))
diff --git a/networkx/drawing/nx_pydot.py b/networkx/drawing/nx_pydot.py
index 2055eb3..93513b0 100644
--- a/networkx/drawing/nx_pydot.py
+++ b/networkx/drawing/nx_pydot.py
@@ -47,7 +47,7 @@ def write_dot(G, path):
         "nx.nx_agraph.write_dot instead.\n\n"
         "See https://github.com/networkx/networkx/issues/5723"
     )
-    warnings.warn(msg, PendingDeprecationWarning, stacklevel=2)
+    warnings.warn(msg, DeprecationWarning, stacklevel=2)
     P = to_pydot(G)
     path.write(P.to_string())
     return
@@ -84,7 +84,7 @@ def read_dot(path):
         "nx.nx_agraph.read_dot instead.\n\n"
         "See https://github.com/networkx/networkx/issues/5723"
     )
-    warnings.warn(msg, PendingDeprecationWarning, stacklevel=2)
+    warnings.warn(msg, DeprecationWarning, stacklevel=2)
 
     data = path.read()
 
@@ -123,7 +123,7 @@ def from_pydot(P):
         "known issues and is not actively maintained.\n\n"
         "See https://github.com/networkx/networkx/issues/5723"
     )
-    warnings.warn(msg, PendingDeprecationWarning, stacklevel=2)
+    warnings.warn(msg, DeprecationWarning, stacklevel=2)
 
     if P.get_strict(None):  # pydot bug: get_strict() shouldn't take argument
         multiedges = False
@@ -223,7 +223,7 @@ def to_pydot(N):
         "known issues and is not actively maintained.\n\n"
         "See https://github.com/networkx/networkx/issues/5723"
     )
-    warnings.warn(msg, PendingDeprecationWarning, stacklevel=2)
+    warnings.warn(msg, DeprecationWarning, stacklevel=2)
 
     # set Graphviz graph type
     if N.is_directed():
@@ -352,7 +352,7 @@ def graphviz_layout(G, prog="neato", root=None):
         "nx.nx_agraph.graphviz_layout instead.\n\n"
         "See https://github.com/networkx/networkx/issues/5723"
     )
-    warnings.warn(msg, PendingDeprecationWarning, stacklevel=2)
+    warnings.warn(msg, DeprecationWarning, stacklevel=2)
 
     return pydot_layout(G=G, prog=prog, root=root)
 
@@ -402,7 +402,7 @@ def pydot_layout(G, prog="neato", root=None):
         "known issues and is not actively maintained.\n\n"
         "See https://github.com/networkx/networkx/issues/5723"
     )
-    warnings.warn(msg, PendingDeprecationWarning, stacklevel=2)
+    warnings.warn(msg, DeprecationWarning, stacklevel=2)
     P = to_pydot(G)
     if root is not None:
         P.set("root", str(root))
diff --git a/networkx/drawing/nx_pylab.py b/networkx/drawing/nx_pylab.py
index 09e5ed3..5e22f84 100644
--- a/networkx/drawing/nx_pylab.py
+++ b/networkx/drawing/nx_pylab.py
@@ -522,8 +522,11 @@ def draw_networkx_edges(
         Also, `(offset, onoffseq)` tuples can be used as style instead of a strings.
         (See `matplotlib.patches.FancyArrowPatch`: `linestyle`)
 
-    alpha : float or None (default=None)
-        The edge transparency
+    alpha : float or array of floats (default=None)
+        The edge transparency.  This can be a single alpha value,
+        in which case it will be applied to all specified edges. Otherwise,
+        if it is an array, the elements of alpha will be applied to the colors
+        in order (cycling through alpha multiple times if necessary).
 
     edge_cmap : Matplotlib colormap, optional
         Colormap for mapping intensities of edges
@@ -575,14 +578,14 @@ def draw_networkx_edges(
         Label for legend
 
     min_source_margin : int (default=0)
-        The minimum margin (gap) at the begining of the edge at the source.
+        The minimum margin (gap) at the beginning of the edge at the source.
 
     min_target_margin : int (default=0)
         The minimum margin (gap) at the end of the edge at the target.
 
     Returns
     -------
-     matplotlib.colections.LineCollection or a list of matplotlib.patches.FancyArrowPatch
+     matplotlib.collections.LineCollection or a list of matplotlib.patches.FancyArrowPatch
         If ``arrows=True``, a list of FancyArrowPatches is returned.
         If ``arrows=False``, a LineCollection is returned.
         If ``arrows=None`` (the default), then a LineCollection is returned if
@@ -650,6 +653,42 @@ def draw_networkx_edges(
     # undirected graphs (for performance reasons) and use FancyArrowPatches
     # for directed graphs.
     # The `arrows` keyword can be used to override the default behavior
+    use_linecollection = not G.is_directed()
+    if arrows in (True, False):
+        use_linecollection = not arrows
+
+    # Some kwargs only apply to FancyArrowPatches. Warn users when they use
+    # non-default values for these kwargs when LineCollection is being used
+    # instead of silently ignoring the specified option
+    if use_linecollection and any(
+        [
+            arrowstyle is not None,
+            arrowsize != 10,
+            connectionstyle != "arc3",
+            min_source_margin != 0,
+            min_target_margin != 0,
+        ]
+    ):
+        import warnings
+
+        msg = (
+            "\n\nThe {0} keyword argument is not applicable when drawing edges\n"
+            "with LineCollection.\n\n"
+            "To make this warning go away, either specify `arrows=True` to\n"
+            "force FancyArrowPatches or use the default value for {0}.\n"
+            "Note that using FancyArrowPatches may be slow for large graphs.\n"
+        )
+        if arrowstyle is not None:
+            msg = msg.format("arrowstyle")
+        if arrowsize != 10:
+            msg = msg.format("arrowsize")
+        if connectionstyle != "arc3":
+            msg = msg.format("connectionstyle")
+        if min_source_margin != 0:
+            msg = msg.format("min_source_margin")
+        if min_target_margin != 0:
+            msg = msg.format("min_target_margin")
+        warnings.warn(msg, category=UserWarning, stacklevel=2)
 
     if arrowstyle == None:
         if G.is_directed():
@@ -657,10 +696,6 @@ def draw_networkx_edges(
         else:
             arrowstyle = "-"
 
-    use_linecollection = not G.is_directed()
-    if arrows in (True, False):
-        use_linecollection = not arrows
-
     if ax is None:
         ax = plt.gca()
 
@@ -749,7 +784,7 @@ def draw_networkx_edges(
                 # is 0, e.g. for a single node. In this case, fall back to scaling
                 # by the maximum node size
                 selfloop_ht = 0.005 * max_nodesize if h == 0 else h
-                # this is called with _screen space_ values so covert back
+                # this is called with _screen space_ values so convert back
                 # to data space
                 data_loc = ax.transData.inverted().transform(posA)
                 v_shift = 0.1 * selfloop_ht
@@ -1077,7 +1112,7 @@ def draw_networkx_edge_labels(
     ax : Matplotlib Axes object, optional
         Draw the graph in the specified Matplotlib axes.
 
-    rotate : bool (deafult=True)
+    rotate : bool (default=True)
         Rotate edge labels to lie parallel to edges
 
     clip_on : bool (default=True)
diff --git a/networkx/drawing/tests/test_latex.py b/networkx/drawing/tests/test_latex.py
new file mode 100644
index 0000000..14ab542
--- /dev/null
+++ b/networkx/drawing/tests/test_latex.py
@@ -0,0 +1,292 @@
+import pytest
+
+import networkx as nx
+
+
+def test_tikz_attributes():
+    G = nx.path_graph(4, create_using=nx.DiGraph)
+    pos = {n: (n, n) for n in G}
+
+    G.add_edge(0, 0)
+    G.edges[(0, 0)]["label"] = "Loop"
+    G.edges[(0, 0)]["label_options"] = "midway"
+
+    G.nodes[0]["style"] = "blue"
+    G.nodes[1]["style"] = "line width=3,draw"
+    G.nodes[2]["style"] = "circle,draw,blue!50"
+    G.nodes[3]["label"] = "Stop"
+    G.edges[(0, 1)]["label"] = "1st Step"
+    G.edges[(0, 1)]["label_options"] = "near end"
+    G.edges[(2, 3)]["label"] = "3rd Step"
+    G.edges[(2, 3)]["label_options"] = "near start"
+    G.edges[(2, 3)]["style"] = "bend left,green"
+    G.edges[(1, 2)]["label"] = "2nd"
+    G.edges[(1, 2)]["label_options"] = "pos=0.5"
+    G.edges[(1, 2)]["style"] = ">->,bend right,line width=3,green!90"
+
+    output_tex = nx.to_latex(
+        G,
+        pos=pos,
+        as_document=False,
+        tikz_options="[scale=3]",
+        node_options="style",
+        edge_options="style",
+        node_label="label",
+        edge_label="label",
+        edge_label_options="label_options",
+    )
+    expected_tex = r"""\begin{figure}
+  \begin{tikzpicture}[scale=3]
+      \draw
+        (0, 0) node[blue] (0){0}
+        (1, 1) node[line width=3,draw] (1){1}
+        (2, 2) node[circle,draw,blue!50] (2){2}
+        (3, 3) node (3){Stop};
+      \begin{scope}[->]
+        \draw (0) to node[near end] {1st Step} (1);
+        \draw[loop,] (0) to node[midway] {Loop} (0);
+        \draw[>->,bend right,line width=3,green!90] (1) to node[pos=0.5] {2nd} (2);
+        \draw[bend left,green] (2) to node[near start] {3rd Step} (3);
+      \end{scope}
+    \end{tikzpicture}
+\end{figure}"""
+
+    assert output_tex == expected_tex
+    # print(output_tex)
+    # # Pretty way to assert that A.to_document() == expected_tex
+    # content_same = True
+    # for aa, bb in zip(expected_tex.split("\n"), output_tex.split("\n")):
+    #     if aa != bb:
+    #         content_same = False
+    #         print(f"-{aa}|\n+{bb}|")
+    # assert content_same
+
+
+def test_basic_multiple_graphs():
+    H1 = nx.path_graph(4)
+    H2 = nx.complete_graph(4)
+    H3 = nx.path_graph(8)
+    H4 = nx.complete_graph(8)
+    captions = [
+        "Path on 4 nodes",
+        "Complete graph on 4 nodes",
+        "Path on 8 nodes",
+        "Complete graph on 8 nodes",
+    ]
+    labels = ["fig2a", "fig2b", "fig2c", "fig2d"]
+    latex_code = nx.to_latex(
+        [H1, H2, H3, H4],
+        n_rows=2,
+        sub_captions=captions,
+        sub_labels=labels,
+    )
+    # print(latex_code)
+    assert "begin{document}" in latex_code
+    assert "begin{figure}" in latex_code
+    assert latex_code.count("begin{subfigure}") == 4
+    assert latex_code.count("tikzpicture") == 8
+    assert latex_code.count("[-]") == 4
+
+
+def test_basic_tikz():
+    expected_tex = r"""\documentclass{report}
+\usepackage{tikz}
+\usepackage{subcaption}
+
+\begin{document}
+\begin{figure}
+  \begin{subfigure}{0.5\textwidth}
+  \begin{tikzpicture}[scale=2]
+      \draw[gray!90]
+        (0.749, 0.702) node[red!90] (0){0}
+        (1.0, -0.014) node[red!90] (1){1}
+        (-0.777, -0.705) node (2){2}
+        (-0.984, 0.042) node (3){3}
+        (-0.028, 0.375) node[cyan!90] (4){4}
+        (-0.412, 0.888) node (5){5}
+        (0.448, -0.856) node (6){6}
+        (0.003, -0.431) node[cyan!90] (7){7};
+      \begin{scope}[->,gray!90]
+        \draw (0) to (4);
+        \draw (0) to (5);
+        \draw (0) to (6);
+        \draw (0) to (7);
+        \draw (1) to (4);
+        \draw (1) to (5);
+        \draw (1) to (6);
+        \draw (1) to (7);
+        \draw (2) to (4);
+        \draw (2) to (5);
+        \draw (2) to (6);
+        \draw (2) to (7);
+        \draw (3) to (4);
+        \draw (3) to (5);
+        \draw (3) to (6);
+        \draw (3) to (7);
+      \end{scope}
+    \end{tikzpicture}
+    \caption{My tikz number 1 of 2}\label{tikz_1_2}
+  \end{subfigure}
+  \begin{subfigure}{0.5\textwidth}
+  \begin{tikzpicture}[scale=2]
+      \draw[gray!90]
+        (0.749, 0.702) node[green!90] (0){0}
+        (1.0, -0.014) node[green!90] (1){1}
+        (-0.777, -0.705) node (2){2}
+        (-0.984, 0.042) node (3){3}
+        (-0.028, 0.375) node[purple!90] (4){4}
+        (-0.412, 0.888) node (5){5}
+        (0.448, -0.856) node (6){6}
+        (0.003, -0.431) node[purple!90] (7){7};
+      \begin{scope}[->,gray!90]
+        \draw (0) to (4);
+        \draw (0) to (5);
+        \draw (0) to (6);
+        \draw (0) to (7);
+        \draw (1) to (4);
+        \draw (1) to (5);
+        \draw (1) to (6);
+        \draw (1) to (7);
+        \draw (2) to (4);
+        \draw (2) to (5);
+        \draw (2) to (6);
+        \draw (2) to (7);
+        \draw (3) to (4);
+        \draw (3) to (5);
+        \draw (3) to (6);
+        \draw (3) to (7);
+      \end{scope}
+    \end{tikzpicture}
+    \caption{My tikz number 2 of 2}\label{tikz_2_2}
+  \end{subfigure}
+  \caption{A graph generated with python and latex.}
+\end{figure}
+\end{document}"""
+
+    edges = [
+        (0, 4),
+        (0, 5),
+        (0, 6),
+        (0, 7),
+        (1, 4),
+        (1, 5),
+        (1, 6),
+        (1, 7),
+        (2, 4),
+        (2, 5),
+        (2, 6),
+        (2, 7),
+        (3, 4),
+        (3, 5),
+        (3, 6),
+        (3, 7),
+    ]
+    G = nx.DiGraph()
+    G.add_nodes_from(range(8))
+    G.add_edges_from(edges)
+    pos = {
+        0: (0.7490296171687696, 0.702353520257394),
+        1: (1.0, -0.014221357723796535),
+        2: (-0.7765783344161441, -0.7054170966808919),
+        3: (-0.9842690223417624, 0.04177547602465483),
+        4: (-0.02768523817180917, 0.3745724439551441),
+        5: (-0.41154855146767433, 0.8880106515525136),
+        6: (0.44780153389148264, -0.8561492709269164),
+        7: (0.0032499953371383505, -0.43092436645809945),
+    }
+
+    rc_node_color = {0: "red!90", 1: "red!90", 4: "cyan!90", 7: "cyan!90"}
+    gp_node_color = {0: "green!90", 1: "green!90", 4: "purple!90", 7: "purple!90"}
+
+    H = G.copy()
+    nx.set_node_attributes(G, rc_node_color, "color")
+    nx.set_node_attributes(H, gp_node_color, "color")
+
+    sub_captions = ["My tikz number 1 of 2", "My tikz number 2 of 2"]
+    sub_labels = ["tikz_1_2", "tikz_2_2"]
+
+    output_tex = nx.to_latex(
+        [G, H],
+        [pos, pos],
+        tikz_options="[scale=2]",
+        default_node_options="gray!90",
+        default_edge_options="gray!90",
+        node_options="color",
+        sub_captions=sub_captions,
+        sub_labels=sub_labels,
+        caption="A graph generated with python and latex.",
+        n_rows=2,
+        as_document=True,
+    )
+
+    assert output_tex == expected_tex
+    # print(output_tex)
+    # # Pretty way to assert that A.to_document() == expected_tex
+    # content_same = True
+    # for aa, bb in zip(expected_tex.split("\n"), output_tex.split("\n")):
+    #     if aa != bb:
+    #         content_same = False
+    #         print(f"-{aa}|\n+{bb}|")
+    # assert content_same
+
+
+def test_exception_pos_single_graph(to_latex=nx.to_latex):
+    # smoke test that pos can be a string
+    G = nx.path_graph(4)
+    to_latex(G, pos="pos")
+
+    # must include all nodes
+    pos = {0: (1, 2), 1: (0, 1), 2: (2, 1)}
+    with pytest.raises(nx.NetworkXError):
+        to_latex(G, pos)
+
+    # must have 2 values
+    pos[3] = (1, 2, 3)
+    with pytest.raises(nx.NetworkXError):
+        to_latex(G, pos)
+    pos[3] = 2
+    with pytest.raises(nx.NetworkXError):
+        to_latex(G, pos)
+
+    # check that passes with 2 values
+    pos[3] = (3, 2)
+    to_latex(G, pos)
+
+
+def test_exception_multiple_graphs(to_latex=nx.to_latex):
+    G = nx.path_graph(3)
+    pos_bad = {0: (1, 2), 1: (0, 1)}
+    pos_OK = {0: (1, 2), 1: (0, 1), 2: (2, 1)}
+    fourG = [G, G, G, G]
+    fourpos = [pos_OK, pos_OK, pos_OK, pos_OK]
+
+    # input single dict to use for all graphs
+    to_latex(fourG, pos_OK)
+    with pytest.raises(nx.NetworkXError):
+        to_latex(fourG, pos_bad)
+
+    # input list of dicts to use for all graphs
+    to_latex(fourG, fourpos)
+    with pytest.raises(nx.NetworkXError):
+        to_latex(fourG, [pos_bad, pos_bad, pos_bad, pos_bad])
+
+    # every pos dict must include all nodes
+    with pytest.raises(nx.NetworkXError):
+        to_latex(fourG, [pos_OK, pos_OK, pos_bad, pos_OK])
+
+    # test sub_captions and sub_labels (len must match Gbunch)
+    with pytest.raises(nx.NetworkXError):
+        to_latex(fourG, fourpos, sub_captions=["hi", "hi"])
+
+    with pytest.raises(nx.NetworkXError):
+        to_latex(fourG, fourpos, sub_labels=["hi", "hi"])
+
+    # all pass
+    to_latex(fourG, fourpos, sub_captions=["hi"] * 4, sub_labels=["lbl"] * 4)
+
+
+def test_exception_multigraph():
+    G = nx.path_graph(4, create_using=nx.MultiGraph)
+    G.add_edge(1, 2)
+    with pytest.raises(nx.NetworkXNotImplemented):
+        nx.to_latex(G)
diff --git a/networkx/drawing/tests/test_layout.py b/networkx/drawing/tests/test_layout.py
index f24d003..faf734c 100644
--- a/networkx/drawing/tests/test_layout.py
+++ b/networkx/drawing/tests/test_layout.py
@@ -66,6 +66,7 @@ class TestLayout:
         nx.kamada_kawai_layout(G)
         nx.kamada_kawai_layout(G, dim=1)
         nx.kamada_kawai_layout(G, dim=3)
+        nx.arf_layout(G)
 
     def test_smoke_string(self):
         G = self.Gs
@@ -80,6 +81,7 @@ class TestLayout:
         nx.kamada_kawai_layout(G)
         nx.kamada_kawai_layout(G, dim=1)
         nx.kamada_kawai_layout(G, dim=3)
+        nx.arf_layout(G)
 
     def check_scale_and_center(self, pos, scale, center):
         center = np.array(center)
@@ -175,6 +177,10 @@ class TestLayout:
         pos = nx.circular_layout(self.Gi)
         npos = nx.fruchterman_reingold_layout(self.Gi, pos=pos)
 
+    def test_smoke_initial_pos_arf(self):
+        pos = nx.circular_layout(self.Gi)
+        npos = nx.arf_layout(self.Gi, pos=pos)
+
     def test_fixed_node_fruchterman_reingold(self):
         # Dense version (numpy based)
         pos = nx.circular_layout(self.Gi)
@@ -242,6 +248,8 @@ class TestLayout:
         assert vpos == {}
         vpos = nx.kamada_kawai_layout(G, center=(1, 1))
         assert vpos == {}
+        vpos = nx.arf_layout(G)
+        assert vpos == {}
 
     def test_bipartite_layout(self):
         G = nx.complete_bipartite_graph(3, 5)
@@ -402,7 +410,6 @@ class TestLayout:
         for k, v in expectation.items():
             assert (s_vpos[k] == v).all()
         s_vpos = nx.rescale_layout_dict(vpos, scale=2)
-
         expectation = {
             0: np.array((-2, -2)),
             1: np.array((2, 2)),
@@ -411,6 +418,24 @@ class TestLayout:
         for k, v in expectation.items():
             assert (s_vpos[k] == v).all()
 
+    def test_arf_layout_partial_input_test(self):
+        """
+        Checks whether partial pos input still returns a proper position.
+        """
+        G = self.Gs
+        node = nx.utils.arbitrary_element(G)
+        pos = nx.circular_layout(G)
+        del pos[node]
+        pos = nx.arf_layout(G, pos=pos)
+        assert len(pos) == len(G)
+
+    def test_arf_layout_negative_a_check(self):
+        """
+        Checks input parameters correctly raises errors. For example,  `a` should be larger than 1
+        """
+        G = self.Gs
+        pytest.raises(ValueError, nx.arf_layout, G=G, a=-1)
+
 
 def test_multipartite_layout_nonnumeric_partition_labels():
     """See gh-5123."""
diff --git a/networkx/drawing/tests/test_pylab.py b/networkx/drawing/tests/test_pylab.py
index f642dcc..cef2702 100644
--- a/networkx/drawing/tests/test_pylab.py
+++ b/networkx/drawing/tests/test_pylab.py
@@ -1,6 +1,7 @@
 """Unit tests for matplotlib drawing functions."""
 import itertools
 import os
+import warnings
 
 import pytest
 
@@ -396,6 +397,7 @@ def test_labels_and_colors():
         G,
         pos,
         edgelist=[(4, 5), (5, 6), (6, 7), (7, 4)],
+        arrows=True,
         min_source_margin=0.5,
         min_target_margin=0.75,
         width=8,
@@ -752,3 +754,38 @@ def test_draw_networkx_edges_undirected_selfloop_colors():
     for fap, clr, slp in zip(ax.patches, edge_colors[-3:], sl_points):
         assert fap.get_path().contains_point(slp)
         assert mpl.colors.same_color(fap.get_edgecolor(), clr)
+    plt.delaxes(ax)
+
+
+@pytest.mark.parametrize(
+    "fap_only_kwarg",  # Non-default values for kwargs that only apply to FAPs
+    (
+        {"arrowstyle": "-"},
+        {"arrowsize": 20},
+        {"connectionstyle": "arc3,rad=0.2"},
+        {"min_source_margin": 10},
+        {"min_target_margin": 10},
+    ),
+)
+def test_user_warnings_for_unused_edge_drawing_kwargs(fap_only_kwarg):
+    """Users should get a warning when they specify a non-default value for
+    one of the kwargs that applies only to edges drawn with FancyArrowPatches,
+    but FancyArrowPatches aren't being used under the hood."""
+    G = nx.path_graph(3)
+    pos = {n: (n, n) for n in G}
+    fig, ax = plt.subplots()
+    # By default, an undirected graph will use LineCollection to represent
+    # the edges
+    kwarg_name = list(fap_only_kwarg.keys())[0]
+    with pytest.warns(
+        UserWarning, match=f"\n\nThe {kwarg_name} keyword argument is not applicable"
+    ):
+        nx.draw_networkx_edges(G, pos, ax=ax, **fap_only_kwarg)
+    # FancyArrowPatches are always used when `arrows=True` is specified.
+    # Check that warnings are *not* raised in this case
+    with warnings.catch_warnings():
+        # Escalate warnings -> errors so tests fail if warnings are raised
+        warnings.simplefilter("error")
+        nx.draw_networkx_edges(G, pos, ax=ax, arrows=True, **fap_only_kwarg)
+
+    plt.delaxes(ax)
diff --git a/networkx/generators/classic.py b/networkx/generators/classic.py
index d83062a..9b5be71 100644
--- a/networkx/generators/classic.py
+++ b/networkx/generators/classic.py
@@ -145,7 +145,26 @@ def balanced_tree(r, h, create_using=None):
 def barbell_graph(m1, m2, create_using=None):
     """Returns the Barbell Graph: two complete graphs connected by a path.
 
-    For $m1 > 1$ and $m2 >= 0$.
+    Parameters
+    ----------
+    m1 : int
+        Size of the left and right barbells, must be greater than 2.
+
+    m2 : int
+        Length of the path connecting the barbells.
+
+    create_using : NetworkX graph constructor, optional (default=nx.Graph)
+       Graph type to create. If graph instance, then cleared before populated.
+       Only undirected Graphs are supported.
+
+    Returns
+    -------
+    G : NetworkX graph
+        A barbell graph.
+
+    Notes
+    -----
+
 
     Two identical complete graphs $K_{m1}$ form the left and right bells,
     and are connected by a path $P_{m2}$.
@@ -177,14 +196,17 @@ def barbell_graph(m1, m2, create_using=None):
     G.add_nodes_from(range(m1, m1 + m2 - 1))
     if m2 > 1:
         G.add_edges_from(pairwise(range(m1, m1 + m2)))
+
     # right barbell
     G.add_edges_from(
         (u, v) for u in range(m1 + m2, 2 * m1 + m2) for v in range(u + 1, 2 * m1 + m2)
     )
+
     # connect it up
     G.add_edge(m1 - 1, m1)
     if m2 > 0:
         G.add_edge(m1 + m2 - 1, m1 + m2)
+
     return G
 
 
@@ -233,6 +255,8 @@ def complete_graph(n, create_using=None):
     n : int or iterable container of nodes
         If n is an integer, nodes are from range(n).
         If n is a container of nodes, those nodes appear in the graph.
+        Warning: n is not checked for duplicates and if present the
+        resulting graph may not be as desired. Make sure you have no duplicates.
     create_using : NetworkX graph constructor, optional (default=nx.Graph)
        Graph type to create. If graph instance, then cleared before populated.
 
@@ -360,6 +384,8 @@ def cycle_graph(n, create_using=None):
     n : int or iterable container of nodes
         If n is an integer, nodes are from `range(n)`.
         If n is a container of nodes, those nodes appear in the graph.
+        Warning: n is not checked for duplicates and if present the
+        resulting graph may not be as desired. Make sure you have no duplicates.
     create_using : NetworkX graph constructor, optional (default=nx.Graph)
        Graph type to create. If graph instance, then cleared before populated.
 
@@ -523,7 +549,9 @@ def lollipop_graph(m, n, create_using=None):
     ----------
     m, n : int or iterable container of nodes (default = 0)
         If an integer, nodes are from `range(m)` and `range(m,m+n)`.
-        If a container, the entries are the coordinate of the node.
+        If a container of nodes, those nodes appear in the graph.
+        Warning: m and n are not checked for duplicates and if present the
+        resulting graph may not be as desired. Make sure you have no duplicates.
 
         The nodes for m appear in the complete graph $K_m$ and the nodes
         for n appear in the path $P_n$
@@ -587,6 +615,8 @@ def path_graph(n, create_using=None):
     n : int or iterable
         If an integer, nodes are 0 to n - 1.
         If an iterable of nodes, in the order they appear in the path.
+        Warning: n is not checked for duplicates and if present the
+        resulting graph may not be as desired. Make sure you have no duplicates.
     create_using : NetworkX graph constructor, optional (default=nx.Graph)
        Graph type to create. If graph instance, then cleared before populated.
 
@@ -608,6 +638,8 @@ def star_graph(n, create_using=None):
     n : int or iterable
         If an integer, node labels are 0 to n with center 0.
         If an iterable of nodes, the center is the first.
+        Warning: n is not checked for duplicates and if present the
+        resulting graph may not be as desired. Make sure you have no duplicates.
     create_using : NetworkX graph constructor, optional (default=nx.Graph)
        Graph type to create. If graph instance, then cleared before populated.
 
@@ -679,6 +711,8 @@ def wheel_graph(n, create_using=None):
     n : int or iterable
         If an integer, node labels are 0 to n with center 0.
         If an iterable of nodes, the center is the first.
+        Warning: n is not checked for duplicates and if present the
+        resulting graph may not be as desired. Make sure you have no duplicates.
     create_using : NetworkX graph constructor, optional (default=nx.Graph)
        Graph type to create. If graph instance, then cleared before populated.
 
diff --git a/networkx/generators/cographs.py b/networkx/generators/cographs.py
index e876358..9f2e35b 100644
--- a/networkx/generators/cographs.py
+++ b/networkx/generators/cographs.py
@@ -25,7 +25,7 @@ def random_cograph(n, seed=None):
     Cographs or $P_4$-free graphs can be obtained from a single vertex
     by disjoint union and complementation operations.
 
-    This generator starts off from a single vertex and performes disjoint
+    This generator starts off from a single vertex and performs disjoint
     union and full join operations on itself.
     The decision on which operation will take place is random.
 
diff --git a/networkx/generators/community.py b/networkx/generators/community.py
index f1f4d67..1379b36 100644
--- a/networkx/generators/community.py
+++ b/networkx/generators/community.py
@@ -132,7 +132,7 @@ def relaxed_caveman_graph(l, k, p, seed=None):
     k : int
       Size of cliques
     p : float
-      Probabilty of rewiring each edge.
+      Probability of rewiring each edge.
     seed : integer, random_state, or None (default)
         Indicator of random number generation state.
         See :ref:`Randomness<randomness>`.
@@ -320,7 +320,7 @@ def gaussian_random_partition_graph(n, s, v, p_in, p_out, directed=False, seed=N
     v : float
       Shape parameter. The variance of cluster size distribution is s/v.
     p_in : float
-      Probabilty of intra cluster connection.
+      Probability of intra cluster connection.
     p_out : float
       Probability of inter cluster connection.
     directed : boolean, optional default=False
diff --git a/networkx/generators/ego.py b/networkx/generators/ego.py
index cca7dfa..d1c126f 100644
--- a/networkx/generators/ego.py
+++ b/networkx/generators/ego.py
@@ -6,6 +6,7 @@ __all__ = ["ego_graph"]
 import networkx as nx
 
 
+@nx._dispatch
 def ego_graph(G, n, radius=1, center=True, undirected=False, distance=None):
     """Returns induced subgraph of neighbors centered at node n within
     a given radius.
diff --git a/networkx/generators/geometric.py b/networkx/generators/geometric.py
index de5fbd4..dcc0512 100644
--- a/networkx/generators/geometric.py
+++ b/networkx/generators/geometric.py
@@ -19,24 +19,6 @@ __all__ = [
 ]
 
 
-def euclidean(x, y):
-    """Returns the Euclidean distance between the vectors ``x`` and ``y``.
-
-    Each of ``x`` and ``y`` can be any iterable of numbers. The
-    iterables must be of the same length.
-
-     .. deprecated:: 2.7
-    """
-    import warnings
-
-    msg = (
-        "euclidean is deprecated and will be removed in 3.0."
-        "Use math.dist(x, y) instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    return math.dist(x, y)
-
-
 def geometric_edges(G, radius, p=2):
     """Returns edge list of node pairs within `radius` of each other.
 
diff --git a/networkx/generators/internet_as_graphs.py b/networkx/generators/internet_as_graphs.py
index c3c1278..830c854 100644
--- a/networkx/generators/internet_as_graphs.py
+++ b/networkx/generators/internet_as_graphs.py
@@ -272,7 +272,7 @@ class AS_graph_generator:
     def add_cp_peering_link(self, cp, to_kind):
         """Add a peering link to a content provider (CP) node.
 
-        Target node j can be CP or M and it is drawn uniformely among the nodes
+        Target node j can be CP or M and it is drawn uniformly among the nodes
         belonging to the same region as cp.
 
         Parameters
diff --git a/networkx/generators/line.py b/networkx/generators/line.py
index 7432c2f..7450750 100644
--- a/networkx/generators/line.py
+++ b/networkx/generators/line.py
@@ -246,7 +246,7 @@ def inverse_line_graph(G):
     This is an implementation of the Roussopoulos algorithm.
 
     If G consists of multiple components, then the algorithm doesn't work.
-    You should invert every component seperately:
+    You should invert every component separately:
 
     >>> K5 = nx.complete_graph(5)
     >>> P4 = nx.Graph([("a", "b"), ("b", "c"), ("c", "d")])
@@ -274,7 +274,14 @@ def inverse_line_graph(G):
     elif G.number_of_nodes() > 1 and G.number_of_edges() == 0:
         msg = (
             "inverse_line_graph() doesn't work on an edgeless graph. "
-            "Please use this function on each component seperately."
+            "Please use this function on each component separately."
+        )
+        raise nx.NetworkXError(msg)
+
+    if nx.number_of_selfloops(G) != 0:
+        msg = (
+            "A line graph as generated by NetworkX has no selfloops, so G has no "
+            "inverse line graph. Please remove the selfloops from G and try again."
         )
         raise nx.NetworkXError(msg)
 
@@ -380,13 +387,9 @@ def _find_partition(G, starting_cell):
     partitioned_vertices = list(starting_cell)
     while G_partition.number_of_edges() > 0:
         # there are still edges left and so more cells to be made
-        u = partitioned_vertices[-1]
+        u = partitioned_vertices.pop()
         deg_u = len(G_partition[u])
-        if deg_u == 0:
-            # if u has no edges left in G_partition then we have found
-            # all of its cells so we do not need to keep looking
-            partitioned_vertices.pop()
-        else:
+        if deg_u != 0:
             # if u still has edges then we need to find its other cell
             # this other cell must be a complete subgraph or else G is
             # not a line graph
@@ -433,6 +436,8 @@ def _select_starting_cell(G, starting_edge=None):
         e = arbitrary_element(G.edges())
     else:
         e = starting_edge
+        if e[0] not in G.nodes():
+            raise nx.NetworkXError(f"Vertex {e[0]} not in graph")
         if e[1] not in G[e[0]]:
             msg = f"starting_edge ({e[0]}, {e[1]}) is not in the Graph"
             raise nx.NetworkXError(msg)
@@ -447,10 +452,10 @@ def _select_starting_cell(G, starting_edge=None):
         T = e_triangles[0]
         a, b, c = T
         # ab was original edge so check the other 2 edges
-        ac_edges = [x for x in _triangles(G, (a, c))]
-        bc_edges = [x for x in _triangles(G, (b, c))]
-        if len(ac_edges) == 1:
-            if len(bc_edges) == 1:
+        ac_edges = len(_triangles(G, (a, c)))
+        bc_edges = len(_triangles(G, (b, c)))
+        if ac_edges == 1:
+            if bc_edges == 1:
                 starting_cell = T
             else:
                 return _select_starting_cell(G, starting_edge=(b, c))
@@ -469,29 +474,22 @@ def _select_starting_cell(G, starting_edge=None):
             starting_cell = T
         elif r - 1 <= s <= r:
             # check if odd triangles containing e form complete subgraph
-            # there must be exactly s+2 of them
-            # and they must all be connected
             triangle_nodes = set()
             for T in odd_triangles:
                 for x in T:
                     triangle_nodes.add(x)
-            if len(triangle_nodes) == s + 2:
-                for u in triangle_nodes:
-                    for v in triangle_nodes:
-                        if u != v and (v not in G[u]):
-                            msg = (
-                                "G is not a line graph (odd triangles "
-                                "do not form complete subgraph)"
-                            )
-                            raise nx.NetworkXError(msg)
-                # otherwise then we can use this as the starting cell
-                starting_cell = tuple(triangle_nodes)
-            else:
-                msg = (
-                    "G is not a line graph (odd triangles "
-                    "do not form complete subgraph)"
-                )
-                raise nx.NetworkXError(msg)
+
+            for u in triangle_nodes:
+                for v in triangle_nodes:
+                    if u != v and (v not in G[u]):
+                        msg = (
+                            "G is not a line graph (odd triangles "
+                            "do not form complete subgraph)"
+                        )
+                        raise nx.NetworkXError(msg)
+            # otherwise then we can use this as the starting cell
+            starting_cell = tuple(triangle_nodes)
+
         else:
             msg = (
                 "G is not a line graph (incorrect number of "
diff --git a/networkx/generators/small.py b/networkx/generators/small.py
index 7b4dd57..2eb4e6f 100644
--- a/networkx/generators/small.py
+++ b/networkx/generators/small.py
@@ -4,7 +4,6 @@ Various small and named graphs, together with some compact generators.
 """
 
 __all__ = [
-    "make_small_graph",
     "LCF_graph",
     "bull_graph",
     "chvatal_graph",
@@ -60,115 +59,6 @@ def _raise_on_directed(func):
     return wrapper
 
 
-def make_small_undirected_graph(graph_description, create_using=None):
-    """
-    Return a small undirected graph described by graph_description.
-
-    .. deprecated:: 2.7
-
-       make_small_undirected_graph is deprecated and will be removed in
-       version 3.0. If "ltype" == "adjacencylist", convert the list to a dict
-       and use `from_dict_of_lists`. If "ltype" == "edgelist", use
-       `from_edgelist`.
-
-    See make_small_graph.
-    """
-    import warnings
-
-    msg = (
-        "\n\nmake_small_undirected_graph is deprecated and will be removed in "
-        "version 3.0.\n"
-        "If `ltype` == 'adjacencylist', convert `xlist` to a dict and use\n"
-        "`from_dict_of_lists` instead.\n"
-        "If `ltype` == 'edgelist', use `from_edgelist` instead."
-    )
-    warnings.warn(msg, category=DeprecationWarning, stacklevel=2)
-
-    G = empty_graph(0, create_using)
-    if G.is_directed():
-        raise NetworkXError("Directed Graph not supported")
-    return make_small_graph(graph_description, G)
-
-
-def make_small_graph(graph_description, create_using=None):
-    """
-    Return the small graph described by graph_description.
-
-    .. deprecated:: 2.7
-
-       make_small_graph is deprecated and will be removed in
-       version 3.0. If "ltype" == "adjacencylist", convert the list to a dict
-       and use `from_dict_of_lists`. If "ltype" == "edgelist", use
-       `from_edgelist`.
-
-    graph_description is a list of the form [ltype,name,n,xlist]
-
-    Here ltype is one of "adjacencylist" or "edgelist",
-    name is the name of the graph and n the number of nodes.
-    This constructs a graph of n nodes with integer labels 0,..,n-1.
-
-    If ltype="adjacencylist"  then xlist is an adjacency list
-    with exactly n entries, in with the j'th entry (which can be empty)
-    specifies the nodes connected to vertex j.
-    e.g. the "square" graph C_4 can be obtained by
-
-    >>> G = nx.make_small_graph(
-    ...     ["adjacencylist", "C_4", 4, [[2, 4], [1, 3], [2, 4], [1, 3]]]
-    ... )
-
-    or, since we do not need to add edges twice,
-
-    >>> G = nx.make_small_graph(["adjacencylist", "C_4", 4, [[2, 4], [3], [4], []]])
-
-    If ltype="edgelist" then xlist is an edge list
-    written as [[v1,w2],[v2,w2],...,[vk,wk]],
-    where vj and wj integers in the range 1,..,n
-    e.g. the "square" graph C_4 can be obtained by
-
-    >>> G = nx.make_small_graph(
-    ...     ["edgelist", "C_4", 4, [[1, 2], [3, 4], [2, 3], [4, 1]]]
-    ... )
-
-    Use the create_using argument to choose the graph class/type.
-    """
-    import warnings
-
-    msg = (
-        "\n\nmake_small_graph is deprecated and will be removed in version 3.0.\n"
-        "If `ltype` == 'adjacencylist', convert `xlist` to a dict and use\n"
-        "`from_dict_of_lists` instead.\n"
-        "If `ltype` == 'edgelist', use `from_edgelist` instead."
-    )
-    warnings.warn(msg, category=DeprecationWarning, stacklevel=2)
-
-    if graph_description[0] not in ("adjacencylist", "edgelist"):
-        raise NetworkXError("ltype must be either adjacencylist or edgelist")
-
-    ltype = graph_description[0]
-    name = graph_description[1]
-    n = graph_description[2]
-
-    G = empty_graph(n, create_using)
-    nodes = G.nodes()
-
-    if ltype == "adjacencylist":
-        adjlist = graph_description[3]
-        if len(adjlist) != n:
-            raise NetworkXError("invalid graph_description")
-        G.add_edges_from([(u - 1, v) for v in nodes for u in adjlist[v]])
-    elif ltype == "edgelist":
-        edgelist = graph_description[3]
-        for e in edgelist:
-            v1 = e[0] - 1
-            v2 = e[1] - 1
-            if v1 < 0 or v1 > n - 1 or v2 < 0 or v2 > n - 1:
-                raise NetworkXError("invalid graph_description")
-            else:
-                G.add_edge(v1, v2)
-    G.name = name
-    return G
-
-
 def LCF_graph(n, shift_list, repeats, create_using=None):
     """
     Return the cubic graph specified in LCF notation.
diff --git a/networkx/generators/tests/test_community.py b/networkx/generators/tests/test_community.py
index 9f4a2e4..24af7fd 100644
--- a/networkx/generators/tests/test_community.py
+++ b/networkx/generators/tests/test_community.py
@@ -110,6 +110,10 @@ def test_caveman_graph():
     G = nx.caveman_graph(4, 3)
     assert len(G) == 12
 
+    G = nx.caveman_graph(5, 1)
+    E5 = nx.empty_graph(5)
+    assert nx.is_isomorphic(G, E5)
+
     G = nx.caveman_graph(1, 5)
     K5 = nx.complete_graph(5)
     assert nx.is_isomorphic(G, K5)
@@ -133,6 +137,9 @@ def test_gaussian_random_partition_graph():
     pytest.raises(
         nx.NetworkXError, nx.gaussian_random_partition_graph, 100, 101, 10, 1, 0
     )
+    # Test when clusters are likely less than 1
+    G = nx.gaussian_random_partition_graph(10, 0.5, 0.5, 0.5, 0.5, seed=1)
+    assert len(G) == 10
 
 
 def test_ring_of_cliques():
@@ -146,8 +153,14 @@ def test_ring_of_cliques():
                 # the edge that already exists cannot be duplicated
                 expected_num_edges = i * (((j * (j - 1)) // 2) + 1) - 1
             assert G.number_of_edges() == expected_num_edges
-    pytest.raises(nx.NetworkXError, nx.ring_of_cliques, 1, 5)
-    pytest.raises(nx.NetworkXError, nx.ring_of_cliques, 3, 0)
+    with pytest.raises(
+        nx.NetworkXError, match="A ring of cliques must have at least two cliques"
+    ):
+        nx.ring_of_cliques(1, 5)
+    with pytest.raises(
+        nx.NetworkXError, match="The cliques must have at least two nodes"
+    ):
+        nx.ring_of_cliques(3, 0)
 
 
 def test_windmill_graph():
@@ -159,8 +172,14 @@ def test_windmill_graph():
             assert G.degree(0) == G.number_of_nodes() - 1
             for i in range(1, G.number_of_nodes()):
                 assert G.degree(i) == k - 1
-    pytest.raises(nx.NetworkXError, nx.ring_of_cliques, 1, 3)
-    pytest.raises(nx.NetworkXError, nx.ring_of_cliques, 15, 0)
+    with pytest.raises(
+        nx.NetworkXError, match="A windmill graph must have at least two cliques"
+    ):
+        nx.windmill_graph(1, 3)
+    with pytest.raises(
+        nx.NetworkXError, match="The cliques must have at least two nodes"
+    ):
+        nx.windmill_graph(3, 0)
 
 
 def test_stochastic_block_model():
@@ -215,7 +234,7 @@ def test_generator():
 
 
 def test_invalid_tau1():
-    with pytest.raises(nx.NetworkXError):
+    with pytest.raises(nx.NetworkXError, match="tau2 must be greater than one"):
         n = 100
         tau1 = 2
         tau2 = 1
@@ -224,7 +243,7 @@ def test_invalid_tau1():
 
 
 def test_invalid_tau2():
-    with pytest.raises(nx.NetworkXError):
+    with pytest.raises(nx.NetworkXError, match="tau1 must be greater than one"):
         n = 100
         tau1 = 1
         tau2 = 2
@@ -233,7 +252,7 @@ def test_invalid_tau2():
 
 
 def test_mu_too_large():
-    with pytest.raises(nx.NetworkXError):
+    with pytest.raises(nx.NetworkXError, match="mu must be in the interval \\[0, 1\\]"):
         n = 100
         tau1 = 2
         tau2 = 2
@@ -242,7 +261,7 @@ def test_mu_too_large():
 
 
 def test_mu_too_small():
-    with pytest.raises(nx.NetworkXError):
+    with pytest.raises(nx.NetworkXError, match="mu must be in the interval \\[0, 1\\]"):
         n = 100
         tau1 = 2
         tau2 = 2
@@ -251,18 +270,93 @@ def test_mu_too_small():
 
 
 def test_both_degrees_none():
-    with pytest.raises(nx.NetworkXError):
+    with pytest.raises(
+        nx.NetworkXError,
+        match="Must assign exactly one of min_degree and average_degree",
+    ):
         n = 100
         tau1 = 2
         tau2 = 2
-        mu = -1
+        mu = 1
         nx.LFR_benchmark_graph(n, tau1, tau2, mu)
 
 
 def test_neither_degrees_none():
-    with pytest.raises(nx.NetworkXError):
+    with pytest.raises(
+        nx.NetworkXError,
+        match="Must assign exactly one of min_degree and average_degree",
+    ):
         n = 100
         tau1 = 2
         tau2 = 2
-        mu = -1
+        mu = 1
         nx.LFR_benchmark_graph(n, tau1, tau2, mu, min_degree=2, average_degree=5)
+
+
+def test_max_iters_exeded():
+    with pytest.raises(
+        nx.ExceededMaxIterations,
+        match="Could not assign communities; try increasing min_community",
+    ):
+        n = 10
+        tau1 = 2
+        tau2 = 2
+        mu = 0.1
+        nx.LFR_benchmark_graph(n, tau1, tau2, mu, min_degree=2, max_iters=10, seed=1)
+
+
+def test_max_deg_out_of_range():
+    with pytest.raises(
+        nx.NetworkXError, match="max_degree must be in the interval \\(0, n\\]"
+    ):
+        n = 10
+        tau1 = 2
+        tau2 = 2
+        mu = 0.1
+        nx.LFR_benchmark_graph(
+            n, tau1, tau2, mu, max_degree=n + 1, max_iters=10, seed=1
+        )
+
+
+def test_max_community():
+    n = 250
+    tau1 = 3
+    tau2 = 1.5
+    mu = 0.1
+    G = nx.LFR_benchmark_graph(
+        n,
+        tau1,
+        tau2,
+        mu,
+        average_degree=5,
+        max_degree=100,
+        min_community=50,
+        max_community=200,
+        seed=10,
+    )
+    assert len(G) == 250
+    C = {frozenset(G.nodes[v]["community"]) for v in G}
+    assert nx.community.is_partition(G.nodes(), C)
+
+
+def test_powerlaw_iterations_exceeded():
+    with pytest.raises(
+        nx.ExceededMaxIterations, match="Could not create power law sequence"
+    ):
+        n = 100
+        tau1 = 2
+        tau2 = 2
+        mu = 1
+        nx.LFR_benchmark_graph(n, tau1, tau2, mu, min_degree=2, max_iters=0)
+
+
+def test_no_scipy_zeta():
+    zeta2 = 1.6449340668482264
+    assert abs(zeta2 - nx.generators.community._hurwitz_zeta(2, 1, 0.0001)) < 0.01
+
+
+def test_generate_min_degree_itr():
+    with pytest.raises(
+        nx.ExceededMaxIterations, match="Could not match average_degree"
+    ):
+        nx.generators.community._generate_min_degree(2, 2, 1, 0.01, 0)
diff --git a/networkx/generators/tests/test_geometric.py b/networkx/generators/tests/test_geometric.py
index 58490bf..56ee369 100644
--- a/networkx/generators/tests/test_geometric.py
+++ b/networkx/generators/tests/test_geometric.py
@@ -137,7 +137,7 @@ class TestSoftRandomGeometricGraph:
         assert len(SRGG.edges()) <= len(RGG.edges())
 
     def test_p_dist_zero(self):
-        """Tests if p_dict = 0 returns disconencted graph with 0 edges"""
+        """Tests if p_dict = 0 returns disconnected graph with 0 edges"""
 
         def p_dist(dist):
             return 0
@@ -208,7 +208,7 @@ class TestGeographicalThresholdGraph:
                 assert not join(G, u, v, 10, -2, l1dist)
 
     def test_p_dist_zero(self):
-        """Tests if p_dict = 0 returns disconencted graph with 0 edges"""
+        """Tests if p_dict = 0 returns disconnected graph with 0 edges"""
 
         def p_dist(dist):
             return 0
diff --git a/networkx/generators/tests/test_internet_as_graphs.py b/networkx/generators/tests/test_internet_as_graphs.py
index fdfbf05..0d578b4 100644
--- a/networkx/generators/tests/test_internet_as_graphs.py
+++ b/networkx/generators/tests/test_internet_as_graphs.py
@@ -27,10 +27,7 @@ class TestInternetASTopology:
             elif cls.G.nodes[i]["type"] == "CP":
                 cls.CP.append(i)
             else:
-                raise ValueError(
-                    "Inconsistent data in the graph\
-                        node attributes"
-                )
+                raise ValueError("Inconsistent data in the graph node attributes")
             cls.set_customers(i)
             cls.set_providers(i)
 
@@ -48,8 +45,7 @@ class TestInternetASTopology:
                         cls.customers[i].add(j)
                     elif i != customer:
                         raise ValueError(
-                            "Inconsistent data in the graph\
-                                edge attributes"
+                            "Inconsistent data in the graph edge attributes"
                         )
 
     @classmethod
@@ -66,8 +62,7 @@ class TestInternetASTopology:
                         cls.providers[i].add(j)
                     elif j != customer:
                         raise ValueError(
-                            "Inconsistent data in the graph\
-                                edge attributes"
+                            "Inconsistent data in the graph edge attributes"
                         )
 
     def test_wrong_input(self):
@@ -136,10 +131,7 @@ class TestInternetASTopology:
                 elif j == cust:
                     prov = i
                 else:
-                    raise ValueError(
-                        "Inconsistent data in the graph edge\
-                            attributes"
-                    )
+                    raise ValueError("Inconsistent data in the graph edge attributes")
                 if cust in self.M:
                     d_m += 1
                     if self.G.nodes[prov]["type"] == "T":
@@ -153,10 +145,7 @@ class TestInternetASTopology:
                     if self.G.nodes[prov]["type"] == "T":
                         t_cp += 1
                 else:
-                    raise ValueError(
-                        "Inconsistent data in the graph edge\
-                            attributes"
-                    )
+                    raise ValueError("Inconsistent data in the graph edge attributes")
             elif e["type"] == "peer":
                 if self.G.nodes[i]["type"] == "M" and self.G.nodes[j]["type"] == "M":
                     p_m_m += 1
@@ -170,10 +159,7 @@ class TestInternetASTopology:
                 ):
                     p_cp_m += 1
             else:
-                raise ValueError(
-                    "Unexpected data in the graph edge\
-                        attributes"
-                )
+                raise ValueError("Unexpected data in the graph edge attributes")
 
         assert d_m / len(self.M) == approx((2 + (2.5 * self.n) / 10000), abs=1e-0)
         assert d_cp / len(self.CP) == approx((2 + (1.5 * self.n) / 10000), abs=1e-0)
diff --git a/networkx/generators/tests/test_joint_degree_seq.py b/networkx/generators/tests/test_joint_degree_seq.py
index aa581c5..15deb76 100644
--- a/networkx/generators/tests/test_joint_degree_seq.py
+++ b/networkx/generators/tests/test_joint_degree_seq.py
@@ -99,7 +99,7 @@ def test_is_valid_directed_joint_degree():
     nkk = {1: {1: 2, 2: 2}}
     assert not is_valid_directed_joint_degree(in_degrees, out_degrees, nkk)
 
-    # not realizable, degree seqeunces have fewer than required nodes.
+    # not realizable, degree sequences have fewer than required nodes.
     in_degrees = [0, 1, 2]
     assert not is_valid_directed_joint_degree(in_degrees, out_degrees, nkk)
 
@@ -110,7 +110,7 @@ def test_directed_joint_degree_graph(n=15, m=100, ntimes=1000):
         # generate gnm random graph and calculate its joint degree.
         g = gnm_random_graph(n, m, None, directed=True)
 
-        # in-degree seqeunce of g as a list of integers.
+        # in-degree sequence of g as a list of integers.
         in_degrees = list(dict(g.in_degree()).values())
         # out-degree sequence of g as a list of integers.
         out_degrees = list(dict(g.out_degree()).values())
diff --git a/networkx/generators/tests/test_line.py b/networkx/generators/tests/test_line.py
index 96380ac..8ad2462 100644
--- a/networkx/generators/tests/test_line.py
+++ b/networkx/generators/tests/test_line.py
@@ -167,55 +167,62 @@ class TestGeneratorInverseLine:
         solution = nx.path_graph(2)
         assert nx.is_isomorphic(H, solution)
 
-    def test_claw(self):
-        # This is the simplest non-line graph
-        G = nx.Graph()
-        G_edges = [[0, 1], [0, 2], [0, 3]]
-        G.add_edges_from(G_edges)
+    def test_edgeless_graph(self):
+        G = nx.empty_graph(5)
+        with pytest.raises(nx.NetworkXError, match="edgeless graph"):
+            nx.inverse_line_graph(G)
+
+    def test_selfloops_error(self):
+        G = nx.cycle_graph(4)
+        G.add_edge(0, 0)
         pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
 
-    def test_non_line_graph(self):
-        # These are other non-line graphs
+    def test_non_line_graphs(self):
+        # Tests several known non-line graphs for impossibility
+        # Adapted from L.W.Beineke, "Characterizations of derived graphs"
+
+        # claw graph
+        claw = nx.star_graph(3)
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, claw)
 
         # wheel graph with 6 nodes
-        G = nx.Graph()
-        G_edges = [
-            [0, 1],
-            [0, 2],
-            [0, 3],
-            [0, 4],
-            [0, 5],
-            [1, 2],
-            [2, 3],
-            [3, 4],
-            [4, 5],
-            [5, 1],
-        ]
-        G.add_edges_from(G_edges)
+        wheel = nx.wheel_graph(6)
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, wheel)
+
+        # K5 with one edge remove
+        K5m = nx.complete_graph(5)
+        K5m.remove_edge(0, 1)
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, K5m)
+
+        # graph without any odd triangles (contains claw as induced subgraph)
+        G = nx.compose(nx.path_graph(2), nx.complete_bipartite_graph(2, 3))
         pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
 
-        #   3---4---5
-        #  / \ / \ /
-        # 0---1---2
-        G = nx.Graph()
-        G_edges = [
-            [0, 1],
-            [1, 2],
-            [3, 4],
-            [4, 5],
-            [0, 3],
-            [1, 3],
-            [1, 4],
-            [2, 4],
-            [2, 5],
-        ]
-        G.add_edges_from(G_edges)
+        ## Variations on a diamond graph
+
+        # Diamond + 2 edges (+ "roof")
+        G = nx.diamond_graph()
+        G.add_edges_from([(4, 0), (5, 3)])
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
+        G.add_edge(4, 5)
         pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
 
-        # K_5 minus an edge
-        K5me = nx.complete_graph(5)
-        K5me.remove_edge(0, 1)
-        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, K5me)
+        # Diamond + 2 connected edges
+        G = nx.diamond_graph()
+        G.add_edges_from([(4, 0), (4, 3)])
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
+
+        # Diamond + K3 + one edge (+ 2*K3)
+        G = nx.diamond_graph()
+        G.add_edges_from([(4, 0), (4, 1), (4, 2), (5, 3)])
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
+        G.add_edges_from([(5, 1), (5, 2)])
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
+
+        # 4 triangles
+        G = nx.diamond_graph()
+        G.add_edges_from([(4, 0), (4, 1), (5, 2), (5, 3)])
+        pytest.raises(nx.NetworkXError, nx.inverse_line_graph, G)
 
     def test_wrong_graph_type(self):
         G = nx.DiGraph()
@@ -275,3 +282,28 @@ class TestGeneratorInverseLine:
         H = nx.line_graph(G)
         J = nx.inverse_line_graph(H)
         assert nx.is_isomorphic(G, J)
+
+
+class TestGeneratorPrivateFunctions:
+    def test_triangles_error(self):
+        G = nx.diamond_graph()
+        pytest.raises(nx.NetworkXError, line._triangles, G, (4, 0))
+        pytest.raises(nx.NetworkXError, line._triangles, G, (0, 3))
+
+    def test_odd_triangles_error(self):
+        G = nx.diamond_graph()
+        pytest.raises(nx.NetworkXError, line._odd_triangle, G, (0, 1, 4))
+        pytest.raises(nx.NetworkXError, line._odd_triangle, G, (0, 1, 3))
+
+    def test_select_starting_cell_error(self):
+        G = nx.diamond_graph()
+        pytest.raises(nx.NetworkXError, line._select_starting_cell, G, (4, 0))
+        pytest.raises(nx.NetworkXError, line._select_starting_cell, G, (0, 3))
+
+    def test_diamond_graph(self):
+        G = nx.diamond_graph()
+        for edge in G.edges:
+            cell = line._select_starting_cell(G, starting_edge=edge)
+            # Starting cell should always be one of the two triangles
+            assert len(cell) == 3
+            assert all(v in G[u] for u in cell for v in cell if u != v)
diff --git a/networkx/generators/tests/test_small.py b/networkx/generators/tests/test_small.py
index 5f5406f..9939c8a 100644
--- a/networkx/generators/tests/test_small.py
+++ b/networkx/generators/tests/test_small.py
@@ -15,15 +15,6 @@ null = nx.null_graph()
 
 
 class TestGeneratorsSmall:
-    def test_make_small_graph(self):
-        d = ["adjacencylist", "Bull Graph", 5, [[2, 3], [1, 3, 4], [1, 2, 5], [2], [3]]]
-        G = nx.make_small_graph(d)
-        assert is_isomorphic(G, nx.bull_graph())
-
-        # Test small graph creation error with wrong ltype
-        d[0] = "erroneouslist"
-        pytest.raises(nx.NetworkXError, nx.make_small_graph, graph_description=d)
-
     def test__LCF_graph(self):
         # If n<=0, then return the null_graph
         G = nx.LCF_graph(-10, [1, 2], 100)
diff --git a/networkx/generators/tests/test_stochastic.py b/networkx/generators/tests/test_stochastic.py
index d75a6a0..3f48f0d 100644
--- a/networkx/generators/tests/test_stochastic.py
+++ b/networkx/generators/tests/test_stochastic.py
@@ -51,6 +51,17 @@ class TestStochasticGraph:
             (0, 2, d),
         ]
 
+    def test_zero_weights(self):
+        """Smoke test: ensure ZeroDivisionError is not raised."""
+        G = nx.DiGraph()
+        G.add_edge(0, 1, weight=0)
+        G.add_edge(0, 2, weight=0)
+        S = nx.stochastic_graph(G)
+        assert sorted(S.edges(data=True)) == [
+            (0, 1, {"weight": 0}),
+            (0, 2, {"weight": 0}),
+        ]
+
     def test_graph_disallowed(self):
         with pytest.raises(nx.NetworkXNotImplemented):
             nx.stochastic_graph(nx.Graph())
diff --git a/networkx/lazy_imports.py b/networkx/lazy_imports.py
index 49344db..6201088 100644
--- a/networkx/lazy_imports.py
+++ b/networkx/lazy_imports.py
@@ -93,7 +93,7 @@ class DelayedImportErrorModule(types.ModuleType):
             fd = self.__frame_data
             raise ModuleNotFoundError(
                 f"No module named '{fd['spec']}'\n\n"
-                "This error is lazily reported, having originally occured in\n"
+                "This error is lazily reported, having originally occurred in\n"
                 f'  File {fd["filename"]}, line {fd["lineno"]}, in {fd["function"]}\n\n'
                 f'----> {"".join(fd["code_context"]).strip()}'
             )
diff --git a/networkx/linalg/algebraicconnectivity.py b/networkx/linalg/algebraicconnectivity.py
index b1b9ef6..a309b1c 100644
--- a/networkx/linalg/algebraicconnectivity.py
+++ b/networkx/linalg/algebraicconnectivity.py
@@ -317,7 +317,7 @@ def _get_fiedler_func(method):
 def algebraic_connectivity(
     G, weight="weight", normalized=False, tol=1e-8, method="tracemin_pcg", seed=None
 ):
-    """Returns the algebraic connectivity of an undirected graph.
+    r"""Returns the algebraic connectivity of an undirected graph.
 
     The algebraic connectivity of a connected undirected graph is the second
     smallest eigenvalue of its Laplacian matrix.
@@ -377,6 +377,19 @@ def algebraic_connectivity(
     See Also
     --------
     laplacian_matrix
+
+    Examples
+    --------
+    For undirected graphs algebraic connectivity can tell us if a graph is connected or not
+    `G` is connected iff  ``algebraic_connectivity(G) > 0``:
+
+    >>> G = nx.complete_graph(5)
+    >>> nx.algebraic_connectivity(G) > 0
+    True
+    >>> G.add_node(10)  # G is no longer connected
+    >>> nx.algebraic_connectivity(G) > 0
+    False
+
     """
     if len(G) < 2:
         raise nx.NetworkXError("graph has less than two nodes.")
diff --git a/networkx/linalg/attrmatrix.py b/networkx/linalg/attrmatrix.py
index 685d393..ace7144 100644
--- a/networkx/linalg/attrmatrix.py
+++ b/networkx/linalg/attrmatrix.py
@@ -150,7 +150,7 @@ def attr_matrix(
     dtype=None,
     order=None,
 ):
-    """Returns a NumPy matrix using attributes from G.
+    """Returns the attribute matrix using attributes from `G` as a numpy array.
 
     If only `G` is passed in, then the adjacency matrix is constructed.
 
@@ -164,12 +164,12 @@ def attr_matrix(
     Parameters
     ----------
     G : graph
-        The NetworkX graph used to construct the NumPy matrix.
+        The NetworkX graph used to construct the attribute matrix.
 
     edge_attr : str, optional
         Each element of the matrix represents a running total of the
         specified edge attribute for edges whose node attributes correspond
-        to the rows/cols of the matirx. The attribute must be present for
+        to the rows/cols of the matrix. The attribute must be present for
         all edges in the graph. If no attribute is specified, then we
         just count the number of edges whose node attributes correspond
         to the matrix element.
@@ -204,11 +204,11 @@ def attr_matrix(
 
     Returns
     -------
-    M : NumPy matrix
+    M : 2D NumPy ndarray
         The attribute matrix.
 
     ordering : list
-        If `rc_order` was specified, then only the matrix is returned.
+        If `rc_order` was specified, then only the attribute matrix is returned.
         However, if `rc_order` was None, then the ordering used to construct
         the matrix is returned as well.
 
@@ -221,16 +221,16 @@ def attr_matrix(
     >>> G.add_edge(0, 2, thickness=2)
     >>> G.add_edge(1, 2, thickness=3)
     >>> nx.attr_matrix(G, rc_order=[0, 1, 2])
-    matrix([[0., 1., 1.],
-            [1., 0., 1.],
-            [1., 1., 0.]])
+    array([[0., 1., 1.],
+           [1., 0., 1.],
+           [1., 1., 0.]])
 
     Alternatively, we can obtain the matrix describing edge thickness.
 
     >>> nx.attr_matrix(G, edge_attr="thickness", rc_order=[0, 1, 2])
-    matrix([[0., 1., 2.],
-            [1., 0., 3.],
-            [2., 3., 0.]])
+    array([[0., 1., 2.],
+           [1., 0., 3.],
+           [2., 3., 0.]])
 
     We can also color the nodes and ask for the probability distribution over
     all edges (u,v) describing:
@@ -242,8 +242,8 @@ def attr_matrix(
     >>> G.nodes[2]["color"] = "blue"
     >>> rc = ["red", "blue"]
     >>> nx.attr_matrix(G, node_attr="color", normalized=True, rc_order=rc)
-    matrix([[0.33333333, 0.66666667],
-            [1.        , 0.        ]])
+    array([[0.33333333, 0.66666667],
+           [1.        , 0.        ]])
 
     For example, the above tells us that for all edges (u,v):
 
@@ -256,8 +256,8 @@ def attr_matrix(
     Finally, we can obtain the total weights listed by the node colors.
 
     >>> nx.attr_matrix(G, edge_attr="weight", node_attr="color", rc_order=rc)
-    matrix([[3., 2.],
-            [2., 0.]])
+    array([[3., 2.],
+           [2., 0.]])
 
     Thus, the total weight over all edges (u,v) with u and v having colors:
 
@@ -298,19 +298,6 @@ def attr_matrix(
     if normalized:
         M /= M.sum(axis=1).reshape((N, 1))
 
-    import warnings
-
-    warnings.warn(
-        (
-            "attr_matrix will return an numpy.ndarray instead of a numpy.matrix "
-            "in NetworkX 3.0."
-        ),
-        category=FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: Remove asmatrix in NetworkX 3.0
-    M = np.asmatrix(M)
-
     if rc_order is None:
         return M, ordering
     else:
@@ -320,7 +307,7 @@ def attr_matrix(
 def attr_sparse_matrix(
     G, edge_attr=None, node_attr=None, normalized=False, rc_order=None, dtype=None
 ):
-    """Returns a SciPy sparse matrix using attributes from G.
+    """Returns a SciPy sparse array using attributes from G.
 
     If only `G` is passed in, then the adjacency matrix is constructed.
 
@@ -369,7 +356,7 @@ def attr_sparse_matrix(
 
     Returns
     -------
-    M : SciPy sparse matrix
+    M : SciPy sparse array
         The attribute matrix.
 
     ordering : list
diff --git a/networkx/linalg/bethehessianmatrix.py b/networkx/linalg/bethehessianmatrix.py
index c2595e4..443ba5b 100644
--- a/networkx/linalg/bethehessianmatrix.py
+++ b/networkx/linalg/bethehessianmatrix.py
@@ -32,7 +32,7 @@ def bethe_hessian_matrix(G, r=None, nodelist=None):
 
     Returns
     -------
-    H : scipy.sparse.csr_matrix
+    H : scipy.sparse.csr_array
       The Bethe Hessian matrix of `G`, with parameter `r`.
 
     Examples
@@ -75,12 +75,4 @@ def bethe_hessian_matrix(G, r=None, nodelist=None):
     D = sp.sparse.csr_array(sp.sparse.spdiags(A.sum(axis=1), 0, m, n, format="csr"))
     # TODO: Rm csr_array wrapper when eye array creation becomes available
     I = sp.sparse.csr_array(sp.sparse.eye(m, n, format="csr"))
-    import warnings
-
-    warnings.warn(
-        "bethe_hessian_matrix will return a scipy.sparse array instead of a matrix in Networkx 3.0",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: Remove the csr_matrix wrapper in NetworkX 3.0
-    return sp.sparse.csr_matrix((r**2 - 1) * I - r * A + D)
+    return (r**2 - 1) * I - r * A + D
diff --git a/networkx/linalg/graphmatrix.py b/networkx/linalg/graphmatrix.py
index ad0d2e3..24354e5 100644
--- a/networkx/linalg/graphmatrix.py
+++ b/networkx/linalg/graphmatrix.py
@@ -3,7 +3,7 @@ Adjacency matrix and incidence matrix of graphs.
 """
 import networkx as nx
 
-__all__ = ["incidence_matrix", "adj_matrix", "adjacency_matrix"]
+__all__ = ["incidence_matrix", "adjacency_matrix"]
 
 
 def incidence_matrix(G, nodelist=None, edgelist=None, oriented=False, weight=None):
@@ -40,7 +40,7 @@ def incidence_matrix(G, nodelist=None, edgelist=None, oriented=False, weight=Non
 
     Returns
     -------
-    A : SciPy sparse matrix
+    A : SciPy sparse array
       The incidence matrix of G.
 
     Notes
@@ -93,14 +93,6 @@ def incidence_matrix(G, nodelist=None, edgelist=None, oriented=False, weight=Non
         else:
             A[ui, ei] = wt
             A[vi, ei] = wt
-    import warnings
-
-    warnings.warn(
-        "incidence_matrix will return a scipy.sparse array instead of a matrix in Networkx 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: Rm sp.sparse.csc_matrix in Networkx 3.0
     return A.asformat("csc")
 
 
@@ -126,7 +118,7 @@ def adjacency_matrix(G, nodelist=None, dtype=None, weight="weight"):
 
     Returns
     -------
-    A : SciPy sparse matrix
+    A : SciPy sparse array
       Adjacency matrix representation of G.
 
     Notes
@@ -145,7 +137,7 @@ def adjacency_matrix(G, nodelist=None, dtype=None, weight="weight"):
     diagonal matrix entry value to the edge weight attribute
     (or the number 1 if the edge has no weight attribute).  If the
     alternate convention of doubling the edge weight is desired the
-    resulting Scipy sparse matrix can be modified as follows:
+    resulting SciPy sparse array can be modified as follows:
 
     >>> G = nx.Graph([(1, 1)])
     >>> A = nx.adjacency_matrix(G)
@@ -162,29 +154,4 @@ def adjacency_matrix(G, nodelist=None, dtype=None, weight="weight"):
     to_dict_of_dicts
     adjacency_spectrum
     """
-    import warnings
-
-    warnings.warn(
-        "adjacency_matrix will return a scipy.sparse array instead of a matrix in Networkx 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: Change to `to_scipy_sparse_array` for networkx 3.0
-    return nx.to_scipy_sparse_matrix(G, nodelist=nodelist, dtype=dtype, weight=weight)
-
-
-def _adj_matrix_warning(G, nodelist=None, dtype=None, weight="weight"):
-    import warnings
-
-    warnings.warn(
-        (
-            "adj_matrix is deprecated and will be removed in version 3.0.\n"
-            "Use `adjacency_matrix` instead\n"
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    return adjacency_matrix(G, nodelist, dtype, weight)
-
-
-adj_matrix = _adj_matrix_warning
+    return nx.to_scipy_sparse_array(G, nodelist=nodelist, dtype=dtype, weight=weight)
diff --git a/networkx/linalg/laplacianmatrix.py b/networkx/linalg/laplacianmatrix.py
index f1053b7..4448b3e 100644
--- a/networkx/linalg/laplacianmatrix.py
+++ b/networkx/linalg/laplacianmatrix.py
@@ -34,7 +34,7 @@ def laplacian_matrix(G, nodelist=None, weight="weight"):
 
     Returns
     -------
-    L : SciPy sparse matrix
+    L : SciPy sparse array
       The Laplacian matrix of G.
 
     Notes
@@ -46,6 +46,21 @@ def laplacian_matrix(G, nodelist=None, weight="weight"):
     to_numpy_array
     normalized_laplacian_matrix
     laplacian_spectrum
+
+    Examples
+    --------
+    For graphs with multiple connected components, L is permutation-similar
+    to a block diagonal matrix where each block is the respective Laplacian
+    matrix for each component.
+
+    >>> G = nx.Graph([(1, 2), (2, 3), (4, 5)])
+    >>> print(nx.laplacian_matrix(G).toarray())
+    [[ 1 -1  0  0  0]
+     [-1  2 -1  0  0]
+     [ 0 -1  1  0  0]
+     [ 0  0  0  1 -1]
+     [ 0  0  0 -1  1]]
+
     """
     import scipy as sp
     import scipy.sparse  # call as sp.sparse
@@ -56,15 +71,7 @@ def laplacian_matrix(G, nodelist=None, weight="weight"):
     n, m = A.shape
     # TODO: rm csr_array wrapper when spdiags can produce arrays
     D = sp.sparse.csr_array(sp.sparse.spdiags(A.sum(axis=1), 0, m, n, format="csr"))
-    import warnings
-
-    warnings.warn(
-        "laplacian_matrix will return a scipy.sparse array instead of a matrix in Networkx 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: rm sp.sparse.csr_matrix in version 3.0
-    return sp.sparse.csr_matrix(D - A)
+    return D - A
 
 
 @not_implemented_for("directed")
@@ -95,7 +102,7 @@ def normalized_laplacian_matrix(G, nodelist=None, weight="weight"):
 
     Returns
     -------
-    N : Scipy sparse matrix
+    N : SciPy sparse array
       The normalized Laplacian matrix of G.
 
     Notes
@@ -136,15 +143,7 @@ def normalized_laplacian_matrix(G, nodelist=None, weight="weight"):
     diags_sqrt[np.isinf(diags_sqrt)] = 0
     # TODO: rm csr_array wrapper when spdiags can produce arrays
     DH = sp.sparse.csr_array(sp.sparse.spdiags(diags_sqrt, 0, m, n, format="csr"))
-    import warnings
-
-    warnings.warn(
-        "normalized_laplacian_matrix will return a scipy.sparse array instead of a matrix in Networkx 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: rm csr_matrix wrapper for NX 3.0
-    return sp.sparse.csr_matrix(DH @ (L @ DH))
+    return DH @ (L @ DH)
 
 
 def total_spanning_tree_weight(G, weight=None):
@@ -269,15 +268,7 @@ def directed_laplacian_matrix(
     # NOTE: This could be sparsified for the non-pagerank cases
     I = np.identity(len(G))
 
-    import warnings
-
-    warnings.warn(
-        "directed_laplacian_matrix will return a numpy array instead of a matrix in NetworkX 3.0",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: rm np.asmatrix for networkx 3.0
-    return np.asmatrix(I - (Q + Q.T) / 2.0)
+    return I - (Q + Q.T) / 2.0
 
 
 @not_implemented_for("undirected")
@@ -356,17 +347,7 @@ def directed_combinatorial_laplacian_matrix(
     # TODO: Rm csr_array wrapper when spdiags array creation becomes available
     Phi = sp.sparse.csr_array(sp.sparse.spdiags(p, 0, n, n)).toarray()
 
-    import warnings
-
-    warnings.warn(
-        "directed_combinatorial_laplacian_matrix will return a numpy array instead of a matrix in NetworkX 3.0",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: Rm np.asmatrix for networkx 3.0
-    import numpy as np
-
-    return np.asmatrix(Phi - (Phi @ P + P.T @ Phi) / 2.0)
+    return Phi - (Phi @ P + P.T @ Phi) / 2.0
 
 
 def _transition_matrix(G, nodelist=None, weight="weight", walk_type=None, alpha=0.95):
diff --git a/networkx/linalg/modularitymatrix.py b/networkx/linalg/modularitymatrix.py
index 978d226..59087b8 100644
--- a/networkx/linalg/modularitymatrix.py
+++ b/networkx/linalg/modularitymatrix.py
@@ -39,7 +39,7 @@ def modularity_matrix(G, nodelist=None, weight=None):
 
     Returns
     -------
-    B : Numpy matrix
+    B : Numpy array
       The modularity matrix of G.
 
     Examples
@@ -71,15 +71,7 @@ def modularity_matrix(G, nodelist=None, weight=None):
     # Expected adjacency matrix
     X = np.outer(k, k) / (2 * m)
 
-    import warnings
-
-    warnings.warn(
-        "modularity_matrix will return a numpy array instead of a matrix in NetworkX 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: rm np.asmatrix for networkx 3.0
-    return np.asmatrix(A - X)
+    return A - X
 
 
 @not_implemented_for("undirected")
@@ -116,7 +108,7 @@ def directed_modularity_matrix(G, nodelist=None, weight=None):
 
     Returns
     -------
-    B : Numpy matrix
+    B : Numpy array
       The modularity matrix of G.
 
     Examples
@@ -169,12 +161,4 @@ def directed_modularity_matrix(G, nodelist=None, weight=None):
     # Expected adjacency matrix
     X = np.outer(k_out, k_in) / m
 
-    import warnings
-
-    warnings.warn(
-        "directed_modularity_matrix will return a numpy array instead of a matrix in NetworkX 3.0.",
-        FutureWarning,
-        stacklevel=2,
-    )
-    # TODO: rm np.asmatrix for networkx 3.0
-    return np.asmatrix(A - X)
+    return A - X
diff --git a/networkx/linalg/spectrum.py b/networkx/linalg/spectrum.py
index caa95fc..66aedbb 100644
--- a/networkx/linalg/spectrum.py
+++ b/networkx/linalg/spectrum.py
@@ -32,11 +32,23 @@ def laplacian_spectrum(G, weight="weight"):
     Notes
     -----
     For MultiGraph/MultiDiGraph, the edges weights are summed.
-    See to_numpy_array for other options.
+    See :func:`~networkx.convert_matrix.to_numpy_array` for other options.
 
     See Also
     --------
     laplacian_matrix
+
+    Examples
+    --------
+    The multiplicity of 0 as an eigenvalue of the laplacian matrix is equal
+    to the number of connected components of G.
+
+    >>> G = nx.Graph()  # Create a graph with 5 nodes and 3 connected components
+    >>> G.add_nodes_from(range(5))
+    >>> G.add_edges_from([(0, 2), (3, 4)])
+    >>> nx.laplacian_spectrum(G)
+    array([0., 0., 0., 2., 2.])
+
     """
     import scipy as sp
     import scipy.linalg  # call as sp.linalg
diff --git a/networkx/readwrite/__init__.py b/networkx/readwrite/__init__.py
index b97724b..f655098 100644
--- a/networkx/readwrite/__init__.py
+++ b/networkx/readwrite/__init__.py
@@ -4,50 +4,9 @@ A package for reading and writing graphs in various formats.
 """
 
 
-def __getattr__(name):
-    """Remove functions and provide informative error messages."""
-    if name == "nx_yaml":
-        raise ImportError(
-            "\nThe nx_yaml module has been removed from NetworkX.\n"
-            "Please use the `yaml` package directly for working with yaml data.\n"
-            "For example, a networkx.Graph `G` can be written to and loaded\n"
-            "from a yaml file with:\n\n"
-            "    import yaml\n\n"
-            "    with open('path_to_yaml_file', 'w') as fh:\n"
-            "        yaml.dump(G, fh)\n"
-            "    with open('path_to_yaml_file', 'r') as fh:\n"
-            "        G = yaml.load(fh, Loader=yaml.Loader)\n\n"
-            "Note that yaml.Loader is considered insecure - see the pyyaml\n"
-            "documentation for further details.\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    if name == "read_yaml":
-        raise ImportError(
-            "\nread_yaml has been removed from NetworkX, please use `yaml`\n"
-            "directly:\n\n"
-            "    import yaml\n\n"
-            "    with open('path', 'r') as fh:\n"
-            "        yaml.load(fh, Loader=yaml.Loader)\n\n"
-            "Note that yaml.Loader is considered insecure - see the pyyaml\n"
-            "documentation for further details.\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    if name == "write_yaml":
-        raise ImportError(
-            "\nwrite_yaml has been removed from NetworkX, please use `yaml`\n"
-            "directly:\n\n"
-            "    import yaml\n\n"
-            "    with open('path_for_yaml_output', 'w') as fh:\n"
-            "        yaml.dump(G_to_be_yaml, fh)\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    raise AttributeError(f"module {__name__} has no attribute {name}")
-
-
 from networkx.readwrite.adjlist import *
 from networkx.readwrite.multiline_adjlist import *
 from networkx.readwrite.edgelist import *
-from networkx.readwrite.gpickle import *
 from networkx.readwrite.pajek import *
 from networkx.readwrite.leda import *
 from networkx.readwrite.sparse6 import *
@@ -55,6 +14,5 @@ from networkx.readwrite.graph6 import *
 from networkx.readwrite.gml import *
 from networkx.readwrite.graphml import *
 from networkx.readwrite.gexf import *
-from networkx.readwrite.nx_shp import *
 from networkx.readwrite.json_graph import *
 from networkx.readwrite.text import *
diff --git a/networkx/readwrite/gml.py b/networkx/readwrite/gml.py
index 1a14b80..98fbd40 100644
--- a/networkx/readwrite/gml.py
+++ b/networkx/readwrite/gml.py
@@ -363,6 +363,11 @@ def parse_gml_lines(lines, label, destringizer):
                         value = destringizer(value)
                     except ValueError:
                         pass
+                # Special handling for empty lists and tuples
+                if value == "()":
+                    value = tuple()
+                if value == "[]":
+                    value = list()
                 curr_token = next(tokens)
             elif category == Pattern.DICT_START:
                 curr_token, value = parse_dict(curr_token)
@@ -381,7 +386,7 @@ def parse_gml_lines(lines, label, destringizer):
                     except Exception:
                         msg = (
                             "an int, float, string, '[' or string"
-                            + " convertable ASCII value for node id or label"
+                            + " convertible ASCII value for node id or label"
                         )
                         unexpected(curr_token, msg)
                 # Special handling for nan and infinity.  Since the gml language
@@ -658,7 +663,7 @@ def generate_gml(G, stringizer=None):
         label "1"
       ]
     ]
-    >>> G = nx.OrderedMultiGraph([("a", "b"), ("a", "b")])
+    >>> G = nx.MultiGraph([("a", "b"), ("a", "b")])
     >>> print("\n".join(nx.generate_gml(G)))
     graph [
       multigraph 1
@@ -728,12 +733,9 @@ def generate_gml(G, stringizer=None):
                 for key, value in value.items():
                     yield from stringize(key, value, (), next_indent)
                 yield indent + "]"
-            elif (
-                isinstance(value, (list, tuple))
-                and key != "label"
-                and value
-                and not in_list
-            ):
+            elif isinstance(value, (list, tuple)) and key != "label" and not in_list:
+                if len(value) == 0:
+                    yield indent + key + " " + f'"{value!r}"'
                 if len(value) == 1:
                     yield indent + key + " " + f'"{LIST_START_VALUE}"'
                 for val in value:
diff --git a/networkx/readwrite/gpickle.py b/networkx/readwrite/gpickle.py
deleted file mode 100644
index 0054afd..0000000
--- a/networkx/readwrite/gpickle.py
+++ /dev/null
@@ -1,109 +0,0 @@
-"""
-**************
-Pickled Graphs
-**************
-Read and write NetworkX graphs as Python pickles.
-
-.. warning::
-    The pickle library is not secure and can be used to create arbitray objects. 
-    Only unpickle data you trust - see :doc:`library/pickle` for additional information.
-
-"The pickle module implements a fundamental, but powerful algorithm
-for serializing and de-serializing a Python object
-structure. "Pickling" is the process whereby a Python object hierarchy
-is converted into a byte stream, and "unpickling" is the inverse
-operation, whereby a byte stream is converted back into an object
-hierarchy."
-
-Note that NetworkX graphs can contain any hashable Python object as
-node (not just integers and strings).  For arbitrary data types it may
-be difficult to represent the data as text.  In that case using Python
-pickles to store the graph data can be used.
-
-Format
-------
-See https://docs.python.org/3/library/pickle.html
-"""
-
-__all__ = ["read_gpickle", "write_gpickle"]
-
-import pickle
-import warnings
-
-from networkx.utils import open_file
-
-
-@open_file(1, mode="wb")
-def write_gpickle(G, path, protocol=pickle.HIGHEST_PROTOCOL):
-    """Write graph in Python pickle format.
-
-    Pickles are a serialized byte stream of a Python object [1]_.
-    This format will preserve Python objects used as nodes or edges.
-
-    Parameters
-    ----------
-    G : graph
-       A NetworkX graph
-
-    path : file or string
-       File or filename to write.
-       Filenames ending in .gz or .bz2 will be compressed.
-
-    protocol : integer
-        Pickling protocol to use. Default value: ``pickle.HIGHEST_PROTOCOL``.
-
-    Examples
-    --------
-    >>> G = nx.path_graph(4)
-    >>> nx.write_gpickle(G, "test.gpickle")
-
-    References
-    ----------
-    .. [1] https://docs.python.org/3/library/pickle.html
-
-    .. deprecated:: 2.6
-    """
-    msg = (
-        "write_gpickle is deprecated and will be removed in 3.0."
-        "Use ``pickle.dump(G, path, protocol)``"
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    pickle.dump(G, path, protocol)
-
-
-@open_file(0, mode="rb")
-def read_gpickle(path):
-    """Read graph object in Python pickle format.
-
-    Pickles are a serialized byte stream of a Python object [1]_.
-    This format will preserve Python objects used as nodes or edges.
-
-    Parameters
-    ----------
-    path : file or string
-       File or filename to write.
-       Filenames ending in .gz or .bz2 will be uncompressed.
-
-    Returns
-    -------
-    G : graph
-       A NetworkX graph
-
-    Examples
-    --------
-    >>> G = nx.path_graph(4)
-    >>> nx.write_gpickle(G, "test.gpickle")
-    >>> G = nx.read_gpickle("test.gpickle")
-
-    References
-    ----------
-    .. [1] https://docs.python.org/3/library/pickle.html
-
-    .. deprecated:: 2.6
-    """
-    msg = (
-        "read_gpickle is deprecated and will be removed in 3.0."
-        "Use ``pickle.load(path)``"
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    return pickle.load(path)
diff --git a/networkx/readwrite/json_graph/__init__.py b/networkx/readwrite/json_graph/__init__.py
index 7715fbb..2ee9d12 100644
--- a/networkx/readwrite/json_graph/__init__.py
+++ b/networkx/readwrite/json_graph/__init__.py
@@ -15,5 +15,4 @@ The three formats that you can generate with NetworkX are:
 from networkx.readwrite.json_graph.node_link import *
 from networkx.readwrite.json_graph.adjacency import *
 from networkx.readwrite.json_graph.tree import *
-from networkx.readwrite.json_graph.jit import *
 from networkx.readwrite.json_graph.cytoscape import *
diff --git a/networkx/readwrite/json_graph/cytoscape.py b/networkx/readwrite/json_graph/cytoscape.py
index 296242c..c0c0e3f 100644
--- a/networkx/readwrite/json_graph/cytoscape.py
+++ b/networkx/readwrite/json_graph/cytoscape.py
@@ -3,24 +3,13 @@ import networkx as nx
 __all__ = ["cytoscape_data", "cytoscape_graph"]
 
 
-def cytoscape_data(G, attrs=None, name="name", ident="id"):
+def cytoscape_data(G, name="name", ident="id"):
     """Returns data in Cytoscape JSON format (cyjs).
 
     Parameters
     ----------
     G : NetworkX Graph
         The graph to convert to cytoscape format
-    attrs : dict or None (default=None)
-        A dictionary containing the keys 'name' and 'ident' which are mapped to
-        the 'name' and 'id' node elements in cyjs format. All other keys are
-        ignored. Default is `None` which results in the default mapping
-        ``dict(name="name", ident="id")``.
-
-        .. deprecated:: 2.6
-
-           The `attrs` keyword argument will be replaced with `name` and
-           `ident` in networkx 3.0
-
     name : string
         A string which is mapped to the 'name' node element in cyjs format.
         Must not have the same value as `ident`.
@@ -58,30 +47,6 @@ def cytoscape_data(G, attrs=None, name="name", ident="id"):
        {'data': {'id': '1', 'value': 1, 'name': '1'}}],
       'edges': [{'data': {'source': 0, 'target': 1}}]}}
     """
-    # ------ TODO: Remove between the lines in 3.0 ----- #
-    if attrs is not None:
-        import warnings
-
-        msg = (
-            "\nThe `attrs` keyword argument of cytoscape_data is deprecated\n"
-            "and will be removed in networkx 3.0.\n"
-            "It is replaced with explicit `name` and `ident` keyword\n"
-            "arguments.\n"
-            "To make this warning go away and ensure usage is forward\n"
-            "compatible, replace `attrs` with `name` and `ident`,\n"
-            "for example:\n\n"
-            "   >>> cytoscape_data(G, attrs={'name': 'foo', 'ident': 'bar'})\n\n"
-            "should instead be written as\n\n"
-            "   >>> cytoscape_data(G, name='foo', ident='bar')\n\n"
-            "in networkx 3.0.\n"
-            "The default values of 'name' and 'id' will not change."
-        )
-        warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-        name = attrs["name"]
-        ident = attrs["ident"]
-    # -------------------------------------------------- #
-
     if name == ident:
         raise nx.NetworkXError("name and ident must be different.")
 
@@ -115,7 +80,7 @@ def cytoscape_data(G, attrs=None, name="name", ident="id"):
     return jsondata
 
 
-def cytoscape_graph(data, attrs=None, name="name", ident="id"):
+def cytoscape_graph(data, name="name", ident="id"):
     """
     Create a NetworkX graph from a dictionary in cytoscape JSON format.
 
@@ -123,17 +88,6 @@ def cytoscape_graph(data, attrs=None, name="name", ident="id"):
     ----------
     data : dict
         A dictionary of data conforming to cytoscape JSON format.
-    attrs : dict or None (default=None)
-        A dictionary containing the keys 'name' and 'ident' which are mapped to
-        the 'name' and 'id' node elements in cyjs format. All other keys are
-        ignored. Default is `None` which results in the default mapping
-        ``dict(name="name", ident="id")``.
-
-        .. deprecated:: 2.6
-
-           The `attrs` keyword argument will be replaced with `name` and
-           `ident` in networkx 3.0
-
     name : string
         A string which is mapped to the 'name' node element in cyjs format.
         Must not have the same value as `ident`.
@@ -181,29 +135,6 @@ def cytoscape_graph(data, attrs=None, name="name", ident="id"):
     >>> G.edges(data=True)
     EdgeDataView([(0, 1, {'source': 0, 'target': 1})])
     """
-    # ------ TODO: Remove between the lines in 3.0 ----- #
-    if attrs is not None:
-        import warnings
-
-        msg = (
-            "\nThe `attrs` keyword argument of cytoscape_data is deprecated\n"
-            "and will be removed in networkx 3.0.\n"
-            "It is replaced with explicit `name` and `ident` keyword\n"
-            "arguments.\n"
-            "To make this warning go away and ensure usage is forward\n"
-            "compatible, replace `attrs` with `name` and `ident`,\n"
-            "for example:\n\n"
-            "   >>> cytoscape_data(G, attrs={'name': 'foo', 'ident': 'bar'})\n\n"
-            "should instead be written as\n\n"
-            "   >>> cytoscape_data(G, name='foo', ident='bar')\n\n"
-            "The default values of 'name' and 'id' will not change."
-        )
-        warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-        name = attrs["name"]
-        ident = attrs["ident"]
-    # -------------------------------------------------- #
-
     if name == ident:
         raise nx.NetworkXError("name and ident must be different.")
 
diff --git a/networkx/readwrite/json_graph/jit.py b/networkx/readwrite/json_graph/jit.py
deleted file mode 100644
index 043f1a1..0000000
--- a/networkx/readwrite/json_graph/jit.py
+++ /dev/null
@@ -1,118 +0,0 @@
-"""
-Read and write NetworkX graphs as JavaScript InfoVis Toolkit (JIT) format JSON.
-
-See the `JIT documentation`_ for more examples.
-
-Format
-------
-var json = [
-  {
-    "id": "aUniqueIdentifier",
-    "name": "usually a nodes name",
-    "data": {
-      "some key": "some value",
-      "some other key": "some other value"
-     },
-    "adjacencies": [
-    {
-      nodeTo:"aNodeId",
-      data: {} //put whatever you want here
-    },
-    'other adjacencies go here...'
-  },
-
-  'other nodes go here...'
-];
-.. _JIT documentation: http://thejit.org
-"""
-
-import json
-import warnings
-
-import networkx as nx
-from networkx.utils.decorators import not_implemented_for
-
-__all__ = ["jit_graph", "jit_data"]
-
-
-def jit_graph(data, create_using=None):
-    """Read a graph from JIT JSON.
-
-    Parameters
-    ----------
-    data : JSON Graph Object
-
-    create_using : Networkx Graph, optional (default: Graph())
-        Return graph of this type. The provided instance will be cleared.
-
-    Returns
-    -------
-    G : NetworkX Graph built from create_using if provided.
-
-    .. deprecated:: 2.6
-    """
-    warnings.warn(
-        ("jit_graph is deprecated and will be removed in NetworkX 3.0."),
-        DeprecationWarning,
-    )
-
-    if create_using is None:
-        G = nx.Graph()
-    else:
-        G = create_using
-        G.clear()
-
-    if isinstance(data, str):
-        data = json.loads(data)
-
-    for node in data:
-        G.add_node(node["id"], **node["data"])
-        if node.get("adjacencies") is not None:
-            for adj in node["adjacencies"]:
-                G.add_edge(node["id"], adj["nodeTo"], **adj["data"])
-    return G
-
-
-@not_implemented_for("multigraph")
-def jit_data(G, indent=None, default=None):
-    """Returns data in JIT JSON format.
-
-    Parameters
-    ----------
-    G : NetworkX Graph
-
-    indent: optional, default=None
-        If indent is a non-negative integer, then JSON array elements and
-        object members will be pretty-printed with that indent level.
-        An indent level of 0, or negative, will only insert newlines.
-        None (the default) selects the most compact representation.
-
-    default: optional, default=None
-         It will pass the value to the json.dumps function in order to
-         be able to serialize custom objects used as nodes.
-
-    Returns
-    -------
-    data: JIT JSON string
-
-    .. deprecated:: 2.6
-    """
-    warnings.warn(
-        ("jit_data is deprecated and will be removed in NetworkX 3.0."),
-        DeprecationWarning,
-    )
-    json_graph = []
-    for node in G.nodes():
-        json_node = {"id": node, "name": node}
-        # node data
-        json_node["data"] = G.nodes[node]
-        # adjacencies
-        if G[node]:
-            json_node["adjacencies"] = []
-            for neighbour in G[node]:
-                adjacency = {"nodeTo": neighbour}
-                # adjacency data
-                adjacency["data"] = G.edges[node, neighbour]
-                json_node["adjacencies"].append(adjacency)
-        json_graph.append(json_node)
-    return json.dumps(json_graph, indent=indent, default=default)
diff --git a/networkx/readwrite/json_graph/tests/test_cytoscape.py b/networkx/readwrite/json_graph/tests/test_cytoscape.py
index e92e737..5d47f21 100644
--- a/networkx/readwrite/json_graph/tests/test_cytoscape.py
+++ b/networkx/readwrite/json_graph/tests/test_cytoscape.py
@@ -7,23 +7,6 @@ import networkx as nx
 from networkx.readwrite.json_graph import cytoscape_data, cytoscape_graph
 
 
-# TODO: To be removed when signature change complete in 3.0
-def test_attrs_deprecation(recwarn):
-    G = nx.path_graph(3)
-
-    # No warnings when `attrs` kwarg not used
-    data = cytoscape_data(G)
-    H = cytoscape_graph(data)
-    assert len(recwarn) == 0
-
-    # Future warning raised with `attrs` kwarg
-    attrs = {"name": "foo", "ident": "bar"}
-    with pytest.warns(DeprecationWarning):
-        data = cytoscape_data(G, attrs)
-    with pytest.warns(DeprecationWarning):
-        H = cytoscape_graph(data, attrs)
-
-
 def test_graph():
     G = nx.path_graph(4)
     H = cytoscape_graph(cytoscape_data(G))
diff --git a/networkx/readwrite/json_graph/tests/test_jit.py b/networkx/readwrite/json_graph/tests/test_jit.py
deleted file mode 100644
index 309c405..0000000
--- a/networkx/readwrite/json_graph/tests/test_jit.py
+++ /dev/null
@@ -1,66 +0,0 @@
-import json
-
-import pytest
-
-import networkx as nx
-from networkx.readwrite.json_graph import jit_data, jit_graph
-
-
-class TestJIT:
-    def test_jit(self):
-        G = nx.Graph()
-        G.add_node("Node1", node_data="foobar")
-        G.add_node("Node3", node_data="bar")
-        G.add_node("Node4")
-        G.add_edge("Node1", "Node2", weight=9, something="isSomething")
-        G.add_edge("Node2", "Node3", weight=4, something="isNotSomething")
-        G.add_edge("Node1", "Node2")
-        d = jit_data(G)
-        K = jit_graph(json.loads(d))
-        assert nx.is_isomorphic(G, K)
-
-    def test_jit_2(self):
-        G = nx.Graph()
-        G.add_node(1, node_data=3)
-        G.add_node(3, node_data=0)
-        G.add_edge(1, 2, weight=9, something=0)
-        G.add_edge(2, 3, weight=4, something=3)
-        G.add_edge(1, 2)
-        d = jit_data(G)
-        K = jit_graph(json.loads(d))
-        assert nx.is_isomorphic(G, K)
-
-    def test_jit_directed(self):
-        G = nx.DiGraph()
-        G.add_node(1, node_data=3)
-        G.add_node(3, node_data=0)
-        G.add_edge(1, 2, weight=9, something=0)
-        G.add_edge(2, 3, weight=4, something=3)
-        G.add_edge(1, 2)
-        d = jit_data(G)
-        K = jit_graph(json.loads(d), create_using=nx.DiGraph())
-        assert nx.is_isomorphic(G, K)
-
-    def test_jit_multi_directed(self):
-        G = nx.MultiDiGraph()
-        G.add_node(1, node_data=3)
-        G.add_node(3, node_data=0)
-        G.add_edge(1, 2, weight=9, something=0)
-        G.add_edge(2, 3, weight=4, something=3)
-        G.add_edge(1, 2)
-        pytest.raises(nx.NetworkXNotImplemented, jit_data, G)
-
-        H = nx.DiGraph(G)
-        d = jit_data(H)
-        K = jit_graph(json.loads(d), create_using=nx.MultiDiGraph())
-        assert nx.is_isomorphic(H, K)
-        K.add_edge(1, 2)
-        assert not nx.is_isomorphic(H, K)
-        assert nx.is_isomorphic(G, K)
-
-    def test_jit_round_trip(self):
-        G = nx.Graph()
-        d = nx.jit_data(G)
-        H = jit_graph(json.loads(d))
-        K = jit_graph(d)
-        assert nx.is_isomorphic(H, K)
diff --git a/networkx/readwrite/json_graph/tests/test_tree.py b/networkx/readwrite/json_graph/tests/test_tree.py
index 59a81df..643a14d 100644
--- a/networkx/readwrite/json_graph/tests/test_tree.py
+++ b/networkx/readwrite/json_graph/tests/test_tree.py
@@ -46,20 +46,3 @@ def test_exceptions():
         G = nx.MultiDiGraph()
         G.add_node(0)
         tree_data(G, 0, ident="node", children="node")
-
-
-# NOTE: To be removed when deprecation expires in 3.0
-def test_attrs_deprecation(recwarn):
-    G = nx.path_graph(3, create_using=nx.DiGraph)
-
-    # No warnings when `attrs` kwarg not used
-    data = tree_data(G, 0)
-    H = tree_graph(data)
-    assert len(recwarn) == 0
-
-    # DeprecationWarning issued when `attrs` is used
-    attrs = {"id": "foo", "children": "bar"}
-    with pytest.warns(DeprecationWarning):
-        data = tree_data(G, 0, attrs=attrs)
-    with pytest.warns(DeprecationWarning):
-        H = tree_graph(data, attrs=attrs)
diff --git a/networkx/readwrite/json_graph/tree.py b/networkx/readwrite/json_graph/tree.py
index ab7098c..3e9a4c9 100644
--- a/networkx/readwrite/json_graph/tree.py
+++ b/networkx/readwrite/json_graph/tree.py
@@ -5,8 +5,7 @@ import networkx as nx
 __all__ = ["tree_data", "tree_graph"]
 
 
-# NOTE: Remove attrs from signature in 3.0
-def tree_data(G, root, attrs=None, ident="id", children="children"):
+def tree_data(G, root, ident="id", children="children"):
     """Returns data in tree format that is suitable for JSON serialization
     and use in Javascript documents.
 
@@ -18,20 +17,6 @@ def tree_data(G, root, attrs=None, ident="id", children="children"):
     root : node
        The root of the tree
 
-    attrs : dict
-        A dictionary that contains two keys 'id' and 'children'. The
-        corresponding values provide the attribute names for storing
-        NetworkX-internal graph data. The values should be unique. Default
-        value: :samp:`dict(id='id', children='children')`.
-
-        If some user-defined graph data use these attribute names as data keys,
-        they may be silently dropped.
-
-        .. deprecated:: 2.6
-
-           The `attrs` keyword argument is replaced by `ident` and `children`
-           and will be removed in networkx 3.0
-
     ident : string
         Attribute name for storing NetworkX-internal graph data. `ident` must
         have a different value than `children`. The default is 'id'.
@@ -79,28 +64,6 @@ def tree_data(G, root, attrs=None, ident="id", children="children"):
     if not nx.is_weakly_connected(G):
         raise TypeError("G is not weakly connected.")
 
-    # NOTE: to be removed in 3.0
-    if attrs is not None:
-        import warnings
-
-        msg = (
-            "\nThe `attrs` keyword argument of tree_data is deprecated\n"
-            "and will be removed in networkx 3.0.\n"
-            "It is replaced with explicit `ident` and `children` "
-            "keyword arguments.\n"
-            "To make this warning go away and ensure usage is forward\n"
-            "compatible, replace `attrs` with `ident` and `children,\n"
-            "for example:\n\n"
-            "    >>> tree_data(G, root, attrs={'id': 'foo', 'children': 'bar'})\n\n"
-            "should instead be written as\n\n"
-            "    >>> tree_data(G, root, ident='foo', children='bar')\n\n"
-            "The default values of 'id' and 'children' will not change."
-        )
-        warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-        ident = attrs["id"]
-        children = attrs["children"]
-
     if ident == children:
         raise nx.NetworkXError("The values for `id` and `children` must be different.")
 
@@ -122,23 +85,13 @@ def tree_data(G, root, attrs=None, ident="id", children="children"):
     return data
 
 
-def tree_graph(data, attrs=None, ident="id", children="children"):
+def tree_graph(data, ident="id", children="children"):
     """Returns graph from tree data format.
 
     Parameters
     ----------
     data : dict
         Tree formatted graph data
-    attrs : dict
-        A dictionary that contains two keys 'id' and 'children'. The
-        corresponding values provide the attribute names for storing
-        NetworkX-internal graph data. The values should be unique. Default
-        value: :samp:`dict(id='id', children='children')`.
-
-        .. deprecated:: 2.6
-
-           The `attrs` keyword argument is replaced by `ident` and `children`
-           and will be removed in networkx 3.0
 
     ident : string
         Attribute name for storing NetworkX-internal graph data. `ident` must
@@ -164,26 +117,6 @@ def tree_graph(data, attrs=None, ident="id", children="children"):
     tree_data, node_link_data, adjacency_data
     """
     graph = nx.DiGraph()
-    if attrs is not None:
-        import warnings
-
-        msg = (
-            "\nThe `attrs` keyword argument of tree_graph is deprecated\n"
-            "and will be removed in networkx 3.0.\n"
-            "It is replaced with explicit `ident` and `children` "
-            "keyword arguments.\n"
-            "To make this warning go away and ensure usage is\n"
-            "forward compatible, replace `attrs` with `ident` and `children,\n"
-            "for example:\n\n"
-            "    >>> tree_graph(data, attrs={'id': 'foo', 'children': 'bar'})\n\n"
-            "should instead be written as\n\n"
-            "    >>> tree_graph(data, ident='foo', children='bar')\n\n"
-            "The default values of 'id' and 'children' will not change."
-        )
-        warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-        ident = attrs["id"]
-        children = attrs["children"]
 
     def add_children(parent, children_):
         for data in children_:
diff --git a/networkx/readwrite/nx_shp.py b/networkx/readwrite/nx_shp.py
deleted file mode 100644
index dd48712..0000000
--- a/networkx/readwrite/nx_shp.py
+++ /dev/null
@@ -1,350 +0,0 @@
-"""
-*********
-Shapefile
-*********
-
-Generates a networkx.DiGraph from point and line shapefiles.
-
-"The Esri Shapefile or simply a shapefile is a popular geospatial vector
-data format for geographic information systems software. It is developed
-and regulated by Esri as a (mostly) open specification for data
-interoperability among Esri and other software products."
-See https://en.wikipedia.org/wiki/Shapefile for additional information.
-"""
-import warnings
-
-import networkx as nx
-
-__all__ = ["read_shp", "write_shp"]
-
-
-def read_shp(path, simplify=True, geom_attrs=True, strict=True):
-    """Generates a networkx.DiGraph from shapefiles.
-
-    .. deprecated:: 2.6
-
-       read_shp is deprecated and will be removed in NetworkX 3.0.
-       See https://networkx.org/documentation/latest/auto_examples/index.html#geospatial.
-
-    Point geometries are
-    translated into nodes, lines into edges. Coordinate tuples are used as
-    keys. Attributes are preserved, line geometries are simplified into start
-    and end coordinates. Accepts a single shapefile or directory of many
-    shapefiles.
-
-    "The Esri Shapefile or simply a shapefile is a popular geospatial vector
-    data format for geographic information systems software [1]_."
-
-    Parameters
-    ----------
-    path : file or string
-       File, directory, or filename to read.
-
-    simplify:  bool
-        If True, simplify line geometries to start and end coordinates.
-        If False, and line feature geometry has multiple segments, the
-        non-geometric attributes for that feature will be repeated for each
-        edge comprising that feature.
-
-    geom_attrs: bool
-        If True, include the Wkb, Wkt and Json geometry attributes with
-        each edge.
-
-        NOTE:  if these attributes are available, write_shp will use them
-        to write the geometry.  If nodes store the underlying coordinates for
-        the edge geometry as well (as they do when they are read via
-        this method) and they change, your geomety will be out of sync.
-
-    strict: bool
-        If True, raise NetworkXError when feature geometry is missing or
-        GeometryType is not supported.
-        If False, silently ignore missing or unsupported geometry in features.
-
-    Returns
-    -------
-    G : NetworkX graph
-
-    Raises
-    ------
-    ImportError
-       If ogr module is not available.
-
-    RuntimeError
-       If file cannot be open or read.
-
-    NetworkXError
-       If strict=True and feature is missing geometry or GeometryType is
-       not supported.
-
-    Examples
-    --------
-    >>> G = nx.read_shp("test.shp")  # doctest: +SKIP
-
-    References
-    ----------
-    .. [1] https://en.wikipedia.org/wiki/Shapefile
-    """
-    msg = (
-        "read_shp is deprecated and will be removed in 3.0."
-        "See https://networkx.org/documentation/latest/auto_examples/index.html#geospatial."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    try:
-        from osgeo import ogr
-    except ImportError as err:
-        raise ImportError("read_shp requires OGR: http://www.gdal.org/") from err
-
-    if not isinstance(path, str):
-        return
-
-    net = nx.DiGraph()
-    shp = ogr.Open(path)
-    if shp is None:
-        raise RuntimeError(f"Unable to open {path}")
-    for lyr in shp:
-        fields = [x.GetName() for x in lyr.schema]
-        for f in lyr:
-            g = f.geometry()
-            if g is None:
-                if strict:
-                    raise nx.NetworkXError("Bad data: feature missing geometry")
-                else:
-                    continue
-            flddata = [f.GetField(f.GetFieldIndex(x)) for x in fields]
-            attributes = dict(zip(fields, flddata))
-            attributes["ShpName"] = lyr.GetName()
-            # Note:  Using layer level geometry type
-            if g.GetGeometryType() == ogr.wkbPoint:
-                net.add_node((g.GetPoint_2D(0)), **attributes)
-            elif g.GetGeometryType() in (ogr.wkbLineString, ogr.wkbMultiLineString):
-                for edge in edges_from_line(g, attributes, simplify, geom_attrs):
-                    e1, e2, attr = edge
-                    net.add_edge(e1, e2)
-                    net[e1][e2].update(attr)
-            else:
-                if strict:
-                    raise nx.NetworkXError(
-                        f"GeometryType {g.GetGeometryType()} not supported"
-                    )
-
-    return net
-
-
-def edges_from_line(geom, attrs, simplify=True, geom_attrs=True):
-    """
-    Generate edges for each line in geom
-    Written as a helper for read_shp
-
-    Parameters
-    ----------
-
-    geom:  ogr line geometry
-        To be converted into an edge or edges
-
-    attrs:  dict
-        Attributes to be associated with all geoms
-
-    simplify:  bool
-        If True, simplify the line as in read_shp
-
-    geom_attrs:  bool
-        If True, add geom attributes to edge as in read_shp
-
-
-    Returns
-    -------
-     edges:  generator of edges
-        each edge is a tuple of form
-        (node1_coord, node2_coord, attribute_dict)
-        suitable for expanding into a networkx Graph add_edge call
-
-    .. deprecated:: 2.6
-    """
-    msg = (
-        "edges_from_line is deprecated and will be removed in 3.0."
-        "See https://networkx.org/documentation/latest/auto_examples/index.html#geospatial."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    try:
-        from osgeo import ogr
-    except ImportError as err:
-        raise ImportError(
-            "edges_from_line requires OGR: " "http://www.gdal.org/"
-        ) from err
-
-    if geom.GetGeometryType() == ogr.wkbLineString:
-        if simplify:
-            edge_attrs = attrs.copy()
-            last = geom.GetPointCount() - 1
-            if geom_attrs:
-                edge_attrs["Wkb"] = geom.ExportToWkb()
-                edge_attrs["Wkt"] = geom.ExportToWkt()
-                edge_attrs["Json"] = geom.ExportToJson()
-            yield (geom.GetPoint_2D(0), geom.GetPoint_2D(last), edge_attrs)
-        else:
-            for i in range(0, geom.GetPointCount() - 1):
-                pt1 = geom.GetPoint_2D(i)
-                pt2 = geom.GetPoint_2D(i + 1)
-                edge_attrs = attrs.copy()
-                if geom_attrs:
-                    segment = ogr.Geometry(ogr.wkbLineString)
-                    segment.AddPoint_2D(pt1[0], pt1[1])
-                    segment.AddPoint_2D(pt2[0], pt2[1])
-                    edge_attrs["Wkb"] = segment.ExportToWkb()
-                    edge_attrs["Wkt"] = segment.ExportToWkt()
-                    edge_attrs["Json"] = segment.ExportToJson()
-                    del segment
-                yield (pt1, pt2, edge_attrs)
-
-    elif geom.GetGeometryType() == ogr.wkbMultiLineString:
-        for i in range(geom.GetGeometryCount()):
-            geom_i = geom.GetGeometryRef(i)
-            yield from edges_from_line(geom_i, attrs, simplify, geom_attrs)
-
-
-def write_shp(G, outdir):
-    """Writes a networkx.DiGraph to two shapefiles, edges and nodes.
-
-    .. deprecated:: 2.6
-
-       write_shp is deprecated and will be removed in 3.0.
-       See https://networkx.org/documentation/latest/auto_examples/index.html#geospatial.
-
-    Nodes and edges are expected to have a Well Known Binary (Wkb) or
-    Well Known Text (Wkt) key in order to generate geometries. Also
-    acceptable are nodes with a numeric tuple key (x,y).
-
-    "The Esri Shapefile or simply a shapefile is a popular geospatial vector
-    data format for geographic information systems software [1]_."
-
-    Parameters
-    ----------
-    G : NetworkX graph
-        Directed graph
-    outdir : directory path
-       Output directory for the two shapefiles.
-
-    Returns
-    -------
-    None
-
-    Examples
-    --------
-    nx.write_shp(digraph, '/shapefiles') # doctest +SKIP
-
-    References
-    ----------
-    .. [1] https://en.wikipedia.org/wiki/Shapefile
-    """
-    msg = (
-        "write_shp is deprecated and will be removed in 3.0."
-        "See https://networkx.org/documentation/latest/auto_examples/index.html#geospatial."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    try:
-        from osgeo import ogr
-    except ImportError as err:
-        raise ImportError("write_shp requires OGR: http://www.gdal.org/") from err
-    # easier to debug in python if ogr throws exceptions
-    ogr.UseExceptions()
-
-    def netgeometry(key, data):
-        if "Wkb" in data:
-            geom = ogr.CreateGeometryFromWkb(data["Wkb"])
-        elif "Wkt" in data:
-            geom = ogr.CreateGeometryFromWkt(data["Wkt"])
-        elif type(key[0]).__name__ == "tuple":  # edge keys are packed tuples
-            geom = ogr.Geometry(ogr.wkbLineString)
-            _from, _to = key[0], key[1]
-            try:
-                geom.SetPoint(0, *_from)
-                geom.SetPoint(1, *_to)
-            except TypeError:
-                # assume user used tuple of int and choked ogr
-                _ffrom = [float(x) for x in _from]
-                _fto = [float(x) for x in _to]
-                geom.SetPoint(0, *_ffrom)
-                geom.SetPoint(1, *_fto)
-        else:
-            geom = ogr.Geometry(ogr.wkbPoint)
-            try:
-                geom.SetPoint(0, *key)
-            except TypeError:
-                # assume user used tuple of int and choked ogr
-                fkey = [float(x) for x in key]
-                geom.SetPoint(0, *fkey)
-
-        return geom
-
-    # Create_feature with new optional attributes arg (should be dict type)
-    def create_feature(geometry, lyr, attributes=None):
-        feature = ogr.Feature(lyr.GetLayerDefn())
-        feature.SetGeometry(g)
-        if attributes is not None:
-            # Loop through attributes, assigning data to each field
-            for field, data in attributes.items():
-                feature.SetField(field, data)
-        lyr.CreateFeature(feature)
-        feature.Destroy()
-
-    # Conversion dict between python and ogr types
-    OGRTypes = {int: ogr.OFTInteger, str: ogr.OFTString, float: ogr.OFTReal}
-
-    # Check/add fields from attribute data to Shapefile layers
-    def add_fields_to_layer(key, value, fields, layer):
-        # Field not in previous edges so add to dict
-        if type(value) in OGRTypes:
-            fields[key] = OGRTypes[type(value)]
-        else:
-            # Data type not supported, default to string (char 80)
-            fields[key] = ogr.OFTString
-        # Create the new field
-        newfield = ogr.FieldDefn(key, fields[key])
-        layer.CreateField(newfield)
-
-    drv = ogr.GetDriverByName("ESRI Shapefile")
-    shpdir = drv.CreateDataSource(outdir)
-    # delete pre-existing output first otherwise ogr chokes
-    try:
-        shpdir.DeleteLayer("nodes")
-    except:
-        pass
-    nodes = shpdir.CreateLayer("nodes", None, ogr.wkbPoint)
-
-    # Storage for node field names and their data types
-    node_fields = {}
-
-    def create_attributes(data, fields, layer):
-        attributes = {}  # storage for attribute data (indexed by field names)
-        for key, value in data.items():
-            # Reject spatial data not required for attribute table
-            if key != "Json" and key != "Wkt" and key != "Wkb" and key != "ShpName":
-                # Check/add field and data type to fields dict
-                if key not in fields:
-                    add_fields_to_layer(key, value, fields, layer)
-                # Store the data from new field to dict for CreateLayer()
-                attributes[key] = value
-        return attributes, layer
-
-    for n in G:
-        data = G.nodes[n]
-        g = netgeometry(n, data)
-        attributes, nodes = create_attributes(data, node_fields, nodes)
-        create_feature(g, nodes, attributes)
-
-    try:
-        shpdir.DeleteLayer("edges")
-    except:
-        pass
-    edges = shpdir.CreateLayer("edges", None, ogr.wkbLineString)
-
-    # New edge attribute write support merged into edge loop
-    edge_fields = {}  # storage for field names and their data types
-
-    for edge in G.edges(data=True):
-        data = G.get_edge_data(*edge)
-        g = netgeometry(edge, data)
-        attributes, edges = create_attributes(edge[2], edge_fields, edges)
-        create_feature(g, edges, attributes)
-
-    nodes, edges = None, None
diff --git a/networkx/readwrite/nx_yaml.py b/networkx/readwrite/nx_yaml.py
deleted file mode 100644
index b8ed9e5..0000000
--- a/networkx/readwrite/nx_yaml.py
+++ /dev/null
@@ -1,59 +0,0 @@
-"""
-****
-YAML
-****
-Read and write NetworkX graphs in YAML format.
-
-"YAML is a data serialization format designed for human readability
-and interaction with scripting languages."
-See http://www.yaml.org for documentation.
-
-Format
-------
-http://pyyaml.org/wiki/PyYAML
-
-"""
-
-
-def __dir__():
-    return ["read_yaml", "write_yaml"]
-
-
-def __getattr__(name):
-    """Remove functions and provide informative error messages."""
-    if name == "nx_yaml":
-        raise ImportError(
-            "\nThe nx_yaml module has been removed from NetworkX.\n"
-            "Please use the `yaml` package directly for working with yaml data.\n"
-            "For example, a networkx.Graph `G` can be written to and loaded\n"
-            "from a yaml file with:\n\n"
-            "    import yaml\n\n"
-            "    with open('path_to_yaml_file', 'w') as fh:\n"
-            "        yaml.dump(G, fh)\n"
-            "    with open('path_to_yaml_file', 'r') as fh:\n"
-            "        G = yaml.load(fh, Loader=yaml.Loader)\n\n"
-            "Note that yaml.Loader is considered insecure - see the pyyaml\n"
-            "documentation for further details.\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    if name == "read_yaml":
-        raise ImportError(
-            "\nread_yaml has been removed from NetworkX, please use `yaml`\n"
-            "directly:\n\n"
-            "    import yaml\n\n"
-            "    with open('path', 'r') as fh:\n"
-            "        yaml.load(fh, Loader=yaml.Loader)\n\n"
-            "Note that yaml.Loader is considered insecure - see the pyyaml\n"
-            "documentation for further details.\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    if name == "write_yaml":
-        raise ImportError(
-            "\nwrite_yaml has been removed from NetworkX, please use `yaml`\n"
-            "directly:\n\n"
-            "    import yaml\n\n"
-            "    with open('path_for_yaml_output', 'w') as fh:\n"
-            "        yaml.dump(G_to_be_yaml, fh)\n\n"
-            "This message will be removed in NetworkX 3.0."
-        )
-    raise AttributeError(f"module {__name__} has no attribute {name}")
diff --git a/networkx/readwrite/tests/test_edgelist.py b/networkx/readwrite/tests/test_edgelist.py
index abd1d3c..18b726f 100644
--- a/networkx/readwrite/tests/test_edgelist.py
+++ b/networkx/readwrite/tests/test_edgelist.py
@@ -183,7 +183,7 @@ class TestEdgelist:
 
     def test_write_edgelist_1(self):
         fh = io.BytesIO()
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_edges_from([(1, 2), (2, 3)])
         nx.write_edgelist(G, fh, data=False)
         fh.seek(0)
@@ -191,7 +191,7 @@ class TestEdgelist:
 
     def test_write_edgelist_2(self):
         fh = io.BytesIO()
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_edges_from([(1, 2), (2, 3)])
         nx.write_edgelist(G, fh, data=True)
         fh.seek(0)
@@ -199,7 +199,7 @@ class TestEdgelist:
 
     def test_write_edgelist_3(self):
         fh = io.BytesIO()
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_edge(1, 2, weight=2.0)
         G.add_edge(2, 3, weight=3.0)
         nx.write_edgelist(G, fh, data=True)
@@ -208,7 +208,7 @@ class TestEdgelist:
 
     def test_write_edgelist_4(self):
         fh = io.BytesIO()
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_edge(1, 2, weight=2.0)
         G.add_edge(2, 3, weight=3.0)
         nx.write_edgelist(G, fh, data=[("weight")])
diff --git a/networkx/readwrite/tests/test_getattr_nxyaml_removal.py b/networkx/readwrite/tests/test_getattr_nxyaml_removal.py
deleted file mode 100644
index 83fc059..0000000
--- a/networkx/readwrite/tests/test_getattr_nxyaml_removal.py
+++ /dev/null
@@ -1,38 +0,0 @@
-"""Test that informative exception messages are raised when attempting to
-access nx_yaml."""
-
-import pytest
-
-_msg_stub = "\n.* has been removed from NetworkX"
-
-
-def test_access_from_module():
-    with pytest.raises(ImportError, match=_msg_stub):
-        from networkx.readwrite.nx_yaml import read_yaml
-    with pytest.raises(ImportError, match=_msg_stub):
-        from networkx.readwrite.nx_yaml import write_yaml
-
-
-def test_access_from_nx_namespace():
-    import networkx as nx
-
-    with pytest.raises(ImportError, match=_msg_stub):
-        nx.read_yaml
-    with pytest.raises(ImportError, match=_msg_stub):
-        nx.write_yaml
-
-
-def test_access_from_readwrite_pkg():
-    from networkx import readwrite
-
-    with pytest.raises(ImportError, match=_msg_stub):
-        readwrite.read_yaml
-    with pytest.raises(ImportError, match=_msg_stub):
-        readwrite.write_yaml
-
-
-def test_accessing_nx_yaml():
-    import networkx as nx
-
-    with pytest.raises(ImportError, match=_msg_stub):
-        nx.nx_yaml
diff --git a/networkx/readwrite/tests/test_gexf.py b/networkx/readwrite/tests/test_gexf.py
index 7166c09..eba6aed 100644
--- a/networkx/readwrite/tests/test_gexf.py
+++ b/networkx/readwrite/tests/test_gexf.py
@@ -276,7 +276,7 @@ org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.gexf.net/\
 
     def test_write_with_node_attributes(self):
         # Addresses #673.
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_edges_from([(0, 1), (1, 2), (2, 3)])
         for i in range(4):
             G.nodes[i]["id"] = i
diff --git a/networkx/readwrite/tests/test_gml.py b/networkx/readwrite/tests/test_gml.py
index 19fb7ed..68b0da6 100644
--- a/networkx/readwrite/tests/test_gml.py
+++ b/networkx/readwrite/tests/test_gml.py
@@ -215,7 +215,7 @@ graph
     def test_tuplelabels(self):
         # https://github.com/networkx/networkx/pull/1048
         # Writing tuple labels to GML failed.
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_edge((0, 1), (1, 0))
         data = "\n".join(nx.generate_gml(G, stringizer=literal_stringizer))
         answer = """graph [
@@ -571,10 +571,6 @@ graph
         G = nx.Graph()
         G.graph["data"] = frozenset([1, 2, 3])
         assert_generate_error(G, stringizer=literal_stringizer)
-        G = nx.Graph()
-        G.graph["data"] = []
-        assert_generate_error(G)
-        assert_generate_error(G, stringizer=len)
 
     def test_label_kwarg(self):
         G = nx.parse_gml(self.simple_data, label="id")
@@ -712,3 +708,23 @@ class TestPropertyLists:
             f.seek(0)
             graph = nx.read_gml(f)
         assert graph.nodes(data=True)["n1"] == {"properties": ["element"]}
+
+
+@pytest.mark.parametrize("coll", (list(), tuple()))
+def test_stringize_empty_list_tuple(coll):
+    G = nx.path_graph(2)
+    G.nodes[0]["test"] = coll  # test serializing an empty collection
+    f = io.BytesIO()
+    nx.write_gml(G, f)  # Smoke test - should not raise
+    f.seek(0)
+    H = nx.read_gml(f)
+    assert H.nodes["0"]["test"] == coll  # Check empty list round-trips properly
+    # Check full round-tripping. Note that nodes are loaded as strings by
+    # default, so there needs to be some remapping prior to comparison
+    H = nx.relabel_nodes(H, {"0": 0, "1": 1})
+    assert nx.utils.graphs_equal(G, H)
+    # Same as above, but use destringizer for node remapping. Should have no
+    # effect on node attr
+    f.seek(0)
+    H = nx.read_gml(f, destringizer=int)
+    assert nx.utils.graphs_equal(G, H)
diff --git a/networkx/readwrite/tests/test_gpickle.py b/networkx/readwrite/tests/test_gpickle.py
deleted file mode 100644
index 3ef83b5..0000000
--- a/networkx/readwrite/tests/test_gpickle.py
+++ /dev/null
@@ -1,75 +0,0 @@
-import os
-import tempfile
-
-import networkx as nx
-from networkx.utils import edges_equal, graphs_equal, nodes_equal
-
-
-class TestGpickle:
-    @classmethod
-    def setup_class(cls):
-        G = nx.Graph(name="test")
-        e = [("a", "b"), ("b", "c"), ("c", "d"), ("d", "e"), ("e", "f"), ("a", "f")]
-        G.add_edges_from(e, width=10)
-        G.add_node("g", color="green")
-        G.graph["number"] = 1
-        DG = nx.DiGraph(G)
-        MG = nx.MultiGraph(G)
-        MG.add_edge("a", "a")
-        MDG = nx.MultiDiGraph(G)
-        MDG.add_edge("a", "a")
-        fG = G.copy()
-        fDG = DG.copy()
-        fMG = MG.copy()
-        fMDG = MDG.copy()
-        nx.freeze(fG)
-        nx.freeze(fDG)
-        nx.freeze(fMG)
-        nx.freeze(fMDG)
-        cls.G = G
-        cls.DG = DG
-        cls.MG = MG
-        cls.MDG = MDG
-        cls.fG = fG
-        cls.fDG = fDG
-        cls.fMG = fMG
-        cls.fMDG = fMDG
-
-    def test_gpickle(self):
-        for G in [
-            self.G,
-            self.DG,
-            self.MG,
-            self.MDG,
-            self.fG,
-            self.fDG,
-            self.fMG,
-            self.fMDG,
-        ]:
-            (fd, fname) = tempfile.mkstemp()
-            nx.write_gpickle(G, fname)
-            Gin = nx.read_gpickle(fname)
-            assert nodes_equal(list(G.nodes(data=True)), list(Gin.nodes(data=True)))
-            assert edges_equal(list(G.edges(data=True)), list(Gin.edges(data=True)))
-            assert graphs_equal(G, Gin)
-            os.close(fd)
-            os.unlink(fname)
-
-    def test_protocol(self):
-        for G in [
-            self.G,
-            self.DG,
-            self.MG,
-            self.MDG,
-            self.fG,
-            self.fDG,
-            self.fMG,
-            self.fMDG,
-        ]:
-            with tempfile.TemporaryFile() as f:
-                nx.write_gpickle(G, f, 0)
-                f.seek(0)
-                Gin = nx.read_gpickle(f)
-                assert nodes_equal(list(G.nodes(data=True)), list(Gin.nodes(data=True)))
-                assert edges_equal(list(G.edges(data=True)), list(Gin.edges(data=True)))
-                assert graphs_equal(G, Gin)
diff --git a/networkx/readwrite/tests/test_graphml.py b/networkx/readwrite/tests/test_graphml.py
index cc37132..0d27719 100644
--- a/networkx/readwrite/tests/test_graphml.py
+++ b/networkx/readwrite/tests/test_graphml.py
@@ -1352,7 +1352,7 @@ class TestWriteGraphML(BaseGraphML):
                 ".//{http://graphml.graphdrawing.org/xmlns}edge"
             )
         ]
-        # verify edge id value is equal to sepcified attribute value
+        # verify edge id value is equal to specified attribute value
         assert sorted(edge_ids) == sorted(edge_attributes.values())
 
         # check graphml generated from generate_graphml()
@@ -1404,7 +1404,7 @@ class TestWriteGraphML(BaseGraphML):
                 ".//{http://graphml.graphdrawing.org/xmlns}edge"
             )
         ]
-        # verify edge id value is equal to sepcified attribute value
+        # verify edge id value is equal to specified attribute value
         assert sorted(edge_ids) == sorted(edge_attributes.values())
 
         # check graphml generated from generate_graphml()
@@ -1482,7 +1482,7 @@ class TestWriteGraphML(BaseGraphML):
         os.unlink(fname)
 
     def test_unicode_escape(self):
-        # test for handling json escaped stings in python 2 Issue #1880
+        # test for handling json escaped strings in python 2 Issue #1880
         import json
 
         a = dict(a='{"a": "123"}')  # an object with many chars to escape
diff --git a/networkx/readwrite/tests/test_p2g.py b/networkx/readwrite/tests/test_p2g.py
index 8280870..e4c50de 100644
--- a/networkx/readwrite/tests/test_p2g.py
+++ b/networkx/readwrite/tests/test_p2g.py
@@ -43,7 +43,7 @@ c
 
 """
         fh = io.BytesIO()
-        G = nx.OrderedDiGraph()
+        G = nx.DiGraph()
         G.name = "foo"
         G.add_edges_from([(1, 2), (2, 3)])
         write_p2g(G, fh)
diff --git a/networkx/readwrite/tests/test_shp.py b/networkx/readwrite/tests/test_shp.py
deleted file mode 100644
index 7ada2e8..0000000
--- a/networkx/readwrite/tests/test_shp.py
+++ /dev/null
@@ -1,288 +0,0 @@
-"""Unit tests for shp.
-"""
-
-import os
-import tempfile
-
-import pytest
-
-ogr = pytest.importorskip("osgeo.ogr")
-
-import networkx as nx
-
-
-class TestShp:
-    def setup_method(self):
-        def createlayer(driver, layerType=ogr.wkbLineString):
-            lyr = driver.CreateLayer("edges", None, layerType)
-            namedef = ogr.FieldDefn("Name", ogr.OFTString)
-            namedef.SetWidth(32)
-            lyr.CreateField(namedef)
-            return lyr
-
-        drv = ogr.GetDriverByName("ESRI Shapefile")
-
-        testdir = os.path.join(tempfile.gettempdir(), "shpdir")
-        shppath = os.path.join(tempfile.gettempdir(), "tmpshp.shp")
-        multi_shppath = os.path.join(tempfile.gettempdir(), "tmp_mshp.shp")
-
-        self.deletetmp(drv, testdir, shppath, multi_shppath)
-        os.mkdir(testdir)
-
-        self.names = ["a", "b", "c", "c"]  # edgenames
-        self.paths = (
-            [(1.0, 1.0), (2.0, 2.0)],
-            [(2.0, 2.0), (3.0, 3.0)],
-            [(0.9, 0.9), (4.0, 0.9), (4.0, 2.0)],
-        )
-
-        self.simplified_names = ["a", "b", "c"]  # edgenames
-        self.simplified_paths = (
-            [(1.0, 1.0), (2.0, 2.0)],
-            [(2.0, 2.0), (3.0, 3.0)],
-            [(0.9, 0.9), (4.0, 2.0)],
-        )
-
-        self.multi_names = ["a", "a", "a", "a"]  # edgenames
-
-        shp = drv.CreateDataSource(shppath)
-        lyr = createlayer(shp)
-
-        for path, name in zip(self.paths, self.names):
-            feat = ogr.Feature(lyr.GetLayerDefn())
-            g = ogr.Geometry(ogr.wkbLineString)
-            for p in path:
-                g.AddPoint_2D(*p)
-            feat.SetGeometry(g)
-            feat.SetField("Name", name)
-            lyr.CreateFeature(feat)
-
-        # create single record multiline shapefile for testing
-        multi_shp = drv.CreateDataSource(multi_shppath)
-        multi_lyr = createlayer(multi_shp, ogr.wkbMultiLineString)
-
-        multi_g = ogr.Geometry(ogr.wkbMultiLineString)
-        for path in self.paths:
-
-            g = ogr.Geometry(ogr.wkbLineString)
-            for p in path:
-                g.AddPoint_2D(*p)
-
-            multi_g.AddGeometry(g)
-
-        multi_feat = ogr.Feature(multi_lyr.GetLayerDefn())
-        multi_feat.SetGeometry(multi_g)
-        multi_feat.SetField("Name", "a")
-        multi_lyr.CreateFeature(multi_feat)
-
-        self.shppath = shppath
-        self.multi_shppath = multi_shppath
-        self.testdir = testdir
-        self.drv = drv
-
-    def deletetmp(self, drv, *paths):
-        for p in paths:
-            if os.path.exists(p):
-                drv.DeleteDataSource(p)
-
-    def testload(self):
-        def compare_graph_paths_names(g, paths, names):
-            expected = nx.DiGraph()
-            for p in paths:
-                nx.add_path(expected, p)
-            assert sorted(expected.nodes) == sorted(g.nodes)
-            assert sorted(expected.edges()) == sorted(g.edges())
-            g_names = [g.get_edge_data(s, e)["Name"] for s, e in g.edges()]
-            assert names == sorted(g_names)
-
-        # simplified
-        G = nx.read_shp(self.shppath)
-        compare_graph_paths_names(G, self.simplified_paths, self.simplified_names)
-
-        # unsimplified
-        G = nx.read_shp(self.shppath, simplify=False)
-        compare_graph_paths_names(G, self.paths, self.names)
-
-        # multiline unsimplified
-        G = nx.read_shp(self.multi_shppath, simplify=False)
-        compare_graph_paths_names(G, self.paths, self.multi_names)
-
-    def checkgeom(self, lyr, expected):
-        feature = lyr.GetNextFeature()
-        actualwkt = []
-        while feature:
-            actualwkt.append(feature.GetGeometryRef().ExportToWkt())
-            feature = lyr.GetNextFeature()
-        assert sorted(expected) == sorted(actualwkt)
-
-    def test_geometryexport(self):
-        expectedpoints_simple = (
-            "POINT (1 1)",
-            "POINT (2 2)",
-            "POINT (3 3)",
-            "POINT (0.9 0.9)",
-            "POINT (4 2)",
-        )
-        expectedlines_simple = (
-            "LINESTRING (1 1,2 2)",
-            "LINESTRING (2 2,3 3)",
-            "LINESTRING (0.9 0.9,4.0 0.9,4 2)",
-        )
-        expectedpoints = (
-            "POINT (1 1)",
-            "POINT (2 2)",
-            "POINT (3 3)",
-            "POINT (0.9 0.9)",
-            "POINT (4.0 0.9)",
-            "POINT (4 2)",
-        )
-        expectedlines = (
-            "LINESTRING (1 1,2 2)",
-            "LINESTRING (2 2,3 3)",
-            "LINESTRING (0.9 0.9,4.0 0.9)",
-            "LINESTRING (4.0 0.9,4 2)",
-        )
-
-        tpath = os.path.join(tempfile.gettempdir(), "shpdir")
-        G = nx.read_shp(self.shppath)
-        nx.write_shp(G, tpath)
-        shpdir = ogr.Open(tpath)
-        self.checkgeom(shpdir.GetLayerByName("nodes"), expectedpoints_simple)
-        self.checkgeom(shpdir.GetLayerByName("edges"), expectedlines_simple)
-
-        # Test unsimplified
-        # Nodes should have additional point,
-        # edges should be 'flattened'
-        G = nx.read_shp(self.shppath, simplify=False)
-        nx.write_shp(G, tpath)
-        shpdir = ogr.Open(tpath)
-        self.checkgeom(shpdir.GetLayerByName("nodes"), expectedpoints)
-        self.checkgeom(shpdir.GetLayerByName("edges"), expectedlines)
-
-    def test_attributeexport(self):
-        def testattributes(lyr, graph):
-            feature = lyr.GetNextFeature()
-            while feature:
-                coords = []
-                ref = feature.GetGeometryRef()
-                last = ref.GetPointCount() - 1
-                edge_nodes = (ref.GetPoint_2D(0), ref.GetPoint_2D(last))
-                name = feature.GetFieldAsString("Name")
-                assert graph.get_edge_data(*edge_nodes)["Name"] == name
-                feature = lyr.GetNextFeature()
-
-        tpath = os.path.join(tempfile.gettempdir(), "shpdir")
-
-        G = nx.read_shp(self.shppath)
-        nx.write_shp(G, tpath)
-        shpdir = ogr.Open(tpath)
-        edges = shpdir.GetLayerByName("edges")
-        testattributes(edges, G)
-
-    # Test export of node attributes in nx.write_shp (#2778)
-    def test_nodeattributeexport(self):
-        tpath = os.path.join(tempfile.gettempdir(), "shpdir")
-
-        G = nx.DiGraph()
-        A = (0, 0)
-        B = (1, 1)
-        C = (2, 2)
-        G.add_edge(A, B)
-        G.add_edge(A, C)
-        label = "node_label"
-        for n, d in G.nodes(data=True):
-            d["label"] = label
-        nx.write_shp(G, tpath)
-
-        H = nx.read_shp(tpath)
-        for n, d in H.nodes(data=True):
-            assert d["label"] == label
-
-    def test_wkt_export(self):
-        G = nx.DiGraph()
-        tpath = os.path.join(tempfile.gettempdir(), "shpdir")
-        points = ("POINT (0.9 0.9)", "POINT (4 2)")
-        line = ("LINESTRING (0.9 0.9,4 2)",)
-        G.add_node(1, Wkt=points[0])
-        G.add_node(2, Wkt=points[1])
-        G.add_edge(1, 2, Wkt=line[0])
-        try:
-            nx.write_shp(G, tpath)
-        except Exception as err:
-            assert False, err
-        shpdir = ogr.Open(tpath)
-        self.checkgeom(shpdir.GetLayerByName("nodes"), points)
-        self.checkgeom(shpdir.GetLayerByName("edges"), line)
-
-    def teardown_method(self):
-        self.deletetmp(self.drv, self.testdir, self.shppath)
-
-
-def test_read_shp_nofile():
-    with pytest.raises(RuntimeError):
-        G = nx.read_shp("hopefully_this_file_will_not_be_available")
-
-
-class TestMissingGeometry:
-    def setup_method(self):
-        self.setup_path()
-        self.delete_shapedir()
-        self.create_shapedir()
-
-    def teardown_method(self):
-        self.delete_shapedir()
-
-    def setup_path(self):
-        self.path = os.path.join(tempfile.gettempdir(), "missing_geometry")
-
-    def create_shapedir(self):
-        drv = ogr.GetDriverByName("ESRI Shapefile")
-        shp = drv.CreateDataSource(self.path)
-        lyr = shp.CreateLayer("nodes", None, ogr.wkbPoint)
-        feature = ogr.Feature(lyr.GetLayerDefn())
-        feature.SetGeometry(None)
-        lyr.CreateFeature(feature)
-        feature.Destroy()
-
-    def delete_shapedir(self):
-        drv = ogr.GetDriverByName("ESRI Shapefile")
-        if os.path.exists(self.path):
-            drv.DeleteDataSource(self.path)
-
-    def test_missing_geometry(self):
-        with pytest.raises(nx.NetworkXError):
-            G = nx.read_shp(self.path)
-
-
-class TestMissingAttrWrite:
-    def setup_method(self):
-        self.setup_path()
-        self.delete_shapedir()
-
-    def teardown_method(self):
-        self.delete_shapedir()
-
-    def setup_path(self):
-        self.path = os.path.join(tempfile.gettempdir(), "missing_attributes")
-
-    def delete_shapedir(self):
-        drv = ogr.GetDriverByName("ESRI Shapefile")
-        if os.path.exists(self.path):
-            drv.DeleteDataSource(self.path)
-
-    def test_missing_attributes(self):
-        G = nx.DiGraph()
-        A = (0, 0)
-        B = (1, 1)
-        C = (2, 2)
-        G.add_edge(A, B, foo=100)
-        G.add_edge(A, C)
-
-        nx.write_shp(G, self.path)
-        H = nx.read_shp(self.path)
-
-        for u, v, d in H.edges(data=True):
-            if u == A and v == B:
-                assert d["foo"] == 100
-            if u == A and v == C:
-                assert d["foo"] is None
diff --git a/networkx/relabel.py b/networkx/relabel.py
index ec34142..df3da44 100644
--- a/networkx/relabel.py
+++ b/networkx/relabel.py
@@ -6,6 +6,9 @@ __all__ = ["convert_node_labels_to_integers", "relabel_nodes"]
 def relabel_nodes(G, mapping, copy=True):
     """Relabel the nodes of the graph G according to a given mapping.
 
+    The original node ordering may not be preserved if `copy` is `False` and the
+    mapping includes overlap between old and new labels.
+
     Parameters
     ----------
     G : graph
@@ -111,16 +114,10 @@ def relabel_nodes(G, mapping, copy=True):
     --------
     convert_node_labels_to_integers
     """
-    # you can pass a function f(old_label) -> new_label
-    # or a class e.g. str(old_label) -> new_label
-    # but we'll just make a dictionary here regardless
-    # To allow classes, we check if __getitem__ is a bound method using __self__
-    if not (
-        hasattr(mapping, "__getitem__") and hasattr(mapping.__getitem__, "__self__")
-    ):
-        m = {n: mapping(n) for n in G}
-    else:
-        m = mapping
+    # you can pass any callable e.g. f(old_label) -> new_label or
+    # e.g. str(old_label) -> new_label, but we'll just make a dictionary here regardless
+    m = {n: mapping(n) for n in G} if callable(mapping) else mapping
+
     if copy:
         return _relabel_copy(G, m)
     else:
@@ -128,9 +125,7 @@ def relabel_nodes(G, mapping, copy=True):
 
 
 def _relabel_inplace(G, mapping):
-    old_labels = set(mapping.keys())
-    new_labels = set(mapping.values())
-    if len(old_labels & new_labels) > 0:
+    if len(mapping.keys() & mapping.values()) > 0:
         # labels sets overlap
         # can we topological sort and still do the relabeling?
         D = nx.DiGraph(list(mapping.items()))
@@ -143,8 +138,8 @@ def _relabel_inplace(G, mapping):
                 "resolve the mapping. Use copy=True."
             ) from err
     else:
-        # non-overlapping label sets
-        nodes = old_labels
+        # non-overlapping label sets, sort them in the order of G nodes
+        nodes = [n for n in G if n in mapping]
 
     multigraph = G.is_multigraph()
     directed = G.is_directed()
diff --git a/networkx/testing/__init__.py b/networkx/testing/__init__.py
deleted file mode 100644
index 884ac83..0000000
--- a/networkx/testing/__init__.py
+++ /dev/null
@@ -1,2 +0,0 @@
-from networkx.testing.utils import *
-from networkx.testing.test import run
diff --git a/networkx/testing/test.py b/networkx/testing/test.py
deleted file mode 100644
index 41739be..0000000
--- a/networkx/testing/test.py
+++ /dev/null
@@ -1,44 +0,0 @@
-import warnings
-
-
-def run(verbosity=1, doctest=False):
-    """Run NetworkX tests.
-
-    Parameters
-    ----------
-    verbosity: integer, optional
-      Level of detail in test reports.  Higher numbers provide more detail.
-
-    doctest: bool, optional
-      True to run doctests in code modules
-    """
-    warnings.warn(
-        (
-            "`run` is deprecated and will be removed in version 3.0.\n"
-            "Call `pytest` directly from the commandline instead.\n"
-        ),
-        DeprecationWarning,
-    )
-
-    import pytest
-
-    pytest_args = ["-l"]
-
-    if verbosity and int(verbosity) > 1:
-        pytest_args += ["-" + "v" * (int(verbosity) - 1)]
-
-    if doctest:
-        pytest_args += ["--doctest-modules"]
-
-    pytest_args += ["--pyargs", "networkx"]
-
-    try:
-        code = pytest.main(pytest_args)
-    except SystemExit as err:
-        code = err.code
-
-    return code == 0
-
-
-if __name__ == "__main__":
-    run()
diff --git a/networkx/testing/tests/__init__.py b/networkx/testing/tests/__init__.py
deleted file mode 100644
index e69de29..0000000
diff --git a/networkx/testing/tests/test_utils.py b/networkx/testing/tests/test_utils.py
deleted file mode 100644
index 32804f3..0000000
--- a/networkx/testing/tests/test_utils.py
+++ /dev/null
@@ -1,160 +0,0 @@
-import networkx as nx
-from networkx.testing import assert_edges_equal, assert_graphs_equal, assert_nodes_equal
-
-# thanks to numpy for this GenericTest class (numpy/testing/test_utils.py)
-
-
-class _GenericTest:
-    @classmethod
-    def _test_equal(cls, a, b):
-        cls._assert_func(a, b)
-
-    @classmethod
-    def _test_not_equal(cls, a, b):
-        try:
-            cls._assert_func(a, b)
-            passed = True
-        except AssertionError:
-            pass
-        else:
-            raise AssertionError("a and b are found equal but are not")
-
-
-class TestNodesEqual(_GenericTest):
-    _assert_func = assert_nodes_equal
-
-    def test_nodes_equal(self):
-        a = [1, 2, 5, 4]
-        b = [4, 5, 1, 2]
-        self._test_equal(a, b)
-
-    def test_nodes_not_equal(self):
-        a = [1, 2, 5, 4]
-        b = [4, 5, 1, 3]
-        self._test_not_equal(a, b)
-
-    def test_nodes_with_data_equal(self):
-        G = nx.Graph()
-        G.add_nodes_from([1, 2, 3], color="red")
-        H = nx.Graph()
-        H.add_nodes_from([1, 2, 3], color="red")
-        self._test_equal(G.nodes(data=True), H.nodes(data=True))
-
-    def test_edges_with_data_not_equal(self):
-        G = nx.Graph()
-        G.add_nodes_from([1, 2, 3], color="red")
-        H = nx.Graph()
-        H.add_nodes_from([1, 2, 3], color="blue")
-        self._test_not_equal(G.nodes(data=True), H.nodes(data=True))
-
-
-class TestEdgesEqual(_GenericTest):
-    _assert_func = assert_edges_equal
-
-    def test_edges_equal(self):
-        a = [(1, 2), (5, 4)]
-        b = [(4, 5), (1, 2)]
-        self._test_equal(a, b)
-
-    def test_edges_not_equal(self):
-        a = [(1, 2), (5, 4)]
-        b = [(4, 5), (1, 3)]
-        self._test_not_equal(a, b)
-
-    def test_edges_with_data_equal(self):
-        G = nx.MultiGraph()
-        nx.add_path(G, [0, 1, 2], weight=1)
-        H = nx.MultiGraph()
-        nx.add_path(H, [0, 1, 2], weight=1)
-        self._test_equal(G.edges(data=True, keys=True), H.edges(data=True, keys=True))
-
-    def test_edges_with_data_not_equal(self):
-        G = nx.MultiGraph()
-        nx.add_path(G, [0, 1, 2], weight=1)
-        H = nx.MultiGraph()
-        nx.add_path(H, [0, 1, 2], weight=2)
-        self._test_not_equal(
-            G.edges(data=True, keys=True), H.edges(data=True, keys=True)
-        )
-
-    def test_no_edges(self):
-        G = nx.MultiGraph()
-        H = nx.MultiGraph()
-        self._test_equal(G.edges(data=True, keys=True), H.edges(data=True, keys=True))
-
-    def test_duplicate_edges(self):
-        a = [(1, 2), (5, 4), (1, 2)]
-        b = [(4, 5), (1, 2)]
-        self._test_not_equal(a, b)
-
-    def test_duplicate_edges_with_data(self):
-        a = [(1, 2, {"weight": 10}), (5, 4), (1, 2, {"weight": 1})]
-        b = [(4, 5), (1, 2), (1, 2, {"weight": 1})]
-        self._test_not_equal(a, b)
-
-    def test_order_of_edges_with_data(self):
-        a = [(1, 2, {"weight": 10}), (1, 2, {"weight": 1})]
-        b = [(1, 2, {"weight": 1}), (1, 2, {"weight": 10})]
-        self._test_equal(a, b)
-
-    def test_order_of_multiedges(self):
-        wt1 = {"weight": 1}
-        wt2 = {"weight": 2}
-        a = [(1, 2, wt1), (1, 2, wt1), (1, 2, wt2)]
-        b = [(1, 2, wt1), (1, 2, wt2), (1, 2, wt2)]
-        self._test_not_equal(a, b)
-
-    def test_order_of_edges_with_keys(self):
-        a = [(1, 2, 0, {"weight": 10}), (1, 2, 1, {"weight": 1}), (1, 2, 2)]
-        b = [(1, 2, 1, {"weight": 1}), (1, 2, 2), (1, 2, 0, {"weight": 10})]
-        self._test_equal(a, b)
-        a = [(1, 2, 1, {"weight": 10}), (1, 2, 0, {"weight": 1}), (1, 2, 2)]
-        b = [(1, 2, 1, {"weight": 1}), (1, 2, 2), (1, 2, 0, {"weight": 10})]
-        self._test_not_equal(a, b)
-
-
-class TestGraphsEqual(_GenericTest):
-    _assert_func = assert_graphs_equal
-
-    def test_graphs_equal(self):
-        G = nx.path_graph(4)
-        H = nx.Graph()
-        nx.add_path(H, range(4))
-        self._test_equal(G, H)
-
-    def test_digraphs_equal(self):
-        G = nx.path_graph(4, create_using=nx.DiGraph())
-        H = nx.DiGraph()
-        nx.add_path(H, range(4))
-        self._test_equal(G, H)
-
-    def test_multigraphs_equal(self):
-        G = nx.path_graph(4, create_using=nx.MultiGraph())
-        H = nx.MultiGraph()
-        nx.add_path(H, range(4))
-        self._test_equal(G, H)
-
-    def test_multidigraphs_equal(self):
-        G = nx.path_graph(4, create_using=nx.MultiDiGraph())
-        H = nx.MultiDiGraph()
-        nx.add_path(H, range(4))
-        self._test_equal(G, H)
-
-    def test_graphs_not_equal(self):
-        G = nx.path_graph(4)
-        H = nx.Graph()
-        nx.add_cycle(H, range(4))
-        self._test_not_equal(G, H)
-
-    def test_graphs_not_equal2(self):
-        G = nx.path_graph(4)
-        H = nx.Graph()
-        nx.add_path(H, range(3))
-        self._test_not_equal(G, H)
-
-    def test_graphs_not_equal3(self):
-        G = nx.path_graph(4)
-        H = nx.Graph()
-        nx.add_path(H, range(4))
-        H.name = "path_graph(4)"
-        self._test_not_equal(G, H)
diff --git a/networkx/testing/utils.py b/networkx/testing/utils.py
deleted file mode 100644
index cf6935d..0000000
--- a/networkx/testing/utils.py
+++ /dev/null
@@ -1,54 +0,0 @@
-import warnings
-
-from networkx.utils import edges_equal, graphs_equal, nodes_equal
-
-__all__ = [
-    "assert_nodes_equal",
-    "assert_edges_equal",
-    "assert_graphs_equal",
-    "almost_equal",
-]
-
-
-def almost_equal(x, y, places=7):
-    warnings.warn(
-        (
-            "`almost_equal` is deprecated and will be removed in version 3.0.\n"
-            "Use `pytest.approx` instead.\n"
-        ),
-        DeprecationWarning,
-    )
-    return round(abs(x - y), places) == 0
-
-
-def assert_nodes_equal(nodes1, nodes2):
-    warnings.warn(
-        (
-            "`assert_nodes_equal` is deprecated and will be removed in version 3.0.\n"
-            "Use `from networkx.utils import nodes_equal` and `assert nodes_equal` instead.\n"
-        ),
-        DeprecationWarning,
-    )
-    assert nodes_equal(nodes1, nodes2)
-
-
-def assert_edges_equal(edges1, edges2):
-    warnings.warn(
-        (
-            "`assert_edges_equal` is deprecated and will be removed in version 3.0.\n"
-            "Use `from networkx.utils import edges_equal` and `assert edges_equal` instead.\n"
-        ),
-        DeprecationWarning,
-    )
-    assert edges_equal(edges1, edges2)
-
-
-def assert_graphs_equal(graph1, graph2):
-    warnings.warn(
-        (
-            "`assert_graphs_equal` is deprecated and will be removed in version 3.0.\n"
-            "Use `from networkx.utils import graphs_equal` and `assert graphs_equal` instead.\n"
-        ),
-        DeprecationWarning,
-    )
-    assert graphs_equal(graph1, graph2)
diff --git a/networkx/tests/test_all_random_functions.py b/networkx/tests/test_all_random_functions.py
index e8aaba1..828c313 100644
--- a/networkx/tests/test_all_random_functions.py
+++ b/networkx/tests/test_all_random_functions.py
@@ -85,7 +85,6 @@ def run_all_random_functions(seed):
     )
     t(nx.betweenness_centrality, G, seed=seed)
     t(nx.edge_betweenness_centrality, G, seed=seed)
-    t(nx.edge_betweenness, G, seed=seed)
     t(nx.approximate_current_flow_betweenness_centrality, G, seed=seed)
     # print("kernighan")
     t(nx.algorithms.community.kernighan_lin_bisection, G, seed=seed)
@@ -135,6 +134,7 @@ def run_all_random_functions(seed):
     t(nx.random_clustered_graph, joint_degree_sequence, seed=seed)
     constructor = [(3, 3, 0.5), (10, 10, 0.7)]
     t(nx.random_shell_graph, constructor, seed=seed)
+    t(nx.random_triad, G.to_directed(), seed=seed)
     mapping = {1: 0.4, 2: 0.3, 3: 0.3}
     t(nx.utils.random_weighted_sample, mapping, k, seed=seed)
     t(nx.utils.weighted_choice, mapping, seed=seed)
diff --git a/networkx/tests/test_convert.py b/networkx/tests/test_convert.py
index 5c0a904..44bed94 100644
--- a/networkx/tests/test_convert.py
+++ b/networkx/tests/test_convert.py
@@ -249,11 +249,11 @@ class TestConvert:
 
     def test_attribute_dict_integrity(self):
         # we must not replace dict-like graph data structures with dicts
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_nodes_from("abc")
-        H = to_networkx_graph(G, create_using=nx.OrderedGraph)
+        H = to_networkx_graph(G, create_using=nx.Graph)
         assert list(H.nodes) == list(G.nodes)
-        H = nx.OrderedDiGraph(G)
+        H = nx.DiGraph(G)
         assert list(H.nodes) == list(G.nodes)
 
     def test_to_edgelist(self):
diff --git a/networkx/tests/test_convert_numpy.py b/networkx/tests/test_convert_numpy.py
index e341ab2..b370f12 100644
--- a/networkx/tests/test_convert_numpy.py
+++ b/networkx/tests/test_convert_numpy.py
@@ -8,258 +8,6 @@ from networkx.generators.classic import barbell_graph, cycle_graph, path_graph
 from networkx.utils import graphs_equal
 
 
-def test_to_numpy_matrix_deprecation():
-    pytest.deprecated_call(nx.to_numpy_matrix, nx.Graph())
-
-
-def test_from_numpy_matrix_deprecation():
-    pytest.deprecated_call(nx.from_numpy_matrix, np.eye(2))
-
-
-def test_to_numpy_recarray_deprecation():
-    pytest.deprecated_call(nx.to_numpy_recarray, nx.Graph())
-
-
-class TestConvertNumpyMatrix:
-    # TODO: This entire class can be removed when to/from_numpy_matrix
-    # deprecation expires
-    def setup_method(self):
-        self.G1 = barbell_graph(10, 3)
-        self.G2 = cycle_graph(10, create_using=nx.DiGraph)
-
-        self.G3 = self.create_weighted(nx.Graph())
-        self.G4 = self.create_weighted(nx.DiGraph())
-
-    def test_exceptions(self):
-        G = np.array("a")
-        pytest.raises(nx.NetworkXError, nx.to_networkx_graph, G)
-
-    def create_weighted(self, G):
-        g = cycle_graph(4)
-        G.add_nodes_from(g)
-        G.add_weighted_edges_from((u, v, 10 + u) for u, v in g.edges())
-        return G
-
-    def assert_equal(self, G1, G2):
-        assert sorted(G1.nodes()) == sorted(G2.nodes())
-        assert sorted(G1.edges()) == sorted(G2.edges())
-
-    def identity_conversion(self, G, A, create_using):
-        assert A.sum() > 0
-        GG = nx.from_numpy_matrix(A, create_using=create_using)
-        self.assert_equal(G, GG)
-        GW = nx.to_networkx_graph(A, create_using=create_using)
-        self.assert_equal(G, GW)
-        GI = nx.empty_graph(0, create_using).__class__(A)
-        self.assert_equal(G, GI)
-
-    def test_shape(self):
-        "Conversion from non-square array."
-        A = np.array([[1, 2, 3], [4, 5, 6]])
-        pytest.raises(nx.NetworkXError, nx.from_numpy_matrix, A)
-
-    def test_identity_graph_matrix(self):
-        "Conversion from graph to matrix to graph."
-        A = nx.to_numpy_matrix(self.G1)
-        self.identity_conversion(self.G1, A, nx.Graph())
-
-    def test_identity_graph_array(self):
-        "Conversion from graph to array to graph."
-        A = nx.to_numpy_matrix(self.G1)
-        A = np.asarray(A)
-        self.identity_conversion(self.G1, A, nx.Graph())
-
-    def test_identity_digraph_matrix(self):
-        """Conversion from digraph to matrix to digraph."""
-        A = nx.to_numpy_matrix(self.G2)
-        self.identity_conversion(self.G2, A, nx.DiGraph())
-
-    def test_identity_digraph_array(self):
-        """Conversion from digraph to array to digraph."""
-        A = nx.to_numpy_matrix(self.G2)
-        A = np.asarray(A)
-        self.identity_conversion(self.G2, A, nx.DiGraph())
-
-    def test_identity_weighted_graph_matrix(self):
-        """Conversion from weighted graph to matrix to weighted graph."""
-        A = nx.to_numpy_matrix(self.G3)
-        self.identity_conversion(self.G3, A, nx.Graph())
-
-    def test_identity_weighted_graph_array(self):
-        """Conversion from weighted graph to array to weighted graph."""
-        A = nx.to_numpy_matrix(self.G3)
-        A = np.asarray(A)
-        self.identity_conversion(self.G3, A, nx.Graph())
-
-    def test_identity_weighted_digraph_matrix(self):
-        """Conversion from weighted digraph to matrix to weighted digraph."""
-        A = nx.to_numpy_matrix(self.G4)
-        self.identity_conversion(self.G4, A, nx.DiGraph())
-
-    def test_identity_weighted_digraph_array(self):
-        """Conversion from weighted digraph to array to weighted digraph."""
-        A = nx.to_numpy_matrix(self.G4)
-        A = np.asarray(A)
-        self.identity_conversion(self.G4, A, nx.DiGraph())
-
-    def test_nodelist(self):
-        """Conversion from graph to matrix to graph with nodelist."""
-        P4 = path_graph(4)
-        P3 = path_graph(3)
-        nodelist = list(P3)
-        A = nx.to_numpy_matrix(P4, nodelist=nodelist)
-        GA = nx.Graph(A)
-        self.assert_equal(GA, P3)
-
-        assert nx.to_numpy_matrix(P3, nodelist=[]).shape == (0, 0)
-        # Test nodelist duplicates.
-        long_nodelist = nodelist + [0]
-        pytest.raises(nx.NetworkXError, nx.to_numpy_matrix, P3, nodelist=long_nodelist)
-
-        # Test nodelist contains non-nodes
-        nonnodelist = [-1, 0, 1, 2]
-        pytest.raises(nx.NetworkXError, nx.to_numpy_matrix, P3, nodelist=nonnodelist)
-
-    def test_weight_keyword(self):
-        WP4 = nx.Graph()
-        WP4.add_edges_from((n, n + 1, dict(weight=0.5, other=0.3)) for n in range(3))
-        P4 = path_graph(4)
-        A = nx.to_numpy_matrix(P4)
-        np.testing.assert_equal(A, nx.to_numpy_matrix(WP4, weight=None))
-        np.testing.assert_equal(0.5 * A, nx.to_numpy_matrix(WP4))
-        np.testing.assert_equal(0.3 * A, nx.to_numpy_matrix(WP4, weight="other"))
-
-    def test_from_numpy_matrix_type(self):
-        pytest.importorskip("scipy")
-
-        A = np.matrix([[1]])
-        G = nx.from_numpy_matrix(A)
-        assert type(G[0][0]["weight"]) == int
-
-        A = np.matrix([[1]]).astype(float)
-        G = nx.from_numpy_matrix(A)
-        assert type(G[0][0]["weight"]) == float
-
-        A = np.matrix([[1]]).astype(str)
-        G = nx.from_numpy_matrix(A)
-        assert type(G[0][0]["weight"]) == str
-
-        A = np.matrix([[1]]).astype(bool)
-        G = nx.from_numpy_matrix(A)
-        assert type(G[0][0]["weight"]) == bool
-
-        A = np.matrix([[1]]).astype(complex)
-        G = nx.from_numpy_matrix(A)
-        assert type(G[0][0]["weight"]) == complex
-
-        A = np.matrix([[1]]).astype(object)
-        pytest.raises(TypeError, nx.from_numpy_matrix, A)
-
-        G = nx.cycle_graph(3)
-        A = nx.adjacency_matrix(G).todense()
-        H = nx.from_numpy_matrix(A)
-        assert all(type(m) == int and type(n) == int for m, n in H.edges())
-        H = nx.from_numpy_array(A)
-        assert all(type(m) == int and type(n) == int for m, n in H.edges())
-
-    def test_from_numpy_matrix_dtype(self):
-        dt = [("weight", float), ("cost", int)]
-        A = np.matrix([[(1.0, 2)]], dtype=dt)
-        G = nx.from_numpy_matrix(A)
-        assert type(G[0][0]["weight"]) == float
-        assert type(G[0][0]["cost"]) == int
-        assert G[0][0]["cost"] == 2
-        assert G[0][0]["weight"] == 1.0
-
-    def test_to_numpy_recarray(self):
-        G = nx.Graph()
-        G.add_edge(1, 2, weight=7.0, cost=5)
-        A = nx.to_numpy_recarray(G, dtype=[("weight", float), ("cost", int)])
-        assert sorted(A.dtype.names) == ["cost", "weight"]
-        assert A.weight[0, 1] == 7.0
-        assert A.weight[0, 0] == 0.0
-        assert A.cost[0, 1] == 5
-        assert A.cost[0, 0] == 0
-
-    def test_numpy_multigraph(self):
-        G = nx.MultiGraph()
-        G.add_edge(1, 2, weight=7)
-        G.add_edge(1, 2, weight=70)
-        A = nx.to_numpy_matrix(G)
-        assert A[1, 0] == 77
-        A = nx.to_numpy_matrix(G, multigraph_weight=min)
-        assert A[1, 0] == 7
-        A = nx.to_numpy_matrix(G, multigraph_weight=max)
-        assert A[1, 0] == 70
-
-    def test_from_numpy_matrix_parallel_edges(self):
-        """Tests that the :func:`networkx.from_numpy_matrix` function
-        interprets integer weights as the number of parallel edges when
-        creating a multigraph.
-
-        """
-        A = np.matrix([[1, 1], [1, 2]])
-        # First, with a simple graph, each integer entry in the adjacency
-        # matrix is interpreted as the weight of a single edge in the graph.
-        expected = nx.DiGraph()
-        edges = [(0, 0), (0, 1), (1, 0)]
-        expected.add_weighted_edges_from([(u, v, 1) for (u, v) in edges])
-        expected.add_edge(1, 1, weight=2)
-        actual = nx.from_numpy_matrix(A, parallel_edges=True, create_using=nx.DiGraph)
-        assert graphs_equal(actual, expected)
-        actual = nx.from_numpy_matrix(A, parallel_edges=False, create_using=nx.DiGraph)
-        assert graphs_equal(actual, expected)
-        # Now each integer entry in the adjacency matrix is interpreted as the
-        # number of parallel edges in the graph if the appropriate keyword
-        # argument is specified.
-        edges = [(0, 0), (0, 1), (1, 0), (1, 1), (1, 1)]
-        expected = nx.MultiDiGraph()
-        expected.add_weighted_edges_from([(u, v, 1) for (u, v) in edges])
-        actual = nx.from_numpy_matrix(
-            A, parallel_edges=True, create_using=nx.MultiDiGraph
-        )
-        assert graphs_equal(actual, expected)
-        expected = nx.MultiDiGraph()
-        expected.add_edges_from(set(edges), weight=1)
-        # The sole self-loop (edge 0) on vertex 1 should have weight 2.
-        expected[1][1][0]["weight"] = 2
-        actual = nx.from_numpy_matrix(
-            A, parallel_edges=False, create_using=nx.MultiDiGraph
-        )
-        assert graphs_equal(actual, expected)
-
-    def test_symmetric(self):
-        """Tests that a symmetric matrix has edges added only once to an
-        undirected multigraph when using :func:`networkx.from_numpy_matrix`.
-
-        """
-        A = np.matrix([[0, 1], [1, 0]])
-        G = nx.from_numpy_matrix(A, create_using=nx.MultiGraph)
-        expected = nx.MultiGraph()
-        expected.add_edge(0, 1, weight=1)
-        assert graphs_equal(G, expected)
-
-    def test_dtype_int_graph(self):
-        """Test that setting dtype int actually gives an integer matrix.
-
-        For more information, see GitHub pull request #1363.
-
-        """
-        G = nx.complete_graph(3)
-        A = nx.to_numpy_matrix(G, dtype=int)
-        assert A.dtype == int
-
-    def test_dtype_int_multigraph(self):
-        """Test that setting dtype int actually gives an integer matrix.
-
-        For more information, see GitHub pull request #1363.
-
-        """
-        G = nx.MultiGraph(nx.complete_graph(3))
-        A = nx.to_numpy_matrix(G, dtype=int)
-        assert A.dtype == int
-
-
 class TestConvertNumpyArray:
     def setup_method(self):
         self.G1 = barbell_graph(10, 3)
@@ -434,75 +182,6 @@ class TestConvertNumpyArray:
         assert A.dtype == int
 
 
-@pytest.fixture
-def recarray_test_graph():
-    G = nx.Graph()
-    G.add_edge(1, 2, weight=7.0, cost=5)
-    return G
-
-
-def test_to_numpy_recarray(recarray_test_graph):
-    A = nx.to_numpy_recarray(
-        recarray_test_graph, dtype=[("weight", float), ("cost", int)]
-    )
-    assert sorted(A.dtype.names) == ["cost", "weight"]
-    assert A.weight[0, 1] == 7.0
-    assert A.weight[0, 0] == 0.0
-    assert A.cost[0, 1] == 5
-    assert A.cost[0, 0] == 0
-    with pytest.raises(AttributeError, match="has no attribute"):
-        A.color[0, 1]
-
-
-def test_to_numpy_recarray_default_dtype(recarray_test_graph):
-    A = nx.to_numpy_recarray(recarray_test_graph)
-    assert A.dtype.names == ("weight",)
-    assert A.weight[0, 0] == 0
-    assert A.weight[0, 1] == 7
-    with pytest.raises(AttributeError, match="has no attribute"):
-        A.cost[0, 1]
-
-
-def test_to_numpy_recarray_directed(recarray_test_graph):
-    G = recarray_test_graph.to_directed()
-    G.remove_edge(2, 1)
-    A = nx.to_numpy_recarray(G, dtype=[("weight", float), ("cost", int)])
-    np.testing.assert_array_equal(A.weight, np.array([[0, 7.0], [0, 0]]))
-    np.testing.assert_array_equal(A.cost, np.array([[0, 5], [0, 0]]))
-
-
-def test_to_numpy_recarray_default_dtype_no_weight():
-    G = nx.Graph()
-    G.add_edge(0, 1, color="red")
-    with pytest.raises(KeyError):
-        A = nx.to_numpy_recarray(G)
-    A = nx.to_numpy_recarray(G, dtype=[("color", "U8")])
-    assert A.color[0, 1] == "red"
-
-
-@pytest.fixture
-def recarray_nodelist_test_graph():
-    G = nx.Graph()
-    G.add_edges_from(
-        [(0, 1, {"weight": 1.0}), (0, 2, {"weight": 2.0}), (1, 2, {"weight": 0.5})]
-    )
-    return G
-
-
-def test_to_numpy_recarray_nodelist(recarray_nodelist_test_graph):
-    A = nx.to_numpy_recarray(recarray_nodelist_test_graph, nodelist=[0, 1])
-    np.testing.assert_array_equal(A.weight, np.array([[0, 1], [1, 0]]))
-
-
-@pytest.mark.parametrize(
-    ("nodelist", "errmsg"),
-    (([2, 3], "in nodelist is not in G"), ([1, 1], "nodelist contains duplicates")),
-)
-def test_to_numpy_recarray_bad_nodelist(recarray_nodelist_test_graph, nodelist, errmsg):
-    with pytest.raises(nx.NetworkXError, match=errmsg):
-        A = nx.to_numpy_recarray(recarray_nodelist_test_graph, nodelist=nodelist)
-
-
 @pytest.fixture
 def multigraph_test_graph():
     G = nx.MultiGraph()
diff --git a/networkx/tests/test_convert_scipy.py b/networkx/tests/test_convert_scipy.py
index 4b2537d..f1a79df 100644
--- a/networkx/tests/test_convert_scipy.py
+++ b/networkx/tests/test_convert_scipy.py
@@ -281,13 +281,3 @@ def test_from_scipy_sparse_array_formats(sparse_format):
     )
     A = sp.sparse.coo_array([[0, 3, 2], [3, 0, 1], [2, 1, 0]]).asformat(sparse_format)
     assert graphs_equal(expected, nx.from_scipy_sparse_array(A))
-
-
-# NOTE: remove when to/from_sparse_matrix deprecations expire
-def test_scipy_sparse_matrix_deprecations():
-    G = nx.path_graph(3)
-    msg = "\n\nThe scipy.sparse array containers will be used instead of matrices"
-    with pytest.warns(DeprecationWarning, match=msg):
-        M = nx.to_scipy_sparse_matrix(G)
-    with pytest.warns(DeprecationWarning, match=msg):
-        H = nx.from_scipy_sparse_matrix(M)
diff --git a/networkx/tests/test_relabel.py b/networkx/tests/test_relabel.py
index 9a86b38..0ebf4d3 100644
--- a/networkx/tests/test_relabel.py
+++ b/networkx/tests/test_relabel.py
@@ -106,12 +106,19 @@ class TestRelabel:
         H = nx.relabel_nodes(G, mapping)
         assert nodes_equal(H.nodes(), [65, 66, 67, 68])
 
-    def test_relabel_nodes_classes(self):
-        G = nx.empty_graph()
-        G.add_edges_from([(0, 1), (0, 2), (1, 2), (2, 3)])
+    def test_relabel_nodes_callable_type(self):
+        G = nx.path_graph(4)
         H = nx.relabel_nodes(G, str)
         assert nodes_equal(H.nodes, ["0", "1", "2", "3"])
 
+    @pytest.mark.parametrize("non_mc", ("0123", ["0", "1", "2", "3"]))
+    def test_relabel_nodes_non_mapping_or_callable(self, non_mc):
+        """If `mapping` is neither a Callable or a Mapping, an exception
+        should be raised."""
+        G = nx.path_graph(4)
+        with pytest.raises(AttributeError):
+            nx.relabel_nodes(G, non_mc)
+
     def test_relabel_nodes_graph(self):
         G = nx.Graph([("A", "B"), ("A", "C"), ("B", "C"), ("C", "D")])
         mapping = {"A": "aardvark", "B": "bear", "C": "cat", "D": "dog"}
@@ -119,7 +126,7 @@ class TestRelabel:
         assert nodes_equal(H.nodes(), ["aardvark", "bear", "cat", "dog"])
 
     def test_relabel_nodes_orderedgraph(self):
-        G = nx.OrderedGraph()
+        G = nx.Graph()
         G.add_nodes_from([1, 2, 3])
         G.add_edges_from([(1, 3), (2, 3)])
         mapping = {1: "a", 2: "b", 3: "c"}
@@ -306,3 +313,35 @@ class TestRelabel:
         H = nx.relabel_nodes(G, mapping, copy=True)
         with pytest.raises(nx.NetworkXUnfeasible):
             H = nx.relabel_nodes(G, mapping, copy=False)
+
+    def test_relabel_preserve_node_order_full_mapping_with_copy_true(self):
+        G = nx.path_graph(3)
+        original_order = list(G.nodes())
+        mapping = {2: "a", 1: "b", 0: "c"}  # dictionary keys out of order on purpose
+        H = nx.relabel_nodes(G, mapping, copy=True)
+        new_order = list(H.nodes())
+        assert [mapping.get(i, i) for i in original_order] == new_order
+
+    def test_relabel_preserve_node_order_full_mapping_with_copy_false(self):
+        G = nx.path_graph(3)
+        original_order = list(G)
+        mapping = {2: "a", 1: "b", 0: "c"}  # dictionary keys out of order on purpose
+        H = nx.relabel_nodes(G, mapping, copy=False)
+        new_order = list(H)
+        assert [mapping.get(i, i) for i in original_order] == new_order
+
+    def test_relabel_preserve_node_order_partial_mapping_with_copy_true(self):
+        G = nx.path_graph(3)
+        original_order = list(G)
+        mapping = {1: "a", 0: "b"}  # partial mapping and keys out of order on purpose
+        H = nx.relabel_nodes(G, mapping, copy=True)
+        new_order = list(H)
+        assert [mapping.get(i, i) for i in original_order] == new_order
+
+    def test_relabel_preserve_node_order_partial_mapping_with_copy_false(self):
+        G = nx.path_graph(3)
+        original_order = list(G)
+        mapping = {1: "a", 0: "b"}  # partial mapping and keys out of order on purpose
+        H = nx.relabel_nodes(G, mapping, copy=False)
+        new_order = list(H)
+        assert [mapping.get(i, i) for i in original_order] != new_order
diff --git a/networkx/utils/__init__.py b/networkx/utils/__init__.py
index 9f168de..48f02c1 100644
--- a/networkx/utils/__init__.py
+++ b/networkx/utils/__init__.py
@@ -4,4 +4,3 @@ from networkx.utils.random_sequence import *
 from networkx.utils.union_find import *
 from networkx.utils.rcm import *
 from networkx.utils.heaps import *
-from networkx.utils.contextmanagers import *
diff --git a/networkx/utils/contextmanagers.py b/networkx/utils/contextmanagers.py
deleted file mode 100644
index dcd6b9c..0000000
--- a/networkx/utils/contextmanagers.py
+++ /dev/null
@@ -1,47 +0,0 @@
-import warnings
-from contextlib import contextmanager
-
-__all__ = ["reversed"]
-
-
-@contextmanager
-def reversed(G):
-    """A context manager for temporarily reversing a directed graph in place.
-
-    .. deprecated:: 2.6
-
-       This context manager is deprecated and will be removed in 3.0.
-       Use ``G.reverse(copy=False) if G.is_directed() else G`` instead.
-
-    This is a no-op for undirected graphs.
-
-    Parameters
-    ----------
-    G : graph
-        A NetworkX graph.
-
-    Warning
-    -------
-    The reversed context manager is deprecated in favor
-    of G.reverse(copy=False). The view allows multiple threads to use the
-    same graph without confusion while the context manager does not.
-    This context manager is scheduled to be removed in version 3.0.
-    """
-    msg = (
-        "context manager reversed is deprecated and to be removed in 3.0."
-        "Use G.reverse(copy=False) if G.is_directed() else G instead."
-    )
-    warnings.warn(msg, DeprecationWarning)
-
-    directed = G.is_directed()
-    if directed:
-        G._pred, G._succ = G._succ, G._pred
-        G._adj = G._succ
-
-    try:
-        yield
-    finally:
-        if directed:
-            # Reverse the reverse.
-            G._pred, G._succ = G._succ, G._pred
-            G._adj = G._succ
diff --git a/networkx/utils/decorators.py b/networkx/utils/decorators.py
index d72af39..6dfc9bf 100644
--- a/networkx/utils/decorators.py
+++ b/networkx/utils/decorators.py
@@ -16,8 +16,6 @@ __all__ = [
     "not_implemented_for",
     "open_file",
     "nodes_or_number",
-    "preserve_random_state",
-    "random_state",
     "np_random_state",
     "py_random_state",
     "argmap",
@@ -258,62 +256,6 @@ def nodes_or_number(which_args):
     return argmap(_nodes_or_number, *iter_wa)
 
 
-def preserve_random_state(func):
-    """Decorator to preserve the numpy.random state during a function.
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-
-    Parameters
-    ----------
-    func : function
-        function around which to preserve the random state.
-
-    Returns
-    -------
-    wrapper : function
-        Function which wraps the input function by saving the state before
-        calling the function and restoring the function afterward.
-
-    Examples
-    --------
-    Decorate functions like this::
-
-        @preserve_random_state
-        def do_random_stuff(x, y):
-            return x + y * numpy.random.random()
-
-    Notes
-    -----
-    If numpy.random is not importable, the state is not saved or restored.
-    """
-    import warnings
-
-    msg = "preserve_random_state is deprecated and will be removed in 3.0."
-    warnings.warn(msg, DeprecationWarning)
-
-    try:
-        import numpy as np
-
-        @contextmanager
-        def save_random_state():
-            state = np.random.get_state()
-            try:
-                yield
-            finally:
-                np.random.set_state(state)
-
-        def wrapper(*args, **kwargs):
-            with save_random_state():
-                np.random.seed(1234567890)
-                return func(*args, **kwargs)
-
-        wrapper.__name__ = func.__name__
-        return wrapper
-    except ImportError:
-        return func
-
-
 def np_random_state(random_state_argument):
     """Decorator to generate a `numpy.random.RandomState` instance.
 
@@ -358,27 +300,6 @@ def np_random_state(random_state_argument):
     return argmap(create_random_state, random_state_argument)
 
 
-def random_state(random_state_argument):
-    """Decorator to generate a `numpy.random.RandomState` instance.
-
-    .. deprecated:: 2.7
-
-       This function is a deprecated alias for `np_random_state` and will be
-       removed in version 3.0. Use np_random_state instead.
-    """
-    import warnings
-
-    warnings.warn(
-        (
-            "`random_state` is a deprecated alias for `np_random_state`\n"
-            "and will be removed in version 3.0. Use `np_random_state` instead."
-        ),
-        DeprecationWarning,
-        stacklevel=2,
-    )
-    return np_random_state(random_state_argument)
-
-
 def py_random_state(random_state_argument):
     """Decorator to generate a random.Random instance (or equiv).
 
diff --git a/networkx/utils/mapped_queue.py b/networkx/utils/mapped_queue.py
index 0ff53a0..2535458 100644
--- a/networkx/utils/mapped_queue.py
+++ b/networkx/utils/mapped_queue.py
@@ -43,7 +43,12 @@ class _HeapElement:
             return self.priority < other
         # assume comparing to another _HeapElement
         if self.priority == other_priority:
-            return self.element < other.element
+            try:
+                return self.element < other.element
+            except TypeError as err:
+                raise TypeError(
+                    "Consider using a tuple, with a priority value that can be compared."
+                )
         return self.priority < other_priority
 
     def __gt__(self, other):
@@ -53,7 +58,12 @@ class _HeapElement:
             return self.priority > other
         # assume comparing to another _HeapElement
         if self.priority == other_priority:
-            return self.element < other.element
+            try:
+                return self.element > other.element
+            except TypeError as err:
+                raise TypeError(
+                    "Consider using a tuple, with a priority value that can be compared."
+                )
         return self.priority > other_priority
 
     def __eq__(self, other):
@@ -93,20 +103,28 @@ class MappedQueue:
     library. While MappedQueue is designed for maximum compatibility with
     heapq, it adds element removal, lookup, and priority update.
 
+    Parameters
+    ----------
+    data : dict or iterable
+
     Examples
     --------
 
-    A `MappedQueue` can be created empty or optionally given an array of
-    initial elements. Calling `push()` will add an element and calling `pop()`
-    will remove and return the smallest element.
+    A `MappedQueue` can be created empty, or optionally, given a dictionary
+    of initial elements and priorities.  The methods `push`, `pop`,
+    `remove`, and `update` operate on the queue.
 
-    >>> q = MappedQueue([916, 50, 4609, 493, 237])
-    >>> q.push(1310)
+    >>> colors_nm = {'red':665, 'blue': 470, 'green': 550}
+    >>> q = MappedQueue(colors_nm)
+    >>> q.remove('red')
+    >>> q.update('green', 'violet', 400)
+    >>> q.push('indigo', 425)
     True
-    >>> [q.pop() for i in range(len(q.heap))]
-    [50, 237, 493, 916, 1310, 4609]
+    >>> [q.pop().element for i in range(len(q.heap))]
+    ['violet', 'indigo', 'blue']
 
-    Elements can also be updated or removed from anywhere in the queue.
+    A `MappedQueue` can also be initialized with a list or other iterable. The priority is assumed
+    to be the sort order of the items in the list.
 
     >>> q = MappedQueue([916, 50, 4609, 493, 237])
     >>> q.remove(493)
@@ -114,6 +132,17 @@ class MappedQueue:
     >>> [q.pop() for i in range(len(q.heap))]
     [50, 916, 1117, 4609]
 
+    An exception is raised if the elements are not comparable.
+
+    >>> q = MappedQueue([100, 'a'])
+    Traceback (most recent call last):
+    ...
+    TypeError: '<' not supported between instances of 'int' and 'str'
+
+    To avoid the exception, use a dictionary to assign priorities to the elements.
+
+    >>> q = MappedQueue({100: 0, 'a': 1 })
+
     References
     ----------
     .. [1] Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2001).
@@ -122,9 +151,11 @@ class MappedQueue:
        Pearson Education.
     """
 
-    def __init__(self, data=[]):
+    def __init__(self, data=None):
         """Priority queue class with updatable priorities."""
-        if isinstance(data, dict):
+        if data is None:
+            self.heap = list()
+        elif isinstance(data, dict):
             self.heap = [_HeapElement(v, k) for k, v in data.items()]
         else:
             self.heap = list(data)
diff --git a/networkx/utils/misc.py b/networkx/utils/misc.py
index 51a527d..31189bf 100644
--- a/networkx/utils/misc.py
+++ b/networkx/utils/misc.py
@@ -21,24 +21,12 @@ from itertools import chain, tee
 import networkx as nx
 
 __all__ = [
-    "is_string_like",
-    "iterable",
-    "empty_generator",
     "flatten",
     "make_list_of_ints",
-    "is_list_of_ints",
-    "make_str",
-    "generate_unique_node",
-    "default_opener",
     "dict_to_numpy_array",
-    "dict_to_numpy_array1",
-    "dict_to_numpy_array2",
-    "is_iterator",
     "arbitrary_element",
-    "consume",
     "pairwise",
     "groups",
-    "to_tuple",
     "create_random_state",
     "create_py_random_state",
     "PythonRandomInterface",
@@ -53,51 +41,6 @@ __all__ = [
 # see G.add_nodes and others in Graph Class in networkx/base.py
 
 
-def is_string_like(obj):  # from John Hunter, types-free version
-    """Check if obj is string.
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-    """
-    msg = (
-        "is_string_like is deprecated and will be removed in 3.0."
-        "Use isinstance(obj, str) instead."
-    )
-    warnings.warn(msg, DeprecationWarning)
-    return isinstance(obj, str)
-
-
-def iterable(obj):
-    """Return True if obj is iterable with a well-defined len().
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-    """
-    msg = (
-        "iterable is deprecated and will be removed in 3.0."
-        "Use isinstance(obj, (collections.abc.Iterable, collections.abc.Sized)) instead."
-    )
-    warnings.warn(msg, DeprecationWarning)
-    if hasattr(obj, "__iter__"):
-        return True
-    try:
-        len(obj)
-    except:
-        return False
-    return True
-
-
-def empty_generator():
-    """Return a generator with no members.
-
-    .. deprecated:: 2.6
-    """
-    warnings.warn(
-        "empty_generator is deprecated and will be removed in v3.0.", DeprecationWarning
-    )
-    return (i for i in ())
-
-
 def flatten(obj, result=None):
     """Return flattened version of (possibly nested) iterable object."""
     if not isinstance(obj, (Iterable, Sized)) or isinstance(obj, str):
@@ -148,79 +91,6 @@ def make_list_of_ints(sequence):
     return sequence
 
 
-def is_list_of_ints(intlist):
-    """Return True if list is a list of ints.
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-    """
-    msg = (
-        "is_list_of_ints is deprecated and will be removed in 3.0."
-        "See also: ``networkx.utils.make_list_of_ints.``"
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    if not isinstance(intlist, list):
-        return False
-    for i in intlist:
-        if not isinstance(i, int):
-            return False
-    return True
-
-
-def make_str(x):
-    """Returns the string representation of t.
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-    """
-    msg = "make_str is deprecated and will be removed in 3.0. Use str instead."
-    warnings.warn(msg, DeprecationWarning)
-    return str(x)
-
-
-def generate_unique_node():
-    """Generate a unique node label.
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-    """
-    msg = "generate_unique_node is deprecated and will be removed in 3.0. Use uuid.uuid4 instead."
-    warnings.warn(msg, DeprecationWarning)
-    return str(uuid.uuid4())
-
-
-def default_opener(filename):
-    """Opens `filename` using system's default program.
-
-    .. deprecated:: 2.6
-       default_opener is deprecated and will be removed in version 3.0.
-       Consider an image processing library to open images, such as Pillow::
-
-           from PIL import Image
-           Image.open(filename).show()
-
-    Parameters
-    ----------
-    filename : str
-        The path of the file to be opened.
-
-    """
-    warnings.warn(
-        "default_opener is deprecated and will be removed in version 3.0. ",
-        DeprecationWarning,
-    )
-    from subprocess import call
-
-    cmds = {
-        "darwin": ["open"],
-        "linux": ["xdg-open"],
-        "linux2": ["xdg-open"],
-        "win32": ["cmd.exe", "/C", "start", ""],
-    }
-    cmd = cmds[sys.platform] + [filename]
-    call(cmd)
-
-
 def dict_to_numpy_array(d, mapping=None):
     """Convert a dictionary of dictionaries to a numpy array
     with optional mapping."""
@@ -232,23 +102,6 @@ def dict_to_numpy_array(d, mapping=None):
         return _dict_to_numpy_array1(d, mapping)
 
 
-def dict_to_numpy_array2(d, mapping=None):
-    """Convert a dict of dicts to a 2d numpy array with optional mapping.
-
-    .. deprecated:: 2.8
-
-       dict_to_numpy_array2 is deprecated and will be removed in networkx 3.0.
-       Use `dict_to_numpy_array` instead.
-    """
-    msg = (
-        "dict_to_numpy_array2 is deprecated and will be removed in networkx 3.0.\n"
-        "Use dict_to_numpy_array instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-    return _dict_to_numpy_array2(d, mapping)
-
-
 def _dict_to_numpy_array2(d, mapping=None):
     """Convert a dictionary of dictionaries to a 2d numpy array
     with optional mapping.
@@ -272,23 +125,6 @@ def _dict_to_numpy_array2(d, mapping=None):
     return a
 
 
-def dict_to_numpy_array1(d, mapping=None):
-    """Convert a dict of numbers to a 1d numpy array with optional mapping.
-
-    .. deprecated:: 2.8
-
-       dict_to_numpy_array1 is deprecated and will be removed in networkx 3.0.
-       Use dict_to_numpy_array instead.
-    """
-    msg = (
-        "dict_to_numpy_array1 is deprecated and will be removed in networkx 3.0.\n"
-        "Use dict_to_numpy_array instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-
-    return _dict_to_numpy_array1(d, mapping)
-
-
 def _dict_to_numpy_array1(d, mapping=None):
     """Convert a dictionary of numbers to a 1d numpy array with optional mapping."""
     import numpy as np
@@ -304,21 +140,6 @@ def _dict_to_numpy_array1(d, mapping=None):
     return a
 
 
-def is_iterator(obj):
-    """Returns True if and only if the given object is an iterator object.
-
-    .. deprecated:: 2.6.0
-        Deprecated in favor of ``isinstance(obj, collections.abc.Iterator)``
-    """
-    msg = (
-        "is_iterator is deprecated and will be removed in version 3.0. "
-        "Use ``isinstance(obj, collections.abc.Iterator)`` instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    has_next_attr = hasattr(obj, "__next__") or hasattr(obj, "next")
-    return iter(obj) is obj and has_next_attr
-
-
 def arbitrary_element(iterable):
     """Returns an arbitrary element of `iterable` without removing it.
 
@@ -388,22 +209,6 @@ def arbitrary_element(iterable):
     return next(iter(iterable))
 
 
-# Recipe from the itertools documentation.
-def consume(iterator):
-    """Consume the iterator entirely.
-
-    .. deprecated:: 2.6
-        This is deprecated and will be removed in NetworkX v3.0.
-    """
-    # Feed the entire iterator into a zero-length deque.
-    msg = (
-        "consume is deprecated and will be removed in version 3.0. "
-        "Use ``collections.deque(iterator, maxlen=0)`` instead."
-    )
-    warnings.warn(msg, DeprecationWarning, stacklevel=2)
-    deque(iterator, maxlen=0)
-
-
 # Recipe from the itertools documentation.
 def pairwise(iterable, cyclic=False):
     "s -> (s0, s1), (s1, s2), (s2, s3), ..."
@@ -436,31 +241,6 @@ def groups(many_to_one):
     return dict(one_to_many)
 
 
-def to_tuple(x):
-    """Converts lists to tuples.
-
-    .. deprecated:: 2.8
-
-       to_tuple is deprecated and will be removed in NetworkX 3.0.
-
-    Examples
-    --------
-    >>> from networkx.utils import to_tuple
-    >>> a_list = [1, 2, [1, 4]]
-    >>> to_tuple(a_list)
-    (1, 2, (1, 4))
-    """
-    warnings.warn(
-        "to_tuple is deprecated and will be removed in NetworkX 3.0.",
-        DeprecationWarning,
-        stacklevel=2,
-    )
-
-    if not isinstance(x, (tuple, list)):
-        return x
-    return tuple(map(to_tuple, x))
-
-
 def create_random_state(random_state=None):
     """Returns a numpy.random.RandomState or numpy.random.Generator instance
     depending on input.
diff --git a/networkx/utils/tests/test_contextmanager.py b/networkx/utils/tests/test_contextmanager.py
deleted file mode 100644
index 6924683..0000000
--- a/networkx/utils/tests/test_contextmanager.py
+++ /dev/null
@@ -1,18 +0,0 @@
-import networkx as nx
-
-
-def test_reversed():
-    G = nx.DiGraph()
-    G.add_edge("A", "B")
-
-    # no exception
-    with nx.utils.reversed(G):
-        pass
-    assert "B" in G["A"]
-
-    # exception
-    try:
-        with nx.utils.reversed(G):
-            raise Exception
-    except:
-        assert "B" in G["A"]
diff --git a/networkx/utils/tests/test_decorators.py b/networkx/utils/tests/test_decorators.py
index 93f22be..b89426e 100644
--- a/networkx/utils/tests/test_decorators.py
+++ b/networkx/utils/tests/test_decorators.py
@@ -11,9 +11,7 @@ from networkx.utils.decorators import (
     not_implemented_for,
     np_random_state,
     open_file,
-    preserve_random_state,
     py_random_state,
-    random_state,
 )
 from networkx.utils.misc import PythonRandomInterface
 
@@ -206,17 +204,6 @@ class TestOpenFileDecorator:
         self.writer_kwarg(path=None)
 
 
-@preserve_random_state
-def test_preserve_random_state():
-    try:
-        import numpy.random
-
-        r = numpy.random.random()
-    except ImportError:
-        return
-    assert abs(r - 0.61879477158568) < 1e-16
-
-
 class TestRandomState:
     @classmethod
     def setup_class(cls):
diff --git a/networkx/utils/tests/test_mapped_queue.py b/networkx/utils/tests/test_mapped_queue.py
index 3570ad2..ca9b7e4 100644
--- a/networkx/utils/tests/test_mapped_queue.py
+++ b/networkx/utils/tests/test_mapped_queue.py
@@ -12,6 +12,13 @@ def test_HeapElement_gtlt():
     assert 1 < bar
 
 
+def test_HeapElement_gtlt_tied_priority():
+    bar = _HeapElement(1, "a")
+    foo = _HeapElement(1, "b")
+    assert foo > bar
+    assert bar < foo
+
+
 def test_HeapElement_eq():
     bar = _HeapElement(1.1, "a")
     foo = _HeapElement(1, "a")
@@ -63,6 +70,10 @@ class TestMappedQueue:
         q = MappedQueue(h)
         self._check_map(q)
 
+    def test_incomparable(self):
+        h = [5, 4, "a", 2, 1, 0]
+        pytest.raises(TypeError, MappedQueue, h)
+
     def test_len(self):
         h = [5, 4, 3, 2, 1, 0]
         q = MappedQueue(h)
@@ -197,6 +208,30 @@ class TestMappedDict(TestMappedQueue):
         priority_dict = {elt: elt for elt in h}
         return MappedQueue(priority_dict)
 
+    def test_init(self):
+        d = {5: 0, 4: 1, "a": 2, 2: 3, 1: 4}
+        q = MappedQueue(d)
+        assert q.position == d
+
+    def test_ties(self):
+        d = {5: 0, 4: 1, 3: 2, 2: 3, 1: 4}
+        q = MappedQueue(d)
+        assert q.position == {elt: pos for pos, elt in enumerate(q.heap)}
+
+    def test_pop(self):
+        d = {5: 0, 4: 1, 3: 2, 2: 3, 1: 4}
+        q = MappedQueue(d)
+        assert q.pop() == _HeapElement(0, 5)
+        assert q.position == {elt: pos for pos, elt in enumerate(q.heap)}
+
+    def test_empty_pop(self):
+        q = MappedQueue()
+        pytest.raises(IndexError, q.pop)
+
+    def test_incomparable_ties(self):
+        d = {5: 0, 4: 0, "a": 0, 2: 0, 1: 0}
+        pytest.raises(TypeError, MappedQueue, d)
+
     def test_push(self):
         to_push = [6, 1, 4, 3, 2, 5, 0]
         h_sifted = [0, 2, 1, 6, 3, 5, 4]
diff --git a/networkx/utils/tests/test_misc.py b/networkx/utils/tests/test_misc.py
index b886852..18d2878 100644
--- a/networkx/utils/tests/test_misc.py
+++ b/networkx/utils/tests/test_misc.py
@@ -13,13 +13,9 @@ from networkx.utils import (
     discrete_sequence,
     flatten,
     groups,
-    is_string_like,
-    iterable,
     make_list_of_ints,
-    make_str,
     pairwise,
     powerlaw_sequence,
-    to_tuple,
 )
 from networkx.utils.misc import _dict_to_numpy_array1, _dict_to_numpy_array2
 
@@ -62,28 +58,6 @@ def test_flatten(nested, result):
     assert issubclass(type(val), tuple)
 
 
-def test_is_string_like():
-    assert is_string_like("aaaa")
-    assert not is_string_like(None)
-    assert not is_string_like(123)
-
-
-def test_iterable():
-    assert not iterable(None)
-    assert not iterable(10)
-    assert iterable([1, 2, 3])
-    assert iterable((1, 2, 3))
-    assert iterable({1: "A", 2: "X"})
-    assert iterable("ABC")
-
-
-def test_graph_iterable():
-    K = nx.complete_graph(10)
-    assert iterable(K)
-    assert iterable(K.nodes())
-    assert iterable(K.edges())
-
-
 def test_make_list_of_ints():
     mylist = [1, 2, 3.0, 42, -2]
     assert make_list_of_ints(mylist) is mylist
@@ -99,20 +73,6 @@ def test_random_number_distribution():
     z = discrete_sequence(20, distribution=[0, 0, 0, 0, 1, 1, 1, 1, 2, 2, 3])
 
 
-def test_make_str_with_bytes():
-    x = "qualité"
-    y = make_str(x)
-    assert isinstance(y, str)
-    assert len(y) == 7
-
-
-def test_make_str_with_unicode():
-    x = "qualité"
-    y = make_str(x)
-    assert isinstance(y, str)
-    assert len(y) == 7
-
-
 class TestNumpyArray:
     @classmethod
     def setup_class(cls):
@@ -195,23 +155,6 @@ def test_groups():
     assert {} == groups({})
 
 
-def test_to_tuple():
-    a_list = [1, 2, [1, 3]]
-    actual = to_tuple(a_list)
-    expected = (1, 2, (1, 3))
-    assert actual == expected
-
-    a_tuple = (1, 2)
-    actual = to_tuple(a_tuple)
-    expected = a_tuple
-    assert actual == expected
-
-    a_mix = (1, 2, [1, 3])
-    actual = to_tuple(a_mix)
-    expected = (1, 2, (1, 3))
-    assert actual == expected
-
-
 def test_create_random_state():
     np = pytest.importorskip("numpy")
     rs = np.random.RandomState
@@ -310,13 +253,3 @@ def test_arbitrary_element_raises(iterator):
     """Value error is raised when input is an iterator."""
     with pytest.raises(ValueError, match="from an iterator"):
         arbitrary_element(iterator)
-
-
-def test_dict_to_numpy_array_deprecations():
-    np = pytest.importorskip("numpy")
-    d = {"a": 1}
-    with pytest.deprecated_call():
-        nx.utils.dict_to_numpy_array1(d)
-    d2 = {"a": {"b": 2}}
-    with pytest.deprecated_call():
-        nx.utils.dict_to_numpy_array2(d2)
diff --git a/networkx/utils/union_find.py b/networkx/utils/union_find.py
index 27596c9..426ed2d 100644
--- a/networkx/utils/union_find.py
+++ b/networkx/utils/union_find.py
@@ -53,11 +53,12 @@ class UnionFind:
             return object
 
         # find path of objects leading to the root
-        path = [object]
+        path = []
         root = self.parents[object]
-        while root != path[-1]:
-            path.append(root)
-            root = self.parents[root]
+        while root != object:
+            path.append(object)
+            object = root
+            root = self.parents[object]
 
         # compress the path and return
         for ancestor in path:
diff --git a/requirements/default.txt b/requirements/default.txt
index 153e8b5..add8758 100644
--- a/requirements/default.txt
+++ b/requirements/default.txt
@@ -1,4 +1,4 @@
-numpy>=1.19
+numpy>=1.20
 scipy>=1.8
 matplotlib>=3.4
 pandas>=1.3
diff --git a/requirements/developer.txt b/requirements/developer.txt
index 068b450..cccc39c 100644
--- a/requirements/developer.txt
+++ b/requirements/developer.txt
@@ -1,2 +1,2 @@
-pre-commit>=2.20
-mypy>=0.982
+pre-commit>=2.21
+mypy>=0.991
diff --git a/requirements/doc.txt b/requirements/doc.txt
index 7cc1e78..7721db0 100644
--- a/requirements/doc.txt
+++ b/requirements/doc.txt
@@ -1,7 +1,7 @@
-sphinx>=5.2
+sphinx==5.2.3
 pydata-sphinx-theme>=0.11
 sphinx-gallery>=0.11
 numpydoc>=1.5
 pillow>=9.2
 nb2plots>=0.6
-texext>=0.6.6
+texext>=0.6.7
diff --git a/requirements/example.txt b/requirements/example.txt
index a256fbd..9350879 100644
--- a/requirements/example.txt
+++ b/requirements/example.txt
@@ -1,6 +1,6 @@
-osmnx>=1.1
+osmnx>=1.2
 momepy>=0.5
 contextily>=1.2
-seaborn>=0.11
-cairocffi>=1.3
-igraph>=0.9.8
+seaborn>=0.12
+cairocffi>=1.4
+igraph>=0.10
diff --git a/requirements/extra.txt b/requirements/extra.txt
index 75db002..081f2b3 100644
--- a/requirements/extra.txt
+++ b/requirements/extra.txt
@@ -1,4 +1,4 @@
 lxml>=4.6
-pygraphviz>=1.9
+pygraphviz>=1.10
 pydot>=1.4.2
 sympy>=1.10
diff --git a/requirements/release.txt b/requirements/release.txt
index 465bec5..384b780 100644
--- a/requirements/release.txt
+++ b/requirements/release.txt
@@ -1,3 +1,3 @@
-build>=0.8
+build>=0.9
 twine>=4.0
-wheel>=0.37
+wheel>=0.38
diff --git a/setup.cfg b/setup.cfg
new file mode 100644
index 0000000..8bfd5a1
--- /dev/null
+++ b/setup.cfg
@@ -0,0 +1,4 @@
+[egg_info]
+tag_build = 
+tag_date = 0
+
diff --git a/setup.py b/setup.py
index e201ec3..f698715 100644
--- a/setup.py
+++ b/setup.py
@@ -68,7 +68,6 @@ packages = [
     "networkx.algorithms",
     "networkx.algorithms.assortativity",
     "networkx.algorithms.bipartite",
-    "networkx.algorithms.node_classification",
     "networkx.algorithms.centrality",
     "networkx.algorithms.community",
     "networkx.algorithms.components",
@@ -90,7 +89,6 @@ packages = [
     "networkx.readwrite",
     "networkx.readwrite.json_graph",
     "networkx.tests",
-    "networkx.testing",
     "networkx.utils",
 ]
 
@@ -124,13 +122,12 @@ dd = os.path.join(docdirbase, "examples", "javascript/force")
 pp = os.path.join("examples", "javascript/force")
 data.append((dd, glob(os.path.join(pp, "*"))))
 
-# add the tests
+# add the tests subpackage(s)
 package_data = {
     "networkx": ["tests/*.py"],
     "networkx.algorithms": ["tests/*.py"],
     "networkx.algorithms.assortativity": ["tests/*.py"],
     "networkx.algorithms.bipartite": ["tests/*.py"],
-    "networkx.algorithms.node_classification": ["tests/*.py"],
     "networkx.algorithms.centrality": ["tests/*.py"],
     "networkx.algorithms.community": ["tests/*.py"],
     "networkx.algorithms.components": ["tests/*.py"],
@@ -151,7 +148,6 @@ package_data = {
     "networkx.linalg": ["tests/*.py"],
     "networkx.readwrite": ["tests/*.py"],
     "networkx.readwrite.json_graph": ["tests/*.py"],
-    "networkx.testing": ["tests/*.py"],
     "networkx.utils": ["tests/*.py"],
 }
 
diff --git a/tools/team_list.py b/tools/team_list.py
deleted file mode 100644
index ce2c271..0000000
--- a/tools/team_list.py
+++ /dev/null
@@ -1,85 +0,0 @@
-import os
-import sys
-import requests
-
-
-project = "networkx"
-core = "core-developers"
-emeritus = "emeritus-developers"
-core_url = f"https://api.github.com/orgs/{project}/teams/{core}/members"
-emeritus_url = f"https://api.github.com/orgs/{project}/teams/{emeritus}/members"
-
-
-token = os.environ.get("GH_TOKEN", None)
-if token is None:
-    print(
-        "No token found.  Please export a GH_TOKEN with permissions "
-        "to read team members."
-    )
-    sys.exit(-1)
-
-
-def api(url):
-    json = requests.get(url=url, headers={"Authorization": f"token {token}"}).json()
-    if "message" in json and json["message"] == "Bad credentials":
-        raise RuntimeError("Invalid token provided")
-    else:
-        return json
-
-
-resp = api(core_url)
-core = sorted(resp, key=lambda user: user["login"].lower())
-
-resp = api(emeritus_url)
-emeritus = sorted(resp, key=lambda user: user["login"].lower())
-
-
-def render_team(team):
-    for member in team:
-        profile = api(member["url"])
-
-        print(
-            f"""
-.. raw:: html
-
-   <div class="team-member">
-     <a href="https://github.com/{member['login']}" class="team-member-name">
-        <div class="team-member-photo">
-           <img
-             src="{member['avatar_url']}&s=40"
-             loading="lazy"
-             alt="Avatar picture of @{profile['login']}"
-           />
-        </div>
-        {profile['name'] if profile['name'] else '@' + profile['login']}
-     </a>
-     <div class="team-member-handle">@{member['login']}</div>
-   </div>
-"""
-        )
-
-
-print(
-    """
-Core Developers
----------------
-
-NetworkX development is guided by the following core team:
-
-"""
-)
-
-render_team(core)
-
-print(
-    """
-
-Emeritus Developers
--------------------
-
-We thank these previously-active core developers for their contributions to NetworkX.
-
-"""
-)
-
-render_team(emeritus)

Debdiff

Debdiff is too long (more than 200 lines). Download the raw debdiff.

More details

Full run details