New Upstream Snapshot - dials-data

Ready changes

Summary

Merged new upstream version: 2.4.0+git20221116.1.d299025 (was: 2.4.0+git20221116.0.d299025).

Resulting package

Built on 2023-01-22T11:05 (took 25m40s)

The resulting binary packages can be installed (if you have the apt repository enabled) by running one of:

apt install -t fresh-snapshots python3-dials-data

Lintian Result

Diff

diff --git a/.azure-pipelines/azure-pipelines.yml b/.azure-pipelines/azure-pipelines.yml
deleted file mode 100644
index dc03351..0000000
--- a/.azure-pipelines/azure-pipelines.yml
+++ /dev/null
@@ -1,283 +0,0 @@
-stages:
-- stage: static
-  displayName: Static Analysis
-  jobs:
-  - job: checks
-    displayName: static code analysis
-    pool:
-      vmImage: ubuntu-20.04
-    steps:
-      # Run syntax validation using oldest and latest Python
-      - task: UsePythonVersion@0
-        displayName: Set up python
-        inputs:
-          versionSpec: 3.7
-
-      - bash: python .azure-pipelines/syntax-validation.py
-        displayName: Syntax validation (3.7)
-
-      - task: UsePythonVersion@0
-        displayName: Set up python
-        inputs:
-          versionSpec: 3.10
-
-      - bash: python .azure-pipelines/syntax-validation.py
-        displayName: Syntax validation (3.10)
-
-      - bash: |
-          set -eux
-          pip install --disable-pip-version-check flake8 flake8-comprehensions
-          python .azure-pipelines/flake8-validation.py
-        displayName: Flake8 validation
-
-      - bash: |
-          set -eux
-          # install versions matching the ones in the corresponding pre-commit hook
-          pip install --disable-pip-version-check mypy==0.931 types-PyYAML==6.0.4 types-requests==2.27.11
-          mypy --no-strict-optional dials_data/
-        displayName: Type checking
-
-      # Set up constants for further build steps
-      - bash: |
-          echo "##vso[task.setvariable variable=BUILD_REPOSITORY_NAME;isOutput=true]${BUILD_REPOSITORY_NAME}"
-          echo "##vso[task.setvariable variable=BUILD_SOURCEBRANCH;isOutput=true]${BUILD_SOURCEBRANCH}"
-          diff -q <(git rev-parse HEAD~0^{tree}) <(git rev-parse HEAD~1^{tree}) >/dev/null && {
-            # Current git commit and parent commit have 100% identical files
-            echo "##vso[task.setvariable variable=AUTOMATED_COMMIT;isOutput=true]true"
-          } || {
-            echo "##vso[task.setvariable variable=AUTOMATED_COMMIT;isOutput=true]false"
-          }
-        displayName: Set up build constants
-        name: constants
-
-      - bash: |
-          echo BUILD_REPOSITORY_NAME = $(constants.BUILD_REPOSITORY_NAME)
-          echo BUILD_SOURCEBRANCH = $(constants.BUILD_SOURCEBRANCH)
-          echo AUTOMATED_COMMIT = $(constants.AUTOMATED_COMMIT)
-
-          if [ "$(constants.BUILD_REPOSITORY_NAME)" == "dials/data" ] && \
-             [ "$(constants.AUTOMATED_COMMIT)" == "true" ] && \
-             [[ "$(constants.BUILD_SOURCEBRANCH)" == refs/pull/* ]]; then
-            echo "##[error]Intentionally marking the initial commit of the automated pull request as failed."
-            echo "The pull request requires a second, meaningful commit, which will run normally."
-            exit 1
-          fi
-        displayName: Verify constants
-
-- stage: build
-  displayName: Build
-  dependsOn:
-  - static
-  condition: and(succeeded(),
-                 not(or(and(eq(dependencies.static.outputs['checks.constants.AUTOMATED_COMMIT'], 'true'),
-                            startsWith(dependencies.static.outputs['checks.constants.BUILD_SOURCEBRANCH'], 'refs/pull/')),
-                        startsWith(dependencies.static.outputs['checks.constants.BUILD_SOURCEBRANCH'], 'refs/heads/update-'))))
-             # Skip entire build step on pointless pull request and hash update builds
-  jobs:
-  - job: build
-    displayName: build package
-    pool:
-      vmImage: ubuntu-20.04
-    steps:
-      - task: UsePythonVersion@0
-        displayName: Set up python
-        inputs:
-          versionSpec: 3.9
-
-      - bash: |
-          pip install --disable-pip-version-check -r requirements_dev.txt
-        displayName: Install dependencies
-
-      - bash: .azure-pipelines/update-package-version
-        displayName: Set package version
-
-      - bash: |
-          set -ex
-          python setup.py sdist bdist_wheel
-          mkdir -p dist/pypi
-          shopt -s extglob
-          mv -v dist/!(pypi) dist/pypi
-          git archive HEAD | gzip > dist/repo-source.tar.gz
-          ls -laR dist
-        displayName: Build python package
-
-      - task: PublishBuildArtifacts@1
-        displayName: Store artifact
-        inputs:
-          pathToPublish: dist/
-          artifactName: package
-
-      - bash: |
-          twine check dist/pypi/*
-          python setup.py checkdocs
-        displayName: Check package description
-
-- stage: update
-  displayName: Update hashes
-  dependsOn:
-  - static
-  condition: and(succeeded(),
-                 eq(dependencies.static.outputs['checks.constants.BUILD_REPOSITORY_NAME'], 'dials/data'),
-                 startsWith(dependencies.static.outputs['checks.constants.BUILD_SOURCEBRANCH'], 'refs/heads/update-'),
-                 eq(dependencies.static.outputs['checks.constants.AUTOMATED_COMMIT'], 'true'))
-             # Only run this stage on automated commits on 'update-...' branches on the main repository
-  jobs:
-  - job: update
-    displayName: Update hash information
-    pool:
-      vmImage: ubuntu-20.04
-    steps:
-      - task: UsePythonVersion@0
-        displayName: Set up python
-        inputs:
-          versionSpec: 3.9
-
-      - script: |
-          set -eux
-          pip install --disable-pip-version-check -r requirements.txt
-          pip install --disable-pip-version-check -e .
-        displayName: Install package
-
-      - bash: .azure-pipelines/update-hashinfo
-        displayName: Update hash information
-        env:
-          GITHUB_TOKEN: $(GITHUB_TOKEN)
-
-      - bash: rm .git/credentials
-        displayName: Remove git credentials
-        condition: always()
-
-- stage: tests
-  displayName: Run unit tests
-  dependsOn:
-  - static
-  - build
-  condition: and(succeeded(),
-                 not(and(eq(dependencies.static.outputs['checks.constants.BUILD_REPOSITORY_NAME'], 'dials/data'),
-                         startsWith(dependencies.static.outputs['checks.constants.BUILD_SOURCEBRANCH'], 'refs/heads/update-'))))
-             # Do not need to bother running tests on automated 'update-...' branches
-  jobs:
-  - job: linux
-    pool:
-      vmImage: ubuntu-20.04
-    strategy:
-      matrix:
-        python37:
-          PYTHON_VERSION: 3.7
-        python38:
-          PYTHON_VERSION: 3.8
-        python39:
-          PYTHON_VERSION: 3.9
-        python310:
-          PYTHON_VERSION: 3.10
-    steps:
-    - template: ci.yml
-  - job: macOS
-    pool:
-      vmImage: macOS-latest
-    strategy:
-      matrix:
-        python37:
-          PYTHON_VERSION: 3.7
-        python38:
-          PYTHON_VERSION: 3.8
-        python39:
-          PYTHON_VERSION: 3.9
-        python310:
-          PYTHON_VERSION: 3.10
-    steps:
-    - template: ci.yml
-  - job: windows
-    pool:
-      vmImage: windows-latest
-    strategy:
-      matrix:
-        python37:
-          PYTHON_VERSION: 3.7
-        python38:
-          PYTHON_VERSION: 3.8
-        python39:
-          PYTHON_VERSION: 3.9
-        python310:
-          PYTHON_VERSION: 3.10
-    steps:
-    - template: ci-windows.yml
-
-- stage: deploy
-  displayName: Publish release
-  dependsOn:
-  - static
-  - tests
-  condition: and(succeeded(),
-                 eq(dependencies.static.outputs['checks.constants.BUILD_REPOSITORY_NAME'], 'dials/data'),
-                 eq(dependencies.static.outputs['checks.constants.BUILD_SOURCEBRANCH'], 'refs/heads/master'))
-  jobs:
-  - job: pypi
-    displayName: Publish pypi release
-    pool:
-      vmImage: ubuntu-20.04
-    steps:
-      - checkout: none
-
-      - task: UsePythonVersion@0
-        displayName: Set up python
-        inputs:
-          versionSpec: 3.9
-
-      - task: DownloadBuildArtifacts@0
-        displayName: Get pre-built package
-        inputs:
-          buildType: 'current'
-          downloadType: 'single'
-          artifactName: 'package'
-          downloadPath: '$(System.ArtifactsDirectory)'
-
-      - script: |
-          pip install --disable-pip-version-check twine
-        displayName: Install twine
-
-      - task: TwineAuthenticate@1
-        displayName: Set up credentials
-        inputs:
-          pythonUploadServiceConnection: pypi-dials-data
-
-      - bash: |
-          python -m twine upload -r pypi-dials-data --config-file $(PYPIRC_PATH) $(System.ArtifactsDirectory)/package/pypi/*.tar.gz $(System.ArtifactsDirectory)/package/pypi/*.whl
-        displayName: Publish package
-
-- stage: hashinfo
-  displayName: Check missing hashes
-  dependsOn:
-  - static
-  - build
-  - tests
-  condition: and(succeeded(),
-                 eq(dependencies.static.outputs['checks.constants.BUILD_SOURCEBRANCH'], 'refs/heads/master'),
-                 eq(dependencies.static.outputs['checks.constants.BUILD_REPOSITORY_NAME'], 'dials/data'))
-             # only run this job in the main branch of the main repository
-  jobs:
-  - job: hashinfo
-    displayName: Check hashes
-    pool:
-      vmImage: ubuntu-20.04
-    steps:
-      - task: UsePythonVersion@0
-        displayName: Set up python
-        inputs:
-          versionSpec: 3.9
-
-      - script: |
-          set -eux
-          pip install --disable-pip-version-check -r requirements.txt
-          pip install --disable-pip-version-check -e .
-        displayName: Install package
-
-      - bash: .azure-pipelines/create-hashinfo-pull-requests
-        displayName: Create pull requests
-        timeoutInMinutes: 5
-        env:
-          GITHUB_TOKEN: $(GITHUB_TOKEN)
-
-      - bash: rm .git/credentials
-        displayName: Remove git credentials
-        condition: always()
diff --git a/.azure-pipelines/ci-windows.yml b/.azure-pipelines/ci-windows.yml
deleted file mode 100644
index 5ee87ed..0000000
--- a/.azure-pipelines/ci-windows.yml
+++ /dev/null
@@ -1,45 +0,0 @@
-steps:
-- checkout: none
-
-- task: UsePythonVersion@0
-  inputs:
-    versionSpec: '$(PYTHON_VERSION)'
-  displayName: 'Use Python $(PYTHON_VERSION)'
-
-- task: DownloadBuildArtifacts@0
-  displayName: Get pre-built package
-  inputs:
-    buildType: 'current'
-    downloadType: 'single'
-    artifactName: 'package'
-    downloadPath: '$(System.ArtifactsDirectory)'
-
-- task: ExtractFiles@1
-  displayName: Checkout sources
-  inputs:
-    archiveFilePatterns: "$(System.ArtifactsDirectory)/package/repo-source.tar.gz"
-    destinationFolder: "$(Pipeline.Workspace)/src"
-
-- script: |
-    pip install --disable-pip-version-check -r $(Pipeline.Workspace)\src\requirements_dev.txt
-  displayName: Install dependencies
-
-- powershell: |
-    pip install --disable-pip-version-check $(dir $(System.ArtifactsDirectory)/package/pypi/*.whl | % {$_.FullName})
-  displayName: Install package
-
-- script: |
-    set PYTHONDEVMODE=1
-    coverage run -m pytest -ra
-  displayName: Run tests
-  workingDirectory: $(Pipeline.Workspace)/src
-
-- script: coverage xml
-  displayName: Export coverage statistics
-  workingDirectory: $(Pipeline.Workspace)/src
-
-- bash: bash <(curl -s https://codecov.io/bash) -t $(CODECOV_TOKEN) -n "Python $(PYTHON_VERSION) $(Agent.OS)"
-  displayName: Publish coverage stats
-  continueOnError: True
-  workingDirectory: $(Pipeline.Workspace)/src
-  timeoutInMinutes: 2
diff --git a/.azure-pipelines/ci.yml b/.azure-pipelines/ci.yml
deleted file mode 100644
index 83c5ba3..0000000
--- a/.azure-pipelines/ci.yml
+++ /dev/null
@@ -1,45 +0,0 @@
-steps:
-- checkout: none
-
-- task: UsePythonVersion@0
-  inputs:
-    versionSpec: '$(PYTHON_VERSION)'
-  displayName: 'Use Python $(PYTHON_VERSION)'
-
-- task: DownloadBuildArtifacts@0
-  displayName: Get pre-built package
-  inputs:
-    buildType: 'current'
-    downloadType: 'single'
-    artifactName: 'package'
-    downloadPath: '$(System.ArtifactsDirectory)'
-
-- task: ExtractFiles@1
-  displayName: Checkout sources
-  inputs:
-    archiveFilePatterns: "$(System.ArtifactsDirectory)/package/repo-source.tar.gz"
-    destinationFolder: "$(Pipeline.Workspace)/src"
-
-- script: |
-    pip install --disable-pip-version-check -r $(Pipeline.Workspace)/src/requirements_dev.txt
-  displayName: Install dependencies
-
-- script: |
-    pip install --disable-pip-version-check $(System.ArtifactsDirectory)/package/pypi/*.whl
-  displayName: Install package
-
-- script: |
-    export PYTHONDEVMODE=1
-    coverage run -m pytest -ra
-  displayName: Run tests
-  workingDirectory: $(Pipeline.Workspace)/src
-
-- script: coverage xml
-  displayName: Export coverage statistics
-  workingDirectory: $(Pipeline.Workspace)/src
-
-- bash: bash <(curl -s https://codecov.io/bash) -t $(CODECOV_TOKEN) -n "Python $(PYTHON_VERSION) $(Agent.OS)"
-  displayName: Publish coverage stats
-  continueOnError: True
-  workingDirectory: $(Pipeline.Workspace)/src
-  timeoutInMinutes: 2
diff --git a/.azure-pipelines/create-hashinfo-pull-requests b/.azure-pipelines/create-hashinfo-pull-requests
deleted file mode 100755
index ce81773..0000000
--- a/.azure-pipelines/create-hashinfo-pull-requests
+++ /dev/null
@@ -1,48 +0,0 @@
-#!/bin/bash -eu
-
-echo "##[section]Setting up git configuration"
-git config --global user.name "Azure on github.com/dials/data"
-git config --global user.email "dials@noreply.github.com"
-git config credential.helper "store --file=.git/credentials"
-echo "https://${GITHUB_TOKEN}:@github.com" > .git/credentials
-echo
-
-echo "##[section]Installing github cli"
-pushd .. >/dev/null
-wget -nv https://github.com/github/hub/releases/download/v2.6.0/hub-linux-amd64-2.6.0.tgz -O - | tar xz
-export PATH=$(pwd)/hub-linux-amd64-2.6.0/bin/:${PATH}
-popd >/dev/null
-hub --version
-echo
-
-echo "##[section]Open pull requests:"
-declare -A UPDATEBRANCHES
-for BRANCH in $(hub pr list -b master -f '%H%n'); do
-  echo ${BRANCH}
-  UPDATEBRANCHES[${BRANCH}]=1
-done
-echo
-
-echo "##[section]Checking for required definition updates..."
-echo Base commit is ${BUILD_SOURCEVERSION}
-for DATASET in $(python -c "import dials_data.datasets as ddd; print('\n'.join(ddd.fileinfo_dirty))")
-do
-  echo Checking ${DATASET}
-  [ ${UPDATEBRANCHES[update-${DATASET}]+x} ] || {
-    echo Creating/Updating branch ${DATASET}
-    git checkout -b "update-${DATASET}"
-    git reset --hard ${BUILD_SOURCEVERSION}
-    git commit --allow-empty -m "Trigger update of dataset ${DATASET}"
-    git push -f -u origin "update-${DATASET}"
-    echo
-    echo Generating pull request:
-    hub pull-request -f -m "Update dataset ${DATASET}
-
-This is an automated pull request to trigger an update of the hash information on dataset \`${DATASET}\`.
-
-These pull requests normally consist of two commits: the first one is an empty commit that triggers the
-hash information update, and the second commit is the actual hash information update. Once both commits
-have appeared, the tests have passed, and you are happy with the results you can squash merge the PR.
-"
-  }
-done
diff --git a/.azure-pipelines/flake8-validation.py b/.azure-pipelines/flake8-validation.py
deleted file mode 100644
index 7b997fa..0000000
--- a/.azure-pipelines/flake8-validation.py
+++ /dev/null
@@ -1,42 +0,0 @@
-from __future__ import annotations
-
-import os
-import subprocess
-
-# Flake8 validation
-failures = 0
-try:
-    flake8 = subprocess.run(
-        [
-            "flake8",
-            "--exit-zero",
-        ],
-        capture_output=True,
-        check=True,
-        encoding="latin-1",
-        timeout=300,
-    )
-except (subprocess.CalledProcessError, subprocess.TimeoutExpired) as e:
-    print(
-        "##vso[task.logissue type=error;]flake8 validation failed with",
-        str(e.__class__.__name__),
-    )
-    print(e.stdout)
-    print(e.stderr)
-    print("##vso[task.complete result=Failed;]flake8 validation failed")
-    exit()
-for line in flake8.stdout.split("\n"):
-    if ":" not in line:
-        continue
-    filename, lineno, column, error = line.split(":", maxsplit=3)
-    errcode, error = error.strip().split(" ", maxsplit=1)
-    filename = os.path.normpath(filename)
-    failures += 1
-    print(
-        f"##vso[task.logissue type=error;sourcepath={filename};"
-        f"linenumber={lineno};columnnumber={column};code={errcode};]" + error
-    )
-
-if failures:
-    print(f"##vso[task.logissue type=warning]Found {failures} flake8 violation(s)")
-    print(f"##vso[task.complete result=Failed;]Found {failures} flake8 violation(s)")
diff --git a/.azure-pipelines/syntax-validation.py b/.azure-pipelines/syntax-validation.py
deleted file mode 100644
index 2d74948..0000000
--- a/.azure-pipelines/syntax-validation.py
+++ /dev/null
@@ -1,32 +0,0 @@
-from __future__ import annotations
-
-import ast
-import os
-import sys
-
-print("Python", sys.version, "\n")
-
-failures = 0
-
-for base, _, files in os.walk("."):
-    for f in files:
-        if not f.endswith(".py"):
-            continue
-        filename = os.path.normpath(os.path.join(base, f))
-        try:
-            with open(filename) as fh:
-                ast.parse(fh.read())
-        except SyntaxError as se:
-            failures += 1
-            print(
-                f"##vso[task.logissue type=error;sourcepath={filename};"
-                f"linenumber={se.lineno};columnnumber={se.offset};]"
-                f"SyntaxError: {se.msg}"
-            )
-            print(" " + se.text + " " * se.offset + "^")
-            print(f"SyntaxError: {se.msg} in {filename} line {se.lineno}")
-            print()
-
-if failures:
-    print(f"##vso[task.logissue type=warning]Found {failures} syntax error(s)")
-    print(f"##vso[task.complete result=Failed;]Found {failures} syntax error(s)")
diff --git a/.azure-pipelines/update-hashinfo b/.azure-pipelines/update-hashinfo
deleted file mode 100755
index ad32f6a..0000000
--- a/.azure-pipelines/update-hashinfo
+++ /dev/null
@@ -1,45 +0,0 @@
-#!/bin/bash -eu
-
-echo "##[section]Setting up git configuration"
-git config --global user.name "Azure on github.com/dials/data"
-git config --global user.email "dials@noreply.github.com"
-git config credential.helper "store --file=.git/credentials"
-echo "https://${GITHUB_TOKEN}:@github.com" > .git/credentials
-echo
-
-git checkout "${BUILD_SOURCEBRANCHNAME}"
-CANDIDATE=$(echo ${BUILD_SOURCEBRANCHNAME} | cut -c 8-)
-if [[ -z $CANDIDATE ]]; then
-  echo Could not determine dataset candidate to update
-  exit 1
-fi
-
-echo "##[section]Considering updating information on dataset ${CANDIDATE}"
-for DATASET in $(dials.data list --quiet --missing-hashinfo)
-do
-  if [[ "${DATASET}" == "${CANDIDATE}" ]]; then
-    echo "##[section]Updating dataset ${DATASET}"
-    dials.data get --create-hashinfo ${DATASET}
-    mv ${DATASET}.yml dials_data/hashinfo/${DATASET}.yml
-    if [ -z "$(git status --porcelain)" ]; then
-      echo Working directory remained clean. Something went wrong.
-      exit 1
-    fi
-    git add dials_data/hashinfo/${DATASET}.yml
-    echo
-    echo "##[section]Committing update"
-    git commit -m "Update file information for dataset ${DATASET}"
-    # Ensure dataset will not be downloaded again in the next round
-    dials.data list --quiet --missing-hashinfo | tee remaining-datasets
-    grep -- "^${DATASET}$" < remaining-datasets && {
-      echo Hashinfo sanity check failed
-      exit 1
-    }
-    git push
-    echo Success.
-    exit 0
-  fi
-done
-
-echo "##[error]Could not identify candidate to update. Something went wrong"
-exit 1
diff --git a/.azure-pipelines/update-package-version b/.azure-pipelines/update-package-version
deleted file mode 100755
index d364c07..0000000
--- a/.azure-pipelines/update-package-version
+++ /dev/null
@@ -1,26 +0,0 @@
-#!/bin/bash -eu
-
-VERSION=$(grep "^__version__" dials_data/__init__.py | cut -d'"' -f 2)
-BASETAG=v${VERSION}
-git tag --list | grep "^${BASETAG}$" || {
-  echo Searching for tag on remote
-  if git fetch origin tag ${BASETAG}; then
-    echo Tag found on remote. Deepening checkout...
-    git fetch --shallow-exclude ${BASETAG}
-    git fetch --deepen 1
-  else
-    echo "Tag not found on remote. Let's make a new one!"
-    git tag ${BASETAG}
-    echo "##[warning]Tag ${BASETAG} is missing on your repository. Packages generated during the build will not have reliable version numbers"
-  fi
-}
-
-DEPTH=$(git rev-list ${BASETAG}..HEAD --count --first-parent)
-REAL_VERSION=${VERSION%.0}.${DEPTH}
-echo Setting package version from ${VERSION} to ${REAL_VERSION}
-sed -i "s/^__version__ =.*/__version__ = \"${REAL_VERSION}\"/" dials_data/__init__.py
-sed -i "s/^version = .*$/version = ${REAL_VERSION}/" setup.cfg
-
-COMMIT=$(git rev-parse --verify HEAD)
-echo Setting package commit to ${COMMIT}
-sed -i "s/^__commit__ =.*/__commit__ = \"${COMMIT}\"/" dials_data/__init__.py
diff --git a/.bumpversion.cfg b/.bumpversion.cfg
deleted file mode 100644
index b97c10c..0000000
--- a/.bumpversion.cfg
+++ /dev/null
@@ -1,16 +0,0 @@
-[bumpversion]
-current_version = 2.4.0
-commit = True
-tag = True
-
-[bumpversion:file:setup.cfg]
-search = version = {current_version}
-replace = version = {new_version}
-
-[bumpversion:file:dials_data/__init__.py]
-search = __version__ = "{current_version}"
-replace = __version__ = "{new_version}"
-
-[bumpversion:file:docs/conf.py]
-search = version = "{current_version}"
-replace = version = "{new_version}"
diff --git a/.codecov.yml b/.codecov.yml
deleted file mode 100644
index 7c8e50e..0000000
--- a/.codecov.yml
+++ /dev/null
@@ -1,16 +0,0 @@
-coverage:
-  status:
-    project:
-      default:
-        target: 0%
-        threshold: 100%
-    patch:
-      default:
-        target: 0%
-        threshold: 100%
-
-comment:
-  layout: "diff, flags"
-  branches:
-    - master
-  after_n_builds: 3
diff --git a/.git-notifier b/.git-notifier
deleted file mode 100644
index 2c4cd39..0000000
--- a/.git-notifier
+++ /dev/null
@@ -1,12 +0,0 @@
-# GIT change notifications sent out by DLS Jenkins
-
-link=https://github.com/dials/data
-commit_link=https://github.com/dials/data/commit/%s
-
-sender=dials-data commit
-
-recipient=dials-commit
-recipient_domain=jiscmail.ac.uk
-
-replyto=dials-support
-replyto_domain=lists.sourceforge.net
diff --git a/.github/ISSUE_TEMPLATE.md b/.github/ISSUE_TEMPLATE.md
deleted file mode 100644
index 1920f4a..0000000
--- a/.github/ISSUE_TEMPLATE.md
+++ /dev/null
@@ -1,15 +0,0 @@
-* DIALS Regression Data version:
-* Python version:
-* Operating System:
-
-### Description
-
-Describe what you were trying to get done.
-Tell us what happened, what went wrong, and what you expected to happen.
-
-### What I Did
-
-```
-Paste the command(s) you ran and the output.
-If there was a crash, please include the traceback here.
-```
diff --git a/.github/renovate.json b/.github/renovate.json
deleted file mode 100644
index c0de59a..0000000
--- a/.github/renovate.json
+++ /dev/null
@@ -1,38 +0,0 @@
-{
-  "extends": [
-    "config:base",
-    ":disableDependencyDashboard"
-  ],
-  "labels": [
-    "dependencies"
-  ],
-  "pip_requirements": {
-    "fileMatch": [
-      "^requirements.*\\.txt$"
-    ],
-    "groupName": "all dependencies",
-    "groupSlug": "all",
-    "packageRules": [
-      {
-        "groupName": "all dependencies",
-        "groupSlug": "all",
-        "matchPackagePatterns": [
-          "*"
-        ]
-      }
-    ]
-  },
-  "prCreation": "not-pending",
-  "prHourlyLimit": 2,
-  "pre-commit": {
-    "schedule": [
-      "after 10am and before 4pm every 3 months on the first day of the month"
-    ],
-    "stabilityDays": 10
-  },
-  "schedule": [
-    "after 7am and before 4pm every monday"
-  ],
-  "stabilityDays": 2,
-  "timezone": "Europe/London"
-}
diff --git a/.gitignore b/.gitignore
deleted file mode 100644
index 31eec24..0000000
--- a/.gitignore
+++ /dev/null
@@ -1,112 +0,0 @@
-# Byte-compiled / optimized / DLL files
-__pycache__/
-*.py[cod]
-*$py.class
-
-# editor files
-*.swp
-~*
-.project
-.pydevproject
-
-# C extensions
-*.so
-
-# Distribution / packaging
-.Python
-env/
-build/
-develop-eggs/
-dist/
-downloads/
-eggs/
-.eggs/
-lib/
-lib64/
-parts/
-sdist/
-var/
-wheels/
-*.egg-info/
-.installed.cfg
-*.egg
-
-# PyInstaller
-#  Usually these files are written by a python script from a template
-#  before PyInstaller builds the exe, so as to inject date/other infos into it.
-*.manifest
-*.spec
-
-# Installer logs
-pip-log.txt
-pip-delete-this-directory.txt
-
-# Unit test / coverage reports
-htmlcov/
-.tox/
-.coverage
-.coverage.*
-.cache
-nosetests.xml
-coverage.xml
-*.cover
-.hypothesis/
-.pytest_cache/
-
-# Translations
-*.mo
-*.pot
-
-# Django stuff:
-*.log
-local_settings.py
-
-# Flask stuff:
-instance/
-.webassets-cache
-
-# Scrapy stuff:
-.scrapy
-
-# Sphinx documentation
-docs/_build/
-
-# PyBuilder
-target/
-
-# Jupyter Notebook
-.ipynb_checkpoints
-
-# pyenv
-.python-version
-
-# celery beat schedule file
-celerybeat-schedule
-
-# SageMath parsed files
-*.sage.py
-
-# dotenv
-.env
-
-# virtualenv
-.venv
-venv/
-ENV/
-
-# Spyder project settings
-.spyderproject
-.spyproject
-
-# Rope project settings
-.ropeproject
-
-# mkdocs documentation
-/site
-
-# mypy
-.mypy_cache/
-
-# notification logs
-.git-notifier*
-git-notifier.log
diff --git a/.pre-commit-config.yaml b/.pre-commit-config.yaml
deleted file mode 100644
index c6476bf..0000000
--- a/.pre-commit-config.yaml
+++ /dev/null
@@ -1,53 +0,0 @@
-repos:
-# Syntax validation and some basic sanity checks
-- repo: https://github.com/pre-commit/pre-commit-hooks
-  rev: v4.0.1
-  hooks:
-  - id: check-merge-conflict
-  - id: check-ast
-    fail_fast: True
-  - id: check-json
-  - id: check-added-large-files
-    args: ['--maxkb=200']
-  - id: check-yaml
-
-# Automatically sort imports
-- repo: https://github.com/PyCQA/isort
-  rev: 5.10.1
-  hooks:
-  - id: isort
-    args: [
-           '-a', 'from __future__ import annotations',        # 3.7-3.11
-           '--rm', 'from __future__ import absolute_import',  # -3.0
-           '--rm', 'from __future__ import division',         # -3.0
-           '--rm', 'from __future__ import generator_stop',   # -3.7
-           '--rm', 'from __future__ import generators',       # -2.3
-           '--rm', 'from __future__ import nested_scopes',    # -2.2
-           '--rm', 'from __future__ import print_function',   # -3.0
-           '--rm', 'from __future__ import unicode_literals', # -3.0
-           '--rm', 'from __future__ import with_statement',   # -2.6
-          ]
-
-# Automatic source code formatting
-- repo: https://github.com/psf/black
-  rev: 22.1.0
-  hooks:
-  - id: black
-    args: [--safe, --quiet]
-
-# Linting
-- repo: https://github.com/PyCQA/flake8
-  rev: 4.0.1
-  hooks:
-  - id: flake8
-    additional_dependencies: ['flake8-comprehensions==3.8.0']
-
-# Type checking
-# Remember to change versions in .azure-pipelines/azure-pipelines.yml to match
-# the versions here.
-- repo: https://github.com/pre-commit/mirrors-mypy
-  rev: v0.931
-  hooks:
-  - id: mypy
-    files: 'dials_data/.*\.py$'
-    additional_dependencies: ['types-PyYAML==6.0.4', 'types-requests==2.27.11']
diff --git a/CONTRIBUTING.rst b/CONTRIBUTING.rst
index 95338a6..44e086c 100644
--- a/CONTRIBUTING.rst
+++ b/CONTRIBUTING.rst
@@ -51,7 +51,7 @@ If you are proposing a feature:
 Get Started!
 ------------
 
-Ready to contribute? Here's how to set up `dials_data` for local development.
+Ready to contribute? Here's how to set up `dials-data` for local development.
 
 1. Fork the `dials/data` `repository on GitHub <https://github.com/dials/data>`__.
 2. Clone your fork locally::
diff --git a/HISTORY.rst b/HISTORY.rst
index 1f7595d..495cb82 100644
--- a/HISTORY.rst
+++ b/HISTORY.rst
@@ -2,6 +2,10 @@
 History
 =======
 
+2.5 (????-??-??)
+^^^^^^^^^^^^^^^^
+
+
 2.4 (2022-03-07)
 ^^^^^^^^^^^^^^^^
 
diff --git a/PKG-INFO b/PKG-INFO
new file mode 100644
index 0000000..23734bd
--- /dev/null
+++ b/PKG-INFO
@@ -0,0 +1,187 @@
+Metadata-Version: 2.1
+Name: dials_data
+Version: 2.4.0
+Summary: DIALS Regression Data Manager
+Home-page: https://github.com/dials/data
+Author: DIALS development team
+Author-email: dials-support@lists.sourceforge.net
+License: BSD 3-Clause License
+Project-URL: Bug Tracker, https://github.com/dials/data/issues
+Project-URL: Documentation, https://dials-data.readthedocs.io/
+Project-URL: Source Code, https://github.com/dials/data
+Keywords: dials,dials_data
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Natural Language :: English
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Requires-Python: >=3.7
+Description-Content-Type: text/x-rst
+License-File: LICENSE
+
+=============================
+DIALS Regression Data Manager
+=============================
+
+.. image:: https://img.shields.io/pypi/v/dials_data.svg
+        :target: https://pypi.python.org/pypi/dials_data
+        :alt: PyPI release
+
+.. image:: https://img.shields.io/conda/vn/conda-forge/dials-data.svg
+        :target: https://anaconda.org/conda-forge/dials-data
+        :alt: Conda release
+
+.. image:: https://travis-ci.com/dials/data.svg?branch=master
+        :target: https://travis-ci.com/dials/data
+        :alt: Build status
+
+.. image:: https://img.shields.io/lgtm/grade/python/g/dials/data.svg?logo=lgtm&logoWidth=18
+        :target: https://lgtm.com/projects/g/dials/data/context:python
+        :alt: Language grade: Python
+
+.. image:: https://img.shields.io/lgtm/alerts/g/dials/data.svg?logo=lgtm&logoWidth=18
+        :target: https://lgtm.com/projects/g/dials/data/alerts/
+        :alt: Total alerts
+
+.. image:: https://readthedocs.org/projects/dials-data/badge/?version=latest
+        :target: https://dials-data.readthedocs.io/en/latest/?badge=latest
+        :alt: Documentation status
+
+.. image:: https://img.shields.io/pypi/pyversions/dials_data.svg
+        :target: https://pypi.org/project/dials_data/
+        :alt: Supported Python versions
+
+.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
+        :target: https://github.com/ambv/black
+        :alt: Code style: black
+
+.. image:: https://img.shields.io/pypi/l/dials_data.svg
+        :target: https://pypi.python.org/pypi/dials_data
+        :alt: BSD license
+
+A python package providing data files used for regression tests in
+DIALS_, dxtbx_, xia2_ and related packages.
+
+If you want to know more about what ``dials-data`` is you can
+have a read through the `background information <https://dials-data.readthedocs.io/en/latest/why.html>`__.
+
+For everything else `the main documentation <https://dials-data.readthedocs.io/>`__ is probably the best start.
+
+
+Installation
+^^^^^^^^^^^^
+
+To install this package in a normal Python environment, run::
+
+    pip install dials-data
+
+and then you can use it with::
+
+    dials.data
+
+If you are in a conda environment you can instead run::
+
+    conda install -c conda-forge dials-data
+
+For more details please take a look at the
+`installation and usage page <https://dials-data.readthedocs.io/en/latest/installation.html>`__.
+
+
+.. _DIALS: https://dials.github.io
+.. _dxtbx: https://github.com/cctbx/cctbx_project/tree/master/dxtbx
+.. _xia2: https://xia2.github.io
+
+=======
+History
+=======
+
+2.5 (????-??-??)
+^^^^^^^^^^^^^^^^
+
+
+2.4 (2022-03-07)
+^^^^^^^^^^^^^^^^
+
+* dials_data no longer uses ``py.path`` internally.
+* dials_data now includes type checking with mypy.
+* We started using the ``requests`` library for faster downloads.
+* Downloads now happen in parallel.
+
+2.3 (2022-01-11)
+^^^^^^^^^^^^^^^^
+
+* Drop Python 3.6 compatibility
+* Dataset `SSX_CuNiR_processed` has been renamed to `cunir_serial_processed` for consistency
+  with `cunir_serial`
+
+2.2 (2021-06-18)
+^^^^^^^^^^^^^^^^
+
+* Deprecate the use of ``py.path`` as test fixture return type.
+  You can either silence the warning by specifying ``dials_data("dataset", pathlib=False)``
+  or move to the new ``pathlib.Path`` return objects by setting ``pathlib=True``.
+  This deprecation is planned to be in place for a considerable amount of time.
+  In the next major release (3.0) the default return type will become ``pathlib.Path``,
+  with ``py.path`` still available if ``pathlib=False`` is specified. At this point
+  the ``pathlib=`` argument will be deprecated.
+  In the following minor release (3.1) all support for ``py.path`` will be dropped.
+
+2.1 (2020-06-11)
+^^^^^^^^^^^^^^^^
+
+* Drops Python 2.7 compatibility
+* Uses importlib.resources to access resource files (requires Python 3.9 or installed package importlib_resources)
+
+2.0 (2019-04-15)
+^^^^^^^^^^^^^^^^
+
+* Convert dials_data to a pytest plugin
+
+1.0 (2019-02-16)
+^^^^^^^^^^^^^^^^
+
+* Add functions for forward-compatibility
+* Enable new release process including automatic deployment of updates
+
+0.6 (2019-02-15)
+^^^^^^^^^^^^^^^^
+
+* Added datasets blend_tutorial, thaumatin_i04
+
+0.5 (2019-01-24)
+^^^^^^^^^^^^^^^^
+
+* Added documentation
+* Added datasets fumarase, vmxi_thaumatin
+
+0.4 (2019-01-11)
+^^^^^^^^^^^^^^^^
+
+* Beta release
+* Added datasets insulin, pychef
+* Automated generation of hashinfo files via Travis
+
+
+0.3 (2019-01-09)
+^^^^^^^^^^^^^^^^
+
+* Dataset download mechanism
+* Added dataset x4wide
+
+
+0.2 (2019-01-08)
+^^^^^^^^^^^^^^^^
+
+* Alpha release
+* Basic command line interface
+* pytest fixture
+
+
+0.1 (2018-11-02)
+^^^^^^^^^^^^^^^^
+
+* First automatic deployment and release on PyPI
diff --git a/README.rst b/README.rst
index 480a4d1..2b6770a 100644
--- a/README.rst
+++ b/README.rst
@@ -41,7 +41,7 @@ DIALS Regression Data Manager
 A python package providing data files used for regression tests in
 DIALS_, dxtbx_, xia2_ and related packages.
 
-If you want to know more about what ``dials_data`` is you can
+If you want to know more about what ``dials-data`` is you can
 have a read through the `background information <https://dials-data.readthedocs.io/en/latest/why.html>`__.
 
 For everything else `the main documentation <https://dials-data.readthedocs.io/>`__ is probably the best start.
@@ -52,7 +52,7 @@ Installation
 
 To install this package in a normal Python environment, run::
 
-    pip install dials_data
+    pip install dials-data
 
 and then you can use it with::
 
@@ -60,7 +60,7 @@ and then you can use it with::
 
 If you are in a conda environment you can instead run::
 
-    conda install -c conda-forge dials_data
+    conda install -c conda-forge dials-data
 
 For more details please take a look at the
 `installation and usage page <https://dials-data.readthedocs.io/en/latest/installation.html>`__.
diff --git a/debian/changelog b/debian/changelog
index ab42a7a..ed0b4bb 100644
--- a/debian/changelog
+++ b/debian/changelog
@@ -1,3 +1,11 @@
+dials-data (2.4.0+git20221116.1.d299025-1) UNRELEASED; urgency=low
+
+  * New upstream snapshot.
+  * New upstream snapshot.
+  * New upstream snapshot.
+
+ -- Debian Janitor <janitor@jelmer.uk>  Sun, 22 Jan 2023 10:42:13 -0000
+
 dials-data (2.4.0-1) unstable; urgency=medium
 
   * New upstream version 2.4.0
diff --git a/dials_data.egg-info/PKG-INFO b/dials_data.egg-info/PKG-INFO
new file mode 100644
index 0000000..f59824b
--- /dev/null
+++ b/dials_data.egg-info/PKG-INFO
@@ -0,0 +1,187 @@
+Metadata-Version: 2.1
+Name: dials-data
+Version: 2.4.0
+Summary: DIALS Regression Data Manager
+Home-page: https://github.com/dials/data
+Author: DIALS development team
+Author-email: dials-support@lists.sourceforge.net
+License: BSD 3-Clause License
+Project-URL: Bug Tracker, https://github.com/dials/data/issues
+Project-URL: Documentation, https://dials-data.readthedocs.io/
+Project-URL: Source Code, https://github.com/dials/data
+Keywords: dials,dials_data
+Classifier: Development Status :: 5 - Production/Stable
+Classifier: Intended Audience :: Developers
+Classifier: License :: OSI Approved :: BSD License
+Classifier: Natural Language :: English
+Classifier: Programming Language :: Python :: 3
+Classifier: Programming Language :: Python :: 3.7
+Classifier: Programming Language :: Python :: 3.8
+Classifier: Programming Language :: Python :: 3.9
+Classifier: Programming Language :: Python :: 3.10
+Requires-Python: >=3.7
+Description-Content-Type: text/x-rst
+License-File: LICENSE
+
+=============================
+DIALS Regression Data Manager
+=============================
+
+.. image:: https://img.shields.io/pypi/v/dials_data.svg
+        :target: https://pypi.python.org/pypi/dials_data
+        :alt: PyPI release
+
+.. image:: https://img.shields.io/conda/vn/conda-forge/dials-data.svg
+        :target: https://anaconda.org/conda-forge/dials-data
+        :alt: Conda release
+
+.. image:: https://travis-ci.com/dials/data.svg?branch=master
+        :target: https://travis-ci.com/dials/data
+        :alt: Build status
+
+.. image:: https://img.shields.io/lgtm/grade/python/g/dials/data.svg?logo=lgtm&logoWidth=18
+        :target: https://lgtm.com/projects/g/dials/data/context:python
+        :alt: Language grade: Python
+
+.. image:: https://img.shields.io/lgtm/alerts/g/dials/data.svg?logo=lgtm&logoWidth=18
+        :target: https://lgtm.com/projects/g/dials/data/alerts/
+        :alt: Total alerts
+
+.. image:: https://readthedocs.org/projects/dials-data/badge/?version=latest
+        :target: https://dials-data.readthedocs.io/en/latest/?badge=latest
+        :alt: Documentation status
+
+.. image:: https://img.shields.io/pypi/pyversions/dials_data.svg
+        :target: https://pypi.org/project/dials_data/
+        :alt: Supported Python versions
+
+.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
+        :target: https://github.com/ambv/black
+        :alt: Code style: black
+
+.. image:: https://img.shields.io/pypi/l/dials_data.svg
+        :target: https://pypi.python.org/pypi/dials_data
+        :alt: BSD license
+
+A python package providing data files used for regression tests in
+DIALS_, dxtbx_, xia2_ and related packages.
+
+If you want to know more about what ``dials-data`` is you can
+have a read through the `background information <https://dials-data.readthedocs.io/en/latest/why.html>`__.
+
+For everything else `the main documentation <https://dials-data.readthedocs.io/>`__ is probably the best start.
+
+
+Installation
+^^^^^^^^^^^^
+
+To install this package in a normal Python environment, run::
+
+    pip install dials-data
+
+and then you can use it with::
+
+    dials.data
+
+If you are in a conda environment you can instead run::
+
+    conda install -c conda-forge dials-data
+
+For more details please take a look at the
+`installation and usage page <https://dials-data.readthedocs.io/en/latest/installation.html>`__.
+
+
+.. _DIALS: https://dials.github.io
+.. _dxtbx: https://github.com/cctbx/cctbx_project/tree/master/dxtbx
+.. _xia2: https://xia2.github.io
+
+=======
+History
+=======
+
+2.5 (????-??-??)
+^^^^^^^^^^^^^^^^
+
+
+2.4 (2022-03-07)
+^^^^^^^^^^^^^^^^
+
+* dials_data no longer uses ``py.path`` internally.
+* dials_data now includes type checking with mypy.
+* We started using the ``requests`` library for faster downloads.
+* Downloads now happen in parallel.
+
+2.3 (2022-01-11)
+^^^^^^^^^^^^^^^^
+
+* Drop Python 3.6 compatibility
+* Dataset `SSX_CuNiR_processed` has been renamed to `cunir_serial_processed` for consistency
+  with `cunir_serial`
+
+2.2 (2021-06-18)
+^^^^^^^^^^^^^^^^
+
+* Deprecate the use of ``py.path`` as test fixture return type.
+  You can either silence the warning by specifying ``dials_data("dataset", pathlib=False)``
+  or move to the new ``pathlib.Path`` return objects by setting ``pathlib=True``.
+  This deprecation is planned to be in place for a considerable amount of time.
+  In the next major release (3.0) the default return type will become ``pathlib.Path``,
+  with ``py.path`` still available if ``pathlib=False`` is specified. At this point
+  the ``pathlib=`` argument will be deprecated.
+  In the following minor release (3.1) all support for ``py.path`` will be dropped.
+
+2.1 (2020-06-11)
+^^^^^^^^^^^^^^^^
+
+* Drops Python 2.7 compatibility
+* Uses importlib.resources to access resource files (requires Python 3.9 or installed package importlib_resources)
+
+2.0 (2019-04-15)
+^^^^^^^^^^^^^^^^
+
+* Convert dials_data to a pytest plugin
+
+1.0 (2019-02-16)
+^^^^^^^^^^^^^^^^
+
+* Add functions for forward-compatibility
+* Enable new release process including automatic deployment of updates
+
+0.6 (2019-02-15)
+^^^^^^^^^^^^^^^^
+
+* Added datasets blend_tutorial, thaumatin_i04
+
+0.5 (2019-01-24)
+^^^^^^^^^^^^^^^^
+
+* Added documentation
+* Added datasets fumarase, vmxi_thaumatin
+
+0.4 (2019-01-11)
+^^^^^^^^^^^^^^^^
+
+* Beta release
+* Added datasets insulin, pychef
+* Automated generation of hashinfo files via Travis
+
+
+0.3 (2019-01-09)
+^^^^^^^^^^^^^^^^
+
+* Dataset download mechanism
+* Added dataset x4wide
+
+
+0.2 (2019-01-08)
+^^^^^^^^^^^^^^^^
+
+* Alpha release
+* Basic command line interface
+* pytest fixture
+
+
+0.1 (2018-11-02)
+^^^^^^^^^^^^^^^^
+
+* First automatic deployment and release on PyPI
diff --git a/dials_data.egg-info/SOURCES.txt b/dials_data.egg-info/SOURCES.txt
new file mode 100644
index 0000000..7203987
--- /dev/null
+++ b/dials_data.egg-info/SOURCES.txt
@@ -0,0 +1,110 @@
+CONTRIBUTING.rst
+HISTORY.rst
+LICENSE
+MANIFEST.in
+README.rst
+pyproject.toml
+setup.cfg
+setup.py
+dials_data/__init__.py
+dials_data/cli.py
+dials_data/datasets.py
+dials_data/download.py
+dials_data/py.typed
+dials_data/pytest11.py
+dials_data.egg-info/PKG-INFO
+dials_data.egg-info/SOURCES.txt
+dials_data.egg-info/dependency_links.txt
+dials_data.egg-info/entry_points.txt
+dials_data.egg-info/not-zip-safe
+dials_data.egg-info/requires.txt
+dials_data.egg-info/top_level.txt
+dials_data/definitions/4fluoro_cxi.yml
+dials_data/definitions/MyD88_processed.yml
+dials_data/definitions/aluminium_standard.yml
+dials_data/definitions/blend_tutorial.yml
+dials_data/definitions/centroid_test_data.yml
+dials_data/definitions/cunir_serial.yml
+dials_data/definitions/cunir_serial_processed.yml
+dials_data/definitions/dtpb_serial_processed.yml
+dials_data/definitions/four_circle_eiger.yml
+dials_data/definitions/fumarase.yml
+dials_data/definitions/i19_1_pdteet_index.yml
+dials_data/definitions/image_examples.yml
+dials_data/definitions/insulin.yml
+dials_data/definitions/insulin_processed.yml
+dials_data/definitions/isis_sxd_example_data.yml
+dials_data/definitions/l_cysteine_4_sweeps_scaled.yml
+dials_data/definitions/l_cysteine_dials_output.yml
+dials_data/definitions/lcls_rayonix_kapton.yml
+dials_data/definitions/lysozyme_JF16M_4img.yml
+dials_data/definitions/lysozyme_JF16M_4img_spectra.yml
+dials_data/definitions/lysozyme_electron_diffraction.yml
+dials_data/definitions/mpro_x0305_processed.yml
+dials_data/definitions/mpro_x0692.yml
+dials_data/definitions/multi_crystal_proteinase_k.yml
+dials_data/definitions/pycbf.yml
+dials_data/definitions/pychef.yml
+dials_data/definitions/quartz_processed.yml
+dials_data/definitions/refinement_test_data.yml
+dials_data/definitions/relion_tutorial_data.yml
+dials_data/definitions/small_molecule_example.yml
+dials_data/definitions/spring8_ccp4_2018.yml
+dials_data/definitions/thaumatin_eiger_screen.yml
+dials_data/definitions/thaumatin_grid_scan.yml
+dials_data/definitions/thaumatin_i04.yml
+dials_data/definitions/trypsin_multi_lattice.yml
+dials_data/definitions/vmxi_proteinase_k_sweeps.yml
+dials_data/definitions/vmxi_thaumatin.yml
+dials_data/definitions/vmxi_thaumatin_grid_index.yml
+dials_data/definitions/x4wide.yml
+dials_data/definitions/x4wide_processed.yml
+dials_data/hashinfo/4fluoro_cxi.yml
+dials_data/hashinfo/MyD88_processed.yml
+dials_data/hashinfo/aluminium_standard.yml
+dials_data/hashinfo/blend_tutorial.yml
+dials_data/hashinfo/centroid_test_data.yml
+dials_data/hashinfo/cunir_serial.yml
+dials_data/hashinfo/cunir_serial_processed.yml
+dials_data/hashinfo/dtpb_serial_processed.yml
+dials_data/hashinfo/four_circle_eiger.yml
+dials_data/hashinfo/fumarase.yml
+dials_data/hashinfo/i19_1_pdteet_index.yml
+dials_data/hashinfo/image_examples.yml
+dials_data/hashinfo/insulin.yml
+dials_data/hashinfo/insulin_processed.yml
+dials_data/hashinfo/isis_sxd_example_data.yml
+dials_data/hashinfo/l_cysteine_4_sweeps_scaled.yml
+dials_data/hashinfo/l_cysteine_dials_output.yml
+dials_data/hashinfo/lcls_rayonix_kapton.yml
+dials_data/hashinfo/lysozyme_JF16M_4img.yml
+dials_data/hashinfo/lysozyme_JF16M_4img_spectra.yml
+dials_data/hashinfo/lysozyme_electron_diffraction.yml
+dials_data/hashinfo/mpro_x0305_processed.yml
+dials_data/hashinfo/mpro_x0692.yml
+dials_data/hashinfo/multi_crystal_proteinase_k.yml
+dials_data/hashinfo/pycbf.yml
+dials_data/hashinfo/pychef.yml
+dials_data/hashinfo/quartz_processed.yml
+dials_data/hashinfo/refinement_test_data.yml
+dials_data/hashinfo/relion_tutorial_data.yml
+dials_data/hashinfo/small_molecule_example.yml
+dials_data/hashinfo/spring8_ccp4_2018.yml
+dials_data/hashinfo/thaumatin_eiger_screen.yml
+dials_data/hashinfo/thaumatin_grid_scan.yml
+dials_data/hashinfo/thaumatin_i04.yml
+dials_data/hashinfo/trypsin_multi_lattice.yml
+dials_data/hashinfo/vmxi_proteinase_k_sweeps.yml
+dials_data/hashinfo/vmxi_thaumatin.yml
+dials_data/hashinfo/vmxi_thaumatin_grid_index.yml
+dials_data/hashinfo/x4wide.yml
+dials_data/hashinfo/x4wide_processed.yml
+docs/conf.py
+docs/contributing.rst
+docs/history.rst
+docs/index.rst
+docs/installation.rst
+docs/readme.rst
+docs/why.rst
+tests/test_dials_data.py
+tests/test_yaml_files.py
\ No newline at end of file
diff --git a/dials_data.egg-info/dependency_links.txt b/dials_data.egg-info/dependency_links.txt
new file mode 100644
index 0000000..8b13789
--- /dev/null
+++ b/dials_data.egg-info/dependency_links.txt
@@ -0,0 +1 @@
+
diff --git a/dials_data.egg-info/entry_points.txt b/dials_data.egg-info/entry_points.txt
new file mode 100644
index 0000000..34be63b
--- /dev/null
+++ b/dials_data.egg-info/entry_points.txt
@@ -0,0 +1,11 @@
+[console_scripts]
+dials.data = dials_data.cli:main
+
+[libtbx.dispatcher.script]
+dials.data = dials.data
+
+[libtbx.precommit]
+dials_data = dials_data
+
+[pytest11]
+dials_data = dials_data.pytest11
diff --git a/dials_data.egg-info/not-zip-safe b/dials_data.egg-info/not-zip-safe
new file mode 100644
index 0000000..8b13789
--- /dev/null
+++ b/dials_data.egg-info/not-zip-safe
@@ -0,0 +1 @@
+
diff --git a/dials_data.egg-info/requires.txt b/dials_data.egg-info/requires.txt
new file mode 100644
index 0000000..03c12d4
--- /dev/null
+++ b/dials_data.egg-info/requires.txt
@@ -0,0 +1,4 @@
+importlib_resources>=1.1
+pytest
+pyyaml
+requests
diff --git a/dials_data.egg-info/top_level.txt b/dials_data.egg-info/top_level.txt
new file mode 100644
index 0000000..ed12ea6
--- /dev/null
+++ b/dials_data.egg-info/top_level.txt
@@ -0,0 +1 @@
+dials_data
diff --git a/dials_data/cli.py b/dials_data/cli.py
index d8c5c22..bea84bd 100644
--- a/dials_data/cli.py
+++ b/dials_data/cli.py
@@ -79,6 +79,8 @@ def cli_get(cmd_args):
         hashinfo = dials_data.download.fetch_dataset(
             ds, ignore_hashinfo=args.create_hashinfo, verify=args.verify
         )
+        if not hashinfo:
+            exit(f"Error downloading dataset {ds}")
         if args.create_hashinfo:
             if not args.quiet:
                 print(f"Writing file integrity information to {ds}.yml")
diff --git a/dials_data/definitions/4fluoro_cxi.yml b/dials_data/definitions/4fluoro_cxi.yml
new file mode 100644
index 0000000..e70fd56
--- /dev/null
+++ b/dials_data/definitions/4fluoro_cxi.yml
@@ -0,0 +1,42 @@
+name: 4-fluoro dataset for 2022 LCLS smSFX workshop
+author: Aaron Brewster, Daniel Paley, Elyse Schriber, Vanessa Oklejas, David Moreau, and J. Nathan Hohman
+license: CC-BY 4.0
+url: https://doi.org/10.5281/zenodo.7117518
+description: >
+ This is the dataset for the 2022 LCLS user meeting workshop, "Applications for small-molecule Serial Femtosecond Diffraction in Materials Science and Chemistry"
+
+ For full details, please refer to the full set of data processing instructions here:
+ https://docs.google.com/document/d/1UNz3zTATIak5UkVGIli0W3HjHZr9orBiWp4nt12AzOQ
+
+ The dataset contains these folders:
+
+ - calibration: calibration constants needed for indexing
+   - r0215.expt: refined geometry file for the JF4M detector
+   - psana_border_hot_shift4.mask: untrusted pixel mask
+ - spotfinding:
+   - r0165_strong.expt, r0165_strong.refl: spotfinding results for run 165
+   - 4fluoro_peaks.txt: a list of 20 peaks harvested from the spotfinder powder pattern.
+ - ten_cbfs: 10 cbf files of 4-fluoro, created from LCLS experiment LY65, run 164.
+ - indexing: working folder for indexing the 10 cbfs.  Contains params_1.phil file, the indexing parameters for cctbx.small_cell_process
+ - merging: data folders, one per run, with integrated.expt and integrated.refl files
+   - mark1.phil and mark0.phil: phil parameters for merging
+ - refinement: 4fluoro_final.res, 4fluoro.hkl, 4fluoro_solved1.res, 4fluoro_solved2.res, 4fluoro_start.ins, input and output files for solving and refining the 4fluoro structure
+
+data:
+  - url: https://zenodo.org/record/7117519/files/lcls_2022_smSFX_workshop_data.tar.gz
+    files:
+      - lcls_2022_smSFX_workshop_data/calibration/psana_border_hot_shift4.mask
+      - lcls_2022_smSFX_workshop_data/calibration/r0215.expt
+      - lcls_2022_smSFX_workshop_data/indexing/params_1.phil
+      - lcls_2022_smSFX_workshop_data/spotfinding/r0165_strong.expt
+      - lcls_2022_smSFX_workshop_data/spotfinding/r0165_strong.refl
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_02081.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_08320.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_15949.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_43630.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_45478.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_05021.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_10738.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_42048.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_44166.cbf
+      - lcls_2022_smSFX_workshop_data/ten_cbfs/cxily6520_r0164_67542.cbf
diff --git a/dials_data/definitions/aluminium_standard.yml b/dials_data/definitions/aluminium_standard.yml
index 46619d1..279851e 100644
--- a/dials_data/definitions/aluminium_standard.yml
+++ b/dials_data/definitions/aluminium_standard.yml
@@ -5,17 +5,15 @@ description: >
   Powder diffraction data collected by Yun Song at DLS on eBIC's Talos Arctica (m12).
   The sample is a polycrystalline aluminium standard.
 
-  imported.expt was generated with
+  `imported.expt` was generated with:
     $ dials.import $(dials.data get -q aluminium_standard)/0p67_5s_0000.mrc
 
-  eyeballed.expt was generated with dials.powder_calibrate_widget
+  `eyeballed.expt` was generated with: 
+    $ dials.powder_calibrate imported.expt standard="Al" coarse_geom="eyeballed.expt"
+
+  `calibrated.expt` was generated without using the wizard with:
+    $ dials.powder_calibrate eyeballed.expt standard="Al" eyeball=False
 
-  calibrated.expt was generated with
-    starting_file = "eyeballed.expt"
-    test_args = [starting_file, "standard=Al", "eyeball=False"]
-    expt_parameters, user_arguments = parse_args(args=test_args)
-    calibrator = PowderCalibrator(expt_params=expt_parameters, user_args=user_arguments)
-    calibrator.calibrate_with_calibrant(verbose=True)
   
   Used for powder geometry calibration testing.
 
diff --git a/dials_data/definitions/cunir_serial.yml b/dials_data/definitions/cunir_serial.yml
index cf87f4d..36e882b 100644
--- a/dials_data/definitions/cunir_serial.yml
+++ b/dials_data/definitions/cunir_serial.yml
@@ -4,6 +4,8 @@ license: CC-BY 4.0
 description: >
   5 example SSX images of CuNiR, taken on I24 beamline at Diamond
   Light Source.
+  Also provided is a reference PDB structure for this protein (pdb entry 2BW4),
+  in PDB and cif format, plus the reference intensity data from that deposition.
 
 data:
   - url: https://github.com/dials/data-files/raw/a0e40127492faa165a9f5ef7590ff1591957e6d9/ssx_CuNiR_test_data/merlin0047_17000.cbf
@@ -11,3 +13,6 @@ data:
   - url: https://github.com/dials/data-files/raw/a0e40127492faa165a9f5ef7590ff1591957e6d9/ssx_CuNiR_test_data/merlin0047_17002.cbf
   - url: https://github.com/dials/data-files/raw/a0e40127492faa165a9f5ef7590ff1591957e6d9/ssx_CuNiR_test_data/merlin0047_17003.cbf
   - url: https://github.com/dials/data-files/raw/a0e40127492faa165a9f5ef7590ff1591957e6d9/ssx_CuNiR_test_data/merlin0047_17004.cbf
+  - url: https://files.rcsb.org/download/2BW4.pdb
+  - url: https://files.rcsb.org/download/2bw4-sf.cif
+  - url: https://files.rcsb.org/download/2bw4.cif
\ No newline at end of file
diff --git a/dials_data/definitions/cunir_serial_processed.yml b/dials_data/definitions/cunir_serial_processed.yml
index d4a76c0..d2fbc28 100644
--- a/dials_data/definitions/cunir_serial_processed.yml
+++ b/dials_data/definitions/cunir_serial_processed.yml
@@ -6,6 +6,7 @@ description: >
   Light Source, courtesy of Robin Owen and Mike Hough.
   5 images were processed with DIALS v3.7.3, with and without a reference
   geometry, for testing indexing.
+  Indexed data were integrated with DIALS commit #37ef448 with dev.dials.ssx_integrate
   Images can be found in the 'cunir_serial' data set; file references are
   laid out so that the images can be accessed if that data set is present.
 
@@ -16,3 +17,5 @@ data:
   - url: https://github.com/dials/data-files/raw/da38a5de509cb7cb22cc28f439058c66923333d1/ssx_CuNiR_test_data/strong_5.refl
   - url: https://github.com/dials/data-files/raw/70d2cbb1bb5d0678b133039a7e009a623762f0ff/ssx_CuNiR_test_data/indexed.expt
   - url: https://github.com/dials/data-files/raw/70d2cbb1bb5d0678b133039a7e009a623762f0ff/ssx_CuNiR_test_data/indexed.refl
+  - url: https://github.com/dials/data-files/raw/d42250171384b66828803de1b07d1009db2e5472/ssx_CuNiR_test_data/integrated.expt
+  - url: https://github.com/dials/data-files/raw/d42250171384b66828803de1b07d1009db2e5472/ssx_CuNiR_test_data/integrated.refl
diff --git a/dials_data/definitions/dtpb_serial_processed.yml b/dials_data/definitions/dtpb_serial_processed.yml
new file mode 100644
index 0000000..292a45c
--- /dev/null
+++ b/dials_data/definitions/dtpb_serial_processed.yml
@@ -0,0 +1,17 @@
+name: xia2.ssx output of processing a DtpB SSX data set from VMXi
+author: James Beilsten-Edmands (2022)
+license: CC-BY 4.0
+description: >
+  Processed data from DtpB taken on VMXi beamline at Diamond Light
+  Source in Feb 2022, visit nt30330-15, courtesy of Mike Hough and James Sandy.
+
+  The integrated files were generated from xia2.ssx processing of grid
+  scans on two wells of dye-type peroxidase (DtpB). The files were generated
+  with dials version 3.12.0, as described in the entry in dials/data-files.
+  
+
+data:
+  - url: https://github.com/dials/data-files/raw/3849a695d7a136d6e93584ccefc17ef59de8c815/ssx_DtpB_test_data/well39_batch12_integrated.expt
+  - url: https://github.com/dials/data-files/raw/3849a695d7a136d6e93584ccefc17ef59de8c815/ssx_DtpB_test_data/well39_batch12_integrated.refl
+  - url: https://github.com/dials/data-files/raw/3849a695d7a136d6e93584ccefc17ef59de8c815/ssx_DtpB_test_data/well42_batch6_integrated.expt
+  - url: https://github.com/dials/data-files/raw/3849a695d7a136d6e93584ccefc17ef59de8c815/ssx_DtpB_test_data/well42_batch6_integrated.refl
diff --git a/dials_data/definitions/four_circle_eiger.yml b/dials_data/definitions/four_circle_eiger.yml
index 47ac681..27a12f1 100644
--- a/dials_data/definitions/four_circle_eiger.yml
+++ b/dials_data/definitions/four_circle_eiger.yml
@@ -25,19 +25,19 @@ description: >
     CuHF2pyz2PF6b_P_O_02.nxs — a 600-image 120° ω scan.
   
 data:
-  - url: https://zenodo.org/record/6093231/files/01_CuHF2pyz2PF6b_Phi.tar.xz
+  - url: https://zenodo.org/record/6347466/files/01_CuHF2pyz2PF6b_Phi.tar.xz
     files:
       - 01_CuHF2pyz2PF6b_Phi/CuHF2pyz2PF6b_Phi_01_000001.h5
       - 01_CuHF2pyz2PF6b_Phi/CuHF2pyz2PF6b_Phi_01_000002.h5
       - 01_CuHF2pyz2PF6b_Phi/CuHF2pyz2PF6b_Phi_01_meta.h5
       - 01_CuHF2pyz2PF6b_Phi/CuHF2pyz2PF6b_Phi_01.nxs
-  - url: https://zenodo.org/record/6093231/files/02_CuHF2pyz2PF6b_2T.tar.xz
+  - url: https://zenodo.org/record/6347466/files/02_CuHF2pyz2PF6b_2T.tar.xz
     files:
       - 02_CuHF2pyz2PF6b_2T/CuHF2pyz2PF6b_2T_01_000001.h5
       - 02_CuHF2pyz2PF6b_2T/CuHF2pyz2PF6b_2T_01_000002.h5
       - 02_CuHF2pyz2PF6b_2T/CuHF2pyz2PF6b_2T_01_meta.h5
       - 02_CuHF2pyz2PF6b_2T/CuHF2pyz2PF6b_2T_01.nxs
-  - url: https://zenodo.org/record/6093231/files/03_CuHF2pyz2PF6b_P_O.tar.xz
+  - url: https://zenodo.org/record/6347466/files/03_CuHF2pyz2PF6b_P_O.tar.xz
     files:
       - 03_CuHF2pyz2PF6b_P_O/CuHF2pyz2PF6b_P_O_01_000001.h5
       - 03_CuHF2pyz2PF6b_P_O/CuHF2pyz2PF6b_P_O_01_000002.h5
diff --git a/dials_data/definitions/image_examples.yml b/dials_data/definitions/image_examples.yml
index 9f41bc4..4607ac8 100644
--- a/dials_data/definitions/image_examples.yml
+++ b/dials_data/definitions/image_examples.yml
@@ -19,5 +19,6 @@ data:
   - url: https://github.com/dials/data-files/raw/7260492a487e88dde58f8be044b21cef1c7578ef/image_examples/dectris_eiger_master.h5
   - url: https://github.com/dials/data-files/raw/9ecf5a4e6d36a2c77574178d306292dd1c093359/image_examples/Gatan_float32_zero_array_001.dm4.gz
   - url: https://github.com/dials/data-files/raw/f587a81ac82b3c35a757492e92259a0105ee1c2a/image_examples/SACLA-MPCCD-run266702-0-subset-refined_experiments_level1.json
+  - url: https://github.com/dials/data-files/raw/30e7df95d17ef152e1eef6e286a4f8143eac7c73/image_examples/SACLA-MPCCD-run266702-0-subset-known_orientations.expt
   - url: https://github.com/dials/data-files/raw/f587a81ac82b3c35a757492e92259a0105ee1c2a/image_examples/SACLA-MPCCD-run266702-0-subset.h5
   - url: https://github.com/dials/data-files/raw/2fc35b473ea4a9c68279e438d4e39e2662d9dfdb/image_examples/endonat3_001.mar2300
diff --git a/dials_data/definitions/insulin_processed.yml b/dials_data/definitions/insulin_processed.yml
index 570564b..55a86fa 100644
--- a/dials_data/definitions/insulin_processed.yml
+++ b/dials_data/definitions/insulin_processed.yml
@@ -12,15 +12,15 @@ description: >
     $ dials.scale symmetrized.{expt,refl}
 
 data:
-  - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/imported.expt
+  - url: https://github.com/dials/data-files/raw/dab5530ec4e1631dda14e91bb59e8d6368b4dcdf/insulin_processed/imported.expt
   - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/strong.refl
-  - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/indexed.expt
+  - url: https://github.com/dials/data-files/raw/dab5530ec4e1631dda14e91bb59e8d6368b4dcdf/insulin_processed/indexed.expt
   - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/indexed.refl
-  - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/integrated.expt
+  - url: https://github.com/dials/data-files/raw/dab5530ec4e1631dda14e91bb59e8d6368b4dcdf/insulin_processed/integrated.expt
   - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/integrated.refl
-  - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/refined.expt
+  - url: https://github.com/dials/data-files/raw/dab5530ec4e1631dda14e91bb59e8d6368b4dcdf/insulin_processed/refined.expt
   - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/refined.refl
-  - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/scaled.expt
+  - url: https://github.com/dials/data-files/raw/dab5530ec4e1631dda14e91bb59e8d6368b4dcdf/insulin_processed/scaled.expt
   - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/scaled.refl
-  - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/symmetrized.expt
+  - url: https://github.com/dials/data-files/raw/dab5530ec4e1631dda14e91bb59e8d6368b4dcdf/insulin_processed/symmetrized.expt
   - url: https://github.com/dials/data-files/raw/4ccd4932291c258316530597e40e252d1f5e37c2/insulin_processed/symmetrized.refl
diff --git a/dials_data/definitions/lysozyme_JF16M_4img.yml b/dials_data/definitions/lysozyme_JF16M_4img.yml
new file mode 100644
index 0000000..c2b9741
--- /dev/null
+++ b/dials_data/definitions/lysozyme_JF16M_4img.yml
@@ -0,0 +1,15 @@
+name: 4 image lysozyme dataset recorded on the Jungfrau 16M detector at SwissFEL and formatted as a NeXus file
+author: Aaron Brewster, Meitian Wang, and Herbert J Bernstein (2022)
+license: CC-BY 4.0
+url: https://doi.org/10.5281/zenodo.7005251
+description: >
+  This is a 4 image lysozyme datasets derived from https://doi.org/10.5281/zenodo.3352357.
+  Data provided by Meitian Wang at PSI and master file revised May 2020 for full NXmx
+  compliance. The specific 4 images are able to be processed by the software package DIALS
+  using commands in the linked dataset above. The images were rounded to integer and
+  compressed to save file space.
+
+data:
+  - url: https://zenodo.org/record/7017221/files/lyso009a_0087.JF07T32V01_master_4img.h5
+  - url: https://zenodo.org/record/7017221/files/lyso009a_0087.JF07T32V01_master_4img_imported.expt
+
diff --git a/dials_data/definitions/lysozyme_JF16M_4img_spectra.yml b/dials_data/definitions/lysozyme_JF16M_4img_spectra.yml
new file mode 100644
index 0000000..3230e2a
--- /dev/null
+++ b/dials_data/definitions/lysozyme_JF16M_4img_spectra.yml
@@ -0,0 +1,10 @@
+name: 4 image cytochrome dataset with per-shot spectra recorded on the Jungfrau 16M detector at SwissFEL in NeXus format
+author: Aaron Brewster (2022)
+license: CC-BY 4.0
+url: https://doi.org/10.5281/zenodo.7011529
+description: >
+  Cytochrome data recorded in 2019 on the Jungfrau 16M detector recorded at SwissFEL including per-shot spectra measure each SASE pulse.
+
+data:
+  - url: https://zenodo.org/record/7011530/files/run_000795.JF07T32V01_master_spectrum_4img.h5
+
diff --git a/dials_data/definitions/quartz_processed.yml b/dials_data/definitions/quartz_processed.yml
new file mode 100644
index 0000000..1e07bfa
--- /dev/null
+++ b/dials_data/definitions/quartz_processed.yml
@@ -0,0 +1,29 @@
+name: DIALS output of processing a quartz crystal electron diffraction data set
+author: David Waterman (2022)
+license: CC-BY 4.0
+url: https://github.com/dials/data-files/blob/master/quartz_processed/README.md
+description: >
+  Files created by DIALS data processing of quartz electron diffraction
+  data from https://zenodo.org/record/5579793. The
+  processing script used to produce these files with
+  DIALS 3.dev.698-gba04be81f is included for reference. The integrated
+  intensities are exported in cif_pets format, allowing absolute
+  structure determination by Jana2020.
+data:
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/dials_dyn.cif_pets
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/dials_prf.cif_pets
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/imported.expt
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/indexed.expt
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/indexed.refl
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/integrated.expt
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/integrated.refl
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/process-quartz.sh
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/README.md
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/refined.expt
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/refined.refl
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/reindexed.expt
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/scaled.expt
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/scaled.refl
+  - url: https://github.com/dials/data-files/raw/bc91cf2c7fd659962ea543fd26afe68f0c71caa4/quartz_processed/strong.refl
+
+
diff --git a/dials_data/definitions/refinement_test_data.yml b/dials_data/definitions/refinement_test_data.yml
new file mode 100644
index 0000000..d2402af
--- /dev/null
+++ b/dials_data/definitions/refinement_test_data.yml
@@ -0,0 +1,123 @@
+name: Refinement test data
+author: David Waterman (2022)
+license: CC-BY 4.0
+description: >
+  Data files used in DIALS tests, mainly for geometry refinement but
+  also used elsewhere.
+
+  These files were originally located in the dials_regression
+  repository. They are derived files created by DIALS processing with
+  various versions of DIALS. This version information is not generally
+  available.
+
+
+data:
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/centroid/experiments_XPARM_REGULARIZED.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/hierarchy_test/datablock.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/cspad_refinement/regression_experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/cspad_refinement/refine.phil
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/cspad_refinement/cspad_refined_experiments_step6_level2_300.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/cspad_refinement/cspad_reflections_step7_300.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/outlier_rejection/residuals.dat
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/centroid_outlier/residuals.refl
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/i23_as_24_panel_barrel/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/i23_as_24_panel_barrel/indexed.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_stills/combined_experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_stills/regression_experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_stills/combined_reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_sweep_one_sample/glucose_isomerase/SWEEP1/index/sv_refined_experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_041/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_041/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_023/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_023/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_024/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_024/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_022/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_022/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_047/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_047/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_020/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_020/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_044/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_044/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_043/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_043/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_017/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_017/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_037/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_037/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_018/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_018/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_048/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_048/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_025/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_025/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_032/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_032/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_009/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_009/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_004/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_004/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_046/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_046/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_019/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_019/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_033/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_033/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_029/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_029/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_013/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_013/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_028/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_028/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_038/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_038/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_031/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_031/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_014/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_014/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_035/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_035/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_011/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_011/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_012/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_012/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_042/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_042/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_005/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_005/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_030/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_030/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_040/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_040/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_002/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_002/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_006/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_006/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_007/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_007/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_003/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_003/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_021/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_021/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_036/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_036/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_026/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_026/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_027/reflections.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/data/sweep_027/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/multi_narrow_wedges/regression_experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/xfel_metrology/benchmark_level2d.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/xfel_metrology/refine.phil
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/xfel_metrology/benchmark_level2d.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/varying_beam_direction/refined_static.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/varying_beam_direction/refined_static.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/dials-423/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/dials-423/subset.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/i04_weak_data/experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/i04_weak_data/regression_experiments.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/i04_weak_data/indexed_strong.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/centroid/experiments_XPARM_REGULARIZED.json
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/centroid/spot_1000_xds.pickle
+  - url: https://github.com/dials/data-files/raw/41bdd5db9ef1407869eb9c3a06c4e20796d82779/refinement_test_data/centroid/spot_all_xds.pickle
+
diff --git a/dials_data/download.py b/dials_data/download.py
index 70d9ec1..870e6ba 100644
--- a/dials_data/download.py
+++ b/dials_data/download.py
@@ -15,6 +15,8 @@ from urllib.parse import urlparse
 
 import py.path
 import requests
+from requests.adapters import HTTPAdapter
+from urllib3.util.retry import Retry
 
 import dials_data.datasets
 
@@ -74,22 +76,6 @@ def _file_lock(file_handle):
             _platform_unlock(file_handle)
 
 
-@contextlib.contextmanager
-def download_lock(target_dir: Optional[Path]):
-    """
-    Obtains a (cooperative) lock on a lockfile in a target directory, so only a
-    single (cooperative) process can enter this context manager at any one time.
-    If the lock is held this will block until the existing lock is released.
-    """
-    if not target_dir:
-        yield
-        return
-    target_dir.mkdir(parents=True, exist_ok=True)
-    with target_dir.joinpath(".lock").open(mode="w") as fh:
-        with _file_lock(fh):
-            yield
-
-
 def _download_to_file(session: requests.Session, url: str, pyfile: Path):
     """
     Downloads a single URL to a file.
@@ -118,17 +104,18 @@ def fetch_dataset(
     read_only: bool = False,
     verbose: bool = False,
     pre_scan: bool = True,
-    download_lockdir: Optional[Path] = None,
-) -> Union[bool, Any]:
+) -> Union[bool, dict[str, Any]]:
     """Check for the presence or integrity of the local copy of the specified
     test dataset. If the dataset is not available or out of date then attempt
     to download/update it transparently.
 
-    :param verbose:          Show everything as it happens.
     :param pre_scan:         If all files are present and all file sizes match
                              then skip file integrity check and exit quicker.
     :param read_only:        Only use existing data, never download anything.
                              Implies pre_scan=True.
+    :param verbose:          Show everything as it happens.
+    :param verify:           Check all files against integrity information and
+                             fail on any mismatch.
     :returns:                False if the dataset can not be downloaded/updated
                              for any reason.
                              True if the dataset is present and passes a
@@ -171,20 +158,44 @@ def fetch_dataset(
         if read_only:
             return False
 
-    # Acquire lock if required as files may be downloaded/written.
-    with download_lock(download_lockdir):
-        _fetch_filelist(filelist)
+    # Obtain a (cooperative) lock on a dataset-specific lockfile, so only one
+    # (cooperative) process can enter this context at any one time. The lock
+    # file sits in the directory above the dataset directory, as to not
+    # interfere with dataset files.
+    target_dir.mkdir(parents=True, exist_ok=True)
+    with target_dir.with_name(f".lock.{dataset}").open(mode="w") as fh:
+        with _file_lock(fh):
+            verification_records = _fetch_filelist(filelist)
 
+    # If any errors occured during download then don't trust the dataset.
+    if verify and not all(verification_records):
+        return False
+
+    integrity_info["verify"] = verification_records
     return integrity_info
 
 
-def _fetch_filelist(filelist: list[dict[str, Any]]) -> None:
+def _fetch_filelist(filelist: list[dict[str, Any]]) -> list[dict[str, Any] | None]:
     with requests.Session() as rs:
+        retry_adapter = HTTPAdapter(
+            max_retries=Retry(
+                total=5,
+                backoff_factor=1,
+                raise_on_status=True,
+                status_forcelist={413, 429, 500, 502, 503, 504},
+            )
+        )
+        rs.mount("http://", retry_adapter)
+        rs.mount("https://", retry_adapter)
+
         pool = concurrent.futures.ThreadPoolExecutor(max_workers=5)
-        pool.map(functools.partial(_fetch_file, rs), filelist)
+        results = pool.map(functools.partial(_fetch_file, rs), filelist)
+        return list(results)
 
 
-def _fetch_file(session: requests.Session, source: dict[str, Any]) -> None:
+def _fetch_file(
+    session: requests.Session, source: dict[str, Any]
+) -> dict[str, Any] | None:
     valid = False
     if source["file"].is_file():
         # verify
@@ -202,18 +213,21 @@ def _fetch_file(session: requests.Session, source: dict[str, Any]) -> None:
         downloaded = True
 
     # verify
+    validation_record = {
+        "size": source["file"].stat().st_size,
+        "hash": file_hash(source["file"]),
+    }
     valid = True
     if source["verify"]:
-        if source["verify"]["size"] != source["file"].stat().st_size:
+        if source["verify"]["size"] != validation_record["size"]:
             print(
                 f"File size mismatch on {source['file']}: "
-                f"{source['file'].stat().st_size}, expected {source['verify']['size']}"
+                f"{validation_record['size']}, expected {source['verify']['size']}"
             )
-        elif source["verify"]["hash"] != file_hash(source["file"]):
+            valid = False
+        elif source["verify"]["hash"] != validation_record["hash"]:
             print(f"File hash mismatch on {source['file']}")
-    else:
-        source["verify"]["size"] = source["file"].stat().st_size
-        source["verify"]["hash"] = file_hash(source["file"])
+            valid = False
 
     # If the file is a tar archive, then decompress
     if source["files"]:
@@ -243,6 +257,11 @@ def _fetch_file(session: requests.Session, source: dict[str, Any]) -> None:
                                 f"in tar archive {source['file']}"
                             )
 
+    if valid:
+        return validation_record
+    else:
+        return None
+
 
 class DataFetcher:
     """A class that offers access to regression datasets.
@@ -320,7 +339,6 @@ class DataFetcher:
                 test_data,
                 pre_scan=True,
                 read_only=False,
-                download_lockdir=self._target_dir,
             )
         if data_available:
             return self._target_dir / test_data
diff --git a/dials_data/hashinfo/4fluoro_cxi.yml b/dials_data/hashinfo/4fluoro_cxi.yml
new file mode 100644
index 0000000..7619c62
--- /dev/null
+++ b/dials_data/hashinfo/4fluoro_cxi.yml
@@ -0,0 +1,5 @@
+definition: cec7e819d3a91eb58a282034eff84ede1fbf07e0f7ff52915e5c08f4d6217c19
+formatversion: 1
+verify:
+- hash: e96817a470fcd73b758e86b7cc6b9b2c1309e6321d708690911a58650c45990b
+  size: 243302684
diff --git a/dials_data/hashinfo/README.MD b/dials_data/hashinfo/README.MD
deleted file mode 100644
index f344793..0000000
--- a/dials_data/hashinfo/README.MD
+++ /dev/null
@@ -1,5 +0,0 @@
-Dataset hash information
-========================
-
-This directory is maintained by automated tools.
-You should not make any changes here yourself.
diff --git a/dials_data/hashinfo/aluminium_standard.yml b/dials_data/hashinfo/aluminium_standard.yml
index bf6acaf..6371f2d 100644
--- a/dials_data/hashinfo/aluminium_standard.yml
+++ b/dials_data/hashinfo/aluminium_standard.yml
@@ -1,4 +1,4 @@
-definition: 4eaca4076872da390c5db4ca7e545159674de804ca8639cb38a99db05651e80b
+definition: 3a4add516a6156e54c0b3069ea74867a9d3104c4dcbacfffd95d204c50f59785
 formatversion: 1
 verify:
 - hash: 1438d6a48e3d6c9086d0d5d24ab677b88564a7208276a146e543ff0309fd29b3
diff --git a/dials_data/hashinfo/cunir_serial.yml b/dials_data/hashinfo/cunir_serial.yml
index 606a50d..0d5505a 100644
--- a/dials_data/hashinfo/cunir_serial.yml
+++ b/dials_data/hashinfo/cunir_serial.yml
@@ -1,4 +1,4 @@
-definition: 5e499205e846d62b787bc8555ba0774f59b368086534be254a493d895a154a30
+definition: f345653a884251cf972677e3ca25d852be762aead4f74659e64e5ae3c1cca317
 formatversion: 1
 verify:
 - hash: 890da105dc7ef00bf59b1095d1f671f0ca3bf1fea8dd3ff369ecb9c424880473
@@ -11,3 +11,9 @@ verify:
   size: 6229967
 - hash: 92d68ed019bc50f9f480d18acf4fe3345f6bd4343674a20f56e41d2bc1b8e71e
   size: 6229971
+- hash: 87486a542b050f0872853a4acdd25a8dd6c58cf8f68914ad50576fe94171ab0e
+  size: 631719
+- hash: b47cf43312f14e6ad567fcc5ddca6025bc2abd3a8cb3e8b3f68e84e7b31da2ce
+  size: 8159030
+- hash: e6719e3947e60e0aa400d588ecc339ac7c606aa5d97a24afd6a9b27ece853c36
+  size: 714975
diff --git a/dials_data/hashinfo/cunir_serial_processed.yml b/dials_data/hashinfo/cunir_serial_processed.yml
index 65bcd64..08e7401 100644
--- a/dials_data/hashinfo/cunir_serial_processed.yml
+++ b/dials_data/hashinfo/cunir_serial_processed.yml
@@ -1,4 +1,4 @@
-definition: 16b3f8545250a8fb50634d426f23c4d914d1896fd7fab31420ae10854c22e30f
+definition: dca09c839750f39409699fcbe91889b52a220608ac19ecdb935c254bedb12378
 formatversion: 1
 verify:
 - hash: dca8ecd27f43c0140fbbfc946e7fff5931a829a3ab9358f351d2908fce536806
@@ -13,3 +13,7 @@ verify:
   size: 22264
 - hash: 8336e3ebf9d472225c1aa4fa80afb380f3521265f15746a1742b65f5c1ead0c7
   size: 148385
+- hash: a6a34f0f725cba86cf6daf5aa5bd2f1a324570bf4b259506560437d1bd6fb481
+  size: 27688
+- hash: cd5799e897fd8710d6a8dc984754444ca7bf9410e7e58e41640e6d8fda8695c7
+  size: 1712773
diff --git a/dials_data/hashinfo/dtpb_serial_processed.yml b/dials_data/hashinfo/dtpb_serial_processed.yml
new file mode 100644
index 0000000..f4508ef
--- /dev/null
+++ b/dials_data/hashinfo/dtpb_serial_processed.yml
@@ -0,0 +1,11 @@
+definition: 5a8f934115b6e7ba1bd5a829d5074513c0fcbdecff2f6b29b15e970e498b4a24
+formatversion: 1
+verify:
+- hash: c8db32a2802cad53548edd7fd4a84df080f41502a2a82d75560089c9177ba6f4
+  size: 64536
+- hash: 98a87ba8ba7843ad9e34c109d4b6a18ebf4f11b09bfb82ddbeb81b34ada15661
+  size: 8130377
+- hash: e7b7f7f1ae5d5cb99f6bc6e21cf00a0583d118c229770e705148a48caef1852f
+  size: 76592
+- hash: 16ad47bc607f34bfac501d7f2e7f6d5aeaecd9082d49c0bb571eeeaa5dd57a64
+  size: 10655254
diff --git a/dials_data/hashinfo/four_circle_eiger.yml b/dials_data/hashinfo/four_circle_eiger.yml
index 5e8c2c8..9f6cf46 100644
--- a/dials_data/hashinfo/four_circle_eiger.yml
+++ b/dials_data/hashinfo/four_circle_eiger.yml
@@ -1,9 +1,9 @@
-definition: 26dad0598c500ceb4a535234ce0515dc9562b861dd3573fd671785713ecdf44d
+definition: 840461a64431a4d3d8db523105924fc36363c8fbb7f804b4ee745d8c2468ec26
 formatversion: 1
 verify:
-- hash: 13a4088f838f20b4959e5607d799bb6f418d35f893f2f8b4effc01ecf0cf1c9c
-  size: 382763416
-- hash: 496f96c265a091eb151d8fc25e352e297be4257fbacda035403036094a8f0b4c
-  size: 376888156
-- hash: d4eda31c0a9285e4f064298ea7001a6d5531ddd00ab2b8cee644bb9520738cb5
-  size: 509819364
+- hash: b5681250f91d78da28b7dec81b14952e3584a48b6c123f002b4b67c55025c905
+  size: 382763432
+- hash: 8038e64ef32afe4e0e4854d2123cba1576046aeecab74c6dd1b74d205248bee3
+  size: 376889048
+- hash: 79e472dd3611bab714126176ee813f6a43dc41d4fccbd2f137e0c7192bc2768a
+  size: 509817940
diff --git a/dials_data/hashinfo/image_examples.yml b/dials_data/hashinfo/image_examples.yml
index 7d4ed32..3fc207d 100644
--- a/dials_data/hashinfo/image_examples.yml
+++ b/dials_data/hashinfo/image_examples.yml
@@ -1,4 +1,4 @@
-definition: 663d71414bc6007bd36bc9c249d0b5f16c70c0ccd1c3ac1e8bf593e77d8ce176
+definition: d6189969684d3c9c5fa19e51d2962f24c76f487cf3f223f292c9048323ccb90c
 formatversion: 1
 verify:
 - hash: 91a7545b2baff2388a80033d88a948d277b913f923a54b7d0c4100df3122fcc2
@@ -25,6 +25,8 @@ verify:
   size: 238496
 - hash: fc26cb2308578ed38f38ed00092b3221d55ac44c1b0e90f80ef92eb490b25f45
   size: 9657
+- hash: b0fa7adcdf731a48d6b9897eaf1992af046a3e8769a4be1b291074656215c873
+  size: 55825
 - hash: e913b3141201347962744a9fa27080711db5d76c9b1e254fcfb28b51b6265006
   size: 7751177
 - hash: ebf8470b43774e9be5d215da3944d952a071728ef34cbe5e645deb0f5461a286
diff --git a/dials_data/hashinfo/insulin_processed.yml b/dials_data/hashinfo/insulin_processed.yml
index 37b14b7..59f89f1 100644
--- a/dials_data/hashinfo/insulin_processed.yml
+++ b/dials_data/hashinfo/insulin_processed.yml
@@ -1,27 +1,27 @@
-definition: 5ac9b218995d9a84d216fc977ce5e49bf5d2692a2476edbbd3880ed8a6826b51
+definition: 6f8188c95486befbfeaf65c71b5e2c1604eebad929e2c534e7b84d89b6a3218c
 formatversion: 1
 verify:
-- hash: 5054edb58f00f0cd7488995f0aa0ccb56173b83f4c2162bd26c7d3f9571cc516
-  size: 5072
+- hash: a25389d7d1c006a8b2ee3e45fb459d08666fc6e9bedf63dc26fdaec4238abb7c
+  size: 5083
 - hash: 6caf12cdf76c49f69d07b25525e3785f88d3dc8cc6430c2e58a8eec749f88274
   size: 22129777
-- hash: 34bd9d49a5d49ab84f7035788ec844170efcc7ccebf2d79e7b703387fc781c3f
-  size: 8307
+- hash: bb511704aff8d9a07b88ce12abcf40d8bc8a0df6cb2eedeb5daebc03662a2105
+  size: 8318
 - hash: 68b4fa52cd71e322e884fea8bab767c94f42cfc740a721c4b097508e01aee869
   size: 25806186
-- hash: d52a91871724a8186b7cbab546be758f92351990a666816f7802b8874b83a0d6
-  size: 22970
+- hash: 58b7dc99552d7b61e336b7c22776254543a0c368ca00a9a494b50f94913e659f
+  size: 22981
 - hash: 7b1c6a3493f4daffca7a6f7b4a277b5e74ad550a88df3f90e8c3bdc3371fb6b7
   size: 19802366
-- hash: 1a817ac9c096e388ba317357d5950cb45d784d31d2fe85b1d9e4603cfcf920d5
-  size: 22807
+- hash: a0e5282b0cb5d14e38a96115c729aa9157ff29c3bcb7e5bcda1f8e24b2e3bffe
+  size: 22818
 - hash: 2aebab9cba3123cb92463106980a7915da8133d15ba21634841142965f8c6430
   size: 25806186
-- hash: 57dca301ef4449174b6f5301621866b9bb3b9a74edf1a351802ca345cf96a207
-  size: 22394
+- hash: e6e1dce6e99d8aa35583df7bc92ab15575efaa209091ebc4b7a759e9402a92bc
+  size: 22405
 - hash: f761340c3235aac6b4fe01eeee0acf0f67dfddb4e0c2da4f7b47f07003f0015c
   size: 22242390
-- hash: 3ae17d8824e9fe22a1b8ff4aa7e67e19c26c298a194cf5de298b408afcf7d0a0
-  size: 20382
+- hash: fd99a5eece769e359ea6ee11dfb833c3e349b601f84299f02a878b5068251451
+  size: 20393
 - hash: 4c7257fc2a1068653d2a28c0d1bbde6ebad273abeaa9a53a9c9f6baaebc81ae1
   size: 19802366
diff --git a/dials_data/hashinfo/lysozyme_JF16M_4img.yml b/dials_data/hashinfo/lysozyme_JF16M_4img.yml
new file mode 100644
index 0000000..6f2052f
--- /dev/null
+++ b/dials_data/hashinfo/lysozyme_JF16M_4img.yml
@@ -0,0 +1,7 @@
+definition: 1c3d901a065b4085aff516157a6617558ca907a7ca82cab019123b79ba5be5e6
+formatversion: 1
+verify:
+- hash: 4070fd3c456225fcdbe3be14701705fe1c3b663b5118a335f9e69d7c04f5d048
+  size: 94937893
+- hash: 393a5cf71361889cef0b91d3a273d294364b403384bc3f0e34328cda2e1ada16
+  size: 702058
diff --git a/dials_data/hashinfo/lysozyme_JF16M_4img_spectra.yml b/dials_data/hashinfo/lysozyme_JF16M_4img_spectra.yml
new file mode 100644
index 0000000..9dd9c73
--- /dev/null
+++ b/dials_data/hashinfo/lysozyme_JF16M_4img_spectra.yml
@@ -0,0 +1,5 @@
+definition: 0d4ae8765cac31e8b6872689ae973abe8e203f62f2e07ee6fd9d07f0980325d5
+formatversion: 1
+verify:
+- hash: c2cb030bf32ba451dd51238d1da55c4b6dc60081078aa34470e724f95af9b374
+  size: 115491620
diff --git a/dials_data/hashinfo/quartz_processed.yml b/dials_data/hashinfo/quartz_processed.yml
new file mode 100644
index 0000000..1448d6e
--- /dev/null
+++ b/dials_data/hashinfo/quartz_processed.yml
@@ -0,0 +1,33 @@
+definition: 75184e1f17b8ca695487ca696925c7e6c95b72bac7068a677e7049646ff540af
+formatversion: 1
+verify:
+- hash: 73c763c82bcc36c77f393431819d33a4eff6936819cd73ce68c5c93b5cf03c09
+  size: 351608
+- hash: 6cf1067ec24a5ca61dfa2fcd3cd0dc9c4529bdcf89011a6e46d9be7ee77d8aa4
+  size: 351608
+- hash: 7b8f9bbfd6019f16e4d08f8dfca9674528bfa06a28cfebe972fa90f24ff6aa58
+  size: 6121
+- hash: f029f64f08a718058e5b5f5aa29c0cf5b28549f8085e46b66435c06f0003b319
+  size: 9330
+- hash: 81b9733c29413208cbbcb0092a7c0316da0eb0708d9158044886b12ac15aa6d7
+  size: 831364
+- hash: b4fd266c23a1bb49eefeaae59084e8f8085dcb0aafabecd3362cda11a16ec434
+  size: 40185
+- hash: d25d7033b2b60622b8f34e2ff08592ab1a7a6264b87afd16f82980e303d03065
+  size: 604462
+- hash: 68eae685fa608e0171505f2738b7cd0f6fcbc112b8b928a5aab7c1057481622a
+  size: 1976
+- hash: f8797d3d2517b97627eb468aff8534405e71848dc752c95ac06346ff2d162d22
+  size: 665
+- hash: 7a8ba47d06a5bb2cca8c2324e4796eb0c49f5a156cbd74a03e49e2a1da62276a
+  size: 40025
+- hash: 45b6d035a9264db54353b045aa2d068f15ddc7e8bcf6293a034d46451977c05f
+  size: 831364
+- hash: f18838ded14632f99bf873c1cdcf744aaccd0671eadf00edfcae34ad911e34df
+  size: 6705
+- hash: 3ba9601c2b5f8cc11bce372c45614073f20afd669690a36996803661a4e28b79
+  size: 44105
+- hash: c3ea883ad2f9e3045693ebd9f6ad2d823728c8ad6a1cbcd073ddf1246d7d0eb6
+  size: 657410
+- hash: cf9d34faaaac5c7e90f248f966ff9739c2a4d1e381e82b7c5daa7fd11c39eb1e
+  size: 749779
diff --git a/dials_data/hashinfo/refinement_test_data.yml b/dials_data/hashinfo/refinement_test_data.yml
new file mode 100644
index 0000000..a58f406
--- /dev/null
+++ b/dials_data/hashinfo/refinement_test_data.yml
@@ -0,0 +1,219 @@
+definition: f20d69d2aea06c137668dcc3c6077d4eb4d95aebf8fa2d88fc90fd452d8314ab
+formatversion: 1
+verify:
+- hash: 4adfe355f85ce2863eb02950c6680b59e925d83aef171eadcfb6735ccef24714
+  size: 26867
+- hash: fb7a0aadc9a48f42dd5a53484d6855ef3d5447c6ac72f7f7ac15820040b12652
+  size: 79530
+- hash: 92e3bdb5939d0c84b71dc621f5aaa1c26b09a5224dda251f8eb8c95343b500cc
+  size: 520366
+- hash: 44a935b80ccabaf1a4725aafbe30d9238111515766285cf4244984acbf4851af
+  size: 643
+- hash: 751853f649847e47dd8d63ec1b4bb089604df684eeebf6198d00bf16158bfbce
+  size: 520349
+- hash: d3ea3c868a926c1e965332d8e5d97ec29c9267c7657a8ed782a0feaf80bbb297
+  size: 38173521
+- hash: 8526ffd521c01daa8b287fb2d5f8390086e59c4bf019572818944bd4806e424e
+  size: 576207
+- hash: 59dc6d5d7df49ec49e6b65628cbe6346c9bd9c16ccb17297b164d59a731f8157
+  size: 78067
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: ea35dd2edbe56722c553fb0f4403aaf9135c45f47afb9f579bac848bd5236467
+  size: 42358074
+- hash: 87e7a913a76babbb75743d6a1f2588d439be2c865beda70fc58598a9d6d69a9c
+  size: 79214
+- hash: 92e3bdb5939d0c84b71dc621f5aaa1c26b09a5224dda251f8eb8c95343b500cc
+  size: 520366
+- hash: 199247aeb958a088cea06c27b10ebb4ab79579efc3ccf0a909396c20eee4be1f
+  size: 264954
+- hash: 123c5d48ad5d380281c6c2d250c367871c9b8976777a6f6a3cc5be4dfae357aa
+  size: 661611
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: b6533fb27f07754eeb4635103203d4aa8ef417d45c7bf4c4e13cec0c4d553aad
+  size: 43744
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: f5c64b384b0fa143e8f8286dbc0a97c0023fa54032ae6c871975b6be25b4b2d7
+  size: 101306
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: 92e3bdb5939d0c84b71dc621f5aaa1c26b09a5224dda251f8eb8c95343b500cc
+  size: 520366
+- hash: fbcab5dbf6f58c2a86cdc5a9a4431e5b8b5d4f32b46c4590057883eb3b158b45
+  size: 3156033
+- hash: 44a935b80ccabaf1a4725aafbe30d9238111515766285cf4244984acbf4851af
+  size: 643
+- hash: 05d6ecfb8a160b3628bd64262fb9b6805328aae7fcb8002fe1fa397d1f94f178
+  size: 375269
+- hash: c5753dc9217fe53c0ba0214d4761e82a778fb0a1053dea4188abd6ece926e1e0
+  size: 19382
+- hash: 7ee8708b301f7bd4dbd71232a8d7653375ce33026826aff725c30115070242c7
+  size: 24013584
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: 5ec5b42c027476dc2f0fb39aa3fdb5bfeeddaae86ad68eb813f9aec68181c736
+  size: 3988042
+- hash: 793acb9083b96a96b3e41518abe8b0679669cb08bf79a36e78596d0f59c99bec
+  size: 69417
+- hash: 92e3bdb5939d0c84b71dc621f5aaa1c26b09a5224dda251f8eb8c95343b500cc
+  size: 520366
+- hash: ac825426d951983a83615f69ab5a5f27d151d492f1cb3a85d009fdcaf0e99e16
+  size: 24482245
+- hash: 4adfe355f85ce2863eb02950c6680b59e925d83aef171eadcfb6735ccef24714
+  size: 26867
+- hash: 1c79579d42f50829e709480d2a5c4b0f8604408dea5a63134995e5a96b413ad1
+  size: 238547
+- hash: f31ebc7c8a6529eff7ad3c10e7b36a37e266c412af9331cefadf57cfc210c5c9
+  size: 6424911
diff --git a/dials_data/hashinfo/trypsin_multi_lattice.yml b/dials_data/hashinfo/trypsin_multi_lattice.yml
new file mode 100644
index 0000000..ab8273f
--- /dev/null
+++ b/dials_data/hashinfo/trypsin_multi_lattice.yml
@@ -0,0 +1,57 @@
+definition: 2adcdef8cb1e03db3383be7736e2107f4fe0c236eb839f1a7518db71f3ec4e16
+formatversion: 1
+verify:
+- hash: e3d716094ee85fd3f1c90167a615f9940ede621d816fdbae9abbfb950fc5f043
+  size: 74765
+- hash: 689ec7d122f6751be52fe39ac21f13e8d9dbdca2c088e2e95f9129b243f61232
+  size: 45901633
+- hash: 3666cb1d94d849e5a023b7047b38300c11821068b51701d5ced66d9e45220c86
+  size: 4540047
+- hash: 8bd3faa1ce94b49e7361b1cdd2ba66fb63b863132eec18bdaec77a989d32d08b
+  size: 4541793
+- hash: 40f8f19f83f3148c17c59e11c16cbe8691c7a60a8c25c3d9504e3d0448ba0b2b
+  size: 4540415
+- hash: d494956ca01dd621c7c2fb18166e24e62f76ac768768a896eb3430dd143856b5
+  size: 4539556
+- hash: 9a1edcc7111e556f03442794f21276e9b2aefc118d049fb9d0f9a9798a601213
+  size: 4540508
+- hash: 060336c6c8f2cd8ff5117702c4f8d0bc5955d6783a5ebda42cedd03e1a570f00
+  size: 4540381
+- hash: 0982285a31c7b53230c916858434a49c62379ed11c2aacaa4b52057e5f254986
+  size: 4541707
+- hash: e7e122279f656a5264373594197574c6e447041db7fe579993931e86c49765a3
+  size: 4541370
+- hash: 4c677d3eaf00f050efcb51b08502d428c396c0c42ee56b7f53109a699e570900
+  size: 4542988
+- hash: b0e8c5828851a73c338ef09aaf52c3568b2f457ab46598755c6c31820bb1fdb7
+  size: 4541415
+- hash: b1afd5e951ab9adff116f15bc2114ba2f0fe380289807151279361687b0901e1
+  size: 4540755
+- hash: f18bb5d16c20c382ccdcaa647b327ca8452dce68b0c6969373b87d5f93bee10c
+  size: 4541863
+- hash: 87398af2647aec3ae808e95862c13215e14e7219bd8a874973de92436e223a6b
+  size: 4541642
+- hash: cb0749d08a8b3058ab639ad48e621eaf3a76e3058568035240313a3226e7142c
+  size: 4542815
+- hash: 5a45a0a8aa3675b3a6c0e439eae3567e091fff0c8e860c2b856e11a0c2635592
+  size: 4540449
+- hash: 7c457f675f720e3cbd3511a1a4d7b203ab61631102621b44c671e7c123b44907
+  size: 4541383
+- hash: 87612b2b2a49a996f53cc514ef0468d5a81a5a9c0a1b693e4eb2b4d8a99cb5e4
+  size: 4542658
+- hash: d21976d03e5ed0f74f2155e381da9e51fdc1c628ccff7ba0408ce2f09e8765fb
+  size: 4543703
+- hash: a07c1ce725cc6dd44bbe006f89ecd96952e8851921b9bdc6f93a6c9ffff36552
+  size: 4541193
+- hash: 1dea1be2f49c1a33490e520717727dfb8ec182482d38105dd1aaa3de94310da2
+  size: 4542044
+- hash: 5123a856b97c95b41a14d7a56b5bdb7510a05e8a562cf87b6d88a48e03c59a14
+  size: 4540533
+- hash: 3a351bf6bfb4c5b8704bfde633a4e9df3a1b31cc33173b53588c097fd1c76255
+  size: 4540302
+- hash: f3365905d3cceaadcf11aa629835c4b384e6d708029f178cf9aa68aff07bca61
+  size: 4542529
+- hash: de02cbd40e27d94b9ea1d5c653b4af12313af9cadfe449c299e5c606b2585909
+  size: 4537890
+- hash: 424807d1122f41a59474d5dbdd64800e9e8d377ed6e06df39dc87e6b588725d0
+  size: 4533592
diff --git a/docs/Makefile b/docs/Makefile
deleted file mode 100644
index 866bb6e..0000000
--- a/docs/Makefile
+++ /dev/null
@@ -1,20 +0,0 @@
-# Minimal makefile for Sphinx documentation
-#
-
-# You can set these variables from the command line.
-SPHINXOPTS    =
-SPHINXBUILD   = python -msphinx
-SPHINXPROJ    = dials_data
-SOURCEDIR     = .
-BUILDDIR      = _build
-
-# Put it first so that "make" without argument is like "make help".
-help:
-	@$(SPHINXBUILD) -M help "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
-
-.PHONY: help Makefile
-
-# Catch-all target: route all unknown targets to Sphinx using the new
-# "make mode" option.  $(O) is meant as a shortcut for $(SPHINXOPTS).
-%: Makefile
-	@$(SPHINXBUILD) -M $@ "$(SOURCEDIR)" "$(BUILDDIR)" $(SPHINXOPTS) $(O)
diff --git a/docs/_templates/breadcrumbs.html b/docs/_templates/breadcrumbs.html
deleted file mode 100644
index b2e28df..0000000
--- a/docs/_templates/breadcrumbs.html
+++ /dev/null
@@ -1,43 +0,0 @@
-{# Support for Sphinx 1.3+ page_source_suffix, but don't break old builds. #}
-
-{% if page_source_suffix %}
-{% set suffix = page_source_suffix %}
-{% else %}
-{% set suffix = source_suffix %}
-{% endif %}
-
-{% if meta is defined and meta is not none %}
-{% set check_meta = True %}
-{% else %}
-{% set check_meta = False %}
-{% endif %}
-
-<div role="navigation" aria-label="breadcrumbs navigation">
-
-  <ul class="wy-breadcrumbs">
-    {% block breadcrumbs %}
-      <li><a href="{{ pathto(master_doc) }}">{{ _('Docs') }}</a> &raquo;</li>
-        {% for doc in parents %}
-          <li><a href="{{ doc.link|e }}">{{ doc.title }}</a> &raquo;</li>
-        {% endfor %}
-      <li>{{ title }}</li>
-    {% endblock %}
-    {% block breadcrumbs_aside %}
-      <li class="wy-breadcrumbs-aside">
-        <a href="https://github.com/dials/data" class="fa fa-github"> View project on GitHub</a>
-      </li>
-    {% endblock %}
-  </ul>
-
-  {% if (theme_prev_next_buttons_location == 'top' or theme_prev_next_buttons_location == 'both') and (next or prev) %}
-  <div class="rst-breadcrumbs-buttons" role="navigation" aria-label="breadcrumb navigation">
-      {% if next %}
-        <a href="{{ next.link|e }}" class="btn btn-neutral float-right" title="{{ next.title|striptags|e }}" accesskey="n">Next <span class="fa fa-arrow-circle-right"></span></a>
-      {% endif %}
-      {% if prev %}
-        <a href="{{ prev.link|e }}" class="btn btn-neutral float-left" title="{{ prev.title|striptags|e }}" accesskey="p"><span class="fa fa-arrow-circle-left"></span> Previous</a>
-      {% endif %}
-  </div>
-  {% endif %}
-  <hr/>
-</div>
diff --git a/docs/make.bat b/docs/make.bat
deleted file mode 100644
index ce35e8f..0000000
--- a/docs/make.bat
+++ /dev/null
@@ -1,36 +0,0 @@
-@ECHO OFF
-
-pushd %~dp0
-
-REM Command file for Sphinx documentation
-
-if "%SPHINXBUILD%" == "" (
-	set SPHINXBUILD=python -msphinx
-)
-set SOURCEDIR=.
-set BUILDDIR=_build
-set SPHINXPROJ=dials_data
-
-if "%1" == "" goto help
-
-%SPHINXBUILD% >NUL 2>NUL
-if errorlevel 9009 (
-	echo.
-	echo.The Sphinx module was not found. Make sure you have Sphinx installed,
-	echo.then set the SPHINXBUILD environment variable to point to the full
-	echo.path of the 'sphinx-build' executable. Alternatively you may add the
-	echo.Sphinx directory to PATH.
-	echo.
-	echo.If you don't have Sphinx installed, grab it from
-	echo.http://sphinx-doc.org/
-	exit /b 1
-)
-
-%SPHINXBUILD% -M %1 %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
-goto end
-
-:help
-%SPHINXBUILD% -M help %SOURCEDIR% %BUILDDIR% %SPHINXOPTS%
-
-:end
-popd
diff --git a/requirements.txt b/requirements.txt
deleted file mode 100644
index 2b32889..0000000
--- a/requirements.txt
+++ /dev/null
@@ -1,5 +0,0 @@
-importlib_resources==5.4.0
-py==1.11.0
-pytest==7.0.1
-pyyaml==6.0
-requests==2.27.1
diff --git a/requirements_dev.txt b/requirements_dev.txt
deleted file mode 100644
index c4b6d93..0000000
--- a/requirements_dev.txt
+++ /dev/null
@@ -1,8 +0,0 @@
-collective.checkdocs==0.2
-coverage==6.3.2
-importlib_resources==5.4.0
-py==1.11.0
-pytest==7.0.1
-pyyaml==6.0
-requests==2.27.1
-wheel==0.37.1
diff --git a/requirements_doc.txt b/requirements_doc.txt
deleted file mode 100644
index 9a901e1..0000000
--- a/requirements_doc.txt
+++ /dev/null
@@ -1,5 +0,0 @@
-py==1.11.0
-pytest==7.0.1
-pyyaml==6.0
-Sphinx==4.4.0
-sphinx_rtd_theme==1.0.0
diff --git a/setup.cfg b/setup.cfg
index 9b0d33d..05b7d43 100644
--- a/setup.cfg
+++ b/setup.cfg
@@ -2,10 +2,10 @@
 name = dials_data
 version = 2.4.0
 url = https://github.com/dials/data
-project_urls =
-    Bug Tracker = https://github.com/dials/data/issues
-    Documentation = https://dials-data.readthedocs.io/
-    Source Code = https://github.com/dials/data
+project_urls = 
+	Bug Tracker = https://github.com/dials/data/issues
+	Documentation = https://dials-data.readthedocs.io/
+	Source Code = https://github.com/dials/data
 description = DIALS Regression Data Manager
 author = DIALS development team
 author_email = dials-support@lists.sourceforge.net
@@ -13,56 +13,52 @@ long_description = file: README.rst, HISTORY.rst
 long_description_content_type = text/x-rst
 license = BSD 3-Clause License
 license_file = LICENSE
-classifiers =
-    Development Status :: 5 - Production/Stable
-    Intended Audience :: Developers
-    License :: OSI Approved :: BSD License
-    Natural Language :: English
-    Programming Language :: Python :: 3
-    Programming Language :: Python :: 3.7
-    Programming Language :: Python :: 3.8
-    Programming Language :: Python :: 3.9
-    Programming Language :: Python :: 3.10
+classifiers = 
+	Development Status :: 5 - Production/Stable
+	Intended Audience :: Developers
+	License :: OSI Approved :: BSD License
+	Natural Language :: English
+	Programming Language :: Python :: 3
+	Programming Language :: Python :: 3.7
+	Programming Language :: Python :: 3.8
+	Programming Language :: Python :: 3.9
+	Programming Language :: Python :: 3.10
 keywords = dials, dials_data
 
 [options]
 include_package_data = True
-install_requires =
-    importlib_resources>=1.1
-    pytest
-    pyyaml
-    requests
-# importlib; python_version == "2.6"
+install_requires = 
+	importlib_resources>=1.1
+	pytest
+	pyyaml
+	requests
 packages = find:
 python_requires = >=3.7
 zip_safe = False
 
 [options.entry_points]
-console_scripts =
-    dials.data = dials_data.cli:main
-libtbx.dispatcher.script =
-    dials.data = dials.data
-libtbx.precommit =
-    dials_data = dials_data
-pytest11 =
-    dials_data = dials_data.pytest11
+console_scripts = 
+	dials.data = dials_data.cli:main
+libtbx.dispatcher.script = 
+	dials.data = dials.data
+libtbx.precommit = 
+	dials_data = dials_data
+pytest11 = 
+	dials_data = dials_data.pytest11
 
 [options.package_data]
 dials_data = py.typed
 
 [flake8]
-# Black disagrees with flake8 on a few points. Ignore those.
 ignore = E203, E266, E501, W503
-# E203 whitespace before ':'
-# E266 too many leading '#' for block comment
-# E501 line too long
-# W503 line break before binary operator
-
 max-line-length = 88
+select = 
+	E401,E711,E712,E713,E714,E721,E722,E901,
+	F401,F402,F403,F405,F541,F631,F632,F633,F811,F812,F821,F822,F841,F901,
+	W191,W291,W292,W293,W602,W603,W604,W605,W606,
+	C4,
+
+[egg_info]
+tag_build = 
+tag_date = 0
 
-select =
-    E401,E711,E712,E713,E714,E721,E722,E901,
-    F401,F402,F403,F405,F541,F631,F632,F633,F811,F812,F821,F822,F841,F901,
-    W191,W291,W292,W293,W602,W603,W604,W605,W606,
-    # flake8-comprehensions, https://github.com/adamchainz/flake8-comprehensions
-    C4,
diff --git a/tests/test_dials_data.py b/tests/test_dials_data.py
index c0d5ffb..f1308cd 100644
--- a/tests/test_dials_data.py
+++ b/tests/test_dials_data.py
@@ -44,9 +44,7 @@ def test_datafetcher_constructs_py_path(fetcher, root):
         ds = df("dataset")
     assert pathlib.Path(ds).resolve() == pathlib.Path("/tmp/root/dataset").resolve()
     assert isinstance(ds, py.path.local)
-    fetcher.assert_called_once_with(
-        "dataset", pre_scan=True, read_only=False, download_lockdir=mock.ANY
-    )
+    fetcher.assert_called_once_with("dataset", pre_scan=True, read_only=False)
 
     ds = df("dataset", pathlib=False)
     assert pathlib.Path(ds).resolve() == pathlib.Path("/tmp/root/dataset").resolve()
@@ -66,9 +64,7 @@ def test_datafetcher_constructs_path(fetcher, root):
     assert ds == test_path / "dataset"
 
     assert isinstance(ds, pathlib.Path)
-    fetcher.assert_called_once_with(
-        "dataset", pre_scan=True, read_only=False, download_lockdir=mock.ANY
-    )
+    fetcher.assert_called_once_with("dataset", pre_scan=True, read_only=False)
 
     with pytest.warns(DeprecationWarning):
         ds = df("dataset")
@@ -76,6 +72,4 @@ def test_datafetcher_constructs_path(fetcher, root):
     assert not isinstance(
         ds, pathlib.Path
     )  # default is currently to return py.path.local()
-    fetcher.assert_called_once_with(
-        "dataset", pre_scan=True, read_only=False, download_lockdir=mock.ANY
-    )
+    fetcher.assert_called_once_with("dataset", pre_scan=True, read_only=False)

Debdiff

[The following lists of changes regard files as different if they have different names, permissions or owners.]

Files in second set of .debs but not in first

-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/definitions/4fluoro_cxi.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/definitions/dtpb_serial_processed.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/definitions/lysozyme_JF16M_4img.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/definitions/lysozyme_JF16M_4img_spectra.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/definitions/quartz_processed.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/definitions/refinement_test_data.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/4fluoro_cxi.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/dtpb_serial_processed.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/lysozyme_JF16M_4img.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/lysozyme_JF16M_4img_spectra.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/quartz_processed.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/refinement_test_data.yml
-rw-r--r--  root/root   /usr/lib/python3/dist-packages/dials_data/hashinfo/trypsin_multi_lattice.yml

No differences were encountered in the control files

More details

Full run details