Codebase list mapproxy / 3dd1679
Imported Upstream version 1.10.0 Bas Couwenberg 7 years ago
119 changed file(s) with 5071 addition(s) and 774 deletion(s). Raw diff Collapse all Expand all
0 <!--- Is this a bug? -->
1 <!--- This issue tracker is only used for tracking bugs. Please use the mailing
2 list, if you have any question or need help: https://mapproxy.org/support -->
3
4 <!--- It is a bug! -->
5 <!--- Please provide a general summary of the issue in the Title above -->
6
7 ## Context
8 <!--- Provide a more detailed introduction to the issue itself, and why you consider it to be a bug -->
9
10 ## Expected Behavior
11 <!--- Tell us what should happen -->
12
13 ## Actual Behavior
14 <!--- Tell us what happens instead -->
15
16 ## Possible Fix
17 <!--- Not obligatory, but suggest a fix or reason for the bug -->
18
19 ## Steps to Reproduce
20 <!--- Provide a an unambiguous set of steps to reproduce this bug -->
21 <!--- Include _minimal_ but _complete_ configurations and test requests. -->
22 <!--- Use https://gist.github.com to link to larger configurations. -->
23 1.
24 2.
25 3.
26 4.
27
28 ## Context
29 <!--- How has this bug affected you? What were you trying to accomplish? -->
30
31 ## Your Environment
32 <!--- Include as many relevant details about the environment you experienced the bug in -->
33 * Version used:
34 * Environment name and version (e.g. Python 2.7.5 with mod_wsgi 4.5.9):
35 * Server type and version:
36 * Operating System and version:
1414 .settings
1515 .pydevproject
1616 .tox/
17
18 .idea/
00 language: python
11
22 python:
3 - "2.6"
43 - "2.7"
54 - "3.3"
65 - "3.4"
98 services:
109 - couchdb
1110 - riak
11 - redis-server
1212
1313 addons:
1414 apt:
2828 - libprotoc-dev
2929
3030 env:
31 - MAPPROXY_TEST_COUCHDB=http://127.0.0.1:5984
31 global:
32 - MAPPROXY_TEST_COUCHDB=http://127.0.0.1:5984
33 - MAPPROXY_TEST_REDIS=127.0.0.1:6379
34
35 # do not load /etc/boto.cfg with Python 3 incompatible plugin
36 # https://github.com/travis-ci/travis-ci/issues/5246#issuecomment-166460882
37 - BOTO_CONFIG=/doesnotexist
3238
3339 cache:
3440 directories:
3642
3743 install:
3844 # riak packages are not compatible with Python 3
39 - "if [[ $TRAVIS_PYTHON_VERSION = '2.7' ]]; then pip install --use-mirrors protobuf>=2.4.1 riak==2.2 riak_pb>=2.0; export MAPPROXY_TEST_COUCHDB=http://127.0.0.1:5984; export MAPPROXY_TEST_RIAK_PBC=pbc://localhost:8087; fi"
45 - "if [[ $TRAVIS_PYTHON_VERSION = '2.7' ]]; then pip install protobuf>=2.4.1 riak==2.2 riak_pb>=2.0; export MAPPROXY_TEST_RIAK_PBC=pbc://localhost:8087; fi"
4046 - "pip install -r requirements-tests.txt"
47 - "pip freeze"
4148
4249 script: nosetests mapproxy
2828 - Richard Duivenvoorde
2929 - Stephan Holl
3030 - Steven D. Lander
31 - Tom Payne
31 - Tom Payne
32 - Joseph Svrcek
0 1.10.0 2017-05-18
1 ~~~~~~~~~~~~~~~~~
2
3 Improvements:
4
5 - Support for S3 cache.
6 - Support for the ArcGIS Compact Cache format version 1.
7 - Support for GeoPackage files.
8 - Support for Redis cache.
9 - Support meta_tiles for tiles sources with bulk_meta_tiles option.
10 - mbtiles/sqlite cache: Store multiple tiles in one transaction.
11 - mbtiles/sqlite cache: Make timeout and WAL configurable.
12 - ArcGIS REST source: Improve handling for ImageServer endpoints.
13 - ArcGIS REST source: Support FeatureInfo requests.
14 - ArcGIS REST source: Support min_res and max_res.
15 - Support merging of RGB images with fixed transparency.
16 - Coverages: Clip source requests at coverage boundaries.
17 - Coverages: Build the difference, union or intersection of multiple coverages.
18 - Coverages: Create coverages from webmercator tile coordinates like 05/182/123
19 with expire tiles files.
20 - Coverages: Add native support for GeoJSON (no OGR/GDAL required).
21 - mapproxy-seed: Add --duration, -reseed-file and -reseed-interval options.
22
23 Fixes:
24
25 - Fix level selection for grids with small res_factor.
26 - mapproxy-util scales: Fix for Python 3.
27 - WMS: Fix FeatureInfo precision for transformed requests.
28 - Auth-API: Fix FeatureInfo for layers with limitto.
29 - Fixes subpixel transformation deviations with Pillow 3.4 or higher.
30 - mapproxy-seed: Reduce log output, especially in --quiet mode.
31 - mapproxy-seed: Improve tile counter for tile grids with custom resolutions.
32 - mapproxy-seed: Improve saving of the seed progress for --continue.
33 - Fix band-merging when not all sources return an image.
34
35 Other:
36
37 - Python 2.6 is no longer supported.
38
39
40 1.9.1 2017-01-18
41 ~~~~~~~~~~~~~~~~
42
43 Fixes:
44
45 - serve-develop: fixed reloader for Windows installations made
46 with recent pip version (#279)
47
048 1.9.0 2016-07-22
149 ~~~~~~~~~~~~~~~~
250
00 MapProxy is an open source proxy for geospatial data. It caches, accelerates and transforms data from existing map services and serves any desktop or web GIS client.
11
2 .. image:: http://mapproxy.org/mapproxy.png
2 .. image:: https://mapproxy.org/mapproxy.png
33
44 MapProxy is a tile cache, but also offers many new and innovative features like full support for WMS clients.
55
6 MapProxy is actively developed and supported by `Omniscale <http://omniscale.com>`_, it is released under the Apache Software License 2.0, runs on Unix/Linux and Windows and is easy to install and to configure.
6 MapProxy is actively developed and supported by `Omniscale <https://omniscale.com>`_, it is released under the Apache Software License 2.0, runs on Unix/Linux and Windows and is easy to install and to configure.
77
8 Go to http://mapproxy.org/ for more information.
8 Go to https://mapproxy.org/ for more information.
99
10 The documentation is available at: http://mapproxy.org/docs/latest/
10 The documentation is available at: https://mapproxy.org/docs/latest/
1111
3232 def __init__(self, app, global_conf):
3333 self.app = app
3434
35 def __call__(self, environ, start_reponse):
35 def __call__(self, environ, start_response):
3636 if random.randint(0, 1) == 1:
37 return self.app(environ, start_reponse)
37 return self.app(environ, start_response)
3838 else:
39 start_reponse('403 Forbidden',
39 start_response('403 Forbidden',
4040 [('content-type', 'text/plain')])
4141 return ['no luck today']
4242
8585 self.app = app
8686 self.prefix = prefix
8787
88 def __call__(self, environ, start_reponse):
88 def __call__(self, environ, start_response):
8989 # put authorize callback function into environment
9090 environ['mapproxy.authorize'] = self.authorize
91 return self.app(environ, start_reponse)
91 return self.app(environ, start_response)
9292
9393 def authorize(self, service, layers=[], environ=None, **kw):
9494 allowed = denied = False
516516 def __init__(self, app, global_conf):
517517 self.app = app
518518
519 def __call__(self, environ, start_reponse):
519 def __call__(self, environ, start_response):
520520 environ['mapproxy.authorize'] = self.authorize
521 return self.app(environ, start_reponse)
521 return self.app(environ, start_response)
522522
523523 def authorize(self, service, layers=[]):
524524 instance_name = environ.get('mapproxy.instance_name', '')
33 .. versionadded:: 1.2.0
44
55 MapProxy supports multiple backends to store the internal tiles. The default backend is file based and does not require any further configuration.
6
67
78 Configuration
89 =============
2425
2526 The following backend types are available.
2627
28
29 - :ref:`cache_file`
30 - :ref:`cache_mbtiles`
31 - :ref:`cache_sqlite`
32 - :ref:`cache_geopackage`
33 - :ref:`cache_couchdb`
34 - :ref:`cache_riak`
35 - :ref:`cache_redis`
36 - :ref:`cache_s3`
37 - :ref:`cache_compact`
38
39 .. _cache_file:
40
2741 ``file``
2842 ========
2943
5266
5367 .. versionadded:: 1.6.0
5468
69 .. _cache_mbtiles:
5570
5671 ``mbtiles``
5772 ===========
86101 The MBTiles format specification does not include any timestamps for each tile and the seeding function is limited therefore. If you include any ``refresh_before`` time in a seed task, all tiles will be recreated regardless of the value. The cleanup process does not support any ``remove_before`` times for MBTiles and it always removes all tiles.
87102 Use the ``--summary`` option of the ``mapproxy-seed`` tool.
88103
104 The note about ``bulk_meta_tiles`` for SQLite below applies to MBtiles as well.
105
106 .. _cache_sqlite:
107
89108 ``sqlite``
90109 ===========
91110
92111 .. versionadded:: 1.6.0
93112
94 Use SQLite databases to store the tiles, similar to ``mbtiles`` cache. The difference to ``mbtiles`` cache is that the ``sqlite`` cache stores each level into a separate databse. This makes it easy to remove complete levels during mapproxy-seed cleanup processes. The ``sqlite`` cache also stores the timestamp of each tile.
113 Use SQLite databases to store the tiles, similar to ``mbtiles`` cache. The difference to ``mbtiles`` cache is that the ``sqlite`` cache stores each level into a separate database. This makes it easy to remove complete levels during mapproxy-seed cleanup processes. The ``sqlite`` cache also stores the timestamp of each tile.
95114
96115 Available options:
97116
98117 ``dirname``:
99 The direcotry where the level databases will be stored.
118 The directory where the level databases will be stored.
100119
101120 ``tile_lock_dir``:
102121 Directory where MapProxy should write lock files when it creates new tiles for this cache. Defaults to ``cache_data/tile_locks``.
113132 type: sqlite
114133 directory: /path/to/cache
115134
135
136 .. note::
137
138 .. versionadded:: 1.10.0
139
140 All tiles from a meta tile request are stored in one transaction into the SQLite file to increase performance. You need to activate the :ref:`bulk_meta_tiles <bulk_meta_tiles>` option to get the same benefit when you are using tiled sources.
141
142 ::
143
144 caches:
145 sqlite_cache:
146 sources: [mytilesource]
147 bulk_meta_tiles: true
148 grids: [GLOBAL_MERCATOR]
149 cache:
150 type: sqlite
151 directory: /path/to/cache
152
153 .. _cache_couchdb:
116154
117155 ``couchdb``
118156 ===========
234272
235273 The ``_attachments``-part is the internal structure of CouchDB where the tile itself is stored. You can access the tile directly at: ``http://localhost:9999/mywms_tiles/mygrid-3-1-2/tile``.
236274
275 .. _cache_riak:
237276
238277 ``riak``
239278 ========
287326 default_ports:
288327 pb: 8087
289328 http: 8098
329
330 .. _cache_redis:
331
332 ``redis``
333 =========
334
335 .. versionadded:: 1.10.0
336
337 Store tiles in a `Redis <https://redis.io/>`_ in-memory database. This backend is useful for short-term caching. Typical use-case is a small Redis cache that allows you to benefit from meta-tiling.
338
339 Your Redis database should be configured with ``maxmemory`` and ``maxmemory-policy`` options to limit the memory usage. For example::
340
341 maxmemory 256mb
342 maxmemory-policy volatile-ttl
343
344
345 Requirements
346 ------------
347
348 You will need the `Python Redis client <https://pypi.python.org/pypi/redis>`_. You can install it in the usual way, for example with ``pip install redis``.
349
350 Configuration
351 -------------
352
353 Available options:
354
355 ``host``:
356 Host name of the Redis server. Defaults to ``127.0.0.1``.
357
358 ``port``:
359 Port of the Redis server. Defaults to ``6379``.
360
361 ``db``:
362 Number of the Redis database. Please refer to the Redis documentation. Defaults to `0`.
363
364 ``prefix``:
365 The prefix added to each tile-key in the Redis cache. Used to distinguish tiles from different caches and grids. Defaults to ``cache-name_grid-name``.
366
367 ``default_ttl``:
368 The default Time-To-Live of each tile in the Redis cache in seconds. Defaults to 3600 seconds (1 hour).
369
370
371
372 Example
373 -------
374
375 ::
376
377 redis_cache:
378 sources: [mywms]
379 grids: [mygrid]
380 cache:
381 type: redis
382 default_ttl: 600
383
384
385 .. _cache_geopackage:
386
387 ``geopackage``
388 ==============
389
390 .. versionadded:: 1.10.0
391
392 Store tiles in a `geopackage <http://www.geopackage.org/>`_ database. MapProxy creates a tile table if one isn't defined and populates the required meta data fields.
393 This backend is good for datasets that require portability.
394 Available options:
395
396 ``filename``:
397 The path to the geopackage file. Defaults to ``cachename.gpkg``.
398
399 ``table_name``:
400 The name of the table where the tiles should be stored (or retrieved if using an existing cache). Defaults to the ``cachename_gridname``.
401
402 ``levels``:
403 Set this to true to cache to a directory where each level is stored in a separate geopackage. Defaults to ``false``.
404 If set to true, ``filename`` is ignored.
405
406 ``directory``:
407 If levels is true use this to specify the directory to store geopackage files.
408
409 You can set the ``sources`` to an empty list, if you use an existing geopackage file and do not have a source.
410
411 ::
412
413 caches:
414 geopackage_cache:
415 sources: []
416 grids: [GLOBAL_MERCATOR]
417 cache:
418 type: geopackage
419 filename: /path/to/bluemarble.gpkg
420 table_name: bluemarble_tiles
421
422 .. note::
423
424 The geopackage format specification does not include any timestamps for each tile and the seeding function is limited therefore. If you include any ``refresh_before`` time in a seed task, all tiles will be recreated regardless of the value. The cleanup process does not support any ``remove_before`` times for geopackage and it always removes all tiles.
425 Use the ``--summary`` option of the ``mapproxy-seed`` tool.
426
427
428 .. _cache_s3:
429
430 ``s3``
431 ======
432
433 .. versionadded:: 1.10.0
434
435 Store tiles in a `Amazon Simple Storage Service (S3) <https://aws.amazon.com/s3/>`_.
436
437
438 Requirements
439 ------------
440
441 You will need the Python `boto3 <https://github.com/boto/boto3>`_ package. You can install it in the usual way, for example with ``pip install boto3``.
442
443 Configuration
444 -------------
445
446 Available options:
447
448 ``bucket_name``:
449 The bucket used for this cache. You can set the default bucket with ``globals.cache.s3.bucket_name``.
450
451 ``profile_name``:
452 Optional profile name for `shared credentials <http://boto3.readthedocs.io/en/latest/guide/configuration.html>`_ for this cache. Alternative methods of authentification are using the ``AWS_ACCESS_KEY_ID`` and ``AWS_SECRET_ACCESS_KEY`` environmental variables, or by using an `IAM role <http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/iam-roles-for-amazon-ec2.html>`_ when using an Amazon EC2 instance.
453 You can set the default profile with ``globals.cache.s3.profile_name``.
454
455 ``directory``:
456 Base directory (path) where all tiles are stored.
457
458 ``directory_layout``:
459 Defines the directory layout for the tiles (``12/12345/67890.png``, ``L12/R00010932/C00003039.png``, etc.). See :ref:`cache_file` for available options. Defaults to ``tms`` (e.g. ``12/12345/67890.png``). This cache cache also supports ``reverse_tms`` where tiles are stored as ``y/x/z.format``. See *note* below.
460
461 .. note::
462 The hierarchical ``directory_layouts`` can hit limitations of S3 *"if you are routinely processing 100 or more requests per second"*. ``directory_layout: reverse_tms`` can work around this limitation. Please read `S3 Request Rate and Performance Considerations <http://docs.aws.amazon.com/AmazonS3/latest/dev/request-rate-perf-considerations.html>`_ for more information on this issue.
463
464 Example
465 -------
466
467 ::
468
469 cache:
470 my_layer_20110501_epsg_4326_cache_out:
471 sources: [my_layer_20110501_cache]
472 cache:
473 type: s3
474 directory: /1.0.0/my_layer/default/20110501/4326/
475 bucket_name: my-s3-tiles-cache
476
477 globals:
478 cache:
479 s3:
480 profile_name: default
481
482
483 .. _cache_compact:
484
485
486 ``compact``
487 ===========
488
489 .. versionadded:: 1.10.0
490
491 Store tiles in ArcGIS compatible compact cache files. A single compact cache ``.bundle`` file stores up to about 16,000 tiles. There is one additional ``.bundlx`` index file for each ``.bundle`` data file.
492
493 Only version 1 of the compact cache format (ArcGIS 10.0-10.2) is supported. Version 2 (ArcGIS 10.3 or higher) is not supported at the moment.
494
495 Available options:
496
497 ``directory``:
498 Directory where MapProxy should store the level directories. This will not add the cache name or grid name to the path. You can use this option to point MapProxy to an existing compact cache.
499
500 ``version``:
501 The version of the ArcGIS compact cache format. This option is required.
502
503
504 You can set the ``sources`` to an empty list, if you use an existing compact cache files and do not have a source.
505
506
507 The following configuration will load tiles from ``/path/to/cache/L00/R0000C0000.bundle``, etc.
508
509 ::
510
511 caches:
512 compact_cache:
513 sources: []
514 grids: [webmercator]
515 cache:
516 type: compact
517 version: 1
518 directory: /path/to/cache
519
520 .. note::
521
522 The compact cache format does not include any timestamps for each tile and the seeding function is limited therefore. If you include any ``refresh_before`` time in a seed task, all tiles will be recreated regardless of the value. The cleanup process does not support any ``remove_before`` times for compact caches and it always removes all tiles.
523 Use the ``--summary`` option of the ``mapproxy-seed`` tool.
524
525
526 .. note::
527
528 The compact cache format is append-only to allow parallel read and write operations. Removing or refreshing tiles with ``mapproxy-seed`` does not reduce the size of the cache files. Therefore, this format is not suitable for caches that require frequent updates.
529
4848 # built documents.
4949 #
5050 # The short X.Y version.
51 version = '1.8'
51 version = '1.10'
5252 # The full version, including alpha/beta/rc tags.
53 release = '1.8.2a0'
53 release = '1.10.0a0'
5454
5555 # The language for content autogenerated by Sphinx. Refer to documentation
5656 # for a list of supported languages.
418418 """"""""""""""""""""""""""
419419 If set to ``true``, MapProxy will only issue a single request to the source. This option can reduce the request latency for uncached areas (on demand caching).
420420
421 By default MapProxy requests all uncached meta tiles that intersect the requested bbox. With a typical configuration it is not uncommon that a requests will trigger four requests each larger than 2000x2000 pixel. With the ``minimize_meta_requests`` option enabled, each request will trigger only one request to the source. That request will be aligned to the next tile boundaries and the tiles will be cached.
421 By default MapProxy requests all uncached meta-tiles that intersect the requested bbox. With a typical configuration it is not uncommon that a requests will trigger four requests each larger than 2000x2000 pixel. With the ``minimize_meta_requests`` option enabled, each request will trigger only one request to the source. That request will be aligned to the next tile boundaries and the tiles will be cached.
422422
423423 .. index:: watermark
424424
468468 """""""""""""""""""""""""""""""""
469469
470470 Change the ``meta_size`` and ``meta_buffer`` of this cache. See :ref:`global cache options <meta_size>` for more details.
471
472 ``bulk_meta_tiles``
473 """""""""""""""""""
474
475 Enables meta-tile handling for tiled sources. See :ref:`global cache options <meta_size>` for more details.
471476
472477 ``image``
473478 """""""""
607612 """"""""
608613
609614 The extent of your grid. You can use either a list or a string with the lower left and upper right coordinates. You can set the SRS of the coordinates with the ``bbox_srs`` option. If that option is not set the ``srs`` of the grid will be used.
615
616 MapProxy always expects your BBOX coordinates order to be east, south, west, north, regardless of your SRS :ref:`axis order <axis_order>`.
617
610618 ::
611619
612620 bbox: [0, 40, 15, 55]
632640 The following values are supported:
633641
634642 ``ll`` or ``sw``:
635
636643 If the x=0, y=0 tile is in the lower-left/south-west corner of the tile grid. This is the default.
637644
638645 ``ul`` or ``nw``:
639
640646 If the x=0, y=0 tile is in the upper-left/north-west corner of the tile grid.
641647
642648
787793 ``cache``
788794 """""""""
789795
796 The following options define how tiles are created and stored. Most options can be set individually for each cache as well.
797
790798 .. versionadded:: 1.6.0 ``tile_lock_dir``
799 .. versionadded:: 1.10.0 ``bulk_meta_tiles``
791800
792801
793802 .. _meta_size:
794803
795804 ``meta_size``
796 MapProxy does not make a single request for every tile but will request a large meta-tile that consist of multiple tiles. ``meta_size`` defines how large a meta-tile is. A ``meta_size`` of ``[4, 4]`` will request 16 tiles in one pass. With a tile size of 256x256 this will result in 1024x1024 requests to the source WMS.
805 MapProxy does not make a single request for every tile it needs, but it will request a large meta-tile that consist of multiple tiles. ``meta_size`` defines how large a meta-tile is. A ``meta_size`` of ``[4, 4]`` will request 16 tiles in one pass. With a tile size of 256x256 this will result in 1024x1024 requests to the source. Tiled sources are still requested tile by tile, but you can configure MapProxy to load multiple tiles in bulk with `bulk_meta_tiles`.
806
807
808 .. _bulk_meta_tiles:
809
810 ``bulk_meta_tiles``
811 Enables meta-tile handling for caches with tile sources.
812 If set to `true`, MapProxy will request neighboring tiles from the source even if only one tile is requested from the cache. ``meta_size`` defines how many tiles should be requested in one step and ``concurrent_tile_creators`` defines how many requests are made in parallel. This option improves the performance for caches that allow to store multiple tiles with one request, like SQLite/MBTiles but not the ``file`` cache.
813
797814
798815 ``meta_buffer``
799816 MapProxy will increase the size of each meta-tile request by this number of
821838 can either be absolute (e.g. ``/tmp/lock/mapproxy``) or relative to the
822839 mapproxy.yaml file. Defaults to ``./cache_data/dir_of_the_cache/tile_locks``.
823840
841
824842 ``concurrent_tile_creators``
825 This limits the number of parallel requests MapProxy will make to a source WMS. This limit is per request and not for all MapProxy requests. To limit the requests MapProxy makes to a single server use the ``concurrent_requests`` option.
826
827 Example: A request in an uncached region requires MapProxy to fetch four meta-tiles. A ``concurrent_tile_creators`` value of two allows MapProxy to make two requests to the source WMS request in parallel. The splitting of the meta tile and the encoding of the new tiles will happen in parallel to.
843 This limits the number of parallel requests MapProxy will make to a source. This limit is per request for this cache and not for all MapProxy requests. To limit the requests MapProxy makes to a single server use the ``concurrent_requests`` option.
844
845 Example: A request in an uncached region requires MapProxy to fetch four meta-tiles. A ``concurrent_tile_creators`` value of two allows MapProxy to make two requests to the source WMS request in parallel. The splitting of the meta-tile and the encoding of the new tiles will happen in parallel to.
828846
829847
830848 ``link_single_color_images``
896914 http:
897915 ssl_ca_certs: /etc/ssl/certs/ca-certificates.crt
898916
899 If you want to use SSL but do not need certificate verification, then you can disable it with the ``ssl_no_cert_checks`` option. You can also disable this check on a source level, see :ref:`WMS source options <wms_source-ssl_no_cert_checks>`.
917 If you want to use SSL but do not need certificate verification, then you can disable it with the ``ssl_no_cert_checks`` option. You can also disable this check on a source level, see :ref:`WMS source options <wms_source_ssl_no_cert_checks>`.
900918 ::
901919
902920 http:
66 MapProxy supports coverages for :doc:`sources <sources>` and in the :doc:`mapproxy-seed tool <seed>`. Refer to the corresponding section in the documentation.
77
88
9 There are three different ways to describe a coverage.
9 There are five different ways to describe a coverage:
1010
1111 - a simple rectangular bounding box,
1212 - a text file with one or more (multi)polygons in WKT format,
13 - (multi)polygons from any data source readable with OGR (e.g. Shapefile, GeoJSON, PostGIS)
14
13 - a GeoJSON file with (multi)polygons features,
14 - (multi)polygons from any data source readable with OGR (e.g. Shapefile, GeoJSON, PostGIS),
15 - a file with webmercator tile coordinates.
16
17 .. versionadded:: 1.10
18
19 You can also build intersections, unions and differences between multiple coverages.
1520
1621 Requirements
1722 ------------
4550 For simple box coverages.
4651
4752 ``bbox`` or ``datasource``:
48 A simple BBOX as a list, e.g: `[4, -30, 10, -28]` or as a string `4,-30,10,-28`.
53 A simple BBOX as a list of minx, miny, maxx, maxy, e.g: `[4, -30, 10, -28]` or as a string `4,-30,10,-28`.
4954
5055 Polygon file
5156 """"""""""""
5560
5661 ``datasource``:
5762 The path to the polygon file. Should be relative to the proxy configuration or absolute.
63
64 GeoJSON
65 """""""
66
67 .. versionadded:: 1.10
68 Previous versions required OGR/GDAL for reading GeoJSON.
69
70 You can use GeoJSON files with Polygon and MultiPolygons geometries. FeatureCollections and Features of these geometries are suppored as well. MapProxy uses OGR to read GeoJSON files if you define a ``where`` filter.
71
72 ``datasource``:
73 The path to the GeoJSON file. Should be relative to the proxy configuration or absolute.
5874
5975 OGR datasource
6076 """"""""""""""
7288 statement (e.g. ``'CNTRY_NAME="Germany"'``) or a full select statement. Refer to the
7389 `OGR SQL support documentation <http://www.gdal.org/ogr/ogr_sql.html>`_. If this
7490 option is unset, the first layer from the datasource will be used.
91
92
93 Expire tiles
94 """"""""""""
95
96 .. versionadded:: 1.10
97
98 Text file with webmercator tile coordinates. The tiles should be in ``z/x/y`` format (e.g. ``14/1283/6201``),
99 with one tile coordinate per line. Only tiles in the webmercator grid are supported (origin is always `nw`).
100
101 ``expire_tiles``:
102 File or directory with expire tile files. Directories are loaded recursive.
103
104
105 Union
106 """""
107
108 .. versionadded:: 1.10
109
110 A union coverage contains the combined coverage of one or more sub-coverages. This can be used to combine multiple coverages a single source. Each sub-coverage can be of any supported type and SRS.
111
112 ``union``:
113 A list of multiple coverages.
114
115 Difference
116 """"""""""
117
118 .. versionadded:: 1.10
119
120 A difference coverage subtracts the coverage of other sub-coverages from the first coverage. This can be used to exclude parts from a coverage. Each sub-coverage can be of any supported type and SRS.
121
122 ``difference``:
123 A list of multiple coverages.
124
125
126 Intersection
127 """"""""""""
128
129 .. versionadded:: 1.10
130
131 An intersection coverage contains only areas that are covered by all sub-coverages. This can be used to limit a larger coverage to a smaller area. Each sub-coverage can be of any supported type and SRS.
132
133 ``difference``:
134 A list of multiple coverages.
135
136
137 Clipping
138 --------
139 .. versionadded:: 1.10.0
140
141 By default MapProxy tries to get and serve full source image even if a coverage only touches it.
142 Clipping by coverage can be enabled by setting ``clip: true``. If enabled, all areas outside the coverage will be converted to transparent pixels.
143
144 The ``clip`` option is only active for source coverages and not for seeding coverages.
75145
76146
77147 Examples
95165 srs: 'EPSG:4326'
96166
97167
168 Example of an intersection coverage with clipping::
169
170 sources:
171 mywms:
172 type: wms
173 req:
174 url: http://example.com/service?
175 layers: base
176 coverage:
177 clip: true
178 intersection:
179 - bbox: [5, 50, 10, 55]
180 srs: 'EPSG:4326'
181 - datasource: coverage.geojson
182 srs: 'EPSG:4326'
183
184
98185 mapproxy-seed
99186 """""""""""""
100187
114114
115115 <Directory /path/to/mapproxy/>
116116 Order deny,allow
117 Require all granted # for Apache 2.4
118 # Allow from all # for Apache 2.2
117 # For Apache 2.4:
118 Require all granted
119 # For Apache 2.2:
120 # Allow from all
119121 </Directory>
120122
121123
9393
9494 GDAL *(optional)*
9595 ~~~~~~~~~~~~~~~~~
96 The :doc:`coverage feature <coverages>` allows you to read geometries from OGR datasources (Shapefiles, PostGIS, etc.). This package is optional and only required for OGR datasource support. OGR is part of GDAL (``libgdal-dev``).
96 The :doc:`coverage feature <coverages>` allows you to read geometries from OGR datasources (Shapefiles, PostGIS, etc.). This package is optional and only required for OGR datasource support (BBOX, WKT and GeoJSON coverages are supported natively). OGR is part of GDAL (``libgdal-dev``).
9797
9898 .. _lxml_install:
9999
00 Installation on Windows
11 =======================
22
3 .. note:: You can also :doc:`install MapProxy inside an existing OSGeo4W installation<install_osgeo4w>`.
4
5 At frist you need a working Python installation. You can download Python from: http://www.python.org/download/. MapProxy requires Python 2.7, 3.3 or 3.4. Python 2.6 should still work, but it is no longer officially supported.
6
3 At frist you need a working Python installation. You can download Python from: https://www.python.org/download/. MapProxy requires Python 2.7, 3.3, 3.4, 3.5 or 3.6. Python 2.6 should still work, but it is no longer officially supported. We would recommend the latest 2.7 version available.
74
85 Virtualenv
96 ----------
2320
2421 .. note:: Apache mod_wsgi does not work well with virtualenv on Windows. If you want to use mod_wsgi for deployment, then you should skip the creation the virtualenv.
2522
26 After you activated the new environment, you have access to ``python`` and ``easy_install``.
23 After you activated the new environment, you have access to ``python`` and ``pip``.
2724 To install MapProxy with most dependencies call::
2825
29 easy_install MapProxy
26 pip install MapProxy
3027
3128 This might take a minute. You can skip the next step.
3229
3330
34 Setuptools
35 ----------
31 PIP
32 ---
3633
37 MapProxy and most dependencies can be installed with the ``easy_install`` command.
38 You need to `install the setuptool package <http://pypi.python.org/pypi/setuptools>`_ to get the ``easy_install`` command.
34 MapProxy and most dependencies can be installed with the ``pip`` command. ``pip`` is already installed if you are using Python >=2.7.9, or Python >=3.4. `Read the pip documentation for more information <https://pip.pypa.io/en/stable/installing/>`_.
3935
4036 After that you can install MapProxy with::
4137
42 c:\Python27\Scripts\easy_install MapProxy
38 c:\Python27\Scripts\pip install MapProxy
4339
4440 This might take a minute.
4541
5248 Pillow and YAML
5349 ~~~~~~~~~~~~~~~
5450
55 Pillow and PyYAML are installed automatically by ``easy_install``.
51 Pillow and PyYAML are installed automatically by ``pip``.
5652
5753 PyProj
5854 ~~~~~~
5955
6056 Since libproj4 is generally not available on a Windows system, you will also need to install the Python package ``pyproj``.
57 You need to manually download the ``pyproj`` package for your system. See below for *Platform dependent packages*.
6158
6259 ::
6360
64 easy_install pyproj
61 pip install path\to\pyproj-xxx.whl
6562
6663
6764 Shapely and GEOS *(optional)*
6865 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
69 Shapely can be installed with ``easy_install Shapely``. This will already include the required ``geos.dll``.
66 Shapely can be installed with ``pip install Shapely``. This will already include the required ``geos.dll``.
7067
7168
7269 GDAL *(optional)*
8683 set GDAL_DRIVER_PATH=C:\Program Files (x86)\GDAL\gdalplugins
8784
8885
86 .. _win_platform_packages:
87
8988 Platform dependent packages
9089 ---------------------------
9190
92 All Python packages are downloaded from http://pypi.python.org/, but not all platform combinations might be available as a binary package, especially if you run a 64bit version of Windows.
91 ``pip`` downloads all packages from https://pypi.python.org/, but not all platform combinations might be available as a binary package, especially if you run a 64bit version of Python.
9392
94 If you run into troubles during installation, because it is trying to compile something (e.g. complaining about ``vcvarsall.bat``), you should look at Christoph Gohlke's `Unofficial Windows Binaries for Python Extension Packages <http://www.lfd.uci.edu/~gohlke/pythonlibs/>`_.
93 If you run into trouble during installation, because it is trying to compile something (e.g. complaining about ``vcvarsall.bat``), you should look at Christoph Gohlke's `Unofficial Windows Binaries for Python Extension Packages <http://www.lfd.uci.edu/~gohlke/pythonlibs/>`_. This is a reliable site for binary packages for Python. You need to download the right package: The ``cpxx`` code refers to the Python version (e.g. ``cp27`` for Python 2.7); ``win32`` for 32bit Python installations and ``amd64`` for 64bit.
9594
96 You can install the ``.exe`` packages with ``easy_install``::
95 You can install the ``.whl``, ``.zip`` or ``.exe`` packages with ``pip``::
9796
98 easy_install path\to\package-xxx.exe
97 pip install path\to\package-xxx.whl
9998
10099
101100 Check installation
109108
110109 Now continue with :ref:`Create a configuration <create_configuration>` from the installation documentation.
111110
112
503503 Export tiles like the internal cache directory structure. This is compatible with TileCache.
504504
505505 ``mbtile``:
506 Exports tiles into a MBTile file.
506 Export tiles into a MBTile file.
507
508 ``sqlite``:
509 Export tiles into SQLite level files.
510
511 ``geopackage``:
512 Export tiles into a GeoPackage file.
507513
508514 ``arcgis``:
509 Exports tiles in a ArcGIS exploded cache directory structure.
510
515 Export tiles in a ArcGIS exploded cache directory structure.
516
517 ``compact-v1``:
518 Export tiles as ArcGIS compact cache bundle files (version 1).
511519
512520
513521 Examples
2020
2121 .. option:: -s <seed.yaml>, --seed-conf==<seed.yaml>
2222
23 The seed configuration. You can also pass the configration as the last argument to ``mapproxy-seed``
23 The seed configuration. You can also pass the configuration as the last argument to ``mapproxy-seed``
2424
2525 .. option:: -f <mapproxy.yaml>, --proxy-conf=<mapproxy.yaml>
2626
6666
6767 Filename where MapProxy stores the seeding progress for the ``--continue`` option. Defaults to ``.mapproxy_seed_progress`` in the current working directory. MapProxy will remove that file after a successful seed.
6868
69 .. option:: --duration
70
71 Stop seeding process after this duration. This option accepts duration in the following format: 120s, 15m, 4h, 0.5d
72 Use this option in combination with ``--continue`` to be able to resume the seeding. By default,
73
74 .. option:: --reseed-file
75
76 File created by ``mapproxy-seed`` at the start of a new seeding.
77
78 .. option:: --reseed-interval
79
80 Only start seeding if ``--reseed-file`` is older then this duration.
81 This option accepts duration in the following format: 120s, 15m, 4h, 0.5d
82 Use this option in combination with ``--continue`` to be able to resume the seeding. By default,
83
84
6985 .. option:: --use-cache-lock
7086
7187 Lock each cache to prevent multiple parallel `mapproxy-seed` calls to work on the same cache.
8096
8197 .. versionadded:: 1.7.0
8298 ``--log-config`` option
99
100 .. versionadded:: 1.10.0
101 ``--duration``, ``--reseed-file`` and ``--reseed-interval`` option
102
103
83104
84105
85106 Examples
377398 austria:
378399 bbox: [9.36, 46.33, 17.28, 49.09]
379400 srs: 'EPSG:4326'
401
402
403 .. _background_seeding:
404
405 Example: Background seeding
406 ---------------------------
407
408 .. versionadded:: 1.10.0
409
410 The ``--duration`` option allows you run MapProxy seeding for a limited time. In combination with the ``--continue`` option, you can resume the seeding process at a later time.
411 You can use this to call ``mapproxy-seed`` with ``cron`` to seed in the off-hours.
412
413 However, this will restart the seeding process from the begining everytime the is seeding completed.
414 You can prevent this with the ``--reeseed-interval`` and ``--reseed-file`` option.
415 The follwing example starts seeding for six hours. It will seed for another six hours, everytime you call this command again. Once all seed and cleanup tasks were proccessed the command will exit immediately everytime you call it within 14 days after the first call. After 14 days, the modification time of the ``reseed.time`` file will be updated and the re-seeding process starts again.
416
417 ::
418
419 mapproxy-seed -f mapproxy.yaml -s seed.yaml \
420 --reseed-interval 14d --duration 6h --reseed-file reseed.time \
421 --continue --progress-file .mapproxy_seed_progress
422
423 You can use the ``--reseed-file`` as a ``refresh_before`` and ``remove_before`` ``mtime``-file.
424
380425
381426
382427 .. _seed_old_configuration:
107107
108108 A list of image mime types the server should offer.
109109
110 .. _wms_featureinfo_types:
111
110112 ``featureinfo_types``
111113 """""""""""""""""""""
112114
113 A list of feature info types the server should offer. Available types are ``text``, ``html`` and ``xml``. The types then are advertised in the capabilities with the correct mime type.
115 A list of feature info types the server should offer. Available types are ``text``, ``html``, ``xml`` and ``json``. The types are advertised in the capabilities with the correct mime type. Defaults to ``[text, html, xml]``.
114116
115117 ``featureinfo_xslt``
116118 """"""""""""""""""""
256256 .. _arcgis_label:
257257
258258 ArcGIS REST API
259 """
259 """""""""""""""
260
261 .. versionadded: 1.9.0
260262
261263 Use the type ``arcgis`` for ArcGIS MapServer and ImageServer REST server endpoints. This
262264 source is based on :ref:`the WMS source <wms_label>` and most WMS options apply to the
265267 ``req``
266268 ^^^^^^^
267269
268 This describes the ArcGIS source. The only required option is ``url``. You need to set ``transparent`` to ``true`` if you want to use this source as an overlay.
269 ::
270
271 req:
272 url: http://example.org/ArcGIS/rest/services/Imagery/MapService
273 layers: show: 0,1
274 transparent: true
275
276 .. _example_configuration:
270 This describes the ArcGIS source. The only required option is ``url``. You need to set ``transparent`` to ``true`` if you want to use this source as an overlay. You can also add ArcGIS specific parameters to ``req``, for example to set the `interpolation method for ImageServers <http://resources.arcgis.com/en/help/rest/apiref/exportimage.html>`_.
271
272
273 ``opts``
274 ^^^^^^^^
275
276 .. versionadded: 1.10.0
277
278 This option affects what request MapProxy sends to the source ArcGIS server.
279
280 ``featureinfo``
281 If this is set to ``true``, MapProxy will mark the layer as queryable and incoming `GetFeatureInfo` requests will be forwarded as ``identify`` requests to the source server. ArcGIS REST server support only HTML and JSON format. You need to enable support for JSON :ref:`wms_featureinfo_types`.
282
283 ``featureinfo_return_geometries``
284 Whether the source should include the feature geometries.
285
286 ``featureinfo_tolerance``
287 Tolerance in pixel within the ArcGIS server should identify features.
277288
278289 Example configuration
279290 ^^^^^^^^^^^^^^^^^^^^^
280291
281 Minimal example::
292 MapServer example::
282293
283294 my_minimal_arcgissource:
284295 type: arcgis
285296 req:
297 layers: show: 0,1
286298 url: http://example.org/ArcGIS/rest/services/Imagery/MapService
287
288 Full example::
299 transparent: true
300
301 ImageServer example::
289302
290303 my_arcgissource:
291304 type: arcgis
292305 coverage:
293306 polygons: GM.txt
294 polygons_srs: EPSG:900913
307 srs: EPSG:3857
295308 req:
296 url: http://example.org/ArcGIS/rest/services/Imagery/MapService
297 layers: show:0,1
298 transparent: true
309 url: http://example.org/ArcGIS/rest/services/World/MODIS/ImageServer
310 interpolation: RSP_CubicConvolution
311 bandIds: 2,0,1
312
299313
300314 .. _tiles_label:
301315
360374 - ``headers``
361375 - ``client_timeout``
362376 - ``ssl_ca_certs``
363 - ``ssl_no_cert_checks`` (:ref:`see above <wms_source-ssl_no_cert_checks>`)
377 - ``ssl_no_cert_checks`` (:ref:`see above <wms_source_ssl_no_cert_checks>`)
364378
365379 See :ref:`HTTP Options <http_ssl>` for detailed documentation.
366380
11 demo:
22 wms:
33 md:
4 title: MapProxy WMS Proxy
5 abstract: This is the fantastic MapProxy.
6 online_resource: http://mapproxy.org/
7 contact:
8 person: Your Name Here
9 position: Technical Director
10 organization:
11 address: Fakestreet 123
12 city: Somewhere
13 postcode: 12345
14 country: Germany
15 phone: +49(0)000-000000-0
16 fax: +49(0)000-000000-0
17 email: info@omniscale.de
18 access_constraints:
19 This service is intended for private and
20 evaluation use only. The data is licensed
21 as Creative Commons Attribution-Share Alike 2.0
22 (http://creativecommons.org/licenses/by-sa/2.0/)
23 fees: 'None'
4 title: MapProxy WMS Proxy
5 abstract: This is the fantastic MapProxy.
6 online_resource: http://mapproxy.org/
7 contact:
8 person: Your Name Here
9 position: Technical Director
10 organization:
11 address: Fakestreet 123
12 city: Somewhere
13 postcode: 12345
14 country: Germany
15 phone: +49(0)000-000000-0
16 fax: +49(0)000-000000-0
17 email: info@omniscale.de
18 access_constraints:
19 Insert license and copyright information for this service.
20 fees: 'None'
2421
2522 sources:
2623 test_wms:
77 contact:
88 person: Your Name Here
99 position: Technical Director
10 organization:
10 organization:
1111 address: Fakestreet 123
1212 city: Somewhere
1313 postcode: 12345
1616 fax: +49(0)000-000000-0
1717 email: info@omniscale.de
1818 access_constraints:
19 This service is intended for private and
20 evaluation use only. The data is licensed
21 as Creative Commons Attribution-Share Alike 2.0
22 (http://creativecommons.org/licenses/by-sa/2.0/)
19 Insert license and copyright information for this service.
2320 fees: 'None'
2421
2522 sources:
77 contact:
88 person: Your Name Here
99 position: Technical Director
10 organization:
10 organization:
1111 address: Fakestreet 123
1212 city: Somewhere
1313 postcode: 12345
1616 fax: +49(0)000-000000-0
1717 email: info@omniscale.de
1818 access_constraints:
19 This service is intended for private and
20 evaluation use only. The data is licensed
21 as Creative Commons Attribution-Share Alike 2.0
22 (http://creativecommons.org/licenses/by-sa/2.0/)
19 Insert license and copyright information for this service.
2320 fees: 'None'
2421
2522 sources:
77 contact:
88 person: Your Name Here
99 position: Technical Director
10 organization:
10 organization:
1111 address: Fakestreet 123
1212 city: Somewhere
1313 postcode: 12345
1616 fax: +49(0)000-000000-0
1717 email: info@omniscale.de
1818 access_constraints:
19 This service is intended for private and
20 evaluation use only. The data is licensed
21 as Creative Commons Attribution-Share Alike 2.0
22 (http://creativecommons.org/licenses/by-sa/2.0/)
19 Insert license and copyright information for this service.
2320 fees: 'None'
2421
2522 sources:
77 contact:
88 person: Your Name Here
99 position: Technical Director
10 organization:
10 organization:
1111 address: Fakestreet 123
1212 city: Somewhere
1313 postcode: 12345
1616 fax: +49(0)000-000000-0
1717 email: info@omniscale.de
1818 access_constraints:
19 This service is intended for private and
20 evaluation use only. The data is licensed
21 as Creative Commons Attribution-Share Alike 2.0
22 (http://creativecommons.org/licenses/by-sa/2.0/)
19 Insert license and copyright information for this service.
2320 fees: 'None'
2421
2522 sources:
00 # This file is part of the MapProxy project.
11 # Copyright (C) 2010 Omniscale <http://omniscale.de>
2 #
2 #
33 # Licensed under the Apache License, Version 2.0 (the "License");
44 # you may not use this file except in compliance with the License.
55 # You may obtain a copy of the License at
6 #
6 #
77 # http://www.apache.org/licenses/LICENSE-2.0
8 #
8 #
99 # Unless required by applicable law or agreed to in writing, software
1010 # distributed under the License is distributed on an "AS IS" BASIS,
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1616 Tile caching (creation, caching and retrieval of tiles).
1717
1818 .. digraph:: Schematic Call Graph
19
19
2020 ranksep = 0.1;
21 node [shape="box", height="0", width="0"]
22
21 node [shape="box", height="0", width="0"]
22
2323 cl [label="CacheMapLayer" href="<mapproxy.layer.CacheMapLayer>"]
2424 tm [label="TileManager", href="<mapproxy.cache.tile.TileManager>"];
2525 fc [label="FileCache", href="<mapproxy.cache.file.FileCache>"];
3030 tm -> fc [label="load\\nstore\\nis_cached"];
3131 tm -> s [label="get_map"]
3232 }
33
33
3434
3535 """
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2016 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement
16 import errno
17 import hashlib
18 import os
19 import shutil
20 import struct
21
22 from mapproxy.image import ImageSource
23 from mapproxy.cache.base import TileCacheBase, tile_buffer
24 from mapproxy.util.fs import ensure_directory, write_atomic
25 from mapproxy.util.lock import FileLock
26 from mapproxy.compat import BytesIO
27
28 import logging
29 log = logging.getLogger(__name__)
30
31
32 class CompactCacheV1(TileCacheBase):
33 supports_timestamp = False
34
35 def __init__(self, cache_dir):
36 self.lock_cache_id = 'compactcache-' + hashlib.md5(cache_dir.encode('utf-8')).hexdigest()
37 self.cache_dir = cache_dir
38
39 def _get_bundle(self, tile_coord):
40 x, y, z = tile_coord
41
42 level_dir = os.path.join(self.cache_dir, 'L%02d' % z)
43
44 c = x // BUNDLEX_GRID_WIDTH * BUNDLEX_GRID_WIDTH
45 r = y // BUNDLEX_GRID_HEIGHT * BUNDLEX_GRID_HEIGHT
46
47 basename = 'R%04xC%04x' % (r, c)
48 return Bundle(os.path.join(level_dir, basename), offset=(c, r))
49
50 def is_cached(self, tile):
51 if tile.coord is None:
52 return True
53 if tile.source:
54 return True
55
56 return self._get_bundle(tile.coord).is_cached(tile)
57
58 def store_tile(self, tile):
59 if tile.stored:
60 return True
61
62 return self._get_bundle(tile.coord).store_tile(tile)
63
64 def load_tile(self, tile, with_metadata=False):
65 if tile.source or tile.coord is None:
66 return True
67
68 return self._get_bundle(tile.coord).load_tile(tile)
69
70 def remove_tile(self, tile):
71 if tile.coord is None:
72 return True
73
74 return self._get_bundle(tile.coord).remove_tile(tile)
75
76 def load_tile_metadata(self, tile):
77 if self.load_tile(tile):
78 tile.timestamp = -1
79
80 def remove_level_tiles_before(self, level, timestamp):
81 if timestamp == 0:
82 level_dir = os.path.join(self.cache_dir, 'L%02d' % level)
83 shutil.rmtree(level_dir, ignore_errors=True)
84 return True
85 return False
86
87 BUNDLE_EXT = '.bundle'
88 BUNDLEX_EXT = '.bundlx'
89
90 class Bundle(object):
91 def __init__(self, base_filename, offset):
92 self.base_filename = base_filename
93 self.lock_filename = base_filename + '.lck'
94 self.offset = offset
95
96 def _rel_tile_coord(self, tile_coord):
97 return (
98 tile_coord[0] % BUNDLEX_GRID_WIDTH,
99 tile_coord[1] % BUNDLEX_GRID_HEIGHT,
100 )
101
102 def is_cached(self, tile):
103 if tile.source or tile.coord is None:
104 return True
105
106 idx = BundleIndex(self.base_filename + BUNDLEX_EXT)
107 x, y = self._rel_tile_coord(tile.coord)
108 offset = idx.tile_offset(x, y)
109 if offset == 0:
110 return False
111
112 bundle = BundleData(self.base_filename + BUNDLE_EXT, self.offset)
113 size = bundle.read_size(offset)
114 return size != 0
115
116 def store_tile(self, tile):
117 if tile.stored:
118 return True
119
120 with tile_buffer(tile) as buf:
121 data = buf.read()
122
123 with FileLock(self.lock_filename):
124 bundle = BundleData(self.base_filename + BUNDLE_EXT, self.offset)
125 idx = BundleIndex(self.base_filename + BUNDLEX_EXT)
126 x, y = self._rel_tile_coord(tile.coord)
127 offset = idx.tile_offset(x, y)
128 offset, size = bundle.append_tile(data, prev_offset=offset)
129 idx.update_tile_offset(x, y, offset=offset, size=size)
130
131 return True
132
133 def load_tile(self, tile, with_metadata=False):
134 if tile.source or tile.coord is None:
135 return True
136
137 idx = BundleIndex(self.base_filename + BUNDLEX_EXT)
138 x, y = self._rel_tile_coord(tile.coord)
139 offset = idx.tile_offset(x, y)
140 if offset == 0:
141 return False
142
143 bundle = BundleData(self.base_filename + BUNDLE_EXT, self.offset)
144 data = bundle.read_tile(offset)
145 if not data:
146 return False
147 tile.source = ImageSource(BytesIO(data))
148
149 return True
150
151 def remove_tile(self, tile):
152 if tile.coord is None:
153 return True
154
155 with FileLock(self.lock_filename):
156 idx = BundleIndex(self.base_filename + BUNDLEX_EXT)
157 x, y = self._rel_tile_coord(tile.coord)
158 idx.remove_tile_offset(x, y)
159
160 return True
161
162
163 BUNDLEX_GRID_WIDTH = 128
164 BUNDLEX_GRID_HEIGHT = 128
165 BUNDLEX_HEADER_SIZE = 16
166 BUNDLEX_HEADER = b'\x03\x00\x00\x00\x10\x00\x00\x00\x00\x40\x00\x00\x05\x00\x00\x00'
167 BUNDLEX_FOOTER_SIZE = 16
168 BUNDLEX_FOOTER = b'\x00\x00\x00\x00\x10\x00\x00\x00\x10\x00\x00\x00\x00\x00\x00\x00'
169
170 class BundleIndex(object):
171 def __init__(self, filename):
172 self.filename = filename
173 # defer initialization to update/remove calls to avoid
174 # index creation on is_cached (prevents new files in read-only caches)
175 self._initialized = False
176
177 def _init_index(self):
178 self._initialized = True
179 if os.path.exists(self.filename):
180 return
181 ensure_directory(self.filename)
182 buf = BytesIO()
183 buf.write(BUNDLEX_HEADER)
184 for i in range(BUNDLEX_GRID_WIDTH * BUNDLEX_GRID_HEIGHT):
185 buf.write(struct.pack('<Q', (i*4)+BUNDLE_HEADER_SIZE)[:5])
186 buf.write(BUNDLEX_FOOTER)
187 write_atomic(self.filename, buf.getvalue())
188
189 def _tile_offset(self, x, y):
190 return BUNDLEX_HEADER_SIZE + (x * BUNDLEX_GRID_HEIGHT + y) * 5
191
192 def tile_offset(self, x, y):
193 idx_offset = self._tile_offset(x, y)
194 try:
195 with open(self.filename, 'rb') as f:
196 f.seek(idx_offset)
197 offset = struct.unpack('<Q', f.read(5) + b'\x00\x00\x00')[0]
198 return offset
199 except IOError as ex:
200 if ex.errno == errno.ENOENT:
201 # mising bundle file -> missing tile
202 return 0
203 raise
204
205 def update_tile_offset(self, x, y, offset, size):
206 self._init_index()
207 idx_offset = self._tile_offset(x, y)
208 offset = struct.pack('<Q', offset)[:5]
209 with open(self.filename, 'r+b') as f:
210 f.seek(idx_offset, os.SEEK_SET)
211 f.write(offset)
212
213 def remove_tile_offset(self, x, y):
214 self._init_index()
215 idx_offset = self._tile_offset(x, y)
216 with open(self.filename, 'r+b') as f:
217 f.seek(idx_offset)
218 f.write(b'\x00' * 5)
219
220 # The bundle file has a header with 15 little-endian long values (60 bytes).
221 # NOTE: the fixed values might be some flags for image options (format, aliasing)
222 # all files available for testing had the same values however.
223 BUNDLE_HEADER_SIZE = 60
224 BUNDLE_HEADER = [
225 3 , # 0, fixed
226 16384 , # 1, max. num of tiles 128*128 = 16384
227 16 , # 2, size of largest tile
228 5 , # 3, fixed
229 0 , # 4, num of tiles in bundle (*4)
230 0 , # 5, fixed
231 60+65536 , # 6, bundle size
232 0 , # 7, fixed
233 40 , # 8 fixed
234 0 , # 9, fixed
235 16 , # 10, fixed
236 0 , # 11, y0
237 127 , # 12, y1
238 0 , # 13, x0
239 127 , # 14, x1
240 ]
241 BUNDLE_HEADER_STRUCT_FORMAT = '<lllllllllllllll'
242
243 class BundleData(object):
244 def __init__(self, filename, tile_offsets):
245 self.filename = filename
246 self.tile_offsets = tile_offsets
247 if not os.path.exists(self.filename):
248 self._init_bundle()
249
250 def _init_bundle(self):
251 ensure_directory(self.filename)
252 header = list(BUNDLE_HEADER)
253 header[13], header[11] = self.tile_offsets
254 header[14], header[12] = header[13]+127, header[11]+127
255 write_atomic(self.filename,
256 struct.pack(BUNDLE_HEADER_STRUCT_FORMAT, *header) +
257 # zero-size entry for each tile
258 (b'\x00' * (BUNDLEX_GRID_HEIGHT * BUNDLEX_GRID_WIDTH * 4)))
259
260 def read_size(self, offset):
261 with open(self.filename, 'rb') as f:
262 f.seek(offset)
263 return struct.unpack('<L', f.read(4))[0]
264
265 def read_tile(self, offset):
266 with open(self.filename, 'rb') as f:
267 f.seek(offset)
268 size = struct.unpack('<L', f.read(4))[0]
269 if size <= 0:
270 return False
271 return f.read(size)
272
273 def append_tile(self, data, prev_offset):
274 size = len(data)
275 is_new_tile = True
276 with open(self.filename, 'r+b') as f:
277 if prev_offset:
278 f.seek(prev_offset, os.SEEK_SET)
279 if f.tell() == prev_offset:
280 if struct.unpack('<L', f.read(4))[0] > 0:
281 is_new_tile = False
282
283 f.seek(0, os.SEEK_END)
284 offset = f.tell()
285 if offset == 0:
286 f.write(b'\x00' * 16) # header
287 offset = 16
288 f.write(struct.pack('<L', size))
289 f.write(data)
290
291 # update header
292 f.seek(0, os.SEEK_SET)
293 header = list(struct.unpack(BUNDLE_HEADER_STRUCT_FORMAT, f.read(60)))
294 header[2] = max(header[2], size)
295 header[6] += size + 4
296 if is_new_tile:
297 header[4] += 4
298 f.seek(0, os.SEEK_SET)
299 f.write(struct.pack(BUNDLE_HEADER_STRUCT_FORMAT, *header))
300
301 return offset, size
1919
2020 from mapproxy.util.fs import ensure_directory, write_atomic
2121 from mapproxy.image import ImageSource, is_single_color_image
22 from mapproxy.cache import path
2223 from mapproxy.cache.base import TileCacheBase, tile_buffer
23 from mapproxy.compat import string_type
2424
2525 import logging
2626 log = logging.getLogger('mapproxy.cache.file')
3030 This class is responsible to store and load the actual tile data.
3131 """
3232 def __init__(self, cache_dir, file_ext, directory_layout='tc',
33 link_single_color_images=False, lock_timeout=60.0):
33 link_single_color_images=False):
3434 """
3535 :param cache_dir: the path where the tile will be stored
3636 :param file_ext: the file extension that will be appended to
4141 self.cache_dir = cache_dir
4242 self.file_ext = file_ext
4343 self.link_single_color_images = link_single_color_images
44 self._tile_location, self._level_location = path.location_funcs(layout=directory_layout)
45 if self._level_location is None:
46 self.level_location = None # disable level based clean-ups
4447
45 if directory_layout == 'tc':
46 self.tile_location = self._tile_location_tc
47 self.level_location = self._level_location
48 elif directory_layout == 'mp':
49 self.tile_location = self._tile_location_mp
50 self.level_location = self._level_location
51 elif directory_layout == 'tms':
52 self.tile_location = self._tile_location_tms
53 self.level_location = self._level_location_tms
54 elif directory_layout == 'quadkey':
55 self.tile_location = self._tile_location_quadkey
56 self.level_location = self._level_location
57 elif directory_layout == 'arcgis':
58 self.tile_location = self._tile_location_arcgiscache
59 self.level_location = self._level_location_arcgiscache
60 else:
61 raise ValueError('unknown directory_layout "%s"' % directory_layout)
48 def tile_location(self, tile, create_dir=False):
49 return self._tile_location(tile, self.cache_dir, self.file_ext, create_dir=create_dir)
6250
63 def _level_location(self, level):
51 def level_location(self, level):
6452 """
6553 Return the path where all tiles for `level` will be stored.
6654
6755 >>> c = FileCache(cache_dir='/tmp/cache/', file_ext='png')
68 >>> c._level_location(2)
56 >>> c.level_location(2)
6957 '/tmp/cache/02'
7058 """
71 if isinstance(level, string_type):
72 return os.path.join(self.cache_dir, level)
73 else:
74 return os.path.join(self.cache_dir, "%02d" % level)
75
76 def _tile_location_tc(self, tile, create_dir=False):
77 """
78 Return the location of the `tile`. Caches the result as ``location``
79 property of the `tile`.
80
81 :param tile: the tile object
82 :param create_dir: if True, create all necessary directories
83 :return: the full filename of the tile
84
85 >>> from mapproxy.cache.tile import Tile
86 >>> c = FileCache(cache_dir='/tmp/cache/', file_ext='png')
87 >>> c.tile_location(Tile((3, 4, 2))).replace('\\\\', '/')
88 '/tmp/cache/02/000/000/003/000/000/004.png'
89 """
90 if tile.location is None:
91 x, y, z = tile.coord
92 parts = (self._level_location(z),
93 "%03d" % int(x / 1000000),
94 "%03d" % (int(x / 1000) % 1000),
95 "%03d" % (int(x) % 1000),
96 "%03d" % int(y / 1000000),
97 "%03d" % (int(y / 1000) % 1000),
98 "%03d.%s" % (int(y) % 1000, self.file_ext))
99 tile.location = os.path.join(*parts)
100 if create_dir:
101 ensure_directory(tile.location)
102 return tile.location
103
104 def _tile_location_mp(self, tile, create_dir=False):
105 """
106 Return the location of the `tile`. Caches the result as ``location``
107 property of the `tile`.
108
109 :param tile: the tile object
110 :param create_dir: if True, create all necessary directories
111 :return: the full filename of the tile
112
113 >>> from mapproxy.cache.tile import Tile
114 >>> c = FileCache(cache_dir='/tmp/cache/', file_ext='png', directory_layout='mp')
115 >>> c.tile_location(Tile((3, 4, 2))).replace('\\\\', '/')
116 '/tmp/cache/02/0000/0003/0000/0004.png'
117 >>> c.tile_location(Tile((12345678, 98765432, 22))).replace('\\\\', '/')
118 '/tmp/cache/22/1234/5678/9876/5432.png'
119 """
120 if tile.location is None:
121 x, y, z = tile.coord
122 parts = (self._level_location(z),
123 "%04d" % int(x / 10000),
124 "%04d" % (int(x) % 10000),
125 "%04d" % int(y / 10000),
126 "%04d.%s" % (int(y) % 10000, self.file_ext))
127 tile.location = os.path.join(*parts)
128 if create_dir:
129 ensure_directory(tile.location)
130 return tile.location
131
132 def _tile_location_tms(self, tile, create_dir=False):
133 """
134 Return the location of the `tile`. Caches the result as ``location``
135 property of the `tile`.
136
137 :param tile: the tile object
138 :param create_dir: if True, create all necessary directories
139 :return: the full filename of the tile
140
141 >>> from mapproxy.cache.tile import Tile
142 >>> c = FileCache(cache_dir='/tmp/cache/', file_ext='png', directory_layout='tms')
143 >>> c.tile_location(Tile((3, 4, 2))).replace('\\\\', '/')
144 '/tmp/cache/2/3/4.png'
145 """
146 if tile.location is None:
147 x, y, z = tile.coord
148 tile.location = os.path.join(
149 self.level_location(str(z)),
150 str(x), str(y) + '.' + self.file_ext
151 )
152 if create_dir:
153 ensure_directory(tile.location)
154 return tile.location
155
156 def _level_location_tms(self, z):
157 return self._level_location(str(z))
158
159 def _tile_location_quadkey(self, tile, create_dir=False):
160 """
161 Return the location of the `tile`. Caches the result as ``location``
162 property of the `tile`.
163
164 :param tile: the tile object
165 :param create_dir: if True, create all necessary directories
166 :return: the full filename of the tile
167
168 >>> from mapproxy.cache.tile import Tile
169 >>> from mapproxy.cache.file import FileCache
170 >>> c = FileCache(cache_dir='/tmp/cache/', file_ext='png', directory_layout='quadkey')
171 >>> c.tile_location(Tile((3, 4, 2))).replace('\\\\', '/')
172 '/tmp/cache/11.png'
173 """
174 if tile.location is None:
175 x, y, z = tile.coord
176 quadKey = ""
177 for i in range(z,0,-1):
178 digit = 0
179 mask = 1 << (i-1)
180 if (x & mask) != 0:
181 digit += 1
182 if (y & mask) != 0:
183 digit += 2
184 quadKey += str(digit)
185 tile.location = os.path.join(
186 self.cache_dir, quadKey + '.' + self.file_ext
187 )
188 if create_dir:
189 ensure_directory(tile.location)
190 return tile.location
191
192 def _tile_location_arcgiscache(self, tile, create_dir=False):
193 """
194 Return the location of the `tile`. Caches the result as ``location``
195 property of the `tile`.
196
197 :param tile: the tile object
198 :param create_dir: if True, create all necessary directories
199 :return: the full filename of the tile
200
201 >>> from mapproxy.cache.tile import Tile
202 >>> from mapproxy.cache.file import FileCache
203 >>> c = FileCache(cache_dir='/tmp/cache/', file_ext='png', directory_layout='arcgis')
204 >>> c.tile_location(Tile((1234567, 87654321, 9))).replace('\\\\', '/')
205 '/tmp/cache/L09/R05397fb1/C0012d687.png'
206 """
207 if tile.location is None:
208 x, y, z = tile.coord
209 parts = (self._level_location_arcgiscache(z), 'R%08x' % y, 'C%08x.%s' % (x, self.file_ext))
210 tile.location = os.path.join(*parts)
211 if create_dir:
212 ensure_directory(tile.location)
213 return tile.location
214
215 def _level_location_arcgiscache(self, z):
216 return self._level_location('L%02d' % z)
59 return self._level_location(level, self.cache_dir)
21760
21861 def _single_color_tile_location(self, color, create_dir=False):
21962 """
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2011-2013 Omniscale <http://omniscale.de>
2
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement
16
17 import hashlib
18 import logging
19 import os
20 import re
21 import sqlite3
22 import threading
23
24 from mapproxy.cache.base import TileCacheBase, tile_buffer, REMOVE_ON_UNLOCK
25 from mapproxy.compat import BytesIO, PY2, itertools
26 from mapproxy.image import ImageSource
27 from mapproxy.srs import get_epsg_num
28 from mapproxy.util.fs import ensure_directory
29 from mapproxy.util.lock import FileLock
30
31
32 log = logging.getLogger(__name__)
33
34 class GeopackageCache(TileCacheBase):
35 supports_timestamp = False
36
37 def __init__(self, geopackage_file, tile_grid, table_name, with_timestamps=False, timeout=30, wal=False):
38 self.tile_grid = tile_grid
39 self.table_name = self._check_table_name(table_name)
40 self.lock_cache_id = 'gpkg' + hashlib.md5(geopackage_file.encode('utf-8')).hexdigest()
41 self.geopackage_file = geopackage_file
42 # XXX timestamps not implemented
43 self.supports_timestamp = with_timestamps
44 self.timeout = timeout
45 self.wal = wal
46 self.ensure_gpkg()
47 self._db_conn_cache = threading.local()
48
49 @property
50 def db(self):
51 if not getattr(self._db_conn_cache, 'db', None):
52 self.ensure_gpkg()
53 self._db_conn_cache.db = sqlite3.connect(self.geopackage_file, timeout=self.timeout)
54 return self._db_conn_cache.db
55
56 def cleanup(self):
57 """
58 Close all open connection and remove them from cache.
59 """
60 if getattr(self._db_conn_cache, 'db', None):
61 self._db_conn_cache.db.close()
62 self._db_conn_cache.db = None
63
64 @staticmethod
65 def _check_table_name(table_name):
66 """
67 >>> GeopackageCache._check_table_name("test")
68 'test'
69 >>> GeopackageCache._check_table_name("test_2")
70 'test_2'
71 >>> GeopackageCache._check_table_name("test-2")
72 'test-2'
73 >>> GeopackageCache._check_table_name("test3;")
74 Traceback (most recent call last):
75 ...
76 ValueError: The table_name test3; contains unsupported characters.
77 >>> GeopackageCache._check_table_name("table name")
78 Traceback (most recent call last):
79 ...
80 ValueError: The table_name table name contains unsupported characters.
81
82 @param table_name: A desired name for an geopackage table.
83 @return: The name of the table if it is good, otherwise an exception.
84 """
85 # Regex string indicating table names which will be accepted.
86 regex_str = '^[a-zA-Z0-9_-]+$'
87 if re.match(regex_str, table_name):
88 return table_name
89 else:
90 msg = ("The table name may only contain alphanumeric characters, an underscore, "
91 "or a dash: {}".format(regex_str))
92 log.info(msg)
93 raise ValueError("The table_name {0} contains unsupported characters.".format(table_name))
94
95 def ensure_gpkg(self):
96 if not os.path.isfile(self.geopackage_file):
97 with FileLock(self.geopackage_file + '.init.lck',
98 remove_on_unlock=REMOVE_ON_UNLOCK):
99 ensure_directory(self.geopackage_file)
100 self._initialize_gpkg()
101 else:
102 if not self.check_gpkg():
103 ensure_directory(self.geopackage_file)
104 self._initialize_gpkg()
105
106 def check_gpkg(self):
107 if not self._verify_table():
108 return False
109 if not self._verify_gpkg_contents():
110 return False
111 if not self._verify_tile_size():
112 return False
113 return True
114
115 def _verify_table(self):
116 with sqlite3.connect(self.geopackage_file) as db:
117 cur = db.execute("""SELECT name FROM sqlite_master WHERE type='table' AND name=?""",
118 (self.table_name,))
119 content = cur.fetchone()
120 if not content:
121 # Table doesn't exist _initialize_gpkg will create a new one.
122 return False
123 return True
124
125 def _verify_gpkg_contents(self):
126 with sqlite3.connect(self.geopackage_file) as db:
127 cur = db.execute("""SELECT * FROM gpkg_contents WHERE table_name = ?"""
128 , (self.table_name,))
129
130 results = cur.fetchone()
131 if not results:
132 # Table doesn't exist in gpkg_contents _initialize_gpkg will add it.
133 return False
134 gpkg_data_type = results[1]
135 gpkg_srs_id = results[9]
136 cur = db.execute("""SELECT * FROM gpkg_spatial_ref_sys WHERE srs_id = ?"""
137 , (gpkg_srs_id,))
138
139 gpkg_coordsys_id = cur.fetchone()[3]
140 if gpkg_data_type.lower() != "tiles":
141 log.info("The geopackage table name already exists for a data type other than tiles.")
142 raise ValueError("table_name is improperly configured.")
143 if gpkg_coordsys_id != get_epsg_num(self.tile_grid.srs.srs_code):
144 log.info(
145 "The geopackage {0} table name {1} already exists and has an SRS of {2}, which does not match the configured" \
146 " Mapproxy SRS of {3}.".format(self.geopackage_file, self.table_name, gpkg_coordsys_id,
147 get_epsg_num(self.tile_grid.srs.srs_code)))
148 raise ValueError("srs is improperly configured.")
149 return True
150
151 def _verify_tile_size(self):
152 with sqlite3.connect(self.geopackage_file) as db:
153 cur = db.execute(
154 """SELECT * FROM gpkg_tile_matrix WHERE table_name = ?""",
155 (self.table_name,))
156
157 results = cur.fetchall()
158 results = results[0]
159 tile_size = self.tile_grid.tile_size
160
161 if not results:
162 # There is no tile conflict. Return to allow the creation of new tiles.
163 return True
164
165 gpkg_table_name, gpkg_zoom_level, gpkg_matrix_width, gpkg_matrix_height, gpkg_tile_width, gpkg_tile_height, \
166 gpkg_pixel_x_size, gpkg_pixel_y_size = results
167 resolution = self.tile_grid.resolution(gpkg_zoom_level)
168 if gpkg_tile_width != tile_size[0] or gpkg_tile_height != tile_size[1]:
169 log.info(
170 "The geopackage {0} table name {1} already exists and has tile sizes of ({2},{3})"
171 " which is different than the configure tile sizes of ({4},{5}).".format(self.geopackage_file,
172 self.table_name,
173 gpkg_tile_width,
174 gpkg_tile_height,
175 tile_size[0],
176 tile_size[1]))
177 log.info("The current mapproxy configuration is invalid for this geopackage.")
178 raise ValueError("tile_size is improperly configured.")
179 if not is_close(gpkg_pixel_x_size, resolution) or not is_close(gpkg_pixel_y_size, resolution):
180 log.info(
181 "The geopackage {0} table name {1} already exists and level {2} a resolution of ({3:.13f},{4:.13f})"
182 " which is different than the configured resolution of ({5:.13f},{6:.13f}).".format(self.geopackage_file,
183 self.table_name,
184 gpkg_zoom_level,
185 gpkg_pixel_x_size,
186 gpkg_pixel_y_size,
187 resolution,
188 resolution))
189 log.info("The current mapproxy configuration is invalid for this geopackage.")
190 raise ValueError("res is improperly configured.")
191 return True
192
193 def _initialize_gpkg(self):
194 log.info('initializing Geopackage file %s', self.geopackage_file)
195 db = sqlite3.connect(self.geopackage_file)
196
197 if self.wal:
198 db.execute('PRAGMA journal_mode=wal')
199
200 proj = get_epsg_num(self.tile_grid.srs.srs_code)
201 stmts = ["""
202 CREATE TABLE IF NOT EXISTS gpkg_contents
203 (table_name TEXT NOT NULL PRIMARY KEY, -- The name of the tiles, or feature table
204 data_type TEXT NOT NULL, -- Type of data stored in the table: "features" per clause Features (http://www.geopackage.org/spec/#features), "tiles" per clause Tiles (http://www.geopackage.org/spec/#tiles), or an implementer-defined value for other data tables per clause in an Extended GeoPackage
205 identifier TEXT UNIQUE, -- A human-readable identifier (e.g. short name) for the table_name content
206 description TEXT DEFAULT '', -- A human-readable description for the table_name content
207 last_change DATETIME NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%fZ','now')), -- Timestamp value in ISO 8601 format as defined by the strftime function %Y-%m-%dT%H:%M:%fZ format string applied to the current time
208 min_x DOUBLE, -- Bounding box minimum easting or longitude for all content in table_name
209 min_y DOUBLE, -- Bounding box minimum northing or latitude for all content in table_name
210 max_x DOUBLE, -- Bounding box maximum easting or longitude for all content in table_name
211 max_y DOUBLE, -- Bounding box maximum northing or latitude for all content in table_name
212 srs_id INTEGER, -- Spatial Reference System ID: gpkg_spatial_ref_sys.srs_id; when data_type is features, SHALL also match gpkg_geometry_columns.srs_id; When data_type is tiles, SHALL also match gpkg_tile_matrix_set.srs.id
213 CONSTRAINT fk_gc_r_srs_id FOREIGN KEY (srs_id) REFERENCES gpkg_spatial_ref_sys(srs_id))
214 """,
215 """
216 CREATE TABLE IF NOT EXISTS gpkg_spatial_ref_sys
217 (srs_name TEXT NOT NULL, -- Human readable name of this SRS (Spatial Reference System)
218 srs_id INTEGER NOT NULL PRIMARY KEY, -- Unique identifier for each Spatial Reference System within a GeoPackage
219 organization TEXT NOT NULL, -- Case-insensitive name of the defining organization e.g. EPSG or epsg
220 organization_coordsys_id INTEGER NOT NULL, -- Numeric ID of the Spatial Reference System assigned by the organization
221 definition TEXT NOT NULL, -- Well-known Text representation of the Spatial Reference System
222 description TEXT)
223 """,
224 """
225 CREATE TABLE IF NOT EXISTS gpkg_tile_matrix
226 (table_name TEXT NOT NULL, -- Tile Pyramid User Data Table Name
227 zoom_level INTEGER NOT NULL, -- 0 <= zoom_level <= max_level for table_name
228 matrix_width INTEGER NOT NULL, -- Number of columns (>= 1) in tile matrix at this zoom level
229 matrix_height INTEGER NOT NULL, -- Number of rows (>= 1) in tile matrix at this zoom level
230 tile_width INTEGER NOT NULL, -- Tile width in pixels (>= 1) for this zoom level
231 tile_height INTEGER NOT NULL, -- Tile height in pixels (>= 1) for this zoom level
232 pixel_x_size DOUBLE NOT NULL, -- In t_table_name srid units or default meters for srid 0 (>0)
233 pixel_y_size DOUBLE NOT NULL, -- In t_table_name srid units or default meters for srid 0 (>0)
234 CONSTRAINT pk_ttm PRIMARY KEY (table_name, zoom_level), CONSTRAINT fk_tmm_table_name FOREIGN KEY (table_name) REFERENCES gpkg_contents(table_name))
235 """,
236 """
237 CREATE TABLE IF NOT EXISTS gpkg_tile_matrix_set
238 (table_name TEXT NOT NULL PRIMARY KEY, -- Tile Pyramid User Data Table Name
239 srs_id INTEGER NOT NULL, -- Spatial Reference System ID: gpkg_spatial_ref_sys.srs_id
240 min_x DOUBLE NOT NULL, -- Bounding box minimum easting or longitude for all content in table_name
241 min_y DOUBLE NOT NULL, -- Bounding box minimum northing or latitude for all content in table_name
242 max_x DOUBLE NOT NULL, -- Bounding box maximum easting or longitude for all content in table_name
243 max_y DOUBLE NOT NULL, -- Bounding box maximum northing or latitude for all content in table_name
244 CONSTRAINT fk_gtms_table_name FOREIGN KEY (table_name) REFERENCES gpkg_contents(table_name), CONSTRAINT fk_gtms_srs FOREIGN KEY (srs_id) REFERENCES gpkg_spatial_ref_sys (srs_id))
245 """,
246 """
247 CREATE TABLE IF NOT EXISTS [{0}]
248 (id INTEGER PRIMARY KEY AUTOINCREMENT, -- Autoincrement primary key
249 zoom_level INTEGER NOT NULL, -- min(zoom_level) <= zoom_level <= max(zoom_level) for t_table_name
250 tile_column INTEGER NOT NULL, -- 0 to tile_matrix matrix_width - 1
251 tile_row INTEGER NOT NULL, -- 0 to tile_matrix matrix_height - 1
252 tile_data BLOB NOT NULL, -- Of an image MIME type specified in clauses Tile Encoding PNG, Tile Encoding JPEG, Tile Encoding WEBP
253 UNIQUE (zoom_level, tile_column, tile_row))
254 """.format(self.table_name)
255 ]
256
257 for stmt in stmts:
258 db.execute(stmt)
259
260 db.execute("PRAGMA foreign_keys = 1;")
261
262 # List of WKT execute statements and data.("""
263 wkt_statement = """
264 INSERT OR REPLACE INTO gpkg_spatial_ref_sys (
265 srs_id,
266 organization,
267 organization_coordsys_id,
268 srs_name,
269 definition)
270 VALUES (?, ?, ?, ?, ?)
271 """
272 wkt_entries = [(3857, 'epsg', 3857, 'WGS 84 / Pseudo-Mercator',
273 """
274 PROJCS["WGS 84 / Pseudo-Mercator",GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,\
275 AUTHORITY["EPSG","7030"]],AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],\
276 UNIT["degree",0.0174532925199433,AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","9122"]]AUTHORITY["EPSG","4326"]],\
277 PROJECTION["Mercator_1SP"],PARAMETER["central_meridian",0],PARAMETER["scale_factor",1],PARAMETER["false_easting",0],\
278 PARAMETER["false_northing",0],UNIT["metre",1,AUTHORITY["EPSG","9001"]],AXIS["X",EAST],AXIS["Y",NORTH]\
279 """
280 ),
281 (4326, 'epsg', 4326, 'WGS 84',
282 """
283 GEOGCS["WGS 84",DATUM["WGS_1984",SPHEROID["WGS 84",6378137,298.257223563,AUTHORITY["EPSG","7030"]],\
284 AUTHORITY["EPSG","6326"]],PRIMEM["Greenwich",0,AUTHORITY["EPSG","8901"]],UNIT["degree",0.0174532925199433,\
285 AUTHORITY["EPSG","9122"]],AUTHORITY["EPSG","4326"]]\
286 """
287 ),
288 (-1, 'NONE', -1, ' ', 'undefined'),
289 (0, 'NONE', 0, ' ', 'undefined')
290 ]
291
292 if get_epsg_num(self.tile_grid.srs.srs_code) not in [4326, 3857]:
293 wkt_entries.append((proj, 'epsg', proj, 'Not provided', "Added via Mapproxy."))
294 db.commit()
295
296 # Add geopackage version to the header (1.0)
297 db.execute("PRAGMA application_id = 1196437808;")
298 db.commit()
299
300 for wkt_entry in wkt_entries:
301 try:
302 db.execute(wkt_statement, (wkt_entry[0], wkt_entry[1], wkt_entry[2], wkt_entry[3], wkt_entry[4]))
303 except sqlite3.IntegrityError:
304 log.info("srs_id already exists.".format(wkt_entry[0]))
305 db.commit()
306
307 # Ensure that tile table exists here, don't overwrite a valid entry.
308 try:
309 db.execute("""
310 INSERT INTO gpkg_contents (
311 table_name,
312 data_type,
313 identifier,
314 description,
315 min_x,
316 max_x,
317 min_y,
318 max_y,
319 srs_id)
320 VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?);
321 """, (self.table_name,
322 "tiles",
323 self.table_name,
324 "Created with Mapproxy.",
325 self.tile_grid.bbox[0],
326 self.tile_grid.bbox[2],
327 self.tile_grid.bbox[1],
328 self.tile_grid.bbox[3],
329 proj))
330 except sqlite3.IntegrityError:
331 pass
332 db.commit()
333
334 # Ensure that tile set exists here, don't overwrite a valid entry.
335 try:
336 db.execute("""
337 INSERT INTO gpkg_tile_matrix_set (table_name, srs_id, min_x, max_x, min_y, max_y)
338 VALUES (?, ?, ?, ?, ?, ?);
339 """, (
340 self.table_name, proj, self.tile_grid.bbox[0], self.tile_grid.bbox[2], self.tile_grid.bbox[1],
341 self.tile_grid.bbox[3]))
342 except sqlite3.IntegrityError:
343 pass
344 db.commit()
345
346 tile_size = self.tile_grid.tile_size
347 for grid, resolution, level in zip(self.tile_grid.grid_sizes,
348 self.tile_grid.resolutions, range(20)):
349 db.execute("""INSERT OR REPLACE INTO gpkg_tile_matrix
350 (table_name, zoom_level, matrix_width, matrix_height, tile_width, tile_height, pixel_x_size, pixel_y_size)
351 VALUES(?, ?, ?, ?, ?, ?, ?, ?)
352 """,
353 (self.table_name, level, grid[0], grid[1], tile_size[0], tile_size[1], resolution, resolution))
354 db.commit()
355 db.close()
356
357 def is_cached(self, tile):
358 if tile.coord is None:
359 return True
360 if tile.source:
361 return True
362
363 return self.load_tile(tile)
364
365
366 def store_tile(self, tile):
367 if tile.stored:
368 return True
369 return self._store_bulk([tile])
370
371 def store_tiles(self, tiles):
372 tiles = [t for t in tiles if not t.stored]
373 return self._store_bulk(tiles)
374
375
376 def _store_bulk(self, tiles):
377 records = []
378 # tile_buffer (as_buffer) will encode the tile to the target format
379 # we collect all tiles before, to avoid having the db transaction
380 # open during this slow encoding
381 for tile in tiles:
382 with tile_buffer(tile) as buf:
383 if PY2:
384 content = buffer(buf.read())
385 else:
386 content = buf.read()
387 x, y, level = tile.coord
388 records.append((level, x, y, content))
389
390 cursor = self.db.cursor()
391 try:
392 stmt = "INSERT OR REPLACE INTO [{0}] (zoom_level, tile_column, tile_row, tile_data) VALUES (?,?,?,?)".format(
393 self.table_name)
394 cursor.executemany(stmt, records)
395 self.db.commit()
396 except sqlite3.OperationalError as ex:
397 log.warn('unable to store tile: %s', ex)
398 return False
399 return True
400
401 def load_tile(self, tile, with_metadata=False):
402 if tile.source or tile.coord is None:
403 return True
404
405 cur = self.db.cursor()
406 cur.execute("""SELECT tile_data FROM [{0}]
407 WHERE tile_column = ? AND
408 tile_row = ? AND
409 zoom_level = ?""".format(self.table_name), tile.coord)
410
411 content = cur.fetchone()
412 if content:
413 tile.source = ImageSource(BytesIO(content[0]))
414 return True
415 else:
416 return False
417
418 def load_tiles(self, tiles, with_metadata=False):
419 # associate the right tiles with the cursor
420 tile_dict = {}
421 coords = []
422 for tile in tiles:
423 if tile.source or tile.coord is None:
424 continue
425 x, y, level = tile.coord
426 coords.append(x)
427 coords.append(y)
428 coords.append(level)
429 tile_dict[(x, y)] = tile
430
431 if not tile_dict:
432 # all tiles loaded or coords are None
433 return True
434
435 stmt_base = "SELECT tile_column, tile_row, tile_data FROM [{0}] WHERE ".format(self.table_name)
436
437 loaded_tiles = 0
438
439 # SQLite is limited to 1000 args -> split into multiple requests if more arguments are needed
440 while coords:
441 cur_coords = coords[:999]
442
443 stmt = stmt_base + ' OR '.join(
444 ['(tile_column = ? AND tile_row = ? AND zoom_level = ?)'] * (len(cur_coords) // 3))
445
446 cursor = self.db.cursor()
447 cursor.execute(stmt, cur_coords)
448
449 for row in cursor:
450 loaded_tiles += 1
451 tile = tile_dict[(row[0], row[1])]
452 data = row[2]
453 tile.size = len(data)
454 tile.source = ImageSource(BytesIO(data))
455 cursor.close()
456
457 coords = coords[999:]
458
459 return loaded_tiles == len(tile_dict)
460
461 def remove_tile(self, tile):
462 cursor = self.db.cursor()
463 cursor.execute(
464 "DELETE FROM [{0}] WHERE (tile_column = ? AND tile_row = ? AND zoom_level = ?)".format(self.table_name),
465 tile.coord)
466 self.db.commit()
467 if cursor.rowcount:
468 return True
469 return False
470
471 def remove_level_tiles_before(self, level, timestamp):
472 if timestamp == 0:
473 cursor = self.db.cursor()
474 cursor.execute(
475 "DELETE FROM [{0}] WHERE (zoom_level = ?)".format(self.table_name), (level,))
476 self.db.commit()
477 log.info("Cursor rowcount = {0}".format(cursor.rowcount))
478 if cursor.rowcount:
479 return True
480 return False
481
482 def load_tile_metadata(self, tile):
483 self.load_tile(tile)
484
485
486 class GeopackageLevelCache(TileCacheBase):
487
488 def __init__(self, geopackage_dir, tile_grid, table_name, timeout=30, wal=False):
489 self.lock_cache_id = 'gpkg-' + hashlib.md5(geopackage_dir.encode('utf-8')).hexdigest()
490 self.cache_dir = geopackage_dir
491 self.tile_grid = tile_grid
492 self.table_name = table_name
493 self.timeout = timeout
494 self.wal = wal
495 self._geopackage = {}
496 self._geopackage_lock = threading.Lock()
497
498 def _get_level(self, level):
499 if level in self._geopackage:
500 return self._geopackage[level]
501
502 with self._geopackage_lock:
503 if level not in self._geopackage:
504 geopackage_filename = os.path.join(self.cache_dir, '%s.gpkg' % level)
505 self._geopackage[level] = GeopackageCache(
506 geopackage_filename,
507 self.tile_grid,
508 self.table_name,
509 with_timestamps=True,
510 timeout=self.timeout,
511 wal=self.wal,
512 )
513
514 return self._geopackage[level]
515
516 def cleanup(self):
517 """
518 Close all open connection and remove them from cache.
519 """
520 with self._geopackage_lock:
521 for gp in self._geopackage.values():
522 gp.cleanup()
523
524 def is_cached(self, tile):
525 if tile.coord is None:
526 return True
527 if tile.source:
528 return True
529
530 return self._get_level(tile.coord[2]).is_cached(tile)
531
532 def store_tile(self, tile):
533 if tile.stored:
534 return True
535
536 return self._get_level(tile.coord[2]).store_tile(tile)
537
538 def store_tiles(self, tiles):
539 failed = False
540 for level, tiles in itertools.groupby(tiles, key=lambda t: t.coord[2]):
541 tiles = [t for t in tiles if not t.stored]
542 res = self._get_level(level).store_tiles(tiles)
543 if not res: failed = True
544 return failed
545
546 def load_tile(self, tile, with_metadata=False):
547 if tile.source or tile.coord is None:
548 return True
549
550 return self._get_level(tile.coord[2]).load_tile(tile, with_metadata=with_metadata)
551
552 def load_tiles(self, tiles, with_metadata=False):
553 level = None
554 for tile in tiles:
555 if tile.source or tile.coord is None:
556 continue
557 level = tile.coord[2]
558 break
559
560 if not level:
561 return True
562
563 return self._get_level(level).load_tiles(tiles, with_metadata=with_metadata)
564
565 def remove_tile(self, tile):
566 if tile.coord is None:
567 return True
568
569 return self._get_level(tile.coord[2]).remove_tile(tile)
570
571 def remove_level_tiles_before(self, level, timestamp):
572 level_cache = self._get_level(level)
573 if timestamp == 0:
574 level_cache.cleanup()
575 os.unlink(level_cache.geopackage_file)
576 return True
577 else:
578 return level_cache.remove_level_tiles_before(level, timestamp)
579
580
581 def is_close(a, b, rel_tol=1e-09, abs_tol=0.0):
582 """
583 See PEP 485, added here for legacy versions.
584
585 >>> is_close(0.0, 0.0)
586 True
587 >>> is_close(1, 1.0)
588 True
589 >>> is_close(0.01, 0.001)
590 False
591 >>> is_close(0.0001001, 0.0001, rel_tol=1e-02)
592 True
593 >>> is_close(0.0001001, 0.0001)
594 False
595
596 @param a: An int or float.
597 @param b: An int or float.
598 @param rel_tol: Relative tolerance - maximumed allow difference between two numbers.
599 @param abs_tol: Absolute tolerance - minimum absolute tolerance.
600 @return: True if the values a and b are close.
601
602 """
603 return abs(a - b) <= max(rel_tol * max(abs(a), abs(b)), abs_tol)
2020 import time
2121
2222 from mapproxy.image import ImageSource
23 from mapproxy.cache.base import TileCacheBase, tile_buffer, CacheBackendError
23 from mapproxy.cache.base import TileCacheBase, tile_buffer, REMOVE_ON_UNLOCK
2424 from mapproxy.util.fs import ensure_directory
2525 from mapproxy.util.lock import FileLock
26 from mapproxy.compat import BytesIO, PY2
26 from mapproxy.compat import BytesIO, PY2, itertools
2727
2828 import logging
2929 log = logging.getLogger(__name__)
3737 class MBTilesCache(TileCacheBase):
3838 supports_timestamp = False
3939
40 def __init__(self, mbtile_file, with_timestamps=False):
40 def __init__(self, mbtile_file, with_timestamps=False, timeout=30, wal=False):
4141 self.lock_cache_id = 'mbtiles-' + hashlib.md5(mbtile_file.encode('utf-8')).hexdigest()
4242 self.mbtile_file = mbtile_file
4343 self.supports_timestamp = with_timestamps
44 self.timeout = timeout
45 self.wal = wal
4446 self.ensure_mbtile()
4547 self._db_conn_cache = threading.local()
4648
4850 def db(self):
4951 if not getattr(self._db_conn_cache, 'db', None):
5052 self.ensure_mbtile()
51 self._db_conn_cache.db = sqlite3.connect(self.mbtile_file)
53 self._db_conn_cache.db = sqlite3.connect(self.mbtile_file, self.timeout)
5254 return self._db_conn_cache.db
5355
5456 def cleanup(self):
6163
6264 def ensure_mbtile(self):
6365 if not os.path.exists(self.mbtile_file):
64 with FileLock(os.path.join(os.path.dirname(self.mbtile_file), 'init.lck'),
65 remove_on_unlock=True):
66 with FileLock(self.mbtile_file + '.init.lck',
67 remove_on_unlock=REMOVE_ON_UNLOCK):
6668 if not os.path.exists(self.mbtile_file):
6769 ensure_directory(self.mbtile_file)
6870 self._initialize_mbtile()
7072 def _initialize_mbtile(self):
7173 log.info('initializing MBTile file %s', self.mbtile_file)
7274 db = sqlite3.connect(self.mbtile_file)
75
76 if self.wal:
77 db.execute('PRAGMA journal_mode=wal')
78
7379 stmt = """
7480 CREATE TABLE tiles (
7581 zoom_level integer,
134140 def store_tile(self, tile):
135141 if tile.stored:
136142 return True
137 with tile_buffer(tile) as buf:
138 if PY2:
139 content = buffer(buf.read())
143 return self._store_bulk([tile])
144
145 def store_tiles(self, tiles):
146 tiles = [t for t in tiles if not t.stored]
147 return self._store_bulk(tiles)
148
149 def _store_bulk(self, tiles):
150 records = []
151 # tile_buffer (as_buffer) will encode the tile to the target format
152 # we collect all tiles before, to avoid having the db transaction
153 # open during this slow encoding
154 for tile in tiles:
155 with tile_buffer(tile) as buf:
156 if PY2:
157 content = buffer(buf.read())
158 else:
159 content = buf.read()
160 x, y, level = tile.coord
161 if self.supports_timestamp:
162 records.append((level, x, y, content, time.time()))
163 else:
164 records.append((level, x, y, content))
165
166 cursor = self.db.cursor()
167 try:
168 if self.supports_timestamp:
169 stmt = "INSERT OR REPLACE INTO tiles (zoom_level, tile_column, tile_row, tile_data, last_modified) VALUES (?,?,?,?, datetime(?, 'unixepoch', 'localtime'))"
170 cursor.executemany(stmt, records)
140171 else:
141 content = buf.read()
142 x, y, level = tile.coord
143 cursor = self.db.cursor()
144 try:
145 if self.supports_timestamp:
146 stmt = "INSERT OR REPLACE INTO tiles (zoom_level, tile_column, tile_row, tile_data, last_modified) VALUES (?,?,?,?, datetime(?, 'unixepoch', 'localtime'))"
147 cursor.execute(stmt, (level, x, y, content, time.time()))
148 else:
149 stmt = "INSERT OR REPLACE INTO tiles (zoom_level, tile_column, tile_row, tile_data) VALUES (?,?,?,?)"
150 cursor.execute(stmt, (level, x, y, content))
151 self.db.commit()
152 except sqlite3.OperationalError as ex:
153 log.warn('unable to store tile: %s', ex)
154 return False
155 return True
172 stmt = "INSERT OR REPLACE INTO tiles (zoom_level, tile_column, tile_row, tile_data) VALUES (?,?,?,?)"
173 cursor.executemany(stmt, records)
174 self.db.commit()
175 except sqlite3.OperationalError as ex:
176 log.warn('unable to store tile: %s', ex)
177 return False
178 return True
156179
157180 def load_tile(self, tile, with_metadata=False):
158181 if tile.source or tile.coord is None:
270293 class MBTilesLevelCache(TileCacheBase):
271294 supports_timestamp = True
272295
273 def __init__(self, mbtiles_dir):
296 def __init__(self, mbtiles_dir, timeout=30, wal=False):
274297 self.lock_cache_id = 'sqlite-' + hashlib.md5(mbtiles_dir.encode('utf-8')).hexdigest()
275298 self.cache_dir = mbtiles_dir
276299 self._mbtiles = {}
300 self.timeout = timeout
301 self.wal = wal
277302 self._mbtiles_lock = threading.Lock()
278303
279304 def _get_level(self, level):
286311 self._mbtiles[level] = MBTilesCache(
287312 mbtile_filename,
288313 with_timestamps=True,
314 timeout=self.timeout,
315 wal=self.wal,
289316 )
290317
291318 return self._mbtiles[level]
311338 return True
312339
313340 return self._get_level(tile.coord[2]).store_tile(tile)
341
342 def store_tiles(self, tiles):
343 failed = False
344 for level, tiles in itertools.groupby(tiles, key=lambda t: t.coord[2]):
345 tiles = [t for t in tiles if not t.stored]
346 res = self._get_level(level).store_tiles(tiles)
347 if not res: failed = True
348 return failed
314349
315350 def load_tile(self, tile, with_metadata=False):
316351 if tile.source or tile.coord is None:
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2010-2016 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 import os
16 from mapproxy.compat import string_type
17 from mapproxy.util.fs import ensure_directory
18
19
20 def location_funcs(layout):
21 if layout == 'tc':
22 return tile_location_tc, level_location
23 elif layout == 'mp':
24 return tile_location_mp, level_location
25 elif layout == 'tms':
26 return tile_location_tms, level_location
27 elif layout == 'reverse_tms':
28 return tile_location_reverse_tms, None
29 elif layout == 'quadkey':
30 return tile_location_quadkey, no_level_location
31 elif layout == 'arcgis':
32 return tile_location_arcgiscache, level_location_arcgiscache
33 else:
34 raise ValueError('unknown directory_layout "%s"' % layout)
35
36 def level_location(level, cache_dir):
37 """
38 Return the path where all tiles for `level` will be stored.
39
40 >>> level_location(2, '/tmp/cache')
41 '/tmp/cache/02'
42 """
43 if isinstance(level, string_type):
44 return os.path.join(cache_dir, level)
45 else:
46 return os.path.join(cache_dir, "%02d" % level)
47
48
49 def level_part(level):
50 """
51 Return the path where all tiles for `level` will be stored.
52
53 >>> level_part(2)
54 '02'
55 >>> level_part('2')
56 '2'
57 """
58 if isinstance(level, string_type):
59 return level
60 else:
61 return "%02d" % level
62
63
64 def tile_location_tc(tile, cache_dir, file_ext, create_dir=False):
65 """
66 Return the location of the `tile`. Caches the result as ``location``
67 property of the `tile`.
68
69 :param tile: the tile object
70 :param create_dir: if True, create all necessary directories
71 :return: the full filename of the tile
72
73 >>> from mapproxy.cache.tile import Tile
74 >>> tile_location_tc(Tile((3, 4, 2)), '/tmp/cache', 'png').replace('\\\\', '/')
75 '/tmp/cache/02/000/000/003/000/000/004.png'
76 """
77 if tile.location is None:
78 x, y, z = tile.coord
79 parts = (cache_dir,
80 level_part(z),
81 "%03d" % int(x / 1000000),
82 "%03d" % (int(x / 1000) % 1000),
83 "%03d" % (int(x) % 1000),
84 "%03d" % int(y / 1000000),
85 "%03d" % (int(y / 1000) % 1000),
86 "%03d.%s" % (int(y) % 1000, file_ext))
87 tile.location = os.path.join(*parts)
88 if create_dir:
89 ensure_directory(tile.location)
90 return tile.location
91
92 def tile_location_mp(tile, cache_dir, file_ext, create_dir=False):
93 """
94 Return the location of the `tile`. Caches the result as ``location``
95 property of the `tile`.
96
97 :param tile: the tile object
98 :param create_dir: if True, create all necessary directories
99 :return: the full filename of the tile
100
101 >>> from mapproxy.cache.tile import Tile
102 >>> tile_location_mp(Tile((3, 4, 2)), '/tmp/cache', 'png').replace('\\\\', '/')
103 '/tmp/cache/02/0000/0003/0000/0004.png'
104 >>> tile_location_mp(Tile((12345678, 98765432, 22)), '/tmp/cache', 'png').replace('\\\\', '/')
105 '/tmp/cache/22/1234/5678/9876/5432.png'
106 """
107 if tile.location is None:
108 x, y, z = tile.coord
109 parts = (cache_dir,
110 level_part(z),
111 "%04d" % int(x / 10000),
112 "%04d" % (int(x) % 10000),
113 "%04d" % int(y / 10000),
114 "%04d.%s" % (int(y) % 10000, file_ext))
115 tile.location = os.path.join(*parts)
116 if create_dir:
117 ensure_directory(tile.location)
118 return tile.location
119
120 def tile_location_tms(tile, cache_dir, file_ext, create_dir=False):
121 """
122 Return the location of the `tile`. Caches the result as ``location``
123 property of the `tile`.
124
125 :param tile: the tile object
126 :param create_dir: if True, create all necessary directories
127 :return: the full filename of the tile
128
129 >>> from mapproxy.cache.tile import Tile
130 >>> tile_location_tms(Tile((3, 4, 2)), '/tmp/cache', 'png').replace('\\\\', '/')
131 '/tmp/cache/2/3/4.png'
132 """
133 if tile.location is None:
134 x, y, z = tile.coord
135 tile.location = os.path.join(
136 cache_dir, level_part(str(z)),
137 str(x), str(y) + '.' + file_ext
138 )
139 if create_dir:
140 ensure_directory(tile.location)
141 return tile.location
142
143 def tile_location_reverse_tms(tile, cache_dir, file_ext, create_dir=False):
144 """
145 Return the location of the `tile`. Caches the result as ``location``
146 property of the `tile`.
147
148 :param tile: the tile object
149 :param create_dir: if True, create all necessary directories
150 :return: the full filename of the tile
151
152 >>> from mapproxy.cache.tile import Tile
153 >>> tile_location_reverse_tms(Tile((3, 4, 2)), '/tmp/cache', 'png').replace('\\\\', '/')
154 '/tmp/cache/4/3/2.png'
155 """
156 if tile.location is None:
157 x, y, z = tile.coord
158 tile.location = os.path.join(
159 cache_dir, str(y), str(x), str(z) + '.' + file_ext
160 )
161 if create_dir:
162 ensure_directory(tile.location)
163 return tile.location
164
165 def level_location_tms(level, cache_dir):
166 return level_location(str(level), cache_dir=cache_dir)
167
168 def tile_location_quadkey(tile, cache_dir, file_ext, create_dir=False):
169 """
170 Return the location of the `tile`. Caches the result as ``location``
171 property of the `tile`.
172
173 :param tile: the tile object
174 :param create_dir: if True, create all necessary directories
175 :return: the full filename of the tile
176
177 >>> from mapproxy.cache.tile import Tile
178 >>> tile_location_quadkey(Tile((3, 4, 2)), '/tmp/cache', 'png').replace('\\\\', '/')
179 '/tmp/cache/11.png'
180 """
181 if tile.location is None:
182 x, y, z = tile.coord
183 quadKey = ""
184 for i in range(z,0,-1):
185 digit = 0
186 mask = 1 << (i-1)
187 if (x & mask) != 0:
188 digit += 1
189 if (y & mask) != 0:
190 digit += 2
191 quadKey += str(digit)
192 tile.location = os.path.join(
193 cache_dir, quadKey + '.' + file_ext
194 )
195 if create_dir:
196 ensure_directory(tile.location)
197 return tile.location
198
199 def no_level_location(level, cache_dir):
200 # dummy for quadkey cache which stores all tiles in one directory
201 raise NotImplementedError('cache does not have any level location')
202
203 def tile_location_arcgiscache(tile, cache_dir, file_ext, create_dir=False):
204 """
205 Return the location of the `tile`. Caches the result as ``location``
206 property of the `tile`.
207
208 :param tile: the tile object
209 :param create_dir: if True, create all necessary directories
210 :return: the full filename of the tile
211
212 >>> from mapproxy.cache.tile import Tile
213 >>> tile_location_arcgiscache(Tile((1234567, 87654321, 9)), '/tmp/cache', 'png').replace('\\\\', '/')
214 '/tmp/cache/L09/R05397fb1/C0012d687.png'
215 """
216 if tile.location is None:
217 x, y, z = tile.coord
218 parts = (cache_dir, 'L%02d' % z, 'R%08x' % y, 'C%08x.%s' % (x, file_ext))
219 tile.location = os.path.join(*parts)
220 if create_dir:
221 ensure_directory(tile.location)
222 return tile.location
223
224 def level_location_arcgiscache(z, cache_dir):
225 return level_location('L%02d' % z, cache_dir=cache_dir)
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2017 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement, absolute_import
16
17 import hashlib
18
19 from mapproxy.image import ImageSource
20 from mapproxy.cache.base import (
21 TileCacheBase,
22 tile_buffer,
23 )
24 from mapproxy.compat import BytesIO
25
26 try:
27 import redis
28 except ImportError:
29 redis = None
30
31
32 import logging
33 log = logging.getLogger(__name__)
34
35
36 class RedisCache(TileCacheBase):
37 def __init__(self, host, port, prefix, ttl=0, db=0):
38 if redis is None:
39 raise ImportError("Redis backend requires 'redis' package.")
40
41 self.prefix = prefix
42 self.lock_cache_id = 'redis-' + hashlib.md5((host + str(port) + prefix + str(db)).encode('utf-8')).hexdigest()
43 self.ttl = ttl
44 self.r = redis.StrictRedis(host=host, port=port, db=db)
45
46 def _key(self, tile):
47 x, y, z = tile.coord
48 return self.prefix + '-%d-%d-%d' % (z, x, y)
49
50 def is_cached(self, tile):
51 if tile.coord is None or tile.source:
52 return True
53
54 return self.r.exists(self._key(tile))
55
56 def store_tile(self, tile):
57 if tile.stored:
58 return True
59
60 key = self._key(tile)
61
62 with tile_buffer(tile) as buf:
63 data = buf.read()
64
65 r = self.r.set(key, data)
66 if self.ttl:
67 # use ms expire times for unit-tests
68 self.r.pexpire(key, int(self.ttl * 1000))
69 return r
70
71 def load_tile(self, tile, with_metadata=False):
72 if tile.source or tile.coord is None:
73 return True
74 key = self._key(tile)
75 tile_data = self.r.get(key)
76 if tile_data:
77 tile.source = ImageSource(BytesIO(tile_data))
78 return True
79 return False
80
81 def remove_tile(self, tile):
82 if tile.coord is None:
83 return True
84
85 key = self._key(tile)
86 self.r.delete(key)
87 return True
2929 from mapproxy.client.log import log_request
3030 from mapproxy.cache.tile import TileCreator, Tile
3131 from mapproxy.source import SourceError
32 from mapproxy.util.lock import LockTimeout
3233
3334 def has_renderd_support():
3435 if not json or not requests:
7071 if result['status'] == 'error':
7172 log_request(address, 500, None, duration=duration, method='RENDERD')
7273 raise SourceError("Error from renderd: %s" % result.get('error_message', 'unknown error from renderd'))
74 elif result['status'] == 'lock':
75 log_request(address, 503, None, duration=duration, method='RENDERD')
76 raise LockTimeout("Lock timeout from renderd: %s" % result.get('error_message', 'unknown lock timeout error from renderd'))
7377
7478 log_request(address, 200, None, duration=duration, method='RENDERD')
7579
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2016 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement
16
17 import hashlib
18 import sys
19 import threading
20
21 from mapproxy.image import ImageSource
22 from mapproxy.cache import path
23 from mapproxy.cache.base import tile_buffer, TileCacheBase
24 from mapproxy.util import async
25 from mapproxy.util.py import reraise_exception
26
27 try:
28 import boto3
29 import botocore
30 except ImportError:
31 boto3 = None
32
33
34 import logging
35 log = logging.getLogger('mapproxy.cache.s3')
36
37
38 _s3_sessions_cache = threading.local()
39 def s3_session(profile_name=None):
40 if not hasattr(_s3_sessions_cache, 'sessions'):
41 _s3_sessions_cache.sessions = {}
42 if profile_name not in _s3_sessions_cache.sessions:
43 _s3_sessions_cache.sessions[profile_name] = boto3.session.Session(profile_name=profile_name)
44 return _s3_sessions_cache.sessions[profile_name]
45
46 class S3ConnectionError(Exception):
47 pass
48
49 class S3Cache(TileCacheBase):
50
51 def __init__(self, base_path, file_ext, directory_layout='tms',
52 bucket_name='mapproxy', profile_name=None,
53 _concurrent_writer=4):
54 super(S3Cache, self).__init__()
55 self.lock_cache_id = hashlib.md5(base_path.encode('utf-8') + bucket_name.encode('utf-8')).hexdigest()
56 self.bucket_name = bucket_name
57 try:
58 self.bucket = self.conn().head_bucket(Bucket=bucket_name)
59 except botocore.exceptions.ClientError as e:
60 if e.response['Error']['Code'] == '404':
61 raise S3ConnectionError('No such bucket: %s' % bucket_name)
62 elif e.response['Error']['Code'] == '403':
63 raise S3ConnectionError('Access denied. Check your credentials')
64 else:
65 reraise_exception(
66 S3ConnectionError('Unknown error: %s' % e),
67 sys.exc_info(),
68 )
69
70 self.base_path = base_path
71 self.file_ext = file_ext
72 self._concurrent_writer = _concurrent_writer
73
74 self._tile_location, _ = path.location_funcs(layout=directory_layout)
75
76 def tile_key(self, tile):
77 return self._tile_location(tile, self.base_path, self.file_ext).lstrip('/')
78
79 def conn(self):
80 if boto3 is None:
81 raise ImportError("S3 Cache requires 'boto3' package.")
82
83 try:
84 return s3_session().client("s3")
85 except Exception as e:
86 raise S3ConnectionError('Error during connection %s' % e)
87
88 def load_tile_metadata(self, tile):
89 if tile.timestamp:
90 return
91 self.is_cached(tile)
92
93 def _set_metadata(self, response, tile):
94 if 'LastModified' in response:
95 tile.timestamp = float(response['LastModified'].strftime('%s'))
96 if 'ContentLength' in response:
97 tile.size = response['ContentLength']
98
99 def is_cached(self, tile):
100 if tile.is_missing():
101 key = self.tile_key(tile)
102 try:
103 r = self.conn().head_object(Bucket=self.bucket_name, Key=key)
104 self._set_metadata(r, tile)
105 except botocore.exceptions.ClientError as e:
106 if e.response['Error']['Code'] in ('404', 'NoSuchKey'):
107 return False
108 raise
109
110 return True
111
112 def load_tiles(self, tiles, with_metadata=True):
113 p = async.Pool(min(4, len(tiles)))
114 return all(p.map(self.load_tile, tiles))
115
116 def load_tile(self, tile, with_metadata=True):
117 if not tile.is_missing():
118 return True
119
120 key = self.tile_key(tile)
121 log.debug('S3:load_tile, key: %s' % key)
122
123 try:
124 r = self.conn().get_object(Bucket=self.bucket_name, Key=key)
125 self._set_metadata(r, tile)
126 tile.source = ImageSource(r['Body'])
127 except botocore.exceptions.ClientError as e:
128 error = e.response.get('Errors', e.response)['Error'] # moto get_object can return Error wrapped in Errors...
129 if error['Code'] in ('404', 'NoSuchKey'):
130 return False
131 raise
132
133 return True
134
135 def remove_tile(self, tile):
136 key = self.tile_key(tile)
137 log.debug('remove_tile, key: %s' % key)
138 self.conn().delete_object(Bucket=self.bucket_name, Key=key)
139
140 def store_tiles(self, tiles):
141 p = async.Pool(min(self._concurrent_writer, len(tiles)))
142 p.map(self.store_tile, tiles)
143
144 def store_tile(self, tile):
145 if tile.stored:
146 return
147
148 key = self.tile_key(tile)
149 log.debug('S3: store_tile, key: %s' % key)
150
151 extra_args = {}
152 if self.file_ext in ('jpeg', 'png'):
153 extra_args['ContentType'] = 'image/' + self.file_ext
154 with tile_buffer(tile) as buf:
155 self.conn().upload_fileobj(
156 NopCloser(buf), # upload_fileobj closes buf, wrap in NopCloser
157 self.bucket_name,
158 key,
159 ExtraArgs=extra_args)
160
161 class NopCloser(object):
162 def __init__(self, wrapped):
163 self.wrapped = wrapped
164
165 def close(self):
166 pass
167
168 def __getattr__(self, name):
169 return getattr(self.wrapped, name)
3636
3737 from __future__ import with_statement
3838
39 from functools import partial
3940 from contextlib import contextmanager
4041 from mapproxy.grid import MetaGrid
4142 from mapproxy.image.merge import merge_images
4243 from mapproxy.image.tile import TileSplitter
4344 from mapproxy.layer import MapQuery, BlankImage
4445 from mapproxy.util import async
46 from mapproxy.util.py import reraise
4547
4648 class TileManager(object):
4749 """
5557 """
5658 def __init__(self, grid, cache, sources, format, locker, image_opts=None, request_format=None,
5759 meta_buffer=None, meta_size=None, minimize_meta_requests=False, identifier=None,
58 pre_store_filter=None, concurrent_tile_creators=1, tile_creator_class=None):
60 pre_store_filter=None, concurrent_tile_creators=1, tile_creator_class=None,
61 bulk_meta_tiles=False,
62 ):
5963 self.grid = grid
6064 self.cache = cache
6165 self.locker = locker
7781 self.meta_grid = MetaGrid(grid, meta_size=meta_size, meta_buffer=meta_buffer)
7882 elif any(source.supports_meta_tiles for source in sources):
7983 raise ValueError('meta tiling configured but not supported by all sources')
84 elif meta_size and not meta_size == [1, 1] and bulk_meta_tiles:
85 # meta tiles configured but all sources are tiled
86 # use bulk_meta_tile mode that download tiles in parallel
87 self.meta_grid = MetaGrid(grid, meta_size=meta_size, meta_buffer=0)
88 self.tile_creator_class = partial(self.tile_creator_class, bulk_meta_tiles=True)
8089
8190 @contextmanager
8291 def session(self):
195204 return tile
196205
197206 class TileCreator(object):
198 def __init__(self, tile_mgr, dimensions=None, image_merger=None):
207 def __init__(self, tile_mgr, dimensions=None, image_merger=None, bulk_meta_tiles=False):
199208 self.cache = tile_mgr.cache
200209 self.sources = tile_mgr.sources
201210 self.grid = tile_mgr.grid
202211 self.meta_grid = tile_mgr.meta_grid
212 self.bulk_meta_tiles = bulk_meta_tiles
203213 self.tile_mgr = tile_mgr
204214 self.dimensions = dimensions
205215 self.image_merger = image_merger
282292 try:
283293 img = source.get_map(query)
284294 except BlankImage:
285 return None
295 return None, None
286296 else:
287 return img
288
289 imgs = []
290 for img in async.imap(get_map_from_source, self.sources):
291 if img is not None:
292 imgs.append(img)
293
294 merger = self.image_merger
295 if not merger:
296 merger = merge_images
297 return merger(imgs, size=query.size, image_opts=self.tile_mgr.image_opts)
297 return (img, source.coverage)
298
299 layers = []
300 for layer in async.imap(get_map_from_source, self.sources):
301 if layer[0] is not None:
302 layers.append(layer)
303
304 return merge_images(layers, size=query.size, bbox=query.bbox, bbox_srs=query.srs,
305 image_opts=self.tile_mgr.image_opts, merger=self.image_merger)
298306
299307 def _create_meta_tiles(self, meta_tiles):
308 if self.bulk_meta_tiles:
309 created_tiles = []
310 for meta_tile in meta_tiles:
311 created_tiles.extend(self._create_bulk_meta_tile(meta_tile))
312 return created_tiles
313
300314 if self.tile_mgr.concurrent_tile_creators > 1 and len(meta_tiles) > 1:
301315 return self._create_threaded(self._create_meta_tile, meta_tiles)
302316
306320 return created_tiles
307321
308322 def _create_meta_tile(self, meta_tile):
323 """
324 _create_meta_tile queries a single meta tile and splits it into
325 tiles.
326 """
309327 tile_size = self.grid.tile_size
310328 query = MapQuery(meta_tile.bbox, meta_tile.size, self.grid.srs, self.tile_mgr.request_format,
311329 dimensions=self.dimensions)
320338 if meta_tile_image.cacheable:
321339 self.cache.store_tiles(splitted_tiles)
322340 return splitted_tiles
323 # else
341 # else
324342 tiles = [Tile(coord) for coord in meta_tile.tiles]
325343 self.cache.load_tiles(tiles)
326344 return tiles
345
346 def _create_bulk_meta_tile(self, meta_tile):
347 """
348 _create_bulk_meta_tile queries each tile of the meta tile in parallel
349 (using concurrent_tile_creators).
350 """
351 tile_size = self.grid.tile_size
352 main_tile = Tile(meta_tile.main_tile_coord)
353 with self.tile_mgr.lock(main_tile):
354 if not all(self.is_cached(t) for t in meta_tile.tiles if t is not None):
355 async_pool = async.Pool(self.tile_mgr.concurrent_tile_creators)
356 def query_tile(coord):
357 try:
358 query = MapQuery(self.grid.tile_bbox(coord), tile_size, self.grid.srs, self.tile_mgr.request_format,
359 dimensions=self.dimensions)
360 tile_image = self._query_sources(query)
361 if tile_image is None:
362 return None
363
364 if self.tile_mgr.image_opts != tile_image.image_opts:
365 # call as_buffer to force conversion into cache format
366 tile_image.as_buffer(self.tile_mgr.image_opts)
367
368 tile = Tile(coord, cacheable=tile_image.cacheable)
369 tile.source = tile_image
370 tile = self.tile_mgr.apply_tile_filter(tile)
371 except BlankImage:
372 return None
373 else:
374 return tile
375
376 tiles = []
377 for tile_task in async_pool.imap(query_tile,
378 [t for t in meta_tile.tiles if t is not None],
379 use_result_objects=True,
380 ):
381 if tile_task.exception is None:
382 tile = tile_task.result
383 if tile is not None:
384 tiles.append(tile)
385 else:
386 ex = tile_task.exception
387 async_pool.shutdown(True)
388 reraise(ex)
389
390 self.cache.store_tiles([t for t in tiles if t.cacheable])
391 return tiles
392
393 # else
394 tiles = [Tile(coord) for coord in meta_tile.tiles]
395 self.cache.load_tiles(tiles)
396 return tiles
397
327398
328399 class Tile(object):
329400 """
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
14
15 from mapproxy.client.http import HTTPClient
16 from mapproxy.client.wms import WMSInfoClient
17 from mapproxy.srs import SRS
18 from mapproxy.featureinfo import create_featureinfo_doc
1419
1520 class ArcGISClient(object):
1621 def __init__(self, request_template, http_client=None):
3237 req.params.transparent = query.transparent
3338
3439 return req.complete_url
40
41 def combined_client(self, other, query):
42 return
43
44 class ArcGISInfoClient(WMSInfoClient):
45 def __init__(self, request_template, supported_srs=None, http_client=None,
46 return_geometries=False,
47 tolerance=5,
48 ):
49 self.request_template = request_template
50 self.http_client = http_client or HTTPClient()
51 if not supported_srs and self.request_template.params.srs is not None:
52 supported_srs = [SRS(self.request_template.params.srs)]
53 self.supported_srs = supported_srs or []
54 self.return_geometries = return_geometries
55 self.tolerance = tolerance
56
57 def get_info(self, query):
58 if self.supported_srs and query.srs not in self.supported_srs:
59 query = self._get_transformed_query(query)
60 resp = self._retrieve(query)
61 # always use query.info_format and not content-type from response (even esri example server aleays return text/plain)
62 return create_featureinfo_doc(resp.read(), query.info_format)
63
64 def _query_url(self, query):
65 req = self.request_template.copy()
66 req.params.bbox = query.bbox
67 req.params.size = query.size
68 req.params.pos = query.pos
69 req.params.srs = query.srs.srs_code
70 if query.info_format.startswith('text/html'):
71 req.params['f'] = 'html'
72 else:
73 req.params['f'] = 'json'
74
75 req.params['tolerance'] = self.tolerance
76 req.params['returnGeometry'] = str(self.return_geometries).lower()
77
78 return req.complete_url
133133 """
134134 req_srs = query.srs
135135 req_bbox = query.bbox
136 req_coord = make_lin_transf((0, 0, query.size[0], query.size[1]), req_bbox)(query.pos)
137
136138 info_srs = self._best_supported_srs(req_srs)
137139 info_bbox = req_srs.transform_bbox_to(info_srs, req_bbox)
138
139 req_coord = make_lin_transf((0, query.size[1], query.size[0], 0), req_bbox)(query.pos)
140 # calculate new info_size to keep square pixels after transform_bbox_to
141 info_aratio = (info_bbox[3] - info_bbox[1])/(info_bbox[2] - info_bbox[0])
142 info_size = query.size[0], int(info_aratio*query.size[0])
140143
141144 info_coord = req_srs.transform_to(info_srs, req_coord)
142 info_pos = make_lin_transf((info_bbox), (0, query.size[1], query.size[0], 0))(info_coord)
145 info_pos = make_lin_transf((info_bbox), (0, 0, info_size[0], info_size[1]))(info_coord)
143146 info_pos = int(round(info_pos[0])), int(round(info_pos[1]))
144147
145148 info_query = InfoQuery(
146149 bbox=info_bbox,
147 size=query.size,
150 size=info_size,
148151 srs=info_srs,
149152 pos=info_pos,
150153 info_format=query.info_format,
1818 'ImageChops', 'quantize']
1919
2020 try:
21 import PIL
2122 from PIL import Image, ImageColor, ImageDraw, ImageFont, ImagePalette, ImageChops, ImageMath
2223 # prevent pyflakes warnings
2324 Image, ImageColor, ImageDraw, ImageFont, ImagePalette, ImageChops, ImageMath
2425 except ImportError:
25 try:
26 import Image, ImageColor, ImageDraw, ImageFont, ImagePalette, ImageChops, ImageMath
27 # prevent pyflakes warnings
28 Image, ImageColor, ImageDraw, ImageFont, ImagePalette, ImageChops, ImageMath
29 except ImportError:
30 # allow MapProxy to start without PIL (for tilecache only).
31 # issue warning and raise ImportError on first use of
32 # a function that requires PIL
33 warnings.warn('PIL is not available')
34 class NoPIL(object):
35 def __getattr__(self, name):
36 if name.startswith('__'):
37 raise AttributeError()
38 raise ImportError('PIL is not available')
39 ImageDraw = ImageFont = ImagePalette = ImageChops = NoPIL()
40 # add some dummy stuff required on import/load time
41 Image = NoPIL()
42 Image.NEAREST = Image.BILINEAR = Image.BICUBIC = 1
43 Image.Image = NoPIL
44 ImageColor = NoPIL()
45 ImageColor.getrgb = lambda x: x
26 # allow MapProxy to start without PIL (for tilecache only).
27 # issue warning and raise ImportError on first use of
28 # a function that requires PIL
29 warnings.warn('PIL is not available')
30 class NoPIL(object):
31 def __getattr__(self, name):
32 if name.startswith('__'):
33 raise AttributeError()
34 raise ImportError('PIL is not available')
35 ImageDraw = ImageFont = ImagePalette = ImageChops = NoPIL()
36 # add some dummy stuff required on import/load time
37 Image = NoPIL()
38 Image.NEAREST = Image.BILINEAR = Image.BICUBIC = 1
39 Image.Image = NoPIL
40 ImageColor = NoPIL()
41 ImageColor.getrgb = lambda x: x
4642
4743 def has_alpha_composite_support():
4844 return hasattr(Image, 'alpha_composite')
45
46 def transform_uses_center():
47 # transformation behavior changed with Pillow 3.4
48 # https://github.com/python-pillow/Pillow/commit/5232361718bae0f0ccda76bfd5b390ebf9179b18
49 if hasattr(PIL, 'PILLOW_VERSION'):
50 if not PIL.PILLOW_VERSION.startswith(('1.', '2.', '3.0', '3.1', '3.2', '3.3')):
51 return True
52 return False
4953
5054 def quantize_pil(img, colors=256, alpha=False, defaults=None):
5155 if hasattr(Image, 'FASTOCTREE'):
2020 load_datasource,
2121 load_ogr_datasource,
2222 load_polygons,
23 load_expire_tiles,
2324 require_geom_support,
2425 build_multipolygon,
2526 )
26 from mapproxy.util.coverage import coverage
27 from mapproxy.util.coverage import (
28 coverage,
29 diff_coverage,
30 union_coverage,
31 intersection_coverage,
32 )
2733 from mapproxy.compat import string_type
2834
2935 bbox_string_re = re.compile(r'[-+]?\d*.?\d+,[-+]?\d*.?\d+,[-+]?\d*.?\d+,[-+]?\d*.?\d+')
3036
3137 def load_coverage(conf, base_path=None):
32 if 'ogr_datasource' in conf:
38 clip = False
39 if 'clip' in conf:
40 clip = conf['clip']
41
42 if 'union' in conf:
43 parts = []
44 for cov in conf['union']:
45 parts.append(load_coverage(cov))
46 return union_coverage(parts, clip=clip)
47 elif 'intersection' in conf:
48 parts = []
49 for cov in conf['intersection']:
50 parts.append(load_coverage(cov))
51 return intersection_coverage(parts, clip=clip)
52 elif 'difference' in conf:
53 parts = []
54 for cov in conf['difference']:
55 parts.append(load_coverage(cov))
56 return diff_coverage(parts, clip=clip)
57 elif 'ogr_datasource' in conf:
3358 require_geom_support()
3459 srs = conf['ogr_srs']
3560 datasource = conf['ogr_datasource']
6994 where = conf.get('where', None)
7095 geom = load_datasource(datasource, where)
7196 bbox, geom = build_multipolygon(geom, simplify=True)
97 elif 'expire_tiles' in conf:
98 require_geom_support()
99 filename = abspath(conf['expire_tiles'])
100 geom = load_expire_tiles(filename)
101 _, geom = build_multipolygon(geom, simplify=False)
102 return coverage(geom, SRS(3857))
72103 else:
73104 return None
74 return coverage(geom or bbox, SRS(srs))
105
106 return coverage(geom or bbox, SRS(srs), clip=clip)
6262 meta_buffer = 80,
6363 minimize_meta_requests = False,
6464 link_single_color_images = False,
65 sqlite_timeout = 30,
6566 )
6667
6768 grid = dict(
620620 request = create_request(self.conf["req"], params)
621621 http_client, request.url = self.http_client(request.url)
622622 coverage = self.coverage()
623 res_range = resolution_range(self.conf)
623624
624625 client = ArcGISClient(request, http_client)
625626 image_opts = self.image_opts(format=params.get('format'))
626627 return ArcGISSource(client, image_opts=image_opts, coverage=coverage,
628 res_range=res_range,
627629 supported_srs=supported_srs,
628630 supported_formats=supported_formats or None)
631
632
633 def fi_source(self, params=None):
634 from mapproxy.client.arcgis import ArcGISInfoClient
635 from mapproxy.request.arcgis import create_identify_request
636 from mapproxy.source.arcgis import ArcGISInfoSource
637 from mapproxy.srs import SRS
638
639 if params is None: params = {}
640 request_format = self.conf['req'].get('format')
641 if request_format:
642 params['format'] = request_format
643 supported_srs = [SRS(code) for code in self.conf.get('supported_srs', [])]
644 fi_source = None
645 if self.conf.get('opts', {}).get('featureinfo', False):
646 opts = self.conf['opts']
647 tolerance = opts.get('featureinfo_tolerance', 5)
648 return_geometries = opts.get('featureinfo_return_geometries', False)
649
650 fi_request = create_identify_request(self.conf['req'], params)
651
652
653 http_client, fi_request.url = self.http_client(fi_request.url)
654 fi_client = ArcGISInfoClient(fi_request,
655 supported_srs=supported_srs,
656 http_client=http_client,
657 tolerance=tolerance,
658 return_geometries=return_geometries,
659 )
660 fi_source = ArcGISInfoSource(fi_client)
661 return fi_source
629662
630663
631664 class WMSSourceConfiguration(SourceConfiguration):
952985 return self.context.globals.get_path('cache_dir', self.conf,
953986 global_key='cache.base_dir')
954987
988 @memoize
989 def has_multiple_grids(self):
990 return len(self.grid_confs()) > 1
991
955992 def lock_dir(self):
956993 lock_dir = self.context.globals.get_path('cache.tile_lock_dir', self.conf)
957994 if not lock_dir:
9641001 cache_dir = self.cache_dir()
9651002 directory_layout = self.conf.get('cache', {}).get('directory_layout', 'tc')
9661003 if self.conf.get('cache', {}).get('directory'):
1004 if self.has_multiple_grids():
1005 raise ConfigurationError(
1006 "using single directory for cache with multiple grids in %s" %
1007 (self.conf['name']),
1008 )
9671009 pass
9681010 elif self.conf.get('cache', {}).get('use_grid_names'):
9691011 cache_dir = os.path.join(cache_dir, self.conf['name'], grid_conf.tile_grid().name)
9771019 log.warn('link_single_color_images not supported on windows')
9781020 link_single_color_images = False
9791021
980 lock_timeout = self.context.globals.get_value('http.client_timeout', {})
981
9821022 return FileCache(
9831023 cache_dir,
9841024 file_ext=file_ext,
9851025 directory_layout=directory_layout,
986 lock_timeout=lock_timeout,
9871026 link_single_color_images=link_single_color_images,
9881027 )
9891028
9991038 else:
10001039 mbfile_path = os.path.join(self.cache_dir(), filename)
10011040
1041 sqlite_timeout = self.context.globals.get_value('cache.sqlite_timeout', self.conf)
1042 wal = self.context.globals.get_value('cache.sqlite_wal', self.conf)
1043
10021044 return MBTilesCache(
10031045 mbfile_path,
1046 timeout=sqlite_timeout,
1047 wal=wal,
1048 )
1049
1050 def _geopackage_cache(self, grid_conf, file_ext):
1051 from mapproxy.cache.geopackage import GeopackageCache, GeopackageLevelCache
1052
1053 filename = self.conf['cache'].get('filename')
1054 table_name = self.conf['cache'].get('table_name') or \
1055 "{}_{}".format(self.conf['name'], grid_conf.tile_grid().name)
1056 levels = self.conf['cache'].get('levels')
1057
1058 if not filename:
1059 filename = self.conf['name'] + '.gpkg'
1060 if filename.startswith('.' + os.sep):
1061 gpkg_file_path = self.context.globals.abspath(filename)
1062 else:
1063 gpkg_file_path = os.path.join(self.cache_dir(), filename)
1064
1065 cache_dir = self.conf['cache'].get('directory')
1066 if cache_dir:
1067 cache_dir = os.path.join(
1068 self.context.globals.abspath(cache_dir),
1069 grid_conf.tile_grid().name
1070 )
1071 else:
1072 cache_dir = self.cache_dir()
1073 cache_dir = os.path.join(
1074 cache_dir,
1075 self.conf['name'],
1076 grid_conf.tile_grid().name
1077 )
1078
1079 if levels:
1080 return GeopackageLevelCache(
1081 cache_dir, grid_conf.tile_grid(), table_name
1082 )
1083 else:
1084 return GeopackageCache(
1085 gpkg_file_path, grid_conf.tile_grid(), table_name
1086 )
1087
1088 def _s3_cache(self, grid_conf, file_ext):
1089 from mapproxy.cache.s3 import S3Cache
1090
1091 bucket_name = self.context.globals.get_value('cache.bucket_name', self.conf,
1092 global_key='cache.s3.bucket_name')
1093
1094 if not bucket_name:
1095 raise ConfigurationError("no bucket_name configured for s3 cache %s" % self.conf['name'])
1096
1097 profile_name = self.context.globals.get_value('cache.profile_name', self.conf,
1098 global_key='cache.s3.profile_name')
1099
1100 directory_layout = self.conf['cache'].get('directory_layout', 'tms')
1101
1102 base_path = self.conf['cache'].get('directory', None)
1103 if base_path is None:
1104 base_path = os.path.join(self.conf['name'], grid_conf.tile_grid().name)
1105
1106 return S3Cache(
1107 base_path=base_path,
1108 file_ext=file_ext,
1109 directory_layout=directory_layout,
1110 bucket_name=bucket_name,
1111 profile_name=profile_name,
10041112 )
10051113
10061114 def _sqlite_cache(self, grid_conf, file_ext):
10201128 grid_conf.tile_grid().name
10211129 )
10221130
1131 sqlite_timeout = self.context.globals.get_value('cache.sqlite_timeout', self.conf)
1132 wal = self.context.globals.get_value('cache.sqlite_wal', self.conf)
1133
10231134 return MBTilesLevelCache(
10241135 cache_dir,
1136 timeout=sqlite_timeout,
1137 wal=wal,
10251138 )
10261139
10271140 def _couchdb_cache(self, grid_conf, file_ext):
10711184 return RiakCache(nodes=nodes, protocol=protocol, bucket=bucket,
10721185 tile_grid=grid_conf.tile_grid(),
10731186 use_secondary_index=use_secondary_index,
1187 )
1188
1189 def _redis_cache(self, grid_conf, file_ext):
1190 from mapproxy.cache.redis import RedisCache
1191
1192 host = self.conf['cache'].get('host', '127.0.0.1')
1193 port = self.conf['cache'].get('port', 6379)
1194 db = self.conf['cache'].get('db', 0)
1195 ttl = self.conf['cache'].get('default_ttl', 3600)
1196
1197 prefix = self.conf['cache'].get('prefix')
1198 if not prefix:
1199 prefix = self.conf['name'] + '_' + grid_conf.tile_grid().name
1200
1201 return RedisCache(
1202 host=host,
1203 port=port,
1204 db=db,
1205 prefix=prefix,
1206 ttl=ttl,
1207 )
1208
1209 def _compact_cache(self, grid_conf, file_ext):
1210 from mapproxy.cache.compact import CompactCacheV1
1211
1212 cache_dir = self.cache_dir()
1213 if self.conf.get('cache', {}).get('directory'):
1214 if self.has_multiple_grids():
1215 raise ConfigurationError(
1216 "using single directory for cache with multiple grids in %s" %
1217 (self.conf['name']),
1218 )
1219 pass
1220 else:
1221 cache_dir = os.path.join(cache_dir, self.conf['name'], grid_conf.tile_grid().name)
1222
1223 if self.conf['cache']['version'] != 1:
1224 raise ConfigurationError("compact cache only supports version 1")
1225 return CompactCacheV1(
1226 cache_dir=cache_dir,
10741227 )
10751228
10761229 def _tile_cache(self, grid_conf, file_ext):
12271380 factor=source.get('factor', 1.0),
12281381 )
12291382
1230 return band_merger.merge, sources, source_image_opts
1383 return band_merger, sources, source_image_opts
12311384
12321385 @memoize
12331386 def caches(self):
12521405 global_key='cache.meta_buffer')
12531406 meta_size = self.context.globals.get_value('meta_size', self.conf,
12541407 global_key='cache.meta_size')
1408 bulk_meta_tiles = self.context.globals.get_value('bulk_meta_tiles', self.conf,
1409 global_key='cache.bulk_meta_tiles')
12551410 minimize_meta_requests = self.context.globals.get_value('minimize_meta_requests', self.conf,
12561411 global_key='cache.minimize_meta_requests')
12571412 concurrent_tile_creators = self.context.globals.get_value('concurrent_tile_creators', self.conf,
13351490 minimize_meta_requests=minimize_meta_requests,
13361491 concurrent_tile_creators=concurrent_tile_creators,
13371492 pre_store_filter=tile_filter,
1338 tile_creator_class=tile_creator_class)
1493 tile_creator_class=tile_creator_class,
1494 bulk_meta_tiles=bulk_meta_tiles,
1495 )
13391496 extent = merge_layer_extents(sources)
13401497 if extent.is_default:
13411498 extent = map_extent_from_grid(tile_grid)
14921649 return dimensions
14931650
14941651 @memoize
1495 def tile_layers(self):
1652 def tile_layers(self, grid_name_as_path=False):
14961653 from mapproxy.service.tile import TileLayer
14971654 from mapproxy.cache.dummy import DummyCache
14981655
15231680 tile_layers = []
15241681 for cache_name in sources:
15251682 for grid, extent, cache_source in self.context.caches[cache_name].caches():
1526
15271683 if dimensions and not isinstance(cache_source.cache, DummyCache):
15281684 # caching of dimension layers is not supported yet
15291685 raise ConfigurationError(
15341690 md = {}
15351691 md['title'] = self.conf['title']
15361692 md['name'] = self.conf['name']
1537 md['name_path'] = (self.conf['name'], grid.srs.srs_code.replace(':', '').upper())
15381693 md['grid_name'] = grid.name
1694 if grid_name_as_path:
1695 md['name_path'] = (md['name'], md['grid_name'])
1696 else:
1697 md['name_path'] = (self.conf['name'], grid.srs.srs_code.replace(':', '').upper())
15391698 md['name_internal'] = md['name_path'][0] + '_' + md['name_path'][1]
15401699 md['format'] = self.context.caches[cache_name].image_opts().format
15411700 md['cache_name'] = cache_name
16111770 def tile_layers(self, conf, use_grid_names=False):
16121771 layers = odict()
16131772 for layer_name, layer_conf in iteritems(self.context.layers):
1614 for tile_layer in layer_conf.tile_layers():
1773 for tile_layer in layer_conf.tile_layers(grid_name_as_path=use_grid_names):
16151774 if not tile_layer: continue
16161775 if use_grid_names:
1617 # new style layer names are tuples
1618 tile_layer.md['name_path'] = (tile_layer.md['name'], tile_layer.md['grid_name'])
16191776 layers[tile_layer.md['name_path']] = tile_layer
16201777 else:
16211778 layers[tile_layer.md['name_internal']] = tile_layer
3535 else:
3636 return [], True
3737
38 coverage = {
38 coverage = recursive({
3939 'polygons': str(),
4040 'polygons_srs': str(),
4141 'bbox': one_of(str(), [number()]),
4646 'datasource': one_of(str(), [number()]),
4747 'where': str(),
4848 'srs': str(),
49 }
49 'expire_tiles': str(),
50 'union': [recursive()],
51 'difference': [recursive()],
52 'intersection': [recursive()],
53 'clip': bool(),
54 })
55
5056 image_opts = {
5157 'mode': str(),
5258 'colors': number(),
105111 },
106112 'sqlite': {
107113 'directory': str(),
114 'sqlite_timeout': number(),
115 'sqlite_wal': bool(),
108116 'tile_lock_dir': str(),
109117 },
110118 'mbtiles': {
111119 'filename': str(),
112 'tile_lock_dir': str(),
120 'sqlite_timeout': number(),
121 'sqlite_wal': bool(),
122 'tile_lock_dir': str(),
123 },
124 'geopackage': {
125 'filename': str(),
126 'directory': str(),
127 'tile_lock_dir': str(),
128 'table_name': str(),
129 'levels': bool(),
113130 },
114131 'couchdb': {
115132 'url': str(),
120137 'tile_id': str(),
121138 'tile_lock_dir': str(),
122139 },
140 's3': {
141 'bucket_name': str(),
142 'directory_layout': str(),
143 'directory': str(),
144 'profile_name': str(),
145 'tile_lock_dir': str(),
146 },
123147 'riak': {
124148 'nodes': [riak_node],
125149 'protocol': one_of('pbc', 'http', 'https'),
129153 'http': number(),
130154 },
131155 'secondary_index': bool(),
132 }
156 'tile_lock_dir': str(),
157 },
158 'redis': {
159 'host': str(),
160 'port': int(),
161 'db': int(),
162 'prefix': str(),
163 'default_ttl': int(),
164 },
165 'compact': {
166 'directory': str(),
167 required('version'): number(),
168 'tile_lock_dir': str(),
169 },
133170 }
134171
135172 on_error = {
323360 'tile_lock_dir': str(),
324361 'meta_size': [number()],
325362 'meta_buffer': number(),
363 'bulk_meta_tiles': bool(),
326364 'max_tile_limit': number(),
327365 'minimize_meta_requests': bool(),
328366 'concurrent_tile_creators': int(),
329367 'link_single_color_images': bool(),
368 's3': {
369 'bucket_name': str(),
370 'profile_name': str(),
371 },
330372 },
331373 'grid': {
332374 'tile_size': [int()],
355397 'cache_dir': str(),
356398 'meta_size': [number()],
357399 'meta_buffer': number(),
400 'bulk_meta_tiles': bool(),
358401 'minimize_meta_requests': bool(),
359402 'concurrent_tile_creators': int(),
360403 'disable_storage': bool(),
485528 'transparent': bool(),
486529 'time': str()
487530 },
531 'opts': {
532 'featureinfo': bool(),
533 'featureinfo_tolerance': number(),
534 'featureinfo_return_geometries': bool(),
535 },
488536 'supported_srs': [str()],
489537 'http': http_opts
490538 }),
4949 email: info@omniscale.de
5050 # multiline strings are possible with the right indention
5151 access_constraints:
52 This service is intended for private and evaluation use only.
53 The data is licensed as Creative Commons Attribution-Share Alike 2.0
54 (http://creativecommons.org/licenses/by-sa/2.0/)
52 Insert license and copyright information for this service.
5553 fees: 'None'
5654
5755 wms:
105103 email: info@omniscale.de
106104 # multiline strings are possible with the right indention
107105 access_constraints:
108 This service is intended for private and evaluation use only.
109 The data is licensed as Creative Commons Attribution-Share Alike 2.0
110 (http://creativecommons.org/licenses/by-sa/2.0/)
106 Insert license and copyright information for this service.
111107 fees: 'None'
112108
113109 layers:
1313 contact:
1414 person: Your Name Here
1515 position: Technical Director
16 organization:
16 organization:
1717 address: Fakestreet 123
1818 city: Somewhere
1919 postcode: 12345
2222 fax: +49(0)000-000000-0
2323 email: info@omniscale.de
2424 access_constraints:
25 This service is intended for private and evaluation use only.
26 The data is licensed as Creative Commons Attribution-Share Alike 2.0
27 (http://creativecommons.org/licenses/by-sa/2.0/)
25 Insert license and copyright information for this service.
2826 fees: 'None'
2927
3028 layers:
3432 # - name: osm_full_example
3533 # title: Omniscale OSM WMS - osm.omniscale.net
3634 # sources: [osm_cache_full_example]
37
35
3836 caches:
3937 osm_cache:
4038 grids: [GLOBAL_MERCATOR, global_geodetic_sqrt2]
4139 sources: [osm_wms]
42
40
4341 # osm_cache_full_example:
4442 # meta_buffer: 20
4543 # meta_size: [5, 5]
7674 # # # always request in this format
7775 # # format: image/png
7876 # map: /home/map/mapserver.map
79
77
8078
8179 grids:
8280 global_geodetic_sqrt2:
1313 # limitations under the License.
1414
1515 import copy
16 import json
17
18 from functools import reduce
1619 from io import StringIO
17 from mapproxy.compat import string_type, PY2, BytesIO
20
21 from mapproxy.compat import string_type, PY2, BytesIO, iteritems
1822
1923 try:
2024 from lxml import etree, html
119123
120124 return cls(result_tree)
121125
126
127 class JSONFeatureInfoDoc(FeatureInfoDoc):
128 info_type = 'json'
129
130 def __init__(self, content):
131 self.content = content
132
133 def as_string(self):
134 return self.content
135
136 @classmethod
137 def combine(cls, docs):
138 contents = [json.loads(d.content) for d in docs]
139 combined = reduce(lambda a, b: merge_dict(a, b), contents)
140 return cls(json.dumps(combined))
141
142
143 def merge_dict(base, other):
144 """
145 Return `base` dict with values from `conf` merged in.
146 """
147 for k, v in iteritems(other):
148 if k not in base:
149 base[k] = v
150 else:
151 if isinstance(base[k], dict):
152 merge_dict(base[k], v)
153 elif isinstance(base[k], list):
154 base[k].extend(v)
155 else:
156 base[k] = v
157 return base
158
122159 def create_featureinfo_doc(content, info_format):
123160 info_format = info_format.split(';', 1)[0].strip() # remove mime options like charset
124161 if info_format in ('text/xml', 'application/vnd.ogc.gml'):
125162 return XMLFeatureInfoDoc(content)
126163 if info_format == 'text/html':
127164 return HTMLFeatureInfoDoc(content)
165 if info_format == 'application/json':
166 return JSONFeatureInfoDoc(content)
128167
129168 return TextFeatureInfoDoc(content)
130169
408408 threshold = thresholds.pop() if thresholds else None
409409
410410 if threshold_result is not None:
411 return threshold_result
411 # Use previous level that was within stretch_factor,
412 # but only if this level res is smaller then res.
413 # This fixes selection for resolutions that are closer together then stretch_factor.
414 #
415 if l_res < res:
416 return threshold_result
412417
413418 if l_res <= res*self.stretch_factor:
419 # l_res within stretch_factor
420 # remember this level, check for thresholds or better res in next loop
414421 threshold_result = level
415422 prev_l_res = l_res
416423 return level
10591066 def deg_to_m(deg):
10601067 return deg * (6378137 * 2 * math.pi) / 360
10611068
1062 OGC_PIXLE_SIZE = 0.00028 #m/px
1069 OGC_PIXEL_SIZE = 0.00028 #m/px
10631070
10641071 def ogc_scale_to_res(scale):
1065 return scale * OGC_PIXLE_SIZE
1072 return scale * OGC_PIXEL_SIZE
10661073 def res_to_ogc_scale(res):
1067 return res / OGC_PIXLE_SIZE
1074 return res / OGC_PIXEL_SIZE
10681075
10691076 def resolution_range(min_res=None, max_res=None, max_scale=None, min_scale=None):
10701077 if min_scale == max_scale == min_res == max_res == None:
3030
3131 def mask_image(img, bbox, bbox_srs, coverage):
3232 geom = mask_polygons(bbox, SRS(bbox_srs), coverage)
33 mask = image_mask_from_geom(img, bbox, geom)
33 mask = image_mask_from_geom(img.size, bbox, geom)
3434 img = img.convert('RGBA')
3535 img.paste((255, 255, 255, 0), (0, 0), mask)
3636 return img
4040 coverage = coverage.intersection(bbox, bbox_srs)
4141 return flatten_to_polygons(coverage.geom)
4242
43 def image_mask_from_geom(img, bbox, polygons):
44 transf = make_lin_transf(bbox, (0, 0) + img.size)
43 def image_mask_from_geom(size, bbox, polygons):
44 mask = Image.new('L', size, 255)
45 if len(polygons) == 0:
46 return mask
4547
46 mask = Image.new('L', img.size, 255)
48 transf = make_lin_transf(bbox, (0, 0) + size)
49
50 # use negative ~.1 pixel buffer
51 buffer = -0.1 * min((bbox[2] - bbox[0]) / size[0], (bbox[3] - bbox[1]) / size[1])
52
4753 draw = ImageDraw.Draw(mask)
4854
49 for p in polygons:
55 def draw_polygon(p):
5056 draw.polygon([transf(coord) for coord in p.exterior.coords], fill=0)
5157 for ring in p.interiors:
5258 draw.polygon([transf(coord) for coord in ring.coords], fill=255)
5359
60 for p in polygons:
61 # little bit smaller polygon does not include touched pixels outside coverage
62 buffered = p.buffer(buffer, resolution=1, join_style=2)
63
64 if buffered.type == 'MultiPolygon':
65 # negative buffer can turn polygon into multipolygon
66 for p in buffered:
67 draw_polygon(p)
68 else:
69 draw_polygon(buffered)
70
5471 return mask
3535 self.layers = []
3636 self.cacheable = True
3737
38 def add(self, layer_img, layer=None):
38 def add(self, img, coverage=None):
3939 """
4040 Add one layer image to merge. Bottom-layers first.
4141 """
42 if layer_img is not None:
43 self.layers.append((layer_img, layer))
42 if img is not None:
43 self.layers.append((img, coverage))
44
45
46 class LayerMerger(LayerMerger):
4447
4548 def merge(self, image_opts, size=None, bbox=None, bbox_srs=None, coverage=None):
4649 """
5356 if not self.layers:
5457 return BlankImageSource(size=size, image_opts=image_opts, cacheable=True)
5558 if len(self.layers) == 1:
56 layer_img, layer = self.layers[0]
59 layer_img, layer_coverage = self.layers[0]
5760 layer_opts = layer_img.image_opts
5861 if (((layer_opts and not layer_opts.transparent) or image_opts.transparent)
5962 and (not size or size == layer_img.size)
60 and (not layer or not layer.coverage or not layer.coverage.clip)
63 and (not layer_coverage or not layer_coverage.clip)
6164 and not coverage):
6265 # layer is opaque, no need to make transparent or add bgcolor
6366 return layer_img
6770
6871 cacheable = self.cacheable
6972 result = create_image(size, image_opts)
70 for layer_img, layer in self.layers:
73 for layer_img, layer_coverage in self.layers:
7174 if not layer_img.cacheable:
7275 cacheable = False
7376 img = layer_img.as_image()
7780 else:
7881 opacity = layer_image_opts.opacity
7982
80 if layer and layer.coverage and layer.coverage.clip:
81 img = mask_image(img, bbox, bbox_srs, layer.coverage)
83 if layer_coverage and layer_coverage.clip:
84 img = mask_image(img, bbox, bbox_srs, layer_coverage)
8285
8386 if result.mode != 'RGBA':
8487 merge_composite = False
8588 else:
8689 merge_composite = has_alpha_composite_support()
90
91 if 'transparency' in img.info:
92 # non-paletted PNGs can have a fixed transparency value
93 # convert to RGBA to have full alpha
94 img = img.convert('RGBA')
8795
8896 if merge_composite:
8997 if opacity is not None and opacity < 1.0:
95103 ImageChops.constant(alpha, int(255 * opacity))
96104 )
97105 img.putalpha(alpha)
98 if img.mode == 'RGB':
99 result.paste(img, (0, 0))
100 else:
106 if img.mode in ('RGBA', 'P'):
101107 # assume paletted images have transparency
102108 if img.mode == 'P':
103109 img = img.convert('RGBA')
104110 result = Image.alpha_composite(result, img)
111 else:
112 result.paste(img, (0, 0))
105113 else:
106114 if opacity is not None and opacity < 1.0:
107115 img = img.convert(result.mode)
108116 result = Image.blend(result, img, layer_image_opts.opacity)
109 elif img.mode == 'RGBA' or img.mode == 'P':
117 elif img.mode in ('RGBA', 'P'):
110118 # assume paletted images have transparency
111119 if img.mode == 'P':
112120 img = img.convert('RGBA')
148156 self.cacheable = True
149157 self.mode = mode
150158 self.max_band = {}
159 self.max_src_images = 0
151160
152161 def add_ops(self, dst_band, src_img, src_band, factor=1.0):
153162 self.ops.append(band_ops(
158167 ))
159168 # store highest requested band index for each source
160169 self.max_band[src_img] = max(self.max_band.get(src_img, 0), src_band)
170 self.max_src_images = max(src_img+1, self.max_src_images)
161171
162172 def merge(self, sources, image_opts, size=None, bbox=None, bbox_srs=None, coverage=None):
163 if not sources:
173 if len(sources) < self.max_src_images:
164174 return BlankImageSource(size=size, image_opts=image_opts, cacheable=True)
165175
166176 if size is None:
218228 return ImageSource(result, size=size, image_opts=image_opts)
219229
220230
221 def merge_images(images, image_opts, size=None):
231 def merge_images(layers, image_opts, size=None, bbox=None, bbox_srs=None, merger=None):
222232 """
223233 Merge multiple images into one.
224234
226236 :param format: the format of the output `ImageSource`
227237 :param size: size of the merged image, if ``None`` the size
228238 of the first image is used
239 :param bbox: Bounding box
240 :param bbox_srs: Bounding box SRS
241 :param merger: Image merger
229242 :rtype: `ImageSource`
230243 """
231 merger = LayerMerger()
232 for img in images:
233 merger.add(img)
234 return merger.merge(image_opts=image_opts, size=size)
244 if merger is None:
245 merger = LayerMerger()
246
247 # BandMerger does not have coverage support, passing only images
248 if isinstance(merger, BandMerger):
249 sources = [l[0] if isinstance(l, tuple) else l for l in layers]
250 return merger.merge(sources, image_opts=image_opts, size=size, bbox=bbox, bbox_srs=bbox_srs)
251
252 for layer in layers:
253 if isinstance(layer, tuple):
254 merger.add(layer[0], layer[1])
255 else:
256 merger.add(layer)
257
258 return merger.merge(image_opts=image_opts, size=size, bbox=bbox, bbox_srs=bbox_srs)
259
235260
236261 def concat_legends(legends, format='png', size=None, bgcolor='#ffffff', transparent=True):
237262 """
00 # This file is part of the MapProxy project.
11 # Copyright (C) 2010 Omniscale <http://omniscale.de>
2 #
2 #
33 # Licensed under the Apache License, Version 2.0 (the "License");
44 # you may not use this file except in compliance with the License.
55 # You may obtain a copy of the License at
6 #
6 #
77 # http://www.apache.org/licenses/LICENSE-2.0
8 #
8 #
99 # Unless required by applicable law or agreed to in writing, software
1010 # distributed under the License is distributed on an "AS IS" BASIS,
1111 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
3232 """
3333 self.tile_grid = tile_grid
3434 self.tile_size = tile_size
35
35
3636 def merge(self, ordered_tiles, image_opts):
3737 """
3838 Merge all tiles into one image.
39
39
4040 :param ordered_tiles: list of tiles, sorted row-wise (top to bottom)
4141 :rtype: `ImageSource`
4242 """
4646 tile = ordered_tiles.pop()
4747 return tile
4848 src_size = self._src_size()
49
49
5050 result = create_image(src_size, image_opts)
5151
5252 cacheable = True
7272 else:
7373 raise
7474 return ImageSource(result, size=src_size, image_opts=image_opts, cacheable=cacheable)
75
75
7676 def _src_size(self):
7777 width = self.tile_grid[0]*self.tile_size[0]
7878 height = self.tile_grid[1]*self.tile_size[1]
7979 return width, height
80
80
8181 def _tile_offset(self, i):
8282 """
8383 Return the image offset (upper-left coord) of the i-th tile,
8585 """
8686 return (i%self.tile_grid[0]*self.tile_size[0],
8787 i//self.tile_grid[0]*self.tile_size[1])
88
88
8989
9090 class TileSplitter(object):
9191 """
105105 minx, miny = crop_coord
106106 maxx = minx + tile_size[0]
107107 maxy = miny + tile_size[1]
108
108
109109 if (minx < 0 or miny < 0 or maxx > self.meta_img.size[0]
110110 or maxy > self.meta_img.size[1]):
111111
120120 else:
121121 crop = self.meta_img.crop((minx, miny, maxx, maxy))
122122 return ImageSource(crop, size=tile_size, image_opts=self.image_opts)
123
123
124124
125125 class TiledImage(object):
126126 """
141141 self.tile_size = tile_size
142142 self.src_bbox = src_bbox
143143 self.src_srs = src_srs
144
144
145145 def image(self, image_opts):
146146 """
147147 Return the tiles as one merged image.
148
148
149149 :rtype: `ImageSource`
150150 """
151151 tm = TileMerger(self.tile_grid, self.tile_size)
152152 return tm.merge(self.tiles, image_opts=image_opts)
153
153
154154 def transform(self, req_bbox, req_srs, out_size, image_opts):
155155 """
156156 Return the the tiles as one merged and transformed image.
157
157
158158 :param req_bbox: the bbox of the output image
159159 :param req_srs: the srs of the req_bbox
160160 :param out_size: the size in pixel of the output image
1414
1515 from __future__ import division
1616
17 from mapproxy.compat.image import Image
17 from mapproxy.compat.image import Image, transform_uses_center
1818 from mapproxy.image import ImageSource, image_filter
1919 from mapproxy.srs import make_lin_transf, bbox_equals
2020
136136 to_src_px = make_lin_transf(src_bbox, src_quad)
137137 to_dst_w = make_lin_transf(dst_quad, dst_bbox)
138138 meshes = []
139
140 # more recent versions of Pillow use center coordinates for
141 # transformations, we manually need to add half a pixel otherwise
142 if transform_uses_center():
143 px_offset = 0.0
144 else:
145 px_offset = 0.5
146
139147 def dst_quad_to_src(quad):
140148 src_quad = []
141149 for dst_px in [(quad[0], quad[1]), (quad[0], quad[3]),
142150 (quad[2], quad[3]), (quad[2], quad[1])]:
143 dst_w = to_dst_w((dst_px[0]+0.5, dst_px[1]+0.5))
151 dst_w = to_dst_w((dst_px[0]+px_offset, dst_px[1]+px_offset))
144152 src_w = self.dst_srs.transform_to(self.src_srs, dst_w)
145153 src_px = to_src_px(src_w)
146154 src_quad.extend(src_px)
151151
152152 @property
153153 def coord(self):
154 return make_lin_transf((0, self.size[1], self.size[0], 0), self.bbox)(self.pos)
154 return make_lin_transf((0, 0, self.size[0], self.size[1]), self.bbox)(self.pos)
155155
156156 class LegendQuery(object):
157157 def __init__(self, format, scale):
1313 # limitations under the License.
1414
1515 from functools import partial as fp
16 from mapproxy.compat import string_type
17 from mapproxy.compat.modules import urlparse
1618 from mapproxy.request.base import RequestParams, BaseRequest
17 from mapproxy.compat import string_type
18
19 from mapproxy.srs import make_lin_transf
1920
2021 class ArcGISExportRequestParams(RequestParams):
2122 """
8586 del _set_srs
8687
8788
89
90 class ArcGISIdentifyRequestParams(ArcGISExportRequestParams):
91 def _get_format(self):
92 """
93 The requested format as string (w/o any 'image/', 'text/', etc prefixes)
94 """
95 return self["format"]
96 def _set_format(self, format):
97 self["format"] = format.rsplit("/")[-1]
98 format = property(_get_format, _set_format)
99 del _get_format
100 del _set_format
101
102 def _get_bbox(self):
103 """
104 ``bbox`` as a tuple (minx, miny, maxx, maxy).
105 """
106 if 'mapExtent' not in self.params or self.params['mapExtent'] is None:
107 return None
108 points = [float(val) for val in self.params['mapExtent'].split(',')]
109 return tuple(points[:4])
110 def _set_bbox(self, value):
111 if value is not None and not isinstance(value, string_type):
112 value = ','.join(str(x) for x in value)
113 self['mapExtent'] = value
114 bbox = property(_get_bbox, _set_bbox)
115 del _get_bbox
116 del _set_bbox
117
118 def _get_size(self):
119 """
120 Size of the request in pixel as a tuple (width, height),
121 or None if one is missing.
122 """
123 if 'imageDisplay' not in self.params or self.params['imageDisplay'] is None:
124 return None
125 dim = [float(val) for val in self.params['imageDisplay'].split(',')]
126 return tuple(dim[:2])
127 def _set_size(self, value):
128 if value is not None and not isinstance(value, string_type):
129 value = ','.join(str(x) for x in value) + ',96'
130 self['imageDisplay'] = value
131 size = property(_get_size, _set_size)
132 del _get_size
133 del _set_size
134
135 def _get_pos(self):
136 size = self.size
137 vals = self['geometry'].split(',')
138 x, y = float(vals[0]), float(vals[1])
139 return make_lin_transf(self.bbox, (0, 0, size[0], size[1]))((x, y))
140
141 def _set_pos(self, value):
142 size = self.size
143 req_coord = make_lin_transf((0, 0, size[0], size[1]), self.bbox)(value)
144 self['geometry'] = '%f,%f' % req_coord
145 pos = property(_get_pos, _set_pos)
146 del _get_pos
147 del _set_pos
148
149 @property
150 def srs(self):
151 srs = self.params.get('sr', None)
152 if srs:
153 return 'EPSG:%s' % srs
154
155 @srs.setter
156 def srs(self, srs):
157 if hasattr(srs, 'srs_code'):
158 code = srs.srs_code
159 else:
160 code = srs
161 self.params['sr'] = code.rsplit(':', 1)[-1]
162
88163 class ArcGISRequest(BaseRequest):
89164 request_params = ArcGISExportRequestParams
90165 fixed_params = {"f": "image"}
92167 def __init__(self, param=None, url='', validate=False, http=None):
93168 BaseRequest.__init__(self, param, url, validate, http)
94169
95 self.url = self.url.rstrip("/")
96 if not self.url.endswith("export"):
97 self.url += "/export"
170 self.url = rest_endpoint(url)
98171
99172 def copy(self):
100173 return self.__class__(param=self.params.copy(), url=self.url)
107180 return params.query_string
108181
109182
110 def create_request(req_data, param):
183 class ArcGISIdentifyRequest(BaseRequest):
184 request_params = ArcGISIdentifyRequestParams
185 fixed_params = {'geometryType': 'esriGeometryPoint'}
186 def __init__(self, param=None, url='', validate=False, http=None):
187 BaseRequest.__init__(self, param, url, validate, http)
188
189 self.url = rest_identify_endpoint(url)
190
191 def copy(self):
192 return self.__class__(param=self.params.copy(), url=self.url)
193
194 @property
195 def query_string(self):
196 params = self.params.copy()
197 for key, value in self.fixed_params.items():
198 params[key] = value
199 return params.query_string
200
201
202
203 def create_identify_request(req_data, param):
111204 req_data = req_data.copy()
112205
113206 # Pop the URL off the request data.
114207 url = req_data['url']
115208 del req_data['url']
116209
210 return ArcGISIdentifyRequest(url=url, param=req_data)
211
212 def create_request(req_data, param):
213 req_data = req_data.copy()
214
215 # Pop the URL off the request data.
216 url = req_data['url']
217 del req_data['url']
218
117219 if 'format' in param:
118220 req_data['format'] = param['format']
119221
122224 req_data['transparent'] = str(req_data['transparent'])
123225
124226 return ArcGISRequest(url=url, param=req_data)
227
228
229 def rest_endpoint(url):
230 parts = urlparse.urlsplit(url)
231 path = parts.path.rstrip('/').split('/')
232
233 if path[-1] in ('export', 'exportImage'):
234 if path[-2] == 'MapServer':
235 path[-1] = 'export'
236 elif path[-2] == 'ImageServer':
237 path[-1] = 'exportImage'
238 elif path[-1] == 'MapServer':
239 path.append('export')
240 elif path[-1] == 'ImageServer':
241 path.append('exportImage')
242
243 parts = parts[0], parts[1], '/'.join(path), parts[3], parts[4]
244 return urlparse.urlunsplit(parts)
245
246
247 def rest_identify_endpoint(url):
248 parts = urlparse.urlsplit(url)
249 path = parts.path.rstrip('/').split('/')
250
251 if path[-1] in ('export', 'exportImage'):
252 path[-1] = 'identify'
253 elif path[-1] in ('MapServer', 'ImageServer'):
254 path.append('identify')
255
256 parts = parts[0], parts[1], '/'.join(path), parts[3], parts[4]
257 return urlparse.urlunsplit(parts)
258
733733 Version('1.3.0'): (('text', 'text/plain'),
734734 ('html', 'text/html'),
735735 ('xml', 'text/xml'),
736 ('json', 'application/json'),
736737 ),
737738 None: (('text', 'text/plain'),
738739 ('html', 'text/html'),
739740 ('xml', 'application/vnd.ogc.gml'),
741 ('json', 'application/json'),
740742 )
741743 }
742744
9090
9191 self.last_modified = timestamp
9292 if (timestamp or etag_data) and max_age is not None:
93 self.headers['Cache-control'] = 'max-age=%d public' % max_age
93 self.headers['Cache-control'] = 'public, max-age=%d, s-maxage=%d' % (max_age, max_age)
9494
9595 def make_conditional(self, req):
9696 """
9595 parser = optparse.OptionParser("%prog grids [options] mapproxy_conf")
9696 parser.add_option("-f", "--mapproxy-conf", dest="mapproxy_conf",
9797 help="MapProxy configuration")
98
99 parser.add_option("-q", "--quiet",
100 action="count", dest="quiet", default=0,
101 help="reduce number of messages to stdout, repeat to disable progress output")
98102
99103 parser.add_option("--source", dest="source",
100104 help="source to export (source or cache)")
203207 'type': 'mbtiles',
204208 'filename': options.dest,
205209 }
210 elif options.type == 'sqlite':
211 cache_conf['cache'] = {
212 'type': 'sqlite',
213 'directory': options.dest,
214 }
215 elif options.type == 'geopackage':
216 cache_conf['cache'] = {
217 'type': 'geopackage',
218 'filename': options.dest,
219 }
220 elif options.type == 'compact-v1':
221 cache_conf['cache'] = {
222 'type': 'compact',
223 'version': 1,
224 'directory': options.dest,
225 }
206226 elif options.type in ('tc', 'mapproxy'):
207227 cache_conf['cache'] = {
208228 'type': 'file',
256276
257277 print(format_export_task(task, custom_grid=custom_grid))
258278
259 logger = ProgressLog(verbose=True, silent=False)
279 logger = ProgressLog(verbose=options.quiet==0, silent=options.quiet>=2)
260280 try:
261281 seed_task(task, progress_logger=logger, dry_run=options.dry_run,
262282 concurrency=options.concurrency)
9090 if args[0] == '-':
9191 values = values_from_stdin()
9292 elif options.eval:
93 values = map(eval, args)
93 values = [eval(a) for a in args]
9494 else:
95 values = map(float, args)
95 values = [float(a) for a in args]
9696
9797 values.sort(reverse=True)
9898
1515 from __future__ import print_function
1616
1717 import os
18 from mapproxy.compat.itertools import izip_longest
1819 from mapproxy.seed.util import format_cleanup_task
1920 from mapproxy.util.fs import cleanup_directory
20 from mapproxy.seed.seeder import TileWorkerPool, TileWalker, TileCleanupWorker
21 from mapproxy.seed.seeder import (
22 TileWorkerPool, TileWalker, TileCleanupWorker,
23 SeedProgress,
24 )
25 from mapproxy.seed.util import ProgressLog
2126
2227 def cleanup(tasks, concurrency=2, dry_run=False, skip_geoms_for_last_levels=0,
2328 verbose=True, progress_logger=None):
2732 if task.coverage is False:
2833 continue
2934
35 # seed_progress for tilewalker cleanup
36 seed_progress = None
37 # cleanup_progress for os.walk based cleanup
38 cleanup_progress = None
39 if progress_logger and progress_logger.progress_store:
40 progress_logger.current_task_id = task.id
41 start_progress = progress_logger.progress_store.get(task.id)
42 seed_progress = SeedProgress(old_progress_identifier=start_progress)
43 cleanup_progress = DirectoryCleanupProgress(old_dir=start_progress)
44
3045 if task.complete_extent:
31 if hasattr(task.tile_manager.cache, 'level_location'):
32 simple_cleanup(task, dry_run=dry_run, progress_logger=progress_logger)
46 if callable(getattr(task.tile_manager.cache, 'level_location', None)):
47 simple_cleanup(task, dry_run=dry_run, progress_logger=progress_logger,
48 cleanup_progress=cleanup_progress)
3349 continue
34 elif hasattr(task.tile_manager.cache, 'remove_level_tiles_before'):
50 elif callable(getattr(task.tile_manager.cache, 'remove_level_tiles_before', None)):
3551 cache_cleanup(task, dry_run=dry_run, progress_logger=progress_logger)
3652 continue
3753
3854 tilewalker_cleanup(task, dry_run=dry_run, concurrency=concurrency,
3955 skip_geoms_for_last_levels=skip_geoms_for_last_levels,
40 progress_logger=progress_logger)
56 progress_logger=progress_logger,
57 seed_progress=seed_progress,
58 )
4159
42 def simple_cleanup(task, dry_run, progress_logger=None):
60
61 def simple_cleanup(task, dry_run, progress_logger=None, cleanup_progress=None):
4362 """
4463 Cleanup cache level on file system level.
4564 """
65
4666 for level in task.levels:
4767 level_dir = task.tile_manager.cache.level_location(level)
4868 if dry_run:
5272 file_handler = None
5373 if progress_logger:
5474 progress_logger.log_message('removing old tiles in ' + normpath(level_dir))
75 if progress_logger.progress_store:
76 cleanup_progress.step_dir(level_dir)
77 if cleanup_progress.already_processed():
78 continue
79 progress_logger.progress_store.add(
80 task.id,
81 cleanup_progress.current_progress_identifier(),
82 )
83 progress_logger.progress_store.write()
84
5585 cleanup_directory(level_dir, task.remove_timestamp,
5686 file_handler=file_handler, remove_empty_dirs=True)
5787
77107 return path
78108
79109 def tilewalker_cleanup(task, dry_run, concurrency, skip_geoms_for_last_levels,
80 progress_logger=None):
110 progress_logger=None, seed_progress=None):
81111 """
82112 Cleanup tiles with tile traversal.
83113 """
87117 dry_run=dry_run, size=concurrency)
88118 tile_walker = TileWalker(task, tile_worker_pool, handle_stale=True,
89119 work_on_metatiles=False, progress_logger=progress_logger,
90 skip_geoms_for_last_levels=skip_geoms_for_last_levels)
120 skip_geoms_for_last_levels=skip_geoms_for_last_levels,
121 seed_progress=seed_progress)
91122 try:
92123 tile_walker.walk()
93124 except KeyboardInterrupt:
95126 raise
96127 finally:
97128 tile_worker_pool.stop()
129
130
131 class DirectoryCleanupProgress(object):
132 def __init__(self, old_dir=None):
133 self.old_dir = old_dir
134 self.current_dir = None
135
136 def step_dir(self, dir):
137 self.current_dir = dir
138
139 def already_processed(self):
140 return self.can_skip(self.old_dir, self.current_dir)
141
142 def current_progress_identifier(self):
143 if self.already_processed() or self.current_dir is None:
144 return self.old_dir
145 return self.current_dir
146
147 @staticmethod
148 def can_skip(old_dir, current_dir):
149 """
150 Return True if the `current_dir` is before `old_dir` when compared
151 lexicographic.
152
153 >>> DirectoryCleanupProgress.can_skip(None, '/00')
154 False
155 >>> DirectoryCleanupProgress.can_skip(None, '/00/000/000')
156 False
157
158 >>> DirectoryCleanupProgress.can_skip('/01/000/001', '/00')
159 True
160 >>> DirectoryCleanupProgress.can_skip('/01/000/001', '/01/000/000')
161 True
162 >>> DirectoryCleanupProgress.can_skip('/01/000/001', '/01/000/000/000')
163 True
164 >>> DirectoryCleanupProgress.can_skip('/01/000/001', '/01/000/001')
165 False
166 >>> DirectoryCleanupProgress.can_skip('/01/000/001', '/01/000/001/000')
167 False
168 """
169 if old_dir is None:
170 return False
171 if current_dir is None:
172 return False
173 for old, current in izip_longest(old_dir.split(os.path.sep), current_dir.split(os.path.sep), fillvalue=None):
174 if old is None:
175 return False
176 if current is None:
177 return False
178 if old < current:
179 return False
180 if old > current:
181 return True
182 return False
183
184 def running(self):
185 return True
1414
1515 from __future__ import print_function
1616
17 import errno
18 import os
19 import re
20 import signal
1721 import sys
22 import time
1823 import logging
1924 from logging.config import fileConfig
2025
21 from optparse import OptionParser
26 from subprocess import Popen
27 from optparse import OptionParser, OptionValueError
2228
2329 from mapproxy.config.loader import load_configuration, ConfigurationError
2430 from mapproxy.seed.config import load_seed_tasks_conf
2834 ProgressLog, ProgressStore)
2935 from mapproxy.seed.cachelock import CacheLocker
3036
37 SECONDS_PER_DAY = 60 * 60 * 24
38 SECONDS_PER_MINUTE = 60
39
3140 def setup_logging(logging_conf=None):
3241 if logging_conf is not None:
3342 fileConfig(logging_conf, {'here': './'})
4150 "[%(asctime)s] %(name)s - %(levelname)s - %(message)s")
4251 ch.setFormatter(formatter)
4352 mapproxy_log.addHandler(ch)
53
54
55 def check_duration(option, opt, value, parser):
56 try:
57 setattr(parser.values, option.dest, parse_duration(value))
58 except ValueError:
59 raise OptionValueError(
60 "option %s: invalid duration value: %r, expected (10s, 15m, 0.5h, 3d, etc)"
61 % (opt, value),
62 )
63
64
65 def parse_duration(string):
66 match = re.match(r'^(\d*.?\d+)(s|m|h|d)', string)
67 if not match:
68 raise ValueError('invalid duration, not in format: 10s, 0.5h, etc.')
69 duration = float(match.group(1))
70 unit = match.group(2)
71 if unit == 's':
72 return duration
73 duration *= 60
74 if unit == 'm':
75 return duration
76 duration *= 60
77 if unit == 'h':
78 return duration
79 duration *= 24
80 return duration
81
4482
4583 class SeedScript(object):
4684 usage = "usage: %prog [options] seed_conf"
96134 default=None,
97135 help="filename for storing the seed progress (for --continue option)")
98136
137 parser.add_option("--duration", dest="duration",
138 help="stop seeding after (120s, 15m, 4h, 0.5d, etc)",
139 type=str, action="callback", callback=check_duration)
140
141 parser.add_option("--reseed-file", dest="reseed_file",
142 help="start of last re-seed", metavar="FILE",
143 default=None)
144 parser.add_option("--reseed-interval", dest="reseed_interval",
145 help="only start seeding if --reseed-file is older then --reseed-interval",
146 metavar="DURATION",
147 type=str, action="callback", callback=check_duration,
148 default=None)
149
99150 parser.add_option("--log-config", dest='logging_conf', default=None,
100151 help="logging configuration")
101152
117168
118169 setup_logging(options.logging_conf)
119170
171 if options.duration:
172 # calls with --duration are handled in call_with_duration
173 sys.exit(self.call_with_duration(options, args))
174
120175 try:
121176 mapproxy_conf = load_configuration(options.conf_file, seed=True)
122177 except ConfigurationError as ex:
131186 if not sys.stdout.isatty() and options.quiet == 0:
132187 # disable verbose output for non-ttys
133188 options.quiet = 1
189
190 progress = None
191 if options.continue_seed or options.progress_file:
192 if not options.progress_file:
193 options.progress_file = '.mapproxy_seed_progress'
194 progress = ProgressStore(options.progress_file,
195 continue_seed=options.continue_seed)
196
197 if options.reseed_file:
198 if not os.path.exists(options.reseed_file):
199 # create --reseed-file if missing
200 with open(options.reseed_file, 'w'):
201 pass
202 else:
203 if progress and not os.path.exists(options.progress_file):
204 # we have an existing --reseed-file but no --progress-file
205 # meaning the last seed call was completed
206 if options.reseed_interval and (
207 os.path.getmtime(options.reseed_file) > (time.time() - options.reseed_interval)
208 ):
209 print("no need for re-seeding")
210 sys.exit(1)
211 os.utime(options.reseed_file, (time.time(), time.time()))
134212
135213 with mapproxy_conf:
136214 try:
150228 for task in cleanup_tasks:
151229 print(format_cleanup_task(task))
152230 return 0
153
154 progress = None
155 if options.continue_seed or options.progress_file:
156 if options.progress_file:
157 progress_file = options.progress_file
158 else:
159 progress_file = '.mapproxy_seed_progress'
160 progress = ProgressStore(progress_file,
161 continue_seed=options.continue_seed)
162231
163232 try:
164233 if options.interactive:
177246 print('========== Cleanup tasks ==========')
178247 print('Start cleanup process (%d task%s)' % (
179248 len(cleanup_tasks), 's' if len(cleanup_tasks) > 1 else ''))
180 logger = ProgressLog(verbose=options.quiet==0, silent=options.quiet>=2)
249 logger = ProgressLog(verbose=options.quiet==0, silent=options.quiet>=2,
250 progress_store=progress)
181251 cleanup(cleanup_tasks, verbose=options.quiet==0, dry_run=options.dry_run,
182252 concurrency=options.concurrency, progress_logger=logger,
183253 skip_geoms_for_last_levels=options.geom_levels)
224294
225295 return seed_names, cleanup_names
226296
297 def call_with_duration(self, options, args):
298 # --duration is implemented by calling mapproxy-seed again in a separate
299 # process (but without --duration) and terminating that process
300 # after --duration
301
302 argv = sys.argv[:]
303 for i, arg in enumerate(sys.argv):
304 if arg == '--duration':
305 argv = sys.argv[:i] + sys.argv[i+2:]
306 break
307 elif arg.startswith('--duration='):
308 argv = sys.argv[:i] + sys.argv[i+1:]
309 break
310
311 # call mapproxy-seed again, poll status, terminate after --duration
312 cmd = Popen(args=argv)
313 start = time.time()
314 while True:
315 if (time.time() - start) > options.duration:
316 try:
317 cmd.send_signal(signal.SIGINT)
318 # try to stop with sigint
319 # send sigterm after 10 seconds
320 for _ in range(10):
321 time.sleep(1)
322 if cmd.poll() is not None:
323 break
324 else:
325 cmd.terminate()
326 except OSError as ex:
327 if ex.errno != errno.ESRCH: # no such process
328 raise
329 return 0
330 if cmd.poll() is not None:
331 return cmd.returncode
332 try:
333 time.sleep(1)
334 except KeyboardInterrupt:
335 # force termination
336 start = 0
337
338
227339 def interactive(self, seed_tasks, cleanup_tasks):
228340 selected_seed_tasks = []
229341 print('========== Select seeding tasks ==========')
263375 result.extend(args.split(','))
264376 return result
265377
378
266379 if __name__ == '__main__':
267380 main()
1515 from __future__ import print_function, division
1616
1717 import sys
18 from collections import deque
1819 from contextlib import contextmanager
1920 import time
2021 try:
3132 from mapproxy.seed.util import format_seed_task, timestamp
3233 from mapproxy.seed.cachelock import DummyCacheLocker, CacheLockedError
3334
34 from mapproxy.seed.util import (exp_backoff, ETA, limit_sub_bbox,
35 from mapproxy.seed.util import (exp_backoff, limit_sub_bbox,
3536 status_symbol, BackoffError)
3637
3738 import logging
5354 queue_class = multiprocessing.Queue
5455
5556
56 class TileProcessor(object):
57 def __init__(self, dry_run=False):
58 self._lastlog = time.time()
59 self.dry_run = dry_run
60
61 def log_progress(self, progress):
62 if (self._lastlog + .1) < time.time():
63 # log progress at most every 100ms
64 print('[%s] %6.2f%% %s \tETA: %s\r' % (
65 timestamp(), progress[1]*100, progress[0],
66 progress[2]
67 ), end=' ')
68 sys.stdout.flush()
69 self._lastlog = time.time()
70
71 def process(self, tiles, progress):
72 if not self.dry_run:
73 self.process_tiles(tiles)
74
75 self.log_progress(progress)
76
77 def stop(self):
78 raise NotImplementedError()
79
80 def process_tiles(self, tiles):
81 raise NotImplementedError()
82
83
84 class TileWorkerPool(TileProcessor):
57 class TileWorkerPool(object):
8558 """
8659 Manages multiple TileWorker.
8760 """
8861 def __init__(self, task, worker_class, size=2, dry_run=False, progress_logger=None):
89 TileProcessor.__init__(self, dry_run=dry_run)
9062 self.tiles_queue = queue_class(size)
9163 self.task = task
9264 self.dry_run = dry_run
192164 class SeedProgress(object):
193165 def __init__(self, old_progress_identifier=None):
194166 self.progress = 0.0
195 self.eta = ETA()
196167 self.level_progress_percentages = [1.0]
197 self.level_progresses = []
168 self.level_progresses = None
169 self.level_progresses_level = 0
198170 self.progress_str_parts = []
199 self.old_level_progresses = None
200 if old_progress_identifier is not None:
201 self.old_level_progresses = old_progress_identifier
171 self.old_level_progresses = old_progress_identifier
202172
203173 def step_forward(self, subtiles=1):
204174 self.progress += self.level_progress_percentages[-1] / subtiles
205 self.eta.update(self.progress)
206175
207176 @property
208177 def progress_str(self):
210179
211180 @contextmanager
212181 def step_down(self, i, subtiles):
182 if self.level_progresses is None:
183 self.level_progresses = []
184 self.level_progresses = self.level_progresses[:self.level_progresses_level]
213185 self.level_progresses.append((i, subtiles))
186 self.level_progresses_level += 1
214187 self.progress_str_parts.append(status_symbol(i, subtiles))
215188 self.level_progress_percentages.append(self.level_progress_percentages[-1] / subtiles)
189
216190 yield
191
217192 self.level_progress_percentages.pop()
218193 self.progress_str_parts.pop()
219 self.level_progresses.pop()
194
195 self.level_progresses_level -= 1
196 if self.level_progresses_level == 0:
197 self.level_progresses = []
220198
221199 def already_processed(self):
222 if self.old_level_progresses == []:
223 return True
224
225 if self.old_level_progresses is None:
226 return False
227
228 if self.progress_is_behind(self.old_level_progresses, self.level_progresses):
229 return True
230 else:
231 return False
200 return self.can_skip(self.old_level_progresses, self.level_progresses)
232201
233202 def current_progress_identifier(self):
234 return self.level_progresses
203 if self.already_processed() or self.level_progresses is None:
204 return self.old_level_progresses
205 return self.level_progresses[:]
235206
236207 @staticmethod
237 def progress_is_behind(old_progress, current_progress):
208 def can_skip(old_progress, current_progress):
238209 """
239210 Return True if the `current_progress` is behind the `old_progress` -
240211 when it isn't as far as the old progress.
241212
242 >>> SeedProgress.progress_is_behind([], [(0, 1)])
213 >>> SeedProgress.can_skip(None, [(0, 4)])
214 False
215 >>> SeedProgress.can_skip([], [(0, 4)])
243216 True
244 >>> SeedProgress.progress_is_behind([(0, 1), (1, 4)], [(0, 1)])
245 False
246 >>> SeedProgress.progress_is_behind([(0, 1), (1, 4)], [(0, 1), (0, 4)])
217 >>> SeedProgress.can_skip([(0, 4)], None)
218 False
219 >>> SeedProgress.can_skip([(0, 4)], [(0, 4)])
220 False
221 >>> SeedProgress.can_skip([(1, 4)], [(0, 4)])
247222 True
248 >>> SeedProgress.progress_is_behind([(0, 1), (1, 4)], [(0, 1), (1, 4)])
223 >>> SeedProgress.can_skip([(0, 4)], [(0, 4), (0, 4)])
224 False
225
226 >>> SeedProgress.can_skip([(0, 4), (0, 4), (2, 4)], [(0, 4), (0, 4)])
227 False
228 >>> SeedProgress.can_skip([(0, 4), (0, 4), (2, 4)], [(0, 4), (0, 4), (1, 4)])
249229 True
250 >>> SeedProgress.progress_is_behind([(0, 1), (1, 4)], [(0, 1), (3, 4)])
251 False
252
253 """
254 for old, current in izip_longest(old_progress, current_progress, fillvalue=(9e15, 9e15)):
230 >>> SeedProgress.can_skip([(0, 4), (0, 4), (2, 4)], [(0, 4), (0, 4), (2, 4)])
231 False
232 >>> SeedProgress.can_skip([(0, 4), (0, 4), (2, 4)], [(0, 4), (0, 4), (3, 4)])
233 False
234 >>> SeedProgress.can_skip([(0, 4), (0, 4), (2, 4)], [(0, 4), (1, 4)])
235 False
236 >>> SeedProgress.can_skip([(0, 4), (0, 4), (2, 4)], [(0, 4), (1, 4), (0, 4)])
237 False
238 """
239 if current_progress is None:
240 return False
241 if old_progress is None:
242 return False
243 if old_progress == []:
244 return True
245 for old, current in izip_longest(old_progress, current_progress, fillvalue=None):
246 if old is None:
247 return False
248 if current is None:
249 return False
255250 if old < current:
256251 return False
257252 if old > current:
258253 return True
259 return True
254 return False
260255
261256 def running(self):
262257 return True
269264
270265
271266 class TileWalker(object):
267 """
268 TileWalker traverses through all tiles in a tile grid and calls worker_pool.process
269 for each (meta) tile. It traverses the tile grid (pyramid) depth-first.
270 Intersection with coverages are checked before handling subtiles in the next level,
271 allowing to determine if all subtiles should be seeded or skipped.
272 """
272273 def __init__(self, task, worker_pool, handle_stale=False, handle_uncached=False,
273274 work_on_metatiles=True, skip_geoms_for_last_levels=0, progress_logger=None,
274275 seed_progress=None):
282283 self.progress_logger = progress_logger
283284
284285 num_seed_levels = len(task.levels)
285 self.report_till_level = task.levels[int(num_seed_levels * 0.8)]
286 if num_seed_levels >= 4:
287 self.report_till_level = task.levels[num_seed_levels-2]
288 else:
289 self.report_till_level = task.levels[num_seed_levels-1]
286290 meta_size = self.tile_mgr.meta_grid.meta_size if self.tile_mgr.meta_grid else (1, 1)
287291 self.tiles_per_metatile = meta_size[0] * meta_size[1]
288292 self.grid = MetaGrid(self.tile_mgr.grid, meta_size=meta_size, meta_buffer=0)
289293 self.count = 0
290294 self.seed_progress = seed_progress or SeedProgress()
295
296 # It is possible that we 'walk' through the same tile multiple times
297 # when seeding irregular tile grids[0]. limit_sub_bbox prevents that we
298 # recurse into the same area multiple times, but it is still possible
299 # that a tile is processed multiple times. Locking prevents that a tile
300 # is seeded multiple times, but it is possible that we count the same tile
301 # multiple times (in dry-mode, or while the tile is in the process queue).
302
303 # Tile counts can be off by 280% with sqrt2 grids.
304 # We keep a small cache of already processed tiles to skip most duplicates.
305 # A simple cache of 64 tile coordinates for each level already brings the
306 # difference down to ~8%, which is good enough and faster than a more
307 # sophisticated FIFO cache with O(1) lookup, or even caching all tiles.
308
309 # [0] irregular tile grids: where one tile does not have exactly 4 subtiles
310 # Typically when you use res_factor, or a custom res list.
311 self.seeded_tiles = {l: deque(maxlen=64) for l in task.levels}
291312
292313 def walk(self):
293314 assert self.handle_stale or self.handle_uncached
329350 if current_level in levels:
330351 levels = levels[1:]
331352 process = True
332 current_level += 1
333353
334354 for i, (subtile, sub_bbox, intersection) in enumerate(subtiles):
335355 if subtile is None: # no intersection
346366 if self.seed_progress.already_processed():
347367 self.seed_progress.step_forward()
348368 else:
349 self._walk(sub_bbox, levels, current_level=current_level,
369 self._walk(sub_bbox, levels, current_level=current_level+1,
350370 all_subtiles=all_subtiles)
351371
352372 if not process:
353373 continue
374
375 # check if subtile was already processed. see comment in __init__
376 if subtile in self.seeded_tiles[current_level]:
377 continue
378 self.seeded_tiles[current_level].appendleft(subtile)
354379
355380 if not self.work_on_metatiles:
356381 # collect actual tiles
434459 self.remove_timestamp = remove_timestamp
435460 self.coverage = coverage
436461 self.complete_extent = complete_extent
462
463 @property
464 def id(self):
465 return 'cleanup', self.md['name'], self.md['cache_name'], self.md['grid_name']
437466
438467 def intersects(self, bbox):
439468 if self.coverage.contains(bbox, self.grid.srs): return CONTAINS
4141 dict.__setitem__(self, key, val)
4242 dict.__setitem__(self, val, key)
4343
44 class ETA(object):
45 def __init__(self):
46 self.avgs = []
47 self.last_tick_start = time.time()
48 self.progress = 0.0
49 self.ticks = 10000
50 self.tick_duration_sums = 0.0
51 self.tick_duration_divisor = 0.0
52 self.tick_count = 0
53
54 def update(self, progress):
55 self.progress = progress
56 missing_ticks = (self.progress * self.ticks) - self.tick_count
57 if missing_ticks:
58 tick_duration = (time.time() - self.last_tick_start) / missing_ticks
59
60 while missing_ticks > 0:
61
62 # reduce the influence of older messurements
63 self.tick_duration_sums *= 0.999
64 self.tick_duration_divisor *= 0.999
65
66 self.tick_count += 1
67
68 self.tick_duration_sums += tick_duration
69 self.tick_duration_divisor += 1
70
71 missing_ticks -= 1
72
73 self.last_tick_start = time.time()
74
75 def eta_string(self):
76 timestamp = self.eta()
77 if timestamp is None:
78 return 'N/A'
79 try:
80 return time.strftime('%Y-%m-%d-%H:%M:%S', time.localtime(timestamp))
81 except (ValueError, OSError): # OSError since Py 3.3
82 # raised when time is out of range (e.g. year >2038)
83 return 'N/A'
84
85 def eta(self):
86 if not self.tick_count: return
87 return (self.last_tick_start +
88 ((self.tick_duration_sums/self.tick_duration_divisor)
89 * (self.ticks - self.tick_count)))
90
91 def __str__(self):
92 return self.eta_string()
93
9444 class ProgressStore(object):
9545 """
9646 Reads and stores seed progresses to a file.
14191 if not out:
14292 out = sys.stdout
14393 self.out = out
144 self.lastlog = time.time()
94 self._laststep = time.time()
95 self._lastprogress = 0
96
14597 self.verbose = verbose
14698 self.silent = silent
14799 self.current_task_id = None
156108 def log_step(self, progress):
157109 if not self.verbose:
158110 return
159 if (self.lastlog + .1) < time.time():
160 # log progress at most every 100ms
161 self.out.write('[%s] %6.2f%%\t%-20s ETA: %s\r' % (
111 if (self._laststep + .5) < time.time():
112 # log progress at most every 500ms
113 self.out.write('[%s] %6.2f%%\t%-20s \r' % (
162114 timestamp(), progress.progress*100, progress.progress_str,
163 progress.eta
164115 ))
165116 self.out.flush()
166 self.lastlog = time.time()
117 self._laststep = time.time()
167118
168119 def log_progress(self, progress, level, bbox, tiles):
169 if self.progress_store and self.current_task_id:
170 self.progress_store.add(self.current_task_id,
171 progress.current_progress_identifier())
172 self.progress_store.write()
120 progress_interval = 1
121 if not self.verbose:
122 progress_interval = 30
123
124 log_progess = False
125 if progress.progress == 1.0 or (self._lastprogress + progress_interval) < time.time():
126 self._lastprogress = time.time()
127 log_progess = True
128
129 if log_progess:
130 if self.progress_store and self.current_task_id:
131 self.progress_store.add(self.current_task_id,
132 progress.current_progress_identifier())
133 self.progress_store.write()
173134
174135 if self.silent:
175136 return
176 self.out.write('[%s] %2s %6.2f%% %s (%d tiles) ETA: %s\n' % (
177 timestamp(), level, progress.progress*100,
178 format_bbox(bbox), tiles, progress.eta))
179 self.out.flush()
137
138 if log_progess:
139 self.out.write('[%s] %2s %6.2f%% %s (%d tiles)\n' % (
140 timestamp(), level, progress.progress*100,
141 format_bbox(bbox), tiles))
142 self.out.flush()
180143
181144
182145 def limit_sub_bbox(bbox, sub_bbox):
00 <?xml version="1.0"?>
1 <Capabilities xmlns="http://www.opengis.net/wmts/1.0" xmlns:ows="http://www.opengis.net/ows/1.1" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:gml="http://www.opengis.net/gml" xsi:schemaLocation="http://www.opengis.net/wmts/1.0 ../wmtsGetCapabilities_response.xsd" version="1.0.0">
1 <Capabilities xmlns="http://www.opengis.net/wmts/1.0" xmlns:ows="http://www.opengis.net/ows/1.1" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:gml="http://www.opengis.net/gml" xsi:schemaLocation="http://www.opengis.net/wmts/1.0 http://schemas.opengis.net/wmts/1.0/wmtsGetCapabilities_response.xsd" version="1.0.0">
22 <ows:ServiceIdentification>
33 <ows:Title>{{service.title}}</ows:Title>
44 <ows:Abstract>{{service.abstract}}</ows:Abstract>
547547 if layer_task.exception is None:
548548 layer, layer_img = layer_task.result
549549 if layer_img is not None:
550 layer_merger.add(layer_img, layer=layer)
550 layer_merger.add(layer_img, layer.coverage)
551551 else:
552552 ex = layer_task.exception
553553 async_pool.shutdown(True)
565565 if layer_task.exception is None:
566566 layer, layer_img = layer_task.result
567567 if layer_img is not None:
568 layer_merger.add(layer_img, layer=layer)
568 layer_merger.add(layer_img, layer.coverage)
569569 rendered += 1
570570 else:
571571 layer_merger.cacheable = False
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 from mapproxy.source.wms import WMSSource
15 from mapproxy.source.wms import WMSSource, WMSInfoSource
1616
1717 import logging
1818 log = logging.getLogger('mapproxy.source.arcgis')
2020
2121 class ArcGISSource(WMSSource):
2222 def __init__(self, client, image_opts=None, coverage=None,
23 supported_srs=None, supported_formats=None):
24 WMSSource.__init__(self, client, image_opts=image_opts, coverage=coverage,
25 supported_srs=supported_srs, supported_formats=supported_formats)
23 res_range=None, supported_srs=None, supported_formats=None):
24 WMSSource.__init__(self, client, image_opts=image_opts,
25 coverage=coverage, res_range=res_range,
26 supported_srs=supported_srs,
27 supported_formats=supported_formats)
28
29
30 class ArcGISInfoSource(WMSInfoSource):
31 def __init__(self, client):
32 self.client = client
33
34 def get_info(self, query):
35 doc = self.client.get_info(query)
36 return doc
1414
1515 from __future__ import print_function
1616
17 import re
1718 import threading
1819 import sys
1920 import cgi
3536 else:
3637 from http.server import HTTPServer as HTTPServer_, BaseHTTPRequestHandler
3738
38 class RequestsMissmatchError(AssertionError):
39 class RequestsMismatchError(AssertionError):
3940 def __init__(self, assertions):
4041 self.assertions = assertions
4142
4344 assertions = []
4445 for assertion in self.assertions:
4546 assertions.append(text_indent(str(assertion), ' ', ' - '))
46 return 'requests missmatch:\n' + '\n'.join(assertions)
47 return 'requests mismatch:\n' + '\n'.join(assertions)
4748
4849 class RequestError(str):
4950 pass
5556 text = first_indent + text
5657 return text.replace('\n', '\n' + indent)
5758
58 class RequestMissmatch(object):
59 class RequestMismatch(object):
5960 def __init__(self, msg, expected, actual):
6061 self.msg = msg
6162 self.expected = expected
6263 self.actual = actual
6364
6465 def __str__(self):
65 return ('requests missmatch, expected:\n' +
66 return ('requests mismatch, expected:\n' +
6667 text_indent(str(self.expected), ' ') +
6768 '\n got:\n' + text_indent(str(self.actual), ' '))
6869
161162 if 'method' in req:
162163 if req['method'] != method:
163164 self.server.assertions.append(
164 RequestMissmatch('unexpected method', req['method'], method)
165 RequestMismatch('unexpected method', req['method'], method)
165166 )
166167 self.server.shutdown = True
167168 if req.get('require_basic_auth', False):
176177 for k, v in req['headers'].items():
177178 if k not in self.headers:
178179 self.server.assertions.append(
179 RequestMissmatch('missing header', k, self.headers)
180 RequestMismatch('missing header', k, self.headers)
180181 )
181182 elif self.headers[k] != v:
182183 self.server.assertions.append(
183 RequestMissmatch('header missmatch', '%s: %s' % (k, v), self.headers)
184 RequestMismatch('header mismatch', '%s: %s' % (k, v), self.headers)
184185 )
185186 if not query_comparator(req['path'], self.query_data):
186187 self.server.assertions.append(
187 RequestMissmatch('requests differ', req['path'], self.query_data)
188 RequestMismatch('requests differ', req['path'], self.query_data)
188189 )
189190 query_actual = set(query_to_dict(self.query_data).items())
190191 query_expected = set(query_to_dict(req['path']).items())
191192 self.server.assertions.append(
192 RequestMissmatch('requests params differ', query_expected - query_actual, query_actual - query_expected)
193 RequestMismatch('requests params differ', query_expected - query_actual, query_actual - query_expected)
193194 )
194195 self.server.shutdown = True
195196 if 'req_assert_function' in req:
270271
271272 if not self._thread.sucess and value:
272273 print('requests to mock httpd did not '
273 'match expectations:\n %s' % RequestsMissmatchError(self._thread.assertions))
274 'match expectations:\n %s' % RequestsMismatchError(self._thread.assertions))
274275 if value:
275276 raise reraise((type, value, traceback))
276277 if not self._thread.sucess:
277 raise RequestsMissmatchError(self._thread.assertions)
278 raise RequestsMismatchError(self._thread.assertions)
278279
279280 def wms_query_eq(expected, actual):
280281 """
311312
312313 return True
313314
315 numbers_only = re.compile('^-?\d+\.\d+(,-?\d+\.\d+)*$')
316
314317 def query_eq(expected, actual):
315318 """
316319 >>> query_eq('bAR=baz&foo=bizz', 'foO=bizz&bar=baz')
321324 True
322325 >>> query_eq('/1/2/3.png', '/1/2/0.png')
323326 False
324 """
325 return (query_to_dict(expected) == query_to_dict(actual) and
326 path_from_query(expected) == path_from_query(actual))
327
328 def assert_query_eq(expected, actual):
327 >>> query_eq('/map?point=2.9999999999,1.00000000001', '/map?point=3.0,1.0')
328 True
329 """
330
331 if path_from_query(expected) != path_from_query(actual):
332 return False
333
334 expected = query_to_dict(expected)
335 actual = query_to_dict(actual)
336
337 if set(expected.keys()) != set(actual.keys()):
338 return False
339
340 for ke, ve in expected.items():
341 if numbers_only.match(ve):
342 if not float_string_almost_eq(ve, actual[ke]):
343 return False
344 else:
345 if ve != actual[ke]:
346 return False
347
348 return True
349
350 def float_string_almost_eq(expected, actual):
351 """
352 Compares if two strings with comma-separated floats are almost equal.
353 Strings must contain floats.
354
355 >>> float_string_almost_eq('12345678900', '12345678901')
356 False
357 >>> float_string_almost_eq('12345678900.0', '12345678901.0')
358 True
359
360 >>> float_string_almost_eq('12345678900.0,-3.0', '12345678901.0,-2.9999999999')
361 True
362 """
363 if not numbers_only.match(expected) or not numbers_only.match(actual):
364 return False
365
366 expected_nums = [float(x) for x in expected.split(',')]
367 actual_nums = [float(x) for x in actual.split(',')]
368
369 if len(expected_nums) != len(actual_nums):
370 return False
371
372 for e, a in zip(expected_nums, actual_nums):
373 if abs(e - a) > abs((e+a)/2)/10e9:
374 return False
375
376 return True
377
378 def assert_query_eq(expected, actual, fuzzy_number_compare=False):
329379 path_actual = path_from_query(actual)
330380 path_expected = path_from_query(expected)
331381 assert path_expected == path_actual, path_expected + '!=' + path_actual
333383 query_actual = set(query_to_dict(actual).items())
334384 query_expected = set(query_to_dict(expected).items())
335385
336 assert query_expected == query_actual, '%s != %s\t%s|%s' % (
386 if fuzzy_number_compare:
387 equal = query_eq(expected, actual)
388 else:
389 equal = query_expected == query_actual
390 assert equal, '%s != %s\t%s|%s' % (
337391 expected, actual, query_expected - query_actual, query_actual - query_expected)
338392
339393 def path_from_query(query):
390444 yield
391445 except:
392446 if not t.sucess:
393 print(str(RequestsMissmatchError(t.assertions)))
447 print(str(RequestsMismatchError(t.assertions)))
394448 raise
395449 finally:
396450 t.shutdown = True
397451 t.join(1)
398452 if not t.sucess:
399 raise RequestsMissmatchError(t.assertions)
453 raise RequestsMismatchError(t.assertions)
400454
401455 @contextmanager
402456 def mock_single_req_httpd(address, request_handler):
406460 yield
407461 except:
408462 if not t.sucess:
409 print(str(RequestsMissmatchError(t.assertions)))
463 print(str(RequestsMismatchError(t.assertions)))
410464 raise
411465 finally:
412466 t.shutdown = True
413467 t.join(1)
414468 if not t.sucess:
415 raise RequestsMissmatchError(t.assertions)
469 raise RequestsMismatchError(t.assertions)
416470
417471
418472 def make_wsgi_env(query_string, extra_environ={}):
00 services:
11 tms:
2 wms:
3 featureinfo_types: ['json']
24
35 layers:
46 - name: app2_layer
79 - name: app2_with_layers_layer
810 title: ArcGIS Cache Layer
911 sources: [app2_with_layers_cache]
12 - name: app2_with_layers_fi_layer
13 title: ArcGIS Cache Layer
14 sources: [app2_with_layers_fi_cache]
1015 - name: app2_wrong_url_layer
1116 title: ArcGIS Cache Layer
1217 sources: [app2_wrong_url_cache]
1823 app2_with_layers_cache:
1924 grids: [GLOBAL_MERCATOR]
2025 sources: [app2_with_layers_source]
26 app2_with_layers_fi_cache:
27 grids: [GLOBAL_MERCATOR]
28 sources: [app2_with_layers_fi_source]
2129 app2_wrong_url_cache:
2230 grids: [GLOBAL_MERCATOR]
2331 sources: [app2_wrong_url_source]
3139 type: arcgis
3240 req:
3341 layers: show:0,1
34 url: http://localhost:42423/arcgis/rest/services/ExampleLayer/ImageServer
42 url: http://localhost:42423/arcgis/rest/services/ExampleLayer/MapServer
43 app2_with_layers_fi_source:
44 type: arcgis
45 opts:
46 featureinfo: true
47 featureinfo_tolerance: 10
48 featureinfo_return_geometries: true
49 supported_srs: ['EPSG:3857']
50 req:
51 layers: show:1,2,3
52 url: http://localhost:42423/arcgis/rest/services/ExampleLayer/MapServer
3553 app2_wrong_url_source:
3654 type: arcgis
3755 req:
0 globals:
1 cache:
2 base_dir: cache_data/
3
4 services:
5 tms:
6 wms:
7 md:
8 title: MapProxy test fixture
9
10 layers:
11 - name: gpkg
12 title: TMS Cache Layer
13 sources: [gpkg_cache, new_gpkg, new_gpkg_table]
14 - name: gpkg_new
15 title: TMS Cache Layer
16 sources: [new_gpkg]
17
18 caches:
19 gpkg_cache:
20 grids: [cache_grid]
21 cache:
22 type: geopackage
23 filename: ./cache.gpkg
24 table_name: cache
25 tile_lock_dir: ./testlockdir
26 sources: [tms]
27 new_gpkg:
28 grids: [new_grid]
29 sources: []
30 cache:
31 type: geopackage
32 filename: ./cache_new.gpkg
33 table_name: cache
34 tile_lock_dir: ./testlockdir
35 new_gpkg_table:
36 grids: [cache_grid]
37 cache:
38 type: geopackage
39 filename: ./cache.gpkg
40 table_name: new_cache
41 tile_lock_dir: ./testlockdir
42 sources: [tms]
43
44 grids:
45 cache_grid:
46 srs: EPSG:900913
47 new_grid:
48 srs: EPSG:4326
49
50
51 sources:
52 tms:
53 type: tile
54 url: http://localhost:42423/tiles/%(tc_path)s.png
55
0 globals:
1 cache:
2 s3:
3 bucket_name: default_bucket
4
5 services:
6 tms:
7 wms:
8 md:
9 title: MapProxy S3
10
11 layers:
12 - name: default
13 title: Default
14 sources: [default_cache]
15 - name: quadkey
16 title: Quadkey
17 sources: [quadkey_cache]
18 - name: reverse
19 title: Reverse
20 sources: [reverse_cache]
21
22 caches:
23 default_cache:
24 grids: [webmercator]
25 cache:
26 type: s3
27 sources: [tms]
28
29 quadkey_cache:
30 grids: [webmercator]
31 cache:
32 type: s3
33 bucket_name: tiles
34 directory_layout: quadkey
35 directory: quadkeytiles
36 sources: [tms]
37
38 reverse_cache:
39 grids: [webmercator]
40 cache:
41 type: s3
42 bucket_name: tiles
43 directory_layout: reverse_tms
44 directory: reversetiles
45 sources: [tms]
46
47 grids:
48 webmercator:
49 name: WebMerc
50 base: GLOBAL_WEBMERCATOR
51
52
53 sources:
54 tms:
55 type: tile
56 url: http://localhost:42423/tiles/%(tc_path)s.png
57
2525 fax: +49(0)441-9392774-9
2626 email: info@omniscale.de
2727 access_constraints:
28 This service is intended for private and evaluation use only.
29 The data is licensed as Creative Commons Attribution-Share Alike 2.0
30 (http://creativecommons.org/licenses/by-sa/2.0/)
28 Here be dragons.
3129
3230 layers:
3331 - name: wms_cache
2424 fax: +49(0)441-9392774-9
2525 email: info@omniscale.de
2626 access_constraints:
27 This service is intended for private and evaluation use only.
28 The data is licensed as Creative Commons Attribution-Share Alike 2.0
29 (http://creativecommons.org/licenses/by-sa/2.0/)
27 Here be dragons.
3028
3129 layers:
3230 - name: jpeg_cache_tiff_source
4040 fax: +49(0)441-9392774-9
4141 email: info@omniscale.de
4242 access_constraints:
43 This service is intended for private and evaluation use only.
44 The data is licensed as Creative Commons Attribution-Share Alike 2.0
45 (http://creativecommons.org/licenses/by-sa/2.0/)
43 Here be dragons.
4644 inspire_md:
4745 type: linked
4846 languages:
4040 fax: +49(0)441-9392774-9
4141 email: info@omniscale.de
4242 access_constraints:
43 This service is intended for private and evaluation use only.
44 The data is licensed as Creative Commons Attribution-Share Alike 2.0
45 (http://creativecommons.org/licenses/by-sa/2.0/)
43 Here be dragons.
4644 keyword_list:
4745 - vocabulary: GEMET
4846 keywords: [Orthoimagery]
4040 fax: +49(0)441-9392774-9
4141 email: info@omniscale.de
4242 access_constraints:
43 This service is intended for private and evaluation use only.
44 The data is licensed as Creative Commons Attribution-Share Alike 2.0
45 (http://creativecommons.org/licenses/by-sa/2.0/)
43 Here be dragons.
4644
4745 layers:
4846 - name: direct
2525 fax: +49(0)441-9392774-9
2626 email: info@omniscale.de
2727 access_constraints:
28 This service is intended for private and evaluation use only.
29 The data is licensed as Creative Commons Attribution-Share Alike 2.0
30 (http://creativecommons.org/licenses/by-sa/2.0/)
28 Here be dragons.
3129
3230 layers:
3331 - name: wms_legend
2525 fax: +49(0)441-9392774-9
2626 email: info@omniscale.de
2727 access_constraints:
28 This service is intended for private and evaluation use only.
29 The data is licensed as Creative Commons Attribution-Share Alike 2.0
30 (http://creativecommons.org/licenses/by-sa/2.0/)
28 Here be dragons.
3129
3230 layers:
3331 - name: mixed_mode
34 title: cache with PNG and JPEG
32 title: cache with PNG and JPEG
3533 sources: [mixed_cache]
3634
3735 caches:
2525 fax: +49(0)441-9392774-9
2626 email: info@omniscale.de
2727 access_constraints:
28 This service is intended for private and evaluation use only.
29 The data is licensed as Creative Commons Attribution-Share Alike 2.0
30 (http://creativecommons.org/licenses/by-sa/2.0/)
28 Here be dragons.
3129
3230 layers:
3331 - name: res
2525 fax: +49(0)441-9392774-9
2626 email: info@omniscale.de
2727 access_constraints:
28 This service is intended for private and evaluation use only.
29 The data is licensed as Creative Commons Attribution-Share Alike 2.0
30 (http://creativecommons.org/licenses/by-sa/2.0/)
28 Here be dragons.
3129
3230 layers:
3331 - name: wms_cache
2727 <ContactElectronicMailAddress>osm@omniscale.de</ContactElectronicMailAddress>
2828 </ContactInformation>
2929 <Fees>none</Fees>
30 <AccessConstraints>This service is intended for private and evaluation use only. The data is licensed as Creative Commons Attribution-Share Alike 2.0 (http://creativecommons.org/licenses/by-sa/2.0/)</AccessConstraints>
30 <AccessConstraints>Here be dragons.</AccessConstraints>
3131 </Service>
3232 <Capability>
3333 <Request>
2727 <ContactElectronicMailAddress>info@omniscale.de</ContactElectronicMailAddress>
2828 </ContactInformation>
2929 <Fees>None</Fees>
30 <AccessConstraints>This service is intended for private and evaluation use only. The data is licensed as Creative Commons Attribution-Share Alike 2.0 (http://creativecommons.org/licenses/by-sa/2.0/)</AccessConstraints>
30 <AccessConstraints>Here be dragons.</AccessConstraints>
3131 </Service>
3232 <Capability>
3333 <Request>
2323 <ContactElectronicMailAddress>info@omniscale.de</ContactElectronicMailAddress>
2424 </ContactInformation>
2525 <Fees>None</Fees>
26 <AccessConstraints>This service is intended for private and evaluation use only. The data is licensed as Creative Commons Attribution-Share Alike 2.0 (http://creativecommons.org/licenses/by-sa/2.0/)</AccessConstraints>
26 <AccessConstraints>Here be dragons.</AccessConstraints>
2727 </Service>
2828 <Capability>
2929 <Request>
2222 fax: +49(0)441-9392774-9
2323 email: info@omniscale.de
2424 access_constraints:
25 This service is intended for private and evaluation use only.
26 The data is licensed as Creative Commons Attribution-Share Alike 2.0
27 (http://creativecommons.org/licenses/by-sa/2.0/)
25 Here be dragons.
2826
2927 layers:
3028 - name: direct
2727 fax: +49(0)441-9392774-9
2828 email: info@omniscale.de
2929 access_constraints:
30 This service is intended for private and evaluation use only.
31 The data is licensed as Creative Commons Attribution-Share Alike 2.0
32 (http://creativecommons.org/licenses/by-sa/2.0/)
30 Here be dragons.
3331
3432 layers:
3533 - name: wms_cache
1515 from __future__ import with_statement, division
1616
1717 from io import BytesIO
18 from mapproxy.request.arcgis import ArcGISRequest
18 from mapproxy.request.wms import WMS111FeatureInfoRequest
1919 from mapproxy.test.image import is_png, create_tmp_image
2020 from mapproxy.test.http import mock_httpd
2121 from mapproxy.test.system import module_setup, module_teardown, SystemTest
3131
3232 transp = create_tmp_image((512, 512), mode='RGBA', color=(0, 0, 0, 0))
3333
34
3435 class TestArcgisSource(SystemTest):
3536 config = test_config
3637 def setup(self):
3738 SystemTest.setup(self)
39 self.common_fi_req = WMS111FeatureInfoRequest(url='/service?',
40 param=dict(x='10', y='20', width='200', height='200', layers='app2_with_layers_fi_layer',
41 format='image/png', query_layers='app2_with_layers_fi_layer', styles='',
42 bbox='1000,400,2000,1400', srs='EPSG:3857', info_format='application/json'))
3843
3944 def test_get_tile(self):
40 expected_req = [({'path': '/arcgis/rest/services/ExampleLayer/ImageServer/export?f=image&format=png&imageSR=900913&bboxSR=900913&bbox=-20037508.342789244,-20037508.342789244,20037508.342789244,20037508.342789244&size=512,512'},
45 expected_req = [({'path': '/arcgis/rest/services/ExampleLayer/ImageServer/exportImage?f=image&format=png&imageSR=900913&bboxSR=900913&bbox=-20037508.342789244,-20037508.342789244,20037508.342789244,20037508.342789244&size=512,512'},
4146 {'body': transp, 'headers': {'content-type': 'image/png'}}),
4247 ]
4348
4954 assert is_png(data)
5055
5156 def test_get_tile_with_layer(self):
52 expected_req = [({'path': '/arcgis/rest/services/ExampleLayer/ImageServer/export?f=image&format=png&layers=show:0,1&imageSR=900913&bboxSR=900913&bbox=-20037508.342789244,-20037508.342789244,20037508.342789244,20037508.342789244&size=512,512'},
57 expected_req = [({'path': '/arcgis/rest/services/ExampleLayer/MapServer/export?f=image&format=png&layers=show:0,1&imageSR=900913&bboxSR=900913&bbox=-20037508.342789244,-20037508.342789244,20037508.342789244,20037508.342789244&size=512,512'},
5358 {'body': transp, 'headers': {'content-type': 'image/png'}}),
5459 ]
5560
6166 assert is_png(data)
6267
6368 def test_get_tile_from_missing_arcgis_layer(self):
64 expected_req = [({'path': '/arcgis/rest/services/NonExistentLayer/ImageServer/export?f=image&format=png&imageSR=900913&bboxSR=900913&bbox=-20037508.342789244,-20037508.342789244,20037508.342789244,20037508.342789244&size=512,512'},
69 expected_req = [({'path': '/arcgis/rest/services/NonExistentLayer/ImageServer/exportImage?f=image&format=png&imageSR=900913&bboxSR=900913&bbox=-20037508.342789244,-20037508.342789244,20037508.342789244,20037508.342789244&size=512,512'},
6570 {'body': b'', 'status': 400}),
6671 ]
6772
6873 with mock_httpd(('localhost', 42423), expected_req, bbox_aware_query_comparator=True):
6974 resp = self.app.get('/tms/1.0.0/app2_wrong_url_layer/0/0/1.png', status=500)
7075 eq_(resp.status_code, 500)
76
77 def test_identify(self):
78 expected_req = [(
79 {'path': '/arcgis/rest/services/ExampleLayer/MapServer/identify?f=json&'
80 'geometry=1050.000000,1300.000000&returnGeometry=true&imageDisplay=200,200,96'
81 '&mapExtent=1000.0,400.0,2000.0,1400.0&layers=show:1,2,3'
82 '&tolerance=10&geometryType=esriGeometryPoint&sr=3857'
83 },
84 {'body': b'{"results": []}', 'headers': {'content-type': 'application/json'}}),
85 ]
86
87 with mock_httpd(('localhost', 42423), expected_req, bbox_aware_query_comparator=True):
88 resp = self.app.get(self.common_fi_req)
89 eq_(resp.content_type, 'application/json')
90 eq_(resp.content_length, len(resp.body))
91 eq_(resp.body, b'{"results": []}')
92
93
94 def test_transformed_identify(self):
95 expected_req = [(
96 {'path': '/arcgis/rest/services/ExampleLayer/MapServer/identify?f=json&'
97 'geometry=573295.377585,6927820.884193&returnGeometry=true&imageDisplay=200,321,96'
98 '&mapExtent=556597.453966,6446275.84102,890555.926346,6982997.92039&layers=show:1,2,3'
99 '&tolerance=10&geometryType=esriGeometryPoint&sr=3857'
100 },
101 {'body': b'{"results": []}', 'headers': {'content-type': 'application/json'}}),
102 ]
103
104 with mock_httpd(('localhost', 42423), expected_req):
105 self.common_fi_req.params.bbox = '5,50,8,53'
106 self.common_fi_req.params.srs = 'EPSG:4326'
107 resp = self.app.get(self.common_fi_req)
108 eq_(resp.content_type, 'application/json')
109 eq_(resp.content_length, len(resp.body))
110 eq_(resp.body, b'{"results": []}')
263263 return {
264264 'authorized': 'partial',
265265 'layers': {
266 'layer1b': {'featureinfo': True, 'limited_to': {'srs': 'EPSG:4326', 'geometry': [-40.0, -40.0, 0.0, 0.0]}},
266 'layer1b': {'featureinfo': True, 'limited_to': {'srs': 'EPSG:4326', 'geometry': [-80.0, -40.0, 0.0, -10.0]}},
267267 }
268268 }
269269
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2011 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement, division
16
17 import os
18 import shutil
19
20 from io import BytesIO
21
22 from mapproxy.request.wms import WMS111MapRequest
23 from mapproxy.test.http import MockServ
24 from mapproxy.test.image import is_png, create_tmp_image
25 from mapproxy.test.system import prepare_env, create_app, module_teardown, SystemTest
26 from mapproxy.cache.geopackage import GeopackageCache
27 from mapproxy.grid import TileGrid
28 from nose.tools import eq_
29 import sqlite3
30
31 test_config = {}
32
33
34 def setup_module():
35 prepare_env(test_config, 'cache_geopackage.yaml')
36
37 shutil.copy(os.path.join(test_config['fixture_dir'], 'cache.gpkg'),
38 test_config['base_dir'])
39 create_app(test_config)
40
41
42 def teardown_module():
43 module_teardown(test_config)
44
45
46 class TestGeopackageCache(SystemTest):
47 config = test_config
48 table_name = 'cache'
49
50 def setup(self):
51 SystemTest.setup(self)
52 self.common_map_req = WMS111MapRequest(url='/service?',
53 param=dict(service='WMS',
54 version='1.1.1', bbox='-180,-80,0,0',
55 width='200', height='200',
56 layers='gpkg', srs='EPSG:4326',
57 format='image/png',
58 styles='', request='GetMap'))
59
60 def test_get_map_cached(self):
61 resp = self.app.get(self.common_map_req)
62 eq_(resp.content_type, 'image/png')
63 data = BytesIO(resp.body)
64 assert is_png(data)
65
66 def test_get_map_uncached(self):
67 assert os.path.exists(os.path.join(test_config['base_dir'], 'cache.gpkg')) # already created on startup
68
69 self.common_map_req.params.bbox = '-180,0,0,80'
70 serv = MockServ(port=42423)
71 serv.expects('/tiles/01/000/000/000/000/000/001.png')
72 serv.returns(create_tmp_image((256, 256)))
73 with serv:
74 resp = self.app.get(self.common_map_req)
75 eq_(resp.content_type, 'image/png')
76 data = BytesIO(resp.body)
77 assert is_png(data)
78
79 # now cached
80 resp = self.app.get(self.common_map_req)
81 eq_(resp.content_type, 'image/png')
82 data = BytesIO(resp.body)
83 assert is_png(data)
84
85 def test_bad_config_geopackage_no_gpkg_contents(self):
86 gpkg_file = os.path.join(test_config['base_dir'], 'cache.gpkg')
87 table_name = 'no_gpkg_contents'
88
89 with sqlite3.connect(gpkg_file) as db:
90 cur = db.execute('''SELECT name FROM sqlite_master WHERE type='table' AND name=?''',
91 (table_name,))
92 content = cur.fetchone()
93 assert content[0] == table_name
94
95 with sqlite3.connect(gpkg_file) as db:
96 cur = db.execute('''SELECT table_name FROM gpkg_contents WHERE table_name=?''',
97 (table_name,))
98 content = cur.fetchone()
99 assert not content
100
101 GeopackageCache(gpkg_file, TileGrid(srs=4326), table_name=table_name)
102
103 with sqlite3.connect(gpkg_file) as db:
104 cur = db.execute('''SELECT table_name FROM gpkg_contents WHERE table_name=?''',
105 (table_name,))
106 content = cur.fetchone()
107 assert content[0] == table_name
108
109 def test_bad_config_geopackage_no_spatial_ref_sys(self):
110 gpkg_file = os.path.join(test_config['base_dir'], 'cache.gpkg')
111 organization_coordsys_id = 3785
112 table_name='no_gpkg_spatial_ref_sys'
113
114 with sqlite3.connect(gpkg_file) as db:
115 cur = db.execute('''SELECT organization_coordsys_id FROM gpkg_spatial_ref_sys WHERE organization_coordsys_id=?''',
116 (organization_coordsys_id,))
117 content = cur.fetchone()
118 assert not content
119
120 GeopackageCache(gpkg_file, TileGrid(srs=3785), table_name=table_name)
121
122 with sqlite3.connect(gpkg_file) as db:
123 cur = db.execute(
124 '''SELECT organization_coordsys_id FROM gpkg_spatial_ref_sys WHERE organization_coordsys_id=?''',
125 (organization_coordsys_id,))
126 content = cur.fetchone()
127 assert content[0] == organization_coordsys_id
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2016 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement, division
16
17 from io import BytesIO
18
19 from mapproxy.request.wms import WMS111MapRequest
20 from mapproxy.test.image import is_png, create_tmp_image
21 from mapproxy.test.system import prepare_env, create_app, module_teardown, SystemTest
22
23 from nose.tools import eq_
24 from nose.plugins.skip import SkipTest
25
26 try:
27 import boto3
28 from moto import mock_s3
29 except ImportError:
30 boto3 = None
31 mock_s3 = None
32
33
34 test_config = {}
35
36 _mock = None
37
38 def setup_module():
39 if not mock_s3 or not boto3:
40 raise SkipTest("boto3 and moto required for S3 tests")
41
42 global _mock
43 _mock = mock_s3()
44 _mock.start()
45
46 boto3.client("s3").create_bucket(Bucket="default_bucket")
47 boto3.client("s3").create_bucket(Bucket="tiles")
48 boto3.client("s3").create_bucket(Bucket="reversetiles")
49
50 prepare_env(test_config, 'cache_s3.yaml')
51 create_app(test_config)
52
53 def teardown_module():
54 module_teardown(test_config)
55 _mock.stop()
56
57 class TestS3Cache(SystemTest):
58 config = test_config
59 table_name = 'cache'
60
61 def setup(self):
62 SystemTest.setup(self)
63 self.common_map_req = WMS111MapRequest(url='/service?',
64 param=dict(service='WMS',
65 version='1.1.1', bbox='-150,-40,-140,-30',
66 width='100', height='100',
67 layers='default', srs='EPSG:4326',
68 format='image/png',
69 styles='', request='GetMap'))
70
71 def test_get_map_cached(self):
72 # mock_s3 interferes with MockServ, use boto to manually upload tile
73 tile = create_tmp_image((256, 256))
74 boto3.client("s3").upload_fileobj(
75 BytesIO(tile),
76 Bucket='default_bucket',
77 Key='default_cache/WebMerc/4/1/9.png',
78 )
79
80 resp = self.app.get(self.common_map_req)
81 eq_(resp.content_type, 'image/png')
82 data = BytesIO(resp.body)
83 assert is_png(data)
84
85
86 def test_get_map_cached_quadkey(self):
87 # mock_s3 interferes with MockServ, use boto to manually upload tile
88 tile = create_tmp_image((256, 256))
89 boto3.client("s3").upload_fileobj(
90 BytesIO(tile),
91 Bucket='tiles',
92 Key='quadkeytiles/2003.png',
93 )
94
95 self.common_map_req.params.layers = 'quadkey'
96 resp = self.app.get(self.common_map_req)
97 eq_(resp.content_type, 'image/png')
98 data = BytesIO(resp.body)
99 assert is_png(data)
100
101 def test_get_map_cached_reverse_tms(self):
102 # mock_s3 interferes with MockServ, use boto to manually upload tile
103 tile = create_tmp_image((256, 256))
104 boto3.client("s3").upload_fileobj(
105 BytesIO(tile),
106 Bucket='tiles',
107 Key='reversetiles/9/1/4.png',
108 )
109
110 self.common_map_req.params.layers = 'reverse'
111 resp = self.app.get(self.common_map_req)
112 eq_(resp.content_type, 'image/png')
113 data = BytesIO(resp.body)
114 assert is_png(data)
8787 assert 'Last-modified' not in resp.headers
8888 else:
8989 eq_(resp.headers['Last-modified'], format_httpdate(timestamp))
90 eq_(resp.headers['Cache-control'], 'max-age=%d public' % max_age)
90 eq_(resp.headers['Cache-control'], 'public, max-age=%d, s-maxage=%d' % (max_age, max_age))
9191
9292 def test_get_cached_tile(self):
9393 etag, max_age = self._update_timestamp()
7070
7171 def test_tms_capabilities(self):
7272 resp = self.app.get('/tms/1.0.0/')
73 assert 'http://localhost/tms/1.0.0/multi_cache/wmts_incompatible_grid' in resp
74 assert 'http://localhost/tms/1.0.0/multi_cache/GLOBAL_WEBMERCATOR' in resp
75 assert 'http://localhost/tms/1.0.0/multi_cache/InspireCrs84Quad' in resp
76 assert 'http://localhost/tms/1.0.0/multi_cache/gk3' in resp
77 assert 'http://localhost/tms/1.0.0/cache/utm32' in resp
73 assert 'http://localhost/tms/1.0.0/multi_cache/EPSG25832' in resp
74 assert 'http://localhost/tms/1.0.0/multi_cache/EPSG3857' in resp
75 assert 'http://localhost/tms/1.0.0/multi_cache/CRS84' in resp
76 assert 'http://localhost/tms/1.0.0/multi_cache/EPSG31467' in resp
77 assert 'http://localhost/tms/1.0.0/cache/EPSG25832' in resp
7878 xml = resp.lxml
7979 assert xml.xpath('count(//TileMap)') == 5
8080
178178 def _check_cache_control_headers(self, resp, etag, max_age):
179179 eq_(resp.headers['ETag'], etag)
180180 eq_(resp.headers['Last-modified'], 'Fri, 13 Feb 2009 23:31:30 GMT')
181 eq_(resp.headers['Cache-control'], 'max-age=%d public' % max_age)
181 eq_(resp.headers['Cache-control'], 'public, max-age=%d, s-maxage=%d' % (max_age, max_age))
182182
183183 def test_get_cached_tile(self):
184184 etag, max_age = self._update_timestamp()
438438 # broken bbox for the requested srs
439439 url = """/service?SERVICE=WMS&VERSION=1.1.1&REQUEST=GetMap&BBOX=-72988843.697212,-255661507.634227,142741550.188860,255661507.634227&SRS=EPSG:25833&WIDTH=164&HEIGHT=388&LAYERS=wms_cache_100&STYLES=&FORMAT=image/png&TRANSPARENT=TRUE"""
440440 resp = self.app.get(url)
441 is_111_exception(resp.lxml, 'Request too large or invalid BBOX.')
441 # result depends on proj version
442 is_111_exception(resp.lxml, re_msg='Request too large or invalid BBOX.|Could not transform BBOX: Invalid result.')
442443
443444 def test_get_map_broken_bbox(self):
444445 url = """/service?VERSION=1.1.11&REQUEST=GetMap&SRS=EPSG:31468&BBOX=-10000855.0573254,2847125.18913603,-9329367.42767611,4239924.78564583&WIDTH=130&HEIGHT=62&LAYERS=wms_cache&STYLES=&FORMAT=image/png&TRANSPARENT=TRUE"""
528529 def test_get_featureinfo_transformed(self):
529530 expected_req = ({'path': r'/service?LAYERs=foo,bar&SERVICE=WMS&FORMAT=image%2Fpng'
530531 '&REQUEST=GetFeatureInfo&HEIGHT=200&SRS=EPSG%3A900913'
531 '&BBOX=5197367.93088,5312902.73895,5311885.44223,5434731.78213'
532 '&BBOX=1172272.30156,7196018.03449,1189711.04571,7213496.99738'
532533 '&styles=&VERSION=1.1.1&feature_count=100'
533 '&WIDTH=200&QUERY_LAYERS=foo,bar&X=14&Y=78'},
534 '&WIDTH=200&QUERY_LAYERS=foo,bar&X=14&Y=20'},
534535 {'body': b'info', 'headers': {'content-type': 'text/plain'}})
535536
536537 # out fi point at x=10,y=20
537 p_25832 = (3570269+10*(3643458 - 3570269)/200, 5540889+20*(5614078 - 5540889)/200)
538 # the transformed fi point at x=10,y=22
539 p_900913 = (5197367.93088+14*(5311885.44223 - 5197367.93088)/200,
540 5312902.73895+78*(5434731.78213 - 5312902.73895)/200)
538 p_25832 = (600000+10*(610000 - 600000)/200, 6010000-20*(6010000 - 6000000)/200)
539 # the transformed fi point at x=14,y=20
540 p_900913 = (1172272.30156+14*(1189711.04571-1172272.30156)/200,
541 7213496.99738-20*(7213496.99738 - 7196018.03449)/200)
541542
542543 # are they the same?
543 # check with tolerance: pixel resolution is ~570 and x/y position is rounded to pizel
544 assert abs(SRS(25832).transform_to(SRS(900913), p_25832)[0] - p_900913[0]) < 570/2
545 assert abs(SRS(25832).transform_to(SRS(900913), p_25832)[1] - p_900913[1]) < 570/2
544 # check with tolerance: pixel resolution is ~50 and x/y position is rounded to pizel
545 assert abs(SRS(25832).transform_to(SRS(900913), p_25832)[0] - p_900913[0]) < 50
546 assert abs(SRS(25832).transform_to(SRS(900913), p_25832)[1] - p_900913[1]) < 50
546547
547548 with mock_httpd(('localhost', 42423), [expected_req], bbox_aware_query_comparator=True):
548 self.common_fi_req.params['bbox'] = '3570269,5540889,3643458,5614078'
549 self.common_fi_req.params['bbox'] = '600000,6000000,610000,6010000'
549550 self.common_fi_req.params['srs'] = 'EPSG:25832'
550551 self.common_fi_req.params.pos = 10, 20
551552 self.common_fi_req.params['feature_count'] = 100
1414
1515 import requests
1616 from mapproxy.test.http import (
17 MockServ, RequestsMissmatchError, mock_httpd,
18 basic_auth_value,
17 MockServ, RequestsMismatchError, mock_httpd,
18 basic_auth_value, query_eq,
1919 )
2020
2121 from nose.tools import eq_
4747 try:
4848 with serv:
4949 requests.get('http://localhost:%d/test' % serv.port)
50 except RequestsMissmatchError as ex:
50 except RequestsMismatchError as ex:
5151 assert ex.assertions[0].expected == 'Accept: Coffee'
5252
5353 def test_expects_post(self):
6464 try:
6565 with serv:
6666 requests.get('http://localhost:%d/test' % serv.port)
67 except RequestsMissmatchError as ex:
67 except RequestsMismatchError as ex:
6868 assert ex.assertions[0].expected == 'POST'
6969 assert ex.assertions[0].actual == 'GET'
7070 else:
136136 with serv:
137137 resp = requests.get('http://localhost:%d/test1' % serv.port)
138138 eq_(resp.content, b'hello1')
139 except RequestsMissmatchError as ex:
140 assert 'requests missmatch:\n - missing requests' in str(ex)
139 except RequestsMismatchError as ex:
140 assert 'requests mismatch:\n - missing requests' in str(ex)
141141 else:
142142 raise AssertionError('AssertionError expected')
143143
176176 raise AssertionError('RequestException expected')
177177 resp = requests.get('http://localhost:%d/test2' % serv.port)
178178 eq_(resp.content, b'hello2')
179 except RequestsMissmatchError as ex:
179 except RequestsMismatchError as ex:
180180 assert 'unexpected request' in ex.assertions[0]
181181 else:
182182 raise AssertionError('AssertionError expected')
206206 'Authorization': basic_auth_value('foo', 'bar'), 'Accept': 'Coffee'}
207207 )
208208 eq_(resp.content, b'ok')
209
210
211 def test_query_eq():
212 assert query_eq('?baz=42&foo=bar', '?foo=bar&baz=42')
213 assert query_eq('?baz=42.00&foo=bar', '?foo=bar&baz=42.0')
214 assert query_eq('?baz=42.000000001&foo=bar', '?foo=bar&baz=42.0')
215 assert not query_eq('?baz=42.00000001&foo=bar', '?foo=bar&baz=42.0')
216
217 assert query_eq('?baz=42.000000001,23.99999999999&foo=bar', '?foo=bar&baz=42.0,24.0')
218 assert not query_eq('?baz=42.00000001&foo=bar', '?foo=bar&baz=42.0')
3131 stop = time.time()
3232
3333 duration = stop - start
34 assert duration < 0.2
34 assert duration < 0.5, "took %s" % duration
3535
3636 eq_(len(result), 40)
3737
6767 stop = time.time()
6868
6969 duration = stop - start
70 assert duration < 0.1
70 assert duration < 0.2, "took %s" % duration
7171
7272 eq_(len(result), 40)
7373
322322 ((0.0, -90.0, 180.0, 90.0), (512, 512), SRS(4326))])
323323
324324
325 class TestTileManagerWMSSourceConcurrent(TestTileManagerWMSSource):
326 def setup(self):
327 TestTileManagerWMSSource.setup(self)
328 self.tile_mgr.concurrent_tile_creators = 2
329
325330 class TestTileManagerWMSSourceMinimalMetaRequests(object):
326331 def setup(self):
327332 self.file_cache = MockFileCache('/dev/null', 'png')
480485 locker=self.locker)
481486
482487 assert self.tile_mgr.meta_grid is None
488
489
490 class TestTileManagerBulkMetaTiles(object):
491 def setup(self):
492 self.file_cache = MockFileCache('/dev/null', 'png')
493 self.grid = TileGrid(SRS(4326), bbox=[-180, -90, 180, 90], origin='ul')
494 self.source_base = SolidColorMockSource(color='#ff0000')
495 self.source_base.supports_meta_tiles = False
496 self.source_overlay = MockSource()
497 self.source_overlay.supports_meta_tiles = False
498 self.locker = TileLocker(tmp_lock_dir, 10, "id")
499 self.tile_mgr = TileManager(self.grid, self.file_cache,
500 [self.source_base, self.source_overlay], 'png',
501 meta_size=[2, 2], meta_buffer=0,
502 locker=self.locker,
503 bulk_meta_tiles=True,
504 )
505
506 def test_bulk_get(self):
507 tiles = self.tile_mgr.creator().create_tiles([Tile((0, 0, 2))])
508 eq_(len(tiles), 2*2)
509 eq_(self.file_cache.stored_tiles, set([(0, 0, 2), (1, 0, 2), (0, 1, 2), (1, 1, 2)]))
510 for requested in [self.source_base.requested, self.source_overlay.requested]:
511 eq_(set(requested), set([
512 ((-180.0, 0.0, -90.0, 90.0), (256, 256), SRS(4326)),
513 ((-90.0, 0.0, 0.0, 90.0), (256, 256), SRS(4326)),
514 ((-180.0, -90.0, -90.0, 0.0), (256, 256), SRS(4326)),
515 ((-90.0, -90.0, 0.0, 0.0), (256, 256), SRS(4326)),
516 ]))
517
518 def test_bulk_get_error(self):
519 self.tile_mgr.sources = [self.source_base, ErrorSource()]
520 try:
521 self.tile_mgr.creator().create_tiles([Tile((0, 0, 2))])
522 except Exception as ex:
523 eq_(ex.args[0], "source error")
524
525 def test_bulk_get_multiple_meta_tiles(self):
526 tiles = self.tile_mgr.creator().create_tiles([Tile((1, 0, 2)), Tile((2, 0, 2))])
527 eq_(len(tiles), 2*2*2)
528 eq_(self.file_cache.stored_tiles, set([
529 (0, 0, 2), (1, 0, 2), (0, 1, 2), (1, 1, 2),
530 (2, 0, 2), (3, 0, 2), (2, 1, 2), (3, 1, 2),
531 ]))
532
533 class ErrorSource(MapLayer):
534 def __init__(self, *args):
535 MapLayer.__init__(self, *args)
536 self.requested = []
537
538 def get_map(self, query):
539 self.requested.append((query.bbox, query.size, query.srs))
540 raise Exception("source error")
541
542 class TestTileManagerBulkMetaTilesConcurrent(TestTileManagerBulkMetaTiles):
543 def setup(self):
544 TestTileManagerBulkMetaTiles.setup(self)
545 self.tile_mgr.concurrent_tile_creators = 2
546
483547
484548 default_image_opts = ImageOptions(resampling='bicubic')
485549
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2016 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement, division
16
17 import os
18 import time
19 import struct
20
21 from io import BytesIO
22
23 from mapproxy.cache.compact import CompactCacheV1
24 from mapproxy.cache.tile import Tile
25 from mapproxy.image import ImageSource
26 from mapproxy.image.opts import ImageOptions
27 from mapproxy.test.unit.test_cache_tile import TileCacheTestBase
28
29 from nose.tools import eq_
30
31 class TestCompactCacheV1(TileCacheTestBase):
32
33 always_loads_metadata = True
34
35 def setup(self):
36 TileCacheTestBase.setup(self)
37 self.cache = CompactCacheV1(
38 cache_dir=self.cache_dir,
39 )
40
41 def test_bundle_files(self):
42 assert not os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundle'))
43 assert not os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundlx'))
44 self.cache.store_tile(self.create_tile(coord=(0, 0, 0)))
45 assert os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundle'))
46 assert os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundlx'))
47
48 assert not os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundle'))
49 assert not os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundlx'))
50 self.cache.store_tile(self.create_tile(coord=(127, 127, 12)))
51 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundle'))
52 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundlx'))
53
54 assert not os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0100C0080.bundle'))
55 assert not os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0100C0080.bundlx'))
56 self.cache.store_tile(self.create_tile(coord=(128, 256, 12)))
57 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0100C0080.bundle'))
58 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0100C0080.bundlx'))
59
60 def test_bundle_files_not_created_on_is_cached(self):
61 assert not os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundle'))
62 assert not os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundlx'))
63 self.cache.is_cached(Tile(coord=(0, 0, 0)))
64 assert not os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundle'))
65 assert not os.path.exists(os.path.join(self.cache_dir, 'L00', 'R0000C0000.bundlx'))
66
67 def test_missing_tiles(self):
68 self.cache.store_tile(self.create_tile(coord=(130, 200, 8)))
69 assert os.path.exists(os.path.join(self.cache_dir, 'L08', 'R0080C0080.bundle'))
70 assert os.path.exists(os.path.join(self.cache_dir, 'L08', 'R0080C0080.bundlx'))
71
72 # test that all other tiles in this bundle are missing
73 assert self.cache.is_cached(Tile((130, 200, 8)))
74 for x in range(128, 255):
75 for y in range(128, 255):
76 if x == 130 and y == 200:
77 continue
78 assert not self.cache.is_cached(Tile((x, y, 8))), (x, y)
79 assert not self.cache.load_tile(Tile((x, y, 8))), (x, y)
80
81 def test_remove_level_tiles_before(self):
82 self.cache.store_tile(self.create_tile(coord=(0, 0, 12)))
83 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundle'))
84 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundlx'))
85
86 # not removed with timestamp
87 self.cache.remove_level_tiles_before(12, time.time())
88 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundle'))
89 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0000C0000.bundlx'))
90
91 # removed with timestamp=0 (remove_all:true in seed.yaml)
92 self.cache.remove_level_tiles_before(12, 0)
93 assert not os.path.exists(os.path.join(self.cache_dir, 'L12'))
94
95
96 def test_bundle_header(self):
97 t = Tile((5000, 1000, 12), ImageSource(BytesIO(b'a' * 4000), image_opts=ImageOptions(format='image/png')))
98 self.cache.store_tile(t)
99 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0380C1380.bundle'))
100 assert os.path.exists(os.path.join(self.cache_dir, 'L12', 'R0380C1380.bundlx'))
101
102 def assert_header(tile_bytes_written, max_tile_bytes):
103 with open(os.path.join(self.cache_dir, 'L12', 'R0380C1380.bundle'), 'r+b') as f:
104 header = struct.unpack('<lllllllllllllll', f.read(60))
105 eq_(header[11], 896)
106 eq_(header[12], 1023)
107 eq_(header[13], 4992)
108 eq_(header[14], 5119)
109 eq_(header[6], 60 + 128*128*4 + sum(tile_bytes_written))
110 eq_(header[2], max_tile_bytes)
111 eq_(header[4], len(tile_bytes_written)*4)
112
113 assert_header([4000 + 4], 4000)
114
115 t = Tile((5000, 1001, 12), ImageSource(BytesIO(b'a' * 6000), image_opts=ImageOptions(format='image/png')))
116 self.cache.store_tile(t)
117 assert_header([4000 + 4, 6000 + 4], 6000)
118
119 t = Tile((4992, 999, 12), ImageSource(BytesIO(b'a' * 1000), image_opts=ImageOptions(format='image/png')))
120 self.cache.store_tile(t)
121 assert_header([4000 + 4, 6000 + 4, 1000 + 4], 6000)
122
123 t = Tile((5000, 1001, 12), ImageSource(BytesIO(b'a' * 3000), image_opts=ImageOptions(format='image/png')))
124 self.cache.store_tile(t)
125 assert_header([4000 + 4, 6000 + 4 + 3000 + 4, 1000 + 4], 6000) # still contains bytes from overwritten tile
126
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2016 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement, division
16
17 import os
18 import time
19 import sqlite3
20 import threading
21
22 from io import BytesIO
23
24 from mapproxy.image import ImageSource
25 from mapproxy.cache.geopackage import GeopackageCache, GeopackageLevelCache
26 from mapproxy.cache.tile import Tile
27 from mapproxy.grid import tile_grid, TileGrid
28 from mapproxy.test.unit.test_cache_tile import TileCacheTestBase
29
30 from nose.tools import eq_
31
32 class TestGeopackageCache(TileCacheTestBase):
33
34 always_loads_metadata = True
35
36 def setup(self):
37 TileCacheTestBase.setup(self)
38 self.gpkg_file = os.path.join(self.cache_dir, 'tmp.gpkg')
39 self.table_name = 'test_tiles'
40 self.cache = GeopackageCache(
41 self.gpkg_file,
42 tile_grid=tile_grid(3857, name='global-webmarcator'),
43 table_name=self.table_name,
44 )
45
46 def teardown(self):
47 if self.cache:
48 self.cache.cleanup()
49 TileCacheTestBase.teardown(self)
50
51 def test_new_geopackage(self):
52 assert os.path.exists(self.gpkg_file)
53
54 with sqlite3.connect(self.gpkg_file) as db:
55 cur = db.execute('''SELECT name FROM sqlite_master WHERE type='table' AND name=?''',
56 (self.table_name,))
57 content = cur.fetchone()
58 assert content[0] == self.table_name
59
60 with sqlite3.connect(self.gpkg_file) as db:
61 cur = db.execute('''SELECT table_name, data_type FROM gpkg_contents WHERE table_name = ?''',
62 (self.table_name,))
63 content = cur.fetchone()
64 assert content[0] == self.table_name
65 assert content[1] == 'tiles'
66
67 with sqlite3.connect(self.gpkg_file) as db:
68 cur = db.execute('''SELECT table_name FROM gpkg_tile_matrix WHERE table_name = ?''',
69 (self.table_name,))
70 content = cur.fetchall()
71 assert len(content) == 20
72
73 with sqlite3.connect(self.gpkg_file) as db:
74 cur = db.execute('''SELECT table_name FROM gpkg_tile_matrix_set WHERE table_name = ?''',
75 (self.table_name,))
76 content = cur.fetchone()
77 assert content[0] == self.table_name
78
79 def test_load_empty_tileset(self):
80 assert self.cache.load_tiles([Tile(None)]) == True
81 assert self.cache.load_tiles([Tile(None), Tile(None), Tile(None)]) == True
82
83 def test_load_more_than_2000_tiles(self):
84 # prepare data
85 for i in range(0, 2010):
86 assert self.cache.store_tile(Tile((i, 0, 10), ImageSource(BytesIO(b'foo'))))
87
88 tiles = [Tile((i, 0, 10)) for i in range(0, 2010)]
89 assert self.cache.load_tiles(tiles)
90
91 def test_timeouts(self):
92 self.cache._db_conn_cache.db = sqlite3.connect(self.cache.geopackage_file, timeout=0.05)
93
94 def block():
95 # block database by delaying the commit
96 db = sqlite3.connect(self.cache.geopackage_file)
97 cur = db.cursor()
98 stmt = "INSERT OR REPLACE INTO {0} (zoom_level, tile_column, tile_row, tile_data) " \
99 "VALUES (?,?,?,?)".format(self.table_name)
100 cur.execute(stmt, (3, 1, 1, '1234'))
101 time.sleep(0.2)
102 db.commit()
103
104 try:
105 assert self.cache.store_tile(self.create_tile((0, 0, 1))) == True
106
107 t = threading.Thread(target=block)
108 t.start()
109 time.sleep(0.05)
110 assert self.cache.store_tile(self.create_tile((0, 0, 1))) == False
111 finally:
112 t.join()
113
114 assert self.cache.store_tile(self.create_tile((0, 0, 1))) == True
115
116
117 class TestGeopackageLevelCache(TileCacheTestBase):
118
119 always_loads_metadata = True
120
121 def setup(self):
122 TileCacheTestBase.setup(self)
123 self.cache = GeopackageLevelCache(
124 self.cache_dir,
125 tile_grid=tile_grid(3857, name='global-webmarcator'),
126 table_name='test_tiles',
127 )
128
129 def teardown(self):
130 if self.cache:
131 self.cache.cleanup()
132 TileCacheTestBase.teardown(self)
133
134 def test_level_files(self):
135 if os.path.exists(self.cache_dir):
136 eq_(os.listdir(self.cache_dir), [])
137
138 self.cache.store_tile(self.create_tile((0, 0, 1)))
139 eq_(os.listdir(self.cache_dir), ['1.gpkg'])
140
141 self.cache.store_tile(self.create_tile((0, 0, 5)))
142 eq_(sorted(os.listdir(self.cache_dir)), ['1.gpkg', '5.gpkg'])
143
144 def test_remove_level_files(self):
145 self.cache.store_tile(self.create_tile((0, 0, 1)))
146 self.cache.store_tile(self.create_tile((0, 0, 2)))
147 eq_(sorted(os.listdir(self.cache_dir)), ['1.gpkg', '2.gpkg'])
148
149 self.cache.remove_level_tiles_before(1, timestamp=0)
150 eq_(os.listdir(self.cache_dir), ['2.gpkg'])
151
152 def test_remove_level_tiles_before(self):
153 self.cache.store_tile(self.create_tile((0, 0, 1)))
154 self.cache.store_tile(self.create_tile((0, 0, 2)))
155
156 eq_(sorted(os.listdir(self.cache_dir)), ['1.gpkg', '2.gpkg'])
157 assert self.cache.is_cached(Tile((0, 0, 1)))
158
159 self.cache.remove_level_tiles_before(1, timestamp=time.time() - 60)
160 assert self.cache.is_cached(Tile((0, 0, 1)))
161
162 self.cache.remove_level_tiles_before(1, timestamp=0)
163 assert not self.cache.is_cached(Tile((0, 0, 1)))
164
165 eq_(sorted(os.listdir(self.cache_dir)), ['1.gpkg', '2.gpkg'])
166 assert self.cache.is_cached(Tile((0, 0, 2)))
167
168
169 def test_bulk_store_tiles_with_different_levels(self):
170 self.cache.store_tiles([
171 self.create_tile((0, 0, 1)),
172 self.create_tile((0, 0, 2)),
173 self.create_tile((1, 0, 2)),
174 self.create_tile((1, 0, 1)),
175 ])
176
177 eq_(sorted(os.listdir(self.cache_dir)), ['1.gpkg', '2.gpkg'])
178 assert self.cache.is_cached(Tile((0, 0, 1)))
179 assert self.cache.is_cached(Tile((1, 0, 1)))
180 assert self.cache.is_cached(Tile((0, 0, 2)))
181 assert self.cache.is_cached(Tile((1, 0, 2)))
182
183 class TestGeopackageCacheInitErrors(object):
184 table_name = 'cache'
185
186 def test_bad_config_geopackage_srs(self):
187 error_msg = None
188 gpkg_file = os.path.join(os.path.join(os.path.dirname(__file__),
189 'fixture'),
190 'cache.gpkg')
191 table_name = 'cache'
192 try:
193 GeopackageCache(gpkg_file, TileGrid(srs=4326), table_name)
194 except ValueError as ve:
195 error_msg = ve
196 assert "srs is improperly configured." in str(error_msg)
197
198 def test_bad_config_geopackage_tile(self):
199 error_msg = None
200 gpkg_file = os.path.join(os.path.join(os.path.dirname(__file__),
201 'fixture'),
202 'cache.gpkg')
203 table_name = 'cache'
204 try:
205 GeopackageCache(gpkg_file, TileGrid(srs=900913, tile_size=(512, 512)), table_name)
206 except ValueError as ve:
207 error_msg = ve
208 assert "tile_size is improperly configured." in str(error_msg)
209
210 def test_bad_config_geopackage_res(self):
211 error_msg = None
212 gpkg_file = os.path.join(os.path.join(os.path.dirname(__file__),
213 'fixture'),
214 'cache.gpkg')
215 table_name = 'cache'
216 try:
217 GeopackageCache(gpkg_file, TileGrid(srs=900913, res=[1000, 100, 10]), table_name)
218 except ValueError as ve:
219 error_msg = ve
220 assert "res is improperly configured." in str(error_msg)
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2017 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from __future__ import with_statement
16
17 try:
18 import redis
19 except ImportError:
20 redis = None
21
22 import time
23 import os
24
25 from nose.plugins.skip import SkipTest
26
27 from mapproxy.cache.tile import Tile
28 from mapproxy.cache.redis import RedisCache
29
30 from mapproxy.test.unit.test_cache_tile import TileCacheTestBase
31
32 class TestRedisCache(TileCacheTestBase):
33 always_loads_metadata = False
34 def setup(self):
35 if not redis:
36 raise SkipTest("redis required for Redis tests")
37
38 redis_host = os.environ.get('MAPPROXY_TEST_REDIS')
39 if not redis_host:
40 raise SkipTest()
41 self.host, self.port = redis_host.split(':')
42
43 TileCacheTestBase.setup(self)
44
45 self.cache = RedisCache(self.host, int(self.port), prefix='mapproxy-test', db=1)
46
47 def teardown(self):
48 for k in self.cache.r.keys('mapproxy-test-*'):
49 self.cache.r.delete(k)
50
51 def test_expire(self):
52 cache = RedisCache(self.host, int(self.port), prefix='mapproxy-test', db=1, ttl=0)
53 t1 = self.create_tile(coord=(9382, 1234, 9))
54 assert cache.store_tile(t1)
55 time.sleep(0.1)
56 t2 = Tile(t1.coord)
57 assert cache.is_cached(t2)
58
59 cache = RedisCache(self.host, int(self.port), prefix='mapproxy-test', db=1, ttl=0.05)
60 t1 = self.create_tile(coord=(5382, 2234, 9))
61 assert cache.store_tile(t1)
62 time.sleep(0.1)
63 t2 = Tile(t1.coord)
64 assert not cache.is_cached(t2)
65
66 def test_double_remove(self):
67 tile = self.create_tile()
68 self.create_cached_tile(tile)
69 assert self.cache.remove_tile(tile)
70 assert self.cache.remove_tile(tile)
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2011 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 try:
16 import boto3
17 from moto import mock_s3
18 except ImportError:
19 boto3 = None
20 mock_s3 = None
21
22 from nose.plugins.skip import SkipTest
23
24 from mapproxy.cache.s3 import S3Cache
25 from mapproxy.test.unit.test_cache_tile import TileCacheTestBase
26
27
28 class TestS3Cache(TileCacheTestBase):
29 always_loads_metadata = True
30 uses_utc = True
31
32 def setup(self):
33 if not mock_s3 or not boto3:
34 raise SkipTest("boto3 and moto required for S3 tests")
35
36 TileCacheTestBase.setup(self)
37
38 self.mock = mock_s3()
39 self.mock.start()
40
41 self.bucket_name = "test"
42 dir_name = 'mapproxy'
43
44 boto3.client("s3").create_bucket(Bucket=self.bucket_name)
45
46 self.cache = S3Cache(dir_name,
47 file_ext='png',
48 directory_layout='tms',
49 bucket_name=self.bucket_name,
50 profile_name=None,
51 _concurrent_writer=1, # moto is not thread safe
52 )
53
54 def teardown(self):
55 self.mock.stop()
56 TileCacheTestBase.teardown(self)
57
58 def check_tile_key(self, layout, tile_coord, key):
59 cache = S3Cache('/mycache/webmercator', 'png', bucket_name=self.bucket_name, directory_layout=layout)
60 cache.store_tile(self.create_tile(tile_coord))
61
62 # raises, if key is missing
63 boto3.client("s3").head_object(Bucket=self.bucket_name, Key=key)
64
65 def test_tile_keys(self):
66 yield self.check_tile_key, 'mp', (12345, 67890, 2), 'mycache/webmercator/02/0001/2345/0006/7890.png'
67 yield self.check_tile_key, 'mp', (12345, 67890, 12), 'mycache/webmercator/12/0001/2345/0006/7890.png'
68
69 yield self.check_tile_key, 'tc', (12345, 67890, 2), 'mycache/webmercator/02/000/012/345/000/067/890.png'
70 yield self.check_tile_key, 'tc', (12345, 67890, 12), 'mycache/webmercator/12/000/012/345/000/067/890.png'
71
72 yield self.check_tile_key, 'tms', (12345, 67890, 2), 'mycache/webmercator/2/12345/67890.png'
73 yield self.check_tile_key, 'tms', (12345, 67890, 12), 'mycache/webmercator/12/12345/67890.png'
74
75 yield self.check_tile_key, 'quadkey', (0, 0, 0), 'mycache/webmercator/.png'
76 yield self.check_tile_key, 'quadkey', (0, 0, 1), 'mycache/webmercator/0.png'
77 yield self.check_tile_key, 'quadkey', (1, 1, 1), 'mycache/webmercator/3.png'
78 yield self.check_tile_key, 'quadkey', (12345, 67890, 12), 'mycache/webmercator/200200331021.png'
79
80 yield self.check_tile_key, 'arcgis', (1, 2, 3), 'mycache/webmercator/L03/R00000002/C00000001.png'
81 yield self.check_tile_key, 'arcgis', (9, 2, 3), 'mycache/webmercator/L03/R00000002/C00000009.png'
82 yield self.check_tile_key, 'arcgis', (10, 2, 3), 'mycache/webmercator/L03/R00000002/C0000000a.png'
83 yield self.check_tile_key, 'arcgis', (12345, 67890, 12), 'mycache/webmercator/L12/R00010932/C00003039.png'
84
1414
1515 from __future__ import with_statement
1616
17 import datetime
1718 import os
1819 import shutil
1920 import threading
2829 from mapproxy.cache.tile import Tile
2930 from mapproxy.cache.file import FileCache
3031 from mapproxy.cache.mbtiles import MBTilesCache, MBTilesLevelCache
31 from mapproxy.cache.base import CacheBackendError
3232 from mapproxy.image import ImageSource
3333 from mapproxy.image.opts import ImageOptions
3434 from mapproxy.test.image import create_tmp_image_buf, is_png
3535
36 from nose.tools import eq_, assert_raises
36 from nose.tools import eq_
3737
3838 tile_image = create_tmp_image_buf((256, 256), color='blue')
3939 tile_image2 = create_tmp_image_buf((256, 256), color='red')
4040
41 def timestamp_is_now(timestamp, delta=5):
42 return abs(timestamp - time.time()) <= delta
4341
4442 class TileCacheTestBase(object):
4543 always_loads_metadata = False
44 uses_utc = False
4645
4746 def setup(self):
4847 self.cache_dir = tempfile.mkdtemp()
5150 if hasattr(self, 'cache_dir') and os.path.exists(self.cache_dir):
5251 shutil.rmtree(self.cache_dir)
5352
54 def create_tile(self, coord=(0, 0, 4)):
53 def create_tile(self, coord=(3009, 589, 12)):
5554 return Tile(coord,
5655 ImageSource(tile_image,
5756 image_opts=ImageOptions(format='image/png')))
5857
59 def create_another_tile(self, coord=(0, 0, 4)):
58 def create_another_tile(self, coord=(3009, 589, 12)):
6059 return Tile(coord,
6160 ImageSource(tile_image2,
6261 image_opts=ImageOptions(format='image/png')))
6362
6463 def test_is_cached_miss(self):
65 assert not self.cache.is_cached(Tile((0, 0, 4)))
64 assert not self.cache.is_cached(Tile((3009, 589, 12)))
6665
6766 def test_is_cached_hit(self):
6867 tile = self.create_tile()
6968 self.create_cached_tile(tile)
70 assert self.cache.is_cached(Tile((0, 0, 4)))
69 assert self.cache.is_cached(Tile((3009, 589, 12)))
7170
7271 def test_is_cached_none(self):
7372 assert self.cache.is_cached(Tile(None))
7675 assert self.cache.load_tile(Tile(None))
7776
7877 def test_load_tile_not_cached(self):
79 tile = Tile((0, 0, 4))
78 tile = Tile((3009, 589, 12))
8079 assert not self.cache.load_tile(tile)
8180 assert tile.source is None
8281 assert tile.is_missing()
8483 def test_load_tile_cached(self):
8584 tile = self.create_tile()
8685 self.create_cached_tile(tile)
87 tile = Tile((0, 0, 4))
86 tile = Tile((3009, 589, 12))
8887 assert self.cache.load_tile(tile) == True
8988 assert not tile.is_missing()
9089
9190 def test_store_tiles(self):
92 tiles = [self.create_tile((x, 0, 4)) for x in range(4)]
91 tiles = [self.create_tile((x, 589, 12)) for x in range(4)]
9392 tiles[0].stored = True
9493 self.cache.store_tiles(tiles)
9594
96 tiles = [Tile((x, 0, 4)) for x in range(4)]
95 tiles = [Tile((x, 589, 12)) for x in range(4)]
9796 assert tiles[0].is_missing()
9897 assert self.cache.load_tile(tiles[0]) == False
9998 assert tiles[0].is_missing()
144143 assert self.cache.load_tile(tile, with_metadata=True)
145144 assert tile.source is not None
146145 if tile.timestamp:
147 assert timestamp_is_now(tile.timestamp, delta=10)
146 now = time.time()
147 if self.uses_utc:
148 now = time.mktime(datetime.datetime.utcnow().timetuple())
149 assert abs(tile.timestamp - now) <= 10
148150 if tile.size:
149151 assert tile.size == size
150152
171173 # tile object is marked as stored,
172174 # check that is is not stored 'again'
173175 # (used for disable_storage)
174 tile = Tile((0, 0, 4), ImageSource(BytesIO(b'foo')))
176 tile = Tile((1234, 589, 12), ImageSource(BytesIO(b'foo')))
175177 tile.stored = True
176178 self.cache.store_tile(tile)
177179
178180 assert self.cache.is_cached(tile)
179181
180 tile = Tile((0, 0, 4))
182 tile = Tile((1234, 589, 12))
181183 assert not self.cache.is_cached(tile)
182184
183185 def test_remove(self):
187189
188190 self.cache.remove_tile(Tile((1, 0, 4)))
189191 assert not self.cache.is_cached(Tile((1, 0, 4)))
192
193 # check if we can recreate a removed tile
194 tile = self.create_tile((1, 0, 4))
195 self.create_cached_tile(tile)
196 assert self.cache.is_cached(Tile((1, 0, 4)))
190197
191198 def create_cached_tile(self, tile):
192199 self.cache.store_tile(tile)
247254 with open(loc, 'wb') as f:
248255 f.write(b'foo')
249256
257
258 def check_tile_location(self, layout, tile_coord, path):
259 cache = FileCache('/tmp/foo', 'png', directory_layout=layout)
260 eq_(cache.tile_location(Tile(tile_coord)), path)
261
262 def test_tile_locations(self):
263 yield self.check_tile_location, 'mp', (12345, 67890, 2), '/tmp/foo/02/0001/2345/0006/7890.png'
264 yield self.check_tile_location, 'mp', (12345, 67890, 12), '/tmp/foo/12/0001/2345/0006/7890.png'
265
266 yield self.check_tile_location, 'tc', (12345, 67890, 2), '/tmp/foo/02/000/012/345/000/067/890.png'
267 yield self.check_tile_location, 'tc', (12345, 67890, 12), '/tmp/foo/12/000/012/345/000/067/890.png'
268
269 yield self.check_tile_location, 'tms', (12345, 67890, 2), '/tmp/foo/2/12345/67890.png'
270 yield self.check_tile_location, 'tms', (12345, 67890, 12), '/tmp/foo/12/12345/67890.png'
271
272 yield self.check_tile_location, 'quadkey', (0, 0, 0), '/tmp/foo/.png'
273 yield self.check_tile_location, 'quadkey', (0, 0, 1), '/tmp/foo/0.png'
274 yield self.check_tile_location, 'quadkey', (1, 1, 1), '/tmp/foo/3.png'
275 yield self.check_tile_location, 'quadkey', (12345, 67890, 12), '/tmp/foo/200200331021.png'
276
277 yield self.check_tile_location, 'arcgis', (1, 2, 3), '/tmp/foo/L03/R00000002/C00000001.png'
278 yield self.check_tile_location, 'arcgis', (9, 2, 3), '/tmp/foo/L03/R00000002/C00000009.png'
279 yield self.check_tile_location, 'arcgis', (10, 2, 3), '/tmp/foo/L03/R00000002/C0000000a.png'
280 yield self.check_tile_location, 'arcgis', (12345, 67890, 12), '/tmp/foo/L12/R00010932/C00003039.png'
281
282
283 def check_level_location(self, layout, level, path):
284 cache = FileCache('/tmp/foo', 'png', directory_layout=layout)
285 eq_(cache.level_location(level), path)
286
287 def test_level_locations(self):
288 yield self.check_level_location, 'mp', 2, '/tmp/foo/02'
289 yield self.check_level_location, 'mp', 12, '/tmp/foo/12'
290
291 yield self.check_level_location, 'tc', 2, '/tmp/foo/02'
292 yield self.check_level_location, 'tc', 12, '/tmp/foo/12'
293
294 yield self.check_level_location, 'tms', '2', '/tmp/foo/2'
295 yield self.check_level_location, 'tms', 12, '/tmp/foo/12'
296
297 yield self.check_level_location, 'arcgis', 3, '/tmp/foo/L03'
298 yield self.check_level_location, 'arcgis', 3, '/tmp/foo/L03'
299 yield self.check_level_location, 'arcgis', 3, '/tmp/foo/L03'
300 yield self.check_level_location, 'arcgis', 12, '/tmp/foo/L12'
301
302 def test_level_location_quadkey(self):
303 try:
304 self.check_level_location('quadkey', 0, None)
305 except NotImplementedError:
306 pass
307 else:
308 assert False, "expected NotImplementedError"
250309
251310 class TestMBTileCache(TileCacheTestBase):
252311 def setup(self):
346405
347406 eq_(sorted(os.listdir(self.cache_dir)), ['1.mbtile', '2.mbtile'])
348407 assert self.cache.is_cached(Tile((0, 0, 2)))
408
409 def test_bulk_store_tiles_with_different_levels(self):
410 self.cache.store_tiles([
411 self.create_tile((0, 0, 1)),
412 self.create_tile((0, 0, 2)),
413 self.create_tile((1, 0, 2)),
414 self.create_tile((1, 0, 1)),
415 ])
416
417 eq_(sorted(os.listdir(self.cache_dir)), ['1.mbtile', '2.mbtile'])
418 assert self.cache.is_cached(Tile((0, 0, 1)))
419 assert self.cache.is_cached(Tile((1, 0, 1)))
420 assert self.cache.is_cached(Tile((0, 0, 2)))
421 assert self.cache.is_cached(Tile((1, 0, 2)))
295295 http = MockHTTPClient()
296296 wms = WMSInfoClient(req, http_client=http, supported_srs=[SRS(25832)])
297297 fi_req = InfoQuery((8, 50, 9, 51), (512, 512),
298 SRS(4326), (256, 256), 'text/plain')
298 SRS(4326), (128, 64), 'text/plain')
299299
300300 wms.get_info(fi_req)
301301
302302 assert wms_query_eq(http.requested[0],
303303 TESTSERVER_URL+'/service?map=foo&LAYERS=foo&SERVICE=WMS&FORMAT=image%2Fpng'
304 '&REQUEST=GetFeatureInfo&HEIGHT=512&SRS=EPSG%3A25832&info_format=text/plain'
304 '&REQUEST=GetFeatureInfo&SRS=EPSG%3A25832&info_format=text/plain'
305305 '&query_layers=foo'
306 '&VERSION=1.1.1&WIDTH=512&STYLES=&x=259&y=255'
307 '&BBOX=428333.552496,5538630.70275,500000.0,5650300.78652')
306 '&VERSION=1.1.1&WIDTH=512&HEIGHT=797&STYLES=&x=135&y=101'
307 '&BBOX=428333.552496,5538630.70275,500000.0,5650300.78652'), http.requested[0]
308308
309309 def test_transform_fi_request(self):
310310 req = WMS111FeatureInfoRequest(url=TESTSERVER_URL + '/service?map=foo', param={'layers':'foo', 'srs': 'EPSG:25832'})
311311 http = MockHTTPClient()
312312 wms = WMSInfoClient(req, http_client=http)
313313 fi_req = InfoQuery((8, 50, 9, 51), (512, 512),
314 SRS(4326), (256, 256), 'text/plain')
314 SRS(4326), (128, 64), 'text/plain')
315315
316316 wms.get_info(fi_req)
317317
318318 assert wms_query_eq(http.requested[0],
319319 TESTSERVER_URL+'/service?map=foo&LAYERS=foo&SERVICE=WMS&FORMAT=image%2Fpng'
320 '&REQUEST=GetFeatureInfo&HEIGHT=512&SRS=EPSG%3A25832&info_format=text/plain'
320 '&REQUEST=GetFeatureInfo&SRS=EPSG%3A25832&info_format=text/plain'
321321 '&query_layers=foo'
322 '&VERSION=1.1.1&WIDTH=512&STYLES=&x=259&y=255'
323 '&BBOX=428333.552496,5538630.70275,500000.0,5650300.78652')
322 '&VERSION=1.1.1&WIDTH=512&HEIGHT=797&STYLES=&x=135&y=101'
323 '&BBOX=428333.552496,5538630.70275,500000.0,5650300.78652'), http.requested[0]
324324
325325 class TestWMSMapRequest100(object):
326326 def setup(self):
0 # This file is part of the MapProxy project.
1 # Copyright (C) 2010 Omniscale <http://omniscale.de>
2 #
3 # Licensed under the Apache License, Version 2.0 (the "License");
4 # you may not use this file except in compliance with the License.
5 # You may obtain a copy of the License at
6 #
7 # http://www.apache.org/licenses/LICENSE-2.0
8 #
9 # Unless required by applicable law or agreed to in writing, software
10 # distributed under the License is distributed on an "AS IS" BASIS,
11 # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
12 # See the License for the specific language governing permissions and
13 # limitations under the License.
14
15 from io import BytesIO
16
17 from mapproxy.client.arcgis import ArcGISInfoClient
18 from mapproxy.layer import InfoQuery
19 from mapproxy.request.arcgis import ArcGISIdentifyRequest
20 from mapproxy.srs import SRS
21 from mapproxy.test.http import assert_query_eq
22
23 TESTSERVER_ADDRESS = ('127.0.0.1', 56413)
24 TESTSERVER_URL = 'http://%s:%s' % TESTSERVER_ADDRESS
25
26
27
28 class MockHTTPClient(object):
29 def __init__(self):
30 self.requested = []
31
32 def open(self, url, data=None):
33 self.requested.append(url)
34 result = BytesIO(b'{}')
35 result.seek(0)
36 result.headers = {}
37 return result
38
39 class TestArcGISInfoClient(object):
40 def test_fi_request(self):
41 req = ArcGISIdentifyRequest(url=TESTSERVER_URL + '/MapServer/export?map=foo', param={'layers':'foo'})
42 http = MockHTTPClient()
43 wms = ArcGISInfoClient(req, http_client=http, supported_srs=[SRS(4326)])
44 fi_req = InfoQuery((8, 50, 9, 51), (512, 512),
45 SRS(4326), (128, 64), 'text/plain')
46
47 wms.get_info(fi_req)
48
49 assert_query_eq(http.requested[0],
50 TESTSERVER_URL+'/MapServer/identify?map=foo'
51 '&imageDisplay=512,512,96&sr=4326&f=json'
52 '&layers=foo&tolerance=5&returnGeometry=false'
53 '&geometryType=esriGeometryPoint&geometry=8.250000,50.875000'
54 '&mapExtent=8,50,9,51',
55 fuzzy_number_compare=True)
56
57 def test_transform_fi_request_supported_srs(self):
58 req = ArcGISIdentifyRequest(url=TESTSERVER_URL + '/MapServer/export?map=foo', param={'layers':'foo'})
59 http = MockHTTPClient()
60 wms = ArcGISInfoClient(req, http_client=http, supported_srs=[SRS(25832)])
61 fi_req = InfoQuery((8, 50, 9, 51), (512, 512),
62 SRS(4326), (128, 64), 'text/plain')
63
64 wms.get_info(fi_req)
65
66 assert_query_eq(http.requested[0],
67 TESTSERVER_URL+'/MapServer/identify?map=foo'
68 '&imageDisplay=512,797,96&sr=25832&f=json'
69 '&layers=foo&tolerance=5&returnGeometry=false'
70 '&geometryType=esriGeometryPoint&geometry=447229.979084,5636149.370634'
71 '&mapExtent=428333.552496,5538630.70275,500000.0,5650300.78652',
72 fuzzy_number_compare=True)
2323 merge_dict,
2424 ConfigurationError,
2525 )
26 from mapproxy.config.coverage import load_coverage
2627 from mapproxy.config.spec import validate_options
2728 from mapproxy.cache.tile import TileManager
29 from mapproxy.seed.spec import validate_seed_conf
2830 from mapproxy.test.helper import TempFile
2931 from mapproxy.test.unit.test_grid import assert_almost_equal_bbox
3032 from nose.tools import eq_, assert_raises
922924
923925 conf.globals.image_options.image_opts({}, 'image/jpeg')
924926
927 class TestLoadCoverage(object):
928 def test_union(self):
929 conf = {
930 'coverages': {
931 'covname': {
932 'union': [
933 {'bbox': [0, 0, 10, 10], 'srs': 'EPSG:4326'},
934 {'bbox': [10, 0, 20, 10], 'srs': 'EPSG:4326', 'unknown': True},
935 ],
936 },
937 },
938 }
939
940 errors, informal_only = validate_seed_conf(conf)
941 assert informal_only
942 assert len(errors) == 1
943 eq_(errors[0], "unknown 'unknown' in coverages.covname.union[1]")
2020 from lxml import etree, html
2121 from nose.tools import eq_
2222
23 from mapproxy.featureinfo import (combined_inputs, XSLTransformer,
24 XMLFeatureInfoDoc, HTMLFeatureInfoDoc)
23 from mapproxy.featureinfo import (
24 combined_inputs,
25 XSLTransformer,
26 XMLFeatureInfoDoc,
27 HTMLFeatureInfoDoc,
28 JSONFeatureInfoDoc,
29 )
2530 from mapproxy.test.helper import strip_whitespace
2631
2732 def test_combined_inputs():
176181 b"<p>baz2\n<p>foo</p>\n<body><p>bar</p></body>",
177182 result.as_string())
178183 eq_(result.info_type, 'text')
184
185 class TestJSONFeatureInfoDocs(object):
186 def test_combine(self):
187 docs = [
188 JSONFeatureInfoDoc('{}'),
189 JSONFeatureInfoDoc('{"results": [{"foo": 1}]}'),
190 JSONFeatureInfoDoc('{"results": [{"bar": 2}]}'),
191 ]
192 result = JSONFeatureInfoDoc.combine(docs)
193
194 eq_('''{"results": [{"foo": 1}, {"bar": 2}]}''',
195 result.as_string())
196 eq_(result.info_type, 'json')
1515 from __future__ import division, with_statement
1616
1717 import os
18 import tempfile
19 import shutil
1820
1921 from mapproxy.srs import SRS, bbox_equals
2022 from mapproxy.util.geom import (
2123 load_polygons,
2224 load_datasource,
25 load_geojson,
26 load_expire_tiles,
2327 transform_geometry,
2428 geom_support,
2529 bbox_polygon,
2630 build_multipolygon,
2731 )
28 from mapproxy.util.coverage import coverage, MultiCoverage
32 from mapproxy.util.coverage import (
33 coverage,
34 MultiCoverage,
35 union_coverage,
36 diff_coverage,
37 intersection_coverage,
38 )
2939 from mapproxy.layer import MapExtent, DefaultMapExtent
3040 from mapproxy.test.helper import TempFile
3141
137147 eq_(polygon.type, 'Polygon')
138148 assert polygon.equals(shapely.geometry.Polygon([(0, 0), (15, 0), (15, 10), (0, 10)]))
139149
150
151 class TestGeoJSONLoading(object):
152 def test_geojson(self):
153 yield (self.check_geojson,
154 '''{"type": "Polygon", "coordinates": [[[0, 0], [10, 0], [10, 10], [0, 0]]]}''',
155 shapely.geometry.Polygon([[0, 0], [10, 0], [10, 10], [0, 0]]),
156 )
157
158 yield (self.check_geojson,
159 '''{"type": "MultiPolygon", "coordinates": [[[[0, 0], [10, 0], [10, 10], [0, 0]]], [[[20, 0], [30, 0], [20, 10], [20, 0]]]]}''',
160 shapely.geometry.Polygon([[0, 0], [10, 0], [10, 10], [0, 0]]).union(shapely.geometry.Polygon([[20, 0], [30, 0], [20, 10], [20, 0]])),
161 )
162
163 yield (self.check_geojson,
164 '''{"type": "Feature", "geometry": {"type": "Polygon", "coordinates": [[[0, 0], [10, 0], [10, 10], [0, 0]]]}}''',
165 shapely.geometry.Polygon([[0, 0], [10, 0], [10, 10], [0, 0]]),
166 )
167
168 yield (self.check_geojson,
169 '''{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Polygon", "coordinates": [[[0, 0], [10, 0], [10, 10], [0, 0]]]}}]}''',
170 shapely.geometry.Polygon([[0, 0], [10, 0], [10, 10], [0, 0]]),
171 )
172
173 def check_geojson(self, geojson, geometry):
174 with TempFile() as fname:
175 with open(fname, 'w') as f:
176 f.write(geojson)
177 polygon = load_geojson(fname)
178 bbox, polygon = build_multipolygon(polygon, simplify=True)
179 assert polygon.is_valid
180 assert polygon.type in ('Polygon', 'MultiPolygon'), polygon.type
181 assert polygon.equals(geometry)
182
183
140184 class TestTransform(object):
141185 def test_polygon_transf(self):
142186 p1 = shapely.geometry.Polygon([(0, 0), (10, 0), (10, 10), (0, 10)])
267311 assert coverage([-10, 10, 80, 80], SRS(4326)) != coverage([-10, 10.0, 80.0, 80], SRS(31467))
268312
269313
314 class TestUnionCoverage(object):
315 def setup(self):
316 self.coverage = union_coverage([
317 coverage([0, 0, 10, 10], SRS(4326)),
318 coverage(shapely.wkt.loads("POLYGON((10 0, 20 0, 20 10, 10 10, 10 0))"), SRS(4326)),
319 coverage(shapely.wkt.loads("POLYGON((-1000000 0, 0 0, 0 1000000, -1000000 1000000, -1000000 0))"), SRS(3857)),
320 ])
321
322 def test_bbox(self):
323 assert bbox_equals(self.coverage.bbox, [-8.98315284, 0.0, 20.0, 10.0], 0.0001), self.coverage.bbox
324
325 def test_contains(self):
326 assert self.coverage.contains((0, 0, 5, 5), SRS(4326))
327 assert self.coverage.contains((-50000, 0, -20000, 20000), SRS(3857))
328 assert not self.coverage.contains((-50000, -100, -20000, 20000), SRS(3857))
329
330 def test_intersects(self):
331 assert self.coverage.intersects((0, 0, 5, 5), SRS(4326))
332 assert self.coverage.intersects((5, 0, 25, 5), SRS(4326))
333 assert self.coverage.intersects((-50000, 0, -20000, 20000), SRS(3857))
334 assert self.coverage.intersects((-50000, -100, -20000, 20000), SRS(3857))
335
336
337 class TestDiffCoverage(object):
338 def setup(self):
339 g1 = coverage(shapely.wkt.loads("POLYGON((-10 0, 20 0, 20 10, -10 10, -10 0))"), SRS(4326))
340 g2 = coverage([0, 2, 8, 8], SRS(4326))
341 g3 = coverage(shapely.wkt.loads("POLYGON((-1000000 500000, 0 500000, 0 1000000, -1000000 1000000, -1000000 500000))"), SRS(3857))
342 self.coverage = diff_coverage([g1, g2, g3])
343
344 def test_bbox(self):
345 assert bbox_equals(self.coverage.bbox, [-10, 0.0, 20.0, 10.0], 0.0001), self.coverage.bbox
346
347 def test_contains(self):
348 assert self.coverage.contains((0, 0, 1, 1), SRS(4326))
349 assert self.coverage.contains((-1100000, 510000, -1050000, 600000), SRS(3857))
350 assert not self.coverage.contains((-1100000, 510000, -990000, 600000), SRS(3857)) # touches # g3
351 assert not self.coverage.contains((4, 4, 5, 5), SRS(4326)) # in g2
352
353 def test_intersects(self):
354 assert self.coverage.intersects((0, 0, 1, 1), SRS(4326))
355 assert self.coverage.intersects((-1100000, 510000, -1050000, 600000), SRS(3857))
356 assert self.coverage.intersects((-1100000, 510000, -990000, 600000), SRS(3857)) # touches # g3
357 assert not self.coverage.intersects((4, 4, 5, 5), SRS(4326)) # in g2
358
359
360 class TestIntersectionCoverage(object):
361 def setup(self):
362 g1 = coverage(shapely.wkt.loads("POLYGON((0 0, 10 0, 10 10, 0 10, 0 0))"), SRS(4326))
363 g2 = coverage([5, 5, 15, 15], SRS(4326))
364 self.coverage = intersection_coverage([g1, g2])
365
366 def test_bbox(self):
367 assert bbox_equals(self.coverage.bbox, [5.0, 5.0, 10.0, 10.0], 0.0001), self.coverage.bbox
368
369 def test_contains(self):
370 assert not self.coverage.contains((0, 0, 1, 1), SRS(4326))
371 assert self.coverage.contains((6, 6, 7, 7), SRS(4326))
372
373 def test_intersects(self):
374 assert self.coverage.intersection((3, 6, 7, 7), SRS(4326))
375 assert self.coverage.intersection((6, 6, 7, 7), SRS(4326))
376 assert not self.coverage.intersects((0, 0, 1, 1), SRS(4326))
377
378
270379 class TestMultiCoverage(object):
271380 def setup(self):
272381 # box from 10 10 to 80 80 with small spike/corner to -10 60 (upper left)
363472
364473 geoms = load_datasource(fname)
365474 eq_(len(geoms), 2)
475
476 def test_geojson(self):
477 with TempFile() as fname:
478 with open(fname, 'wb') as f:
479 f.write(b'''{"type": "FeatureCollection", "features": [
480 {"type": "Feature", "geometry": {"type": "Polygon", "coordinates": [[[0, 0], [10, 0], [10, 10], [0, 0]]]} },
481 {"type": "Feature", "geometry": {"type": "MultiPolygon", "coordinates": [[[[0, 0], [10, 0], [10, 10], [0, 0]]], [[[0, 0], [10, 0], [10, 10], [0, 0]]], [[[0, 0], [10, 0], [10, 10], [0, 0]]]]} },
482 {"type": "Feature", "geometry": {"type": "Point", "coordinates": [0, 0]} }
483 ]}''')
484
485 geoms = load_datasource(fname)
486 eq_(len(geoms), 4)
487
488 def test_expire_tiles_dir(self):
489 dirname = tempfile.mkdtemp()
490 try:
491 fname = os.path.join(dirname, 'tiles')
492 with open(fname, 'wb') as f:
493 f.write(b"4/2/5\n")
494 f.write(b"4/2/6\n")
495 f.write(b"4/4/3\n")
496
497 geoms = load_expire_tiles(dirname)
498 eq_(len(geoms), 3)
499 finally:
500 shutil.rmtree(dirname)
501
502 def test_expire_tiles_file(self):
503 with TempFile() as fname:
504 with open(fname, 'wb') as f:
505 f.write(b"4/2/5\n")
506 f.write(b"4/2/6\n")
507 f.write(b"error\n")
508 f.write(b"4/2/1\n") # rest of file is ignored
509
510 geoms = load_expire_tiles(fname)
511 eq_(len(geoms), 2)
676676 assert t1[2] == t2[0]
677677 assert t1[0] == t3[0]
678678 assert t1[1] == t3[3]
679
680
681 class TestClosestLevelTinyResFactor(object):
682 def setup(self):
683 self.grid = TileGrid(SRS(31467),
684 bbox=[420000,30000,900000,350000], origin='ul',
685 res=[4000,3750,3500,3250,3000,2750,2500,2250,2000,1750,1500,1250,1000,750,650,500,250,100,50,20,10,5,2.5,2,1.5,1,0.5],
686 )
687
688 def test_closest_level(self):
689 eq_(self.grid.closest_level(5000), 0)
690 eq_(self.grid.closest_level(4000), 0)
691 eq_(self.grid.closest_level(3750), 1)
692 eq_(self.grid.closest_level(3500), 2)
693 eq_(self.grid.closest_level(3250), 3)
694 eq_(self.grid.closest_level(3000), 4)
679695
680696
681697 class TestOrigins(object):
1818 import os
1919 from io import BytesIO
2020 from mapproxy.compat.image import Image, ImageDraw
21 from mapproxy.image import ImageSource, ReadBufWrapper, is_single_color_image
22 from mapproxy.image import peek_image_format
21 from mapproxy.image import (
22 ImageSource,
23 BlankImageSource,
24 ReadBufWrapper,
25 is_single_color_image,
26 peek_image_format,
27 _make_transparent as make_transparent,
28 SubImageSource,
29 img_has_transparency,
30 quantize,
31 )
2332 from mapproxy.image.merge import merge_images, BandMerger
24 from mapproxy.image import _make_transparent as make_transparent, SubImageSource, img_has_transparency, quantize
2533 from mapproxy.image.opts import ImageOptions
2634 from mapproxy.image.tile import TileMerger, TileSplitter
2735 from mapproxy.image.transform import ImageTransformer
310318 (10*10, (127, 127, 255, 255)),
311319 ])
312320
321 def test_merge_L(self):
322 img1 = ImageSource(Image.new('RGBA', (10, 10), (255, 0, 255, 255)))
323 img2 = ImageSource(Image.new('L', (10, 10), 100))
324
325 # img2 overlays img1
326 result = merge_images([img1, img2], ImageOptions(transparent=True))
327 img = result.as_image()
328 assert_img_colors_eq(img, [
329 (10*10, (100, 100, 100, 255)),
330 ])
331
313332 def test_paletted_merge(self):
314333 if not hasattr(Image, 'FASTOCTREE'):
315334 raise SkipTest()
345364 result = merge_images([img1, img2], ImageOptions(transparent=False))
346365 img = result.as_image()
347366 eq_(img.getpixel((0, 0)), (0, 255, 255))
367
368 def test_merge_rgb_with_transp(self):
369 img1 = ImageSource(Image.new('RGB', (10, 10), (255, 0, 255)))
370 raw = Image.new('RGB', (10, 10), (0, 255, 255))
371 raw.info = {'transparency': (0, 255, 255)} # make full transparent
372 img2 = ImageSource(raw)
373
374 result = merge_images([img1, img2], ImageOptions(transparent=False))
375 img = result.as_image()
376 eq_(img.getpixel((0, 0)), (255, 0, 255))
348377
349378
350379 class TestLayerCompositeMerge(object):
581610 self.img1 = ImageSource(Image.new('RGB', (10, 10), (100, 110, 120)))
582611 self.img2 = ImageSource(Image.new('RGB', (10, 10), (200, 210, 220)))
583612 self.img3 = ImageSource(Image.new('RGB', (10, 10), (0, 255, 0)))
613 self.blank = BlankImageSource(size=(10, 10), image_opts=ImageOptions())
584614
585615 def test_merge_noops(self):
586616 """
594624 eq_(img.size, (10, 10))
595625 eq_(img.getpixel((0, 0)), (0, 0, 0))
596626
597 def test_merge_no_source(self):
598 """
599 Check that empty source list returns BlankImageSource.
627 def test_merge_missing_source(self):
628 """
629 Check that empty source list or source list with missing images
630 returns BlankImageSource.
600631 """
601632 merger = BandMerger(mode='RGB')
602633 merger.add_ops(dst_band=0, src_img=0, src_band=0)
634 merger.add_ops(dst_band=1, src_img=1, src_band=0)
635 merger.add_ops(dst_band=2, src_img=2, src_band=0)
603636
604637 img_opts = ImageOptions('RGBA', transparent=True)
605638 result = merger.merge([], img_opts, size=(10, 10))
607640
608641 eq_(img.size, (10, 10))
609642 eq_(img.getpixel((0, 0)), (255, 255, 255, 0))
643
644 result = merger.merge([self.img0, self.img1], img_opts, size=(10, 10))
645 img = result.as_image()
646
647 eq_(img.size, (10, 10))
648 eq_(img.getpixel((0, 0)), (255, 255, 255, 0))
649
610650
611651 def test_rgb_merge(self):
612652 """
1212 # See the License for the specific language governing permissions and
1313 # limitations under the License.
1414
15 from mapproxy.compat.image import Image
15 from mapproxy.compat.image import Image, ImageDraw
1616 from mapproxy.srs import SRS
1717 from mapproxy.image import ImageSource
1818 from mapproxy.image.opts import ImageOptions
1919 from mapproxy.image.mask import mask_image_source_from_coverage
20 from mapproxy.image.merge import LayerMerger
2021 from mapproxy.util.coverage import load_limited_to
21 from mapproxy.test.image import assert_img_colors_eq
22 from mapproxy.test.image import assert_img_colors_eq, create_image
23 from nose.tools import eq_
2224
2325 try:
2426 from shapely.geometry import Polygon
7274 geom = 'POLYGON((2 2, 2 8, 8 8, 8 2, 2 2), (4 4, 4 6, 6 6, 6 4, 4 4))'
7375
7476 result = mask_image_source_from_coverage(img, [0, 0, 10, 10], SRS(4326), coverage(geom))
75 # 60*61 - 20*21 = 3240
77 # 60*60 - 20*20 = 3200
7678 assert_img_colors_eq(result.as_image().getcolors(),
77 [(10000-3240, (255, 255, 255, 0)), (3240, (100, 0, 200, 255))])
79 [(10000-3200, (255, 255, 255, 0)), (3200, (100, 0, 200, 255))])
7880
7981 def test_shapely_mask_with_transform_partial_image_transparent(self):
8082 img = ImageSource(Image.new('RGB', (100, 100), color=(100, 0, 200)),
8688 # 20*20 = 400
8789 assert_img_colors_eq(result.as_image().getcolors(),
8890 [(10000-400, (255, 255, 255, 0)), (400, (100, 0, 200, 255))])
91
92
93 class TestLayerCoverageMerge(object):
94 def setup(self):
95 self.coverage1 = coverage(Polygon([(0, 0), (0, 10), (10, 10), (10, 0)]), 3857)
96 self.coverage2 = coverage([2, 2, 8, 8], 3857)
97
98 def test_merge_single_coverage(self):
99 merger = LayerMerger()
100 merger.add(ImageSource(Image.new('RGB', (10, 10), (255, 255, 255))), self.coverage1)
101 result = merger.merge(image_opts=ImageOptions(transparent=True), bbox=(5, 0, 15, 10), bbox_srs=3857)
102 img = result.as_image()
103 eq_(img.mode, 'RGBA')
104 eq_(img.getpixel((4, 0)), (255, 255, 255, 255))
105 eq_(img.getpixel((6, 0)), (255, 255, 255, 0))
106
107 def test_merge_overlapping_coverage(self):
108 color1 = (255, 255, 0)
109 color2 = (0, 255, 255)
110 merger = LayerMerger()
111 merger.add(ImageSource(Image.new('RGB', (10, 10), color1)), self.coverage1)
112 merger.add(ImageSource(Image.new('RGB', (10, 10), color2)), self.coverage2)
113
114 result = merger.merge(image_opts=ImageOptions(), bbox=(0, 0, 10, 10), bbox_srs=3857)
115 img = result.as_image()
116 eq_(img.mode, 'RGB')
117
118 expected = create_image((10, 10), color1, 'RGB')
119 draw = ImageDraw.Draw(expected)
120 draw.polygon([(2, 2), (7, 2), (7, 7), (2, 7)], fill=color2)
121
122 for x in range(0, 9):
123 for y in range(0, 9):
124 eq_(img.getpixel((x, y)), expected.getpixel((x, y)))
125
2121 from mapproxy.request.wms import (wms_request, WMSMapRequest, WMSMapRequestParams,
2222 WMS111MapRequest, WMS100MapRequest, WMS130MapRequest,
2323 WMS111FeatureInfoRequest)
24 from mapproxy.request.arcgis import ArcGISRequest
24 from mapproxy.request.arcgis import ArcGISRequest, ArcGISIdentifyRequest
2525 from mapproxy.exception import RequestError
2626 from mapproxy.request.wms.exception import (WMS111ExceptionHandler, WMSImageExceptionHandler,
2727 WMSBlankExceptionHandler)
230230 req.params.bboxSR = SRS("EPSG:4326")
231231 eq_("4326", req.params.bboxSR)
232232 eq_("4326", req.params["bboxSR"])
233
234 def check_endpoint(self, url, expected):
235 req = ArcGISRequest(url=url)
236 eq_(req.url, expected)
237
238 def test_endpoint_urls(self):
239 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer/', 'http://example.com/ArcGIS/rest/MapServer/export'
240 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer', 'http://example.com/ArcGIS/rest/MapServer/export'
241 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer/export', 'http://example.com/ArcGIS/rest/MapServer/export'
242 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/', 'http://example.com/ArcGIS/rest/ImageServer/exportImage'
243 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer', 'http://example.com/ArcGIS/rest/ImageServer/exportImage'
244 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/export', 'http://example.com/ArcGIS/rest/ImageServer/exportImage'
245 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/exportImage', 'http://example.com/ArcGIS/rest/ImageServer/exportImage'
246
247 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer/export?param=foo', 'http://example.com/ArcGIS/rest/MapServer/export?param=foo'
248 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/export?param=foo', 'http://example.com/ArcGIS/rest/ImageServer/exportImage?param=foo'
249
250
251 class TestArcGISIndentifyRequest(object):
252 def test_base_request(self):
253 req = ArcGISIdentifyRequest(url="http://example.com/ArcGIS/rest/MapServer/")
254 eq_("http://example.com/ArcGIS/rest/MapServer/identify", req.url)
255 req.params.bbox = [-180.0, -90.0, 180.0, 90.0]
256 eq_((-180.0, -90.0, 180.0, 90.0), req.params.bbox)
257 eq_("-180.0,-90.0,180.0,90.0", req.params["mapExtent"])
258 req.params.size = [256, 256]
259 eq_((256, 256), req.params.size)
260 eq_("256,256,96", req.params["imageDisplay"])
261 req.params.srs = "EPSG:4326"
262 eq_("EPSG:4326", req.params.srs)
263 eq_("4326", req.params["sr"])
264
265 def check_endpoint(self, url, expected):
266 req = ArcGISIdentifyRequest(url=url)
267 eq_(req.url, expected)
268
269 def test_endpoint_urls(self):
270 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer/', 'http://example.com/ArcGIS/rest/MapServer/identify'
271 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer', 'http://example.com/ArcGIS/rest/MapServer/identify'
272 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer/export', 'http://example.com/ArcGIS/rest/MapServer/identify'
273 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/', 'http://example.com/ArcGIS/rest/ImageServer/identify'
274 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer', 'http://example.com/ArcGIS/rest/ImageServer/identify'
275 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/export', 'http://example.com/ArcGIS/rest/ImageServer/identify'
276 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/exportImage', 'http://example.com/ArcGIS/rest/ImageServer/identify'
277
278 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/MapServer/export?param=foo', 'http://example.com/ArcGIS/rest/MapServer/identify?param=foo'
279 yield self.check_endpoint, 'http://example.com/ArcGIS/rest/ImageServer/export?param=foo', 'http://example.com/ArcGIS/rest/ImageServer/identify?param=foo'
233280
234281
235282 class TestRequest(object):
153153
154154 def test_seed_full_bbox_continue(self):
155155 task = self.make_bbox_task([-180, -90, 180, 90], SRS(4326), [0, 1, 2])
156 seed_progress = SeedProgress([(0, 1), (0, 2)])
156 seed_progress = SeedProgress([(0, 1), (1, 2)])
157157 seeder = TileWalker(task, self.seed_pool, handle_uncached=True, seed_progress=seed_progress)
158158 seeder.walk()
159159
262262 before_timestamp_from_options({'minutes': 15}) + 60 * 15,
263263 time.time(), -1
264264 )
265
266 class TestSeedProgress(object):
267 def test_progress_identifier(self):
268 old = SeedProgress()
269 with old.step_down(0, 2):
270 with old.step_down(0, 4):
271 eq_(old.current_progress_identifier(), [(0, 2), (0, 4)])
272 # previous leafs are still present
273 eq_(old.current_progress_identifier(), [(0, 2), (0, 4)])
274 with old.step_down(1, 4):
275 eq_(old.current_progress_identifier(), [(0, 2), (1, 4)])
276 eq_(old.current_progress_identifier(), [(0, 2), (1, 4)])
277
278 eq_(old.current_progress_identifier(), []) # empty list after seed
279
280 with old.step_down(1, 2):
281 eq_(old.current_progress_identifier(), [(1, 2)])
282 with old.step_down(0, 4):
283 with old.step_down(1, 4):
284 eq_(old.current_progress_identifier(), [(1, 2), (0, 4), (1, 4)])
285
286 def test_already_processed(self):
287 new = SeedProgress([(0, 2)])
288 with new.step_down(0, 2):
289 assert not new.already_processed()
290 with new.step_down(0, 2):
291 assert not new.already_processed()
292
293 new = SeedProgress([(1, 2)])
294 with new.step_down(0, 2):
295 assert new.already_processed()
296 with new.step_down(0, 2):
297 assert new.already_processed()
298
299
300 new = SeedProgress([(0, 2), (1, 4), (2, 4)])
301 with new.step_down(0, 2):
302 assert not new.already_processed()
303 with new.step_down(0, 4):
304 assert new.already_processed()
305 with new.step_down(1, 4):
306 assert not new.already_processed()
307 with new.step_down(1, 4):
308 assert new.already_processed()
309 with new.step_down(2, 4):
310 assert not new.already_processed()
311 with new.step_down(3, 4):
312 assert not new.already_processed()
313 with new.step_down(2, 4):
314 assert not new.already_processed()
1414
1515 from __future__ import with_statement, division
1616
17 from mapproxy.layer import MapQuery
17 from mapproxy.layer import MapQuery, InfoQuery
1818 from mapproxy.srs import SRS
1919 from mapproxy.service.wms import combined_layers
2020 from nose.tools import eq_
7575 eq_(combined[1].client.request_template.params.layers, ['c', 'd'])
7676 eq_(combined[2].client.request_template.params.layers, ['e', 'f'])
7777
78
79 class TestInfoQuery(object):
80 def test_coord(self):
81 query = InfoQuery((8, 50, 9, 51), (400, 1000),
82 SRS(4326), (100, 600), 'text/plain')
83 eq_(query.coord, (8.25, 50.4))
9191 raise
9292 if len(args[0]) == 1:
9393 eventlet.sleep()
94 return _result_iter([call(*zip(*args)[0])], use_result_objects)
94 return _result_iter([call(*list(zip(*args))[0])], use_result_objects)
9595 pool = eventlet.greenpool.GreenPool(self.size)
9696 return _result_iter(pool.imap(call, *args), use_result_objects)
9797
2424 load_polygon_lines,
2525 transform_geometry,
2626 bbox_polygon,
27 EmptyGeometryError,
2728 )
2829 from mapproxy.srs import SRS
2930
3839 # missing Shapely is handled by require_geom_support
3940 pass
4041
41 def coverage(geom, srs):
42 def coverage(geom, srs, clip=False):
4243 if isinstance(geom, (list, tuple)):
43 return BBOXCoverage(geom, srs)
44 return BBOXCoverage(geom, srs, clip=clip)
4445 else:
45 return GeomCoverage(geom, srs)
46 return GeomCoverage(geom, srs, clip=clip)
4647
4748 def load_limited_to(limited_to):
4849 require_geom_support()
106107 return '<MultiCoverage %r: %r>' % (self.extent.llbbox, self.coverages)
107108
108109 class BBOXCoverage(object):
109 clip = False
110 def __init__(self, bbox, srs):
110 def __init__(self, bbox, srs, clip=False):
111111 self.bbox = bbox
112112 self.srs = srs
113113 self.geom = None
114 self.clip = clip
114115
115116 @property
116117 def extent(self):
138139
139140 if intersection[0] >= intersection[2] or intersection[1] >= intersection[3]:
140141 return None
141 return BBOXCoverage(intersection, self.srs)
142 return BBOXCoverage(intersection, self.srs, clip=self.clip)
142143
143144 def contains(self, bbox, srs):
144145 bbox = self._bbox_in_coverage_srs(bbox, srs)
149150 return self
150151
151152 bbox = self.srs.transform_bbox_to(srs, self.bbox)
152 return BBOXCoverage(bbox, srs)
153 return BBOXCoverage(bbox, srs, clip=self.clip)
153154
154155 def __eq__(self, other):
155156 if not isinstance(other, BBOXCoverage):
217218 return self
218219
219220 geom = transform_geometry(self.srs, srs, self.geom)
220 return GeomCoverage(geom, srs)
221 return GeomCoverage(geom, srs, clip=self.clip)
221222
222223 def intersects(self, bbox, srs):
223224 bbox = self._geom_in_coverage_srs(bbox, srs)
226227
227228 def intersection(self, bbox, srs):
228229 bbox = self._geom_in_coverage_srs(bbox, srs)
229 return GeomCoverage(self.geom.intersection(bbox), self.srs)
230 return GeomCoverage(self.geom.intersection(bbox), self.srs, clip=self.clip)
230231
231232 def contains(self, bbox, srs):
232233 bbox = self._geom_in_coverage_srs(bbox, srs)
254255 return not self.__eq__(other)
255256
256257 def __repr__(self):
257 return '<GeomCoverage %r: %r>' % (self.extent.llbbox, self.geom)
258 return '<GeomCoverage %r: %r>' % (self.extent.llbbox, self.geom)
259
260 def union_coverage(coverages, clip=None):
261 """
262 Create a coverage that is the union of all `coverages`.
263 Resulting coverage is in the SRS of the first coverage.
264 """
265 srs = coverages[0].srs
266
267 coverages = [c.transform_to(srs) for c in coverages]
268
269 geoms = []
270 for c in coverages:
271 if isinstance(c, BBOXCoverage):
272 geoms.append(bbox_polygon(c.bbox))
273 else:
274 geoms.append(c.geom)
275
276 import shapely.ops
277 union = shapely.ops.cascaded_union(geoms)
278
279 return GeomCoverage(union, srs=srs, clip=clip)
280
281 def diff_coverage(coverages, clip=None):
282 """
283 Create a coverage by subtracting all `coverages` from the first one.
284 Resulting coverage is in the SRS of the first coverage.
285 """
286 srs = coverages[0].srs
287
288 coverages = [c.transform_to(srs) for c in coverages]
289
290 geoms = []
291 for c in coverages:
292 if isinstance(c, BBOXCoverage):
293 geoms.append(bbox_polygon(c.bbox))
294 else:
295 geoms.append(c.geom)
296
297 sub = shapely.ops.cascaded_union(geoms[1:])
298 diff = geoms[0].difference(sub)
299
300 if diff.is_empty:
301 raise EmptyGeometryError("diff did not return any geometry")
302
303 return GeomCoverage(diff, srs=srs, clip=clip)
304
305 def intersection_coverage(coverages, clip=None):
306 """
307 Create a coverage by creating the intersection of all `coverages`.
308 Resulting coverage is in the SRS of the first coverage.
309 """
310 srs = coverages[0].srs
311
312 coverages = [c.transform_to(srs) for c in coverages]
313
314 geoms = []
315 for c in coverages:
316 if isinstance(c, BBOXCoverage):
317 geoms.append(bbox_polygon(c.bbox))
318 else:
319 geoms.append(c.geom)
320
321 intersection = reduce(lambda a, b: a.intersection(b), geoms)
322
323 if intersection.is_empty:
324 raise EmptyGeometryError("intersection did not return any geometry")
325
326 return GeomCoverage(intersection, srs=srs, clip=clip)
603603 _log('info', ' * Restarting with reloader')
604604
605605 args = [sys.executable] + sys.argv
606 # pip installs commands as .exe, but sys.argv[0]
607 # can miss the prefix. add .exe to avoid file-not-found
608 # in subprocess call
609 if os.name == 'nt' and '.' not in args[1]:
610 args[1] = args[1] + '.exe'
611
606 if os.name == 'nt':
607 # pip installs commands as .exe, but sys.argv[0]
608 # can miss the prefix.
609 # Add .exe to avoid file-not-found in subprocess call.
610 # Also, recent pip versions create .exe commands that are not
611 # executable by Python, but there is a -script.py which
612 # we need to call in this case. Check for this first.
613 if os.path.exists(args[1] + '-script.py'):
614 args[1] = args[1] + '-script.py'
615 elif not args[1].endswith('.exe'):
616 args[1] = args[1] + '.exe'
612617 new_environ = os.environ.copy()
613618 new_environ['WERKZEUG_RUN_MAIN'] = 'true'
614619
1313 md = cap.metadata()
1414 eq_(md['name'], 'OGC:WMS')
1515 eq_(md['title'], 'Omniscale OpenStreetMap WMS')
16 eq_(md['access_constraints'], 'This service is intended for private and evaluation use only. The data is licensed as Creative Commons Attribution-Share Alike 2.0 (http://creativecommons.org/licenses/by-sa/2.0/)')
16 eq_(md['access_constraints'], 'Here be dragons.')
1717 eq_(md['fees'], 'none')
1818 eq_(md['online_resource'], 'http://omniscale.de/')
1919 eq_(md['abstract'], 'Omniscale OpenStreetMap WMS (powered by MapProxy)')
2727 <ContactElectronicMailAddress>osm@omniscale.de</ContactElectronicMailAddress>
2828 </ContactInformation>
2929 <Fees>none</Fees>
30 <AccessConstraints>This service is intended for private and evaluation use only. The data is licensed as Creative Commons Attribution-Share Alike 2.0 (http://creativecommons.org/licenses/by-sa/2.0/)</AccessConstraints>
30 <AccessConstraints>Here be dragons.</AccessConstraints>
3131 </Service>
3232 <Capability>
3333 <Request>
1515 from __future__ import division, with_statement
1616
1717 import os
18 import json
1819 import codecs
1920 from functools import partial
2021 from contextlib import closing
2122
23 from mapproxy.grid import tile_grid
2224 from mapproxy.compat import string_type
2325
2426 import logging
5456
5557 Returns a list of Shapely Polygons.
5658 """
57 # check if it is a wkt file
59 # check if it is a wkt or geojson file
5860 if os.path.exists(os.path.abspath(datasource)):
5961 with open(os.path.abspath(datasource), 'rb') as fp:
6062 data = fp.read(50)
6163 if data.lower().lstrip().startswith((b'polygon', b'multipolygon')):
6264 return load_polygons(datasource)
63
65 # only load geojson directly if we don't have a filter
66 if where is None and data and data.startswith(b'{'):
67 return load_geojson(datasource)
6468 # otherwise pass to OGR
6569 return load_ogr_datasource(datasource, where=where)
6670
110114
111115 return polygons
112116
117 def load_geojson(datasource):
118 with open(datasource) as f:
119 geojson = json.load(f)
120 t = geojson.get('type')
121 if not t:
122 raise CoverageReadError("not a GeoJSON")
123 geometries = []
124 if t == 'FeatureCollection':
125 for f in geojson.get('features'):
126 geom = f.get('geometry')
127 if geom:
128 geometries.append(geom)
129 elif t == 'Feature':
130 if 'geometry' in geojson:
131 geometries.append(geojson['geometry'])
132 elif t in ('Polygon', 'MultiPolygon'):
133 geometries.append(geojson)
134 else:
135 log_config.warn('skipping feature of type %s from %s: not a Polygon/MultiPolygon',
136 t, datasource)
137
138 polygons = []
139 for geom in geometries:
140 geom = shapely.geometry.asShape(geom)
141 if geom.type == 'Polygon':
142 polygons.append(geom)
143 elif geom.type == 'MultiPolygon':
144 for p in geom:
145 polygons.append(p)
146 else:
147 log_config.warn('ignoring non-polygon geometry (%s) from %s',
148 geom.type, datasource)
149
150 return polygons
151
113152 def load_polygon_lines(line_iter, source='<string>'):
114153 polygons = []
115154 for line in line_iter:
172211 transf = partial(transform_xy, from_srs, to_srs)
173212
174213 if geometry.type == 'Polygon':
175 return transform_polygon(transf, geometry)
176
177 if geometry.type == 'MultiPolygon':
178 return transform_multipolygon(transf, geometry)
179
180 raise ValueError('cannot transform %s' % geometry.type)
214 result = transform_polygon(transf, geometry)
215 elif geometry.type == 'MultiPolygon':
216 result = transform_multipolygon(transf, geometry)
217 else:
218 raise ValueError('cannot transform %s' % geometry.type)
219
220 if not result.is_valid:
221 result = result.buffer(0)
222 return result
181223
182224 def transform_polygon(transf, polygon):
183225 ext = transf(polygon.exterior.xy)
215257
216258 return []
217259
218
260 def load_expire_tiles(expire_dir, grid=None):
261 if grid is None:
262 grid = tile_grid(3857, origin='nw')
263 tiles = set()
264
265 def parse(filename):
266 with open(filename) as f:
267 try:
268 for line in f:
269 if not line:
270 continue
271 tile = tuple(map(int, line.split('/')))
272 tiles.add(tile)
273 except:
274 log_config.warn('found error in %s, skipping rest of file', filename)
275
276 if os.path.isdir(expire_dir):
277 for root, dirs, files in os.walk(expire_dir):
278 for name in files:
279 filename = os.path.join(root, name)
280 parse(filename)
281 else:
282 parse(expire_dir)
283
284 boxes = []
285 for tile in tiles:
286 z, x, y = tile
287 boxes.append(shapely.geometry.box(*grid.tile_bbox((x, y, z))))
288
289 return boxes
6969
7070 def memoize(func):
7171 @wraps(func)
72 def wrapper(self, *args):
72 def wrapper(self, *args, **kwargs):
7373 if not hasattr(self, '__memoize_cache'):
7474 self.__memoize_cache = {}
7575 cache = self.__memoize_cache.setdefault(func, {})
76 if args not in cache:
77 cache[args] = func(self, *args)
78 return cache[args]
76 key = args + tuple(kwargs.items())
77 if key not in cache:
78 cache[key] = func(self, *args, **kwargs)
79 return cache[key]
7980 return wrapper
8081
1010 from scriptine.shell import backtick_, sh
1111
1212 PACKAGE_NAME = 'MapProxy'
13 REMOTE_DOC_LOCATION = 'omniscale.de:domains/mapproxy.org/docs'
14 REMOTE_REL_LOCATION = 'omniscale.de:domains/mapproxy.org/static/rel'
13 REMOTE_DOC_LOCATION = 'mapproxy.org:/opt/www/mapproxy.org/docs'
14 REMOTE_REL_LOCATION = 'mapproxy.org:/opt/www/mapproxy.org/static/rel'
1515
1616 VERSION_FILES = [
1717 ('setup.py', 'version="###"'),
7777 remote_rel_location = REMOTE_REL_LOCATION
7878 sh('scp dist/MapProxy-%(ver)s.* %(remote_rel_location)s' % locals())
7979
80 def upload_test_sdist_command():
81 date = backtick_('date +%Y%m%d').strip()
82 print('python setup.py egg_info -R -D -b ".dev%s" register -r testpypi sdist upload -r testpypi' % (date, ))
83
8084 def upload_final_sdist_command():
8185 sh('python setup.py egg_info -b "" -D sdist upload')
8286
0 WebTest==2.0.10
1 lxml==3.2.4
2 nose==1.3.0
3 Shapely==1.5.8
4 PyYAML==3.10
5 Pillow==2.8.1
6 WebOb==1.2.3
7 beautifulsoup4==4.4.0
8 coverage==3.7
9 requests==2.0.1
10 six==1.4.1
11 waitress==0.8.7
0 WebTest==2.0.25
1 lxml==3.7.3
2 nose==1.3.7
3 Shapely==1.5.17
4 PyYAML==3.12
5 Pillow==4.0.0
6 WebOb==1.7.1
7 coverage==4.3.4
8 requests==2.13.0
9 boto3==1.4.4
10 moto==0.4.31
11 eventlet==0.20.1
12 beautifulsoup4==4.5.3
13 boto==2.46.1
14 botocore==1.5.14
15 docutils==0.13.1
16 enum-compat==0.0.2
17 futures==3.0.5
18 greenlet==0.4.12
19 httpretty==0.8.10
20 Jinja2==2.9.5
21 jmespath==0.9.1
22 MarkupSafe==0.23
23 olefile==0.44
24 python-dateutil==2.6.0
25 pytz==2016.10
26 s3transfer==0.1.10
27 six==1.10.0
28 waitress==1.0.2
29 Werkzeug==0.11.15
30 xmltodict==0.10.2
31 redis==2.10.5
5353
5454 setup(
5555 name='MapProxy',
56 version="1.8.2a0",
56 version="1.10.0a0",
5757 description='An accelerating proxy for web map services',
5858 long_description=long_description(7),
5959 author='Oliver Tonnhofer',
3131 sphinx-build -b html -d {envtmpdir}/doctrees . {envtmpdir}/html
3232 sphinx-build -b latex -d {envtmpdir}/doctrees . {envtmpdir}/latex
3333 make -C {envtmpdir}/latex all-pdf
34 rsync -a --delete-after {envtmpdir}/html/ {envtmpdir}/latex/MapProxy.pdf ssh-226270-upload@mapproxy.org:domains/mapproxy.org/docs/nightly/
34 rsync -a --delete-after {envtmpdir}/html/ {envtmpdir}/latex/MapProxy.pdf os@mapproxy.org:/opt/www/mapproxy.org/docs/nightly/