Codebase list drms / abb409e
Update upstream source from tag 'upstream/0.6.3' Update to upstream version '0.6.3' with Debian dir f242a771372a44bac05774b2b7305eba2824f029 Ole Streicher 1 year, 6 months ago
55 changed file(s) with 1356 addition(s) and 1303 deletion(s). Raw diff Collapse all Expand all
0 # readthedocs.yml
1 # Read the Docs configuration file
2 # See https://docs.readthedocs.io/en/stable/config-file/v2.html for details
3
40 version: 2
1 build:
2 os: ubuntu-20.04
3 tools:
4 python: "3.9"
5 apt_packages:
6 - graphviz
57
68 sphinx:
79 builder: html
810 configuration: docs/conf.py
9 fail_on_warning: true
11 fail_on_warning: false
1012
1113 python:
1214 install:
13 - method: pip
14 extra_requirements:
15 - dev
16 path: .
15 - method: pip
16 extra_requirements:
17 - all
18 - docs
19 path: .
0 0.6.3 (2022-10-13)
1 ==================
2
3 Bug Fixes
4 ---------
5
6 - Updated indexing in a function to prevent FutureWarnings from pandas. (`#73 <https://github.com/sunpy/drms/pull/73>`__)
7
8
9 Trivial/Internal Changes
10 ------------------------
11
12 - Updated the init of `drms.json.HttpJsonRequest` to raise a nicer message if the URL fails to open. (`#76 <https://github.com/sunpy/drms/pull/76>`__)
13
14
015 0.6.2 (2021-05-15)
116 ==================
217
0 Copyright (c) 2013-2021 The SunPy developers
0 Copyright (c) 2013-2022 The SunPy developers
11 All rights reserved.
22
33 Redistribution and use in source and binary forms, with or without
00 Metadata-Version: 2.1
11 Name: drms
2 Version: 0.6.2
3 Summary: "Access HMI, AIA and MDI data with Python from the public JSOC DRMS server"
2 Version: 0.6.3
3 Summary: Access HMI, AIA and MDI data with Python from the Standford JSOC DRMS
44 Home-page: https://sunpy.org
55 Author: The SunPy Community
66 Author-email: sunpy@googlegroups.com
77 License: BSD 2-Clause
8 Description: ====
9 drms
10 ====
11
12 `Docs <https://docs.sunpy.org/projects/drms/>`__ |
13 `Tutorial <https://docs.sunpy.org/projects/drms/en/latest/tutorial.html>`__ |
14 `Github <https://github.com/sunpy/drms>`__ |
15 `PyPI <https://pypi.python.org/pypi/drms>`__
16
17 |JOSS| |Zenodo|
18
19 .. |JOSS| image:: https://joss.theoj.org/papers/10.21105/joss.01614/status.svg
20 :target: https://doi.org/10.21105/joss.01614
21 .. |Zenodo| image:: https://zenodo.org/badge/58651845.svg
22 :target: https://zenodo.org/badge/latestdoi/58651845
23
24 The ``drms`` module provides an easy-to-use interface for accessing HMI, AIA and MDI data with Python.
25 It uses the publicly accessible `JSOC <http://jsoc.stanford.edu/>`__ DRMS server by default, but can also be used with local `NetDRMS <http://jsoc.stanford.edu/netdrms/>`__ sites.
26 More information, including a detailed tutorial, is available in the `Documentation <https://docs.sunpy.org/projects/drms/>`__.
27
28 Getting Help
29 ------------
30
31 This is a SunPy-affiliated package. For more information or to ask questions
32 about drms or SunPy, check out:
33
34 - `drms Documentation`_
35 - `SunPy Matrix Channel`_
36 - `SunPy mailing list`_
37
38 .. _drms Documentation: https://docs.sunpy.org/projects/drms/en/latest/
39 .. _SunPy Matrix Channel: https://riot.im/app/#/room/#sunpy:matrix.org
40
41 Contributing
42 ------------
43
44 If you would like to get involved, start by joining the `SunPy mailing list`_ and check out the `Developers Guide`_ section of the SunPy docs.
45 Stop by our chat room `#sunpy:matrix.org`_ if you have any questions.
46 Help is always welcome so let us know what you like to work on, or check out the `issues page`_ for the list of known outstanding items.
47
48 For more information on contributing to SunPy and to DRMS, please read our `Newcomers guide`_.
49
50 .. _SunPy mailing list: https://groups.google.com/forum/#!forum/sunpy
51 .. _Developers Guide: https://docs.sunpy.org/en/latest/dev_guide/index.html
52 .. _`#sunpy:matrix.org`: https://app.element.io/#/room/#sunpy:openastronomy.org
53 .. _issues page: https://github.com/sunpy/drms/issues
54 .. _Newcomers guide: https://docs.sunpy.org/en/latest/dev_guide/newcomers.html
55
56 Code of Conduct (CoC)
57 ---------------------
58
59 When you are interacting with the SunPy community you are asked to follow our `code of conduct`_.
60
61 .. _code of conduct: https://docs.sunpy.org/en/latest/code_of_conduct.html
62
63 Citation
64 --------
65
66 If you use ``drms`` in your work, please consider citing our `paper`_.
67
68 .. code :: bibtex
69
70 @article{Glogowski2019,
71 doi = {10.21105/joss.01614},
72 url = {https://doi.org/10.21105/joss.01614},
73 year = {2019},
74 publisher = {The Open Journal},
75 volume = {4},
76 number = {40},
77 pages = {1614},
78 author = {Kolja Glogowski and Monica G. Bobra and Nitin Choudhary and Arthur B. Amezcua and Stuart J. Mumford},
79 title = {drms: A Python package for accessing HMI and AIA data},
80 journal = {Journal of Open Source Software}
81 }
82
83 .. _paper: https://doi.org/10.21105/joss.01614
84
85 Acknowledgements
86 ----------------
87
88 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
89
908 Keywords: solar physics,solar,science,data
919 Platform: any
9210 Classifier: Development Status :: 5 - Production/Stable
9614 Classifier: Operating System :: OS Independent
9715 Classifier: Programming Language :: Python
9816 Classifier: Programming Language :: Python :: 3
99 Classifier: Programming Language :: Python :: 3.7
10017 Classifier: Programming Language :: Python :: 3.8
10118 Classifier: Programming Language :: Python :: 3.9
19 Classifier: Programming Language :: Python :: 3.10
10220 Classifier: Topic :: Scientific/Engineering :: Astronomy
10321 Classifier: Topic :: Scientific/Engineering :: Physics
10422 Provides: drms
105 Requires-Python: >=3.7
23 Requires-Python: >=3.8
10624 Description-Content-Type: text/x-rst
10725 Provides-Extra: tests
10826 Provides-Extra: docs
10927 Provides-Extra: dev
11028 Provides-Extra: all
29 License-File: LICENSE.rst
30
31 ====
32 drms
33 ====
34
35 `Docs <https://docs.sunpy.org/projects/drms/>`__ |
36 `Tutorial <https://docs.sunpy.org/projects/drms/en/latest/tutorial.html>`__ |
37 `Github <https://github.com/sunpy/drms>`__ |
38 `PyPI <https://pypi.org/project/drms/>`__
39
40 |JOSS| |Zenodo|
41
42 .. |JOSS| image:: https://joss.theoj.org/papers/10.21105/joss.01614/status.svg
43 :target: https://doi.org/10.21105/joss.01614
44 .. |Zenodo| image:: https://zenodo.org/badge/58651845.svg
45 :target: https://zenodo.org/badge/latestdoi/58651845
46
47 The ``drms`` module provides an easy-to-use interface for accessing HMI, AIA and MDI data with Python.
48 It uses the publicly accessible `JSOC <http://jsoc.stanford.edu/>`__ DRMS server by default, but can also be used with local `NetDRMS <http://jsoc.stanford.edu/netdrms/>`__ sites.
49 More information, including a detailed tutorial, is available in the `Documentation <https://docs.sunpy.org/projects/drms/>`__.
50
51 Getting Help
52 ------------
53 For more information or to ask questions about ``drms``, check out:
54
55 - `drms Documentation <https://docs.sunpy.org/projects/drms/en/latest/>`__
56
57 .. _drms Documentation: https://docs.sunpy.org/projects/drms/en/latest/
58
59 Contributing
60 ------------
61 If you would like to get involved, start by joining the `SunPy Chat`_ and read our `Newcomers' guide <https://docs.sunpy.org/en/latest/dev_guide/newcomers.html>`__.
62 Help is always welcome so let us know what you like to work on, or check out the `issues page <https://github.com/sunpy/drms/issues>`__ for a list of known outstanding items.
63
64 Citation
65 --------
66 If you use ``drms`` in your work, please cite our `paper <https://doi.org/10.21105/joss.01614>`__.
67
68 .. code :: bibtex
69
70 @article{Glogowski2019,
71 doi = {10.21105/joss.01614},
72 url = {https://doi.org/10.21105/joss.01614},
73 year = {2019},
74 publisher = {The Open Journal},
75 volume = {4},
76 number = {40},
77 pages = {1614},
78 author = {Kolja Glogowski and Monica G. Bobra and Nitin Choudhary and Arthur B. Amezcua and Stuart J. Mumford},
79 title = {drms: A Python package for accessing HMI and AIA data},
80 journal = {Journal of Open Source Software}
81 }
82
83 Code of Conduct (CoC)
84 ---------------------
85
86 When you are interacting with the SunPy community you are asked to follow our `code of conduct <https://docs.sunpy.org/en/latest/code_of_conduct.html>`__.
87
88 Acknowledgements
89 ----------------
90 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
91
92 .. _SunPy Chat: https://openastronomy.element.io/#/room/#sunpy:openastronomy.org
44 `Docs <https://docs.sunpy.org/projects/drms/>`__ |
55 `Tutorial <https://docs.sunpy.org/projects/drms/en/latest/tutorial.html>`__ |
66 `Github <https://github.com/sunpy/drms>`__ |
7 `PyPI <https://pypi.python.org/pypi/drms>`__
7 `PyPI <https://pypi.org/project/drms/>`__
88
99 |JOSS| |Zenodo|
1010
1919
2020 Getting Help
2121 ------------
22 For more information or to ask questions about ``drms``, check out:
2223
23 This is a SunPy-affiliated package. For more information or to ask questions
24 about drms or SunPy, check out:
25
26 - `drms Documentation`_
27 - `SunPy Matrix Channel`_
28 - `SunPy mailing list`_
24 - `drms Documentation <https://docs.sunpy.org/projects/drms/en/latest/>`__
2925
3026 .. _drms Documentation: https://docs.sunpy.org/projects/drms/en/latest/
31 .. _SunPy Matrix Channel: https://riot.im/app/#/room/#sunpy:matrix.org
3227
3328 Contributing
3429 ------------
35
36 If you would like to get involved, start by joining the `SunPy mailing list`_ and check out the `Developers Guide`_ section of the SunPy docs.
37 Stop by our chat room `#sunpy:matrix.org`_ if you have any questions.
38 Help is always welcome so let us know what you like to work on, or check out the `issues page`_ for the list of known outstanding items.
39
40 For more information on contributing to SunPy and to DRMS, please read our `Newcomers guide`_.
41
42 .. _SunPy mailing list: https://groups.google.com/forum/#!forum/sunpy
43 .. _Developers Guide: https://docs.sunpy.org/en/latest/dev_guide/index.html
44 .. _`#sunpy:matrix.org`: https://app.element.io/#/room/#sunpy:openastronomy.org
45 .. _issues page: https://github.com/sunpy/drms/issues
46 .. _Newcomers guide: https://docs.sunpy.org/en/latest/dev_guide/newcomers.html
47
48 Code of Conduct (CoC)
49 ---------------------
50
51 When you are interacting with the SunPy community you are asked to follow our `code of conduct`_.
52
53 .. _code of conduct: https://docs.sunpy.org/en/latest/code_of_conduct.html
30 If you would like to get involved, start by joining the `SunPy Chat`_ and read our `Newcomers' guide <https://docs.sunpy.org/en/latest/dev_guide/newcomers.html>`__.
31 Help is always welcome so let us know what you like to work on, or check out the `issues page <https://github.com/sunpy/drms/issues>`__ for a list of known outstanding items.
5432
5533 Citation
5634 --------
57
58 If you use ``drms`` in your work, please consider citing our `paper`_.
35 If you use ``drms`` in your work, please cite our `paper <https://doi.org/10.21105/joss.01614>`__.
5936
6037 .. code :: bibtex
6138
7249 journal = {Journal of Open Source Software}
7350 }
7451
75 .. _paper: https://doi.org/10.21105/joss.01614
52 Code of Conduct (CoC)
53 ---------------------
54
55 When you are interacting with the SunPy community you are asked to follow our `code of conduct <https://docs.sunpy.org/en/latest/code_of_conduct.html>`__.
7656
7757 Acknowledgements
7858 ----------------
59 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
7960
80 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
61 .. _SunPy Chat: https://openastronomy.element.io/#/room/#sunpy:openastronomy.org
55 import drms
66
77 # Test URLs, used to check if a online site is reachable
8 jsoc_testurl = 'http://jsoc.stanford.edu/'
9 kis_testurl = 'http://drms.leibniz-kis.de/'
8 jsoc_testurl = "http://jsoc.stanford.edu/"
9 kis_testurl = "http://drms.leibniz-kis.de/"
1010
1111
1212 def pytest_addoption(parser):
13 parser.addoption('--email', help='Export email address')
13 parser.addoption("--email", help="Export email address")
1414
1515
1616 class lazily_cached:
2222 self.func = lambda: f(*args, **kwargs)
2323
2424 def __call__(self):
25 if not hasattr(self, 'result'):
25 if not hasattr(self, "result"):
2626 self.result = self.func()
2727 return self.result
2828
4545
4646 def pytest_runtest_setup(item):
4747 # Skip JSOC online site tests if the site is not reachable.
48 if item.get_closest_marker('jsoc') is not None:
48 if item.get_closest_marker("jsoc") is not None:
4949 if not jsoc_reachable():
50 pytest.skip('JSOC is not reachable')
50 pytest.skip("JSOC is not reachable")
5151
5252 # Skip KIS online site tests if the site is not reachable.
53 if item.get_closest_marker('kis') is not None:
53 if item.get_closest_marker("kis") is not None:
5454 if not kis_reachable():
55 pytest.skip('KIS is not reachable')
55 pytest.skip("KIS is not reachable")
5656
5757 # Skip export tests if no email address was specified.
58 if item.get_closest_marker('export') is not None:
59 email = item.config.getoption('email')
58 if item.get_closest_marker("export") is not None:
59 email = item.config.getoption("email")
6060 if email is None:
61 pytest.skip('No email address specified; use the --email option to enable export tests')
61 pytest.skip("No email address specified; use the --email option to enable export tests")
6262
6363
6464 @pytest.fixture
6666 """
6767 Email address from --email command line option.
6868 """
69 return request.config.getoption('--email')
69 return request.config.getoption("--email")
7070
7171
7272 @pytest.fixture
7474 """
7575 Client fixture for JSOC online tests, does not use email.
7676 """
77 return drms.Client('jsoc')
77 return drms.Client("jsoc")
7878
7979
8080 @pytest.fixture
8282 """
8383 Client fixture for JSOC online tests, uses email if specified.
8484 """
85 return drms.Client('jsoc', email=email)
85 return drms.Client("jsoc", email=email)
8686
8787
8888 @pytest.fixture
9090 """
9191 Client fixture for KIS online tests.
9292 """
93 return drms.Client('kis')
93 return drms.Client("kis")
22 *************
33 API Reference
44 *************
5
6 .. automodapi:: drms
57
68 .. automodapi:: drms.client
79
0 #
10 # Configuration file for the Sphinx documentation builder.
2 #
3 # This file does only contain a selection of the most common options. For a
4 # full list see the documentation:
5 # http://www.sphinx-doc.org/en/master/config
6
71
82 # -- Project information -----------------------------------------------------
93
104 import os
115 from drms import __version__
126
13 project = 'drms'
14 copyright = '2021, The SunPy Developers'
15 author = 'The SunPy Developers'
7 project = "drms"
8 copyright = "2022, The SunPy Developers"
9 author = "The SunPy Developers"
1610
1711 # The full version, including alpha/beta/rc tags
1812 release = __version__
19 is_development = '.dev' in __version__
13 is_development = ".dev" in __version__
2014
2115 # -- General configuration ---------------------------------------------------
2216
2418 # extensions coming with Sphinx (named 'sphinx.ext.*') or your custom
2519 # ones.
2620 extensions = [
27 'sphinx_automodapi.automodapi',
28 'sphinx_automodapi.smart_resolver',
29 'sphinx_gallery.gen_gallery',
30 'sphinx.ext.autodoc',
31 'sphinx.ext.coverage',
32 'sphinx.ext.doctest',
33 'sphinx.ext.inheritance_diagram',
34 'sphinx.ext.intersphinx',
35 'sphinx.ext.mathjax',
36 'sphinx.ext.napoleon',
37 'sphinx.ext.todo',
38 'sphinx.ext.viewcode',
39 'sphinx_changelog',
21 "sphinx_automodapi.automodapi",
22 "sphinx_automodapi.smart_resolver",
23 "sphinx_gallery.gen_gallery",
24 "sphinx.ext.autodoc",
25 "sphinx.ext.coverage",
26 "sphinx.ext.doctest",
27 "sphinx.ext.inheritance_diagram",
28 "sphinx.ext.intersphinx",
29 "sphinx.ext.mathjax",
30 "sphinx.ext.napoleon",
31 "sphinx.ext.todo",
32 "sphinx.ext.viewcode",
33 "sphinx_changelog",
4034 ]
4135 numpydoc_show_class_members = False
4236
4640 # List of patterns, relative to source directory, that match files and
4741 # directories to ignore when looking for source files.
4842 # This pattern also affects html_static_path and html_extra_path.
49 exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']
43 exclude_patterns = ["_build", "Thumbs.db", ".DS_Store"]
5044
5145 # The suffix(es) of source filenames.
5246 # You can specify multiple suffix as a list of string:
53 source_suffix = '.rst'
47 source_suffix = ".rst"
5448
5549 # The master toctree document.
56 master_doc = 'index'
50 master_doc = "index"
5751
5852 # The reST default role (used for this markup: `text`) to use for all
5953 # documents. Set to the "smart" one.
60 default_role = 'obj'
54 default_role = "obj"
6155
6256 # Enable nitpicky mode, which forces links to be non-broken
6357 nitpicky = True
6458 nitpick_ignore = [
65 ('py:obj', 'numpy.datetime64'),
59 ("py:obj", "numpy.datetime64"),
6660 # See https://github.com/numpy/numpy/issues/10039
6761 ]
6862
69 # Example configuration for intersphinx: refer to the Python standard library.
63 # -- Options for intersphinx extension -----------------------------------------
7064 intersphinx_mapping = {
71 'python': ('https://docs.python.org/3/', (None, 'http://data.astropy.org/intersphinx/python3.inv'),),
72 'pandas': ('http://pandas.pydata.org/pandas-docs/stable/', None),
73 'numpy': ('https://docs.scipy.org/doc/numpy/', (None, 'http://data.astropy.org/intersphinx/numpy.inv'),),
74 'scipy': (
75 'https://docs.scipy.org/doc/scipy/reference/',
76 (None, 'http://data.astropy.org/intersphinx/scipy.inv'),
77 ),
78 'matplotlib': ('https://matplotlib.org/', (None, 'http://data.astropy.org/intersphinx/matplotlib.inv'),),
79 'astropy': ('http://docs.astropy.org/en/stable/', None),
80 'sunpy': ('https://docs.sunpy.org/en/stable/', None),
65 "python": ("https://docs.python.org/3/", None),
66 "numpy": ("https://numpy.org/doc/stable/", None),
67 "astropy": ("https://docs.astropy.org/en/stable/", None),
68 "pandas": ("https://pandas.pydata.org/pandas-docs/stable/", None),
69 "sunpy": ("https://docs.sunpy.org/en/stable/", None),
8170 }
8271
8372 # -- Options for HTML output -------------------------------------------------
8877 try:
8978 from sunpy_sphinx_theme.conf import *
9079 except ImportError:
91 html_theme = 'default'
80 html_theme = "default"
9281
9382 # JSOC email os env
94 os.environ["JSOC_EMAIL"] = "jsoc@cadair.com"
83 os.environ["JSOC_EMAIL"] = "jsoc@sunpy.org"
9584
9685 # Add any paths that contain custom static files (such as style sheets) here,
9786 # relative to this directory. They are copied after the builtin static files,
9988 # html_static_path = ['_static']
10089
10190 # Render inheritance diagrams in SVG
102 graphviz_output_format = 'svg'
91 graphviz_output_format = "svg"
10392
10493 graphviz_dot_args = [
105 '-Nfontsize=10',
106 '-Nfontname=Helvetica Neue, Helvetica, Arial, sans-serif',
107 '-Efontsize=10',
108 '-Efontname=Helvetica Neue, Helvetica, Arial, sans-serif',
109 '-Gfontsize=10',
110 '-Gfontname=Helvetica Neue, Helvetica, Arial, sans-serif',
94 "-Nfontsize=10",
95 "-Nfontname=Helvetica Neue, Helvetica, Arial, sans-serif",
96 "-Efontsize=10",
97 "-Efontname=Helvetica Neue, Helvetica, Arial, sans-serif",
98 "-Gfontsize=10",
99 "-Gfontname=Helvetica Neue, Helvetica, Arial, sans-serif",
111100 ]
112101
113102 # -- Sphinx Gallery ------------------------------------------------------------
114103 from sphinx_gallery.sorting import ExampleTitleSortKey
115104
116105 sphinx_gallery_conf = {
117 'backreferences_dir': os.path.join('generated', 'modules'),
118 'filename_pattern': '^((?!skip_).)*$',
119 'examples_dirs': os.path.join('..', 'examples'),
120 'within_subsection_order': ExampleTitleSortKey,
121 'gallery_dirs': os.path.join('generated', 'gallery'),
106 "backreferences_dir": os.path.join("generated", "modules"),
107 "filename_pattern": "^((?!skip_).)*$",
108 "examples_dirs": os.path.join("..", "examples"),
109 "within_subsection_order": ExampleTitleSortKey,
110 "gallery_dirs": os.path.join("generated", "gallery"),
122111 # Comes from the theme.
123 'default_thumb_file': os.path.join(html_static_path[0], 'img', 'sunpy_icon_128x128.png'),
124 'abort_on_example_error': False,
125 'only_warn_on_example_error': True,
126 'plot_gallery': True,
127 'remove_config_comments': True,
128 'doc_module': ('sunpy'),
112 "default_thumb_file": os.path.join(html_static_path[0], "img", "sunpy_icon_128x128.png"),
113 "abort_on_example_error": False,
114 "only_warn_on_example_error": True,
115 "plot_gallery": True,
116 "remove_config_comments": True,
117 "doc_module": ("sunpy"),
129118 }
0 ##################
0 ******************
11 drms documentation
2 ##################
2 ******************
33
44 :Github: https://github.com/sunpy/drms
5 :PyPI: https://pypi.python.org/pypi/drms
5 :PyPI: https://pypi.org/project/drms/
66
7 Python library for accessing HMI, AIA and MDI data.
7 Python library for accessing HMI, AIA and MDI data from the Joint Science Operations Center (JSOC) or other Data Record Management System (DRMS) servers.
88
99 .. toctree::
1010 :maxdepth: 2
0 ************
01 Introduction
1 ============
2
2 ************
33 The ``drms`` Python package can be used to access HMI, AIA and MDI data which are stored in a DRMS database system.
44
5 DRMS stands for *Data Record Management System* and is a system that was developed by the `Joint Science Operation Center <http://jsoc.stanford.edu/>`__ (JSOC), headquartered at Stanford University, to handle the data produced by the AIA and HMI instruments aboard the `Solar Dynamics Observatory <http://sdo.gsfc.nasa.gov/>`__ spacecraft.
5 DRMS stands for *Data Record Management System* and is a system that was developed by the `Joint Science Operation Center <http://jsoc.stanford.edu/>`__ (JSOC), headquartered at Stanford University, to handle the data produced by the AIA and HMI instruments aboard the `Solar Dynamics Observatory <https://sdo.gsfc.nasa.gov//>`__ spacecraft.
66
77 By default the ``drms`` library uses the HTTP/JSON interface provided by JSOC and has similar functionality to the `JSOC Lookdata <http://jsoc.stanford.edu/ajax/lookdata.html>`__ website.
88 It can be used to query metadata, submit data export requests and download data files.
1010 This module also works well for local `NetDRMS <http://jsoc.stanford.edu/netdrms/>`__ sites, as long as the site runs a web server providing the needed CGI programs ``show_series`` and ``jsoc_info`` (for the data export functionality, additional CGIs, like ``jsoc_fetch``, are needed).
1111
1212 Requirements
13 ------------
14
13 ============
1514 The ``drms`` module supports Python 3.7 or newer.
1615 It requires the following Python packages:
1716
1918 - pandas
2019
2120 Installation
22 ------------
21 ============
22 If you are using `miniforge`_ (which is conda but using the conda-forge channel):
2323
24 If you are using `Anaconda`_, it is recommended to use the `conda-forge`_ package::
24 .. code-block:: bash
2525
26 conda config --append channels conda-forge
2726 conda install drms
2827
29 Otherwise the ``drms`` Python package can be installed from `PyPI`_ using::
28 Otherwise the ``drms`` Python package can be installed from `PyPI`_ using:
29
30 .. code-block:: bash
3031
3132 pip install drms
3233
3334 .. note::
34 If you do not use a Python distribution, like `Anaconda`_,
35 If you do not use a Python distribution, like `miniforge`_,
3536 and did not create an isolated Python environment using `Virtualenv`_,
36 you might need to add ``--user`` to the ``pip`` command::
37 you might need to add ``--user`` to the ``pip`` command:
3738
38 pip install --user drms
39 .. code-block:: bash
3940
40 .. _PyPI: https://pypi.python.org/pypi/drms
41 pip install --user drms
42
43 .. _PyPI: https://pypi.org/project/drms/
4144 .. _conda-forge: https://anaconda.org/conda-forge/drms
42 .. _Anaconda: https://www.anaconda.com/distribution/
43 .. _Virtualenv: https://virtualenv.pypa.io
45 .. _miniforge: https://github.com/conda-forge/miniforge#miniforge3
46 .. _Virtualenv: https://virtualenv.pypa.io/en/latest/
4447
4548 Acknowledgements
46 ----------------
47
49 ================
4850 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
00 .. _tutorial:
11
2 ********
23 Tutorial
3 ========
4
4 ********
55 This tutorial gives an introduction on how to use the ``drms`` Python library.
66 More detailed information on the different classes and functions can be found in the :ref:`API Reference <reference>`.
77
88 Basic usage
9 -----------
10
11 In this first part, we start with looking at data series that are available from `JSOC <http://jsoc.stanford.edu/>`__ and perform some basic DRMS queries to obtain keyword data (metadata) and segment file (data) locations.
9 ===========
10 We start with looking at data series that are available from `JSOC <http://jsoc.stanford.edu/>`__ and perform some basic DRMS queries to obtain keyword data (metadata) and segment file (data) locations.
1211 This is essentially what you can do on the `JSOC Lookdata <http://jsoc.stanford.edu/ajax/lookdata.html>`__ website.
1312
14 To be able to access the JSOC DRMS from Python, we first need to import the ``drms`` module and create an instance of the `~drms.client.Client` class::
13 To be able to access the JSOC DRMS from Python, we first need to import the ``drms`` module and create an instance of the `~drms.client.Client` class:
14
15 .. code-block:: python
1516
1617 >>> import drms
1718 >>> client = drms.Client() # doctest: +REMOTE_DATA
1819
19 All available data series can be now retrieved by calling the :meth:`drms.client.Client.series` method.
20 All available data series can be now retrieved by calling :meth:`drms.client.Client.series`.
2021 HMI series names start with ``"hmi."``, AIA series names with ``"aia."`` and the names of MDI series with ``"mdi."``.
2122
2223 The first (optional) parameter of this method takes a regular expression that allows you to filter the result.
23 If you, for example, want to obtain a list of HMI series, with a name that start with the string ``"m_"``, you can write::
24 If for example, you want to obtain a list of HMI series, with a name that start with the string ``"m_"``, you can write:
25
26 .. code-block:: python
2427
2528 >>> client.series(r'hmi\.m_') # doctest: +REMOTE_DATA
2629 ['hmi.M_45s', 'hmi.M_45s_dcon', 'hmi.M_720s', 'hmi.M_720s_dcon', 'hmi.M_720s_dconS', 'hmi.m_720s_mod', 'hmi.m_720s_nrt']
3134 DRMS records can be selected by creating a query string that contains a series name, followed by one or more fields, which are surrounded by square brackets.
3235 Each of those fields corresponds to a specific primekey, that is specified in the series definition.
3336 A complete set of primekeys represents a unique identifier for a record in that particular series.
34 For more detailed information on building record set queries, including additional non-primekey fields, see the `JSOC Help <http://jsoc.stanford.edu/ajax/RecordSetHelp.html>`__ page about this topiclient.
35
36 With the ``drms`` module you can use the :meth:`drms.client.Client.pkeys` method to obtain a list of all primekeys of a series, e.g.::
37 For more detailed information on building record set queries, including additional non-primekey fields, see the `JSOC Help <http://jsoc.stanford.edu/ajax/RecordSetHelp.html>`__ page about this.
38
39 With the ``drms`` module you can use :meth:`drms.client.Client.pkeys` to obtain a list of all primekeys of a series:
40
41 .. code-block:: python
3742
3843 >>> client.pkeys('hmi.m_720s') # doctest: +REMOTE_DATA
3944 ['T_REC', 'CAMERA']
40
4145 >>> client.pkeys('hmi.v_sht_modes') # doctest: +REMOTE_DATA
4246 ['T_START', 'LMIN', 'LMAX', 'NDT']
4347
4448 A list of all (regular) keywords can be obtained using :meth:`drms.client.Client.keys`.
45 You can also use the method :meth:`drms.client.Client.info` to get more detailed information about a series, e.g.::
46
47 >>> si = client.info('hmi.v_avg120') # doctest: +REMOTE_DATA
48 >>> si.segments # doctest: +REMOTE_DATA
49 You can also use :meth:`drms.client.Client.info` to get more detailed information about a series:
50
51 .. code-block:: python
52
53 >>> series_info = client.info('hmi.v_avg120') # doctest: +REMOTE_DATA
54 >>> series_info.segments # doctest: +REMOTE_DATA
4955 type units protocol dims note
5056 name
5157 mean short m/s fits 4096x4096 Doppler mean
5359 valid short NA fits 4096x4096 valid pixel count
5460 Log char NA generic run log
5561
56 All table-like structures, returned by routines in the ``drms`` module, are `Pandas DataFrames <http://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.html>`__.
57 If you are new to `Pandas <http://pandas.pydata.org/>`__, you should have a look at the introduction to `Pandas Data Structures <http://pandas.pydata.org/pandas-docs/stable/dsintro.html>`__.
58
59 Record set queries, used to obtain keyword data and get the location of data segments, can be performed using the :meth:`drms.client.Client.query` method.
60 To get, for example, the record time and the mean value for some of the HMI Dopplergrams that were recorded on April 1, 2016, together with the spacecraft's radial velocity in respect to the Sun, you can write::
61
62 >>> k = client.query('hmi.v_45s[2016.04.01_TAI/1d@6h]',
62 All table-like structures, returned by routines in the ``drms`` module, are `Pandas DataFrames <https://pandas.pydata.org/pandas-docs/stable/generated/pandas.DataFrame.html>`__.
63 If you are new to `Pandas <https://pandas.pydata.org/>`__, you should have a look at the introduction to `Pandas Data Structures <https://pandas.pydata.org/pandas-docs/stable/dsintro.html>`__.
64
65 Record set queries, used to obtain keyword data and get the location of data segments, can be performed using :meth:`drms.client.Client.query`.
66 To get, for example, the record time and the mean value for some of the HMI Dopplergrams that were recorded on April 1, 2016, together with the spacecraft's radial velocity in respect to the Sun, you can write:
67
68 .. code-block:: python
69
70 >>> query = client.query('hmi.v_45s[2016.04.01_TAI/1d@6h]',
6371 ... key='T_REC, DATAMEAN, OBS_VR') # doctest: +REMOTE_DATA
64 >>> k # doctest: +REMOTE_DATA
72 >>> query # doctest: +REMOTE_DATA
6573 T_REC DATAMEAN OBS_VR
6674 0 2016.04.01_00:00:00_TAI 3313.104980 3309.268006
6775 1 2016.04.01_06:00:00_TAI 878.075195 887.864139
6876 2 2016.04.01_12:00:00_TAI -2289.062500 -2284.690263
6977 3 2016.04.01_18:00:00_TAI 128.609283 137.836168
7078
71 JSOC time strings can be converted to a naive ``datetime``
72 representation using the :meth:`drms.utils.to_datetime` utility function::
73
74 >>> t = drms.to_datetime(k.T_REC) # doctest: +REMOTE_DATA
75 >>> t # doctest: +REMOTE_DATA
79 JSOC time strings can be converted to a naive `~datetime.datetime` representation using :meth:`drms.utils.to_datetime`:
80
81 .. code-block:: python
82
83 >>> timestamps = drms.to_datetime(query.T_REC) # doctest: +REMOTE_DATA
84 >>> timestamps # doctest: +REMOTE_DATA
7685 0 2016-04-01 00:00:00
7786 1 2016-04-01 06:00:00
7887 2 2016-04-01 12:00:00
8190
8291 For most of the HMI and MDI data sets, the `TAI <https://en.wikipedia.org/wiki/International_Atomic_Time>`__ time standard is used which, in contrast to `UTC <https://en.wikipedia.org/wiki/Coordinated_Universal_Time>`__, does not make use of any leap seconds.
8392 The TAI standard is currently not supported by the Python standard libraries.
84 If you need to convert timestamps between TAI and UTC, you can use the `Astropy <http://www.astropy.org/>`__ time module::
93 If you need to convert timestamps between TAI and UTC, you can use `Astropy <https://www.astropy.org/>`__:
94
95 .. code-block:: python
8596
8697 >>> from astropy.time import Time
87 >>> ta = Time(t[0], format='datetime', scale='tai') # doctest: +REMOTE_DATA
88 >>> ta # doctest: +REMOTE_DATA
98 >>> start_time = Time(timestamps[0], format='datetime', scale='tai') # doctest: +REMOTE_DATA
99 >>> start_time # doctest: +REMOTE_DATA
89100 <Time object: scale='tai' format='datetime' value=2016-04-01 00:00:00>
90 >>> ta.utc # doctest: +REMOTE_DATA
101 >>> start_time.utc # doctest: +REMOTE_DATA
91102 <Time object: scale='utc' format='datetime' value=2016-03-31 23:59:24>
92103
93 The ``"hmi.v_45s"`` series has a data segment with the name ``"Dopplergram"``, which contains Dopplergrams for each record in the series, that are stored as `FITS <http://fits.gsfclient.nasa.gov/>`__ files.
94 The location of the FITS files for the record set query in the example above, can be obtained by using the ``seg`` parameter of the :meth:`drms.client.Client.query` method::
95
96 >>> s = client.query('hmi.v_45s[2016.04.01_TAI/1d@6h]', seg='Dopplergram') # doctest: +REMOTE_DATA
97 >>> s # doctest: +REMOTE_DATA
104 The ``"hmi.v_45s"`` series has a data segment with the name ``"Dopplergram"``, which contains Dopplergrams for each record in the series, that are stored as `FITS <https://fits.gsfc.nasa.gov/>`__ files.
105 The location of the FITS files for the record set query in the example above, can be obtained by using the ``seg`` parameter of :meth:`drms.client.Client.query`:
106
107 .. code-block:: python
108
109 >>> query = client.query('hmi.v_45s[2016.04.01_TAI/1d@6h]', seg='Dopplergram') # doctest: +REMOTE_DATA
110 >>> query # doctest: +REMOTE_DATA
98111 Dopplergram
99112 0 /SUM58/D803708321/S00008/Dopplergram.fits
100113 1 /SUM41/D803708361/S00008/Dopplergram.fits
101114 2 /SUM71/D803720859/S00008/Dopplergram.fits
102115 3 /SUM70/D803730119/S00008/Dopplergram.fits
103116
104 Note that the ``key`` and ``seg`` parameters can also be used together in one :meth:`drms.client.Client.query` call, i.e.::
105
106 >>> k, s = client.query('hmi.v_45s[2016.04.01_TAI/1d@6h]',
117 Note that the ``key`` and ``seg`` parameters can also be used together in one :meth:`drms.client.Client.query` call:
118
119 .. code-block:: python
120
121 >>> keys, segments = client.query('hmi.v_45s[2016.04.01_TAI/1d@6h]',
107122 ... key='T_REC, DATAMEAN, OBS_VR', seg='Dopplergram') # doctest: +REMOTE_DATA
108123
109124 The file paths listed above are the storage location on the JSOC server.
110 You can access these files, even if you do not have direct NFS access to the filesystem, by prepending the JSOC URL to segment file path::
111
112 >>> url = 'http://jsoc.stanford.edu' + s.Dopplergram[0] # doctest: +REMOTE_DATA
125 You can access these files, even if you do not have direct NFS access to the filesystem, by prepending the JSOC URL to segment file path:
126
127 .. code-block:: python
128
129 >>> url = 'http://jsoc.stanford.edu' + segments.Dopplergram[0] # doctest: +REMOTE_DATA
113130 >>> url # doctest: +REMOTE_DATA
114131 'http://jsoc.stanford.edu/SUM58/D803708321/S00008/Dopplergram.fits'
115132
116133 >>> from astropy.io import fits
117 >>> a = fits.getdata(url) # doctest: +REMOTE_DATA
118 >>> print(a.shape, a.dtype) # doctest: +REMOTE_DATA
134 >>> data = fits.getdata(url) # doctest: +REMOTE_DATA
135 >>> print(data.shape, data.dtype) # doctest: +REMOTE_DATA
119136 (4096, 4096) float32
120137
121 Note that FITS files which are accessed in this way, do not contain anykeyword data in their headers.
138 Note that FITS files which are accessed in this way, do not contain any keyword data in their headers.
122139 This is perfectly fine in many cases, because you can just use :meth:`drms.client.Client.query` to obtain the data of all required keywords.
123140 If you need FITS files with headers that contain all the keyword data, you need to submit an export request to JSOC, which is described in the next section.
124141
125 Export requests can also be useful, if you want to download more than only one or two files (even without keyword headers), because you can then use the :meth:`drms.client.ExportRequest.download` method, which takes care of creating URLs, downloading the data and (if necessary) generating suitable local filenames.
142 Export requests can also be useful, if you want to download more than only one or two files (even without keyword headers), because you can then use :meth:`drms.client.ExportRequest.download`, which takes care of creating URLs, downloading the data and (if necessary) generating suitable local filenames.
126143
127144 Data export requests
128145 --------------------
129
130146 Data export requests can be interactively built and submitted on the `JSOC Export Data <http://jsoc.stanford.edu/ajax/exportdata.html>`__ webpage, where you can also find more information about the different export options that are available.
131 Note that a registered email address is required to for submitting export requests. You can register your email address on the `JSOC email registration <http://jsoc.stanford.edu/ajax/register_email.html>`__ webpage.
147 Note that a registered email address is required to for submitting export requests.
148 You can register your email address on the `JSOC email registration <http://jsoc.stanford.edu/ajax/register_email.html>`__ webpage.
132149
133150 It is advisable to have a closer look at the export webpage before submitting export requests using the ``drms`` library.
134151 It is also possible to submit an export request on the webpage and then use the Python routines to query the request status and download files.
136153 .. warning::
137154 Please replace the email below with your own registered email.
138155
139 First, we start again with importing the ``drms`` library and creating a `~drms.client.Client` instance::
156 .. code-block:: python
157
158 >>> import os
159 >>> email_address = os.environ["JSOC_EMAIL"]
160
161 First, we start again with importing the ``drms`` library and creating a `~drms.client.Client` instance:
162
163 .. code-block:: python
140164
141165 >>> import drms
142 >>> client = drms.Client(email='nabil.freij@gmail.com', verbose=True) # doctest: +REMOTE_DATA
166 >>> client = drms.Client(email=email_address, verbose=True) # doctest: +REMOTE_DATA
143167
144168 In this case we also provide an email address (which needs to be already registered at JSOC) and turn on status messages by enabling the ``verbose`` flag.
145169
146 We now create a download directory for our downloads, in case it does not exist yet::
170 We now create a download directory for our downloads, in case it does not exist yet:
171
172 .. code-block:: python
147173
148174 >>> import os
149175 >>> out_dir = 'downloads'
156182 In the following examples we confine ourselves to the methods ``url_quick`` and ``url`` and the protocols ``as-is`` and ``fits``.
157183
158184 url_quick / as-is
159 ~~~~~~~~~~~~~~~~~
160
185 ^^^^^^^^^^^^^^^^^
161186 The most direct and quickest way of downloading files is the combination ``url_quick`` / ``as-is``.
162187 This (in most cases) does not create an actual export request, where you would have to wait for it being finished, but rather compiles a list of files from your data export query, which can then be directly downloaded.
163188 This also means that this kind of export usually has no ``ExportID`` assigned to it.
165190
166191 As an example, we now create an ``url_quick`` / ``as-is`` export request for the same record set that was used in the previous section.
167192 For export requests, the segment name is specified using an additional field in the query string, surrounded by curly braces.
168 Note that :meth:`drms.client.Client.export` performs an ``url_quick`` / ``as-is`` export request by default, so you do not need to explicitly use ``method='url_quick'`` and ``protocol='as-is'`` in this case::
169
170 >>> r = client.export('hmi.v_45s[2016.04.01_TAI/1d@6h]{Dopplergram}') # doctest: +REMOTE_DATA
171 >>> r # doctest: +REMOTE_DATA
193 Note that :meth:`drms.client.Client.export` performs an ``url_quick`` / ``as-is`` export request by default, so you do not need to explicitly use ``method='url_quick'`` and ``protocol='as-is'`` in this case:
194
195 .. code-block:: python
196
197 >>> export_request = client.export('hmi.v_45s[2016.04.01_TAI/1d@6h]{Dopplergram}') # doctest: +REMOTE_DATA
198 >>> export_request # doctest: +REMOTE_DATA
172199 <ExportRequest: id=None, status=0>
173200
174 >>> r.data.filename # doctest: +REMOTE_DATA
201 >>> export_request.data.filename # doctest: +REMOTE_DATA
175202 0 /SUM58/D803708321/S00008/Dopplergram.fits
176203 1 /SUM41/D803708361/S00008/Dopplergram.fits
177204 2 /SUM71/D803720859/S00008/Dopplergram.fits
178205 3 /SUM70/D803730119/S00008/Dopplergram.fits
179206 Name: filename, dtype: object
180207
181 Download URLs can now be generated using the :attr:`drms.client.ExportRequest.urls` attribute::
182
183 >>> r.urls.url[0] # doctest: +REMOTE_DATA
208 Download URLs can now be generated using the :attr:`drms.client.ExportRequest.urls` attribute:
209
210 .. code-block:: python
211
212 >>> export_request.urls.url[0] # doctest: +REMOTE_DATA
184213 'http://jsoc.stanford.edu/SUM58/D803708321/S00008/Dopplergram.fits'
185214
186 Files can be downloaded using the :meth:`drms.client.ExportRequest.download` method.
187 You can (optionally) select which file(s) you want to download, by using the ``index`` parameter of this method. The following, for example, only downloads the first file of the request::
188
189 >>> r.download(out_dir, 0) # doctest: +REMOTE_DATA
215 Files can be downloaded using :meth:`drms.client.ExportRequest.download`.
216 You can (optionally) select which file(s) you want to download, by using the ``index`` parameter of this method.
217 The following, for example, only downloads the first file of the request:
218
219 .. code-block:: python
220
221 >>> export_request.download(out_dir, 0) # doctest: +REMOTE_DATA
190222 Downloading file 1 of 1...
191223 record: hmi.V_45s[2016.04.01_00:00:00_TAI][2]{Dopplergram}
192224 filename: Dopplergram.fits
197229 If you need keyword data added to the headers, you have to use the ``fits`` export protocol instead, which is described below.
198230
199231 url / fits
200 ~~~~~~~~~~
201
232 ^^^^^^^^^^
202233 Using the ``fits`` export protocol, allows you to request FITS files that include all keyword data in their headers.
203234 Note that this protocol *does not convert* other file formats into the FITS format.
204235 The only purpose of ``protocol='fits'`` is to add keyword data to headers of segment files, that are already stored using the FITS format.
209240
210241 In the following example, we use the ``hmi.sharp_720s`` series, which contains `Spaceweather HMI Active Region Patches <http://jsoc.stanford.edu/doc/data/hmi/sharp/sharp.htm>`__ (SHARPs), and download some data files from this series.
211242
212 First we have a look at the content of the series, by using :meth:`drms.client.Client.info` to get a `~drms.client.SeriesInfo` instance for this particular series::
213
214 >>> si = client.info('hmi.sharp_720s') # doctest: +REMOTE_DATA
215
216 >>> si.note # doctest: +REMOTE_DATA
243 First we have a look at the content of the series, by using :meth:`drms.client.Client.info` to get a `~drms.client.SeriesInfo` instance for this particular series:
244
245 .. code-block:: python
246
247 >>> series_info = client.info('hmi.sharp_720s') # doctest: +REMOTE_DATA
248 >>> series_info.note # doctest: +REMOTE_DATA
217249 'Spaceweather HMI Active Region Patch (SHARP): CCD coordinates'
218
219 >>> si.primekeys # doctest: +REMOTE_DATA
250 >>> series_info.primekeys # doctest: +REMOTE_DATA
220251 ['HARPNUM', 'T_REC']
221252
222 This series contains a total of 31 different data segments::
223
224 >>> len(si.segments) # doctest: +REMOTE_DATA
253 This series contains a total of 31 different data segments:
254
255 .. code-block:: python
256
257 >>> len(series_info.segments) # doctest: +REMOTE_DATA
225258 31
226
227 >>> si.segments.index.values # doctest: +REMOTE_DATA
259 >>> series_info.segments.index.values # doctest: +REMOTE_DATA
228260 array(['magnetogram', 'bitmap', 'Dopplergram', 'continuum', 'inclination',
229261 'azimuth', 'field', 'vlos_mag', 'dop_width', 'eta_0', 'damping',
230262 'src_continuum', 'src_grad', 'alpha_mag', 'chisq', 'conv_flag',
234266 'inclination_alpha_err', 'azimuth_alpha_err', 'disambig',
235267 'conf_disambig'], dtype=object)
236268
237 Here, we are only interested in magnetograms and continuum intensity maps ::
238
239 >>> si.segments.loc[['continuum', 'magnetogram']] # doctest: +REMOTE_DATA
269 Here, we are only interested in magnetograms and continuum intensity maps:
270
271 .. code-block:: python
272
273 >>> series_info.segments.loc[['continuum', 'magnetogram']] # doctest: +REMOTE_DATA
240274 type units protocol dims note
241275 name
242276 continuum int DN/s fits VARxVAR continuum intensity
244278
245279 which are stored as FITS files with varying dimensions.
246280
247 If we now want to submit an export request for a magnetogram and an intensity map of HARP number 4864, recorded at midnight on November 30, 2014, we can use the following export query string::
248
249 >>> ds = 'hmi.sharp_720s[4864][2014.11.30_00:00_TAI]{continuum, magnetogram}' # doctest: +REMOTE_DATA
250
251 In order to obtain FITS files that include keyword data in their headers, we then need to use ``protocol='fits'`` when submitting the request using :meth:`drms.client.Client.export`::
252
253 >>> r = client.export(ds, method='url', protocol='fits') # doctest: +REMOTE_DATA
254 >>> r # doctest: +REMOTE_DATA
281 If we now want to submit an export request for a magnetogram and an intensity map of HARP number 4864, recorded at midnight on November 30, 2014, we can use the following export query string:
282
283 .. code-block:: python
284
285 >>> query_string = 'hmi.sharp_720s[4864][2014.11.30_00:00_TAI]{continuum, magnetogram}' # doctest: +REMOTE_DATA
286
287 In order to obtain FITS files that include keyword data in their headers, we then need to use ``protocol='fits'`` when submitting the request using :meth:`drms.client.Client.export`:
288
289 .. code-block:: python
290
291 >>> export_request = client.export(query_string, method='url', protocol='fits') # doctest: +REMOTE_DATA
292 >>> export_request # doctest: +REMOTE_DATA
255293 <ExportRequest: id=JSOC_..., status=2>
256294
257 We now need to wait for the server to prepare the requested files::
258
259 >>> r.wait() # doctest: +REMOTE_DATA
295 We now need to wait for the server to prepare the requested files:
296
297 .. code-block:: python
298
299 >>> export_request.wait() # doctest: +REMOTE_DATA
260300 Export request pending. [id=..., status=2]
261301 Waiting for 5 seconds...
262302 ...
263303
264 >>> r.status # doctest: +REMOTE_DATA
304 >>> export_request.status # doctest: +REMOTE_DATA
265305 0
266306
267307 Note that calling :meth:`drms.client.ExportRequest.wait` is optional.
268308 It gives you some control over the waiting process, but it can be usually omitted, in which case :meth:`~drms.client.ExportRequest.wait` is called implicitly, when you for example try to download the requested files.
269309
270310 After the export request is finished, a unique request URL is created for you, which points to the location where all your requested files are stored.
271 You can use the :attr:`drms.client.ExportRequest.request_url` attribute to obtain this URL::
272
273 >>> r.request_url # doctest: +REMOTE_DATA
311 You can use the :attr:`drms.client.ExportRequest.request_url` attribute to obtain this URL:
312
313 .. code-block:: python
314
315 >>> export_request.request_url # doctest: +REMOTE_DATA
274316 'http://jsoc.stanford.edu/.../S00000'
275317
276318 Note that this location is only temporary and that all files will be deleted after a couple of days.
277319
278 Downloading the data works exactly like in the previous example, by using the :meth:`drms.client.ExportRequest.download` method::
279
280 >>> r.download(out_dir) # doctest: +REMOTE_DATA
320 Downloading the data works exactly like in the previous example, by using :meth:`drms.client.ExportRequest.download`:
321
322 .. code-block:: python
323
324 >>> export_request.download(out_dir) # doctest: +REMOTE_DATA
281325 Downloading file 1 of 2...
282326 ...
283327 Downloading file 2 of 2...
284328 ...
285329
286 .. note::
287 If you want to access an existing export request that you have submitted earlier, or if you submitted an export request using the `JSOC Export Data <http://jsoc.stanford.edu/ajax/exportdata.html>`__ webpage and want to access it from Python, you can use the :meth:`drms.client.Client.export_from_id` method with the corresponding ``ExportID`` to create an `drms.client.ExportRequest` instance for this particular request.
330 If you want to access an existing export request that you have submitted earlier, or if you submitted an export request using the `JSOC Export Data <http://jsoc.stanford.edu/ajax/exportdata.html>`__ webpage.
331 You can use :meth:`drms.client.Client.export_from_id` with the corresponding ``ExportID`` to create an `drms.client.ExportRequest` instance for this particular request.
1616 # Must be done before any drms imports
1717 __minimum_python_version__ = "3.7"
1818
19
1920 class UnsupportedPythonError(Exception):
20 """Running on an unsupported version of Python."""
21 """
22 Running on an unsupported version of Python.
23 """
2124
2225
23 if sys.version_info < tuple(int(val) for val in __minimum_python_version__.split('.')):
26 if sys.version_info < tuple(int(val) for val in __minimum_python_version__.split(".")):
2427 # This has to be .format to keep backwards compatibly.
25 raise UnsupportedPythonError(
26 "sunpy does not support Python < {}".format(__minimum_python_version__))
28 raise UnsupportedPythonError("sunpy does not support Python < {}".format(__minimum_python_version__))
2729
2830
2931 def _get_bibtex():
3032 import textwrap
3133
3234 # Set the bibtex entry to the article referenced in CITATION.rst
33 citation_file = os.path.join(os.path.dirname(__file__), 'CITATION.rst')
35 citation_file = os.path.join(os.path.dirname(__file__), "CITATION.rst")
3436
3537 # Explicitly specify UTF-8 encoding in case the system's default encoding is problematic
36 with open(citation_file, encoding='utf-8') as citation:
38 with open(citation_file, encoding="utf-8") as citation:
3739 # Extract the first bibtex block:
38 ref = citation.read().partition('.. code:: bibtex\n\n')[2]
39 lines = ref.split('\n')
40 ref = citation.read().partition(".. code:: bibtex\n\n")[2]
41 lines = ref.split("\n")
4042 # Only read the lines which are indented
41 lines = lines[: [line.startswith(' ') for line in lines].index(False)]
42 ref = textwrap.dedent('\n'.join(lines))
43 lines = lines[: [line.startswith(" ") for line in lines].index(False)]
44 ref = textwrap.dedent("\n".join(lines))
4345 return ref
46
4447
4548 __citation__ = __bibtex__ = _get_bibtex()
4649
47 from .version import version as __version__
4850 # DRMS imports to collapse the namespace
4951 from .client import *
5052 from .config import *
5153 from .exceptions import *
5254 from .json import *
5355 from .utils import *
56 from .version import version as __version__
00 # coding: utf-8
11 # file generated by setuptools_scm
22 # don't change, don't track in version control
3 version = '0.6.2'
4 version_tuple = (0, 6, 2)
3 __version__ = version = '0.6.3'
4 __version_tuple__ = version_tuple = (0, 6, 3)
1212 from .json import HttpJsonClient
1313 from .utils import _extract_series_name, _pd_to_numeric_coerce, _split_arg
1414
15 __all__ = ['SeriesInfo', 'ExportRequest', 'Client']
15 __all__ = ["SeriesInfo", "ExportRequest", "Client"]
1616
1717
1818 class SeriesInfo:
5050 def __init__(self, d, name=None):
5151 self._d = d
5252 self.name = name
53 self.retention = self._d.get('retention')
54 self.unitsize = self._d.get('unitsize')
55 self.archive = self._d.get('archive')
56 self.tapegroup = self._d.get('tapegroup')
57 self.note = self._d.get('note')
58 self.primekeys = self._d.get('primekeys')
59 self.dbindex = self._d.get('dbindex')
60 self.keywords = self._parse_keywords(d['keywords'])
61 self.links = self._parse_links(d['links'])
62 self.segments = self._parse_segments(d['segments'])
53 self.retention = self._d.get("retention")
54 self.unitsize = self._d.get("unitsize")
55 self.archive = self._d.get("archive")
56 self.tapegroup = self._d.get("tapegroup")
57 self.note = self._d.get("note")
58 self.primekeys = self._d.get("primekeys")
59 self.dbindex = self._d.get("dbindex")
60 self.keywords = self._parse_keywords(d["keywords"])
61 self.links = self._parse_links(d["links"])
62 self.segments = self._parse_segments(d["segments"])
6363
6464 @staticmethod
6565 def _parse_keywords(d):
66 keys = ['name', 'type', 'recscope', 'defval', 'units', 'note', 'linkinfo']
66 keys = ["name", "type", "recscope", "defval", "units", "note", "linkinfo"]
6767 res = []
6868 for di in d:
6969 resi = []
7373 if not res:
7474 res = None # workaround for older pandas versions
7575 res = pd.DataFrame(res, columns=keys)
76 res.index = res.pop('name')
77 res['is_time'] = res.type == 'time'
78 res['is_integer'] = res.type == 'short'
79 res['is_integer'] |= res.type == 'int'
80 res['is_integer'] |= res.type == 'longlong'
81 res['is_real'] = res.type == 'float'
82 res['is_real'] |= res.type == 'double'
83 res['is_numeric'] = res.is_integer | res.is_real
76 res.index = res.pop("name")
77 res["is_time"] = res.type == "time"
78 res["is_integer"] = res.type == "short"
79 res["is_integer"] |= res.type == "int"
80 res["is_integer"] |= res.type == "longlong"
81 res["is_real"] = res.type == "float"
82 res["is_real"] |= res.type == "double"
83 res["is_numeric"] = res.is_integer | res.is_real
8484 return res
8585
8686 @staticmethod
8787 def _parse_links(d):
88 keys = ['name', 'target', 'kind', 'note']
88 keys = ["name", "target", "kind", "note"]
8989 res = []
9090 for di in d:
9191 resi = []
9595 if not res:
9696 res = None # workaround for older pandas versions
9797 res = pd.DataFrame(res, columns=keys)
98 res.index = res.pop('name')
98 res.index = res.pop("name")
9999 return res
100100
101101 @staticmethod
102102 def _parse_segments(d):
103 keys = ['name', 'type', 'units', 'protocol', 'dims', 'note']
103 keys = ["name", "type", "units", "protocol", "dims", "note"]
104104 res = []
105105 for di in d:
106106 resi = []
110110 if not res:
111111 res = None # workaround for older pandas versions
112112 res = pd.DataFrame(res, columns=keys)
113 res.index = res.pop('name')
113 res.index = res.pop("name")
114114 return res
115115
116116 def __repr__(self):
117117 if self.name is None:
118 return '<SeriesInfo>'
118 return "<SeriesInfo>"
119119 else:
120 return f'<SeriesInfo: {self.name}>'
120 return f"<SeriesInfo: {self.name}>"
121121
122122
123123 class ExportRequest:
146146 return cls(d, client)
147147
148148 def __repr__(self):
149 idstr = str(None) if self._requestid is None else (f'{self._requestid}')
150 return f'<ExportRequest: id={idstr}, status={int(self._status)}>'
149 idstr = str(None) if self._requestid is None else (f"{self._requestid}")
150 return f"<ExportRequest: id={idstr}, status={int(self._status)}>"
151151
152152 @staticmethod
153153 def _parse_data(d):
154 keys = ['record', 'filename']
154 keys = ["record", "filename"]
155155 res = None if d is None else [(di.get(keys[0]), di.get(keys[1])) for di in d]
156156 if not res:
157157 res = None # workaround for older pandas versions
163163 d = self._client._json.exp_status(self._requestid)
164164 self._d = d
165165 self._d_time = time.time()
166 self._status = int(self._d.get('status', self._status))
167 self._requestid = self._d.get('requestid', self._requestid)
166 self._status = int(self._d.get("status", self._status))
167 self._requestid = self._d.get("requestid", self._requestid)
168168 if self._requestid is None:
169169 # Apparently 'reqid' is used instead of 'requestid' for certain
170170 # protocols like 'mpg'
171 self._requestid = self._d.get('reqid')
172 if self._requestid == '':
171 self._requestid = self._d.get("reqid")
172 if self._requestid == "":
173173 # Use None if the requestid is empty (url_quick + as-is)
174174 self._requestid = None
175175
177177 if self._status in self._status_codes_ok_or_pending:
178178 if self._status != self._status_code_notfound or notfound_ok:
179179 return # request has not failed (yet)
180 msg = self._d.get('error')
180 msg = self._d.get("error")
181181 if msg is None:
182 msg = 'DRMS export request failed.'
183 msg += f' [status={int(self._status)}]'
182 msg = "DRMS export request failed."
183 msg += f" [status={int(self._status)}]"
184184 raise DrmsExportError(msg)
185185
186186 def _generate_download_urls(self):
191191 data_dir = self.dir
192192
193193 # Clear first record name for movies, as it is not a DRMS record.
194 if self.protocol in ['mpg', 'mp4']:
195 if res.record[0].startswith('movie'):
194 if self.protocol in ["mpg", "mp4"]:
195 if res.record[0].startswith("movie"):
196196 res.record[0] = None
197197
198198 # tar exports provide only a single TAR file with full path
199199 if self.tarfile is not None:
200200 data_dir = None
201 res = pd.DataFrame([(None, self.tarfile)], columns=['record', 'filename'])
201 res = pd.DataFrame([(None, self.tarfile)], columns=["record", "filename"])
202202
203203 # If data_dir is None, the filename column should contain the full
204204 # path of the file and we need to extract the basename part. If
205205 # data_dir contains a directory, the filename column should contain
206206 # only the basename and we need to join it with the directory.
207207 if data_dir is None:
208 res.rename(columns={'filename': 'fpath'}, inplace=True)
209 split_fpath = res.fpath.str.split('/')
210 res['filename'] = [sfp[-1] for sfp in split_fpath]
208 res.rename(columns={"filename": "fpath"}, inplace=True)
209 split_fpath = res.fpath.str.split("/")
210 res["filename"] = [sfp[-1] for sfp in split_fpath]
211211 else:
212 res['fpath'] = [f'{data_dir}/{filename}' for filename in res.filename]
213
214 if self.method.startswith('url'):
212 res["fpath"] = [f"{data_dir}/{filename}" for filename in res.filename]
213
214 if self.method.startswith("url"):
215215 baseurl = self._client._server.http_download_baseurl
216 elif self.method.startswith('ftp'):
216 elif self.method.startswith("ftp"):
217217 baseurl = self._client._server.ftp_download_baseurl
218218 else:
219 raise RuntimeError(f'Download is not supported for export method {self.method}')
219 raise RuntimeError(f"Download is not supported for export method {self.method}")
220220
221221 # Generate download URLs.
222222 urls = []
223223 for fp in res.fpath:
224 while fp.startswith('/'):
224 while fp.startswith("/"):
225225 fp = fp[1:]
226226 urls.append(urljoin(baseurl, fp))
227 res['url'] = urls
227 res["url"] = urls
228228
229229 # Remove rows with missing files.
230 res = res[res.filename != 'NoDataFile']
231 del res['fpath']
230 res = res[res.filename != "NoDataFile"]
231 del res["fpath"]
232232 return res
233233
234234 @staticmethod
239239 i = 1
240240 new_fname = fname
241241 while os.path.exists(new_fname):
242 new_fname = f'{fname}.{int(i)}'
242 new_fname = f"{fname}.{int(i)}"
243243 i += 1
244244 return new_fname
245245
262262 """
263263 (string) Export method.
264264 """
265 return self._d.get('method')
265 return self._d.get("method")
266266
267267 @property
268268 def protocol(self):
269269 """
270270 (string) Export protocol.
271271 """
272 return self._d.get('protocol')
272 return self._d.get("protocol")
273273
274274 @property
275275 def dir(self):
280280 self._raise_on_error()
281281 else:
282282 self.wait()
283 data_dir = self._d.get('dir')
283 data_dir = self._d.get("dir")
284284 return data_dir if data_dir else None
285285
286286 @property
295295 self._raise_on_error()
296296 else:
297297 self.wait()
298 return self._parse_data(self._d.get('data'))
298 return self._parse_data(self._d.get("data"))
299299
300300 @property
301301 def tarfile(self):
306306 self._raise_on_error()
307307 else:
308308 self.wait()
309 data_tarfile = self._d.get('tarfile')
309 data_tarfile = self._d.get("tarfile")
310310 return data_tarfile if data_tarfile else None
311311
312312 @property
318318 self._raise_on_error()
319319 else:
320320 self.wait()
321 data_keywords = self._d.get('keywords')
321 data_keywords = self._d.get("keywords")
322322 return data_keywords if data_keywords else None
323323
324324 @property
330330 http_baseurl = self._client._server.http_download_baseurl
331331 if data_dir is None or http_baseurl is None:
332332 return None
333 if data_dir.startswith('/'):
333 if data_dir.startswith("/"):
334334 data_dir = data_dir[1:]
335335 return urljoin(http_baseurl, data_dir)
336336
462462
463463 while True:
464464 if verbose:
465 idstr = str(None) if self._requestid is None else (f'{self._requestid}')
466 print(f'Export request pending. [id={idstr}, status={self._status}]')
465 idstr = str(None) if self._requestid is None else (f"{self._requestid}")
466 print(f"Export request pending. [id={idstr}, status={self._status}]")
467467
468468 # Use the user-provided sleep value or the server's wait value.
469469 # In case neither is available, wait for 5 seconds.
470 wait_secs = self._d.get('wait', 5) if sleep is None else sleep
470 wait_secs = self._d.get("wait", 5) if sleep is None else sleep
471471
472472 # Consider the time that passed since the last status update.
473473 wait_secs -= time.time() - self._d_time
480480 return False
481481
482482 if verbose:
483 print(f'Waiting for {int(round(wait_secs))} seconds...')
483 print(f"Waiting for {int(round(wait_secs))} seconds...")
484484 time.sleep(wait_secs)
485485
486486 if self.has_finished():
491491 if retries_notfound <= 0:
492492 self._raise_on_error(notfound_ok=False)
493493 if verbose:
494 print(f'Request not found on server, {retries_notfound} retries left.')
494 print(f"Request not found on server, {retries_notfound} retries left.")
495495 retries_notfound -= 1
496496
497497 def download(self, directory, index=None, fname_from_rec=None, verbose=None):
544544 """
545545 out_dir = os.path.abspath(directory)
546546 if not os.path.isdir(out_dir):
547 raise OSError(f'Download directory {out_dir} does not exist')
547 raise OSError(f"Download directory {out_dir} does not exist")
548548
549549 if np.isscalar(index):
550550 index = [int(index)]
559559
560560 if fname_from_rec is None:
561561 # For 'url_quick', generate local filenames from record strings.
562 if self.method == 'url_quick':
562 if self.method == "url_quick":
563563 fname_from_rec = True
564564
565565 # self.urls contains the same records as self.data, except for the tar
581581
582582 fpath = os.path.join(out_dir, filename)
583583 fpath_new = self._next_available_filename(fpath)
584 fpath_tmp = self._next_available_filename(f'{fpath_new}.part')
584 fpath_tmp = self._next_available_filename(f"{fpath_new}.part")
585585 if verbose:
586 print(f'Downloading file {int(i + 1)} of {int(ndata)}...')
587 print(f' record: {di.record}')
588 print(f' filename: {di.filename}')
586 print(f"Downloading file {int(i + 1)} of {int(ndata)}...")
587 print(f" record: {di.record}")
588 print(f" filename: {di.filename}")
589589 try:
590590 urlretrieve(di.url, fpath_tmp)
591591 except (HTTPError, URLError):
592592 fpath_new = None
593593 if verbose:
594 print(' -> Error: Could not download file')
594 print(" -> Error: Could not download file")
595595 else:
596596 fpath_new = self._next_available_filename(fpath)
597597 os.rename(fpath_tmp, fpath_new)
598598 if verbose:
599 print(f' -> {os.path.relpath(fpath_new)}')
599 print(f" -> {os.path.relpath(fpath_new)}")
600600 downloads.append(fpath_new)
601601
602 res = data[['record', 'url']].copy()
603 res['download'] = downloads
602 res = data[["record", "url"]].copy()
603 res["download"] = downloads
604604 return res
605605
606606
610610
611611 Parameters
612612 ----------
613 server : str or ServerConfig
613 server : str or drms.config.ServerConfig
614614 Registered server ID or ServerConfig instance.
615615 Defaults to JSOC.
616616 email : str or None
621621 Print debug output (disabled by default).
622622 """
623623
624 def __init__(self, server='jsoc', email=None, verbose=False, debug=False):
624 def __init__(self, server="jsoc", email=None, verbose=False, debug=False):
625625 self._json = HttpJsonClient(server=server, debug=debug)
626626 self._info_cache = {}
627627 self.verbose = verbose # use property for convertion to bool
628628 self.email = email # use property for email validation
629629
630630 def __repr__(self):
631 return f'<Client: {self._server.name}>'
631 return f"<Client: {self._server.name}>"
632632
633633 def _convert_numeric_keywords(self, ds, kdf, skip_conversion=None):
634634 si = self.info(ds)
635635 int_keys = list(si.keywords[si.keywords.is_integer].index)
636636 num_keys = list(si.keywords[si.keywords.is_numeric].index)
637 num_keys += ['*recnum*', '*sunum*', '*size*']
637 num_keys += ["*recnum*", "*sunum*", "*size*"]
638638 if skip_conversion is None:
639639 skip_conversion = []
640640 elif isinstance(skip_conversion, str):
645645 # pandas apparently does not support hexadecimal strings, so
646646 # we need a special treatment for integer strings that start
647647 # with '0x', like QUALITY. The following to_numeric call is
648 # still neccessary as the results are still Python objects.
648 # still necessary as the results are still Python objects.
649649 if k in int_keys and kdf[k].dtype is np.dtype(object):
650 idx = kdf[k].str.startswith('0x')
650 idx = kdf[k].str.startswith("0x")
651651 if idx.any():
652 kdf.loc[idx, k] = kdf.loc[idx, k].map(lambda x: int(x, base=16))
652 k_idx = kdf.columns.get_loc(k)
653 kdf[kdf.columns[k_idx]] = kdf[kdf.columns[k_idx]].apply(int, base=16)
653654 if k in num_keys:
654655 kdf[k] = _pd_to_numeric_coerce(kdf[k])
655656
659660 Raises a DrmsQueryError, using the json error message from d.
660661 """
661662 if status is None:
662 status = d.get('status')
663 msg = d.get('error')
663 status = d.get("status")
664 msg = d.get("error")
664665 if msg is None:
665 msg = 'DRMS Query failed.'
666 msg += f' [status={status}]'
666 msg = "DRMS Query failed."
667 msg += f" [status={status}]"
667668 raise DrmsQueryError(msg)
668669
669670 def _generate_filenamefmt(self, sname):
679680 pkfmt_list = []
680681 for k in si.primekeys:
681682 if si.keywords.loc[k].is_time:
682 pkfmt_list.append(f'{{{k}:A}}')
683 pkfmt_list.append(f"{{{k}:A}}")
683684 else:
684 pkfmt_list.append(f'{{{k}}}')
685 pkfmt_list.append(f"{{{k}}}")
685686
686687 if pkfmt_list:
687 return '{}.{}.{{segment}}'.format(si.name, '.'.join(pkfmt_list))
688 return "{}.{}.{{segment}}".format(si.name, ".".join(pkfmt_list))
688689 else:
689 return str(si.name) + '.{recnum:%lld}.{segment}'
690 return str(si.name) + ".{recnum:%lld}.{segment}"
690691
691692 # Some regular expressions used to parse export request queries.
692 _re_export_recset = re.compile(r'^\s*([\w\.]+)\s*(\[.*\])?\s*(?:\{([\w\s\.,]*)\})?\s*$')
693 _re_export_recset_pkeys = re.compile(r'\[([^\[^\]]*)\]')
694 _re_export_recset_slist = re.compile(r'[\s,]+')
693 _re_export_recset = re.compile(r"^\s*([\w\.]+)\s*(\[.*\])?\s*(?:\{([\w\s\.,]*)\})?\s*$")
694 _re_export_recset_pkeys = re.compile(r"\[([^\[^\]]*)\]")
695 _re_export_recset_slist = re.compile(r"[\s,]+")
695696
696697 @staticmethod
697698 def _parse_export_recset(rs):
735736 # Cleanup time strings.
736737 if si.keywords.loc[si.primekeys[i]].is_time:
737738 v = pkeys[i]
738 v = v.replace('.', "").replace(':', "").replace('-', "")
739 v = v.replace(".", "").replace(":", "").replace("-", "")
739740 pkeys[i] = v
740741
741742 # Generate filename.
742743 fname = si.name
743744 if pkeys is not None:
744745 pkeys = [k for k in pkeys if k.strip()]
745 pkeys_str = '.'.join(pkeys)
746 pkeys_str = ".".join(pkeys)
746747 if pkeys_str:
747 fname += f'.{pkeys_str}'
748 fname += f".{pkeys_str}"
748749 if segs is not None:
749750 segs = [s for s in segs if s.strip()]
750 segs_str = '.'.join(segs)
751 segs_str = ".".join(segs)
751752 if segs_str:
752 fname += f'.{segs_str}'
753 fname += f".{segs_str}"
753754
754755 if old_fname is not None:
755756 # Try to use the file extension of the original filename.
756 known_fname_extensions = ['.fits', '.txt', '.jpg', '.mpg', '.mp4', '.tar']
757 known_fname_extensions = [".fits", ".txt", ".jpg", ".mpg", ".mp4", ".tar"]
757758 for ext in known_fname_extensions:
758759 if old_fname.endswith(ext):
759760 return fname + ext
761762
762763 # Export color table names, from (internal) series "jsoc.Color_Tables"
763764 _export_color_table_names = [
764 'HMI_mag.lut',
765 'aia_131.lut',
766 'aia_1600.lut',
767 'aia_1700.lut',
768 'aia_171.lut',
769 'aia_193.lut',
770 'aia_211.lut',
771 'aia_304.lut',
772 'aia_335.lut',
773 'aia_4500.lut',
774 'aia_94.lut',
775 'aia_mixed',
776 'bb.sao',
777 'grey.sao',
778 'heat.sao',
765 "HMI_mag.lut",
766 "aia_131.lut",
767 "aia_1600.lut",
768 "aia_1700.lut",
769 "aia_171.lut",
770 "aia_193.lut",
771 "aia_211.lut",
772 "aia_304.lut",
773 "aia_335.lut",
774 "aia_4500.lut",
775 "aia_94.lut",
776 "aia_mixed",
777 "bb.sao",
778 "grey.sao",
779 "heat.sao",
779780 ]
780781
781782 # Export scaling types, from (internal) series "jsoc.Color_Tables"
782 _export_scaling_names = ['LOG', 'MINMAX', 'MINMAXGIVEN', 'SQRT', 'mag']
783 _export_scaling_names = ["LOG", "MINMAX", "MINMAXGIVEN", "SQRT", "mag"]
783784
784785 @staticmethod
785786 def _validate_export_protocol_args(protocol_args):
789790 if protocol_args is None:
790791 return
791792
792 ct_key = 'ct'
793 ct_key = "ct"
793794 ct = protocol_args.get(ct_key)
794795 if ct is None:
795 ct_key = 'CT'
796 ct_key = "CT"
796797 ct = protocol_args.get(ct_key)
797798 if ct is not None:
798799 ll = [s.lower() for s in Client._export_color_table_names]
799800 try:
800801 i = ll.index(ct.lower())
801802 except ValueError:
802 msg = f'{ct} is not a valid color table, '
803 msg += 'available color tables: {}'.format(
804 ', '.join([str(s) for s in Client._export_color_table_names])
803 msg = f"{ct} is not a valid color table, "
804 msg += "available color tables: {}".format(
805 ", ".join([str(s) for s in Client._export_color_table_names])
805806 )
806807 raise ValueError(msg)
807808 protocol_args[ct_key] = Client._export_color_table_names[i]
808809
809 scaling = protocol_args.get('scaling')
810 scaling = protocol_args.get("scaling")
810811 if scaling is not None:
811812 ll = [s.lower() for s in Client._export_scaling_names]
812813 try:
813814 i = ll.index(scaling.lower())
814815 except ValueError:
815 msg = f'{scaling} is not a valid scaling type,'
816 msg += 'available scaling types: {}'.format(
817 ', '.join([str(s) for s in Client._export_scaling_names])
818 )
816 msg = f"{scaling} is not a valid scaling type,"
817 msg += "available scaling types: {}".format(", ".join([str(s) for s in Client._export_scaling_names]))
819818 raise ValueError(msg)
820 protocol_args['scaling'] = Client._export_scaling_names[i]
819 protocol_args["scaling"] = Client._export_scaling_names[i]
821820
822821 @property
823822 def _server(self):
847846 @email.setter
848847 def email(self, value):
849848 if value is not None and not self.check_email(value):
850 raise ValueError('Email address is invalid or not registered')
849 raise ValueError("Email address is invalid or not registered")
851850 self._email = value
852851
853852 @property
884883 primekeys and a description of the selected series (see
885884 parameter ``full``).
886885 """
887 if not self._server.check_supported('series'):
888 raise DrmsOperationNotSupported('Server does not support series list access')
886 if not self._server.check_supported("series"):
887 raise DrmsOperationNotSupported("Server does not support series list access")
889888 if self._server.url_show_series_wrapper is None:
890889 # No wrapper CGI available, use the regular version.
891890 d = self._json.show_series(regex)
892 status = d.get('status')
891 status = d.get("status")
893892 if status != 0:
894893 self._raise_query_error(d)
895894 if full:
896 keys = ('name', 'primekeys', 'note')
897 if not d['names']:
895 keys = ("name", "primekeys", "note")
896 if not d["names"]:
898897 return pd.DataFrame(columns=keys)
899 recs = [(it['name'], _split_arg(it['primekeys']), it['note']) for it in d['names']]
898 recs = [(it["name"], _split_arg(it["primekeys"]), it["note"]) for it in d["names"]]
900899 return pd.DataFrame(recs, columns=keys)
901900 else:
902 if not d['names']:
901 if not d["names"]:
903902 return []
904 return [it['name'] for it in d['names']]
903 return [it["name"] for it in d["names"]]
905904 else:
906905 # Use show_series_wrapper instead of the regular version.
907906 d = self._json.show_series_wrapper(regex, info=full)
908907 if full:
909 keys = ('name', 'note')
910 if not d['seriesList']:
908 keys = ("name", "note")
909 if not d["seriesList"]:
911910 return pd.DataFrame(columns=keys)
912911 recs = []
913 for it in d['seriesList']:
912 for it in d["seriesList"]:
914913 name, info = tuple(it.items())[0]
915 note = info.get('description', "")
914 note = info.get("description", "")
916915 recs.append((name, note))
917916 return pd.DataFrame(recs, columns=keys)
918917 else:
919 return d['seriesList']
918 return d["seriesList"]
920919
921920 def info(self, ds):
922921 """
933932 SeriesInfo instance containing information about the data
934933 series.
935934 """
936 if not self._server.check_supported('info'):
937 raise DrmsOperationNotSupported('Server does not support series info access')
935 if not self._server.check_supported("info"):
936 raise DrmsOperationNotSupported("Server does not support series info access")
938937 name = _extract_series_name(ds)
939938 if name is not None:
940939 name = name.lower()
941940 if name in self._info_cache:
942941 return self._info_cache[name]
943942 d = self._json.series_struct(name)
944 status = d.get('status')
943 status = d.get("status")
945944 if status != 0:
946945 self._raise_query_error(d)
947946 si = SeriesInfo(d, name=name)
10471046 Link query results. This DataFrame is only returned,
10481047 if link is not None.
10491048 """
1050 if not self._server.check_supported('query'):
1051 raise DrmsOperationNotSupported('Server does not support DRMS queries')
1049 if not self._server.check_supported("query"):
1050 raise DrmsOperationNotSupported("Server does not support DRMS queries")
10521051 if pkeys:
10531052 pk = self.pkeys(ds)
10541053 key = _split_arg(key) if key is not None else []
10561055 key = pk + key
10571056
10581057 lres = self._json.rs_list(ds, key, seg, link, recinfo=rec_index, n=n)
1059 status = lres.get('status')
1058 status = lres.get("status")
10601059 if status != 0:
10611060 self._raise_query_error(lres)
10621061
10631062 res = []
10641063 if key is not None:
1065 if 'keywords' in lres:
1066 names = [it['name'] for it in lres['keywords']]
1067 values = [it['values'] for it in lres['keywords']]
1064 if "keywords" in lres:
1065 names = [it["name"] for it in lres["keywords"]]
1066 values = [it["values"] for it in lres["keywords"]]
10681067 res_key = pd.DataFrame.from_dict(OrderedDict(zip(names, values)))
10691068 else:
10701069 res_key = pd.DataFrame()
10731072 res.append(res_key)
10741073
10751074 if seg is not None:
1076 if 'segments' in lres:
1077 names = [it['name'] for it in lres['segments']]
1078 values = [it['values'] for it in lres['segments']]
1075 if "segments" in lres:
1076 names = [it["name"] for it in lres["segments"]]
1077 values = [it["values"] for it in lres["segments"]]
10791078 res_seg = pd.DataFrame.from_dict(OrderedDict(zip(names, values)))
10801079 else:
10811080 res_seg = pd.DataFrame()
10821081 res.append(res_seg)
10831082
10841083 if link is not None:
1085 if 'links' in lres:
1086 names = [it['name'] for it in lres['links']]
1087 values = [it['values'] for it in lres['links']]
1084 if "links" in lres:
1085 names = [it["name"] for it in lres["links"]]
1086 values = [it["values"] for it in lres["links"]]
10881087 res_link = pd.DataFrame.from_dict(OrderedDict(zip(names, values)))
10891088 else:
10901089 res_link = pd.DataFrame()
10911090 res.append(res_link)
10921091
10931092 if rec_index:
1094 index = [it['name'] for it in lres['recinfo']]
1093 index = [it["name"] for it in lres["recinfo"]]
10951094 for r in res:
10961095 r.index = index
10971096
11211120 True if the email address is valid and registered, False
11221121 otherwise.
11231122 """
1124 if not self._server.check_supported('email'):
1125 raise DrmsOperationNotSupported('Server does not support user emails')
1123 if not self._server.check_supported("email"):
1124 raise DrmsOperationNotSupported("Server does not support user emails")
11261125 res = self._json.check_address(email)
1127 status = res.get('status')
1126 status = res.get("status")
11281127 return status is not None and int(status) == 2
11291128
11301129 def export(
11311130 self,
11321131 ds,
1133 method='url_quick',
1134 protocol='as-is',
1132 method="url_quick",
1133 protocol="as-is",
11351134 protocol_args=None,
11361135 filenamefmt=None,
11371136 n=None,
12051204 -------
12061205 result : `ExportRequest`
12071206 """
1208 if not self._server.check_supported('export'):
1209 raise DrmsOperationNotSupported('Server does not support export requests')
1207 if not self._server.check_supported("export"):
1208 raise DrmsOperationNotSupported("Server does not support export requests")
12101209 if email is None:
12111210 if self._email is None:
1212 raise ValueError('The email argument is required, when no default email address was set')
1211 raise ValueError("The email argument is required, when no default email address was set")
12131212 email = self._email
12141213
12151214 if filenamefmt is None:
12181217 elif filenamefmt is False:
12191218 filenamefmt = None
12201219
1221 if protocol.lower() in ['jpg', 'mpg', 'mp4']:
1220 if protocol.lower() in ["jpg", "mpg", "mp4"]:
12221221 self._validate_export_protocol_args(protocol_args)
12231222
12241223 d = self._json.exp_request(
12471246 -------
12481247 result : `ExportRequest`
12491248 """
1250 if not self._server.check_supported('export'):
1251 raise DrmsOperationNotSupported('Server does not support export requests')
1249 if not self._server.check_supported("export"):
1250 raise DrmsOperationNotSupported("Server does not support export requests")
12521251 return ExportRequest._create_from_id(requestid, client=self)
00 from urllib.parse import urljoin
11
2 __all__ = ['ServerConfig', 'register_server']
2 __all__ = ["ServerConfig", "register_server"]
33
44
55 class ServerConfig:
3838 """
3939
4040 _valid_keys = [
41 'name',
42 'cgi_baseurl',
43 'cgi_show_series',
44 'cgi_jsoc_info',
45 'cgi_jsoc_fetch',
46 'cgi_check_address',
47 'cgi_show_series_wrapper',
48 'show_series_wrapper_dbhost',
49 'url_show_series',
50 'url_jsoc_info',
51 'url_jsoc_fetch',
52 'url_check_address',
53 'url_show_series_wrapper',
54 'encoding',
55 'http_download_baseurl',
56 'ftp_download_baseurl',
41 "name",
42 "cgi_baseurl",
43 "cgi_show_series",
44 "cgi_jsoc_info",
45 "cgi_jsoc_fetch",
46 "cgi_check_address",
47 "cgi_show_series_wrapper",
48 "show_series_wrapper_dbhost",
49 "url_show_series",
50 "url_jsoc_info",
51 "url_jsoc_fetch",
52 "url_check_address",
53 "url_show_series_wrapper",
54 "encoding",
55 "http_download_baseurl",
56 "ftp_download_baseurl",
5757 ]
5858
5959 def __init__(self, config=None, **kwargs):
6262
6363 for k in d:
6464 if k not in self._valid_keys:
65 raise ValueError(f'Invalid server config key: {k}')
65 raise ValueError(f"Invalid server config key: {k}")
6666
67 if 'name' not in d:
67 if "name" not in d:
6868 raise ValueError('Server config entry "name" is missing')
6969
7070 # encoding defaults to latin1
71 if 'encoding' not in d:
72 d['encoding'] = 'latin1'
71 if "encoding" not in d:
72 d["encoding"] = "latin1"
7373
7474 # Generate URL entries from CGI entries, if cgi_baseurl exists and
7575 # the specific URL entry is not already set.
76 if 'cgi_baseurl' in d:
77 cgi_baseurl = d['cgi_baseurl']
78 cgi_keys = [k for k in self._valid_keys if k.startswith('cgi') and k != 'cgi_baseurl']
76 if "cgi_baseurl" in d:
77 cgi_baseurl = d["cgi_baseurl"]
78 cgi_keys = [k for k in self._valid_keys if k.startswith("cgi") and k != "cgi_baseurl"]
7979 for k in cgi_keys:
80 url_key = f'url{k[3:]}'
80 url_key = f"url{k[3:]}"
8181 cgi_value = d.get(k)
8282 if d.get(url_key) is None and cgi_value is not None:
8383 d[url_key] = urljoin(cgi_baseurl, cgi_value)
9797 def __setattr__(self, name, value):
9898 if name in self._valid_keys:
9999 if not isinstance(value, str):
100 raise ValueError(f'{name} config value must be a string')
100 raise ValueError(f"{name} config value must be a string")
101101 self._d[name] = value
102102 else:
103103 object.__setattr__(self, name, value)
112112 """
113113 Check if an operation is supported by the server.
114114 """
115 if op == 'series':
115 if op == "series":
116116 return (self.cgi_show_series is not None) or (self.cgi_show_series_wrapper is not None)
117 elif op == 'info':
117 elif op == "info":
118118 return self.cgi_jsoc_info is not None
119 elif op == 'query':
119 elif op == "query":
120120 return self.cgi_jsoc_info is not None
121 elif op == 'email':
121 elif op == "email":
122122 return self.cgi_check_address is not None
123 elif op == 'export':
123 elif op == "export":
124124 return (self.cgi_jsoc_info is not None) and (self.cgi_jsoc_fetch is not None)
125125 else:
126 raise ValueError(f'Unknown operation: {op!r}')
126 raise ValueError(f"Unknown operation: {op!r}")
127127
128128
129129 def register_server(config):
133133 global _server_configs
134134 name = config.name.lower()
135135 if name in _server_configs:
136 raise RuntimeError(f'ServerConfig {name} already registered')
136 raise RuntimeError(f"ServerConfig {name} already registered")
137137 _server_configs[config.name.lower()] = config
138138
139139
143143 # Register public JSOC DRMS server.
144144 register_server(
145145 ServerConfig(
146 name='JSOC',
147 cgi_baseurl='http://jsoc.stanford.edu/cgi-bin/ajax/',
148 cgi_show_series='show_series',
149 cgi_jsoc_info='jsoc_info',
150 cgi_jsoc_fetch='jsoc_fetch',
151 cgi_check_address='checkAddress.sh',
152 cgi_show_series_wrapper='showextseries',
153 show_series_wrapper_dbhost='hmidb2',
154 http_download_baseurl='http://jsoc.stanford.edu/',
155 ftp_download_baseurl='ftp://pail.stanford.edu/export/',
146 name="JSOC",
147 cgi_baseurl="http://jsoc.stanford.edu/cgi-bin/ajax/",
148 cgi_show_series="show_series",
149 cgi_jsoc_info="jsoc_info",
150 cgi_jsoc_fetch="jsoc_fetch",
151 cgi_check_address="checkAddress.sh",
152 cgi_show_series_wrapper="showextseries",
153 show_series_wrapper_dbhost="hmidb2",
154 http_download_baseurl="http://jsoc.stanford.edu/",
155 ftp_download_baseurl="ftp://pail.stanford.edu/export/",
156156 )
157157 )
158158
159159 # Register KIS DRMS server.
160160 register_server(
161161 ServerConfig(
162 name='KIS',
163 cgi_baseurl='http://drms.leibniz-kis.de/cgi-bin/',
164 cgi_show_series='show_series',
165 cgi_jsoc_info='jsoc_info',
162 name="KIS",
163 cgi_baseurl="http://drms.leibniz-kis.de/cgi-bin/",
164 cgi_show_series="show_series",
165 cgi_jsoc_info="jsoc_info",
166166 )
167167 )
00 __all__ = [
1 'DrmsError',
2 'DrmsQueryError',
3 'DrmsExportError',
4 'DrmsOperationNotSupported',
1 "DrmsError",
2 "DrmsQueryError",
3 "DrmsExportError",
4 "DrmsOperationNotSupported",
55 ]
66
77
00 import json as _json
11 from urllib.parse import urlencode, quote_plus
2 from urllib.request import urlopen
2 from urllib.request import HTTPError, urlopen
33
44 from .config import ServerConfig, _server_configs
55 from .utils import _split_arg
66
7 __all__ = ['const', 'HttpJsonRequest', 'HttpJsonClient']
7 __all__ = ["const", "HttpJsonRequest", "HttpJsonClient"]
88
99
1010 class JsocInfoConstants:
3737 = ``'*archive*'``
3838 """
3939
40 all = '**ALL**'
41 none = '**NONE**'
42 recdir = '*recdir*'
43 dirmtime = '*dirmtime*'
44 logdir = '*logdir*'
45 recnum = '*recnum*'
46 sunum = '*sunum*'
47 size = '*size*'
48 online = '*online*'
49 retain = '*retain*'
50 archive = '*archive*'
40 all = "**ALL**"
41 none = "**NONE**"
42 recdir = "*recdir*"
43 dirmtime = "*dirmtime*"
44 logdir = "*logdir*"
45 recnum = "*recnum*"
46 sunum = "*sunum*"
47 size = "*size*"
48 online = "*online*"
49 retain = "*retain*"
50 archive = "*archive*"
5151
5252
5353 const = JsocInfoConstants()
6262
6363 def __init__(self, url, encoding):
6464 self._encoding = encoding
65 self._http = urlopen(url)
65 try:
66 self._http = urlopen(url)
67 except HTTPError as e:
68 e.msg = f"Failed to open URL: {e.url} with {e.code} - {e.msg}"
69 raise e
6670 self._data_str = None
6771 self._data = None
6872
6973 def __repr__(self):
70 return f'<HttpJsonRequest: {self.url}>'
74 return f"<HttpJsonRequest: {self.url}>"
7175
7276 @property
7377 def url(self):
99103 Enable or disable debug mode (default is disabled).
100104 """
101105
102 def __init__(self, server='jsoc', debug=False):
106 def __init__(self, server="jsoc", debug=False):
103107 if isinstance(server, ServerConfig):
104108 self._server = server
105109 else:
107111 self.debug = debug
108112
109113 def __repr__(self):
110 return f'<HttpJsonClient: {self._server.name}>'
114 return f"<HttpJsonClient: {self._server.name}>"
111115
112116 def _json_request(self, url):
113117 if self.debug:
139143 -------
140144 result : dict
141145 """
142 query = '?' if ds_filter is not None else ""
146 query = "?" if ds_filter is not None else ""
143147 if ds_filter is not None:
144 query += urlencode({'filter': ds_filter})
148 query += urlencode({"filter": ds_filter})
145149 req = self._json_request(self._server.url_show_series + query)
146150 return req.data
147151
167171 -------
168172 result : dict
169173 """
170 query_args = {'dbhost': self._server.show_series_wrapper_dbhost}
174 query_args = {"dbhost": self._server.show_series_wrapper_dbhost}
171175 if ds_filter is not None:
172 query_args['filter'] = ds_filter
176 query_args["filter"] = ds_filter
173177 if info:
174 query_args['info'] = '1'
175 query = f'?{urlencode(query_args)}'
178 query_args["info"] = "1"
179 query = f"?{urlencode(query_args)}"
176180 req = self._json_request(self._server.url_show_series_wrapper + query)
177181 return req.data
178182
242246 Dictionary containing the requested record set information.
243247 """
244248 if key is None and seg is None and link is None:
245 raise ValueError('At least one key, seg or link must be specified')
246 d = {'op': 'rs_list', 'ds': ds}
249 raise ValueError("At least one key, seg or link must be specified")
250 d = {"op": "rs_list", "ds": ds}
247251 if key is not None:
248 d['key'] = ','.join(_split_arg(key))
252 d["key"] = ",".join(_split_arg(key))
249253 if seg is not None:
250 d['seg'] = ','.join(_split_arg(seg))
254 d["seg"] = ",".join(_split_arg(seg))
251255 if link is not None:
252 d['link'] = ','.join(_split_arg(link))
256 d["link"] = ",".join(_split_arg(link))
253257 if recinfo:
254 d['R'] = '1'
258 d["R"] = "1"
255259 if n is not None:
256 d['n'] = f'{int(int(n))}'
260 d["n"] = f"{int(int(n))}"
257261 if uid is not None:
258 d['userhandle'] = uid
259 query = f'?{urlencode(d)}'
262 d["userhandle"] = uid
263 query = f"?{urlencode(d)}"
260264 req = self._json_request(self._server.url_jsoc_info + query)
261265 return req.data
262266
279283 - 4: Email address has neither been validated nor registered
280284 - -2: Not a valid email address
281285 """
282 query = '?' + urlencode({'address': quote_plus(email), 'checkonly': '1'})
286 query = "?" + urlencode({"address": quote_plus(email), "checkonly": "1"})
283287 req = self._json_request(self._server.url_check_address + query)
284288 return req.data
285289
334338 self,
335339 ds,
336340 notify,
337 method='url_quick',
338 protocol='as-is',
341 method="url_quick",
342 protocol="as-is",
339343 protocol_args=None,
340344 filenamefmt=None,
341345 n=None,
343347 requestor=None,
344348 ):
345349 method = method.lower()
346 method_list = ['url_quick', 'url', 'url-tar', 'ftp', 'ftp-tar']
350 method_list = ["url_quick", "url", "url-tar", "ftp", "ftp-tar"]
347351 if method not in method_list:
348352 raise ValueError(
349 'Method {} is not supported, valid methods are: {}'.format(
350 method, ', '.join(str(s) for s in method_list)
353 "Method {} is not supported, valid methods are: {}".format(
354 method, ", ".join(str(s) for s in method_list)
351355 )
352356 )
353357
354358 protocol = protocol.lower()
355 img_protocol_list = ['jpg', 'mpg', 'mp4']
356 protocol_list = ['as-is', 'fits'] + img_protocol_list
359 img_protocol_list = ["jpg", "mpg", "mp4"]
360 protocol_list = ["as-is", "fits"] + img_protocol_list
357361 if protocol not in protocol_list:
358362 raise ValueError(
359 'Protocol {} is not supported, valid protocols are: {}'.format(
360 protocol, ', '.join(str(s) for s in protocol_list)
363 "Protocol {} is not supported, valid protocols are: {}".format(
364 protocol, ", ".join(str(s) for s in protocol_list)
361365 )
362366 )
363367
364368 # method "url_quick" is meant to be used with "as-is", change method
365369 # to "url" if protocol is not "as-is"
366 if method == 'url_quick' and protocol != 'as-is':
367 method = 'url'
370 if method == "url_quick" and protocol != "as-is":
371 method = "url"
368372
369373 if protocol in img_protocol_list:
370 extra_keys = {'ct': 'grey.sao', 'scaling': 'MINMAX', 'size': 1}
374 extra_keys = {"ct": "grey.sao", "scaling": "MINMAX", "size": 1}
371375 if protocol_args is not None:
372376 for k, v in protocol_args.items():
373 if k.lower() == 'ct':
374 extra_keys['ct'] = v
375 elif k == 'scaling':
377 if k.lower() == "ct":
378 extra_keys["ct"] = v
379 elif k == "scaling":
376380 extra_keys[k] = v
377 elif k == 'size':
381 elif k == "size":
378382 extra_keys[k] = int(v)
379 elif k in ['min', 'max']:
383 elif k in ["min", "max"]:
380384 extra_keys[k] = float(v)
381385 else:
382 raise ValueError(f'Unknown protocol argument: {k}')
383 protocol += ',CT={ct},scaling={scaling},size={size}'.format(**extra_keys)
384 if 'min' in extra_keys:
386 raise ValueError(f"Unknown protocol argument: {k}")
387 protocol += ",CT={ct},scaling={scaling},size={size}".format(**extra_keys)
388 if "min" in extra_keys:
385389 protocol += f',min={extra_keys["min"]:g}'
386 if 'max' in extra_keys:
390 if "max" in extra_keys:
387391 protocol += f',max={extra_keys["max"]:g}'
388392 else:
389393 if protocol_args is not None:
390 raise ValueError(f'protocol_args not supported for protocol {protocol}')
394 raise ValueError(f"protocol_args not supported for protocol {protocol}")
391395
392396 d = {
393 'op': 'exp_request',
394 'format': 'json',
395 'ds': ds,
396 'notify': notify,
397 'method': method,
398 'protocol': protocol,
397 "op": "exp_request",
398 "format": "json",
399 "ds": ds,
400 "notify": notify,
401 "method": method,
402 "protocol": protocol,
399403 }
400404
401405 if filenamefmt is not None:
402 d['filenamefmt'] = filenamefmt
406 d["filenamefmt"] = filenamefmt
403407
404408 n = int(n) if n is not None else 0
405 d['process=n'] = f'{n}'
409 d["process=n"] = f"{n}"
406410 if process is not None:
407411 allowed_processes = [
408 'im_patch',
409 'resize',
410 'rebin',
411 'aia_scale_aialev1',
412 'aia_scale_orig',
413 'aia_scale_other',
414 'Maproj',
415 'HmiB2ptr',
412 "im_patch",
413 "resize",
414 "rebin",
415 "aia_scale_aialev1",
416 "aia_scale_orig",
417 "aia_scale_other",
418 "Maproj",
419 "HmiB2ptr",
416420 ]
417421 process_strings = {}
418422 for p, opts in process.items():
419423 if p not in allowed_processes:
420 raise ValueError(f'{p} is not one of the allowed processing options: {allowed_processes}')
421 process_strings[p] = ','.join([f'{k}={v}' for k, v in opts.items()])
422 processes = '|'.join([f'{k},{v}' for k, v in process_strings.items()])
423 d['process=n'] = f'{d["process=n"]}|{processes}'
424 raise ValueError(f"{p} is not one of the allowed processing options: {allowed_processes}")
425 process_strings[p] = ",".join([f"{k}={v}" for k, v in opts.items()])
426 processes = "|".join([f"{k},{v}" for k, v in process_strings.items()])
427 d["process=n"] = f'{d["process=n"]}|{processes}'
424428
425429 if requestor is None:
426 d['requestor'] = notify.split('@')[0]
430 d["requestor"] = notify.split("@")[0]
427431 elif requestor is not False:
428 d['requestor'] = requestor
429
430 query = '?' + urlencode(d)
432 d["requestor"] = requestor
433
434 query = "?" + urlencode(d)
431435 return self._server.url_jsoc_fetch + query
432436
433437 def exp_status(self, requestid):
88
99 # Create a Client instance
1010 client = drms.Client(args.server, email=args.email, verbose=args.verbose, debug=args.debug)
11 print(f'client: {client}')
11 print(f"client: {client}")
1212
1313
1414 def parse_args(args):
1515 import drms
1616
1717 # Handle command line options
18 parser = argparse.ArgumentParser(description='drms, access HMI, AIA and MDI data with python')
19 parser.add_argument('--debug', action='store_true', help='enable debug output')
18 parser = argparse.ArgumentParser(description="drms, access HMI, AIA and MDI data with python")
19 parser.add_argument("--debug", action="store_true", help="enable debug output")
2020 parser.add_argument(
21 '--version', action='version', version=f'drms v{drms.__version__}', help='show package version and exit',
21 "--version",
22 action="version",
23 version=f"drms v{drms.__version__}",
24 help="show package version and exit",
2225 )
23 parser.add_argument('--email', help='email address for data export requests')
24 parser.add_argument('--verbose', action='store_true', help='print export status messages to stdout')
25 parser.add_argument('server', nargs='?', default='jsoc', help='DRMS server, default is JSOC')
26 parser.add_argument("--email", help="email address for data export requests")
27 parser.add_argument("--verbose", action="store_true", help="print export status messages to stdout")
28 parser.add_argument("server", nargs="?", default="jsoc", help="DRMS server, default is JSOC")
2629
2730 args = parser.parse_args(args)
2831 return args
66 def test_client_init_defaults():
77 c = drms.Client()
88 assert isinstance(c._server, ServerConfig)
9 assert c._server.name.lower() == 'jsoc'
9 assert c._server.name.lower() == "jsoc"
1010 assert c.email is None
1111 assert c.verbose is False
1212 assert c.debug is False
1313
1414
15 @pytest.mark.parametrize('value', [True, False])
15 @pytest.mark.parametrize("value", [True, False])
1616 def test_client_init_verbose(value):
1717 c = drms.Client(verbose=value)
1818 assert c.verbose is value
1919 assert c.debug is False
2020
2121
22 @pytest.mark.parametrize('value', [True, False])
22 @pytest.mark.parametrize("value", [True, False])
2323 def test_client_init_debug(value):
2424 c = drms.Client(debug=value)
2525 assert c.verbose is False
2626 assert c.debug is value
2727
2828
29 @pytest.mark.parametrize('server_name', ['jsoc', 'kis'])
29 @pytest.mark.parametrize("server_name", ["jsoc", "kis"])
3030 def test_client_registered_servers(server_name):
3131 c = drms.Client(server_name)
3232 assert isinstance(c._server, ServerConfig)
3737
3838
3939 def test_client_custom_config():
40 cfg = ServerConfig(name='TEST')
40 cfg = ServerConfig(name="TEST")
4141 c = drms.Client(server=cfg)
4242 assert isinstance(c._server, ServerConfig)
43 assert c._server.name == 'TEST'
43 assert c._server.name == "TEST"
4444
4545
4646 def test_repr():
47 assert repr(drms.Client()) == '<Client: JSOC>'
48 assert repr(drms.Client(server='kis')) == '<Client: KIS>'
47 assert repr(drms.Client()) == "<Client: JSOC>"
48 assert repr(drms.Client(server="kis")) == "<Client: KIS>"
33
44
55 def test_create_config_basic():
6 cfg = ServerConfig(name='TEST')
6 cfg = ServerConfig(name="TEST")
77 valid_keys = ServerConfig._valid_keys
8 assert 'name' in valid_keys
9 assert 'encoding' in valid_keys
8 assert "name" in valid_keys
9 assert "encoding" in valid_keys
1010 for k in valid_keys:
1111 v = getattr(cfg, k)
12 if k == 'name':
13 assert v == 'TEST'
14 elif k == 'encoding':
15 assert v == 'latin1'
12 if k == "name":
13 assert v == "TEST"
14 elif k == "encoding":
15 assert v == "latin1"
1616 else:
1717 assert v is None
18 assert repr(cfg) == '<ServerConfig: TEST>'
18 assert repr(cfg) == "<ServerConfig: TEST>"
1919
2020
2121 def test_create_config_missing_name():
2424
2525
2626 def test_copy_config():
27 cfg = ServerConfig(name='TEST')
28 assert cfg.name == 'TEST'
27 cfg = ServerConfig(name="TEST")
28 assert cfg.name == "TEST"
2929
3030 cfg2 = cfg.copy()
3131 assert cfg2 is not cfg
32 assert cfg2.name == 'TEST'
32 assert cfg2.name == "TEST"
3333
34 cfg.name = 'MUH'
34 cfg.name = "MUH"
3535 assert cfg.name != cfg2.name
3636
3737
3838 def test_register_server():
39 cfg = ServerConfig(name='TEST')
39 cfg = ServerConfig(name="TEST")
4040
41 assert 'test' not in _server_configs
41 assert "test" not in _server_configs
4242 register_server(cfg)
43 assert 'test' in _server_configs
43 assert "test" in _server_configs
4444
45 del _server_configs['test']
46 assert 'test' not in _server_configs
45 del _server_configs["test"]
46 assert "test" not in _server_configs
4747
4848
4949 def test_register_server_existing():
50 assert 'jsoc' in _server_configs
51 cfg = ServerConfig(name='jsoc')
50 assert "jsoc" in _server_configs
51 cfg = ServerConfig(name="jsoc")
5252 with pytest.raises(RuntimeError):
5353 register_server(cfg)
54 assert 'jsoc' in _server_configs
54 assert "jsoc" in _server_configs
5555
5656
5757 def test_config_jsoc():
58 assert 'jsoc' in _server_configs
58 assert "jsoc" in _server_configs
5959
60 cfg = _server_configs['jsoc']
61 assert cfg.name.lower() == 'jsoc'
60 cfg = _server_configs["jsoc"]
61 assert cfg.name.lower() == "jsoc"
6262 assert isinstance(cfg.encoding, str)
6363 assert isinstance(cfg.cgi_show_series, str)
6464 assert isinstance(cfg.cgi_jsoc_info, str)
6666 assert isinstance(cfg.cgi_check_address, str)
6767 assert isinstance(cfg.cgi_show_series_wrapper, str)
6868 assert isinstance(cfg.show_series_wrapper_dbhost, str)
69 assert cfg.http_download_baseurl.startswith('http://')
70 assert cfg.ftp_download_baseurl.startswith('ftp://')
69 assert cfg.http_download_baseurl.startswith("http://")
70 assert cfg.ftp_download_baseurl.startswith("ftp://")
7171
7272 baseurl = cfg.cgi_baseurl
73 assert baseurl.startswith('http://')
73 assert baseurl.startswith("http://")
7474 assert cfg.url_show_series.startswith(baseurl)
7575 assert cfg.url_jsoc_info.startswith(baseurl)
7676 assert cfg.url_jsoc_fetch.startswith(baseurl)
7979
8080
8181 def test_config_kis():
82 assert 'kis' in _server_configs
83 cfg = _server_configs['kis']
82 assert "kis" in _server_configs
83 cfg = _server_configs["kis"]
8484
85 assert cfg.name.lower() == 'kis'
85 assert cfg.name.lower() == "kis"
8686 assert isinstance(cfg.encoding, str)
8787
8888 assert isinstance(cfg.cgi_show_series, str)
9595 assert cfg.ftp_download_baseurl is None
9696
9797 baseurl = cfg.cgi_baseurl
98 assert baseurl.startswith('http://')
98 assert baseurl.startswith("http://")
9999 assert cfg.url_show_series.startswith(baseurl)
100100 assert cfg.url_jsoc_info.startswith(baseurl)
101101 assert cfg.url_jsoc_fetch is None
104104
105105
106106 @pytest.mark.parametrize(
107 'server_name, operation, expected',
107 "server_name, operation, expected",
108108 [
109 ('jsoc', 'series', True),
110 ('jsoc', 'info', True),
111 ('jsoc', 'query', True),
112 ('jsoc', 'email', True),
113 ('jsoc', 'export', True),
114 ('kis', 'series', True),
115 ('kis', 'info', True),
116 ('kis', 'query', True),
117 ('kis', 'email', False),
118 ('kis', 'export', False),
109 ("jsoc", "series", True),
110 ("jsoc", "info", True),
111 ("jsoc", "query", True),
112 ("jsoc", "email", True),
113 ("jsoc", "export", True),
114 ("kis", "series", True),
115 ("kis", "info", True),
116 ("kis", "query", True),
117 ("kis", "email", False),
118 ("kis", "export", False),
119119 ],
120120 )
121121 def test_supported(server_name, operation, expected):
124124
125125
126126 @pytest.mark.parametrize(
127 'server_name, operation', [('jsoc', 'bar'), ('kis', 'foo')],
127 "server_name, operation",
128 [("jsoc", "bar"), ("kis", "foo")],
128129 )
129130 def test_supported_invalid_operation(server_name, operation):
130131 cfg = _server_configs[server_name]
134135
135136 def test_create_config_invalid_key():
136137 with pytest.raises(ValueError):
137 cfg = ServerConfig(foo='bar')
138 cfg = ServerConfig(foo="bar")
138139
139140
140141 def test_getset_attr():
141 cfg = ServerConfig(name='TEST')
142 assert getattr(cfg, 'name') == 'TEST'
143 assert getattr(cfg, '__dict__') == {'_d': {'encoding': 'latin1', 'name': 'TEST'}}
142 cfg = ServerConfig(name="TEST")
143 assert getattr(cfg, "name") == "TEST"
144 assert getattr(cfg, "__dict__") == {"_d": {"encoding": "latin1", "name": "TEST"}}
144145 with pytest.raises(AttributeError):
145 getattr(cfg, 'foo')
146 setattr(cfg, 'name', 'NewTest')
147 assert getattr(cfg, 'name') == 'NewTest'
146 getattr(cfg, "foo")
147 setattr(cfg, "name", "NewTest")
148 assert getattr(cfg, "name") == "NewTest"
148149 with pytest.raises(ValueError):
149 setattr(cfg, 'name', 123)
150 setattr(cfg, '__sizeof__', 127)
151 assert getattr(cfg, '__sizeof__') == 127
150 setattr(cfg, "name", 123)
151 setattr(cfg, "__sizeof__", 127)
152 assert getattr(cfg, "__sizeof__") == 127
152153
153154
154155 def test_to_dict():
155 cfg = ServerConfig(name='TEST')
156 cfg = ServerConfig(name="TEST")
156157 _dict = cfg.to_dict()
157158 assert isinstance(_dict, dict)
158 assert _dict == {'encoding': 'latin1', 'name': 'TEST'}
159 assert _dict == {"encoding": "latin1", "name": "TEST"}
159160
160161
161162 def test_inbuilt_dir():
162 cfg = ServerConfig(name='TEST')
163 cfg = ServerConfig(name="TEST")
163164 valid_keys = ServerConfig._valid_keys
164165 list_attr = dir(cfg)
165166 assert isinstance(list_attr, list)
33
44
55 @pytest.mark.parametrize(
6 'exception_class', [drms.DrmsError, drms.DrmsQueryError, drms.DrmsExportError, drms.DrmsOperationNotSupported],
6 "exception_class",
7 [drms.DrmsError, drms.DrmsQueryError, drms.DrmsExportError, drms.DrmsOperationNotSupported],
78 )
89 def test_exception_class(exception_class):
910 with pytest.raises(RuntimeError):
55
66
77 @pytest.mark.parametrize(
8 'symbol',
8 "symbol",
99 [
10 'DrmsError',
11 'DrmsQueryError',
12 'DrmsExportError',
13 'DrmsOperationNotSupported',
14 'SeriesInfo',
15 'ExportRequest',
16 'Client',
17 'const',
18 'to_datetime',
10 "DrmsError",
11 "DrmsQueryError",
12 "DrmsExportError",
13 "DrmsOperationNotSupported",
14 "SeriesInfo",
15 "ExportRequest",
16 "Client",
17 "const",
18 "to_datetime",
1919 ],
2020 )
2121 def test_symbols(symbol):
2424
2525 def test_version():
2626 assert isinstance(drms.__version__, str)
27 version = drms.__version__.split('+')[0]
27 version = drms.__version__.split("+")[0]
2828 # Check to make sure it isn't empty
2929 assert version
3030 # To match the 0.6 in 0.6.dev3 or v0.6
31 m = re.match(r'v*\d+\.\d+\.*', version)
31 m = re.match(r"v*\d+\.\d+\.*", version)
3232 assert m is not None
3333
3434
3535 def test_bibtex():
3636 assert isinstance(drms.__citation__, str)
37 m = re.match(r'.*Glogowski2019.*', drms.__citation__)
37 m = re.match(r".*Glogowski2019.*", drms.__citation__)
3838 assert m is not None
55 def test_series_list_all(jsoc_client):
66 slist = jsoc_client.series()
77 assert isinstance(slist, list)
8 assert 'hmi.v_45s' in (s.lower() for s in slist)
9 assert 'hmi.m_720s' in (s.lower() for s in slist)
10 assert 'hmi.ic_720s' in (s.lower() for s in slist)
11 assert 'aia.lev1' in (s.lower() for s in slist)
12 assert 'aia.lev1_euv_12s' in (s.lower() for s in slist)
13 assert 'mdi.fd_v' in (s.lower() for s in slist)
8 assert "hmi.v_45s" in (s.lower() for s in slist)
9 assert "hmi.m_720s" in (s.lower() for s in slist)
10 assert "hmi.ic_720s" in (s.lower() for s in slist)
11 assert "aia.lev1" in (s.lower() for s in slist)
12 assert "aia.lev1_euv_12s" in (s.lower() for s in slist)
13 assert "mdi.fd_v" in (s.lower() for s in slist)
1414
1515
1616 @pytest.mark.jsoc
1717 @pytest.mark.remote_data
18 @pytest.mark.parametrize('schema', ['aia', 'hmi', 'mdi'])
18 @pytest.mark.parametrize("schema", ["aia", "hmi", "mdi"])
1919 def test_series_list_schemata(jsoc_client, schema):
20 regex = fr'{schema}\.'
20 regex = rf"{schema}\."
2121 slist = jsoc_client.series(regex)
2222 assert len(slist) > 0
2323 for sname in slist:
24 assert sname.startswith(f'{schema}.')
24 assert sname.startswith(f"{schema}.")
55
66 # Invalid email addresses used for testing
77 invalid_emails = [
8 'notregistered@example.com',
9 'not-valid',
8 "notregistered@example.com",
9 "not-valid",
1010 "",
1111 ]
1212
1313
1414 @pytest.mark.jsoc
1515 @pytest.mark.remote_data
16 @pytest.mark.parametrize('email', invalid_emails)
16 @pytest.mark.parametrize("email", invalid_emails)
1717 def test_email_invalid_check(email):
18 c = drms.Client('jsoc')
18 c = drms.Client("jsoc")
1919 assert not c.check_email(email)
2020
2121
2222 @pytest.mark.jsoc
2323 @pytest.mark.remote_data
24 @pytest.mark.parametrize('email', invalid_emails)
24 @pytest.mark.parametrize("email", invalid_emails)
2525 def test_email_invalid_set(email):
26 c = drms.Client('jsoc')
26 c = drms.Client("jsoc")
2727 with pytest.raises(ValueError):
2828 c.email = email
2929
3030
3131 @pytest.mark.jsoc
3232 @pytest.mark.remote_data
33 @pytest.mark.parametrize('email', invalid_emails)
33 @pytest.mark.parametrize("email", invalid_emails)
3434 def test_email_invalid_init(email):
3535 with pytest.raises(ValueError):
36 drms.Client('jsoc', email=email)
36 drms.Client("jsoc", email=email)
3737
3838
3939 @pytest.mark.jsoc
4040 @pytest.mark.export
4141 @pytest.mark.remote_data
4242 def test_email_cmdopt_check(email):
43 c = drms.Client('jsoc')
43 c = drms.Client("jsoc")
4444 assert c.check_email(email)
4545
4646
4848 @pytest.mark.export
4949 @pytest.mark.remote_data
5050 def test_email_cmdopt_set(email):
51 c = drms.Client('jsoc')
51 c = drms.Client("jsoc")
5252 c.email = email
5353 assert c.email == email
5454
5757 @pytest.mark.export
5858 @pytest.mark.remote_data
5959 def test_email_cmdopt_init(email):
60 c = drms.Client('jsoc', email=email)
60 c = drms.Client("jsoc", email=email)
6161 assert c.email == email
6262
6363
6464 def test_query_invalid():
65 cfg = ServerConfig(name='TEST')
65 cfg = ServerConfig(name="TEST")
6666 with pytest.raises(DrmsOperationNotSupported):
67 c = drms.Client(server=cfg, email='user@example.com')
67 c = drms.Client(server=cfg, email="user@example.com")
55 @pytest.mark.jsoc
66 @pytest.mark.export
77 @pytest.mark.remote_data
8 @pytest.mark.parametrize('method', ['url_quick', 'url'])
8 @pytest.mark.parametrize("method", ["url_quick", "url"])
99 def test_export_asis_basic(jsoc_client_export, method):
1010 r = jsoc_client_export.export(
11 'hmi.v_avg120[2150]{mean,power}', protocol='as-is', method=method, requestor=False,
11 "hmi.v_avg120[2150]{mean,power}",
12 protocol="as-is",
13 method=method,
14 requestor=False,
1215 )
1316
1417 assert isinstance(r, drms.ExportRequest)
1518 assert r.wait(timeout=60)
1619 assert r.has_succeeded()
17 assert r.protocol == 'as-is'
20 assert r.protocol == "as-is"
1821 assert len(r.urls) == 12 # 6 files per segment
1922
2023 for record in r.urls.record:
2124 record = record.lower()
22 assert record.startswith('hmi.v_avg120[2150]')
23 assert record.endswith('{mean}') or record.endswith('{power}')
25 assert record.startswith("hmi.v_avg120[2150]")
26 assert record.endswith("{mean}") or record.endswith("{power}")
2427
2528 for filename in r.urls.filename:
26 assert filename.endswith('mean.fits') or filename.endswith('power.fits')
29 assert filename.endswith("mean.fits") or filename.endswith("power.fits")
2730
2831 for url in r.urls.url:
29 assert url.endswith('mean.fits') or url.endswith('power.fits')
32 assert url.endswith("mean.fits") or url.endswith("power.fits")
3033
3134
3235 @pytest.mark.jsoc
3437 @pytest.mark.remote_data
3538 def test_export_fits_basic(jsoc_client_export):
3639 r = jsoc_client_export.export(
37 'hmi.sharp_720s[4864][2014.11.30_00:00_TAI]{continuum, magnetogram}',
38 protocol='fits',
39 method='url',
40 "hmi.sharp_720s[4864][2014.11.30_00:00_TAI]{continuum, magnetogram}",
41 protocol="fits",
42 method="url",
4043 requestor=False,
4144 )
4245
4346 assert isinstance(r, drms.ExportRequest)
4447 assert r.wait(timeout=60)
4548 assert r.has_succeeded()
46 assert r.protocol == 'fits'
49 assert r.protocol == "fits"
4750 assert len(r.urls) == 2 # 1 file per segment
4851
4952 for record in r.urls.record:
5053 record = record.lower()
51 assert record.startswith('hmi.sharp_720s[4864]')
52 assert record.endswith('2014.11.30_00:00:00_tai]')
54 assert record.startswith("hmi.sharp_720s[4864]")
55 assert record.endswith("2014.11.30_00:00:00_tai]")
5356
5457 for filename in r.urls.filename:
55 assert filename.endswith('continuum.fits') or filename.endswith('magnetogram.fits')
58 assert filename.endswith("continuum.fits") or filename.endswith("magnetogram.fits")
5659
5760 for url in r.urls.url:
58 assert url.endswith('continuum.fits') or url.endswith('magnetogram.fits')
61 assert url.endswith("continuum.fits") or url.endswith("magnetogram.fits")
5962
6063
6164 @pytest.mark.jsoc
6669 # NOTE: processing exports seem to fail silently on the server side if
6770 # the correct names/arguments are not passed. Not clear how to check
6871 # that this has not happened.
69 process = {'im_patch': {
70 't_ref': '2015-10-17T04:33:30.000',
71 't': 0,
72 'r': 0,
73 'c': 0,
74 'locunits': 'arcsec',
75 'boxunits': 'arcsec',
76 'x': -517.2,
77 'y': -246,
78 'width': 345.6,
79 'height': 345.6,
80 }}
72 process = {
73 "im_patch": {
74 "t_ref": "2015-10-17T04:33:30.000",
75 "t": 0,
76 "r": 0,
77 "c": 0,
78 "locunits": "arcsec",
79 "boxunits": "arcsec",
80 "x": -517.2,
81 "y": -246,
82 "width": 345.6,
83 "height": 345.6,
84 }
85 }
8186 req = jsoc_client_export.export(
82 'aia.lev1_euv_12s[2015-10-17T04:33:30.000/1m@12s][171]{image}',
83 method='url',
84 protocol='fits',
87 "aia.lev1_euv_12s[2015-10-17T04:33:30.000/1m@12s][171]{image}",
88 method="url",
89 protocol="fits",
8590 process=process,
8691 requestor=False,
8792 )
8994 assert isinstance(req, drms.ExportRequest)
9095 assert req.wait(timeout=60)
9196 assert req.has_succeeded()
92 assert req.protocol == 'fits'
97 assert req.protocol == "fits"
9398
9499 for record in req.urls.record:
95100 record = record.lower()
96 assert record.startswith('aia.lev1_euv_12s_mod')
101 assert record.startswith("aia.lev1_euv_12s_mod")
97102
98103 for filename in req.urls.filename:
99 assert filename.endswith('image.fits')
104 assert filename.endswith("image.fits")
100105
101106 for url in req.urls.url:
102 assert url.endswith('image.fits')
107 assert url.endswith("image.fits")
103108
104109
105110 @pytest.mark.jsoc
111116 # the correct names/arguments are not passed. Not clear how to check
112117 # that this has not happened.
113118 req = jsoc_client_export.export(
114 'hmi.M_720s[2020-10-17_22:12:00_TAI/24m]{magnetogram}',
115 method='url',
116 protocol='fits',
117 process={'rebin': {'method': 'boxcar', 'scale': 0.25}},
119 "hmi.M_720s[2020-10-17_22:12:00_TAI/24m]{magnetogram}",
120 method="url",
121 protocol="fits",
122 process={"rebin": {"method": "boxcar", "scale": 0.25}},
118123 requestor=False,
119124 )
120125
121126 assert isinstance(req, drms.ExportRequest)
122127 assert req.wait(timeout=60)
123128 assert req.has_succeeded()
124 assert req.protocol == 'fits'
129 assert req.protocol == "fits"
125130
126131 for record in req.urls.record:
127132 record = record.lower()
128 assert record.startswith('hmi.m_720s_mod')
133 assert record.startswith("hmi.m_720s_mod")
129134
130135 for filename in req.urls.filename:
131 assert filename.endswith('magnetogram.fits')
136 assert filename.endswith("magnetogram.fits")
132137
133138 for url in req.urls.url:
134 assert url.endswith('magnetogram.fits')
139 assert url.endswith("magnetogram.fits")
135140
136141
137142 @pytest.mark.jsoc
138143 @pytest.mark.export
139144 @pytest.mark.remote_data
140145 def test_export_invalid_process(jsoc_client_export):
141 with pytest.raises(ValueError, match='foobar is not one of the allowed processing options'):
146 with pytest.raises(ValueError, match="foobar is not one of the allowed processing options"):
142147 jsoc_client_export.export(
143 'aia.lev1_euv_12s[2015-10-17T04:33:30.000/1m@12s][171]{image}',
144 process={'foobar': {}}
148 "aia.lev1_euv_12s[2015-10-17T04:33:30.000/1m@12s][171]{image}", process={"foobar": {}}
145149 )
146150
147151
150154 @pytest.mark.remote_data
151155 def test_export_email(jsoc_client):
152156 with pytest.raises(ValueError):
153 jsoc_client.export('hmi.v_45s[2016.04.01_TAI/1d@6h]{Dopplergram}')
157 jsoc_client.export("hmi.v_45s[2016.04.01_TAI/1d@6h]{Dopplergram}")
33 @pytest.mark.jsoc
44 @pytest.mark.remote_data
55 @pytest.mark.parametrize(
6 'series, pkeys, segments',
6 "series, pkeys, segments",
77 [
8 ('hmi.v_45s', ['T_REC', 'CAMERA'], ['Dopplergram']),
9 ('hmi.m_720s', ['T_REC', 'CAMERA'], ['magnetogram']),
10 ('hmi.v_avg120', ['CarrRot', 'CMLon'], ['mean', 'power', 'valid', 'Log']),
8 ("hmi.v_45s", ["T_REC", "CAMERA"], ["Dopplergram"]),
9 ("hmi.m_720s", ["T_REC", "CAMERA"], ["magnetogram"]),
10 ("hmi.v_avg120", ["CarrRot", "CMLon"], ["mean", "power", "valid", "Log"]),
1111 ],
1212 )
1313 def test_series_info_basic(jsoc_client, series, pkeys, segments):
2323 @pytest.mark.jsoc
2424 @pytest.mark.remote_data
2525 @pytest.mark.parametrize(
26 'series, pkeys',
26 "series, pkeys",
2727 [
28 ('hmi.v_45s', ['T_REC', 'CAMERA']),
29 ('hmi.m_720s', ['T_REC', 'CAMERA']),
30 ('hmi.v_avg120', ['CarrRot', 'CMLon']),
31 ('aia.lev1', ['T_REC', 'FSN']),
32 ('aia.lev1_euv_12s', ['T_REC', 'WAVELNTH']),
33 ('aia.response', ['T_START', 'WAVE_STR']),
34 ('iris.lev1', ['T_OBS', 'FSN']),
35 ('mdi.fd_m_lev182', ['T_REC']),
28 ("hmi.v_45s", ["T_REC", "CAMERA"]),
29 ("hmi.m_720s", ["T_REC", "CAMERA"]),
30 ("hmi.v_avg120", ["CarrRot", "CMLon"]),
31 ("aia.lev1", ["T_REC", "FSN"]),
32 ("aia.lev1_euv_12s", ["T_REC", "WAVELNTH"]),
33 ("aia.response", ["T_START", "WAVE_STR"]),
34 ("iris.lev1", ["T_OBS", "FSN"]),
35 ("mdi.fd_m_lev182", ["T_REC"]),
3636 ],
3737 )
3838 def test_series_primekeys(jsoc_client, series, pkeys):
77 @pytest.mark.jsoc
88 @pytest.mark.remote_data
99 def test_query_basic(jsoc_client):
10 keys, segs = jsoc_client.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]', key='T_REC, CRLT_OBS', seg='Dopplergram')
10 keys, segs = jsoc_client.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]", key="T_REC, CRLT_OBS", seg="Dopplergram")
1111 assert len(keys) == 4
12 for k in ['T_REC', 'CRLT_OBS']:
12 for k in ["T_REC", "CRLT_OBS"]:
1313 assert k in keys.columns
1414 assert len(segs) == 4
15 assert 'Dopplergram' in segs.columns
15 assert "Dopplergram" in segs.columns
1616 assert ((keys.CRLT_OBS - 3.14159).abs() < 0.0001).all()
1717
1818
2020 @pytest.mark.remote_data
2121 def test_query_allmissing(jsoc_client):
2222 with pytest.raises(ValueError):
23 keys = jsoc_client.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]')
23 keys = jsoc_client.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]")
2424
2525
2626 @pytest.mark.jsoc
2727 @pytest.mark.remote_data
2828 def test_query_key(jsoc_client):
29 keys = jsoc_client.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]', key='T_REC, CRLT_OBS')
29 keys = jsoc_client.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]", key="T_REC, CRLT_OBS")
3030 assert len(keys) == 4
31 for k in ['T_REC', 'CRLT_OBS']:
31 for k in ["T_REC", "CRLT_OBS"]:
3232 assert k in keys.columns
3333 assert ((keys.CRLT_OBS - 3.14159).abs() < 0.0001).all()
3434
3636 @pytest.mark.jsoc
3737 @pytest.mark.remote_data
3838 def test_query_seg(jsoc_client):
39 segs = jsoc_client.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]', seg='Dopplergram')
39 segs = jsoc_client.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]", seg="Dopplergram")
4040 assert len(segs) == 4
41 assert 'Dopplergram' in segs.columns
41 assert "Dopplergram" in segs.columns
4242
4343
4444 @pytest.mark.jsoc
4545 @pytest.mark.remote_data
4646 def test_query_link(jsoc_client):
47 links = jsoc_client.query('hmi.B_720s[2013.07.03_08:42_TAI/40m]', link='MDATA')
47 links = jsoc_client.query("hmi.B_720s[2013.07.03_08:42_TAI/40m]", link="MDATA")
4848 assert len(links) == 3
49 assert 'MDATA' in links.columns
49 assert "MDATA" in links.columns
5050
5151
5252 @pytest.mark.jsoc
5353 @pytest.mark.remote_data
5454 def test_query_seg_key_link(jsoc_client):
55 keys, segs, links = jsoc_client.query('hmi.B_720s[2013.07.03_08:42_TAI/40m]', key='foo', link='bar', seg='baz')
55 keys, segs, links = jsoc_client.query("hmi.B_720s[2013.07.03_08:42_TAI/40m]", key="foo", link="bar", seg="baz")
5656 assert len(keys) == 3
57 assert (keys.foo == 'Invalid KeyLink').all()
57 assert (keys.foo == "Invalid KeyLink").all()
5858 assert len(segs) == 3
59 assert (segs.baz == 'InvalidSegName').all()
59 assert (segs.baz == "InvalidSegName").all()
6060 assert len(links) == 3
61 assert (links.bar == 'Invalid_Link').all()
61 assert (links.bar == "Invalid_Link").all()
6262
6363
6464 @pytest.mark.jsoc
6565 @pytest.mark.remote_data
6666 def test_query_pkeys(jsoc_client):
67 keys = jsoc_client.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]', pkeys=True)
67 keys = jsoc_client.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]", pkeys=True)
6868 pkeys = list(keys.columns.values)
69 assert pkeys == jsoc_client.pkeys('hmi.v_45s[2013.07.03_08:42_TAI/3m]')
69 assert pkeys == jsoc_client.pkeys("hmi.v_45s[2013.07.03_08:42_TAI/3m]")
7070 assert len(keys) == 4
7171
7272
7373 @pytest.mark.jsoc
7474 @pytest.mark.remote_data
7575 def test_query_recindex(jsoc_client):
76 keys = jsoc_client.query('hmi.V_45s[2013.07.03_08:42_TAI/4m]', key='T_REC', rec_index=True)
76 keys = jsoc_client.query("hmi.V_45s[2013.07.03_08:42_TAI/4m]", key="T_REC", rec_index=True)
7777 index_values = list(keys.index.values)
78 assert all(['hmi.V_45s' in s for s in index_values])
78 assert all(["hmi.V_45s" in s for s in index_values])
7979 assert len(keys) == 5
8080
8181
8282 def test_query_invalid_server():
83 cfg = ServerConfig(name='TEST')
83 cfg = ServerConfig(name="TEST")
8484 c = drms.Client(server=cfg)
8585 with pytest.raises(DrmsOperationNotSupported):
86 keys = c.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]', pkeys=True)
86 keys = c.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]", pkeys=True)
8787
8888
8989 @pytest.mark.jsoc
9090 @pytest.mark.remote_data
9191 def test_query_invalid_series(jsoc_client):
9292 with pytest.raises(DrmsQueryError):
93 keys = jsoc_client.query('foo', key='T_REC')
93 keys = jsoc_client.query("foo", key="T_REC")
94
95
96 @pytest.mark.remote_data
97 def test_query_hexadecimal_strings():
98 # Exercise the part of client.py that deals with hexadecimal strings
99 c = drms.Client()
100 c.query("hmi.v_45s[2014.01.01_00:00:35_TAI-2014.01.01_01:00:35_TAI]", key="**ALL**")
77 def test_series_list_all(kis_client):
88 slist = kis_client.series()
99 assert isinstance(slist, list)
10 assert 'hmi.v_45s' in (s.lower() for s in slist)
11 assert 'hmi.m_720s' in (s.lower() for s in slist)
12 assert 'hmi.ic_720s' in (s.lower() for s in slist)
13 assert 'mdi.fd_v' in (s.lower() for s in slist)
10 assert "hmi.v_45s" in (s.lower() for s in slist)
11 assert "hmi.m_720s" in (s.lower() for s in slist)
12 assert "hmi.ic_720s" in (s.lower() for s in slist)
13 assert "mdi.fd_v" in (s.lower() for s in slist)
1414
1515
1616 @pytest.mark.kis
1717 @pytest.mark.remote_data
18 @pytest.mark.parametrize('schema', ['hmi', 'mdi'])
18 @pytest.mark.parametrize("schema", ["hmi", "mdi"])
1919 def test_series_list_schemata(kis_client, schema):
20 regex = fr'{schema}\.'
20 regex = rf"{schema}\."
2121 slist = kis_client.series(regex)
2222 assert len(slist) > 0
2323 for sname in slist:
24 assert sname.startswith(f'{schema}.')
24 assert sname.startswith(f"{schema}.")
2525
2626
2727 @pytest.mark.kis
2828 @pytest.mark.remote_data
2929 @pytest.mark.parametrize(
30 'series, pkeys, segments',
30 "series, pkeys, segments",
3131 [
32 ('hmi.v_45s', ['T_REC', 'CAMERA'], ['Dopplergram']),
33 ('hmi.m_720s', ['T_REC', 'CAMERA'], ['magnetogram']),
34 ('hmi.v_avg120', ['CarrRot', 'CMLon'], ['mean', 'power', 'valid', 'Log']),
32 ("hmi.v_45s", ["T_REC", "CAMERA"], ["Dopplergram"]),
33 ("hmi.m_720s", ["T_REC", "CAMERA"], ["magnetogram"]),
34 ("hmi.v_avg120", ["CarrRot", "CMLon"], ["mean", "power", "valid", "Log"]),
3535 ],
3636 )
3737 def test_series_info_basic(kis_client, series, pkeys, segments):
4747 @pytest.mark.kis
4848 @pytest.mark.remote_data
4949 def test_query_basic(kis_client):
50 keys, segs = kis_client.query('hmi.v_45s[2013.07.03_08:42_TAI/3m]', key='T_REC, CRLT_OBS', seg='Dopplergram')
50 keys, segs = kis_client.query("hmi.v_45s[2013.07.03_08:42_TAI/3m]", key="T_REC, CRLT_OBS", seg="Dopplergram")
5151 assert len(keys) == 4
52 for k in ['T_REC', 'CRLT_OBS']:
52 for k in ["T_REC", "CRLT_OBS"]:
5353 assert k in keys.columns
5454 assert len(segs) == 4
55 assert 'Dopplergram' in segs.columns
55 assert "Dopplergram" in segs.columns
5656 assert ((keys.CRLT_OBS - 3.14159).abs() < 0.0001).all()
5757
5858
6060 @pytest.mark.remote_data
6161 def test_not_supported_email(kis_client):
6262 with pytest.raises(drms.DrmsOperationNotSupported):
63 kis_client.email = 'name@example.com'
63 kis_client.email = "name@example.com"
6464
6565
6666 @pytest.mark.kis
6767 @pytest.mark.remote_data
6868 def test_not_supported_export(kis_client):
6969 with pytest.raises(drms.DrmsOperationNotSupported):
70 kis_client.export('hmi.v_45s[2010.05.01_TAI]')
70 kis_client.export("hmi.v_45s[2010.05.01_TAI]")
7171 with pytest.raises(drms.DrmsOperationNotSupported):
72 kis_client.export_from_id('KIS_20120101_123')
72 kis_client.export_from_id("KIS_20120101_123")
1010
1111
1212 def test_debug():
13 helper(['--debug'], 'debug', True)
14 helper([], 'debug', False)
13 helper(["--debug"], "debug", True)
14 helper([], "debug", False)
1515
1616
1717 def test_verbose():
18 helper(['--verbose'], 'verbose', True)
19 helper([], 'verbose', False)
18 helper(["--verbose"], "verbose", True)
19 helper([], "verbose", False)
2020
2121
2222 def test_version():
2323 with pytest.raises(SystemExit):
24 helper(['--version'], 'version', True)
24 helper(["--version"], "version", True)
2525
2626
2727 def test_server():
28 helper(['fake_url_server'], 'server', 'fake_url_server')
29 helper([], 'server', 'jsoc')
28 helper(["fake_url_server"], "server", "fake_url_server")
29 helper([], "server", "jsoc")
3030
3131
3232 def test_email():
33 helper(['--email', 'fake@gmail.com'], 'email', 'fake@gmail.com')
34 helper([], 'email', None)
33 helper(["--email", "fake@gmail.com"], "email", "fake@gmail.com")
34 helper([], "email", None)
3535
3636
3737 def test_main_empty():
3838 sys.argv = [
39 'drms',
39 "drms",
4040 ]
4141 main()
77 def test_parse_keywords():
88 info = [
99 {
10 'recscope': 'variable',
11 'units': 'none',
12 'name': 'cparms_sg000',
13 'defval': 'compress Rice',
14 'note': '',
15 'type': 'string',
10 "recscope": "variable",
11 "units": "none",
12 "name": "cparms_sg000",
13 "defval": "compress Rice",
14 "note": "",
15 "type": "string",
1616 },
1717 {
18 'recscope': 'variable',
19 'units': 'none',
20 'name': 'mean_bzero',
21 'defval': '0',
22 'note': '',
23 'type': 'double',
18 "recscope": "variable",
19 "units": "none",
20 "name": "mean_bzero",
21 "defval": "0",
22 "note": "",
23 "type": "double",
2424 },
2525 {
26 'recscope': 'variable',
27 'units': 'none',
28 'name': 'mean_bscale',
29 'defval': '0.25',
30 'note': '',
31 'type': 'double',
26 "recscope": "variable",
27 "units": "none",
28 "name": "mean_bscale",
29 "defval": "0.25",
30 "note": "",
31 "type": "double",
3232 },
3333 {
34 'recscope': 'variable',
35 'units': 'TAI',
36 'name': 'MidTime',
37 'defval': '-4712.01.01_11:59_TAI',
38 'note': 'Midpoint of averaging interval',
39 'type': 'time',
34 "recscope": "variable",
35 "units": "TAI",
36 "name": "MidTime",
37 "defval": "-4712.01.01_11:59_TAI",
38 "note": "Midpoint of averaging interval",
39 "type": "time",
4040 },
4141 ]
4242 exp = OrderedDict(
4343 [
44 ('name', ['cparms_sg000', 'mean_bzero', 'mean_bscale', 'MidTime']),
45 ('type', ['string', 'double', 'double', 'time']),
46 ('recscope', ['variable', 'variable', 'variable', 'variable']),
47 ('defval', ['compress Rice', '0', '0.25', '-4712.01.01_11:59_TAI']),
48 ('units', ['none', 'none', 'none', 'TAI']),
49 ('note', ['', '', '', 'Midpoint of averaging interval']),
50 ('linkinfo', [None, None, None, None]),
51 ('is_time', [False, False, False, True]),
52 ('is_integer', [False, False, False, False]),
53 ('is_real', [False, True, True, False]),
54 ('is_numeric', [False, True, True, False]),
44 ("name", ["cparms_sg000", "mean_bzero", "mean_bscale", "MidTime"]),
45 ("type", ["string", "double", "double", "time"]),
46 ("recscope", ["variable", "variable", "variable", "variable"]),
47 ("defval", ["compress Rice", "0", "0.25", "-4712.01.01_11:59_TAI"]),
48 ("units", ["none", "none", "none", "TAI"]),
49 ("note", ["", "", "", "Midpoint of averaging interval"]),
50 ("linkinfo", [None, None, None, None]),
51 ("is_time", [False, False, False, True]),
52 ("is_integer", [False, False, False, False]),
53 ("is_real", [False, True, True, False]),
54 ("is_numeric", [False, True, True, False]),
5555 ]
5656 )
5757
5858 exp = pd.DataFrame(data=exp)
59 exp.index = exp.pop('name')
59 exp.index = exp.pop("name")
6060 assert drms.SeriesInfo._parse_keywords(info).equals(exp)
6161
6262
6363 def test_parse_links():
6464 links = [
65 {'name': 'BHARP', 'kind': 'DYNAMIC', 'note': 'Bharp', 'target': 'hmi.Bharp_720s'},
66 {'name': 'MHARP', 'kind': 'DYNAMIC', 'note': 'Mharp', 'target': 'hmi.Mharp_720s'},
65 {"name": "BHARP", "kind": "DYNAMIC", "note": "Bharp", "target": "hmi.Bharp_720s"},
66 {"name": "MHARP", "kind": "DYNAMIC", "note": "Mharp", "target": "hmi.Mharp_720s"},
6767 ]
6868 exp = OrderedDict(
6969 [
70 ('name', ['BHARP', 'MHARP']),
71 ('target', ['hmi.Bharp_720s', 'hmi.Mharp_720s']),
72 ('kind', ['DYNAMIC', 'DYNAMIC']),
73 ('note', ['Bharp', 'Mharp']),
70 ("name", ["BHARP", "MHARP"]),
71 ("target", ["hmi.Bharp_720s", "hmi.Mharp_720s"]),
72 ("kind", ["DYNAMIC", "DYNAMIC"]),
73 ("note", ["Bharp", "Mharp"]),
7474 ]
7575 )
7676 exp = pd.DataFrame(data=exp)
77 exp.index = exp.pop('name')
77 exp.index = exp.pop("name")
7878 assert drms.SeriesInfo._parse_links(links).equals(exp)
7979
8080
8181 def test_parse_segments():
8282 segments = [
8383 {
84 'type': 'int',
85 'dims': 'VARxVAR',
86 'units': 'Gauss',
87 'protocol': 'fits',
88 'note': 'magnetogram',
89 'name': 'magnetogram',
84 "type": "int",
85 "dims": "VARxVAR",
86 "units": "Gauss",
87 "protocol": "fits",
88 "note": "magnetogram",
89 "name": "magnetogram",
9090 },
9191 {
92 'type': 'char',
93 'dims': 'VARxVAR',
94 'units': 'Enumerated',
95 'protocol': 'fits',
96 'note': 'Mask for the patch',
97 'name': 'bitmap',
92 "type": "char",
93 "dims": "VARxVAR",
94 "units": "Enumerated",
95 "protocol": "fits",
96 "note": "Mask for the patch",
97 "name": "bitmap",
9898 },
9999 {
100 'type': 'int',
101 'dims': 'VARxVAR',
102 'units': 'm/s',
103 'protocol': 'fits',
104 'note': 'Dopplergram',
105 'name': 'Dopplergram',
100 "type": "int",
101 "dims": "VARxVAR",
102 "units": "m/s",
103 "protocol": "fits",
104 "note": "Dopplergram",
105 "name": "Dopplergram",
106106 },
107107 ]
108108 exp = OrderedDict(
109109 [
110 ('name', ['magnetogram', 'bitmap', 'Dopplergram']),
111 ('type', ['int', 'char', 'int']),
112 ('units', ['Gauss', 'Enumerated', 'm/s']),
113 ('protocol', ['fits', 'fits', 'fits']),
114 ('dims', ['VARxVAR', 'VARxVAR', 'VARxVAR']),
115 ('note', ['magnetogram', 'Mask for the patch', 'Dopplergram']),
110 ("name", ["magnetogram", "bitmap", "Dopplergram"]),
111 ("type", ["int", "char", "int"]),
112 ("units", ["Gauss", "Enumerated", "m/s"]),
113 ("protocol", ["fits", "fits", "fits"]),
114 ("dims", ["VARxVAR", "VARxVAR", "VARxVAR"]),
115 ("note", ["magnetogram", "Mask for the patch", "Dopplergram"]),
116116 ]
117117 )
118118
119119 exp = pd.DataFrame(data=exp)
120 exp.index = exp.pop('name')
120 exp.index = exp.pop("name")
121121 assert drms.SeriesInfo._parse_segments(segments).equals(exp)
122122
123123
124124 def test_repr():
125125 info = {
126 'primekeys': ['CarrRot', 'CMLon'],
127 'retention': 1800,
128 'tapegroup': 1,
129 'archive': 1,
130 'primekeysinfo': [],
131 'unitsize': 1,
132 'note': 'Temporal averages of HMI Vgrams over 1/3 CR',
133 'dbindex': [None],
134 'links': [],
135 'segments': [],
136 'keywords': [],
126 "primekeys": ["CarrRot", "CMLon"],
127 "retention": 1800,
128 "tapegroup": 1,
129 "archive": 1,
130 "primekeysinfo": [],
131 "unitsize": 1,
132 "note": "Temporal averages of HMI Vgrams over 1/3 CR",
133 "dbindex": [None],
134 "links": [],
135 "segments": [],
136 "keywords": [],
137137 }
138 assert repr(drms.SeriesInfo(info)) == '<SeriesInfo>'
139 assert repr(drms.SeriesInfo(info, 'hmi')) == '<SeriesInfo: hmi>'
138 assert repr(drms.SeriesInfo(info)) == "<SeriesInfo>"
139 assert repr(drms.SeriesInfo(info, "hmi")) == "<SeriesInfo: hmi>"
44 import drms
55
66 data_tai = [
7 ('2010.05.01_TAI', pd.Timestamp('2010-05-01 00:00:00')),
8 ('2010.05.01_00:00_TAI', pd.Timestamp('2010-05-01 00:00:00')),
9 ('2010.05.01_00:00:00_TAI', pd.Timestamp('2010-05-01 00:00:00')),
10 ('2010.05.01_01:23:45_TAI', pd.Timestamp('2010-05-01 01:23:45')),
11 ('2013.12.21_23:32_TAI', pd.Timestamp('2013-12-21 23:32:00')),
12 ('2013.12.21_23:32:34_TAI', pd.Timestamp('2013-12-21 23:32:34')),
7 ("2010.05.01_TAI", pd.Timestamp("2010-05-01 00:00:00")),
8 ("2010.05.01_00:00_TAI", pd.Timestamp("2010-05-01 00:00:00")),
9 ("2010.05.01_00:00:00_TAI", pd.Timestamp("2010-05-01 00:00:00")),
10 ("2010.05.01_01:23:45_TAI", pd.Timestamp("2010-05-01 01:23:45")),
11 ("2013.12.21_23:32_TAI", pd.Timestamp("2013-12-21 23:32:00")),
12 ("2013.12.21_23:32:34_TAI", pd.Timestamp("2013-12-21 23:32:34")),
1313 ]
1414 data_tai_in = [data[0] for data in data_tai]
1515 data_tai_out = pd.Series([data[1] for data in data_tai])
1616
1717
18 @pytest.mark.parametrize('time_string, expected', data_tai)
18 @pytest.mark.parametrize("time_string, expected", data_tai)
1919 def test_tai_string(time_string, expected):
2020 assert drms.to_datetime(time_string) == expected
2121
2222
2323 @pytest.mark.parametrize(
24 'time_string, expected',
24 "time_string, expected",
2525 [
26 ('2010-05-01T00:00Z', pd.Timestamp('2010-05-01 00:00:00')),
27 ('2010-05-01T00:00:00Z', pd.Timestamp('2010-05-01 00:00:00')),
28 ('2010-05-01T01:23:45Z', pd.Timestamp('2010-05-01 01:23:45')),
29 ('2013-12-21T23:32Z', pd.Timestamp('2013-12-21 23:32:00')),
30 ('2013-12-21T23:32:34Z', pd.Timestamp('2013-12-21 23:32:34')),
31 ('2010-05-01 00:00Z', pd.Timestamp('2010-05-01 00:00:00')),
32 ('2010-05-01 00:00:00Z', pd.Timestamp('2010-05-01 00:00:00')),
33 ('2010-05-01 01:23:45Z', pd.Timestamp('2010-05-01 01:23:45')),
34 ('2013-12-21 23:32Z', pd.Timestamp('2013-12-21 23:32:00')),
35 ('2013-12-21 23:32:34Z', pd.Timestamp('2013-12-21 23:32:34')),
26 ("2010-05-01T00:00Z", pd.Timestamp("2010-05-01 00:00:00")),
27 ("2010-05-01T00:00:00Z", pd.Timestamp("2010-05-01 00:00:00")),
28 ("2010-05-01T01:23:45Z", pd.Timestamp("2010-05-01 01:23:45")),
29 ("2013-12-21T23:32Z", pd.Timestamp("2013-12-21 23:32:00")),
30 ("2013-12-21T23:32:34Z", pd.Timestamp("2013-12-21 23:32:34")),
31 ("2010-05-01 00:00Z", pd.Timestamp("2010-05-01 00:00:00")),
32 ("2010-05-01 00:00:00Z", pd.Timestamp("2010-05-01 00:00:00")),
33 ("2010-05-01 01:23:45Z", pd.Timestamp("2010-05-01 01:23:45")),
34 ("2013-12-21 23:32Z", pd.Timestamp("2013-12-21 23:32:00")),
35 ("2013-12-21 23:32:34Z", pd.Timestamp("2013-12-21 23:32:34")),
3636 ],
3737 )
3838 def test_z_string(time_string, expected):
3939 assert drms.to_datetime(time_string) == expected
4040
4141
42 @pytest.mark.xfail(reason='pandas does not support leap seconds')
42 @pytest.mark.xfail(reason="pandas does not support leap seconds")
4343 @pytest.mark.parametrize(
44 'time_string, expected',
44 "time_string, expected",
4545 [
46 ('2012-06-30T23:59:60Z', '2012-06-30 23:59:60'),
47 ('2015-06-30T23:59:60Z', '2015-06-30 23:59:60'),
48 ('2016-12-31T23:59:60Z', '2016-12-31 23:59:60'),
46 ("2012-06-30T23:59:60Z", "2012-06-30 23:59:60"),
47 ("2015-06-30T23:59:60Z", "2015-06-30 23:59:60"),
48 ("2016-12-31T23:59:60Z", "2016-12-31 23:59:60"),
4949 ],
5050 )
5151 def test_z_leap_string(time_string, expected):
5353
5454
5555 @pytest.mark.parametrize(
56 'time_string, expected',
56 "time_string, expected",
5757 [
58 ('2013.12.21_23:32:34_TAI', pd.Timestamp('2013-12-21 23:32:34')),
59 ('2013.12.21_23:32:34_UTC', pd.Timestamp('2013-12-21 23:32:34')),
60 ('2013.12.21_23:32:34Z', pd.Timestamp('2013-12-21 23:32:34')),
58 ("2013.12.21_23:32:34_TAI", pd.Timestamp("2013-12-21 23:32:34")),
59 ("2013.12.21_23:32:34_UTC", pd.Timestamp("2013-12-21 23:32:34")),
60 ("2013.12.21_23:32:34Z", pd.Timestamp("2013-12-21 23:32:34")),
6161 ],
6262 )
6363 def test_force_string(time_string, expected):
6565
6666
6767 @pytest.mark.parametrize(
68 'time_series, expected',
68 "time_series, expected",
6969 [
7070 (data_tai_in, data_tai_out),
7171 (pd.Series(data_tai_in), data_tai_out),
7878
7979
8080 data_invalid = [
81 ('2010.05.01_TAI', False),
82 ('2010.05.01_00:00_TAI', False),
83 ('', True),
84 ('1600', True),
85 ('foo', True),
86 ('2013.12.21_23:32:34_TAI', False),
81 ("2010.05.01_TAI", False),
82 ("2010.05.01_00:00_TAI", False),
83 ("", True),
84 ("1600", True),
85 ("foo", True),
86 ("2013.12.21_23:32:34_TAI", False),
8787 ]
8888 data_invalid_in = [data[0] for data in data_invalid]
8989 data_invalid_out = pd.Series([data[1] for data in data_invalid])
9090
9191
92 @pytest.mark.parametrize('time_string, expected', data_invalid)
92 @pytest.mark.parametrize("time_string, expected", data_invalid)
9393 def test_corner_case(time_string, expected):
9494 assert pd.isnull(drms.to_datetime(time_string)) == expected
9595 assert isinstance(drms.to_datetime([]), pd.Series)
9797
9898
9999 @pytest.mark.parametrize(
100 'time_series, expected',
100 "time_series, expected",
101101 [
102102 (data_invalid_in, data_invalid_out),
103103 (pd.Series(data_invalid_in), data_invalid_out),
44
55
66 @pytest.mark.parametrize(
7 'in_obj, expected',
7 "in_obj, expected",
88 [
99 ("", []),
10 ('asd', ['asd']),
11 ('aa,bb,cc', ['aa', 'bb', 'cc']),
12 ('aa, bb, cc', ['aa', 'bb', 'cc']),
13 (' aa,bb, cc, dd', ['aa', 'bb', 'cc', 'dd']),
14 ('aa,\tbb,cc, dd ', ['aa', 'bb', 'cc', 'dd']),
15 ('aa,\tbb,cc, dd ', ['aa', 'bb', 'cc', 'dd']),
10 ("asd", ["asd"]),
11 ("aa,bb,cc", ["aa", "bb", "cc"]),
12 ("aa, bb, cc", ["aa", "bb", "cc"]),
13 (" aa,bb, cc, dd", ["aa", "bb", "cc", "dd"]),
14 ("aa,\tbb,cc, dd ", ["aa", "bb", "cc", "dd"]),
15 ("aa,\tbb,cc, dd ", ["aa", "bb", "cc", "dd"]),
1616 ([], []),
17 (['a', 'b', 'c'], ['a', 'b', 'c']),
18 (('a', 'b', 'c'), ['a', 'b', 'c']),
17 (["a", "b", "c"], ["a", "b", "c"]),
18 (("a", "b", "c"), ["a", "b", "c"]),
1919 ],
2020 )
2121 def test_split_arg(in_obj, expected):
2626
2727
2828 @pytest.mark.parametrize(
29 'ds_string, expected',
29 "ds_string, expected",
3030 [
31 ('hmi.v_45s', 'hmi.v_45s'),
32 ('hmi.v_45s[2010.05.01_TAI]', 'hmi.v_45s'),
33 ('hmi.v_45s[2010.05.01_TAI/365d@1d]', 'hmi.v_45s'),
34 ('hmi.v_45s[2010.05.01_TAI/365d@1d][?QUALITY>=0?]', 'hmi.v_45s'),
35 ('hmi.v_45s[2010.05.01_TAI/1d@6h]{Dopplergram}', 'hmi.v_45s'),
31 ("hmi.v_45s", "hmi.v_45s"),
32 ("hmi.v_45s[2010.05.01_TAI]", "hmi.v_45s"),
33 ("hmi.v_45s[2010.05.01_TAI/365d@1d]", "hmi.v_45s"),
34 ("hmi.v_45s[2010.05.01_TAI/365d@1d][?QUALITY>=0?]", "hmi.v_45s"),
35 ("hmi.v_45s[2010.05.01_TAI/1d@6h]{Dopplergram}", "hmi.v_45s"),
3636 ],
3737 )
3838 def test_extract_series(ds_string, expected):
4040
4141
4242 @pytest.mark.parametrize(
43 'arg, exp',
43 "arg, exp",
4444 [
45 (pd.Series(['1.0', '2', -3]), pd.Series([1.0, 2.0, -3.0])),
46 (pd.Series(['1.0', 'apple', -3]), pd.Series([1.0, float('nan'), -3.0])),
45 (pd.Series(["1.0", "2", -3]), pd.Series([1.0, 2.0, -3.0])),
46 (pd.Series(["1.0", "apple", -3]), pd.Series([1.0, float("nan"), -3.0])),
4747 ],
4848 )
4949 def test_pd_to_numeric_coerce(arg, exp):
5151
5252
5353 @pytest.mark.parametrize(
54 'arg, exp',
54 "arg, exp",
5555 [
5656 (
57 pd.Series(['2016-04-01 00:00:00', '2016-04-01 06:00:00']),
58 pd.Series([pd.Timestamp('2016-04-01 00:00:00'), pd.Timestamp('2016-04-01 06:00:00')]),
57 pd.Series(["2016-04-01 00:00:00", "2016-04-01 06:00:00"]),
58 pd.Series([pd.Timestamp("2016-04-01 00:00:00"), pd.Timestamp("2016-04-01 06:00:00")]),
5959 )
6060 ],
6161 )
22 import numpy as np
33 import pandas as pd
44
5 __all__ = ['to_datetime']
5 __all__ = ["to_datetime"]
66
77
88 def _pd_to_datetime_coerce(arg):
9 return pd.to_datetime(arg, errors='coerce')
9 return pd.to_datetime(arg, errors="coerce")
1010
1111
1212 def _pd_to_numeric_coerce(arg):
13 return pd.to_numeric(arg, errors='coerce')
13 return pd.to_numeric(arg, errors="coerce")
1414
1515
1616 def _split_arg(arg):
1818 Split a comma-separated string into a list.
1919 """
2020 if isinstance(arg, str):
21 arg = [it for it in re.split(r'[\s,]+', arg) if it]
21 arg = [it for it in re.split(r"[\s,]+", arg) if it]
2222 return arg
2323
2424
2626 """
2727 Extract series name from record set.
2828 """
29 m = re.match(r'^\s*([\w\.]+).*$', ds)
29 m = re.match(r"^\s*([\w\.]+).*$", ds)
3030 return m.group(1) if m is not None else None
3131
3232
6464 Pandas series or a single Timestamp object.
6565 """
6666 s = pd.Series(tstr, dtype=object).astype(str)
67 if force or s.str.endswith('_TAI').any():
68 s = s.str.replace('_TAI', "")
69 s = s.str.replace('_', ' ')
70 s = s.str.replace('.', '-', regex=True, n=2)
67 if force or s.str.endswith("_TAI").any():
68 s = s.str.replace("_TAI", "")
69 s = s.str.replace("_", " ")
70 s = s.str.replace(".", "-", regex=True, n=2)
7171 res = _pd_to_datetime_coerce(s)
7272 res = res.dt.tz_localize(None)
7373 return res.iloc[0] if (len(res) == 1) and np.isscalar(tstr) else res
88 except Exception:
99 import warnings
1010
11 warnings.warn(
12 f'could not determine {__name__.split(".")[0]} package version; this indicates a broken installation'
13 )
11 warnings.warn(f'could not determine {__name__.split(".")[0]} package version; this indicates a broken installation')
1412 del warnings
1513
16 version = '0.0.0'
14 version = "0.0.0"
1715
1816
1917 # We use LooseVersion to define major, minor, micro, but ignore any suffixes.
3634
3735 del split_version # clean up namespace.
3836
39 release = 'dev' not in version
37 release = "dev" not in version
00 Metadata-Version: 2.1
11 Name: drms
2 Version: 0.6.2
3 Summary: "Access HMI, AIA and MDI data with Python from the public JSOC DRMS server"
2 Version: 0.6.3
3 Summary: Access HMI, AIA and MDI data with Python from the Standford JSOC DRMS
44 Home-page: https://sunpy.org
55 Author: The SunPy Community
66 Author-email: sunpy@googlegroups.com
77 License: BSD 2-Clause
8 Description: ====
9 drms
10 ====
11
12 `Docs <https://docs.sunpy.org/projects/drms/>`__ |
13 `Tutorial <https://docs.sunpy.org/projects/drms/en/latest/tutorial.html>`__ |
14 `Github <https://github.com/sunpy/drms>`__ |
15 `PyPI <https://pypi.python.org/pypi/drms>`__
16
17 |JOSS| |Zenodo|
18
19 .. |JOSS| image:: https://joss.theoj.org/papers/10.21105/joss.01614/status.svg
20 :target: https://doi.org/10.21105/joss.01614
21 .. |Zenodo| image:: https://zenodo.org/badge/58651845.svg
22 :target: https://zenodo.org/badge/latestdoi/58651845
23
24 The ``drms`` module provides an easy-to-use interface for accessing HMI, AIA and MDI data with Python.
25 It uses the publicly accessible `JSOC <http://jsoc.stanford.edu/>`__ DRMS server by default, but can also be used with local `NetDRMS <http://jsoc.stanford.edu/netdrms/>`__ sites.
26 More information, including a detailed tutorial, is available in the `Documentation <https://docs.sunpy.org/projects/drms/>`__.
27
28 Getting Help
29 ------------
30
31 This is a SunPy-affiliated package. For more information or to ask questions
32 about drms or SunPy, check out:
33
34 - `drms Documentation`_
35 - `SunPy Matrix Channel`_
36 - `SunPy mailing list`_
37
38 .. _drms Documentation: https://docs.sunpy.org/projects/drms/en/latest/
39 .. _SunPy Matrix Channel: https://riot.im/app/#/room/#sunpy:matrix.org
40
41 Contributing
42 ------------
43
44 If you would like to get involved, start by joining the `SunPy mailing list`_ and check out the `Developers Guide`_ section of the SunPy docs.
45 Stop by our chat room `#sunpy:matrix.org`_ if you have any questions.
46 Help is always welcome so let us know what you like to work on, or check out the `issues page`_ for the list of known outstanding items.
47
48 For more information on contributing to SunPy and to DRMS, please read our `Newcomers guide`_.
49
50 .. _SunPy mailing list: https://groups.google.com/forum/#!forum/sunpy
51 .. _Developers Guide: https://docs.sunpy.org/en/latest/dev_guide/index.html
52 .. _`#sunpy:matrix.org`: https://app.element.io/#/room/#sunpy:openastronomy.org
53 .. _issues page: https://github.com/sunpy/drms/issues
54 .. _Newcomers guide: https://docs.sunpy.org/en/latest/dev_guide/newcomers.html
55
56 Code of Conduct (CoC)
57 ---------------------
58
59 When you are interacting with the SunPy community you are asked to follow our `code of conduct`_.
60
61 .. _code of conduct: https://docs.sunpy.org/en/latest/code_of_conduct.html
62
63 Citation
64 --------
65
66 If you use ``drms`` in your work, please consider citing our `paper`_.
67
68 .. code :: bibtex
69
70 @article{Glogowski2019,
71 doi = {10.21105/joss.01614},
72 url = {https://doi.org/10.21105/joss.01614},
73 year = {2019},
74 publisher = {The Open Journal},
75 volume = {4},
76 number = {40},
77 pages = {1614},
78 author = {Kolja Glogowski and Monica G. Bobra and Nitin Choudhary and Arthur B. Amezcua and Stuart J. Mumford},
79 title = {drms: A Python package for accessing HMI and AIA data},
80 journal = {Journal of Open Source Software}
81 }
82
83 .. _paper: https://doi.org/10.21105/joss.01614
84
85 Acknowledgements
86 ----------------
87
88 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
89
908 Keywords: solar physics,solar,science,data
919 Platform: any
9210 Classifier: Development Status :: 5 - Production/Stable
9614 Classifier: Operating System :: OS Independent
9715 Classifier: Programming Language :: Python
9816 Classifier: Programming Language :: Python :: 3
99 Classifier: Programming Language :: Python :: 3.7
10017 Classifier: Programming Language :: Python :: 3.8
10118 Classifier: Programming Language :: Python :: 3.9
19 Classifier: Programming Language :: Python :: 3.10
10220 Classifier: Topic :: Scientific/Engineering :: Astronomy
10321 Classifier: Topic :: Scientific/Engineering :: Physics
10422 Provides: drms
105 Requires-Python: >=3.7
23 Requires-Python: >=3.8
10624 Description-Content-Type: text/x-rst
10725 Provides-Extra: tests
10826 Provides-Extra: docs
10927 Provides-Extra: dev
11028 Provides-Extra: all
29 License-File: LICENSE.rst
30
31 ====
32 drms
33 ====
34
35 `Docs <https://docs.sunpy.org/projects/drms/>`__ |
36 `Tutorial <https://docs.sunpy.org/projects/drms/en/latest/tutorial.html>`__ |
37 `Github <https://github.com/sunpy/drms>`__ |
38 `PyPI <https://pypi.org/project/drms/>`__
39
40 |JOSS| |Zenodo|
41
42 .. |JOSS| image:: https://joss.theoj.org/papers/10.21105/joss.01614/status.svg
43 :target: https://doi.org/10.21105/joss.01614
44 .. |Zenodo| image:: https://zenodo.org/badge/58651845.svg
45 :target: https://zenodo.org/badge/latestdoi/58651845
46
47 The ``drms`` module provides an easy-to-use interface for accessing HMI, AIA and MDI data with Python.
48 It uses the publicly accessible `JSOC <http://jsoc.stanford.edu/>`__ DRMS server by default, but can also be used with local `NetDRMS <http://jsoc.stanford.edu/netdrms/>`__ sites.
49 More information, including a detailed tutorial, is available in the `Documentation <https://docs.sunpy.org/projects/drms/>`__.
50
51 Getting Help
52 ------------
53 For more information or to ask questions about ``drms``, check out:
54
55 - `drms Documentation <https://docs.sunpy.org/projects/drms/en/latest/>`__
56
57 .. _drms Documentation: https://docs.sunpy.org/projects/drms/en/latest/
58
59 Contributing
60 ------------
61 If you would like to get involved, start by joining the `SunPy Chat`_ and read our `Newcomers' guide <https://docs.sunpy.org/en/latest/dev_guide/newcomers.html>`__.
62 Help is always welcome so let us know what you like to work on, or check out the `issues page <https://github.com/sunpy/drms/issues>`__ for a list of known outstanding items.
63
64 Citation
65 --------
66 If you use ``drms`` in your work, please cite our `paper <https://doi.org/10.21105/joss.01614>`__.
67
68 .. code :: bibtex
69
70 @article{Glogowski2019,
71 doi = {10.21105/joss.01614},
72 url = {https://doi.org/10.21105/joss.01614},
73 year = {2019},
74 publisher = {The Open Journal},
75 volume = {4},
76 number = {40},
77 pages = {1614},
78 author = {Kolja Glogowski and Monica G. Bobra and Nitin Choudhary and Arthur B. Amezcua and Stuart J. Mumford},
79 title = {drms: A Python package for accessing HMI and AIA data},
80 journal = {Journal of Open Source Software}
81 }
82
83 Code of Conduct (CoC)
84 ---------------------
85
86 When you are interacting with the SunPy community you are asked to follow our `code of conduct <https://docs.sunpy.org/en/latest/code_of_conduct.html>`__.
87
88 Acknowledgements
89 ----------------
90 Kolja Glogowski has received funding from the European Research Council under the European Union's Seventh Framework Programme (FP/2007-2013) / ERC Grant Agreement no. 307117.
91
92 .. _SunPy Chat: https://openastronomy.element.io/#/room/#sunpy:openastronomy.org
00 [console_scripts]
11 drms = drms.main:main
2
2020 email = os.environ["JSOC_EMAIL"]
2121
2222 # Download directory
23 out_dir = os.path.join('downloads')
23 out_dir = os.path.join("downloads")
2424
2525 # Create download directory if it does not exist yet.
2626 if not os.path.exists(out_dir):
2929 ###############################################################################
3030 # Construct the DRMS query string: "Series[timespan][wavelength]{data segments}"
3131
32 qstr = 'aia.lev1_euv_12s[2015-10-17T04:33:30.000/1m@12s][171]{image}'
33 print(f'Data export query:\n {qstr}\n')
32 qstr = "aia.lev1_euv_12s[2015-10-17T04:33:30.000/1m@12s][171]{image}"
33 print(f"Data export query:\n {qstr}\n")
3434
3535 ###############################################################################
3636 # Construct the dictionary specifying that we want to request a cutout.
4040 # ``r`` controls the use of sub-pixel registration.
4141 # ``c`` controls whether off-limb pixels are filled with NaNs.
4242 # For additional details about ``im_patch``, see the `documentation <http://jsoc.stanford.edu/doxygen_html/group__im__patch.html>`_.
43 process = {'im_patch': {
44 't_ref': '2015-10-17T04:33:30.000',
45 't': 0,
46 'r': 0,
47 'c': 0,
48 'locunits': 'arcsec',
49 'boxunits': 'arcsec',
50 'x': -517.2,
51 'y': -246,
52 'width': 345.6,
53 'height': 345.6,
54 }}
43 process = {
44 "im_patch": {
45 "t_ref": "2015-10-17T04:33:30.000",
46 "t": 0,
47 "r": 0,
48 "c": 0,
49 "locunits": "arcsec",
50 "boxunits": "arcsec",
51 "x": -517.2,
52 "y": -246,
53 "width": 345.6,
54 "height": 345.6,
55 }
56 }
5557
5658 # Submit export request using the 'fits' protocol
57 print('Submitting export request...')
59 print("Submitting export request...")
5860 result = client.export(
5961 qstr,
60 method='url',
61 protocol='fits',
62 method="url",
63 protocol="fits",
6264 email=email,
6365 process=process,
6466 )
6567
6668 # Print request URL.
67 print(f'\nRequest URL: {result.request_url}')
68 print(f'{int(len(result.urls))} file(s) available for download.\n')
69 print(f"\nRequest URL: {result.request_url}")
70 print(f"{int(len(result.urls))} file(s) available for download.\n")
6971
7072 # Download selected files.
7173 result.wait()
7274 result.download(out_dir)
73 print('Download finished.')
75 print("Download finished.")
7476 print(f'\nDownload directory:\n "{os.path.abspath(out_dir)}"\n')
2525 email = os.environ["JSOC_EMAIL"]
2626
2727 # Download directory
28 out_dir = os.path.join('downloads')
28 out_dir = os.path.join("downloads")
2929
3030 # Create download directory if it does not exist yet.
3131 if not os.path.exists(out_dir):
3434 ###############################################################################
3535 # Construct the DRMS query string: "Series[harpnum][timespan]{data segments}"
3636
37 qstr = 'hmi.sharp_720s[7451][2020.09.27_00:00:00_TAI]{continuum, magnetogram, field}'
38 print(f'Data export query:\n {qstr}\n')
37 qstr = "hmi.sharp_720s[7451][2020.09.27_00:00:00_TAI]{continuum, magnetogram, field}"
38 print(f"Data export query:\n {qstr}\n")
3939
4040 # Submit export request, defaults to method='url_quick' and protocol='as-is'
41 print('Submitting export request...')
41 print("Submitting export request...")
4242 result = client.export(qstr, email=email)
43 print(f'{int(len(result.urls))} file(s) available for download.\n')
43 print(f"{int(len(result.urls))} file(s) available for download.\n")
4444
4545 # Download selected files.
4646 result.download(out_dir)
47 print('Download finished.')
47 print("Download finished.")
4848 print(f'\nDownload directory:\n "{os.path.abspath(out_dir)}"\n')
2626
2727 # Use 'as-is' instead of 'fits', if record keywords are not needed in the
2828 # FITS header. This greatly reduces the server load!
29 export_protocol = 'fits'
29 export_protocol = "fits"
3030
3131 # Download directory
32 out_dir = os.path.join('downloads')
32 out_dir = os.path.join("downloads")
3333
3434 # Create download directory if it does not exist yet.
3535 if not os.path.exists(out_dir):
3838 ###############################################################################
3939 # Construct the DRMS query string: "Series[harpnum][timespan]{data segments}"
4040
41 qstr = 'hmi.sharp_720s[4864][2014.11.30_00:00:00_TAI/1d@8h]{continuum, magnetogram, field}'
42 print(f'Data export query:\n {qstr}\n')
41 qstr = "hmi.sharp_720s[4864][2014.11.30_00:00:00_TAI/1d@8h]{continuum, magnetogram, field}"
42 print(f"Data export query:\n {qstr}\n")
4343
4444 # Submit export request using the 'fits' protocol
45 print('Submitting export request...')
46 result = client.export(qstr, method='url', protocol=export_protocol, email=email)
45 print("Submitting export request...")
46 result = client.export(qstr, method="url", protocol=export_protocol, email=email)
4747
4848 # Print request URL.
49 print(f'\nRequest URL: {result.request_url}')
50 print(f'{int(len(result.urls))} file(s) available for download.\n')
49 print(f"\nRequest URL: {result.request_url}")
50 print(f"{int(len(result.urls))} file(s) available for download.\n")
5151
5252 # Download selected files.
5353 result.download(out_dir)
54 print('Download finished.')
54 print("Download finished.")
5555 print(f'\nDownload directory:\n "{os.path.abspath(out_dir)}"\n')
1919 client = drms.Client(verbose=True)
2020
2121 # Export request ID
22 request_id = 'JSOC_20201101_198'
22 request_id = "JSOC_20201101_198"
2323
2424 # Querying the server using the entered RequestID.
25 print(f'Looking up export request {request_id}...')
25 print(f"Looking up export request {request_id}...")
2626 result = client.export_from_id(request_id)
2727
2828 # Print request URL and number of available files.
29 print(f'\nRequest URL: {result.request_url}')
30 print(f'{int(len(result.urls))} file(s) available for download.\n')
29 print(f"\nRequest URL: {result.request_url}")
30 print(f"{int(len(result.urls))} file(s) available for download.\n")
3131
3232 # Create download directory if it does not exist yet.
33 out_dir = os.path.join('downloads', request_id)
33 out_dir = os.path.join("downloads", request_id)
3434 if not os.path.exists(out_dir):
3535 os.makedirs(out_dir)
3636
3737 # Download all available files.
3838 result.download(out_dir)
39 print('Download finished.')
40 print(f'\nDownload directory:\n {os.path.abspath(out_dir)}\n')
39 print("Download finished.")
40 print(f"\nDownload directory:\n {os.path.abspath(out_dir)}\n")
2727
2828 # Arguments for 'jpg' protocol
2929 jpg_args = {
30 'ct': 'aia_304.lut', # color table
31 'min': 4, # min value
32 'max': 800, # max value
33 'scaling': 'log', # color scaling
34 'size': 2, # binning (1 -> 4k, 2 -> 2k, 4 -> 1k)
30 "ct": "aia_304.lut", # color table
31 "min": 4, # min value
32 "max": 800, # max value
33 "scaling": "log", # color scaling
34 "size": 2, # binning (1 -> 4k, 2 -> 2k, 4 -> 1k)
3535 }
3636
3737 # Download directory
38 out_dir = 'downloads'
38 out_dir = "downloads"
3939
4040 # Create download directory if it does not exist yet.
4141 if not os.path.exists(out_dir):
4444 ###############################################################################
4545 # Construct the DRMS query string: "Series[timespan][wavelength]{data segments}"
4646
47 qstr = 'aia.lev1_euv_12s[2012-08-31T19:48:01Z][304]{image}'
48 print(f'Data export query:\n {qstr}\n')
47 qstr = "aia.lev1_euv_12s[2012-08-31T19:48:01Z][304]{image}"
48 print(f"Data export query:\n {qstr}\n")
4949
5050 # Submit export request using the 'jpg' protocol with custom protocol_args
51 print('Submitting export request...')
52 result = client.export(qstr, protocol='jpg', protocol_args=jpg_args, email=email)
51 print("Submitting export request...")
52 result = client.export(qstr, protocol="jpg", protocol_args=jpg_args, email=email)
5353
5454 # Print request URL.
55 print(f'\nRequest URL: {result.request_url}')
56 print(f'{int(len(result.urls))} file(s) available for download.\n')
55 print(f"\nRequest URL: {result.request_url}")
56 print(f"{int(len(result.urls))} file(s) available for download.\n")
5757
5858 # Download selected files.
5959 result.download(out_dir)
60 print('Download finished.')
61 print(f'\nDownload directory:\n {os.path.abspath(out_dir)}\n')
60 print("Download finished.")
61 print(f"\nDownload directory:\n {os.path.abspath(out_dir)}\n")
2626 email = os.environ["JSOC_EMAIL"]
2727
2828 # Download directory
29 out_dir = 'downloads'
29 out_dir = "downloads"
3030
3131 # Create download directory if it does not exist yet.
3232 if not os.path.exists(out_dir):
3535 ###############################################################################
3636 # Construct the DRMS query string: "Series[timespan]{segment}"
3737
38 qstr = 'hmi.m_720s[2014.11.28_00:00:00_TAI/5d@1h]{magnetogram}'
39 print(f'Data export query:\n {qstr}\n')
38 qstr = "hmi.m_720s[2014.11.28_00:00:00_TAI/5d@1h]{magnetogram}"
39 print(f"Data export query:\n {qstr}\n")
4040
4141 ###############################################################################
4242 # Arguments for 'mp4' protocol
4343
4444 mp4_args = {
45 'ct': 'grey.sao', # color table
46 'min': -1500, # min value
47 'max': 1500, # max value
48 'scaling': 'mag', # color scaling
49 'size': 8, # binning (1 -> 4k, 2 -> 2k, 4 -> 1k, 8 -> 512)
45 "ct": "grey.sao", # color table
46 "min": -1500, # min value
47 "max": 1500, # max value
48 "scaling": "mag", # color scaling
49 "size": 8, # binning (1 -> 4k, 2 -> 2k, 4 -> 1k, 8 -> 512)
5050 }
5151
5252 # Submit export request using the 'mp4' protocol with custom protocol_args
53 print('Submitting export request...')
54 result = client.export(qstr, protocol='mp4', protocol_args=mp4_args, email=email)
53 print("Submitting export request...")
54 result = client.export(qstr, protocol="mp4", protocol_args=mp4_args, email=email)
5555 result.wait(sleep=10)
5656
5757 # Print request URL.
58 print(f'\nRequest URL: {result.request_url}')
59 print(f'{len(result.urls)} file(s) available for download.\n')
58 print(f"\nRequest URL: {result.request_url}")
59 print(f"{len(result.urls)} file(s) available for download.\n")
6060
6161 # Download movie file only: index=0
6262 result.download(out_dir, 0)
63 print('Download finished.')
64 print(f'\nDownload directory:\n {os.path.abspath(out_dir)}\n')
63 print("Download finished.")
64 print(f"\nDownload directory:\n {os.path.abspath(out_dir)}\n")
2424 ###############################################################################
2525 # Construct the DRMS query string: "Series[timespan][wavelength]"
2626
27 qstr = 'hmi.ic_720s[2015.01.01_00:00:00_TAI/10d@1d]{continuum}'
27 qstr = "hmi.ic_720s[2015.01.01_00:00:00_TAI/10d@1d]{continuum}"
2828
2929 # Submit export request, defaults to method='url_quick' and protocol='as-is'
30 print(f'Data export query:\n {qstr}\n')
31 print('Submitting export request...')
30 print(f"Data export query:\n {qstr}\n")
31 print("Submitting export request...")
3232 result = client.export(qstr, email=email)
33 print(f'len(r.urls) file(s) available for download.\n')
33 print(f"len(r.urls) file(s) available for download.\n")
3434
3535 # Print download URLs.
36 for _, row in result.urls[['record', 'url']].iterrows():
37 print(f'REC: {row.record}')
38 print(f'URL: {row.url}\n')
36 for _, row in result.urls[["record", "url"]].iterrows():
37 print(f"REC: {row.record}")
38 print(f"URL: {row.url}\n")
2828 email = os.environ["JSOC_EMAIL"]
2929
3030 # Download directory
31 out_dir = 'downloads'
31 out_dir = "downloads"
3232
3333 # Create download directory if it does not exist yet.
3434 if not os.path.exists(out_dir):
3737 ###############################################################################
3838 # Construct the DRMS query string: "Series[Carrington rotation][Carrington longitude]{data segments}"
3939
40 qstr = 'hmi.rdvflows_fd15_frame[2150][360]{Ux, Uy}'
41 print(f'Data export query:\n {qstr}\n')
40 qstr = "hmi.rdvflows_fd15_frame[2150][360]{Ux, Uy}"
41 print(f"Data export query:\n {qstr}\n")
4242
4343 # Submit export request using the 'url-tar' method, protocol default: 'as-is'
44 print('Submitting export request...')
45 result = client.export(qstr, method='url-tar', email=email)
44 print("Submitting export request...")
45 result = client.export(qstr, method="url-tar", email=email)
4646
4747 # Print request URL.
48 print(f'\nRequest URL: {result.request_url}')
49 print(f'{len(result.urls)} file(s) available for download.\n')
48 print(f"\nRequest URL: {result.request_url}")
49 print(f"{len(result.urls)} file(s) available for download.\n")
5050
5151 # Download selected files.
5252 dr = result.download(out_dir)
53 print('Download finished.')
53 print("Download finished.")
5454 print(f'\nDownloaded file:\n "{dr.download[0]}"\n')
1414 client = drms.Client()
1515
1616 # Get all available HMI series
17 hmi_series = client.series(r'hmi\.', full=True)
17 hmi_series = client.series(r"hmi\.", full=True)
1818
1919 # Print series names, prime-keys (pkeys) and notes
2020 for series in hmi_series.index:
21 print('Series:', hmi_series.name[series])
22 print(' Notes:', (f'\n{8 * " "}').join(textwrap.wrap(hmi_series.note[series])))
21 print("Series:", hmi_series.name[series])
22 print(" Notes:", (f'\n{8 * " "}').join(textwrap.wrap(hmi_series.note[series])))
1414 client = drms.Client()
1515
1616 # Query series info
17 series_info = client.info('hmi.v_45s')
17 series_info = client.info("hmi.v_45s")
1818
1919 # Print keyword info
20 print(f'Listing keywords for {series_info.name}:\n')
20 print(f"Listing keywords for {series_info.name}:\n")
2121 for keyword in sorted(series_info.keywords.index):
2222 keyword_info = series_info.keywords.loc[keyword]
2323 print(keyword)
24 print(f' type ....... {keyword_info.type} ')
25 print(f' recscope ... {keyword_info.recscope} ')
26 print(f' defval ..... {keyword_info.defval} ')
27 print(f' units ...... {keyword_info.units} ')
28 print(f' note ....... {keyword_info.note} ')
24 print(f" type ....... {keyword_info.type} ")
25 print(f" recscope ... {keyword_info.recscope} ")
26 print(f" defval ..... {keyword_info.defval} ")
27 print(f" units ...... {keyword_info.units} ")
28 print(f" note ....... {keyword_info.note} ")
1919 # list of all available keywords of a series.
2020
2121 keys = [
22 'T_REC',
23 'T_OBS',
24 'DATAMIN',
25 'DATAMAX',
26 'DATAMEAN',
27 'DATARMS',
28 'DATASKEW',
29 'DATAKURT',
30 'QUALITY',
22 "T_REC",
23 "T_OBS",
24 "DATAMIN",
25 "DATAMAX",
26 "DATAMEAN",
27 "DATARMS",
28 "DATASKEW",
29 "DATAKURT",
30 "QUALITY",
3131 ]
3232
3333 ###############################################################################
3636 # entries (like note) are missing for linked keywords, so we are using the
3737 # entries from aia.lev1 in this case.
3838
39 print('Querying series info...')
40 series_info = client.info('aia.lev1_euv_12s')
41 series_info_lev1 = client.info('aia.lev1')
39 print("Querying series info...")
40 series_info = client.info("aia.lev1_euv_12s")
41 series_info_lev1 = client.info("aia.lev1")
4242 for key in keys:
4343 linkinfo = series_info.keywords.loc[key].linkinfo
44 if linkinfo is not None and linkinfo.startswith('lev1->'):
44 if linkinfo is not None and linkinfo.startswith("lev1->"):
4545 note_str = series_info_lev1.keywords.loc[key].note
4646 else:
4747 note_str = series_info.keywords.loc[key].note
48 print(f'{key:>10} : {note_str}')
48 print(f"{key:>10} : {note_str}")
4949
5050 ###############################################################################
5151 # Construct the DRMS query string: "Series[timespan][wavelength]"
5252
53 qstr = 'aia.lev1_euv_12s[2014-01-01T00:00:01Z/365d@1d][335]'
53 qstr = "aia.lev1_euv_12s[2014-01-01T00:00:01Z/365d@1d][335]"
5454
5555 # Get keyword values for the selected timespan and wavelength
56 print(f'Querying keyword data...\n -> {qstr}')
56 print(f"Querying keyword data...\n -> {qstr}")
5757 result = client.query(qstr, key=keys)
58 print(f' -> {len(result)} lines retrieved.')
58 print(f" -> {len(result)} lines retrieved.")
5959
6060 # Only use entries with QUALITY==0
6161 result = result[result.QUALITY == 0]
62 print(f' -> {len(result)} lines after QUALITY selection.')
62 print(f" -> {len(result)} lines after QUALITY selection.")
6363
6464 # Convert T_REC strings to datetime and use it as index for the series
6565 result.index = drms.to_datetime(result.T_REC)
6767 ###############################################################################
6868 # Create some simple plots
6969
70 ax = result[['DATAMIN', 'DATAMAX', 'DATAMEAN', 'DATARMS', 'DATASKEW']].plot(figsize=(8, 10), subplots=True)
71 ax[0].set_title(qstr, fontsize='medium')
70 ax = result[["DATAMIN", "DATAMAX", "DATAMEAN", "DATARMS", "DATASKEW"]].plot(figsize=(8, 10), subplots=True)
71 ax[0].set_title(qstr, fontsize="medium")
7272 plt.tight_layout()
7373 plt.show()
1717 ###############################################################################
1818 # Construct the DRMS query string: "Series[timespan]"
1919
20 qstr = f'hmi.ic_720s[2010.05.01_TAI-2016.04.01_TAI@6h]'
20 qstr = f"hmi.ic_720s[2010.05.01_TAI-2016.04.01_TAI@6h]"
2121
2222 # Send request to the DRMS server
23 print('Querying keyword data...\n -> {qstr}')
24 result = client.query(qstr, key=['T_REC', 'DATAMEAN', 'DATARMS'])
25 print(f' -> {int(len(result))} lines retrieved.')
23 print("Querying keyword data...\n -> {qstr}")
24 result = client.query(qstr, key=["T_REC", "DATAMEAN", "DATARMS"])
25 print(f" -> {int(len(result))} lines retrieved.")
2626
2727 ###############################################################################
2828 # Now to plot the image.
2929
3030 # Convert T_REC strings to datetime and use it as index for the series
31 result.index = drms.to_datetime(result.pop('T_REC'))
31 result.index = drms.to_datetime(result.pop("T_REC"))
3232
3333 # Note: DATARMS contains the standard deviation, not the RMS!
3434 t = result.index
3737
3838 # Create plot
3939 fig, ax = plt.subplots(1, 1, figsize=(15, 7))
40 ax.set_title(qstr, fontsize='medium')
40 ax.set_title(qstr, fontsize="medium")
4141 ax.fill_between(
42 t, avg + std, avg - std, edgecolor='none', facecolor='b', alpha=0.3, interpolate=True,
42 t,
43 avg + std,
44 avg - std,
45 edgecolor="none",
46 facecolor="b",
47 alpha=0.3,
48 interpolate=True,
4349 )
44 ax.plot(t, avg, color='b')
45 ax.set_xlabel('Time')
46 ax.set_ylabel('Disk-averaged continuum intensity [kDN/s]')
50 ax.plot(t, avg, color="b")
51 ax.set_xlabel("Time")
52 ax.set_ylabel("Disk-averaged continuum intensity [kDN/s]")
4753 fig.tight_layout()
4854
4955 plt.show()
1818 ###############################################################################
1919 # Construct the DRMS query string: "Series[timespan][wavelength]"
2020
21 qstr = 'hmi.v_sht_modes[2014.06.20_00:00:00_TAI]'
21 qstr = "hmi.v_sht_modes[2014.06.20_00:00:00_TAI]"
2222
2323 # TODO: Add text here.
24 segname = 'm6' # 'm6', 'm18' or 'm36'
24 segname = "m6" # 'm6', 'm18' or 'm36'
2525
2626 # Send request to the DRMS server
27 print(f'Querying keyword data...\n -> {qstr}')
28 result, filenames = client.query(qstr, key=['T_START', 'T_STOP', 'LMIN', 'LMAX', 'NDT'], seg=segname)
29 print(f' -> {len(result)} lines retrieved.')
27 print(f"Querying keyword data...\n -> {qstr}")
28 result, filenames = client.query(qstr, key=["T_START", "T_STOP", "LMIN", "LMAX", "NDT"], seg=segname)
29 print(f" -> {len(result)} lines retrieved.")
3030
3131 # Use only the first line of the query result
3232 result = result.iloc[0]
33 fname = f'http://jsoc.stanford.edu{filenames[segname][0]}'
33 fname = f"http://jsoc.stanford.edu{filenames[segname][0]}"
3434
3535 # Read the data segment
36 print(f'Reading data from {fname}...')
36 print(f"Reading data from {fname}...")
3737 a = np.genfromtxt(fname)
3838
3939 # For column names, see appendix of Larson & Schou (2015SoPh..290.3221L)
5252 # Plot the zoomed in on lower l
5353 fig, ax = plt.subplots(1, 1, figsize=(11, 7))
5454 ax.set_title(
55 f'Time = {result.T_START} ... {result.T_STOP}, L = {result.LMIN} ... {result.LMAX}, NDT = {result.NDT}',
56 fontsize='medium',
55 f"Time = {result.T_START} ... {result.T_STOP}, L = {result.LMIN} ... {result.LMAX}, NDT = {result.NDT}",
56 fontsize="medium",
5757 )
5858 for ni in np.unique(n):
5959 idx = n == ni
60 ax.plot(l[idx], nu[idx], 'b.-')
60 ax.plot(l[idx], nu[idx], "b.-")
6161 ax.set_xlim(0, 120)
6262 ax.set_ylim(0.8, 4.5)
63 ax.set_xlabel('Harmonic degree')
64 ax.set_ylabel('Frequency [mHz]')
63 ax.set_xlabel("Harmonic degree")
64 ax.set_ylabel("Frequency [mHz]")
6565 fig.tight_layout()
6666
6767 ###############################################################################
6969
7070 fig, ax = plt.subplots(1, 1, figsize=(11, 7))
7171 ax.set_title(
72 f'Time = {result.T_START} ... {result.T_STOP}, L = {result.LMIN} ... {result.LMAX}, NDT = {result.NDT}',
73 fontsize='medium',
72 f"Time = {result.T_START} ... {result.T_STOP}, L = {result.LMIN} ... {result.LMAX}, NDT = {result.NDT}",
73 fontsize="medium",
7474 )
7575 for ni in np.unique(n):
7676 if ni <= 20:
7777 idx = n == ni
78 ax.plot(l[idx], nu[idx], 'b.', ms=3)
78 ax.plot(l[idx], nu[idx], "b.", ms=3)
7979 if ni < 10:
80 ax.plot(l[idx], nu[idx] + 1000 * snu[idx], 'g')
81 ax.plot(l[idx], nu[idx] - 1000 * snu[idx], 'g')
80 ax.plot(l[idx], nu[idx] + 1000 * snu[idx], "g")
81 ax.plot(l[idx], nu[idx] - 1000 * snu[idx], "g")
8282 else:
83 ax.plot(l[idx], nu[idx] + 500 * snu[idx], 'r')
84 ax.plot(l[idx], nu[idx] - 500 * snu[idx], 'r')
83 ax.plot(l[idx], nu[idx] + 500 * snu[idx], "r")
84 ax.plot(l[idx], nu[idx] - 500 * snu[idx], "r")
8585 ax.legend(
86 loc='upper right',
86 loc="upper right",
8787 handles=[
88 plt.Line2D([0], [0], color='r', label='500 sigma'),
89 plt.Line2D([0], [0], color='g', label='1000 sigma'),
88 plt.Line2D([0], [0], color="r", label="500 sigma"),
89 plt.Line2D([0], [0], color="g", label="1000 sigma"),
9090 ],
9191 )
9292 ax.set_xlim(-5, 305)
9393 ax.set_ylim(0.8, 4.5)
94 ax.set_xlabel('Harmonic degree')
95 ax.set_ylabel('Frequency [mHz]')
94 ax.set_xlabel("Harmonic degree")
95 ax.set_ylabel("Frequency [mHz]")
9696 fig.tight_layout()
9797
9898 plt.show()
1818 ###############################################################################
1919 # Construct the DRMS query string: "Series[timespan][wavelength]"
2020
21 qstr = 'hmi.meanpf_720s[2010.05.01_TAI-2016.04.01_TAI@12h]'
21 qstr = "hmi.meanpf_720s[2010.05.01_TAI-2016.04.01_TAI@12h]"
2222
2323 # Send request to the DRMS server
24 print('Querying keyword data...\n -> {qstr}')
25 result = client.query(qstr, key=['T_REC', 'CAPN2', 'CAPS2'])
26 print(f' -> {len(result)} lines retrieved.')
24 print("Querying keyword data...\n -> {qstr}")
25 result = client.query(qstr, key=["T_REC", "CAPN2", "CAPS2"])
26 print(f" -> {len(result)} lines retrieved.")
2727
2828 # Convert T_REC strings to datetime and use it as index for the series
29 result.index = drms.to_datetime(result.pop('T_REC'))
29 result.index = drms.to_datetime(result.pop("T_REC"))
3030
3131 # Determine smallest timestep
3232 dt = np.diff(result.index.to_pydatetime()).min()
4949
5050 # Plot smoothed data
5151 fig, ax = plt.subplots(1, 1, figsize=(15, 7))
52 ax.set_title(qstr, fontsize='medium')
53 ax.plot(t, n, 'b', alpha=0.5, label='North pole')
54 ax.plot(t, s, 'g', alpha=0.5, label='South pole')
55 ax.plot(t, mn, 'r', label='Moving average')
56 ax.plot(t, ms, 'r', label="")
57 ax.set_xlabel('Time')
58 ax.set_ylabel('Mean radial field strength [G]')
52 ax.set_title(qstr, fontsize="medium")
53 ax.plot(t, n, "b", alpha=0.5, label="North pole")
54 ax.plot(t, s, "g", alpha=0.5, label="South pole")
55 ax.plot(t, mn, "r", label="Moving average")
56 ax.plot(t, ms, "r", label="")
57 ax.set_xlabel("Time")
58 ax.set_ylabel("Mean radial field strength [G]")
5959 ax.legend()
6060 fig.tight_layout()
6161
6262 # Plot raw data
6363 fig, ax = plt.subplots(1, 1, figsize=(15, 7))
64 ax.set_title(qstr, fontsize='medium')
65 ax.fill_between(t, mn - sn, mn + sn, edgecolor='none', facecolor='b', alpha=0.3, interpolate=True)
66 ax.fill_between(t, ms - ss, ms + ss, edgecolor='none', facecolor='g', alpha=0.3, interpolate=True)
67 ax.plot(t, mn, 'b', label='North pole')
68 ax.plot(t, ms, 'g', label='South pole')
69 ax.set_xlabel('Time')
70 ax.set_ylabel('Mean radial field strength [G]')
64 ax.set_title(qstr, fontsize="medium")
65 ax.fill_between(t, mn - sn, mn + sn, edgecolor="none", facecolor="b", alpha=0.3, interpolate=True)
66 ax.fill_between(t, ms - ss, ms + ss, edgecolor="none", facecolor="g", alpha=0.3, interpolate=True)
67 ax.plot(t, mn, "b", label="North pole")
68 ax.plot(t, ms, "g", label="South pole")
69 ax.set_xlabel("Time")
70 ax.set_ylabel("Mean radial field strength [G]")
7171 ax.legend()
7272 fig.tight_layout()
7373
1919 ###############################################################################
2020 # Construct the DRMS query string: "Series[Carrington rotation]"
2121
22 qstr = 'hmi.synoptic_mr_720s[2150]'
22 qstr = "hmi.synoptic_mr_720s[2150]"
2323
2424 # Send request to the DRMS server
25 print('Querying keyword data...\n -> {qstr}')
26 segname = 'synopMr'
25 print("Querying keyword data...\n -> {qstr}")
26 segname = "synopMr"
2727 results, filenames = client.query(qstr, key=drms.const.all, seg=segname)
28 print(f' -> {len(results)} lines retrieved.')
28 print(f" -> {len(results)} lines retrieved.")
2929
3030 # Use only the first line of the query result
3131 results = results.iloc[0]
32 fname = f'http://jsoc.stanford.edu{filenames[segname][0]}'
32 fname = f"http://jsoc.stanford.edu{filenames[segname][0]}"
3333
3434 # Read the data segment
3535 # Note: HTTP downloads get cached in ~/.astropy/cache/downloads
36 print(f'Reading data from {fname}...')
36 print(f"Reading data from {fname}...")
3737 a = fits.getdata(fname)
3838 ny, nx = a.shape
3939
6363
6464 # Create plot
6565 fig, ax = plt.subplots(1, 1, figsize=(13.5, 6))
66 ax.set_title(f'{qstr}, Time: {results.T_START} ... {results.T_STOP}', fontsize='medium')
66 ax.set_title(f"{qstr}, Time: {results.T_START} ... {results.T_STOP}", fontsize="medium")
6767 ax.imshow(
68 a, vmin=-300, vmax=300, origin='lower', interpolation='nearest', cmap='gray', extent=extent, aspect=aspect,
68 a,
69 vmin=-300,
70 vmax=300,
71 origin="lower",
72 interpolation="nearest",
73 cmap="gray",
74 extent=extent,
75 aspect=aspect,
6976 )
7077 ax.invert_xaxis()
71 ax.set_xlabel('Carrington longitude')
72 ax.set_ylabel('Sine latitude')
78 ax.set_xlabel("Carrington longitude")
79 ax.set_ylabel("Sine latitude")
7380 fig.tight_layout()
7481
7582 plt.show()
00 [build-system]
11 requires = [
2 "setuptools",
3 "setuptools_scm",
4 "wheel",
5 "oldest-supported-numpy",
6 ]
2 "setuptools>=56,!=61.0.0",
3 "setuptools_scm[toml]>=6.2",
4 "wheel",
5 "oldest-supported-numpy",
6 ]
77 build-backend = 'setuptools.build_meta'
8
9 [tool.black]
10 line-length = 120
11 include = '\.pyi?$'
12 exclude = '''
13 (
14 /(
15 \.eggs
16 | \.git
17 | \.mypy_cache
18 | \.tox
19 | \.venv
20 | _build
21 | buck-out
22 | build
23 | dist
24 | docs
25 | .history
26 )/
27 )
28 '''
829
930 [ tool.gilesbot ]
1031 [ tool.gilesbot.pull_requests ]
00 [metadata]
11 name = drms
22 provides = drms
3 description = "Access HMI, AIA and MDI data with Python from the public JSOC DRMS server"
3 description = Access HMI, AIA and MDI data with Python from the Standford JSOC DRMS
44 long_description = file: README.rst
55 long_description_content_type = text/x-rst
66 author = The SunPy Community
77 author_email = sunpy@googlegroups.com
88 license = BSD 2-Clause
9 license_file = LICENSE.rst
9 license_files = LICENSE.rst
1010 url = https://sunpy.org
1111 edit_on_github = True
1212 github_project = sunpy/drms
2020 Operating System :: OS Independent
2121 Programming Language :: Python
2222 Programming Language :: Python :: 3
23 Programming Language :: Python :: 3.7
2423 Programming Language :: Python :: 3.8
2524 Programming Language :: Python :: 3.9
25 Programming Language :: Python :: 3.10
2626 Topic :: Scientific/Engineering :: Astronomy
2727 Topic :: Scientific/Engineering :: Physics
2828
2929 [options]
3030 zip_safe = False
31 python_requires = >=3.7
31 python_requires = >=3.8
3232 packages = find:
3333 include_package_data = True
3434 setup_requires =
6262 norecursedirs = ".tox" "build" "docs[\/]_build" "docs[\/]generated" "*.egg-info" "examples" ".history" "paper" "drms[\/]_dev"
6363 doctest_plus = enabled
6464 doctest_optionflags = NORMALIZE_WHITESPACE FLOAT_CMP ELLIPSIS
65 addopts = --doctest-rst
65 addopts = --doctest-rst -p no:unraisableexception -p no:threadexception
6666 markers =
6767 remote_data: marks this test function as needing remote data.
6868 jsoc: marks the test function as needing a connection to JSOC.
7070 export: marks the test function as needing a JSOC registered email address.
7171 remote_data_strict = True
7272 junit_family = xunit2
73 filterwarnings =
74 error
75 always::pytest.PytestConfigWarning
76 ignore:numpy.ufunc size changed:RuntimeWarning
77 ignore:numpy.ndarray size changed:RuntimeWarning
7378
7479 [pycodestyle]
7580 max_line_length = 110
22 from itertools import chain
33
44 from setuptools import setup
5 from setuptools.config import read_configuration
5
6 try:
7 # Recommended for setuptools 61.0.0+
8 # (though may disappear in the future)
9 from setuptools.config.setupcfg import read_configuration
10 except ImportError:
11 from setuptools.config import read_configuration
612
713 ################################################################################
814 # Programmatically generate some extras combos.
915 ################################################################################
10 extras = read_configuration('setup.cfg')['options']['extras_require']
11
16 extras = read_configuration("setup.cfg")["options"]["extras_require"]
1217 # Dev is everything
13 extras['dev'] = list(chain(*list(extras.values())))
14
18 extras["dev"] = list(chain(*list(extras.values())))
1519 # All is everything but tests and docs
16 exclude_keys = ('tests', 'docs', 'dev')
20 exclude_keys = ("tests", "docs", "dev")
1721 ex_extras = dict([i for i in list(extras.items()) if i[0] not in exclude_keys])
1822 # Concatenate all the values together for 'all'
19 extras['all'] = list(chain.from_iterable(list(ex_extras.values())))
20
23 extras["all"] = list(chain.from_iterable(list(ex_extras.values())))
2124 setup(
22 extras_require=extras, use_scm_version={'write_to': os.path.join('drms', '_version.py')},
25 extras_require=extras,
26 use_scm_version={"write_to": os.path.join("drms", "_version.py")},
2327 )
00 [tox]
11 envlist =
2 py{37,38,39}{,-online,-sunpy}
2 py{38,39,310}{,-online,-sunpy}
33 build_docs
44 codestyle
55 isolated_build = true
66 requires =
7 setuptools >= 30.3.0
7 setuptools >=56, !=61.0.0
88 pip >= 19.3.1
99 tox-pypi-filter >= 0.12
1010
1313 # Run the tests in a temporary directory to make sure that we don't import
1414 # drms from the source tree
1515 changedir = .tmp/{envname}
16 # tox environments are constructed with so-called 'factors' (or terms)
17 # separated by hyphens, e.g. test-devdeps-cov. Lines below starting with factor:
18 # will only take effect if that factor is included in the environment name. To
19 # see a list of example environments that can be run, along with a description,
20 # run:
21 #
22 # tox -l -v
23 #
2416 description =
2517 run tests
2618 online: that require remote data
3022 COLUMNS = 180
3123 PYTEST_COMMAND = pytest -vvv -s -ra --pyargs drms --cov-report=xml --cov=drms --cov-config={toxinidir}/setup.cfg {toxinidir}/docs
3224 build_docs,online: HOME = {envtmpdir}
25 JSOC_EMAIL = jsoc@sunpy.org
3326 passenv =
3427 HTTP_PROXY
3528 HTTPS_PROXY
3629 NO_PROXY
3730 CIRCLECI
3831 deps =
39 # These are specific online extras we use to run the online tests.
40 online: pytest-rerunfailures
41 online: pytest-timeout
32 pytest-timeout
33 # These are specific extras we use to run the sunpy tests.
4234 sunpy: git+https://github.com/sunpy/sunpy
43 # These are specific extras we use to run the sunpy tests.
4435 sunpy: beautifulsoup4
4536 sunpy: pytest-mock
4637 sunpy: python-dateutil
4738 sunpy: scipy
4839 sunpy: tqdm
4940 sunpy: zeep
50 # The following indicates which extras_require from setup.cfg will be installed
51 # dev is special in that it installs everything
5241 extras =
5342 dev
5443 commands =
55 sunpy: pytest -vvv -s -ra --pyargs sunpy.net.jsoc --remote-data=any {posargs}
44 sunpy: pytest -vvv -s -ra --pyargs sunpy.net.jsoc --timeout=120 --remote-data=any {posargs}
5645 !online: {env:PYTEST_COMMAND} {posargs}
57 online: {env:PYTEST_COMMAND} --reruns 2 --reruns-delay 60 --timeout=300 --remote-data=any --email jsoc@cadair.com {posargs}
46 online: {env:PYTEST_COMMAND} --timeout=120 --remote-data=any --email jsoc@sunpy.org {posargs}
5847
5948 [testenv:build_docs]
6049 changedir = docs