Minimum fits header data required for butler ingest-raws

I am trying to use butler ingest-raws to load raw data from test data for the telescope currently being build named DREAMS. When I attempt this I get the warning

lsst.ingest WARNING: Could not extract observation metadata from the following:

What I think is happening is that butler ingest-raws can’t find the information in the fits headers that it wants. At present there is not much in the headers of the data, not even the time the fits file was taken. What is the minimum information that I need to add to the headers in order for them to successfully be ingested?

I’m not sure how much infrastructure you have got in place. I’m assuming you have written an Instrument class and run butler register-instrument.

For raw ingest to work you need a metadata translator to be written so that an ObservationInfo can be constructed containing all the information that the ingest system requires. You do this by making a subclass of astro_metadata_translator.MetadataTranslator (see the astro_metadata_translator docs).

The mandatory list of translations is defined at

and will include a date of observation because otherwise it’s impossible for the calibration system to match a calibration to an exposure.

I had not written an Instrument class particular to my data. Can you point me to where I might find out how to do this?

How to create a new LSST obs package — LSST Science Pipelines is the best place to start, as you will be creating an obs_dreams package.

How exactly do I use astro_metadata_translator? When I try:

astrometadata dump lsst/raw/obs_18.fits

I get:

HDU 1 was not found in file lsst/raw/obs_18.fits. Ignoring request.

When I try:

astrometadata translate lsst/raw/obs_18.fits

I get:

Analyzing lsst/raw/obs_18.fits…
HDU 1 was not found in file lsst/raw/obs_18.fits. Ignoring request.
Failure processing lsst/raw/obs_18.fits: None of the registered translation classes [‘DECam’, ‘SuprimeCam’, ‘HSC’, ‘MegaPrime’, ‘SDSS’] understood this header from lsst/raw/obs_18.fits
Files with failed translations:
lsst/raw/obs_18.fits

There is no example for astrometadata, so I am not sure which command to use or why exactly it fails.

You say you are writing files for your own instrument so you have to create a new MetadataTranslator class that knows about your metadata standard.

Example code is at

https://astro-metadata-translator.lsst.io

The error message is saying that your files might only have a primary header (which is not fatal in this case) but then the headers in there do not conform to a know translator (it lists DECam, HSC, MegaPrime etc).

astro_metadata_translator provides a standardized framework for converting your header standard to a form that will be understandable by butler.

If you are writing “standard” FITS headers you can subclass from the FitsTranslator since that will understand the standard INSTRUME, OBSGEO-, and DATE- headers for you (which will be a good start). You have to use the -p option to astrometadata to get it to pre-load your metadata translator (normally you would do that in your Instrument class for butler so that the translator is known automatically and your files will be recognized).

If you pip install astro-metadata-translator make sure you get a new developer release since the only release up there with a formal release version number is very old. Might be safest to git clone it from

I am returning to this after working on two referee reports and I am having trouble.

I can’t seem to work out how to use astrometadata on my data. I tried tests using Sumprime-Cam data but this didn’t help. It looks like the first thing I need to run is ‘astrometadata dump’ but this doesn’t produce a file that I can send to the bulter. I am unsure which -p option I should use like you suggest. What I would like is the steps I need to follow to use astrometadata on my data including any options I might need.

Have you written a MetadataTranslator subclass as described in my previous post? The -p is the class name of your translator once you’ve written it.

This is the step I am struggling with. I can’t seem to work out how to write a MetadataTranslator subclass for my data based on your previous posts.

I suggest you start by copying astro_metadata_translator/decam.py at main · lsst/astro_metadata_translator · GitHub and editing it to suit your needs. The can_translate method is the critical one since it is the bit of code that declares to the translation infrastructure that this translator class recognizes the header.

You can put this translator class in your own package or eventually do a pull request to astro_metadata_translator. If it’s in an external package you need some way to import the translator class before it can be recognized to the system. You will see in obs_lsst that we do this by importing it in the Instrument subclass.

The -p parameter is what you use for the astrometadata command to register your translator class.

Once I modify decam.py I need to reinstall astro_metadata_translator. It is not clear to me how to do this from the github files (I have been using the pip installed version). There are no obvious ‘make’ files or install files. So how do I install astro_metadata_translator from the github files?

If you have cloned the GitHub repo and adding your new translator in to the tree directly, then you can use something like pip install -e . to have it install into your python tree in developer mode. If you have a Rubin environment then you can do the usual setup -k -r . to EUPS set up the package locally.

I have run into a problem that the command astrometadata no longer works on my system.

When I tried

pip install -e ~/programs/astro_metadata_translator-main

I get the below error message:

Obtaining file:///home/plah/programs/astro_metadata_translator-main
Installing build dependencies … done
Checking if build backend supports build_editable … done
Getting requirements to build editable … done
Installing backend dependencies … done
Preparing editable metadata (pyproject.toml) … error
error: subprocess-exited-with-error

× Preparing editable metadata (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [34 lines of output]
Popen([‘git’, ‘version’], cwd=/home/plah/programs/astro_metadata_translator-main, universal_newlines=False, shell=None, istream=None)
Popen([‘git’, ‘version’], cwd=/home/plah/programs/astro_metadata_translator-main, universal_newlines=False, shell=None, istream=None)
Failed checking if running in CYGWIN due to: FileNotFoundError(2, ‘No such file or directory’)
Traceback (most recent call last):
File “/home/plah/.local/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py”, line 351, in
main()
File “/home/plah/.local/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py”, line 333, in main
json_out[‘return_val’] = hook(**hook_input[‘kwargs’])
File “/home/plah/.local/lib/python3.10/site-packages/pip/_vendor/pep517/in_process/_in_process.py”, line 179, in prepare_metadata_for_build_editable
return hook(metadata_directory, config_settings)
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/build_meta.py”, line 451, in prepare_metadata_for_build_editable
return self.prepare_metadata_for_build_wheel(
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/build_meta.py”, line 377, in prepare_metadata_for_build_wheel
self.run_setup()
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/build_meta.py”, line 335, in run_setup
exec(code, locals())
File “”, line 1, in
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/init.py”, line 87, in setup
return distutils.core.setup(**attrs)
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/_distutils/core.py”, line 147, in setup
_setup_distribution = dist = klass(attrs)
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/dist.py”, line 475, in init
_Distribution.init(
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/_distutils/dist.py”, line 283, in init
self.finalize_options()
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/setuptools/dist.py”, line 899, in finalize_options
ep(self)
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/lsst_versions/_versions.py”, line 454, in infer_version_for_setuptools
version, written = _process_version_writing(".", True, fallback=True)
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/lsst_versions/_versions.py”, line 391, in _process_version_writing
version = get_lsst_version(dirname, fallback)
File “/tmp/pip-build-env-4h35ahfw/overlay/lib/python3.10/site-packages/lsst_versions/_versions.py”, line 427, in get_lsst_version
raise RuntimeError(f"Unable to find a version from Git or metadata within directory {dirname}")
RuntimeError: Unable to find a version from Git or metadata within directory .
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

When I tried

setup -k -r ~/programs/astro_metadata_translator-main

this runs without output so I can only presume it works.

However as I mentioned astrometadata no longer works. I tried uninstalling it and installing it again using pip but it still wouldn’t work. This has left me baffled. Do you have any ideas on how I can proceed?

Run this command from the git directory. The code is assuming that it can determine the git version from the working directory. The error is saying that it can’t find any git version there. That’s why I suggested pip install -e ..

if you are using Scons then you have to then go into the checkout directory and type scons so that it builds the astrometadata command for you.

Running scons restored the use of astrometadata. However it didn’t seem to pick up my modified decam.py file - dreams.py. The only thing I have changed so far in the file is the DECam name within it to DREAMS. When I run

astrometadata translate ~/data/DREAMS/lsst/raw

it gives me a list of registered translation classes in the error:

Failure processing /home/plah/data/DREAMS/lsst/raw/obs_18.fits: None of the registered translation classes [‘DECam’, ‘SuprimeCam’, ‘HSC’, ‘MegaPrime’, ‘SDSS’] understood this header from /home/plah/data/DREAMS/lsst/raw/obs_18.fits

I know I will need to make more changes to decam.py to get past this error but if it is not listed in the registered translation classes it means that the new installation didn’t pick up the decam.py file. Is there something more I need to do to get decam.py registered?

If you are adding a new class directly to astro_metadata_translator you need to edit the relevant __init__.py to ensure that it’s imported by default.

astro_metadata_translator/python/astro_metadata_translator/translators at main · lsst/astro_metadata_translator · GitHubinit.py

Modifying the __init__.py worked. How do I ensure that when astrometadata reads a fits file that it uses dreams.py. Does it look at the header and read the telescope name?

At present I am trying to modify the details of decam.py to the new dreams.py. I have noticed that there is a lot more code in decam.py than for subaru.py and I wonder how much is necessary for my dreams.py. In particular the _trivial_map which seems to map the header keywords to python variable names. decam.py has multiple entries while subaru.py has none. Does that mean that subaru data had the keywords that the code expects while decam data one needs to change the defaults? I assume the fits file requires all the header information listed in _trivial_map like humidity and temperature (I will need to add this to my test DREAMS data). How much of the rest of the decam.py code do I need?

You need to write a “can_translate” method that teaches dream.py to understand when it has encountered DREAM data.

It depends on how closely your headers match the ObservationInfo data model. subaru.py is a base class ofr Subaru instruments that share the same data model, and hsc.py is the specific instrument. If you translate a DECam or HSC header with astrometadata you will see all the values that you need to calculate in your translator. The full list is in astro_metadata_translator/properties.py at main · lsst/astro_metadata_translator · GitHub

It maps headers to translated concepts. If you have one header that directly maps to a translated property then you can use trivial map. In cases where you need to manipulate the value a little or combine two header values (such as RA/DEC calculations) you will need to write a to_x method (where x is the name of a property).

You may end up with a very simple translator or a very complicated translator depending on your choice of FITS header data model and how it relates to the butler model.