In the shared delegate-contributions-dp01 repository there is a new directory, backgrounds/, with a new notebook that explains the basics of how to retrieve and display the backgrounds which were subtracted from calexps and deepCoadds.
The notebook also contains a few links to more information about background estimation and subtraction in the LSST Science Pipelines. It is a fairly basic demonstration, and DP0 Delegates might be interested to develop and contribute notebooks to the backgrounds/ directory that, e.g., demonstrate science analyses which make use of backgrounds, or alternative background estimators.
How do I get a copy of this notebook?
Unlike the tutorial-notebooks repository (here), delegate-contributions-dp01 repository (here) does not automatically appear in the home directories of all RSP user accounts. Delegates must git clone the delegate-contributions-dp01 repository for themselves. If you’re new to GitHub, these instructions will help you out.
KeyError Traceback (most recent call last)
/opt/lsst/software/stack/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-110-g1427568b+500492d978/python/lsst/daf/butler/core/dimensions/_coordinate.py in standardize(mapping, graph, universe, defaults, **kwargs)
233 try:
→ 234 values = tuple(d[name] for name in graph.required.names)
235 except KeyError as err:
/opt/lsst/software/stack/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-110-g1427568b+500492d978/python/lsst/daf/butler/core/dimensions/_coordinate.py in (.0)
233 try:
→ 234 values = tuple(d[name] for name in graph.required.names)
235 except KeyError as err:
KeyError: ‘skymap’
The above exception was the direct cause of the following exception:
/opt/lsst/software/stack/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-110-g1427568b+500492d978/python/lsst/daf/butler/_butler.py in _findDatasetRef(self, datasetRefOrType, dataId, collections, allowUnresolved, **kwargs)
902 # type instead of letting registry.findDataset do it, so we get the
903 # result even if no dataset is found.
→ 904 dataId = DataCoordinate.standardize(dataId, graph=datasetType.dimensions,
905 defaults=self.registry.defaults.dataId, **kwargs)
906 # Always lookup the DatasetRef, even if one is given, to ensure it is
/opt/lsst/software/stack/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-110-g1427568b+500492d978/python/lsst/daf/butler/core/dimensions/_coordinate.py in standardize(mapping, graph, universe, defaults, **kwargs)
234 values = tuple(d[name] for name in graph.required.names)
235 except KeyError as err:
→ 236 raise KeyError(f"No value in data ID ({mapping}) for required dimension {err}.") from err
237 # Some backends cannot handle numpy.int64 type which is a subclass of
238 # numbers.Integral; convert that to int.
KeyError: “No value in data ID ({‘tract’: 4226, ‘patch’: 17, ‘band’: ‘r’}) for required dimension ‘skymap’.”
Is there a typo towards the end of the subtracted background code? In the second line below, should that be “coadd2Id” instead of “coaddId”? In that case “skymap”: “DC2” needs to be added to the first line.
This code is an excellent contribution to the tutorials - it’s taught me a lot about background subtraction and proper coding techniques - thank you for this!
Hmm, I don’t get the same error when 'skymap':'DC1' is added to the coaddId in that cell (see screenshot below). Is the error message coming from a different cell, or perhaps were other changes made?
Yes, there was a typo in the fourth-last cell of this notebook, and in the second line it was coaddId and now it is coadd2Id – good catch! It is now fixed.