Image Difference Task using DECam images [gen3]

Thanks Meredith :slight_smile:

Hi all! hope you had happy and healthy holidays :slight_smile:
From my part, Iā€™m almost about to run the ImageDifference task to my sources.
For the template exposure, I ran the ApTemplate.yaml command to build the deep coadd:

pipetask run -b data_hits/ -i processCcdOutputs/Blind15A_02,refcatsv2,DECam/raw/all,DECam/calib,skymaps,DECam/calib/20180503calibs --output-run Coadd_SNe/Blind15A_02 -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApTemplate.yaml --register-dataset-types --instrument lsst.obs.decam.DarkEnergyCamera -d "instrument='DECam' AND exposure IN (742019, 742013, 742020, 742014) AND detector NOT IN (2,61)" -j 1 --extend-run --skip-existing --clobber-outputs

And in a Jupyter notebook, Iā€™m able to retrieve the goodSeeingCoadd exposures.

for i in butler.registry.queryDatasets("goodSeeingCoadd", collections='Coadd_SNe/Blind15A_02'):
    print(i)

goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 50}, sc=ExposureF] (id=144fd294-10dd-43ac-bb1c-7f226e91c74d)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 85}, sc=ExposureF] (id=e075daf2-7867-4575-be19-1ae6ccfffce4)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 59}, sc=ExposureF] (id=070e579b-3d77-471c-b0a2-023628ac4042)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 75}, sc=ExposureF] (id=f04d3685-40b0-4e9f-a70c-52ff3e50bf42)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 46}, sc=ExposureF] (id=35f26fde-7dba-47e5-a42c-c44fce3b5b11)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 45}, sc=ExposureF] (id=3fb4318b-9c45-4c85-bc04-f502153cfd38)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 36}, sc=ExposureF] (id=a1b1aa27-6c83-4b49-a8dc-23dd70d25019)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 91}, sc=ExposureF] (id=acaf361d-2d9d-44f0-b646-bcb00bcde0e5)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 19}, sc=ExposureF] (id=23de5d39-ba58-412c-b2c8-05cf8ebdf699)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 18}, sc=ExposureF] (id=9ec3bd46-6efc-46d6-ae0c-fbc87e96ebb7)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 49}, sc=ExposureF] (id=f300b10b-1b52-4c1a-b7f4-fbc23ef73a20)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 48}, sc=ExposureF] (id=0536f21c-7804-442b-867b-a546e9b0a81d)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 61}, sc=ExposureF] (id=6ca89adb-fbad-4fc4-810b-1432a2d80706)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 35}, sc=ExposureF] (id=ad773668-ae1d-40dc-905c-2b58159e0f45)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 25}, sc=ExposureF] (id=014a2d81-7cf2-411a-857d-d001079abc0b)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 64}, sc=ExposureF] (id=953fac51-3b6e-4fd4-b6f1-f4d83326b861)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 38}, sc=ExposureF] (id=12a5739c-3495-49ee-b24b-9579fdc307ca)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 20}, sc=ExposureF] (id=4c0c9aa4-2ea7-4345-b2d4-6fa5325bd6d0)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 93}, sc=ExposureF] (id=bf5fc486-abc7-4692-8566-ec4c91fd3a89)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 47}, sc=ExposureF] (id=5434dcbe-139c-4ed8-8a1d-857710cc37bb)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 37}, sc=ExposureF] (id=f89ebdae-2c18-4561-85d6-3ce54f4d897f)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 51}, sc=ExposureF] (id=133fb332-bc09-47ca-945e-a35155f74e56)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 41}, sc=ExposureF] (id=bef1db63-a01d-4514-b274-c02083cb54ab)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 79}, sc=ExposureF] (id=ab01acbc-3bf1-48e3-8297-24afa0d7616a)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 83}, sc=ExposureF] (id=af91b011-71ca-4aa4-95af-ffbf009bc7ed)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 63}, sc=ExposureF] (id=f068afdb-1e13-428f-a4b8-0fa9096d5f48)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 53}, sc=ExposureF] (id=efd836e7-a618-4aee-9bae-aa71ed9e7b7f)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 82}, sc=ExposureF] (id=92894495-5832-4191-ab00-bf2410eeca1c)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 6}, sc=ExposureF] (id=28d94004-7324-4400-9f14-7ae26f680dfb)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 40}, sc=ExposureF] (id=e34125b9-2272-4b9e-9afe-e2f5a8103839)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 95}, sc=ExposureF] (id=4995d117-5366-4f73-a8ff-db27041e015a)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 57}, sc=ExposureF] (id=9c83aeaa-72a0-4a19-9aab-f838cb756a63)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 92}, sc=ExposureF] (id=3e7aa5bb-6a9e-498b-a1ac-8ea6dd7bae55)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 27}, sc=ExposureF] (id=c617f175-c8b8-4e99-bfbe-3ad2fea4e4d1)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 9}, sc=ExposureF] (id=c7a7b6e2-03e9-4730-85ad-11e3f6f5a3f9)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 69}, sc=ExposureF] (id=7084e611-3a3a-4248-8f60-38c326d9bb11)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 94}, sc=ExposureF] (id=4b2884b6-437f-4c84-8296-863a834e3f12)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 8}, sc=ExposureF] (id=4646eed7-4cfd-4324-b7f2-f11ac7df8f7c)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 52}, sc=ExposureF] (id=1cb3bc1f-98bb-4e27-9e76-3a2ef8586187)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 81}, sc=ExposureF] (id=49f96254-876d-4067-8deb-c2876fafcb22)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 39}, sc=ExposureF] (id=da41424b-9fff-4825-a669-4b5b60c677c4)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 21}, sc=ExposureF] (id=3fed488b-33fd-42c7-9df2-cab2fee7ef89)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 29}, sc=ExposureF] (id=249f9856-e6d5-4781-98dd-97aa1d702c3f)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 26}, sc=ExposureF] (id=d8adb7a3-49cc-4d7c-9e38-af14d8b377e1)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 7}, sc=ExposureF] (id=05f326c6-0d1d-4873-b53d-ea5875a839bf)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 68}, sc=ExposureF] (id=e7f99290-16e9-408f-9fb1-05a28eeb3281)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 58}, sc=ExposureF] (id=5228592e-4607-4c6d-bece-595927e1d568)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 67}, sc=ExposureF] (id=c7a868a1-54f3-44e8-9ecc-c6e9e4203d98)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 71}, sc=ExposureF] (id=d347db9e-1392-4f7b-a812-01ddd090050a)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 80}, sc=ExposureF] (id=24884513-ba2a-4741-98bc-c6b0632c70e4)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 84}, sc=ExposureF] (id=1b240485-0fed-483e-a3f2-e0bed3362ba1)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 74}, sc=ExposureF] (id=884ae1ad-23fb-41e1-8b29-1ba09b52effe)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 28}, sc=ExposureF] (id=42b9739f-a4c7-4c6c-8d48-608ad7f4330c)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 2, patch: 10}, sc=ExposureF] (id=ddfa17e8-ad88-4613-a743-13166e4a7ed8)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 62}, sc=ExposureF] (id=554c48e1-04a0-450b-af50-131054a987a9)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 56}, sc=ExposureF] (id=c110a222-7c70-4e33-8c7c-00309118ac41)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 60}, sc=ExposureF] (id=3ce5fe46-cc48-42fa-a97c-e1fb5635c5c5)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 73}, sc=ExposureF] (id=c9008dc4-bede-4371-afb4-4c1c7584a7e5)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 72}, sc=ExposureF] (id=d6d93844-0c5b-44dd-8c4a-593672f3a3d7)
goodSeeingCoadd@{band: 'g', skymap: 'discrete/hits', tract: 1, patch: 70}, sc=ExposureF] (id=243cd35e-eb8d-49c3-b69e-808ecdba0e4b)

Now, all of these sources should cover the region in the sky that corresponds to the science exposure I will use. And considering the parameters of ImageDifferenceTask.run, the templateExposure should be these goodSeeingcoadd exposures, right?.
Iā€™m confused because I only can retrieve (butler.get) individual goodSeeingCoadd of a certain track and patch, so my guess is that I should somehow do a mosaic with all of them ?, that way I can correctly ā€˜injectā€™ it to the ImageDifferenceTask.run module.
Happy to hear your comments :slight_smile:

I agree you seem to have a nice set of goodSeeingCoadds ready for use as templates! All you should need to do is list the output collection as an input to your image difference pipeline, e.g.,

pipetask run -b data_hits -i Coadd_SNe/Blind15A_02,[list science image collection, calibs, skymaps, etc. here] -p ApPipe.yaml -o [output collection name here]  -d "exposure IN (1, 2, 3)"

The butler handles all the tract/patch lookups for each visit/detector you want to difference. If you want to process data that spans more than one tract, life is slightly more complicated, but that is totally doable too and should soon be trivial.

1 Like

Ok :slight_smile:
by running that command:

pipetask run -b data_hits -i Coadd_SNe/Blind15A_02,processCcdOutputs/Blind15A_02,refcatsv2,DECam/raw/all,DECam/calib,skymaps,DECam/calib/20180503calibs,DECam/calib/20150219calibs,DECam/calib/20150220calibs,DECam/calib/20150218calibs,DECam/calib/20150217calibs,DECam/calib/20150221calibs -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApPipe.yaml  -o imagDiff_SNe/Blind15A_02  -d "exposure IN (410891, 410947, 410997, 411432, 411231, 411281, 411331, 411381,411633, 411784, 411834, 411683, 411734, 412036, 412086, 412226,412280, 742019, 742013, 742020, 742014) AND detector = 35" -j 3

I encountered the error:

Traceback (most recent call last):
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/bin/pipetask", line 29, in <module>
    sys.exit(main())
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/pipetask.py", line 52, in main
    return cli()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 102, in run
    qgraph = script.qgraph(pipelineObj=pipeline, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/script/qgraph.py", line 148, in qgraph
    qgraph = f.makeGraph(pipelineObj, args)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 569, in makeGraph
    qgraph = graphBuilder.makeGraph(pipeline, collections, run, args.data_query, metadata=metadata)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/graphBuilder.py", line 923, in makeGraph
    scaffolding = _PipelineScaffolding(pipeline, registry=self.registry)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/graphBuilder.py", line 413, in __init__
    datasetTypes = PipelineDatasetTypes.fromPipeline(pipeline, registry=registry)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/pipeline.py", line 930, in fromPipeline
    for taskDef in pipeline:
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/pipeline.py", line 589, in toExpandedPipeline
    config.validate()
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ap_association/22.0.1-14-ga93a7b3+5b4b187193/python/lsst/ap/association/diaPipe.py", line 254, in validate
    pexConfig.Config.validate(self)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pex_config/22.0.1-1-g87000a6+9f9726e262/python/lsst/pex/config/config.py", line 1300, in validate
    field.validate(self)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pex_config/22.0.1-1-g87000a6+9f9726e262/python/lsst/pex/config/configurableField.py", line 357, in validate
    value.validate()
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/dax_apdb/22.0.1-2-gb66926d+cc1287fa8a/python/lsst/dax/apdb/apdb.py", line 210, in validate
    super().validate()
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pex_config/22.0.1-1-g87000a6+9f9726e262/python/lsst/pex/config/config.py", line 1300, in validate
    field.validate(self)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pex_config/22.0.1-1-g87000a6+9f9726e262/python/lsst/pex/config/config.py", line 427, in validate
    raise FieldValidationError(self, instance, "Required value cannot be None")
lsst.pex.config.config.FieldValidationError: Field 'apdb.db_url' failed validation: Required value cannot be None
For more information see the Field definition at:
  File dax/apdb/apdb.py:138 (ApdbConfig) and the Config definition at:
  File dax/apdb/apdb.py:136 (<module>)

From what Iā€™ve read, I must set the database for the ap pipeline, and to do so I should run something like:

make_apdb.py -c db_url="sqlite:///databases/apdb.db"

but this exact command doesnā€™t do the trick and gives the error message of ā€œunable to open database fileā€. On your guide you have db_url="postgresql://USER@DB_ADDRESS/DB_NAME", so in my case, Iā€™m not completely sure as to what I should write. Should I write the name of my SQL database? I would like to know your input before I run something ā€œirreversibleā€ :sweat_smile:
thanks!

Oh, that is true, you do need to run make_apdb.py and then give URL for that brand new empty APDB in your config. I am assuming you want to run the full AP Pipeline, if thatā€™s not the case you could just exclude the DiaPipe step. The simplest APDB to make is an sqlite one like the example you pasted - it the URL is essentially just a file path and it can live wherever you want (just not in the Butler repo pleaseā€¦ in some working directory).

You can run into database errors down the line if you attempt to re-use an existing APDB, so itā€™s best to make a new empty one each time you run ApPipe.yaml unless you have a strong reason not to. Hope that helps.

Ok! Iā€™ve run

make_apdb.py -c db_url="sqlite:////home/jahumada/databases/apdb.dbā€ isolation_level=READ_UNCOMMITTED

and after that, I tried:

but I still get the same error

File dax/apdb/apdb.py:138 (ApdbConfig) and the Config definition at:
File dax/apdb/apdb.py:136 (<module>)

I know that make_apdb.py lives in the directory: lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ap_pipe/22.0.1-14-g4707e2c+f89552dc5c/python/lsst/ap/pipe/ whereas dax/apdb/apdb.py lives in lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/dax_apdb/22.0.1-2-gb66926d+cc1287fa8a/python/lsst/dax/apdb/. I wonder if maybe I should try to configure apdb.py ? since this file is the one that appears in the error messageā€¦
Thanks :slight_smile:

I think you missed this: ap_pipe/ApPipe.yaml at main Ā· lsst/ap_pipe Ā· GitHub

# A db_url is always required, e.g.,
# -c diaPipe:apdb.db_url: 'sqlite:////project/user/association.db'

although I think the syntax for your pipetask command should actually be:

-c diaPipe:apdb.db_url='sqlite:////home/jahumada/databases/apdb.db'
2 Likes

Thanks a lot ktl! adding the tags -c diaPipe:apdb.isolation_level='READ_UNCOMMITTED' -c diaPipe:apdb.db_url='sqlite:////home/jahumada/databases/apdb.db' got rid of the apdb.py errors!

Now when I run the pipetask command, for some reason I get:

ctrl.mpexec.cmdLineFwk INFO: QuantumGraph contains 126 quanta for 6 tasks, graph ID: '1642458527.2738802-211352'
Traceback (most recent call last):
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/bin/pipetask", line 29, in <module>
    sys.exit(main())
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/pipetask.py", line 52, in main
    return cli()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 103, in run
    script.run(qgraphObj=qgraph, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/script/run.py", line 171, in run
    f.runPipeline(qgraphObj, taskFactory, args)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 650, in runPipeline
    preExecInit.initialize(graph,
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/preExecInit.py", line 90, in initialize
    self.initializeDatasetTypes(graph, registerDatasetTypes)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/preExecInit.py", line 137, in initializeDatasetTypes
    expected = self.butler.registry.getDatasetType(datasetType.name)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registries/sql.py", line 391, in getDatasetType
    return self._managers.datasets[name].datasetType
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/interfaces/_datasets.py", line 509, in __getitem__
    raise KeyError(f"Dataset type with name '{name}' not found.")
KeyError: "Dataset type with name 'goodSeeingDiff_diaSrc_schema' not found."

This is weird because is the first time Iā€™m running this task, and I have no dataset type of goodSeeingDiff_diaSrc_schema from prior stepsā€¦ I imagine Diff comes from the Image differencing method?.
Do you know what could be happening?

1 Like

Sorry I wasnā€™t clearer about the APDB config situation!

Are you using a relatively recent version of ap_pipe (in particular, $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApPipe.yaml and the camera-agnostic ApPipe.yaml it imports from one level up)? The configs in there, last updated Oct 2021, should ā€œjust work.ā€ I do recall encountering errors of the sort you mention last summer, when we were in the process of switching the AP Pipeline over to goodSeeing everything.

1 Like

Thanks Meredith :slight_smile:
I updated the software and no longer have the goodSeeingDiff_diaSrc_schema error :smiley:

So now, after running the pipetask:

pipetask run -b data_hits -i Coadd_SNe/Blind15A_02,processCcdOutputs/Blind15A_02,refcatsv2,DECam/raw/all,DECam/calib,skymaps,DECam/calib/20180503calibs,DECam/calib/20150219calibs,DECam/calib/20150220calibs,DECam/calib/20150218calibs,DECam/calib/20150217calibs,DECam/calib/20150221calibs -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApPipe.yaml  -o imagDiff_SNe/Blind15A_02  -d "exposure IN (410891, 410947, 410997, 411432, 411231, 411281, 411331, 411381,411633, 411784, 411834, 411683, 411734, 412036, 412086, 412226,412280, 742019, 742013, 742020, 742014) AND detector = 35" -j 3 -c diaPipe:apdb.db_url='sqlite:////home/jahumada/databases/apdb.db' -c diaPipe:apdb.isolation_level='READ_UNCOMMITTED' --register-dataset-types

I got the error

FileNotFoundError: Not enough datasets (0) found for non-optional connection isr.flat (flat) with minimum=1 for quantum data ID {instrument: 'DECam', detector: 35, exposure: 411734, ...}.

Which I donā€™t understand why I got, because the flat that corresponds to that exposure is included on the inputs of the pipetask command (DECam/calib/20150219calibs). It is the same collection I used as input to produce the processed images for that exposure :frowning:. I ran the command again, without the exposure 411734, and got a similar error for another exposure that uses the same flat!, so thereā€™s definitely a problem with this collection specifically. Trying to solve this I figured that I could maybe include directly the collection of flats of that exposure on the inputs of the pipetask command, but then I encountered another error:

Output CHAINED collection 'imagDiff_SNe/Blind15A_02' exists, but it ends with a different sequence of input collections than those given:

After that I did other tests by changing the output name collectionā€¦ but these also resulted in error because I used the same apdb database :expressionless:. I then tried to create another apdb database to run this again completely, but for some reason when I run:

make_apdb.py -c db_url="sqlite:////home/jahumada/databases/apdbv2.dbā€ isolation_level=READ_UNCOMMITTED

It doesnā€™t stop runningā€¦ It just shows:

(lsst-scipipe-0.7.0) [jahumada@leftraru1 ~]$ make_apdb.py -c db_url="sqlite:////home/jahumada/databases/apdbv2.dbā€ isolation_level=READ_UNCOMMITTED
> 
> 
> 
> 
> 

I finally tried to run the same pipetask command of the beginning of this post, but now with the extensions --extend-run --skip-existing --clobber-ouputs

And got:

RuntimeError: Duplicate DiaSources found after association and merging with history. This is likely due to re-running data with an already populated Apdb. If this was not the case then there was an unexpected failure in Association while matching sources to objects, and should be reported. Exiting.

So, I was wondering if there is a way to --skip-existing DiaSources, or if its best to start over again and figure out how to do another apdb database

Thanks a lot :full_moon_with_face:

Hmm, OK, a few things to figure out here. First of all, you definitely want a fresh APDB with this kind of use case. A ā€œrealā€ APDB assumes time will always be going forward and youā€™ll never reprocess the same DiaSource.

I am not sure why make_apdb.py is hanging for you. I usually run it as

make_apdb.py -c isolation_level=READ_UNCOMMITTED -c db_url="sqlite:////WHATEVERPATH/association.db"

but that is virtually identical to what you wrote, unless the config flags are more particular than I realized.

Nextā€¦ Iā€™m not sure whatā€™s going on with the flats. We have been changing some things around in the various pipelines and configs, so Iā€™m guessing you accidentally landed in some mismatch state. What stack version are you on, and perhaps most crucially, how up-to-date is your ApPipe.yaml? If youā€™re sure youā€™re specifying the correct input calib repo, something has clearly changed since you previously ran single frame processing (i.e., ISR, characterize, and calibrate), but I canā€™t immediately tell you what.

Regarding input collections. Once you assign a set of input collections and start processing data, you canā€™t change the inputs for a subsequent processing run to the same output collection. Thatā€™s intentional. It might help to run query collections with --chains=tree to get a clearer look at what collections are chained along. For example, in one giant shared repo, I can do this

(lsst-scipipe) [mrawls@lsst-devl03 ~]$ butler query-collections /repo/main DECam* --chains=tree
                                Name                                     Type   
-------------------------------------------------------------------- -----------
DECam/defaults                                                       CHAINED    
  DECam/raw/all                                                      RUN        
  DECam/calib                                                        CHAINED    
    DECam/calib/DM-30703                                             CALIBRATION
    DECam/calib/DM-29338                                             CALIBRATION
    DECam/calib/DM-28638                                             CALIBRATION
    DECam/calib/DM-28638/unbounded                                   RUN        
  refcats                                                            CHAINED    
    refcats/DM-28636                                                 RUN        
  skymaps                                                            RUN        

[...] (more collections exist too; just a snippet)

and then I know I can just specify -i DECam/defaults as an input collection to get all the calibs, refracts, skymaps, etc. I could ever need.

Hope some of this guides you in a useful direction.

1 Like

I tried running

but I get the same result, it wont stop running :open_mouth:. Iā€™m using the lsst stack v23, which I installed following the instructions: Install with newinstall.sh and eups distrib ā€” LSST Science Pipelines
Prior to that, I was using the weekly version: lsst stack v22 w37.
Thanks for the tip of the chain-tree :slight_smile:

Itā€™s possible there could be problems if your home directory is on a network filesystem. Could you try with e.g. /tmp/association.db and see if that also hangs? (You could then copy that to your /home, but it might have the same problem when used later in the pipeline.)

1 Like

Thanks ktl, It didnā€™t hang when I tried /tmp/association.db , and after doing that, it started to work again :D.
Without considering the exposures that use the master flat that doesnā€™t work, I was able to run the imageDifferencing task :smiley:
I suppose the dataset type goodSeeingDiff_differenceExp is the science image that was subtracted?

1 Like

Prerequisites ā€” LSST Science Pipelines gives details on what you could check with your local system administrators with regard to your /home network filesystem.

1 Like

Iā€™m still not sure what to make of your flat-not-found oddity, but yes, in this scenario goodSeeingDiff_differenceExp is what difference images are called, i.e., the result from doing difference imaging on a science image with a template.

1 Like

Hi! I appear to have the same problem, that make_apdb.py hangs and doesnā€™t stop running. I tried with different directories like /tmp/association.db but now it doesnā€™t do the trick. Do you have any suggestions as to what I could try to fix this issue?, for now, I will update the software to the weekly version v22_10 and check if it works.
Thanks!

Tried with the weekly version v22_10 and still hangsā€¦ I donā€™t know what could be going on :confused:

The only suggestions I have are either to enable DEBUG logging for lsst.dax.apdb or to run make_apdb.py under the pdb debugger to see where it is getting hung up.

1 Like

I think I found the issue! it seems that make_apdb.py is sensible to the order that you write the tags,
It works if I run, for example:
make_apdb.py -c isolation_level=READ_UNCOMMITTED -c db_url="sqlite:////home/jahumada/databases/blind15a_26_agn.db"
before, I ran -c db_url="sqlite:////home/jahumada/databases/blind15a_26_agn.db as the first tag