Image Difference Task using DECam images [gen3]

Hi everyone :slight_smile:
Currently, I have calibrated exposures of DECam images and would like to do Image differencing photometry on them. I am a little lost about the steps I should take to do this. From the posts of the community forum I’ve seen people do image differencing directly to their calibrated exposures, as well as doing an intermediate step of running ‘coadds’ (I’m not sure what the latter means or do).
I also wonder if under the context of the gen3 middleware, the syntaxis of image differencing has changed as well.
Any suggestions or references that you may have to clarify these questions will be greatly appreciated

Hi Paula, I wrote this guide for processing DC2 data with the AP Pipeline, but the same principles apply for processing DECam data (including difference imaging). In essence, you put imageDifferenceTask (along with whatever else!) in a yaml pipeline file, and specify via pipetask run (or a batch processing system) what dataIds to process. Have a look and see if this gets you started.

Edited to add: the coadd thing is because when you do difference imaging, you need some reference template. The science pipelines is set up to expect a stacked (deep) coadd image as a difference imaging template. That’s why I often build coadds with one pipeline, e.g., ApTemplate.yaml and then run difference imaging (and other steps) with another pipeline, e.g., ApPipe.yaml.

3 Likes

Thanks for the reference Meredith :slight_smile: I have a question regarding the deep coadd image, is it recommendable to build it with all the available images centered at a certain (RA, DEC) -all the images for which I would later do image differencing- or is it best to build it with a subset that has the best seeing?.
I also wonder what method of Image Differencing does the LSST Science Pipeline use?, because I haven’t seen any references… maybe is a unique method implemented by the pipeline?

Thanks again!

In general it is best to build a coadd intended as a difference imaging template with a some sample of best seeing images, but as in all things, there are tradeoffs. For instance, it’s also important to have full coverage by way of several — ballpark maybe 5 or more? — visits being coadded in each patch (chunk of sky).

We have a relatively new BestSeeingQuantileSelectVisitsTask in pipe_tasks I use for selecting a subset of best seeing images — check out ApTemplate.yaml. During actual LSST, templates will be generated in advance of running the AP Pipeline, and won’t contain a new science image visit as one of the inputs to the template. That said, there’s nothing inherently wrong with using the same visit as part of a template and as a science image visit for difference imaging with other datasets.

For difference imaging, we use ImageDifferenceTask, also in pipe_tasks that uses lots of code in the ip_diffim package. Check out ApPipe.yaml. By default we use the Alard & Lupton (1998) algorithm, but it also supports another algorithm we call ZOGY (Zackay, Ofek, & Gal-Yam 2016). There is a very new implementation of image differencing, lsst.pipe.tasks.ImageDifferenceFromTemplateTask, which must be used in after lsst.ip.diffim.getTemplate.GetMultiTractCoaddTemplateTask, for handling datasets that span multiple tracts.

You can learn more about some of the recent work on difference imaging algorithms in technote DMTN-179 and this AP team status update from earlier in the year.

1 Like

Thanks for your response!. I have tried the following command:

pipetask run -b data_hits/ -i processCcdOutputs/Blind15A_02,refcats,DECam/raw/all,DECam/calib --output-run BestSeeing/Blind15A_02 -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApTemplate.yaml --register-dataset-types

and the output I get is:

RuntimeError: Error finding datasets of type skyMap in collections [processCcdOutputs/Blind15A_02, refcats, DECam/raw/all, DECam/calib]; it is impossible for any such datasets to be found in any of those collections, most likely because the dataset type is not registered.  This error may become a successful query that returns no results in the future, because queries with no results are not usually considered an error.

So I wonder, what exactly is the dataset of skyMap? and is there a way to retrieve these data? having done calibrated exposures so far?

Getting started tutorial part 5: coadding images — LSST Science Pipelines describes what a skymap is and why it’s needed. But we do not yet have a tutorial that shows how to create one using the butler register-skymap command. There are several kinds of skymaps, each with various configuration parameters; the best one for you may depend on things like how much sky you intend to process.

1 Like

Thanks a lot ktl! In my case I want a piece of sky that covers roughly a box of RA = [135,162] and DEC = [-8.3,4]. Do you know where I could find the available configuration parameters that could work for me?, I’ve been checking the scripts of the pipe_tasks and I don’t understand so much where they are. Is also not so clear to me how the register-skymap constructs the skyMap, does it use the reference catalog, for instance?

Thanks again!

Here’s an example of using a DiscreteSkyMap to cover a particular RA/dec region:
Config file: faro/hsc_sky_map.py at master ¡ lsst/faro ¡ GitHub
Butler command: faro/hsc_gen2_to_gen3.sh at master ¡ lsst/faro ¡ GitHub

Skymaps are purely descriptions of a tangent plane pixelization; there are no references to catalogs or anything real in the sky.

Other types of skymaps, including ones that cover the entire sky instead of just a small area, are listed at lsst.skymap — LSST Science Pipelines

1 Like

Hi ktl, thanks a lot for all the references! I was able to create a discrete skymap :smiley:

I ultimately created a script called config_hits.py that contained:

config.skyMap["discrete"].patchInnerDimensions=[4000, 4000]  #dimensions of inner region of patches (x,y pixels) (List, default (4000, 4000))
config.skyMap["discrete"].patchBorder=100 # border between patch inner and outer bbox (pixels) (int, default 100)
config.skyMap["discrete"].tractOverlap=0.0 #minimum overlap between adjacent sky tracts, on the sky (deg) (float, default 1.0)
config.skyMap["discrete"].pixelScale=0.2626 #nominal pixel scale (arcsec/pixel)
config.skyMap["discrete"].projection='TAN' # one of the FITS WCS projection codes, such as:- STG: stereographic projection- MOL: Molleweide’s projection- TAN: tangent-plane projection (str, default 'STG')
config.skyMap["discrete"].rotation=0.0 #Rotation for WCS (deg) (float, default 0)
config.skyMap["discrete"].raList=[137.225, 137.225, 138.38 , 138.38 , 139.535, 140.69 , 140.69 ,
       141.845, 141.845, 143.   , 143.   , 144.155, 144.155, 145.31 ,
       145.31 , 146.465, 147.389, 147.62 , 147.62 , 148.931, 149.7  ,
       150.086, 150.086, 150.12 , 152.193, 152.387, 152.396, 153.094,
       153.12 , 154.324, 154.697, 155.12 , 155.39 , 155.47 , 156.856,
       156.856, 157.199, 157.947, 158.011, 159.148, 159.166, 160.38] 
config.skyMap["discrete"].decList=[-0.29974, -4.30078, -2.30026,  1.70078, -0.29974, -2.30026,
        1.70078, -0.29974, -4.30078, -2.30026,  1.70078, -0.29974,
       -4.30078, -6.3013 ,  1.70078, -4.30078, -0.06874, -6.3013 ,
        2.16278, -4.0963 ,  0.15   , -2.09578, -6.09682,  2.21   ,
       -2.09578,  2.09578, -6.09682, -4.19156,  0.     , -2.09578,
        2.09578, -6.52   ,  0.09526, -4.95   , -2.14098,  1.86006,
       -6.52   , -4.19156, -0.14046, -6.28734,  1.86006,  0.]
config.skyMap["discrete"].radiusList=[1.5]*42
config.skyMap.name = "discrete"
config.name = "discrete/hits"

then I ran:
butler register-skymap data_hits/ -C testdata_hits/config_hits.py

and got the output:

pipe.tasks.script.registerSkymap INFO: sky map has 42 tracts
pipe.tasks.script.registerSkymap INFO: tract 0 has corners (138.830, -1.903), (135.620, -1.903), (135.621, 1.304), (138.829, 1.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 1 has corners (138.837, -5.903), (135.613, -5.903), (135.619, -2.695), (138.830, -2.695) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 2 has corners (139.987, -3.903), (136.773, -3.903), (136.776, -0.696), (139.984, -0.696) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 3 has corners (139.984, 0.096), (136.776, 0.096), (136.774, 3.304), (139.986, 3.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 4 has corners (141.140, -1.903), (137.930, -1.903), (137.931, 1.304), (141.139, 1.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 5 has corners (142.297, -3.903), (139.083, -3.903), (139.086, -0.696), (142.294, -0.696) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 6 has corners (142.294, 0.096), (139.086, 0.096), (139.084, 3.304), (142.296, 3.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 7 has corners (143.450, -1.903), (140.240, -1.903), (140.241, 1.304), (143.449, 1.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 8 has corners (143.457, -5.903), (140.233, -5.903), (140.239, -2.695), (143.450, -2.695) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 9 has corners (144.607, -3.903), (141.393, -3.903), (141.396, -0.696), (144.604, -0.696) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 10 has corners (144.604, 0.096), (141.396, 0.096), (141.394, 3.304), (144.606, 3.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 11 has corners (145.760, -1.903), (142.550, -1.903), (142.551, 1.304), (145.759, 1.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 12 has corners (145.767, -5.903), (142.543, -5.903), (142.549, -2.695), (145.760, -2.695) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 13 has corners (146.929, -7.903), (143.691, -7.903), (143.701, -4.695), (146.919, -4.695) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 14 has corners (146.914, 0.096), (143.706, 0.096), (143.704, 3.304), (146.916, 3.304) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 15 has corners (148.077, -5.903), (144.853, -5.903), (144.859, -2.695), (148.070, -2.695) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 16 has corners (148.993, -1.672), (145.785, -1.672), (145.785, 1.535), (148.993, 1.535) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 17 has corners (149.239, -7.903), (146.001, -7.903), (146.011, -4.695), (149.229, -4.695) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 18 has corners (149.224, 0.558), (146.016, 0.558), (146.013, 3.766), (149.227, 3.766) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 19 has corners (150.543, -5.698), (147.319, -5.698), (147.326, -2.491), (150.536, -2.491) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 20 has corners (151.304, -1.454), (148.096, -1.454), (148.095, 1.754), (151.304, 1.754) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 21 has corners (151.693, -3.699), (148.479, -3.699), (148.482, -0.491), (151.690, -0.491) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 22 has corners (151.704, -7.698), (148.468, -7.698), (148.477, -4.491), (151.695, -4.491) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 23 has corners (151.724, 0.605), (148.516, 0.605), (148.513, 3.813), (151.727, 3.813) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 24 has corners (153.800, -3.699), (150.586, -3.699), (150.589, -0.491), (153.797, -0.491) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 25 has corners (153.991, 0.491), (150.783, 0.491), (150.780, 3.699), (153.994, 3.699) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 26 has corners (154.014, -7.698), (150.778, -7.698), (150.787, -4.491), (154.005, -4.491) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 27 has corners (154.706, -5.794), (151.482, -5.794), (151.489, -2.586), (154.699, -2.586) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 28 has corners (154.724, -1.604), (151.516, -1.604), (151.516, 1.604), (154.724, 1.604) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 29 has corners (155.931, -3.699), (152.717, -3.699), (152.720, -0.491), (155.928, -0.491) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 30 has corners (156.301, 0.491), (153.093, 0.491), (153.090, 3.699), (156.304, 3.699) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 31 has corners (156.740, -8.121), (153.500, -8.121), (153.510, -4.914), (156.730, -4.914) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 32 has corners (156.994, -1.508), (153.786, -1.508), (153.786, 1.699), (156.994, 1.699) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 33 has corners (157.084, -6.552), (153.856, -6.552), (153.863, -3.344), (157.076, -3.344) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 34 has corners (158.463, -3.744), (155.249, -3.744), (155.252, -0.536), (158.460, -0.536) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 35 has corners (158.460, 0.256), (155.252, 0.256), (155.249, 3.463), (158.463, 3.463) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 36 has corners (158.819, -8.121), (155.579, -8.121), (155.589, -4.914), (158.809, -4.914) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 37 has corners (159.559, -5.794), (156.335, -5.794), (156.342, -2.586), (159.552, -2.586) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 38 has corners (159.615, -1.744), (156.406, -1.744), (156.407, 1.463), (159.615, 1.463) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 39 has corners (160.767, -7.889), (157.529, -7.889), (157.539, -4.681), (160.757, -4.681) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 40 has corners (160.770, 0.256), (157.562, 0.256), (157.559, 3.463), (160.773, 3.463) (RA, Dec deg) and 11 x 11 patches
pipe.tasks.script.registerSkymap INFO: tract 41 has corners (161.984, -1.604), (158.776, -1.604), (158.776, 1.604), (161.984, 1.604) (RA, Dec deg) and 11 x 11 patches

following Meredith indications I’m currently running:
pipetask run -b data_hits/ -i processCcdOutputs/Blind15A_02,refcatsv2,DECam/raw/all,DECam/calib,skymaps,DECam/calib/20150217calibs,DECam/calib/20150218calibs,DECam/calib/20150219calibs,DECam/calib/20150220calibs --output-run BestSeeing/Blind15A_02 -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApTemplate.yaml --register-dataset-types --instrument lsst.obs.decam.DarkEnergyCamera -d "instrument='DECam' AND exposure IN (410891, 410947, 410997, 411231, 411281, 411331, 411381, 411432, 411633, 411683, 411734, 411784, 411834, 412036, 412086) AND detector NOT IN (2, 61)" -j 3

the exposures that I am processing in this command correspond to the same field of the sky at different epochs, from the nights of 2015-02-17, 2015-02-18, 2015-02-19 and 2015-02-20.

This command has made the following directory:

BestSeeing/
`-- Blind15A_02
    |-- assembleCoadd_config
    |-- calexp
    |-- calexpBackground
    |-- calibrate_config
    |-- calibrate_log
    |-- calibrate_metadata
    |-- characterizeImage_config
    |-- characterizeImage_log
    |-- characterizeImage_metadata
    |-- consolidateVisitSummary_config
    |-- deepCoadd_directWarp
    |-- deepCoadd_psfMatchedWarp
    |-- icExp
    |-- icExpBackground
    |-- icSrc
    |-- icSrc_schema
    |-- isr_config
    |-- isr_log
    |-- isr_metadata
    |-- makeWarp_config
    |-- makeWarp_log
    |-- makeWarp_metadata
    |-- packages
    |-- postISRCCD
    |-- selectGoodSeeingVisits_config
    |-- src
    |-- srcMatch
    `-- src_schema

Many folders were created! but I speculate that the files that are most important for me is deepCoadd_psfMatchedWarp and deepCoadd_directWarp. I’m not sure why another directory of calexp is created?
I’m also beginning to have a problem with disk space because the steps of processing create many heavy “artifacts”, so soon I will have to start deleting some intermediate files of the processing… but I’m not sure what to delete. How do you approach this problem?

Thanks a lot!

1 Like

Great work!! Generally to look at output data products, you’re better off using the butler than perusing file directories, but it is heartening to see all the things landing in a sensible place. The two kinds of warp you mention are the precursor to what will hopefully be written out as plain old deepCoadds soon.

Incidentally, the latest version of ApTemplate.yaml creates warps and coadds with the name goodSeeingCoadd instead of deepCoadd, so I guess you’re using a slightly older version here? No problem if it is running successfully! :crossed_fingers:

I’m not sure what to tell you about disk space… the science pipelines are definitely not optimized to conserve it. I don’t know offhand what, if anything, you can safely delete without confusing the butler. I can tell you that it’s normal to have “input data products” copied over to “output repository directories” (like the file tree you show with calexps in it). I suspect there’s a way to tell it not to do that, but I try to stay far away from manipulating files on disk when those files live in a butler repo.

1 Like

To clarify a few things:

  • You can delete datasets without confusing the Butler by using the butler prune-collection or butler prune-datasets commands.
  • While the Butler makes it appear that “input data products” are copied to “output repository directories” (actually collections), it doesn’t actually do any copying. While we’re not optimal for space usage, we wouldn’t waste space like that :slight_smile:
  • The calexp (and icExp and …) datasets are present in this case because the entire ApTemplate.yaml pipeline was executed, which includes that single-frame processing. If you want to avoid that, you need to do one of two things: select only the pipeline tasks that you want to run (e.g. by specifying -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ApTemplate.yaml#makeTemplate) or tell pipetask to not produce outputs that are already available (e.g. by specifying --skip-existing-in processCcdOutputs/Blind15A_02 or possibly just --skip-existing).
  • The pipeline looks like it started to assemble the (template) coadd from the warps, as assembleCoadd_config is present, but something has gone wrong as none of those outputs are present. You’ll have to look at the logs to see what error occurred.
2 Likes

Thanks again for all that valuable information!! it has helped me a lot :slight_smile:
I tried to run the same command but now with:

but for some reason, I get a QuantumGraph is empty error, which doesn’t happen when I don’t specify the #makeTemplate task… although I wasn’t able to run it to its entirety because of space issues, so maybe it could’ve happened eventually?

For the same reason of not finishing the run, Is why at that point I didn’t have output files. :open_mouth:

Happy to hear any comments :slight_smile:

Ok, so adding the command --skip-existing-in processCcdOutputs/Blind15A_02 I could ran the task until I got the following error:

ctrl.mpexec.mpGraphExecutor ERROR: Failed jobs:
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 57, visit: 411784, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 72, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 26, visit: 412036, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 25, visit: 411633, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 85, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 25, visit: 411683, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 59, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 67, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 70, visit: 411281, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 58, visit: 411231, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 46, visit: 411784, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 29, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 85, visit: 412036, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 75, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 94, visit: 411633, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 32, visit: 411633, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 75, visit: 412036, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 67, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 85, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 32, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 31, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 32, visit: 411231, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 81, visit: 412086, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 8, visit: 412086, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 41, visit: 411784, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 58, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 39, visit: 411683, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 51, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 50, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 48, visit: 411633, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 50, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 30, visit: 410997, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 74, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 95, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 48, visit: 412086, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 64, visit: 412036, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 59, visit: 410997, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 64, visit: 410997, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 28, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 59, visit: 411231, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 31, visit: 411432, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 18, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 30, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 91, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 25, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 8, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 81, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 70, visit: 411834, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 57, visit: 411281, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 92, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 19, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 92, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 19, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 37, visit: 411734, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 25, visit: 412086, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 32, visit: 412036, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 95, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 29, visit: 411784, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 39, visit: 410997, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 59, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 39, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 68, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 57, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 69, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 53, visit: 411784, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 25, visit: 411231, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 31, visit: 412036, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 62, visit: 411281, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 18, visit: 411381, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 59, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 64, visit: 411432, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 75, visit: 411432, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 37, visit: 411683, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 6, visit: 410947, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 85, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 49, visit: 412086, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 67, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 40, visit: 411683, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 46, visit: 411281, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 45, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 26, visit: 411381, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 2, patch: 30, visit: 411784, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 75, visit: 410891, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 49, visit: 411331, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 36, visit: 411381, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED: <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 95, visit: 411633, ...}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 59}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 72}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 74}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 45}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 48}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 19}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 32}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 75}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 92}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 31}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 18}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 91}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 81}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 25}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 67}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 57}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 37}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 68}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 41}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 69}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 6}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 29}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 64}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 53}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 58}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 36}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 46}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 49}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 50}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 85}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 70}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 62}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 30}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 40}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 26}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 51}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 2, patch: 8}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 39}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 28}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 95}>
ctrl.mpexec.mpGraphExecutor ERROR:   - FAILED_DEP: <TaskDef(CompareWarpAssembleCoaddTask, label=assembleCoadd) dataId={band: 'g', skymap: 'discrete/hits', tract: 1, patch: 94}>
Traceback (most recent call last):
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/bin/pipetask", line 29, in <module>
    sys.exit(main())
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/pipetask.py", line 52, in main
    return cli()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 103, in run
    script.run(qgraphObj=qgraph, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/script/run.py", line 171, in run
    f.runPipeline(qgraphObj, taskFactory, args)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 669, in runPipeline
    executor.execute(graph, butler)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/mpGraphExecutor.py", line 300, in execute
    self._executeQuantaMP(graph, butler)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/mpGraphExecutor.py", line 486, in _executeQuantaMP
    raise MPGraphExecutorError("One or more tasks failed or timed out during execution.")
lsst.ctrl.mpexec.mpGraphExecutor.MPGraphExecutorError: One or more tasks failed or timed out during execution.

I don’t really understand the error here, is it that maybe the discrete skymap is not sufficient for my exposures?

the directory that I got for now is:

data_hits/BestSeeing/
`-- Blind15A_02
    |-- assembleCoadd_config
    |-- assembleCoadd_log
    |-- assembleCoadd_metadata
    |-- calibrate_config
    |-- characterizeImage_config
    |-- consolidateVisitSummary_config
    |-- consolidateVisitSummary_log
    |-- consolidateVisitSummary_metadata
    |-- deepCoadd_directWarp
    |-- deepCoadd_psfMatchedWarp
    |-- goodSeeingCoadd
    |-- goodSeeingCoadd_nImage
    |-- goodSeeingVisits
    |-- icSrc_schema
    |-- isr_config
    |-- makeWarp_config
    |-- makeWarp_log
    |-- makeWarp_metadata
    |-- packages
    |-- selectGoodSeeingVisits_config
    |-- selectGoodSeeingVisits_log
    |-- selectGoodSeeingVisits_metadata
    |-- src_schema
    `-- visitSummary

and the directory of goodSeeingCoadd has the following content:

data_hits/BestSeeing/Blind15A_02/goodSeeingCoadd
|-- 1
|   |-- 27
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_27_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 29
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_29_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 35
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_35_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 38
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_38_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 47
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_47_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 52
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_52_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 56
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_56_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 60
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_60_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 61
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_61_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 63
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_63_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 71
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_71_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 73
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_73_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 79
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_79_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 80
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_80_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 82
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_82_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 83
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_83_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   |-- 84
|   |   `-- g
|   |       `-- goodSeeingCoadd_1_84_g_discrete_hits_BestSeeing_Blind15A_02.fits
|   `-- 93
|       `-- g
|           `-- goodSeeingCoadd_1_93_g_discrete_hits_BestSeeing_Blind15A_02.fits
`-- 2
    |-- 10
    |   `-- g
    |       `-- goodSeeingCoadd_2_10_g_discrete_hits_BestSeeing_Blind15A_02.fits
    |-- 20
    |   `-- g
    |       `-- goodSeeingCoadd_2_20_g_discrete_hits_BestSeeing_Blind15A_02.fits
    |-- 21
    |   `-- g
    |       `-- goodSeeingCoadd_2_21_g_discrete_hits_BestSeeing_Blind15A_02.fits
    |-- 7
    |   `-- g
    |       `-- goodSeeingCoadd_2_7_g_discrete_hits_BestSeeing_Blind15A_02.fits
    `-- 9
        `-- g
            `-- goodSeeingCoadd_2_9_g_discrete_hits_BestSeeing_Blind15A_02.fits

Thanks!

There should be additional error messages earlier in the log that tell you what went wrong with each failed MakeWarpTask. The CompareWarpAssembleCoaddTasks failed because their inputs weren’t present due to the MakeWarpTask failures (hence “FAILED_DEP”).

1 Like

I see :slight_smile:, I was able to find one of the errors:

ctrl.mpexec.singleQuantumExecutor ERROR: Execution of task 'makeWarp' on quantum {instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 57, visit: 411784, ...} failed
Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1771, in _execute_context
    self.dialect.do_execute(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 717, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: disk I/O error

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 174, in execute
    self.runQuantum(task, quantum, taskDef, butler)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 450, in runQuantum
    task.runQuantum(butlerQC, inputRefs, outputRefs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_tasks/21.0.0-139-g4e3796b1+8c51ef16e5/python/lsst/pipe/tasks/makeCoaddTempExp.py", line 806, in runQuantum
    butlerQC.put(results.exposures["direct"], outputRefs.direct)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/butlerQuantumContext.py", line 200, in put
    self._put(values, dataset)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/butlerQuantumContext.py", line 96, in _put
    self.__butler.put(value, ref)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/core/utils.py", line 269, in inner
    return func(self, *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/_butler.py", line 929, in put
    ref, = self.registry.insertDatasets(datasetType, run=run, dataIds=[dataId])
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/core/utils.py", line 269, in inner
    return func(self, *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registries/sql.py", line 456, in insertDatasets
    refs = list(storage.insert(runRecord, expandedDataIds, idGenerationMode))
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/datasets/byDimensions/_storage.py", line 593, in insert
    yield from self._insert(run, dataIdList, rows, self._db.insert)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/datasets/byDimensions/_storage.py", line 631, in _insert
    insertMethod(self._static.dataset, *rows)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/databases/sqlite.py", line 422, in insert
    return super().insert(table, *rows, select=select, names=names, returnIds=returnIds)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/interfaces/_database.py", line 1381, in insert
    connection.execute(table.insert(), rows)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1263, in execute
    return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
    return connection._execute_clauseelement(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1452, in _execute_clauseelement
    ret = self._execute_context(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1814, in _execute_context
    self._handle_dbapi_exception(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1995, in _handle_dbapi_exception
    util.raise_(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1771, in _execute_context
    self.dialect.do_execute(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 717, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) disk I/O error
[SQL: INSERT INTO dataset (id, dataset_type_id, run_id) VALUES (?, ?, ?)]
[parameters: ('158c0c980c084474b560dbb46dd99e65', 56, 50)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)
Process task-395:
Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1771, in _execute_context
    self.dialect.do_execute(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 717, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: disk I/O error

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 249, in captureLogging
    yield ctx
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 174, in execute
    self.runQuantum(task, quantum, taskDef, butler)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 450, in runQuantum
    task.runQuantum(butlerQC, inputRefs, outputRefs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_tasks/21.0.0-139-g4e3796b1+8c51ef16e5/python/lsst/pipe/tasks/makeCoaddTempExp.py", line 806, in runQuantum
    butlerQC.put(results.exposures["direct"], outputRefs.direct)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/butlerQuantumContext.py", line 200, in put
    self._put(values, dataset)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/pipe_base/22.0.1-23-g91f4350+66b3ef8762/python/lsst/pipe/base/butlerQuantumContext.py", line 96, in _put
    self.__butler.put(value, ref)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/core/utils.py", line 269, in inner
    return func(self, *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/_butler.py", line 929, in put
    ref, = self.registry.insertDatasets(datasetType, run=run, dataIds=[dataId])
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/core/utils.py", line 269, in inner
    return func(self, *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registries/sql.py", line 456, in insertDatasets
    refs = list(storage.insert(runRecord, expandedDataIds, idGenerationMode))
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/datasets/byDimensions/_storage.py", line 593, in insert
    yield from self._insert(run, dataIdList, rows, self._db.insert)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/datasets/byDimensions/_storage.py", line 631, in _insert
    insertMethod(self._static.dataset, *rows)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/databases/sqlite.py", line 422, in insert
    return super().insert(table, *rows, select=select, names=names, returnIds=returnIds)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/interfaces/_database.py", line 1381, in insert
    connection.execute(table.insert(), rows)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1263, in execute
    return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
    return connection._execute_clauseelement(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1452, in _execute_clauseelement
    ret = self._execute_context(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1814, in _execute_context
    self._handle_dbapi_exception(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1995, in _handle_dbapi_exception
    util.raise_(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1771, in _execute_context
    self.dialect.do_execute(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 717, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) disk I/O error
[SQL: INSERT INTO dataset (id, dataset_type_id, run_id) VALUES (?, ?, ?)]
[parameters: ('158c0c980c084474b560dbb46dd99e65', 56, 50)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1771, in _execute_context
    self.dialect.do_execute(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 717, in do_execute
    cursor.execute(statement, parameters)
sqlite3.OperationalError: disk I/O error

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/multiprocessing/process.py", line 315, in _bootstrap
    self.run()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/mpGraphExecutor.py", line 129, in _executeJob
    quantumExecutor.execute(taskDef, quantum, butler)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 184, in execute
    _LOG.info("Execution of task '%s' on quantum %s took %.3f seconds",
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/contextlib.py", line 131, in __exit__
    self.gen.throw(type, value, traceback)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 252, in captureLogging
    self.writeLogRecords(quantum, taskDef, butler, ctx.store)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 515, in writeLogRecords
    butler.ingest(dataset, transfer="move")
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/core/utils.py", line 269, in inner
    return func(self, *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/_butler.py", line 1619, in ingest
    refs = self.registry.insertDatasets(
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/core/utils.py", line 269, in inner
    return func(self, *args, **kwargs)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registries/sql.py", line 456, in insertDatasets
    refs = list(storage.insert(runRecord, expandedDataIds, idGenerationMode))
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/datasets/byDimensions/_storage.py", line 593, in insert
    yield from self._insert(run, dataIdList, rows, self._db.insert)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/datasets/byDimensions/_storage.py", line 631, in _insert
    insertMethod(self._static.dataset, *rows)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/databases/sqlite.py", line 422, in insert
    return super().insert(table, *rows, select=select, names=names, returnIds=returnIds)
  File "/home/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/registry/interfaces/_database.py", line 1381, in insert
    connection.execute(table.insert(), rows)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1263, in execute
    return meth(self, multiparams, params, _EMPTY_EXECUTION_OPTS)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 323, in _execute_on_connection
    return connection._execute_clauseelement(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1452, in _execute_clauseelement
    ret = self._execute_context(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1814, in _execute_context
    self._handle_dbapi_exception(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1995, in _handle_dbapi_exception
    util.raise_(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/util/compat.py", line 207, in raise_
    raise exception
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/base.py", line 1771, in _execute_context
    self.dialect.do_execute(
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.7.0/lib/python3.8/site-packages/sqlalchemy/engine/default.py", line 717, in do_execute
    cursor.execute(statement, parameters)
sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) disk I/O error
[SQL: INSERT INTO dataset (id, dataset_type_id, run_id) VALUES (?, ?, ?)]
[parameters: ('46601d87249a4122b106c01bf8aee9d6', 66, 50)]
(Background on this error at: https://sqlalche.me/e/14/e3q8)
ctrl.mpexec.mpGraphExecutor ERROR: Task <TaskDef(MakeWarpTask, label=makeWarp) dataId={instrument: 'DECam', skymap: 'discrete/hits', tract: 1, patch: 57, visit: 411784, ...}> failed; processing will continue for remaining tasks.

It seems like a sql error, but I’m not sure why it happened?

Thanks again!

That could be due to a disk filling up, or we have sometimes seen it happen due to file locking on NFSv3 filesystems. The latter should not happen if NFSv4 is used to mount the filesystem or if you place your repo on a non-NFS filesystem, but unfortunately neither of those may be readily available options.

If you’re running tasks in parallel, reducing the level of parallelism may help.

1 Like

I have a question regarding the goodSeeingCoadd directory

what do the numbers in the subfolders mean? and also, is the seeing of the images calculated when running this ApTemplate command?

The un-helpful answer is, don’t poke around in the butler repository directories, just retrieve things with the Butler :grinning_face_with_smiling_eyes: The more-helpful answer is, those are tracts, and the numbers one level inside are patches.

You can figure out the directory structure by looking in obs_base/policy files, and obs_decam/policy files for any overrides, e.g., obs_base/exposures.yaml at main ¡ lsst/obs_base ¡ GitHub. (goodSeeing is substituted for deep in this case)

Yes, seeing (essentially PSF size) is used by the ApTemplate.yaml pipeline to select best-seeing exposures for coaddition (task labeled selectGoodSeeingVisits). To retrieve it more generally, I typically do something like this

sigma2fwhm = 2.*np.sqrt(2.*np.log(2.))
# prerequisite: load a source catalog I'm calling `sourceTable` as a pandas dataframe
# compute PSF size
traceRadius = np.sqrt(0.5) * np.sqrt(sourceTable.ixxPSF + sourceTable.iyyPSF)
# add a seeing column to the dataframe
sourceTable['seeing'] = sigma2fwhm * traceRadius 

You can also get the PSF size directly from an image, e.g.,

psf = exposure.getPsf()
seeing = psf.computeShape(psf.getAveragePosition()).getDeterminantRadius()*sigma2fwhm
1 Like

Thanks a lot Meredith! for both methods of calculating the seeing and for your answer regarding the directories :). I have a question about the SourceTable, I understand is a pandas dataframe of a source catalog, but of what? Is it from an exposure? and could there be a difference in the psf calculated from both methods?

In my overly-specific use case, sourceTable is a pandas df loaded from an Alert Production DB. But it could just as easily be a Butler-loaded src or similar table that has a bunch of detected objects per exposure, per patch, or per tract. Many different catalog data products have the relevant ixxPSF and iyyPSF columns. The image-based way to estimate PSF size/seeing won’t necessarily numerically equal any individual object’s PSF size, but in most cases, it should be a decent estimate for the overall image.

1 Like