Thanks a lot for your response Meredith! I was wondering if there are any command tags for pipetask run
that could “continue” a run that was stopped due to computer-related issues. I’ve seen the pipetask run tags, and It’s not clear for me which one could do that or if any of them can. Many times a pipetask run stops (due to computer issues) and I find it a little inefficient to erase the crosstalk-sources folder and start again.
I’ve tried adding --skip-existing --extend-run
but it looks like it’s not intended for that purpose
Thanks!
I think --skip-existing --extend-run
should do what you describe. What is happening instead?
One idea: try not specifying input collections the second time you call pipetask run
; once input collections are specified, they can’t be changed for subsequent reruns to the same output collection.
when I added --skip-existing --extend-run
to the command line, I got:
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
return f(get_current_context(), *args, **kwargs)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 102, in run
qgraph = script.qgraph(pipelineObj=pipeline, **kwargs)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/script/qgraph.py", line 145, in qgraph
qgraph = f.makeGraph(pipelineObj, args)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 550, in makeGraph
qgraph = graphBuilder.makeGraph(pipeline, collections, run, args.data_query, metadata=metadata)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 904, in makeGraph
scaffolding.resolveDatasetRefs(self.registry, collections, run, commonDataIds,
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 718, in resolveDatasetRefs
raise OutputExistsError(
lsst.pipe.base.graphBuilder.OutputExistsError: Quantum {instrument: 'DECam', detector: 56, exposure: 411752, ...} of task with label 'overscan' has some outputs that exist ([DatasetRef(DatasetType('overscanRaw', {band, instrument, detector, physical_filter, exposure}, Exposure), {instrument: 'DECam', detector: 56, exposure: 411752, ...}, id=b2d0c20c-fb4c-4e57-b1bb-cd360763cc01, run='DECam/raw/crosstalk-sources/20210811T065927Z')]) and others that don't ([DatasetRef(DatasetType('overscan_metadata', {band, instrument, detector, physical_filter, exposure}, PropertySet), {instrument: 'DECam', detector: 56, exposure: 411752, ...})]), with no metadata output, and clobbering outputs was not enabled.
I will try not to specify the input collection next time! and will give you updates on that once I try it out. For now, I was able to do the pipetask run
and was wondering about your nightly build:
In my case I ran:
butler query-dimension-records /mnt/flock/jahumada/data_hits exposure --where "instrument='DECam' AND exposure.observation_type='bias' AND exposure.exposure_time=0.0 AND exposure.day_obs=20150313"
What could mean that this command line yields no results? Do you know of a command tag that could show me the available bias exposures without building anything yet? Something like --show --where "instrument='DECam' AND exposure.observation_type='bias' AND exposure.exposure_time=0.0"
Thanks for your time
and clobbering outputs was not enabled.
The error message tells you how to fix it. Looking at the help you will find --clobber-outputs
.
That is the right query fragment to find bias observations. You shouldn’t need to also have to specify the exposure time.
Do you know that you ingested some raw bias exposures? Maybe the biases were from different days?
Hi,
I will consider that next time as well
I ingested the calib images using butler ingest-raws
. There were no problems and it said that it ingested successfully. The output of butler ingest-raws
of the calibration and raw images was:
Name Type
------------------------------------ -----------
DECam/calib CALIBRATION
DECam/calib/unbounded RUN
DECam/calib/curated/19700101T000000Z RUN
DECam/calib/curated/20130115T013000Z RUN
DECam/calib/curated/20130916T092600Z RUN
DECam/calib/curated/20140117T012900Z RUN
DECam/calib/curated/20141020T103000Z RUN
DECam/calib/curated/20150105T011500Z RUN
DECam/calib/curated/20131130T000000Z RUN
What’s a little confusing is the dates of the DECam/calib/curated/ , because all the images I’m working with are between 2014 and 2015. Could this be why I have no results with butler query-dimension-records
?
I also tried the following:
(lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ butler query-dimension-records /mnt/flock/jahumada/data_hits exposure --where "instrument='DECam' AND exposure.observation_type='bias' AND exposure.day_obs>20150313"
No results. Try --help for more information.
(lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ butler query-dimension-records /mnt/flock/jahumada/data_hits exposure --where "instrument='DECam' AND exposure.observation_type='bias' AND exposure.day_obs<20150313"
No results. Try --help for more information.
It seems that the butler doesn’t recognize any bias type image.
PS: I am also using the weekly LSST Science Pipelines version nº27, I’ve had it before the big release of version 22 was available, I am not sure if having this version could produce these types of errors I am having.
Thanks!
Yes. These are the curated calibrations (such as defect masks) that are small and are added to a butler repository with the butler write-curated-calibrations
command. These will not include bias or dark images.
The raws you ingest will end up in collect DECam/raw/all
– what raw files did you ask to ingest with butler ingest-raws
? If you only ingested data from one of our general testdata repositories those will likely all be science frames. You can see what raw data you have by not constraining that butler query-dimension-records
command by observation type. (or by using butler query-datasets
with the raw
dataset type).
Ok I found the problem,
Putting zero
instead of bias
to the observation_type
, I got results
(lsst-scipipe-0.6.0) jahumada@leftraru1:/mnt/flock/jahumada$ butler query-dimension-records /mnt/flock/jahumada/data_hits exposure --where "instrument='DECam' AND exposure.observation_type='zero' AND exposure.day_obs=20150313"
instrument id physical_filter obs_id exposure_time dark_time observation_type observation_reason day_obs seq_num group_name group_id target_name science_program tracking_ra tracking_dec sky_angle zenith_angle timespan [2]
DECam 421350 solid plate 0.0 0.0 ct4m20150313t193722 0.0 0.0139599 zero unknown 20150313 421350 421350 421350 preflats-BIAS 2015A-0608 34.380787207985456 20.708306035300883 90.0 None 2015-03-13 19:37:57.656041 .. 2015-03-13 19:38:48.000000
DECam 421351 solid plate 0.0 0.0 ct4m20150313t193745 0.0 0.0122199 zero unknown 20150313 421351 421351 421351 preflats-BIAS 2015A-0608 34.47681637530135 20.708389100893545 90.0 None 2015-03-13 19:38:20.610996 .. 2015-03-13 19:39:07.000000
DECam 421352 solid plate 0.0 0.0 ct4m20150313t193808 0.0 0.0137498 zero unknown 20150313 421352 421352 421352 preflats-BIAS 2015A-0608 34.57286220928707 20.708499944262773 90.0 None 2015-03-13 19:38:43.844598 .. 2015-03-13 19:39:31.000000
DECam 421353 solid plate 0.0 0.0 ct4m20150313t193832 0.0 0.01334 zero unknown 20150313 421353 421353 421353 preflats-BIAS 2015A-0608 34.668908043285676 20.708556065406825 90.0 None 2015-03-13 19:39:07.340203 .. 2015-03-13 19:39:52.000000
DECam 421354 solid plate 0.0 0.0 ct4m20150313t193855 0.0 0.0148501 zero unknown 20150313 421354 421354 421354 preflats-BIAS 2015A-0608 34.76912054398033 20.708667186982925 90.0 None 2015-03-13 19:39:30.318561 .. 2015-03-13 19:40:16.000000
DECam 421355 solid plate 0.0 0.0 ct4m20150313t193918 0.0 0.0144 zero unknown 20150313 421355 421355 421355 preflats-BIAS 2015A-0608 34.86514971132854 20.708694974785754 90.0 None 2015-03-13 19:39:53.326389 .. 2015-03-13 19:40:48.000000
DECam 421356 solid plate 0.0 0.0 ct4m20150313t193941 0.0 0.03789 zero unknown 20150313 421356 421356 421356 preflats-BIAS 2015A-0608 34.961195545336615 20.70883415147616 90.0 None 2015-03-13 19:40:16.304359 .. 2015-03-13 19:41:02.000000
DECam 421357 solid plate 0.0 0.0 ct4m20150313t194004 0.0 0.0140901 zero unknown 20150313 421357 421357 421357 preflats-BIAS 2015A-0608 35.05724137935772 20.70891721705238 90.0 None 2015-03-13 19:40:39.543707 .. 2015-03-13 19:41:27.000000
DECam 421358 solid plate 0.0 0.0 ct4m20150313t194027 0.0 0.0134001 zero unknown 20150313 421358 421358 421358 preflats-BIAS 2015A-0608 35.15328721338843 20.70897306040326 90.0 None 2015-03-13 19:41:02.570380 .. 2015-03-13 19:41:48.000000
DECam 421359 solid plate 0.0 0.0 ct4m20150313t194050 0.0 0.0127599 zero unknown 20150313 421359 421359 421359 preflats-BIAS 2015A-0608 35.24931638075213 20.70908418152705 90.0 None 2015-03-13 19:41:25.520977 .. 2015-03-13 19:42:11.000000
DECam 421360 solid plate 0.0 0.0 ct4m20150313t194113 0.0 0.0136201 zero unknown 20150313 421360 421360 421360 preflats-BIAS 2015A-0608 35.34540804812597 20.70916724709863 90.0 None 2015-03-13 19:41:48.609948 .. 2015-03-13 19:42:34.000000
DECam 421448 solid plate 0.0 0.0 ct4m20150313t204838 0.0 0.01372 zero unknown 20150313 421448 421448 421448 postflats-BIAS 2015A-0608 52.23791243511519 20.72661282207203 90.0 None 2015-03-13 20:49:13.651863 .. 2015-03-13 20:49:59.000000
DECam 421449 solid plate 0.0 0.0 ct4m20150313t204901 0.0 0.01579 zero unknown 20150313 421449 421449 421449 postflats-BIAS 2015A-0608 52.33394160356247 20.72672477553536 90.0 None 2015-03-13 20:49:36.530961 .. 2015-03-13 20:50:22.000000
DECam 421450 solid plate 0.0 0.0 ct4m20150313t204924 0.0 0.01406 zero unknown 20150313 421450 421450 421450 postflats-BIAS 2015A-0608 52.43003327201389 20.726862840107717 90.0 None 2015-03-13 20:49:59.556428 .. 2015-03-13 20:50:54.000000
DECam 421451 solid plate 0.0 0.0 ct4m20150313t204947 0.0 0.04866 zero unknown 20150313 421451 421451 421451 postflats-BIAS 2015A-0608 52.5260749404759 20.726946738000716 90.0 None 2015-03-13 20:50:22.697510 .. 2015-03-13 20:51:08.000000
DECam 421452 solid plate 0.0 0.0 ct4m20150313t205010 0.0 0.0144701 zero unknown 20150313 421452 421452 421452 postflats-BIAS 2015A-0608 52.622120775604465 20.727085913663895 90.0 None 2015-03-13 20:50:45.748267 .. 2015-03-13 20:51:31.000000
DECam 421453 solid plate 0.0 0.0 ct4m20150313t205033 0.0 0.0120902 zero unknown 20150313 421453 421453 421453 postflats-BIAS 2015A-0608 52.718195777409 20.72719675598844 90.0 None 2015-03-13 20:51:08.777096 .. 2015-03-13 20:51:54.000000
DECam 421454 solid plate 0.0 0.0 ct4m20150313t205056 0.0 0.0141101 zero unknown 20150313 421454 421454 421454 postflats-BIAS 2015A-0608 52.81428744588622 20.727307876084247 90.0 None 2015-03-13 20:51:31.865202 .. 2015-03-13 20:52:17.000000
DECam 421455 solid plate 0.0 0.0 ct4m20150313t205120 0.0 0.01296 zero unknown 20150313 421455 421455 421455 postflats-BIAS 2015A-0608 52.91031661436798 20.72741899616613 90.0 None 2015-03-13 20:51:55.012157 .. 2015-03-13 20:52:40.000000
DECam 421456 solid plate 0.0 0.0 ct4m20150313t205143 0.0 0.0128601 zero unknown 20150313 421456 421456 421456 postflats-BIAS 2015A-0608 53.00636244952244 20.72752983846369 90.0 None 2015-03-13 20:52:18.092811 .. 2015-03-13 20:53:03.000000
DECam 421457 solid plate 0.0 0.0 ct4m20150313t205206 0.0 0.01439 zero unknown 20150313 421457 421457 421457 postflats-BIAS 2015A-0608 53.102449951347076 20.727669014090335 90.0 None 2015-03-13 20:52:41.109736 .. 2015-03-13 20:53:27.000000
DECam 421458 solid plate 0.0 0.0 ct4m20150313t205229 0.0 0.0143199 zero unknown 20150313 421458 421458 421458 postflats-BIAS 2015A-0608 53.19849578651594 20.727751800816037 90.0 None 2015-03-13 20:53:04.075676 .. 2015-03-13 20:53:50.000000
DECam 421459 solid plate 0.0 0.0 ct4m20150313t234600 0.0 0.0106499 zero unknown 20150313 421459 421459 421459 2015A-0608 91.52315333295033 -29.75860440698451 90.0 None 2015-03-13 23:46:35.384677 .. 2015-03-13 23:47:21.000000
DECam 421460 solid plate 0.0 0.0 ct4m20150313t234623 0.0 0.01213 zero unknown 20150313 421460 421460 421460 2015A-0608 91.52315333295188 -29.758632462540064 90.0 None 2015-03-13 23:46:58.273867 .. 2015-03-13 23:47:44.000000
Thanks for the tips!
Hi all!
I have ran the command line to build a master bias for the observation night of 2014-02-28:
pipetask run -b data_hits/ -p $CP_PIPE_DIR/pipelines/cpBias.yaml -i DECam/raw/all,DECam/calib -o hits_master_calib/on20140228_bias -d "instrument='DECam' AND exposure.observation_type='zero' AND exposure.day_obs=20140228" -j 3 --register-dataset-types
Supposedly, It ran successfully. Here is one of the last output lines I obtained:
ctrl.mpexec.mpGraphExecutor INFO: Executed 1485 quanta successfully, 0 failed and 3 remain out of total 1488 quanta.
isr WARN: No rough magnitude zero point defined for filter solid plate 0.0 0.0.
isr WARN: Non-positive exposure time; skipping rough zero point.
ctrl.mpexec.singleQuantumExecutor INFO: Execution of task 'isr' on quantum {instrument: 'DECam', detector: 27, exposure: 288902, ...} took 5.818 seconds
ctrl.mpexec.mpGraphExecutor INFO: Executed 1486 quanta successfully, 0 failed and 2 remain out of total 1488 quanta.
ctrl.mpexec.singleQuantumExecutor INFO: Execution of task 'isr' on quantum {instrument: 'DECam', detector: 27, exposure: 288832, ...} took 5.808 seconds
ctrl.mpexec.mpGraphExecutor INFO: Executed 1487 quanta successfully, 0 failed and 1 remain out of total 1488 quanta.
cpBiasCombine INFO: Scaling input 0 by 1.0
cpBiasCombine INFO: Scaling input 1 by 1.0
cpBiasCombine INFO: Scaling input 2 by 1.0
cpBiasCombine INFO: Scaling input 3 by 1.0
cpBiasCombine INFO: Scaling input 4 by 1.0
cpBiasCombine INFO: Scaling input 5 by 1.0
cpBiasCombine INFO: Scaling input 6 by 1.0
cpBiasCombine INFO: Scaling input 7 by 1.0
cpBiasCombine INFO: Scaling input 8 by 1.0
cpBiasCombine INFO: Scaling input 9 by 1.0
cpBiasCombine INFO: Scaling input 10 by 1.0
cpBiasCombine INFO: Scaling input 11 by 1.0
cpBiasCombine INFO: Scaling input 12 by 1.0
cpBiasCombine INFO: Scaling input 13 by 1.0
cpBiasCombine INFO: Scaling input 14 by 1.0
cpBiasCombine INFO: Scaling input 15 by 1.0
cpBiasCombine INFO: Scaling input 16 by 1.0
cpBiasCombine INFO: Scaling input 17 by 1.0
cpBiasCombine INFO: Scaling input 18 by 1.0
cpBiasCombine INFO: Scaling input 19 by 1.0
cpBiasCombine INFO: Scaling input 20 by 1.0
cpBiasCombine INFO: Scaling input 21 by 1.0
cpBiasCombine INFO: Scaling input 22 by 1.0
cpBiasCombine WARN: Found 8025961 NAN pixels
cpBiasCombine WARN: Exception making an obs group for headers. Continuing.
ctrl.mpexec.singleQuantumExecutor INFO: Execution of task 'cpBiasCombine' on quantum {instrument: 'DECam', detector: 27} took 30.350 seconds
ctrl.mpexec.mpGraphExecutor INFO: Executed 1488 quanta successfully, 0 failed and 0 remain out of total 1488 quanta.
But the thing is that I don’t have an output image on the hits_master_calib/on20140228_bias
folder, so I am a little confused as to what the outcome should be. Maybe I should add a configuration tag -c
to obtain the master bias? If so, I’m not sure how It must be configured
Thanks a lot for your time again!
This is not a “folder”, it’s a Butler “collection”. In particular, with this command line, your output will go into a “RUN” collection that is named with a timestamp that is then chained into the output collection that you specified.
You can use butler
command line tools to look at your collections and the datasets within them. You can also specify your output collection as an input collection for other pipetask run
commands. Finally, if you want the actual file containing the master bias output, the safest mechanism is to use butler retrieve-artifacts
, although you can also look in a folder/directory named after the time-stamped collection under your Butler datastore root.
Thanks for your indications! I was able to found where the master biases are being generated, as well as for the flats.
Now that I have the master bias and the flats of a night of observation, I would like to do the calibrations of the science images. Following somewhat the indications of the tutorial of Getting Started with the HSC camera, I should do a processccd task for my DECam images.
One of the inputs of the processccd task is a reference catalog, that I don’t have. I should ingest a Pan-STARRS catalog for it to work, right?
I’ve been trying to run the following:
pipetask run -b data_hits/ -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ProcessCcd.yaml -i DECam/raw/all,hits_master_calib/on20140228_bias,hits_master_calib/on20140228_flats --instrument lsst.obs.decam.DarkEnergyCamera --output-run processCcdOutputs/pon20140228
The night that I want to process is 2014-02-28, and the directories hits_master_calib/on20140228_bias,hits_master_calib/on20140228_flats
correspond to the folders where the master bias and “master” flat is located (these directories are in the butler query-collections).
Running this command I get an error that could also be due to the lack of a reference catalog.
The error is this:
numexpr.utils INFO: Note: NumExpr detected 20 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
numexpr.utils INFO: NumExpr defaulting to 8 threads.
Traceback (most recent call last):
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/bin/pipetask", line 29, in <module>
sys.exit(main())
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/pipetask.py", line 43, in main
return cli()
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
return f(get_current_context(), *args, **kwargs)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 102, in run
qgraph = script.qgraph(pipelineObj=pipeline, **kwargs)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/script/qgraph.py", line 145, in qgraph
qgraph = f.makeGraph(pipelineObj, args)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 550, in makeGraph
qgraph = graphBuilder.makeGraph(pipeline, collections, run, args.data_query, metadata=metadata)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 892, in makeGraph
scaffolding = _PipelineScaffolding(pipeline, registry=self.registry)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 411, in __init__
datasetTypes = PipelineDatasetTypes.fromPipeline(pipeline, registry=registry)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/pipeline.py", line 910, in fromPipeline
for taskDef in pipeline:
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/pipeline.py", line 576, in toExpandedPipeline
overrides.applyTo(config)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/configOverrides.py", line 296, in applyTo
instrument.applyConfigOverrides(name, config)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/obs_base/21.0.0-60-g2baedfa+3512c75b65/python/lsst/obs/base/_instrument.py", line 346, in applyConfigOverrides
config.load(path)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_config/21.0.0-1-ga51b5d4+9915f4c1f0/python/lsst/pex/config/config.py", line 1036, in load
self.loadFromStream(stream=code, root=root, filename=filename)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_config/21.0.0-1-ga51b5d4+9915f4c1f0/python/lsst/pex/config/config.py", line 1075, in loadFromStream
exec(stream, globals, local)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/obs_decam/21.0.0-20-g1e553c2+458274d3ec/config/calibrate.py", line 30, in <module>
config.measurement.load(os.path.join(obsConfigDir, "hsm.py"))
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_config/21.0.0-1-ga51b5d4+9915f4c1f0/python/lsst/pex/config/config.py", line 1036, in load
self.loadFromStream(stream=code, root=root, filename=filename)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_config/21.0.0-1-ga51b5d4+9915f4c1f0/python/lsst/pex/config/config.py", line 1075, in loadFromStream
exec(stream, globals, local)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/obs_decam/21.0.0-20-g1e553c2+458274d3ec/config/hsm.py", line 8, in <module>
config.load(os.path.join(getPackageDir("meas_extensions_shapeHSM"), "config", "enable.py"))
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_config/21.0.0-1-ga51b5d4+9915f4c1f0/python/lsst/pex/config/config.py", line 1036, in load
self.loadFromStream(stream=code, root=root, filename=filename)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_config/21.0.0-1-ga51b5d4+9915f4c1f0/python/lsst/pex/config/config.py", line 1075, in loadFromStream
exec(stream, globals, local)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/config/enable.py", line 7, in <module>
import lsst.meas.extensions.shapeHSM
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/python/lsst/meas/extensions/shapeHSM/__init__.py", line 27, in <module>
from .hsmMomentsControl import *
ImportError: libgalsim.so.2.2: cannot open shared object file: No such file or directory
Thanks a lot for your time!
That’s a strange error that indicates problems with your conda environment, not something to do with anything astronomical. Can you please post the results of these commands?
$ conda list galsim
$ conda list rubin-env
$ ls -l $CONDA_PREFIX/lib/galsim*
$ ldd $EUPS_PATH/Linux64/meas_extensions_shapeHSM/*/lib/libmeas_extensions_shapeHSM.so
Ok!, here is what I got:
(lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ conda list galsim # packages in environment at /home/jahumada/.conda/envs/lsst-scipipe-0.6.0: # # Name Version Build Channel galsim 2.3.0 py38h260733e_0 conda-forge (lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ conda list rubin-env # packages in environment at /home/jahumada/.conda/envs/lsst-scipipe-0.6.0: # # Name Version Build Channel rubin-env 0.6.0 ha770c72_1 conda-forge rubin-env-nosysroot 0.6.0 ha770c72_1 conda-forge (lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ ls -l $CONDA_PREFIX/lib/galsim* ls: cannot access /home/jahumada/.conda/envs/lsstscipipe-0.6.0/lib/galsim*: No such file or directory (lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ ldd $EUPS_PATH/Linux64/meas_extensions_shapeHSM/*/lib/libmeas_extensions_shapeHSM.so /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_base/21.0.0-14-g3bd782b+16b33694c2/lib/libmeas_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_base/21.0.0-14-g3bd782b+16b33694c2/lib/libmeas_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_base/21.0.0-14-g3bd782b+16b33694c2/lib/libmeas_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_base/21.0.0-14-g3bd782b+16b33694c2/lib/libmeas_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.26' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/geom/21.0.0-5-g8c1d971+7e5b4c34a6/lib/libgeom.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/geom/21.0.0-5-g8c1d971+7e5b4c34a6/lib/libgeom.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_base/21.0.0-6-g1930a60+b18c5f6b76/lib/libdaf_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_base/21.0.0-6-g1930a60+b18c5f6b76/lib/libdaf_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_base/21.0.0-6-g1930a60+b18c5f6b76/lib/libdaf_base.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_exceptions/21.0.0-2-gde069b7+4f46bdaea8/lib/libpex_exceptions.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_exceptions/21.0.0-2-gde069b7+4f46bdaea8/lib/libpex_exceptions.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/log/21.0.0-5-gcc89fd6+b18c5f6b76/lib/liblog.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/log/21.0.0-5-gcc89fd6+b18c5f6b76/lib/liblog.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/log/21.0.0-5-gcc89fd6+b18c5f6b76/lib/liblog.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.8' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.26' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/utils/21.0.0-7-g0503b2e+18535a8d22/lib/libutils.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/utils/21.0.0-7-g0503b2e+18535a8d22/lib/libutils.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/base/21.0.0-7-gdf92d54+64c1bc5aa5/lib/libbase.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/base/21.0.0-7-gdf92d54+64c1bc5aa5/lib/libbase.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/sphgeom/21.0.0-1-g8760c09+64c1bc5aa5/lib/libsphgeom.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `CXXABI_1.3.9' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/sphgeom/21.0.0-1-g8760c09+64c1bc5aa5/lib/libsphgeom.so) /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.21' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/sphgeom/21.0.0-1-g8760c09+64c1bc5aa5/lib/libsphgeom.so) linux-vdso.so.1 => (0x00002b0c6607c000) libgalsim.so.2.2 => not found libmeas_base.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_base/21.0.0-14-g3bd782b+16b33694c2/lib/libmeas_base.so (0x00002b0c660be000) libafw.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so (0x00002b0c6627e000) libgeom.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/geom/21.0.0-5-g8c1d971+7e5b4c34a6/lib/libgeom.so (0x00002b0c66185000) libdaf_base.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_base/21.0.0-6-g1930a60+b18c5f6b76/lib/libdaf_base.so (0x00002b0c661cb000) libpex_exceptions.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_exceptions/21.0.0-2-gde069b7+4f46bdaea8/lib/libpex_exceptions.so (0x00002b0c66265000) libstdc++.so.6 => /lib64/libstdc++.so.6 (0x00002b0c66d64000) libm.so.6 => /lib64/libm.so.6 (0x00002b0c6706c000) libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x00002b0c6736e000) libc.so.6 => /lib64/libc.so.6 (0x00002b0c67584000) liblog.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/log/21.0.0-5-gcc89fd6+b18c5f6b76/lib/liblog.so (0x00002b0c6626e000) liblog4cxx.so.11 => not found libfftw3.so.3 => not found libastshim.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so (0x00002b0c67952000) libast.so.9 => not found libcfitsio.so.8 => not found libgsl.so.25 => not found libopenblas.so.0 => not found libMinuit2.so => not found liblog4cxx.so.11 => not found libutils.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/utils/21.0.0-7-g0503b2e+18535a8d22/lib/libutils.so (0x00002b0c679ae000) libbase.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/base/21.0.0-7-gdf92d54+64c1bc5aa5/lib/libbase.so (0x00002b0c679ed000) libpthread.so.0 => /lib64/libpthread.so.0 (0x00002b0c679f9000) libsphgeom.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/sphgeom/21.0.0-1-g8760c09+64c1bc5aa5/lib/libsphgeom.so (0x00002b0c67c16000) /lib64/ld-linux-x86-64.so.2 (0x00002b0c6605a000) liblog4cxx.so.11 => not found libast.so.9 => not found libast_grf_2.0.so.9 => not found libast_grf_3.2.so.9 => not found libast_grf_5.6.so.9 => not found libast_grf3d.so.9 => not found libast_err.so.9 => not found libdl.so.2 => /lib64/libdl.so.2 (0x00002b0c67c55000)
So, there have been times where I had to disable the intel MPI/MKL library to run a pipetask run
properly. If I enable the intel MPI/MKL library and load Python 3.8.2 as well, I get a different result for ldd $EUPS_PATH/Linux64/meas_extensions_shapeHSM/*/lib/libmeas_extensions_shapeHSM.so
, that is:
(lsst-scipipe-0.6.0) [jahumada@leftraru3 jahumada]$ ldd $EUPS_PATH/Linux64/meas_extensions_shapeHSM/*/lib/libmeas_extensions_shapeHSM.so
/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /home/lmod/software/Core/GCCcore/8.2.0/lib64/libstdc++.so.6: version `GLIBCXX_3.4.26' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so)
/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_extensions_shapeHSM/21.0.0-5-g810ae3f+b39e0bca51/lib/libmeas_extensions_shapeHSM.so: /home/lmod/software/Core/GCCcore/8.2.0/lib64/libstdc++.so.6: version `GLIBCXX_3.4.26' not found (required by /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so)
linux-vdso.so.1 => (0x00002b45157bc000)
libgalsim.so.2.2 => not found
libmeas_base.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/meas_base/21.0.0-14-g3bd782b+16b33694c2/lib/libmeas_base.so (0x00002b4515800000)
libafw.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/lib/libafw.so (0x00002b45159be000)
libgeom.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/geom/21.0.0-5-g8c1d971+7e5b4c34a6/lib/libgeom.so (0x00002b45158c7000)
libdaf_base.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_base/21.0.0-6-g1930a60+b18c5f6b76/lib/libdaf_base.so (0x00002b451590d000)
libpex_exceptions.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pex_exceptions/21.0.0-2-gde069b7+4f46bdaea8/lib/libpex_exceptions.so (0x00002b45159a7000)
libstdc++.so.6 => /home/lmod/software/Core/GCCcore/8.2.0/lib64/libstdc++.so.6 (0x00002b45164a4000)
libm.so.6 => /lib64/libm.so.6 (0x00002b451663e000)
libgcc_s.so.1 => /home/lmod/software/Core/GCCcore/8.2.0/lib64/libgcc_s.so.1 (0x00002b4516940000)
libc.so.6 => /lib64/libc.so.6 (0x00002b4516959000)
liblog.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/log/21.0.0-5-gcc89fd6+b18c5f6b76/lib/liblog.so (0x00002b45159b0000)
liblog4cxx.so.11 => not found
libfftw3.so.3 => not found
libastshim.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/astshim/21.0.0-2-g45278ab+64c1bc5aa5/lib/libastshim.so (0x00002b4516d28000)
libast.so.9 => not found
libcfitsio.so.8 => not found
libgsl.so.25 => not found
libopenblas.so.0 => not found
libMinuit2.so => not found
liblog4cxx.so.11 => not found
libutils.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/utils/21.0.0-7-g0503b2e+18535a8d22/lib/libutils.so (0x00002b4516d85000)
libbase.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/base/21.0.0-7-gdf92d54+64c1bc5aa5/lib/libbase.so (0x00002b4516dc4000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00002b4516dd0000)
libsphgeom.so => /mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/sphgeom/21.0.0-1-g8760c09+64c1bc5aa5/lib/libsphgeom.so (0x00002b4516fed000)
/lib64/ld-linux-x86-64.so.2 (0x00002b451579a000)
liblog4cxx.so.11 => not found
libast.so.9 => not found
libast_grf_2.0.so.9 => not found
libast_grf_3.2.so.9 => not found
libast_grf_5.6.so.9 => not found
libast_grf3d.so.9 => not found
libast_err.so.9 => not found
libdl.so.2 => /lib64/libdl.so.2 (0x00002b451702c000)
Could these not found extensions be because I am using a weekly (27th version) of the 21st version of the pipeline?
OK, you’ll need to reinstall rubin-env
in your lsst-scipipe-0.6.0
environment. The weekly you are using was not certified or built to use galsim 2.3. We pinned the version in the environment to handle this, but it doesn’t automatically update previously-installed environments. I believe the minimum you can do to get going again is
$ conda install galsim=2.2
You could try instead
$ conda install rubin-env=0.6.0=ha770c72_4
which will be safer, as it also constrains some other packages.
I don’t think you can do
$ conda upgrade rubin-env=0.6.0
as that may not downgrade existing packages.
Thanks ktl! I ran conda install rubin-env=0.6.0=ha770c72_4
and it installed succesfully.
Now I do get the error of the missing reference catalog
(lsst-scipipe-0.6.0) [jahumada@leftraru4 jahumada]$ pipetask run -b data_hits/ -p $AP_PIPE_DIR/pipelines/DarkEnergyCamera/ProcessCcd.yaml -i DECam/raw/all,hits_master_calib/on20140228_bias,hits_master_calib/on20140228_flats --instrument lsst.obs.decam.DarkEnergyCamera --output-run processCcdOutputs/pon20140228
Traceback (most recent call last):
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/pipeline.py", line 711, in makeDatasetTypesSet
datasetType = registry.getDatasetType(c.name)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/registries/sql.py", line 391, in getDatasetType
return self._managers.datasets[name].datasetType
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/registry/interfaces/_datasets.py", line 491, in __getitem__
raise KeyError(f"Dataset type with name '{name}' not found.")
KeyError: "Dataset type with name 'ps1_pv3_3pi_20170110' not found."
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/bin/pipetask", line 29, in <module>
sys.exit(main())
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/pipetask.py", line 43, in main
return cli()
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
return f(get_current_context(), *args, **kwargs)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 102, in run
qgraph = script.qgraph(pipelineObj=pipeline, **kwargs)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/script/qgraph.py", line 145, in qgraph
qgraph = f.makeGraph(pipelineObj, args)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 550, in makeGraph
qgraph = graphBuilder.makeGraph(pipeline, collections, run, args.data_query, metadata=metadata)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 892, in makeGraph
scaffolding = _PipelineScaffolding(pipeline, registry=self.registry)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 411, in __init__
datasetTypes = PipelineDatasetTypes.fromPipeline(pipeline, registry=registry)
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/pipeline.py", line 911, in fromPipeline
thisTask = TaskDatasetTypes.fromTaskDef(
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/pipeline.py", line 784, in fromTaskDef
prerequisites=makeDatasetTypesSet("prerequisiteInputs"),
File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/pipeline.py", line 713, in makeDatasetTypesSet
raise LookupError(
LookupError: DatasetType 'ps1_pv3_3pi_20170110' referenced by CalibrateConnections uses 'skypix' as a dimension placeholder, but does not already exist in the registry. Note that reference catalog names are now used as the dataset type name instead of 'ref_cat'.
From what I’ve read, the only way to get a gen3 refcat is to ingest it into a gen2 repo and then convert it to gen3 (https://jira.lsstcorp.org/browse/DM-29543). There is an example that uses Gaia DR2 How to generate an LSST reference catalog — LSST Science Pipelines but I would like to use Pan-STARRS to construct the reference catalog. I would like to know if you have any recommendations as to what procedures would be best for me to do? Maybe I could reproduce the Gaia DR2 example with Pan-STARRS?
Thanks a lot for your time
If you can wait, I’m preparing instructions on how to transfer an existing gen2 refcat (we have a PS1 refcat in the LSST format) to a gen3 repo. It should only be a couple of commands. See DM-30624 for updates.
Converting an external refcat to the LSST format and then ingesting it into a gen3 repo is work I’m doing currently; DM-29543 is the right ticket to follow that.
Okay I will go ahead and download the images that work for me from here https://tigress-web.princeton.edu/~pprice/ps1_pv3_3pi_20170110/
I will keep an eye on the tickets you mention as well!
Hi John I was able to download the Pan-STARRS images needed to calibrate my images .
From what I’ve checked, there are no updates published on the tickets DM-30624 and DM-29543 for now . I was wondering if you had any methods/commands I could try in the meantime? I would like to try stuff .
I actually don’t dispose of a lot of time to wait, so any news is great news for me.
Thanks a lot for your time!
I’ve made some progress on this. You can try the following commands, where REPO is the path to your gen3 butler repo. Note that you’ll need a pretty recent pipelines version, as butler register-dataset-type
was added at the end of August.
- Create an astropy-readable table file containing one row per input file in the PS1 refcat, with two columns: the filename and htm7-pixel index. I think something like this code will get you the right output:
import glob
import astropy.table
table = astropy.table.Table(names=("filename", "htm7"), dtype=('str', 'int'))
for file in glob.glob("ps1_pv3_3pi_20170110/*.fits"):
table.add_row((file, int(splitext(file)[0])))
table.write("filename_to_htm.ecsv")
- Register the new ps1 refcat dataset:
butler register-dataset-type REPO ps1_pv3_3pi_20170110 SimpleCatalog htm7
- Ingest the LSST-formatted files you downloaded:
butler ingest-files -t direct REPO ps1_pv3_3pi_20170110 refcats filename_to_htm.ecsv
The ingested data should appear via butler query-datasets --collections refcats REPO
.
Hello,
I’m having problems reducing DECam data. In particular the bias combine task is giving tons of errors and producing very few master frames. Below I enclose some of the pipeline output (which is a re-run):
ValueError: Failure from formatter ‘lsst.obs.base.formatters.fitsExposure.FitsExposureFormatter’ for dataset d79309f0-85bf-4569-9b78-b177483c4bfe (cpBiasProc from file:///mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_
20211028T155954Z.fits):
File “src/fits.cc”, line 1578, in lsst::afw::fits::Fits::Fits(const string&, const string&, int)
cfitsio error: error reading from FITS file (108) : Opening file ‘/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_20211028T155954Z.fits’ with mode ‘r’
cfitsio error stack:
Error reading data buffer from file:
/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/2021102
8T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0
_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_20211028T155954Z.fit
s
Failed to find the END keyword in header (ffgphd).
Failed to read the required primary array header keywords.
ffopen could not interpret primary array header of file:
/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/2021102
8T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0
_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_20211028T155954Z.fit
s
{0}
lsst::afw::fits::FitsError: 'cfitsio error: error reading from FITS file (108) : Opening file ‘/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_20211028T155954Z.fits’ with mode ‘r’
cfitsio error stack:
Error reading data buffer from file:
/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/2021102
8T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0
_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_20211028T155954Z.fit
s
Failed to find the END keyword in header (ffgphd).
Failed to read the required primary array header keywords.
ffopen could not interpret primary array header of file:
/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/2021102
8T155954Z/cpBiasProc/20191122/ct4m20191122t191226/cpBiasProc_DECam_solid_plate_0
_0_0_0_ct4m20191122t191226_S14_master_calib_on20191122_bias_20211028T155954Z.fit
s
’
ctrl.mpexec.mpGraphExecutor ERROR: Task <TaskDef(CalibCombineTask, label=cpBiasCombine) dataId={instrument: ‘DECam’, detector: 13}> failed; processing will continue for remaining tasks.
ctrl.mpexec.mpGraphExecutor INFO: Executed 0 quanta successfully, 13 failed and 44 remain out of total 57 quanta.
timer.lsst.daf.butler.datastores.fileDatastore ERROR: Reading from location file:///mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191052/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191052_S2_master_calib_on20191122_bias_20211028T155954Z.fits with formatter lsst.obs.base.formatters.fitsExposure.FitsExposure
Formatter: Took 17.8089 seconds
ctrl.mpexec.singleQuantumExecutor ERROR: Execution of task ‘cpBiasCombine’ on quantum {instrument: ‘DECam’, detector: 26} failed
Traceback (most recent call last):
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/daf_butler/22.0.1-70-gc781c74d+f4bf734fca/python/lsst/daf/butler/datastores/fileDatastore.py”, line 1215, in _read_artifact_into_memory
result = formatter.read(component=getInfo.component if isComponent else None)
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/obs_base/22.0.1-18-g3ad3863+50f8e256e0/python/lsst/obs/base/formatters/fitsExposure.py”, line 89, in read
return self.readFull()
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/obs_base/22.0.1-18-g3ad3863+50f8e256e0/python/lsst/obs/base/formatters/fitsExposure.py”, line 560, in readFull
result = self.reader.read(self.checked_parameters)
lsst.pex.exceptions.wrappers.FitsError:
File “src/fits.cc”, line 1419, in void lsst::afw::fits::Fits::readImageImpl(int, T, long int, long int*, long int*) [with T = float]
cfitsio error (/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191052/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191052_S2_master_calib_on20191122_bias_20211028T155954Z.fits): error reading from FITS file (108) : Reading image
cfitsio error stack:
Error reading data buffer from file:
/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/2021102
8T155954Z/cpBiasProc/20191122/ct4m20191122t191052/cpBiasProc_DECam_solid_plate_0
_0_0_0_ct4m20191122t191052_S2_master_calib_on20191122_bias_20211028T155954Z.fits
Error reading elements 1 thru 2813 from column 1 (ffgclb).
error reading compressed byte stream from binary table
{0}
lsst::afw::fits::FitsError: 'cfitsio error (/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191052/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191052_S2_master_calib_on20191122_bias_20211028T155954Z.fits): error reading from FITS file (108) : Reading image
(base) [jahumada@leftraru4 ~]$ ls /mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191052/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191052_S2_master_calib_on20191122_bias_20211028T155954Z.fits*
/mnt/flock/jahumada/eridanus_data/processed/master_calib/on20191122_bias/20211028T155954Z/cpBiasProc/20191122/ct4m20191122t191052/cpBiasProc_DECam_solid_plate_0_0_0_0_ct4m20191122t191052_S2_master_calib_on20191122_bias_20211028T155954Z.fits
(base) [jahumada@leftraru4 ~]$ tail plira_23665563.err
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cli/script/run.py”, line 171, in run
f.runPipeline(qgraphObj, taskFactory, args)
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/cmdLineFwk.py”, line 669, in runPipeline
executor.execute(graph, butler)
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/mpGraphExecutor.py”, line 300, in execute
self._executeQuantaMP(graph, butler)
File “/mnt/flock/jahumada/lsstw_v21_37/lsst-w.2021.37/scripts/stack/miniconda3-py38_4.9.2-0.7.0/Linux64/ctrl_mpexec/22.0.1-18-gee15f3f+b3a3f94acb/python/lsst/ctrl/mpexec/mpGraphExecutor.py”, line 486, in _executeQuantaMP
raise MPGraphExecutorError(“One or more tasks failed or timed out during execution.”)
lsst.ctrl.mpexec.mpGraphExecutor.MPGraphExecutorError: One or more tasks failed or timed out during execution.
slurmstepd: error: Detected 48 oom-kill event(s) in step 23665563.batch cgroup. Some of your processes may have been killed by the cgroup out-of-memory handler.
Sorry about so much mess!
Paulina.
This looks like the file is completely corrupt. Can other tools read it? We usually only encounter problems like this if you’ve run out of disk space.