Issues with ingestCalib.py using DECAM images

Hi everyone :slight_smile:
I’m currently trying to ingest the raw and calibrated images from DECAM following the gen 3 approach of processing images.
So, the data that I’m working with was downloaded from the Noir Lab archive, and I organized the raw images by folders of pointings. Meanwhile, the bias and dome flats are all in the same calib_Paula folder. I was able to ingest the raw images in their corresponding pointing folders, but when I try to ingest the calibration images with ingestCalib.py I get a sqlite3.IntegrityError

(lsst-scipipe-cb4e2dc) jahumada@leftraru3:/mnt/flock/jahumada$ ingestCalibs.py /mnt/flock/jahumada/DATA_Paula/pointing1  --calib /mnt/flock/jahumada/DATA_Paula/pointing1/calib /mnt/flock/jahumada/testdata_hits/calib_Paula/*.fits.fz --validity 999 --config clobber=True --mode=link 
root INFO: Loading config overrride file '/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/obs_decam/21.0.0+ed48dff28b/config/ingestCalibs.py'
CameraMapper INFO: Loading Posix exposure registry from /mnt/flock/jahumada/DATA_Paula/pointing1
ingestCalibs INFO: /mnt/flock/jahumada/testdata_hits/calib_Paula/c4d_140228_201715_fri.fits.fz --<link>--> /mnt/flock/jahumada/DATA_Paula/pointing1/calib/FLAT/2014-02-28/g/FLAT-2014-02-28-25.fits
ingestCalibs INFO: /mnt/flock/jahumada/testdata_hits/calib_Paula/c4d_140228_201814_fri.fits.fz --<link>--> /mnt/flock/jahumada/DATA_Paula/pointing1/calib/FLAT/2014-02-28/g/FLAT-2014-02-28-25.fits
ingestCalibs WARN: /mnt/flock/jahumada/testdata_hits/calib_Paula/c4d_140228_201814_fri.fits.fz: already ingested: {'filter': 'g', 'ccdnum': 25, 'calibDate': '2014-02-28'}
Traceback (most recent call last):
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_tasks/21.0.0+44ca056b81/bin/ingestCalibs.py", line 3, in <module>
    IngestCalibsTask.parseAndRun()
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_tasks/21.0.0+44ca056b81/python/lsst/pipe/tasks/ingest.py", line 411, in parseAndRun
    task.run(args)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_tasks/21.0.0+44ca056b81/python/lsst/pipe/tasks/ingestCalibs.py", line 266, in run
    create=args.create, table=calibType)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_tasks/21.0.0+44ca056b81/python/lsst/pipe/tasks/ingestCalibs.py", line 113, in addRow
    RegisterTask.addRow(self, conn, info, *args, **kwargs)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_tasks/21.0.0+44ca056b81/python/lsst/pipe/tasks/ingest.py", line 359, in addRow
    conn.cursor().execute(sql, values)
sqlite3.IntegrityError: UNIQUE constraint failed: flat.filter, flat.ccdnum, flat.calibDate

The plan is to do ingestCalibs.py /mnt/flock/jahumada/DATA_Paula/pointing1 --calib /mnt/flock/jahumada/DATA_Paula/pointing1/calib /mnt/flock/jahumada/testdata_hits/calib_Paula/*.fits.fz --validity 999 --config clobber=True --mode=link for all the pointing folders (pointing1, pointing2, pointing3, etc…) , but I don’t know how to get pass this error. I speculate that this error is because Ingest.py finds multiple values for the flat.filter, flat.ccdnum and flat.calibDate in the header of the flat, which is understandable since the structures of a DECAM image contemplates 61 ccds. Is this the reason I get this error ? or maybe I’m missing an important detail before the ingestion?. If you know how to avoid this error, please let me know :smiley:

Thanks a lot for reading :slight_smile:

This is a gen2 command for ingesting calibrations. For gen3 it is not possible to ingest community pipeline decam calibrations. This is not because of a technical reason, it’s because no-one has wanted to write the necessary ingester since all our testing involves building the calibration files ourselves in order to test our own software. Maybe @mrawls can point you at some documentation for how to run the DECam pipelines without using the CP calibrations.

Gen3 commands are generally of the form butler something or pipetask something and gen2 commands tend to be standalone .py commands.

I am working on a general purpose gen3 ingester and in theory this could be set up to ingest the MEF calibration files, although certifying them as valid calibrations would have to be done in a separate step (using butler certify-calibrations). To my knowledge, no-one has tried to run our gen3 pipelines with CP calibrations though.

1 Like

Ok, thanks a lot for the clarification :slight_smile: I’m new at the gen3 approach so this is very useful info.

As Tim pointed out, you are using a Gen2 command here that doesn’t exist for Gen3. The issue you’re encountering may be a Gen2 bug that crept in as we changed “generic ingest” code to work for Gen3, or you are inadvertently trying to ingest the same calib file twice, or you are inadvertently trying to ingest both a calib image and a calib wtmap or dqmask (DECam community pipeline weight/mask images).

Regardless of where the error is originating, Tim is correct in guiding you to use Gen3. You will need to start from raws, not instcals, and build your own (e.g., nightly) calib products since there is no support for importing pre-existing calib products. You should keep an eye on DM-39651, and you can also read my very-draft guide that will soon land in the actual obs_decam docs. Hope this helps!

1 Like

Right, I’m inclined with the first two possibilities that you mention, since I solely downloaded raw type images this time around. Either way, I will start with gen3 having now a better grasp of how to start :). Thanks for the references !

Hi Meredith,
I’m ingesting the DECAM data using your very-draft guide and encountered a problem with the pipetask command line:

(lsst-scipipe-cb4e2dc) jahumada@leftraru1:/mnt/flock/jahumada/DATA_hits$ pipetask run -b . -i DECam/raw/all -o DECam/raw/crosstalk-sources -p $CP_PIPE_DIR/pipelines/DarkEnergyCamera/RunIsrForCrosstalkSources.yaml
Error: An error occurred during command execution:
Traceback (most recent call last):
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/daf_butler/21.0.0+187b78b4b8/python/lsst/daf/butler/cli/utils.py", line 453, in cli_handle_exception
    return func(*args, **kwargs)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/ctrl_mpexec/21.0.0+2f1cc9de74/python/lsst/ctrl/mpexec/cli/script/build.py", line 92, in build
    pipeline = f.makePipeline(args)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/ctrl_mpexec/21.0.0+2f1cc9de74/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 499, in makePipeline
    pipeline = Pipeline.fromFile(args.pipeline)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_base/21.0.0+544a109665/python/lsst/pipe/base/pipeline.py", line 193, in fromFile
    pipeline: Pipeline = cls.fromIR(pipelineIR.PipelineIR.from_file(filename))
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_base/21.0.0+544a109665/python/lsst/pipe/base/pipelineIR.py", line 681, in from_file
    return cls(loaded_yaml)
  File "/home/jahumada/lsst_stack-v21/stack/miniconda3-py37_4.8.2-cb4e2dc/Linux64/pipe_base/21.0.0+544a109665/python/lsst/pipe/base/pipelineIR.py", line 392, in __init__
    raise ValueError("A pipeline must be declared with one or more tasks")
ValueError: A pipeline must be declared with one or more tasks

I’m not sure what’s the problem with the number of declared tasks. Just so you know, I have the release version 21.0.0 (2020-12-08) and I also had to git clone cp_pipe for the prior commands to work :).

Again, thanks for your time :slight_smile:

I’m quite sure that you will need a newer weekly version of the Science Pipelines in order to use the current clone of cp_pipe. v21 has too old a version of the middleware.

2 Likes

I think K-T is correct. v21 is too old, unfortunately, and you will need to set up a recent weekly or wait for v22 (soon!). You may be able to get past this particular roadblock by setting up a local clone of ap_pipe along with cp_pipe so it finds the correct pipeline files to import, but I can pretty much guarantee you’ll hit some other middleware (Butler-related) problem if you try that route.

1 Like

Okay :slight_smile: thank you both for your input. I was able to install the newest weekly version of the LSST science pipelines :D, and following the draft guide I got a FileNotFoundError

(lsst-scipipe-0.6.0) jahumada@leftraru1:/mnt/flock/jahumada/data_hits$ pipetask run -b . -i DECam/raw/all
-o DECam/raw/crosstalk-sourcelss -p CP_PIPE_DIR/pipelines/DarkEnergyCamera/RunIsrForCrosstalkSources.yaml
Traceback (most recent call last):
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/bin/pipetask", line 29, in <module>
sys.exit(main())
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/pipetask.py", line 43, in main
return cli()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
return f(get_current_context(), *args, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 102, in run
qgraph = script.qgraph(pipelineObj=pipeline, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/script/qgraph.py", line 145, in qgraph
qgraph = f.makeGraph(pipelineObj, args)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 550, in makeGraph
qgraph = graphBuilder.makeGraph(pipeline, collections, run, args.data_query, metadata=metadata)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 907, in makeGraph
return scaffolding.makeQuantumGraph(metadata=metadata)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 805, in makeQuantumGraph
graph = QuantumGraph({task.taskDef: task.makeQuantumSet() for task in self.tasks}, metadata=metadata)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 805, in <dictcomp>
graph = QuantumGraph({task.taskDef: task.makeQuantumSet() for task in self.tasks}, metadata=metadata)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 359, in makeQuantumSet
return set(q.makeQuantum() for q in self.quanta.values())
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 359, in <genexpr>
return set(q.makeQuantum() for q in self.quanta.values())
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/graphBuilder.py", line 260, in makeQuantum
helper.adjust_in_place(self.task.taskDef.connections, self.task.taskDef.label, self.dataId)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/connections.py", line 699, in adjust_in_place
adjusted_inputs_by_connection, adjusted_outputs_by_connection = connections.adjustQuantum(
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/connections.py", line 582, in adjustQuantum
raise FileNotFoundError(
FileNotFoundError: Not enough datasets (0) found for non-optional connection overscan.camera (camera) with minimum=1 for quantum data ID {instrument: 'DECam', detector: 1, exposure: 288849, ...}.

I could do the ingest-raws of the calib and raw images with no problem, so I don’t know why this error raises. I also wasn’t aware of the quantum data ID :0.

I would advise against working inside your butler repository. You don’t want to delete that gen3.sqlite3 file by mistake and you will have to remind yourself which files you wrote and which files butler wrote.

Looks like your copy and paste missed off some important parts of the command line.

This is caused by you not providing a collection that includes a camera definition. Did you run butler write-curated-calibrations REPO DECam ? Your input collection only seems to include raw files but other datasets are required to run the pipeline. Can you show us what butler query-collections REPO gives you?

Okay!, I will work outside the butler repository from now on:). And yes, the -p Traceback was a typo of copying the error with the command line, I will edit it! sorry for that [now the command line is complete]. I did run butler write-curated-calibrations . lsst.obs.decam.DarkEnergyCamera inside the REPO directory.
The butler query-collections REPO gives:

(lsst-scipipe-0.6.0) jahumada@leftraru3:/mnt/flock/jahumada$ butler query-collections /mnt/flock/jahumada/data_hits/
                Name                     Type   
------------------------------------ -----------
DECam/calib                          CALIBRATION
DECam/calib/unbounded                RUN        
DECam/calib/curated/19700101T000000Z RUN        
DECam/calib/curated/20130115T013000Z RUN        
DECam/calib/curated/20130916T092600Z RUN        
DECam/calib/curated/20140117T012900Z RUN        
DECam/calib/curated/20141020T103000Z RUN        
DECam/calib/curated/20150105T011500Z RUN        
DECam/calib/curated/20131130T000000Z RUN        
DECam/raw/all                        RUN

As long as you update your input collections (-i DECam/raw/all,DECam/calib), the pipeline should run.

Use that as a second input collection for your pipetask run. The graph builder needs to know where to find the inputs so you need to tell it what raws to use and what calibrations to use. Your previous command only specified the raw location.

Ok :slight_smile: I used DECam/calib as a second input to the pipetask run and got the following:

(lsst-scipipe-0.6.0) [jahumada@leftraru4 jahumada]$ pipetask run -b /mnt/flock/jahumada/data_hits -i DECam/raw/all,DECam/calib -o DECam/raw/crosstalk-sources -p $CP_PIPE_DIR/pipelines/DarkEnergyCamera/RunIsrForCrosstalkSources.yaml
numexpr.utils INFO: Note: NumExpr detected 20 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
numexpr.utils INFO: NumExpr defaulting to 8 threads.
ctrl.mpexec.cmdLineFwk INFO: QuantumGraph contains 155682 quanta for 1 tasks, graph ID: '1625765004.8131242-14655'
Traceback (most recent call last):
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/bin/pipetask", line 29, in <module>
    sys.exit(main())
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/pipetask.py", line 43, in main
    return cli()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 103, in run
    script.run(qgraphObj=qgraph, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/script/run.py", line 168, in run
    f.runPipeline(qgraphObj, taskFactory, args)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 624, in runPipeline
    preExecInit.initialize(graph,
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/preExecInit.py", line 90, in initialize
    self.initializeDatasetTypes(graph, registerDatasetTypes)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/preExecInit.py", line 137, in initializeDatasetTypes
    expected = self.butler.registry.getDatasetType(datasetType.name)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/registries/sql.py", line 391, in getDatasetType
    return self._managers.datasets[name].datasetType
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/registry/interfaces/_datasets.py", line 491, in __getitem__
    raise KeyError(f"Dataset type with name '{name}' not found.")
KeyError: "Dataset type with name 'packages' not found."

Probably there is a package I should declare before the pipetask?. Thanks for your time :slight_smile: .

I think you just need to append --register-dataset-types to the end of your pipetask run command :crossed_fingers: This tells the Butler you intend to create (“register”) a new kind of output dataset it has not seen before.

1 Like

Ok thanks :slight_smile: adding the --register-dataset-types tag in the end of the pipetask command did avoid the error I was encountering :D. I have been compiling pipetask run -b /mnt/flock/jahumada/data_hits -i DECam/raw/all,DECam/calib -o DECam/raw/crosstalk-sources -p $CP_PIPE_DIR/pipelines/DarkEnergyCamera/RunIsrForCrosstalkSources.yaml --register-dataset-types for 19 hours now and unfortunately got an error (ps. I am analyzing 2300 raw images approx)

Traceback (most recent call last):
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/datastores/fileDatastore.py", line 1021, in _write_in_memory_to_artifact
    formatter.write(inMemoryDataset)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/obs_base/21.0.0-60-g2baedfa+3512c75b65/python/lsst/obs/base/formatters/fitsExposure.py", line 266, in write
    inMemoryDataset.writeFitsWithOptions(outputPath, options=ps)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/afw/21.0.0-46-g27369044f+532d44eec1/python/lsst/afw/image/image/_fitsIoWithOptions.py", line 132, in exposureWriteFitsWithOptions
    self.writeFits(dest, **writeOptionDict)
lsst.pex.exceptions.wrappers.FitsError:
  File "src/fits.cc", line 1351, in void lsst::afw::fits::Fits::writeImage(const lsst::afw::image::ImageBase<PixelT>&, const lsst::afw::fits::ImageWriteOptions&, std::shared_ptr<const lsst::daf::base::PropertySet>, std::shared_ptr<const lsst::afw::image::Mask<int> >) [with T = float]
    cfitsio error (/mnt/flock/jahumada/data_hits/DECam/raw/crosstalk-sources/20210712T224845Z/overscanRaw/20150226/ct4m20150226t215115/overscanRaw_DECam_g_DECam_SDSS_c0001_4720_0_1520_0_ct4m20150226t215115_N15_DECam_raw_crosstalk-sources_20210712T224845Z.fits): error writing to FITS f$
cfitsio error stack:
  HIERARCH ARCHIVE_ID_PHOTOCALIB = 1 / archive ID for generic component 'PHOTOCALI
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH SKYWCS_ID =         3 / archive ID for generic component 'SKYWCS'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH ARCHIVE_ID_SKYWCS = 3 / archive ID for generic component 'SKYWCS'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH DETECTOR_ID =       4 / archive ID for generic component 'DETECTOR'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH ARCHIVE_ID_DETECTOR = 4 / archive ID for generic component 'DETECTOR'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH FILTER_ID =       131 / archive ID for generic component 'FILTER'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH ARCHIVE_ID_FILTER = 131 / archive ID for generic component 'FILTER'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH BORE-AIRMASS =   1.58
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH BORE-ROTANG =     90.
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH INSTRUMENT = 'DECam   '
 Error writing elements 1 thru 3374 of input data array (ffpclb).
 {0}
lsst::afw::fits::FitsError: 'cfitsio error (/mnt/flock/jahumada/data_hits/DECam/raw/crosstalk-sources/20210712T224845Z/overscanRaw/20150226/ct4m20150226t215115/overscanRaw_DECam_g_DECam_SDSS_c0001_4720_0_1520_0_ct4m20150226t215115_N15_DECam_raw_crosstalk-sources_20210712T224845Z.fits): error writing to FITS file (106) : Writing image
cfitsio error stack:
  HIERARCH ARCHIVE_ID_PHOTOCALIB = 1 / archive ID for generic component 'PHOTOCALI
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH SKYWCS_ID =         3 / archive ID for generic component 'SKYWCS'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH ARCHIVE_ID_SKYWCS = 3 / archive ID for generic component 'SKYWCS'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH DETECTOR_ID =       4 / archive ID for generic component 'DETECTOR'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH ARCHIVE_ID_DETECTOR = 4 / archive ID for generic component 'DETECTOR'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH FILTER_ID =       131 / archive ID for generic component 'FILTER'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH ARCHIVE_ID_FILTER = 131 / archive ID for generic component 'FILTER'
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH BORE-AIRMASS =   1.58
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH BORE-ROTANG =     90.
  Warning: the following keyword does not conform to the HIERARCH convention
  HIERARCH INSTRUMENT = 'DECam   '
  Error writing data buffer to file:
  /mnt/flock/jahumada/data_hits/DECam/raw/crosstalk-sources/20210712T224845Z/overs
  canRaw/20150226/ct4m20150226t215115/overscanRaw_DECam_g_DECam_SDSS_c0001_4720_0_
  1520_0_ct4m20150226t215115_N15_DECam_raw_crosstalk-sources_20210712T224845Z.fits
  Error writing elements 1 thru 3374 of input data array (ffpclb).
'


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/bin/pipetask", line 29, in <module>
    sys.exit(main())
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/pipetask.py", line 43, in main
    return cli()
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1259, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "/home/jahumada/.conda/envs/lsst-scipipe-0.6.0/lib/python3.8/site-packages/click/decorators.py", line 21, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/cmd/commands.py", line 103, in run
    script.run(qgraphObj=qgraph, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cli/script/run.py", line 168, in run
    f.runPipeline(qgraphObj, taskFactory, args)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/cmdLineFwk.py", line 643, in runPipeline
    executor.execute(graph, butler)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/mpGraphExecutor.py", line 301, in execute
    self._executeQuantaInProcess(graph, butler)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/mpGraphExecutor.py", line 351, in _executeQuantaInProcess
    self.quantumExecutor.execute(qnode.taskDef, qnode.quantum, butler)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 137, in execute
    self.runQuantum(task, quantum, taskDef, butler)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ctrl_mpexec/21.0.0-34-g2cbe45c+f487817123/python/lsst/ctrl/mpexec/singleQuantumExecutor.py", line 345, in runQuantum
    task.runQuantum(butlerQC, inputRefs, outputRefs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/ip_isr/21.0.0-21-gc8894c7+b39e0bca51/python/lsst/ip/isr/isrTask.py", line 1053, in runQuantum
    butlerQC.put(outputs, outputRefs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/butlerQuantumContext.py", line 198, in put
    self._put(valuesAttribute, refs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/pipe_base/21.0.0-29-gf27b79e+8c09b75fa7/python/lsst/pipe/base/butlerQuantumContext.py", line 98, in _put
    butler.put(value, ref)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/core/utils.py", line 265, in inner
    return func(self, *args, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/_butler.py", line 939, in put
    self.datastore.put(obj, ref)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/core/utils.py", line 265, in inner
    return func(self, *args, **kwargs)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/datastores/fileDatastore.py", line 1638, in put
    storedInfo = self._write_in_memory_to_artifact(inMemoryDataset, ref)
  File "/mnt/flock/jahumada/lsstw_v21_27/lsst-w.2021.27/scripts/stack/miniconda3-py38_4.9.2-0.6.0/Linux64/daf_butler/21.0.0-109-g75b85349+7e5b4c34a6/python/lsst/daf/butler/datastores/fileDatastore.py", line 1023, in _write_in_memory_to_artifact
    raise RuntimeError(f"Failed to serialize dataset {ref} of type {type(inMemoryDataset)} "
RuntimeError: Failed to serialize dataset overscanRaw@{instrument: 'DECam', detector: 46, exposure: 415219, ...}, sc=Exposure] (id=7b67bb3b-ba23-430d-ba7e-cbf5f99794ed) of type <class 'lsst.afw.image.exposure.ExposureF'> to location file:///mnt/flock/jahumada/data_hits/DECam/raw/crosstalk-sources/20210712T224845Z/overscanRaw/20150226/ct4m20150226t215115/overscanRaw_DECam_g_DECam_SDSS_c0001_4720_0_1520_0_ct4m20150226t215115_N15_DECam_raw_crosstalk-sources_20210712T224845Z.fits

Do you know why I get this?. Also, I’m not sure what this pipetask command does? or what CrosstalkSources mean? If you know, please let me know :smiley:
I appreciate your time and answers :slight_smile:

We sometimes see errors like this if a disk fills up. Could that be the case here?

1 Like

The goal of this pipeline is to preprocess the set of input RAW frames so they can be used to correct for inter-chip crosstalk (where a bright source on detector X shows up as a faint source on detector Y). This step applies the overscan correction to all the raw frames so that they can be supplied as an input to the full ISR processing. When the ISR processing is called for a given {exposure, detector} set, it passes all detectors that are overscanRaw for the same exposure as the crosstalkSources input.
In any case, this will effectively double the disk usage for the raw data (which are now in both dataset_types raw and overscanRaw), and makes K-T’s suggestion fairly likely.

1 Like

Indeed, K-T’s suggestion is right ! My disk got full, I will delete some stuff then :sweat_smile: Thanks!

Hi Cristopher :slight_smile: , your description was really useful to understand what I was doing!. I was wondering if there is any documentation that could go deeper regarding this pre-processing? I understand that gen3 is still very new and probably there is not a lot of documentation available yet, but maybe it follows the same logic as the gen2 approach? and looking at the latter can be useful to a degree.