Verification Datasets Meeting 2016-01-20 Minutes

Attendees: Colin, Hsin-Fang, Mario, Angelo, David, Frossie, MWV (had to leave early), Gruendl, RHL, Simon (joined late)


  • Didn’t get a chance to look at pipeQA yet
  • Next things to think about, what do we bring over from HSC side (QA stuff)
  • Working on validation dataset on DECam
  • validate_drp repo, making sure things work, python path etc.
  • DECam data calibrated to SDSS catalog, astrometric precision 70 mas, z-band
  • RHL: are you using bootstraping the deep exposures to the shallow exposures?
  • No
  • Status of photometric and astrometric separation?
  • Colin: Simon is working on generic astrometric loader, remove dependence on, the first of two steps [see Simon’s later clarification below]
  • Waiting on code review from Nidever for DM-4706, DM-4709


  • Debugging noise estimates in CP-processed DECam data
  • Underestimate of noise in variance plane, leading to many false detections
  • Tested this many ways
  • Forced-photometry, scatter 15% larger than expected
  • DIA sources, consistent with noise, also more scatter than expected
  • Scaled variance planes by 15% percent and reran
    900 positive, 900 negative, ~1000 dipoles, all per sq deg
    better than DES results post-machine learning
    [DES has 8-9k detections per focal plane post machine learning rejection, so ~3k/sq deg, some caveats, the imaging is deeper, and detection threshold might not be the same,]
  • RHL: have you looked at covariance of the images?
  • Colin: not yet
  • Mario: Andy Becker found that the covariance was 1-2%
  • RHL: how are you doing the detection?
  • Colin: using the standard routines/setup
  • RHL: using Gaussian approximation to do detection right now, could be off from real PSF by ~15%
  • Discussion on how the detection is actually done, do we ever use the “real” PSF for detection
  • S/N of PSF measurements consistent with the detection threshold
  • RHL: HSC uses “sky objects”, useful information, part of regular processing
  • Colin has also looked at the pixels themselves and found the same issue, i.e. variance values too small
  • Mario: what’s the depth of these images, compared to LSST?
  • Colin: shallower than LSST, looking at HiTS data (deeper) as well
  • David: might be good to process the data with LSST stack, i.e. do ISR ourselves, and see if this problem persists
  • We might talk to Frank about this issue, and keep Gruendl in the loop
  • Colin will figure out priorities for the next steps


  • Processing raw data
  • Looking at TPV cards that were left in the header
  • Camera geometry
  • David: there are some branches that aren’t merged on obs_decam
  • Colin doesn’t have any obs_decam changes
  • H-F will merge when tickets are closed
    • two open tickets, astrometry issues
    • other issue is on processing raw data
  • RHL wants to process DECam calibration data, wants good ISR
  • David: also getting failure on chip 62 because it’s using ccdnum for extension number which is wrong; H-F made fix fort this only for raw data, we should look at applying this to instcal
  • David: Should we use processCcd or processCcdDecam?
  • H-F: for raw data us processCcd, for instcall use processCcdDecam
  • David: can we make processCcd work on both raw and instcal?
  • H-F: should be a way to do that, might be ticket about that, RFC-95, DM-4077
  • There’s not a standard way to handle preprocessed data in LSST


  • Haven’t had time to work on the bulge, LINEA work
  • Having some problems with obs_decam


  • Link for the quick-and-dirty QA webpage for the COSOMS processing:
  • Will add a password in the near future for these pages.
  • Still need to look through the failures to see what’s going on.
  • Spent most of the time recently working on a script that tests the astrometric matcher, matchOptimisticB. This was prompted by matcher failures in the bulge data processing, but might just be an issue with config defaults.
  • It works well until about 80 arcsec and then fails beyond that
  • Gruendl: first DECam data had astrometry off by more than 1 arcmin
  • RHL: might need a preburner for blind solver, looking at bright solver
  • Simon: could it be getting bad matches at large offsets?
  • David: it got the same number of matches, was surprised that it didn’t decrease slowly, maybe there’s something else also going on
  • David: I’m not saying there is a problem, working out to 80 arcsec is a good thing. Overall, I’m trying to see where the algorithms/code breaks so we understand the performance of our software.


  • Didn’t see Joachim recently, not sure what the status is on getting MOPS to work


  • Clarification on separating astrometric and photometric calibration. On his plate to allow indexing of any file not just
  • David: can we do astrometry/photometry separately now?
  • Task needs to know the name of the data products so can load via the Butler, can define different datasets in policy file and get them from the butler. Should be able to do it now, but easier in the future.
  • RHL: will also need input catalogs in the future for PSF stars, so not potentially more than just two catalog types.
  • Twinkles, still want to look at repeatibility on aperture photometry and not just PSF photometry
  • Twinkles project plans to have month’s worth of data, PhoSim, full OpSim cadence, by end of March,
    variability of base sources (RR Lyrae, flares, cepheids, etc), but not weak lensing or stuff like that
    no variability due to clouds, moon, but hold dark sky stable
  • Gruendl: could use RASICAM data from DECam for cloudiness, 10microns to map the water varpor
    distribution across the sky, operating throughout DECam, Kevin Reil at SLAC


not on


not on