Verification Datasets Meeting 2016-01-13 Minutes

Attendees: Colin, Mario, Simon, Dominique F., Hsin-Fang, Gruendl, Juan, MWV, Angelo, David
Regrets:
Notetaker: Angelo

Colin:

  • Measurement on difference images. Fixed bugs preventing to get measurements, now able to quantify detections as function of magnitude, etc
  • Action Item: Include link to results in verification wiki
  • Tired to loop over src FITS catalogs, used daf_ingest to set up a mysql database and running queries again lsst10 improved analysis a lot. ingestProcessed.py should be used for metadata, where is it?
  • Action item: Talk with Jacek or Serge about database access. See discussion on community about the need of having the visit and ccd metadata in addition to the src table.
  • Results: many detections of probable “noise”. Noise model is wrong, off by ~15%, CP noise model issue. Distribution of im1-im2/variance is too wide.
  • Plan to proceed? Try our own ISR from raw images and see how good our noise model is. MWV suggested a generic test of noise model.

David:

  • Gave talk at AAS, good summary of what we have done so far (http://zenodo.org/record/44673)
  • COSMOS data: taking the src catalog from each ccd, matched sources between visits. Which matching method from stack should use? R:
import lsst.afw.table as afwTable
match = afwTable.matchRaDec(srcRef, srcVis, afwGeom.Angle(1, afwGeom.arcseconds))
  • src cat from individual ccds and visits, how to preserve the match result and original visit result?
  • Need of a multi-band matcher (in progress)
  • Changed strategy and focus on ProcessCdd to understand better each step and progress on QA

Michael:

  • CFHT astrometry checks see bin/check-astrometry.py in https://github.com/wmwv/validate_drp
  • Median value of astrometric scatter (from the positions recorded by the pipeline) is 23 mas (mag < 21). Checks internal astrometry between visits.
  • demo running check-astrometry on different output data repositories
  • COSMOS DECam large astrometric scatter (>200 mas), probably CP WCS issue
  • generic validate_drp ?
  • Colin: good design choice, small datasets and small pieces of code
  • Mario: suggests to look into pipeQA, it has two components test (?) pipeQA that runs on a given data repository (it should handle any obs package) and display pipeQA which produce the plots. Which parts of the framework can bre reused?
  • Action item: Mario send presentations and references on pipeQA
  • Will it work with SQuaSH?
  • David: Simultaneous astrometry. JimB and Dominque B. talked about what’s needed to bring simulateous astrometry
  • UW is planing to do this in this cycle, @parejkoj is leading the effort to bring HSC code (simultaneous photometric solution) over
  • Action Item: David look into the that first and contact @parejkoj to see how to include Dominiques code. Set up a phone conf week from Tuesday. See also https://jira.lsstcorp.org/browse/RFC-123

Simon:

Angelo:

  • Looking at the astrometry matcher failures, see examples at https://confluence.lsstcorp.org/display/SQRE/Bulge+Survey+Processing
  • Using default configuration, maximum offset 3 arcsec, might turn that up (calibrate.astrometry.matcher.maxMatchDistArcSec = 3.0)
  • two solvers, default matcher requires you to be close while astrometry.net can solve blind option
  • minimum matching number of pairs looks ok (calibrate.astrometry.matcher.minMatchedPairs = 30)

Yusra/Francisco:

  • not on

Dominique:

  • not on

Mario:

  • MOPS
    • Joachim Moeyens has run MOPS on test datasets, got it run all the way through to tracks, only works on his Mac, produces garbage on linux, maybe memory issues, working on it

Domonique F.:

  • introduced Juan Pablo Gomez, working on CFHT images

For more information, I’ve run 10 visits each of g, r, and i band simulated data through single frame processing, coadd, coadd detection and measurement, and forced photometry. These are single chips per visit with rotational dithering only.

two solvers, default matcher requires you to be close while astrometry.net can solve blind option

I Inherited the use of astrometry.net in LSST. It proved to be unreliable, slow, and hard to debug. The unreliability was I think mostly due to it not liking us saturating bright stars. Dustin Lang made a number of changes and we got it mostly working on HSC.

I don’t believe that we need a blind solver, hence the switch to a new solver. We’ve run it on a couple of thousand degrees of HSC data without problem (a total of 99.4 polychromatic degrees of coverage if you’re interested). If we do need a blind solver, I think we should do this as a whole-camera step using a smallish number of bright stars before solving for each CCD.

The HSC tool for simultaneous astrometry and photometry is meas_mosaic. There is an effort I’m aware of to bring this over to LSST (that’s DM-2674), but it’s not being led by JohnP. As far as I know, he is focusing on meas_simastrom, which is not HSC code, and doesn’t currently implement a simultaneous photometry (but see DM-3871).

Am I confused?

I think what was said on the call is that JohnP is adding in simultaneous photometry fitting as well because HSC needs it.

Under Mario’s report, I think that’s a different “Peter”.

Actually, I think it’s a different “Yoachim”. I will fix.

I’m guessing the J. Parecko is supposed to be me (Parejko)? I’m looking into simastrom, which is only astrometry, not photometry (as far as I’m aware).

Thanks @timj for fixing and sorry for the mistakes.

There are plans to include a photometry solution in meas_simastrom (I’ve discussed it with @boutigny and @PierreAstier), but I don’t know what state it’s in.

There is a photometric fit in meas_simastrom. It consists of a minimizer (of an abstract
model) and the implementation of one model: one photometric scale per input “calexp” (but one).
We only miss a scheme to store the results in a way that the image stacking will use.

The minimizer code could also fit a model of the “illumination correction” using a set
of dithered exposures of the same scene. Again, one question here is how we store
this product. This is needed for serious photometry on Megacam/CFHT images
because the corrections to flatfield are important.