Verification Datasets Meeting 2016-05-11 Minutes

Verification datasets telecon

Attendees: Colin, MWV, David, Hsin-Fang, Simon

MWV:

  • on vacation last week
  • wants to work on lsst QA package, do more tests
  • in Tucson Thur/Fri

Colin:

  • not working on verification much
  • working on star/galaxy separation, looking for ground-based data that has spaced-based data
    will try to use David’s COSMOS reductions
  • separate star/galaxy paper, eventually put it in the MAF framework
  • PSF - model photometry, SDSS technique, similar to SExtractor spreadmodel

Simon:

  • nice twinkles 3-color coadd image

https://community.lsst.org/uploads/default/original/1X/20170e53013d17bd57cc767792d779de5d1d07b9.png

  • high S/N rate, wanted 100 S/N above 5sigma in each visit, could be too many, confusion
  • there is outlier rejection in the coadds, some confusion why the S/N show up so clearly in coadd

Hsin-Fang:

  • nothing to update

Angelo:

  • showing QA dashboard update
  • based on django database package, using bokeh for plots
  • using Jenkins jobs now
  • shows metric values vs. time
  • wants to make it extendable
  • if a new metric is added the QA database schema does NOT need to be updated

David:
https://confluence.lsstcorp.org/display/SQRE/COSMOS+DECam+data+reduction

  • worked last week on on KPM CoDR document
  • trying to calculate photometric scatter for aperture photometry
  • the circular aperture numbers are for the radius in pixels
  • Colin clarified that for PSF aperture corrections it’s to a reference aperture, the default is 17 pixels
  • can be set in the config
    config.calibrate.measureApCorr.refFluxName=‘base_CircularApertureFlux_17_0’
  • but there does not appear to be any curve of growth correction in the stack
  • going to rerun processCcd with no or linear spatial variation in PSF to see if that improves the results
  • POST-TELECON: the aperture photometry does have much lower scatter, down to ~0.4% (see figure on confluence page). So definitely some issues with PSF photometry and model.

I’ve found recently with some HSC data that I get much better star/galaxy separation (for resolved stellar pops work) if I use a small-scale background subtraction (binSize = 32). This keeps CModel from attempting to fit low surface brightness stuff underneath stars. Maybe that’s only necessary for HSC because of the big glass, but presumably you’ll approach that regime with coadds, so it would be good to build an optional local sky subtraction into some of our measurement algorithms.

You’re probably using the SafeClipAssembleCoaddTask, which is now the default coadder. It is careful about what it rejects, especially in the wings of objects detected in each visit.

@RHL wrote some for HSC, but we haven’t yet got it working robustly (it’s a subtly different problem than for SDSS). All photometry (except perhaps for the circular apertures?) should be corrected to the slot_CalibFlux (defaults to base_CircularApertureFlux_12_0) which should be good enough for most science.

Which PSF model are you using? If it’s not PSFEx, then you should switch. @boutigny found that it magically solved some of his problems this morning.

Local sky subtraction will, of course, ruin your galaxy photometry.

The curve-of-growth is tricky due to the crowding in 300s Subaru exposures. It could be made to work, but it wasn’t at the top of our priority list. What exactly are you looking for here?

The coaddition is very carefully written to be linear; this preserves the PSF. I’m not sure what problems you are seeing – can you add a few more details?

I’m using the default PSF solve which is a PCA one. I don’t think it’s PSFEx unless it’s a reimplementation.

It’s not a huge deal. I just wasn’t sure what aperture radius to pick and having something that was corrected for the curve of growth (to the “total” flux) was have made my decision easier. :grinning:
Also, it would be nice to have the PSF photometry aperture correction actually correct it to “total” flux and not just to some fiducial radius (which could change from one run to the next). But I understand it can get challenging in semi-crowded regions.

I think Simon can explain better, but basically for the Twinkles project they put in a large number of S/N in the simulated data (10x the expected amount) but quite a few of them show up in the full, 10-year coadd. We were surprised that the outlier-rejection algorithm didn’t throw them out during coaddition process.

I strongly recommend using meas_extensions_psfex.

The correction to the “total” flux is degenerate with the zero point. Everything should be aperture corrected to the slot_CalibFlux so that it’s all consistent, but there’s not much reason to worry about anything larger.

As Paul said, you don’t need to know the curve of growth corrections as stellar photometry is relative to standards (this includes the PSF flux) – we use the same calibration radius for standard stars as science targets and the corrections cancel out. The exception is if you want to use some aperture that we have not directly calibrated.

You can’t just use outlier rejection as it’s a statistical process to solve a non-statistical problem. This isn’t just a theoretical comment — clipping (even say at 5 sigma) destroys the property that stars are just a delta function convolved with the PSF (i.e. there is no PSF).

The exception to this is to psf-match the images before stacking. We could do this, but it involves degrading data. The Kaiser coadd gets around some of this (for faint stars), and DRP is thinking about such optimal coadds. Stay tuned!

The safe clipping that we use gets around this. The “right” thing to do is to use the difference imaging to identify variables and remove them. We are not running a full diffim while making these coadds, but a rather simple algorithm that correctly removes things that only show up as isolated objects in one (?) visit; @rearmstr can tell you more.

This is not good enough, and it’s on the list! One problem is the chicken and egg of building good templates to do the diffim, but needing diffim to build good templates. I think this is tractable (I’m not sure @jbosch agrees), but in any case experiments are needed.

P.S. I have a writeup expanding the point about clipping if you’re interested.

I would be interested in reading that.[quote=“RHL, post:7, topic:773”]
As Paul said, you don’t need to know the curve of growth corrections as stellar photometry is relative to standards
[/quote]

I agree that this would probably be fine. A few minor concerns still come to mind:
-will there be problems if the seeing is quite bad and a decent amount of flux is outside the “calibflux” aperture?
-will the amount of (scattered) light in the broad wings outside the “calibflux” aperture be constant across the entire focal plane? I could image that it’s not.
These are probably pretty minor issues (sub-percent level) but may matter in the end.