issues with blended objects, no deblending in forced photometry, multifit is supposed to fix this
but it doesn’t allow amplitudes to vary per epoch (wihin a given band)
the way to do the lightcurves is to do forced photometry on difference images (also no deblending),
but if both are varying then you are hosed
run 2 between now and Sep., a few blockers, need to be fixed first
want to be able to choose the data based on some quality cuts
-not clear how this will work, Simon added some things to LDM-151 that we need tools for “data discovery”
in software primatives (9.22)
need a PSF homogenized template, PSF footprint size is allowed to vary which can cause problems
some rolloff on the edges which are causing lots of false positives on the edges, because not flatfielding
sims specific issue
This is most likely aperture corrections. Simulations tend to insert sources with the flux levels set for an infinite radius. However, the stack corrects everything to a 12 pixel aperture (slot_calibFlux, which is base_CircularApertureFlux_12_0). The difference (the flux from 12 pixels to infinity) doesn’t really matter for most applications.
Which data? The input data to be processed, or the output data? In the latter case, I think the expectation is that we would ingest into a database and use SQL joins with the exposure metadata table to do cuts. However: while loading Twinkles into the Prototype Data Access Center is desirable, it may not align well in terms of schedule with DESC’s needs.
I agree that the way to do this in production is to load the exposure info into a database. The problem is that we don’t currently make it easy to do this. If we can load these data into the PDAC, I’m all for it. What would be the timeline for that?
@rbiswas gave us a talk in the sims telecon this morning about this as well. He said the other issue (perhaps more so than the slightly lower flux?) was that the uncertainty reported by the DM pipeline was smaller than expected. It may actually be related to the same effect … if it’s bright, the uncertainty should be low, but then the offset could be coming from the aperture correction.
In a side note - understanding what DM and SQuARE will need for reference catalogs and doing the comparison between those reference catalogs and the DM measurements (especially if there are any alterations that we should put into the sims reference catalogs) is something sims is very interested in understanding.
By how much, and at what stage? If it’s on the coadds, that’s not at all surprising because variance has been lost as covariance which we don’t track. If it’s elsewhere, it might indicate that the variance plane isn’t being set correctly.
You’d have to talk to @rbiswas or someone else who is in on the twinkles analysis for more details I’m afraid. I don’t think it’s coadds, because it’s measurements that were going into a lightcurve.