Running the stack on HSC data for a DESC PSF project

Hi all - congratulations on the data release!

One more problem I’ve hit trying to use the NERSC installation v12.1 - I can ingest okay but when I try processCcd.py I get this - wondered if anyone had seen it before?

Traceback (most recent call last):
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/cmdLineTask.py", line 346, in __call__
    result = task.run(dataRef, **kwargs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_tasks/12.1/python/lsst/pipe/tasks/processCcd.py", line 181, in run
    icSourceCat = charRes.sourceCat,
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_tasks/12.1/python/lsst/pipe/tasks/calibrate.py", line 383, in run
    icSourceCat=icSourceCat,
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_tasks/12.1/python/lsst/pipe/tasks/calibrate.py", line 462, in calibrate
    sourceCat=sourceCat,
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_astrom/12.1/python/lsst/meas/astrom/astrometry.py", line 197, in run
    res = self.solve(exposure=exposure, sourceCat=sourceCat)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_astrom/12.1/python/lsst/meas/astrom/astrometry.py", line 285, in solve
    calib=expMd.calib,
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_algorithms/12.1/python/lsst/meas/algorithms/loadReferenceObjects.py", line 214, in loadPixelBox
    loadRes = self.loadSkyCircle(ctrCoord, maxRadius, filterName)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_astrom/12.1/python/lsst/meas/astrom/loadAstrometryNetObjects.py", line 98, in loadSkyCircle
    self._readIndexFiles()
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pipe_base/12.1/python/lsst/pipe/base/timer.py", line 121, in wrapper
    res = func(self, *args, **keyArgs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_astrom/12.1/python/lsst/meas/astrom/loadAstrometryNetObjects.py", line 162, in _readIndexFiles
    self.multiInds = AstrometryNetCatalog(self.andConfig)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_astrom/12.1/python/lsst/meas/astrom/multiindex.py", line 186, in __init__
    self._initFromCache(cacheName)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/meas_astrom/12.1/python/lsst/meas/astrom/multiindex.py", line 238, in _initFromCache
    with pyfits.open(filename) as hduList:
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/hdu/hdulist.py", line 124, in fitsopen
    return HDUList.fromfile(name, mode, memmap, save_backup, **kwargs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/hdu/hdulist.py", line 266, in fromfile
    save_backup=save_backup, **kwargs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/hdu/hdulist.py", line 823, in _readfrom
    hdu = _BaseHDU.readfrom(ffo, **kwargs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/hdu/base.py", line 370, in readfrom
    **kwargs)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/hdu/base.py", line 430, in _readfrom_internal
    header = Header.fromfile(data, endcard=not ignore_missing_end)
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/header.py", line 423, in fromfile
    padding)[1]
  File "/global/common/cori/contrib/lsst/lsstDM/v12_1/Linux64/pyfits/3.4.0+6/lib/python/pyfits-3.4-py2.7-linux-x86_64.egg/pyfits/header.py", line 492, in _from_blocks
    raise IOError('Header missing END card.')
IOError: Header missing END card.

I suggest having a look at your astrometry_net_data package. Specifically, you might check that the andCache.fits file isn’t corrupted.

Thanks - you’re right! Looks like the ASTROMETRY_NET_DATA_DIR is not set quite right on the NERSC installation (there are two subdirs that do seem to be valid).

Cheers!

Hello, I’m a 1st year graduate student at Carnegie Mellon working with Rachel Mandelbaum and Joe on testing the PSF modelling errors for the LSST.

I’m using the HSC PSF data that Joe talked about here. Is anyone familiar with how to extract the WCS from the data files?

Many thanks,
Husni

What are you trying to do with the WCS?

You should be able to get the WCS with a butler call: butler.get('calexp_wcs', dataIds). That will give you an lsst.afw.wcs object.

I’m trying to connect the data from all the CCDs so thought getting the WCS would help with this (or is there a simpler way, i.e. to ask the butler for the calibration exposure from all the CCDs at the same time?

I’m not sure how dataIds should be specified exactly, but I tried using butler.get('calexp_wcs', dataIds) and specifying the visit and the ccd and got back an error saying AttributeError: 'HscMapper' object has no attribute 'map_calexp_wcs'. Am I using it wrong?

Thanks,
Husni

What do you mean “connect the data from all CCDs?” Are you trying to make a mosaic of all the images (I believe we have a tool for that), or do something with cross-CCD catalogs?

Oh, right: what version of the stack are you using? That looks like you don’t have a recent-enough version.

You could try the older way:

calexp = butler.get("calexp", dataId, immediate=True)
tanWcs = calexp.getWcs()

I’m trying to do analysis on the entire PSF field at once, so something like a mosaic of all the images would be very helpful. Could you please point me at the tools for that?

That seems to work, thanks a lot!

@price @reiss or @jbosch can probably help you with that more than I can.

@husni, I’m afraid it still isn’t really clear to me what you want to do. I think if you want to look at the PSF model, a coadd wouldn’t really be very helpful. I imagine you’d be better off looking at postage stamps of the PSF (vs. postage stamps of stars at the same positions on the focal plane), or perhaps various shape residuals. Having a WCS (to correct for geometric distortions) would give you different version of all of those metrics, but I don’t believe it’s intrinsic.

Hello, I would like to ask something related.
I am trying to process the data from HSC. I tried to download the calibration files close to the date of the raw data from HSC as well as the calibRegistry.sqlite3 from http://tigress-web.princeton.edu/~pprice/CALIB-LSST-20160419/. When I run singleFrameDriver.py, the following error come out:

FATAL 2017-07-11T03:23:01.001 singleFrameDriver ({'taiObs': '2014-09-24', u'pointing': 997, 'visit': 7802, u'dateObs': '2014-09-24', u'filter': 'HSC-I', u'field': 'ABELL2319', u'ccd': 1, 'expTime': 240.0})(cmdLineTask.py:351)- Failed on dataId={'taiObs': '2014-09-24', u'pointing': 997, 'visit': 7802, u'dateObs': '2014-09-24', u'filter': 'HSC-I', u'field': 'ABELL2319', u'ccd': 35, 'expTime': 240.0}: Unable to retrieve bias for {'taiObs': '2014-09-24', u'pointing': 997, 'visit': 7802, u'dateObs': '2014-09-24', u'filter': 'HSC-I', u'field': 'ABELL2319', u'ccd': 35, 'expTime': 240.0}: No locations for get: datasetType:bias dataId:DataId(initialdata={'taiObs': '2014-09-24', u'pointing': 997, 'visit': 7802, u'dateObs': '2014-09-24', u'filter': 'HSC-I', u'field': 'ABELL2319', u'ccd': 35, 'expTime': 240.0}, tag=set([]))

I thought it might be the difference in the dates prohibited isr to locate the bias.

So I tried to construct the BIAS and DARK from the downloaded calibration frame on the same date. However, when I run constructBias.py, the error is:

Traceback (most recent call last): 
  File "/../lsstsw/stack/Linux64/pipe_drivers/13.0-3-gfcfef02+7/bin/constructBias.py", line 3, in <module>
    BiasTask.parseAndSubmit()
  File "/../lsstsw/stack/Linux64/ctrl_pool/13.0-2-gb1fa231+3/python/lsst/ctrl/pool/parallel.py", line 424, in parseAndSubmit
    **kwargs)
  File "/../lsstsw/stack/Linux64/ctrl_pool/13.0-2-gb1fa231+3/python/lsst/ctrl/pool/parallel.py", line 333, in parse_args
    args.parent = self._parent.parse_args(config, args=leftover, **kwargs)
  File "/../lsstsw/stack/Linux64/pipe_drivers/13.0-3-gfcfef02+7/python/lsst/pipe/drivers/constructCalibs.py", line 269, in parse_args
    namespace = ArgumentParser.parse_args(self, *args, **kwargs)
  File "/../lsstsw/stack/Linux64/pipe_base/13.0-3-g7fa07e0/python/lsst/pipe/base/argumentParser.py", line 465, in parse_args
    namespace.camera = mapperClass.getCameraName()
AttributeError: 'NoneType' object has no attribute ‘getCameraName'

Would anyone knows how to overcome this error? I would like to generate calexp and src. Any help or discussion is appreciated with many thanks.



The problem with AttributeError: 'NoneType' object has no attribute ‘getCameraName' is solved somehow. Something to do with the layer of directory confusion I guess.

The reason why I cannot use the calibration files released is still unclear. The date of raw file is 20140924 while the file is 20140920 which is within 720 days.

Hi Vera. Sorry for not getting back to you sooner.

Could you please post the command-line you’re using, and perhaps the full log? I suspect you may not be using the --calib command-line argument to point to your downloaded calib root directory.

@jbosch Sorry for not replying earlier! Essentially I want to use Galsim to simulate galaxies and call the PSF model at their coordinates to use it in correcting for their elipticities etc. So the idea was to make up a single PSF model at all positions by combining all the CCDs from one visit, or is there a better way to approach this?

I’ve been trying to use the wcs.getPixelOrigin() (which seemed to be the only attribute that was not either 0 or the same for every CCD) to do this. But it’s returning the same/overlapping positions for some CCDs in the same visit, which is not what I expected. Am I missing something/is there a better attribute to look for?

@husni, I’m afraid I still don’t quite understand - I get that you want to simulate galaxies convolved with the (modeled) HSC PSF, but I’m not sure what you mean by a “single PSF model at all positions”. If you want a centered image of the PSF model at a position on a CCD image, you can use Psf.computeKernelImage(afw.geom.Point2D(x, y)), with (x, y) given in CCD pixel coordinates.

Combining all of the PSF models for one visit somehow doesn’t seem like a meaningful thing to do, unless you just want some kind of approximate average PSF for the visit. If that’s the case, I’d just recommend just calling the above at a number of randomly-drawn points from randomly-selected CCDs in the visit and averaging them. Because the PSF models are defined and evaluated in pixel coordinates, you shouldn’t need a WCS to evaluate them at all.

If you want to resample the PSF images to sky coordinates, it’s a little trickier (I’d recommend using our WarpedPsf class, which transforms a PSF through a pair of WCSs), but I can walk you through that too.

[quote=“jbosch, post:42, topic:964”]
@husni, I’m afraid I still don’t quite understand - I get that you want to simulate galaxies convolved with the (modeled) HSC PSF, but I’m not sure what you mean by a “single PSF model at all positions”. [/quote]

@jbosch Sorry that was not a very clear way to explain it. By that I meant a PSF model that covers the entire part of sky that the HSC covers.

This is actually exactly what I would like to do, only I am trying to construct the Psf. Currently, I’m getting it from the calexp, which I’m in turn getting it from the butler for a single visit and single ccd and looping over them, so it only contains limited x,y positions from one CCD in every iteration. This is why I’m trying to do this addition to get a Psf object that spans the entire sky. Or is there a way to ask the butler for the calexp of the entire sky?

Ok, I understand. I think the core problem here is that a particular PSF is valid only for a particular image, and there is no single image that covers the entire area of sky that HSC covers. There are multiple overlapping epoch-level images at any given point on the sky, of course, so even at a single point no one of those is “the PSF”.

However, when we make coadd images, we coadd the PSFs in a way that is consistent with the way we built the coadd (i.e. it’s the appropriate model for the effective PSF of the coadd). If you get the PSF object from a deepCoadd_calexp image, that’s what you’re getting - but even those are only valid over a small region of sky, called a patch. All of the patches in a larger region called a tract share a common pixel grid (they’re conceptually square subimages of a larger virtual image for the full tract).

You could conceivably make a CoaddPsf that combines PSFs over a tract, but it would be very large and cumbersome to work with, because it would essentially have to store all of the PSF models and WCSs of all of the input images that went into all of those coadd images. I’m afraid it’s not possible to make a CoaddPsf that covers multiple tracts, because the class assumes there’s one WCS that describes the pixel grid of the image to which it corresponds, and different tracts have different WCSs.

If you want one object that can return the PSF at an arbitrary point in the survey, then, I suspect you’ll be better off writing a new class that can look up the tract and patch that correspond to a particular point in celestial coordinates, load the CoaddPsf for that patch, and then evaluate the PSF at that point after converting it to tract coordinates. If you load the deepCoadd_skyMap dataset (this will be an instance of lsst.skymap.RingsSkyMap for HSC), you’ll get an object that can do those tract/patch lookups and coordinate transformatoins.

@jbosch after discussing with @rmandelb I think I’ve made a terminology mistake which has caused the confusion. I should have said the entire field of view instead of the entire sky, since I meant the PSF model from a single visit, rather than deep-coadding all the visits to get the effective PSF. Sorry about that.

So the problem is just that when I try to e.g. plot the positions of the pixels from each of the ccds in a single visit on the same plot I’m finding significant overlap between the CCDs (which if I understand correctly shouldn’t happen within the same visit?). I’m getting the position of each pixel using wcs.getPixelOrigin()[0] + x, and wcs.getPixelOrigin()[1] + y, where wcs = calexp.getWcs(), x and y are between 0 and calexp.getWidth() and calexp.getHeight() respectively, and calexp is being outputted from the butler for each CCD. Is this the right thing to do/is there a better way to do this?

In order to put all of the CCDs in the focal plane on a single plot, you can either make the plot in focal plane coordinates and convert to CCD coordinates using our “cameraGeom” library, or, for a particular visit, you can make the plot in sky coordinates and use the Wcs.

If you want to use the Wcs, you should just use skyToPixel and pixelToSky to transform between sky coordinates and pixel coordinates. Using getPixelOrigin the way you have is only valid when all of the CCDs have the same Wcs parameters aside from their origin parameters. That can never be precisely true for a quality Wcs, because there are at least small differences in rotation and perhaps scale between CCDs (and for HSC, fairly significant optical distortion as well).

Thanks a lot @jbosch, that helped a lot – things seem to work well now! :slight_smile: