How to extract PSFEx PSF from a PVI/calexp outside of Science Pipelines?

How can I most easily extract and use the PSF models from a DP0.2 calexp outside of the Science Pipelines environment? I’m looking for a low-dependency way to get the PSF for use in other astronomical code, without needing the Science Pipelines.

My understanding is that these are PSFEx models. I’m looking at the code in

and starting to work out how I can change the coefficients and then apply them, but (a) this will take a little bit; (b) I’ll mess something up; and (c) it’s brittle in the face of any PSF model changes

First, a high-level comment:

At the moment, while the image data products are all required to be FITS-4.00 conformant, and the image, variance, and flags/mask extensions should be straightforward to use in community software, we have not (yet) focused on making the substantial additional content in the files (all in binary-table extensions) usefully available via mechanisms other than the Rubin Science Pipelines. This is not something that will be fully addressed before DR1, I think, realistically.

We do have a Observatory-level requirement in the long run to maintain the accessibility of the data, and I personally (this is not formal policy, to be clear) think this means being able to document the format of the data in such a way that it is scientifically usable without the stack.

Within the RSP, we will provide services to provide some limited access to this information in non-stack mechanisms. For instance, we will have an API Aspect service that returns the PSF (as an image) for any point on any released image. This will be callable remotely and will also enable the Portal Aspect to display PSFs for catalog entries on demand.

But I don’t think that’s what you’re specifically looking for.

So, I think that what you trying to do is find a means to use the actual external psfex package to work with image files downloaded from the RSP environment, is that correct?

Yes. I think that’s a reasonable goal.

But even simpler Python-only stuff would be useful. E.g., here’s a simple way to read and put together the PSF (at the reference X, Y position, no spatial variation in this treatment):

import matplotlib.pyplot as plt
import numpy as np

from astropy.io import fits

file = "image_calexp-r-896748-R14_S20-2025-08-03T08.fits"

# The PSFEx metadata and data indexes.
# We need the metadata to look up the pixel step.
# This is in units of (image pixel)/(PSF pixel)
# One common thing is for the PSF to be oversampled by a factor of 2
# That would be a pixstep = 0.5
psfex_hdu_info_index = 11
# The PSFEx data contains the basis vectors (arrays) for the PSF
# and the coefficients
psfex_hdu_data_index = 12

hdu = fits.open(file)

psfex_info = hdu[psfex_hdu_info_index]
psfex_data = hdu[psfex_hdu_data_index]

pixstep = psfex_info.data._pixstep[0]  # Image pixel per PSF pixel
size = psfex_data.data["_size"]  # size of PSF  (nx, ny, n_basis_vectors)
comp = psfex_data.data["_comp"]  # PSF basis components
coeff = psfex_data.data["coeff"]  # Coefficients modifying each basis vector

# `size` is an array of [[nx, ny, nbasis]] so we need to get the first element
# But the in-memory representation of comp is raveled in [nbasis, nx, ny]
# So we reverse it to reshape the image data correctly.
psf_basis_image = comp.reshape(*size[0][::-1])

# Calculate the PSF at the location of the reference X, Y points on the image.
psf_image = psf_basis_image * psfex_data.data["basis"][0, :, np.newaxis, np.newaxis]
psf_image = psf_image.sum(0)
# And normalize to 1 over the PSF area, but in image pixel space
psf_image /= psf_image.sum() * pixstep**2

I’d be extremely reluctant to publish documentation committing to the HDU numbers to be used! That’s certainly not intended as a stable public interface of ours.

I’ll research this offline and post something well-considered back.

Yes, names should work just fine to access an HDU. Astropy can take names or numbers. Looking at a calexp file I have here the Psfex extensions are actually 10 and 11 (but maybe 12 is the one that matters?)

The fundamental problem though is that they have identical names… We currently don’t really pay attention to the EXTNAME and that is causing us real problems in terms of portability.

$ fitsinfo calexp_HSC_i_HSC-I_904010_0_31_HSC_runs_ci_hsc_20230802T200001Z.fits
Filename: calexp_HSC_i_HSC-I_904010_0_31_HSC_runs_ci_hsc_20230802T200001Z.fits
No.    Name      Ver    Type      Cards   Dimensions   Format
  0  IMAGE         1 PrimaryHDU     282   ()      
  1  IMAGE         1 CompImageHDU     82   (2048, 4176)   float32   
  2  MASK          1 CompImageHDU     91   (2048, 4176)   int32   
  3  VARIANCE      1 CompImageHDU     82   (2048, 4176)   float32   
  4  ARCHIVE_INDEX    1 BinTableHDU     41   313R x 7C   [1J, 1J, 1J, 1J, 1J, 64A, 64A]   
  5  TransformPoint2ToPoint2    1 BinTableHDU     18   228R x 1C   [1QB(13854)]   
  6  SkyWcs        1 BinTableHDU     17   1R x 1C   [1QB(11118)]   
  7  ApCorrMap     1 BinTableHDU     21   62R x 2C   [64A, 1J]   
  8  ChebyshevBoundedField    1 BinTableHDU     41   31R x 6C   [1J, 1J, 1J, 1J, 1J, 9D]   
  9  ChebyshevBoundedField    1 BinTableHDU     41   32R x 6C   [1J, 1J, 1J, 1J, 1J, 1D]   
 10  PsfexPsf      1 BinTableHDU     52   1R x 9C   [1J, 1J, 1J, 1J, 1J, 1J, 1D, 1D, 1E]   
 11  PsfexPsf      1 BinTableHDU     45   1R x 8C   [2J, 1J, 6D, 6D, 3J, 41334E, 2D, 2D]   
 12  Polygon       1 BinTableHDU     21   9R x 2C   [1D, 1D]   
 13  PhotoCalib    1 BinTableHDU     36   1R x 5C   [1X, 1D, 1D, 1J, 1J]   
 14  FilterLabel    1 BinTableHDU     28   1R x 3C   [2X, 32A, 32A]   
 15  ProductTransmissionCurve    1 BinTableHDU     21   5R x 2C   [1J, 1J]   
 16  TransformedTransmissionCurve    1 BinTableHDU     21   1R x 2C   [1J, 1J]   
 17  SpatiallyConstantTransmissionCurve    1 BinTableHDU     30   5R x 4C   [1D, 1D, 1QD(1000), 1QD(1000)]   
 18  RadialTransmissionCurve    1 BinTableHDU     34   1R x 5C   [1D, 1D, 1QD(16791), 1QD(579), 1QD(29)]   
 19  Detector      1 BinTableHDU    115   1R x 22C   [1QA(4), 1J, 1J, 1QA(3), 1J, 1J, 1J, 1J, 1D, 1D, 1D, 1D, 1D, 1D, 1D, 1D, 1D, 1D, 1J, 1QE(0), 1QA(3), 1J]   
 20  TransformMap    1 BinTableHDU     33   226R x 5C   [1QA(10), 1QA(4), 1QA(18), 1QA(4), 1J]   
 21  Detector      1 BinTableHDU    200   4R x 38C   [3X, 1QA(1), 1J, 1J, 1J, 1J, 1D, 1D, 1D, 1D, 1J, 1QD(4), 1QA(7), 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1J, 1D, 1D, 1QA(1)]  

Yes, that’s why I didn’t mention using the names. We really are going to have to discuss this offline for a bit.