Tips for adapting the LSST Stack for other cameras

Eiichi Egami, Christina Williams, Christopher Willmer, and myself have been working to develop image simulation for the JWST NIRCam instrument (led by Marcia Rieke). We have been pursuing parallel methods, both producing an in-house image simulator (led by Christopher Willmer) that accounts for some of the finer details of the electronics and using the existing LSST image simulation methods (PhoSim and GalSim) for comparison purposes. Eiichi has been working with Scott Daniel at UW and the Purdue group to adapt the existing LSST methods for use in simulating NIRCam images, and everyone we have corresponded with has been extremely helpful. Christina and I have focused primarily on producing input source catalogs modeled from existing deep space-based datasets, which we have been adapting for input into PhoSim and Christopher’s in-house simulator. The availability of the LSST software has been a boon for us, and we will do our best to be conscientious in giving LSST appropriate credit when we use the existing tools.

As we become more adept at producing our simulated images, we are going to become increasingly concerned with producing image stacks and mosaics. We are exploring some existing options for this important task, and we are considering trying to use the LSST
Software Stack. I am writing to ask for advice in determining where the NIRCam team would need to focus efforts in adapting the LSST Stack for use with NIRCam simulated (and possibly real!) images.

Having read the LSST Software User Guide, and the Using the LSST Stack tree in particular, I get the impression that the tutorial describing how to process the SDSS Stripe 82 Images is most closely related to what we are interested in. We will begin with this and the related tutorials, but if there have any pointers for further adapting these methods we would appreciate hearing them. Is the appropriate path to follow the Representation of a Camera tree to produce an “obs_nircam” package once we would like to start modeling the NIRCam data, after we have reproduced the SDSS tutorials?

We would appreciate very much any advice people are willing to give.

Welcome! We’re still early in the process of making the Science Pipelines Stack more friendly for both project and non-project developers, but if you’re willing to suffer with our current state :smile: , the proper next step is indeed to develop an “obs_nircam” package. That would include a camera description, a mapper class and related “policy” configuration describing your typical on-disk directory/file structure, and any specialized subtask code (e.g. for instrument signature removal) that NIRCam would require.

Hi Brant; just to expand on @ktl’s comment (and unfortunately to not direct answer your questions), I’m working on writing documentation that will make the Stack easier to approach and extend. The new docs will entirely replace the documentation that you currently see on our Confluence wiki or that made with Doxygen.

These new docs will come online in the new few months and we plan to prepare a section that covers adapting the Stack to new instruments.

If you forge ahead now, we’d really like to hear about your experiences (good and bad) here on community.lsst.org so that we can (a) improve the usability of the Stack APIs, and (b) write docs that are needed. And of course this Community forum is also a great place to ask our devs for help.

Thanks very much. This information is certainly enough for us to get started, and thank you for responding promptly to the post.

From an algorithmic standpoint, there’s one area you should be warned about in making use of the LSST stack on [simulated] space-based data: many of our most important algorithms (notably PSF estimation and warping for coaddition) assume Nyquist-sampled images. That means our stacking algorithms are currently not suitable for most space-based data, where it’s typically necessary to use the dithering to reconstruct the image at a smaller pixel scale.

We aim to remedy that at some point (as some LSST data may not be well-sampled), and there are a couple of teams planning to use the LSST stack on simulated WFIRST data, so you’re not going to be alone in wanting to solve this problem within the stack, but it may take a while.

Thanks, yes, I was aware of this issue but JWST NIRCam is Nyquist sampled at 2 and 4 microns. As you say, many of us are considering using the LSST stack on simulated WFIRST data, so it will be something there will be lots of future interest in!