Firefly pixel interpolation algorithm for transformed images

I have created DM-4247 to track the suggestions for pixel approximation algorithms camera team would like to have added.

Currently, when rendering an image in a transformed (scaled or rotated) coordinate system, Firefly is using the pixel value of the nearest neighboring integer coordinate sample in the image. (See VALUE_INTERPOLATION_NEAREST_NEIGHBOR)

“As the image is scaled up, it will look correspondingly blocky. As the image is scaled down, the colors for source pixels will be either used unmodified, or skipped entirely in the output representation.”

It is possible to use different algorithms for getting the interpolated pixel value, as shown here. However, other algorithms will probably not be as fast.