Continuing the theme of tricky and subtle features that nobody notices when they work correctly, check out the quality improvement in Photospector when we properly filter this high-resolution photo from Moscone Center below (click on the image to toggle between Filtered or Unfiltered):

Notice the aliasing along the window frames and cables? See how the buildings get sharper? The “Filtered” image uses proper texture filtering when drawing the image, while the second, with the jaggy edges uses “nearest neighbor” sampling without any filtering to display this high resolution photo scaled down for the non-retina iPad2 display.

Most image applications use a low-resolution proxy image rather than showing you a full-resolution image. The low-resolution proxy is created by the Apple image libraries using a nice filtered downsampling algorithm, so it looks good at the resolution of the iPad, but you can’t zoom in to see the pixels. Photospector always shows the full-resolution image using a virtual texture engine, just like modern video games (pdf). Other apps that manage to work on full-resolution images are very slow, taking seconds to apply each adjustment or when zooming into the image, while Photospector does all of this smoothly in real time.

Working at full-resolution in real-time enables a new level of interactivity when working with your images — crossing a critical threshold in the Workflow Scale (pdf). Leveraging the speed of the GPU and game engine technologies including virtual texturing, megashaders and heads-up display (HUD) controls, allow you to inspect and adjust your images interactively, seeing the results of small adjustments instantly so you can fine tune the settings to get exactly the result you want.

WorkflowScaleInstead of doing filtering once and limiting your zoom range, Photospector uses the GPU to filter the image dynamically allowing you to zoom in past the individual pixels and back out to the full image. Other apps limit the resolution to 4K along each axis to fit within the maximum texture sizes, or even smaller sizes to fit your display. Photospector’s virtual texture engine tiles the image in real time, storing only the visible parts of the image at the current scale allowing you to inspect and edit truly gargantuan images with resolutions of 64K on each axis!

Filtering allows you to blur the image dynamically, providing an “area operation”, a technique critical to a number of advanced image processing functions, like Clarity (unsharpen mask), where the difference between each pixel and a blurred version of the same pixel is called “local contrast”. Now that Photospector has real time filtering, we are excited to begin work on adding real-time local contrast and blurring operations!

VirtualTextureDiagram

Photospector’s virtual texture system uses a two-step lookup for each displayed pixel. Each pixel is drawn by first looking up the current location and scale (mip-map level) in the page table texture which stores references to the image tiles stored in the physical texture (below). After reading the page entry for the current pixel, physical texture coordinates are computed which identify the current tile at the proper scale and then offset within that tile to find the final pixel, which is then sampled using the GPU’s native filtering hardware.

Physical Texture
The physical texture stores image tiles at various scales. During rendering, we dynamically choose the set of tiles we need and load them in the background, updating the physical and page tables each frame based on the currently visible region. This is why, if you look closely, you can see the image “rez up” when it is first loaded or as you zoom in, but you have to look closely since often happens much less than a single second!

Padded TilesEnabling filtering on a virtual texture is tricky because the tiles in the physical texture are not contiguous, so naively filtered samples will cause tile artifacts. We need to pad each tile with a border in order to avoid these artifacts, resulting in various tricky scale and offsets as we map between the screen, page table and physical texture. But, once it is all working smoothly, we are able to view full-resolution images and enable computation of advanced photo manipulation algorithms, like Clarity, in real-time!

(Coffee stains are not required for correct filtering.)

Filtering works both when scaling down an image and when zooming in. Most image viewing apps will filter the image when you zoom in, making it impossible to see the individual pixels. Photospector is designed to inspect your images, so we use the unfiltered “nearest neighbor” sampling when we zoom in to a photo. Seeing the individual pixels is the focus of our Pixel Inspector tool which shows the numeric values of each pixel, and includes a thumbnail map overview of the entire image with a small yellow rectangle identifying the current view. This is another idea copied from games, where “minimaps” draw a dynamic map of the region surrounding the player.
Pixel InspectorUsing linear filtering would smooth over many of the jagged edges in the closeup view above, but that would make it harder to identify problems in your image. Drawing the zoomed-in image without filtering allows you to see the JPEG compression artifacts in the blue sky and identify individual pixel numeric values. Photospector’s combination of filtered minification and unfiltered magnification allows you to view and edit ginormous full-resolution images and zoom in past the pixels to really understand your images from the inside out.

Megashader network

The final piece of game technology in Photospector is the megashader used to compute the real-time color adjustments. Megashaders combine the results of multiple operations and can selectively enable features to keep performance as high as possible for the selected set of operations. Our megashader implements the two-step virtual texture lookup and then performs a series of optional color adjustment operations controlled by the slider values in Photospector’s color tools. Brightness, contrast, exposure, white/black points, gamma, and other adjustments are performed prior to displaying the final pixel. Implementing these color adjustments in the shader provides two key benefits: real-time performance and non-destructible editing. Photospector never modifies the original image content. To save out the results of an operation, it must render the image and store the result either in the device gallery, or to one of the sharing services, like Facebook, Dropbox or Twitter. Non-destructive editing avoids any accumulation of error in the result, and, when coupled with 16-bit shaders, it allows Photospector to operate on RAW or 16-bit data with full precision (though the initial version does not support 16-bit processing). The combination of the virtual texturing and the megashader game technologies provide quality and performance that enable a whole new way of working with your images.