Yes, that will be the proper way to do Rift support in games. But for any additional supersampling beyond what's needed to compensate for the warp losses, driver-based SSAA might be a bit faster than just further increasing the render buffer size.MSat wrote:I'm thinking that downscaling would have to be performed by the application rather than being forced via the GPU's control panel, as the scene needs to be rendered above the Rift's native resolution to prevent a loss of quality from warp compensation anyway. Since any application has to support the Rift natively (for which the resolution is known), I doubt that you would actually be able to select a different resolution anyway - only the type and amount of AA you want to apply.
On the other hand: Dynamically adjusting the buffer size could be a way to keep a steady framerate while offering the highest possible quality (= using a large render buffer for massive downsampling when there's enough power and dynamically reducing the render buffer size when the framerate drops). That wouldn't be possible with just driver-based SSAA.
I wonder how big the pixel magnification factor will be in the center of the shader (for determining the virtual resolution to compensate for the quality loss).