Quick answer: A normal RenderTexture has one surface and gets overwritten by whichever eye renders second — the other eye samples stale or empty pixels and you see a high-frequency flicker. Allocate it as a stereo target with vrUsage = VRTextureUsage.TwoEyes, render inside the XR display subsystem’s render pass, and match its MSAA sample count to the eye buffers.

Here is how to fix Unity RenderTextures that flicker between the left and right eyes in stereo VR. You build a fancy mirror, a security-camera display, or a magic portal using a Camera that renders to a RenderTexture, and on a flat monitor it looks great. The moment you put on the headset, the surface strobes — one eye sees the rendered content, the other sees the previous frame, the clear color, or nothing at all. The brain fuses the two into a flickering mess that ranges from mildly annoying to immediately nauseating. The cause is almost always that your RenderTexture is a single-eye surface being asked to feed a two-eye view.

The Symptom

You set up a Camera that writes into a RenderTexture, and a quad in the world samples that texture. Specific failure modes:

Strobing at headset refresh rate. The texture appears to flicker at 72/90/120 Hz, depending on the headset. This is the difference between left-eye and right-eye contents reaching your brain on alternating frames. Closing one eye stops the flicker entirely — a dead giveaway that the surface is single-eye.

One eye is correct, the other is black or stale. Looking at the surface with both eyes, one eye shows the expected image, the other shows whatever was last cleared into the buffer or a frame from many updates ago. Swapping which eye is closed swaps which view is correct.

Works in the Editor Game view but breaks in the build. The Game view renders mono, so the bug never appears. Only on-device or in XR Simulator builds does the texture flicker. This makes the bug invisible during day-to-day iteration.

Single Pass Instanced makes it worse. Switching the XR plugin from Multi Pass to Single Pass Instanced amplifies the flicker because both eyes share even more state, and the RenderTexture clear can interleave with the second-eye draws.

What Causes This

Single-surface RenderTexture for a two-eye view. A vanilla new RenderTexture(w, h, depth) allocates exactly one texture surface. In stereo rendering, both eyes try to bind that same surface. Whichever camera or pass renders second overwrites the first, and the first eye reads the second eye’s pixels (or vice versa) once they sample the texture later in the frame. The result is two slightly different images presented to opposite eyes — classic stereo flicker.

vrUsage not set to TwoEyes. Unity exposes a RenderTextureDescriptor.vrUsage field specifically to fix this. Setting it to VRTextureUsage.TwoEyes backs the RT with a Texture2DArray of two slices and arranges for the stereo renderer to bind slice 0 for the left eye, slice 1 for the right. Most tutorials miss this flag because they predate it, and most code samples assume mono rendering.

Camera.targetTexture renders before the XR pass. A standalone Camera with a targetTexture renders during its own slot in the camera loop, often before the XR display subsystem has begun its eye passes. The texture is filled once in mono and then sampled twice during the eye passes — same image both eyes, no parallax, sometimes correct, sometimes a frame behind. Pulling the rendering inside the XR pass via XRDisplaySubsystem.GetRenderPass guarantees per-eye execution.

MSAA mismatch. If your RT has 1x MSAA but the eye textures have 4x, Unity inserts an implicit resolve when sampling. That resolve can race with the second-eye pass and produce a half-resolved frame in one eye. Matching the sample counts removes the resolve and the race.

The Fix

Step 1: Allocate the RenderTexture as a stereo target. Use RenderTextureDescriptor rather than the constructor so you can set vrUsage. Texture2DArray is the underlying type and shaders need to know that.

// Create a stereo-ready RenderTexture for VR
public RenderTexture CreateStereoRT(int w, int h)
{
    var desc = new RenderTextureDescriptor(w, h)
    {
        colorFormat = RenderTextureFormat.ARGB32,
        depthBufferBits = 24,
        msaaSamples = QualitySettings.antiAliasing,
        // The crucial flag: allocate two slices, one per eye
        vrUsage = VRTextureUsage.TwoEyes,
        dimension = TextureDimension.Tex2DArray,
        volumeDepth = 2,
        useMipMap = false,
        autoGenerateMips = false
    };

    var rt = new RenderTexture(desc);
    rt.Create();
    return rt;
}

The Texture2DArray dimension and volumeDepth = 2 are required — vrUsage alone does not allocate the second slice on every backend. Setting all three together gives you a portable stereo target.

Step 2: Render inside the XR pass. Hook into the XRDisplaySubsystem so your render commands execute between the eye passes that already exist in the frame. Sampling shaders read unity_StereoEyeIndex to pick the correct slice.

using UnityEngine.XR;

void RenderStereoMirror(RenderTexture rt)
{
    var xr = GetActiveDisplay();
    if (xr == null) return;

    for (int p = 0; p < xr.GetRenderPassCount(); p++)
    {
        XRDisplaySubsystem.XRRenderPass pass;
        xr.GetRenderPass(p, out pass);

        for (int e = 0; e < pass.GetRenderParameterCount(); e++)
        {
            XRDisplaySubsystem.XRRenderParameter eye;
            pass.GetRenderParameter(_cam, e, out eye);

            _cam.projectionMatrix = eye.projection;
            _cam.worldToCameraMatrix = eye.view;
            _cam.SetTargetBuffers(
                rt.GetNativeDepthBufferPtr() != default
                    ? rt.depthBuffer : pass.renderTarget,
                pass.renderTarget);

            _cam.Render();
        }
    }
}

The loop iterates render passes and parameters because Single Pass Instanced exposes one pass with two parameters (one per eye), while Multi Pass exposes two passes with one parameter each. The same code handles both.

Sampling the Stereo Texture in a Shader

Unity provides a macro set in UnityCG.cginc (built-in) and com.unity.render-pipelines.core (URP/HDRP) for sampling Texture2DArray RTs by eye index:

// URP shader sampling a stereo RenderTexture
TEXTURE2D_ARRAY(_StereoMirror);
SAMPLER(sampler_StereoMirror);

half4 Frag(Varyings i) : SV_Target
{
    int slice = unity_StereoEyeIndex;
    return SAMPLE_TEXTURE2D_ARRAY(
        _StereoMirror, sampler_StereoMirror, i.uv, slice);
}

If you sample a Texture2DArray with the regular SAMPLE_TEXTURE2D macro you only get slice 0, which puts you back in single-eye land. The compiler will not warn you — it is a silent visual bug.

MSAA Has to Match

Open Project Settings → Quality and note your antialiasing value, then check the active XR plugin’s render pass MSAA setting. The RT’s msaaSamples must equal both. In code, query QualitySettings.antiAliasing at startup and pass that into your descriptor. If you support multiple quality levels, recreate the RT when the quality level changes.

If you need a non-MSAA render target for a deferred shading pass, allocate a separate single-sample RT for the resolve and never bind it as a camera target during the XR pass — do the resolve as an explicit Blit between passes.

“Mono code in a stereo world will always look like a 1990s movie effect — flicker, parallax in the wrong place, ghost images. Tell Unity you have two eyes and it will give you two slices.”

Related Issues

If your VR camera positions feel correct in the editor but jitter on-device, see Unity VR Camera Jitter in Builds. For Single Pass Instanced compatibility issues with custom shaders, check SPI Shaders Broken After Upgrade.

Two eyes, two slices — VRTextureUsage.TwoEyes is the magic flag everyone forgets.