Quick answer: Meta Quest 2 and Quest 3 require a sustained 72 fps at minimum, with 90 fps and 120 fps modes available. Any frame that takes longer than the target frame time causes the compositor to reproject the previous frame, which introduces visual artifacts.
Learning how to debug VR bugs meta quest is a common challenge for game developers. VR development introduces an entire category of bugs that do not exist in flat-screen games. A dropped frame does not just look bad — it makes players physically nauseous. A tracking glitch does not just break immersion — it disorients someone who cannot see the real world. Debugging on Meta Quest adds another layer of complexity because you are working with a mobile chipset, limited thermal headroom, and no monitor to watch your debug output. This guide covers the VR-specific bugs you will encounter on Quest and how to track them down.
Frame Rate and Motion Sickness
The relationship between frame rate and comfort is non-negotiable in VR. On a flat screen, dropping from 60 fps to 45 fps is annoying. In VR, it causes nausea. The Quest compositor runs at a fixed rate (72, 90, or 120 Hz depending on your app’s settings) and expects a new frame from your app every cycle. When your app misses a frame deadline, the compositor uses Asynchronous SpaceWarp (ASW) to reproject the previous frame. ASW prevents the display from showing a stale image, but reprojected frames introduce visible artifacts around moving objects — ghosting, warping, and edge distortion.
Occasional single-frame drops are tolerable. Sustained drops are not. If your game runs at 90 Hz, you have 11.1 milliseconds per frame. If a frame takes 12 milliseconds, ASW kicks in and players see artifacts. If multiple consecutive frames miss the deadline, players feel discomfort within seconds and nausea within minutes.
The OVR Metrics Tool is your primary diagnostic tool. It runs as a heads-up overlay inside the headset and displays frame rate, CPU and GPU utilization, and whether ASW is active. Enable it during every play test session. When you see ASW activating, note the scene, the player’s position, and what was happening in the game. Then reproduce that scenario with a GPU profiler attached to find the specific bottleneck.
Tracking Bugs
Quest uses inside-out tracking with cameras on the headset to track both the headset position and the controllers. Tracking works well in typical room conditions but degrades in specific scenarios that you must test for. Low lighting causes tracking loss because the cameras cannot see environmental features. Direct sunlight or strong IR light can overwhelm the sensors. Featureless environments like a white-walled room with no furniture provide insufficient visual features for tracking.
When tracking is lost, the headset position freezes or drifts. Controllers may float away from the player’s actual hand position. Your game needs to handle these states gracefully. The Oculus SDK provides tracking confidence values — monitor them and show a warning or pause the game when confidence drops below a threshold rather than letting the player continue with broken input.
Controller occlusion is a predictable tracking failure. When a player holds a controller behind their back or below their waist (outside the headset cameras’ field of view), tracking relies on the controller’s IMU for dead reckoning, which drifts quickly. If your game requires aiming behind the player or reaching below waist level, test these interactions specifically and consider adding a recentering prompt if tracking drifts too far.
Guardian and Boundary System
The Quest guardian system draws a virtual boundary when the player approaches the edge of their play area. Your app receives events when the guardian becomes visible and when the player passes through it. Many VR games ignore these events entirely, which leads to bugs where the player is interacting with both the guardian overlay and the game simultaneously.
The correct behavior depends on your game type. A stationary game (seated or standing in place) should pause or dim when the guardian activates, since the player is likely stepping outside their intended position. A room-scale game should at minimum reduce interactive elements near the boundary edges. No game should ever place critical interactions (buttons, items, enemies) in positions that consistently trigger the guardian for players with small play areas.
Test your game in the minimum recommended play space (2m x 2m for room-scale) and verify that all core gameplay is accessible without triggering the guardian. Many developers test in large spaces and never see guardian issues until players with small apartments report problems.
Hand Tracking Bugs
If your game supports hand tracking, you are dealing with a fundamentally noisier input system than controllers. Hand tracking on Quest uses the same headset cameras as positional tracking, with machine learning models estimating hand pose. The system works well for casual interactions but has known limitations: fingers close together are hard to distinguish, fast hand movements cause tracking lag, hands partially occluded by objects (like gripping a real table edge) lose tracking, and two hands close together can confuse the system.
Design your hand tracking interactions to be forgiving. A pinch gesture should activate within a range of finger distances, not at a precise threshold. A grab gesture should have a generous activation volume around the target object. Provide visual feedback for tracking confidence — fade hand models to transparent when tracking quality drops rather than showing jittery hands that break immersion.
Test hand tracking with different skin tones and in different lighting conditions. The tracking quality can vary and your gesture recognition thresholds need to accommodate this variance.
Thermal Throttling
The Quest runs on a Snapdragon XR2 mobile chip with passive cooling inside a sealed headset pressed against the player’s face. Heat has nowhere to go. Under sustained GPU load, the chip’s clock speeds reduce to prevent overheating, and your frame rate drops with them. This thermal throttling is the most common cause of the “it works fine for ten minutes then gets bad” pattern.
The OVR Metrics Tool displays a thermal state from level 0 (cool) to level 4 (critical). At level 2, the system starts reducing clock speeds. At level 3, the reduction is significant. At level 4, the system may force the app to close. Your goal is to keep the headset at thermal level 1 or below during sustained gameplay.
If your game reaches thermal level 2 within fifteen minutes on a Quest starting at room temperature, your GPU workload is too high. Common fixes include reducing render resolution with fixed foveated rendering (which lowers resolution at the edges of the lens where the player is not looking), reducing draw calls through batching and instancing, simplifying shaders, and reducing the number of real-time lights. Every optimization that reduces GPU utilization extends the time before thermal throttling kicks in.
Debugging with adb and OVR Tools
Since the Quest has no monitor output during normal gameplay, all debugging happens either through a connected PC or through in-headset overlays. Connect the Quest via USB and use adb logcat to stream logs in real time. Filter by your app’s tag to reduce noise. For Unity apps, filter with adb logcat -s Unity. For crashes, run adb bugreport after the crash to capture the full system log including the crash stack trace.
Meta’s GPU profiler (integrated into RenderDoc) can capture individual frames from a running Quest app and analyze them on your PC. This is essential for finding the specific draw calls, shaders, or textures that cause GPU bottlenecks. Connect via USB, capture a frame during a performance problem, and examine the GPU timeline to see which render pass takes the most time.
For CPU profiling, both Unity Profiler and Unreal Insights support remote profiling over adb. Run the profiler on your PC, connect to the Quest app, and capture CPU frame timings, memory allocations, and thread activity. The key metrics are main thread time, render thread time, and any spikes that correlate with frame drops visible in the OVR Metrics Tool.
Common Quest-Specific Bugs
Passthrough artifacts are unique to Quest’s mixed reality mode. If your app uses passthrough (showing the real world through the headset cameras), you may encounter depth estimation errors where virtual objects appear to float in front of or behind real-world surfaces, color temperature shifts between passthrough and rendered content, and latency differences between passthrough video and rendered elements that cause a disconnect during fast head movements.
Audio spatialization bugs are more noticeable in VR than on flat screens. If a sound source is supposed to be to the player’s left but arrives in the right ear, it breaks the sense of presence immediately. Verify that your audio spatialization respects the headset’s tracked rotation and that sound sources move correctly when the player turns their head.
Sleep and wake behavior on Quest is similar to mobile devices. The headset enters sleep when removed from the player’s head (detected by the proximity sensor). Your app receives a pause event and must save state. On resume, audio devices, network connections, and Bluetooth controller connections may need to be re-established. Test the pause/resume cycle repeatedly, including during active multiplayer sessions.
If a playtester reports feeling sick, that is a bug. Treat it with the same severity as a crash.