Quick answer: A game should target a crash-free session rate of 99.5 percent or higher before release. This means that out of every 1,000 play sessions, no more than 5 should end in a crash. For console certification, platform holders like Sony and Microsoft typically require even stricter thresholds.
A thorough game release checklist QA stability performance prevents issues from reaching players. Shipping a game is not the same as finishing a game. The gap between “it works on my machine” and “it works reliably for thousands of players across hundreds of hardware configurations” is where launches succeed or fail. This checklist covers the quality, stability, and performance gates you should clear before pressing the release button. It is organized by phase so you can track your progress in the weeks leading up to launch.
Phase 1: Pre-Release QA Checklist
Begin this phase four to six weeks before your target release date. These are the foundational checks that everything else depends on.
Full playthrough on target platforms. Someone on your team — ideally not the person who built the content — needs to play through the entire game on every platform you are shipping on. Not a speedrun. A thorough playthrough that tests every major path, every side quest, every menu, every save and load cycle. Document the hardware used, the time taken, and every issue encountered, no matter how minor.
Bug backlog review. Export your complete bug backlog and categorize every open issue by severity. Your release criteria should be explicit: zero P0 (blocker) bugs, zero P1 (critical) bugs, and a documented decision for every P2 (major) bug explaining whether it will be fixed before launch, fixed in the day-one patch, or deferred. Do not ship with unreviewed bugs — every open issue should have a conscious disposition.
Regression testing on fixed bugs. Every bug marked as fixed during your beta or development period needs to be retested in the release candidate build. Regression is the silent killer of game launches. A fix for one system can break another, and automated tests do not catch everything in a game context. Allocate at least two full days for regression testing and prioritize retesting P0 and P1 fixes.
Save system validation. Test saving and loading from every major game state: mid-combat, during cutscenes, at menu screens, during loading transitions, and at the boundaries between levels or zones. Test what happens when a save is interrupted — kill the process during a save operation and verify the game recovers gracefully. Test save file compatibility if you have changed the save format during development. Corrupt a save file manually and verify the game handles it without crashing.
Input device testing. Test every supported input method through the entire game. Keyboard and mouse, controller (at minimum Xbox and PlayStation layouts), touchscreen if applicable, and any specialty inputs you support. Pay special attention to rebinding — verify that rebound controls work in all contexts including menus, gameplay, and cutscenes. Test hot-plugging controllers mid-game.
Localization verification. If your game ships in multiple languages, every localized string needs visual verification in context. Automated checks can confirm that translation keys are populated, but only a human can verify that the translated text fits in the UI, reads correctly in context, and does not overlap with other elements. German and Russian text is typically 30 to 40 percent longer than English — verify that your UI handles this gracefully.
Phase 2: Stability Metrics and Thresholds
Stability is measurable. Do not rely on subjective assessments like “it feels stable” or “I have not seen a crash in a while.” Define numeric thresholds and track them systematically.
Crash-free session rate. This is the percentage of play sessions that complete without a crash to desktop, a freeze requiring force-quit, or an unrecoverable error. Your target should be 99.5 percent or higher. Measure this across your beta population or internal testing over at least one week of daily play. If you are at 99 percent, that means 1 in 100 sessions crashes — which translates to a significant number of negative reviews on launch day.
Mean time between failures (MTBF). Track the average playtime before a crash occurs. For a game with two-hour average sessions, an MTBF of 20 hours means most players will complete the game without a crash. An MTBF of 4 hours means almost every player will experience at least one crash. Calculate this from your crash reporting data and set a minimum threshold based on your game’s expected session length.
Memory stability over long sessions. Run your game for four to eight hours continuously while monitoring memory usage. Memory should be stable, meaning it does not continuously grow over time. A slow memory leak that adds 50 MB per hour might not manifest during a 30-minute QA session but will cause crashes during marathon play sessions. Test on your minimum spec hardware where available memory is lowest.
Crash clustering. Analyze your crash reports to identify clusters. A single crash that accounts for 60 percent of all reports is a very different situation than 50 unique crashes each accounting for 2 percent. Prioritize fixing clustered crashes because each fix improves the crash-free rate significantly. If your top three crash signatures account for more than 50 percent of crashes, fix those and retest before evaluating whether you meet your stability threshold.
Startup reliability. Track the percentage of installations that successfully launch the game on first attempt. Failed startups — due to missing dependencies, driver incompatibilities, or configuration issues — are the most damaging kind of failure because the player has no opportunity to experience the game at all. Target 99 percent successful first launches across your tested hardware matrix.
Phase 3: Performance Benchmarks
Performance expectations vary by genre, platform, and target audience. A turn-based strategy game can ship at 30 FPS comfortably; a competitive shooter cannot. Define your benchmarks based on your game and your audience, then verify them systematically.
Frame rate targets. Define minimum and target frame rates for each quality preset or platform. A common structure: minimum spec hardware should hold 30 FPS at low settings with no drops below 20 FPS; recommended spec hardware should hold 60 FPS at medium settings with no drops below 45 FPS; high-end hardware should hold 60 FPS or higher at maximum settings. These numbers should be sustained, not peaks. Measure using the 1st percentile frame time, not the average, because averages hide stutters.
Frame time consistency. Average frame rate is misleading. A game that alternates between 120 FPS and 20 FPS has an average of 70 FPS but feels terrible. Track frame time variance and ensure the 99th percentile frame time is no more than double the median. Shader compilation stutters, garbage collection pauses, and asset streaming hitches are the usual culprits for frame time spikes in otherwise smooth games.
Loading times. Measure and document loading times for every transition in the game: initial startup, main menu to gameplay, level transitions, fast travel, respawning, and returning to the menu. Set maximum acceptable times for each: 15 seconds for initial load, 5 seconds for level transitions, 2 seconds for respawns. Test on both SSD and HDD if you support older hardware. Loading times on a mechanical hard drive can be three to five times longer than on an SSD.
Memory usage. Profile your game’s memory consumption across its most demanding sections — typically large open areas with many entities, particle effects, or complex UI overlays. On your minimum spec hardware, peak memory usage should stay below 80 percent of available system RAM. Exceeding this threshold causes the operating system to page to disk, creating massive performance drops. On consoles, exceeding memory limits causes immediate termination.
GPU and CPU utilization. On recommended spec hardware, neither the GPU nor the CPU should consistently exceed 90 percent utilization during normal gameplay. Headroom is essential for handling transient spikes from combat, explosions, or other intensive moments. If your game regularly maxes out a resource, those transient spikes will cause visible hitches.
Thermal performance on mobile and handheld. If you ship on mobile, Switch, or Steam Deck, test extended play sessions (60 minutes or more) and monitor for thermal throttling. A game that runs at 60 FPS for the first 10 minutes but drops to 25 FPS after 30 minutes due to thermal limits is failing a performance test that short benchmarks would miss.
Phase 4: Platform Certification Requirements
If you are shipping on consoles, platform certification is a mandatory gate with specific technical requirements. Failing certification delays your launch — sometimes by weeks. Prepare early.
Sony PlayStation certification requires that the game never crashes, never hangs for more than 30 seconds, correctly handles all system events (controller disconnect, user switching, suspend and resume, system notifications), meets performance requirements for the hardware, and properly implements trophies, save data management, and online features according to their technical requirements checklist. Obtain the TRC (Technical Requirements Checklist) early and test against it throughout development, not just at the end.
Microsoft Xbox certification has similar requirements documented in their XR (Xbox Requirements). Key areas include proper handling of Quick Resume, Smart Delivery across console generations, Xbox Live integration, and accessibility features. Microsoft offers a pre-certification self-test tool — use it before submitting to catch common failures.
Nintendo Switch certification focuses on performance and user experience. The game must function correctly in both handheld and docked modes with appropriate resolution and performance adjustments. Test save data compatibility, controller configurations including Joy-Con and Pro Controller, and proper handling of sleep mode.
Steam does not have formal certification, but the Steam Deck verification program checks for controller support, readable text at handheld resolution, proper handling of suspend and resume, and no dependency on external launchers. Achieving “Verified” status for Steam Deck is valuable for visibility and sales.
For all platforms, submit your build for certification as early as allowed. First submissions commonly fail on edge cases that are difficult to predict. Budget two certification cycles in your timeline: the initial submission and one resubmission after fixing any issues found.
Phase 5: Day-One Patch Planning
A day-one patch is not a safety net for shipping a broken game — it is a practical response to the reality that development continues between gold master submission and the release date. Plan for it deliberately.
Lock the gold master build. The build you submit for certification or manufacturing is your gold master. Once locked, no changes go into this build. Any subsequent fixes go into a separate branch that becomes the day-one patch. This separation is critical because the gold master has been tested and certified; mixing in new changes invalidates that testing.
Define the patch scope. The day-one patch should contain only bug fixes and stability improvements discovered after the gold master was locked. It should not contain new features, balance changes, or content additions. The exception is critical fixes for issues discovered during certification that were waived or missed. Scope discipline is important because the day-one patch receives less testing time than the gold master.
Submit the patch for certification early. Console platforms require that day-one patches pass certification just like the base game. Submit the patch at least one week before launch to allow for certification turnaround. If the patch fails certification, you need time to fix and resubmit without delaying the launch.
Minimize patch size. Players who buy the physical version or download on launch day will encounter the day-one patch immediately. A 500 MB patch is expected. A 20 GB patch tells players the game was not finished when it was shipped, regardless of the actual contents. Structure your build pipeline so that the patch contains only changed files, not the entire game data.
Prepare patch notes. Write clear, honest patch notes that describe what the day-one patch fixes. Players who follow your game will see the patch and want to know what changed. Transparent communication about pre-launch fixes builds trust. Vague patch notes like “various bug fixes and improvements” do the opposite.
Have a rollback plan. In the unlikely event that the day-one patch introduces a critical issue worse than what it fixes, know how to revert to the gold master build on each platform. This situation is rare but devastating if you are not prepared. Document the rollback procedure before launch day.
The Final Week Before Launch
With one week to go, your build should be locked, your patch submitted, and your checklist nearly complete. Use this week for final verification and preparation.
Run a complete smoke test on the final build plus day-one patch on every target platform. Launch the game, start a new save, play for 30 minutes, save and quit, reload and continue, complete a major milestone, and verify that nothing has regressed from the last round of testing.
Verify that your crash reporting and analytics are working on the release build. When thousands of players start playing on day one, you need real-time visibility into crash rates, performance data, and error logs. Test the entire pipeline: trigger a test crash, verify it appears in your dashboard, confirm that the stack trace is symbolicated and readable.
Prepare your support infrastructure. Write a known issues list based on the P2 and P3 bugs you decided not to fix before launch. Set up a community hub or support page where players can report issues. Brief your support team — or yourself, if you are the support team — on the most likely issues players will encounter and the recommended workarounds.
For a deeper dive into post-launch monitoring, see our guide on bug reporting metrics every game studio should track. If you are planning your beta process before the release push, how to run a beta test for your indie game covers the testing phase that feeds into this checklist.
Print this checklist and pin it to your wall three months before launch. Cross off items as you complete them. The items you keep postponing are the ones most likely to bite you on release day.