Quick answer: Define three to four tiers based on your target audience. Minimum spec is the lowest hardware you officially support, typically a 5-7 year old mid-range PC or integrated graphics. Recommended spec is the hardware that runs the game well at 1080p 60 FPS with medium-high settings.

Learning how to benchmark your game across hardware tiers is a common challenge for game developers. You tested your game on your development machine and it runs great. Then the Steam reviews arrive: “Runs at 15 FPS on my laptop,” “Stutters constantly on my GTX 1060,” “Uses 12 GB of RAM doing nothing.” Testing on a single machine is not benchmarking. Real benchmarking means defining hardware tiers, building automated test scenes, collecting structured metrics, and verifying that every tier meets its performance target before every release. This guide shows you how to build that pipeline from scratch.

Defining Hardware Tiers

Hardware tiers are the foundation of your performance strategy. Each tier represents a class of player hardware with specific performance targets. Most games need three tiers. Some need four.

Minimum Spec
  // The lowest hardware you officially support
  // Target: 30 FPS at 720p on Low settings
  // Example: GTX 1050 / RX 560, i5-4460, 8 GB RAM

Recommended Spec
  // The "good experience" tier
  // Target: 60 FPS at 1080p on Medium-High settings
  // Example: GTX 1660 / RX 5600, Ryzen 5 3600, 16 GB RAM

Ultra Spec
  // Current-gen hardware for maximum visual quality
  // Target: 60 FPS at 1440p/4K on Ultra settings
  // Example: RTX 4070 / RX 7800, Ryzen 7 7700, 32 GB RAM

Steam Deck / Handheld (optional)
  // If targeting handheld PCs
  // Target: 30-40 FPS at 800p on Low-Medium settings
  // Fixed hardware: AMD APU, 16 GB shared memory

Do not guess at these tiers. The Steam Hardware Survey publishes monthly data on what hardware Steam users actually have. As of early 2026, the most common GPU is still the GTX 1650 family, the most common RAM amount is 16 GB, and the most common resolution is 1080p. Your minimum spec should cover at least the 20th percentile of Steam users, and your recommended spec should cover the median.

Building an Automated Benchmark Scene

Manual testing is unreliable. Different testers play differently, visit different areas, and encounter different scenarios. An automated benchmark scene guarantees identical conditions every run, making results comparable across hardware and across builds.

Design the benchmark scene to exercise your game's most demanding scenarios. Include dense geometry, maximum enemy count, particle effects, complex lighting, and any other element that stresses the renderer or CPU. Script a camera path that moves through the scene over 60-90 seconds, covering worst-case areas.

// Example benchmark runner (pseudo-code, adaptable to any engine)

class BenchmarkRunner:
    var frame_times: Array[float] = []
    var start_time: float
    var duration: float = 60.0
    var camera_path: Path3D
    var path_follow: PathFollow3D

    func _ready():
        # Disable player input, use scripted camera
        camera_path = $BenchmarkPath
        path_follow = $BenchmarkPath/PathFollow3D
        start_time = Time.get_ticks_msec()
        # Force quality preset from command line arg
        var preset = get_cli_arg("--quality", "medium")
        apply_quality_preset(preset)

    func _process(delta):
        # Record frame time
        frame_times.append(delta * 1000.0)  # Convert to ms

        # Advance camera along path
        path_follow.progress_ratio += delta / duration

        # Check if benchmark is complete
        if path_follow.progress_ratio >= 1.0:
            save_results()
            get_tree().quit()

    func save_results():
        frame_times.sort()
        var results = {
            "avg_fps": 1000.0 / average(frame_times),
            "avg_frame_time_ms": average(frame_times),
            "max_frame_time_ms": frame_times[-1],
            "p99_frame_time_ms": percentile(frame_times, 99),
            "one_pct_low_fps": 1000.0 / percentile(frame_times, 99),
            "total_frames": frame_times.size(),
            "ram_mb": OS.get_static_memory_usage() / 1048576,
            "quality_preset": current_preset,
            "gpu_name": RenderingServer.get_video_adapter_name(),
            "timestamp": Time.get_datetime_string_from_system()
        }
        var file = FileAccess.open("benchmark_results.json", FileAccess.WRITE)
        file.store_string(JSON.stringify(results, "  "))

Key Metrics to Track

Average FPS alone is misleading. A game that alternates between 90 FPS and 30 FPS averages 60 FPS but feels terrible. Track these metrics for every benchmark run:

Average FPS. The headline number. Useful for quick comparison but hides stutter.

1% Low FPS. The average of the worst 1% of frame times, converted to FPS. This represents the experience during the most demanding moments. If your average is 60 FPS but your 1% low is 20 FPS, players will experience noticeable stutter. Your 1% low should be within 50% of your average for a smooth experience.

0.1% Low FPS. The worst 0.1% of frames. This captures individual stutter spikes from garbage collection, shader compilation, asset streaming, or other one-off hitches. A low 0.1% with a good 1% low indicates occasional spikes rather than sustained poor performance.

Maximum frame time. The single worst frame. Anything over 100ms is a visible hitch. Anything over 500ms feels like a freeze.

Memory usage. Both system RAM and VRAM. Track peak usage, not just average. A game that briefly spikes to 15 GB of RAM will crash on 16 GB systems that are running anything else.

// Example benchmark output (JSON)
{
  "gpu_name": "NVIDIA GeForce GTX 1660 Super",
  "quality_preset": "medium",
  "resolution": "1920x1080",
  "avg_fps": 72.4,
  "one_pct_low_fps": 58.1,
  "point_one_pct_low_fps": 41.3,
  "avg_frame_time_ms": 13.81,
  "max_frame_time_ms": 38.7,
  "total_frames": 4344,
  "ram_peak_mb": 2847,
  "vram_peak_mb": 3102,
  "timestamp": "2026-03-24T14:30:00Z",
  "build": "v0.9.4-beta",
  "result": "PASS"
}

Quality Presets That Scale

Each hardware tier needs a quality preset that hits its performance target. Quality presets adjust multiple settings simultaneously: texture resolution, shadow quality, draw distance, post-processing, anti-aliasing, and particle density. Designing presets that scale well requires understanding which settings have the largest performance impact per visual quality cost.

// Quality preset impact ranking (typical)
// Listed from highest to lowest performance impact

1. Resolution / Render Scale     // Largest single impact on GPU
2. Shadow Quality + Distance     // Shadow maps are expensive
3. Anti-Aliasing Method          // TAA vs FXAA vs MSAA vs off
4. Post-Processing               // Bloom, DOF, motion blur, AO
5. Draw Distance / LOD Bias      // How much geometry is visible
6. Texture Quality               // Affects VRAM, less impact on FPS
7. Particle Density              // Overdraw from particles
8. Vegetation Density            // Draw calls and fill rate

// Low preset: Settings 1-5 at minimum, 6-8 at low
// Medium preset: Settings 1-3 at medium, 4-5 at medium, 6-8 at medium
// High preset: Everything at high, render scale 100%
// Ultra preset: Everything max, render scale 100-150%

Test each preset on its target hardware tier. Adjust individual settings until the 1% low FPS meets your target. Document the exact settings for each preset in version control so they do not drift accidentally.

Automating Benchmark Runs

Running benchmarks manually on multiple machines does not scale. Automate the process with a script that launches the benchmark, collects results, and compares them against targets.

#!/bin/bash
# benchmark_all_presets.sh
# Run on each test machine, collects results to shared directory

GAME="./MyGame.exe"
RESULTS_DIR="//server/benchmarks/$(hostname)/$(date +%Y%m%d)"
mkdir -p "$RESULTS_DIR"

for preset in low medium high ultra; do
    echo "Running benchmark: $preset"
    $GAME --benchmark --quality=$preset \
          --output="$RESULTS_DIR/benchmark_${preset}.json" \
          --duration=60 \
          --resolution=1920x1080
done

# Compare results against targets
python3 compare_benchmarks.py \
    --results="$RESULTS_DIR" \
    --targets=benchmark_targets.json
// benchmark_targets.json
{
  "low": {
    "min_avg_fps": 30,
    "min_one_pct_low_fps": 24,
    "max_ram_mb": 4096,
    "max_vram_mb": 2048
  },
  "medium": {
    "min_avg_fps": 60,
    "min_one_pct_low_fps": 45,
    "max_ram_mb": 8192,
    "max_vram_mb": 4096
  },
  "high": {
    "min_avg_fps": 60,
    "min_one_pct_low_fps": 50,
    "max_ram_mb": 12288,
    "max_vram_mb": 6144
  },
  "ultra": {
    "min_avg_fps": 60,
    "min_one_pct_low_fps": 50,
    "max_ram_mb": 16384,
    "max_vram_mb": 8192
  }
}

Tracking Performance Over Time

A single benchmark run is a snapshot. The real value comes from tracking results over time across builds. Store benchmark results in a database or structured file system. Graph the metrics per build. When average FPS drops by 5% between builds, you know exactly which change caused the regression.

Integrate benchmarking into your CI pipeline. Run the benchmark on a dedicated test machine after every merge to main. If the results drop below targets, flag the build. This catches performance regressions before they reach playtesters or players.

“We added automated benchmarks to our CI six months before launch. We caught 23 performance regressions that would have shipped without them. The biggest was a particle system change that cut FPS in half on minimum spec hardware. It took thirty minutes to fix because we caught it the same day it was committed.”

Using Steam Hardware Survey Data Effectively

The Steam Hardware Survey is published monthly and provides real data on what your players actually use. Key data points to reference when defining your hardware tiers: the GPU distribution shows that mid-range GPUs from 2-3 generations ago dominate; the RAM distribution confirms 16 GB as the median; the CPU core count shows that 6-8 cores is most common; and the OS distribution helps you prioritize platform testing.

Cross-reference the survey with your own telemetry if you have it. Your actual player base may skew differently from the Steam average, especially for certain genres. Strategy games tend to attract players with more powerful hardware. Casual games have more players on integrated graphics and laptops.

Related Resources

For engine-specific profiling, see profiling frame rate drops in Unity, Unreal Insights profiling guide, or Godot performance profiling for beginners. For GPU-specific analysis, read GPU profiling for game developers.

Build a 60-second benchmark scene this week. Run it on the weakest hardware you have access to. That single test will tell you more about your game's real-world performance than any amount of testing on your development machine.