Quick answer: Mark inputs [ReadOnly]; chain dependent jobs via the JobHandle dependency parameter; partition writes so no two threads touch the same index. Reach for [NativeDisableContainerSafetyRestriction] only when you have proven partitioning.

You schedule two IJobParallelFor jobs that both touch positions[i]. Unity throws “The previously scheduled job writes to the NativeArray — a parallel job cannot read or write while a job is scheduled to write.” The safety check is correct; chain or partition.

The Symptom

InvalidOperationException at Schedule time mentioning data race or read/write conflict. Or hangs/incorrect data when safety is off (don’t do that as a fix).

The Fix

Pattern 1: ReadOnly + dependency chain.

[BurstCompile]
struct ComputeForcesJob : IJobParallelFor
{
    [ReadOnly] public NativeArray<float3> positions;
    public NativeArray<float3> forces;
    public void Execute(int i) { ... forces[i] = ...; }
}

[BurstCompile]
struct ApplyForcesJob : IJobParallelFor
{
    [ReadOnly] public NativeArray<float3> forces;
    public NativeArray<float3> positions;
    public float dt;
    public void Execute(int i) { positions[i] += forces[i] * dt; }
}

var h1 = computeJob.Schedule(N, 64);
var h2 = applyJob.Schedule(N, 64, h1);
h2.Complete();

ReadOnly on inputs lets the safety system understand the access pattern. h2 takes h1 as dependency; runs after h1 is done.

Pattern 2: NativeArray.AsParallelWriter. For NativeList/Queue accumulating across threads:

struct AppendJob : IJobParallelFor
{
    public NativeList<int>.ParallelWriter writer;
    public void Execute(int i) { writer.AddNoResize(i * 2); }
}

ParallelWriter handles thread-safe append.

Verifying

Run with safety checks on. No data race exception. Profiler → Jobs view shows expected parallelism. Disable safety only for measured benchmarks of correct code.

“ReadOnly. Chain. Partition. Safety stays happy.”

Related Issues

For Burst BC1006, see Burst. For Deallocate errors, see deallocate.

Right access pattern, right attributes. Safe parallel.