graph2  ·  film scanning pipeline

Electron count display brightness

How a raw photosite fill value becomes a pixel on screen — and why gamma encoding is the difference between correct and crushed.

electrons
display brightness (cd/m²)  ·  electron count (14-bit DN)
Brightness vs electron count.
linear — no gamma sRGB gamma 2.2
screen patch at probe point
no gamma
gamma 2.2
electrons
DN / 16383
linear 8-bit
gamma 8-bit

50%
of well capacity (8,192 e⁻) renders as ~18 cd/m² linearly — only 18% of peak brightness. With gamma encode: 50 cd/m².
2.2
Display gamma assumed by sRGB standard. The monitor applies v2.2 to every value it receives — gamma encode pre-compensates with v1/2.2.
16-bit
TIFFs from camera scanners must carry an embedded ICC profile. Without it, the OS renders the file as linear — same crush as no gamma.
shadow compression — where the damage is worst
Drag the probe to the left (low electron counts). This is where shadow detail from a color negative lands after inversion — the densest parts of the film transmit the least light, so photosites fill to only a small fraction of their capacity. Without gamma encoding, the display applies its own power curve on top of an already-dark linear value, squaring it into near-black. A shadow at 10% well fill renders at roughly 1% display brightness linearly — almost indistinguishable from black. With sRGB gamma encode, the same value lifts to about 35% brightness, which is perceptually correct.
the midpoint gap — 50% well ≠ 50% brightness
At exactly half the well capacity — 8,192 electrons, DN 8192 — a linear signal delivers only ~18 cd/m² on a 100 cd/m² display. That is 18% of peak, not 50%. The gamma-encoded version delivers ~50 cd/m², which is what the eye expects from a "middle grey." This gap is not a bug in the sensor or the display: it is the predictable consequence of the display's assumed input encoding. The sensor is linear by design; the display is perceptual by design; gamma encoding bridges them.
why 16-bit TIFFs need an ICC profile
When a camera scanner exports a 16-bit TIFF without an embedded color profile, the operating system and most viewers assume the data is already gamma-encoded (sRGB). If the raw linear scan data is stored untagged, the viewer applies its sRGB display transform on top of linear values — equivalent to raising a linear signal to the power of 2.2 × 2.2 = 4.84. Shadows collapse completely. The fix is either to embed a linear gamma ICC profile (so the OS applies the correct transform), or to apply the gamma encode in the scanning software before export and tag the output as sRGB.
toggle "gamma 2.2 encode" off — this is raw linear export
With gamma encoding disabled, the chart shows what happens when a scanning application hands off a proportional linear value directly to the display pipeline. Both the curve shape and the screen patch show the result: highlights compress near peak brightness while shadows are crushed to near-black. This is also the default rendering behaviour when a 16-bit TIFF is opened in an unmanaged viewer (web browser without color management, basic image viewer) that ignores the embedded profile. The curve is not wrong — the data is correct — but the display context assumes a different encoding.