Jacob Dymond

Reviewed and maintained by Jacob Dymond

Founder, ScreenDetect

Last reviewed March 24, 2026

ScreenDetect testing methodology

This page documents how ScreenDetect runs browser-based display checks, how a result becomes trustworthy enough to document, and where the workflow stops.

ScreenDetect is a visual reference workflow, not a display lab. The standard is to control the setup, repeat the observation, capture evidence, and avoid stronger hardware or warranty claims than the route can honestly support.

Scope and limits

The methodology is useful only if it says both what the workflows are allowed to establish and what they are not allowed to overstate.

ScreenDetect can establish
Visible defect signals under controlled viewing conditions: fixed pixel defects, dark-room bleed patterns, image retention, and mild burn-in indicators.
ScreenDetect cannot establish
Direct hardware measurements such as luminance, gamut, electrical fault state, or model-specific manufacturer pass and fail thresholds.
Review rule
A route is only trusted when the viewing setup is controlled, the result survives a retest, and the evidence can be checked later by the user or support.

Testing baseline

These conditions make the result more trustworthy and make later evidence more useful.

Display mode

Use fullscreen patterns with browser zoom at normal scale and no visible UI overlays, taskbars, or notifications.

Lighting

Backlight bleed checks belong in a dark room. Pixel and burn-in checks should still avoid glare, window reflections, and mixed lighting.

Brightness

Use a stress pass first, then confirm at normal daily-use brightness before judging practical severity.

Viewing distance

Judge severity from normal viewing distance first. Nose-to-screen inspection can help locate a defect, but it should not be the only basis for a keep or return decision.

Repeatability

A finding is treated as stronger when it stays in the same location across multiple passes, patterns, or retests under the same setup.

Documentation minimum

Capture one full-context image, one close view, and the setup details that affect interpretation: device model, brightness, room conditions, and whether the result persisted after retesting.

Workflow matrix

Each route has its own signal, false-positive pattern, retest rule, evidence burden, and stop point. The point of this matrix is comparison, not decoration.

Pixel Test

Primary signal
Fixed black, bright, or wrong-color points that remain in the same panel location across solid color patterns.
False positive risk
Dust, smudges, pressure marks, zoom artifacts, and panel texture can all be misread as point defects on a first pass.
Retest rule
Treat the result as stronger only when the same point remains visible across multiple colors and repeated passes.
Evidence to keep
Save one fullscreen context shot and one close shot that shows the exact pixel location on the panel.
Stop point
If the point is inconsistent, moves, or disappears after cleaning and retesting, do not classify it as a stable defect yet.

Backlight Bleed Test

Primary signal
Fixed edge or corner light patches on a black pattern in a dark room, especially when they remain intrusive at normal viewing distance.
False positive risk
IPS glow, reflections, and camera overexposure can exaggerate or even fake a bleed problem.
Retest rule
Treat the result as stronger when the bright area stays fixed after a small viewing-angle change and a second pass.
Evidence to keep
Capture one normal-distance dark-room photo, then close context shots of each affected edge or corner.
Stop point
If the brightness shifts mainly with angle, treat it as a glow problem first rather than escalating immediately as fixed bleed.

Burn-In Test

Primary signal
Stationary ghosting, tint zones, or luminance dips that remain visible on gray and color patterns.
False positive risk
Temporary image retention, reflections, and dirty panels can look worse than they are on a single pass.
Retest rule
Retest after varied content at normal brightness. A result that fades materially is treated differently from one that does not move.
Evidence to keep
Record the pattern used, prior static-content exposure if known, and whether the artifact changed after retesting.
Stop point
Do not describe a single-pass retention artifact as permanent burn-in unless it survives the retest sequence with little change.

Stuck Pixel Fixer

Primary signal
This route is for unstable stuck states, not confirmed dead pixels. It is a repair attempt, not a diagnostic proof on its own.
False positive risk
Users often keep retrying on old or physically failed pixels that were never plausible repair candidates.
Retest rule
If several controlled sessions show no meaningful change, ScreenDetect treats that as a stop signal rather than a reason to keep retrying forever.
Evidence to keep
Keep before and after close-ups, number of sessions, and whether the pixel behavior changed at all.
Stop point
If the pixel remains unchanged after repeated controlled sessions, move to replacement or warranty logic instead of indefinite repair loops.

Burn-In Fixer

Primary signal
This route is designed for temporary retention or light image memory. It should not be presented as a guaranteed fix for permanent panel wear.
False positive risk
Users can mistake any temporary improvement for a permanent fix when the panel still needs follow-up retesting.
Retest rule
If the artifact barely changes across controlled sessions and retests, the route shifts from mitigation to monitoring, support, or service.
Evidence to keep
Keep before and after captures on the same pattern and brightness so any change can be judged honestly.
Stop point
Do not keep running aggressive sessions as a substitute for service advice once change has plateaued.

Evidence and proof

A good methodology page should show how ScreenDetect reviews its own claims and what a user needs to preserve when the next step is support, return, or warranty.

What ScreenDetect verifies before publishing guidance

  1. 01The page states the viewing setup and retest rule, not only the conclusion.
  2. 02Each workflow includes a failure mode and a stop point, not only a success path.
  3. 03Visible review metadata identifies who maintains the page and when it was last reviewed.
  4. 04Internal route links lead to matching setup guidance rather than generic marketing summaries.

What a useful claim package should include

  1. 01A fullscreen context image or video that shows the defect in the same state the user saw it.
  2. 02A close image that confirms location without losing panel context.
  3. 03Device model, panel type if known, brightness setting, and room conditions.
  4. 04A note on what changed after retesting, after varied content, or after a repair cycle.
  5. 05Any manufacturer policy language or support requirement that affects the next action.

Common confounders

These are the reasons ScreenDetect pushes retesting and conservative language before users move into repair, exchange, or warranty action.

  1. 01Reflections, smudges, and dust that move with cleaning or viewing angle.
  2. 02Camera exposure that makes dark-room bleed or retention look more severe than it is in person.
  3. 03Auto-brightness, HDR, local dimming, night modes, or adaptive color settings left enabled during the pass.
  4. 04Judging a result only from extreme close range instead of normal use distance.
  5. 05Single-pass conclusions with no retest after the room, brightness, or content changes.
  6. 06Applying the wrong workflow to the wrong symptom, such as treating retention like a dead pixel or IPS glow like fixed bleed.

Sources and review

Last reviewed March 24, 2026. This page is maintained as a technical reference for browser-based display checks and updated when route behavior or evidence guidance changes materially.

ReferencePublisherURL
LCD and OLED Pixel Defect Policy OverviewISO 9241-302/303/305/307 (standards family reference)https://www.iso.org/standard/52280.html
Dell Display Pixel GuidelinesDell Supporthttps://www.dell.com/support/kbdoc/en-us/000126004/dell-display-pixel-guidelines
LG OLED TV - Pixel Cleaning and Burn-In Support GuidanceLG Supporthttps://www.lg.com/us/support/help-library/lg-oled-tv-run-pixel-cleaning-to-remove-screen-burn-ins-spots-lines-dots-CT10000018-20154768393287
Sony BRAVIA OLED - Panel Refresh GuidanceSony Supporthttps://www.sony.com/electronics/support/articles/00173467
Long-term Burn-In Test ResultsRTINGShttps://www.rtings.com/tv/tests/longevity-burn-in-test-updates-and-results