Our television reviews are extensive, combining technical laboratory tests with real-world viewing experience. Using some of the most advanced calibration tools available, we measure the amount of light a television puts out, how dark it can get while still generating a picture, how accurate its colors are out of the box, and how much input lag it generates.
(Credit: Imaging Science Foundation, THX)
As PCMag’s lead home entertainment and TV analyst, I’m an Imaging Science Foundation Level III-certified TV calibrator and a THX Level I home theater expert, and this training has informed the development and ongoing improvements of our testing methodology.
TV Testing Equipment
We measure light and color output using a Klein K-10A colorimeter. This is a device that measures the light a TV produces—both how bright that light is (luminance) in candela per square meters (cd/m^2, or nits) and the type and amount of color in that light (hue and saturation). To ensure we’re getting consistent light and color levels between TVs, we use a Murideo SIX-G signal generator, which outputs specific images to the screen with both standard dynamic range (SDR) and high dynamic range (HDR) signals.
Klein K-10A colorimeter and Murideo SIX-G signal generator (Credit: Will Greenwald)
We profile each TV with Portrait Displays’ Calman software, a professional calibration suite for monitors, TVs, and other screens. It processes all measurements from the meters and enables us to produce our color charts for each review.
We use a Leo Bodnar 4K Video Signal Lag Tester to measure input lag, the time between the screen receiving a signal and the picture updating.
Testing TV Picture Quality
While a full calibration can produce the most accurate results from any TV, we prefer to evaluate the experience most consumers will get out of the box, under the best settings available on the TV without any granular adjustments or additional equipment. When we set up a new TV for testing, we generally place it in its “Movie,” “Cinema,” or “Calibrated” mode. If that mode isn’t available, we select the default picture mode and set the white balance to the warmest available setting. We also disable any energy-saving modes and light sensor features to ensure we get the full span of light the TV can put out for contrast measurements.

We measure black level and peak brightness using a consistent selection of test patterns produced by our signal generator. For LED-backlit TVs, we measure peak brightness based on both full-screen and 18% white fields (a picture where a rectangle that takes up 18% of the screen sits in the center of a black field); some TVs are capable of boosting backlight in limited zones, while others lack that capability, so both measurements are noted when considering contrast. OLED TVs can get consistently brighter as less of the screen is fully lit, so we only use an 18% white field surrounded by black for comparison with all other TVs, though we do note brightness using a smaller 10% field when comparing with other OLEDs.
This is our full test setup, running a color test. The real testing the room would be much darker, but we want you to be able to see how we do it. (Photo: Will Greenwald)
For the black level, we use a partial white field and measure based on the black parts of the frame. This ensures an accurate black level reading and a realistic contrast measurement since many TV backlights simply turn off when receiving a signal that’s completely black. To calculate contrast ratio, we divide the peak brightness by the black level; for OLED TVs that generate no light in black sections of the screen due to the panel technology, we consider effective contrast to be “infinite.” LED TVs with local dimming arrays have proven to produce effectively perfect blacks as well, so for most high-end TVs, the technical contrast ratio has become less important than brightness and how well a TV visually handles very dark objects in video content.
Above is an example chart of TV peak brightness. This chart was made for our review of the Hisense U6N, our Editors’ Choice for budget TVs. The measurements in this chart are based on an HDR10 signal in the TV’s Movie mode. Generally, TVs display much higher contrast with an HDR signal, though it’s also important to know how they will perform with older, non-HDR content, so we also take measurements with SDR signals, which we note in reviews.
After we measure peak brightness and black level, we move on to color testing. We display full fields of white, red, green, blue, cyan, yellow, and magenta and compare their measurements against ideal color targets. We again use both SDR and HDR signals here, comparing color performance in both cases with their most relevant color spaces: For SDR, we see how close color levels come to the Rec.709 broadcast standard color space. For HDR, we use the much wider DCI-P3 digital cinema color space.
(Credit: PCMag)
Above are the color measurements we took for the Hisense U6N. The two charts show the TV’s color levels with an SDR signal and with an HDR signal as dots, and their respective target color spaces as triangles.
Testing Input Lag
With luminance and color measurements recorded, we then measure input lag using the Bodnar 4K Video Signal Lag Tester. It’s a small box that generates randomly flashing patterns on the screen. A light sensor on the box times how long it takes for the screen to respond after it sends the flashes. We test lag with both 4K60 and 1080p120 signals and consider a latency of less than a single frame (16.6 milliseconds at 60Hz, 8.3ms at 120Hz) to be best for gaming.
Recommended by Our Editors
This is a new input lag testing method we’re implementing. We previously relied on an HDFury 4K Diva 18Gbps HDMI matrix with an Xbox One X as a video source. This method also uses a sensor placed in front of the screen to flash between black and white and measure the amount of time it takes for the screen to update. This tool is limited to 1080p60 output signals, while our new Bodnar testing device lets us test higher resolutions and refresh rates.
Our previous input lag testing process with an HDFury Diva HDMI matrix, using an Xbox One X as a source. The Diva displays the box in the middle, and the sensor measures the time between when a signal is sent and the box flashes. (Photo: Will Greenwald)
The two pieces of test equipment use different methods to generate their test signals, and so the numbers aren’t directly comparable. TVs tested before 2025 were measured with the Diva, but starting this year, all TV lag tests will have two charts showing 4K60 and 1080p120 input lag measurements, compared with measurements using only the Bodnar.
Evaluating the Viewing Experience
Finally, it’s time for some real-world testing. To this end, we watch a variety of content to get a sense of general performance. We have a library of films and documentaries on Blu-ray and Ultra HD Blu-ray to observe how well each TV can handle different types of content, like dark, moody, shadow-filled scenes and bright, colorful nature landscapes. One particularly useful tool for these tests is the Spears & Munsil Ultra HD benchmark disk, an Ultra HD Blu-ray disc with test charts and demonstration footage to evaluate how well a TV shows all sorts of content.
While streaming media is a consideration, using physical media in our tests ensures that we can get the best idea of what a TV is capable of showing from the highest-quality signal without considering bandwidth or variable bitrates.
For more, check out our TV product guide, as well as picks for the best TVs. And to make sure you’re getting the best picture possible, check out our list of easy fixes for common TV problems.
Get Our Best Stories!
Your Daily Dose of Our Top Tech News
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!
About Will Greenwald
Lead Analyst, Consumer Electronics
