Our television reviews are extensive, combining technical laboratory tests with real-world viewing experience. Using some of the most advanced calibration tools available, I measure the amount of light a television puts out, how dark it can get while still generating a picture, how accurate its colors are out of the box, and how much input lag it generates.
(Credit: Imaging Science Foundation, THX)
As PCMag’s lead home entertainment and TV analyst, I have reviewed more than 170 individual models to date. I’m an Imaging Science Foundation Level III-certified TV calibrator and a THX Level I home theater expert, and this training has informed the development and ongoing improvements of our TV testing methodology. Read on for all the details.
TV Testing Equipment
I measure light and color output using a Klein K-10A colorimeter. This is a device that measures the light a TV produces—both how bright that light is (luminance) in candela per square meters (cd/m^2, or nits) and the type and amount of color in that light (hue and saturation). To ensure I’m getting consistent light and color levels between TVs, I use a Murideo SIX-G signal generator, which outputs specific images to the screen with both standard dynamic range (SDR) and high dynamic range (HDR) signals.
Left to right: Klein K-10A colorimeter, Leo Bodnar 4K Video Signal Lag Tester, and Murideo SIX-G 8K signal generator (Credit: Will Greenwald)
I profile each TV with Portrait Displays’ Calman software, a professional calibration suite for monitors, TVs, and other screens. It processes all measurements from the meters and enables me to produce our color charts for each review.
I use a Leo Bodnar 4K Video Signal Lag Tester to measure input lag, the time between the screen receiving a signal and the picture updating.
Testing TV Picture Quality
While a full calibration can produce the most accurate results from any TV, I prefer to evaluate the experience most consumers will get out of the box, using the best settings available on the TV without any granular adjustments or additional equipment. When I set up a new TV for testing, I generally place it in its “Movie,” “Cinema,” or “Calibrated” mode. If that mode isn’t available, I select the default picture mode and set the white balance to the warmest available setting. I also disable any energy-saving modes and light sensor features to ensure I get the full span of light the TV can put out for contrast measurements.
How PCMag Tests TVs
I measure black level and peak brightness using a consistent selection of test patterns produced by our signal generator. For LED-backlit TVs, I measure peak brightness based on both full-screen and 18% white fields (a picture where a rectangle that takes up 18% of the screen sits in the center of a black field); some TVs are capable of boosting backlight in limited zones, while others lack that capability, so both measurements are noted when considering contrast. OLED TVs can get consistently brighter as less of the screen is fully lit, so I only use an 18% white field surrounded by black for comparison with all other TVs, though I do note brightness using a smaller 10% field when comparing with other OLEDs.
Demonstrating how we test TV picture quality, but this occurs in a darker room to avoid interference from ambient light. (Credit: Will Greenwald)
For the black level, I use a partial white field and measure based on the black parts of the frame. This ensures an accurate black level reading and a realistic contrast measurement since many TV backlights simply turn off when receiving a signal that’s completely black. To calculate contrast ratio, I divide the peak brightness by the black level; for OLED TVs that generate no light in black sections of the screen due to the panel technology, I consider effective contrast to be “infinite.” LED TVs with local dimming arrays have proven to produce effectively perfect blacks as well, so for most high-end TVs, the technical contrast ratio has become less important than brightness and how well a TV visually handles very dark objects in video content.
Above is an example chart of TV peak brightness. This chart was made for my review of the Hisense U8QG, our Editors’ Choice for budget TVs. The measurements in this chart are based on an HDR10 signal in the TV’s Movie mode. Generally, TVs display much higher contrast with an HDR signal, though it’s also important to know how they will perform with older, non-HDR content, so I also take measurements with SDR signals, which I note in reviews.
After I measure peak brightness and black level, I move on to color testing. I display full fields of white, red, green, blue, cyan, yellow, and magenta and compare their measurements against ideal color targets. I again use both SDR and HDR10 signals here (plus the Dolby Vision HDR format if the TV supports it), comparing color performance in both cases with their most relevant color spaces: For SDR, I see how close color levels come to the Rec.709 broadcast standard color space. For HDR10 and Dolby Vision, I use the much wider DCI-P3 digital cinema color space.
(Credit: PCMag)
Above are the color measurements I took for the Hisense U8QG. The three charts show the TV’s color levels with SDR, HDR10, and Dolby Vision signals as dots, and their respective target color spaces as triangles.
Testing Input Lag
Deomonstrating how we test input lag with a Leo Bodnar 4K Video Signal Lag Tester (Credit: Will Greenwald)
With luminance and color measurements recorded, I then measure input lag using the Bodnar 4K Video Signal Lag Tester. It’s a small box that generates randomly flashing patterns on the screen. A light sensor on the box times how long it takes for the screen to respond after it sends the flashes. I test lag with both 4K60 and 1080p120 signals and consider less than a single frame of latency (16.6 milliseconds at 60Hz, 8.3ms at 120Hz) best for gaming. The above chart shows test results for the Hisense U8QG, compared with other TVs.
Recommended by Our Editors
I previously relied on an HDFury 4K Diva 18Gbps HDMI matrix with an Xbox One X as a video source. This method also uses a sensor placed in front of the screen to flash between black and white and measure the amount of time it takes for the screen to update. This tool is limited to 1080p60 output signals, while the Bodnar testing device lets me test higher resolutions and refresh rates.
The two pieces of test equipment use different methods to generate their test signals, and so the numbers aren’t directly comparable. TVs tested before 2025 were measured with the Diva, but starting this year, all TV lag tests will have two charts showing 4K60 and 1080p120 input lag measurements, compared with measurements using only the Bodnar. If you look up my reviews of older TVs, they will show input lag based on the Diva testing method.
Evaluating the Viewing Experience
Finally, it’s time for some real-world testing. To this end, I watch a variety of content to get a sense of general performance. I have a library of films, documentaries, and test footage on Ultra HD Blu-ray to observe how well each TV can handle different types of content, like dark, shadow-filled scenes and bright, colorful nature landscapes. One particularly useful tool for these tests is the Spears & Munsil Ultra HD benchmark disk, an Ultra HD Blu-ray disc with test charts and demonstration footage to evaluate how well a TV shows all sorts of content.
While streaming media is a consideration, using physical media in my tests ensures that I can get the best idea of what a TV is capable of showing from the highest-quality signal without considering bandwidth or variable bitrates.
For more, check out our TV product guide, as well as picks for the best TVs. And to make sure you’re getting the best picture possible, check out our list of easy fixes for common TV problems.
Get Our Best Stories!
Your Daily Dose of Our Top Tech News
By clicking Sign Me Up, you confirm you are 16+ and agree to our Terms of Use and Privacy Policy.
Thanks for signing up!
Your subscription has been confirmed. Keep an eye on your inbox!