Mini LED TVs have a long way to go.
That was my big takeaway from a weekend spent judging the 20th annual Value Electronics TV Shootout, in which the best OLED TVs from Sony, Samsung, and LG went head-to-head with their Mini LED counterparts. The Sony A95L OLED won the competition for the second year in a row, while the new Sony Bravia 9 Mini LED was the best of the Mini LED TVs. (The winners are crowned “King Of TV,” which is a real trademark owned by Value Electronics — the certificate of which was proudly displayed during judging. It was serious!)
Value Electronics is a boutique high-end home theater store in Scarsdale, New York — owners Robert and Wendy Zohn founded it in 1998, and they’ve been running the TV Shootout since 2004. It gets covered in the world of high-end TVs every year, and the TV makers themselves all show up to hear the feedback. As a longtime TV nerd, I’ve been following the results of the Shootout for quite a while, so it was a thrill to meet Robert when I bought an A95L from the store after moving to the area last year and an even bigger thrill when he invited me to join the judging panel.
On the first day, we reviewed professionally calibrated 65-inch versions of the highest-end OLED and Mini LED TVs from each manufacturer; on the second day, we looked at the largest available sizes of those models in their out-of-the-box filmmaker modes with energy-saving features turned off. The TVs, listed at their 65-inch MSRPs, were:
- Sony A95L, QD-OLED, $3,499
- Samsung S95D, QD-OLED, $3,399
- LG G4, OLED, $3,399
- Sony XR90, Mini LED, $3,299
- Samsung QN95D, Mini LED, $3,299
- LG QNED90T, Mini LED, $1,899
Interestingly, Robert told the group that TCL and Hisense had specifically asked not to be included in the Shootout. “You can imagine what that means,” he said. “We had Vizio one year, and it was, respectfully, embarrassing for them.”
All the TVs were calibrated for the competition by Cecil Meade and DeWayne Davis, who are both professional ISF calibrators. (If you’re a devoted AVS Forum reader like me, you’ll know Meade and Davis as ClassyTech and D-Nice. Meeting them was also a thrill.) Content came from a Magnetar disc player, a Kaleidascape streamer, and an Apple TV, routed through a switching system built and operated by AVPro’s Jason Dustal.
The first day of the Shootout was meant to be extremely objective: Meade and Davis had calibrated each of the six TVs as best as possible, and our job as judges was to compare them to two $43,000 Sony BVM-HX3110 reference monitors and rate them on how well they matched across various categories. A lot of categories: we were each handed a clipboard with a stack of six testing scorecards, one each for things like 4K HDR Dark Scene, 4K HDR Bright Scene, and 1080P SDR Reference, with a grid of specific attributes to score, like Color Accuracy, Shadow Detail, and Low Color Luminance. We watched various movie clips that showcased these attributes and scored the TVs from 1–5 based on how closely they matched the reference monitors. All in all, we gave each TV over 60 total scores on the first day — it took nearly six hours.
The TVs were set up in two groups of three in the front of the showroom, one for the OLEDs and one for the Mini LEDs, with a BVM in each group for comparison. Judges moved between the two groups, standing at various distances and angles while muttering about brightness levels and scratching out scores on our sheets, while the audience whispered amongst themselves.
We did know which TVs were which — Meade told me that “doing it blind is just for show” since we could see operating system elements as content was switched, and up close, the two QD-OLEDs from Samsung and Sony had their characteristic color fringing. “Anyone who’s a judge should be able to tell these apart,” he said.
It’s important to note that all of this means we were judging a very specific, objective definition of success for these TVs: how closely they could be calibrated to match a reference display. The closer the image was to what we saw on those BVM reference displays, the higher the score, and the farther from the reference, the lower the score. There were moments where some TVs might have looked subjectively better than the reference displays, particularly in dark scenes where the shadow detail was pumped up to be more visible, and we were still meant to give a lower score because it didn’t match the reference.
I’m stressing this because “can be calibrated to closely match a reference display” is but one thing to consider when evaluating a TV, and we did not touch on anything else, like gaming features, number of HDMI inputs, operating systems, or even Dolby Vision support (which the Samsungs do not have). This whole thing was about the limits of picture quality, and picture quality alone. There are a lot of reasons you might pick any of these TVs that have nothing to do with how closely they can be calibrated to match a reference display; Vergecast listeners know how often I talk about the Samsung Frame TV, which outsells all of these TVs, often at similar prices, with a picture quality best described as “whatever.”
It probably isn’t surprising that the Sony A95L was the TV that could be most closely calibrated to match a Sony reference display — although the Samsung S95D only came in second by .1 and really only lagged in general HDR performance. The LG G4 did much more poorly than I expected, with muddy shadow details and poor color in dark scenes and the odd color issue in bright scenes. I was really expecting more, given that the G4’s micro lens array OLED tech should allow for higher peak brightness, but the image processing really let it down during the calibrated test. (In a fascinating twist, the 83-inch G4 performed much better on the second day — we’ll come to that.)
All that said, the three OLEDs were extremely close to each other — you can really tell how mature OLED is as a display technology and how good calibrators have gotten at coaxing the maximum performance out of these panels. Perhaps the most telling thing about the Shootout was how quickly the judges could evaluate the Mini LEDs compared to the OLEDs — the OLEDs took vastly more time because the fine differences were ultimately so hard to see.
In comparison to the tight refinement of the OLEDs, the Mini LED sets were all over the map. There’s a lot of excitement for Mini LED as a technology to compete with OLED — Robert told us that he’d gotten more requests to compare the 83-inch LG G4 OLED to the 85-inch Sony XR90 Mini LED than he’d ever heard — but frankly, the picture quality just isn’t there yet, from any of the manufacturers. The Sony XR90 was by far the closest competitor to the OLED displays, but it still finished well behind the LG G4 in the overall scoring, and the (in fairness, much cheaper) LG QNED90T was so thoroughly outclassed on the first day it wasn’t even included on the second day.
The issue is that Mini LED TVs are still fundamentally LCD TVs, and they have familiar LCD issues — especially off-axis color and brightness shifts, which are pretty noticeable with TVs of this size. Moving just a few inches side to side would result in color shifts; we had to stand fairly far back from all the LCDs to see a uniform picture. Beyond that, the Mini LEDs all had less accurate, more washed-out colors than the OLEDs — going back to look at the OLEDs after judging the Mini LEDs was like a breath of fresh air.
In terms of Mini LED backlight performance, Sony’s determination and investment in this tech is clearly paying off — the XR90 had the least amount of blooming and the fastest response times and could hit peak brightness more consistently than the other sets. The Samsung put up a capable fight, but its backlight response times were clearly slower, and there was much more blooming — at times, it looked like a regular full-array LED backlight instead of a Mini LED set. The LG — which, again, is vastly less expensive than the others — was basically not in the game at all.
Having spent so much time looking at consumer displays from these companies over the years, it was fascinating to see how each company retained its signature looks even after calibration and even across display types. The Samsung had the most intense colors — color brightness is a big strength of the company’s quantum dot OLED tech — and aggressive upscaling, the LG prioritized contrast and seemed hungry for more brightness, and the Sony was the most restrained and confident. Outside of a direct calibrated comparison, it would be hard to fault any of the OLEDs, and a casual viewer probably wouldn’t pick out huge problems with the XR90. But there’s still a long way to go before Mini LEDs can catch up to OLED — or even hit the same sort of consistent performance ceiling that the OLEDs seem to have achieved.
The second day of testing was much more subjective and casual — the largest sizes of each TV were set up in their out-of-the-box filmmaker or professional modes with energy saver settings turned off. That combination disabled any extraneous motion or image processing while allowing the TVs to hit their peak brightness. We watched a variety of clips and compared them to the reference displays again, but instead of grading individual attributes, we merely ranked our top three sets for each clip.
The most immediate surprise here was that the larger 83-inch LG G4 was vastly more competitive in this test than the 65-inch version had been the day before. More than one judge commented that things would have been very different if the 65-inch version had looked like the 83-inch, especially in low-light scenes. According to the calibrators in the room, this is a common issue with the G4 — the 65-inch model often looks worse than the other sizes. It’s disappointing, but it does mean the 83-inch G4 compared much more favorably to the 77-inch A95L — in this test, it came in second place overall.
The out-of-the-box test also made it clear that none of these TVs render color particularly accurately from the factory. The OLEDs were generally more accurate, although the Samsung was oversaturated — the QD-OLED panel has a bigger color gamut than the rest, and Samsung appears to be overdriving it a bit too enthusiastically. The A95L was a little too magenta, and the LG was a bit too yellow. (Meade told me that his basic calibration for the A95L is to turn the reds down one click and the blues down four. You can see how close things are getting.)
The Mini LEDs were uniformly too pink, and my note for the Sony XR90 for the out-of-the-box test was simply that “the colors are all over the place.”
The uncalibrated test also allowed these sets to showcase their brightness — the LG G4 came closest to the reference display during a particularly brutal black-and-white Dune: Part Two scene, and the Sony XR90 was the brightest and most intense of all during the Vader lightsaber sequence in Rogue One. But the Mini LEDs struggled with blooming and responsiveness overall: the Samsung backlight was less accurate than the Sony’s and continued to look like a regular full array backlight at times, while the Sony backlight could be seen ramping brightness up and down during Aquaman 2. Neither Mini LED TV could render a full-screen starfield during Rogue One’s end credits — some stars caused blooming, while others were too dim to trigger the backlight, so they simply didn’t appear. (All three OLEDs obviously handled this just fine.)
All that said, I found myself debating whether to put the Sony XR90 Mini LED in third place over the Samsung S95D OLED in the out-of-the-box test. The Sony can get so bright, and the Samsung colors are so oversaturated, that it felt like a tradeoff either way. On top of that, I did not love the matte finish on the S95D screen, which some others in the room quite liked. In the end, though, I picked the S95D in third place — I’d rather have the true blacks and crisper image of the OLED, and it was not lost on me that the Samsung could be calibrated closer to reference than the Sony Mini LED.
I don’t think there’s any doubt that OLED TVs remain the standard after judging the Shootout — the Sony A95L in particular can be calibrated to match a reference display so closely that complaining about its picture quality requires an almost obsessive devotion to image rendering arcana. (As an A95L owner, I can say it’s much easier to complain about the fact that it sometimes restarts itself for no reason.) But it’s also exciting to see Mini LED put up meaningful competition — I’m very curious to see how all of these companies do again next year.