Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.

It seems they want to make these settings usable without specialist knowledge, but the end result of their opaque naming and vague descriptions is that anybody who actually cares about what they see and thinks they might benefit from some of the features has to either systematically try every possible combination of options or teach themselves video engineering and try to figure out for themselves what each one actually does.

This isn't unique to TVs. It's amazing really how much effort a company will put into adding a feature to a product only to completely negate any value it might have by assuming any attempt at clearly documenting it, even if buried deep in a manual, will cause their customers' brains to explode.



"Filmmaker mode" is the industry's attempt at this. On supported TVs it's just another picture mode (like vivid or standard), but it disables all the junk the other modes have enabled by default without wading though all the individual settings. I don't know how widely adopted it is though, but my LG OLED from 2020 has it.


The problem with filmmaker mode is I don't trust it more than other modes. It would take no effort at all for a TV maker to start fiddling whit "filmmaker mode" to boost colors or something to "get an edge", then everyone does it, and we're back to where we started. I just turn them off and leave it that way. Companies have already proven time and again they'll make changes we don't like just because they can, so it's important to take every opportunity to prevent them even getting a chance.


"Filmmaker mode" is a trademark of the UHD Alliance, so if TV makers want to deviate from the spec they can't call it "Filmmaker mode" anymore. There's a few different TV makers in the UHD Alliance so there's an incentive for the spec to not have wiggle room that one member could exploit to the determent of the others.


Huh, I never knew this, they even have a website

https://filmmakermode.com/

Good to know there seems to be an effort to keep some consistency.


That's cool info. Thanks!


It's true that Filmmaker Mode might at some point in the future be corrupted, but in the actual world of today, if you go to a TV and set it to Filmmaker Mode, it's going to move most things to correct settings, and all things to correct settings on at least some TVs.

(The trickiest thing is actually brightness. LG originally used to set brightness to 100 nits in Filmmaker Mode for SDR, which is correct dark room behavior -- but a lot of people aren't in dark rooms and want brighter screens, so they changed it to be significantly brighter. Defensible, but it now means that if you are in a dark room, you have to look up which brightness level is close to 100 nits.)


On my Samsung film mode has an insane amount of processing. Game Mode is the setting where the display is most true to what's being sent to it.


Game mode being latency-optimized really is the saving grace in a market segment where the big brands try to keep hardware cost as cheap as possible. Sure, you _could_ have a game mode that does all of the fancy processing closer to real-time, but now you can't use a bargain-basement CPU.


Not "Film mode", but "Filmmaker mode". The latter is a trademark with specific requirements.

Game mode will indeed likely turn off any expensive latency-introducing processing but it's unlikely to provide the best color accuracy.


On my Samsung OLED game mode has an annoying effect that turns (nearly) copletely black screens into gray smudge garbage that you can only turn down but not completely off, making that mode entirely useless.

Yup, it's great, at least for live action content. I've found that for Anime, a small amount of motion interpolation is absolutely needed on my OLED, otherwise the content has horrible judder.


I always found that weird, anime relies on motion blur for smoothness when panning / scrolling motion interpolation works as an upgraded version of that... until it starts to interpolate actual animation


On my LG OLED I think it looks bad. Whites are off and I feel like the colours are squashed. Might be more accurate, but it's bad for me. I prefer to use standard, disable everything and put the white balance on neutral, neither cold nor warm.


I had just recently factory reset my samsung S90C QDOLED - and had to work through the annoying process of dialing the settings back to something sane and tasteful. Filmmaker mode only got it part of the way there. The white balance was still set to warm, and inexplicably HDR was static (ignoring the content 'hints'), and even then the contrast seemed off, and I had to set the dynamic contrast to 'low' (whatever that means) to keep everything from looking overly dark.

It makes me wish that there was something like an industry standard 'calibrated' mode that everyone could target - let all the other garbage features be a divergence from that. Hell, there probably is, but they'd never suggest a consumer use that and not all of their value-add tackey DSP.


"Warm" or "Warm 2" or "Warm 50" is the correct white point on most TVs. Yes, it would make sense if some "Neutral" setting was where they put the standards-compliant setting, but in practice nobody ever wants it to be warmer than D6500, and lots of people want it some degree of cooler, so they anchor the proper setting to the warm side of their adjustment.

When you say that "HDR is static" you probably mean that "Dynamic tone-mapping" was turned off. This is also correct behavior. Dynamic tone-mapping isn't about using content settings to do per-scene tone-mapping (that's HDR10+ or Dolby Vision, though Samsung doesn't support the latter), it's about just yoloing the image to be brighter and more vivid than it should be rather than sticking to the accurate rendering.

What you're discovering here is that the reason TV makers put these "garbage features" in is that a lot of people like a TV picture that's too vivid, too blue, too bright. If you set it to the true standard settings, people's first impression is that it looks bad, as yours was. (But if you live with it for a while, it'll quickly start to look good, and then when you look at a blown-out picture, it'll look gross.)


This is all correct.

“Filmmaker Mode” on LG OLED was horrible. Yes, all of the “extra” features were off, but it was overly warm and unbalanced as hell. I either don’t understand “Filmmakers” or that mode is intended to be so bad that you will need to fix it yourself.


Filmmaker is warm because it follows the standardized D6500 whitepoint. But that's the monitor whitepoint it is mastered against, and how it's intended to be seen.

TV producers always set their sets to way higher by default because blue tones show off colors better.

As a result of both that familiarity and the better saturation, most people don't like filmmaker when they try to use it at first. After a few weeks, though, you'll be wondering why you ever liked the oversaturated neons and severely off brightness curve of other modes.

Or not, do whatever you want, it's your TV!


The whites in Filmmaker Mode are not off. They'll look warm to you if you're used to the too-blue settings, but they're completely and measurably correct.

I'd suggest living with it for a while; if you do, you'll quickly get used to it, and then going to the "standard" (sic) setting will look too blue.


The problem is that comparing to all the monitors I have, specifically the one in my Lenovo Yoga OLED that is supposed to be very accurate, whites are very warm in filmmaker mode. What's that about?


Your monitor is probably set to the wrong settings for film content. Almost all monitors are set to a cool white point out of the box. If you're not producing film or color calibrated photography on your monitor, there is no standard white temperature for PC displays.


The Lenovo has an official ICC profiler, so I think that's unlikely.


still looks like yellow piss.


Disclaimer: i prefer movies to look like reality. but apparently this is far away from "artistic purpose".


What does “like reality” mean?


It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass. If I look at the screen and then I look outside, it should look the same. HDR screens and sensors are getting pretty close, but almost everyone is using color grading so the advantage is gone. And after colors, don't get me started about motion and the 24fps abomination.


> It means that the colors should be correct. The sky on tv should look like the sky. The grass on tv should look like grass.

It is not as clear cut as you think and is very much a gradient. I could send 10 different color gradings of the sky and grass to 10 different people and they could all say it looks “natural” to them, or a few would say it looks “off,” because our expectations of “natural” looks are not informed by any sort of objective rubric. Naturally if everyone says it’s off the common denominator is likely the colorist, but aside from that, the above generally holds. It’s why color grading with proper scopes and such is so important. You’re doing your best to meet the expectation for as many people as possible knowing that they will be looking on different devices, have different ideas of what a proper color is, are in different environments, etc. and ultimately you will still disappoint some folks. There are so many hardware factors at play stacked on top of an individual’s own expectations.

Even the color of the room you’re in or the color/intensity of the light in your peripheral vision will heavily influence how you perceive a color that is directly in front of you. Even if you walk around with a proper color reference chart checking everything it’s just always going to have a subjective element because you have your own opinion of what constitutes green grass.


In a way, this actually touches on a real issue. Instead of trying to please random ppl and make heuristics that work in arbitrary conditions, maybe start from the objective reality? I mean, for the start, take a picture, and then immediately compare it with the subject. If it looks identical then that's a good start. I haven't seen any device capable of doing this. Of course you would need the entire sensor-processing-screen chain to be calibrated for this.


Everything I talked about above applies even more so now that you’re trying to say “we’ll make a camera capture objective colors/reality.” That’s been a debate about cameras ever since the first images were taken. “The truth of the image.”

There is no such thing as the “correct” or “most natural” image. There is essentially no “true” image.


I completely agree. Theoretically you could capture and reproduce the entire spectrum for each pixel, but even that is not "true" because it is not the entire light field. But I still think that we can look at the picture on phone in the hand and at the subject just in front, and try to make them as similar as possible to our senses? This looks to me like a big improvement to the current state of affairs. Then you can always say to a critic: I checked just as i took the picture/movie, and this is exactly how the sky/grass/subject looked.


White walls in my kitchen look different depending on the time of day and weather, and that’s before I turn on the lights.

What is the correct colour?


Well, I know what you mean, color is complicated. BUT, I can look at a hundred skys and they look like sky. I will look at the sky on the tv, and it looks like sky on the tv, not like the real sky. And sky is probably easy to replicate, but if you take the grass or leaves, or human skin, then the tv becomes funny most of the time.


> I will look at the sky on the tv, and it looks like sky on the tv, not like the real sky.

Well for starters you’re viewing the real sky in 3D and your TV is a 2D medium. Truly that immediately changes your perception and drastically. TV looks like TV no matter what.


I'm sure part of it is so that marketing can say that their TV has new putz-tech smooth vibes AI 2.0, but honestly I also see this same thing happen with products aimed at technical people who would benefit from actually knowing what a particular feature or setting really is. Even in my own work on tools aimed at developers, non-technical stakeholders push really hard to dumb down and hide what things really are, believing that makes the tools easier to use, when really it just makes it more confusing for the users.


I don't think you are the target audience of the dumbed down part but the people paying them for it. They don't need the detailed documentation on those thing, so why make it?


> It would help if TV manufacturers would clearly document what these features do, and use consistent names that reflect that.

It would also help if there was a common, universal, perfect "reference TV" to aim for (or multiple such references for different use cases), with the job of the TV being to approximate this reference as closely as possible.

Alas, much like documenting the features, this would turn TVs into commodities, which is what consumers want, but TV vendors very much don't.


"reference TVs" exist, they're what movies/tv shows are mastered on, e.g. https://flandersscientific.com/XMP551/


I wonder if there's a video equivalent to the Yamaha NS-10[1], a studio monitor (audio) that (simplifying) sounds bad enough that audio engineers reckon if they can make the mix sound good on them, they'll sound alright on just about anything.

[1]: https://en.wikipedia.org/wiki/Yamaha_NS-10


Probably not, or they don't go by it, since there seems to be a massive problem with people being unable to hear dialogue well enough to not need subtitles.

https://news.ycombinator.com/item?id=37218711

It was a real eye(ear?)-opener to watch Seinfeld on Netflix and suddenly have no problem understanding what they're saying. They solved the problem before, they just ... unsolved it.


My favorite thing about Kodi is an audio setting that boosts the center channel. Since most speech comes through that, it generally just turns up the voices, and the music and sound effects stay at the same level. It's a godsend. Also another great reason to have a nice backup collection on a hard drive.


It's a similar thing to watching movies from before the mid-2000 (I place the inflection point around Collateral in 2004) where after that you get overly dark scenes where you can't make out anything, while anything earlier you get these night scenes where you can clearly make out the setting, and the focused actors/props are clearly visible.

Watch An American Werewolf in London, Strange Days, True Lies, Blade Runner, or any other movie from the film era all up to the start of digital, and you can see that the sets are incredibly well lit. On film they couldn't afford to reshoot and didn't have immediate view of what everything in the frame resulted on, so they had to be conservative. They didn't have per-pixel brightness manipulation (feathering and burning were film techniques that could technically have been applied per frame, but good luck with doing that at any reasonable expense or amount of time). They didn't have hyper-fast color film-stock they could use (ISO 800 was about the fastest you could get), and it was a clear downgrade from anything slower.

The advent of digital film-making when sensors reached ISO 1600/3200 with reasonable image quality is when the allure of time/cost savings of not lighting heavily for every scene showed its ugly head, and by the 2020's you get the "Netflix look" from studios optimizing for "the cheapest possible thing we can get out the door" (the most expensive thing in any production is filming in location, a producer will want to squeeze every minute of that away, with the smallest crew they could get away with).


$21k for a 55-inch 4K is rough, but this thing must be super delicate because basic US shipping is $500.

(Still cheaper than a Netflix subscription though.)


If you account for the wastage/insurance costs using standard freight carriers that seems reasonable to me as a proportion of value. I’m sure this is shipped insured, well packaged and on a pallet.

Walmart might be able to resell a damaged/open box $2k TV at a discount, but I don’t think that’s so easy for speciality calibrated equipment.


Reference monitor pricing has never been any where near something mere mortals could afford. The price you gave of $21k for 55” is more than 50% of the average of $1k+ per inch I’m used to seeing from Sony.


My local hummus factory puts the product destined for Costco into a different sized tub than the one destined for Walmart. Companies want to make it hard for the consumer to compare.


Costco’s whole thing is selling larger quantities, most times at a lower per unit price than other retailers such as Walmart. Walmart’s wholesale competitor to Costco is Sam’s Club. Also, Costco’s price labels always show the per unit price of the product (as do Walmart’s, in my experience).


Often a false economy. My MIL shops at Sam's Club, and ends up throwing half her food away because she cannot eat it all before it expires. I've told her that those dates often don't mean the food is instantly "bad" the next day but she refuses to touch anything that is "expired."


My wife is the same way - the "best by" date is just a date they put for best "freshness". "Sell by" date is similar. It's not about safety.

My wife grew up in a hot and humid climate where things went bad quickly, so this tendency doesn't come from nowhere. Her whole family now lives in the US midwest, and there are similar arguments between her siblings and their spouses.


Also: freezer


The ones I’m talking about were only subtly different, like 22 oz vs 24 oz. To me it was obvious what they were doing, shoppers couldn’t compare same-size units and they could have more freedom with prices.


Showing a unit price on the label is a requirement of US law.


Which unit is the fun game that gets played. I've seen way to many products right beside each other that use different measurements.


Most people will have devices that can easily convert measurements to the desired unit.


That same device can also calculate the unit price (since you know price & weight), so why even print it, right?


Oh fun, now I can invest even more time and energy into grocery shopping.


There is no federal law requiring unit requiring unit pricing, but the the NIST has guidelines that most grocery stores follow voluntarily. 9 states have adopted the guidelines as law.

https://www.nist.gov/system/files/documents/2023/02/09/2023%...


I don't think that's correct. Prices for retail goods aren't usually even attached to the product in interstate commerce, and are shown locally on store shelving.

Any applicable unit pricing requirements would be at the state/local level, not federal, but only a few states have such requirements. See: https://www.nist.gov/pml/owm/national-legal-metrology/us-ret...


You think the factory decided this?


The sizes were requested by the companies, the tour guide pointed this out in answer to questions.


I disable all video processing features and calibrate my sets. Bought a meter years ago and it’s given me endless value.


Yup - this is the way. Your room color and lighting effect your TV so proper calibration with a meter is always ideal


These exist, typically made by Panasonic or Sony, and cost upwards of 20k USD. HDTVtest has compared them to the top OLED consumer tvs in the past. Film studios use the reference models for their editing and mastering work.

Sony specifically targets the reference with their final calibration on their top TVs, assuming you are in Cinema or Dolby Vision mode, or whatever they call it this year.


There is! That is precisely how TVs work! Specs like BT.2020 and BT.2100 define the color primaries, white point, and how colors and brightness levels should be represented. Other specs define other elements of the signal. SMPTE ST 2080 defines what the mastering environment should be, which is where you get the recommendations for bias lighting.

This is all out there -- but consumers DO NOT want it, because in a back-to-back comparison, they believe they want (as you'll see in other messages in this thread) displays that are over-bright, over-blue, over-saturated, and over-contrasty. And so that's what they get.

But if you want a perfect reference TV, that's what Filmmaker Mode is for, if you've got a TV maker that's even trying.


The purpose of the naming is generally to overwhelm consumers and drive long term repeat buys. You can’t remember if your tv has the fitzbuzz, but you’re damn sure this fancy new tv in the store looks a hell of a lot better than you’re current tv and there really pushing this fitzbuzz thing.


Cynically, I think its a bit, just a little, to do with how we handle manuals, today.

It wasn't that long ago, that the manual spelled out everything in detail enough that a kid could understand, absorb, and decide he was going to dive into his own and end up in the industry. I wouldn't have broken or created nearly as much, without it.

But, a few things challenged the norm. For many, many reasons, manuals became less about the specification and more about the functionality. Then they became even more simplified, because of the need to translate it into thirty different languages automatically. And even smaller, to discourage people from blaming the company rather than themselves, by never admitting anything in the manual.

What I would do for a return to fault repair guides [0].

[0] https://archive.org/details/olivetti-linea-98-service-manual...


Another factor is the increased importance of software part of the product, and how that changes via updates that can make a manual outdated. Or at least a printed manual, so if they're doing updates to product launch it might not match what a customer gets straight out of the box or any later production runs where new firmware is included. It would be somewhat mitigated if there was an onus to keep online/downloadable manuals updated alongside the software. I know my motherboard BIOS no longer matches the manual, but even then most descriptions are so simple they do nothing more than list the options with no explanation.


Yep, old features can disappear, new features can be added, the whole product can even be enshittified.

Updates are a mixed bag.


Going a level deeper, more information can be gleaned for how closely modern technology mimics kids toys that don’t require manuals.

A punch card machine certainly requires specs, and would not be confused with a toy.

A server rack, same, but the manuals are pieced out and specific, with details being lost.

You’ll notice anything with dangerous implications naturally wards off tampering near natively.

Desktop and laptop computers depending on sharp edges and design language, whether they use a touch screen. Almost kids toys, manual now in collective common sense for most.

Tablet, colorful case, basically a toy. Ask how many people using one can write bit transition diagrams for or/and, let alone xor.

We’ve drifted far away from where we started. Part of me feels like the youth are losing their childhoods earlier and earlier as our technology becomes easier to use. Being cynical of course.


That doesn't preclude clearly documenting what the feature does somewhere in the manual or online. People who either don't care or don't have the mental capacity to understand it won't read it. People who care a lot, such as specialist reviewers or your competitors, will figure it out anyway. I don't see any downside to adding the documentation for the benefit of paying customers who want to make an informed choice about when to use the feature, even in this cynical world view.


That costs money.


Why let a consumer educate themselves as easily as possible when it’s more profitable to deter that behaviour and keep you confused? Especially when some of the tech is entirely false (iirc about a decade ago, TVs were advertised as ‘360hz’ which was not related to the refresh rates).

I’m with you personally, but the companies that sell TVs are not.


They will setup their TVs with whatever setting makes them sell better than the other TVs in the shop.


I don't particularly like that, but even so, it doesn't preclude having a "standard" or "no enhancement" option, even if it's not the default.

On my TCL TV I can turn off "smart" image and a bunch of other crap, and there's a "standard" image mode. But I'm not convinced that's actually "as close to reference as the panel can get". One reason is that there is noticeable input lag when connected to a pc, whereas if I switch it to "pc", the lag is basically gone, but the image looks different. So I have no idea which is the "standard" one.

Ironically, when I first turned it on, all the "smart" things were off.


Sometimes PC mode reduces image quality (like lowering bit depth) at the expense of lower input lag


Is there a way to verify this other than breaking out a colorimeter (which I happen to have)?


For bit depth issues, displaying a test pattern is probably the best test.

I'm not certain this is true. TVs have become so ludicrously inexpensive that it seems the only criteria consumers shop for is bigger screen and lower price.


"Our users are morons who can barely read, let alone read a manual", meet "our users can definitely figure out how to use our app without a manual".


TV's are on their way to free, and are thoroughly enshittified. The consumer is the product, so compliance with consumer preferences is going to plummet. They don't care if you know what you want, you're going to get what they provide.

They want a spy device in your house, recording and sending screenshots and audio clips to their servers, providing hooks into every piece of media you consume, allowing them a detailed profile of you and your household. By purchasing the device, you're agreeing to waiving any and all expectations of privacy.

Your best bet is to get a projector, or spend thousands to get an actual dumb display. TVs are a lost cause - they've discovered how to exploit users and there's no going back.


I just went through this learning curve with my new Sony Bravia 8 II.

I also auditioned the LG G5.

I calibrated both of them. It is not that much effort after some research on avsforum.com. I think this task would be fairly trivial for the hackernews crowd.


Agreed. And I’m not going to flip my TV’s mode every time I watch a new show. I need something that does a good job on average, where I can set it and forget it.


exactly. the only adjustment I need to be making is hdmi input and volume.


The whole comment 100% matches my experience with any and every BIOS setup out there.


worst is graphic settings for games. needs PhD to understand.


They just need 3 settings for games, 1) make room hot, 2) make room warm, 3) maintain room temperature.


I use that first setting to keep my basement living room warm in the winter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: