We would like to bring up a discussion related to the Gamma setting in the monitor.
When developing the firmware, we discovered that we’re limited in the number of gamma value presets the scaler can handle, so we need to be picky in which options we offer. That means asking the experts: you, the actual users.
A gamma correction of 1/2.2 will be required, as it’s the industry standard and used for things like sRGB.
But what other gamma settings do we need? Is it enough to offer 2.0/2.2/2.4/2.6, which seem to be the standard values, or do we really need the ‘odd decimal’ values like 2.1/2.3/2.5?
Do we need 1.8 or other values below 2.0? What is the minimum that you guys need? What is the minimum that you guys would like to see?
Also the opposite, what is the maximum that you guys need? What is the maximum that you guys would like to see?
Upload-able LUTs would be ideal, but unlikely. We still need 2.0/2.2/2.4/2.6 since they are standards for movies, TV, images, and other things. I don’t really know about any of the odd numbered values usefulness.
I would like to propose adding a linear Gamma (1.0). I know most images and videos don’t support this, but it would actually be useful for any one dealing with game engines and linear rendering pipelines. Some images like RAW do support linear encoding, but I don’t know how useful it would be for that. It could also be easier processing wise to run gamma transforms in software for film and digital photos since they are almost liner also.
A gamma of 1.8 is an old standard from before MacOS X Snow Leopard. So it would be nice to have if people wan’t to use older operating systems, but it is not necessary for any modern applications. I have also seen some monitors have a gamma of 1.6 but I can’t find any info as to why.
As far as extreme values go. I would say the lowest anyone would need to go is 1.0. As far as a maximum, I don’t really know if anything above 3.0 would be necessary.
I don’t know if this was clear but a gamma of 1.0 is linear. Any other gamma value is nonlinear since it is based on a logarithmic equation. The difference between linear and non-linear gamma is just inputs into the equation that get us a strait line.
The more important thing to understand is why we apply gamma to monitors in the first place. It was initially due to a limitation of older CRT monitors that we don’t have anymore but keep around for backwards compatibility. It’s a necessary correction needed to view anything in the sRGB and AdobeRGB formats like JPEG and MPEG. Instead of me explaining it here are some easy resources to help explain it:
If there are any other questions you may have please let me know.
Yes, I agree that 2.2 & 2.4 are the most used gammas but 2.6 is still needed to edit and preview films that are meant for movie theaters. Even more so since the OLED panel excels in low light conditions found in theaters. The bare minimum configuration is 2.2, 2.4, & 2.6
Ya, it wasn’t announced in the keynote but was posted on the website. You can see it listed here:
This reminds of when apple’s solution to getting rid of the dvd player on their laptops was a virtual DVD player. They created software for windows & mac to borrow the disc reader from another machine over the network. Nice memories.
EDIT I just checked settings and it still has this functionality LOL