Discussion Topic: Gamma Correction Setting

MicrosoftTeams-image (1)

Hi there community!

We would like to bring up a discussion related to the Gamma setting in the monitor.

When developing the firmware, we discovered that we’re limited in the number of gamma value presets the scaler can handle, so we need to be picky in which options we offer. That means asking the experts: you, the actual users.

A gamma correction of 1/2.2 will be required, as it’s the industry standard and used for things like sRGB.

But what other gamma settings do we need? Is it enough to offer 2.0/2.2/2.4/2.6, which seem to be the standard values, or do we really need the ‘odd decimal’ values like 2.1/2.3/2.5?

Do we need 1.8 or other values below 2.0? What is the minimum that you guys need? What is the minimum that you guys would like to see?

Also the opposite, what is the maximum that you guys need? What is the maximum that you guys would like to see?

Please let us know what you think!


Upload-able LUTs would be ideal, but unlikely. We still need 2.0/2.2/2.4/2.6 since they are standards for movies, TV, images, and other things. I don’t really know about any of the odd numbered values usefulness.

I would like to propose adding a linear Gamma (1.0). I know most images and videos don’t support this, but it would actually be useful for any one dealing with game engines and linear rendering pipelines. Some images like RAW do support linear encoding, but I don’t know how useful it would be for that. It could also be easier processing wise to run gamma transforms in software for film and digital photos since they are almost liner also.

A gamma of 1.8 is an old standard from before MacOS X Snow Leopard. So it would be nice to have if people wan’t to use older operating systems, but it is not necessary for any modern applications. I have also seen some monitors have a gamma of 1.6 but I can’t find any info as to why.

As far as extreme values go. I would say the lowest anyone would need to go is 1.0. As far as a maximum, I don’t really know if anything above 3.0 would be necessary.

Hope this helps. :slight_smile:


Keep it simple and compatible.



I appreciate your insightful explanation, @MemeScreen :grin:

It’s fascinating to learn that a 1.0 Gamma can be meaningful for working with game engines and linear rendering pipelines!

I’m curious about the difference between a linear Gamma and a non-linear one. Could you please share more about what differentiates the two and how may us observe this difference in real life?

Hey, @8BiTw0LF :wave:

Would you mind elaborating on your rationale behind recommending Gamma 2.3? It will help our team understand why it will be meaningful for you.

Furthermore, is it correct that you suggested 1.0 and 1.8 for the same reason as @MemeScreen?

By the way, we’ve built a few preliminary presets in our Glossy OLED :eyes:


I don’t know if this was clear but a gamma of 1.0 is linear. Any other gamma value is nonlinear since it is based on a logarithmic equation. The difference between linear and non-linear gamma is just inputs into the equation that get us a strait line.
40cafaa6c194da3216b90ef4c881b95a0bbda3aa 01_4

The more important thing to understand is why we apply gamma to monitors in the first place. It was initially due to a limitation of older CRT monitors that we don’t have anymore but keep around for backwards compatibility. It’s a necessary correction needed to view anything in the sRGB and AdobeRGB formats like JPEG and MPEG. Instead of me explaining it here are some easy resources to help explain it:

If there are any other questions you may have please let me know.



For me personally 2.0/2.2/2.4/2.6 is enough as 2.2 and 2.4 is the ones i use on all of my monitors/tvs


Yes, I agree that 2.2 & 2.4 are the most used gammas but 2.6 is still needed to edit and preview films that are meant for movie theaters. Even more so since the OLED panel excels in low light conditions found in theaters. The bare minimum configuration is 2.2, 2.4, & 2.6

1 Like

Welcome to the community and thank you for sharing your thoughts on this! :smiley:

As far as I know, 2.2 is the most accurate for Windows and 1.8 for MacOS. But…

I never really know this part. Mainly because I do not own a Mac anymore… I am guessing they shifted to 2.2 being recommended setting as well?

Ya, it wasn’t announced in the keynote but was posted on the website. You can see it listed here:

This reminds of when apple’s solution to getting rid of the dvd player on their laptops was a virtual DVD player. They created software for windows & mac to borrow the disc reader from another machine over the network. Nice memories.

EDIT I just checked settings and it still has this functionality LOL :rofl:

1 Like