I´m sorry if I ask so late @nkyadav and @ReignDespair and @Lore_Wonder but the increase from 16 to 32 zone local dimming that the ES07D02 model received, has also changed the configuration of the placed LEDs? In such a way that the LEDs on the edges both below and above as well as right and left.
A strip of LEDs along the bottom looks horrendous to me to the point of saying “that looks fucking disgusting! Ugh! I better disable local dimming” please… Let this go. That the monitor is not exactly cheap and has a fake “10-bit” panel.
Information taken from here:
At least that people have a “decent” experience when viewing content in HDR
(I´m sorry, I’ve decided to make a separate post. So that it becomes more visible and other people are informed.)
It’s not that no one is not interested - we have a lot of things going on. One of which is that the majority of our CS team is in Ukraine, another is that we’re working feverishly to get products rolling out the door to those who still haven’t received their orders, working on the updates for the glossy Spectrum 4K model, the entire QHD lineup, and numerous things besides.
Please be patient with us - things went from marginally bad to pure terror in the matter of a day, and we’ve got a lot of things to deal with.
In the meantime, I’ll tag our product manager Rob (@Helios ) to see if he can answer your question.
To my current knowledge, the configuration is still the same: bottom edge lit. This will result in 32 vertical strips. But just to be sure we’re double-checking with LG Display and our manufacturer, in case they forgot this important detail!
Waaaah! I´m so sorry if I reply so late. It’s 4 in the morning here in Spain. No one responded even though I had 42 views and I thought no one called this attention or was the only crazy person who didn’t know until now I discovered the terrible news that came out on TV and what @ReignDespair communicated. Really, I’m so sorry. I hope everybody is fine. I’ve been feeling a bit overwhelmed. Honestly, I’m very lucky not to live the same situation… I will have patience. Don’t worry, it’s understandable. Just like I was patient waiting almost 1 year to get my Spectrum. All Eve workers have my support. And by the way, thank you very much for helping me. I have to thank you for answering me in the publications. I always see you like my posts
One question: LG Display hasn’t started producing these panels yet, has it? I´m sorry that the monitor does not have the ability to produce decent HDR content. A vertical strip of the lower edge illuminated is ridiculous, it is insufficient. I’ve read so many reviews from tftcentral, Hardware Unboxed, pcgamer, tom’s hardware… How can a manufacturer like LG and many others offer such a shitty LED configuration? It’s a scam, Vesa DisplayHDR 400 is another scam… I’ve seen more videos on Youtube with that vertical LEDS configuration (only the bottom part is being very stingy). The neat examples I’ve posted (thanks to CNET) clearly show that having a single vertical strip of LEDs on the bottom edge you can’t see details. Because literally the fucking light fills the whole screen. Obviously I’m not going to ask for a refund for this. On the contrary, I am super happy with my purchase and I would not change it for anything. In fact, I look forward to Eve bringing out a monitor with a QD-OLED or Micro LED panel in the next few years I’m pissed off and frustrated by the local dimming capability, it sucks. I hope they change it. I want this monitor to have a higher number of sales, the set of specifications and functions deserves it. Really… I do have to pay extra to get strips on all the LED edges. I do (hopefully this display can a native 10 bit panel) I find it to be a fantastic monitor. There are things this monitor offers unlike other monitors. But everyone… Absolutely, everyone! They offer the same crap ability to display at least a “decent” content in HDR. The manufacturers have no balls, they want to get more money from the people who buy PC monitors. That’s not fair…
I ask this question in case it’s not too late and you make a change of plans. I think if you do a poll. People will vote to use LED strips on all edges. i really think so
This monitor has the same capabilities as the LG ones with the same panel. LG 27GP950-B Review - RTINGS.com ; Honestly having used the gp950 at microcenter and my Eve spectrum they look almost exactly the same panel wise.
Sidebar, using ubiquitous emojis in your post doesn’t lend towards your credibility.
Who does not give you credibility is you who gives them smart. They share the same panel (although not exactly the same because the ES07D02 model is more recent, therefore, the panel is new). They don’t have the same polarizer and they don’t have the same number of local dimming zones. Basically it’s a different panel. It’s new and they don’t share the same resolution or refresh rate. Also, what does it matter to you that I use emojis? Is it because it takes more seriousness from my publication? I honestly don’t see the point of your comment. Local dimming zones can be physically changed. It’s not from the panel itself… And yes you want to know more about local LED dimming. Click on the link and tell me, because your comment does not help at all. What good is it to me to know that they share the “same” panel. Yes this feature is not from the panel itself
As a note on the DisplayHDR standards being ‘a scam’: Certainly DisplayHDR400, HDR600, and similar standards do not offer the same experience that HDR1000 does. Then again, HDR1000 doesn’t offer the same experience that HDR1400 does, and I’m sure there will be new certifications in the future that make us look back at HDR1000 thinking 'that’s so stupid, how can anyone consider that real HDR?'. Compare it to how 720p was introduced as ‘High Definition’, or how USB 2.0’s 480Mb/s was ‘High-Speed USB’.
What is considered ‘high speed’, ‘high definition’, ‘high dynamic range’, ‘high refresh rate’, or high-anything, really, is always a moving target as technology advances. What supposedly fake standards such as DisplayHDR400 achieve, is that there is a marketing incentive for manufacturers of both components and end products to create products that meet those minimum criteria. In the current market, 400 nits of peak brightness is an entry-level offering, compared to only a few years ago when 400 nits was still found on higher-end offerings. Just because a high-end standard such as HDR1400 is an amazing example of what modern technology can achieve, it’s important to keep perspective. A rising tide lifts all ships!
Finally, there’s always the matter of cost. Increasing the number of backlight zones, or the complexity of their configuration, greatly increases the cost of the monitor. Yes, there are screens out there with a better HDR experience. They also have a different price tag, so if you’re looking for that, then expect to pay more. For comparison: even though Lamborghini has amazing V12 engines that put the performance of my car to shame, the price difference also reflects that, and in no way does it make the engine in my car a bad one.
That turned into a little rant, so TL;DR: Technology advances, and things get better all the time. Everything comes at a price. Things aren’t bad just because something better exists. A future Eve monitor might boast even more awesome specifications and features, but the current Spectrum line-up offers a great experience at its price point.
It’s an evolution of the existing panel. Doubling the number of vertical zones is a much smaller effort than completely redesigning the way that lighting zones work.
It’d be wonderful if LG would create a panel with maybe 128 lighting zones: a grid 16 wide x 8 high. Each zone would be about 4cm x 4cm, which isn’t perfect but is at least backlight bleed is relatively localized. But edge lighting this grid might create some interesting problems, or might run into patent issues. Redesigning with a rear light would require quite a different diffuser design.
If LG do decide to massively change their backlight scheme, you can almost guarantee they’ll announce it with a new monitor of their own. Only after that will we get to see it in a Spectrum.
What I am reading is very interesting. I really thank you very much for answering me . Frankly these comments DO help and provide an idea in my head @Helios and @NZgeek . I try to behave and speak in a civil way but it makes me angry that a comment does not contribute anything at all and denies my main concern. Telling me that I “lose credibility”. It’s very frustrating to hear that when I’m trying to educate myself and speak knowledgeably on the subject. I´m studying as a digital pre-printer, I edit photos and videos regardless of the quality and what is required of me. I have practiced with images with higher resolution than my screen, not dimensions in cm only (clearly my hobby is playing videogames of many kinds like souls-likes, rpgs, shooters… Doom Eternal is a clear example xD and Necromunda Hired Gun). But there’s something I’m missing in this whole thing… From this post by CNET, they say the following:
Info.: LED LCD backlights explained - CNET
What’s going on here? They say that as light guides got better and costs had to come down (to make LCD LEDs cheaper). So why have monitors remained relatively expensive for so long? Marketing people are rubbing their hands for selling and cheating. The reviews of Hardware Unboxed, tftcentral, rtings do not show the response times in detail because they measure from 10% to 90%, they do not show images about the quality of backlight strobing.Therefore, it is not as reliable for me and rtings must improve in its tests because it is a large company. Hardware Unboxed is not a big company as such. There are people who ask Rtings for more detailed tests and I still don’t see the change. Those reviews really show the performance. Manufacturers/brands are lazy as tftcentral say. They are very right and I think we must demand a change for a fair price. They always hide something even if we say “we announce this but there must be a minimum”. SDR content has been with us since CRT technology was invented. We take leaps and bounds but you do want something. You have to bleed your wallet a lot. What is it about a professional LG OLED monitor costing between €2,000 and €3,000, unlike its TVs? Samsung Display has beaten them. Both in quality and price and boy will there be people who will buy that QD-OLED ultrawide screen. I would even buy it, but I reserve it for the Eve team to take it out. I have my reasons and I want to support this company with good will. You have had enough balls to get a differential product. You have achieved a product in which the rest of the brands will cost you a product like this. And that has a lot of merit, I laughed so hard when one person said "“give me reasons to buy this monitor” something like that was his comment with a contemptuous tone. I understand that people are pissed off because they are waiting so long for their monitor, but boy… Compare the Spectrum that you have pre-ordered and purchased vs. the rest of the monitors. the facts are there. “give me reasons to buy this monitor” something like that was his comment with a contemptuous tone. I understand that people are pissed off because they are waiting so long for their monitor, but boy… Compare the Spectrum that you have pre-ordered and purchased vs. the rest of the monitors. the facts are there. I find it hard to understand and believe that sometimes the quality of the screens on TVs are sometimes better than some monitors and vice versa when they are essentially the same. Someone smart will tell me “TVs are for watching shows, they have more input lag…” of course… tell your friend who has no idea and you’ll act smart. There is something here that I am not understanding and the manufacturers are selling something to justify themselves. I would like to know what price difference exists between Top and bottom, and Along the bottom. It would be very interesting. Because a row of LEDS at the bottom edge. It will light up an entire column. The fact that manufacturers invent a technology does not give them the right to sell something that works badly and therefore, people end up deactivating that function. That’s the impression it gives me. Sorry if the text is very long. I have a strong concern and it is gratifying to read your solutions. Really, I thank you with all my heart
I have to emphasize that I am 20 years old. So I am not experienced enough to see this whole process of change in the market, from CRT era to cool screens like PLASMA, OLED, WOLED, QD-OLED, Mini LED, Micro LED, etc. I always thought LCD was such a disappointing technology when I was comfortable with my old CRT TV. And this can be confirmed by @BlurBusters and surely @Liquidshadowfox
I think I am too ambitious when I see something that works, it is already sold on the market to the general public. But they don’t put it together in the same package.
I’ve been reading your comment several times and it’s a very good point of view that high-end monitors were 400 nits a couple of years ago. But I wonder if those years, the TVs were even better. It is very interesting. Although I think that the brightness is not a major problem from my point of view. But the real contrast, the color gamut and that it uses a real 10-bit panel. That is the central theme. No one has complained because an OLED has a Vesa displayhdr 400 true black certification. My mobile has a very poor brightness capacity but extremely good colors and blacks. DisplayHDR 400 and 600 feel like a half certification. They meet the brightness and the range of colors, the rest. They don’t care about anything at all. Amazing… TFTCentral is right hahahaha manufacturers are lazy people who want to sell you and raise the price according to what they want to obtain benefits. TFTCentral is right hahahaha manufacturers are lazy people who want to sell you and raise the price according to what they want to obtain benefits.
well, from what I’m learning DisplayHDR 600 is more aimed at those people who are professional designers for these reasons:
More interesting color gamuts than sRGB, allowing you to display more colors.
10-bit color depth. It is true that they are not entirely real, but it is a much more recommendable specification than 8-bit.
More shine. I have to say that with a brightness of 400 cd/m² you are very well served, although it may be insufficient in certain situations.
Local dimming is completely dispensed with. Well, I think it would be interesting if a future Spectrum had better hardware to dim small areas on the screen.
Woah that’s a big wall of text you got there! Although I agree that many HDR implementations are subpar I think it might be too late to make any hardware configuration changes to the monitors in question. Maybe for the next revisions it would be a good idea but I think for now this monitor serves only 3 major purposes
Color accuracy (for color accurate work)
Refresh rate, pixel perfect integer scaling and strobing (for gamers)
Ports (for everyone)
Although it would be nice to have a grid LEDs as you mentioned, I don’t think it’ll be possible to make the change this late into the project. I personally will use this monitor for programming and gaming alone and nothing else. HDR is useless for me and causes more harm than good.
With HDR enabled on windows we have a few issues
Alt tabbing between certain full screen applications causes an enormous black screen because the app in question doesn’t support HDR
Auto HDR coming to windows still has a lot of bugs (not mature) and doesn’t look that promising yet
Most games don’t support HDR correctly
I am honestly accepting this monitor for what it is. A good monitor tuned by blur busters with 8 bit colors, good port selection and high refresh (280 hz). ← Despite these low expectations I have set, this monitor will still be MILES better than anything out in the market right now.
LG monitors (currently there’s no gaming glossy monitor available and the contrast is usually low at 800:1)
Samsung VA G7, G9 monitors (there’s still a hint of black smear [I’ve owned the G7 and can speak for this] and out of the 3 panels I bought, they all had flicker issues even after firmware fix for gsync [it had microstudders with vrr control enabled on all 3 panels])
Asus gaming monitors (their ELMB sync implementation is subpar, their anti flicker algorithm is worse than gigabytes and results in slight double images no matter the refresh on my XG27AQM, also can’t change OD while the backlight strobing mode is enabled)
Gigabyte gaming monitors (their aim sync stabilizer doesn’t work below 75 hz with very little configurability to the OD while strobing is enabled)
MSI gaming monitors (these are the closest to being perfect but they don’t have a decent VRR + strobing implementation yet, let alone on any panels above 200 hz)
Eve should take HDR seriously for their next iteration but I don’t think it’ll happen for this iteration. I rather wait for their OLEDs for HDR where 400 true black certification will look MUCH better than IPS HDR 600 imo.
Totally agree with you. Although the second and third points for me, were the main reasons for buying my Spectrum as opposed to other monitors. It still seems a shame that this monitor is mediocre for viewing HDR content 10-Bit vs. 8-Bit + FRC: Monitor Panel Difference | BenQ US . For the local attenuation and the real capacity in bits. At least in SDR it would be great. I wonder what Acer’s backlight strobing quality is like. Yes it is good or bad unlike Asus. I think that Eve should be very serious about presenting a future Spectrum based on QD-OLED or Micro LED panel (very difficult because the technology is still recent). After launching model ES07D01. Yes or yes, they should market with OLED. According to what Chief Blur Busters says, it will be more complex and difficult to obtain good quality and avoid crosstalk if the next Spectrum is based on an LCD panel + full array local dimming. Mini LED I don’t know if it’s possible. With OLED or Micro LED would be more recommended when using blacklight strobing [Scanning Backlights] About Full-Array Backlights and Strobe Crosstalk - Blur Busters Forums
Sorry about the wall of text. There are many things to comment xDD I have another question: on the spec sheet. In the section that indicates the brightness “typical 450cd/m² - 750 cd/m² peak”, what is the difference between typical and peak? Do you mean that the typical brightness is the real capacity as well as the typical contrast? There is something that is doubting in my head when there is a lot of talk about “600 nits” or “1000 nits” on a screen
I THINK (don’t quote me on this) typical brightness is when most of the screen has something white displayed and peak is when you have maybe 10% of the screen with white and everything else dark or darker colors. I’ve seen Rting reviews show this in their reviews but I could be wrong.
I think what you say is true according to the image
Although I thought that the brightness peak would be like the ceiling to shine so much the screen but in a scene like a sun bet and then everything that is bright is held by the static brightness. I don’t know, I still have a lot to learn and study as an image and video editor although I would like to work at 25 years as a game programmer a silly dream for any kid who plays a lot of all kinds of video games It has to be very difficult and very hard. My favorite big studio is FromSoftware along with Capcom (Monster Hunter World made a radical change as the saga progressed. I´m sorry if I strayed from the topic. It doesn’t matter what I just said)
One question @Helios , is it possible that the next Spectrum is based on an OLED panel from LG Display or QD-OLED from Samsung Display and is Dolby Vision certified? I have seen several videos that tone mapping depends largely on its performance by the manufacturer and the screen can show a better or worse final result. Unlike Dolby Vision, I think it doesn’t have tone mopping or it’s not the same as a special chip does. Sorry if I ask this. I find it super interesting if Eve can rethink this specification
Thank you for the input! Currently we are still focused on the Spectrum line-up that we have on hand and designed.
When we start a new project or update(s) to our current line-up, the community will be the first to be made aware. The development of course, will always include community feedback in mind. Stay tuned!