Has anyone tested integer scaling?

I watched so many reviews on YouTube, and read this forum as well as other sites on the internet and cannot find a single place where there has been some testing of the main feature of the spectrum. It is a big oversight since it was one of the main selling points and nobody seems to be covering it.

I have the 4K model pre-ordered and wanted some real testing of that before I make the remainder of the payment, since I have older consoles (PS3, XBOX 360) as well as newer consoles, and my laptop has a 1660Ti card, which will not be able to output games in 4K, maybe some, but not most of them. So I would like to know if anyone has tested the older consoles, and also tested 1080p gaming on PC.

Does it look sharp, or blurry, how does it work? and how easy is it to activate? Any info would be welcome.

Thank you.


As a community tester, I tested pixel-perfect (integer) scaling which was the main/only feature I was interested in regarding Eve Spectrum.

Pixel-perfect scaling does work with some limitations such as that it only works at resolutions predefined in the monitor’s EDID and does not work (just centers at 100% instead) at custom resolutions.

YouTube reviewers and other mass media, indeed, consistently miss this exclusive feature of Eve Spectrum and only cover generic features like HDMI 2.1.

The only review so far I’m aware of, that somewhat covered integer scaling is the one by Eurogamer / Digital Foundry.


Is it any different to Nvidia’s Ampere gpus Integer Scaling?

Both scaling via display and scaling via GPU have their usecases.

Prescaling via GPU is basically a limited workaround:

  • Impossible with non-computer video sources such as game consoles and hardware video players. For example, Nintendo Switch, PlayStation 3, MiSTer, Super Nt, Mega Sg, SNES Mini.

  • Wastes bandwidth, potentially resulting in a lower refresh rate or color depth compared with the display’s own scaling.

  • Doesn’t work with pre-Turing nVidia GPUs.

  • Specifically the nVidia implementation of integer scaling officially has multiple limitations such as hardware-level incompatibility with HDR, 4:2:0, custom resolutions, tiled mode (used in Dell UP3218K 8K monitor), etc.

See also the corresponding question in the FAQ in my article about integer scaling.


I still have no idea what it really is, how it works, when it works, what it’s for :laughing:

Pixel-perfect integer-ratio scaling with no blur a.k.a. integer scaling



I read your post mentioned earlier and understand the subject better now, but still not when I would make use of the OSD (aspect ratio) options. When I change them I notice no difference regardless of the resolution. hmm.

Note that for integer scaling to work, the logical resolution must be at least twice lower than the native resolution of the display. For example, with a 4K monitor, the maximum logical resolution for integer scaling is Full HD (1920×1080), so each logical pixel turns into a square group of 2×2 physical pixels.

Also make sure scaling via display is selected in GPU control panel. This is easy to check: the resolution displayed in the monitor menu/OSD must be equal to the logical resolution you’re working at. For example, if you selected 1920×1080 and 60 Hz in Windows display settings, the monitor menu must display “1920x1080@60Hz” at the top right. In case of scaling via GPU, the resolution displayed in the monitor menu will always be native: “3840x2160@…”.

And for what it’s worth, the option in the monitor menu to enable integer scaling is:

“Picture” → “Aspect Ratio” → “Pixel perfect”.


Thank you so much for your replies, I have a couple of extra questions if you don’t mind:

  1. If I keep my OS at 4K but set my game’s internal settings to output 1080p, would the pixel perfect scaling automatically kick in when I launch the game?
  2. For consoles, say PS3, when I start it would it automatically try and upscale or would the monitor switch into pixel perfect using 720p (3x) scaling?

I mean is it intelligent and automatic or would I have to change the whole OS to be 1080p, and also set the monitor to integer scale for it to work?

Basically yes, but this depends on whether the game is in true (exclusive) full-screen mode. Many modern PC games use so called borderless (“Windowed Fullscreen”) mode instead (at least by default) which is a regular windowed mode, just with the game-window size set to occupy the entire screen.

If the game supports exclusive full-screen mode, it’s important to switch the game to this mode. If the game does not support exclusive full-screen mode, the only way to use display scaling is switching OS itself to the in-game resolution before running the game.

Pixel-perfect scaling works with any input resolution as long as pixel-perfect scaling is enabled in the monitor settings (“Picture” → “Aspect Ratio” → “Pixel perfect”) and the input resolution is among resolutions supported by the monitor (predefined in the monitor’s EDID). Those resolutions include Full HD (1920×1080, 1080p) and HD (1280×720, 720p).

I successfully tested the feature with both PC and noncomputer devices such as photocamera (Panasonic GF5, 1080i, etc.) and game console (SNES Mini, 720p). See my tester thread for details.


Thanks Marat, I read all of your post, its very detailed and informative!

Another question I have is, say my OS is at 4K and I have pixel-perfect switched on, would that be OK? I mean 4K is native it would just ignore that and not scale, and only do so when I run a game (in exclusive full screen) at 1080p or 720p.

In your post, it mentions that HDMI #2 had issues, had lower resolution support and did not scale. Is that still the case. I intended to put my laptop on HDMI #1 or DP, and my consoles on the other ports, like HDMI #2, but your post has me worried that it may not work.

Yeah, if the native/logical resolution ratio is lower than 2.0, the maximum integer scale is 1.0, so the image is just centered without scaling, and the rest screen space is filled with black. For example, this happens at QHD (2560×1440). 3840×2160 is equal to the native monitor resolution, so there are no black bars, and the screen is occupied entirely without scaling or centering. The monitor works as expected in this regard.

According to Grant (@Lore_Wonder), mass-produced units have multiple improvements compared with preproduction prototypes provided to community testers. Those improvements include a somewhat improved HDMI port #2, though I’m not aware what exactly was improved and to what extent. I don’t have a mass-produced unit, so cannot test it myself. As far as I understand based on the monitor specs, HDMI ports #1 and #2 are meant to be functionally identical.


Also most people just don’t get it. If you’ve got a 4k monitor with integer scaling feature, you’re able to play anything at 1080p without sacrificing clarity and introducing blur AND without a need to have an ampere GPU. I’m still on GTX1080 and use an old 1080p monitor, thus going for 4k monitor and waiting for a 4xxx series is just perfect in my use case as nothing’s sacrificed and only gained.

Considering IS in general… it can be implemented on the application side via render targets (FBOs) with nearest filtering enabled for color attachments and later resolved onto the backbuffer. Some of the more advanced games allow that, but sadly there aren’t many of them.


Thank you for the clarification, I have placed my order.


My bad that I wasn’t aware of this feature before I ordered it. I was pleasantly surprised when I started using it. IMHO much more encompassing than an otherwise vendor specific GPU implementation.


Is the integer scaling enabled when adaptive sync is on?
I can’t even choose it.


According to my experience, once enabled in the monitor menu/OSD, pixel-perfect (integer) scaling then works regardless of whether the corresponding option in the monitor menu/OSD is available for changing or grayed-out.

Pixel-perfect scaling should also be turned-on by default. If it’s not, turn-off Adaptive Sync or Low-latency mode (which usually cause “Aspect ratio” to be grayed-out), switch “Aspect ratio” monitor option to “Pixel-perfect”, then turn-on Adaptive Sync or Low-latency mode back.


I have a Nvidia GPU (3060Ti). What scaling settings do I have to use in the Nvidia control panel on the Adjust Desktop Size and Position page so that the Spectrum correctly applies pixel perfect scaling to my games? I don’t know if some settings may override the Spectrum’s integer scaling.

Just disable scaling via GPU in nVidia control panel, so that computer sends signal at its original unscaled resolution and monitor does scaling:

“Adjust desktop size and position” → “Perform scaling on” → “Display” → press “Apply” button.

Note that Windows 10 tends to silently enable scaling via GPU when switching to a non-native resolution via the “Display resolution” dropdown in the “Display” window opened via the “Display settings” item of the context menu of the Windows desktop. This happens with both nVidia and AMD GPUs. So if you manually switch Windows desktop resolution, use the classic “List All Modes” window available via the corresponding button in the window opened via the text button “Display adapter properties for Display 1” in the “Advanced display settings” window. Fortunately, this issue typically does not affect games.


Ok, I also have to choose between Aspect Ratio, Full-screen and No scaling in the same page of the control panel when selecting perform scaling on display and check/uncheck ‘Override the scaling mode set by games and programs’. I have Windows 11 by the way.