Which is the better port to use for the 4K@144Hz Spectrum: HDMI 2.1 or DisplayPort 1.4?

While I excitedly wait for my Spectrum 4K@144Hz to arrive, I’ve begun making sure I have the appropriate setup to get the most out of it. Which brings me to the question, which cable port will get the most out of the Spectrum 4K@144Hz?

From what I’ve found, both HDMI 2.1 and DisplayPort 1.4 seem to have limitations reaching 144Hz at a 4K. From what I’ve found, HDMI 2.1 only natively supports 120Hz at 4K UHD resolution but with . Similarly, DisplayPort 1.4 only supports 4K up to 120HZ unless you use a DP 1.4 cable with DSC(Display Stream Compression). With DSC you can reach 144Hz at “4K” with compression.

There are many other differences between the two types including differences in color gamut and integrated tech for screen tearing, etc. While other sources have compared the 2 ports, they have contradictory information that’s left me more confused. So I turn to the Eve community.

Between HDMI 2.1 and Display Port 1.4, which one will give you the best access to Spectrums full specs in resolution, refresh rate and color?

If you’re on a PC or other device that allows you to manually set a custom resolution, you can take advantage of HDMI 2.1’s 48Gbps bandwidth to run Spectrum at 4K@144Hz without any compression. While the official spec supports up to 120Hz, you can set it to whatever you want with a custom resolution.


Perfect! That’s exactly what I was looking for. Thank you!!!


Sorry, but this is incorrect. You don’t need to set a custom resolution to reach 144hz on HDMI 2.1. Spectrum uses non standard timings to reduce bandwidth, meaning 144hz at 4k HDR is under the bandwidth limit. If the monitor was using standard timings then yes, it wouldn’t be able to reach 144hz at 4k HDR.


I used HDMI 2.1 and dropped back to DisplayPort @ 4K 120HZ. With my RTX 3090 I was getting handshake issues (no image until the PC had booted). This is a somewhat known issue with the 30 series connecting to TVs.

Not sure how AMD cards would handle it.

1 Like

For me, turning off fast boot in windows helped a lot with the start up issues I was having with HDMI 2.1. I did have to set the refresh rate to 120hz though in order to get surround mode (3 spectrums) to work reliably.

1 Like

Is there any visible difference using DSC with DP 1.4 vs not? I use DP on my current monitor for the PC (RTX 3080) and the HDMI ports for a Google TV and a PS5.

I’ve been using DP for my PC with 3090 and thought it’s been fine. Been running 4K/144Hz/HDR.
I’m not sure but I read DP runs DSC to run at those specs, but I haven’t particularly noticed.

Can you elaborate?

How can non standard timings lead to lower bandwith?

Reduced Blanking: The Way to Low Power, High Resolution Displays | Synopsys

This link explains the relationship between timings and pixel clock/bandwidth required. As you use timings with reduced blanking periods, you reduce bandwidth required which means you can use that extra bandwidth to increase resolution or in Spectrum’s case, increase refresh rate.

Hi if u ordered this EVE monitor which has thunderbolt 4 type-c why would u deal with other ancient connections?? Just use nice and best thunderbolt 4 from laptop /desktop with certified thunderbolt 4 cable and you goood tooo goooo

Is it possible to enable G-sync and HDR with HDMI 2.1? If yes it should be the best gaming solution right?

I do believe so! If you have a GeForce GTX 16 series or higher GPU you should be able to with HDMI 2.1 :slight_smile:

I have a 3080Ti, I will buy a HDMI 2.1 cable for my futur spectrum so! :slight_smile:

The Spectrum only supports DisplayPort alt mode. It doesn’t support Thunderbolt.

DP alt mode allows physical wiring inside the USB-C cable to carry a DP signal rather than USB. It’s why you can use a simple adapter cable to convert USB-C to a regular DP plug.

TB wraps up PCIe data packets and allows them to be sent over a cable. Video signals don’t use dedicated wires - they have to be extracted from the TB data. The implementation is a lot more complex, and anyone who uses it has to go through additional certification and pay licensing fees.

TB has been around longer than DP alt mode, so a lot of older monitors would support TB graphics over USB-C. Things have changed more recently, with TB typically only being seen on high-end productivity monitors (e.g. the ones Apple make).


Not quite correct. Please see this table on Nvidia’s site.

Nvidia didn’t introduce HDMI 2.1 until they brought out their RTX 30xx series of graphics cards. Everything else is limited to HDMI 2.0b, which means 4K@60HZ SDR or 4K@50Hz HDR.

The big improvement brought in with the GTX 16xx and RTX 20xx cards is support for DisplayPort DSC. This allows for 4K@144Hz HDR to be used with a DisplayPort 1.4 capable monitor (like the Spectrum).


Thank you for shining light on that topic @NZgeek and clearing that up :blush:

1 Like