HDMI 2.1 or DP 1.4?

Hi,

I’ve ordered a Spectrum 4K and have an RTX 3090 (link). Should I use HDMI 2.1 or DP 1.4 with this combo?

A monitor reviewer said DP 1.4 because it has “VESA DSC” but idk :man_shrugging:

Thanks for your help!

1 Like

As much as i know you would need to use HDMI 2.1 if you want to get 4k with 144 hz while the DP1.4 will only run 4k 60 hz. That is what i have read and heard so it could be wrong.

HDMI 2.1 is objectively better than DP 1.4. With 3090 you’d use HDMI 2.1.

2 Likes

DP 1.4 + DSC has less input lag than HDMI 2.1, atleast on currently available monitors.

HDMI 2.1 also has VESA DSC, so that isn’t a valid reason.

@kreative above says that HDMI 2.1 has more latency on existing monitors but I’ve not looked into that thoroughly enough to comment one way or the other.

The main advantage of using HDMI2.1 however is that you don’t have to use DSC to get the same picture to the screen. Both standards will be able to run Spectrum at it’s advertised specs (4k144hz), DP 1.4 just uses DSC to do it while HDMI 2.1 doesn’t need to. I’m personally going to use HDMI (I also have a 3090) unless I see some specific testing that shows an actual difference in latency as kreative alluded to.

6 Likes

From what I read, Displayport 1.4 has 32.4Gbps of bandwith and a data rate of 25.92Gbps. It can handle 4K at 120Hz and with DSC it can support 30-bit/px color and HDR enabled.

HdMI 2.1 has 48Gbps of bandwith, can handle up to 10K, frame rates up to 120Hz, supports 16-bit color and BT.2020. HDMI 2.1 is great for playing PS5, Xbox Series X, and PC at high resolution and high fps.

Here’s some links:

https://www.google.com/amp/s/www.cnet.com/google-amp/news/hdmi-2-1-what-you-need-to-know-for-gaming-8k-tvs-and-more-in-2021/

7 Likes

Screenshot_2021-05-27-00-52-29-803_com.google.android.youtube

Source: 진짜가 나타났다! LG 27GP950 양심 리뷰! - YouTube

Video is about the LG 27GP950

See also “A Guide to HDMI 2.1” — an article by Simon Baker from TFT Central.

1 Like

As another 3090 owner here, I’ll be using HDMI 2.1. As far as input lag is concerned, this would not be a strict 1.4 is always better (or worse) than HDMI 2.1. The total input lag is going to depend on total lag for your particular system, including the input device (ie mouse or keyboard), USB controller, chipset, PCI bus, CPU, RAM, OS, Drivers, Application, GPU, and eventually the monitor (including it’s own unique mix of internal circuitry, scaler, etc.).

Besides based on the Youtube translated closed captioning, I’m not 100% sure this video can be trusted.
inputlag

5 Likes

I’m going to chime in here as I have been testing Spectrum and I have noticed no extra input lag associated with using HDMI 2.1. Now this is anecdotal evidence, but factually, I am skeptical that using HDMI 2.1 would add input latency. I may be wrong but have not heard of this until now and keep myself as informed as I can. Input lag can depend on many distinct factors and if a reviewer does not properly take that into consideration and remove as many variables as possible the data could be skewed. I wish I could provide more factual data on the Spectrum in particular, but I don’t have the tools required, so my opinion on input latency for the Spectrum will have to do :sweat_smile:

6 Likes

I will use a two monitor setup. Do you know any reasonable priced graphics card (3060 or 3070) that has two HDMI 2.1 output ports? I quickly looked at EVGA but they have 1 HDMI and 3 DP ports…
So in this setting DP is the only option, or?

The only company I know that does this is Gigabyte, but they only support multi HDMI 2.1 on their premium options. e.g AORUS GeForce RTX™ 3060 Ti/3070 MASTER

So if it’s a feature you really need you’ll have to pay a premium for it.

Thank you. I quickly checked Gigabyte website and found cards outside the AORUS line with 2x HDMI 2.1 available. Like (https://www.gigabyte.com/Graphics-Card/GV-N3060EAGLE-12GD-rev-10/sp#sp or GeForce RTX™ 3060 GAMING OC 12G (rev. 2.0) Key Features | Graphics Card - GIGABYTE Global)
I haven’t checked the price…

1 Like

As a user with a 2070 Super with DP 1.4 and HDMI 2.0, would DSC affect the displayed colors? Can I use DP 1.4 to achieve 4k@144Hz with confidence?

As far as I know this gen there are more GPUs that have two HDMI 2.1s, last gen it was only gigabyte with GPUs 3 HDMIs, two being 2.1, but it looks like around half of GPUs this gen have two HDMI 2.1s, Nvidia FE unfortunately not being one of them.

I bought an ASUS RTX 3070 8GB Dual earlier this year ready for my pair of Spectrums (Spectra?). It has 2 x HDMI 2.1 and 3 x DisplayPort 1.4a.

Like all the Nvidia RTX cards they’re still in short supply but I don’t believe the lead times are too bad at present.

DSC is lossy, though it’s marketed as visually lossless.

1 Like

if I understood the specifications and previous communications right,
either HDMI 2.1 or DP 1.4 are required for running the 4k model at 120Hz with HDR.
Since I personally use a Vega56 the choice is easy with only HDMI 2.0b and DP 1.4 available.

I would be really curious though, to at one point get information - such as a list of verified GPUs and/or ports for all the possible resolutions and refresh rates.

DP 1.3 added support for non-HDR 4K @ 120Hz. DP 1.4 doubled the bandwidth to support non-HDR 8K @ 60Hz, which should easily handle HDR 4K @ 120Hz.

From what we’ve been told, DP 1.4 with DSC will do HDR 4K @ 144Hz. The few extra Hz are just enough to push it outside of non-DSC territory.

The verdict so far on DSC is that there’s virtually no difference in the picture quality. If you were doing a side-by-side comparison and knew what to look for, you’d probably be able to spot some minor differences. It honestly shouldn’t be a concern.

3 Likes

There is a reason, why reviewers use tools to measure display latency. So, i’m sorry, but i can’t just trust your word what there is no added latency compared to displayport, when you admitted what the single tool you used to verify this is your eyes.