DisplayPort 1.2 (HDCP 1.2), HDMI (HDCP 1.4) question

Hello, i just bought a cheap new Monitor for my PC. (My old Philips 27" is about 8 years old, so i wanted to try a new monitor. Getting unreliable…)

So this new monitor is a CHEAP Samsung 27" monitor C27C322.

But my question is, what is the difference between the tech specs on this Samsung that says it supports (and cables included) “DisplayPort 1.2 (HDCP 1.2), HDMI (HDCP 1.4)”?

Which is my best choice? 1.2 sounds lower than 1.4, but i know DP is more modern. In what way, and will it affect my monitor enjoyance?

I beleive the new Samsung is 100hz, but id guess both interfaces support that?

Any image quality differences?

Any other tips?

Ignore the version numbers, they’re different technologies by different groups. For 99% of users (you) it doesn’t matter. The important part is: what connector does your PC have and what cables do you have?

Generally speaking, DP is better for very high resolutions/refresh rates, HDMI can pass audio. But again for your purposes, if your PC only has one you should use that, if it has both, use the cables you have, if you don’t have cables just buy a DP-DP.

I have a RTX 2080 Ti with both interfaces. I used to play games furiously from the 70s and till about 2015, so im mostly a picture/movie/etc guy now. BUT i fall into games a few times here and there, and i want ok visuals for that.

So i got both interfaces and i wondered about those specs and numbers…

I might have got an answer up here (Thank you), unless anyone want to add something about quality of images

(Oh and i have speakers, so i dont worry about HDMI sound output)

Both standards well exceed the capabilities of the monitor so just use either, you won’t be able to tell the difference.

If you have a laptop as well, you can make a “ghetto KVM” by connecting one machine to either port so connect the desktop to the one the laptop doesn’t have.

If the difference you’re wondering about is the HDCP versions, I don’t see how they would matter to a normal computer use case.

HDCP is High-bandwidth Digital Content Protection, which only affects how a monitor is allowed to display copyrighted media. I’ve never seen any setup where HDCP 1.2 wasn’t good enough, and DisplayPort’s advantages are substantial computer-related uses (rather than home media). However, at 27", HDMI may be good enough and the cable is probably cheaper.

I was just wondering if, any differences in quality, between the both (desktop)

No ؜؜؜

Not for this monitor/graphics card combo. Any ten-year-old port/cable should be fine.

The differences are only apparent in higher-end monitors, like the 4k resolution, 240 Hz, ultra-mega-super-duper-wide ones (something like this: 49” Odyssey Neo G95NA DQHD LED 144Hz 1ms(GtG) Curved Gaming Monitor - LS49AG952NNXZA | Samsung US). In those cases, you either need a very recent version of HDMI + nice cables that support the full bandwidth, or it can generally be safer to use DisplayPort.

But that’s more a matter of some older versions of HDMI and some inferior cables not having enough bandwidth to support that combination of resolution and refresh rate.

But in your specific case, 1920x1080 @ 100 Hz will work fine with either. There will be no difference in image quality.


The connector ports and cables just determine the maximum refresh rate and resolution you can use (and whether you send audio over them, which you shouldn’t do with computer monitors anyway because they sound like crap). But if it’s within the spec, it’s a digital signal, so “better” cables or connectors won’t matter and won’t affect the image quality.

How it looks on your monitor is more about that monitor’s LCD panel type (IPS vs VA and others), which have various tradeoffs between color gamut (what they can display), accuracy, and various visual artifacts. OLED monitors will give you very good vibrancy with some other downsides.

It’s all too much to get into here, but you can look into those if you really care. Probably you don’t if you’re hunting for budget monitors anyway :slight_smile: It also kinda depends on how well calibrated for color and gamma your monitor is. If you’re not sure, find a pretty game screenshot that you like, try to remember how it looks, and then go to an Apple store and pull that same screenshot up on one of their Macbook Pros. Those have really nice panels with good color gamut and accuracy and are calibrated ahead of time, so you can see how the image is “supposed” to look when everything is working. Then you can decide if it’s worth the price difference (between a good and bad monitor, not a Mac). For many people it’s not even noticeable, like the hordes of poorly-calibrated HDTVs you’ll find at every Costco, Best Buy, Target, etc.

Beyond that, if you want good gaming graphics but only on occasion, I suggest using GeForce Now instead. That gives you an RTX 4080 in the cloud, with full support for ray-tracing and DLSS frame generation. It will look way way way better in modern games than a 2080. They’re probably going to upgrade to 5080s soon too, once those come out.

They have both monthly plans and day passes available, and for $20/mo it’s a no-brainer vs having to upgrade your expensive graphics card every few years.

What kind of games?

What kind of games, what…? Does GeForce Now support?

Here’s a list: GeForce NOW - Play Your Games Anywhere

The monthly fee just gets you the cloud 4080 you use to play with. You bring your own games from Steam, Epic, Microsoft GamePass, Battle.net, or a few other other platforms. You bring your own games and Nvidia doesn’t sell them directly.

That would depend on one’s gaming choices and one’s network configuration. There are plenty of practical combinations of those for which cloud rendering would make the experience worse.

At only 1080p @ 100 Hz, an RTX 2080-Ti is way enough to render any game to excellent quality (unless the machine is overly CPU limited for OP’s games of interest or something).

@Loggins, I’d say boot it up and enjoy. You’re doing fine with that graphics setup.

OK, fair enough. For fast enough networks, though, the experience is quite incredible, even for twitch shooters. It’s come a long way since Stadia and the like.

This probably isn’t true anymore, for any game that use ray-tracing/path-tracing or just have very high detail. Even older titles like Cyberpunk 2077 struggle to hit 60 fps at 1080p, much less 100 fps (if they want to match the monitor refresh rate): https://www.youtube.com/watch?v=dQCrO9wYXL8

But shrug maybe it’s no big deal. Personally I use GFN for supported games, but there are also games that I play on my Mac at very low settings. They seem fine to my old eyes. Everything is so nice looking these days that even very low is so much better than what we had in the old days.

I put a 3060 video card in my PC (Dell) and I found that the HDMI sometimes didn’t register with my 4K monior. The DP consitently actually works. This is probably due to the detection of the monitor failing on the video card. All about a year or two old.

However, this equipment is nothng like yours. The only question is whether the monitor comes alive when you start the computer. I would suggest DP is more reliable, it is for me. But this is the only situation where I saw this happen, and I attribute it to a too fancy video card.

I’ve had similar experiences too. If the HDMI works, it works, but with a lot of setups, it’s super flaky. DP is usually more reliable for me, and I will choose it whenever I can.

DisplayPort is intended for computers to connect to monitors, while HDMI was originally for TVs to connect to TV peripherals and was later adapted for monitors, PCs, and so on. I haven’t had issues with just computers and monitors myself, but connecting a Raspberry Pi to a device via HDMI is a huge troubleshooting PITA so in that sense I prefer DP.