this post was submitted on 29 Feb 2024
390 points (99.2% liked)

Linux

48397 readers
1254 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

For three years there has been a bug report around 4K@120Hz being unavailable via HDMI 2.1 on the AMD Linux driver.

The wait continues...

top 50 comments
sorted by: hot top controversial new old
[–] [email protected] 226 points 9 months ago (3 children)

This really bothers me. Closed standards locked behind a licensing fee may as well not be standards at all, in my opinion.

[–] [email protected] 69 points 9 months ago (7 children)

I don't understand why any hardware uses HDMI anymore anyway, what does it have that displayport doesn't?

[–] [email protected] 68 points 9 months ago* (last edited 9 months ago) (2 children)

HDMi foundation is founded by companies who own the home theatre environement (mainly movie conpanies and television) who puts DRM on HDMI to make it harder to illegally copy content like movies, ao they will always want to be anti open source because thats the request of streaming services/movie businesses. Its why for example, mobile devices have widevine levels. those levels basically determine how "unlocked" the device is and services will refuse to offer full functionality to unlocked devices because of it, be it audio or video.

Members of VESA, who control the displaypprt standard are generally computer companies are mostly not in the business of media, so they value specs over drm on changes, which for example a use case is that displayport allows for daisychaining diaplays.

[–] [email protected] 34 points 9 months ago (2 children)

The DRM is so stupid - now in the era of streaming you can get literally anything webripped day1.

DRM is obsolete (and it never really wasn't tbh).

[–] [email protected] 38 points 9 months ago

DRM is not to stop pirates, but to show investors and licence holders you are trying to stop pirates.

[–] [email protected] 13 points 9 months ago (3 children)

its the attempt that matters more to investors than the pirates. its why a shit ton of games have denuvo, evem if the version of denuvo they utilized is cracked already or not. its not there for the end user, its there for the investors to show they are at least attempting to fight off piracy.

[–] [email protected] 5 points 9 months ago (1 children)

Denuvo is actually very effective relatively speaking. Several popular games that use it have never been cracked. They haven't made it impossible, just sufficiently difficult and tedious that no one wants to bother.

[–] [email protected] 5 points 9 months ago (1 children)

some aren't cracked because theres like only one person actually doing it, and said person wont crack anime games because she hates anime.

[–] [email protected] 2 points 9 months ago

Yes, I'm well aware. Those are the symptoms. I just explained the cause.

load more comments (2 replies)
[–] [email protected] 13 points 9 months ago (2 children)

I don't know a single person who has ever used HDMI to steal copyrighted content. Seriously? Who would rip a 2 hr move by watching it vs the 10 min it takes to rip a movie digitally.

Like shit ya got CAM, WebRIP, BRRIP and SCENE. I doubt HDMI was used in any of these scenarios.

[–] [email protected] 9 points 9 months ago

technically speaking, every gamer who capture cards to bypass when games on PlayStation has an explicit mode that disables built in recording when a cutscene is active is an example.

[–] [email protected] 3 points 9 months ago

@n3m37h @Dudewitbow HDMI consortium decides to f around and find out if people really care re: displayport vs hdmi

[–] [email protected] 51 points 9 months ago (2 children)

Decades of being the standard in a/v. That's like asking, why don't we get rid of gas stations and just install electric chargers? Well, everybody's got gas powered cars.

[–] [email protected] 19 points 9 months ago (3 children)

AV things sure since they stick around longer, but computers? When was the last time you saw a high end GPU with VGA or DVI? And they already usually have mostly DisplayPort with just one or two HDMI ports

[–] [email protected] 22 points 9 months ago* (last edited 9 months ago) (1 children)

Well, I wasn't referring to that ecosystem. That ecosystem is already on display port. The reason HDMI is so prevalent is because it's the standard in audio-visual equipment. Why would I talk about computer equipment when it's not the standard there?

The point still stands. Everybody has equipment that has HDMI, and to phase out that standard in equipment going forward is phasing out equipment people already own.

load more comments (1 replies)
[–] [email protected] 7 points 9 months ago

Computers are AV things.

load more comments (1 replies)
[–] [email protected] 9 points 9 months ago (1 children)

HDMI only had about four good years to itself before DisplayPort showed up. In contrast, the RCA port stuck around for damn near 100 years.

[–] [email protected] 4 points 9 months ago

We also didn't have digital signals till DVI in 1999, HDMI in 2002 and display port in 2006

[–] [email protected] 22 points 9 months ago

Probably a lot more hardware using HDMI than DisplayPort? Just throwing a guess, tbh.

That being said, I might consider looking towards DisplayPort when I can get a new monitor...

[–] [email protected] 9 points 9 months ago (1 children)

CEC (technically I think displayport could support it, but generally isn't implemented) and ethernet up to 100Mbps.

[–] [email protected] 15 points 9 months ago (1 children)

Almost nothing uses ethernet over HDMI to my knowledge.

[–] [email protected] 7 points 9 months ago (2 children)

This is the first time I heard of Ethernet over HDMI and I can't tell if you're joking.

[–] [email protected] 3 points 9 months ago (2 children)

I think they mean HDMI over Ethernet, which is a real thing, but not something I've ever seen in real life.

[–] [email protected] 13 points 9 months ago

No. Network over HDMI.
Nobody implements it, but its part of the standard

[–] [email protected] 7 points 9 months ago (1 children)
[–] [email protected] 8 points 9 months ago

Thanks, I just threw up in my mouth.

[–] [email protected] 7 points 9 months ago (3 children)

Feature-wise probably next to nothing, and it's usually behind one or two generations in terms of bandwidth. HDMI is often the only port available on TVs though, so GPU makers likely can't afford to just leave it out.

[–] Grass 8 points 9 months ago* (last edited 9 months ago)

They should anyway. New tech TV's are all smart these days and the dumb ones are made for two decades ago. At this point we are better off with a PC monitor and separate speakers. Built in speakers are shit seemingly as a requirement. I use a video port switch for extra inputs without needing to use the on screen menus or just running out of built in ports.

load more comments (2 replies)
[–] [email protected] 6 points 9 months ago (1 children)

eARC and 12gbp/s more bandwidth (4k@185hz vs 4k@120hz)

Otherwise the same

[–] [email protected] 3 points 9 months ago

Your info is outdated. DP 2.0 is 80 Gbps can do 4K@240hz without display stream compression. It can do up to 16K@60hz using DSC.

load more comments (1 replies)
load more comments (1 replies)
[–] [email protected] 100 points 9 months ago (1 children)

Alright AMD, just remove HDMI from your graphics cards and be done with it 🤷 . Fuck the HDMI forum.

[–] [email protected] 38 points 9 months ago (2 children)

As much as I want them to give HDMI the middle finger I don't think they have enough leverage in the GPU market to pull such a bold move off.

[–] [email protected] 12 points 9 months ago

They could they just include a DP to HDMI adapter in the box and have no HDMI ports on the GPU maybe?

[–] [email protected] 8 points 9 months ago (1 children)

They’re currently what 15% of the market? Nvidia would happily swoop in and pick up some more market share.

load more comments (1 replies)
[–] [email protected] 72 points 9 months ago (1 children)

Alright, displayport, here we come

[–] [email protected] 22 points 9 months ago (1 children)

I've been on the DP bandwagon since using my GTX 660Ti

[–] [email protected] 10 points 9 months ago (2 children)

I don’t think I’ve ever used hdmi by choice. It’s always been VGA > DVI > DisplayPort. The only times I use HDMI is consoles or stupid monitors either only 1 DP and a bunch of HDMI.

load more comments (2 replies)
[–] [email protected] 7 points 9 months ago

These guys can go fuck themselves

[–] [email protected] 5 points 9 months ago* (last edited 9 months ago) (2 children)

So I see people on the phoronix forums complaining that this is a bad thing because they have TVs which are HDMI only. From what I read, the HDMI 2.1+ spec is only needed to support extreme cases like 4k@120Hz and above. So my question is how many people are there who have a TV old enough to have no display ports but be of that outrageous specification.

Edit : it seems I am mistaken in thinking that new TVs have display port.

[–] [email protected] 4 points 9 months ago

So my question is how many people are there who have a TV old enough to have no display ports but be of that outrageous specification

As far as I know no consumer TV has Display port.

I bought a TV maybe 2-3 years ago that supports 4K@120 and it doesn't have a display port, only HDMI.

[–] [email protected] 4 points 9 months ago

I'm using a recent 42" LG OLED TV as a large affordable PC monitor in order to support 4K@120Hz+HDR@10bit, which is great for gaming or content creation that can appreciate the screen real estate. Anything in the proper PC Monitor market similarly sized or even slightly smaller costs way more per screen area and feature parity.

Unfortunately such TVs rarely include anything other than HDMI for digital video input, regardless of the growing trend connecting gaming PCs in the living room, like with fiber optic HDMI cables. I actually went with a GPU with more than one HDMI output so I could display to both TVs in the house simultaneously.

Also, having an API as well as a remote to control my monitor is kind of nice. Enough folks are using LG TVs as monitors for this midsize range that there even open source projects to entirely mimic conventional display behaviors:

I also kind of like using the TV as simple KVMs with less cables. For example with audio, I can independently control volume and mux output to either speakers or multiple Bluetooth devices from the TV, without having fiddle around with repairing Bluetooth peripherals to each PC or gaming console. That's particularly nice when swapping from playing games on the PC to watching movies on a Chromecast with a friend over two pairs of headphones, while still keeping the house quite for the family. That kind of KVM functionality and connectivity is still kind of a premium feature for modest priced PC monitors. Of course others find their own use cases for hacking the TV remote APIs:

load more comments
view more: next ›