this post was submitted on 29 Feb 2024
390 points (99.2% liked)

Linux

48397 readers
1254 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

For three years there has been a bug report around 4K@120Hz being unavailable via HDMI 2.1 on the AMD Linux driver.

The wait continues...

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 226 points 9 months ago (3 children)

This really bothers me. Closed standards locked behind a licensing fee may as well not be standards at all, in my opinion.

[–] [email protected] 69 points 9 months ago (7 children)

I don't understand why any hardware uses HDMI anymore anyway, what does it have that displayport doesn't?

[–] [email protected] 68 points 9 months ago* (last edited 9 months ago) (2 children)

HDMi foundation is founded by companies who own the home theatre environement (mainly movie conpanies and television) who puts DRM on HDMI to make it harder to illegally copy content like movies, ao they will always want to be anti open source because thats the request of streaming services/movie businesses. Its why for example, mobile devices have widevine levels. those levels basically determine how "unlocked" the device is and services will refuse to offer full functionality to unlocked devices because of it, be it audio or video.

Members of VESA, who control the displaypprt standard are generally computer companies are mostly not in the business of media, so they value specs over drm on changes, which for example a use case is that displayport allows for daisychaining diaplays.

[–] [email protected] 34 points 9 months ago (2 children)

The DRM is so stupid - now in the era of streaming you can get literally anything webripped day1.

DRM is obsolete (and it never really wasn't tbh).

[–] [email protected] 38 points 9 months ago

DRM is not to stop pirates, but to show investors and licence holders you are trying to stop pirates.

[–] [email protected] 13 points 9 months ago (2 children)

its the attempt that matters more to investors than the pirates. its why a shit ton of games have denuvo, evem if the version of denuvo they utilized is cracked already or not. its not there for the end user, its there for the investors to show they are at least attempting to fight off piracy.

[–] [email protected] 5 points 9 months ago (1 children)

Denuvo is actually very effective relatively speaking. Several popular games that use it have never been cracked. They haven't made it impossible, just sufficiently difficult and tedious that no one wants to bother.

[–] [email protected] 5 points 9 months ago (1 children)

some aren't cracked because theres like only one person actually doing it, and said person wont crack anime games because she hates anime.

[–] [email protected] 2 points 9 months ago

Yes, I'm well aware. Those are the symptoms. I just explained the cause.

[–] [email protected] 1 points 9 months ago (1 children)

Isn’t DRM in games working though. Denuvo only being cracked by one person, to me it sounds like a win for the corporations.

[–] [email protected] 1 points 9 months ago

it's working in the sense that i slows it down. However how denuvo works is that there are usually are generations of denuvo that get cracked, so once one gets cracked in a generation, theres a handful that will be cracked with it. if a company is using an older generation of denuvo, you may typically see day 1 cracks, which ultimately means the company paid denuvo for nothing, but the point is, denuvo wasn't meant to stop piracy first, it was meant to appease investors that require denuvo to be implemented.

[–] [email protected] 13 points 9 months ago (2 children)

I don't know a single person who has ever used HDMI to steal copyrighted content. Seriously? Who would rip a 2 hr move by watching it vs the 10 min it takes to rip a movie digitally.

Like shit ya got CAM, WebRIP, BRRIP and SCENE. I doubt HDMI was used in any of these scenarios.

[–] [email protected] 9 points 9 months ago

technically speaking, every gamer who capture cards to bypass when games on PlayStation has an explicit mode that disables built in recording when a cutscene is active is an example.

[–] [email protected] 3 points 9 months ago

@n3m37h @Dudewitbow HDMI consortium decides to f around and find out if people really care re: displayport vs hdmi

[–] [email protected] 51 points 9 months ago (2 children)

Decades of being the standard in a/v. That's like asking, why don't we get rid of gas stations and just install electric chargers? Well, everybody's got gas powered cars.

[–] [email protected] 19 points 9 months ago (3 children)

AV things sure since they stick around longer, but computers? When was the last time you saw a high end GPU with VGA or DVI? And they already usually have mostly DisplayPort with just one or two HDMI ports

[–] [email protected] 22 points 9 months ago* (last edited 9 months ago) (1 children)

Well, I wasn't referring to that ecosystem. That ecosystem is already on display port. The reason HDMI is so prevalent is because it's the standard in audio-visual equipment. Why would I talk about computer equipment when it's not the standard there?

The point still stands. Everybody has equipment that has HDMI, and to phase out that standard in equipment going forward is phasing out equipment people already own.

[–] [email protected] 1 points 9 months ago* (last edited 9 months ago)

and to phase out that standard in equipment going forward is phasing out equipment people already own.

And where's the problem in that? My parents still use a soon 20 years old plasma tv. But they're getting old too.

[–] [email protected] 7 points 9 months ago

Computers are AV things.

[–] [email protected] 1 points 9 months ago

Today. Every time I go downstairs.

[–] [email protected] 9 points 9 months ago (1 children)

HDMI only had about four good years to itself before DisplayPort showed up. In contrast, the RCA port stuck around for damn near 100 years.

[–] [email protected] 4 points 9 months ago

We also didn't have digital signals till DVI in 1999, HDMI in 2002 and display port in 2006

[–] [email protected] 22 points 9 months ago

Probably a lot more hardware using HDMI than DisplayPort? Just throwing a guess, tbh.

That being said, I might consider looking towards DisplayPort when I can get a new monitor...

[–] [email protected] 9 points 9 months ago (1 children)

CEC (technically I think displayport could support it, but generally isn't implemented) and ethernet up to 100Mbps.

[–] [email protected] 15 points 9 months ago (1 children)

Almost nothing uses ethernet over HDMI to my knowledge.

[–] [email protected] 7 points 9 months ago (2 children)

This is the first time I heard of Ethernet over HDMI and I can't tell if you're joking.

[–] [email protected] 3 points 9 months ago (2 children)

I think they mean HDMI over Ethernet, which is a real thing, but not something I've ever seen in real life.

[–] [email protected] 13 points 9 months ago

No. Network over HDMI.
Nobody implements it, but its part of the standard

[–] [email protected] 7 points 9 months ago (1 children)
[–] [email protected] 8 points 9 months ago

Thanks, I just threw up in my mouth.

[–] [email protected] 7 points 9 months ago (3 children)

Feature-wise probably next to nothing, and it's usually behind one or two generations in terms of bandwidth. HDMI is often the only port available on TVs though, so GPU makers likely can't afford to just leave it out.

[–] Grass 8 points 9 months ago* (last edited 9 months ago)

They should anyway. New tech TV's are all smart these days and the dumb ones are made for two decades ago. At this point we are better off with a PC monitor and separate speakers. Built in speakers are shit seemingly as a requirement. I use a video port switch for extra inputs without needing to use the on screen menus or just running out of built in ports.

[–] [email protected] 2 points 9 months ago* (last edited 9 months ago)

Yep. Very common.

A lot of people use their pc like a console or media server. Ie. use it to watch/play stuff from their bed or couch.

[–] [email protected] 1 points 9 months ago

Why not? If you need it get a converter.

[–] [email protected] 6 points 9 months ago (1 children)

eARC and 12gbp/s more bandwidth (4k@185hz vs 4k@120hz)

Otherwise the same

[–] [email protected] 3 points 9 months ago

Your info is outdated. DP 2.0 is 80 Gbps can do 4K@240hz without display stream compression. It can do up to 16K@60hz using DSC.

[–] [email protected] 0 points 9 months ago

Can hook up to TVs…

[–] [email protected] 2 points 9 months ago

Besed on the upvotes, it's not only your opinion. 👍