Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.
I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.
The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.
Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.
I think that the biggest issue with dp over usbc is that people are going to try to use the same cable for 4k and large data transfers at the same time, and will then whine about weird behaviour.
Yep, very true. I didn’t understand this until I couldn’t connect my Mac to my screen via the USB C given with the computer, I had to buy another (and order it in specifically). Pick up a cable, and I have no idea which version it is.
This is the big issue I have with with “USB C”. USB c is just the connector which can be used for so many things. What actual is supported depends on things you can’t see, like the cable construction or what the device supports.
Yeah I have multiple USB cables, some at 30w, and some at 140w. Get them mixed up all the time! More companies need to at least brand the wattage on the connectors.
That doesn’t even seem so unreasonable. Is that the limit though? My cable puts a gigabyte a second down it so I wouldn’t imagine that would hit the limit.
See the section “Resolution and refresh frequency limits”. The table there shows it’d be able to do 4k/144hz/10bpp just fine, but can’t keep above 60hz for 8k.
Its an uncompressed video signal, and that takes a lot of bandwidth. Though there is a simple lossless compression mode.
It seems markdown formatting ruined your numbers because of the asterisks. Whatever is written between two of those turns italic, so they’re not ideal for multiplication symbols here on Lemmy (or any other place that implements markdown formatting).
I think the maths got a bit funky there. I don’t think a cable capable of such speeds would struggled to do 60Hz at 4K, it surely doesn’t need close to a gigabyte a second?
USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess
USB C seems like a good idea but in reality all it really did was take my 5 different, not interchangeable, but visually distinct, cables, and make them all look identical and require labeling
I love having mysterious cables that may or may not do things I expect them to when plugged into ports that may or may not support the features I think they do.
We are all aware of that.
However, there are tons of studios people have constructed that use HDMI TVs as part of that setup. Those professionals will continue to be unable to use Linux professionally. That’s a huge issue to still have in 2024 with one of the major GFX options. Linux desktop relies on more than some enthusiasts if we want to see it progress.
If a user only has an HDMI TV and they are considering to use a steamOS AMD like console in the future, they will not be able to use the full capability of their TV. Telling them to buy a new TV is not going increase adoption.
Corporations will not touch Linux devices with HDMI problems.
Are you serious? You’re commenting on an article discussing this very problem. ???
Personally for me it was VRR and High refresh rate over 4k+. I have since purchased an NVIDIA card to get around it. At the time the “adapters” were not providing what I’d consider a reasonable solution. Essentially crippling the features of my high end television.
As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn’t done anything to get it built into TVs.
The DisplayPort AUX channel is a half-duplex (bidirectional) data channel used for miscellaneous additional data beyond video and audio, such as EDID (I2C) or CEC commands.
Sounds like hdmi Forum are a bunch of twats. Time for a new format.
DisplayPort already exists
We cannot have two standards, that’s ridiculous! We need to develop one universal standard that covers everyone’s use cases.
There are now three competing standards.
https://xkcd.com/927/
I know what you are referencing, but displayport already covers everybody’s use cases
#switchtodisplayport
Oh? Let me CEC on that…
I’ll just pull it up on this display that’s more than 9 feet away from the source…
Hi, my name is USB-C!
And what does that use? That’s right it’s Displayport Alternate Mode! Oh you’ve got Thunderbolt? Guess what, also Displayport!
Chuck Testa!
Yes, I agree. And it needs to be open bloody source!!
Hard to find on non-pc gear, but that’s a fair point
It’s usually easy enough to adapt it as needed. It can typically send signals compatible with HDMI and DVI-D just fine.
The passive adapters that connect to DP++ ports probably still rely on this HDMI specific driver/firmware support for these features.
can’t you just mod it?
And also USB c
USB-C display output uses the Display Port protocol
Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.
I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.
The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.
Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.
would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?
better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc
Have you looked at the naming of the usb standards? No you havn’t otherwise you wouldn’t make this sensible suggestion.
the shenenigans with USB 3 naming you mean? you’re right, this would be too logical for USB lol
Please think of the shareholders… :(
I think that the biggest issue with dp over usbc is that people are going to try to use the same cable for 4k and large data transfers at the same time, and will then whine about weird behaviour.
4K works for mine (it’s 3.2).
Yep, very true. I didn’t understand this until I couldn’t connect my Mac to my screen via the USB C given with the computer, I had to buy another (and order it in specifically). Pick up a cable, and I have no idea which version it is.
Dont forget the limited length. I cant remember exactly but usb c delivering power has a max length of arpund 4 metres
This is the big issue I have with with “USB C”. USB c is just the connector which can be used for so many things. What actual is supported depends on things you can’t see, like the cable construction or what the device supports.
Yeah I have multiple USB cables, some at 30w, and some at 140w. Get them mixed up all the time! More companies need to at least brand the wattage on the connectors.
There’s some really high bandwidth stuff that USB-C isn’t rated for. You have to really press the limits, though. Something like 4k + 240Hz + HDR.
That doesn’t even seem so unreasonable. Is that the limit though? My cable puts a gigabyte a second down it so I wouldn’t imagine that would hit the limit.
USB-C with Thunderbolt currently had a limit of 40Gbit/sec. Wikipedia has a table of what DisplayPort can do at that bandwidth:
https://en.wikipedia.org/wiki/DisplayPort
See the section “Resolution and refresh frequency limits”. The table there shows it’d be able to do 4k/144hz/10bpp just fine, but can’t keep above 60hz for 8k.
Its an uncompressed video signal, and that takes a lot of bandwidth. Though there is a simple lossless compression mode.
It is trivial arithmetic: 4.52403840*2160 ≈ 9 GB/ s. Not even close. Even worse, that cable will struggle to get ordinary 60hz 4k delivered.
4.5 × 240 × 3840 × 2160
It seems markdown formatting ruined your numbers because of the asterisks. Whatever is written between two of those turns italic, so they’re not ideal for multiplication symbols here on Lemmy (or any other place that implements markdown formatting).
I think the maths got a bit funky there. I don’t think a cable capable of such speeds would struggled to do 60Hz at 4K, it surely doesn’t need close to a gigabyte a second?
USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess
I thought thunderbolt was DP passthrough as well
USB C seems like a good idea but in reality all it really did was take my 5 different, not interchangeable, but visually distinct, cables, and make them all look identical and require labeling
I love having mysterious cables that may or may not do things I expect them to when plugged into ports that may or may not support the features I think they do.
If the implementation is so broad that I have to break out my label maker, can we even really call it a “standard”
you mean thunderbolt?
We are all aware of that. However, there are tons of studios people have constructed that use HDMI TVs as part of that setup. Those professionals will continue to be unable to use Linux professionally. That’s a huge issue to still have in 2024 with one of the major GFX options. Linux desktop relies on more than some enthusiasts if we want to see it progress.
If a user only has an HDMI TV and they are considering to use a steamOS AMD like console in the future, they will not be able to use the full capability of their TV. Telling them to buy a new TV is not going increase adoption.
Corporations will not touch Linux devices with HDMI problems.
Linux has very little to do with DisplayPort. My Windows PCs use DisplayPort. You can get passive adapters to switch from HDMI to DisplayPort etc.
What? I’m not sure what you’re on about. Of course DP is not a Linux specific technology. Not sure what that has to do with my comment specifically.
I’m talking about people who would like to use the full capabilities of their HDMI TVs ( while using AMD), when using Linux.
My understanding is the adapters do not provide all the features of the HDMI 2.1 spec. Is that no longer the case?
The problem is those passive adapters only work because one side switches to the other’s protocol.
What exactly doesnt work over HDMI?
Are you serious? You’re commenting on an article discussing this very problem. ???
Personally for me it was VRR and High refresh rate over 4k+. I have since purchased an NVIDIA card to get around it. At the time the “adapters” were not providing what I’d consider a reasonable solution. Essentially crippling the features of my high end television.
More people should try DP.
I thought I had NSFW turned off… 🤣
( ͡° ͜ʖ ͡°)
What do Dill Pickles have to do with being work safe?
When you’re trying to get into DPs, the outside can be slippery and the screw part can be tight! Very dangerous for the workplace.
As already mentioned, DisplayPort exists. The problem is adoption. Even getting DisplayPort adopted as the de facto standard for PC monitors hasn’t done anything to get it built into TVs.
also there’s still no alternative to hdmi-cec
DisplayPort supports CEC.
From Wikipedia:
huh didn’t know