• Tja@programming.dev
      link
      fedilink
      English
      arrow-up
      101
      ·
      10 months ago

      We cannot have two standards, that’s ridiculous! We need to develop one universal standard that covers everyone’s use cases.

        • ABCDE@lemmy.world
          link
          fedilink
          English
          arrow-up
          23
          arrow-down
          1
          ·
          10 months ago

          Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.

          • BetaDoggo_@lemmy.world
            link
            fedilink
            English
            arrow-up
            47
            arrow-down
            1
            ·
            10 months ago

            I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.

            The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.

            Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.

            • strawberry@kbin.run
              link
              fedilink
              arrow-up
              16
              ·
              edit-2
              10 months ago

              would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?

              better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc

              • Flipper@feddit.de
                link
                fedilink
                English
                arrow-up
                15
                ·
                10 months ago

                Have you looked at the naming of the usb standards? No you havn’t otherwise you wouldn’t make this sensible suggestion.

                • strawberry@kbin.run
                  link
                  fedilink
                  arrow-up
                  5
                  ·
                  10 months ago

                  the shenenigans with USB 3 naming you mean? you’re right, this would be too logical for USB lol

            • jaxxed@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              10 months ago

              I think that the biggest issue with dp over usbc is that people are going to try to use the same cable for 4k and large data transfers at the same time, and will then whine about weird behaviour.

            • ABCDE@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              ·
              10 months ago

              Yep, very true. I didn’t understand this until I couldn’t connect my Mac to my screen via the USB C given with the computer, I had to buy another (and order it in specifically). Pick up a cable, and I have no idea which version it is.

            • Mr_Dr_Oink@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Dont forget the limited length. I cant remember exactly but usb c delivering power has a max length of arpund 4 metres

            • Freestylesno@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              This is the big issue I have with with “USB C”. USB c is just the connector which can be used for so many things. What actual is supported depends on things you can’t see, like the cable construction or what the device supports.

            • cum@lemmy.cafe
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Yeah I have multiple USB cables, some at 30w, and some at 140w. Get them mixed up all the time! More companies need to at least brand the wattage on the connectors.

          • frezik@midwest.social
            link
            fedilink
            English
            arrow-up
            5
            ·
            10 months ago

            There’s some really high bandwidth stuff that USB-C isn’t rated for. You have to really press the limits, though. Something like 4k + 240Hz + HDR.

            • ABCDE@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              That doesn’t even seem so unreasonable. Is that the limit though? My cable puts a gigabyte a second down it so I wouldn’t imagine that would hit the limit.

              • frezik@midwest.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                10 months ago

                USB-C with Thunderbolt currently had a limit of 40Gbit/sec. Wikipedia has a table of what DisplayPort can do at that bandwidth:

                https://en.wikipedia.org/wiki/DisplayPort

                See the section “Resolution and refresh frequency limits”. The table there shows it’d be able to do 4k/144hz/10bpp just fine, but can’t keep above 60hz for 8k.

                Its an uncompressed video signal, and that takes a lot of bandwidth. Though there is a simple lossless compression mode.

              • GeniusIsme@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                10 months ago

                It is trivial arithmetic: 4.52403840*2160 ≈ 9 GB/ s. Not even close. Even worse, that cable will struggle to get ordinary 60hz 4k delivered.

                • pirat@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  ·
                  10 months ago

                  4.5 × 240 × 3840 × 2160

                  It seems markdown formatting ruined your numbers because of the asterisks. Whatever is written between two of those turns italic, so they’re not ideal for multiplication symbols here on Lemmy (or any other place that implements markdown formatting).

                • ABCDE@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  I think the maths got a bit funky there. I don’t think a cable capable of such speeds would struggled to do 60Hz at 4K, it surely doesn’t need close to a gigabyte a second?

      • Player2@lemm.ee
        link
        fedilink
        English
        arrow-up
        20
        ·
        10 months ago

        USB C is just a connector, you might be referring to Displayport over USB C which is basically just the same standard with a different connector at the end. That or Thunderbolt I guess

      • trafficnab@lemmy.ca
        link
        fedilink
        English
        arrow-up
        11
        ·
        10 months ago

        USB C seems like a good idea but in reality all it really did was take my 5 different, not interchangeable, but visually distinct, cables, and make them all look identical and require labeling

      • admiralteal@kbin.social
        link
        fedilink
        arrow-up
        8
        ·
        10 months ago

        I love having mysterious cables that may or may not do things I expect them to when plugged into ports that may or may not support the features I think they do.

        • trafficnab@lemmy.ca
          link
          fedilink
          English
          arrow-up
          3
          ·
          10 months ago

          If the implementation is so broad that I have to break out my label maker, can we even really call it a “standard”

    • zelifcam@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      edit-2
      10 months ago

      We are all aware of that. However, there are tons of studios people have constructed that use HDMI TVs as part of that setup. Those professionals will continue to be unable to use Linux professionally. That’s a huge issue to still have in 2024 with one of the major GFX options. Linux desktop relies on more than some enthusiasts if we want to see it progress.

      If a user only has an HDMI TV and they are considering to use a steamOS AMD like console in the future, they will not be able to use the full capability of their TV. Telling them to buy a new TV is not going increase adoption.

      Corporations will not touch Linux devices with HDMI problems.

      • Eldritch@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        5
        ·
        10 months ago

        Linux has very little to do with DisplayPort. My Windows PCs use DisplayPort. You can get passive adapters to switch from HDMI to DisplayPort etc.

        • zelifcam@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          ·
          edit-2
          10 months ago

          Linux has very little to do with DisplayPort. My Windows PCs use DisplayPort.

          What? I’m not sure what you’re on about. Of course DP is not a Linux specific technology. Not sure what that has to do with my comment specifically.

          I’m talking about people who would like to use the full capabilities of their HDMI TVs ( while using AMD), when using Linux.

          My understanding is the adapters do not provide all the features of the HDMI 2.1 spec. Is that no longer the case?

          • Malfeasant@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            ·
            10 months ago

            The problem is those passive adapters only work because one side switches to the other’s protocol.

        • zelifcam@lemmy.world
          link
          fedilink
          English
          arrow-up
          8
          ·
          edit-2
          10 months ago

          Are you serious? You’re commenting on an article discussing this very problem. ???

          Personally for me it was VRR and High refresh rate over 4k+. I have since purchased an NVIDIA card to get around it. At the time the “adapters” were not providing what I’d consider a reasonable solution. Essentially crippling the features of my high end television.