You guys notice how they’re pushing these HDMI 2.1 cables on like Xbox One s users and stuff with 4K TVs? Or even the 8K TVs that aren’t capable of the 120 frame rates and stuff like that.

It’s kind of crazy how they start marketing this s*** years and years before so you end up buying the product that doesn’t even do any good for you unless you can afford a crazy good gaming TV or monitor, or your PC player that actually knows what you’re doing, but you end up buying all this extra useless s*** don’t even realize that it’s not giving you any real upgrades to quality, and then by the time it actually is giving you upgrades to quality 2.6 is out.

My buddy just bought an 8K TV and his next door neighbor told him oh bro now you got to go get that HDMI 2.1 cable.

Your TV is not a gaming TV yes it’s 8K compatible but you can also achieve that with a 1.4. which you already own three of. And you’re certainly not going to pull 120 frames and if you do you’ll never pull 160. Not to mention your Xbox wouldn’t push 120 frames if you manually pushed 120 frames through it? Don’t really know where I was going with that. Point is let alone 160.

I tried explaining the concept of bottlenecking to him probably 75 times but he’s only ever owned one PC in his entire life and I’m pretty sure it was like a Best buy bought.

Capitalism is beautiful.

  • Jure Repinc@lemmy.ml
    link
    fedilink
    English
    arrow-up
    33
    ·
    2 months ago

    Oh how I wish those TV manufacturers would get rid of HDMI and replace it with DisplyPort. HDMI mafia does not allow opensource implementations of HDMI specification and so not all latest features of it can be supported by graphics card drivers on GNU/Linux. Death to HDMI!

    • Petter1@lemm.ee
      link
      fedilink
      arrow-up
      9
      ·
      2 months ago

      I didn’t even know that and always have preferred Display port. Only got wonky vid over Hdmi yet, not DP

    • ZILtoid1991@lemmy.world
      link
      fedilink
      arrow-up
      8
      ·
      2 months ago

      From what I’ve heard, manufacturers are afraid getting sued for dumb users breaking HDMI plugs into DP sockets and vice versa.

  • Vinny_93@lemmy.world
    link
    fedilink
    arrow-up
    26
    arrow-down
    1
    ·
    2 months ago

    First off: cables don’t have version numbers. The host and the client have ports that adhere to a certain spec and the HDMI foundation made that very unclear by incorporating 2.0b into 2.1 and now not every 2.1 port supports the same things. Cables are defined by their max bandwidth, i.e. high speed, ultra high speed or high speed with ethernet. You might see marketers saying something is a 2.1 cable, that just means it is capable of supporting some or all of the 2.1 spec.

    Second: the only reason to get new HDMI cables, like you said, is if you currently have a very old one and have devices that actually make use of the bandwidth. And I’ll tell you right now, most of the high speed cables will do just fine. It’s when you start doing 8k120 with HDR and VRR with eARC you’ll need heftier cables. The only external devices to support that, though, are either supplied with cables because their makers don’t want you bottlenecking your device, or they are PCs.

    Third: the only reason HDMI is even a thing is because this joint venture behind it successfully lobbied their inferior product to TV manufacturers. DisplayPort has always been and will always be the better interface for video.

    • osaerisxero@kbin.melroy.org
      link
      fedilink
      arrow-up
      7
      ·
      2 months ago

      First off: cables don’t have version numbers.

      Yes, and this is unironically a problem. I am frankly happy to see this push just so I don’t have to find out that the video issue I’ve been troubleshooting for the last 2 hours was due to a cable that’s marked the same as any other cable happens to have half the bandwidth as some other arbitrary one.

      Fuck HDMI. All my homies hate HDMI.

    • WolfLink@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      I have a 4k120hz gaming monitor and I have some HDMI cables that don’t support that quality.

      I also just use DisplayPort because it’s better anyway (e.g. lower latency).

    • Ptsf@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      2 months ago

      While I almost completely agree with you, never underestimate the power of using the right tool for the right job. HDMI is actually far more resilient to signal corruption in my experience than display port since it implements TMDS and the cables are more commonly well shielded since they expect them to be used in device dense environments, which isn’t really applicable to anyone familiar with technology (don’t group up your cables next to something with significant RF noise/leaks, duh.) but does matter for the end user use case these see. The fees hdmi charge are a scam though fr and we could ask better from the industry.

      • Vinny_93@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        1 month ago

        Mostly unable to make use of certain features. Say your display supports 4k @ 120Hz. If you have an improper cable you might be able to get 4k30 or 4k60, but not 4k120.

  • Nikls94@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    2 months ago

    Upvote because it made me angry.

    I’ve come to the point where at least some people come to me for advice on buying electronics.

    Girl friend asked me for a TV to connect with her PlayStation, not that expensive, 4K/60 and low input delay for casual gaming, and it should last for at least 10 years and should be cheap. Long story short, I got her a 4K/60fps TV with a gaming mode that has like 2~3ms delay for € 550. It‘s a dumb Philips TV running Linux, so no google play and you can remove all spyware. It has apps, but she got the PS to do all of this anyway.

    • gazter@aussie.zone
      link
      fedilink
      arrow-up
      4
      ·
      2 months ago

      Huh, that’s interesting. I would have thought that a TV running Linux would be called ‘smart’.

      I’m with you though, it’s better to be more ‘modular’ and have your playback device- be it PlayStation, media server, heck even television receiver, seperate from the display itself.

      • MrScottyTay@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Yeah i think that tv is still a smart tv just not an android based smart tv (or it might still be Android since that is also very Linux like, especially when you remove Google services)

  • Che Banana@beehaw.org
    link
    fedilink
    arrow-up
    4
    arrow-down
    1
    ·
    2 months ago

    I am blissfully unaware of the differences, and since I’m playing the steam deck on my TV the only HDMI cable I rummaged around for and found in our pile of obsolete cables is doing the job.