So Elon gutted Twitter, and people jumped ship to Mastodon. Now spez did… you know… and we’re on Lemmy and Kbin. Can we have a YouTube to PeerTube exodus next? With the whole ad-pocalypse over there, seems like Google is itching for it.

  • Barry Zuckerkorn@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Hosting and bandwidth for videos has a big cost.

    Plus it’s computationally expensive. YouTube has entire data centers filled with servers using custom silicon to encode ingested videos into nearly every resolution/framerate and codec they serve, so that different clients get the most efficient option for their quality settings and supported codecs, no matter what the original uploader happened to upload. Granted, that workflow mainly makes sense because of bandwidth costs, but the high quality of the user experience depends on that backend.

    • Art [he/him] 🌈@beehaw.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Fascinating! I guess I should have known, because I use yt-dlp a lot and see each video has multiple formats available, but multimedia streaming is not my forte so I never put much thought into the technicalities.

      Pardon the ignorant follow-up question: it doesn’t encode on-the-fly, right? It encodes upon uploading the source, then it keeps a copy of each format?

      • Barry Zuckerkorn@beehaw.org
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        As I understand it, it ingests an uploaded video and automatically encodes it in a bunch of different quality settings in h.264, then, if the video is popular enough to justify the computational cost of encoding into AV1 and VP9, they’ll do that when the video reaches something like 1000 views. And yes, once encoded they just keep the copies so that it doesn’t have to be done again.

        Here’s a 2-year-old blog post where YouTube describes some of the technical challenges.

        As that blog post explains, when you’re running a service that ingests 500 hours of user submitted video every minute, you’ll need to handle that task differently than how, for example, Netflix does (with way more video minutes being served, but a comparatively tiny amount of original video content to encode, where bandwidth efficiency becomes far more important than encoding computational efficiency).

        • Art [he/him] 🌈@beehaw.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          1 year ago

          Thanks a lot for the explanation and the link.

          I asked about that because I thought if it’s a one-time thing, then the uploader would be the one using their own computer to encode. Maybe they’re given a desktop software or a set configurations for, say, ffmpeg. It would certainly be less attractive than to just upload it and let the platform handle it.

          • Barry Zuckerkorn@beehaw.org
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Yeah, I think it’s doable to distribute that compute burden if each channel owner has a desktop CPU laying around to encode a bunch of video formats, but lots of people are doing stuff directly on their phones, and I don’t think a phone CPU/GPU would be able to process a significant amount of video without heat/power issues.