• Tja@programming.dev
        link
        fedilink
        English
        arrow-up
        6
        ·
        10 months ago

        We cannot have two standards, that’s ridiculous! We need to develop one universal standard that covers everyone’s use cases.

        • frezik@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          It’s usually easy enough to adapt it as needed. It can typically send signals compatible with HDMI and DVI-D just fine.

          • zarenki@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            The passive adapters that connect to DP++ ports probably still rely on this HDMI specific driver/firmware support for these features.

          • ABCDE@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 months ago

            Can it use others, and is there a benefit? USB C makes a lot of sense; lower material usage, small, carries data, power and connects to almost everything now.

            • BetaDoggo_@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              I believe USB-C is the only connector supported for carrying DisplayPort signals other than DisplayPort itself.

              The biggest issue with USB-C for display in my opinion is that cable specs vary so much. A cable with a type c end could carry anywhere from 60-10000MB/s and deliver anywhere from 5-240W. What’s worse is that most aren’t labeled, so even if you know what spec you need you’re going to have a hell of a time finding it in a pile of identical black cables.

              Not that I dislike USB-C. It’s a great connector, but the branding of USB has always been a mess.

              • strawberry@kbin.run
                link
                fedilink
                arrow-up
                0
                ·
                10 months ago

                would be neat to somehow have a standard color coding. kinda how USB 3 is (usually) blue, maybe there could be thin bands of color on the connector?

                better yet, maybe some raised bumps so visually impaired people could feel what type it was. for example one dot is USB 2, two could be USB 3, etc

                • Flipper@feddit.de
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  10 months ago

                  Have you looked at the naming of the usb standards? No you havn’t otherwise you wouldn’t make this sensible suggestion.

  • Justin@lemmy.jlh.name
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    This is really frustrating. This is the only thing holding Linux gaming back for me, as someone who games with a AMD GPU and an OLED TV. On Windows 4k120 works fine, but on Linux I can only get 4k60. I’ve been trying to use an adapter, but it crashes a lot.

    AMD seemed to be really trying to bring this feature to Linux, too. Really tragic that they were trying to support us, and some anti-open source goons shot them down.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    10 months ago

    So why is it rejected?

    Just because they’re still trying to use HDMI to prevent piracy? Who in fuck’s name is using HDMI capture for piracy? On a 24fps movie, that’s 237MB of data to process every second just for the video. A 2 hour movie would be 1.6TB. Plus the audio would likely be over 2TB.

    I’ve got a Jellyfin server packed with 4K Blu-ray rips that suggest there are easier ways to get at that data.

    • dangblingus@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      10 months ago

      Most people don’t pirate 4K media due to file size and internet speed constraints. Most people pirate 1080p video. There’s also the prospect of people pirating live television, which HDMI capture would be perfect for.

      • Psythik@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        10 months ago

        Then most people need get a better ISP. My crappy $60/mo fixed 5G can download an entire 4K film in under 10 minutes or start streaming it within a second. Y’all should see if there are any options beyond cable and DSL in your town. You might be pleasantly surprised what’s available these days.

        • nymwit@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          10 months ago

          Is that not a compressed stream though? Genuinely asking. A 4k blu ray rip and a 4k stream from a service (or whatever it saves for offline viewing on an app) a pretty different. I think things are getting conflated with capturing live 4k television and capturing a 4k blu ray as it plays, which both might be using an HDMI cable.

          • Psythik@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 months ago

            I use Stremio and only stream full 4K Blu Ray rips, with HDR and Dolby Atmos and all. So nothing is recompressed. 50-70GB files but it starts streaming almost instantly.

            I have a poor 5G signal due to a tree that’s blocking my view of the antenna, so I get anywhere between 400Mbps and 1400Mbps (I’m supposed to get a gigabit but it’s usually closer to 500). Even with a poor signal it’s still way faster than any other ISP in my town.

    • sarmale@lemmy.zip
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Can’t you compress what the HDMI outputs in real time so that it would have a normal size?

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Sure. But why bother when you can rip it right from the disc in higher quality than you could ever hope to capture in real time?

  • NoLifeGaming@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    10 months ago

    Always thought that display port is better anyways lol. Anything that HDMI does or have that display port doesnt?

    • Solrac@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      10 months ago

      Audio? Forgive my ignorance, It is out of actual ignorance or rather lack of knowledge

      • MeanEYE@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        Naah, DisplayPort carries everything from audio, USB, displays, etc. Version 1.2 even allows daisy chaining displays, so you don’t have to have number of cables going to your PC. When it comes to audio, version 1.4 supports 1536 kHz maximum sample rate at 24bits and supports 32 individual audio channels. Scary good! Overall it’s significantly better protocol.

  • csolisr@hub.azkware.net
    link
    fedilink
    arrow-up
    0
    arrow-down
    1
    ·
    10 months ago

    If we had to relay exclusively on non-proprietary protocols, I doubt that GNU/Linux would have gone anywhere beyond the Commodore 64

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      10 months ago

      Linux never ran on the Commodore 64 (1984). That was way before Linux was released by Linus Torvalds (1991).

      I’d also like to point out that we do all rely on non-proprietary protocols. Examples you used today: TCP and HTTP.

      If we didn’t have free and open source protocols we’d all still be using Prodigy and AOL. “Smart” devices couldn’t talk to each other, and the world of software would be 100-10,000x more expensive and we’d probably have about 1/1,000,000th of what we have available today.

      Every little thing we rely on every day from computers to the Internet to cars to planes only works because they’re not relying on exclusive, proprietary protocols. Weird shit like HDMI is the exception, not the rule.

      History demonstrates that proprietary protocols and connectors like HDMI only stick around as long as they’re convenient, easy, and cheap. As soon as they lose one of those properties a competitor will spring up and eventually it will replace the proprietary nonsense. It’s only a matter of time. This news about HDMI being rejected is just another shove, moving the world away from that protocol.

      There actually is a way for proprietary bullshit to persist even when it’s the worst: When it’s mandated by government.