• AItoothbrush@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    edit-2
    7 hours ago

    That is not how standards work. Standards are standards because they are standard. Also, just use usb c, it does literally everything you need.

    Edit: the article says theres a usbc compatible version, idk how thatll work but well see. Other thing, the article says hdmi doesnt have power but there are active devices that dont need power, how do those work then?

    Edit2: yup i was right hdmi can cary 0.25 to 1.5w depending on version.

  • dankm@lemmy.ca
    link
    fedilink
    English
    arrow-up
    4
    ·
    10 hours ago

    I’ll be all over it if it refuses to allow support for anything even remotely similar to HDCP.

  • ramble81@lemm.ee
    link
    fedilink
    English
    arrow-up
    73
    arrow-down
    1
    ·
    1 day ago

    DP is royalty free. Why are they looking for “yet another standard”

  • Ulrich@feddit.org
    link
    fedilink
    English
    arrow-up
    14
    arrow-down
    2
    ·
    23 hours ago

    “China”? Like, the government is making video cables now?

  • Jesus_666@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    ·
    1 day ago

    Now the big question is how many patents are relevant and who owns them. And even if it turns out to have cheap licensing, beating HDMI won’t be easy, as DisplayPort demonstrates. Technological superiority doesn’t mean shit if you can’t overcome HDMI’s network effect.

    • thatKamGuy@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      25
      ·
      1 day ago

      First rule of tech disruption is to ignore patents/laws and get a big enough footprint to be able to fend off the eventual litigation. Given this is China, they can mandate implementing it on mainland devices for the initial wave, and roll out later revisions into adjacent regions over time once it’s taken a foothold.

  • Nik282000@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    22
    ·
    1 day ago

    Why? 4k is already at the limit of what your eyes can resolve unless you have an enormous screen.

    • monotremata@lemmy.ca
      link
      fedilink
      English
      arrow-up
      11
      ·
      22 hours ago

      One use is VR, where the field of view is huge. The industry size and distance recommendations have a TV take up about 30° of your field of view, which works out to 128 pixels per degree for a 4k screen. For a headset with a 100° field of view (most are a little higher than this at this point, or at least claim to be, but it’s a good baseline) you’d be looking at a 12k resolution to get the same level of clarity. But, of course, you’d need to run it at a very high framerate to avoid simulator sickness, whereas 4k often gets away with just 30 fps. Delivering power over the same cable also means just one cable.

      Currently there are no GPUs to drive that high a resolution and framerate. But the cable was one limiting factor there, made especially frustrating by nVidia sticking to displayport 1.4 for so long.

    • Jyek@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      1 day ago

      People had the same argument about 4K from 2K and the same argument about frame rates higher than 60fps. But, the bigger counter to that argument is that even if that were true, additional features take up additional bandwidth and the true power of these new connector standards is in the additional bandwidth. Things like 4K HDR at very high refresh rates and additional features like 3D and VRR can have signal room on the same cable enabling the creation of displays that can do more at once than they could before. Also there are high pixel density displays and projectors that can make use of 8K and 16K footage. It’s niche for now but the technology advancing is not a bad thing.