• insomniac_lemon@kbin.social
    link
    fedilink
    arrow-up
    0
    ·
    6 months ago

    KiB, MiB, GiB etc are more clear. It makes a big difference especially 1TB vs 1TiB.

    The American way would probably be still using the units you listed but still meaning 1024, just to be confusing.

    Either that or maybe something that uses physical measurement of a hard-drive (or CD?) using length. Like that new game is 24.0854 inches of data (maybe it could be 1.467 miles of CD?).

    • survivalmachine@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      6 months ago

      The American way would probably be still using the units you listed but still meaning 1024, just to be confusing.

      American here. This is actually the proper way. KB is 1024 bytes. MB is 1024 KB. The terms were invented and used like that for decades.

      Moving to ‘proper metric’ where KB is 1000 bytes was a scam invented by storage manufacturers to pretend to have bigger hard drives.

      And then inventing the KiB prefixes was a soft-bellied capitulation by Europeans to those storage manufacturers.

      Real hackers still use Kilo/Mega/Giga/Tera prefixes while still thinking in powers of 2. If we accept XiB, we admit that the scummy storage vendors have won.

      Note: I’ll also accept that I’m an idiot American and therefore my opinion is stupid and invalid, but I stand by it.

      • Kairos@lemmy.today
        link
        fedilink
        arrow-up
        0
        ·
        6 months ago

        No the correct way is to use the proper fucking metric standard. Use Mi or Gi if you need it. We have computers that can divide large numbers now. We don’t need bit shifting.

        • PowerCrazy@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          Hey how is “bit shifting” different then division? (The answer may surprise you).

            • PowerCrazy@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              6 months ago

              interesting, so does the computer have a special “base 10” ALU that somehow implements division without bit shifting?

              • nybble41@programming.dev
                link
                fedilink
                arrow-up
                1
                ·
                6 months ago

                In general integer division is implemented using a form of long division, in binary. There is no base-10 arithmetic involved. It’s a relatively expensive operation which usually requires multiple clock cycles to complete, whereas dividing by a power of two (“bit shifting”) is trivial and can be done in hardware simply by routing the signals appropriately, without any logic gates.