• piccolo@sh.itjust.works
    link
    fedilink
    arrow-up
    5
    ·
    3 months ago

    It makes sense because servers are expensive to operate. The real scam is nintendo where you pay for P2P multiplayer…

    • ✺roguetrick✺@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      3 months ago

      They’re expensive when you’re not already building a CDN for delivery of massive files all around the world. Economies of scale quickly matter there.

    • Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      They’re stupidly cheap to operate per user when you have millions of them, which is how companies like Facebook manage to make a profit from merely showing adverts to users and with no subscription fees.

    • Aceticon@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      3 months ago

      They’re stupidly cheap to operate per user when you have millions of them, which is how companies like Facebook manage to make a profit from merely showing adverts to users and with no subscription fees.

      Remember that Sony gets a cut from games being distributed to their platform, so online fees are just them double dipping for extra profits.

      • piccolo@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        3 months ago

        Web servers are different from game servers. You need a lot of performance and fast low latency servers to keep up with realtime game play. Webservers however dont need that and can benefit of load balancing accross multiple servers. Scale of economy helps a lot, but with game servers the cost doesnt change much because a session has to be on a single machine.

        As for distribution costs, most of the cost is manufacturing and physical distribution of discs. So yeah, they are making a killing by continuing to take a a huge cut from game sales when most of their distribution is online.

        • Aceticon@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          3 months ago

          At some point in my career I’ve actually designed mission critical high performance distributed server systems for a living, so I’m well aware of that.

          You can still pack thousands of users per server and have very low latency as long as you use the right architecture for it (it’s mainly done with in-memory caching and load balancing) when you’re accessing gigantic datasets which far exceed the data space of a game where the actual shared data space is miniscule since all clients share a local copy of most of the dataspace - i.e. the game level they’re playing in - and even with the most insane anti-cheat logic that checks every piece of data coming in from the user side against a server-side copy of the “game level data space” it’s still but a fraction of the shared data space in equivalent situations in the corporate world, plus it tends to be easilly partitionable data (i.e. even in MMORG with a single fully open massive playing space, players only affect limited areas of the entire game space so you don’t really need to check the actions of a player against the data of all other players).

          Also keep in mind that all the static (never changing or slow changing stuff) like achievements or immutable level configuration can still be served with “normal” latencies.

          Further the kind LVL1 ISP that provides network access for companies like Sony servicing millions of users already has more than good enough latency in their normal service and hence Sony needs not pay extra for “low latency”.

          Anyways, you do make a good and valid point, it’s just that IMHO that’s the kind of thing that pushes the running costs per-player-month from one dollar cents or less to, at most (and this is likely quite a large overestimation), a dollar per-player-month unless they only have tens of players per-server (which would be insane and they should fire their systems designers if that’s the case).