It is all "anti-cheat". Many years ago gamedevs like John Carmack were no joke wizards pushing the envelope, which certainly required low level access. Today's gamedevs aren't doing that... not even close. Modern games are known for shamefully janky code - and it isn't motivated my performance, see the GTA json parsing travesty for example: https://news.ycombinator.com/item?id=26296339
Making software is hard, I’d not be so quick to judge the GTA devs, especially when GTA games are massively larger development undertakings. Doom was made by a handful of people in less than six months - GTA most certainly wasn’t. There is such a vast difference in complexity I’m not sure there is much value in comparing. The entire doom source code is likely smaller than a single grand theft auto save file.
hmm, having a hard time determining if I'm simply defending an entrenched position or if the fact that they were the first to write FPS netcode might excuse the fact that it didn't initially run super great on corporate IPX networks that wired everything together with hubs instead of switches :) I vaguely remember hearing about that bug, and I very likely played the game prior to getting my hands on a version with it patched out. Never noticed a problem in the computer lab, but then we were still on token ring - where collision lights weren't really a thing.
> The entire doom source code is likely smaller than a single grand theft auto save file.
Is that a defense of GTA, you think that "complexity" is needed? I don't think so, I think it is emblematic of something very wrong in not just software - but in the mentality one would need in order to look at the depth of our average stack traces and think "Meh, 25 calls deep isn't bad for text file pretty printer".
I think it’s pretty ridiculous to think something more complex than doom isn’t needed in a game simulating a vast number of systems like GTA, yes. Bugs and issues increase with complexity, surprise!
Doom is gloriously simple by today’s standards - it doesn’t even support “room over room”. That isn’t a criticism of today being over-complex either - the state of the art in gameplay systems has just moved on massively as compute power has increased. We take “room over room” for-granted in 2021 which we couldn’t in the 90s.
Separately, if you can manage a software release with 1000 human developers, multiple release platforms and millions of users over 5 years (like GTA 5) and not ship random little performance bugs, you need to tell the rest of the world the secret of how you did it.
> I think it’s pretty ridiculous to think something more complex than doom isn’t needed...
That isn't what I said, it isn't even close enough to pretend that it isn't a bad faith interpretation designed to strawman an argument where you don't look totally ridiculous. Here, lemme help you get back on track - you said: "The entire doom source code is likely smaller than a single grand theft auto save file." You think I might have been addressing that?
> Maybe you are still ok with 640kb of memory too though? ;)
Oddly enough, I'm currently writing an (eventually) open source firmware for an ancient APC SmartUPS that used a variant of the 8051 MCU... I'm finding the 16K ROM and 256B RAM pretty roomy. I expect the other two people who eventually flash it onto their hardware in the next 30 years will be very pleased.
> random little performance bugs
lol, writing a microtransaction json parser that increases load times by an order of magnitude, for millions of customers, for years - and is so braindead that somebody with all the resources of a freeware decompiler and a stopwatch can do your job better than you... that isn't a "whoops, more seizure t-poses", that is a "I care so little about the quality of my work that I'm not even gonna bother dumping a flamegraph for that 10 minute freeze everyone has to sit through - for years."
> The entire doom source code is likely smaller than a single grand theft auto save file.
Just checked, "linuxdoom-1.10" (https://github.com/id-Software/DOOM) is 1.26 MB, my GTA save is a little over 500 kB. The GTA save files are actually quite small, likely because most things in the game world are emergent behavior of the game's systems and aren't permanent state (like in e.g. TES games, which are notorious for large save files, as well as save file corruption bugs due to "oops we didn't think saves would grow bigger than 16 MiB!").
That actually makes it more impressive, since the GTA game world is the result of all these systems working together to create a "living, breathing world" indeed, versus what you see in most other open world games (most recently: CP2077).
> While there’s no question Carmack was an exceptional talent, let’s not forget the original Doom shipped with serious network killing bugs in 93
You're confusing "this is necessary" with "there are no (horrible) problems with this".
Doom needed low-level access because it was (at the time) simply impossible to make the game run at a reasonable speed without low-level access. Separately, Doom had bugs, because all software is terrible.
GTA does not need low-level access, because everything is has a legitimate reason to do runs at reasonable speeds on modern hardware (or if it doesn't, low-level access won't speed it up noticeably). Separately, GTA has bugs, because all software is terrible.
Point taken, do you think there is any reason they can't run with locked down privileges or is it all anti-cheat?