Probably yes, I know it at least refuses to 'type down first 5 pages of lotr book' because of copyright reasons. Its filter is getting better (as in worse for the user) everyday
I had read an article about how DOOMs engine works and noticed how a variable for tracking the demo kept being incremented even after the next demo started. This variable was compared with a second one storing its previous value
Doesn't sound like something that would crash, I wonder what was the actual crash
Signed overflow is undefined behavior in C, so pretty much anything could happen. Though this crash seems to be deterministic between platforms and compilers, so probably not about that. TFA says the variable is being compared to its previous value, and that comparison presumably assumes new < old cannot happen. And when it does, it could easily lead to eg. stack corruption. C after all happily goes to UB land if, for example, some execution path doesn’t return a value in a function that’s supposed to return a value.
Just because the language standard allows for anything to happen doesn't mean that actually anything can happen with real compilers. It's still a good question to think about how it could actually lead to a crash.
That’s what I said? It’s easy to come up with scenarios where signed overflow breaks a program in a crashy way if the optimizer, for example, optimizes out a check for said overflow because it’s allowed to assume that `++i < 0` can never happen if i is initialized to >= 0. That’s something that very real optimizers take advantage of in the very real world, not just on paper. For example, GCC needs -fwrapv to give you guaranteed wrapping behavior (there’s sctually -ftrapv which raises a SIGFPE on overflow – that’s likely the easiest way to cause this crash!)
But I specifically said that it doesn’t look like SOUB in this particular case, and proposed an alternative mechanism for crashing. What’s almost certain is that some type of UB is involved because "crashing" is not any behavior defined by the standard, except if it was something like an assertion failing, leading to an intentional `abort`.
That doesn't make sense. If new < old cant happen there is no need to make a comparison. Stack corruption? Nah, its a counter not an index or pointer or it would fail sooner. But then what is the failure? IDK
Assuming new > old doesn't mean you actually make the comparison, but rather that the code is written with the belief that new > old. This code behaves correctly under this assumption, but might be doing something very bad that leads to a crash if the new < old.
An actual analysis would be needed to understand the actual cause of the crash.
Um, there are the cases new == old and new > old. And all the more specific cases new == old + n. I haven’t seen the code so this is just speculation, but there are plenty of ways how an unexpected, "can never happen" comparison result causes immediate UB because there’s no execution path to handle it, causing garbage to be returned from a function (and if that garbage was supposed to be a pointer, well…) or even execution never hitting a `ret` and just proceeding to execute whatever is next in memory.
Another super easy way to enter UB land by assuming an integer is nonnegative is array indexing.
int foo[5] = { … }
foo[i % 5] = bar;
Everything is fine as long as i isn’t negative. But if it is… (note that negative % positive == negative in C)
The error states that the window can't be created. It might be the problem with parameters to the window creation function (that should not depend on game state), or maybe the system is out of memory. Resources allocated in memory are never cleaned up because cleanup time overflows?
Doom4CE (this port) was based on WinDoom, which only creates the program window once at startup, then switches the graphical mode, and proceeds to draw on screen independently, processing the keyboard and mouse input messages. I'm not sure, but maybe Windows CE memory management forced the programmer to drop everything and start from scratch at the load of each level? Then why do we see the old window?
There are various 32 bit integer counters in Doom code. I find it quite strange that the author neither names the specific one, nor what it does, nor tries to debug what happens by simply initialising it with some big value.
Moreover, 2^32 divided by 60 frames per second, then by 60 seconds, 60 minutes, 24 hours, 30 days, and 12 months gives us a little less than 2.5 years. However, Doom gameplay tick (or “tic”), on which everything else is based, famously happens only 35 times a second, and is detached from frame rendering rate on both systems that are too slow (many computers at the time of release), or too fast (most systems that appeared afterwards). 2^32 divided by 35, 60 seconds, etc. gives us about 4 years until overflow.
Would be hilarious if it really is such an easy mistake.
The comments show the values each thread observed.
Why? Nothing in that code implies any synchronization between threads and force an ordering. thread_2 can fetch value of y before 1 writes to it which would set b to 0.
You would need additional mechanisms (an extra atomic that you compare_exchange) to force order
edit: but I guess the comment means it is the thing author wants to observe
Now, the big question: is this execution even possible under the C++ memory model?
> sure, use an extra atomic to synchronize threads
What? That would make the situation worse. The execution has a weird unintuitive quirk where the actions of thread 3 seem to precede the actions of thread 1, which seem to precede the actions of thread 2, yet thread 2 observes an action of thread 3. Stronger synchronization would forbid such a thing.
The main question of the article is "Is the memory model _weak enough_ to permit the proposed execution?"
A week or so ago I needed to convince chatgpt that following code will indeed initialize x values in struct
struct MyStruct
{
int x = 5;
};
...
MyStruct myStructs[100];
It was insisting very passionately that you need MyStruct myStructs[100] = {}; instead.
I even showed msvc assembly output and pointed to the place where it is looping & assigning all x values and then it started hallucinating about msvc not conforming the standards. Then I did it for gcc and it said the same. It was surreal how strongly it believed it was correct.
LLM's don't have beliefs, so "convincing" them of this or that is a a waste of your time. The way to handle such cases is to start anew with a clean context and just add your insight to the prompt so that it lands on the right track from the beginning. Remember these models are ultimately just next-token predictors and anthropomorphizing them will invariably lead to suboptimal interactions.