Why video games crash&freeze&lag, while software in general doesn't.
So, if you play video games, I'm sure you're well familiar with the issues that plague almost every single game (or at least a lot of the heavy hitters of the modern market). Some dev teams are even infamous for the practically beta releases of their games disguised as an alpha (*cough* Bethesda/Obsidian *cough* Fallout New Vegas). But I use a variety of applications on my desktop most notably; Blender, Firefox, GIMP, UDK (just the editor), Unity (just the editor), etc. (Notice they are all free to use hehe). And these are pretty demanding applications too. But I can rely on them not to crash and burn so hard I have to unplug the machine. Nor does any saved information get deleted unexpectedly. But video games, I can't say the same thing. It seems like this is an industry that is most heavily plagued with software issues, to the point that standards might even be lowering as we speak.
So, I'm wondering why does game software specifically crash so hard, whereas most other types of software do not? I know, in practice, debugging is an asymptotic process with diminishing returns for any programming endeavor, but I don't get the impression that software in general suffers from the extent of fatal bugs like the sub-field of game software does. And considering that game devs are selling an immerse experience with their games, how much more unimmersive can you get then watching your horse violently spin up into the air, or to find an essential NPC just dead in front of his shop?
Last edited by gakushya; April 24th, 2012 at 06:13 AM.
Reason: for the lulz