This thread reminds me that FF XV was delayed by about 2 months to avoid a day-one patch which I really appreciate. AT&T now charges us if we go over a certain amount of data per month and having the disk be a fully self-contained and working game is something I really appreciate. Also, there's nothing I hate more than buying a brand new game only to sit and stare at a downloading screen... But I really appreciate what Square-Enix did. I'd rather it be delayed and get something worthwhile vs getting something crappier up-front.
I maintain three distinct collections:
My Arcade games collection is based of MAME. The PleasureDome tracker helped me fill it.
The DOS collection is based of DOSBOX. Thanks to the PleasureDome tracker, I can keep a collection of >600 games. A good deal of which I owned at one time or another.
The ZXSpectrum collection is based of FUSE. Used to be the most excellent Spectacular on Windows, one of the few applications I dearly miss from my transition to Linux. But FUSE is not bad either. Thanks again to the PleasureDome tracker I was able to recover the hundreds of game tapes I once owned before being a dumb arse and selling them and add quite a few more games I never owned. PleasureDome tracker - the only torrent tracker that matters.
Last edited by Mario F.; 09-12-2016 at 03:40 PM.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
I fully agree. Ever since the release of PS3 and Xbox 360 in 2006 almost every single game requires a day one "patch".
Why? Because games are rushed through the testing process. QA employee's do not test everything they should, and
even if they do - games are still released with bugs and glitches that should have been amended long before the game
went to shipping.
Does this have anything to do with deadlines? Of course. Just a guess - but I can almost wager that directors of game
companies put enormous pressure on PM's to push games out the door ASAP. Did you play SNES in the early 1990's? Well you
can tell all testing was done then. Because it HAD TO WORK. There were no "patches" back then, and even PS1 and PS2 games
came out of the workshop working fully at least 99% of the time. Ever since development moved to the eighth generation, it has
become the "norm" for all games to requires several downloads of patches. Games are released half finished. I'm looking at you
Bethesda *cough* SKYRIM *cough*
Double Helix STL
In their defence, games were much more easy to code then than they are today.
Most 80s and 90s consoles were coded on the metal in whatever assembly language their processor supported. The instruction set on the 8 and 16 bit processors was relatively easy to manage. The code was straightforward and easy to reuse and the project complexity was limited. Bugs were easy to detect and fix and games were not complex, which made testing a much simpler task. The same can be said of the 16bit era of the PC industry. It can be argued that memory management on 16 bit computers was harder (and stupid). It was. But in contrast there wasn't much the hardware could do, which limited games capabilities. The smaller the project, the easier to project it and come up with a quality product.
Today, the problem is not just the need to support multiple hardware configurations and the insistence of the supply side of the hardware industry to adhere to partial standards only and try to gain a commercial edge by providing exclusive features that spread cross-compatibility issues and widened too much the window for problems. And neither it is just the problem of developers building bigger and more complex games. Today developers face the need to incorporate huge chunks of code that they didn't write and which they have very little control of.
Third-party libraries make up perhaps 70% of the entire code base of a modern game. Bugs happen too often associated with these libraries. Either a bug in the library itself, or a bug associated with some bad implementation on the library. Fixes often are in fact not actual bug squashes, but instead code that needs to be written to circumvent a library bug or limitation. Code that needs to be written but that should have never existed. When before developers were in control of 100% of their code, today that was reduced to around 30% of the entire project. (this 70/30 relationship is a gross estimation on my part. Take it with a grain of salt, but be also prepared to accept that on some projects that ratio may be even higher).
The only tests capable of catching these bugs before shipping are integration and production tests. But the limitations here are obvious. How can anyone expect a company to run every possible hardware configuration? Worst, many bugs can be associated with unrelated software installed on the machine, or the machine current stability. Which makes it impossible to test anything; the reason why companies have been slowly discouraged from investing too much into post-development tests is because it's all but useless.
You can't solve these problems. The technological scope and spread of the PC industry is the reason behind its success. The downside is this.
Originally Posted by brewbuck:
Reimplementing a large system in another language to get a 25% performance boost is nonsense. It would be cheaper to just get a computer which is 25% faster.
That's a really good point, Mario. Which is why I'm happy Square is taking the time to pull the product back.
It's a shame that MGS V's development was cut short. I think they spent about 5 years on it and still didn't have a complete game. However, I will say that MGS V is probably the most well-coded game I've ever played in my entire life (at 26 this may not mean much).
I've easily put it in about 200 hours across two whole playthroughs of the game and I encountered one bug once where my character was traversing a hill and a weird texture glitch made me run in a circle for like 2 seconds before going back to normal.
The Evil Within, however, I managed to beat because the final boss had a pathing glitch that made the fight too easy lol.
I'm meeting with my boss' supervisor next week. She has seen me coding feverishly at my desk and is concerned about my workload. I feel like this might be a double edged sword. What if she feels like I should be able to handle my load, so maybe there is someone better suited for my job? What if she tells my supervisor and he trumps up a reason to fire me? I seriously am concerned about this.
Hard to make sense of that without more info about the person. Basically if they're a bad manager, could be bad, but if they're a good manager, it could be good, i.e. maybe they'll take some off your plate.
> How do you guys handle these situations?
That's a project management issue. Changes to the requirements inherently alter the deadline, and there should be a change request to document it. If you're experiencing scope creep without alterations to the deadline or expected effort, the project manager is not doing his/her job.
If this is habitual from the project manager and cannot be resolved, I'd favor dumping the employer and finding someone who doesn't abuse their developers.
My best code is written with the delete key.
This thread and in general hearing about others' problems at work makes me glad I chose to keep CS-related things as a hobby. I code at work, but only like 20% of the time. I'd shoot myself if I had to have someone breathing down my neck everyday to code faster etc.
Come to the world of industrial automation, the work is much easier and we need more CS-oriented people to bring the software up to par.