YMMV • Radar • Quotes • (Funny • Heartwarming • Awesome) • Fridge • Characters • Fanfic Recs • Nightmare Fuel • Shout Out • Plot • Tear Jerker • Headscratchers • Trivia • WMG • Recap • Ho Yay • Image Links • Memes • Haiku • Laconic • Source • Setting |
---|
You'd think it would be simple to program something to do something simple. Despite what some people may tell you, you're right. That is, however, if you're not a complete moron. However, some people seem to be under the impression that any simple task is a gargantuan effort that requires the importation of several processor-heavy, 100-megabyte libraries just to set up. This leads to machines capable of performing 2 billion complex calculations per second being brought to their knees with spin-locks and memory leaks and, in a perfect world, a refusal by customers to use the product.
This is not a perfect world.
The other side of this is, of course, Genius Programming.
See also Artificial Stupidity, Game Breaking Bug (for video games).
There are a few things that shouldn't be listed here, most importantly:
- Software written predominantly by enthusiasts, unless there's good reason to believe that it should be judged by the same standards as professional software — for example, if the software's developers or publishers market it as being comparable or superior to professional software.
- Software that hasn't actually been marked as a release candidate or final build. Beta software is expected to have serious issues, and alpha software is, by definition, not even finished.
- Software that appears to use too much memory without a good reason. By itself, using too much memory doesn't actually tell us anything about the quality of the coding, since it's generally possible to write software that uses additional memory to gain a performance boost.
Some galling examples can be found on The Daily WTF, particularly the CodeSOD section.
Miscellaneous Examples[]
- Actual code that causes these observed effects is a weekly feature at thedailywtf.com.
- Also from the programming side, libraries developed by the Department of Redundancy Department where you have to lapse into Pokémon-Speak to write any meaningful code. For instance, take this line from Debian's version of awesome's rc.lua:
|
- To be clear, "menu" is the only member of "debian", and "Debian_menu" is the only member of "debian.menu".
- Adobe Flash. You may notice it on this very site, taking up 100% of your CPU and 80 megabytes of your RAM to display a static image that would take up 12K as a JPG.
- Thank heavens for FlashBlock.
- And on numerous video sites, as a player that drags brand new multicore, multigigahertz computers to their knees in order to jerkily fail playing h.264 video that would run silky smooth on Pentium IIs or G3s as unwrapped files.
- Even simple programs, like Conway's Game of Life, which can easily run at 70 frames per second on a 486SX when written in C will struggle to run at 5 frames per minute when the same code is used in Flash on a computer more than a hundred times faster.
- Java should have filled the niche of web-based games that Flash mostly owns.. except that early versions of Java were so slow and so unnatural looking that Flash actually looked good in comparison. By the time they fixed it, Flash had become the de-facto standard for this kind of thing, much to the chagrin of just about everyone except Adobe.
- Sometimes the already-poor performance of Flash is compounded by the often badly coded applications written for it. To give an example, The BBC embeds audio and/or video files in pretty much every article on the BBC News website. Unfortunately the initial version of the Flash app they used to do this was so badly designed that any system with a processor below a Core i7 was pretty much guaranteed to be utterly brought to its knees for several minutes at a time while the player loaded. It took months for the app's performance problem to be fixed.
- A couple versions ago, the windows Flash installer would sometimes report insufficient disk space even when there was no such problem. The reason? The installer would check drive C for space, regardless of the actual destination drive, and even if C wasn't assigned to a hard drive at all.
- Compounding all these problems is the fact that Adobe appears to be deliberately crippling Flash in its capacity to perform its original purpose — vector-based animation — to try and get people to use it for what they want, which seems to be websites (hands up, everyone who thinks this sounds perfectly reasonable. Anyone? No one? Good).
- Adobe Acrobat and Adobe Reader aren't much better either. Somehow, Ctrl+C to copy doesn't always work, despite the fact that this is one of the most basic features of any program that can display text. Sometimes it works, sometimes it silently fails, sometimes it fails and pops up an error message saying "An internal error has occurred", and sometimes it works but still pops up that error. And if you hit Ctrl+C four times in a row in a futile attempt to copy the same text, you might get one of each result despite the fact that nothing else changed between attempts. It's like playing a slot machine.
- Also, Linux versions are somewhat troublesome. Sometimes, when having several documents opened, you cannot change between them using the mouse -same for using the menus-. You have to use the keyboard to activate the menus, "unlocking" it.
- Linpus Linux Lite, as shipped with the Acer Aspire One. Now in fairness to Linpus, its GUI could not possibly be more intuitive (plus a boot time of just 20 seconds). But there is designing a distro for complete beginners, and there is designing a distro with several directories hard-coded to be read-only and Add/Remove Programs accessible only by some fairly complex command-line tinkering. That the sum total of its official documentation is a ten-page manual that contains no information that can't be figured out by an experienced user within five minutes of booting doesn't help.
- In Brazil, many low-end computers are sold with horrible Linux distros in order to claim tax breaks for using locally-developed software. Stuff which cannot be updated without major breakage, full of security holes, old versions of packages and so on, to the point that it seems many people only buy them so they can install a pirated copy of Windows to save money.
- Thousands of computers using McAfee Antivirus were brought down on April 21, 2010 by an update that misidentified an essential Windows file as a virus, causing computers to constantly reboot. Users couldn't go online for a fix; they had to go to an unaffected computer to get the fix and install it manually on their computer.
- That was actually fixable on infected PCs, you just open the Run dialog box and type shutdown -a. Then you just download the patch. Of course, most users wouldn't know this, however.
- McAfee's strength is that it blocks everything that might be a threat. Its weakness is that it blocks everything that might be a threat. If you wish to use a program that it considers a threat (and as of this writing, it considers Dhux's Scar, among other things, to be such a program), you cannot get it to grant an exception. You're supposed to send McAfee's developers an email telling them it's a false alarm. If they don't respond, you need to disable McAfee every time you want to use the program.
- On many computers, McAfee will make the CD drive stop working. And McAfee is often stealthily installed by default when you're trying to install something completely unrelated (Flash, for example; note the already-checked checkbox). You know, like those stupid "toolbars" that are pretty much always loaded with viruses.
- An AVG update identified a critical file in 64-bit versions of Windows 7, preventing the systems from booting up.
- HP's printers, scanners, and fax machines come with software to go with the hardware. Everything but the drivers are optional, but HP does everything short of flat-out lying to your face to falsely suggest that the other crap is required. If you do install the optional software, the stupid thing will pop up a "Error: Your HP printer is disconnected" every time you turn off the printer. Next thing you know they'll make your computer tell you to turn the light and TV back on when you leave a room.
- Norton products have a tendency to do this:
- Norton Internet Security blocks any and all images with certain dimensions, specifically those that are commonly used for advertisements. Problem is, at least one of the sizes is also commonly used by sites for non-ad purposes. In older versions, this could not be turned off without disabling all filtering completely.
- Attempting to uninstall some older versions of Norton products, particularly Norton SystemWorks and Norton Internet Security has been known to actively damage the user's computer, to the point of crashing it, rendering it unable to boot, and/or corrupting the Windows Registry and/or files on the hard drive. Symantec eventually created the Norton Removal Tool, a program for the sole purpose of safely and cleanly uninstalling other Norton products, to remedy this.
- A notable failure: it is not unheard of for Norton Anti Virus to declare itself a virus, and attempt to delete itself.
- My God! It's become self-aware! It doesn't want to live.
- Then there's a whole load of other issues caused by Copy Protection. See the anti-piracy measures section below.
- Norton 360 has decided that "not commonly used by other people" is sufficient metric of suspicion to block and immediately delete any executable run by the user. So if you're in any computer science class ever and have to create, compile, and run your own programs on a regular basis, Norton hates you.
- It will also randomly sift through and delete dll's — such as critical libraries in the Unity 3D engine. The screen does not let you override this. Attempting to "Learn More" doesn't take you to anyplace that will clue you into how to work around this, but to the Norton product page — in Japanese.
- It also slows a lot the time needed to boot Windows. On a Core 2 Duo with 4 Gb of RAM and Windows 7, with the thing installed up to 5 minutes to fully boot. Without it, roughly 45 seconds.
- And that without including other funny things like blocking Firefox with its firewall or BSO Ds after a bad update of its virus database that forced to restore the system.
- The Computer Stupidities section of Rinkworks.com has its own section for programming.
- Adding another Sony example to the lot already present further down in this page: when they first started out with their portable music players, Sony didn't support the MP 3 standard due to their historical unwillingness to support anything that could encourage piracy of any kind. Their players instead supported Atrac3, a proprietary Sony audio format. Being (somewhat surprisingly) smart enough to figure out that users would want to listen to their MP 3 music, Sony sold the players with SonicStage, an upload program capable of converting MP 3 files to Atrac. SonicStage promptly proceeded to annoy a whole lot of people: buggy and prone to crashing, some computers couldn't run it at all (for reasons unknown), prompting many to return the players and switch brands.
- Creative did the same with their first hard-disk players: before they started supporting the MTP format (widely supported by many music managers), the only way you could upload stuff to them was by using the godawful PlayCenter program, later superseded by the even worse MediaSource. Many users preferred to keep PlayCenter: buggy as it was, at least it did its job sometimes. Both programs also attempted to set themselves as default players and music managers, further irritating users.
- Ditto Microsoft with the Zune. Read the Microsoft section for more.
- Made worse by the fact that aside from the problems resulting from SonicStage, Atrac3s have better sound quality and a much smaller file size than the equivalent mp3. With the death of SonicStage due to the aforementioned problems it's Lost Forever.
- The PlayStation 3 can still encode and play Atrac, but see below for why this is probably not an improvement.
- Many streaming video websites have a ridiculously small buffer size; quite often only 3--4 seconds of video, and possibly even less at HD resolutions. In optimum conditions it's not so problematic, but if the site is particularly busy or if your Internet connection isn't too fast, it can make the videos all but unwatchable. Absurdly, YouTube suffers from this restriction on its streaming videos, despite allowing an unlimited buffer on non-streamed videos, not to mention that when you watch videos, you may expect to get a black screen reading the message of "An error occurred. Please try again later."
- Computers don't really have random number generation. They generate one number at a time through a formula. In order. Programming teachers used to demonstrate this Socratically on old Apple II computers, by having their students write a BASIC program to print random numbers and then running it, rebooting, running it, rebooting, and so forth, observing the results. The best way around this was to continually generate "random" numbers in the background of a program's title screen, thus letting the delay until the user pressed a key serve as the randomizer.
- RANDU, the infamous random number generator included on IBM computers in the 60s. How bad is it? Aside from the fact that every number is odd (which is very very bad on its own yet easy to work around), any three adjacently-generated numbers can be mathematically related into a plane (which looks like 15 planes when plotted due to modulo arithmetic).
- And because the computer it was included with, System/360 mainframe, is widely regarded as their greatest work and was the computer of The Sixties and The Seventies, the generator became so widespread that the traces of it periodically surface even NOW, almost fifty years past.
- Earlier versions of Java would install updates not by patching the existing installation, but creating a completely new installation in a separate folder without removing the old one.
- This could be justifiable in a sense - keeping the old versions serves as a crude form of backwards compatibility, ensuring that older code would be able to find the version of Java it was meant to use. If the install was small enough (not that Java is known for its brevity), it would be somewhat practical, though inelegant in the extreme and therefore a Bad Thing.
- For graphing calculators, it often happens that there exist several different hardware for linking them to computers. It also happens that different linking software, from different authors, don't support the same hardware. But unlike Texas Instruments calculators, linking applications for Casio calculators all used different, incompatible file formats on the PC.
- While Windows Vista (see below) did introduce a ton of problems, it also did something that revealed many a programmer's idiot programming choice: assuming that the user account always had administrative rights. In Windows Vista, Microsoft introduced UAC, which would only assign standard user rights to a program, even if the user was an administrator. This is sensible, as it limits the damage that the program can do if it goes rogue. Programs that needed administrator rights were detected based on the file name and an optional configuration file called a manifest. Of course, older software that needed administrator rights knew nothing of manifests, and would fail in unpredictable ways, usually spouting an error message that wouldn't make the actual problem obvious to the non-technical (or necessarily even to the technical) — although Windows did sometimes spout a dialogue box along the lines of "Whoops, this program looks like it needs admin rights, but it didn't ask for them and I didn't realise until just now, do you want me to make sure it runs as an admin in future?".
- So the designers of the Soviet Phobos space probe left testing routines in the flight computer's ROM — fair enough, everyone does the same, because removing them means retesting and recertifying the whole computer, which generally would've be plainly impossible without said routines. But to design the probe's Operating System in such a way that a one-character typo in an incoming command would accidentally trigger a routine that turns off the attitude thrusters, making the spacecraft unable to point its solar panels at the Sun and recharge its batteries, effectively killing it, takes a special kind of failure.
- Firefox 4's temporary file deletion algorithm was an unusual case, since it was actually pretty effective at deleting older files and freeing up large amounts of disk space. It suffered a major problem, though, in that it chewed up huge amounts of CPU power and maxed out the hard drive in the process, which could slow your entire system to a crawl. Worse still, there was no way of aborting the cleanup routine, and if you killed the Firefox process, it would just invoke the file cleaner again as soon as you restarted the browser. It wasn't until Firefox 5 that the file cleaner got fixed, using much less CPU power and still being fairly disk-intensive, but not to the same extent as previously.
- Tech sites have noted a rather disturbing trend in how certain handheld devices handle firmware updates. The sane way to do such an update over the internet is to check for the existence of updated firmware, download it, erase the old firmware, and then load the updated version. Ideally there's also a backup firmware chip, or some other way of restoring the device if things go pear shaped. Unfortunately, a lot of devices (especially cheaper ones) don't actually do that — instead, they check for a firmware update, and upon getting confirmation that that there is such an update, the device immediately wipes the old firmware, then downloads and installs the updated version. If anything goes significantly wrong during the download (i.e. loss of internet connection, loss of power or a software error), then the device will almost certainly be bricked. On top of that, most of the time there's no way to restore such a device to working order outside of replacing the motherboard, and only a 50/50 or so chance the manufacturer will replace it under warranty.
- Curiously, most so-called "smart TVs" will do this - Samsung and Sony seem to be particularly bad about it.
- Fargus Multimedia's Russian bootleg of Putt-Putt Saves the Zoo is a complete collection of failure. Alongside having enough problems as it is (characters commonly speak in the wrong voices and their lips keep moving after finishing a line of dialog), they made, quite possibly, one of the biggest fails possible — the game was originally completely unplayable. Why? They packaged the game with a blank W32 file, the file that executes the game. It wouldn't be until 2004 that fans would fix this by using torrents to grab an American W32 and stick it into the Russian version.
- A flight of ultra-high-tech F-22 Raptors suffered multiple computer failures and were practically crippled because their programming couldn't cope with the time zone change of crossing the International Date Line. Somehow, it never occurred to the designers that a fighter aircraft just might cross the International Date Line and forgot to program its systems to adjust for it--which is a standard part of the programming of modern cellphones. This oversight resulted in a temporary grounding of all Raptors for a time.
Games[]
Note: Consider if the entry would fit more into Obvious Beta or Game Breaking Bug.
- Jan Ryu Mon, an online Mahjong game by PlayNC, which has plenty of evidence suggesting it was programmed by drunken monkeys:
- If a cookie gets blocked from the site (most notably the registration page), the server gives a "Fatal Error" page with a stack trace - no indication whatsoever that the problem is a blocked cookie.
- The login only works in Internet Explorer - attempting to log in on any other browser will result in a "Please use Internet Explorer 6.0 or higher to log in" pop-up message as soon as you clicked on the username or password fields, and then repeat the pop-up every time you try to type a letter, click on one of the fields, or click the "Login" button. There is absolutely nothing on the site that doesn't work in Firefox, except...
- Even though the game ran in a separate .EXE file, it would give an error message on startup if the game executable was launched directly. The only way to launch it to download the ActiveX extension (for IE only) from the game's site, log in, then use a button on the site to launch the extension for the sole purpose of launching the game - the extension would pass along your login information to the game, and they apparently didn't think to make the game executable ask for a login.
- Scary fact: this is becoming an extremely common way to handle log in "securely". Even Final Fantasy 14 is using a variation with an embedded IE session in a "login client" window. Try loading quite a few MMOs with your Internet connection offline and watch the fun.
- Then when you logged in, the game was ridiculously slow because of the eye-candy animations for every single turn. This was further exacerbated by the graphics engine, which was so inefficiently programmed that the game experienced more lag and frame-skip to show a Mahjong table with one hand moving one Mahjong tile than Touhou games do trying to animate 2000 bullets simultaneously. This is often caused by graphics engines which cache nothing or very little, instead opting to re-render most or all of the screen from scratch on every single frame.
- To add insult to injury, their server had major routing issues during the beta (and still have some as of this writing), forcing many players to go through a proxy just to connect, which also meant a LOT of lag - a round would be easily half an hour, when many other online Mahjong games can finish a round in 10-15 minutes.
- And then there's a plethora of random crazy Game Breaking Bugs.
- Big Rigs Over the Road Racing. It's an obvious pre alpha build of a game, with no collision detection, no AI and some incredibly bad programming of physics (as in, go as fast as you want-- Warp 9000, if you have the patience-- but only in reverse). Despite being obviously an alpha, the creators tried to sell it as it was anyway.
- Then, in one of the strongest cases of "why bother?" in history, they added AI in a patch... which drives in a fixed course at a rate of 1 MPH and stops short of the finish line.
- The reason it stops is because it has finished the race... but there is no loss condition in the game. That's right, the game simply doesn't allow you to lose.
- Action 52. 52 games, and not one of them past Obvious Beta. See the work page.
- The Tetris the Grand Master clone Heboris--specifically, the unofficial expansion--has more or less died out, because attempts to peek into the source code, much less make any further modifications, have proven futile due to the game being a messy hybrid of a gaming scripting language and C++ .
- On a related note, there were a handful of genuine C++ ports of it. However, the MINI version (which allows for "plugins" to define additional game modes and/or rotation rules) is the most commonly-used version, and the way it works pretty much inhibits any attempt at porting it entirely to C++ .
- Zero Gear isn't problematic in and of itself... but the nature of its Steam integration allows it to be used to play any Steam-locked game you want, without owning the game. This is most notably used by hackers to bypass VAC bans: just start a new account, download the Zero Gear demo, and copy the files over.
- The save system in Pokémon Diamond, Pearl and Platinum. Making any changes in the PC storage system increases the save time from an already iffy 5 seconds to 15 seconds. Are you the type that likes to save regularly? Sucks to be you!
- It should be noted that the save time is at least in part due to the fact that the game calculates and saves individual checksums for every single Pokémon in your PC boxes, plus two additional checksums for the whole save file, then two copies of the file are saved to the card, so that any data that gets corrupted (i.e. fails a checksum verification) on one copy can be loaded from the other. Which is actually a good programming practice taken to a ridiculous extreme. On the other hand, Generation III also did this with a consistent ~5 seconds.
- They fixed it in Pokémon HeartGold and SoulSilver by apparently being more selective in determining whether to save the whole file, as the "saving a lot of data" message only comes up after moving a lot of Pokémon around. Pokémon Black and White fixed it further by unlocking boxes as you filled up the ones you were given, reducing the amount of data to be saved in the first place.
- Indie game developer and utter egomaniac MDickie has released the source code for many of his games, including the infamous The You Testament. By looking at the code of the latter, you discover for instance that the "exit program" function works by deliberately crashing the game. And that's just the tip of the iceberg.
- The open-source Windows port of Syndicate Wars eats insane amounts of disc space for no logical reason.
- Back in the days of DOS and Windows 9x, many games (such as the PC version of Slave Zero, as Lowtax discovered in one of the first articles he ever wrote for Something Awful) were hard-coded to assume that the CD-ROM drive was D:, rather than actually bothering to check with the OS to see that this was the case. If you had more than one hard drive, or had your disk partitioned — which a lot of people with >2GB hard drives had to do prior to Windows 98 arriving on the scene — you were generally out of luck unless you could either edit the game's configuration files or find a No-CD patch.
- The process of installing Battlefield 3 can be used as argument as to why EA should ditch its Origin platform (which people have accused of acting like spyware in addition to its design flaws) and go back to Steam, or copy Steam blatantly. The first is that Origin treats installing from a DVD as "downloading" the game, which feels kind of odd when you can't pause and resume it. The second is you cannot quit Origin while downloading and installing an update, which is something Steam can do. If you want to run the game, you also need to logon to their Battle Log website. And lastly, if this is a clean install, you have to install a plugin for your browser so the website can launch the game (which sounds kind of fishy). But overall, you do not launch Battlefield 3 from Origin, you launch it from the Battlelog website. Which also has the side effect of "you don't have internet connection? Too bad! You're not playing BF3".
- Magicka is a very fun game, but sometimes it's so difficult to work with that it's almost not worth the trouble.
- The game takes almost two minutes to start up. That's before the company logo shows up, too. You just sit there staring at a black screen for two minutes. (Thankfully, you can skip straight to the title screen once the company logos do start appearing.)
- The graphics are nice, but that's because of its superb art direction. Technology-wise, it's a fairly standard 3D top-down brawler... And yet it chugs down resources like a cutting-edge first-person shooter. A laptop that runs Team Fortress 2 and Left 4 Dead can't keep up with Magicka.
- If someone's connection drops during a multiplayer game, there is no way for them to re-join. The remaining player(s) must either go on alone, or quit and go back to the lobby.
- The developers acknowledged the game's disastrous launch with a series of blog posts and weeks of patching that made the experience more bearable. Then, with their characteristic and warped sense of humor, they introduced a free DLC called "Mea Culpa" that gave each player character a set of gag items: A tattered robe, a "Bug Staff", and a "Crash To Desktop" magick.
- While notable for also having an entry on the Genius Programming page, Minecraft. Anyone with any amount of programming experience will do nothing but scream WHY GOD WHY at the way certain things are handled. There's the TCP-based networking (that causes the lag you love so much), the unbelievable number of idiotic bugs and the memory leaks.
- The patch notes for Starcraft II's 1.4.1 patch said that it "Fixed slowdowns which could occur when using Razer Spectre mouse." The Razer Spectre is official hardware designed specifically for that game.
- The developers of the Anno Domini series need to be slapped with some basic GUI guideline books. For example, the first game would only save after you hit the save button, but not just after naming the save game. It would also completely remove the game directory on uninstall, including save games and settings. Anno 1503 (the second game), never got its promised multiplayer mode. Anno 1404, released in 2009, still assumed that there would be only one user, and that this user would have admin rights.
- The installer for Duke Nukem Forever seems to have been programmed by someone with an "everything but the kitchen sink" mentality. Not only does it install a bunch of applications and frameworks that the game doesn't actually use, but it installs the AMD Dual-Core Optimiser, regardless of whether your CPU is multi-core, or even made by AMD.
- Eve Online had a stellar example of what not to do in programming with the initial rollout of the Trinity update. Eve had a file called boot.ini that contains various parameters... but it's also a critical system file stored in C:\. A typo in the patch caused it to overwrite the version in the root directory, not the EO folder, resulting in an unbootable system that had to be recovered with a rescue disk. This is why you never name your files after something that already exists in the OS. (Since that debacle, the file in question has been renamed start.ini.)
- Myth II: Soulblighter had an uninstaller bug that was discovered after the game had gone gold. If you uninstalled the game, it deleted the directory in which the game was installed. It was possible to override the default install settings and install the game in the root directory of a given drive. Fortunately, only one person suffered a drive wipe as a result (the person who discovered the bug), and they actually replaced the discs after the copies of the game were boxed, but before the game was shipped. Still, it was a fairly glaring blunder.
- Diablo II has fairly simple mechanics due to its nature as an online title released in 2000. That did not prevent Blizzard from introducing bugs in literally every skill tree and about 20% of all skills in the game. Bugs range from the curse resist aura getting weaker as you put points into it and a rapid fire melee attack that misses completely if the first swing misses to masteries claiming to increase damage on spells and items but not actually doing it, Energy Shield bypassing resistances meaning your mana is drained in 2 fireballs, and homing projectiles going invisible if fired from off screen. Lightning bolt spells ignored faster cast rate items for no particular reason. Berserk correctly set your defense to 0 when you used it, then if you used it again it would give you negative defense and after a while it would roll around and give you 8 million defense. Numbers were wildly off on release: the high level Lightning Strike spear skill would do a total of 50 damage at maximum spell level, and the poison from reanimated skeletal mages would do 1 damage per second over the course of five minutes. And that's just spells: there were also numerous dupe bugs, ways to teleport to locked maps, the list goes on.
- Infamously the game was only considered difficult for three reasons: about half of the combinations of random enchantments a boss could have would interact in bugged ways and result in an instakill in some way (fun example: the combination fire enchanted/lightning enchanted would erroneously add the (huge) damage from the fire enchanted death explosion to the damage of every single one of the 10+ lightning sparks emitted every single time the boss was struck), the poison clouds of claw vipers would invisibly deliver their melee attack 25 times per second resulting in a RRRRRRR sound and a very quick death, and gloams drain a slight amount of mana on attack but also seem to deal 256 times that amount as damage whenever they hit you with anything.
Microsoft Doesn't Take You Where You Want To Go Today[]
- Microsoft in general tends to have a lot of problems with Not Invented Here and an inability to let go of problematic legacy code and designs (often because lots of third-party software relies on the problematic behavior) in their own unsung examples of Idiot Programming.
- In fact, similar to The Daily WTF, Microsoft veteran Raymond Chen's blog The Old New Thing is a good place to look for explanations of things that at first seem to be Idiot Programming on Microsoft's part. As Chen puts it, "no matter what you do, somebody will call you an idiot".
- As of October 2010, this is a list of programs that can be described as "we found this program doing stupid shit and have to work around it". One of the reasons Vista was so poorly received was because a lot of programs that did stuff they shouldn't have done wouldn't work properly. "Properly" being based on guidelines formed around 2001 and enforced in 2007.
- Mac Word 6. So legendarily bad, the Windows version ran faster in an emulator. Bear in mind that this was back when PCs ran x86 code and Macs were on the
PPC68k architecture. Exacerbated by the quality of its immediate predecessor, Mac Word 5.1, often regarded even today as Microsoft's finest work and possibly the best word processor ever written. - Versions of Word as recently as 2003 have had a document filesize limit of 32 megabytes, even though that could be reached by a document with 30 reasonably-sized photos embedded in it.
- Despite its beautiful and fairly responsive user interface, Windows Live Mail has several glaring flaws, such as pegging the CPU for a full minute to fetch mail from an IMAP server and popping an error message every time it is disconnected, even though it should be a Foregone Conclusion if you leave a dormant connection lying open for several minutes with no activity.
- Every version of Microsoft Windows gets this when it first comes out (except, strangely, for Windows 7), but Windows Vista and Windows ME have had the highest amounts of Internet Backlash. The common belief now is that most of Windows Vista's bad reception came from a sub-optimal initial release, which had a number of serious bugs relating to file transfers and networking (they mostly caused speed problems rather than data corruption ones, but it made using pre-SP1 versions of Vista a pain in the backside). Most of the serious problems were fixed with the first service release, but Vista's reputation, which had already been dented by its failure to live up to Microsoft's early promises, never really recovered.
- Windows ME, on the other hand, was arguably the worst operating system (apart from the infamously broken MS-DOS 4.00) ever released by Microsoft, to the extent that geeks have been wondering for years whether it was some kind of Springtime for Hitler plot to make the upcoming NT-based "Whistler" (what would subsequently become Windows XP) look better. Perhaps the biggest problem was that the memory management system was so catastrophically broken that up to a third of any given system's RAM was often rendered unusable due to memory leaks. Moreover, System Restore (which would become a well-loved feature in XP and beyond) severely hurt performance, not helped by the aforementioned memory management problem, and would quite often fail to restore your documents and important files, but did restore viruses and malware.
- A particularly facepalm worthy bug: ME, for the first time, supported Zip files without an external program. This was back in an era when diskettes were still ubiquitous, so the use case of a Zip file spanned across multiple diskettes was not a particularly uncommon situation. Opening a spanned archive would result in a prompt for the first diskette... and it would keep asking you until you produced that diskette. If it was lost, reformatted or damaged you were out of luck because there was no way to cancel out of that dialog box and no way to terminate it without terminating Explorer.
- Windows XP was pretty decent in most aspects when it was released... except for the OS's security, which was broken beyond belief, even if it wasn't obvious at the time of release. Famously, it was demonstrated that if you installed the RTM build of XP on a computer in mid-2007 and browsed the internet for just an hour, the OS would be hopelessly corrupted by viruses and malware, to the point of requiring a complete reformat and reinstall of the system.
- Can be attributed to the fact that Microsft stupidly defaults every new user account as an administrator. The UNIX/Linux equivalent of doing this is running as a super user, which any UNIX/Linux user will tell you is a very bad idea. If you run your account as a standard user, it can alleviate a lot of problems.
- Vista was particularly hilarious in the way it restructured so many things that Microsoft actually had to set up workshops to teach people how to use it; customers found these workshops very helpful. Snarkers were quick to pick up on the fact that Vista was perfectly intuitive, provided you had a trained expert holding your hand every step of the way.
- A major annoyance for new Vista/7 users who migrate from XP: the Read Only bug. Any hard disk with a NTFS file system that was created in XP that gets imported into a Vista/7 system will by default have all files and folders stored in it as read only, even for a user with administrative privilages, and even if one uses the "take ownership" feature. The solution would be go to the Security properties of each and every file and folder (the fastest way would be to go to the file system's root directory and select all files, and apply the following steps to all child files and folders), add the current user account to the list, declare it the owner, and grant all privilges.
- Windows ME, on the other hand, was arguably the worst operating system (apart from the infamously broken MS-DOS 4.00) ever released by Microsoft, to the extent that geeks have been wondering for years whether it was some kind of Springtime for Hitler plot to make the upcoming NT-based "Whistler" (what would subsequently become Windows XP) look better. Perhaps the biggest problem was that the memory management system was so catastrophically broken that up to a third of any given system's RAM was often rendered unusable due to memory leaks. Moreover, System Restore (which would become a well-loved feature in XP and beyond) severely hurt performance, not helped by the aforementioned memory management problem, and would quite often fail to restore your documents and important files, but did restore viruses and malware.
- The Zune software. The interface is fine, but it devours RAM and takes up way too much CPU power for what it does. iTunes, which is ordinarily infamous for being bloated garbage, actually runs faster.
- Not anymore. Zune has managed to improve performance in each subsequent release, to the point where as of version 4 even machines that don't have much higher than the minimum spec can run it with all the visual effects turned on with little to no problem. iTunes on the other hand has gotten even more slow and bloated over time.
- Windows Live Hotmail. Opera and Chrome have to spoof as something else before Microsoft will actually serve a page at all, and even then everything that isn't Firefox or Explorer has to provide its own scripts to replace the broken ones provided by the site. An overhaul of Hotmail in 2010 fixed it.
- In 2011, they've fallen prey to yet another issue. They're attempting to fight spam--by preventing it from leaving the users draft box. Perfectly legitimate mail is often blocked, which they acknowledge, giving no clue for how to change it so you can send your message beyond "re-edit it so it looks less spam-like." "Spam-like" has, among other definitions "possessing of such subject titles as the fairly common 'RE: How's it going?'"
- Active Desktop was an optional Windows 95 update released in '97 in an effort to catch up with that "World Wide Web" thing that had taken Microsoft by surprise and capitalise on the new push technology hype (basically RSS feeds). The concept was ahead of its time: you could place webpages and things like weather and stock updates right on your desktop and bypass the browser altogether. It also gave your folders a spiffy overhaul, introduced the quick launch bar and made everything clickable look like hyperlinks. In fact, folder windows were browser windows and you could type both URLs and folder paths into the address bar. There was one problem (aside from the need to be constantly connected over pay-per-minute dialup to receive push updates): many user interface elements were essentially outsourced to your browser, and this was back when a crash in one browser window tended to take down all others with it. The browser was the paragon of stability known as Internet Explorer 4. You can see where this is going.
- Things got more sensible and less crash-prone in Windows 98, but the desktop components remained unstable all the way until Microsoft realised noone used the feature for exactly this reason and replaced it with desktop gadgets in Windows Vista.
- Microsoft Outlook uses one giant .pst blob for all emails, which tends to get corrupted once it reaches two gigabytes. This page acknowledges this, and implies that it's the user's fault for using the program so much.
- Prior to Version 7, Exchange did the much the same thing: all mail for its users was stored in a single flat file on the Exchange server. This file was generally created in its entirety in advance and populated over time rather than constantly expanding. Problems: if the file became "fragmented" as users deleted messages, it would require a compression cycle, which required the server be taken offline for possibly hours. Additionally, if the file reached its limit, it would simply stop accepting new messages while acting to users like nothing was wrong. The file could be increased in size, but only to about 16Gb (as of Exchange 5). The safest solution was to migrate to Exchange 7, which in and of itself is a nightmare that often required rebuilding the entire system to deal with the absolute requirement of Active Directory.
- Internet Explorer:
- Internet Explorer 7 and 8 both talked up a "new commitment to standards and performance", with each one certifiably supporting more features than its predecessor, but each paling in comparison to every other browser available when released. IE7 did fix some of the most severe bugs that IE6 had suffered from, but the underlying engine was near-identical with most of the new features being cosmetic, and for the most part the browser was just as insecure and bug-ridden as its predecessor. IE8 by comparison had a redesigned engine that fixed most of the security problems, but added a new problem in that it kinda sucked at rendering older websites. Microsoft tried to divert attention from this by hyping up its "Web Slices" and "Accelerators", both of which were features that only Internet Explorer supported, but all the other browsers could feel free to implement themselves! While this trick worked for Netscape during the first Browser War (until Microsoft ended it by fiat by bundling IE4 with Windows 98), it didn't take this time around, and both versions languished in obscurity, hemorrhaging market share all the while.
- Internet Explorer 9, however, looks like Microsoft has learned their lesson in what they have to do, and that it's going to finally avert this, with development focusing exclusively on W3C-standardized features (as in HTML5, CSS3, and EcmaScript 5), many that every other browser already supports, and some that they don't- but only ones that are part of the World Wide Web Consortium-approved standards. Of the tests that Microsoft has submitted that IE9 passes but other browsers fail, most are passed by at least one other browser, with some tests they've submitted not even passing in Internet Explorer 9, but passing in Opera or Safari.
- Many legacy features are finally being re-architected to match the reality of modern Web browsing, such as JavaScript being implemented as part of the browser instead of going through the generic Windows COM script interface that was introduced over a decade ago and used through IE8.
- To top it off, the IE9 Platform Previews run completely platform-agnostic examples an order of magnitude faster than every other browser out there (by implementing hardware-accelerated video through Windows' new Direct2D API).
- From a plugin standpoint, get a load of these load times. The AVG toolbar adds, on average, a full second to the load time every time you create a new tab. On top of lots of other boneheaded on-load hooks, one developer actually incorporated network calls to their addon's initialization routine, meaning that, until it received a response from a remote server, it would block your tab from opening.
- Microsoft Office 2007 has some neat features that were previously unavailable, but those features take a backseat to some of the problems it has:
- The program is a RAM hog, making it cripplingly slow, even on machines with 4GB of RAM.
- The toolbars are nowhere near as customizable as previous versions, leaving you with the "ribbon" at the top, which takes up a good portion of your screen.
- Many of the shortcuts have been eliminated. If you're the kind of person who likes to use a keyboard instead of the mouse, you're out of luck.
- Many of the features have been renamed, but the Help feature doesn't help you with this at all. It would have been nice to go to the help menu, type in the name of the feature you want to use, and have it give you the name of the new feature. If you had a function that you used in a previous version, you have to figure out what the new version calls it.
- Many features have been shuffled around, too. Using Microsoft's unusual naming and categorizing strategy, you have to figure out where your features went, which is especially difficult if you aren't sure what the new version calls it, or if it was removed entirely.
- For example, in Office 2003 and previous versions, if you wanted to edit the header or footer, you would go to edit, then header/footer. In office 2007, if you want to edit the header or footer, you have to go to "Insert" then "header/footer."
- Like previous versions, if you do a lot of typing in Word, you're going to spend most of your time looking at the bottom of the screen. The only way to avoid this is to continuously scroll up.
- Microsoft's own spellchecker doesn't recognize Microsoft's own words, such as Powerpoint.
- That's because it's spelled "PowerPoint".
- Games for Windows [Live] . It's like Steam, and mostly works just as well... except for installation. To install a game, you have to have enough room to store three entire copies of that game on your hard drive [1]. For some games, that's well over 30 gigabytes. Contrast with Steam, which requires enough room to store one entire copy. You know, the copy that you actually use.
- Also, while Steam will automatically update your games for you so that you never have to worry about not having your game up to date, Games for Windows Live will only tell you it needs to update when you actually try to play them. You know, the exact moment when you don't want to wait several minutes for your game to be ready.
- And if you think that's bad, Microsoft has announced its making a new push for GFWL (because of its awful, and justified, reputation) and "doubling down" on PC games. What does this mean? "it will allow publishers to submit a pre-protected or unprotected build of their game with their choice of DRM." [1] Yep. Microsoft's ideas to save their PC business? MORE DRM! It almost seems like they're trying to kill PC gaming!
- Another terrible aspect of GFWL is that it a lot of games using it have their savegames locked down in a way that makes you essentially lose them (without going through a wallbanging ordeal to get them working again anyway) every time you reinstall a game or try to transfer your progress to another system. Again this compares unfavorably to Steam, which either just keeps out of the way of screwing with save-files in the first place, or, with Steam Cloud, outright embraces transferring them to different systems or installations.
- Also, if you somehow register for the wrong country, there is no way to go back and change it. At all, not even with customer service. The only solution is to create another account with a different name (and lose everything in the previous one, of course).
- Versions of Windows Media Player randomly crash with a cryptic message about "server execution" failing. Even when you're trying to play a simple wav file on your computer with no DRM and network sharing disabled, and there's no conceivable reason it would even need to access a remote server in the first place.
- The Zune also had a leap-year glitch, which made them freeze up on New Year's Eve of a leap year because of the clock driver screwing up on how it handles leap years.
- Regardless of the Zune's other problems, its most glaring one was its incompatibility with Plays For Sure-protected media. Microsoft apparently can't even maintain compatibility with its own stuff.
Sony Only Does Everything Poorly[]
- As discovered in the months when the PlayStation 3 was hacked, some of the code in the system involving signing software was discovered to use a constant integer for all systems, which was barely obfuscated at all. (a simple math problem was all it took to get it)
- The Sony rootkit designed to install whenever a user placed an audio CD in their computer. Ironically, this wouldn't get installed on many user's computers because it required administrative privileges to install, and a safe setup will deny these privileges to prevent just this kind of software from installing. On top of that, the rootkit installed on AutoPlay, which means (on Windows XP and earlier from before AutoPlay was changed to be prompt-only) you could defeat it by, on top of disabling AutoPlay altogether, holding the shift key when you insert the disc. If, through some miracle of ineptitude, the rootkit did get installed on a paying customer's computer, it would slow down your computer AND open up gigantic security holes that would invite (additional) malware. Sony later released a program that would supposedly remove the rootkit, but only installed MORE crap. And to download it required submitting a valid e-mail address, which Sony was free to sell to spammers. All this in the name of Copy Protection. Because Digital Piracy Is Evil, and wrecking people's computers is apparently better than possibly letting them copy your CD.
- Probably the ultimate example of screwing up loyal customers while doing less than nothing to discourage piracy. The unskippable (except for pirates) FBI warnings on DVDs are nothing compared to this.
- Another idiotic Sony DRM idea: Key2Audio, a DRM system that worked by violating the Red Book Compact Disc standard and putting a dummy track claiming the disc was empty around the outer edge of the disc (which is read first by PC disc drives, while stereos read from the inner track first). The trick to breaking this one? Keep the outer track from being read. How to do that? Draw over the edge with a permanent marker.
- Still Sony: the PlayStation 3. A firmware bug in which some models believed that 2010 was a leap year resulted in lockouts on single player games due to the machine refusing to connect to the PlayStation Network. What was the reason for this system having such a perilous dependency on the judgement of its system clock? DRM!
- The bug stemmed from the hardware using binary-coded decimal for the clock. Because apparently converting that time for display is so difficult for the ten core Cell processor.
- And another Sony facepalm to add to the pile: Sony releases frequent updates for their Play Station Portable, mostly in an attempt to fix "security holes" that would allow running "homebrew" applications in the name of preventing piracy. On one occasion a fix for an exploit that would allow such "unauthorized" code to run with the use of a specific game ended up opening an exploit that required no game at all.
- Sony tried to do the same with the Playstation 3, in addition to numerous other security features such as the Hypervisor and the Cell Processor's SPE Isolation. As the hacking group Fail0verflow (the same guys responsible for the major WII breakthrough) discovered, the only bits of security that are actually implemented well are usermode/kernelmode, per-console keys, and the "on-die boot ROM" - everything else was either bypassed or broken through. This includes the public-key cryptography. Yes, the cryptography in the PlayStation 3 used to check the signature on software was cracked, and sony's private keys (which are used to sign software for the PlayStation 3) were obtained.
- Sony blamed the massive Playstation Network outage of April 2011 on a "external intrusion." How did this happen? Sony was running a version of Apache with known vulnerabilities for two months.
- It gets worse: Eighty million names, addresses, birthdates and hashed passwords were stolen, it seems the PSN servers stored them all in plain text.
- And they transmitted credit card details using HTTP GET (that is, directly in the url). Unencrypted.
- In a possibly related hack, custom firmware enabled hackers to obtain free games and DLC from the PSN store. Why? Sony made the classic mistake of trusting the client software and assuming a certain variable in the PlayStation 3's firmware could never be modified.
- It gets worse: Eighty million names, addresses, birthdates and hashed passwords were stolen, it seems the PSN servers stored them all in plain text.
- To redeem a code for money or vouchers for the Playstation Network, you have to do it through a PlayStation 3, PSP, Vita, or a PC. The first three are simple enough on their own, but to do it through a PC, you have to download and install a program called MediaGo, instead of being able to log into your PSN account through the website and redeem the code there. And to think some people say that Sony is overly controlling.
Apple - It Just Doesn't Work[]
- Apple products, especially iTunes, have a habit of downloading updates, installing them, then leaving behind a tens or even hundreds of megabytes (per update!) worth of temporary files which it doesn't clean up. Even if you update your iPod firmware, it'll leave behind the temporary files on the computer you used to update it. To add insult to injury, it leaves these files in its own application data directory instead of trying to look for a system-designated temporary directory, meaning any other program trying to find and clean up unneeded temporary files won't notice. To get rid of these wastes of space, you have to dig through your file system to find Apple's directory, look for the temporary directory within that, and delete the junk yourself.
- It's like living with the worst room-mate ever.
- It also always restarts the computer on Windows-based systems after applying the updates, without warning, and even if it doesn't need the restart. It's annoying when you're leaving the updater in the background running and then all of your programs start to close all of a sudden.
- For some reason, when Apple was releasing Safari for Windows for the first time, it had a problem crashing when attempting to bookmark a page. A basic web browser action, and it killed the program.
- The initial beta release of Safari for Windows also included a bug that caused text not to render at all if too many fonts were installed.
- It should be noted that in a hacker's convention contest, Pwn2Own, Mac OS X historically was routinely the quickest to fall. And by quickest, less than a minute. A common entry point for exploits? Safari. Whether or not Apple has picked up its game remains to be seen.
- The old Apple III was three parts stupid and one part hubris; the case was completely unventilated and the CPU didn't even have a heat sink. Apple reckoned that the entire case was aluminum, which would work just fine as a heat sink, no need to put holes in our lovely machine! This led to the overheating chips actually becoming unseated from their sockets; tech support would advise customers to lift the machine a few inches off the desktop and drop it, the idea being that the shock would re-seat the chips. It subsequently turned out that the case wasn't the only problem, since a lot of the early Apple IIIs shipped with defective power circuity that ran hotter than it was supposed to, but it helped turn what would have otherwise been an issue that affected a tiny fraction of Apple IIIs into a widespread problem. Well, at least it gave Cracked something to joke about.
- A lesser, but still serious design problem existed with the Power Mac G4 Cube. Like the iMacs of that era, it had no cooling fan and relied on a top-mounted cooling vent to let heat out of the chassis. The problem was that the Cube had more powerful hardware crammed into a smaller space than the classic iMacs, meaning that the entirely passive cooling setup was barely enough to keep the system cool. If the vent was even slightly blocked however, then the system would rapidly overheat in short order. Add to that the problem of the Cube's design being perfect for putting sheets of paper (or worse still, books) on top of the cooling vent, and It Got Worse. Granted, this situation relied on foolishness by the user for it to occur, but it was still a silly decision to leave out a cooling fan (and one that thankfully wasn't repeated when Apple tried the same concept again with the Mac Mini).
- Another issue related to heat is that Apple has a serious track record of not applying thermal grease appropriately in their systems. Most DIY computer builders know that a rice-grain-sized glob of thermal grease is enough. Apple pretty much caked the chips that needed it with thermal grease.
- Heat issues are also bad for Mac Book Pros. Not so much for casual users, but very much so for heavy processor load applications. Since the MBP is pretty much de rigeur for musicians (and almost as much for graphic designers and moviemakers), this is a rather annoying problem since Photoshop with a lot of images or layers or any music software with a large number of tracks WILL drive your temperature through the roof. Those who choose to game with a MBP have it even worse - World of Warcraft will start to cook your MBP within 30 minutes of playing. Especially if you have a high room temperature. The solution? Get the free software programs Temperature Monitor and SMC Fan Control. Keep an eye on your temps and be very liberal with upping the fans: the only downsides to doing so are more noise, a drop in battery time, and possible fan wear: all FAR better than your main system components being fried or worn down early.
- Apple's made a big mistake with one of their generations of the iPhone. Depending on how you held it, it could not RECEIVE SIGNALS. The iPhone 4's antenna is integrated into its outside design and is a bare, unpainted aluminum strip around its edge, with a small gap somewhere along the way. To get a good signal strength it relies on this gap being open, but if you hold the phone wrong (which "accidentally" happens to be the most comfortable way to do so, especially if you're left-handed), your palm covers that gap and, if it's in the least bit damp, shorts it, rendering the antenna completely useless. Lacquering the outside of antenna, or simply moving the air gap a bit so it doesn't get shorted by the user's hand, would've solved the problem in a breeze, but, apparently, Apple is much more concerned about its "product identity" than about its users. Apple suggested users to "hold it right" because all cellphones have this problem (yeah, ALL of them, especially the iPhone). As it turns out, this design was completely intentional because you were supposed to buy an extra kit for additional $25. Apple got sued from at least 3 major sources for scam due to this.
- Macbook disc drives are often finicky to use, sometimes not read the disc at all and getting it stuck in the drive. The presented solutions? Restarting your computer and holding down the mouse button until it ejects. And even that isn't guaranteed, sometimes the disc will jut out just enough that the solution won't register at all and pushing it in with a pair of tweezers finishes the job. To put this in perspective, technologically inferior video game consoles could do a slot-loading disc drive far better(Wii, PlayStation 3).
DRM and Copy Protection: Why The Software Pirates Are Winning[]
- Dragon Age has a really Face Palm worthy bit with its installer where, when it asks for your CD-key, you can use task manger (AKA "Ctrl-Alt-Delete") to stop one of the processes that is part of the installer and skip it. Under the DMCA, Task manager is now illegal, although given certain other efforts listed below, why markers haven't been outlawed is beyond us.
- Games using SecuROM will also fail to launch (without explanation) if they detect a process named "procexp.exe" (Sysinternals Process Explorer, a program provided by Microsoft), ostensibly out of fear that hackers will use it to reverse engineer their DRM process, even though Process Explorer is basically just a beefed up version of Task Manager, and Process Monitor is the one that reveals any action taken by any program whatsoever. Ways you can circumvent this:
- Close ProcExp and reopen it immediately after starting the game.
- Rename the ProcExp executable to anything else.
- In other words, this measure does nothing to stop or even slow down piracy, but if you bought the game legally...
- StarForce, another Copy Protection program, in addition to opening up security holes and causing BSODs, could physically break a legit user's CD drive. What's next, breaking people's knees in the name of copy protection?
- Not only that, it would disable any legitimate SCSI drives because it assumed any SCSI drive was a virtual CD/DVD drive (which most were seen as to the OS). It doesn't help that SATA drives are also, likely due to technical reasons, listed as SCSI drives by the OS; although CD/DVD drives often still used plain IDE during the early days of SATA hard drives, this is beginning to change, though hard drives have been universally SATA-connected for some time.. And it was so badly written, if someone installed it on a Windows Vista machine, it would kill the OS to the point of requiring a reinstall.
- That's because SATA essentially is a SCSI interface, only used over a specially designed serial cable instead of older SCSI ribbon, and just present the IDE/PATA-compatible front to the OS for the sake of compatibility. Serial Attached SCSI uses exactly same cables and connectors, and SAS controllers can even natively operate SATA drives (though not vice versa). Modern OSes bypass this front and use the native SATA command set, which is basically a subset of SCSI, and thus the drives are listed as such.
- StarForce doesn't have a very good track record when it comes to DRM in the first place: When people were complaining about how shoddy it was, how easy it was to bypass, and how very many OSes it had destroyed, one StarForce employee responded that the product prevented piracy by providing a link to a torrent for Galactic Civilizations 2, a game that doesn't have any DRM measures in place. Video game media quite accurately pointed out that StarForce was running a protection racket: protect your games with our DRM, or we'll point people to pirated copies of your game (the employee in question was reprimanded and the company disavowed his actions, but it did not help the company's standing at that particular time).
- Not only that, it would disable any legitimate SCSI drives because it assumed any SCSI drive was a virtual CD/DVD drive (which most were seen as to the OS). It doesn't help that SATA drives are also, likely due to technical reasons, listed as SCSI drives by the OS; although CD/DVD drives often still used plain IDE during the early days of SATA hard drives, this is beginning to change, though hard drives have been universally SATA-connected for some time.. And it was so badly written, if someone installed it on a Windows Vista machine, it would kill the OS to the point of requiring a reinstall.
- Norton Anti Virus. Norton was once the ultimate antivirus software, and the standard the others aspired to. Then they decided to focus on anti piracy, over actually protecting the people who actually paid for the product. Now it can hardly detect anything, and often when it detects a virus IT CAN'T DELETE IT, making it COMPLETELY USELESS. For this, Norton now has the ultimate pirate defense: it's such an awful program, anyone knowledgeable enough to pirate software is going to get the paid, more feature-rich version of a program with a free version like AVG or avast!, or just go the legal route and get either the free versions of either of those or the newer Microsoft Security Essentials, which are all free, yet somehow still far better than Norton.
- With Norton 360, attempting to create a backup will take you to a section where you have to login with your Norton Account. The problem is, if you don't happen to know the account's password (like if your parents gave you the CD as one of the 3 or so allowed copies of the program), the window's exit and right-click-close buttons are grayed out, and Task Manager can't kill the application. You have to sign in properly or it declares your copy pirated despite providing the CD Key earlier.
- Ubisoft's online DRM, which prevented players from playing their games when the servers went down.
- The PC version of Gears of War had a DRM certificate that would expire on January 28, 2009, making the game unplayable after that date. Luckily, Epic released a patch to fix this shortly after.
- EA's smartphone version of Tetris requires an Internet connection. For a single-player game. And if your connection drops in mid-game, it'll kick you back to the title screen. Despite the fact that there are no features in the game that should even require Internet connectivity.
Hardware That Wears Hard On You[]
- Some more examples courtesy of Sony: (Are Microsoft and Sony having a "anything you can screw up, we can screw up worse" competition or something?)
- First, the first batch of Play Station 2's were known for starting to produce a "Disc Read Error" after some time, eventually refusing to read any disc at all. The cause? The gear for the CD drive's laser tracking had absolutely nothing to prevent it from slipping, so the laser would gradually go out of alignment.
- The original model of the PSP had buttons too close to the screen, so the Einsteins at Sony moved over the switch for the square button, without moving the location of the button itself. Thus every PSP had an unresponsive square button that would also often stick. Note that the square button is the second-most important face button on the controller, right before X; in other words, it's used constantly during the action in most games. Sony president Ken Kutaragi confirmed that this was intentional.
Ken Kutaragi: I believe we made the most beautiful thing in the world. Nobody would criticize a renowned architect's blueprint that the position of a gate is wrong. It's the same as that. |
- And before you ask, yes, this is a real quote sourced by dozens of trusted publications. The man actually went there.
- Another PSP-related issue was that if you held the original model a certain way, the disc would spontaneously eject. It was common enough to be a meme on YTMND.
- The original Playstation wasn't exempt from issues either. The original Series 1000 units and later Series 3000 units (which converted the 1000's A/V RCA ports to a proprietary A/V port) had the laser reader array at 9 o'clock on the tray. This put it directly adjacent to the power supply, which ran exceptionally hot. Result: the reader lens would warp, causing the system to fail spectacularly and requiring a new unit. Sony admitted this design flaw existed...after all warranties on the 1000 and 3000 units were up and the Series 5000 with the reader array at 2 o'clock was on the market.
- For a non-console example, the company's HiFD "floptical" drive system. The Zip Drive and the LS-120 Superdrive had already attempted to displace the ageing 1.44MB floppy, but many predicted that the HiFD would be the real deal. At least until it turned out that Sony had utterly screwed up the HiFD's write head design, which caused performance degradation, hard crashes, data corruption, and all sorts of other nasty problems. They took the drive off the market, then bought it back a year later... in a new 200MB version that was totally incompatible with disks used by the original 150MB version (and 720KB floppies as well), since the original HiFD design was so badly messed up that they couldn't maintain compatibility and make the succeeding version actually work. Sony has made a lot of weird, proprietary formats that have failed to take off for whatever reason, but the HiFD has to go down as the worst of the lot.
- The PlayStation 3 console stops charging the batteries in the controller when the unit is turned off. This seems counter-intuitive. You'd think the system would continue to charge the batteries when the game is turned off.
- This makes perfect sense if your objective is to minimize the console's current draw when it's turned off. The power has to come from somewhere.
- Except that it would only draw that power as long as it's charging, then would stop drawing power when the controller is charged.
- The reason is that USB ports can be powered only when there's power on the motherboard, you can't wire them so they are under power when the MB is off. Then there's the matter of monitoring charge and turning it off when the battery is full, which is absolutely necessary, given many batteries tendency to explode when overcharged.
- The original Xbox 360 does the same thing: when the unit is off, the controllers will not charge. The Slim models, however, do provide power to the USB ports even in standby.
- The infamous "Red Ring of Death" that occurs in some Xbox 360 units. Incidentally, that whole debacle was blown way out of proportion, no thanks to the media (it's important to note, however, that Microsoft released official numbers stating that 51.4% of all 360 units were or would eventually be affected by the issue listed below). While there were an abnormally high number of faults, once it was being broadcast everywhere people started sending back perfectly functional consoles - only "three segments of a red ring" meant "I'm broken, talk to my makers". Other codes could be as simple as "Mind pushing my cables in a bit more?", something easy to figure out if you Read the Freaking Manual.
- However, the design flaw that led to the fatal RRoDs was at the very least a boneheaded decision on Microsoft's part, and at worst, proof that They Just Didn't Care about making something reliable. Basically, the chip containing the graphics core and memory controller got exceptionally hot under full load, and was only cooled by a crappy little heatsink. This led to the chip in question actually desoldering itself from the motherboard after a while, and people who opened up the cases on dead units actually reported the chip falling out of the console after removing the heatsink.
- The heat sink was a lot larger but shrunk to make room for the DVD drive. You think they would've tested it after making a risky design choice like that.
- A more plausible explanation is that the solder joints weren't very reliable under repeated thermal stress. Eventually they crack. The same thing happens to first generation Play Station 3 models (the Yellow Light of Death), albeit much later. Whoever was commissioned to do the assembly of both the 360 and PlayStation 3 must've had a grudge.
- The 360 has another design flaw in it that makes it very easy for the console to scratch your game discs if the system is intentionally or unintentionally moved while the game disc is still spinning inside the tray. The problem is apparently so insignificant amongst most Xbox 360 owners (though ironically MS themselves are fully aware of this problem), That when they made the Slim model of the system they fixed Red Ring issues (somewhat) but not the disc scratching issue.
- To be fair it is stated in the manual, FAQ's, and other sources that if you want to move the console you have to turn it off.
- Most mechanical drives can tolerate movement at least. It's not recommended (especially for hard drives, where the head is just nanometers away from the platter), but not accounting for some movement is just bad. Anyone that has worked in a game-trading industry (such as Gamestop/EB Games) can tell you that not a day goes by without someone trying to get a game fixed or traded in as defective due to the evil Halo Scratch.
- Most of 360 problems stem from the inexplicable decision to use the full-sized desktop DVD drive, which even in the larger original consoles took almost a quarter of their internal volume. Early models also had four rather large chips on the motherboard, due to the 90-nm manufacturing process, which also made them run quite hot (especially the GPU-VRAM combo that doubled as a northbridge). But the relative positions of the GPU and the drive (and the latter's bulk) meant that there simply wasn't any room to put any practical heatsink! Microsoft tried to address this problem in two separate motherboard redesigns, first of which finally added at least some heatsink, but it was only third, when the chipset shrinking to just two components allowed designers to completely reshuffle board and even add a little fan atop the new, large heatsink, which finally did the problem away somewhat. However even the Slim version still uses that hugeass desktop DVD-drive, which still has no support for the disk, perpetuating the scratching problem.
- However, the design flaw that led to the fatal RRoDs was at the very least a boneheaded decision on Microsoft's part, and at worst, proof that They Just Didn't Care about making something reliable. Basically, the chip containing the graphics core and memory controller got exceptionally hot under full load, and was only cooled by a crappy little heatsink. This led to the chip in question actually desoldering itself from the motherboard after a while, and people who opened up the cases on dead units actually reported the chip falling out of the console after removing the heatsink.
- Why, after the introduction of integrated controllers into every other storage device, does the floppy have to be controlled by the motherboard? Sure, it makes the floppy drive simpler to manufacture, but you're left with a motherboard that only knows how to operate a spinning mass of magnetic material. Try making a floppy "emulator" that actually uses flash storage, and you'll run into this nigh-impassible obstacle.
- The floppy drive interface design made sense when it was designed (first PC hard drives also used a similar interface) and was later kept for backward compatibility. However, a lot of motherboards also support IDE floppy drives (I do not know if there were any actual IDE floppy drives, but a LS120 drive identifies itself as floppy drive and can read regular 3.5" floppy disks), a SCSI or USB device can also identify as floppy drive. On the other hand, the floppy interface is quite simple if you want to make your own floppy drive emulator.
- The problems with the interface aren't really that bad. Remember the LPT port? All the messed-up ISA/EISA variants? The interface design is really only bad compared with a) modern designs and b) the design of the floppy disc itself, which is often hailed by those in the tiny field as the best connection system ever, for appearing square (making it look good) but having only one way to insert it, period (in contrast to even USB, which has two ways to be forced in, only one of which works).
- The LPT port is a pretty cool interface, if you like to throw together homemade devices that don't need a controller IC.
- They all have nothing on the infamous A20 line. Due to the quirk in how its addressing system worked [2], Intel's 8088/86 CPUs could theoretically address slightly more than their advertised 1 MB. But because they physically still had only 20 address pins, the resulting address just wrapped over, so the last 64K of memory actually were the same as first. Some early programmers[3] were, unsurprisingly, stupid enough to use this almost-not-a-bug as a feature. So, when the 24-bit 80286 rolled in, a problem arose — nothing wrapped any more. In a truly stellar example of a "compatibility is God" thinking, IBM engineers couldn't think up anything better than to simply block the offending 21st pin (the aforementioned A20 line) on the motherboard side, making the 286 unable to use a solid chunk of its memory above 1 meg until this switch was turned on. This might have been an acceptable (if very clumsy) solution had IBM defaulted to having the A20 line enabled and provided an option to disable it when needed, but instead they decided to have it always turned off unless the OS specifically enables it. By the 386 times no sane programmer used that "wrapping up" trick any more, but turning the A20 line on is still among the very first things any PC OS has to do. It wasn't until Intel introduced the Core i7 in 2008 that they finally decided "screw it," and locked the A20 line into being permanently enabled.
- The original model Nokia N-Gage had a complete joke of a design. As a phone, the only way you could speak or hear anything effectively is if the user held the thin side of the unit to his/her ear (earning it the derisive nickname "taco phone" and the infamous "sidetalking"). From a gaming point of view, it was even worse, as the screen was oriented vertically instead of horizontally like most handhelds, limiting the player's ability to see the game field (very problematic with games like the N-Gage port of Sonic Advance). Worst of all, however, is the fact that in order to change games, one had to remove the casing and the battery every single time.
- As far as bad console (or rather, console add-on) design goes, the all-time biggest example is probably the Atari Jaguar CD. Aside from the crappy overall production quality of the add-on (the Jaguar itself wasn't too hot in this department, either) and poor aesthetics which many people have likened to a toilet seat, the CD sat on top of the Jaguar and often failed to connect properly to the cartridge slot, as opposed to the similar add-ons for the NES and Sega Genesis which used the console's own weight to secure a good connection. Moreover, the disc lid was badly designed and tended to squash the CD against the bottom of the console, which in turn would cause the disc motor to break apart internally from its fruitless attempts to spin the disc. All of this was compounded by Atari's decision to ditch any form of error protection code so as to increase the disc capacity to 800 megabytes, which caused software errors aplenty, and the fact that the parts themselves tended to be defective.
- The Sega Saturn is probably seen as one of the worst designs internally. In an effort to try to bring more and more power to the console, Sega added two CP Us to the system. Sounds great!... Until you consider that there were also six other processors that couldn't interface too well. This also made the motherboard prohibitively complex, being the most expensive console at the time. And lastly, the GPU used quadrilaterals (four sided shapes) as basic primitives, whereas everyone else used triangles. This made multiplatformed games tricky to work with on the Saturn.
- The "Prescott" core Pentium 4 has a reputation for being pretty much the worst CPU design in history. It had some design trade-offs which lessened the processor's performance-per-clock over the original Pentium 4 design, but theoretically allowed the Prescott to run at much higher clockspeeds. Unfortunately, these changes also made the Prescott vastly hotter than the original design, making it impossible for Intel to actually achieve the clockspeeds they wanted. Moreover, they totally bottlenecked the processor's performance, meaning that Intel's usual performance increasing tricks (more cache and faster system buses) did nothing to help. By the time Intel came up with a new processor that put them back in the lead, the once hugely valuable "Pentium" brand had been rendered utterly worthless by the whole Prescott fiasco, and the new processor was instead called the Core 2. The Pentium name is still in use, but is applied to the fairly stripped-down, low-end processors that Intel puts out for cheap computers.
- The Prescott probably deserves the title of worst x86 CPU design ever (although there might be a case for the 80286) but allow me to introduce you to Intel's other CPU project in the same era: the Itanium. Designed for servers, using a bunch of incredibly bleeding edge hardware design ideas. Promised to be incredibly fast. The catch? It could only hit that theoretical speed promise if the compiler generated perfectly optimized machine code for it. Turned out you couldn't optimize most of the code that runs on servers that hard, because programming languages suck, and even if you could, the compilers of the time weren't up to it. Turned out if you didn't give the thing perfectly optimized code, it ran about half as fast as the Pentium 4 and sucked down twice as much electricity doing it. Did I mention this was right about the time server farm operators started getting serious about cutting their electricity and HVAC bills?
- Making things worse, this was actually Intel's third attempt at implementing such a design. The failure of their first effort, the iAPX-432 was somewhat forgivable, given that it wasn't really possible to achieve what Intel wanted on the manufacturing processes available in the early eighties. What really should have taught them the folly of their ways came later in the decade with the i860, a much better implementation of what they had tried to achieve with the iAPX-432... which still happened to be both slower and vastly more expensive than not only the 80386 (bear in mind Intel released the 80486 a few months before the i860) but also the i960, a much simpler and cheaper design which subsequently became the Ensemble Darkhorse of Intel, and is still used today in certain roles.
- To be fair, in the relatively few situations where it gets the chance to shine, Itanium 2 and its successors can achieve some truly awesome performance figures. The first Itanium on the other hand was an absolute joke. Even if you managed to get all your codepaths and data flows absolutely optimal, the chip would only perform as well as a similarly clocked Pentium III. Even Intel actually went so far as to recommend that only software developers should even think about buying systems based on the first Itanium, and that everyone else should wait for Itanium 2, which probably ranks as one of the most humiliating moments in the company's history.
- The failure of the first Itanium was largely down to the horrible cache system that Intel designed for it. While the L1 and L2 caches were both reasonably fast (though the L2 cache was a little on the small side), the L3 cache used the same off-chip cache system designed three years previously for the original Pentium II Xeon. By the time the Itanium had hit the streets however, running external cache chips at CPU speeds just wasn't possible anymore without some compromise, so Intel decided to give them extremely high latency. This proved to be an absolutely disastrous design choice, and basically negated the effects of the cache. Moreover, Itanium instruction are four times larger than x86 ones, leaving the chip strangled between its useless L3 cache, and L1 and L2 caches that weren't big or fast enough to compensate. Most of the improvement in Itanium 2 came from Intel simply making the L1 and L2 caches similar sizes but much faster, and incorporating the L3 cache into the CPU die.
- While Intel's CPU designers have mostly been able to avoid any crippling hardware-level bugs since the infamous FDIV bug in 1993 (say what you like about the Pentium 4, at least it could add numbers together correctly), their chipset designers seem much more prone to making screw-ups. Firstly there was the optional Memory Translator Hub component of the 820 chipset, which basically didn't work at all and was rapidly discontinued. Then there was the 915 and 925 chipsets; both had serious design flaws in their first production run, which required a respin to correct. The P67 and H67 chipsets were found to have a design error that supplied too much power to the SATA 3Gbps controllers, which would cause them to burn out over time (though the 6Gbps controllers were unaffected, oddly enough).
- Speaking of the 820 chipset, anyone remember RDRAM? It was touted by Intel and Rambus as a high performance RAM for the Pentium III to be used in conjuntion with the 820. But implementation wise, it was not up to snuff (in fact benchmarks revealed that applications ran slower with RDRAM than with the older SDRAM!), not to mention very expensive, and third party chipset makers (such as SiS, who gained some fame during this era) went to cheaper DDR RAM instead (and begrudgingly, so did Intel, leaving Rambus with egg on their faces), which ultimately became the de facto industry standard. RDRAM still found use in other applications though (like the Nintendo 64 and the Playstation 2).
- ↑ One for the installation archive, one for the extracted archive, and one for the actual installed copy; it does not delete these things between steps, only at the end
- ↑ Basically, they've skipped on the bounds check there
- ↑ Among them, Microsoft. The CALL 5 entry point in MS-DOS relies on this behaviour.