July 27, 2008

The greatest blogger on earth is Joel Spolsky. He has an article he wrote recently called Martian Headsets. In a very roundabout analogy way, he explains why the fact that Internet Explorer 8 is going to be standards compliant is both a good and terrible thing and why Microsoft is screwed no matter what they do. Now, it’s not like Microsoft deserves sympathy for the mess they’re in because they created it.

To recap: versions of IE up to and including IE6 are not only non-standards compliant but they were downright hostile to standards and now if IE8 actually does wind up being standards compliant it will break every page out there that’s been coded to account for IE6’s quirks. Developers actually call it “quirks mode” and IE7 went a little ways to fix this but IE8 is actually going to try and implement the standards fully. Depending on who you listen to, IE6 (often called out because it went the longest time without an update) either did not meet standards because standards were too loosely defined or too difficult to follow (it’s all in the article above).

There’s one bit in the article though that I think really needs to be pointed out

Jon Postel should be honored for his enormous contributions to the invention of the Internet, and there is really no reason to fault him for the infamous robustness principle. 1981 is prehistoric. If you had told Postel that there would be 90 million untrained people, not engineers, creating web sites, and they would be doing all kinds of awful things, and some kind of misguided charity would have caused the early browser makers to accept these errors and display the page anyway, he would have understood that this is the wrong principle, and that, actually, the web standards idealists are right, and the way the web “should have” been built would be to have very, very strict standards and every web browser should be positively obnoxious about pointing them all out to you and web developers that couldn’t figure out how to be “conservative in what they emit” should not be allowed to author pages that appear anywhere until they get their act together.

But, of course, if that had happened, maybe the web would never have taken off like it did, and maybe instead, we’d all be using a gigantic Lotus Notes network operated by AT&T. Shudder.

Basically there are all kinds of stuff that you shouldn’t be allowed to do in a standards-compliant webpage in a standards-compliant web browser. You cannot follow certain kinds of tags with certain other kinds of tags, it’s illegal. It doesn’t really make sense that it’s illegal, since the basic effect is the same, but it’s still illegal.

But if web browsers enforced everything, would the web really have become as popular as it is?

MySpace is a site to go to if you want your eyes to bleed. The guys there have constructed their code in such a way that it’s fantastically easy to make a web page, and damn near impossible to make a web page that looks good. Years ago (think 1996) there was a site called GeoCities which did something similar, without the social networking capabilities. Web site hosting was expensive and out of most people’s grasp, and they sure didn’t know how to use HTML. GeoCities would let you create a webpage, get it online, and also had a tool to edit the HTML for you. Sure, you had about 2MB to work with and sure, the URL was half a mile long, but you could do it. And the web became littered with tens of thousands of sites that essentially consisted of pictures of their cats, a couple of animated “UNDER CONSTRUCTION” gifs, and every tag trick in the HTML 2.0 book, with at least one blinking text tag which became useless when IE decided not to implement it.

Most of the people who made GeoCities sites either abandoned the concept of making their own pages (and maybe moved on to making YouTube videos or something) or they went forward and maybe pursued a career in technology. The new generation of people who want to make a spiffy web page and don’t know how have moved on to MySpace. Namely, High School teenagers.

Now I may come across as a snob here but really I’m giving MySpace some faint praise. It’s not like MySpace is the first site to try this – it wasn’t even the first site trying at the same time that it got started and running. It was just the first one to nail what people need – an easy way to create content, and an audience.

Same thing goes for YouTube – people don’t remember (even though it was maybe 2006 when this all happened) but YouTube was one of a number of video site competitors. And if you’re like me and you mainly just watch the videos, they were all the same. So why did YouTube succeed? Easy, they made it dead simple to upload a video and share it. No one else quite got this. Not even Google, who had their own competing video hosting site and wound up buying YouTube.

And it’s not like video sites were a new concept or anything – there had been sites for years that hosted online videos, but they all suffered from the same problems, namely the technology involved. You had to embed the videos on your site, or make them streamable. Most video players, like RealPlayer or Windows Media Player or QuickTime could be embedded but which one did you go with? If you went with any of the three you wound up locking out people who didn’t have the one you picked. If you went with Windows Media Player, which has the greatest install base, then you locked out the Macintosh and Linux users. If you went with QuickTime you locked out people who were purposely keeping Apple off of their systems. If you went with RealPlayer you locked out everyone who didn’t purposely go out and download that player. And since Real Media did so many shitty things for so many years with their player and how it would operate many people, myself included, just boycott the thing out of sheer spite.

And then every once in a while the program you wrote the embedded video player support for in your website would change and decide to not work with your page unless you updated the tags in your document – but doing so would then break the support for anyone who didn’t upgrade. You could just tell everyone going to your site to just upgrade to the latest player but a large percentage of your audience would just say to hell with it and move on.

This all changed when Macromedia (now part of Adobe) added the ability for Flash to play video content. Initially I thought it was a dumb idea – why would anyone want to play a video in a Flash document? I also figured for sure it would be abused – great, now all those sites out there who annoy you with Flash-only content will throw videos in your face. But it turned out to be brilliant – now instead of worrying about the ten different kinds of video codecs and who has what player, now you just had to worry about who had Flash. And 93% of web users have Flash (out of desktop users, not phones or anything). So while Google Video was trying to implement and enforce an open source standard based on VLC on their users, YouTube would literally take just about any sort of video file on Earth and just play it. They solved the technical issues involved with video on the Internet, and because they solved the problem YouTube became easy, and that made them popular. YouTube didn’t win by having the best ad campaign or spending a lot of money convincing people, they won because they were the best in a field suddenly ripe with competitors.

Of course the other problem with video sites on the Internet was bandwidth. YouTube ran advertisements from day one but no one believed for a second that they were making enough money with them to cover their bandwidth costs – especially since they were literally doubling their bandwidth usage every month. Everyone wanted to know what their business plan was. As it turns out their business plan was “get purchased by someone bigger” and that’s exactly what happened. But that’s another story.

There’s a reason the web took off – because it became easy to make a web page. If making a web page was difficult – and if the initial web browsers of the day had enforced this – then the web might not have taken off. It’s not like this was the first thing to appear on the Internet – email and newsgroups go back further than the web, along with IRC, FTP, etc. For that matter, if Microsoft hadn’t made Internet Explorer a built-in feature of Windows, would the web have taken off as quickly as it did? Suddenly you had no excuse not to be online – there was a web browser built into your system. Netscape sued Microsoft for bundling IE with Windows 98, and it’s not like Microsoft really did them any favors there, but even as recently as 1997 Wired Magazine was prognosticating that the Web Browser would go the way of the Dodo in favor of “push” technology. “Push”, as it was configured back then, never really took off (the idea that instead of you seeking out content it would come to you) but in a modified form it exists today – RSS feeds, instant messages, podcasts, etc.

So why doesn’t the Macintosh take over the world? Especially since, as so many of its fans decree, it’s so much better? Simple really – it’s not easy to run one. There’s one place you can get one – Apple – and if you don’t like their offerings or their prices, then tough. Want to run a PC with Windows? You have hundreds of manufacturers in an ecosystem of computer hardware makers to choose from. Back when Apple started making computers, every manufacturer did their own thing, no one ran programs from anyone else, and the market was very fragmented. Apple still runs their operation the same way today. So while Steve Jobs can make rooms full of people in turtleneck sweaters cluck like chickens at the sight of a new iPhone, Apple can’t get past a single-digit market share.

Their #1 success story, the iPod, only sells and works as well as it does because it runs what people want it to run, namely the scores of MP3 files they’ve amassed over the years. The Macintosh, by comparison, doesn’t run what people want it to run, namely all of the Windows programs they own, and all of the games they’d like to play. If the iPod had, from day one, only run AAC files then no one would have purchased it. They would have just gone on to the next iPod-like player that would. In fact, the iPod never really picked up steam, sales-wise, until the third generation which officially supported Windows. Sure, you could do it before if your PC had FireWire (few did in 2001) and if you were willing to try and run one of the reverse-engineered programs people were releasing, but until Apple officially made the thing support Windows, it didn’t go anywhere. And today probably 70% or more of iPod owners run Windows, which Apple treats as a second-class citizen with regard to iTunes. The Macintosh is selling better nowadays, but it’s likely to have nothing to do with the witty “Mac vs. PC” ad campaigns – it’s likely due to the fact that now the Macintosh runs on an Intel processor, which means you now run Windows on your Macintosh, either through dual booting (which Apple officially supports, via Boot Camp) or through a program called Parallells which allows you to boot Windows at the same time as Mac OS X. So in other words, the Macintosh is becoming more popular now because it gives people what they want – the ability to run their existing Windows programs and games.

Anyway the point of this whole long diatribe which took way too long to write is that there’s usually really good reasons that things take off and it’s not just because someone can advertise better than someone else. The Internet took off because it made it easy to get information out to the massess, especially when you could guarantee that they’d be running a web browser. MySpace made it even easier to have a website when you don’t know how to make one. Video sites were always a lost cause until Flash took away the technical barriers to entry, YouTube took away the content posting barriers, and Google took away the bandwidth concerns. The Macintosh has always been a bit player until Apple took away the barriers to running Windows and the programs everyone already owns.

We want to believe in conspiracy theories. It’s fun. But IE wasn’t standards noncompliant because Microsoft wanted to fuck the web, it was noncompliant because standards are really hard to nail down and Microsoft just screwed them up. YouTube isn’t the #1 video site online because they advertised, they’re #1 because they made it easy for people to put videos online – they figured that out before anyone else. The reason someone wins in a technical field is just because they figure out the barrier to entry and the elminate it.

Since my short and stupid posts tend to garner a lot of people liking them, here’s another. Moe blogs about her kid. I don’t have kids, so you get to hear about my cats.

We have two Tonkinese sister cats, Liza and Sandy. We’ve had them since about 2001 or so. They’re pretty different, personality-wise, so it makes for an interesting contrast. Liza (“my” cat) is fat and skittish, doesn’t run around a whole lot, and whines a lot more than her sister. Sandy (my “Wife’s” cat) runs around a lot, is a lot more adventurous, and (ironically) tends to like food a lot more.

Our house is two stories tall and when you get to the top of the staircase there’s this taller than waist-height “wall” (I’m sure there’s a better term for this) that runs parallel to it on the second floor, forming a bit of a “hallway” leading to our bedroom. Sandy likes to jump up and perch on the wall for various reasons: she’s a dorky creature of habit, it makes it easier for her to get close to eye level with us, and her sister won’t go up there so it’s a great way to get away from her when she’s being chased.

For some reason a few weeks ago she screwed up and went straight over the wall. She landed on her butt and went tumbling down the stairs. She seemed fine, but a day or so later we noticed she was chasing her tail. A lot. At first we just figured she was just being a dork again but then we noticed her tail was twitching a lot. We sprayed Bitter Apple on her tail but it wasn’t effective, seeing as how a couple of days later she had literally knawed off all the fur on the tip of her tail. We took her to the vet who believes it’s a pinched nerve in her butt from the fall and gave us some medicine to rub in her ear (way easier than making her take a pill).

In any event, the entire point of telling you that story was so that I could show you this – I was explaining to a coworker what happened and they weren’t getting what I was saying, so I illustrated it for them on the whiteboard.

She didn’t really go tumbleweeding down the stairs, that was just funnier to draw. She’s getting better and other than the now-subsiding biting habit she’s fine like before.

But in the meantime I’ve taken to calling her “bonetail”

Earlier this week Slashdot ran a story on obsolete technical skills, and it inspired me to share my personal level of insanity with the group. So, if you like weird posts this one is for you. If not, tune in… whenever the hell I finish the other posts I have unfinished right now.

Back when I was a kid, I grew up in a modest town of about 50,000 people. Too big to be a small town, not big enough to get on most maps. Our phone book was about one inch thick. Small towns had phone books that were essentially glorified pamphlets, about 1/4″ thick, and even then they shared it with all the neighboring towns. I knew people from small towns who thought phone numbers were four digits long, since the first three digits were always the same (and the then-optional area code was the same for probably a hundred miles).

When my family would go on trips we would visit “big cities” like Dallas, Houston, Orlando, Memphis, etc. and in the hotel rooms I would notice that the phone books were always really thick. Like 4-5″ thick. And sometimes, that was just the yellow pages, the white pages were an entirely different book, itself 3″ at least. And they always had these awesome pictures on the front of the local skyline instead of the giant public domain “fingers do the walking” logo that would grace the phone book back home.

So consequently I made the connection early on in my mind that living in a huge city meant you were a success. And living in a huge city meant a huge phone book. Therefore, having a huge phone book in your home meant you were a success. A tenuous connection, but even then I had big dreams of moving to a “big city” later in life and one of these days I would have a big phone book in my house because hey, that’s what big successful people living in big successful cities do.

Years and years pass. I grow up, go through High School, go to College, graduate, get married, and eventually my Wife and I move to the Dallas/Fort Worth Metroplex. We get good paying jobs and rent then eventually buy a house. Initially the phone books that would appear on our porch would be the same standard one-inch affairs I grew up with because we live in the suburbs and they only cover the suburbs, but then one day a bag with two phone books, a 3-inch white pages and a 5-inch yellow pages, shows up on our front porch. These phone books cover the entire Metroplex. They have amazing photos of the Dallas skyline, with Reunion Tower on them (under a stuck-on ad for some ambulance chaser, but that peels off easily enough).

I’m elated. After all these years, I’ve finally made it! I’m finally in a good job making good money and living in a big city and hey, like all big successful people living in big cities, I have a pair of bigass phone books. I’ve arrived! Every time I look at these phone books I’ll remember how I’m in a big city.

So I put these phone books next to the phone and the first thing my Wife says was “Just throw those things away. We have the Internet now.”

I ignore the order and I keep the phone books under the phone cradle for a few years, exchanging them out when a new one comes in. I never tell my Wife the insanely silly “but I’ve always wanted a big phone book” bit because I’m not in the mood to get laughed at (though, apparently, I don’t mind that people reading my blog will laugh at me). I get to keep them in place with the razor thin “well what if we want to look up a phone number when the power’s off or our Internet is down?” excuse.

But then one day I’m cleaning the house and I’m trying to reduce some clutter and it occurs to me that in two years I’ve never opened these things, ever, and they’re just collecting dust and the odds of the power going out or the Internet going down at the same time as my cell phone battery dying and me having to have some obscure phone number are vanishingly small. Oh, and in the years since we moved out here we’ve switched to Vonage so we couldn’t even use the phone in a power outage anyway. And I now have Internet access on my phone (hell my wife has a Treo) so if we needed to look up a number there’s better ways. And the inconvenience of a computer in another room is moot since I put Ubuntu on an old laptop and keep it in the kitchen, hooked up wirelessly to our router.

So I tossed the phone books into the recycle bin (literally) and do so for every other phone book that comes in. At some point I figure they’ll stop putting them on my doorstep, and people will stop advertising in them. They’ll go the way of the pay phone and TV Guide’s printed listings.

Now I’ll just have to contend with dialing ten digits to call someone or remembering ten different area codes to be my reminder of how I’m in a big city. That’ll work.

There’s always been this conspiracy theory that Microsoft purposely made crappy operating systems over the years because then they could always sell us upgrades and patches. Besides being just way off base (we don’t pay for patches, for starters), it’s always had this one flaw – by the theory’s own admission, one day Microsoft would actually get it right and then they’d be screwed. It’s like the flaw in Al Bundy’s Bigger Idiot Theory: eventually you find the biggest idiot (and he called her Peg).

So, while I don’t think that was really Microsoft’s plan, one aspect of it has seemingly come true – they finally got the operating system right with Windows XP.

Windows XP was the first Windows consumer operating system from Microsoft that didn’t require a daily reboot. It was the first Windows that felt truly stable. Blue screens of death were more a function of driver conflict than random occurences (Windows 95 actually had a bug wherein the OS would crash 48.5 days into a session, no matter what happened). Even the most skeptical Windows users were convinced by SP2.

XP worked so well that Microsoft would not release another major operating system for over five years. This was a change from their usual procedure of every two to three years. One operating system, Windows ME (Millenium Edition) was a marketing stopgap release between Windows 98 and Windows XP – someone literally just decided at Microsoft that they needed a new operating system to sell and so they wound up delivering probably the least stable operating system in their history. This probably had something to do with the change in scheduling,

Windows XP being so popular and stable had one side effect – it made it much harder to be a Microsoft critic. No longer did you have Windows to kick around any more, at least with regards to stability. Security was still a concern and over five years, security patches were always a concern – in fact, installing the original Windows XP (no service packs) while connected to the Internet will result in a system infected by worms. However, a fully-patched copy of XP is the best operating system Microsoft has ever released.

Earlier this year, Microsoft delivered the XP followup, Windows Vista. The reviews on it are decidedly mixed. While it offers many new features, uses 3D acceleration for the desktop, and finally adopts a limited user account user model (technically XP had this but it was a joke), it comes at a performance hit and requires more resources like processing and RAM. It has DirectX 10, which is good news for gamers – except that few cards support it and almost no games need it yet (and those that do only use it for marginal effect).

So relatively few have upgraded to Vista. I know of people who have and have had no problems. I also know of people who’ve pitched it out entirely out of frustration. Myself, I used to dual-boot between XP and a Vista RC but I just bulldozed it when the RC expired. It is, overall, a nicer operating system than XP but I just haven’t felt the need to shell out the money for the Ultimate version (I’d have to get that one), especially when XP does everything I want it to.

It doesn’t help that even Microsoft has issues with Vista – the first version of Visual Studio 2005 SP1 wouldn’t work on Vista, and neither did the Zune software, both from Microsoft. Major vendors had problems making drivers for the OS – Nvidia was shipping cards with “Windows Vista Ready” stickers on the boxes while at the same time the drivers were causing major issues for users. Many people had older, unsupported peripherals whose manufacturer decided not to come out with a Vista driver for – they would prefer the customer buy a new device, one that they haven’t discontinued. Myself, if I were running Vista today, I would dual boot with XP for those times when you really need to use something that Vista won’t do. I think it would be different if my computer was completely for personal or entertainment use, but as it stands now I use it to make part of my living, so it’s more important that it use an OS that works, instead of a flashy one which might not work.

So many people prefer XP right now that many are rolling back. Dell is offering it six months into 2008. Microsoft is about to roll out XP SP3, something they had previously stated they would never do. Actually, they had to unveil SP2c, a service pack whose lone function over SP2 was that it allows for more product keys than SP2 did – implying that XP is still selling well.

One of the problems Microsoft has developed over the years is that they’ve pretty much tapped the entire market. Nowadays everyone has a PC already and so most people have a Microsoft OS already. I paid for XP back in 2001 and they haven’t seen another penny from me since on operating systems. If I were the type to buy my PC’s premade from Dell then every time I would buy a new PC, I’d also be buying a new OS license, at some price. If the Dell PC came with Vista, then I’d be buying a Vista license with the PC, but overall the amount of money that goes to Microsoft is unchanged (since Dell likely buys these things in the same bulk quantities/prices that they did with XP).

No, what Microsoft wants is for people like me who run XP (or people who bought a Dell PC in the last few years with XP) to go buy a Vista upgrade. This way, they get the money from the initial OS sale, as well as the money from the upgrade. Their stock price hasn’t budged in years since, while they always have been and always will be selling operating systems, they’re not selling more operating systems except for when people just buy more PC’s. So they want people to upgrade to Vista, but when people refuse and just stay on XP, it screws this plan up.

But really no one thinks that it’s the biggest problem in the world that everyone just prefers to stay on XP. Everyone will upgrade, eventually. There have always been stragglers. There are people to this day that refuse to upgrade to Windows XP and continue to run Windows 2000 (and are only now running into the issue of programs locking out 2000 for artificial reasons). I knew someone who ran Windows 98 until about 2005 – he would spend the LAN Party BSOD’ing and reinstalling his OS while the rest of us played.

Now, the real humor comes from people who somehow view the Vista disdain as an opportunity.

Yes, the Macintosh is a good system, especially now that it’s essentially a PC running an Apple OS. Double especially now that it can dual-boot Windows. But people aren’t going to switch to it. Yes, some will but not in a mass number. At some point you hit this tipping point and you really need to have a PC running Windows. You could run a Macintosh with its 10.5 “Leopard” operating system and use Safari or Firefox instead of IE for web browsing, and iLife or whatever Apple calls its Office competitor for word processing and email and so forth and it will work OK. But at some point you will need to run some Windows program that Parallels won’t run or a game or something and then you’ll have to boot into Windows to do it – at which point you might as well have saved some money and bought a Dell laptop anyway. Dell is still more cost effective (albeit marginally so these days) and offers more choice.

But really the Macintosh is just a symptom of the bigger problem – the bigger problem is the clued-out perception that computing is interchangeable. That you could go to your parents’ house and swap out their PC for one running Ubuntu and they wouldn’t notice. After all, they’d still have web, email, and office applications. What do they care, right?

They’d care. As soon as they get the idea to download or purchase some software from Wal-Mart they’d care. As soon as they buy the $50 scam HP printer from Target and it won’t work on Linux until they do a herculean amount of Googling and have to set up a root password to print off an email, they’d care. Linux zealots have been prognosticating the “year of Linux on the desktop!” for a decade now and they’ve gotten nowhere. Their rallying cry of “Ubuntu is getting better! Give it some more time!” goes hand in hand with “Vista has had a year, forget it, it’s too late! Move on!”

I go to Slashdot from time to time and it’s such a piece of shit site. The stores that make it to the main page are about as incendiary as they come. Just last week came a story titled Microsoft Disses Windows to Sell More Windows, poking fun at how Microsoft has to point out flaws in “older operating systems” (XP, in this case) to sell Vista. This from the community that produces a new Ubuntu every six months. And praises Apple for coming out with a marginal upgrade every 1.5-2 years and charging $129 for it.

The funniest thing about Slashdot is the posters to the forum threads attached to the stories. I’d love to see a venn diagram of them. You see a lot of people posting stuff like “Death to Micro$oft!” “Windows Sucks!” “Linux for Life!” “.NET Sucks!”, etc. Then you see a number of people saying “Why can’t I get a job?” “Why won’t anyone hire my Linux/PHP skills?” “Why do companies insist on running Microshaft software?!” I wonder how many of these people are the same people. Yes of course no one’s going to hire you – you spent all your time learning a bunch of free stuff that the marketplace isn’t interested in. Yes, Google runs almost 100% on Linux, but companies that do are few and far between. Yes, over half of the web servers in use in the world run on Apache and the LAMP stack (Linux, Apache, MySQL and PHP) but when you whittle it down to the Fortune 500 companies (the people who tend to employ others) it’s 80% Windows and IIS.

Apple actually makes their own web browser, called Safari. They unveiled it on the Macintosh in 2003. Earlier this year, they released a Windows port to coincide with the fact that the iPhone runs Safari and they need web developers on Windows to use it to develop apps. Within the first 24 hours, over 100 security vulnerabilities were found. While some of these vulnerabilities were a side-effect of how Windows handles issues (i.e., they didn’t exist on the Macintosh port), many of them were simply inherent to the browser itself (i.e., they were found to exist on the Macintosh port). Four of them were quite severe. Part of the reason they were found so quickly is because the software tools needed to discover them exist on Windows and not on the Macintosh (a side effect of the hacker community existing mainly on Windows), but part of the reason is because there’s just several orders of magnitude more users on Windows than on Macintosh. The security vulnerabilities languished undiscovered for four years simply because not enough Macintosh users were looking for them. To their credit, Apple released a patch for the most critical ones within 48 hours, and a flurry of patches since then.

If I were Microsoft, I’d be saying “It’s not so damn easy, is it?”

Apple has been experiencing similar problems across the board. They released Mac OS X 10.5 “Leopard” in October and many Macintosh users have experienced issues. Some have seen slowdowns, others have noticed less stability. A number of people have rolled back the upgrade. The whole affair sounds suspiciously like the XP versus Vista debacle. Apple shifted resources from this upgrade to the iPhone project, which was a lot more difficult than they had envisioned, and it shows.

It’s not so damn easy, is it?

Linux zealots proclaim conversions. They want people who are fed up with Windows to convert to Linux. They want people to convert from an operating system designed for the masses to an operating system designed for hardcore techies. This goes back to that interchangeable computing garbage. The notion that companies can take their existing codebase and products and throw them away to migrate to an operating system which will run nothing they’ve ever created and was instead written and concieved by a group of individuals, most of whom have never met in real life. Ever notice how Linux doesn’t do much if anything innovative? Ubuntu runs a lot like Windows, because they copied Windows. Microsoft has rooms full of people coming up with stuff like the Ribbon UI in Office 2007. The Open Source movement has people connected online trying to write a competitive web browser.

The Open Source and Linux movements have done good things but they lose sight of one simple fact (or are in denial): money makes things happen. More specifically, money makes things happen faster, and in the technology field this is vital. Let’s take a look at the Open Source Software (OSS) Movement’s big success stories:

Linux: A successful example of OSS working well. However, consider how slowly it has evolved. It was unveiled in 1991 and is still unusable for the average user to this day. My wife’s grandfather can figure out Windows 98, an OS that’s closing in on a decade now. But putting that aside, consider that Linux was born out of a group of individuals looking to clone UNIX, which was written by a commercial entity. In fact, that group of individuals (The GNU movement) was unable to complete anything until a plucky Swedish college kid wrote them a Kernel and finished the thing off. And consider that most of the innovations in Linux nowadays come from companies like IBM, Red Hat, and Canonical (the Ubuntu corporation). People with financial motivations, in other words.

Apache: Another success, and pretty much an organic one. I can’t really take anything away from them on this one. Same thing goes for MySQL. PHP is a poor man’s ASP clone (and not even ASP.NET, mere ASP) but hey, it’s free.

Firefox: An increasingly popular web browser, but it was based off of the Netscape 5 codebase (Netscape 5 was never formally released, it was essentially the maturation of the 4.x line). So, code written for commercial reasons. And this is after many years of a loose net of people working on it. Firefox is a good browser but it wouldn’t exist were it not for the commercial desires of another company. And it’s not even 100% standards compliant (that award will likey go to Opera 9.5).

OpenOffice: A solid product, but it suffers both from the fact that it, too, was originally derived from a commercial codebase (a German company which was swallowed up ages ago by Sun Microsystems) and the fact that it only offers a fraction of the features of Microsoft Office. Office products are in this difficult spot in that literally everyone in the world needs to use them and they have very diverse needs. True, the average user only ever employs 10% of Office’s features, but that 10% is different for everyone. Many a person has attempted to migrate themselves or their secretary to OpenOffice only to learn that some obscure feature that Office had is missing and is a complete deal breaker. Microsoft Office has pretty much hit feature saturation point and it took it twenty years or more to do so. OpenOffice has been out for five years. Not so damn easy is it?

Not that I’m 100% a Microsoft apologist, I call them out on the rug when they need to be. Like the bizarre decisions surrounding the Zune. Or the licensing policies for Vista. But they’re not stupid, or even necessarily evil. Yes, they have made some shady ethical business decisions in the past and I’m not excusing that. However, some of the things people blast them for are simple business decisions. Yes, of course they’re going to charge money for their operating system and software – they’re in this to make money. Yes, they’re going to cut Dell a volume discount – why not? Dell’s offering Linux PC’s now so it’s not like Microsoft’s not “allowing” them to do so or something. A lot of the people who blast Microsoft for the business decisions they make either have never worked in the business world or are in denial about how it works. If Apple had Microsoft’s power, they’d be worse. Apple hates buttons on mice for crying out loud and doesn’t trust you to change your own iPod battery.

But the people who think the world need to migrate away from Microsoft have it all wrong. Apple can’t make a secure web browser and the OSS movement can’t make anything happen without financially motivated people, which they’re against (look at how they’ve turned on Red Hat for doing just this). Moving entirely to a less mature option (and both are less mature in terms of experience with a critical mass of users) would be a huge step backwards. I’m not saying that other options can’t exist – I run Linux myself and hope to own a Macintosh one day – but this notion that one all-encompassing entity needs to be removed and replaced by another all-encompassing entity, just one you like better, is naive.

And this is why the world is seriously not going to move away from Microsoft technologies – because the real world is staffed by intelligent people who get this.

On the first day of QuakeCon this year, I booted up my PC and was greeted with a nice message saying that I had made too many changes to my system and that I would have to reactivate Windows XP.

Now, other than being a little bit annoying, this didn’t concern me, both because I knew that this is a perfectly legitimately licensed copy of Windows XP, one that I paid full retail for back in 2001, but also because I’ve had to do this before and never had an issue. It was still annoying though, given that I had actually just installed and activated this system less than a week prior. My hard drives started giving me issues, plus they were now-ancient IDE technology, so I upgraded to a nice 500GB SATA2 drive.

The problem was that my mouse and keyboard wouldn’t work. I couldn’t click “OK” to start with the procedure. It was due to them being a USB keyboard and mouse and being now plugged into different ports. I couldn’t put them back in the original port though since originally they were in a hub that I did not bring with me. I started cursing Microsoft and envisioning not being able to play anything during QuakeCon. I realized those poor bastards who have to reload their operating systems at QuakeCon (never fails that at least one is using a 46″ HDTV for a Monitor, too) aren’t so pathetic after all. I started asking random strangers if they had a PS/2 keyboard or mouse and found out that the concept is mostly extinct.

But fortunately XP had kept pressing on and eventually figured out I had a USB keyboard and mouse and let me have them back. So, now on to the activation.

Which failed. Because I had just done it a week prior. Now I had to call a phone number, type in numbers, answer questions, and get a new number to type in. All while on the BYOC floor at QuakeCon with no reasonable way to hear anything on a cell phone of any importance.

I had three days to do it though so I skipped it and did it the following morning when there were fewer people there. I’m just glad I knew you could type in the numbers – the phone call tells you to say them aloud, which is fine when it works but when it doesn’t you’re speaking to someone in Bangalore and with loud gaming and yelling going on, the odds of that working are pretty slim.

But whatever, circumstances aside I haven’t had too many issues with activation. It’s available 24/7 and I’m not worried about Microsoft going out of business tomorrow and leaving me and XP high and dry.

Sometimes though you don’t have that luxury. At the same event, it was noticed that for some reason, despite having Internet access, gamers couldn’t get into Steam. Some people had luck at certian ungodly hours but most people couldn’t ever get in, myself included. It was almost as if QuakeCon was being specifically blocked. All the more ironic given the fact that id and Valve announced a Steam partnership and handed out keys to activate the original Quake game, and that Valve themselves were there showing off Left 4 Dead.

But at least Steam is around and kicking. Valve is signing up big publishers and developers to put games on the service, and it’s becoming a viable alternative to retail. Compare that to Triton, a competing service which not only went out of business, but gave absolutely no notice to its customers or publishers/developers who signed on board. Triton’s only real high profile game was Prey and 3D Realms/2K responded swiftly by mailing out physical copies of the game to all Triton purchasers, as well as putting the game on Steam.

A game was recently released for the PC called BioShock. It’s a single player FPS from Ken Levine, reminiscent of the Half-Life series (it’s a “spiritual successor” to System Shock 2) and powered by Unreal Engine 3. It seemed to have it all for a single player experience, an interesting premise (the Objectivist dystopia ala Ayn Rand), top of the line graphics (the first really significant UE3-powered game on the PC) a community evangelist that was already well known within the scene, and an underdog, underappreciated designer in Ken Levine.

But then the copy protection issues started to surface. DRM company SecuROM’s newest copy protection was applied to the retail versions of the game. I ignored this initially since I was already familiar with SecuROM as a CD/DVD-ROM protection technology and also because I was planning to buy the game on Steam. But as it turns out, this new technology had nothing to do with disc protection and was also applied to the Steam versions of the game.

The initial version of the game had a total of two activations allowed. This meant it could be installed at most twice. In theory, if you uninstalled the game you got one of the activations back – but this is counter to how most games work. Most people when they’re blowing their hard drive away just do so and don’t worry about uninstalling anything first – that’s a waste of time.

Problem was, the uninstall-and-get-an-activation-back part wasn’t working, and it bit some reviewers in the butt. To get things squared away, reviewers called 2K Games and were told to call SecuROM. SecuROM told them to call 2K Games. Hilarity ensued. It didn’t help that SecuROM is made by Sony, who themselves got in a ton of hot water a year or so ago over including a rootkit on audio CD’s to both prevent people from ripping the discs to MP3 and also spy on their listening habits (a rootkit permanently modifies your operating system)

To make things worse, people playing the game on the Xbox 360 didn’t have to worry about this. PC users were effectively being “punished” for the sins of piracy (which a lot of PC users are admittedly guilty 0f). Respectable journalists were going so far as to tell PC users that they were perfectly justified in downloading cracked versions of the executable if they had actually purchased the game.

Now, I had two concerns about this technology. First, I was concerned about being accused of pirating a game that I did actually purchase due to an overzealous antipiracy scheme. 2K somewhat alleviated that by expanding the activation limit to 5 across 5 “different” PC’s and stating that no matter what, people who had purchased the game would always be able to play it. We’ll see how this works over time, but overall the theory is good.

The second concern I had was this – what happens when/if either or both of 2K or SecuROM goes out of business? I won’t be able to call up SecuROM 20 years from now if they’re out of business and I can’t activate. Ken Levine went on record to state that, at some point in the future when BioShock‘s sales have leveled off, the copy protection will be lifted. That pretty much addresses my second concern (but like before: we’ll see).

It occurs to me though – all of my Steam games are tied to my Steam account. If Valve goes the way of Triton, then all those games will be impossible to play. I don’t really see that happening of course but it could.

But then again, if Microsoft went away then I couldn’t activate Windows XP and I wouldn’t be able to do anything. I don’t see that happening either, but only because I know Microsoft isn’t going anywhere. Even if they did everything wrong they’d still have decades of life left in them.

It’s just sort of disturbing to realize how much of my normal life is regulated by the continued existence of companies. I love TiVo to death but if they went out of business then I’m the owner of two now-worthless boxes. My ability to get my car to work relies on the gas infrastructure not collapsing. My ability to post this blog relies on Blogger not folding (they’re owned by Google, fat chance).

Of course Xbox 360 owners don’t have the most reliable consoles either and 20 years from now the odds of any particular Xbox 360 still working at all or whatever the super-successor from Microsoft is being able to play it is slim, so maybe the PC users aren’t so screwed after all.

I finally got an iPod last month. It’s something of a fitting irony that as soon as I get one, no one talks about the iPod anymore and it’s all about the iPhone. Oh well, whatever.

I got the 80GB model because, other than just being an iPod, the most important thing to me was storage space. Of course, Apple does like every other vendor of hard drives and advertises it as 80GB but it’s only 80GB in base ten numbering, but every operating system worth its salt countd bytes in base two, so it winds up having a formatted capacity of 74GB. Which is fine, except that I still had too much music in MP3 form.

The first thing I did was to go through and properly tag my collection. I’ve always been pretty good about this but apparently not good enough. I got my Wife a red 8GB iPod Nano for her birthday back in April (it’s somewhat ironic that I’ve been whining about wanting an iPod for years now and first one I buy is not for me). In dealing with iTunes on her system, I learned several things. Namely, iTunes runs solely off of tag information for everything. That folder structure you’ve been maintaining for years now? That’s nice, but it doesn’t mean squat unless the stuff is tagged properly. That “folder.jpg” file you’ve kept in the folder for the album cover art? Doesn’t mean squat – iTunes goes off of what album art is embedded inside of the MP3 file. Also, you need to use the “Album Artist” field so that the one song on the album with a different artist (i.e., Snoop Dogg featuring Xhibit) still winds up in the same “album” with the rest of the entries. I had to re-adjust my practices a bit. Fortunately I found a program, MP3tag, which seems to do everything I need it to.

For my own technology-snobbish reasons, I actually went to the Apple Store in Plano to get the thing. The irony of passing many Costco, Best Buy, Circuit City, Fry’s and Wal-Mart stores that all sell iPods was not lost on me. I don’t really have any concrete reasons other than the fact that I figured, if you’re going to buy an Apple product, go to an Apple Store. Why not, right? I originally wanted a white model, but I had halfway convinced myself to get the black one. It did look a lot slicker in photographs but when I actually got to the store, where there are several tethered-by-a-steel-rope models to play with, the black ones were much dirtier, and the screens just didn’t look as good, even at maximum brightness. Plus, iPods are supposed to be white. So I went with white. I also picked up a good clear sturdy plastic case.

So once I got home I did one last pass on my MP3 collection with regards to proper tagging and then proceeded to back it up. It took 19 DVD-R’s to do so, and I had actually started the process a few days prior (making 19 Nero documents and then burning them later). Then, I went through and pruned the collection – I removed any artists I wasn’t really that interested in. I removed any albums that I didn’t think made sense on my iPod. For example, I cut out the Nirvana boxed set since it’s neat as a completionist’s entry, but not as something to actually listen to. I cut most of Prince’s albums because, well, most of it is crap – but I kept the greatest hits albums because he does do some great stuff now and again. I trimmed the collection down to about 63GB.

Then I fired up iTunes. Or rather, first I went and downloaded iTunes. It used to be that it was included on a disc with the iPod – now they literally just tell you to go download it. Not that it’s a big deal, just that with a $350 investment, a 20¢ disc is an odd way to cut costs. It also used to be that the most expensive iPod also included a dock, but now it just comes with the same cable as all the others – of course the most expensive iPod used to cost about $50 more, so I guess it evens out (since Apple’s Universal Dock is about $40).

So then I imported my music collection into iTunes. The main reason I did the DVD-R backup was because I’ve read a post or two where iTunes wiped out someone’s music collection this way. I had better luck as iTunes didn’t wipe me out and took about 30-45 minutes to import my collection.

Then I synced the iPod. I had actually been playing with it a bit while I was waiting for discs to burn and for iTunes to finish importing songs. I bought this thing on a Friday evening while my Wife was out running an event until 2 in the morning. I don’t remember when I started the syncing but basically it didn’t finish before I went to bed three hours later. By my estimates it took about four hours over USB2 to send all the music over. I didn’t get to actually check it out until the next morning.

So I hit eject. Only it didn’t work. iTunes told me that something else had a handle on the iPod. I just figured it couldn’t handle that much music being sent over at once. I resorted to disconnecting it anyway and doing a soft reset. It worked fine after that. I eventually figured out that Winamp has a default plugin now that is designed to “grab” an iPod when it’s plugged in, so as long as I don’t have Winamp running when I want to eject, I’m good.

There were still more quirks to overcome. The “Artists” menu was cluttered with every one-off artist from every soundtrack or various artists album I’ve ever owned. I eventually figured out the Compilation flag which keeps these artists out of the Artists list and in the Compilations list. Then I looked at the Artists menu and I saw “Adolph Hitler” – turns out I had missed the South Park Christmas Album.

I’ve also started to do some more proactive things to trim my collection down further – all the better to store new music and podcasts on. For example, I’ve removed any redundant songs from greatest hits compilations – you know, the ones where all the songs are old except for the two new ones? I’ve deleted all the previously released songs. I think without this, I would have Aerosmith’s “Walk This Way” like 20 times on there. If I want to listen to a boxed set, I construct a playlist of the running order of the set – the old songs and the ones specific to the boxed set.

At present, I have about 14,000 songs on the iPod. I’m not sure if that includes the podcasts or not but anyway, I more recently went through and downsampled anything above 192kbps down to 192kbps. In the time since I initially loaded up the thing my collection grew to 70GB but now I have it back down to 65GB. Soon I’ll need to just suck it up and start removing stuff I don’t listen to in favor of things I do listen to. I actually downsampled everything to 128kbps and got things down to 52GB but everything just sounded too awful (though ironically I do have several 128kbps files that sound great) so I went and rolled back (after doing a second backup/restore onto DVD-R’s)

I always figured I would never use the iPod for video but for grins I fired up the trailer to The Simpsons Movie and dangit, I actually like the video capabilities of this thing. So I fired up a video converter and now I keep the ocassional DivX -> QuickTime movie on there. One of the first things my Wife did when I got her the Nano was to cash in some of her credit card reward points on a boom box that takes the iPod as input – so now we can use either of our iPods in that boom box and listen to our music on the go. The other thing we got her for her birthday was her family and I got her a new car stereo system to replace the dying one – this new one has an 1/8″ jack so she can listen to her Nano in the car. It also has an iPod-specific cable but at $50 for the cable and $30 to install it, we drew the line there.

I have a friend who hates the iPod. Actually, he hates Apple. He hates Apple with the passion of, well, the passion of how a Linux zealot hates Microsoft. I still haven’t told him yet that I own an iPod, mainly because I just don’t want to hear about it. My friend likely just hates Apple because they’re run by liberal turtleneck-wearing hippie Democrats in California. It does make me think about why I went with it. At one point in time you could make the argument that iPod was overpriced, and it still is expensive, but now they’re in-line with other players. The 30GB iPod and the 30GB Microsoft Zune cost the same. The Creative Zen tops out at 60GB and the Archos line of players is mainly about video, which like I said is secondary on my list of concerns. The Sandisk Sansa line is an up-and-comer, but they’re flash only and have nowhere near the capacity I need.

The iPod’s interface, features and marketing are tough to beat – to say nothing about the ecosystem of peripherals and accessories. Ironically, this is exactly the reasoning behind Windows’ dominance – you could switch to Linux or Macintosh but so many things – from games to scanners – can’t come with you. And if you think about it, it makes sense why the iPod is so popular. Apple makes computers and some people do buy them but so much of your content – that is, your programs, documents, games, etc. – can’t come along. The Macintosh is incompatible with most of your existing content (Boot Camp and virtualization notwithstanding). The iPod, however, by virtue of the fact that it can play MP3’s, is compatible with your existing content. This is why iPod has 75% of the MP3 player market, and Macintosh has 5% of the PC market.

Anywho, just thought I’d share.

OK, I’m going to switch gears for a minute and make a different kind of post.

Just recently, I changed jobs. The job I held before I had for four years. The one before that, for about fifteen months. Before that was darkness (aka College).

The job change in question was a long time coming. I switched positions in the organization two years ago and started working from home (since the company decided to close the office in favor of telecommuting). At first it was great – no commute, no office politics, new position and I liked the work. But then I started getting handed assignments that I didn’t like, like traveling all over to install software or srcripting a survey in a proprietary language. Between these, my company’s fondness for offshoring, constant reorganizations and the fact that they announced my position was becoming a business analyst that didn’t code anymore, I decided it was time to move on. I started a new position a few weeks ago.

The job search, in earnest, took about three months. Not too bad, but it reminded me of why it took so long to move on – I fucking hate the job search process

Interviews – I hate interviewing. For starters, when you haven’t interviewed in a while, you suck at it, so you blow the first couple of interviews. Plus it never fails that you’re really good at Area A in development but they keep asking you questions about Area B and so you feel like a know-nothing idiot when it’s done.

I know I should be grateful, and I am mostly, but at one point I was interviewing every day of the week, which got old quick. I got so tired of explaining to yet another person my life’s history to that point. My degree in college is functionally unrelated to my career and my GPA wasn’t the most awesome, so that’s fun to explain to everyone, too. Nevermind that I’ve been out of college for seven years now, I still have to explain why I majored in Geography and still can’t find my way to the airport I drove to the week before.

Every once in a blue moon over the last four years of my job I would get a random recruiter phone call and the job sounded good enough to at least go talk to them. They’d get to the question “so why are you looking for a new job?” and I’d say “I’m not – you called me” and I’d never hear from them again. Maybe this was me being passive aggressive, I dunno.

My wife has a friend and he has told us that he has never interviewed for a job and then not received an offer. That’s a pretty impressive statistic and you believe it coming from him – but with all due respect, he doesn’t have to go through the technical interviews programmers have to go through. He doesn’t have to write code on a whiteboard in a suit for four hours. He doesn’t have to explain how much the Empire State Building weighs (and yes, I did actually get that question once – I assume we both read the same article). He doesn’t have to explain five different ways that his code is right because the interviewer keeps fucking with him.

Recruiters – I make this complaint in general, not in specific. The job right now I got through a recruiter. In fact, in this last round, I only interviewed twice that wasn’t through a recruiter.

That being said, there are recruiters who are awesome and do their job and make their money and perform a useful role. Then there are the other ones. The ones who know absolutely nothing about the tech field they’re working in. In a meeting with one, I had to explain to her what all the different technologies mean (I didn’t mind, she was pretty receptive about it). Worse are the ones who think they know what they’re talking about, but don’t. Like the ones who see mostly C# on my resume and then assume I couldn’t do a VB.NET job (C# and VB.NET are the two main functionally-identical languages in .NET). Not that I wouldn’t prefer it, that I couldn’t do it. And then they say “do you know anyone who could do it?” – oh, I see, you want me do to your job for you and get someone else a job in the process?

Something else I also figured out really quickly is that if a hundred different recruiters pitch you the same job, there’s something bad wrong with the job probably. There’s got to be some reason why turnover is huge and the company in question has resorted to calling every recruiter in the area (and some not in the area) and say “have at it”.

Recruiters generally want to meet with you before pitching you to people and that’s fine and all so long as they understand that you still have a regular job and you can meet them after work. Some don’t – I would imagine that a good chunk of their clients are unemployed and will jump at the chance to drive to Downtown Dallas and meet them at 10:30 AM. Worse still are the ones who aren’t in your area – at least you don’t have to go meet them in person but I just don’t think I’d be comfortable with getting employment from a third party in Alaska.

Monster.com – now, I’m not really complaining about Monster because this time around Monster indirectly got me the job. For various reasons, this time around I just let the recruiters call me and went from there – I didn’t get to the point where I needed to start applying for jobs directly. Monster has this thing where when you sign on, update your resume, and log off, you’re bumped to the top of some “hey they logged on” list – theory being, you’re looking for a new job. After a couple of months of looking I finally did this one day and the following day I got – I kid you not – 35 emails and 20 phone calls. This was on a Thursday – I pretty much didn’t get any work done until the following week.

That’s not my complaint about Monster. Actually, I guess my complaint isn’t about Monster at all. My complaint is when people ignore what I’ve said on Monster. I listed my profile as the Dallas, TX area, no travel, no relocating, and no straight contracts (i.e., a “three months and then you’re done” sort of position). What happened? Lots of calls about straight contracts. Calls about relocations and positions with lots of travel. “But what if it’s all in the state of Tex…” NO TRAVEL. My wife had a 100% travel position for a while – never again do I get in a position like that if I can manage it. Truth is, I don’t mind some travel – it’s kinda neat to get away for a couple of days and it wouldn’t be the end of the world (my last job had me travelling 4-5 times a year, tops), but if you call me after reading my “NO TRAVEL” portion and pitch me a job that sends me all over the country, I’m going to say “no thanks” and hang up.

Job Applications – Some places you go to interview, usually the ones where it’s a direct hire position, want you to fill out a job application before you actually talk to them. This, to me, is probably the single most annoying thing about the job search process.

First, it never fails that I forget to think “oh hey this is a direct hire position, maybe I should bring a ‘cheat sheet’ in case they make me fill out an application”. Instead, it usually comes a surprise. Then they make you fill out your life story, and make you feel pretty rotten in the process. They want you to go back X years in your job history. Dude, it’s on my fucking resume, why don’t you just look there. Yeah, I know you need it for your records – how about you save us both some time and make me fill this out when we’re closer to a job offer? (that’s another thing – what’s up with these jobs that require like ten interviews to get the position – getting in at the Pentagon is easier!) You have to explain any gap longer than thirty days. They want names, addresses, phone numbers, direct reports, etc. It’s especially annoying when they’re asking for the name and phone number of the boss you don’t want to clue into the fact that you’re looking for work and don’t give you a “please don’t contact them” checkbox. I always put in the name and phone number of a coworker who’s in on the gag so they can divert them if need be.

Then they ask for your educational history in the same way. Like I remember offhand the address and phone number of my High School. Oh good, they ask for GPA and Major. Then they ask for any convictions you have. Some ask for that, of which I have none, but some even ask you if you’ve had any tickets for moving violations. Well hell, I don’t know – I had a ticket some years back for an expired registration sticker – is that a moving violation? I was moving the car when the cop spotted me. And heck if I remember if that was in the last seven years or not.

Sometimes they ask if you drink or use tobacco. Erm, define “use”. I have the occasional social drink and yeah, I’ve smoked a cigar in the last decade probably. Does that count? My wife worked at a place that simply would not hire tobacco users and would fire you if they found out you were a smoker. Not smoking on the property, smoking at all ever period. Company line was that smokers needed fewer work breaks and were cheaper on insurance costs. I think the company just liked that it was a legal way to discriminate (same company had few if any black people). So damned if I want to get fired because I had a Swisher Sweet four years ago and forgot about it.

The absolute worst is when they have you sign the back. It never fails that there’s some clause right above the signature spot that says “we have the right to contact…. your employer…” erm – no? Please don’t? If possible, I turn the thing in without signing it. Maybe this has cost me jobs before, I dunno.

Now that I’ve put some of my gripes out there, here’s some actual stories, some from this latest “round” of interviews, some from others

Too-good-to-be-true – one recruiter called me up and pitched me a job that, while it was far away, was offering 2.5x what I made in my previous job (current job at the time). While that sounds awesome and all, it was so much more money that it set off my bullshit detector. It didn’t help that he was pitching the job to me like it was a used car. When I finally got him to tell me the name, it rang a bell and I gave him some “I’ll think about it” line and called my friend. Yup, true enough, the same person had pestered him two years prior, even threatening to come out to meet him at his workplace. I learned from someone else in the time since that the company in question was horrible to work for and generally worked people to death then had nice rounds of layoffs. I eventually programmed the recruiter in my phone under the name “IGNORE”

No we won’t tell you – one recruiter called me up, and every time it wasn’t just one guy on the phone, it was always him and his partner. They pitched me some job that was a little further than I’d like to drive but I figured it was worth being submitted to. That is, until they wouldn’t tell me the name. I get that when you first have contact with a recruiter they don’t want to tell you the client’s name because they don’t want you to go behind their backs and go directly to the client and cut them out of the loop (consequently if they have an exclusive arrangement with the client, they’ll tell you right off the bat). But my policy is that I must know the name of the client before I let you submit me. Part of this is because there are certain companies I don’t want to work for for various reasons (like I’ve known people who have worked there and they’ve warned me to stay away) and sometimes I’ve been submitted there before so it’s a waste of everyone’s time to resubmit me. But this one recruiter-pair literally wouldn’t tell me until after they submitted me. When I wouldn’t budge on the matter, they offered to take me to lunch to sweet-talk me, but I refused. Generally, this is a sign that the company in question is so notorious that no one will work with them.

Not too fair – a couple of years ago I interviewed for a large mortgage broker headquartered in the area. Their location was a large, sprawling, multi-building campus. All was well and good until I got there and saw the “JOB FAIR” banner. There were tons of people there, all vying to be cogs in the mortgage broker machine. My first thought was “I’ve been scammed”. Kinda like when someone sets up a “job interview” for you and it winds up being a large seminar trying to sell you on Amway or something. I had to park a mile away from the place. Hopeful that perhaps I wasn’t part of the “Job Fair”, I went to the building on the campus without the “Job Fair” banner. I asked for the Such-and-such building I was supposed to be reporting to for the interview – the woman pointed to the building with the banner. Every conversation I had went like this

“I’m here for an interview with So-and-so at the Such-and-such building.”

“Are you here for the Job Fair”

“I don’t know if I’m here for the Job Fair, all I know is I’m here for an interview with So-and-so at the Such-and-such building.”

And then the person would just point me to someone for the Job Fair. Apparently they were doing interviews at the Job Fair. I had to repeat this routine like 3-4 times and as soon as I finally got to someone who knew what was going on (while I prepared the “there’s been a misunderstanding fuck you bye” speech in my head) they finally said “Oh, you’re supposed to go to the fourth floor” and there I had the actual interview.

We don’t know either – I once had an interview at a company that made trucks (we’ll just leave it at that). They had a “hire for life” mentality, and mentioned that the only reason they had a position open for interviewing at all was because someone had retired – their retirement party was the week before. Nice, considering my gig at the time saw programmers as an exportable resource. Only problem was – they had no idea what I’d be doing when I got there. Their policy was to hire first, then figure out what the person does later. So that part where you interview them to see if you even want the gig? Yeah that was impossible. I might be working in VB.NET. Or VB6. They didn’t know for sure. Oh, I might have to travel to Europe for six months on hire. Or not. They didn’t really know for sure. I wasn’t sad when they never called back.

Oh and the reason I mention it was a trucking-related company was that this was another one of those fill-out-a-job-application companies. They didn’t make me do it first, I did it at the end of the interview. It included a question asking for my CB Handle. I guess they hired truckers as well?

I hope you’re not evil – This one was not an interview per se but it was with a recruiter. Job sounded great, but then she said “the only thing is the company is… well, they’re… Christian, so they want someone with good morals and values.” I asked, “So, what does that mean? What does it matter if they’re a Christian company?” She responds “Well, I don’t think they start off every day with a prayer like Interstate Batteries but they just want someone with good…. morals and values”.

I’m not sure what they expected me to say. “Oh sorry, I have no morals or values. Hell, you should be scared to be in the room with me.” I never heard more on the position but it just struck me as odd that they were bracing me for the overt Christianity of this company (which is just fine with me, I sort of have a separation of work and church stance). I wouldn’t have ended the interview or anything, but it was just weird.

Please wait – When I was trying to move to the Metroplex like four years ago I interviewed at this one place on the Tollway and at one point towards what I thought was the end they said “OK we’ll be right back”… and I sat there for like 45 minutes in this dead silent conference room in this dead silent office doing nothing. Then one of them came back and said “OK, we’ll call you back for your next interview”

That wasn’t the weird part – the weird part is I did that standard thing where you wait a bit and you call back to see if you got the job (or in this case, the next interview). Which if you have to call them either means they’re slow or you didn’t get the gig. So the receptionist says “oh he’s busy now, would you like to leave him a message?” so I do. And I call back in a few hours. Same story, only I don’t leave a message. I do the same for the next few days. I leave him another message or two. I’m on the road headed to another interview (for the job I accepted and kept for four years) when another recruiter calls and pitches me this place and I tell him I’ve already interviewed there. I eventually quit calling the place and I never did hear back from them.

Now, I know what happened – they went in the back and maybe they were busy or maybe they discussed me, I don’t know. Maybe they just passed on hiring me or maybe the position fell through or whatever, which was fine. But why avoid my calls? Why not call me back when I’ve taken the time to call you first? If I don’t have the job just tell me and I’ll leave you alone. Why avoid me? I mean, it did eventually work – I eventually just quit calling and so they avoided any conflict. But jeeze, grow some sack and tell me no already.

Carry the one – One recruiter called me up and pitched me a gig with a company that sounded fun (that’s another thing – every job is pitched to you as a “fun place to work”. Every single one) until it came out that they required a 50-hour work week. Thanks but no thanks, all other things being equal, I’ll take the job that only requires 40 hours per week.

But they weren’t done yet – you got paid in some sort of sliding scale overtime sort of deal. So like, if your salary was $X that was what you got paid for your typical 40-hour work week. Divide the salary by the number of weeks/hours in a year and that was the amount per-hour you’d be paid for those extra 10 hours a week.

So… why not just pay me 120% of $X? As in, if the job paid $10,000 a year for a 40-hour week (an unrealistic but simple number) why not just say “oh the job pays $12,000 per year but you have to work 50 hours a week”. Why in the hell are you making me do the math on this one? Is it because the 50-hour-a-week thing is such a turnoff for everyone that you’re trying to make it sound like I get a bonus for it? Or are you trying to trick me into thinking I’ll get paid more than I will?

I think they were targeting the desperate-to-get-a-job types. That or they were just handed a shitty job to pitch. Like the one local firm I kept getting pitched that had a suit-and-tie policy. Sorry, all other things being equal I’m taking the job that lets me wear something normal to work.

Contracting Insanity – this one is not mine, but it’s my favorite interview story ever. It was a Slashdot comment.

All is well now, I got an awesome job through a great recruiter at a good pay rate mere minutes from my house (30 of them). Truth be told, I like these horror stories, I just hate going through them.

Several years ago there was this one (now defunct) page I would go to and people would post their webcams there. I think I went there because the Penny Arcade guys had their webcams there. Something about the time made webcams interesting.

Anyway if you remember anything about webcams when they were “hot” (and no I don’t mean the “dirty” ones) you’ll remember the trend was to pose for some shot, use some sort of editing software to graft a phrase on the image, and then leave that static image, in place, for a long time (I think people had started to realize that they scratched themselves too much to leave the things live for too long).

In the wake of 9/11 one of the webcams on this page (whose webcam it was escapes me) was just of the guy sitting in a room, the only illumination being from his monitor, with a rather downtrodden expression. The phrase he had typed on the top of the image was: “When I was growing up, my mother always said TV, movies and videogames would desensitize me to violence and reality.” At the bottom of the image was the phrase: “I really wish she had been right about that”

One of the topics about interactive entertainment (okay, video games) that’s always fascinated me is how it affects us, or doesn’t, or if it even can or not. Many of us in the gaming proletariat have always maintained that video games don’t effect us. We know that’s not entirely true – play Civilization IV long enough and you’ll be moving the pieces in your sleep. Play Tetris long enough and the cityscape skylines will start to beg for more pieces. But playing GTA3 didn’t make me into a violent criminal. If anything its non-repetitive gameplay actually hinders your ability to draw too many patterns in your mind.

Still, I play a lot of games where I am in a 3-D world with very realistic (or at least convincing) graphics, armed with a gun, and killing anyone I see. Sometimes the blood makes patterns on the walls. In some areas of DOOM 3, the brains literally pop out of the enemies (who are all zombies). Thanks to the invention of rag doll physics, I can now hear the crunch of their bones as their bodies traipse down every stair or rock on the way to the ground. In playing all of this it has entered my mind that I may be getting desensitized to violence. It doesn’t stop me of course.

And then this past summer I bought a game off of Steam called DEFCON. This game’s premise is essentially to implement the “Global Thermonuclear War” game from the 1983 blockbuster WarGames. A brilliant premise, especially for children of the 80’s like me, and one I can’t believe wasn’t done sooner. The graphics are low key, the gameplay is simple, and the whole notion reeks of style.

One thing, though – the game is exceptionally creepy. It’s a combination of using the rather low-tech graphics (though it’s not like the game is some EGA slouch), eerie music, and some subtle sound effects (like wind) that make the game downright spooky to play. But not because it’s some scary notion like those in the Resident Evil games, no this one is creepy because – you’re basically killing millions upon millions of people. The catchphrase of the game is “Everybody Dies” and it’s a given in the game that a large number of your people will die, too – the way to “win” (or one of them anyway) is to just make sure more of the enemy’s side dies than yours.

So it says something that in a day and age where I can play a game that lets me mow down pedestrians and kill innocent people, I get the heebee jeebees from seeing “DALLAS HIT: 5.4 MILLION DEAD” on the screen in cold stale letters. I guess it means two things – I haven’t been desensitized to violence, after all, and that context is important despite what the Jack Thompsons of the world think.

Over Christmas, I bought a Nintendo DS. I now mostly retract my earlier statementthis is now the most perfect gaming device I have ever purchased. It’s too bad it can’t do multiplayer GBA games or play GB/GBC cartridges, but after seeing my GBA games on the backlit screen, there was no going back to my GBA.

The first game I bought was New Super Mario Bros. The second game I got was Brain Age: Train Your Brain In Minutes A Day. I was sort of shocked to see that Brain Age was #10 on the top 10 console games sold in 2006, period. I figured that I was unusual in being weird enough to want to play this game (though, I did notice it was advertised in my wife’s magazine Real Simple, so perhaps Nintendo got it right about expanding their market.)

Brain Age claims, in a very “for entertainment value only” sort of way, to exercise your brain and make your mind “sharp”. It takes the research of Ryuta Kawashima and turns it into an interactive game, which is quite effective because it is considerably more interactive than a book and can calculate your progress for you (the game even plays as if you’re holding it like a book). It uses the internal clock of the DS to make it such that you can only play the games once per day, it tracks your progress on a graph, and even comes with Sudoku puzzles.

I’ve been playing this game for a few months now and, though it might be a placebo effect, I do think the game is actually effective at what it claims. Not that I think it’s made me smarter or sharper necessarily, but I am getting better at the activities daily and the nature of some of the tasks (quick rapid fire math calculations, memorizing lists of words) do seem a lot like the sorts of things we have kids do in schools. It occurs to me that this game would be excellent for schools. This is the sort of game my wife could like. Hospitals in Japan have been using it to ward off dementia. The game is selling many times better than Nintendo had ever dreamed.

But then it occurs to me – if you accept the notion that Brain Age might have an effect on your mind – in this case a positive one – don’t you also have to accept the notion that other video games might have a negative effect on your mind?

The style of Final Fantasy-type games (specifically, old SNES-type games with low tech graphics) is reproducible enough that a company in Japan actually made a software package called RPG Maker whose purpose is to allow people to make their own Final Fantasy-style RPG. An individual named Danny Ledonne used a version of this software to make a game caled Super Columbine Massacre RPG!, which recreates – to some extent – the events of April 20, 1999, putting you in the role of Eric Harris and Dylan Klebold. Seeing as how I’m the only person I know who actually thought JFK Reloaded was neat, I figured I’d give the game a go.

Final Fantasy-type games are famous for using the “opening screen of text” tactic. Usually it’s a screen of a solid color (white, black, etc.) with text on it, each line fading in, and usually some sort of weird nonsense that makes no sense whatsoever outside of the universe of the game (which you know nothing about since every Final Fantasy game is completely different). However, SCMRPG‘s opening screen read:

The purest surreal act would be to go into a crowd and fire at random.
André Breton, 1896-1966

I actually felt nauseous reading that. And the slow, meticulous pace of the opening sequence of the game was just surreal. When dippy looking 16-bit sprites are representing some androgynous fictional Japanese characters, it’s easy to have no emotional attachment to the game. When the sprites represent real-life killers who meticulously planned the then-worst school shooting in history, the experience is much different.

The author of this game did his research – just about everything in the game comes from a real-life incident or allegation (easy enough to do, since everything about that day and the killers has been documented over the years). The MIDI music is from the era. The theme song on the main page is “The Nobodies”, the Marilyn Manson song which is generally accepted to be about Columbine.

The author has come under a lot of fire for the content of the game, especially the “going to Hell” detour the game takes (more of a reference to the types of detours the Final Fantasy-style games take than a commentary on the killers) and many people have stated that the author’s initial purpose was to stir up controversy and that he only switched his story to the “social commentary” role once he got the popularity he desired. I disagree; I think he intended to make a work of art and once he figured out that the technology he wanted to employ – namely that of an old RPG-style game – would prove feasible enough for his purpose, he went ahead and finished it.

I started writing this post in February. Shortly after starting it, I made decision to start looking for a new job. In late March, a family issue gained the majority of my attention until late April, and in the last three weeks I finally secured and started another job. This is why the time to write this new post took so long this time. In the meantime though, another school shooting went down at Virginia Tech and perhaps ironically, it took place in the same timeframe as Columbine (the third week of April).

The game industry was able to breathe a slight sigh of relief when it came out that Seung-Hui Cho did not play video games (though this didn’t stop Jack Thompson from making the claim anyway). With a body count of 32 compared to Columbine’s 15, the VT shooting became the worst school massacre in history, and in the ensuing it weeks it caused a lot of speculation and finger pointing. However, it seems to have vanished from the spotlight quicker than Columbine did. Perhaps it’s the Iraq War, perhaps its that it was a college as opposed to a high school (where the students don’t have a choice in the matter of attending), perhaps it was because there wasn’t an easy pop culture target to nail it to (Marilyn Manson, etc.), perhaps it was the video and photos that the killer sent to NBC News during the tragedy, maybe it’s the misplaced blame on gun laws (a few months prior, VT made it illegal to carry a concealed weapon on campus, leading some to believe that had this rule not been put in place the massacre could have been ended by another student). Whatever it is, the focus on VT has fallen a lot quicker than Columbine’s shadow.

In any event, I’m not sure if games can really have any lasting effect on our senses anyway. On JFKaos, a JFK Reloaded fan site (the only one, probably) someone claiming to be from a marketing agency wrote into the webmaster. This person stated that Traffic, the developer of JFK Reloaded, contacted them first and came over to show the game. In the initial showing of the game, one woman was so nauseated by watching the game being played (in the game if you hit JFK’s head in the same way Oswald did it has the same “brains flying” effect as the Zapruder film) that she got nauseous and had to flee the room.

Now, I’ve seen videos on the Internet that have made me sick and given me nightmares. However, JFK Reloaded didn’t. Neither does the Zapruder film. Neither do horror movies or blood splattered on the walls in video games. Does this mean I’ve become desensitized? Does this mean society’s become desensitized? (witness how The Beatles were once seen as a corrupting influence, but now Marilyn Manson collaborates with Disney)

Or does this just mean what we’ve all known all along and no one wants to admit – different things affect people in different ways and censoring something for the masses in order to avoid upsetting a small number of people is pointless.

I’ve seen some upgrades recently.

Back in October, I jointed the cult of widescreen LCD owners. LCD monitors are one of those deals where, once you make the switch, you wonder how you ever got along with CRT.

Amusingly, I went back on some of the bold claims that I’ve made in the past. It occurred to me that I had spent a lot of money on a 20″ widescreen monitor (native res of 1680×1050) and part of the reasoning behind that was I wanted things on my monitor to look prettier. And here I was running the old, circa-1995 Windows 95 theme. So I gave the Royale Theme a whirl, and I wound up liking it – I think half of the reason is because it was designed with LCD monitors in mind. I also went ahead and tried out ClearType and, after getting over some of my prejudice, wound up liking it, too – again, because it was designed with LCD’s in mind. I had to apply a different font for my programming editors, but it wound up being worth it in the long run. Ironically, though, I now have less reason to upgrade to Vista (or will at least have less neat new stuff happening when I do) as a result.

I also got, for my birthday, a G5 Laser Mouse and a G15 Keyboard. I’m in the unique-ish position of being anti-wireless. In the same way that I now think people who “prefer” CRT over LCD are backwards, I’m in the position of thinking wired mouses are superior to wireless mice, which many people believe is backwards. The G5 has a wireless “cousin”, the G7, which is about $30 more and does not feature the removable weights cartridge that the G5 does. I know I sound like those people who refuse to move to CD’s and prefer vinyl because vinyl sounds slightly better, but I refuse to go to a wireless mouse because the wired mice are a little more responsive. I open up MS Paint and try to make circles with the mice very quickly. Without fail, the wired mice make better curves than their wireless cousins – the wireless mice always have straight lines as part of the curves. So for $30 more it’s a less accurate mouse and is missing the weight cartridge feature? Pass.

Of course the irony there is that it’s not like I’m such a meticulous hardcore gamer that an extra 1.4 grams on the right side of the mouse will make a huge difference, but it’s the principle of the thing.

The G15 keyboard is really nice but it’s another break from tradition for me. I’ve always viewed keyboards as these cheap, disposable devices and here I am spending $100 on one. It’s already made me less likely to eat at the desk – my last keyboard was so clogged with food and dirt it wasn’t worth the effort to save. But the illuminated keys are worth the price of admission alone. Ironically, I don’t tend to play a lot of the games the macro keys would come in handy for, but they have come in handy for testing things I’m developing – my main job has this project I’m working on where I have to fill in a form on a web page before continuing. Once I hooked this up to a macro, life was good.

The LCD screen is seen by some as gimmicky (enough so that Logitech sells a version of the keyboard, the G11, without the screen for about $30 less) but ironically for me it’s worth it for a lot of non-game reasons. I mostly play FPS games so the fact that it tells me how much health I have or how many bullets less is not that useful – that information is on the screen already (though it does still sort of come in handy in PREY since there’s an actual number on the LCD screen instead of just a meter) but the TrillianG15 plugin for Trillian Pro is sent from the gods – now I can answer instant messages without having to tab out of the game. I can also keep track of the time with the clock on the LCD and can check out performance settings with the performance monitor, check out what song is playing in-game using the media display, all from the LCD.

The only irony in all of this (other than the fact that it wasn’t until I got these guys home did I realize the shortcoming in that I didn’t own a USB KVM switch for my work laptop) is that I’ve had to adjust to them. I’ve never owned a mouse that could “tilt” the wheel so I keep screwing things up when I try to middle-click on something. And I never realized how much of my typing was based off of “where are my hands on the keyboard” until I got a keyboard that was much wider than any other keyboard I had ever used (side note: the G15 is just barely small enough to fit on my keyboard drawer, both in terms of LCD clearance and sheer width). I kept hitting macro keys (which, by default, are mapped to the F-keys) because I thought it was the edge of the QWERTY section. And while the rest of the keyboard is a standard 101-key affair, Logitech lays out their keys and sizes just differently enough that I’ve had to do some readjustments. I had a florescent lamp on my desk and, ironically, the light bouncing off of the black keys makes them harder to see – so that had to go. And though I used to be bad about not cutting my fingernails quickly enough, never again since whatever material these keys are made out of feels like crap when you hit it with a nail (in my opinion anyway, I have no idea what it’s like for women with longer nails). Overall though these are awesome purchases – that I can see the keys in the dark and ratchet down my mouse sensitivity in-game has already paid off in spades.

One of the things that comes along with newer technology like this is whether or not your games support it. The widescreen is the biggest X factor. If a game is sufficiently old and/or didn’t allow for minute tweaking, then it doesn’t work with widescreen, or at least not correctly. If a game can’t run in the native resolution then you have to run it non-natively, which causes some blurring due to the nature of LCD’s. Not a huge deal, and I’ve gotten to where I try to run older games in a window (which presents its own challenges if the game is a really old version of DirectX that didn’t get along with high color displays all that well). The Widescreen Gaming Forum has come in handy but if the game is old enough and no one can find a tweak that works, and it won’t run in a window, then you’re just sort of hosed. Old games like Quake 3 work with some work because the developers were freaking awesome but even some newer games don’t work. Neverwinter Nights works with widescreen but the newer game Star Wars: Knights of the Old Republic, which worked with the Neverwinter Nights engine, doesn’t work since the developers cut off support for the game before the functionality was grafted into NWN (either that or they just never bothered to re-graft it back in. Bioware developed both games, so I wouldn’t be surprised if it was just a matter of contract woes with Lucasarts – in any event the support can be hacked back in).

Ironically, one game that supports widescreen, multi-monitor gaming, the G15 keyboard’s LCD screen and and all of its function keys, and the G5 mouse, is World of Warcraft – aka, the game I refuse to play both because of my anti-MMORPG stance and also because I’m afraid that I’ll get hooked and like it. It’s kinda like those situations in college or high school where you’re at a party and there’s marijuana floating around – you don’t try the pot both because you’re anti-drugs, and also because – you’re afraid you might like it. So the fact that I just compared WoW to drugs says something about it.

But it does bring up something else I’ve noticed – one of the podcasts I listen to is the PC Gamer Podcast, and it’s one of the more interesting podcasts I listen to. These guys have been covering PC games for over a decade (though I don’t think there’s anyone who’s been with the magazine since day one) and the debate is lively (for example, they don’t like the 0-100% scale they use, either, but they’re stuck with it). One thing I kinda don’t like about the podcast though is that 1/3-1/2 of every show talks about World of Warcraft. If your only exposure to the PC gaming world was this podcast, you might not even realize there were other MMORPG’s out there. Sure, WoW has 8 million gamers now (or accounts, but that’s the current active number – not just over the lifetime of the game) so it’s not like something you can ignore. It concerns me because while WoW is something of a shining example of PC gaming superiority, there are a lot of people who believe their $15/month fee is a better investment than additional games, so they actually buy even fewer PC games because of the game.

But one thing World of Warcraft has for it that even other MMORPG’s don’t necessarily have is constant development. Sure, part of that is the fact that they need to keep coming up with new content or people stop paying and playing, but as part of that continuing development is that the game adapts to new technologies. When dual core came out, they adapted to it (I don’t know offhand if WoW exploits dual core but at least it doesn’t screw it up like it has with other games). When widescreen monitors came out, they supported those (it’s more than just a resolution change, it also requires a POV tweak). When the G15 keyboard came out they put support for it in the game. It was compatible with Vista from day one and heck, even the Macintosh version got a universal binary so the game runs natively on both PowerPC and Intel hardware in that universe. It’s just nice that the game continuously updates itself for new stuff. One of these days when everyone’s computer is more powerful, they’ll make new expansion packs that require beefier hardware and have better graphics (not that this is unique, EverQuest did the same thing).

Granted, this is from Blizzard, the same guys that are still issuing updates for Starcraft, so they have a tradition of setting the bar really high for support, and having 8 million people pay $15 or thereabouts each (I doubt China is paying that much per-head, and prepaid cards do get the price down some) does help. And it’s not like I have an answer here – without further sales and revenue coming in it’s not like there’s much of an incentive to continue development after the sales window (though as I say that, id came out with a DOOM 3 patch yesterday), but it’s the dirty little secret that while in theory the PC is eternally reverse compatible (as opposed to consoles where the Nintendo 64 doesn’t play SNES games and current consoles only begrudgingly play old games because optical discs means that the form factor argument is out the window) the fact is that sometimes getting the PC to run old games is quite the task. If the game is an old DOS game, DOSBox usually does the trick. If the game used DirectX then in theory with a tiny bit of hassle it should always work. But if none of these tricks work and some new technology breaks things (I wound up having to fire off Painkiller with XCPU because neither the AMD Dual Core Optimizer nor the MS Dual Core Hotfix would fix it) then you’re just sort of screwed.

World of Warcraft continues to grow over two years after its release – PREY sold over a million copies but it went from top dollar to cheap bin in seven months. I wonder if PC games would sell better if they had a definite commitment on development windows. No one (or at least not that many people) wanted to buy Quake 4 for fear it wouldn’t be supported as long as other games and when that happens, its a self-fulfilling prophecy since a lack of people playing the game causes it to be less popular and doesn’t encourage others to do so. To its credit, Quake 4 did release several major updates and I think could be the modern-day-technology successor to Quake 3 but I fear it’s too late to be given that chance. Battlefield 2 never saw all the fixes it needed (and literally one year later it was still unfixed while EA forced them to whip out Battlefield 2142) so confidence is important to gamers.

Anyway, I’m happy that I’ve got these nice new upgrades (I need a new hard drive and Windows Vista Ultimate but that’s a ways off) and when I can get the games to play along that’s extra awesome. But I do see the lack of adaptation as a problem. When you’re id Software and your 1999 game Quake 3 can adapt (or even your 1996 game Quake through the source code you gave away) that’s awesome. When you’re EA/DICE and your 2005 game Battlefield 2 can’t adapt (or tells me I’m a cheater if I try), that’s unacceptable. It’s almost tempting to try World of Warcraft just because they give gamers what they want instead of telling them. I still have copies of World of Warcraft sitting in this office with me. I’m tempted to install it. I can quit at any time, ya know.

Nah, I’ll just fire up Oblivion instead…

There was a point in time in which the group Guns N’ Roses was the biggest band on the face of the earth and with the exception of groups like Led Zeppelin or The Beatles, the greatest band of all time. And this was after one album.

Similar to my following of Van Halen, Guns N’ Roses is one of the other groups I follow. Their first album, 1987’s Appetite for Destruction is pretty much perfect – many rank it as the #2 rock album of all time, just under Led Zeppelin’s fourth album. At the time people even hailed them as the second coming of Led Zeppelin.

A follow-up EP, Lies, (or GN’R Lies, depending on how you read the cover) had their earlier independent release Live Like A Suicide and four new acoustic tracks, including the controversial “One in a Million”. A song like that would derail most careers, but GN’R had too much momentum.

Three years later GN’R came out with two albums simultaneously, Use Your Illusion I and Use Your Illusion II. They were much different albums from their previous efforts – they were highly produced, featured long epic songs and horn sections, and were promoted by the band’s first headlining tour. Most fans came along for the ride, some decided that the new albums were too different and the result of Axl Rose’s increasingly eccentric mind. Izzy Stradlin left the group before the tour started, which was the first sign of trouble.

1993 saw the release of “The Spaghetti Incident?”, a 12-song EP of covers, mostly of punk rock tunes. No one knew it at the time but it would be the last full release from the “original” lineup of GN’R (sans Steve Adler, their original dummer who was fired after Lies). A cover of the Rolling Stones’ “Sympathy for the Devil” appearing on the 1997 Interview with the Vampire soundtrack album would be the last song from the original lineup.

For a long time nothing happened. Slash quit the group and started a short lived side project, Slash’s Snakepit. Duff McKagan left at the end of his contract. Matt Sorum and Gilby Clarke were fired. Slash, McKagan and Sorum eventually did have a “second coming” in the form of hooking up with Scott Weiland and forming Velvet Revolver – a band who experienced an unprecedented amount of initial success based more or less off of the fact that they were considered the second coming of GN’R.

Of course the real second coming of GN’R (or the other one, if you prefer) was with the band that Axl Rose was now the only remaining original member of. He started hiring new replacements for his former bandmates and started recording new material. Somewhat quickly, this new GN’R had a song called “Oh My God” ready for 1999’s End of Days soundtrack.

Shortly thereafter, though, the band went into stealth mode recording a new album and very little was heard from them for months at a time. Occasionally a snippet of information would come out, like a producer for the album had been hired (or quit, or fired), or a new member of the band (like guitarist Buckethead – famous for wearing a mask and an empty KFC bucket on his head) had been hired (or quit, or fired).

At some point, the name of the new album came out: Chinese Democracy.

In 2002 there was some hope that the band was nearing completion of the new album when they were the surprise closing band on the MTV Video Music Awards. This was followed by a national tour. However, eight dates into the tour the entire affair was canceled (they had maybe played four or five shows) and the band went into stealth mode again. Years went by without a peep from the GN’R camp, other than from producers or members who had quit. Axl became the next Bigfoot – people would report on seeing him in the same way one would report seeing the Loch Ness Monster (of course, Nessie never gets interviewed by a surprise camera crew coming out of a hockey game)

However, in January of this year, Axl went on record (and more or less came out of hiding) as saying “you will hear new music this year”, which was pretty much accepted by most as meaning that Chinese Democracy would be released in 2006. In February, decent quality recordings of the songs “There Was A Time”, “Better”, “I.R.S.” and “Catcher in the Rye” were leaked on the Internet – the unconfirmed rumor was that Axl leaked them himself to test out the waters. That same month, Slash claimed to have heard the album and said it would be released in March, which obviously never happened. Over the intervening months, Axl occasionally dropped hints about the new album – the most prevalent being that there were 32 songs in some state of completion, 23 of which he was working on completing, and 13 of which would actually be on the final album.

Axl made a surprise appearance on the Eddie Trunk show in May (his first interview in several years), he allowed Harmonix and Red Octane to put “Sweet Child O’ Mine” in Guitar Hero II as a playable song. Over the summer the New Guns N’ Roses played several sold out warmup shows and tried out the new songs. The plans were in place for the European tour over the summer with the North American tour to start in October.

And yet time went on with no announcement of the release date for Chinese Democracy. Axl had a chance to avoid or deny the idea that it would still be released in 2006 when he was asked about it on MTV News backstage at the 2006 Video Music Awards in August, but he still maintained that it would indeed be released in 2006.

In October a strong rumor was posted on RollingStone.com which indicated the album was to be released on November 21, but no one has ever confirmed it. When asked about the release date, GN’R’s manager just stated “there are only fifteen Tuesdays left in the year” (new albums are released on Tuesdays). A Harley Davidson ad featuring the final studio version of “Better” was placed on the HarleyDavidson.com website on October 21, only to be replaced by a version featuring “Paradise City” (from Appetite for Destruction) with the “Better” version changed to “coming soon”. When asked further on the release date for the album, GN’R’s manager stated “we might not bother with a release date – you might just walk into your record store one day and find it there”.

So that’s where it stands today – the tour is continuing (one canceled date notwithstanding) and the album is still “officially” being released in 2006, but no one knows anything else. As I write this there are nine days until the rumored November 21st date and still nothing from GN’R and/or their label. One potential problem is that the 21st is also the date that the new Jay-Z album is released (Jay-Z had previously “retired” so this release is seen as significant). Employees from record stores not only report that their usual indicators of an impending release show nothing for Chinese Democracy, they also show nothing at all whereas albums coming out in 2007 have at least some trace in the system.

Some speculate that perhaps the management wasn’t kidding with their statements that the album might just appear on store shelves one day. Given that the aforementioned Jay-Z album that’s being released on the same day has already leaked online and Chinese Democracy hasn’t, it might be that the album is being handled in such an interesting manner to thwart piracy (it’s hard to pirate an album if you’re not even sure it’s finished yet). While an album magically appearing in stores would not be the best maneuver from a marketing push perspective, the Eminems album still sold amazingly well when their releases were pushed up unexpectedly to odd days of the week (like the Friday before the scheduled Tuesday) to thwart piracy. Of course those albums at least had a release date to speak of, and GN’R’s popularity in 2006 doesn’t compare to Eminem’s popularity in 2002.

Still, Axl does have in his possession something resembling the final album – he’s used it as collateral to get into clubs (he used it to get a club to stay open on his birthday – the DJ reported handling two CD’s). The “13 songs” statement seem to indicate that the final lineup of the album has been decided on (I find myself wondering why he’s trying to finish the other 13 songs). Sebastian Bach, who hung out with Axl enough to get himself used as an opening artist on their tour, says that he’s heard the album and that it’s “amazing”. Rumors have circulated that people in the parking lots of Interscope (the label, I believe – “Geffen Records” no longer exists) were listening to it via loudspeakers on the building. It’s also been rumored that last week’s concert cancellation (the original official story was that the fire marshals were trying to force GN’R to tone down their show and really force them out, the “official” official story was that the local police would fine the group if they drank beer on stage – but why they would forego a $200K concert to avoid a $250 fine is weird) was due to Axl needing to fly to California to make some last minute decisions on the record (the other rumor is that since only 3,500 seats of the 5,000 seat venue were sold, Axl took it as an insult and canceled the show). Supposedly the cover art is finished and the marketing campaign is ready to go.

And yet – no album. Or release date. It seems extremely weird for an album that’s supposedly going to be released by the end of the year to not have anything remotely more concrete available in the way of information. But then again, nothing about GN’R has been normal thus far – Axl has used the same name of the group despite being the lone original member (Dizzy Reed is a holdover from the Use Your Illusion days but he still wasn’t in the original lineup) and then went on to spend close to ten years recording an album at a rumored cost of $14 million (perhaps that’s it – the record label has already spent so much money they don’t want to spend money to promote it). This truly is the Duke Nukem Forever of the record industry. It could be that Axl and crew have been mum because they’re working so hard on it. It could be that they don’t want to disenfranchise concert goers by stating that the album in fact won’t make it out in 2006 like they promised. It could be that they just don’t know yet at this point when it will be out. It could be that they’re targetting December 26, 2006 as the release date – the last Tuesday of the year. And it could be that November 21, 2006 will see at least something – a single, an announcement, etc. (the “Talking Metal” podcast believes the date will be December 5, 2006 – and there’s some speculation that they might have insider information).

My main curiousity is – what is the point of no return? At what point is it that it’s literally too late to get the album into stores? It’s been said that between Thanksgiving and Christmas record labels “shut down” (which is why the Christmas albums all come out in October or so) and so if it doesn’t make it by November 21st (the last Tuesday before Black Friday) then it will likely come out at the end of the year or not at all. But if this coming Tuesday (the 14th) comes and goes with no announcement does that mean that the 21st is impossible? Or will it really be one of those “walk into the store” kind of deals? And if it is, will the album be successful? Appetite for Destruction shot up the charts with no video or radio airplay or advance promotion, could Chinese Democracy do the same?

And overall, I’m curious about the album because the leaks, to me anyway, sounded good. I know this isn’t GN’R with Slash (the closest we will get to that is Velvet Revolver). I know this is essentially Axl’s solo project with the same name. It would be like if Studio 60 On The Sunset Strip was also called The West Wing but it was still about an SNL show in LA and not The White House. I bought Daikatana the first day it was out because man – what a story. I want to hear this album because I want to know what an album from an eccentric perfectionist spending a decade and a small fortune sounds like. Was GN’R huge because of Axl, or despite him?

All I know is – no matter what, if I wake up one morning (maybe next Tuesday) and hear that Chinese Democracy is suddenly on store shelves, I’m stopping what I’m doing and running to the nearest store and buying it. And any CD singles with unreleased songs. It’ll be like 1992 again.