Okay, I’ll preface by saying this one’s a little odd, since it deals with football and I’m hardly a knowledgeable football fan. Nevertheless I figure it will be short and probably interesting since this is from the perspective of the opposite of the football fan.

Anywho, as far back as 1985 (I think), my Dad took us all to football games. Specifically Texas A&M football games. Suffice it to say that early introductions to the A&M campus more or less constituted the neccessary brainwashing to convince me and my sister to go to A&M. I remember being particularly impressed at eight years old to see that even the scoreboard of Kyle Field was a large building.

A&M went to the Cotton Bowl three years in a row, from 1986 to 1988, and we went to it one year, 1987 (the one out of the three where A&M lost). At this point in time, the Cotton Bowl was seen (if I recall correctly) as the Super Bowl of college football (in the south anyway). Somewhere along the line, the Cotton Bowl, as a bowl game (it’s also the name of the stadium in Dallas where it’s held) kinda went downhill. I don’t recall exactly when but it might have coincided with the neighboorhood the Cotton Bowl was in turning into a ghetto (or perhaps it was always that way).

In any event, at some point in time another thing happened – bowl games started getting sponsors. Like I said earlier this is all from memory, but suddenly one year it was no longer the “Cotton Bowl” it was the “Mobil Cotton Bowl”. Mobil has since sold the Cotton Bowl to Southwestern Bell, so it’s the “SBC Cotton Bowl Classic” (though I’m not sure when the “Classic” showed up). The “Sugar Bowl” is now the “Nokia Sugar Bowl”, and in 1998 A&M went to that while I was in the Aggie Band. Amusingly, the scoreboards in the stadium had some permanent ads for a different cell phone company, which were covered up during the game (we saw the original ads during a rehearsal).

Now I’m not some prude that thinks advertising is an evil thing, but it has given way to some embarassing gaffes in the past. The best one I can remember is the “IBM OS/2 Fiesta Bowl”, which showed commercials for OS/2 during the breaks. I just wonder what they were thinking – that the armchair quarterbacks of the world would change to something called OS/2 because they sponsored a bowl game? I mean, “IBM OS/2 Fiesta Bowl” just doesn’t roll off the tongue.

Now changing gears for a second, Texas A&M is simply put not doing so well in football this year. Apparently A&M had never lost at Kyle Field versus a conference team since they switched conferences to the Big 12 a few years back. They’ve lost four times at Kyle this year. And despite being the “Winningest Coach in A&M History”, fans are now calling for the head of R.C. Slocum on a stick. He’s like the Mariah Carey of football at this point. Popular consensus is that he’ll still have a job next year but short of a miraculous comeback that will be it for him. A&M got a new President this year and he apparently has a goal for A&M to become National Champions in three years. I wonder why no one thinks it might be the players’ fault that the team sucks. Apparently since A&M isn’t the “named school” for Texas (the University of Texas is), the best players don’t seek them out first.

But this past weekend A&M won a game at Kyle Field versus Oklahoma State. This wouldn’t be too significant, save for the fact that it helps Slocum hang on to his job a little longer, and the fact that Oklahoma State was the #1 ranked team in the country and A&M’s never beat a #1 team before. Now Oklahoma State can’t be the National Champions and A&M might go to a bowl game.

Dad explained it once to me how they determine the “National Champion” and now I don’t remember, but you basically have to be either #1 or be #2 at the end of the year and beat #1 or something like that. In any event, A&M hasn’t done it since 1939, and that was the only time they did.

But tying this back into the bowl theme, it used to be that there were only a few bowl games, and you only got to go to one if you were good that year. Now it’s like belts in wrestling – there’s a ton and now it seems as if the bad teams just go to the crap bowls. So since A&M wasn’t even in bowl running says something about how bad it was doing. Last year A&M wasn’t doing so bad but they went to a pretty bad bowl – the DiscountFurniture.com Bowl. Suffice it to say, it was embarrassing simply because of the name of the bowl. I don’t know if DiscountFurniture.com took over an existing bowl and ditched the name or if DiscountFurniture.com just didn’t bother with a “real” name, but it might as well have been a porn company sponsoring the bowl. In their defense, Discount Furniture is actually a chain of very big time furniture stores and not some fly by night dot-bomb.

But in any event, Saturday’s victory probably bought Slocum one more year of employment (at least).

Still, if A&M does go to a bowl game this year, let’s hope it’s not the FreeOnion-Loaf.org bowl in Minute Maid Park or some crap like that.

I think I’ll take advantage of this lull at work (boss goes to Doc. right as things go stale) to dispel another myth or two that have been bugging me.

General Computing Myth #1: The latest (whatever) is always neccessarily the best.

I’m running two PC’s at home. If you count the one my Wife runs as well as the dinky one in the garage running Lindows, then that’s four. My system runs Windows XP. We decided it would be good to have a new system from spare parts to do the things we don’t want to trouble our PC’s with, such as being a print server, a test web server, as well as being a backup server (to this end we moved the tape drive to it).

Since I’ve seen the light of the Whistler line of operating systems (Windows XP), I don’t want to go back to Windows 98, which was notorious for locking up with regards to network operations. However I also didn’t want to lay down the wad for a new copy of XP, nor did I want to break the law and pirate one. So I did some looking and discovered the Windows.NET Server 2003 Customer Preview Program. Sign up with Microsoft and you can download and install Windows.NET Server 2003 RC1. This copy will function for 360 days and is the successor to Windows 2000 Server. I signed up, they let me download it, and I’m very much pleased with the results.

Now there are a few drawbacks. For starters no one makes Windows.NET drivers for anything, so you have to guess as to whether or not to use Windows 2000 drivers or XP Drivers. Also, some devices don’t have equivalent drivers for Windows 2000 Server or they outright say they don’t support it. It’s kinda a crapshoot. Plus there’s always the very curious things about a Microsoft Server Operating System – like why does it bother with putting Solitare and Minesweeper or certian other things that pose small risks on consumer OS’s but would be huge liabilities on production servers – like “easy file sharing”.

Nonetheless, it does all I want it to and I’m happy. But I see on the private newsgroups they let you have access to that there are a number of people who decided to download this operating system and install it as their new home operating system. I can almost understand why you might think this would be a much better deal than say Windows 98, but some people blew away their Windows XP Professional partitions and opted instead to install .NET Server. Then they complain (and get flamed immediately) that “this thing won’t play Counter-Strike, WTF???”

This is the embodiment of the myth that the latest whatever is always the best. Windows XP Pro was the best OS Microsoft ever did, so Windows.NET Server must be even better, right? Complaining that Windows.NET Server (or even Windows 2000 if you get right down to it) won’t run your games is like complaining that your new car won’t fly unless you drive it off a cliff – and then it crashes.

The last place I used to work at would never apply the Windows NT 4.0 Service Packs until a few months had gone by. It would never fail that the service packs would break something vital to the operation and then you’re stuck backtracking and (ugh) reinstalling. I’ll never forget when a new student worker (like me) was hired – he couldn’t understand why all the PC’s in the library were running Windows 3.1 in 1999. I could never seem to make him “get” the fact that these were old machines, new licenses were expensive, and that (at that point) Windows 3.1 still did everything neccessary to access the online card catalogs and such. It blew his mind when we upgraded almost all of them to Windows NT 4.0 instead of Windows 2000 (literally a few weeks old at that point).

General Computing Myth #2: A Bigger number means better.

This myth can be dispelled by anyone running an 800MHz Macintosh that can outpace a 2GHz PC in some areas (most of which involve Adobe code).

I blame this on two things – the mass public’s misunderstanding of what a version number means, and game consoles in the early 1990’s.

The ruler of the roost in 1989 was the Nintendo Entertainment System (NES). Atari and Sega might as well not have tried. But that year Sega unveiled a competitor in the form of the Genesis. The NES, they pointed out, was an 8-bit system. The Genesis was a 16-bit system, so it was therefore better. Anyone can see that 16 is twice as much as 8, so the notion instilled in the public was that the Genesis was (at least) twice as good as the NES. This wasn’t neccessarily untrue, either. The Genesis could run faster games (though that was more a function of clock speed than processing bandwidth), could run bigger game levels (more a function of memory than of processing power) and could display more colors. The fact that it ran more colors was a result of the CPU being 16-bit.

Of course, in two years the Super NES (SNES) followed, and by all accounts was a superior system to the Genesis. Of course, playing catch-up with the Genesis, it would have to be better to even stand a chance, but it allowed for even more colors and more elaborate games (it was the system of choice for Japanese RPG developers). But it, being a 16-bit system, was seen as “equal” to the Genesis by the general public.

So then a few years later it came time for the next generation of game systems. Naturally, the console makers started working on 32-bit systems. Atari and Nintendo both had the bright idea of trying to leapfrog that generation and play on the publics fascination with higher numbers, and came out with 64-bit systems. Atari even contributed to this notion by unveiling a “DO THE MATH!” campaign (decreeing that 64-bit was always neccessarily better than 32-bit). Of course by this point the clear notion of what a “bit” meant was getting muddied. The NES had an 8-bit CPU that did everything, the Genesis/SNES had 16-bit CPU’s that did almost everything (usually sound was covered by a second processor). However, the Atari Jaguar had a 16-bit CPU and four other chips – two were 32-bit and two were 64-bit. Since the 64-bit chips were the graphics chips and graphics were everything, the highest common denominator won out. However, when 32-bit PlayStation games were looking and playing better, the Jaguar won out. The Nintendo 64 had more components that were 64-bit, but not too many more. Sega even tried to parlay the fact that the Sega Saturn had three different 32-bit processors as an advantage. That developers hated developing on these multiple processors or that the single PlayStation processor was still more powerful they didn’t point out.

So when the next generation got primed (Dreamcast, PS2, etc.) lots of people were asking “how many bits?” and no amount of explaining could convince them that bits didn’t matter anymore and that things like polygon count were more important. It didn’t help that over-zealous game store employees trying to sell the Dreamcast pushed it as a “128-bit” system, and then later employees trying to make the Dreamcast look crappy in the light of the PS2 sold that one as a “256-bit” system.

Truth be told, the Dreamcast does have what could be considered 128-bit guts, same as PS2, but I don’t think the makers of either system bothered with the bit thing. Now here’s the fun part – the Microsoft Xbox, what most people agree is the most powerful cat on the block, is a 32-bit system. Seriously. The Intel Pentium III processor it has inside is a 32-bit CPU, the Nvidia chip does 32-bit color, and the sound components are at best 24-bit.

Now on the PC front, Intel is putting out a chip called the Itanium (there’s already an Itanium2 on the way). AMD is coming out with a chip called the Hammer. These are 64-bit CPU’s. This means that the bandwidth is twice as much and the amount of memory they will need is twice as much. It does not mean that the chips (at comparable clock speeds) are faster. They require everything to be recompiled. They require specialized operating systems. And since they want to keep their instruction sets low, optimizations have to be done on the compiler end. One person at QuakeCon 2002 asking Carmack if DOOM 3 would be available for 64-bit processors was surprised to learn that the answer was no and the reason was that a 64-bit CPU would actually make the game slower (since at this point DOOM 3‘s graphics are more a function of video cards than CPU speed). The only real advantages a 64-bit processor might have are in the server arena.

Then there’s the version number conundrum. The public believes that a higher version number means better. There is a little bit of logic to this – experience brings something to the table. The latest consumer OS should be better than the previous one. Of course, history is full of exceptions – Windows ME was by all accounts a disaster compared to Windows 98, and like we’ve mentioned, in production environments people aren’t always itching to upgrade to the latest version of anything (an extreme example of which is why I have a job programming COBOL on a 1985 mainframe).

However, people have played on this notion for some time now. AOL has lavish ads to point out new versions of its client software – that it took them eight major versions to make certian things work the public seems not to mind. One WWII flying game (whose name eludes me) came out with numerous patches within its first six months, cumulating in a Version 5.0, which they then advertised on the new boxed copies of the game. Why a consumer would think that a game requiring five major revisions in six months after shipping befuddles me.

Microsoft contracted to provide PC-DOS to IBM back in the 1980’s. They placed in their contract that they could also sell it as MS-DOS. When the arrangement expired, Microsoft released MS-DOS 6.0, but did not deliver a PC-DOS 6.0. Instead, IBM took PC-DOS and made their own improvements to it and released it as PC-DOS 6.1, skipping 6.0. Microsoft then released MS-DOS 6.2 and 6.22, followed by IBM’s PC-DOS 6.3. These guys were trying to one-up each other to play with the public’s notion that the higher version number equals better. It was all for nought, though, since PC-DOS lost (and IBM killed it in favor of the similarly ill-fated OS/2 2.0 project).

Microsoft Word was released as Word for Windows 1.0 and then Word for Windows 2.0. The next version, however, was named Word for Windows 6.0, skipping 3.0, 4.0 and 5.0. The official line was so that it would match the version number of the DOS version, but some say it was to make their version number as high as or higher than WordPerfect, the then market leader. Of course WordPerfect debuted on Windows at 5.0 or 5.1, to match their own DOS version, despite debuting the 5.0/5.1 equivalent on other platforms (Mac, OS/2) as 1.0. At one point in time, a version of Microsoft Office would have different versions of the programs (Word 2.0, Excel 5.0, etc.) but Office 95 had all products at 7.0 – not that it mattered, by that point it was Word 95, Excel 95, etc. Office XP is 10.0, and Office.NET (or whatever they call it) is 11.0

Microsoft’s development products aren’t immune, either. Microsoft’s Visual InterDev was 1.0 with Visual Studio 5.0 (I think), but when Visual Studio 97 was released, it was bumped up to 6.0 to match the other releases. InterDev has pretty much been dissolved into the ASP.NET handling features of Visual Studio.NET.

And the year numbering scheme isn’t impervious to the public’s notions as well. Windows 95 was released, and so was Office 95, which required Windows 95. Makes sense. Microsoft then released Office 97, so the joke was if someone said they were running “Windows 97” they didn’t know the difference between an OS and an office suite. So you can imagine how difficult it was for people to get that even though there was a Windows 2000 and an Office 2000, they didn’t have to have Windows 2000 to run Office 2000. I wonder how many sales MS missed out on with that. Didn’t matter – they did the same thing with Office and Windows XP and since XP was aimed at the average user (whereas 2000 wasn’t), they don’t mind the association one bit.

The one place the year numbering scheme makes the best sense is money management software – every year we see a new version of Microsoft Money and Quicken the same way we see a new year model of Ford Explorer every year, only some people buy into the notion (myself, a little) that you should buy the new software every year – more like a donation for your continued convienence and financial health. Plus the rebates are usually pretty healthy.

Anywho, another day, another couple of pieces of misinformation dispelled (I hope).

I read a statistic the other day – apparently in the early 1990’s, the amount of music being sold on cassette tape was 66% of the total music sales – today it’s less than 4%. Personally, I’m wondering what is up with those 4% of people. Who is it that hasn’t “upgraded” to CD yet? Some people might still have cassette decks in their cars, but how many people don’t own a boom box that can copy your CD to a 99ยข blank tape?

I actually pondered this a few weeks back when I went into a record store in the mall. For some reason it just hit me that I didn’t even know if they bothered to carry cassettes anymore, so I turned around. Yup, they still do. They only take up one portion of one wall and their availability is spotty, but you can still get cassettes of the latest albums.

Part of me misses the cassette – before I went CD in 1991, I had to buy these things as Vinyl became scarce. The cover art for albums is square, so either only a portion of the cassette cover is used for the cover art, or it has to be changed to be configured for the rectangular shape.

One of the things cassettes had going for them was the idea that they were more durable. True, throw a cassette on the pavement and the odds of it being playable afterwards are better than a CD landing face-up. However, play a CD 10,000 times and it will sound the same every time (hardware withstanding). But the thing that always got me was this – the CD is a piece of aluminum and plastic and it costs more than the cassettes – which take longer and are more expensive to make (they even have moving parts). Originally people had no problem paying more for CD’s – new technology and all. But CD’s never got cheaper (they got more expensive) and so the original premise, that CD’s would go down in price after certian R&D costs had been recouped, never materialized.

I find it interesting that vinyl is still around, sort of. I don’t know what record labels still make vinyl copies of new albums, but for even the artists that can, not all of them bother. What I do see a lot of is artists coming out with 12″ singles – full sized vinyl records with a song and maybe a few remixes of that song. DJ’s use them. I don’t pretend to understand the whole notion of DJ’ing, but other than the mere concept and looks of spinning a record, I’m not sure if there is any real advantage over just cueing up a CD.

Now I see that BMG, a mega conglomeration of record labels, is unveiling CD copy protections on all discs sold in Europe soon (Europeans are apparently less vocal about rights). If it goes well they’ll do it here (USA), too. They claim that the CD’s are redbook compliant, meaning that if your player doesn’t play them, you need to get a new player that can. This would be like tire manufacturers all deciding to make new tires a certian way and telling everyone if they don’t fit on your car anymore to buy a new car, not their problem.

The irony is that the CD is destined to go away as well. Picture if you could just buy your music as MP3’s. No more manufacturing costs, no more shipping costs, no more retailer middle man. People could buy their music online, not share it (if it’s cheap enough people will buy it), and the record companies could stand to make more money. Dvorak had a column on this.

Funny that I worry about formats and it’s all destined to turn into air eventually.

After spending a weekend with some sick in-laws the inevitable has happened – I’m starting to feel sick. Last night as I was headed to bed I started to feel that sinus drainage pressure crap that is always a precursor to this sort of thing. So, to combat it, I took some NyQuil (or rather the Albertson’s generic equivalent). This morning I felt like utter crap but, seeing as how I’m already taking a few days off this week I decided to go ahead and try to go to work. Right before I left I took the day pills from Comtrex and some DayQuil (generic again). I’ve also been sucking on Halls Defense all day. Seeing as how the drugs appear to be helping and the NyQuil never completely wore off, I’m somewhere between “stewed” and “stoned” right now.

Last night I fixed some ViaTexas code that I wrote at the end of a long day, so when I looked at it I thought, “what was I thinking when I wrote this?”. I wonder if anything I do today I’ll look back upon as “I wonder how drugged I was when I wrote that?”

A little over a year ago I lamented the death of the Atari Jaguar, then five years gone. One of the things that I always found interesting was the lack of an Atari Jaguar emulator. I knew for a fact that there were ones being worked on (in fact at one point in time it was somewhat fashionable, it seemed, to announce you were working on one), but no progress was being made. Jagulator could render a couple of frames from Flashback and homebrew demos, and there was maybe one more that kinda emulated one of the CPU’s but didn’t do anything. Jagulator was even the product of the guy who did UltraHLE, the first working Nintendo 64 emulator. It seemed as if Jaguar and a few other consoles from that era (the 3DO comes to mind) were in an odd rut – extremely old consoles (NES, Genesis) were pretty much sprite-pushers – no real 3-D to speak of. They can be handled in software easily. More modern consoles bear more resemblence to modern PC hardware (especially the video cards), so N64 and PSX emulation have come around. But the Jaguar wasn’t really 2-D or 3-D and it was pretty much like nothing else, so it was stuck so to speak.

But then last week Project Tempest was released, the first working Jaguar emulator. After some minor versions, Version 0.2 now plays Tempest 2000 more or less graphically perfectly. It’s Windows only and there is no sound and the compatibility is spotty, but if you’re like me and you mainly just want to play Tempest 2000 again, this is good stuff indeed. Check it out.

In that vein, to me emulation has gotten somewhat boring. It used to be that we had less than perfect NES emulators and SNES games that wouldn’t run. Nowadays pretty much everything has been hammered out. These days we have badass PSX and N64 emulators. I’m personally disinterested in emulators for the PS2 or other newer consoles, though I know eventually they’ll be done. I’m somewhat shocked no one has come out with an XBox emulator, given the similarities to the PC. I figured for sure someone would come out with something by this point. Hell, the Dreamcast will be an impossibility since the GD-ROM is incompatibile with the PC.

The one thing that was interesting to me for a while was the Dreamcast emulating other platforms. Now of course we have a nearly perfect NES emulator, plus spot on entries for other consoles like the 2600, the Intellivision and Genesis. Bleem came out with 3 discs to emulate PSX titles and that’s pretty much the extent of PSX emulation on the DC. The lone SNES emulator alive for the DC has pretty much topped out (and was never full speed), so DC emulation to me has gone stale.

So that leaves one thing in emulation that’s interesting to me – unemulated platforms. We still don’t have a good Virtual Boy emulator (the best one was a DOS based emu un-updated for over two years now when the author got laid off and lost interest). We need a finished Jaguar emulator (easy enough – only about 50 games total). And there’s probably some more I haven’t thought of.

So in any case go home and get your Tempest 2000 on.

So here I sit at a standstill – I’m waiting on about fifty things before I can continue here at work, so here’s another longish post. BTW, I think the post Moe was referring to was either this one or this one.

There’s a game developer named Acclaim. They’re a somewhat uneven game developer. Sometimes they do good games (the first two Nintendo 64 Turok games come to mind), but more often than not, their games aren’t very good. They’re especially notorious for the “quick cash-in licensed game”. Every South Park game has been terrible and Acclaim did them all – Comedy Central, who owns South Park outright, just told Acclaim to come out with some games and quickly. South Park’s creators publicly criticized the games – didn’t matter, they sold well anyway. Acclaim is also the maker of any bad Batman game you ever rented by accident.

Recently Acclaim has embarked on some rather questionable advertising practices. For all the criticism of the American audience, most of these techniques are practiced in the UK, where people are supposedly even less easy to enrapture. For the game Shadow Man 2, Acclaim bought advertising space – on headstones in cemeteries. To promote Turok: Evolution they offered the first five people in the UK to change their names to “Turok” $1,000. The hundreds that came in late probably weren’t too thrilled. For their game Burnout 2: Point of Impact, they offered to pay the speeding tickets of anyone who was on their way to buy the game. I wonder – if you speed on your way to get Burnout 2 and die, will they advertise the game on your tombstone?

Of course, what they really have here is the cheapest (and some would say lowest) forms of advertising possible – press coverage. I would guess no one at a cemetery is interested in becoming aware about a video game where you kill other people, but the mere mention in newspapers (along with a mockup of what a tombstone would look like with the ad on it) got Acclaim tons of free publicity. I presume Acclaim never actually took out the ads on tombstones – who would they pay for it? The same thing for the other promotions – there’s no such thing as bad publicity, don’t ya know? I view Acclaim as the Troma of the game industry – more known for their hype and bad product than their actual contributions.

But now they’ve got a game coming out called BMX XXX. When the Tony Hawk’s Pro Skater clones starting coming out from all over, Acclaim came out with a game called Dave Mirra’s BMX which was a servicable but hardly revolutionary game. For the sequel they dropped Dave Mirra (who dropped who from what is in dispute) and decided to go for a more “adult” feel – namely nudity and profanity. I’m not completely sure how it works (if you have to unlock the boobs or what) but it appears this game will have more in common with a Motley Crue video than a sports game. Much the same way PC Accelerator decided that “titties sell” (and other magazines, like Total Movie, still adhere to), Acclaim is going for that mature audience that’s still a bit juvenile.

My first reaction was this: this cements the notion that Acclaim is pretty much a turd in the game industry pool. My second reaction was this: this stupid game might actually sell. Worse than that, it might spawn off a long list of copycats. We’re just now seeing the end of the trashy tunnel that was the Deer Hunter legacy, and now this.

However, despite the fact that the game will carry an “M” rating, many chains, including Target and Wal-Mart, refuse to carry the game. Best Buy is carrying a cut down version of the game. The part of me worrying about the game selling well and creating a trend were allayed. But then I started to wonder – how is this game worse than a violent game? Or a violent movie? Or even a pornographic movie? Best Buy sells lots of violent games, violent movies, and even dirty movies (not that I know anything about that). I don’t think Wal-Mart or Target carry “dirty” movies, but they do sell violent movies and games, plus they already have systems in place to keep “M” games and “R” movies from being sold to minors. Hell, Wal-Mart even sells guns and ammo.

And the “dirty game” isn’t unprecedented, either. No sooner had pornography made its way to the Internet than people started making pornographic games. The only place I can think I’ve seen them for sale is at Hastings (who, to their credit, doesn’t seem to discriminate at all). The ESRB system even has an oft-unused “AO” (Adults Only) rating. Some 21 games have used this throughout the years, most from one particular company trying to make their works seem more legitimate – most games worthy of an AO rating don’t bother with ratings. Story goes the game Phantasmagoria 2: A Puzzle of Flesh, a game from the FMV-fad era featuring frontal nudity, was set to be the first game from a major publisher to garner an “AO” rating, but then Sierra decided to go for the RSAC rating, which used 3 “meters” for sex, violence and language. Bizarrely, subsequent re-releases of the game label it as “M”.

Now Acclaim is crying to the press that they’re being falsely discriminated against. My gut reaction was: this needs an “asinine” tag from FARK. I mean come on, you made a game that was not only extreme for the sake of being extreme, but then you complain that you ventured into uncharted territory and got hurt. Of course, that was a trap – they want people to think that. They want the media to cover the story. They want the media to cover the story that is Acclaim complaining about the story. And the game is going to sell. Like the out of the way theater that shows the movie you want to see, people will now go out of their way to get this game.

But then I got to thinking – what if all of this is real? I mean, what if Acclaim wanted some controversy and clamor over it, but what if they realized what they were getting themselves into? I mean, no Wal-Mart sales is often seen as a death-knell. Wal-Mart is so big that these guys are the ones who demanded and got smaller game boxes. And these are the stores they make “non-stickered” versions of albums for. Since Best Buy is carrying the pared down BMX XXX obviously such a version exists, but Wal-Mart, Target, et al declined even that.

So how is this game worse? I mean, we have all these systems in place – the rating system, the enforced ages, the precedents in the other products in the stores. How is BMX XXX worse than an R-rated movie? I mean, as far as I know there’s no death or actual sex, just some swearing and some naked bikers. Perhaps it’s because of the “XXX” in the title – if you don’t carry “real” pornography then you could make the point that it falls into that category. But then again we’re about to have a movie called xXx hit DVD, so there goes that argument. Perhaps they want to keep it out of the hands of children, but they do that already. Perhaps they don’t want the bad publicity, but then again people protest Wal-Mart for selling (or not selling) abortion pills, guns and bad maps, so they’re used to controversy. Plus when you have as many stores as they do, someone slips and falls in your store and sues you several times a day. I can’t believe that they’re scared of one video game.

The bottom line in all of this is that Acclaim ventured into uncharted territory and not only looks to be getting a bit stung, but now they can become more than a game developer. My gut reaction was “they shouldn’t be able to do that!”. I have had similar opinions of other games (Postal 2 and the Hitman series come to mind). I figured for sure I’d feel the same about Grand Theft Auto 3 but not only was it a good game, it wasn’t any worse than the Godfather series or Payback, violent movies where you root for the bad guys. I realize I’m no better than the people who say that all violent video games should be banned, or those people who want art to be censored. Acclaim is going to wind up a free speech martyr. I’m not sure if this means we’ll look on them more or less fondly, but then again we made a movie about the struggles of Larry Flynt.

But then again, Luther Campbell was an exhonorated free speech martyr but that doesn’t mean the 2 Live Crew was any good

Since the ever present Moe appears to like my wordiness (the word “genius” was actually used) I guess I had better keep it up with the longish posts and frequency, lest my link appear italicized.

Before I dig in, here’s my Tip of the Day: what I’ve found is 10x better than using blogger.com or thar blogger tool in Mozilla (which, it turns out, sucks ass) is w.bloggar. Somewhere between an HTML editor and a word processor, it’s what I use these days to make the posts. I realize of course this kills the point – the idea behind Blogger is that it doesn’t tie you to one platform and here I go, tying to a platform (Windows) but I don’t care – it’s not like you have to go 100% one way or the other. I want productivity, I don’t care about the principles (of the Application Service Provider, in this case).

Right, so there’s this guy in the Maryland/Virginia/D.C. area and he’s killing people. He’s clearly a sniper. He kills with one shot. He hasn’t killed anyone since Friday, but he’s been on a spree of sorts. If he keeps going they’ll catch him. If they catch him he’s gonna be executed. Of course the media is going nuts with this and I can’t say I blame them – this story has actually taken Iraq off the front pages. The WTC I could deal with – perhaps I’m naive but I don’t see myself being in any buildings important enough to blow up. Anthrax I can also deal with – I don’t think I’m important enough to get it sent to me and I don’t handle important people’s mail. But these people are random, unconnected people pumping their gas. I think if I were in Maryland I’d just stay indoors, or at least I wouldn’t get gas for a while.

But then our local NBC Affiliate whose news staff is full of incompetent idiots, goes on to run a story about video and computer games where you can be a sniper. Suffice it to stay, this got me steamed, but I couldn’t help but laugh. This was clearly a story quickly pasted together to cash in (so to speak) on the recent events and, akin to the quick turnaround on the notion that the 9/11 terrorists used Flight Simulator to train, was in very poor taste.

Now I’ll be the first to admit that I’m hardly an unbiased source, but the way this story was assembled was just asinine. They frequently used the word “children” to describe the target audience. They (of course) alleged that these games can be used to properly train would-be snipers. Then they go to interview the 17-year old jerkoff working at the game store in the Waco mall – he didn’t help the cause at all. And I love how the images they used for the little intro graphic were from Wolfenstein 3-D (the original) and DOOM – very old games in other words (probably the same graphic they used when Columbine hit). And they presented it as damning that they contacted game makers (publishers and developers I would assume) but never recieved a call back. It left you with the notion (if you didn’t know any better and I would guess that the majority of the people watching the news at 10:00 on Friday night wouldn’t) that these games were designed to be “ulra-realistic” so that children would play then and then become an army of assassins.

So, let’s address this one piece at a time, shall we?

Myth #1: Video and Computer Games are realistic. Here’s the skinny on games: on the whole they are not realistic. Write this down: Video and Computer games are the antithesis of reality and realism. There are some exceptions to this – flight simulators strive to be realistic for training purposes and some games (the Sim series comes to mind) strive to become models of reality, but ultimately the games anyone cares about in this regard are about as far from the truth as possible.

Witness the physics of it all. In most FPS titles your character can run forever, never needing to stop, never needing to catch his/her breath, never once collapsing from exhaustion. The Quake series of games gave birth to “Rocket Jumping” – jumping whilst firing a rocket into the ground, to make you jump higher – try this in real life and you’ll blow your legs off. Most games are designed to where the character has to take multiple bullet hits to deplete their life energy – how many people do you know that can take multiple bullets to the chest?

History is full of games that have been designed to be as realistic as possible, only to have the code ripped out when it kept the game from being fun. Imagine if Soul Calibur was realistic – one swipe from any of these weapons would easily kill the combatants – whats the fun in that? Racing games have been divided into two categories, “simulation” and “arcade”. The simulation game strives for realistic physics, whereas the arcade-style game amounts to “make car go fast now”. The arcade games are much more popular – despite the fact that people say they want a realistic car game they don’t – real cars suck. What people want is to do is drive as fast as possible with little real consequence.

And that’s the thing – people say they want realism but they really want anything but. This past fall there were two WW2 themed FPS games on the market, Return to Castle Wolfenstein and Medal of Honor: Allied Assault. The Spielberg equivalencies say that RtCW is more like the Indiana Jones series (more concerned with the weird things about Hitler – the obsession with the occult and armies of the undead) whereas MoH:AA is more like Saving Private Ryan. As a result, people were stoked about a PC game that let you play Saving Private Ryan. The developers promised an unprecedented level of realism, and to a large degree they delivered. However, many gamers were perturbed with the results. The one thing that sticks out in my mind is the blood. When you shot someone in RtCW, blood spewed on the walls and such, much as you might expect. However, in MoH:AA, “gray smoke” came from the shot soldiers. “Unrealistic!” gamers cried. However, as it turns out, MoH:AA was in fact more realistic than RtCW. Research and combat experience has shown that, when a person who is clad in tons of thick uniform (especially WW2-era issue) gets shot, the fabric of the uniform disintegrates into a mist (the aforementioned “gray smoke”) and the blood doesn’t spew out, it won’t even make it to the uniform. The notion of blood spewing forth came from Hollywood.

People say they want realism, however they want anything but. Another thing people say they want is “realistic A.I.” (so that they think they’re playing against real people in a single-player game). A.I. is definitely something that can be done very well or very poorly, but what it really boils down to is “artificial stupidity”. The game always know where you are, always knows where you’re headed, and can always kill you. That wouldn’t be much fun. Take PONG for example – the game always knows where the ball is headed and can always hit it. So then how do you win PONG? Well, in versions of the game featuring A.I. (the original 1972 arcade machine required two players – there was no A.I.) the game plays “stupid”, and increasing difficulty levels just plays “less stupid”. Same thing goes for Tic-Tac-Toe, but I’ll save that for another diatribe.

Myth #2: Video and Computer Games can turn someone into an efficient killing machine. This has always irked me the most, especially since most of the time I see this in a news story on a major network – inevitably they point the finger to games and movies, never the violent shows on their own network (though often the violent shows on other networks). Lt. Col. David Grossman has made a career on this allegation – of course he also has a book to sell on the topic, so he’s hardly an unbiased source, either.

So let’s nip this one in the bud, shall we. First from a physical standpoint. Games are largely controlled with either a mouse/keyboard combo, or with a joystick controller. True, arcades do have “gun games” and there are home gun accessories, but few own them. In particular, the Columbine killers were known to play DOOM, which has no need of gun accessories. However, ask anyone who has ever fired a real gun versus guns in games and they’ll tell you the two couldn’t be more unalike. As for this sniper, remember he kills with one bullet, and has been successful in killing most of the time, and hitting his target all the time. If you’re implying that someone who has soley played Counter-Strike or Soldier of Fortune could then get a sniper rifle and do the same thing previously done only with a mouse and keyboard, you should realize how ludicrous that notion is before the thought completes forming in your head. The person involved in the Maryland shootings has had training, possibly professional (i.e., he was a shooter for the Army or Police). Plus we all know who this guy is going to be once they catch him – he’s going to be some 35 year old nut, white, single, grudge against the government, possibly mentally unbalanced from a war.

The most important thing to realize about the difference between controlling a game and a real weapon is that in the game you’re relying soley on the precision of the controller. With a real weapon, you’re relying on your own precision – and unless you’ve had training, your precision sucks. I know – years ago I qualified on a M-16 in the corps. I pointed the thing perfectly, but it was pure luck that most of the bullets hit the target, and that was on the third or fourth try. I sure as hell didn’t get any of them on the bullseye.

And as for the argument that “gun games” make for better killers – the argument, though more difficult to refute, still doesn’t hold up since it doesn’t take into account physics. Real weapons have recoil – the fake ones don’t. Also, in a gun game you have to hit the “general area”. When you fire the zapper in Duck Hunt (for example), the screen turns black, except for where the duck was, which turns white. If your gun sees the white square, you “hit” the duck. I’m sure the principle behind more recent games is more advanced, but the next time you see people playing Area 51 at your local arcade, watch how many times they simply press the gun to the screen to hit something – not exactly training.

Myth #3: Video and Computer Games are aimed at children, and children in buy them in any store. Alright, let’s get this one out of the way first: Not all games are meant for children. Same reason not all movies are meant for children. You wouldn’t take, say, Aliens and say that it was meant to cater to children, so why allege that Soldier of Fortune is aimed at children?

I can see where this notion comes from – at one point in time games were for children. Not too many 40-year-olds picked up Super Mario Bros. Of course these children have grown up and as a result, they want their games to grow up as well. Even Nintendo has finally seen this – “M” rated games are on the GameCube in large numbers. Much like there are still cartoons for children, there are ones for adults as well (Akira, South Park). There are comic books for children and comic books aimed at adults as well (I’m not sure, but I think that the adult comic books outnumber the childrens ones at this point). Just because a medium was made for an age group doesn’t mean it has to stay that way.

We already have a system in place for this – the ratings system. Here’s how you can tell if a game is meant for adults – it’s got an “M” rating for mature. Think of “M” as “R” in movies. A “T” (Teen) rating is “PG-13” and an “E” (Everyone) rating is akin to “G” or “PG” (there’s also an “EC” (Early Childhood) rating that’s more akin to a “G” rating). Today, most retailers worth their salt do not sell “M” rated games to anyone under 17. The major retailers (Target, Wal-Mart, etc.) do this as part of their corporate policy (they do the same thing with movies). I’m not sure what Babbage’s/Etc. does but even when I worked there we wouldn’t sell certian games (the Grand Theft Auto or Kingpin games, for example) to children.

I agree that “M” rated games should not be sold to children. I loathe to have a law on this sort of thing, since a unified retailer front would have the same overall effect (witness the number of people who think that the “R” rating is enforced by law – it isn’t).

Parents – go through your children’s room (hint – if they still live in your house and are under 18 they’re children) and take away any game with an “M” on it. If you’re not sure, look it up, but most game discs have the ratings on them. You’re not invading their privacy or stealing from them – you own their room (and the house it’s in) and the games were bought with your money anyway. Unless you can be convinced that the game is harmless or you trust your kids, don’t let them have the game. In fact, have this policy mirror whatever your R-rated movie policy is.

Simply put, games don’t have the impact we think they do. Game makers would like their games to move people in ways that movies do and I’m sure that some day they will – but most of them don’t right now. The kid that killed himself after being robbed in EverQuest was mentally depressed and suicidal to begin with. The Columbine killers were already screwed up in the head from being bullied, and the six year old who killed a fellow first grader quite literally lived in a crack house.

I don’t claim to know the answers to why someone in a white van gets his jollies off sniping people, or why people become violent. What I do see a lot of is complete misinformation about the game industry from people on the outside. The three myths above are the most annoying ones and while I know that, short of a handful of people, no one will read this, I just figured I’d get this all out of my head.

I’ve always found the music compilation an interesting animal. Sometimes an artist releases a compilation at the height of their career, other times at the low point. Sometimes its released by the label, other times the artist wants it. Depending on the form, frequency of release, time frame and relevance of the artist, the mere release of the compilation may say something about the state of the artist.

It’s easy to see why, at the very least, record labels like compilation. For the longest time, Michael Jackson’s Thriller was the #1 selling album of all time – now it’s the Eagles’ Greatest Hits. The production costs are easy – you don’t have to pay $85 an hour while the artist tries to make the next Appetite for Destruction, instead most if not all of the material has already been recorded. Plus as we’ve seen the payoffs can be very long term. In many respects, it can also be a rehash of the “one good song” syndrome – most of the people who bought the Eagles’ album probably did it for “Hotel California”.

It’s also easy to see why customers like the compilation, too. In many cases you want a certian number of hit songs by the artist in question, but feel that the purchase of all of their albums would be expensive and counter-productive. Plus it’s cheaper for the record company in question to make one CD of select songs instead of making more “remastered” copies of older albums that may never sell. Sometimes when an artist becomes famous in America, the rest of the world gets a compilation album, instead of all their old CD’s.

Recently the trend has been the “crash course” compilation. There have probably been more Elvis compilations than actual Elvis songs, but a new compilation, Elv1s: 30 #1 Hits, is still selling like hotcakes. That many people in 2002 want a crash course in Elvis (plus it has the new dance mix of “A Little Less Conversation”). This alludes into the next trend – the 2 CD set. Used to be a greatest hits compilation had to fit on an LP (which is why the Eagles’ release runs under 50 minutes), but the advent of the CD meant that most “double-LP”‘s could fit on a single disc. The next step was 2 CD’s. As is already the trend with DVD, people tend to think that a single disc compilation is not as good.

Another trend is the unprecedented cooperation of record labels. This may be due to better lawyers, or perhaps the advent of MP3. The new Rolling Stones compilation, Forty Licks, is hardly the most comprehensive compilation the Stones have ever released, but the main significance (other than the fact that the number of songs, 40, matches the number of years the group has been together) is that it’s the first ever compilation to span the Stones’ entire career. In 1973 the Rolling Stones made their own record label and owned their catalog outright after that point. The problem is, some of their best material is still owned by Abcko records. Abcko and Virgin (their current label) finally got together to do this release. Aerosmith went from Columbia to Geffen and back to Columbia. Any compilation they’ve ever done (including the Pandora’s Box boxed set) has only ever concentrated on one “section” of their career. In late 2001, Geffen released Young Lust, a compilation of their Geffen years (oddly reminicent of Big Ones, a previous compilation). Geffen had every right to do it, but Aerosmith had every right to bitch about it publicly. They decided to do their own “proper” compilation, Oh Yeah! Ultimate Aerosmith Hits, but leave a nice gap in the Geffen years. Geffen finally came through at the last minute and so Aerosmith was able to do the compilation they wanted, complete with Geffen songs.

Some artists are guilty of too many compilations. KISS is probably the most guilty. The number of compilations they’ve released over the years is staggering. They just released another one last month, the single-disc The Very Best of KISS, in the wake of the announcement that they wouldn’t be breaking up after all. Remember that less than a year ago we got the KISS boxed set. You may ask, “what’s the harm in releasing so many compilations?” The answer is in the exclusive content. Every artist (especially every very popular artist) has a legion of devout fans. You ever see an artist release an album in multiple limited edition covers and think “surely no one buys them all”? The devout fans do. Fans of rap group Insane Clown Posse have been known to buy multiple copies of every album, one of each type of packaging and then another to listen to – the others they leave sealed. ICP is one of the most proacitvely collectible groups, and entire websites have been set up to discuss minute variations in packaging and place rarity values on them. In any event, every compilation these days has exclusive content – a new song or two, an outtake, a kickass live performance, you name it. The devout fans must have these things, and despite the fact that they already own 90% of the songs on the album, they still plop down the full wad for the CD. Do it enough times though and you start to feel backlash – a number of formerly devout KISS fans refused to buy the boxed set no matter what was on it.

Sometimes a compilation album is a cry for help, a gasp to see if anyone still cares. Tom Petty was seeing his popularity decline in the early 90’s, so he and the Heartbreakers released their Greatest Hits albums and it sold tons, fueled on by the hit single (and black comedy video) “Mary Jane’s Last Dance”. Plus, it worked – Petty’s solo followup Wildflowers, also sold a ton. However, his popularity declined again and 2000’s 2-CD Anthology didn’t move any discs. He just released a new album, The Last DJ, so we’ll see how things pan out for him. Mariah Carey has had more #1 hits than any other female artist in history, and a 1999 compilation flaunted this. Then she signed to a $100 million contract, but her first movie and album, Glitter, flopped miserably so her label, before giving her a gold parachute and a swift kick, came out with a second, 2-CD set in 2001, just two years after the last one. A sadder case is Motley Crue. Their album Dr. Feelgood was (and still is) the best selling album they’ve ever done. However, their greatest hits compilation, Decade of Decadence sold lowsy. Then, to make things worse, they fired Vince Neil, their lead singer, and replaced him. It didn’t work, so they fired the new lead singer, re-hired Neil, and recored a new album, American Swine. It didn’t sell – they took a risk on the new singer and the public forgot about them. So, they quickly assembled a new greatest hits album, Greatest Hits. One of the hits was the new song from their previous hits release. It also didn’t sell. Motley Crue’s Behind the Music special is sill in heavy rotation, due to their decadent lifestyle and the fact that no one can seem to get tired of hearing Tommy Lee talk about boinking Pamela Anderson. Too bad Lee left to become a rapper.

And then there’s the sad example of false advertising. For example, the album Wang Chung’s Greatest Hits does in fact have the hit “(Everybody) Have Fun Tonight”, but to say that Wang Chung ever had another hit is just a lie. This is why some artists forego the “Greatest Hits” title and instead go for the “Best Of” name – this way it’s a compilation of their “best” stuff, not neccessarily the things that were “hits”. This logic has its place – many of Elvis’ best songs were never #1 hits, or even charted, so they were left off of this latest compilation. However, some of his cheesiest songs were #1 hits, so they’re included. Given that the American Public makes N’Sync superstars, they’re not always the best judge.

There’s the boxed set. Since I’ve touched on this before I’ll be brief, but the gist is this – come out with a set of multiple CD’s and place it in a nice, sturdy box with a book(let). It’s more exhaustive than a simple greatest hits compilation can be and it’s still a better deal than buying all the albums (less chaff, more wheat). Some artists can get away with just an extended hits compilation with some new content (which is why the Led Zeppelin boxed set is the best selling of all time), other times rare content is required. Boxed sets are usually significant since labels won’t do them for lesser established artists. For every album that doesn’t sell, they lose profit on one CD – for every copy of the KISS box that doesn’t sell they lose profit on five. Possibly the most overdone set I’ve ever seen has been Tom Petty’s Playback – 6 CD’s, 3 greatest hits, 3 rarities and unreleased. Most diehard Rolling Stones fans can’t sit through six discs…

Finally there’s the posthumous release (which we’ve already touched on with Elvis). Sometimes the releases are pretty legitimate – not too many of them, well constructed, tasteful. However, hit any gas station and see how many Elvis cassettes are next to the checkout counter and you can see what I mean by exploited (I always find the gospel ones amusing, given that Elvis died of a drug overdose). Sometimes the term isn’t so much “exploited” as it is “overdone and tired”. For a while I found it quite amazing that new Hendrix material was being released some 30+ years after his death, now it’s just downright tired and annoying. Hendrix and 2Pac are two artists who constantly see new material being released posthumously, alongside greatest hits albums. I don’t see how in the heck they could have possibly recorded this much. The other end of this spectrum, obviously, is the posthumous release people have to beg for. There is a ton of never released Nirvana material that Courtney Love has been sitting on. She along with the other two members of Nirvana, own the rights to it and while the former Nirvana members wanted to release a 10th anniversary boxed set for Nevermind, Love nixed that idea. The exact reason why is unknown, but the assumption is that her carrer is lagging and she’s tired of being known only as the “ex-wife” of Cobain, so she doesn’t want to feed that notion. However, she recently relented and this November will see the releae of Nirvana, a single-disc hits compilation with the one new song, “You Know You’re Right”. A Boxed set will follow in 2004. This higlights the other trend set by Alice in Chains – a greatest hits CD as a preface to a boxed set (though KISS went the other way recently).

The only thing that bugs me about this is that, as the music industry becomes more centralized you’ll see more and more of this. The music industry won’t want to innovate, they’ll just want to repackage the past and find the next sex toy to tempt the high school masses. Ticketmaster has a concert monopoly, Clear Channel owns half of the radio waves, and 25% of the record industry is owned by a wine cooler company. Maybe Tom Petty’s latest album, The Last DJ, a pseudo-concept album ragging the music industry, is dead on. Or maybe he’s just tired it won’t include him anymore.