Mac Source Ports Progress Report: April 28, 2022

I’ve added a build of j2mv to the site, it’s a multiplayer client for Jedi Knight II. The OpenJK project had a single player client for JK2 but not one for multiplayer. Daggolin was able to figure out the trick to getting OpenJK working on arm64, he was able to apply similar logic to j2mv, which I incorporated.

I’ve updated DevilutionX, the source port of Diablo, to the latest code. They had a formal release of 1.4.0 and this includes that, plus some stuff off the main branch. I’m not sure what my long term strategy should be yet, I think at some point I’ll want to make builds based off labeled or tagged commits but I’m not sure the best way to do that. As in, for DevilutionX they have a commit they tagged as 1.4.0 and then there’s like 60+ commits after that. My build includes the 60+ commits after that which, so long as they didn’t break anything, is probably fine. Maybe at some point I’ll have a “stable release” and “latest code” thing happening.

My focus at the moment is to try and get as many games/ports as possible going, with an eye on automation down the line. At some point I need to get releases packaged into .dmg files instead of just zip files of the app bundles. Not sure if this would be less likely to trigger the “this file is not commonly downloaded” thing Chrome does when (I assume) the zip file contains executable code for the first X number of downloads.

Someone pointed out that the dhewm3 bundle for DOOM 3 did not feature the code to run the Resurrection of Evil expansion pack. I rectified that and did a new upload.

Iliyas Jorio let me know that the fourth and final Pangea Software game he maintains, Mighty Mike, is now signed and notarized so I added it to the site as well. I wasn’t sure if I should have listed it as Mighty Mike (Power Pete) or not – it was released in 1995 as a commercial CD-ROM title from Interplay/MacPlay under the Power Pete name, in 2001 Pangea regained the rights to the game but not the name so they re-released it as Mighty Mike via shareware. Maybe I have my timeframes mixed up but I’m kinda surprised to hear there was still shareware happening in 2001.

And then on something of a lark, despite still trying to get ES3 happening, I decided to do The Ur-Quan Masters, aka Star Control 2. I’ve never been into this game but it’s one of those titles where it’s frequently on the lists of best games ever made, and subsequent attempts at sequels or remakes haven’t really been a hit, sort of a RoboCop effect.

I’m always fascinated by backstories like this – the developer is Toys for Bob which, unlike a lot of developers from the 90’s is still technically around, though for a while there they were stuck in licensed game hell. In any event, Star Control II was ported to the 3DO where they used the then-novel CD-ROM technology to add speech and different music to the game.

Apparently the source to the original DOS version is completely lost, but the 3DO port’s code was still available so they made it open source. Then at some point they made the content for the game be freeware under a creative commons license. The one thing they didn’t own and couldn’t release was the Star Control name, which is in some legal limbo clusterfuck that continues to this day, so the project is The Ur-Quan Masters, which was the subtitle of the original game.

Since the game uses very little if any 3D graphics (there’s some level of OpenGL in there but it’s primarily a 2D game) I figured it would come over to Apple Silicon fairly easily. And for the most part it did, with a couple of hitches.

Most of the source ports I work with use either Make or CMake to handle builds, with probably 2/3 or more of those being CMake. I’m still figuring out the best practices there but I’m coming around to liking CMake projects the most because they’re much closer to uniformity with what works.

There’s some outliers – a few projects use nonstandard stuff, like Ninja. The DXX-Rebirth ports used SConstruct which I’ve never even accidentally run into anywhere. The Ur-Quan Masters, however, uses its own homegrown build system. It’s not that bad, it is to some extent a scripted wrapper around Make, and they have another script that builds an App bundle, not entirely successfully but nothing I can’t work around

The problem is it has a menu system. You have to select a few things by specifying numbers, and even that’s no big deal except it requires intervention and I can’t automate it. Star Control 2 came out in 1992 and The Ur-Quan Masters started in 2002, so it’s a 20-year-old source project for a 30-year-old game. Maybe they’d do it differently today but a number of today’s practices either didn’t exist or weren’t mature in 2002.

Anyway without a ton of hassle I had it doing a Universal 2 build and even included the data files so it can be the full game (according to the FAQ on the website, this is permissible). But the problem was – the sound was fucked up on Apple Silicon. The graphics were fine, the sound had an issue.

There’s two or three libraries included in source code form that appear to be sound-related. My first thought was perhaps one of these has an updated version that works with Apple Silicon so I could just copy over newer code, but the UQM devs actually left a note to future developers that these were custom versions of these libraries, so that was out – no telling if a new version of the code would work at all.

Then on a Hail Mary hunch I figured – ok, so if it’s working on Intel Macs and screwed up on Apple Silicon Macs, could there be some sort of code in these audio libraries that’s being configured for Intel. So I did a search for “x86_64” and found two places doing an #ifdef thing. I added arm64 to the list of defined variables and it worked. Essentially it was two places that had to define a typedef for 64-bit machines and this code predates arm64 (or at least Apple Silicon). The website for the port says the 0.8 version released late 2021 was the first release in almost a decade.

In that vein, I added a new “Full Game” badge to source ports on the site to indicate releases that include the game data or at least don’t require you to download or configure anything out. Looks like right now with the Pangea titles, the Marathon titles, Star Control II and a few others, there’s a dozen full game releases on the site.

Mac Source Ports Status Report: April 16, 2022

So just to post something, here’s what I’m in the midst of with regards to Mac Source Ports.

I have the ioquake3-based game World of Padman building but it crashes when it connects to most servers. It appears the authors provide their own builds for Mac, but they’re the result of an automated process. Like a number of source port projects they don’t have someone involved with Mac experience. They’re working on a new version soon so I may hold off and try again when that ships.

I caught the attention of a Jedi Knight fan site named JKHub and someone from the site asked me to take a look at jk2mv which looks to be a port of the multiplayer for Jedi Knight II. As I’ve done the other Jedi Knight open source games and also ioquake3 it seems straightforward and I have an Intel build working but there’s a snag on M1 I’m trying to work out.

I’ve got a list I’m maintaining of the source ports I want to do, and I’m doing them in phases/passes. First pass, anything that builds into an app bundle (so, an “.app”) and that I can get going on Apple Silicon fairly quickly gets done. Second pass, anything that builds and runs on Apple Silicon and is cognizant of the Mac but isn’t doing its own app bundling, third pass, anything I can get to build that runs but clearly has no Mac-specific considerations. I’ve deviated from this a little bit (System Shock/Shockolate and Rise of the Triad/rottexpr would really be third pass ports) but I’m trying to prioritize. The holdups on those deviations were they didn’t have any code in them to look for data outside of some place like a subfolder of the executable location. It’s not hard to put code in there to look under Application Support, but it’s more work.

Back when I first announced the site on Reddit someone asked if I had a Discord. I didn’t but with a few taps on my phone, I did. It was absolutely an afterthought, but it’s turned out to be a surprisingly active thing, we have over eighty members now and some of the more active ones have gone and found more games to port or link to, so shout out to my Discord community.

One thing I was surprised to see had the potential to build out of the box for the Mac was the Serious Engine. Back in 2016, Croteam released the source to Serious Engine 1, which powered the first two games in the Serious Sam series, Serious Sam: The First Encounter and Serious Sam: The Second Encounter. Those games have been remastered/remade multiple times now but the original files work with this release. I haven’t completely worked through what’s going on yet but I was able to get it to build, however the exact arrangement of data files for the two games is a little weird. By default it wants to build the second game, but you can also have it build the first game, but then the first game won’t run without some files it appears are only part of the second game? It’s weird. But it’s got potential.

The big fish I’m working on right now and one I’d love to get going on the site is The Elder Scrolls III: Morrowind, courtesy of the OpenMW project. Morrowind was never open sourced but these guys have created an engine that can run it anyway. Lots of projects like this get started, very few make it to a playable state, OpenMW is an exception. I know people have had it running on Apple Silicon before and I have it building but I can’t get it to run, and out of the box the Intel version through Rosetta has issues (though, when you mess with stuff like this enough sometimes it’s not clear if it’s because there’s an out of the box problem or if you’ve messed with something and some preferences file hidden somewhere on your system is gumming up the works). Anyway I’m going to keep picking at it.

At some point I’m going to need to explore the possibility of letting either Intel-only games on the site or games that are maintained but not signed and notarized. Obviously the goal would be to get everything up to the state of being Universal 2 but at some point you have to think – what is the more important goal, having everything up to this rigid standard or providing access to source ports for Mac owners even if they’re not perfect?

For example Warzone 2100. This is a game that was released in 1999, open sourced in 2004, and then its data released as public domain in 2008. It has a fantastic open source project that has a slick, professional website with downloads for macOS, Linux and Windows, and it’s even a Universal 2 app. Should be a no-brainer to add them to the site as a Third Party Build.

So what’s the holdup? They’re not signed and notarized. Their app is Ad Hoc signed but it’s not signed by an Apple Developer account or notarized. These guys are so close to being perfect that instead of trying to do my own build I reached out to them. Naturally it’s not like they weren’t aware of the notarization thing, but the holdup seems to be: there does not appear to be a way, currently, for an unincorporated entity to get a paid Apple Developer account. You’d either have to get it in someone’s name or incorporate as a legal entity. Someone’s name means someone would have to get the account and no one on their project can or is willing to do this (and I’d volunteer but then I’d have to share my Apple credentials) and incorporating as a legal entity is a costly proposition for a non-profit project.

Indeed, all the signed/notarized builds on MSP that I make are signed by me, Tom Kidd. But if you didn’t know about me you’d check the signature and see it’s just “some rando” who signed them. I looked at the other third party builds I’m pointed to and yeah, all of them are just “some rando” as well. In some cases it makes sense (Iliyas Jorio personally maintains the Pangea builds so his name on them makes sense) but it’s sort of a flaw in the concept since if you’re firing up, say, dhewm3 it’s just signed by some random person.

The one exception I found was OpenTTD, they actually have a signed app whose signer is named “The OpenTTD Team” but it’s because they incorporated, in Scotland apparently, as OpenTTD Distribution Ltd. So, a legal entity.

As a side note, I was initially surprised that an Ad Hoc signature meant anything. So, there’s unsigned code where it’s not signed at all, and then there’s code signed with a certificate, like an Apple Developer certificate, and in the middle there’s Ad Hoc signed where you’ve run the codesign utility on it. I think Xcode just does this for you automatically, or it can. A big change with Apple Silicon was that on Apple Silicon you can’t run unsigned code at all. It has to be at least Ad Hoc signed (and even as I type that I’m not sure I have that right).

Thing is anyone can Ad Hoc sign something. There’s no proof of who did it or why you should trust it. So why is it a requirement now for Apple Silicon? My best guess is that, in addition to the fact that signed code (with a certificate) can tell you who signed it, it also tells you it hasn’t been tampered with since being signed. An Ad Hoc signature doesn’t leave an audit trail but you’re at least reasonably certain that it hasn’t been tampered with since whoever signed it did so. At least that’s my best guess.

My current thought process is: I want to get more stuff on the site, then I can consider letting in exceptions. I may need to reconfigure how the little cards work too, make it more clear that something is signed or just Ad Hoc signed, or is Intel-only, or whatever.

Granted if it takes as long to get Morrowind going as it did Ion Fury I may bend that rule a little quicker, we’ll see.

Mac Source Ports

On the one in a million chance that there’s someone out there who still reads my blog, or is still subscribed to the RSS feed despite me not posting in *checks notes* years, and does not look at the front page, here’s a post to point my latest venture: Mac Source Ports.

The short version is I’m using it to post signed and notarized app bundle builds of video game source ports for the Mac to run on Intel and Apple Silicon processors.

The long version is here.

So that’s the first reason I’m making this post.

The second reason is: I’m actually going to use the categories on this site to categorize thing. And by categories I mean category: one category for Mac Source Ports posts. I’m going to link to it from the main Mac Source Page.

I figure a blog to say what I’m working on and where I’m at with things would be relevant.

Why not just put that on macsourceports.com? I guess I could, but for right now I’ve got that thing hosted somewhere it would be nontrivial to say “and run a PHP site in this subdirectory, oh and can you even do databases?”

Right now macsourceports.com is entirely custom code. I don’t know how long term feasible that is or how long before I’ll need to suck it up and move to a proper CMS, but for now it works and I can do anything I want with it, although it’s a touch on the tedious side.

Anyway, I’ve tried using this as a platform to put my thoughts out there and I’ve kinda fallen off of that, then I tried using this as a platform for my professional works and, well that kinda fell off too. So either this latest attempt at using my blog will really, no really, prove to be useful, or I’ll fall off of it, too. We’ll see.

Meanwhile, during the pandemic

I’ve not been able to concentrate on a real development goal very successfully. I’m not sure what it is, I actually have less time these days since the pandemic hit, even though working from home since March means I have those commuting hours back. It’s weird.

And the other thing is I keep losing focus. I have one game I tried getting running on the iPhone and was unsuccessful so I switched gears. I’ve looked at a couple of projects where there’s an iOS Xcode project file so I figure I’d just fire them up, but they don’t work so I figure I’d get them fixed up, but then I started to come to the conclusion on both that maybe iOS support was never finished, which makes the concept a lot more work all of a sudden.

One cool thing I did get working was a port of Disasteroids 3D to the Mac. This is a freeware game written in 2000 by Thom Wetzel. I’ve actually mentioned him before, he wrote the Bitmap Font Builder I used on the DOOM iOS port and I’ve met him before at QuakeCon. Anyway, Disasteroids 3D was a freeware and open source (or at least source available) game he wrote as an exercise to learn about OpenGL and DirectX. The source has always been available but he recently put it on GitHub which gave me the inclination to convert it to SDL (swapping that out for the DirectX parts) and got it running on the Mac.

So that’s kinda cool.

Earlier today someone emailed me asking about the Wolfenstein 3-D iOS port. They had error messages about broken symbols with StoreKit. I wasn’t sure what they did but I figure adding StoreKit back would fix it. Then I realized what’s happening is that the app wasn’t compiling because the “com.idsoftware.wolf3d” bundle identifier was already taken which isn’t an issue except that Xcode tries to register it for in-app purchases, which it can’t since that ID is taken, and removing the in-app purchases entitlement removes StoreKit. The game has in-app purchases to facilitate the purchase of the Spear of Destiny levels but that’s not really needed anymore (the link to the levels is in the code if you want to download them.) Why this wasn’t an issue before and it is now I don’t know other than something in Xcode must have changed.

When I started this project the goal was to make as few changes as possible so that id might use the pull request to fix the game on the App Store. But since that hasn’t happened and probably won’t, I went in and neutered the code needing StoreKit but then I ran into another issue, one that’s been nagging me forever.

When the game launches it does this “Assert” code to make sure a value isn’t something wrong and it bails if it is. You can get around it by just commenting out the assert but then, some of the time, it launches you in a weird spot in the level and you’re stuck in a wall or outside the bounds of the map or something. The heck of it is I wasn’t sure what caused it or what fixes it. It clearly worked on my phone at one point in time.

Anyway this sent me down a weird rabbit hole during QuakeCon 2020 of all things (the one where it’s all at home) but basically, a Wolfenstein 3-D level consists of a section that tells the game where the walls are, a section that tells where the objects are, including the player starting point, and a third of “objects”, whatever that is. The second and third sections (“planes” in the code) weren’t loading correctly, which is why the player would spawn in weird spots. Long story short, some code that populated an array of offsets of where to look in the level file for the three sections/planes was failing. Why it’s failing now and not consistently before I don’t know (could be something about iOS or Xcode changed) but after spending hours hunting this section down and then having a bizarre epiphany on how to fix it, which worked. So after being bothered by this issue for a few years now I’ve gone in and fixed it. Finally.

Let’s try this again

I’m not sure if it’s awesome or terrifying that I’ve been blogging here intermittently for almost twenty years now.

My first post is appropriately precious since I mention how it’s hosted on tripod.com and how I’m concerned about turning it into a news site? Dunno. It was a long time ago.

I’ve recounted the strange history of this site before and not a whole lot has happened since then. My previous post was 2015 and I have several drafts of stuff I’ve never finished. Looking at some of those they’re so hideously out of date that they’ll probably just stay in the drafts folder forever.

At one point I had the tagline of this site as “still around for some reason” and it was more than just a smartass phrase – most of the people I know who blogged and had blogs have long since moved on. Some to other forms of social media, some to nothing in particular. Yet I just kept this site going. It’s not expensive to host or anything, like $5 a month or some crap (though I think that might be $10 a month – whatever it’s still not much) and I’ve been able to use subsites as test beds for other stuff.

And then in 2017 or so I got the idea to fix up some old games on the iPhone that were killed off in the 32-bitpocalypse but the source was available. It took a long time to do it, both because of the spare time nature of the deal and also because I didn’t know what I was doing. Once I got them working, I thought the whole process was interesting so I wrote an article on here about all the stuff I ran into. The original code drops from John Carmack had these Word documents in them that detailed the process of making them so I figured writing up the process of fixing them was appropriate, and perhaps useful to someone else.

Then I got into the process of adding features and porting other games, mostly id Software titles but a couple of other things as well. I’ve written up the id Software ports here, so I won’t repeat it all here but basically it became my pattern – do project, post code, post video, write article.

At some point I got some mainstream attention from gaming, tech and development sites and the resulting SEO has driven people to this site in decent numbers ever since. Not a ton mind you, probably within the range of rounding errors for major sites, but I get an email or two a week with people asking questions.

But the front page was still this one random post from 2015 with no links to the articles on the ports.

So I decided recently to do a few things – first, I changed the home page to where it shows off some info about me and this work, semi-resume style, and lists out the projects, including the non-id stuff like the Virtual Boy VR emulator and the port of the C++ version of VVVVVV. I tried to make them a little bit slick looking with mockups of the devices and stuff. It kinda worked.

The second thing is I’m going to blog again but probably not write much more in the way of wordy, profound posts. I mean I might write up some stuff like that (this post is starting to get long now that I look at it), but I’ve hit this point where I figure little things are worth writing about too. Stuff like this:

I’ve modified the Quake II and Quake III: Arena ports to more properly handle SDL. For some reason I was reticent to include the code wholesale like most projects do but now I’ve changed my mind and I’ll probably be doing this to the other SDL ports of mine soon.

I wrote to Brendon Chung of Blendo Games and asked for the source code to the Mac port of Thirty Flights of Loving. He sent me the code and I fixed it up (since, last year, the Mac had its own 32-bitpocalypse). I posted the results here.

I contributed to the ioquake3 project by fixing up the Xcode project, which last saw attention in 2013. I’m also looking to help them with other things as time permits.

For no particular reason I added a software rendering target to the Quake II project. Just wanted to see what it would be like to play Quake II on your phone but pretend like you’re running a Pentium 90 with no hardware accelerator card.

I was able to get LatestChatty, the iOS and iPadOS client for the discussion forum at Shacknews, to run on macOS via Catalyst. I don’t have an article for this and outside of the community of people on Shacknews no one is going to care but it was just neat to see that with not a whole ton of effort this iOS app that was more or less started at the dawn of iOS development can come along to the Mac. I’m coordinating with the author to get it on the Mac App Store, which will be neat once Universal Purchases for macOS/iOS apps are in place like they are for iOS/tvOS apps.

Anyway that’s it for now. I have no idea who reads this anymore but in any event here it is.

August 11, 2015

When I was growing up in Texarkana there was a local guy named Bobo the Magician. He was indeed a magician, and we would sometimes go to the auditorium or theater in whatever school I was in and watch him perform magic. In hindsight it seems sort of weird to take a bunch of kids and have them see a magician perform instead of being in class but whatever, it was entertaining and I sure didn’t mind getting out of a class.

I seriously remember seeing this guy my entire public school career. I saw him do shows in elementary, middle and high school. I would see him at the grocery store, where he had card tricks at the ready. I also remember him being incredibly old every time I saw him. I graduated from high school in 1995 and according to what I can find he passed in 1996 so he basically performed until the end (though he probably scaled back in those last few years).

There’s one occasion that has really stuck with me though.

I was in middle school and he was giving us a performance in the auditorium. For some background and context, the middle school I went to in Texarkana was called Pine Street Middle School. The actual campus and buildings themselves, however, used to be the local high school, called Texas High School (as a side note, people sometimes think I’m making that name up but the school was really called “Texas High School”, and it’s the only high school in Texas called that. I have no idea to this day how we secured the name before and to the exclusion of every other high school, but the rival high school on the Arkansas side of the same city was called Arkansas High School so maybe that’s part of the reason). In the 70’s, Texas High School moved to a larger building complex and they re-purposed this older school as a middle school. I don’t know how old the buildings really were but if it’s any indication, the rooms were heated by metal radiators. Which a row of seats was always next to. And that I burned myself on at least once. Hopefully only once.

So anyway I want to say this happened in my 7th grade year (so, the middle year of middle school). The show is your pretty typical magic fare – the oddly folded newspaper that tears apart and re-assembles, the locking rings, and of course rabbits in hats. I have to think that part of the deal with Bobo as a magician was that he wasn’t trying to be innovative or new, he was just trying to entertain kids, and maybe even get some inspired to try out magic themselves (which worked, I think, I knew some kids who got into it over the years) so the cliched old standards were part of the gimmick.

Then he did a trick that, for the most part, was also a standard thing except for one aspect which I’ve never been able to forget. It was the typical “is this your card?” deal where you have a person on stage pick out a card from a deck and then you will find their card from the deck in an implausible way, often feigning failure beforehand. He had the girl on stage (I think it was just a student and not a teacher) pull out a card (let’s say it was the seven of diamonds – it was a red suit card at the very least), then he broke out a black Sharpie marker and had her sign it (so that the resulting card can’t be just any seven of diamonds). For reasons that didn’t seem immediately obvious he also put a rubber band around the deck. He then does the “is this your card?” bit loudly and produces the wrong card. She says no, everyone laughs, Bobo pretends like he’s failed. He does it a second time. Still not the card. Bobo feigns frustration.

He then says “well then…” and throws the deck of cards, with the rubber band around it, into the air. But I don’t mean he tossed it up a few feet, when he threw it, it went all the way to the ceiling of the auditorium. This place wasn’t a Colosseum but it was a good two stories. The stage area he was standing on was pulled out from the main stage area so above him was the ceiling of the entire building, not just the stage area. The ceiling was basically a giant painted surface, no tiles or anything. And it’s been a bunch of years but my memory of this card toss is such that it didn’t seem as if he had just hurled it really hard, it was as if it was being guided up by a wire or a rubber band or a pulley or something.

When it hit the ceiling it came back down, except that one card was stuck to the ceiling. “IS THAT YOUR CARD?” he proclaimed, and indeed, it did appear to be the seven of diamonds with the girl’s signature on it. At least, as far as we could tell seeing as how it was very far away from anyone sitting in seats at the bottom of the auditorium. Big applause, let’s have a hand for (girl’s name), everyone is impressed. He somehow got the deck to hit the ceiling and a card be seemingly removed from it despite being wrapped in a rubber band.

But for a lot of us kids in the audience there was a bigger question at hand. I was wondering it to myself at the time but many kids in the audience started shouting the question and even pointing upwards: “how are you going to get the card off the ceiling?”

Bobo responds: “Oh, that’s your problem!”

Big laugh from the audience, the show goes on for however much longer it goes on for, applause at the end, then we all get up and go back to class.

Here’s the thing, though: he wasn’t kidding. The card stayed up there. Several more times over the next couple of years I would be in that auditorium for various reasons, including I think a couple of occasions where I would go back after moving on to high school to see my sister doing something like a play or something and it was still there. In fact I’m not sure if Bobo ever performed there again but I always looked up and the card was always there. No one ever took it down. I’m not sure how you would have taken it down – I don’t know if they made ladders that big back then and I’m not sure if you could have gotten a scissor lift in there if you had to – besides being old, it really wasn’t designed to let things like that in. Bobo himself was pushing 90 so he sure wasn’t going to get on anything high to get it.

A few years ago I was back home visiting my grandmother with my family and as I left, since I was by myself (my wife was home for some reason) I decided to drive through some places I used to know just for the heck of it. For the last couple of years of high school we had actually moved to a suburb of Texarkana called Wake Village where my grandmother also lived, so I actually had to specifically drive back to Texarkana to see anything like my original house or what not.

It was an overcast day and it just so happened that Pine Street Middle School was on the way, so I decided to drive past it. When I got there, I discovered that it was closed down for good. There was a large makeshift fence around the entire place with “NO TRESSPASSING” and “KEEP OUT” signs all over. Many windows had been boarded up with plywood. The windows in the band hall were cluttered with old desks piled up. The gymnasium across the street looked flat out condemned from a long time of disuse.

Apparently, a few years back the Texarkana Independent School District decided to consolidate the middle schools together (there were at least two, maybe three) and migrate the whole operation to the site of an old (but newer than Pine Street) elementary school under the name “Texas Middle School” (I guess they scored that name too). I don’t know that I would have gotten out of the car and had a “used to be my playground” moment there but it was out of the question now. I vaguely remembered hearing something about a new middle school but I had just not put together what it would have meant so I really wasn’t expecting to find my old stomping ground condemned. Combine that with the dreary overcast day and it was just a really weird feeling. But it’s always a weird feeling to go back home. I later stopped by the local shopping mall to discover that just about every non-anchor store I ever remembered was long gone. Even the McDonald’s went under.

This had to be several years ago – my grandmother passed away in 2010 and this was a little while before that, so maybe 2008-2009ish. Since Pine Street Middle School is no more, it’s hard to find information about it online outside of autogenerated Yelp pages that just mimic some data dump, or empty Facebook groups for former students. The most recent thing I can find is this 2014 article that says it was still standing at that time but that the owner (which isn’t TISD since 2005) hasn’t done anything with it, but wants it on the National Registry list, meanwhile the neighbors want it town down since it’s an eyesore. That’s a far cry from here in Dallas where attempts to tear down the 108-year-old Dallas High School are routinely shot down by locals (it was recently bought again to be re-purposed)

But something occurs to me – I wonder if that card is still there? I mean, it stayed there while the place was still being used, I can’t imagine someone went to the trouble of removing it.

And getting back to that magic trick, how did that work, exactly? I mean yeah I know magicians don’t talk about what they do and the whole thing is supposed to be mysterious, so I’m not supposed to know how he propelled that deck of cards to the ceiling. I figure it wasn’t a string attached to the ceiling since if they could attach a fishing wire up there somehow then they could have gotten the card down too. Perhaps there was some invisible slingshot thing happening. Perhaps I’m underestimating an 80-year-old’s ability to throw a deck of cards.

And even if we ignore the how of the trick, I have to wonder – did he clear that trick with the school? Did he warn them he was going to stick a card to the ceiling of the place? Did they approve? How did the card stay up there to begin with? It wouldn’t take a ton of glue to stick a single playing card to the ceiling but kind of impressive that it just stayed there. What did the school officials think of him gluing a playing card to their ceiling? Were they pissed at him? Is that why he never played there again? (assuming my memory of that is right). Why did no one ever try to take the card down? Was it too hard? Was it just that it would be too much hassle for such a small thing? Were they worried it would take down some paint with it? Did they just not give a shit?

And hell, did anyone besides me even remember it was there? I mean, maybe no one else looked up. Maybe no one else remembered the trick like I did. It would be sort of typical for me to focus on the damndest things.

And I wonder if anyone else has done a trick like that. I mean, like I said the “is this your card?” thing is very common but I’ve never seen anyone else do the ceiling thing before. Maybe this was Bobo’s specialty trick.

Maybe this was his way of leaving a mark. And maybe it’s still there.

October 15, 2014

Twin Peaks is coming back.

If you haven’t seen it and you’re looking for the cheapest way to check it out, at the time of this writing the entire series is on Netflix. I’ve read some mixed reports on this but my impression is that since Twin Peaks was shot on film, the HD versions on Netflix were made as film to HD transfers (as opposed to just regular SD video “upscaled” to 1080p or whatever) so for a few years Netflix was probably the only way to see Twin Peaks in HD until the Blu-Ray version came out in 2014.

So the short version is – watch the series on Netflix (or Hulu) for the easiest/cheapest way to watch the show. Buy the Blu-Ray set to get everything including extras, the movie, and deleted scenes. 

As for the long version – let’s just say Twin Peaks has had a real rough road on home formats. 

Back when Twin Peaks was on the air, TV shows on home video were a rare thing, both due to business and technical/logistical reasons. Besides the fact that people weren’t into maintaining collections of movies on VHS like they do today with DVD (which is a whole different topic that has a lot to do with rental pricing), TV shows on VHS were a space hassle. A standard T-120 tape held two hours in SP mode, which gave the best picture quality so that’s what most commercial releases used. For an hour long show like Twin Peaks this meant you could get about two episodes on a tape. An hour long show has commercials of course but unless it clocked in at 40 minutes or so (most go 43-47), two per tape was the max you could do. Twin Peaks had thirty episodes, so this means an entire run of the show on VHS SP would be fifteen VHS tapes, and at about 1″ per tape this was taking over a foot of space on your shelf. Compare this to the recent Blu-Ray set, Twin Peaks: The Entire Mystery which houses the entire series, plus the movie and tons of extras, in a couple of inches of space.
And if you were a retailer, this meant you needed to stock tons and tons of tapes. So as you might imagine, the logistics of this alone pretty much made TV on home video formats a non-starter. Very few shows ever made it to VHS and of the ones that did, they tended to either be shows with a low episode count, or they didn’t get sold retail. The original Star Trek had a VHS release where, for bonus dick move points, they only put one episode on a tape. Star Trek had 79 episodes (80 with the original pilot) so this was quite a commitment of shelf space. Some other shows, like Cheers or Bonanza were only sold through the types of TV commercials where the end screen was blue with a yellow 1-800 number and you bought them via a subscription like a Time Life book series.
Despite all that though, Twin Peaks did see a VHS release from Worldvision, but it was a huge mess. The original set was the first season, which was seven episodes, one per tape (another dick move) at $100. However, it did not include the pilot episode. This would be a plague on Twin Peaks releases for a long time to come.
The pilot episode of Twin Peaks was a problem for a long time because of rights issues. Namely, David Lynch and Mark Frost needed to get it funded (and in 1990 spending a million dollars on a TV pilot for a show that hasn’t been sold yet was damn near unheard of) so they secured funding from Warner Bros. One of the terms of the deal was that they would film additional scenes to construct an ending for the show to make it a self contained direct-to-video movie for the international market. I’m guessing the original market was going to be in Europe, so this version is commonly called the “Euro Pilot”.
In any event, this made it such that any company who wanted to put out a set of the first season either had to pay WB for the rights to the first episode, or release it without the first episode. Apparently this was prohibitively expensive or impossible, especially given the cult/niche nature of the show, so this first VHS set didn’t have it.
Parallel to this though, WB did release a VHS tape of the Euro Pilot, which probably thoroughly confused anyone not familiar with the details.
After the second season, a second VHS set was released by Worldvision, with the 29 regular episodes of the show on six VHS tapes for $100. The way they accomplished this, however, was to put out the tapes using EP mode, which allowed six hours on a T-120 tape. The tradeoff, however, was that the video quality was terrible. To say nothing of the annoyance involved with purchasing the first season a second time.
Parallel to all of this, Worldvision also put out releases of the show on Laserdisc, but it was segmented into four volumes at $125 or so a pop. Still no pilot episode, though again WB did release the Euro Pilot on Laserdisc for another $35.
When DVD came around, Artisan controlled the home video rights to Twin Peaks, so they set out to release the show and this time they were going to get the pilot episode as well. However they ran into the same rights issues as Worldvision and so their release of the first season in 2001 also did not contain the pilot episode. Parallel to this, New Line Cinema announced the release of the movie Fire Walk with Me on DVD but it wound up being a bare bones release due, again, to rights issues. I’ve never been as into the movie as I was the show but apparently it was well known for many years that David Lynch had filmed a lot of scenes that didn’t make it in the final film and these scenes were long considered the “holy grail” for Twin Peaks fans, although Lynch had reportedly stated that the version released in theaters was his “director’s cut” and that no other scenes needed to be in the film.
However, also in 2001 a DVD surfaced for sale from a company called Catalyst Logic in Taiwan, where the rules around copyright are a little more muddied (that or Catalyst Logic didn’t care). It had the original pilot, as it was aired in 1990 on ABC (so, not the Euro Pilot). Its exact origins are unknown, as it clearly is taken from a source other than someone having videotaped it off of ABC but the video quality leaves a lot to be desired. Still, it was the only way to watch the pilot at the time on any home video format.
Following the season one release, Artisan ran into some issues with the second season (combined with less than anticipated sales of the first season, possibly related to the lack of pilot episode) and as a result the second season didn’t come to DVD until early 2007 when some rights expired and CBS (for some reason) released the set.
Finally, most if not all of the issues with the show on DVD were addressed when later in 2007, Twin Peaks: The Definitive Gold Box Edition was released. It contained all of the episodes of the TV show, and for the first time officially had both the Euro Pilot and the original pilot on DVD, along with tons of extras including the “log lady intros” (when the show aired on Bravo at one point, David Lynch directed a series of surreal introductions to each episode with the actress who played the “log lady” on the show). The only thing it didn’t have was the movie but no one seemed to mind.
Finally earlier this year the Blu-Ray boxed set, Twin Peaks: The Entire Mystery was released. This has literally everything – all the episodes in HD, both versions of the pilot, the movie, and about 90 minutes of deleted scenes from the movie, something pretty much everyone had given up on. The only things it doesn’t have are a Saturday Night Live skit from when Kyle MacLachlan hosted in 1991 or so, and the music video for “Falling” but those are nearly ignorable omissions.
So it’s sort of surreal that in the past few months we’ve seen the holy grail of Twin Peaks finally be wrapped up in one neat package, and now we’re seeing that a Season 3 (I guess) is going to come out in 2016. I hope they do like Dallas and just pick it up like we’ve just not seen what they’ve been doing for 25 years.

March 14, 2014

I have now re-re-relaunched this blog (give or take a “re-“).

It occurs to me in the course of looking at the long and strange well-documented history of this blog, I’ve had all manner of weird shit happen – I started it out on tripod.com hosting editing with FrontPage, briefly lost control of the domain name once when the fly-by-night company I registered it from decided to lock me out of it, taken it across two different CMS packages, and hosted it at a smattering of different places including a server farm which literally consisted of faded beige boxes missing drive bay doors sitting on the floor of the house where my friends also ran their small consulting empire.

In any event I can add one more to the list – I’ve been hacked. Actually, a couple of times. Towards the end of my tenure with my friends’ hosting, the colocation facility they moved to (after the beige-boxes-on-the-floor phase) decided they had to take my blog offline because it was spreading viruses. Had been for years, they said. Longer than they had been hosting it apparently, which made sense to none of us. Long story short there was an exploit in the Akismet plugin that WordPress came with by default. Also by default this plugin was disabled and because of that, WordPress would never prompt me to update it. I never activated it because the service it provides – comment spam protection – is useless to me because this blog doesn’t have comments. Anyway someone exploited it and was able to insert all kinds of stuff in the blog to point to links to websites serving up malware and so forth. I don’t remember exactly how we cleaned up that mess (I think we restored from an old backup because – unsurprisingly – it had been a long time since I posted last and there was nothing to lose) but we got it back online.

I changed over to 1&1 hosting when my friends moved to a different part of the state, both because of the convenience factor and also because I didn’t want my one-off WordPress blog here to cause them any more issues. Besides, I figured, what are the odds I’d be hacked again?

At some point I noticed my site was incredibly slow. Like, took forever to load slow. I got busy with some real life stuff so I ignored it but when a friend a few weeks back asked for a link to the Velveeta post, I decided I needed to get to the bottom of it. I’m running on the cheapest package 1&1 has to offer (literally $1/month) so I figured it had to do with cramming me onto some box hosting a million other $1/month blogs, but the WordPress admin interface, also running PHP, came up fine so I knew it was probably not that.

To be quite honest I’m still not sure what the heck all happened. I saw my template was crammed with tons of shit – lots and lots of seemingly random characters – but nothing obviously malicious. I got rid of it, but I couldn’t edit any posts. They just wouldn’t come up in the editor. I inspected exports and backups and other than one clear spam link for an online drugstore I couldn’t figure it out.

So I backed everything up, deleted everything – database and all – then put on a clean WordPress install and restored the posts from a backup. Worked like a charm. Restored from a WordPress XML backup, not a SQL script backup, which worked much better than I anticipated. So now I’m back. Again.

I’m still curious how exactly I got hacked but not enough to pour a lot of time into it. But it does occur to me that this is analogous to a paradox with regards to urban decay.

The paradox is: the way to make sure an old building continues to stay in good shape is to keep using it. I work in Downtown Dallas now, in a skyscraper. A few doors down is another skyscraper called Elm Place, which is currently completely empty. As in, tenancy dwindled to the point where in 2010 they just kicked everyone out and closed it down. I believe someone recently purchased it so at some point it might be open again but we can already see signs of rot. Tiles falling apart. Drooping ceilings. The little businesses on the bottom floor that were restaurants next to the large windows have left behind remnants that are fading from mere sunlight exposure.

By comparison, in Texas near New Braunfels there’s a place called Gruene Hall which was built in 1878 and still operates to this day – as a dance hall. Elm Place was built in 1965 and is showing signs of wear and tear a mere four years after being unoccupied, meanwhile a dance hall built thirteen years after the Civil War ended, and which is made of freaking wood, is still up and kicking.

This blog on WordPress, with no comments, very few plugins, and (previously) a very barebones theme, seemingly doesn’t have anything you think would rust, but apparently it did. I tried to stay on top of WordPress updates, but I’m sure I fell down on the job at some point and that was that. WordPress is open source software, which is a double-edged sword with regards to security issues. On the one hand, the ability to have tons of people looking at your code has the benefit that you theoretically find out about issues faster. On the other hand, it also allows malicious individuals to find the security holes, potentially not tell you, and then exploit it in everyone’s site, or in the sites of anyone who doesn’t patch (which would be the category I fell into). Closed source software is more security through obscurity (i.e., flaws can’t be found by having the source) so it’s debatable which approach is better when it comes to a non-centralized piece of end-user software.

My wife has a laptop that, long story short, came with Windows 7, we upgraded it to Windows 8, and then she switched to a Mac (another long story I’ll need to write up at some point). At some point I had to help her accomplish something that, another long story short, needed Windows 7 and Office 2010. I was running Windows 8 and Office 2013 locally, so I needed another machine. We reformatted her laptop, put Windows 7 and Office 2010 on it, accomplished the task, and then shut the laptop and put it away for almost a year.

Then she needed an “extra” laptop for something so it made perfect sense to just dig up the Windows 7 laptop we used before so we did. First thing we do is start applying Windows and Office updates. Everything is going fine, hundreds of patches get applied, Microsoft Security Essentials gets updated, system gets rebooted a few times, all is good.

In the morning there was some weird piece of software running on the desktop. I found that odd, and I looked and whatever it was was installed the day prior. I search for the name of the thing on Google and yeah, it’s malware/adware. We had a ton of updates to apply when we had first booted it up and so I thought maybe something had snuck on there while we were updating it. I uninstalled everything (including toolbars on web browsers) and all was good.

Until the next day when some of it came back. I had everything updated, and it still got wormholed or whatever. I ran MalwareBytes on it and cleaned out some stuff and had MSE run a deep scan. I think I got everything (this wound up being a laptop that sees some amount of use in my wife’s office and I haven’t heard anything bad). My guess as to what happened is that a worm put malware on the system while it was updating and then even after all the updates applied, a process left over that MSE didn’t catch was still allowing software to be installed.

I don’t have these problems on my main system and I’m a pretty stringent user (I had set this laptop up using the best practices I knew of) and still it developed issues. The only thing I can think of is that my main system, by virtue of being on all of the time, is constantly updating and on top of the latest software updates and patches. The laptop had been off for a year, so it had catching up to do and a lot of maintenance to get it back in working condition.

So in other words my main machine(s) stay in good shape because I keep using them, and this laptop got slammed with malware because it had been offline for too long. It’s similar to urban decay.

Codebases are the same way – games whose engine goes open source or see constant attention (see: Starcraft or Quake 3) see themselves running on modern hardware, while games just a few years old break on new versions of Windows.

So yeah, my blog got hacked because I stopped using it. Or updating it anyway.

Also at one point I had made this new template for the blog which was a variant of the ancient theme I had going on (which was literally designed to look like a Web 0.1 page because I thought that was hi-damn-larious for way too long) but it got lost in the first hacking kerfuffle, so I just searched on WordPress.org and found this one. Works for me – minimalistic without being an eyesore in an era of responsive web design.

I’d say I’ll be updating a lot more often now but that’s like an excuse you give your doctor that he doesn’t want to hear. So I’ll just say that I’ll still be around… for some reason.

May 3, 2013

The problem with Android is that the carriers are not looking at it as an operating system, they’re looking at it like an initial source code drop. They’re looking at it like a game developer views an engine.

Let’s say I’m a game developer. I decide I want to make a game and I don’t want to write an engine. The things I want to do with my game would be served fine by an existing engine, just with some tweaks. Maybe I’m Gearbox Software and I want to make Borderlands – I don’t write an engine from scratch, I just license Unreal and add cell shading to it.

So let’s say the latest version of Unreal Engine is version 3.1 (not sure what version numbering system Unreal uses but let’s assume it’s like Windows or something). So I go pay my half million or whatever and I get a source code drop of Unreal Engine 3.1 and I start working on my game.

At some point in my game’s development is the point of no return – code freeze, feature lock, whatever you want to call it. At this point I can’t make any source code changes except for bug fixes. I have to stick to this or else I’ll miss my deadline and/or the game will never ship (see: DNF).

Now let’s say before the point of no return Unreal Engine 3.2 is released and I can see how it would benefit my game. I have to decide at this point whether or not to use Unreal Engine 3.2. If it’s early enough in development then I could consider getting Unreal Engine 3.2 (pay the upgrade fee or maybe it just comes with the license) and then re-apply all the changes I made to 3.1 to it.

After the point of return Unreal Engine 3.3 is released and I can see how it would further benefit my game. However, I can’t afford to change over to 3.3 so we just keep going on 3.2.

So then we release our game, even though it’s running on an engine that’s not the latest. And we have no intention of upgrading it either – this is just the version our game uses.

And then later Unreal Engine 4.0 comes out. We don’t upgrade to that one either. If our game sells well and we make a sequel, we’ll probably use it but we’re not going to upgrade the first one.

Here’s the rub: no one expects us to upgrade it either. Gamers are not going to care that we’re not running on the latest engine and they’re not going to hold it against us when we don’t upgrade to Unreal Engine 4.0.

That’s what handset makers are doing. They don’t see themselves as putting Android 2.3 on a phone, they see themselves as putting their own OS on the phone that is based off of a particular version of Android originally.

And when a new version of Android comes out they’ll use that on their next phone but they’re not going to upgrade their existing phones. Why should they? They’ve already made their money is how they see it.

And I believe this is an approach that works fine in games but not in phones. People don’t run apps in Borderlands but they do run apps in Android and so having the latest OS is important.

Instead what we have is analogous to if Dell sold computers where the OS was based off of Windows but they went in and dinked with the source code so now you don’t know if anything is going to run even though you have the hardware to do it. Android marries the worst parts of closed architectures with the worst problems of open source.

If you’re a programmer or hacker or tinkerer than Android is probably heaven for you, but I don’t see how it’s a good fit for anyone else.

March 28, 2013

Rumor is Apple is going to report its first negative income growth since 2003. Some say this is a sign that something about the iPhone is broken. I would say that there’s nothing broken if what we’re seeing here is basic market saturation.

Prior to the iPhone, the phone carriers had this issue where everyone already had a cell phone. My late grandfather in law, who was hard of hearing, had bad eyesight and missing part of a finger from a tractor incident, had a cell phone. Everyone had a cell phone. So there wasn’t this massive growth anymore because they weren’t doubling customers every quarter based on how no one had a cell phone and then everyone went and bought one. Same way Microsoft was on this exponential growth thing because no one had a computer except for computer geeks, then everyone went out and bought one, and they almost all had a Microsoft OS on them.

What the carriers needed to grow, then, was to steal customers away from other carriers. This is why they kept doing gimmicky things. This is why they fought tooth and nail on that “taking your cell phone number with you” thing. And this is why a desperate company named AT&T let a computer and MP3 player company come in and not only put their new iPhone product on AT&T’s network, but they got AT&T to let them keep complete control over it. And this was after they had made the same pitch to Verizon and Verizon told them to go fuck themselves.

When Apple wanted to get into the phone market people said they were nuts because the cell phone market was saturated. And it was. But if you think about it, Apple wasn’t getting into the cell phone market, they were getting into the smartphone market. Everyone had a cell phone that made phone calls. These phones would also do things like take real shitty photos and play a small number of overpriced shitty games from the carrier. Very few people had a smartphone. And companies like Microsoft had been trying to get people to buy one for years. Blackberries were seen as things executives had to use. Apple placed this big bet that a whole bunch of people would want to have these PDA-sized things that ran apps and web pages to hold up to their heads.

And they were right – eventually. Once the 3G came out and got subsidized by AT&T (remember that the first one wasn’t) and apps became a thing they sold truckloads of these things. And then it was on. Suddenly the cell phone market didn’t matter as much as the smartphone market. Or another way of putting it was that now it was a race to convince everyone that they needed to upgrade their cell phones to smartphones. All the handset makers were shitting their beds because they had been making crappy handset after crappy handset (largely because the carriers had them by the balls and didn’t let them try much else because all these things were were cheap ways to get customers locked in) and here comes this distant second place computer manufacturer with a huge first mover advantage.

Fortunately Google had purchased a company with a crazy idea to make an open source embedded OS based on Linux and hey, everyone can use it for free. Knock yourself out. Fuck, we’ll pay you if you let us control parts of it – namely the advertising. So a dozen hardware companies lucked out because there was an OS most of the way there they could just use for free or cheap. Microsoft got so scared they pulled off the most coordinated technical maneuver in their history by stringing every piece of technology and software they’ve ever done into a “new” operating system which they named Windows Phone 7. It was such a radical departure from the “WIMP on a phone” paradigm that the average person thinks they re-wrote everything from scratch. Of course, the next thing they did was to put this same interface and paradigm on their main operating system, giving them a solid entry against the iPad (which, incidentally, threw Android a curveball because all these handset makers suddenly had to try and shit out tablets as well as coming up with competitive handheld hardware) but cannibalizing the perception of desktop users in the process. And RIM? They were so in denial that anything had changed and resistant to the market shift that their developer tools make command-line compilers look appealing, and their new WP7-like rebirth? It finally came out… last fucking week.

So where was I going with this? Simple – at some point the smartphone market will hit the same saturation point as the cell phone market was circa 2007. When that’s occurred it’s going to be the same thing again, except that it will be the hardware and OS makers doing the slogging. When everyone has a smartphone the only way Apple gets more users on iOS is to lure people from Android. And the only way Google gets more people on Android is to lure people from iOS. And the only way Microsoft gets more users on to WP7 WP8 is to hope to hell that BlackBerry 10 is a flop and RIM BlackBerry finally goes down the drain they’ve been circling for years now and somehow get the corporate market on board (which might not work as most of the world has moved to a BYOD paradigm).

Of course, the definition of “everyone has a smartphone” is murky. My parents don’t have a smartphone. Don’t want one either. When their cell phones die they just go get new cheap feature phones. Probably whatever the local AT&T place has for free with a contract. Not everyone wants a computer in their pocket. So the mobile market knows there’s more customers to convert to smartphones, they just don’t know how many of them they can ever get. My parents could get an iPhone 4 free on contract right now but they don’t want one. Their eyes are bad so most of the appeal of the thing is gone. And they want to open up something and hit numbers, not unlock with a swipe and figure out which of the icons looks like a phone.

So I think what will happen is that smartphone innovation will plateau once we’ve hit as close to smartphone market saturation as we can and then once the only way to get smartphone customers is to lure people away (or back) from other platforms is to out-do them. And we may have started that plateau now.