Archive

Archive for the ‘Computer history’ Category

Is it worth disputing the title of “first 3D game on a PC” to John Carmack?

Recently, someone posted a comment on “The Dawn of 3D Games” which I suppose disputed the vaguely stated claim that I wrote the first 3D game for a PC. So I felt like I had to reply and give my point of view on exactly why me, myself and I alone consider that Alpha Waves was a small milestone in the history of 3D gaming.

In reality, there is in my opinion not a single “first 3D game on a PC”, but for a given definition of what a 3D game is, you have a first one that matched these criteria. And for a set of criteria that seems to be relatively reasonable to me (like: it has to be a game, it has to run on some kind of PC or microcomputer, it has to be true 6-axis 3D on a reasonable portion of the screen, and you need some kind of immersion and interaction with a large number of objects), Alpha Waves may very well be the very first. Change a tiny bit in the definition, and some other game gets the crown. So let’s put it that way: Alpha Waves was innovative, and that’s my personal favorite for the title, for obvious reasons.

All that doesn’t matter much, except that in my attempt at documenting this bit of useless ancient geek history, I visited the id Software web site, and I was surprised to see that there’s still the following on their web site:

The first 3D PC game ever! Hovertank 3D debuted the amazing technology that was used to usher in the First Person Shooter genre with Wonfenstein 3D.

Is this a boiled frog approach to marketing? Just by leaving patently wrong stuff on the web site long enough, folks will stop noticing and end up thinking it’s true?

Come on, John! I hesitate writing that about Alpha Waves, when it predated Hovertank by a good year and had a significantly better 3D rendering (if only because it had three axis of rotation). And Alpha Waves is by no mean alone, there are easily half a dozen games predating Hovertank and offering better 3D. You are a celebrity in the world of video games. With all the credit that is due, why do you need to keep this little lie on your web site?

Why does it matter? Precisely because you are a celebrity, so everything you say has a huge impact, including minute details of wording in a long-forgotten corner of an old web site you probably don’t even remember existed. Nonetheless, just fix it. Simply write something like “The first id game ever.” That would do just fine. And that claim is a significant milestone in its own right. Probably a bigger one than “first 3D game on the PC”, as far as the gaming industry is concerned…

And if you feel concerned about your personal place in history, I’m sure Armadillo Aerospace will take care of that.

Steve Jobs forgot how hard it was to create a company

The following video shows Steve Jobs as an entrepreneur, starting over with NeXT. To me, it’s reassuring to see that the Great Steve Jobs himself sometimes found the task overwhelming, despite having $7M (1990′s dollars) in the bank.

If you only have 10 seconds, look at 13:18 into the video. Steve Jobs says:

I forgot how much work it actually is to create a company. It’s a lot of work. You got to do everything.

This is exactly how I feel right now. Doing everything. Vaporized, atomized. It’s fun, but it’s hard. I had not forgotten, I plain didn’t know.

Steve Jobs was also known for his focus on focus. If you are creating a company, you should probably read this.

When your product is not even built yet, none of this stuff matters.  But your startup, in the pre-product phase, is basically a ticking time bomb.  The only thing that can prevent it from exploding is user delight.  User delight attracts funding, enhances morale, builds determination, earns revenue…Until you get to user delight, you’re always at risk of running out of money or, much more likely, losing a key engineer to something more interesting.  Time is your most precious resource.

This is why building a company is an exercise in humility. It’s a case where you don’t need to assume you are below average: you are. You have less funding than your competitors. Your product has less features. Your have less customers, less engineers, less press coverage. If you do something really innovative, most people will think it’s stupid and explain why you are doing it wrong. And most of the time, they are right, you are doing it wrong.

But here is the difference compared to my past experiences in larger companies. In a startup, when you do it wrong, you fix it, and you fix it so quickly you sometimes don’t even realize it. In my opinion, that’s the single reason why startups sometimes succeed. They fall a lot, but then they learn how to walk, and once they get the gist of it, they run circles around more “adult” companies.

The Singularity has already happened…

IEEE Spectrum has a special report about the Singularity, that point in our future where predictions fall apart because major technical changes make any extrapolation we may make based on today’s trends essentially obsolete. Even the New-York Times has an article, entitled The Future Is Now? Pretty Soon, at Least, which quickly brushes up some of the ideas.

The special issue in IEEE is more extensive. There are many interesting articles. In one of them, Ray Kurzweil, arguably the inventor of the concept of Singularity, debates with Neil Gershenfeld, and Vernor Vinge shares what he sees as the signs of the Singularity.

One important point, I believe, is that “there will be a singularity at time t” is a proposition that might depend on the time it’s being enunciated. It seems very likely to me that when you are in the middle of a singularity, you have no idea that it’s there. That’s why I am a bit wary of the use of a singular noun, the singularity, when I think really that there have been many singularities over the course of history.

How could someone from the middle-age, for example, predict the structure of a society after motorized personal transportation became not only possible, but mainstream and relatively cheap (I know, I know, gas prices…)? In other words, seen from the middle-age, the invention of the automobile or, even more so, the airplane, were singularities that might be predicted (e.g. by Leonardo da Vinci), but whose impact on society was really difficult to grasp. The same is true for remote communication, from the telephone to television to the Internet.

Now, one singularity is somewhat special, and it’s when we started building enhancements to our intelligence, and not just our physical abilities. That’s the very definition Vernor Vinge uses when he writes:

I think it’s likely that with technology we can in the fairly near future create or become creatures of more than human intelligence. Such a technological singularity would revolutionize our world, ushering in a posthuman epoch.

But that already happened. The first modest pocket calculators enabled computations so complex that they completely changed the course of engineering. Any engineer with a calculator has “more than human intelligence”, for he can compute faster than any human being without a calculator can. It’s only recently that we redefined intelligence to exclude the ability to perform computations, and the only reason we did that is because computers were so much better at it than we are.

So that’s my personal view on that question: the most important singularity, the one that Ray Kurzweil sees sometime in the future, has already happened, and we are right in the middle of seeing its effects.

Inside a TRS-80 model 100

PC World opens the guts of a TRS-80 model 100, a vintage computer that was one of the first truly portable computers. Unfortunately, that’s not one I have in my collection, so if you happen to have one… There is also a link in this story to the most collectible PCs of all time, and it turns out I have only three of them, not counting pieces of some as yet unidentified Cray which I doubt is a Cray 1.

I remember seeing the TRS-80 model 100, and beeing unimpressed. What made it so popular among journalists, the set of built-in applications, to some extent lowered its value to young geeks. To me at the time, it looked way too much like a largely oversized business thingie. I was much more impressed by the Canon XO-7 at the time. It was not quite as “big” in terms of features (who needs 32K of memory or all these built-in applications?), but it could be connected to a TV and had this very cool plotter.

Back to the TRS model 100, I think that the most interesting part of the story is that this is the last time Bill Gates wrote a significant fraction of the software for a prodduct. And you can hear from the way he describes it that he was really excited about the business uses, about what you could do with the product. Of course, that software crashed from time to time…

Another thing to remember in these days of “green” is that this machine ran for 20 hours on 4 standard AA batteries!

Categories: Computer history

Where did the HP Way go?

Recently, I discussed with some HP colleagues about the old “HP Way”. This happens a lot, actually. I’d say that this is a topic of discussion during lunch at HP maybe once a week, in one form or another.

Employees who were at Hewlett-Packard before the merger with Compaq, more specifically before Carly Fiorina decided to overhaul the corporate culture, will often comment about the “good old days”. Employees from companies that HP acquired later, most notably Compaq or DEC, are obviously much less passionate about the HP Way, but they generally show some interest if only because of the role it used to play in making HP employees so passionate about their company.

Oh, look, the HP Way is gone!

One thing I had noticed was that the “HP Way” was nowhere to be found on any HP web site that I know of. It is not on the corporate HP History web site, nor does a search for “HP Way” on that site get any meaningful result. It’s possible that there is a better search string that would get the result, it may even be somewhere I did not look, but my point is that it’s not very easy to find. (Update: Since one reader got confused, I want to make clear that I’m looking for the text of the HP Way, the description of the values that used to be given to employees, which I quote below. I am not looking for the words “HP Way”, which are present at a number of places.)

Contrast this with the About HP corporate page in 1996, and what do you see here as the last link? Sure thing: the HP way is prominently displayed as an essential component of the HP culture. Every HP employee was “brainwashed” with the HP Way from his or her first day in the company. No wonder that years later, they still ask where it’s gone. Back then, the HP way even had its own dedicated web page.

Did someone rewrite HP corporate history?

So the fact that there is no reference to it anywhere on today’s corporate web site seems odd. It almost looks like history has been rewritten. It get the same feeling when I enter the HP building in Sophia-Antipolis. Why? Because there are two portraits in the lobby: Bill Hewlett and Dave Packard. That building was initially purchased and built by Digital Equipment Corporation. If any picture should be there, it should show Ken Olsen, not Hewlett or Packard, nor Rod Canion for that matter.

I would personally hate to have built a company that left the kind of imprint in computer history DEC left, only to see it vanish from corporate memory almost overnight… Erasing the pictures of the past sounds much more like Vae Victis or the work of George Orwell’s Minitrue than the kind of fair and balanced rendition of corporate history you would naively expect from a well-meaning corporate communication department.

Google can’t find the HP Way either…

But the truth is, I don’t think there is any evil intended here. One reason is that Google sometimes has troube finding the HP Way too. Actually, your mileage may vary. I once got a link on the HP alumni as the second result. But usually, you are much more likely to find Lunch, the HP way, an extremely funny story for those who were at HP in those days (because it is sooo true).

As the saying goes, “never ascribe to malice that which is adequately explained by studidity“. As an aside, this is sometimes attributed to Napoleon Bonaparte, but I can’t find any French source that would confirm it. I think that the closest confirmed equivalent from Bonaparte would be “N’interrompez jamais un ennemi qui est en train de faire une erreur” (never interrupt an enemy while he’s making a mistake), but that is not even close.

Back on topic, I’m tempted to think that it’s simply a case where the HP Way was no longer considered relevant, and nobody bothered to keep a tab on it in the HP corporate web site. As a result, when looking for “HP Way” on the web today, it’s become much easier to find highly critical accounts of HP than a description of what the HP Way really represented. Obviously, that can’t be too good for HP’s image…

Where can we find the HP Way today?

To finally find a reference to the HP way as I remembered it being described to HP employees, I had to search Google for a comparison with Carly Fiorina’s “Rules of the garage”. And I finally found a tribute to Bill Hewlett that quotes both texts exactly as I remembered them.

As the link to the historical HP way page on the HP web site shows, another option is to use the excellent Wayback Machine to look at the web they way it used to be at some point in the past. But that’s something you will do only if you remember that there once was something to be searched for. Again, the point here is not that you cannot find it, it’s that finding the original HP Way has become so much more difficult…

The original HP Way: It’s all about employees

The original HP Way was not so much about a company as it was about its employees:

The HP Way
We have trust and respect for individuals.
We focus on a high level of achievement and contribution.
We conduct our business with uncompromising integrity.
We achieve our common objectives through teamwork.
We encourage flexibility and innovation.

There are only five points and very few words. It’s a highly condensed way to express the corporate policy, which trusts the employees to understand not just the rules, but most importantly their intent and spirit. There is no redundancy, each point is about a different topic. These rules have been written by engineers for engineers. It’s almost a computer program…

The rules of the garage: What was that all about?

By contrast, the so-called “Rules of the Garage” introduced by Carly Fiorina, look really weak:

Rules of the Garage
Believe you can change the world.
Work quickly, keep the tools unlocked, work whenever.
Know when to work alone and when to work together.
Share – tools, ideas.
Trust your colleagues.
No politics. No bureaucracy. (These are ridiculous in a garage.)
The customer defines a job well done.
Radical ideas are not bad ideas.
Invent different ways of working.
Make a contribution every day.
If it doesn’t contribute, it doesn’t leave the garage.
Believe that together we can do anything.

To me, it sounds much more like the kind of routine you’d give to little kids in kindergarden… It’s not precise, highly redundant. More importantly, the relationship between the company and the employee is no longer bi-directional as it was in the HP Way. Notice how “we” changed into “you” except in the last one. If I were cynical, I’d say that this new set of rules is: “you do this, and we’ll get the reward”… Isn’t that exactly what happened in the years that followed?

The Rules of the Garage did not last long. They quietly went the way of the dodo, but I don’t think it was ever intended for them to last decades as the HP Way had. Instead, I believe that the problem was to evict the HP Way without giving the impression that nothing replaced it. But in reality, nothing replaced the HP Way: the Rules of the Garage were essentially empty, and after the Rules of the Garage, there was nothing…

How is that relevant today?

Despite its long absence on the HP Corporate web site, the HP Way is still seen as the reference for a successful corporate culture nowadays. It’s widely recognized that HP and the HP culture ignited the Silicon Valley. There is a good reason for that: highly creative people are what makes this economy thrive. See Phil McKinney’s ideas on the creative economy to see just how relevant this remains today. But guess what: creativity is motivated by the confident belief that there will be a reward.

The HP Way was about the various aspects of that reward: respect, achievement, integrity, teamwork, innovation. I can’t think of much beyond that in terms of recognition. I can’t think of better reasons to work hard. Now, I don’t care much about calling it the HP Way, but I do care about these values being at the core of what my company does. This is not by accident if the top technology companies in the world, including a large fraction of the Silicon Valley, have applied the HP Way in one form or another. It’s not charity, it’s simply the most efficient way to do business when the majority of your employees are highly creative individuals.

With all the respect that I have for HP’s current management, as far as corporate culture is concerned, they could still learn a thing of two from such history-certified business geniuses as Hewlett and Packard. About eight years after having been actively erased from corporate communication, the HP Way is still very much being talked about; it is still regarded as a reference. Maybe that’s a sign that there is something timeless about it…

Update: HP Way 2.0?

Today, Google “HP Way” and HP’s corporate values show as the second entry. No keyword stuffing here (I checked), but Google apparently decided that the page was relevant to the topic somehow. It’s a good thing: the keywords in HP’s corporate objectives are much closer to the old HP Way than to the Rules of the Garage. The five original keywords, respect, achievement, integrity, teamwork, innovation, are all there. New keywords have been added, including agility or passion. But the style is the same, very terse, very dense. The HP web site labels this as “our shared values”, but it wouldn’t be unfair to call it “HP Way 2.0″.

The next step is to make employees (and from there, outside observers) really believe that these values are back. People outside HP may not realize just how hard it was to turn HP around. Reorganizations were frequent. A number of good people, colleagues and friends, lost their jobs. This left scars. Many employees don’t feel valued or safe anymore. Many will no longer believe that “trust or respect for individuals” applies to them. This can be fixed, and for the long term of HP, this has to be fixed. Not because HP should be a charity, but precisely because the HP Way is what made HP such a successful business.

Il est dans le caractère français d’exagérer,
de se plaindre et de tout défigurer
dès qu’on est mécontent.

Napoléon Bonaparte

Categories: Computer history, HP

Added an Amstrad CPC-464 to my collection

My brother Matthieu just offered me an Amstrad CPC-464 for my computer collection.

Writing this, I realized that this may be the first time that I actually mention this collection on the blog. I have about 25 computers, and about the same number of calculators, the vast majority more or less in working order. I have to write more or less, unfortunately, because it’s not infrequent for one of these old machines to break.

It works, but it’s so slow

Anyway, the Amstrad is in perfect working order, as you can see on the picture above. I had forgotten just how fast these machines booted, and just how slow the BASIC was after that. Old memories kicked in, as I tried to practice the horrid program editor of these machines “EDIT 30” would bring a line editor for line 30. You could also move the cursor around, and it looked like you were editing, but you really weren’t. You had to use the magic “COPY” key to copy one character at a time for editing. Totally idiotic if you look at it from today’s perspective.

My kids were unimpressed. We tried to load a few games, but it was really boring, even to me. Load the cassette tape, wait 5 minutes for something to load, that asks you whether you want to play part 1 or part 2 or part 3 of the game, then load again for 5 minutes, and so on. Today’s kids just have no idea how lucky they are to have instant access to a gigantic library of games from the Internet. But even if I grew up with this kind of boxes, I must admit that my memory of how bad this was had somewhat faded.

At some point, I tried to write a little program to draw a Lissajous curve. This was the kind of thing that we thought were cool at the time. Except that it took minutes to draw the curve! I simply had forgotten that. My oldest son, Tanguy, looked at this, and while the curve interested him, I can’t say that the BASIC moved him a bit. We ended up discussing Lissajous curves on the remarkable Apple Grapher application. So much for old times…

On the lookout for other machines…

At some point, I have to spend a little time on e-Bay trying to look for the missing pieces in the collection, but for the moment, I’m simply looking at random, and picking up things when I think they are interesting. Right now, one of the few pieces left that I’d really like to have is a TI-99/4A. That’s just because I spent so many hours programming it at a friend’s house as a kid…

The other side

Funny that I should, twice in the same day, read a story about what the other side has to say about one of Richard Stallman’s many gripes:

The dawn of 3D games…

I was recently drawn to look back at my first paid programming job, Alpha Waves, which apparently may also have been the first 3D platform game.

Others writing about this game after more than 15 years helped me realize that my recollection of that time might be an interesting bit of computer history worth sharing (If you don’t care about such old geezer’s stuff, skip that article!) Some of it is documented elsewhere. Some of it may well never have been written before.

Anyway, this is the kind of story that might interest my kids, if only them…

Update: One of my kids, reading the article, asked me why I had been calling Alpha Waves “the first real 3D game“, since Starglider 2 had been released more than one year earlier. And it’s true that the static screen snapshots below don’t do justice to the difference between the state of the art at the time and what Alpha Waves brought to the gaming experience. It’s only watching a video of Starglider 2 that my son realized how bad 3D was back then. Videos convey the point much better than words or static pictures:


Starglider 2

Alpha Waves

By the way, Google video really rocks!

Inspiration: Starglider 2



One thing I remember is my inspiration for writing Alpha Waves. It all started with a game called Starglider 2, which for the first time on Atari ST and Amiga (and in microcomputer history, for all I know1), featured somewhat realistic 3D graphics. What made them realistic was that for the first time, this game showed animated flat-shaded graphics. Earlier games like the original Starglider only drew lines. Hiding the lines for back-facing polygons was considered highly advanced stuff. So flat-shaded polygons? That was almost surreal.

My first reaction when seeing Starglider 2 was: “Wow!” My second reaction was: “How do they do that?“. Finally, this would turn into: “Can I beat that?“. As you can see on the picture, Starglider 2 displayed 3D graphics in a small region of the screen, and only a small number of objects were visible at once. So the next steps, obviously, were to see how large of a screen region you could use for 3D graphics, and how many objects you could draw at the same time.



Today’s 3D graphics are generated by hardware capable of filling petagazillions of pixels per second, so young readers may not realize that the memory bandwidth at the time made the simple task of filling the screen with polygons more than a few times per second a challenge in itself. Starglider 2 was smooth!… which, at the time, did not mean 60fps like today, more like 10-15 when the screen got crowded. Remember, this was the time where the boink demo was considered extremely cool.

In summary, doing better than Starglider meant something like filling two thirds of the screen, having 15 objects on screen instead of 4, and remaining above 10 frames per second. Ultimately, Alpha Waves would far exceed this initial objective: full screen, not a single bitmap sprite on screen (even the player was drawn in 3D), and sometimes as many as 50 objects on screen. The Atari ST even ended up with a dual-player mode where two players would compete on screen simultaneously (a feature that never made it to the PC version).

Elaborating 3D algorithms



Obviously, to best Starglider, I had to first understand how one would draw 3D graphics on screen. I quickly rediscovered the mathematical formulas, but they were only the beginning. The real question was how to do it fast. Again, to understand the problem, you have to remember that we are talking about a time where the Motorola 68000 used in the Atari ST and Amiga was considered a relatively fast processor. This processor not only had no built-in floating point capability, it was actually very expensive to do a multiplication! So I ended up reformulating the problem as: “How can I draw 3D using mostly additions?.”

The solution would seem extraordinarily obsolete today. The code pre-computed displacement along the X, Y or Z axis, so that it could rotate these vectors only once, and then describe all objects using an encoding that looked like: “go one step right, then two steps up, then one step back“. Each individual step was recorded in a temporary array, and then the final 3D object was created by connecting some of these recorded points. Again, this may seem like a very silly algorithm when, today, 3D routinely uses things like quaternions to compute coordinate transforms. But boy! was it fast!

Of course, there were a few other tricks along the way. For example, without floating-point capabilities there was obviously no way to compute a cosine. This was easily solved by storing a pre-computed sine/cosine tables returning integer amplitudes (-32767..32767). That made it possible to use another micro-optimization. On the 68000, the multiplication operation took two 16-bit arguments and returned a 32-bit value. Multiply a 16-bit signed coordinate by a 16-bit signed amplitude gave me a 32-bit signed coordinate. Anybody today would shift that down to obtain a 16-bit value, but the 68000 had no barrel shifter, which meant that shifting down would have been expensive. On the other hand, it had a swap instruction exchanging the high and low 16-bit parts of a 32-bit register. So after applying swap to the result of the multiplication, I was getting a 14-bit coordinate.

Drawing polygons

The problem of coordinate transforms being solved, the next most difficult problem was drawing polygons quickly on screen. This involved a number of steps: clipping, decomposing the result in a number of triangles and trapezoids, and drawing each piece. The details of the kind of techniques used are now well known, and illustrated here (see figure 9.6 in that article).

My original polygon drawing algorithm was already relatively fast, but the Infogrames folks later insisted that I use their in-house routines to facilitate the porting to Amiga (where they had these routines already available). They had at least a couple of people dedicated to the in-house library of graphics routines on a variety of machines, and they were slowly switching to C for the high-level game architecture, using C a little bit like we use scripting languages today, for the slow stuff. And honestly, the Atari ST version of their polygon routine was a little bit faster than mine, using self-modifying code to optimize the inner loop as it ran.

Well, that optimization made it incompatible with the new and amazing 68020-based Atari TT (because you needed to tell the instruction-side of the processor to re-fetch the data you had just written on the data side, which that code did not do). Being pretty annoyed at the 2% difference between their code and mine, I created a best-of-breed combined routine using an assembler variant of Duff’s device, which if my memory is correct, bested their code quite handily, and also ran on the Atari TT. But that all happened much later, when we were in the final phase of the game development.

From polygons to worlds

Earlier stages of the development of Alpha Waves were much more modest. I was only starting to be able to draw and rotate simple 3D objects. It began with a big cube showing the limits of the 16-bit coordinate space, the limits of my “world”. Then, inside that cube, I placed another one, and one more to test how objects were hiding one another, and so on.

Soon, I had something like a dozen cubes floating in space. And more, and more. And that’s when it slowly became clear that I was close to achieving my original dream. This was definitely faster than Starglider. Even with all these objects drawn on screen, on the entire screen no less, this was still smooth. I was thrilled, I was proud! This may seem ridiculous when today anybody can run Second Life and access some 24 terabytes of stored world information, but at the time, this was world-class 3D graphics.

But all things considered, rotating cubes on a screen gets pretty boring pretty fast. So to test my graphic routines, I started exploring ways to move inside my little 3D worlds. The first one was the most obvious possible: some kind of flight simulator that would let you fly through the world. You’d turn left and right or up and down with the joystick, and move forward by firing the joystick. That allowed me to test all possible rotation angles. To avoid the bugs when coordinates exceeded the 16-bit space, I added code that would keep me inside the coordinates cube. It was as if you bounced on the walls, on the ceiling or on the floor.

Believe it or not, I thought it was fun to bounce against the walls, and started testing all kinds of funny dynamics. The next obvious step was to bounce against the cubes inside the world instead of flying right through them as was then the de-facto industry standard… The cubes acted like big repellers, and so bouncing off one would allow you to quickly accelerate, for instance to climb to the top of the world. Add a little gravity, some platforms on the floor to start bouncing when you fell, and Alpha Waves was born!

In my mind, I was sort of recreating the experience of being a smurf, which in the comics bounce from the floor to reach a table, and so on. And I was starting to think that I might build some kind of smurf-based adventure game around that graphics technology. To that end, I started creating a little mechanism to switch from one “room” of the game to the next, through “doors” located on the walls. The trick was to reach the door, and you would switch to the next room. This way, I could explore an even larger world.

I was certainly starting to see some game potential in my code, but I was still not considering that little toy demo as a game, more like the foundation for what might one day be a real great adventure game in 3D. Sure, I had a lot of fun bouncing around, but who else would find this funny?

I couldn’t have been more wrong!

Alpha Waves, meet Infogrames

I understood that when I presented this code to Infogrames. At the time, this was still a pretty small company occupying a single floor of a building in Villeurbanne. I don’t know how large it was, but I would guess about 20-30 people. Still, this new building was already a giant step up compared to the office I had visited one year earlier, during my first interaction with them. And I need to explain that the first interaction was the reason I was getting back to Infogrames.


The first time, I had been looking for a student job, and they were looking for someone to translate some book about expert systems. Don’t ask me why. I think that Bruno Bonnell thought at the time it was a good idea to diversify the company into “serious stuff” like artificial intelligence and some kind of expert system software for mom and pop. No kidding! These were wild times…

In any case, I did the required job, but when came time to be paid… they had already figured out that expert systems were a totally crazy idea and changed their mind. So they did not need my work any more, and in that case, why pay for it? Guess what, when you are aged less than 20, you are pretty naive, you tend to trust folks. In short, there was no written contract, just a gentlemen’s agreement that they all too happily broke. I did not get a dime.

So one year or so later, when I returned to them, it was primarily with the intent to make up for that loss by working as an intern for one month, doing little and learning a lot from them. Why did I think this was a good idea? I don’t know. I just wanted them to pay me something I guess.

Hey! This is a game!


Anyway, to convince them to hire me during the summer months, I had prepared a sort of career portfolio with various tiny programs. These were a number of half-baked experiments with various technologies like 2D scrolling, sound, text display, and so on. Not a single of these programs was a real game, but my point was more to show that I could write code.

The piece of code I thought would impress them the most was some sort of small clone of Time Bandit, with a few additional features like proportional text being displayed on screen, that was looking so much nicer than the typical fixed font of the time… There was not much of a plot, only a few worlds, but I thought this demonstrated I knew enough about game coding for a summer intern job. Well, when I showed that code, Infograme’s technical director essentially yawned. I was seeing the chance of getting my money back escape…


Disappointed, I took the last floppy in my pile. This was the one containing the “Cube” demo. But if my marvelous Time Bandit clone had failed to impress them, this would would definitely be a total flop as well. Anyway, I started it up, gave the joystick to the technical director… and one hour later, he was still holding it, bouncing left and right like crazy! In my memory, he looked every bit like the guy on the picture, fascinated by this new and strange kind of game…

When he finally left the room, the technical director quickly came back with a contract that essentially read: Christophe de Dinechin will be paid 5000F (less than $1000) for a two months intern job working on Infograme’s “Project Cube”. Well, fool me once, shame on you, fool me twice… My answer was quick: no way, that game is not an Infograme project, it’s almost finished (something I had realized only minutes earlier), if you want it, this will be a royalties-based contract. He replied, “This is a standard contract, just sign it, we will adjust it later.” Fortunately, I had previous experience with this kind of “contract”, so I steadfastly refused to sign anything.

Frederick Raynal

After that, things moved relatively fast. Negotiating royalties was a real pain for me, and according to him, a pleasure for Bruno Bonnell. He commented something along the lines of “all these guys tell me that they don’t need much… Well, that’s what they get!”. I fought for a royalties rate that I thought was decent. In the end, I was very disappointed when I left the room with, if my memory serves me right, something like 17%. I shared my disappointment with the engineers around. I remember a silence. And then, I was told that this was actually the best rate Infogrames had ever conceded to an independent author.

I do not recall how Frederick Raynal got involved exactly, but what I do recall is that he looked at the code, and told his management he felt this code could be ported to the PC. This was against Infogrames’ policy at the time, which was to never port an assembly language game to a different CPU. Frederick argued that my code contained comments all over the place that made it very clear what was happening. And so he ported it to the PC. He actually did more than port it. The PC version included, for example, a very nice tutorial showing how to use the game which did not exist in my original version. The only thing he failed to do is accounting for different CPU speeds, and so Alpha Waves is practically unplayable on today’s machines without slowing it down quite a bit. Update: I had him read this blog, since I’m talking about him, and he commented that it was the last time he made this mistake :-)

More than anything, Frederick is a really nice guy, and we still exchange email every other year. As history would record, he would go on creating Alone in the Dark, a game that was extraordinarily successful, and then many other very successful games. Alone in the Dark used 3D graphics for characters, a first in the industry. Frederick has said that this use of 3D was a consequence of his earlier work on Alpha Waves.

Aftermath

The rest of the story is, unfortunately, consistent with Infogrames earlier behavior regarding payments. I had every trouble in the world getting them to pay the royalties as scheduled in the contract. For a short while, I considered reusing my game engine for the Smurf-style adventure game. But after several late payment notices, I got fed up of Infogrames and gave up gaming for good. Apparently, Frederick Raynal had similar issues after the success of Alone in the Dark.

This is my personal experience of the early days of the gaming industry, and the very beginning of 3D in videogames. It was wild, it was fun!. Thanks to Alpha Waves and my urge to beat Starglider, I got to meet a few people who are living legends today, and to see the early days of the company that later bough american icon Atari.

If you have any stories about this period, I’d love to hear them. Please leave them in the comments area.


1Update: Well, it turns out I did not know much. Actually, the MS-DOS port of Elite also had flat-shaded graphics, unlike the hidden-line graphics of earlier versions. Like Starglider 2, however, the 3D experience was lacking.

Update: I also discovered that the id Software web site claims that Hovertank was the first 3D game on the PC. Obviously, they are wrong: Alpha Waves predates it by one year, and it offered a better 3D experience.

Je suis jeune, il est vrai ; mais aux âmes bien nées
La valeur n’attend point le nombre des années.

Pierre Corneille, Le Cid

Old computer ads…

For those who never saw the ad below, look at this really funny walk down memory lane.

Categories: Computer history, Funny

Code bloat does not pay

Comparing an old Mac Plus with a fairly recent PC, it looks like we did not make much progress in responsiveness:

Check out the results! For the functions that people use most often, the 1986 vintage Mac Plus beats the 2007 AMD Athlon 64 X2 4800+: 9 tests to 8! Out of the 17 tests, the antique Mac won 53% of the time! Including a jaw-dropping 52 second whipping of the AMD from the time the Power button is pushed to the time the Desktop is up and useable.

We also didn’t want to overly embarrass the AMD by comparing the time it takes to install the OS vs. the old Mac. The Mac’s average of about a minute is dwarfed by the approximately one hour install time of Windows XP Pro.

Meantime, Steve Jobs and Bill Gates met for an interesting interview. I don’t know, I find the parallel funny…

Categories: Computer history, Funny
Follow

Get every new post delivered to your Inbox.

Join 365 other followers

%d bloggers like this: