Giving Up And Starting Over

Posted on:
October 19, 2015
Posted in:
Generation 5 Engine, Graphics, JavaScript, Programming

I’ll admit a certain amount of shame in giving up on my project. I always knew that it would be tough to build a software-based graphics engine on par with the N64 and PSX from scratch (well – sorta – the PSX would still be a cinch). I underestimated, however, how much work would go into just copying the basic OpenGL data flow and state machine. This is the sort of thing that seemed much more doable when I was a high school student naive to what else is out there.

It dawned on me as I was collecting all the various minuscule pain-in-the-ass bits I’d need to build my project, and after reading an article about the benefits of using the web as your software platform instead of a native app, that it would make infinitely more sense just to build this in WebGL.

WebGL is a sort of variation on OpenGL ES – the stripped-back version of OpenGL employed on smartphones and other integrated devices.

Side Note: What’s With the HP/palm logo?

Boring trivia: I learned OpenGL ES on a Palm Pre, and WebGL is based on OpenGL ES. When it came time to learn WebGL I just copied over the test scene I already had.

Issue #1: Low Resolution With Anti-aliasing

It isn’t actually very hard to do low resolution graphics in WebGL. Just like OpenGL ES, you can create a Renderbuffer at any valid texture resolution, render to that, and then draw that Renderbuffer onto a polygon that fills the screen.

The problem is that WebGL doesn’t support anti-aliasing for renderbuffers at all – so if you want to emulate the look of an N64 you have to get a little bit creative.

  1. Create 2 renderbuffers – one at the target resolution and one at 4x the target resolution.
  2. Choose the high resolution renderbuffer
  3. Render the scene
  4. Generate mip-maps for the high resolution renderbuffer
  5. Render the high resolution buffer onto the low resolution buffer (OpenGL will automatically choose mip level 2, which will effectively give you a 16-tap super-sample anti-alias)
  6. Render the low resolution buffer onto the screen

You can immediately see why this approach is preferable. Steps 1 and 2 can be collected into a “start frame” function and steps 4-6 can be collected into a “finish frame” function. It also means that aside from calling those functions a game developer could use WebGL normally in the middle.

Issue #2: Color Accuracy

The issue here is actually that modern hardware is much much more accurate than the N64 and PSX ever were. What I was rendering with the above technique just didn’t have the right flavor. My solution was to add a little code to the fragment shader for Step 5 above that converts the color to 3 RGB 8-bit integers, then dither them down to 3 RGB 6-bit integers.

Issue #3: Rose Colored Glasses

320×240 is an exceptionally low resolution. In fact, it’s much lower than most people would picture if they were being nostalgic for the era. It also leads to a bit of waste. Renderbuffers need to be sized as powers of two, meaning that to contain a 320×240 screen you need a 512×256 buffer.

Using the whole 512×256 buffer doesn’t consume any additional resources and makes things look just a little bit more crisp.

Issue #4: Scaling Up

The final step is to scale the low resolution renderbuffer onto whatever size screenbuffer the browser has supplied. There are some issues, though – namely that the simple nearest-neighbor scaling leads to inconsistently sized pixels. This is still unsolved.

Edit: Solved!

Obviously I didn’t want to simply use a bilinear resample – you’d lose the charm that comes from big chunky pixels. The reason why a nearest-neighbor scale looked bad was that each of the screen pixels had to account for some fraction of a virtual pixel, but not necessarily one that divides equally.

The solution was to figure out:

  • How much of the screen pixel covers the nearest-neighbor virtual pixel, both horizontally and vertically.
  • Calculate how much of the screen pixel covers the neighboring pixels (which is easy – if 25% of the screen pixel covers this column, 75% must cover the next column)
  • Get a weighted average of the 4 pixels involved
Closeup view of this algorithm in action. In practice, it looks like the pixels themselves have been anti-aliased.
View full sized screenshot (1920×1200).

Next Steps

I only barely have a working alpha together. My next steps are to take all of this code I have and put it together into a single thing that can initialized and called. The goal, of course, it create as little impediment to a developer as possible.

I should also try to create some N64-like scenes to test with – maybe a race track or a castle or something.