AABB Collision Resolution, Phase I

The Metareal World is very, very, rectangular.

MapRoomSmall

The classic world is an 8×8 grid of rooms, each room described by a 12×12 grid of characters. (Each room was checkered with two characters and a color scheme.) Your probe could only move exactly up, down, left, right. Most objects also moved only cardinally. (1)

I took a wonderfully simple shortcut for collision detection, which ran fine on a 1 MHz 6502, with maybe a dozen game objects on the 280 x 192 pixel screen at any time. It was this. When drawing each object to the screen, perform a logical AND of each byte it with the pixels underneath it, then XOR it to the screen. After the fact, if any of the ANDs yielded a nonzero result, we know the object has collided with something. The usual response was to simply back it up to the previous position, and maybe reverse its direction if it’s a “bouncer”.

It was visually perfect: if it looks like they collide, they collide. (2)

The modern world, however,

FakeWorld1

has different rules. The world is still grid-like, but may deviate in places, for dramatic effect. Your probe can move in any direction. Even, like, you know, diagonals. As clever as “pixel collision detection” may have been in 1984, it’s not the right approach any more.

However, the walls shall all be rectangular and planar, and plane-aligned to one of XY, YZ, or ZX. And although objects may be oddly shaped, I’m comfortable treating them, for collision and detection purposes, generally as boxes as well.

In game programming, this kind of geometry is called “axis-aligned bounding-boxes”, or AABB.

The general flow for each frame is:

  • Objects may move or teleport
  • Collisions are assessed
  • Corrections are made to conform to the world rules, like objects can’t move through walls so they need to stop at the wall.

First Problems

Iterating with the design and implementation revealed a few gotchas and false starts.

Problem: Sudden snapping looks bad!

A tempting approach is to look for intersections, and find the minimal move to “fix” the intersection.

CollisionP1C1

But visually, a sudden correction of your line of travel makes no sense. It feels clunky

Solution: Correction must be to move less, not more

To remain visually pleasant and plausible, a correction must only work against the movement. It mustn’t move further than the original movement, or in a different direction. (3)

CollisionP1C2

Problem: Snagging on Seams

CollisionP1C3

As an object moves along a wall, pressing downward, at some point it might seem like the “minimal correction” is to stop the forward motion. At first think, doing the smallest possible correction seems best… But if we handle W2 first, then we “snag” on the wall seam.

Solution: Handle “necessary” corrections first

If we look at each intersection, we can see that on W1, only one correction is possible: upward. W2, on the other hand, has a choice between slightly left, or very up. Since W1 has no choice, we commit to the upward motion. Then we handle W2, which turns out to need no further correction. This also handles the case of corners:

CollisionP1C4

Where each of W1 and W3 have only one viable correction. W2 had a choice, and in effect uses both of them.

Non-Problem: Bullet Penetration

In the Metareal World, things don’t move fast. In some game worlds, we care deeply about projectiles and vehicles passing rapidly through walls. “It can’t happen here.”

CollisionP1C5<>

It turns out in this world, conveniently, we don’t need to worry about it. Objects can only move a fraction of their own size per frame. (The corner example above depends on this, by the way, to not squirt “between” W1 and W3, leaving only W2 in collision.)

Implementation Details

First thing to note: Metareal does this in 3d, not 2d. Everything is as shown above in the pictures, but in three axes.

The implementation is as follows (4):

  • (This is done for each moving object.)
  • The movement is in some direction, (Mx, My, Mz).
  • For each intersection, describe a correction (Cx, Cy, Cz) that would move it out of the intersection.
    • The correction shows 1 or more possible axes of correction. (-1, 2, 0) would mean that x-1 or y+2 would resolve the intersection.
    • The correction is in the opposite direction from the movement. We never resolve a collision by moving even more than the original movement. This is the rule that prevents lurchy snappings.
  • We go through the list of corrections. Those with a single axis of correction, like (0, 2, 0), are applied. It means there’s no alternatives to that shove, to get out of collision.
  • These mandatory shoves are subtracted from the remaining possible collisions, all of which involve a choice between several possible axes. Thus modified, the smallest axis of correction for each is chosen.

If only the object is moving, and the walls are not, then the shove directions from all collisions have the same signs — in particular, opposite of the movement.

Also, if the object started out not in collision, then the greatest possible correction is simply to undo the movement. This would happen when trying to move into a corner.

Futures and Glossovers

There are still things this algorithm handles poorly.

  • Each object has a priority. Walls have a higher priority than objects, for example. This is taken into account to decide which thing gets shoved.
  • Transitive shoving is not well-handled yet. Ideally, this should be resolved within each frame, not rippling forward in time. Also, shove priority should be transitive as well. For example, normally a P3 object cannot push a P2 object. But if the P3 object is impelled by a P1 object into the P2 object, then, backed by the higher priority (5) force, it should successfully move the P2 object as a result.
  • The actual implementation treats the movement vector as the difference between the object’s movement and the “wall’s” movement (which might not really be a wall). If these are from different directions, the corrections may be contradictory; resolution may be impossible. For now, I average them and let the object get squished between them. But it is not satisfying.
  • In something resembling the “relativistic sliding hole” paradox, a sliding object can slip into gap that is smaller then itself. Then it’s once again in the “pushed from both sides” state, which doesn’t work well.

Some of these can be mostly avoided with careful world design. Transitive shoving will be implemented as part of Phase II; it will be needed.



Footnotes

1. Last night I had a dream involving a post-apocalyptic Oakland (comma, California), vast concrete parking garage-like spaces. The bedraggled, homeless denizens clustered and sat here and there, moving metal scraps and junk around on crude, chalk-drawn boards. They were in some way psychically addicted to playing chess. Anyway, the term “cardinal directions.” I wonder if Rooks ought more properly to be called Cardinals.

2. Except that on the Apple II green and purple don’t collide, they occupy alternate bits. So objects were carefully designed to have white parts at collision-ey points.

3. Sometimes an object uses “teleporting”, where it moves a great distance without expecting smooth visual continuity. In that case, it’s ok to lurch, and indeed we can just take the minimal correction in any direction to enforce the world’s physics.

4. This answer on gamedev.stackexchange.com suggests to simply resolve all your X-penetrations first, and then Y-penetrations. But I’m not convinced.

5. Since I come from an embedded hardware background, I’m accustomed to explaining that interrupts with lower numbers have higher priorities. I carry this convention forward.

Antic Magazine, Volume 4, Number 7, November 1985

Antic Magazine, Volume 4, Number 7, November 1985

ANKH
Datamost
19821 Nordhoff Street
Northridge, CA 91324
(818) 703-1202
$19.95, 48K disk
Reviewed by Jack Powell

anhk

When I was a child, I bought a puzzle box in Chinatown. It was lacquer-shiny and inlaid with all sorts of colorful, cryptic symbols. And to open it up, you had to find these hidden panels and slide them up, right, down, and in just the right combination before the top slid back.

ANKH from Datamost is a little like that Chinese puzzle box. It’s called an “Adventure in the MetaReal World” but it’s really more of a graphic puzzle than an adventure.

You control a strange little four-color blimp, described in the documentation as your “other.” The object is to explore all 64 rooms in the game. And to do this, you have to solve various puzzles by opening doors and picking up objects. A large part of the challenge is figuring out just what the puzzles are.

There are a few meanies to avoid in some of rooms. You can shoot them, or outrun them. They’re really not that dangerous, the main thing is the puzzle factor.

And you must always watch your Karma. It’s the green line on the right of the screen.

If this doesn’t sound like your usual computer game, you’re right. It’s different. In philosophical tone, it’s a little like Lifespan from the Antic Arcade Catalog. Game play, however is closer to Sir Galahad and the Holy Grail.

The documentation is purposely vague. It really can’t say much without spoiling the game. A flyer was included in the package, however, which takes the player, step by step, through the first few puzzles. Datamost probably added this after their phone started ringing off the hook.

The ambiguity can get pretty frustrating. When the game begins, you’re presented with arrows pointing right and left, and the word “CHOOSE.” Choose “right” and you begin what appears to be the main game. Choose “left” however, and you end up playing around with what seems to be a pointless character-graphics screen. I’ve gone both directions and made it through 54 of the 64 rooms, but I still haven’t figured out what’s going on in the “left” area. It’s mentioned nowhere in the documentation. Perhaps it’s a meditation room.

ANKH is not an action game. There’s plenty of time to sit in one room and think about your next move. Some solutions require coordination, but most require experimentation and abstract reasoning.

If you like puzzles, this is your kind of computer game. I like puzzles.

Editing, Bitmaps, and Mesh

Added a tremendously useful parameter type to the editor: 1-bit-deep bitmap type. The editor is clunky, by traditional standards, but as ever, fine for me.

One use for a bitmap parameter on an object is to procedurally generate its geometry. Allows a wide variety of expression without importing 3d assets. Since part of the “look” of the product will be “code, code, code”, this works well. The relatedness of the forms conveys the fact nicely.

Generating mesh from bitmap was fun. I used a greedy algorithm to find rectangles from top left to bottom right, knocking out bits once found. Tracing the extrusion edges was trickier. It works by:

  • Building a list of every unit segment which is an edge (0-1 boundaries)
  • Take one, and link to the next & the next, until the star point is re-found
  • Make the sides longer than unit segments: any three matching X’s or Y’s in a row, remove the middle vertex. Repeat.
  • For each loop, determine if it is clockwise or counterclockwise (4 more lefts than right, or four more rights than left
  • Determine if it’s contains bits or spaces — find the leftmost lowest corner, and see if it surrounds a bit or a space
  • If necessary, reverse the loop (bits clockwise, spaces counterclockwise) so the faces face outward from the final geometry.

Also, beginnings of mapping these objects to Volume Collision Objects. The world begins to operate.

Shader Stylings

Took a “fun day” to play with shader stylings. I know the look I want is somewhat anti-realistic, but rich. In places, flat-shaded and dreamlike. These dabblings come close. Not to go too far down that path just now; ratholing on the wrong details is a project failure mode. On the other hand, my personal approach to learning and accomplishing involves repeated revisitations.

2015-04-09

Game: World Editor Beginnings

In this post, I’ll document some of the blow-by-blow of a (hopefully!) rapid bring up of a world editor. Rapid means different things to different people… I’m hoping for a couple of weeks here.

Wednesday, 2015-03-25

Haven’t looked at the codebase in about a week and half, was busy getting ready for the first deployment of my Video Feedback Toy at a private event at Chabot Science Center. Went well. Is built on MeLib just like Metareal will be.

Spent day at Cafe Pergolesi and the downtown Santa Cruz Public Library, created fresh project, and added some MeLib features for specifying viewport and scissoring for each MeRenderworld step. End of day:

MeEd01

A content area, & an info area. Pretty special!

Thursday, 2015-03-26

Tragically, the current state of my libraries is still that bringing stuff up has a painful-feeling up-front cost. Spent a couple of hours bringing forward some text console code, and managing an array of items. The process of bringing up this editor will imply creating the World Model, too. Gets and sets, here we come!

But for now, just a shallow visual presence.

MeEd02 MeEd02

 

Friday, 2015-03-27

Short workday today. Worked on some serialization. I have a “parameterizable thing manager”, which can instantiate subclasses of MeThing by class name. Today, brought in some XML code from another project to easily save the complete state of some things to a file.

There are smart people who seem to hate XML, but I think it’s just fine.

Anyway, the code to render to XML is quite easy (if you have enough good stuff behind it):

    std::string toXml()
    {
        StringXmlWriter *sx = new StringXmlWriter();
        OmXmlWriter *w = new OmXmlWriter(sx);
        w->beginElement("level");
        for(auto thing : this->things)
        {
            MeThingKind *kind = thing->getKind();
            w->beginElement("thing");
            w->addAttribute("kind", kind->getName().c_str());
            w->addAttribute("name", thing->getName().c_str());
            for(auto pd : kind->getParameterDescriptions())
            {
                w->beginElement("parameter");
                w->addAttribute("name", pd.name.c_str());
                w->addAttribute("value", thing->getValueString(pd.name).c_str());
                w->endElement();
            }
            w->endElement();
        }
        
        w->endElement();
        
        std::string result = sx->s;
        
        delete w;
        delete sx;
        
        return result;
    }

And produces a nice little text:

 <level>
  <thing kind="Thing1" name="">
   <parameter name="x" value="1e+20"/>
   <parameter name="y" value="0"/>
   <parameter name="i" value="0"/>
   <parameter name="color" value="1 , 0 , 0.5 , 1"/>
  </thing>
  <thing kind="Thing2" name="item 1">
   <parameter name="x" value="111.11"/>
   <parameter name="y" value="0"/>
   <parameter name="j" value="0"/>
  </thing>
  <thing kind="Thing1" name="item 2">
   <parameter name="x" value="-1e-20"/>
   <parameter name="y" value="0"/>
   <parameter name="i" value="0"/>
   <parameter name="color" value="1 , 0 , 0.5 , 1"/>
  </thing>
  <thing kind="Thing1" name="item 3">
   <parameter name="x" value="0"/>
   <parameter name="y" value="0"/>
   <parameter name="i" value="0"/>
   <parameter name="color" value="23 , 42 , 86 , 99"/>
  </thing>
 </level>

Saturday, 2015-03-28

They don’t work for money any more, or to earn a place in heaven (which was a big motivating factor once upon a time, believe you me). They’re working and inventing because they like it. Linda, Larry, there’s no concept of weekends any more! — Spaulding Grey in True Stories

This “no corporate employment” changes how time works. On the other hand, weekends are when my friends are around, so spent a few hours today helping one move. I always take the Truck Arranger role, for best stacking.

Implemented a first pass at XML loading for the editor. I have a c++ class built around expat2, so you can receive individual elements on two callbacks, like so:

    void beginElement(std::string elementName, std::map<std::string, std::string> &attributes)
    {
        if(elementName == "thing")
        {
            std::string thingKind = attributes["kind"];
            std::string thingName = attributes["name"];
            MeThing *thing = this->tm->newInstance(thingKind);
            thing->setName(thingName);
            this->things.push_back(thing);
            this->xmlCurrentThing = thing;
        }
        else if(elementName == "parameter")
        {
            if(this->xmlCurrentThing)
            {
                std::string name = attributes["name"];
                std::string value = attributes["value"];
                this->xmlCurrentThing->setValueString(name, value);
            }
        }
    }
    void endElement(S elementName)
    {
        std::string e = elementName;
        if(e == "thing")
            this->xmlCurrentThing = NULL;
    }

Two points of magic to note. As alluded to earlier, this->tm is a MeThingManager. It has the ability to instantiate different implementations of MeThing by name. The other is thing->setValueString(), which does a best-effort interpretation of text for the particular type of the named parameter.

So far, these thing-lists are just abstract bags of parameters. Next up, defining some thing kinds with a geometrical, visual presence. Probably cubes.

Sunday, 2015-03-29

Development shortcut: All UI is performed with the existing keyboard space navigation (6-axis fly-around using esdf/ijkl), and a text console.

Library has only one kind of Object in it, a cube. The cube only has position. Can be added & moved around, and saved & restored to XML. Can tab to edit different Objects, and page up/down keys to select different parameters to edit.

MeEd03

 

It’s inscrutable if you’re not me (but I am, so that’s ok for now), and it’s just barely enough.

Next up, managing rooms & walls, as well; these may be handled differently than the live objects.

Next up after that, letting the level “run”.

Tuesday, 2015-03-31

Life of a full-stack developer. To implement mouse hit-testing, needed to add an alternate shader that operates on the same geometry. To keep the shaders in sync, had to implement #include for the shader file loader. So had to flesh out my OmFile:: utility file library. (With benefit of well-defined swap point for platform porting.)

editorHitTest

Tuesday, 2015-04-07

And I think I’ll wrap up this post now. In the last week, many low-glamor changes such as keyboard bindings for numeric editing individual params, snapping to multiples, and the like. Mouse click selection throughout. A Material Manager class, so instantiated objects can reference their required shaders by name.

End result: can instantiate several different types of objects, edit & delete them, save & recall to disk.

2015-04-07

Onwards! A long slow path.

Personal Computer News, Issue 086, 1984-11-10

ankh84_choose ankh84_smearyPractice

ANKH

This game is something else but exactly what I’m not sure. Put at its simplest (and this game is anything but simple), Ankh is a series of logic puzzles, some unbelievably tough.

The game environment is described as the ‘Metareal world of Ankh where logic works but doesn’t rule’. The idea is to find your way through 64 connecting rooms, each of which is different and holds a puzzle. Sometimes you can pass through a room’s doorway with no trouble but mostly you need to find a way to open up the exit before you can pass.

Each room must be prodded, poked and probed to yield up its secrets. Solving a room’s puzzle may bring forth a treasure, a tool or (groan) yet a further puzzle. You control a spherical object, your Mindprobe. It can be moved around and fire missiles. You gain power by performing certain actions (gaining access to a new room, for example) and lose it at a rate of knots when you collide with your environment. The game is over when you lose all power – or when you solve it.

At start-up you have the opportunity to experiment on an abstract set-up – a confidence-sapping experience in my case. You may also select the level of perplexity of the puzzles (from mindboggling to mind shattering).

A real headache inducer, Ankh is like nothing you’ve played before. There’s no middle way – you either love it madly or stick it straight under a steam hammer.


Issue: 086
Publication Date: November 10 1984
Subsection: GAMEPLAY
Article Name: Ankh
Author: Bob Chappell


Computer: Commodore 64
Price: £8.95 (Cassette)
Publisher: Beyond
Reviewer: Bob Chappell
Rating: 8/10

Engine: MeRenderWorld

And now, a few notes about the Metareal Engine architecture. As mentioned earlier, of course it’s silly to write my own, let’s move on.

The library presently consists of two major components: Rendering and Volume Management. The Rendering side deals with lists of triangles, groupings of parts, geometry lists, and shader graphs. The Volume Management side deals with detecting intersections of volumes, and notifying listeners. Physics is built on top of volume management, as a special case of responding to intersections.

In this post, I’ll talk about one aspect of Rendering.

MeRenderWorld

The class MeRenderWorld is presently the highest level container. An application may use several, certainly, but this is the top container on the library side.

An MeRenderWorld manages:

  • Lists of Parts (each part is a geometry list)
  • Cached geometry lists, grouping parts by material, so each material uses a single glDraw
  • References to materials (shader programs, for an OpenGL implementation)
  • Backing buffers for rendering post processing
  • List of render operations

The MeRenderWorld itself knows nothing about OpenGL; rather it deals with MeIMaterials and MeIPartRenderers and MeITextures and so on. For now there are only two implementations of each, one for OpenGL and one for FakeFL, which just records actions suitable for unit testing the library.

Often, one wants to describe a process by means of a document, perhaps in XML or similar. Describing the graph of draws and post-processing for a final scene frame is a perfect example of this. However, I have a strict policy of implementing code first, document second. Here I’ll show the code mechanism for building up a render graph.

The render graph is described as a strict, linear sequence of actions. It has to become one anyway (at least until glNext), so that’s the primitive. Here is an example setup:

    MeRenderWorld *rw = this->renderWorld;
    rw->addStepClear("", 0x123456);
    rw->addStepDraw(this->materialSky, "");
    rw->addStepDraw(this->materialOpaque, "");

Here we see that each frame will be created with three actions: a clear-to-color, and two material draws. Today this is, in fact, exactly two glDraw() calls, but that could change.

The empty strings are the names of frame buffers. The unnamed buffer, “”, is the default output buffer. This might be the screen or window, or be another buffer, depending later on the argument to rw->draw().

Here’s a more complex render sequence:

    rw->addStepClear("b0", this->bgColor);
    rw->addStepDraw(this->materialSky, "b0");
    rw->addStepDraw(this->material, "b0");
    addCopyingStep(rw, "b0", "b1");

    rw->addStepSetTextureColor("b1", this->materialTransparent, "texture1");
    rw->addStepDraw(this->materialTransparent, "b0");

    addCopyingStep(rw, "b0", "");

In this sequence, the background is cleared and then drawn with a skybox. Then, the opaque parts of the geometry are drawn onto a buffer named “b0”. This buffer is created on-demand by MeRenderWorld, you just have to name it, and it exists.

That first render is provided as a texture argument to the transparent material, in a uniform named “texture1”, which is then drawn onto “b0” also.

Lastly, “b0” is copied to the output. (addCopyingStep() is a macro which instantiates a screen-sized rectangle textured with one buffer, drawing to the other).

Here’s a frame of the output:

translucent

And there’s a peek into the Metareal Engine’s rendering component.

Looking Forward

2015-03-02_hitTesting

As I’m coding, managing lists of millions of vertices, swapping buffers, keeping track of state and building optimization caches (in time and space, as we say), it is wonderfully overwhelming. There is so very much more to do. Collada imports, managing interactive object types, state saving and recalling, and many fine details that can’t yet be revealed.

To say nothing of the joy of actually crafting the puzzles, and the graphics and the music.

I am reminded of an excellent SF book by Greg Egan called “Permutation City”, which takes place mostly in a simulated world. (The world has become detached from reality, because the computer that generated it was turned off, so they are free-floating.)

One of the persons inside it has found a way to cope with immortality; from time to time he is reborn with a new obsession. After he has cataloged the last of the several billion species of butterfly and moths in some imaginary world, he wonders who he will become next.

The room goes fuzzy, and he is disoriented… he realizes he is in a workshop with acres of bins of wooden dowels… all waiting to be lathed into table and chair legs.

He is in heaven.

Greetings!

Hello, to the future! And now, my contentful first post. These won’t be public for a while, so if you are reading this, it’s because you scrolled back in time.

A normal person writing a game, here in the year 2015, would get a nice framework & IDE, like Unity, and get to work. I read others’ dev blogs and they say things like, “Wow, wasted 3 weeks on a game concept before I decided it wasn’t that interesting.”

Not me. I fire up Xcode and start typing C++ code talking to OpenGL. Why? Indeed. The shortest answer is, because I like writing code. Also, 3 weeks to me is inconsequential. I’ve been assembling my libraries for close to six months now. And they are not bad.

Also, I know exactly what the game concept will be, and have 100% confidence in its viability. This is somewhat mitigated by the fact that I don’t care if, ultimately, anyone else likes it.

And also, having not written any video games in 30 years — literally, a lifetime ago — I was curious how it’s done. I wanted to truly feel the shaders, the collision testing, the lists of millions of triangles. All that.

Also, if you’re reading this, it’s because I’ve chosen to make it public. Hello, to the future!