So, last post I mentioned that I had finished off the particle emitter system in a prototype environment (literally a blank 2D canvas) but the next job was to take those particle emitters and drop them into the game environment. I had initially through that I would just overlay the particles over the top of the game scene and it wouldn’t be all that bad if you could see particles through walls; in-game walls aren’t taller than people anyway so it should be okay, surely.
But it wasn’t okay, it looked terrible.
In an “Isometric” scene, zSorting (the order things are rendered in so behind/in-front looks right) is done from the top of the screen to the bottom in linear order. In-engine at the moment, only 1 object (a wall tile, a person, a door) can be on 1 tile at a time, so the engine just renders whatever object is on whatever tile starting at the top of the screen down to the bottom. However, there can be 2000 particles on one tile, and incorporating them into the zSorting populator and renderer took some fairly heavy changes to the way it works.
Every particle has to remember what the y-axis of the tile was for the emitter that it came from and then gets inserted into a massive stack in order of y-axis, with up to 2000 other particles sharing the same y-axis. So there’s an array called ParticleSorter(200,2000), the first dimension is the y axis, and the second is the particular particle that’s on that axis. Once all the particles are added to that array stack, the stack is then read from during the zRenderer routine.
It’s not perfect yet; particles need to be given a concept of “floor” for them to land on, but the rendering is working and it seems to be fairly pretty so far. Here’s a video where you can see a fairly early implementation of breakable doors with particle effects.
I’ve been working on lots of little mathematical bits of graphics for UI stuff and also simple in-game animations. You’ve probably seen the “Spawn In” animation where people seem to drop out of the sky, which is all animated procedurally, but I’ve now added a nice little line-graph UI element for a new game system and I’ve just finished the particle emitter system.
I uploaded a video showing the particle emitter system being built from scratch if you’re the sort of person who likes to watch that kind of thing. Also, an irrelevant animated gif.
A productive weekend goes on by and lots of new graphics and some actual engine work get done. I spend almost 50% of my time lately between working on “the game” and talking to people about it. Some days I write emails for hours or introduce myself to people on twitter while looking for other game developers or people in the industry. Getting to know other people doing the same things as me, or people who are generally interested in what I am doing is a very rewarding experience.
Work keeps on chugging along. In the meantime, here’s three pictures of some new stuff.
people won’t be sure you’ve done anything at all”-God, Futurama
For a game to “Feel right” it’s important for the player to not quite be able to understand why the game play is as satisfying as it is, even when they’re losing at the game. When you deliver a “fail” or a “death” to a player, it’s important that they don’t feel like they’re being punished for the game’s mistakes; it needs to be their own mistakes. If a player is discovered by an enemy, you need the player to think “Whoops, I messed that up.” as opposed to “THAT’S BULLSHIT!”. When I play-test a small area, I have to think about what feels like it should work from a game-play perspective. This lead me to working heavily on the way in-game glass works.
The “Line of sight system” in the game is fairly accurate; it casts per-pixel rays from one pixel to another and fills isometric tiles along the way with creature vision. The problem is that it started to feel unfair when it seemed obvious that a player could hide next to a window frame while a guard looked out the window. Unfortunately, when the creature casts a cone of vision through the window, it can easily see someone standing next to the window.
It is realistic, but it doesn’t really feel “fair” when you’re standing in what seems, by “video game logic” to be a good hiding spot. I’ve had to spend hours now building a “blind-spot” system to make things feel much fairer in these situations. I’ve done this by treating glass windows like lenses that slightly bend a creature’s vision to create blind spots at the edge of a window frame.
<– Here is what happens when you try to sneak by some windows without Lens/Light Bending implemented.
And here is what happens with Bending. –>
This is something that the player will probably never notice exists, but when they run up next to a window to hide from a security camera on the other side, they’ll be happy not to be discovered because that’s how things should work in video games.
I’ve been making lots of improvements to the game engine over the past couple of weeks and I just wanted to shoot out a few videos showing some of the new features that have been implemented.
The first is the new, smooth vision cones. Now, when a Creature wants to look somewhere, it isn’t able to instantly look at that new direction; it has to turn its head smoothly. This is fantastic for making sure creatures notice other creatures standing next to them when they’re changing direction.
These are the new class of unit called Sentry. They’re exactly the same as the “Enforcer” class (as in: they have guns, they run at stuff and shoot it) but they can be linked to a trigger and HAYWIRE!ed. If someone HAYWIRE!s them, they change their “Team” to the hacker’s team and immediately fight on their side. The one in this video has 9999 health so he easily wipes out the entire level.
And finally, this is the new “Spawn-in” mechanic and graphic. I can add creatures mid-campaign now with a little “drop pod” animation. You also notice the ground shake a little when they land thanks to the new Camera.VibrationStrength setting. Also, the game crashes if more than 20 creatures try to pathfind at the same time.