Pioneer DevBlog
Wednesday, 29 January 2014
Hey good lookings, what's cookings?
Meshes. Meshes are cooking. Through the power of my new meshcooker app. The meshcooker implements my Application Interface and is instantiated instead of the game by command line switches to cook assets into a preferable format.
Here is the backstory: All of the new mesh import work (for certain values of "new") operates through a set of iterators that go over each vertex and triangle to smooth and flip normals to make the mesh look as good as possible. There are some meshes it can't process as it operates on shared vertices and shared edges, but for most convex hulls it works just fine. And thankfully most injection moulded plastic forms a convex hull...
The upside is that most LDR mesh files are loadable, which increases the available brickset from the basic bricks, plates and roof bricks I have to a wide array of brick elements. The unfortunate consequence is each mesh is post processed every time its loads, which leads to ~very looooong~ startup times.
So I've taken the process offline and written a mesh cooker that opens LDR and OBJ mesh files, builds a triangle set and post-processes for smoothing. Finally it converts them to a renderable vertex buffer format and serialises to disk as a bytestream.
The mesh resource bytestream can be loaded directly into a vertex buffer and startup times are now faster than ever!
The mesh resource bytestream also includes data sections for resource ID and version numbering so the mesh cooker can be automated to refresh when new or updated content is added.
I don't have much code to show-and-tell, nor any new screenies, at least not yet, but its only a few steps away from importing a whole bunch-a-bricks from online LDR resources. The code makes heavy use of the existing Mesh Import pipeline, and just derives a new Import and Export pair from the current MeshImport and MeshExport interfaces.
Tests are thin on the ground, but I'll add a few to ensure that the Import/Export implementations are symmetrical and go from there.
More news as it breaks. Recreational Programming is a lot of fun, if you have any questions or comments about game development or the Pioneer project let me know!
Monday, 12 August 2013
Triptych
The three camera controllers and input paths are working. Here is a composite shot of the a scene that I took with all three cameras.
- On the left, the scene with my build camera which you can use to spin the scene and move a 2D cursor to add and remove bricks.
- In the middle, a "minifigs-eye-view". This is a first person view from the minifigs perspective showing what he sees. You are free to walk around the world as a minifig.
- On the right my third person "adventure" controller where you can see your character and explore the environment. This out-of-body experience lets you control the minifig like a traditional game character.
I've bound F1, F2, F3 to switch between them while playing which is good enough for now.
There is still a little more work before a public build is ready. The new cameras, camera controllers and input controllers are all wired together using manual dependency injection and while the scene is still a directed acyclic graph I'm worried that it would be difficult to do in a clean factory so I think a code review is in order.
The build palette is pretty rudimentary, with only a half-dozen bricks in that I've been testing with but it should be trivial to include most of the basic bricks, plates, tiles etc... that I had in the prototype and the new LDR importer should mean adding custom bricks is reasonably quick and painless.
I need a mesh cooker - this has been on the backlog for a long time. I'm currently importing OBJ and LDR meshes on startup but a really quick mesh serializer would be much better. Now I'm doing small iterations, the import time is really starting to annoy me.
Other than that, there are a few misc tasks to address, including some visual glitches that are going to bother me if I don't take time out to address them, including the difference in shadow depth precision on nVidia and ATI hardware (yes. that's still broken) and adding more config options for the render quality.
Comments and questions about Pioneer, programming or game development welcome - I'm thinking of doing an Q&A blog post to answer a bunch in one go.
So until next time, keep building!
Wednesday, 7 August 2013
Brick by Brick
Now the third-person controller works, I'm integrating it with the previous first-person and the free camera mode from my Mesh Tool/Import application. Along with the improved build UI, the game will soon be back together and I'll be pioneering in a new tiny world to run and jump around.
There are still a couple of bugs, and rendering issues to resolve, but it totally feels like I'm on the road to a new public release now.
Code wise, there isn't much too the new camera/camera controller works. The input and camera stacks are now both trees, with the Game Input responsible for switching different input policies in and out, and for switching different cameras and camera controllers in and out at the same time.
There is a slightly weird C++ workaround in my camera hierarchy because of the reference counted pointers I am using, but I'll either fix it or declare "No Fault Found" and move on.
My reference counted pointers are built on incomplete types, and point toward concrete implementations. However I ended up with a public method on a camera subclass that doesn't belong in the interface. Conundrum.
I'll revisit the problem before long, and the workaround doesn't create any cyclic dependencies or garbage collect problems, but its no longer code I can be proud of which is a non-trivial problem.
Usually I can sleep on problems like this, and it all comes out in the wash during review. Which seems like a reasonable plan this time too.
Until next time, think inside the box!
Sunday, 4 August 2013
Let there be life!
Its one small step for man, one giant leap for a minifig! Since all of the heavy lifting was done, I thought I'd take a break from technical tasks and do something visual...
...Its pretty satisfying to run a minifig around the environment.
Once the mesh was imported and post processed, it needed a skeleton, rigging and animating along with a third person walk/run controller.
As these are reasonably trivial there isn't much code detail to show off. The skeleton is a hierarchy of about a dozen physics objects. My physics evaluation runs in two passes. The first pass handles integration and the second resolves hierarchy - so its ideal for a simple skeleton with forward kinematics.
The skeleton is rigged in the minifig factory to a cheap one-bone-per-vertex system, which is enough for prototype although I'm itching to make it look nicer.
The animation is a procedural walk/run cycle, and I've stubbed out a keyframe animation player and anim blend tree that I'll fill in later to do some more fancy anims including idle loops, blending between gaits and partial body anims for actions.
The walk run controller is a modified version of the orbit camera from my Mesh Tool application with WASD movement pushed onto the skeleton Root motion. There is no footstep matching so there is some skating, especially on turns. Again, this is good enough for prototype and it'll land on the backlog when it need to gets resolved.
The minifig is parameterised by colour and body element but without painted torsos or heads. I'm probably going to play with the root motion, gravity integrator and collision for a while, and then look at minifig customization more when I roll out a multiplayer server.
Until next time, stay plastic!
Wednesday, 24 July 2013
Minifig Factory
The new progress today was a minifig factory. It diverged from my common factory pattern of having a pre-populated database of creatable objects that are requested by ID. Instead, a MinifigParameters object contains a description of each element and which bone it attaches too.
The MinifigParameters object is passed to the factory, which returns a fully constructed and configured minifig including a render component that can be added to the scene.
For ease of use the MinifigFactory creates a Minifig skeleton that describes the physics hierarchy. It creates all of the mesh instances with a MeshRenderComponent and ties everything together based on the MinifigParameters passed in.
Lastly, and as a temporary measure, the MinifigFactory also constructs an example MinifigParameters object that I can use. I'll probably end up with a MinifigParametersDatabase that can retrieve parameters objects that describe specific Minifigs by unique ID, or wrap that process in the MinifigFactory so you can request minfigs by unique ID or by passing in a parameters object.
Solving the minifig construction task was a good self-contained process. The next step is probably adding the WalkRun controller with camera and then the animation controller after that.
Until next time, stay square!
Sunday, 21 July 2013
This kinda worked
Showing off the detail, this kinda came together. There are still a few imperfections in the geometry that I'm going to resolve manually at a later date but what I've got is suitable for now.
Its 1AM and the mesh is in a usable state for animation, which is a good place to finish the weekends work.
The next task is to add a Walk/Run controller to the Input and Camera stacks, and tie them to the minifig locomotion. Once its moving in-game then its time to write an animation library. I think the first pass will be FK bones with a single bone-per-vertex, CPU skinned.
This will be my reference animator and then I'll reproduce it with GPU skinning and switch up if the video hardware is capable.
I'm considering rigging for two-bones-per-vertex skinning and pre-multipling the inverse T matrix during mesh cooking, but I think that's going too far for now. I'll only need one bone per vertex for a long time yet and will be able to get the animation fidelity I'm after.
With beauty shots like this, it becomes apparent how poor the specular highlights are on my reference shader, so a new plastic shader is in the immediate future too.
Submesh breakdown
Because I had a couple of minutes, I rewrote the LDR importer to divide the mesh into discrete submesh components and now post processs each component for normals and smoothing before recombining them.
Sections with a shared triangle edge are treated as the same submesh, and I just walk all of the triangles edges to build each one.
Here is a composition of the AirTanks(3838) and Hips(3815) meshes with each submesh colourised. There are a couple of tears in the mesh which I have yet to get to the bottom of but in general it proves the concept and demonstrates the process is probably sound.
To achieve this I created a new MeshToolApp. My Application base class looks like this
class Application
{
public: virtual int Execute()=0;
public: Application() {}
public: virtual ~Application() {}
};
So the main method looks something like this:
int main( int argc, char ** argv )
{
Application * MyApp = new MeshToolApplication();
int Result = 0;
if ( MyApp )
{
Result = MyApp->Execute();
}
return Result;
}
There is a little more to it in practice as I first parse the command line parameters to determine which application to execute, and there is some other init/cleanup, but this demonstrates the pattern very well.
The mesh tool app instantiates the same engine, renderer and input code as the main application but supplies an orbiting camera so I ca spin around the model to view it. It has simple import/load at the moment - once I'm happy enough with the post process I'll add a GUI to load, convert, view and save meshes. The last step will be creating a MeshConvertApplication that scans the content folder for LDR meshes and serializes out a converted mesh as an automated process. It might render a screen capture of each mesh and save a bitmap for each one too.
Subscribe to:
Posts (Atom)