Thursday, 1 December 2016

Attaching armour to an armature rig in Blender

We're not entirely sure that this is the correct way to go about it. But here's how we managed to create our characters in Blender, and still maintain an object hierarchy after importing into Unity, allowing us to switch armour elements on and off at runtime.

Firstly, everything we wanted to be able to turn on and off is created in Blender as a separate model/mesh. We did this by creating the objects in one instance of Blender, and copy and paste them into the model containing our character.

This kept the UV mapping and so on, but allowed us to position armour over our soon-to-be-exported-to-Unity character.


After scaling and rotating the objects into position, we then parented each piece of armour to the rig, using automatic weights (we're using the Rigify meta-rig in this example rather than the generated rig, but the principle applies to any rig really). Sometimes, after parenting, the piece needs tweaking to get the rotation/location exactly right.

Then we select our single armour piece and enter weight-paint mode. The automatic weights mean that each piece of armour inherits all of the vertex groups from the model. We simply find the one that best matches the location of our armour, and delete all the others for the selected piece of armour.

Then, with just a single vertex group displaying influence over the armour piece, we paint it to the highest degree of red possible. In effect, a single bone (or vertex group controlled by one or more bones) moves the armour - this way it can't bend or deform as multiple bones fight for control over the armour.



To check everything's working, we saved the file and copied it into our Unity Assets folder, and dragged an instance of the character onto the stage.


Not only does the armour conform to the model/animation, but we still retain the entire object hierarchy, so that we can switch armour elements on and off for different character types/player positions in our game.


Coupled with being able to change textures at runtime, this should give our players plenty of scope for customising their own characters in the game!

Now the only thing really left to do is to create some custom animations for our orc characters. For some reason, applying humanoid animations to these particular characters ends up making them look a little bit... well, gormless.




Wednesday, 30 November 2016

Changing textures at runtime in Unity

One of the things we're keen to get working for our Unity game is the ability to customise Fantasy Football characters. As different characters have different amounts of armour, we're modelling the characters with every possible armour piece attached, and then disable them at runtime in Unity.

To be able to do that would be pretty cool.
What would be super-cool would be to have the player design their own team strip (perhaps using some kind of web editor) and then have their players clad in their own custom colours.

To do that would require generating and changing textures on-the-fly. Now we're pretty sure - with some PHP and GDI+ - we can generate the appropriate png at runtime. What we need is a routine to allow us to change the texture of an object at runtime in Unity.

Luckily, it's not that difficult:

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class loadTexture : MonoBehaviour {

   public GameObject target;
   public string web;

   // Use this for initialization
   void Start () {
      if (web.Length > 0) {
         StartCoroutine (loadImage (target, web));
      }
   }

   private IEnumerator loadImage( GameObject page, string url ) {
      WWW www = new WWW( url );
      yield return www;
      page.GetComponent<Renderer>().material.mainTexture = www.texture;
   }


}

We set the script up by dragging the orc skin into the "target" field, and setting the URL to our (local) web server.


When the game engine runs, our orc appears with the default green skin:


But when the next texture has finished downloading, the skin colour changes immediately.


It's only a little thing, but it's pretty exciting for our game idea; we've potentially got the ability to allow players to create (and download) their own team colours - and in Unity it simply means loading a new (single) texture/png from the web server.

When playing against an opponent, the Unity app could download their team colours, thus allowing both players to completely customise their own teams - and have their team colours appear in other people's games.

The original (or, more accurately, the second edition) Blood Bowl boardgame came with a number of "endzone" markers, for different teams. The game was very much about customising the teams - creating your own team name, mascot, insignia, team colours etc. In the game, the endzones at the end of the playing surface were simple double-sided strips of card which could be swapped out depending on which team(s) were playing.



Not only could we provide players with the ability to create their own team colours, we could even have custom in-game end-zones by simply swapping out a texture or two.

Now that would be pretty cool.....


Tuesday, 29 November 2016

Blender and weight painting

So far we've managed to import an existing Unity Asset into Blender, create our own mesh, unwrap UVs, use existing UVs and add a custom rig.

In truth, the rigs that come with Unity Assets are almost impossible to use for animating in Blender - quite often the bones are disconnected and each faces upwards/outwards.


After doing some research into motion capture software (particularly using the Microsoft Kinect, but more on that later) we worked out why this might be - but it doesn't help us, when we want to create our own, custom animations from within Blender. We need to throw this set of bones away and use our own rig to animate the mesh.

And to do so isn't exactly trivial. It's not especially difficult either - at least, not to get a mesh moving to a skeleton. But it is tricky to get it working as well as the original Unity Asset. This is mostly because - it turns out - of weight-painting.

So far, whenever we've parented a mesh to a rig, we've used the with automatic weights option. Which makes it work. But not without problems. The most obvious example of this is around the hips; check out this walk animation.





Watch the bit at around 17 seconds in, where the character walks towards the camera. Notice the "trouser line" where the waist meets the hips. That moves around quite a bit. Which is fine for an organic shape (a humanoid shape covered in fur, for example) but if this character was indeed wearing a belt, things would look a little bit funky.

And that's because the hips - in this model - are weighted above the belt-line. Compare that to one of our imported models.


In order to display the weights on each bone, there are a couple of things to do first - obviously we need to be in weight-paint mode. But in this mode, there's no easy way to select a bone on the model - luckily we can use "vertex groups" to select the points on the model that are effected by each bone.

If we move the hip bone on the model above, there's a very definite "line" where the deformation stops acting - just below the "belt-line".


When we inspected the weights on our imported models, we noticed that where the effects of different bones start and end, the weight painting is the same for both bones.


Note how the lower leg bone affects the shin area mostly, and the knee area is green (affected quite a bit, but not any further up the leg). The upper leg bone affects the thigh area and the knee area is also green. The affect of the upper bone goes no further down the leg than the knee. This ensures that there is no "overlap" between the  two bones, so they do not "compete" for influence over the mesh.

Which means our next Blender project will be to rig a model, apply some custom UVs, and weight our custom rig to create a similar behaviour. Right now it's getting a bit late, so that will have to wait for a few days....

Sunday, 27 November 2016

Blender FBX import - where are the UVs?

In the last couple of years, we've racked up quite a bill on the Unity Asset Store. A good few hundred quid a last. Some of this has been for plug-ins or shaders, but the bulk of it has been for models.

Quite often the characters come with animations, which is great. But if they need altering slightly, or a new animation is required, or the model needs a bit adding/removing, things get a bit trickier.

Most often, Asset store characters are supplied with the models, in FBX format. And Blender likes FBX. But when you import the model into Blender, quite often, you don't quite get what you'd expect...


Here we imported one of the soldiers from the Toon Soliders pack (an amazing character set, btw, with some really nice animations thrown in, as well as a mecanim-ready rigged character). But even after setting the view to textured the model appears as a solid, blank canvas.

If we go into our UV editor....


... there's nothing there!


Even if we select the model, and the material that came with the FBX....


... nothing doing.  We tried to load the texture into the explorer panel on the right, but the model stubbornly refused to render with the material/texture we've set.


The way to do it, is to select the model, enter edit mode, select all faces (not edges or vertices)...


Then select UV view from the bottom-left option in the menubar.


Bingo! We've now got the UV shapes - but there's still no sign of the texture. To fix this, in UV view, load the appropriate texture/png from disk.


If you've selected the right texture, everything should line up just right. Go back to 3D view and, hey presto!


Now our imported character looks just like he should, when used in Unity. To prove this, save the entire .blend file into your Assets folder and flip over to Unity. Drag-n-drop the character into the scene (Unity automagically imports .blend files using the built-in FBX importer, so no need to export again) and apply an animation.


Choose the most appropriate shader for the model, by selecting the character in the explorer, expanding the list and selecting the mesh:


Hit play, and watch your (Blender modified) character come to life!


Friday, 25 November 2016

New press-n-peel printer

After a mis-hap with our Dell laser and some acetate (seriously, Steve, what did you think would happen, putting thin plastic through a thermo-nuclear-hot fuser?) it was time to get a replacement.

Just over a year ago we tested a few different printers for use with both genuine press-n-peel blue (which currently costs a staggering £5/sheet) and the cheap yellow chinese alternative we bought (10p per sheet)

Surprisingly, we were able to go into a shop - a real, physical shop made out of bricks and everything - and pick up a desktop Xerox Phaser for around £120.


(Another) Steve at Kings Printers in Brighton was incredibly knowledgeable about printers, feeder ports, toner components and so on. He also did a lot of research on our behalf, contacting toner manufacturers (and a fair few "compatible" providers) before returning the same conclusion we'd already come to: for press-n-peel, Xerox is best.

And after a quick test with our cheap alternative paper:


Good, strong, solid black. No scaling or broken traces. Although we forgot to photograph the final board (it was etched, lacquered and put into a final product before we thought to photograph it) you can see the quality of the Xerox print for press-n-peel use.

In short, if you're struggling with press-n-peel (a lot of people do, and we regularly get questions about how to get the best results from non-optimum papers) you could do a lot worse than upgrade your laser printer.

A genuine desktop Xerox is much less expensive than we were expecting - and the results speak for themselves!


Thursday, 24 November 2016

Importing BVH files into Blender Rigify-ied rig with MakeWalk

Did anyone mention that animating is really, really hard? Rigging a character in Blender is pretty straight-forward, even if you do it yourself. With just a few mouse clicks, you can easily Rigify a character, and have full IK/FK support. Moving limbs and waving arms around is pretty simple stuff.

But creating a realistic animation - making the character look like a real, living, breathing thing, rather than a stiff, robotic puppet - is really hard. Plus, for some poses, FK (forward kinematics) is ideal. But for some poses, IK is preferable. And getting Blender to play nicely, as you switch from one to the other, is a bit of a nightmare.

Luckily, there's a (relatively) easy solution: BVH.
And ever since the first Microsoft Kinect (originally for XBox) hit the market, indie game developers have had access to a nice, easy, cheap(ish) motion capture device.


The Kinect For Windows package is now discontinued, but you can still use a Kinect One for XBox and a PC connection cable to get the same result.

The cable costs more than the Kinect (though both can be bought online for around £40 if you look carefully). So, of course, we snagged a few and looked at how we can use them for our animations.

While we're waiting for them to arrive, we took a look at BVH animation in Blender. There are quite a few software packages that work with the Kinect One to create BVH animations (we'll try a few out when the hardware actually arrives) - so in the meantime, we thought it best to look at how to use BVH files with our rigged character in Blender.

It turns out, it's actually quite easy to import BVH animations into Blender - simply download the MakeHuman Blender tools, and activate the MakeWalk plug-in in your Blender project.


Then load your character, complete with rig (either a hand-carved rig or a Rigify generated one) and when in pose mode the MakeWalk tab should appear in the Tools panel on the left of the screen (assuming a default pane layout in Blender)


Hit the "load and retarget" button and hey presto! Your character takes on the BVH animation.

If your character stubbornly remains in the t-pose position, use the play/frame advance tools to move through each frame of the animation. If there are no frames of animation in the timeline, check your console for import errors.

Now that was easy. But it's not the end of things. Most BVH files are massively wasteful. They key just about every bone, location and rotation on just about every frame. And there are also a couple of frames where "glitchy" poses appear.

Even for an energetic ballet dance move, something doesn't look quite right in this frame!

So there's a bit of tidying up to do. But, in general, it's a quick and easy way to get a basic animation into Blender. We usually just flick through the animation and where any one frame looks particularly out of place, simply remove that keyframe. As long as there is a decent keyframe before and after the offending frame (or frames, you can get away with deleting up to 5-10 consecutive frames before it's noticable) you should be ok. Looking at the dope sheet however, shows us a slightly different story:

Wow! That's a lot of keyframe information.

Probably about 90% of this keyframe information is unnecessary (that's just a guess - plucked a number completely out of thin air). But given that Blender (and, eventually, our target platform Unity) will automatically tween between two poses, what we really need to do is grab just the pertinent frames of action - keep the main key poses between actions, and let Blender fill in the gaps, rather than force our model into a set pose on every single frame.

With our animation, we found that the first 21 frames were basically the same pose - the actor in the BVH mo-cap studio obviously readying themselves to perform the action. So we kept frame one and frame 22 and deleted all the other frames between these two. The animation played more or less the same, but with only one key pose, instead of 22, at the start of the animation.

Here's the same dope sheet, but with only the important frames kept in - any frames where the character was simply transitioning from one pose into the next, we deleted.


While we're still keying every bone, location and rotation, between frames 1-50 we now have eight fully-keyed frames of animation, instead of fifty. Already that's a massive reduction in animation data.

The playback still looks pretty much the same as the original. A few, tiny, subtle little nuances may be lost. But that's the compromise for getting good, clean, small-sized animation data; something we're happy to live with.

The only issue with this approach is that it boring. Repetitive and boring.
Now animation isn't hard. Just tedious!

Wednesday, 16 November 2016

Simple walk cycle in Blender

Following on from our Rigify rigging, we had a go at animation today. Let's be clear. Creating animations is not easy! But one of the simplest animations to do, with 3D software, is a simple walk cycle.

So we started off with our "extreme" pose. With auto-keying set, each time a bone or IK controller is moved, Blender automatically creates a key frame for you.


We used the IK controllers on the hands and feet to create a classic walking pose. For good measure, in pose mode, we selected all the bones in the rig and - via the Space popup menu - hit "insert keyframe". Then, with all the bones still selected, hit the copy pose button.

Then, forty frames on, we hit the "paste reversed" button. It's the second of the two paste buttons. A perfect mirror image of the pose appears.


Then, forty frames further on again, we paste the original pose (not mirrored). In the section at the bottom, we set the animation to run from frames 1-80 and hit play. Already we have a simple (albeit rather clumsy) walk animation.



Now, that's ok. But it's not brilliant. It's a bit "lifeless".
So we now need to create the "crossover pose". This is the pose, exactly half-way along the animation, where one leg crosses over the other. While we're about it, we made a few tweaks to the crossover pose.

It's at this point in a walk action, where the "front" leg  is taking the entire weight of the body. It's also where the character is getting ready to "spring" into the next pose. As such, the body needs to be compressed slightly - the front leg bent slightly, ready to push off with the next step. So we grab the hips and pull them down slightly.


Because our hands are still set to IK (inverse kinematics) they remain exactly in place, even though the entire body has shifted down a little (almost as if they were holding onto some invisible bars, keeping them at their current level). So we grab each hand and pull them down, towards the floor, a little bit.

Then, forty frames further on, we paste a mirrored copy of this cross-over pose. Already our walk cycle animation looks much better.



Now that's a bit more like it. It's not perfect, but with just two major poses (mirrored at the appropriate points in the animation) our walk cycle has a lot more life in it!
If you watch the animation back, the character appears to "ride on his heels" for a long part of the walk. We would prefer his foot to be flat to the floor for a longer time during the animation.

So between the extreme and the crossover poses, we created another keyframe, this time placing the foot flat on the floor (and turning the toes a little to flatten the foot out completely).


After watching the animation through a few times more, we found a slight - ever so slight (in fact, probably not even noticeable once the character is in a game world, animated, and viewed at a distance from a slightly overhead view-point) problem. As the foot slides back, from the crossover to the extreme pose, it can dip below the floor line.


Another key frame and a quick tweak, and the animation is pretty much done. For now, anyway. There's no real character to the walk - no gait, or anything to indicate whether this is a light or heavy person. But it's a start!



In hindsight, we'd probably make this animation run over 60 frames, not eighty. As it is, the character appears to be walking quite slowly. So we should either speed it up a little, or play about with the key-frames, making the rise take longer, and the "fall" into the crossover position quicker, to indicate a slow, heavy, lumbering character.

There are probably plenty of ways we can add a little more character to the animations. But for now, we're just getting used to mixing IK and FK controllers to create simple animations for our Unity game. Next up.... getting the animations to play in Unity!