Tag Archives: blender

Blender to Houdini camera exporter script

Unlikely anyone but me needs this, but just in case: a Houdini script to copy camera animation to the clipboard as a Blender-flavour Python script. Select a camera (or camera switcher) in Houdini, run the script, go to Blender, create a new text block and hit Paste. Execute it and Boom! there’s your camera, all animated an’ stuff.

Suspect most folk wanna go the other way, but I’ve a stupidly complex object already in Blender that wouldn’t be trivial to export, and my scene’s in Houdini. Lots of zany animated texture stuff going on as well… why recreate it if Blender can render it happily? Just need the camera to match up with the rest of the scene.

I’d been going round the houses, exporting to AE first, then from AE to Blender, but focal-length and DoF settings weren’t making it through, so this is an improvement.

Always weird, though, writing a Python script that generates … a Python script. ‘Specially when you’re running it in one 3D package, with its own data structures/methods, and it has to produce a script for a different package with different names and concepts for everything

hey ho…. github gist linky

Blender / Cycles / Particles / Motion blur artefact fix

A render with lots of particles (baked), using Blender and Cycles. I was getting this peculiar streak artefact right next to the particle emitter:

After a bit of fiddling around, I’ve sussed what’s causing it. The default setting for Motion Blur in Cycles is to centre the motion blur around the frame time: it opens the “shutter” before the frame time, then closes it after the frame time.

So when it’s rendering frame 7 with a half-frame (180°) shutter, it’ll actually render the portion of time from 6.75 to 7.25. But it seems that when Cycles reads the particle data from the cache, it only reads the previous frame and the current frame. So the second half of the motion-blurred period doesn’t have the correct particle data to work from.

Easy fix then: choose either Start on Frame or End on Frame instead. Which one to choose will depend on your scene, but it should become obvious when you test it.

Streak artefacts gone! (Phew)

Getting camera tracking data from Blender to After Effects

There’s a plugin for Blender that lets you export 3D camera data in a format AE can use – you end up with a camera in AE that moves exactly the same way as your camera in Blender.

After Effects strangely doesn’t support orthogonal cameras though – you can have either flat compositions with no 3D transformations, or full-on 3D with perspective. No isometric-style perspectiveless 3D. So, no good for my current project, which looks 3D but is completely perspectiveless:

Screen Shot 2014-02-07 at 11.11.52

This means that if you need to track a logo in over an orthogonally-rendered animation you have to use the 2D Corner Pin effect to distort the logo to fit the scene. In a fast-moving scene you can just render tracking markers at the corners of where the logo should go, then use them in AE to place the corner pins, either by eye or using Motion Tracker.

For slow moving scenes it’s hard to get the track perfectly. Orthogonal views along with a camera that rotates as well as translating exacerbates the error; the lack of perspective already looks weird, so a logo that isn’t tracked solidly seems to swim. After much frustration and a bit of googling I decided to hack up a script for Blender to export the data I needed:

# use this script to export the resulting screen coordinates
# of the currently active object in 3D space. Coords get 
# output to the console -h

import bpy
from bpy_extras.object_utils import world_to_camera_view

scene = bpy.context.scene

# needed to rescale 2d coordinates
render = scene.render
res_x = render.resolution_x
res_y = render.resolution_y

rnd = lambda i: round(i,1)

print("====================")

for j in range(1845,1902):   # frame range you're interested in    
    scene.frame_set(j)
    obj = bpy.context.active_object
    coords_2d = world_to_camera_view(scene, cam, (obj.matrix_world *  obj.location))
    print("\t{}\t{}\t{}\t".format(j, rnd(res_x*coords_2d.x), rnd(res_y*(1-coords_2d.y))))

Worth the hassle, though: the flown-in logo is locked to the wall now:
Screen Shot 2014-02-07 at 11.15.36