3Delight (renderman) in Blender

June 1st, 2010 . 42 comments

For a little while now I’ve been working in my own time on a renderman connection for Blender 2.5. The new render API, while not fully complete and stable, makes it much easier to connect Blender’s scene data to external renderers, via python.

I’m aware that other people are interested in this topic too, but so far I’ve been doing this for my own use – to get more familiar and experienced with renderman in practice (by diving deeper in head first), and to develop something I’d like to use myself as a lighting/shading artist. I’m also concentrating on the 3Delight renderer at the moment, since it interests me most and provides a free single license.

It’s still quite a work-in-progress, and by no means provides exhaustive support at this point. I’m tackling this from a pragmatic angle, with the priority of implementing things I want to use myself first and making that easy to use rather than trying to supporting the entire rispec from the start. I’ll probably release the code very soon, but I would like to clean it up a little bit first.

Here’s a test I rendered out last night, with a model by my mate Tristan Lock:

Anyway, currently it supports: polygon mesh, subdivision surfaces, uv coordinates, depth of field blur, motion blur, surface/displacement/lightsource shaders with parameters available in blender material properties, simple conversion from blender’s lamps to renderman lightsources, shadow maps, raytraced shadows, built-in indirect lighting and environment lighting using 3Delight’s ‘indirectlight’ and ‘envlight2’ shaders, and shader editing and compilation from the blender text editor.

screenshot 01

§ 42 Responses to 3Delight (renderman) in Blender"

  • Glenn says:

    Very nice progress Matt. looking forward to playing with it here.

    ~Glenn

  • Mitch says:

    Very cool. Please keep us posted.

  • maces says:

    Hi,

    that’s very cool.

    Well,
    – Will it be in (standard) Blender
    – is there a test version
    – Will there be tools like Slim (Shader Tool), maybe use nodes for that?!
    – Will there be a possibility to render the material preview with the/an external Rendering engine (I think this is interesting for every external renderer)

    maces

  • maces says:

    Hi,

    That looks awesome.

    – Will it be in Blender
    – Is there a test version
    – Is something like Slim (Shader tool) planned? Maybe based on (Material-)nodes?!
    – Will there be a material preview rendered in the external renderer?

    (I think the last point is very interesting for every external render engine)

    maces

  • jason7 says:

    This means we will finally be able to get a P pass for Nuke.
    Thanks!

  • Matt says:

    Hi all,

    Maces:
    – Don’t know
    – Not yet
    – Not planned
    – Probably, I think the capability exists

    jason7: If by P pass you mean pixel XYZ location, you should be able to do that pretty easily with Blender already, using material override, no?

    cheers

  • Cessen says:

    Hey Matt, looks really cool. I played around a bit with this as well, but didn’t make nearly as much progress as you have.
    It looks to me like your pragmatic approach closely matches what I would want from a renderman tie-in with Blender.

    After durian is over, I’d be very interested in working to get this functional for Aqsis as well, if you wouldn’t mind some outside contributions.

  • Matt says:

    Cessen: sure, i don’t mind that at all, it shouldn’t be too difficult to do.

  • LoafMag says:

    God? Is that you? Came back to earth in your heavenly glory in the shape of Matt Ebb? 🙂

    Man this is huuuuuge news!

    Looking forward to a test version to play with 🙂

    Maces: There are plenty of existing SLIM equivalents out there, some OSS, some freeware, some commercial, there are probably some other priorites at this point 🙂

  • LoafMag says:

    Btw you can get those shadows motion blurred with deep shadow maps 😉
    Got multi-segment motion blur working btw?
    Does the new render API provide some means to tap into subframe information? To put inside MotionBegin/MotionEnd as motion segments that is?

    Just gotta say once again that this is fantastic, If you released this into the wild it would almost be too good to be true, regardless of it’s current state! 🙂

  • Matt says:

    hi loafmag: not sure on the state of multi-segment blur yet. The method available to change frames (in order to get new object states) only work on integer frames, but i’m not sure how easy that will be to change. Need to talk to joshua about it.

  • LoafMag says:

    There is a work-around though, wich worked for Ribmosaic in blender 2.49; Say you have 100 frames total, and you want 5 motion segments.
    * Scale whole animation by 5 so you end up with 500 frames
    * Specifiy motion blur over 5 frames
    * Render with step: 5 (only render every fifth frame

    But that won’t work well with baked animation data for particles or cloth. In that case I used to use MDF export and then reimport so that each baked frame was in effect a shape key, and then scale by x number of times so that you could extract motion segments.

    However it’s pretty important for the render API in 2.5 to be able to export subframes, so this shouldn’t be a permanent solution 🙂

    Btw you can put an unlimited amount of motion segments inside MotionBegin/MotionEnd in 3delight, so you could do some pretty cool things. Like extreme shutter renders like in this photo: http://us.123rf.com/400wm/400/400/razvanphoto/razvanphoto0610/razvanphoto061000140/568003.jpg

    😀

  • LoafMag says:

    Oh and btw, it’s pretty cool that increasing the amount of motion segements doesn’t have that big of an impact on render times, as long as you don’t change the amount of pixel samples or the shading rate 🙂

    Btw, this is rendered with AIR, another Renderman render, with extreme shutter and motion segments, much like the photo in my previous post: http://www.sitexgraphics.com/html/car_ad.html

  • Matt says:

    Yeah, I’m aware of how the motionbegin/end blocks work 🙂

    That workaround is horrible! Had a chat to Joshua about the fractional frames today, think I have an idea for a solution, will implement when I get some time.

  • LoafMag says:

    Yes it IS horrible 🙂 But the smallest you could export using the 2.49 API was a frame so there was no other choice for Eric/Whiterabbit who made ribmosaic 🙂
    Good news that things will change in 2.5 API!

    Sorry for the bombardment of questions, but do you use a “passes” system similar to Ribmosaic or 3delightForMaya for things like generating shadow maps, point clouds, environment maps etc? Or have you come up with something more intuitive?

    Keep up the amazing work 🙂

  • Matt says:

    Currently there’s not much of a system – as I said it’s still early days. There’s no envmap or ptc rendering yet – for shadow maps there’s a ‘kind of’ internal passes system that currently just automatically generates shadow maps for each shadow-map-using light before each render. There’s plenty of room for development here, anyway, when i get a bit of time.

    BTW, got hair strands working the other day (but not hair children yet…)

    http://mke3.net/projects/3Delight_blender/screens/3delight_hairs.png

  • LoafMag says:

    Very cool that you’ve got hairs working! RiCurves I assume? Reyes is just insane at rendering hair.
    I stumbled upon that earlier, was snooping around some in http://mke3.net/projects/3Delight_blender/ to see if you had something tucked away, perhaps an accidentaly uploaded 3delight exporter script or something :p

    Got any more teasers you feel like sharing with the world? 🙂

    Hope you get enough time and motivation to continue, looks very promising!

  • Matt says:

    yah, i hope to soon – i’m just tied up on some jobs at the moment, not much spare time for a week or so.

    cheers

  • Loolarge says:

    This is looking very promising Matt, can’t wait to test it!

  • Moolah says:

    Your 3delight supports very pretty things! I thinkg it will be pleasure to use it! 🙂

  • Dalibor Garic says:

    Hi Matt
    Its about we want to make the same thing.
    I worked on my exporter in couple past months , and I already implement Hair Fur export and procedurall hair/fur and it works lightning fast. The trick is in using native Blenders C code.
    Here are links of some test renders with more then million hairs in scene .
    It’s enough to tell that export time is about
    8 sec.
    LINKS:
    http://a.imageshack.us/img707/9264/pixiehairfurtest01.jpg

    http://a.imageshack.us/img94/6339/pixiehairfurtest02.jpg

    http://a.imageshack.us/img541/3669/pixiehairfurtest03.jpg

    http://a.imageshack.us/img37/4008/pixiehairfurtest04.jpg

    Everything is based on Blender 2.49b code because blenders 2.5 code is so frequently changes . But once it will be implemented in 2.5 i hope.
    For now procedural hair generating works only with Pixie , my favorite.
    Render in Pixie in Debug mode in 6 min.
    When render with Pixie in release mode is about 2 min but Pixie come unstable.
    Cheers!!!

  • LoafMag says:

    Hey Matt,

    Have you made any progress with the exporter so far? Did you find out if there’s a way to extract subframes through the API? Curious also how the problem with gettings subframes out of baked simulation data will be solved.
    Hope you’re still checking the comments section in this old blog post 🙂

  • Matt says:

    Hi LoafMag, I haven’t abandoned this, just have been very busy with some production work recently. But I’m going to use this stuff on an upcoming job so it’ll be getting more attention soon. Subframes will be easy now, I’ve already done most of the internal work in the blender source.

    Simulation data remains still a sore point because blender’s physics systems are still quite inconsistent. It’s possible to get sub-frame interpolated data out of the point cache, though not all the simulators themselves (i.e. cloth) support retrieving sub-frame sim data directly from them. Physics in Blender is currently in quite a nasty condition and it really needs to be fixed and cleaned up.

  • LoafMag says:

    Good news! 🙂 I’m curious if the decision to use 3delight/renderman on this upcoming job was because of limitations in blender internal that forced it or that you simply wanted to? 🙂

    Good news also about the sub-frame stuff 🙂 For close sims the old work-around I mentioned before still works though, it’s a pain in the rectum but it get’s the job done 🙂
    Perhaps now with Brecht leaving to Refractive even more attention will aimed towards the API?

    Btw did you see Eric Back’s latest update on RibMosaic? Have you been in contact with him about Mosaic? You could certainly benefit from exchanging some ideas
    /Magnus

  • LoafMag says:

    Typo, I meant to write “cloth sims” not “close sims”

  • Matt says:

    Hi LoafMag

    What I’m aiming to use this for isn’t particularly complicated rendering-wise, which is better for an early test and to get some experience with it. Even so, there are many limitations and faults in Blender’s internal renderer and I suppose this is part of a longer term effort of mine to move away from using it in the future. Maybe I’ll write a bit on this later.

    I did see the mosaic update. I haven’t been in contact with him, honestly I’m happy to just proceed on this myself and learn from mistakes. One reason is that I didn’t like the 2.4 mosaic version very much – it seemed very complicated and didn’t make it easy for me to get the sort of work that I wanted to get done quickly. To me it seemed more aimed at a TD type person who would want ultimate flexibility, but I’m more interested in something more focused that works well in my lighting workflow.

    This is why rather than attempt to implement the entire rispec from the start, I’m trying to develop this via production, to see what’s needed first, what works and doesn’t work, what’s nice vs annoying and revise as I go.

  • LoafMag says:

    Sound nice, looking forward to an evt. writeup on that 🙂

    It makes sense to do it from the ground up, it’s also a good reason to learn more renderman, can never learn enough, it’s a beast.

    I agree that Whiterabbit is aiming for maximum flexibility, but in the update he also explains why the new Mosaic for 2.5x will be usable for anything from casual artists to lightning artists to TDs, he spent alot of time planning and learning from past mistakes on this one.
    But it’s a huge project and it doesn’t hurt to have alternatives, I’m guessing yours will be streamlined for 3delight and more of a seamless intergration with blender instead of a “pipeline” approach? And that’ll be popular among lots of people, definately something worth pursuing.

    Btw, thanks for this!:
    http://lists.blender.org/pipermail/bf-blender-cvs/2010-August/030401.html

  • mookie says:

    WooW! I would LOVE to try it out! I’ve heard so many good things about Renderman but I found existing exporters very unfriendly! I hope you’ll share it with community! Keep up an awsome work!

  • LoafMag says:

    Btw Matt, you mentioned it’s possible to get interpolated subframe data from baked simulations.

    This should probably be filed as a possible bug, but in Blender Internal, Full Sample Motion Blur currently doesn’t work with baked Hair, Cloth and Softbody simulations. No matter how many samples you render with it doesn’t update 🙂

    Any progress as of late btw?

  • claas says:

    hey

    great start man – I find that shader compiler good that you consider this inside Blender.

    I assume you have a script running that passes it to the compiler and saves the shader in the right directory?

    Claas

  • Matt says:

    LoafMag: There are already a few bugs filed about that in the tracker.. I left some comments there when I was on tracker duties earlier.

    As for progress, I’ve been working on it a lot recently on this aforementioned job – have almost rewritten most of it, and added a lot of missing things that you only really discover when trying to use it for real work (i guess that’s the point of the exercise!) Been a bit slowed the last week or two since I’ve been out at another studio but the final deadline for this project i’m rendering in 3delight will be mid-september so can look at releasing something out there after then.

    class: that’s pretty much it, yes

    cheers

  • LoafMag says:

    Nice, looking forward to mid-september then, hope we’ll get to see the results of that job aswell 🙂

    Regarding subframe stuff, is it possible through the API to pass the interpolated subframe data from the Blender cache system to 3delight? That is if you will be using cached/baked simulation data in this project of yours?

    Would I be way off in assuming there’s some fur/hair related stuff in this project? 🙂

  • LoafMag says:

    Btw, some usefull stuff for your exporter project found at the 3delight forums.

    PTC based glossy reflections/refractions and MULTI-bounce GI:
    http://www.3delight.com/en/modules/PunBB/viewtopic.php?id=1504

    PTC based shadows and shadow casting in PTC based GI:
    http://www.3delight.com/en/modules/PunBB/viewtopic.php?id=2065

    PTC based occlusion in general:
    http://www.3delight.com/en/modules/PunBB/viewtopic.php?id=1618

    More on the same subject:
    http://www.3delight.com/en/modules/PunBB/viewtopic.php?id=1511

  • Matt says:

    There’s no API to access blender’s physics point caches directly, all access takes place via setting the current blender frame to a (float) frame number, letting the animation system update and evaluate all drivers/physics/etc, then reading the data from blender’s objects/obdata.

    As mentioned before, while the internal point cache C code is theoretically capable of handling subframes, physics in blender is a broken in various places regarding handling of fractional frames, meaning the data that actually ends up in blender after being retrieved from a physics cache can often be incorrect. Really the correct thing to do is fix that in Blender since it’s a problem for Blender internal as well (i.e. the bugs in the bug tracker).

    As for the job I’m using this for, there’s no hair, it’s low budget, not very sophisticated, and not really anything particularly suited to renderman (in fact in some ways you could consider it un-suitable for renderman :).

    But it’s simple and still involves a lot of practical production elements which is why I’m confident to be working this way, developing the software alongside as a first real-world test, without too much risk should anything go wrong or should limitations become apparent.

  • Brandon Hix says:

    Hey Matt this is great. I’ve been following your work on this and the work on ribmosaic by whiterabbit, how is your work coming? I would love to be able to wrap my brain around getting this working with some of my scenes. 🙂 Thanks again.

  • LoafMag says:

    I noticed on the Aqsis dev meeting log that Whiterabbit just announced he’ll no longer be working on Rib Mosaic, sad sad news :/

    How’s progress for you Matt? Are you still working on your exporter?

  • Matt says:

    Ouch, really? that’s a real shame..

    I have still been working on it, and I know I said I thought I’d be able to release something in September, however it’s still in an incomplete state and right now I’m concentrating on some other things (yep, finding work and paying rent comes first..)

    I know it sucks to keep waiting, but c’est la vie, I guess..

  • Jos says:

    Hey Matt your work looks great. I would like to help on the code. Is there an opportunity of assisting you in the development of the code?

  • kilikoro says:

    Hello Matt and Jos.
    I know Matt is really busy at this time. That’s why IMHO, some helps may be welcome. May you accept to join your forces to make a good renderman exporter for blender, or (individual exporters).
    I’ve tested the Whiterabbit mosaic (2.49) I think the renderman renderers (and the opensource ones) are very powerfull. Please don’t let these projects die.
    Cheers

  • kilikoro says:

    Hello Matt,

    Have you heard about jupiter skin for 3delight??

    http://opensourcevfx.org/2010/11/jupiter-skin/

  • Matt says:

    yes, I’ve seen that, but it’s not very interesting if you don’t have access to the face scan data.

  • DaveLeitz says:

    Just a quick ‘heads up’ to anyone who uses Ubuntu/Linux 64 bit…

    The only build that this script works on right now is 2.56-r34678 by ‘gdawg’ over at Graphicall.org. He hasn’t updated his build, yet… The newer builds by ‘fish’ won’t run it, probably because of changes in the Python API. I’m sure this will change in the near future, but if anyone wants to play with Matt’s Renderman/3delight plug in right now, that’s the build to download and run.

    However, that same build seems to have a problem with Blender internal renderer, so I wouldn’t make it my default Blender. It’s better to use the latest ones by ‘fish’ and others for general purpose work.

Leave a Reply

Your email address will not be published. Required fields are marked *

What's this?

You are currently reading 3Delight (renderman) in Blender at Matt Ebb.

meta