before displacement in viewport
after displacement in render
The original geometry is just a simple nurbs plane, and all deformation is done in render-time by displacement shader, but I needed to have a real deformed geometry for spray particle simulation for both emitter and collision object.
So, I wrote a very simple RSL-plug-in that exports some data such as P and s, t as an Ascii type Houdini geometry file “geo” at render-time, just like native prman point cloud but that can be read from Houdini, in sop level. The plug-in is very simple. The C++ code is only a hundred twenty lines long. All it does is just grap a variable and save it to text file in certain format.
I tried to read the “geo” file spec in Houdini document, but there was little too much information for me. All I needed was just storing vector and float values as points, so, I just created few points in Houdini and exported them as a geo file and opened it in text editor to check the format and replaced those number using RSL-Plug-in.
Above image is the “geo” version point cloud in Houdini view-port. The points look pretty a lot, but it was not too crazy. The average size of the geo file that has 5 or 6 attributes was about 25 to 30 megabytes per frame.
It was actually pretty interesting experiment. I found the shader evaluation is done not once per one micro-polygon but once per one vertex of micro-polygon, so if you have only one micro-polygon in your scene, shader evaluation will be done four times and they interpolate the result. I think that’s why we always get a smooth result even though shading rate is higher then 1. I later found we even have a option to interpolate the result or not.
And the next step is to build a polygon geometry from the point cloud. Since I exported s and t attributes, I can use those values as translate X and translate Y in 3D space and make them a grid shape, and I transfered the original P attribute of the point cloud to a polygon grid object, and I replaced point position value of the polygon grid object with the new P value.
I think this type of technique can be very useful in many cases. Especially these days, we use z-brush or mud box a lot to generate heavy displacement texture map and pretty often we need some kind of representation of the final geometry in viewport. As long as the geometry has an organized UV(or st), we can generate a polygon geometry in viewport not just a point cloud representation. Also, Polygon count doesn’t change, so this kind of geometry can be exported with point cache and used in other application such as Maya and 3d max with animation.
I can also transfer some other attributes like velocity or peak area for particle simulation. I also can check which direction the surface is moving to. The red part is moving down and the green part is moving up.
Continue in Making Ocean #3