Point cloud tool

Here is the page for my point cloud display and extract tool, it’s also the production blog for independent study for Prof. Malcolm Kesson.

Will update the blog whenever I have any progress.


reference:  point cloud display tool from Double Negative


reading: Complete Maya programming: A guide to Maya API

I read this book last quarter and got the basic ideas of Maya API.

To do :

1. Install Boost:

Boost is a C++ library for efficiently use of some C++ data types.

I interned at Digital Domain and the Technical Directors there are using boost extensively.


boost downloaded, according to a friend at Digital Domain, if I just want to use some array function

sets from boost, I dont need to actually build it, so I just need to specify the path in the make files

2. Figure out how the “make” file works.

I will work in Linux env, and the Double Negative point cloud tool shipped with a “make” file to build the plugin.

Will try to figure out how it works and adjust it to fit my need.


followed the instruction, downloaded and installed “cmake 2.8″ on a machine at Mongometry Hall.

3. Compile the tool, implement it based on my need.

4. Add point cloud extraction function sets



Updated I:

1, Installed “boost“.

I downloaded boost, the library I need to use doesnt need to be compiled, so I can put is anywhere in the harddrive.

then I tried to link it into the c++ code.  At the beginning it failed several times because in the actual code,

it writes

Include <boost/shared_arry.hpp>.

So after some try-outs I found copying the “boost” library folder into “<maya_install_folder>/include” folder is the easiest way to make this link work.

2. Installed cmake 2.80, so I can actually make a “make” file.

Nothing special to mention.

3. Then is comes to the problem. The script is originally asked for “usr/bin/c++” as compiler, and for whatever reason the complied plug-in will have some error message as:

undefined symbol :ZSt16__ostream_insertIcSt11char_traits_ZSt16__



According to an on-line info:

gcc is the “GNU” C Compiler, and g++ is the “GNU C++ compiler, while cc and CC are the Sun C and C++ compilers also available on Sun workstations. Below are several examples that show how to use g++ to compile C++ programs, although much of the information applies to C programs as well as compiling with the other compilers.

I changed the compiler to g++, which seems almost working.

However, after that, the loading plugin-in error becomes:

// Error: /usr/autodesk/maya2009-x64/lib/libstdc++.so.6: version `GLIBCXX_3.4.9′ not found (required by /usr/autodesk/maya2009-x64/bin/plug-ins/dnPtcViewerNode.so) (dnPtcViewerNode) //

On-line research shows this problem might comes from the codes is asking for an old version of libstdc++.so.6 library…




With the awesome hint from a Hair TD at Digital Domain, I figured out that with Maya API, I have to use a older version of g++.

the Ubuntu system at SCAD now is using g++ 4.2.4.

the Maya requires g++ 4.1.2 at most..

After installed the g++ 4.1.2, the source code finally got correctly compiled. Kinda a rough start, but I am already feel happy about it.

Here is the screen shot of the node usage, and the 100% credits goes to Double Negative folks:

ptc view

Display of a simple point cloud



The next step:

1. Add a simple filtering function set.

For the efficiency purpose, the original node has an option to display percentages of the points.

The funny thing is, when the percentage was set to larger than 50%, such as, for example, 60%, it will display a little weird. I baked a torus point cloud, so in theory, the 60% will display 60% of a torus proportionally, which means a less dense version. however, I got a result like:

display weird

percentage weirdness

I looked at the source code and pretty much figured why, will try to implement this as soon as possible.


I did some debug process to the choose points function, here are some results:
percentage: 50%
points going to load is :10451
points per step is :2

percentage: 60%
points going to load is :12541
points per step is :1

As it prints in the results. if choose 60% of the total points to display, which mean it needs to display :

1.0/0.6 = 1.666667 points per step.
However, it’s impossible to skip 1.6666667 points per step. Instead, the original code used “floor()” function to make the float number a interger. And unfortunately, floor(1.66667) = 1, so the original function sets will choose every points in the first 60% of the total points.

As a result, I got a logically “wrong” display result, which is been showed above in the second image of this page.

First Implementation:

display correct

percentage correct

The image above shows the correct display of 60% point clouds.
Basically what I did are:
1. if the percentage is a value like 50%, 25%, 10% …etc. Keep the original algorithm.
Because in these cases, the point for each step will be 2, 4, 10 and so on, and which wont cause any coverting float to int problem.
the actual code for the condition check will be:

if (1.0%percentage == 0)
use original algorithm

2. any other conditions, will use the implemented algorithm.

a. generated a random float number between 0-1,

b. compare the float number to the points number needs to choose divided by total points number

c. if the float number is bigger, choose the current point. then choose_num –, total_num –

d. if the float numnber is smaller, do nothing, go to the next point.

here are the new algorithm:

float ling_rand_number = random_zto();

if ( ling_rand_number > (float)pt_choose/pt_all && pt_choose > 0)
if ( sp < numPts )
// load points


percentage: 60%
points going to load is :12541

and the actual point loaded is 11588.

Not exactly 60% of the total points number, because

a little bit acuracy has been sacrified for the random number generation.

But at least not fully accurate is better than completely wrong.. :)



Fix of the Fix:

The random number generation causes inaccuracy problem haunted me for several days…

and I felt a little uncomfortable about it..

After some try-outs, I figured a beter algorithm to fix the random number generation problem:

The newly algorithm is :

load, for example, 60% of the points means skip 1.666667 points per step,

if the first time skip 1 point, then after this the next time it needs to skip:

(1.666667 – 1 + 1,66667) = 2.2XXX , so the next time skip 2 points,

and after this, the third time needs to skip (2,2XX-2+1.6667) = 1.8XXX, then skip 1 point..

and so forth…

And here are the better results:

display correct

18.630 percentage correct


display correct

60 percentage correct

and animation from 0% to 100%

animated from 0% to 100%

With this algorithm, the percentage display results are 100% accurate.

And I am comfortable enough to go to the next step!



Next Step:

Will try to write an extraction tool, based on a geometry(cube or sphere)’s bounding box. Any points in side that bounding box will be write out seperately as another .ptc file.


Two function sets added:

the fisrt one is getting the current selection in Maya.

It’s really easy to do in MEL command, but in MAYA API is a little different.

At Digital Domain , I wrote the some similar function set in Python for Maya API,

Compares to Python code, c++ needs more accurate/clear data type defined.

The get current selection will besomething like:

import maya.OpenMaya as om
currSelList = om.MSelectionList()

and In c++, the code changed to :

#include <maya/MGlobal.h>


MSelectionList m_currSel;
MGlobal:: getActiveSelectionList(m_currSel);

Also the function sets to get the bounding box of a dag node is a little complicated, I omitted the code here.

In order to test the function sets, I just added these two function sets into the ::compute function of the node

every time is reloads/redraw the node, the bounding box function will be invoked as well.

I get the point cloud viewer’s bounding box info :

the selection bbox’s min.x is : -1.20311

display correct

custom attr added

Then I added my custom attr to the node, and connected it to the AE template.

The cool, or kinda funny thing about the AE templates is if your node is named : “myNode”,  then then AE template has to be named to “AEmyNodeTemplate”.

When creating the “myNode” in the scene, the AE template will automatically attach to the node, as long as the namesare matched.
After I got the UI/AE template updated, I defined the “compute” function to, if the string in the text  field changed, re-compute the extraction’s box based on the typed in geometry.

For example, I typed in “pCube1″(a cube created in the test scene already), I got printed result from the terminal:
the pCube1 bbox’s min.x is : -0.511703

Updates soon




Added two custom node attributes:

a. A custom bounding box attribute:

b. A custom hidden attribute (let’s say “aLingChange”)to be affected by all my custom attribute.

c. Two numeric attributes to display the bounding box’s min and max values

The concept was pretty simple: if the affecting geometry’s name changes, the “aLingChange” has been affected.

and then the ::compute() function would be called, causing the bouncing box’s re-calculate, and update the min/max attribute to display

the newly updated boundingbox info.

However, the affecting and affected calculation seems not working on my custom attrs.

I went through a lot of on-line docs, and been posing questions on the Python inside Maya google group.

and I tried all the suggested solutions to fix this problem, but didn’t work.

Though I learn a lot more about Maya API during the process, and there is a really interesting function set called:


Normally, in the custom node’s initialize() function, user can only set affecting relationships bwtween

two non-dynanmic attributes.

But with this MPxNode::setDependenceDirty() method, we can set relationship between dynamic and non-dynamic,

or dynamic to dynamic attribute of a node. really interesting.

Anyway, the problem consumed me about 5 days, and the affected attribute still doesn’t really work,

I have to ask the compute() function of a node to compute whenever a input attr changes.

It works for my purpose for now, will try to figure it out later if have time.

And here is the UI and results display for now:

the “pCube1″’s bounding box is connected to the node’s “extractBBox” attr:

display correct

custom attrs updates




I got the affecting attributes working.

For whatever mysterious reason, the Linux system I was using, was not happy with my custom node.

Probably it’s usinig a really old system image.

If I use any other machines in Montgomery hall, the node affecting attributes works like charm..

As a result, this problem fixed, with no specific reason.

After that, I added two custom attributes:

a. A write out path for the extraction point cloud file

b. A click button to execute the extraction process.

To get these displays, and buttons attached to the node, I have to adjust the

AE template .mel script a lot. Fortunately, the source code from Double Negative

provided me a really good reference and start.

And here is the updated UI:

display correct

custom attrs updates2




the next step is the write out function sets:

Compares the some problem I encounted during the UI/attribute adjust process,

the actually write out function sets is fairly easy.

I twicked the loadptc() function, and instead of storing all the required data to the arrays,

write out the ptc point data.

A interesting problem I had was from when I close the write out point cloud file,

I was using


to close the write-out .ptc file. Which was designed to close a Read-in point cloud file.

The result from closing a write-out .ptc file from

is the last several thousand points will not be written out.

For example, if user wants to write a .ptc file contains 42,345 points, only 40,000 points will

be actually written.. Really intersting.

Then I found out I suppose to use :


to close a write-out point cloud file. And the results are correct since then.

Here is the extraction result:


extraction result 01

It’s a result from 50% density of the original points, and extracted by the cube.

I am happi with the result for now..

The Next step:

I user control to specific either use the extraction box or not, because sometime user just need a less dense version of point cloud…

Updates soon




Added two condition checks:

a. If the read-in point cloud file name and write out point cloud file has the same path,

don’t start the write out process.

b. Added a useExtractBox custom attribute, connected as a boolean input,

so if user specify not using extract bounding box, the write out process will write-out .ptc file as it displays.

And if set “use extract box”  to “true”, it will write out .ptc file based on the displayed points and the extraction bbox.

Here is the added the attr:


optional extraction



On Siggraph 2009, Mr. Dale Mayeda had a presentation about Interactive lighting of effects using point clouds in “Bolt”

I was fansinated by that and wanted to try something like it..

Here are the step I worked on for the last week:


houdini sim

1. Sim in Houdini:

a. I did a test sim in Houdini fluid

b. Couldn’t figure a way to bake out point cloud from the fluid sim directly,
but fortunately, Houdini is powerful enough to bake lots of info out in its own formats.
So I baked the velocity info out, and used the velocity info to drive a particle object.
And since houdini can renders particles as surface, it’s possible to bake out point cloud
from the moving particle points.

2. Baked out Cs, _radiosity, and _area info from the particle.

a. By default, the particle Cs(color of surface), which in Houdini is Cd, is white.
So I figured a way to convert the Fuild sim’s “Fuel” info into the Cd of the particle.
I baked out the “fuel” info,and with the help of VOPPOP in Houdini, I basically controlled the Cd
with the fuel emission.



3. Baked out PRman point cloud based on the particle.

Same of Prman, to bake out Cs, _radiosity, and _area info, AOVs needed to be set up in Houdini.

a. Set up AOVs in RiAov tab

b. Added custom shader(for baking out point clouds) paths in the shader search path for Houdini:


houdini prman

Then here is the result of Particle screenshot in Houdini, and baked point cloud displayed in Maya,
with the display tool:


results compare

And here is the baked ptc sequence displayed in Maya. The points have dark color indicates fewer fuel and won’t contribute too much in the point based lighting.

playblast of baked ptc files



Then the next step is rendering out with the ptc file.

I wrote a simple light shader, with the indirectdiffuse() function reading .ptc file

and here is the first teste result in Maya:


light shader in Maya

After that, as the test result shows, the problem was at the area that has lots points, the surfaces nearbay will be lit up really unnaturally. Fortunately, the indirectdiffue() function has some options to deal with this kinda situation.

Don’t know if I can paste the Render Docs here, but

the options:




are making the result better.

Here is test result, with a point cloud sequence:

render of ptc based lighting