[Sidefx-houdini-list] deep primid aov?
matt.estela at gmail.com
Tue Aug 12 22:38:40 EDT 2014
Outputting primid as a deep aov appears to be filtered to nonsensical
values, is that expected?
Say we had a complex spindly object like this motorcycle sculpture created
Comp would like to have control over grading each sub-object of this bike,
but outputting each part (wheel, engine, seat etc) as a separate pass is
too much work, even outputting rgb mattes would mean at least 7 or 8 aov's.
Add to that the problems of the wires being thinner than a pixel, so
standard rgb mattes get filtered away by opacity, not ideal.
Each part is a single curve, so in theory we'd output the primitive id as a
deep aov. Hmmm....
Tested this; created a few poly grids, created a shader that passes
getprimid -> parameter, write that out as an aov, and enable deep camera
map output as an exr.
In nuke, I can get the deep aov, and use a deepsample node to query the
values. To my surprise the primid isn't clean; in its default state there's
multiple samples, the topmost sample is correct (eg, 5), the values behind
are nonsense fractions (3.2, 1.2, 0.7, 0.1 etc).
If I change the main sample filter on the rop to 'closest surface', I get a
single sample per pixel which makes more sense, and sampling in the middle
of the grids I get correct values. But if I look at anti-aliased edges, the
values are still fractional.
What am I missing? My naive understanding of deep is it stores the samples
prior to filtering; as such the deepsample picker values returned should be
correct primid's without being filtered down by opacity or antialising.
Stranger still, if I use a deepToPoints, the pointcloud looks correct, but
I'm not sure I trust the way it visualises the samples.
Anyone tried this? I read an article recently were weta were talking about
using deep id's to isolate bits of chimps, seems like a useful thing that
we should be able to do in mantra.
More information about the Sidefx-houdini-list