16 posts tagged

houdini

Houdini to Redshift: Keeping Colors Sharp

In Houdini, I usually assign color to primitives (though Houdini defaults to assigning it to “points”). However, if you want Redshift to recognize color attributes (using RSUserDataColor), you need to promote the Cd attribute to points or vertices, as Redshift doesn’t interpret it directly on polygons.

Promoting Cd to points will result in color blending when you subdivide the model, which can create blurred colors. To maintain sharp color boundaries, promote Cd to vertices instead, as Redshift can understand vertex-level color attributes clearly.

 No comments    2   3 d   3dmodeling   houdini   redshift

Baking textures with Redshift inside Houdini

I had to bake some texture maps in Redshift inside Houdini. Haven’t seen any clear tutorials on how to do that. So here is short guide.
Let say I created complex material mixing different texures, adjusting them with color corrections and gradients, noises. I’m happy how it looks in the Redshift renderer. And want to pass geometry to 3ds Max and setup materials with Corona.
General ideas is that you need to create custom AOVs to bake all those textures. Link to documentation about custom AOVs:
https://help.maxon.net/r3d/houdini/en-us/#html/Custom+AOVs.html

Here is how a test material network looks like:

Just for visual reference I’m adding black nulls to know what maps I want to bake. And connecting here those nulls to red nodes (StoreColorToAOV or StoreIntegerToAOV). Sadly you can’t use `opinput(“.”,0)` to get the name of connected node in MAT context like you can in SOP. So you’ll need to copy paste names from nulls.

Create separate Redshift render node. In RenderMaps tab set Renders Maps Baking Enable.
If during testing you want to switch fast between textures resolution (from 512x512px to 1024, 2048, 4096) add an integer slider (with range from 0 to 3) to the inteface of RenderMaps tab (I called it indra_res_mult) and in the Output resolution add:

512*pow(2, ch("indra_res_mult"))

If it is black and white texture like roughness or mask use data type scalar. And it will be saved as 8 bit greyscale image in this case. They need to be with gamma 1. But if you render them in Redshift in png it will save in gamma 2.2. So if you want gamma 1 you need to save in tif. And for channels like base color use png.

Things to remember:

  1. No tessellation on obj level. It took me more than 1 hour to figure out why my maps were looking strange and it was just this one checkbox.
  2. No overlapping uvs.
  3. Faces has to be coplanar. I personally didn’t have a problem with this. But in this video for C4D it is recommended to triangulate.

Tips:

  1. If you have assigned groups to geometry and want to use it inside redshift materials: you need to promote it to vertices on SOP level and switch on “Output as Integer Attribute”. You don’t need to create node for each group. Just use GRP_* in group name field. (I usually start names of all groups that I want to keep with GRP_ in the beginning). Then read it inside materaials with “RS Integer User Data” node. Just writing “GRP_group_name” or “group:GRP_group_name” will not work. That’s why we need convert group to integer attribute.
  2. To write those masks you need to use “RS Store Integer to AOV”. StoreColorToAOV will not work.
 221   2 mon   houdini   redshift

Flexible Color Assignment in Houdini and Redshift

How do you assign random colors from a specific set to objects and keep the setup flexible for changes with Houdini and Redshift?
Let’s say we have several plastic cups in a scene and 4 specific colors from a client.

  1. Create a class attribute with a connectivity node (you can name it whatever you like).
  2. Promote it to the vertex level with an attribute promote.
  3. In the shader tree, use an RS Integer User Data node to bring in the attribute named “class” (or any other name that you gave it earlier).
  4. Connect it to an RS Jitter node (name it “max_variations_01”) and select “User Data ID” in Input ID Mode. In “integer jitter,” set the min to 0 and the max to 3, so we will have 4 variations. With this node, we only control the number of variations.
  5. Create another RS Jitter node (name it “lightness_range_01”). We will use it to create lightness variations. Keep the color to black and set Saturation Variation Max to 0. Now, with the Value Seed, you can control the randomness.
  6. Create an RS Color Ramp (name it “recolor_01”) with 4 colors from your client and set the interpolation to constant.
  7. After adjusting the seed on the “lightness_range_01” node, you will need to move the colors a little bit on “recolor_01” so each of them will end up in a range generated by “lightness_range_01.”

Another thing that you can do is to offset UVs for each object. So when you add textures for roughness, they will not repeat obviously. To do this, after the connectivity node, add an attribute wrangle node (Run over Vertices) with this:

@uv.x+=rand(@class);

Before:

After:

 227   3 mon   houdini   redshift

Search and replace paths in Houdini

If you want to search and replace paths in multiple locations use Windows -> Hscript Textport window

and write:

opchange \$DOWNLOADS"/wetransfer" \$JOB"/geo"

Backslash before variables will allow you to keep variables ($JOB and $DOWNLOADS in this case) and not expand them to full path.

Another case: you imported FBX with materials. You can move them to mat context and change path with this:

opchange ../../materials /mat

Another example: on windows machine for some reason textures are using absolute paths and I want to change them all to the root of the job project:

opchange "C:/Users/user_name/Dropbox/Work/project_name" \$JOB

Documentation:
https://www.sidefx.com/docs/houdini/commands/opchange.html

 1090   2023   houdini
 859   2022   houdini   modeler

Houdini – random coloring from image palette.

I was trying to optimize my coloring process for a project. And here is where I got right now:

Coloring process:

  1. Get palette that I want as a screenshot from here:
    https://paletton.com/#uid=60B0u0kllzcboPZgUH4pEuxt-pp
  2. Convert image to Utility-Texture_sRGB with target color space ACEScg using PYCO ColorSpace converter. (I still need to make some more test on this part by using this .exrs files as emission texute to compare colors with reference).
    https://pyco.gumroad.com/l/pycocs
    Free with the code free at checkout.
  3. From github you can install Color Palette Ramp – a Houdini HDA that creates a ramp based on a color palette from an image.
    https://github.com/jamesrobinsonvfx/colorpaletteramp
  4. In Houdini using that HDA (colorpaletteramp) on SOP level create a ramp. If I got image from Paletton webpage then I use Stops -> 20. But something around 10 works great for other images.
  1. With OD Tools you can right click on result and “Palletize Ramp [OD]” to make colors separation constant and look more like palette instead or gradient. You can get OD Houdini Shelf Tools 2021 for $100 here:
    https://origamidigital.com/cart/index.php?route=product/product&manufacturer_id=11&product_id=66
  1. You can save this ramp in your OD Asset library for future use.
  1. To color geometry based on disconnected pieces: first use “Connectivity” node on points to create integer attribute called id. Then use “Attribute Adjust Color” node with Adjustment Value -> Pattern type set to Random. Randomization By -> Custom attribute. Custom Attribute -> id. Then with changing seed parameter you can get random options of color combinations.

Results from 3 different ramps:

 1075   2022   3dmodeling   houdini

Hunting for likes

Small personal project that I made in Houdini testing Axiom GPU solver. Idea is that we always hunting for likes and “hearts”, setting traps with hot topics.
Looped video with music:

Work in progress:

Stills:

 687   2022   animation   houdini

Shaman – Houdini vs Blender

I wanted to try Blender for a long time. And came across a series of tutorials from YouTube channel Blender 3d. After watching it became clear why so many people love this free software.
I started in Blender, but then jumped back into Houdini. With plugin called Modeler, you can repeat the steps without problems.
Here is a “turntable” and then a speedup walkthrough:

UVs I did in RizomUV. They have just released an update. And now you can insert one group into another. For example, a group of “feathers” can be included in the “head” group and packed together. One of my favorite tricks: you can pack the islands using their direction in 3D space. Want everything to be aligned by Y in UV space? Just a click of a button. By the way, the groups made in Houdini are visible in Rizom.

After UVing, I imported some groups into Zbrush to add details.
I baked from high to low in Marmoset. It also understands groups from Houdini, and therefore it is not necessary to export the “exploded” mesh separately, as it is usually done for baking in Substance. Another nice thing about it is the auto-reload of textures and geometry. If you change something in another program and save, Marmoset automatically shows those changes.
I textured in Substance Painter. Then I rendered in Houdini with Redshift.

To make the cartoon outline: I cloned geometry. Assigned double-sided material to it. The “front” is transparent, and the “back” has only black emission material assigned to it. Then you add displacement with a constant instead of a texture. And that’s it. You can control the thickness of the line with the amount of displacement. And color of line with emission (yellow in this example):

Then I repeated the same trick in Marmoset. It works when rendering. But displacement is not supported in the “viewer”. So if you want to send a link to the client, so he can rotate the model in the browser you need another approach:
I exported additional geometry from Houdini, but with reversed normals and a bit inflated with “Peak” node. Then in Marmoset I assined a new dark material without reflections, and set the Diffusion module to Unlit.

Here is the result that you can rotate:

And couple more renders from Redshift:

Original concept drawing was made by amazing La Foret Oublie.
And here is a great article on shading in Marmoset.

 1677   2021   3dmodeling   blender   houdini   panama   redshift
 1327   2021   animation   houdini

What language to use in Panama.

Sometimes I feel like writing or talking on social media. But deciding which language to use is tricky. Russian guys don’t know Spanish. Many Panamanians do not speak English fluently and certainly do not know Russian.
Everything I watch and read is in English. But I rarely speak it. All my notes are also in English.
After ten years in the tropics, I began to forget some Russian words. Whether it is worth remembering – I do not know. Better to learn to speak correct Spanish. I speak often as a dockman with zero benevolence. My woman suggested at the beginning of conversations with new people to mention that I am Russian and frankness is not considered rudeness in our country.

In the video suggested by YouTube, a guy makes a dialog icon in five minutes in Sketch:

I liked the thumbnail and I made it in 3D.
At the same time, I practiced gluing panoramas with a high dynamic range. They are used for lighting most scenes in 3D. Just before the start of the quarantine, I took a few photos in the office. Unfortunately, the only program that glues them well (PTGui 12) costs $300. The demo works without restrictions, but it fills everything with watermarks. In Photoshop, however, they can be erased even in 32 bits. I guess it’s ok for just a fun project.
I also figured out a little bit more about the differences in normals between points and primitives in houdini.

Sometimes I need to get vector graphics from illustrator and extrude it in houdini.

Constant problem is that some parts can be flipped and will extrude in different direction. It’s happening because direction of paths when they were made in illustrator is different.

So I thought that I can add Attribute Expression node, set it to points, attribute to normal, set VEXpression from dropdown to Constant Value and write 0, 1, 0 in Contact Value to get normals pointing up. But it will not change the primitives normals. Because primitive normals are not actually an attribute. They are derived information that is calculated based upon the vertices that make up the primitive. As such, they cannot be modified. You can still use PolyExtrude, set it to point normal and extrusion mode to Existing. But you will end up with geo where some primitives normals will look “out” and others will look “in”. I don’t know if there is an easy fix for that.

So after you bringing you paths from illustrator you need first to separate primitives that are flipped. You can do this by using simple group node. Use only “keep by normals”, set the direction to 0,-1,0 and lower the spread angle. There is also Labs Split Primitives by Normal node that does exactly this but with less clicks.
Then use a reverse node. It WILL reverse vertex order in the primitives.

Also takes are amazing. You can create different version of scene and in render node save an image with take name like this:
$HIP/render/r01/bubbles.static.s5.`chsop(“take”)`.$F2.tif
So `chsop(“take”)` part is responsible for take name. And in my case the output names will be:
bubbles.static.s5.blue_bubbles.01.tif
bubbles.static.s5.orange_bubbles.01.tif

 828   2021   3dmodeling   houdini   redshift
Earlier Ctrl + ↓