Auto-texturing ComfyUI Workflow (2023)

I came up with a workflow for auto-texturing a UV'd 3D asset by tricking Stable Diffusion into thinking it's coloring a line drawing. Color and shape langue is partially formed by "breeding" the palettes/styles of up to 5 images to influence the output work.
This use case was pretty situational to my game projects, which required abstract patterns to follow either the cells or the overall flow of the mesh topology.

ComfyUI Workflow.  The 5 images on the left are influencing the palette/style of the final outcome.  The black image on the bottom is the UV map.  The yellow box near the right is the KSampler, the "heart" of the system that actually does the generation.

ComfyUI Workflow. The 5 images on the left are influencing the palette/style of the final outcome. The black image on the bottom is the UV map. The yellow box near the right is the KSampler, the "heart" of the system that actually does the generation.

Output object mapped.

Output object mapped.

Closer view of before+after.

Closer view of before+after.

Example of a test race track generated with this system, proving that it doesn't have to slavishly follow the pattern of the underlying mesh.

Example of a test race track generated with this system, proving that it doesn't have to slavishly follow the pattern of the underlying mesh.