Comment
 Kamil’s notes
Home
By subject
By category
By author
🍎 AI-gen­er­ated ma­te­ri­als
Technology🎮Game development
IdeasKamil SzczerbaKamil Szczerba

In the video game in­dus­try, mak­ing game as­sets is a ma­jor time in­vest­ment, some­times more than cod­ing it­self.

The lat­est ren­der­ers use a re­al­is­tic model called PBR. For each ma­te­r­ial, the artist must spec­ify val­ues, such as its color, metalness, and roughness.

The left-hand sphere is smooth, the right-hand one is rough. They have the same colour. You can move the cam­era!

Find­ing the cor­rect val­ues by hand re­quires a lot of tri­als and er­rors. Here is an ap­ple with ran­dom PBR val­ues:

Af­ter some tweak­ing to find the cor­rect val­ues:

More nat­ural-look­ing, is it not? Un­for­tu­nately, while there are free sources of PBR ma­te­ri­als, one has to be ready to in­vest time/​money for pre­cise ma­te­ri­als. Let us see how we can au­to­mate that. Our plan:

  1. Dif­fer­en­tial ren­der­ing
    Ren­der a scene with an ini­tialised ma­te­r­ial
  2. Gen­er­a­tive AI
    Re­place the ma­te­r­ial in the ren­der with one we want
  3. In­verse ren­der­ing
    Get the PBR val­ues
Dif­fer­en­tial ren­der­ing

Ren­der­ers trans­form a 3D scene into a 2D pic­ture. Math­e­mat­i­cally, they are a func­tion. Through au­to­matic dif­fer­en­ti­a­tion, we can de­duce a new func­tion, which would tell us how the out­put would change, given a change in the in­put. And more rel­e­vantly for us, we can, given a par­tic­u­lar pic­ture, get the orig­i­nal scene, in­clud­ing its PBR val­ues.

For our scene, I put two spheres in a box: one made of cop­per, the other made of a generic ma­te­r­ial. Here is the ren­der made with ☘️Mit­suba:

Spheres of gold and generic materials

If you have a scene in their ☘️well-doc­u­mented XML for­mat, ren­der­ing is sim­ple:

Gen­er­a­tive AI

The next step is to re­place the generic ma­te­r­ial with the one we want (gold in our case study). Sta­ble Dif­fu­sion is quite good for gen­er­at­ing re­al­is­tic ma­te­ri­als. Con­trol­Net, an ex­ten­sion, helps SD keep the 3D shapes. I masked the generic sphere, and asked SD to re­place it with gold:

ComfyUI workflow

Here is the re­sult:

Stable Diffusion inpainting

The light­ing is far from cor­rect, but the ma­te­r­ial it­self is recog­nis­able: gold!

In­verse ren­der­ing

The ini­tial scene had the fol­low­ing pa­ra­me­ters:

eta and alpha in­flu­ence re­spec­tively the colour and rough­ness of the ma­te­r­ial.

Us­ing gra­di­ent de­scent (sim­i­lar to the ☘️Mit­suba in­verse-ren­der­ing tu­to­r­ial), we get the fol­low­ing val­ues:

If we used the same rough­ness as the left-hand ball in a ren­der:

Inverse rendering

Quite cred­i­ble! For com­par­i­son, here is a phys­i­cally-cor­rect gold ma­te­r­ial:

Real gold

A bit greener than ex­pected, is it not? Con­trary to phys­i­cal re­al­ism, our ap­proach favours psy­cho­log­i­cal re­al­ism, as Sta­ble Dif­fu­sion is in­flu­enced by the “idea” of gold used by artists.