Gaussian Material Synthesis

Károly Zsolnai-Fehér, Peter Wonka, Michael Wimmer
Gaussian Material Synthesis
ACM Transactions on Graphics (SIGGRAPH 2018), 37(4):76:1-76:14, August 2018. [paper] [supplementary] [video]

Information

Abstract

We present a learning-based system for rapid mass-scale material synthesis that is useful for novice and expert users alike. The user preferences are learned via Gaussian Process Regression and can be easily sampled for new recommendations. Typically, each recommendation takes 40-60 seconds to render with global illumination, which makes this process impracticable for real-world workflows. Our neural network eliminates this bottleneck by providing high-quality image predictions in real time, after which it is possible to pick the desired materials from a gallery and assign them to a scene in an intuitive manner. Workflow timings against Disney’s “principled” shader reveal that our system scales well with the number of sought materials, thus empowering even novice users to generate hundreds of high-quality material models without any expertise in material modeling. Similarly, expert users experience a significant decrease in the total modeling time when populating a scene with materials. Furthermore, our proposed solution also offers controllable recommendations and a novel latent space variant generation step to enable the real-time fine-tuning of materials without requiring any domain expertise.

Additional Files and Images

Additional images and videos

Additional files

Weblinks

BibTeX

@article{zsolnai-2018-gms,
  title =      "Gaussian Material Synthesis",
  author =     "K\'{a}roly Zsolnai-Feh\'{e}r and Peter Wonka and Michael
               Wimmer",
  year =       "2018",
  abstract =   "We present a learning-based system for rapid mass-scale
               material synthesis that is useful for novice and expert
               users alike. The user preferences are learned via Gaussian
               Process Regression and can be easily sampled for new
               recommendations. Typically, each recommendation takes 40-60
               seconds to render with global illumination, which makes this
               process impracticable for real-world workflows. Our neural
               network eliminates this bottleneck by providing high-quality
               image predictions in real time, after which it is possible
               to pick the desired materials from a gallery and assign them
               to a scene in an intuitive manner. Workflow timings against
               Disney’s “principled” shader reveal that our system
               scales well with the number of sought materials, thus
               empowering even novice users to generate hundreds of
               high-quality material models without any expertise in
               material modeling. Similarly, expert users experience a
               significant decrease in the total modeling time when
               populating a scene with materials. Furthermore, our proposed
               solution also offers controllable recommendations and a
               novel latent space variant generation step to enable the
               real-time fine-tuning of materials without requiring any
               domain expertise.",
  month =      aug,
  doi =        "10.1145/3197517.3201307",
  issn =       "0730-0301",
  journal =    "ACM Transactions on Graphics (SIGGRAPH 2018)",
  number =     "4",
  volume =     "37",
  pages =      "76:1--76:14",
  keywords =   "gaussian material synthesis, neural rendering, neural
               rendering",
  URL =        "https://www.cg.tuwien.ac.at/research/publications/2018/zsolnai-2018-gms/",
}