Information
- Publication Type: PhD-Thesis
- Workgroup(s)/Project(s):
- Date: 2026
- Date (Start): 2021
- Date (End): 2024
- TU Wien Library: AC17745734
- Second Supervisor: Pedro Hermosilla Casajus
- Open Access: yes
- First Supervisor: Renata Georgia Raidou

- Pages: 104
- Keywords: Style transfer, Texture synthesis, Neural Networks, Neural rendering
Abstract
3D style transfer refers to altering the visual appearance of 3D objects and scenes to match a given (artistic) style, usually provided as an image. 3D style transfer presents significant potential in streamlining the creation of 3D assets such as game environment props, VFX elements, or largescale virtual scenes. However, it faces challenges such as ensuring multi-view consistency, respecting computational and memory constraints, and enabling artist control. In this dissertation, we propose three methods that aim at stylizing 3D assets while addressing these challenges. We focus on optimization-based methods due to the higher quality of results compared to single-pass methods. 0ur contributions advance the state-of-the-art by introducing: (i) novel surface-aware CNN operators for direct mesh texturing, (ii) the first Gaussian Splatting (GS) method capable of transferring both high-frequency details and large scale patterns, and (iii) an interactive method that allows directional and region-based control over the stylization process. Each of these methods outperforms existing baselines in visual fidelity and robustness. Across three complementary projects, we explore different facets of 3D style transfer. In the first project, we propose a method that creates textures directly on the surface of a mesh. By replacing the standard 2D convolution and pooling layers in a pre-trained 2D CNN with surface-based operations, we achieve seamless, multi-view-consistent texture synthesis without relying on proxy 2D images. In the second project, we transfer both high-frequency and large-scale patterns using GS, while addressing representation-specific artifacts such as oversized or elongated Gaussians. Furthermore, we design a style loss capable of transferring style patterns at multiple scales, resulting in visually appealing stylized scenes that preserve both intricate details and large-scale motifs. In the third project, we propose an interactive method that allows users to guide stylization by drawing lines to control pattern direction, and painting regions on both the 3D surface and style image to specify where and how specific style patterns should be applied. Through our extensive qualitative and quantitative evaluations, we show that our methods surpass state-of-the-art techniques. We also demonstrate their robustness across diverse 3D objects, scenes, and styles, highlighting the flexibility of the presented methods. Future work may explore extensions such as geometry modification for style-driven shape changes, more efficient !arge-scale pattern synthesis, temporal coherence in dynamic or video-based scenes, and refined interactive controls informed by direct artist feedback to better integrate creative intent into the stylization pipeline.
Additional Files and Images
Weblinks
BibTeX
@phdthesis{Kovacs_PhD,
title = "3D Style Transfer: Lifting 2D Methods to 3D and Enabling
Interactive Guidance",
author = "Áron Samuel Kov\'{a}cs",
year = "2026",
abstract = "3D style transfer refers to altering the visual appearance
of 3D objects and scenes to match a given (artistic) style,
usually provided as an image. 3D style transfer presents
significant potential in streamlining the creation of 3D
assets such as game environment props, VFX elements, or
largescale virtual scenes. However, it faces challenges such
as ensuring multi-view consistency, respecting computational
and memory constraints, and enabling artist control. In this
dissertation, we propose three methods that aim at stylizing
3D assets while addressing these challenges. We focus on
optimization-based methods due to the higher quality of
results compared to single-pass methods. 0ur contributions
advance the state-of-the-art by introducing: (i) novel
surface-aware CNN operators for direct mesh texturing, (ii)
the first Gaussian Splatting (GS) method capable of
transferring both high-frequency details and large scale
patterns, and (iii) an interactive method that allows
directional and region-based control over the stylization
process. Each of these methods outperforms existing
baselines in visual fidelity and robustness. Across three
complementary projects, we explore different facets of 3D
style transfer. In the first project, we propose a method
that creates textures directly on the surface of a mesh. By
replacing the standard 2D convolution and pooling layers in
a pre-trained 2D CNN with surface-based operations, we
achieve seamless, multi-view-consistent texture synthesis
without relying on proxy 2D images. In the second project,
we transfer both high-frequency and large-scale patterns
using GS, while addressing representation-specific artifacts
such as oversized or elongated Gaussians. Furthermore, we
design a style loss capable of transferring style patterns
at multiple scales, resulting in visually appealing stylized
scenes that preserve both intricate details and large-scale
motifs. In the third project, we propose an interactive
method that allows users to guide stylization by drawing
lines to control pattern direction, and painting regions on
both the 3D surface and style image to specify where and how
specific style patterns should be applied. Through our
extensive qualitative and quantitative evaluations, we show
that our methods surpass state-of-the-art techniques. We
also demonstrate their robustness across diverse 3D objects,
scenes, and styles, highlighting the flexibility of the
presented methods. Future work may explore extensions such
as geometry modification for style-driven shape changes,
more efficient !arge-scale pattern synthesis, temporal
coherence in dynamic or video-based scenes, and refined
interactive controls informed by direct artist feedback to
better integrate creative intent into the stylization
pipeline.",
pages = "104",
address = "Favoritenstrasse 9-11/E193-02, A-1040 Vienna, Austria",
school = "Research Unit of Computer Graphics, Institute of Visual
Computing and Human-Centered Technology, Faculty of
Informatics, TU Wien ",
keywords = "Style transfer, Texture synthesis, Neural Networks, Neural
rendering",
URL = "https://www.cg.tuwien.ac.at/research/publications/2026/Kovacs_PhD/",
}