Kovacs, A. S. (2026). 3D Style Transfer: Lifting 2D Methods to 3D and Enabling Interactive Guidance [Dissertation, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2026.137815
3D style transfer refers to altering the visual appearance of 3D objects and scenes to match a given (artistic) style, usually provided as an image. 3D style transfer presents significant potential in streamlining the creation of 3D assets such as game environment props, VFX elements, or largescale virtual scenes. However, it faces challenges such as ensuring multi-view consistency, respecting computational and memory constraints, and enabling artist control. In this dissertation, we propose three methods that aim at stylizing 3D assets while addressing these challenges. We focus on optimization-based methods due to the higher quality of results compared to single-pass methods. 0ur contributions advance the state-of-the-art by introducing: (i) novel surface-aware CNN operators for direct mesh texturing, (ii) the first Gaussian Splatting (GS) method capable of transferring both high-frequency details and large scale patterns, and (iii) an interactive method that allows directional and region-based control over the stylization process. Each of these methods outperforms existing baselines in visual fidelity and robustness. Across three complementary projects, we explore different facets of 3D style transfer. In the first project, we propose a method that creates textures directly on the surface of a mesh. By replacing the standard 2D convolution and pooling layers in a pre-trained 2D CNN with surface-based operations, we achieve seamless, multi-view-consistent texture synthesis without relying on proxy 2D images. In the second project, we transfer both high-frequency and large-scale patterns using GS, while addressing representation-specific artifacts such as oversized or elongated Gaussians. Furthermore, we design a style loss capable of transferring style patterns at multiple scales, resulting in visually appealing stylized scenes that preserve both intricate details and large-scale motifs. In the third project, we propose an interactive method that allows users to guide stylization by drawing lines to control pattern direction, and painting regions on both the 3D surface and style image to specify where and how specific style patterns should be applied. Through our extensive qualitative and quantitative evaluations, we show that our methods surpass state-of-the-art techniques. We also demonstrate their robustness across diverse 3D objects, scenes, and styles, highlighting the flexibility of the presented methods. Future work may explore extensions such as geometry modification for style-driven shape changes, more efficient !arge-scale pattern synthesis, temporal coherence in dynamic or video-based scenes, and refined interactive controls informed by direct artist feedback to better integrate creative intent into the stylization pipeline.
en
Additional information:
Arbeit an der Bibliothek noch nicht eingelangt - Daten nicht geprüft Abweichender Titel nach Übersetzung der Verfasserin/des Verfassers