<div class="csl-bib-body">
<div class="csl-entry">Kovacs, A. S. (2026). <i>3D Style Transfer: Lifting 2D Methods to 3D and Enabling Interactive Guidance</i> [Dissertation, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2026.137815</div>
</div>
-
dc.identifier.uri
https://doi.org/10.34726/hss.2026.137815
-
dc.identifier.uri
http://hdl.handle.net/20.500.12708/224033
-
dc.description
Arbeit an der Bibliothek noch nicht eingelangt - Daten nicht geprüft
-
dc.description
Abweichender Titel nach Übersetzung der Verfasserin/des Verfassers
-
dc.description.abstract
3D style transfer refers to altering the visual appearance of 3D objects and scenes to match a given (artistic) style, usually provided as an image. 3D style transfer presents significant potential in streamlining the creation of 3D assets such as game environment props, VFX elements, or largescale virtual scenes. However, it faces challenges such as ensuring multi-view consistency, respecting computational and memory constraints, and enabling artist control. In this dissertation, we propose three methods that aim at stylizing 3D assets while addressing these challenges. We focus on optimization-based methods due to the higher quality of results compared to single-pass methods. 0ur contributions advance the state-of-the-art by introducing: (i) novel surface-aware CNN operators for direct mesh texturing, (ii) the first Gaussian Splatting (GS) method capable of transferring both high-frequency details and large scale patterns, and (iii) an interactive method that allows directional and region-based control over the stylization process. Each of these methods outperforms existing baselines in visual fidelity and robustness. Across three complementary projects, we explore different facets of 3D style transfer. In the first project, we propose a method that creates textures directly on the surface of a mesh. By replacing the standard 2D convolution and pooling layers in a pre-trained 2D CNN with surface-based operations, we achieve seamless, multi-view-consistent texture synthesis without relying on proxy 2D images. In the second project, we transfer both high-frequency and large-scale patterns using GS, while addressing representation-specific artifacts such as oversized or elongated Gaussians. Furthermore, we design a style loss capable of transferring style patterns at multiple scales, resulting in visually appealing stylized scenes that preserve both intricate details and large-scale motifs. In the third project, we propose an interactive method that allows users to guide stylization by drawing lines to control pattern direction, and painting regions on both the 3D surface and style image to specify where and how specific style patterns should be applied. Through our extensive qualitative and quantitative evaluations, we show that our methods surpass state-of-the-art techniques. We also demonstrate their robustness across diverse 3D objects, scenes, and styles, highlighting the flexibility of the presented methods. Future work may explore extensions such as geometry modification for style-driven shape changes, more efficient !arge-scale pattern synthesis, temporal coherence in dynamic or video-based scenes, and refined interactive controls informed by direct artist feedback to better integrate creative intent into the stylization pipeline.
en
dc.language
English
-
dc.language.iso
en
-
dc.rights.uri
http://rightsstatements.org/vocab/InC/1.0/
-
dc.subject
Style transfer
en
dc.subject
Texture synthesis
en
dc.subject
Neural Networks
en
dc.subject
Neural rendering
en
dc.title
3D Style Transfer: Lifting 2D Methods to 3D and Enabling Interactive Guidance
en
dc.type
Thesis
en
dc.type
Hochschulschrift
de
dc.rights.license
In Copyright
en
dc.rights.license
Urheberrechtsschutz
de
dc.identifier.doi
10.34726/hss.2026.137815
-
dc.contributor.affiliation
TU Wien, Österreich
-
dc.rights.holder
Aron Samuel Kovacs
-
dc.publisher.place
Wien
-
tuw.version
vor
-
tuw.thesisinformation
Technische Universität Wien
-
dc.contributor.assistant
Hermosilla Casajus, Pedro
-
tuw.publication.orgunit
E193 - Institut für Visual Computing and Human-Centered Technology
-
dc.type.qualificationlevel
Doctoral
-
dc.identifier.libraryid
AC17745734
-
dc.description.numberOfPages
104
-
dc.thesistype
Dissertation
de
dc.thesistype
Dissertation
en
tuw.author.orcid
0000-0002-0849-9032
-
dc.rights.identifier
In Copyright
en
dc.rights.identifier
Urheberrechtsschutz
de
tuw.advisor.staffStatus
staff
-
tuw.assistant.staffStatus
exstaff
-
tuw.advisor.orcid
0000-0003-2468-0664
-
item.cerifentitytype
Publications
-
item.openaccessfulltext
Open Access
-
item.languageiso639-1
en
-
item.fulltext
with Fulltext
-
item.openairetype
doctoral thesis
-
item.grantfulltext
open
-
item.mimetype
application/pdf
-
item.openairecristype
http://purl.org/coar/resource_type/c_db06
-
crisitem.author.dept
E193-02 - Forschungsbereich Computer Graphics
-
crisitem.author.orcid
0000-0002-0849-9032
-
crisitem.author.parentorg
E193 - Institut für Visual Computing and Human-Centered Technology