
Yonsei University–Metown Research on 3D Gaussian Splatting Texture Editing Accepted to IEEE TVCG
A joint research paper titled “3D Gaussian Splatting Texture Editing via Single Modified Image,” conducted by researchers Haneul Baek and Gyumin Kim from Yonsei University together with Dohae Lee and Inkwon Lee from Metown, has been accepted for publication in the prestigious IEEE Transactions on Visualization and Computer Graphics (TVCG). The paper proposes a novel method that enables precise texture editing of neural-rendered 3D scenes using only a single modified image. Unlike conventional approaches requiring multi-view editing or additional training, this method allows intuitive texture modifications to be propagated across the entire 3D scene from a single edited view. The research demonstrates a new foundation for intuitive and efficient 3D content editing, highlighting the potential for making 3D editing accessible even to non-expert users. The technology is expected to be applied across various industries including 3D content production, digital twins, XR, metaverse, and VFX, significantly improving the efficiency and usability of 3D scene editing workflows. Metown plans to integrate the research outcomes into its EVOVA neural rendering engine to enhance 3D editing capabilities, including material changes, color adjustments, and detail editing for automated e-commerce 3D content production. CEO Dohae Lee commented, “This research is meaningful in that it extends neural rendering from generation to editing. We will continue building a comprehensive 3D AI technology ecosystem connecting generation, editing, and deployment through EVOVA.” Metown also noted that its core technologies have been developed with support from major Korean government initiatives including the Metaverse Lab Support Project funded by the Ministry of Science and ICT, the Ministry of SMEs and Startups, and the Ministry of Culture, Sports and Tourism.
2026.04.09





