한국어 | English

StyleShift: Real-Time Texture Generation with Stable Diffusion

C# | Unity Engine | Gen AI
Real-Time Texture Update on Individual Object
Real-Time Texture Update Across Entire Environment



Tools

Engine: Unity (C#)
Gen AI: Stable Diffusion Web UI API + ControlNet
Platform: Meta Quest

Project Overview

This prototype explores how generative AI can be used to dynamically transform 3D environments in real time. Starting with a single object —a cup— I experimented with applying different styles using text prompts via Stable Diffusion. These included materials like ceramic, glass, and artistic styles like sketch or watercolor.

After validating the concept, I extended it to the full room. The walls, floor, and objects all change textures based on the selected prompt, giving the user the feeling of being instantly transported to a different environment. This allows for immersive mood or theme changes without scene transitions or reloading.

System Structure

The system enables real-time texture transformation based on user-defined prompts. A descriptive text prompt (e.g., "wooden texture", "futuristic metal", "cozy hand-drawn") is sent from Unity to the Stable Diffusion Web UI API, along with a conditional image using the object’s UV Map (via ControlNet). The resulting AI-generated texture is decoded and applied directly to 3D models within the Unity scene.

💬 Prompt ➝ 🧠🎨 Stable Diffusion (with ControlNet) ➝ 🖼️ Texture Generation ➝ 🏠 Scene Update

Key Features