Compositional Neural Textures

SIGGRAPH Asia, 2024
Compositional neural representation for texture editing. (a) We present a neural representation that decomposes textures into geometric elements with appearance features in a fully unsupervised manner. Geometric elements are represented by Gaussian functions corresponding to individual texture elements (textons), and each Gaussian has an associated appearance vector to encode texton appearance. Our network architecture and objectives are designed to encourage extraction of recurring textons and separation of geometry/structure and appearance, while preserving faithful reconstruction of the original texture. This facilitates a variety of editing operations and synthesis applications, such as transferring appearance from one texture towards the structure of another texture (b) or image (c), texture interpolation and morphing (d), manipulating the variations within a texture (e), edit propagation (f), and direct manipulation (e.g. moving and transforming) of the textons (g).
Fast forward


Supplementary video


Talk
(Pending)

Abstract

Texture plays a vital role in enhancing visual richness in both real photographs and computer-generated imagery. However, the process of editing textures often involves laborious and repetitive manual adjustments of textons, which are the recurring local patterns that characterize textures. This work introduces a fully unsupervised approach for representing textures using a compositional neural model that captures individual textons. We represent each texton as a 2D Gaussian function whose spatial support approximates its shape, and an associated feature that encodes its detailed appearance. By modeling a texture as a discrete composition of Gaussian textons, the representation offers both expressiveness and ease of editing. Textures can be edited by modifying the compositional Gaussians within the latent space, and new textures can be efficiently synthesized by feeding the modified Gaussians through a generator network in a feed-forward manner. This approach enables a wide range of applications, including transferring appearance from an image texture to another image, diversifying textures, texture interpolation, revealing/modifying texture variations, edit propagation, texture animation, and direct texton manipulation. The proposed approach contributes to advancing texture analysis, modeling, and editing techniques, and opens up new possibilities for creating visually appealing images with controllable textures.

Resources and Downloads


  • Paper
    paper (Arxiv)

  • Supplementary
    supp

  • Code
    (Pending)

Citation

@inproceedings{Tu:2024:CNT,
 title = {Compositional Neural Textures},
 author = {Peihan Tu and Li-Yi Wei and Matthias Zwicker},
 doi = {10.1145/3680528.3687561},
 year = 2024,
 month = 12,
 booktitle = {SIGGRAPH Asia 2024 Conference Papers},
 series = {SIGGRAPH Asia '24}
}

Contact

For questions and clarifications, please get in touch with:
Peihan Tu peihan.tu@gmail.com

Template from GVV Group at MPI.