Projective Urban Texturing

3DV 2021

Yiangos Georgiou, Melinos Averkiou, Tom Kelly, & Evangelos Kalogerakis


The creation of high quality textures for immersive urban environments is a central component of the city modeling problem. Many recent pipelines capture or synthesize large quantities of city geometry using scanners or procedural modeling pipelines. Such geometry is intricate and realistic, however the generation of photo-realistic textures for such large scenes remains a problem – photo datasets are often panoramic and are challenging to re-target to new geometry. To address these issues we present a neural architecture to generate photo-realistic textures for urban environments. Our Projective Urban Texturing (PUT) system iteratively re-targets textural style and detail from real-world panoramic images to unseen, unstructured urban meshes. The output is a texture atlas, applied onto the input 3D ur-ban model geometry. PUT is conditioned on prior adjacent textures to ensure consistency between consecutively generated textures. We show results for several generated texture atlases, learned from different cities, and present quantitative evaluation of our outputs.



G. Yiangos, A. Melinos, T. Kelly, and K. Evangelos, Projective Urban Texturing, IEEE, 2021.
Bibtex | PDF
month = {October},
author = {G Yiangos and A Melinos and T Kelly and K Evangelos},
note = {{\copyright} 2021 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.},
booktitle = {3DV 2021: International Conference on 3D Vision},
title = {Projective Urban Texturing},
publisher = {IEEE},
journal = {Proceedings of the International Conference on 3D Vision 2021},
year = {2021},
url = {}

Authors from VCG

tom kelly