Go to the content anchor

Smart 3D Panorama Reconstruction: An Experience to Break Through Limits of Time and Space

Thanks to the support of ‘Digital Economy Project’, the researchers at College of Electronic Engineering & Computer Science, National Tsing Hua University (NTHU-EECS) collaborated with iStaging Corp. (iStaging), connecting experts from both academic and industry community, and completed the study of “Smart 3D Panorama Reconstruction”. The study creates immersive experiences in viewing house tours with virtual reality technologies for users at their homes via myriads of panoramic images taken by iStaging. With the cutting-edge research, online house tours now enable free movement in 3D, breaking traditional space restrictions of static panoramic views, bringing a real revolution to both consumers and developers. As for consumers, they enjoy the house purchase experience with faster and more complete access to house information. At the meantime, Real estate developers and house agencies improve business efficiency through our technology.

As panoramic image technologies becoming popularized among the mass public, business models based on panoramic images have been gradually explored. From Google Street View to the iStaging indoor panoramic image house tour recorded by a mobile device, more and more real estate agents and platforms around the world, for instance, Yung-Ching Realty, H&B Realty, Zillow or Redfin, provide remote house tour services to greatly increase the access rate of the object and speeding up the transaction process. A large number of house buyers or potential tenants can take tours online. However, various similar real estate house tour services have sprung up. Our competitive technology brings users a new experience to explore the online house tour just like the real house visits -- being able to move around in three dimensions. Through the 3D space technology, not only the consumer experience is improved, but the content generation process simplification and cost reduction is a breakthrough for the developers and agents.

Therefore, NTHU started to collaborate with iStaging and conducted “Smart 3D Panorama Reconstruction” research under ‘Digital Economy Project’ of Department of Engineering and Technologies, MOST. The most eye-catching improvement features that we only need one panoramic image to build an indoor 3D layout leveraging novel machine learning technique. We further merry smart reconstruction with virtual reality to provide user a realistic sense of 3D and visually immersive experience. Current capture solutions depend on expensive Lidar equipment to acquire indoor depth information and massive laboring to complete 3D indoor layout modeling. Our research enables a high-end online house tour experience with a single panoramic image, which implies a considerable cost reduction in human resources and hardware expenses. Furthermore, it sustains scalability to enable penetration through the market, and more importantly, brings the online house tour service into our daily lives. This breakthrough technology can be effectively extended to overseas markets through iStaging's current business network of house agents in more than 50 countries around the world.

To fully leverage the power of industrial-standard data provided by iStaging, 2500 pieces of indoor panoramic images are used for training and testing. Comparing to the state-of-the-art, namely LayoutNet [1], the technology brings an impressive 15% accuracy increase in terms of 2D and 3D IoU. The research has been accepted by the top-tier conference on computer vision and will be officially published in California later this June [2].

Thanks to the technologies developed by the researchers at NTHU and industrial-scale data from iStaging, online house tours are gearing up for the immersive 3D indoor free movement tour. We look forward to seeing iStaging bringing more of our technology to real products.


[1] C. Zou, A. Colburn, Q. Shan, and D. Hoiem. LayoutNet: Reconstructing the 3d room layout from a single rgb image. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2018.

[2] S-T Yang, F-E Wang, C-H Peng, P Wonka, M. Sun, and H-K Chu. DuLa-Net: A Dual-Projection Network for Estimating Room Layouts from a Single RGB Panorama. In The IEEE Conference on Computer Vision and Pattern Recognition (CVPR), June 2019. https://cgv.cs.nthu.edu.tw/projects/dulanet

Media Contact:

Associate Professor, Min Sun

Department of Electronic Engineering, National Tsing Hua University (NTHU-EE),




Associate Professor, Hung-Kuo Chu

Department of Computer Science, National Tsing Hua University (NTHU-CS),




Last Modified : 2019/05/19