Real-time Appearance-Driven Memory-Efficient Dense Canopy Synthesis
Precisely reconstructed landscape models can bridge gaps between design and construction while also providing realistic design visualizations and demonstrations.
Though trees are indispensable among the landscapes, they generally function as visual references in interactive design and it is not worthwhile to consume too many resources for highly detailed representation and reconstruction.
However, the state-of-the-art mainly focuses on replicating lifelike geometric and topological details with unconstrained memory usage and rendering cost, and a scenario to reconstruct canopies with low resource usage has yet to be well investigated in related domains.
Therefore, this work seeks a method to synthesize highly realistic dense tree canopies with minimal memory consumption and superior rendering efficiency.
The method first fits the tree silhouette under different views with semi-ellipsoids, denoted as proxy.
Then, exemplar leaves are quasi-randomly distributed on the proxy shell for the aggregated appearance of leaf clusters and further memory preservation.
The view-dependent high-varied visual appearance is further compressed into multiple textures as the composition of view-independent diffuse coloring and view-dependent residual lighting.
The former compresses coloring textures as the quantized representative color of each leaf, and the latter encodes pixel-based multi-view subtlety textures as proxy-based light fields.
Finally, several experiments and a user study have been conducted to compare the proposed method against the state-of-theart baselines.
Accordingly, the proposed method achieves superior memory usage and comparable rendering efficiency while presenting similar visual appearance to the ground-truth.