Tree animation and modeling via analytic simulation and image-based reconstruction

Placeholder Show Content


Botanical trees are ubiquitous in many real-world environments, and they are consequently also pervasive in the virtual environments created for film special effects, games, virtual reality, etc. The motion of these trees ranges from subtle swaying that is easily overlooked to dramatic deformation in extreme weather conditions. The geometry of these trees is highly intricate, with features including bark, knots, intricate networks of fine twigs and branches, and at times leaves, fruits, and flowers. This dissertation presents a set of frameworks for modeling a tree's motion via artist-directed simulation and its geometry via image-based reconstruction. First, we address the challenge of efficiently simulating tree motion for complex tree models. Efficient simulation of tree movement is difficult due to the many degrees of freedom needed to represent a high fidelity tree; this challenge is further compounded by the typically stiff material properties of a tree, making it difficult to simulate at interactive frame rates using standard time integration techniques. To address these issues, we present an algorithm with linear time complexity that simulates trees as articulated rigid bodies with as-stiff-as-desired rotational joints by taking the analytic solution to an approximated problem. Our method supports a variety of effects including wind, fictitious forces, collisions, plasticity, and leaf and fruit fracture, and it can be used to interactively parameterize tree models, run tree simulations using a game engine on a commodity tablet, simulate and render high quality trees with thousands to millions of rigid bodies, etc. Second, we address the challenge of reconstructing high fidelity, simulation-ready geometry of a tree from the real world. Beginning with a set of drone-mounted cameras, we acquire image data and apply traditional structure from motion and multi-view stereo techniques, then use the resulting point cloud data to build a simulatable representation of the tree's trunk and major branches. To reconstruct smaller twigs, we present an image annotation approach and subsequent improvements to simplify or automate parts of the annotation process using stereo correspondence information and a trained network for semantic segmentation of tree images. Finally, we combine the resulting model with our simulation framework to animate the reconstructed tree geometry.


Type of resource text
Form electronic resource; remote; computer; online resource
Extent 1 online resource.
Place California
Place [Stanford, California]
Publisher [Stanford University]
Copyright date 2019; ©2019
Publication date 2019; 2019
Issuance monographic
Language English


Author Quigley, Edward
Degree supervisor Fedkiw, Ronald P, 1968-
Thesis advisor Fedkiw, Ronald P, 1968-
Thesis advisor Bohg, Jeannette, 1981-
Thesis advisor Khatib, Oussama
Degree committee member Bohg, Jeannette, 1981-
Degree committee member Khatib, Oussama
Associated with Stanford University, Computer Science Department.


Genre Theses
Genre Text

Bibliographic information

Statement of responsibility Ed Quigley.
Note Submitted to the Computer Science Department.
Thesis Thesis Ph.D. Stanford University 2019.
Location electronic resource

Access conditions

© 2019 by Edward Quigley

Also listed in

Loading usage metrics...