Augmented Reality Sandbox
Gammy Dodger stashed this in Trivia, Trivets and Rivets
Augmented Reality Sandbox
UC Davis' W.M. Keck Center for Active Visualization in the Earth Sciences (KeckCAVES), together with the UC Davis Tahoe Environmental Research Center, Lawrence Hall of Science, and ECHO Lake Aquarium and Science Center, is involved in an NSF-funded project on informal science education for freshwater lake and watershed science. As part of this project, we are primarily developing 3D visualization applications to teach earth science concepts, but we also built a hands-on exhibit combining a real sandbox, and virtual topography and water created using a closed loop of a Microsoft Kinect 3D camera, powerful simulation and visualization software, and a data projector. The resulting augmented reality (AR) sandbox allows users to create topography models by shaping real sand, which is then augmented in real time by an elevation color map, topographic contour lines, and simulated water. The system teaches geographic, geologic, and hydrologic concepts such as how to read a topography map, the meaning of contour lines, watersheds, catchment areas, levees, etc.
This project was inspired by a video created by a group of Czech researchers, who demonstrate an early prototype of an AR sandbox with elevation color mapping and some limited form of fluid flow. There is an even earlier project, Project Mimicry, of which we learned only later; it also appears to only be in early testing phase.
Figure 1: The Augmented Reality Sandbox in its natural habitat. Left: Sandbox unit when turned off. The Kinect 3D camera and the digital projector are suspended above the sandbox proper from the pole attached to the back. Right: Sandbox table when turned on, showing a mountain with a crater lake, surrounded by several lower lakes.Project Goals
The goal of this project was to develop a real-time integrated augmented reality system to physically create topography models which are then scanned into a computer in real time, and used as background for a variety of graphics effects and simulations. The final product is supposed to be self-contained to the point where it can be used as a hands-on exhibit in science museums with little supervision.
The sandbox hardware was built by project specialist Peter Gold of the UC Davis Department of Geology. The driving software is based on the Vrui VR development toolkit and the Kinect 3D video processing framework, and is available for download under the GNU General Public License.
Raw depth frames arrive from the Kinect camera at 30 frames per second and are fed into a statistical evaluation filter with a fixed configurable per-pixel buffer size (currently defaulting to 30 frames, corresponding to 1 second delay), which serves the triple purpose of filtering out moving objects such as hands or tools, reducing the noise inherent in the Kinect's depth data stream, and filling in missing data in the depth stream. The resulting topographic surface is then rendered from the point of view of the data projector suspended above the sandbox, with the effect that the projected topography exactly matches the real sand topography. The software uses a combination of several GLSL shaders to color the surface by elevation using customizable color maps (the default color map used right now was provided by M. Burak Yikilmaz, a post-doc in the UC Davis Department of Geology), and to add real-time topographic contour lines.
At the same time, a water flow simulation based on the Saint-Venant set of shallow water equations, which are a depth-integrated version of the set of Navier-Stokes equations governing fluid flow, is run in the background using another set of GLSL shaders. The simulation is an explicit second-order accurate time evolution of the hyperbolic system of partial differential equations, using the virtual sand surface as boundary conditions. The implementation of this method follows the paper "a second-order well-balanced positivity preserving central-upwind scheme for the Saint-Venant system" by A. Kurganov and G. Petrova, using a simple viscosity term, open boundary conditions at the edges of the grid, and a second-order strong stability-preserving Runge-Kutta temporal integration step. The simulation is run such that the water flows exactly at real speed assuming a 1:100 scale factor, unless turbulence in the flow forces too many integration steps for the driving graphics card (currently an Nvidia Geforce 580) to handle.
The AR Sandbox is slowly gaining traction, with many unaffiliated users building their own versions (see pictures or movies of a select few on the External Installations page). To support the budding AR Sandbox user community, we now have an official AR Sandbox support forum on the Lake Visualization 3D web site where users can help each other or provide feedback to us.
The AR Sandbox support forum also contains detailed step-by-step software installation instructions aimed at new Linux users, and a video showing the entire process.
- Reed, S., Kreylos, O., Hsi, S., Kellogg, L., Schladow, G., Yikilmaz, M.B., Segale, H., Silverman, J., Yalowitz, S., and Sato, E., Shaping Watersheds Exhibit: An Interactive, Augmented Reality Sandbox for Advancing Earth Science Education, American Geophysical Union (AGU) Fall Meeting 2014, Abstract no. ED34A-01
Pages In This Section
PicturesSome pictures showing the evolving augmented reality sandbox prototype.MoviesSome movies showing the evolving augmented reality sandbox prototype, and several tutorial videos showing how to build your own AR sandbox.DownloadDownload page for the current release of the software driving the augmented reality sandbox, released under the GNU General Public License.InstructionsPurchase, construction, setup, and calibration instructions for an Augmented Reality (AR) Sandbox.External InstallationsList of external users who built their own augmented reality sandboxes, based on our software and designs.