Tip:
Highlight text to annotate it
X
This video introduces some of the augmented reality features possible with Meta that go
beyond the capabilities of virtual reality.
Here we are generating a real time point cloud of two hands using the depth-sensing camera.
This mesh is able to align with a person's real hands as they look through the stereoscopic
display in the glasses.
With our gesture recognition algorithms, we can simultaneously track the movements of
the two hands independently.
The IMU allows us to look around and keep objects at a fixed position in the real world.
We can also make the objects collide, and have them react accordingly.
Here we can see an example application that dynamically generates a finger-sculpted mesh
and sends it directly to a 3D printer.
Now the user is pressing the "Got It" button with their finger -- this functionality is
provided by the finger tracking algorithms that can precisely detect a person's fingertips.
Using the ZeroUI sculpting library we are able to generate a dynamic 3D mesh. The mesh
follows the fingers as they curve back and forth to create the sculpted object.
The sculpted mesh snaps into the 3D printer as it gets dragged over it. A conversion process
happens in the background to get the mesh ready for printing.
This converted mesh file then gets sent to the printer by Pronsole, which also provides
our application with an estimated printing time.
This does not require any menus or other software -- just by dragging the object to the printer
it begins to print, and you are seeing the entire process here as a user would experience it.
Here you can see the completed 3D model that we created, now as a physical, tangible object.
These applications demonstrate the beginnings of what is possible with the Meta glasses.
What will you make next?