“Ok…Meta just won #AWE2017! Best example of bridging today's technology (📱) with tomorrow's (#AR) that I've seen.”
More Than Just Hardware: Introducing Workspace
Workspace is the Meta 2's operating environment for developers, designers, and innovators. An intuitive complement to the way we work and play in AR, Workspace incorporates our neuroscience-based AR Design Guidelines.
Utilizing Meta’s gestural technology, users can reach out and touch holograms right before their very eyes. The Workspace features immersive 3D apps that are as easily manipulatable as their real-world counterparts, including:
- 3D models that invite users in for hands-on exploration
- 3D monitors that allow users to email and browse without the constraints of 2D workflows
How Devs are Building AR Experiences on the Meta 2
"When people put on our current Analytics VR demo, they instinctively reach out to touch their data, but then have to learn how to use the virtual trackball and gaze controllers. Building our app on the Meta headset let us remove a layer of abstraction, so users can do what they want to directly: reach out and touch and manipulate their data. It's stunning and compelling."
Product Lead for VR, Great Wave
Widest Field of View (FOV)
Meta 2 includes a full 90-degree field of view and 2560 x 1440 high-dpi display, allowing for more information to be displayed and reducing the amount of effort needed to divide your attention between the real and augmented virtual world – ultimately creating a more immersive AR experience. The Meta 2 see-through headset also makes everything below your eyebrows completely transparent and unobstructed so you can easily make eye contact with others. Not to mention, the headset can be worn comfortably over glasses.
Direct Hand Interaction
Hands are the tools people use to engage with their environments, and the Meta 2 enables you to leverage your hands to interact with holographic apps. The sensor array on the headset makes positional tracking possible to see, grab, and move holograms just like physical objects. You can also track objects, such as a pen or paint brush, with the Meta 2.
Built for Developers and Designers
The included Meta 2 SDK is built on top of the most popular 3D engine in the world. Unity® enables you to create holographic apps quickly and easily with Unity and C#. The SDK includes SLAM, hands interaction tracking, occlusion, collaboration, neurointerface design guidelines, example code, apps, documentation, and support. Join the largest community of AR developers in the world.
Coming Soon: SOLIDWORKS Integration
We recently partnered with SolidWorks and will be offering integration with the Meta 2 – making it easier for you to design models, collaborate with others, and present your work in more compelling ways.
When will the Meta 2 Development Kit ship?
The Meta 2 has started shipping. We will be in communication with pre-order customers on timing.
What is the programming language for the developer kit?
Apps built on the Meta 2 Development Kit will be in Unity 3D version 5 or higher.
How are you doing positional tracking without an external sensor?
The Meta 2 Development Kit uses a positional tracking algorithm that fuses image features from the world, captured with onboard computer vision camera, with the wearer's acceleration and angular movements, captured with an onboard inertial measurement unit (IMU), to give a six degree of freedom positional tracking. Unlike other systems, Meta's solution to positional tracking uses a type of SLAM algorithm that does not require calibration or mapping the space ahead of time. It begins tracking the wearer's position immediately.
How does the image projection system work?
The Meta 2 Development Kit uses an off-axis optical engine to form images using a single-element half-silver mirror, in which the optical axis of the aperture is not coincident with the mechanical center of the system. Left and right eye images from an LCD panel reflect off the inside of the visor (the half-silver mirror) into the eyes, forming a high-resolution stereoscopic 3D image at 20 pixels per degree, ensuring text readability. As well as being the optical element, the visor is partially reflective, which enables the wearer to see digital information overlaid on the physical world. With the visor transparent and the panel positioned above the eyes, wearers can make full eye contact and read each other's facial expressions, making the Meta 2 Development Kit ideal for collaboration. The visor and head straps are designed to comfortably accommodate wearing glasses underneath, removing the need for extra prescription lens.
What are the PC requirements?
Recommended PC requirements are: Graphics: NVIDIA GTX 960 / AMD R9 280; CPU: Intel Core i7 (desktop CPU); Memory: 8GB RAM; Game Engine: 64-bit Unity 5.3x; Storage: 10GB; Video: HDMI 1.4b; Sound Card: Intel HD-compatible sound card; USB Ports: USB 3.0; OS: Windows 8.1 64-bit or newer
The unboxing video shows the use of a MacBook Pro. Does the Meta 2 support Mac?
We're exploring support for all relevant platforms; however, we have not committed to development on any of these right now.