Seven months after the release of the Meta Ray-Ban display, developers can now build the first apps for Meta’s first smart glasses with an integrated display. The developer Timur Abdrakhimov (Linkedin) demonstrates the new possibilities with a port of the first-person shooter “Doom”.
Read more after the ad
The game appears on the smart glasses’ waveguide display, which has a diagonal field of view of 20 degrees. It is controlled by finger movements, which the Meta Neural Band translates into computer commands. The Meta Ray-Ban display is only visible to the right eye, which means the glasses are only partially suitable for longer gaming sessions. As a proof of concept, the experiment is still interesting, especially since “Doom” has previously been made to run on calculators, lawn mowers and even electric toothbrushes.
The Meta Ray-Ban Display is currently only available in the USA due to delivery bottlenecks. In the EU, requirements for batteries and AI also make market launch more difficult. According to a report, Meta is planning a second generation of smart glasses for this year. Meta may save the global introduction for this successor model.
Two ways to get apps for the Meta Ray-Ban display
Developers have two approaches to developing apps for the Meta Ray-Ban display. One of them is the Meta Wearables Device Access Toolkit, an SDK for iOS and Android that Meta has been offering for its displayless glasses since the end of last year and has now expanded to include display functions. For the first time, developers can extend existing smartphone apps to the glasses display and display elements such as text, images or video playback. Developed using Swift for iOS and Kotlin for Android.
A second option is the new “Web Apps”. Developers can develop and test these standalone applications using HTML, CSS and JavaScript in the browser and then launch them on the glasses via URL. They have access to movement and orientation data, GPS data from the smartphone, input from the Meta Neural Band and local storage. Meta sees web apps primarily for fast prototyping and lean applications. Abdrakhimov’s “Doom” port is also based on this approach. Further early experiments with the developer tools can be found in a subreddit set up specifically by Meta.
Read more after the ad
Both development paths are initially only available as developer previews, which means that developers can build and test their applications but cannot yet distribute them regularly to end users. For more information, visit Meta’s wearables developer page.
Meta is positioning itself in front of new smart glasses competition
Meta also announced that fingerwriting, initially introduced for testers in January, will now be made available to all users. It works, among other things, in Instagram, WhatsApp, Messenger and in native messaging apps on Android and iOS. The recording function announced in March, which combines the display and camera images in one video, will also be more widely available. Visual pedestrian navigation will also be expanded to the entire USA and will also work in major international cities such as London, Paris and Rome. Live subtitles, which transcribe spoken language during conversations or phone calls, are also available for WhatsApp, Facebook Messenger and Instagram Direct. Meta’s new AI model Muse Spark is scheduled for release this summer for Meta Ray-Ban Display.
The timing of the developer announcements should not be a coincidence. Google could announce the first smart glasses based on Android XR at the Google I/O developer conference next week. Apple is also pursuing smart glasses plans and could give the first indications of this at WWDC in June. In addition, Snap is expected to present the first AR glasses for consumers this year, the presentation of which will probably not be long in coming.
As smart glasses become more widespread, the controversies surrounding the devices are also growing. Meta has recently come under criticism several times, for example because of alleged plans for facial recognition and because of intimate glasses videos that ended up on Clickworkers for data annotation. There is also the concern that smart glasses will make secret recordings in public spaces easier.
(tobe)
