![]() The idea is for creators to run with this in several directions. For example, lens creators can access APIs from Snap partners to integrate things like stock tickers or animations that sync with weather conditions. And an updated physics engine emulates forces like gravity.Īdditionally, Lens Studio 4.10 brings greater access to APIs. Armed with that capability, lenses can more realistically interact with physical space. Unpacking that a bit, World Mesh enables devices to scan extensive depth maps of a given space before placing a lens. It also extended its World Mesh AR immersive capabilities to low-end phones. This was demonstrated in more licensed music in Snap’s Sounds library, so lenses can have greater audio dimension. Zeroing in on these “themes,” one key pattern is lens depth and immersion. So we’re highlighting the Lens Studio keynote and its strategic takeaways for this week’s XR Talks (video and summary below). In addition to advancing Lens Studio’s capabilities, these updates represent a few key themes in Snap’s evolution as an AR platform. 4.10 with broader lens sound libraries, greater depth mapping, and creator monetization tools such as integrated calls-to-action. At its Lens Fest last week, it announced v. As we examined earlier this week, it also announced 3.5 trillion cumulative views for 2.5 million lenses.īut beyond new usage figures, Snap also continues to advance capabilities in its Lens Studio creation platform. This is evident in its latest usage numbers, including six billion lens engagements per day. ![]() Snap continues to invest in AR lenses as a driver for its advertising business. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |