Snap hosted its fourth annual Snap Partner Summit on Thursday to announce new augmented reality (AR) features that aim to help users with their daily lives and go beyond just sharing interactive content on Snapchat. The company’s adoption of AR has helped it rise against the likes of Meta and YouTube. Some recent advances in delivering interactive experiences have enabled Snapchat to be a niche offering in the app world.
At the Snap Partner Summit 2022, crack announced its plan to help clothing brands and businesses build AR-based solutions to cut back on returns. The Santa Monica, California-based company is also making it easier for developers to create new AR experiences — which it calls Lenses.
To understand more about how Snap is integrating deeper AR experiences within its Snapchat The app, and how it’s tackling challenges, including privacy concerns, spoke to Gadgets 360 over a virtual call with Q Pan, Director – Computer Vision Engineering at Snap.
Qi Pan, Snap director for computer vision engineering, explains how it’s moving forward with AR
photo credit: snap
Pan explained how Snap is building out the AR shopping lens and plans to help developers create new interactive experiences using the Lens cloud. The executive also talked about Snap’s approach to the Metaverse – nascent technology space who recently saw the entrance companies including meta And Microsoft, Here are edited excerpts of the conversation.
How has AR growth been on Snap? And what has been India’s role in that journey so far?
The journey of AR has been wonderful. Even when I joined the company, like five and a half years ago, we were talking to the company’s leadership, and they were already very clear that AR was going to be such an important part of Snap’s Fusion. Was. To the outside world, it seems that we are getting more and more into AR now, but internally, AR has always been very important to us in our trajectory. And that’s because we’re looking at a five- or ten-year horizon. We expect this to change from people using mobiles as their primary computing devices to those using AR glasses. The goal of my team is to really try to unlock that end-user value – trying to establish technologies to enable new AR experiences, things like location-based AR where you live in your home or office or on your street. can talk to. This also includes things like multiuser AR, where you can start to benefit from actually experiencing AR with someone, rather than recording your solo experience of AR and sending a video to someone who is primarily Happening today.
If we’re thinking about social AR, there’s going to be a lot of benefits for people actually engaging in AR together in the future. And it’s one of our Lens Cloud offerings. So, it’s going to develop tools that will allow people to move beyond what lenses can do today, to really explore new use cases, things like usability, provide information, all kinds of things. New use cases, because I believe these AR glasses will have to provide value at all times. They have to improve your daily life in some small way. To your point on India’s contribution to the journey, the growth in the country has been absolutely astonishing. so, there are 100 million users in India now, which is fantastic. A lot of that growth is reflected on the AR side as well. I see that India is such an exciting market, it is such a fast growing market. And it’s really important for us to understand the use cases in AR.
AR requires users to open their camera – allowing the app to see not only them but their surroundings as well. This may not be comfortable for many users in the market due to factors including privacy concerns. So, how is Snap trying to convince people to open the camera on their devices and experience AR – without any hesitation?
One of the most unique things about Snapchat is that it actually opens up to the camera. We see that the users are actually connecting to the camera. We have 250 million people playing with AR and opening cameras, playing with AR every day, which is a really amazing number. I think people already have this kind of behavior. on us Glasses, the approach that we are taking is to try to be very transparent about what is happening, even if you look at the first few versions of the glasses we launched, which were just camera capture devices, we did a A conscious effort to let other people know if the camera is recording, and we want to be really clear when the camera is recording, and when it’s not recording, so that the people around you are kind of Get comfortable with, that way understand what the hardware is doing. And it is the same with the new generation of glasses. These are kind of habitual changes, these are gradual changes that are happening. So, as soon as you can provide that real tangible value to people, they will be ready to use the camera to improve their lives.
Snap recently enabled users to anchor AR experiences to their favorite places fetch custom landmarks, How do you deal with privacy issues if some users may try to abuse the feature – perhaps even somewhat unintentionally – by violating the privacy of others and certain physical locations?
We have taken a very cautious approach. We want the world to be covered with interesting and useful experiences. And so, every single location-based AR lens, similar to other lenses, goes through a moderation workflow, and they have to follow lens manufacturing guidelines, community guidelines, to make sure the content is appropriate. But yeah, I think it’s a really important topic, because with any of these tools that are adding information to the world, we really want to make sure that this stuff is useful.
Snap is bringing new AR lenses so users can try on outfits without changing clothes. How will the app deal with accuracy as size is a major concern in using virtual solutions to buy clothes and apparel?
A lot of it works with scale accuracy. If you look at our type glasses test product, it’s using the depth sensors on the front of the devices to sense the scale of people’s faces to estimate the size. So, you have an experience, which is relatively accurate in understanding how big glasses are on your face. In addition, there is another class of experiences, which is imagining objects in the world around you, such as imagining a sofa, or imagining a handbag. And they are also imagined on a large scale. You roughly understand the size of the sofa in your living room or the size of the handbag in front of you. But in the case of clothing, it is quite a complex area. The experiences of the first generation will help people understand how clothes will appear on them. But I think in the future being able to help people choose the right size between medium and large is a really important capability, as it will help drive down the rate of return on online orders.
Initially starting with VR and AR, tech companies are now moving towards the metaverse. Any plans to enter this nascent space with Snapchat in the coming future?
yes so metaverse The term widely used in the industry means a lot of different things to a lot of different people. At Snap, we’re focused on what value we bring to end users. And at the same time, one gap in Snape’s thinking is that we think the real world is a great place. Now our goal is not to kick people out of the real world. We really want to enhance the real world in a small or significant way. So, our perspective is really about how we make sense of the world around us. And how can we make it better versus how to take you out into the real world and get into this kind of metaverse.