Digital Media Concepts/Snapchat Lenses

Snapchat Lenses enable a user of the Lens Studio application by Snap Inc. to add special effects to photos and videos with features such as Face Lenses and World Lenses. Face Lenses allow manipulation of the user’s eyes, mouth, head, and shoulders to transform their face with effects such as turning them into puppy dogs, superheroes, or aliens, or showing them in different makeup or luxury jewelry. World Lenses allow users to interact with three-dimensional objects overlaid onto the surrounding environment as displayed by the outward facing camera. Lens Studio by Snap Inc. is an application designed for users of Snapchat to build Augmented Reality (AR) experiences. Lens Studio offers a deep set of built-in features such as custom shaders and advanced tracking technology and includes various templates to let users get started making Snapchat Lenses.

General
To use Snapchat Lenses:


 * 1) Go to the Camera screen
 * 2) Tap on a face to activate the Lens feature
 * 3) Swipe on the carousel and tap one of the Lenses
 * 4) Tap or hold the circle symbol to capture the Snap
 * 5) Record your Snap while the effect is active
 * 6) Edit and/or send the Snap

For more details, go to: https://support.snapchat.com/en-US/article/face-world-lenses

History
For a detailed description, go to: https://lensstudio.snapchat.com/changelog/

Technology
Computer vision has gained massive traction in recent decades, with applications ranging from depositing checks to self-driving cars. With a daily user count in the tens of millions, Snap's augmented reality lenses have become a ubiquitous application of computer vision. Initially driven through its acquisition of Ukrainian startup Looksery, Snap has continued to improve its augmented reality technology with 17 acquisitions including startups such as Zenly and AIFactory. Snap's augmented reality lenses are capable of mapping faces and other objects in 3D space, taking into account rotation and occlusion so that overlaid effects animate correctly in real time. Face Lenses allow accurate manipulation of the user’s eyes, mouth, head, and shoulders to transform their face. World Lenses appear on the outward facing camera and can detect and map surfaces and environments.

Overview
Implementation details for Snap's lenses remain confidential, but the underlying technology can be broken down into the following three broad areas.

Object Detection
The object detection algorithm identifies all relevant objects (e.g. faces for face lenses) in the input image or video frame and provides their bounding boxes. Advancements in object detection algorithms in recent decades have improved their robustness to rotation and occlusion. Common techniques divide images into small sections and utilize a combination of Histogram of Oriented Gradients (HOG) and Support Vector Machine (SVM) to determine if relevant object features exist. In the case of face detection, the bridge of the nose is lighter than its surroundings, the eye socket is darker than the forehead, and the center of the forehead is lighter than its sides.

Landmark Extraction
After identifying and bounding the objects of interest, detailed object landmarks are extracted. Techniques such as an Ensemble of Regression Trees may be used for low latency landmark extraction. In the case of facial landmark extraction, local region coordinates for features such as the eyes, lips, nose, and mouth are extracted and updated in real-time. For example, changes in eyebrow coordinates relative to other facial features allow algorithms to determine if a user has raised their eyebrows.

Image Processing
An Active Shape Model is trained based large quantities of images and adjusted to create a 3D mesh that can shift and scale with the object of interest. In the case of facial modeling, an "average face" model is adjusted to overlay a mesh over the users face, mapping to each point frame by frame. Lenses are able to distort features of the chosen object through manipulation of the mesh overlay.

Development
To capitalize on the growing need for augmented reality capabilities, Snap introduced Lens Studio in December 2017. Lens Studio is an easy-to-use development platform that allows anyone to create lenses, giving creative license to developers outside the company. As research advanced and smartphones became increasingly powerful, the capabilities of Lens Studio have been updated over time with features such as full-body tracking, pet tracking, custom materials, and support for custom Machine Learning models.

Lens Studio 3.0 introduced SnapML, which allows machine learning experts to add custom ML models when creating lenses. The model acts as a black box that allows developers to extend the capabilities of Lens Studio to create a far wider variety of compelling augmented reality features.

Similar Technology Alternatives

 * Tiktok
 * Instagram
 * Facebook
 * Froggipedia
 * BBC Civilisations AR
 * SketchAR
 * MondlyAR
 * Pokémon Go

Monetization
Advertising accounted for 98% of Snap's total $1.7 billion in revenue in 2019. Sponsored lenses represent one of Snap's key advertising revenue streams, with lens usage increasing 37% year-over-year in Q2 2020 as AR platform adoption accelerated. Lens AR experiences offer not just an impression, but sustained “play time” as users interact with interactive ads. They are a powerful and memorable way to connect with consumers on a massive scale using augmented reality. For example, some brands use Face Lenses to transform users into their brand icon or movie characters, while other brands leverage World Lenses to showcase products and product features.

Advertising partners span industries from automobiles and financial services to retail and telecom. Success stories include a McDonald's recruitment campaign which generated over 42,000 applications from Saudi job hopefuls. This ad format has been used to drive results across business objectives from awareness and engagement to consideration and sales lift. Businesses have the option of either creating their own AR experiences with Lens Studio or partnering with Snap's in-house AR team to create large-scale advertising campaigns.

Racism
In 2016, Snap’ s “Yellowface” lens feature was criticized because the lens  gave users facial features - “slanted eyes, large front teeth, and rosy cheeks” that are associated with Asian stereotypes. Also in 2016, Snap’s platform introduced a "4/20" filter, which users could use on themselves to look like Bob Marley. Some users criticized this as being racist, because it enabled users to show themselves with digital blackface and dreadlocks.

Health Concerns
As users of Snap's augmented reality lenses (or other lenses that allow people to reshape their face or body) increased，Neelam Vashi, MD, director of the Ethnic Skin Center at BMC and Boston University School of Medicine, proposed a phenomenon named “Snapchat dysmorphia”, which is associated with Body Dysmorphic Disorder (BDD). The doctor argues that Snap’s editing features are causing patients to  "seek out surgery to help them appear like the filtered versions of themselves".

Security Concerns
In April 2016, Snap was sued for negligence by the driver of an Outlander. The driver of the Mercedes 230 that struck him from behind was preoccupied with trying to Snap her driving speed by using a function of its Lens feature that allows overlaying a car’s speed on top of photos or videos. The Outlander driver claimed that Snap knew this feature was being used in illegal speed contests but did nothing to prevent its use.