Manhattan-based Body Labs was founded in 2013, and in early 2015 introduced the beta version of BodyKit, a set of APIs for virtualizing and simulating the human body. Body Labs is the provider of advanced technology used to analyze body motion and shape – it collects, digitizes, and organizes all of the data and information related to the pose, motion, and shape of human bodies. The company added new health and fitness apps a few months after the BodyKit beta was launched, and less than a year later secured exclusive licenses and patents for its 3D body modeling and virtual reality technologies. In the fall of 2016, the online retailer API Body Labs Blue was introduced, and now Body Labs is back with something new.
The company’s mission is to transform the human body into a digital platform, and design, produce, and sell goods and services based around that platform. So with that in mind, Body Labs is announcing the launch of SOMA today, its human-aware artificial intelligence platform. The platform creates realistic 3D models of the body, which can be used to help progress smart homes, personalized shopping, autonomous vehicles, and gaming; these models make it possible for developers and businesses to create innovative apps that are able to predict 3D human shape and motion from videos and photos.
Using SOMA, mobility leaders can detect and predict pedestrian actions with conventional cameras, which can help make our roads safer and more people-friendly; the platform is also learning body commands, so that intelligent software and hardware can understand gestures without voice prompts or controllers.
The platform can use photos and videos to accurately predict and measure a person’s 3D shape, so stores can personalize apparel and sizing for their customers. All shapes are able to be compared, or measured, to perfectly fit customers’ bodies, so brands can more accurately understand 3D human shape and motion. Social shopping can be a personalized activity thanks to SOMA, which can connect consumers to recommended and favorite products, and peer-to-peer communication, based on their body shape. Discovery and sizing platforms are able to be customized, since SOMA can actually run inside an existing recommendation engine, and allows businesses to combine body shape data with customers’ past purchases.
Customers can also filter reviews and add context easily, based on their own body shape. The platform helps run data-driven design as well: it compares a person’s purchasing behaviors with their 3D body shape, so brands and businesses can better manage their channels for distribution, manufacturing, and sizing. In addition, SOMA can help businesses conduct large-scale sizing studies by using customer photos to improve the sizing and fit of clothing.
VIDEO
SOMA can also transform the gaming world, as user-generated videos allow the platform to detect 3D motion and shape referencing, so a user’s actions can be transferred directly into a VR environment or interactive game.
It captures motion without having to use markers, and can also detect facial features, players’ 3D body shape, landmarks, and joint rotations through simple video. Every motion that SOMA captures comes with a skeleton, which can be used to integrate with different gaming and animation pipelines. For a more personalized gaming experience, SOMA can replicate 3D motion to power real-time augmented reality or VR settings. Virtual MMO characters, sports players, and in-game avatars can be personalized with SOMA’s ability to capture each gamer’s individual body shape.
The platform offers ease of use, as it can capture user-generated motion right from a smartphone or server. SOMA actually turns a user’s actions into superpowers, personalizing player interactions using their actions, attacks, and sports moves. It’s optimized for mobile use, as SOMA’s neural networks match its patented 3D mesh (SMPL) to provide a common format, which uses videos, scans, and photos to describe a user’s 3D shape and motion.
VIDEO