Tutorial & Documentation
Soli Sandbox is a way to connect your web prototypes to Soli interactions on Pixel 4.[1] Create a prototype on your computer, and bring it to life with Soli gestures through the Soli Sandbox app. Please note that Soli Sandbox is not a tool for creating production apps.
💻 Remix the starter code or start from scratch
📱Test your prototype in the Soli Sandbox app
💻📱Add features and test them out
Let’s learn how to create a project in the Soli Sandbox app. Getting set up is easy and requires three steps.
📱Pixel 4
Check that your Android OS version is up to date. Learn how to do it in this help center article.
Check that Motion Sense Bridge is up to date.
Download the Soli Sandbox app.
Play with featured examples in the app to familiarize yourself with Soli interactions (presence, reach, swipe, and tap). After exploring available interactions, you’ll start your first project.
💻 Computer
Now let’s create your first project.
💻 Computer
In Glitch, click the Share button > Live App
You’ll see your prototype’s default URL. You can rename the URL to something more memorable.
📱Pixel 4
Open the Soli Sandbox app > Menu > enter your prototype’s URL in the “Open URL or file” field. You’ll see a white circle on a black screen that ripples in response to a tap gesture.
💻Computer
Take a look at your remixed prototype’s code. The comments on each line explain how it works. Find line 16, where you’ll see a window.onSoliEvent function. The window.onSoliEvent runs every time Soli detects an event. This code checks for a specific Soli event using event.type == ‘x’. On line 17, we see event.type == ‘tap’, so tap starts the ripple animation.
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 | window.onSoliEvent = function(event) { // this function will run any time a gesture is detected' if(event.type == 'tap') { console.log("tap detected"); t = 0; //start the ripple animation } if(event.type == 'swipe') { // if(event.data.direction == '1') { // console.log("right swipe detected"); // moveRight(); // } else if (event.data.direction == '5') { // console.log("left swipe detected"); // moveLeft(); // } } if(event.type == 'presence') { // add code here to respond to presence } if(event.type == 'reach') { // add code here to respond to reach } }; |
💻Computer
Let’s change our prototype to respond to a swipe instead. Comment out lines 18-19 and un-comment lines 22-28 (command + / on Glitch). Note that lines 13 and 16 use console.log()to display messages in the Soli Sandbox app’s console log.
A swipe event is sent each time Soli detects motion that resembles a hand waving above the device, similar to brushing crumbs off a table or swatting a fly. Each Soli event contains some data depending on its type. Swipe contains direction data which you can use to tell apart left and right swipes.
16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 | window.onSoliEvent = function(event) { // this function will run any time a gesture is detected' if(event.type == 'tap') { // console.log("tap detected"); // t = 0; //start the ripple animation } if(event.type == 'swipe') { if(event.data.direction == '1') { console.log("right swipe detected"); moveRight(); } else if (event.data.direction == '5') { console.log("left swipe detected"); moveLeft(); } } if(event.type == 'presence') { // add code here to respond to presence } if(event.type == 'reach') { // add code here to respond to reach } }; |
📱Pixel 4
Refresh the app to test your changes. The circle will move in the direction of the swipe. If you press the console log button in the menu, you will see messages appear when a swipe is detected.
A Soli event is a JSON (JavaScript Object Notation) object that Soli sends to Soli Sandbox when it detects a “gesture”. There are four Soli events to use in your prototypes.
Presence | Reach |
Swipe | Tap |
Each Soli event captures data such as velocity, angle, or direction.
Example :
{type : swipe, data : {direction : 1}} // 1 = right |
Add the following code to show event data in the console log when a Soli event is detected. To show the console log, tap the Bottom app bar > console log button in the Soli Sandbox app.
1 2 3 | window.onSoliEvent = function (event) { console.log(JSON.stringify(event.data)); } |
A tap event is sent when Soli detects a movement that resembles a single hand bounce above the center of the phone. A tap feels similar to dribbling a basketball.
Tap detection
1 2 3 4 5 | window.onSoliEvent = function (event) { if (event.type == 'tap') { console.log(“tap detected”); //this message will appear in the console log } } |
A swipe event is sent any time Soli detects movement that resembles a hand waving above the device, similar to opening a curtain in either direction.
Direction: Value describing the direction of the swipe relative to the device.
1 | (East) right |
5 | (West) left |
Note: Down (south, 3) and Up (north, 7) are available, but are not optimized. Diagonal swipes may be classified as left or right.
Detected: This parameter describes the likelihood of the swipe gesture. If this value is false, Soli is less confident that the gesture was detected. You can use this value to filter out movements which may be very similar to swipes (e.g. passing a coffee cup over the phone). These are considered false positives.
Swipe detection
1 2 3 4 5 | window.onSoliEvent = function (event) { if (event.type === 'swipe') { console.log(“swipe detected”); //this message will appear in the console log } } |
Directional swipe detection
1 2 3 4 5 | window.onSoliEvent = function (event) { if (event.type == 'swipe' && event.data.direction == 1) { console.log(“right swipe detected”); //this message will appear in the console log } } |
A reach event is sent any time Soli detects a movement that resembles a hand reaching toward the device, within about 5-10cm.
Note: After a reach event is sent, Soli will continue to send data for four seconds or until the hand moves away from the phone.
axialVelocity: Value in meters/second that describes the speed of the object approaching the device.
distance: Value in meters of how far the object is from the center of the phone.
angle: An array of two values in degrees, describing the angle of incidence of the object moving towards the device. The first angle is the Azimuth angle and the second angle is the Elevation angle. Note: This angle is not Euler angle. Converting to 3D Cartesian position equations are below.
Note: Sometimes Velocity/Angle/Distance data will be 4294967040 (“NaN” AKA Not a Number). It means reach is still active but Soli can’t detect the hand signal. Make sure to filter these values out in your code.
Store reach location in global variables:
1 2 3 4 5 6 7 8 | let reachAzimuth, reachElevation, reachDistance; window.onSoliEvent = function (event) { if (event.type == 'reach' && event.data.detected == true) { // reachAzimuth = event.data.angle[0]; //updates the reach azimuth angle reachElevation = event.data.angle[1]; //updates the reach elevation angle reachDistance = event.data.distance; //updates the reach distance } } |
Convert to cartesian coordinates:
1 2 3 4 5 6 7 8 9 10 11 12 14 15 16 17 | //convert to cartesian coordinates function sphericalToCartesian (azimuth, elevation, distance) { azimuth = azimuth * Math.PI / 180; elevation = elevation * Math.PI / 180; x = distance * sin(- azimuth); y = distance * sin(elevation) * cos(azimuth); z = distance * cos(elevation) * cos(azimuth); var reachVector = [ x, y, z ]; return reachVector; } window.onSoliEvent (event) { if (event.type == 'reach' && event.data.detected == true) { console.log(sphericalToCartesian(event.data.angle[0], event.data.angle[1], event.data.distance)); } } |
A presence event is sent any time Soli detects one person within ~0.7 meter (2.3 ft) of the device.
Distance: Value in meters of how far away the detected person is. If this value is 0, it means that presence sensing is active, but no user is detected.
Log user presence:
1 2 3 4 5 | window.onSoliEvent = function (event) { if (event.type == 'presence') { console.log(“presence detected”); } } |
Log user distance:
1 2 3 4 5 6 | window.onSoliEvent = function (event) { if (event.type == 'presence') { const distance = event.data.distance; console.log(“user distance: ” + distance); } } |
These examples cover advanced concepts for prototyping with Soli.
Multiple Soli Events: A Simple example with all four Soli Events active at once.
The Soli sensor is located at the top of the phone, creating an interactive hemisphere that can sense and understand movements up to 0.6 meter (2.3 ft) around the phone.
Motion Sense disables gesture events when the sensor is covered or the phone is in motion. This is determined by two factors:
You can get local files from your computer to Pixel 4 in the following ways:
Android File Transfer app
The easiest way to get local files from your computer to your Pixel 4 is the Android File Transfer app. When you download the Soli Sandbox app, a folder will automatically be created in downloads called Soli_Sandbox folder. Drag your project folder to the Soli_Sandbox folder in Downloads, and it will appear in the local files menu in Soli Sandbox
Note: To easily sort projects, files must be located in a project folder. Soli Sandbox will ignore subfolders within the project folder. The name of your prototype is determined by the <title> tag in the html file.
Terminal
If you’re comfortable with Terminal, you can also use adb push storage/emulated/0/downloads/soli_sandbox [your file] to send files to your Soli_Sandbox folder.
If your app requires a server (e.g. a dynamic app which loads files from other folders), you can run a simple server on your computer that can be accessed from devices on the same wifi network.
Note: Soli Sandbox requires an https:// connection. http:// connections will be blocked for security reasons.
To simplify development, you may want to simulate Soli events with your keyboard, so you can debug non-Soli parts of your app before testing on your device. Look up the keyCode for keys with Keycode.info under “key.which”.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 | //This code uses P5.js function keyPressed() { if (keyCode === LEFT_ARROW) { //swipe left } else if (keyCode === RIGHT_ARROW) { //swipe right } else if (keyCode === DOWN_ARROW) { //presence } else if (keyCode === UP_ARROW) { //reach } else if (keyCode === “84”) { // ‘t’ is 84 //tap } }; |
Consider the following themes in your ideation phase:
Accelerate common actions: Soli interactions allow technology to feel more natural and human. Use Soli interactions to accelerate common actions—e.g. we combined reach detection with Pixel 4’s Face Unlock to create a faster and more seamless unlock experience. , or to be more considerate, such as dimming the volume of a ringer as you reach.
Extend your phone: Soli interactions can be used as alternatives which augment current device functions, especially when hands are wet, messy, or occupied (driving, biking).
Bring delight: Presence detection and hand gestures open up a new world of expression; try using Soli to build worlds, control characters, and build an emotional connection with your users.
Soli classifies movements once a gesture is completed. For example, swipes must cross both sides of the phone before the gesture is recognized. Some gesture classifications may occur at the same time—such as a reach event and a swipe event.
Here are a few considerations when educating users within your own prototypes:
A user should know when to gesture: Gestures may only be available at certain moments within your experience. Make it clear when the user can use gestures.
A user should know how to gesture: Users need to learn when and how they should gesture. Consider text or visual prompts to guide the user to success. View animation examples in the Welcome example in Soli Sandbox; open a gesture to try out, and wait 7 seconds for the animation to overlay.
A user should understand when a gesture is properly recognized: Gesture confirmation feedback and UI change of state are important aspects of managing error states and building the right mental model. See the section below on visual feedback.
A user should feel confident: A successful gesture doesn’t necessarily mean the user has mastered the system. Give users the opportunity to gesture as long as they would like to.
Since a user is not touching a screen, it’s best to provide visual feedback for Soli interactions. This can be done with an existing UI element (such as album art shifting when you skip a song). This can also be done with a new element (such as the blue glow for gestures on Pixel 4). Consider communicating to your users when:
Soli Sandbox prototypes are html files which receive and respond to Soli events with Javascript. The Soli Sandbox app uses Android System Webview to display prototypes. All technologies currently supported by Android System Webview, with the exception of webAR, (WEBGL and webVR) are supported on the Soli Sandbox app.
Send us a DM @googleATAP on Twitter or Instagram!
[1] Soli functionality only available on Pixel 4 and only in the US, Canada, Singapore, Australia, Taiwan, Japan and most European countries. For more information see g.co/pixel/motionsense. Soli Sandbox app does not enable creating Android apps.