Tutorial & Documentation

Soli Sandbox Manual

Soli Sandbox is a way to connect your web prototypes to Soli interactions on Pixel 4.[1] Create a prototype on your computer, and bring it to life with Soli gestures through the Soli Sandbox app. Please note that Soli Sandbox is not a tool for creating production apps.

Create

💻 Remix the starter code or start from scratch

Play

📱Test your prototype in the Soli Sandbox app

Iterate

💻📱Add features and test them out


Tutorial

00 Getting started

Let’s learn how to create a project in the Soli Sandbox app. Getting set up is easy and requires three steps.

📱Pixel 4

Check that your Android OS version is up to date. Learn how to do it in this help center article.

Check that Motion Sense Bridge is up to date.

Download the Soli Sandbox app.

Play with featured examples in the app to familiarize yourself with Soli interactions (presence, reach, swipe, and tap). After exploring available interactions, you’ll start your first project.

01 Create a prototype

 💻 Computer

Now let’s create your first project.

  1. Open the starter code on Glitch.
  1. You can use any hosting tool (e.g. GitHub, Pages, CodePen, etc) or even transfer files directly onto your device to load your prototype in Soli Sandbox.  For this tutorial, we're using Glitch (an online code editor with built-in hosting) because it makes it simple to duplicate, remix, and host code. You don’t need an account to get started. If you create one, you can save your prototypes on Glitch. Learn more about Glitch.
  2. You can also transfer files directly onto your device to load your prototype in Soli Sandbox.
  1. Click the file name “soli-p5js-starter” > Remix project to make a copy.

02 Play with the prototype

💻 Computer

In Glitch, click the Share button > Live App

You’ll see your prototype’s default URL. You can rename the URL to something more memorable.

📱Pixel 4

Open the Soli Sandbox app > Menu > enter your prototype’s URL in the “Open URL or file” field. You’ll see a white circle on a black screen that ripples in response to a tap gesture.        

03 Switch from tap to swipe.

💻Computer
Take a look at your remixed prototype’s code. The comments on each line explain how it works. Find line 16, where you’ll see a  
window.onSoliEvent function. The window.onSoliEvent  runs every time Soli detects an event. This code checks for a specific Soli event using  event.type == ‘x’.  On line 17, we see  event.type == ‘tap’, so tap starts the ripple animation.  

index.html

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

window.onSoliEvent = function(event) { // this function will run any time a gesture is detected'

    if(event.type == 'tap') {

      console.log("tap detected");

      t = 0; //start the ripple animation

    }

    if(event.type == 'swipe') {

      // if(event.data.direction == '1') {

      //  console.log("right swipe detected");

      //  moveRight();

      // } else if (event.data.direction == '5') {

      //  console.log("left swipe detected");

      //  moveLeft();

      // }

    }

    if(event.type == 'presence') {

      // add code here to respond to presence

    }

    if(event.type == 'reach') {

      // add code here to respond to reach

    }

};

💻Computer
Let’s change our prototype to respond to a swipe instead. Comment out lines 18-19 and un-comment lines 22-28 (command + / on Glitch). Note that lines 13 and 16 use
console.log()to display messages in the Soli Sandbox app’s console log.

A
swipe event is sent each time Soli detects motion that resembles a hand waving above the device, similar to brushing crumbs off a table or swatting a fly. Each Soli event contains some data depending on its type. Swipe contains direction data which you can use to tell apart left and right swipes.

 index.html

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33

34

35

36

window.onSoliEvent = function(event) { // this function will run any time a gesture is detected'

    if(event.type == 'tap') {

      // console.log("tap detected");

      // t = 0; //start the ripple animation

    }

    if(event.type == 'swipe') {

      if(event.data.direction == '1') {

        console.log("right swipe detected");

        moveRight();

      } else if (event.data.direction == '5') {

        console.log("left swipe detected");

        moveLeft();

      }

    }

    if(event.type == 'presence') {

      // add code here to respond to presence

    }

    if(event.type == 'reach') {

      // add code here to respond to reach

    }

};

📱Pixel 4

Refresh the app to test your changes. The circle will move in the direction of the swipe. If you press the console log button in the menu, you will see messages appear when a swipe is detected.

Documentation

Soli events

What is a Soli event?

A Soli event is a JSON (JavaScript Object Notation) object that Soli sends to Soli Sandbox when it detects a “gesture”. There are four Soli events to use in your prototypes.

Presence 

Reach

Swipe

Tap

What information is in a Soli event?

Each Soli event captures data such as velocity, angle, or direction.

Example :

{type : swipe, data : {direction : 1}} // 1 = right

How can I tell if an event occurred?

Add the following code to show event data in the console log when a Soli event is detected. To show the console log, tap the Bottom app bar > console log button  in the Soli Sandbox app.

1

2

3

window.onSoliEvent = function (event) {

  console.log(JSON.stringify(event.data));

}

Tap

Tap is a ‘bounce-like’ movement that moves towards the device.

A tap event is sent when Soli detects a movement that resembles a single hand bounce above the center of the phone. A tap feels  similar to dribbling a basketball.

Code snippets

Tap detection

1

2

3

4

5

window.onSoliEvent = function (event) {

  if (event.type == 'tap') {

    console.log(“tap detected”); //this message will appear in the console log

  }

}

Swipe

Swipe is a sweeping gesture that crosses both sides of a device

A swipe event is sent any time Soli detects movement that resembles  a hand waving above the device, similar to opening a curtain in either direction.

Data

Direction: Value describing the direction of the swipe relative to the device.

1

(East) right

5

(West) left

Note: Down (south, 3) and Up (north, 7) are available, but are not optimized.  Diagonal swipes may be classified as left or right.

Detected: This parameter describes the likelihood of the swipe gesture. If this value is  false, Soli is less confident that the gesture was detected. You can use this value to filter out movements which may be very similar to swipes (e.g. passing a coffee cup over the phone). These are considered false positives.

Code snippets

Swipe detection

1

2

3

4

5

window.onSoliEvent = function (event) {

  if (event.type === 'swipe') {

    console.log(“swipe detected”); //this message will appear in the console log

  }

}

Directional swipe detection

1

2

3

4

5

window.onSoliEvent = function (event) {

  if (event.type == 'swipe' && event.data.direction == 1) {

    console.log(“right swipe detected”); //this message will appear in the console log

  }

}

Reach

Reach is a hand movement that approaches the device from any direction.

A reach event is sent any time Soli detects a movement that resembles a hand reaching toward the device, within about 5-10cm.

Note: After a reach event is sent, Soli will continue to send data for four seconds or until the hand moves away from the phone.

Data

axialVelocity: Value in meters/second that describes the speed of the object approaching the device.

distance: Value in meters of how far the object is from the center of the phone.

angle: An array of two values in degrees, describing the angle of incidence of the object moving towards the device. The first angle is the Azimuth angle and the second angle is the Elevation angle. Note: This angle is not Euler angle. Converting to 3D Cartesian position equations are below.

Note: Sometimes Velocity/Angle/Distance data will be 4294967040 (“NaN” AKA Not a Number). It means reach is still active but Soli can’t detect the hand signal. Make sure to filter these values out in your code.

Code snippets

Store reach location in global variables:

1

2

3

4

5

6

7

8

let reachAzimuth, reachElevation, reachDistance;

window.onSoliEvent = function (event) {

  if (event.type == 'reach' && event.data.detected == true) { //

    reachAzimuth = event.data.angle[0]; //updates the reach azimuth angle

    reachElevation = event.data.angle[1]; //updates the reach elevation angle

    reachDistance = event.data.distance; //updates the reach distance

  }

}

Convert to cartesian coordinates:

1

2

3

4

5

6

7

8

9

10

11

12
13

14

15

16

17

//convert to cartesian coordinates

function sphericalToCartesian (azimuth, elevation, distance) {

  azimuth = azimuth * Math.PI / 180;

  elevation = elevation * Math.PI / 180;

  x = distance * sin(- azimuth);

  y = distance * sin(elevation) * cos(azimuth);

  z = distance * cos(elevation) * cos(azimuth);

  var reachVector = [ x, y, z ];

  return reachVector;

}

window.onSoliEvent (event) {

  if (event.type == 'reach' && event.data.detected == true) {

    console.log(sphericalToCartesian(event.data.angle[0], event.data.angle[1], event.data.distance));  

  }

}

Presence

Presence understands when a user(s) is within the sensing area.

A presence event is sent any time Soli detects one person within ~0.7 meter (2.3 ft) of the device.

Data

Distance: Value in meters of how far away the detected person is. If this value is 0, it means that presence sensing is active, but no user is detected.

Code snippets

Log user presence:

1

2

3

4

5

window.onSoliEvent = function (event) {

  if (event.type == 'presence') {

    console.log(“presence detected”);

  }

}        

Log user distance:

1

2

3

4

5

6

window.onSoliEvent = function (event) {

  if (event.type == 'presence') {

    const distance = event.data.distance;

    console.log(“user distance: ” + distance);

  }

}

Code Examples

These examples cover advanced concepts for prototyping with Soli.

Multiple Soli Events: A Simple example with all four Soli Events active at once.

Soli Websocket: A simple website connected to a Pixel 4 over web socket so that you can use Pixel 4 as a controller for any web browser (e.g. tablet, monitor)

Sensor Details

The Soli sensor is located at the top of the phone, creating an interactive hemisphere that can sense and understand movements up to 0.6 meter (2.3 ft) around the phone.

Motion Sense disables gesture events when the sensor is covered or the phone is in motion. This is determined by two factors:

  1. Significant motion via accelerometer: If the device is experiencing significant motion, this creates more noise from the sensor. Soli events are suppressed while this motion is occurring.
  2. Sensor covered via prox. sensor: If the Soli radar is covered, it’s likely that the device is upside-down, in someone’s pocket, or put away.

Debugging & local development

Debugging checklist

  1. If you’re opening a prototype from a URL, make sure you’re connected to the internet.
  2. Motion Sense functionality is only available on Pixel 4 in the US, Canada, Singapore, Australia, Taiwan, Japan and most European countries. For more information see g.co/pixel/motionsense.
  3. Make sure your Pixel 4 is not in Airplane Mode or Battery Saver, and Motion Sense is enabled in Android Settings.
  4. Check to make sure you’re not covering the sensor or moving the phone, which disables Soli.
  5. Check that Soli is turned on in Settings. Go to Settings > System > Motion Sense > Enable Motion Sense.
  6. Check that all of the items under Settings > System > Motion Sense > Quick Gestures are enabled
  7. Check that Soli Sandbox is receiving Soli events. If Soli is on, you’ll see a green dot on the right of the console log and a label that says “Soli On”. If it’s red and the label says “Soli Off”, and the above steps are completed, please reach out to our Google Group.
  8. Check that you aren’t trying to do anything that can’t be done with Android System Webview. Check here for details. (webAR not currently supported see demos)
  9. Check that your app is responding to Soli events as intended. Add additional console.log(); messages to your prototype. Try disabling most of your code, and re-adding pieces until you find the part that breaks it.

Using Local files

You can get local files from your computer to Pixel 4 in the following ways:

Android File Transfer app

The easiest way to get local files from your computer to your Pixel 4 is the Android File Transfer app. When you download the Soli Sandbox app, a folder will automatically be created in downloads called Soli_Sandbox folder. Drag your project folder to the Soli_Sandbox folder in Downloads, and it will appear in the local files menu in Soli Sandbox

Note: To easily sort projects, files must be located in a project folder. Soli Sandbox will ignore subfolders within the project folder. The name of your prototype is determined by the <title> tag in the html file.

Terminal

If you’re comfortable with Terminal, you can also use adb push storage/emulated/0/downloads/soli_sandbox [your file] to send files to your Soli_Sandbox folder.

If your app requires a server (e.g. a dynamic app which loads files from other folders), you can run a simple server on your computer that can be accessed from devices on the same wifi network.

Note: Soli Sandbox requires an https:// connection. http:// connections will be blocked for security reasons.

Local Testing on Desktop

To simplify development, you may want to simulate Soli events with your keyboard, so you can debug non-Soli parts of your app before testing on your device. Look up the keyCode for keys with Keycode.info under “key.which”.

1

2

3

4

5

6

7

8

9

10

11

12

13

14

//This code uses P5.js

function keyPressed() {

  if (keyCode === LEFT_ARROW) {

    //swipe left

  } else if (keyCode === RIGHT_ARROW) {

    //swipe right

  } else if (keyCode === DOWN_ARROW) {

    //presence

  } else if (keyCode === UP_ARROW) {

    //reach

  } else if (keyCode === “84”) { // ‘t’ is 84

    //tap

  }

};

FAQ: Building prototypes with Soli

Q: I’m not sure where to start. What are some good uses for Soli?

Consider the following themes in your ideation phase:

Accelerate common actions: Soli interactions allow technology to feel more natural and human. Use Soli interactions to accelerate common actions—e.g. we combined reach detection with Pixel 4’s Face Unlock to create a faster and more seamless unlock experience. , or to be more considerate, such as dimming the volume of a ringer as you reach.

Extend your phone: Soli interactions can be used as alternatives which augment current device functions, especially when hands are wet, messy, or occupied (driving, biking).

Bring delight: Presence detection and hand gestures open up a new world of expression; try using Soli to build worlds, control characters, and build an emotional connection with your users.

Q: When does Soli recognize a gesture?

Soli classifies movements once a gesture is completed. For example, swipes must cross both sides of the phone before the gesture is recognized.  Some gesture classifications may occur at the same time—such as a reach event and a swipe event.

Q: How do I teach my users how to do gestures in my prototype?

Here are a few considerations when educating users within your own prototypes:

A user should know when to gesture: Gestures may only be available at certain moments within your experience. Make it clear when the user can use gestures.

A user should know how to gesture: Users need to learn when and how they should gesture. Consider text or visual prompts to guide the user to success. View animation examples in the Welcome example in Soli Sandbox; open a gesture to try out, and wait 7 seconds for the animation to overlay.

A user should understand when a gesture is properly recognized: Gesture confirmation feedback and UI change of state are important aspects of managing error states and building the right mental model. See the section below on visual feedback.

A user should feel confident: A successful gesture doesn’t necessarily mean the user has mastered the system. Give users the opportunity to gesture as long as they would like to.

Q: How do I design visual feedback for my prototype?

Since a user is not touching a screen, it’s best to provide visual feedback for Soli interactions. This can be done with an existing UI element (such as album art shifting when you skip a song). This can also be done with a new element (such as the blue glow for gestures on Pixel 4). Consider communicating to your users when:

  1. Gestures are available
  2. When gestures are successful

Q: What technologies does Soli Sandbox support?

Soli Sandbox prototypes are html files which receive and respond to Soli events with Javascript. The Soli Sandbox app uses Android System Webview to display prototypes. All technologies currently supported by Android System Webview, with the exception of webAR, (WEBGL and webVR) are supported on the Soli Sandbox app.

Q: Don’t see your question here?

Send us a DM @googleATAP on Twitter or Instagram!


[1] Soli functionality only available on Pixel 4 and only in the US, Canada, Singapore, Australia, Taiwan, Japan and most European countries. For more information see g.co/pixel/motionsense.  Soli Sandbox app does not enable creating Android apps.