During a Google I/O
presentation, Google developers offered up some details about how they
built some of Android's location-aware apps, like an automatic car
finder feature, and said that new products like watches and connected
devices promise much more interesting apps in the future.
The speakers showed how
combining data from various sensors and devices can let developers
predict what kind of activity a user is doing and thus trigger certain
functions.
To collect data about user
movements in order to build models, Google enlisted employees who
recorded 65,000 "sensor traces," which are essentially graphs that show
movement based on data collected from a phone's accelerometer.
The employees labeled the activity they were doing at the time so that
Google could create models for activities like walking or biking.
It found that adding data from
additional devices and sensors helped improve accuracy a lot.
For instance, Stogaitis showed a sensor trace graph from accelerometer
data that looked just like data from someone walking.
But when he added data from the barometer on the phone, he noticed
that there was a slight spike in barometric pressure, which correlates
to elevation.
It turned out that the employee who collected this data was walking
down stairs.
In another example, the accelerometer data again looked like someone walking.
But that same user was also wearing a watch and its accelerometer data was much steadier.
The user was riding a bike.
Once Google collected this
user data, it created machine learning models that can examine sensor
data to predict what users are doing.
That kind of information was
useful when Google built its car finder app.
That app first looks at accelerometer data to determine that a user is
in the car.
It then looks at the tilt sensor in the phone to determine when the
user goes from a sitting to a standing position, indicating that they
are leaving the car.
At that moment the app saves the location of the user.
A number
of apps or features that developers could write that take advantage of
this kind of contextual awareness.
For instance, an IM app might automatically read text messages when
the app detects that a user is in the car.
An app could show users at the end of the week how much time they
spent commuting to work during the week.
That's the kind of application
that could be useful to businesses that might want to measure the time
it takes workers to complete certain jobs as a way to improve
efficiencies.
When developers combine that
better data with new kinds of connected devices, they'll be able to
build even more interesting apps, he predicted.
"The ability to understand context becomes richer," he said.
For instance, a user could say "turn on lights" and because your phone
knows your precise location, it can instruct the nearest light to turn
on.
Or, a user might be able to knock on their own door and the system
would recognize the knock and the movement as the home owner and unlock
the door.
Many of the APIs required to build the kinds of apps Kadous and Stogaitis discussed are already available from Google.