Introducing on{X} – Automate Your Life


At Bing, we talk a lot about thinking outside the search box. Yesterday we highlighted some updates to Bing on Xbox that will allow more people to search with the sound of their voice. We think search should be ubiquitous augmenting devices with intelligence that can do things on your behalf

Welcome to on{X}. Today, we are launching a new web site and mobile application which helps developers and technology enthusiasts, to remotely program their phones. This new model is orthogonal to the classic ‘app from a marketplace’ model we’re all used to.

  • Wouldn’t you like to program your phone to automatically send a reply SMS with your current location as you’re driving when your wife texts you a “where?” message?
  • How about programming your phone to automatically show you today’s agenda as you step into the office?
  • Or show you the weather when you wake up in the morning?

Keep reading if this sounds interesting:

As we experience the explosion of mobile devices and smartphones, the power of mobile apps is undeniable. Apps are everywhere, from basic communication tools like email and chat, to social networking like Facebook or Twitter, to games, utilities, navigation aids- the list goes on and on. The rise of apps across all mobile platforms is driving people to think in terms of atomic ‘app’ elements.

Furthermore, we use apps everywhere. Whenever we’re faced with a task or a question like “who is that actor in the movie we just watched and where else did he play?” or “I need to book a table at that restaurant tonight” our instinctual knee-jerk reaction is to reach for our pocket, grab our phone, and launch that right app for the job.

When ‘faced’ with a question or a task, we ‘grab’ our phone and we ‘launch’ an app. All are very proactive and conscience verbs of activities we do. Apps are great when we know we need to use them, we know they exist, and we actually put them to use at the right moment. An overwhelming majority of apps in existence today require you to do something in order for them to deliver value. When was the last time you started jogging but forgot to launch your running app or got stuck in gridlock because you forgot to launch your traffic app?

The modern smartphone is a Swiss Army knife of sensors and actuators. The list is long but it includes a built-in microphone, a speaker, a camera, a GPS, an accelerometer, a compass, a gyro, Wi-Fi, 3G, Bluetooth, NFC and so on. Not only do they have tremendous sensor capabilities, but they are practically with us everywhere we go, always connected, and never turned off. On the face of it, the smartphone should be the ideal platform to continuously sense the world and proactively interact with the user. Why hasn’t this happened already?

One big reason is that it’s simply still too hard to write good software that makes smart use of continuous sensing capabilities without annoying the user or draining the battery, or both. It becomes much harder when we need to consider support for multiple mobile operating systems with a diverse set of capabilities and APIs.

This is what we’re trying to solve.

Project on{X} is a developer oriented service that enables developers and technological enthusiasts, to easily program mobile devices to dynamically react to a continuously changing environment. The code we write is an action that we hook up to a sensor-based event. For example: “AC power disconnected” or “WiFi network detected”. Do you want some more sophisticated examples? How about “User mode-of-transport just changed from walking to driving” or “user left home”?

For each such triggering event, we can easily create reactions. Instead of limiting the reaction to a simple list of actions, we are offering the full power of JavaScript. That’s right, you can push any arbitrary JavaScript code, remotely, down to your mobile device and hook it up to a continuous signals sensing framework that you only need to download and install once. The possibilities are wide open because you no longer need to worry about the target platform. Even better, Project on{x} is optimized to not drain your battery.

In typical systems, a phone acts as the eyes and ears – it senses the world and sends the data to a web service in the cloud for analysis. After the real-time or off-line analysis is completed, and the system realizes it has something to tell the user, the phone is engaged again. This is usually in the form of a push-notification message. Now the phone acts as the mouth of the system – talking to the user. This is how many apps and services work, from navigation systems, to simple personal assistants.

The concept of Inversion-of-Control offers an interesting alternative. Instead of having the data move to the cloud, we’re pushing the code down to the device, where the data (sensor signal in our case) originates. The first real advantage that comes to mind is privacy – no data is leaving the device. Computation is taking place on the phone, in real-time, as a reaction to a real-world sensor-based event. Sure, the script reacting to the event can decide to communicate with the web and either upload or download data, but that’s no longer a mandatory course of action. In fact, in many scenarios we explored, no communication with the cloud is happening at all. This brings me to the second advantage – connectivity. Despite the pretty picture of an always connected world we love to keep in our heads, the reality is an intermittent and flaky connectivity for a big portion of mobile users. But, if the signal is originating on the phone and the code reacting to it can do its job on the spot without the need to connect to the web, we just enabled a whole new set of scenarios that work very well in the real-world of intermittent connectivity.

To summarize, we are bringing a new concept to the mobile development world. It’s a multi-platform framework that enables continuous sensing & interaction via powerful JavaScript reactions to real-world activities. We are very excited about the possibilities it opens up and would love to engage with the amazing community of mobile and web developers to explore the boundaries of this new set of ideas. We start on{X} as a beta on Android phones, that will be followed by more platforms in the future.

You can try it out here.

– Eran Yariv, Principal Development Manager

Share on twitter Share on facebook Share on linkedin Share on linkedin