The Future Of Context Aware Mobile Computing

You might not have heard of ‘context-awareness’, but you almost certainly will be exposed to it over the next few years. We take a look at this rapidly growing technology in which computing devices detect and measure information about the user and use artificial intelligence rules to apply that information to specially-tailored applications.

The processing and memory capabilities of modern smartphones has increased remarkably over the past ten years. If you had shown someone from just a decade ago, 2003, a Samsung Galaxy 4 or an Iphone 5s, whilst they were still fiddling away with their monochrome Nokia 3310’s and giddy at the prospect of the release of Snake II – they would be flabbergasted beyond comprehension. Modern smartphones are so powerful that they could almost certainly outperform even top-notch gaming PC’s of the time. While the hardware might be groundbreaking, many of the current generation applications for these smartphones aren’t anything revolutionary yet – sure, you can get gorgeous looking games with fancy 3D graphics, and you can get whizz-bang design apps that allow you to edit photos and videos to exacting precision, but there’s not a lot yet that truly takes advantage of the power of mobile computing. This is where context-aware mobile applications come in.

Smartphones today come with a large range of tools that allow the device to ‘sense’ information about the user’s environment – think GPS tracking, the accelerometer, voice recording, image and video recording, and bluetooth sensoring to detect nearby devices. What if you could develop a range of applications that make full use of this monitoring capability, along with detailed inference logic rules, to intelligently monitor the user’s environment and offer relevant services acccordingly? Put simply, context-aware mobile applications are those in which the processing rules, available services, data or business logic of the application change depending on the user’s environment, location, and other external factors.

Some of the factors that could potentially determine a user’s situation or ‘context’ include what the user is doing (the activity), who the user is with, when they are doing the activity, and where they are doing the activity. The ability of an application to sense these factors and alter its behaviour based on them is what is known as context awareness. Context-awareness allows for mobile applications to intelligently determine their user’s situation and automatically provide relevant services without prompting.

You probably have already experienced to a basic level a degree of context awareness in applications or devices – think of an electronic device with a light sensor that automatically adjusts its brightness depending upon the light conditions around it. Let’s take this concept a step further – imagine you slept in and are late to work, so you start running to the train station. An application in your wristwatch has recorded and knows your daily routine; that you normally walk to the train station at a certain time each morning at a walking speed of about 3 km/h. On this occasion, it has sensed that you have left the house 30 minutes later than usual – not only that, but you are moving at twice the normal speed – at running pace – so it uses its inference engine to deduce a fact about your current situation and determines that you are late for work. It knows you catch the train, so it looks up the timetable for the morning trains for your local station and displays a countdown timer on your watch display for the next train’s departure time.

While most truly ambitious context-aware applications are still in the research phase, there are a number of applications that attempt to demonstrate the power and, more importantly, the future potential of context-aware applications. One of these is EmotionSense, a mobile sensing platform designed initially for social psychology studies which attempts to sense a person’s emotions and calculate how these emotions relate to their activities, location and proximity to other people. It does this by continually measuring and recording four key elements of contextual data – human voices around it, the location of the user, the user’s movement data (with the accelerometer), and other nearby people/devices (with the Bluetooth sensor).

Once the raw data are collected, EmotionSense stores these facts in its Knowledge Base. The system’s inference engine then analyses this data and makes decisions on it using a set of inference rules. For example, if there is no sound or movement at all detected, the time is between 10pm and 8am, and the location matches a user-defined area (such as a bedroom), the inference engine might determine that the user is asleep. If the user’s vocal pitch and key words are matched in the inference database against those of a ‘happy’ person, the system might determine that the user is happy.

Another interesting application is ReQall – an application that attempts to record almost every aspect of your life, and acts as a ‘memory assistant’, by providing information that can help you make key decisions during your day – based on information and stimulus you’ve encountered that possibly you weren’t even aware of.

These applications, though, while undoubtedly innovative and boundary-pushing, merely scratch the surface of what large-scale context aware computing could do for the average user. Middleware solutions for context awareness are being developed, which allow context information to be stored in the cloud, and provide a set of contextual processing rules that are common among all applications so software developers can focus primarily on their own application logic. This means that context aware data can be collected from all users around the world and accumulated in the cloud, meaning application developers can essentially develop apps with environmental knowledge of everyone in a particular city or region.

One exciting development in the field of healthcare that utilises a similar middleware solution is an application developed by a group of Chinese engineers that continually collects environmental data from mobile devices, which is stored in a database in the middleware layer. Mobile devices can then access the middleware to collect medical status information based on a patient’s context and medical history to calculate and manage high risk situations for that patient.

Yet another possibility for future context-aware applications include a traffic monitoring system in which users can determine the location and speed of all other users in their city, as the application automatically consults the context information in the cloud to provide the fastest route to a particular destination for that user. Alternatively, think of a maintenance application for a city council in which information about city council assets (such as parks, road signs, and street lights) is collected throughout each day from citizens’ mobile devices as they walk or drive past these assets, stored in a database, and sent to the city council office when a maintenance day for that asset is due.

As mobile computing processing power, wireless Internet speeds and storage capacity continue to increase, we’ll doubtlessly see more of these context-aware applications entering the mainstream and the charts in the Google Play and ITunes stores as genius developers out there begin to discover innovative ways to make people’s lives that much easier by using the context sensing capabilities of these devices. For now, consider yourself ahead of the curve in this particular field of technology – and if you’re an application developer, what are you waiting for? The next ‘killer app’ could be one that utilises context awareness in ways people hadn’t even considered before. Get to work!

You may also like...