A developer’s perspective on one of the most talked about wearables of recent times.
My colleagues have heard me say this several hundred times over the last few months. They have taken delight in the different search terms I have had to come up with; partly to test the glassware, but also just to entertain them. It’s rather liberating to talk to yourself at your desk, despite the ridicule from your colleagues.
Of course, you can also scan a barcode: “Ok Glass scan a product”.
We started this experiment in June last year. We had a prototype working, and filmed a conceptual video about how customers might use the glassware. Since then, it has changed substantially, although the principle functions remain. We have refined and shortened the user journeys and also clarified the experience to make it consistent with the Glass design patterns.
If you are already a Glass wearer, you should find the experience very familiar and you can try the glassware out. As this is a very early experiment you can only add items to your basket and view nutritional information, but it’s enough to give a sense of what it would be like to interact with Tesco on this type of hardware.
From a developers’ perspective, working with Glass has been a joy. The updates to Android Studio that have made Android development more accessible all apply to Glass development. The Glass Development Kit (GDK) documentation is good and getting better. The community is helpful and proactive about sharing knowledge, especially on stackoverflow. The Glass team at Google does all they can to try to make sure the glassware delivers the best experience possible. This is a challenge given how Glass is still being developed, so it can be somewhat of a moving target. The Glass software platform went through 6 updates in the time we worked with it, which shows how much Google is still investing in the platform.
Given the steady flow of software updates, and the various articles that have been published alluding to updated Glass hardware, I can’t help but feel this is still the beginning of the journey for Glass and for Tesco.
What role can Tesco play in promoting good health and wellbeing?
Background and Introduction
What we eat can contribute a huge amount to our general health and wellbeing. As the UK’s biggest supermarket, Tesco recognises the key role it can play in promoting good health and wellness.
The app and wearable tech market, in particular, has grown dramatically over the last 3-4 years. Health products from Nike, FitBit, JawBone and Pebble have emerged on the market as fantastic Health aids.
In the lab we wanted to see whether we could develop a product that would harness the excitement generated by this new app and wearables market, but put a Tesco spin on it.
Goals and Objectives
HealthBuddy is ostensibly research driven, we’ve had ‘fun’ exploring:
Alternative ways for customers to record their daily calorie consumption.
Utilising in-built android phone sensors to track customers daily activity.
Pairing an Android application with a wearable hear-rate monitoring device.
The latest Android UI patterns.
Activity based gamification mechanics.
How to build mobile apps using the Xamarin platform.
Our aim has not been to produce an application we would immediately give to customers.
Where We’re At Now
We have built and Android app that delivers on what we set out in our goals and objectives. Customers can use HealthBuddy to track their calorie consumption in multiple ways and monitor four variations of physical activity.
Customers can select an activity goal when they setup a profile. A simple overview screen is available to track progress. Rewards are released for achieving activity goals which can then be shared socially.
Our next step is to test HealthBuddy with a control group and gain feedback on potential uses for the prototype moving forward.
We’ve been checking out Google glass for a few months now; experimenting with applications for colleagues and customers while evaluating the technology itself.
Immediately, we thought about how our colleagues might be able to use Glass to check stock hands-free, or how our customers might be able to add a product to their grocery delivery basket while making a cup of tea. Getting to that stage has been a journey into entirely new areas of user interaction: new gestures, user interface elements, and input mechanisms.
Most of all, it’s about trying to understand the use-cases for Glass. It’s unlike any other hardware technology we’ve had before, so we tend to try and apply the use-cases we see for mobiles, tablets and desktop computing to see if they stick to Glass. They mostly don’t. Glass isn’t the kind of tech you use for 15, 10 or even 5 minutes at a time. You’re not going to comfortably do your entire grocery shop by staring at the top right-hand corner of your field of vision, but you might just add a single item, see some nutritional information, and then move on. You might get a notification about your delivery, including a photo of your delivery driver.
Click to play video.
Other than some time-compression between adding items to the basket and the actual delivery, the prototype app is real; no smoke and mirrors there. Every once in a while, a new piece of technology comes a long that pushes the boundaries of science-fiction, making all sorts of potential use-cases an immediately reality. Glass feels like one of those technologies, in the sort of way WiFi or Smart Phones changed things. This is just the beginning of our journey with Glass, but we’re very excited about it and other wearables, and most importantly how you will use wearable technology to interact with Tesco.
As the momentum behind wearable technology continues to build in 2014, I can’t help think that the devices and consumer products that will actually stick are those that get the correct balance of being both a ‘cool technology’ and a being ‘truly’ wearable ( and here I mean wearable in the sense that we actually desire and want these products about our person whilst we go about our normal lives.)
I think the Google Glass is the typical example here. You’d be a braver, more knowledgable man than I to predict the adoption of the Google Glass as a consumer product.
I’m deeeefinitely not going to do that.
However, as I’ve watched maiden voyages on the Glass product my experience has been that folk really need to digest the notion of having this new experience so vital, as it is, in relation to their head and physical person.
My point here with the Google Glass example – (and forgive me if it is relatively obvious) – isn’t that Glass is not a fantastically exciting consumer product, utilising some fascinating and compelling technology.
Of course it is.
It’s more the point that if consumers are going to adopt these devices as part of their daily routines and lives, it seems to me they really, really are going to have to compliment our life’s – both practically and aesthetically – if they are to be taken to our consumer hearts.
We have visited the Consumer Electronics Show (CES) and the National Retail Federation’s Big Show (commonly known as NRF) in the U.S.
I’ve just got back from a busy couple of weeks at the Consumer Electronics Show (CES) and the National Retail Federation’s Big Show (commonly known as NRF) in the U.S. CES is the biggest technology show in the world with over 150,000 people attending and NRF is the biggest retail tech show in the world with a crowd of around 40,000.
At CES there was a definite focus on wearable technology. We met companies that provide all sorts of wearables from watches and fitness trackers (now even for your dog!) to a variety of companies that are building head mounted displays. There is definitely a lot of potential for these devices to change how customers interact with us in the future. For example, imagine being able to add products to your shopping list or your online grocery order just by looking at them and tapping your glasses, or just by saying the name of the product out loud.
Trying out one of the watches at the CES
At NRF in New York, the trend for wearables was similar if less pronounced. Epson showcased their impressive augmented reality glasses with a partner of theirs who had created a product picking app. We thought this could easily help Personal Shoppers in our dotcom stores, because it could visually show them where a particular product is located on a shelf, leaving both their hands free to pick it up and pack it for the customer.
Mike trying out the Epson glass
We also recently got hold of Google Glass, and CES was the perfect environment to give it a longer test (thanks to the techy crowd attending, where I definitely blended in a bit more with other “Glass Explorers” who were a common sight around the show.) Watch this space for news on what we’re up to with that.
As Mike mentioned in his blog post, there were lots of innovations at NRF that have the potential to have a positive impact on the customer and this was reflected at CES too. You may have heard about 4K displays, which people are saying is the next big revolution in TV. 4K displays were originally developed for use in digital cinema projection and offers four times the resolution of existing Full HD, 1080p TVs. They feel much more immersive, especially if you are up close and within touching distance. We also saw some holographic-like technology that felt like you were interacting with an image that appeared to float in mid-air as though it was a touch screen (although it was impossible to photograph properly). This combined with the larger screens and integrated mobile experiences could add a bit of theatre to the store environment.
Another big trend at CES this year was the growing emergence of self-driving cars and more generally connected cars. This might have a huge effect on how we shop, from cars that park themselves when you get out at the door right through to a car that goes to a drive-through click and collect whilst you’re at work… imagine that.
The car of the future? The self-driving car at the CES.