+digital strategyFirstly a marketer, I'm a digital strategist and content creator. I love engaging communities and building properties that inspire action. I'm the co-founder at Just Media Design, I've worked for a Silicon Valley Incubator, I co-host the iTunes #1 podcast Living Outrageously, and Social Q&A. I'm the author of the Amazon best selling Outsource Your MVP and am a world record holder.
A fan of emerging technology, I’ve been excited for a while to get my hands on Google Glass, and with Google’s December announcement of the MyGlass app being available for iOS, I knew the holiday period would be the perfect opportunity to give my undivided attention to the famed headgear.
Ten minutes into wearing Glass, I’d begun to realize that this really was unlike any technology I’d ever used (read: I didn’t know where to look). Because the Glass display sits above your right eye, perhaps the most challenging hurdle is learning that you don’t need to look at the world through it the transparent monitor.
Once mastering the upward glance, I found my default reaction was to keep the display focused in my peripheral vision. Don’t do this, it will put you on a bullet train to headache- ville. For those first few of hours, the display seemed like a fly hovering around my face, it took all of the control in my being to keep my head straight and look below the display when it’s not in use.
The software presented an inevitable few minutes of fumbling around with a whole new experience to navigate through. While the browsing experience takes some getting used to, all in all the menu structure makes sense. The touch sensitive side panel makes flicking through the navigation and selecting options with your index finger a breeze.
And then it hit me.
As I walked down a sunny downtown San Francisco street on my way to the office, I had a realization… I’m staring at a glass display, in the sky, using my finger as the primary method of navigating menus. Essentially removing myself from the ‘real world’ to focus on the technology. Isn’t this the exact predicament Glass was designed to prevent?
Well I needed not fret, Google had an answer. Their heavily promoted “OK Glass” voice commands. Five minutes in and I was quick to realize that voice doesn’t work. It just doesn’t. Extremely simple commands with no background noise worked from time to time, but any time I had to narrate at any length, the results were dismal (in Google’s defense, I do have an Australian accent).
By this point, I’d also started to take note of the camera quality. While snapping a photo is one of the best ways to demonstrate Glass to friends, the camera just ended up being a novelty. With the lengths that mobile handset manufacturers are now going to raise the sensor quality game, the camera built into Glass left much to be desired, especially in low light.
Another major flaw that no one seems to have raised is that virtually every pair of glasses on the market today has arms that fold inward. Except Glass. Folding arms have been the standard for decades, providing ease when storing spectacles that are not in use. Unfortunately the processor and battery is built into the right arm, and as a result, the Google Glass frame doesn’t fold at any point and is fairly rigid in its design. This means that every time I slid the unit into a pocket, or bag, I was living in fear that the display would come out scratched, or worse, broken off completely.
In saying that, the hardware isn’t particularly heavy and once you’re accustomed to wearing Glass, it actually feels quite natural. Any weight I did feel was easily forgiven – I had a computer strapped to my face, so I expected to feel it. It’s also hard to deny the beauty in the industrial design. Glass is pretty damn nice to look at with smooth lines and a simple form factor, that isn’t littered with logos or buttons.
But I’m still not going to wear it…
Lets face the bigger issue at hand. Even if Glass were the most amazing new gadget experience on the planet, I couldn’t wear it right now. Every moment it was on felt like I was part of an awkward social experiment. The minute the demonstration is over, I’d come to expect that inevitable vacant stare, with everyone around you wondering when I was going to put it away. Hell, even my wife didn’t know how to talk to me while I wore Glass.
Whether I was in the office, sitting in a coffee shop, standing in line at the bank, or walking home from work (with a fifteen hundred dollar computer strapped to my face and the fear of being mugged), I found myself taking Glass off and putting it away far more than I thought I would.
I just couldn’t work out where Glass fit in my life.
This led me to wonder, am I being too picky? Are my expectations too high? In a world where we’re so used to upgrading to solve our problems, did I have an unfair expectation that straight off the conveyer belt, this piece of emerging technology should change my life?
I settled on the fact that wearable technology should supplement our lives and extend our capabilities, not obstruct us. If a piece of wearable technology caused me to feel uncomfortable, then for me, it’s just not usable.
All is not lost
This might not be the Glass I want, but it’s the Glass we need. As with all emerging technologies, early iterations need to be invested in and actually used for data to be captured and markets to buy into the future. I’ve got no doubt that wearable technology will soar in the future, but as for now, it seems that I’m just not ready for Glass… And Glass isn’t ready for me.
- Name: Matt Kelly
- E-mail: firstname.lastname@example.org