Hello, Future: Google Glass (Wearable Computing)

Lenny

Press "X" to admire hat
Joined
Jan 11, 2007
Messages
3,958
Location
Manchester
Just under a year ago, Google unveiled a new project being worked on by Sergey Brin within Google X Lab -- Google's skunkworks, which aims to develop future technologies such as Google's self-driving car -- called Project Glass. The accompanying video gave us an idea of what Google wanted to create: an augmented reality head-mounted display to allow the hands-free consumption of information available to smartphone users.

At the same time, press shots of the design of the device were released, showing a relatively advanced prototype.

Within days, people realised that Google employees were testing the devices in the wild, and photos surfaced of various people wearing them, including Brin himself!

Over the proceeding months, various Googlers uploaded photos and videos to Google+. All were taken as first-person shots, and every single one of them was incredible - a father swinging his child around (both arms in shot), someone bouncing on a trampoline, a pair of hands ready to catch a ball.

At Google I/O, there was a live demo, and it was insane: skydiving, stunt-biking, abseiling, all the while being broadcast live from glasses being worn by the performers. At the end of the demo, Sergey Brin announced that every member in the audience (after all, I/O is a developer conference) would be able to sign up and buy early editions of the device, called the Glass Explorer Edition, allowing them to get the hardware long before the general public would, and giving them a chance to start developing for it!

Project Glass began showing up in random places for testing (models at New York Fashion Week wore pairs as they strutted down the catwalk), and more tiny details emerged (such as a headset with bone-conduction speakers - kind of like the lollipops that played music when you bit them).

A couple of weeks ago, developers who had bought the glasses at I/O 2012 attended Project Glass hackathons, where they were given a chance to not only try the hardware, but start developing software for it.

And today, Google released a video showing off the UI and some basic features: recording video, taking voice notes, making video calls, image search, taking pictures, giving map directions, translating text, searching Google, receiving notifications, and displaying information such as the current time, weather, and flight details.


The video shows that most options are accessed with voice commands: "OK, glass... take a picture". From details released over the year, we also know that the side of the device has a scrollable touchpad, which allows the user to scroll through menus.

I, for one, am beyond excited. I love the connectivity afforded to me by my smartphone, but there is that feeling of being disconnected because it's in your pocket. With Google Glass, on the other hand, the information is always within reach. Combine it with Google Now, an Android feature that passively delivers information to you based on search habits, and the world will be forever instantly accessible.

Thoughts?
 
JCombine it with Google Now, an Android feature that passively delivers information to you based on search habits, and the world will be forever instantly accessible.

Instantly accessible to Google.

I'm sorry, I know technology should be exciting, but Google's approach to taking data on its users and privacy is frightening.

People are up in arms at the idea of a government monitoring us - 'Big Brother" - and yet when a billion-dollar multinational does it, somehow it's a "good thing"?

Lots of companies take data, but Microsoft come across as too clueless to use, and Apple are focused on keeping it private. Google, however, wants to exploit all its user data for commercial purposes.

Remember, Google is not a technology company - it is the world's biggest advertising corporation, and they want to know more about you, to sell you more stuff, by using technology.

Heck, would anyone be so excited about this if a government were offering the same thing? And yet, a government offers basic legal protections. Google does it's best to obfuscate them, and openly exploit them were possible (where it deems advantage to itself is worth the while).

I don't mean to sound cynical and dismissive, and I really don't wear a tin foil hat. But my business has required me to work closely with Google for years and frankly the company is scary - arrogant in the extreme, with absolutely no interest in its users excepting for generating profit and data.

Google is a company that will actively break the law to get what it wants, abuse its monopoly position to kill competition, and refuses absolutely to be answerable to anyone.

I guess I'm not sold on the idea of Google Vision. :)
 
I really like the idea. The only problems are the connectivity here, which is patchy at best in some rural areas, and the cost.
 
Instantly accessible to Google.

I'm sorry, I know technology should be exciting, but Google's approach to taking data on its users and privacy is frightening.

People are up in arms at the idea of a government monitoring us - 'Big Brother" - and yet when a billion-dollar multinational does it, somehow it's a "good thing"?

I'd rather this thread not devolve into another argument about Google and data and privacy concerns, but I will say that I'm prepared to sacrifice some privacy to Google if they can use it to make my life easier. And whilst I'm scared by what Facebook are doing, and what they could potentially do, I am firmly of the belief (and also apparently offensively condescending about it) that to exist in this modern world people have to give up old ideas of anonymity, and embrace being known in the digital sense - as a collection of data that includes locations (which, I might add, are all publicly available as census data anyway), interests as keywords, regular search terms.

My trust is in Google, and because of that I have entered into their ecosystem. I don't care that they are using what they know about me to make money, because it isn't hurting me. I haven't been the victim of targeted attacks based on my information, and if it stays that way then I will keep playing in Google's playground.

---

So, putting aside data and privacy concerns (feel free to start a new thread, though - I'll happily join in), what do you think about the technology?
 
You may trust Google - and your trust could, I suppose, be justified (though I have my doubts), but what about those being recorded?

That's being paranoid, of course: the idea that Google would sell a type (;)) of smartphone with an in-built, outward-looking camera is completely ludicrous. Why on Earth would a smartphone need or have such a camera...?


But there are other issues, like safety. If the technology was placed in ordinary glasses (or contact lenses) mightn't this be distracting? After all, not every user would have a self-driving Google car.
 
But there are other issues, like safety. If the technology was placed in ordinary glasses (or contact lenses) mightn't this be distracting? After all, not every user would have a self-driving Google car.

I guess it all comes down to design. With something like Glass, the display is transparent and floats in the top right corner of your vision. When you're not focused on it it blurs out, similar to objects in your peripheral vision.

If we're to take the latest video as being what is seen by the human eye, with the rectangle in the corner being the Glass display, then it's quickly clear that the images being projected onto the display have high transparency -- you can see through them -- with some images being totally transparent around the content (the directions, for example).

I fully expect the experience to be jarring when you first put the device on, but as you get used to it it will fade and we'll ignore it just we like we ignore things in our peripheral vision. After all, why do those of us who wear glasses not get driven crazy by the lines of the frames forever in our vision?

On top of that, software is clever. We can already detect the state of a phone by its orientation and other data from the accelerometers and sensors -- and from that turn off the screen when it's next to your ear in a phone call, or put it into a deeper sleep when it's in your pocket -- so why can't the same be true of Glass? If it doesn't have its own sensors built in, then it will definitely be taking data from whatever smartphone it's tethered to. Take the GPS data, work out the speed being travelled, if it's over a certain speed, restrict the device to a select few functions or lock it down completely.

Maybe when object recognition algorithms can be run on the Glass and recognise a steering wheel, you can limit the lock-out to just the driver.
 
The frames of my glasses disappear because my software is even more clever** than Google's will likely ever be. And neither my glasses nor my image processing functions are subject to the sort of bug that might freeze a programme, leaving an image right in my "line of sight".

I hope.




** - Clever in the sense (;)) that what I think I'm seeing at any given point in time is a construct, not (necessarily) a true representation of what's really in front of my eyes.
 
So do we only use proven technology for which we have a high confidence level that it won't break? Damn possible advances because there's a risk of it crashing? Do we just stagnate?

I'm fully aware of the possibilities of software going wrong, and I agree that it will happen, particularly with first generation products, but we're not going to get anywhere if we let the fear of that happening stop research and development.

We know that Google engineers have been testing Glass in the wild for at least a year, and judging by the prototypes spotted on Brin just a day after the first video it's probable that Google engineers have been testing them for far longer. Not only are the minds at Google some of the best in the world, but they will have thousands of hours of real world data - it's almost an impossibility that they won't have considered the worst-case scenarios during the design processes.

On top of that, general release isn't predicted for at least another year.

There will be bugs in the final software, because you can never stamp them all out, but there's a high likeliness that the final software will have enough error-handling to give the illusion of a bug-free system.

Look at current smartphones, for example. I can't remember the last time my Android phone crashed, and that's running software that was written to work over hundreds of devices. Just think what the Google engineers can achieve with software that is optimised to run on a Glass device - you're not going to get fragmentation in that market for many years to come.
 
I think it's genius.

That said, we also have to be cautious in our use of the Google Glasses.

It would also probably be really distracting in school...and I can't afford to get bad grades :eek:
 
So do we only use proven technology for which we have a high confidence level that it won't break? Damn possible advances because there's a risk of it crashing? Do we just stagnate?
It depends where the people using the glasses are and what they're doing. Behind the wheel of a car isn't somewhere I'd like it to be used, particularly when I'm on or near a road. (I have zero confidence that it will never break. Does any complex software never break?)

The problem is that people already use similar, handheld technology whilst driving - the world is full of selfish idiots - and I can't see how they could be stopped. But at least you can see someone holding a phone or smartphone. How would you even know whether their glasses are Google glasses?

The saddest thing (in my eyes, if not yours) is that Google will happily carry on regardless - so you won't be stagnating (unless you're run over) - until someone takes them to court. (Not that a case would get very far: I don't recall mobile phone manufacturers or mobile operators being prosecuted when one of their customers has killed someone while driving and simultaneously phoning, checking their messages or tweeting.)
 
The problem is that people already use similar, handheld technology whilst driving - the world is full of selfish idiots - and I can't see how they could be stopped. But at least you can see someone holding a phone or smartphone. How would you even know whether their glasses are Google glasses?

Because the unit is pretty obvious... ?

ap_google_glasses_kb_120627_wg.jpg


http://img.talkandroid.com/uploads/2012/06/ap_google_glasses_kb_120627_wg.jpg

Believe me, I understand where you're coming from, but apart from the form factor the technology isn't entirely new. It's not hard to compare it to smartphones in hands-free mode - voice commands to bring up content that you have to glance at to see. The difference between Glass and a smartphone is that Glass is in your vision, rather than docked on or above the central console, so at least when you glance at it you've still got a view of the road.

The saddest thing (in my eyes, if not yours) is that Google will happily carry on regardless - so you won't be stagnating (unless you're run over) - until someone takes them to court. (Not that a case would get very far: I don't recall mobile phone manufacturers or mobile operators being prosecuted when one of their customers has killed someone while driving and simultaneously phoning, checking their messages or tweeting.)

I don't think that's a fair comment at all.

Google went to the State of California and helped draft legislation to allow self-driving cars on the road. Sure, they were doing it to serve themselves, but they're taking steps to ensure that they're doing things legally and safely.

They've taken pains to design Glass so that it's not directly in your vision at all times, one assumes to make it less invasive and more passive in the way it delivers information. They've intelligently designed the way that information is delivered so that it doesn't entirely block out a large rectangle of peripheral vision. Who's to say that the engineers haven't taken into account the danger of an always-on device and already built in safeguards?

Now, we're both making assumptions. I'm assuming that the engineers have enough common sense to realise potential problems and are actively trying to solve them, if they haven't already. You're assuming that Google are only interested in getting a product to market and won't bother with the safety of the public unless they're forced to, whether it be from bad press or direct orders from government bodies.

I put trust in Google, so I'm obviously going to believe that they're on it. Whatever we both think, we won't know for sure until someone with a pair for personal use puts up a review that mentions what the device does (or, at the latest, when one of us has a pair).

---

EDIT: Tell you what, I've sent the question to the Project Glass profile on Google+. I'll let you know if I get a reply, regardless to the content (if they tell me it's down to the driver's discretion, for example, I will gladly hold my head and cry for humanity).

In fact, here's my post if you want to check up on it yourself too: https://plus.google.com/u/0/108375132394985923048/posts/PaH4nUcEf6c

---

EDIT2: I think I'm coming on a bit strongly, so I'll apologise in advance. I'm not aiming to offend, or come off as condescending or arrogant, but I'm not going to rewrite, because that's how I feel and I don't like toning things down for anyone: if you don't like it, so what, free country. I truly, deeply believe that in this day and age, with something like Glass, where you can't escape the information, safety concerns will have been discussed at length and measures designed, if not implemented, to increase safety.
 
Last edited:
Lenny said:
I fully expect the experience to be jarring when you first put the device on, but as you get used to it it will fade and we'll ignore it just we like we ignore things in our peripheral vision. After all, why do those of us who wear glasses not get driven crazy by the lines of the frames forever in our vision?

I've been wearing glasses since I was a kid. Still haven't found frames thin enough that they don't drive me crazy. Things in my peripheral bug the life out of me.

Just had to be said. :p

Can't contribute more to the topic: most of what I would add is about the stuff you don't want to get into, rather than the tech itself.

Although, I do like the layer of abstraction that a more tactile device provides, whereas you obviously aren't so keen.
 
Because the unit is pretty obvious... ?

ap_google_glasses_kb_120627_wg.jpg

That's interesting - I thought the unit was supposed to be the entire pair of glasses. Judging by this photo, "Glass" is actually just a bar that sticks out from the ear and the "lenses" part is entirely spurious??
 
That's interesting - I thought the unit was supposed to be the entire pair of glasses. Judging by this photo, "Glass" is actually just a bar that sticks out from the ear and the "lenses" part is entirely spurious??
To quote from the Wiki article on Google Glass:
In the future, new designs may allow integration of the display into people's normal eyewear
and
The project was announced on Google+ by Babak Parviz, an electrical engineer who has also worked on putting displays into contact lenses....
 
I don't think that's a fair comment at all.
I think it is entirely fair comment given the behaviour of Google with regard to the total disregard it showed to copyright with its Google Books project. That was only stopped (assuming it has - how would we know?) by a court case.

And can I mention Google Street View's trawling for Wifi?


However innocent Google's owners are, and however beneficent they think they're being, Google is a corporation whose eventual future and ownership cannot be predicted. But even if the company was staffed, to the end of time, with angelic beings, it's opening a path down which other, less angelic, companies might follow.
 
To quote from the Wiki article on Google Glass:

In the future, new designs may allow integration of the display into people's normal eyewear

and

The project was announced on Google+ by Babak Parviz, an electrical engineer who has also worked on putting displays into contact lenses....

Follow the sources for the first quote, and you'll see that both articles simply speculate on what could be done.

Babak Parviz specialises in nanotechnology and the fabrication of microelectromechanical systems (MEMS). Look at some of his publications. Yes, he's been involved in projects that aim to put displays in contact lenses, but he's also been involved in projects that are completely unrelated.

I don't disagree that this sort of thing could make its way to normal eyewear (although I do take issue from quoting Wikipedia as fact without checking the sources), because after all it's a logical progression, but contact lenses? I've been wearing contacts for three and a half years and I still don't like putting them in. I'd say that more than half of the glasses-wearers I know freak out at the idea of putting contacts in and refuse to even try them, so where's the market**? It's all very well creating displays in contacts in a research setting, but not if you're looking to make money.

---

Sorry, I interpreted your comment about carrying on regardless to be in relation to the safety of users -- release the product without any features that prevent drivers from being distracted -- rather than the company ploughing through data and privacy concerns.

Because they've been proactive in raising awareness of their self-driving cars (which are another Google X Lab project), and have helped pass bills and legislation relating to self-driving cars in Nevada, Florida, and California (and they're now doing the same in Texas), I trust that Google are aware of the concerns that an always-on display will raise, as I have already said.

As I have also said, I'd rather this not turn into another thread about data and privacy issues. Maybe Google should have asked permission to show snippets of copyrighted text first, but that case has been closed and a settlement reached (though the question about whether permission would have been granted in the first place is an interesting one. There's a whole discussion about copyrighted content in the digital age we could have elsewhere).

As for the Street View case, there were some bad judgement calls. I see no problem with collecting the information to map network points to improve location awareness, as these are being publicly broadcast. Storing and possibly analysing private data that is being broadcast over unencrypted channels is not so good. Whilst I believe that people who use unencrypted networks to access private data deserve all that is out to get them, I don't agree with the snooping, even for research purposes.


**It's estimated in the US that 75% of adults need some sort of corrective lens, and of those 11% wear contacts. In the UK, 7.5% of the adult population wear contacts. If such a small percentage of people who would benefit from contacts don't wear them, then why should the numbers be different for the rest of the population?
 
That's interesting - I thought the unit was supposed to be the entire pair of glasses. Judging by this photo, "Glass" is actually just a bar that sticks out from the ear and the "lenses" part is entirely spurious??

Sorry for the confusion - I perhaps should have linked to an image in the original post.

So the actual unit is the part that sits in front of the right eye and wraps around over the ear. There is a battery pack behind the ear, a scrollable touchpad in front of the ear, and the display about an inch in front of the right eye (although other configurations are apparently being developed).

The display uses the property of infinity focus, I believe, to render an image that the eye can focus on even at such a short distance. I don't know much about how it is done, other than that it has something to do with the angle of the light entering the eye. As such, the image on the display is focussed upon as if it were more than a few feet away.

---

I've been wearing glasses since I was a kid. Still haven't found frames thin enough that they don't drive me crazy. Things in my peripheral bug the life out of me.

Eeeee, there's always one. :rolleyes: To be honest, I have always wondered how thick frames affect the sight. I'm lucky in that my eyes are still good enough to allow for thin lenses and thin frames.

Although, I do like the layer of abstraction that a more tactile device provides, whereas you obviously aren't so keen.

Yep! I'm all for heads-up displays Terminator style. :p That's not to say I don't like tactile devices (when I'm typing, I have to use a mechanical keyboard, because the other types, such as laptop chiclet, just don't feel right), it's just that the idea of information being delivered to you regardless of surface is too compelling to me. I can't wait for the time when substrates are developed that can be put onto flat surfaces to transform them into a display, if not also an input device.

Something like Glass appeals to me because it frees up your hands (for something like the example in the video where the ice sculptor can bring up images of tigers whilst he works, without having to disconnect from the sculpture or move back and forth to a screen with his inspiration), and because it can be linked to services, like Google Now, which can provide information based on your location and things it knows about you, without you needing to explicitly tell it to.
 
As I have also said, I'd rather this not turn into another thread about data and privacy issues.
I too do not want to get into those data examples. I was merely using them as well known examples of the way Google has behaved. To be kind (and there's no reason not to be**), enthusiasm had sometimes led them to go too far and when confronted about this, they have, until stopped by various courts, continued to do so.


There will always have to be a balance between those who want to forge forward and those who see the difficulties inherent in various proposals for progress. Letting either dominate is bad, if only because:
  • the naysayers could stop any progress;
  • the overenthusiastic could deny proper consideration of the downsides of their proposals, ones that (inadvertently) leave us all in a worse position than the one in which we all started.
** - That Google's intentions may have been to do good actually makes their actions more problematic. After all, if one thinks one is in the right and acting on that, one may do all one can to continue acting on it, because one is doing the right thing. When one is a giant multinational corporation, this attitude may have adverse consequences for millions of people.
 
A lot of my respect for Google comes from them being a successful research company - people draw comparisons between Google, Bell Labs, and Xerox PARC for obvious reasons. Whilst they have core products that they make money on, they are always pushing to progress that bit further, whether it's in the way we search for and consume information, or the architecture of server farms and the software they run (in bids to make them more efficient, and allow for distributed computing at massive scales). And despite the size of Google, it's still an agile company that can make decisions and change within weeks and months, rather than years. Just look at how much more focussed the company is after two years with Larry Page as CEO - where before Google had a series of overlapping but unrelated products, they now have an almost unified platform where each component speaks to the others, as well as divisions that can autonomously work on projects that might not fit into the Google ecosystem but still benefit from the way Google thinks.

It's just a shame that the rest of the world is tangled in red tape and bureaucracy that slows it down so much.

Whatever people think about Google, I'm sure everyone can agree that they are trying to advance technology in ways that enhance the world and benefit people. Eggs will be broken, but that's inevitable - there are organisations that will put up an incredible resistance because they're built upon the old and can't afford to change the way they operate, or just plain refuse to.

To use the self-driving cars as an example yet again, it could be a sign that Google are coming to understand that the balance does exist, so by working with the proper government bodies they can discuss their proposals and ensure that the downsides are identified before it's too late.

---

Two more things to add:

The first is a look at some of the Google Glass patents, including designs that build sensors such as accelerometers and GPS into the device, along with radios for wireless connectivity and storage.

http://www.engadget.com/2013/02/21/google-glass-patent-application-diagramsi/

--

The second is actually about the safety concerns of using Glass whilst driving, from the QA session at Google I/O 2012:

Asked about the safety of using Google Glasses while driving, Brin said, "We actually have done a lot of work on that, with our self-driving cars." He thinks that the limited amount of data that the Glasses show is safe, and furthermore since the glasses project images that appear to the user to be optically far away, users won't have to refocus their eyes, as they do when shifting their gaze to a dashboard.

Google designer Isabelle Olsson said the classes are "designed to interact with the virtual world without distracting you from the real world... close to your senses, but not blocking." She also said they needed to be physically unobtrusive and comfortable. "If this is not ridiculously light, it doesn't belong on your face." She said the current Google glasses weight less than many sunglasses.

http://news.cnet.com/8301-1023_3-57462165-93/sergey-brin-google-glasses-will-set-you-free/

The people involved with the project have considered at least one problem - glancing at the display whilst driving. The way the image in the display appears to the far away might make glancing at it little different from glancing to an overhead motorway sign as you drive towards it.

It's not quite what I hoped for (software measures to restrict use), but it is something that has been considered in the design (though it remains to be seen if it's a happy coincidence because of the way the image is shown).
 
I got a reply on Google+ from the Project Glass team! I can't tell from the Android app if it's been publicly shared (though I'd assume so, as my post to them was), so here's the reply as a quote:

Definitely something we've been thinking about. Glancing at Glass is actually very similar to glancing at the rearview mirror. There are lots of things that you shouldn't do on your phone while driving, like watching videos, that you wouldn't do on Glass either. And so far, we've found using turn-by-turn navigation on Glass to be less disruptive than some alternative navigation options. We'll continue to pay close attention to this, and encourage everyone to follow the laws that are applicable in their location.

Equal parts common sense and following the law, and good design (focussing on Glass, and the way navigation information is delivered). Essentially it's the same as Brin's answer at I/O last year.

And again here's the link to my question, which should also show the reply: https://plus.google.com/u/0/108375132394985923048/posts/PaH4nUcEf6c
 

Similar threads


Back
Top