I'm amazed that people still feel this level of paranoia about stuff, when they have a clear option: do not use the product. Concerned about the privacy implication of Google Home? Don't get one. Don't like the fact that Apple Watch reads biometric data? Get a Casio watch. Don't like the fact that Facebook knows who your friends are? Go out more.
The very fact that someone is watching the keynote and going "oh my Gawd, the level of intrusion" at every turn, kind of tells me they are interested in having the features but don't want to provide the data. Otherwise you wouldn't even be watching the keynote. Am I missing something?
(Full disclaimer: I do work for an Alphabet company, so take my words with as many grains of salt as you want.)
>I'm amazed that people still feel this level of paranoia about stuff, when they have a clear option: do not use the product.
Maybe because people are not islands, and don't think like isolated solipsists but also worry about the wider social implications of a technology?
Them not using a troublesome product won't solve that -- and it might not even solve the implications for them (e.g. like your friends can still tag you on photos on Facebook, even if you don't have an account there). Or you know, visiting a friend's house who has the device.
>The very fact that someone is watching the keynote and going "oh my Gawd, the level of intrusion" at every turn, kind of tells me they are interested in having the features but don't want to provide the data. Otherwise you wouldn't even be watching the keynote. Am I missing something?
Yes. 30+ years of data abuses and privacy encroachment by advertisers.
It's not a paid service with siloed data used only for the service's purpose -- rather the service is the lure to gather even more data for targeted marketing/ads.
> I'm amazed that people still feel this level of paranoia about stuff, when they have a clear option: do not use the product
People say that about stuff like Googlemail and Facebook. "Don't use it", except that doesn't always help when friends use it, and feed it with info about you as well. But hey, your fault for liking people who use Facebook or something. Now you can simply choose to not be friends with people who put this "home product" in their home, right? Or you can wear a disguise and talk with a false voice when visiting them, or ask them to turn it off whenever you're over. It's all voluntary, so no biggie.
At what point will all these small things combine into one big "You don't have to read or write anything, or communicate with any other person, or buy anything, or ever leave your house"? That's not even a slippery slope in my mind, more like taking 15 points which are in a straight line and extrapolating the line further. So when you talk of "this level of paranoia", I'd say the flipside of that is considering your stance a certain "level of apathy and/or short-sightedness".
Google Home crosses yet another line. It installs an always on far field microphone connected to Google in a house near me. Not getting one is no longer sufficient to keep my spoken words outside of Google's reach.
This is the same as "if you don't want to be tracked by Google online, don't use Google Search", which conveniently omits that Google is working hard to add a tracking beacon on every page on the internet, via AdSense and +1 button.
StreetView, being tagged in photos by friends even if I don't use Google Photos. I also have a number of reasons to believe Google is homing in most people location information from Android phones without their informed and explicit consent.
Google's privacy practices are very questionable. Google could do better than hide under "don't use Google". What ever happened to "Don't be evil?".
Maybe I'm parsing your tone wrong, but what is so outlandish/surprising about people liking features but not the implications of them? (or often, of the way they are implemented, see every cloud vs local software discussion ever)
E.g. cheald's comment above is pretty clear "I always wanted that, but I fear its implementation"
I think you parsed it correctly. My surprise is more about people watching a keynote fully knowing that most of the stuff announced will be ran by Google, and still being outraged about "the privacy implications." I'm not sure what they expected. A fully open source implementation with pre packaged Docker images so you can deploy your own cluster and petabytes of training data?
The truth about a lot of these services is they only work at the scale Google (or Amazon, or Facebook, etc.) run them. You'd spend thousands, if not millions, of dollars to get equivalent functionality, even if you got the software for free. I've basically had this argument every time someone helpfully suggests "Dropcam/Nestcam cameras should have an option to save data locally." You can have cheap, 100% private and efficient. Choose two.
No, you are buying a product whose description is literally "you ask it something, and our cloud service (which costs tons of money and was built over almost two decades) will give you the answer." How do you expect to do that without a roundtrip of a certain amount of information to Google's servers?
How is that "being evil"? Isn't that the product's whole raison d'être? Everything else ("it's going to be recording everything you say and reporting it to the NSA!") is just tinfoil-hat-level speculation and not based on any factual information. If I started saying "In-n-Out could change their burger recipe to include cyanide at any minute and kill all their customers" would you accuse In-n-Out of plotting to kill everyone?
For the record, this is my very own (pretty frustrated) point of view, and obviously doesn't represent Alphabet's view in any capacity.
You are indeed missing something important. We can choose not to use a particular product. But we cannot choose not to use all the products everyone else is using, because that would make us lonely, antisocial hermits without the slightest chance of ever finding work, friends or lovers.
So even if Googlers may find it "amazing" that privacy advocates don't just shut up and go live behind a rock, that's what you're going to have to live with in a free society.
> We can choose not to use a particular product. But we cannot choose not to use all the products everyone else is using, because that would make us lonely, antisocial hermits without the slightest chance of ever finding work, friends or lovers.
I don't use Facebook, Twitter, WhatsApp, etc. etc. etc., you name it. Yet, somehow I am employed, have friends and hobbies, and am happily married. How could that have possibly happened?
I don't use Facebook and WhatsApp either, and Twitter doesn't have my real name, so I know that it is possible to avoid many intrusive services, but it depends on what communities you are part of and what sort of people you want to stay in touch with.
I guess the proof is all in the etc etc etc.
How about things like using Google search, exchanging unencrypted email with people who use ad funded mail services like gmail, using software or hardware that isn't completely open source (or working with people who do), using ad funded web sites, operating systems or apps that track you, using credit cards and other authenticated payment services, using location based services, using the internet without Tor, making unencrypted phone calls?
But yes, strictly speaking you are right. It is not impossible to avoid all of that and still have a social life. It's just very very difficult, and that's why it makes sense to speak up when useful services are designed in ways that create unnecessary privacy issues.
I guess you have a point. I, too, grew up before all these things were "the standard" (heck, I didn't have internet at home until I was 22 for crying out loud) so even though it is annoying when they aren't there, I can totally picture a life without them.
Not "googlers", this particular "googler" (me.) Google is a huge company, and I imagine a lot of people actually hold views closer to yours.
I'm just an old codger that can't understand why people have expectations of A, B and C, when they are buying a product that says it does A, promises to do B as well as possible and tells you that it definitely can't do C because it's technically impossible. Like, back in my day people read EULAs and stuff :)
Except soon one's environment will be enveloped by a cloud of such devices, each monitoring, tracking, and associating without consent.
And, please don't think this is hyperbole. Independently, over the past year, I've caught parts of this same vision from two separate (large) companies. This is the low hanging fruit. The rewards are too high, the costs are low, and there are very few downside (to the companies).
This has always been the case across centuries. Information is power and grabbing or keeping it is a generational challenge.
So, you can't just put your arms down and say it is hopeless.
The Free Software Foundation fight was also hopeless when they started, look at what happened since that time and behold how software with available code dominates the market for third-party components.
It is in our hands the responsibility to act and make the new technology to respect privacy and people as much as possible.
I remember having a similar feeling of wonder watching the amazing movie "Her."
I then went down a thought path of what level of personal data the AI would have had in the movie, and started thinking about what that would look like from a monetization/privacy standpoint given todays realities of advertising, NSA surveillance, etc. Pretty scary.
In fact, I'd love for someone to do a cut of clips from that movie with overlays detailing all the bits of personal data that would have been fed to a company's servers based on what is happening in the scene--there are some particularly juicy scenes for that if I recall correctly.
"Yesterday ... I realized how long it had been since I looked at a new technology with wonder, instead of an automatic feeling of dread."
EDIT: Speaking of Maciej, if you're not already following him on Twitter, today is the day to do that. He's having a field day with this keynote.