Shhh… Bruce Schneier on How We Sold Our Souls & Privacy to Internet Giants

It’s simple. Whenever Bruce Schneier speaks, listen.

How we sold our souls – and more – to the internet giants

Bruce Schneier
Sunday 17 May 2015 11.00 BST

Last year, when my refrigerator broke, the repair man replaced the computer that controls it. I realised that I had been thinking about the refrigerator backwards: it’s not a refrigerator with a computer, it’s a computer that keeps food cold. Just like that, everything is turning into a computer. Your phone is a computer that makes calls. Your car is a computer with wheels and an engine. Your oven is a computer that cooks lasagne. Your camera is a computer that takes pictures. Even our pets and livestock are now regularly chipped; my cat could be considered a computer that sleeps in the sun all day.

Computers are being embedded into all sort of products that connect to the internet. Nest, which Google purchased last year for more than $3bn, makes an internet-enabled thermostat. You can buy a smart air conditioner that learns your preferences and maximises energy efficiency. Fitness tracking devices, such as Fitbit or Jawbone, collect information about your movements, awake and asleep, and use that to analyse both your exercise and sleep habits. Many medical devices are starting to be internet-enabled, collecting and reporting a variety of biometric data. There are – or will be soon – devices that continually measure our vital signs, moods and brain activity.

This year, we have had two surprising stories of technology monitoring our activity: Samsung televisions that listen to conversations in the room and send them elsewhere for transcription – just in case someone is telling the TV to change the channel – and a Barbie that records your child’s questions and sells them to third parties.

All these computers produce data about what they’re doing and a lot of it is surveillance data. It’s the location of your phone, who you’re talking to and what you’re saying, what you’re searching and writing. It’s your heart rate. Corporations gather, store and analyse this data, often without our knowledge, and typically without our consent. Based on this data, they draw conclusions about us that we might disagree with or object to and that can affect our lives in profound ways. We may not like to admit it, but we are under mass surveillance.

Internet surveillance has evolved into a shockingly extensive, robust and profitable surveillance architecture. You are being tracked pretty much everywhere you go, by many companies and data brokers: 10 different companies on one website, a dozen on another. Facebook tracks you on every site with a Facebook Like button (whether you’re logged in to Facebook or not), while Google tracks you on every site that has a Google Plus g+ button or that uses Google Analytics to monitor its own web traffic.

Most of the companies tracking you have names you’ve never heard of: Rubicon Project, AdSonar, Quantcast, Undertone, Traffic Marketplace. If you want to see who’s tracking you, install one of the browser plug-ins that let you monitor cookies. I guarantee you will be startled. One reporter discovered that 105 different companies tracked his internet use during one 36-hour period. In 2010, the seemingly innocuous site Dictionary.com installed more than 200 tracking cookies on your browser when you visited.

It’s no different on your smartphone. The apps there track you as well. They track your location and sometimes download your address book, calendar, bookmarks and search history. In 2013, the rapper Jay Z and Samsung teamed up to offer people who downloaded an app the ability to hear the new Jay Z album before release. The app required that users give Samsung consent to view all accounts on the phone, track its location and who the user was talking to. The Angry Birds game even collects location data when you’re not playing. It’s less Big Brother and more hundreds of tittletattle little brothers.

Most internet surveillance data is inherently anonymous, but companies are increasingly able to correlate the information gathered with other information that positively identifies us. You identify yourself willingly to lots of internet services. Often you do this with only a username, but increasingly usernames can be tied to your real name. Google tried to enforce this with its “real name policy”, which required users register for Google Plus with their legal names, until it rescinded that policy in 2014. Facebook pretty much demands real names. Whenever you use your credit card number to buy something, your real identity is tied to any cookies set by companies involved in that transaction. And any browsing you do on your smartphone is tied to you as the phone’s owner, although the website might not know it.

Surveillance is the business model of the internet for two primary reasons: people like free and people like convenient. The truth is, though, that people aren’t given much of a choice. It’s either surveillance or nothing and the surveillance is conveniently invisible so you don’t have to think about it. And it’s all possible because laws have failed to keep up with changes in business practices.

In general, privacy is something people tend to undervalue until they don’t have it anymore. Arguments such as “I have nothing to hide” are common, but aren’t really true. People living under constant surveillance quickly realise that privacy isn’t about having something to hide. It’s about individuality and personal autonomy. It’s about being able to decide who to reveal yourself to and under what terms. It’s about being free to be an individual and not having to constantly justify yourself to some overseer.

This tendency to undervalue privacy is exacerbated by companies deliberately making sure that privacy is not salient to users. When you log on to Facebook, you don’t think about how much personal information you’re revealing to the company; you chat with your friends. When you wake up in the morning, you don’t think about how you’re going to allow a bunch of companies to track you throughout the day; you just put your cell phone in your pocket.

But by accepting surveillance-based business models, we hand over even more power to the powerful. Google controls two-thirds of the US search market. Almost three-quarters of all internet users have Facebook accounts. Amazon controls about 30% of the US book market, and 70% of the ebook market. Comcast owns about 25% of the US broadband market. These companies have enormous power and control over us simply because of their economic position.

Our relationship with many of the internet companies we rely on is not a traditional company-customer relationship. That’s primarily because we’re not customers – we’re products those companies sell to their real customers. The companies are analogous to feudal lords and we are their vassals, peasants and – on a bad day – serfs. We are tenant farmers for these companies, working on their land by producing data that they in turn sell for profit.

Yes, it’s a metaphor, but it often really feels like that. Some people have pledged allegiance to Google. They have Gmail accounts, use Google Calendar and Google Docs and have Android phones. Others have pledged similar allegiance to Apple. They have iMacs, iPhones and iPads and let iCloud automatically synchronise and back up everything. Still others let Microsoft do it all. Some of us have pretty much abandoned email altogether for Facebook, Twitter and Instagram. We might prefer one feudal lord to the others. We might distribute our allegiance among several of these companies or studiously avoid a particular one we don’t like. Regardless, it’s becoming increasingly difficult to avoid pledging allegiance to at least one of them.

After all, customers get a lot of value out of having feudal lords. It’s simply easier and safer for someone else to hold our data and manage our devices. We like having someone else take care of our device configurations, software management, and data storage. We like it when we can access our email anywhere, from any computer, and we like it that Facebook just works, from any device, anywhere. We want our calendar entries to appear automatically on all our devices. Cloud storage sites do a better job of backing up our photos and files than we can manage by ourselves; Apple has done a great job of keeping malware out of its iPhone app store. We like automatic security updates and automatic backups; the companies do a better job of protecting our devices than we ever did. And we’re really happy when, after we lose a smartphone and buy a new one, all of our data reappears on it at the push of a button.

In this new world of computing, we’re no longer expected to manage our computing environment. We trust the feudal lords to treat us well and protect us from harm. It’s all a result of two technological trends.

The first is the rise of cloud computing. Basically, our data is no longer stored and processed on our computers. That all happens on servers owned by many different companies. The result is that we no longer control our data. These companies access our data—both content and metadata—for whatever profitable purpose they want. They have carefully crafted terms of service that dictate what sorts of data we can store on their systems, and can delete our entire accounts if they believe we violate them. And they turn our data over to law enforcement without our knowledge or consent. Potentially even worse, our data might be stored on computers in a country whose data protection laws are less than rigorous.

The second trend is the rise of user devices that are managed closely by their vendors: iPhones, iPads, Android phones, Kindles, ChromeBooks, and the like. The result is that we no longer control our computing environment. We have ceded control over what we can see, what we can do, and what we can use. Apple has rules about what software can be installed on iOS devices. You can load your own documents onto your Kindle, but Amazon is able to delete books it has already sold you. In 2009, Amazon automatically deleted some editions of George Orwell’s Nineteen Eighty-Four from users’ Kindles because of a copyright issue. I know, you just couldn’t write this stuff any more ironically.

It’s not just hardware. It’s getting hard to just buy a piece of software and use it on your computer in any way you like. Increasingly, vendors are moving to a subscription model—Adobe did that with Creative Cloud in 2013—that gives the vendor much more control. Microsoft hasn’t yet given up on a purchase model, but is making its MS Office subscription very attractive. And Office 365’s option of storing your documents in the Microsoft cloud is hard to turn off. Companies are pushing us in this direction because it makes us more profitable as customers or users.

Given current laws, trust is our only option. There are no consistent or predictable rules. We have no control over the actions of these companies. I can’t negotiate the rules regarding when Yahoo will access my photos on Flickr. I can’t demand greater security for my presentations on Prezi or my task list on Trello. I don’t even know the cloud providers to whom those companies have outsourced their infrastructures. If any of those companies delete my data, I don’t have the right to demand it back. If any of those companies give the government access to my data, I have no recourse. And if I decide to abandon those services, chances are I can’t easily take my data with me.

Political scientist Henry Farrell observed: “Much of our life is conducted online, which is another way of saying that much of our life is conducted under rules set by large private businesses, which are subject neither to much regulation nor much real market competition.”

The common defence is something like “business is business”. No one is forced to join Facebook or use Google search or buy an iPhone. Potential customers are choosing to enter into these quasi-feudal user relationships because of the enormous value they receive from them. If they don’t like it, goes the argument, they shouldn’t do it.

This advice is not practical. It’s not reasonable to tell people that if they don’t like their data being collected, they shouldn’t email, shop online, use Facebook or have a mobile phone. I can’t imagine students getting through school anymore without an internet search or Wikipedia, much less finding a job afterwards. These are the tools of modern life. They’re necessary to a career and a social life. Opting out just isn’t a viable choice for most of us, most of the time; it violates what have become very real norms of contemporary life.

Right now, choosing among providers is not a choice between surveillance or no surveillance, but only a choice of which feudal lords get to spy on you. This won’t change until we have laws to protect both us and our data from these sorts of relationships. Data is power and those that have our data have power over us. It’s time for government to step in and balance things out.

Adapted from Data and Goliath by Bruce Schneier, published by Norton Books. To order a copy for £17.99 go to bookshop.theguardian.com. Bruce Schneier is a security technologist and CTO of Resilient Systems Inc. He blogs at schneier.com, and tweets at @schneierblog

Shhh… Facebook Violates EU Law as it Tracks Everyone Including Logged Out Users and Visitors

Continuing on the Facebook topic again, check out the video clip and the exclusive Guardian article below:

Facebook ‘tracks all visitors, breaching EU law’

Exclusive: People without Facebook accounts, logged out users, and EU users who have explicitly opted out of tracking are all being tracked, report says

Facebook tracks the web browsing of everyone who visits a page on its site even if the user does not have an account or has explicitly opted out of tracking in the EU, extensive research commissioned by the Belgian data protection agency has revealed.

The report, from researchers at the Centre of Interdisciplinary Law and ICT (ICRI) and the Computer Security and Industrial Cryptography department (Cosic) at the University of Leuven, and the media, information and telecommunication department (Smit) at Vrije Universiteit Brussels, was commissioned after an original draft report revealed Facebook’s privacy policy breaches European law.

The researchers now claim that Facebook tracks computers of users without their consent, whether they are logged in to Facebook or not, and even if they are not registered users of the site or explicitly opt out in Europe. Facebook tracks users in order to target advertising.

The issue revolves around Facebook’s use of its social plugins such as the “Like” button, which has been placed on more than 13m sites including health and government sites.

Facebook places tracking cookies on users’ computers if they visit any page on the facebook.com domain, including fan pages or other pages that do not require a Facebook account to visit.

When a user visits a third-party site that carries one of Facebook’s social plug-ins, it detects and sends the tracking cookies back to Facebook – even if the user does not interact with the Like button, Facebook Login or other extension of the social media site.

EU privacy law states that prior consent must be given before issuing a cookie or performing tracking, unless it is necessary for either the networking required to connect to the service (“criterion A”) or to deliver a service specifically requested by the user (“criterion B”).

The same law requires websites to notify users on their first visit to a site that it uses cookies, requesting consent to do so.

A cookie is a small file placed on a user’s computer by a website that stores settings, previous activities and other small amounts of information needed by the site. They are sent to the site on each visit and can therefore be used to identify a user’s computer and track their movements across the web.

“We collect information when you visit or use third-party websites and apps that use our services. This includes information about the websites and apps you visit, your use of our services on those websites and apps, as well as information the developer or publisher of the app or website provides to you or us,” states Facebook’s data usage policy, which was updated this year.

Facebook’s tracking practices have ‘no legal basis’

An opinion published by Article 29, the pan-European data regulator working party, in 2012 stated that unless delivering a service specifically requested by the user, social plug-ins must have consent before placing a cookie. “Since by definition social plug-ins are destined to members of a particular social network, they are not of any use for non-members, and therefore do not match ‘criterion B’ for those users.”

The same applies for users of Facebook who are logged out at the time, while logged-in users should only be served a “session cookie” that expires when the user logs out or closes their browser, according to Article 29.

The Article 29 working party has also said that cookies set for “security purposes” can only fall under the consent exemptions if they are essential for a service explicitly requested by the user – not general security of the service.

Facebook’s cookie policy updated this year states that the company still uses cookies if users do not have a Facebook account, or are logged out, to “enable us to deliver, select, evaluate, measure and understand the ads we serve on and off Facebook”.

The social network tracks its users for advertising purposes across non-Facebook sites by default. Users can opt out of ad tracking, but an opt-out mechanism “is not an adequate mechanism to obtain average users informed consent”, according to Article 29.

“European legislation is really quite clear on this point. To be legally valid, an individual’s consent towards online behavioural advertising must be opt-in,” explained Brendan Van Alsenoy, a researcher at ICRI and one of the report’s author.

“Facebook cannot rely on users’ inaction (ie not opting out through a third-party website) to infer consent. As far as non-users are concerned, Facebook really has no legal basis whatsoever to justify its current tracking practices.”

Opt-out mechanism actually enables tracking for the non-tracked

The researchers also analysed the opt-out mechanism used by Facebook and many other internet companies including Google and Microsoft.

Users wanting to opt out of behavioural tracking are directed to sites run by the Digital Advertising Alliance in the US, Digital Advertising Alliance of Canada in Canada or the European Digital Advertising Alliance in the EU, each of which allow bulk opting-out from 100 companies.

But the researchers discovered that far from opting out of tracking, Facebook places a new cookie on the computers of users who have not been tracked before.

“If people who are not being tracked by Facebook use the ‘opt out’ mechanism proposed for the EU, Facebook places a long-term, uniquely identifying cookie, which can be used to track them for the next two years,” explained Günes Acar from Cosic, who also co-wrote the report. “What’s more, we found that Facebook does not place any long-term identifying cookie on the opt-out sites suggested by Facebook for US and Canadian users.”

The finding was confirmed by Steven Englehardt, a researcher at Princeton University’s department of computer science who was not involved in the report: “I started with a fresh browsing session and received an additional ‘datr’ cookie that appears capable of uniquely identifying users on the UK version of the European opt-out site. This cookie was not present during repeat tests with a fresh session on the US or Canadian version.”

Facebook sets an opt-out cookie on all the opt-out sites, but this cookie cannot be used for tracking individuals since it does not contain a unique identifier. Why Facebook places the “datr” cookie on computers of EU users who opt out is unknown.

‘Privacy-friendly’ design

For users worried about tracking, third-party browser add-ons that block tracking are available, says Acar: “Examples include Privacy Badger, Ghostery and Disconnect. Privacy Badger replaces social plug-ins with privacy preserving counterparts so that users can still use social plug-ins, but not be tracked until they actually click on them.

“We argue that it is the legal duty of Facebook to design its services and components in a privacy-friendly way,” Van Alsenoy added. “This means designing social plug-ins in such a way that information about individual’s personal browsing activities outside of Facebook are not unnecessarily exposed.”

Facebook is being investigated by the Dutch data protection authority, which asked the social network to delay rollout of its new privacy policy, and is being probed by the Article 29 working party.

A Facebook spokesperson said: “This report contains factual inaccuracies. The authors have never contacted us, nor sought to clarify any assumptions upon which their report is based. Neither did they invite our comment on the report before making it public. We have explained in detail the inaccuracies in the earlier draft report (after it was published) directly to the Belgian DPA, who we understand commissioned it, and have offered to meet with them to explain why it is incorrect, but they have declined to meet or engage with us. However, we remain willing to engage with them and hope they will be prepared to update their work in due course.”

“Earlier this year we updated our terms and policies to make them more clear and concise, to reflect new product features and to highlight how we’re expanding people’s control over advertising. We’re confident the updates comply with applicable laws including EU law.”

Van Alsenoy and Acar, authors of the study, told the Guardian: “We welcome comments via the contact email address listed within the report. Several people have already reached out to provide suggestions and ideas, which we really appreciate.”

“To date, we have not been contacted by Facebook directly nor have we received any meeting request. We’re not surprised that Facebook holds a different opinion as to what European data protection laws require. But if Facebook feels today’s releases contain factual errors, we’re happy to receive any specific remarks it would like to make.”

Shhh… Why You Should Forget Facebook for Good?

Do you need convincing reasons to leave Facebook for good? Look no further than this video clip and Guardian article below.

To be honest, I signed up to Facebook only late last year but used it exclusively to promote this blog. Yet, I’m always having second thoughts…

Leave Facebook if you don’t want to be spied on, warns EU

European Commission admits Safe Harbour framework cannot ensure privacy of EU citizens’ data when sent to the US by American internet firms

Samuel Gibbs
@SamuelGibbs

Thursday 26 March 2015 19.11 GMT

The European Commission has warned EU citizens that they should close their Facebook accounts if they want to keep information private from US security services, finding that current Safe Harbour legislation does not protect citizen’s data.

The comments were made by EC attorney Bernhard Schima in a case brought by privacy campaigner Maximilian Schrems, looking at whether the data of EU citizens should be considered safe if sent to the US in a post-Snowden revelation landscape.

“You might consider closing your Facebook account, if you have one,” Schima told attorney general Yves Bot in a hearing of the case at the European court of justice in Luxembourg.

When asked directly, the commission could not confirm to the court that the Safe Harbour rules provide adequate protection of EU citizens’ data as it currently stands.

The US no longer qualifies

The case, dubbed “the Facebook data privacy case”, concerns the current Safe Harbour framework, which covers the transmission of EU citizens’ data across the Atlantic to the US. Without the framework, it is against EU law to transmit private data outside of the EU. The case collects complaints lodged against Apple, Facebook, Microsoft, Microsoft-owned Skype and Yahoo.

Schrems maintains that companies operating inside the EU should not be allowed to transfer data to the US under Safe Harbour protections – which state that US data protection rules are adequate if information is passed by companies on a “self-certify” basis – because the US no longer qualifies for such a status.

The case argues that the US government’s Prism data collection programme, revealed by Edward Snowden in the NSA files, which sees EU citizens’ data held by US companies passed on to US intelligence agencies, breaches the EU’s Data Protection Directive “adequacy” standard for privacy protection, meaning that the Safe Harbour framework no longer applies.

Poland and a few other member states as well as advocacy group Digital Rights Ireland joined Schrems in arguing that the Safe Harbour framework cannot ensure the protection of EU citizens’ data and therefore is in violation of the two articles of the Data Protection Directive.

The commission, however, argued that Safe Harbour is necessary both politically and economically and that it is still a work in progress. The EC and the Ireland data protection watchdog argue that the EC should be left to reform it with a 13-point plan to ensure the privacy of EU citizens’ data.

“There have been a spate of cases from the ECJ and other courts on data privacy and retention showing the judiciary as being more than willing to be a disrupting influence,” said Paula Barrett, partner and data protection expert at law firm Eversheds. “Bringing down the safe harbour mechanism might seem politically and economically ill-conceived, but as the decision of the ECJ in the so-called ‘right to be forgotten’ case seems to reinforce that isn’t a fetter which the ECJ is restrained by.”

An opinion on the Safe Harbour framework from the ECJ is expected by 24 June.

Facebook declined to comment.

Shhh… ProtonMail: Email Privacy and Encryption

Sending an email message is like sending a postcard. That’s the message Hillary Clinton probably now wish she heard earlier.

Andy Yen, a scientist at CERN – the European Organization for Nuclear Research – co-founded ProtonMail, an encrypted email startup based in Geneva, Switzerland. As he explained in this TEDTalk, it is easy to make encryption easy for all to use and keep all email private.

But curiously, it seems so much like PGP.

Shhh… How Come Obama Suddenly Understood & Explained to China Why Backdoors into Encryption is Really Bad?

“Those kinds of restrictive practices I think would ironically hurt the Chinese economy over the long term because I don’t think there is any US or European firm, any international firm, that could credibly get away with that wholesale turning over of data, personal data, over to a government.”

That’s a quote from Obama reported in The Guardian (see article below).

Oh great, so Obama actually understood the consequences of government gaining backdoors into encryption? He should give the same advice to his NSA director Mike Rogers who somehow struggled when asked about the issue recently.

Building backdoors into encryption isn’t only bad for China, Mr President

Trevor Timm
@trevortimm
Wednesday 4 March 2015 16.15 GMT

Want to know why forcing tech companies to build backdoors into encryption is a terrible idea? Look no further than President Obama’s stark criticism of China’s plan to do exactly that on Tuesday. If only he would tell the FBI and NSA the same thing.

In a stunningly short-sighted move, the FBI – and more recently the NSA – have been pushing for a new US law that would force tech companies like Apple and Google to hand over the encryption keys or build backdoors into their products and tools so the government would always have access to our communications. It was only a matter of time before other governments jumped on the bandwagon, and China wasted no time in demanding the same from tech companies a few weeks ago.

As President Obama himself described to Reuters, China has proposed an expansive new “anti-terrorism” bill that “would essentially force all foreign companies, including US companies, to turn over to the Chinese government mechanisms where they can snoop and keep track of all the users of those services.”

Obama continued: “Those kinds of restrictive practices I think would ironically hurt the Chinese economy over the long term because I don’t think there is any US or European firm, any international firm, that could credibly get away with that wholesale turning over of data, personal data, over to a government.”

Bravo! Of course these are the exact arguments for why it would be a disaster for US government to force tech companies to do the same. (Somehow Obama left that part out.)

As Yahoo’s top security executive Alex Stamos told NSA director Mike Rogers in a public confrontation last week, building backdoors into encryption is like “drilling a hole into a windshield.” Even if it’s technically possible to produce the flaw – and we, for some reason, trust the US government never to abuse it – other countries will inevitably demand access for themselves. Companies will no longer be in a position to say no, and even if they did, intelligence services would find the backdoor unilaterally – or just steal the keys outright.

For an example on how this works, look no further than last week’s Snowden revelation that the UK’s intelligence service and the NSA stole the encryption keys for millions of Sim cards used by many of the world’s most popular cell phone providers. It’s happened many times before too. Ss security expert Bruce Schneier has documented with numerous examples, “Back-door access built for the good guys is routinely used by the bad guys.”

Stamos repeatedly (and commendably) pushed the NSA director for an answer on what happens when China or Russia also demand backdoors from tech companies, but Rogers didn’t have an answer prepared at all. He just kept repeating “I think we can work through this”. As Stamos insinuated, maybe Rogers should ask his own staff why we actually can’t work through this, because virtually every technologist agrees backdoors just cannot be secure in practice.

(If you want to further understand the details behind the encryption vs. backdoor debate and how what the NSA director is asking for is quite literally impossible, read this excellent piece by surveillance expert Julian Sanchez.)

It’s downright bizarre that the US government has been warning of the grave cybersecurity risks the country faces while, at the very same time, arguing that we should pass a law that would weaken cybersecurity and put every single citizen at more risk of having their private information stolen by criminals, foreign governments, and our own.

Forcing backdoors will also be disastrous for the US economy as it would be for China’s. US tech companies – which already have suffered billions of dollars of losses overseas because of consumer distrust over their relationships with the NSA – would lose all credibility with users around the world if the FBI and NSA succeed with their plan.

The White House is supposedly coming out with an official policy on encryption sometime this month, according to the New York Times – but the President can save himself a lot of time and just apply his comments about China to the US government. If he knows backdoors in encryption are bad for cybersecurity, privacy, and the economy, why is there even a debate?

Shhh… NSA Want Framework to Access Encrypted Communications

NSA Director Admiral Michael Rogers said at a cyber security conference in Washington DC Monday this week that the government needs to develop a “framework” so that the NSA and law enforcement agencies could read encrypted data when they need and he was immediately challenged by top security experts from the tech industry, most notably Yahoo’s chief information security officer Alex Stamos (see transcript).

Obama's Still On the Wrong Frequency On Cybersecurity Issues

This is probably the most telling moment of how US President Barack Obama is still on the wrong frequency on cyber matters…

Obama blamed the “impact on their [the tech companies] bottom lines” for the mistrust between the government and Silicon Valley in the aftermath of the Snowden revelations. These were his words, straight from the POTUSA mouth rather than reading from the scripts, in an exclusive interview with Re/code’s Kara Swisher (see video below) following the well publicized cybersecurity summit at Stanford University last Friday, when he signed an executive order to encourage the private sector to share cybersecurity threat information with other companies and the US government.

Contrast that with the high-profile speech by Apple CEO Tim Cook (see video below), who warned about “life and death” and “dire consequences” in sacrificing the right to privacy as technology companies had a duty to protect their customers.

His speech was delivered before Obama’s address to the summit – which the White House organized to foster better cooperation and the sharing of private information with Silicon Valley – best remembered for the absence of leaders from tech giants like Google, Yahoo and Facebook who gave Obama the snub amid growing tensions between Silicon Valley and the Obama administration. Heavyweights whom Obama counted as “my friends” in the Re/code interview (watch closely his expression at the 39th second of the clip above).

Shhh… New Search Engine Memex to Reach the Other 95% of the Web (And Dark Web) that Google Missed

Popular search engines like Google, Yahoo and Bing can only access 5 percent of all the contents in the internet space. So that’s one good reason to be excited about the new breed search engine Memex, now at beta stage, developed by the US military’s Defense Advanced Research Projects Agency (DARPA) which is capable of ploughing through the entire web space including the Dark Web, that part (much of the other 95 percent) of the cyber ecosystem where criminals operate in the shadows to buy, sell and advertise their illegal trades such as weapons and sex trafficking.

Find out more about MEMEX from this exclusive 60 Minutes clip:

http://www.cbsnews.com/common/video/cbsnews_video.swf

And more about the Dark Web:

New US Sanctions on North Korea – Comparing Sony & the World’s Biggest Data Breaches

In what looks like the opening salvo in response to the major cyberattack on Sony Pictures Entertainment, the United States slapped North Korea with a new round of sanctions last Friday when President Obama signed an Executive Order authorizing the imposition of sanctions and designated 3 entities and 10 individuals for being agencies or officials of the North Korean government.

According to a Treasury Department statement:

databreach-Sanctions

databreach-Sanctions2

The identifiers of these 10 individuals are:

databreach-Sanctions3

But the US government knew sanctions have had limited impact on the Hermit Kingdom. The new sanctions might be deemed as swift and decisive measures in some quarters but it is really nothing more than a window-dressing of sorts – much like animating a gun with one’s fingers under a coat as a first warning at best. Consider, for example, what kind of impact should one expect from these new sanctions anyway? The 3 organizations were already on the US sanctions list and the 10 North Koreans are highly unlikely to have assets in the US, at least not under their name.

In any case, the horizon ahead of 2015 is likely to be proliferated with more headlines about catastrophic data breaches.

And the Sony cyberattack actually pale in comparison to other data breaches on record, as shown (below) by independent data journalist and information designer David McCandless – you can also click on the bubbles to find out about these cases shown in the chart and table nicely compiled and presented in his blog.

databreachChart1
databreachChart2
databreachChart3

Shhh… When the Postman Became A Spy

Question: If the NSA managed to threaten and make Internet and technology giants like Yahoo, Google, Apple, Facebook, etc to hand over our metadata, who else could they target?

The US Postal Service?

And why not – since the information like names, addresses and postmark dates of both the senders and recipients conveniently splashed on the package covers could provide valuable investigative leads to law enforcement agencies?

As it turned out, the USPS Office of Inspector General (OIG) — the internal watchdog of the postal service – found that “USPS captured information from the outside of about 49,000 pieces of consumer mail in 2013 and turned much of it over to law enforcement organizations throughout the country, unbeknownst to the intended senders and recipients” – see full story below.

And why then should one trust the postal services outside the US – given the Snowden revelations also revealed how intelligence agencies across the globe have duly followed the NSA leads?

The US Postal Service has been quietly surveilling more mail than anyone thought

Program captured information from the outside of 49,000 pieces of mail in 2013 alone, sharing it with law enforcement agents

By Carl Franzen on October 28, 2014 02:15 pm

Snail mail is growing steadily less popular thanks to the internet, but people in the US still send lots of it every year — over 158 billion pieces of mail were handled by the US Postal Service in 2013 alone. As it turns out, the USPS has also been quietly spying on way more of the mail passing through its doors than previously acknowledged. A report from the agency’s internal watchdog — the USPS Office of Inspector General (OIG) — found that USPS captured information from the outside of about 49,000 pieces of consumer mail in 2013 and turned much of it over to law enforcement organizations throughout the country, unbeknownst to the intended senders and recipients. This information reportedly did not include the contents of letters and packages, but rather was limited to the information appearing only on the exterior, such as names, addresses, and postmark dates.

The report on the USPS information capturing program, called “mail covers,” was initially published to little fanfare over the summer and subsequently reported on by Politico, but is getting more attention now with an article appearing today in The New York Times that includes additional details.

First some background: the mail covers program is hardly new, it’s been in existence for over a hundred years, as The Times notes. It’s also not as invasive as a full search warrant for the contents of mail, which the USPS also grants (although only for federal search warrants; state search warrants aren’t accepted by the agency). In a guide for law enforcement agencies, the USPS explains exactly how the program works: a police officer/law enforcement agent needs to be already conducting an investigation into a suspected felony and have the names and addresses for their intended surveillance targets. The officer must send this information to the USPS through the mail or provide it verbally (in person or over the phone), along with a reason why the mail cover is needed. Then the USPS will begin capturing the information from the exterior of all the targets’ incoming and outgoing mail for up to 30 days (although extensions are available). The USPS says that “information from a mail cover often provides valuable investigative leads,” but adds that it “is confidential and should be restricted to those persons who are participating in the investigation.”

However, as the OIG report found, there are numerous problems with the way the USPS has been running the mail covers program. For starters, the USPS has a mail cover app that apparently doesn’t work very well and is blamed for the agency continuing to capture information from the mail of 928 targets even after the surveillance period was supposed to have ended. The USPS also appears to have started mail cover surveillance on targets without sufficient justification from law enforcement as to why it was needed, and some USPS employees didn’t even keep the written justification on file like they were supposed to. And in a further failure of duty, several mail covers weren’t started on time. Perhaps most troubling of all, the USPS doesn’t appear to have been accurately reporting the total number of mail covers in its official records provided to the Times under Freedom of Information Act requests, which show only 100,000 total requests for mail surveillance between 2001 and 2012 (an average of 8,000 a year, way fewer than the 49,000 mail covers acknowledged in the OIG report). The USPS said it agreed with the findings of the OIG report and would work to implement changes, but for an agency already struggling with how to move into the future, the findings are hardly good news.

Post-Snowden, the US Reaps a Security Whirlwind

From China with Love

It’s the one year anniversary of what is now known as the Snowden revelations, which appeared on June 5 and June 9 when The Guardian broke news of classified National Security Agency documents and Edward Snowden revealed himself in Hong Kong as the source of those leaks.

There is still much to decipher from the chronology of events in the aftermath and the sudden global awakening to the end of privacy. Among the impacts on the personal, business and political fronts, one interesting salient feature is the hypocritical rhetorical spats between the US and China in recent weeks, which could set the undertone for US-Sino relations for years to come.

Snowden said his biggest fear is that nothing would change following his bold decision a year ago.

You can find the entire column here.

Shhh… Heartbleed Check & Fix

The open source OpenSSL project revealed Monday a serious security vulnerability known as the “Heartbleed” bug that is used by two-third of the web to encrypt data, ie. to protect usernames, passwords and any sensitive information on secure websites. Yahoo is said to be the most exposed to Heartbleed but the company said it has fixed the core vulnerability on its main sites. There are several things you would need to do to check for Heartbleed bug and protect yourself from it, apart from changing your passwords. And according to the Tor project, staying away from the internet entirely for several days might be a good idea.

Check these YouTube video clips for more information – and find out how to fix it on Ubuntu Linux.

The Growing Hacker Epidemic

Time for Standardized Data Breach Law

The latest hack on Bitcoin exchange Mt.Gox, leading to its sudden bankruptcy late February, and the spate of recent cyber-attacks have prompted warnings of a wave of serious cybercrimes ahead as hackers continue to breach the antiquated payment systems of companies like many top retailers.

Stock exchange regulators like the American SEC have rules for disclosures when company database were hacked but the general public is often at the mercy of private companies less inclined or compelled to raise red flags.

The private sector, policymakers and regulators have been slow to respond and address the increasing threats and sophistication of cybercriminals – only 11 percent of companies adopt industry-standard security measures, leaving our personal data highly vulnerable.

Time for a standardized data breach law?

Find out more from my latest column posted here and there.

The Enemies of the US

Take your pick: Edward Snowden, Internet and phone service providers, or just everybody?

The furor over the past week about how US intelligence agencies like the National Security Agency and the Federal Bureau of Investigation have for years scooped up massive loads of private communications data raises one critical and distressing question.

Who, worldwide and in the US, are the general public supposed to trust now that it seems all forms of digital and cyber communications risk being read by the American authorities? The Americans, it seems, don’t believe it’s that big a deal. By 62-34, according to the latest poll by Pew Research and the Washington Post, they say it’s more important to investigate the threats than protect their privacy. But what about the rest of the world?

The immediate acknowledgement, rather than point blank denial, of the massive clandestine eavesdropping programs is no doubt alarming even for those long suspicious of such covert undertakings. But the more disturbing part is that the official response amounts to plain outright lies.

Please read this entire Opinion Column here.