The iPhone X #PrivacyFail

BDF52C9E-6A09-4303-89C5-20FA276E2E62

Who doesn’t get excited about the latest toys from Apple? We certainly do, but of course we look at everything from the personal privacy perspective. Apple seems to be missing a huge privacy fail after yesterday’s release of the Apple iPhone X: FaceID.

FaceID is touted to be the fastest and most futuristic way to unlock your phone without touching a button. However, this intricate system comprised of an infrared camera, “a11 bionic chip”, flood illumination and a dot projector are about to create a massive privacy invasion!

Consider how this new tool will be exploited by Snapchat and Facebook. They could literally grab a 3D image of your face from your shiny new iPhone and store a crisp 30,000 dot based scan. These scans (which are undoubtedly valuable) will then be sold to third party data brokers. And, because we know from history that data brokers and credit bureaus have shoddy security practices, imagine a data breach where your crisp 30,000 dot based facial image is available all over the internet.

In another scenario, advertisers could use the hardware to determine your emotional reaction to the content shown to you in realtime. This would all happen without the user realizing it and would be an immensely valuable tool for retailers, advertisers, politicians and anyone else that wants to get inside your head. Remember, companies will go to any length to make a profit.

Journalists have written about concerns with police forcing an individual to unlock the phone with their face. There are concerns that the technology can predict sexual preferences and even spark mobile application racism. These are all serious concerns that require some pretty quick answers.

PrivacyMate advises you to stay away from the iPhone X until these issues can be researched further. For now, we must all agree it is one big #privacyfail.

Data Brokers and People Search Sites: You’re on Notice!

main-qimg-60405559076e9dcba5e50b8a8e43050f-c

Organizations that collect, analyze and provide personal data profiles for use by others are now on notice that errors in those reports — even those that to some may seem flattering or inconsequential — may be harmful to data subjects and create legal liability. This should increase organizations’ responsibility to verify data accuracy, allow data subjects the opportunity to correct — or rectify — inaccuracies, and evaluate the privacy consequences of using personal data for profiling.

This comes following the ruling in the Spokeo case.

A Virginia resident named Thomas Robins sued Spokeo in 2011, alleging that the information about him on the site was largely erroneous. According to the suit, Spokeo “incorrectly stated that he was in his 50s, that he was married, that he was employed in a professional or technical field, and that he has children.”

Robins alleges that Spokeo violated federal law by not making reasonable efforts to confirm the info before selling it to third parties. The lawsuit says Robins, who was unemployed at the time, may have lost job opportunities as a result of Spokeo’s inaccurate information.  He alleged that he suffered “actual harm in the form of anxiety, stress, concern, and/or worry about his diminished employment prospects.”

Turns out the court system agreed.

The U.S. Supreme Court weighed in on Robins’ claim of being “harmed” by Spokeo.  In a 6-2 ruling, the high court ruled that Robins must show “real” harm for his lawsuit to go forward.  It told the 9th Circuit to take another look.  The 9th Circuit took another look and said “Robins had alleged injuries that were sufficiently concrete” to merit a lawsuit.

PrivacyMate works daily with subscribers who do not want their information to be online.  We routinely see how their fine print attempts to avoid responsibility for inaccurate information. Take Spokeo for example:  “Spokeo does not verify this public information,” it declares, and “makes no guarantees to Spokeo users about the accuracy, legitimacy or legality of any information or how recently any information was collected or updated.”

That won’t fly much longer, data brokers and people search sites.  Robins won.  The high court spoke.  You are now on notice.

50% off PrivacyMate for those Affected by Equifax Breach

Data-breach-wall-writing-man-e1450184052868

We are offering 50% off our service for those affected by the Equifax Breach – Equifax announced a major data breach yesterday affecting some 143 million Americans’ personal information.  This breach included unauthorized access to sensitive personal information including name, social security number and driver’s license numbers between mid-May and July of this summer.

Sign up here today and use code “EQUIFAX” to obtain 50% off our service for 1-year.

PrivacyMate removes your sensitive personal information from data brokers, telemarketers, junk mailers and public search sites.  This includes your name, age, date of birth, address, previous addresses, family members names, social security number, email address and more.  Our service, coupled with robust identity theft protection, provides the full-spectrum of personal privacy and identity theft protection available in the United States.

In addition, for those affected, Equifax is offering complimentary identity theft protection and credit file monitoring:  information can be found at equifaxsecurity2017.com/enroll

Now that Angie’s List is free, you become the product

angies-list-free

Angie’s list is now free.  With over 3 million users there must be a catch, right?  The catch is YOU.  If you are an Angie’s List subscriber, you are now the product.

At a launch party in New York Angie’s List CEO Scott Durchslag said, “This is a transformative moment for the new Angie’s List as we welcome more consumers into our community so they can experience the industry-leading value and unique services that we provide.”

Translation:  we cannot believe we didn’t figure this out sooner; we are sitting on a treasure trove of personal information and the data brokers are now funding our business. Come join the free party that is Angie’s List. Hallelujah!

Why is this a #privacyfail?  Because it is being done in a less than transparent manner.  Let us explain:

First, check out Angie’s consumer-facing privacy policy here:  it says “Angie’s List does not sell, rent, or trade your email address or any other personal information. We respect your privacy and take the responsibility of protecting the information that you share with us very seriously.”

But when you click-through to Angie’s complete privacy policy page the truth becomes clear:  Angie is collecting your name, email, home and cell phone number and address.  They are also checking out how long you stay on their site, where you last visited and what hyperlinks you click.  They even admit to using other tracking technologies to “understand which pages you visit on their site and other ways you interact with their site and services, such as purchases you make” and they say the reason is “to optimize and tailor our services for you.”

Bogus.

They even admit to linking the information they record using tracking technologies to the Personal Information they collect.  Why?  Because data brokers pay tons of money for that packaged information.  After all, data is the new currency.

When you go from charging $9.99 per year to 3 million members to a free product, you are giving up nearly $30 million in annual revenue.  Think about that for a second.  One can only imagine how much they are cashing in on our personal information.

In today’s day and age you must be more transparent.

How can you say on one hand that you don’t sell, rent or trade our personal information but then admit you do?  The following comes straight from Angie’s List privacy policy:

Disclosures and Transfers of Information
We do not disclose Personal Information to third parties, except when one or more of the following conditions is true:
  • We have your permission to make the disclosure;
  • The disclosure is necessary for the purpose for which the personal information was obtained; [huh?]
  • The disclosure is to the service provider from whom you purchased services through Angie’s List’s platform, including without limitation Big Deals, Storefronts, and project submissions;
  • The disclosure is to financial service providers in order to fulfill and carry out the purchase and provision of goods and services requested by you;
  • The disclosure is permitted by relevant law;
  • The Personal Information to be disclosed is otherwise publicly available in accordance with the applicable law;
  • The disclosure is reasonably related to the sale or other disposition of all or part of our business or assets;
  • The disclosure is for our own marketing purposes (including, without limitation, for Angie’s List to market services to you on third-party social media platforms such as Facebook), or, with your authorization, for the marketing purposes of third parties;
  • The disclosure is combined with information collected from other companies and used to improve and personalize services, content, and advertising from us or third parties;
  • The party to whom the disclosure is made controls, is controlled by, or is under common control with Angie’s List;
  • The disclosure is in our sole discretion necessary for the establishment or maintenance of legal claims or legal compliance, to satisfy any law, regulation, subpoena or government request, or in connection with litigation;
  • The disclosure is in our sole discretion about users who we believe are engaged in illegal activities or are otherwise in violation of our Angie’s List Membership Agreement, even without a subpoena, warrant or court order; or
  • The disclosure is to outside businesses to perform certain services for us, such as maintaining our Site and Services, mailing lists, processing orders and delivering products and services, sending postal mail, processing claims for lost or stolen certificates, providing marketing assistance, confirming your identity for review integrity, and data analysis (“Administrative Service Providers”), including Administrative Service Providers outside the country or jurisdiction in which you reside.

Translation:  we collect your personal information, package it up and sell it to the highest bidder.  

#privacyfail 

 

Robocall Settlement Pays You $300 Per Call

cruise-ship-water-ocean-line-pixabay-815

If you received a pre-recorded “robo” call about a free cruise, you can get paid $300 per call and up to $900 as part of a legal settlement.

Check here to see if your home or mobile line was involved.


The story
:  Over five years ago, Mr. Charvat began receiving prerecorded telemarketing calls promoting the goods or services of Carnival, Royal Caribbean and Norwegian cruise lines. As Mr. Charvat did not consent to receive such calls, they were in violation of the Telephone Consumer Protection Act, 47 U.S.C. § 227 (“TCPA”). It turned out that Mr. Charvat was not alone, and that millions of other consumers nationwide received identical prerecorded telemarketing calls that were made by Defendant Resort Marketing Group, Inc. and its principal, Elizabeth Valente.

The lawsuit
:  Charvat vs. Resort Marketing Group concerns an alleged violation of the Telephone Consumer Protection Act (TCPA) when a company called Resort Marketing Group made unwanted automated calls to consumers offering a free cruise with Carnival, Royal Carribean and Norweigan.


The settlement
:  To resolve this matter without the expense, delay and uncertainties of litigation, the parties have reached a Settlement, which resolves all claims against the Defendants. The Settlement is not an admission of wrongdoing by the Defendants and does not imply that there has been, or would be, any finding that the Defendants violated the TCPA.


How you benefit
:  You are included in the Settlement as a Settlement Class Member if you were the owner, subscriber, or user of a residential or cellular telephone line that received pre-recorded telemarketing calls between July of 2009 and March of 2014 that were initiated by RMG during which you were offered a free cruise with Carnival, Royal Caribbean and Norwegian cruise lines and your phone number is contained in the call records produced by RMG in this case. The Call Records contain all phone numbers that RMG used to initiate pre-recorded telemarketing calls to promote its business.

For more information and to check if you are entitled to settlement funds, visit this page.

The Woeful Protective Measures of Data Brokers

shutterstock_237616864

Alex Haynes, a security researcher with the InfoSecurity group and infosecurity-magazine.com wrote an eye opening piece this week about data brokers.  The article can be found here:  https://www.infosecurity-magazine.com/opinions/are-data-brokers-actually-secure/

Mr. Haynes concludes his piece with the following food for thought:

“So what does this all mean? Firstly, data broker protective measures are woeful, mainly consisting of security-as a-service offerings and security seals, which are not effective countermeasures on their own but best placed in a defense-in-depth stack. The fact that most sites don’t even implement transport layer security as standard shows the lag they have with the security of mainstream sites today.

Secondly, it means we are now entering the age of the mega breach. In the way that breaches in the hundreds of millions of records are becoming the norm today, we will soon become accustomed to breaches containing billions of records a few years from now.

Lastly, this is an area crying out for regulation. Opening up access to larger and larger pools of consumer data should bring with it corresponding shifts in security obligations, which are sadly lacking today – at least in the United States. Until then, it’s watch and wait.”

PrivacyMate applauds this work by Mr. Haynes.  We agree:  we are in the era of a potential mega breach.  After all, we are leaving our personal and sensitive information in the open, flimsy databases of the data brokers.  The data brokers are making it impossible to exercise our Constitutional right to privacy by opting-out of their practices.  And their motive?  Money.  Greed.  In an era where big data equals big profits, what has been lost is our freedom to control who has our personal and sensitive information.

PrivacyMate has joined the working group in Vermont to create legislation regulating data brokers and we will fight for consumer rights until these practices are curbed and regulated.

Are Disney and Nickelodeon Apps Safe for Kids?

maxresdefault

A parent in California believes that Disney and Nickelodeon are using online websites and applications to collect, store and sell their children’s personal information.  The law suits claim that children under 13 years old who use the apps have their personal information taken for purposes of ‘future commercial exploitation’.  This would be in violation of COPPA.

The Children’s Online Privacy Protection Rule, better known as COPPA, requires that any app, website, or service directed to children must disclose what data it is collecting, get parental approval to collect it, and must give parents the ability to opt-out of having their kids data shared.

The lawsuits claim that most users, including the parents of children using apps, “do not know that apps created for children are engineered to surreptitiously and unlawfully collect the child-users’ personal information,” which is then shared for “advertising and other commercial purposes.”

In addition, the suits allege that these applications are collecting information and creating full and unique online tracking profiles of our children (the same way it is done for adults) in violation of federal law.  Others named as defendants include Disney and Viacom partners who make and launch their apps, uncluding Kochava, Unity, and Upsight.

Viacom is Nickelodeon’s parent company.

The law suits ask for the following relief:

  • That the court find the defendants — Disney, Viacom, and the software companies building the apps — in violation of COPPA
  • That the defendants immediately stop and be permanently barred from collecting childrens’ data this way
  • That the companies destroy any data they already collected unlawfully
  • That the court determine both statutory damages (i.e. delineated fines for breaking the law) and punitive damages

This wouldn’t be the first time that popular kid’s sites got into trouble for illegally tracking children online.  One wonders what the privacy policies of those applications say about their collection of information.

Really, CVS?

IMG_0174

As the co-founder of a privacy company it is safe to say I’m uber-focused on my own experiences when it comes to my personal and sensitive data and that of my family.

This past week I was irked big time.  I know pharmacy companies are notorious for persuading us to collect our personal and sensitive information.  But I am shocked at the extent to which they try to block me from opting-out of such activities.

Let me explain.

I visited CVS Pharmacy to pick up a prescription for my wife.  I went through the already painful process of announcing her name and date of birth in front of 3 strangers and 2 pharmacists (first, this probably runs afoul of HIPAA – I never gave CVS our marriage certificate so how do they know I’m her husband…second, couldn’t one of the strangers behind me then use her name and date of birth to steal her identity?).

I digress.

The pharmacist returned with her prescription and checked me out.  At the register, once I swiped my card I was asked to look at the computer screen on the card reader.  The pharmacist asked me to answer the question on the screen, which read:

“IS (XXX) XXX-XXXX THE NUMBER YOU WISH TO RECEIVE
MARKETING, ALERTS, PRESCRIPTION INFORMATION
AND OTHER ALERTS RELATED TO YOUR CARE?”

Of course my answer is going to be “NO”…..but here’s the problem.  I was not allowed to answer no.  I was only given the following two options:

“YES”    or    “INFO”
(note – the “Info” button was grey’d out)

Huh?

I asked the pharmacist how I could answer no to CVS’ collecting, storing and sharing of my wife’s cell phone and he told me to press “INFO”.  I did, and out popped the following receipt from the register:

cvs

I asked the pharmacist what this was.  He said he didn’t know.  Thanks, Mr. Pharmacist.  Appreciate you.

So I called the 800 number and it was an automated system that listed 9 options for me – none of which had anything to do with their privacy practices.  I repeated the words “customer service” and “other” until it took me to a new menu.  This new menu had as its second option “calls”.  So I said “calls”.

I was now with a live agent.  I told him I was calling about my privacy preferences and wanted to opt-out of CVS’ phone number program.  He was confused.  I thought to myself – do people not do this often?  

After about 7 minutes he finally understood why I was calling.  He told me the number would be removed from CVS system in 48 hours.

Really, CVS?  Haven’t you been the subject of federal lawsuits about these practices?  As an attorney and privacy advocate I am shocked and stunned how difficult CVS has made the process to opt-out of them collecting, storing and sharing my wife’s cell phone number.  It’s deceptive at best.  And I am savvy when it comes to this stuff.

PrivacyMate has gone to great lengths and will continue to go to great lengths to help create an opt-in versus opt-out society.  Until then, we will be left wondering if the robo-call and telemarketing epidemic is related to the not-so-friendly privacy practices of behemoth’s like CVS.  #SMH.

A Treasure Trove of Personal and Sensitive Data

im3ages

Google’s new advertising scheme, Store Sales Measurement, allows them to track and record our credit and debit card transactions – both online and within brick-and-mortar retailers. The idea is that Google can use this information to better determine how many sales have been generated by digital ad campaigns.  They can then, of course, package the data and sell it off to data brokers who, in turn, sell it to marketers.

EPIC is not happy about this and wants the United States Federal Trade Commission to review Google’s algorithms themselves.  According to the Washington Post, EPIC says the database technology Google’s scheme is based on – CryptDB – also has known security flaws.  In 2015, Microsoft researchers successfully hacked health records stored using CryptDB.

“Google has collected billions of credit card transactions, containing personal customer information, from credit card companies, data brokers, and others and has linked those records with the activities of Internet users, including product searches and location searches,” EPIC writes in documents filed with the FTC. “This data reveals sensitive information about consumer purchases, health, and private lives.”

Google responded by saying that users can opt out of ad tracking by unchecking “Web and App Activity” within their Activity Controls settings.  The problem, however, is that Google still stores click data even when “Web and App Activity” is unchecked.  EPIC also points to the fact that Google won’t disclose which companies are providing it with shopping transaction records, which only adds to the general opaqueness of a process that could see customers’ medical conditions or religious beliefs leverages without their explicit knowledge.

EPIC alleges in its complaint that Google’s opt-out process is “burdensome, opaque, and misleading.”  PrivacyMate agrees.  We always believe the standard for such tracking should be “opt-in” and not “opt-out”.

EPIC is calling on the FTC to require Google to offer a “clean and simple” opt-out tool. The advocacy group also wants Google to reveal the identities of the data brokers or other third parties that provide information about people’s purchases, and to disclose the details of the algorithm that powers the program.

There’s a Spy Cleaning Your Home

irobot-roomba-860-vacuuming-robot-d-20170614093626307-489556_alt1

Roomba is a device owned by the company iRobot.  They are making headlines this week as it has become evident their CEO wishes to sell the personal data collected by these vacuum robots in our homes to the highest bidder – later this year or next year.  While the CEO has every right to profit in the manner he so chooses, the way he is going about it is a bit opaque.  He was quoted this week saying: “We will always ask your permission to even store map data“.

We all know that nobody reads privacy policies anyway.  How many Roomba users actually knew the robot was collecting data and mapping their home all this time?  Did anyone?

In fact, two Roomba models — the 960 and 980 — map the interiors of our homes to more efficiently clean up dust and dirt. Those intimate maps, the company hopes, could soon be sold as personalized, data-rich products to giant tech companies, seizing a bigger role in the burgeoning market for “smart” devices in the Web-connected household.

While companies like Amazon, Google and Apple would likely pay big money for that kind of data (Amazon just recently made select Roomba devices compatible with its Alexa digital assistant), the idea of a smart vacuum recording floor plan data for sale does raise some privacy concerns.

One of the biggest concerns of digitally connected homes is the risk of having our personal information hacked or otherwise compromised. Even if iRobot can promise secure handling of its customer’s home layout, the same can’t be guaranteed for other companies using the same data for their own devices.  And when Roomba shares our information with other 3rd parties, how do we know those parties are taking appropriate steps to keep it safe?

Furthermore, as we ask above, how many Roomba users are already aware that their little robot vacuum has been collecting their data all of this time?  How many people actually read the parent company, iRobot’s privacy policy that is found on their website here?

This story is yet another example of the ways in which technology coupled with corporate need is getting in the way of our fundamental right to privacy.