Your address will show here +12 34 56 78
Consumer, Personal Data, Privacy, Tech

Facebook let Netflix and Spotify access private messages from users, startling documents reveal

Leaked documents show tech giant continued sharing user details despite public statements saying it had stopped years earlier

For years, Facebook gave some of the world’s largest technology companies more intrusive access to users’ personal data than it has disclosed, effectively exempting those business partners from its usual privacy rules, according to internal records and interviews.

The special arrangements are detailed in hundreds of pages of Facebook documents obtained by The New York Times. The records, generated in 2017 by the company’s internal system for tracking partnerships, provide the most complete picture yet of the social network’s data-sharing practices. They also underscore how personal data has become the most prized commodity of the digital age, traded on a vast scale by some of the most powerful companies in Silicon Valley and beyond.

The exchange was intended to benefit everyone. Pushing for explosive growth, Facebook got more users, lifting its advertising revenue. Partner companies acquired features to make their products more attractive. Facebook users connected with friends across different devices and websites. But Facebook also assumed extraordinary power over the personal information of its 2.2 billion users — control it has wielded with little transparency or outside oversight.

The social network allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, the records show, and gave Netflix and Spotify the ability to read Facebook users’ private messages.

The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer, despite public statements that it had stopped that type of sharing years earlier.

Facebook has been reeling from a series of privacy scandals, set off by revelations in March that a political consulting firm, Cambridge Analytica, improperly used Facebook data to build tools that aided President Donald Trump’s 2016 campaign. Acknowledging that it had breached users’ trust, Facebook insisted it had instituted stricter privacy protections long ago. Mark Zuckerberg, the chief executive, assured lawmakers in April that people “have complete control” over everything they share on Facebook.

But the documents, as well as interviews with about 50 former employees of Facebook and its corporate partners, reveal Facebook allowed certain companies access to data despite those protections. They also raise questions about whether Facebook ran afoul of a 2011 consent agreement with the Federal Trade Commission that barred the social network from sharing user data without explicit permission.

In all, the deals described in the documents benefited more than 150 companies — most of them tech businesses, including online retailers and entertainment sites, but also car-makers and media organisations. Their applications sought the data of hundreds of millions of people a month, the records show. The deals, the oldest of which date to 2010, were all active in 2017. Some were still in effect this year.

In an interview, Steve Satterfield, Facebook’s director of privacy and public policy, said none of the partnerships violated users’ privacy or the FTC agreement. Contracts required the companies to abide by Facebook policies, he added.

Still, Facebook executives have acknowledged missteps during the past year. “We know we’ve got work to do to regain people’s trust,” Mr Satterfield said. “Protecting people’s information requires stronger teams, better technology and clearer policies, and that’s where we’ve been focused for most of 2018.” He said the partnerships were “one area of focus” and Facebook was in the process of winding many of them down.

Facebook has found no evidence of abuse by its partners, a spokesperson said. Some of the largest partners, including Amazon, Microsoft and Yahoo, said they had used the data appropriately, but declined to discuss the sharing deals in detail. Facebook did say it had mismanaged some of its partnerships, allowing certain companies’ access to continue long after they had shut down the features that required the data.

With most of the partnerships, Mr Satterfield said, the FTC agreement did not require the social network to secure users’ consent before sharing data because Facebook considered the partners extensions of itself — service providers that allowed users to interact with their Facebook friends. The partners were prohibited from using the personal information for other purposes, he said. “Facebook’s partners don’t get to ignore people’s privacy settings.”

Data privacy experts disputed Facebook’s assertion that most partnerships were exempted from the regulatory requirements, expressing scepticism that businesses as varied as device makers, retailers and search companies would be viewed alike by the agency. “The only common theme is that they are partnerships that would benefit the company in terms of development or growth into an area that they otherwise could not get access to,” said Ashkan Soltani, former chief technologist at the FTC.

Mr Soltani and three former employees of the FTC’s consumer protection division, which brought the case that led to the consent decree, said in interviews that its data-sharing deals had probably violated the agreement.

“This is just giving third parties permission to harvest data without you being informed of it or giving consent to it,” said David Vladeck, who formerly ran the FTC’s consumer protection bureau. “I don’t understand how this unconsented-to data harvesting can at all be justified under the consent decree.”

Details of the agreements are emerging at a pivotal moment for the world’s largest social network. Facebook has been hammered with questions about its data sharing from lawmakers and regulators in the United States and Europe. The FTC this spring opened a new inquiry into Facebook’s compliance with the consent order, while the Justice Department and Securities and Exchange Commission are also investigating the company.

Facebook’s stock price has fallen, and a group of shareholders has called for Mr Zuckerberg to step aside as chairman. Shareholders also have filed a lawsuit alleging that executives failed to impose effective privacy safeguards. Angry users started a #DeleteFacebook movement.

This month, a British parliamentary committee investigating internet disinformation released internal Facebook emails, seized from the plaintiff in another lawsuit against Facebook. The messages disclosed some partnerships and depicted a company preoccupied with growth, whose leaders sought to undermine competitors and briefly considered selling access to user data.

As Facebook has battled one crisis after another, the company’s critics, including some former advisers and employees, have singled out the data-sharing as cause for concern.

“I don’t believe it is legitimate to enter into data-sharing partnerships where there is not prior informed consent from the user,” said Roger McNamee, an early investor in Facebook. “No one should trust Facebook until they change their business model.

Unlike Europe, where social media companies have had to adapt to stricter regulation, the United States has no general consumer privacy law, leaving tech companies free to monetise most kinds of personal information as long as they do not mislead their users. The FTC, which regulates trade, can bring enforcement actions against companies that deceive their customers.

Besides Facebook, the FTC has consent agreements with Google and Twitter stemming from alleged privacy violations.

For some advocates, the torrent of user data flowing out of Facebook has called into question not only Facebook’s compliance with the FTC agreement, but also the agency’s approach to privacy regulation.

“There has been an endless barrage of how Facebook has ignored users’ privacy settings, and we truly believed that in 2011 we had solved this problem,” said Marc Rotenberg, head of the Electronic Privacy Information Centre, an online privacy group that filed one of the first complaints about Facebook with federal regulators. “We brought Facebook under the regulatory authority of the FTC after a tremendous amount of work. The FTC has failed to act.”

According to Facebook, most of its data partnerships fall under an exemption to the FTC agreement. The company argues the partner companies are service providers — companies that use the data only “for and at the direction of” Facebook and function as an extension of the social network.

But Mr Vladeck and other former FTC officials said Facebook was interpreting the exemption too broadly.

When The Times reported last summer on the partnerships with device-makers, Facebook used the term “integration partners” to describe BlackBerry, Huawei and other manufacturers that pulled Facebook data to provide social-media-style features on smartphones. All such integration partners, Facebook asserted, were covered by the service provider exemption.

Since then, as the social network has disclosed its data-sharing deals with other kinds of businesses — including internet companies such as Yahoo — Facebook has labelled them integration partners, too.

Facebook even re-categorised one company, the Russian search giant Yandex, as an integration partner.

Facebook records show Yandex had access in 2017 to Facebook’s unique user IDs even after the social network stopped sharing them with other applications, citing privacy risks. A spokesperson for Yandex, which was accused last year by Ukraine’s security service of funnelling its user data to the Kremlin, said the company was unaware of the access and did not know why Facebook had allowed it to continue. They added the Ukrainian allegations “have no merit.”

In October, Facebook said Yandex was not an integration partner. But in early December, as The Times was preparing to publish this article, Facebook told congressional lawmakers that it was.

How closely Facebook monitored its data partners is uncertain. Most of Facebook’s partners declined to discuss what kind of reviews or audits Facebook subjected them to. Two former Facebook partners, whose deals with the social network dated to 2010, said they could find no evidence that Facebook had ever audited them. One was BlackBerry. The other was Yandex.



Consumer, Legislation, Personal Data, Privacy, Trending

General Medical Council and doctors’ unions hit out at government’s ‘cavalier approach’ to patient data.

Police forces will be able to “strong-arm” NHS bodies into handing over confidential patient data under planned laws that have sparked fury from doctors’ groups and the UK’s medical watchdog.

Ministers are planning new powers for police forces that would “set aside” the existing duty of confidentiality that applies to patient data held by the NHS and will instead require NHS organisations to hand over data police say they need to prevent serious violence.

Last week, England’s national data guardian, Dr Nicola Byrne, told The Independent she had serious concerns about the impact of the legislation going through parliament, and warned that the case for introducing the sweeping powers had not been made.

Now the UK’s medical watchdog, the General Medical Council (GMC), has also criticised the new law, proposals for which are contained in the Police, Crime and Sentencing Bill, warning it fails to protect patients’ sensitive information and could disproportionately hit some groups and worsen inequalities.

Human rights group Liberty said the proposed law is so broad that police forces will be able to “strong-arm information” from the NHS and other bodies, and that this could include information about people’s health, religious beliefs and political links. It added: “Altogether, these provisions are likely to give rise to significant and severe breaches of individuals’ data rights.”

Under the proposed legislation, which will be debated in the House of Lords in the coming weeks, local health boards will be legally required to provide confidential patient information when it is requested by police. The bill explicitly sets aside the existing common-law duty of confidentiality.

The purpose is to prevent serious violence, but there is already provision to allow patient information to be shared with police where there is a public interest need, such as the threat of violence or preventing a crime. The bill does not set out in detail what kinds of data could be handed over.

Under the proposed new law, police would have the power to demand information regardless of whether the NHS considered it to be in the public interest or not.

Professor Colin Melville, medical director at the GMC, told The Independent: “We are concerned the bill doesn’t protect patients’ sensitive health information and risks undermining the trust at the heart of doctor-patient relationships. We also share concerns held by others that an erosion of this trust may disproportionately affect certain communities and deepen societal inequalities.”

The GMC has raised its objections with the Home Office, which has said that the law will still require organisations to meet data protection rules before sharing any information.

But the doctors’ union, the British Medical Association (BMA), said in a briefing for members of the House of Lords that this would not provide adequate protection.

It said that health information had “long been afforded special legal status, over and beyond the Data Protection Act, in the form of the common-law duty of confidentiality”, which had been upheld by several court cases.

It added that the bill would “override the duty of medical confidentiality, including by legally requiring identifiable health information about individuals to be shared with the police”, saying: “We believe that setting aside of the duty of confidentiality, to require confidential information to be routinely given to the police when requested, will have a highly damaging impact on the relationship of trust between doctors and their patients. A removal of a long-established protection for confidential health information, alongside a broad interpretation of ‘serious crime’, may mean many patients are reluctant or fearful to consult or to share information with doctors,”

Dr Claudia Paoloni, president of the Hospital Consultants and Specialists Association, said the new law would “seriously undermine” the existing trust-based relationship with patients, and would create barriers to them seeking care: “We have significant concerns about what appears to be a cavalier approach to long-held principles, without clear objectives, and which is likely to have unintended consequences. Unless these concerns around individual patient confidentiality can be satisfactorily answered, we believe such powers should be removed from the bill.

“Other than the most serious crimes, which are already covered by precedent on disclosure on public interest grounds, it remains unclear precisely in what way laws to force the release of individuals’ medical records would be used.”

The latest data controversy comes after the NHS was forced to pause plans to share GP patient data with third parties to aid research, after an outcry from some doctors and patients over how the information would be used.

A Home Office spokesperson said: “Appropriate safeguards are in place, and any information shared must be proportionate. The bill makes clear that information can only be shared in accordance with data protection laws.”



Legislation, Personal Data, Privacy, Tech, Trending

The cyberspace pioneer is skeptical about a blockchain-based internet

Web inventor Tim Berners-Lee wants to rescue his creation from centralization. But does he align himself with Web3’s promise of salvation?

At TNW Conference, the computer scientist gave a one-word answer: “Nope.”

That snub may seem to clash with Berners-Lee’s recent actions. The 67-year-old now campaigns to save his “dysfunctional” brainchild from the clutches of Big Tech.

He’s also made a cool $5.4million by selling an NFT — one of Web3’s supposed pillars. But the Brit has his own vision for the web’s successor: a decentralized architecture that gives users control of their data.

Berners-Lee want to build it on a platform he calls Solid — but you can call it Web 3.0.
“We did talk about it as Web 3.0 at one point, because Web 2.0 was a term used for the dysfunction of what happens with user-generated content on the large platforms,” he said.

“People have called that Web 2.0, so if you want to call this Web 3.0, then okay.”

On the blockchain, it just doesn’t work. Berners-Lee shares Web3’s purported mission of transferring data from Big Tech to the people. But he’s taking a different route to the target. While Web3 is based on blockchain, Solid is built with standard web tools and open specifications. Private information is stored in decentralized data stores called “pods,” which can be hosted wherever the user wants. They can then choose which apps can access their data. This approach aims to provide interoperability, speed, scalability, and privacy.

“When you try to build that stuff on the blockchain, it just doesn’t work,” said Berners-Lee.



Personal Data, Privacy, Tech

There’s a New Yorker cartoon from the early 90’s that you may have seen. A dog, perched at a desktop computer keyboard, explains to his fellow canine: “On the Internet, nobody knows you’re a dog.” Back then, the internet was synonymous with anonymity, and privacy wasn’t an issue because the tools to track every click and pixel of a user’s activity didn’t exist yet. If that cartoon were revised today, it might read, “On the internet, everybody knows what brand of kibble you eat and every time you sniff another dog’s butt.”

dealistic origins

At its inception, the internet’s promise of the free flow of information to and from every corner of the world was exhilarating. Today, companies know more about us than perhaps our closest family members and friends do, and the lack of privacy is suffocating. We don’t control our own data online: Big Tech companies do, and our government refuses to do anything about it. Increasingly, many technologists are advocating for “data dignity” or the idea that we deserve more control over the use of our personal data.

It is still possible to forge a better path for our digital future. We formed EthicalTech to transform technology as a force for good and accelerate the development of a better internet for everyone. EthicalTech is a multidisciplinary coalition that builds standards for an ethical internet with privacy at its core.

Paradise lost

The journey to rediscovering the internet as it was meant to be and reasserting our data dignity starts with understanding how we got here. Despite taxpayer dollars funding key internet components and breakthroughs through public institutions, the government chose to leave the web essentially unregulated. The idealistic promise envisioned by early adopters was steamrolled by businesses doing what businesses do best: generate returns for shareholders and putting our privacy second.

The laissez faire approach soon gave rise to the most valuable digital currency of all: the cookie. First-party cookies—premised on the idea that if you visit a website, you’re in a first-party, or direct, relationship with that site’s owner—allowed the site owner to cater to your needs. Third-party cookies, or those created by a business different from the website you’re visiting, extended personalization and utility across the internet but also opened the door to tracking us everywhere we go online. It’s a phenomenon we made worse when, bamboozled by the lure of social media, we turned over our data for free products and services, not realizing that we—or at least our data—had become the product.

As it turns out, the same market dynamics and network effects that lure a critical mass of users to a platform like Facebook or Google also result in few big winners. The internet economy consolidated around these behemoths, who used cookies to seize control over the data economy and maintained market power at the expense of your privacy. Now, when the regulators come knocking, Big Tech turns regulation on its head. For example, Google’s announcement that it would deprecate third-party cookies wasn’t, despite its claims, about respecting privacy. Instead, it was a bait-and-switch for Google to simply prioritize their own first-party cookies over competitor first-party data. Such oligopolistic data dominance is no better than a competitive market when it comes to privacy. True privacy and data dignity is when the people control their own data.

Demanding data dignity

Can Big Tech change its ways? All along, companies insisted they had policies to respect privacy and empower people to make their own choices. Facebook, at its launch, even sought to distinguish itself from MySpace by emphasizing that it would put privacy first. It didn’t. After all, no Facebook policies were breached when, in 2016, it facilitated Cambridge Analytica’s harvesting of privacy data from more than 87 million Facebook users. And Facebook was not alone; an endless number of companies skim our data without our knowledge or consent.

On the regulatory side, government has been slow and clumsy. Efforts like the European Union’s General Data Protection Regulation, or GDPR, and California’s Privacy Rights Act, or CPRA, have attempted to define new rules of the road. The problem is that tech firms move faster than governments and often operate in spaces where they have a significant information advantage over regulators. As a result, even as awareness of data dignity builds, Big Tech remains free to track our movements, gather information about us, and sell those insights as they wish.

Our collective awareness is increasing; now we need to turn that awareness into collective action. EthicalTech is bringing together experts and leaders from the private, public, and non-profit sectors to find common sense solutions for enacting data dignity. Together, we can create the infrastructure for an ethical internet where privacy standards are built in. It is up to us to recapture the initial promise of the internet and allow future generations to trust they can interact, co-create, and exist in a digital world with their privacy intact.



Consumer, Personal Data, Privacy, Tech

De datahonger van Big Tech loopt steeds meer tegen muren op. Europese privacywaakhonden floten Google al terug. Ook andere techbedrijven realiseren zich dat ze meer aandacht voor de privacy van hun gebruikers moeten hebben. Dat heeft een grote invloed op wat voor informatie voor marketeers beschikbaar is. Wacht dus niet met het zoeken naar alternatieven.

Google Analytics is niet langer een optie om bezoekers van websites te analyseren. Medio januari kwam de Oostenrijkse privacywaakhond DSB als eerste tot de conclusie dat Google Analytics in strijd is met de GDPR/AVG en niet langer gebruikt mag worden door organisaties die gegevens verwerken van EU-burgers. Eind januari volgde de Noorse privacytoezichthouder Datatilsynet en begin februari kwam ook de Franse privacy- en data-autoriteit CNIL tot de conclusie dat Google Analytics niet langer gebruikt mag worden in dat land. In navolging daarvan heeft CNIL medio april drie websites in Frankrijk bevolen om te stoppen met Google Analytics. De potentiële boete is niet mis: tot €20 miljoen of 4 procent van de jaaromzet.

De Nederlandse Autoriteit Persoonsgegevens (AP) onderzoekt dit nog en heeft nog geen officieel besluit genomen, maar waarschuwt Nederlandse organisaties al om Google Analytics niet meer te gebruiken. Mede naar aanleiding van deze ontwikkelingen heeft Google het einde van Universal Analytics (Google Analytics 3) al aangekondigd voor 1 juli 2023. Google Analytics 4 is weliswaar al beschikbaar, maar werkt totaal anders dan GA3. Organisaties moeten in feite hun Analytics-aanpak helemaal opnieuw opbouwen. Het is een flinke uitdaging om die transitie in iets meer dan een jaar door te voeren, terwijl ook in GA4 niet alle privacyvraagstukken op een goede manier worden aangepakt.

Inperken van ‘big tech’

Maar het is niet alleen Google Analytics dat teveel informatie van internetgebruikers verzamelt. Ook andere techreuzen, zoals Amazon, Meta (Facebook) en Apple slaan bakken vol informatie van gebruikers op. Het is te verwachten dat ook daar binnenkort een einde aan komt, zeker nu de EU-lidstaten en het Europees parlement eind maart een akkoord hebben bereikt over de Digital Markets Act (DMA). Deze wet moet die macht van Big Tech inperken, voor meer concurrentie zorgen en de privacy van EU-burgers beter beschermen.

Al deze ontwikkelingen betekenen dat marketeers en gebruikers op zoek moeten gaan naar alternatieven die meer oog hebben voor privacy. Hoewel de privacy van internetgebruikers in de Europese Unie steeds beter wordt beschermd met de AVG-wetgeving (GDPR), komen de Big Tech-bedrijven (Google, Apple, Facebook en Amazon, kortweg GAFA) regelmatig in het nieuws vanwege privacyschandalen. Zo worstelde Facebook geruime tijd met de beroemde Cambridge Analytica-affaire en moest Mark Zuckerberg zich in het Amerikaanse Congres verantwoorden over hoe Facebook met de privacy van gebruikers omgaat.

De privacyschendingen van Big Tech worden door de consument ook daadwerkelijk als negatief ervaren. Uit Europees onderzoek blijkt dat 41% van de Europeanen geen gegevens over zichzelf wil delen met particuliere bedrijven. Volgens Pew Research is bijna 70% van de Amerikanen van mening dat hun persoonsgegevens worden misbruikt. Consumentengegevens zijn een van de waardevolste handelsgoederen ter wereld, maar de macht van deze techreuzen is te groot geworden. Die macht van de grote techbedrijven heeft al geleid tot een reeks antitrustzaken.

Google FloC en Topics?

Gelukkig is Big Tech zelf inmiddels ook tot de conclusie gekomen dat het blijven negeren van privacy-eisen van gebruikers een bedreiging vormt voor het succes van hun bedrijf. Als techbedrijven gegevens van gebruikers willen gebruiken, zullen ze dat moeten doen op een manier die de consumenten accepteren. Daarom beginnen ze serieuzer te worden over het waarborgen van de privacy de apparaten of toepassingen van gebruikers.

Google heeft al aangegeven dat cookies van derden door de Chrome-browser vanaf 2023 geblokkeerd zullen worden. En dat is voor de grootste speler op de online advertentiemarkt op zijn zachtst gezegd opmerkelijk. Het lijkt een overwinning op het gebied van privacy, maar dat betekent niet dat er een einde komt aan het verzamelen van gebruikersgegevens. Nadat Google aanvankelijk inzette op de zogenoemde FLoC-technologie, (Federated Learning of Cohorts), maakte het bedrijf in januari van dit jaar bekend dat het toch niet verder zou gaan met dit cookie-alternatief. In plaats daarvan worden ‘Topics de nieuwe manier van Google om gebruikers in te delen in categorieën met vergelijkbare interesses, zodat bedrijven ze op die manier contextuele advertenties kunnen gaan voorschotelen. Met Topics verzamelt Chrome informatie over de interesses van gebruikers terwijl ze websites bezoeken. Deze gegevens worden dan drie weken bewaard. Op dit moment heeft Google het aantal Topics beperkt tot 300, maar is van plan om dit later uit te breiden.

Het feit dat advertenties hiermee contextueel worden, is weliswaar een stap in de goede richting, maar toch worden er zo voor reclamedoeleinden nog steeds gegevens doorgegeven vanuit een browser. Ook Topics is dus bedoeld om gegevens over de interesses van Chrome-gebruikers te verzamelen en deze te delen met derden, die op basis daarvan advertenties tonen. Andere browsers, zoals Safari, Firefox of Brave, doen dat niet.

Geef consument de controle

De beste manier om met de onverzadigbare datahonger van Big Tech om te gaan, is door te zoeken naar alternatieven. Kernbegrip daarbij is dataminimalisering, oftewel: alleen die bepaalde informatie verzamelen en gebruiken van bezoekers die nodig is voor het uitvoeren van een bepaalde handeling of het aanbieden van een dienst. We zullen de behoefte van consumenten aan privacy moeten accepteren en ze de controle weer moeten geven. Vanuit technisch oogpunt moeten ze de mogelijkheid hebben om hun goedkeuring voor het gebruik van hun gegevens in te trekken of tegen te houden als zij het merk associëren met praktijken die zij niet op prijs stellen. Bedrijven moeten alleen gegevens verzamelen en gebruiken voor een specifiek en legitiem doel. Dit zijn sleutelbeginselen die niet alleen belangrijk zijn voor de AVG, maar voor de meeste privacywetten elders in de wereld.

Het wordt steeds duidelijker dat er zonder expliciete toestemming van gebruikers niets meer gedaan mag worden met hun data. Dit betekent voor marketingafdelingen dat het aantal datapunten dat ze voor hun acties kunnen gebruiken met de helft of misschien zelfs meer zal afnemen. Bedrijven moeten daarom heel goed nagaan welke bezoekersdata essentieel zijn en wat de beste manier is om die data op de juiste manier te kunnen verzamelen.

Zoals de verschillende privacyautoriteiten in Europa inmiddels adviseren, is het verstandig om uit te kijken naar een alternatief voor Google Analytics, dat wél aan de privacyregels voldoet. Gelukkig zijn er al uitstekende Europese alternatieven, zoals Plausible, Simple Analytics, Splitbee of Piwik PRO. Wacht daarmee niet tot er een daadwerkelijk verbod is op Google Analytics, maar begin er direct mee, zodat er een sterke en op vertrouwen gebaseerde relatie met klanten opgebouwd kan worden.



Legislation, Personal Data, Privacy, Tech

The European Parliament has backed new rules for data sharing, in a move the EU hopes will help harness the power of data and artificial intelligence (AI) to boost innovation across the continent.

The new legislation aims to increase the availability of data for businesses and individuals while setting a raft of new standards for data sharing across the bloc. It will apply to manufacturers, companies and users.

It was first proposed in late 2020 to offer “an alternative European model” to the data handling practices of major US tech platforms. The idea is to create common European data spaces in a variety of fields including health, energy, agriculture, mobility and finance.

The legislation, known as the Data Governance Act (DGA), was approved by EU lawmakers on Wednesday with 501 votes to 12, and 40 abstentions.

It now needs to be formally adopted by the Council before entering into force.

“Our goal with the DGA is to set the foundation for a data economy in which people and businesses can trust. Data sharing can only flourish if trust and fairness are guaranteed, stimulating new business models and social innovation,” said Angelika Niebler, the German MEP who steered the legislation through parliament.

“Experience has shown that trust – be it trust in privacy or in the confidentiality of valuable business data – is a paramount issue. The Parliament insisted on a clear scope, making sure that the credo of trust is inscribed in the future of Europe’s data economy”.

Data transfer disputes

The legislation is part of a broader EU strategy to break Big Tech’s hold over the data sphere.

The EU is also working on a Data Act that specifically looks at who can create value from data and aims to “place safeguards against unlawful data transfer,” which could affect US or other foreign tech companies.

The European Commission published its draft of that act in late February, with commission Vice President Margrethe Vestager saying at the time: “We want to give consumers and companies even more control over what can be done with their data, clarifying who can access data and on what terms.”

Data disputes between the EU and Big Tech have been growing in recent years.

Meta recently warned it could pull Facebook and Instagram out of Europe if it’s unable to transfer, store and process Europeans’ data on US-based servers.

Harnessing the power of AI

According to estimates by the European Commission, the amount of data generated by public bodies, businesses and citizens will be multiplied by five between 2018 and 2025.

“We are at the beginning of the age of AI and Europe will require more and more data. This legislation should make it easy and safe to tap into the rich data silos spread all over the EU,” Niebler said.

“The data revolution will not wait for Europe. We need to act now if European digital companies want to have a place among the world’s top digital innovators”.

MEPs said they had negotiated with EU ministers to ensure there were no loopholes that would allow companies outside of the EU to abuse the scheme.

They are pushing to make the most of data for “objectives of general interest” such as scientific research, health, climate change and mobility.



Consumer, Corporate, Legislation, Personal Data, Privacy, Trending

In September 2021, the Kingdom of Saudi Arabia issued its Personal Data Protection Law to regulate the processing of personal data. The PDPL is the first federal, sector-agnostic data privacy legislation in Saudi Arabia. Organizations will be faced with significant changes to their operations to ensure compliance.

The PDPL comes into effect only 180 days after the publication in the Official Gazette, meaning the law will be effective March 23, subject to the passage of the implementing regulations. For the first two years, it will be enforced under the Saudi Data and Artificial Intelligence Authority, after which a transition to the National Data Management Office will be considered.   

Like other new data protection laws and updates within the broader Middle East and North Africa region, some elements within the PDPL are similar to those of other international data protection regulations. The law also includes numerous unique requirements — such as data transfer and localization requirements — businesses will need to pay careful attention to. Fulfilling these requirements may be operationally burdensome and early planning will be critical to avoid significant setbacks.
The PDPL also includes extraterritorial effect so organizations based outside Saudi Arabia will still be subject to the law and its requirements if they process the personal data of Saudi residents.

What does the law introduce?
The PDPL introduces a number of requirements that could significantly impact how companies in the Kingdom operate. The most notable include:

Registration requirements. 
Data controllers, the organizations that determine the means and purpose of processing of personal data, must register via an electronic portal which includes an annual registration fee.

Records of processing. 
Data controllers must create and maintain a record of how they process personal data, and it must be registered with the SDAIA. Any foreign company operating in the Kingdom and processing personal data of Saudi residents must appoint a local representative. More guidance regarding when this requirement will become effective is forthcoming from the SDAIA. Organizations will also be expected to appoint data officers to manage compliance with the law.

Data subject rights.
Individuals are now provided with new rights to their data, namely that they have the right to information about how their data is processed, the ability to access copies of their data and request corrections, and the right to have their data destroyed. Individuals will also have the right to lodge complaints with the regulatory authority.

Data transfers.
Data transfers outside the Kingdom are only permitted in limited circumstances. However, even if the transfer meets one of the permitted exceptions, the data controller must receive approval by an appropriate government authority, amongst other conditions.

The principal legal basis for processing under the law is consent. Personal data may only be processed without consent in certain circumstances. Individuals will also have the right to withdraw their consent to the processing of their personal data. Importantly, data controllers must also have prior consent of individuals to send direct marketing and must provide an opt-out mechanism.

Impact assessments.
Data controllers must assess projects, products and services to identify data protection risks posed to individuals.

Privacy notice.
Data controllers must implement a privacy notice specifying how data will be processed prior to collecting personal data from individuals.

Breach notification.
Data controllers will be expected to report data breaches to the regulatory authority as soon as they become aware of an incident.

Sensitive data.
Information such as genetic, health, credit and financial data will fall under scope of the law. This data is also likely to be subject to additional regulation.

So how do we prepare?
Like most compliance efforts, early preparation is essential, especially to achieve compliance with some of the more onerous requirements detailed in the PDPL. As a priority, organizations should follow this six-point plan:

Step 1: Understand the data. 
Organizations must understand what data they hold, how it is used and who it is shared with. This can be accomplished by creating a record of processing activities to trace data through the information lifecycle. This document can be used as a single source of truth and to inform other compliance activities.

Step 2: Establish governance. 
Identifying local representatives where appropriate and appointing data officers will be an essential step. These individuals should be integrated into existing data protection or security networks of governance to enable the successful communication and escalation of risks.

Step 3: Create policies and procedures.
Policies and processes must be updated to reflect the new data protection responsibilities, including procedural guidance for responding to data subject rights requests and issuing data breach notifications. Policy refreshes must also address the assessment of data protection and security standards in place among third parties.

Step 4: Implement and test breach plans.
Organizations need a robust data breach plan that articulates each step involved in responding to a breach, the individuals and teams involved, and the timelines to complete each step. Testing your plan will help to ensure your teams are cohesive and ready should an actual incident occur.

Step 5: Identify international data transfers.
Using the ROPAs as a starting point, organizations should seek to understand what data is transferred internationally and where it is transferred to. This includes understanding how limitations in the law may affect these transfers and beginning to adopt strategies for compliance.

Step 6: Provide training and change management.
Training is an effective tool to develop a sustainable culture of compliance. To complement training activities, organizations should consider identifying change management strategies to help ensure that the compliance activities are embedded successfully.



Consumer, Legislation, Personal Data, Privacy, Trending

Many businesses collect data for multifold purposes. Here’s how to know what they’re doing with your personal data and whether it is secure.

As technologies that capture and analyze data proliferate, so, too, do businesses’ abilities to contextualize data and draw new insights from it. Artificial intelligence is a critical tool for data capture, analysis, and collection of information that many businesses are using for a range of purposes, including better understanding day-to-day operations, making more informed business decisions and learning about their customers.

Customer data is a focus area all its own. From consumer behavior to predictive analytics, companies regularly capture, store, and analyze large amounts of quantitative and qualitative data on their consumer base every day. Some companies have built an entire business model around consumer data, whether they’re companies selling personal information to a third party or creating targeted ads. Customer data is big business.

Here’s a look at some of the ways companies capture consumer data, what exactly they do with that information, and how you can use the same techniques for your own business purposes.

Types of consumer data businesses collect

The consumer data that businesses collect can be broken down into four categories: 

Personal data. This category includes personally identifiable information such as Social Security numbers and gender as well as nonpersonally identifiable information, including your IP address, web browser cookies, and device IDs (which both your laptop and mobile device have).

Engagement data. This type of data details how consumers interact with a business’s website, mobile appstext messages, social media pages, emails, paid ads and customer service routes

Behavioral data. This category includes transactional details such as purchase histories, product usage information (e.g., repeated actions), and qualitative data (e.g., mouse movement information).

Attitudinal data. This data type encompasses metrics on consumer satisfaction, purchase criteria, product desirability and more. 

How do businesses collect your data?

Companies capture data in many ways from many sources. Some collection methods are highly technical in nature, while others are more deductive (although these processes often employ sophisticated software).

The bottom line, though, is that companies are using a cornucopia of collection methods and sources to capture and process customer data on metrics, with interest in types of data ranging from demographic data to behavioral data, said Liam Hanham, data science manager at Workday. 

“Customer data can be collected in three ways: by directly asking customers, by indirectly tracking customers, and by appending other sources of customer data to your own,” said Hanham. “A robust business strategy needs all three.”

Businesses are adept at pulling in all types of data from nearly every nook and cranny. The most obvious places are from consumer activity on their websites, social media pages, through customer phone calls and live chats, but there are some more interesting methods at work as well.

One example is location-based advertising, which utilizes tracking technologies such as an internet-connected device’s IP address (and the other devices it interacts with – your laptop may interact with your mobile device and vice versa) to build a personalized data profile. This information is then used to target users’ devices with hyper-personalized, relevant advertising.

Companies also dig deep into their customer service records to see how customers have interacted with their sales and support departments in the past. Here, they are incorporating direct feedback about what worked and what didn’t, what a customer liked and disliked, on a grand scale.

Besides collecting information for business purposes, companies that sell personal information and other data to third-party sources have become commonplace. Once captured, this information is regularly changing hands in a data marketplace of its own.

Turning data into knowledge

Capturing large amounts of data creates the problem of how to sort through and analyze all that data. No human can reasonably sit down and read through line after line of customer data all day long, and even if they could, they probably wouldn’t make much of a dent. Computers, however, sift through this data more quickly and efficiently than humans, and they can operate 24/7/365 without taking a break.

As machine learning algorithms and other forms of AI proliferate and improve, data analytics becomes an even more powerful field for breaking down the sea of data into manageable tidbits of actionable insights. Some AI programs will flag anomalies or offer recommendations to decision-makers within an organization based on the contextualized data. Without programs like these, all the data captured in the world would be utterly useless.

How do businesses use your data?

There are several ways companies use the consumer data they collect and the insights they draw from that data.

To improve the customer experience.

For many companies, consumer data offers a way to better understand and meet their customers’ demands. By analyzing customer behavior, as well as vast troves of reviews and feedback, companies can nimbly modify their digital presence, goods, or services to better suit the current marketplace.

Not only do companies use consumer data to improve consumer experiences as a whole, but they use data to make decisions on an individualized level, said Brandon Chopp, digital manager for iHeartRaves.

“Our most important source of marketing intelligence comes from understanding customer data and using it to improve our website functionality,” Chopp said. “Our team has improved the customer experience by creating customized promotions and special offers based on customer data. Since each customer is going to have their own individual preferences, personalization is key.”

1: To refine a company’s marketing strategy

Contextualized data can help companies understand how consumers are engaging with and responding to their marketing campaigns, and adjust accordingly. This highly predictive use case gives businesses an idea of what consumers want based on what they have already done. Like other aspects of consumer data analysis, marketing is becoming more about personalization, said Brett Downes, SEO manager at Ghost Marketing.

“Mapping users’ journeys and personalizing their journey, not just through your website but further onto platforms like YouTube, LinkedIn, Facebook, or on to any other website, is now essential,” Downes said. “Segmenting data effectively allows you to market to only the people you know are most likely to engage. These have opened up new opportunities in industries previously very hard to market to.”

2: To transform the data into cash flow

Companies that capture data stand to profit from it. Data brokers, or data service providers that buy and sell information on customers, have risen as a new industry alongside big data. For businesses that capture large amounts of data, collecting information and then selling it  represent opportunities for new revenue streams.

For advertisers, having this information available for purchase is immensely valuable, so the demand for more and more data is ever increasing. That means the more disparate data sources data brokers can pull from to package more thorough data profiles, the more money they can make by selling this information to one another and advertisers.

3: To secure more data

Some businesses even use consumer data as a means of securing more sensitive information. For example, banking institutions sometimes use voice recognition data to authorize a user to access their financial information or protect them for fraudulent attempts to steal their information.

These systems work by marrying data from a customer’s interaction with a call center, machine learning algorithms, and tracking technologies that can identify and flag potentially fraudulent attempts to access a customer’s account. This takes some of the guesswork and human error out of catching a con.

As data capture and analytics technologies become more sophisticated, companies will find new and more effective ways to collect and contextualize data on everything, including consumers. For businesses, doing so is essential to remain competitive well into the future; failing to do so, on the other hand, is like running a race with your legs tied together. Insight is king, and insight in the modern business environment is gleaned from contextualized data.

4: Data privacy regulations

So much consumer data has been captured and analyzed that governments are crafting strict data and consumer privacy regulations designed to give individuals a modicum of control over how their data is used. The European Union’s General Data Protection Requirements (GDPR) lays out the rules of data capture, storage, usage, and sharing for companies, and GDPR regulation and compliance doesn’t just matter for European countries – it’s a law applicable to any business that targets or collects the personal data of EU citizens.

Data privacy has made it to the U.S. in the form of the California Consumer Privacy Act (CCPA). The CCPA is, in some ways, similar to GDPR regulation but differs in that it requires consumers to opt out of data collection rather than putting the onus on service providers. It also names the state as the entity to develop applicable data law rather than a company’s internal decision-makers.

Data privacy regulations are changing the way businesses capture, store, share and analyze consumer data. Businesses that are so far untouched by data privacy regulations can expect to have a greater legal obligation to protect consumers’ data as more consumers demand privacy rights. Data collection by private companies, though, is unlikely to go away; it will merely change in form as businesses adapt to new laws and regulations.



Consumer, Personal Data, Privacy, Trending
Two-sided opt-in requires both the merchant making a sale and the consumer requesting their data to opt-in for the exchange. No data is shared unless both consumer and merchant opt-in to the bank, credit card issuer, or consumer-facing app to request the data.

During the past few decades, consumers have learned to understand the difference between opt-in, opt-out, and even double opt-in processes. Millions of people sign up for online platforms or services by providing their personally identifiable information (PII). Consumers offer their data in exchange for offers, subscriptions, and basic services.
As subscribers, consumers expect to receive some security in exchange for sharing an email address. The simplest form is a second opt-in, which created the “double opt-in” standard, which provides explicit permission from the consumer. This happens, for example, when one downloads an app on a mobile device. Consumers often grant that permission reflexively, and many are unaware that this double opt-in is the de-facto standard.

Going beyond double opt-in

Most consumers understand that banks and other companies own their data when they sign up or check a box to agree to their terms. In turn, the companies that own the credit and debit cards that consumers carry in their wallets can use the spending data in ways regulated by the outlined agreement within the terms of services to which both parties agreed.
But today’s consumers are demanding more from these relationships. They want more control over their data. It’s not enough to earn 1% cash back on purchases or use points for travel. Instead, they want to pick the partners that access their data based on their best interests.

Companies like Klarna, the Swedish fintech (financial technology) company, capitalize on this trend by building bespoke relationships with retailers and creating special “buy now, pay later” offers for consumers. This strategy enables the consumer to bypass building new relationships with each retailer because Klarna has that covered. Such transactions can happen because Klarna “opted in” to the retailer’s API. In contrast, the consumer needs to only “opt-in” to the credit card issuer, or in this case Klarna, to allow data to be shared.
New middleware is coming that will enable merchants to provide receipt data related to consumer transactions. This data can be delivered directly to financial institutions (banks, credit-card companies, etc.) without requiring merchants to engage directly with each financial entity for pre-approved use cases. This middleware receives the receipt data from the merchant through its proprietary API or standard batch process and can then offer it to the financial institution. The financial institution can then execute the use case by issuing credit for card-linked offers or displaying the receipt within the banking app.

What is two-sided opt-in?

Two-sided opt-in requires both the merchant making a sale and the consumer requesting their data to opt-in for the exchange. No data is shared unless both consumer and merchant opt-in to the bank, credit card issuer, or consumer-facing app to request the data. Two-sided opt-in also places control for the interaction squarely in the hands of those involved in the transaction. And it will provide enormous benefits to consumers and retailers.

Consumers will no longer need paper receipts. Instead, they will be able to see each product in their credit card’s electronic item-detail history instead of the aggregated transactions and purchase totals that appear now. Consumers will also be directly notified of product recalls and can manage returns easily through an app instead of searching for an itemized paper receipt.
But the benefits for retailers are even more robust. Retailers utilizing two-sided opt-in will create a new revenue stream. Their customers will view product transactions down to the SKU-level — an ability that benefits every retailer, large and small. Even small retailers who leverage this data to create new revenue streams can see profits in the thousands of dollars per month. Larger retailers will build new revenue streams in the millions.

Through this technology, retailers will gain insights into consumer behavior, leading to new opportunities to market using far more granular trend analysis and deeper data. Savvy retailers will increase the effectiveness of their marketing spend by leveraging SKU-level data to deliver highly personalized, consumer-focused experiences.

The triangle of benefit: retailers, fintechs, and consumers.

This triangle of benefit will soon become the standard, as retailers drive more revenue from data they had never been able to monetize while getting closer to their consumers. Banks and fintechs can gain more control over each transaction and be able to market against them. And consumers can eschew paper receipts and enjoy more transparency and control over their experience and finances.

It all starts with the new privacy standard of two-sided opt-in.


Legislation, Personal Data, Privacy


The United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework, which will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-U.S. Privacy Shield framework. 


This Framework will reestablish an important legal mechanism for transfers of EU personal data to the United States. The United States has committed to implement new safeguards to ensure that signals intelligence activities are necessary and proportionate in the pursuit of defined national security objectives, which will ensure the privacy of EU personal data and to create a new mechanism for EU individuals to seek redress if they believe they are unlawfully targeted by signals intelligence activities. This deal in principle reflects the strength of the enduring U.S.-EU relationship, as we continue to deepen our partnership based on our shared democratic values.


This Framework will provide vital benefits to citizens on both sides of the Atlantic. For EU individuals, the deal includes new, high-standard commitments regarding the protection of personal data. For citizens and companies on both sides of the Atlantic, the deal will enable the continued flow of data that underpins more than $1 trillion in cross-border commerce every year, and will enable businesses of all sizes to compete in each other’s markets. It is the culmination of more than a year of detailed negotiations between the EU and the U.S. following the 2020 decision by the Court of Justice of the European Union ruling that the prior EU-U.S. framework , known as Privacy Shield,  did not satisfy EU legal requirements.


The new Trans-Atlantic Data Privacy Framework underscores our shared commitment to privacy, data protection, the rule of law, and our collective security as well as our mutual recognition of the importance of trans-Atlantic data flows to our respective citizens, economies, and societies.  Data flows are critical to the trans-Atlantic economic relationship and for all companies large and small across all sectors of the economy. In fact, more data flows between the United States and Europe than anywhere else in the world, enabling the $7.1 trillion U.S.-EU economic relationship.


By ensuring a durable and reliable legal basis for data flows, the new Trans-Atlantic Data Privacy Framework will underpin an inclusive and competitive digital economy and lay the foundation for further economic cooperation. It addresses the Court of Justice of the European Union’s Schrems II decision concerning U.S, law governing signals intelligence activities. Under the Trans-Atlantic Data Privacy Framework, the United States has made unprecedented commitments to:


Strengthen the privacy and civil liberties safeguards governing U.S. signals intelligence activities;

Establish a new redress mechanism with independent and binding authority; and

Enhance its existing rigorous and layered oversight of signals intelligence activities.

For example, the new Framework ensures that:


Signals intelligence collection may be undertaken only where necessary to advance legitimate national security objectives, and must not disproportionately impact the protection of individual privacy and civil liberties;

EU individuals may seek redress from a new multi-layer redress mechanism that includes an independent Data Protection Review Court that would consist of individuals chosen from outside the U.S. Government who would have full authority to adjudicate claims and direct remedial measures as needed; and

U.S. intelligence agencies will adopt procedures to ensure effective oversight of new privacy and civil liberties standards.

Participating companies and organizations that take advantage of the Framework to legally protect data flows will continue to be required to adhere to the Privacy Shield Principles, including the requirement to self-certify their adherence to the Principles through the U.S. Department of Commerce. EU individuals will continue to have access to multiple avenues of recourse to resolve complaints about participating organizations, including through alternative dispute resolution and binding arbitration.


These new policies will be implemented by the U.S. intelligence community in a way to effectively protect its citizens, and those of its allies and partners, consistent with the high-standard protections offered under this Framework.


The teams of the U.S. government and the European Commission will now continue their cooperation with a view to translate this arrangement into legal documents that will need to be adopted on both sides to put in place this new Trans-Atlantic Data Privacy Framework. For that purpose, these U.S. commitments will be included in an Executive Order that will form the basis of the Commission’s assessment in its future adequacy decision.