Your address will show here +12 34 56 78
Consumer, Personal Data, Privacy, Tech

Facebook let Netflix and Spotify access private messages from users, startling documents reveal

Leaked documents show tech giant continued sharing user details despite public statements saying it had stopped years earlier

For years, Facebook gave some of the world’s largest technology companies more intrusive access to users’ personal data than it has disclosed, effectively exempting those business partners from its usual privacy rules, according to internal records and interviews.

The special arrangements are detailed in hundreds of pages of Facebook documents obtained by The New York Times. The records, generated in 2017 by the company’s internal system for tracking partnerships, provide the most complete picture yet of the social network’s data-sharing practices. They also underscore how personal data has become the most prized commodity of the digital age, traded on a vast scale by some of the most powerful companies in Silicon Valley and beyond.

The exchange was intended to benefit everyone. Pushing for explosive growth, Facebook got more users, lifting its advertising revenue. Partner companies acquired features to make their products more attractive. Facebook users connected with friends across different devices and websites. But Facebook also assumed extraordinary power over the personal information of its 2.2 billion users — control it has wielded with little transparency or outside oversight.

The social network allowed Microsoft’s Bing search engine to see the names of virtually all Facebook users’ friends without consent, the records show, and gave Netflix and Spotify the ability to read Facebook users’ private messages.

The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer, despite public statements that it had stopped that type of sharing years earlier.

Facebook has been reeling from a series of privacy scandals, set off by revelations in March that a political consulting firm, Cambridge Analytica, improperly used Facebook data to build tools that aided President Donald Trump’s 2016 campaign. Acknowledging that it had breached users’ trust, Facebook insisted it had instituted stricter privacy protections long ago. Mark Zuckerberg, the chief executive, assured lawmakers in April that people “have complete control” over everything they share on Facebook.

But the documents, as well as interviews with about 50 former employees of Facebook and its corporate partners, reveal Facebook allowed certain companies access to data despite those protections. They also raise questions about whether Facebook ran afoul of a 2011 consent agreement with the Federal Trade Commission that barred the social network from sharing user data without explicit permission.

In all, the deals described in the documents benefited more than 150 companies — most of them tech businesses, including online retailers and entertainment sites, but also car-makers and media organisations. Their applications sought the data of hundreds of millions of people a month, the records show. The deals, the oldest of which date to 2010, were all active in 2017. Some were still in effect this year.

In an interview, Steve Satterfield, Facebook’s director of privacy and public policy, said none of the partnerships violated users’ privacy or the FTC agreement. Contracts required the companies to abide by Facebook policies, he added.

Still, Facebook executives have acknowledged missteps during the past year. “We know we’ve got work to do to regain people’s trust,” Mr Satterfield said. “Protecting people’s information requires stronger teams, better technology and clearer policies, and that’s where we’ve been focused for most of 2018.” He said the partnerships were “one area of focus” and Facebook was in the process of winding many of them down.

Facebook has found no evidence of abuse by its partners, a spokesperson said. Some of the largest partners, including Amazon, Microsoft and Yahoo, said they had used the data appropriately, but declined to discuss the sharing deals in detail. Facebook did say it had mismanaged some of its partnerships, allowing certain companies’ access to continue long after they had shut down the features that required the data.

With most of the partnerships, Mr Satterfield said, the FTC agreement did not require the social network to secure users’ consent before sharing data because Facebook considered the partners extensions of itself — service providers that allowed users to interact with their Facebook friends. The partners were prohibited from using the personal information for other purposes, he said. “Facebook’s partners don’t get to ignore people’s privacy settings.”

Data privacy experts disputed Facebook’s assertion that most partnerships were exempted from the regulatory requirements, expressing scepticism that businesses as varied as device makers, retailers and search companies would be viewed alike by the agency. “The only common theme is that they are partnerships that would benefit the company in terms of development or growth into an area that they otherwise could not get access to,” said Ashkan Soltani, former chief technologist at the FTC.

Mr Soltani and three former employees of the FTC’s consumer protection division, which brought the case that led to the consent decree, said in interviews that its data-sharing deals had probably violated the agreement.

“This is just giving third parties permission to harvest data without you being informed of it or giving consent to it,” said David Vladeck, who formerly ran the FTC’s consumer protection bureau. “I don’t understand how this unconsented-to data harvesting can at all be justified under the consent decree.”

Details of the agreements are emerging at a pivotal moment for the world’s largest social network. Facebook has been hammered with questions about its data sharing from lawmakers and regulators in the United States and Europe. The FTC this spring opened a new inquiry into Facebook’s compliance with the consent order, while the Justice Department and Securities and Exchange Commission are also investigating the company.

Facebook’s stock price has fallen, and a group of shareholders has called for Mr Zuckerberg to step aside as chairman. Shareholders also have filed a lawsuit alleging that executives failed to impose effective privacy safeguards. Angry users started a #DeleteFacebook movement.

This month, a British parliamentary committee investigating internet disinformation released internal Facebook emails, seized from the plaintiff in another lawsuit against Facebook. The messages disclosed some partnerships and depicted a company preoccupied with growth, whose leaders sought to undermine competitors and briefly considered selling access to user data.

As Facebook has battled one crisis after another, the company’s critics, including some former advisers and employees, have singled out the data-sharing as cause for concern.

“I don’t believe it is legitimate to enter into data-sharing partnerships where there is not prior informed consent from the user,” said Roger McNamee, an early investor in Facebook. “No one should trust Facebook until they change their business model.

Unlike Europe, where social media companies have had to adapt to stricter regulation, the United States has no general consumer privacy law, leaving tech companies free to monetise most kinds of personal information as long as they do not mislead their users. The FTC, which regulates trade, can bring enforcement actions against companies that deceive their customers.

Besides Facebook, the FTC has consent agreements with Google and Twitter stemming from alleged privacy violations.

For some advocates, the torrent of user data flowing out of Facebook has called into question not only Facebook’s compliance with the FTC agreement, but also the agency’s approach to privacy regulation.

“There has been an endless barrage of how Facebook has ignored users’ privacy settings, and we truly believed that in 2011 we had solved this problem,” said Marc Rotenberg, head of the Electronic Privacy Information Centre, an online privacy group that filed one of the first complaints about Facebook with federal regulators. “We brought Facebook under the regulatory authority of the FTC after a tremendous amount of work. The FTC has failed to act.”

According to Facebook, most of its data partnerships fall under an exemption to the FTC agreement. The company argues the partner companies are service providers — companies that use the data only “for and at the direction of” Facebook and function as an extension of the social network.

But Mr Vladeck and other former FTC officials said Facebook was interpreting the exemption too broadly.

When The Times reported last summer on the partnerships with device-makers, Facebook used the term “integration partners” to describe BlackBerry, Huawei and other manufacturers that pulled Facebook data to provide social-media-style features on smartphones. All such integration partners, Facebook asserted, were covered by the service provider exemption.

Since then, as the social network has disclosed its data-sharing deals with other kinds of businesses — including internet companies such as Yahoo — Facebook has labelled them integration partners, too.

Facebook even re-categorised one company, the Russian search giant Yandex, as an integration partner.

Facebook records show Yandex had access in 2017 to Facebook’s unique user IDs even after the social network stopped sharing them with other applications, citing privacy risks. A spokesperson for Yandex, which was accused last year by Ukraine’s security service of funnelling its user data to the Kremlin, said the company was unaware of the access and did not know why Facebook had allowed it to continue. They added the Ukrainian allegations “have no merit.”

In October, Facebook said Yandex was not an integration partner. But in early December, as The Times was preparing to publish this article, Facebook told congressional lawmakers that it was.

How closely Facebook monitored its data partners is uncertain. Most of Facebook’s partners declined to discuss what kind of reviews or audits Facebook subjected them to. Two former Facebook partners, whose deals with the social network dated to 2010, said they could find no evidence that Facebook had ever audited them. One was BlackBerry. The other was Yandex.

Source: https://www.independent.co.uk/tech/facebook-personal-data-user-details-amazon-netflix-spotify-mark-zuckerberg-a8690396.html

0

Consumer, Legislation, Personal Data, Privacy, Trending

General Medical Council and doctors’ unions hit out at government’s ‘cavalier approach’ to patient data.

Police forces will be able to “strong-arm” NHS bodies into handing over confidential patient data under planned laws that have sparked fury from doctors’ groups and the UK’s medical watchdog.

Ministers are planning new powers for police forces that would “set aside” the existing duty of confidentiality that applies to patient data held by the NHS and will instead require NHS organisations to hand over data police say they need to prevent serious violence.

Last week, England’s national data guardian, Dr Nicola Byrne, told The Independent she had serious concerns about the impact of the legislation going through parliament, and warned that the case for introducing the sweeping powers had not been made.

Now the UK’s medical watchdog, the General Medical Council (GMC), has also criticised the new law, proposals for which are contained in the Police, Crime and Sentencing Bill, warning it fails to protect patients’ sensitive information and could disproportionately hit some groups and worsen inequalities.

Human rights group Liberty said the proposed law is so broad that police forces will be able to “strong-arm information” from the NHS and other bodies, and that this could include information about people’s health, religious beliefs and political links. It added: “Altogether, these provisions are likely to give rise to significant and severe breaches of individuals’ data rights.”

Under the proposed legislation, which will be debated in the House of Lords in the coming weeks, local health boards will be legally required to provide confidential patient information when it is requested by police. The bill explicitly sets aside the existing common-law duty of confidentiality.

The purpose is to prevent serious violence, but there is already provision to allow patient information to be shared with police where there is a public interest need, such as the threat of violence or preventing a crime. The bill does not set out in detail what kinds of data could be handed over.

Under the proposed new law, police would have the power to demand information regardless of whether the NHS considered it to be in the public interest or not.

Professor Colin Melville, medical director at the GMC, told The Independent: “We are concerned the bill doesn’t protect patients’ sensitive health information and risks undermining the trust at the heart of doctor-patient relationships. We also share concerns held by others that an erosion of this trust may disproportionately affect certain communities and deepen societal inequalities.”

The GMC has raised its objections with the Home Office, which has said that the law will still require organisations to meet data protection rules before sharing any information.

But the doctors’ union, the British Medical Association (BMA), said in a briefing for members of the House of Lords that this would not provide adequate protection.

It said that health information had “long been afforded special legal status, over and beyond the Data Protection Act, in the form of the common-law duty of confidentiality”, which had been upheld by several court cases.

It added that the bill would “override the duty of medical confidentiality, including by legally requiring identifiable health information about individuals to be shared with the police”, saying: “We believe that setting aside of the duty of confidentiality, to require confidential information to be routinely given to the police when requested, will have a highly damaging impact on the relationship of trust between doctors and their patients. A removal of a long-established protection for confidential health information, alongside a broad interpretation of ‘serious crime’, may mean many patients are reluctant or fearful to consult or to share information with doctors,”

Dr Claudia Paoloni, president of the Hospital Consultants and Specialists Association, said the new law would “seriously undermine” the existing trust-based relationship with patients, and would create barriers to them seeking care: “We have significant concerns about what appears to be a cavalier approach to long-held principles, without clear objectives, and which is likely to have unintended consequences. Unless these concerns around individual patient confidentiality can be satisfactorily answered, we believe such powers should be removed from the bill.

“Other than the most serious crimes, which are already covered by precedent on disclosure on public interest grounds, it remains unclear precisely in what way laws to force the release of individuals’ medical records would be used.”

The latest data controversy comes after the NHS was forced to pause plans to share GP patient data with third parties to aid research, after an outcry from some doctors and patients over how the information would be used.

A Home Office spokesperson said: “Appropriate safeguards are in place, and any information shared must be proportionate. The bill makes clear that information can only be shared in accordance with data protection laws.”

Source: https://www.independent.co.uk/news/health/police-nhs-patient-data-bill-b1938998.html

0

Consumer, Legislation, Privacy, Tech

The world-leading data law changed how companies work. But four years on, there’s a lag on cleaning up Big Tech.

One thousand four hundred and fifty-nine days have passed since data rights nonprofit NOYB fired off its first complaints under Europe’s flagship data regulation, GDPR. The complaints allege Google, WhatsApp, Facebook, and Instagram forced people into giving up their data without obtaining proper consent, says Romain Robert, a program director at the nonprofit. The complaints landed on May 25, 2018, the day GDPR came into force and bolstered the privacy rights of 740 million Europeans. Four years later, NOYB is still waiting for final decisions to be made. And it’s not the only one.

Since the General Data Protection Regulation went into effect, data regulators tasked with enforcing the law have struggled to act quickly on complaints against Big Tech firms and the murky online advertising industry, with scores of cases still outstanding. While GDPR has immeasurably improved the privacy rights of millions inside and outside of Europe, it hasn’t stamped out the worst problems: Data brokers are still stockpiling your information and selling it, and the online advertising industry remains littered with potential abuses.

Now, civil society groups have grown frustrated with GDPR’s limitations, while some countries’ regulators complain the system to handle international complaints is bloated and slows down enforcement. By comparison, the information economy moves at breakneck speed. “To say that GDPR is well enforced, I think it’s a mistake. It’s not enforced as quickly as we thought,” Robert says. NOYB has just settled a legal case against the delays in its consent complaints. “There’s still what we call an enforcement gap and problems with cross-border enforcement and enforcement against the big players,” adds David Martin Ruiz, a senior legal officer at the European Consumer Organization, which filed a complaint about Google’s location tracking four years ago.

Lawmakers in Brussels first proposed reforming Europe’s data rules back in January 2012 and passed the final law in 2016, giving companies and organizations two years to fall in line. GDPR builds upon previous data regulations, super-charging your rights and altering how businesses must handle your personal data, information like your name or IP address. GDPR doesn’t ban the use of data in certain cases, such as police use of intrusive facial recognition; instead, seven principles sit at its heart and guide how your data can be handled, stored, and used. These principles apply equally to charities and governments, pharmaceutical companies and Big Tech firms.

Crucially, GDPR weaponized these principles and handed each European country’s data regulator the power to issue fines of up to 4 percent of a firm’s global turnover and order companies to stop practices that violate GDPR’s principles. (Ordering a company to stop processing people’s data is arguably more impactful than issuing fines.) It was never likely that GDPR fines and enforcement were going to flow quickly from regulators—in competition law, for instance, cases can take decades—but four years after GDPR started, the total number of major decisions against the world’s most powerful data companies remains agonizingly low.

Under the dense series of rules that make up GDPR, complaints against a company that operates in multiple EU countries are usually funneled to the country where its main European headquarters are based. This so-called one-stop-shop process dictates that the country leads the investigation. The tiny nation of Luxembourg handles complaints against Amazon; the Netherlands deals with Netflix; Sweden has Spotify; and Ireland is responsible for Meta’s Facebook, WhatsApp, and Instagram, plus all of Google’s services, Airbnb, Yahoo, Twitter, Microsoft, Apple, and LinkedIn.

A glut of early and complex GDPR complaints has led to backlogs at regulators, including the Irish body, and international cooperation has been slowed down by paperwork. Since May 2018, the Irish regulator has completed 65 percent of cases involving cross-border decisions—400 are outstanding, according to the regulator’s own stats. Other cases, launched by NOYB against Netflix (Netherlands), Spotify (Sweden), and PimEyes (Poland) have all also dragged on for years.

Europe’s data regulators claim GDPR enforcement is still maturing and that it is working well and improving over time. (Officials from France, Ireland, Germany, Norway, Luxembourg, Italy, the UK, and Europe’s two independent bodies, the EDPS and EDPB, were all interviewed for this article.) The number of fines has ramped up as the legislation has aged, hitting a running total of €1.6 billion (around $1.7 billion). The biggest? Luxembourg fined Amazon €746 million, and Ireland fined WhatsApp €225 million last year. (Both companies are appealing the decisions). At the same time, one lesser-known Belgian fine could change how the entire ad tech industry works. However, officials concede that changes to the way GDPR is enforced could speed up the process and ensure swifter action.

Helen Dixon is at the heart of Europe’s GDPR enforcement, with the Irish Data Protection Commission (DPC) responsible for an outsized number of Big Tech firms. The DPC has faced criticism for struggling to keep up with the number of complaints under its purview, drawing ire from fellow regulators and calls to reform the body. “If everything comes at you at the same time, clearly there’s going to be a lag in terms of prioritizing and dealing sequentially with the issues while standing up what is a very significant legal framework,” Dixon says, defending her office’s performance. Dixon says the DPC has had to handle GDPR’s complexity from scratch, leading to many cases and new processes, and there aren’t simple answers for many of them.

“I would classify the DPC as being very effective in the first four years of application of the GDPR,” Dixon says. “The fact that DPC has stood up a new legal framework that many described as ‘the law of everything’ in a couple of short years, and implemented what are very significant sanctions in the form of fines and corrective measures already in that time period” shows its success, Dixon says. The organization has enforced measures against Twitter, WhatsApp, Facebook, and Groupon, among thousands of national cases, during this time.

“There should be an independent review of how to reform and strengthen the DPC,” says Johnny Ryan, a senior fellow at the Irish Council for Civil Liberties. “We cannot know from outside what the problems are.” Ryan adds that blame can’t just be leveled at the Irish regulator. “The European Commission has immense power. The GDPR is supposed to be an immense project. And the Commission has neglected the GDPR,” he says. “It doesn’t just propose the laws, it also has to see that they are applied.”

o far, the European Commission has backed enforcement of GDPR in Ireland and across the continent. “The Commission has consistently called on data protection authorities to continue stepping up their enforcement efforts,” Didier Reynders, the European Commissioner for Justice, says in a statement. “We have launched six infringement procedures under the GDPR.” These legal cases include action against Slovenia for failing to import GDPR into its national law and questioning the independence of the Belgian data authority.

However, following a complaint from Ryan in February, the EU Ombudsman, a watchdog for European institutions, opened an inquiry into how the Commission has been monitoring data protection in Ireland. (The Ombudsman says the Commission has until May 25 to reply, after asking for its initial deadline to be extended. Reynders says the Commission does not comment on ongoing inquiries). If the Commission does look into Ireland, it could make recommendations, says Estelle Massé, the global data protection lead at Access Now, a technology-focused civil rights organization. “There is an issue, and if you don’t intervene in this way, I don’t really see how the situation will resolve,” Massé says. “It has to go through an infringement procedure.”

Despite clear enforcement problems, GDPR has had an incalculable effect on data practices broadly. EU countries have made decisions in thousands of local cases and issued guidance to organizations to say how they should use people’s data. Spain’s LaLiga soccer league was fined after its app spied on users, retailer H&M was fined in Germany after it saved details about employees’ personal lives, the Netherlands’ tax body was fined over its use of a ‘blacklist,’ and these are just a handful of the successful cases.

Some of GDPR’s impact is also hidden—the law isn’t just about fines and ordering companies to change—and it has improved company behaviors. “If you compare the awareness about cybersecurity, about data protection, about privacy, as it looked like 10 years ago and it looks today, these are completely different worlds,” says Wojciech Wiewiórowski, the European Data Protection Supervisor, who oversees GDPR cases against European institutions, such as Europol.

Companies have been put off using people’s data in dubious ways, experts say, when they wouldn’t have thought twice about it pre-GDPR. One recent study estimated that the number of Android apps on Google’s Play store has dropped by a third since the introduction of GDPR, citing better privacy protections. “More and more businesses have allocated significant budgets to doing data protection compliance,” says Hazel Grant, head of the privacy, security, and information group at London-headquartered law firm Fieldfisher. Grant says that when GDPR decisions are made—such as Austria’s decision to make the use of Google Analytics unlawful—companies are concerned about what it means for them. “Four or five years ago, that enforcement wouldn’t have happened,” Grant says. “And if it had happened, maybe a few data protection lawyers would have known about it—it wouldn’t have been out there with clients coming to us saying we need advice on this.”

But at Big Tech levels where data is plentiful, the scale of complying with GDPR is different. One recent internal Facebook document obtained by Motherboard hints that the company doesn’t really know what it does with your data—an assertion Facebook denied at the time. Equally, a WIRED and Reveal joint investigation at the end of 2021 found serious shortcomings in the ways Amazon handles customer data. (Amazon said it had an “exceptional” track record in protecting data.)

Microsoft declined a request to comment. Neither Google nor Facebook provided comment in time for publication.

“There is a lag, especially on Big Tech, enforcing the law on Big Tech—and Big Tech means cross-border cases, and that means the one-stop-shop and the cooperation among the data protection authorities,” says Ulrich Kelber, the head of the German federal data protection regulator. The one-stop-shop allows all of Europe’s regulators to have a say on the final decision of the lead regulator in that case, which can then be challenged. Ireland’s fine against WhatsApp grew from the original proposed penalty of as little as €30 million to €225 million after other regulators weighed in. Another Irish case against Instagram is currently being discussed, Dixon says, which will add months to its final outcome.

The one-stop-shop was created under GPDR, meaning the process has started with teething problems, but four years in, a lot still needs to be improved. Tobias Judin, the head of international at Norway’s data protection authority, says that each week several drafts of decisions are circulated among Europe’s data regulators. “In the vast majority of those cases, we actually agree,” Judin says. (German authorities object the most.) Decisions can face a lot of back and forth between regulators, wrapped up in bureaucracy. “We do question whether, in those cases that have a European-wide impact, it makes sense and whether it is feasible that these cases are solely dealt with by one data protection authority until we reach the decision stage,” Judin says.

Luxembourg’s data regulator hit Amazon with a record-breaking €746 million fine last year, its first case against the retailer. Amazon is contesting the fine in court—in a statement to WIRED, the company repeated its assertion that “there has been no data breach, and no customer data has been exposed to any third party”—but Luxembourg’s regulator says investigations will always be lengthy despite it bringing in new ways to investigate companies. “I think under one year or one-half year, I think it’s almost impossible to have it closed before such a delay,” says Alain Herrmann, one of Luxembourg’s four data protection commissioners. “There are huge [amounts of] information to deal with.” Herrmann says Luxembourg has a few other international cases ongoing, but national secrecy laws prevent it from talking about them. “It’s just the [one-stop-shop] system, the lack of resources, the lack of clear law and procedure, which makes their job even more difficult,” Robert says.

The French data regulator has, in some ways, sidestepped the international GDPR process by directly pursuing companies’ use of cookies. Despite common beliefs, annoying cookie pop-ups don’t come from GDPR—they’re governed by the EU’s separate E-Privacy law, and the French regulator has taken advantage of this. Marie-Laure Denis, the head of French regulator CNIL, has hit Google, Amazon, and Facebook with hefty fines for bad cookie practices. Perhaps more importantly, it has forced companies to change their behavior. Google is altering its cookie banners across the whole of Europe following the French enforcement.

“We are starting to see really concrete changes to the digital ecosystems and evolution of practices, which is really what we are looking [for],” Denis says. She explains that CNIL will next look at data collection by mobile apps under the E-Privacy law, and cloud data transfers under GDPR. The cookie enforcement effort wasn’t to avoid GDPR’s protracted process, but it was more efficient, Denis says. “We still believe in the GDPR enforcement mechanism, but we need to make it work better—and quicker.”

In the last year, there have been growing calls to change how GDPR works. “Enforcement should be more centralized for big affairs,” Viviane Redding, the politician who proposed GDPR back in 2012, said of the data law in May last year. The calls have come as Europe passed its next two big pieces of digital regulation: the Digital Services Act and the Digital Markets Act. The laws, which focus on competition and internet safety, handle enforcement differently from GDPR; in some instances, the European Commission will investigate Big Tech companies. The move is a nod to the fact that GDPR enforcement may not have been as smooth as politicians would have liked.

There appears to be little appetite to reopen GDPR itself; however, smaller tweaks could help improve enforcement. At a recent meeting of data regulators held by the European Data Protection Board, a body that exists to guide regulators, countries agreed that some international cases will work to fixed deadlines and timelines and said they would try to “join forces” on some investigations. Norway’s Judin says the move is positive but questions how effective it will be in practice.

Massé, from Access Now, says a small amendment to GDPR could significantly address some of the biggest current enforcement problems. Legislation could ensure data protection authorities handle complaints in the same way (including using the same forms), explicitly lay out how the one-stop-shop should work, and make sure that procedures in individual countries are the same, Massé says. In short, it could clarify how GDPR enforcement should be handled by every country.

Source: https://www.wired.com/story/gdpr-2022/

0

Consumer, Personal Data, Privacy, Tech

De datahonger van Big Tech loopt steeds meer tegen muren op. Europese privacywaakhonden floten Google al terug. Ook andere techbedrijven realiseren zich dat ze meer aandacht voor de privacy van hun gebruikers moeten hebben. Dat heeft een grote invloed op wat voor informatie voor marketeers beschikbaar is. Wacht dus niet met het zoeken naar alternatieven.

Google Analytics is niet langer een optie om bezoekers van websites te analyseren. Medio januari kwam de Oostenrijkse privacywaakhond DSB als eerste tot de conclusie dat Google Analytics in strijd is met de GDPR/AVG en niet langer gebruikt mag worden door organisaties die gegevens verwerken van EU-burgers. Eind januari volgde de Noorse privacytoezichthouder Datatilsynet en begin februari kwam ook de Franse privacy- en data-autoriteit CNIL tot de conclusie dat Google Analytics niet langer gebruikt mag worden in dat land. In navolging daarvan heeft CNIL medio april drie websites in Frankrijk bevolen om te stoppen met Google Analytics. De potentiële boete is niet mis: tot €20 miljoen of 4 procent van de jaaromzet.

De Nederlandse Autoriteit Persoonsgegevens (AP) onderzoekt dit nog en heeft nog geen officieel besluit genomen, maar waarschuwt Nederlandse organisaties al om Google Analytics niet meer te gebruiken. Mede naar aanleiding van deze ontwikkelingen heeft Google het einde van Universal Analytics (Google Analytics 3) al aangekondigd voor 1 juli 2023. Google Analytics 4 is weliswaar al beschikbaar, maar werkt totaal anders dan GA3. Organisaties moeten in feite hun Analytics-aanpak helemaal opnieuw opbouwen. Het is een flinke uitdaging om die transitie in iets meer dan een jaar door te voeren, terwijl ook in GA4 niet alle privacyvraagstukken op een goede manier worden aangepakt.

Inperken van ‘big tech’

Maar het is niet alleen Google Analytics dat teveel informatie van internetgebruikers verzamelt. Ook andere techreuzen, zoals Amazon, Meta (Facebook) en Apple slaan bakken vol informatie van gebruikers op. Het is te verwachten dat ook daar binnenkort een einde aan komt, zeker nu de EU-lidstaten en het Europees parlement eind maart een akkoord hebben bereikt over de Digital Markets Act (DMA). Deze wet moet die macht van Big Tech inperken, voor meer concurrentie zorgen en de privacy van EU-burgers beter beschermen.

Al deze ontwikkelingen betekenen dat marketeers en gebruikers op zoek moeten gaan naar alternatieven die meer oog hebben voor privacy. Hoewel de privacy van internetgebruikers in de Europese Unie steeds beter wordt beschermd met de AVG-wetgeving (GDPR), komen de Big Tech-bedrijven (Google, Apple, Facebook en Amazon, kortweg GAFA) regelmatig in het nieuws vanwege privacyschandalen. Zo worstelde Facebook geruime tijd met de beroemde Cambridge Analytica-affaire en moest Mark Zuckerberg zich in het Amerikaanse Congres verantwoorden over hoe Facebook met de privacy van gebruikers omgaat.

De privacyschendingen van Big Tech worden door de consument ook daadwerkelijk als negatief ervaren. Uit Europees onderzoek blijkt dat 41% van de Europeanen geen gegevens over zichzelf wil delen met particuliere bedrijven. Volgens Pew Research is bijna 70% van de Amerikanen van mening dat hun persoonsgegevens worden misbruikt. Consumentengegevens zijn een van de waardevolste handelsgoederen ter wereld, maar de macht van deze techreuzen is te groot geworden. Die macht van de grote techbedrijven heeft al geleid tot een reeks antitrustzaken.

Google FloC en Topics?

Gelukkig is Big Tech zelf inmiddels ook tot de conclusie gekomen dat het blijven negeren van privacy-eisen van gebruikers een bedreiging vormt voor het succes van hun bedrijf. Als techbedrijven gegevens van gebruikers willen gebruiken, zullen ze dat moeten doen op een manier die de consumenten accepteren. Daarom beginnen ze serieuzer te worden over het waarborgen van de privacy de apparaten of toepassingen van gebruikers.

Google heeft al aangegeven dat cookies van derden door de Chrome-browser vanaf 2023 geblokkeerd zullen worden. En dat is voor de grootste speler op de online advertentiemarkt op zijn zachtst gezegd opmerkelijk. Het lijkt een overwinning op het gebied van privacy, maar dat betekent niet dat er een einde komt aan het verzamelen van gebruikersgegevens. Nadat Google aanvankelijk inzette op de zogenoemde FLoC-technologie, (Federated Learning of Cohorts), maakte het bedrijf in januari van dit jaar bekend dat het toch niet verder zou gaan met dit cookie-alternatief. In plaats daarvan worden ‘Topics de nieuwe manier van Google om gebruikers in te delen in categorieën met vergelijkbare interesses, zodat bedrijven ze op die manier contextuele advertenties kunnen gaan voorschotelen. Met Topics verzamelt Chrome informatie over de interesses van gebruikers terwijl ze websites bezoeken. Deze gegevens worden dan drie weken bewaard. Op dit moment heeft Google het aantal Topics beperkt tot 300, maar is van plan om dit later uit te breiden.

Het feit dat advertenties hiermee contextueel worden, is weliswaar een stap in de goede richting, maar toch worden er zo voor reclamedoeleinden nog steeds gegevens doorgegeven vanuit een browser. Ook Topics is dus bedoeld om gegevens over de interesses van Chrome-gebruikers te verzamelen en deze te delen met derden, die op basis daarvan advertenties tonen. Andere browsers, zoals Safari, Firefox of Brave, doen dat niet.

Geef consument de controle

De beste manier om met de onverzadigbare datahonger van Big Tech om te gaan, is door te zoeken naar alternatieven. Kernbegrip daarbij is dataminimalisering, oftewel: alleen die bepaalde informatie verzamelen en gebruiken van bezoekers die nodig is voor het uitvoeren van een bepaalde handeling of het aanbieden van een dienst. We zullen de behoefte van consumenten aan privacy moeten accepteren en ze de controle weer moeten geven. Vanuit technisch oogpunt moeten ze de mogelijkheid hebben om hun goedkeuring voor het gebruik van hun gegevens in te trekken of tegen te houden als zij het merk associëren met praktijken die zij niet op prijs stellen. Bedrijven moeten alleen gegevens verzamelen en gebruiken voor een specifiek en legitiem doel. Dit zijn sleutelbeginselen die niet alleen belangrijk zijn voor de AVG, maar voor de meeste privacywetten elders in de wereld.

Het wordt steeds duidelijker dat er zonder expliciete toestemming van gebruikers niets meer gedaan mag worden met hun data. Dit betekent voor marketingafdelingen dat het aantal datapunten dat ze voor hun acties kunnen gebruiken met de helft of misschien zelfs meer zal afnemen. Bedrijven moeten daarom heel goed nagaan welke bezoekersdata essentieel zijn en wat de beste manier is om die data op de juiste manier te kunnen verzamelen.

Zoals de verschillende privacyautoriteiten in Europa inmiddels adviseren, is het verstandig om uit te kijken naar een alternatief voor Google Analytics, dat wél aan de privacyregels voldoet. Gelukkig zijn er al uitstekende Europese alternatieven, zoals Plausible, Simple Analytics, Splitbee of Piwik PRO. Wacht daarmee niet tot er een daadwerkelijk verbod is op Google Analytics, maar begin er direct mee, zodat er een sterke en op vertrouwen gebaseerde relatie met klanten opgebouwd kan worden.

Source: https://www.emerce.nl/achtergrond/privacy-en-data-wacht-niet-met-zoeken-naar-alternatieven-voor-big-tech

0

Consumer, Legislation, Privacy, Tech, Trending
The Digital Services Act will reshape the online world
The EU has agreed on another ambitious piece of legislation to police the online world.

Early Saturday morning, after hours of negotiations, the bloc agreed on the broad terms of the Digital Services Act, or DSA, which will force tech companies to take greater responsibility for content that appears on their platforms. New obligations include removing illegal content and goods more quickly, explaining to users and researchers how their algorithms work, and taking stricter action on the spread of misinformation. Companies face fines of up to 6 percent of their annual turnover for noncompliance.

“The DSA will upgrade the ground-rules for all online services in the EU,” said European Commission President Ursula von der Leyen in a statement. “It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms.”

“What is illegal offline, should be illegal online”

Margrethe Vestager, the European Commissioner for Competition who has spearheaded much of the bloc’s tech regulation, said the act would “ensure that platforms are held accountable for the risks their services can pose to society and citizens.”

The DSA shouldn’t be confused with the DMA or Digital Markets Act, which was agreed upon in March. Both acts affect the tech world, but the DMA focuses on creating a level playing field between businesses while the DSA deals with how companies police content on their platforms. The DSA will therefore likely have a more immediate impact on internet users.

Although the legislation only applies to EU citizens, the effect of these laws will certainly be felt in other parts of the world, too. Global tech companies may decide it is more cost-effective to implement a single strategy to police content and take the EU’s comparatively stringent regulations as their benchmark. Lawmakers in the US keen to rein in Big Tech with their own regulations have already begun looking to the EU’s rules for inspiration.

The final text of the DSA has yet to be released, but the European Parliament and European Commission have detailed a number of obligations it will contain:

  1. Targeted advertising based on an individual’s religion, sexual orientation, or ethnicity is banned. Minors cannot be subject to targeted advertising either.
  2. “Dark patterns” — confusing or deceptive user interfaces designed to steer users into making certain choices — will be prohibited. The EU says that, as a rule, canceling subscriptions should be as easy as signing up for them.
  3. Large online platforms like Facebook will have to make the working of their recommender algorithms (used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users. Users should also be offered a recommender system “not based on profiling.” In the case of Instagram, for example, this would mean a chronological feed (as it introduced recently).
  4. Hosting services and online platforms will have to explain clearly why they have removed illegal content as well as give users the ability to appeal such takedowns. The DSA itself does not define what content is illegal, though, and leaves this up to individual countries.
  5. The largest online platforms will have to provide key data to researchers to “provide more insight into how online risks evolve.”
  6. Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
  7. Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine).

The DSA will, like the DMA, distinguish between tech companies of different sizes, placing greater obligations on bigger companies. The largest firms — those with at least 45 million users in the EU, like Meta and Google — will face the most scrutiny. These tech companies have lobbied hard to water down the requirements in the DSA, particularly those concerning targeted advertising and handing over data to outside researchers.

Although the broad terms of the DSA have now been agreed upon by the member states of the EU, the legal language still needs to be finalized and the act officially voted into law. This last step is seen as a formality at this point, though. The rules will apply to all companies 15 months after the act is voted into law, or from January 1st, 2024, whichever is later.


Source: https://www.theverge.com/2022/4/23/23036976/eu-digital-services-act-finalized-algorithms-targeted-advertising
0

Consumer, Legislation, Privacy, Tech, Trending

French regulators have hit Google and Facebook with 210 million euros ($237 million) in fines over their use of “cookies”, the data used to track users online, authorities said Thursday.

US tech giants, including the likes of Apple and Amazon, have come under growing pressure over their [business] practices across Europe, where they have faced massive fines and plans to impose far-reaching EU rules on how they operate.

The 150-million-euro fine imposed on Google was a record by France’s National Commission for Information Technology and Freedom (CNIL), beating a previous cookie-related fine of 100 million euros against the company in December 2020.

Facebook was handed a 60-million-euro fine.

“CNIL has determined that the sites facebook.com, google.fr and (Google-owned) youtube.com do not allow users to refuse the use of cookies as simply as to accept them,” the regulatory body said.

The two platforms have three months to adapt their practices, after which France will impose fines of 100,000 euros per day, CNIL added.

Google told AFP it would change its practices following the ruling. “In accordance with the expectations of internet users… we are committed to implementing new changes, as well as to working actively with CNIL in response to its decision,” the US firm said in a statement.

Cookies are little packets of data that are set up on a user’s computer when they visit a website, allowing web browsers to save information about their session.

They are highly valuable for Google and Facebook as ways to personalise advertising — their primary source of revenue.

But privacy advocates have long pushed back. Since the European Union passed a 2018 law on personal data, internet companies face stricter rules that oblige them to seek the direct consent of users before installing cookies on their computers.

90 notices issued

CNIL argued that Google, Facebook and YouTube make it very easy to consent to cookies via a single button, whereas rejecting the request requires several clicks.

It had given internet companies until April 2021 to adapt to the tighter privacy rules, warning that they would start facing sanctions after that date.

French newspaper Le Figaro was the first to be sanctioned, receiving a fine of 50,000 euros in July for allowing cookies to be installed by advertising partners without the direct approval of users, or even after they had rejected them.

CNIL said recently that it had sent 90 formal notices to websites since April.

In 2020, it inflicted fines of 100 million and 35 million euros respectively on Google and Amazon for their use of cookies.

The fines were based on an earlier EU law, the General Data Protection Regulation, with CNIL arguing that the companies had failed to give “sufficiently clear” information to users about cookies.

Source: https://www.france24.com/en/technology/20220106-france-fines-google-facebook-more-than-%E2%82%AC200-million-for-cookie-breaches

 
0

Consumer, Corporate, Legislation, Personal Data, Privacy, Trending

In September 2021, the Kingdom of Saudi Arabia issued its Personal Data Protection Law to regulate the processing of personal data. The PDPL is the first federal, sector-agnostic data privacy legislation in Saudi Arabia. Organizations will be faced with significant changes to their operations to ensure compliance.

The PDPL comes into effect only 180 days after the publication in the Official Gazette, meaning the law will be effective March 23, subject to the passage of the implementing regulations. For the first two years, it will be enforced under the Saudi Data and Artificial Intelligence Authority, after which a transition to the National Data Management Office will be considered.   

Like other new data protection laws and updates within the broader Middle East and North Africa region, some elements within the PDPL are similar to those of other international data protection regulations. The law also includes numerous unique requirements — such as data transfer and localization requirements — businesses will need to pay careful attention to. Fulfilling these requirements may be operationally burdensome and early planning will be critical to avoid significant setbacks.
The PDPL also includes extraterritorial effect so organizations based outside Saudi Arabia will still be subject to the law and its requirements if they process the personal data of Saudi residents.

What does the law introduce?
The PDPL introduces a number of requirements that could significantly impact how companies in the Kingdom operate. The most notable include:

Registration requirements. 
Data controllers, the organizations that determine the means and purpose of processing of personal data, must register via an electronic portal which includes an annual registration fee.

Records of processing. 
Data controllers must create and maintain a record of how they process personal data, and it must be registered with the SDAIA. Any foreign company operating in the Kingdom and processing personal data of Saudi residents must appoint a local representative. More guidance regarding when this requirement will become effective is forthcoming from the SDAIA. Organizations will also be expected to appoint data officers to manage compliance with the law.

Data subject rights.
Individuals are now provided with new rights to their data, namely that they have the right to information about how their data is processed, the ability to access copies of their data and request corrections, and the right to have their data destroyed. Individuals will also have the right to lodge complaints with the regulatory authority.

Data transfers.
Data transfers outside the Kingdom are only permitted in limited circumstances. However, even if the transfer meets one of the permitted exceptions, the data controller must receive approval by an appropriate government authority, amongst other conditions.

The principal legal basis for processing under the law is consent. Personal data may only be processed without consent in certain circumstances. Individuals will also have the right to withdraw their consent to the processing of their personal data. Importantly, data controllers must also have prior consent of individuals to send direct marketing and must provide an opt-out mechanism.

Impact assessments.
Data controllers must assess projects, products and services to identify data protection risks posed to individuals.

Privacy notice.
Data controllers must implement a privacy notice specifying how data will be processed prior to collecting personal data from individuals.

Breach notification.
Data controllers will be expected to report data breaches to the regulatory authority as soon as they become aware of an incident.

Sensitive data.
Information such as genetic, health, credit and financial data will fall under scope of the law. This data is also likely to be subject to additional regulation.

So how do we prepare?
Like most compliance efforts, early preparation is essential, especially to achieve compliance with some of the more onerous requirements detailed in the PDPL. As a priority, organizations should follow this six-point plan:

Step 1: Understand the data. 
Organizations must understand what data they hold, how it is used and who it is shared with. This can be accomplished by creating a record of processing activities to trace data through the information lifecycle. This document can be used as a single source of truth and to inform other compliance activities.

Step 2: Establish governance. 
Identifying local representatives where appropriate and appointing data officers will be an essential step. These individuals should be integrated into existing data protection or security networks of governance to enable the successful communication and escalation of risks.

Step 3: Create policies and procedures.
Policies and processes must be updated to reflect the new data protection responsibilities, including procedural guidance for responding to data subject rights requests and issuing data breach notifications. Policy refreshes must also address the assessment of data protection and security standards in place among third parties.

Step 4: Implement and test breach plans.
Organizations need a robust data breach plan that articulates each step involved in responding to a breach, the individuals and teams involved, and the timelines to complete each step. Testing your plan will help to ensure your teams are cohesive and ready should an actual incident occur.

Step 5: Identify international data transfers.
Using the ROPAs as a starting point, organizations should seek to understand what data is transferred internationally and where it is transferred to. This includes understanding how limitations in the law may affect these transfers and beginning to adopt strategies for compliance.

Step 6: Provide training and change management.
Training is an effective tool to develop a sustainable culture of compliance. To complement training activities, organizations should consider identifying change management strategies to help ensure that the compliance activities are embedded successfully.

Source: https://iapp.org/news/a/how-to-prepare-for-saudi-arabias-personal-data-protection-law/?mkt_tok=MTM4LUVaTS0wNDIAAAGDqUdxDYhqkPyxHyko4ed2GyuwzheNwgSQ4hjNmCZTuv7-CU3tAAeMAcWRZ2fJ_tz3KavvmN2VgSlfxV0ldu0m9vyZRLP9mlWHgKIaDzpqn31-

0

Consumer, Legislation, Personal Data, Privacy, Trending

Many businesses collect data for multifold purposes. Here’s how to know what they’re doing with your personal data and whether it is secure.


As technologies that capture and analyze data proliferate, so, too, do businesses’ abilities to contextualize data and draw new insights from it. Artificial intelligence is a critical tool for data capture, analysis, and collection of information that many businesses are using for a range of purposes, including better understanding day-to-day operations, making more informed business decisions and learning about their customers.

Customer data is a focus area all its own. From consumer behavior to predictive analytics, companies regularly capture, store, and analyze large amounts of quantitative and qualitative data on their consumer base every day. Some companies have built an entire business model around consumer data, whether they’re companies selling personal information to a third party or creating targeted ads. Customer data is big business.

Here’s a look at some of the ways companies capture consumer data, what exactly they do with that information, and how you can use the same techniques for your own business purposes.

Types of consumer data businesses collect

The consumer data that businesses collect can be broken down into four categories: 

Personal data. This category includes personally identifiable information such as Social Security numbers and gender as well as nonpersonally identifiable information, including your IP address, web browser cookies, and device IDs (which both your laptop and mobile device have).

Engagement data. This type of data details how consumers interact with a business’s website, mobile appstext messages, social media pages, emails, paid ads and customer service routes

Behavioral data. This category includes transactional details such as purchase histories, product usage information (e.g., repeated actions), and qualitative data (e.g., mouse movement information).

Attitudinal data. This data type encompasses metrics on consumer satisfaction, purchase criteria, product desirability and more. 

How do businesses collect your data?

Companies capture data in many ways from many sources. Some collection methods are highly technical in nature, while others are more deductive (although these processes often employ sophisticated software).

The bottom line, though, is that companies are using a cornucopia of collection methods and sources to capture and process customer data on metrics, with interest in types of data ranging from demographic data to behavioral data, said Liam Hanham, data science manager at Workday. 

“Customer data can be collected in three ways: by directly asking customers, by indirectly tracking customers, and by appending other sources of customer data to your own,” said Hanham. “A robust business strategy needs all three.”

Businesses are adept at pulling in all types of data from nearly every nook and cranny. The most obvious places are from consumer activity on their websites, social media pages, through customer phone calls and live chats, but there are some more interesting methods at work as well.

One example is location-based advertising, which utilizes tracking technologies such as an internet-connected device’s IP address (and the other devices it interacts with – your laptop may interact with your mobile device and vice versa) to build a personalized data profile. This information is then used to target users’ devices with hyper-personalized, relevant advertising.

Companies also dig deep into their customer service records to see how customers have interacted with their sales and support departments in the past. Here, they are incorporating direct feedback about what worked and what didn’t, what a customer liked and disliked, on a grand scale.

Besides collecting information for business purposes, companies that sell personal information and other data to third-party sources have become commonplace. Once captured, this information is regularly changing hands in a data marketplace of its own.

Turning data into knowledge

Capturing large amounts of data creates the problem of how to sort through and analyze all that data. No human can reasonably sit down and read through line after line of customer data all day long, and even if they could, they probably wouldn’t make much of a dent. Computers, however, sift through this data more quickly and efficiently than humans, and they can operate 24/7/365 without taking a break.

As machine learning algorithms and other forms of AI proliferate and improve, data analytics becomes an even more powerful field for breaking down the sea of data into manageable tidbits of actionable insights. Some AI programs will flag anomalies or offer recommendations to decision-makers within an organization based on the contextualized data. Without programs like these, all the data captured in the world would be utterly useless.


How do businesses use your data?

There are several ways companies use the consumer data they collect and the insights they draw from that data.

To improve the customer experience.

For many companies, consumer data offers a way to better understand and meet their customers’ demands. By analyzing customer behavior, as well as vast troves of reviews and feedback, companies can nimbly modify their digital presence, goods, or services to better suit the current marketplace.

Not only do companies use consumer data to improve consumer experiences as a whole, but they use data to make decisions on an individualized level, said Brandon Chopp, digital manager for iHeartRaves.

“Our most important source of marketing intelligence comes from understanding customer data and using it to improve our website functionality,” Chopp said. “Our team has improved the customer experience by creating customized promotions and special offers based on customer data. Since each customer is going to have their own individual preferences, personalization is key.”

1: To refine a company’s marketing strategy

Contextualized data can help companies understand how consumers are engaging with and responding to their marketing campaigns, and adjust accordingly. This highly predictive use case gives businesses an idea of what consumers want based on what they have already done. Like other aspects of consumer data analysis, marketing is becoming more about personalization, said Brett Downes, SEO manager at Ghost Marketing.

“Mapping users’ journeys and personalizing their journey, not just through your website but further onto platforms like YouTube, LinkedIn, Facebook, or on to any other website, is now essential,” Downes said. “Segmenting data effectively allows you to market to only the people you know are most likely to engage. These have opened up new opportunities in industries previously very hard to market to.”

2: To transform the data into cash flow

Companies that capture data stand to profit from it. Data brokers, or data service providers that buy and sell information on customers, have risen as a new industry alongside big data. For businesses that capture large amounts of data, collecting information and then selling it  represent opportunities for new revenue streams.

For advertisers, having this information available for purchase is immensely valuable, so the demand for more and more data is ever increasing. That means the more disparate data sources data brokers can pull from to package more thorough data profiles, the more money they can make by selling this information to one another and advertisers.

3: To secure more data

Some businesses even use consumer data as a means of securing more sensitive information. For example, banking institutions sometimes use voice recognition data to authorize a user to access their financial information or protect them for fraudulent attempts to steal their information.

These systems work by marrying data from a customer’s interaction with a call center, machine learning algorithms, and tracking technologies that can identify and flag potentially fraudulent attempts to access a customer’s account. This takes some of the guesswork and human error out of catching a con.

As data capture and analytics technologies become more sophisticated, companies will find new and more effective ways to collect and contextualize data on everything, including consumers. For businesses, doing so is essential to remain competitive well into the future; failing to do so, on the other hand, is like running a race with your legs tied together. Insight is king, and insight in the modern business environment is gleaned from contextualized data.

4: Data privacy regulations

So much consumer data has been captured and analyzed that governments are crafting strict data and consumer privacy regulations designed to give individuals a modicum of control over how their data is used. The European Union’s General Data Protection Requirements (GDPR) lays out the rules of data capture, storage, usage, and sharing for companies, and GDPR regulation and compliance doesn’t just matter for European countries – it’s a law applicable to any business that targets or collects the personal data of EU citizens.

Data privacy has made it to the U.S. in the form of the California Consumer Privacy Act (CCPA). The CCPA is, in some ways, similar to GDPR regulation but differs in that it requires consumers to opt out of data collection rather than putting the onus on service providers. It also names the state as the entity to develop applicable data law rather than a company’s internal decision-makers.

Data privacy regulations are changing the way businesses capture, store, share and analyze consumer data. Businesses that are so far untouched by data privacy regulations can expect to have a greater legal obligation to protect consumers’ data as more consumers demand privacy rights. Data collection by private companies, though, is unlikely to go away; it will merely change in form as businesses adapt to new laws and regulations.

Source: https://www.businessnewsdaily.com/10625-businesses-collecting-data.html

0

Consumer, Personal Data, Privacy, Trending
Two-sided opt-in requires both the merchant making a sale and the consumer requesting their data to opt-in for the exchange. No data is shared unless both consumer and merchant opt-in to the bank, credit card issuer, or consumer-facing app to request the data.

During the past few decades, consumers have learned to understand the difference between opt-in, opt-out, and even double opt-in processes. Millions of people sign up for online platforms or services by providing their personally identifiable information (PII). Consumers offer their data in exchange for offers, subscriptions, and basic services.
As subscribers, consumers expect to receive some security in exchange for sharing an email address. The simplest form is a second opt-in, which created the “double opt-in” standard, which provides explicit permission from the consumer. This happens, for example, when one downloads an app on a mobile device. Consumers often grant that permission reflexively, and many are unaware that this double opt-in is the de-facto standard.

Going beyond double opt-in

Most consumers understand that banks and other companies own their data when they sign up or check a box to agree to their terms. In turn, the companies that own the credit and debit cards that consumers carry in their wallets can use the spending data in ways regulated by the outlined agreement within the terms of services to which both parties agreed.
But today’s consumers are demanding more from these relationships. They want more control over their data. It’s not enough to earn 1% cash back on purchases or use points for travel. Instead, they want to pick the partners that access their data based on their best interests.

Companies like Klarna, the Swedish fintech (financial technology) company, capitalize on this trend by building bespoke relationships with retailers and creating special “buy now, pay later” offers for consumers. This strategy enables the consumer to bypass building new relationships with each retailer because Klarna has that covered. Such transactions can happen because Klarna “opted in” to the retailer’s API. In contrast, the consumer needs to only “opt-in” to the credit card issuer, or in this case Klarna, to allow data to be shared.
New middleware is coming that will enable merchants to provide receipt data related to consumer transactions. This data can be delivered directly to financial institutions (banks, credit-card companies, etc.) without requiring merchants to engage directly with each financial entity for pre-approved use cases. This middleware receives the receipt data from the merchant through its proprietary API or standard batch process and can then offer it to the financial institution. The financial institution can then execute the use case by issuing credit for card-linked offers or displaying the receipt within the banking app.

What is two-sided opt-in?

Two-sided opt-in requires both the merchant making a sale and the consumer requesting their data to opt-in for the exchange. No data is shared unless both consumer and merchant opt-in to the bank, credit card issuer, or consumer-facing app to request the data. Two-sided opt-in also places control for the interaction squarely in the hands of those involved in the transaction. And it will provide enormous benefits to consumers and retailers.

Consumers will no longer need paper receipts. Instead, they will be able to see each product in their credit card’s electronic item-detail history instead of the aggregated transactions and purchase totals that appear now. Consumers will also be directly notified of product recalls and can manage returns easily through an app instead of searching for an itemized paper receipt.
But the benefits for retailers are even more robust. Retailers utilizing two-sided opt-in will create a new revenue stream. Their customers will view product transactions down to the SKU-level — an ability that benefits every retailer, large and small. Even small retailers who leverage this data to create new revenue streams can see profits in the thousands of dollars per month. Larger retailers will build new revenue streams in the millions.

Through this technology, retailers will gain insights into consumer behavior, leading to new opportunities to market using far more granular trend analysis and deeper data. Savvy retailers will increase the effectiveness of their marketing spend by leveraging SKU-level data to deliver highly personalized, consumer-focused experiences.

The triangle of benefit: retailers, fintechs, and consumers.

This triangle of benefit will soon become the standard, as retailers drive more revenue from data they had never been able to monetize while getting closer to their consumers. Banks and fintechs can gain more control over each transaction and be able to market against them. And consumers can eschew paper receipts and enjoy more transparency and control over their experience and finances.

It all starts with the new privacy standard of two-sided opt-in.

Source: https://www.digitalcommerce360.com/2021/08/31/why-two-sided-opt-in-will-become-the-new-standard/
0

Consumer, Legislation, Personal Data, Privacy
0