Your address will show here +12 34 56 78
Consumer, Legislation, Personal Data, Privacy, Trending

General Medical Council and doctors’ unions hit out at government’s ‘cavalier approach’ to patient data.

Police forces will be able to “strong-arm” NHS bodies into handing over confidential patient data under planned laws that have sparked fury from doctors’ groups and the UK’s medical watchdog.

Ministers are planning new powers for police forces that would “set aside” the existing duty of confidentiality that applies to patient data held by the NHS and will instead require NHS organisations to hand over data police say they need to prevent serious violence.

Last week, England’s national data guardian, Dr Nicola Byrne, told The Independent she had serious concerns about the impact of the legislation going through parliament, and warned that the case for introducing the sweeping powers had not been made.

Now the UK’s medical watchdog, the General Medical Council (GMC), has also criticised the new law, proposals for which are contained in the Police, Crime and Sentencing Bill, warning it fails to protect patients’ sensitive information and could disproportionately hit some groups and worsen inequalities.

Human rights group Liberty said the proposed law is so broad that police forces will be able to “strong-arm information” from the NHS and other bodies, and that this could include information about people’s health, religious beliefs and political links. It added: “Altogether, these provisions are likely to give rise to significant and severe breaches of individuals’ data rights.”

Under the proposed legislation, which will be debated in the House of Lords in the coming weeks, local health boards will be legally required to provide confidential patient information when it is requested by police. The bill explicitly sets aside the existing common-law duty of confidentiality.

The purpose is to prevent serious violence, but there is already provision to allow patient information to be shared with police where there is a public interest need, such as the threat of violence or preventing a crime. The bill does not set out in detail what kinds of data could be handed over.

Under the proposed new law, police would have the power to demand information regardless of whether the NHS considered it to be in the public interest or not.

Professor Colin Melville, medical director at the GMC, told The Independent: “We are concerned the bill doesn’t protect patients’ sensitive health information and risks undermining the trust at the heart of doctor-patient relationships. We also share concerns held by others that an erosion of this trust may disproportionately affect certain communities and deepen societal inequalities.”

The GMC has raised its objections with the Home Office, which has said that the law will still require organisations to meet data protection rules before sharing any information.

But the doctors’ union, the British Medical Association (BMA), said in a briefing for members of the House of Lords that this would not provide adequate protection.

It said that health information had “long been afforded special legal status, over and beyond the Data Protection Act, in the form of the common-law duty of confidentiality”, which had been upheld by several court cases.

It added that the bill would “override the duty of medical confidentiality, including by legally requiring identifiable health information about individuals to be shared with the police”, saying: “We believe that setting aside of the duty of confidentiality, to require confidential information to be routinely given to the police when requested, will have a highly damaging impact on the relationship of trust between doctors and their patients. A removal of a long-established protection for confidential health information, alongside a broad interpretation of ‘serious crime’, may mean many patients are reluctant or fearful to consult or to share information with doctors,”

Dr Claudia Paoloni, president of the Hospital Consultants and Specialists Association, said the new law would “seriously undermine” the existing trust-based relationship with patients, and would create barriers to them seeking care: “We have significant concerns about what appears to be a cavalier approach to long-held principles, without clear objectives, and which is likely to have unintended consequences. Unless these concerns around individual patient confidentiality can be satisfactorily answered, we believe such powers should be removed from the bill.

“Other than the most serious crimes, which are already covered by precedent on disclosure on public interest grounds, it remains unclear precisely in what way laws to force the release of individuals’ medical records would be used.”

The latest data controversy comes after the NHS was forced to pause plans to share GP patient data with third parties to aid research, after an outcry from some doctors and patients over how the information would be used.

A Home Office spokesperson said: “Appropriate safeguards are in place, and any information shared must be proportionate. The bill makes clear that information can only be shared in accordance with data protection laws.”

Source: https://www.independent.co.uk/news/health/police-nhs-patient-data-bill-b1938998.html

0

Legislation, Personal Data, Privacy, Tech, Trending

The cyberspace pioneer is skeptical about a blockchain-based internet

Web inventor Tim Berners-Lee wants to rescue his creation from centralization. But does he align himself with Web3’s promise of salvation?

At TNW Conference, the computer scientist gave a one-word answer: “Nope.”

That snub may seem to clash with Berners-Lee’s recent actions. The 67-year-old now campaigns to save his “dysfunctional” brainchild from the clutches of Big Tech.

He’s also made a cool $5.4million by selling an NFT — one of Web3’s supposed pillars. But the Brit has his own vision for the web’s successor: a decentralized architecture that gives users control of their data.

Berners-Lee want to build it on a platform he calls Solid — but you can call it Web 3.0.
“We did talk about it as Web 3.0 at one point, because Web 2.0 was a term used for the dysfunction of what happens with user-generated content on the large platforms,” he said.

“People have called that Web 2.0, so if you want to call this Web 3.0, then okay.”

On the blockchain, it just doesn’t work. Berners-Lee shares Web3’s purported mission of transferring data from Big Tech to the people. But he’s taking a different route to the target. While Web3 is based on blockchain, Solid is built with standard web tools and open specifications. Private information is stored in decentralized data stores called “pods,” which can be hosted wherever the user wants. They can then choose which apps can access their data. This approach aims to provide interoperability, speed, scalability, and privacy.

“When you try to build that stuff on the blockchain, it just doesn’t work,” said Berners-Lee.

Source: https://thenextweb.com/news/web-inventor-tim-berners-lee-screw-web3-my-decentralized-internet-doesnt-need-blockchain

0

Consumer, Legislation, Privacy, Tech

The world-leading data law changed how companies work. But four years on, there’s a lag on cleaning up Big Tech.

One thousand four hundred and fifty-nine days have passed since data rights nonprofit NOYB fired off its first complaints under Europe’s flagship data regulation, GDPR. The complaints allege Google, WhatsApp, Facebook, and Instagram forced people into giving up their data without obtaining proper consent, says Romain Robert, a program director at the nonprofit. The complaints landed on May 25, 2018, the day GDPR came into force and bolstered the privacy rights of 740 million Europeans. Four years later, NOYB is still waiting for final decisions to be made. And it’s not the only one.

Since the General Data Protection Regulation went into effect, data regulators tasked with enforcing the law have struggled to act quickly on complaints against Big Tech firms and the murky online advertising industry, with scores of cases still outstanding. While GDPR has immeasurably improved the privacy rights of millions inside and outside of Europe, it hasn’t stamped out the worst problems: Data brokers are still stockpiling your information and selling it, and the online advertising industry remains littered with potential abuses.

Now, civil society groups have grown frustrated with GDPR’s limitations, while some countries’ regulators complain the system to handle international complaints is bloated and slows down enforcement. By comparison, the information economy moves at breakneck speed. “To say that GDPR is well enforced, I think it’s a mistake. It’s not enforced as quickly as we thought,” Robert says. NOYB has just settled a legal case against the delays in its consent complaints. “There’s still what we call an enforcement gap and problems with cross-border enforcement and enforcement against the big players,” adds David Martin Ruiz, a senior legal officer at the European Consumer Organization, which filed a complaint about Google’s location tracking four years ago.

Lawmakers in Brussels first proposed reforming Europe’s data rules back in January 2012 and passed the final law in 2016, giving companies and organizations two years to fall in line. GDPR builds upon previous data regulations, super-charging your rights and altering how businesses must handle your personal data, information like your name or IP address. GDPR doesn’t ban the use of data in certain cases, such as police use of intrusive facial recognition; instead, seven principles sit at its heart and guide how your data can be handled, stored, and used. These principles apply equally to charities and governments, pharmaceutical companies and Big Tech firms.

Crucially, GDPR weaponized these principles and handed each European country’s data regulator the power to issue fines of up to 4 percent of a firm’s global turnover and order companies to stop practices that violate GDPR’s principles. (Ordering a company to stop processing people’s data is arguably more impactful than issuing fines.) It was never likely that GDPR fines and enforcement were going to flow quickly from regulators—in competition law, for instance, cases can take decades—but four years after GDPR started, the total number of major decisions against the world’s most powerful data companies remains agonizingly low.

Under the dense series of rules that make up GDPR, complaints against a company that operates in multiple EU countries are usually funneled to the country where its main European headquarters are based. This so-called one-stop-shop process dictates that the country leads the investigation. The tiny nation of Luxembourg handles complaints against Amazon; the Netherlands deals with Netflix; Sweden has Spotify; and Ireland is responsible for Meta’s Facebook, WhatsApp, and Instagram, plus all of Google’s services, Airbnb, Yahoo, Twitter, Microsoft, Apple, and LinkedIn.

A glut of early and complex GDPR complaints has led to backlogs at regulators, including the Irish body, and international cooperation has been slowed down by paperwork. Since May 2018, the Irish regulator has completed 65 percent of cases involving cross-border decisions—400 are outstanding, according to the regulator’s own stats. Other cases, launched by NOYB against Netflix (Netherlands), Spotify (Sweden), and PimEyes (Poland) have all also dragged on for years.

Europe’s data regulators claim GDPR enforcement is still maturing and that it is working well and improving over time. (Officials from France, Ireland, Germany, Norway, Luxembourg, Italy, the UK, and Europe’s two independent bodies, the EDPS and EDPB, were all interviewed for this article.) The number of fines has ramped up as the legislation has aged, hitting a running total of €1.6 billion (around $1.7 billion). The biggest? Luxembourg fined Amazon €746 million, and Ireland fined WhatsApp €225 million last year. (Both companies are appealing the decisions). At the same time, one lesser-known Belgian fine could change how the entire ad tech industry works. However, officials concede that changes to the way GDPR is enforced could speed up the process and ensure swifter action.

Helen Dixon is at the heart of Europe’s GDPR enforcement, with the Irish Data Protection Commission (DPC) responsible for an outsized number of Big Tech firms. The DPC has faced criticism for struggling to keep up with the number of complaints under its purview, drawing ire from fellow regulators and calls to reform the body. “If everything comes at you at the same time, clearly there’s going to be a lag in terms of prioritizing and dealing sequentially with the issues while standing up what is a very significant legal framework,” Dixon says, defending her office’s performance. Dixon says the DPC has had to handle GDPR’s complexity from scratch, leading to many cases and new processes, and there aren’t simple answers for many of them.

“I would classify the DPC as being very effective in the first four years of application of the GDPR,” Dixon says. “The fact that DPC has stood up a new legal framework that many described as ‘the law of everything’ in a couple of short years, and implemented what are very significant sanctions in the form of fines and corrective measures already in that time period” shows its success, Dixon says. The organization has enforced measures against Twitter, WhatsApp, Facebook, and Groupon, among thousands of national cases, during this time.

“There should be an independent review of how to reform and strengthen the DPC,” says Johnny Ryan, a senior fellow at the Irish Council for Civil Liberties. “We cannot know from outside what the problems are.” Ryan adds that blame can’t just be leveled at the Irish regulator. “The European Commission has immense power. The GDPR is supposed to be an immense project. And the Commission has neglected the GDPR,” he says. “It doesn’t just propose the laws, it also has to see that they are applied.”

o far, the European Commission has backed enforcement of GDPR in Ireland and across the continent. “The Commission has consistently called on data protection authorities to continue stepping up their enforcement efforts,” Didier Reynders, the European Commissioner for Justice, says in a statement. “We have launched six infringement procedures under the GDPR.” These legal cases include action against Slovenia for failing to import GDPR into its national law and questioning the independence of the Belgian data authority.

However, following a complaint from Ryan in February, the EU Ombudsman, a watchdog for European institutions, opened an inquiry into how the Commission has been monitoring data protection in Ireland. (The Ombudsman says the Commission has until May 25 to reply, after asking for its initial deadline to be extended. Reynders says the Commission does not comment on ongoing inquiries). If the Commission does look into Ireland, it could make recommendations, says Estelle Massé, the global data protection lead at Access Now, a technology-focused civil rights organization. “There is an issue, and if you don’t intervene in this way, I don’t really see how the situation will resolve,” Massé says. “It has to go through an infringement procedure.”

Despite clear enforcement problems, GDPR has had an incalculable effect on data practices broadly. EU countries have made decisions in thousands of local cases and issued guidance to organizations to say how they should use people’s data. Spain’s LaLiga soccer league was fined after its app spied on users, retailer H&M was fined in Germany after it saved details about employees’ personal lives, the Netherlands’ tax body was fined over its use of a ‘blacklist,’ and these are just a handful of the successful cases.

Some of GDPR’s impact is also hidden—the law isn’t just about fines and ordering companies to change—and it has improved company behaviors. “If you compare the awareness about cybersecurity, about data protection, about privacy, as it looked like 10 years ago and it looks today, these are completely different worlds,” says Wojciech Wiewiórowski, the European Data Protection Supervisor, who oversees GDPR cases against European institutions, such as Europol.

Companies have been put off using people’s data in dubious ways, experts say, when they wouldn’t have thought twice about it pre-GDPR. One recent study estimated that the number of Android apps on Google’s Play store has dropped by a third since the introduction of GDPR, citing better privacy protections. “More and more businesses have allocated significant budgets to doing data protection compliance,” says Hazel Grant, head of the privacy, security, and information group at London-headquartered law firm Fieldfisher. Grant says that when GDPR decisions are made—such as Austria’s decision to make the use of Google Analytics unlawful—companies are concerned about what it means for them. “Four or five years ago, that enforcement wouldn’t have happened,” Grant says. “And if it had happened, maybe a few data protection lawyers would have known about it—it wouldn’t have been out there with clients coming to us saying we need advice on this.”

But at Big Tech levels where data is plentiful, the scale of complying with GDPR is different. One recent internal Facebook document obtained by Motherboard hints that the company doesn’t really know what it does with your data—an assertion Facebook denied at the time. Equally, a WIRED and Reveal joint investigation at the end of 2021 found serious shortcomings in the ways Amazon handles customer data. (Amazon said it had an “exceptional” track record in protecting data.)

Microsoft declined a request to comment. Neither Google nor Facebook provided comment in time for publication.

“There is a lag, especially on Big Tech, enforcing the law on Big Tech—and Big Tech means cross-border cases, and that means the one-stop-shop and the cooperation among the data protection authorities,” says Ulrich Kelber, the head of the German federal data protection regulator. The one-stop-shop allows all of Europe’s regulators to have a say on the final decision of the lead regulator in that case, which can then be challenged. Ireland’s fine against WhatsApp grew from the original proposed penalty of as little as €30 million to €225 million after other regulators weighed in. Another Irish case against Instagram is currently being discussed, Dixon says, which will add months to its final outcome.

The one-stop-shop was created under GPDR, meaning the process has started with teething problems, but four years in, a lot still needs to be improved. Tobias Judin, the head of international at Norway’s data protection authority, says that each week several drafts of decisions are circulated among Europe’s data regulators. “In the vast majority of those cases, we actually agree,” Judin says. (German authorities object the most.) Decisions can face a lot of back and forth between regulators, wrapped up in bureaucracy. “We do question whether, in those cases that have a European-wide impact, it makes sense and whether it is feasible that these cases are solely dealt with by one data protection authority until we reach the decision stage,” Judin says.

Luxembourg’s data regulator hit Amazon with a record-breaking €746 million fine last year, its first case against the retailer. Amazon is contesting the fine in court—in a statement to WIRED, the company repeated its assertion that “there has been no data breach, and no customer data has been exposed to any third party”—but Luxembourg’s regulator says investigations will always be lengthy despite it bringing in new ways to investigate companies. “I think under one year or one-half year, I think it’s almost impossible to have it closed before such a delay,” says Alain Herrmann, one of Luxembourg’s four data protection commissioners. “There are huge [amounts of] information to deal with.” Herrmann says Luxembourg has a few other international cases ongoing, but national secrecy laws prevent it from talking about them. “It’s just the [one-stop-shop] system, the lack of resources, the lack of clear law and procedure, which makes their job even more difficult,” Robert says.

The French data regulator has, in some ways, sidestepped the international GDPR process by directly pursuing companies’ use of cookies. Despite common beliefs, annoying cookie pop-ups don’t come from GDPR—they’re governed by the EU’s separate E-Privacy law, and the French regulator has taken advantage of this. Marie-Laure Denis, the head of French regulator CNIL, has hit Google, Amazon, and Facebook with hefty fines for bad cookie practices. Perhaps more importantly, it has forced companies to change their behavior. Google is altering its cookie banners across the whole of Europe following the French enforcement.

“We are starting to see really concrete changes to the digital ecosystems and evolution of practices, which is really what we are looking [for],” Denis says. She explains that CNIL will next look at data collection by mobile apps under the E-Privacy law, and cloud data transfers under GDPR. The cookie enforcement effort wasn’t to avoid GDPR’s protracted process, but it was more efficient, Denis says. “We still believe in the GDPR enforcement mechanism, but we need to make it work better—and quicker.”

In the last year, there have been growing calls to change how GDPR works. “Enforcement should be more centralized for big affairs,” Viviane Redding, the politician who proposed GDPR back in 2012, said of the data law in May last year. The calls have come as Europe passed its next two big pieces of digital regulation: the Digital Services Act and the Digital Markets Act. The laws, which focus on competition and internet safety, handle enforcement differently from GDPR; in some instances, the European Commission will investigate Big Tech companies. The move is a nod to the fact that GDPR enforcement may not have been as smooth as politicians would have liked.

There appears to be little appetite to reopen GDPR itself; however, smaller tweaks could help improve enforcement. At a recent meeting of data regulators held by the European Data Protection Board, a body that exists to guide regulators, countries agreed that some international cases will work to fixed deadlines and timelines and said they would try to “join forces” on some investigations. Norway’s Judin says the move is positive but questions how effective it will be in practice.

Massé, from Access Now, says a small amendment to GDPR could significantly address some of the biggest current enforcement problems. Legislation could ensure data protection authorities handle complaints in the same way (including using the same forms), explicitly lay out how the one-stop-shop should work, and make sure that procedures in individual countries are the same, Massé says. In short, it could clarify how GDPR enforcement should be handled by every country.

Source: https://www.wired.com/story/gdpr-2022/

0

Consumer, Legislation, Privacy, Tech, Trending
The Digital Services Act will reshape the online world
The EU has agreed on another ambitious piece of legislation to police the online world.

Early Saturday morning, after hours of negotiations, the bloc agreed on the broad terms of the Digital Services Act, or DSA, which will force tech companies to take greater responsibility for content that appears on their platforms. New obligations include removing illegal content and goods more quickly, explaining to users and researchers how their algorithms work, and taking stricter action on the spread of misinformation. Companies face fines of up to 6 percent of their annual turnover for noncompliance.

“The DSA will upgrade the ground-rules for all online services in the EU,” said European Commission President Ursula von der Leyen in a statement. “It gives practical effect to the principle that what is illegal offline, should be illegal online. The greater the size, the greater the responsibilities of online platforms.”

“What is illegal offline, should be illegal online”

Margrethe Vestager, the European Commissioner for Competition who has spearheaded much of the bloc’s tech regulation, said the act would “ensure that platforms are held accountable for the risks their services can pose to society and citizens.”

The DSA shouldn’t be confused with the DMA or Digital Markets Act, which was agreed upon in March. Both acts affect the tech world, but the DMA focuses on creating a level playing field between businesses while the DSA deals with how companies police content on their platforms. The DSA will therefore likely have a more immediate impact on internet users.

Although the legislation only applies to EU citizens, the effect of these laws will certainly be felt in other parts of the world, too. Global tech companies may decide it is more cost-effective to implement a single strategy to police content and take the EU’s comparatively stringent regulations as their benchmark. Lawmakers in the US keen to rein in Big Tech with their own regulations have already begun looking to the EU’s rules for inspiration.

The final text of the DSA has yet to be released, but the European Parliament and European Commission have detailed a number of obligations it will contain:

  1. Targeted advertising based on an individual’s religion, sexual orientation, or ethnicity is banned. Minors cannot be subject to targeted advertising either.
  2. “Dark patterns” — confusing or deceptive user interfaces designed to steer users into making certain choices — will be prohibited. The EU says that, as a rule, canceling subscriptions should be as easy as signing up for them.
  3. Large online platforms like Facebook will have to make the working of their recommender algorithms (used for sorting content on the News Feed or suggesting TV shows on Netflix) transparent to users. Users should also be offered a recommender system “not based on profiling.” In the case of Instagram, for example, this would mean a chronological feed (as it introduced recently).
  4. Hosting services and online platforms will have to explain clearly why they have removed illegal content as well as give users the ability to appeal such takedowns. The DSA itself does not define what content is illegal, though, and leaves this up to individual countries.
  5. The largest online platforms will have to provide key data to researchers to “provide more insight into how online risks evolve.”
  6. Online marketplaces must keep basic information about traders on their platform to track down individuals selling illegal goods or services.
  7. Large platforms will also have to introduce new strategies for dealing with misinformation during crises (a provision inspired by the recent invasion of Ukraine).

The DSA will, like the DMA, distinguish between tech companies of different sizes, placing greater obligations on bigger companies. The largest firms — those with at least 45 million users in the EU, like Meta and Google — will face the most scrutiny. These tech companies have lobbied hard to water down the requirements in the DSA, particularly those concerning targeted advertising and handing over data to outside researchers.

Although the broad terms of the DSA have now been agreed upon by the member states of the EU, the legal language still needs to be finalized and the act officially voted into law. This last step is seen as a formality at this point, though. The rules will apply to all companies 15 months after the act is voted into law, or from January 1st, 2024, whichever is later.


Source: https://www.theverge.com/2022/4/23/23036976/eu-digital-services-act-finalized-algorithms-targeted-advertising
0

Consumer, Legislation, Privacy, Tech, Trending

French regulators have hit Google and Facebook with 210 million euros ($237 million) in fines over their use of “cookies”, the data used to track users online, authorities said Thursday.

US tech giants, including the likes of Apple and Amazon, have come under growing pressure over their [business] practices across Europe, where they have faced massive fines and plans to impose far-reaching EU rules on how they operate.

The 150-million-euro fine imposed on Google was a record by France’s National Commission for Information Technology and Freedom (CNIL), beating a previous cookie-related fine of 100 million euros against the company in December 2020.

Facebook was handed a 60-million-euro fine.

“CNIL has determined that the sites facebook.com, google.fr and (Google-owned) youtube.com do not allow users to refuse the use of cookies as simply as to accept them,” the regulatory body said.

The two platforms have three months to adapt their practices, after which France will impose fines of 100,000 euros per day, CNIL added.

Google told AFP it would change its practices following the ruling. “In accordance with the expectations of internet users… we are committed to implementing new changes, as well as to working actively with CNIL in response to its decision,” the US firm said in a statement.

Cookies are little packets of data that are set up on a user’s computer when they visit a website, allowing web browsers to save information about their session.

They are highly valuable for Google and Facebook as ways to personalise advertising — their primary source of revenue.

But privacy advocates have long pushed back. Since the European Union passed a 2018 law on personal data, internet companies face stricter rules that oblige them to seek the direct consent of users before installing cookies on their computers.

90 notices issued

CNIL argued that Google, Facebook and YouTube make it very easy to consent to cookies via a single button, whereas rejecting the request requires several clicks.

It had given internet companies until April 2021 to adapt to the tighter privacy rules, warning that they would start facing sanctions after that date.

French newspaper Le Figaro was the first to be sanctioned, receiving a fine of 50,000 euros in July for allowing cookies to be installed by advertising partners without the direct approval of users, or even after they had rejected them.

CNIL said recently that it had sent 90 formal notices to websites since April.

In 2020, it inflicted fines of 100 million and 35 million euros respectively on Google and Amazon for their use of cookies.

The fines were based on an earlier EU law, the General Data Protection Regulation, with CNIL arguing that the companies had failed to give “sufficiently clear” information to users about cookies.

Source: https://www.france24.com/en/technology/20220106-france-fines-google-facebook-more-than-%E2%82%AC200-million-for-cookie-breaches

 
0

Legislation, Personal Data, Privacy, Tech

The European Parliament has backed new rules for data sharing, in a move the EU hopes will help harness the power of data and artificial intelligence (AI) to boost innovation across the continent.

The new legislation aims to increase the availability of data for businesses and individuals while setting a raft of new standards for data sharing across the bloc. It will apply to manufacturers, companies and users.

It was first proposed in late 2020 to offer “an alternative European model” to the data handling practices of major US tech platforms. The idea is to create common European data spaces in a variety of fields including health, energy, agriculture, mobility and finance.

The legislation, known as the Data Governance Act (DGA), was approved by EU lawmakers on Wednesday with 501 votes to 12, and 40 abstentions.

It now needs to be formally adopted by the Council before entering into force.

“Our goal with the DGA is to set the foundation for a data economy in which people and businesses can trust. Data sharing can only flourish if trust and fairness are guaranteed, stimulating new business models and social innovation,” said Angelika Niebler, the German MEP who steered the legislation through parliament.

“Experience has shown that trust – be it trust in privacy or in the confidentiality of valuable business data – is a paramount issue. The Parliament insisted on a clear scope, making sure that the credo of trust is inscribed in the future of Europe’s data economy”.

Data transfer disputes

The legislation is part of a broader EU strategy to break Big Tech’s hold over the data sphere.

The EU is also working on a Data Act that specifically looks at who can create value from data and aims to “place safeguards against unlawful data transfer,” which could affect US or other foreign tech companies.

The European Commission published its draft of that act in late February, with commission Vice President Margrethe Vestager saying at the time: “We want to give consumers and companies even more control over what can be done with their data, clarifying who can access data and on what terms.”

Data disputes between the EU and Big Tech have been growing in recent years.

Meta recently warned it could pull Facebook and Instagram out of Europe if it’s unable to transfer, store and process Europeans’ data on US-based servers.

Harnessing the power of AI

According to estimates by the European Commission, the amount of data generated by public bodies, businesses and citizens will be multiplied by five between 2018 and 2025.

“We are at the beginning of the age of AI and Europe will require more and more data. This legislation should make it easy and safe to tap into the rich data silos spread all over the EU,” Niebler said.

“The data revolution will not wait for Europe. We need to act now if European digital companies want to have a place among the world’s top digital innovators”.

MEPs said they had negotiated with EU ministers to ensure there were no loopholes that would allow companies outside of the EU to abuse the scheme.

They are pushing to make the most of data for “objectives of general interest” such as scientific research, health, climate change and mobility.

Source: https://www.euronews.com/next/2022/04/07/meps-pass-new-eu-data-sharing-rules-in-push-to-break-big-tech-dominance-and-boost-ai

0

Consumer, Corporate, Legislation, Personal Data, Privacy, Trending

In September 2021, the Kingdom of Saudi Arabia issued its Personal Data Protection Law to regulate the processing of personal data. The PDPL is the first federal, sector-agnostic data privacy legislation in Saudi Arabia. Organizations will be faced with significant changes to their operations to ensure compliance.

The PDPL comes into effect only 180 days after the publication in the Official Gazette, meaning the law will be effective March 23, subject to the passage of the implementing regulations. For the first two years, it will be enforced under the Saudi Data and Artificial Intelligence Authority, after which a transition to the National Data Management Office will be considered.   

Like other new data protection laws and updates within the broader Middle East and North Africa region, some elements within the PDPL are similar to those of other international data protection regulations. The law also includes numerous unique requirements — such as data transfer and localization requirements — businesses will need to pay careful attention to. Fulfilling these requirements may be operationally burdensome and early planning will be critical to avoid significant setbacks.
The PDPL also includes extraterritorial effect so organizations based outside Saudi Arabia will still be subject to the law and its requirements if they process the personal data of Saudi residents.

What does the law introduce?
The PDPL introduces a number of requirements that could significantly impact how companies in the Kingdom operate. The most notable include:

Registration requirements. 
Data controllers, the organizations that determine the means and purpose of processing of personal data, must register via an electronic portal which includes an annual registration fee.

Records of processing. 
Data controllers must create and maintain a record of how they process personal data, and it must be registered with the SDAIA. Any foreign company operating in the Kingdom and processing personal data of Saudi residents must appoint a local representative. More guidance regarding when this requirement will become effective is forthcoming from the SDAIA. Organizations will also be expected to appoint data officers to manage compliance with the law.

Data subject rights.
Individuals are now provided with new rights to their data, namely that they have the right to information about how their data is processed, the ability to access copies of their data and request corrections, and the right to have their data destroyed. Individuals will also have the right to lodge complaints with the regulatory authority.

Data transfers.
Data transfers outside the Kingdom are only permitted in limited circumstances. However, even if the transfer meets one of the permitted exceptions, the data controller must receive approval by an appropriate government authority, amongst other conditions.

The principal legal basis for processing under the law is consent. Personal data may only be processed without consent in certain circumstances. Individuals will also have the right to withdraw their consent to the processing of their personal data. Importantly, data controllers must also have prior consent of individuals to send direct marketing and must provide an opt-out mechanism.

Impact assessments.
Data controllers must assess projects, products and services to identify data protection risks posed to individuals.

Privacy notice.
Data controllers must implement a privacy notice specifying how data will be processed prior to collecting personal data from individuals.

Breach notification.
Data controllers will be expected to report data breaches to the regulatory authority as soon as they become aware of an incident.

Sensitive data.
Information such as genetic, health, credit and financial data will fall under scope of the law. This data is also likely to be subject to additional regulation.

So how do we prepare?
Like most compliance efforts, early preparation is essential, especially to achieve compliance with some of the more onerous requirements detailed in the PDPL. As a priority, organizations should follow this six-point plan:

Step 1: Understand the data. 
Organizations must understand what data they hold, how it is used and who it is shared with. This can be accomplished by creating a record of processing activities to trace data through the information lifecycle. This document can be used as a single source of truth and to inform other compliance activities.

Step 2: Establish governance. 
Identifying local representatives where appropriate and appointing data officers will be an essential step. These individuals should be integrated into existing data protection or security networks of governance to enable the successful communication and escalation of risks.

Step 3: Create policies and procedures.
Policies and processes must be updated to reflect the new data protection responsibilities, including procedural guidance for responding to data subject rights requests and issuing data breach notifications. Policy refreshes must also address the assessment of data protection and security standards in place among third parties.

Step 4: Implement and test breach plans.
Organizations need a robust data breach plan that articulates each step involved in responding to a breach, the individuals and teams involved, and the timelines to complete each step. Testing your plan will help to ensure your teams are cohesive and ready should an actual incident occur.

Step 5: Identify international data transfers.
Using the ROPAs as a starting point, organizations should seek to understand what data is transferred internationally and where it is transferred to. This includes understanding how limitations in the law may affect these transfers and beginning to adopt strategies for compliance.

Step 6: Provide training and change management.
Training is an effective tool to develop a sustainable culture of compliance. To complement training activities, organizations should consider identifying change management strategies to help ensure that the compliance activities are embedded successfully.

Source: https://iapp.org/news/a/how-to-prepare-for-saudi-arabias-personal-data-protection-law/?mkt_tok=MTM4LUVaTS0wNDIAAAGDqUdxDYhqkPyxHyko4ed2GyuwzheNwgSQ4hjNmCZTuv7-CU3tAAeMAcWRZ2fJ_tz3KavvmN2VgSlfxV0ldu0m9vyZRLP9mlWHgKIaDzpqn31-

0

Consumer, Legislation, Personal Data, Privacy, Trending

Many businesses collect data for multifold purposes. Here’s how to know what they’re doing with your personal data and whether it is secure.


As technologies that capture and analyze data proliferate, so, too, do businesses’ abilities to contextualize data and draw new insights from it. Artificial intelligence is a critical tool for data capture, analysis, and collection of information that many businesses are using for a range of purposes, including better understanding day-to-day operations, making more informed business decisions and learning about their customers.

Customer data is a focus area all its own. From consumer behavior to predictive analytics, companies regularly capture, store, and analyze large amounts of quantitative and qualitative data on their consumer base every day. Some companies have built an entire business model around consumer data, whether they’re companies selling personal information to a third party or creating targeted ads. Customer data is big business.

Here’s a look at some of the ways companies capture consumer data, what exactly they do with that information, and how you can use the same techniques for your own business purposes.

Types of consumer data businesses collect

The consumer data that businesses collect can be broken down into four categories: 

Personal data. This category includes personally identifiable information such as Social Security numbers and gender as well as nonpersonally identifiable information, including your IP address, web browser cookies, and device IDs (which both your laptop and mobile device have).

Engagement data. This type of data details how consumers interact with a business’s website, mobile appstext messages, social media pages, emails, paid ads and customer service routes

Behavioral data. This category includes transactional details such as purchase histories, product usage information (e.g., repeated actions), and qualitative data (e.g., mouse movement information).

Attitudinal data. This data type encompasses metrics on consumer satisfaction, purchase criteria, product desirability and more. 

How do businesses collect your data?

Companies capture data in many ways from many sources. Some collection methods are highly technical in nature, while others are more deductive (although these processes often employ sophisticated software).

The bottom line, though, is that companies are using a cornucopia of collection methods and sources to capture and process customer data on metrics, with interest in types of data ranging from demographic data to behavioral data, said Liam Hanham, data science manager at Workday. 

“Customer data can be collected in three ways: by directly asking customers, by indirectly tracking customers, and by appending other sources of customer data to your own,” said Hanham. “A robust business strategy needs all three.”

Businesses are adept at pulling in all types of data from nearly every nook and cranny. The most obvious places are from consumer activity on their websites, social media pages, through customer phone calls and live chats, but there are some more interesting methods at work as well.

One example is location-based advertising, which utilizes tracking technologies such as an internet-connected device’s IP address (and the other devices it interacts with – your laptop may interact with your mobile device and vice versa) to build a personalized data profile. This information is then used to target users’ devices with hyper-personalized, relevant advertising.

Companies also dig deep into their customer service records to see how customers have interacted with their sales and support departments in the past. Here, they are incorporating direct feedback about what worked and what didn’t, what a customer liked and disliked, on a grand scale.

Besides collecting information for business purposes, companies that sell personal information and other data to third-party sources have become commonplace. Once captured, this information is regularly changing hands in a data marketplace of its own.

Turning data into knowledge

Capturing large amounts of data creates the problem of how to sort through and analyze all that data. No human can reasonably sit down and read through line after line of customer data all day long, and even if they could, they probably wouldn’t make much of a dent. Computers, however, sift through this data more quickly and efficiently than humans, and they can operate 24/7/365 without taking a break.

As machine learning algorithms and other forms of AI proliferate and improve, data analytics becomes an even more powerful field for breaking down the sea of data into manageable tidbits of actionable insights. Some AI programs will flag anomalies or offer recommendations to decision-makers within an organization based on the contextualized data. Without programs like these, all the data captured in the world would be utterly useless.


How do businesses use your data?

There are several ways companies use the consumer data they collect and the insights they draw from that data.

To improve the customer experience.

For many companies, consumer data offers a way to better understand and meet their customers’ demands. By analyzing customer behavior, as well as vast troves of reviews and feedback, companies can nimbly modify their digital presence, goods, or services to better suit the current marketplace.

Not only do companies use consumer data to improve consumer experiences as a whole, but they use data to make decisions on an individualized level, said Brandon Chopp, digital manager for iHeartRaves.

“Our most important source of marketing intelligence comes from understanding customer data and using it to improve our website functionality,” Chopp said. “Our team has improved the customer experience by creating customized promotions and special offers based on customer data. Since each customer is going to have their own individual preferences, personalization is key.”

1: To refine a company’s marketing strategy

Contextualized data can help companies understand how consumers are engaging with and responding to their marketing campaigns, and adjust accordingly. This highly predictive use case gives businesses an idea of what consumers want based on what they have already done. Like other aspects of consumer data analysis, marketing is becoming more about personalization, said Brett Downes, SEO manager at Ghost Marketing.

“Mapping users’ journeys and personalizing their journey, not just through your website but further onto platforms like YouTube, LinkedIn, Facebook, or on to any other website, is now essential,” Downes said. “Segmenting data effectively allows you to market to only the people you know are most likely to engage. These have opened up new opportunities in industries previously very hard to market to.”

2: To transform the data into cash flow

Companies that capture data stand to profit from it. Data brokers, or data service providers that buy and sell information on customers, have risen as a new industry alongside big data. For businesses that capture large amounts of data, collecting information and then selling it  represent opportunities for new revenue streams.

For advertisers, having this information available for purchase is immensely valuable, so the demand for more and more data is ever increasing. That means the more disparate data sources data brokers can pull from to package more thorough data profiles, the more money they can make by selling this information to one another and advertisers.

3: To secure more data

Some businesses even use consumer data as a means of securing more sensitive information. For example, banking institutions sometimes use voice recognition data to authorize a user to access their financial information or protect them for fraudulent attempts to steal their information.

These systems work by marrying data from a customer’s interaction with a call center, machine learning algorithms, and tracking technologies that can identify and flag potentially fraudulent attempts to access a customer’s account. This takes some of the guesswork and human error out of catching a con.

As data capture and analytics technologies become more sophisticated, companies will find new and more effective ways to collect and contextualize data on everything, including consumers. For businesses, doing so is essential to remain competitive well into the future; failing to do so, on the other hand, is like running a race with your legs tied together. Insight is king, and insight in the modern business environment is gleaned from contextualized data.

4: Data privacy regulations

So much consumer data has been captured and analyzed that governments are crafting strict data and consumer privacy regulations designed to give individuals a modicum of control over how their data is used. The European Union’s General Data Protection Requirements (GDPR) lays out the rules of data capture, storage, usage, and sharing for companies, and GDPR regulation and compliance doesn’t just matter for European countries – it’s a law applicable to any business that targets or collects the personal data of EU citizens.

Data privacy has made it to the U.S. in the form of the California Consumer Privacy Act (CCPA). The CCPA is, in some ways, similar to GDPR regulation but differs in that it requires consumers to opt out of data collection rather than putting the onus on service providers. It also names the state as the entity to develop applicable data law rather than a company’s internal decision-makers.

Data privacy regulations are changing the way businesses capture, store, share and analyze consumer data. Businesses that are so far untouched by data privacy regulations can expect to have a greater legal obligation to protect consumers’ data as more consumers demand privacy rights. Data collection by private companies, though, is unlikely to go away; it will merely change in form as businesses adapt to new laws and regulations.

Source: https://www.businessnewsdaily.com/10625-businesses-collecting-data.html

0

Legislation, Personal Data, Privacy

STATEMENTS AND RELEASES

The United States and the European Commission have committed to a new Trans-Atlantic Data Privacy Framework, which will foster trans-Atlantic data flows and address the concerns raised by the Court of Justice of the European Union when it struck down in 2020 the Commission’s adequacy decision underlying the EU-U.S. Privacy Shield framework. 

 

This Framework will reestablish an important legal mechanism for transfers of EU personal data to the United States. The United States has committed to implement new safeguards to ensure that signals intelligence activities are necessary and proportionate in the pursuit of defined national security objectives, which will ensure the privacy of EU personal data and to create a new mechanism for EU individuals to seek redress if they believe they are unlawfully targeted by signals intelligence activities. This deal in principle reflects the strength of the enduring U.S.-EU relationship, as we continue to deepen our partnership based on our shared democratic values.

 

This Framework will provide vital benefits to citizens on both sides of the Atlantic. For EU individuals, the deal includes new, high-standard commitments regarding the protection of personal data. For citizens and companies on both sides of the Atlantic, the deal will enable the continued flow of data that underpins more than $1 trillion in cross-border commerce every year, and will enable businesses of all sizes to compete in each other’s markets. It is the culmination of more than a year of detailed negotiations between the EU and the U.S. following the 2020 decision by the Court of Justice of the European Union ruling that the prior EU-U.S. framework , known as Privacy Shield,  did not satisfy EU legal requirements.

 

The new Trans-Atlantic Data Privacy Framework underscores our shared commitment to privacy, data protection, the rule of law, and our collective security as well as our mutual recognition of the importance of trans-Atlantic data flows to our respective citizens, economies, and societies.  Data flows are critical to the trans-Atlantic economic relationship and for all companies large and small across all sectors of the economy. In fact, more data flows between the United States and Europe than anywhere else in the world, enabling the $7.1 trillion U.S.-EU economic relationship.

 

By ensuring a durable and reliable legal basis for data flows, the new Trans-Atlantic Data Privacy Framework will underpin an inclusive and competitive digital economy and lay the foundation for further economic cooperation. It addresses the Court of Justice of the European Union’s Schrems II decision concerning U.S, law governing signals intelligence activities. Under the Trans-Atlantic Data Privacy Framework, the United States has made unprecedented commitments to:

 

Strengthen the privacy and civil liberties safeguards governing U.S. signals intelligence activities;

Establish a new redress mechanism with independent and binding authority; and

Enhance its existing rigorous and layered oversight of signals intelligence activities.

For example, the new Framework ensures that:

 

Signals intelligence collection may be undertaken only where necessary to advance legitimate national security objectives, and must not disproportionately impact the protection of individual privacy and civil liberties;

EU individuals may seek redress from a new multi-layer redress mechanism that includes an independent Data Protection Review Court that would consist of individuals chosen from outside the U.S. Government who would have full authority to adjudicate claims and direct remedial measures as needed; and

U.S. intelligence agencies will adopt procedures to ensure effective oversight of new privacy and civil liberties standards.

Participating companies and organizations that take advantage of the Framework to legally protect data flows will continue to be required to adhere to the Privacy Shield Principles, including the requirement to self-certify their adherence to the Principles through the U.S. Department of Commerce. EU individuals will continue to have access to multiple avenues of recourse to resolve complaints about participating organizations, including through alternative dispute resolution and binding arbitration.

 

These new policies will be implemented by the U.S. intelligence community in a way to effectively protect its citizens, and those of its allies and partners, consistent with the high-standard protections offered under this Framework.

 

The teams of the U.S. government and the European Commission will now continue their cooperation with a view to translate this arrangement into legal documents that will need to be adopted on both sides to put in place this new Trans-Atlantic Data Privacy Framework. For that purpose, these U.S. commitments will be included in an Executive Order that will form the basis of the Commission’s assessment in its future adequacy decision.

Source: https://www.whitehouse.gov/briefing-room/statements-releases/2022/03/25/fact-sheet-united-states-and-european-commission-announce-trans-atlantic-data-privacy-framework/

0

Consumer, Legislation, Personal Data, Privacy
0