The incoming Digital Markets Act (DMA) and Digital Services Act (DSA) are two of the EU’s most significant pieces of digital-era legislation. But a set of key protections in the latter risks being lost during final negotiations this week.
The two Acts place fresh regulatory structure around digital services, with the DMA focused on enlarging competition and responsibility for market-dominant players (tech giant “gatekeepers”) and the DSA on ensuring greater transparency, safety and consumer protection from service providers.
As with the EU's General Data Protection Regulation, both have been closely watched internationally. Their final formats will provide industry guardrails that will determine how multinational "gatekeeper" platforms and other digital service providers function in Europe, and by extension, the world. As with the GDPR, it will prove difficult for companies to comply with European regulations and not offer services to a similar standard elsewhere.
Hence, some of the enormous EU and international pressures on these flagship acts. The DMA is now baked, having received general approval from the European Parliament in March. But the DSA is still, just barely, in play.
As the negotiating deadline looms, the European Council under the French presidency has been pushing against the European Parliament to water down several sections that contain some of the more important provisions of the Act – areas that would have direct guiding effect in member state legislation, such as Ireland's proposed online safety Bill.
BEUC, the European consumer rights organisation, has written an open letter to the French EU presidency, highlighting its alarm.
“It is with concern that we have seen important consumer-relevant provisions watered down in recent compromise proposals put forward by the French presidency, notably in relation to the obligations of online marketplaces, online advertising and dark patterns,” the letter states.
The besieged DSA provisions include those barring platforms from using deceptive “dark pattern” website designs which manipulate users towards commitments and information disclosures. In particular, children’s rights campaigners bridle at the proposed weakening of dark pattern protections for minors.
Another worry is the intention to drop a commitment to protect children from online targeted advertising and tracking. Negotiations have pushed for better safeguards for all against such hidden data-gathering and exploitation but, at the very least, minors must be defended against this digital subterfuge.
BEUC also argues for spot checks on the quality of goods provided by online marketplaces. Bemoaning the failure of the DSA to introduce stricter platform liability for illegal goods, which BEUC terms “a missed opportunity to act more decisively against the lax approach that online platforms often take against illegal activities such as the sale of unsafe products”, the group wants an obligation for online marketplaces to do random quality checks.
These critical steps tackle some of the most pressing problems in the online manipulation of consumers, especially, children. And yet, the response from the council has been to argue that all these areas are already covered in other legislation, specifically, the GDPR and the unfair contract terms directive.
This is a ludicrous claim. If either contained adequate safeguards in these areas, they would be widely used to address and prevent such problems. According to solicitor Simon McGarr of Data Compliance Europe and Digital Rights Ireland, neither of those laws adequately covers these areas.
“We have laws on the books which can laboriously be used to ban some of the outcomes of dark patterns, but it would be a lot easier for regulators and consumers alike if the dark patterns themselves were forbidden,” he says.
He also notes that some of these provisions were planned for the stalled eprivacy directive, intended as GDPR companion legislation and target of some of the heaviest industry lobbying in EU history.
“In effect, the proposed amendment would import some of those stalled ideas on consent which had been expected to have been addressed four years ago,” McGarr says.
German MEP Alexandra Geese, a leading DSA parliamentary negotiator, says: "Consent is a crucial concept in GDPR. It should allow users to share data only when they are comfortable with doing so. But today's consent frameworks don't allow users to do that.
“While it is very easy to click to give consent, it is close to impossible or at least extremely time-consuming to refuse consent. Users are systematically misled.”
And while she notes that GDPR should protect internet users from practices that habitually force them to divulge personal data in exchange for services – data often sold on to hundreds of data brokers – she says that “unfortunately GDPR is not consistently enforced”.
In her view, "enforcement is particularly poor in Ireland where Google and Facebook and most other major tech companies are based. This gives those platforms a competitive advantage over companies based in other countries where GDPR is consistently enforced." Ouch.
That’s an indication of parliamentary sentiment towards a need to legislate around what they see as a major GDPR roadblock – Ireland.
There’s little time to ensure the DSA gets the legal heft it needs. Hopefully, parliamentary negotiators will prevail, clearing the way to a better online environment for all.