Facebook must not be allowed to act like a “digital gangster,” an influential committee of British lawmakers has said in a report that calls for the company to be more heavily regulated. Facebook has responded by saying it is “open to meaningful regulation” regarding disinformation and privacy.
The report about online disinformation and “fake news” was published Monday by the U.K. Parliament’s digital, culture, media and sport (DCMS) committee.
The committee investigated the subject for a year and a half, at one point forming an unprecedented “Grand Committee” with lawmakers from eight other countries: Canada, Brazil, France, Ireland, Argentina, Singapore, Belgium and Latvia. The Cambridge Analytica scandal broke during the investigation, adding much fuel to its fire.
The outcome? According to the report’s conclusions, it’s time to end the era of self-regulation and voluntary codes of practice for tech companies like Facebook, and to introduce a compulsory code of ethics with an independent regulator to enforce it.
That means making the tech companies legally liable for “harmful and illegal content” on their platforms, with “large fines” for non-compliance.
“Among the countless innocuous postings of celebrations and holiday snaps, some malicious forces use Facebook to threaten and harass others, to publish revenge porn, to disseminate hate speech and propaganda of all kinds, and to influence elections and democratic processes—much of which Facebook, and other social media companies, are either unable or unwilling to prevent,” the report read. “We need to apply widely-accepted democratic principles to ensure their application in the digital age.”
Facebook CEO Mark Zuckerberg did not do himself or his company any favors by repeatedly refusing to testify before the DCMS committee. Frustrated and offended by his absence, the committee took the extraordinary step last November of seizing internal Facebook emails from a businessman who obtained the documents in a lawsuit against Facebook, and who happened to be travelling with them in the U.K.
In its report, the committee noted that the emails showed Facebook was “willing to override its users’ privacy settings in order to transfer data to some app developers, to charge high prices in advertising to some developers, for the exchange of that data, and to starve some developers… of that data, thereby causing them to lose their business.” This meant the company had “at the very least” breached a privacy-related consent order with the U.S. Federal Trade Commission, it said.
The U.K.’s antitrust regulator should investigate Facebook, the committee urged, adding: “Companies like Facebook should not be allowed to behave like ‘digital gangsters’ in the online world, considering themselves to be ahead of and beyond the law.”
“The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight. But only governments and the law are powerful enough to contain them. The legislative tools already exist,” the report read. “They must now be applied to digital activity, using tools such as privacy laws, data protection legislation, antitrust and competition law. If companies become monopolies they can be broken up, in whatever sector.”
In response, Facebook public policy manager Karim Palant said the company was “open to meaningful regulation,” and also supports “effective privacy legislation that holds companies to high standards in their use of data and transparency for users.” He added: “We have already made substantial changes so that every political ad on Facebook has to be authorized, state who is paying for it and then is stored in a searchable archive for seven years… While we still have more to do, we are not the same company we were a year ago.”