Privacy abuses are not a bug but a feature πŸ”₯

Facebook's online privacy abuses have been known of the public for more than a decade. Dozens of scandals have strained the platform's credibility, and new data misuses are constantly brought to light. Some recent instances include the ways Facebook manipulates its news feed for research purposes, its racially-biased protections against hate speech, and cases of redlining in ad targeting. The Cambridge Analytica scandal and subsequent senate hearing in 2018 revealed these abuses were not bugs, but integral features of the platform.

Photo: Jim Watson

In a similarly abusive fashion, governments in the United States and many other countries have been ramping up their efforts to collect their people's information. Amidst the rise of populism and authoritarianism, legislators and law enforcement agencies have been advocating for personal data retention and encryption backdoors. Abusive governments agencies call tech companies to broadly monitor people's online activity. Our social media timelines have never been more scrutinized.

Several tech companies have been under fire for actively collaborating with anti immigration efforts. Most prominent offenders include Microsoft and Palantir, but it turns out many more are involved.

Conversely, other areas of our digital lives have seen a lot of progress when it comes to privacy. Namely, the recent democratization of end-to-end encryption applications like Signal and Wire has contributed to protecting the communications of millions of people. So why are we still stuck with entirely unprotected social media timelines?

Unlike private chats, public feeds are open by nature, which means conventional encryption techniques don't apply well to them. Moreover, as semi-public spaces, social media timelines have traditionally been understood as inherently unfit for privacy. But this is a false dichotomy: openness doesn't have to come at the expense of safety.

Given their open nature, how can we bring some degree of privacy to social media feeds?

Deface is the beginning of an answer to this problem. Deface piggybacks on Facebook and obfuscates timeline content: it makes user messages costly for computers to read, while preserving people's user experience.

Deface logo
Deface could benefit to the public in several ways
  • The adversarial approach taken by the project aims to spark a conversation. It questions Facebook's unethical practices, and challenges the established view that there's no possible privacy in public online spaces.
  • Deface disrupts personal data processing by Facebook and third-party organizations. If successful, this increases the cost of mass surveillance, and reduces the monitoring capabilities of abusive government agencies.
  • By challenging Facebook's ability to read people's content, Deface might also prevent the company's algorithms from overly curating people's feeds and passively promoting hate speech.
  • As an open source tool and research project, Deface shows how personal data can be decoupled from the platforms where it is stored. This paves the way for many other projects of social media appropriation.

Tracks of work in 2019 ⚑️

Deface currently exists as a Chrome extension prototype, and a public release date is expected in 2019. Our open source efforts will cover several tracks of work, as we commit to tackling the many social and technical challenges ahead of us.

Security & privacy πŸ›‘

Most urgently, we're looking to connect with security researchers and cryptographers in order to confirm the viability of our design. Our approach to privacy is unconventional, and we don't expect to get it right on the first shot. In parallel, we are writing an open source cost analysis that models the impact of proof-of-work obfuscation on Facebook data processing capabilities. At a later stage, we will seek security auditing.

Frontend development & resilience β˜”οΈ

Facebook is notorious for shutting down applications that piggyback on their platform. We will need to harden the Deface client's content scripts, and build resilience against Facebook's countermeasures.

Cross-platform experience πŸ“±

Most people consume social media on mobile platforms. For this reason, it is critical for Deface to become a cross-platform experience. We plan to release Deface to major browsers (Chrome, Firefox) and mobile operating systems (Android, iOS).

Public-key encryption features πŸ”

In order to accelerate content sharing and provide stronger safety features, Deface will eventually secure some communication by using public-key encryption. To this end, we need to flesh out a technical specification.

Open source development 🌍

For the sake of transparency and accountability, we plan to share our code openly and publish documentation under a Creative Commons license. We’re committed to assembling a diverse set of voices and skill sets, and to providing a safe space for conversation.

Use-cases & ethics 🐝

We seek to connect with community organizers and activists who could benefit from this project, and together tailor the tool to their needs. Because each community operates differently, we need to understand their threat model, and mitigate the risks they might face.

User experience & security information β›‘

It is critical our audience understands what's at stake with their data, and what security trade-offs Deface makes. We also need to design our application with built-in ways to opt out and walk things back when necessary.

Accessibility 😎

Because it alters Facebook's conventional user experience, Deface might impact individuals with accessibility needs. We plan to review and address those potential issues.

Launch campaign πŸš€

The tracks of work listed above will lead up to a release of Deface to the public. We will need to strategize to make this launch impactful. We will talk to the press, coordinate with privacy advocates, and identify influencers willing to support our project.