Design principles & privacy safeguards
Thesis
Overview
The purpose of Synapse is to extend the mission of Boundless in maximizing human life and freedom. The path which maximizes these must be the path undertaken by Synapse. Often this means protecting people's privacy more than they want it to be protected.
However, consent is the cornerstone, ultimately the user is in control of their fate.
Abuse-resistant design
The future of the world runs on data. Data on a person's health can be used to prescribe them life-saving medicine at a fraction of the cost, saving their life. However, this same data can be used for people's serious harm through surveillance. Even innocuous tools that leave users completely in control can have dangerous externalities.
For example, when people first got smartphones with cameras, it was up to them to use the camera or not. But now governments require those individuals to take pictures of themselves with those smartphones, holding their ID. This connects people's digital footprint with their physical person, undoing the anonymity that helped the original web be a place of free expression.
Nothing created by or in Synapse must ever have serious and problematic consequences in conditions of abuse. This is not about cutting features. This is about finding methods to make features unuseful in situations of abuse.
There will be debates about what is acceptable, and maximizing human life and freedom must be the guiding light in those debates.
User-guided privacy
User-initiated
The user must be the deliberate initiator of actions which reveal their data to others.
User-control
The user cannot be forced to adhere by our higher standards of privacy if they deliberately choose not to. This requires explicit consent with the trade-offs being made clear to the user.
For example, it may not be apparent that AI facial recognition can lead to false positives, which get innocent people arrested. But these sort of problems must be made apparent if people deliberately choose to consent to things that have negative externalities.
Privacy screen
Ideally speaking, we would like to give all users a privacy screen in which they can delete their data as it is stored by all other applications that have ever used Synapse, so that Synapse can be a common means of controlling what information is shared with third parties, not only what information they initially receive, but also their continued right to hold that data, forcing other companies to delete that data upon the user's retraction of the data.
Essentially, we would like to give the users the ability to claw back data from other companies that might hold it completely, forcing data leaks out of the greater web were possible. We would like to give the users the ability to choose to do so using Synapse as a condition of developing for Synapse, if at all possible.
This could be done via a judging system in which capable or leading individuals in digital societies would have the right to ban applications if they don't comply with privacy requirements like this one. Rather than being a law, it could be predicated on abuse and public knowledge of that abuse, allowing it not to operate like regulation and stifle innovation, but rather a system of curtailing grotesque abuse when it arises without brothering other parties.
It's an open question how people get elected to the seats in which they have control over these matters within the Synapse framework.