A pair of notes written by the head of Facebook Reality Labs — Andrew Bosworth — and published internally to company employees provide a look into Facebook’s approach to privacy heading into 2021.
The notes were first made public by Alex Kantrowitz and UploadVR independently confirmed them as authentic. The contents are of significant interest to our readers so we are publishing them in full below:
Andrew Bosworth To Facebook Reality Labs:
Starting in January we are changing the way we approach product development in FRL. Instead of imagining a product and trimming it down to fit modern standards of data privacy and security we are going to invert our process. We will start with the assumption that we can’t collect, use, or store any data. The burden is on us to demonstrate why certain data is truly required for the product to work. Even then I want us to scope it as aggressively as we can, holding a higher bar for sending data to the server than we do for processing it locally. I have no problem with us giving users options to share more if they choose (opt-in) but by default we shouldn’t expect it.
I don’t want us to just meet the consumer expectations for privacy today. I want us to differentiate our products on the basis of privacy. Let other companies scramble to keep up with us.
Andrew Bosworth’s “The Big Shift”:
When I joined Microsoft in 2004, the only required reading was Writing Secure Code. The company had spent years getting dragged through the mud for the viruses prevalent on their platforms. While relatively few consumers faced material negative impact from malware, the possibility of it was a constant. Getting a glance at the codebase, I could see why. They had built up decades of buffer overruns and unchecked dereferences in a sprawling code base.
I sometimes wonder if Microsoft engineers before me felt the coverage of their company was unfair. Perhaps they felt that too much attention was being paid to relatively rare issues. Perhaps the trade-offs they had made enabled huge ecosystems of developers who might otherwise struggle. If that was the case, it didn’t matter. Consumers didn’t feel safe using Microsoft products.
By the time I joined the quality of new code impressive. Security got more attention than anything else in code reviews. Even the most junior engineer like me was held to the highest standards. They made product decisions that in many ways made their user experience worse but in exchange (rightly) gave consumers’ confidence that the system they were using was trustworthy.
Today Microsoft is considered perhaps the most trustworthy software vendor in the world. It is trusted by an overwhelming majority of enterprise companies. Having been on the outside since 2005 it was impressive to watch their persistence yield a gradual but definitive shift in their reputation. I think this is a model for us at Facebook. We should become the undisputed leaders in providing privacy aware software.
As it did at Microsoft, this starts with an acknowledgement that the way we operated for a long time is no longer the best way to serve those who use our products. We have always put the best consumer experience first. That meant the best ranking, the best content, and the best sharing tools. Contrary to what is often written we have always cared about privacy and balanced providing people with both a good privacy experience and a good user experience. But global sentiment has clearly shifted to the point that people are willing to accept sacrifices in the quality of the product in ways we hadn’t considered in order to have stronger guarantees around data privacy. We need to consider the consumer experience holistically rather than at optimizing for each individual feature.
The good news is that this next phase of our journey is underway. It started gradually a long time ago and then suddenly in mid-2019 when we clearly put privacy first internally and redesigned the Privacy Review process to provide just such a holistic view of our products. Privacy Review is an effective backstop but is still having to escalate far too often as local teams haven’t yet internalized the magnitude of the shift we are undergoing. The next step is for the priority of privacy to permeate the entirety of our culture, we’ve made inroads here but we have a long ways to go. Privacy Review should become a simple housekeeping exercise unless we detect further shifts in public attitudes towards privacy.
Of course we are also building tools to make this easier. Just as Microsoft deployed fuzzing to test every input, we will build tools that make it easier to write good code and harder to write bad code. But the tools will only be effective insofar as we stop fighting them at the cultural level. The new normal is giving people choices about data usage and finding ways to provide the best value we can when data isn’t available.
With new culture and new tools, an a concerted effort to revisit old products, we are on a long road to redemption. We shouldn’t expect any credit before it is due and perhaps not even until a good while afterwards. But we should nonetheless feel proud of the shift we are undertaking and confident in our ability to see it through.