What are the main current initiatives at the Future of Privacy Forum?
Future of Privacy Forum is a nonprofit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies.
FPF brings together industry, academics, consumer advocates, and other thought leaders to explore the challenges posed by technological innovation and develop privacy protections, ethical norms, and workable business practices.
FPF helps fill the void in the “space not occupied by law” which exists due to the speed of technology development.
As “data optimists,” we believe that the power of data for good is a net benefit to society, and that it can be well-managed to control risks and offer the best protections and empowerment to consumers and individuals.
FPF’s current initiatives include best practices for the use of Genetic Data generated by consumer genetic testing services, privacy and data management challenges for artificial intelligence and machine learning, potential harms and mitigation strategies for algorithmic decision-making, advancing sensible practices for mobility related technologies, addressing stude
What privacy issues can city technologies raise?
Smart communities marry technology platforms with Big Data analytics and government services and promise to use civic data to trigger innovation, drive inclusivity, and make urban spaces more efficient, livable, and equitable. Many smart community technologies rely on personal data about individuals and can raise significant privacy issues if appropriate safeguards are not taken into consideration. If these technologies are not implemented with serious attention to privacy, they will threaten to violate individuals’ rights and upset the balance of power between city governments and city residents. If residents do not see the benefits of new technologies or mistrust that their information will be protected, they will see new urban sensors and services as tools of discipline and surveillance, rather than transparency and innovation.
Is it possible to mitigate these concerns, while preserving the benefits of cities that are cleaner, faster, safer, more efficient and more sustainable? How?
Absolutely. While smart community technologies can raise privacy issues, sophisticated data privacy programs can mitigate these concerns while preserving the benefits of cities that are cleaner, faster, safer, more efficient, and more sustainable. Cities can mitigate privacy issues by ensuring they conduct thorough privacy impact assessments that identify and alleviate concerns.
Leading cities are hiring Chief Privacy Officers and Chief Data Officers to make thoughtful decisions about providing appropriate notices, choices, and security measures to protect citizens’ data. If city leaders, technology providers, community organizations, and other stakeholders work together to address fundamental privacy issues, they will be able to leverage the benefits of a data-rich society while minimizing threats to individual privacy. Successful cities will compete to be the most accountable and transparent about how they use data about residents, not just the most technologically advanced.
The USA federal government released guidance that will hasten the roll out of self-driving cars on American roads. What is your opinion?
In September 2017, the Department of Transportation and the National Highway Traffic Safety Administration issued updated guidance for autonomous vehicles; streamlining the prior year’s guidance, incorporating public comments, and stripping privacy from its recommendations. For those of us who think that proactively protecting consumer privacy in the connected car is crucial to the adoption of this safety-enhancing technology, its absence from this NHTSA guidance is notable. But some may find that this move achieves the clarity that many of us, including the Government Accountability Office in a 2017 study, have called for regarding NHTSA and Federal Trade Commission roles—by explaining that nearly all issues related to privacy in connected cars fall squarely in the FTC’s camp.
Nevertheless, mobility-related technologies are evolving rapidly, transforming the safety and convenience of transportation. Many of these new features are enabled by the collection of new types of data, putting the topic of privacy in connected cars on the agenda of industry, policymakers, and regulators. Advancing sensible practices will be essential to ensure that the collection and use of this data is responsible, thoughtful, and communicated effectively to consumers.
What does privacy means to you?
We live in a world with a wealth of data. There are endless opportunities to use that data to come up with new health advantages, to implement tools that make cities more efficient, to have cars that are aware of their surroundings. Can my wearable tell me that I am coming down with some disease that I can now solve because I caught it early? There are so many areas where data holds opportunity, but every one of those opportunities is also a source of great risk.
Do we want a world that is so Orwellian even if it’s for our good because our data can be used to nudge us, to improve us, to help us live longer and better? Clearly, we want those benefits, but how do we do so in a way that is ethical, that is moral, that doesn’t make us give up control over my sense of self, over my autonomy. We want a world that is safer, but we don’t want government monitoring every email and every phone call, but we don’t want terrorist attacks or other terrible things happening. In integrating the existing opportunities – safety, happiness, wellness, all the things that data can be harnessed to achieve – we are first beginning to scratch the opportunities, but we are also beginning to understand the dark path of misusing the data and or of being overly optimistic.
As CEO of a non-profit organization that convenes industry, academics, and advocates, my challenge is to hear all sides and to try to develop a compromise that maximizes benefit and minimizes risk.
I am excited about the latest gadget and newest breakthrough, but also understand the incredible consequences of not putting proper policy in place to ensure that society benefits in a way that lifts us all up and takes us in a positive direction. That’s what privacy means to me.
Anything else you wish to add?
Please subscribe to our mailing list at www.fpf.org/subscribe to stay updated about our work. You can also follow us on social media at @futureofprivacy. If you are interested in becoming a member of FPF, please contact Barbara Kelly at firstname.lastname@example.org.
Jules Polonetsky serves as CEO of the Future of Privacy Forum, a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Jules is co-editor of The Cambridge Handbook of Consumer Privacy, published by Cambridge University Press (2018). More of his writing and research can be found on Google Scholar and SSRN and at fpf.org.
Jules is also Co-chairman of the Israel Tech Policy Institute, based in Tel Aviv.
Jules previous roles have included serving as Chief Privacy Officer at AOL and before that at DoubleClick, as Consumer Affairs Commissioner for New York City, as an elected New York State Legislator and as a congressional staffer, and as an attorney.
Jules has served on the boards of a number of privacy and consumer protection organizations including TRUSTe, the International Association of Privacy Professionals, and the Network Advertising Initiative. From 2011-2012, Jules served on the Department of Homeland Security Data Privacy and Integrity Advisory Committee. Jules is a member of The George Washington University Law School Privacy and Security Advisory Council.