Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update DPG privacy requirements #183

Open
gfanti opened this issue Jun 26, 2024 · 2 comments
Open

Update DPG privacy requirements #183

gfanti opened this issue Jun 26, 2024 · 2 comments

Comments

@gfanti
Copy link

gfanti commented Jun 26, 2024

We (@GeetikaGopi, @amad-person, @omkhar, @gfanti) are a team of researchers from Carnegie Mellon University and OpenSSF. In our recent study to appear at SOUPS 2024, we have found that a significant fraction of DPGs respond to question 9(a) on privacy with responses that are incomplete or misleading (more details in the paper). For example, we find that the level of detail that many DPGs provide in response to 9(a) is insufficient to understand much about their privacy posture. This can make it challenging to understand if PII is being handled properly. We would hence like to discuss the possibility of updating the privacy requirements for being classified as a DPG.

Starting point: A proposed solution
It may not be scalable or feasible for the DPGA to meaningfully evaluate the privacy posture of DPGs, given that many DPGs consist of large and complex codebases. In our paper (Section 6.2.1), we propose an alternative architecture for privacy evaluation of DPGs. Roughly, the proposed process would proceed as follows:

  1. DPGs would complete and submit a standardized privacy assessment developed by an external body; for example, privacy impact assessments (PIAs) are widely used.
  2. DPGs could either complete this assessment on their own (self-attestation) or submit a certified assessment from third parties approved by the DPGA (e.g., many consulting firms routinely conduct PIAs today).
  3. DPGs would submit the documentation of their privacy assessment along with their DPG application.
  4. The DPGA would not evaluate the quality of the privacy assessment, beyond ensuring a good-faith response—it would simply post the assessment information on the DPGA website, along with the remaining DPG standard responses.
  5. Adopters would evaluate themselves whether a DPG meets their privacy requirements; the privacy assessment would give them a summary from which they can make an initial assessment.

We believe this process has a few desirable properties:

  • It provides adopters with a more nuanced evaluation of DPGs’ privacy postures, compared to the current responses to 9(a).
  • It does not require the DPGA to decide what privacy features are important.
  • It does not require the DPGA to evaluate DPGs’ privacy postures.
  • It makes use of existing, widely-adopted privacy evaluation tools and ecosystems.

This is of course not the only possible process, and as always, there are tradeoffs. We would be happy to discuss this issue (both the underlying problem and potential solutions) further.

@jstclair2019
Copy link

+1 @gfanti I wholeheartedly endorse this proposal. I agree with the proposed steps in the interim but ultimately harmonize the DPGA to specify common privacy controls.

@gfanti
Copy link
Author

gfanti commented Sep 15, 2024

Thank you for your consideration! May we ask if there has been any further discussion on these points?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants
@gfanti @jstclair2019 and others