Differential Privacy Synthetic Data Challenge

Propose an algorithm to develop differentially private synthetic datasets to enable the protection of personally identifiable information (PII) while maintaining a dataset's utility for analysis

Other Funding
  • Program Type
    Other Funding
  • Funding Award
    $150,000

This challenge is focused on proactively protecting individual privacy while allowing for public safety data to be used by researchers for positive purposes and outcomes. NIST’s PSCR (public safety communications research) has strong commitments to both public safety research and the preservation of security and privacy, including the use of de-identification.
There is no absolute protection that data will not be misused. Even a dataset that protects individual identities may, if it gets into the wrong hands, be used for ill purposes. Weaknesses in the security of the original data can threaten the privacy of individuals.
It is well known that privacy in data release is an important area for the Federal Government (which has an Open Data Policy), state governments, the public safety sector and many commercial non-governmental organizations. Developments coming out of this competition would hopefully drive major advances in the practical applications of differential privacy for these organizations.
The purpose of this series of competitions is to provide a platform for researchers to develop more advanced differentially private methods that can substantially improve the privacy protection and utility of the resulting datasets.

NIST Differential Privacy Synthetic Data Challenge - 3/14/2019 Webinar

Future Phases

Check out an overview of each competition phase below - complete rules for each phase will be released as the competition progresses.

Documents

Let’s Get Started.

To apply or join the program, please submit an application.