Given our industry's long history of lying about data retention and usage and openai's opaqueness and Sam Altman's specific sleaziness I wouldn't trust this privacy statement one bit. But I know the statement will be enough for corporate "due diligence".
Which is a shame because an actual audit of the live training data of these systems could be possible, albeit imperfect. Setup an independent third party audit firm that gets daily access to a randomly chosen slice of the training data and check its source. Something along those lines would give some actual teeth to these statements about data privacy or data segmentation.
Which is a shame because an actual audit of the live training data of these systems could be possible, albeit imperfect. Setup an independent third party audit firm that gets daily access to a randomly chosen slice of the training data and check its source. Something along those lines would give some actual teeth to these statements about data privacy or data segmentation.