a gray area, lawfully speaking. what if a model could retrieve the exact data used during training on-demand, pretty much like a database, but upon inspection it wouldn't do that due to specific undocumented parameters necessary for it? to me that sounds like a convenient way of stealing personal data, storing it somewhere, and still not risking getting sued due to GDPR infringement
probably scary but doesnt fit in my head
this sounds way too conspiracy-like to be real tho
Обсуждают сегодня