LinkedIn users unaware of AI data training

Unaware Training

LinkedIn has been using user data to train AI models without explicitly informing users or obtaining their consent.

The social network recently updated its privacy policy, revealing that it uses personal data to improve and develop products, train AI models, provide personalized services, and gain insights. Users in the U.S. were subject to this data scraping, while those in the EU, EEA, or Switzerland were not affected due to stricter data privacy rules in those regions.

LinkedIn confirmed that the AI models trained using this data include those for writing suggestions and post recommendations.

To mitigate privacy concerns, LinkedIn stated that it employs “privacy enhancing techniques” like redacting and removing personal information from datasets used for AI training. Users wishing to opt out of data scraping can navigate to the “Data Privacy” section on LinkedIn’s desktop settings.

However, exercising this option does not affect any data already used for training. LinkedIn’s parent company, Microsoft, may also use the collected data to train its own AI models. The data encompasses user interactions, posts, language preferences, and feedback shared with LinkedIn.

LinkedIn’s AI data privacy concern

Privacy activists have voiced concerns over LinkedIn’s decision to opt users into AI training without explicit consent. Mariano delli Santi, legal and policy officer at the UK-based privacy advocacy nonprofit Open Rights Group, criticized the opt-out model as “wholly inadequate to protect our rights.” He stressed that opt-in consent should be legally mandated.

The nonprofit Open Rights Group (ORG) has called for an investigation by the U.K.’s Information Commissioner’s Office (ICO) into LinkedIn and other social networks’ default practice of using user data for AI training. The ORG advocated for mandatory opt-in consent to protect user rights effectively. Ireland’s Data Protection Commission (DPC), responsible for GDPR compliance, mentioned that LinkedIn would issue clarifications to its global privacy policy and introduce an opt-out setting for users who do not want their data used for AI training.

This opt-out is not relevant to EU/EEA users, as LinkedIn is not utilizing their data for training these models. The trend of platforms repurposing user-generated content to train generative AI models is not unique to LinkedIn. Companies like Tumblr, Photobucket, Reddit, and Stack Overflow also license user data for AI model training, often making it challenging for users to opt out.

This broad usage of user data for AI underscores the growing demand for comprehensive regulatory scrutiny and user-centric consent mechanisms.

Recent content