The UK’s National Cyber Security Centre (NCSC) has warned that entering personal information in chatbots like ChatGPT means that it is stored and is likely to be used for developing the LLM service or model.
This means could mean that the LLM provider (or its partners/contractors) can read queries, and may incorporate them into future versions, thereby raising privacy concerns. Also, the NCSC has warned that this stored information could be at risk of hacks, leaks, or accidentally being made publicly accessible. It has also warned of the privacy risk of aggregation of information across multiple queries using the same logins.
The advice is not to include sensitive information in queries to public LLMs, and not to submit queries to public LLMs that would lead to issues were they made public.