> That key is used to encrypt data throughout the process so we can't see your data. Unpopular opinion: It has been bothering me for a while. Statements like that are slightly misleading. Obviously, the data has to be decrypted before it is passed to the model. So transport may be encrypted, and also you may encrypt context data at rest, but there is no such thing as a large language model that can process encrypted data. There is nothing that technically keeps nostr:nprofile1qyjhwumn8ghj7en9v4j8xtnwdaehgu3wvfskuep0dakku62ltamx2mn5w4ex2ucpxpmhxue69uhkjarrdpuj6em0d3jx2mnjdajz6en4wf3k7mn5dphhq6rpva6hxtnnvdshyctz9e5k6tcqyp7u8zl8y8yfa87nstgj2405t2shal4rez0fzvxgrseq7k60gsrx6zeuh5t from tapping into the data. Ultimately, it's a trust model, not a technical solution. Prove me wrong.