@munin I mean, what do they expect. LLMs make the computers themselves open to social engineering. I've seen over and over that social engineering on people will work, no matter the training, under the right situations. I don't expect LLMs to ever be any different.