There is some clarity now on how a journalist ended up in that group chat involving the White House, and a military strike discussion. A Siri feature reportedly caused it. Feature, not a bug. National security advisor Mike Waltz mistakenly added, manually, Atlantic editor-in-chief Jeffrey Goldberg to that Signal group chat intended for a Trump spokesperson. This error occurred when Waltz approved a Siri suggestion that updated his contact list, inadvertently replacing the intended number with Goldberg’s. The White House, which had authorized officials to use Signal for secure communications, faced criticism as this incident highlights the vulnerabilities of consumer products in sensitive government operations.
Why do we care?
No hacking, no espionage—just a poorly timed AI suggestion accepted by a human. This isn’t about the tools, and don’t fall down that rabbit hole. The consequences of this error would be significantly smaller – and not potentially break laws – if the information had not been discussed in an inappropriate platform.
It’s also a preview of where unmanaged AI assistance collides with operational trust.