
A data-analytics-id=”inline-link” href=”https://www.tomshardware.com/tag/security” data-auto-tag-linker=”true” data-before-rewrite-localise=”https://www.tomshardware.com/tag/security”>security researcher has demonstrated how a malicious data-analytics-id=”inline-link” href=”https://www.tomshardware.com/tag/google” data-auto-tag-linker=”true” data-before-rewrite-localise=”https://www.tomshardware.com/tag/google”>Google Calendar invite can prompt-inject ChatGPT and coax it into leaking private emails once Google connectors are enabled. In a post onX, on September 12, data-analytics-id=”inline-link” href=”https://x.com/Eito_Miyamura/status/1966541235306237985″ data-url=”https://x.com/Eito_Miyamura/status/1966541235306237985″ target=”_blank” referrerpolicy=”no-referrer-when-downgrade” data-hl-processed=”none”>Eito Miyamura outlines a simple scenario: An attacker sends a calendar invitation seeded with instructions and waits for the target to engage with ChatGPT and ask it to perform an action. ChatGPT then reads the booby-trapped event and follows orders to search Gmail and follow sensitive details….

![[CITYPNG.COM]White Google Play PlayStore Logo – 1500×1500](https://startupnews.fyi/wp-content/uploads/2025/08/CITYPNG.COMWhite-Google-Play-PlayStore-Logo-1500x1500-1-630x630.png)