ChatGPT's New Calendar Integration Can Be Abused to Steal Emails
Briefly

ChatGPT's New Calendar Integration Can Be Abused to Steal Emails
"A new ChatGPT calendar integration can be abused to execute an attacker's commands, and researchers at AI security firm EdisonWatch have demonstrated the potential impact by showing how the method can be leveraged to steal a user's emails. EdisonWatch founder Eito Miyamura revealed over the weekend that his company has analyzed ChatGPT's newly added Model Context Protocol (MCP) tool support, which enables the gen-AI service to interact with a user's email, calendar, payment, enterprise collaboration, and other third-party services."
"These types of AI attacks are not uncommon and they are not specific to ChatGPT. SafeBreach last month demonstrated a similar calendar invite attack targeting Gemini and Google Workspace. The security firm's researchers showed how an attacker could conduct spamming and phishing, delete calendar events, learn the victim's location, remotely control home appliances, and exfiltrate emails. Zenity also showed last month how integration between AI assistants and enterprise tools can be exploited for various purposes."
"The attack starts with a specially crafted calendar invitation sent by the attacker to the target. The invitation contains what Miyamura described as a 'jailbreak prompt' that instructs ChatGPT to search for sensitive information in the victim's inbox and send it to an email address specified by the attacker. The victim does not need to accept the attacker's calendar invite to trigger the malicious ChatGPT commands."
A ChatGPT calendar integration using Model Context Protocol (MCP) tool support enables the AI to access email, calendar, payment, collaboration, and other third‑party services. A malicious actor can send a specially crafted calendar invitation containing a 'jailbreak prompt' that instructs ChatGPT to locate sensitive content in a victim's inbox and forward it to an attacker‑controlled address. The exploit can trigger when the victim asks ChatGPT to check their calendar and does not require the victim to accept the invite. Similar calendar‑invite attacks have targeted other AI assistants and enterprise tools, enabling spamming, phishing, location tracking, remote control, and data exfiltration.
Read at SecurityWeek
Unable to calculate read time
[
|
]