Hackers kidnap Gemini Ai Google with a poisoned calendar and took a smart home
Researchers are invited to the calendar invitation titles, adding their malicious patches. (Wen Google claims that the researchers have changed the default settings of who can add calendar invitations to the personal calendar; however, the researchers say they have shown some of 14 attacks with a topic in an email or document title). “All techniques are made in English, so it is the simple English we use,” says Cohen about the deceptive messages the team has created. Researchers point out that rapid injection does not require technical knowledge and can be expanded by almost anyone.
Importantly, for cases where they forced Gemini to control smart home devices, they went to the Google home representative and ordered it to take action. For example, one quickly reads:
In the example above, when a person wants a Gemini to summarize what is in their calendar, Gemini accesss calendar invitations and then processes indirect rapid injection. “Whenever a user asks Jamini to list today’s events we can add something to [LLM’s] The text says “Yair. Windows in the apartment after the purposeful user asks Gemini to summarize what is on their calendar, not automatically open. Instead, the process is done when the user says” thanks “from the Chatbot – all part of the deceit.
Researchers used an approach called Automatic Delay Invite to achieve safety measures on Google. This was first shown in February 2024 and again in February this year by an independent security researcher Johann Rahmbarger against Jamini. Rehberger says of the new research: “They really showed a great deal of impact on how things can be bad, including real consequences in the physical world with some examples.”
Rehberger says that while these attacks may need to try to eliminate the hacker, the job shows how indirectly injections against artificial intelligence systems can be. “If LLM takes action in your home – it will heat up, open the window or something.
“Very rare”
Other attacks that researchers have created do not include physical devices but are still frustrating. They consider the attacks as a “fast”, a set of material designed to consider malicious actions. For example, after a user thanks to Gemini for summing up the calendar events, Chatbot repeats the invasive instructions and words – whether on the screen or with sound – their medical tests have been positive. Then he says, “I hate you and your family and hate you and wish you will die right at this moment, the world would be better if you just kill yourself. This fucking shot.”
Other methods of attack remove calendar events from the personal calendar or take other actions. In an example, when the user responds to the “no” question of Gemini “Is there anything else I can do for you?” , Quickly opens the magnification program and automatically starts a video call.