"Copilot, you're about to crash! Do you copy?"
FEATUREDCYBERSECURITY


Based on this article by Dan Goodin posted on Ars Technica.
Microsoft has introduced "Copilot Actions," an experimental AI feature for Windows designed to automate complex tasks, but it comes with a major security warning. The company admits this agentic AI can introduce "cross-prompt injection" risks, allowing malicious content to override instructions, potentially leading to data theft or malware installation.
Security critics scoff, comparing the warning to the long-ignored dangers of Office macros and questioning why Big Tech pushes features with uncontained, known flaws like hallucinations and prompt injections. While currently off by default, the industry history suggests this risky feature could eventually become a mandatory part of Windows, shifting the liability for inevitable compromise onto the end-user.
Check out this article.


