{"id":33566,"date":"2024-07-23T22:32:31","date_gmt":"2024-07-23T20:32:31","guid":{"rendered":"https:\/\/www.kaspersky.co.za\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/33566\/"},"modified":"2024-07-23T22:32:31","modified_gmt":"2024-07-23T20:32:31","slug":"it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra","status":"publish","type":"post","link":"https:\/\/www.kaspersky.co.za\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/33566\/","title":{"rendered":"How to prepare corporate cybersecurity for all-seeing AI assistants"},"content":{"rendered":"<p>Throughout May and June, the IT world watched the unfolding drama of Copilot+ Recall. First came Microsoft\u2019s <a href=\"https:\/\/www.theverge.com\/2024\/5\/20\/24159258\/microsoft-recall-ai-explorer-windows-11-surface-event\" target=\"_blank\" rel=\"nofollow noopener\">announcement<\/a> of the \u201cmemory\u201d feature named Recall that takes screenshots of everything happening on a computer every few seconds and extracting all useful information into a shared database. Then, cybersecurity researchers <a href=\"https:\/\/medium.com\/doublepulsar\/recall-stealing-everything-youve-ever-typed-or-viewed-on-your-own-windows-pc-is-now-possible-da3e12e9465e\" target=\"_blank\" rel=\"nofollow noopener\">criticized<\/a> Recall\u2019s implementation by exposing security flaws and <a href=\"https:\/\/github.com\/xaitax\/TotalRecall\" target=\"_blank\" rel=\"nofollow noopener\">demonstrating<\/a> the potential for data exfiltration \u2014 including of the remote kind. This forced Microsoft to backpedal: first <a href=\"https:\/\/blogs.windows.com\/windowsexperience\/2024\/06\/07\/update-on-the-recall-preview-feature-for-copilot-pcs\/\" target=\"_blank\" rel=\"nofollow noopener\">stating<\/a> the feature wouldn\u2019t be enabled by default and promising improved encryption, and then delaying the mass rollout of Recall entirely \u2014 opting to first test it in the Windows Insider Program beta. Despite this setback, Redmond remains committed to the project and plans to launch it on a broad range of computers \u2014 including those with AMD and Intel CPUs.<\/p>\n<p>Within the context of devices in the workplace \u2014 especially if a company allows BYOD \u2014 Recall clearly violates corporate data retention policies and significantly amplifies potential damage if a network is compromised by infostealers or ransomware. What\u2019s more concerning is the clear intention of Microsoft\u2019s competitors to follow this trend. The recently announced <a href=\"https:\/\/www.apple.com\/apple-intelligence\/\" target=\"_blank\" rel=\"nofollow noopener\">Apple Intelligence<\/a> is still shrouded in marketing language, but the company claims that Siri will have \u201conscreen awareness\u201d when processing requests, and text-handling tools available across all apps will be capable of both local or ChatGPT-powered processing. While Google\u2019s equivalent features remain under wraps, the <a href=\"https:\/\/www.pcworld.com\/article\/2357511\/google-qa-how-chromebooks-are-moving-into-the-ai-era.html\" target=\"_blank\" rel=\"nofollow noopener\">company has confirmed<\/a> that Project Astra \u2014 the visual assistant announced at Google I\/O \u2014 \u00a0will eventually find its way onto Chromebooks, utilizing screenshots as the input data stream. How should IT and cybersecurity teams prepare for this deluge of AI-powered features?<\/p>\n<h2>Risks of visual assistants<\/h2>\n<p>We previously discussed how to mitigate the risks of unchecked ChatGPT and other AI assistants\u2019 usage by employees <a href=\"https:\/\/www.kaspersky.com\/blog\/how-to-use-chatgpt-ai-assistants-securely-2024\/50562\/\" target=\"_blank\" rel=\"noopener nofollow\">in this article<\/a>. However, there we focused on the deliberate adoption of additional apps and services by employees themselves \u2014 a new and troublesome breed of <a href=\"https:\/\/www.kaspersky.com\/blog\/shadow-it-as-a-threat\/34938\/\" target=\"_blank\" rel=\"noopener nofollow\">shadow IT<\/a>. OS-level assistants present a more complex challenge:<\/p>\n<ul>\n<li>The assistant can take screenshots, recognize text on them, and store any information displayed on an employee\u2019s screen \u2014 either locally or in a public cloud. This occurs regardless of the information\u2019s sensitivity, current authentication status, or work context. For instance, an AI assistant could create a local, or even cloud-based, copy of an encrypted email requiring a password.<\/li>\n<li>Captured data might not adhere to corporate data-retention policies; data requiring encryption might be stored without it; data scheduled for deletion might persist in an unaccounted copy; data meant to remain inside the company\u2019s perimeter might end up in a cloud \u2014 potentially under an unknown jurisdiction.<\/li>\n<li>The problem of unauthorized access is exacerbated since AI assistants might bypass additional authentication measures implemented for sensitive services within an organization. (Roughly speaking, if you need to view financial transaction data, even after being authorized in the system you need to enable RDP, raise a certificate, log in to the remote system, and enter the password again \u2014 or you could simply view it through an AI assistant such as Recall.)<\/li>\n<li>Control over the AI assistant by the user and even IT administrators is limited. Accidental or deliberate <a href=\"https:\/\/www.bleepingcomputer.com\/news\/microsoft\/microsoft-removes-copilot-app-incorrectly-added-on-windows-pcs\/\" target=\"_blank\" rel=\"nofollow noopener\">activation of additional OS functions<\/a> at the manufacturer\u2019s command is a known issue. Essentially, Recall, or a similar feature, could appear on a computer unexpectedly and without warning as part of an update.<\/li>\n<\/ul>\n<p>Although all the tech giants are claiming to be paying close attention to AI security, the practical implementation of security measures must stand the test of reality. Microsoft\u2019s initial claims about data being processed locally and stored in encrypted form proved inaccurate, as the encryption in question was in fact a simple BitLocker, which effectively only protects data when the computer is turned off. Now we have to wait for cybersecurity professionals to assess Microsoft\u2019s updated encryption and whatever Apple eventually releases. Apple claims that some information is processed locally, some within their own cloud using secure computing principles without storing data post-processing, and some \u2014 transmitted to OpenAI in anonymized form. While Google\u2019s approach remains to be seen, the company\u2019s track record <a href=\"https:\/\/www.404media.co\/google-leak-reveals-thousands-of-privacy-incidents\/\" target=\"_blank\" rel=\"nofollow noopener\">speaks for itself<\/a>.<\/p>\n<h2>AI assistant implementation policies<\/h2>\n<p>Considering the substantial risks and overall lack of maturity in this domain, a conservative strategy is recommended for deploying visual AI assistants:<\/p>\n<ol>\n<li>Collaboratively determine (involving IT, cybersecurity, and business teams) which employee workflows would benefit significantly from visual AI assistants to justify the introduction of additional risks.<\/li>\n<li>Establish a company policy and inform employees that the use of system-level visual AI assistants is prohibited. Grant exceptions on a case-by-case basis for specific uses.<\/li>\n<li>Take measures to block the spontaneous activation of visual AI. Utilize <a href=\"https:\/\/learn.microsoft.com\/en-us\/windows\/client-management\/manage-recall\" target=\"_blank\" rel=\"nofollow noopener\">Microsoft group policies<\/a> and block the execution of AI applications at the EDR or EMM\/UEM level. Keep in mind that older computers might not be able to run AI components due to technical limitations, but manufacturers are working to expand their reach to previous system versions.<\/li>\n<li>Ensure that security policies and tools are applied to all devices used by employees for work \u2014 including personal computers.<\/li>\n<li>If the first-stage discussion identifies a group of employees that could significantly benefit from visual AI, launch a pilot program with just a few of these employees. IT and cybersecurity teams should develop recommended visual assistant settings tailored to employee roles and company policies. In addition to configuring the assistant, implement enhanced security measures (such as strict user authentication policies and more stringent SIEM and EDR monitoring settings) to prevent data leaks and protect the pilot computers from unwanted\/malicious software. Ensure that the available AI assistant is activated by an administrator using these specific settings.<\/li>\n<li>Regularly and thoroughly analyze the pilot program\u2019s group performance compared to a control group, along with the behavior of company computers with the AI assistant activated. Based on this analysis, decide whether to expand or discontinue the pilot program.<\/li>\n<li>Appoint a dedicated resource to monitor cybersecurity research and threat intelligence regarding attacks targeting visual AI assistants and their stored data. This will allow for timely policy adjustments as this technology evolves.<\/li>\n<\/ol>\n<input type=\"hidden\" class=\"category_for_banner\" value=\"mdr\">\n","protected":false},"excerpt":{"rendered":"<p>Although Microsoft has radically revised the rollout plan for its controversial Recall feature, cybersecurity teams can\u2019t afford to ignore the issue of &#8220;AI onlookers.<\/p>\n","protected":false},"author":2706,"featured_media":33567,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"_acf_changed":false,"footnotes":""},"categories":[1999,3020,3021],"tags":[1140,14,2141,3726,22,38,43,3734,321,131],"class_list":{"0":"post-33566","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-business","8":"category-enterprise","9":"category-smb","10":"tag-ai","11":"tag-apple","12":"tag-business","13":"tag-copilot","14":"tag-google","15":"tag-microsoft","16":"tag-privacy","17":"tag-recall","18":"tag-technology","19":"tag-tips"},"hreflang":[{"hreflang":"en-za","url":"https:\/\/www.kaspersky.co.za\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/33566\/"},{"hreflang":"en-in","url":"https:\/\/www.kaspersky.co.in\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/27758\/"},{"hreflang":"en-ae","url":"https:\/\/me-en.kaspersky.com\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/23088\/"},{"hreflang":"en-us","url":"https:\/\/usa.kaspersky.com\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/30440\/"},{"hreflang":"en-gb","url":"https:\/\/www.kaspersky.co.uk\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/27968\/"},{"hreflang":"ru","url":"https:\/\/www.kaspersky.ru\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/37892\/"},{"hreflang":"x-default","url":"https:\/\/www.kaspersky.com\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/51769\/"},{"hreflang":"ru-kz","url":"https:\/\/blog.kaspersky.kz\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/28065\/"},{"hreflang":"en-au","url":"https:\/\/www.kaspersky.com.au\/blog\/it-compliance-of-microsoft-copilot-recall-apple-intelligence-google-astra\/33901\/"}],"acf":[],"banners":"","maintag":{"url":"https:\/\/www.kaspersky.co.za\/blog\/tag\/ai\/","name":"AI"},"_links":{"self":[{"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/posts\/33566","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/users\/2706"}],"replies":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/comments?post=33566"}],"version-history":[{"count":0,"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/posts\/33566\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/media\/33567"}],"wp:attachment":[{"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/media?parent=33566"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/categories?post=33566"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.kaspersky.co.za\/blog\/wp-json\/wp\/v2\/tags?post=33566"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}