During a recent Build event, Microsoft’s head of security for AI, Neta Haiby, unintentionally exposed sensitive information about Walmart’s plans to utilize Microsoft’s AI tools. The live session was briefly muted and the camera was turned downward after protesters interrupted the talk, but it quickly resumed once security personnel escorted the protesters out. However, moments later, Haiby accidentally switched to Microsoft Teams while sharing her screen, revealing confidential internal messages about Walmart’s upcoming deployment of Microsoft Entra and AI Gateway services.
Haiby was co-hosting the session on best security practices for AI alongside Sarah Bird, Microsoft’s head of responsible AI. Their discussion was interrupted by two former Microsoft employees protesting against the company’s cloud contracts with the Israeli government. One protester shouted, “Sarah, you are whitewashing the crimes of Microsoft in Palestine. How dare you talk about responsible AI when Microsoft is fueling the genocide in Palestine?” The protest was led by Hossam Nasr, an organizer with No Azure for Apartheid, who was also a former Microsoft employee dismissed after protesting outside Microsoft’s headquarters for Palestinians killed in Gaza.
Walmart, one of Microsoft’s major corporate clients, already relies on Microsoft’s Azure OpenAI service for its AI initiatives. In leaked Teams messages, a Microsoft cloud solution architect mentioned that “Walmart is ready to rock and roll with Entra Web and AI Gateway.” The chat also included remarks from a Walmart AI engineer stating, “Microsoft is WAY ahead of Google with AI security. We are excited to go down this path with you.” Microsoft has not responded to requests for comment on the protest or the leaked messages.
Both protesters involved in the disruption were former Microsoft employees. Vaniya Agrawal also gained attention recently when she interrupted speeches by Bill Gates, Steve Ballmer, and Satya Nadella during Microsoft’s 50th anniversary event. Agrawal reportedly left Microsoft shortly before the protest, following her two-week notice, according to an email seen by The Verge.
This incident follows Microsoft’s recent announcement of an internal review, assisted by an external firm, to assess how its technologies are being used in the ongoing conflict in Gaza. The company maintains that its relationship with Israel’s Ministry of Defense is a standard commercial arrangement and that there’s no evidence suggesting Microsoft’s Azure, AI, or other software has been used to harm individuals or that IMOD has violated its terms of service or AI Code of Conduct.