Sending the wrong signal
Explosion emojis. Decisions about military strikes taking place via a chat group. A journalist accidentally included in high-level war strategy planning. It sounds like a far-fetched plot for a political satire – but it was a real-world breach that unfolded with the kind of ease that should worry every organisation.
The now-infamous ‘Signalgate’ leak saw senior White House officials in the US using Signal, an encrypted messaging app, to discuss potential airstrikes in Yemen. The group chat had been set up by President Trump’s national security adviser Michael Waltz, who inadvertently invited a reporter to join, in a slip-up that has since been attributed to an ‘auto-suggest’ option from his iPhone.
In a cautionary tale of what can go wrong when direct messaging apps blur the lines between personal and professional use, Signalgate raises fundamental questions for any organisation about security, transparency, and confidentiality.
The problem isn’t limited to Signal. The UK’s COVID-19 Inquiry has revealed extensive use of WhatsApp among government ministers for day-to-day decision-making. It’s part of a broader shift in how we communicate — one that accelerated rapidly during the pandemic, when the need for quick, remote connection often trumped concerns about data security or governance.
Originally designed for private conversations, encrypted messaging platforms swiftly became embedded in everyday working life. The appeal is obvious: instant, informal, and always to hand.
But their use introduces a spectrum of risks — from data breaches and non-compliance to employee rights and serious reputational damage.
In many organisations, adoption of these tools was organic and even welcomed as a practical lifeline. Familiar platforms helped people stay connected through lockdowns, but also ushered in a new informality in language and tone. One need only glance at the emojis exchanged during the Signalgate chats to see how quickly professional norms can shift.
The risks, however, are serious.
One is the difficulty of oversight as encrypted messaging is, by design, hard to monitor. Yet unchecked use can breach employment law, particularly if attempts to monitor private communications infringe employee rights. Conversely, failing to monitor conversations can create vulnerabilities, especially if inappropriate or offensive messages go unaddressed.
Transparency is another challenge. Whether in politics or business, key decisions made via unrecorded private channels may fall foul of legal and governance requirements. In the public sphere, this can mean breaching open records laws; in organisations, it undermines audit trails and accountability.
Data management must also comply with the Data Protection Act 2018 and General Data Protection Regulation (UK GDPR). The mishandling of sensitive information - even unintentionally - can attract hefty penalties and damage long-standing relationships if customer confidentiality is impacted. And while platforms like Signal are designed to protect user data, responsibility for compliance lies squarely with the organisation.
Security, too, is a concern and not just in hostile state espionage scenarios, as in the Signalgate case, where experts flagged the use of unsecured personal devices as the greatest vulnerability. Any business handling sensitive or client-related information must contend with the same risk: encrypted doesn’t mean infallible. Interception, device compromise and data leakage remain live threats with industrial espionage a real threat to many businesses.
Alongside direct messaging, use of private social media accounts may add a further layer of complexity. Even content shared behind privacy settings could find its way into the public domain, with potentially serious consequences for brand reputation and internal trust.
The ‘auto-suggest’ blamed by the White House is also a reminder of how much we are now working with embedded AI and predictive tools. Auto-suggest features, contact prompts, and auto-correct functions are designed for convenience, but they work on machine logic, not human context. When a platform decides who you “probably meant” to message, or how you “meant” to phrase a sentence, that shortcut can carry real consequences, especially in sensitive or high-stakes environments. The more seamlessly these tools integrate with our daily communication, the easier it becomes to overlook the fact that they’re making decisions on our behalf.
The Online Safety Act 2023 may also have implications. While designed to protect users online - particularly children - its provisions could impact encrypted messaging apps, potentially requiring access to private conversations to detect harmful content. This could raise further questions around organisational use of such tools and their data protection obligations.
The upshot? Organisations need to take a proactive, structured approach:
- Establish clear communication policies, specifying which channels are approved for business use and why.
- Offer secure alternatives for internal communication, rather than simply banning popular apps.
- Train all staff - including senior leaders - in the risks, rules, and expectations around messaging and data protection.
- Provide guidance on AI-powered tools, including how to disable automated features, particularly when dealing with sensitive communications.
- Implement hardware safeguards like mobile device management (MDM) systems to secure data on both personal and work devices.
- Conduct regular audits to ensure compliance and catch unauthorised use early.
It’s a fast-moving environment, and the breach at the highest level of the US administration shows how hard it is to keep tabs on everyone within an organisation, especially when it may be the most senior people who think it’s ok to break the rules.
But by proactively addressing the use of encrypted messaging apps, organisations are on the road to safeguarding their reputation, ensuring legal compliance, and protecting sensitive data in an increasingly complex digital environment.