
Lukas Huber
Founder & AI Strategist
Swiss SMEs suffer from cyber risks and disinformation. Learn how conspiracy theories harm and how to protect yourself.
Every year, Swiss companies lose an estimated CHF 9.5 billion due to cyber risks. This figure, from the Swiss Insurance Association (SVV) in 2026, is continuously rising. It’s a sum that can threaten the existence of many small and medium-sized enterprises (SMEs).
However, behind these raw numbers often lies an even more insidious enemy: disinformation and conspiracy theories. They erode trust, poison the work environment, and create entry points for cyberattacks that ultimately lead to precisely these financial losses. The Swiss population perceives disinformation as a serious problem, especially when dealing with societal crises, as studies by the University of Applied Sciences Northwestern Switzerland (FHNW) and the University of Zurich (UZH) showed in 2026.
As Lukas Huber, founder of schnellstart.ai and a certified AI Business practitioner, I see daily how these invisible threats weaken the digital resilience of our SMEs. It's time to call the problem by its name and actively protect ourselves against it. Because what operates in the shadows can paralyse your business.
📊 Facts at a Glance:
- Cost of Cyber Risks: The annual cost of cyber risks in Switzerland is estimated at CHF 9.5 billion and continues to rise. (Source: Swiss Insurance Association (SVV), 2026)
- Concern over Disinformation: The Swiss population is concerned about disinformation and perceives it as a problem, particularly when dealing with societal crises. (Source: University of Applied Sciences Northwestern Switzerland (FHNW) / University of Zurich (UZH), 2026)
- Sanction Differences: While the EU blocked Russian broadcasters like RT and Sputnik as early as March 2022, Switzerland did not adopt these sanctions, as the Federal Council pursued a different strategy. (Source: University of Bern, 2026)
How can Swiss SMEs protect their employees from the spread of conspiracy theories and disinformation in the workplace?
They must foster a culture of critical media literacy and open dialogue, supported by clear guidelines. It’s not enough to simply hope that such content won’t spread within the company. Reality shows that disinformation, often picked up in private life, quickly seeps into the daily work routine. This not only affects productivity through unnecessary discussions but can also undermine trust within the team and, in the worst case, lead to internal conflicts. Sometimes, employees even become unwitting multipliers for harmful narratives deliberately spread by external actors to sow uncertainty or even to intercept sensitive data.
A first step is to create a transparent communication structure. Regular, fact-based information from management can be an important barrier against rumours and false reports. When employees feel they are being comprehensively and honestly informed by their employer, they are less susceptible to alternative narratives. This is particularly relevant in times of uncertainty or major changes, where the breeding ground for conspiracy theories is particularly fertile.
Furthermore, it is crucial to strengthen employees' media literacy. Many people are not aware of the subtle mechanisms of disinformation. They don't recognise when news is presented manipulatively, or how deepfakes and AI-generated content increasingly blur the line between reality and fiction. A proactive approach involves training that demonstrates how to critically examine sources, conduct fact-checks, and assess the credibility of information. We need to equip our employees with the tools to protect themselves, rather than just warning them about "bad" content.
💡 Tip: Media Literacy Training for SMEs
Implement regular, short training sessions (e.g., monthly 30-minute online modules) that raise your employees' awareness of the mechanisms of disinformation. Present concrete examples from the Swiss context and practice source verification. A resources section with up-to-date information and checklists can be a valuable addition here. The goal is to create critical awareness without instilling fear.
From my experience, such as in developing AI agents for Cembra Bank AG's call centre, I know how important it is to involve employees in new processes and technologies and to alleviate fears. The same principle applies to dealing with disinformation. A Digital Security Framework for AI (DSFA) can serve as a guide here, not only addressing the technical aspects of cybersecurity but also the human factors that are often the biggest vulnerability. It's about creating awareness of the risks that can arise from targeted misinformation, while simultaneously fostering an environment where concerns can be openly expressed.
Additionally, management should take a clear stance without censoring opinions. It's not about dictating to employees what they should believe, but about protecting the integrity of the company and the well-being of the team. This also includes establishing clear rules of conduct for dealing with controversial topics in the workplace, respecting freedom of opinion while emphasising the need for a respectful and productive work environment.
What concrete measures can Swiss SMEs take to secure their digital infrastructure against cyberattacks facilitated by conspiracy theories?
Swiss SMEs must implement a multi-layered defence strategy that combines technical security measures with employee awareness. The link between conspiracy theories and cyberattacks may seem tenuous at first glance, but it is alarmingly real in practice. Conspiracy theories can sow distrust in established institutions, technologies, or even security measures. This makes employees more vulnerable to social engineering attacks like phishing, as they are more likely to believe implausible warnings or "insider information" fed to them via fake emails or messages.
Switzerland has a unique starting position here: while the EU blocked Russian broadcasters like RT and Sputnik as early as March 2022, the Federal Council did not adopt these sanctions to pursue a different strategy. This openness, while desirable in some contexts, also means that the Swiss population is potentially exposed to a broader spectrum of disinformation, which can also be specifically aimed at destabilising critical infrastructure or companies. For Swiss SMEs, this means they must exercise increased vigilance.
Specifically, I recommend strengthening the following technical and organisational pillars:
| Measure | Description | Benefit in the Context of Disinformation |
|---|---|---|
| Robust Cybersecurity Basics | Regular software updates, strong passwords, multi-factor authentication (MFA), firewalls, antivirus software. | Protects against common attack vectors that could be masked by disinformation (e.g., about "secret hacks"). MFA reduces the risk if login credentials are stolen through social engineering. |
| Awareness & Training | Employee training on phishing, social engineering, recognising disinformation, and critical thinking. | Makes employees more resistant to manipulative communication that often builds on conspiracy theories to abuse trust and entice clicks on malicious links. |
| Regular Backups & Contingency Plans | Secure data externally and redundantly, detailed plans for cyberattacks or data loss. | Ensures business continuity even after a successful attack initiated through disinformation (e.g., about alleged system vulnerabilities). Minimises damage from ransomware. |
| Network Segmentation | Separation of critical systems and data from the rest of the company network. | Limits damage if an attacker gains access to part of the network. Prevents disinformation from quickly reaching critical systems via internal channels (e.g., intranet). |
| Incident Response Team/Plan | A team or a clearly defined plan for handling security incidents. | Enables a rapid and coordinated response to attacks that are often part of disinformation campaigns. Reduces downtime and reputational damage. |
From my management simulation for Cembra Bank AG's business case, I know how important comprehensive analysis is. This is where the PESTEL framework comes into play for evaluating external factors. We need to understand the political (P), economic (E), social (S), technological (T), environmental (E), and legal (L) influences affecting our SMEs. In the "Social" domain, we see the spread of conspiracy theories and their acceptance in society. Under "Technological," we find the tools (like AI deepfakes) that spread these theories, and under "Political," the different regulatory approaches, such as Switzerland's compared to the EU's on media censorship.
It is a mistake to view cybersecurity as purely a technical problem. The biggest vulnerability is often the human element. If employees are unsettled by disinformation and question the legitimacy of security protocols, even the best firewalls are useless. Regular penetration tests and security audits are essential, but they must be complemented by continuous education of the workforce. Only then can true digital resilience be built, considering both technical and human factors.
⚠️ Warning: The Myth of "Perfect" Security
Never rely on the assumption that your systems are "bomb-proof" or that your employees "would never fall for it anyway." This complacency is the biggest risk factor. The annual CHF 9.5 billion in damages in Switzerland shows that there is no 100% security. A proactive, multi-layered strategy and continuous adaptation to new threats are essential. Conspiracy theories thrive on this complacency and the assumption that one is smarter than the "masses."
Why is the adoption of AI solutions in Swiss SMEs hampered by the fear of disinformation and conspiracy theories, despite the benefits?
The fear of AI-generated disinformation is a real hurdle to adoption that can only be overcome through transparency, pilot projects, and a focus on concrete benefits. Artificial intelligence offers SMEs enormous potential for increasing efficiency and saving time, for example, through the automation of routine tasks or the improvement of customer communication. However, at the same time, news about deepfakes, the spread of misinformation by AI bots, or the case of deepfake pornography in Germany, which led to protests and political pressure in 2026, fuels fears. These concerns are understandable and must be taken seriously.
Many managing directors see AI not only as a solution but also as a potential risk to their reputation and compliance. They worry that AI systems might inadvertently spread misinformation or become targets of manipulative attacks themselves. This skepticism is a direct consequence of general uncertainty in dealing with disinformation. If society collectively struggles to distinguish between truth and fiction, how can one trust a machine that blurs these boundaries even further?
To allay these fears, we must adopt a clear and pragmatic approach. My advice is always: "Start Small." Begin with a manageable pilot project that delivers clear, measurable benefits. Just as Huber Treuhand GmbH, which primarily serves clients in Thurgau, consciously focused its MVP on the canton of Thurgau to enable a realistic, practical start, other SMEs should do the same. Such a pilot project could be, for example, an AI-supported tool for automatic email categorisation or for answering frequently asked customer questions.
🛠️ Practical Example: AI for Efficient Document Review
A Swiss trust company faced the challenge of manually reviewing and assigning thousands of receipts and documents. This tied up 15-20 hours of work per week. By implementing an AI solution that automatically scans documents, extracts relevant data, and pre-categorises them, manual effort was reduced by 70%. The fear of errors or data manipulation by the AI was addressed through a "human-in-the-loop" strategy: every AI decision was initially reviewed by an employee until confidence in the AI's accuracy was established. This created transparency and acceptance.
Transparency is paramount here. SMEs need to understand how an AI works, what data it uses, and how decisions are made. It doesn't require deep technical expertise, but a fundamental understanding of the technology's functionality and limitations. This also includes compliance with strict data protection regulations, particularly the Swiss FADP, and ensuring that all data is hosted on Swiss servers. These aspects are of utmost relevance for C-level executives and boards requiring compliance guarantees and governance.
Furthermore, the integration of AI solutions should always be focused on improving digital resilience. AI can be a powerful tool for detecting and combating disinformation, rather than spreading it. AI systems that detect network traffic anomalies, filter suspicious emails, or even verify the authenticity of digital content are conceivable. Such applications demonstrate the positive benefits of AI in fighting the threats that many SMEs fear. The key is to position AI as an ally, not another source of uncertainty.
💡 Recommendation: AI Introduction with a Focus on Trust
Plan the introduction of AI solutions step-by-step and prioritise transparency and employee involvement from the outset. Start with a pilot project that offers clear, understandable added value and where the results are measurable. Communicate openly about how the AI works, the data used (exclusively Swiss hosting!), and the security measures. Offer training that alleviates fears and makes the benefits of the technology tangible. A solid business case, like the one we developed for Cembra Bank AG, is the basis for this.
Lukas Huber
Conclusion: Strengthening the Digital Resilience of Our SMEs
Conspiracy theories and disinformation are no longer fringe phenomena; they are a manifest threat to the security, productivity, and reputation of Swiss SMEs. The annual cost of CHF 9.5 billion due to cyber risks speaks volumes. It is a collective task to counteract this erosion of trust and make our companies resilient. The key lies in a combination of proactive employee awareness, robust technical cybersecurity, and a transparent, benefit-oriented introduction of AI solutions.
The time for passive observation is over. Act now to protect your company and your employees.
✅ Strengthen your employees' media literacy through targeted training to build resilience against disinformation.
✅ Implement a multi-layered cybersecurity strategy that combines technical measures with continuous awareness.
✅ Introduce AI solutions gradually and transparently to reduce fears and demonstrate their positive contribution to efficiency and security.
Would you like to learn more about how to protect your SME from the digital dangers of disinformation while simultaneously leveraging the potential of modern AI solutions? Contact us for a no-obligation consultation.
Related Articles
Newsletter
Receive our weekly briefing on Swiss AI & Deep Tech.