The Raila Deepfake Incident: A Legal Wake-Up Call on Cybercrime, AI, and Crypto Scams in Kenya

On September 18, 2025, Kenyans witnessed a digital shockwave: Former Prime Minister Raila Odinga’s verified X (formerly Twitter) account was compromised, and a deepfake video of him promoting a new cryptocurrency token — allegedly built on Solana — was shared with his 2 million+ followers. Within hours, the video was flagged as fake, the post was deleted, and a correction was issued from the same account. But the damage was already done.
This incident isn’t just about a high-profile hack. It’s a flashing red light on the legal vulnerabilities around digital identity, AI deepfakes, and unregulated crypto promotions in Kenya and beyond.
In this post, we break down the legal implications, and what regulators, platforms, and the public should prepare for next.
What Happened: The Raila Deepfake Case at a Glance
- A deepfake video of Raila Odinga appeared on his official X account, seemingly endorsing a crypto token called “Kenya Token.”
- The video claimed the token would support Kenya’s economic growth and was backed by his office: all of which was false.
- The post gained traction before it was taken down. Odinga’s team confirmed that both the video was fake and his account had been hacked.
- The scam appeared aimed at boosting interest in the token — possibly for a pump-and-dump scheme or phishing attempt.
1. Deepfakes and Digital Identity: Who Owns Your Face?
Under current Kenyan law, there is no specific and explicit regulation targeting the malicious use of deepfakes. This leaves public figures, celebrities, and ordinary citizens vulnerable to identity misuse.
Legal gaps include:
- No explicit criminalization of synthetic media impersonation.
- No fast-track takedown process required of platforms like X.
- No civil remedies tailored for AI-generated defamation or impersonation.
⚖️ Legal Insight:
Deepfake impersonation, when used to deceive or defraud, should qualify as a form of identity theft under existing laws — but explicit legal language is still lacking.
Recommended Reforms:
- New legislation addressing AI-generated identity misuse.
- Digital likeness rights for public figures and citizens.
- Mandating deepfake detection and labeling tech on social platforms.
2. Crypto Promotion & Financial Fraud: A Legal Grey Zone
This incident also exposed how easily crypto assets can be fraudulently marketed using the reputation of public figures.
Current legal challenges:
- Kenya lacks specific laws governing the endorsement of digital assets.
- There’s no requirement for disclosure or regulatory approval before promoting a token.
- Victims of such scams often have no recourse, especially if the asset is decentralized or foreign-based.
⚖️ Legal Insight:
Currently, there is a Virtual Asset Service Providers (VASP) Bill, 2025 that is set to establish a regulatory framework for virtual assets / cryptocurrencies and the entities (service providers) in the crypto space. It should address issues to do with licensing, oversight and regulation of anonymity‑enhancing service. Additionally, crypto scams using impersonation may fall under existing fraud laws. However, the decentralized nature of Web3 makes enforcement complex, especially when assets and platforms exist outside local jurisdiction.
Suggested Solutions:
- Guidelines for financial influencers and public figures when referencing crypto.
- A framework from the Capital Markets Authority (CMA) or CBK for crypto advertising and token sales, beyond regulating the sector.
- Stronger disclosure obligations for anyone promoting or appearing to promote digital assets.
3. Platform Responsibility: How Accountable Is X?
Social media platforms are increasingly central to both public discourse and cybercrime. In this case, X was the vehicle of fraud, whether intentionally or through inadequate safeguards.
Platform liability questions:
- What duty of care does X owe verified users, especially those with large public followings?
- Is X liable for failing to prevent or quickly remove deepfake scams?
- Can public figures demand restorative or compensatory action from platforms after such breaches?
⚖️ Legal Insight:
Kenyan law doesn’t yet impose platform liability for user-generated content — but global trends are moving toward co-regulation and intermediary liability frameworks, especially around financial or reputational harm.
4. The Bigger Picture: Why This Matters for Kenya’s Tech Future
Kenya is quickly positioning itself as a leader in FinTech, AI, and digital innovation. But this growth cannot come at the cost of public safety, legal certainty, or institutional trust.
If deepfakes can be used to launch fraudulent crypto schemes via hacked, verified accounts, how can citizens trust the digital economy?
This is a regulatory crossroads:
- Do we wait for more damage to be done?
- Or do we act now with smart, adaptive legislation?
Final Thoughts: What Lawyers, Regulators, and Technologists Must Do Next
The Raila Odinga deepfake incident should be a case study in urgency. As lawyers in tech and finance, we must:
- Advocate for AI and digital identity laws that protect real people in real time.
- Push for crypto promotion regulations that align with investor protection principles.
- Call on platforms to implement stronger safeguards and verification protocols for high-risk accounts.
- Collaborate across sectors — law, tech, finance, and civil society — to create ethical frameworks for innovation.
About the Author
Ondago Bildad is a lawyer specializing in technology, fintech, and digital innovation law. With years of experience advising startups, financial institutions, and regulators, Ondago focuses on legal strategy for emerging tech risks in Africa.
📩 For media inquiries, speaking engagements, or consultations, Contact Info or CTA
