- The CyberLens Newsletter
- Posts
- The Trojan Games: Fake AI and Gaming Firms Are Spreading Malware to Crypto Users via Telegram and Discord
The Trojan Games: Fake AI and Gaming Firms Are Spreading Malware to Crypto Users via Telegram and Discord
An Unseen Intrusion of the Cryptocurrency Space by Threat Actors Masquerading as Legitimate Web3 Projects, AI Startups, and Esports Gaming Platforms
Stay up-to-date with AI
The Rundown is the most trusted AI newsletter in the world, with 1,000,000+ readers and exclusive interviews with AI leaders like Mark Zuckerberg, Demis Hassibis, Mustafa Suleyman, and more.
Their expert research team spends all day learning what’s new in AI and talking with industry experts, then distills the most important developments into one free email every morning.
Plus, complete the quiz after signing up and they’ll recommend the best AI tools, guides, and courses – tailored to your needs.
Interesting Tech Fact:
Did you know that some cyber-criminals are embedding malware into fake game mods and cheats that specifically target competitive gamers using hardware-aware evasion techniques? These stealthy malware variants, often disguised as aimbots or FPS performance boosters, can detect if they’re running in virtual machines or sandbox environments—allowing them to bypass cybersecurity research tools and infect real gamer systems undetected. This tactic has become increasingly popular in underground forums, where malware authors exploit gamers’ desire for an edge to deliver infostealers, RATs, and cryptominers without triggering antivirus defenses.
Introduction
In an age where artificial intelligence, gaming, and decentralized finance dominate digital innovation, cyber-criminals have found fertile ground to exploit the excitement. New investigations reveal a growing wave of malware campaigns leveraging fake AI companies, fraudulent Web3 gaming platforms, and synthetic eSports brands to target cryptocurrency enthusiasts through social messaging platforms—most notably Telegram and Discord.
These attackers aren’t just using common phishing tactics—they're constructing entire ecosystems of believable websites, social media personas, GitHub repositories, and application download links. Their endgame: infect unsuspecting users with infostealers like RedLine, Racoon Stealer, DarkStealer, and Lumma Stealer, ultimately draining cryptocurrency wallets and hijacking digital identities.
CyberLens has investigated the most recent campaign variants, key threat actors, the techniques they’ve employed, and the platforms exploited. We also listed the fraudulent domains and names of the fake companies behind this cyber assault on the crypto world.
Anatomy of the Campaign
The Lure: Fake AI and Gaming Brands
At the center of this new wave of malware operations are fabricated companies that appear legitimate to the untrained eye. Some claim to be developing the "next generation" of AI-powered gaming, while others masquerade as blockchain-based esports platforms or tools for automated crypto trading. These entities boast professionally designed websites, active Telegram channels, and well-branded Discord servers—all designed to project legitimacy.
Some known fake entities in these recent campaigns include:
Sonar AI Bot – Posing as an AI-enhanced crypto market prediction bot
AimbotsPro – Falsely advertised as an AI esports enhancement tool
AstroGamerFX – Claimed to offer tokenized in-game NFTs and play-to-earn features
Cryptex Gaming Hub – Marketed itself as a blockchain esports tournament platform
QuantAITrade – Posed as a quant-driven trading assistant for crypto investors
All these names were associated with download links leading to malicious executables hosted on GitHub, AnonFiles, MediaFire, and other popular file-sharing platforms.
Platforms Used: Telegram & Discord as Malware Delivery Vehicles
Cyber-criminals exploit the trust and community-driven nature of Telegram and Discord. On Telegram, they create channels and bot-driven chats that simulate AI functionality. These bots often ask users to download software or link their wallets to “access premium features.”
On Discord, attackers use fake developer servers to share “alpha builds,” “early access games,” or “AI tools,” encouraging downloads from links that appear legitimate but are bundled with infostealers. Social engineering is often coupled with urgency or exclusivity: “limited beta access,” “invite-only testing,” or “airdrops for the first 100 users.”
Key Malware Families Involved
RedLine Stealer
Steals browser cookies, stored passwords, and cryptocurrency wallet information.
Distributed via fake executables named after gaming or AI apps (e.g., aimbotpro_beta.exe).
Lumma Stealer
Advanced obfuscation techniques and anti-VM capabilities.
Often dropped via .rar files or disguised as Discord bots or crypto wallet “enhancers.”
Racoon Stealer v2
Recently reemerged with active Telegram distribution campaigns.
Known to target browser extensions and clipboard crypto addresses.
DarkStealer
Newer infostealer distributed by threat actors behind fake AI analysis tools.
Collected credentials, autofill data, and cookies from Chromium-based browsers.
How It Works: Step-by-Step Breakdown
Fake Brand Creation: Cybercriminals create convincing brand identities, complete with logos, taglines, and boilerplate product pages promising revolutionary AI and gaming utilities.
Promotion Through Social Media: These sites are circulated on platforms like X (Twitter), Reddit crypto forums, and Discord alpha groups to create buzz and curiosity.
Entry via Messaging Apps: Telegram bots and Discord messages are used to distribute “downloads” or “setup files” which contain the malware.
User Execution: Users install the software, which runs background scripts to steal browser data, crypto wallet credentials, Discord tokens, and system fingerprints.
Exfiltration & Monetization: Data is uploaded to command-and-control servers, usually via encrypted POST requests or Telegram API uploads. Cryptocurrency wallets are emptied, and credentials are sold on the dark web.
Known Fake Websites and Infrastructure
This section breaks down some of the most actively used fake websites that are central to these malware distribution campaigns. These domains are crafted to appear legitimate, leveraging SSL certificates, polished branding, and convincing product narratives—all while serving as malware traps.
quantaitrade[.]site
Fake Identity: QuantAITrade – An AI-powered crypto trading bot
Description: Promoted as a predictive trading assistant using quantum AI analytics. The site offers a “Pro Version” download, which is actually a Lumma Stealer payload. Telegram bots and Discord links were used to promote its utility to crypto traders and token airdrop seekers.
astrogamerfx[.]pro
Fake Identity: AstroGamerFX – Web3-based NFT gaming platform
Description: Marketed as a play-to-earn game with AI-generated maps and NFT integration. Download links for “Astro Beta v1” led to an executable infected with RedLine Stealer. The campaign heavily targeted Discord communities focused on gaming and NFTs.
sonarbots-ai[.]io
Fake Identity: Sonar AI Bot – AI sentiment analysis for crypto
Description: Claimed to be an AI tool that reads social sentiment across Twitter and Reddit to generate buy/sell signals for crypto assets. Once installed, it harvested browser data and wallet credentials using Racoon Stealer v2.
cryptgaminghub[.]net
Fake Identity: Cryptex Gaming Hub – Decentralized esports platform
Description: Advertised as a blockchain-powered competitive gaming tournament site. Players were enticed to download the tournament “game launcher,” which was bundled with DarkStealer malware. Promoted via crypto influencer Discords.
aimbotprotools[.]xyz
Fake Identity: AimbotsPro – AI gaming performance optimizer
Description: Allegedly an esports coaching tool using AI to enhance in-game targeting and performance. Widely distributed in gaming Telegram channels and cheat mod forums. The installer acted as a dropper for RedLine and had clipboard hijacking capabilities for crypto addresses.
aiinsighttrade[.]online
Fake Identity: AI Insight Trade – AI-based NFT value predictor
Description: Supposedly used deep learning to evaluate and predict the future value of NFTs in real time. Files offered for download were laced with Lumma Stealer and promoted via LinkedIn clone profiles and Twitter ads.
meta-gaminglabs[.]com
Fake Identity: Meta Gaming Labs – “The Future of AI in Web3 Gaming”
Description: This site showcased mock case studies, roadmap timelines, and fabricated team bios to appear like a funded startup. It offered a “Dev Kit Alpha” download that infected victims with Racoon Stealer v2 and attempted to hijack Discord tokens for further propagation.
Key Infrastructure Techniques Used:
SSL and Cloudflare Protection: These sites use valid SSL certificates and sometimes Cloudflare to delay takedowns and obscure their hosting IPs.
GitHub & Dropbox as Payload Hosts: Malware files are often stored in public repositories or free hosting services, bypassing some AV filters.
Typosquatting Real Brands: Some sites closely mimic legitimate AI or Web3 brands, confusing users through subtle domain alterations.
Who Is Behind These Campaigns?
Attribution remains murky, but several indicators point toward Eastern European threat actors and organized cybercrime groups who specialize in malware-as-a-service (MaaS). Some Telegram channels linked to the campaigns have ties to Russian-speaking malware forums and sellers advertising “custom stealers with GUI” for around $150–$300 USD.
These actors also offer affiliate programs where new members can distribute the malware in exchange for a cut of the profits from compromised wallets and stolen accounts.
The Impact on Cryptocurrency Users
Cryptocurrency traders and NFT enthusiasts, especially those seeking underground alpha tools or AI-powered crypto bots, are particularly vulnerable. The combination of FOMO (fear of missing out), anonymity, and community-driven trust makes these users easy targets.
In one recent case, a user reported a loss of over $19,000 USD in various ERC-20 tokens after downloading what they believed was a beta trading tool. Logs recovered from threat actor C2 servers revealed hundreds of wallet seed phrases and exchange credentials harvested in under 48 hours.
How to Stay Protected
Never download software from Telegram or Discord links, especially if it’s not from an official website or reputable GitHub repo.
Verify the legitimacy of AI and gaming companies by checking for external reviews, GitHub activity, or third-party audits.
Use endpoint detection tools with behavior analysis features, as many infostealers now evade traditional antivirus.
Secure your cryptocurrency wallets with hardware wallets or use multisig features when possible.
Enable 2FA everywhere—especially on exchange and trading platforms.
Final Thoughts
The fusion of fake AI companies and gaming firms with messaging platforms like Telegram and Discord marks a dangerous evolution in the threat landscape targeting cryptocurrency users. As cyber-criminals refine their social engineering tactics and mimic real startups with alarming accuracy, the burden of vigilance grows heavier for digital asset holders and tech-savvy investors.
Nonetheless, CyberLens will continue to monitor these threat vectors closely. Stay subscribed for real-time alerts and expert analysis on how to secure your digital identity and crypto assets in this era of hyper-targeted deception.