How Internet Fraudsters exploit your Godly nature to perpetrate fraud
Internet fraudsters (yahoo-yahoo as it is called in Nigeria) are evolving their tactics, and their manipulation techniques now extend beyond fear and urgency. Increasingly, these internet fraudsters are exploiting your Godly nature, taking advantage of your positive emotions such as empathy, compassion, and the desire to help others, warns Anna Collard, SVP of Content Strategy and CISO Advisor at KnowBe4 Africa. According to Collard, emotional manipulation lies at the heart of most social engineering attacks (source), and while many are familiar with scams that induce panic or fear, far fewer recognize that scammers also weaponize kindness, curiosity, and love to achieve their ends.
Collard explains that modern cybercriminals are now using emotionally appealing schemes such as fake fundraisers and charitable drives involving children, the elderly, or victims of natural disasters to evoke compassion and trust. These scams often appear legitimate, supported by fabricated testimonials, deepfake videos, and AI-generated imagery depicting the supposed success of their causes. “These tactics are highly persuasive because they mimic the emotional triggers that motivate people to act quickly and generously,” she says.
While fear-based scams remain highly effective due to their ability to trigger panic and time pressure, positive emotions are equally potent in lowering a person’s defenses. “When people feel good about helping others, they are less likely to question the legitimacy of a request,” Collard notes. Research shows that the ‘warm glow’ effect—the psychological satisfaction of doing good—can impair critical thinking by prompting individuals to rely on emotional instincts rather than rational analysis.
This emotional vulnerability is amplified by the brain’s reward response to kindness, which creates a feedback loop that criminals exploit. Scammers often establish a sense of connection and shared purpose with their victims, deepening emotional investment and making withdrawal increasingly difficult. The sunk-cost fallacy then takes hold—once victims have donated money or invested emotionally, they feel compelled to continue helping, even when doubts arise.

Collard highlights that trust-based scams are becoming more prevalent, including fraudulent charity campaigns that imitate legitimate organisations such as UNICEF and CANSA. These scams are particularly successful in communities with strong cultural values of mutual support, such as the African philosophy of ubuntu, which emphasizes community and compassion. “Criminals exploit these values by framing their scams as community-building efforts,” she warns.
Other sophisticated examples include romance fraud (source) and ‘pig butchering’ scams (source), in which cybercriminals cultivate long-term relationships with their victims before exploiting them financially. These schemes are psychologically complex, relying not on immediate monetary requests but on prolonged emotional grooming. “These scams often involve months of trust-building, making victims feel seen and cared for before introducing financial manipulation,” Collard explains.
To protect themselves, individuals must balance caution with compassion. Collard advises people to pause and verify charitable causes before donating, using independent resources to confirm legitimacy. She recommends applying a 24- to 48-hour “cooling-off period” before making any emotionally charged financial decisions, especially those involving charity or investment opportunities. “Talk to trusted friends or family before acting, and always use secure, traceable payment channels rather than cash, cryptocurrency, or prepaid cards,” she adds.
For organisations, Collard underscores the need for robust security awareness training (source) that addresses emotional manipulation as a key vector of human risk—not just technical threats. Training modules should include scenarios around charity scams, fake volunteer initiatives, and community investment frauds. Importantly, this education must be culturally relevant, integrating local examples and familiar social contexts to make security lessons relatable. “Verification should be presented as an act of care, not cynicism,” she emphasizes.
From a policy perspective, Collard recommends that companies implement structured approval processes for any form of charitable giving or external community investment, alongside clear verification guidelines to prevent fraudulent engagement.
Understanding the psychology of victims is also crucial. Many who fall prey to romance or emotional scams experience genuine psychological dependency. “It’s not as simple as telling someone to stop communicating with a scammer,” Collard says. “These relationships often feel real to victims. They need empathy, patience, and often professional guidance to rebuild trust in their own judgment.”
Ultimately, Collard concludes that the goal of cybersecurity awareness is not to foster distrust, but to protect the capacity for genuine kindness. “Being security-conscious allows people to continue helping others safely,” she notes. “Awareness is not about cynicism—it’s about preserving the authenticity of human connection while ensuring that goodwill doesn’t become a vulnerability.”

