DHAKA — Eight days before Bangladesh's national election, click farms are selling democracy by the thousand. For just 120 taka—roughly $1—political operatives can purchase 1,000 fake Facebook reactions to attack opponents or manufacture support that doesn't exist.
A billion people aren't a statistic—they're a billion stories. But what happens when bots outnumber voters?
An investigation by The Daily Star uncovered a coordinated network of bot accounts manipulating political discourse across Bangladesh ahead of the February 12 election. Researchers purchased nearly 30,000 fake reactions across nine test posts over two days from four separate click farms, exposing an industry that threatens to undermine democratic legitimacy.
When a Dhaka University student activist shared a screenshot warning about violent threats against a fellow activist in December, her post drew over 10,000 reactions—8,500 of them mocking "haha" emojis. One in every five came from suspicious profiles with names like Kokou Khelios from Togo and Olivier Randrianjaka from Madagascar. These accounts had no plausible connection to Bangladeshi student politics. They were bots, purchased to discredit her concern.
The investigation analyzed 263 Facebook posts drawing 885,811 reactions total and identified at least 19,708 reactions from obvious bot accounts. More alarming: the same 354 bot profiles delivering fake reactions for the researchers' test posts were also active across the pages of at least six candidates running in next week's election.
The Click Farm Economy
The Daily Star contacted four bot services—Sociafy, Finix IT, Tech Dreams, and Finix IT Boosting—and found a sophisticated industry with tiered pricing. Foreign bot reactions arrive nearly instantly: 5,000 within five minutes for 500 taka. Domestic reactions, which appear more authentic, cost more—exceeding 200 taka per thousand—and take 24 to 48 hours.
Nayeem Uddin, owner of Sociafy, told investigators that "500-600 people maintain servers" of Facebook profiles across Bangladesh. Bot accounts are purchased for roughly 15-16 taka domestically, with international sources from Thailand, Vietnam, and Pakistan.
For 240 million Bangladeshis preparing to vote, this cheap manipulation undermines the very foundation of informed choice. When fake engagement drowns out genuine voices, voters can't distinguish between real support and purchased illusion.
Political Coordination Across Parties
The investigation uncovered four coordinated clusters operating across the political spectrum. Two pro-Jamaat clusters with 326 and 16 members launched 1,465 attacks against BNP users and 1,196 against left-wing content. A pro-Awami League cluster of 134 members and a smaller pro-BNP cluster of 12 members also deployed bots systematically.
Media outlets received 515 attacks from just the pro-Jamaat clusters alone, an attempt to discredit independent journalism during a critical election period.
At least six candidates—two BNP, two Jamaat, one IAB, and one independent—had identifiable bot followers. Two BNP candidates had 6,098 combined bot followers, while two Jamaat candidates had 5,312.
Meta's Silence
Facebook's Community Standards explicitly prohibit this activity: "We do not allow attempting to or successfully selling, buying, or exchanging for engagement, such as likes, shares, views, follows, clicks."
Yet Meta did not respond to The Daily Star's requests for comment. The platform's detection systems are failing to catch organized manipulation that researchers identified in just two days of work.
For Bangladesh, a country of 170 million people with one of the world's highest social media penetration rates, Facebook isn't just a platform—it's the primary political arena. When that arena is flooded with fake voices, real democratic discourse drowns.
As voters head to the polls on February 12, they face an opponent more insidious than any candidate: an industry that can manufacture consensus, ridicule dissent, and drown truth for the price of a cup of tea. A billion people aren't a statistic—but bots are trying to make them one.




