Astroturfing
Astroturfing is the deceptive practice of hiding the sponsors of an orchestrated message or organization to make it appear as though it originates from, and is supported by, unsolicited grassroots participants. It is a practice intended to give the statements or organizations credibility by withholding information about the source's financial backers.
The implication behind the use of the term is that instead of a "true" or "natural" grassroots effort behind the activity in question, there is a "fake" or "artificial" appearance of support. It is increasingly recognized as a problem in social media, e-commerce, and politics. Astroturfing can influence public opinion by flooding platforms like political blogs, news sites, and review websites with manipulated content. Some groups accused of astroturfing argue that they are legitimately helping citizen activists to make their voices heard.
Etymology
The modern usage of the term "astroturf" is widely credited to former Treasury Secretary of the United States and senator from Texas Lloyd Bentsen who, in 1985, when faced with a barrage of postcards and letters facilitated by insurance companies claiming to be concerned constituents, said that "a fellow from Texas can tell the difference between grass roots and Astroturf", referencing the brand of artificial grass.Roles
While the term "astroturfing" often evokes images of corporate lobbying or political media manipulation, its function as a mechanism for manufacturing consent transcends liberal democracies. In their foundational work Manufacturing Consent, Edward S. Herman and Noam Chomsky argue that power is reproduced not merely through censorship but through the orchestration of discourse, where the appearance of grassroots consensus is shaped by elite interests. This dynamic plays out in authoritarian contexts like China, where the state has adopted astroturfing as a strategic tool to manage, rather than suppress, online expression. As Rongbin Han documents in Manufacturing Consent in Cyberspace: China's "Fifty-Cent Army", the Chinese government recruits and trains anonymous online commentators to seed pro-regime narratives across forums and comment sections, presenting them as spontaneous public sentiment. Far from simply muzzling dissent, this practice reflects a sophisticated state effort to simulate legitimacy and manage perception within digital public spheres. Yet ironically, as Han's research shows, these efforts often fail due to poor coordination, lackluster incentives, and the lingering bureaucratic logic of top-down propaganda, ultimately undermining the very trust they aim to build.Many countries have laws prohibiting some astroturfing practices with various methods of enforcement. In the US, the FTC has set rules against endorsing a product without disclosing that one is paid to do so. In the EU, social networking sites may be governed by the Unfair Commercial Practices Directive which also prohibits undisclosed paid endorsements and connected individuals from misleading readers into thinking they are regular consumers.
Various detection methods have been developed by researchers, including content analysis, linguistic analysis, authorship attribution, and machine learning.
While these approaches have been instrumental in flagging inauthentic behavior, such as bot-like posting patterns or coordinated message drops, more recent scholarship emphasizes that astroturf detection also requires interpretive analysis of messaging strategies. Brieuc Lits, in his study of pro-shale gas lobbying campaigns, argues that astroturfing often succeeds not simply by masking sponsorship but by adopting discursive frames that mimic those of authentic civic groups, through what Lits terms "corporate ventriloquism", private interests assume the voice of the public, strategically emphasizing values like economic freedom or energy independence to mask underlying industrial agendas. Lits claims that these language choices are not accidental; they are calculated to evoke grassroots legitimacy while marginalizing competing narratives, such as those centered on environmental harm or community health. As a result, detection now involves more than just identifying false identities or automation; it demands scrutiny of how language, symbols, and values are mobilized to simulate authenticity.
In addition to content and linguistic cues, coordination-based detection methods have gained traction as a means of identifying astroturfing campaigns. Schoch et al. propose a scalable, network-based approach that focuses on identifying synchronized patterns of behavior, such as co-tweeting or co-retweeting identical messages within short time windows, as indicators of centralized coordination. This method, rooted in principal-agent theory, assumes that hired agents or campaign employees tend to "shirk", reusing content and showing repetitive, time-bounded activity. By mapping message coordination networks, the study was able to reliably distinguish astroturfing accounts from organic grassroots actors across dozens of global campaigns. Unlike bot-centric detection, this strategy targets behavioral traces unique to organized disinformation and has proven robust even when automated behavior is minimal or absent.
Definition
In political science, it is defined as the process of seeking electoral victory or legislative relief for grievances by helping political actors find and mobilize a sympathetic public, and is designed to create the image of public consensus where there is none. Astroturfing is the use of fake grassroots efforts that primarily focus on influencing public opinion and typically are funded by corporations and political entities to form opinions.On the internet, "astroturfers" use software to hide their identity. Sometimes one individual operates through many personas to give the impression of widespread support for their client's agenda. Some studies suggest astroturfing can alter public viewpoints and create enough doubt to inhibit action. In the first systematic study of astroturfing in the United States, Oxford Professor Philip N. Howard argued that the internet was making it much easier for powerful lobbyists and political movements to activate small groups of aggrieved citizens to have an exaggerated importance in public policy debates. Astroturfed accounts on social media do not always require humans to write their posts; one January 2021 study detailed a "set of human-looking bot accounts" used to post political content, which was able to operate automatically for fourteen days before being detected and suspended by Twitter. Twitter trends are often targeted by astroturfing as they are used as a proxy for popularity. A study conducted by researchers at EPFL reported that 20% of the global Twitter trends in 2019 were fake, created automatically using fake and compromised accounts which tweet in a coordinated way to mimic grassroots organizing of regular Twitter users.
Policies and enforcement
Many countries have laws that prohibit more overt astroturfing practices. In the United States, the Federal Trade Commission may send cease-and-desist orders or require a fine of $16,000 per day for those that violate its "Guides Concerning the Use of Endorsements and Testimonials in Advertising". The FTC's guides were updated in 2009 to address social media and word-of-mouth marketing. According to an article in the Journal of Consumer Policy, the FTC's guides holds advertisers responsible for ensuring bloggers or product endorsers comply with the guides, and any product endorsers with a material connection are required to provide honest reviews.In the European Union, the Unfair Commercial Practices Directive requires that paid-for editorial content in the media provide a clear disclosure that the content is a sponsored advertisement. Additionally, it prohibits those with a material connection from misleading readers into thinking they are a regular consumer.
The United Kingdom has the Consumer Protection from Unfair Trading Regulations, which prohibits "Falsely representing oneself as a consumer." They allow for up to two years in prison and unlimited fines for breaches. Additionally, the advertising industry in the UK has adopted many voluntary policies, such as the Code of Non-Broadcast Advertising, Sale, Promotion and Direct Marketing. A trade association, the Advertising Standards Authority, investigates complaints of breaches. The policy requires that marketing professionals not mislead their audience, including by omitting a disclosure of their material connection.
In Australia, astroturfing is regulated by Section 18 of the Australian Consumer Law, which broadly prohibits "misleading and deceptive conduct". According to the Journal of Consumer Policy, Australia's laws, which were introduced in 1975, are more vague. In most cases, they are enforced through lawsuits from competitors, rather than the regulatory body, the Australian Competition & Consumer Commission. There is also an International Consumer Protection and Enforcement Network.
Legal regulations are primarily targeted towards testimonials, endorsements and statements as to the performance or quality of a product. Employees of an organization may be considered acting as customers if their actions are not guided by authority within the company.
In October 2018, after denying that they had paid for people to show up in support of a controversial power plant development project in New Orleans, Entergy was fined five million dollars for using astroturf firm The Hawthorn Group to provide actors to prevent real community members' voices from being counted at city council meetings and show false grassroots support.
Debate
Effectiveness
In the book Grassroots for Hire: Public Affairs Consultants in American Democracy, Edward Walker defines "astroturfing" as public participation that is perceived as heavily incented, as fraudulent, or as an elite campaign masquerading as a mass movement. Although not all campaigns by professional grassroots lobbying consultants meet this definition, the book finds that the elite-sponsored grassroots campaigns often fail when they are not transparent about their sources of sponsorship and/or fail to develop partnerships with constituencies that have an independent interest in the issue. Walker highlights the case of Working Families for Wal-Mart, in which the campaign's lack of transparency led to its demise.A study published in the Journal of Business Ethics examined the effects of websites operated by front groups on students. It found that astroturfing was effective at creating uncertainty and lowering trust about claims, thereby changing perceptions that tend to favor the business interests behind the astroturfing effort. The New York Times reported that "consumer" reviews are more effective, because "they purport to be testimonials of real people, even though some are bought and sold just like everything else on the commercial Internet." Some organizations feel that their business is threatened by negative comments, so they may engage in astroturfing to drown them out. Online comments from astroturfing employees can also sway the discussion through the influence of groupthink.