The General Services Administration wants to know what percentage of federal rulemakings receive fake public comments, which will require more data from partner agencies.
GSA kicked off its effort to fight fake comments in rulemakings Thursday, but its genesis really started back in 2017. That December, the Federal Communications Commission repealed net neutrality guidelines treating the internet as a public utility. But nearly half of the 22 million public comments on the proposed decision came from stolen identities.
GSA became the managing partner of the eRulemaking Program — a shared service allowing the public to electronically review dockets and submit comments on proposed rulemakings across multiple agencies — in October.
GSA is in the process of modernizing the program with more secure, innovative technical solutions that can mitigate fake comments and sift through mass-comment campaigns.
“We’re going to take the integrity of the rulemaking process very seriously and go through an intentional, rigorous, phased process, and the initial step of that is discovery,” Tobias Schroeder, director of the eRulemaking Program, said during a public meeting Thursday. “So we want to make sure that we’re gathering all the available information we can — and I emphasize available — from the partner agencies that we interact with to get the statistics so that we can independently validate what those numbers are and identify what the risks are and respond in a prioritized way to those risks.”
Fake comments falsely attribute the identity of the commenter, according to the Office of Governmentwide Policy’s working definition. In the net neutrality case, the identities of high-profile citizens, senators and even celebrities like Elvis Presley were stolen to post comments.
The eRulemaking Program will work with partner agencies on how they wish to respond to fake comments, Schroeder said.
Agencies can negate fake comments downstream, but that ignores their social and economic costs, said Sanith Wijesinghe, information systems engineer at federally funded research and development center MITRE.
“You now have on the public record a statement attributed to someone without their consent, and it takes quite a lot of effort to correct that,” Wijesinghe said. “Not everyone has the resources to do it.”
Fake comments likely originate in foreign countries, and an “arms race” results as the technology matures, he added.
Wijesinghe said agencies should explore alternative ways of structuring rules, so comments are less of a free-for-all and more directed at specific groups they desire feedback from.
The National Association of Manufacturers is a “power user” of the rulemaking process — weighing in frequently and encouraging its members to do the same — and wants to ensure any improvements to agency tools aren’t a barrier to public engagement, said Patrick Hedren, a vice president of the advocacy group.
“We want to be very careful to not amend the tool in a way that cracks down on something that’s completely normal, permissible, constitutionally valid and a positive way to engage with government,” Hedren said.
When faced with a mass-comment campaign, agencies should “look to the normative point being made,” he added.
But comment campaigns organized by advocacy groups can drown out the voice of other interests like small businesses, which have less of a presence in other parts of the regulatory process because of the costs.
Small businesses also run the risk of signing on to a poorer-quality comment during mass commenting than they might otherwise, said Oliver Sherouse, regulatory economist at the Small Business Administration.
“Writing a detailed comment letter is hard work,” Sherouse said.
Agencies and trade associations should help small businesses identify relevant rules and supporting documents early in the rulemaking process, using machine learning to identify effective entities for a given regulation, product or program, he added. If rules are made machine-readable, even better.
A guide on writing effective comments would also help small businesses, Sherouse said.
Rulemaking agencies ultimately decide whether commenters must authenticate their identity in some way before submitting, which could crackdown on fake comments but also limit responses. Thus why agencies sometimes forgo authentication.
An empirical analysis comparing the volume of anonymous comments to the validity of the points they make would be useful before backing authentication, Sherouse said.
“It’s not difficult to figure out why a business would be reluctant to criticize its regulator on the record,” he said.