The war room adds a human dimension to the artificial intelligence tools Facebook has already deployed to detect inauthentic or manipulative activity.
“Humans can adapt quickly to new threats,” Gleicher said of the latest effort.
Chakrabarti said the new center is an important part of coordinating activity — even for a company that has been built on remote communications among people in various parts of the world.
“There’s no substitute to face to face interactions,” he said.
The war room was activated just weeks ahead of the US vote, amid persistent fears of manipulation by Russia and other state entities, or efforts to polarize or inflame tensions.
The war room is part of stepped up security announced by Facebook that will be adding some 20,000 employees.
“With elections we need people to detect and remove (false information) as quickly as possible,” Chakrabarti said.
The human and computerized efforts to weed out bad information complement each other, according to Chakrabarti.
“If an anomaly is detected in an automated way, then a data scientist will investigate, will see if there is really a problem,” he said.
The efforts are also coordinated with Facebook’s fact-checking partners around the world including media organizations such as AFP and university experts.
Gleicher said the team will remain on high alert for any effort that could lead to false information going viral and potentially impacting the result of an election.
“We need to stay ahead of bad actors,” he said. “We keep shrinking the doorway. They keep trying to get in.”