YouTube disabled the live chat section of the streaming video about 30 minutes into the hearing because of what it called “hateful comments”, The Associated Press reported.
The incident came as executives from Google and Facebook appeared before the House Judiciary Committee to answer questions about the companies’ role in the spread of hate crimes and the rise of white nationalism in the US. They were joined by leaders of such human rights organizations as the Anti-Defamation League and the Equal Justice Society, along with conservative commentator Candace Owens.
Neil Potts, Facebook director of public policy, and Alexandria Walden, counsel for free expression and human rights at Google, defended policies at the two companies that prohibit material that incites violence or hate. Google owns YouTube.
“There is no place for terrorism or hate on Facebook,” Potts testified. “We remove any content that incites violence.”
The hearing comes as the US is experiencing an increase in hate crimes and hate groups.
There were 1,020 known hate groups in the country in 2018, the fourth straight year of growth, according to the Southern Poverty Law Center, which monitors extremism in the US Hate crimes, meanwhile, rose 30 percent in the three-year period ending in 2017, the organization said, citing FBI figures.
Democratic Rep. David Cicilline of Rhode Island grilled the Facebook and Google executives about their companies’ responsibility for the spread of white supremacist views, pushing them to acknowledge they have played a role, even if it was unintentional. Potts and Walden conceded the companies have a duty to try to curb hate.
But the challenges became clear as Cicilline pushed Potts to answer why Facebook did not immediately remove far-right commentator Faith Goldy last week, after announcing a ban on white nationalism on the social network.
Goldy, who has asked her viewers to help “stop the white race from vanishing,” was not removed until Monday.
“What specific proactive steps is Facebook taking to identify other leaders like Faith Goldy and preemptively remove them from the platform?” Cicilline asked.
Potts reiterated that the company works to identify people with links to hate and violence and banishes them from Facebook.
The hearing was prompted by the mosque shootings last month in Christchurch, New Zealand, that left 50 people dead. The gunman livestreamed the attacks on Facebook and published a long post online that espoused white supremacist views.
But controversy over white nationalism and hate speech has dogged online platforms such as Facebook and Google’s YouTube for years.
In 2017, following the deadly violence in Charlottesville, Virginia, tech giants began banishing extremist groups and individuals espousing white supremacist views and support for violence. Facebook extended the ban to white nationalists.
Despite the ban, accounts such as one with the name Aryan Pride were still visible as of late Monday. The account read: “IF YOUR NOT WHITE friend ur own kind cause Im not ur friend.”