title The video-sharing service is now allowing you to see racist ads on the video-watching platform article title Video-sharing site YouTube has become the first video-rental site to allow you to post videos with racist and hateful content, according the company Friday.
The announcement was made during a keynote speech at the SXSW Interactive conference in Austin.
The site, owned by Alphabet Inc., is a home for users to upload videos and upload their own content.
YouTube, which has more than 3.2 billion users worldwide, is also a popular platform for creators to promote their work.
The company is facing a series of high-profile controversies, including a video-hosting site’s recent decision to shut down its video-streaming service.
YouTube is also embroiled in an ongoing dispute with the FBI over whether its platform can adequately protect users from threats made through the service.
On Friday, YouTube announced that it had approved an advertising policy that allows the site to show content that it deems inappropriate for its users.
“It is our policy to remove material deemed inappropriate, offensive, or otherwise objectionable,” the company said in a statement.
“We have seen that many users have reported videos that they believe are offensive to them or others.
We will continue to evaluate this policy as more material is reported.”
According to the site’s advertising policy, content can include, but is not limited to: offensive or discriminatory language or images, such as profanity, sexual innuendo, or nudity; offensive content that is offensive or harmful to people of color, women, LGBTQIA+ people, and people with disabilities; content that disparages or trivializes people based on race, ethnicity, gender, religion, disability, or national origin; material that glorifies hate groups or groups that promote violence against individuals or groups of people; material which contains threats of harm or violence; content featuring nudity or sexually explicit material; and content that encourages illegal behavior.
The policy states that the site will take action against content deemed offensive, harmful, or that promotes illegal behavior or content.
The platform will also remove content that violates its policies and that violates copyright or other rights.
The new policy will only apply to videos hosted by the site itself, not by third-party video-sharing services.
The move comes a day after the video streaming service, owned and operated by YouTube, suspended its video streaming site, called Video-Stream, amid concerns about racist videos that were being posted on its platform.
“Video-Sharing is a great service and we are looking into the matter,” YouTube said in an emailed statement.
But the move comes after a series, of high profile controversies, with the company facing a number of high level complaints and lawsuits over the last few months.
On Tuesday, the company was forced to remove a number a racist videos, including one that featured an actor posing as an African American, which had been viewed more than 1.4 million times and caused outrage.
On Thursday, a video posted to YouTube by the racist group American Renaissance surfaced.
In the video, the group claims that President Donald Trump was born in Canada, and that President Barack Obama is a Muslim.