Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Connect with us

Hi, what are you looking for?

slide 3 of 2
THE BIZNOB – Global Business & Financial News – A Business Journal – Focus On Business Leaders, Technology – Enterpeneurship – Finance – Economy – Politics & LifestyleTHE BIZNOB – Global Business & Financial News – A Business Journal – Focus On Business Leaders, Technology – Enterpeneurship – Finance – Economy – Politics & Lifestyle

Technology

Technology

Former Meta employee tells Senate company failed to protect teens safety

Meta Logo Photo Credit: Alamy Meta Logo Photo Credit: Alamy
Meta Logo Photo Credit: Alamy Meta Logo Photo Credit: Alamy

Listen to the article now

In testimony before a U.S. Senate panel on Tuesday, a former employee of Meta (META.O) claims that the parent company of Facebook and Instagram knew about the harassment and other negative issues that kids were experiencing on its platforms but did nothing about them.

Arturo Bejar, the employee, was a director of engineering for Facebook’s Protect and Care team from 2009 to 2015 and worked on Instagram’s well-being from 2019 to 2021, according to him.

Bejar is giving a testimony about social media’s effect on adolescent mental health before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law.

In prepared statements made available before the hearing, he stated, “It’s time that young users have the tools to report and suppress online abuse and it’s time that the public and parents understand the true level of harm posed by these ‘products’.”

Bejar’s speech coincides with a bipartisan congressional effort to enact legislation mandating social media companies to give parents online kid safety resources.

Bejar stated during the hearing that his work at Meta aimed to impact Facebook and Instagram’s designs in a way that would encourage users to engage in more positive behaviors and give young people resources to deal with negative experiences.

In a response, Meta reaffirmed its commitment to safeguarding youth online, citing its support of the user surveys that Bejar mentioned in his evidence and the tools it has developed, such as anonymous alerts for potentially harmful information.

“Every day countless people inside and outside of Meta are working on how to help keep young people safe online,” read a statement from Meta. “All of this work continues.”

Senators were informed by Bejar that he often met with high-ranking corporate leaders, such as Chief Executive Mark Zuckerberg, and that, at the time, he thought they were supportive of the effort. He later determined, however, that the executives had repeatedly chosen “time and time again not to tackle this issue,” as he stated in his testimony.

In a 2021 email, Bejar forwarded internal statistics to Zuckerberg and other high-ranking officials, which showed that 24.4% of youngsters between the ages of 13 and 15 had reported receiving unsolicited sexual approaches, and 51% of Instagram users had reported having a harmful or detrimental experience on the network in the previous seven days.

Additionally, he informed them that his 16-year-old daughter had received offensive images and sexist remarks but lacked the means to submit the incidents to the firm. The Wall Street Journal was the first to report on the email’s existence.

During his evidence, Bejar said that during a discussion, Chris Cox, the Chief Product Officer of Meta, was able to recall specific figures about risks to teenagers. “I found it heartbreaking because it meant that they knew and were not acting on it,” Bejar added.


Comment Template

You May Also Like

Technology

AMD achieved a major milestone in 2024, surpassing 100 million gaming chips shipped. Strong demand for Ryzen processors fueled growth, despite a decline in...

Business

AMD faces challenges in the AI chip race as its latest forecast failed to meet investor expectations, causing a stock decline. While the company...

Business

Palantir Technologies’ 2025 revenue outlook highlights its rapid growth as a leader in AI innovation. Projecting $3.75 billion in revenue, Palantir excels in both...

Technology

Sam Altman, CEO of OpenAI, has called for an urgent pivot to open-source innovation to counter rising competition from DeepSeek, an AI leader championing...

Notice: The Biznob uses cookies to provide necessary website functionality, improve your experience and analyze our traffic. By using our website, you agree to our Privacy Policy and our Cookie Policy.

Ok