Today’s Tech

Today’s Tech

Technology and social media company Facebook has been in hot water recently regarding leaked internal company documents that express the company’s awareness of issues regarding child safety and the mental health of its users. 

The tens of thousands of pages of documents were the center of a series on the Wall Street Journal entitled “The Facebook Files” which discusses at length how the tech giant knows that its products cause harm to its users. In response, the U.S. Senate called two hearings to examine Facebook’s products’ effects on children’s and teens’ privacy and mental health.

Facebook, in an unprecedented move, decided to indefinitely suspend its plan to create an Instagram app specifically for children under 13, called Instagram Kids. Child welfare advocates and policymakers were worried that the app could harm children regarding their privacy, screen time and mental health. Instagram Kids was originally presented as an ad-free version of Instagram but with age-appropriate content as well as tools for parents to limit their children’s screen time and who can follow and message them.

However, the Wall Street Journal disclosed some of Facebook’s research that revealed how Instagram is incredibly toxic to teenage girls in particular; a survey indicated that one in three teen girls said that Instagram exacerbated their already-existing body image issues. Child welfare advocates said that children could be exposed to harmful content through increased screen time and that Facebook would have more opportunities to access their personal data. Law enforcement authorities also expressed concern that pedophiles could potentially meet and communicate with children through Instagram Kids with the National Center on Sexual Exploitation calling the app an “irresponsible idea from its inception.” There are also concerns that the primary audience for the app would be children much younger than the expected 10-12 year old demographic, most of whom already have Instagram accounts.

By Wednesday, Sept. 29, Facebook was in full damage control mode, releasing heavily annotated documents discounting its own research of how its apps harm users, specifically the mental health of adult and teenage Instagram users, on the heels of the company’s Senate hearings Thursday, Sept. 30. The annotations aimed to cast doubt on the scope of Facebook’s research, even on results that seem positive for the company’s reputation.

Lawmakers plan to use Facebook’s scandals to renew interest in revamping and reframing laws for protecting children online. They argue that the 1998 Children’s Online Privacy Protection Act (COPPA) has not been properly enforced and does not encompass privacy concerns regarding social media, apps, websites and video games.

The “whistleblower” who released the internal documents used by the Wall Street Journal in their Facebook series was revealed Sunday, Oct. 3 to be former Facebook product manager Frances Haugen during her “60 Minutes” interview. She alleged that Facebook dissolved measures to prevent the spread of misinformation after the 2020 election and undermined safety efforts by disbanding the civic integrity team. She also commented on how social media usage will worsen teen girls’ body image, referencing Facebook’s own findings.

“There were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money,” Haugen said in the interview.

Facebook is arguably in the biggest crisis in the company’s history, since Congress is still investigating its handling of the Jan. 6 insurrection at the U.S. Capitol. Haugen will testify during the second Senate hearing on Tuesday, Oct. 5.