Social media users are questioning the ethics and effects of the most popular platforms after a scandal involving Facebook in early October.

Frances Haugen, the self-proclaimed Facebook “whistleblower”, testified that Facebook’s products generate “self-hate” and “weaken our democracy”, according to the United States Senate Committee on Commerce, Science and Transportation.

Gary Kebbel, a UNL emeritus professor of journalism, produces the Mobile Me & You conference, which teaches about mobile and social media. Kebbel said that the Facebook scandal creates a “debatable discussion” of ethics.

“Facebook has an obligation, first and foremost, to its stockholders. Its job is to keep its stock high and to do whatever it can within the law to keep the stock up and keep the stockholders happy,” Kebbel said. “If Facebook internally knows that what it’s doing to increase its profits is also harming people, that’s a whole new element in the equation.”

Susan Swearer, Chair of the UNL Department of Educational Psychology, said she has done research on the psychological effects of the amount of time people spend online.

“It’s not just like Instagram is inherently bad or Facebook is inherently bad,” Swearer said. “It’s also how much time [is spent online].”

Swearer said that among the positives of social media, such as accessibility, there can be negative effects depending on the user and the way they use the platform.

“How do we control our usage of whatever it is, versus letting the device or the platform control our lives?” Swearer said. “If I’m spending all this time on Instagram and feeling badly that everyone’s having more fun than I am, then maybe I shouldn’t spend an hour scrolling through Instagram.”

Swearer recommends that users consider how much time they spend online, how this makes them feel, and then reevaluate their usage of these social media platforms and online outlets.

“For some people, social media platforms serve the function of feeling connected but for some, it makes them feel even more disconnected,” Swearer said.

Aside from the user’s own actions on social media platforms, Facebook makes decisions about the content that is promoted on its platform through its algorithms, Kebbel said.

“Facebook administrators know that negative comments, hostile comments, engage people a lot more,” Kebbel said. “So therefore, that gets resurfaced. It becomes a self-fulfilling prophecy.”

Kebbel said that in the era of social media, we have seen the “harm that can be done” and the power of these platforms.

“I in particular, and I assume others of my age and generation, initially saw a lot of hope in the internet, open discussion, and social media,” Kebbel said. “Now, we’ve really in the past couple years been shown the serious dangers of it.”

Facebook has allowed “disinformation” and “hostile engagement” on their platform in order to receive engagement and raise advertising revenue, Kebbel said. 

“The time has come for society to say we have a vested interest in truthful information,” Kebbel said. “You have either not controlled [the spread of disinformation], or clearly lied to us about trying to control it, or lied to us about your effectiveness in controlling it. You’ve blown your chance and now we’re enforcing responsibility on you.”