Social Media Users Should Guard Against Unwanted Persuasion

By Marilyn Broggi, PhD Candidate in Mass Communication at the University of Georgia, and member of Steamboat Institute’s Emerging Leaders Council. 

Recent news about the Twitter Files, Hunter Biden’s laptop, and recent Twitter content limitations in Turkey have highlighted issues related to content moderation on social media.

Since average users cannot directly control content moderation on social media platforms, I argue consumers should protect themselves against unwanted persuasion no matter what a platform decides to do with content.

To do this, it is important to consider a few known ways information has historically spread on social media apart from content moderation: (1) in nodes, or groups, of users, (2) based on a user’s preferences, and (3) through native advertising placements.

Information on social media has been shown to exist in nodes, where topics are primarily discussed within groups of users. A social network analysis visualization software called NodeXL can depict these nodes, and NodeXL offers a gallery of data visualizations for users to upload their findings. Browsing these network visualizations can show how certain topics arguably produce echo chambers of information.

Additionally, information can be spread to users based on their interests. TikTok, for instance, spreads information to users by curating a personalized experience based on “user interactions,” “video information,” and “device and account settings” that indicate user preferences, according to “How TikTok recommends videos #ForYou.

Users can also be targeted with information through posts that are native advertising placements. Native ads on social media covertly blend with the format of surrounding content on a user’s feed (Wojdynski & Evans, 2020). A social media native ad could be a post paid for by a brand or advertiser that is formatted like a regular post and includes an advertising disclosure, such as “Sponsored.” Social media influencer posts about paid brand sponsorships using disclosures, like “#ad,” are also considered native ads (Evans et al., 2017).

I argue it is possible that echo chambers, customized feeds, and covertly formatted ads potentially reduce a user’s psychological guard against persuasion attempts. This argument is rooted in the Persuasion Knowledge Model (Friestad & Wright, 1994), where people guard against persuasion attempts when they know someone is working to persuade them of something. Could users be so comfortable with the information being viewed that they let their guard down?

With guards down, users could passively view persuasive content through the same lens as entertainment content. This has been shown in advertising research when advertising content is covertly disguised as entertainment content (Wojdynski & Evans, 2020). Research suggests an awareness of the persuasive nature of the content, specifically recognizing content as advertising, could influence the user’s attitudes, perceptions, and behaviors (Wojdynski & Evans, 2020).

Thus, I believe users should keep their guard up to protect against unwanted persuasion. Here’s how:

Pay attention.

  • While this might go without saying, paying attention is key. You must know you are being persuaded to guard against it as suggested by the Persuasion Knowledge Model (Friestad & Wright, 1994).
  • Regularly ask yourself, “Even if I like this content, is it trying to persuade me of something?”
  • Look closely for sponsorship disclosures to determine if the content is advertising.

View social media in the morning or in smaller increments of time.

Intentionally seek information in multiple ways.

  • Fight confirmation bias by following accounts you disagree with to stay out of an echo chamber of information.
  • Balance your information by seeking out non-social media sources, such as U.S. House Oversight Committee press releases or investigative journalism articles.
  • Develop a historical lens with which you can interpret current events. I recommend doing this by reading primary sources, such as intelligence community memoirs, to retrospectively understand historical events.

Know the policies of each social media platform to understand how content decisions are or are not made.

  • Read the policies at least annually. If you opt for a platform with strict content moderation, then review the platform’s policies to cue you into what you are not seeing.
  • If you opt to use a platform with less content moderation, be willing to see a larger scope of information.
  • Familiarize yourself with any social media content management bodies. For instance, Meta uses the Oversight Board to offer an independent appeal process for content. The Oversight Board posts case decisions and policy advisory opinions
July 6, 2023