TikTok's Algorithm Funneled Me Towards Nazi Symbolism, Even After Removing a Swastika Necklace
When I started browsing TikTok Shop for "hip hop jewelry," the platform repeatedly suggested that I might also be interested in products with blatant Nazi symbolism. The algorithm highlighted phrases like "swastika jewelry," "double lightning bolt necklace," and "SS necklace" alongside search terms related to hip-hop fashion, despite these latter items being unrelated.
To investigate this phenomenon further, I followed what the algorithm recommended in the "Others searched for" boxes on TikTok Shop, which sometimes appears as a set of four related search suggestions with images. The results included jewelry pieces adorned with swastikas and similar Nazi symbols, along with other products that seemed innocuous at first glance.
A search for "swastika jewelry" led to recommendations like "German WW2 necklace," featuring a Star of David pendant, alongside hip-hop-inspired items. Further clicking on these suggested searches revealed an endless web of phrases related to Nazi symbolism and white nationalism, including "double lightning bolt necklace" and "SS necklace."
However, some products in the search results appeared less overtly inflammatory, with symbols that resembled those used by Nazi Germany but were not directly connected to antisemitism. For example, one product featured S-shaped lightning bolts on top of each other, rather than the side-by-side arrangement seen in the SS insignia.
Experts say that understanding how TikTok's algorithm works is difficult due to its opaque nature. Filippo Menczer, a professor at Indiana University, suggests that the results could be either a result of the algorithm working as intended or manipulation by nefarious users trying to astroturf popularity through fake accounts.
Joan Donovan, coauthor of the book "Meme Wars," believes TikTok needs to do more to address content moderation and user protection. She emphasizes the importance of providing transparency to users about how they were targeted by these algorithmic suggestions.
When I started browsing TikTok Shop for "hip hop jewelry," the platform repeatedly suggested that I might also be interested in products with blatant Nazi symbolism. The algorithm highlighted phrases like "swastika jewelry," "double lightning bolt necklace," and "SS necklace" alongside search terms related to hip-hop fashion, despite these latter items being unrelated.
To investigate this phenomenon further, I followed what the algorithm recommended in the "Others searched for" boxes on TikTok Shop, which sometimes appears as a set of four related search suggestions with images. The results included jewelry pieces adorned with swastikas and similar Nazi symbols, along with other products that seemed innocuous at first glance.
A search for "swastika jewelry" led to recommendations like "German WW2 necklace," featuring a Star of David pendant, alongside hip-hop-inspired items. Further clicking on these suggested searches revealed an endless web of phrases related to Nazi symbolism and white nationalism, including "double lightning bolt necklace" and "SS necklace."
However, some products in the search results appeared less overtly inflammatory, with symbols that resembled those used by Nazi Germany but were not directly connected to antisemitism. For example, one product featured S-shaped lightning bolts on top of each other, rather than the side-by-side arrangement seen in the SS insignia.
Experts say that understanding how TikTok's algorithm works is difficult due to its opaque nature. Filippo Menczer, a professor at Indiana University, suggests that the results could be either a result of the algorithm working as intended or manipulation by nefarious users trying to astroturf popularity through fake accounts.
Joan Donovan, coauthor of the book "Meme Wars," believes TikTok needs to do more to address content moderation and user protection. She emphasizes the importance of providing transparency to users about how they were targeted by these algorithmic suggestions.