«

本周误解焦点:短视频如何助长阴谋论蔓延

qimuai 发布于 阅读:5 一手编译


本周误解焦点:短视频如何助长阴谋论蔓延

内容来源:https://lifehacker.com/entertainment/what-people-are-getting-wrong-this-week-the-short-video-to-conspiracy-theory?utm_medium=RSS

内容总结:

近日,有网络博主通过实验揭露社交媒体平台存在推送阴谋论内容的现象。实验结果显示,主流短视频平台在用户进行常规内容浏览时,均会在短时间内向用户推荐未经证实的阴谋论视频。

该实验通过建立空白社交账号,分别以“恐龙”“越南战争”“2000年美国总统大选”为关键词进行内容搜索与互动。数据显示:在TikTok平台,用户平均仅需观看114个视频(约57分钟)即可接触到阴谋论内容;YouTube Shorts和Instagram Reels则分别需要230个视频(约1小时57分钟)和275个视频(约138分钟)。测试中出现的典型阴谋论内容包括“911事件内部操作”“金字塔真实用途”“平行时空手机”等离奇说法。

尽管各平台推送阴谋论的速度存在差异,但实验表明,用户仅需观看一部电影时长的时间,就会被动接触虚假信息。有分析认为,算法可能更倾向于推广吸引眼球的内容,因为真相往往缺乏戏剧性,而阴谋论却能让人产生“掌握独家信息”的错觉。目前尚无证据表明科技公司存在主观政治意图,但该现象确实反映出算法推荐机制在追求用户粘性与商业利益时,可能对网络信息环境造成的潜在风险。

中文翻译:

你知道吗?谷歌搜索其实可以自定义过滤垃圾信息?只需简单设置就能提升搜索质量,比如将Lifehacker网站(即本人供稿平台)设为优先信源。

虽然偶尔会出现像"被提预言"或"医疗床骗局"这类广泛传播的不实信息,但碎片化的信息生态几乎已扼杀那些体系庞大的阴谋论。如今再难出现"人类从未登月"这类能统一低智群体的宏大理论,算法反而会为每个人量身定制阴谋论——你可能不再加入地平说协会,却会坚信现在是公元1728年,或是怀疑某个1980年代的英国喜剧演员实为AI虚构的产物。

这种认知扭曲如何形成?社交媒体又需要多久能将求知者变成谬论信徒?YouTube博主Benaminute近期通过实验探寻答案:如果从随机普通话题切入,仅观看与主题相关视频,短视频平台需要多久开始推送阴谋论?结果令人震惊——根本用不了多久。

【殊途同归的算法陷阱】
Benaminute创建了空白社交账号,模拟用户对三个中性话题(恐龙、越南战争、2000年美国总统大选)的纯粹兴趣。他仅在平台搜索关键词,并观看点赞相关视频。

·恐龙话题
YouTube Shorts:前540条视频包含《侏罗纪公园》宣传片、AI生成恐龙视频等,第541条突然出现乔·罗根宣称"金字塔是DNA修复装置"的片段。
TikTok:第144条即出现2400万播放量的伪造UFO视频。
Instagram Reels:历经661条视频后,出现所谓"能窥视平行维度的千禧年禁机"。

·越南战争话题
历史政治类话题的演变更触目惊心:
YouTube Shorts仅用7条视频就转向"诺亚方舟阴谋论"。
TikTok第161条视频声称黑石集团参与策划特朗普遇刺案。
Reels第139条出现"布什策划9·11"内容。

·2000年大选话题
这个敏感议题同样难逃算法操控:
YouTube Shorts第136条再现诺亚方舟阴谋论。
TikTok仅38条视频就推送"9月24日末日降临"。
Reels第26条即出现"世贸中心遭布什/克林顿爆破"。

【平台沦陷速度排行榜】
TikTok以平均114条视频(57分钟)最快导向阴谋论,YouTube Shorts需230条(1小时57分钟),Reels需275条(138分钟)。虽然存在差异,但三者均能在观看一部漫威电影的时间内完成认知腐蚀。

【算法背后的真相】
我们很容易认定科技公司刻意操纵推荐算法传播虚假信息。或许出于特定政治目的影响选举,或许如Benaminute略带反讽的推测——这些应用旨在"让我们持续愤怒、分裂、分心",忽略真正的矛盾维度。

但这本身也构成了阴谋论。我们尚无足够证据解释算法偏好,比起顶层设计的恶意操纵,更合理的解释是TikTok等平台纯粹逐利使然。若某平台算法优先推送真相,注定迅速失败——相比阴谋论的光怪陆离,真相总是枯燥乏味。阴谋论让信徒自觉掌握常人未知的内幕,而专注真相者只能表述"现有证据表明""逻辑推导可知",谁又愿意听这些呢?

英文来源:

Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding my work at Lifehacker as a preferred source.
While sometimes there's a big piece of misinformation that a lot of people latch onto—like The Rapture or the existence of "MedBeds"—the fractured nature of the information sphere has all but killed the overarching conspiracy theory. No longer do big ideas like "we never went to the moon" unite the dumbest minds; instead, the algorithm creates bespoke conspiracy theories. So instead of joining the Flat Earth Society, you might think the actual year is 1728, or that AI secretly imagined a British comedian from the the 1980s and seeded the web with evidence of his existence.
But how does it start? And how quickly can social media platforms transform someone from a person from a seeker-of-knowledge to a believer-in-bullshit? YouTuber Benaminute recently posted a video where he dug in to find out. His question: If you start with a benign, broad, randomly chosen subject, and you only watch videos having to do with that subject, how long will it take until TikTok, YouTube Shorts, and Instagram Reels feed you a conspiracy theory video? The answer: not long at all.
Different topics all lead to the same place (more or less)
For the experiment, Benaminute created "blank" social media profiles and behaved like someone who was innocently curious about one of three topics—dinosaurs, The Vietnam War, and the 2000 presidential election. He put the keyword in each platform's search bar and only watched and liked videos about the initial subject.
Dinosaurs
YouTube Shorts: The initial videos were ads for Jurassic Park, AI slop featuring dinosaurs, and the occasional educational video, but the 541st video was a clip from the Joe Rogan Experience about how the pyramids were not tombs, but "DNA restoration devices."
TikTok: If you thought TikTok would get to conspiracy theories quickly, you'd be right. The 144th video was this fake UFO video that has 24 million views.
Instagram Reels: Insta took 661 videos to get from dinosaurs to a "forbidden phone from the 2000s that lets you see into a parallel dimension."
The Vietnam War
Things get worse for people interested in historical or political events. On all short-form platforms, an interest in Vietnam will lead you pretty quickly to right-leaning content, which leads you to conspiracy theories.
YouTube Shorts got to a conspiracy theory video about Noah's Ark in only seven videos.
TikTok took a little longer; video 161 was about how financial services company Blackrock had something to do with the attempted assassination of Donald Trump.
Reels took 139 videos to get to "Bush did 9/11."
The 2000 election
The election of 2000 is still a charged topic, but it's been awhile, so maybe cooler heads and verified information will win the day? Spoiler: nope.
YouTube Shorts took 136 videos to get to the same Noah's Ark conspiracy as it did for dinosaur fans.
TikTok only took 38 videos to get to "The Rapture is happening on September 24."
Reels took only 26 videos to land on "The World Trade Center was bombed" (by either Clinton or Bush).
Which social media app leads to conspiracy theories fastest?
The champion of "normal search to conspiracy theory" speed runs is TikTok, with an average of 114 videos or 57 minutes of watching. YouTube Shorts comes in second with 230 videos or 1 hour 57 minutes of time, and Reels takes 275 video or 138 minutes. It's a distinction without a difference; however, all three platforms lead to conspiracies in the time it takes to watch a Marvel movie.
What does it all mean?
It would be easy to conclude that the massive tech companies that built YouTube, Instagram, and TikTok companies weight their recommendation engines so viewers are led to fake stories. Maybe they have specific political aims and are trying to sway votes, or maybe (as Benaminute posits in a semi-tongue-in-cheek way) these apps are built to "keep us angry, divided, and distracted" from realizing the conflict isn't between Left and Right, but between "up and down."
This is also a conspiracy theory, however. I'm not saying he's wrong, but we don't have enough information to know why algorithms recommend conspiracy content. It could be because bad actors at the top demand specific results for some purpose, but it seems more likely to me that TikTok et al. don't have an agenda beyond making money.\
I have no doubt that a social media platform featuring an algorithm that weighs the truth heavily would fail pretty quickly; the Truth is boring compared to conspiracy theories. Conspiracy theories, broadly, make believers feel special, like they have inside knowledge the rest of us lack. People scroll TikTok to have fun; the truth usually isn't fun. Conspiracy theorist can say things like "UFOs are here!" or "They're turning the frogs gay!" Meanwhile, if you're devoted to the truth, you mostly have to go with "the best evidence suggests..." or "it seems logical that..." and who wants to hear that?

LifeHacker

文章目录


    扫描二维码,在手机上阅读