artificial insemination syringe
Updated: July 30, 2021 | Originally Published: July 30, 2021
Searching for “Smith” on YouTube reveals a prominent figure among the “Disinformation Dozen,” identified by the Center for Countering Digital Hate as a leading source of COVID-19 misinformation. With 394,000 subscribers, the conspiracy-ridden Smith is the most active anti-vaccine videographer. His channel is easy to find, featuring a pinned video titled, “Unveiling the Truth About COVID-19.” Spoiler alert: by “truth,” Smith refers to a conspiracy theory he concocted in his basement using string and news clippings, fueled by a peculiar disdain for public health measures.
In his video, Smith asserts that “the technocrats are manipulating the pandemic narrative and exploiting the chaos they have instigated,” while displaying a COVID-19 vaccine. The term “misinformation” appears against a backdrop of escalating death tolls. He claims, “They are now deploying… tracking technology,” which is code for “vaccine microchip!” When he proclaims “Join the fight for the future,” the visuals show people removing their masks.
This rhetoric is undeniably anti-vax. While he never explicitly states that the pandemic is a hoax or that masks are unnecessary, his content is carefully crafted to avoid direct violations of YouTube’s COVID-19 Medical Misinformation Policy. Nonetheless, it promotes ideas that clearly fall outside those guidelines. Shouldn’t this be a violation somewhere? Surprisingly, it isn’t.
His promotional video includes a screenshot of a review stating that “Smith reveals the hidden truths behind the global PLANdemic.” He might not be technically breaching policy, but I was then recommended to watch “Vitamin D and COVID-19: Evidence for Prevention and Treatment of Coronavirus.” Alarmingly, Professor Roger Smith, MD, does not claim that Vitamin D will prevent or cure all COVID-19 infections, yet he remains compliant with YouTube’s content rules. Similarly, the absurd “COVID-19 and Zinc” video that follows is recommended, and the creator, Dr. John Brown, who lacks the “MD” designation, avoids outright claims about Zinc’s effectiveness, keeping him within the guidelines.
This situation is far from acceptable. If such disinformation is permitted, YouTube’s COVID-19 misinformation policies require urgent revision.
The Role of Algorithms
Social media significantly contributes to the spread of vaccine misinformation, which is endangering lives. However, platforms like Facebook and YouTube aren’t merely allowing false information to slip through their enforcement gaps; their algorithms actively promote it. They create echo chambers that breed vaccine-resistant groups, who then spread misinformation through communities, including local retail environments.
Dr. John Brown’s zinc video led me to a clip that blatantly violates YouTube’s guidelines by suggesting Ivermectin (an anti-parasitic primarily for animals) as a remedy for COVID-19. Titled “Ivermectin and the Odds of Hospitalization Due to COVID-19: Evidence from a Quasi-Experimental Analysis,” the video claims this dewormer is effective. It should go without saying that Ivermectin is not a treatment for COVID-19, but here we are.
This is followed by another Ivermectin-focused video where Dr. Brown states, “I will not be giving any of my own opinions, at least I’ll try not to… Always consult your own prescriber; this is for educational purposes only.” While he attempts to adhere to Community Guidelines, he continues to mislead viewers into considering dangerous alternatives.
Next is a video discussing “Vaccines for Children,” contending that children should not be vaccinated because they don’t contract COVID-19 as frequently as adults. This claim skirts the guideline prohibiting assertions that children cannot or do not contract COVID-19.
Facebook’s Response to COVID-19 Misinformation
According to The Washington Post, an experiment by advocacy group Avaaz in June revealed Facebook’s algorithm’s alarming effectiveness in promoting anti-vaccine content. They created two accounts, which were recommended 109 anti-vaccine pages within just two days. While Facebook has removed 18 million instances of COVID-19 misinformation since the previous year, it’s evidently insufficient.
After searching for “COVID vaccine,” the algorithm recommended the hashtag #covidvaccinesideeffects. This led me to a post from a group that insinuates COVID-19 vaccines are harmful, which violates Facebook’s policies against unsafe vaccine claims. Further exploration revealed even more posts linking deaths to the #covidvaccinesideeffects hashtag, clearly breaching Facebook’s guidelines.
I stumbled upon this content purely through algorithmic recommendations. Although not everything was anti-vaccine, a significant amount of misinformation was present. Avaaz’s experiment mirrored this; starting with a search for “vaccine” or liking an anti-vaccine page resulted in an avalanche of related recommendations.
Clearly, Facebook needs to enforce its guidelines more rigorously and reassess its algorithm.
Cleverly Concealed Misinformation
Anti-vaccine advocates are resourceful. The Washington Post reported on one group that cleverly named itself “Dance Party” and boasted over 40,000 members before being shut down by Facebook. They used “pizza” as code for “Pfizer.” Who knows how much more of this is out there? One influencer previously obscured “COVID” and “vaccine” in their Instagram posts to evade content guidelines.
This particular post, tagged with their company’s name, promotes visiting reputable sources like the CDC for vaccine information. However, because they obscured certain words, it slipped through the cracks.
Need for Revised Guidelines on COVID-19 Misinformation
YouTube’s guidelines are not being effectively enforced, and Facebook’s policies are similarly lacking. Both platforms need to strengthen their enforcement of COVID-19 misinformation rules urgently. It’s also on us to report any anti-vaccine content we come across. If we don’t assist in this effort, we risk putting these platforms at a disadvantage.
Yet, both platforms must undergo significant algorithmic revisions. Why would a search for “COVID vaccine” lead to recommendations for “#covidvaccinesideeffects”? Both algorithms resemble rabbit holes of misinformation; one small opening leads to a chaotic world filled with microchips, vaccine-related deaths, and bizarre conspiracy theories involving figures like Bill Gates.
Neither platform is doing enough. Both require substantial improvements, and they need them urgently, or COVID-19 misinformation will continue to proliferate, resulting in more lives lost.
Summary
This article examines the inadequacies of social media platforms like YouTube and Facebook in curbing the spread of COVID-19 misinformation. Despite community guidelines designed to limit harmful content, algorithms often promote misleading videos and posts, contributing to vaccine hesitancy and public health risks. The piece calls for better enforcement of existing policies and a thorough reevaluation of algorithms to prevent further misinformation propagation.
Read more about home insemination. For comprehensive insights on pregnancy options, visit Resolve’s family-building resource. If you’re looking for trusted products, check out Make A Mom’s artificial insemination kit.
Search Queries
- home insemination kit
- self insemination
- at home insemination kit
- home insemination syringe
- home insemination process
Keyphrase: COVID-19 misinformation on social media
Tags: [“home insemination kit” “home insemination syringe” “self insemination”]