Foreign Policy Research Institute A Nation Must Think Before it Acts (De)friending in the Baltics: Lessons from Facebook’s Sputnik Crackdown
(De)friending in the Baltics: Lessons from Facebook’s Sputnik Crackdown

(De)friending in the Baltics: Lessons from Facebook’s Sputnik Crackdown

On January 17, Facebook shut down more than 350 pages and accounts linked to Russian state-owned media company, Rossiya Segodnya, and its radio and online service, Sputnik. Citing “coordinated inauthentic behavior,” the social media monolith nixed the 289 pages and 75 accounts tied to the outlets across the Baltic states, Central and Eastern Europe, the Caucasus, and Central Asia. Another network of nearly 150 fake Facebook and Instagram accounts originating in Russia, but mostly operating in Ukraine, were also closed.

Accruing audiences based on neutral topics ranging from tourism, to news, medicine, food, and sports, these Sputnik-linked pages and accounts also pushed coordinated Kremlin propaganda to around 800,000 unsuspecting followers with the intent to inorganically inflate their audience and promote Rossiya Segodnya outlets—spreading mis- and disinformation, curated in support of Russian state narratives.

“Despite their misrepresentations of their identities, we found that these Pages and accounts were linked to employees of Sputnik, a news agency based in Moscow, and that some of the Pages frequently posted about topics like anti-NATO sentiment, protest movements, and anti-corruption,” wrote Nathaniel Gleicher, Facebook’s head of cybersecurity policy.

But the removal of these pages—which generated around $135,000 in ad revenue for Facebook and promoted 190 events—was slow in coming. Some pages have been actively advertising for half a decade, since 2013.

Russia has long had an unabashedly obvious interest in spreading fake news in the former Soviet space—an attempt to maintain a foothold in its “sphere of interest.” The 2008 Russo-Georgian war marked a turning point in the blatant use of falsified media more than a decade ago. In fact, the editor-in-chief of RT, another Kremlin-controlled media organization, literally said the media’s role in the Georgia conflict was “conducting the information war, and what’s more, against the whole Western world.”

Manipulating Baltic Information Consumption

Facebook’s characterization of Sputnik as a simple “news agency” is also misleading, and its slow response to these pages “shows a lack of understanding of Russia’s motives and more importantly and concerningly, an utter lack of care for users in these countries, for many of which Facebook is essentially the homepage to the Internet,” tweeted disinformation expert Nina Jankowicz.

Fourteen pages targeting Latvia had more than 40,000 followers; eight pages aimed at Estonia garnered another 19,000. That’s up to 1.5 to 2 percent of the total populations of the countries, which is no small reach for pages amassing largely unsuspecting followers.

Together, Latvia and Estonia have less than 3.5 million residents, and around a quarter of the population in both of these Baltic countries speaks Russian as the first language. Despite various levels of social and political integration in their Baltic home countries, the Kremlin has consistently tried to co-opt these Russian speakers. It relies heavily on the media to create cleavages in the populations, and manipulates discussions around issues such as NATO integration and historical narratives to garner support for Russia’s geopolitical interests.

For example, Sputnik Latvia operates both a Russian- and Latvian-language version of its site, but the stories curated for each audience vary dramatically. On January 23, Sputnik’s homepage in Latvian led with “The Daugavpils Cinema has again been fined for showing films in Russian.” Conversely, in Russian, a homepage story was “How Russia can punish Latvia for the glorification of Nazism“—part of a consistent, yet inaccurate, narrative that Kremlin propagandists have pushed for decades around Latvia. Neither story appeared on the other language’s homepage. Such divergence contributes to the fact that many of Latvia’s residents live in information silos, depending on the language in which they consume news—even if it comes from the same source.

In an interview with Latvian investigative TV show De Facto, Sputnik Latvia’s Moscow-based editor-in-chief did not contest the link between Rossiya Segodnya and the Facebook pages and accounts. Rather, Valentin Rozhencov affirmed that Sputnik had control of the pages, which, he argued, were engaging in “normal content promotion on social networks” without “breaking any rules, neither Facebook’s, Latvia’s, nor Russia’s.”

Sputnik’s press service wrote off Facebook’s actions saying, “The decision is clearly political in its nature and, as a matter of fact, is practically censorship — seven [Facebook] pages belonging to our news hubs in neighboring countries have been blocked.”

Facebook under Scrutiny

Facebook has come under harsh criticism for its approach to suspicious news outlets. In September 2018, investigative journalist at Re:Baltica Inga Springe derided the company’s policy toward fake news in Latvia: “FB closes down fake profiles, but does not erase fake news, because [they say] everyone has the right to say whatever they please.”

Re:Baltica has brought scrutiny to suspicious social media habits among politicians in Latvia, particularly those of the pro-Russia Harmony party. One Sputnik-linked Facebook page closed in the January 17 purge was dedicated to Nil Ushakov, the mayor of Riga and leader of Harmony. Facebook has also been less than forthcoming in providing information to anti-corruption watch-dog agencies in Latvia, particularly regarding campaign spending. Springe noted that only two Facebook press officers cover more than 20 countries, leaving many questions by local journalists unanswered. This lack of transparency is worrying, given social media’s role in democratic elections writ large.

“It also turned out that there is no organisation that looks for fake news in the Latvian language on FB, while there is a person or persons (the number was never mentioned) who monitor hate speech in Latvian,” wrote Springe in September 2018.

Under fire for its role in spreading not only disinformation, but also violence-inciting content around the world, Facebook has recently taken steps toward upping the number of content monitors searching for violations to Facebook’s standards (e.g. hate speech, nude photos, and harassment). In July 2018, Facebook had around 7,500 content reviewers across the globe. In December 2018, that number doubled. Competence Call Center (CCC), one of Facebook’s subcontractors dealing with content moderation, announced plans to hire 150 reviewers in the Riga center. Facebook says around 30,000 people work broadly on safety and security for the platform, but this number pales in comparison to the number of users on the platform: 2.27 billion monthly.

Facebook’s Shortcomings in Fake News

Identifying fake news is not explicitly in the mandate of content reviewers. For that, Facebook relies on a “combination of technology and human review,” including partnerships with third-party fact-checkers.

Facebook lists no partner organizations for fact-checking in Russia or the former Soviet space.

But individuals with substantive knowledge of the media situation were critical to the January 17 closures. “Facebook also notes that it wouldn’t have been able to detect these [Kremlin-linked] accounts without help from outsiders. It should invest more in staff with language skills and cultural fluency so it can do better. It has the money,” tweeted Jankowicz.

As the investment in content monitors in Riga may show, Latvia is a particularly strategic location to tackle these issues. CCC advertises Russian-language monitoring jobs starting at €1,200 per month. At €14,400 per year, that salary is above market for the region, but falls far short of the wages earned in Barcelona, around €25,000 per year, or Germany, where moderators earn up to €15 an hour.

A hub in the region, Latvia boasts a broad pool of potential employees with significant language and cultural competency in the post-Soviet space. The Baltics are no strangers to Russia’s fake news assault, and they often prove better equipped to handle it than Americans. Riga is even home to NATO’s Strategic Communications Centre of Excellence, which routinely focuses on propaganda and media manipulations.

The social media giant also says part of the goal in shutting down pages that consistently share or publish misinformation is not only to cut down on the distribution of false news, but also to remove “the financial incentives to create it.” For the Kremlin, spreading propaganda is far more than a business model for financial gain. Creating networks of trust and spreading false or misleading information is directly part of a strategy to co-opt hearts and minds. Cultivating “compatriots” within its sphere of influence is written into Russia’s foreign policy.

That’s why these Sputnik-linked pages focused on things like sports or scenic pictures of Riga, and not Kremlin propaganda outright. “If you show someone you are on their side on an issue that is close to their heart, it becomes much easier to nudge them on other issues,” NPR reported. This is analogous with what happened in the U.S. elections; Russian troll farms curated groups around hot button topics like Black Lives Matter or gun rights, using shared values to build trust within such groups. Once group loyalty is established, it becomes far easier to take advantage of a trusting community.

Western social media organizations cannot afford to ignore the post-Soviet space—and they could benefit from its example. The Baltic countries and others in the region have long pointed to the information war being waged by Russia, and have been on the offensive. Perhaps that contributed to relatively low social media influence of Russian trolls in recent elections, as compared with the 2016 U.S. presidential race. But this is not an excuse to let the region sit in limbo or fend for itself. Facebook’s slow action to moderate fake news posters on the world’s flagship online social platform is not enough. It’s time for social media companies to learn from regional experts—and do far better.