Facebook Pulls Nonprofits into the Fire

 Photo by  William Iven  on  Unsplash

Photo by William Iven on Unsplash

By now pretty much everyone has heard of the debacle that has befallen Facebook over targeting user profiles in the 2016 Presidential election. It comes at a particularly inopportune time for nonprofits, because very recently Facebook had become a major platform for raising money online.

Facebook got involved in fundraising in 2013 when it introduced a new button that allowed people to contribute directly to nonprofits via the platform for the first time. Back then, just 19 nonprofits were listed as partners. Fast forward two years and Facebook upped the game for nonprofits when it launched Fund-raisers, and an improved “donate” button, and doubled the number of nonprofit partners that could raise money with the tool. Later that year, Facebook expanded Fund-raisers to allow individual users to solicit donations from Facebook friends for more than 100 U.S. nonprofits.

Then the doors blew off. Around the beginning of 2017, the 100 nonprofits were expanded to more than 750,000. The Bill & Melinda Gates Foundation and Facebook partnered to create a fund of $1 million in matching funds.

Later that year, on Giving Tuesday, 2017, (the Tuesday after Thanksgiving) the Gates Foundation announced another $1 million in matching funds. But more importantly, Facebook announced that it would no longer charge the five percent fee that it has been taking on donations raised via Fund-raiser. So, what’s in it for Facebook? In addition to the goodwill that the move generated, it increased the amount of time that users spend on the site. Increased time equals increased revenue for Facebook.

And the free fundraising platform represented a broadside for competing platforms, like GoFundMe, which claims to be the world’s largest social fundraising community. Although GoFundMe promotes itself as a “free” platform for personal fundraising campaigns, it charges an “industry-standard” processing fee of 2.9 percent for credit cards transactions.

With Facebook’s introduction of a free platform with 2.2 billion active monthly users, every nonprofit in the country had a major incentive to develop a social media strategy that included Facebook as a significant component. But now, thanks to the London-based political consulting firm Cambridge Analytica, all bets are off. If you don’t already know the story, here it is in a nutshell…

 Photo by  Markus Spiske  on  Unsplash

Photo by Markus Spiske on Unsplash

Facebook exposed data from 50 million of its users to Cambridge Analytica, a London-based consulting firm that worked for the 2016 Trump campaign. The vice president of Cambridge Analytica was Steve Bannon, who introduced the campaign to the firm during the 2016 election.

The data was acquired between June and August of 2014 by a Russian-American researcher named Aleksandr Kogan, who was employed by the University of Cambridge. Kogan built a Facebook application that was a quiz. People who took the quiz unknowingly allowed the app to collect data (called “buttons”) from all their Facebook friends as well.

[*If you would like to see the information from your Facebook account that may have been stolen -- your information that is shared with advertisers -- we list the steps for doing so at the end of this blog.]

After the company obtained the personal profiles of these 50 million users, they developed “psychographic profiles” of them. They wanted to understand the personalities of the users, and then use that information to match advertisements with the people most likely to respond to them. Personality is most commonly measured on a scale known as the “Big Five.” It’s based on five well-established traits: agreeableness, neuroticism, openness to new experiences, extroversion, and conscientiousness. Our levels of these traits tend to be relatively stable throughout our lifetime. And it’s not surprising that the levels of these traits are related to the things we like, buy, and spend time with.

Specific ads, referred to as “psychographic messages,” are then served up to certain people. This process is known as “psychographic microtargeting.” For example, someone who is high in neuroticism (tends to be anxious, compulsive, etc.) might be more easily swayed by an ad describing how foreign actors are threatening their way of life (Cambridge Analytica CEO Alexander Nix once explained their approach in this way to prospective clients).

Enter the Russians. The American public was subjected to an enormous, Russian-backed misinformation campaign in the runup to the 2016 election. Did personality-matched ads sway the election? That’s difficult to say. For one thing, there is little research on whether personality targeting works for political campaigns at all. The study that does exist finds mixed evidence on the ability of personality traits to predict who is most likely to turn out to vote as well as mixed evidence over whether personality traits can predict who is most likely to be persuaded by advertisements.

And even if personality-matched ads are more engaging, it is much more likely that microtargeted ads work on reinforcing people’s preconceived notions. It’s doubtful that they changed anyone’s minds on whether to vote for Donald Trump or Hillary Clinton.

So, what’s the upshot for nonprofits? All this is bad for the nonprofit community because it casts a shadow of distrust on a platform that was poised to be a significant conduit between their organizations and donors. Facebook is now embroiled in a very public debate over how much users can trust Facebook with their data. Facebook allowed a third-party developer to engineer an application for the sole purpose of gathering data. And the developer was able to exploit a loophole to collect information on not only people who used the app but all their friends — without them knowing.

And to make matters worse, Facebook has known about this for more than two years, and only now are they actually acknowledging they made a mistake.

Will this affect the ability of nonprofits to raise funds via Facebook? We’ll have to wait and see. But it’s not a good sign that a nontrivial number of people are deleting their Facebook accounts.

 Photo by  Tim Bennett  on  Unsplash

Photo by Tim Bennett on Unsplash

Nonprofits have struggled for some time to react to fractured communication channels. In most nonprofit executive suites, social is a brand-new world, and Facebook looks like a pathway to get to more people, more efficiently. Now, that is compromised. Do nonprofits want to align with a platform that has broken trust with the public? Will that distrust transfer onto nonprofits?

Our opinion is that Facebook is so ubiquitous that this controversy will not have much of an impact in the short run. But, the younger generation may change all that, since they tell us that Facebook is an “old person’s” platform. So, while moving away from Facebook doesn’t look like a smart move today, the generational shift away from the platform will probably take nonprofits in other directions in the long-term.

Social media is measured in dog-years. Facebook has been around since 2004. The smart money will be on using today’s dominant channel to maximize returns while keeping a sharp eye out for the next big thing.

* Here’s how to check your Facebook privacy settings to see what you are revealing. When you’re on Facebook, go to the small black triangle on the far-right side of the top blue bar. Click on it and go down to “Settings,” which you will find near the end of the pulldown menu. Click Settings, and you’ll find one of the most interesting places on the site — the page with your settings. In fact, you might find it interesting to click into every single setting to see what’s there, but if nothing else, go to the one on the left side near the bottom that says, “Ads.” You will probably be surprised at what’s there.

Open up each of your Ad Preferences, and you’ll see some of the “buttons” we referred to above that Cambridge Analytica mined in the election and that other advertisers are mining to sell you goods or affect your behavior. From here you can edit or delete to your heart’s content, from a single ad source to your entire Facebook account.