Facebook’s Ad Policies Changed, but Political Campaigns Can Still Target You

Meta announced changes to its ad-targeting policies, but they will do little to stop campaigns from reaching specific voters.


Continue reading the main story

Supported by

Continue reading the main story

Sign up here to get On Politics in your inbox on Tuesdays and Thursdays.

During the 2020 election, both the Biden and the Trump campaigns ran Facebook ads targeted to Black voters in Kenosha, Wis., about the protests over race and policing that dominated the summer.

On Tuesday, Meta, the social media company formerly known as Facebook, announced changes that, on the surface, would appear to reduce such targeting. But it remains entirely possible for campaigns to get around these limitations.

The company said it planned to eliminate advertisers’ ability to target people with promotions based on their interactions with content related to race and ethnicity or political affiliation, as well as thousands of other topics.

But those changes would do nothing to stop a campaign from targeting the same audiences in Wisconsin with Facebook ads, just in different ways: Location targeting is still permitted, down to the ZIP code. Campaigns could also use a feature known as “look-alike audiences,” along with a host of remaining options.

Indeed, the changes announced by Meta on Tuesday — which arrived amid a growing outcry over the damage social platforms have done to the political and social fabric — will most likely just force political campaigns to switch methods. They will still be able to reach specific voters pretty easily.

“There’s just so many different ways that you can reach different groups of people not using these targeting methods, even going to geolocation and textual data,” Tim Cameron, a Republican digital consultant, said. “Now, where you can’t use detailed targeting to reach L.G.B.T.Q. culture, you can certainly set up ads around Pride Week and around certain locations that are a part of that culture. So, it’s just kind of like a closed road that at the end of the day, people are going to find a way to get around it to get to their destination.”

That campaigns can still use this specific targeting on Meta’s platforms, which include Facebook, Instagram and Messenger, reflects the difficulty the company faces in reining in its political advertising process. Some have deemed this process exploitative of vulnerable groups, especially in a vicious and polarized political arena.

Aside from targeting audiences based on ZIP codes, another common tactic campaigns use is uploading data, such as a campaign’s voter file, and running specific ads to people the campaigns want to reach. They can also use “look-alike” models, which take a ZIP code that has similar demographics to a segment they want to target and ask Facebook to find similar audiences.

The company, in its statement announcing the changes, noted that some of these targeting options would still be available.

“The decision to remove these detailed targeting options was not easy and we know this change may negatively impact some businesses and organizations,” Graham Mudd, a vice president of product marketing for Meta, said. “Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions.”

Meta’s changes are likely to have a more substantial footprint outside politics, such as in combating housing discrimination or exploiting body image issues to sell products.

And political campaigns will have some new hurdles to overcome. Using behavior targeting or interest targeting can be critical for finding new voters or donors, as well as for maintaining the efficiency that is a hallmark of digital advertising.

“We use it as a way to exclude conservative-leaning segments,” said Cat Stern, the director of digital persuasion at Authentic, a Democratic digital firm, explaining how interest targeting helps campaigns reach voters more efficiently. She added that the removal of behavioral targeting, such as finding people “likely to engage with political content,” would also force campaigns to “get creative” in their efforts to reach new audiences.

Understand the Facebook Papers

Card 1 of 6

A tech giant in trouble. The leak of internal documents by a former Facebook employee has provided an intimate look at the operations of the secretive social media company and renewed calls for better regulations of the company’s wide reach into the lives of its users.

How it began. In September, The Wall Street Journal published The Facebook Files, a series of reports based on leaked documents. The series exposed evidence that Facebook, which on Oct. 28 assumed the corporate name of Meta, knew Instagram, one of its products was worsening body-image issues among teenagers.

The whistle-blower. During an interview with “60 Minutes” that aired Oct. 3, Frances Haugen, a Facebook product manager who left the company in May, revealed that she was responsible for the leak of those internal documents.

Ms. Haugen’s testimony in Congress. On Oct. 5, Ms. Haugen testified before a Senate subcommittee, saying that Facebook was willing to use hateful and harmful content on its site to keep users coming back. Facebook executives, including Mark Zuckerberg, called her accusations untrue.

The Facebook Papers. Ms. Haugen also filed a complaint with the Securities and Exchange Commission and provided the documents to Congress in redacted form. A congressional staff member then supplied the documents, known as the Facebook Papers, to several news organizations, including The New York Times.

New revelations. Documents from the Facebook Papers show the degree to which Facebook knew of extremist groups on its site trying to polarize American voters before the election. They also reveal that internal researchers had repeatedly determined how Facebook’s key features amplified toxic content on the platform.

In a statement on Wednesday calling on Meta to reverse its changes, the four campaign arms of the Democratic Party — the national committee, and committees that oversee races for governor, House and Senate — argued that the new limitations did not address the larger crisis plaguing the platform: disinformation.

“Meta has once again shirked its responsibility to protect voters on its platforms by implementing backwards political ad policies that will limit our ability to communicate with voters about the democratic process, and that once again do nothing to address the platform’s most serious issue — an algorithm that incentivizes misinformation and hate,” the statement said.

On Politics is also available as a newsletter. Sign up here to get it delivered to your inbox.

Is there anything you think we’re missing? Anything you want to see more of? We’d love to hear from you. Email us at onpolitics@nytimes.com.

Leave a Reply