Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: Advice for EA org staff and EA group organisers interacting with political campaigns, published by Catherine Low on June 17, 2024 on The Effective Altruism Forum.
Compiled by CEA's Community Health team
2024 is
the biggest year for elections in history(!), and while many of these elections have passed, some important elections are upcoming, including the UK and US elections, providing a potentially large opportunity to have an impact through political change.
This post is intended
1. To make it easier for EA group organisers and organisation staff to adhere to the laws in relevant countries
2. And more generally, to help the community be able to take high impact actions now and in the future by reducing risks of polarisation of EA and the cause areas we care about.
Two main concerns: Legal risks and risks around polarisation and epistemics
Legal risks
Charities and organisations associated with/funded by charities have constraints on what political activities they can do. See "More about legal risks."
Note: This post is not legal advice. Our team is employed by US and UK charities (Effective Ventures US and UK). So, we have a little familiarity with the legal situations for groups/organisations that are based in the US or UK (many EA organisations), and groups/organisations that are funded by charities in the US or UK (even more EA groups and organisations). We have very little knowledge about the legal situation relating to other countries.
It could be useful for groups/orgs in any country (including US and UK) to get independent legal advice.
Risks around polarisation and epistemics
These risks include
EA becoming more associated with specific parties or parts of the political spectrum, in a way that makes EAs less able to collaborate with others
Issues EA works on about becoming polarised / associated with a specific party
EA falling into lower standards of reasoning, honesty, etc through feeling a need to compete in political arenas where good epistemics are not valued as highly
Creating suspicion about whether EAs are primarily motivated by seeking power rather than doing the most good.
Of course, the upside of doing political work could be extremely high. So our recommendation isn't for EAs to stop doing political work, but to be very careful to think through risks when choosing your actions.
Some related ideas about the risks of polarisation and political advocacy:
1.
Climate change policy
and politics in the US
2.
Lesson 7: Even among EAs, politics might somewhat degrade our typical epistemics and rigor
3.
To Oppose Polarization, Tug Sideways
4.
Politics on the EA Forum
More about legal risks
If your group/organisation is a charity or is funded by a charity
In many (or maybe all?) places, charities or organisations funded by charities are NOT allowed to engage in political campaigning.
E.g.
US
U.S. 501(c)(3) public charities are prohibited from "intervening in political campaigns" (
more detail). This includes organisations that are funded by US 501 (c)(3) charities (including Open Philanthropy's charitable arm, and Effective Ventures (which hosts EA Funds and CEA)). This includes
financial support for a campaign, including reimbursing costs for people to engage in volunteer activities
endorsing or disapproving of a candidate, referring to a candidate's characteristics or qualifications for office - in writing, speaking, mentions on the website, podcasts, etc. Language that could appear partisan like stating "holding elected officials accountable" could also imply disapproval.
taking action to help or hurt the chances of a candidate. This can be problematic even if you or your charity didn't intend to help or hurt the candidate.
staff taking political action that's seen as representing the organisation they work for
E.g. attending rallies or door knocking as ...