You’ve heard of “artificial intelligence,” or AI, in one sense or another; we’ve been reckoning with the concept through books, movies and academic discussions since its earliest mentions in the 1950s. When you hear about it today, though, it’s typically in the context of “generative AI,” the rapidly evolving web-based tool that humans are using right now to enhance their worlds. Generative AI (like the popular ChatGPT engine, among others) writes up entire documents, draws up complex images, researches historical issues, drafts organizational plans, and even provides advice on complicated scenarios, pretty much on its own, hence the "generative" term; all you have to do is type in what you're looking for. It's pretty amazing. If you work in local government, you might be thinking about where and how it fits in; perhaps your city already has an AI policy on the books, or has used it to help draft or enhance language in public documents. But, like with any transformative technology, it comes with all kinds of cautions and ethics worries. Are we comfortable with it essentially learning how to do jobs we've always entrusted humans with? Already, we’ve seen trust in generative AI lead to embarassing, avoidable public blunders and messes. It brings privacy and authenticity issues, too. Did you know generative AI can sample a recording of your voice and learn how to speak as if it's really you, potentially faking out anyone who hears it? Take this episode of Municipal Equation, the podcast about cities and towns from the NC League of Municipalities, as a sort of primer on the generative-AI conversation in the context of local government at the moment. What’s the potential? What are the dangers? How can cities and towns use generative AI safely and for the best? It’s not going away. // Municipal Equation is a production of the N.C. League of Municipalities, . Contact host/producer Ben Brown at .