AI Resources for Civic Tech
Prompting for Civic Tech Work
Section titled “Prompting for Civic Tech Work”AI tools are only as useful as the instructions you give them. These prompt templates are written for the kinds of tasks that come up on real civic tech projects. Copy, adapt, and make them your own.
Writing a README
“Write a README for a civic tech project called [name]. The project [what it does] and is built for [who]. It is currently in [early/active/maintenance] stage. Include sections for: project description, why it matters, how to get set up locally, how to contribute, and who to contact.”
Drafting a grant summary
“Write a 200-word summary of a civic tech project for a grant application. The project [what it does], serves [who], and addresses [what problem]. The team is made up of volunteers. Emphasize community impact and open source values.”
Cleaning a dataset description
“I have a dataset with the following columns: [list columns]. Write plain-language descriptions for each column that a non-technical user could understand. Avoid jargon.”
Generating test data
“Generate 20 rows of realistic but fictional test data for a [describe your app] application. Include fields for [list fields]. Make sure the data reflects the diversity of a DC-area population.”
Writing accessibility copy
“Write alt text for an image that shows [describe the image]. The context is a civic tech project about [topic]. Keep it under 125 characters and descriptive.”
Summarizing meeting notes
“Here are raw notes from a project meeting: [paste notes]. Summarize into: key decisions made, action items with owners, and open questions. Use plain language.”
AI and Government Data
Section titled “AI and Government Data”When working with public datasets or anything that touches resident information, a few things are worth keeping in mind before pasting data into an AI tool.
Don’t input PII. If a dataset includes names, addresses, Social Security numbers, or anything that could identify a specific person, don’t paste it into a public AI tool. Anonymize or aggregate first.
Public doesn’t mean unrestricted. Some government datasets have terms of use that limit how they can be processed or republished. Check the data license before using AI to transform or summarize it for public-facing outputs.
Outputs aren’t authoritative. AI-generated summaries of government data can introduce errors or omit important nuance. Always have a human review anything that will be shared publicly or used to inform decisions.
Be transparent with partners. If you’re working with a DC agency or community organization and using AI in your workflow, let them know. Some partners have strong feelings about it, and surfacing that early avoids awkward conversations later.
When AI Helps and When It Doesn’t
Section titled “When AI Helps and When It Doesn’t”AI tools can genuinely save time on a volunteer-run project. They can also mislead you in ways that aren’t immediately obvious. Here’s an honest breakdown.
Where AI tends to help
- Boilerplate and first drafts — READMEs, onboarding docs, grant summaries, email templates. Getting from a blank page to something editable is where AI earns its keep.
- Summarizing — long meeting notes, a dense policy document, a GitHub issue thread. AI is good at pulling out the shape of something.
- Code scaffolding — setting up a project structure, writing repetitive functions, generating test data. Tools like GitHub Copilot and Cursor are particularly useful here.
- Editing and rewriting — tightening up copy, adjusting tone, making something more accessible to a non-technical audience.
Where AI tends to underperform or mislead
- Local and DC-specific context — AI doesn’t know the political history of a neighborhood, the relationship between two agencies, or why a particular dataset is structured the way it is. That knowledge lives with your team and your community partners.
- Domain-specific policy — housing regulations, transit policy, procurement rules. AI can summarize what it’s seen in training data, but it can be confidently wrong about specifics. Verify with a human who knows the space.
- Anything requiring accountability — if a decision affects residents, a community organization, or a government partner, a human needs to own it. AI-generated outputs shouldn’t be the last word.