Price transparency is good for civic tech
A few weeks ago at OpenPlans we put our prices for Shareabouts onto our website. Before then, if you wanted to pay OpenPlans to set up a map, we had to talk about it - our prices weren’t secret, and I’ve happy described them on conference panels, but getting the details wasn’t as easy as going to our website and looking.
Price transparency like this is a really good thing for people buying technology for government. We’re chipping away at the appallingly expensive status quo.
I know that the fine people at Civic Insight have done something similar, and they even have a fancy pricing calculator.
Shared prices reduces friction for people seeking high-quality tools.
Every phone call or email followup to find out about the cost of tools is a small barrier to doing a better job of community involvement - small barriers that add up enough to stop a busy person. And even for a simple query, that research time that could be better spent on other tasks.
Transparent pricing helps other people advocate for good tools.
We all benefit from a well-informed community inside and outside city government, with realistic expectations of the costs of tools. These tools are also much cheaper than many people expect, but they aren’t free. And what you pay for is extremely good value. Having this info available helps everyone understand the options.
Why keep prices secret? Concern that these might not be the “right” prices perhaps? Sure: we might not be charging enough, or more than some cities want to pay for particular features. As we keep working on adding new features, we will re-evaluate. Perhaps concern about being undercut by others? Or wanting to keep pricing flexible/opaque in case a mythical deep-pocketed client shows up? Neither of these seems like good arguments to me (and they weren’t ones used by anyone at OpenPlans, I should add - we were slow to do this mostly because we’re small and busy).
The prices we’re sharing don’t cover everything, for example special feature development we are often asked to do. Soon, we will add prices for OpenPlans, our planning communication tool. We have more work ahead to give greater openness to the costs of hiring us, but we’re trying.
Spend your time on data tools
I mentioned that teams working on Big Apps should look at data trends, not just make maps. A related observation: you have limited development time, so don’t waste it building an engagement tool. Focus on a data tool.
What does this mean? By “data tool”, I mean something that can be useful to look up or make sense of data about the world around us. For example, recent building permits, or maximum buildable floor area around a location based on current zoning, or something with financial data, or access to health care services. A tool that can imperfectly answer a defined question over and over, maybe for different places and times.
There are heaps of difficult problems out there, and community organizations need help answering them. Often, these organizations understand both a problem and the answer they need, and the right data tool, if sensitively designed, can slot right in and provide answers right away that can lead to immediate positive change. Sure, you have to do some work to identify these problems and the organizations, but the payoff is huge (when measured in social goodness, at least).
The alternative is the seductive world of engagement tools. Look at all those people on Facebook! Look at these tweets! Surely we can harness just a little bit of this energy to get people engaged online in fixing this problem. Every neighborhood might be different, but they all need this collaborative tool for…. Alas, the answer is almost never building the missing tool.
Instead, the answer to organization challenges like organizing neighbors to care about safer streets, or parents about schools, or anyone about anything, is to meet them face to face. Technology can obviously do a lot to help all along the way, but it won’t replace capacity on the ground and people. And if you’re considering building tools, you likely aren’t also building capacity face to face. That’s not a judgement, just a realistic assessment of how you can spend your valuable and limited time.
So, if you’re embarking on a development project for social good, go build some data tools.
Trend, not maps
This year’s Big Apps competition is focused on some real issues, including traffic safety. We have tons of open data that can be used to explore these issues (like recently-released crash data), but most responses I’ve seen are maps.
Maps are great, but tools to examine trends are better. For example:
- Is this district seeing more crashes than last year?
- Is the past week a “typical” week, or is there something to look into?
- How do increases in crash numbers in this district compare to our neighbors over there in another district?
- How does change in this neighborhood rank in comparison to others in the city?
These are questions that people in community organizations and elected officials need answers to. City-wide mapping tools are just less useful, because they don’t drive towards policy responses.
And it’s not just street safety. Imagine dashboard tools showing trends from open data like 311, or building permits.
Trend tools are harder to build, but they are so powerful. We need more of them (and a framework to do time and district comparisons on a dataset would be very helpful to get us there).
UPDATE 5/29: Here’s a great trend dashboard from Make Queens Safer.
The only reason I know this, and that other neighborhood leaders know this, is because of government records. Northside neighborhood leaders try to keep up; they’re some of the most hawk-eyed citizens in the city. But often times local government can be the worst enemy in untangling the messes left by these companies.
Neighborhood leaders and housing researchers are force multipliers to counties losing out on property taxes and cities failing to enforce rental license laws and ensure livability.
Government should be doing everything they can to ensure that housing researchers have the convenient access they need to help fight fraud and abuse.”
– “This is what happens when housing data isn’t open and accessible”, by Tony Webster. https://tonywebster.com/2014/01/housing-data-open-minneapolis/
Public process: Don’t botch your online engagement:
"In short, their impressive wizbangery can be deceptive, fooling the uninitiated into thinking it’s the tool that really matters, rather than the goal-focused story the tool allows you to tell." — Scott Doyon on websites for planning
osp, websites, engagement
The tools community boards and council members need
Borough President Gale Brewer hosted a roundtable on data and tool needs on Monday. Manhattan Community Board chairs and Council Members attended. Here’s my condensed list of the needs I heard —
- Need to identify locations for neckdowns, turn signals —- identify them, where should they go, where are the dangerous intersections?
- What is the difference between crash data from NYPD vs DOT, who collects what data?
- Need to track construction/traffic issues.
- Want to have a replacement/tools to deal with LMCC closing.
- What is the impact of new development (for planning schools, transit, sewage treatment)? Need forecasting tools.
- Need tools to overlay district info with other data layers.
- Need affordable housing data — type, expiration, capacity, requirements, what is being built, in the pipeline
- Need FEMA flood zone maps.
- Need construction projects mapped, all on a single map
- Need to map out energy efficiency, green buildings.
- Need to know commercial/vacancy rate in the community.
- Need population projections (for schools).
- Want to be proactive with air rights for Hudson River Park.
- Need help working with/verifying DOE data.
- Need health data (no hospital in the district, uses a lot of small community based health centers, hard to get those datasets).
- Want to set up a system to stay on top of a retail survey.
- Want to model shadow impact on parks from tall buildings.
- Want to get demographic data by school enrollment zone.
Quality of life
- Need tools to work with 311 data: construction, noise.
- Need State Liquor Authority data overlaid with other info tied to a single address.
- Need to see more info about liquor license requests — what else do applicants own in the city? What is going on with their applicants before other boards?
- Want to track places with noise complaints/nuisance reports.
- Want to map buildings with C violations — not getting turned around quickly enough.
- Need access to quality of life data/complaints
- Need a digital complaints form for a CB office — want to see how many complaints come in, how many are resolved, etc.
- Need to track CB resolutions — send them out, not sure if they are acted on. SLA etc. don’t know how to follow or track.
- Need better public notifications — meetings, issue notifications, followups.
Walk First tool - what should SF be investing in?:
The City will be investing $17 million over the next five years to improve safety conditions for people walking. Given the City’s limited resources and the need to use this money effectively, you will be asked to prioritize each of 15 pedestrian safety tools (see Tools page for more information) by indicating whether or not you feel that the tool is a low, medium or high funding priority – essentially, how would you spend $17 million on pedestrian safety?
After each selection, you will see the graph at the right change in relation to your choices. This graph shows how your choices affect the total cost, the time to implement and the effectiveness of the solutions.
participation, tool, osp
Microparticipation in Transportation Planning:
details of the Austin Twitter experiments, and more on “microblogging”
twitter, planning, osp, planbox
Don’t make laws to make maps
Hey, legislators! Don’t write laws that require maps (especially those that detail how the map information will be aggregated).
Instead, write laws to open up data. The maps will come. Much easier. Shorter, future-proofed laws.
If you feel strongly about this, go testify. @transalt will be there, and other open data smarties.
This time, the law under discussion is about crash data in NYC, but the same unfortunate approach already made it into law for crime data. Interactive maps are an excellent tool to making complex data public, but requirements for a city agency to produce the map is not the right approach. Why not?
1. We need tools that answer questions and solve problems, and doing that well requires you to start with those needs, rather than building a generic map.
2. The track record of government-built maps is not great, maybe because of #1, or the tools they have, or because of internal development practices that don’t involve users, or something else. For example, the SLA liquor license map.
3. The track record of researchers and technologists and journalists to build data browsing tools is excellent. For example, excellent crime analysis, insightful 311 analysis, everything WNYC does, Vizzuality’s output etc.
4. There are complex tech problems that talented government technologists should work on. Making an interactive map isn’t one of them.
5. Legislation that is extremely specific seems brittle and prone to letter-of-the-law following later. Especially if a city department decided to be uncooperative in the future. Whereas full disaggregated data is flexible. We already have guidelines for opening up data “right”, no need to re-design this for each different type of data. Getting particular about mapping requirements is the worst sort of over-specifying. For crashes, maybe the aggregation by street segment prevents analysis of intersection safety (for example).
CDOT Performance Management Dashboard:
Crying out for a really nice front end! Interesting data here…
The tables in this dashboard work to promote transparency and accountability by providing real-time information about CDOT’s performance of the multiple public way infrastructure maintenance tasks we take on every day.
The charts provide a breakdown of the types of tasks performed, the number of requests for services, the number of requests that have been addressed and those current open. The information is updated on a daily basis.
dashboard, chicago, DOT, osp
“So while the specific problem this contest addresses is relatively humble, I’d see it as a creating a larger opportunity for academics, researchers, data scientists, and curious participants to figure out if can we develop predictive algorithms that work for multiple cities. Because if we can, then these algorithms could be a shared common asset. Each algorithm would become a tool for not just one housing non-profit, or city program but a tool for all sufficiently similar non-profits or city programs.”
Great. We need more community-owned insights. 311 seems to be a hugely under-valued community asset, for lots of reasons. Hopefully this cool project from David Eaves and SeeClickFix will start to make the locked-up value in 311 more accessible to everyone.
Announcing the 311 Data Challenge, soon to be launched on Kaggle | eaves.ca
Thoughts on business models for civic tech
A frequent topic among the civic tech chatteratti is the question of business models… How do we take these great ideas for civic engagement and scale them to orders of magnitude more people, operating as sustainable businesses with happy employees?
Although the non-profit model works for OpenPlans, I’m personally very keen to see more for-profit efforts - if what we’re doing is so great, it must be possible to do it without depending on members, foundations, etc.
(Boarding a plane to the Knight News Challenge Summer School today, I decided to write down some thoughts that have been rattling around in my head for months. There’s a shorter, more coherent argument to be made, but here’s a first pass, public draft.)
First, some thoughts about the common challenges faced by civic tech businesses. Not all firms have these problems, and none of these are impossible barriers, a problem is just an opportunity with a frowny-face, etc.
- The thought-diaspora of civic tech can be seen as very resilient, because we have many small firms that are independently going after similar goals. But we’re also a fragile collection — every government entity has a technology incumbent, and they aren’t going quietly. You can’t expect to take a multi-million dollar industry away and not experience some pushback (those golf club memberships have to be paid for somehow).
- We’re also fragile because some of the best ideas are either hard to scale, or coming from places that aren’t set up to scale or sell. Perhaps because of our hacker/fellowship origins, we’re good at coming up with creative solutions to problems, but less good at the other stuff. E.g. CrowdGauge or StreetMix.
- For civic engagement and planing software, there’s a risk of larger non-tech incumbents in the space moving in. When margins are thin, will the mega engineering firms develop in-house expertise in (say) TileMill or Shareabouts rather than going outside. Will young firms blaze a trail, only to see older firms cash in?
- Cities are often keen to work with local firms, and local talent - which is great, but it creates a market distortion that is bad for small firms based in a different city, even if their product is great.
- Smaller municipalities have a hard time spending money on things.
- And of course, procurement throws up all sorts of complexities.
Here are three approaches I’ve been musing over. The guiding principle for all of these is to preserve what works so well - small firms with the capacity to deliver groundbreaking tools - but scaled up drastically.
- the Civic Tech Collaborative, working together to get better at selling
- Scaling Together, working together to address the challenge of scaling
- More Local Tech, getting out of the way so a thousand developers take these tools to their communities.
I see a lot of challenges to those three approaches. In particular, I have an un-resolved conflict between the necessity to scale and the inability to scale - if these tools are so great, and as powerful as I think, why do we need the complex approaches described above?
Civic Tech Collaborative
Multiple small vendors who are affiliated and cross-promoting.
Focus on tools that aren’t on a trajectory to stand-alone sustainability already (so, not GovDelivery or Textizen, but maybe Shareabouts and Local Data).
A coherent, unified offering of a few smaller tools coming from the Code for America class each year, plus OpenPlans, and other incubators. Potentially, hiring a professional sales and support staff, to wrap a ribbon of professionalism around a number of tools at once, without each group having that expense.
So for example, OpenPlans could offer our services and re-sell other good tools without creating a procurement headache for clients.
There’s a version of this model where the collaborative shares skills rather than tools — the McKinsey of civic tech, like a more technical Bennet Midland.
Possibly as part of the same collaborative, a grouping that helps turn good standalone projects into feasible service offerings.
The group finds people with expertise in running and tuning service versions of tools, in a manner that makes government IT people happy. The world of civic tech seems to be loaded towards the front end, so finding skills to scale up a service or even just make hosting something that’s robust could be a good compliment. This approach saves each small outfit from needing to bring these skills onboard.
(This one fascinates me.) Local tech expertise is already helping cities with their web problems. Can we give new tools to those teams, and also help toolmakers be successful?
Local tech expertise is often cheaper, and more accessible. For society more generally this might be a good thing — the tax dollars stay local, and government contracts are rewarding work for local developers. We all win if more small development shops are actively re-deploying the best civic tech, by having more people skilled at using and offering tools.
Tool producers can worry less about selling tools to smaller places, but need to find revenue streams to support core development. The obvious example is Wordpress and Automattic, but the parallels aren’t quite true, and the density of work is lower, both for the core development team and the individual re-sellers.
“What’s clear to me is what local government maps need is less GIS and a lot more user-friendly auto-complete and SEO. Because in the end users want search and retrieval to work for maps the way it works for the rest of the web.”
Great analysis of how people use maps online. Less GIS, more search, more web.
How the Public Actually Uses Local Government Web Maps: Metrics from Denver.
Simple tech for actual problems
I gave a quick demo of the Liquor License Helper at #betaNYC tonight. It’s a really crude tool that generates a list of churches within 200ft and places with liquor licenses within 500ft - important info for community boards deciding if a particular location is viable for a license.
The tool has some major issues — bad address search, problems with using land use parcels rather than addresses, distance between parcels isn’t distance to entrances, bad source data, etc, etc. All true.
But (hopefully) the point of the tool comes through - there’s a legit need for a couple of simple data queries, which community groups are doing right now with Word docs, old paper maps, human memory, and other tools. For various reasons, those are likely the right tool for some jobs. But not for looking up property details within a defined radius.
We don’t even need a map to convey this info - in fact, a map might make the report less helpful. Hitting Print brings up a lightly-optimized version for taking to meetings, copying, handing out, etc. Addresses are the pertinent data here (I think… based on my small sample of conversations with people at boards, who as target users are the only people who can really judge the value of this tool).
Let’s make more tools like this! It was easy (CartoDB and some queries). Let’s keep discovering and churning out simple tools for actual problems until all the easy problems are dealt with.