The only reason I know this, and that other neighborhood leaders know this, is because of government records. Northside neighborhood leaders try to keep up; they’re some of the most hawk-eyed citizens in the city. But often times local government can be the worst enemy in untangling the messes left by these companies.
Neighborhood leaders and housing researchers are force multipliers to counties losing out on property taxes and cities failing to enforce rental license laws and ensure livability.
Government should be doing everything they can to ensure that housing researchers have the convenient access they need to help fight fraud and abuse.”
– “This is what happens when housing data isn’t open and accessible”, by Tony Webster. https://tonywebster.com/2014/01/housing-data-open-minneapolis/
Public process: Don’t botch your online engagement:
"In short, their impressive wizbangery can be deceptive, fooling the uninitiated into thinking it’s the tool that really matters, rather than the goal-focused story the tool allows you to tell." — Scott Doyon on websites for planning
osp, websites, engagement
Borough President Gale Brewer hosted a roundtable on data and tool needs on Monday. Manhattan Community Board chairs and Council Members attended. Here’s my condensed list of the needs I heard —
Quality of life
Walk First tool - what should SF be investing in?:
The City will be investing $17 million over the next five years to improve safety conditions for people walking. Given the City’s limited resources and the need to use this money effectively, you will be asked to prioritize each of 15 pedestrian safety tools (see Tools page for more information) by indicating whether or not you feel that the tool is a low, medium or high funding priority – essentially, how would you spend $17 million on pedestrian safety?
After each selection, you will see the graph at the right change in relation to your choices. This graph shows how your choices affect the total cost, the time to implement and the effectiveness of the solutions.
participation, tool, osp
Microparticipation in Transportation Planning:
details of the Austin Twitter experiments, and more on “microblogging”
twitter, planning, osp, planbox
Hey, legislators! Don’t write laws that require maps (especially those that detail how the map information will be aggregated).
Instead, write laws to open up data. The maps will come. Much easier. Shorter, future-proofed laws.
If you feel strongly about this, go testify. @transalt will be there, and other open data smarties.
This time, the law under discussion is about crash data in NYC, but the same unfortunate approach already made it into law for crime data. Interactive maps are an excellent tool to making complex data public, but requirements for a city agency to produce the map is not the right approach. Why not?
1. We need tools that answer questions and solve problems, and doing that well requires you to start with those needs, rather than building a generic map.
2. The track record of government-built maps is not great, maybe because of #1, or the tools they have, or because of internal development practices that don’t involve users, or something else. For example, the SLA liquor license map.
3. The track record of researchers and technologists and journalists to build data browsing tools is excellent. For example, excellent crime analysis, insightful 311 analysis, everything WNYC does, Vizzuality’s output etc.
4. There are complex tech problems that talented government technologists should work on. Making an interactive map isn’t one of them.
5. Legislation that is extremely specific seems brittle and prone to letter-of-the-law following later. Especially if a city department decided to be uncooperative in the future. Whereas full disaggregated data is flexible. We already have guidelines for opening up data “right”, no need to re-design this for each different type of data. Getting particular about mapping requirements is the worst sort of over-specifying. For crashes, maybe the aggregation by street segment prevents analysis of intersection safety (for example).
CDOT Performance Management Dashboard:
Crying out for a really nice front end! Interesting data here…
The tables in this dashboard work to promote transparency and accountability by providing real-time information about CDOT’s performance of the multiple public way infrastructure maintenance tasks we take on every day.
The charts provide a breakdown of the types of tasks performed, the number of requests for services, the number of requests that have been addressed and those current open. The information is updated on a daily basis.
dashboard, chicago, DOT, osp
“So while the specific problem this contest addresses is relatively humble, I’d see it as a creating a larger opportunity for academics, researchers, data scientists, and curious participants to figure out if can we develop predictive algorithms that work for multiple cities. Because if we can, then these algorithms could be a shared common asset. Each algorithm would become a tool for not just one housing non-profit, or city program but a tool for all sufficiently similar non-profits or city programs.”
Great. We need more community-owned insights. 311 seems to be a hugely under-valued community asset, for lots of reasons. Hopefully this cool project from David Eaves and SeeClickFix will start to make the locked-up value in 311 more accessible to everyone.
A frequent topic among the civic tech chatteratti is the question of business models… How do we take these great ideas for civic engagement and scale them to orders of magnitude more people, operating as sustainable businesses with happy employees?
Although the non-profit model works for OpenPlans, I’m personally very keen to see more for-profit efforts - if what we’re doing is so great, it must be possible to do it without depending on members, foundations, etc.
(Boarding a plane to the Knight News Challenge Summer School today, I decided to write down some thoughts that have been rattling around in my head for months. There’s a shorter, more coherent argument to be made, but here’s a first pass, public draft.)
First, some thoughts about the common challenges faced by civic tech businesses. Not all firms have these problems, and none of these are impossible barriers, a problem is just an opportunity with a frowny-face, etc.
Here are three approaches I’ve been musing over. The guiding principle for all of these is to preserve what works so well - small firms with the capacity to deliver groundbreaking tools - but scaled up drastically.
I see a lot of challenges to those three approaches. In particular, I have an un-resolved conflict between the necessity to scale and the inability to scale - if these tools are so great, and as powerful as I think, why do we need the complex approaches described above?
Civic Tech Collaborative
Multiple small vendors who are affiliated and cross-promoting.
Focus on tools that aren’t on a trajectory to stand-alone sustainability already (so, not GovDelivery or Textizen, but maybe Shareabouts and Local Data).
A coherent, unified offering of a few smaller tools coming from the Code for America class each year, plus OpenPlans, and other incubators. Potentially, hiring a professional sales and support staff, to wrap a ribbon of professionalism around a number of tools at once, without each group having that expense.
So for example, OpenPlans could offer our services and re-sell other good tools without creating a procurement headache for clients.
There’s a version of this model where the collaborative shares skills rather than tools — the McKinsey of civic tech, like a more technical Bennet Midland.
Possibly as part of the same collaborative, a grouping that helps turn good standalone projects into feasible service offerings.
The group finds people with expertise in running and tuning service versions of tools, in a manner that makes government IT people happy. The world of civic tech seems to be loaded towards the front end, so finding skills to scale up a service or even just make hosting something that’s robust could be a good compliment. This approach saves each small outfit from needing to bring these skills onboard.
(This one fascinates me.) Local tech expertise is already helping cities with their web problems. Can we give new tools to those teams, and also help toolmakers be successful?
Local tech expertise is often cheaper, and more accessible. For society more generally this might be a good thing — the tax dollars stay local, and government contracts are rewarding work for local developers. We all win if more small development shops are actively re-deploying the best civic tech, by having more people skilled at using and offering tools.
Tool producers can worry less about selling tools to smaller places, but need to find revenue streams to support core development. The obvious example is Wordpress and Automattic, but the parallels aren’t quite true, and the density of work is lower, both for the core development team and the individual re-sellers.
“What’s clear to me is what local government maps need is less GIS and a lot more user-friendly auto-complete and SEO. Because in the end users want search and retrieval to work for maps the way it works for the rest of the web.”
Great analysis of how people use maps online. Less GIS, more search, more web.
I gave a quick demo of the Liquor License Helper at #betaNYC tonight. It’s a really crude tool that generates a list of churches within 200ft and places with liquor licenses within 500ft - important info for community boards deciding if a particular location is viable for a license.
The tool has some major issues — bad address search, problems with using land use parcels rather than addresses, distance between parcels isn’t distance to entrances, bad source data, etc, etc. All true.
But (hopefully) the point of the tool comes through - there’s a legit need for a couple of simple data queries, which community groups are doing right now with Word docs, old paper maps, human memory, and other tools. For various reasons, those are likely the right tool for some jobs. But not for looking up property details within a defined radius.
We don’t even need a map to convey this info - in fact, a map might make the report less helpful. Hitting Print brings up a lightly-optimized version for taking to meetings, copying, handing out, etc. Addresses are the pertinent data here (I think… based on my small sample of conversations with people at boards, who as target users are the only people who can really judge the value of this tool).
Let’s make more tools like this! It was easy (CartoDB and some queries). Let’s keep discovering and churning out simple tools for actual problems until all the easy problems are dealt with.
WhichHood.org is now more fun. Check it out.
Above, some emerging Brooklyn neighborhoods following a burst of interest after the GeoNYC meetup last week.
“If you have a problem and can’t come up with a solution, I suggest you take it to Temple University’s Urban Apps and Maps Studio. The 130 Philadelphia teens who participated this summer seem capable of solving anything.”
Just read this piece by Donna Frisby-Greenwood right after finishing “Race Against the Machine"… A great case study for the optimism of the final chapter?
Donna checks out demos at the end of the six-week summer program of research, fieldwork and software dev skills, and shares some amazing stories —
“Two of [the teens] said they originally thought there wasn’t anything they could do to help their city, so it felt amazing to contribute ideas that might make a difference. In fact, Moira Baylson, deputy director for the Philadelphia Office of Arts, Culture and the Creative Economy, was so excited that she wanted the students to get started with their plans immediately.”
Brilliant project. Understanding the scope of public notices is the first step to coming up with new tools and standards — and what better approach to doing the survey than going out and taking pics?