Blog

Thoughts on improving social capital, collaboration, and outcomes.

Putting Constituents at the Center of Public Performance Reporting

15488931698_7b163ce499_k.jpg

Last week I had the privilege to lead a workshop at the 10th annual Public Performance Conference in collaboration with my friend Andy Krackov. The conference is organized by the National Center for Public Performance (which is currently moving its institutional home from Rutgers to Suffolk University). It gathers a considerable audience of state and local government officials focused on making government more efficient and accountable to citizens.

This year's conference theme was sustaining public performance initiatives during times of transition. In our session, "Constituents at the Center: A Design-Thinking Approach to Performance Measurement & Reporting," we recommended an approach to break public performance systems free from political administration-driven goals and technology-driven reporting. We've seen performance improvement projects peter out because new administrations want to define their own goals, and too often performance dashboards do not adequately meet the basic needs of those on the front lines who need to use the data. Such projects can be designed for sustainability, usability, and efficacy, but they require a different conceptual approach.

Public sector performance improvement is a challenge that is ripe for more design thinking. To better withstand the pressures of political and institutional change and sustain themselves for the long term, one solution is to consult the users for whom performance improvement systems should be designed: constituents. Political climates and administrations change frequently, but constituents and their needs -- with the exception of some highly transient pockets of the country -- are a relative constant. And local challenges such as increasing economic development and improving public health aren't spawned and can’t be solved in alignment with election cycles.

I think the following diagram from the Nielsen Norman Group effectively visualizes the core components of the design thinking process. In summary, this methodology engages real users to uncover problems and test ideas, favors rapid prototyping and iteration over defining and engineering the “the perfect solution," and embraces the concept that “launched” solutions must continually evolve and improve.

There are likely many reasons why design thinking is a useful methodology for public performance measurement and reporting. Here are three.

  1. Designers focus on real problems and solutions for citizens, as opposed to purely political priorities. Talking to constituents and understanding their needs can help ensure the reporting system adequately reflects them. Most design thinking definitions include the notion of "empathizing" with users, which suggests getting beyond their stated needs and really seeking to understand latent needs. There was a great story last month in the Harvard Business Review about understanding the hidden reasons behind missed medical appointments.
  2. Solutions aren’t constrained to out-of-the-box reporting from technology platforms. We all work within constraints, and if a significant investment has been made in a technical platform, there’s an obvious desire and argument to use what’s available. But to the extent possible, I would advise not allowing the technology to determine the design of your solution. Too often I’ve had clients who wanted to de-prioritize user research and testing in order to save more time and budget for technology. But ensuring usability is the most critical part of a design project. You are much better off with a solution with fewer bells and whistles, that really meets a core user need than with a feature-heavy tool that gathers digital dust because no one needs, wants, or knows how to use it.
  3. A focus on longer term constituent needs can increase the chances of reporting program sustainability across administrations. Former Maryland Governor Martin O'Malley was the keynote speaker at the Public Performance Conference, and he delivered a compelling argument for a more goal-oriented, data-driven, and transparent government. I was disappointed to see that one of Gov. O’Malley’s signature performance reporting achievements -- StateStat -- had been dismantled under the Hogan administration. But there's some reason for hope (and perhaps justification for my hypothesis) in Montgomery County’s CountyStat, which was developed collaboratively with a diverse group of 150 constituents in 2006 and is still actively maintained.

Despite the obvious benefits, there may be good reasons why employing a comprehensive design thinking approach may not be possible for your organization. My advice is to start small -- choose one audience research activity to improve your system's design. For example, a few years ago I worked with a former colleague at Forum One to design the first tool to share publicly reported data from the largest greenhouse gas emitting facilities in the United States. Because of government restrictions on research with human subjects, we weren’t able to consult with more than nine users. Given this constraint, we focused our attention on our highest priority audience: communicators and researchers with subject matter expertise in environmental issues. We conducted a focus group with representative users from this segment during which we reviewed draft, low fidelity storyboards of the application we proposed to design. Their critical feedback in this session was essential to ensuring that the application would meet their needs and the needs of other end users. Just one example of an improvement coming out of this focus group was the recommendation to design a completely different interface for mobile users that used native GPS capabilities to find GHG emitting facilities based on current location.

Are you interested in learning more about applying design thinking to your open data and/or performance reporting project? Get in touch