Connecting Grassroots to Government Podcast #5: Michael Frank Goodchild

Editor’s note: In September 2012, the Commons Lab hosted the Connecting Grassroots to Government for Disaster Management workshop. Over two days, we spoke with a number of event participants for a series of video podcasts covering various aspect of the proceedings. Additional installments will be posted in the coming weeks. The workshop summary report is available here.

In this podcast, Michael Frank Goodchild talks about major improvements he sees taking place over the next decade in relation to social media and disaster response, particularly how diverse, web-based data streams could be cleaned up and synthesized. With new technologies, Goodchild hopes for “a means to turn a whole lot of stuff into a complete picture.” Goodchild further talks about data quality issues surrounding this information.

Goodchild, an emeritus professor of geography at the University of California-Santa Barbara and an affiliate professor of geography at the University of Washington, was formerly director the National Center for Geographic Information and Analysis.

We spoke with Goodchild at the Connecting Grassroots to Government workshop in September 2012.

Ten of the Best City-Oriented Innovation Tools

Editor’s note: In this guest post, Ian Kalin, currently the director of open data for software company Socrata and former Presidential Innovation Fellow, shares his thoughts about the tools that are available to cities looking to take advantage of the increased data flows.

This past week, The Economist published “The Multiplexed Metropolis,” a brilliantly global and judicious review of data-driven civic innovation. To quote the article’s subhead, “Enthusiasts think that data services can change cities in this century as much as electricity did in the last one. They are a long way from proving their case.”

Knowing better than to argue with the bible of capitalism, I instead offer that The Economist missed a few of the most important civic innovation tools. Civic innovators are people who create new ways to improve cities. They can come from all corners of civic society: government, the private sector, non-profits, academics, etc. Their greatest resource is people who are motivated to work together and improve their community.

Beyond that, there is a variety of tools making real impacts within U.S. cities, and every civic innovator should consider using them:
Continue reading “Ten of the Best City-Oriented Innovation Tools”

Project Open Data: Interview with Ben Balter

In our recent post on the Open Data Policy, we mentioned Project Open Data as an exciting manifestation of collaborative government concepts put into practice. To learn more, we reached out to GitHubber Ben Balter, former Presidential Innovation Fellow and previous contributor to the Commons Lab. Ben also provided input on agile development for our paper on the National Broadband Map.

Ben Balter

How did GitHub become a part of this project?

I was working as a Presidential Innovation Fellow when the process to create the Open Data Policy began. Anyone within government is used to seeing documents circulate with no real idea of when it was edited, by whom, whether it was the most current version, and so on. This is very opaque. So while we’re working on open data policy, the process itself was very not open. Open source developers within the Innovation Fellows started talking about using GitHub to create the actual document. Lowering the barrier to entry was always the idea—we want people editing this and sharing their perspectives. Continue reading “Project Open Data: Interview with Ben Balter”

New Open Data Rules Continue Push for Government Innovation

low power

FCC Visualization of Low Power FM Availability, built on open data and explained on GitHub.

Today, the Office of Management and Budget and the Office of Science and Technology Policy jointly released a new Open Data Policy directing agencies to implement specific structural reforms. In conjunction with an Executive Order prioritizing open and machine readable government information, these adjustments are forward looking and exciting. They speak to a general understanding that a deliberate approach to the way that data are processed and released can exponentially enhance their value.

Continue reading “New Open Data Rules Continue Push for Government Innovation”

The Specter of “Preemptive Government”

Last week, reporter Alex Howard published an interview on O’Reilly’s Strata blog that discusses a new use of Big Data in the form of “preemptive government.” This concept refers to emerging technologies that are able to sift through large, heterogeneous datasets and make predictive judgements about everything from crime to building or business code violations.

The phrase “preemptive government” itself sounds like it was torn from the pages of a Philip K. Dick story, with government agents targeting potential violators of the law before any violation occurs. In the piece, former Indianapolis mayor (and former New York City deputy mayor) Stephen Goldsmith acknowledges that the concept raises some very thorny ethical questions, particularly around disproportionate police attention and profiling. Issues around data collection, retention and usage all bear on how such preemptive governance would be conducted. From a conceptual standpoint, preemptive governance could be construed as a new form of surreptitious surveillance.

But it could be a two-way street. As Goldsmith acknowledges, inasmuch as preemptive governance relies on crowdsourced forms of data production, it can enable citizens to participate in the ways their environments are governed. Much like how participatory Geographic Information Systems (PGIS) to some degree enabled citizens to use technologies to influence urban administration, crowdsourcing data production opens new connections between citizens and their government. Also, like PGIS, this process is likely to be fraught with social and political challenges that bear exploring, in particular how marginalized communities are impacted.

Research is needed to understand whether Big Data, crowdsourcing, and, perhaps, preemptive government can increase government efficiency, and how this impacts different social groups. Are some people marginalized by these processes? Are others given a greater voice in governance? What kinds of problems can these processes address?

In the meantime, policymakers could look to these technologies as ways to improve operations and to increase transparency, while figuring out how to navigate legal and social frameworks of privacy and confidentiality. In an ideal world, these technologies can be leveraged to empower everyday people to positively influence the ways they are governed.

About the author

Ryan Burns, PhC, is a doctoral candidate in geography at University of Washington-Seattle. He is currently serving as a research assistant with the Commons Lab of the Science and Technology Innovation Program at the Woodrow Wilson International Center for Scholars. He is studying the social and political implications of geographic technologies, particularly the ways new mapping and social media technologies are being integrated into disaster management strategies.

Crowdsourcing at U.S. AID

On June 28, 2012, officials from U.S. AID met at the Wilson Center to discuss a recent experiment in using crowdsourcing to help clean up and map  data on development loans. The international aid agency had 117,000 records on private loans made possible by USAID’s Development Credit Authority, which could be mapped and made available to the public. The problem? Location data for these loans was not standard and could not be easily mapped. U.S. AID officials decided to turn to “the crowd” and recruit interested volunteers from the Standby Task Force and GISCorps to help clean the data, making it available for additional analysis.

At the event, Shadrock Roberts, Stephanie Grosser and D. Ben Swartley — all with U.S. AID — discussed the results of the project that was able to clean up the information at virtually no cost to the government, noting that they hope it will be an example for further collaboration between government and engaged volunteers.

“By leveraging partnerships, volunteers, other federal agencies, and the private sector, the entire project was completed at no cost,” the officials wrote in a recently released case study focused on the exercise. “Our hope is that the case study will provide others in government with information and guidance to move forward with their own crowdsourcing projects. Whether the intent is opening data, increased engagement, or improved services, agencies must embrace new technologies that can bring citizens closer to their government.”

While the U.S. AID exercise resulted in high-quality output, officials with the agency did provide advice for other agencies considering similar work. At the event, Grosser stressed the need to “crawl, walk, [then] run” for other organizations considering using crowdsourcing — that is, start small with a few hundred records and then expand on that as the system is refined.

But U.S. AID sees crowdsourcing becoming a key part of improving government data. “We need to be working as hard to release relevant data we already have as we are to create it,” Roberts concluded, when discussing how the U.S. AID experience could apply more generally. “The crowd is willing to do research, data mining, and data cleanup.”

Powerpoint slides from the event are available here.

Statistical Significance: How Big Data is Changing the Way We See the World

Craig Fugate, director of Federal Emergency Management Agency, once said on predicting calamities: “Disasters are like horseshoes, hand grenades and thermonuclear devices; you just need to be close – preferably more than less.”
Luckily, the information age has given organizations throughout the world a better way to understand and begin to predict complex phenomena. This new method ties together old-fashioned statistical analysis with the explosion of crowdsourced information now being transmitted over phone networks and the internet. This includes Twitter posts, Facebook status updates, blog posts, text messages, Amazon purchase histories, and more –what  a recent white paper by UN Global Pulse dubs “Big Data.”

Though huge datasets have existed before, Big Data information has the unique characteristics of being both current and broad enough for decision-makers to make policy and operational decisions that respond to immediate issues. On one hand, Big Data is an excellent tool when leveraged effectively. It helped to save lives after the 2010 Haiti earthquake, is being mined by the intelligence community to track future trends, and may even find its way into US counterinsurgency doctrine. However, as with all human-sourced data, Big Data is notoriously fickle, and is prone to misinterpretation. With some NGOs, businesses, and governments working tirelessly to harness the power of Big Data, even as others remain far more reluctant, it’s worth considering both sides of this new venture.

What does Big Data do for us?

On one hand, having access to millions of lines of data collected from various sources can lead to insights that would have previously been impossible. One of the earlier applications of large-scale statistical analysis happened some years ago, during the trial of Slobodan Milosevic. During the trial, the relevance of large datasets was showcased by demonstrating that surges in Serbian troop movements were the primary cause of Albanians fleeing their homes, and not NATO bombings or Albanian guerilla attacks, as Milosevic claimed.

Since then, the information age has led to much greater opportunities. Google has been successful in correlating specific web search terms with outbreaks of illnesses, such as dengue fever or the flu, often before the Centers for Disease Control & Prevention or other health agencies have officially confirmed the outbreak. This method looks at the number of people who search for terms such as “dengue” using Google’s search engine, and based on how many searches occur in a specific area within a narrow span of time, Google was able to estimate the likelihood of an outbreak having occurred. In the aftermath of the Haiti earthquake, the disaster relief NGOs in Haiti implemented a nationwide SMS campaign, allowing anyone with a cell phone to report the locations of injured or trapped people via a simple text message. The density and content of these text messages were updated on a map in real time and used in deciding how to best deploy resources. Continue reading “Statistical Significance: How Big Data is Changing the Way We See the World”

Leveraging Social Media for Global Development: Robert Kirkpatrick of UN Global Pulse

The recent Tech@State conference featured an address from Robert Kirkpatrick of United Nations (UN) Global Pulse. There have been many applications of social media to the relatively short-term process of crisis management, but Global Pulse instead focuses on the utility of real-time data for longer-term development. The initiative has focused on analyzing new, digital data sources — social media, online news, online prices, etc. — to demonstrate how this can enhance traditional statistics. For example, in one of Global Pulse’s initial projects, the number of Indonesian tweets including the words “rice” and “price” was found to correlate closely with official Indonesian government food price inflation rates. The work has intriguing potential, and Pulse Labs will soon be operational in Indonesia and Uganda. The Wilson Center’s Zachary Bastian spoke with Robert about the project.

 Where did Global Pulse begin?

Global Pulse began with recognition, at the height of global economic crisis in 2009, that we were seeing something new. In today’s volatile and hyper-connected world, socioeconomic shocks emerge and reverberate around the globe almost with the speed of natural disasters. World leaders became very concerned about the impact of the crisis on vulnerable populations — they knew millions would be at risk. Jobs would be lost, and families would end up selling assets, making difficult trade-offs, and falling back into the grip of poverty and hunger. Yet as they sought to prioritize responses with dwindling resources, they found that up-to-date, household-level information on who was being impacted, and how, simply wasn’t available. National census data and official statistics about how populations are faring lag by months — and often by years.

As the global economy becomes more volatile and the pace of change accelerates, we have found that traditional 20th century tools for tracking development cannot keep up. A great deal of money is spent on aid, but these investment decisions are often made without real time awareness of needs or real-time feedback on the efficacy of our policies and programs. Recognizing both the need for more timely and accurate information, as well as the opportunity presented by the emergence of “Big Data,” the UN Secretary-General established the Global Pulse initiative to help the global community gain the situational awareness required for protecting fragile development gains in the 21st century.
Continue reading “Leveraging Social Media for Global Development: Robert Kirkpatrick of UN Global Pulse”