SciCast is a crowdsourced forecasting platform for science and technology run by George Mason University. It is based on the idea that the collective wisdom of an informed and diverse group is often a better predictor than the judgment of a single expert.
Part of the Forecasting Science and Technology (ForeST) Program funded by the Intelligence Advanced Research Projects Activity (IARPA), SciCast questions are generated by its participants, as well as ForeST teams at Inkling Markets, George Mason University, BAE Systems and SRI International. KaDSci LLC helps scientists and policymakers formulate questions for SciCast, and Gold Brand Software, LLC is the systems integrator.
SciCast is the largest S&T forecasting effort we know of, crowdsourcing in real-time from a pool of thousands of scientists and enthusiasts. Popular topics include Bitcoin, the search for MH370, chess, alternative energy, space sciences and honeybee colony collapse. We also have a richly connected set of questions on Top500 computer speeds, and another set on open problems in theoretical computer science.
Internet search engines have replaced the need for the National Technical Information Service (NTIS), a federal agency that collects and organizes scientific and technical information derived from government-sponsored research, according to a new Senate bill introduced in early April. The bill, called the “Let Me Google That For You” Act, would strike funding for the NTIS, which is part of the Commerce Department.
The NTIS was created more than 40 years ago as a way to disseminate knowledge from government funded research and reports. The need for NTIS before the onset of the internet age was clear, but today the introduced bill claims, “95 percent of the reports available from sources other than NTIS [are] available free of charge” from a website called, “www.google.com.” Currently the agency receives $67 million dollars in federal funding annually.
1) Making Digital the Default: Building a 21st Century Veterans Experience
2) Data Innovation: Unleashing the Power of Data Resources to Improve Americans’ Lives
3) By the People, for the People: Crowdsourcing to Improve Government
“This highly-competitive program recruits talented, diverse individuals from the innovation community and pairs them with top civil servants to tackle many of our nation’s biggest challenges, and to achieve a profound and lasting social impact,” according to the White House. Since August 2012, fellows have teamed up with those in government to develop new solutions to all manner of problems.
Think you’ve got what it takes? Applications are due April 7, 2014 — you can start the process here.
And be sure to check out our report on citizen science and government here.
Editor’s note: In this guest post, Ian Kalin, currently the director of open data for software company Socrata and former Presidential Innovation Fellow, shares his thoughts about the tools that are available to cities looking to take advantage of the increased data flows.
This past week, The Economist published “The Multiplexed Metropolis,” a brilliantly global and judicious review of data-driven civic innovation. To quote the article’s subhead, “Enthusiasts think that data services can change cities in this century as much as electricity did in the last one. They are a long way from proving their case.”
Knowing better than to argue with the bible of capitalism, I instead offer that The Economist missed a few of the most important civic innovation tools. Civic innovators are people who create new ways to improve cities. They can come from all corners of civic society: government, the private sector, non-profits, academics, etc. Their greatest resource is people who are motivated to work together and improve their community.
The growing use of social media and other mass collaboration technologies is opening up new opportunities in disaster management efforts, but is also creating new challenges for policymakers looking to incorporate these tools into existing frameworks, according to our latest report.
The Commons Lab, part of the Wilson Center’s Science & Technology Innovation Program, hosted a September 2012 workshop bringing together emergency responders, crisis mappers, researchers, and software programmers to discuss issues surrounding the adoption of these new technologies.
We are now proud to unveil “Connecting Grassroots to Government for Disaster Management: Workshop Summary,” a report discussing the key findings, policy suggestions, and success stories that emerged during the workshop. The report’s release coincides with the tenth annual Disaster Preparedness Month, sponsored by the Federal Emergency Management Agency in the Department of Homeland Security to help educate the public about preparing for emergencies. Continue reading “NEW REPORT: Connecting Grassroots to Government for Disaster Management”→
The Commons Lab of the Science and Technology Innovation Program is proud to announce the release of The Power of Hackathons: A Roadmap for Sustainable Open Innovation. Hackathons are collaborative events that have long been part of programmer culture, where people gather in person, online or both to work together on a problem. This could involve creating an application, improving an existing one or testing a platform.
In recent years, government agencies at multiple levels have started holding hackathon events of their own. For this brief, author Zachary Bastian interviewed agency staff, hackathon planners and hackathon participants to better understand how these events can be structured. The fundamental lesson was that a hackathon is not a panacea, but instead should be part of a broader open data and innovation centric strategy. Continue reading “The Power of Hackathons”→
As part of the Commons Lab’s ongoing initiative to highlight the intersection of emerging technologies and citizen science, we present a profile of SeaSketch, a marine management software that makes complex spatial planning tools accessible to everyone. This was prepared with the gracious assistance of Will McClintock, director of the McClintock Lab.
The SeaSketch initiative highlights key components of successful citizen science projects. The end product is a result of an iterative process where the developers applied previous successes and learned from mistakes. The tool was designed to allow people without technical training to participate, expanding access to stakeholders. MarineMap had a quantifiable impact on California marine protected areas, increasing their size from 1 percent to 16 percent of the coastline. The subsequent version, SeaSketch, is uniquely suited to scale out worldwide, addressing coastal and land management challenges. By emphasizing iterative development, non-expert accessibility and scalability, SeaSketch offers a model of successful citizen science. Continue reading “Citizen Science Profile: SeaSketch”→
I was working as a Presidential Innovation Fellow when the process to create the Open Data Policy began. Anyone within government is used to seeing documents circulate with no real idea of when it was edited, by whom, whether it was the most current version, and so on. This is very opaque. So while we’re working on open data policy, the process itself was very not open. Open source developers within the Innovation Fellows started talking about using GitHub to create the actual document. Lowering the barrier to entry was always the idea—we want people editing this and sharing their perspectives. Continue reading “Project Open Data: Interview with Ben Balter”→
Today, the Office of Management and Budget and the Office of Science and Technology Policy jointly released a new Open Data Policy directing agencies to implement specific structural reforms. In conjunction with an Executive Order prioritizing open and machine readable government information, these adjustments are forward looking and exciting. They speak to a general understanding that a deliberate approach to the way that data are processed and released can exponentially enhance their value.
Editor’s note: In September 2012, the Commons Lab hosted the Connecting Grassroots to Government for Disaster Management workshop. Over two days, we spoke with a number of event participants for a series of video podcasts covering various aspect of the proceedings. The conversation below with Eric Rasmussen is the first of these podcasts. Please stay tuned: Additional installments will be posted in the coming weeks and the workshop summary report will be published in June.
Eric Rasmussen wears many hats: He is a medical doctor, a research professor for environmental security and global medicine at San Diego State University, an affiliate associate professor of medicine at the University of Washington, and the managing director at Infinitum Humanitarian Systems, a “profit-for-purpose” company in California that focuses on reducing vulnerability for systems and populations. In addition to sitting on a number of boards, Rasmussen served in the Navy for more than 25 years and was deployed more than 15 times to Iraq, Afghanistan and other countries.
In this podcast, Rasmussen discusses the limitations software developers face when moving ideas from concept to implementation in disaster response, noting that developers often have too little access to end users and too little understanding of the constraints faced by those users in the field. He also discusses the need to engage agencies and other responders early on to make sure new systems are incorporated into agency response plans and the role of policymakers in addressing these challenges.