In the budget, the administration argues that granting the public access to intellectual property and scientific knowledge leads to innovation. With the open data from government-funded projects, citizens and businesses can build upon pre-existing research, resulting in technological development. The Data.gov website, for example, offers more than 188,000 data sets on topics ranging from healthcare to agriculture. Using this information, external groups have created applications to improve accessibility for people with disabilities and help food truck vendors improve their sales. Currently, the government has already taken the first step in expanding the public’s access to datasets with initiatives like Project Open Data, but the additional investment proposed in the FY2017 budget could allow for further independent research and innovation.
Second, the increased availability and accessibility to open data will create new jobs and industries, according to the FY2017 budget. Moreover, federal R&D data is oftentimes successfully leveraged by academics and entrepreneurs. For example, the National Aeronautics and Space Administration (NASA) has long collaborated with the private sector to conduct experiments on the International Space Station. After NASA granted private researchers access to their findings, the researchers were able to create a whole new industry in small satellites.
With the growth of citizen science comes the challenge of coordinating people, projects, and data. But these challenges also present a tremendous opportunity – with proper standardization, data can support multiple projects, allowing citizen science to address ever-grander issues and problems.
The U.S. Citizen Science Association (CSA) recently founded a Data and Metadata Working Group to promote collaboration in citizen science through development and/or improvement of international standards for data and metadata. To support, advance, and facilitate interoperability, the working group supports the standardization of:
Project metadata, which describes different types of citizen science activities. For example, one type of project metadata could be the intended outcomes of a certain citizen science project.
Observational metadata, which describes the data collected through citizen science activities. One type of observational metadata could be the location where a data point was collected.
Capitalizing on the momentum from the recent White House event — which appointed citizen science coordinators in Federal agencies, highlighted legislation introduced in Congress concerning funding mechanisms and clarifying legal and administrative issues to using citizen science, and launched a new Federal toolkit on citizen science and crowdsourcing — the Commons Lab is hosting a panel examining the legal issues affecting federal citizen science and the potential intellectual property rights that could arise from using citizen science.
This panel corresponds with the launch of two new Commons Lab Publications:
Managing Intellectual Property Rights in Citizen Science, by Teresa Scassa and Haewon Chung
As a project manager or researcher conducting citizen science, either at the federal level or in partnership with governmental agencies, there are certain issues like the Information Quality Act that will impact citizen science and crowdsourcing project design. Being aware of these issues prior to initiating projects will save time and provide avenues for complying with or “lawfully evading” potential barriers. The Commons Lab web-enabled policy tool will also be demonstrated at the event. This tool helps users navigate the complicated laws discussed in Robert Gellman’s report on legal issues affecting citizen science.
Intellectual property rights in the age of open source, open data, open science and also, citizen science, are complicated and require significant forethought before embarking on a citizen science project. Please join us to hear from two experts on the legal barriers and intellectual property rights issues in citizen science and collect a hard copy of the reports.
Teresa Scassa, Canada Research Chair in Information Law and Professor in the Faculty of Law, University of Ottawa Haewon Chung, Doctoral Candidate in Law, University of Ottawa Robert Gellman, Privacy and Information Policy Consultant in Washington, DC
Moderator Jay Benforado, Office of Research and Development, U.S. Environmental Protection Agency
We all have a role to play in creating a sustainable future for us and our planet. Tracking these 17 goals on eradicating extreme poverty, gender inequality, disease, and social injustice is not a small task. We need new ideas on how to do this and how to empower people to get involved.
That’s why the Open Seventeen challenge, an open platform that supports crowdsourcing projects, was launched in May this year. It’s about tapping into the power of your online community–or the one in your backyard—to help you sift through existing datasets (images, scanned text, tweets and more!) to find things that computers can’t pick up.
Today we’re putting out the call for the the second round of the Open Seventeen Challenge, looking for new ideas on how crowdsourcing can help tackle extreme poverty, corruption, and gender inequality. The winning ideas will receive online coaching and technical support to set up a crowdcrafting project and have it go live.
We had some great grassroots ideas come out of the first round and our two winning projects are getting ready to launch their crowdsourcing after graduating from their coaching program run by the GovLab Academy:
Promise 2030, led by John Ranford, is creating a “street guide to sustainable businesses.” Promise 2030 will be piloted in UK towns trying to reduce their CO2 impacts, tackling Global Goal 11: Sustainable Cities and Communities. By crowdsourcing information about local shops and businesses, Promise 2030 wants to accelerate the number of small and medium sized enterprises that record and publish information on their sustainability.
DATAFARMA, led by Janeth Cifuentes and Manuel Mejía, is looking to crowdsource to gather information on Hepatitis C and the use of generic medicines, to make that information easily accessible by people affected by this disease in Colombia and Mexico. The idea is to create a platform that maps disease outbreaks, while also having vital information on associated treatments and patient care. The project will focus on Global Goal 3: Good Health & Well-Being.
The new challenge will run until the end of December 2015 and the winners will be announced in the new year!
The Open Seventeen Challenge is a joint initiative of the research organizations Citizen Cyberlab and GovLab, The ONE Campaign, and the open-source company SciFabric.
Learn about the Global Goals, then find out more about the Open Seventeen challenge and get involved at openseventeen.org!
CSA’s Data and Metadata Working Group aims to support, advance, and facilitate data interoperability among and between citizen science projects, and other data repositories; and, to promote collaboration in citizen science via the development and/or improvement of international standards for data and metadata.
CSA’s Data and Metadata Working Group is highly collaborative, partnering with ECSA and ACSA to ensure that any standards developed will be globally relevant. Currently, this working group is also collaborating with the US Federal Government to help ensure that metadata standards adopted for use in federal agencies (per the recent White House memorandum, “Addressing Societal and Scientific Challenges through Citizen Science and Crowdsourcing”) are compatible with PPSR_CORE.
Please contact Anne Bowser with any questions- and look for more news in the coming weeks.
On May 16th & 17th of 2015 the Commons Lab hosted Washington, D.C.’s first-ever Science Hack Day in collaboration with ArtsEdge of the John F. Kennedy Center for the Arts. The event was attended by over 100 people and a handful of the hacks produced have continued on to form either a non-profit, limited liability company or submitted proposals to seek more funding.
The use of hackathon’s by the government, as a tool to engage top talent, apply new ways of thinking to seemingly intractable problems and increase public engagement and awareness has been growing over the past decade, especially in the past couple years. This exciting movement has incredible promise; however there are strategic research investments and best practices that could be made by the government to utilize this tool to its fullest potential. This case study analyzes the science hack day event itself, highlights some of the award winning hacks and explores some of the governments investments in concepts behind hackathon’s and offers suggestions for avoiding potential pitfalls of mass collaboration.
You can download the report here: https://www.wilsoncenter.org/publication/science-hack-day-bridging-the-hacking-community-and-government.
What are enabling conditions? Agar in a petri dish help grow bacteria to create cultures for studying microbes; using technology enables one to date in a pool wider than your social circle; small grants help fund the exploration of a big idea; learning Spanish not by classroom, but by living in Mexico helps facilitate faster learning; zoning policies requiring low-income housing in cities enables access and diversity. These are but a few examples of enabling conditions that can be found across multiple disciplines, industries and governments. Mae Jemison on day two quoted the phrase, “The future doesn’t just happen – it’s made.” The goal of day three was to explore the enabling conditions needed to make the future of creating and turning data into reliable environmental and societal information for decision-making.
In the morning, Inger Andersen, director general of the International Union for Conservation of Nature (IUCN), provided an estimate of $5 million a year to maintain the IUCN Redlist, the canonical database for the state of the world’s endangered species. This amount is “peanuts” compared to other data collection initiatives, according to Andersen, who emphasized this by comparing it to the U.S. census, which spends $13 billion every 10 years to understand demographics. However, what’s consistently undervalued are the 300 years’ worth of volunteer time that goes into updating that database. According to Anderson, the biggest word of caution moving forward is to keep the spirit of volunteerism alive by providing contributors with credit. This is not only the code of ethics in science and writing, but it’s also basic human decency: Give credit where it’s due, and the 300 years of volunteer time will continue to grow to 600 years and beyond as the world becomes more connected.
Enabling condition: give citizen scientists credit where it’s due and see an increase in volunteer time.
Enrico Giovannini, an economist and statistician and member of the Club of Rome, called for a “single state-of-the-art system that should serve [the] international community and countries [with the goal of] efficiency and effectiveness.” In addition, Giovannini stated we should place pressure on the private sector to share their data. This could include national policies which require private companies operating in their country to disclose all data collected about their resources (natural resources, demographic, economic). Giovannini ended with a bold statement, for countries to write the Sustainable Development Goals (SDGs) into their constitution. While that is a tall order — in the United States, there have been only 27 amendments to the Constitution in 226 years — what might be a more digestible step is to incorporate them into the missions of governmental and non-governmental organizations that are affiliated with one of the 17 goals.
Enabling conditions: Prevent duplication through collaboration, pressure private companies to share data while respecting private personal information and build the SDGs into governance frameworks.
Following the opening plenary was a panel on the “Polices, partnerships and open data for sustainable development,” moderated by Willian Sontag, initiatives manager from the U.S. Environmental Protection Agency’s Office of International and Tribal Affairs. One notable part of this presentation was the big data ecosystem workshops, led by CODATA in developing countries to increase best practices and therefore increase accuracy of data.
The afternoon panels focused on cutting-edge technology and stories from the “feet in the field,” the people collecting the data or managing the volunteers who collect that data that make its way down the information pipeline. The feet in the field representatives all emphasized the tremendous power of technology for turning data into information quickly in order to act on priority conservation areas. However, Liam Pin Koh of conservationdrones.org said it best, in that technology enables us to collect better data faster, but the main goals are to acquire the data in any way possible.
Enabling condition: don’t rely solely on technology, continue to use all methods of data collection (oral, social, manual), but make an extra effort to digitize them and make them accessible to all.
Ayesha Yousef Al Biooshi of the Abu Dhabi Environment Agency shared methods of studying dugongs, of which the UAE is home to the second-largest population. The Conservation Leadership Programme advertised its capacity-building program for young, aspiring conservation leaders, and lastly, the Mohamed bin Zayed Species Conservation Fund shared stories from their grantees. The fund provides small grants to young ecologists, biologists and conservation leaders to collect valuable data for their academic work.
The importance of follow-ups from this conference will be revealed in the years to come, as governments begin to set up their monitoring and reporting systems of the SDGs. Delegates from the Eye on Earth Summit will continue to tackle the problems of harmonizing data demand, supply and enabling conditions. Each leader from the founding five organizations stated in the final panel their dedication to leveraging their own communities for generating information in tandem with each other in order to monitor the SDGs.
Jacqueline McGlade of UNEP demoed the organization’s web intelligence platform for tracking the SDGs, UNEP Live, which aggregates media stories about individual countries and their environmental initiatives among other environmental data, like citizen science. The media stories are particularly relevant because, as McGlade noted, “Governments are driven by what is written about them.” If that’s so, then the Eye on Earth Alliance, with its amazing leaders, will hopefully inspire governments as news gets out about this incredible Alliance.
One suggestion moving forward for the next Eye on Earth Summit is to emphasize the process of collaboration. The convening reason for this Alliance is based on the problem of “oceans of data but only drops of information” and executing collaboration processes is the solution to building a bridge. Because of this, all presenters should spend at minimum one minute on how they work with other stakeholders to create a system of information for environmental and societal indicators. That way, other organizations can learn from each other about best practices for collaboration.
I’ll leave readers with a thought from one of my favorite authors, Chimanadache Nigamonzi Adichie, about the “danger of a single story.” When we believe whole-heartedly in one single story, we lose focus and disregard the possibility of other stories that describe the same thing. By bringing together organizations and individuals who care about the same thing – equitable access for all to data to inform decision-making – but may approach it differently, we are able to listen to everyone’s story and leverage the information that they share.
I want to thank the organizers of the Eye on Earth Blogging Competition for this unique and amazing opportunity. While you might not have seen them, the media team are a crucial component to this Alliance, acting as the megaphone for communicating to the world what happened over the past three days.
Lastly, I will share my pledge: As a citizen science researcher with the Woodrow Wilson International Center for scholars in Washington, D.C., our organization strives to be a bridge between academia and policy, informing people on matters of international importance with reliable and actionable information. I have the incredible privilege to take what I learned here and bring it back to the United States and share with academics, policymakers and relevant organizations. I look forward to signing up my organization as a member of the Eye on Earth Alliance.
Software systems are divided into two parts, the front end and the back end, to help identify issues, put a complex system into boxes and to simplify maintenance. Back end is usually associated with data scientists, engineers and databases, while front end is associated with users, designers, communicators and writers. Day two of the Eye on Earth Summit brought together both “ends”under the theme of data supply. The data supply back end? Data infrastructure investment, breaking down islands of information in the sea of the world wide web and citizen science. The data supply front end? Data visualization and mapping, education, communication and equal access.
The day opened with invigorating talks from Barbara Ryan, Secretariat Director of the Group on Earth Observations, who displayed an astonishing graph demonstrating the number of imagery downloads from the satellite LANDSAT once it was opened to the public, which jumped from 53 to 5,700. It’s easy to say that the increase is obviously due to access—and one would be correct—but the real insight came from the fact that a significant portion of the downloads were from other U.S. federal agencies. This fact emphasizes how open data reduces the amount of resources that each agency must allocate for information gathering and epitomizes the resource-effectiveness of open data. Muki Haklay of University College London and Director of the ExCITES program provided a history of citizen science and a call to action to move beyond citizens as just sensors, and emphasized a human-centered design approach when working with technology and communities. In other words, don’t just plop a cell phone in the community and expect results; work backwards from their needs.
Christopher Tucker, from Map Story, an open-source atlas of change, emphasized why policy-makers should care about these types of crowdsourced platforms. In ecology, usually the most resilient systems are those which have high biodiversity, or many different types of plants and animals in the system, and are therefore are more adaptable to changes in the environment. Tucker argued that open-source platforms like Map Story, among “official” sources of geospatial information, provide a key piece in the information ecosystem, therefore providing resiliency.
The plenary closed with Mae Jemison, the first African-American astronaut and pinciple of 100-Year Starship, an organization dedicated to creating the future of human space travel. Jemison came to the realization in space that Earth will always be here, the question is, will we? Through studying other planets, we can learn more about our own and possibly gain insights to help us protect our life support, the environment.
The Global Network of Networks (GNON) is an initiative of Eye on Earth to create a technical supportive structure for individual projects with databases of environmental information to connect to other databases with similar or the same information. If a citizen science project is collecting water quality data and the field they use to describe “water quality” is “H20 goodness,” it will probably have significance to them, but won’t make any sense to say a UAE national organization like AEGDI. GNON, according to Rob Atkinson of meta-linkage, is NOT trying to create another set of standards. Instead, it aims to create a support system for these thousands of databases with crucial environmental information to explain what each field of data means to an individual project, and how it relates to a possibly similar field of data among these islands of information. If you’re interested in learning more, the Open Geospatial Consortium is soliciting projects for “testbeds,”where they will work with you and the World Wide Web Consortium (W3C) to prototype their concept. GNON anticipates and expects to evolve with coming technology.
Funding, funding, funding! Its loud and clear coming from the community and echoed by Marc Levy, deputy director of the Center for International Earth Science Information Network, during a panel on “Data for Sustainable Development.” “You have to commit to open data, but none of the agencies [funding organizations] provide help or services to do it,” he said. While it’s noble to commit to open data principles, there are no line items in budgets for the support services needed to enable research projects to share their data. Building on GNON, what is the point of sharing your data if your information isn’t discoverable and/or meaningful to someone?
The big question becomes, what does this cost? What costs are associated with maintaining a database and keeping it open? A panelist made up of representatives from International Union for the Conservation of Nature, BirdLife International and United Nations Environment Program-World Conservation Monitoring Centre presented their databases during a panel on “Understanding the costs of knowledge: cost of data generation and maintenance.”
Following the case studies, Diego Juffebignoli, program officer for the Protected Areas program at the UNEP-WCMC, presented his research, which attempted to quantify how much it costs to maintain these cornerstone conservation “knowledge products.”The answer? It’s complicated. One of the notable issues in discerning costs was how to evaluate volunteer time, because a professional scientist volunteering her time is a different value than an interested and curious citizen. However Juffebignoli was able to provide a price range of investments over the last ~30 years—$116 – 204 million with an estimate of current annual investments of $6.5 million.
An incredible panel of individuals presented their efforts to incorporate citizen science for mobilizing policy action during “Crowdsourcing, citizen science: everyone is a supplier.” Tuntiak Katan J., arriving from Ecuador at 3am that day, presented his work with indigenous communities in monitoring carbon for the UN Reducing Emissions from Deforestation and Degradation. Brian Sullivan of Google Earth Outreach presented Global Fishing Watch, a real-time feed of fishing boats in the world, and told a compelling success story of the nation of Kiribati, which was able to bring illegal fishers to court using evidence from this platform, eventually causing the perpetrators to settle out of court. Another argument that open data pushes transparency and ultimately to legal action.
The day ended with a plenary on data visualization and the citizen science award. A refrain heard again and again was “what gets visualized gets used.”We heard from some of the most prominent names in data visualization for decision-making. World Resources Institute (WRI) unveiled their prototype of Resource Watch, a platform developed with Vizzuality using a multiple-prong approach, from storytelling to accountability metrics. Craig Mills, CEO of Vizzuality added that while there are more than 7 billion cell phone subscribers, a large part of them don’t use smartphones. With Resource Watch, Vizzuality and WRI will move past data visualization for the first world and offer services to those who only have access to SMS and audio services, with a hotline to provide real-time data on your geographical region. This type of thinking will be vital to the #datarevolution, and should be praised and encouraged.
A panel filled with communicators, visualizers and data experts brings with it excellent metaphors, analogies and interactive content. However the most compelling metaphor was the simplest (not a coincidence). Ushahidi’s Angela odor Lungati, director of community engagement, described data, platforms and people as:
seeds = data
land = platforms, e.g. Ushahidi
farmers = people
Seeds and land are pointless without people to sow the land and care for the plants. Data and platforms are useless if they aren’t reaching the people who need them most.
The last event was the award for the citizen science challenge. All three finalists were invited to attend the event. Biocaching gamified biodiversity observations using a mobile app. Hack the Rainforest, an initiative of Digital Democracy, brought developers to the Peruvian forest to aid communities in creating technological platforms for collecting and visualizing the data of deforestation. The winner, Logging Roads Initiative of Moabi, used Opens Street Map and Global Forest Watch to literally put illegal logging roads on the map in the Democratic Republic of Congo.
You can’t build a software system without people for the front-end and the back-end, and neither one is more important than the other. The Eye On Earth Summit 2015 has brought together the movers and shakers from both ends to build a software system for protecting our planet.
Today marks the first day of the Eye on Earth Summit, an alliance dedicated to the Rio 10 Principle of access to reliable, scientifically accurate and relevant information for environmental decision making. The multi-stakeholder alliance is dedicated to utilizing big data, existing networks and tackling the difficult issues of interoperability to prevent silos of information. Among the eight initiatives are:
Conceived by the Abu Dhabi Environment Agency and their Global Environmental Data Initiative in partnership with the United Nations Environmental Programme, the alliance held its inaugural summit in 2011. The second summit, this October 6th – 8th in Abu Dhabi, coincides with the signing of the Sustainable Development Goals which will be the main theme for the event. The days are divided into “Data Demand,” “Data Supply” and “Enabling Conditions.”
Citizen science will play a crucial role in providing the scale and granularity necessary for informed environmental decision making. However many barriers still exist to translating this information to decision makers, known widely in the community as data quality and integrity, participation and motivation, sustainability and duplication prevention. This summit will focus on some of these issues and demonstrate possible solutions, new initiatives and technology that could tackle the information pipeline from civil society (citizen science) to local, regional, national and global environmental decision making.
Elizabeth Tyson, of the Commons Lab, won their blogging competition and has the honor of reporting for the event as the Official Eye on Earth Blogger. Follow along for daily blog posts on exciting panels such as:
Addressing policy making demand,
Donors demand for data and environmental data for business performance
Collaborative research and activist knowledge for environmental justice
Connecting networks to support environmental sustainability
Understanding the costs of knowledge: cost of data generation and maintenance
Reaching audiences through innovations in visualization
Citizen scientists and their role in monitoring of local to global environmental change
Policies, partnerships and open data for sustainable development
Additional media providing excellent coverage include, Muki Haklay’s of University College London’s blog, Po Ve Sham and @STIPcommonslab, #datarevolution, #EOESUMMIT15, @EKTyson.
Yesterday, John P. Holdren, Director of the White House Office of Science and Technology Policy (OSTP) released a memorandum to the heads of executive departments and agencies calling for a variety of actions to be taken to fast track the use of citizen science and crowdsourcing in the federal government. Among the specific actions are (1) for each agency to appoint a citizen science and crowdsourcing coordinator and (2) for agencies to list citizen science and crowdsourcing projects in a new GSA website (similar to Challenge.gov that lists prizes sponsored by agencies) to help the public find federally funded projects. This latter effort can build on the current Commons Lab database of federal projects.
Towards the end of the memo OSTP outlines suggestions for building capacity through five areas: policy, resources and staffing, technologies and scientific instrumentation, grant-making and rigorous research. While the recommendations are specific to federal employees, one area in particular, grant-making mechanisms, should be of particular interest to Do-It-Yourself biology and maker communities.
The memo states:
Create mechanisms for providing small grants to individuals and communities that may not be affiliated with universities or traditional government contractors
It then highlights DARPA’s Fast Track Initiatives as a flagship model for funding that could be applied to citizen science and crowdsourcing. The Fast Track Initiative opens up small research grants to individuals instead of traditional institutions like universities and beltway contractors.
Federal funding agencies should develop metrics and procedures to allow actors outside the traditional academic or business communities to apply for and receive federal grants. If we want to harness the intellectual power of this movement, federal funding agencies should rethink their mechanisms for awarding grants – Todd Kuiken, Wilson Center
This is good news that the cry’s for diversifying science funding mechanisms in the federal government have been heard and we look forward to watch the ripple effects unfold. Citizen Scientists, DIYbio, maker and hacker communities will no longer be restricted to crowd-funding platforms and should stay tuned and watch closely.