Transparency Talk

Category: "Sharing" (58 posts)

Creating a Culture of Learning: An Interview with Yvonne Belanger, Director of Evaluation & Learning, Barr Foundation
November 8, 2018

Yvonne Belanger is the director of learning & evaluation at the Barr Foundation and leads Barr's efforts to gauge its impact and support ongoing learning among staff, grantees, and the fields in which they work.

Recently, Janet Camarena, director of transparency initiatives for Foundation Center, interviewed Belanger about how creating a culture of learning and openness can improve philanthropy. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.


YvonneGlassPockets: More and more foundations seem to be hiring staff with titles having to do with evaluation and learning. You’ve been in this role at the Barr Foundation for just about a year, having come over from a similar role at the Bill & Melinda Gates Foundation. Why do you think roles like this are on the rise in philanthropy, and what are your aspirations for how greater capacity for evaluation and learning can benefit the field?

Yvonne Belanger: I think the spread of these roles in strategic philanthropy comes from increasing recognition that building a stronger learning function is a strategic investment, and it requires dedicated expertise and leadership. My hope is that strong evaluation and learning capacity at Barr (and across the philanthropic sector generally) will enable better decisions and accelerate the pace of social change to make the world more equitable and just.

GP: What have been your priorities in this first year and what is your approach to learning? More specifically, what is Barr’s learning process like, what sources do you learn from, how do you use the learnings to inform your work?

YB: At Barr, we are committed to learning from our efforts and continuously improving. Our programmatic work benefits from many sources of knowledge to inform strategy including landscape scans, academic research, ongoing conversations with grantees and formal site visits, and program evaluations to name a few. During this first year, I have been working with Barr’s program teams to assess their needs, to sketch out a trajectory for the next few years, and to launch evaluation projects across our strategies to enhance our strategic learning. Learning is not limited to evaluating the work of our programs, but also includes getting feedback from our partners. Recently, we were fortunate to hear from grantees via our Grantee Perception Report survey, including specific feedback on our learning and evaluation practices. As we reflected on their responses in relation to Barr’s values and examples of strong practice among our peers, we saw several ways we could improve.

GP: What kinds of improvements are you making as a result of feedback you received?

YB: We identified three opportunities for improvement: to make evaluation more useful, to be clearer about how Barr defines success and measures progress, and to be more transparent with our learning.

  • Make evaluations more collaborative and beneficial to our partners. We heard from our grantees that participating in evaluations funded by Barr hasn’t always felt useful or applicable to their work. We are adopting approaches to evaluation that prioritize grantee input and benefit. For example, in our Creative Commonwealth Initiative, a partnership with five community foundations to strengthen arts and creativity across Massachusetts, we included the grantees early in the evaluation design phase. With their input, we modified and prioritized evaluation questions and incorporated flexible technical assistance to build their capacity for data and measurement. In our Education Program, the early phase of our Engage New England evaluation is focused on sharing learning with grantees and the partners supporting their work to make implementation of these new school models stronger.
  • Be clearer about how we measure outcomes. Our grantees want to understand how Barr assesses progress. In September, we published a grantee guide to outputs and outcomes to clarify what we are looking for from grantees and to support them in developing a strong proposal. Currently, our program teams are clarifying progress measures for our strategies, and we plan to make that information more accessible to our grantees.
  • Share what we learn. To quote your recent GrantCraft Open for Good report, “Knowledge has the power to spark change, but only if it is shared.” To maximize Barr’s impact, we aim to be #OpenForGood and produce and share insights that help our grantees, practitioners, policymakers, and others. To this end, we are proactively sharing information about evaluation work in progress, such as the evaluation questions we are exploring, and when the field can expect results. Our Barr Fellows program evaluation is one example of this practice. We are also building a new knowledge center for Barr to highlight and share research and reports from our partners, and make these reports easier for practitioners and policymakers to find and re-share.

GP: Clearly all of this takes time and resources to do well. What benefits can you point to of investing in learning and knowledge sharing?

YB: Our new Impact & Learning page reflects our aspiration that by sharing work in progress and lessons learned, we hope to influence nonprofits and other funders, advance field knowledge, inform policy, and elevate community expertise. When you are working on changing complex systems, there are almost never silver bullets. To make headway on difficult social problems we need to view them from multiple perspectives and build learning over time by analyzing the successes – and the failures - of many different efforts and approaches.

GP: Barr’s president, Jim Canales, is featured in a video clip on the Impact & Learning page talking about the important role philanthropy plays as a source of “risk capital” to test emerging and untested solutions, some of which may not work or fail, and that the field should see these as learning opportunities. And, of course, these struggles and failures could be great lessons for philanthropy as a whole. How do you balance this tension at Barr, between a desire to provide “risk capital,” the desire to open up what you are learning, and reputational concerns about sharing evaluations of initiatives that didn’t produce the desired results?

YB: It’s unusual for Foundations to be open about how they define success, and admissions of failure are notably rare. I think foundations are often just as concerned about their grantees’ reputation and credibility as their own. At Barr we do aspire to be more transparent, including when things that haven’t worked or our efforts have fallen short of our goals. To paraphrase Jim Canales, risk isn’t an end in itself, but a foundation should be willing to take risks in order to see impact. Factors that influence impact or the pace of change are often ones that funders often have control over, such as the amount of risk we were willing to take, or the conceptualization and design of an initiative. When a funder can reflect openly about these issues, these usually generate valuable lessons for philanthropy and reflect the kind of risks we should be able to take more often.

GP: Now that you are entering your second year in this role, where are the next directions you hope to take Barr’s evaluation and learning efforts?

YB: In addition to continuing and sustaining robust evaluation for major initiatives across our program areas, and sharing what we’re learning as we go, we have two new areas of focus in 2019 – people and practices. We will have an internal staff development series to cultivate mindsets, skills, and shared habits that support learning, and we will also be working to strengthen our practices around strategy measurement so that we can be clearer both internally and externally about how we measure progress and impact. Ultimately, we believe these efforts will make our strategies stronger, will improve our ability to learn with and from our grantees, and will lead to greater impact.

 

Data Fix: Do's & Don'ts for Reporting Geographic Area Served
November 1, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This is the second post in a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoThe first post in our Data Fix series focused on areas that may seem straightforward but often cause confusion, including recipient location data. But don’t confuse recipient location (where the check was sent) with Geographic Area Served (the area meant to benefit from the funding). Data on recipient location, one of our required fields, allows us to match data to the correct organization in our database, ensuring accuracy for analyses or data visualizations. In contrast, Geographic Area Served, one of our highest priority fields, helps us tell the real story about where your funding is making an impact.

How to Report Geographic Area Served

We recognize that providing data on Geographic Area Served can be challenging. Many funders may not track this information, and those who do may depend on grantees or program staff to provide the details. It’s important to keep in mind that sharing some information is better than no information, as funders are currently the only source of this data.

DO DON'T
Do include details for locations beyond the country level. For example, for U.S. locations, specify a state along with providing geo area served at the city or county level. For non-U.S. locations, include the country name when funding a specific city, province, state or region. Don’t be too broad in scope. “Global Programs” may not be accurate if your work is focused on specific countries. Similarly, listing the geo area served as “Canada” is misleading if the work is serving the province of “Quebec, Canada” rather than the entire country.

Do use commas to indicate hierarchy and semi-colons to separate multiple areas served. For example:

  • Topeka, Kansas (comma used to indicate hierarchy)
  • Hitchcock County, Nebraska; Lisbon, Portugal; Asia (semi-colons used to list and separate multiple locations)
Don’t use negatives or catch-all terms. “Not California,” “Other,” “Statewide” or “International” may be meaningful within your organization, but these terms cannot be interpreted for mapping. Instead of “Statewide,” use the name of the state. Instead of “International,” use “Global Programs” or list the countries, regions, or continent being served.

Do define regions. If you are reporting on geo area served at the regional level (e.g. East Africa), please provide a list of the countries included in your organization’s definition of that region. Your definition of a region may differ from that of Foundation Center. Similarly, if your foundation defines its own regions (Southwestern Ohio), consider including the counties comprising that region.

Don’t forget to include the term “County” when reporting on U.S. counties. This will ensure your grant to an entire county isn’t assigned to the same named city (e.g. Los Angeles County, California, rather than Los Angeles, California).

Geographic Area Served in Foundation Center Platforms

Data provided (in a loadable format) will appear in “Grant Details” in Foundation Directory Online (FDO) and Foundation MapsFoundation Maps, including the complimentary eReporter map showing your own foundation’s data, also display an Area Served mapping view. 

Copy of Untitled

If data is not provided, Foundation Center will do one of the following:

  • Default to the location of the recipient organization
  • Add geo area served based on text in the grant description
  • Add geo area served based on where the recipient organization works, as listed on their website or in their mission statement, if this information is available in our database

Responsibly Sharing Geographic Area Served


Although our mission is to encourage transparency through the sharing of grants data, we acknowledge there are contexts in which sharing this data may be cause for concern. If the publishing of this data increases risks to the population meant to benefit from the funding, the grantee/recipient, or your own organization, you can either omit Geographic Area Served information entirely or report it at a higher, less sensitive level (e.g. country vs. province or city). For more information on this topic, please see Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts and Sharing Data Responsibly: A Conversation Guide for Funders.

More Tips to Come!

I hope you have a better understanding of how to report Geographic Area Served through eReporting. Without this data, valuable information about where funding is making a difference may be lost! Moving forward, we’ll explore the required fields of Recipient Name and Grant Description. If you have any questions, please feel free to contact me.

-- Kati Neiheisel

Data Fix: Do's and Don'ts for Data Mapping & More!
October 3, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This post is part of a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoAs many of you know, Foundation Center was established to provide transparency for the field of philanthropy. A key part of this mission is collecting, indexing, and aggregating millions of grants each year. In recent years this laborious process has become more streamlined thanks to technology, auto-coding, and to those of you who directly report your grants data to us. Your participation also increases the timeliness and accuracy of the data.

Today, over 1300 funders worldwide share grants data directly with Foundation Center. Over the 20 years we've been collecting this data, we've encountered some issues concerning the basic fields required. To make sharing data even quicker and easier, we've put together some dos and don'ts focusing on three areas that may seem straightforward, but often cause confusion.

Location Data for Accurate Mapping and Matching

Quite simply, to map your grants data we need location information! And we need location information for more than mapping. We also use this information to ensure we are matching data to the correct organizations in our database. To help us do this even more accurately, we encourage you to provide as much location data as possible. This also helps you by increasing the usability of your own data when running your own analyses or data visualizations.

DO DON'T
Do supply Recipient City for U.S. and non-U.S. Recipients. Don't forget to supply Recipient Address and Recipient Postal Code, if possible.
Do supply Recipient State for U.S. Recipients. Don't supply post office box in place of street address for Recipient Address, if possible.

Do supply Recipient Country for non-U.S. Recipients.

Don't confuse Recipient location (where the check was sent) with Geographic Area Served (where the service will be provided). 

What's Your Type? Authorized or Paid?

Two types of grant amounts can be reported: Authorized amounts (new grants authorized in a given fiscal year, including the full amount of grants that may be paid over multiple years) or Paid amounts (as grants would appear in your IRS tax form). You can report on either one of these types of amounts – we just need to know which one you are using: Authorized or Paid.

DO DON'T
Do indicate if you are reporting on Authorized or Paid amounts. Don't send more than one column of Amounts in your report – either Authorized or Paid for the entire list.
Do remain consistent from year to year with sending either Authorized amounts or Paid amounts to prevent duplication of grants. Don't forget to include Grant Duration (in months) or Grant Start Date and Grant End Date, if possible.
Do report the type of Currency of the amount listed, if not US Dollars. Don't include more than one amount per grant.

The Essential Fiscal Year

An accurate Fiscal Year is essential since we publish grants data by fiscal year in our data-driven tools and content-rich platforms such as those developed by Foundation Landscapes, including Funding the Ocean, SDG Funders, Equal Footing and Youth Giving. Fiscal Year can be reported with a year (2018) or date range (07/01/2017-06/31/2018), but both formats will appear in published products as YEAR AWARDED: 2018.

DO DON'T
Do include the Fiscal Year in which the grants were either Authorized or Paid by you, the funder. Don't provide the Fiscal Year of the Recipient organization.
Do format your Fiscal Year as a year (2018) or a date range (07/01/2017-06/31/2018). Don't forget, for off-calendar fiscal years, the last year of the date range is the Fiscal Year: 07/01/2017-06/31/2018 = 2018.

More Tips to Come!

I hope you have a better understanding of these three areas of data to be shared through Foundation Center eReporting. Moving forward, we'll explore the required fields of Recipient Name and Grant Description, as well as high priority fields such as Geographic Area Served. If you have any questions, please feel free to contact me. Thank you! And don't forget, the data you share IS making a difference!

-- Kati Neiheisel

Staff Pick: Foundation Funded Research Explores How to Improve the Voter Experience
August 9, 2018

Becca Leviss is a Knowledge Services Fellow at Foundation Center.

This post is part of the GlassPockets’ Democracy Funding series, designed to spotlight knowledge about ways in which philanthropy is working to strengthen American democracy.

Becca 2Voting is central to our democracy, providing citizens from all communities direct way to influence the future by conveying beliefs through civic participation. Though foundations by law must be non-partisan, they can and do support democracy in a variety of ways, and we are tracking these activities in our publicly available Foundation Funding for U.S. Democracy web portal.  
 
From this data we can see that encouraging broad civic participation is one of the most popular ways in which institutional philanthropy supports our democracy. Specific strategies under civic participation include issue-based participation, civic education and leadership, naturalization and immigrant civic integration, and public participation. So, what have foundations learned from these efforts about how to strengthen our democracy? Today we will zoom in to learn from a foundation-funded report that is openly available, containing findings from data collection on elections and voting patterns, including how well the process is workingand who is included or excluded. 
 
Our latest “Staff Pick” from IssueLab’s Democracy Special Collection, which is comprised of foundation-funded research on the topic, explores an aspect of the voter experience in America that could be improvedWith less than 90 days to go before the midterm elections, we’re pleased to offer this deep dive into an important piece of voting-related research. 
 
Research in the social sector can sometimes feel inaccessible or artificial—based on complex theories and mathematical models and highly-controlled situations. This report, however, presents its research methodology and results in a clear, understandable manner that invites the reader to continue its work to understanding how polling sites can use their resources to both investigate and improve the voter experience.  

STAFF PICK

Improving the Voter Experience: Reducing Polling Place Wait Times by Measuring Lines and Managing Polling Place Resources, by Charles Stewart III; John C. Fortier; Matthew Weil; Tim Harper; Stephen Pettigrew 

Download the Report

Publisher

Bipartisan Policy Center

Funders

Ford Foundation; The Democracy Fund

Quick Summary

Voting is the cornerstone of civic engagement in American democracy, but long wait times and inefficient organization at polling places can undermine the voting process and even discourage citizens from voting altogether. In 2013, President Barack Obama launched the bipartisan Presidential Commission on Election Administration (PCEA) to initiate studies and collaborative research on polling place wait times. The PCEA’s work revealed that while wait times and poll lines are a serious issue in the United States, they are also reflective of deeper, more complex problems within the election administration system. This report by the Bipartisan Policy Center summarizes the PCEA’s efforts and highlights how the knowledge gained can produce action and improvement at polling sites. Ultimately, the report emphasizes the need for continued research and innovation in approaching common issues in the voter experience.

Field of Practice

Government Reform

What makes it stand out?

Ne report“Long lines may be a canary in the coal mine,” begins the report,“indicating problems beyond a simple mismatch between the number of voting machines and voters, such as voter rules that are inaccurate or onerous.” Quantitative and qualitative data has shown that long lines at the polls have wide-reaching economic costs of over half a billion dollars in a presidential election, as well as the immeasurable cost of voter discouragement due to polling place problems. These issues are exacerbated at polling sites that are urban, dense, and with large minority populations, where lack of resources and access can disenfranchise the voting population.

While the dilemma of election administration is complex, the report describes a rather straight-forward series of projects by the Massachusetts Institute of Technology and the Bipartisan Policy Center. MIT and BPC collaborated to create a system of data collection on polling lines and polling place efficiency that would be simple and easily implemented by poll workers. The program utilized basic queuing theory: calculating the average wait time of a voter by dividing the average line length by the average arrival time. For fellow (and potential future) researchers, this report spends a meaningful portion of time explaining the significance of each variable, how it is calculated, and how its fluctuation impacts the overall results of the investigation. We are given examples of several successful iterations of the study and their evaluations, as well as insight into certain research choices.

MIT/BPC’s work has found that an overwhelming majority of Election Day polling sites—82 percent—experienced the longest line when the doors first opened. In all, a total of 90 percent of Election Day polling sites have their longest lines within the first two hourly samples (when observed on Hour 0 and Hour 1), with the lines declining at an average rate after that. Similarly, voters experience the longest wait times when the lines were at their longest. This pattern is vastly different from that of early voting sites, where wait time is relatively constant; however, these sites still most commonly experience their longest lines at the beginning of the day (25 percent of the studied population).

The research emphasizes the importance of how to adequately prepare for the length of the longest line. The report suggests that if polling sites adjust worker shifts to accommodate for strong early morning voter turnout on Election Day, they can easily clear the lines within the first few hours of voting, thus saving money and better serving their voters. The report also recognizes the range of its results: in other words, individual precincts have individual needs. Without meaningful research, however, we cannot know how to meet those needs and improve the voter experience. Therefore, as readers (and hopefully fellow voters), we are encouraged by MIT/BPC’s work to take clear and simple action to improve our own polling sites through continued research and investigation. This report exemplifies the importance of making the research and data process transparent and attainable so that we can not only understand its significance, but actively contribute to its efforts. There are many processes that could benefit from this kind of data analysis to improve the user experience. What if foundations analyzed their grant processes in this way? I can’t help but think that there is much that philanthropy can learn from the government from reports like this that show how institutions are opening up data collection to improve the user experience for actors and stakeholders.

Key Quote

“Precincts with large numbers of registered voters often have too few check-in stations or voting booths to handle the volume of voters assigned to the precinct, even under the best of circumstances. Precincts that are unable to clear the lines from the first three hours of voting are virtually guaranteed to have long lines throughout the day. Polling places in urban areas often face design challenges—small, inconvenient spaces—that undermine many election officials’ best efforts to provide adequate resources to these locations.”

--Becca Leviss

What Philanthropy Can Learn from Open Government Data Efforts
July 5, 2018

Daniela Pineda, Ph.D., is vice president of integration and learning at First 5 LA, an independent public agency created by voters to advocate for programs and polices benefiting young children. A version of this post also appears in the GOVERNING blog.

Daniela Pineda Photo 2Statistics-packed spreadsheets and lengthy, jargon-filled reports can be enough to make anybody feel dizzy. It's natural. That makes it the responsibility for those of us involved in government and its related institutions to find more creative ways to share the breadth of information we have with those who can benefit from it.

Government agencies, foundations and nonprofits can find ways to make data, outcomes and reports more user-friendly and accessible. In meeting the goal of transparency, we must go beyond inviting people to wade through dense piles of data and instead make them feel welcome using it, so they gain insights and understanding.

How can this be done? We need to make our data less wonky, if you will.

This might sound silly, and being transparent might sound as easy as simply releasing documents. But while leaders of public agencies and officeholders are compelled to comply with requests under freedom-of-information and public-records laws, genuine transparency requires a commitment to making the information being shared easy to understand and useful.

“…genuine transparency requires a commitment to making the information being shared easy to understand and useful.”

Things to consider include how your intended audience prefers to access and consume information. For instance, there are generational differences in the accessing of information on tablets and mobile devices as opposed to traditional websites. Consider all the platforms your audience uses to view information, such as smartphone apps, news websites and social media platforms, to constantly evolve based on their feedback.

Spreadsheets just won't work here. You need to invest in data visualization techniques and content writing to explain data, no matter how it is accessed.

The second annual Equipt to Innovate survey, published by Governing in partnership with Living Cities, found several cities not only using data consistently to drive decision-making but also embracing ways to make data digestible for the publics they serve.

Los Angeles' DataLA portal, for example, offers more than 1,000 data sets for all to use along with trainings and tutorials on how to make charts, maps and other visualization. The portal's blog offers a robust discussion of the issues and challenges faced with using existing data to meet common requests. Louisville, Ky., went the proverbial extra mile, putting a lot of thought into what data would be of interest to residents and sharing the best examples of free online services that have been built using the metro government's open data.

Louisville's efforts point up the seemingly obvious but critical strategy of making sure you know what information your target audience actually needs. Have you asked? Perhaps not. The answers should guide you, but also remember to be flexible about what you are asking. For example, the Los Angeles Unified School District is set to launch a new portal later this summer to provide parents with data, and is still learning how to supply information that parents find useful. District officials are listening to feedback throughout the process, and they are willing to adjust. One important strategy for this is to make your audience -- or a sampling of them -- part of your beta testing. Ask what information they found useful and what else would have been helpful.

“When you share, you are inviting others to engage with you about how to improve your work.”

Remember, the first time you allow a glimpse into your data and processes, it's inevitable your information will have gaps and kinks that you can't foresee. And if you are lucky to get feedback about what didn't work so well, it may even seem harsh. Don't take it personally. It's an opportunity to ask your audience what could be done better and commit to doing so. It may take weeks, months or maybe longer to package information for release, making it usable and accessible, but this is an investment worth making. You might miss the mark the first time, but make a commitment to keep trying.

And don't be daunted by the reality that anytime you share information you expose yourself to criticism. Sharing with the public that a project didn't meet expectations or failed completely is a challenge no matter how you look at it. But sharing, even when it is sharing your weaknesses, is a strength your organization can use to build its reputation and gain influence in the long term.

When you share, you are inviting others to engage with you about how to improve your work. You also are modeling the importance of being open about failure. This openness is what helps others feel like partners in the work, and they will feel more comfortable opening up about their own struggles. You might be surprised at who will reach out and what type of partnerships can come from sharing.

Through this process, you will build your reputation and credibility, helping your organization advance its goals. Ultimately, it's about helping those you serve by giving them the opportunity to help you.

--Daniela Pineda

Nominations for Foundation Center’s #OpenForGood Award Now Open
June 13, 2018

Sarina Dayal is the knowledge services associate at Foundation Center.

Sarina DayalTo encourage funders to be more transparent, Foundation Center has launched the inaugural #OpenForGood Award. This award will recognize foundations that display a strong commitment to transparency and knowledge sharing.

Last year, we started #OpenForGood, a campaign to encourage foundations to openly share what they learn so we can all get collectively smarter. Now, we’re launching this award as a way to bring due recognition and visibility to foundations who share challenges, successes, and failures openly to strengthen how we can think and act as a sector. The winning foundations will demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. We’re looking for the best examples of smart, creative, strategic, and consistent knowledge sharing in the field, across all geographic and issue contexts.

What’s In It for You?

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in the form of social media and newsletter space. What is a Knowledge Center and why would you want one? It is a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

OFGaward-528The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality, today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to create a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourage the sector to make it a priority to learn from one another and share content with a global audience, so that we can build smartly on one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Who Is Eligible for the Award?

  • Any foundation anywhere in the world (self-nominations welcome)
  • Must share its collection of published evaluations publicly through IssueLab
  • Must demonstrate active commitment to open knowledge
  • Preferential characteristics include foundations that integrate creativity, field leadership, openness, and community insight into knowledge sharing work
  • Bonus points for use of other open knowledge elements such as open licensing, digital object identifiers (DOIs), or institutional repository

Anyone is welcome to nominate any foundation through September 30, 2018. Winners will be selected in the Fall through a review process and notified in January. The award will officially be presented at next year’s annual GEO Conference. If you have any questions, please email openforgood@foundationcenter.org. Click here to nominate a foundation today!

Who will you nominate as being #OpenForGood?

--Sarina Dayal

Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts
April 12, 2018

Laia Griñó is director of data discovery at Foundation Center. This post also appears in the Human Rights Funders Network's blog.

Laia Grino photoOver the last few months, this blog has presented insights gained from the Advancing Human Rights initiative’s five-year trend analysis. Getting to these insights would not have been possible had not a growing number of funders decided to consistently share more detailed data about their grantmaking, such as through Foundation Center’s eReporting program. In a field where data can pose real risks, some might feel that this openness is ill-advised. Yet transparency and data protection need not be at odds. By operating from a framework of responsible data, funders can simultaneously protect the privacy and security of grantees and contribute to making the human rights field more transparent, accountable, and effective.

This topic – balancing transparency and data protection – was the focus of a session facilitated by Foundation Center at the PEAK Grantmaking annual conference last month. Our goal was not to debate the merits of one principle over the other, but to help provide a framework that funders can use in determining how to share grants data, even in challenging circumstances. What follows are some of the ideas and tips discussed at that session (a caveat here: these tips focus on data shared voluntarily by funders on their website, with external partners like Foundation Center, etc.; we recognize that funders may also face legal reporting requirements that could raise additional issues).

HRFN Graphic

  • Think of transparency as a spectrum: Conversations regarding data sharing often seem to end up at extremes: we must share everything or we can’t share anything. Instead, funders should identify what level of transparency makes sense for them by asking themselves two questions: (1) What portion of our grants portfolio contains sensitive data that could put grantees at risk if shared? and (2) For the portion of grants deemed sensitive, which grant details – if any – are possible to share? Based on our experience with Advancing Human Rights, in most cases funders will find that it is possible to share some, if not most, of their grants information.
  • Assess the risks of sharing data: Answering these questions requires careful consideration of the consequences if information about certain grants is made public, particularly for grantees’ security. As noted at the PEAK session, in assessing risks funders should not only consider possible negative actions by government actors, but also by actors like militant groups or even a grantee’s community or family. It is also important to recognize that risks can change over time, which is why it is so critical that funders understand what will happen with the data they share; if circumstances change, they need to know who should be notified so that newly sensitive data can be removed.
  • Get grantees’ input: Minimizing harm to grantees is of utmost importance to funders. And yet grantees usually have little or no input on decisions about what information is shared about them. Some funders do explicitly ask for grantees’ consent to share information, sometimes at multiple points along the grant process. This could take the form of an opt-in box included as part of the grant agreement process, for example. At a minimum, grantees should understand where and how data about the grant will be used.
  • Calibrate what is shared based on the level of risk: Depending on the outcomes of their risk assessment (and grantees’ input), a funder may determine that it’s inadvisable to share any details about certain grants. In these cases, funders may opt not to include those grants in their reporting at all, or to only report on them at an aggregate level (e.g., $2 million in grants to region or country X). In situations where it is possible to acknowledge a grant, funders can take steps to protect a grantee, such as: anonymizing the name of the grantee; providing limited information on the grantee’s location (e.g., country only); and/or redacting or eliminating a grant description (note: from our experience processing data, it is easy to overlook sensitive information in grant descriptions!).
  • Build data protection into grants management systems: Technology has an important role to play in making data protection systematic and, importantly, manageable. For example, some funders have “flags” to indicate which grants can be shared publicly or, conversely, which are sensitive. In one example shared at PEAK, a grants management system has been set up so that if a grant has been marked as sensitive, the grantee’s name will automatically appear as “Confidential” in any reports generated. These steps can minimize the risk of data being shared due to human error.

Transparency is at the core of Foundation Center’s mission. We believe deeply that transparency can not only help build public trust but also advance more inclusive and effective philanthropy. For that reason, we are committed to being responsible stewards of the data that is shared with us (see the security plan for Advancing Human Rights, for example). A single conference session or blog post cannot do justice to such a complex and longdebated topic. We are therefore thankful that our colleagues at Ariadne360Giving and The Engine Room have just started a project to provide funders with greater guidance around this issue (learn more in these two thoughtful blog posts from The Engine Room, here and here). We look forward to seeing and acting on their findings! 

--Laia Griñó

New IssueLab Infographic Delves into Foundation Evaluation Practices
January 3, 2018

Evaluation_look_1101[1]More than half of funders are sharing evaluation results. How are they doing it, and how can other foundations learn from these lessons?

A detailed IssueLab infographic reveals how foundations are conducting evaluations, what they’re evaluating and whether they publicly shared what they learned. The findings are based on a 2017 Foundation Center survey of U.S. foundations.

In the last five years, 42% of foundations have conducted and/or commissioned an evaluation. Among the types of foundations more likely to do so are larger funders, as well as community foundations, of which 64% reported a commissioned evaluation in the last five years.

Other key findings:

  • 55% of foundations share what they are learning (Are you?)
  • Only 36% of foundations look at what other funders are sharing
  • 28% of foundations evaluate themselves as a whole
  • 51% of foundations evaluate individual grants

Most surprising and disappointing is how few foundations report using the knowledge that is shared by others. In a field that is not known for sharing, it’s likely most foundation staff don’t think the data is out there or searchable and retrievable in a user-friendly way. To solve this problem, IssueLab developed a new IssueLab:Results tool that easily allows anyone to seek and find foundation evaluations. You can now easily learn from your colleagues.

This IssueLab infographic is part of Foundation Center’s ongoing efforts to champion greater foundation transparency.. This year, Foundation Center launched the related #OpenForGood campaign, which encourages foundations to openly share their knowledge and learn from one another. Hint-Hint: adopting open knowledge practices could be an excellent New Year’s resolution for your foundation! How will your foundation be #OpenForGood?

--Melissa Moy

In the Know: #OpenForGood Staff Pick
November 1, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Native Arts & Cultures Foundation.


Staff Pick: Native Arts & Cultures Foundation

Progressing Issues of Social Importance Through the Work of Indigenous Artists: A Social Impact Evaluation of the Native Arts and Cultures Foundation's Pilot Community Inspiration Program

Download the Report

Quick Summary

NACF Report

Impact measurement is a challenge for all kinds of organizations, and arts and culture organizations in particular often struggle with how to quantify the impact they are making. How does one measure the social impact of an epic spoken word poem, or of a large-scale, temporary art installation, or of performance art? The same is true of measuring the impact of social change efforts--how can these be measured in the short term given the usual pace of change? This report provides a good example of how to overcome both of these struggles.

In 2014, the Native Arts & Cultures Foundation (NACF) launched a new initiative, the Community Inspiration Program (CIP), which is rooted in the understanding that arts and cultures projects have an important role to play in motivating community engagement and supporting social change.

This 2017 report considers the social impacts of the 2014 CIP projects—what effects did they have on communities and on the issues, conversations, and connections that are critical in those communities? Its secondary purpose is to provide the NACF with ideas for how to improve its grantmaking in support of arts for community change.

Field(s) of Practice

  • Arts and Culture
  • Native and Indigenous Communities
  • Social Change
  • Community Engagement

This report opens up knowledge about the pilot phases of a new initiative whose intended impacts, community inspiration and social change, are vital but difficult concepts to operationalize and measure. The evaluation provides valuable insight into how foundations can encourage the inclusion of indigenous perspectives and truths not only in the design of their programs but also in the evaluation of those same programs.

What makes it stand out?

Several key aspects make this report noteworthy. First, this evaluation comprises a unique combination of more traditional methods and data with what the authors call an "aesthetic-appreciative" evaluation lens, which accounts for a set of dimensions associated with aesthetic projects such as "disruption," "stickiness," and "communal meaning," providing a more holistic analysis of the projects. Further, because the evaluation was focused on Native-artist led projects, it relied on the guidance of indigenous research strategies. Intentionality around developing strategies and principles for stakeholder-inclusion make this a noteworthy and useful framework for others, regardless of whether Native communities are the focus of your evaluation.

Key Quote

"Even a multiplicity of evaluation measures may not 'truly' tell the story of social impact if, for evaluators, effects are unobservable (for example, they occur at a point in the future that is beyond the evaluation's timeframe), unpredictable (so that evaluators don't know where to look for impact), or illegible (evaluators cannot understand that they are seeing the effects of a project)."

--Gabriela Fitz

Open Access to Foundation Knowledge
October 25, 2017

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. This post also appears in Medium. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Lisa Brooks Photo
Lisa Brooks

Foundations have a lot of reasons to share knowledge. They produce knowledge themselves. They hire others to research and author works that help with internal strategy development and evaluation of internal strategies, programs, and projects. And they make grants that assist others in gaining insight into social issues — be it through original research, evaluation work, or other work aimed at creating a better understanding of issues so that we can all pursue better solutions to social problems. In almost all aspects of foundation work, knowledge is an outcome.

While openly sharing this knowledge is uneven across the social sector, we do see more and more foundations starting to explore open access to the knowledge assets they make possible. Many foundations are sharing more intentionally through their websites, external clearinghouses, and other online destinations. And more foundations are suggesting — sometimes requiring — that their grantees openly share knowledge that was produced with grant dollars.

Lacey Althouse Photo
Lacey Althouse

Some foundations are even becoming open access champions. For example, the Hewlett Foundation has authored a terrifically helpful free toolkit that provides an in-depth how-to aimed at moving foundation and grantee intellectual property licensing practices away from “all rights reserved” copyrights and toward “some rights reserved” open licenses. (Full disclosure: IssueLab is included in the toolkit as one solution for long term knowledge preservation and sharing.) (“Hewlett Foundation Open Licensing Toolkit for Staff”)

For those who are already 100% open it’s easy to forget that, when first starting out, learning about open access can be daunting. For those who are trying to open up, like most things, getting there is a series of steps. One step is understanding how licensing can work for, or against, openness. Hewlett’s toolkit is a wonderful primer for understanding this. IssueLab also offers some ways to dig into other areas of openness. Check out Share the Wealth for tips.

Hawaii

 

However it is that foundations find their way to providing open access to the knowledge they make possible, we applaud and support it! In the spirit of International Open Access Week’s theme, “Open in order to….,” here’s what a few leading foundations have to say about the topic of openness in the social sector.

James Irvine Foundation 
Find on IssueLab.

“We have a responsibility to share our knowledge. There’s been a lot of money that gets put into capturing and generating knowledge and we shouldn’t keep it to ourselves.”

-Kim Ammann Howard, Director of Impact Assessment and Learning

Hewlett Foundation
Find on IssueLab.

“Our purpose for existing is to help make the world a better place. One way we can do that is to try things, learn, and then share what we have learned. That seems obvious. What is not obvious is the opposite: not sharing. So the question shouldn’t be why share; it should be why not share.”

-Larry Kramer, President

Hawaii Community Foundation
Find on IssueLab.

“Openness and transparency is one element of holding ourselves accountable to the public — to the communities we’re either in or serving. To me, it’s a necessary part of our accountability and I don’t think it should necessarily be an option.

-Tom Kelly, Vice President of Knowledge, Evaluation and Learning

The David and Lucile Packard Foundation
Find on IssueLab.

“Why do we want to share these things? …One, because it’s great to share what we’re learning, what’s worked, what hasn’t, what impact has been made so that others can learn from the work that our grantees are doing so that they can either not reinvent the wheel, gain insights from it or learn from where we’ve gone wrong… I think it helps to build the field overall since we’re sharing what we’re learning.”

-Bernadette Sangalang, Program Officer

The Rockefeller Foundation
Find on IssueLab

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations — well before the results are even known.”

-Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly
(Read more on why the Rockefeller Foundation is open for good.)

If you are a foundation ready to make open access the norm as part of your impact operations, here’s how you can become an open knowledge organization today.

IssueLab believes that social sector knowledge is a public good that is meant to be freely accessible to all. We collect and share the sector’s knowledge assets and we support the social sector’s adoption of open knowledge practices. Visit our collection of ~23,000 open access resources. While you’re there, add your knowledge — it takes minutes and costs nothing. Find out what we’re open in order to do here. IssueLab is a service of Foundation Center.

--Lisa Brooks and Lacey Althouse

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories