Transparency Talk

Category: "Sharing" (56 posts)

Data Fix: Do's and Don'ts for Data Mapping & More!
October 3, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This post is part of a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoAs many of you know, Foundation Center was established to provide transparency for the field of philanthropy. A key part of this mission is collecting, indexing, and aggregating millions of grants each year. In recent years this laborious process has become more streamlined thanks to technology, auto-coding, and to those of you who directly report your grants data to us. Your participation also increases the timeliness and accuracy of the data.

Today, over 1300 funders worldwide share grants data directly with Foundation Center. Over the 20 years we've been collecting this data, we've encountered some issues concerning the basic fields required. To make sharing data even quicker and easier, we've put together some dos and don'ts focusing on three areas that may seem straightforward, but often cause confusion.

Location Data for Accurate Mapping and Matching

Quite simply, to map your grants data we need location information! And we need location information for more than mapping. We also use this information to ensure we are matching data to the correct organizations in our database. To help us do this even more accurately, we encourage you to provide as much location data as possible. This also helps you by increasing the usability of your own data when running your own analyses or data visualizations.

DO DON'T
Do supply Recipient City for U.S. and non-U.S. Recipients. Don't forget to supply Recipient Address and Recipient Postal Code, if possible.
Do supply Recipient State for U.S. Recipients. Don't supply post office box in place of street address for Recipient Address, if possible.

Do supply Recipient Country for non-U.S. Recipients.

Don't confuse Recipient location (where the check was sent) with Geographic Area Served (where the service will be provided). 

What's Your Type? Authorized or Paid?

Two types of grant amounts can be reported: Authorized amounts (new grants authorized in a given fiscal year, including the full amount of grants that may be paid over multiple years) or Paid amounts (as grants would appear in your IRS tax form). You can report on either one of these types of amounts – we just need to know which one you are using: Authorized or Paid.

DO DON'T
Do indicate if you are reporting on Authorized or Paid amounts. Don't send more than one column of Amounts in your report – either Authorized or Paid for the entire list.
Do remain consistent from year to year with sending either Authorized amounts or Paid amounts to prevent duplication of grants. Don't forget to include Grant Duration (in months) or Grant Start Date and Grant End Date, if possible.
Do report the type of Currency of the amount listed, if not US Dollars. Don't include more than one amount per grant.

The Essential Fiscal Year

An accurate Fiscal Year is essential since we publish grants data by fiscal year in our data-driven tools and content-rich platforms such as those developed by Foundation Landscapes, including Funding the Ocean, SDG Funders, Equal Footing and Youth Giving. Fiscal Year can be reported with a year (2018) or date range (07/01/2017-06/31/2018), but both formats will appear in published products as YEAR AWARDED: 2018.

DO DON'T
Do include the Fiscal Year in which the grants were either Authorized or Paid by you, the funder. Don't provide the Fiscal Year of the Recipient organization.
Do format your Fiscal Year as a year (2018) or a date range (07/01/2017-06/31/2018). Don't forget, for off-calendar fiscal years, the last year of the date range is the Fiscal Year: 07/01/2017-06/31/2018 = 2018.

More Tips to Come!

I hope you have a better understanding of these three areas of data to be shared through Foundation Center eReporting. Moving forward, we'll explore the required fields of Recipient Name and Grant Description, as well as high priority fields such as Geographic Area Served. If you have any questions, please feel free to contact me. Thank you! And don't forget, the data you share IS making a difference!

-- Kati Neiheisel

Staff Pick: Foundation Funded Research Explores How to Improve the Voter Experience
August 9, 2018

Becca Leviss is a Knowledge Services Fellow at Foundation Center.

This post is part of the GlassPockets’ Democracy Funding series, designed to spotlight knowledge about ways in which philanthropy is working to strengthen American democracy.

Becca 2Voting is central to our democracy, providing citizens from all communities direct way to influence the future by conveying beliefs through civic participation. Though foundations by law must be non-partisan, they can and do support democracy in a variety of ways, and we are tracking these activities in our publicly available Foundation Funding for U.S. Democracy web portal.  
 
From this data we can see that encouraging broad civic participation is one of the most popular ways in which institutional philanthropy supports our democracy. Specific strategies under civic participation include issue-based participation, civic education and leadership, naturalization and immigrant civic integration, and public participation. So, what have foundations learned from these efforts about how to strengthen our democracy? Today we will zoom in to learn from a foundation-funded report that is openly available, containing findings from data collection on elections and voting patterns, including how well the process is workingand who is included or excluded. 
 
Our latest “Staff Pick” from IssueLab’s Democracy Special Collection, which is comprised of foundation-funded research on the topic, explores an aspect of the voter experience in America that could be improvedWith less than 90 days to go before the midterm elections, we’re pleased to offer this deep dive into an important piece of voting-related research. 
 
Research in the social sector can sometimes feel inaccessible or artificial—based on complex theories and mathematical models and highly-controlled situations. This report, however, presents its research methodology and results in a clear, understandable manner that invites the reader to continue its work to understanding how polling sites can use their resources to both investigate and improve the voter experience.  

STAFF PICK

Improving the Voter Experience: Reducing Polling Place Wait Times by Measuring Lines and Managing Polling Place Resources, by Charles Stewart III; John C. Fortier; Matthew Weil; Tim Harper; Stephen Pettigrew 

Download the Report

Publisher

Bipartisan Policy Center

Funders

Ford Foundation; The Democracy Fund

Quick Summary

Voting is the cornerstone of civic engagement in American democracy, but long wait times and inefficient organization at polling places can undermine the voting process and even discourage citizens from voting altogether. In 2013, President Barack Obama launched the bipartisan Presidential Commission on Election Administration (PCEA) to initiate studies and collaborative research on polling place wait times. The PCEA’s work revealed that while wait times and poll lines are a serious issue in the United States, they are also reflective of deeper, more complex problems within the election administration system. This report by the Bipartisan Policy Center summarizes the PCEA’s efforts and highlights how the knowledge gained can produce action and improvement at polling sites. Ultimately, the report emphasizes the need for continued research and innovation in approaching common issues in the voter experience.

Field of Practice

Government Reform

What makes it stand out?

Ne report“Long lines may be a canary in the coal mine,” begins the report,“indicating problems beyond a simple mismatch between the number of voting machines and voters, such as voter rules that are inaccurate or onerous.” Quantitative and qualitative data has shown that long lines at the polls have wide-reaching economic costs of over half a billion dollars in a presidential election, as well as the immeasurable cost of voter discouragement due to polling place problems. These issues are exacerbated at polling sites that are urban, dense, and with large minority populations, where lack of resources and access can disenfranchise the voting population.

While the dilemma of election administration is complex, the report describes a rather straight-forward series of projects by the Massachusetts Institute of Technology and the Bipartisan Policy Center. MIT and BPC collaborated to create a system of data collection on polling lines and polling place efficiency that would be simple and easily implemented by poll workers. The program utilized basic queuing theory: calculating the average wait time of a voter by dividing the average line length by the average arrival time. For fellow (and potential future) researchers, this report spends a meaningful portion of time explaining the significance of each variable, how it is calculated, and how its fluctuation impacts the overall results of the investigation. We are given examples of several successful iterations of the study and their evaluations, as well as insight into certain research choices.

MIT/BPC’s work has found that an overwhelming majority of Election Day polling sites—82 percent—experienced the longest line when the doors first opened. In all, a total of 90 percent of Election Day polling sites have their longest lines within the first two hourly samples (when observed on Hour 0 and Hour 1), with the lines declining at an average rate after that. Similarly, voters experience the longest wait times when the lines were at their longest. This pattern is vastly different from that of early voting sites, where wait time is relatively constant; however, these sites still most commonly experience their longest lines at the beginning of the day (25 percent of the studied population).

The research emphasizes the importance of how to adequately prepare for the length of the longest line. The report suggests that if polling sites adjust worker shifts to accommodate for strong early morning voter turnout on Election Day, they can easily clear the lines within the first few hours of voting, thus saving money and better serving their voters. The report also recognizes the range of its results: in other words, individual precincts have individual needs. Without meaningful research, however, we cannot know how to meet those needs and improve the voter experience. Therefore, as readers (and hopefully fellow voters), we are encouraged by MIT/BPC’s work to take clear and simple action to improve our own polling sites through continued research and investigation. This report exemplifies the importance of making the research and data process transparent and attainable so that we can not only understand its significance, but actively contribute to its efforts. There are many processes that could benefit from this kind of data analysis to improve the user experience. What if foundations analyzed their grant processes in this way? I can’t help but think that there is much that philanthropy can learn from the government from reports like this that show how institutions are opening up data collection to improve the user experience for actors and stakeholders.

Key Quote

“Precincts with large numbers of registered voters often have too few check-in stations or voting booths to handle the volume of voters assigned to the precinct, even under the best of circumstances. Precincts that are unable to clear the lines from the first three hours of voting are virtually guaranteed to have long lines throughout the day. Polling places in urban areas often face design challenges—small, inconvenient spaces—that undermine many election officials’ best efforts to provide adequate resources to these locations.”

--Becca Leviss

What Philanthropy Can Learn from Open Government Data Efforts
July 5, 2018

Daniela Pineda, Ph.D., is vice president of integration and learning at First 5 LA, an independent public agency created by voters to advocate for programs and polices benefiting young children. A version of this post also appears in the GOVERNING blog.

Daniela Pineda Photo 2Statistics-packed spreadsheets and lengthy, jargon-filled reports can be enough to make anybody feel dizzy. It's natural. That makes it the responsibility for those of us involved in government and its related institutions to find more creative ways to share the breadth of information we have with those who can benefit from it.

Government agencies, foundations and nonprofits can find ways to make data, outcomes and reports more user-friendly and accessible. In meeting the goal of transparency, we must go beyond inviting people to wade through dense piles of data and instead make them feel welcome using it, so they gain insights and understanding.

How can this be done? We need to make our data less wonky, if you will.

This might sound silly, and being transparent might sound as easy as simply releasing documents. But while leaders of public agencies and officeholders are compelled to comply with requests under freedom-of-information and public-records laws, genuine transparency requires a commitment to making the information being shared easy to understand and useful.

“…genuine transparency requires a commitment to making the information being shared easy to understand and useful.”

Things to consider include how your intended audience prefers to access and consume information. For instance, there are generational differences in the accessing of information on tablets and mobile devices as opposed to traditional websites. Consider all the platforms your audience uses to view information, such as smartphone apps, news websites and social media platforms, to constantly evolve based on their feedback.

Spreadsheets just won't work here. You need to invest in data visualization techniques and content writing to explain data, no matter how it is accessed.

The second annual Equipt to Innovate survey, published by Governing in partnership with Living Cities, found several cities not only using data consistently to drive decision-making but also embracing ways to make data digestible for the publics they serve.

Los Angeles' DataLA portal, for example, offers more than 1,000 data sets for all to use along with trainings and tutorials on how to make charts, maps and other visualization. The portal's blog offers a robust discussion of the issues and challenges faced with using existing data to meet common requests. Louisville, Ky., went the proverbial extra mile, putting a lot of thought into what data would be of interest to residents and sharing the best examples of free online services that have been built using the metro government's open data.

Louisville's efforts point up the seemingly obvious but critical strategy of making sure you know what information your target audience actually needs. Have you asked? Perhaps not. The answers should guide you, but also remember to be flexible about what you are asking. For example, the Los Angeles Unified School District is set to launch a new portal later this summer to provide parents with data, and is still learning how to supply information that parents find useful. District officials are listening to feedback throughout the process, and they are willing to adjust. One important strategy for this is to make your audience -- or a sampling of them -- part of your beta testing. Ask what information they found useful and what else would have been helpful.

“When you share, you are inviting others to engage with you about how to improve your work.”

Remember, the first time you allow a glimpse into your data and processes, it's inevitable your information will have gaps and kinks that you can't foresee. And if you are lucky to get feedback about what didn't work so well, it may even seem harsh. Don't take it personally. It's an opportunity to ask your audience what could be done better and commit to doing so. It may take weeks, months or maybe longer to package information for release, making it usable and accessible, but this is an investment worth making. You might miss the mark the first time, but make a commitment to keep trying.

And don't be daunted by the reality that anytime you share information you expose yourself to criticism. Sharing with the public that a project didn't meet expectations or failed completely is a challenge no matter how you look at it. But sharing, even when it is sharing your weaknesses, is a strength your organization can use to build its reputation and gain influence in the long term.

When you share, you are inviting others to engage with you about how to improve your work. You also are modeling the importance of being open about failure. This openness is what helps others feel like partners in the work, and they will feel more comfortable opening up about their own struggles. You might be surprised at who will reach out and what type of partnerships can come from sharing.

Through this process, you will build your reputation and credibility, helping your organization advance its goals. Ultimately, it's about helping those you serve by giving them the opportunity to help you.

--Daniela Pineda

Nominations for Foundation Center’s #OpenForGood Award Now Open
June 13, 2018

Sarina Dayal is the knowledge services associate at Foundation Center.

Sarina DayalTo encourage funders to be more transparent, Foundation Center has launched the inaugural #OpenForGood Award. This award will recognize foundations that display a strong commitment to transparency and knowledge sharing.

Last year, we started #OpenForGood, a campaign to encourage foundations to openly share what they learn so we can all get collectively smarter. Now, we’re launching this award as a way to bring due recognition and visibility to foundations who share challenges, successes, and failures openly to strengthen how we can think and act as a sector. The winning foundations will demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. We’re looking for the best examples of smart, creative, strategic, and consistent knowledge sharing in the field, across all geographic and issue contexts.

What’s In It for You?

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in the form of social media and newsletter space. What is a Knowledge Center and why would you want one? It is a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

OFGaward-528The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality, today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to create a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourage the sector to make it a priority to learn from one another and share content with a global audience, so that we can build smartly on one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Who Is Eligible for the Award?

  • Any foundation anywhere in the world (self-nominations welcome)
  • Must share its collection of published evaluations publicly through IssueLab
  • Must demonstrate active commitment to open knowledge
  • Preferential characteristics include foundations that integrate creativity, field leadership, openness, and community insight into knowledge sharing work
  • Bonus points for use of other open knowledge elements such as open licensing, digital object identifiers (DOIs), or institutional repository

Anyone is welcome to nominate any foundation through September 30, 2018. Winners will be selected in the Fall through a review process and notified in January. The award will officially be presented at next year’s annual GEO Conference. If you have any questions, please email openforgood@foundationcenter.org. Click here to nominate a foundation today!

Who will you nominate as being #OpenForGood?

--Sarina Dayal

Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts
April 12, 2018

Laia Griñó is director of data discovery at Foundation Center. This post also appears in the Human Rights Funders Network's blog.

Laia Grino photoOver the last few months, this blog has presented insights gained from the Advancing Human Rights initiative’s five-year trend analysis. Getting to these insights would not have been possible had not a growing number of funders decided to consistently share more detailed data about their grantmaking, such as through Foundation Center’s eReporting program. In a field where data can pose real risks, some might feel that this openness is ill-advised. Yet transparency and data protection need not be at odds. By operating from a framework of responsible data, funders can simultaneously protect the privacy and security of grantees and contribute to making the human rights field more transparent, accountable, and effective.

This topic – balancing transparency and data protection – was the focus of a session facilitated by Foundation Center at the PEAK Grantmaking annual conference last month. Our goal was not to debate the merits of one principle over the other, but to help provide a framework that funders can use in determining how to share grants data, even in challenging circumstances. What follows are some of the ideas and tips discussed at that session (a caveat here: these tips focus on data shared voluntarily by funders on their website, with external partners like Foundation Center, etc.; we recognize that funders may also face legal reporting requirements that could raise additional issues).

HRFN Graphic

  • Think of transparency as a spectrum: Conversations regarding data sharing often seem to end up at extremes: we must share everything or we can’t share anything. Instead, funders should identify what level of transparency makes sense for them by asking themselves two questions: (1) What portion of our grants portfolio contains sensitive data that could put grantees at risk if shared? and (2) For the portion of grants deemed sensitive, which grant details – if any – are possible to share? Based on our experience with Advancing Human Rights, in most cases funders will find that it is possible to share some, if not most, of their grants information.
  • Assess the risks of sharing data: Answering these questions requires careful consideration of the consequences if information about certain grants is made public, particularly for grantees’ security. As noted at the PEAK session, in assessing risks funders should not only consider possible negative actions by government actors, but also by actors like militant groups or even a grantee’s community or family. It is also important to recognize that risks can change over time, which is why it is so critical that funders understand what will happen with the data they share; if circumstances change, they need to know who should be notified so that newly sensitive data can be removed.
  • Get grantees’ input: Minimizing harm to grantees is of utmost importance to funders. And yet grantees usually have little or no input on decisions about what information is shared about them. Some funders do explicitly ask for grantees’ consent to share information, sometimes at multiple points along the grant process. This could take the form of an opt-in box included as part of the grant agreement process, for example. At a minimum, grantees should understand where and how data about the grant will be used.
  • Calibrate what is shared based on the level of risk: Depending on the outcomes of their risk assessment (and grantees’ input), a funder may determine that it’s inadvisable to share any details about certain grants. In these cases, funders may opt not to include those grants in their reporting at all, or to only report on them at an aggregate level (e.g., $2 million in grants to region or country X). In situations where it is possible to acknowledge a grant, funders can take steps to protect a grantee, such as: anonymizing the name of the grantee; providing limited information on the grantee’s location (e.g., country only); and/or redacting or eliminating a grant description (note: from our experience processing data, it is easy to overlook sensitive information in grant descriptions!).
  • Build data protection into grants management systems: Technology has an important role to play in making data protection systematic and, importantly, manageable. For example, some funders have “flags” to indicate which grants can be shared publicly or, conversely, which are sensitive. In one example shared at PEAK, a grants management system has been set up so that if a grant has been marked as sensitive, the grantee’s name will automatically appear as “Confidential” in any reports generated. These steps can minimize the risk of data being shared due to human error.

Transparency is at the core of Foundation Center’s mission. We believe deeply that transparency can not only help build public trust but also advance more inclusive and effective philanthropy. For that reason, we are committed to being responsible stewards of the data that is shared with us (see the security plan for Advancing Human Rights, for example). A single conference session or blog post cannot do justice to such a complex and longdebated topic. We are therefore thankful that our colleagues at Ariadne360Giving and The Engine Room have just started a project to provide funders with greater guidance around this issue (learn more in these two thoughtful blog posts from The Engine Room, here and here). We look forward to seeing and acting on their findings! 

--Laia Griñó

New IssueLab Infographic Delves into Foundation Evaluation Practices
January 3, 2018

Evaluation_look_1101[1]More than half of funders are sharing evaluation results. How are they doing it, and how can other foundations learn from these lessons?

A detailed IssueLab infographic reveals how foundations are conducting evaluations, what they’re evaluating and whether they publicly shared what they learned. The findings are based on a 2017 Foundation Center survey of U.S. foundations.

In the last five years, 42% of foundations have conducted and/or commissioned an evaluation. Among the types of foundations more likely to do so are larger funders, as well as community foundations, of which 64% reported a commissioned evaluation in the last five years.

Other key findings:

  • 55% of foundations share what they are learning (Are you?)
  • Only 36% of foundations look at what other funders are sharing
  • 28% of foundations evaluate themselves as a whole
  • 51% of foundations evaluate individual grants

Most surprising and disappointing is how few foundations report using the knowledge that is shared by others. In a field that is not known for sharing, it’s likely most foundation staff don’t think the data is out there or searchable and retrievable in a user-friendly way. To solve this problem, IssueLab developed a new IssueLab:Results tool that easily allows anyone to seek and find foundation evaluations. You can now easily learn from your colleagues.

This IssueLab infographic is part of Foundation Center’s ongoing efforts to champion greater foundation transparency.. This year, Foundation Center launched the related #OpenForGood campaign, which encourages foundations to openly share their knowledge and learn from one another. Hint-Hint: adopting open knowledge practices could be an excellent New Year’s resolution for your foundation! How will your foundation be #OpenForGood?

--Melissa Moy

In the Know: #OpenForGood Staff Pick
November 1, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Native Arts & Cultures Foundation.


Staff Pick: Native Arts & Cultures Foundation

Progressing Issues of Social Importance Through the Work of Indigenous Artists: A Social Impact Evaluation of the Native Arts and Cultures Foundation's Pilot Community Inspiration Program

Download the Report

Quick Summary

NACF Report

Impact measurement is a challenge for all kinds of organizations, and arts and culture organizations in particular often struggle with how to quantify the impact they are making. How does one measure the social impact of an epic spoken word poem, or of a large-scale, temporary art installation, or of performance art? The same is true of measuring the impact of social change efforts--how can these be measured in the short term given the usual pace of change? This report provides a good example of how to overcome both of these struggles.

In 2014, the Native Arts & Cultures Foundation (NACF) launched a new initiative, the Community Inspiration Program (CIP), which is rooted in the understanding that arts and cultures projects have an important role to play in motivating community engagement and supporting social change.

This 2017 report considers the social impacts of the 2014 CIP projects—what effects did they have on communities and on the issues, conversations, and connections that are critical in those communities? Its secondary purpose is to provide the NACF with ideas for how to improve its grantmaking in support of arts for community change.

Field(s) of Practice

  • Arts and Culture
  • Native and Indigenous Communities
  • Social Change
  • Community Engagement

This report opens up knowledge about the pilot phases of a new initiative whose intended impacts, community inspiration and social change, are vital but difficult concepts to operationalize and measure. The evaluation provides valuable insight into how foundations can encourage the inclusion of indigenous perspectives and truths not only in the design of their programs but also in the evaluation of those same programs.

What makes it stand out?

Several key aspects make this report noteworthy. First, this evaluation comprises a unique combination of more traditional methods and data with what the authors call an "aesthetic-appreciative" evaluation lens, which accounts for a set of dimensions associated with aesthetic projects such as "disruption," "stickiness," and "communal meaning," providing a more holistic analysis of the projects. Further, because the evaluation was focused on Native-artist led projects, it relied on the guidance of indigenous research strategies. Intentionality around developing strategies and principles for stakeholder-inclusion make this a noteworthy and useful framework for others, regardless of whether Native communities are the focus of your evaluation.

Key Quote

"Even a multiplicity of evaluation measures may not 'truly' tell the story of social impact if, for evaluators, effects are unobservable (for example, they occur at a point in the future that is beyond the evaluation's timeframe), unpredictable (so that evaluators don't know where to look for impact), or illegible (evaluators cannot understand that they are seeing the effects of a project)."

--Gabriela Fitz

Open Access to Foundation Knowledge
October 25, 2017

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. This post also appears in Medium. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Lisa Brooks Photo
Lisa Brooks

Foundations have a lot of reasons to share knowledge. They produce knowledge themselves. They hire others to research and author works that help with internal strategy development and evaluation of internal strategies, programs, and projects. And they make grants that assist others in gaining insight into social issues — be it through original research, evaluation work, or other work aimed at creating a better understanding of issues so that we can all pursue better solutions to social problems. In almost all aspects of foundation work, knowledge is an outcome.

While openly sharing this knowledge is uneven across the social sector, we do see more and more foundations starting to explore open access to the knowledge assets they make possible. Many foundations are sharing more intentionally through their websites, external clearinghouses, and other online destinations. And more foundations are suggesting — sometimes requiring — that their grantees openly share knowledge that was produced with grant dollars.

Lacey Althouse Photo
Lacey Althouse

Some foundations are even becoming open access champions. For example, the Hewlett Foundation has authored a terrifically helpful free toolkit that provides an in-depth how-to aimed at moving foundation and grantee intellectual property licensing practices away from “all rights reserved” copyrights and toward “some rights reserved” open licenses. (Full disclosure: IssueLab is included in the toolkit as one solution for long term knowledge preservation and sharing.) (“Hewlett Foundation Open Licensing Toolkit for Staff”)

For those who are already 100% open it’s easy to forget that, when first starting out, learning about open access can be daunting. For those who are trying to open up, like most things, getting there is a series of steps. One step is understanding how licensing can work for, or against, openness. Hewlett’s toolkit is a wonderful primer for understanding this. IssueLab also offers some ways to dig into other areas of openness. Check out Share the Wealth for tips.

Hawaii

 

However it is that foundations find their way to providing open access to the knowledge they make possible, we applaud and support it! In the spirit of International Open Access Week’s theme, “Open in order to….,” here’s what a few leading foundations have to say about the topic of openness in the social sector.

James Irvine Foundation 
Find on IssueLab.

“We have a responsibility to share our knowledge. There’s been a lot of money that gets put into capturing and generating knowledge and we shouldn’t keep it to ourselves.”

-Kim Ammann Howard, Director of Impact Assessment and Learning

Hewlett Foundation
Find on IssueLab.

“Our purpose for existing is to help make the world a better place. One way we can do that is to try things, learn, and then share what we have learned. That seems obvious. What is not obvious is the opposite: not sharing. So the question shouldn’t be why share; it should be why not share.”

-Larry Kramer, President

Hawaii Community Foundation
Find on IssueLab.

“Openness and transparency is one element of holding ourselves accountable to the public — to the communities we’re either in or serving. To me, it’s a necessary part of our accountability and I don’t think it should necessarily be an option.

-Tom Kelly, Vice President of Knowledge, Evaluation and Learning

The David and Lucile Packard Foundation
Find on IssueLab.

“Why do we want to share these things? …One, because it’s great to share what we’re learning, what’s worked, what hasn’t, what impact has been made so that others can learn from the work that our grantees are doing so that they can either not reinvent the wheel, gain insights from it or learn from where we’ve gone wrong… I think it helps to build the field overall since we’re sharing what we’re learning.”

-Bernadette Sangalang, Program Officer

The Rockefeller Foundation
Find on IssueLab

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations — well before the results are even known.”

-Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly
(Read more on why the Rockefeller Foundation is open for good.)

If you are a foundation ready to make open access the norm as part of your impact operations, here’s how you can become an open knowledge organization today.

IssueLab believes that social sector knowledge is a public good that is meant to be freely accessible to all. We collect and share the sector’s knowledge assets and we support the social sector’s adoption of open knowledge practices. Visit our collection of ~23,000 open access resources. While you’re there, add your knowledge — it takes minutes and costs nothing. Find out what we’re open in order to do here. IssueLab is a service of Foundation Center.

--Lisa Brooks and Lacey Althouse

No Moat Philanthropy Part 3: Building Your Network
October 4, 2017

Jen Ford Reedy is President of the Bush Foundation. On the occasion of her fifth anniversary leading the foundation, she reflects on efforts undertaken to make the Bush Foundation more permeable. Because the strategies and tactics she shares can be inspiring and helpful for any grantmaker exploring ways to open up their grantmaking, we are devoting our blog space all week to the series. This is the third post in the five-part series.

Reedyjenniferford-croppedIn yesterday’s post I shared how we have tried to bring different perspectives into the Foundation.  Today’s post is mostly about getting out of the Foundation, to meet new people.  This is the third principle of No Moat Philanthropy.

Principle #3: Continuously and intentionally connect with new people

Five years ago we had close working relationships with people in each of our initiative areas. While we valued those relationships, we kept a pretty tight circle. We knew people wanted money from us, and we also knew their chances of receiving it were slim. This can be awkward and who wants that? While avoiding awkwardness can make life more pleasant, we now believe embracing that awkwardness actually makes us smarter. While we can only fund a limited number of people and organizations, interacting with lots and lots of people and organizations helps us better understand our region and make better, more informed strategic choices and funding decisions.

We believe in the power of networks. We believe that a community’s strength and diversity of connections help define its capacity for resilience and innovation. We work to ensure we are continuously connecting with new and different people. Each year, we set outreach priorities for geographic areas, cultural communities and/or sectors based on our analysis of where our network is weakest. Then we strive to make new connections in a way that creates connections between others, too. Specifically we:

“We believe that a community’s strength and diversity of connections help define its capacity for resilience and innovation.”

Hold office hours to meet with people all around the region. We hold “office hours” in communities around the region for anyone interested in with our Foundation staff. These are sometimes coupled with a listening session, co-hosted with a local partner, that allow us to understand what issues are most important to the community.

Sponsor and attend other people’s events. We introduced an open process to request Bush Foundation sponsorship of events. We had been sponsoring some events, but we never considered it a program strategy. One of the primary criteria for event sponsorship is whether it will help us connect with people who might benefit from learning about our work. This might include having a Bush Foundation booth manned by staff members who are there to meet and field questions from attendees.

Host events designed for connection. We were already hosting a number of events to build relationships with and among our Fellows and grantees. In the past five years, however, we have taken our events strategy to a higher level by focusing on connecting people across our programs with people beyond our existing grantee and Fellowship networks. The best example of this is bushCONNECT, our flagship event which brings together 1,100 leaders from the region. To ensure we are attracting individuals beyond our community network, we engage “recruitment partners” from around the region who receive grant support to recruit a cohort from within their network to bring to the event, thereby ensuring bushCONNECT attendees more fully represent the geography — and diversity — of our region.

Take cohorts of people to national events. We also offer scholarships for cohorts of people from our region to attend national conferences together. During the event, we create opportunities to build connections with and among the attendees from the region. This allows us to meet and support more people in the region, build attendees’ individual networks, and ensure leaders in our region are both contributing to and benefitting from national conversations.

We are not throwing parties for fun.  We see relationship building as core to our strategy.  We see every interaction as an opportunity to influence and be influenced.  More on that tomorrow.

--Jen Ford Reedy

Opening Up the Good and Bad Leads to Stronger Communities and Better Grantmaking
September 28, 2017

Hanh Cao Yu is Chief Learning Officer at The California Endowment.  She has been researcher and evaluator of equity and philanthropy for more than two decades. 

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Hanh-Cao-Yu-photoMore than a year ago when I began my tenure at The California Endowment (TCE), I reflected deeply about the opportunities and challenges ahead as the new Chief Learning Officer.  We were six years into a complex, 10-year policy/systems change initiative called Building Healthy Communities (BHC).  This initiative was launched in 2010 to advance statewide policy, change the narrative, and transform 14 of California’s communities most devastated by health inequities into places where all people—particular our youth—have an opportunity to thrive.  This is the boldest bet in the foundation’s history at $1 billion and the stakes are high.  It is not surprising, then, that despite the emphasis on learning, the evaluation of BHC is seen as a winning or losing proposition. 

“By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve.”

As I thought about the role of learning and evaluation in deepening BHC’s impact, I became inspired by the words of Nelson Mandela: “I never lose.  I either win or I learn.”  His encouragement to shift our mindset from “Win/Lose” to “Win/Learn” is crucial to continuous improvement and success.  

I also drew from the insights of Amy Edmondson who reminds us that if we experience failure, not all failures are bad.  According to Edmondson, mistakes can be preventable, unavoidable due to complexity, or even intelligent failures.  So, despite careful planning and learning from decades of research on comprehensive community initiatives and bold systems change efforts, in an initiative as complex as BHC, mistakes can and will occur. By spurring change at community, regional and state levels, and linking community mobilization with sophisticated policy advocacy, TCE was truly venturing into new territory when we launched BHC.

BHC's Big Wins and Lessons 

At the mid-point of BHC, TCE staff and Board paused to assess where we have been successful and where we could do better in improving the conditions under which young people could be healthy and thrive in our underserved communities.  The results were widely shared in the 2016 report, A New Power Grid:  Building Healthy Communities at Year 5.

As a result of taking the time to assess overall progress, we identified some of BHC's biggest impacts to date. In the first five years, TCE and partners contributed to significant policy/system wins:

  • Improved health coverage for the underserved;
  • Strengthened health coverage policy for the undocumented;
  • Improved school climate, wellness and equity;
  • Prevention and reform within the justice system;
  • Public-private investment and policy changes on behalf of boys and young men of color; and
  • Local & regional progress in adoption of “Health In All Policies,” a collaborative approach incorporating health considerations into decision-making across all policy areas

Our Board and team are very pleased with the results and impact of BHC to date, but we have been committed to learning from our share of mistakes. 

Along with the victories, we acknowledged in the report some hard lessons.  Most notable among our mistakes were more attention to:

  • Putting Community in “Community-Driven” Change.  Armed with lessons on having clarity about results to achieve results, we over thought the early process.  This resulted in prescriptiveness in the planning phase that was not only unnecessary, but also harmful. We entered the community planning process with multiple outcomes frameworks and a planning process that struck many partners as philanthropic arrogance. The smarter move was to engage community leaders with the clarity of a shared vision and operating principles, and create the space for community leaders and residents to incubate goals, results, and strategy. Fortunately, we course corrected, and our partners were patient while we did so.
  • Revisiting assumptions about local realities and systems dynamics.  In the report, we discussed our assumption about creating a single locus of inside-out, outside-in activity where community residents, leaders and systems leaders could collaborate on defined goals. It was readily apparent that community leaders distrusted many “systems” insiders, and systems leaders viewed outsider/activists as unreasonable. We underestimated the importance of the roles of historical structural inequalities, context, and dynamics of relationships at the local level.  Local collaboratives or “hubs” were reorganized and customized to meet local realities, and we threw the concept of a single model of collaboration across all the sites out the window.

Some course corrections we made included adjusting and sharpening our underlying assumptions and theory of change and taking on new community-driven priorities that we never anticipated early on; examples include school discipline reform, dismantling the prison pipeline in communities of color through prevention, and work that is taking place in TCE’s Boys & Young Men of Color portfolio.  By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve. 

“Some partner feedback was difficult to hear, but all of it was useful and is making our work with partners stronger.”

Further, significant developments have occurred since the report:

Positioning “Power Building” as central to improving complex systems and policies.  In defining key performance indicators, we know the policy milestones achieved thus far represent only surface manifestations of the ultimate success we are seeking.  We had a breakthrough when we positioned “building the power and voice” of the adults and youth in our communities and “health equity” at the center of our BHC North Star Goals and Indicators.  Ultimately, we’ll know we are successful when the power dynamics in our partner communities have shifted so that adult and youth residents know how to hold local officials accountable for full, ongoing implementation of these policies.

Continuing to listen to our partners.  In addition to clarifying our North Stars, we sought further mid-point advice from our partners, reaching out to 175 stakeholders, including 68 youth and adult residents of BHC communities, for feedback to shape the remainder of BHC’s implementation and to inform our transition planning for the next decade.  Some of what our partners told us was difficult to hear, but all of it was useful and is making our work with partners stronger.    

From these lessons, I challenge our philanthropic colleagues to consider:

  • How can we learn to detect complex failures early to help us go beyond lessons that are superficial? As Amy Edmonson states, “The job of leaders is to see that their organizations don’t just move on after a failure but stop to dig in and discover the wisdom contained in it.”
  • In complex initiatives and complex organizations, what does it take to design a learning culture to capitalize successfully on mistakes? How do we truly engage in “trial and error” and stay open to experimentation and midcourse corrections?  How can we focus internally on our own operations and ways of work, as well as being willing to change our strategies and relationships with external partners?  Further, how do we, as grantmakers responsible for serving the public good, take responsibility for making these lessons #OpenForGood so others can learn from them as well?

It is worth noting that a key action that TCE took at the board level as we embarked on BHC was to dissolve the Board Program Committee and replace it with Learning and Performance Committee.  This set-up offered consistent opportunity for learning from evaluation reports between the Board, the CEO, and the management team and for sharing our learnings publicly to build the philanthropic field.  Now, even as we enter the final phase of BHC, we continue to look for ways to structure opportunities to learn, and I can say, “We are well into a journey to learn intelligently from our successes as well as our mistakes to make meaningful, positive impacts.”

--Hanh Cao Yu

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories