Transparency Talk

Category: "#OpenForGood" (20 posts)

Increasing Attention to Transparency: The MacArthur Foundation Is #OpenForGood
April 17, 2018

Chantell Johnson is managing director of evaluation at the John D. and Catherine T. MacArthur Foundation. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Chantell Johnson photoAt MacArthur, the desire to be transparent is not new. We believe philanthropy has a responsibility to be explicit about its values, choices, and decisions with regard to its use of resources. Toward that end, we have long had an information sharing policy that guides what and when we share information about the work of the Foundation or our grantees. Over time, we have continued to challenge ourselves to do better and to share more. The latest refinement of our approach to transparency is an effort toward increasingly sharing more knowledge about what we are learning. We expect to continue to push ourselves in this regard, and participating in Foundation Center’s Glasspockets  and #OpenForGood movements are just a couple of examples of how this has manifested.

In recent years, we have made a more concerted effort to revisit and strengthen our information sharing policy by:

  • Expanding our thinking about what we can and should be transparent about (e.g., our principles of transparency guided our public communications around our 100&Change competition, which included an ongoing blog);
  • Making our guidance more contemporary by moving beyond statements about information sharing to publishing more and different kinds of information (e.g., Grantee Perception Reports and evaluation findings);
  • Making our practices related to transparency more explicit; and
  • Ensuring that our evaluation work is front and center in our efforts related to transparency.

Among the steps we have taken to increase our transparency are the following:

Sharing more information about our strategy development process.
The Foundation's website has a page dedicated to How We Work, which provides detailed information about our approach to strategy development. We share an inside look into the lifecycle of our programmatic efforts, beginning with conceptualizing a grantmaking strategy through the implementation and ending phases, under an approach we refer to as Design/Build. Design/Build recognizes that social problems and conditions are not static, and thus our response to these problems needs to be iterative and evolve with the context to be most impactful. Moreover, we aim to be transparent as we design and build strategies over time. 

“We have continued to challenge ourselves to do better and to share more.”

Using evaluation to document what we are measuring and learning about our work.
Core to Design/Build is evaluation. Evaluation has become an increasingly important priority among our program staff. It serves as a tool to document what we are doing, how well we are doing it, how work is progressing, what is being achieved, and who benefits. We value evaluation not only for the critical information it provides to our Board, leadership, and program teams, but for the insights it can provide for grantees, partners, and beneficiaries in the fields in which we aim to make a difference. Moreover, it provides the critical content that we believe is at the heart of many philanthropic efforts related to transparency.

Expanding the delivery mechanisms for sharing our work.
While our final evaluation reports have generally been made public on our website, we aim to make more of our evaluation activities and products available (e.g., landscape reviews and baseline and interim reports). Further, in an effort to make our evaluation work more accessible, we are among the first foundations to make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

Further evidence of the Foundation's commitment to increased transparency includes continuing to improve our “Glass Pockets” by sharing:

  • Our searchable database of grants, including award amount, program, year, and purpose;
  • Funding statistics including total grants, impact investments, final budgeted amounts by program, and administrative expenses (all updated annually);
  • Perspectives of our program directors and staff;
  • Links to grantee products including grant-supported research studies consistent with the Foundation's intellectual property policies;
  • Stories highlighting the work and impact of our grantees and recipients of impact investments; and
  • Center for Effective Philanthropy Grantee Perception report results

Going forward, we will look for additional ways to be transparent. And, we will challenge ourselves to make findings and learnings more accessible even more quickly.

--Chantell Johnson 

New on Glasspockets: Open Knowledge Feature Added to Glasspockets Profiles
March 19, 2018

Janet Camarena is director of foundation transparency iniatives at Foundation Center

Janet Camarena photoWho has glass pockets when it comes to knowledge? Answering this question using our Glasspockets profiles just became a lot easier, thanks to a new feature we’ve added to emphasize the importance of creating a culture of shared learning in philanthropy. Beginning today, Glasspockets profiles are featuring a tie-in with our ongoing #OpenForGood campaign, designed to encourage open knowledge sharing by foundations.

All Glasspockets profiles now have a dedicated space to feature the knowledge that each foundation has contributed to IssueLab, which is a free, open repository that currently provides searchable access to nearly 24,000 knowledge documents. Currently, 67 of the 93 profiles on Glasspockets showcase recently shared reports on IssueLab. For example, looking at the Rockefeller Brothers Fund's Glasspockets profile reveals that it is participating in the #OpenForGood movement; a window appears on the right side of its profile featuring the latest learning the foundation has shared on IssueLab.

"Sharing your knowledge via open repositories is openness that is good for you and good for the field."

This window on shared knowledge is a dynamic feed generated from our IssueLab database, so if you have published evaluations or other publications to share that are not showing up in your profile, simply go to IssueLab to upload these documents, or contact our Glasspockets team for assistance. And if your foundation invested specifically in monitoring and evaluating results, you can share those evaluations in our new IssueLab: Results. To acknowledge your efforts for sharing your recent evaluations, your foundation will receive an #OpenforGood badge to display on your website and on your Glasspockets profile to signal your commitment to creating a community of shared learning.

Though not a formal part of the transparency assessment, the #OpenForGood feature makes profile users aware of the kinds of learning that are available from participating foundations. Besides linking to the two most recent reports, a shortcut is also provided linking the user to a landing page of all of that foundation’s available knowledge documents.

OFG Everyone Learns GroupSince Glasspockets began, the transparency self-assessment has tracked whether foundations make available a central landing page of knowledge on their own websites, and that will continue to be included moving forward. So what’s the difference here? Opening up your knowledge on your own website is great for people who already know about your institution and visit your website, but it doesn’t really help to spread that knowledge to peers and practitioners unaware of your work. The fragmentation of knowledge across thousands of websites doesn’t do much to accelerate progress as a field—but that’s where open repositories like IssueLab come in.

Open repositories have several things going for them that truly live up to the idea of being #OpenForGood. First of all, any report you make available on IssueLab becomes machine-readable, so it can more easily be used and built upon by others doing similar work. Secondly, once a resource has been added to IssueLab, it becomes part of the sector’s collective intelligence, feeding through an open protocol system, which integrates with systems like WorldCat in 10,000+ public libraries, which means students, academics, journalists, and the general public can easily find the knowledge you’ve generated and shared, even if they’ve never heard of IssueLab, Foundation Center, or your organization. Once in the system, your knowledge resources can also be issued something called a Digital Object Identifier (DOI), so you can track access and use of that knowledge in an ongoing way.

The easiest way to think of it is that sharing your knowledge via open repositories is openness that is good for you and good for the field. So how about it? What will you #OpenForGood?

--Janet Camarena

The Rockefeller Brothers Fund is #OpenForGood
January 31, 2018

Hope Lyons is the director of program management at the Rockefeller Brothers Fund, and Ari Klickstein is the communications associate/digital specialist at RBF. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Lyons-928_255x320px
Hope Lyons
Klickstein_255x320
Ari Klickstein

As a private foundation, the Rockefeller Brothers Fund advances a just, peaceful, and sustainable world through grantmaking and related activities. We believe that discerning and communicating the impact of our grantmaking and other programmatic contributions is essential to fulfilling the Fund’s mission, as is a commitment to stewardship, transparency, and accountability. Philanthropy exists to serve the public good. By opening up what we are learning, we believe that we are honoring the public’s trust in our activities as a private foundation.

As part of our commitment to serving the public good, we are proud to be among the first foundations to join the new #OpenForGood campaign by sharing published reports on our grantmaking through Foundation Center’s open repository, IssueLab, and its new special collection of evaluations Find Results, and continue to make them available on our own website. These reports and impact assessments are materials authored by third party assessment teams, and sometimes by our own program leadership, in addition to the published research papers and studies by grantees already on IssueLab.

We feel strongly that we have a responsibility to our grantees, trustees, partners, and the wider public to periodically evaluate our grantmaking, to use the findings to inform our strategy and practice, and to be transparent about what we are learning. In terms of our sector, this knowledge can go a long way in advancing fields of practice by identifying effective approaches. The Fund has a long history of sharing our findings with the public, stretching as far back as 1961, when the results of the Fund’s Special Studies Project were published as the bestselling volume Prospect for America. The book featured expert analysis on key issues of the era including international relations, economic and societal challenges, and democratic practices, topics which remain central to our grantmaking work.

We view our grantmaking as an investment in the public good, and place a great deal of importance on accountability. Through surveys conducted by the Center for Effective Philanthropy in 2016, our grantees and prospective grantees told us that they wanted to hear more about what we have learned, as well as what the Fund has tried but was recognized as less successful in its past grantmaking. Regular assessments by CEP and third-party issue-area experts help keep us accountable and identify blind-spots in our strategies. While our evaluations have long been posted online, and we have reorganized our website to make the materials easier to find, we have also made a commitment to have additional reflections on what we’re learning going forward and to more proactively share these reports. We are grateful to Foundation Center for creating and maintaining IssueLab as a sharing platform and learning environment hub for the public, practitioners, and peers alike to locate resources and benefit from the research that the philanthropic sector undertakes.

--Hope Lyons and Ari Klickstein

Getting Practical About Open Licensing
January 11, 2018

Kristy Tsadick is Deputy General Counsel and Heath Wickline is a Communications Officer at the William and Flora Hewlett Foundation, where they created an Open Licensing Toolkit for the foundation’s staff and its grantees in 2015. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Kristy_Tsadick photo
Kristy Tsadick
Heath_Wickline photo
Heath Wickline

Some of the biggest barriers to open licensing—an alternative to traditional copyright that encourages sharing of intellectual property with few or no restrictions—are practical ones. What rights are authors really giving others when they openly license their work? How do authors decide on the right Creative Commons license for their work? And having decided to openly license what they’ve created, how do authors actually let others know about their decision?

The Hewlett Foundation, where we both work, has a long history of supporting openness and transparency, and when Larry Kramer joined the foundation as president in 2012, he decided to make a renewal of that commitment a key part of his tenure. In 2015, that renewed commitment resulted in a decision to extend our support for open licensing to require it on works created using grant funds, underlining our belief that if grants are made to support the public good then the knowledge they generate should also be considered a public good.

To successfully implement this idea, we knew we would have to offer some concrete guidance to our program staff and grantees on both what we were asking of them and how to do it. We also knew we wanted to create a policy that would offer our grantees flexibility to comply with it in ways that made sense for their organizations. Both ideas are embodied in the Open Licensing Toolkit for Staff that we developed.

The kit is structured to help the foundation’s program staff decide to which grants the new rule applies, introduce open licensing to grantees, and help clarify what an open license on written works will mean for them. It uses FAQs, a “decision tree,” template emails and other documents to walk through the process. There is even a guide to marking works with a Creative Commons license to make clear what information is needed along with the copyright notice. And while the kit was designed with Hewlett Foundation staff in mind, we also wanted it to be useful for grantees and others interested in expanding their understanding and use of open licenses—so, of course, the toolkit itself carries a broad Creative Commons license.

Hewlett_toolkitIn thinking about which of our grants would be in scope for open licensing, we realized early on that general operating support is incompatible with the policy because those funds are given “with no strings attached.” Beyond even this broad exemption, we wanted to allow plenty of space for grantees to select licenses or request an exemption where they felt open licenses could do harm to them financially. It’s been gratifying to see how grantees have recognized the spirit of the new policy, and how infrequently they’ve requested exemptions—so much so that we stopped tracking those requests about a year after instituting the new policy. In one area where we did often see requests for exemptions—in grants to performing arts organizations, where the “work” is often a performance and selling tickets to it or recordings of it central to a grantee’s business model—we recently decided to change our standard grant agreements to recognize the need for this exemption.

Our goal in adopting the new policy was to show others what open licensing could mean for them—the way it can help spread knowledge and increase the impact of philanthropic resources. In that, we’ve been extremely successful, as other organizations have built on our toolkit, and our policy, to encourage open licensing in their own work. The Children’s Investment Fund Foundation (CIFF), for example, based its implementation guide for its own transparency policy on our toolkit, and the U.S. Department of State included a link to it in its Federal Open Licensing Playbook to encourage open licensing across all federal agencies. And because we included a Creative Commons license on the kit to be #OpenForGood, other organizations—including yours—are free to use and build on our work, too.

Hardly anyone would argue against getting more impact for the same dollars or having their ideas adopted and shared by more people. But real-world implementation details get in the way. Our experience with our Open Licensing Toolkit shows that a practical, flexible approach to open licensing helped extend our impact in ways we never could have imagined.

--Kristy Tsadick and Heath Wickline

In the Know: #OpenForGood Staff Pick December 2017
December 20, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Conrad N. Hilton Foundation. Read last month's staff pick here.


Staff Pick: Conrad N. Hilton Foundation

Evaluation of the Conrad N. Hilton Foundation Chronic Homelessness Initiative: 2016 Evaluation Report, Phase I

Download the Report

Quick Summary

2016 Hilton Foundation Report

In 2011, the Conrad N. Hilton Foundation partnered with Abt Associates Inc. to conduct an evaluation of the Hilton Foundation’s Chronic Homelessness Initiative, with the goal of answering an overarching question: Is the Chronic Homelessness Initiative an effective strategy to end and prevent chronic homelessness in Los Angeles County?

Answering that question has not been so easy. And it bears mentioning that this is not one of those reports that strives to prove a certain model is working, but instead provides a suitably complicated picture of an issue that will be an ongoing, multi-agency struggle.  A combination of economic conditions, insufficient and shrinking availability of affordable housing, and an unmet need for mental health and supportive services actually resulted in an increase in homeless people living in Los Angeles County during the time period under study. The numbers even suggest that Los Angeles was further from ending chronic homelessness than ever before. But the story is a bit more complicated than that.

In this final evaluation report on the community’s progress over five years, (January 2011 through December 2015), Abt Associates Inc. found that the collaborative system that had been developed during the first phase of the initiative actually represented a kind of turning point for the County to address chronic homelessness, which was needed more than ever by the end of 2015.

Field of Practice

  • Housing and Homelessness

What kinds of knowledge does this report up?

This report goes beyond evaluating a single effort or initiative to look at the larger collaborative system of funding bodies and stakeholders involved in solving a problem like chronic homelessness. We often hear that no foundation can solve problems single-handedly, so it’s refreshing to see a report framework that takes this reality into account by not just attempting to isolate the foundation-funded part of the work. The initiative’s strategy focused on a systemic approach that included goals, such as the leveraging of public funds, demonstrated action by elected and public officials, and increased capacity among developers and providers to provide permanent and supporting housing effectively, alongside the actual construction of thousands of housing units. By adopting this same systemic lens, the evaluation itself provides valuable insight into not just the issue of chronic homelessness in Los Angeles County, but also into how we might think about and evaluate programs and initiatives that are similarly collaborative or interdependent by design.

What makes it stand out?

This report is notable for two reasons. First is the evaluators’ willingness and ability to genuinely grapple with the discouraging fact that homelessness had gone up during the time of the initiative, as well as the foundation’s willingness to share this knowledge by publishing and sharing it. All too often, reports that don’t cast foundation strategies in the best possible light don’t see the light of day at all. Sadly, it is that kind of “sweeping under the rug” of knowledge that keeps us all in the dark. The second notable thing about this report is its design. The combination of a summary “dashboard” with easily digestible infographics about both the process of the evaluation and its findings, and a clear summary analysis for each strategic goal, makes this evaluation stand out from the crowd.

Key Quote

“From our vantage point, the Foundation’s investment in Systems Change was its most important contribution to the community’s effort to end chronic homelessness during Phase I of the Initiative. But that does not mean the Foundation’s investments in programs and knowledge dissemination did not make significant contributions. We believe it is the interplay of the three that yielded the greatest dividend.”

--Gabriela Fitz

In the Know: #OpenForGood Staff Pick
November 1, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Native Arts & Cultures Foundation.


Staff Pick: Native Arts & Cultures Foundation

Progressing Issues of Social Importance Through the Work of Indigenous Artists: A Social Impact Evaluation of the Native Arts and Cultures Foundation's Pilot Community Inspiration Program

Download the Report

Quick Summary

NACF Report

Impact measurement is a challenge for all kinds of organizations, and arts and culture organizations in particular often struggle with how to quantify the impact they are making. How does one measure the social impact of an epic spoken word poem, or of a large-scale, temporary art installation, or of performance art? The same is true of measuring the impact of social change efforts--how can these be measured in the short term given the usual pace of change? This report provides a good example of how to overcome both of these struggles.

In 2014, the Native Arts & Cultures Foundation (NACF) launched a new initiative, the Community Inspiration Program (CIP), which is rooted in the understanding that arts and cultures projects have an important role to play in motivating community engagement and supporting social change.

This 2017 report considers the social impacts of the 2014 CIP projects—what effects did they have on communities and on the issues, conversations, and connections that are critical in those communities? Its secondary purpose is to provide the NACF with ideas for how to improve its grantmaking in support of arts for community change.

Field(s) of Practice

  • Arts and Culture
  • Native and Indigenous Communities
  • Social Change
  • Community Engagement

This report opens up knowledge about the pilot phases of a new initiative whose intended impacts, community inspiration and social change, are vital but difficult concepts to operationalize and measure. The evaluation provides valuable insight into how foundations can encourage the inclusion of indigenous perspectives and truths not only in the design of their programs but also in the evaluation of those same programs.

What makes it stand out?

Several key aspects make this report noteworthy. First, this evaluation comprises a unique combination of more traditional methods and data with what the authors call an "aesthetic-appreciative" evaluation lens, which accounts for a set of dimensions associated with aesthetic projects such as "disruption," "stickiness," and "communal meaning," providing a more holistic analysis of the projects. Further, because the evaluation was focused on Native-artist led projects, it relied on the guidance of indigenous research strategies. Intentionality around developing strategies and principles for stakeholder-inclusion make this a noteworthy and useful framework for others, regardless of whether Native communities are the focus of your evaluation.

Key Quote

"Even a multiplicity of evaluation measures may not 'truly' tell the story of social impact if, for evaluators, effects are unobservable (for example, they occur at a point in the future that is beyond the evaluation's timeframe), unpredictable (so that evaluators don't know where to look for impact), or illegible (evaluators cannot understand that they are seeing the effects of a project)."

--Gabriela Fitz

Open Access to Foundation Knowledge
October 25, 2017

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. This post also appears in Medium. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Lisa Brooks Photo
Lisa Brooks

Foundations have a lot of reasons to share knowledge. They produce knowledge themselves. They hire others to research and author works that help with internal strategy development and evaluation of internal strategies, programs, and projects. And they make grants that assist others in gaining insight into social issues — be it through original research, evaluation work, or other work aimed at creating a better understanding of issues so that we can all pursue better solutions to social problems. In almost all aspects of foundation work, knowledge is an outcome.

While openly sharing this knowledge is uneven across the social sector, we do see more and more foundations starting to explore open access to the knowledge assets they make possible. Many foundations are sharing more intentionally through their websites, external clearinghouses, and other online destinations. And more foundations are suggesting — sometimes requiring — that their grantees openly share knowledge that was produced with grant dollars.

Lacey Althouse Photo
Lacey Althouse

Some foundations are even becoming open access champions. For example, the Hewlett Foundation has authored a terrifically helpful free toolkit that provides an in-depth how-to aimed at moving foundation and grantee intellectual property licensing practices away from “all rights reserved” copyrights and toward “some rights reserved” open licenses. (Full disclosure: IssueLab is included in the toolkit as one solution for long term knowledge preservation and sharing.) (“Hewlett Foundation Open Licensing Toolkit for Staff”)

For those who are already 100% open it’s easy to forget that, when first starting out, learning about open access can be daunting. For those who are trying to open up, like most things, getting there is a series of steps. One step is understanding how licensing can work for, or against, openness. Hewlett’s toolkit is a wonderful primer for understanding this. IssueLab also offers some ways to dig into other areas of openness. Check out Share the Wealth for tips.

Hawaii

 

However it is that foundations find their way to providing open access to the knowledge they make possible, we applaud and support it! In the spirit of International Open Access Week’s theme, “Open in order to….,” here’s what a few leading foundations have to say about the topic of openness in the social sector.

James Irvine Foundation 
Find on IssueLab.

“We have a responsibility to share our knowledge. There’s been a lot of money that gets put into capturing and generating knowledge and we shouldn’t keep it to ourselves.”

-Kim Ammann Howard, Director of Impact Assessment and Learning

Hewlett Foundation
Find on IssueLab.

“Our purpose for existing is to help make the world a better place. One way we can do that is to try things, learn, and then share what we have learned. That seems obvious. What is not obvious is the opposite: not sharing. So the question shouldn’t be why share; it should be why not share.”

-Larry Kramer, President

Hawaii Community Foundation
Find on IssueLab.

“Openness and transparency is one element of holding ourselves accountable to the public — to the communities we’re either in or serving. To me, it’s a necessary part of our accountability and I don’t think it should necessarily be an option.

-Tom Kelly, Vice President of Knowledge, Evaluation and Learning

The David and Lucile Packard Foundation
Find on IssueLab.

“Why do we want to share these things? …One, because it’s great to share what we’re learning, what’s worked, what hasn’t, what impact has been made so that others can learn from the work that our grantees are doing so that they can either not reinvent the wheel, gain insights from it or learn from where we’ve gone wrong… I think it helps to build the field overall since we’re sharing what we’re learning.”

-Bernadette Sangalang, Program Officer

The Rockefeller Foundation
Find on IssueLab

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations — well before the results are even known.”

-Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly
(Read more on why the Rockefeller Foundation is open for good.)

If you are a foundation ready to make open access the norm as part of your impact operations, here’s how you can become an open knowledge organization today.

IssueLab believes that social sector knowledge is a public good that is meant to be freely accessible to all. We collect and share the sector’s knowledge assets and we support the social sector’s adoption of open knowledge practices. Visit our collection of ~23,000 open access resources. While you’re there, add your knowledge — it takes minutes and costs nothing. Find out what we’re open in order to do here. IssueLab is a service of Foundation Center.

--Lisa Brooks and Lacey Althouse

How "Going Public" Improves Evaluations
October 17, 2017

Edward Pauly is director of research and evaluation at The Wallace Foundation. This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

ED_finalAs foundations strive to be #OpenForGood and share key lessons from their grantees' work, a frequent question that arises is how foundations can balance the value of openness with concerns about potential risks.

Concerns about risk are particularly charged when it comes to evaluations. Those concerns include: possible reputational damage to grantees from a critical or less-than-positive evaluation; internal foundation staff disagreements with evaluators about the accomplishments and challenges of grantees they know well; and evaluators’ delays and complicated interpretations.

It therefore may seem counterintuitive to embrace – as The Wallace Foundation has – the idea of making evaluations public and distributing them widely. And one of the key reasons may be surprising: To get better and more useful evaluations.

The Wallace Foundation has found that high-quality evaluations – by which we mean independent, commissioned research that tackles questions that are important to the field – are often a powerful tool for improving policy and practice. We have also found that evaluations are notably improved in quality and utility by being publicly distributed.

Incentives for High Quality

A key reason is that the incentives of a public report for the author are aligned with quality in several ways:

  • Evaluation research teams know that when their reports are public and widely distributed, they will be closely scrutinized and their reputation is on the line. Therefore, they do their highest quality work when it’s public.  In our experience, non-public reports are more likely than public reports to be weak in data use, loose in their analysis, and even a bit sloppy in their writing.  It is also noteworthy that some of the best evaluation teams insist on publishing their reports.
  • Evaluators also recognize that they benefit from the visibility of their public reports because visibility brings them more research opportunities – but only if their work is excellent, accessible and useful.
  • We see evaluators perk up when they focus on the audience their reports will reach. Gathering data and writing for a broad audience of practitioners and policymakers incentivizes evaluators to seek out and carefully consider the concerns of the audience: What information does the audience need in order to judge the value of the project being evaluated? What evidence will the intended audience find useful? How should the evaluation report be written so it will be accessible to the audience?

Making evaluations public is a classic case of a virtuous circle: public scrutiny creates incentives for high quality, accessibility and utility; high quality reports lead to expanded, engaged audiences – and the circle turns again, as large audiences use evaluation lessons to strengthen their own work, and demand more high-quality evaluations. To achieve these benefits, it’s obviously essential for grantmakers to communicate upfront and thoroughly with grantees about the goals of a public evaluation report -- goals of sharing lessons that can benefit the entire field, presented in a way that avoids any hint of punitive or harsh messaging.

“What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work?”

Asking the Right Questions

A key difference between evaluations commissioned for internal use and evaluations designed to produce public reports for a broad audience lies in the questions they ask. Of course, for any evaluation or applied research project, a crucial precursor to success is getting the questions right. In many cases, internally-focused evaluations quite reasonably ask questions about the lessons for the foundation as a grantmaker. Evaluations for a broad audience of practitioners and policymakers, including the grantees themselves, typically ask a broader set of questions, often emphasizing lessons for the field on how an innovative program can be successfully implemented, what outcomes are likely, and what policies are likely to be supportive.

In shaping these efforts at Wallace as part of the overall design of initiatives, we have found that one of the most valuable initial steps is to ask field leaders: What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work? This kind of listening can help a foundation get the questions right for an evaluation whose findings will be valued, and used, by field leaders and practitioners.

Knowledge at Work

For example, school district leaders interested in Wallace-supported “principal pipelines” that could help ensure a reliable supply of effective principals, wanted to know the costs of starting such pipelines and maintaining them over time. The result was a widely-used RAND report that we commissioned, “What It Takes to Operate and Maintain Principal Pipelines: Costs and Other Resources.” RAND found that costs are less than one half of 1% of districts’ expenditures; the report also explained what drives costs, and provided a very practical checklist of the components of a pipeline that readers can customize and adapt to meet their local needs.

Other examples that show how high-quality public evaluations can help grantees and the field include:

Being #OpenForGood does not happen overnight, and managing an evaluation planned for wide public distribution isn’t easy. The challenges start with getting the question right – and then selecting a high-performing evaluation team; allocating adequate resources for the evaluation; connecting the evaluators with grantees and obtaining relevant data; managing the inevitable and unpredictable bumps in the road; reviewing the draft report for accuracy and tone; allowing time for grantees to fact-check it; and preparing with grantees and the research team for the public release. Difficulties, like rocks on a path, crop up in each stage in the journey. Wallace has encountered all of these difficulties, and we don’t always navigate them successfully. (Delays are a persistent issue for us.)

Since we believe that the knowledge we produce is a public good, it follows that the payoff of publishing useful evaluation reports is worth it. Interest from the field is evidenced by 750,000 downloads last year from www.wallacefoundation.org, and a highly engaged public discourse about what works, what doesn’t, why, and how – rather than the silence that often greets many internally-focused evaluations.

--Edward Pauly

Opening Up the Good and Bad Leads to Stronger Communities and Better Grantmaking
September 28, 2017

Hanh Cao Yu is Chief Learning Officer at The California Endowment.  She has been researcher and evaluator of equity and philanthropy for more than two decades. 

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Hanh-Cao-Yu-photoMore than a year ago when I began my tenure at The California Endowment (TCE), I reflected deeply about the opportunities and challenges ahead as the new Chief Learning Officer.  We were six years into a complex, 10-year policy/systems change initiative called Building Healthy Communities (BHC).  This initiative was launched in 2010 to advance statewide policy, change the narrative, and transform 14 of California’s communities most devastated by health inequities into places where all people—particular our youth—have an opportunity to thrive.  This is the boldest bet in the foundation’s history at $1 billion and the stakes are high.  It is not surprising, then, that despite the emphasis on learning, the evaluation of BHC is seen as a winning or losing proposition. 

“By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve.”

As I thought about the role of learning and evaluation in deepening BHC’s impact, I became inspired by the words of Nelson Mandela: “I never lose.  I either win or I learn.”  His encouragement to shift our mindset from “Win/Lose” to “Win/Learn” is crucial to continuous improvement and success.  

I also drew from the insights of Amy Edmondson who reminds us that if we experience failure, not all failures are bad.  According to Edmondson, mistakes can be preventable, unavoidable due to complexity, or even intelligent failures.  So, despite careful planning and learning from decades of research on comprehensive community initiatives and bold systems change efforts, in an initiative as complex as BHC, mistakes can and will occur. By spurring change at community, regional and state levels, and linking community mobilization with sophisticated policy advocacy, TCE was truly venturing into new territory when we launched BHC.

BHC's Big Wins and Lessons 

At the mid-point of BHC, TCE staff and Board paused to assess where we have been successful and where we could do better in improving the conditions under which young people could be healthy and thrive in our underserved communities.  The results were widely shared in the 2016 report, A New Power Grid:  Building Healthy Communities at Year 5.

As a result of taking the time to assess overall progress, we identified some of BHC's biggest impacts to date. In the first five years, TCE and partners contributed to significant policy/system wins:

  • Improved health coverage for the underserved;
  • Strengthened health coverage policy for the undocumented;
  • Improved school climate, wellness and equity;
  • Prevention and reform within the justice system;
  • Public-private investment and policy changes on behalf of boys and young men of color; and
  • Local & regional progress in adoption of “Health In All Policies,” a collaborative approach incorporating health considerations into decision-making across all policy areas

Our Board and team are very pleased with the results and impact of BHC to date, but we have been committed to learning from our share of mistakes. 

Along with the victories, we acknowledged in the report some hard lessons.  Most notable among our mistakes were more attention to:

  • Putting Community in “Community-Driven” Change.  Armed with lessons on having clarity about results to achieve results, we over thought the early process.  This resulted in prescriptiveness in the planning phase that was not only unnecessary, but also harmful. We entered the community planning process with multiple outcomes frameworks and a planning process that struck many partners as philanthropic arrogance. The smarter move was to engage community leaders with the clarity of a shared vision and operating principles, and create the space for community leaders and residents to incubate goals, results, and strategy. Fortunately, we course corrected, and our partners were patient while we did so.
  • Revisiting assumptions about local realities and systems dynamics.  In the report, we discussed our assumption about creating a single locus of inside-out, outside-in activity where community residents, leaders and systems leaders could collaborate on defined goals. It was readily apparent that community leaders distrusted many “systems” insiders, and systems leaders viewed outsider/activists as unreasonable. We underestimated the importance of the roles of historical structural inequalities, context, and dynamics of relationships at the local level.  Local collaboratives or “hubs” were reorganized and customized to meet local realities, and we threw the concept of a single model of collaboration across all the sites out the window.

Some course corrections we made included adjusting and sharpening our underlying assumptions and theory of change and taking on new community-driven priorities that we never anticipated early on; examples include school discipline reform, dismantling the prison pipeline in communities of color through prevention, and work that is taking place in TCE’s Boys & Young Men of Color portfolio.  By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve. 

“Some partner feedback was difficult to hear, but all of it was useful and is making our work with partners stronger.”

Further, significant developments have occurred since the report:

Positioning “Power Building” as central to improving complex systems and policies.  In defining key performance indicators, we know the policy milestones achieved thus far represent only surface manifestations of the ultimate success we are seeking.  We had a breakthrough when we positioned “building the power and voice” of the adults and youth in our communities and “health equity” at the center of our BHC North Star Goals and Indicators.  Ultimately, we’ll know we are successful when the power dynamics in our partner communities have shifted so that adult and youth residents know how to hold local officials accountable for full, ongoing implementation of these policies.

Continuing to listen to our partners.  In addition to clarifying our North Stars, we sought further mid-point advice from our partners, reaching out to 175 stakeholders, including 68 youth and adult residents of BHC communities, for feedback to shape the remainder of BHC’s implementation and to inform our transition planning for the next decade.  Some of what our partners told us was difficult to hear, but all of it was useful and is making our work with partners stronger.    

From these lessons, I challenge our philanthropic colleagues to consider:

  • How can we learn to detect complex failures early to help us go beyond lessons that are superficial? As Amy Edmonson states, “The job of leaders is to see that their organizations don’t just move on after a failure but stop to dig in and discover the wisdom contained in it.”
  • In complex initiatives and complex organizations, what does it take to design a learning culture to capitalize successfully on mistakes? How do we truly engage in “trial and error” and stay open to experimentation and midcourse corrections?  How can we focus internally on our own operations and ways of work, as well as being willing to change our strategies and relationships with external partners?  Further, how do we, as grantmakers responsible for serving the public good, take responsibility for making these lessons #OpenForGood so others can learn from them as well?

It is worth noting that a key action that TCE took at the board level as we embarked on BHC was to dissolve the Board Program Committee and replace it with Learning and Performance Committee.  This set-up offered consistent opportunity for learning from evaluation reports between the Board, the CEO, and the management team and for sharing our learnings publicly to build the philanthropic field.  Now, even as we enter the final phase of BHC, we continue to look for ways to structure opportunities to learn, and I can say, “We are well into a journey to learn intelligently from our successes as well as our mistakes to make meaningful, positive impacts.”

--Hanh Cao Yu

Championing Transparency: The Rockefeller Foundation Is First to Share All Evaluations As Part of #OpenForGood
September 26, 2017

The Rockefeller Foundation staff who authored this post are Veronica Olazabal, Director of Measurement, Evaluation, and Organizational Performance; Shawna Hoffman, Measurement, Evaluation, and Organizational Performance Specialist; and Nadia Asgaraly, Measurement and Evaluation Intern.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Veronica Olazabal
Veronica Olazabal
Shawna Hoffman
Shawna Hoffman
Nadia Asgaraly
Nadia Asgaraly

TRF Color LogoToday, aligned with The Rockefeller Foundation's commitments to sharing and accountability, we are proud to be the first foundation to accept the challenge and proactively make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

A History of Transparency and Sharing

Since its founding more than 100 years ago, The Rockefeller Foundation's mission has remained unchanged: to promote the well-being of humanity throughout the world. To this end, the Foundation seeks to catalyze and scale transformative innovation across sectors and geographies, and take risks where others cannot, or will not. While working in innovative spaces, the Foundation has always recognized that the full impact of its programs and investments can only be realized if it measures - and shares - what it is learning. Knowledge and evidence sharing is core to the organization's DNA dating back to its founder John D. Rockefeller Sr., who espoused the virtues of learning from and with others—positing that this was the key to "enlarging the boundaries of human knowledge."

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known.”

Evaluation for the Public Good

Building the evidence base for the areas in which we work is the cornerstone of The Rockefeller Foundation's approach to measurement and evaluation. By systematically tracking progress toward implementation and outcomes of our programs, and by testing, validating, and assessing our assumptions and hypotheses, we believe that we can manage and optimize our impact. Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.

But living out transparency as a core value is not without its challenges. A commitment to the principle of transparency alone is insufficient; organizations, especially foundations, must walk the talk. Sharing evidence requires the political will and human resources to do so, and more importantly, getting comfortable communicating not only one's successes, but also one's challenges and failures. For this reason, to ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known. Then, once evaluation reports are finalized, they are posted to the Foundation website, available to the public free of charge.

#OpenForGood Project

The Foundation Center's #OpenForGood project, and IssueLab's related Results platform, help take the Foundation's commitment to sharing and strengthening the evidence base to the next level. By building a repository where everyone can identify others working on similar topics, search for answers to specific questions, and quickly identify where knowledge gaps exists, they are leading the charge on knowledge sharing.

The Rockefeller Foundation is proud to support this significant effort by being the first to contribute its evaluation evidence base to IssueLab: Results as part of the #OpenForGood movement, with the hope of encouraging others to do the same.

-- Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories