Transparency Talk

Category: "Reporting" (38 posts)

Increasing Attention to Transparency: The MacArthur Foundation Is #OpenForGood
April 17, 2018

Chantell Johnson is managing director of evaluation at the John D. and Catherine T. MacArthur Foundation. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Chantell Johnson photoAt MacArthur, the desire to be transparent is not new. We believe philanthropy has a responsibility to be explicit about its values, choices, and decisions with regard to its use of resources. Toward that end, we have long had an information sharing policy that guides what and when we share information about the work of the Foundation or our grantees. Over time, we have continued to challenge ourselves to do better and to share more. The latest refinement of our approach to transparency is an effort toward increasingly sharing more knowledge about what we are learning. We expect to continue to push ourselves in this regard, and participating in Foundation Center’s Glasspockets  and #OpenForGood movements are just a couple of examples of how this has manifested.

In recent years, we have made a more concerted effort to revisit and strengthen our information sharing policy by:

  • Expanding our thinking about what we can and should be transparent about (e.g., our principles of transparency guided our public communications around our 100&Change competition, which included an ongoing blog);
  • Making our guidance more contemporary by moving beyond statements about information sharing to publishing more and different kinds of information (e.g., Grantee Perception Reports and evaluation findings);
  • Making our practices related to transparency more explicit; and
  • Ensuring that our evaluation work is front and center in our efforts related to transparency.

Among the steps we have taken to increase our transparency are the following:

Sharing more information about our strategy development process.
The Foundation's website has a page dedicated to How We Work, which provides detailed information about our approach to strategy development. We share an inside look into the lifecycle of our programmatic efforts, beginning with conceptualizing a grantmaking strategy through the implementation and ending phases, under an approach we refer to as Design/Build. Design/Build recognizes that social problems and conditions are not static, and thus our response to these problems needs to be iterative and evolve with the context to be most impactful. Moreover, we aim to be transparent as we design and build strategies over time. 

“We have continued to challenge ourselves to do better and to share more.”

Using evaluation to document what we are measuring and learning about our work.
Core to Design/Build is evaluation. Evaluation has become an increasingly important priority among our program staff. It serves as a tool to document what we are doing, how well we are doing it, how work is progressing, what is being achieved, and who benefits. We value evaluation not only for the critical information it provides to our Board, leadership, and program teams, but for the insights it can provide for grantees, partners, and beneficiaries in the fields in which we aim to make a difference. Moreover, it provides the critical content that we believe is at the heart of many philanthropic efforts related to transparency.

Expanding the delivery mechanisms for sharing our work.
While our final evaluation reports have generally been made public on our website, we aim to make more of our evaluation activities and products available (e.g., landscape reviews and baseline and interim reports). Further, in an effort to make our evaluation work more accessible, we are among the first foundations to make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

Further evidence of the Foundation's commitment to increased transparency includes continuing to improve our “Glass Pockets” by sharing:

  • Our searchable database of grants, including award amount, program, year, and purpose;
  • Funding statistics including total grants, impact investments, final budgeted amounts by program, and administrative expenses (all updated annually);
  • Perspectives of our program directors and staff;
  • Links to grantee products including grant-supported research studies consistent with the Foundation's intellectual property policies;
  • Stories highlighting the work and impact of our grantees and recipients of impact investments; and
  • Center for Effective Philanthropy Grantee Perception report results

Going forward, we will look for additional ways to be transparent. And, we will challenge ourselves to make findings and learnings more accessible even more quickly.

--Chantell Johnson 

Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts
April 12, 2018

Laia Griñó is director of data discovery at Foundation Center. This post also appears in the Human Rights Funders Network's blog.

Laia Grino photoOver the last few months, this blog has presented insights gained from the Advancing Human Rights initiative’s five-year trend analysis. Getting to these insights would not have been possible had not a growing number of funders decided to consistently share more detailed data about their grantmaking, such as through Foundation Center’s eReporting program. In a field where data can pose real risks, some might feel that this openness is ill-advised. Yet transparency and data protection need not be at odds. By operating from a framework of responsible data, funders can simultaneously protect the privacy and security of grantees and contribute to making the human rights field more transparent, accountable, and effective.

This topic – balancing transparency and data protection – was the focus of a session facilitated by Foundation Center at the PEAK Grantmaking annual conference last month. Our goal was not to debate the merits of one principle over the other, but to help provide a framework that funders can use in determining how to share grants data, even in challenging circumstances. What follows are some of the ideas and tips discussed at that session (a caveat here: these tips focus on data shared voluntarily by funders on their website, with external partners like Foundation Center, etc.; we recognize that funders may also face legal reporting requirements that could raise additional issues).

HRFN Graphic

  • Think of transparency as a spectrum: Conversations regarding data sharing often seem to end up at extremes: we must share everything or we can’t share anything. Instead, funders should identify what level of transparency makes sense for them by asking themselves two questions: (1) What portion of our grants portfolio contains sensitive data that could put grantees at risk if shared? and (2) For the portion of grants deemed sensitive, which grant details – if any – are possible to share? Based on our experience with Advancing Human Rights, in most cases funders will find that it is possible to share some, if not most, of their grants information.
  • Assess the risks of sharing data: Answering these questions requires careful consideration of the consequences if information about certain grants is made public, particularly for grantees’ security. As noted at the PEAK session, in assessing risks funders should not only consider possible negative actions by government actors, but also by actors like militant groups or even a grantee’s community or family. It is also important to recognize that risks can change over time, which is why it is so critical that funders understand what will happen with the data they share; if circumstances change, they need to know who should be notified so that newly sensitive data can be removed.
  • Get grantees’ input: Minimizing harm to grantees is of utmost importance to funders. And yet grantees usually have little or no input on decisions about what information is shared about them. Some funders do explicitly ask for grantees’ consent to share information, sometimes at multiple points along the grant process. This could take the form of an opt-in box included as part of the grant agreement process, for example. At a minimum, grantees should understand where and how data about the grant will be used.
  • Calibrate what is shared based on the level of risk: Depending on the outcomes of their risk assessment (and grantees’ input), a funder may determine that it’s inadvisable to share any details about certain grants. In these cases, funders may opt not to include those grants in their reporting at all, or to only report on them at an aggregate level (e.g., $2 million in grants to region or country X). In situations where it is possible to acknowledge a grant, funders can take steps to protect a grantee, such as: anonymizing the name of the grantee; providing limited information on the grantee’s location (e.g., country only); and/or redacting or eliminating a grant description (note: from our experience processing data, it is easy to overlook sensitive information in grant descriptions!).
  • Build data protection into grants management systems: Technology has an important role to play in making data protection systematic and, importantly, manageable. For example, some funders have “flags” to indicate which grants can be shared publicly or, conversely, which are sensitive. In one example shared at PEAK, a grants management system has been set up so that if a grant has been marked as sensitive, the grantee’s name will automatically appear as “Confidential” in any reports generated. These steps can minimize the risk of data being shared due to human error.

Transparency is at the core of Foundation Center’s mission. We believe deeply that transparency can not only help build public trust but also advance more inclusive and effective philanthropy. For that reason, we are committed to being responsible stewards of the data that is shared with us (see the security plan for Advancing Human Rights, for example). A single conference session or blog post cannot do justice to such a complex and longdebated topic. We are therefore thankful that our colleagues at Ariadne360Giving and The Engine Room have just started a project to provide funders with greater guidance around this issue (learn more in these two thoughtful blog posts from The Engine Room, here and here). We look forward to seeing and acting on their findings! 

--Laia Griñó

From Dark Ages to Enlightenment: A Magical Tale of Mapping Human Rights Grantmaking
April 4, 2018

Mona Chun is Executive Director of Human Rights Funders Network, a global network of grantmakers committed to effective human rights philanthropy.

Mona HeadshotOnce upon a time, back in the old days of 2010, human rights funders were sitting alone in their castles, with no knowledge of what their peers in other towers and castles were doing – just the certainty that their issue area, above all others, was underfunded. Each castle also spoke its own language, making it difficult for castle communities to learn from one another. This lack of transparency and shared language about common work and goals meant everyone was working in the dark.

Then a gender-neutral knight, clad in human rights armor (ethically produced of course), arrived in the form of our Advancing Human Rights research. With this research in hand, funders can now:

  • Peer out from their towers across the beautiful funding landscape;
  • Use a telescope to look at what their peers are doing, from overall funding trends to grants-level detail;
  • Use a common language to compare notes on funding priorities and approaches;
  • Find peers with whom to collaborate and new grantee partners to support; and
  • Refine and strengthen their funding strategies.

Armed with this knowledge, human rights funders can leave their towers and visit others, even government towers, to advocate and leverage additional resources in their area of interest.

Advancing Human Rights MapMapping Unchartered Territory

The Advancing Human Rights initiative, a partnership between Human Rights Funders Network (HRFN) and Foundation Center, has mapped more than $12 billion in human rights funding from foundations since 2010. Because of the great potential such data has to inform and improve our collective work, many years of work went into this. Ten years ago, HRFN recognized that in order to help human rights funders become more effective in their work, we needed to get a better understanding of where the money was going, what was being funded and how much was being spent. After our initial planning, we partnered with Foundation Center, brought in Ariadne and Prospera as funder network collaborators, formed a global Advisory Committee and hashed out the taxonomy to develop a shared language. Then, we began the process of wrangling funders to share their detailed grantmaking data.

It was no easy feat, but we published the first benchmark report on human rights grantmaking for 2010, and since then, we have worked to improve the research scope and process and trained funders to use the tools we’ve developed. In January, we released our first ever trends analysis. Over the five years of data collection featured on the Advancing Human Rights research hub, we’ve compiled almost 100,000 human rights grants from funders in 114 countries.

Adopting A Can-Do Attitude

In 2010, major funders in our network didn’t believe this could be done.

First, could we get the grantmaking data from members? For the first few years, we campaigned hard to get members to share their detailed grants information. We created a musical “Map It” parody (set to the tune of Devo’s “Whip It”) and launched a Rosie the Riveter campaign (“You Can Do It: Submit Your Data!”). We deployed pocket-size fold-outs and enormous posters thanking foundations for their participation. Several years later, we have seen our gimmicks bear fruit: 780 funders contributed data in our most recent year. When we began, no human rights data was being gathered from funders outside North America. In our first year, we incorporated data from 49 foundations outside North America and in the most recent year, that number more than doubled to 109. The value of participation is now clear. Repeated nudging is still necessary, but not gimmicks.

Rosie Collage
The Human Rights Funder Network celebrates its Rosie the Riveter “You Can Do It: Submit Your Data!” campaign. Photo Credit: Human Rights Funders Network

Data Makes A Difference

Once we had the research, could we get busy funders to use the data? With all the hard work being done in the field and so much to learn from it, we were committed to creating research that would be used. Focusing as much energy on sharing the research as we had compiling it, we aimed to minimize unused reports sitting on shelves. Global tours, presentations, workshops and tutorials have resulted in funders sharing story after story of how they are putting the findings to use:

  • Funders sift through the data to inform their strategic plans and understand where they sit vis-à-vis their peers;
  • Use the tools to break out of their silos and build collaborative initiatives;
  • Use the research to advocate to their boards, their governments, their constituencies; and
  • Enter into new areas of work or geographies knowing the existing landscape of organizations on the ground, search for donors doing complementary work, and discover the issues most and least funded.

Overall, their decisions can be informed by funding data that did not exist before, beyond the wishful daydreams of funders in their towers.

I wish I could say that we’ll live happily ever after with this data. But the pursuit of human rights is a long-term struggle. Those committed to social change know that progress is often accompanied by backlash. As we face the current challenging times together, sometimes we just need to recognize how far we’ve come and how much more we know, holding on to the magic of possibility (and the occasional fairy tale) to inspire us for the still long and winding, but newly illuminated, road ahead.

--Mona Chun

In the Know: #OpenForGood Staff Pick December 2017
December 20, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Conrad N. Hilton Foundation. Read last month's staff pick here.


Staff Pick: Conrad N. Hilton Foundation

Evaluation of the Conrad N. Hilton Foundation Chronic Homelessness Initiative: 2016 Evaluation Report, Phase I

Download the Report

Quick Summary

2016 Hilton Foundation Report

In 2011, the Conrad N. Hilton Foundation partnered with Abt Associates Inc. to conduct an evaluation of the Hilton Foundation’s Chronic Homelessness Initiative, with the goal of answering an overarching question: Is the Chronic Homelessness Initiative an effective strategy to end and prevent chronic homelessness in Los Angeles County?

Answering that question has not been so easy. And it bears mentioning that this is not one of those reports that strives to prove a certain model is working, but instead provides a suitably complicated picture of an issue that will be an ongoing, multi-agency struggle.  A combination of economic conditions, insufficient and shrinking availability of affordable housing, and an unmet need for mental health and supportive services actually resulted in an increase in homeless people living in Los Angeles County during the time period under study. The numbers even suggest that Los Angeles was further from ending chronic homelessness than ever before. But the story is a bit more complicated than that.

In this final evaluation report on the community’s progress over five years, (January 2011 through December 2015), Abt Associates Inc. found that the collaborative system that had been developed during the first phase of the initiative actually represented a kind of turning point for the County to address chronic homelessness, which was needed more than ever by the end of 2015.

Field of Practice

  • Housing and Homelessness

What kinds of knowledge does this report up?

This report goes beyond evaluating a single effort or initiative to look at the larger collaborative system of funding bodies and stakeholders involved in solving a problem like chronic homelessness. We often hear that no foundation can solve problems single-handedly, so it’s refreshing to see a report framework that takes this reality into account by not just attempting to isolate the foundation-funded part of the work. The initiative’s strategy focused on a systemic approach that included goals, such as the leveraging of public funds, demonstrated action by elected and public officials, and increased capacity among developers and providers to provide permanent and supporting housing effectively, alongside the actual construction of thousands of housing units. By adopting this same systemic lens, the evaluation itself provides valuable insight into not just the issue of chronic homelessness in Los Angeles County, but also into how we might think about and evaluate programs and initiatives that are similarly collaborative or interdependent by design.

What makes it stand out?

This report is notable for two reasons. First is the evaluators’ willingness and ability to genuinely grapple with the discouraging fact that homelessness had gone up during the time of the initiative, as well as the foundation’s willingness to share this knowledge by publishing and sharing it. All too often, reports that don’t cast foundation strategies in the best possible light don’t see the light of day at all. Sadly, it is that kind of “sweeping under the rug” of knowledge that keeps us all in the dark. The second notable thing about this report is its design. The combination of a summary “dashboard” with easily digestible infographics about both the process of the evaluation and its findings, and a clear summary analysis for each strategic goal, makes this evaluation stand out from the crowd.

Key Quote

“From our vantage point, the Foundation’s investment in Systems Change was its most important contribution to the community’s effort to end chronic homelessness during Phase I of the Initiative. But that does not mean the Foundation’s investments in programs and knowledge dissemination did not make significant contributions. We believe it is the interplay of the three that yielded the greatest dividend.”

--Gabriela Fitz

No Pain, No Gain: The Reality of Improving Grant Descriptions
November 8, 2017

Gretchen Schackel is Grants Manager of the James F. and Marion L. Miller Foundation in Portland, Oregon.

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series explores the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Join us at a session about the Open 990-PF in partnership with Southern California Grantmakers. Learn more or register here.                                   

Gretchen Schackel - Miller photoYou know those blog posts that describe adopting a best practice?  The ones that make it sound so easy and tempting that you try it, only to be let down because you discover that either you are doing something terribly wrong, or it is a lot harder than the author made it sound because they left out all of the pain points? Well, don’t worry—this is not one of those posts! In fact, I will start off with the pain points so you can go in eyes wide open, if like me, you end up on a quest to improve your foundation’s grant descriptions.  

This post is a sequel to another Transparency Talk article that recently featured our foundation’s executive director, detailing lessons learned about why improving grants data is important to the foundation, as well as to the sector as a whole. That article ended with a brief snapshot of some “before and after” grant descriptions, showing how we are working to improve the way we tell the story of each grant, so I’m picking up here where that left off to share an honest, behind-the-scenes look at what it took to get from the before to the after.

“Capturing critical details when writing accurate and complete grant descriptions aids your efforts on the 990-PF form.”

Pain Relievers

As the grants manager, it’s my job to put the right processes in place so we can capture critical details when writing grant descriptions to ensure that they are accurate and complete, and well….actually descriptive (AKA “Purpose of grant of contribution” on form 990-PF). This fall marks my 11-year anniversary at the Miller Foundation and one thing that has remained constant throughout my tenure is what a pain writing good grant descriptions can be if you don’t know where to begin. So, I’m sharing my playbook below, because the communities we are serving, and how we are serving them, deserve to be described and celebrated. I’ve learned some tips and work-arounds along the way that I’ll share as I inventory the various obstacles you might encounter

Pain Point #1:

Lean Staffing. We are a staff of four people: Executive Director, Program Officer, Grants Manager, and Administrative Assistant. We don’t publish an annual report; we have just started using social media, and just completed a website redesign. This makes all of us part-time communications staff. I wouldn’t describe this as a best practice, but it’s the reality at many foundations.  

Pain Reliever #1:

Grant Descriptions Can Serve Many Purposes. As mentioned above, the editorial process involved in prepping text for public consumption can be labor intensive, particularly in organizations without a communications department. Grant descriptions, which represent the substance of our work, turn out to be handy for small organizations like ours because they can serve many purposes. They are used for our minutes, our website, our 990-PF, and for our eReport to Foundation Center for its searchable databases. We don’t have time to write different grant descriptions for each specific use. So, we write one grant description that we can use in multiple platforms and situations.

Pain Point #2:

Garbage In – Garbage Out. Data starts with the grantees, and I know from talking to them that they are often not well equipped with time or technology to collect good data. It’s not just about what questions are we asking but rather how are we helping our grantees understand what we need and help them get us the best data possible.

Pain Reliever #2:

You have to work with what you’ve got. And what we have is the information provided by the potential grantees in their applications.  Most of the information we need can be found in the “Brief summary of the grant request” question on the grant application. Rather than treat this as a test that potential grantees must either pass/fail, we provide detailed instructions of the kind of information we would like to see in the summary as part of our online application process. Taking the guesswork out of the application has improved the data quality we receive at the start of the grant. Our arts portfolio also requires that grantees participate in DataArts, which serves as a collective database that grantees only have to enter once and then all arts funders can access their data. Participating in field-building shortcuts like this is a great way to make the process more efficient for everyone.

Once you have the framework in place to get a good grant summary from your prospective grantees, however, your work is not yet done.  Often, important elements of the funded grant can change during board deliberations, so I find it essential to share the grant summary with our program staff before finalizing to ensure we are capturing the detail accurately.

Pain Point #3: Lack of an industry standard on what makes the perfect grant description.  There are probably as many ways to write a grant description as there are foundations, and reinventing wheels is a waste of our collective time, so I have long wished for a framework we could all agree to follow.

Pain Reliever #3: The Get on the Map Campaign.

We have learned a lot from Foundation Center’s Get on the Map campaign about the elements of a great grant description. The Get on the Map campaign is a partnership between United Philanthropy Forum and Foundation Center designed to improve philanthropic data, and includes a helpful framework that details the best way to share your data with Foundation Center and the public. What I immediately loved about it is how it reminded me of being that weird kid who loved to diagram sentences in junior high. But perhaps it’s not that strange since I know grants managers enjoy turning chaos into order. So, let's try to use sentence diagramming as a model for writing grant descriptions.

The Anatomy of a Good Grant Description

First, we’ll start with the four elements of a good grant description and assign each a color.

  • WHAT: What is the primary objective of the grant?
  • WHO:  Are there any specifically intended beneficiaries?
  • HOW: What are the primary strategies of the grant?
  • WHERE:  Where will the grant monies serve if beyond the recipient address?

Example #1:

We’ll start with an easy example. Program support grant descriptions often write themselves:

Brief summary of the grant request from application form:

“We are seeking support for Chicas Youth Development which serves over 500 Latina girls and their families in grades 3-12 in Washington County. Chicas launched in 2008 and has since grown to partner with three Washington County school districts and over 500 local families each year to offer after school programming, leadership, and community service opportunities for Latina youth and their families.”

Grant Description: to support the Chicas Youth Development program which serves 500 Latina girls in grades 3-12 located in Washington County.

That was pretty easy!! Particularly because of how we improved the clarity of what we ask for.

Example #2:

The grant below is also a project grant but the Brief summary of the grant request from the application is a little less straight forward:

“GRANTEE requests $AMOUNT to support the presentation of two new publications and four community readings featuring the writing of diverse voices: people who are experiencing homeless, immigrants and refugees living in our community, seniors living on a low income, LGBTQ folks, people living with a disability, and many others whose voices often live on the margins.  This project will bring together people to experience and explore art and will focus on those with the least access to do so.

Grant Description: To support community building through publication and public readings of works written by marginalized populations.

Example #3:

This grant is for both general operating support and a challenge grant. Tricky.

GRANTEE respectfully requests $AMOUNT over two years to support program growth as well as provide a matching challenge for individual donations as we continue to increase our sustainability through support from individual donors. If awarded, $AMOUNT would be put to general operating funds to support our continued program growth in all areas: traditional high school program, statewide initiative pilot program and our college program. The remaining $AMOUNT each year would serve as a matching challenge grant. In order to be eligible for the match, GRANTEE would have to raise $AMOUNT in new and increased individual donations each year of the grant period.

Okay Grant Description: To support program growth and provide a matching challenge for individual donations.

Good Grant Description: General operating funds to support program growth and a challenge grant to increase support from individual donors.

Better Grant Description: This grant was awarded in two parts: 1. General operating funds for mission related activities that provide intensive support to low-income high school juniors and seniors in Oregon. 2. A 1:1 challenge grant to increase support from individual donors.

The above description is a perfect example of why it’s important to read the proposal narrative as well as confer with program staff.

If you follow this process, I can’t promise it will be painless, but it will go a long way to relieving a lot of the pain points that come with grants management—particularly the grants management of today in which grants managers are at the crossroads of being data managers, information officers, and storytellers.  I have found making this journey is worth it. Because, after all, behind every grant lies a story waiting to be told and a community waiting to hear it. So, let’s get our stories straight!

--Gretchen Schackel

How "Going Public" Improves Evaluations
October 17, 2017

Edward Pauly is director of research and evaluation at The Wallace Foundation. This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

ED_finalAs foundations strive to be #OpenForGood and share key lessons from their grantees' work, a frequent question that arises is how foundations can balance the value of openness with concerns about potential risks.

Concerns about risk are particularly charged when it comes to evaluations. Those concerns include: possible reputational damage to grantees from a critical or less-than-positive evaluation; internal foundation staff disagreements with evaluators about the accomplishments and challenges of grantees they know well; and evaluators’ delays and complicated interpretations.

It therefore may seem counterintuitive to embrace – as The Wallace Foundation has – the idea of making evaluations public and distributing them widely. And one of the key reasons may be surprising: To get better and more useful evaluations.

The Wallace Foundation has found that high-quality evaluations – by which we mean independent, commissioned research that tackles questions that are important to the field – are often a powerful tool for improving policy and practice. We have also found that evaluations are notably improved in quality and utility by being publicly distributed.

Incentives for High Quality

A key reason is that the incentives of a public report for the author are aligned with quality in several ways:

  • Evaluation research teams know that when their reports are public and widely distributed, they will be closely scrutinized and their reputation is on the line. Therefore, they do their highest quality work when it’s public.  In our experience, non-public reports are more likely than public reports to be weak in data use, loose in their analysis, and even a bit sloppy in their writing.  It is also noteworthy that some of the best evaluation teams insist on publishing their reports.
  • Evaluators also recognize that they benefit from the visibility of their public reports because visibility brings them more research opportunities – but only if their work is excellent, accessible and useful.
  • We see evaluators perk up when they focus on the audience their reports will reach. Gathering data and writing for a broad audience of practitioners and policymakers incentivizes evaluators to seek out and carefully consider the concerns of the audience: What information does the audience need in order to judge the value of the project being evaluated? What evidence will the intended audience find useful? How should the evaluation report be written so it will be accessible to the audience?

Making evaluations public is a classic case of a virtuous circle: public scrutiny creates incentives for high quality, accessibility and utility; high quality reports lead to expanded, engaged audiences – and the circle turns again, as large audiences use evaluation lessons to strengthen their own work, and demand more high-quality evaluations. To achieve these benefits, it’s obviously essential for grantmakers to communicate upfront and thoroughly with grantees about the goals of a public evaluation report -- goals of sharing lessons that can benefit the entire field, presented in a way that avoids any hint of punitive or harsh messaging.

“What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work?”

Asking the Right Questions

A key difference between evaluations commissioned for internal use and evaluations designed to produce public reports for a broad audience lies in the questions they ask. Of course, for any evaluation or applied research project, a crucial precursor to success is getting the questions right. In many cases, internally-focused evaluations quite reasonably ask questions about the lessons for the foundation as a grantmaker. Evaluations for a broad audience of practitioners and policymakers, including the grantees themselves, typically ask a broader set of questions, often emphasizing lessons for the field on how an innovative program can be successfully implemented, what outcomes are likely, and what policies are likely to be supportive.

In shaping these efforts at Wallace as part of the overall design of initiatives, we have found that one of the most valuable initial steps is to ask field leaders: What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work? This kind of listening can help a foundation get the questions right for an evaluation whose findings will be valued, and used, by field leaders and practitioners.

Knowledge at Work

For example, school district leaders interested in Wallace-supported “principal pipelines” that could help ensure a reliable supply of effective principals, wanted to know the costs of starting such pipelines and maintaining them over time. The result was a widely-used RAND report that we commissioned, “What It Takes to Operate and Maintain Principal Pipelines: Costs and Other Resources.” RAND found that costs are less than one half of 1% of districts’ expenditures; the report also explained what drives costs, and provided a very practical checklist of the components of a pipeline that readers can customize and adapt to meet their local needs.

Other examples that show how high-quality public evaluations can help grantees and the field include:

Being #OpenForGood does not happen overnight, and managing an evaluation planned for wide public distribution isn’t easy. The challenges start with getting the question right – and then selecting a high-performing evaluation team; allocating adequate resources for the evaluation; connecting the evaluators with grantees and obtaining relevant data; managing the inevitable and unpredictable bumps in the road; reviewing the draft report for accuracy and tone; allowing time for grantees to fact-check it; and preparing with grantees and the research team for the public release. Difficulties, like rocks on a path, crop up in each stage in the journey. Wallace has encountered all of these difficulties, and we don’t always navigate them successfully. (Delays are a persistent issue for us.)

Since we believe that the knowledge we produce is a public good, it follows that the payoff of publishing useful evaluation reports is worth it. Interest from the field is evidenced by 750,000 downloads last year from www.wallacefoundation.org, and a highly engaged public discourse about what works, what doesn’t, why, and how – rather than the silence that often greets many internally-focused evaluations.

--Edward Pauly

No Moat Philanthropy Part 5: The Downsides & Why It’s Worth It
October 6, 2017

Jen Ford Reedy is President of the Bush Foundation. On the occasion of her fifth anniversary leading the foundation, she reflects on efforts undertaken to make the Bush Foundation more permeable. Because the strategies and tactics she shares can be inspiring and helpful for any grantmaker exploring ways to open up their grantmaking, we have devoted this blog space all week to the series. This is the final post in the five-part series.

Reedyjenniferford-croppedEverything we do is a trade-off. Spending time and money on the activities described in this No Moat Philanthropy series means time and money not invested in something else. Here are some of the downsides of the trade-offs we have made:

It takes some operating expense.  It requires real staff time for us to do office hours in western North Dakota and to reformat grant reports to be shared online and to do every other activity described in these posts. We believe there is lots of opportunity to advance our mission in the “how” of grantmaking and weigh that as an investment alongside others. In our case, we did not have an increase in staff costs or operating expenses as we made this shift. We just reprioritized.

It can be bureaucratic.  Having open programs and having community members involved in processes requires some structure and rules and standardization in a way that can feel stifling. Philanthropy feels more artful and inspired when you can be creative and move quickly. To be equitably accessible and to improve the chance we are funding the best idea, we are committed to making this trade-off. (While, of course, being as artful and creative as possible within the structures we set!)

“We believe our effectiveness is fundamentally tied to our ability to influence and be influenced by others.”

Lots of applications means lots of turndowns.  Conventional wisdom in philanthropy is to try to limit unsuccessful applications – reducing the amount of effort nonprofits invest with no return. This is an important consideration and it is why many foundations have very narrow guidelines and/or don’t accept unsolicited proposals. The flip side, however, is that the more we all narrow our funding apertures, the harder it is for organizations to get great ideas funded. We’ve decided to run counter to conventional wisdom and give lots of organizations a shot at funding. Of course, we don’t want to waste their time. We have three strategies to try to mitigate this waste: (1) through our hotlines we try to coach unlikely grantees out of the process. (In our experience, nonprofits will often apply anyway – which suggests to us that they value having a shot – even if the odds are long.); (2) we try to make the process worth it. Our surveys suggest that applicants who do the programs with the biggest pools get something out of the process – (and we learn from the applicants even if they are not funded.); and (3) we try to make the first stage of our processes as simple as possible so folks are not wasting too much effort.

Relationships are hard!  Thinking of ourselves as being in relationship with people in the region is not simple. There are lots of them! And it can be super frustrating if a Bush staff member gives advice on a hotline that seems to be contradicted by the feedback when an application is declined. We’ve had to invest money and time in developing our CRM capacity and habits. We have a lot more work to do on this front. We will never not have a lot more work to do on our intercultural competence and our efforts to practice inclusion. Truly including people with different perspectives can make decisions harder as it makes decisions better.  The early returns on our efforts have been encouraging and we are committed to continuing the work to be more fully in relationship with more people in the communities we serve.

Conclusion

Overall, we believe a No Moat Philanthropy approach has made us more effective. When we are intentional about having impact through how we do our work — building relationships, inspiring action, spreading optimism — then we increase the positive impact we have in the region.

We believe our effectiveness is fundamentally tied to our ability to influence and be influenced by others, which demands trust, reciprocity and a genuine openness to the ideas of others. It requires understanding perspectives other than our own. It requires permeability.

While we arrived at this approach largely because of our place-based sensibility and strategic orientation toward people (see learning paper: “The Bush Approach”), the same principles can apply to a national or international foundation focused on particular issues. The definition of community is different, but the potential value of permeability within that community is the same.

--Jen Ford Reedy

Opening Up the Good and Bad Leads to Stronger Communities and Better Grantmaking
September 28, 2017

Hanh Cao Yu is Chief Learning Officer at The California Endowment.  She has been researcher and evaluator of equity and philanthropy for more than two decades. 

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Hanh-Cao-Yu-photoMore than a year ago when I began my tenure at The California Endowment (TCE), I reflected deeply about the opportunities and challenges ahead as the new Chief Learning Officer.  We were six years into a complex, 10-year policy/systems change initiative called Building Healthy Communities (BHC).  This initiative was launched in 2010 to advance statewide policy, change the narrative, and transform 14 of California’s communities most devastated by health inequities into places where all people—particular our youth—have an opportunity to thrive.  This is the boldest bet in the foundation’s history at $1 billion and the stakes are high.  It is not surprising, then, that despite the emphasis on learning, the evaluation of BHC is seen as a winning or losing proposition. 

“By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve.”

As I thought about the role of learning and evaluation in deepening BHC’s impact, I became inspired by the words of Nelson Mandela: “I never lose.  I either win or I learn.”  His encouragement to shift our mindset from “Win/Lose” to “Win/Learn” is crucial to continuous improvement and success.  

I also drew from the insights of Amy Edmondson who reminds us that if we experience failure, not all failures are bad.  According to Edmondson, mistakes can be preventable, unavoidable due to complexity, or even intelligent failures.  So, despite careful planning and learning from decades of research on comprehensive community initiatives and bold systems change efforts, in an initiative as complex as BHC, mistakes can and will occur. By spurring change at community, regional and state levels, and linking community mobilization with sophisticated policy advocacy, TCE was truly venturing into new territory when we launched BHC.

BHC's Big Wins and Lessons 

At the mid-point of BHC, TCE staff and Board paused to assess where we have been successful and where we could do better in improving the conditions under which young people could be healthy and thrive in our underserved communities.  The results were widely shared in the 2016 report, A New Power Grid:  Building Healthy Communities at Year 5.

As a result of taking the time to assess overall progress, we identified some of BHC's biggest impacts to date. In the first five years, TCE and partners contributed to significant policy/system wins:

  • Improved health coverage for the underserved;
  • Strengthened health coverage policy for the undocumented;
  • Improved school climate, wellness and equity;
  • Prevention and reform within the justice system;
  • Public-private investment and policy changes on behalf of boys and young men of color; and
  • Local & regional progress in adoption of “Health In All Policies,” a collaborative approach incorporating health considerations into decision-making across all policy areas

Our Board and team are very pleased with the results and impact of BHC to date, but we have been committed to learning from our share of mistakes. 

Along with the victories, we acknowledged in the report some hard lessons.  Most notable among our mistakes were more attention to:

  • Putting Community in “Community-Driven” Change.  Armed with lessons on having clarity about results to achieve results, we over thought the early process.  This resulted in prescriptiveness in the planning phase that was not only unnecessary, but also harmful. We entered the community planning process with multiple outcomes frameworks and a planning process that struck many partners as philanthropic arrogance. The smarter move was to engage community leaders with the clarity of a shared vision and operating principles, and create the space for community leaders and residents to incubate goals, results, and strategy. Fortunately, we course corrected, and our partners were patient while we did so.
  • Revisiting assumptions about local realities and systems dynamics.  In the report, we discussed our assumption about creating a single locus of inside-out, outside-in activity where community residents, leaders and systems leaders could collaborate on defined goals. It was readily apparent that community leaders distrusted many “systems” insiders, and systems leaders viewed outsider/activists as unreasonable. We underestimated the importance of the roles of historical structural inequalities, context, and dynamics of relationships at the local level.  Local collaboratives or “hubs” were reorganized and customized to meet local realities, and we threw the concept of a single model of collaboration across all the sites out the window.

Some course corrections we made included adjusting and sharpening our underlying assumptions and theory of change and taking on new community-driven priorities that we never anticipated early on; examples include school discipline reform, dismantling the prison pipeline in communities of color through prevention, and work that is taking place in TCE’s Boys & Young Men of Color portfolio.  By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve. 

“Some partner feedback was difficult to hear, but all of it was useful and is making our work with partners stronger.”

Further, significant developments have occurred since the report:

Positioning “Power Building” as central to improving complex systems and policies.  In defining key performance indicators, we know the policy milestones achieved thus far represent only surface manifestations of the ultimate success we are seeking.  We had a breakthrough when we positioned “building the power and voice” of the adults and youth in our communities and “health equity” at the center of our BHC North Star Goals and Indicators.  Ultimately, we’ll know we are successful when the power dynamics in our partner communities have shifted so that adult and youth residents know how to hold local officials accountable for full, ongoing implementation of these policies.

Continuing to listen to our partners.  In addition to clarifying our North Stars, we sought further mid-point advice from our partners, reaching out to 175 stakeholders, including 68 youth and adult residents of BHC communities, for feedback to shape the remainder of BHC’s implementation and to inform our transition planning for the next decade.  Some of what our partners told us was difficult to hear, but all of it was useful and is making our work with partners stronger.    

From these lessons, I challenge our philanthropic colleagues to consider:

  • How can we learn to detect complex failures early to help us go beyond lessons that are superficial? As Amy Edmonson states, “The job of leaders is to see that their organizations don’t just move on after a failure but stop to dig in and discover the wisdom contained in it.”
  • In complex initiatives and complex organizations, what does it take to design a learning culture to capitalize successfully on mistakes? How do we truly engage in “trial and error” and stay open to experimentation and midcourse corrections?  How can we focus internally on our own operations and ways of work, as well as being willing to change our strategies and relationships with external partners?  Further, how do we, as grantmakers responsible for serving the public good, take responsibility for making these lessons #OpenForGood so others can learn from them as well?

It is worth noting that a key action that TCE took at the board level as we embarked on BHC was to dissolve the Board Program Committee and replace it with Learning and Performance Committee.  This set-up offered consistent opportunity for learning from evaluation reports between the Board, the CEO, and the management team and for sharing our learnings publicly to build the philanthropic field.  Now, even as we enter the final phase of BHC, we continue to look for ways to structure opportunities to learn, and I can say, “We are well into a journey to learn intelligently from our successes as well as our mistakes to make meaningful, positive impacts.”

--Hanh Cao Yu

Championing Transparency: The Rockefeller Foundation Is First to Share All Evaluations As Part of #OpenForGood
September 26, 2017

The Rockefeller Foundation staff who authored this post are Veronica Olazabal, Director of Measurement, Evaluation, and Organizational Performance; Shawna Hoffman, Measurement, Evaluation, and Organizational Performance Specialist; and Nadia Asgaraly, Measurement and Evaluation Intern.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Veronica Olazabal
Veronica Olazabal
Shawna Hoffman
Shawna Hoffman
Nadia Asgaraly
Nadia Asgaraly

TRF Color LogoToday, aligned with The Rockefeller Foundation's commitments to sharing and accountability, we are proud to be the first foundation to accept the challenge and proactively make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

A History of Transparency and Sharing

Since its founding more than 100 years ago, The Rockefeller Foundation's mission has remained unchanged: to promote the well-being of humanity throughout the world. To this end, the Foundation seeks to catalyze and scale transformative innovation across sectors and geographies, and take risks where others cannot, or will not. While working in innovative spaces, the Foundation has always recognized that the full impact of its programs and investments can only be realized if it measures - and shares - what it is learning. Knowledge and evidence sharing is core to the organization's DNA dating back to its founder John D. Rockefeller Sr., who espoused the virtues of learning from and with others—positing that this was the key to "enlarging the boundaries of human knowledge."

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known.”

Evaluation for the Public Good

Building the evidence base for the areas in which we work is the cornerstone of The Rockefeller Foundation's approach to measurement and evaluation. By systematically tracking progress toward implementation and outcomes of our programs, and by testing, validating, and assessing our assumptions and hypotheses, we believe that we can manage and optimize our impact. Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.

But living out transparency as a core value is not without its challenges. A commitment to the principle of transparency alone is insufficient; organizations, especially foundations, must walk the talk. Sharing evidence requires the political will and human resources to do so, and more importantly, getting comfortable communicating not only one's successes, but also one's challenges and failures. For this reason, to ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known. Then, once evaluation reports are finalized, they are posted to the Foundation website, available to the public free of charge.

#OpenForGood Project

The Foundation Center's #OpenForGood project, and IssueLab's related Results platform, help take the Foundation's commitment to sharing and strengthening the evidence base to the next level. By building a repository where everyone can identify others working on similar topics, search for answers to specific questions, and quickly identify where knowledge gaps exists, they are leading the charge on knowledge sharing.

The Rockefeller Foundation is proud to support this significant effort by being the first to contribute its evaluation evidence base to IssueLab: Results as part of the #OpenForGood movement, with the hope of encouraging others to do the same.

-- Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly

Trend to Watch: Using SDGs to Improve Foundation Transparency
September 19, 2017

(Janet Camarena is director of transparency initiatives at Foundation Center. )

Janet Camarena PhotoAs Foundation Center's director of transparency initiatives, one of the most interesting parts of my job is having the opportunity to play "transparency scout," regularly reviewing foundation websites for signs of openness in what is too often a closed universe. Some of this scouting leads to lifting up practices that can be examples for others on our Transparency Talk blog, sometimes it leads to a new transparency indicator on our assessment framework, and sometimes we just file it internally as a "trend to watch. "

Today, it's a combination of all three; we are using this blog post to announce the launch of a new, "Trend to Watch" indicator that signals an emerging practice: the use of the Sustainable Development Goals to improve how foundations open up their work to the world.

Sustainable Development GoalsThe United Nations' Sustainable Development Goals (SDGs), otherwise known as the Global Goals, are a universal call to action to end poverty, protect the planet and ensure that all people enjoy peace and prosperity. There are a total of 17 goals, such as ending poverty, zero hunger, reduced inequalities, and climate action. Written deliberately broad to serve as a collective playbook that governments and private sector alike can use, they can also serve as a much needed shared language across philanthropy and across sectors to signal areas of common interest, and measure shared progress.

And let's face it, as foundation strategies become increasingly specialized and strategic, explaining the objectives and the nuances can become a jargon-laden minefield that can make it difficult and time consuming for those on the outside to fully understand the intended goal of a new program or initiative. The simplicity of the SDG iconography cuts through the jargon so foundation website visitors can quickly identify alignment with the goals or not, and then more easily determine whether they should devote time to reading further. The SDG framework also provides a clear visual framework to display grants and outcomes data in a way that is meaningful beyond the four walls of the foundation.

Let's take a look at how some foundation websites are using the SDGs to more clearly explain their work:

Silicon Valley Community Foundation (SVCF)

One of my favorite examples is from a simple chart the Silicon Valley Community Foundation shared on its blog, because it specifically opens up the work of its donor-advised funds using the SDGs. Donor-advised funds are typically not the most transparent vehicles, so using the SDGs as a framework to tally how SVCF's donor-advised funds are making an impact is particularly clever, refreshing, and offers a new window into a fast-growth area of philanthropy.

A quick glance at the chart reveals that quality education, good health and well-being, and sustainable cities and communities are the most common priorities among Silicon Valley donors.

GHR Foundation

A good example of how the SDGs can be used as a shared language to explain the intended impact of a grant portfolio is from GHR Foundation in Minnesota. I also like this example because it shows how the SDGs can be effectively used in both global and domestic grant portfolios. GHR uses the SDG iconography across all of its portfolios, as sidebars on the pages that describe foundation strategies. GHR's "Children in Families" is a core foundation grantmaking strategy that addresses children and families in need on a global scale. The portfolio name is a broad one, but by including the SDG iconography, web visitors can quickly understand that GHR is using this program area to address poverty, hunger, as well as lead to outcomes tied to health and well-being:

GHR is also able to use the SDG framework to create similar understanding of its domestic work. Below is an example from its Catholic Schools program serving the Twin Cities:

Through the visual cues the icons provide, I can quickly determine that in addition to aligning with the quality education goal, that this part of GHR's portfolio also addresses hunger and economically disadvantaged populations through its education grantmaking. This could also signal that the grantmaker interprets education broadly and supports the provision of wrap-around services to address the needs of low-income children as a holistic way of addressing the achievement gap. That's a lot of information conveyed with three small icons!

Tableau Foundation

The most sophisticated example comes to us from the tech and corporate grantmaking worlds--the Tableau Foundation. Tableau makes data visualization software, so using technology as a means to improve transparency is a core approach, and they are using their own grantmaking as an example of how you can use data to tell a compelling visual story. Through the interactive "Living Annual Report" on its website, Tableau regularly updates its grantmaking tallies and grantee data so web visitors have near real-time information. One of the tabs on the report reveals the SDG indicators, providing a quick snapshot of how Tableau's grantmaking, software donations, and corporate volunteering align with the SDGs.

As you mouse over any bar on the left, near real-time data appears, tallying how much of Tableau's funding has gone to support each goal. The interactive bar chart on the right lists Tableau's grantees, and visitors can quickly see the grantee list in the context of the SDGs as well as know the specific scale of its grantmaking to each recipient.

If you're inspired by these examples, but aren't sure how to begin connecting your portfolio to the Global Goals, you can use the SDG Indicator Wizard to help you get started. All you need to do is copy and paste your program descriptions or the descriptive language of a sample grant into the Wizard and its machine-learning tools let you know where your grantmaking lands on the SDG matrix. It's a lot of fun – and great place to start learning about the SDGs. And, because it transforms your program language into the relevant SDG goals, indicator, and targets, it may just provide a shortcut to that new strategy you were thinking of developing!

What more examples? The good news is we're also tracking SDGs as a transparency indicator at "Who Has Glasspockets?" You can view them all here. Is your foundation using the SDGs to help tell the story of your work? We're always on the lookout for new examples, so let us know and your foundation can be the next trend setter in our new Trend to Watch.

-- Janet Camarena

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories