Transparency Talk

Category: "Evaluation" (85 posts)

Glasspockets Find: GuideStar’s Good Practices for Foundations Leads with Transparency and Openness
November 15, 2017

Earlier this year, GuideStar released an informative report, "A Guide to Good Practices in Foundation Operations” that offers tips to eliminate foundation inefficiencies and increase open and responsive grantmaking. The report title emphasizes that a one-size-fits-all “best practices” approach is not appropriate given the unique nature of foundations; it also cautions that this can foster a “one-size-fits-one” culture that creates great inefficiencies for grantseeking organizations, and for the sector as a whole.

At Glasspockets, we are happy to see that transparency topped GuideStar’s list of practices philanthropy should adopt to overcome these challenges.

Download the Report.

In addition to transparency, GuideStar’s good foundation practices cover a range of topics including communications, power dynamics, constituency relations, diversity, and due diligence. Specifically, the report recommends the following tips to eliminate inefficiencies and maximize social sector impact:

  1. Be Transparent to the Public
  2. Be Rigorous—But Remain Respectful of Your Applicants
  3. Be Responsive to Your Constituents
  4. Be Proactive about Diversity, Equity, and Inclusion

“We believe that foundations of all shapes and sizes can apply these practices. We also believe that civic society will be much more efficient, stronger, and more effective if all foundations adopt them,” the report states.

Foundation-good-practices-report coverBe Transparent to the Public

Transparency not only benefits grantseekers and the public, but it also benefits grantmakers. GuideStar shared that grantmakers who are open and transparent are more likely to pursue excellence, and be more responsive to their constituents and public criticism. Another benefit: “The act of transparency can force an organization to be clear about its goals and strategy.” 

GuideStar also highlights how foundations can learn from their peers and develop benchmarks through the Glasspockets’ Transparency Trends tool, which helps foundations compare its transparency practices with others and create a customized report with recommendations.

Be Rigorous—But Remain Respectful of Your Applicants

GuideStar suggests foundations can determine the “health of a grantseeker” by: verifying its eligibility to accept grants; confirming that the nonprofit’s proposal aligns with the grantmaker’s mission; and checking on the grant applicant’s role in the community and the field. However, GuideStar cautions foundations about making unreasonable and overly stringent demands such as requesting redundant information or unnecessary documentation that could potentially impede nonprofits from fulfilling their missions. For example, a foundation could gain information on a grantseeker’s legal status, its impact, and its financial health due to the availability of outside products or check the foundation's current records before requesting that information from the nonprofit.

Be Responsive to Your Constituents

Funders should not overlook the use of staff expertise to inform new directions. For example, staff feedback mechanisms should be in place so that their experience and observations can inform foundation’ strategies and missions. GuideStar encourages funders to use an in-house or third-party survey to gather “staff perceptions of their relationships with managers, whether staffers believe they are empowered to do their jobs, and their perceptions of organizational culture.” GuideStar also states that beneficiary feedback mechanisms represent an under-used but effective means of informing foundation strategy.

Be Proactive about Diversity, Equity, and Inclusion

Funders’ efforts to address diversity, equity and inclusion should be internalized and synthesized as a “keystone value for an organization” and not due to “ad hoc efforts or in response to public campaigns.” GuideStar emphasizes that diversity is essential to maximizing a foundation’s impact on social good because it “encourages innovation, energizes organizations, and widens perspectives.” Diversity should be reflected in the staff from the Board of Directors to line staff.

Moving Forward

In light of change and uncertainty in society, GuideStar notes that foundations continue to play an important role in influencing and empowering change in the social sector. With GuideStar’s insightful and practical suggestions to address inefficiencies and implement good practices, foundations have opportunities to create internal changes that can have long-lasting impact inside and outside of foundation walls. What good practices is your foundation currently implementing, and which good practices will you aim for?

--Melissa Moy

In the Know: #OpenForGood Staff Pick
November 1, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Native Arts & Cultures Foundation.


Staff Pick: Native Arts & Cultures Foundation

Progressing Issues of Social Importance Through the Work of Indigenous Artists: A Social Impact Evaluation of the Native Arts and Cultures Foundation's Pilot Community Inspiration Program

Download the Report

Quick Summary

NACF Report

Impact measurement is a challenge for all kinds of organizations, and arts and culture organizations in particular often struggle with how to quantify the impact they are making. How does one measure the social impact of an epic spoken word poem, or of a large-scale, temporary art installation, or of performance art? The same is true of measuring the impact of social change efforts--how can these be measured in the short term given the usual pace of change? This report provides a good example of how to overcome both of these struggles.

In 2014, the Native Arts & Cultures Foundation (NACF) launched a new initiative, the Community Inspiration Program (CIP), which is rooted in the understanding that arts and cultures projects have an important role to play in motivating community engagement and supporting social change.

This 2017 report considers the social impacts of the 2014 CIP projects—what effects did they have on communities and on the issues, conversations, and connections that are critical in those communities? Its secondary purpose is to provide the NACF with ideas for how to improve its grantmaking in support of arts for community change.

Field(s) of Practice

  • Arts and Culture
  • Native and Indigenous Communities
  • Social Change
  • Community Engagement

This report opens up knowledge about the pilot phases of a new initiative whose intended impacts, community inspiration and social change, are vital but difficult concepts to operationalize and measure. The evaluation provides valuable insight into how foundations can encourage the inclusion of indigenous perspectives and truths not only in the design of their programs but also in the evaluation of those same programs.

What makes it stand out?

Several key aspects make this report noteworthy. First, this evaluation comprises a unique combination of more traditional methods and data with what the authors call an "aesthetic-appreciative" evaluation lens, which accounts for a set of dimensions associated with aesthetic projects such as "disruption," "stickiness," and "communal meaning," providing a more holistic analysis of the projects. Further, because the evaluation was focused on Native-artist led projects, it relied on the guidance of indigenous research strategies. Intentionality around developing strategies and principles for stakeholder-inclusion make this a noteworthy and useful framework for others, regardless of whether Native communities are the focus of your evaluation.

Key Quote

"Even a multiplicity of evaluation measures may not 'truly' tell the story of social impact if, for evaluators, effects are unobservable (for example, they occur at a point in the future that is beyond the evaluation's timeframe), unpredictable (so that evaluators don't know where to look for impact), or illegible (evaluators cannot understand that they are seeing the effects of a project)."

--Gabriela Fitz

How "Going Public" Improves Evaluations
October 17, 2017

Edward Pauly is director of research and evaluation at The Wallace Foundation. This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

ED_finalAs foundations strive to be #OpenForGood and share key lessons from their grantees' work, a frequent question that arises is how foundations can balance the value of openness with concerns about potential risks.

Concerns about risk are particularly charged when it comes to evaluations. Those concerns include: possible reputational damage to grantees from a critical or less-than-positive evaluation; internal foundation staff disagreements with evaluators about the accomplishments and challenges of grantees they know well; and evaluators’ delays and complicated interpretations.

It therefore may seem counterintuitive to embrace – as The Wallace Foundation has – the idea of making evaluations public and distributing them widely. And one of the key reasons may be surprising: To get better and more useful evaluations.

The Wallace Foundation has found that high-quality evaluations – by which we mean independent, commissioned research that tackles questions that are important to the field – are often a powerful tool for improving policy and practice. We have also found that evaluations are notably improved in quality and utility by being publicly distributed.

Incentives for High Quality

A key reason is that the incentives of a public report for the author are aligned with quality in several ways:

  • Evaluation research teams know that when their reports are public and widely distributed, they will be closely scrutinized and their reputation is on the line. Therefore, they do their highest quality work when it’s public.  In our experience, non-public reports are more likely than public reports to be weak in data use, loose in their analysis, and even a bit sloppy in their writing.  It is also noteworthy that some of the best evaluation teams insist on publishing their reports.
  • Evaluators also recognize that they benefit from the visibility of their public reports because visibility brings them more research opportunities – but only if their work is excellent, accessible and useful.
  • We see evaluators perk up when they focus on the audience their reports will reach. Gathering data and writing for a broad audience of practitioners and policymakers incentivizes evaluators to seek out and carefully consider the concerns of the audience: What information does the audience need in order to judge the value of the project being evaluated? What evidence will the intended audience find useful? How should the evaluation report be written so it will be accessible to the audience?

Making evaluations public is a classic case of a virtuous circle: public scrutiny creates incentives for high quality, accessibility and utility; high quality reports lead to expanded, engaged audiences – and the circle turns again, as large audiences use evaluation lessons to strengthen their own work, and demand more high-quality evaluations. To achieve these benefits, it’s obviously essential for grantmakers to communicate upfront and thoroughly with grantees about the goals of a public evaluation report -- goals of sharing lessons that can benefit the entire field, presented in a way that avoids any hint of punitive or harsh messaging.

“What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work?”

Asking the Right Questions

A key difference between evaluations commissioned for internal use and evaluations designed to produce public reports for a broad audience lies in the questions they ask. Of course, for any evaluation or applied research project, a crucial precursor to success is getting the questions right. In many cases, internally-focused evaluations quite reasonably ask questions about the lessons for the foundation as a grantmaker. Evaluations for a broad audience of practitioners and policymakers, including the grantees themselves, typically ask a broader set of questions, often emphasizing lessons for the field on how an innovative program can be successfully implemented, what outcomes are likely, and what policies are likely to be supportive.

In shaping these efforts at Wallace as part of the overall design of initiatives, we have found that one of the most valuable initial steps is to ask field leaders: What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work? This kind of listening can help a foundation get the questions right for an evaluation whose findings will be valued, and used, by field leaders and practitioners.

Knowledge at Work

For example, school district leaders interested in Wallace-supported “principal pipelines” that could help ensure a reliable supply of effective principals, wanted to know the costs of starting such pipelines and maintaining them over time. The result was a widely-used RAND report that we commissioned, “What It Takes to Operate and Maintain Principal Pipelines: Costs and Other Resources.” RAND found that costs are less than one half of 1% of districts’ expenditures; the report also explained what drives costs, and provided a very practical checklist of the components of a pipeline that readers can customize and adapt to meet their local needs.

Other examples that show how high-quality public evaluations can help grantees and the field include:

Being #OpenForGood does not happen overnight, and managing an evaluation planned for wide public distribution isn’t easy. The challenges start with getting the question right – and then selecting a high-performing evaluation team; allocating adequate resources for the evaluation; connecting the evaluators with grantees and obtaining relevant data; managing the inevitable and unpredictable bumps in the road; reviewing the draft report for accuracy and tone; allowing time for grantees to fact-check it; and preparing with grantees and the research team for the public release. Difficulties, like rocks on a path, crop up in each stage in the journey. Wallace has encountered all of these difficulties, and we don’t always navigate them successfully. (Delays are a persistent issue for us.)

Since we believe that the knowledge we produce is a public good, it follows that the payoff of publishing useful evaluation reports is worth it. Interest from the field is evidenced by 750,000 downloads last year from www.wallacefoundation.org, and a highly engaged public discourse about what works, what doesn’t, why, and how – rather than the silence that often greets many internally-focused evaluations.

--Edward Pauly

No Moat Philanthropy Part 4: Beyond the Transactional
October 5, 2017

Jen Ford Reedy is President of the Bush Foundation. On the occasion of her fifth anniversary leading the foundation, she reflects on efforts undertaken to make the Bush Foundation more permeable. Because the strategies and tactics she shares can be inspiring and helpful for any grantmaker exploring ways to open up their grantmaking, we are devoting our blog space all week to the series. This is the fourth post in the five-part series.

Reedyjenniferford-croppedWe have a grantmaking model that is based on the belief that, if we do it right, we will create more good by what we inspire than by what we directly fund. Principle #4 and #5 of No Moat Philanthropy are directly related to this, how connecting and sharing with others can advance your foundation’s mission.

Principle #4: Value every interaction as an opportunity to advance your mission

Our tagline and our strategy are one and the same: We invest in great ideas and the people who power them. We know that the only way anything happens is through people. Any place or field, therefore, is limited by the ambitions and the skills of the people in it.

The Bush Fellowship has been a flagship program of the Foundation for decades. We hear repeatedly from Bush Fellows that the experience changed what they thought was possible in their life and career. With the Bush Fellows program as our source code, we’ve been working for the past five years to ensure that all of our programs have the same effect. How can we encourage people to think bigger and think differently? How can we be a force for optimism?

This notion of a foundation being a force for optimism is not an obvious one. After all, we mostly tell people no. Last year, 95 percent of people who applied for the Bush Fellowship did not receive one. We’ve worked diligently to make sure all applicant interactions with us are helpful and encouraging, regardless of grant or fellowship outcome. And our surveys suggest the work is paying off. For example, 79 percent of declined Bush Fellowship applicants said the process increased their beliefs that they can accomplish “a lot.”

“If we do grantmaking right, we will create more good by what we inspire than by what we directly fund.”

To have this impact with each applicant, we:

Operate hotlines to speak with Bush staff. For our open programs, we have established hotlines for potential applicants. We will speak with people as many times as they desire to provide coaching on their idea or proposal. For applicants, this is a way to clearly understand what we are looking for and to vet ideas with us. For Bush staff, this is a way to provide coaching and encouragement to strengthen proposals and to influence activities beyond those we fund.

Give feedback about declined applications. We offer feedback to declined applicants for our major grant and fellowship programs because we see this as another valuable opportunity to provide coaching and encouragement. We have also witnessed applicants using the feedback to improve their plans and proposals, which benefits both them and us. This two-way dialogue also allows applicants to share how we can improve the process for them.

Find ways to support declined applicants. In the course of our processes, we learn about far more amazing people and organizations than we can actually fund. Therefore, we try to find ways to be useful to more than just the limited number of accepted applicants. For example, we consider declined Bush Fellowship finalists to be part of our “Bush Network” and invite them to bushCONNECT. We also provide declined Bush Prize finalists with a $10,000 grant. In our hiring process, we offer unsuccessful finalists the chance to meet with our hiring consultant for an individual coaching session. In addition, across all our programs and operations, we try to craft our applications and our processes so that the experience of applying adds value to an applicant’s thinking and planning.

Every interaction is an opportunity to influence and be influenced.  Every interaction is an opportunity for shared learning. And that brings me to our fifth and final principle…

Bush-altlogo-color Principle #5: Share as you go.

In the past five years, we’ve been working to get more of what we are thinking — and learning — out to the community. This has required adjusting our standards and prioritizing just getting something out, even if it is not glossy and beautiful. It has required a new, shared understanding with grantees and Fellows that their reports and reflections will be public, so as many people as possible can benefit from their experience. It has required designing our internal work — like strategy documents for the Board — with external audiences in mind so they are ready to share.

We believe that if we do it right, we can have as much and potentially more impact from sharing the stories and spreading the lessons from our grantees and Fellows as from the investments themselves. This belief is at the heart of all our communications (see learning paper: “Communications as Program”) and is also reinforced with specific tactics such as:

“We potentially have more impact from sharing the stories and spreading the lessons from our grantees and Fellows.”

Post grantee reports on our website. We introduced “Learning Logs” to make grant reports public, and we hope, to give them life and utility beyond our walls. We refer prospective applicants to relevant learning logs as they craft their proposals, and we hear from applicants that they have indeed learned from them. Grantees and Fellows also share that they read one another’s Learning Logs as a way to get new ideas for overcoming barriers.

Share lessons along the way. We are publishing learning papers (like this one) as we believe we have something useful to share. We intended this to lower the bar of who, when and how we share. Our learning papers are not beautiful. Most of them are not based on statistically significant evaluation methodologies. They simply document a staff effort to process something we are working on and to share our reflections.

Tie evaluation to audience analysis. We invest heavily in external evaluations of our work, but in doing so we have found that the end-product is often only useful to our staff and key stakeholders. Consequently, we introduced a different approach to thinking about evaluation with a sharing mindset. We use a framework to identify the audiences who might care about or benefit from the lessons of an evaluation, what questions are relevant to each group, and what form or output would be most useful to them.

Webinar to the max. Webinars are not a particularly novel activity; however, we view them as a core tool of permeability. We host a webinar at the beginning of every application period for Grant and Fellowship programs to explain the process and what we are looking for. We also host them when we have a job opening to discuss the role and what it is like to work here. We host them annually for our Foundation initiatives to explain what we are up to and where we are headed. Most webinars feature a staff presentation followed by an open Q&A with videos archived on our website for anyone who missed it.

If you’ve been reading this series all week, you might be wondering when I’m going to get to the downsides of No Moat Philanthropy. All new approaches have their pain points.  So, come back tomorrow and I’ll share our pain and why we believe it is worth it.

--Jen Ford Reedy

Opening Up the Good and Bad Leads to Stronger Communities and Better Grantmaking
September 28, 2017

Hanh Cao Yu is Chief Learning Officer at The California Endowment.  She has been researcher and evaluator of equity and philanthropy for more than two decades. 

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Hanh-Cao-Yu-photoMore than a year ago when I began my tenure at The California Endowment (TCE), I reflected deeply about the opportunities and challenges ahead as the new Chief Learning Officer.  We were six years into a complex, 10-year policy/systems change initiative called Building Healthy Communities (BHC).  This initiative was launched in 2010 to advance statewide policy, change the narrative, and transform 14 of California’s communities most devastated by health inequities into places where all people—particular our youth—have an opportunity to thrive.  This is the boldest bet in the foundation’s history at $1 billion and the stakes are high.  It is not surprising, then, that despite the emphasis on learning, the evaluation of BHC is seen as a winning or losing proposition. 

“By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve.”

As I thought about the role of learning and evaluation in deepening BHC’s impact, I became inspired by the words of Nelson Mandela: “I never lose.  I either win or I learn.”  His encouragement to shift our mindset from “Win/Lose” to “Win/Learn” is crucial to continuous improvement and success.  

I also drew from the insights of Amy Edmondson who reminds us that if we experience failure, not all failures are bad.  According to Edmondson, mistakes can be preventable, unavoidable due to complexity, or even intelligent failures.  So, despite careful planning and learning from decades of research on comprehensive community initiatives and bold systems change efforts, in an initiative as complex as BHC, mistakes can and will occur. By spurring change at community, regional and state levels, and linking community mobilization with sophisticated policy advocacy, TCE was truly venturing into new territory when we launched BHC.

BHC's Big Wins and Lessons 

At the mid-point of BHC, TCE staff and Board paused to assess where we have been successful and where we could do better in improving the conditions under which young people could be healthy and thrive in our underserved communities.  The results were widely shared in the 2016 report, A New Power Grid:  Building Healthy Communities at Year 5.

As a result of taking the time to assess overall progress, we identified some of BHC's biggest impacts to date. In the first five years, TCE and partners contributed to significant policy/system wins:

  • Improved health coverage for the underserved;
  • Strengthened health coverage policy for the undocumented;
  • Improved school climate, wellness and equity;
  • Prevention and reform within the justice system;
  • Public-private investment and policy changes on behalf of boys and young men of color; and
  • Local & regional progress in adoption of “Health In All Policies,” a collaborative approach incorporating health considerations into decision-making across all policy areas

Our Board and team are very pleased with the results and impact of BHC to date, but we have been committed to learning from our share of mistakes. 

Along with the victories, we acknowledged in the report some hard lessons.  Most notable among our mistakes were more attention to:

  • Putting Community in “Community-Driven” Change.  Armed with lessons on having clarity about results to achieve results, we over thought the early process.  This resulted in prescriptiveness in the planning phase that was not only unnecessary, but also harmful. We entered the community planning process with multiple outcomes frameworks and a planning process that struck many partners as philanthropic arrogance. The smarter move was to engage community leaders with the clarity of a shared vision and operating principles, and create the space for community leaders and residents to incubate goals, results, and strategy. Fortunately, we course corrected, and our partners were patient while we did so.
  • Revisiting assumptions about local realities and systems dynamics.  In the report, we discussed our assumption about creating a single locus of inside-out, outside-in activity where community residents, leaders and systems leaders could collaborate on defined goals. It was readily apparent that community leaders distrusted many “systems” insiders, and systems leaders viewed outsider/activists as unreasonable. We underestimated the importance of the roles of historical structural inequalities, context, and dynamics of relationships at the local level.  Local collaboratives or “hubs” were reorganized and customized to meet local realities, and we threw the concept of a single model of collaboration across all the sites out the window.

Some course corrections we made included adjusting and sharpening our underlying assumptions and theory of change and taking on new community-driven priorities that we never anticipated early on; examples include school discipline reform, dismantling the prison pipeline in communities of color through prevention, and work that is taking place in TCE’s Boys & Young Men of Color portfolio.  By acknowledging our mistakes, our focus has sharpened and our dual roles as changemakers and grantmakers have continued to evolve. 

“Some partner feedback was difficult to hear, but all of it was useful and is making our work with partners stronger.”

Further, significant developments have occurred since the report:

Positioning “Power Building” as central to improving complex systems and policies.  In defining key performance indicators, we know the policy milestones achieved thus far represent only surface manifestations of the ultimate success we are seeking.  We had a breakthrough when we positioned “building the power and voice” of the adults and youth in our communities and “health equity” at the center of our BHC North Star Goals and Indicators.  Ultimately, we’ll know we are successful when the power dynamics in our partner communities have shifted so that adult and youth residents know how to hold local officials accountable for full, ongoing implementation of these policies.

Continuing to listen to our partners.  In addition to clarifying our North Stars, we sought further mid-point advice from our partners, reaching out to 175 stakeholders, including 68 youth and adult residents of BHC communities, for feedback to shape the remainder of BHC’s implementation and to inform our transition planning for the next decade.  Some of what our partners told us was difficult to hear, but all of it was useful and is making our work with partners stronger.    

From these lessons, I challenge our philanthropic colleagues to consider:

  • How can we learn to detect complex failures early to help us go beyond lessons that are superficial? As Amy Edmonson states, “The job of leaders is to see that their organizations don’t just move on after a failure but stop to dig in and discover the wisdom contained in it.”
  • In complex initiatives and complex organizations, what does it take to design a learning culture to capitalize successfully on mistakes? How do we truly engage in “trial and error” and stay open to experimentation and midcourse corrections?  How can we focus internally on our own operations and ways of work, as well as being willing to change our strategies and relationships with external partners?  Further, how do we, as grantmakers responsible for serving the public good, take responsibility for making these lessons #OpenForGood so others can learn from them as well?

It is worth noting that a key action that TCE took at the board level as we embarked on BHC was to dissolve the Board Program Committee and replace it with Learning and Performance Committee.  This set-up offered consistent opportunity for learning from evaluation reports between the Board, the CEO, and the management team and for sharing our learnings publicly to build the philanthropic field.  Now, even as we enter the final phase of BHC, we continue to look for ways to structure opportunities to learn, and I can say, “We are well into a journey to learn intelligently from our successes as well as our mistakes to make meaningful, positive impacts.”

--Hanh Cao Yu

Championing Transparency: The Rockefeller Foundation Is First to Share All Evaluations As Part of #OpenForGood
September 26, 2017

The Rockefeller Foundation staff who authored this post are Veronica Olazabal, Director of Measurement, Evaluation, and Organizational Performance; Shawna Hoffman, Measurement, Evaluation, and Organizational Performance Specialist; and Nadia Asgaraly, Measurement and Evaluation Intern.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Veronica Olazabal
Veronica Olazabal
Shawna Hoffman
Shawna Hoffman
Nadia Asgaraly
Nadia Asgaraly

TRF Color LogoToday, aligned with The Rockefeller Foundation's commitments to sharing and accountability, we are proud to be the first foundation to accept the challenge and proactively make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

A History of Transparency and Sharing

Since its founding more than 100 years ago, The Rockefeller Foundation's mission has remained unchanged: to promote the well-being of humanity throughout the world. To this end, the Foundation seeks to catalyze and scale transformative innovation across sectors and geographies, and take risks where others cannot, or will not. While working in innovative spaces, the Foundation has always recognized that the full impact of its programs and investments can only be realized if it measures - and shares - what it is learning. Knowledge and evidence sharing is core to the organization's DNA dating back to its founder John D. Rockefeller Sr., who espoused the virtues of learning from and with others—positing that this was the key to "enlarging the boundaries of human knowledge."

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known.”

Evaluation for the Public Good

Building the evidence base for the areas in which we work is the cornerstone of The Rockefeller Foundation's approach to measurement and evaluation. By systematically tracking progress toward implementation and outcomes of our programs, and by testing, validating, and assessing our assumptions and hypotheses, we believe that we can manage and optimize our impact. Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.

But living out transparency as a core value is not without its challenges. A commitment to the principle of transparency alone is insufficient; organizations, especially foundations, must walk the talk. Sharing evidence requires the political will and human resources to do so, and more importantly, getting comfortable communicating not only one's successes, but also one's challenges and failures. For this reason, to ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known. Then, once evaluation reports are finalized, they are posted to the Foundation website, available to the public free of charge.

#OpenForGood Project

The Foundation Center's #OpenForGood project, and IssueLab's related Results platform, help take the Foundation's commitment to sharing and strengthening the evidence base to the next level. By building a repository where everyone can identify others working on similar topics, search for answers to specific questions, and quickly identify where knowledge gaps exists, they are leading the charge on knowledge sharing.

The Rockefeller Foundation is proud to support this significant effort by being the first to contribute its evaluation evidence base to IssueLab: Results as part of the #OpenForGood movement, with the hope of encouraging others to do the same.

-- Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly

Trend to Watch: Using SDGs to Improve Foundation Transparency
September 19, 2017

(Janet Camarena is director of transparency initiatives at Foundation Center. )

Janet Camarena PhotoAs Foundation Center's director of transparency initiatives, one of the most interesting parts of my job is having the opportunity to play "transparency scout," regularly reviewing foundation websites for signs of openness in what is too often a closed universe. Some of this scouting leads to lifting up practices that can be examples for others on our Transparency Talk blog, sometimes it leads to a new transparency indicator on our assessment framework, and sometimes we just file it internally as a "trend to watch. "

Today, it's a combination of all three; we are using this blog post to announce the launch of a new, "Trend to Watch" indicator that signals an emerging practice: the use of the Sustainable Development Goals to improve how foundations open up their work to the world.

Sustainable Development GoalsThe United Nations' Sustainable Development Goals (SDGs), otherwise known as the Global Goals, are a universal call to action to end poverty, protect the planet and ensure that all people enjoy peace and prosperity. There are a total of 17 goals, such as ending poverty, zero hunger, reduced inequalities, and climate action. Written deliberately broad to serve as a collective playbook that governments and private sector alike can use, they can also serve as a much needed shared language across philanthropy and across sectors to signal areas of common interest, and measure shared progress.

And let's face it, as foundation strategies become increasingly specialized and strategic, explaining the objectives and the nuances can become a jargon-laden minefield that can make it difficult and time consuming for those on the outside to fully understand the intended goal of a new program or initiative. The simplicity of the SDG iconography cuts through the jargon so foundation website visitors can quickly identify alignment with the goals or not, and then more easily determine whether they should devote time to reading further. The SDG framework also provides a clear visual framework to display grants and outcomes data in a way that is meaningful beyond the four walls of the foundation.

Let's take a look at how some foundation websites are using the SDGs to more clearly explain their work:

Silicon Valley Community Foundation (SVCF)

One of my favorite examples is from a simple chart the Silicon Valley Community Foundation shared on its blog, because it specifically opens up the work of its donor-advised funds using the SDGs. Donor-advised funds are typically not the most transparent vehicles, so using the SDGs as a framework to tally how SVCF's donor-advised funds are making an impact is particularly clever, refreshing, and offers a new window into a fast-growth area of philanthropy.

A quick glance at the chart reveals that quality education, good health and well-being, and sustainable cities and communities are the most common priorities among Silicon Valley donors.

GHR Foundation

A good example of how the SDGs can be used as a shared language to explain the intended impact of a grant portfolio is from GHR Foundation in Minnesota. I also like this example because it shows how the SDGs can be effectively used in both global and domestic grant portfolios. GHR uses the SDG iconography across all of its portfolios, as sidebars on the pages that describe foundation strategies. GHR's "Children in Families" is a core foundation grantmaking strategy that addresses children and families in need on a global scale. The portfolio name is a broad one, but by including the SDG iconography, web visitors can quickly understand that GHR is using this program area to address poverty, hunger, as well as lead to outcomes tied to health and well-being:

GHR is also able to use the SDG framework to create similar understanding of its domestic work. Below is an example from its Catholic Schools program serving the Twin Cities:

Through the visual cues the icons provide, I can quickly determine that in addition to aligning with the quality education goal, that this part of GHR's portfolio also addresses hunger and economically disadvantaged populations through its education grantmaking. This could also signal that the grantmaker interprets education broadly and supports the provision of wrap-around services to address the needs of low-income children as a holistic way of addressing the achievement gap. That's a lot of information conveyed with three small icons!

Tableau Foundation

The most sophisticated example comes to us from the tech and corporate grantmaking worlds--the Tableau Foundation. Tableau makes data visualization software, so using technology as a means to improve transparency is a core approach, and they are using their own grantmaking as an example of how you can use data to tell a compelling visual story. Through the interactive "Living Annual Report" on its website, Tableau regularly updates its grantmaking tallies and grantee data so web visitors have near real-time information. One of the tabs on the report reveals the SDG indicators, providing a quick snapshot of how Tableau's grantmaking, software donations, and corporate volunteering align with the SDGs.

As you mouse over any bar on the left, near real-time data appears, tallying how much of Tableau's funding has gone to support each goal. The interactive bar chart on the right lists Tableau's grantees, and visitors can quickly see the grantee list in the context of the SDGs as well as know the specific scale of its grantmaking to each recipient.

If you're inspired by these examples, but aren't sure how to begin connecting your portfolio to the Global Goals, you can use the SDG Indicator Wizard to help you get started. All you need to do is copy and paste your program descriptions or the descriptive language of a sample grant into the Wizard and its machine-learning tools let you know where your grantmaking lands on the SDG matrix. It's a lot of fun – and great place to start learning about the SDGs. And, because it transforms your program language into the relevant SDG goals, indicator, and targets, it may just provide a shortcut to that new strategy you were thinking of developing!

What more examples? The good news is we're also tracking SDGs as a transparency indicator at "Who Has Glasspockets?" You can view them all here. Is your foundation using the SDGs to help tell the story of your work? We're always on the lookout for new examples, so let us know and your foundation can be the next trend setter in our new Trend to Watch.

-- Janet Camarena

The Power of Narrative: Philanthropy and Storytelling
August 31, 2017

Nicole Richards is Chief Storyteller at Philanthropy Australia

Nicole Richards photoWhen it comes to storytelling, philanthropy generally gets a failing grade.

It’s not that we’re short on great stories—they’re everywhere. We hear, see and experience them every day in our work to catalyse positive social change. The story opportunities in philanthropy flow as bountifully as the chocolate river in Willie Wonka’s chocolate factory.

But who has the time to capture them so that they become more than just a feel-good anecdote? Who has the capacity to tell them in a way that might influence others to act? Most of us are too busy with the business of grantmaking and measuring impact to share more than the occasional story at a board meeting or conference. Thousands of stories slip away.

Willy Wonka & RiverThat’s to our detriment. Humans are hard-wired for storytelling—stories are what connect us.

Three months ago, I stepped into the newly created role of Chief Storyteller at Philanthropy Australia, the national industry association for giving in Australia. The position, which is directly aligned with the organization’s strategic plan, has been backed for three years by five local funders who believe in the power and potential of storytelling to grow giving in this country.

The stories I tell span the spectrum of philanthropy, with a view to increasing transparency for a diverse cast of philanthropic actors. From collective giving groups and newly established private ancillary funds to the country’s oldest philanthropic foundations—the stories and the protagonists are distinct but the intent is the same: to make a difference.

Some of those are human interest stories that profile funders and their giving journeys case studies that showcase good practices, and opinion-style narratives designed to challenge the status quo.

From what we’ve seen, the appetite for these stories is boundless—philanthropists of all sizes and persuasions love learning from the collective experience of their peers. Telling these stories, or better yet, passing the mic so that the stories can be recounted firsthand by the funders, their nonprofit partners and the communities they serve, is a powerful form of knowledge sharing, of connecting people with new ideas and networks.

While it’s easy enough to find  and package the stories for ready consumption by those already practicing philanthropy, the bigger challenge is to send the stories beyond the echo chamber and put them before would-be philanthropists and aspiring social change makers. 

That’s as much about opening up philanthropy to demystify it for the uninitiated as it is about sharing stories of philanthropic impact for other philanthropy insiders. Philanthropy is too often viewed as the closed-door, exclusive domain of the ultra-wealthy. As agents of philanthropy, we have a responsibility to bust that myth and lift the veil.

Not all the stories we choose to tell should gleam like candy—the authenticity of the story is critical to its impact. We need more cautionary tales such as stories of failure, of missteps and strategies that went awry. By sharing the stories that aren’t sugar-coated, we make philanthropy less opaque, more accessible and ultimately more effective. By making storytelling a part of our process, we begin to normalize a culture of openness.

While crooning about pure imagination beside his chocolate river, Willie Wonka intoned: “Want to change the world…there’s nothing to it.”

We know he’s wrong on that front, but his golden ticket giveaway of the chocolate factory was a great story.

There’s a story behind every act of giving. For the sake of more and better philanthropy, it’s time we took those stories beyond the chocolate factory gates.

--Nicole Richards

 

I Thought I Knew You: Grants Data & the 990PF
August 23, 2017

(Martha S. Richards is the Executive Director of the James F. and Marion L. Miller Foundation in Portland, Oregon.)

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series will explore the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Join us at a session about the Open 990PF in partnership with Grantmakers of Oregon and Southwest Washington. Learn more or register here.

Martha Richards photoI have a confession to make. Up until a few years ago when this story begins, I used to take the 990PF for granted. I thought of it as something that ensured we were following federal regulations and that if we filed it on time and followed the reporting practices we had always used, that this would be sufficient for all concerned. I was also pretty certain no one but a few insiders within the government and perhaps a handful of philanthropy groups would ever bother to read it.

Well, you might have heard the expression: "You don't know what you don't know," and that's a good segue to what I have to share.

In Spring 2010, the Coalition of Communities of Color (CCC) released a study -- Communities of Color in Multnomah County: an Unsettling Profile -- which defined the disparities facing communities of color in Oregon's largest urban area, Portland. Inspired by this analysis, that December, Foundation Center (FC) and Grantmakers of Oregon and SW Washington (GOSW) co-presented Grantmaking to Communities of Color in Oregon -- a groundbreaking report that acknowledged that philanthropy was part of the problem. The report estimated only 9.6% of grants awarded in 2008 by Oregon private and community funders actually reached communities of color.

While the data told a moving story, the source of the data also became a parallel conversation because the philanthropic community here in Oregon learned about the limitations of using tax returns to tell such important stories. The grant descriptions in our 990s rarely disclose details about the intended beneficiaries of the grants—even if we know them.

The result: We embarked on a long journey to address both issues. While GOSW and CCC hosted a forum to raise awareness of the reports and their attendant policy recommendations, foundations committed to look more closely at their giving practices as well as their data collection efforts, especially emphasizing collecting better beneficiary data, and reporting relationship with Foundation Center.

This prompted us at the James F. and Marion L. Miller Foundation to examine our own giving and how we could describe its reach. We fund in the areas of arts and K-12 education. We have a small staff. Our application process did not require a detailed analysis of demographic data from arts applicants or schools, nor an understanding of the diverse nature of nonprofit leadership among our grantees. We realized that we did not know if the grants we made were reaching the populations we hoped to serve.

As part of this effort, I chaired a GOSW-led Data Work Group to explore how to obtain more meaningful data sets without adding to the length and complexity of our application processes. We invited nonprofit partners to the table. We studied Foundation's Center's processes and invited their staff to meet with and advise us. We tried, tested, and began to encourage nonprofits to help us learn more about how and who we were reaching with our philanthropic dollars. Eventually, we encouraged many of our Oregon foundations to become eReporters to Foundation Center, providing more detailed descriptions of what the grant was for, and who was reached with the funding. Our reports to the Foundation Center and to the IRS have improved, and we make an effort to report detailed demographic information.

Before and After Chart

However, we discovered that it can be difficult for some types of organizations to capture specific demographic data. In the arts, for instance, outside of audience surveys, one generally does not complete a demographic survey to buy a ticket. At the Miller Foundation, we chose to partner with DataArts to collect financial and audience data on our arts grantees. Arts organizations annually complete the profile and it can be used for several arts funders in the state. Their demographic profile is still being developed, but it will encourage better data information and capture in the future. Unfortunately, this platform does not exist for other nonprofits.

Get on the Map

Get on the Map encourages foundations to share current and complete details about their grantmaking with Foundation Center. The interactive map, databases and reports allow foundations to have a better understanding of grantee funding and demographics.

We didn't know it then, but as a result of our committee's efforts, a new data improvement movement was born, called Get on the Map (GOTM). GOTM encourages foundations to share current and complete details about their grantmaking with Foundation Center, so the Maps, databases, and reports it issues are as accurate as possible. The grants we share also populate an interactive map that members of GOSW have access to, which means that we have a better idea of the ecosystem in which we work. It has since scaled nationally with other regions also committing to improve the data they collect and share about their grantmaking so we can all be less in the dark about what efforts are underway and who is working on them.

As a result, today our foundation has a better understanding of who our grantees are serving and reaching today, than we did seven years ago, and I think we are also doing a better job of sharing that story with the IRS, Foundation Center, and the many sets of eyes I now know view those platforms.

We are still learning what we do not know. But at least, now we know what we do not know.

-- Martha Richards


Coming to Grantmakers of Oregon and Southwest Washington: To learn more about what story your 990PF tells about your foundation, register to attend Once Upon a 990PF. Visit the GOSW website for more information and to register.

How To Keep Me Scrolling Through What You Are Sharing
August 10, 2017

Tom Kelly is Vice President of Knowledge, Evaluation & Learning at the Hawai‘i Community Foundation. He has been learning and evaluating in philanthropy since the beginning of the century. @TomEval  TomEval.com

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Tom Kelly Hi ResHello, my name is Tom and I am a Subscriber. And a Tweeter, Follower, Forwarder (FYI!), Google Searcher, and DropBox Hoarder. I subscribe to blogs, feeds, e-newsletters, and email updates. My professional title includes the word “Knowledge,” so I feel compelled to make sure I am keeping track of the high volume of data, information, reports, and ideas flowing throughout the nonprofit and foundation worlds (yes, it is a bit of a compulsion…and I am not even including my favorite travel, shopping and coupon alerts).

It is a lot and I confess I do not read all of it. It is a form of meditation for me to scroll through emails and Twitter feeds while waiting in line at Aloha Salads. I skim, I save, I forward, I retweet – I copy and save for later reading (later when?). In fact, no one can be expected to keep up, so how does anyone make sense of it all, or even find what we need when we need it? Everyone being #OpenForGood and sharing everything is great, but who is reading it all? And how do we make what we are opening for good actually good?

Making Knowledge Usable

We have all experienced at some point Drowning in Information-Starving for Knowledge (John Naisbitt’s Megatrends…I prefer E.O. Wilson’s “starving for wisdom” theory). The information may be out there but rarely in a form that is easily found, read, understood, and most importantly used. Foundation Center and IssueLab have made it easier for people in the sector to know what is being funded, where new ideas are being tested, and what evidence and lessons are available. But nonprofits and foundations still have to upload and share many more of their documents than they do now. And we need to make sure that the information we share is readable, usable, and ready to be applied.

Hawaii Community Foundation Graphic

DataViz guru Stephanie Evergreen recently taught me a new hashtag: #TLDR – “Too Long, Didn’t Read.”

She now proposes that every published report be available in three formats – a one-page handout with key messages, a 3-page executive summary, and a 25-page report (plus appendices). In this way the “scanners,” “skimmers” and “deep divers” can access the information in the form they prefer and in the time they have. It also requires writing (and formatting) differently for each of these sets of eyes. (By the way, do you know which one you are?)

From Information to Influence

But it is not enough to make your reports accessible, searchable, and also easily readable in short and long forms; you also need to include the information people need to make decisions and act. It means deciding in advance who you want to inform and influence and what you want people to do with the information. You need to be clear about your purpose for sharing information, and you need to give people the right kinds of information if you expect them to read it, learn from it, and apply it.

“Give people the right kinds of information if you expect them to read it, learn from it, and apply it.”

Too many times I have read reports with promising findings or interesting lessons, and then I race through all the footnotes and the appendices at the back of the report looking for resources that could point me to the details of evidence and data or implementation guidance. I usually wind up trying to track down the authors by email or phone to follow up.

A 2005 study of more than 1,000 evaluations published in human services found only 22 well-designed and well-documented reports that shared any analysis of implementation factors – what lessons people learned about how best to put the program or services in place. We cannot expect other people and organizations to share knowledge and learn if they cannot access information from others that helps them use the knowledge and apply it in their own programs and organizations. YES, I want to hear about your lessons and “a-ha’s,” but I also want to see data and analysis of the common challenges that all nonprofits and foundations face:

  • How to apply and adapt program and practice models in different contexts
  • How to sustain effective practices
  • How to scale successful efforts to more people and communities

This means making sure that your evaluations and your reports include opening up the challenges of implementation – the same challenges others are likely to face. It also means placing your findings in the context of existing learning while also using similar definitions so that we can build on each other’s knowledge. For example, in our recent middle school connectedness initiative, our evaluator Learning for Action reviewed the literature first to determine specific components and best practices of youth mentoring so that we could build the evaluation on what had come before, and then report clearly about what we learned about in-school mentoring and open up  useful and comparable knowledge to the field. 

So please plan ahead and define your knowledge sharing and influence agenda up front and consider the following guidelines:

  • Who needs to read your report?
  • What information does your report need to share to be useful and used?
  • Read and review similar studies and reports and determine in advance what additional knowledge is needed and what you will document and evaluate.
  • Use common definitions and program model frameworks so we are able to continually build on field knowledge and not create anew each time.
  • Pay attention to and evaluate implementation, replication and the management challenges (staffing, training, communication, adaptation) that others will face.
  • And disseminate widely and share at conferences, in journals, in your sector networks, and in IssueLab’s open repository.

And I will be very happy to read through your implementation lessons in your report’s footnotes and appendices next time I am in line for a salad.

--Tom Kelly

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories