Transparency Talk

Category: "Evaluation" (98 posts)

Putting a Stop to Recreating the Wheel: Strengthening the Field of Philanthropic Evaluation
December 13, 2018

Clare Nolan is Co-Founder of Engage R+D, which works with nonprofits, foundations, and public agencies to measure their impact, bring together stakeholders, and foster learning and innovation.

Meg Long is President of Equal Measure, Philadelphia-based professional services nonprofit focused on helping its clients—foundations, nonprofit organizations, and public entities—deepen and accelerate social change.

2
Clare Nolan

In 2017, Engage R+D and Equal Measure, with support from the Gordon and Betty Moore Foundation launched an exploratory dialogue of funders and evaluators to discuss the current state of evaluation and learning in philanthropy, explore barriers to greater collaboration and impact, and identify approaches and strategies to build the collective capacity of small and mid-sized evaluation firms. Our goal was to test whether there was interest in our sector for building an affinity network of evaluation leaders working with and within philanthropy. Since our initial meeting with a few dozen colleagues in 2017, our affinity network has grown to 250 individuals nationally, and there is growing momentum for finding ways funders and evaluators can work together differently to deepen the impact of evaluation and learning on philanthropic practice.

At the recent 2018 American Evaluation Association (AEA) conference in Cleveland, Ohio, nearly 100 funders and evaluators gathered to discuss four action areas that have generated the most “buzz” during our previous network convening at the Grantmakers for Effective Organizations (GEO) conference and from our subsequent network survey:

1. Improving the application of evaluation in philanthropic strategy and practice.

2. Supporting the sharing and adaptation of evaluation learning for multiple users.

3. Supporting formal partnerships and collaborations across evaluators and evaluation firms.

4. Strengthening and diversifying the pipeline of evaluators working with and within philanthropy.

1
Meg Long

We asked participants to choose one of these action areas and join the corresponding large table discussion to reflect on what they have learned about the topic and identify how the affinity network can contribute to advancing the field. Through crowd-sourcing, participants identified some key ways in which action teams that will be launched in early 2019 can provide a value-add to the field.

1. What will it take to more tightly connect evaluation with strategy and decision-making? Provide more guidance on what evaluation should look like in philanthropy.

Are there common principles, trainings, articles, case studies, guides, etc. that an action team could identify and develop? Could the affinity network be a space to convene funders and evaluators that work in similar fields to share evaluation results and lessons learned?

2. What will it take to broaden the audience for evaluations beyond individual organizations? Create a “market place” for knowledge sharing and incentivize participation.

As readers of this blog will know from Foundation Center’s #OpenForGood efforts, there is general agreement around the need to do better at sharing knowledge, building evidence, and being willing to share what foundations are learning – both successes and failures. How can an action team support the creation of a culture of knowledge sharing through existing venues and mechanisms (e.g., IssueLab, Evaluation Roundtable)? How could incentives be built in to support transparency and accountability?

3. How can the field create spaces that support greater collaboration and knowledge sharing among funders and evaluators? Identify promising evaluator partnership models that resulted in collaboration and not competition.

Partnerships have worked well where there are established relationships and trust and when power dynamics are minimized. How can an action team identify promising models and practices for successful collaborations where collaboration is not the main goal? How can they establish shared values, goals, etc. to further collaboration?

4. What will it take to create the conditions necessary to attract, support, and retain new talent? Build upon existing models to support emerging evaluators of color and identify practices for ongoing guidance and mentorship.

Recruiting, hiring, and retaining talent to fit evaluation and learning needs in philanthropy is challenging due to education and training programs as well as changing expectations in the field. How can we leverage and build on existing programs (e.g., AEA Graduate Education Diversity Internship, Leaders in Equitable Evaluation and Diversity, etc.) to increase the pipeline, and support ongoing retention and professional development?

Overall, we are delighted to see that there is much enthusiasm in our field to do more work on these issues. We look forward to launching action teams in early 2019 to further flesh out the ideas shared above in addition to others generated over the past year.

If you are interested in learning more about this effort, please contact Pilar Mendoza. If you would like to join the network and receive updates about this work, please contact Christine Kemler.

--Clare Nolan and Meg Long

Living Our Values: Gauging a Foundation’s Commitment to Diversity, Equity, and Inclusion
November 29, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarThe California Endowment (TCE) recently wrapped up our 2016 Diversity, Equity, and Inclusion (DEI) Audit, our fourth since 2008. The audit was initially developed at a time when community advocates were pushing the foundation to address issues of structural racism and inequity. As TCE’s grantmaking responded, staff and our CEO were also interested in promoting DEI values across the entire foundation beyond programmatic spaces. Over time, these values became increasingly engrained in TCE’s ethos and the foundation committed to conducting a regular audit as a vehicle with which to determine if and how our DEI values were guiding organizational practice.

Sharing information about our DEI Audit often raises questions about how to launch such an effort. Some colleagues are in the early stages of considering whether they want to carry out an audit of their own. Are we ready? What do we need to have in place to even begin to broach this possibility? Others are interested to hear about how we use the findings from such an assessment. To help answer these questions, this is the first of a two-part blog series to share the lessons we’re learning by using a DEI audit to hold ourselves accountable to our values.

While the audit provides a frame to identify if our DEI values are being expressed throughout the foundation, it also fosters learning. Findings are reviewed and discussed with executive leadership, board, and staff. Reviews provide venues to involve both programmatic and non-programmatic staff in DEI discussions. An audit workgroup typically considers how to take action on findings so that the foundation can continuously improve and also considers how to revise audit goals to ensure forward movement. By sharing findings publicly, we hope our experience and lessons can help to support the field more broadly.

It is, however, no small feat. The audit is a comprehensive process that includes a demographic survey of staff and board, a staff and board survey of DEI attitudes and beliefs, interviews with key foundation leaders, examining available demographic data from grantee partners as well as a review of DEI-related documents gathered in between audits. Having dedicated resources to engage a neutral outsider to carry out the audit in partnership with the foundation is also important to this process. We’ve found it particularly helpful to engage with a consistent trusted partner, Social Policy Research Associates, over each of our audits to capture and candidly reflect where we’re making progress and where we need to work harder to create change.

As your foundation considers your own readiness to engage in such an audit process, we offer the following factors that have facilitated a productive and learning oriented DEI audit effort at TCE:

1. Clarity about the fundamental importance of Diversity, Equity, and Inclusion to the Foundation

The expression of our DEI values has evolved over time. When the audit started, several program staff members who focused on DEI and cultural competency developed a guiding statement on Diversity and Inclusiveness. Located within our audit report, it focused heavily on diversity although tweaks were made to the statement over time. A significant shift occurred several years ago when our executive team articulated a comprehensive set of core values that undergirds all our work and leads with a commitment to diversity, equity, and inclusion.

2. Interest in reflection and adaptation

The audit is a tool for organizational learning that facilitates continuous improvement. The process relies on having both a growth mindset and clear goals for what we hope to accomplish. Our 13 goals range from board engagement to utilizing accessibility best practices. In addition to examining our own goals, the audit shares how we’re doing with respect to a framework of institutional supports required to build a culture of equity. By comparing the foundation to itself over time we can determine if and where change is occurring. It also allows us to revise goals so that we can continue to push ourselves forward as we improve, or to course correct if we’re not on track. We anticipate updating our goals before our next audit to reflect where we are currently in our DEI journey.

3. Engagement of key leaders, including staff

Our CEO is vocal and clear about the importance of DEI internally and externally, as well as about the significance of conducting the audit itself. Our executive team, board, and CEO all contribute to the audit process and are actively interested in reviewing and discussing its findings.

Staff engagement is critical throughout audit implementation, reflection on findings, and action planning as well. It’s notable that the vast majority of staff at all levels feel comfortable pushing the foundation to stay accountable to DEI internally. However, there is a small, but growing percentage (23%) of staff who report feeling uncomfortable raising DEI concerns in the workplace suggesting an area for greater attention.

4. Capacity to respond to any findings

Findings are not always going to be comfortable. Identifying areas for improvement may put the organization and our leaders in tough places. TCE has historically convened a cross departmental workgroup to consider audit findings and tackle action planning. We considered co-locating the audit workgroup within our executive leadership team to increase the group’s capacity to address audit findings. However, now we are considering whether it would be best situated and aligned within an emerging body that will be specifically focused on bringing racial equity to the center of all our work.

5. Courage and will to repeat

In a sector with limited accountability, choosing to voluntarily and publicly examine foundation practices takes real commitment and courage. It’s always great to hear where we’re doing well but committing to a process that also raises multiple areas where we need to put more attention, requires deep will to repeat on a regular basis. And we do so in recognition that this is long term, ongoing work that, in lieu of having a real finish line, requires us to continuously adapt as our communities evolve.

Conducting our DEI audit regularly has strengthened our sense of where our practice excels—for example in our grantmaking, possessing a strong vision and authorizing environment, and diversity among staff and board. It’s also strengthened our sense of the ways we want to improve such as developing a more widely shared DEI analysis and trainings for all staff as well as continuing to strengthen data collection among our partners. The value of our DEI audit lies equally in considering findings as well as being a springboard for prioritizing action. TCE has been on this road a long time and we’ll keep at it for the foreseeable future. As our understanding of what it takes to pursue diversity, equity, and inclusion internally and externally sharpens, so will the demands on our practice. Our DEI audit will continue to ensure that we hold ourselves to these demands. In my next post, we’ll take a closer look at what we’re learning about operationalizing equity within the foundation.

--Mona Jhawar

What Does It Take to Shift to a Learning Culture in Philanthropy?
November 20, 2018

Janet Camarena is director of transparency initiatives at Foundation Center.

This post also appears in the Center for Effective Philanthropy blog.

Janet Camarena PhotoIf there was ever any doubt that greater openness and transparency could benefit organized philanthropy, a new report from the Center for Effective Philanthropy (CEP) about knowledge-sharing practices puts it to rest. Besides making a case for the need for greater transparency in the field, the report also provides some hopeful signs that, among foundation leaders, there is growing recognition of the value of shifting to a culture of learning to improve foundations’ efforts.

Understanding & Sharing What Works: The State of Foundation Practice reveals how well foundation leaders understand what is and isn’t working in their foundation’s programs, how they figure this out, and what, if anything, they share with others about what they’ve learned. These trends are explored through 119 survey responses from, and 41 in-depth interviews with foundation CEOs. A companion series of profiles tell the story about these practices in the context of four foundations that have committed to working more openly.

Since Foundation Center’s launch of GlassPockets in 2010, we have tracked transparency around planning and performance measurement within the “Who Has Glass Pockets?” self-assessment. Currently, of the nearly 100 foundations that have participated in GlassPockets, only 27 percent publicly share any information about how they measure their progress toward institutional goals. Given this lack of knowledge sharing, we undertook a new #OpenForGood campaign to encourage foundations to publicly share published evaluations through the IssueLab open archive.

As someone who has spent the last decade examining foundation transparency practices (or the lack thereof) and championing greater openness, I read CEP’s findings with an eye for elements that might help us better understand the barriers and catalysts to this kind of culture shift in the field. Here’s what I took away from the report.

Performance Anxiety

UWW_MAIN_COV_border (1)While two-thirds of foundation CEOs in CEP’s study report having a strong sense of what is working programmatically within their foundations, nearly 60 percent report having a weaker grasp on what is not working. This begs the question: If you don’t know something is broken, then how do you fix it? Since we know foundations have a tendency to be success-oriented, this by itself wasn’t surprising. But it’s a helpful metric that proves the point of how investing in evaluation, learning, and sharing can only lead to wiser use of precious resources for the field as a whole.

The report also reveals that many CEOs who have learned what is not working well at their foundations are unlikely to share that knowledge, as more than one-third of respondents cite hesitancy around disclosing missteps and failures. The interviews and profiles point to what can best be described as performance anxiety. CEOs cite the need for professionals to show what went well, fear of losing the trust of stakeholders, and a desire to impress their boards as motivations for concealing struggles. Of these motivations, board leadership seems particularly influential for setting the culture when it comes to transparency and failure.

In the profiles, Rockefeller Brothers Fund (RBF) President Stephen Heintz discusses both the importance of his board and his background in government as factors that have informed RBF’s willingness to share the kinds of information many foundations won’t. RBF was an early participant in GlassPockets, and now is an early adopter of the #OpenForGood movement to openly share knowledge. As a result, RBF has been one of the examples we often point to for the more challenging aspects of transparency such as frameworks for diversity data, knowledge sharing, and investment practices.

An important takeaway of the RBF profile is the Fund’s emphasis on the way in which a board can help ease performance anxiety by simply giving leadership permission to talk about pain points and missteps. Yet one-third of CEOs specifically mention that their foundation faces pressure from its board to withhold information about failures. This sparks my interest in seeing a similar survey asking foundation trustees about their perspectives in this area.

Utility or Futility?

Anyone who works inside a foundation — or anyone who has ever applied for a grant from a foundation — will tell you they are buried in the kind of paperwork load that often feels futile (which actually spawned a whole other worthy movement led by PEAK Grantmaking called Project Streamline). In the CEP study, the majority of foundation CEOs report finding most of the standard sources of knowledge that they require not very useful to them. Site visits were most consistently ranked highly, with the majority of CEOs (56 percent) pointing to them as one of the most useful sources for learning about what is and isn’t working. Grantee focus groups and convenings came in a distant second, with only 38 percent of CEOs reporting these as a most useful source. And despite the labor involved on both sides of the table, final grant reports were ranked as a most useful source for learning by only 31 percent of CEOs.

”Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge.“

If most foundations find greater value in higher touch methods of learning, such as meeting face-to-face or hosting grantee gatherings, then perhaps this is a reminder that if foundations reduce the burdens of their own bureaucracies and streamline application and reporting processes, there will be more time for learning from community and stakeholder engagement.

The companion profile of the Weingart Foundation, another longtime GlassPockets participant, shows the benefits of funders making more time for grantee engagement, and provides a number of methods for doing so. Weingart co-creates its learning and assessment frameworks with grantees, routinely shares all the grantee feedback it receives from its Grantee Perception Report (GPR), regularly makes time to convene grantees for shared learning, and also pays grantees for their time in helping to inform Weingart’s trustees about the problems it seeks to solve.

Supply and Demand

One of the questions we get the most about #OpenForGood’s efforts to build an open, collective knowledge base for the field is whether anyone will actually use this content. This concern also surfaces in CEP’s interviews, with a number of CEOs citing the difficulty of knowing what is useful to share as an impediment to openness. A big source of optimism here is learning that a majority of CEOs report that their decisions are often informed by what other foundations are learning, meaning foundations can rest assured that if they supply knowledge about what is and isn’t working, the demand is there for that knowledge to make a larger impact beyond their own foundation. Think of all that untapped potential!

Of course, given the current state of knowledge sharing in the field, only 19 percent of CEOs surveyed report having quite a bit of knowledge about what’s working at peer foundations, and just 6 percent report having quite a bit of knowledge about what’s not working among their programmatic peers. Despite this dearth of knowledge, still fully three-quarters of foundation CEOs report that they use what they have access to from peers in informing strategy and direction within their own foundations.

Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge. Now there is every reason for knowledge sharing to become the norm rather than the exception.

--Janet Camarena

Creating a Culture of Learning: An Interview with Yvonne Belanger, Director of Evaluation & Learning, Barr Foundation
November 8, 2018

Yvonne Belanger is the director of learning & evaluation at the Barr Foundation and leads Barr's efforts to gauge its impact and support ongoing learning among staff, grantees, and the fields in which they work.

Recently, Janet Camarena, director of transparency initiatives for Foundation Center, interviewed Belanger about how creating a culture of learning and openness can improve philanthropy. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.


YvonneGlassPockets: More and more foundations seem to be hiring staff with titles having to do with evaluation and learning. You’ve been in this role at the Barr Foundation for just about a year, having come over from a similar role at the Bill & Melinda Gates Foundation. Why do you think roles like this are on the rise in philanthropy, and what are your aspirations for how greater capacity for evaluation and learning can benefit the field?

Yvonne Belanger: I think the spread of these roles in strategic philanthropy comes from increasing recognition that building a stronger learning function is a strategic investment, and it requires dedicated expertise and leadership. My hope is that strong evaluation and learning capacity at Barr (and across the philanthropic sector generally) will enable better decisions and accelerate the pace of social change to make the world more equitable and just.

GP: What have been your priorities in this first year and what is your approach to learning? More specifically, what is Barr’s learning process like, what sources do you learn from, how do you use the learnings to inform your work?

YB: At Barr, we are committed to learning from our efforts and continuously improving. Our programmatic work benefits from many sources of knowledge to inform strategy including landscape scans, academic research, ongoing conversations with grantees and formal site visits, and program evaluations to name a few. During this first year, I have been working with Barr’s program teams to assess their needs, to sketch out a trajectory for the next few years, and to launch evaluation projects across our strategies to enhance our strategic learning. Learning is not limited to evaluating the work of our programs, but also includes getting feedback from our partners. Recently, we were fortunate to hear from grantees via our Grantee Perception Report survey, including specific feedback on our learning and evaluation practices. As we reflected on their responses in relation to Barr’s values and examples of strong practice among our peers, we saw several ways we could improve.

GP: What kinds of improvements are you making as a result of feedback you received?

YB: We identified three opportunities for improvement: to make evaluation more useful, to be clearer about how Barr defines success and measures progress, and to be more transparent with our learning.

  • Make evaluations more collaborative and beneficial to our partners. We heard from our grantees that participating in evaluations funded by Barr hasn’t always felt useful or applicable to their work. We are adopting approaches to evaluation that prioritize grantee input and benefit. For example, in our Creative Commonwealth Initiative, a partnership with five community foundations to strengthen arts and creativity across Massachusetts, we included the grantees early in the evaluation design phase. With their input, we modified and prioritized evaluation questions and incorporated flexible technical assistance to build their capacity for data and measurement. In our Education Program, the early phase of our Engage New England evaluation is focused on sharing learning with grantees and the partners supporting their work to make implementation of these new school models stronger.
  • Be clearer about how we measure outcomes. Our grantees want to understand how Barr assesses progress. In September, we published a grantee guide to outputs and outcomes to clarify what we are looking for from grantees and to support them in developing a strong proposal. Currently, our program teams are clarifying progress measures for our strategies, and we plan to make that information more accessible to our grantees.
  • Share what we learn. To quote your recent GrantCraft Open for Good report, “Knowledge has the power to spark change, but only if it is shared.” To maximize Barr’s impact, we aim to be #OpenForGood and produce and share insights that help our grantees, practitioners, policymakers, and others. To this end, we are proactively sharing information about evaluation work in progress, such as the evaluation questions we are exploring, and when the field can expect results. Our Barr Fellows program evaluation is one example of this practice. We are also building a new knowledge center for Barr to highlight and share research and reports from our partners, and make these reports easier for practitioners and policymakers to find and re-share.

GP: Clearly all of this takes time and resources to do well. What benefits can you point to of investing in learning and knowledge sharing?

YB: Our new Impact & Learning page reflects our aspiration that by sharing work in progress and lessons learned, we hope to influence nonprofits and other funders, advance field knowledge, inform policy, and elevate community expertise. When you are working on changing complex systems, there are almost never silver bullets. To make headway on difficult social problems we need to view them from multiple perspectives and build learning over time by analyzing the successes – and the failures - of many different efforts and approaches.

GP: Barr’s president, Jim Canales, is featured in a video clip on the Impact & Learning page talking about the important role philanthropy plays as a source of “risk capital” to test emerging and untested solutions, some of which may not work or fail, and that the field should see these as learning opportunities. And, of course, these struggles and failures could be great lessons for philanthropy as a whole. How do you balance this tension at Barr, between a desire to provide “risk capital,” the desire to open up what you are learning, and reputational concerns about sharing evaluations of initiatives that didn’t produce the desired results?

YB: It’s unusual for Foundations to be open about how they define success, and admissions of failure are notably rare. I think foundations are often just as concerned about their grantees’ reputation and credibility as their own. At Barr we do aspire to be more transparent, including when things that haven’t worked or our efforts have fallen short of our goals. To paraphrase Jim Canales, risk isn’t an end in itself, but a foundation should be willing to take risks in order to see impact. Factors that influence impact or the pace of change are often ones that funders often have control over, such as the amount of risk we were willing to take, or the conceptualization and design of an initiative. When a funder can reflect openly about these issues, these usually generate valuable lessons for philanthropy and reflect the kind of risks we should be able to take more often.

GP: Now that you are entering your second year in this role, where are the next directions you hope to take Barr’s evaluation and learning efforts?

YB: In addition to continuing and sustaining robust evaluation for major initiatives across our program areas, and sharing what we’re learning as we go, we have two new areas of focus in 2019 – people and practices. We will have an internal staff development series to cultivate mindsets, skills, and shared habits that support learning, and we will also be working to strengthen our practices around strategy measurement so that we can be clearer both internally and externally about how we measure progress and impact. Ultimately, we believe these efforts will make our strategies stronger, will improve our ability to learn with and from our grantees, and will lead to greater impact.

 

Building Our Knowledge Sharing Muscle at Irvine
May 17, 2018

Kim Ammann Howard joined the James Irvine Foundation as Director of Impact Assessment and Learning in 2015. She has more than 20 years of social impact experience working with nonprofits, foundations, and the public sector to collect, use, and share information that stimulates ongoing learning, and change.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Kim Ammann HowardHaving recently spent two days with peer foundation evaluation directors, I am savoring the rich conversations and reflecting on how shared knowledge benefits my own thinking and actions. It also reminds me of how often those conversations only benefit those inside the room. To really influence the field, we need to build our knowledge sharing muscle beyond our four walls and usual circles. A new report from the Foundation Center, Open for Good: Knowledge Sharing to Strengthen Grantmaking, aims to help funders do just that, and I was happy to contribute some of The James Irvine Foundation’s own journey to the guide.

When I joined the Foundation at the end of 2015, there was already a commitment to transparency and openness that established knowledge sharing as part of the culture. It was something that attracted me to Irvine, and I was excited to build on the types of information collected and disseminated in the past, and to figure out how we could grow.

Open For Good CoverOur Framework

In 2016, we launched our new strategy, which focuses on expanding economic and political opportunity for California families and young adults who are working but struggling with poverty. This presented an opportune moment to articulate and set expectations about how impact assessment and learning (IA&L) is integrated in the work. This includes defining how we assess our progress in meeting our strategic goals, how we learn, and how we use what we learn to adapt and improve. We developed a framework that outlines our approach to IA&L – why we think it’s important, what principles guide us, and how we put IA&L into practice.

While the IA&L framework was designed as an internal guide, we decided to make it available externally for three reasons: to honor the Foundation’s commitment to transparency and openness; to hold ourselves accountable to what we say we espouse for IA&L; and to model our approach for colleagues at other organizations who may be interested in adopting a similar framework.

What We’re Learning

We’ve also dedicated a new portion of our website to what we are learning. We use this section to share knowledge with the field – and not only the end results of an initiative or body of research but also to communicate what happens in the middle – to be transparent about the work as we go.

For example, in 2017, we spent a year listening and learning from grantees, employers, thought leaders, and other stakeholders in California to inform what would become our Better Careers initiative. At the end of the year, we announced the goal of the initiative to connect low-income Californians to good jobs with family-sustaining wages and advancement opportunities. It was important for us to uphold the principles of feedback set in our IA&L framework by communicating with all the stakeholders who helped to inform the initiative’s strategy – it was also the right thing to do. We wanted to be transparent about how we got to our Better Career approach and highlight the ideas reflected in it as well as the equally valuable insights that we decided not to pursue. Given the resources that went into accumulating this knowledge, and in the spirit of greater funder collaboration, we also posted these ideas on our website to benefit others working in this space.

As we continue to build our knowledge sharing muscle at Irvine, we are exploring additional ways to communicate as we go. We are currently reflecting on what we are learning about how we work inside the foundation – and thinking about ways to share the insights that can add value to the field. Participating as a voice in the Foundation Center’s new Open for Good guide was one such opportunity, and the stories and lessons from other Foundations in the guide inspires our own path forward. 

--Kim Ammann Howard

Learn, Share, and We All Win! Foundation Center Releases #OpenForGood Guide and Announces Award Opportunity
May 10, 2018

Open For Good CoverMelissa Moy is special projects associate for Glasspockets.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Knowledge is a resource philanthropy can’t afford to keep for itself, and as a result of a newly available guide, funders will now have a road map for opening up that knowledge. The new GrantCraft guide, Open for Good: Knowledge Sharing to Strengthen Grantmaking, supported by the Fund for Shared Insight, illustrates practical steps that all donors can take to create a culture of shared learning.

Philanthropy is in a unique position to generate knowledge and disseminate it, and this guide will help foundations navigate the process. Each year, foundations make $5 billion in grants toward knowledge production. These assessments, evaluations, communities of practice, and key findings are valuable, yet only a small fraction of foundations share what they learn, with even fewer using open licenses or open repositories to share these learnings. Foundations have demonstrated that some of the information they value most are lessons about “what did and didn’t work.” And yet, this is the same knowledge that foundations are often most reluctant to share.

The guide, part of Foundation Center’s larger #OpenForGood campaign, makes a strong case for foundations to openly share knowledge as an integral and strategic aspect of philanthropy. Through interviews with leaders in knowledge sharing, the guide outlines tested solutions to overcome common barriers to impart learnings, as well as essential components needed for funders to strengthen their knowledge-sharing practice. The guide emphasizes that sharing knowledge can deepen internal reflection and learning, lead to new connections and ideas, and promote institutional credibility and influence. 

Knowledge comes in all shapes and sizes – program and grantee evaluations, foundation performance assessments, thought leadership, formal and informal reflections that are shared among foundation staff and board members. The guide will help your foundation identify the types of information that can be shared and how to take actionable steps.

Download the Guide

OFGaward-528To further encourage funders to be more transparent, this week Foundation Center also announces the opening of a nomination period for the inaugural #OpenForGood Award  to bring due recognition and visibility to foundations who share challenges, successes, and failures to strengthen how we can think and act as a sector.

Three winning foundations will demonstrate an active commitment to open knowledge and share their evaluations through IssueLab. Winners will receive technical support to create a custom knowledge center for themselves or a grantee, as well as promotional support in the form of social media and newsletter space. Who will you nominate as being #OpenForGood?

--Melissa Moy 

Knowledge Sharing to Strengthen Grantmaking
April 26, 2018

Clare Nolan, MPP, co-founder of Engage R+D, is a nationally recognized evaluation and strategy consultant for the foundation, nonprofit and public sectors. Her expertise helps foundations to document and learn from their investments in systems and policy change, networks, scaling, and innovation. This post also appears on the Grantmakers for Effective Organizations’ (GEO) Perspectives blog.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Clare Nolan PhotoKnowledge has the power to spark change, but only if it is shared. Many grantmakers instinctively like the idea of sharing the knowledge they generate with others. But in the face of competing priorities, a stronger case must be made for foundations to devote time and resources to sharing knowledge. The truth is that when foundations share knowledge generated through evaluation, strategy development and thought leadership, they benefit not only others but also themselves. Sharing knowledge can deepen internal reflection and learning, lead to new connections and ideas, and promote institutional credibility and influence.

Foundations can strengthen their knowledge sharing practices by enhancing organizational capacity and culture, and by understanding how to overcome common hurdles to sharing knowledge. The forthcoming GrantCraft guide Open for Good: Knowledge Sharing to Strengthen Grantmaking provides tips and resources for how foundations can do just that. My organization, Engage R+D, partnered with Foundation Center to produce this guide as part of #OpenForGood, a call to action for foundations to openly share their knowledge.

Knowledge Sharing GraphTo produce the guide, we conducted interviews with the staff of foundations, varying by origin, content focus, size, and geography. The participants shared their insights about the benefits of sharing knowledge not only for others, but also for their own organizations. They also described strategies they use for sharing knowledge, which we then converted into concrete and actionable tips for grantmakers. Some of the tips and resources available in the guide include:

  • A quiz to determine what type of knowledge sharer you are. Based upon responses to questions about your organization’s capacity and culture, you can determine where you fall within a quadrant of knowledge sharing (see visual). The guide offers tips for how to integrate knowledge sharing into your practice in ways that would be a good fit for you and your organization.
  • Nuts and bolts guidance on how to go about sharing knowledge. To take the mystery out of the knowledge sharing process, the guide breaks down the different elements that are needed to actually put knowledge sharing into practice. It provides answers to common questions grantmakers have on this topic, such as: What kinds of knowledge should I be sharing exactly? Where can I disseminate this knowledge? Who at my foundation should be responsible for doing the sharing?
  • Ideas on how to evolve your foundation’s knowledge-sharing practice. Even foundation staff engaged in sophisticated knowledge-sharing practices noted the importance of evolving their practice to meet the demands of a rapidly changing external context. The guide includes tips on how foundations can adapt their practice in this way. For example, it offers guidance on how to optimize the use of technology for knowledge sharing, while still finding ways to engage audiences with less technological capacity.

The tips and resources in the guide are interspersed with quotes, audio clips, and case examples from the foundation staff members we interviewed. These interviews provide voices from the field sharing tangible examples of how to put the strategies in the guide into practice.

Want to know how your foundation measures up when it comes to knowledge sharing? We are pleased to provide readers of this blog with an advance copy of Chapter 2 from the forthcoming Guide which includes the quiz referenced above. Want to learn more? Sign up for the Foundation Center’s GrantCraft newsletter and receive a copy of the Guide upon its release. And, for those who are attending the GEO conference next week in San Francisco, visit us at our #OpenForGood pop-up quiz station where you can learn more about what kind of knowledge sharer you are.

--Clare Nolan

Increasing Attention to Transparency: The MacArthur Foundation Is #OpenForGood
April 17, 2018

Chantell Johnson is managing director of evaluation at the John D. and Catherine T. MacArthur Foundation. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Chantell Johnson photoAt MacArthur, the desire to be transparent is not new. We believe philanthropy has a responsibility to be explicit about its values, choices, and decisions with regard to its use of resources. Toward that end, we have long had an information sharing policy that guides what and when we share information about the work of the Foundation or our grantees. Over time, we have continued to challenge ourselves to do better and to share more. The latest refinement of our approach to transparency is an effort toward increasingly sharing more knowledge about what we are learning. We expect to continue to push ourselves in this regard, and participating in Foundation Center’s Glasspockets  and #OpenForGood movements are just a couple of examples of how this has manifested.

In recent years, we have made a more concerted effort to revisit and strengthen our information sharing policy by:

  • Expanding our thinking about what we can and should be transparent about (e.g., our principles of transparency guided our public communications around our 100&Change competition, which included an ongoing blog);
  • Making our guidance more contemporary by moving beyond statements about information sharing to publishing more and different kinds of information (e.g., Grantee Perception Reports and evaluation findings);
  • Making our practices related to transparency more explicit; and
  • Ensuring that our evaluation work is front and center in our efforts related to transparency.

Among the steps we have taken to increase our transparency are the following:

Sharing more information about our strategy development process.
The Foundation's website has a page dedicated to How We Work, which provides detailed information about our approach to strategy development. We share an inside look into the lifecycle of our programmatic efforts, beginning with conceptualizing a grantmaking strategy through the implementation and ending phases, under an approach we refer to as Design/Build. Design/Build recognizes that social problems and conditions are not static, and thus our response to these problems needs to be iterative and evolve with the context to be most impactful. Moreover, we aim to be transparent as we design and build strategies over time. 

“We have continued to challenge ourselves to do better and to share more.”

Using evaluation to document what we are measuring and learning about our work.
Core to Design/Build is evaluation. Evaluation has become an increasingly important priority among our program staff. It serves as a tool to document what we are doing, how well we are doing it, how work is progressing, what is being achieved, and who benefits. We value evaluation not only for the critical information it provides to our Board, leadership, and program teams, but for the insights it can provide for grantees, partners, and beneficiaries in the fields in which we aim to make a difference. Moreover, it provides the critical content that we believe is at the heart of many philanthropic efforts related to transparency.

Expanding the delivery mechanisms for sharing our work.
While our final evaluation reports have generally been made public on our website, we aim to make more of our evaluation activities and products available (e.g., landscape reviews and baseline and interim reports). Further, in an effort to make our evaluation work more accessible, we are among the first foundations to make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

Further evidence of the Foundation's commitment to increased transparency includes continuing to improve our “Glass Pockets” by sharing:

  • Our searchable database of grants, including award amount, program, year, and purpose;
  • Funding statistics including total grants, impact investments, final budgeted amounts by program, and administrative expenses (all updated annually);
  • Perspectives of our program directors and staff;
  • Links to grantee products including grant-supported research studies consistent with the Foundation's intellectual property policies;
  • Stories highlighting the work and impact of our grantees and recipients of impact investments; and
  • Center for Effective Philanthropy Grantee Perception report results

Going forward, we will look for additional ways to be transparent. And, we will challenge ourselves to make findings and learnings more accessible even more quickly.

--Chantell Johnson 

Hiding Your Diversity Data Helps Keep #PhilanthropySoWhite
March 28, 2018

Orson Aguilar is president of The Greenlining Institute.

Orson photoAt this point, it’s no secret: Philanthropy needs to diversify. Diversity, or the lack thereof, has become something of a hot-button issue in recent years. We’ve seen dozens of articles urging foundations to make changes, including a 2016 op-ed co-written by Dr. Robert Ross, Luz Vega-Marquis, and Stephen Heintz entitled, Philanthropic Leadership Shouldn’t Look Like the Country Club Set.

And a handful of foundations have demonstrated what is possible when they make diversity, equity, and inclusion organizational priorities. The California Endowment (TCE), one of the pioneers in these efforts, adopted a 15-part Diversity Plan in 2008, and since that year, TCE has published four “Diversity, Equity, and Inclusion Audits” to track its own progress. The audit is simple and profound, stating: “By openly reflecting on our progress and challenges related to diversity, equity and inclusion, we hope that the audit fosters a broader culture of continuous improvement where we challenge ourselves to always do better and to advance -- for the field, for our staff, and for the communities we ultimately serve.”

And yet, despite this heightened awareness and the concerted efforts of a handful of organizations, diversity and equity in philanthropy as a whole haven’t changed much. The data published by the D5 Coalition suggest that we have seen virtually no increase in the number of people of color who hold staff and leadership positions at foundations, and little increase in the representation of women.

“Making philanthropy more diverse and inclusive should be a top priority for everyone.”

More frustrating is the fact that very few foundations have decided to voluntarily disclose their demographic data since the attempted passage of California’s A.B. 624, proposed legislation that would have required large foundations in the state to collect and disclose demographic data for themselves and for their grantees. 

According to a search on Glasspockets.org, only 10 of the more than 90 foundations publicly committing to working more openly have disclosed both their diversity data and their diversity values policies. The list of 10 foundations includes foundations such as The David and Lucile Packard Foundation, The Rockefeller Foundation, Annenberg Foundation, and Silicon Valley Community Foundation. They should be applauded. Interestingly, more than 40 foundations have stated that they have diversity/values policies, yet most of them fail to disclose their own diversity data.

Making philanthropy more diverse and inclusive should be a top priority for everyone, regardless of whether or not your foundation focuses on supporting communities of color. This isn’t just a numbers game. As Ruth McCambridge reminds us in her recent article for Nonprofit Quarterly, “Lack of racial, ethnic, and gender diversity in philanthropy enlarges the understanding gap between philanthropy and the communities meant to be final beneficiaries.” By not including more people who understand the experiences of communities of color in leadership positions, foundations put extra distance between themselves and these communities and can’t know how best to serve them.

Diana Campoamor and Vikki N. Spruill, veterans in the struggle to diversify philanthropy, jointly wrote in 2016, “Few would argue that there has been too little discussion about making the sector look more like the people it serves. The real challenge has been to set in motion the measures that assure greater diversity throughout the sector.”

“The only way philanthropy will remain relevant is if it evolves along with the communities around it.”

Just as it took #OscarsSoWhite to jolt the Motion Picture Academy into action, will it take #PhilanthropySoWhite taking off on social media to transform this sector? A group of people has championed this issue from within the world of philanthropy for years, and yet progress remains slow. It’s no longer a question of awareness; it’s a question of priorities. Of course, every foundation has its own vision and purpose, but the only way philanthropy will remain relevant is if it evolves along with the communities around it. That means being intentional about hiring more people from diverse backgrounds who can bring much-needed perspectives to the table; tracking the demographics of people who benefit from grant dollars; tracking the demographics of foundation board and staff, and being transparent about all of those numbers.

Why is transparency so important? Because we’ve seen it drive massive change in other fields. Since the California Public Utilities Commission began requiring the companies it regulates to report how much contracting they do with businesses owned by women, people of color and service disabled veterans, these companies’ contracts with diverse businesses went from $2.6 million in 1986 to $8.8 billion in 2016. In philanthropy, transparency can drive the field to build more coalitions of foundations that can hold each other accountable to high standards of transparency and inclusiveness. It can help them learn from the inclusive practices already adopted by some foundations.

Ultimately, it’s going to take a bigger push than anything we’ve seen before to transform the sector. Otherwise, philanthropy will become more and more out of touch with the people it seeks to serve, and it will become increasingly unable to address the needs of a rapidly changing America.

What is perplexing is that large foundations value data and frequently fund social justice efforts to obtain more gender, racial, LGBTQ and ethnic data as positive outcomes of their grants. The fiscal impact on foundations to collect this data about their own operations and grantees would be negligible. Foundations like TCE have demonstrated “the sky didn’t fall” when the data was published, as critics suggested would happen 10 years ago.  Just the opposite: The foundation learned from its data to make better decisions on how to operate.

In an era of greater transparency, and increasing recognition that we are a diverse and multicultural nation, we urge more foundations to take the leap and conduct and share their own diversity and inclusion audits.

--Orson Aguilar 

It’s Not You, It’s Me: Breaking Up With Your Organization’s Inequitable Funding Practices
March 21, 2018

Erika Grace “E.G.” Nelson is a Community Health and Health Equity Program Manager at the Center for Prevention at Blue Cross and Blue Shield of Minnesota. E.G. recently led the Center through an equity scan of its Request for Proposal (RFP) policies and procedures.

Erika Nelson photo“It’s not you; it’s me” is possibly the most cliché break-up excuse, but for many funders, it really is their own policies and procedures that undermine their ability to find community soulmates. Perhaps you have had conversations with community members who have said that they found out about your funding opportunity too late, were too busy to apply, or, worse yet, were rejected even though their project sounds like a great fit based on the conversation you are currently having with them. The reality is that funders typically enact policies that are convenient for themselves, as opposed to what makes sense for grantseekers, and diversity, equity, and inclusion (DEI) fall by the wayside of expediency. As a result, organizations with the most social and fiscal capital have the best shot at receiving awards.

Have you ever taken the time to think about how your funding portfolio might look differently if your RFP process was designed to be more equitable and inclusive? We recently completed an equity scan, and here is a bit about how this reflection has led to changes in our RFP process.

“Funders typically enact policies that are convenient for themselves, as opposed to what makes sense for grantseekers.”

At the Center for Prevention, our goal is to improve the health of all Minnesotans by tackling the leading causes of preventable disease and death – commercial tobacco use, physical inactivity, and unhealthy eating. While Minnesota has one of the best overall health rankings in the nation, we see huge gaps in health outcomes when considering factors such as race, income, and area of residence.

We also know that communities are aware of what they need to be healthy, but organizations established by and for marginalized communities tend to face greater barriers than well-resourced, mainstream organizations in getting what they need. We wanted to remove as many barriers from our application process as possible so that we could find and support more community-based and culturally-tailored approaches to addressing health needs. To begin identifying these barriers, our team reflected on challenges identified by communities we work with and walked through our application process from beginning to end using an equity lens. As a result, we have implemented several systemic changes to move towards our vision of a truly equitable process.

Bringing the Funding Opportunities to the Community

BCBS_Center_Prevention_vert_blueWe began our journey by thinking about funding opportunities. Before an organization can even apply for funding, it needs to know that an opportunity exists. Through community conversations, we learned that many organizations were unfamiliar with our resources and work. We recommended that project teams develop a tailored outreach plan for each funding opportunity, with specific outreach to organizations or sectors we considered to be key stakeholders or who had been markedly absent in previous rounds. Moving forward, we also have a goal of literally meeting folks where they are at – town halls, cultural events, on social media – to share our work and funding opportunities.

As a result, here are some ways we shifted how we engage with community organizations through our RFP process:

  • Time. Once applicants find out about an opportunity, they need to apply, which takes some time. We learned that some potential applicants prioritized other opportunities because they didn’t have the staff capacity to apply for multiple opportunities concurrently. The easiest solution to this problem was to give applicants more time, so we extended our open application period. In our case, we went from no set minimum to at least six weeks.
  • Assistance. We also wanted to make sure that applicants could make informed decisions about how to prioritize staff time, so we opened up new channels for discussing funding opportunities. We made sure that every application had a designated point person for answering questions from the public, and even piloted some creative ways to interact with the community in advance of the submitted application, such as an “office hours” hotline where anyone could call in and ask questions. The number of inquiries was manageable and allowed applicants to receive guidance on whether their projects were a good match before they invested time in applying. Follow-up survey data showed that this strategy paid off because applicants reported that they understood our funding objectives and that the time they invested in applying was appropriate for the potential award.
  • Accessibility. We are also working towards using more accessible language to articulate the merits of a viable proposal. We now run a readability test on all RFP language before publication, with the goal of using language that is no higher than an eighth grade reading level. Such tests have helped us remove jargon, and improve comprehension by professionals outside of public health as well as by non-native English speakers.

Leveling the Playing Field of Community Relationships

Our team also considered the role relationships play in evaluating proposals. We approached equity from two angles. We set limits on which and when “outside information”— knowledge we have about a project that didn’t come from the application—can be shared during proposal review. We also started reaching out to new applicants to discuss their work more deeply. Our familiarity with mainstream organizations and those we have previously funded can influence how we evaluate an application, and in some cases lead to an unfair advantage for groups that already have many advantages.  So these limits on “outside information” were put in place to level the playing field, as well as to begin to strengthen relationships with organizations that were new to us. These conversations helped us to fill in gaps in our understanding that we may unconsciously fill in for organizations we are already familiar with.

“We now run a readability test on all RFP language before publication…to remove jargon, and improve comprehension.”

Transparent Evaluation Processes

We felt transparency in our decision-making process could only improve the quality of proposals. One way we have done this is by making scoring rubrics available to applicants. We also began providing tailored feedback to each declined applicant on how the proposal could have been stronger in hopes that it will improve future submissions. Though we have yet to determine what impact this will have in the future, we can say that applicants have been appreciative and found this feedback to be useful.

Hope and More Work to Be Done

While we don’t yet have much data to analyze post-implementation, we have noticed a few positive outcomes. We have seen a great increase in applications from greater Minnesota in particular, demonstrating that our targeted outreach is increasingly effective. Our funding awards to projects by and for people of color have also doubled in one of two opportunities we have analyzed since implementation. Despite this progress, we continue to wrestle with how to develop scoring tools that better reflect our values. 

The above are just some examples of how we have begun to identify and address equity barriers in our process that may be helpful for others. If your foundation is considering something similar, here are some things we learned from our experience that may be helpful for you.

  • Leadership & Promising Practices. As with any new process implementation, support from leadership is critical. If you are met with resistance, keep in mind that funders typically want to emulate best and promising practices in philanthropy, and sharing what other funders are doing around diversity, equity, and inclusion can be highly motivating.
  • Checks & Balances. It is also important to keep in mind that old habits die hard. It is not necessarily because team members are resistant to change, but simply need to get into the routine of doing things differently. For that reason, be sure that you build in checks and balances along the way to ensure that all who touch your RFP process have the opportunity to identify pain points along the way while also upholding equity commitments.
  • No One Size Fits All. Keep in mind that there is not one model that will work for everyone, and much in the same way, not all the communities you serve will be pleased with the changes you make. So, keep asking for and responding to feedback from community and know that correcting mistakes is part of improvement and part of ensuring our processes continue to be ones that facilitate, rather than undermine, diversity, equity, and inclusion.

--Erika Grace “E.G.” Nelson

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories