Transparency Talk

Category: "Lessons Learned" (90 posts)

Meet Our #OpenForGood Award Winner: An Interview with Veronica Olazabal, Director of Measurement, Evaluation and Organizational Performance, The Rockefeller Foundation
July 10, 2019

Nqrwrzfk





Veronica Olazabal

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

The Rockefeller Foundation advances new frontiers of science, data, policy, and innovation to solve global challenges related to health, food, power, and economic mobility. In this interview, Veronica Olazabal shares insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the Rockefeller Foundation and how this practice has helped you to advance your work? Or put another way, what is the good that has come about as a result?

Veronica Olazabal: We are excited to be an inaugural recipient of the #OpenForGood award! As you may be aware, since its founding more than 100 years ago, The Rockefeller Foundation's mission has been “ to promote the well-being of humanity throughout the world.” To this end, the Foundation seeks to catalyze and scale transformative innovation across sectors and geographies, and take risks where others cannot, or will not.

While often working in new and innovative spaces, the Foundation has always recognized that the full impact of its programs and investments can only be realized if it measures - and shares - what it is learning. Knowledge and evidence sharing have been core to the organization's DNA dating back to its founder John D. Rockefeller Sr., who espoused the virtues of learning from and with others—positing that this was the key to "enlarging the boundaries of human knowledge." You can imagine how this, in turn, resulted in transformational breakthroughs such as the Green Revolution, the eradication of Yellow Fever and the formalization of Impact Investing.

The-rockefeller-foundationGP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo? 

VO: Learning is a team sport and to that end, an evaluation and learning team should be centrally positioned and accessible to all teams across a foundation. At the Rockefeller Foundation, the Measurement and Evaluation team engages with both the programmatic and the impact investing teams. We see our role as enablers of good practices around impact management and programmatic learning -- often working with teams in early stage design support, through start-up, implementation and exit. We also work collaboratively with others at the Foundation such as our grants-management and data teams to ensure the “right” M&E data is being captured throughout our grantee’s lifecycle.

Yet, I will be the first to say that building a culture of learning by continuously reaching “over the fence” is a lot of work and might be challenging for a small team, which is the reality for most foundations. Benchmarking data produced by the Center for Evaluation Innovation (CEI) and the Center for Effective Philanthropy (CEP) lands most M&E teams at foundations at around 1.5. So, capacity for culture change is clearly a challenge. My suggestion here is to source evaluation and learning talent that balances the hard technical chops with the softer people skills. I believe you truly need both and if an organization optimizes for one over the other, might experience a series of false starts. A good place to start in sourcing evaluation talent is the American Evaluation Association (AEA).

GP: As you heard during the award presentation, one of the reasons the Rockefeller Foundation was selected to receive this award is because of your commitment to sharing the results of any evaluation you commission, before you even know the outcome. This pledge seems designed to not let negative findings affect your decision about whether or not to share what your learned. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected Rockefeller’s reputation in the field, and why it’s worth it?

VO: In 2017, The Rockefeller Foundation was pleased to be the first to make all of its evaluations available to IssueLab as part of #OpenForGood. But to the Foundation, being open goes well beyond passively making information available to those seeking it. Being truly open necessarily involves the proactive sharing of lessons so that others can be aware of and leverage from the things that we are learning. To that end, we regularly author blogs, disseminate evaluation reports and M&E learnings via digital channels, and – perhaps most importantly – share back evaluation results with our grantees and partners – so that evaluation is more than a one-way extractive exercise.

"Being truly open necessarily involves the proactive sharing of lessons so that others can be aware of and leverage from the things that we are learning."

Taking sharing one step further, earlier this year, The Rockefeller Foundation adopted a new Data Asset Policy aimed at making the data that we collect as part of our grantmaking freely available to others who could use it to effect more good in the world. The policy is grounded on two core principles: 1) that the data we fund has incredible value for public good and that these assets can serve as fuel for better decision-making; and 2) we commit ourselves to being responsible stewards of these data, which means prioritizing privacy and protection, especially of those individuals and communities we seek to serve. Moving forward, this opens up the ability to amplify our learning even further and in even more innovative ways.

GP: A concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does Rockefeller include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

VO: Having had the experience of being both a funder and a grantee, I know this is a real barrier to enabling robust learning cultures and evidence-informed decision-making. For this reason, at The Rockefeller Foundation we approach resourcing in a few different ways:

  • First, through embedding resources for evaluation and learning into individual grantee budgets and agreements from the start. This type of funding enables grantees to generate the type of data they need for their own decision-making, learning and reporting.
  • We also often work in a consortia model where we commission an evaluation and learning grantee separately to synthesize learnings across groups of grantees and provide technical assistance as needed. This approach helps decrease the reporting burden for “implementation” types of grantees as it generates what is it the Foundation would like to learn (which could differ from what the grantees and their clients find useful). Here is an example from our Digital Jobs Africa portfolio generated through this evaluation and learning model.
  • Finally, we have also at times, and upon request, seconded our own M&E staff to grantees and partners to help build their M&E muscle and enable them to measure their own impact. While this is rare, we are seeing this request more and more and hence why we value both technical expertise and relationship management skills.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your own work.

VO: There are many opportunities to learn from others. In my current role, I am in continuous engagement with colleagues in similar roles at other philanthropies and regularly meet before or after convenings organized by CEP, GEO and AEA. In addition, as part of my work on the Fund for Shared Insight which is a funding collaborative working to make listening to end-users the norm, my philanthropy colleagues and I often exchange on where we all are in our personal and institutional learning journeys.

Finally, as part of a W.K. Kellogg Foundation-funded Lab for Learning, The Rockefeller Foundation was most recently among a cohort of 15 foundations that took part in a year-long series of convenings to address systemic barriers to learning. Participation here required us to experiment with ideas for supporting learning in our own settings and then sharing our experiences with the group. Through this engagement, we learned about how others were building learning habits in their foundations (written about in Julia Coffman’s post here). More specifically, the measurement and evaluation team was able to introduce Making Thinking Visible and Asking Powerful Questions in our early stage support to program teams to push thinking about assumptions and concrete dimensions of the work. This engagement then helped to structure the foundations of a learning agenda (e.g. theory of change-like tool with clear outcomes, hypotheses, assumptions and evidence) that would be used to anchor adaptive management and continuous improvement once the program strategy rolled out.

--Veronica Olazabal & Janet Camarena

Meet Our #OpenForGood Award Winner: An Interview with Lee Alexander Risby, Head of Effective Philanthropy & Savi Mull, Senior Evaluation Manager, C&A Foundation
June 19, 2019

1




Lee Alexander Risby

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

C&A Foundation is a European foundation that supports programs and initiatives to transform fashion into a fair and sustainable industry that enables everyone – from farmer to factory worker – to thrive. In this interview, Lee Alexander Risby and Savi Mull share insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the C&A Foundation and how this practice has helped you to advance your work?

2




Savi Mull

Savi Mull: For almost five years, C&A Foundation has been dedicated to transforming the fashion industry into a force for good. A large part of that work includes instilling transparency and accountability in supply chains across the industry. From the start, we also wanted to lead by example by being transparent and accountable as an organization, sharing what we were learning whilst on this journey, being true to our work and helping the rest of the industry learn from our successes and failures.

Lee Alexander Risby: Indeed, from the beginning, we made a commitment to be open about our results and lessons by publishing evaluations on our website and dashboards in our Annual Reports. After all, you cannot encourage the fashion industry to be transparent and accountable and not live by the same principles yourself. Importantly, our commitment to transparency has always been championed both by our Executive Director and our Board.

Savi: To do this, over the years we have put many processes in place.  For example, internally we use after-action reviews to gather lessons from our initiatives and allow our teams to discuss honestly what could have been done better in that program or partnership.  We also do third party, external evaluations of our initiatives, sharing the reports and lessons learned. This helps us and our partners to learn, and it informs initiatives and strategies going forward.

The Role of Evaluation Inside Foundations

GP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

SM: I believe it is essential to have this type of function in a foundation to drive formal learning from and within programs. But at the same time, it is an ongoing process that cannot be driven by one function alone. All staff needs to be responsible for the learning that makes philanthropy effective – not just evaluators.

LAR: To begin, we were deliberate in building a team of evaluation professionals to promote accountable learning. We started hiring slowly and built the team over time. What I looked for with each new member of the team, and I am always looking for, is an evaluator with more than just skills, they also need the influencing, listening, communication and negotiating skills to help others learn. Evaluations have little effect without good internal and external communication.

”For us, it was important to be a critical friend, listener, and enabler of learning and not the police.”

The evaluation function itself has also evolved over the last five years. It started off as a monitoring, evaluation and learning function (MEL) and is now Effective Philanthropy. From the start, the function was as not set up as an independent department but created to help programmatic teams in the design of appropriate monitoring and evaluation for the programs, and facilitators and advisors on strategy. However, it has not always been a straight-forward process from the inside. In the first years, we had to spend a lot of time explaining and persuading staff of the need for evaluation, transparency and learning and the benefits of doing so. We wanted to avoid a strong independent evaluation function as that can reduce learning by placing too much emphasis on accountability. For us, it was important to be a critical friend, listener, and enabler of learning and not the police.

SM: So, the first bit of advice is that evaluators should be supportive listeners, assisting programmatic teams throughout the design and implementation phases to get the best results possible. They should not come in just at the end of an initiative to do an evaluation.

LAR: The second piece of advice is on positioning, support, and structure of evaluation within a foundation.  Firstly, it is critical to have is to have the buy-in of the leadership and board for both evaluation and transparency. And secondly, the evaluation function must be part of the management team and report to the CEO or Executive Director. This gives reporting and learning the appropriate support structure and importance.

The third piece of advice is to consider not creating an evaluation function, but an effective philanthropy function. Evaluation is done for learning, and learning drives effectiveness in grant-making for better results and long-term impacts on systems.

SM: The final piece of advice is to take guidance from others outside your organization. The whole team has consulted broadly with former colleagues and mentors from across the evaluation community as well as experienced philanthropic professionals. Remember you are part of a field with peers whose knowledge and experience can help guide you.

Opening Up Pain Points

GP: One of the reasons the committee selected C&A Foundation to receive the award is because of your institutional comfort level with sharing not just successes, but also being very forthright about what didn’t work. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected C&A Foundation’s reputation in the field, and why it’s worth it?

LAR: I would say this. The question for foundation boards and leaders is straightforward: do you want to be more effective and have an impact? The answer to that will always be yes, but it is dependent on learning and sharing across the organization and with others. If we do not share evaluations, research or experiences, we do not learn from each other and we cannot be effective in our philanthropic endeavors.

"There is a benefit to being open, you build trust and integrity – success and failure is part of all of us."

The other question for boards and leaders is: who does philanthropy serve? For us, we want to transform the fashion industry, which is made up of cotton farmers, workers in spinning mills and cut and sew factories, consumers and entrepreneurs, to name a few – they are our public. As such we have the duty to be transparent to the public about where we are succeeding and where we have failed and how we can improve. We do not think there is a reputation risk. In fact, there is a benefit to being open, you build trust and integrity – success and failure is part of all of us.

SM: Adding to what Lee has said, being open about our failures not only helps us but the entire field. Some of our partners have felt reticent about our publishing evaluations, but we always reassure them and stress from the beginning of an evaluation process that it is an opportunity to understand how to they can improve their work and how we can improve our partnership, as well as a chance to share those lessons more broadly.

Learning While Lean

GP: Given the lean philanthropy staffing structures in place at many corporate foundations, do you have any advice for your peers on how those without a dedicated evaluation team might still be able to take some small steps to sharing what they are learning?

SM: Learning is a continuous process. In the absence of staff dedicated to evaluation, take baby steps within your power, such as implementing after-action reviews, holding thematic webinars, or doing quick summaries of lessons from grants and/or existing evaluations from others. If the organization’s leadership endorses learning, these small steps are a good place to start.

GP: And speaking of lean staffing structures, a concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does C&A Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

SM: The foundation has a Monitoring and Evaluation Policy that lays out the role of the programmatic staff and partners as well as of the dedicated Effective Philanthropy Team. C&A Foundation partners are generally responsible for the design and execution of self-evaluation - to be submitted at the end of the grant period. External evaluation budgets are covered by the foundation and do not pose a financial burden on partners at all. They are included in the overall cost of an initiative, and when needed we have an additional central evaluation fund that is used to respond to the programmatic team’s and partner’s ad hoc demands for evaluations and learning.

The Effective Philanthropy team does provide technical assistance to partners and foundation staff upon request. The guidance ranges from technical inputs related to the theory of change development to the design of baseline and mid-line data collection exercises. The theory of change work has been really rewarding for partners and ourselves. We all enjoy that part of the work.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your work.

Learning Leads to Effectiveness

C-a-foundation (1)LAR: In the moving from a more traditional MEL approach to effective philanthropy we looked at the work of other foundations. This included learning from the William and Flora Hewlett Foundation, the Rockefeller Foundation, and others. We had discussions with a number of peers in the field. We also asked Nancy MacPherson (formerly Managing Director of Evaluation at Rockefeller) and Fay Twersky (Director of Effective Philanthropy at Hewlett) to review our Effective Philanthropy strategy when it was under development. Their feedback and advice helped a lot. In the end, we decided to begin to build out the function in a similar way to the Hewlett Foundation. But there are some differences. For example, our evaluation practice is currently positioned at a deeper initiative level, which is related to the field context where there is a significant evidence gap across the fashion industry that needs to be filled. Concomitant to this is our emphasis on piloting and testing and that goes hand-in-hand with the demand for evaluative thinking, reporting, and learning.

Our team has also been influenced by our own successes and failures from previous roles. That has also inspired us to embrace a slightly different approach.

SM: In terms of where we are at the moment, we still oversee performance monitoring, evaluation, and support to the program teams in developing theories of change and KPIs; but we are also building out organizational learning approach and are in the process of hiring a Senior Learning Manager. Lastly, we are piloting our organizational and network effectiveness in Brazil, which is being led by a colleague who joined the foundation last year.

LAR: We are also in the midst of an Overall Effectiveness Evaluation (OEE) of C&A Foundation’s first 5-year strategy. In general, this is not a type of evaluation that foundations use much. As well as looking at results, the evaluators are evaluating the whole organization, including Effective Philanthropy. For me as an evaluator, it has been really rewarding to be on the other side of a good question.

We are learning from the OEE as we go along and we decided to create ongoing opportunities for reporting/feedback from the process rather than waiting until the very end for a report. This means that program staff can be engaged in proactive discussions about performance and emerging lessons in a timely way. The OEE is already starting to play a vital role to inform the development of the next 5-year strategy and our organization. But you will surely hear more on that evaluation process later as it will be published. There is always room for improvement and learning never stops.

--Lee Alexander Risby and Savi Mull

Meet Our #OpenForGood Award Winner: An Interview with Craig Connelly, Chief Executive Officer, The Ian Potter Foundation
June 12, 2019

Download



Craig Connelly

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

The Ian Potter Foundation is an Australian foundation that supports and promotes excellence and innovation working for a vibrant, healthy, fair, and sustainable Australia. In this interview, Craig Connelly shares insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the Ian Potter Foundation and how this practice has helped you to advance your work? Or put another way, what is the good that has come about as a result?

Craig Connelly: The Ian Potter Foundation decided to invest in our research and evaluation capability primarily to improve the quality of our grantmaking. We believe that evaluating our grantees and the work that we fund through measuring and evaluating outcomes enables us to understand the extent to which our funding guidelines are achieving the intended outcomes. This results in a more informed approach to our grantmaking which should improve the quality of our grantmaking over time.

A core part of this includes being completely transparent with our grantees and with the broader sector. To do anything otherwise is not being consistent with our expectations of our grantees. We are asking our grantees to be partners, to pursue a strategic relationship with them and that requires open and honest conversation. Therefore, we need to be an open, honest and transparent funder and demonstrate that in order to win the trust of the organizations we fund.

Examples of this transparency are the learnings that we glean from our grantees that we share with the broader sector. We’re getting very positive feedback from both funders and grantees on the quality of the learnings that we’re sharing and the value that they add to the thought processes that nonprofit organizations and other funders go through.

The-ian-potter-foundationGP: Increasingly we are seeing foundations move toward a structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

CC: Anyone in a research and evaluation role needs to be an integral part of the program management team. The research and evaluation process informs our grantmaking. It needs to assist the program managers to be better at what they do, and it needs to learn from what the program managers are doing as well. You don’t want it to be a silo, it is just another function of your program management team. It is an integral part of that team and it is in constant communication both with the program management team and with grantees from day one.

GP: As you heard during the award presentation, one of the reasons the Ian Potter Foundation was selected to receive this award is because of how you prioritize thinking about how stakeholders like grantees might benefit from the reports and knowledge you possess. We often hear that while there is a desire to share grantee reports publicly, that there are reputational concerns that prevent it or that to scrub the reports of sensitive information would be too time consuming, yet you do it for all of your portfolios. What are your tips for how to keep this a manageable process?

CC: The initial work to compile and anonymize our grantee learnings required some investment in time from our Research & Evaluation Manager and communications team. To make this task manageable, the work was tackled one program area at a time. Now that a bank of learnings has been created for each program area, new learnings are easily compiled and added on a yearly basis. This work is scheduled at less busy times for those staff involved. The Ian Potter Foundation is also looking at ways learnings can be shared directly from grantees to the wider nonprofit sector. One idea is to create a forum (e.g. a podcast) where nonprofits can share their experiences with their peers in the sector.

GP: A concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does The Ian Potter Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

"...we need to be an open, honest and transparent funder and demonstrate that in order to win the trust of the organizations we fund."

CC: One of the benefits that we found at The Ian Potter Foundation of having a Research & Evaluation Manager becoming an integral part of our process is that our authorizing environment – our board and the committees responsible for program areas – have become very comfortable including funding evaluation for all of our grants. We now also understand what it costs to complete an effective evaluation. We often ask grantees to add more to their budget to ensure a good quality evaluation can be completed as part of the grant.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your own work.

CC: Yes, we have a couple of examples I can point to. The first comes from our Education Program Manager, Rikki Andrews, who points to the creation of the Early Childhood Impact Alliance (ECIA) through a grant to the University of Melbourne. The purpose of the ECIA is to convene, connect and increase understanding of research and policy among early childhood philanthropic funders, to ensure there is more strategic and concerted philanthropic support of research and its application.

Additionally, the Foundation’s Senior Program Manager, Dr. Alberto Furlan, explains, ‘We are in the process of learning from organizations we partner with all the time. In the last few years, program managers have been prioritizing extensive site visits to shortlisted applicants to discuss and see the projects in situ. In a ‘big country’ such as Australia, this takes a considerable amount of time and resources, but it invariably pays off. Such visits highlight the importance of relationship building deep and honest listening when partnering with not-for-profits. The Foundation prides itself in being open and approachable and site visits greatly contribute to understanding the reality of the day-to-day challenges, and successes, of the organizations working on the ground.’

--Craig Connelly & Janet Camarena

Candid Announces Inaugural #OpenForGood Award Winners
May 30, 2019

Janet Camarena is director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

Open For Good Awardees and Committee MembersLeft to Right: Meg Long, President, Equal Measure (#OpenForGood selection committee); Janet Camarena, Director, Transparency Initiatives, Candid; Awardee Savi Mull, Senior Evaluation Manager, C&A Foundation; Awardee Veronica Olazabal, Director, Measurement, Evaluation & Organizational Performance, The Rockefeller Foundation; Clare Nolan, Co-Founder, Engage R + D (#OpenForGood selection committee).

Yesterday as part of the Grantmakers for Effective Organizations Learning Conference, Candid announced the inaugural recipients of the #OpenForGood Award, which is designed to recognize and encourage foundations to openly share what they learn so we can all get collectively smarter. The award, part of a larger #OpenForGood campaign started in 2017, includes a set of tools to help funders work more transparently including a GrantCraft Guide about how to operationalize knowledge sharing, a growing collection of foundation evaluations on IssueLab, and advice from peers in a curated blog series.

The three winning foundations each demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. Selected by an external committee from a globally sourced nomination process, the committee reviewed the contenders looking for evidence of an active commitment to open knowledge, creative approaches to making knowledge shareable, field leadership, and incorporating community insights into knowledge sharing work.

And the Winners Are…

Here are some highlights from the award presentation remarks:

C and A FoundationC&A Foundation
Award Summary: Creativity, Demonstrated Field Leadership, and Willingness to Openly Share Struggles

The C&A Foundation is a multi-national, corporate foundation working to fundamentally transform the fashion industry. C&A Foundation gives its partners financial support, expertise and networks so they can make the fashion industry work better for every person it touches. Lessons learned and impact for each of its programs are clearly available on its website, and helpful top-level summaries are provided for every impact evaluation making a lengthy narrative evaluation very accessible to peers, grantees and other stakeholders. C&A Foundation even provides such summaries for efforts that didn’t go as planned, packaging them in an easy-to-read, graphic format that it shares via its Results & Learning blog, rather than hiding them away and quietly moving on as is more often the case in the field.

The Ian Potter FoundationIan Potter Foundation
Award Summary: Creativity, Field Leadership, and Lifting Up Community Insights

This foundation routinely publishes collective summaries from all of its grantee reports for each portfolio as a way to support shared learning among its existing and future grantees. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than questions to satisfy the typical ritual of a grant report that goes something like submit, data enter, file away never to be seen, and repeat.

Beyond being transparent with its grantee learning and reports, the Ian Potter Foundation also recently helped lift the burden on its grantees when it comes to measurement and outcomes. Instead of asking overworked charities to invent a unique set of metrics just for their grant process, foundation evaluation staff took it upon themselves to mine the Sustainable Development Goals targets framework to provide grantees with optional and ready-made outcomes templates that would work across the field for many funders. You can read more about that effort underway in a recent blog post here.

The Rockefeller FoundationThe Rockefeller Foundation
Award Summary: Field Leadership, Consistent Knowledge Sharing, and Commitment to Working Transparently

The Rockefeller Foundation can boast early adopter status to transparency and openness—it  has had a longstanding commitment to creating a culture of learning and as such was one of the very first foundations to join the GlassPockets transparency movement and also to commit to #OpenForGood principles by sharing its published evaluations widely. Rockefeller Foundation also took the unusual step of upping the ante on the #OpenForGood Pledge aiming for both creating a culture of learning and accountability, with its monitoring and evaluation team stating that: “To ensure that we hold ourselves to a high bar, our foundation pre-commits itself to publicly sharing the results of its evaluations - well before the results are even known.” This ensures that even if the evaluation reports unfavorable findings, the intent is to share it all.

In an earlier GlassPockets blog post, Rockefeller’s monitoring and evaluation team shows a unique understanding of how sharing knowledge can advance the funder’s goals: “Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.”  Rockefeller’s use of IssueLab’s open knowledge platform is living up to this promise as anyone can currently query and find more than 400 knowledge documents funded, published, or co-published by the Rockefeller Foundation.

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in knowledge dissemination. Knowledge Centers are a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to creating a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourages the sector to make it a priority to learn from one another, share content with a global audience, so that we can build smartly one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Congratulations to our inaugural class of #OpenForGood Award Winners! What will you #OpenForGood?

--Janet Camarena

Book Review: 'Giving Done Right: Effective Philanthropy and Making Every Dollar Count'
April 18, 2019

Daniel X Matz is manager and content developer for Candid's GlassPockets.org portal. This review first appeared in Philanthropy News Digest's PhilanTopic blog.

Daniel X MatzBack in 2016, Bill Gates, in the context of his partnership with the Heifer Foundation to donate 100,000 chickens to people around the world living on $2 a day, blogged about how raising egg-laying fowl can be a smart, cost-effective antidote to extreme poverty. As Phil Buchanan tells it in Giving Done Right: Effective Philanthropy and Making Every Dollar Count, the idea, however well-intentioned, attracted scorn from some quarters, including Bolivia, where the offer was declined — after it was pointed out that the country already produces some 197 million chickens a year. The episode is a pointed reminder that being an effective philanthropist isn't as easy as it might seem.

"If you want to effect lasting change — to move the needle — then you need to dig in and think long-term."

And Buchanan ought to know; as the founding CEO of the Cambridge-based Center for Effective Philanthropy for the past seventeen years, he has worked closely with more than three hundred foundations and scores of individual givers, exploring the landscape of American giving, distilling lessons learned (both successes and failures), and highlighting what works and what doesn't. (Spoiler alert: there's no single answer as to how to give "right," but few are better positioned than Buchanan to frame the question.) In this slim volume, he lays out a framework that can help anyone engaged in philanthropy to be more thoughtful, open-minded, and willing to learn, adapt, and keep trying.

As Buchanan sees it, anyone can be an effective philanthropist, and there is no one best practice to that end, other than to be as engaged as one can be. While much of the advice he shares is better suited for the well-heeled donor or the program officer at an established foundation (those with the time and resources to think through larger issues, consider options, and evaluate methods for learning from their giving), the panhandler's dictum applies: you don't need to be a Rockefeller to help a fella, and you don't need to be a tech billionaire to carve out a smart, sustainable path for your own giving. Certainly, to give is better than not to give, and if all you have the time to do is to write a check, do that. But if you want to effect lasting change — to move the needle, as it were — then you need to dig in and think long-term.

Phil BuchananPhil Buchanan

According to Buchanan, digging in means setting goals, weighing strategies for achieving those goals, evaluating the effectiveness of your giving, and, armed with that information, going back for more. Buchanan's work with CEP has given him special insight into how philanthropists approach their giving, and he's nut-shelled a range of smart propositions designed to help individuals and institutions think more clearly about how and where they give. Take his four types of givers:

  • The charitable banker broadly gives because of precedent or simply because they're asked to, but not really having a goal or focus that informs that giving.
  • The perpetual adjuster always changes who and what they fund but never having a sense of whether that giving is doing any good.
  • The partial strategist connects some of the dots in terms of goals, strategy, and effectiveness, but still keeps much of his/her giving unaligned with those goals (think of the family foundation that strategically works to reduce hunger in its community but allocates half its grants to the unrelated interests of board members).
  • The total strategist is all in on finding approaches that work and is rigorously willing to test strategies toward achieving clear goals.

While most givers start out as charitable bankers, Buchanan wants them to become as strategic as they can be, spending their time, talent, and treasure "maximizing [their] chances of making a difference."

Being strategic isn't quite the same as being on target, however, and the balance of Giving Done Right is a broad-brush effort to tease out the key ingredients of effective philanthropy. For instance:

  • Stop thinking you know everything. "The most effective givers open themselves to the possibility that others are in a better position to identify solutions." Not only do givers need to up their game with respect to understanding the problem they hope to solve, they also need to deepen their understanding of the communities and nonprofits actually doing the work.
  • Stop re-inventing the wheel. "The best givers share what they're learning openly with other funders and those they fund." Chances are you're not the first to want to solve an intractable problem; effective philanthropy means building on what others have learned, supporting their efforts when they work, and collaborating to find new paths when they don't.
  • Take the time to find the right fit. Not every family needs its own foundation; for some a checkbook at the kitchen table will do just fine, for others it's a giving circle, a community foundation, a donor-advised fund, an LLC, or a programmatically focused, professionally staffed foundation. And while Buchanan sees the opacity of DAFs and LLCs as a thorn in the side of the sector's embrace of openness (and conversely views independent foundations as the dark horse in leveraging transparency across the sector) here, the key is understanding which vehicle works best with your goals, and then getting to work.

Ultimately, transparency is at the heart of Giving Done Right, where "clarity, openness, and honesty about goals and strategies, as well as the nitty-gritty of what the giver is learning about what works and what doesn't" are tools that givers of all sizes need at the ready. Effective givers willingly use openness to strengthen relationships between funders, communities, and collaborators, help mitigate redundancy, build consensus, and solve problems.

Giving Done RightBuchanan also has a few dragons to slay, and Giving Done Right starts and ends with an exhortation for givers of all sizes to ignore the misguided lessons embraced by a new generation of wealthy donors. First and foremost is the assumption that nonprofits would be more effective if they were run like for-profit businesses. No one likes bloat or ineffectiveness, but as Buchanan notes, most nonprofits are bare-bones operations that rather miraculously squeeze water from the proverbial stone day in and day out. What's more, most for-profit businesses aren't as efficient as they'd have us believe, relying on a solitary metric — quarterly profit — to measure their success. In addition, Buchanan scolds those who see nonprofits' reliance on philanthropy as "dependency." Without philanthropic support, he writes, tongue firmly in cheek, how would a children's charity keep the lights on, by putting the kids to work? And in any case, he reminds us, the nonprofit sector overall generates nearly $1.7 trillion in annual revenue ($1 in every $10 of U.S. GDP), with 70 percent of that derived from fees and services.

Similarly, Buchanan has no patience for foundations that demand that their nonprofit grantees spend time and money evaluating the impact of their services while being unwilling to fund such work, or for fixating on "overhead" as a measure of nonprofit effectiveness while too often ignoring the full-spectrum cost involved in delivering nonprofit services. And while he's willing to concede that what a successful business tycoon knows about getting rich might (might) provide some insight into how to be an effective philanthropist, it's more likely than not to cloud one's judgment. After all, if the world's problems could be solved by a vigorous application of business acumen, why haven't they?

In Buchanan's view, givers are much more likely to be effective by taking the time to learn what they don't know and proceeding from there. Not everyone embraces that idea. As David Callahan's The Givers showed, the growth of big philanthropy in an era where government is less willing and less capable of affecting social change has become a hotly contested issue. In January, Buchanan, along with Rob Reich (co-director of Stanford's Center on Philanthropy and Civil Society), Ben Soskis (Center on Nonprofits and Philanthropy at the Urban Institute), and Anand Giridharadas (Winners Take All: The Elite Charade of Changing the World) engaged in a debate on Twitter during which they laid out their views with respect to the role of philanthropy in present-day America, its influence (both positive and negative) on our politics, and the tendency of Big Anything to generate a handful of winners and lots of losers. That debate is echoed in Giving Done Right, with Buchanan staking out a middle ground where philanthropy is celebrated as a reflection of American idealism and pluralism, where giving is good and smarter giving is better, and where the willingness of philanthropists and nonprofits (the unsung heroes of our more perfect union) to work together to solve seemingly intractable problems is to be commended.

-- Daniel X Matz

More of Daniel's book reviews touching on philanthropy, the arts, and the social sector, can be found on Philanthropy News Digest's Off the Shelf.

Don’t “Ghost” Declined Applicants: The Ins and Outs of Giving Applicant Feedback
April 4, 2019

Mandy Ellerton joined the [Archibald] Bush Foundation in 2011, where she created and now directs the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community problems, using approaches that are collaborative and inclusive of people who are most directly affected by the problem.

GlassPockets Road to 100

This post is part of our “Road to 100 & Beyond series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the “Who Has GlassPockets? self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, promising practices in transparency, helpful examples, and lessons learned.

I’ve often thought that fundraising can be as bad as dating. (Kudos to you lucky few who have had great experiences dating!) Lots of dates, lots of dead ends, lots of frustrating encounters before you (maybe) find a match. All along the way you look for even the smallest sign to indicate that someone likes you. “They laughed at my joke!” or, in the case of fundraising, “they seemed really excited about page five of last year’s impact report!” Not to mention the endless time spent doing online searches for shreds of information that might be useful. This reality is part of the reason why Bush Foundation was proud to be among the first 100 foundations to participate in GlassPockets. We believe that transparency and opening lines of communication is critical to good grantmaking, because both in dating and in fundraising, it can be heartbreaking and crazymaking to try and sort out whether you have a connection or if someone’s “just not that into you.” If only there was a way to just “swipe left” or “swipe right” and make everything a little simpler.

“We believe that transparency and opening lines of communication is critical to good grantmaking.”

I’m not proposing a Tinder for grantmaking (nor should anyone, probably, although hat tip to Vu Le for messing with all of us and floating the idea on April Fool’s Day). But over the past several years, Bush Foundation’s Community Innovation program staff has used a system to provide feedback calls for declined applicants, in the hopes of making foundation fundraising a little less opaque and crazymaking. We use the calls to be transparent and explain why we made our funding decisions. The calls also help us live out our “Spread Optimism” value because they allow us to help and encourage applicants and potentially point them to other resources. This is all part of our larger engagement strategy, described in “No Moat Philanthropy.”

 

Ellertonmandy20152
Mandy Ellerton

How Feedback Calls Work

We use a systematic approach for feedback calls:

  • We proactively offer the opportunity to sign up for feedback calls in the email we send to declined applicants.
  • We use a scheduling tool (after trying a couple different options we’ve landed on Slotted, which is relatively cheap and easy to use) and offer a variety of times for feedback calls every week. Collectively five Community Innovation Team members hold about an hour a week for feedback calls. The calls typically last about 20 minutes. We’ve found this is about the right amount of time so that we can offer feedback calls to most of the declined applicants who want them.
  • We prepare for our feedback calls. We re-read the application and develop an outline for the call ahead of time.
  • During the call we offer a couple of reasons why we declined the application. We often discuss what an applicant could work on to strengthen their project and whether they ought to apply again.
  • We also spend a lot of time listening; sometimes these calls can understandably be emotional. Grant applications are a representation of someone’s hopes and dreams and sometimes your decline might feel like the end of the road for the applicant. But hang with them. Don’t get defensive. However hard it might feel for you, it’s a lot harder for the declined applicant. And ultimately, hard conversations can be transformative for everyone involved. I will say, however, that most of our feedback calls are really positive exchanges.
  • We use anonymous surveys to evaluate what people think of the feedback calls and during the feedback call we ask whether the applicant has any feedback for us to improve our programs/grantmaking process.
  • We train new staff on how to do feedback calls. We have a staff instruction manual on how to do feedback calls, but we also have new team members shadow more seasoned team members for a while before they do a feedback call alone.

 

What’s Going Well

The feedback calls appear to be useful for both declined applicants and for us:

  • In our 2018 surveys, respondents (n=38) rated the feedback calls highly. They gave the calls an average rating of 6.1 (out of 7) for overall helpfulness, 95% said the calls added some value or a lot of value, and 81.2% said they had a somewhat better or much better understanding of the programs after the feedback call.
  • We’ve seen the number of applications for our Community Innovation Grant and Bush Prize for Community Innovation programs go down over time and we’ve seen the overall quality go up. We think that’s due, in part, to feedback calls that help applicants decide whether to apply again and that help applicants improve their projects to become a better fit for funding in the future.
  • I’d also like to think that doing feedback calls has made us better grantmakers. First, it shows up in our selection meetings. When you might have to talk to someone about why you made the funding decision you did, you’re going to be even more thoughtful in making the decision in the first place. You’re going to hew even closer to your stated criteria and treat the decision with care. We regularly discuss what feedback we plan to give to declined applicants in the actual selection meeting. Second, in a system that has inherently huge power differentials (foundations have all of it and applicants have virtually none of it), doing feedback calls forces you to come face to face with that reality. Never confronting the fact that your funding decisions impact real people with hopes and dreams is a part of what corrupts philanthropy. Feedback calls keep you a little more humble.

 

What We’re Working On

We still have room to improve our feedback calls:

  • We’ve heard from declined applicants that they sometimes get conflicting feedback from different team members when they apply (and get declined) multiple times; 15% of survey respondents said their feedback was inconsistent with prior feedback from us. Cringe. That definitely makes fundraising more crazymaking. We’re working on how to have more staff continuity with applicants who have applied multiple times.
  • We sometimes struggle to determine how long to keep encouraging a declined applicant to improve their project for future applications versus saying more definitively that the project is not a fit. Yes, we want to “Spread Optimism,” but although it never feels good for anyone involved, sometimes the best course of action is to encourage an applicant to seek funding elsewhere.

I’m under no illusions that feedback calls are going to fix the structural issues with philanthropy and fundraising. I welcome that larger conversation, driven in large part by brave critiques of philanthropy emerging lately like Decolonizing Wealth, Just Giving and Winners Take All. In the meantime, fundraising, as with dating, is still going to have moments of heartache and uncertainty. When you apply for a grant, you have to be brave and vulnerable; you’re putting your hopes and dreams out into a really confusing and opaque system that’s going to judge them, perhaps support them, or perhaps dash them, and maybe even “ghost” them by never responding. Feedback calls are one way to treat those hopes and dreams with a bit more care.

--Mandy Ellerton

A New Year, a New Transparency Indicator: Coming Soon—Transparency Values & Policies
January 3, 2019

Janet Camarena is director of transparency initiatives at Foundation Center.

Janet Camarena PhotoWhen GlassPockets started nine years ago, it was rare to find any reference to transparency in relation to philanthropy or foundations. The focus of most references to transparency at the time were in relation to nonprofits or governments, but seldom to philanthropy. When we set out to create a framework to assess foundation transparency, the “Who Has GlassPockets?” criteria were based on an inventory of current foundation practices meaning there were no indicators on the list that were not being shared somewhere by at least a few foundations. Not surprisingly, given the lack of emphasis on foundation transparency, there were few mentions of it as a policy or even as a value in the websites we reviewed, so it didn’t make sense at the time to include it as a formal indicator.

GlassPockets Road to 100A lot has changed in nine years, and it’s clear now from reviewing philanthropy journals, conferences, and yes, even foundation websites that awareness about the importance of philanthropic transparency is on the rise. Among the nearly 100 foundations that have taken and publicly shared “Who Has GlassPockets?” transparency assessments, more than 40 percent are now using their websites as a means to communicate values or policies that aim to demonstrate an intentional commitment to transparency. And demonstrating that how the work is done is as important as what is done, another encouraging signal is that in many cases there are articulated statements on new “How We Work” pages outlining not just what these foundations do, but an emphasis on sharing how they aim to go about it. These statements can be found among funders of all types, including large, small, family, and independent foundations.

We want to encourage this intentionality around transparency, so in 2019 we are adding a new transparency indicator asking whether participating foundations have publicly shared values or policies committing themselves to working openly and transparently. In late January the “Who Has GlassPockets?” self-assessment and profiles will be updated reflecting the new addition. Does your foundation’s website have stated values or policies about its commitment to transparency? If not, below are some samples we have found that may serve as inspiration for others:

  • The Barr Foundation’s “How We Work" page leads with an ethos stating “We strive to be transparent, foster open communication, and build constructive relationships.” And elaborates further about field-building potential: “We aim to be open and transparent about our work and to contribute to broader efforts that promote and advance the field of philanthropy.”

  • The Samuel N. and Mary Castle Foundation’s Mission and Core Values page articulates a long list of values that “emerge from the Foundation’s long history,” including a commitment to forming strategic alliances, working honestly, “showing compassion and mutual respect among grantmakers and grantees,” and ties its focus on transparency to a commitment to high standards and quality: “The Foundation strives for high quality in everything it does so that the Foundation is synonymous with quality, transparency and responsiveness.”

  • The Ford Foundation’s statement connects its transparency focus to culture, values around debate and collaboration, and a commitment to accountability: “Our culture is driven by trust, constructive debate, and leadership that empowers innovation and excellence. We strive to listen and learn and to model openness and transparency. We are accountable to each other at the foundation, to our charter, to our sector, to the organizations we support, and to society at large—as well as to the laws that govern our nonprofit status.”

  • An excerpt from the Bill and Melinda Gates Foundation’s “Information Sharing Approach” page emphasizes collaboration, peer learning, and offers an appropriately global view: “Around the world, institutions are maximizing their impact by becoming increasingly transparent. This follows a fundamental truth: that access to information and data fosters effective collaboration. At the foundation, we are embracing this reality through a continued commitment to search for opportunities that will help others understand our priorities better and what supports our decision making. The foundation is also committed to helping the philanthropic sector develop the tools that will increase confidence in our collective ability to address tough challenges around the world…..We will continually refine our approach to information sharing by regularly exploring how we increase access to important information within the foundation, while studying other institutional efforts at transparency to learn lessons from our partners and peers.”

  • The Walter and Elise Haas Fund connects its transparency focus to its mission statement, and its transparency-related activities to greater effectiveness: “Our ongoing commitment to transparency is a reflection of our mission — to build a healthy, just, and vibrant society in which people feel connected to and responsible for their community. The Walter & Elise Haas Fund shares real-time grants data and champions cross-sector work and community cooperation. Our grantmaking leverages partnerships and collaborations to produce results that no single actor could accomplish alone.”

  • The William and Flora Hewlett Foundation’s statement emphasizes the importance of transparency in creating a culture of learning: “The foundation is committed to openness, transparency and learning. While individually important, our commitments to openness, transparency, and learning jointly express values that are vital to our work. Because our operations—both internal and external—are situated in complex institutional and cultural environments, we cannot achieve our goals without being an adaptive, learning organization. And we cannot be such an organization unless we are open and transparent: willing to encourage debate and dissent, both within and without the foundation; ready to share what we learn with the field and broader public; eager to hear from and listen to others. These qualities of openness to learning and willingness to adjust are equally important for both external grantmaking and internal administration.”

These are just a few of the examples GlassPockets will have available when the new indicator is added later this month. Keep an eye on our Twitter feed for updates.

Happy New Year, Happy New Transparency Indicator!

--Janet Camarena

Evolving Towards Equity, Getting Beyond Semantics
December 17, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarIn my previous post, I reflected on The California Endowment’s practice of conducting a Diversity, Equity, and Inclusion (DEI) Audit and how it helps us to stay accountable to intentionally integrating and advancing these values across the foundation.

We started this practice with a “Diversity and Inclusion” Audit in 2008 and as part of our third audit in 2013, The California Endowment (TCE) adjusted the framing to a “Diversity, Equity, and Inclusion” Audit. This allowed us to better connect the audit with how the foundation viewed the goals of our strategy and broadened the lens used through the audit process.

While this could be viewed as a semantic update based on changes in the nonprofit and philanthropic sectors, by 2016 our audit results reflected how TCE described both our core values that lead with principles of DEI and the ultimate outcome of our work that point towards health equity and justice for all. And although we didn’t make a corresponding change to reflect this shift in what the audit specifically assesses, select findings from our most recent audit highlight how not only diversity, but how equity is also being operationalized within the foundation.

Getting beyond the numbers

In some ways, the most straightforward entry point for DEI discussions is to first examine diversity by assessing quantitative representation within the foundation at the board and staff level, among our partners, contractors, vendors, and investment managers. Though it’s a necessary beginning, reporting and reflection, however, cannot stop with counting heads.  While our audit may have started as a way to gauge inclusion through the lens of diversity, it’s become clear that collecting and examining demographic data sets the stage for critical conversations to follow.

Part of the inherent value of reflecting on diversity and representation is in service of getting beyond the numbers to discover what questions the numbers inspire. Questions such as:

  • Who’s missing or overrepresented and why?
  • What implications could the gaps in lived experiences have on the foundation, the strategies used and how our work is conducted?
  • What are the underlying structures and systems that shape the demographics of the foundation and of the organizations with which we partner?

It’s these types of questions about our demographics and diversity that help move us beyond discussions about representation into deeper discussions about equity.

The audit has been a valuable point of reflection and action planning over the past several years. It’s a comprehensive process conducted in partnership with evaluation firm, SPR, that spans an extensive number of sources.

Towards Equity and Inclusion

As TCE pursues our health equity goals, we’ve been able to define and distinguish key differences between diversity, equity, and inclusion. While diversity examines representation, we define equity as promoting fair conditions, opportunities, and outcomes. We also define inclusion as valuing and raising the perspectives and voices of diverse communities to be considered where decisions are being made. For future audits, we’re looking to refine our DEI audit goals to more explicitly focus on equity and inclusion across both our grantmaking efforts and to even more deeply examine our internal policies, practices, and operations. However, here are a few examples from our latest audit that highlight how equity and inclusion currently show up across the foundation outside of our grantmaking.

Equity in hiring

  • Recognizing the impact of structural racism and mass incarceration, TCE followed the lead of partners working to “ban the box” and the Executives’ Alliance for Boys and Men of Color to change hiring practices. TCE now utilizes a Fair Chance Hiring Policy that opens the door for hiring qualified applicants with a conviction or an arrest and shares open positions with anti-recidivism organizations.

Inclusion and equity in investments

  • In the spirit of inclusion, the criteria for our Program Related Investments (PRIs) integrate whether the PRI will engage the community it is intended to benefit as well as whether the investment will address a known health inequity or social determinant of health.
  • In recognition of structural racism leading to higher rates of incarceration within communities of color, in 2015 TCE announced that we will no longer invest in companies profiting from for-profit prisons, jails, or detention centers.

Equity in vendor selection

  • Operationalizing equity also requires considering how facility operations align with organizational values. In line with our divestment from for-profit prisons, an RFP process identified a vendor-nonprofit team that encouraged hiring formerly incarcerated and homeless community members within our onsite café. We remain committed to this approach.

The Work Ahead

We’ve accomplished a great deal. At the same time, as we evolve towards becoming an equity organization there are areas where we need to put more of our attention.

To move beyond articulating values and to get to deeper staff engagement, audit feedback suggests more staff resources are needed to connect individual functions and roles to our DEI values, including through our performance review process, particularly among non-program staff.

Connected to developing a greater vision regardless of department affiliation, we will soon embark to engage staff across the entire organization to develop a more deeply shared racial equity analysis of our work.  As part of this effort, our board is participating in racial equity trainings and adopted a resolution to utilize a racial equity lens as the foundation develops our next strategic plan.  Building on what we’re learning through our audits, in 2019 we’ll launch this effort towards becoming a racially equitable health foundation that will intentionally bring racial equity to the center of our work and how we operate.

Finally, as we continue to partner with and support community to fight for equity, there are several unanswered, imminent questions we’ll need to tackle. Within the walls of the foundation:

  • How do we hold ourselves to the same equity and inclusion principles that our partners demand of system leaders?
  • How do we confront the contradictions of how we operate as an organization rooted in a corporate or hierarchical design to share power with staff regardless of position, increase decision making transparency, and include those impacted by pending decisions in the same way we ask our systems leaders to include and respond to community?
  • With an interest in greater accountability to equity and inclusion, how do we not only tend to power dynamics but consider greater power sharing through foundation structures and current decision-making bodies both internally and externally?

Herein lies our next evolutionary moment.

--Mona Jhawar

Putting a Stop to Recreating the Wheel: Strengthening the Field of Philanthropic Evaluation
December 13, 2018

Clare Nolan is Co-Founder of Engage R+D, which works with nonprofits, foundations, and public agencies to measure their impact, bring together stakeholders, and foster learning and innovation.

Meg Long is President of Equal Measure, Philadelphia-based professional services nonprofit focused on helping its clients—foundations, nonprofit organizations, and public entities—deepen and accelerate social change.

2
Clare Nolan

In 2017, Engage R+D and Equal Measure, with support from the Gordon and Betty Moore Foundation launched an exploratory dialogue of funders and evaluators to discuss the current state of evaluation and learning in philanthropy, explore barriers to greater collaboration and impact, and identify approaches and strategies to build the collective capacity of small and mid-sized evaluation firms. Our goal was to test whether there was interest in our sector for building an affinity network of evaluation leaders working with and within philanthropy. Since our initial meeting with a few dozen colleagues in 2017, our affinity network has grown to 250 individuals nationally, and there is growing momentum for finding ways funders and evaluators can work together differently to deepen the impact of evaluation and learning on philanthropic practice.

At the recent 2018 American Evaluation Association (AEA) conference in Cleveland, Ohio, nearly 100 funders and evaluators gathered to discuss four action areas that have generated the most “buzz” during our previous network convening at the Grantmakers for Effective Organizations (GEO) conference and from our subsequent network survey:

1. Improving the application of evaluation in philanthropic strategy and practice.

2. Supporting the sharing and adaptation of evaluation learning for multiple users.

3. Supporting formal partnerships and collaborations across evaluators and evaluation firms.

4. Strengthening and diversifying the pipeline of evaluators working with and within philanthropy.

1
Meg Long

We asked participants to choose one of these action areas and join the corresponding large table discussion to reflect on what they have learned about the topic and identify how the affinity network can contribute to advancing the field. Through crowd-sourcing, participants identified some key ways in which action teams that will be launched in early 2019 can provide a value-add to the field.

1. What will it take to more tightly connect evaluation with strategy and decision-making? Provide more guidance on what evaluation should look like in philanthropy.

Are there common principles, trainings, articles, case studies, guides, etc. that an action team could identify and develop? Could the affinity network be a space to convene funders and evaluators that work in similar fields to share evaluation results and lessons learned?

2. What will it take to broaden the audience for evaluations beyond individual organizations? Create a “market place” for knowledge sharing and incentivize participation.

As readers of this blog will know from Foundation Center’s #OpenForGood efforts, there is general agreement around the need to do better at sharing knowledge, building evidence, and being willing to share what foundations are learning – both successes and failures. How can an action team support the creation of a culture of knowledge sharing through existing venues and mechanisms (e.g., IssueLab, Evaluation Roundtable)? How could incentives be built in to support transparency and accountability?

3. How can the field create spaces that support greater collaboration and knowledge sharing among funders and evaluators? Identify promising evaluator partnership models that resulted in collaboration and not competition.

Partnerships have worked well where there are established relationships and trust and when power dynamics are minimized. How can an action team identify promising models and practices for successful collaborations where collaboration is not the main goal? How can they establish shared values, goals, etc. to further collaboration?

4. What will it take to create the conditions necessary to attract, support, and retain new talent? Build upon existing models to support emerging evaluators of color and identify practices for ongoing guidance and mentorship.

Recruiting, hiring, and retaining talent to fit evaluation and learning needs in philanthropy is challenging due to education and training programs as well as changing expectations in the field. How can we leverage and build on existing programs (e.g., AEA Graduate Education Diversity Internship, Leaders in Equitable Evaluation and Diversity, etc.) to increase the pipeline, and support ongoing retention and professional development?

Overall, we are delighted to see that there is much enthusiasm in our field to do more work on these issues. We look forward to launching action teams in early 2019 to further flesh out the ideas shared above in addition to others generated over the past year.

If you are interested in learning more about this effort, please contact Pilar Mendoza. If you would like to join the network and receive updates about this work, please contact Christine Kemler.

--Clare Nolan and Meg Long

Living Our Values: Gauging a Foundation’s Commitment to Diversity, Equity, and Inclusion
November 29, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarThe California Endowment (TCE) recently wrapped up our 2016 Diversity, Equity, and Inclusion (DEI) Audit, our fourth since 2008. The audit was initially developed at a time when community advocates were pushing the foundation to address issues of structural racism and inequity. As TCE’s grantmaking responded, staff and our CEO were also interested in promoting DEI values across the entire foundation beyond programmatic spaces. Over time, these values became increasingly engrained in TCE’s ethos and the foundation committed to conducting a regular audit as a vehicle with which to determine if and how our DEI values were guiding organizational practice.

Sharing information about our DEI Audit often raises questions about how to launch such an effort. Some colleagues are in the early stages of considering whether they want to carry out an audit of their own. Are we ready? What do we need to have in place to even begin to broach this possibility? Others are interested to hear about how we use the findings from such an assessment. To help answer these questions, this is the first of a two-part blog series to share the lessons we’re learning by using a DEI audit to hold ourselves accountable to our values.

While the audit provides a frame to identify if our DEI values are being expressed throughout the foundation, it also fosters learning. Findings are reviewed and discussed with executive leadership, board, and staff. Reviews provide venues to involve both programmatic and non-programmatic staff in DEI discussions. An audit workgroup typically considers how to take action on findings so that the foundation can continuously improve and also considers how to revise audit goals to ensure forward movement. By sharing findings publicly, we hope our experience and lessons can help to support the field more broadly.

It is, however, no small feat. The audit is a comprehensive process that includes a demographic survey of staff and board, a staff and board survey of DEI attitudes and beliefs, interviews with key foundation leaders, examining available demographic data from grantee partners as well as a review of DEI-related documents gathered in between audits. Having dedicated resources to engage a neutral outsider to carry out the audit in partnership with the foundation is also important to this process. We’ve found it particularly helpful to engage with a consistent trusted partner, Social Policy Research Associates, over each of our audits to capture and candidly reflect where we’re making progress and where we need to work harder to create change.

As your foundation considers your own readiness to engage in such an audit process, we offer the following factors that have facilitated a productive and learning oriented DEI audit effort at TCE:

1. Clarity about the fundamental importance of Diversity, Equity, and Inclusion to the Foundation

The expression of our DEI values has evolved over time. When the audit started, several program staff members who focused on DEI and cultural competency developed a guiding statement on Diversity and Inclusiveness. Located within our audit report, it focused heavily on diversity although tweaks were made to the statement over time. A significant shift occurred several years ago when our executive team articulated a comprehensive set of core values that undergirds all our work and leads with a commitment to diversity, equity, and inclusion.

2. Interest in reflection and adaptation

The audit is a tool for organizational learning that facilitates continuous improvement. The process relies on having both a growth mindset and clear goals for what we hope to accomplish. Our 13 goals range from board engagement to utilizing accessibility best practices. In addition to examining our own goals, the audit shares how we’re doing with respect to a framework of institutional supports required to build a culture of equity. By comparing the foundation to itself over time we can determine if and where change is occurring. It also allows us to revise goals so that we can continue to push ourselves forward as we improve, or to course correct if we’re not on track. We anticipate updating our goals before our next audit to reflect where we are currently in our DEI journey.

3. Engagement of key leaders, including staff

Our CEO is vocal and clear about the importance of DEI internally and externally, as well as about the significance of conducting the audit itself. Our executive team, board, and CEO all contribute to the audit process and are actively interested in reviewing and discussing its findings.

Staff engagement is critical throughout audit implementation, reflection on findings, and action planning as well. It’s notable that the vast majority of staff at all levels feel comfortable pushing the foundation to stay accountable to DEI internally. However, there is a small, but growing percentage (23%) of staff who report feeling uncomfortable raising DEI concerns in the workplace suggesting an area for greater attention.

4. Capacity to respond to any findings

Findings are not always going to be comfortable. Identifying areas for improvement may put the organization and our leaders in tough places. TCE has historically convened a cross departmental workgroup to consider audit findings and tackle action planning. We considered co-locating the audit workgroup within our executive leadership team to increase the group’s capacity to address audit findings. However, now we are considering whether it would be best situated and aligned within an emerging body that will be specifically focused on bringing racial equity to the center of all our work.

5. Courage and will to repeat

In a sector with limited accountability, choosing to voluntarily and publicly examine foundation practices takes real commitment and courage. It’s always great to hear where we’re doing well but committing to a process that also raises multiple areas where we need to put more attention, requires deep will to repeat on a regular basis. And we do so in recognition that this is long term, ongoing work that, in lieu of having a real finish line, requires us to continuously adapt as our communities evolve.

Conducting our DEI audit regularly has strengthened our sense of where our practice excels—for example in our grantmaking, possessing a strong vision and authorizing environment, and diversity among staff and board. It’s also strengthened our sense of the ways we want to improve such as developing a more widely shared DEI analysis and trainings for all staff as well as continuing to strengthen data collection among our partners. The value of our DEI audit lies equally in considering findings as well as being a springboard for prioritizing action. TCE has been on this road a long time and we’ll keep at it for the foreseeable future. As our understanding of what it takes to pursue diversity, equity, and inclusion internally and externally sharpens, so will the demands on our practice. Our DEI audit will continue to ensure that we hold ourselves to these demands. In my next post, we’ll take a closer look at what we’re learning about operationalizing equity within the foundation.

--Mona Jhawar

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories