Transparency Talk

Category: "Feedback" (64 posts)

Ask, Listen, Act. Embedding Community Voices into our Brand.
September 12, 2019

Untitled design






Zeeba Khalili

Zeeba Khalili is the Learning and Evaluation Officer at Marguerite Casey Foundation.

Los Angeles, California; Baltimore, Maryland; Mobile, Alabama; Rapid City, South Dakota; El Paso, Texas; and Yakima, Washington. These six cities were chosen to reflect a diversity of regional, generational, cultural, ethnic and socioeconomic perspectives. In 2002, as part of our creation, Marguerite Casey Foundation convened listening circles in these six cities to listen to the voices of more than 600 families. The Foundation posed the same three questions in each listening circle:

  • What creates strong families and children?
  • What would it take to change the systems that have an impact on the lives of families and children?
  • How would you leverage $30 million to ensure the well-being of children, families and communities?

Though these six hundred voices spoke of diverse needs, in many ways we discovered they spoke as one. They called for respecting and valuing families; empowering families and holding them at the center of systems of care; promoting grassroots activism and leadership; collaborating across agencies and systems; changing unresponsive policies; and galvanizing public will to support families that help avoid crises and ultimately lead away from dependence on systems.

We didn’t convene listening circles just to check a box of community involvement. Hearing stories and ideas directly from communities allowed us to build a Foundation that challenged preconceived notions about the “best” way to support families and end poverty. What we heard became the framework for the Foundation’s mission and strategy, grounded in listening to communities’ concerns as articulated by community leaders and taking action informed by families’ voices. We committed to Ask, Listen, Act, making it our brand promise, and one of our forms of philanthropic transparency. The Foundation grounds its decisions in what we’ve heard from our constituencies, both grantees and families, and we make our learnings public so that other groups can learn from the work we’ve already done.

Today, Marguerite Casey Foundation’s grantmaking echoes the sentiments heard seventeen years ago. We provide long-term, sizeable multi-year general operating support grants to grassroots activism and advocacy organizations. We invest in Equal Voice networks, regionally and nationally, facilitated by network weavers, who help grantees collaborate across issues, form alliances and bring about long-term change.

Marguerite-casey-foundationAdditionally, the Foundation lifts the voices of low-income families to the national dialogue through our Equal Voice News online platform, harnessing the power of storytelling about families leading change in their communities. Program officers, closest to the grantees, connect the Foundation’s communications team to the families on the ground and elevate their experiences so that others can learn from them. For example, in July, the Foundation chronicled a Black farming community in rural Georgia, once thought to be vanishing, but that remains steadfast in its efforts to fight issues of Black land loss and food-related disparities. This story supports the work of Southwest Georgia Project for Community Education, a grantee of the Foundation, serving as a tool in fundraising and in garnering greater media attention.

Ask, Listen, Act allows us to engage the community for both our benefit and for theirs. Its methodology can be seen across the Foundation, including in how we learn from our grantees. Every few years we commission the Center for Effective Philanthropy (CEP) to conduct a Grantee Perception Report survey of our grantees to assess our impact and interactions. These reports create genuine opportunity for the Foundation to reflect on our strategies and in the past, based on feedback, we identified two key areas to improve: consistency of communications and assistance beyond grant dollars. We created cross-regional teams of Program Officers to ensure that grantees could always reach someone with questions or concerns and provided several grantees in the South with technical assistance funding to grow their financial and governance infrastructures.

We hold ourselves accountable to the six hundred families that came together from across the country in 2002 to help us with our founding. Their unique circumstances and breadth of perspectives continue to be heard today in the communities we serve, shared with us by our grantees, and so the Foundation’s approach remains steadfast. We hope that the philanthropic community will recognize that the constituencies we serve deserve to be listened to and more than that, deserve to be experts of their own lives.

--Zeeba Khalili

Meet Our #OpenForGood Award Winner: An Interview with Lee Alexander Risby, Head of Effective Philanthropy & Savi Mull, Senior Evaluation Manager, C&A Foundation
June 19, 2019

1




Lee Alexander Risby

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

C&A Foundation is a European foundation that supports programs and initiatives to transform fashion into a fair and sustainable industry that enables everyone – from farmer to factory worker – to thrive. In this interview, Lee Alexander Risby and Savi Mull share insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the C&A Foundation and how this practice has helped you to advance your work?

2




Savi Mull

Savi Mull: For almost five years, C&A Foundation has been dedicated to transforming the fashion industry into a force for good. A large part of that work includes instilling transparency and accountability in supply chains across the industry. From the start, we also wanted to lead by example by being transparent and accountable as an organization, sharing what we were learning whilst on this journey, being true to our work and helping the rest of the industry learn from our successes and failures.

Lee Alexander Risby: Indeed, from the beginning, we made a commitment to be open about our results and lessons by publishing evaluations on our website and dashboards in our Annual Reports. After all, you cannot encourage the fashion industry to be transparent and accountable and not live by the same principles yourself. Importantly, our commitment to transparency has always been championed both by our Executive Director and our Board.

Savi: To do this, over the years we have put many processes in place.  For example, internally we use after-action reviews to gather lessons from our initiatives and allow our teams to discuss honestly what could have been done better in that program or partnership.  We also do third party, external evaluations of our initiatives, sharing the reports and lessons learned. This helps us and our partners to learn, and it informs initiatives and strategies going forward.

The Role of Evaluation Inside Foundations

GP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

SM: I believe it is essential to have this type of function in a foundation to drive formal learning from and within programs. But at the same time, it is an ongoing process that cannot be driven by one function alone. All staff needs to be responsible for the learning that makes philanthropy effective – not just evaluators.

LAR: To begin, we were deliberate in building a team of evaluation professionals to promote accountable learning. We started hiring slowly and built the team over time. What I looked for with each new member of the team, and I am always looking for, is an evaluator with more than just skills, they also need the influencing, listening, communication and negotiating skills to help others learn. Evaluations have little effect without good internal and external communication.

”For us, it was important to be a critical friend, listener, and enabler of learning and not the police.”

The evaluation function itself has also evolved over the last five years. It started off as a monitoring, evaluation and learning function (MEL) and is now Effective Philanthropy. From the start, the function was as not set up as an independent department but created to help programmatic teams in the design of appropriate monitoring and evaluation for the programs, and facilitators and advisors on strategy. However, it has not always been a straight-forward process from the inside. In the first years, we had to spend a lot of time explaining and persuading staff of the need for evaluation, transparency and learning and the benefits of doing so. We wanted to avoid a strong independent evaluation function as that can reduce learning by placing too much emphasis on accountability. For us, it was important to be a critical friend, listener, and enabler of learning and not the police.

SM: So, the first bit of advice is that evaluators should be supportive listeners, assisting programmatic teams throughout the design and implementation phases to get the best results possible. They should not come in just at the end of an initiative to do an evaluation.

LAR: The second piece of advice is on positioning, support, and structure of evaluation within a foundation.  Firstly, it is critical to have is to have the buy-in of the leadership and board for both evaluation and transparency. And secondly, the evaluation function must be part of the management team and report to the CEO or Executive Director. This gives reporting and learning the appropriate support structure and importance.

The third piece of advice is to consider not creating an evaluation function, but an effective philanthropy function. Evaluation is done for learning, and learning drives effectiveness in grant-making for better results and long-term impacts on systems.

SM: The final piece of advice is to take guidance from others outside your organization. The whole team has consulted broadly with former colleagues and mentors from across the evaluation community as well as experienced philanthropic professionals. Remember you are part of a field with peers whose knowledge and experience can help guide you.

Opening Up Pain Points

GP: One of the reasons the committee selected C&A Foundation to receive the award is because of your institutional comfort level with sharing not just successes, but also being very forthright about what didn’t work. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected C&A Foundation’s reputation in the field, and why it’s worth it?

LAR: I would say this. The question for foundation boards and leaders is straightforward: do you want to be more effective and have an impact? The answer to that will always be yes, but it is dependent on learning and sharing across the organization and with others. If we do not share evaluations, research or experiences, we do not learn from each other and we cannot be effective in our philanthropic endeavors.

"There is a benefit to being open, you build trust and integrity – success and failure is part of all of us."

The other question for boards and leaders is: who does philanthropy serve? For us, we want to transform the fashion industry, which is made up of cotton farmers, workers in spinning mills and cut and sew factories, consumers and entrepreneurs, to name a few – they are our public. As such we have the duty to be transparent to the public about where we are succeeding and where we have failed and how we can improve. We do not think there is a reputation risk. In fact, there is a benefit to being open, you build trust and integrity – success and failure is part of all of us.

SM: Adding to what Lee has said, being open about our failures not only helps us but the entire field. Some of our partners have felt reticent about our publishing evaluations, but we always reassure them and stress from the beginning of an evaluation process that it is an opportunity to understand how to they can improve their work and how we can improve our partnership, as well as a chance to share those lessons more broadly.

Learning While Lean

GP: Given the lean philanthropy staffing structures in place at many corporate foundations, do you have any advice for your peers on how those without a dedicated evaluation team might still be able to take some small steps to sharing what they are learning?

SM: Learning is a continuous process. In the absence of staff dedicated to evaluation, take baby steps within your power, such as implementing after-action reviews, holding thematic webinars, or doing quick summaries of lessons from grants and/or existing evaluations from others. If the organization’s leadership endorses learning, these small steps are a good place to start.

GP: And speaking of lean staffing structures, a concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does C&A Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

SM: The foundation has a Monitoring and Evaluation Policy that lays out the role of the programmatic staff and partners as well as of the dedicated Effective Philanthropy Team. C&A Foundation partners are generally responsible for the design and execution of self-evaluation - to be submitted at the end of the grant period. External evaluation budgets are covered by the foundation and do not pose a financial burden on partners at all. They are included in the overall cost of an initiative, and when needed we have an additional central evaluation fund that is used to respond to the programmatic team’s and partner’s ad hoc demands for evaluations and learning.

The Effective Philanthropy team does provide technical assistance to partners and foundation staff upon request. The guidance ranges from technical inputs related to the theory of change development to the design of baseline and mid-line data collection exercises. The theory of change work has been really rewarding for partners and ourselves. We all enjoy that part of the work.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your work.

Learning Leads to Effectiveness

C-a-foundation (1)LAR: In the moving from a more traditional MEL approach to effective philanthropy we looked at the work of other foundations. This included learning from the William and Flora Hewlett Foundation, the Rockefeller Foundation, and others. We had discussions with a number of peers in the field. We also asked Nancy MacPherson (formerly Managing Director of Evaluation at Rockefeller) and Fay Twersky (Director of Effective Philanthropy at Hewlett) to review our Effective Philanthropy strategy when it was under development. Their feedback and advice helped a lot. In the end, we decided to begin to build out the function in a similar way to the Hewlett Foundation. But there are some differences. For example, our evaluation practice is currently positioned at a deeper initiative level, which is related to the field context where there is a significant evidence gap across the fashion industry that needs to be filled. Concomitant to this is our emphasis on piloting and testing and that goes hand-in-hand with the demand for evaluative thinking, reporting, and learning.

Our team has also been influenced by our own successes and failures from previous roles. That has also inspired us to embrace a slightly different approach.

SM: In terms of where we are at the moment, we still oversee performance monitoring, evaluation, and support to the program teams in developing theories of change and KPIs; but we are also building out organizational learning approach and are in the process of hiring a Senior Learning Manager. Lastly, we are piloting our organizational and network effectiveness in Brazil, which is being led by a colleague who joined the foundation last year.

LAR: We are also in the midst of an Overall Effectiveness Evaluation (OEE) of C&A Foundation’s first 5-year strategy. In general, this is not a type of evaluation that foundations use much. As well as looking at results, the evaluators are evaluating the whole organization, including Effective Philanthropy. For me as an evaluator, it has been really rewarding to be on the other side of a good question.

We are learning from the OEE as we go along and we decided to create ongoing opportunities for reporting/feedback from the process rather than waiting until the very end for a report. This means that program staff can be engaged in proactive discussions about performance and emerging lessons in a timely way. The OEE is already starting to play a vital role to inform the development of the next 5-year strategy and our organization. But you will surely hear more on that evaluation process later as it will be published. There is always room for improvement and learning never stops.

--Lee Alexander Risby and Savi Mull

Opening Up Emerging Knowledge: New Shared Learning from IssueLab
May 23, 2019

Janet Camarena is the director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Balloons1024x512

Though it’s hard to believe, we are already almost halfway through 2019! Given that midpoints are often a time to reflect and take stock, it seemed good timing to mine the knowledge that the field has shared in IssueLab to see some examples of a few of the reports and lessons learned that our GlassPockets foundations have shared over the last six months. Scanning the recent titles, some themes immediately jumped out at me that seemed to be a focus of research across the field, such as racial and gender equity, global trends, and impact measurement.

This is also a good reminder that IssueLab helps make your knowledge discoverable. Though I’m highlighting seven recent publications here, I only had to visit one website to find and freely download them. Acting as a “collective brain” for the field, IssueLab organizes the social sector’s knowledge so we can all have a virtual filing cabinet that makes this knowledge readily available. If it’s been a while since you uploaded your knowledge to IssueLab, you can add any of your publications to our growing library here. It’s a great way to make your knowledge discoverable, mitigate the knowledge fragmentation in the field, and make your foundation live up to being #OpenForGood.

And, speaking of #OpenForGood, our inaugural awards designed to encourage more knowledge sharing across the field will be announced at the upcoming GEO Learning Conference during lunch on May 29th. If you will be at GEO, join us to learn who the #OpenForGood knowledge sharing champions will be! And remember, if you’ve learned something, share something!

Opening Up Evaluations & Grantee Reports

“It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than just the data that the funder might need.”

Foundations pilot initiatives all the time, but do they share what they learned from them once the evaluation is all said and done? And what about all the potentially helpful data filed away in grantee reports? This first cluster of new reports opens up this kind of knowledge:

  • Creative City (published by Animating Democracy, Funded by the Barr and Boston Foundations, April 2019) The Creative City pilot program, created by the New England Foundation for the Arts in partnership with the Barr Foundation, supported artists of all disciplines for art in Boston that would serve to drive public imagination and community engagement. Artists, funders, and administrators alike will find much to learn from this report about how to rethink arts in the context of people and place. One compelling example is the Lemonade Stand installation, created by artists Elisa H. Hamilton and Silvia Lopez Chavez, which made the rounds of many Boston neighborhoods, and attracted many people with its bright yellow kiosk glow. Though it looked on the surface like a lemonade stand, it was actually an art installation inviting the community to connect by exchanging stories about how they turned lemons into lemonade.
  • Giving Refugees A Voice: Independent Evaluation (MacroScope London, Funded by the C&A Foundation, March 2018-February 2019) The C&A Foundation supported the Giving Refugees a Voice initiative, designed to improve working conditions for Syrian and other refugees in the Turkish apparel sector using social media monitoring technology. The pilot initiative used social media monitoring technology to analyze the public Facebook posts of millions of refugees associated with the apparel sector in Turkey. The purpose of this analysis was to galvanize brands, employers, and others to take actions and make changes that would directly improve the working conditions for Syrian people in Turkey. This impact report forthrightly reveals that though the social media efforts were an innovative way to document the scale of the Syrians working informally in the Turkish apparel industry, the pilot fell short of its goals as there was no evidence that the social media analysis led to improved working conditions. Rather than keep such a negative outcome quiet, the C&A Foundation publicly released its findings and also created a blog summary about them earlier this year outlining the results, what they learned from them, and what would be helpful for stakeholders and partners to know in an easy-to-read outline.
  • Grantee Learnings: Disability (Published by Ian Potter Foundation, December 2018) The information documented in this publication has been taken from the final reports of disability-serving grantees, which were submitted to The Ian Potter Foundation following the completion of their projects. The Ian Potter Foundation routinely shares out grantee learnings for each of its portfolios as a way to support shared learning among its existing and future grantees, and this is the most recent of these. The report is easily arranged so that other disability services providers can benefit from the hard-won lessons learned of their peers when it comes to likely areas of shared challenges such as staffing, program planning, working with parents and partners, scaling, evaluation measurement, and technology use. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than just the data that the funder might need.

Lessons Learned from Scholarship & Fellowship Funding

Donors looking to make a difference using scholarships and student aid to improve diversity, equity, and inclusion have two new excellent sources of knowledge available to them:

  • Delivering on the Promise: An Impact Evaluation of the Gates Millennium Scholars Program (Published by American Institutes for Research, Funded by the Bill & Melinda Gates Foundation, May 2019) This report shares findings from an impact evaluation of the Gates Millennium Scholars (GMS) program and reflects on findings from implementation evaluations conducted on the program since its inaugural year. The GMS program is an effort designed to improve higher education access and opportunity for high achieving low-income students of color by reducing the cost of entry. The program also seeks to develop a new and diverse generation of leaders to serve America by encouraging leadership participation, civic engagement, and the pursuit of graduate education and careers in seven fields in which minorities are underrepresented—computer science, engineering, mathematics, science, education, library science, and public health. It discusses the extent to which the program has made an impact, and offers concluding thoughts on how the Foundation can maximize its investment in the higher education arena. A central argument of this report is that philanthropic activities like the GMS program can indeed play a crucial role in improving academic outcomes for high-achieving, disadvantaged students.
  • Promoting Gender Equity: Lessons From Ford’s International Fellows Program (Published by IIE Center for Academic Mobility Research & Impact, Funded by Ford Foundation, January 2019) As part of its mission to provide higher education access to marginalized communities, the Ford Foundation International Fellowships Program (IFP) sought to address gender inequality by providing graduate fellowships to nearly 2,150 women—50% of the IFP fellow population—from 22 countries in the developing world. This brief explores how international fellowship programs like IFP can advance educational, social, and economic equity for women. In addition to discussing the approach, the program took in providing educational access and opportunity to women. The brief looks at two stories of alumnae who have not only benefitted from the fellowship themselves, but who are working to advance gender equity in their home communities and countries. Activists, advocates, and practitioners can draw upon the strategies and stories that follow to better understand the meaning of gender equity and advance their own efforts to achieve social justice for women and girls worldwide.

Sharing Knowledge about the Social Sector

Foundations invest in knowledge creation to better understand the ecosystem of the social sector, as well as to address critical knowledge gaps they see in the fields in which they work. Thanks to these titles being added to IssueLab, we can all learn from them too! Here’s a couple of recent titles added to IssueLab that shed new and needed light on the fields of philanthropy and nonprofits:

  • Philanthropy in China (Published by Asian Venture Philanthropy Network, Funded by The Rockefeller Foundation, April 2019) Philanthropy is now a global growth industry, but philanthropic transparency norms in other parts of the world are often lacking, so knowledge can be scarce. Philanthropy in China today is expanding and evolving rapidly, so filling in these knowledge gaps is even more pressing. This report presents an overview of the philanthropy ecosystem in China by reviewing existing knowledge and drawing insights from influential practitioners. It also provides an analysis of the key trends, opportunities as well as a set of recommendations for funders and resource providers who are inspired to catalyze a more vibrant and impactful philanthropy ecosystem in China.
  • Race to Lead: Women of Color in the Nonprofit Sector (Published by the Building Movement Project, Funded by New York Community Trust, Robert Sterling Clark Foundation, Community Resource Exchange, New York Foundation, Meyer Memorial Trust, Center for Nonprofit Excellence at the United Way of Central New Mexico, North Carolina Center for Nonprofits, Russ Finkelstein, February 2019) This report is part of the Race to Lead series by the Building Movement Project, seeking to understand why there are still relatively so few leaders of color in the nonprofit sector. Using data taken from a national survey of more than 4,000 people, and supplemented by numerous focus groups around the country, this latest report reveals that women of color encounter systemic obstacles to their advancement over and above the barriers faced by white women and men of color. Another key finding in the report is that education and training are not enough to correct systemic inequities—women of color with high levels of education are more likely to be in administrative roles and are more likely to report frustrations about inadequate and inequitable salaries. Building Movement Project’s call to action focuses on systems change, organizational change, and individual support for women of color in the sector.

Is this reminding you that you have new knowledge to share? Great—I can’t wait to see what you will #OpenForGood!

--Janet Camarena

Don’t “Ghost” Declined Applicants: The Ins and Outs of Giving Applicant Feedback
April 4, 2019

Mandy Ellerton joined the [Archibald] Bush Foundation in 2011, where she created and now directs the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community problems, using approaches that are collaborative and inclusive of people who are most directly affected by the problem.

GlassPockets Road to 100

This post is part of our “Road to 100 & Beyond series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the “Who Has GlassPockets? self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, promising practices in transparency, helpful examples, and lessons learned.

I’ve often thought that fundraising can be as bad as dating. (Kudos to you lucky few who have had great experiences dating!) Lots of dates, lots of dead ends, lots of frustrating encounters before you (maybe) find a match. All along the way you look for even the smallest sign to indicate that someone likes you. “They laughed at my joke!” or, in the case of fundraising, “they seemed really excited about page five of last year’s impact report!” Not to mention the endless time spent doing online searches for shreds of information that might be useful. This reality is part of the reason why Bush Foundation was proud to be among the first 100 foundations to participate in GlassPockets. We believe that transparency and opening lines of communication is critical to good grantmaking, because both in dating and in fundraising, it can be heartbreaking and crazymaking to try and sort out whether you have a connection or if someone’s “just not that into you.” If only there was a way to just “swipe left” or “swipe right” and make everything a little simpler.

“We believe that transparency and opening lines of communication is critical to good grantmaking.”

I’m not proposing a Tinder for grantmaking (nor should anyone, probably, although hat tip to Vu Le for messing with all of us and floating the idea on April Fool’s Day). But over the past several years, Bush Foundation’s Community Innovation program staff has used a system to provide feedback calls for declined applicants, in the hopes of making foundation fundraising a little less opaque and crazymaking. We use the calls to be transparent and explain why we made our funding decisions. The calls also help us live out our “Spread Optimism” value because they allow us to help and encourage applicants and potentially point them to other resources. This is all part of our larger engagement strategy, described in “No Moat Philanthropy.”

 

Ellertonmandy20152
Mandy Ellerton

How Feedback Calls Work

We use a systematic approach for feedback calls:

  • We proactively offer the opportunity to sign up for feedback calls in the email we send to declined applicants.
  • We use a scheduling tool (after trying a couple different options we’ve landed on Slotted, which is relatively cheap and easy to use) and offer a variety of times for feedback calls every week. Collectively five Community Innovation Team members hold about an hour a week for feedback calls. The calls typically last about 20 minutes. We’ve found this is about the right amount of time so that we can offer feedback calls to most of the declined applicants who want them.
  • We prepare for our feedback calls. We re-read the application and develop an outline for the call ahead of time.
  • During the call we offer a couple of reasons why we declined the application. We often discuss what an applicant could work on to strengthen their project and whether they ought to apply again.
  • We also spend a lot of time listening; sometimes these calls can understandably be emotional. Grant applications are a representation of someone’s hopes and dreams and sometimes your decline might feel like the end of the road for the applicant. But hang with them. Don’t get defensive. However hard it might feel for you, it’s a lot harder for the declined applicant. And ultimately, hard conversations can be transformative for everyone involved. I will say, however, that most of our feedback calls are really positive exchanges.
  • We use anonymous surveys to evaluate what people think of the feedback calls and during the feedback call we ask whether the applicant has any feedback for us to improve our programs/grantmaking process.
  • We train new staff on how to do feedback calls. We have a staff instruction manual on how to do feedback calls, but we also have new team members shadow more seasoned team members for a while before they do a feedback call alone.

 

What’s Going Well

The feedback calls appear to be useful for both declined applicants and for us:

  • In our 2018 surveys, respondents (n=38) rated the feedback calls highly. They gave the calls an average rating of 6.1 (out of 7) for overall helpfulness, 95% said the calls added some value or a lot of value, and 81.2% said they had a somewhat better or much better understanding of the programs after the feedback call.
  • We’ve seen the number of applications for our Community Innovation Grant and Bush Prize for Community Innovation programs go down over time and we’ve seen the overall quality go up. We think that’s due, in part, to feedback calls that help applicants decide whether to apply again and that help applicants improve their projects to become a better fit for funding in the future.
  • I’d also like to think that doing feedback calls has made us better grantmakers. First, it shows up in our selection meetings. When you might have to talk to someone about why you made the funding decision you did, you’re going to be even more thoughtful in making the decision in the first place. You’re going to hew even closer to your stated criteria and treat the decision with care. We regularly discuss what feedback we plan to give to declined applicants in the actual selection meeting. Second, in a system that has inherently huge power differentials (foundations have all of it and applicants have virtually none of it), doing feedback calls forces you to come face to face with that reality. Never confronting the fact that your funding decisions impact real people with hopes and dreams is a part of what corrupts philanthropy. Feedback calls keep you a little more humble.

 

What We’re Working On

We still have room to improve our feedback calls:

  • We’ve heard from declined applicants that they sometimes get conflicting feedback from different team members when they apply (and get declined) multiple times; 15% of survey respondents said their feedback was inconsistent with prior feedback from us. Cringe. That definitely makes fundraising more crazymaking. We’re working on how to have more staff continuity with applicants who have applied multiple times.
  • We sometimes struggle to determine how long to keep encouraging a declined applicant to improve their project for future applications versus saying more definitively that the project is not a fit. Yes, we want to “Spread Optimism,” but although it never feels good for anyone involved, sometimes the best course of action is to encourage an applicant to seek funding elsewhere.

I’m under no illusions that feedback calls are going to fix the structural issues with philanthropy and fundraising. I welcome that larger conversation, driven in large part by brave critiques of philanthropy emerging lately like Decolonizing Wealth, Just Giving and Winners Take All. In the meantime, fundraising, as with dating, is still going to have moments of heartache and uncertainty. When you apply for a grant, you have to be brave and vulnerable; you’re putting your hopes and dreams out into a really confusing and opaque system that’s going to judge them, perhaps support them, or perhaps dash them, and maybe even “ghost” them by never responding. Feedback calls are one way to treat those hopes and dreams with a bit more care.

--Mandy Ellerton

Evolving Towards Equity, Getting Beyond Semantics
December 17, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarIn my previous post, I reflected on The California Endowment’s practice of conducting a Diversity, Equity, and Inclusion (DEI) Audit and how it helps us to stay accountable to intentionally integrating and advancing these values across the foundation.

We started this practice with a “Diversity and Inclusion” Audit in 2008 and as part of our third audit in 2013, The California Endowment (TCE) adjusted the framing to a “Diversity, Equity, and Inclusion” Audit. This allowed us to better connect the audit with how the foundation viewed the goals of our strategy and broadened the lens used through the audit process.

While this could be viewed as a semantic update based on changes in the nonprofit and philanthropic sectors, by 2016 our audit results reflected how TCE described both our core values that lead with principles of DEI and the ultimate outcome of our work that point towards health equity and justice for all. And although we didn’t make a corresponding change to reflect this shift in what the audit specifically assesses, select findings from our most recent audit highlight how not only diversity, but how equity is also being operationalized within the foundation.

Getting beyond the numbers

In some ways, the most straightforward entry point for DEI discussions is to first examine diversity by assessing quantitative representation within the foundation at the board and staff level, among our partners, contractors, vendors, and investment managers. Though it’s a necessary beginning, reporting and reflection, however, cannot stop with counting heads.  While our audit may have started as a way to gauge inclusion through the lens of diversity, it’s become clear that collecting and examining demographic data sets the stage for critical conversations to follow.

Part of the inherent value of reflecting on diversity and representation is in service of getting beyond the numbers to discover what questions the numbers inspire. Questions such as:

  • Who’s missing or overrepresented and why?
  • What implications could the gaps in lived experiences have on the foundation, the strategies used and how our work is conducted?
  • What are the underlying structures and systems that shape the demographics of the foundation and of the organizations with which we partner?

It’s these types of questions about our demographics and diversity that help move us beyond discussions about representation into deeper discussions about equity.

The audit has been a valuable point of reflection and action planning over the past several years. It’s a comprehensive process conducted in partnership with evaluation firm, SPR, that spans an extensive number of sources.

Towards Equity and Inclusion

As TCE pursues our health equity goals, we’ve been able to define and distinguish key differences between diversity, equity, and inclusion. While diversity examines representation, we define equity as promoting fair conditions, opportunities, and outcomes. We also define inclusion as valuing and raising the perspectives and voices of diverse communities to be considered where decisions are being made. For future audits, we’re looking to refine our DEI audit goals to more explicitly focus on equity and inclusion across both our grantmaking efforts and to even more deeply examine our internal policies, practices, and operations. However, here are a few examples from our latest audit that highlight how equity and inclusion currently show up across the foundation outside of our grantmaking.

Equity in hiring

  • Recognizing the impact of structural racism and mass incarceration, TCE followed the lead of partners working to “ban the box” and the Executives’ Alliance for Boys and Men of Color to change hiring practices. TCE now utilizes a Fair Chance Hiring Policy that opens the door for hiring qualified applicants with a conviction or an arrest and shares open positions with anti-recidivism organizations.

Inclusion and equity in investments

  • In the spirit of inclusion, the criteria for our Program Related Investments (PRIs) integrate whether the PRI will engage the community it is intended to benefit as well as whether the investment will address a known health inequity or social determinant of health.
  • In recognition of structural racism leading to higher rates of incarceration within communities of color, in 2015 TCE announced that we will no longer invest in companies profiting from for-profit prisons, jails, or detention centers.

Equity in vendor selection

  • Operationalizing equity also requires considering how facility operations align with organizational values. In line with our divestment from for-profit prisons, an RFP process identified a vendor-nonprofit team that encouraged hiring formerly incarcerated and homeless community members within our onsite café. We remain committed to this approach.

The Work Ahead

We’ve accomplished a great deal. At the same time, as we evolve towards becoming an equity organization there are areas where we need to put more of our attention.

To move beyond articulating values and to get to deeper staff engagement, audit feedback suggests more staff resources are needed to connect individual functions and roles to our DEI values, including through our performance review process, particularly among non-program staff.

Connected to developing a greater vision regardless of department affiliation, we will soon embark to engage staff across the entire organization to develop a more deeply shared racial equity analysis of our work.  As part of this effort, our board is participating in racial equity trainings and adopted a resolution to utilize a racial equity lens as the foundation develops our next strategic plan.  Building on what we’re learning through our audits, in 2019 we’ll launch this effort towards becoming a racially equitable health foundation that will intentionally bring racial equity to the center of our work and how we operate.

Finally, as we continue to partner with and support community to fight for equity, there are several unanswered, imminent questions we’ll need to tackle. Within the walls of the foundation:

  • How do we hold ourselves to the same equity and inclusion principles that our partners demand of system leaders?
  • How do we confront the contradictions of how we operate as an organization rooted in a corporate or hierarchical design to share power with staff regardless of position, increase decision making transparency, and include those impacted by pending decisions in the same way we ask our systems leaders to include and respond to community?
  • With an interest in greater accountability to equity and inclusion, how do we not only tend to power dynamics but consider greater power sharing through foundation structures and current decision-making bodies both internally and externally?

Herein lies our next evolutionary moment.

--Mona Jhawar

Creating a Culture of Learning: An Interview with Yvonne Belanger, Director of Evaluation & Learning, Barr Foundation
November 8, 2018

Yvonne Belanger is the director of learning & evaluation at the Barr Foundation and leads Barr's efforts to gauge its impact and support ongoing learning among staff, grantees, and the fields in which they work.

Recently, Janet Camarena, director of transparency initiatives for Foundation Center, interviewed Belanger about how creating a culture of learning and openness can improve philanthropy. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.


YvonneGlassPockets: More and more foundations seem to be hiring staff with titles having to do with evaluation and learning. You’ve been in this role at the Barr Foundation for just about a year, having come over from a similar role at the Bill & Melinda Gates Foundation. Why do you think roles like this are on the rise in philanthropy, and what are your aspirations for how greater capacity for evaluation and learning can benefit the field?

Yvonne Belanger: I think the spread of these roles in strategic philanthropy comes from increasing recognition that building a stronger learning function is a strategic investment, and it requires dedicated expertise and leadership. My hope is that strong evaluation and learning capacity at Barr (and across the philanthropic sector generally) will enable better decisions and accelerate the pace of social change to make the world more equitable and just.

GP: What have been your priorities in this first year and what is your approach to learning? More specifically, what is Barr’s learning process like, what sources do you learn from, how do you use the learnings to inform your work?

YB: At Barr, we are committed to learning from our efforts and continuously improving. Our programmatic work benefits from many sources of knowledge to inform strategy including landscape scans, academic research, ongoing conversations with grantees and formal site visits, and program evaluations to name a few. During this first year, I have been working with Barr’s program teams to assess their needs, to sketch out a trajectory for the next few years, and to launch evaluation projects across our strategies to enhance our strategic learning. Learning is not limited to evaluating the work of our programs, but also includes getting feedback from our partners. Recently, we were fortunate to hear from grantees via our Grantee Perception Report survey, including specific feedback on our learning and evaluation practices. As we reflected on their responses in relation to Barr’s values and examples of strong practice among our peers, we saw several ways we could improve.

GP: What kinds of improvements are you making as a result of feedback you received?

YB: We identified three opportunities for improvement: to make evaluation more useful, to be clearer about how Barr defines success and measures progress, and to be more transparent with our learning.

  • Make evaluations more collaborative and beneficial to our partners. We heard from our grantees that participating in evaluations funded by Barr hasn’t always felt useful or applicable to their work. We are adopting approaches to evaluation that prioritize grantee input and benefit. For example, in our Creative Commonwealth Initiative, a partnership with five community foundations to strengthen arts and creativity across Massachusetts, we included the grantees early in the evaluation design phase. With their input, we modified and prioritized evaluation questions and incorporated flexible technical assistance to build their capacity for data and measurement. In our Education Program, the early phase of our Engage New England evaluation is focused on sharing learning with grantees and the partners supporting their work to make implementation of these new school models stronger.
  • Be clearer about how we measure outcomes. Our grantees want to understand how Barr assesses progress. In September, we published a grantee guide to outputs and outcomes to clarify what we are looking for from grantees and to support them in developing a strong proposal. Currently, our program teams are clarifying progress measures for our strategies, and we plan to make that information more accessible to our grantees.
  • Share what we learn. To quote your recent GrantCraft Open for Good report, “Knowledge has the power to spark change, but only if it is shared.” To maximize Barr’s impact, we aim to be #OpenForGood and produce and share insights that help our grantees, practitioners, policymakers, and others. To this end, we are proactively sharing information about evaluation work in progress, such as the evaluation questions we are exploring, and when the field can expect results. Our Barr Fellows program evaluation is one example of this practice. We are also building a new knowledge center for Barr to highlight and share research and reports from our partners, and make these reports easier for practitioners and policymakers to find and re-share.

GP: Clearly all of this takes time and resources to do well. What benefits can you point to of investing in learning and knowledge sharing?

YB: Our new Impact & Learning page reflects our aspiration that by sharing work in progress and lessons learned, we hope to influence nonprofits and other funders, advance field knowledge, inform policy, and elevate community expertise. When you are working on changing complex systems, there are almost never silver bullets. To make headway on difficult social problems we need to view them from multiple perspectives and build learning over time by analyzing the successes – and the failures - of many different efforts and approaches.

GP: Barr’s president, Jim Canales, is featured in a video clip on the Impact & Learning page talking about the important role philanthropy plays as a source of “risk capital” to test emerging and untested solutions, some of which may not work or fail, and that the field should see these as learning opportunities. And, of course, these struggles and failures could be great lessons for philanthropy as a whole. How do you balance this tension at Barr, between a desire to provide “risk capital,” the desire to open up what you are learning, and reputational concerns about sharing evaluations of initiatives that didn’t produce the desired results?

YB: It’s unusual for Foundations to be open about how they define success, and admissions of failure are notably rare. I think foundations are often just as concerned about their grantees’ reputation and credibility as their own. At Barr we do aspire to be more transparent, including when things that haven’t worked or our efforts have fallen short of our goals. To paraphrase Jim Canales, risk isn’t an end in itself, but a foundation should be willing to take risks in order to see impact. Factors that influence impact or the pace of change are often ones that funders often have control over, such as the amount of risk we were willing to take, or the conceptualization and design of an initiative. When a funder can reflect openly about these issues, these usually generate valuable lessons for philanthropy and reflect the kind of risks we should be able to take more often.

GP: Now that you are entering your second year in this role, where are the next directions you hope to take Barr’s evaluation and learning efforts?

YB: In addition to continuing and sustaining robust evaluation for major initiatives across our program areas, and sharing what we’re learning as we go, we have two new areas of focus in 2019 – people and practices. We will have an internal staff development series to cultivate mindsets, skills, and shared habits that support learning, and we will also be working to strengthen our practices around strategy measurement so that we can be clearer both internally and externally about how we measure progress and impact. Ultimately, we believe these efforts will make our strategies stronger, will improve our ability to learn with and from our grantees, and will lead to greater impact.

 

Building Our Knowledge Sharing Muscle at Irvine
May 17, 2018

Kim Ammann Howard joined the James Irvine Foundation as Director of Impact Assessment and Learning in 2015. She has more than 20 years of social impact experience working with nonprofits, foundations, and the public sector to collect, use, and share information that stimulates ongoing learning, and change.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Kim Ammann HowardHaving recently spent two days with peer foundation evaluation directors, I am savoring the rich conversations and reflecting on how shared knowledge benefits my own thinking and actions. It also reminds me of how often those conversations only benefit those inside the room. To really influence the field, we need to build our knowledge sharing muscle beyond our four walls and usual circles. A new report from the Foundation Center, Open for Good: Knowledge Sharing to Strengthen Grantmaking, aims to help funders do just that, and I was happy to contribute some of The James Irvine Foundation’s own journey to the guide.

When I joined the Foundation at the end of 2015, there was already a commitment to transparency and openness that established knowledge sharing as part of the culture. It was something that attracted me to Irvine, and I was excited to build on the types of information collected and disseminated in the past, and to figure out how we could grow.

Open For Good CoverOur Framework

In 2016, we launched our new strategy, which focuses on expanding economic and political opportunity for California families and young adults who are working but struggling with poverty. This presented an opportune moment to articulate and set expectations about how impact assessment and learning (IA&L) is integrated in the work. This includes defining how we assess our progress in meeting our strategic goals, how we learn, and how we use what we learn to adapt and improve. We developed a framework that outlines our approach to IA&L – why we think it’s important, what principles guide us, and how we put IA&L into practice.

While the IA&L framework was designed as an internal guide, we decided to make it available externally for three reasons: to honor the Foundation’s commitment to transparency and openness; to hold ourselves accountable to what we say we espouse for IA&L; and to model our approach for colleagues at other organizations who may be interested in adopting a similar framework.

What We’re Learning

We’ve also dedicated a new portion of our website to what we are learning. We use this section to share knowledge with the field – and not only the end results of an initiative or body of research but also to communicate what happens in the middle – to be transparent about the work as we go.

For example, in 2017, we spent a year listening and learning from grantees, employers, thought leaders, and other stakeholders in California to inform what would become our Better Careers initiative. At the end of the year, we announced the goal of the initiative to connect low-income Californians to good jobs with family-sustaining wages and advancement opportunities. It was important for us to uphold the principles of feedback set in our IA&L framework by communicating with all the stakeholders who helped to inform the initiative’s strategy – it was also the right thing to do. We wanted to be transparent about how we got to our Better Career approach and highlight the ideas reflected in it as well as the equally valuable insights that we decided not to pursue. Given the resources that went into accumulating this knowledge, and in the spirit of greater funder collaboration, we also posted these ideas on our website to benefit others working in this space.

As we continue to build our knowledge sharing muscle at Irvine, we are exploring additional ways to communicate as we go. We are currently reflecting on what we are learning about how we work inside the foundation – and thinking about ways to share the insights that can add value to the field. Participating as a voice in the Foundation Center’s new Open for Good guide was one such opportunity, and the stories and lessons from other Foundations in the guide inspires our own path forward. 

--Kim Ammann Howard

Learn, Share, and We All Win! Foundation Center Releases #OpenForGood Guide and Announces Award Opportunity
May 10, 2018

Open For Good CoverMelissa Moy is special projects associate for Glasspockets.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Knowledge is a resource philanthropy can’t afford to keep for itself, and as a result of a newly available guide, funders will now have a road map for opening up that knowledge. The new GrantCraft guide, Open for Good: Knowledge Sharing to Strengthen Grantmaking, supported by the Fund for Shared Insight, illustrates practical steps that all donors can take to create a culture of shared learning.

Philanthropy is in a unique position to generate knowledge and disseminate it, and this guide will help foundations navigate the process. Each year, foundations make $5 billion in grants toward knowledge production. These assessments, evaluations, communities of practice, and key findings are valuable, yet only a small fraction of foundations share what they learn, with even fewer using open licenses or open repositories to share these learnings. Foundations have demonstrated that some of the information they value most are lessons about “what did and didn’t work.” And yet, this is the same knowledge that foundations are often most reluctant to share.

The guide, part of Foundation Center’s larger #OpenForGood campaign, makes a strong case for foundations to openly share knowledge as an integral and strategic aspect of philanthropy. Through interviews with leaders in knowledge sharing, the guide outlines tested solutions to overcome common barriers to impart learnings, as well as essential components needed for funders to strengthen their knowledge-sharing practice. The guide emphasizes that sharing knowledge can deepen internal reflection and learning, lead to new connections and ideas, and promote institutional credibility and influence. 

Knowledge comes in all shapes and sizes – program and grantee evaluations, foundation performance assessments, thought leadership, formal and informal reflections that are shared among foundation staff and board members. The guide will help your foundation identify the types of information that can be shared and how to take actionable steps.

Download the Guide

OFGaward-528To further encourage funders to be more transparent, this week Foundation Center also announces the opening of a nomination period for the inaugural #OpenForGood Award  to bring due recognition and visibility to foundations who share challenges, successes, and failures to strengthen how we can think and act as a sector.

Three winning foundations will demonstrate an active commitment to open knowledge and share their evaluations through IssueLab. Winners will receive technical support to create a custom knowledge center for themselves or a grantee, as well as promotional support in the form of social media and newsletter space. Who will you nominate as being #OpenForGood?

--Melissa Moy 

To Serve Better, Share
May 3, 2018

Daniela Pineda, Ph.D., is vice president of integration and learning at First 5 LA, an independent public agency created by voters to advocate for programs and polices benefiting young children.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Daniela Pineda Photo 2We share ideas freely on Pinterest, we easily give our opinions on products on Amazon and we learn from “how-to” videos on YouTube from the comfort of our homes. We even enjoy sharing and being creative by pulling ideas and concepts together.

Often, this is not what happens once we step foot in the office. We may find ourselves more reluctant to embrace sharing what works, learning what doesn’t and then applying these lessons to our work. It’s hard to speak about how things didn’t turn out as expected. It is as if we are saving the treasure of our knowledge for a rainy day, as if it’s a limited resource.

I believe in the power of being #OpenForGood, using knowledge to improve philanthropic effectiveness, in our case, to help create more opportunities and better outcomes for young children.

That’s why I am delighted to participate in a new how-to guide that was just released this week by sharing examples from our journey to opening up our knowledge at First5 LA. As part of Foundation Center’s #OpenForGood movement, the new GrantCraft guide Open for Good: Knowledge Sharing to Strengthen Grantmaking provides tips and resources, including strategies for knowledge sharing. Everyone benefits when organizations strengthen their knowledge sharing practices by enhancing organizational capacity and culture, and by understanding how to overcome common hurdles to sharing knowledge.  

“We can achieve more collectively and individually by sharing information and creating knowledge.”

As a public entity, First 5 LA is uniquely positioned to share knowledge with the field. Our mandate to be transparent serves as a powerful launchpad for sharing knowledge. For example, in our work with communities across Los Angeles County, we work to elevate the voices and perspectives of parents to leaders and lawmakers.

When we create opportunities for parents and policymakers to hear from each other, we are moving beyond a transparency requirement to foster more nuanced conversations on how we can all help improve outcomes for kids.

No matter your type of organization or mission -- foundations, nonprofit, government or business, we can achieve more collectively and individually by sharing information and creating knowledge.

Sharing information about what has worked, what hasn’t, and being open to learning lessons from others is a skill that sharpens your thinking, benefits the field, and helps advance your own goals, while also benefiting those you serve.

We must be mindful of the many potential roadblocks to sharing in service of becoming more effective, both inside and outside of our own organizations. Among them: egos and a lack of humility; competition for resources; a lack of incentives to share; and a lack of awareness of what information is shared and what outcomes it produces.

Sharing Sharpens Your Thinking

Failing to see knowledge sharing as part of your job amounts to lost opportunity, lost time, and lost resources. Making the time to find out what others are doing is important. At a minimum, we can feel empowered by the simple knowledge that we aren’t the only ones dealing with the problems we face in our jobs. In a best case scenario, we can adapt that information to our context, and try new ways to do our jobs better.

Open For Good CoverThis notion really hit home for me from a very simple online search when I started a new role. Curious if others were also grappling with similar issues about how to effectively evaluate place-based work, I searched a few sites. In philanthropy, we are fortunate to have impressive open online repositories such as Foundation Center’s Issue Lab, where we can find loads of information.

Indeed, my search led to several pieces on lessons learned from funders of place-based work. I fortunately found a thoughtful report on the topic at hand. But what was most useful, beyond reading the insight gleaned, was that I was then able to reach out to one of the authors to learn exactly what it meant to let the evaluation design evolve with the initiative.

Based on this connection, I refined a step on our learning agenda process to ensure we set the expectation that community voices were consulted earlier, during the planning phase of the project. While we had already planned for inclusion, I learned what types of pitfalls to avoid when structuring community engagement on a long-term evaluation project.

Since reaching out to my colleague, I have continued to learn from him and a broader network of learning practitioners who also value sharing knowledge. This concept of reaching out to others and asking simple questions is simple, and yet so few make the time to do it.

The truth is, great ideas can come from anywhere: a conversation on a commuter train, a session at a conference, or results from a search engine. Sharing, and being open to new ideas, serves to sharpen thinking and can improve your ability to achieve your philanthropic to  goals.

Sharing Benefits the Field

At a more global level, to make an impact on society and change things for the better, share what you know, and be willing to adjust your approach based on what you learn. That’s the approach we embrace at First 5 LA.

This not only helps our organization in our mission, but it sets an example for other like-minded organizations to open their viewpoints on sharing their successes and failures.

“Don’t save your knowledge for a rainy day—it’s an unlimited resource!”

For example, we recently worked with an evaluation partner to restructure the scope of its engagement. This was difficult because the project had been in place for a long time and the restructuring resulted in a more narrow scope. The partner was disappointed that we determined only two of the four initially designed subprojects remained relevant to our work. It could appear we were no longer committed to learn about this investment.

By being open with them, we also heard about their own concerns that the data would be of sufficient quality to conduct rigorous analyses. We listened and came up with a joint approach  to reach out to a different entity to secure an alternative data source. This worked, and now the project has been refocused, new data was secured, and the partner saw firsthand that while the approach changed, we were still committed to learning together.

Sharing information and outcomes is essential to being influencers in our areas of expertise. And learning from others is essential to being assets within our fields. In this case, we landed on an alternative approach to leverage data, and we maintained a productive relationship with our partner. We plan to share this approach broadly so that it can spark new ideas and insights or confirm an approach among other grantmakers grappling with similar issues.

Once we as individuals, managers and organizations can distill and discern knowledge, we can apply it to our own important work for public good, and share it with others to help them with theirs.

Sharing Is a Skill

These sharing efforts should permeate your organization, beyond the C-suite. Leaders must lead by example and encourage staff to see themselves as gatherers – and contributors – of knowledge to their fields.

Ultimately, learning to share information is a skill. To do this, and to glean the best information from data includes sharing it with others both inside and outside of your organization.

But collecting reams of information will do us no good if we do not have a specific plan for the data, and then analyze what it means in a bigger universe – and for those we serve.

At First 5 LA, we take a very pragmatic approach to data collection. First, we work with our programs to identify the specific systems we are trying to impact. Once that is determined, we then create learning agendas, which are tools for us to prioritize the key learning questions that will help us know if we are making progress on behalf of kids in Los Angeles  County.

Our approach requires that we specify how we plan to use those data before we collect it. Data should be tied to specific learning questions.

We are proud of our work and approach to use learning as a strategy, and it is not always easy to let others benefit from what we learn the hard way.

But our work is not ultimately about a singular institution. And you don’t need to save your knowledge for a rainy day—it’s usually an unlimited resource! It’s about huddling under a shared umbrella in stormy weather, and basking together in the sunshine for the ones who need us the most. Those we serve.

--Daniela Pineda

Knowledge Sharing to Strengthen Grantmaking
April 26, 2018

Clare Nolan, MPP, co-founder of Engage R+D, is a nationally recognized evaluation and strategy consultant for the foundation, nonprofit and public sectors. Her expertise helps foundations to document and learn from their investments in systems and policy change, networks, scaling, and innovation. This post also appears on the Grantmakers for Effective Organizations’ (GEO) Perspectives blog.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Clare Nolan PhotoKnowledge has the power to spark change, but only if it is shared. Many grantmakers instinctively like the idea of sharing the knowledge they generate with others. But in the face of competing priorities, a stronger case must be made for foundations to devote time and resources to sharing knowledge. The truth is that when foundations share knowledge generated through evaluation, strategy development and thought leadership, they benefit not only others but also themselves. Sharing knowledge can deepen internal reflection and learning, lead to new connections and ideas, and promote institutional credibility and influence.

Foundations can strengthen their knowledge sharing practices by enhancing organizational capacity and culture, and by understanding how to overcome common hurdles to sharing knowledge. The forthcoming GrantCraft guide Open for Good: Knowledge Sharing to Strengthen Grantmaking provides tips and resources for how foundations can do just that. My organization, Engage R+D, partnered with Foundation Center to produce this guide as part of #OpenForGood, a call to action for foundations to openly share their knowledge.

Knowledge Sharing GraphTo produce the guide, we conducted interviews with the staff of foundations, varying by origin, content focus, size, and geography. The participants shared their insights about the benefits of sharing knowledge not only for others, but also for their own organizations. They also described strategies they use for sharing knowledge, which we then converted into concrete and actionable tips for grantmakers. Some of the tips and resources available in the guide include:

  • A quiz to determine what type of knowledge sharer you are. Based upon responses to questions about your organization’s capacity and culture, you can determine where you fall within a quadrant of knowledge sharing (see visual). The guide offers tips for how to integrate knowledge sharing into your practice in ways that would be a good fit for you and your organization.
  • Nuts and bolts guidance on how to go about sharing knowledge. To take the mystery out of the knowledge sharing process, the guide breaks down the different elements that are needed to actually put knowledge sharing into practice. It provides answers to common questions grantmakers have on this topic, such as: What kinds of knowledge should I be sharing exactly? Where can I disseminate this knowledge? Who at my foundation should be responsible for doing the sharing?
  • Ideas on how to evolve your foundation’s knowledge-sharing practice. Even foundation staff engaged in sophisticated knowledge-sharing practices noted the importance of evolving their practice to meet the demands of a rapidly changing external context. The guide includes tips on how foundations can adapt their practice in this way. For example, it offers guidance on how to optimize the use of technology for knowledge sharing, while still finding ways to engage audiences with less technological capacity.

The tips and resources in the guide are interspersed with quotes, audio clips, and case examples from the foundation staff members we interviewed. These interviews provide voices from the field sharing tangible examples of how to put the strategies in the guide into practice.

Want to know how your foundation measures up when it comes to knowledge sharing? We are pleased to provide readers of this blog with an advance copy of Chapter 2 from the forthcoming Guide which includes the quiz referenced above. Want to learn more? Sign up for the Foundation Center’s GrantCraft newsletter and receive a copy of the Guide upon its release. And, for those who are attending the GEO conference next week in San Francisco, visit us at our #OpenForGood pop-up quiz station where you can learn more about what kind of knowledge sharer you are.

--Clare Nolan

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories