Transparency Talk

« May 2019 | Main | July 2019 »

June 2019 (2 posts)

Meet Our #OpenForGood Award Winner: An Interview with Lee Alexander Risby, Head of Effective Philanthropy & Savi Mull, Senior Evaluation Manager, C&A Foundation
June 19, 2019

1




Lee Alexander Risby

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

C&A Foundation is a European foundation that supports programs and initiatives to transform fashion into a fair and sustainable industry that enables everyone – from farmer to factory worker – to thrive. In this interview, Lee Alexander Risby and Savi Mull share insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the C&A Foundation and how this practice has helped you to advance your work?

2




Savi Mull

Savi Mull: For almost five years, C&A Foundation has been dedicated to transforming the fashion industry into a force for good. A large part of that work includes instilling transparency and accountability in supply chains across the industry. From the start, we also wanted to lead by example by being transparent and accountable as an organization, sharing what we were learning whilst on this journey, being true to our work and helping the rest of the industry learn from our successes and failures.

Lee Alexander Risby: Indeed, from the beginning, we made a commitment to be open about our results and lessons by publishing evaluations on our website and dashboards in our Annual Reports. After all, you cannot encourage the fashion industry to be transparent and accountable and not live by the same principles yourself. Importantly, our commitment to transparency has always been championed both by our Executive Director and our Board.

Savi: To do this, over the years we have put many processes in place.  For example, internally we use after-action reviews to gather lessons from our initiatives and allow our teams to discuss honestly what could have been done better in that program or partnership.  We also do third party, external evaluations of our initiatives, sharing the reports and lessons learned. This helps us and our partners to learn, and it informs initiatives and strategies going forward.

The Role of Evaluation Inside Foundations

GP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

SM: I believe it is essential to have this type of function in a foundation to drive formal learning from and within programs. But at the same time, it is an ongoing process that cannot be driven by one function alone. All staff needs to be responsible for the learning that makes philanthropy effective – not just evaluators.

LAR: To begin, we were deliberate in building a team of evaluation professionals to promote accountable learning. We started hiring slowly and built the team over time. What I looked for with each new member of the team, and I am always looking for, is an evaluator with more than just skills, they also need the influencing, listening, communication and negotiating skills to help others learn. Evaluations have little effect without good internal and external communication.

”For us, it was important to be a critical friend, listener, and enabler of learning and not the police.”

The evaluation function itself has also evolved over the last five years. It started off as a monitoring, evaluation and learning function (MEL) and is now Effective Philanthropy. From the start, the function was as not set up as an independent department but created to help programmatic teams in the design of appropriate monitoring and evaluation for the programs, and facilitators and advisors on strategy. However, it has not always been a straight-forward process from the inside. In the first years, we had to spend a lot of time explaining and persuading staff of the need for evaluation, transparency and learning and the benefits of doing so. We wanted to avoid a strong independent evaluation function as that can reduce learning by placing too much emphasis on accountability. For us, it was important to be a critical friend, listener, and enabler of learning and not the police.

SM: So, the first bit of advice is that evaluators should be supportive listeners, assisting programmatic teams throughout the design and implementation phases to get the best results possible. They should not come in just at the end of an initiative to do an evaluation.

LAR: The second piece of advice is on positioning, support, and structure of evaluation within a foundation.  Firstly, it is critical to have is to have the buy-in of the leadership and board for both evaluation and transparency. And secondly, the evaluation function must be part of the management team and report to the CEO or Executive Director. This gives reporting and learning the appropriate support structure and importance.

The third piece of advice is to consider not creating an evaluation function, but an effective philanthropy function. Evaluation is done for learning, and learning drives effectiveness in grant-making for better results and long-term impacts on systems.

SM: The final piece of advice is to take guidance from others outside your organization. The whole team has consulted broadly with former colleagues and mentors from across the evaluation community as well as experienced philanthropic professionals. Remember you are part of a field with peers whose knowledge and experience can help guide you.

Opening Up Pain Points

GP: One of the reasons the committee selected C&A Foundation to receive the award is because of your institutional comfort level with sharing not just successes, but also being very forthright about what didn’t work. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected C&A Foundation’s reputation in the field, and why it’s worth it?

LAR: I would say this. The question for foundation boards and leaders is straightforward: do you want to be more effective and have an impact? The answer to that will always be yes, but it is dependent on learning and sharing across the organization and with others. If we do not share evaluations, research or experiences, we do not learn from each other and we cannot be effective in our philanthropic endeavors.

"There is a benefit to being open, you build trust and integrity – success and failure is part of all of us."

The other question for boards and leaders is: who does philanthropy serve? For us, we want to transform the fashion industry, which is made up of cotton farmers, workers in spinning mills and cut and sew factories, consumers and entrepreneurs, to name a few – they are our public. As such we have the duty to be transparent to the public about where we are succeeding and where we have failed and how we can improve. We do not think there is a reputation risk. In fact, there is a benefit to being open, you build trust and integrity – success and failure is part of all of us.

SM: Adding to what Lee has said, being open about our failures not only helps us but the entire field. Some of our partners have felt reticent about our publishing evaluations, but we always reassure them and stress from the beginning of an evaluation process that it is an opportunity to understand how to they can improve their work and how we can improve our partnership, as well as a chance to share those lessons more broadly.

Learning While Lean

GP: Given the lean philanthropy staffing structures in place at many corporate foundations, do you have any advice for your peers on how those without a dedicated evaluation team might still be able to take some small steps to sharing what they are learning?

SM: Learning is a continuous process. In the absence of staff dedicated to evaluation, take baby steps within your power, such as implementing after-action reviews, holding thematic webinars, or doing quick summaries of lessons from grants and/or existing evaluations from others. If the organization’s leadership endorses learning, these small steps are a good place to start.

GP: And speaking of lean staffing structures, a concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does C&A Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

SM: The foundation has a Monitoring and Evaluation Policy that lays out the role of the programmatic staff and partners as well as of the dedicated Effective Philanthropy Team. C&A Foundation partners are generally responsible for the design and execution of self-evaluation - to be submitted at the end of the grant period. External evaluation budgets are covered by the foundation and do not pose a financial burden on partners at all. They are included in the overall cost of an initiative, and when needed we have an additional central evaluation fund that is used to respond to the programmatic team’s and partner’s ad hoc demands for evaluations and learning.

The Effective Philanthropy team does provide technical assistance to partners and foundation staff upon request. The guidance ranges from technical inputs related to the theory of change development to the design of baseline and mid-line data collection exercises. The theory of change work has been really rewarding for partners and ourselves. We all enjoy that part of the work.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your work.

Learning Leads to Effectiveness

C-a-foundation (1)LAR: In the moving from a more traditional MEL approach to effective philanthropy we looked at the work of other foundations. This included learning from the William and Flora Hewlett Foundation, the Rockefeller Foundation, and others. We had discussions with a number of peers in the field. We also asked Nancy MacPherson (formerly Managing Director of Evaluation at Rockefeller) and Fay Twersky (Director of Effective Philanthropy at Hewlett) to review our Effective Philanthropy strategy when it was under development. Their feedback and advice helped a lot. In the end, we decided to begin to build out the function in a similar way to the Hewlett Foundation. But there are some differences. For example, our evaluation practice is currently positioned at a deeper initiative level, which is related to the field context where there is a significant evidence gap across the fashion industry that needs to be filled. Concomitant to this is our emphasis on piloting and testing and that goes hand-in-hand with the demand for evaluative thinking, reporting, and learning.

Our team has also been influenced by our own successes and failures from previous roles. That has also inspired us to embrace a slightly different approach.

SM: In terms of where we are at the moment, we still oversee performance monitoring, evaluation, and support to the program teams in developing theories of change and KPIs; but we are also building out organizational learning approach and are in the process of hiring a Senior Learning Manager. Lastly, we are piloting our organizational and network effectiveness in Brazil, which is being led by a colleague who joined the foundation last year.

LAR: We are also in the midst of an Overall Effectiveness Evaluation (OEE) of C&A Foundation’s first 5-year strategy. In general, this is not a type of evaluation that foundations use much. As well as looking at results, the evaluators are evaluating the whole organization, including Effective Philanthropy. For me as an evaluator, it has been really rewarding to be on the other side of a good question.

We are learning from the OEE as we go along and we decided to create ongoing opportunities for reporting/feedback from the process rather than waiting until the very end for a report. This means that program staff can be engaged in proactive discussions about performance and emerging lessons in a timely way. The OEE is already starting to play a vital role to inform the development of the next 5-year strategy and our organization. But you will surely hear more on that evaluation process later as it will be published. There is always room for improvement and learning never stops.

--Lee Alexander Risby and Savi Mull

Meet Our #OpenForGood Award Winner: An Interview with Craig Connelly, Chief Executive Officer, The Ian Potter Foundation
June 12, 2019

Download



Craig Connelly

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

The Ian Potter Foundation is an Australian foundation that supports and promotes excellence and innovation working for a vibrant, healthy, fair, and sustainable Australia. In this interview, Craig Connelly shares insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the Ian Potter Foundation and how this practice has helped you to advance your work? Or put another way, what is the good that has come about as a result?

Craig Connelly: The Ian Potter Foundation decided to invest in our research and evaluation capability primarily to improve the quality of our grantmaking. We believe that evaluating our grantees and the work that we fund through measuring and evaluating outcomes enables us to understand the extent to which our funding guidelines are achieving the intended outcomes. This results in a more informed approach to our grantmaking which should improve the quality of our grantmaking over time.

A core part of this includes being completely transparent with our grantees and with the broader sector. To do anything otherwise is not being consistent with our expectations of our grantees. We are asking our grantees to be partners, to pursue a strategic relationship with them and that requires open and honest conversation. Therefore, we need to be an open, honest and transparent funder and demonstrate that in order to win the trust of the organizations we fund.

Examples of this transparency are the learnings that we glean from our grantees that we share with the broader sector. We’re getting very positive feedback from both funders and grantees on the quality of the learnings that we’re sharing and the value that they add to the thought processes that nonprofit organizations and other funders go through.

The-ian-potter-foundationGP: Increasingly we are seeing foundations move toward a structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

CC: Anyone in a research and evaluation role needs to be an integral part of the program management team. The research and evaluation process informs our grantmaking. It needs to assist the program managers to be better at what they do, and it needs to learn from what the program managers are doing as well. You don’t want it to be a silo, it is just another function of your program management team. It is an integral part of that team and it is in constant communication both with the program management team and with grantees from day one.

GP: As you heard during the award presentation, one of the reasons the Ian Potter Foundation was selected to receive this award is because of how you prioritize thinking about how stakeholders like grantees might benefit from the reports and knowledge you possess. We often hear that while there is a desire to share grantee reports publicly, that there are reputational concerns that prevent it or that to scrub the reports of sensitive information would be too time consuming, yet you do it for all of your portfolios. What are your tips for how to keep this a manageable process?

CC: The initial work to compile and anonymize our grantee learnings required some investment in time from our Research & Evaluation Manager and communications team. To make this task manageable, the work was tackled one program area at a time. Now that a bank of learnings has been created for each program area, new learnings are easily compiled and added on a yearly basis. This work is scheduled at less busy times for those staff involved. The Ian Potter Foundation is also looking at ways learnings can be shared directly from grantees to the wider nonprofit sector. One idea is to create a forum (e.g. a podcast) where nonprofits can share their experiences with their peers in the sector.

GP: A concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does The Ian Potter Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

"...we need to be an open, honest and transparent funder and demonstrate that in order to win the trust of the organizations we fund."

CC: One of the benefits that we found at The Ian Potter Foundation of having a Research & Evaluation Manager becoming an integral part of our process is that our authorizing environment – our board and the committees responsible for program areas – have become very comfortable including funding evaluation for all of our grants. We now also understand what it costs to complete an effective evaluation. We often ask grantees to add more to their budget to ensure a good quality evaluation can be completed as part of the grant.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your own work.

CC: Yes, we have a couple of examples I can point to. The first comes from our Education Program Manager, Rikki Andrews, who points to the creation of the Early Childhood Impact Alliance (ECIA) through a grant to the University of Melbourne. The purpose of the ECIA is to convene, connect and increase understanding of research and policy among early childhood philanthropic funders, to ensure there is more strategic and concerted philanthropic support of research and its application.

Additionally, the Foundation’s Senior Program Manager, Dr. Alberto Furlan, explains, ‘We are in the process of learning from organizations we partner with all the time. In the last few years, program managers have been prioritizing extensive site visits to shortlisted applicants to discuss and see the projects in situ. In a ‘big country’ such as Australia, this takes a considerable amount of time and resources, but it invariably pays off. Such visits highlight the importance of relationship building deep and honest listening when partnering with not-for-profits. The Foundation prides itself in being open and approachable and site visits greatly contribute to understanding the reality of the day-to-day challenges, and successes, of the organizations working on the ground.’

--Craig Connelly & Janet Camarena

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories