Transparency Talk

Category: "Who Has Glass Pockets?" (43 posts)

Ask, Listen, Act. Embedding Community Voices into our Brand.
September 12, 2019

Untitled design






Zeeba Khalili

Zeeba Khalili is the Learning and Evaluation Officer at Marguerite Casey Foundation.

Los Angeles, California; Baltimore, Maryland; Mobile, Alabama; Rapid City, South Dakota; El Paso, Texas; and Yakima, Washington. These six cities were chosen to reflect a diversity of regional, generational, cultural, ethnic and socioeconomic perspectives. In 2002, as part of our creation, Marguerite Casey Foundation convened listening circles in these six cities to listen to the voices of more than 600 families. The Foundation posed the same three questions in each listening circle:

  • What creates strong families and children?
  • What would it take to change the systems that have an impact on the lives of families and children?
  • How would you leverage $30 million to ensure the well-being of children, families and communities?

Though these six hundred voices spoke of diverse needs, in many ways we discovered they spoke as one. They called for respecting and valuing families; empowering families and holding them at the center of systems of care; promoting grassroots activism and leadership; collaborating across agencies and systems; changing unresponsive policies; and galvanizing public will to support families that help avoid crises and ultimately lead away from dependence on systems.

We didn’t convene listening circles just to check a box of community involvement. Hearing stories and ideas directly from communities allowed us to build a Foundation that challenged preconceived notions about the “best” way to support families and end poverty. What we heard became the framework for the Foundation’s mission and strategy, grounded in listening to communities’ concerns as articulated by community leaders and taking action informed by families’ voices. We committed to Ask, Listen, Act, making it our brand promise, and one of our forms of philanthropic transparency. The Foundation grounds its decisions in what we’ve heard from our constituencies, both grantees and families, and we make our learnings public so that other groups can learn from the work we’ve already done.

Today, Marguerite Casey Foundation’s grantmaking echoes the sentiments heard seventeen years ago. We provide long-term, sizeable multi-year general operating support grants to grassroots activism and advocacy organizations. We invest in Equal Voice networks, regionally and nationally, facilitated by network weavers, who help grantees collaborate across issues, form alliances and bring about long-term change.

Marguerite-casey-foundationAdditionally, the Foundation lifts the voices of low-income families to the national dialogue through our Equal Voice News online platform, harnessing the power of storytelling about families leading change in their communities. Program officers, closest to the grantees, connect the Foundation’s communications team to the families on the ground and elevate their experiences so that others can learn from them. For example, in July, the Foundation chronicled a Black farming community in rural Georgia, once thought to be vanishing, but that remains steadfast in its efforts to fight issues of Black land loss and food-related disparities. This story supports the work of Southwest Georgia Project for Community Education, a grantee of the Foundation, serving as a tool in fundraising and in garnering greater media attention.

Ask, Listen, Act allows us to engage the community for both our benefit and for theirs. Its methodology can be seen across the Foundation, including in how we learn from our grantees. Every few years we commission the Center for Effective Philanthropy (CEP) to conduct a Grantee Perception Report survey of our grantees to assess our impact and interactions. These reports create genuine opportunity for the Foundation to reflect on our strategies and in the past, based on feedback, we identified two key areas to improve: consistency of communications and assistance beyond grant dollars. We created cross-regional teams of Program Officers to ensure that grantees could always reach someone with questions or concerns and provided several grantees in the South with technical assistance funding to grow their financial and governance infrastructures.

We hold ourselves accountable to the six hundred families that came together from across the country in 2002 to help us with our founding. Their unique circumstances and breadth of perspectives continue to be heard today in the communities we serve, shared with us by our grantees, and so the Foundation’s approach remains steadfast. We hope that the philanthropic community will recognize that the constituencies we serve deserve to be listened to and more than that, deserve to be experts of their own lives.

--Zeeba Khalili

Meet Our New GlassPockets Foundation: An Interview with Chris Langston, President & CEO, Archstone Foundation
August 8, 2019

GlassPockets Road to 100

This post is part of our "Road to 100 & Beyond" series, in which we are featuring the foundations that have joined us in building a movement for transparency that now surpasses 100 foundations publicly participating in the "Who Has GlassPockets?" self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, helpful examples, and lessons learned.

Since its inception in 1985 as a healthcare conversion foundation, Archstone Foundation has responded to the implications of changing demographics by supporting innovative responses to the emerging and unmet needs of older adults. The Foundation has funded a wide range of grantees making important contributions in critical, yet often overlooked areas of need.

Today, the Foundation focuses its grantmaking on four major areas:

  • Enabling older adults to remain in their homes and communities;
  • Improving the treatment of late-life depression;
  • Developing innovative responses to the family caregiving needs of older adults; and
  • Expanding the health care and broader workforce needed to care for, and serve, the rapidly growing aging population.

Archstone Foundation is among our newest GlassPockets participants. In this interview with GlassPockets’ Janet Camarena, Chris Langston, President & CEO of the Archstone Foundation, explains why transparency is central to its philanthropic efforts.

GlassPockets: Archstone Foundation was born out of a healthcare conversion, when a nonprofit HMO became a for-profit corporation. Do you think transparency is more important for healthcare conversion foundations to demonstrate that these dollars are being used for public good? Or are there other reasons that you are prioritizing philanthropic transparency?

Langston_hi_Staff_Photos_3.0_165_165_c1_c_t_0_0
Chris Langston

Chris Langston: I’m sure the public is more interested in what’s going on with healthcare conversion foundations, as the funds are more clearly a public trust because they derived from the tax advantages given to the nonprofit parent. As an older, smaller conversion, the public has long since forgotten the origin of the endowment, but what we do is still supported by the taxpayers granting favorable treatment to the endowment. Nevertheless, to my mind, conversions or foundations born of a wealthy individual’s gift (or other source) have the same obligation to transparency. Foundations are granted tremendous autonomy in what and how they do their work and, beyond some very broad IRS regulations, are only accountable to their boards. As a consequence, I think that we owe the public great visibility into what we do and how we do it. I believe that the great diversity of foundations is a strength in the sector, and I oppose external mandates regarding subject matter, limited lifespan, payout rates, or other aspects of foundation discretion. So, the only remaining constraint is public scrutiny of our process and our work.

GP: We often hear concerns that transparency takes a lot of time and resources, so it's really more relevant for large foundations. Why would you say transparency and openness should be a priority for even foundations comprised of a small team? How have you benefited from your efforts to open up your work?

CL: I see the GlassPockets standards as a floor and not one that takes a great deal of effort to keep shiny. We share through our website our current grants, our strategic plans, our governance documents, and financial reports. Even small foundations need to have these tools and structures and sharing them digitally is no burden. These things change relatively slowly and in the modern era are relatively easy to keep up to date.

Moreover, I’ve worked at two other foundations previously, one which started as not very transparent because of inattention to communicating to the public and one which had historically gone to great lengths to be opaque – the Atlantic Philanthropies during its anonymous giving phase. In neither case did our lack of transparency make our work better – I think it made it worse. We got less constructive engagement from the field, we got less alignment between us and grantees, and we didn’t benefit from the extra energy that comes from knowing that your successes and failures are going to be visible for all to see.

GP: Your commitment to openness includes maintaining a responsive grantmaking program with an open RFP that can be submitted on an ongoing basis. At a time when many foundations are putting up walls by shifting to invite only grantmaking, this is notable in that you are maintaining this kind of openness with a very small program team made up of three officers. Why has it been important to maintain the open RFP, and what is your advice to keeping it manageable for lean teams?

CL: Actually, we are right now reviewing our responsive grantmaking program and could very well stop or constrain it. While having an open RFP mechanism is one kind of openness, I am more committed to having an open-door policy. I think it is a legitimate strategic decision as to whether a foundation takes grant applications by invitation only, has a monthly letter of intent review (as we currently do), or something in between. What’s more important is that there be regular opportunities whereby grantseekers can learn from foundation staff about foundation priorities and strategies for change and where foundation staff can learn about the needs and interests of nonprofits in the field and the people in need.

”The GlassPockets process is a thoughtful and well-structured way of getting started in opening up to the public, what largely belongs to the public, even if it is held in trust for them by us on the inside.”

GP: How did the GlassPockets self-assessment process help you improve or better understand your organization's level of transparency, and why should your peers participate?

CL: The GlassPockets process is a thoughtful and well-structured way of getting started in opening up to the public, what largely belongs to the public, even if it is held in trust for them by us on the inside. Providing the information helps you in many ways – it helps you be sure that you even have all the tools, policies, and procedures of a modern nonprofit (e.g., conflict of interest, committee charters, etc.). It helps you whenever you have a twinge of conscience at the thought of making something public, in so far as that is telling you that you are doing something that you don’t feel good about – something that doesn’t pass the “would you want to see it on the front page of the paper test.” And the process is part of creating a culture of openness and honesty among and between board, staff, and grantees. Creating this kind of culture is an enormous project undermined by fear, norms of silence, and power differentials – but I think it is critical for effective grantmaking.

GP: Since ideally, transparency is always evolving and there is always more that can be shared, what are some of your hopes for how Archstone Foundation will continue to open up its work in new ways in the future?

CL: Having earned a GlassPockets designation now at a second organization, it is this issue that really interests me – how can we take further steps in transparency. While it is scary and a long-term project to build a shared understanding and the will to change, I hope to make much more information public – for example, grant proposals (at least the funded ones), evaluations, board minutes, budgets, and more. The federal grantmaking process at the National Institute of Health already does much of this. When I think about government processes, I expect all of that transparency and more -- and yet government is at least nominally subject to the control of the voting public. Since foundations do not make their grantmaking or staffing decisions subject to elections, shouldn’t we be even more transparent than government?

Fundamentally, the issue is that among funders and nonprofits, we spend a lot of time not just “reinventing the wheel” but more accurately, reinventing the flat tire. It is not that there is more knowledge or skill on one side or the other of the grantmaking table, it’s that there isn’t enough truth and light illuminating the conversation. And as the party with the power of the purse, it is incumbent on us to go first to change the dialogue if we want to have better results.

--Chris Langston & Janet Camarena

Transparency Levels Go Live on GlassPockets
July 25, 2019

6a00e54efc2f80883301b7c90b6cb7970b-150wi
Janet Camarena

Janet Camarena is director of transparency initiatives at Candid.

Earlier this year, we announced a new Transparency Level framework on GlassPockets that would recognize grantmakers for having Core, Advanced, or Champion-level transparency practices based on how detailed the websites are for each profiled foundation. This announcement coincided with GlassPockets reaching its 100th publicly shared profile when the Walton Family Foundation joined and doubled down on their commitment to transparency by supporting GlassPockets in developing the new tiered-approach to transparency. Now, for the first time in the history of the platform, these levels are publicly visible when viewing the funders profiled on the site.

Each GlassPockets profile now comes complete with a transparency badge denoting the level that funder has attained. We encourage foundations to proudly display this badge on their websites as a way to demonstrate their commitment to transparency. You can get your badge here. Visitors to “Who Has Glass Pockets?” can also sort by transparency level to see which foundations comprise each. This sort feature also lists foundations by the number of transparency indicators they currently have, making it possible to quickly determine which foundations lead the pack when it comes to their online transparency practices.

Currently, the distinction of which foundation has the most transparent website goes to Rockefeller Brothers Fund (RBF), so a big congratulations to RBF for living its values when it says that it “is committed to sharing information to promote understanding of its mission and to advance the work of its grantees. The RBF values transparency, openness, and accountability, and has long provided detailed information about its history, program strategies, grants, impact, governance, operations, and finances.” The John D. and Catherine T. MacArthur Foundation, The Wallace Foundation, the Brazil Foundation, currently round out the top four foundations on GlassPockets based on the variety of types of data that is shared on their websites.

Badge-array-w-descriiption-2019-07

The transparency levels are designed to motivate foundations to continue to improve their transparency practices over time, as well as to use the data GlassPockets has collected to create suggested pathways for how transparency can evolve over time. The Core-level transparency practices are a natural entry point for new participants and reveal the data that is most commonly shared by foundations, which tends to be information about what the foundation does. Advanced transparency practices reveal not just what a foundation does, but also reveals how they do it by sharing information about a foundation’s operations. And Champion-level transparency practices push the current boundaries of what most foundations share online.

If it’s been a while since you’ve updated your GlassPockets profile, or reviewed your website’s transparency practices, now is an excellent time to do so, and you might just level up!

Explore GlassPockets Now

--Janet Camarena

Meet Our #OpenForGood Award Winner: An Interview with Veronica Olazabal, Director of Measurement, Evaluation and Organizational Performance, The Rockefeller Foundation
July 10, 2019

Nqrwrzfk





Veronica Olazabal

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

The Rockefeller Foundation advances new frontiers of science, data, policy, and innovation to solve global challenges related to health, food, power, and economic mobility. In this interview, Veronica Olazabal shares insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the Rockefeller Foundation and how this practice has helped you to advance your work? Or put another way, what is the good that has come about as a result?

Veronica Olazabal: We are excited to be an inaugural recipient of the #OpenForGood award! As you may be aware, since its founding more than 100 years ago, The Rockefeller Foundation's mission has been “ to promote the well-being of humanity throughout the world.” To this end, the Foundation seeks to catalyze and scale transformative innovation across sectors and geographies, and take risks where others cannot, or will not.

While often working in new and innovative spaces, the Foundation has always recognized that the full impact of its programs and investments can only be realized if it measures - and shares - what it is learning. Knowledge and evidence sharing have been core to the organization's DNA dating back to its founder John D. Rockefeller Sr., who espoused the virtues of learning from and with others—positing that this was the key to "enlarging the boundaries of human knowledge." You can imagine how this, in turn, resulted in transformational breakthroughs such as the Green Revolution, the eradication of Yellow Fever and the formalization of Impact Investing.

The-rockefeller-foundationGP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo? 

VO: Learning is a team sport and to that end, an evaluation and learning team should be centrally positioned and accessible to all teams across a foundation. At the Rockefeller Foundation, the Measurement and Evaluation team engages with both the programmatic and the impact investing teams. We see our role as enablers of good practices around impact management and programmatic learning -- often working with teams in early stage design support, through start-up, implementation and exit. We also work collaboratively with others at the Foundation such as our grants-management and data teams to ensure the “right” M&E data is being captured throughout our grantee’s lifecycle.

Yet, I will be the first to say that building a culture of learning by continuously reaching “over the fence” is a lot of work and might be challenging for a small team, which is the reality for most foundations. Benchmarking data produced by the Center for Evaluation Innovation (CEI) and the Center for Effective Philanthropy (CEP) lands most M&E teams at foundations at around 1.5. So, capacity for culture change is clearly a challenge. My suggestion here is to source evaluation and learning talent that balances the hard technical chops with the softer people skills. I believe you truly need both and if an organization optimizes for one over the other, might experience a series of false starts. A good place to start in sourcing evaluation talent is the American Evaluation Association (AEA).

GP: As you heard during the award presentation, one of the reasons the Rockefeller Foundation was selected to receive this award is because of your commitment to sharing the results of any evaluation you commission, before you even know the outcome. This pledge seems designed to not let negative findings affect your decision about whether or not to share what your learned. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected Rockefeller’s reputation in the field, and why it’s worth it?

VO: In 2017, The Rockefeller Foundation was pleased to be the first to make all of its evaluations available to IssueLab as part of #OpenForGood. But to the Foundation, being open goes well beyond passively making information available to those seeking it. Being truly open necessarily involves the proactive sharing of lessons so that others can be aware of and leverage from the things that we are learning. To that end, we regularly author blogs, disseminate evaluation reports and M&E learnings via digital channels, and – perhaps most importantly – share back evaluation results with our grantees and partners – so that evaluation is more than a one-way extractive exercise.

"Being truly open necessarily involves the proactive sharing of lessons so that others can be aware of and leverage from the things that we are learning."

Taking sharing one step further, earlier this year, The Rockefeller Foundation adopted a new Data Asset Policy aimed at making the data that we collect as part of our grantmaking freely available to others who could use it to effect more good in the world. The policy is grounded on two core principles: 1) that the data we fund has incredible value for public good and that these assets can serve as fuel for better decision-making; and 2) we commit ourselves to being responsible stewards of these data, which means prioritizing privacy and protection, especially of those individuals and communities we seek to serve. Moving forward, this opens up the ability to amplify our learning even further and in even more innovative ways.

GP: A concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does Rockefeller include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

VO: Having had the experience of being both a funder and a grantee, I know this is a real barrier to enabling robust learning cultures and evidence-informed decision-making. For this reason, at The Rockefeller Foundation we approach resourcing in a few different ways:

  • First, through embedding resources for evaluation and learning into individual grantee budgets and agreements from the start. This type of funding enables grantees to generate the type of data they need for their own decision-making, learning and reporting.
  • We also often work in a consortia model where we commission an evaluation and learning grantee separately to synthesize learnings across groups of grantees and provide technical assistance as needed. This approach helps decrease the reporting burden for “implementation” types of grantees as it generates what is it the Foundation would like to learn (which could differ from what the grantees and their clients find useful). Here is an example from our Digital Jobs Africa portfolio generated through this evaluation and learning model.
  • Finally, we have also at times, and upon request, seconded our own M&E staff to grantees and partners to help build their M&E muscle and enable them to measure their own impact. While this is rare, we are seeing this request more and more and hence why we value both technical expertise and relationship management skills.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your own work.

VO: There are many opportunities to learn from others. In my current role, I am in continuous engagement with colleagues in similar roles at other philanthropies and regularly meet before or after convenings organized by CEP, GEO and AEA. In addition, as part of my work on the Fund for Shared Insight which is a funding collaborative working to make listening to end-users the norm, my philanthropy colleagues and I often exchange on where we all are in our personal and institutional learning journeys.

Finally, as part of a W.K. Kellogg Foundation-funded Lab for Learning, The Rockefeller Foundation was most recently among a cohort of 15 foundations that took part in a year-long series of convenings to address systemic barriers to learning. Participation here required us to experiment with ideas for supporting learning in our own settings and then sharing our experiences with the group. Through this engagement, we learned about how others were building learning habits in their foundations (written about in Julia Coffman’s post here). More specifically, the measurement and evaluation team was able to introduce Making Thinking Visible and Asking Powerful Questions in our early stage support to program teams to push thinking about assumptions and concrete dimensions of the work. This engagement then helped to structure the foundations of a learning agenda (e.g. theory of change-like tool with clear outcomes, hypotheses, assumptions and evidence) that would be used to anchor adaptive management and continuous improvement once the program strategy rolled out.

--Veronica Olazabal & Janet Camarena

Meet Our #OpenForGood Award Winner: An Interview with Lee Alexander Risby, Head of Effective Philanthropy & Savi Mull, Senior Evaluation Manager, C&A Foundation
June 19, 2019

1




Lee Alexander Risby

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

C&A Foundation is a European foundation that supports programs and initiatives to transform fashion into a fair and sustainable industry that enables everyone – from farmer to factory worker – to thrive. In this interview, Lee Alexander Risby and Savi Mull share insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the C&A Foundation and how this practice has helped you to advance your work?

2




Savi Mull

Savi Mull: For almost five years, C&A Foundation has been dedicated to transforming the fashion industry into a force for good. A large part of that work includes instilling transparency and accountability in supply chains across the industry. From the start, we also wanted to lead by example by being transparent and accountable as an organization, sharing what we were learning whilst on this journey, being true to our work and helping the rest of the industry learn from our successes and failures.

Lee Alexander Risby: Indeed, from the beginning, we made a commitment to be open about our results and lessons by publishing evaluations on our website and dashboards in our Annual Reports. After all, you cannot encourage the fashion industry to be transparent and accountable and not live by the same principles yourself. Importantly, our commitment to transparency has always been championed both by our Executive Director and our Board.

Savi: To do this, over the years we have put many processes in place.  For example, internally we use after-action reviews to gather lessons from our initiatives and allow our teams to discuss honestly what could have been done better in that program or partnership.  We also do third party, external evaluations of our initiatives, sharing the reports and lessons learned. This helps us and our partners to learn, and it informs initiatives and strategies going forward.

The Role of Evaluation Inside Foundations

GP: Your title has the word “evaluation” in its name and increasingly we are seeing foundations move toward this staffing structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

SM: I believe it is essential to have this type of function in a foundation to drive formal learning from and within programs. But at the same time, it is an ongoing process that cannot be driven by one function alone. All staff needs to be responsible for the learning that makes philanthropy effective – not just evaluators.

LAR: To begin, we were deliberate in building a team of evaluation professionals to promote accountable learning. We started hiring slowly and built the team over time. What I looked for with each new member of the team, and I am always looking for, is an evaluator with more than just skills, they also need the influencing, listening, communication and negotiating skills to help others learn. Evaluations have little effect without good internal and external communication.

”For us, it was important to be a critical friend, listener, and enabler of learning and not the police.”

The evaluation function itself has also evolved over the last five years. It started off as a monitoring, evaluation and learning function (MEL) and is now Effective Philanthropy. From the start, the function was as not set up as an independent department but created to help programmatic teams in the design of appropriate monitoring and evaluation for the programs, and facilitators and advisors on strategy. However, it has not always been a straight-forward process from the inside. In the first years, we had to spend a lot of time explaining and persuading staff of the need for evaluation, transparency and learning and the benefits of doing so. We wanted to avoid a strong independent evaluation function as that can reduce learning by placing too much emphasis on accountability. For us, it was important to be a critical friend, listener, and enabler of learning and not the police.

SM: So, the first bit of advice is that evaluators should be supportive listeners, assisting programmatic teams throughout the design and implementation phases to get the best results possible. They should not come in just at the end of an initiative to do an evaluation.

LAR: The second piece of advice is on positioning, support, and structure of evaluation within a foundation.  Firstly, it is critical to have is to have the buy-in of the leadership and board for both evaluation and transparency. And secondly, the evaluation function must be part of the management team and report to the CEO or Executive Director. This gives reporting and learning the appropriate support structure and importance.

The third piece of advice is to consider not creating an evaluation function, but an effective philanthropy function. Evaluation is done for learning, and learning drives effectiveness in grant-making for better results and long-term impacts on systems.

SM: The final piece of advice is to take guidance from others outside your organization. The whole team has consulted broadly with former colleagues and mentors from across the evaluation community as well as experienced philanthropic professionals. Remember you are part of a field with peers whose knowledge and experience can help guide you.

Opening Up Pain Points

GP: One of the reasons the committee selected C&A Foundation to receive the award is because of your institutional comfort level with sharing not just successes, but also being very forthright about what didn’t work. We often hear that foundation boards and leaders are worried about reputational issues with such sharing. What would you say to those leaders about how opening up these pain points and lessons has affected C&A Foundation’s reputation in the field, and why it’s worth it?

LAR: I would say this. The question for foundation boards and leaders is straightforward: do you want to be more effective and have an impact? The answer to that will always be yes, but it is dependent on learning and sharing across the organization and with others. If we do not share evaluations, research or experiences, we do not learn from each other and we cannot be effective in our philanthropic endeavors.

"There is a benefit to being open, you build trust and integrity – success and failure is part of all of us."

The other question for boards and leaders is: who does philanthropy serve? For us, we want to transform the fashion industry, which is made up of cotton farmers, workers in spinning mills and cut and sew factories, consumers and entrepreneurs, to name a few – they are our public. As such we have the duty to be transparent to the public about where we are succeeding and where we have failed and how we can improve. We do not think there is a reputation risk. In fact, there is a benefit to being open, you build trust and integrity – success and failure is part of all of us.

SM: Adding to what Lee has said, being open about our failures not only helps us but the entire field. Some of our partners have felt reticent about our publishing evaluations, but we always reassure them and stress from the beginning of an evaluation process that it is an opportunity to understand how to they can improve their work and how we can improve our partnership, as well as a chance to share those lessons more broadly.

Learning While Lean

GP: Given the lean philanthropy staffing structures in place at many corporate foundations, do you have any advice for your peers on how those without a dedicated evaluation team might still be able to take some small steps to sharing what they are learning?

SM: Learning is a continuous process. In the absence of staff dedicated to evaluation, take baby steps within your power, such as implementing after-action reviews, holding thematic webinars, or doing quick summaries of lessons from grants and/or existing evaluations from others. If the organization’s leadership endorses learning, these small steps are a good place to start.

GP: And speaking of lean staffing structures, a concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does C&A Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

SM: The foundation has a Monitoring and Evaluation Policy that lays out the role of the programmatic staff and partners as well as of the dedicated Effective Philanthropy Team. C&A Foundation partners are generally responsible for the design and execution of self-evaluation - to be submitted at the end of the grant period. External evaluation budgets are covered by the foundation and do not pose a financial burden on partners at all. They are included in the overall cost of an initiative, and when needed we have an additional central evaluation fund that is used to respond to the programmatic team’s and partner’s ad hoc demands for evaluations and learning.

The Effective Philanthropy team does provide technical assistance to partners and foundation staff upon request. The guidance ranges from technical inputs related to the theory of change development to the design of baseline and mid-line data collection exercises. The theory of change work has been really rewarding for partners and ourselves. We all enjoy that part of the work.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your work.

Learning Leads to Effectiveness

C-a-foundation (1)LAR: In the moving from a more traditional MEL approach to effective philanthropy we looked at the work of other foundations. This included learning from the William and Flora Hewlett Foundation, the Rockefeller Foundation, and others. We had discussions with a number of peers in the field. We also asked Nancy MacPherson (formerly Managing Director of Evaluation at Rockefeller) and Fay Twersky (Director of Effective Philanthropy at Hewlett) to review our Effective Philanthropy strategy when it was under development. Their feedback and advice helped a lot. In the end, we decided to begin to build out the function in a similar way to the Hewlett Foundation. But there are some differences. For example, our evaluation practice is currently positioned at a deeper initiative level, which is related to the field context where there is a significant evidence gap across the fashion industry that needs to be filled. Concomitant to this is our emphasis on piloting and testing and that goes hand-in-hand with the demand for evaluative thinking, reporting, and learning.

Our team has also been influenced by our own successes and failures from previous roles. That has also inspired us to embrace a slightly different approach.

SM: In terms of where we are at the moment, we still oversee performance monitoring, evaluation, and support to the program teams in developing theories of change and KPIs; but we are also building out organizational learning approach and are in the process of hiring a Senior Learning Manager. Lastly, we are piloting our organizational and network effectiveness in Brazil, which is being led by a colleague who joined the foundation last year.

LAR: We are also in the midst of an Overall Effectiveness Evaluation (OEE) of C&A Foundation’s first 5-year strategy. In general, this is not a type of evaluation that foundations use much. As well as looking at results, the evaluators are evaluating the whole organization, including Effective Philanthropy. For me as an evaluator, it has been really rewarding to be on the other side of a good question.

We are learning from the OEE as we go along and we decided to create ongoing opportunities for reporting/feedback from the process rather than waiting until the very end for a report. This means that program staff can be engaged in proactive discussions about performance and emerging lessons in a timely way. The OEE is already starting to play a vital role to inform the development of the next 5-year strategy and our organization. But you will surely hear more on that evaluation process later as it will be published. There is always room for improvement and learning never stops.

--Lee Alexander Risby and Savi Mull

Meet Our #OpenForGood Award Winner: An Interview with Craig Connelly, Chief Executive Officer, The Ian Potter Foundation
June 12, 2019

Download



Craig Connelly

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

The Ian Potter Foundation is an Australian foundation that supports and promotes excellence and innovation working for a vibrant, healthy, fair, and sustainable Australia. In this interview, Craig Connelly shares insights with GlassPockets' Janet Camarena about how the foundation’s practices support learning and open knowledge.

GlassPockets: Congratulations on being one of our inaugural recipients of the #OpenForGood award! The award was designed to recognize those foundations that are working to advance the field by sharing what they are learning. Can you please share why you have prioritized knowledge sharing at the Ian Potter Foundation and how this practice has helped you to advance your work? Or put another way, what is the good that has come about as a result?

Craig Connelly: The Ian Potter Foundation decided to invest in our research and evaluation capability primarily to improve the quality of our grantmaking. We believe that evaluating our grantees and the work that we fund through measuring and evaluating outcomes enables us to understand the extent to which our funding guidelines are achieving the intended outcomes. This results in a more informed approach to our grantmaking which should improve the quality of our grantmaking over time.

A core part of this includes being completely transparent with our grantees and with the broader sector. To do anything otherwise is not being consistent with our expectations of our grantees. We are asking our grantees to be partners, to pursue a strategic relationship with them and that requires open and honest conversation. Therefore, we need to be an open, honest and transparent funder and demonstrate that in order to win the trust of the organizations we fund.

Examples of this transparency are the learnings that we glean from our grantees that we share with the broader sector. We’re getting very positive feedback from both funders and grantees on the quality of the learnings that we’re sharing and the value that they add to the thought processes that nonprofit organizations and other funders go through.

The-ian-potter-foundationGP: Increasingly we are seeing foundations move toward a structure of having staff dedicated to evaluation and learning. For those foundations that are considering adding such a unit to their teams, what advice do you have about the structures needed to create a culture of learning across the organization and avoid the creation of one more silo?

CC: Anyone in a research and evaluation role needs to be an integral part of the program management team. The research and evaluation process informs our grantmaking. It needs to assist the program managers to be better at what they do, and it needs to learn from what the program managers are doing as well. You don’t want it to be a silo, it is just another function of your program management team. It is an integral part of that team and it is in constant communication both with the program management team and with grantees from day one.

GP: As you heard during the award presentation, one of the reasons the Ian Potter Foundation was selected to receive this award is because of how you prioritize thinking about how stakeholders like grantees might benefit from the reports and knowledge you possess. We often hear that while there is a desire to share grantee reports publicly, that there are reputational concerns that prevent it or that to scrub the reports of sensitive information would be too time consuming, yet you do it for all of your portfolios. What are your tips for how to keep this a manageable process?

CC: The initial work to compile and anonymize our grantee learnings required some investment in time from our Research & Evaluation Manager and communications team. To make this task manageable, the work was tackled one program area at a time. Now that a bank of learnings has been created for each program area, new learnings are easily compiled and added on a yearly basis. This work is scheduled at less busy times for those staff involved. The Ian Potter Foundation is also looking at ways learnings can be shared directly from grantees to the wider nonprofit sector. One idea is to create a forum (e.g. a podcast) where nonprofits can share their experiences with their peers in the sector.

GP: A concern we often hear is that a funder creating a culture of learning leads to an increased burden on grantees who are then asked for robust evaluations and outcomes measures that no one is willing to pay for. Does The Ian Potter Foundation include funding for the evaluations and reporting or other technical assistance to mitigate the burden on grantees?

"...we need to be an open, honest and transparent funder and demonstrate that in order to win the trust of the organizations we fund."

CC: One of the benefits that we found at The Ian Potter Foundation of having a Research & Evaluation Manager becoming an integral part of our process is that our authorizing environment – our board and the committees responsible for program areas – have become very comfortable including funding evaluation for all of our grants. We now also understand what it costs to complete an effective evaluation. We often ask grantees to add more to their budget to ensure a good quality evaluation can be completed as part of the grant.

GP: Learning is a two-way street and foundations are both producers and consumers of knowledge. Let’s close this interview with hearing about a noteworthy piece of knowledge you recently learned thanks to another foundation or organization sharing it, and how it helped inform your own work.

CC: Yes, we have a couple of examples I can point to. The first comes from our Education Program Manager, Rikki Andrews, who points to the creation of the Early Childhood Impact Alliance (ECIA) through a grant to the University of Melbourne. The purpose of the ECIA is to convene, connect and increase understanding of research and policy among early childhood philanthropic funders, to ensure there is more strategic and concerted philanthropic support of research and its application.

Additionally, the Foundation’s Senior Program Manager, Dr. Alberto Furlan, explains, ‘We are in the process of learning from organizations we partner with all the time. In the last few years, program managers have been prioritizing extensive site visits to shortlisted applicants to discuss and see the projects in situ. In a ‘big country’ such as Australia, this takes a considerable amount of time and resources, but it invariably pays off. Such visits highlight the importance of relationship building deep and honest listening when partnering with not-for-profits. The Foundation prides itself in being open and approachable and site visits greatly contribute to understanding the reality of the day-to-day challenges, and successes, of the organizations working on the ground.’

--Craig Connelly & Janet Camarena

Candid Announces Inaugural #OpenForGood Award Winners
May 30, 2019

Janet Camarena is director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

Open For Good Awardees and Committee MembersLeft to Right: Meg Long, President, Equal Measure (#OpenForGood selection committee); Janet Camarena, Director, Transparency Initiatives, Candid; Awardee Savi Mull, Senior Evaluation Manager, C&A Foundation; Awardee Veronica Olazabal, Director, Measurement, Evaluation & Organizational Performance, The Rockefeller Foundation; Clare Nolan, Co-Founder, Engage R + D (#OpenForGood selection committee).

Yesterday as part of the Grantmakers for Effective Organizations Learning Conference, Candid announced the inaugural recipients of the #OpenForGood Award, which is designed to recognize and encourage foundations to openly share what they learn so we can all get collectively smarter. The award, part of a larger #OpenForGood campaign started in 2017, includes a set of tools to help funders work more transparently including a GrantCraft Guide about how to operationalize knowledge sharing, a growing collection of foundation evaluations on IssueLab, and advice from peers in a curated blog series.

The three winning foundations each demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. Selected by an external committee from a globally sourced nomination process, the committee reviewed the contenders looking for evidence of an active commitment to open knowledge, creative approaches to making knowledge shareable, field leadership, and incorporating community insights into knowledge sharing work.

And the Winners Are…

Here are some highlights from the award presentation remarks:

C and A FoundationC&A Foundation
Award Summary: Creativity, Demonstrated Field Leadership, and Willingness to Openly Share Struggles

The C&A Foundation is a multi-national, corporate foundation working to fundamentally transform the fashion industry. C&A Foundation gives its partners financial support, expertise and networks so they can make the fashion industry work better for every person it touches. Lessons learned and impact for each of its programs are clearly available on its website, and helpful top-level summaries are provided for every impact evaluation making a lengthy narrative evaluation very accessible to peers, grantees and other stakeholders. C&A Foundation even provides such summaries for efforts that didn’t go as planned, packaging them in an easy-to-read, graphic format that it shares via its Results & Learning blog, rather than hiding them away and quietly moving on as is more often the case in the field.

The Ian Potter FoundationIan Potter Foundation
Award Summary: Creativity, Field Leadership, and Lifting Up Community Insights

This foundation routinely publishes collective summaries from all of its grantee reports for each portfolio as a way to support shared learning among its existing and future grantees. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than questions to satisfy the typical ritual of a grant report that goes something like submit, data enter, file away never to be seen, and repeat.

Beyond being transparent with its grantee learning and reports, the Ian Potter Foundation also recently helped lift the burden on its grantees when it comes to measurement and outcomes. Instead of asking overworked charities to invent a unique set of metrics just for their grant process, foundation evaluation staff took it upon themselves to mine the Sustainable Development Goals targets framework to provide grantees with optional and ready-made outcomes templates that would work across the field for many funders. You can read more about that effort underway in a recent blog post here.

The Rockefeller FoundationThe Rockefeller Foundation
Award Summary: Field Leadership, Consistent Knowledge Sharing, and Commitment to Working Transparently

The Rockefeller Foundation can boast early adopter status to transparency and openness—it  has had a longstanding commitment to creating a culture of learning and as such was one of the very first foundations to join the GlassPockets transparency movement and also to commit to #OpenForGood principles by sharing its published evaluations widely. Rockefeller Foundation also took the unusual step of upping the ante on the #OpenForGood Pledge aiming for both creating a culture of learning and accountability, with its monitoring and evaluation team stating that: “To ensure that we hold ourselves to a high bar, our foundation pre-commits itself to publicly sharing the results of its evaluations - well before the results are even known.” This ensures that even if the evaluation reports unfavorable findings, the intent is to share it all.

In an earlier GlassPockets blog post, Rockefeller’s monitoring and evaluation team shows a unique understanding of how sharing knowledge can advance the funder’s goals: “Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.”  Rockefeller’s use of IssueLab’s open knowledge platform is living up to this promise as anyone can currently query and find more than 400 knowledge documents funded, published, or co-published by the Rockefeller Foundation.

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in knowledge dissemination. Knowledge Centers are a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to creating a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourages the sector to make it a priority to learn from one another, share content with a global audience, so that we can build smartly one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Congratulations to our inaugural class of #OpenForGood Award Winners! What will you #OpenForGood?

--Janet Camarena

How the Sustainable Development Goals Can Focus Outcomes Measurement
April 25, 2019

Ian-potter-185







GlassPockets Road to 100

Dr. Squirrel Main is the Research and Evaluation Manager at The Ian Potter Foundation in Australia.

This post is part of our "Road to 100 & Beyond" series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the "Who Has GlassPockets?" self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations, helpful examples, and lessons learned.

We all can play a small part in broader global movements, both in our grantmaking and our outcomes measurement. As such, The Ian Potter Foundation is beginning to encourage grantees to learn more about the United Nations Sustainable Development Goals (SDGs). As the Foundation's research and evaluation manager, I have found grantees often have difficulty pitching their progress and successes in a manner that readily translates across contexts and stakeholders. For example, a grantee may be trying for ongoing funding from local, state and Commonwealth governments and reaching out to an Aboriginal Community Controlled Health organization. The SDGs, especially when contextualized at a local level can speak to all four stakeholders and more.

In terms of outcomes measurement, as a foundation we support the global goals and, as such, are increasingly offering the option to use the global indicators behind these goals. Tracking these SDGs can assist grantees in increasing the sophistication of their measurements: the previous "all of our children are doing well" is now a more clear "we know that 85% of our 112 participants are now developmentally on track (up from 44%) as measured by their AEDC scores." It's easy to see how the latter sentence translates readily into government dollars—and as we know, leverage is the currency of philanthropy.

In addition to increasing grantees' leverage potential, our foundation can better focus the way in which we track and achieve outcomes. Having such clear outcomes is much easier—dare I say "more fun"?—when placed in the context of a global measurement movement. The Ian Potter Foundation was proud to join the GlassPockets movement last year because we believe transparency can benefit the philanthropic sector, particularly given the benefits of shared frameworks for learning. Along that vein, here's what we are learning from our experimentation with using the SDGs.

The Process of Integrating SDGs into Foundation Work

How do we encourage grantees and applicants to use SDGs to measure their outcomes? On a very practical note, it meant adding the relevant SDGs to our application via a drop-down menu in our grants management software (some databases now have add-on modules you can purchase to do this job). While grantees are free to select outcomes measurements that are best suited to their stakeholder needs, since mid-2016 105 out of 379 final-stage applicants have voluntarily opted to select SDGs as potential outcomes. To assist this process, we have specifically color-indicated SDGs on our help sheets, with the goal number listed in parentheses (see, for example, our Environment and Conservation help sheet).

In terms of process specifics, we are gradually transitioning from open-form to suggested goals to SDGs, and have produced documents which outline suggested goals and example metrics for grants in each program area. In Q3 of 2019, we will further narrow the outcomes, which will likely mean that over 85% of outcomes listed on our application will be SDG indicators.

Squirrel-main-150



Squirrel Main


How the SDGs Appear Across the Foundation's Work

The SDGs manifest themselves in very different ways across our broad portfolio. Currently direct outcome measurement, SDG-aligned research and strategic initiatives are the most common approaches where we are finding alignment with SDG work.

Direct measurement can be relatively straightforward. For instance, our science grantmaking focuses predominantly on environmental restoration and conservation, so most grantees find it easy to align their outcomes with goals 13 (Climate), 14 (Water) and 15 (Land). One example is a grant we continued last year to Professor Jessica Meeuwig at the Marine Futures Lab at the University of Western Australia to increase protection, monitoring and reporting of marine reserves around the Australian coastline. Professor Meeuwig selected "Proportion of important sites for terrestrial, freshwater and marine biodiversity that are covered by protected areas, by ecosystem type (SDG 15.1.2)" as one of her long-term metrics. Easy. Watch this space and we will know the results.

In terms of research, we are attempting to go beyond direct goal accomplishment. For instance, we have engaged in some blue-sky thinking in this area and are supporting Deakin University researcher Brett Bryan to bring the SDGs to a local level. So, for example, one of the project's goals reads: "Derive detailed local sustainability pathways for the Goulburn-Murray study area … assessing the range and viability of options (e.g. irrigation reconfiguration, ecosystem services markets, renewable energy) … to ensure a just transition to a more sustainable future…" In short, these researchers are bringing sophisticated mathematical models to old-fashioned community meetings to determine the best way to help communities meet goals aligned with the SDGs that are most important to that community. In his six-month face-to-face check-in, Professor Bryan observed that the Victorian State Government recently decided to use SDGs as THE framework for future environmental reports. This move further underscores the need for communities and smaller grantees to be fluent in "SDGese" in order to remain salient in the political realm over the next decade. To put a spin on the old adage, when government sneezes, grantees catch cold!

Lastly, some grantees apply SDGs beyond research to strategic policy work. To facilitate measuring this type of work, we divide long-term outcomes into technical (outcomes for an immediate group/project/organization) and strategic (large policy/systemic change). The SDGs are very nimble and can be applied to both types of outcomes. For example, a grantee focusing on technical success–like our grant to expand Youthworx's capacity to build its social enterprise–might choose to select indicator (8.6.1) Proportion/number of youth (aged 15-24 years) engaged in education, employment or training for their hands-on training programs, whereas other projects—even by the same organisation—(one example that has been funded by others is Youthworx's National Youth Commission project) focus on more ‘strategic' outcomes such as (8.b.1) Existence of a developed and operationalized national strategy for youth employment as a distinct strategy or as part of a national employment strategy. We encourage grantees to pick what's right for them—and remind them that it's OK to just do solid service delivery, if that's their main modus operandi.

Do the SDGs work neatly for every area of our funding? To be honest, no. Unlike other areas, the arts are much trickier to align with the SDGs. We acknowledge the distinction between vibrancy and sustainability. And, while some arts-focused foundations choose to measure progress based on sub-goals related to culture (e.g., Goal 3 (well-being), 4 (education) and 11 (cities and communities)), we have chosen—for now—to espouse the outcomes listed by Australia's Cultural Development Network and offer those options in our drop-down menus. Out of our seven major funding areas, the arts are the only program area for which we do not have SDGs as outcome measurement options.

Our Role in Building SDG Capacity

In addition to encouraging applicants to select (and measure) SDG-related outcomes on the application, we convene Welcome Workshops after every Board meeting in which grantees gather to learn about our foundation and priorities. These workshops are also an opportunity for grantees within the same program area to discuss dissemination, goal setting and outcomes measurement. To this end, part of our presentation specifically references the SDGs and encourages grantees to consider how their measurements are aligned. We also conduct face-to-face, post-award evaluation site visits with the majority of grantees, and these visits present another opportunity to consider how they will collect data and reflect on learnings related to their long-term outcomes' measurement. We have found that in the last few funding rounds, grantees are very knowledgeable about the SDGs and enthusiastic to collaborate and learn more about existing models of measurement within their field. No one wants to reinvent wheels when shared frameworks already exist.

Measuring the Difference

And, of course we, like you, wonder if the focus on SDGs will make a tangible difference to our foundation's outcomes. Our current active grants have an average duration of 2 years, 9 months (and that average is lengthening), so we have yet to analyse our progress—or, more importantly, learn and improve the trajectory of our progress towards the SDGs. However, in preparation for measuring this new outcome's framework, we have a baseline benchmark to use as a comparison. Presently, for the 833 grants closed (since January 2010—our foundation is 50 years old but our outcomes measurement is relatively new!) for which we have been able to gather long-term outcomes, we are achieving a 71% success rate. Within the next year, as we review final reports, we will begin to encounter the results from the SDGs—which will help us measure and learn from our progress towards these global goals. And ideally—although we acknowledge that 100% success is not the holy grail of philanthropy—we will be able to show how focusing on the SDGs (and the collective learnings and wisdoms associated with progress towards those goals) has assisted us in striving towards a more vibrant, fair, healthy and sustainable Australia.

-- Squirrel Main

Designing for Impact: Using a Web Redesign to Improve Transparency, Equity, and Inclusion
April 11, 2019

This post is part of our "Road to 100 & Beyond" series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the "Who Has GlassPockets?" self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations, helpful examples, and lessons learned.

Na Eng
Na Eng

Na Eng is the communications director at the McKnight Foundation, a private family foundation based in Minneapolis.

The McKnight Foundation is proud to be among the early group of foundations that joined the GlassPockets movement and has benefited from its tools and resources. As GlassPockets crosses the threshold of 100 foundation transparency profiles on its website, I wanted to share a personal reflection on how McKnight approaches transparency on our website, and how GlassPockets has been part of that journey.

When I decided on a redesign of our website about a year ago, I knew that there was a great body of knowledge we could tap into by reviewing GlassPockets tools and content, so I scheduled a call with Janet Camarena, who leads the website and initiative to encourage greater foundation transparency. In this new version of our web presence, I wanted to design for transparency from the start. GlassPockets didn´t disappoint, and Janet offered a helpful perspective from her years of observing the paths and barriers faced by our peers on the road to transparency.

While the word transparency can sometimes feel like a clinical term, Janet explained that transparency and openness can humanize institutions through the power of storytelling, and we all know foundations have powerful stories about the impact of their grantees. When I asked her about the common tendency of foundations to embrace a stance of humility, she nodded. She said she often hears that humility can stand in the way of embracing a “GlassPockets approach,” preventing us from seeing storytelling as an act of public service, rather than as self-serving content.

This conversation reaffirmed for me one of the core benefits of foundation transparency: when the public knows more about what foundations fund and how they approach their work, trust is built, advancing the entire field of philanthropy, the nonprofits we support, and our collective impact.

GlassPockets Road to 100

How McKnight Advances Transparency with its Website

A key purpose for our foundation website is pragmatic and impactful transparency. With our web developer, Visceral, we tried to make our site as fun to peruse and simple to navigate as possible, and we packed it with information to help people conduct practical business. For example, we now include all the details on how to seek funding, how to reserve a meeting space, and even the investments we make in our impact investing portfolio. We also have a robust, easy-to-search grants database, which makes us a rarity among national funders. According to the GlassPockets’ Transparency Challenge, only about one of every 100 foundations shares current grants data online. Lists of grants, combined with compelling images and vignettes throughout the site, help others to better understand our organization’s mission.

In addition, I’ve come to realize that providing more information does not necessarily achieve greater transparency. It’s as essential to offer an updated, accurate representation of work—and that means clearing the clutter. (Consider the KonMari method of thanking what no longer has value, and then letting go.) External websites should not be used as an internal digital archiving system. We’ve learned that dated content often caused confusion about our current purpose and identity. However, for scholarly use, we do archive older reports with IssueLab, which has an impressive open knowledge-sharing system.

Digital Accessibility & Linguistic Inclusion

Transparency also requires understanding the needs of diverse audiences and making digital inclusion a priority. When we set out for our site to be more user-friendly for people who are hard of hearing or blind, we commissioned an accessibility audit. And rather than rely on web-based scanners, we asked people who had the relevant disabilities to evaluate its accessibility level. Among the changes, we added closed captioning to all our videos, at little cost. We’ve since expanded closed captioning to more than a dozen languages, all spoken in our home state of Minnesota, including Hmong, Laotian, Somali, Oromo, Arabic, Chinese, Spanish, and others.

A website can leave people behind or it can inspire more people to advance the mission.

Our efforts toward digital inclusion, which enable transparency for people with different physical and linguistic abilities, are ongoing. We still have much to learn. We´re now learning more about the technical needs of people in low-bandwidth zones in the developing world, rural communities, and even in pockets of metro areas. When most digital communications are designed for able-bodied English language speakers who have access to high-speed internet, significant population groups are cut off from the ideas and opportunities we offer, and we’re deprived of the chance to connect with people who have so much to contribute to advancing our mission.

Our society often thinks of discrimination in terms of individual actions, giving scant attention to systemic barriers. These are insidious obstacles created and maintained, often unintentionally, even by people of goodwill—simply because they’re not aware of the impact of these barriers on those who are not just like them.

The website of an organization that has the power to distribute resources, bestow awards, and select new staff and partners can be an instrument for perpetuating or disrupting inequity. And when a foundation has important ideas to spread—in our case, ideas about advancing a just, creative, and abundant future where people and planet thrive— a website can leave people behind... or it can inspire more people to advance the mission.

Thankfully, we have movements like GlassPockets urging us all to move toward more pragmatic, inclusive, and impactful transparency.

--Na Eng

Don’t “Ghost” Declined Applicants: The Ins and Outs of Giving Applicant Feedback
April 4, 2019

Mandy Ellerton joined the [Archibald] Bush Foundation in 2011, where she created and now directs the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community problems, using approaches that are collaborative and inclusive of people who are most directly affected by the problem.

GlassPockets Road to 100

This post is part of our “Road to 100 & Beyond series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the “Who Has GlassPockets? self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, promising practices in transparency, helpful examples, and lessons learned.

I’ve often thought that fundraising can be as bad as dating. (Kudos to you lucky few who have had great experiences dating!) Lots of dates, lots of dead ends, lots of frustrating encounters before you (maybe) find a match. All along the way you look for even the smallest sign to indicate that someone likes you. “They laughed at my joke!” or, in the case of fundraising, “they seemed really excited about page five of last year’s impact report!” Not to mention the endless time spent doing online searches for shreds of information that might be useful. This reality is part of the reason why Bush Foundation was proud to be among the first 100 foundations to participate in GlassPockets. We believe that transparency and opening lines of communication is critical to good grantmaking, because both in dating and in fundraising, it can be heartbreaking and crazymaking to try and sort out whether you have a connection or if someone’s “just not that into you.” If only there was a way to just “swipe left” or “swipe right” and make everything a little simpler.

“We believe that transparency and opening lines of communication is critical to good grantmaking.”

I’m not proposing a Tinder for grantmaking (nor should anyone, probably, although hat tip to Vu Le for messing with all of us and floating the idea on April Fool’s Day). But over the past several years, Bush Foundation’s Community Innovation program staff has used a system to provide feedback calls for declined applicants, in the hopes of making foundation fundraising a little less opaque and crazymaking. We use the calls to be transparent and explain why we made our funding decisions. The calls also help us live out our “Spread Optimism” value because they allow us to help and encourage applicants and potentially point them to other resources. This is all part of our larger engagement strategy, described in “No Moat Philanthropy.”

 

Ellertonmandy20152
Mandy Ellerton

How Feedback Calls Work

We use a systematic approach for feedback calls:

  • We proactively offer the opportunity to sign up for feedback calls in the email we send to declined applicants.
  • We use a scheduling tool (after trying a couple different options we’ve landed on Slotted, which is relatively cheap and easy to use) and offer a variety of times for feedback calls every week. Collectively five Community Innovation Team members hold about an hour a week for feedback calls. The calls typically last about 20 minutes. We’ve found this is about the right amount of time so that we can offer feedback calls to most of the declined applicants who want them.
  • We prepare for our feedback calls. We re-read the application and develop an outline for the call ahead of time.
  • During the call we offer a couple of reasons why we declined the application. We often discuss what an applicant could work on to strengthen their project and whether they ought to apply again.
  • We also spend a lot of time listening; sometimes these calls can understandably be emotional. Grant applications are a representation of someone’s hopes and dreams and sometimes your decline might feel like the end of the road for the applicant. But hang with them. Don’t get defensive. However hard it might feel for you, it’s a lot harder for the declined applicant. And ultimately, hard conversations can be transformative for everyone involved. I will say, however, that most of our feedback calls are really positive exchanges.
  • We use anonymous surveys to evaluate what people think of the feedback calls and during the feedback call we ask whether the applicant has any feedback for us to improve our programs/grantmaking process.
  • We train new staff on how to do feedback calls. We have a staff instruction manual on how to do feedback calls, but we also have new team members shadow more seasoned team members for a while before they do a feedback call alone.

 

What’s Going Well

The feedback calls appear to be useful for both declined applicants and for us:

  • In our 2018 surveys, respondents (n=38) rated the feedback calls highly. They gave the calls an average rating of 6.1 (out of 7) for overall helpfulness, 95% said the calls added some value or a lot of value, and 81.2% said they had a somewhat better or much better understanding of the programs after the feedback call.
  • We’ve seen the number of applications for our Community Innovation Grant and Bush Prize for Community Innovation programs go down over time and we’ve seen the overall quality go up. We think that’s due, in part, to feedback calls that help applicants decide whether to apply again and that help applicants improve their projects to become a better fit for funding in the future.
  • I’d also like to think that doing feedback calls has made us better grantmakers. First, it shows up in our selection meetings. When you might have to talk to someone about why you made the funding decision you did, you’re going to be even more thoughtful in making the decision in the first place. You’re going to hew even closer to your stated criteria and treat the decision with care. We regularly discuss what feedback we plan to give to declined applicants in the actual selection meeting. Second, in a system that has inherently huge power differentials (foundations have all of it and applicants have virtually none of it), doing feedback calls forces you to come face to face with that reality. Never confronting the fact that your funding decisions impact real people with hopes and dreams is a part of what corrupts philanthropy. Feedback calls keep you a little more humble.

 

What We’re Working On

We still have room to improve our feedback calls:

  • We’ve heard from declined applicants that they sometimes get conflicting feedback from different team members when they apply (and get declined) multiple times; 15% of survey respondents said their feedback was inconsistent with prior feedback from us. Cringe. That definitely makes fundraising more crazymaking. We’re working on how to have more staff continuity with applicants who have applied multiple times.
  • We sometimes struggle to determine how long to keep encouraging a declined applicant to improve their project for future applications versus saying more definitively that the project is not a fit. Yes, we want to “Spread Optimism,” but although it never feels good for anyone involved, sometimes the best course of action is to encourage an applicant to seek funding elsewhere.

I’m under no illusions that feedback calls are going to fix the structural issues with philanthropy and fundraising. I welcome that larger conversation, driven in large part by brave critiques of philanthropy emerging lately like Decolonizing Wealth, Just Giving and Winners Take All. In the meantime, fundraising, as with dating, is still going to have moments of heartache and uncertainty. When you apply for a grant, you have to be brave and vulnerable; you’re putting your hopes and dreams out into a really confusing and opaque system that’s going to judge them, perhaps support them, or perhaps dash them, and maybe even “ghost” them by never responding. Feedback calls are one way to treat those hopes and dreams with a bit more care.

--Mandy Ellerton

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories