Transparency Talk

Category: "Outcomes" (17 posts)

Part 2: Top 10 Lessons Learned on the Path to Community Change
June 25, 2013

(Robert K. Ross, M.D. is President and CEO of The California Endowment. Yesterday he shared three aha moments from the Endowment’s first two years of work in its Building Healthy Communities plan.)

Ross-100Okay, at times I step back and look at the BHC initiative and wonder—could we have made it more complicated? 14 sites. Multiple grantees in each site. A core set of multiple health issues. Multiple state-level grantees. And the expectation that the parts will add up to something greater and catalyze a convergence that builds more power and leads to greater impact.

But then again, supporting an agenda for social and community change does require multiple strategies, operating in alignment: the use of data, message framing and story-telling; innovative models; a variety of influential messengers; convening and facilitating champions; “grassroots and treetops” and coordination; meaningful community engagement. Power-building requires multiple, aligned investments.

Our Top Ten Lessons for Philanthropy

Finally, I want to share some lessons with partners in philanthropy regarding planning and implementing a community-change initiative. As we engaged in the planning process of BHC, we tried in earnest to stick by a key aphorism, one I learned from colleague and mentor Ralph Smith at the Annie E. Casey Foundation: make new mistakes.

The track record of community change work by philanthropy is not a work of art. Tapping into the wisdom of institutions such as the Aspen Institute, the Annie E. Casey Foundation, the Hewlett Foundation, the Skillman Foundation, the Marguerite Casey Foundation, and the Northwest Area Foundation, we incorporated the lessons of success and struggle from our colleagues in the field. Learning from these and other colleagues, we were able to avoid hitting major rocks as our BHC ship sailed out of harbor. So, we learned the following:

Community engagement in planning processes will be simultaneously exhilarating and messy.

1. Take time to plan, and plan to take the time. We embarked on a 9 month community engagement process in the 14 BHC sites, and we ended up taking 12-15 months. Nobody died, and nobody got fired. Community engagement in planning processes will be simultaneously exhilarating and messy. If it is going too smoothly and too well, then something may be terribly wrong – like the possibility that a foundation is not receiving candid, meaningful input from local leaders. If it is bumpy and messy and getting to consensus, and clarity is taking much longer than originally planned, it may very well mean that you are gaining the trust of leaders to raise thorny, difficult issues. As a general rule, we just took the time that was needed for local leaders to develop their local BHC plans, and we did not pit BHC sites against one another to race by the foundation’s clock. Community leaders want a compass more than they want a clock.

2. Don’t lead with the money. The issue of whether to announce “how much” the dollar commitment is in a foundation initiative is a tricky path. On the one hand, a major dollar-commitment announcement by a foundation can provide excitement, anticipation, and mobilize civic and community support. On the other hand, “leading with the money” can instigate all manner of posturing, control issues, manipulation, and political grantsmanship among potential grantees. We decided to quietly announce the breadth and scope of our commitment -- $1 Billion over a ten-year period in local and statewide policy funding – but veered away from formally announcing precise budget commitments in each site. In other words, we wanted to send a message that our commitment was serious without leading the conversations with grant dollar puppetry.

3. Date logic models, but get married to learning. There is no doubt that engaging in the disciplined exercise of how you think – and how community leaders believe – positive change and results will happen is a sound practice. But it is also important to recognize that community change and positive results in the context of complex social and political systems often defy tidy, linear models. If you want to get married, it is wiser to commit to the process of active, dynamic, real-time learning. We provided logic model training for leaders in the 14 BHC sites, with varying levels of effectiveness across the sites; we have been clear, however that learning is not optional, either for grantees or our own program staff.  

4. Be transparent about desired results. There are written and unwritten axioms about the need for philanthropy to be completely community driven in community-change work. Our experience is that this thinking is a truism without being entirely true. For starters, our foundation is legally chartered as a health foundation, and although we employ a broad definition of the word “health”, there are limitations and constraints about what we can and cannot fund. This issue led to some considerable tensions within the foundation (at the board and staff level), as well as with grantees and stakeholders, about prioritized community needs that were outside the scope of our health mission. The most obvious and recurrent tension-generating themes, in the context of a pervasive economic recession, were issues of economic development, job creation, and mortgage foreclosure across the sites. The battles over if and how we should enter “the space” of economic development as a health foundation were intense and emotional. We ultimately landed on a framework (utilizing mission-investing in our investment portfolio) for how to move forward without “mission drift”, and have been communicating our approach to our own program staff and stakeholders, but it has not been easy. But the worst of all worlds would have been to promise community leaders a course of action that we would either abandon or renege upon later on. We decided to stick to our mission and results (the right move, however discomforting for foundation-community relations).  

5. Be dogmatic about the results, but flexible about the strategies. The work of community change is noble, but funders cannot afford to fall in love with the process of the work at the expense of meaningful results and impact. Once community leaders and funders agree on a set of outcomes, objectives, or results, these must represent the “true north” on the compass. In the BHC planning and early implementation, we gave community leaders and organizations in the BHC planning process a blank slate on strategies, but insisted on being results driven and logic-model supported. The good news is that across our 14 BHC sites, there is community and resident ownership about the priorities and the strategies to achieve healthier community environments for young people. While these strategies vary, we are seeing growing convergence as the sites engage and learn from one another.

6. Listening is a form of leadership. Irish poet David Whyte underscores the importance of “leadership through conversation.” We have been quite intentional about active listening at all stages of the planning and implementation, and being mindful of closing the feedback loop with community leaders and grantees. We utilized a fairly simple “what we said, what they said, what we heard, what we’ll do” format. At the conclusion of the one-year planning process, our past Board Chair (Tessie Guillermo) and I co-authored and co-videotaped messages to the 14 sites summarizing the key themes and priorities we heard from community leaders in the sites, and what to expect in support from our foundation in the months ahead. We have now begun to bring site leaders together twice annually with foundation staff, so that leaders and staff can share stories of progress, struggles, and inspiration. All of this in service of the all-too critical “t-word”: trust. Trust is the mother’s milk of community change efforts by philanthropy, and active, engaged listening is the foundation.

7. Make “patient” grants, and “urgent” grants. Investors engaged in place-based, community change efforts encounter several tensions to manage. Among them is the tension of patience versus urgency. As efforts such as the Harlem Children’s Zone, Market Creek Plaza in Southeast San Diego, the Skillman Foundation’s work in Detroit, and the Dudley Street Initiative in Boston have demonstrated, positive community change takes time. A two- or three-year grant just won’t do it, and most successful efforts require 7, 10, or 12 years of “patient money.” The most thoughtful investments on this front involve leadership development, organizational capacity building, and collaborative efficacy; but “impact” yield from these investments will typically take years to bear fruit. “Urgent” money involves investing in short-term campaigns or capital projects where tangible results are realized within 12-18 months. Community change, place-based philanthropy will require both types of investments, and too heavy a bias or tilt towards “patient” investments will leave the investor and the partnership vulnerable to allegations that some money has been spent, some meetings have occurred, but nothing “tangible” has been produced. As a result, confidence in the effort will dissipate. Our BHC effort in the early going has been appreciative of the need to simultaneously make “patient” and “urgent” (which we also call “early wins”) grants.

8. Story-telling is part of the doing. The two-most under-appreciated and under-invested themes in social-change philanthropy are power-building and story-telling. Having been at the helm of a large-asset foundation for more than a decade, I am guilty-as-charged on this front; in retrospect, I would gladly trade in half of the (often expensive) academic and research-oriented reports we have commissioned in my twelve years as CEO for more compelling, interesting, and impactful “stories” of community-level change that illuminate the path towards a healthy, more vibrant community. Story-telling by community leaders, youth, or community-based organizations can be powerful tools on multiple fronts: local residents and youth experience the power and passion of their own voice; local media are inspired to re-tell the story in a way that scales up the audience; policymakers pay greater heed and attention to the issue being raised; civic engagement and participation is served; cynicism, disengagement, and disempowerment are reduced. Utilizing multiple forms of story-telling, from social media to flip-cam videos to traditional approaches, we have been assertive in support of community leaders and youth on this front, and it has been inspiring to witness.

Why build, preserve, and protect our respective brands and reputations if we are not going to spend it? Spend that damn brand.

9. Spend the damn brand. Institutional philanthropy is risk-averse. We tend to worry and fret about how our institutional brand, reputation, and civic standing might be sullied by associating with potentially controversial efforts or organizations, and as a general rule, we keep our heads and our profile low. But we have discovered, in the early years of the BHC effort, that thoughtful, surgical application of our civic standing and reputation matters to community leaders – and that they want us to spend “it” on their behalf. Sometimes it comes in the form of convening a meeting, writing and placing an op-ed, placing a phone call to a civic leader, or taking out a full-page ad on an issue in the local newspaper. We have done this with regards to healthy food options for youth and families, health insurance coverage for the uninsured, gang prevention and intervention strategies, and school health efforts. There is a school of thought among philanthropy that our job as funders is “to make the grant and get out of the way.” We would argue that our job is to achieve our respective missions, and by any means necessary. On occasion, this requires stepping out of character on behalf of grantees, and utilizing our voice as well. Why build, preserve, and protect our respective brands and reputations if we are not going to spend it? Spend that damn brand.

10. A Highly Engaged Board. In the earliest planning stages of BHC with our Board of Directors, the Board made it clear that they understood the value and importance of a ten-year commitment, but they also made three points clear. The first was the importance of honesty, candor, and trust about the progress of the effort. The second was a complete commitment to an evaluation approach framed by “learning through impact.” And thirdly, they wanted to be engaged for the purposes of learning, and governance, but not micromanagement. We accomplished the latter by organizing our quarterly Board meetings in or near a BHC community site at least three times a year, and each Board member accepted an assignment of one community site for more in-depth and richer learning. Board members share their observations over dinner at our Board meetings.

In closing, we have found the work of community change to be an exhilarating journey in pursuit of our health mission. We have gained an appreciation of the importance of the “right brain-left brain balance” in this work: having a Theory of Change, and Logic Models, and metrics are important, but trust-building, power-building, and the spiritual dimension of the work constitutes the real glue to hold partners and relationships together over the long haul. And finally, a special note of thanks and appreciation to those foundations who have traversed this path before us, sharing tidbits of lessons and wisdom so that we can “make new mistakes” in the battle for community improvement and health justice.

--Robert K. Ross, M.D.

The Journey from Practice to Theory: Developing a Foundation’s Theory of Change
February 7, 2013

Mary Gregory is the executive director of the Bella Vista Foundation, one of twenty-two foundations managed by Pacific Foundation Services (PFS). She has been with the company for fourteen years and enjoys the variety of philanthropic styles demonstrated by PFS’s clients.

Gregory-100I have the privilege of managing a number of grantmaking portfolios for PFS foundations, and each has taught me important lessons about the art and science of grantmaking. Most recently, as a result of many years of work with the Bella Vista Foundation (BVF), I had the opportunity to learn first-hand what it takes to develop a foundation’s theory of change. But first, let me give you some background. The Bella Vista Foundation (BVF) was started in 1999, and within a few years of making general grants to benefit children/youth, the board decided that one of its purposes should be to make a difference in the lives of children prenatal to three years old from low-income families in four Bay Area counties.

In 2007, after reviewing data, reading studies on infant development, and talking to experts in the field, BVF decided to fund programs that help parents and caregivers cope with stress and anxiety in order to prevent more serious mental health issues from arising which might negatively affect the health social and emotional development of their infants and toddlers. The foundation looks for high quality, culturally aware programs for parents and caregivers that may use any of a number of strategies to create well-being and community, including exercise, classes (such as parenting education), community activism, and peer counseling. These programs can be initiated by nonprofit organizations, county departments, or joint efforts between counties and independent organizations.

In 2012, with a grantmaking capability for this program area that currently amounts to about $1.2M per year, Bella Vista Foundation began to think about whether it could measure its impact. How could the foundation tell if parents and caregivers of very young children were actually better able to cope with anxiety and stress? BVF now encourages its grantees to set goals for their programs. Some programs already measure impact on their clients, using any of a variety of measurement tools that are easily available, to see if levels of stress and anxiety decrease in a meaningful way as a result of participation. Collection of this data also helps grantees to see if they need to revise their programs to get better results.

We realize impact measurement is tricky for foundations, as our investments are just part of a whole ecosystem of funding. BVF’s thinking is that if we aggregate the results of our grantees, we will at least know how many individuals were positively affected by these programs, and what percentage of the participants that represents. Through grantmaking, we are also getting a picture of how many agencies and/or nonprofits in each of our four counties are addressing parental stress and anxiety in families with young children. When Bella Vista Foundation is able to aggregate the programs’ results, we will have a sense of whether our grants are making a difference, and can also create a body of shared learning that will benefit our grantees beyond the grant investment.

Logic Model

View the logic model»

During the past year, in order to lead the way and to better understand the process, the foundation created and publicly shared its own Theory of Change (TOC). As board and staff crafted the TOC, we decided that this might also be a useful tool for our grantees, so we worked with a consultant to help us standardize our language, to review the foundation’s draft version, and to lead a workshop for grantees to get them started on creating their own TOCs. BVF then offered small technical assistance grants to six organizations that wanted to continue and refine their work, which is ongoing—the work will take place between now and early summer. We now know how difficult it is to create a Theory of Change! Foundation staff members are creating customized versions of our TOC for each of the four counties in which BVF makes grants because each county is different, so our activities and funding in each county will need to be customized. Bella Vista Foundation hopes that we can use this new set of tools to measure our progress towards our goals and our vision, and make our own course corrections when needed.

--Mary Gregory

For Impact’s Sake: The Need for Transparency on Diversity & Equity in Philanthropy
November 7, 2012

(Kelly Brown is Director of the D5 Coalition, a five-year, effort to advance philanthropy’s diversity, equity and inclusiveness.)

Brown-100Philanthropy exists for the common good, and advancing diversity, equity, and inclusion helps us live up to that value. In particular, thinking about equity in our grantmaking helps ensure that we are having the greatest impact on the issues identified in our unique missions—by targeting resources to the people in our constituencies with the greatest need.

But to really maximize our impact and hold ourselves accountable to our values, our constituencies, and each other, we also have to track who benefits from our grantmaking and be transparent about the results. If we can do that successfully, we can: 1) better understand whom we are reaching and whom we are missing—and adjust strategies accordingly; 2) leverage public policy or public dollars to fill gaps or create synergy; and 3) connect our work to the work of other foundations that focus on common issues or common consistencies.

As a field, we have a dual problem with both collecting and sharing data on diversity and equity.

Realizing that kind of success, though, is a real challenge. As a field, we have a dual problem with both collecting and sharing data on diversity and equity. Foundations measure internal diversity and the impact of their grantmaking in many different ways—or not at all. And the foundations that do collect this kind of data share it to varying degrees—or not at all. These challenges make it difficult to assess the year-over-year progress of individual foundations, or to draw comparisons among foundations, or between philanthropy and the public sector.  

So what do we do about it? We have to establish a uniform data collection and reporting system, and encourage the whole field to use it. We’re excited by the renewed energy in the field to take on this challenge—the Reporting Commitment is a great recent example.

A key goal of D5 is to improve data collection and transparency as it relates to diversity, equity, and inclusion. Last month, we helped convene 15 leaders on this topic in philanthropy and academia to discuss a pilot project to pioneer a collection and reporting system. As this promising work continues—and expands—we will be able to share more information about how to participate.  

In the meantime, the field also has to do the research to figure out what policies and practices are, in fact, the most effective at fostering diversity, equity, and inclusion. It’s hard for us to call on foundations to track and be transparent about diversity and equity when we can’t say in the same breath: And if you aren’t happy with where you stand, here are the most effective steps you can take to address it.

To help on that front, D5 just commissioned three organizations to conduct research that will help identify the most effective policies and tools philanthropic leaders can draw upon to help drive meaningful change and also lay the groundwork for gathering the data needed to help track the field’s progress. For more information about the Insights on Diversity research, check out the press release here.

Being transparent about diversity and equity can be intimidating. But I hope the need for it will increasingly be viewed as a pathway to impact—not as an onerous task that could result in scolding if a foundation is behind where it would like to be. This is an opportunity to learn from each other, to find ways to better work together to serve common constituencies, and to better meet the needs of an increasingly diverse world.

--Kelly Brown

Another Way of Thinking about Accountability
October 25, 2011

(Michael Remaley is the director of Public Policy Communicators NYC and president of HAMILL REMALEY breakthrough communications. In a previous post for Transparency Talk, he wrote about identifying transparency benchmarks in foundation communications.)

More and more philanthropic professionals are accepting the idea that their organizations should be transparent and, in part because those who founded the organization took major tax benefits when it was established, have some accountability to the public. Many of our field's big thinkers are making a compelling case that public accountability in philanthropy should be a core value in our work. But when it comes to accountability, what if foundations and the public are talking about entirely different things?

New research from Public Agenda and the Kettering Foundation presents evidence that the public and leaders across many sectors hold strikingly different ideas about what it means to be accountable. The report, "Don't Count Us Out: How an Overreliance on Accountability Could Undermine the Public's Confidence in Schools, Business, Government and More," is based on new public opinion research. It outlines the key dimensions of accountability as the public defines it and contrasts the public's perspective with prevailing leadership views. Although it isn't mentioned in the subtitle, the report explores the ramifications for foundations, too.

For philanthropic professionals, the implications are significant – both for their foundations and the institutions they support. There are several pros and cons in the research for those foundations already committed to transparency and accountability. For those foundations on the fence about accountability, the research reinforces the fact that the public expects institutions to be accountable, but raises questions about just what that means. 

There are several key points from the research that philanthropic professionals will want to consider:

Accountability requires ethics.

For foundations, the biggest "pro" in this research is that the public sees accountability first as a dimension of ethics and responsibility.  Foundations – especially those with an orientation toward accountability and transparency – will likely fair well with the public in this regard. On the "con" side, many leaders who see accountability measures as the principal way to ensure that their institutions meet their obligations to the public may be putting too much faith in how much the public values the setting of benchmarks, collecting data, measuring performance, disclosing information, and organizing system-wide reforms. Those mechanisms, while often valuable as management tools, fall far short of relieving the public's most potent concerns, especially their fears about an ethical decline in our society. Foundations that demonstrate they are acting responsibly and ethically will be thought by the public to be accountable more than those that simply talk about benchmarks.

More information does not equal more trust.

Typically, people know almost nothing about specific measures, and they rarely see them as clear-cut evidence of effectiveness. Many Americans are deeply skeptical about the accuracy and importance of quantitative measures. Most are suspicious of the ways in which numbers can be manipulated or tell only half the story. So on the "pro" side, this research is good news for those foundations that have become adept at getting their message out with personal stories of those affected by their programs. For those that are still trying to talk about their impact with lists of grants made and lots of data, the "cons" in this research may be quite jarring. Many members of the public feel confused and overwhelmed by the detailed information flying past them in the name of "disclosure" and "transparency." Many fear they are being manipulated by the complex presentations. More and more statistics do not reassure, so in fact, more information can actually lead to less public trust. It's not that they don't want accountability and information from foundations, but a whole lot of data (without any qualitative context) isn't reassuring to them.

Responsiveness is just as important as benchmarks.

For the public, being able to reach someone who listens to you and treats your ideas and questions respectfully is a fundamental dimension of accountability. This may be the biggest challenge for foundations in this research, since even the most transparent rarely open the door more than a crack to let the general public in to give feedback on the funding programs aimed at them. For most people, not being able to talk to someone is a signal that the institution doesn't genuinely care about those they serve. Foundations are particularly opaque to the public. The message is clear for those in philanthropy and other sectors who may fear being besieged by community input: the public wants a better balance and authentic mechanisms that allow them to be heard. On the "pro" side, those foundations that do seek community input and can demonstrate they are listening will likely be afforded a great deal of public trust. Foundations that rate well on the Foundation Center's Glasspockets measures of transparency, especially those dealing with grantee surveys and grantee feedback, can probably feel some relief that they will likely be considered accountable in the public's eyes.

The public expects to be held accountable, too.

For most Americans, the return to real accountability is not the job of leaders alone. Time and again, people in focus groups spoke about their own responsibilities and the near impossibility of solving problems without a broad base of responsibility at every level of society. Many foundations already get this. Institutions that embrace the idea of a public role in fostering institutional accountability must think creatively and proactively about how typical citizens can contribute their knowledge and actions to fulfill the organization's mission. The report emphasizes that giving people more and more information or giving them more and more choices without truly considering public priorities and concerns is likely to backfire.

The "Don't Count Us Out" report is getting a lot of attention in policy circles. The Washington Post's education columnist Jay Mathews said, "Its message is vital. Accountability is a key word in our national debate… The Public Agenda/Kettering report may have exposed the greatest obstacle to getting our kids the educations they deserve." And The Nonprofit Quarterly said, "The authors suggest that there is one other area that needs equal attention: philanthropy, which they say has 'fewer true accountability mechanisms than any other field.' However, there is one dimension of accountability in which philanthropy may be the strongest: the 'publicly stated moral convictions of its leaders.' How to measure that will, perhaps, be the biggest challenge of all."

For foundation professionals involved in communicating the results of their organizations' work, the first thing to recognize is simply the different orientation of your audience. The second is to understand that people expect more than just statistics and analyses of results to feel that the foundation is indeed accountable. Many foundations are hesitant to allow outsiders to even have easy e-mail access to staff (another Glasspockets transparency measure). So allowing the public to give feedback on the programs that are directed at them may seem like a radical idea to some. Many foundations are already doing grantee surveys and allowing public commentary on their blogs. These are likely to go a long way in engendering trust with the public.

Many foundations have already realized that telling stories is a more effective means of communicating with people than rolling off statistics and spewing facts. When it comes to demonstrating our foundations' accountability, it may be time to consider the idea that bringing the public into the process is as important as enumerating outcomes.

-- Michael Remaley

The Wiki Workplace and a Network Mindset - Part 2
October 5, 2011

Diana Scearce (Diana Scearce is a senior consultant with the Monitor Institute where she works primarily with networks and multi-stakeholder groups. Her work combines strategy, facilitation, research, scenario thinking, and learning design. She has written multiple articles and reports on effectively leveraging networks, including the forthcoming "Catalyzing Networks for Social Change: A Funder's Guide" (GEO, October 2011).

This is part two of a two-part blog series on how the David and Lucile Packard Foundation is working with a network mindset with its "see through filing cabinet"— a wiki through which the foundation's Organizational Effectiveness (OE) team shares resources and insights across its grantmaking, research in progress, and even internal documents. Part one shared an interview with the OE team's Stephanie McAuliffe and Kathy Reich, about how the experiment came about and how it's impacting their work. Part two shares insights from another OE team member, Jeff Jackson, about wiki results to date and how they're approaching assessment. For a deeper dive, check out their "wiki learning" page.

How do you know if this experiment in transparency — and now engagement — is working?

Jeff Jackson: Year one was about making sure the work and processes most meaningful and useful to us were shared, and that perhaps some people would engage enough to let us know we were not just being transparent to ourselves. While initially we weren't quite sure how to measure this, we now have more third party comments about our transparency than we know what to do with, including the Chronicle of Philanthropy saying "Packard is leading.

Efficiency became the earliest and most visible benefit of transparency.  We could easily point ourselves and others in the right direction faster and better than we did before with our disjointed filing cabinets (we're a very virtual OE team with team members in Mexico and multiple U.S. locations). A non OE team member (a nonprofit leader) is now telling us he is using the wiki as his OE resource center.

It seems strange to try to measure a wiki since it evolves with every new member and can take on a very different look as the work changes. For instance, it wasn't immediately apparent why Eugene Kim wanted to use the wiki to post his notes from the GEO Learning conference (vs. use his own blog), but that's the beauty of this flexible, open format. Although I don't know why he made that choice, I still found value in what he did. Once he posted his conference notes, I decided to do the same for conferences I attended.

We're also learning that once we set "our" measures, the definition and scope of "our" changes. At the same time, we still believe that without measures/targets for distinct parts of the wiki (Goldmine for instance), we might not progress or know we are progressing.

The Packard Foundation team's experiences mirror broader experiences we've noted in the "Network of Network Funders" — a learning community for funders who are catalyzing networks and working with a network mindset. Assessing the impact of network platforms, like wikis, and the impact of groups of people who are working together formally and informally on shared social change goals can be tough. Participation is fluid, the network is in a constant state of flux, and outcomes can be unexpected when you're inviting broad and diverse participation. Yet, as Jeff says, having clear indicators of progress is critical for staying on course, learning about what is working, and adapting as needed.

How are you assessing the impact of your efforts to work transparently and catalyze networks?

-- Diana Scearce

The Glass Filing Cabinet: What the Packard Foundation is Learning about Learning in Public
June 28, 2011

(Paul Connolly is Senior Vice President of TCC Group, a management consulting firm that serves nonprofit organizations, foundations, and corporate community involvement programs.) 

Paul ConnollyTypically, when a foundation hires an evaluator to assess a program, that evaluator collects lots of information from a range of stakeholders, analyzes the data, writes a report, and discusses it with the funder. Then, an abridged final report is maybe shared with the field. The Packard Foundation has pursued a much more transparent and interactive approach for the current review of its Organizational Effectiveness program—an approach which the foundation staff likens to having "a glass filing cabinet."

The Packard Foundation is facilitating a learning in public process through which [they] are sharing early research findings widely and encouraging input. For over two decades, Packard has been making grants to support such efforts as strategic planning, board development, succession planning, and web site upgrades to strengthen the organizational capacity of its nonprofit grantees. Packard retained TCC Group several months ago to help retrospectively assess 1,300 of these grants made during the past ten years and ascertain what constitutes a successful organizational effectiveness project. Packard is grappling with questions like: What is the sustained impact of the grants we make? How and to what extent can we quantify impact, its staff, and their outcomes? What contributes to the consultant relationship success? What are the factors that contribute to a successful project?

Packard began by compiling a huge data set based on grantee records and survey research and then asked TCC to help with the analysis. Rather than scrutinizing Packard's data on our own behind closed office doors, we are facilitating a "learning in public" process through which we are sharing early research findings widely and encouraging input. Leveraging Packard's organizational effectiveness wiki site, the project has set up a section of the wiki for grantees, consultants, funders, and other interested parties to review preliminary findings and provide feedback (we invite yours, too!). And conversations have been emerging on Twitter, blogs, and other social media venues.

What have we discovered so far about this networked approach to collective learning?

  • The Packard Foundation has been praised at several recent philanthropy conferences (such as the June 6-7 Grantmakers for Effective Organizations learning conference) for its open approach, so there seems to be some support in the field for this type of inclusive evaluation process.
  • There has been some engagement on the wiki, but not very much. We recognized that the wiki was not as technologically accessible as we had wished and are working on improving that. We are also realizing that asking a broad array of people to sift through and comment on a lot of "semi-baked" data is, well, asking a lot. (A few consultants even went so far as to say, justifiably, that they would only do so if they were paid for their time.)
  • We have learned to cull the findings and extract a few noteworthy nuggets that we then highlight and ask for feedback on—so it is more like drinking water from a cup rather than a fire hose.
  • We are also creating more opportunities for select constituents to participate in "old-fashioned" in-person discussion groups and teleconference webinars, during which we can "think out loud" with them. We have found that this live interaction engages people and make them more motivated to contribute their ideas online, too, as part of an ongoing conversation.

Ralph Waldo Emerson observed that "there are many things of which a wise man may wish to be ignorant." And New York University new media professor Clay Shirky points out that our society does not have a problem with information overload, but filter failure.

What are other foundations finding out about seeking broad input through two-way social media exchanges? How can philanthropies create better filters for seeking commentary when most people actually might not be that interested in poring through all of the information in those glass filing cabinets? At what point can a funder "over share" and ask constituents to review and comment on "too much information?" When is the best time to seek feedback from various types of stakeholders on slightly baked, half-baked, or fully baked findings? When soliciting experts' opinions, where exactly is that fine line between a foundation being open and receptive—and being presumptuous and insensitive? What is the best ways to blend online and offline input to maximize collective intelligence?

These are questions we are mulling over. We would like to hear what you think. And we would be glad to share more of our experience and insights as this public learning process evolves.

— Paul Connolly

Foundations Fail at Failing
January 18, 2011

Michael Remaley is the director of Public Policy Communicators NYC and president of HAMILL REMALEY breakthrough communications.

"If you hit the bull's eye every time,
you've set the target too close."

I thought of this, one of my favorite aphorisms, at the Communications Network's annual conference last September when the Hewlett Foundation's Communications Director Eric Brown talked about his organization's "failed grantmaking" contest.  Hewlett's smart internal exercise forces each department to name one grant from its portfolio that did not meet expectations, think through and explain what went wrong and help the entire organization learn from its failure. 

This is a learning exercise that more foundations should consider adopting. But more than that, it is an important example of how Hewlett's leadership has set the tone for candor about the unavoidable truth of philanthropic experimentation: failure is part of the equation. 

It is no coincidence that Hewlett is also one of the few foundations that has talked publicly about initiatives that didn't live up to expectations. It is also no coincidence that Hewlett's profile on Glasspockets gives a good indication of its commitment to transparency.  I would assert that Hewlett's reputation for being one of the most innovative, thoughtful, and effective foundations is directly related to its transparency, willingness to publicly question its strategies, and forthrightness in discussing the limitations of its successes. And that reputation further enhances its ability to exert influence and make change.

The hard sciences learned the importance of sharing candid assessments of "failed" experiments centuries ago. In fact, scientists seem to treasure results that do not meet expected outcomes even more highly than those that confirm what is already believed to be true. 

I am hardly the first person to call upon foundations to talk more openly about failure, experimentation, and unexpected outcomes. (See list below.) Hewlett's Paul Brest seems to have really kickstarted the conversation in 2007 by writing and talking about his foundation's experiences. That was followed by Robert Giloth and Susan Gewirtz's seminal 2008 piece in Foundation Review, "Philanthropy and Mistakes: An Untapped Resource." Many others, including Bob Hughes, Larry Blumenthal, Edward Pauly, Grant Oliphant, and Sean Stannard-Stockton, have added important insights about the need for foundations to be more open about their lessons learned.  The conversation about failure and experimentation seemed to grow and deepen over the past three years.  So you might think that foundations would be making major changes in how they communicate about failure.  You would be wrong.

Foundations give a lot of lip service to supporting "experimentation" in social sciences. But you almost never hear them talking about outcomes that failed to meet expectations, and even more rarely, those that call their basic strategies into question. If foundations want to be real leaders in advancing social change, they must move past the endless happy-talk that makes every grant sound like a success. Instead, they should use their web sites to detail how they are evaluating their work and what they've learned from unexpected outcomes. 

A foundation sharing its experiences with grants gone wrong is still very much the exception.  Anyone who is on the receiving end of foundation annual reports and newsletters knows this is true.  But to substantiate my assertion, I decided to do a little systematic poking around.

I figured the 21 largest supporters of the Center for Effective Philanthropy (most of which are also supporters of Grantmakers for Effective Organizations) would be the foundations most attuned to the value of self-reflection, evaluation, and sharing results that defy expectations, and also those that would have budgets big enough to support substantial evaluation efforts. I spent many hours exploring the nooks of crannies of these foundations' web sites.  I looked at numerous publications and evaluation sections of the sites, and I searched each site on the terms failure, failed, unmet expectations, unmet objective, unmet goal, experimentation, mistake, lessons learned, and assessment.

What I found was that few foundations make it easy to learn from projects that didn't go as spectacularly as planned, let alone talk frankly about what has been learned from the shortcomings of foundation strategy or execution.   Many of the 21 foundations I examined made no mention at all of evaluation criteria and organizational outcomes, even though their association with CEP and GEO implies that they demand that kind of forthrightness from grantees. The majority of the foundation sites I examined had a few project evaluation reports scattered among other foundation supported research – and many of those evaluation reports were laudatory with pablum like "real collaboration is a challenge" tacked on at the end. 

Some of the best exceptions were Robert Wood Johnson Foundation, the William and Flora Hewlett Foundation, and the Wallace Foundation. Each of those foundations not only makes it easy to find many project evaluations that are balanced in presenting positive and negative outcomes along with what was learned through the process, but also present self-critical examinations of foundation strategy and progress as whole. It is also not a coincidence that each of those foundations' profiles on Glasspockets indicates a commitment to transparency demonstrated by making public an assessment of overall foundation performance.

But perhaps the best example – the foundation that gets the Gold Star for Succeeding in Failing – is the James Irvine Foundation. The evaluation section of its site describes their approach to evaluating grantee success and links to all of its individual evaluations of initiatives. It also links to a Foundation Assessment section that has foundation annual progress reports for the last four years.  These progress reports are exceptionally detailed and well-documented, as well as frank about successes and failures.  Irvine has also produced "Insights: Lessons Learned" publications with candid assessments of their experiences with collaborations and other grantmaking practices. A search of the Irvine site on "lessons learned" produces lots of useful and interesting evaluative information and insightful critical analysis.

We are all members of the social science community and contributors to the social experiment that is American philanthropy. We now have enough examples of foundations talking humbly about their shortcomings to know that such candor only accelerates social progress and enhances the reputations of those philanthropic leaders. We've seen no evidence that talking forthrightly about the real-world circumstances leading to failure damages nonprofits or the foundations involved, so I wonder why foundations seem so reluctant to take on this leadership role.

What has your organization learned from experiments that didn't meet expectations?

Selected Readings:
A Chronology of the Dialogue on Failure
and Experimentation in Philanthropy

— Michael Remaley

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories