Transparency Talk

Category: "Data" (102 posts)

Trend to Watch: Using SDGs to Improve Foundation Transparency
September 19, 2017

(Janet Camarena is director of transparency initiatives at Foundation Center. )

Janet Camarena PhotoAs Foundation Center's director of transparency initiatives, one of the most interesting parts of my job is having the opportunity to play "transparency scout," regularly reviewing foundation websites for signs of openness in what is too often a closed universe. Some of this scouting leads to lifting up practices that can be examples for others on our Transparency Talk blog, sometimes it leads to a new transparency indicator on our assessment framework, and sometimes we just file it internally as a "trend to watch. "

Today, it's a combination of all three; we are using this blog post to announce the launch of a new, "Trend to Watch" indicator that signals an emerging practice: the use of the Sustainable Development Goals to improve how foundations open up their work to the world.

Sustainable Development GoalsThe United Nations' Sustainable Development Goals (SDGs), otherwise known as the Global Goals, are a universal call to action to end poverty, protect the planet and ensure that all people enjoy peace and prosperity. There are a total of 17 goals, such as ending poverty, zero hunger, reduced inequalities, and climate action. Written deliberately broad to serve as a collective playbook that governments and private sector alike can use, they can also serve as a much needed shared language across philanthropy and across sectors to signal areas of common interest, and measure shared progress.

And let's face it, as foundation strategies become increasingly specialized and strategic, explaining the objectives and the nuances can become a jargon-laden minefield that can make it difficult and time consuming for those on the outside to fully understand the intended goal of a new program or initiative. The simplicity of the SDG iconography cuts through the jargon so foundation website visitors can quickly identify alignment with the goals or not, and then more easily determine whether they should devote time to reading further. The SDG framework also provides a clear visual framework to display grants and outcomes data in a way that is meaningful beyond the four walls of the foundation.

Let's take a look at how some foundation websites are using the SDGs to more clearly explain their work:

Silicon Valley Community Foundation (SVCF)

One of my favorite examples is from a simple chart the Silicon Valley Community Foundation shared on its blog, because it specifically opens up the work of its donor-advised funds using the SDGs. Donor-advised funds are typically not the most transparent vehicles, so using the SDGs as a framework to tally how SVCF's donor-advised funds are making an impact is particularly clever, refreshing, and offers a new window into a fast-growth area of philanthropy.

A quick glance at the chart reveals that quality education, good health and well-being, and sustainable cities and communities are the most common priorities among Silicon Valley donors.

GHR Foundation

A good example of how the SDGs can be used as a shared language to explain the intended impact of a grant portfolio is from GHR Foundation in Minnesota. I also like this example because it shows how the SDGs can be effectively used in both global and domestic grant portfolios. GHR uses the SDG iconography across all of its portfolios, as sidebars on the pages that describe foundation strategies. GHR's "Children in Families" is a core foundation grantmaking strategy that addresses children and families in need on a global scale. The portfolio name is a broad one, but by including the SDG iconography, web visitors can quickly understand that GHR is using this program area to address poverty, hunger, as well as lead to outcomes tied to health and well-being:

GHR is also able to use the SDG framework to create similar understanding of its domestic work. Below is an example from its Catholic Schools program serving the Twin Cities:

Through the visual cues the icons provide, I can quickly determine that in addition to aligning with the quality education goal, that this part of GHR's portfolio also addresses hunger and economically disadvantaged populations through its education grantmaking. This could also signal that the grantmaker interprets education broadly and supports the provision of wrap-around services to address the needs of low-income children as a holistic way of addressing the achievement gap. That's a lot of information conveyed with three small icons!

Tableau Foundation

The most sophisticated example comes to us from the tech and corporate grantmaking worlds--the Tableau Foundation. Tableau makes data visualization software, so using technology as a means to improve transparency is a core approach, and they are using their own grantmaking as an example of how you can use data to tell a compelling visual story. Through the interactive "Living Annual Report" on its website, Tableau regularly updates its grantmaking tallies and grantee data so web visitors have near real-time information. One of the tabs on the report reveals the SDG indicators, providing a quick snapshot of how Tableau's grantmaking, software donations, and corporate volunteering align with the SDGs.

As you mouse over any bar on the left, near real-time data appears, tallying how much of Tableau's funding has gone to support each goal. The interactive bar chart on the right lists Tableau's grantees, and visitors can quickly see the grantee list in the context of the SDGs as well as know the specific scale of its grantmaking to each recipient.

If you're inspired by these examples, but aren't sure how to begin connecting your portfolio to the Global Goals, you can use the SDG Indicator Wizard to help you get started. All you need to do is copy and paste your program descriptions or the descriptive language of a sample grant into the Wizard and its machine-learning tools let you know where your grantmaking lands on the SDG matrix. It's a lot of fun – and great place to start learning about the SDGs. And, because it transforms your program language into the relevant SDG goals, indicator, and targets, it may just provide a shortcut to that new strategy you were thinking of developing!

What more examples? The good news is we're also tracking SDGs as a transparency indicator at "Who Has Glasspockets?" You can view them all here. Is your foundation using the SDGs to help tell the story of your work? We're always on the lookout for new examples, so let us know and your foundation can be the next trend setter in our new Trend to Watch.

-- Janet Camarena

I Thought I Knew You: Grants Data & the 990PF
August 23, 2017

(Martha S. Richards is the Executive Director of the James F. and Marion L. Miller Foundation in Portland, Oregon.)

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series will explore the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Join us at a session about the Open 990PF in partnership with Grantmakers of Oregon and Southwest Washington. Learn more or register here.

Martha Richards photoI have a confession to make. Up until a few years ago when this story begins, I used to take the 990PF for granted. I thought of it as something that ensured we were following federal regulations and that if we filed it on time and followed the reporting practices we had always used, that this would be sufficient for all concerned. I was also pretty certain no one but a few insiders within the government and perhaps a handful of philanthropy groups would ever bother to read it.

Well, you might have heard the expression: "You don't know what you don't know," and that's a good segue to what I have to share.

In Spring 2010, the Coalition of Communities of Color (CCC) released a study -- Communities of Color in Multnomah County: an Unsettling Profile -- which defined the disparities facing communities of color in Oregon's largest urban area, Portland. Inspired by this analysis, that December, Foundation Center (FC) and Grantmakers of Oregon and SW Washington (GOSW) co-presented Grantmaking to Communities of Color in Oregon -- a groundbreaking report that acknowledged that philanthropy was part of the problem. The report estimated only 9.6% of grants awarded in 2008 by Oregon private and community funders actually reached communities of color.

While the data told a moving story, the source of the data also became a parallel conversation because the philanthropic community here in Oregon learned about the limitations of using tax returns to tell such important stories. The grant descriptions in our 990s rarely disclose details about the intended beneficiaries of the grants—even if we know them.

The result: We embarked on a long journey to address both issues. While GOSW and CCC hosted a forum to raise awareness of the reports and their attendant policy recommendations, foundations committed to look more closely at their giving practices as well as their data collection efforts, especially emphasizing collecting better beneficiary data, and reporting relationship with Foundation Center.

This prompted us at the James F. and Marion L. Miller Foundation to examine our own giving and how we could describe its reach. We fund in the areas of arts and K-12 education. We have a small staff. Our application process did not require a detailed analysis of demographic data from arts applicants or schools, nor an understanding of the diverse nature of nonprofit leadership among our grantees. We realized that we did not know if the grants we made were reaching the populations we hoped to serve.

As part of this effort, I chaired a GOSW-led Data Work Group to explore how to obtain more meaningful data sets without adding to the length and complexity of our application processes. We invited nonprofit partners to the table. We studied Foundation's Center's processes and invited their staff to meet with and advise us. We tried, tested, and began to encourage nonprofits to help us learn more about how and who we were reaching with our philanthropic dollars. Eventually, we encouraged many of our Oregon foundations to become eReporters to Foundation Center, providing more detailed descriptions of what the grant was for, and who was reached with the funding. Our reports to the Foundation Center and to the IRS have improved, and we make an effort to report detailed demographic information.

Before and After Chart

However, we discovered that it can be difficult for some types of organizations to capture specific demographic data. In the arts, for instance, outside of audience surveys, one generally does not complete a demographic survey to buy a ticket. At the Miller Foundation, we chose to partner with DataArts to collect financial and audience data on our arts grantees. Arts organizations annually complete the profile and it can be used for several arts funders in the state. Their demographic profile is still being developed, but it will encourage better data information and capture in the future. Unfortunately, this platform does not exist for other nonprofits.

Get on the Map

Get on the Map encourages foundations to share current and complete details about their grantmaking with Foundation Center. The interactive map, databases and reports allow foundations to have a better understanding of grantee funding and demographics.

We didn't know it then, but as a result of our committee's efforts, a new data improvement movement was born, called Get on the Map (GOTM). GOTM encourages foundations to share current and complete details about their grantmaking with Foundation Center, so the Maps, databases, and reports it issues are as accurate as possible. The grants we share also populate an interactive map that members of GOSW have access to, which means that we have a better idea of the ecosystem in which we work. It has since scaled nationally with other regions also committing to improve the data they collect and share about their grantmaking so we can all be less in the dark about what efforts are underway and who is working on them.

As a result, today our foundation has a better understanding of who our grantees are serving and reaching today, than we did seven years ago, and I think we are also doing a better job of sharing that story with the IRS, Foundation Center, and the many sets of eyes I now know view those platforms.

We are still learning what we do not know. But at least, now we know what we do not know.

-- Martha Richards


Coming to Grantmakers of Oregon and Southwest Washington: To learn more about what story your 990PF tells about your foundation, register to attend Once Upon a 990PF. Visit the GOSW website for more information and to register.

How To Keep Me Scrolling Through What You Are Sharing
August 10, 2017

Tom Kelly is Vice President of Knowledge, Evaluation & Learning at the Hawai‘i Community Foundation. He has been learning and evaluating in philanthropy since the beginning of the century. @TomEval  TomEval.com

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Tom Kelly Hi ResHello, my name is Tom and I am a Subscriber. And a Tweeter, Follower, Forwarder (FYI!), Google Searcher, and DropBox Hoarder. I subscribe to blogs, feeds, e-newsletters, and email updates. My professional title includes the word “Knowledge,” so I feel compelled to make sure I am keeping track of the high volume of data, information, reports, and ideas flowing throughout the nonprofit and foundation worlds (yes, it is a bit of a compulsion…and I am not even including my favorite travel, shopping and coupon alerts).

It is a lot and I confess I do not read all of it. It is a form of meditation for me to scroll through emails and Twitter feeds while waiting in line at Aloha Salads. I skim, I save, I forward, I retweet – I copy and save for later reading (later when?). In fact, no one can be expected to keep up, so how does anyone make sense of it all, or even find what we need when we need it? Everyone being #OpenForGood and sharing everything is great, but who is reading it all? And how do we make what we are opening for good actually good?

Making Knowledge Usable

We have all experienced at some point Drowning in Information-Starving for Knowledge (John Naisbitt’s Megatrends…I prefer E.O. Wilson’s “starving for wisdom” theory). The information may be out there but rarely in a form that is easily found, read, understood, and most importantly used. Foundation Center and IssueLab have made it easier for people in the sector to know what is being funded, where new ideas are being tested, and what evidence and lessons are available. But nonprofits and foundations still have to upload and share many more of their documents than they do now. And we need to make sure that the information we share is readable, usable, and ready to be applied.

Hawaii Community Foundation Graphic

DataViz guru Stephanie Evergreen recently taught me a new hashtag: #TLDR – “Too Long, Didn’t Read.”

She now proposes that every published report be available in three formats – a one-page handout with key messages, a 3-page executive summary, and a 25-page report (plus appendices). In this way the “scanners,” “skimmers” and “deep divers” can access the information in the form they prefer and in the time they have. It also requires writing (and formatting) differently for each of these sets of eyes. (By the way, do you know which one you are?)

From Information to Influence

But it is not enough to make your reports accessible, searchable, and also easily readable in short and long forms; you also need to include the information people need to make decisions and act. It means deciding in advance who you want to inform and influence and what you want people to do with the information. You need to be clear about your purpose for sharing information, and you need to give people the right kinds of information if you expect them to read it, learn from it, and apply it.

“Give people the right kinds of information if you expect them to read it, learn from it, and apply it.”

Too many times I have read reports with promising findings or interesting lessons, and then I race through all the footnotes and the appendices at the back of the report looking for resources that could point me to the details of evidence and data or implementation guidance. I usually wind up trying to track down the authors by email or phone to follow up.

A 2005 study of more than 1,000 evaluations published in human services found only 22 well-designed and well-documented reports that shared any analysis of implementation factors – what lessons people learned about how best to put the program or services in place. We cannot expect other people and organizations to share knowledge and learn if they cannot access information from others that helps them use the knowledge and apply it in their own programs and organizations. YES, I want to hear about your lessons and “a-ha’s,” but I also want to see data and analysis of the common challenges that all nonprofits and foundations face:

  • How to apply and adapt program and practice models in different contexts
  • How to sustain effective practices
  • How to scale successful efforts to more people and communities

This means making sure that your evaluations and your reports include opening up the challenges of implementation – the same challenges others are likely to face. It also means placing your findings in the context of existing learning while also using similar definitions so that we can build on each other’s knowledge. For example, in our recent middle school connectedness initiative, our evaluator Learning for Action reviewed the literature first to determine specific components and best practices of youth mentoring so that we could build the evaluation on what had come before, and then report clearly about what we learned about in-school mentoring and open up  useful and comparable knowledge to the field. 

So please plan ahead and define your knowledge sharing and influence agenda up front and consider the following guidelines:

  • Who needs to read your report?
  • What information does your report need to share to be useful and used?
  • Read and review similar studies and reports and determine in advance what additional knowledge is needed and what you will document and evaluate.
  • Use common definitions and program model frameworks so we are able to continually build on field knowledge and not create anew each time.
  • Pay attention to and evaluate implementation, replication and the management challenges (staffing, training, communication, adaptation) that others will face.
  • And disseminate widely and share at conferences, in journals, in your sector networks, and in IssueLab’s open repository.

And I will be very happy to read through your implementation lessons in your report’s footnotes and appendices next time I am in line for a salad.

--Tom Kelly

Foundations and Endowments: Smart People, Dumb Choices
August 3, 2017

(Marc Gunther writes about nonprofits, foundations, business and sustainability. He also writes for NonprofitChronicles.com. A version of this post also appears in Nonprofit Chronicles.)

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series will explore the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Marc Gunther photoAmerica’s foundations spend many millions of dollars every year on investment advice. In return, they get sub-par performance.

You read that right: Money that could be spent on charitable programs — to alleviate global poverty, help cure disease, improve education, support research or promote the arts — instead flows into the pockets of well-to-do investment advisors and asset managers who, as a group, generate returns on their endowment investments that are below average.

This is redistribution in the wrong direction, on a grand scale: Foundation endowments hold about $800 billion in investments. It hasn’t attracted a lot of attention, but that could change as foundations make their IRS tax filings open, digital and searchable. That should create competitive pressures on foundation investment officers to do better, and for foundation executives and trustees to rethink business as usual investing.

The latest evidence that they aren’t doing very well arrived recently with the news that two energy funds managed by a Houston-based private equity firm called EnerVest are on the verge of going bust. Once worth $2 billion, the funds will leave investors “with, at most, pennies for every dollar they invested,” the Wall Street Journal reports. To add insult to injury, the funds in question were invested in oil and natural gas during 2012 and 2013, just as Bill McKibben, 350.org and a handful of their allies were urging institutional investors to divest from fossil fuels.

Foundations that invested in the failing Enervest funds include the J. Paul Getty Trust, the John D. and Catherine T. MacArthur Foundation and the California-based Fletcher Jones Foundation, according to their most recent IRS filings. Stranded assets, anyone?

“Endowed private foundations are unaccountable to anyone other than their own trustees.”

Of course, no investment strategy can prevent losses. But the collapse of the Enervest funds points to a broader and deeper problem–the fact that most foundations trust their endowment to investment offices and/or outside portfolio managers who pursue active and expensive investment strategies that, as a group, have underperformed the broader markets.

How costly has this underperformance been? That’s impossible to know because most foundations do not disclose their investment returns. This, by itself, is a troubling; it’s a reminder that endowed private foundations are unaccountable to anyone other than their own trustees.

On disclosure, there are signs of progress. The Ford Foundation says it intends to release its investment returns for the first time. A startup company called Foundation Financial Research is compiling data on endowments as well, which it intends to make available to foundation trustees and sell to asset managers.

What’s more, as the IRS Form 990s filed by foundations become machine readable, it will become easier for analysts, activists, journalists and other foundations to see exactly how billions of dollars of foundations assets are deployed, and how they are performing. Advocates for mission-based investment, or for hiring more women and people of color to manage foundation assets are likely to shine a light on foundations whose endowments that are underperforming.

Unhappily, all indications are that most foundations are underperforming because they pursue costly, active investment strategies. This month, what is believed to be the most comprehensive annual survey of foundation endowment performance once again delivered discouraging news for the sector.

The 2016 Council on Foundations–Commonfund Study of Investment of Endowments for Private and Community Foundations® reported on one-year, five-year and 10-year returns for private foundations, and they again trail passive benchmarks.

The 10-year annual average return for private foundations was 4.7 percent, the study found. The five-year return was 7.6 percent. Those returns are net of fees — meaning that outside investment fees are taken into account — but they do not take into account the considerable salaries of investment officers at staffed foundations.

By comparison, Vanguard, the pioneering giant of passive investing, says a simple mix of index funds with 70 percent in stocks and 30 percent in fixed-income assets delivered an annualized return of 5.4 percent over the past 10 years. The five-year return was 9.1 percent.

These differences add up in a hurry.

Warnings, Ignored

The underperformance of foundation endowments is not a surprise. In a Financial Times essay called The end of active investing? that should be read by every foundation trustee, Charles D. Ellis, who formerly chaired the investment committee at Yale, wrote:

“Over 10 years, 83 per cent of active funds in the US fail to match their chosen benchmarks; 40 per cent stumble so badly that they are terminated before the 10-year period is completed and 64 per cent of funds drift away from their originally declared style of investing. These seriously disappointing records would not be at all acceptable if produced by any other industry.”

The performance of hedge funds, private-equity funds and venture capital has trended downwards as institutional investors flocked into those markets, chasing returns. Notable investors including Warren Buffett, Jack Bogle (who as Vanguard’s founder has a vested interest in passive investing), David Swensen, Yale’s longtime chief investment officer, and Charles Ellis have all argued for years that most investors–even institutional investors–should simply diversity their portfolios, pursue passive strategies and keep their investing costs low.

In his most recent letter to investors in Berkshire Hathaway, Buffett wrote:

“When trillions of dollars are managed by Wall Streeters charging high fees, it will usually be the managers who reap outsized profits, not the clients. Both large and small investors should stick with low-cost index funds.”

For more from Buffett about why passive investing makes sense, see my March blogpost, Warren Buffett has some excellent advice for foundations that they probably won’t take. Recently, Freakonomics did an excellent podcast on the topic, titled The Stupidest Thing You Can Do With Your Money.

2016700activepassivesign-640x410-jpgThat said, the debate between active and passive asset managers remains unsettled. While index funds have outperformed actively-managed portfolios over the last decade, Cambridge Associates, a big investment firm that builds customized portfolios for institutional investors and private clients, published a study last spring saying that this past decade is an anomaly. Cambridge Associates found that since 1990, fully diversified (i.e., actively managed) portfolios have underperformed a simple 70/30 stock/bond portfolio in only two periods: 1995–99 and 2009–2016. To no one’s surprise, Cambridge says: “We continue to find investments in private equity and hedge funds that we believe have an ability to add value to portfolios over the long term.” Portfolio managers are also sure to argue that their expertise and connections enable them to beat market indices.

But where is the evidence? To the best of my knowledge, seven of the U.S.’s 10 biggest foundations decline to disclose their investment returns. I emailed or called the Getty, MacArthur and Fletcher Jones foundations to ask about their investments in Enervest and was told that they do not discuss individual investments. They declined comment.

To its credit, MacArthur does disclose its investment performance of its $6.3 billion endowment. On the other hand, MacArthur has an extensive grantmaking program supporting “conservation and sustainable development.” Why is it financing oil and gas assets?

Ultimately, foundation boards are responsible for overseeing the investment of their endowments. Why don’t they do a better job of it? Maybe it’s because many foundation trustees — particularly those who oversee the investment committees — come out of Wall Street, private equity funds, hedge funds and venture capital. They are the so-called experts, and they have built successful careers by managing other people’s people. It’s not easy for the other board members, who may be academics, activists, lawyers or politicians, to question their expertise. But that’s what they need to do.

And, at the very least, foundations ought to be open about how their endowments are performing so those who manage their billions of dollars can be held accountable.

--Marc Gunther

How Improved Evaluation Sharing Has the Potential to Strengthen a Foundation’s Work
July 27, 2017

Jen GlickmanJennifer Glickman is manager, research team, at the Center for Effective Philanthropy. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Philanthropy is a complex, demanding field, and many foundations are limited in the amount of resources they can dedicate to obtaining and sharing knowledge about their practices. This makes it necessary to consider, then, in what areas should foundations focus their learning and sharing efforts to be #OpenForGood?

Last year, the Center for Effective Philanthropy (CEP) released two research reports exploring this question. The first, Sharing What Matters: Foundation Transparency, looks at foundation CEOs’ perspectives on what it means to be transparent, who the primary audiences are for foundations’ transparency efforts, and what is most important for foundations to share.

The second report, Benchmarking Foundation Evaluation Practices, presents benchmarking data collected from senior foundation staff with evaluation responsibilities on topics such as evaluation staffing and structures, investment in evaluation work, and the usefulness of evaluation information. Together, these reports provide meaningful insights into how foundations can learn and share knowledge most effectively.

CEP’s research found that there are specific topics about which foundation CEOs believe being transparent could potentially increase their foundation’s ability to be effective. These areas include the foundation’s grantmaking processes, its goals and strategies, how it assesses its performance, and the foundation’s experiences with what has and has not worked in its efforts to achieve its programmatic goals. While foundation CEOs believe their foundations are doing well in sharing information about their grantmaking, goals, and strategies, they say their foundations are much less transparent about the lessons they learn through their work.

CEP Transparency Graphic

For example, nearly 70 percent of the CEOs CEP surveyed say being transparent about their foundation’s experiences with what has worked in its efforts to achieve its programmatic goals could increase effectiveness to a significant extent. In contrast, only 46 percent say their foundations are very or extremely transparent about these experiences. Even fewer, 31 percent, say their foundations are very or extremely transparent about what has not worked in their programmatic efforts, despite 60 percent believing that being transparent about this topic could potentially increase their effectiveness to a significant extent.

And yet, foundations want this information about lessons learned and think it is important. Three-quarters of foundation CEOs say they often seek out opportunities to learn from other foundations’ work, and is that it enables others to learn from foundation work more generally.

How is knowledge being shared then? According to our evaluation research, foundations are mostly sharing their programmatic knowledge internally. Over three-quarters of the evaluation staff who responded to our survey say evaluation findings are shared quite a bit or a lot with the foundation’s CEO, and 66 percent say findings are shared quite a bit or a lot with foundation staff. In comparison:

  • Only 28 percent of respondents say evaluation findings are shared quite a bit or a lot with the foundation’s grantees;
  • 17 percent say findings are shared quite a bit or a lot with other foundations; and
  • Only 14 percent say findings are shared quite a bit or a lot with the general public.

CEP Evaluation Survey Graphic

In fact, less than 10 percent of respondents say that disseminating evaluation findings externally is a top priority for their role.

But respondents do not think these numbers are adequate. Nearly three-quarters of respondents say their foundation invests too little in disseminating evaluation findings externally. Moreover, when CEP asked respondents what they hope will have changed for foundations in the collection and/or use of evaluation information in five years, one of the top three changes mentioned was that foundations will be more transparent about their evaluations and share what they are learning externally.

So, if foundation CEOs believe that being transparent about what their foundation is learning could increase its effectiveness, and foundation evaluation staff believe that foundations should be investing more in disseminating findings externally, what is holding foundations back from embracing an #OpenForGood approach?

CEP has a research study underway looking more deeply into what foundations know about what is and isn’t working in their practices and with whom they share that information, and will have new data to enrich the current conversations on transparency and evaluation in early 2018. In the meanwhile, take a moment to stop and consider what you might #OpenForGood.

--Jennifer Glickman

How to Make Grantee Reports #OpenForGood
July 20, 2017

Mandy Ellerton and Molly Matheson Gruen joined the [Archibald] Bush Foundation in 2011, where they created and now direct the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community challenges, using approaches that are collaborative and inclusive of people who are most directly affected by the problem. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Ellertonmandy20152
Mandy Ellerton

When we started working at the Bush Foundation in 2011, we encountered a machine we’d never seen before: the Lektriever. It’s a giant machine that moves files around, kind of like a dry cleaner’s clothes rack, and allows you to seriously pack in the paper. As a responsible grantmaker, it’s how the Bush Foundation had meticulously tracked and stored its files for posterity - in particular, grantee reports - for decades.

In 2013, the Bush Foundation had the privilege of moving to a new office. Mere days before we were to move into the new space, we got a frantic call from the new building’s management. It turned out that the Lektrievers (we actually had multiple giant filing machines!) were too heavy for the floor of the new building, which had to be reinforced with a number of steel plates to sustain their weight.

MMG 2015 Headshot1
Molly Matheson Gruen

The Lektrievers symbolized our opportunity to become more transparent and move beyond simply preserving our records, instead seeing them as relevant learning tools for current audiences. It was time to lighten the load and share this valuable information with the world.

Even with all this extra engineering, we would still have to say goodbye to one of the machines altogether for the entire system to be structurally sound. We had decades of grantee stories, experiences and learning trapped in a huge machine in the inner sanctum of our office, up on the 25th floor. 

Learning Logs Emerge

We developed our grantee learning log concept in the Community Innovation Programs as one way to increase the Foundation’s transparency. At the heart of it, our learning logs are a very simple concept: they are grantee reports, shared online. But, like many things that appear simple, once you pull on the string of change – the complexity reveals itself.

“Every Community Innovation project is an opportunity for others to learn and the learning logs are a platform to share this learning.”

Before we could save the reports from a life of oblivion in the Lektriever, build out the technology and slap the reports online, we needed to entirely rethink our approach to grantee reporting to create a process that was more mutually beneficial. First, we streamlined our grant accountability measures (assessing whether the grantees did what they said they’d do) by structuring them into a conversation with grantees, rather than as a part of the written reports. We’ve found that conducting these assessments in a conversation takes the pressure off and creates a space where grantees can be more candid, leading to increased trust and a stronger partnership.

Second, our grantee reports now focus on what grantees are learning in their grant-funded project. What’s working? What’s not? What would you do differently if you had it to do all over again? This new process resulted in reports that were more concise and to the point.

Finally, we redesigned our website to create a searchable mechanism for sharing these reports online. This involved linking our grant management system directly with our website so that when a grantee submits a report, we do a quick review and then the report automatically populates our website. We’ve also designed a way for grantees to be able to designate select answers as private when they want to share sensitive information with us, yet not make it entirely public. We leave it up grantee discretion and those selected answers do not appear on the website. Grantees designate their answers to be private for a number of reasons, most often because they discuss sensitive situations having to do with specific people or partners – like when someone drops out of the project or when a disagreement with a partner holds up progress. And while we’ve been pleased at the candor of most of our grantees, some are still understandably reluctant to be publicly candid about failures or mistakes.

But why does this new approach to grantee reporting matter, besides making sure the floor doesn’t collapse beneath our Lektrievers?

Bushfoundation-Lektriever photo
The Lektriever is a giant machine that moves files around, kind of like a dry cleaner’s clothes rack. The Bush Foundation had meticulously tracked and stored its files for posterity - in particular, grantee reports - for decades. Credit: Bush Foundation

Learning Sees the Light of Day

Learning logs help bring grantee learning into the light of day, instead of hiding in the Lektrievers, so that more people can learn about what it really takes to solve problems. Our Community Innovation programs at the Bush Foundation fund and reward the process of innovation–the process of solving problems. Our grantees are addressing wildly different issues: from water quality to historical trauma, from economic development to prison reform. But, when you talk to our grantees, you see that they actually have a lot in common and a lot to learn from one another about effective problem-solving. And beyond our grantee pool, there are countless other organizations that want to engage their communities and work collaboratively to solve problems.  Every Community Innovation project is an opportunity for others to learn and the learning logs are a platform to share this learning, making it #OpenForGood.

We also want to honor our grantees’ time. Grantees spend a lot of time preparing grant reports for funders. And, in a best case scenario, a program officer reads the report and sends the grantee a response of some kind before the report is filed away. But, let’s be honest – sometimes even that doesn’t happen. The report process can be a burden on nonprofits and the only party to benefit is the funder. We hope that the learning logs help affirm to our grantees that they’re part of something bigger than themselves - that what they share matters to others who are doing similar work.

We also hear from our grantees that the reports provide a helpful, reflective process, especially when they fill it out together with collaborating partners. One grantee even said she’d like to fill out the report more often than we require to have regular reflection moments with her team!

Learning from the Learning Logs

We only launched the learning logs last year, but we’ve already received some positive feedback. We’ve heard from both funded and non-funded organizations that the learning logs provide inspiration and practical advice so that they can pursue similar projects. A grantee recently shared a current challenge in their work. It directly connected to some work we knew another grantee had done and had written about in their learning log. So, since this knowledge was now out in the open, we were able to direct them to the learning log as a way to expand our grantee’s impact, even beyond their local community, and use it to help advance another grantee’s work.

Take, for example, some of the following quotes from some of our grantee reports:

  • The Minnesot Brain Injury Alliance's project worked on finding ways to better serve homeless people with brain injuries.  They reflected that, "Taking the opportunity for reflection at various points in the process was very important in working toward innovation.  Without reflection, we might not have been open to revising our plan and implementing new possibilities."
  • GROW South Dakota addressed a number of challenges facing rural South Dakota communities. They shared that, “Getting to conversations that matter requires careful preparation in terms of finding good questions and setting good ground rules for how the conversations will take place—making sure all voices are heard, and that people are listening for understanding and not involved in a debate.”
  •  The People's Press Project engaged communities of color and disenfranchised communities to create a non-commercial, community-owned, low-powered radio station serving the Fargo-Moorhead area of North Dakota. They learned “quickly that simply inviting community members to a meeting or a training was not a type of outreach that was effective.”

Like many foundations, we decline far more applications than what we fund, and our limited funding can only help communities tackle so many problems. Our learning logs are one way to try and squeeze out more impact from those direct investments. By reading grantee learning logs, hopefully more people will be inspired to effectively solve problems in their communities.

We’re not planning to get rid of the Lektrievers anytime soon – they’re pretty retro cool and efficient. They contain important historical records and are incredibly useful for other kinds of record keeping, beyond grantee documentation. Plus, the floor hasn’t fallen in yet. But, as Bush Foundation Communications Director Dominick Washington put it, now we’re unleashing the knowledge, “getting it out of those cabinets, and to people who can use it.”

--Mandy Ellerton and Molly Matheson Gruen

What Will You #OpenForGood?
July 13, 2017

Janet Camarena is director of transparency initiatives at Foundation Center.  This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Janet Camarena Photo

This week, Foundation Center is launching our new #OpenForGood campaign, designed to encourage better knowledge sharing practices among foundations.  Three Foundation Center services—Glasspockets, IssueLab, and GrantCraft are leveraging their platforms to advance the idea that philanthropy can best live up to its promise of serving the public good by openly and consistently sharing what it’s learning from its work.  Glasspockets is featuring advice and insights from “knowledge sharing champions” in philanthropy on an ongoing #OpenForGood blog series; IssueLab has launched a special Results platform allowing users to learn from a collective knowledge base of foundation evaluations; and a forthcoming GrantCraft Guide on open knowledge practices is in development.

Although this campaign is focused on helping and inspiring foundations to use new and emerging technologies to better collectively learn, it is also in some ways rooted in the history that is Foundation Center’s origin story.

OFG-twitter

A Short History

Sixty years ago, Foundation Center was established to provide transparency for a field in jeopardy of losing its philanthropic freedom due to McCarthy Era accusations that gained traction in the absence of any openness whatsoever about foundation priorities, activities, or processes.  Not one, but two congressional commissions were formed to investigate foundations committing alleged “un-American activities.”  As a result of these congressional inquiries, which spanned several years during the 1950s, Foundation Center was established to provide transparency in a field that had nearly lost everything due to its opacity. 

“The solution and call to action here is actually a simple one – if you learn something, share something.”

I know our Transparency Talk audience is most likely familiar with this story since the Glasspockets name stems from this history when Carnegie Corporation Chair Russell Leffingwell said, “The foundation should have glass pockets…” during his congressional testimony, describing a vision for a field that would be so open as to allow anyone to have a look inside the workings and activities of philanthropy.  But it seems important to repeat that story now in the context of new technologies that can facilitate greater openness.

Working Collectively Smarter

Now that we live in a time when most of us walk around with literal glass in our pockets, and use these devices to connect us to the outside world, it is surprising that only 10% of foundations have a website, which means 90% of the field is missing discovery from the outside world.  But having websites would really just bring foundations into the latter days of the 20th century--#OpenForGood aims to bring them into the present day by encouraging foundations to openly share their knowledge in the name of working collectively smarter.

What if you could know what others know, rather than constantly replicating experiments and pilots that have already been tried and tested elsewhere?  Sadly, the common practice of foundations keeping knowledge in large file cabinets or hard drives only a few can access means that there are no such shortcuts. The solution and call to action here is actually a simple one—if you learn something, share something

In foundations, learning typically takes the form of evaluation and monitoring, so we are specifically asking foundations to upload all of your published reports from 2015 and 2016 to the new IssueLab: Results platform, so that anyone can build on the lessons you’ve learned, whether inside or outside of your networks. Foundations that upload their published evaluations will receive an #OpenForGood badge to demonstrate their commitment to creating a community of shared learning.

Calls to Action

But #OpenForGood foundations don’t just share evaluations, they also:

  • Open themselves to ideas and lessons learned by others by searching shared repositories, like those at IssueLab as part of their own research process;
  • They use Glasspockets to compare their foundation's transparency practices to their peers, add their profile, and help encourage openness by sharing their experiences and experiments with transparency here on Transparency Talk;
  • They use GrantCraft to hear what their colleagues have to say, then add their voice to the conversation. If they have an insight, they share it!

Share Your Photos

“#OpenForGood foundations share their images with us so we can show the collective power of philanthropic openness, not just in words, but images. ”

And finally, #OpenForGood foundations share their images with us so we can show the collective power of philanthropic openness, not just in words, but images.  We would like to evolve the #OpenForGood campaign over time to become a powerful and meaningful way for foundations to open up your work and impact a broader audience than you could reach on your own. Any campaign about openness and transparency should, after all, use real images rather than staged or stock photography. 

So, we invite you to share any high resolution photographs that feature the various dimensions of your foundation's work.  Ideally, we would like to capture images of the good you are doing out in the world, outside of the four walls of your foundation, and of course, we would give appropriate credit to participating foundations and your photographers.  The kinds of images we are seeking include people collaborating in teams, open landscapes, and images that convey the story of your work and who benefits. Let us know if you have images to share that may now benefit from this extended reach and openness framing by contacting openforgood@foundationcenter.org.

What will you #OpenForGood?

--Janet Camarena

Why Evaluations Are Worth Reading – or Not
June 14, 2017

Rebekah Levin is the Director of Evaluation and Learning for the Robert R. McCormick Foundation, guiding the Foundation in evaluating the impact of its philanthropic giving and its involvement in community issues. She is working both with the Foundation’s grantmaking programs, and also with the parks, gardens, and museums at Cantigny Park. This post is part of the Glasspockets’ #OpenForGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

Rebekah Levin photoTruth in lending statement:  I am an evaluator.  I believe strongly in the power of excellent evaluations to inform, guide, support and assess programs, strategies, initiatives, organizations and movements.  I have directed programs that were redesigned to increase their effectiveness, their cultural appropriateness and their impact based on evaluation data, helped to design and implement evaluation initiatives here at the foundation that changed the way that we understand and do our work, and have worked with many foundation colleagues and nonprofits to find ways to make evaluation serve their needs for understanding and improvement. 

“I believe strongly in the power of excellent evaluations."

One of the strongest examples that I’ve seen of excellent evaluation within philanthropy came with a child abuse prevention and treatment project.  Our foundation funded almost 30 organizations that were using 37 tools to measure treatment impact of treatment, many of which were culturally inappropriate, designed for initial screenings, or inappropriate for a host of other reasons, and staff from these organizations running similar programs had conflicting views about the tools.  Foundation program staff wanted to be able to compare program outcomes using uniform evaluation tools and to use that data to make funding, policy, and program recommendations, but they were at a loss as to how to do so in a way that honored the grantees’ knowledge and experience.   A new evaluation initiative was funded, combining the development of a "community of practice" for the nonprofits and foundation together to:

  • create a unified set of reporting tools;
  • learn together from the data about how to improve program design and implementation, and the systematic use of data to support staff/program effectiveness;
  • develop a new rubric which the foundation would use to assess programs and proposals; and
  • provide evaluation coaching for all organizations participating in the initiative.

The evaluation initiative was so successful that the nonprofits participating decided to continue their work together beyond the initial scope of the project to improve their own programs and better support the children and families that they are serving. This “Unified Project Outcomes” article describes the project and established processes in far greater detail.

But I have also seen and been a part of evaluations where:

  • the methodology was flawed or weak;
  • the input data were inaccurate and full of gaps;
  • there was limited understanding of the context of the organization;
  • there was no input from relevant participants; and
  • there was no thought to the use of the data/analysis;

so that little to no value came out of them, and the learning that took place as a result was equally inconsequential.

Mccormick-foundation-logo_2xSo now to those evaluation reports that often come at the end of a project or foundation initiative, and sometimes have interim and smaller versions throughout their life span.  Except to a program officer who has to report to their director about how a contract or foundation strategy was implemented, the changes from the plan that occurred, and the value or impact of an investment or initiative, should anyone bother reading them?  From my perch, the answer is a big “Maybe.”  What does it take for an evaluation report to be worth my time to read, given the stack of other things sitting here on my desk that I am trying to carve out time to read?  A lot.

  1. It has to be an evaluation and not a PR piece. Too often, "evaluation" reports provide a cleaned up version of what really occurred in a program, with none of the information about how and why an initiative or organization functioned as it did, and the data all point to its success.  This is not to say that initiatives/organizations can’t be successful.  But no project or organization works perfectly, and if I don’t see critical concerns/problems/caveats identified, my guess is that I’m not getting the whole story, and its value to me drops precipitously.
  2. It has to provide relevant context. To read an evaluation of a multi-organizational collaboration in Illinois without placing its fiscal challenges within the context of our state’s ongoing budget crisis, or to read about a university-sponsored community-based educational program without knowing the long history of mistrust between the school and the community, or any other of the relevant and critical contextual pieces that are effect a program, initiative or organization makes that evaluation of little value.  Placed within a nuanced set of circumstances significantly improves the possibility that the knowledge is transferable to other settings.
  3. It has to be clear and as detailed as possible about the populations that it is serving. Too often, I read evaluations that leave out critical information about who they were targeting and who participated or was served. 
  4. The evaluation’s methodology must be described with sufficient detail so that I have confidence that it used an appropriate and skillful approach to its design and implementation as well as the analysis of the data. I also pay great attention to what extent those who were the focus of the evaluation participated in the evaluation’s design, the questions being addressed, the methodology being used, and the analysis of the data.
  5. And finally, in order to get read, the evaluation has to be something I know exists, or something I can easily find. If it exists in a repository like IssueLab, my chances of finding it increase significantly.  After all, even if it’s good, it is even better if it is #OpenForGood for others, like me, to learn from it.

When these conditions are met, the answer to the question, “Are evaluations worth reading?” is an unequivocal “YES!,” if you value learning from others’ experiences and using that knowledge to inform and guide your own work.

--Rebekah Levin

The Real World is Messy. How Do You Know Your Foundation Is Making an Impact?
June 7, 2017

Aaron Lester is an experienced writer and editor in the nonprofit space. In his role as content marketing manager at Fluxx, Aaron’s goal is to collect and share meaningful stories from the world of philanthropy. This post is part of the Glasspockets’ #OpenForGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

AaronLesterIn a perfect world, foundations could learn from every mistake, build on every new piece of knowledge, and know with certainty what impact every effort has made.

Of course, we’re not in that world. We’re in the real, fast-paced world of nonprofits where messy human needs and unpredictable natural and political forces necessitate a more flexible course. In that world, it’s more challenging to measure the effects of our grantmaking efforts and learn from them. It turns out knowledge sharing is a tough nut to crack.

And without meaningful knowledge sharing, we’re left struggling to understand the philanthropic sector’s true impact — positive or negative — within a single organization or across many. The solution is a more transparent sector that is willing to share data — quantitative as well as qualitative — that tells stories of wins and losses, successes and failures—in other words, a sector that is #OpenForGood. But, of course, this is much easier said than done.

My role at Fluxx creates many opportunities for me to talk with others in the field and share stories the philanthropic sector can learn from. I recently had the chance to speak with grantmakers on this very issue.

Measuring Whose Success?

Even within a foundation, it can be difficult to truly understand the impact of a grant or other social investment.

“Lose the mindset defined by a fear of failure; instead, embrace one that drives you to search for opportunity.”

As Adriana Jiménez, director of grants management at the ASPCA and former grants manager at the Surdna Foundation, explains, it’s difficult for foundations to prove conclusively that it’s their slice of the grantmaking that has made a meaningful difference in the community. “When you collect grant-by-grant data, it doesn’t always roll up to your foundation’s goals or even your grant’s goals.”

The issue is that there’s no standardized way to measure grantmaking data, and it’s an inherently difficult task because there are different levels of assessment (grant, cluster, program, foundation, etc.), there is similar work being done in different contexts, and a lot of data is only available in narrative form.

One way to combat these challenges is to make sure your foundation is transparent and in agreement around shared goals with grantees from the start of the relationship. Being too prescriptive or attempting to standardize the way your grantees work will never create the results you’re after. Part of this early alignment includes developing clear, measurable goals together and addressing how the knowledge you’re gaining can and should translate into improvements in performance.

A grantee should never have to alter their goals or objectives just to receive funding. That sends the wrong message, and it provides the wrong incentive for grantees to participate in knowledge-sharing activities. But when you work as partners from the start and provide space for grantees to collaborate on strategy, a stronger partnership will form, and the stories your data tells will begin to be much more meaningful.

The Many Languages of Human Kindness

If sharing knowledge is difficult within one organization, it’s even more challenging across organizations.

FluxxJiménez points out that a major challenge is the complexity of foundations, as they rely on different taxonomies and technologies and discuss similar issues using different language. Every foundation’s uniqueness is, in its day-to-day work, its strength, but in terms of big-picture learning across organizations, it’s a hurdle.

Producing cohesive, comprehensive data out of diverse, fragmented information across multiple organizations is a huge challenge. Mining the information and tracking it in an ongoing way is another obstacle made more difficult because the results are often more anecdotal than they are purely quantitative. And when this information is spread out over so many regions and focus areas, the types of interventions vary so widely that meaningful knowledge sharing becomes untenable.

Gwyneth Tripp, grants manager at Blue Shield of California Foundation, also cites a capacity issue. Most foundations don’t have designated roles for gathering, tracking, organizing, and exchanging shareable data, so they resort to asking staff who already have their own sizable to-do lists. Tripp says:

“They have an interest and a desire [in knowledge sharing], but also a real challenge of balancing the everyday needs, the strategic goals, the relationships with grantees, and then adding that layer of ‘let’s learn and think about it all’ is really tough to get in.

“Also, becoming more transparent about the way you work, including sharing successes as well as failures, can open your foundation up to scrutiny. This can be uncomfortable. But it’s important to delineate between ‘failure’ and ‘opportunity to learn and improve.’”

Sparking Change

But foundations know (possibly better than anyone else) that obstacles don’t make accomplishing a goal impossible.

And this goal’s rewards are great: When foundations can achieve effective knowledge sharing, they’ll have better insights into what other funding is available for the grantees within the issues they are tackling, who is being supported, which experiments are worth replicating, and where there are both gaps and opportunities. And with those insights, foundations gain the ability to iterate and improve upon their operations, even leading to stronger, more strategic collaborations and partnerships.

Creating and promoting this kind of accessible, useful knowledge sharing starts with a few steps:

  1. Begin from within. Tracking the impact of your grantmaking efforts and sharing those findings with the rest of the sector requires organizations to look internally first. Start by building a knowledge management implementation plan that involves every stakeholder, from internal teams to grantee partners to board executives.
  1. Determine and prioritize technology needs. Improvements in technology — specifically cloud-based technology — are part of what’s driving the demand for data on philanthropic impact in the first place. Your grants management system needs to provide integrated efficiency and accessibility if you want to motivate staff participation and generate usable insights from the data you’re collecting. Is your software streamlining your efforts, or is it only complicating them?
  1. Change your mindset. Knowledge sharing can be intimidating, but it doesn’t have to be. Lose the mindset defined by a fear of failure; instead, embrace one that drives you to search for opportunity. Promote a stronger culture of knowledge sharing across the sector by sharing your organizational practices and lessons learned. Uncover opportunities to collect data and share information across organizations.

There’s no denying that knowledge sharing benefits foundations everywhere, along with the programs they fund. Don’t let the challenges hold you back from aiming for educational, shareable data — you have too much to gain not to pursue that goal.  What will you #OpenForGood?

--Aaron Lester 

Practicing Transparency for Discovery and Learning
May 22, 2017

Richard Russell Resize Photo

At The Russell Family Foundation, we appreciate tools that help make the invisible more visible. This pursuit of transparency is a family trait that stems from our experience in the financial services industry, where we invented stock indexes that more truly reflect the market. The Frank Russell Company earned a reputation for quality research, long-term thinking and general excellence. We do our best to carry on in that tradition at the foundation.

In particular, we seek to communicate and practice our core values, such as lifelong learning and the importance of relationships. During the past 20 years, these touchstones have served us well.

Richard Woo Photo

Today, we’re relying on them even more as we prepare for a period of significant transition, which involves new roles for family members, changes to leadership and staff positions, and evolving our core programs. What’s different now, however, is that we are employing new tools to guide us.

Legacy Communications Toolkit

For us, transparency is as much about discovery as disclosure. That’s because the discovery process is how we determine: (1) what we know, (2) what we don’t know, (3) where we stand, and (4) what boundaries, if any, exist for a specific topic. Discovery can be a humbling and inspiring experience. Sometimes it exposes our blind spots; other times it reveals important new opportunities. Nevertheless, learning is the payoff for investing in transparency and discovery.

In 2016, we took steps to re-affirm our founding principles, in order to set the stage for the next 20 years of operations. We identified the need for additional frameworks to help guide us through important issues such as leadership succession and grant strategy. From those efforts, we’ve bundled together all the useful pieces, which we call our Legacy Communications Toolkit (it's a work in progress).

Over the past couple of years, we have developed some new components. One centerpiece is our three-dimensional chessboard, which we introduced in our last blog post. It is a useful tool for initiating and clarifying conversation about important issues that might otherwise be difficult to surface. The chessboard can be used to visualize and understand the complex layers of communications and expectations associated with foundation life – like how transparent we need to be when revising our grant strategy, or how we understand a family member who doesn’t want to participate.

Case in point: In a family foundation, tensions can arise when trustees hold competing or conflicting opinions and worldviews. If not handled sensitively, principled conversations among peers can become deeply personal, causing individuals to briefly lose sight of the organizational mission and the goal of serving the public trust. One such discussion arose among our trustees in 2016; at issue was the scope of themes that should be eligible for funding. The intentional and purposeful conversation among family trustees about this matter was facilitated by a skilled and trusted organizational consultant outside the foundation. With that assistance, the trustees clarified the boundaries between personal, familial, organizational and public goals – and eventually settled on a decision that balanced the greatest number of interests especially that of serving the foundation’s public mission. This exercise in more transparent communication among trustees and consensus decision-making was essentially the laboratory that gave rise to the three-dimensional chessboard.

Can you imagine applying the three-dimensional chessboard to a crucial conversation waiting to happen at a foundation near you?

Another dynamic tool we rely on is a graphic timeline of the foundation’s history. It is a 20-foot mural, on display in our office that highlights important moments from our beginnings in 1999 to the present day. The timeline is filled with photos, charts, and quotations, with more being added as time passes. This visual history does more than remind us of the past; it helps us appreciate the context of defining moments. Those moments (as well as the details of our history) constitute our collective narrative. We are continually exploring and discovering the appropriate balance between transparency, family privacy and a public trust.

TRFF visual-timeline

The Russell Family Foundation uses its timeline as a teaching tool.  Source: The Russell Family Foundation

To date, the timeline has proven to be an invaluable teaching tool, especially for younger family members who wish to take active roles in the foundation, or newcomers to our enterprise who want to know how we got here. It stimulates conversation and questions, and it has helped us onboard new community board members and staff by giving them a vivid sense of our history and mission. Grantees and community visitors are often intrigued by the informal imagery captured on the story wall, which invites their curiosity, discussion and ultimately a deeper relationship with our work.

Imagining the Future Together

The elements of our Legacy Communications Toolkit emphasize storytelling in its many forms: visual, narrative, historical, data-driven, and more. Storytelling activates our imaginations so we can see the changes we’ve accomplished or wish to make going forward. This process also helps us envision what level of transparency is required.

A good example of this approach is how we are currently updating one of our longest standing environmental programs, which focuses on the waters of Puget Sound.

After a decade of investment and hundreds of grants employing a wide variety of tactics, we took stock of our impact on Puget Sound protection and restoration. We reviewed our grant history, studied the most recent literature, interviewed regional thought leaders, and drew upon the relationships with our longtime grantees. The effort was illuminating – making the invisible more visible. Despite all that had been accomplished over the years, we recognized that our efforts were a mile wide and an inch deep.

Visualizing our impact in this way gave us the motivation to develop a new approach. We knew from past projects that there was an appetite for alignment among nonprofits. We also realized that our broad network of individual grantees gave us credibility to encourage greater collaboration within the field. We put these pieces together to create the Puget Sound Collective, an informal group of nonprofits and funders who desire a more coordinated regional vision and strategy for Puget Sound recovery.

Our partners joined the Puget Sound Collective for the possibility of making greater impact and doing more together. But, naturally, they want to know where their peers are coming from, how specific goals will be set, and how decisions will be made. In other words, they expect transparency. We knew going in that openness and candor would be the table stakes for this new forum. However, bringing people together to work across differences (organizations, missions, geographies, genders, race, class, etc.) requires transparency in all directions. That takes time; it takes deep, trusting relationships.

The experience has reinforced how important it is for the foundation to practice transparent behavior. We are building the road alongside our partners as we walk it. We need to be honest when we can only see as far down the road as they can. We need to be clear in our intention for grantees to set the agenda – to offer support without control – because relationships like this move at the speed of trust.

At a time when the country is experiencing deep divides and uncertainty, family foundations can reassure their constituents by demonstrating a commitment to transparency about their story and the essentials behind the work they do. However, they should also bear in mind the Goldilocks Principle – “not too hot, not too cold, but just right.” They need to find the best fit for their organization because the benefits of transparency are measured in degrees.

We hope our methods, experiments, and discoveries serve as useful references. Mahalo to those who commented on the first blog post. To everyone reading this installment, please share your thoughts, counterpoints or questions.

--Richard Russell and Richard Woo

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories