Transparency Talk

Category: "Data" (106 posts)

No Pain, No Gain: The Reality of Improving Grant Descriptions
November 8, 2017

Gretchen Schackel is Grants Manager of the James F. and Marion L. Miller Foundation in Portland, Oregon.

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series explores the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Join us at a session about the Open 990-PF in partnership with Southern California Grantmakers. Learn more or register here.                                   

Gretchen Schackel - Miller photoYou know those blog posts that describe adopting a best practice?  The ones that make it sound so easy and tempting that you try it, only to be let down because you discover that either you are doing something terribly wrong, or it is a lot harder than the author made it sound because they left out all of the pain points? Well, don’t worry—this is not one of those posts! In fact, I will start off with the pain points so you can go in eyes wide open, if like me, you end up on a quest to improve your foundation’s grant descriptions.  

This post is a sequel to another Transparency Talk article that recently featured our foundation’s executive director, detailing lessons learned about why improving grants data is important to the foundation, as well as to the sector as a whole. That article ended with a brief snapshot of some “before and after” grant descriptions, showing how we are working to improve the way we tell the story of each grant, so I’m picking up here where that left off to share an honest, behind-the-scenes look at what it took to get from the before to the after.

“Capturing critical details when writing accurate and complete grant descriptions aids your efforts on the 990-PF form.”

Pain Relievers

As the grants manager, it’s my job to put the right processes in place so we can capture critical details when writing grant descriptions to ensure that they are accurate and complete, and well….actually descriptive (AKA “Purpose of grant of contribution” on form 990-PF). This fall marks my 11-year anniversary at the Miller Foundation and one thing that has remained constant throughout my tenure is what a pain writing good grant descriptions can be if you don’t know where to begin. So, I’m sharing my playbook below, because the communities we are serving, and how we are serving them, deserve to be described and celebrated. I’ve learned some tips and work-arounds along the way that I’ll share as I inventory the various obstacles you might encounter

Pain Point #1:

Lean Staffing. We are a staff of four people: Executive Director, Program Officer, Grants Manager, and Administrative Assistant. We don’t publish an annual report; we have just started using social media, and just completed a website redesign. This makes all of us part-time communications staff. I wouldn’t describe this as a best practice, but it’s the reality at many foundations.  

Pain Reliever #1:

Grant Descriptions Can Serve Many Purposes. As mentioned above, the editorial process involved in prepping text for public consumption can be labor intensive, particularly in organizations without a communications department. Grant descriptions, which represent the substance of our work, turn out to be handy for small organizations like ours because they can serve many purposes. They are used for our minutes, our website, our 990-PF, and for our eReport to Foundation Center for its searchable databases. We don’t have time to write different grant descriptions for each specific use. So, we write one grant description that we can use in multiple platforms and situations.

Pain Point #2:

Garbage In – Garbage Out. Data starts with the grantees, and I know from talking to them that they are often not well equipped with time or technology to collect good data. It’s not just about what questions are we asking but rather how are we helping our grantees understand what we need and help them get us the best data possible.

Pain Reliever #2:

You have to work with what you’ve got. And what we have is the information provided by the potential grantees in their applications.  Most of the information we need can be found in the “Brief summary of the grant request” question on the grant application. Rather than treat this as a test that potential grantees must either pass/fail, we provide detailed instructions of the kind of information we would like to see in the summary as part of our online application process. Taking the guesswork out of the application has improved the data quality we receive at the start of the grant. Our arts portfolio also requires that grantees participate in DataArts, which serves as a collective database that grantees only have to enter once and then all arts funders can access their data. Participating in field-building shortcuts like this is a great way to make the process more efficient for everyone.

Once you have the framework in place to get a good grant summary from your prospective grantees, however, your work is not yet done.  Often, important elements of the funded grant can change during board deliberations, so I find it essential to share the grant summary with our program staff before finalizing to ensure we are capturing the detail accurately.

Pain Point #3: Lack of an industry standard on what makes the perfect grant description.  There are probably as many ways to write a grant description as there are foundations, and reinventing wheels is a waste of our collective time, so I have long wished for a framework we could all agree to follow.

Pain Reliever #3: The Get on the Map Campaign.

We have learned a lot from Foundation Center’s Get on the Map campaign about the elements of a great grant description. The Get on the Map campaign is a partnership between United Philanthropy Forum and Foundation Center designed to improve philanthropic data, and includes a helpful framework that details the best way to share your data with Foundation Center and the public. What I immediately loved about it is how it reminded me of being that weird kid who loved to diagram sentences in junior high. But perhaps it’s not that strange since I know grants managers enjoy turning chaos into order. So, let's try to use sentence diagramming as a model for writing grant descriptions.

The Anatomy of a Good Grant Description

First, we’ll start with the four elements of a good grant description and assign each a color.

  • WHAT: What is the primary objective of the grant?
  • WHO:  Are there any specifically intended beneficiaries?
  • HOW: What are the primary strategies of the grant?
  • WHERE:  Where will the grant monies serve if beyond the recipient address?

Example #1:

We’ll start with an easy example. Program support grant descriptions often write themselves:

Brief summary of the grant request from application form:

“We are seeking support for Chicas Youth Development which serves over 500 Latina girls and their families in grades 3-12 in Washington County. Chicas launched in 2008 and has since grown to partner with three Washington County school districts and over 500 local families each year to offer after school programming, leadership, and community service opportunities for Latina youth and their families.”

Grant Description: to support the Chicas Youth Development program which serves 500 Latina girls in grades 3-12 located in Washington County.

That was pretty easy!! Particularly because of how we improved the clarity of what we ask for.

Example #2:

The grant below is also a project grant but the Brief summary of the grant request from the application is a little less straight forward:

“GRANTEE requests $AMOUNT to support the presentation of two new publications and four community readings featuring the writing of diverse voices: people who are experiencing homeless, immigrants and refugees living in our community, seniors living on a low income, LGBTQ folks, people living with a disability, and many others whose voices often live on the margins.  This project will bring together people to experience and explore art and will focus on those with the least access to do so.

Grant Description: To support community building through publication and public readings of works written by marginalized populations.

Example #3:

This grant is for both general operating support and a challenge grant. Tricky.

GRANTEE respectfully requests $AMOUNT over two years to support program growth as well as provide a matching challenge for individual donations as we continue to increase our sustainability through support from individual donors. If awarded, $AMOUNT would be put to general operating funds to support our continued program growth in all areas: traditional high school program, statewide initiative pilot program and our college program. The remaining $AMOUNT each year would serve as a matching challenge grant. In order to be eligible for the match, GRANTEE would have to raise $AMOUNT in new and increased individual donations each year of the grant period.

Okay Grant Description: To support program growth and provide a matching challenge for individual donations.

Good Grant Description: General operating funds to support program growth and a challenge grant to increase support from individual donors.

Better Grant Description: This grant was awarded in two parts: 1. General operating funds for mission related activities that provide intensive support to low-income high school juniors and seniors in Oregon. 2. A 1:1 challenge grant to increase support from individual donors.

The above description is a perfect example of why it’s important to read the proposal narrative as well as confer with program staff.

If you follow this process, I can’t promise it will be painless, but it will go a long way to relieving a lot of the pain points that come with grants management—particularly the grants management of today in which grants managers are at the crossroads of being data managers, information officers, and storytellers.  I have found making this journey is worth it. Because, after all, behind every grant lies a story waiting to be told and a community waiting to hear it. So, let’s get our stories straight!

--Gretchen Schackel

In the Know: #OpenForGood Staff Pick
November 1, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Native Arts & Cultures Foundation.


Staff Pick: Native Arts & Cultures Foundation

Progressing Issues of Social Importance Through the Work of Indigenous Artists: A Social Impact Evaluation of the Native Arts and Cultures Foundation's Pilot Community Inspiration Program

Download the Report

Quick Summary

NACF Report

Impact measurement is a challenge for all kinds of organizations, and arts and culture organizations in particular often struggle with how to quantify the impact they are making. How does one measure the social impact of an epic spoken word poem, or of a large-scale, temporary art installation, or of performance art? The same is true of measuring the impact of social change efforts--how can these be measured in the short term given the usual pace of change? This report provides a good example of how to overcome both of these struggles.

In 2014, the Native Arts & Cultures Foundation (NACF) launched a new initiative, the Community Inspiration Program (CIP), which is rooted in the understanding that arts and cultures projects have an important role to play in motivating community engagement and supporting social change.

This 2017 report considers the social impacts of the 2014 CIP projects—what effects did they have on communities and on the issues, conversations, and connections that are critical in those communities? Its secondary purpose is to provide the NACF with ideas for how to improve its grantmaking in support of arts for community change.

Field(s) of Practice

  • Arts and Culture
  • Native and Indigenous Communities
  • Social Change
  • Community Engagement

This report opens up knowledge about the pilot phases of a new initiative whose intended impacts, community inspiration and social change, are vital but difficult concepts to operationalize and measure. The evaluation provides valuable insight into how foundations can encourage the inclusion of indigenous perspectives and truths not only in the design of their programs but also in the evaluation of those same programs.

What makes it stand out?

Several key aspects make this report noteworthy. First, this evaluation comprises a unique combination of more traditional methods and data with what the authors call an "aesthetic-appreciative" evaluation lens, which accounts for a set of dimensions associated with aesthetic projects such as "disruption," "stickiness," and "communal meaning," providing a more holistic analysis of the projects. Further, because the evaluation was focused on Native-artist led projects, it relied on the guidance of indigenous research strategies. Intentionality around developing strategies and principles for stakeholder-inclusion make this a noteworthy and useful framework for others, regardless of whether Native communities are the focus of your evaluation.

Key Quote

"Even a multiplicity of evaluation measures may not 'truly' tell the story of social impact if, for evaluators, effects are unobservable (for example, they occur at a point in the future that is beyond the evaluation's timeframe), unpredictable (so that evaluators don't know where to look for impact), or illegible (evaluators cannot understand that they are seeing the effects of a project)."

--Gabriela Fitz

Open Access to Foundation Knowledge
October 25, 2017

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. This post also appears in Medium. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Lisa Brooks Photo
Lisa Brooks

Foundations have a lot of reasons to share knowledge. They produce knowledge themselves. They hire others to research and author works that help with internal strategy development and evaluation of internal strategies, programs, and projects. And they make grants that assist others in gaining insight into social issues — be it through original research, evaluation work, or other work aimed at creating a better understanding of issues so that we can all pursue better solutions to social problems. In almost all aspects of foundation work, knowledge is an outcome.

While openly sharing this knowledge is uneven across the social sector, we do see more and more foundations starting to explore open access to the knowledge assets they make possible. Many foundations are sharing more intentionally through their websites, external clearinghouses, and other online destinations. And more foundations are suggesting — sometimes requiring — that their grantees openly share knowledge that was produced with grant dollars.

Lacey Althouse Photo
Lacey Althouse

Some foundations are even becoming open access champions. For example, the Hewlett Foundation has authored a terrifically helpful free toolkit that provides an in-depth how-to aimed at moving foundation and grantee intellectual property licensing practices away from “all rights reserved” copyrights and toward “some rights reserved” open licenses. (Full disclosure: IssueLab is included in the toolkit as one solution for long term knowledge preservation and sharing.) (“Hewlett Foundation Open Licensing Toolkit for Staff”)

For those who are already 100% open it’s easy to forget that, when first starting out, learning about open access can be daunting. For those who are trying to open up, like most things, getting there is a series of steps. One step is understanding how licensing can work for, or against, openness. Hewlett’s toolkit is a wonderful primer for understanding this. IssueLab also offers some ways to dig into other areas of openness. Check out Share the Wealth for tips.

Hawaii

 

However it is that foundations find their way to providing open access to the knowledge they make possible, we applaud and support it! In the spirit of International Open Access Week’s theme, “Open in order to….,” here’s what a few leading foundations have to say about the topic of openness in the social sector.

James Irvine Foundation 
Find on IssueLab.

“We have a responsibility to share our knowledge. There’s been a lot of money that gets put into capturing and generating knowledge and we shouldn’t keep it to ourselves.”

-Kim Ammann Howard, Director of Impact Assessment and Learning

Hewlett Foundation
Find on IssueLab.

“Our purpose for existing is to help make the world a better place. One way we can do that is to try things, learn, and then share what we have learned. That seems obvious. What is not obvious is the opposite: not sharing. So the question shouldn’t be why share; it should be why not share.”

-Larry Kramer, President

Hawaii Community Foundation
Find on IssueLab.

“Openness and transparency is one element of holding ourselves accountable to the public — to the communities we’re either in or serving. To me, it’s a necessary part of our accountability and I don’t think it should necessarily be an option.

-Tom Kelly, Vice President of Knowledge, Evaluation and Learning

The David and Lucile Packard Foundation
Find on IssueLab.

“Why do we want to share these things? …One, because it’s great to share what we’re learning, what’s worked, what hasn’t, what impact has been made so that others can learn from the work that our grantees are doing so that they can either not reinvent the wheel, gain insights from it or learn from where we’ve gone wrong… I think it helps to build the field overall since we’re sharing what we’re learning.”

-Bernadette Sangalang, Program Officer

The Rockefeller Foundation
Find on IssueLab

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations — well before the results are even known.”

-Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly
(Read more on why the Rockefeller Foundation is open for good.)

If you are a foundation ready to make open access the norm as part of your impact operations, here’s how you can become an open knowledge organization today.

IssueLab believes that social sector knowledge is a public good that is meant to be freely accessible to all. We collect and share the sector’s knowledge assets and we support the social sector’s adoption of open knowledge practices. Visit our collection of ~23,000 open access resources. While you’re there, add your knowledge — it takes minutes and costs nothing. Find out what we’re open in order to do here. IssueLab is a service of Foundation Center.

--Lisa Brooks and Lacey Althouse

Championing Transparency: The Rockefeller Foundation Is First to Share All Evaluations As Part of #OpenForGood
September 26, 2017

The Rockefeller Foundation staff who authored this post are Veronica Olazabal, Director of Measurement, Evaluation, and Organizational Performance; Shawna Hoffman, Measurement, Evaluation, and Organizational Performance Specialist; and Nadia Asgaraly, Measurement and Evaluation Intern.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Veronica Olazabal
Veronica Olazabal
Shawna Hoffman
Shawna Hoffman
Nadia Asgaraly
Nadia Asgaraly

TRF Color LogoToday, aligned with The Rockefeller Foundation's commitments to sharing and accountability, we are proud to be the first foundation to accept the challenge and proactively make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

A History of Transparency and Sharing

Since its founding more than 100 years ago, The Rockefeller Foundation's mission has remained unchanged: to promote the well-being of humanity throughout the world. To this end, the Foundation seeks to catalyze and scale transformative innovation across sectors and geographies, and take risks where others cannot, or will not. While working in innovative spaces, the Foundation has always recognized that the full impact of its programs and investments can only be realized if it measures - and shares - what it is learning. Knowledge and evidence sharing is core to the organization's DNA dating back to its founder John D. Rockefeller Sr., who espoused the virtues of learning from and with others—positing that this was the key to "enlarging the boundaries of human knowledge."

“To ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known.”

Evaluation for the Public Good

Building the evidence base for the areas in which we work is the cornerstone of The Rockefeller Foundation's approach to measurement and evaluation. By systematically tracking progress toward implementation and outcomes of our programs, and by testing, validating, and assessing our assumptions and hypotheses, we believe that we can manage and optimize our impact. Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.

But living out transparency as a core value is not without its challenges. A commitment to the principle of transparency alone is insufficient; organizations, especially foundations, must walk the talk. Sharing evidence requires the political will and human resources to do so, and more importantly, getting comfortable communicating not only one's successes, but also one's challenges and failures. For this reason, to ensure that we hold ourselves to this high bar, The Rockefeller Foundation pre-commits itself to sharing the results of its evaluations - well before the results are even known. Then, once evaluation reports are finalized, they are posted to the Foundation website, available to the public free of charge.

#OpenForGood Project

The Foundation Center's #OpenForGood project, and IssueLab's related Results platform, help take the Foundation's commitment to sharing and strengthening the evidence base to the next level. By building a repository where everyone can identify others working on similar topics, search for answers to specific questions, and quickly identify where knowledge gaps exists, they are leading the charge on knowledge sharing.

The Rockefeller Foundation is proud to support this significant effort by being the first to contribute its evaluation evidence base to IssueLab: Results as part of the #OpenForGood movement, with the hope of encouraging others to do the same.

-- Veronica Olazabal, Shawna Hoffman, and Nadia Asgaraly

Trend to Watch: Using SDGs to Improve Foundation Transparency
September 19, 2017

(Janet Camarena is director of transparency initiatives at Foundation Center. )

Janet Camarena PhotoAs Foundation Center's director of transparency initiatives, one of the most interesting parts of my job is having the opportunity to play "transparency scout," regularly reviewing foundation websites for signs of openness in what is too often a closed universe. Some of this scouting leads to lifting up practices that can be examples for others on our Transparency Talk blog, sometimes it leads to a new transparency indicator on our assessment framework, and sometimes we just file it internally as a "trend to watch. "

Today, it's a combination of all three; we are using this blog post to announce the launch of a new, "Trend to Watch" indicator that signals an emerging practice: the use of the Sustainable Development Goals to improve how foundations open up their work to the world.

Sustainable Development GoalsThe United Nations' Sustainable Development Goals (SDGs), otherwise known as the Global Goals, are a universal call to action to end poverty, protect the planet and ensure that all people enjoy peace and prosperity. There are a total of 17 goals, such as ending poverty, zero hunger, reduced inequalities, and climate action. Written deliberately broad to serve as a collective playbook that governments and private sector alike can use, they can also serve as a much needed shared language across philanthropy and across sectors to signal areas of common interest, and measure shared progress.

And let's face it, as foundation strategies become increasingly specialized and strategic, explaining the objectives and the nuances can become a jargon-laden minefield that can make it difficult and time consuming for those on the outside to fully understand the intended goal of a new program or initiative. The simplicity of the SDG iconography cuts through the jargon so foundation website visitors can quickly identify alignment with the goals or not, and then more easily determine whether they should devote time to reading further. The SDG framework also provides a clear visual framework to display grants and outcomes data in a way that is meaningful beyond the four walls of the foundation.

Let's take a look at how some foundation websites are using the SDGs to more clearly explain their work:

Silicon Valley Community Foundation (SVCF)

One of my favorite examples is from a simple chart the Silicon Valley Community Foundation shared on its blog, because it specifically opens up the work of its donor-advised funds using the SDGs. Donor-advised funds are typically not the most transparent vehicles, so using the SDGs as a framework to tally how SVCF's donor-advised funds are making an impact is particularly clever, refreshing, and offers a new window into a fast-growth area of philanthropy.

A quick glance at the chart reveals that quality education, good health and well-being, and sustainable cities and communities are the most common priorities among Silicon Valley donors.

GHR Foundation

A good example of how the SDGs can be used as a shared language to explain the intended impact of a grant portfolio is from GHR Foundation in Minnesota. I also like this example because it shows how the SDGs can be effectively used in both global and domestic grant portfolios. GHR uses the SDG iconography across all of its portfolios, as sidebars on the pages that describe foundation strategies. GHR's "Children in Families" is a core foundation grantmaking strategy that addresses children and families in need on a global scale. The portfolio name is a broad one, but by including the SDG iconography, web visitors can quickly understand that GHR is using this program area to address poverty, hunger, as well as lead to outcomes tied to health and well-being:

GHR is also able to use the SDG framework to create similar understanding of its domestic work. Below is an example from its Catholic Schools program serving the Twin Cities:

Through the visual cues the icons provide, I can quickly determine that in addition to aligning with the quality education goal, that this part of GHR's portfolio also addresses hunger and economically disadvantaged populations through its education grantmaking. This could also signal that the grantmaker interprets education broadly and supports the provision of wrap-around services to address the needs of low-income children as a holistic way of addressing the achievement gap. That's a lot of information conveyed with three small icons!

Tableau Foundation

The most sophisticated example comes to us from the tech and corporate grantmaking worlds--the Tableau Foundation. Tableau makes data visualization software, so using technology as a means to improve transparency is a core approach, and they are using their own grantmaking as an example of how you can use data to tell a compelling visual story. Through the interactive "Living Annual Report" on its website, Tableau regularly updates its grantmaking tallies and grantee data so web visitors have near real-time information. One of the tabs on the report reveals the SDG indicators, providing a quick snapshot of how Tableau's grantmaking, software donations, and corporate volunteering align with the SDGs.

As you mouse over any bar on the left, near real-time data appears, tallying how much of Tableau's funding has gone to support each goal. The interactive bar chart on the right lists Tableau's grantees, and visitors can quickly see the grantee list in the context of the SDGs as well as know the specific scale of its grantmaking to each recipient.

If you're inspired by these examples, but aren't sure how to begin connecting your portfolio to the Global Goals, you can use the SDG Indicator Wizard to help you get started. All you need to do is copy and paste your program descriptions or the descriptive language of a sample grant into the Wizard and its machine-learning tools let you know where your grantmaking lands on the SDG matrix. It's a lot of fun – and great place to start learning about the SDGs. And, because it transforms your program language into the relevant SDG goals, indicator, and targets, it may just provide a shortcut to that new strategy you were thinking of developing!

What more examples? The good news is we're also tracking SDGs as a transparency indicator at "Who Has Glasspockets?" You can view them all here. Is your foundation using the SDGs to help tell the story of your work? We're always on the lookout for new examples, so let us know and your foundation can be the next trend setter in our new Trend to Watch.

-- Janet Camarena

I Thought I Knew You: Grants Data & the 990PF
August 23, 2017

(Martha S. Richards is the Executive Director of the James F. and Marion L. Miller Foundation in Portland, Oregon.)

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series will explore the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Join us at a session about the Open 990PF in partnership with Grantmakers of Oregon and Southwest Washington. Learn more or register here.

Martha Richards photoI have a confession to make. Up until a few years ago when this story begins, I used to take the 990PF for granted. I thought of it as something that ensured we were following federal regulations and that if we filed it on time and followed the reporting practices we had always used, that this would be sufficient for all concerned. I was also pretty certain no one but a few insiders within the government and perhaps a handful of philanthropy groups would ever bother to read it.

Well, you might have heard the expression: "You don't know what you don't know," and that's a good segue to what I have to share.

In Spring 2010, the Coalition of Communities of Color (CCC) released a study -- Communities of Color in Multnomah County: an Unsettling Profile -- which defined the disparities facing communities of color in Oregon's largest urban area, Portland. Inspired by this analysis, that December, Foundation Center (FC) and Grantmakers of Oregon and SW Washington (GOSW) co-presented Grantmaking to Communities of Color in Oregon -- a groundbreaking report that acknowledged that philanthropy was part of the problem. The report estimated only 9.6% of grants awarded in 2008 by Oregon private and community funders actually reached communities of color.

While the data told a moving story, the source of the data also became a parallel conversation because the philanthropic community here in Oregon learned about the limitations of using tax returns to tell such important stories. The grant descriptions in our 990s rarely disclose details about the intended beneficiaries of the grants—even if we know them.

The result: We embarked on a long journey to address both issues. While GOSW and CCC hosted a forum to raise awareness of the reports and their attendant policy recommendations, foundations committed to look more closely at their giving practices as well as their data collection efforts, especially emphasizing collecting better beneficiary data, and reporting relationship with Foundation Center.

This prompted us at the James F. and Marion L. Miller Foundation to examine our own giving and how we could describe its reach. We fund in the areas of arts and K-12 education. We have a small staff. Our application process did not require a detailed analysis of demographic data from arts applicants or schools, nor an understanding of the diverse nature of nonprofit leadership among our grantees. We realized that we did not know if the grants we made were reaching the populations we hoped to serve.

As part of this effort, I chaired a GOSW-led Data Work Group to explore how to obtain more meaningful data sets without adding to the length and complexity of our application processes. We invited nonprofit partners to the table. We studied Foundation's Center's processes and invited their staff to meet with and advise us. We tried, tested, and began to encourage nonprofits to help us learn more about how and who we were reaching with our philanthropic dollars. Eventually, we encouraged many of our Oregon foundations to become eReporters to Foundation Center, providing more detailed descriptions of what the grant was for, and who was reached with the funding. Our reports to the Foundation Center and to the IRS have improved, and we make an effort to report detailed demographic information.

Before and After Chart

However, we discovered that it can be difficult for some types of organizations to capture specific demographic data. In the arts, for instance, outside of audience surveys, one generally does not complete a demographic survey to buy a ticket. At the Miller Foundation, we chose to partner with DataArts to collect financial and audience data on our arts grantees. Arts organizations annually complete the profile and it can be used for several arts funders in the state. Their demographic profile is still being developed, but it will encourage better data information and capture in the future. Unfortunately, this platform does not exist for other nonprofits.

Get on the Map

Get on the Map encourages foundations to share current and complete details about their grantmaking with Foundation Center. The interactive map, databases and reports allow foundations to have a better understanding of grantee funding and demographics.

We didn't know it then, but as a result of our committee's efforts, a new data improvement movement was born, called Get on the Map (GOTM). GOTM encourages foundations to share current and complete details about their grantmaking with Foundation Center, so the Maps, databases, and reports it issues are as accurate as possible. The grants we share also populate an interactive map that members of GOSW have access to, which means that we have a better idea of the ecosystem in which we work. It has since scaled nationally with other regions also committing to improve the data they collect and share about their grantmaking so we can all be less in the dark about what efforts are underway and who is working on them.

As a result, today our foundation has a better understanding of who our grantees are serving and reaching today, than we did seven years ago, and I think we are also doing a better job of sharing that story with the IRS, Foundation Center, and the many sets of eyes I now know view those platforms.

We are still learning what we do not know. But at least, now we know what we do not know.

-- Martha Richards


Coming to Grantmakers of Oregon and Southwest Washington: To learn more about what story your 990PF tells about your foundation, register to attend Once Upon a 990PF. Visit the GOSW website for more information and to register.

How To Keep Me Scrolling Through What You Are Sharing
August 10, 2017

Tom Kelly is Vice President of Knowledge, Evaluation & Learning at the Hawai‘i Community Foundation. He has been learning and evaluating in philanthropy since the beginning of the century. @TomEval  TomEval.com

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Tom Kelly Hi ResHello, my name is Tom and I am a Subscriber. And a Tweeter, Follower, Forwarder (FYI!), Google Searcher, and DropBox Hoarder. I subscribe to blogs, feeds, e-newsletters, and email updates. My professional title includes the word “Knowledge,” so I feel compelled to make sure I am keeping track of the high volume of data, information, reports, and ideas flowing throughout the nonprofit and foundation worlds (yes, it is a bit of a compulsion…and I am not even including my favorite travel, shopping and coupon alerts).

It is a lot and I confess I do not read all of it. It is a form of meditation for me to scroll through emails and Twitter feeds while waiting in line at Aloha Salads. I skim, I save, I forward, I retweet – I copy and save for later reading (later when?). In fact, no one can be expected to keep up, so how does anyone make sense of it all, or even find what we need when we need it? Everyone being #OpenForGood and sharing everything is great, but who is reading it all? And how do we make what we are opening for good actually good?

Making Knowledge Usable

We have all experienced at some point Drowning in Information-Starving for Knowledge (John Naisbitt’s Megatrends…I prefer E.O. Wilson’s “starving for wisdom” theory). The information may be out there but rarely in a form that is easily found, read, understood, and most importantly used. Foundation Center and IssueLab have made it easier for people in the sector to know what is being funded, where new ideas are being tested, and what evidence and lessons are available. But nonprofits and foundations still have to upload and share many more of their documents than they do now. And we need to make sure that the information we share is readable, usable, and ready to be applied.

Hawaii Community Foundation Graphic

DataViz guru Stephanie Evergreen recently taught me a new hashtag: #TLDR – “Too Long, Didn’t Read.”

She now proposes that every published report be available in three formats – a one-page handout with key messages, a 3-page executive summary, and a 25-page report (plus appendices). In this way the “scanners,” “skimmers” and “deep divers” can access the information in the form they prefer and in the time they have. It also requires writing (and formatting) differently for each of these sets of eyes. (By the way, do you know which one you are?)

From Information to Influence

But it is not enough to make your reports accessible, searchable, and also easily readable in short and long forms; you also need to include the information people need to make decisions and act. It means deciding in advance who you want to inform and influence and what you want people to do with the information. You need to be clear about your purpose for sharing information, and you need to give people the right kinds of information if you expect them to read it, learn from it, and apply it.

“Give people the right kinds of information if you expect them to read it, learn from it, and apply it.”

Too many times I have read reports with promising findings or interesting lessons, and then I race through all the footnotes and the appendices at the back of the report looking for resources that could point me to the details of evidence and data or implementation guidance. I usually wind up trying to track down the authors by email or phone to follow up.

A 2005 study of more than 1,000 evaluations published in human services found only 22 well-designed and well-documented reports that shared any analysis of implementation factors – what lessons people learned about how best to put the program or services in place. We cannot expect other people and organizations to share knowledge and learn if they cannot access information from others that helps them use the knowledge and apply it in their own programs and organizations. YES, I want to hear about your lessons and “a-ha’s,” but I also want to see data and analysis of the common challenges that all nonprofits and foundations face:

  • How to apply and adapt program and practice models in different contexts
  • How to sustain effective practices
  • How to scale successful efforts to more people and communities

This means making sure that your evaluations and your reports include opening up the challenges of implementation – the same challenges others are likely to face. It also means placing your findings in the context of existing learning while also using similar definitions so that we can build on each other’s knowledge. For example, in our recent middle school connectedness initiative, our evaluator Learning for Action reviewed the literature first to determine specific components and best practices of youth mentoring so that we could build the evaluation on what had come before, and then report clearly about what we learned about in-school mentoring and open up  useful and comparable knowledge to the field. 

So please plan ahead and define your knowledge sharing and influence agenda up front and consider the following guidelines:

  • Who needs to read your report?
  • What information does your report need to share to be useful and used?
  • Read and review similar studies and reports and determine in advance what additional knowledge is needed and what you will document and evaluate.
  • Use common definitions and program model frameworks so we are able to continually build on field knowledge and not create anew each time.
  • Pay attention to and evaluate implementation, replication and the management challenges (staffing, training, communication, adaptation) that others will face.
  • And disseminate widely and share at conferences, in journals, in your sector networks, and in IssueLab’s open repository.

And I will be very happy to read through your implementation lessons in your report’s footnotes and appendices next time I am in line for a salad.

--Tom Kelly

Foundations and Endowments: Smart People, Dumb Choices
August 3, 2017

(Marc Gunther writes about nonprofits, foundations, business and sustainability. He also writes for NonprofitChronicles.com. A version of this post also appears in Nonprofit Chronicles.)

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series will explore the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Marc Gunther photoAmerica’s foundations spend many millions of dollars every year on investment advice. In return, they get sub-par performance.

You read that right: Money that could be spent on charitable programs — to alleviate global poverty, help cure disease, improve education, support research or promote the arts — instead flows into the pockets of well-to-do investment advisors and asset managers who, as a group, generate returns on their endowment investments that are below average.

This is redistribution in the wrong direction, on a grand scale: Foundation endowments hold about $800 billion in investments. It hasn’t attracted a lot of attention, but that could change as foundations make their IRS tax filings open, digital and searchable. That should create competitive pressures on foundation investment officers to do better, and for foundation executives and trustees to rethink business as usual investing.

The latest evidence that they aren’t doing very well arrived recently with the news that two energy funds managed by a Houston-based private equity firm called EnerVest are on the verge of going bust. Once worth $2 billion, the funds will leave investors “with, at most, pennies for every dollar they invested,” the Wall Street Journal reports. To add insult to injury, the funds in question were invested in oil and natural gas during 2012 and 2013, just as Bill McKibben, 350.org and a handful of their allies were urging institutional investors to divest from fossil fuels.

Foundations that invested in the failing Enervest funds include the J. Paul Getty Trust, the John D. and Catherine T. MacArthur Foundation and the California-based Fletcher Jones Foundation, according to their most recent IRS filings. Stranded assets, anyone?

“Endowed private foundations are unaccountable to anyone other than their own trustees.”

Of course, no investment strategy can prevent losses. But the collapse of the Enervest funds points to a broader and deeper problem–the fact that most foundations trust their endowment to investment offices and/or outside portfolio managers who pursue active and expensive investment strategies that, as a group, have underperformed the broader markets.

How costly has this underperformance been? That’s impossible to know because most foundations do not disclose their investment returns. This, by itself, is a troubling; it’s a reminder that endowed private foundations are unaccountable to anyone other than their own trustees.

On disclosure, there are signs of progress. The Ford Foundation says it intends to release its investment returns for the first time. A startup company called Foundation Financial Research is compiling data on endowments as well, which it intends to make available to foundation trustees and sell to asset managers.

What’s more, as the IRS Form 990s filed by foundations become machine readable, it will become easier for analysts, activists, journalists and other foundations to see exactly how billions of dollars of foundations assets are deployed, and how they are performing. Advocates for mission-based investment, or for hiring more women and people of color to manage foundation assets are likely to shine a light on foundations whose endowments that are underperforming.

Unhappily, all indications are that most foundations are underperforming because they pursue costly, active investment strategies. This month, what is believed to be the most comprehensive annual survey of foundation endowment performance once again delivered discouraging news for the sector.

The 2016 Council on Foundations–Commonfund Study of Investment of Endowments for Private and Community Foundations® reported on one-year, five-year and 10-year returns for private foundations, and they again trail passive benchmarks.

The 10-year annual average return for private foundations was 4.7 percent, the study found. The five-year return was 7.6 percent. Those returns are net of fees — meaning that outside investment fees are taken into account — but they do not take into account the considerable salaries of investment officers at staffed foundations.

By comparison, Vanguard, the pioneering giant of passive investing, says a simple mix of index funds with 70 percent in stocks and 30 percent in fixed-income assets delivered an annualized return of 5.4 percent over the past 10 years. The five-year return was 9.1 percent.

These differences add up in a hurry.

Warnings, Ignored

The underperformance of foundation endowments is not a surprise. In a Financial Times essay called The end of active investing? that should be read by every foundation trustee, Charles D. Ellis, who formerly chaired the investment committee at Yale, wrote:

“Over 10 years, 83 per cent of active funds in the US fail to match their chosen benchmarks; 40 per cent stumble so badly that they are terminated before the 10-year period is completed and 64 per cent of funds drift away from their originally declared style of investing. These seriously disappointing records would not be at all acceptable if produced by any other industry.”

The performance of hedge funds, private-equity funds and venture capital has trended downwards as institutional investors flocked into those markets, chasing returns. Notable investors including Warren Buffett, Jack Bogle (who as Vanguard’s founder has a vested interest in passive investing), David Swensen, Yale’s longtime chief investment officer, and Charles Ellis have all argued for years that most investors–even institutional investors–should simply diversity their portfolios, pursue passive strategies and keep their investing costs low.

In his most recent letter to investors in Berkshire Hathaway, Buffett wrote:

“When trillions of dollars are managed by Wall Streeters charging high fees, it will usually be the managers who reap outsized profits, not the clients. Both large and small investors should stick with low-cost index funds.”

For more from Buffett about why passive investing makes sense, see my March blogpost, Warren Buffett has some excellent advice for foundations that they probably won’t take. Recently, Freakonomics did an excellent podcast on the topic, titled The Stupidest Thing You Can Do With Your Money.

2016700activepassivesign-640x410-jpgThat said, the debate between active and passive asset managers remains unsettled. While index funds have outperformed actively-managed portfolios over the last decade, Cambridge Associates, a big investment firm that builds customized portfolios for institutional investors and private clients, published a study last spring saying that this past decade is an anomaly. Cambridge Associates found that since 1990, fully diversified (i.e., actively managed) portfolios have underperformed a simple 70/30 stock/bond portfolio in only two periods: 1995–99 and 2009–2016. To no one’s surprise, Cambridge says: “We continue to find investments in private equity and hedge funds that we believe have an ability to add value to portfolios over the long term.” Portfolio managers are also sure to argue that their expertise and connections enable them to beat market indices.

But where is the evidence? To the best of my knowledge, seven of the U.S.’s 10 biggest foundations decline to disclose their investment returns. I emailed or called the Getty, MacArthur and Fletcher Jones foundations to ask about their investments in Enervest and was told that they do not discuss individual investments. They declined comment.

To its credit, MacArthur does disclose its investment performance of its $6.3 billion endowment. On the other hand, MacArthur has an extensive grantmaking program supporting “conservation and sustainable development.” Why is it financing oil and gas assets?

Ultimately, foundation boards are responsible for overseeing the investment of their endowments. Why don’t they do a better job of it? Maybe it’s because many foundation trustees — particularly those who oversee the investment committees — come out of Wall Street, private equity funds, hedge funds and venture capital. They are the so-called experts, and they have built successful careers by managing other people’s people. It’s not easy for the other board members, who may be academics, activists, lawyers or politicians, to question their expertise. But that’s what they need to do.

And, at the very least, foundations ought to be open about how their endowments are performing so those who manage their billions of dollars can be held accountable.

--Marc Gunther

How Improved Evaluation Sharing Has the Potential to Strengthen a Foundation’s Work
July 27, 2017

Jen GlickmanJennifer Glickman is manager, research team, at the Center for Effective Philanthropy. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Philanthropy is a complex, demanding field, and many foundations are limited in the amount of resources they can dedicate to obtaining and sharing knowledge about their practices. This makes it necessary to consider, then, in what areas should foundations focus their learning and sharing efforts to be #OpenForGood?

Last year, the Center for Effective Philanthropy (CEP) released two research reports exploring this question. The first, Sharing What Matters: Foundation Transparency, looks at foundation CEOs’ perspectives on what it means to be transparent, who the primary audiences are for foundations’ transparency efforts, and what is most important for foundations to share.

The second report, Benchmarking Foundation Evaluation Practices, presents benchmarking data collected from senior foundation staff with evaluation responsibilities on topics such as evaluation staffing and structures, investment in evaluation work, and the usefulness of evaluation information. Together, these reports provide meaningful insights into how foundations can learn and share knowledge most effectively.

CEP’s research found that there are specific topics about which foundation CEOs believe being transparent could potentially increase their foundation’s ability to be effective. These areas include the foundation’s grantmaking processes, its goals and strategies, how it assesses its performance, and the foundation’s experiences with what has and has not worked in its efforts to achieve its programmatic goals. While foundation CEOs believe their foundations are doing well in sharing information about their grantmaking, goals, and strategies, they say their foundations are much less transparent about the lessons they learn through their work.

CEP Transparency Graphic

For example, nearly 70 percent of the CEOs CEP surveyed say being transparent about their foundation’s experiences with what has worked in its efforts to achieve its programmatic goals could increase effectiveness to a significant extent. In contrast, only 46 percent say their foundations are very or extremely transparent about these experiences. Even fewer, 31 percent, say their foundations are very or extremely transparent about what has not worked in their programmatic efforts, despite 60 percent believing that being transparent about this topic could potentially increase their effectiveness to a significant extent.

And yet, foundations want this information about lessons learned and think it is important. Three-quarters of foundation CEOs say they often seek out opportunities to learn from other foundations’ work, and is that it enables others to learn from foundation work more generally.

How is knowledge being shared then? According to our evaluation research, foundations are mostly sharing their programmatic knowledge internally. Over three-quarters of the evaluation staff who responded to our survey say evaluation findings are shared quite a bit or a lot with the foundation’s CEO, and 66 percent say findings are shared quite a bit or a lot with foundation staff. In comparison:

  • Only 28 percent of respondents say evaluation findings are shared quite a bit or a lot with the foundation’s grantees;
  • 17 percent say findings are shared quite a bit or a lot with other foundations; and
  • Only 14 percent say findings are shared quite a bit or a lot with the general public.

CEP Evaluation Survey Graphic

In fact, less than 10 percent of respondents say that disseminating evaluation findings externally is a top priority for their role.

But respondents do not think these numbers are adequate. Nearly three-quarters of respondents say their foundation invests too little in disseminating evaluation findings externally. Moreover, when CEP asked respondents what they hope will have changed for foundations in the collection and/or use of evaluation information in five years, one of the top three changes mentioned was that foundations will be more transparent about their evaluations and share what they are learning externally.

So, if foundation CEOs believe that being transparent about what their foundation is learning could increase its effectiveness, and foundation evaluation staff believe that foundations should be investing more in disseminating findings externally, what is holding foundations back from embracing an #OpenForGood approach?

CEP has a research study underway looking more deeply into what foundations know about what is and isn’t working in their practices and with whom they share that information, and will have new data to enrich the current conversations on transparency and evaluation in early 2018. In the meanwhile, take a moment to stop and consider what you might #OpenForGood.

--Jennifer Glickman

How to Make Grantee Reports #OpenForGood
July 20, 2017

Mandy Ellerton and Molly Matheson Gruen joined the [Archibald] Bush Foundation in 2011, where they created and now direct the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community challenges, using approaches that are collaborative and inclusive of people who are most directly affected by the problem. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Ellertonmandy20152
Mandy Ellerton

When we started working at the Bush Foundation in 2011, we encountered a machine we’d never seen before: the Lektriever. It’s a giant machine that moves files around, kind of like a dry cleaner’s clothes rack, and allows you to seriously pack in the paper. As a responsible grantmaker, it’s how the Bush Foundation had meticulously tracked and stored its files for posterity - in particular, grantee reports - for decades.

In 2013, the Bush Foundation had the privilege of moving to a new office. Mere days before we were to move into the new space, we got a frantic call from the new building’s management. It turned out that the Lektrievers (we actually had multiple giant filing machines!) were too heavy for the floor of the new building, which had to be reinforced with a number of steel plates to sustain their weight.

MMG 2015 Headshot1
Molly Matheson Gruen

The Lektrievers symbolized our opportunity to become more transparent and move beyond simply preserving our records, instead seeing them as relevant learning tools for current audiences. It was time to lighten the load and share this valuable information with the world.

Even with all this extra engineering, we would still have to say goodbye to one of the machines altogether for the entire system to be structurally sound. We had decades of grantee stories, experiences and learning trapped in a huge machine in the inner sanctum of our office, up on the 25th floor. 

Learning Logs Emerge

We developed our grantee learning log concept in the Community Innovation Programs as one way to increase the Foundation’s transparency. At the heart of it, our learning logs are a very simple concept: they are grantee reports, shared online. But, like many things that appear simple, once you pull on the string of change – the complexity reveals itself.

“Every Community Innovation project is an opportunity for others to learn and the learning logs are a platform to share this learning.”

Before we could save the reports from a life of oblivion in the Lektriever, build out the technology and slap the reports online, we needed to entirely rethink our approach to grantee reporting to create a process that was more mutually beneficial. First, we streamlined our grant accountability measures (assessing whether the grantees did what they said they’d do) by structuring them into a conversation with grantees, rather than as a part of the written reports. We’ve found that conducting these assessments in a conversation takes the pressure off and creates a space where grantees can be more candid, leading to increased trust and a stronger partnership.

Second, our grantee reports now focus on what grantees are learning in their grant-funded project. What’s working? What’s not? What would you do differently if you had it to do all over again? This new process resulted in reports that were more concise and to the point.

Finally, we redesigned our website to create a searchable mechanism for sharing these reports online. This involved linking our grant management system directly with our website so that when a grantee submits a report, we do a quick review and then the report automatically populates our website. We’ve also designed a way for grantees to be able to designate select answers as private when they want to share sensitive information with us, yet not make it entirely public. We leave it up grantee discretion and those selected answers do not appear on the website. Grantees designate their answers to be private for a number of reasons, most often because they discuss sensitive situations having to do with specific people or partners – like when someone drops out of the project or when a disagreement with a partner holds up progress. And while we’ve been pleased at the candor of most of our grantees, some are still understandably reluctant to be publicly candid about failures or mistakes.

But why does this new approach to grantee reporting matter, besides making sure the floor doesn’t collapse beneath our Lektrievers?

Bushfoundation-Lektriever photo
The Lektriever is a giant machine that moves files around, kind of like a dry cleaner’s clothes rack. The Bush Foundation had meticulously tracked and stored its files for posterity - in particular, grantee reports - for decades. Credit: Bush Foundation

Learning Sees the Light of Day

Learning logs help bring grantee learning into the light of day, instead of hiding in the Lektrievers, so that more people can learn about what it really takes to solve problems. Our Community Innovation programs at the Bush Foundation fund and reward the process of innovation–the process of solving problems. Our grantees are addressing wildly different issues: from water quality to historical trauma, from economic development to prison reform. But, when you talk to our grantees, you see that they actually have a lot in common and a lot to learn from one another about effective problem-solving. And beyond our grantee pool, there are countless other organizations that want to engage their communities and work collaboratively to solve problems.  Every Community Innovation project is an opportunity for others to learn and the learning logs are a platform to share this learning, making it #OpenForGood.

We also want to honor our grantees’ time. Grantees spend a lot of time preparing grant reports for funders. And, in a best case scenario, a program officer reads the report and sends the grantee a response of some kind before the report is filed away. But, let’s be honest – sometimes even that doesn’t happen. The report process can be a burden on nonprofits and the only party to benefit is the funder. We hope that the learning logs help affirm to our grantees that they’re part of something bigger than themselves - that what they share matters to others who are doing similar work.

We also hear from our grantees that the reports provide a helpful, reflective process, especially when they fill it out together with collaborating partners. One grantee even said she’d like to fill out the report more often than we require to have regular reflection moments with her team!

Learning from the Learning Logs

We only launched the learning logs last year, but we’ve already received some positive feedback. We’ve heard from both funded and non-funded organizations that the learning logs provide inspiration and practical advice so that they can pursue similar projects. A grantee recently shared a current challenge in their work. It directly connected to some work we knew another grantee had done and had written about in their learning log. So, since this knowledge was now out in the open, we were able to direct them to the learning log as a way to expand our grantee’s impact, even beyond their local community, and use it to help advance another grantee’s work.

Take, for example, some of the following quotes from some of our grantee reports:

  • The Minnesot Brain Injury Alliance's project worked on finding ways to better serve homeless people with brain injuries.  They reflected that, "Taking the opportunity for reflection at various points in the process was very important in working toward innovation.  Without reflection, we might not have been open to revising our plan and implementing new possibilities."
  • GROW South Dakota addressed a number of challenges facing rural South Dakota communities. They shared that, “Getting to conversations that matter requires careful preparation in terms of finding good questions and setting good ground rules for how the conversations will take place—making sure all voices are heard, and that people are listening for understanding and not involved in a debate.”
  •  The People's Press Project engaged communities of color and disenfranchised communities to create a non-commercial, community-owned, low-powered radio station serving the Fargo-Moorhead area of North Dakota. They learned “quickly that simply inviting community members to a meeting or a training was not a type of outreach that was effective.”

Like many foundations, we decline far more applications than what we fund, and our limited funding can only help communities tackle so many problems. Our learning logs are one way to try and squeeze out more impact from those direct investments. By reading grantee learning logs, hopefully more people will be inspired to effectively solve problems in their communities.

We’re not planning to get rid of the Lektrievers anytime soon – they’re pretty retro cool and efficient. They contain important historical records and are incredibly useful for other kinds of record keeping, beyond grantee documentation. Plus, the floor hasn’t fallen in yet. But, as Bush Foundation Communications Director Dominick Washington put it, now we’re unleashing the knowledge, “getting it out of those cabinets, and to people who can use it.”

--Mandy Ellerton and Molly Matheson Gruen

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories