Transparency Talk

Category: "Reporting" (41 posts)

Data Fix: Do's and Don'ts for Data Mapping & More!
October 3, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This post is part of a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoAs many of you know, Foundation Center was established to provide transparency for the field of philanthropy. A key part of this mission is collecting, indexing, and aggregating millions of grants each year. In recent years this laborious process has become more streamlined thanks to technology, auto-coding, and to those of you who directly report your grants data to us. Your participation also increases the timeliness and accuracy of the data.

Today, over 1300 funders worldwide share grants data directly with Foundation Center. Over the 20 years we've been collecting this data, we've encountered some issues concerning the basic fields required. To make sharing data even quicker and easier, we've put together some dos and don'ts focusing on three areas that may seem straightforward, but often cause confusion.

Location Data for Accurate Mapping and Matching

Quite simply, to map your grants data we need location information! And we need location information for more than mapping. We also use this information to ensure we are matching data to the correct organizations in our database. To help us do this even more accurately, we encourage you to provide as much location data as possible. This also helps you by increasing the usability of your own data when running your own analyses or data visualizations.

DO DON'T
Do supply Recipient City for U.S. and non-U.S. Recipients. Don't forget to supply Recipient Address and Recipient Postal Code, if possible.
Do supply Recipient State for U.S. Recipients. Don't supply post office box in place of street address for Recipient Address, if possible.

Do supply Recipient Country for non-U.S. Recipients.

Don't confuse Recipient location (where the check was sent) with Geographic Area Served (where the service will be provided). 

What's Your Type? Authorized or Paid?

Two types of grant amounts can be reported: Authorized amounts (new grants authorized in a given fiscal year, including the full amount of grants that may be paid over multiple years) or Paid amounts (as grants would appear in your IRS tax form). You can report on either one of these types of amounts – we just need to know which one you are using: Authorized or Paid.

DO DON'T
Do indicate if you are reporting on Authorized or Paid amounts. Don't send more than one column of Amounts in your report – either Authorized or Paid for the entire list.
Do remain consistent from year to year with sending either Authorized amounts or Paid amounts to prevent duplication of grants. Don't forget to include Grant Duration (in months) or Grant Start Date and Grant End Date, if possible.
Do report the type of Currency of the amount listed, if not US Dollars. Don't include more than one amount per grant.

The Essential Fiscal Year

An accurate Fiscal Year is essential since we publish grants data by fiscal year in our data-driven tools and content-rich platforms such as those developed by Foundation Landscapes, including Funding the Ocean, SDG Funders, Equal Footing and Youth Giving. Fiscal Year can be reported with a year (2018) or date range (07/01/2017-06/31/2018), but both formats will appear in published products as YEAR AWARDED: 2018.

DO DON'T
Do include the Fiscal Year in which the grants were either Authorized or Paid by you, the funder. Don't provide the Fiscal Year of the Recipient organization.
Do format your Fiscal Year as a year (2018) or a date range (07/01/2017-06/31/2018). Don't forget, for off-calendar fiscal years, the last year of the date range is the Fiscal Year: 07/01/2017-06/31/2018 = 2018.

More Tips to Come!

I hope you have a better understanding of these three areas of data to be shared through Foundation Center eReporting. Moving forward, we'll explore the required fields of Recipient Name and Grant Description, as well as high priority fields such as Geographic Area Served. If you have any questions, please feel free to contact me. Thank you! And don't forget, the data you share IS making a difference!

-- Kati Neiheisel

“Because It’s Hard” Is Not an Excuse – Challenges in Collecting and Using Demographic Data for Grantmaking
August 30, 2018

Melissa Sines is the Effective Practices Program Manager at PEAK Grantmaking. In this role, she works with internal teams, external consultants, volunteer advisory groups, and partner organizations to articulate and highlight the best ways to make grants – Effective Practices. A version of this post also appears in the PEAK Grantmaking blog.

MelissaFor philanthropy to advance equity in all communities, especially low-income communities and communities of color, it needs to be able to understand the demographics of the organizations being funded (and declined), the people being served, and the communities impacted. That data should be used to assess practices and drive decision making.

PEAK Grantmaking is working to better understand and build the capacity of grantmakers for collecting and utilizing demographic data as part of their grantmaking. Our work is focused on answering four key questions:

  • What demographic data are grantmakers collecting and why?
  • How are they collecting these demographic data?
  • How is demographic data being used and interpreted?
  • How can funders use demographic data to inform their work?

In the process of undertaking this research, we surfaced a lot of myths and challenges around this topic that prevent our field from reaching the goal of being accountable to our communities and collecting this data for responsible and effective use.

Generally, about half of all grantmakers are collecting demographic data either about the communities they are serving or about the leaders of the nonprofits they have supported. For those who reported that they found the collection and use of this data to be challenging, our researcher dug a little deeper and asked about the challenges they were seeing.

Some of the challenges that were brought to the forefront by our research were:

PEAK Grantmaking reportChallenge 1: Fidelity and Accuracy in Self-Reported Data
Data, and self-reported data in particular, will always be limited in its ability to tell the entire story and to achieve the nuance necessary for understanding. Many nonprofits, especially small grassroots organizations, lack the capability or capacity to collect and track data about their communities. In addition, white-led nonprofits may fear that lack of diversity at the board or senior staff level may be judged harshly by grantmakers.

Challenge 2: Broad Variations in Taxonomy
Detailed and flexible identity data can give a more complete picture of the community, but this flexibility works against data standardization. Varying taxonomies, across sectors or organizations, can make it difficult to compare and contrast data. It can also be a real burden if the nonprofit applying for a grant does not collect demographic data in the categories that a grantmaker is using. This can lead to confusion about how to report this data to a funder.

Challenge 3: Varying Data Needs Across Programs
Even inside a single organization, different programs may be collecting and tracking different data, as program officers respond to needs in their community and directives from senior leadership. Different strategies or approaches to a problem demand different data. For instance, an arts advocacy program may be more concerned with constituent demographics and impact, while an artist’s program will want to know about demographics of individual artists.

Challenge 4: Aggregating Data for Coalitions and Collaborations
This becomes even more complex as coalitions and collaborative efforts that bring together numerous organizations, or programs inside of different organizations, to accomplish a single task. The aforementioned challenges are compounded as more organizations, different databases, and various taxonomies try to aggregate consistent demographic data to track impact on specific populations.

These are all very real challenges, but they are not insurmountable. Philanthropy, if it puts itself to the task, can tackle these challenges.

Some suggestions to get the field started from our report include

  • Don’t let the perfect be the enemy of the good. Pilot systems for data collection, then revisit them to ensure that they are working correctly, meeting the need for good data, and serving the ultimate goal of tracking impact.
  • Fund the capacity of nonprofits to collect good data and to engage in their own diversity, equity, and inclusion efforts.
  • Engage in a conversation – internally and externally – about how this data will be collected and how it will be used. If foundation staff and the nonprofits they work with understand the need for this data, they will more willingly seek and provide this information.
  • For coalitions and collaborative efforts, it may make sense to fund a backbone organization that takes on this task (among other administrative or evaluation efforts) in support of the collective effort.
  • Work with your funding peers – in an issue area or in a community – to approach this challenge in a way that will decrease the burden on nonprofits and utilize experts that may exist at larger grantmaking operations.
  • Support field-wide data aggregators, like GuideStar or the Foundation Center, and work alongside them as they try to collect and disseminate demographic data about the staff and boards at nonprofits and the demographics of communities that are being supported by grantmaking funds.

Grantmakers have the resources and the expertise to begin solving this issue and to share their learning with the entire field. To read more about how grantmakers are collecting and using demographic data, download the full report.

--Melissa Sines

Knowledge Sharing to Strengthen Grantmaking
April 26, 2018

Clare Nolan, MPP, co-founder of Engage R+D, is a nationally recognized evaluation and strategy consultant for the foundation, nonprofit and public sectors. Her expertise helps foundations to document and learn from their investments in systems and policy change, networks, scaling, and innovation. This post also appears on the Grantmakers for Effective Organizations’ (GEO) Perspectives blog.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Clare Nolan PhotoKnowledge has the power to spark change, but only if it is shared. Many grantmakers instinctively like the idea of sharing the knowledge they generate with others. But in the face of competing priorities, a stronger case must be made for foundations to devote time and resources to sharing knowledge. The truth is that when foundations share knowledge generated through evaluation, strategy development and thought leadership, they benefit not only others but also themselves. Sharing knowledge can deepen internal reflection and learning, lead to new connections and ideas, and promote institutional credibility and influence.

Foundations can strengthen their knowledge sharing practices by enhancing organizational capacity and culture, and by understanding how to overcome common hurdles to sharing knowledge. The forthcoming GrantCraft guide Open for Good: Knowledge Sharing to Strengthen Grantmaking provides tips and resources for how foundations can do just that. My organization, Engage R+D, partnered with Foundation Center to produce this guide as part of #OpenForGood, a call to action for foundations to openly share their knowledge.

Knowledge Sharing GraphTo produce the guide, we conducted interviews with the staff of foundations, varying by origin, content focus, size, and geography. The participants shared their insights about the benefits of sharing knowledge not only for others, but also for their own organizations. They also described strategies they use for sharing knowledge, which we then converted into concrete and actionable tips for grantmakers. Some of the tips and resources available in the guide include:

  • A quiz to determine what type of knowledge sharer you are. Based upon responses to questions about your organization’s capacity and culture, you can determine where you fall within a quadrant of knowledge sharing (see visual). The guide offers tips for how to integrate knowledge sharing into your practice in ways that would be a good fit for you and your organization.
  • Nuts and bolts guidance on how to go about sharing knowledge. To take the mystery out of the knowledge sharing process, the guide breaks down the different elements that are needed to actually put knowledge sharing into practice. It provides answers to common questions grantmakers have on this topic, such as: What kinds of knowledge should I be sharing exactly? Where can I disseminate this knowledge? Who at my foundation should be responsible for doing the sharing?
  • Ideas on how to evolve your foundation’s knowledge-sharing practice. Even foundation staff engaged in sophisticated knowledge-sharing practices noted the importance of evolving their practice to meet the demands of a rapidly changing external context. The guide includes tips on how foundations can adapt their practice in this way. For example, it offers guidance on how to optimize the use of technology for knowledge sharing, while still finding ways to engage audiences with less technological capacity.

The tips and resources in the guide are interspersed with quotes, audio clips, and case examples from the foundation staff members we interviewed. These interviews provide voices from the field sharing tangible examples of how to put the strategies in the guide into practice.

Want to know how your foundation measures up when it comes to knowledge sharing? We are pleased to provide readers of this blog with an advance copy of Chapter 2 from the forthcoming Guide which includes the quiz referenced above. Want to learn more? Sign up for the Foundation Center’s GrantCraft newsletter and receive a copy of the Guide upon its release. And, for those who are attending the GEO conference next week in San Francisco, visit us at our #OpenForGood pop-up quiz station where you can learn more about what kind of knowledge sharer you are.

--Clare Nolan

Increasing Attention to Transparency: The MacArthur Foundation Is #OpenForGood
April 17, 2018

Chantell Johnson is managing director of evaluation at the John D. and Catherine T. MacArthur Foundation. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Chantell Johnson photoAt MacArthur, the desire to be transparent is not new. We believe philanthropy has a responsibility to be explicit about its values, choices, and decisions with regard to its use of resources. Toward that end, we have long had an information sharing policy that guides what and when we share information about the work of the Foundation or our grantees. Over time, we have continued to challenge ourselves to do better and to share more. The latest refinement of our approach to transparency is an effort toward increasingly sharing more knowledge about what we are learning. We expect to continue to push ourselves in this regard, and participating in Foundation Center’s Glasspockets  and #OpenForGood movements are just a couple of examples of how this has manifested.

In recent years, we have made a more concerted effort to revisit and strengthen our information sharing policy by:

  • Expanding our thinking about what we can and should be transparent about (e.g., our principles of transparency guided our public communications around our 100&Change competition, which included an ongoing blog);
  • Making our guidance more contemporary by moving beyond statements about information sharing to publishing more and different kinds of information (e.g., Grantee Perception Reports and evaluation findings);
  • Making our practices related to transparency more explicit; and
  • Ensuring that our evaluation work is front and center in our efforts related to transparency.

Among the steps we have taken to increase our transparency are the following:

Sharing more information about our strategy development process.
The Foundation's website has a page dedicated to How We Work, which provides detailed information about our approach to strategy development. We share an inside look into the lifecycle of our programmatic efforts, beginning with conceptualizing a grantmaking strategy through the implementation and ending phases, under an approach we refer to as Design/Build. Design/Build recognizes that social problems and conditions are not static, and thus our response to these problems needs to be iterative and evolve with the context to be most impactful. Moreover, we aim to be transparent as we design and build strategies over time. 

“We have continued to challenge ourselves to do better and to share more.”

Using evaluation to document what we are measuring and learning about our work.
Core to Design/Build is evaluation. Evaluation has become an increasingly important priority among our program staff. It serves as a tool to document what we are doing, how well we are doing it, how work is progressing, what is being achieved, and who benefits. We value evaluation not only for the critical information it provides to our Board, leadership, and program teams, but for the insights it can provide for grantees, partners, and beneficiaries in the fields in which we aim to make a difference. Moreover, it provides the critical content that we believe is at the heart of many philanthropic efforts related to transparency.

Expanding the delivery mechanisms for sharing our work.
While our final evaluation reports have generally been made public on our website, we aim to make more of our evaluation activities and products available (e.g., landscape reviews and baseline and interim reports). Further, in an effort to make our evaluation work more accessible, we are among the first foundations to make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

Further evidence of the Foundation's commitment to increased transparency includes continuing to improve our “Glass Pockets” by sharing:

  • Our searchable database of grants, including award amount, program, year, and purpose;
  • Funding statistics including total grants, impact investments, final budgeted amounts by program, and administrative expenses (all updated annually);
  • Perspectives of our program directors and staff;
  • Links to grantee products including grant-supported research studies consistent with the Foundation's intellectual property policies;
  • Stories highlighting the work and impact of our grantees and recipients of impact investments; and
  • Center for Effective Philanthropy Grantee Perception report results

Going forward, we will look for additional ways to be transparent. And, we will challenge ourselves to make findings and learnings more accessible even more quickly.

--Chantell Johnson 

Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts
April 12, 2018

Laia Griñó is director of data discovery at Foundation Center. This post also appears in the Human Rights Funders Network's blog.

Laia Grino photoOver the last few months, this blog has presented insights gained from the Advancing Human Rights initiative’s five-year trend analysis. Getting to these insights would not have been possible had not a growing number of funders decided to consistently share more detailed data about their grantmaking, such as through Foundation Center’s eReporting program. In a field where data can pose real risks, some might feel that this openness is ill-advised. Yet transparency and data protection need not be at odds. By operating from a framework of responsible data, funders can simultaneously protect the privacy and security of grantees and contribute to making the human rights field more transparent, accountable, and effective.

This topic – balancing transparency and data protection – was the focus of a session facilitated by Foundation Center at the PEAK Grantmaking annual conference last month. Our goal was not to debate the merits of one principle over the other, but to help provide a framework that funders can use in determining how to share grants data, even in challenging circumstances. What follows are some of the ideas and tips discussed at that session (a caveat here: these tips focus on data shared voluntarily by funders on their website, with external partners like Foundation Center, etc.; we recognize that funders may also face legal reporting requirements that could raise additional issues).

HRFN Graphic

  • Think of transparency as a spectrum: Conversations regarding data sharing often seem to end up at extremes: we must share everything or we can’t share anything. Instead, funders should identify what level of transparency makes sense for them by asking themselves two questions: (1) What portion of our grants portfolio contains sensitive data that could put grantees at risk if shared? and (2) For the portion of grants deemed sensitive, which grant details – if any – are possible to share? Based on our experience with Advancing Human Rights, in most cases funders will find that it is possible to share some, if not most, of their grants information.
  • Assess the risks of sharing data: Answering these questions requires careful consideration of the consequences if information about certain grants is made public, particularly for grantees’ security. As noted at the PEAK session, in assessing risks funders should not only consider possible negative actions by government actors, but also by actors like militant groups or even a grantee’s community or family. It is also important to recognize that risks can change over time, which is why it is so critical that funders understand what will happen with the data they share; if circumstances change, they need to know who should be notified so that newly sensitive data can be removed.
  • Get grantees’ input: Minimizing harm to grantees is of utmost importance to funders. And yet grantees usually have little or no input on decisions about what information is shared about them. Some funders do explicitly ask for grantees’ consent to share information, sometimes at multiple points along the grant process. This could take the form of an opt-in box included as part of the grant agreement process, for example. At a minimum, grantees should understand where and how data about the grant will be used.
  • Calibrate what is shared based on the level of risk: Depending on the outcomes of their risk assessment (and grantees’ input), a funder may determine that it’s inadvisable to share any details about certain grants. In these cases, funders may opt not to include those grants in their reporting at all, or to only report on them at an aggregate level (e.g., $2 million in grants to region or country X). In situations where it is possible to acknowledge a grant, funders can take steps to protect a grantee, such as: anonymizing the name of the grantee; providing limited information on the grantee’s location (e.g., country only); and/or redacting or eliminating a grant description (note: from our experience processing data, it is easy to overlook sensitive information in grant descriptions!).
  • Build data protection into grants management systems: Technology has an important role to play in making data protection systematic and, importantly, manageable. For example, some funders have “flags” to indicate which grants can be shared publicly or, conversely, which are sensitive. In one example shared at PEAK, a grants management system has been set up so that if a grant has been marked as sensitive, the grantee’s name will automatically appear as “Confidential” in any reports generated. These steps can minimize the risk of data being shared due to human error.

Transparency is at the core of Foundation Center’s mission. We believe deeply that transparency can not only help build public trust but also advance more inclusive and effective philanthropy. For that reason, we are committed to being responsible stewards of the data that is shared with us (see the security plan for Advancing Human Rights, for example). A single conference session or blog post cannot do justice to such a complex and longdebated topic. We are therefore thankful that our colleagues at Ariadne360Giving and The Engine Room have just started a project to provide funders with greater guidance around this issue (learn more in these two thoughtful blog posts from The Engine Room, here and here). We look forward to seeing and acting on their findings! 

--Laia Griñó

From Dark Ages to Enlightenment: A Magical Tale of Mapping Human Rights Grantmaking
April 4, 2018

Mona Chun is Executive Director of Human Rights Funders Network, a global network of grantmakers committed to effective human rights philanthropy.

Mona HeadshotOnce upon a time, back in the old days of 2010, human rights funders were sitting alone in their castles, with no knowledge of what their peers in other towers and castles were doing – just the certainty that their issue area, above all others, was underfunded. Each castle also spoke its own language, making it difficult for castle communities to learn from one another. This lack of transparency and shared language about common work and goals meant everyone was working in the dark.

Then a gender-neutral knight, clad in human rights armor (ethically produced of course), arrived in the form of our Advancing Human Rights research. With this research in hand, funders can now:

  • Peer out from their towers across the beautiful funding landscape;
  • Use a telescope to look at what their peers are doing, from overall funding trends to grants-level detail;
  • Use a common language to compare notes on funding priorities and approaches;
  • Find peers with whom to collaborate and new grantee partners to support; and
  • Refine and strengthen their funding strategies.

Armed with this knowledge, human rights funders can leave their towers and visit others, even government towers, to advocate and leverage additional resources in their area of interest.

Advancing Human Rights MapMapping Unchartered Territory

The Advancing Human Rights initiative, a partnership between Human Rights Funders Network (HRFN) and Foundation Center, has mapped more than $12 billion in human rights funding from foundations since 2010. Because of the great potential such data has to inform and improve our collective work, many years of work went into this. Ten years ago, HRFN recognized that in order to help human rights funders become more effective in their work, we needed to get a better understanding of where the money was going, what was being funded and how much was being spent. After our initial planning, we partnered with Foundation Center, brought in Ariadne and Prospera as funder network collaborators, formed a global Advisory Committee and hashed out the taxonomy to develop a shared language. Then, we began the process of wrangling funders to share their detailed grantmaking data.

It was no easy feat, but we published the first benchmark report on human rights grantmaking for 2010, and since then, we have worked to improve the research scope and process and trained funders to use the tools we’ve developed. In January, we released our first ever trends analysis. Over the five years of data collection featured on the Advancing Human Rights research hub, we’ve compiled almost 100,000 human rights grants from funders in 114 countries.

Adopting A Can-Do Attitude

In 2010, major funders in our network didn’t believe this could be done.

First, could we get the grantmaking data from members? For the first few years, we campaigned hard to get members to share their detailed grants information. We created a musical “Map It” parody (set to the tune of Devo’s “Whip It”) and launched a Rosie the Riveter campaign (“You Can Do It: Submit Your Data!”). We deployed pocket-size fold-outs and enormous posters thanking foundations for their participation. Several years later, we have seen our gimmicks bear fruit: 780 funders contributed data in our most recent year. When we began, no human rights data was being gathered from funders outside North America. In our first year, we incorporated data from 49 foundations outside North America and in the most recent year, that number more than doubled to 109. The value of participation is now clear. Repeated nudging is still necessary, but not gimmicks.

Rosie Collage
The Human Rights Funder Network celebrates its Rosie the Riveter “You Can Do It: Submit Your Data!” campaign. Photo Credit: Human Rights Funders Network

Data Makes A Difference

Once we had the research, could we get busy funders to use the data? With all the hard work being done in the field and so much to learn from it, we were committed to creating research that would be used. Focusing as much energy on sharing the research as we had compiling it, we aimed to minimize unused reports sitting on shelves. Global tours, presentations, workshops and tutorials have resulted in funders sharing story after story of how they are putting the findings to use:

  • Funders sift through the data to inform their strategic plans and understand where they sit vis-à-vis their peers;
  • Use the tools to break out of their silos and build collaborative initiatives;
  • Use the research to advocate to their boards, their governments, their constituencies; and
  • Enter into new areas of work or geographies knowing the existing landscape of organizations on the ground, search for donors doing complementary work, and discover the issues most and least funded.

Overall, their decisions can be informed by funding data that did not exist before, beyond the wishful daydreams of funders in their towers.

I wish I could say that we’ll live happily ever after with this data. But the pursuit of human rights is a long-term struggle. Those committed to social change know that progress is often accompanied by backlash. As we face the current challenging times together, sometimes we just need to recognize how far we’ve come and how much more we know, holding on to the magic of possibility (and the occasional fairy tale) to inspire us for the still long and winding, but newly illuminated, road ahead.

--Mona Chun

In the Know: #OpenForGood Staff Pick December 2017
December 20, 2017

Gabriela Fitz is director of knowledge management initiatives at Foundation Center.

This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Gabi Fitz photo

As the #OpenForGood campaign builds steam, and we continue to add to our IssueLab Results repository of more than 400 documents containing lessons learned and evaluative data, our team will regularly shine the spotlight on new and noteworthy examples of the knowledge that is available to help us work smarter, together. This current pick comes to us from the Conrad N. Hilton Foundation. Read last month's staff pick here.


Staff Pick: Conrad N. Hilton Foundation

Evaluation of the Conrad N. Hilton Foundation Chronic Homelessness Initiative: 2016 Evaluation Report, Phase I

Download the Report

Quick Summary

 

In 2011, the Conrad N. Hilton Foundation partnered with Abt Associates Inc. to conduct an evaluation of the Hilton Foundation’s Chronic Homelessness Initiative, with the goal of answering an overarching question: Is the Chronic Homelessness Initiative an effective strategy to end and prevent chronic homelessness in Los Angeles County?

Answering that question has not been so easy. And it bears mentioning that this is not one of those reports that strives to prove a certain model is working, but instead provides a suitably complicated picture of an issue that will be an ongoing, multi-agency struggle.  A combination of economic conditions, insufficient and shrinking availability of affordable housing, and an unmet need for mental health and supportive services actually resulted in an increase in homeless people living in Los Angeles County during the time period under study. The numbers even suggest that Los Angeles was further from ending chronic homelessness than ever before. But the story is a bit more complicated than that.

In this final evaluation report on the community’s progress over five years, (January 2011 through December 2015), Abt Associates Inc. found that the collaborative system that had been developed during the first phase of the initiative actually represented a kind of turning point for the County to address chronic homelessness, which was needed more than ever by the end of 2015.

Field of Practice

  • Housing and Homelessness

What kinds of knowledge does this report open up?

This report goes beyond evaluating a single effort or initiative to look at the larger collaborative system of funding bodies and stakeholders involved in solving a problem like chronic homelessness. We often hear that no foundation can solve problems single-handedly, so it’s refreshing to see a report framework that takes this reality into account by not just attempting to isolate the foundation-funded part of the work. The initiative’s strategy focused on a systemic approach that included goals, such as the leveraging of public funds, demonstrated action by elected and public officials, and increased capacity among developers and providers to provide permanent and supporting housing effectively, alongside the actual construction of thousands of housing units. By adopting this same systemic lens, the evaluation itself provides valuable insight into not just the issue of chronic homelessness in Los Angeles County, but also into how we might think about and evaluate programs and initiatives that are similarly collaborative or interdependent by design.

What makes it stand out?

This report is notable for two reasons. First is the evaluators’ willingness and ability to genuinely grapple with the discouraging fact that homelessness had gone up during the time of the initiative, as well as the foundation’s willingness to share this knowledge by publishing and sharing it. All too often, reports that don’t cast foundation strategies in the best possible light don’t see the light of day at all. Sadly, it is that kind of “sweeping under the rug” of knowledge that keeps us all in the dark. The second notable thing about this report is its design. The combination of a summary “dashboard” with easily digestible infographics about both the process of the evaluation and its findings, and a clear summary analysis for each strategic goal, makes this evaluation stand out from the crowd.

Key Quote

“From our vantage point, the Foundation’s investment in Systems Change was its most important contribution to the community’s effort to end chronic homelessness during Phase I of the Initiative. But that does not mean the Foundation’s investments in programs and knowledge dissemination did not make significant contributions. We believe it is the interplay of the three that yielded the greatest dividend.”

--Gabriela Fitz

No Pain, No Gain: The Reality of Improving Grant Descriptions
November 8, 2017

Gretchen Schackel is Grants Manager of the James F. and Marion L. Miller Foundation in Portland, Oregon.

This post is part of a Transparency Talk series, presented in partnership with the Conrad N. Hilton Foundation, examining the importance of the 990-PF, the informational tax form that foundations must annually file. The series explores the implications of the open 990; how journalists and researchers use the 990-PF to understand philanthropy; and its role, limitations, and potential as a communications tool.

Join us at a session about the Open 990-PF in partnership with Southern California Grantmakers. Learn more or register here.                                   

Gretchen Schackel - Miller photoYou know those blog posts that describe adopting a best practice?  The ones that make it sound so easy and tempting that you try it, only to be let down because you discover that either you are doing something terribly wrong, or it is a lot harder than the author made it sound because they left out all of the pain points? Well, don’t worry—this is not one of those posts! In fact, I will start off with the pain points so you can go in eyes wide open, if like me, you end up on a quest to improve your foundation’s grant descriptions.  

This post is a sequel to another Transparency Talk article that recently featured our foundation’s executive director, detailing lessons learned about why improving grants data is important to the foundation, as well as to the sector as a whole. That article ended with a brief snapshot of some “before and after” grant descriptions, showing how we are working to improve the way we tell the story of each grant, so I’m picking up here where that left off to share an honest, behind-the-scenes look at what it took to get from the before to the after.

“Capturing critical details when writing accurate and complete grant descriptions aids your efforts on the 990-PF form.”

Pain Relievers

As the grants manager, it’s my job to put the right processes in place so we can capture critical details when writing grant descriptions to ensure that they are accurate and complete, and well….actually descriptive (AKA “Purpose of grant of contribution” on form 990-PF). This fall marks my 11-year anniversary at the Miller Foundation and one thing that has remained constant throughout my tenure is what a pain writing good grant descriptions can be if you don’t know where to begin. So, I’m sharing my playbook below, because the communities we are serving, and how we are serving them, deserve to be described and celebrated. I’ve learned some tips and work-arounds along the way that I’ll share as I inventory the various obstacles you might encounter

Pain Point #1:

Lean Staffing. We are a staff of four people: Executive Director, Program Officer, Grants Manager, and Administrative Assistant. We don’t publish an annual report; we have just started using social media, and just completed a website redesign. This makes all of us part-time communications staff. I wouldn’t describe this as a best practice, but it’s the reality at many foundations.  

Pain Reliever #1:

Grant Descriptions Can Serve Many Purposes. As mentioned above, the editorial process involved in prepping text for public consumption can be labor intensive, particularly in organizations without a communications department. Grant descriptions, which represent the substance of our work, turn out to be handy for small organizations like ours because they can serve many purposes. They are used for our minutes, our website, our 990-PF, and for our eReport to Foundation Center for its searchable databases. We don’t have time to write different grant descriptions for each specific use. So, we write one grant description that we can use in multiple platforms and situations.

Pain Point #2:

Garbage In – Garbage Out. Data starts with the grantees, and I know from talking to them that they are often not well equipped with time or technology to collect good data. It’s not just about what questions are we asking but rather how are we helping our grantees understand what we need and help them get us the best data possible.

Pain Reliever #2:

You have to work with what you’ve got. And what we have is the information provided by the potential grantees in their applications.  Most of the information we need can be found in the “Brief summary of the grant request” question on the grant application. Rather than treat this as a test that potential grantees must either pass/fail, we provide detailed instructions of the kind of information we would like to see in the summary as part of our online application process. Taking the guesswork out of the application has improved the data quality we receive at the start of the grant. Our arts portfolio also requires that grantees participate in DataArts, which serves as a collective database that grantees only have to enter once and then all arts funders can access their data. Participating in field-building shortcuts like this is a great way to make the process more efficient for everyone.

Once you have the framework in place to get a good grant summary from your prospective grantees, however, your work is not yet done.  Often, important elements of the funded grant can change during board deliberations, so I find it essential to share the grant summary with our program staff before finalizing to ensure we are capturing the detail accurately.

Pain Point #3: Lack of an industry standard on what makes the perfect grant description.  There are probably as many ways to write a grant description as there are foundations, and reinventing wheels is a waste of our collective time, so I have long wished for a framework we could all agree to follow.

Pain Reliever #3: The Get on the Map Campaign.

We have learned a lot from Foundation Center’s Get on the Map campaign about the elements of a great grant description. The Get on the Map campaign is a partnership between United Philanthropy Forum and Foundation Center designed to improve philanthropic data, and includes a helpful framework that details the best way to share your data with Foundation Center and the public. What I immediately loved about it is how it reminded me of being that weird kid who loved to diagram sentences in junior high. But perhaps it’s not that strange since I know grants managers enjoy turning chaos into order. So, let's try to use sentence diagramming as a model for writing grant descriptions.

The Anatomy of a Good Grant Description

First, we’ll start with the four elements of a good grant description and assign each a color.

  • WHAT: What is the primary objective of the grant?
  • WHO:  Are there any specifically intended beneficiaries?
  • HOW: What are the primary strategies of the grant?
  • WHERE:  Where will the grant monies serve if beyond the recipient address?

Example #1:

We’ll start with an easy example. Program support grant descriptions often write themselves:

Brief summary of the grant request from application form:

“We are seeking support for Chicas Youth Development which serves over 500 Latina girls and their families in grades 3-12 in Washington County. Chicas launched in 2008 and has since grown to partner with three Washington County school districts and over 500 local families each year to offer after school programming, leadership, and community service opportunities for Latina youth and their families.”

Grant Description: to support the Chicas Youth Development program which serves 500 Latina girls in grades 3-12 located in Washington County.

That was pretty easy!! Particularly because of how we improved the clarity of what we ask for.

Example #2:

The grant below is also a project grant but the Brief summary of the grant request from the application is a little less straight forward:

“GRANTEE requests $AMOUNT to support the presentation of two new publications and four community readings featuring the writing of diverse voices: people who are experiencing homeless, immigrants and refugees living in our community, seniors living on a low income, LGBTQ folks, people living with a disability, and many others whose voices often live on the margins.  This project will bring together people to experience and explore art and will focus on those with the least access to do so.

Grant Description: To support community building through publication and public readings of works written by marginalized populations.

Example #3:

This grant is for both general operating support and a challenge grant. Tricky.

GRANTEE respectfully requests $AMOUNT over two years to support program growth as well as provide a matching challenge for individual donations as we continue to increase our sustainability through support from individual donors. If awarded, $AMOUNT would be put to general operating funds to support our continued program growth in all areas: traditional high school program, statewide initiative pilot program and our college program. The remaining $AMOUNT each year would serve as a matching challenge grant. In order to be eligible for the match, GRANTEE would have to raise $AMOUNT in new and increased individual donations each year of the grant period.

Okay Grant Description: To support program growth and provide a matching challenge for individual donations.

Good Grant Description: General operating funds to support program growth and a challenge grant to increase support from individual donors.

Better Grant Description: This grant was awarded in two parts: 1. General operating funds for mission related activities that provide intensive support to low-income high school juniors and seniors in Oregon. 2. A 1:1 challenge grant to increase support from individual donors.

The above description is a perfect example of why it’s important to read the proposal narrative as well as confer with program staff.

If you follow this process, I can’t promise it will be painless, but it will go a long way to relieving a lot of the pain points that come with grants management—particularly the grants management of today in which grants managers are at the crossroads of being data managers, information officers, and storytellers.  I have found making this journey is worth it. Because, after all, behind every grant lies a story waiting to be told and a community waiting to hear it. So, let’s get our stories straight!

--Gretchen Schackel

How "Going Public" Improves Evaluations
October 17, 2017

Edward Pauly is director of research and evaluation at The Wallace Foundation. This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

ED_finalAs foundations strive to be #OpenForGood and share key lessons from their grantees' work, a frequent question that arises is how foundations can balance the value of openness with concerns about potential risks.

Concerns about risk are particularly charged when it comes to evaluations. Those concerns include: possible reputational damage to grantees from a critical or less-than-positive evaluation; internal foundation staff disagreements with evaluators about the accomplishments and challenges of grantees they know well; and evaluators’ delays and complicated interpretations.

It therefore may seem counterintuitive to embrace – as The Wallace Foundation has – the idea of making evaluations public and distributing them widely. And one of the key reasons may be surprising: To get better and more useful evaluations.

The Wallace Foundation has found that high-quality evaluations – by which we mean independent, commissioned research that tackles questions that are important to the field – are often a powerful tool for improving policy and practice. We have also found that evaluations are notably improved in quality and utility by being publicly distributed.

Incentives for High Quality

A key reason is that the incentives of a public report for the author are aligned with quality in several ways:

  • Evaluation research teams know that when their reports are public and widely distributed, they will be closely scrutinized and their reputation is on the line. Therefore, they do their highest quality work when it’s public.  In our experience, non-public reports are more likely than public reports to be weak in data use, loose in their analysis, and even a bit sloppy in their writing.  It is also noteworthy that some of the best evaluation teams insist on publishing their reports.
  • Evaluators also recognize that they benefit from the visibility of their public reports because visibility brings them more research opportunities – but only if their work is excellent, accessible and useful.
  • We see evaluators perk up when they focus on the audience their reports will reach. Gathering data and writing for a broad audience of practitioners and policymakers incentivizes evaluators to seek out and carefully consider the concerns of the audience: What information does the audience need in order to judge the value of the project being evaluated? What evidence will the intended audience find useful? How should the evaluation report be written so it will be accessible to the audience?

Making evaluations public is a classic case of a virtuous circle: public scrutiny creates incentives for high quality, accessibility and utility; high quality reports lead to expanded, engaged audiences – and the circle turns again, as large audiences use evaluation lessons to strengthen their own work, and demand more high-quality evaluations. To achieve these benefits, it’s obviously essential for grantmakers to communicate upfront and thoroughly with grantees about the goals of a public evaluation report -- goals of sharing lessons that can benefit the entire field, presented in a way that avoids any hint of punitive or harsh messaging.

“What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work?”

Asking the Right Questions

A key difference between evaluations commissioned for internal use and evaluations designed to produce public reports for a broad audience lies in the questions they ask. Of course, for any evaluation or applied research project, a crucial precursor to success is getting the questions right. In many cases, internally-focused evaluations quite reasonably ask questions about the lessons for the foundation as a grantmaker. Evaluations for a broad audience of practitioners and policymakers, including the grantees themselves, typically ask a broader set of questions, often emphasizing lessons for the field on how an innovative program can be successfully implemented, what outcomes are likely, and what policies are likely to be supportive.

In shaping these efforts at Wallace as part of the overall design of initiatives, we have found that one of the most valuable initial steps is to ask field leaders: What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work? This kind of listening can help a foundation get the questions right for an evaluation whose findings will be valued, and used, by field leaders and practitioners.

Knowledge at Work

For example, school district leaders interested in Wallace-supported “principal pipelines” that could help ensure a reliable supply of effective principals, wanted to know the costs of starting such pipelines and maintaining them over time. The result was a widely-used RAND report that we commissioned, “What It Takes to Operate and Maintain Principal Pipelines: Costs and Other Resources.” RAND found that costs are less than one half of 1% of districts’ expenditures; the report also explained what drives costs, and provided a very practical checklist of the components of a pipeline that readers can customize and adapt to meet their local needs.

Other examples that show how high-quality public evaluations can help grantees and the field include:

Being #OpenForGood does not happen overnight, and managing an evaluation planned for wide public distribution isn’t easy. The challenges start with getting the question right – and then selecting a high-performing evaluation team; allocating adequate resources for the evaluation; connecting the evaluators with grantees and obtaining relevant data; managing the inevitable and unpredictable bumps in the road; reviewing the draft report for accuracy and tone; allowing time for grantees to fact-check it; and preparing with grantees and the research team for the public release. Difficulties, like rocks on a path, crop up in each stage in the journey. Wallace has encountered all of these difficulties, and we don’t always navigate them successfully. (Delays are a persistent issue for us.)

Since we believe that the knowledge we produce is a public good, it follows that the payoff of publishing useful evaluation reports is worth it. Interest from the field is evidenced by 750,000 downloads last year from www.wallacefoundation.org, and a highly engaged public discourse about what works, what doesn’t, why, and how – rather than the silence that often greets many internally-focused evaluations.

--Edward Pauly

No Moat Philanthropy Part 5: The Downsides & Why It’s Worth It
October 6, 2017

Jen Ford Reedy is President of the Bush Foundation. On the occasion of her fifth anniversary leading the foundation, she reflects on efforts undertaken to make the Bush Foundation more permeable. Because the strategies and tactics she shares can be inspiring and helpful for any grantmaker exploring ways to open up their grantmaking, we have devoted this blog space all week to the series. This is the final post in the five-part series.

Reedyjenniferford-croppedEverything we do is a trade-off. Spending time and money on the activities described in this No Moat Philanthropy series means time and money not invested in something else. Here are some of the downsides of the trade-offs we have made:

It takes some operating expense.  It requires real staff time for us to do office hours in western North Dakota and to reformat grant reports to be shared online and to do every other activity described in these posts. We believe there is lots of opportunity to advance our mission in the “how” of grantmaking and weigh that as an investment alongside others. In our case, we did not have an increase in staff costs or operating expenses as we made this shift. We just reprioritized.

It can be bureaucratic.  Having open programs and having community members involved in processes requires some structure and rules and standardization in a way that can feel stifling. Philanthropy feels more artful and inspired when you can be creative and move quickly. To be equitably accessible and to improve the chance we are funding the best idea, we are committed to making this trade-off. (While, of course, being as artful and creative as possible within the structures we set!)

“We believe our effectiveness is fundamentally tied to our ability to influence and be influenced by others.”

Lots of applications means lots of turndowns.  Conventional wisdom in philanthropy is to try to limit unsuccessful applications – reducing the amount of effort nonprofits invest with no return. This is an important consideration and it is why many foundations have very narrow guidelines and/or don’t accept unsolicited proposals. The flip side, however, is that the more we all narrow our funding apertures, the harder it is for organizations to get great ideas funded. We’ve decided to run counter to conventional wisdom and give lots of organizations a shot at funding. Of course, we don’t want to waste their time. We have three strategies to try to mitigate this waste: (1) through our hotlines we try to coach unlikely grantees out of the process. (In our experience, nonprofits will often apply anyway – which suggests to us that they value having a shot – even if the odds are long.); (2) we try to make the process worth it. Our surveys suggest that applicants who do the programs with the biggest pools get something out of the process – (and we learn from the applicants even if they are not funded.); and (3) we try to make the first stage of our processes as simple as possible so folks are not wasting too much effort.

Relationships are hard!  Thinking of ourselves as being in relationship with people in the region is not simple. There are lots of them! And it can be super frustrating if a Bush staff member gives advice on a hotline that seems to be contradicted by the feedback when an application is declined. We’ve had to invest money and time in developing our CRM capacity and habits. We have a lot more work to do on this front. We will never not have a lot more work to do on our intercultural competence and our efforts to practice inclusion. Truly including people with different perspectives can make decisions harder as it makes decisions better.  The early returns on our efforts have been encouraging and we are committed to continuing the work to be more fully in relationship with more people in the communities we serve.

Conclusion

Overall, we believe a No Moat Philanthropy approach has made us more effective. When we are intentional about having impact through how we do our work — building relationships, inspiring action, spreading optimism — then we increase the positive impact we have in the region.

We believe our effectiveness is fundamentally tied to our ability to influence and be influenced by others, which demands trust, reciprocity and a genuine openness to the ideas of others. It requires understanding perspectives other than our own. It requires permeability.

While we arrived at this approach largely because of our place-based sensibility and strategic orientation toward people (see learning paper: “The Bush Approach”), the same principles can apply to a national or international foundation focused on particular issues. The definition of community is different, but the potential value of permeability within that community is the same.

--Jen Ford Reedy

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories