Transparency Talk

Category: "Open Data" (44 posts)

Putting a Stop to Recreating the Wheel: Strengthening the Field of Philanthropic Evaluation
December 13, 2018

Clare Nolan is Co-Founder of Engage R+D, which works with nonprofits, foundations, and public agencies to measure their impact, bring together stakeholders, and foster learning and innovation.

Meg Long is President of Equal Measure, Philadelphia-based professional services nonprofit focused on helping its clients—foundations, nonprofit organizations, and public entities—deepen and accelerate social change.

2
Clare Nolan

In 2017, Engage R+D and Equal Measure, with support from the Gordon and Betty Moore Foundation launched an exploratory dialogue of funders and evaluators to discuss the current state of evaluation and learning in philanthropy, explore barriers to greater collaboration and impact, and identify approaches and strategies to build the collective capacity of small and mid-sized evaluation firms. Our goal was to test whether there was interest in our sector for building an affinity network of evaluation leaders working with and within philanthropy. Since our initial meeting with a few dozen colleagues in 2017, our affinity network has grown to 250 individuals nationally, and there is growing momentum for finding ways funders and evaluators can work together differently to deepen the impact of evaluation and learning on philanthropic practice.

At the recent 2018 American Evaluation Association (AEA) conference in Cleveland, Ohio, nearly 100 funders and evaluators gathered to discuss four action areas that have generated the most “buzz” during our previous network convening at the Grantmakers for Effective Organizations (GEO) conference and from our subsequent network survey:

1. Improving the application of evaluation in philanthropic strategy and practice.

2. Supporting the sharing and adaptation of evaluation learning for multiple users.

3. Supporting formal partnerships and collaborations across evaluators and evaluation firms.

4. Strengthening and diversifying the pipeline of evaluators working with and within philanthropy.

1
Meg Long

We asked participants to choose one of these action areas and join the corresponding large table discussion to reflect on what they have learned about the topic and identify how the affinity network can contribute to advancing the field. Through crowd-sourcing, participants identified some key ways in which action teams that will be launched in early 2019 can provide a value-add to the field.

1. What will it take to more tightly connect evaluation with strategy and decision-making? Provide more guidance on what evaluation should look like in philanthropy.

Are there common principles, trainings, articles, case studies, guides, etc. that an action team could identify and develop? Could the affinity network be a space to convene funders and evaluators that work in similar fields to share evaluation results and lessons learned?

2. What will it take to broaden the audience for evaluations beyond individual organizations? Create a “market place” for knowledge sharing and incentivize participation.

As readers of this blog will know from Foundation Center’s #OpenForGood efforts, there is general agreement around the need to do better at sharing knowledge, building evidence, and being willing to share what foundations are learning – both successes and failures. How can an action team support the creation of a culture of knowledge sharing through existing venues and mechanisms (e.g., IssueLab, Evaluation Roundtable)? How could incentives be built in to support transparency and accountability?

3. How can the field create spaces that support greater collaboration and knowledge sharing among funders and evaluators? Identify promising evaluator partnership models that resulted in collaboration and not competition.

Partnerships have worked well where there are established relationships and trust and when power dynamics are minimized. How can an action team identify promising models and practices for successful collaborations where collaboration is not the main goal? How can they establish shared values, goals, etc. to further collaboration?

4. What will it take to create the conditions necessary to attract, support, and retain new talent? Build upon existing models to support emerging evaluators of color and identify practices for ongoing guidance and mentorship.

Recruiting, hiring, and retaining talent to fit evaluation and learning needs in philanthropy is challenging due to education and training programs as well as changing expectations in the field. How can we leverage and build on existing programs (e.g., AEA Graduate Education Diversity Internship, Leaders in Equitable Evaluation and Diversity, etc.) to increase the pipeline, and support ongoing retention and professional development?

Overall, we are delighted to see that there is much enthusiasm in our field to do more work on these issues. We look forward to launching action teams in early 2019 to further flesh out the ideas shared above in addition to others generated over the past year.

If you are interested in learning more about this effort, please contact Pilar Mendoza. If you would like to join the network and receive updates about this work, please contact Christine Kemler.

--Clare Nolan and Meg Long

Data Fix: Do's & Don'ts for Reporting Geographic Area Served
November 1, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This is the second post in a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoThe first post in our Data Fix series focused on areas that may seem straightforward but often cause confusion, including recipient location data. But don’t confuse recipient location (where the check was sent) with Geographic Area Served (the area meant to benefit from the funding). Data on recipient location, one of our required fields, allows us to match data to the correct organization in our database, ensuring accuracy for analyses or data visualizations. In contrast, Geographic Area Served, one of our highest priority fields, helps us tell the real story about where your funding is making an impact.

How to Report Geographic Area Served

We recognize that providing data on Geographic Area Served can be challenging. Many funders may not track this information, and those who do may depend on grantees or program staff to provide the details. It’s important to keep in mind that sharing some information is better than no information, as funders are currently the only source of this data.

DO DON'T
Do include details for locations beyond the country level. For example, for U.S. locations, specify a state along with providing geo area served at the city or county level. For non-U.S. locations, include the country name when funding a specific city, province, state or region. Don’t be too broad in scope. “Global Programs” may not be accurate if your work is focused on specific countries. Similarly, listing the geo area served as “Canada” is misleading if the work is serving the province of “Quebec, Canada” rather than the entire country.

Do use commas to indicate hierarchy and semi-colons to separate multiple areas served. For example:

  • Topeka, Kansas (comma used to indicate hierarchy)
  • Hitchcock County, Nebraska; Lisbon, Portugal; Asia (semi-colons used to list and separate multiple locations)
Don’t use negatives or catch-all terms. “Not California,” “Other,” “Statewide” or “International” may be meaningful within your organization, but these terms cannot be interpreted for mapping. Instead of “Statewide,” use the name of the state. Instead of “International,” use “Global Programs” or list the countries, regions, or continent being served.

Do define regions. If you are reporting on geo area served at the regional level (e.g. East Africa), please provide a list of the countries included in your organization’s definition of that region. Your definition of a region may differ from that of Foundation Center. Similarly, if your foundation defines its own regions (Southwestern Ohio), consider including the counties comprising that region.

Don’t forget to include the term “County” when reporting on U.S. counties. This will ensure your grant to an entire county isn’t assigned to the same named city (e.g. Los Angeles County, California, rather than Los Angeles, California).

Geographic Area Served in Foundation Center Platforms

Data provided (in a loadable format) will appear in “Grant Details” in Foundation Directory Online (FDO) and Foundation MapsFoundation Maps, including the complimentary eReporter map showing your own foundation’s data, also display an Area Served mapping view. 

Copy of Untitled

If data is not provided, Foundation Center will do one of the following:

  • Default to the location of the recipient organization
  • Add geo area served based on text in the grant description
  • Add geo area served based on where the recipient organization works, as listed on their website or in their mission statement, if this information is available in our database

Responsibly Sharing Geographic Area Served


Although our mission is to encourage transparency through the sharing of grants data, we acknowledge there are contexts in which sharing this data may be cause for concern. If the publishing of this data increases risks to the population meant to benefit from the funding, the grantee/recipient, or your own organization, you can either omit Geographic Area Served information entirely or report it at a higher, less sensitive level (e.g. country vs. province or city). For more information on this topic, please see Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts and Sharing Data Responsibly: A Conversation Guide for Funders.

More Tips to Come!

I hope you have a better understanding of how to report Geographic Area Served through eReporting. Without this data, valuable information about where funding is making a difference may be lost! Moving forward, we’ll explore the required fields of Recipient Name and Grant Description. If you have any questions, please feel free to contact me.

-- Kati Neiheisel

New Guide Helps Human Rights Funders Balance Tension between Risk & Transparency
October 25, 2018

Julie Broome is the Director of Ariadne, a network of European donors that support social change and human rights.  

Tom Walker is the Research Manager at The Engine Room, an international organisation that helps activists and organisations use data and technology effectively and responsibly.

2
Julie Broome

Foundations find themselves in a challenging situation when it comes to making decisions about how much data to share about their grantmaking. On the one hand, in recognition of the public benefit function of philanthropy, there is a demand for greater transparency on the part of funders and a push to be open about how much they are giving and who they are giving it to. These demands sometimes come from states, increasingly from philanthropy professionals themselves, and also from critics who believe that philanthropy has been too opaque for too long and raise questions about fairness and access. 

At the same time, donors who work in human rights and on politically charged issues, are increasingly becoming aware of the risks to grantees if sensitive information ends up in the public domain. As a result, some funders have moved towards sharing little to no information. However, this can have negative consequences in terms of our collective ability to map different fields, making it harder for us all develop a sense of the funding landscape in different areas. It can also serve to keep certain groups “underground,” when in reality they might benefit from the credibility that foundation funding can bestow.

1
Tom Walker

As the European partners in the Advancing Human Rights project, led by the Human Rights Funders Network and Foundation Center, Ariadne collects grantmaking data from our members that feeds into this larger effort to understand where human rights funding is going and how it is shifting over time. Unlike the United States, in which the IRS 990-PF form eventually provides transparency about grantee transactions, there is no equivalent data source in Europe. Yet, many donors find grant activity information useful in finding peer funders and identifying potential gaps in the funding landscape where their own funds could make a difference. We frequently receive requests from donors who want to use these datasets to drill down into specific areas of interest, and map out different funding fields. But these types of sources of data will become less valuable over time if donors move away from voluntarily sharing information about their grantmaking.

Nonetheless, the risks to grantees if donors share information irresponsibly are very real, especially at a time when civil society is increasingly under threat from both state and non-state actors.  It was in the interest of trying to balance these two aims – maintaining sufficient data to be able to analyse trends in philanthropy while protecting grantees – that led Ariadne to partner with The Engine Room to create a guide to help funders navigate these tricky questions.

After looking at why and how funders share data and the challenges of doing so responsibly, The Engine Room interviewed 8 people and surveyed 32 others working in foundations that fund human rights organisations, asking how they shared data about their grants and highlighting any risks they might see.

Funders told us that they felt treating data responsibly was important, but that implementing it in their day-to-day work was often difficult. It involved balancing competing priorities: between transparency and data protection legislation; between protecting grantees’ data and reporting requirements; and between protecting grantees from unwanted attention, and publicising stories to highlight the benefits of the grantee’s work.

The funders we heard from said they found it particularly difficult to predict how risks might change over time, and how to manage data that had already been shared and published. The most common concerns were:

  • ensuring that data that had already been published remained up to date;
  • de-identifying data before it was published
  • Working with third parties to be responsible when sharing data about grantees, such as with donors who fund through intermediaries and may request information about the intermediaries’ grantees.

Untitled designAlthough the funders we interviewed differed in their mission, size, geographical spread and focus area, they all stressed the importance of respecting the autonomy of their grantees. Practically, this meant that additional security or privacy measures were often introduced only when the grantee raised a concern. The people we spoke with were often aware that this reactive approach puts the burden of assessing data-related risks onto grantees, and suggested that they most needed support when it came to talking with grantees and other funders in an open, informed way about the opportunities and risks associated with sharing grantee data.

These conversations can be difficult ones to have. So, we tried a new approach: a guide to help funders have better conversations about responsible data.

It’s aimed at funders or grantmakers who want to treat their grantees’ data responsibly, but don’t always know how. It lists common questions that grantees and funders might ask, combined with advice and resources to help answer them, and tips for structuring a proactive conversation with grantees.

”There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process.“

There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process. It offers prompts that are designed to help you talk more openly with grantees or other funders about data-related risks and ways of dealing with them. The guide is organised around three elements of the grantmaking lifecycle: data collection, data storage, and data sharing.

Because contexts and grantmaking systems vary dramatically and change constantly, a one-size-fits-all solution is impossible. Instead, we decided to offer guidance on processes and questions that many funders share – from deciding whether to publish a case study to having conversations about security with grantees. For example, one tip that would benefit many grantmakers is to ensure that grant agreements include specifics about how the funder will use any data collected as a result of the grant, based on a discussion that helps the grantee to understand how their data will be managed and make decisions accordingly.

This guide aims to give practical advice that helps funders strengthen their relationships with grantees - thereby leading to more effective grantmaking. Download the guide, and let us know what you think!

--Julie Broome and Tom Walker

Philanthropy and Democracy: Bringing Data to the Debate
October 18, 2018

Anna Koob is a manager of knowledge services for Foundation Center.

Anna-koob_tilemediumAs money and politics become increasingly intertwined, the enduring debate around the role of philanthropy in a democratic society has taken on new life in recent months  (see
here, here, here, and here for prominent examples).

One side of the debate sees the flexibility of foundation dollars as a part of the solution to strengthen struggling democratic institutions. Others contend that foundations are profoundly undemocratic and increasingly powerful institutions that bypass government channels to shape the country--and world--to their will. Regardless of where you stand, a practical starting point is to learn more about what grantmakers are actually doing to affect democracy in these United States.

While foundations are required by law to avoid partisan and candidate campaigning, these limitations still leave plenty of room for foundations to engage with democracy in other ways.

Which funders are working on voter access issues? How much money is dedicated to civic engagement on key issues like health or the environment? Which organizations are receiving grants to increase transparency in government? Foundation Funding for U.S. Democracy, offers a free public resource to get at the answers to such questions.

Browse More Than 55k Democracy Grants

Launched in 2014 by Foundation Center and updated regularly, Foundation Funding for U.S. Democracy’s data tool currently includes over 57,000 grants awarded by more than 6,000 funders totaling $5.1 billion dollars across four major categories: campaigns and elections, civic participation, government strengthening, and media.

The tool offers a look at the big picture through dashboards on each of these categories, and also allows you to browse granular grants-level information.  Interested in understanding:

  • The largest funders of campaigns and elections work?
  • Grantmaking in support of civic participation, broken down by population type?
  • The strategies used to affect democracy work?

To paraphrase the slogan of Apple, there’s a dashboard (and underlying data tool) for that!

The site also features a collection of research on U.S. democracy, powered by IssueLab, links to a number of relevant blog posts, and hosts infographics we’ve developed using data from the tool.

What Does the Data Tell Us About Philanthropic Support for Democracy?

Copy of UntitledLess than two percent of all philanthropic funding in the United States meets our criteria for democracy funding, which includes efforts by foundations to foster an engaged and informed public and support government accountability and integrity, as well as funding for policy research and advocacy. It’s a modest amount considering that this subset captures a wide range of topics, including money in politics, civic leadership development, civil rights litigation, and journalism training. Some findings from the data rise to the top:

  1. Funding for campaigns and elections is the smallest of the four major funding categories tracked. While most people might think of elections as the basic mechanism of democracy, this category only constitutes about 12 percent of democracy funding represented in the tool. Civic participation and government each vie for being the largest category with each accounting for about 38 percent of total democracy funding. And relevant media funding accounts for 28 percent. (Note that grants can be counted in multiple categories, so totals exceed 100 percent.)
  • Less than a quarter of funding supports policy and advocacy work. While work to affect policy is often considered front and center when discussing philanthropy’s impact on democracy, the data tool reveals that many funders are working to strengthen democracy in other ways. Supporting civics education for youth, bolstering election administration, strengthening platforms for government accountability, or funding investigative journalism appear as examples of grantmaking areas that strengthen democracy, but have less direct implications for public policy.
  • Funder interest in the census and the role of media in democracy is increasing. Given the turbulence of the last couple of years in the U.S. political system and amid calls for greater philanthropic involvement in strengthening democracy, what changes have we seen in giving patterns? Well, with the caveat that there is a lag between the time when grants are awarded and when we receive that data (from 990 tax forms or direct reporting by foundations), based on reports added to IssueLab and news items posted on Philanthropy News Digest, we are seeing evidence that funders are rallying around some causes to strengthen democratic institutions, including efforts to ensure representativeness in the 2020 census and support for research on media consumption and digital disinformation.

Why Should Funders be Transparent about Their Democracy Work?

Appeals for data sharing in philanthropy often center around the common good -- detailed data helps to inform authentic conversations around who’s funding what, where, among grantmakers, nonprofits, and other stakeholders. But in a field that’s focused on shaping the nature of our democracy and represents funding from both sides of the ideological divide -- including, for example, grantmaking in support of the American Legislative Exchange Council (“dedicated to the principles of limited government, free markets and federalism”) alongside grants awarded to organizations like the Center for American Progress (“dedicated to improving the lives of all Americans, through bold, progressive ideas”), democracy funders tend to be especially cautious about publicizing their work and opening themselves up to increased scrutiny and criticism.  

But the reality is that foundation opacity undermines credibility and public trust. Precisely because of criticism about the lack of democracy in philanthropy, foundations should demonstrate intentional transparency and show that they are living their values as democracy funders. Foundations also find that, particularly in a space that’s rife with speculation, there’s a benefit to shaping your own narrative and describing what you do in your own words. It may not make you immune to criticism, but it shows that you have nothing to hide.

How Funders Can Actively Engage: Submitting Grants Data

Copy of Untitled copy 2Grants data in the platform is either reported directly to Foundation Center via our eReporter program or sourced via publicly available 990 tax forms. While we’re able to get our data-eager hands on foundation grants either way, we prefer sourcing them directly from funders as it lends itself to more recent data -- particularly valuable in the current, fast-paced ‘democracy in crisis’ era -- and more detailed grant descriptions.

To submit your most recent grants (we’re currently collecting grants awarded in 2017), become an eReporter! Export a list of your most recent grants data in a spreadsheet (all grants - not limited to those relevant to democracy), review the data to make sure there’s no sensitive information and everything is as you’d like it to appear, and email your report to egrants@foundationcenter.org. Submit data as often as you’d like, but at least on an annual basis.

Bringing Tangible Details to Abstract Discussions

At Foundation Center, we often tout data’s ability to help guide decision making about funding and general resource allocation. And that’s a great practical use case for the philanthropic data that we collect -- whether for human rights, ocean conservation funding, the Sustainable Development Goals, or democracy. At a time of increased foundation scrutiny, this publicly-available platform can also provide some transparency and concrete details to broaden discussions. What have foundations done to strengthen democracy? And how might they best contribute in these politically uncertain times? For examples, look to the data.

Have questions about this resource? Contact us at democracy@foundationcenter.org.

--Anna Koob

Data Fix: Do's and Don'ts for Data Mapping & More!
October 3, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This post is part of a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoAs many of you know, Foundation Center was established to provide transparency for the field of philanthropy. A key part of this mission is collecting, indexing, and aggregating millions of grants each year. In recent years this laborious process has become more streamlined thanks to technology, auto-coding, and to those of you who directly report your grants data to us. Your participation also increases the timeliness and accuracy of the data.

Today, over 1300 funders worldwide share grants data directly with Foundation Center. Over the 20 years we've been collecting this data, we've encountered some issues concerning the basic fields required. To make sharing data even quicker and easier, we've put together some dos and don'ts focusing on three areas that may seem straightforward, but often cause confusion.

Location Data for Accurate Mapping and Matching

Quite simply, to map your grants data we need location information! And we need location information for more than mapping. We also use this information to ensure we are matching data to the correct organizations in our database. To help us do this even more accurately, we encourage you to provide as much location data as possible. This also helps you by increasing the usability of your own data when running your own analyses or data visualizations.

DO DON'T
Do supply Recipient City for U.S. and non-U.S. Recipients. Don't forget to supply Recipient Address and Recipient Postal Code, if possible.
Do supply Recipient State for U.S. Recipients. Don't supply post office box in place of street address for Recipient Address, if possible.

Do supply Recipient Country for non-U.S. Recipients.

Don't confuse Recipient location (where the check was sent) with Geographic Area Served (where the service will be provided). 

What's Your Type? Authorized or Paid?

Two types of grant amounts can be reported: Authorized amounts (new grants authorized in a given fiscal year, including the full amount of grants that may be paid over multiple years) or Paid amounts (as grants would appear in your IRS tax form). You can report on either one of these types of amounts – we just need to know which one you are using: Authorized or Paid.

DO DON'T
Do indicate if you are reporting on Authorized or Paid amounts. Don't send more than one column of Amounts in your report – either Authorized or Paid for the entire list.
Do remain consistent from year to year with sending either Authorized amounts or Paid amounts to prevent duplication of grants. Don't forget to include Grant Duration (in months) or Grant Start Date and Grant End Date, if possible.
Do report the type of Currency of the amount listed, if not US Dollars. Don't include more than one amount per grant.

The Essential Fiscal Year

An accurate Fiscal Year is essential since we publish grants data by fiscal year in our data-driven tools and content-rich platforms such as those developed by Foundation Landscapes, including Funding the Ocean, SDG Funders, Equal Footing and Youth Giving. Fiscal Year can be reported with a year (2018) or date range (07/01/2017-06/31/2018), but both formats will appear in published products as YEAR AWARDED: 2018.

DO DON'T
Do include the Fiscal Year in which the grants were either Authorized or Paid by you, the funder. Don't provide the Fiscal Year of the Recipient organization.
Do format your Fiscal Year as a year (2018) or a date range (07/01/2017-06/31/2018). Don't forget, for off-calendar fiscal years, the last year of the date range is the Fiscal Year: 07/01/2017-06/31/2018 = 2018.

More Tips to Come!

I hope you have a better understanding of these three areas of data to be shared through Foundation Center eReporting. Moving forward, we'll explore the required fields of Recipient Name and Grant Description, as well as high priority fields such as Geographic Area Served. If you have any questions, please feel free to contact me. Thank you! And don't forget, the data you share IS making a difference!

-- Kati Neiheisel

“Because It’s Hard” Is Not an Excuse – Challenges in Collecting and Using Demographic Data for Grantmaking
August 30, 2018

Melissa Sines is the Effective Practices Program Manager at PEAK Grantmaking. In this role, she works with internal teams, external consultants, volunteer advisory groups, and partner organizations to articulate and highlight the best ways to make grants – Effective Practices. A version of this post also appears in the PEAK Grantmaking blog.

MelissaFor philanthropy to advance equity in all communities, especially low-income communities and communities of color, it needs to be able to understand the demographics of the organizations being funded (and declined), the people being served, and the communities impacted. That data should be used to assess practices and drive decision making.

PEAK Grantmaking is working to better understand and build the capacity of grantmakers for collecting and utilizing demographic data as part of their grantmaking. Our work is focused on answering four key questions:

  • What demographic data are grantmakers collecting and why?
  • How are they collecting these demographic data?
  • How is demographic data being used and interpreted?
  • How can funders use demographic data to inform their work?

In the process of undertaking this research, we surfaced a lot of myths and challenges around this topic that prevent our field from reaching the goal of being accountable to our communities and collecting this data for responsible and effective use.

Generally, about half of all grantmakers are collecting demographic data either about the communities they are serving or about the leaders of the nonprofits they have supported. For those who reported that they found the collection and use of this data to be challenging, our researcher dug a little deeper and asked about the challenges they were seeing.

Some of the challenges that were brought to the forefront by our research were:

PEAK Grantmaking reportChallenge 1: Fidelity and Accuracy in Self-Reported Data
Data, and self-reported data in particular, will always be limited in its ability to tell the entire story and to achieve the nuance necessary for understanding. Many nonprofits, especially small grassroots organizations, lack the capability or capacity to collect and track data about their communities. In addition, white-led nonprofits may fear that lack of diversity at the board or senior staff level may be judged harshly by grantmakers.

Challenge 2: Broad Variations in Taxonomy
Detailed and flexible identity data can give a more complete picture of the community, but this flexibility works against data standardization. Varying taxonomies, across sectors or organizations, can make it difficult to compare and contrast data. It can also be a real burden if the nonprofit applying for a grant does not collect demographic data in the categories that a grantmaker is using. This can lead to confusion about how to report this data to a funder.

Challenge 3: Varying Data Needs Across Programs
Even inside a single organization, different programs may be collecting and tracking different data, as program officers respond to needs in their community and directives from senior leadership. Different strategies or approaches to a problem demand different data. For instance, an arts advocacy program may be more concerned with constituent demographics and impact, while an artist’s program will want to know about demographics of individual artists.

Challenge 4: Aggregating Data for Coalitions and Collaborations
This becomes even more complex as coalitions and collaborative efforts that bring together numerous organizations, or programs inside of different organizations, to accomplish a single task. The aforementioned challenges are compounded as more organizations, different databases, and various taxonomies try to aggregate consistent demographic data to track impact on specific populations.

These are all very real challenges, but they are not insurmountable. Philanthropy, if it puts itself to the task, can tackle these challenges.

Some suggestions to get the field started from our report include

  • Don’t let the perfect be the enemy of the good. Pilot systems for data collection, then revisit them to ensure that they are working correctly, meeting the need for good data, and serving the ultimate goal of tracking impact.
  • Fund the capacity of nonprofits to collect good data and to engage in their own diversity, equity, and inclusion efforts.
  • Engage in a conversation – internally and externally – about how this data will be collected and how it will be used. If foundation staff and the nonprofits they work with understand the need for this data, they will more willingly seek and provide this information.
  • For coalitions and collaborative efforts, it may make sense to fund a backbone organization that takes on this task (among other administrative or evaluation efforts) in support of the collective effort.
  • Work with your funding peers – in an issue area or in a community – to approach this challenge in a way that will decrease the burden on nonprofits and utilize experts that may exist at larger grantmaking operations.
  • Support field-wide data aggregators, like GuideStar or the Foundation Center, and work alongside them as they try to collect and disseminate demographic data about the staff and boards at nonprofits and the demographics of communities that are being supported by grantmaking funds.

Grantmakers have the resources and the expertise to begin solving this issue and to share their learning with the entire field. To read more about how grantmakers are collecting and using demographic data, download the full report.

--Melissa Sines

Staff Pick: Foundation Funded Research Explores How to Improve the Voter Experience
August 9, 2018

Becca Leviss is a Knowledge Services Fellow at Foundation Center.

This post is part of the GlassPockets’ Democracy Funding series, designed to spotlight knowledge about ways in which philanthropy is working to strengthen American democracy.

Becca 2Voting is central to our democracy, providing citizens from all communities direct way to influence the future by conveying beliefs through civic participation. Though foundations by law must be non-partisan, they can and do support democracy in a variety of ways, and we are tracking these activities in our publicly available Foundation Funding for U.S. Democracy web portal.  
 
From this data we can see that encouraging broad civic participation is one of the most popular ways in which institutional philanthropy supports our democracy. Specific strategies under civic participation include issue-based participation, civic education and leadership, naturalization and immigrant civic integration, and public participation. So, what have foundations learned from these efforts about how to strengthen our democracy? Today we will zoom in to learn from a foundation-funded report that is openly available, containing findings from data collection on elections and voting patterns, including how well the process is workingand who is included or excluded. 
 
Our latest “Staff Pick” from IssueLab’s Democracy Special Collection, which is comprised of foundation-funded research on the topic, explores an aspect of the voter experience in America that could be improvedWith less than 90 days to go before the midterm elections, we’re pleased to offer this deep dive into an important piece of voting-related research. 
 
Research in the social sector can sometimes feel inaccessible or artificial—based on complex theories and mathematical models and highly-controlled situations. This report, however, presents its research methodology and results in a clear, understandable manner that invites the reader to continue its work to understanding how polling sites can use their resources to both investigate and improve the voter experience.  

STAFF PICK

Improving the Voter Experience: Reducing Polling Place Wait Times by Measuring Lines and Managing Polling Place Resources, by Charles Stewart III; John C. Fortier; Matthew Weil; Tim Harper; Stephen Pettigrew 

Download the Report

Publisher

Bipartisan Policy Center

Funders

Ford Foundation; The Democracy Fund

Quick Summary

Voting is the cornerstone of civic engagement in American democracy, but long wait times and inefficient organization at polling places can undermine the voting process and even discourage citizens from voting altogether. In 2013, President Barack Obama launched the bipartisan Presidential Commission on Election Administration (PCEA) to initiate studies and collaborative research on polling place wait times. The PCEA’s work revealed that while wait times and poll lines are a serious issue in the United States, they are also reflective of deeper, more complex problems within the election administration system. This report by the Bipartisan Policy Center summarizes the PCEA’s efforts and highlights how the knowledge gained can produce action and improvement at polling sites. Ultimately, the report emphasizes the need for continued research and innovation in approaching common issues in the voter experience.

Field of Practice

Government Reform

What makes it stand out?

Ne report“Long lines may be a canary in the coal mine,” begins the report,“indicating problems beyond a simple mismatch between the number of voting machines and voters, such as voter rules that are inaccurate or onerous.” Quantitative and qualitative data has shown that long lines at the polls have wide-reaching economic costs of over half a billion dollars in a presidential election, as well as the immeasurable cost of voter discouragement due to polling place problems. These issues are exacerbated at polling sites that are urban, dense, and with large minority populations, where lack of resources and access can disenfranchise the voting population.

While the dilemma of election administration is complex, the report describes a rather straight-forward series of projects by the Massachusetts Institute of Technology and the Bipartisan Policy Center. MIT and BPC collaborated to create a system of data collection on polling lines and polling place efficiency that would be simple and easily implemented by poll workers. The program utilized basic queuing theory: calculating the average wait time of a voter by dividing the average line length by the average arrival time. For fellow (and potential future) researchers, this report spends a meaningful portion of time explaining the significance of each variable, how it is calculated, and how its fluctuation impacts the overall results of the investigation. We are given examples of several successful iterations of the study and their evaluations, as well as insight into certain research choices.

MIT/BPC’s work has found that an overwhelming majority of Election Day polling sites—82 percent—experienced the longest line when the doors first opened. In all, a total of 90 percent of Election Day polling sites have their longest lines within the first two hourly samples (when observed on Hour 0 and Hour 1), with the lines declining at an average rate after that. Similarly, voters experience the longest wait times when the lines were at their longest. This pattern is vastly different from that of early voting sites, where wait time is relatively constant; however, these sites still most commonly experience their longest lines at the beginning of the day (25 percent of the studied population).

The research emphasizes the importance of how to adequately prepare for the length of the longest line. The report suggests that if polling sites adjust worker shifts to accommodate for strong early morning voter turnout on Election Day, they can easily clear the lines within the first few hours of voting, thus saving money and better serving their voters. The report also recognizes the range of its results: in other words, individual precincts have individual needs. Without meaningful research, however, we cannot know how to meet those needs and improve the voter experience. Therefore, as readers (and hopefully fellow voters), we are encouraged by MIT/BPC’s work to take clear and simple action to improve our own polling sites through continued research and investigation. This report exemplifies the importance of making the research and data process transparent and attainable so that we can not only understand its significance, but actively contribute to its efforts. There are many processes that could benefit from this kind of data analysis to improve the user experience. What if foundations analyzed their grant processes in this way? I can’t help but think that there is much that philanthropy can learn from the government from reports like this that show how institutions are opening up data collection to improve the user experience for actors and stakeholders.

Key Quote

“Precincts with large numbers of registered voters often have too few check-in stations or voting booths to handle the volume of voters assigned to the precinct, even under the best of circumstances. Precincts that are unable to clear the lines from the first three hours of voting are virtually guaranteed to have long lines throughout the day. Polling places in urban areas often face design challenges—small, inconvenient spaces—that undermine many election officials’ best efforts to provide adequate resources to these locations.”

--Becca Leviss

What Philanthropy Can Learn from Open Government Data Efforts
July 5, 2018

Daniela Pineda, Ph.D., is vice president of integration and learning at First 5 LA, an independent public agency created by voters to advocate for programs and polices benefiting young children. A version of this post also appears in the GOVERNING blog.

Daniela Pineda Photo 2Statistics-packed spreadsheets and lengthy, jargon-filled reports can be enough to make anybody feel dizzy. It's natural. That makes it the responsibility for those of us involved in government and its related institutions to find more creative ways to share the breadth of information we have with those who can benefit from it.

Government agencies, foundations and nonprofits can find ways to make data, outcomes and reports more user-friendly and accessible. In meeting the goal of transparency, we must go beyond inviting people to wade through dense piles of data and instead make them feel welcome using it, so they gain insights and understanding.

How can this be done? We need to make our data less wonky, if you will.

This might sound silly, and being transparent might sound as easy as simply releasing documents. But while leaders of public agencies and officeholders are compelled to comply with requests under freedom-of-information and public-records laws, genuine transparency requires a commitment to making the information being shared easy to understand and useful.

“…genuine transparency requires a commitment to making the information being shared easy to understand and useful.”

Things to consider include how your intended audience prefers to access and consume information. For instance, there are generational differences in the accessing of information on tablets and mobile devices as opposed to traditional websites. Consider all the platforms your audience uses to view information, such as smartphone apps, news websites and social media platforms, to constantly evolve based on their feedback.

Spreadsheets just won't work here. You need to invest in data visualization techniques and content writing to explain data, no matter how it is accessed.

The second annual Equipt to Innovate survey, published by Governing in partnership with Living Cities, found several cities not only using data consistently to drive decision-making but also embracing ways to make data digestible for the publics they serve.

Los Angeles' DataLA portal, for example, offers more than 1,000 data sets for all to use along with trainings and tutorials on how to make charts, maps and other visualization. The portal's blog offers a robust discussion of the issues and challenges faced with using existing data to meet common requests. Louisville, Ky., went the proverbial extra mile, putting a lot of thought into what data would be of interest to residents and sharing the best examples of free online services that have been built using the metro government's open data.

Louisville's efforts point up the seemingly obvious but critical strategy of making sure you know what information your target audience actually needs. Have you asked? Perhaps not. The answers should guide you, but also remember to be flexible about what you are asking. For example, the Los Angeles Unified School District is set to launch a new portal later this summer to provide parents with data, and is still learning how to supply information that parents find useful. District officials are listening to feedback throughout the process, and they are willing to adjust. One important strategy for this is to make your audience -- or a sampling of them -- part of your beta testing. Ask what information they found useful and what else would have been helpful.

“When you share, you are inviting others to engage with you about how to improve your work.”

Remember, the first time you allow a glimpse into your data and processes, it's inevitable your information will have gaps and kinks that you can't foresee. And if you are lucky to get feedback about what didn't work so well, it may even seem harsh. Don't take it personally. It's an opportunity to ask your audience what could be done better and commit to doing so. It may take weeks, months or maybe longer to package information for release, making it usable and accessible, but this is an investment worth making. You might miss the mark the first time, but make a commitment to keep trying.

And don't be daunted by the reality that anytime you share information you expose yourself to criticism. Sharing with the public that a project didn't meet expectations or failed completely is a challenge no matter how you look at it. But sharing, even when it is sharing your weaknesses, is a strength your organization can use to build its reputation and gain influence in the long term.

When you share, you are inviting others to engage with you about how to improve your work. You also are modeling the importance of being open about failure. This openness is what helps others feel like partners in the work, and they will feel more comfortable opening up about their own struggles. You might be surprised at who will reach out and what type of partnerships can come from sharing.

Through this process, you will build your reputation and credibility, helping your organization advance its goals. Ultimately, it's about helping those you serve by giving them the opportunity to help you.

--Daniela Pineda

The Risky Business of Foundation Opacity
May 23, 2018

Janet Camarena is director of transparency initiatives for Foundation Center.

Janet Camarena PhotoIn case there was ever any doubt that foundation philanthropy suffers from an opacity problem, a recent Foundation Review article, Foundation Transparency: Opacity — It’s Complicated, by Robert J. Reid, helps settle the matter through research findings that confirm the existence of “significant opacity.” From the lack of foundation websites and annual reporting, to perpetual insider control, and a desire to keep a low public profile, the author’s research confirms what many of us have been saying for years--that there is much room for improved transparency in the field.

The problem is, one can read the entire article, and not get the message that opacity is a problem, and a risky one at that. In our networked world of social media, open data, and audience-generated reviews, sending a message that transparency or opacity are operational approaches of choice is dangerous and much higher risk than encouraging donors to discover and tell their own story, lest others tell it for them.

History also confirms that philanthropic freedom is most at risk from an opaque approach than from a transparent one. Foundations learned this lesson the hard way in the 1950’s during McCarthyism, when two separate congressional commissions were formed to investigate foundation activities. Since there was no central place containing information about institutional philanthropy, no aggregate industry data, no collective data about the grants they were making, foundation leaders spent years telling their stories one foundation at a time, giving testimony to defend their work against accusations that they were committing “Un-American” acts.

It became clear to the foundation leaders who were called to testify that it was this lack of public understanding of institutional philanthropy that led to the suspicions and accusations they were facing, and that as a result of opacity, they may lose the philanthropic freedom that the tax laws allowed. As a result of this crisis, foundation leaders established Foundation Center as an organization devoted to providing transparency for the field of philanthropy. During his testimony, Russell Leffingwell, at the time chair of the Carnegie Corporation, said: “The foundation should have glass pockets,” so that anyone could easily look inside foundations and understand their value to society, and inspire confidence rather than suspicion. This is both the origin story for Foundation Center and for our Glasspockets website and initiative to champion greater foundation transparency.

“...existing and emerging technologies and networks are making foundation opacity obsolete...”

The lessons in this history couldn’t be more relevant to today’s operating environment where existing and emerging technologies and networks are making foundation opacity obsolete, and more importantly, creating conditions that actually serve to strengthen philanthropy such as facilitating feedback loops, peer benchmarking, and stakeholder input. Though foundations can continue to practice what Reid refers to as “opaque practices” or “situational transparency,” it’s important that foundations also understand that they do so at their own peril, because due to new user-review tools and open data platforms that didn’t exist previously, the relative level of transparency and opacity are rapidly slipping out of their control. Let’s review a few of these new tools that are poised to shake up the quiet, insular world of foundations.

Open 990-PF

990-PF graphicBeginning in 2016, the IRS started releasing e-filed Forms 990 and 990-PF as machine-readable, open data. Because the data is now not only open, but digital and machine-readable this means that anyone from journalists to researchers to activists can aggregate this data and make comparisons, correlations, and judgments about philanthropy at lightning speed, all without input from foundations and regardless of how opaque they may prefer their activities to be. Investment practices, demographics of beneficiaries, and compensation practices are examples of 990 data that can get easily turned into compelling narratives about foundations. This has institution-wide implications for foundations, from governance practices to grants data and from staffing to investment management and communications strategy.  Foundation administrators who have not been looking at their foundation’s 990-PF with an eye to the story that it tells about their work, probably should. Because of how the open 990-PF has the potential to transform foundation transparency, Glasspockets has devoted an ongoing blog series to providing guidance and helpful examples to prepare foundations for this new age of open data.

GrantAdvisor

Phil goalsIndustries as diverse as restaurants, travel, retail, health, and even nonprofits have had the blessing and curse of receiving unfiltered user feedback via online review sites for many years now, so it’s hard to believe that until 2017 this was not the case for philanthropy. With the launch of GrantAdvisor.org last year, now foundations can view, for better or worse, what their stakeholders really think—and so can anyone else. (For transparency’s sake, I currently serve in an advisory role to this platform.) Anyone can register to give feedback, and once a foundation receives more than five reviews their profile goes live on the site for the world to see, whether the foundation wants it there or not, so opacity here is not an option the funder controls. Given the power dynamic, reviews are anonymous, and foundations are able to post responses. A profile with emoji-symbols invites users to rate foundations on two principal metrics: the length of time it takes to complete a foundation’s application process, and a smiley/frowning face rating what it’s like to work with the particular funder.

So far, enough reviews have been submitted to provide 69 foundations with unfiltered feedback, and participation is steadily growing. And, more than 130 foundations have registered to receive alerts when feedback is posted, has yours? And some, which Reid may refer to as “transparency enthusiasts,” are even inviting their grantees to leave them a review on GrantAdvisor. These foundations understand that this kind of transparency about how applicants can provide feedback, and the open, unfiltered way in which it’s collected, can actually serve to strengthen and improve foundation policies and practices.

These are just a couple of emerging platforms that exist that are specific to philanthropy itself. When you zoom out to think about the entire universe of user generated content that is now easily available to all, from blogs to Twitter and employee-review sites like Glassdoor, it’s clear that while you can choose opacity, opacity may not choose you, because opacity as we all know it is over. To think otherwise is to risk adopting practices that don’t actually mitigate risk, but rather promote a false sense of security while only serving to limit effectiveness. So don’t make the mistake of thinking transparency is too complicated, or that opacity is the convenient and safer choice, because it’s actually not a choice at all, but a risky and ultimately obsolete way of working.

--Janet Camarena

Increasing Attention to Transparency: The MacArthur Foundation Is #OpenForGood
April 17, 2018

Chantell Johnson is managing director of evaluation at the John D. and Catherine T. MacArthur Foundation. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Chantell Johnson photoAt MacArthur, the desire to be transparent is not new. We believe philanthropy has a responsibility to be explicit about its values, choices, and decisions with regard to its use of resources. Toward that end, we have long had an information sharing policy that guides what and when we share information about the work of the Foundation or our grantees. Over time, we have continued to challenge ourselves to do better and to share more. The latest refinement of our approach to transparency is an effort toward increasingly sharing more knowledge about what we are learning. We expect to continue to push ourselves in this regard, and participating in Foundation Center’s Glasspockets  and #OpenForGood movements are just a couple of examples of how this has manifested.

In recent years, we have made a more concerted effort to revisit and strengthen our information sharing policy by:

  • Expanding our thinking about what we can and should be transparent about (e.g., our principles of transparency guided our public communications around our 100&Change competition, which included an ongoing blog);
  • Making our guidance more contemporary by moving beyond statements about information sharing to publishing more and different kinds of information (e.g., Grantee Perception Reports and evaluation findings);
  • Making our practices related to transparency more explicit; and
  • Ensuring that our evaluation work is front and center in our efforts related to transparency.

Among the steps we have taken to increase our transparency are the following:

Sharing more information about our strategy development process.
The Foundation's website has a page dedicated to How We Work, which provides detailed information about our approach to strategy development. We share an inside look into the lifecycle of our programmatic efforts, beginning with conceptualizing a grantmaking strategy through the implementation and ending phases, under an approach we refer to as Design/Build. Design/Build recognizes that social problems and conditions are not static, and thus our response to these problems needs to be iterative and evolve with the context to be most impactful. Moreover, we aim to be transparent as we design and build strategies over time. 

“We have continued to challenge ourselves to do better and to share more.”

Using evaluation to document what we are measuring and learning about our work.
Core to Design/Build is evaluation. Evaluation has become an increasingly important priority among our program staff. It serves as a tool to document what we are doing, how well we are doing it, how work is progressing, what is being achieved, and who benefits. We value evaluation not only for the critical information it provides to our Board, leadership, and program teams, but for the insights it can provide for grantees, partners, and beneficiaries in the fields in which we aim to make a difference. Moreover, it provides the critical content that we believe is at the heart of many philanthropic efforts related to transparency.

Expanding the delivery mechanisms for sharing our work.
While our final evaluation reports have generally been made public on our website, we aim to make more of our evaluation activities and products available (e.g., landscape reviews and baseline and interim reports). Further, in an effort to make our evaluation work more accessible, we are among the first foundations to make all of our evaluation reports publicly available as part of Foundation Center's #OpenForGood campaign.

Further evidence of the Foundation's commitment to increased transparency includes continuing to improve our “Glass Pockets” by sharing:

  • Our searchable database of grants, including award amount, program, year, and purpose;
  • Funding statistics including total grants, impact investments, final budgeted amounts by program, and administrative expenses (all updated annually);
  • Perspectives of our program directors and staff;
  • Links to grantee products including grant-supported research studies consistent with the Foundation's intellectual property policies;
  • Stories highlighting the work and impact of our grantees and recipients of impact investments; and
  • Center for Effective Philanthropy Grantee Perception report results

Going forward, we will look for additional ways to be transparent. And, we will challenge ourselves to make findings and learnings more accessible even more quickly.

--Chantell Johnson 

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories