Transparency Talk

Category: "Data" (130 posts)

Living Our Values: Gauging a Foundation’s Commitment to Diversity, Equity, and Inclusion
November 29, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarThe California Endowment (TCE) recently wrapped up our 2016 Diversity, Equity, and Inclusion (DEI) Audit, our fourth since 2008. The audit was initially developed at a time when community advocates were pushing the foundation to address issues of structural racism and inequity. As TCE’s grantmaking responded, staff and our CEO were also interested in promoting DEI values across the entire foundation beyond programmatic spaces. Over time, these values became increasingly engrained in TCE’s ethos and the foundation committed to conducting a regular audit as a vehicle with which to determine if and how our DEI values were guiding organizational practice.

Sharing information about our DEI Audit often raises questions about how to launch such an effort. Some colleagues are in the early stages of considering whether they want to carry out an audit of their own. Are we ready? What do we need to have in place to even begin to broach this possibility? Others are interested to hear about how we use the findings from such an assessment. To help answer these questions, this is the first of a two-part blog series to share the lessons we’re learning by using a DEI audit to hold ourselves accountable to our values.

While the audit provides a frame to identify if our DEI values are being expressed throughout the foundation, it also fosters learning. Findings are reviewed and discussed with executive leadership, board, and staff. Reviews provide venues to involve both programmatic and non-programmatic staff in DEI discussions. An audit workgroup typically considers how to take action on findings so that the foundation can continuously improve and also considers how to revise audit goals to ensure forward movement. By sharing findings publicly, we hope our experience and lessons can help to support the field more broadly.

It is, however, no small feat. The audit is a comprehensive process that includes a demographic survey of staff and board, a staff and board survey of DEI attitudes and beliefs, interviews with key foundation leaders, examining available demographic data from grantee partners as well as a review of DEI-related documents gathered in between audits. Having dedicated resources to engage a neutral outsider to carry out the audit in partnership with the foundation is also important to this process. We’ve found it particularly helpful to engage with a consistent trusted partner, Social Policy Research Associates, over each of our audits to capture and candidly reflect where we’re making progress and where we need to work harder to create change.

As your foundation considers your own readiness to engage in such an audit process, we offer the following factors that have facilitated a productive and learning oriented DEI audit effort at TCE:

1. Clarity about the fundamental importance of Diversity, Equity, and Inclusion to the Foundation

The expression of our DEI values has evolved over time. When the audit started, several program staff members who focused on DEI and cultural competency developed a guiding statement on Diversity and Inclusiveness. Located within our audit report, it focused heavily on diversity although tweaks were made to the statement over time. A significant shift occurred several years ago when our executive team articulated a comprehensive set of core values that undergirds all our work and leads with a commitment to diversity, equity, and inclusion.

2. Interest in reflection and adaptation

The audit is a tool for organizational learning that facilitates continuous improvement. The process relies on having both a growth mindset and clear goals for what we hope to accomplish. Our 13 goals range from board engagement to utilizing accessibility best practices. In addition to examining our own goals, the audit shares how we’re doing with respect to a framework of institutional supports required to build a culture of equity. By comparing the foundation to itself over time we can determine if and where change is occurring. It also allows us to revise goals so that we can continue to push ourselves forward as we improve, or to course correct if we’re not on track. We anticipate updating our goals before our next audit to reflect where we are currently in our DEI journey.

3. Engagement of key leaders, including staff

Our CEO is vocal and clear about the importance of DEI internally and externally, as well as about the significance of conducting the audit itself. Our executive team, board, and CEO all contribute to the audit process and are actively interested in reviewing and discussing its findings.

Staff engagement is critical throughout audit implementation, reflection on findings, and action planning as well. It’s notable that the vast majority of staff at all levels feel comfortable pushing the foundation to stay accountable to DEI internally. However, there is a small, but growing percentage (23%) of staff who report feeling uncomfortable raising DEI concerns in the workplace suggesting an area for greater attention.

4. Capacity to respond to any findings

Findings are not always going to be comfortable. Identifying areas for improvement may put the organization and our leaders in tough places. TCE has historically convened a cross departmental workgroup to consider audit findings and tackle action planning. We considered co-locating the audit workgroup within our executive leadership team to increase the group’s capacity to address audit findings. However, now we are considering whether it would be best situated and aligned within an emerging body that will be specifically focused on bringing racial equity to the center of all our work.

5. Courage and will to repeat

In a sector with limited accountability, choosing to voluntarily and publicly examine foundation practices takes real commitment and courage. It’s always great to hear where we’re doing well but committing to a process that also raises multiple areas where we need to put more attention, requires deep will to repeat on a regular basis. And we do so in recognition that this is long term, ongoing work that, in lieu of having a real finish line, requires us to continuously adapt as our communities evolve.

Conducting our DEI audit regularly has strengthened our sense of where our practice excels—for example in our grantmaking, possessing a strong vision and authorizing environment, and diversity among staff and board. It’s also strengthened our sense of the ways we want to improve such as developing a more widely shared DEI analysis and trainings for all staff as well as continuing to strengthen data collection among our partners. The value of our DEI audit lies equally in considering findings as well as being a springboard for prioritizing action. TCE has been on this road a long time and we’ll keep at it for the foreseeable future. As our understanding of what it takes to pursue diversity, equity, and inclusion internally and externally sharpens, so will the demands on our practice. Our DEI audit will continue to ensure that we hold ourselves to these demands. In my next post, we’ll take a closer look at what we’re learning about operationalizing equity within the foundation.

--Mona Jhawar

What Does It Take to Shift to a Learning Culture in Philanthropy?
November 20, 2018

Janet Camarena is director of transparency initiatives at Foundation Center.

This post also appears in the Center for Effective Philanthropy blog.

Janet Camarena PhotoIf there was ever any doubt that greater openness and transparency could benefit organized philanthropy, a new report from the Center for Effective Philanthropy (CEP) about knowledge-sharing practices puts it to rest. Besides making a case for the need for greater transparency in the field, the report also provides some hopeful signs that, among foundation leaders, there is growing recognition of the value of shifting to a culture of learning to improve foundations’ efforts.

Understanding & Sharing What Works: The State of Foundation Practice reveals how well foundation leaders understand what is and isn’t working in their foundation’s programs, how they figure this out, and what, if anything, they share with others about what they’ve learned. These trends are explored through 119 survey responses from, and 41 in-depth interviews with foundation CEOs. A companion series of profiles tell the story about these practices in the context of four foundations that have committed to working more openly.

Since Foundation Center’s launch of GlassPockets in 2010, we have tracked transparency around planning and performance measurement within the “Who Has Glass Pockets?” self-assessment. Currently, of the nearly 100 foundations that have participated in GlassPockets, only 27 percent publicly share any information about how they measure their progress toward institutional goals. Given this lack of knowledge sharing, we undertook a new #OpenForGood campaign to encourage foundations to publicly share published evaluations through the IssueLab open archive.

As someone who has spent the last decade examining foundation transparency practices (or the lack thereof) and championing greater openness, I read CEP’s findings with an eye for elements that might help us better understand the barriers and catalysts to this kind of culture shift in the field. Here’s what I took away from the report.

Performance Anxiety

UWW_MAIN_COV_border (1)While two-thirds of foundation CEOs in CEP’s study report having a strong sense of what is working programmatically within their foundations, nearly 60 percent report having a weaker grasp on what is not working. This begs the question: If you don’t know something is broken, then how do you fix it? Since we know foundations have a tendency to be success-oriented, this by itself wasn’t surprising. But it’s a helpful metric that proves the point of how investing in evaluation, learning, and sharing can only lead to wiser use of precious resources for the field as a whole.

The report also reveals that many CEOs who have learned what is not working well at their foundations are unlikely to share that knowledge, as more than one-third of respondents cite hesitancy around disclosing missteps and failures. The interviews and profiles point to what can best be described as performance anxiety. CEOs cite the need for professionals to show what went well, fear of losing the trust of stakeholders, and a desire to impress their boards as motivations for concealing struggles. Of these motivations, board leadership seems particularly influential for setting the culture when it comes to transparency and failure.

In the profiles, Rockefeller Brothers Fund (RBF) President Stephen Heintz discusses both the importance of his board and his background in government as factors that have informed RBF’s willingness to share the kinds of information many foundations won’t. RBF was an early participant in GlassPockets, and now is an early adopter of the #OpenForGood movement to openly share knowledge. As a result, RBF has been one of the examples we often point to for the more challenging aspects of transparency such as frameworks for diversity data, knowledge sharing, and investment practices.

An important takeaway of the RBF profile is the Fund’s emphasis on the way in which a board can help ease performance anxiety by simply giving leadership permission to talk about pain points and missteps. Yet one-third of CEOs specifically mention that their foundation faces pressure from its board to withhold information about failures. This sparks my interest in seeing a similar survey asking foundation trustees about their perspectives in this area.

Utility or Futility?

Anyone who works inside a foundation — or anyone who has ever applied for a grant from a foundation — will tell you they are buried in the kind of paperwork load that often feels futile (which actually spawned a whole other worthy movement led by PEAK Grantmaking called Project Streamline). In the CEP study, the majority of foundation CEOs report finding most of the standard sources of knowledge that they require not very useful to them. Site visits were most consistently ranked highly, with the majority of CEOs (56 percent) pointing to them as one of the most useful sources for learning about what is and isn’t working. Grantee focus groups and convenings came in a distant second, with only 38 percent of CEOs reporting these as a most useful source. And despite the labor involved on both sides of the table, final grant reports were ranked as a most useful source for learning by only 31 percent of CEOs.

”Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge.“

If most foundations find greater value in higher touch methods of learning, such as meeting face-to-face or hosting grantee gatherings, then perhaps this is a reminder that if foundations reduce the burdens of their own bureaucracies and streamline application and reporting processes, there will be more time for learning from community and stakeholder engagement.

The companion profile of the Weingart Foundation, another longtime GlassPockets participant, shows the benefits of funders making more time for grantee engagement, and provides a number of methods for doing so. Weingart co-creates its learning and assessment frameworks with grantees, routinely shares all the grantee feedback it receives from its Grantee Perception Report (GPR), regularly makes time to convene grantees for shared learning, and also pays grantees for their time in helping to inform Weingart’s trustees about the problems it seeks to solve.

Supply and Demand

One of the questions we get the most about #OpenForGood’s efforts to build an open, collective knowledge base for the field is whether anyone will actually use this content. This concern also surfaces in CEP’s interviews, with a number of CEOs citing the difficulty of knowing what is useful to share as an impediment to openness. A big source of optimism here is learning that a majority of CEOs report that their decisions are often informed by what other foundations are learning, meaning foundations can rest assured that if they supply knowledge about what is and isn’t working, the demand is there for that knowledge to make a larger impact beyond their own foundation. Think of all that untapped potential!

Of course, given the current state of knowledge sharing in the field, only 19 percent of CEOs surveyed report having quite a bit of knowledge about what’s working at peer foundations, and just 6 percent report having quite a bit of knowledge about what’s not working among their programmatic peers. Despite this dearth of knowledge, still fully three-quarters of foundation CEOs report that they use what they have access to from peers in informing strategy and direction within their own foundations.

Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge. Now there is every reason for knowledge sharing to become the norm rather than the exception.

--Janet Camarena

Data Fix: Do's & Don'ts for Reporting Geographic Area Served
November 1, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This is the second post in a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoThe first post in our Data Fix series focused on areas that may seem straightforward but often cause confusion, including recipient location data. But don’t confuse recipient location (where the check was sent) with Geographic Area Served (the area meant to benefit from the funding). Data on recipient location, one of our required fields, allows us to match data to the correct organization in our database, ensuring accuracy for analyses or data visualizations. In contrast, Geographic Area Served, one of our highest priority fields, helps us tell the real story about where your funding is making an impact.

How to Report Geographic Area Served

We recognize that providing data on Geographic Area Served can be challenging. Many funders may not track this information, and those who do may depend on grantees or program staff to provide the details. It’s important to keep in mind that sharing some information is better than no information, as funders are currently the only source of this data.

DO DON'T
Do include details for locations beyond the country level. For example, for U.S. locations, specify a state along with providing geo area served at the city or county level. For non-U.S. locations, include the country name when funding a specific city, province, state or region. Don’t be too broad in scope. “Global Programs” may not be accurate if your work is focused on specific countries. Similarly, listing the geo area served as “Canada” is misleading if the work is serving the province of “Quebec, Canada” rather than the entire country.

Do use commas to indicate hierarchy and semi-colons to separate multiple areas served. For example:

  • Topeka, Kansas (comma used to indicate hierarchy)
  • Hitchcock County, Nebraska; Lisbon, Portugal; Asia (semi-colons used to list and separate multiple locations)
Don’t use negatives or catch-all terms. “Not California,” “Other,” “Statewide” or “International” may be meaningful within your organization, but these terms cannot be interpreted for mapping. Instead of “Statewide,” use the name of the state. Instead of “International,” use “Global Programs” or list the countries, regions, or continent being served.

Do define regions. If you are reporting on geo area served at the regional level (e.g. East Africa), please provide a list of the countries included in your organization’s definition of that region. Your definition of a region may differ from that of Foundation Center. Similarly, if your foundation defines its own regions (Southwestern Ohio), consider including the counties comprising that region.

Don’t forget to include the term “County” when reporting on U.S. counties. This will ensure your grant to an entire county isn’t assigned to the same named city (e.g. Los Angeles County, California, rather than Los Angeles, California).

Geographic Area Served in Foundation Center Platforms

Data provided (in a loadable format) will appear in “Grant Details” in Foundation Directory Online (FDO) and Foundation MapsFoundation Maps, including the complimentary eReporter map showing your own foundation’s data, also display an Area Served mapping view. 

Copy of Untitled

If data is not provided, Foundation Center will do one of the following:

  • Default to the location of the recipient organization
  • Add geo area served based on text in the grant description
  • Add geo area served based on where the recipient organization works, as listed on their website or in their mission statement, if this information is available in our database

Responsibly Sharing Geographic Area Served


Although our mission is to encourage transparency through the sharing of grants data, we acknowledge there are contexts in which sharing this data may be cause for concern. If the publishing of this data increases risks to the population meant to benefit from the funding, the grantee/recipient, or your own organization, you can either omit Geographic Area Served information entirely or report it at a higher, less sensitive level (e.g. country vs. province or city). For more information on this topic, please see Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts and Sharing Data Responsibly: A Conversation Guide for Funders.

More Tips to Come!

I hope you have a better understanding of how to report Geographic Area Served through eReporting. Without this data, valuable information about where funding is making a difference may be lost! Moving forward, we’ll explore the required fields of Recipient Name and Grant Description. If you have any questions, please feel free to contact me.

-- Kati Neiheisel

New Guide Helps Human Rights Funders Balance Tension between Risk & Transparency
October 25, 2018

Julie Broome is the Director of Ariadne, a network of European donors that support social change and human rights.  

Tom Walker is the Research Manager at The Engine Room, an international organisation that helps activists and organisations use data and technology effectively and responsibly.

2
Julie Broome

Foundations find themselves in a challenging situation when it comes to making decisions about how much data to share about their grantmaking. On the one hand, in recognition of the public benefit function of philanthropy, there is a demand for greater transparency on the part of funders and a push to be open about how much they are giving and who they are giving it to. These demands sometimes come from states, increasingly from philanthropy professionals themselves, and also from critics who believe that philanthropy has been too opaque for too long and raise questions about fairness and access. 

At the same time, donors who work in human rights and on politically charged issues, are increasingly becoming aware of the risks to grantees if sensitive information ends up in the public domain. As a result, some funders have moved towards sharing little to no information. However, this can have negative consequences in terms of our collective ability to map different fields, making it harder for us all develop a sense of the funding landscape in different areas. It can also serve to keep certain groups “underground,” when in reality they might benefit from the credibility that foundation funding can bestow.

1
Tom Walker

As the European partners in the Advancing Human Rights project, led by the Human Rights Funders Network and Foundation Center, Ariadne collects grantmaking data from our members that feeds into this larger effort to understand where human rights funding is going and how it is shifting over time. Unlike the United States, in which the IRS 990-PF form eventually provides transparency about grantee transactions, there is no equivalent data source in Europe. Yet, many donors find grant activity information useful in finding peer funders and identifying potential gaps in the funding landscape where their own funds could make a difference. We frequently receive requests from donors who want to use these datasets to drill down into specific areas of interest, and map out different funding fields. But these types of sources of data will become less valuable over time if donors move away from voluntarily sharing information about their grantmaking.

Nonetheless, the risks to grantees if donors share information irresponsibly are very real, especially at a time when civil society is increasingly under threat from both state and non-state actors.  It was in the interest of trying to balance these two aims – maintaining sufficient data to be able to analyse trends in philanthropy while protecting grantees – that led Ariadne to partner with The Engine Room to create a guide to help funders navigate these tricky questions.

After looking at why and how funders share data and the challenges of doing so responsibly, The Engine Room interviewed 8 people and surveyed 32 others working in foundations that fund human rights organisations, asking how they shared data about their grants and highlighting any risks they might see.

Funders told us that they felt treating data responsibly was important, but that implementing it in their day-to-day work was often difficult. It involved balancing competing priorities: between transparency and data protection legislation; between protecting grantees’ data and reporting requirements; and between protecting grantees from unwanted attention, and publicising stories to highlight the benefits of the grantee’s work.

The funders we heard from said they found it particularly difficult to predict how risks might change over time, and how to manage data that had already been shared and published. The most common concerns were:

  • ensuring that data that had already been published remained up to date;
  • de-identifying data before it was published
  • Working with third parties to be responsible when sharing data about grantees, such as with donors who fund through intermediaries and may request information about the intermediaries’ grantees.

Untitled designAlthough the funders we interviewed differed in their mission, size, geographical spread and focus area, they all stressed the importance of respecting the autonomy of their grantees. Practically, this meant that additional security or privacy measures were often introduced only when the grantee raised a concern. The people we spoke with were often aware that this reactive approach puts the burden of assessing data-related risks onto grantees, and suggested that they most needed support when it came to talking with grantees and other funders in an open, informed way about the opportunities and risks associated with sharing grantee data.

These conversations can be difficult ones to have. So, we tried a new approach: a guide to help funders have better conversations about responsible data.

It’s aimed at funders or grantmakers who want to treat their grantees’ data responsibly, but don’t always know how. It lists common questions that grantees and funders might ask, combined with advice and resources to help answer them, and tips for structuring a proactive conversation with grantees.

”There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process.“

There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process. It offers prompts that are designed to help you talk more openly with grantees or other funders about data-related risks and ways of dealing with them. The guide is organised around three elements of the grantmaking lifecycle: data collection, data storage, and data sharing.

Because contexts and grantmaking systems vary dramatically and change constantly, a one-size-fits-all solution is impossible. Instead, we decided to offer guidance on processes and questions that many funders share – from deciding whether to publish a case study to having conversations about security with grantees. For example, one tip that would benefit many grantmakers is to ensure that grant agreements include specifics about how the funder will use any data collected as a result of the grant, based on a discussion that helps the grantee to understand how their data will be managed and make decisions accordingly.

This guide aims to give practical advice that helps funders strengthen their relationships with grantees - thereby leading to more effective grantmaking. Download the guide, and let us know what you think!

--Julie Broome and Tom Walker

Philanthropy and Democracy: Bringing Data to the Debate
October 18, 2018

Anna Koob is a manager of knowledge services for Foundation Center.

Anna-koob_tilemediumAs money and politics become increasingly intertwined, the enduring debate around the role of philanthropy in a democratic society has taken on new life in recent months  (see
here, here, here, and here for prominent examples).

One side of the debate sees the flexibility of foundation dollars as a part of the solution to strengthen struggling democratic institutions. Others contend that foundations are profoundly undemocratic and increasingly powerful institutions that bypass government channels to shape the country--and world--to their will. Regardless of where you stand, a practical starting point is to learn more about what grantmakers are actually doing to affect democracy in these United States.

While foundations are required by law to avoid partisan and candidate campaigning, these limitations still leave plenty of room for foundations to engage with democracy in other ways.

Which funders are working on voter access issues? How much money is dedicated to civic engagement on key issues like health or the environment? Which organizations are receiving grants to increase transparency in government? Foundation Funding for U.S. Democracy, offers a free public resource to get at the answers to such questions.

Browse More Than 55k Democracy Grants

Launched in 2014 by Foundation Center and updated regularly, Foundation Funding for U.S. Democracy’s data tool currently includes over 57,000 grants awarded by more than 6,000 funders totaling $5.1 billion dollars across four major categories: campaigns and elections, civic participation, government strengthening, and media.

The tool offers a look at the big picture through dashboards on each of these categories, and also allows you to browse granular grants-level information.  Interested in understanding:

  • The largest funders of campaigns and elections work?
  • Grantmaking in support of civic participation, broken down by population type?
  • The strategies used to affect democracy work?

To paraphrase the slogan of Apple, there’s a dashboard (and underlying data tool) for that!

The site also features a collection of research on U.S. democracy, powered by IssueLab, links to a number of relevant blog posts, and hosts infographics we’ve developed using data from the tool.

What Does the Data Tell Us About Philanthropic Support for Democracy?

Copy of UntitledLess than two percent of all philanthropic funding in the United States meets our criteria for democracy funding, which includes efforts by foundations to foster an engaged and informed public and support government accountability and integrity, as well as funding for policy research and advocacy. It’s a modest amount considering that this subset captures a wide range of topics, including money in politics, civic leadership development, civil rights litigation, and journalism training. Some findings from the data rise to the top:

  1. Funding for campaigns and elections is the smallest of the four major funding categories tracked. While most people might think of elections as the basic mechanism of democracy, this category only constitutes about 12 percent of democracy funding represented in the tool. Civic participation and government each vie for being the largest category with each accounting for about 38 percent of total democracy funding. And relevant media funding accounts for 28 percent. (Note that grants can be counted in multiple categories, so totals exceed 100 percent.)
  • Less than a quarter of funding supports policy and advocacy work. While work to affect policy is often considered front and center when discussing philanthropy’s impact on democracy, the data tool reveals that many funders are working to strengthen democracy in other ways. Supporting civics education for youth, bolstering election administration, strengthening platforms for government accountability, or funding investigative journalism appear as examples of grantmaking areas that strengthen democracy, but have less direct implications for public policy.
  • Funder interest in the census and the role of media in democracy is increasing. Given the turbulence of the last couple of years in the U.S. political system and amid calls for greater philanthropic involvement in strengthening democracy, what changes have we seen in giving patterns? Well, with the caveat that there is a lag between the time when grants are awarded and when we receive that data (from 990 tax forms or direct reporting by foundations), based on reports added to IssueLab and news items posted on Philanthropy News Digest, we are seeing evidence that funders are rallying around some causes to strengthen democratic institutions, including efforts to ensure representativeness in the 2020 census and support for research on media consumption and digital disinformation.

Why Should Funders be Transparent about Their Democracy Work?

Appeals for data sharing in philanthropy often center around the common good -- detailed data helps to inform authentic conversations around who’s funding what, where, among grantmakers, nonprofits, and other stakeholders. But in a field that’s focused on shaping the nature of our democracy and represents funding from both sides of the ideological divide -- including, for example, grantmaking in support of the American Legislative Exchange Council (“dedicated to the principles of limited government, free markets and federalism”) alongside grants awarded to organizations like the Center for American Progress (“dedicated to improving the lives of all Americans, through bold, progressive ideas”), democracy funders tend to be especially cautious about publicizing their work and opening themselves up to increased scrutiny and criticism.  

But the reality is that foundation opacity undermines credibility and public trust. Precisely because of criticism about the lack of democracy in philanthropy, foundations should demonstrate intentional transparency and show that they are living their values as democracy funders. Foundations also find that, particularly in a space that’s rife with speculation, there’s a benefit to shaping your own narrative and describing what you do in your own words. It may not make you immune to criticism, but it shows that you have nothing to hide.

How Funders Can Actively Engage: Submitting Grants Data

Copy of Untitled copy 2Grants data in the platform is either reported directly to Foundation Center via our eReporter program or sourced via publicly available 990 tax forms. While we’re able to get our data-eager hands on foundation grants either way, we prefer sourcing them directly from funders as it lends itself to more recent data -- particularly valuable in the current, fast-paced ‘democracy in crisis’ era -- and more detailed grant descriptions.

To submit your most recent grants (we’re currently collecting grants awarded in 2017), become an eReporter! Export a list of your most recent grants data in a spreadsheet (all grants - not limited to those relevant to democracy), review the data to make sure there’s no sensitive information and everything is as you’d like it to appear, and email your report to egrants@foundationcenter.org. Submit data as often as you’d like, but at least on an annual basis.

Bringing Tangible Details to Abstract Discussions

At Foundation Center, we often tout data’s ability to help guide decision making about funding and general resource allocation. And that’s a great practical use case for the philanthropic data that we collect -- whether for human rights, ocean conservation funding, the Sustainable Development Goals, or democracy. At a time of increased foundation scrutiny, this publicly-available platform can also provide some transparency and concrete details to broaden discussions. What have foundations done to strengthen democracy? And how might they best contribute in these politically uncertain times? For examples, look to the data.

Have questions about this resource? Contact us at democracy@foundationcenter.org.

--Anna Koob

Data Fix: Do's and Don'ts for Data Mapping & More!
October 3, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This post is part of a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoAs many of you know, Foundation Center was established to provide transparency for the field of philanthropy. A key part of this mission is collecting, indexing, and aggregating millions of grants each year. In recent years this laborious process has become more streamlined thanks to technology, auto-coding, and to those of you who directly report your grants data to us. Your participation also increases the timeliness and accuracy of the data.

Today, over 1300 funders worldwide share grants data directly with Foundation Center. Over the 20 years we've been collecting this data, we've encountered some issues concerning the basic fields required. To make sharing data even quicker and easier, we've put together some dos and don'ts focusing on three areas that may seem straightforward, but often cause confusion.

Location Data for Accurate Mapping and Matching

Quite simply, to map your grants data we need location information! And we need location information for more than mapping. We also use this information to ensure we are matching data to the correct organizations in our database. To help us do this even more accurately, we encourage you to provide as much location data as possible. This also helps you by increasing the usability of your own data when running your own analyses or data visualizations.

DO DON'T
Do supply Recipient City for U.S. and non-U.S. Recipients. Don't forget to supply Recipient Address and Recipient Postal Code, if possible.
Do supply Recipient State for U.S. Recipients. Don't supply post office box in place of street address for Recipient Address, if possible.

Do supply Recipient Country for non-U.S. Recipients.

Don't confuse Recipient location (where the check was sent) with Geographic Area Served (where the service will be provided). 

What's Your Type? Authorized or Paid?

Two types of grant amounts can be reported: Authorized amounts (new grants authorized in a given fiscal year, including the full amount of grants that may be paid over multiple years) or Paid amounts (as grants would appear in your IRS tax form). You can report on either one of these types of amounts – we just need to know which one you are using: Authorized or Paid.

DO DON'T
Do indicate if you are reporting on Authorized or Paid amounts. Don't send more than one column of Amounts in your report – either Authorized or Paid for the entire list.
Do remain consistent from year to year with sending either Authorized amounts or Paid amounts to prevent duplication of grants. Don't forget to include Grant Duration (in months) or Grant Start Date and Grant End Date, if possible.
Do report the type of Currency of the amount listed, if not US Dollars. Don't include more than one amount per grant.

The Essential Fiscal Year

An accurate Fiscal Year is essential since we publish grants data by fiscal year in our data-driven tools and content-rich platforms such as those developed by Foundation Landscapes, including Funding the Ocean, SDG Funders, Equal Footing and Youth Giving. Fiscal Year can be reported with a year (2018) or date range (07/01/2017-06/31/2018), but both formats will appear in published products as YEAR AWARDED: 2018.

DO DON'T
Do include the Fiscal Year in which the grants were either Authorized or Paid by you, the funder. Don't provide the Fiscal Year of the Recipient organization.
Do format your Fiscal Year as a year (2018) or a date range (07/01/2017-06/31/2018). Don't forget, for off-calendar fiscal years, the last year of the date range is the Fiscal Year: 07/01/2017-06/31/2018 = 2018.

More Tips to Come!

I hope you have a better understanding of these three areas of data to be shared through Foundation Center eReporting. Moving forward, we'll explore the required fields of Recipient Name and Grant Description, as well as high priority fields such as Geographic Area Served. If you have any questions, please feel free to contact me. Thank you! And don't forget, the data you share IS making a difference!

-- Kati Neiheisel

“Because It’s Hard” Is Not an Excuse – Challenges in Collecting and Using Demographic Data for Grantmaking
August 30, 2018

Melissa Sines is the Effective Practices Program Manager at PEAK Grantmaking. In this role, she works with internal teams, external consultants, volunteer advisory groups, and partner organizations to articulate and highlight the best ways to make grants – Effective Practices. A version of this post also appears in the PEAK Grantmaking blog.

MelissaFor philanthropy to advance equity in all communities, especially low-income communities and communities of color, it needs to be able to understand the demographics of the organizations being funded (and declined), the people being served, and the communities impacted. That data should be used to assess practices and drive decision making.

PEAK Grantmaking is working to better understand and build the capacity of grantmakers for collecting and utilizing demographic data as part of their grantmaking. Our work is focused on answering four key questions:

  • What demographic data are grantmakers collecting and why?
  • How are they collecting these demographic data?
  • How is demographic data being used and interpreted?
  • How can funders use demographic data to inform their work?

In the process of undertaking this research, we surfaced a lot of myths and challenges around this topic that prevent our field from reaching the goal of being accountable to our communities and collecting this data for responsible and effective use.

Generally, about half of all grantmakers are collecting demographic data either about the communities they are serving or about the leaders of the nonprofits they have supported. For those who reported that they found the collection and use of this data to be challenging, our researcher dug a little deeper and asked about the challenges they were seeing.

Some of the challenges that were brought to the forefront by our research were:

PEAK Grantmaking reportChallenge 1: Fidelity and Accuracy in Self-Reported Data
Data, and self-reported data in particular, will always be limited in its ability to tell the entire story and to achieve the nuance necessary for understanding. Many nonprofits, especially small grassroots organizations, lack the capability or capacity to collect and track data about their communities. In addition, white-led nonprofits may fear that lack of diversity at the board or senior staff level may be judged harshly by grantmakers.

Challenge 2: Broad Variations in Taxonomy
Detailed and flexible identity data can give a more complete picture of the community, but this flexibility works against data standardization. Varying taxonomies, across sectors or organizations, can make it difficult to compare and contrast data. It can also be a real burden if the nonprofit applying for a grant does not collect demographic data in the categories that a grantmaker is using. This can lead to confusion about how to report this data to a funder.

Challenge 3: Varying Data Needs Across Programs
Even inside a single organization, different programs may be collecting and tracking different data, as program officers respond to needs in their community and directives from senior leadership. Different strategies or approaches to a problem demand different data. For instance, an arts advocacy program may be more concerned with constituent demographics and impact, while an artist’s program will want to know about demographics of individual artists.

Challenge 4: Aggregating Data for Coalitions and Collaborations
This becomes even more complex as coalitions and collaborative efforts that bring together numerous organizations, or programs inside of different organizations, to accomplish a single task. The aforementioned challenges are compounded as more organizations, different databases, and various taxonomies try to aggregate consistent demographic data to track impact on specific populations.

These are all very real challenges, but they are not insurmountable. Philanthropy, if it puts itself to the task, can tackle these challenges.

Some suggestions to get the field started from our report include

  • Don’t let the perfect be the enemy of the good. Pilot systems for data collection, then revisit them to ensure that they are working correctly, meeting the need for good data, and serving the ultimate goal of tracking impact.
  • Fund the capacity of nonprofits to collect good data and to engage in their own diversity, equity, and inclusion efforts.
  • Engage in a conversation – internally and externally – about how this data will be collected and how it will be used. If foundation staff and the nonprofits they work with understand the need for this data, they will more willingly seek and provide this information.
  • For coalitions and collaborative efforts, it may make sense to fund a backbone organization that takes on this task (among other administrative or evaluation efforts) in support of the collective effort.
  • Work with your funding peers – in an issue area or in a community – to approach this challenge in a way that will decrease the burden on nonprofits and utilize experts that may exist at larger grantmaking operations.
  • Support field-wide data aggregators, like GuideStar or the Foundation Center, and work alongside them as they try to collect and disseminate demographic data about the staff and boards at nonprofits and the demographics of communities that are being supported by grantmaking funds.

Grantmakers have the resources and the expertise to begin solving this issue and to share their learning with the entire field. To read more about how grantmakers are collecting and using demographic data, download the full report.

--Melissa Sines

New Report Sheds Light on Global Funding Trends by U.S. Foundations
August 23, 2018

Janet Camarena is director of transparency initiatives for Foundation Center.

Janet Camarena PhotoThose of us in philanthropy often hear that foundations are increasingly rising to the challenge and working to address the world’s most pressing problems, and new data now available demonstrates that in order to fully address these challenges, philanthropic dollars are transcending borders and prior levels of giving. A new report released this month by the Council on Foundations and Foundation Center reveals that global giving by U.S. foundations increased by 29% from 2011 to 2015, reaching an all-time high of $9.3 billion in 2015. Interestingly, despite reaching that new peak in global giving, the report also documents that just 12% of international grant dollars from U.S. foundations went directly to organizations based in the country where programs were implemented.

The State of Global Giving by U.S. Foundations is the latest report in a decades-long collaboration between the two organizations and aims to help funders and civil society organizations better navigate the giving landscape as they work to effect change around the world. A treasure trove of data from prior reports dating back to 1997 is publicly available here.

In terms of transparency and openness, the report offers a helpful data-driven perspective on some of the key global philanthropy debates, issues, and movements of our time. Are you concerned with whether increasing government regulations are preventing foundations from supporting efforts in countries that have enacted tougher funding restrictions? Or, do you want to know how much funding goes to groups on the ground vs. U.S.-based intermediaries? Or, how about getting a better understanding of where the $9.3 billion was spent and how it is advancing the 17 different Sustainable Development Goals? These are just a few examples of the kinds of data and analysis you’ll find in the new report.

Increased Restrictions on Foreign Funding

Global GivingAs governments around the world continue to pass legislation that places increasing restriction on civil society, these restrictions can complicate direct grantmaking to local organizations by U.S. foundations. Between 2012 and 2015, the International Center for Not-for-Profit Law found that almost 100 laws constraining the freedoms of association or assembly were proposed or enacted across more than 55 countries. And, perhaps of most concern to foundations, 36% of these laws limited intentional funding of local civil society groups. Common restrictions affecting international funding include: governmental pre-approval of all grants coming from foreign sources; routing of all foreign funding through government entities; or enacting funding caps or taxation.

Despite the growth of these potentially chilling restrictions, surprisingly the report data did not show a correlation between the funding flows to a specific country and its level of restrictions as ranked on the “Philanthropic Freedom” index. However, it’s important to note that this kind of analysis may be more accurate over time. Since this study used grants data from 2014-2015, it could be likely that the effects of recently enacted legislation on philanthropy would surface in future grant years after the laws take full effect. Based on the currently available data, what is clear is that when we look at the top recipient countries that most benefit from U.S. foundation funding, some of these high ranking recipient countries are the ones with very challenging legal environments. Of course, philanthropic funding flows are always determined by a multitude of factors, but this raises questions to explore, such as why are certain countries with difficult legal environments high on the recipient list while others are not?

Intermediary Giving vs. Local Support

Representatives from NGOs, and advocates of community-based groups have long pushed for increased philanthropic capital to flow directly through these groups rather than through large, U.S-based intermediaries. And growing movements like #ShiftThePower have continued to build momentum around direct investments in communities. However, perhaps due to the aforementioned increasing restrictions on foreign funding, the new report reveals that foundations continue to favor funding through U.S.-based intermediaries, and:

  • Direct grants to local organizations were substantially smaller in size, averaging just under $242K, while grants to intermediaries averaged just over $554K; and
  • In terms of dollar amount, U.S.-based intermediaries received $20.5 billion in total, while non-U.S. intermediaries received $10.5 billion, and direct support tallied $4.1 billion.
  • By number of grants, nearly 49,000 grants during this four-year period went to U.S. intermediaries, 7,514 went to non-U.S. intermediaries, and 16,948 grants were awarded directly.

Progress on Sustainable Development Goals

Readers of this blog might recall that last year around this time we added the Sustainable Development Goals to our “Who Has Glass Pockets?” transparency self-assessment framework. This allowed us to document examples of funders using this shared, multi-sector language to convey their priorities and ultimate goals of their work. The United Nations' Sustainable Development Goals (SDGs), otherwise known as the Global Goals, are a universal call to action to end poverty, protect the planet and ensure that all people enjoy peace and prosperity. Some foundations have started aligning their funding with the SDGs, and some even using it as a shared language across philanthropy and across sectors to signal areas of common interest, and measure shared progress.

GlassPockets currently has tracked examples from corporate, community, independent, and family foundations that are using the SDG framework as a means to better communicate their work. Now, thanks to the new report, we now also have data about how philanthropic grantmaking is making progress on the SDGs, as well as trend data based on the Global Goals:

  • The Global Goals that represented the largest share of global grant dollars were Good Health & Well Being ($17 billion); Gender Equality ($4.9 billion); and Zero Hunger ($3.6 billion).
  • And among the Global Goals that showed the greatest reduction in grant support over the time period covered by the report were Affordable & Clean Energy which declined by 40 percent; Quality Education which dipped by 31.4 percent; and Clean Water & Sanitation which dropped by more than 30 percent.

It’s important to note that the SDGs formally did not go into effect until January 2016, and the data from this report begins from 2011. Still, the distribution of foundation funding by SDGs during the five year period before will serve as a baseline for tracking U.S. philanthropic efforts toward the achievement of the global goals.

With mounting challenges that transcend national boundaries, it’s increasingly important to understand how funds are being allocated to tackle global issues. Now, thanks to this report, we have a window into the scope and growth of institutional philanthropy as a global industry.

--Janet Camarena

Staff Pick: The Promise and Limits of First Amendment Rights on College Campuses
August 16, 2018

Becca Leviss is a Knowledge Services Fellow at Foundation Center, and an undergraduate student at Tufts University majoring in Sociology.
 
Becca 2Institutions of higher learning are natural places for the open exchange of ideas, debating diverse viewpoints, and learning from people who come from different backgrounds. Yet, in recent years, the issue of free speech on college campuses has at once empowered, and also confused, isolated, and angered students, university administrations, alumni, and the American public.

As a college student myself, this report by Gallup, the Knight Foundation and the Newseum Institute caught my eye. There’s a running joke about the death of free speech on my campus, and I’ve experienced limitations on both sides of the spectrum: choosing not to speak up during class, feeling offended by thoughtless comments, and tapping into comraderies made obvious by a shared intellectual space. While I acknowledge the difficulties of censorship and seclusion, I cannot ignore the way ideological bubbles have provided a sense of security in my college experience. Likewise, as students, academics, and active citizens, we have an obligation to uphold the tenets of American democracy, but also recognize its nuance and complexity.


STAFF PICK

Free Expression on Campus: What College Students Think about First Amendment Issues

Download the Report

Publisher

John S. and James L. Knight Foundation; Gallup, Inc.

Funders

John S. and James L. Knight Foundation; American Council on Education; Charles Koch Foundation; Stanton Foundation

Quick summary

Young people continue to be at the forefront of ideological movements and change-making in American society. As a result, they are demographically opinionated and invested in First Amendment issues. This report updates a 2016 nationally representative study of college students on the security of First Amendment freedoms to account for rapidly shifting political, social, and ideological arenas following the most recent presidential election. While the study confirms the value and overall perception of relative security of free expression for college students, it finds that students are less likely now than they were in 2016 to say that their rights are secure. Their ideology is also often contradictory: students criticize overly-prohibitive campuses and extreme actions to prevent unpopular speech, but statistically are more likely to value inclusion and diversity over free speech.

Field of practice

Human Rights and Civil Liberties

What makes it stand out?

FgtReading this report serves as an important reminder of the fragility of our liberties in shifting political and social contexts, and how those contexts can impact our perspective of security. The report, a continuation of a 2016 study, investigates the intricacies of First Amendment protections through the perspectives of college students and administrations. As university actors attempt to navigate one of the more contentious issues in an already-contentious time, we gain insight the complexity of a free society by examining it through the eyes of the new generation. Since the data collection began with the 2016 study, the authors are able to compare how respondents’ attitudes changed over time.

The report begins with college students’ views of First Amendment rights. Overall college students are less likely to see First Amendment rights as secure, especially when compared with the 2016 survey. This includes a 21-percentage-point decline in perceived security of freedom of the press and nine-point declines for free speech, freedom of assembly and freedom to petition the government. The report also looks at how political party affiliation may affect these perceptions. For example, the percentage of Republicans that feel that their First Amendment rights (freedom of speech, religion, press, assembly, petition) are very secure or secure in the country today has increased in comparison to the 2016 study, while Democrats and Independents’ sense of security has decreased significantly since 2016. The study shows that Republicans are far more likely than Democrats to perceive their First Amendment rights as secure. We see this difference most notably in their views on freedom of the press and freedom of assembly. Almost eight in ten Republicans think that the freedom of the press is secure in 2017, in comparison to almost five in ten Democrats. Even fewer Democrats think that freedom of assembly is secure, compared with 74 percent of Republicans. Independents generally fall somewhere in between the perspectives of Republicans and Democrats.

Some of the most notable shifts in perceptions are in regards to freedom of speech and freedom of the press. While 71 percent of Republican respondents think that their freedom of speech is very secure/secure (an increase of five percentage points), only 59 percent of Democrats responded the same, a decrease of fifteen percent from 2016. Across the board, however, respondents report thinking that the freedom of the press is less secure than it was in 2016, regardless of political ideology.

The report also provides insights into groups that don’t always feel they can speak freely on campus. Female students and students of color, for example, are less likely to feel secure about their First Amendment rights. And, college students are much less likely to believe that political conservatives can freely express themselves, compared with other groups.

The study illustrates college students’ struggles to reconcile the importance of both free speech and inclusion in a democratic society. Although students feel that campus climate stifles their ability to speak freely, they largely support university measures to control speech, like the creation of safe spaces, free speech zones, and campus speech codes. When asked to choose between a diverse, inclusive society and protecting free speech, a slight majority of 53 percent favored the former. Conversely, an overwhelming 70 percent support an open learning environment that exposes students to a variety of speech.

The report also reveals that the debates that may have once happened on campus may now be moving to social media, an increasingly popular medium of expression for young people. Fifty-seven percent of students say that discussions of political and social issues take place mostly over social media, as opposed to public areas of campus. Despite social media’s popularity, however, students fear that it generates negative impacts for expression. 63 percent of students disagree that dialogue over social media is mostly civil and 83 percent fear that it is too easy for people to say things anonymously on social media platforms. These negative attitudes towards ideological expression on social media are only increasing. Most dramatically, the percentage of students that believe social media stifles free speech because users block dissenting opinions has risen 12 points since 2016.

The report closes with a look at students’ perceptions about the limits of free speech. Openness advocates will find this section most interesting as it outlines circumstances under which students feel limits on free speech are appropriate. The study examines student reactions to issues of free speech on college campuses from disinviting controversial speakers to on-campus protests—I can say that I’ve experienced most of them firsthand at my own school. The study found that while students oppose disinviting controversial speakers on campus, they do support it under the threat of violence, although 34 percent of respondents concede that violent reactions are sometimes acceptable. Regardless, more than six in ten students are not even aware of the free speech codes of their respective schools, let alone if their schools have ever had to disinvite certain speakers.

Overall, the “…findings make clear that college students see the landscape for the First Amendment as continuing to evolve,” and reveal the complexity of the ongoing debate on First Amendment rights.

Key quote

“College students generally endorse First Amendment ideals in the abstract. The vast majority say free speech is important to democracy and favor an open learning environment that promotes the airing of a wide variety of ideas. However, the actions of some students in recent years — from milder actions such as claiming to be threatened by messages written in chalk promoting Trump’s candidacy to the most extreme acts of engaging in violence to stop attempted speeches — raise issues of just how committed college students are to upholding First Amendment ideals.”

--Becca Leviss

Staff Pick: Foundation Funded Research Explores How to Improve the Voter Experience
August 9, 2018

Becca Leviss is a Knowledge Services Fellow at Foundation Center.

This post is part of the GlassPockets’ Democracy Funding series, designed to spotlight knowledge about ways in which philanthropy is working to strengthen American democracy.

Becca 2Voting is central to our democracy, providing citizens from all communities direct way to influence the future by conveying beliefs through civic participation. Though foundations by law must be non-partisan, they can and do support democracy in a variety of ways, and we are tracking these activities in our publicly available Foundation Funding for U.S. Democracy web portal.  
 
From this data we can see that encouraging broad civic participation is one of the most popular ways in which institutional philanthropy supports our democracy. Specific strategies under civic participation include issue-based participation, civic education and leadership, naturalization and immigrant civic integration, and public participation. So, what have foundations learned from these efforts about how to strengthen our democracy? Today we will zoom in to learn from a foundation-funded report that is openly available, containing findings from data collection on elections and voting patterns, including how well the process is workingand who is included or excluded. 
 
Our latest “Staff Pick” from IssueLab’s Democracy Special Collection, which is comprised of foundation-funded research on the topic, explores an aspect of the voter experience in America that could be improvedWith less than 90 days to go before the midterm elections, we’re pleased to offer this deep dive into an important piece of voting-related research. 
 
Research in the social sector can sometimes feel inaccessible or artificial—based on complex theories and mathematical models and highly-controlled situations. This report, however, presents its research methodology and results in a clear, understandable manner that invites the reader to continue its work to understanding how polling sites can use their resources to both investigate and improve the voter experience.  

STAFF PICK

Improving the Voter Experience: Reducing Polling Place Wait Times by Measuring Lines and Managing Polling Place Resources, by Charles Stewart III; John C. Fortier; Matthew Weil; Tim Harper; Stephen Pettigrew 

Download the Report

Publisher

Bipartisan Policy Center

Funders

Ford Foundation; The Democracy Fund

Quick Summary

Voting is the cornerstone of civic engagement in American democracy, but long wait times and inefficient organization at polling places can undermine the voting process and even discourage citizens from voting altogether. In 2013, President Barack Obama launched the bipartisan Presidential Commission on Election Administration (PCEA) to initiate studies and collaborative research on polling place wait times. The PCEA’s work revealed that while wait times and poll lines are a serious issue in the United States, they are also reflective of deeper, more complex problems within the election administration system. This report by the Bipartisan Policy Center summarizes the PCEA’s efforts and highlights how the knowledge gained can produce action and improvement at polling sites. Ultimately, the report emphasizes the need for continued research and innovation in approaching common issues in the voter experience.

Field of Practice

Government Reform

What makes it stand out?

Ne report“Long lines may be a canary in the coal mine,” begins the report,“indicating problems beyond a simple mismatch between the number of voting machines and voters, such as voter rules that are inaccurate or onerous.” Quantitative and qualitative data has shown that long lines at the polls have wide-reaching economic costs of over half a billion dollars in a presidential election, as well as the immeasurable cost of voter discouragement due to polling place problems. These issues are exacerbated at polling sites that are urban, dense, and with large minority populations, where lack of resources and access can disenfranchise the voting population.

While the dilemma of election administration is complex, the report describes a rather straight-forward series of projects by the Massachusetts Institute of Technology and the Bipartisan Policy Center. MIT and BPC collaborated to create a system of data collection on polling lines and polling place efficiency that would be simple and easily implemented by poll workers. The program utilized basic queuing theory: calculating the average wait time of a voter by dividing the average line length by the average arrival time. For fellow (and potential future) researchers, this report spends a meaningful portion of time explaining the significance of each variable, how it is calculated, and how its fluctuation impacts the overall results of the investigation. We are given examples of several successful iterations of the study and their evaluations, as well as insight into certain research choices.

MIT/BPC’s work has found that an overwhelming majority of Election Day polling sites—82 percent—experienced the longest line when the doors first opened. In all, a total of 90 percent of Election Day polling sites have their longest lines within the first two hourly samples (when observed on Hour 0 and Hour 1), with the lines declining at an average rate after that. Similarly, voters experience the longest wait times when the lines were at their longest. This pattern is vastly different from that of early voting sites, where wait time is relatively constant; however, these sites still most commonly experience their longest lines at the beginning of the day (25 percent of the studied population).

The research emphasizes the importance of how to adequately prepare for the length of the longest line. The report suggests that if polling sites adjust worker shifts to accommodate for strong early morning voter turnout on Election Day, they can easily clear the lines within the first few hours of voting, thus saving money and better serving their voters. The report also recognizes the range of its results: in other words, individual precincts have individual needs. Without meaningful research, however, we cannot know how to meet those needs and improve the voter experience. Therefore, as readers (and hopefully fellow voters), we are encouraged by MIT/BPC’s work to take clear and simple action to improve our own polling sites through continued research and investigation. This report exemplifies the importance of making the research and data process transparent and attainable so that we can not only understand its significance, but actively contribute to its efforts. There are many processes that could benefit from this kind of data analysis to improve the user experience. What if foundations analyzed their grant processes in this way? I can’t help but think that there is much that philanthropy can learn from the government from reports like this that show how institutions are opening up data collection to improve the user experience for actors and stakeholders.

Key Quote

“Precincts with large numbers of registered voters often have too few check-in stations or voting booths to handle the volume of voters assigned to the precinct, even under the best of circumstances. Precincts that are unable to clear the lines from the first three hours of voting are virtually guaranteed to have long lines throughout the day. Polling places in urban areas often face design challenges—small, inconvenient spaces—that undermine many election officials’ best efforts to provide adequate resources to these locations.”

--Becca Leviss

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories