Diversity and inclusion initiatives must include comprehensive metrics for success for accountability. If companies fail to measure their demographics, broken down by function, level, and short- and long-term trends, they won’t be able to assess the success, or failure, of initiatives and leaders. Collecting data with well-worded, thoughtfully structured surveys provides critical information that informs policies, procedures, and programs and drives accountability.
Why did we choose this area?
Measurement is fundamental to understanding current performance, setting targets for the future, and providing accountability. Metrics and goals also drive behavior, and metrics and goals for D&I will drive behavior across the company.
As a baseline, we recommend regularly reviewing demographic data (including data on different processes, like hiring, promotion, and managing attrition, cut by demographics) and running employee engagement surveys (comprehensive surveys periodically and lightweight pulse surveys more frequently). This quantitative and qualitative data is critical to understanding organizational health and areas for improvement; for example, it can help identify things like problematic dropoffs in recruiting pipelines, or key predictors for dissatisfaction and turnover, and correspondingly, good areas for intervention and improvement.
Metrics and surveys also work to create accountability around the reception and success of different company programs and policies, such as the Rooney Rule for hiring, or vacation and leave policies. That is, in addition to tracking top line metrics, companies should drill down with detailed analysis on specific initiatives.
How do we think about measuring progress?
Measuring diversity and inclusion is important to elevate it to the same status as every other business priority, and to make sure it’s approached with the same sort of rigor. For this purpose, at a minimum, metrics and survey findings should be transparent internally, to help companies understand areas of improvement and, over time, any progress or lack thereof.
There are also reasons to share at least some data externally. For one, shared data is necessary to establish industry benchmarks on problems and trends, and for companies to understand relative performance. For another, the act of publishing data creates strong public accountability for companies working on diversity and inclusion. More in the way of self-interest, companies may want to share their data for PR purposes or brand-building; that’s not a problem, as long as the data aren’t misrepresented.
To that end, we intend to share aggregated metrics across the group of companies who participate in our initiative (result ranges for the top 25 percent, middle 50 percent, and bottom 25 percent), and we plan to share the names of the companies in the top 25 percent publicly.
What are we concerned about?
Firstly, measuring the right things is hard.
The wrong metrics can present a misleading view of system health, and optimizing against them can produce distorted outcomes. There is danger in vanity metrics, which provide good optics but lack significance. For example, some companies include non-engineers in their definitions of their engineering teams, artificially inflating their diversity numbers with groups that are usually more diverse. Internally, this can erode trust with employees who see a lack of diversity on their teams, and can delay identifying and addressing problems, while externally, it can lend a company artificial weight as an industry leader.
And while surveys are invaluable for understanding sentiment, they only function well when they are thoughtfully designed and executed, and analysis is done correctly. Poorly designed surveys can not only produce incorrect results but, even worse, be offensive or damaging. Employee distrust of company-administered surveys, particularly around sensitive questions, can skew results due to misrepresentation or lack of response. Running surveys too often can create survey fatigue, lowering participation rates and similarly skewing the data.
Merely measuring and tracking performance isn’t enough; it has to be accompanied by a system for responding to results, and this system must be consistent and transparent. For surveys, repeating them before action has been taken after previous survey results or change can be noted can engender disillusionment and increased negative sentiment amongst employees.
Despite our recommendations, some companies may not track metrics or run surveys at all out of concern that they’re not instructive or actionable or that they may reflect unfavorably. For seed-stage companies in particular, the numbers may be too small to draw statistically significant conclusions, and it may not be possible to cut survey data by demographics because there simply aren’t enough people. The challenge is for these companies to recognize the value of diversity and inclusion early, and include measurement of it to assess progress and identify areas in need of improvement.
What are our recommendations?
Consider using existing metrics definitions and surveys
There’s a lot to recommend when it comes to using existing metrics definitions and surveys, particularly on third-party survey platforms that can aggregate results across organizations. Doing so automatically accommodates many of the best practices recommended below. It also means that data is immediately comparable across companies.
Be transparent with metrics and survey results
In addition to accountability, internal transparency builds trust and confidence, while external transparency allows a company to lead by example. This holds true even though not all metrics need have the same level of transparency, and there are likely to be tiers of access to the information.
Our recommendation on external transparency is for companies with 50 or more employees to publish diversity reports at least yearly, and for high-growth companies that are hiring rapidly, at least twice yearly.
Metrics should be consistent across the industry
We recommend that companies prioritize the following metrics:
- Employees overall, by function, seniority, and tenure, cut by demographics
- Employee status (full-time / part-time / contractor), cut by demographics
- Management and leadership, cut by demographics
- Employees reporting to female managers
- Employees reporting to managers from underrepresented groups
- Salary, cut by demographics
- Raises and bonuses, cut by demographics
- Equity, for all-time and 12 months trailing, cut by demographics
- Employee equity pool, for all-time and 12 months trailing, cut by gender and race
- Investor equity pool, cut by gender and race
- Vesting rates, cut by gender and race
- Board of Directors, cut by demographics
- Candidate pools and hiring funnels, by role, cut by demographics
- Voluntary and involuntary attrition rates, cut by demographics
- Promotion rates, cut by demographics
- Complaints (formal and informal), cut by demographics
- Complaint resolution status
Use inclusive demographic breakdowns
We recommend drawing from the below breakdowns for metrics and surveys:
- Race/ethnicity, with affordance for multiracial identity
- East Asian (including Chinese, Japanese, Korean, Mongolian, Tibetan, and Taiwanese)
- Middle Eastern
- Native American/Alaskan Native/First Nations
- Pacific Islander
- South Asian (including Bangladeshi, Bhutanese, Indian, Nepali, Pakistani, and Sri Lankan)
- Southeast Asian (including Burmese, Cambodian, Filipino, Hmong, Indonesian, Laotian, Malaysian, Mien, Singaporean, Thai, and Vietnamese)
- Prefer not to answer
- Sexual orientation
- Family status
- Children in the home part-time or full-time
- Responsibility for the care of other people
- Immigration status
- Veteran status
- English proficiency
- Languages spoken
- Age and tenure at organization
- Educational attainment
- Highest degree
- Highest degree of parents
- College attended: public/private/any
We recognize that some of these get very narrowly specific, which can pose problems for small organizations. One possibility is to roll the data up into broader categories as appropriate. It is certainly advisable to do so to protect individuals if their anonymity would be compromised otherwise. A common rule of thumb is to require that there be at least five people in a group before survey data is available for review, for example.
There are additional demographic breakdowns that can be very instructive to study, but are sensitive and present disclosure concerns. Employees may not want to share this information with their employers, and employers may not want the responsibility of having it, either. One solution to this problem is to work with a third party survey company that hosts all the data and is responsible for providing analysis of the aggregated data.
Here are some of the potentially more sensitive breakdowns:
- Transgender status
- Mental health/substance abuse status
- Disability status
- Criminal history
Many thanks to Culture Amp and Paradigm for their guidance on recommended demographic breakdowns. These are largely drawn from their collaborative effort on an Employee Inclusion Survey.1
Set employee, leadership, board, and investor demographic diversity goals
The first step to achieving goals is setting them. Within companies, we recommend setting demographic targets not only across the employee base, but also for leadership, the board, and investors.
However, what these goals look like is not always straightforward.
The first question is what types of diversity to set goals around. Gender and race are the visible classes and the ones for which there is at least some population data available for comparison. Of course, gender and race are far from the only relevant demographic classifications, but they are a good starting point — and a lack of representation of women and people of color is a warning sign that there is a problem, even if a company is neurodiverse or has a high percentage of LGBQTA employees.
The second question is what the numeric goals are, exactly. Not every company, department, or team will have the same targets, nor boards or investor pools. It is instructive to consider a number of factors in defining long-term targets and the shorter-term, intermediate goals to get there. These include the population demographics of the country/countries, state/states, and city/cities in which a company is located or from which a company hires; of the available talent pool, by role; and of the current and anticipated or desired customer base.
Gender is relatively easy to reason about, because in natural human populations the mix of male and female tends towards 50/50, even if not precisely that. As a default goal, it’s reasonable to aim for organizational groups to be approximately 50 percent female and 50 percent male. As for race, there are significant differences in racial breakdowns depending on the populations in question. For example, 2010 Census results put the United States at 16.3 percent Hispanic or LatinX, 12.6 percent Black, and 4.8 percent Asian, whereas the Bay Area was 23.5 percent Hispanic or LatinX, 6.7 percent Black, and 23.3 percent Asian. Furthermore, immigration plays a non-trivial role in driving current tech company demographics.
Of course, these population demographics are not the only ones to consider. For different roles, there may be strong constraints given the available talent pool (though some constraints are artificial, like self-imposed guidance on only hiring from a limited set of schools or from specific majors, even when those criteria are irrelevant).
As for the optimal, ideal-world employee, leadership, board, or investor mixes, though, the demographics of the userbase of a company’s products or services provide a good guideline. Even for a Silicon Valley company, if it’s aiming to build for a mainstream American audience, it should aspire beyond the local or existing talent pool demographics to something more representative of the country. That being said, on the flip side, even if a userbase is relatively homogeneous, there are still benefits to having diversity in the team supporting it. Diversity in companies isn’t just about reflecting the demographics of their customers; it also makes them more innovative and better at driving business outcomes.
There are not a lot of data available on what goals companies have set and even less data on what goals have been accomplished. We designed our startup program to collect and share data as a way for companies to semi-anonymously share results and show what is possible to achieve.
Re-use existing surveys, or design new surveys with great care
Good survey design is an art and a science. There is a wealth of research and guidance on how to design and run surveys effectively, but in cases where it’s possible to reuse existing surveys or survey questions, it’s advisable to do so. Others may have already tested their reliability and validity, and uniformity in surveys allows their results to be compared across organizations.
Repeat surveys regularly, at the right intervals
Optimal survey intervals depend mostly on whether a company has had an opportunity to make changes based on the previous survey and to measure whether employees feel that specific improvements were made. For example, if a company implements new communication mechanisms as a result of survey feedback, it may be advisable to wait six months after implementing the mechanisms to re-survey. But short and lightweight pulse surveys, e.g. about 10 questions, can generally done more frequently. However, many startups report survey fatigue setting in quickly — survey response rate can be a useful yardstick for measuring survey fatigue.
While Project Include provides resources as a starting point, we have not made a comprehensive search of all resources and do not necessarily agree with everything in the resources. We share these as helpful references and encourage you to continue exploring.
- Culture Amp and Paradigm Inclusion Survey
- Level Playing Field Institute - The Tilted Playing Field: Hidden Bias in Information Technology Workplaces
- NASA Diversity and Inclusion Assessment Survey
- SAIC Employment Survey (with wording developed by Gallup)
- Harvard University’s tips on survey wording
Existing diversity data reports:
Beginning in 2014, many companies now publish diversity data reports of some form. Open Diversity Data is a good resource to find these reports, and below are direct links to some as well:
- Dropbox 2015, 2014
- Facebook 2015, 2014
- Gusto 2016, 2015
- Indiegogo 2014
- Intel 2015, 2015 mid-year
- LinkedIn 2015, 2014
- Pinterest 2015, 2014
- Slack 2016, 2015
- Twitter 2015, 2014
- Yahoo 2015, 2014
Clune, B. “Culture Amp and Paradigm Partnership.” Culture Amp. Retrieved April 2016 from: http://blog.cultureamp.com/culture-amp-and-paradigm-partnership/ ↩