Harvard Business Review: Better People Analytics

Better People Analytics

Artificial Intelligence and Ethics

Artificial Intelligence and Ethics

How the Eagles Followed the Numbers to the Super Bowl

How the Eagles Followed the Numbers to the Super Bowl

How People Analytics Can Change Process, Culture, and Strategy

How People Analytics Can Change Process, Culture, and Strategy

University Took Uncommonly Close Look at Student-Conduct Data

Rutgers

Dodgers, Brewers show how analytics is changing baseball

Baseball

Little Privacy in the Workplace of the Future

Little Privacy in the Workplace of the Future

Google's Culture of Self-Surveying

Google

The Resume of the Future

The Resume of the Future

More Academic Articles

What can machine learning do? Workforce implications
Erik Brynjolfsson and Tom Mitchell. 12/22/2017. “What can machine learning do? Workforce implications.” Science, 358, 6370, Pp. 1530-1534. Publisher's VersionAbstract
Digital computers have transformed work in almost every sector of the economy over the past several decades (1). We are now at the beginning of an even larger and more rapid transformation due to recent advances in machine learning (ML), which is capable of accelerating the pace of automation itself. However, although it is clear that ML is a “general purpose technology,” like the steam engine and electricity, which spawns a plethora of additional innovations and capabilities (2), there is no widely shared agreement on the tasks where ML systems excel, and thus little agreement on the specific expected impacts on the workforce and on the economy more broadly. We discuss what we see to be key implications for the workforce, drawing on our rubric of what the current generation of ML systems can and cannot do [see the supplementary materials (SM)]. Although parts of many jobs may be “suitable for ML” (SML), other tasks within these same jobs do not fit the criteria for ML well; hence, effects on employment are more complex than the simple replacement and substitution story emphasized by some. Although economic effects of ML are relatively limited today, and we are not facing the imminent “end of work” as is sometimes proclaimed, the implications for the economy and the workforce going forward are profound.
Small Cues Change Savings Choices
James J.Choi, Emily Haisley, Jennifer Kurkoski, and Cade Massey. 2017. “Small Cues Change Savings Choices.” Behavioral Evidence Hub. Publisher's VersionAbstract

PROJECT SUMMARY

Researchers tested the effects of including cues, anchors, and savings goals in a company email encouraging employee contributions to their 401(k).

IMPACT

Researchers found that providing high contribution rate or savings goal examples, or highlighting high savings thresholds created by the 401(k) plan rules, increased 401(k) contribution rates by 1-2% of income per pay period.

Read More.

Overcoming Algorithm Aversion: People Will Use Imperfect Algorithms If They Can (Even Slightly) Modify Them
Berkeley Dietvorst, Joseph P. Simmons, and Cade Massey. 6/13/2015. “Overcoming Algorithm Aversion: People Will Use Imperfect Algorithms If They Can (Even Slightly) Modify Them.” SSRN. Publisher's VersionAbstract
Although evidence-based algorithms consistently outperform human forecasters, people often fail to use them after learning that they are imperfect, a phenomenon known as algorithm aversion. In this paper, we present three studies investigating how to reduce algorithm aversion. In incentivized forecasting tasks, participants chose between using their own forecasts or those of an algorithm that was built by experts. Participants were considerably more likely to choose to use an imperfect algorithm when they could modify its forecasts, and they performed better as a result. Notably, the preference for modifiable algorithms held even when participants were severely restricted in the modifications they could make (Studies 1-3). In fact, our results suggest that participants’ preference for modifiable algorithms was indicative of a desire for some control over the forecasting outcome, and not for a desire for greater control over the forecasting outcome, as participants’ preference for modifiable algorithms was relatively insensitive to the magnitude of the modifications they were able to make (Study 2). Additionally, we found that giving participants the freedom to modify an imperfect algorithm made them feel more satisfied with the forecasting process, more likely to believe that the algorithm was superior, and more likely to choose to use an algorithm to make subsequent forecasts (Study 3). This research suggests that one can reduce algorithm aversion by giving people some control - even a slight amount - over an imperfect algorithm’s forecast.
The Bright Side of Being Prosocial at Work, and the Dark Side, Too
Mark C. Bolino and Adam Grant. 2016. “The Bright Side of Being Prosocial at Work, and the Dark Side, Too.” The Academy of Management Annals. Publisher's VersionAbstract
More than a quarter century ago, organizational scholars began to explore the implications of prosociality in organizations. Three interrelated streams have emerged from this work, which focus on prosocial motives (the desire to benefit others or expend effort out of concern for others), prosocial behaviors (acts that promote/protect the welfare of individuals, groups, or organizations), and prosocial impact (the experience of making a positive difference in the lives of others through one’s work). Prior studies have highlighted the importance of prosocial motives, behaviors, and impact, and have enhanced our understanding of each of them. However, there has been little effort to systematically review and integrate these related lines of work in a way that furthers our understanding of prosociality in organizations. In this article, we provide an overview of the current state of the literature, highlight key findings, identify major research themes, and address important controversies and debates. We call for an expanded view of prosocial behavior and a sharper focus on the costs and unintended consequences of prosocial phenomena. We conclude by suggesting a number of avenues for future research that will address unanswered questions and should provide a more complete understanding of prosociality in the workplace.
More

More Popular Press

A University Took an Uncommonly Close Look at Its Student-Conduct Data. Here’s What It Found.
Dan Bauman. 8/28/2018. “A University Took an Uncommonly Close Look at Its Student-Conduct Data. Here’s What It Found.” The Chronicle of Higher Education. Publisher's VersionAbstract
When colleges try to understand their students, they resort to a common tool: the survey.

And surveys are fine, says Dayna Weintraub, director of student-affairs research and assessment at Rutgers University at New Brunswick. But she also recognizes their drawbacks: poor response rates, underrepresentation of particular demographic groups, and, in certain instances, answers that lack needed candor.

And so, to assess and change student conduct in a more effective way, Weintraub and her colleagues have tried a new approach: find existing, direct, and detailed data on how Rutgers students conduct themselves, and combine them.

Leading the effort was Kevin Pitt, director of student conduct at the New Jersey university. Working alongside Weintraub, he and his team analyzed, with granular specificity, the behavior patterns of students in a variety of contexts: consuming excessive alcohol or drugs, in questionable sexual situations, and others. Pitt and his team examined student-level trends within those areas, combining a variety of previously siloed databases to sketch a more-informative picture of student life at Rutgers.

Read More.

Amazon scrapped 'sexist AI' tool
10/10/2018. “Amazon scrapped 'sexist AI' tool.” BBC News. Publisher's VersionAbstract

An algorithm that was being tested as a recruitment tool by online giant Amazon was sexist and had to be scrapped, according to a Reuters report. The artificial intelligence system was trained on data submitted by applicants over a 10-year period, much of which came from men, it claimed.

Reuters was told by members of the team working on it that the system effectively taught itself that male candidates were preferable. Amazon has not responded to the claims.

Reuters spoke to five members of the team who developed the machine learning tool in 2014, none of whom wanted to be publicly named. They told Reuters that the system was intended to review job applications and give candidates a score ranging from one to five stars.

"They literally wanted it to be an engine where I'm going to give you 100 resumes, it will spit out the top five, and we'll hire those," said one of the engineers who spoke to Reuters.

Read More.

More

Meet Your New Boss: An Algorithm

Meet Your New Boss: An Algorithm

A.I. as Talent Scout: Unorthodox Hires, and Maybe Lower Pay

A.I. as Talent Scout: Unorthodox Hires, and Maybe Lower Pay

The Performance Management Revolution

Performance Management

Amazon scrapped 'sexist AI' tool

Amazon AI

Making it easier to discover datasets

Google AI

HR Must Make People Analytics More User-Friendly

HR Must Make People Analytics More User-Friendly

More Harvard Business Review

The New Analytics of Culture
Matthew Corritore, Amir Goldberg, and Sameer Srivastava. 1/31/2020. “The New Analytics of Culture.” Harvard Business Review. Publisher's VersionAbstract
A business’s culture can catalyze or undermine success. Yet the tools available for measuring it—namely, employee surveys and questionnaires—have significant shortcomings. Employee self-reports are often unreliable. The values and beliefs that people say are important to them, for example, are often not reflected in how they actually behave. Moreover, surveys provide static, or at best episodic, snapshots of organizations that are constantly evolving. And they’re limited by researchers’ tendency to assume that distinctive and idiosyncratic cultures can be neatly categorized into a few common types.
Better People Analytics
Paul Leonardi and Noshir Contractor. 11/1/2018. “Better People Analytics.” Harvard Business Review. Publisher's VersionAbstract

"We have charts and graphs to back us up. So f*** off.” New hires in Google’s people analytics department began receiving a laptop sticker with that slogan a few years ago, when the group probably felt it needed to defend its work. Back then people analytics—using statistical insights from employee data to make talent management decisions—was still a provocative idea with plenty of skeptics who feared it might lead companies to reduce individuals to numbers. HR collected data on workers, but the notion that it could be actively mined to understand and manage them was novel—and suspect.

Today there’s no need for stickers. More than 70% of companies now say they consider people analytics to be a high priority. The field even has celebrated case studies, like Google’s Project Oxygen, which uncovered the practices of the tech giant’s best managers and then used them in coaching sessions to improve the work of low performers. Other examples, such as Dell’s experiments with increasing the success of its sales force, also point to the power of people analytics.

But hype, as it often does, has outpaced reality. The truth is, people analytics has made only modest progress over the past decade. A survey by Tata Consultancy Services found that just 5% of big-data investments go to HR, the group that typically manages people analytics. And a recent study by Deloitte showed that although people analytics has become mainstream, only 9% of companies believe they have a good understanding of which talent dimensions drive performance in their organizations.

What gives? If, as the sticker says, people analytics teams have charts and graphs to back them up, why haven’t results followed? We believe it’s because most rely on a narrow approach to data analysis: They use data only about individual people, when data about the interplay among people is equally or more important.

People’s interactions are the focus of an emerging discipline we call relational analytics. By incorporating it into their people analytics strategies, companies can better identify employees who are capable of helping them achieve their goals, whether for increased innovation, influence, or efficiency. Firms will also gain insight into which key players they can’t afford to lose and where silos exist in their organizations.

Most people analytics teams rely on a narrow approach to data analysis.

Fortunately, the raw material for relational analytics already exists in companies. It’s the data created by e-mail exchanges, chats, and file transfers—the digital exhaust of a company. By mining it, firms can build good relational analytics models.

In this article we present a framework for understanding and applying relational analytics. And we have the charts and graphs to back us up.

Read More.

 

The Performance Management Revolution
Peter Cappelli and Anna Tavis. 10/2016. “The Performance Management Revolution.” Harvard Business Review. Publisher's VersionAbstract

When Brian Jensen told his audience of HR executives that Colorcon wasn’t bothering with annual reviews anymore, they were appalled. This was in 2002, during his tenure as the drugmaker’s head of global human resources. In his presentation at the Wharton School, Jensen explained that Colorcon had found a more effective way of reinforcing desired behaviors and managing performance: Supervisors were giving people instant feedback, tying it to individuals’ own goals, and handing out small weekly bonuses to employees they saw doing good things.

Back then the idea of abandoning the traditional appraisal process—and all that followed from it—seemed heretical. But now, by some estimates, more than one-third of U.S. companies are doing just that. From Silicon Valley to New York, and in offices across the world, firms are replacing annual reviews with frequent, informal check-ins between managers and employees.

How We Got Here

Historical and economic context has played a large role in the evolution of performance management over the decades. When human capital was plentiful, the focus was on which people to let go, which to keep, and which to reward—and for those purposes, traditional appraisals (with their emphasis on individual accountability) worked pretty well. But when talent was in shorter supply, as it is now, developing people became a greater concern—and organizations had to find new ways of meeting that need.

Read More.

More