Digital computers have transformed work in almost every sector of the economy over the past several decades (1). We are now at the beginning of an even larger and more rapid transformation due to recent advances in machine learning (ML), which is capable of accelerating the pace of automation itself. However, although it is clear that ML is a “general purpose technology,” like the steam engine and electricity, which spawns a plethora of additional innovations and capabilities (2), there is no widely shared agreement on the tasks where ML systems excel, and thus little agreement on the specific expected impacts on the workforce and on the economy more broadly. We discuss what we see to be key implications for the workforce, drawing on our rubric of what the current generation of ML systems can and cannot do [see the supplementary materials (SM)]. Although parts of many jobs may be “suitable for ML” (SML), other tasks within these same jobs do not fit the criteria for ML well; hence, effects on employment are more complex than the simple replacement and substitution story emphasized by some. Although economic effects of ML are relatively limited today, and we are not facing the imminent “end of work” as is sometimes proclaimed, the implications for the economy and the workforce going forward are profound.
Researchers tested the effects of including cues, anchors, and savings goals in a company email encouraging employee contributions to their 401(k).
Researchers found that providing high contribution rate or savings goal examples, or highlighting high savings thresholds created by the 401(k) plan rules, increased 401(k) contribution rates by 1-2% of income per pay period.
Although evidence-based algorithms consistently outperform human forecasters, people often fail to use them after learning that they are imperfect, a phenomenon known as algorithm aversion. In this paper, we present three studies investigating how to reduce algorithm aversion. In incentivized forecasting tasks, participants chose between using their own forecasts or those of an algorithm that was built by experts. Participants were considerably more likely to choose to use an imperfect algorithm when they could modify its forecasts, and they performed better as a result. Notably, the preference for modifiable algorithms held even when participants were severely restricted in the modifications they could make (Studies 1-3). In fact, our results suggest that participants’ preference for modifiable algorithms was indicative of a desire for some control over the forecasting outcome, and not for a desire for greater control over the forecasting outcome, as participants’ preference for modifiable algorithms was relatively insensitive to the magnitude of the modifications they were able to make (Study 2). Additionally, we found that giving participants the freedom to modify an imperfect algorithm made them feel more satisfied with the forecasting process, more likely to believe that the algorithm was superior, and more likely to choose to use an algorithm to make subsequent forecasts (Study 3). This research suggests that one can reduce algorithm aversion by giving people some control - even a slight amount - over an imperfect algorithm’s forecast.
More than a quarter century ago, organizational scholars began to explore the implications of prosociality in organizations. Three interrelated streams have emerged from this work, which focus on prosocial motives (the desire to benefit others or expend effort out of concern for others), prosocial behaviors (acts that promote/protect the welfare of individuals, groups, or organizations), and prosocial impact (the experience of making a positive difference in the lives of others through one’s work). Prior studies have highlighted the importance of prosocial motives, behaviors, and impact, and have enhanced our understanding of each of them. However, there has been little effort to systematically review and integrate these related lines of work in a way that furthers our understanding of prosociality in organizations. In this article, we provide an overview of the current state of the literature, highlight key findings, identify major research themes, and address important controversies and debates. We call for an expanded view of prosocial behavior and a sharper focus on the costs and unintended consequences of prosocial phenomena. We conclude by suggesting a number of avenues for future research that will address unanswered questions and should provide a more complete understanding of prosociality in the workplace.
You want to know which teams are at the forefront of analytics? Just look around at the teams still playing.
Once upon a time, there was the Oakland Athletics and a sacred tome called "Moneyball." It was about baseball teams winning with statistics. Only it wasn't about that at all. It was about market inefficiency. Then John Henry bought the Boston Red Sox, hired Bill James, made Theo Epstein his general manager, and Moneyball spread to a big market.
We're several iterations past all of that. Things move fast in technology, so fast it can even carry a tradition-based industry like baseball into the digital age. These days, every team is playing Moneyball. All of them, as in 30 for 30.
"At this point, I think everyone assumes that their counterpart is smart," Brewers general manager David Stearns said. "And everyone is doing what they can do to unearth competitive advantages." To call it Moneyball is not right, either. Michael Lewis is still turning out ground-breaking work, but to fully capture what is happening in big league front offices, circa 2018, the next inside look at analytics and baseball would need to be authored by someone like the late Stephen Hawking. It's hard to say what you'd call it. "The Singularity" has already been taken.
The University of Arizona is tracking freshman students’ ID card swipes to anticipate which students are more likely to drop out. University researchers hope to use the data to lower dropout rates. (Dropping out refers to those who have left higher-education entirely and those who transfer to other colleges.)
The card data tells researchers how frequently a student has entered a residence hall, library, and the student recreation center, which includes a salon, convenience store, mail room, and movie theater. The cards are also used for buying vending machine snacks and more, putting the total number of locations near 700. There’s a sensor embedded in the CatCard student IDs, which are given to every student attending the university.
“By getting their digital traces, you can explore their patterns of movement, behavior and interactions, and that tells you a great deal about them,” Sudha Ram, a professor of management information systems who directs the initiative, said in a press release.
The modern workplace is awash in meetings, many of which are terrible. As a result, people mostly hate going to meetings. The problem is this: The whole point of meetings is to have discussions that you can’t have any other way. And yet most meetings are devoid of real debate.
To improve the meetings you run, and save the meetings you’re invited to, focus on making the discussion more robust.
When teams have a good fight during meetings, team members debate the issues, consider alternatives, challenge one another, listen to minority views, and scrutinize assumptions. Every participant can speak up without fear of retribution. However, many people shy away from such conflict, conflating disagreement and debate with personal attacks. In reality, this sort of friction produces the best decisions. In my recent study of 5,000 managers and employees, published in my recent book, I found that the best performers are really good at generating rigorous discussions in team meetings. (The sample includes senior and junior managers and individual contributors from a range of industries in corporate America; my aim was to statistically identify work habits that correlate with higher performance.)
So how do you lead a good fight in meetings? Here are six practical tips:
During Jeff Immelt’s 16 years as CEO, GE radically changed its mix of businesses and its strategy.
Its focus—becoming a truly global, technology-driven industrial company that’s blazing the path for the internet of things—has had dramatic implications for the profile of its workforce. Currently, 50% of GE’s 300,000 employees have been with the company for five years or less, meaning that they may lack the personal networks needed to succeed and get ahead. The skills of GE’s workforce have been rapidly changing as well, largely because of the company’s ongoing transformation into a state-of-the-art digital industrial organization that excels at analytics. The good news is that GE has managed to attract thousands of digerati. The bad news is that they have little tolerance for the bureaucracy of a conventional multinational. As is the case with younger workers in general, they want to be in charge of their own careers and don’t want to depend solely on their bosses or HR to identify opportunities and figure out the training and experiences needed to pursue their professional goals.
What’s the solution to these challenges? GE hopes it’s HR analytics. “We need a set of complementary technologies that can take a company that’s in 180 countries around the world and make it small,” says James Gallman, who until recently was the GE executive responsible for people analytics and planning. The technologies he’s referring to are a set of self-service applications available to employees, leaders, and HR. All the apps are based on a generic matching algorithm built by data scientists at GE’s Global Research Center in conjunction with HR. “It’s GE’s version of Match.com,” quips Gallman. “It can take a person and match him or her to something else: online or conventional educational programs, another person, or a job.”
Do you think you know how to get the best from your people? Or do you know? How do investments in your employees actually affect workforce performance? Who are your top performers? How can you empower and motivate other employees to excel?
Leading-edge companies are increasingly adopting sophisticated methods of analyzing employee data to enhance their competitive advantage. Google, Best Buy, Sysco, and others are beginning to understand exactly how to ensure the highest productivity, engagement, and retention of top talent, and then replicating their successes. If you want better performance from your top employees—who are perhaps your greatest asset and your largest expense—you’ll do well to favor analytics over your gut instincts.
Harrah’s Entertainment is well-known for employing analytics to select customers with the greatest profit potential and to refine pricing and promotions for targeted segments. (See “Competing on Analytics,”HBR January 2006.) Harrah’s has also extended this approach to people decisions, using insights derived from data to put the right employees in the right jobs and creating models that calculate the optimal number of staff members to deal with customers at the front desk and other service points. Today the company uses analytics to hold itself accountable for the things that matter most to its staff, knowing that happier and healthier employees create better-satisfied guests.