Publications

2018
Joshua Geltzer and Dipayan Ghosh. 2018. “How Washington Can Prevent Midterm Election Interference .” Foreign Affairs. Publisher's VersionAbstract

When the U.S. Department of Justice earlier this month announced indictments of 12 Russian intelligence officials for hacking the Democratic Party’s and Hillary Clinton’s e-mails in 2016, President Donald Trump’s first reaction was to blame the administration of Barack Obama for not taking action against the interference. “Why didn’t they do something about it, especially when it was reported that President Obama was informed by the FBI in September, before the Election?” he tweeted.

Trump will have no one to blame but his own administration, however, in the event of another such attack before the 2018 midterm elections. Faced with mounting evidence of continued Russian efforts “to undermine America’s democracy,” in the words of Director of National Intelligence Dan Coats, the administration needs to take concrete steps to prevent further foreign interference. For a moment last month, it looked as though this were starting to happen: reports

Dipayan Ghosh and Jim Steyer. 2018. “Kids Shouldn’t Have to Sacrifice Privacy for Education.” The New York Times. Publisher's Version opinion_kids_shouldnt_have_to_sacrifice_privacy_for_education_-_the_new_york_times.pdf
Dipayan Ghosh. 2018. “The Market vs. Democracy: The tools that let companies place targeted ads online also help bad actors like Russia spread disinformation. .” Slate. Publisher's VersionAbstract

If you spend enough time browsing social media, there is a chance you saw an intriguing story shared and re-shared in recent days about how agents of NATO—a long-standing strategic alliance between the United States, Canada, the United Kingdom, and most of continental Europe west of Kharkiv, Ukraine—sprayed chemicals over Poland to damage the well-being of the local population. The original Polish-language account has been spread far and wide with great certitude. Given you are reading this Slate piece about internet-based disinformation, you may already suspect the truth: The Poland story is entirely fake. But would you have been so skeptical if you had seen it shared on social media by the people you trust most?

In recent days, researchers have shown that agents of the Russian government have pushed the Poland story—an example of pure disinformation in its most egregious form—on the most visible social media platforms. And though the long-standing chemtrails controversy has been verifiably (and repeatedly) debunked, many social media users continue to believe it, making them particularly vulnerable to the false story about chemicals sprayed on an unwitting population. We know that these sorts of conspiracy theories do not necessarily recede with time. Instead, they are often so intelligibly and inflammatorily recounted that they continue to spread, affecting susceptible readers who might not question their veracity or the motivations of their propagators.

combating_disinformation_online_pits_the_market_against_democracy_.pdf
Dipayan Ghosh and Ben Scott. 2018. “Russia's Election Interference Is Digital Marketing 101 .” The Atlantic. Publisher's VersionAbstract

Last Friday, the Justice Department charged 13 Russians with attempting to subvert the 2016 U.S. presidential elections. The case presented by Special Counsel Robert Mueller laid out an elaborate scheme of information operations, carried out primarily via the social media websites Facebook, Instagram, and Twitter. Through the Internet Research Agency, a so-called “troll factory” in St. Petersburg, the Russians created hundreds of fake accounts on these services, which then disseminated fake news and other misleading content about Democratic candidate Hillary Clinton to hundreds of thousands of users. They focused their campaign on topics that divide America—race, immigration, and religion—and targeted battleground states. According to figures reported by Facebook and Twitter, the Russian campaign reached more than 125 million Americans on Facebook; over 675,000 people engaged with Russian trolls on Twitter. The Russians’ effort is, of course, ongoing.

Thus far, the media coverage of Mueller’s indictment has fixated on how all this could have happened, and probed whether the Trump campaign was involved. The answers to these questions will all emerge in time. The more troubling question is why it was so easy to make fools out of so many Americans.

Consider two things. First: While the Russians created fake accounts to pose as Americans on social media and buy ads, the technologies they deployed are all commonplace in the digital-marketing industry—this was no 007-style spycraft. Second: These days, Americans live in divisive, partisan information environments, chock-full of incendiary rhetoric. They have very low standards about the sources they accept as accurate, and yet aren’t great at parsing fact from fiction on the Internet. Even “digital natives”— young people most at home in an online information environment—have proven inept at judging credibility. In other words, when the Russians set out to poison American politics, they were pushing on an open door.

 
russias_election_interference_is_digital_marketing_101_-_the_atlantic.pdf
Joshua Geltzer and Dipayan Ghosh. 2018. “Tech Companies Are Ruining America’s Image .” Foreign Policy. Publisher's VersionAbstract

Not long ago, Americans used to worry — constantly and loudly — about what their country’s main cultural export was and what it said about them. In the 1990s, after the Iron Curtain came down, many Americans wondered whether the appealing lifestyles the world saw on U.S. sitcoms and blockbusters deserved some credit for energizing global resistance to communism. Then, as the optimism of the ’90s gave way to the shock and horror of 9/11, Americans asked, with palpable chagrin, whether the materialism and vulgarity of their TV shows and movies were contributing to the virulent anti-Americanism that had spread throughout much of the globe.

These lines of inquiry helped Americans better understand how they were seen and treated by others, including foreign governments, corporations, and populaces. Of course, they also betrayed a certain self-regard, if not self-satisfaction — traits less common among Americans today in a fast-changing world environment that has challenged their traditional cultural dominance. Hollywood is still churning out blockbusters of course, but it is losing its influence year upon year. Similarly, American television shows have gradually lost their stranglehold on prime time in foreign markets and are increasingly forced to give way to local content.

tech_companies_are_ruining_americas_image_-_foreign_policy.pdf
Dipayan Ghosh. 2018. “The Tightrope Google Has to Walk in China .” Harvard Business Review. Publisher's VersionAbstract

With over 1.3 billion people, the Chinese consumer market is a tempting target for Western technology companies. Of course, it’s also a risky place to do business. The recent news that Google is considering a re-entry into China further highlights a troubling balancing act faced by technology companies looking to do business there. The company last entered China in 2006 with a censored search engine, but pulled the plug on the operation four years later after it discovered that human-rights activists’ Gmail accounts had been hacked. While the economic opportunity in re-entering China could be massive for the firm, there are very real dangers for Google or any internet firm in underestimating the threat posed by Chinese meddling.

 
Dipayan Ghosh. 2018. “What is microtargeting and what is it doing in our politics? .” In Internet Citizen. Mozilla. Publisher's VersionAbstract

The rise of the digital media ecosystem – with internet search engines, over-the-top video services, social media networks, and web-based news outlets all simultaneously vying for our collective attention – has dramatically revolutionized the way that the average American consumes information today.  This new media regime increasingly influences every aspect of our society, from how we educate our kids to which products we choose to buy. And critically, one feature of our modern information diet is a practice known as microtargeting, and among its many commercial and noncommercial uses, it is continuing to change the way American politics works.

what_is_microtargeting_and_what_is_it_doing_in_our_politics_-_internet_citizen.pdf
Dipayan Ghosh. 2018. “What You Need to Know About California’s New Data Privacy Law .” Harvard Business Review. Publisher's VersionAbstract

Late last month, California passed a sweeping consumer privacy law that might force significant changes on companies that deal in personal data — and especially those operating in the digital space. The law’s passage comes on the heels of a few days of intense negotiation among privacy advocates, technology startups, network providers, Silicon Valley internet companies, and others. Those discussions have resulted in what many are describing as a landmark policy constituting the most stringent data protection regime in the United States.

Much of the political impetus behind the law’s passage came from some major privacy scandals that have come to light in recent months, including the Cambridge Analytica incident involving Facebook user data. This and other news drove public support for a privacy ballot initiative that would have instituted an even stricter data protection regime on companies that deal in consumer data if the state’s residents voted to pass it in November. But after intense negotiation, especially from leading internet companies and internet service providers, the backers of the ballot initiative agreed to drop the initiative and instead support the passage of the law.

The new law — the California Consumer Privacy Act, A.B. 375 — affords California residents an array of new rights, starting with the right to be informed about what kinds of personal data companies have collected and why it was collected. Among other novel protections, the law stipulates that consumers have the right to request the deletion of personal information, opt out of the sale of personal information, and access the personal information in a “readily useable format” that enables its transfer to third parties without hindrance.

what_you_need_to_know_about_californias_new_data_privacy_law.pdf
Dipayan Ghosh. 2018. “Yes, Silicon Valley needs regulation. But Trump’s reason why is misguided .” The Guardian. Publisher's VersionAbstract

n a head-turning move that has pitted him squarely against Silicon Valley’s most revered companies, Donald Trump proclaimed last week that algorithms developed by the likes of Google and Facebook fail to offer consumers politically-balanced news about American politics and his presidency itself.

The underlying insinuation was that firms like Google, in designing features like search engine results pages and the algorithms that power them, perpetuate a kind of bias against conservative media in the US. His message, in all its brashness, was very clear: that there is an insidious suppression of certain kinds of US news outlets, and that should the internet platform companies fail to address it, the president himself will do so throhesiugh the power vested in him – including, potentially, by levying heavy-handed regulation.

 
yes_silicon_valley_needs_regulation._but_trumps_reason_why_is_misguided_dipayan_ghosh_opinion_the_guardian.pdf
Dipayan Ghosh and Ben Scott. 2018. “Digital Deceit: The Technologies Behind Precision Propaganda on the Internet.” New America. Publisher's VersionAbstract

Over the past year, there has been rising pressure on Facebook, Google and Twitter to account for how bad actors are exploiting their platforms. The catalyst of this so-called “tech-lash” was the revelation last summer that agents of the Russian government engaged in disinformation operations using these services to influence the 2016 presidential campaigns.

The investigation into the Russian operation pulled back the curtain on a modern Internet marketplace that enables widespread disinformation over online channels. Questionable digital advertisements, social media bots, and viral Internet memes carrying toxic messages have featured heavily in the news. But we have only begun to scratch the surface of a much larger ecosystem of digital advertising and marketing technologies. To truly address the specter of future nefarious interventions in the American political process, we need to broaden the lens and assess all of the tools available to online commercial advertisers. Disinformation operators in the future will replicate all of these techniques, using the full suite of platforms and technologies. These tools grow more powerful all the time as new advances in algorithmic technologies and artificial intelligence are integrated into the marketplace for digital marketing and advertising.

The central problem of disinformation corrupting American political culture is not Russian spies or a particular social media platform. The central problem is that the entire industry is built to leverage sophisticated technology to aggregate user attention and sell advertising. There is an alignment of interests between advertisers and the platforms. And disinformation operators are typically indistinguishable from any other advertiser. Any viable policy solutions must start here.

To inform and support this important public debate, this paper analyzes the technologies of digital advertising and marketing in order to deepen our understanding of precision propaganda.

Our paper concludes with a series of recommendations to guide corporate reform, consumer empowerment and new public policy development. Current efforts to promote transparency in the advertising ecosystem are important steps. But these are only the first moves in a long and difficult challenge. We offer a set of principles to guide the path forward as well as starting points for potential regulatory intervention. These include changes to election law, data privacy protections, and competition policy. The nature of this crisis in media and democracy requires an ambitious approach to reform from Silicon Valley C-Suites to Capitol Hill to the handsets of everyday internet users. The American political resilience has through the ages hinged on our implicit commitment that markets must take a backseat to democracy.

    digital-deceit-final-v3.pdf
    Dipayan Ghosh and Ben Scott. 2018. “Disinformation Is Becoming Unstoppable.” TIME. Publisher's VersionAbstract

    We are in the midst of a “tech-lash.” For months, the leading Internet companies have faced a wave of criticism sparked by revelations that they unwittingly enabled the spread of Russian disinformation that distorted the 2016 election. They are now beginning to listen. Recently, Facebook responded when chief executive Mark Zuckerberg announced that his company is revamping its flagship News Feed service: The algorithm powering it will now prioritize content shared by your friends and family over news stories and viral videos. The company followed up by announcing it will survey users and potentially relegate untrusted outlets.

    facebook_and_fake_news_disinformation_may_be_unstoppable_time.pdf
    Dawn Schrader and Dipayan Ghosh. 2018. “Proactively Protecting Against the Singularity: Ethical Decision Making in AI.” IEEE Security & Privacy. Publisher's VersionAbstract
    This article proposes an ethical framework for the development and implementation of artificial intelligence that is based on philosophical principles and perspectives upholding human rights and well-being above potential superior machine intelligence referred to as the singularity. Illustrative cases demonstrate the framework's application and suggest guidelines for future policy.
    2017
    Dipayan Ghosh. 2017. “AI is the future of hiring, but it's far from immune to bias.” Quartz. Publisher's VersionAbstract

    The injection of AI into the recruiting industry is exciting for job seekers and firms alike. But it is precisely at this time, when many new players are newly exploring the tremendous opportunities that are at hand, that engineers and policymakers must be doubly cautious to assure ethical standards are adhered to in the development of new hiring technologies powered by AI.

    The starkest and most concerning issue is algorithmic discrimination, which can unwittingly be propagated through AI, particularly if its designers are not careful in how they select input data and how they craft the underlying algorithms.

    We already know that humans can make biased decisions in hiring contexts. In one widely-cited NBER experiment, recruiters reviewed resumes that featured both “white sounding” and “black sounding” names. Even though both groups were similarly credentialed on paper, the recruiters more often selected the former group. Imagine a situation in which such a recruiting policy is coded into a decision-making algorithm. This situation is entirely feasible, particularly if the policy leads to profitable results for the employing client despite its implicit bias.

    The recruiting industry is replete with arbitrary measures of competence and qualifications, each of which can perpetuate bias in its own right. Enterprise, the car rental company, for example, uses tools provided by software firm iCIMS to check whether candidates meet minimum requirements including a bachelor’s degree and some form of leadership experience. Such bright-line conditions aided by software can limit opportunities for deserving people who do not satisfy them but would otherwise perform the work well, potentially perpetuating bias.

    ai_in_hiring_can_lead_to_algorithmic_bias_-_quartz_at_work.pdf
    Dipayan Ghosh. 2017. “Apple’s Dangerous Market Grab in China .” The New York Times. Publisher's VersionAbstract

    Apple announced last week that it will open a data center in Guizhou, China. This is a first‐of‐its‐ kind action by a major United States tech company since the passage last month of strict new Chinese digital commerce regulations that require foreign companies with operations in China to store users’ data in the country. These events could threaten to disrupt the free flow of information over the internet.

    opinion_apples_dangerous_market_grab_in_china_-_the_new_york_times.pdf
    2015
    Jonathan Tse, Dawn E. Schrader, Dipayan Ghosh, Tony Liao, and David Lundie. 2015. “A bibliometric analysis of privacy and ethics in IEEE Security and Privacy.” Ethics and Information Technology. Publisher's VersionAbstract

    The increasingly ubiquitous use of technology has led to the concomitant rise of intensified data collection and the ethical issues associated with the privacy and security of that data. In order to address the question of how these ethical concerns are discussed in the literature surrounding the subject, we examined articles published in IEEE Security and Privacy, a magazine targeted towards a general, technically-oriented readership spanning both academia and industry. Our investigation of the intersection between the ethical and technological dimensions of privacy and security is structured as a bibliometric analysis. Our dataset covers all articles published in IEEE Security and Privacy since its inception in 2003 to February 06, 2014 . This venue was chosen not only because of its target readership, but also because a preliminary search of keywords related to ethics, privacy, and security topics in the ISI Web of Knowledge and IEEE Xplore indicated that IEEE Security and Privacy has published a preponderance of articles matching those topics. In fact, our search returned two-fold more articles for IEEE Security and Privacy than the next most prolific venue. These reasons, coupled with the fact that both academia and industry are well-represented in the authorship of articles makes IEEE Security and Privacy an excellent candidate for bibliometric analysis. Our analysis examines the ways articles in IEEE Security and Privacy relate ethics to information technology. Such articles can influence the development of law, policy and the future of information technology ethics. We employed thematic and JK-biplot analyses of content relating privacy and ethics and found eight dominant themes as well as the inter-theme relationships. Authors and institutional affiliations were examined to discern whether centers of research activity and/or authors dominated the overall field or thematic areas. Results suggest avenues for future work in critical areas, especially for closing present gaps in the coverage of ethics and information technology privacy and security themes particularly in the areas of ethics and privacy awareness.

     
    a_bibliometric_analysis_of_privacy_and_e.pdf
    2013
    Dipayan Ghosh, Robert J. Thomas, and Stephen B. Wicker. 2013. “A Privacy-Aware Design for the Vehicle-to-Grid Framework.” 2013 46th Hawaii International Conference on System Sciences. Publisher's VersionAbstract

    The vehicle-to-grid (V2G) framework proposes integration of battery-powered electric drive vehicles into the grid, enabling them to be recharged as necessary and to act as suppliers in the ancillary service electricity markets. This is expected to create incentives for the production and adoption of electric vehicles in the automotive industry. V2G frameworks require that the utility company or a third party aggregator has access to each vehicle's charging status via a two-way communication network for billing and planning purposes. We establish that there exist consumer privacy risks associated with current concepts for V2G implementation and argue that consumer preferences and behaviors can be inferred from charging information if privacy is not a primary concern from the outset of V2G design. Finally, we outline a privacy-aware architecture for V2G systems.

    2012
    Dipayan Ghosh, Dawn E. Schrader, William D. Schulze, and Stephen B. Wicker. 2012. “Economic analysis of privacy-aware Advanced Metering Infrastructure adoption .” 2012 IEEE PES Innovative Smart Grid Technologies (ISGT). Publisher's VersionAbstract

    Demand response systems primarily seek to reduce demand levels during periods of high load and increase demand as necessary in the off-peak hours. The objective of this flattening of the demand curve is to curb the need for generators to frequently ramp up or down and to reduce peak load levels. This, in turn, would potentially decrease the aggregate production cost of electricity. One of the most effective known methods of accomplishing this is to use Advanced Metering Infrastructure (AMI), an intelligent metering technology that collects temporally precise consumer electricity usage data and relays it to the local utility. Because AMI modules collect fine granularity consumer data, a significant threat to consumer privacy exists, as this data can be shared or sold by the utility to interested third parties. A privacy-aware AMI module can be used to avoid this inherent danger by protecting an individual consumer's data using public key infrastructure. However, while privacy-aware AMI would be preferred by consumers, utilities would naturally prefer non-privacy-aware modules, as they could profit from the sale of consumer usage data. Therefore, it is not clear what regulatory structure should be implemented in determining what type of AMI to offer consumers and with what regulations. In this paper, we examine two possible regulatory regimes using consumer decision theory and determine the economic conditions required for privacy-aware AMI adoption at equilibrium under both regimes. Finally, we predict the privacy-aware AMI adoption rates for each regime and provide regulatory recommendations.

    2011
    Dipayan Ghosh and Peter B. Luh. 2011. “Analysis and simulation of payment cost minimization and bid cost minimization with strategic bidders .” 2011 IEEE/PES Power Systems Conference and Exposition. Publisher's VersionAbstract

    Presently, Independent System Operators (ISOs) in deregulated electricity markets in the U.S. use an auction method that minimizes the total bid cost when determining units to be on and their generation levels (Bid Cost Minimization or BCM). It has recently been shown that this method of auction does not provide minimal consumer payment costs for a given set of bids under the Market Clearing Price (MCP)-based or congestion-dependent Locational Marginal Price (LMP) scheme. Instead, an alternative auction method that directly minimizes the total consumer payment (Payment Cost Minimization or PCM) provides the payment-minimizing auction selection. Though the use of PCM minimizes consumer payments for a given set of bids for a day, it has not been fully illustrated that PCM minimizes payments over a longer time period with intelligent bidders that can adjust to the new auction mechanism. To address this important economic issue, a novel discrete game theoretic method of bidder behavior is used to model the competitive nature of generating companies in the day-ahead market. Numerical testing results show that PCM significantly reduces consumer payments with intelligent bidders. Simulation using a market simulator is also presented with similar results. Finally, insight into the potential benefits of PCM is briefly presented.

    Dipayan Ghosh and Peter B. Luh. 2011. “Analysis and simulation of payment cost minimization and bid cost minimization with strategic bidders .” 2011 IEEE/PES Power Systems Conference and Exposition. Publisher's VersionAbstract

    Presently, Independent System Operators (ISOs) in deregulated electricity markets in the U.S. use an auction method that minimizes the total bid cost when determining units to be on and their generation levels (Bid Cost Minimization or BCM). It has recently been shown that this method of auction does not provide minimal consumer payment costs for a given set of bids under the Market Clearing Price (MCP)-based or congestion-dependent Locational Marginal Price (LMP) scheme. Instead, an alternative auction method that directly minimizes the total consumer payment (Payment Cost Minimization or PCM) provides the payment-minimizing auction selection. Though the use of PCM minimizes consumer payments for a given set of bids for a day, it has not been fully illustrated that PCM minimizes payments over a longer time period with intelligent bidders that can adjust to the new auction mechanism. To address this important economic issue, a novel discrete game theoretic method of bidder behavior is used to model the competitive nature of generating companies in the day-ahead market. Numerical testing results show that PCM significantly reduces consumer payments with intelligent bidders. Simulation using a market simulator is also presented with similar results. Finally, insight into the potential benefits of PCM is briefly presented.

    Dipayan Ghosh, Stephen B. Wicker, and Lawrence E. Blume. 2011. “Game theoretic analysis of privacy-aware Advanced Metering Infrastructure .” 2011 2nd IEEE PES International Conference and Exhibition on Innovative Smart Grid Technologies. Publisher's VersionAbstract

    Demand response systems seek to flatten the demand for electricity by providing real-time pricing to consumers to motivate avoidance of power-intensive tasks when rates are high. Advanced Metering Infrastructure (AMI) has been developed to facilitate this process, allowing for billing that applies fine-grained prices to fine-grained consumption data. But AMI also presents a unique privacy risk to consumers - fine-grained consumption data reveals a great deal about the behaviour, beliefs, and preferences of consumers. Such information is of interest to third parties, further exacerbating the privacy risk. This suggests a need for AMI to be developed from a privacy-aware perspective. Adopting a game-theoretic model, we consider utilities that offer both privacy-aware and non-privacy-aware AMI. A non-cooperative game is developed in which a representative consumer strategizes against the utility. The regulatory measures required for the desired privacy-facilitating Nash equilibrium of the game are discussed, and recommendations for policymakers are presented. In particular, it is found that a combination of regulation and consumer awareness must overcome the financial benefit arising from the sale of consumption information to third parties.

    Pages