algorithmic bias

42 results back to index


pages: 346 words: 97,890

The Road to Conscious Machines by Michael Wooldridge

Ada Lovelace, AI winter, algorithmic bias, Andrew Wiles, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, basic income, British Empire, call centre, combinatorial explosion, computer vision, DARPA: Urban Challenge, don't be evil, Donald Trump, Elon Musk, Eratosthenes, factory automation, future of work, gig economy, Google Glasses, intangible asset, James Watt: steam engine, job automation, John von Neumann, Loebner Prize, Minecraft, Nash equilibrium, Norbert Wiener, NP-complete, P = NP, pattern recognition, RAND corporation, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stephen Hawking, Steven Pinker, strong AI, technological singularity, telemarketer, Tesla Model S, The Coming Technological Singularity, The Future of Employment, the scientific method, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Turing test, universal basic income, Von Neumann architecture

But the good news is that at least there are signs that governments are willing to try. Algorithmic Bias We might hope that AI systems would be free of the prejudices and biases that plague the human world, but I’m afraid that is not the case. Over the past decade, as machine learning systems have been rolled out into more and more areas of application, we have begun to understand how automated decision-making systems can exhibit algorithmic bias. It is now a major research field, with many groups struggling to understand the problems it raises, and how to avoid it. Algorithmic bias, as the name suggests, is concerned with situations in which a computer program – not just AI systems, but any computer program – exhibits bias of some form in its decision-making.

This takes us into a consideration of how AI will affect the nature of work, and the possibility that AI-governed employment will be alienating. This, in turn, leads us to consider the impact that the use of AI technologies might have on human rights, and the possibility of lethal autonomous weapons. We’ll then consider the emergence of algorithmic bias, and the issues surrounding lack of diversity in AI, as well as the phenomena of fake news and fake AI. Employment and Unemployment ‘Robots will take our jobs. We’d better plan now, before it’s too late.’ –– Guardian, 2018 After the Terminator narrative, probably the most widely discussed and widely feared aspect of AI is how it will affect the future of work and, in particular, the potential it has to put people out of work.

For example, imagine that, in the banking example above, the key feature of the data that you chose to train your program on was … racial origin. Then it would be no surprise if the resulting program made hopelessly biased decisions about who should get a mortgage. (You don’t think a bank would be stupid enough to do something like this, do you? Just you wait.) Algorithmic bias is a particularly prominent issue at present because, as we saw, one feature of the current wave of AI systems is that they are ‘black boxes’: they cannot explain or rationalize the decisions they make in the way that a person can. This problem is exacerbated if we place too much trust in the systems we build – and there is anecdotal evidence that we do exactly that with AI systems.


pages: 345 words: 92,063

Power, for All: How It Really Works and Why It's Everyone's Business by Julie Battilana, Tiziana Casciaro

affirmative action, agricultural Revolution, Albert Einstein, algorithmic bias, Asperger Syndrome, blood diamonds, Boris Johnson, British Empire, call centre, Cass Sunstein, clean water, cognitive dissonance, collective bargaining, conceptual framework, coronavirus, Covid-19, COVID-19, different worldview, disinformation, Elon Musk, Erik Brynjolfsson, feminist movement, fundamental attribution error, future of work, gig economy, hiring and firing, impact investing, income inequality, informal economy, Intergovernmental Panel on Climate Change (IPCC), invention of movable type, Jeff Bezos, job satisfaction, Joshua Gans and Andrew Leigh, Mahatma Gandhi, means of production, mega-rich, meta-analysis, Milgram experiment, moral hazard, Naomi Klein, Nelson Mandela, Occupy movement, Panopticon Jeremy Bentham, principal–agent problem, profit maximization, Ralph Waldo Emerson, ride hailing / ride sharing, Second Machine Age, shareholder value, sharing economy, Shoshana Zuboff, Silicon Valley, Social Responsibility of Business Is to Increase Its Profits, Steven Pinker, surveillance capitalism, the scientific method, The Wisdom of Crowds, Tim Cook: Apple, transatlantic slave trade, union organizing, zero-sum game

House Committee on Science, Space, and Technology, she outlined key priorities related to AI, including the need to halt both governmental and commercial use of facial recognition in sensitive social and political contexts until the risks are fully studied and adequate regulations, such as biometric privacy laws and assessments to vet algorithms for bias, are in place.72 The latter will be particularly challenging, not only in terms of deciding where responsibility for assessing algorithms for bias should rest, but also because what constitutes a fair algorithm is a complex question, one that engineers, computer scientists, and legal scholars on the front lines of ethical AI development are asking with increasing urgency.73 But we do know some things about how algorithms function and where we are better equipped to exercise oversight.

The Workers Disagree,” Vox, November 21, 2018, https://www.vox.com/2018/11/21/18105719/google-walkout-real-change-organizers-protest-discrimination-kara-swisher-recode-decode-podcast. 72 Artificial Intelligence, Societal and Ethical Implications, Before the United States House of Representatives Committee on Science, Space, and Technology, 116th Cong. (2019) (statement of Meredith Whittaker, cofounder and codirector of AI Now Institute). 73 For example, Genie Barton, Nicol Turner-Lee, and Paul Resnick, “Algorithmic Bias Detection and Mitigation: Best Practices and Policies to Reduce Consumer Harms,” Brookings, May 22, 2019, https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/; Sorelle A. Friedler et al., “A Comparative Study of Fairness Enhancing Interventions in Machine Learning,” in Proceedings of the Conference on Fairness, Accountability, and Transparency (2019): 329–38; Solon Barocas, Moritz Hardt, and Arvind Narayanan, Fairness and Machine Learning (fairmlbook.org, 2019). 74 Cynthia Dwork, “Skewed or Rescued?

In 2021, Australia passed a law requiring social media companies to pay for the journalism appearing on their platforms, despite their protestations—a landmark step toward restoring a measure of power for public-interest journalism.69 Judicial systems are also beginning to hold tech companies to account for algorithmic bias. In a watershed 2021 lawsuit brought by delivery riders against food app Deliveroo, a court in Bologna, Italy, ruled that even if an algorithm discriminates against workers unintentionally, a company can still be held liable and be forced to pay damages.70 It isn’t easy to regulate complex technologies that tend to evolve rapidly, however.


pages: 290 words: 73,000

Algorithms of Oppression: How Search Engines Reinforce Racism by Safiya Umoja Noble

A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, borderless world, cloud computing, conceptual framework, crowdsourcing, desegregation, disinformation, Donald Trump, Edward Snowden, Filter Bubble, Firefox, Google Earth, Google Glasses, housing crisis, illegal immigration, immigration reform, information retrieval, Internet Archive, Jaron Lanier, Mitch Kapor, Naomi Klein, new economy, PageRank, performance metric, phenotype, profit motive, Silicon Valley, Silicon Valley ideology, Snapchat, Tim Cook: Apple, union organizing, women in the workforce, yellow journalism

Helen Nissenbaum, a professor of media, culture, and communication and computer science at New York University, has written with Lucas Introna, a professor of organization, technology, and ethics at the Lancaster University Management School, about how search engines bias information toward the most powerful online. Their work was corroborated by Alejandro Diaz, who wrote his dissertation at Stanford on sociopolitical bias in Google’s products. Kate Crawford and Tarleton Gillespie, two researchers at Microsoft Research New England, have written extensively about algorithmic bias, and Crawford recently coorganized a summit with the White House and New York University for academics, industry, and activists concerned with the social impact of artificial intelligence in society. At that meeting, I participated in a working group on artificial-intelligence social inequality, where tremendous concern was raised about deep-machine-learning projects and software applications, including concern about furthering social injustice and structural racism.

The library practitioner Matthew Reidsma gave a recent gift to the profession when he blogged about library discovery systems, or search interfaces, that are just as troubled as commercial interfaces. In his blog post, he details the limitations of databases, the kinds of gender biases that are present in discovery tools, and how little innovation has been brought to bear in resolving some of the contradictions we know about.35 Figure 5.2. A call to the profession to address algorithmic bias in library discovery systems by Matthew Reidsma attempts to influence the field of information studies. Source: Reidsma, 2016. I sought to test the call that Reidsma made to the profession to interrogate library information management tools by conducting searches in a key library database.

., and Madden, M. (2015, March). Americans’ Privacy Strategies Post-Snowden. Pew Research Center. Retrieved from www.pewinternet.org. Rajagopal, I., and Bojin, N. (2002). Digital Representation: Racism on the World Wide Web. First Monday, 7(10). Retrieved from www.firstmonday.org. Reidsma, M. (2016, March 11). Algorithmic Bias in Library Discovery Systems. Matthew Reidsma’s blog. Retrieved from http://​matthew.reidsrow.com/​articles/​173. Rifkin, J. (1995). The End of Work: The Decline of the Global Labor Force and the Dawn of the Post-Market Era. New York: Putnam. Rifkin, J. (2000). The Age of Access: The New Culture of Hypercapitalism, Where All of Life Is a Paid-For Experience.


pages: 370 words: 107,983

Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All by Robert Elliott Smith

Ada Lovelace, affirmative action, AI winter, Alfred Russel Wallace, algorithmic bias, Amazon Mechanical Turk, animal electricity, autonomous vehicles, Black Swan, British Empire, cellular automata, citizen journalism, Claude Shannon: information theory, combinatorial explosion, corporate personhood, correlation coefficient, crowdsourcing, Daniel Kahneman / Amos Tversky, desegregation, discovery of DNA, disinformation, Douglas Hofstadter, Elon Musk, Fellow of the Royal Society, feminist movement, Filter Bubble, Flash crash, Gerolamo Cardano, gig economy, Gödel, Escher, Bach, invention of the wheel, invisible hand, Jacquard loom, Jacques de Vaucanson, John Harrison: Longitude, John von Neumann, Kenneth Arrow, low skilled workers, Mark Zuckerberg, mass immigration, meta-analysis, mutually assured destruction, natural language processing, new economy, On the Economy of Machinery and Manufactures, p-value, pattern recognition, Paul Samuelson, performance metric, Pierre-Simon Laplace, precariat, profit maximization, profit motive, Silicon Valley, social intelligence, statistical model, Stephen Hawking, stochastic process, telemarketer, The Bell Curve by Richard Herrnstein and Charles Murray, The Future of Employment, the scientific method, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, theory of mind, Thomas Bayes, Thomas Malthus, traveling salesman, Turing machine, Turing test, twin studies, Vilfredo Pareto, Von Neumann architecture, women in the workforce, Yochai Benkler

Understood: Women’s Hearts Are Victims of a System that Is Ill-Equipped to Diagnose, Treat and Support Them: Heart & Stroke 2018 Heart Report. www.heartandstroke.ca/-/media/pdf-files/canada/2018-heart-month/hs_2018-heart-report_en.ashx?la=en&hash=B7E7C6225111EB4AECEE7EE729BFC050E2643082 Chapter 8 1Frederick S. Lane, 2009, American Privacy: The 400-Year History of Our Most Contested Right. Boston: Beacon Press. 2Aaron Pressman, 2018, How to Fight the Growing Scourge of Algorithmic Bias in AI. Fortune, http://fortune.com/2018/09/14/fight-algorithmic-bias-joy-buolamwini/ 3Mark Frauenfelder, 2017, Racist Soap Dispenser. BoingBoing, https://boingboing.net/2017/08/16/racist-soap-dispenser.html 4Jana Kasperkevic, 2105, Google Says Sorry for Racist Auto-Tag in Photo App. Guardian, www.theguardian.com/technology/2015/jul/01/google-sorry-racist-auto-tag-photo-app 5Tom Simonite, 2018, When It Comes to Gorillas, Google Photos Remains Blind.

People pick the features; it’s extremely difficult for those people to avoid biases in their selection, and all feature-based models of complex phenomena contain biases. Therefore, the amplification Eubanks suggests is not just about biases in data, it’s about the whole presumption of algorithmic understanding of complex human issues. Algorithmic bias is certainly a huge and hidden civil rights issue, because it will not be the wealthy who will have their job or loan applications assessed by a computer (or their child care, mortgage or medical insurance). It is not the white-collar criminal who will be evaluated en masse by an algorithm for parole, or white-collar suburbs that will be targeted as areas that must be patrolled by police cars and surveillance in a search for potential financial crimes.

While Wikipedia defines ‘fake news’ as a synonym for news satire, more recently it has become at worst a condemnation of real human journalism, and at best an umbrella term for the uncontrolled explosion of misleading information dispersed on social media. Mark Zuckerberg has pledged to make Facebook impede fake news. But it’s unclear how, as in the past his company has eliminated human editors and curators in favour of algorithms, ironically to ensure less bias in their presentation of news. As we’ve seen before, being able to determine what is true and what is fake is not a goal that algorithms can achieve easily. Furthermore, the war on fake news misses the point. Regardless of whether algorithms present ‘true’ or ‘fake’ news, they will still be working towards their primary directive: the maximization of value.


pages: 416 words: 112,268

Human Compatible: Artificial Intelligence and the Problem of Control by Stuart Russell

3D printing, Ada Lovelace, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Alfred Russel Wallace, algorithmic bias, Andrew Wiles, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, augmented reality, autonomous vehicles, basic income, blockchain, brain emulation, Cass Sunstein, Claude Shannon: information theory, complexity theory, computer vision, connected car, crowdsourcing, Daniel Kahneman / Amos Tversky, delayed gratification, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ernest Rutherford, Flash crash, full employment, future of work, Garrett Hardin, Gerolamo Cardano, ImageNet competition, Intergovernmental Panel on Climate Change (IPCC), Internet of things, invention of the wheel, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John Nash: game theory, John von Neumann, Kenneth Arrow, Kevin Kelly, Law of Accelerating Returns, Mark Zuckerberg, Nash equilibrium, Norbert Wiener, NP-complete, openstreetmap, P = NP, Pareto efficiency, Paul Samuelson, Pierre-Simon Laplace, positional goods, probability theory / Blaise Pascal / Pierre de Fermat, profit maximization, RAND corporation, random walk, Ray Kurzweil, recommendation engine, RFID, Richard Thaler, ride hailing / ride sharing, Robert Shiller, Robert Shiller, robotic process automation, Rodney Brooks, Second Machine Age, self-driving car, Shoshana Zuboff, Silicon Valley, smart cities, smart contracts, social intelligence, speech recognition, Stephen Hawking, Steven Pinker, superintelligent machines, surveillance capitalism, Thales of Miletus, The Future of Employment, The Theory of the Leisure Class by Thorstein Veblen, Thomas Bayes, Thorstein Veblen, Tragedy of the Commons, transport as a service, Turing machine, Turing test, universal basic income, uranium enrichment, Von Neumann architecture, Wall-E, Watson beat the top human players on Jeopardy!, web application, zero-sum game

Although this sounds admirable in principle, it remains to be seen—at least at the time of writing—how much impact this will have in practice. It is often so much easier, faster, and cheaper to leave the decisions to the machine. One reason for all the concern about automated decisions is the potential for algorithmic bias—the tendency of machine learning algorithms to produce inappropriately biased decisions about loans, housing, jobs, insurance, parole, sentencing, college admission, and so on. The explicit use of criteria such as race in these decisions has been illegal for decades in many countries and is prohibited by Article 9 of the GDPR for a very wide range of applications.

The EU’s GDPR is often said to provide a general “right to an explanation” for any automated decision,38 but the actual language of Article 14 merely requires meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing for the data subject. At present, it is unknown how courts will enforce this clause. It’s possible that the hapless consumer will just be handed a description of the particular deep learning algorithm used to train the classifier that made the decision. Nowadays, the likely causes of algorithmic bias lie in the data rather than in the deliberate malfeasance of corporations. In 2015, Glamour magazine reported a disappointing finding: “The first female Google image search result for ‘CEO’ appears TWELVE rows down—and it’s Barbie.” (There were some actual women in the 2018 results, but most of them were models portraying CEOs in generic stock photos, rather than actual female CEOs; the 2019 results are somewhat better.)

Fortunately, a good deal of attention has been paid to the problem of removing inadvertent bias from machine learning algorithms, and there are now methods that produce unbiased results according to several plausible and desirable definitions of fairness.39 The mathematical analysis of these definitions of fairness shows that they cannot be achieved simultaneously and that, when enforced, they result in lower prediction accuracy and, in the case of lending decisions, lower profit for the lender. This is perhaps disappointing, but at least it makes clear the trade-offs involved in avoiding algorithmic bias. One hopes that awareness of these methods and of the issue itself will spread quickly among policy makers, practitioners, and users. If handing authority over individual humans to machines is sometimes problematic, what about authority over lots of humans? That is, should we put machines in political and management roles?


pages: 161 words: 39,526

Applied Artificial Intelligence: A Handbook for Business Leaders by Mariya Yao, Adelyn Zhou, Marlene Jia

Airbnb, algorithmic bias, Amazon Web Services, artificial general intelligence, autonomous vehicles, backpropagation, business intelligence, business process, call centre, chief data officer, computer vision, conceptual framework, en.wikipedia.org, future of work, industrial robot, Internet of things, iterative process, Jeff Bezos, job automation, Marc Andreessen, natural language processing, new economy, pattern recognition, performance metric, price discrimination, randomized controlled trial, recommendation engine, robotic process automation, self-driving car, sentiment analysis, Silicon Valley, skunkworks, software is eating the world, source of truth, speech recognition, statistical model, strong AI, technological singularity, The future is already here

Finally, the use of technology—including AI, predictive analytics, automation, and social media bots—can have far-ranging social impact. AI can be used for illegal surveillance, propaganda, deception, and social manipulation. * * * (29) Yao, M. (2017, November 2). Fighting Algorithmic BIAS & Homogenous Thinking in AI. TopBots. Retrieved from http://www.topbots.com/fightinghomogenous-thinking-algorithmic-bias-ai/ (30) Westervelt, E (Contributor). (2017, August 18). Did A Bail Reform Algorithm Contribute To This San Francisco Man’s Murder? [Radio Broadcast Episode]. In Carline Watson (Executive Producer), All Things Considered.


pages: 506 words: 133,134

The Lonely Century: How Isolation Imperils Our Future by Noreena Hertz

"side hustle", Airbnb, airport security, algorithmic bias, Asian financial crisis, Bernie Sanders, big-box store, Broken windows theory, call centre, Capital in the Twenty-First Century by Thomas Piketty, car-free, Cass Sunstein, centre right, conceptual framework, Copley Medal, coronavirus, correlation does not imply causation, Covid-19, COVID-19, dark matter, deindustrialization, Diane Coyle, disinformation, Donald Trump, en.wikipedia.org, Erik Brynjolfsson, Fellow of the Royal Society, future of work, gender pay gap, gig economy, Gordon Gekko, greed is good, happiness index / gross national happiness, housing crisis, illegal immigration, independent contractor, industrial robot, Jane Jacobs, Jeff Bezos, job automation, job satisfaction, knowledge economy, labor-force participation, longitudinal study, low skilled workers, Lyft, Mark Zuckerberg, mass immigration, means of production, megacity, meta-analysis, move fast and break things, move fast and break things, Network effects, new economy, Pepto Bismol, QWERTY keyboard, Ray Oldenburg, remote working, rent control, RFID, Ronald Reagan, San Francisco homelessness, Second Machine Age, Shoshana Zuboff, Silicon Valley, Skype, Snapchat, Social Responsibility of Business Is to Increase Its Profits, Steve Jobs, surveillance capitalism, TaskRabbit, The Death and Life of Great American Cities, The Future of Employment, The Great Good Place, The Wealth of Nations by Adam Smith, Tim Cook: Apple, Uber and Lyft, uber lyft, urban planning, Wall-E, WeWork, working poor

CHAPTER EIGHT: The Digital Whip 1 Robert Booth, ‘Unilever saves on recruiters by using AI to assess job interviews’, Guardian, 25 October 2019, https://www.theguardian.com/technology/2019/oct/25/unilever-saves-on-recruiters-by-using-ai-to-assess-job-interviews; The Harvey Nash HR Survey 2019, https://www.harveynash.com/hrsurvey/full-report/charts/#summary. 2 ‘HireVue surpasses ten million video interviews completed worldwide’, HireVue, 21 May 2019, https://www.hirevue.com/press-release/hirevue-surpasses-ten-million-video-interviews-completed-worldwide. 3 ‘EPIC Files Complaint with FTC about Employment Screening Firm HireVue’, Electronic Privacy Information Center, 6 November 2019, https://epic.org/2019/11/epic-files-complaint-with-ftc.html; see full complaint at https://epic.org/privacy/ftc/hirevue/EPIC_FTC_HireVue_Complaint.pdf. 4 Loren Larsen, ‘HireVue Assessments and Preventing Algorithmic Bias’, HireVue, 22 June 2018, https://www.hirevue.com/blog/hirevue-assessments-and-preventing-algorithmic-bias; cf. Emma Leech, ‘The perils of AI recruitment’, New Statesman, 14 August 2019, https://tech.newstatesman.com/emerging-technologies/ai-recruitment-algorithms-bias; Julius Schulte, ‘AI-assisted recruitment is biased. Here’s how to make it more fair’, World Economic Forum, 9 May 2019, https://www.weforum.org/agenda/2019/05/ai-assisted-recruitment-is-biased-heres-how-to-beat-it/. 5 Drew Harwell, ‘A face-scanning algorithm increasingly decides whether you deserve the job’, Washington Post, 6 November 2019, https://www.washingtonpost.com/technology/2019/10/22/ai-hiring-face-scanning-algorithm-increasingly-decides-whether-you-deserve-job/. 6 Reuters, ‘Amazon ditched AI recruiting tool that favoured men for technical jobs’, Guardian, 11 October 2018, https://www.theguardian.com/technology/2018/oct/10/amazon-hiring-ai-gender-bias-recruiting-engine. 7 Kuba Krys et al., ‘Be Careful Where You Smile: Culture Shapes Judgments of Intelligence and Honesty of Smiling Individuals’, Journal of Nonverbal Behavior 40 (2016), 101–16, https://doi.org/10.1007/s10919-015-0226-4. these assumptions, reflected in proverbs and stereotypes, are now backed up by quantitative analysis of 44 countries. 8 The broad theory holds that in countries with more historical diversity – i.e. populated by a large proportion of immigrants who may not share languages or cultural norms – smiles are more expected and used as social currency; see Khazan, ‘Why Americans smile so much’, The Atlantic, 3 May 2017, https://www.theatlantic.com/science/archive/2017/05/why-americans-smile-so-much/524967/. 9 The retail giant’s eventual closure of its German stores was related, analysts speculate, to its inability to adapt to different cultural expectations; Mark Landler and Michael Barbaro, ‘Wal-Mart Finds That Its Formula Doesn’t Fit Every Culture’, New York Times, 2 August 2006, https://www.nytimes.com/2006/08/02/business/worldbusiness/02walmart.html; see also Khazan, ‘Why Americans smile so much’. 10 Implied in this statement on its website under FAQ for interviewees: ‘Customer-facing jobs, like a bank teller role, do require a degree of friendliness and attention to other people.

Platforms like Uber, Fiverr and TaskRabbit that use ratings should be mandated to audit their mechanisms, identify potential biases and recalibrate accordingly. Moreover, an ‘appeals process’ must be guaranteed, so that those who rely on these platforms to make a decent living are able to contest ratings that they consider unfair. Whilst algorithmic bias is considerably harder to address, we can certainly screen for it much better than has been done to date. Ironically, there is perhaps even a role for algorithms here in monitoring and identifying such biases.61 More fundamentally, the choices that underpin the algorithm – how the data is collected, its code and the decision-making heuristics that parse the data – need to be made transparent, so that redress and recourse are possible.


Artificial Whiteness by Yarden Katz

affirmative action, AI winter, algorithmic bias, Amazon Mechanical Turk, autonomous vehicles, blue-collar work, cellular automata, cloud computing, colonial rule, computer vision, conceptual framework, Danny Hillis, David Graeber, desegregation, Donald Trump, Edward Snowden, Elon Musk, Erik Brynjolfsson, European colonialism, Ferguson, Missouri, housing crisis, income inequality, information retrieval, invisible hand, Jeff Bezos, Kevin Kelly, knowledge worker, Mark Zuckerberg, mass incarceration, Menlo Park, Nate Silver, natural language processing, Norbert Wiener, pattern recognition, phenotype, Philip Mirowski, RAND corporation, recommendation engine, rent control, Rodney Brooks, Ronald Reagan, Seymour Hersh, Shoshana Zuboff, Silicon Valley, Silicon Valley ideology, Skype, speech recognition, statistical model, Stephen Hawking, Stewart Brand, surveillance capitalism, talking drums, telemarketer, The Signal and the Noise by Nate Silver, Whole Earth Catalog, WikiLeaks

Some protesters are holding masks imprinted with Amazon founder Jeff Bezos’s face. Photograph by NWDC Resistance. Yet these are familiar uses of computing that do not fall under the fundable rubric of “AI” that the AI expert industry is after. And activists haven’t framed these issues around “AI,” nor around phrases such as “algorithmic accountability” or “algorithmic bias” invoked by critical AI experts. Instead, activists have focused on how technical systems fit into the political system that fuels deportations and detentions, as outlined in Mijente’s report Who’s Behind ICE?. The report highlights the role of “cloud computing” in enabling data interoperability across agencies.

abolition, 143, 148, 247nn37–38 Action Technologies, 290n29 Adam, Alison, 6, 30, 99–100, 103, 187, 207 adaptation, of AI, 10, 65, 128, 167, 178–80 Afghanistan, 58, 209, 219 Agre, Philip, 50, 198–99, 248n84, 249n85, 291n32, 291nn33–34, 299n79 AI Magazine, 23, 56–57, 249n88 AI Now, 79–80, 135, 138–43, 147–49, 151, 261nn41–43, 272n19, 274n37, 276n43 air force, 54–55, 191, 211, 251n102 AI: The Tumultuous History of the Quest for Artificial Intelligence, 51–52, 249n89, 250n90 Algeria, 15 algorithmic accountability, 125, 129, 132 algorithmic bias, 132, 135, 138–40, 169, 175–76, 178–80, 272n22, 284n61, 285n65 algorithmic sentencing, 129 Allen, Theodore, 279n24 Allende, Salvador, 193 AlphaGo, 62, 106, 108 Amazon (company): and Amazon Mechanical Turk (AMT), 116–17; and facial recognition, 176; roles in carceral state, 131–33, 149, 151, 179; and platform companies, 254n5; and the rebranding of AI, 255n12, 256n15, 257n16, 272n19, 276n43 Amazon (place), 82, 86 American Civil Liberties Union (ACLU), 141, 143 American exceptionalism, 60–61 American Psychological Association, 21 analytic epistemology, 188–189, 192, 287n6.

See also rationalistic tradition anthropometry, 7, 160, 278n19, 279n26 AnyVision, 276n45 apartheid, 114 Apple, 68, 71, 74, 285n65 Armer, Paul, 244n37 army, 44, 133, 219, 251n104, 295n55, 299n77 Arpaio, Joe, 179 artificial intelligence (AI): alternatives to, 11, 14, 185–88, 203–10; autonomy of, 52–53, 58–59, 247n72; coining of term, 3, 22–23, 241nn8–9; dissenting views of, 33–34, 44–48, 193–94; expert industry around, 2–3, 9–10, 66–74; as a foil for neoliberalism, 70–78, 119–22, 166, 226–27; “hype” about, 33, 208, 238n17, 255n14; and models of the self, 5–7, 10, 27–32, 154, 157, 165–67; as a site of imperial struggle, 60–61; individual and institutional investment in, 7–11, 20, 79–81, 154, 167–72, 182, 229; militaristic frame of, 35–38, 45–47, 50–59, 245n44; nebulous and shifting character of, 2–3, 5–6, 10, 22–27, 33–38, 42, 45, 52–55, 63–70, 154–55, 164–67; springs and winters of, 4, 48; sponsorship of, 24–26, 59; rebranding of, 12, 65–70, 80–81; relation to psychology and biology, 5, 21; “technical” narratives about, 2–4, 66–68, 261n43; as a technology of whiteness, 8–11, 155, 172, 181–82 artificial life (ALife): and bio-inspired computing, 210, 296n60; and racial narratives, 159, 277n12–14; relation to AI, 159, 277n11 Artificial War, 212–13 ARPA, 24, 96, 249n85; and Behavioral Sciences Program, 98, 265n11 Association for the Advancement of Artificial Intelligence (AAAI), 5, 23, 37–38 Association for Computing Machinery (ACM), 23 Atlantic, The, 61, 104 automata studies, 241nn8–9 automation: and automated systems, 141, 169–70; of industrial production, 25–26, 34, 74–75, 78; of warfare, 43–44, 57–59 autopoiesis, 194–96: and neoliberalism, 201–2; and war, 213 Azoulay, Ariella Aïsha, 300n1 Azoulay, Roger, 15, 300n1 Baldwin, James, 93 Barbrook, Richard, 156 Baudrillard, Jean, 151 Baupost Group, 263n57 Beer, Stafford, 25, 193, 290n30 Belfer Center, 260n40 Bell Labs, 22 Bell, Trudy E., 246n58 behaviorism, 119, 121–22, 166, 267n46, 269n49, 269n51 Benjamin, Ruha, 285n67 Berkman Klein Center for Internet & Society, 66, 260n39, 261n42 bias. See algorithmic bias big data, 12, 67–71, 79–80, 254n3 Black Lives Matter, 138, 284n64 Black in AI, 173–74 Black radical tradition, 173, 178, 181 Black Skin, White Masks, 175, 180–81 Blackstone Group, 81–84, 86, 88 blood quantum, 162, 279n26 Blumenbach, Johann F., 160, 278n18 Boden, Margaret, 259n24 Boeing, 260n39 Bolsonaro, Jair, 82–83 Bostrom, Nick, 76 Brand, Stewart, 293n43 Brazil, 82–83, 86 Bridges, Ruby, 113, 115 Brooks, Fred P., 270n54 Brooks, Rodney, 44, 207–8, 295nn52–53, 296n60 Brown, Michael, 137 Buolamwini, Joy, 175–80, 284n61, 285n65, 285n67 Bush, George W., 58, 131, 209, 216 California Institute of Technology, 242n19 Cameron, Andy, 156 capitalism: and anticapitalism, 45; and capitalist visions of AI experts, 9–10, 35, 65, 74–78, 81–88.


pages: 533

Future Politics: Living Together in a World Transformed by Tech by Jamie Susskind

3D printing, additive manufacturing, affirmative action, agricultural Revolution, Airbnb, airport security, algorithmic bias, Andrew Keen, artificial general intelligence, augmented reality, automated trading system, autonomous vehicles, basic income, Bertrand Russell: In Praise of Idleness, bitcoin, blockchain, brain emulation, British Empire, business process, Capital in the Twenty-First Century by Thomas Piketty, cashless society, Cass Sunstein, cellular automata, cloud computing, computer age, computer vision, continuation of politics by other means, correlation does not imply causation, crowdsourcing, cryptocurrency, digital map, disinformation, distributed ledger, Donald Trump, easy for humans, difficult for computers, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ethereum, ethereum blockchain, Filter Bubble, future of work, Google bus, Google X / Alphabet X, Googley, industrial robot, informal economy, intangible asset, Internet of things, invention of the printing press, invention of writing, Isaac Newton, Jaron Lanier, John Markoff, Joseph Schumpeter, Kevin Kelly, knowledge economy, lifelogging, Metcalfe’s law, mittelstand, more computing power than Apollo, move fast and break things, move fast and break things, natural language processing, Network effects, new economy, night-watchman state, Oculus Rift, Panopticon Jeremy Bentham, pattern recognition, payday loans, price discrimination, price mechanism, RAND corporation, ransomware, Ray Kurzweil, Richard Stallman, ride hailing / ride sharing, road to serfdom, Robert Mercer, Satoshi Nakamoto, Second Machine Age, selection bias, self-driving car, sexual politics, sharing economy, Silicon Valley, Silicon Valley startup, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, smart contracts, Snapchat, speech recognition, Steve Bannon, Steve Jobs, Steve Wozniak, Steven Levy, technological singularity, the built environment, The Structural Transformation of the Public Sphere, The Wisdom of Crowds, Thomas L Friedman, Tragedy of the Commons, universal basic income, urban planning, Watson beat the top human players on Jeopardy!, working-age population, Yochai Benkler

‘Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings’. arXiv, 21 Jul. 2016 <https://arxiv.org/ pdf/1607.06520.pdf> (accessed 3 Dec. 2017). Bonchi, Francesco, Carlos Castillo, and Sara Hajian. ‘Algorithmic Bias: From Discrimination Discovery to Fairness-aware Data Mining’. KDD OUP CORRECTED PROOF – FINAL, 28/05/18, SPi РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS Bibliography 443 2016 Tutorial. <http://francescobonchi.com/tutorial-algorithmic-bias. pdf> (accessed 3 Dec. 2017). Booth, Robert. ‘Facebook Reveals News Feed Experiment to Control ­Emotions’. The Guardian, 30 Jun. 2004 <https://www.theguardian.com/ technology/2014/jun/29/facebook-users-emotions-news-feeds> (accessed 11 Dec. 2017).

This is just one way of assessing OUP CORRECTED PROOF – FINAL, 28/05/18, SPi РЕЛИЗ ПОДГОТОВИЛА ГРУППА "What's News" VK.COM/WSNWS Algorithmic Injustice 281 a­ lgorithmic injustice. One of the tasks for political theorists will be to find more. ‘Algorithmic Discrimination’ Different types of algorithmic injustice are sometimes lumped together under the name ‘algorithmic discrimination’. I avoid this term, along with the term algorithmic bias, because it can lead to confusion. Discrimination is a subtle concept with at least three acceptable meanings. The first is neutral, referring to the process of drawing distinctions between one thing and another. (If I say you are a highly discriminating art critic, I am praising your acuity and not calling you a bigot.)

Google and the Perpetuation of Stereotypes via Autocomplete Search Forms’, Critical Discourse Studies 10, no. 2 (2013) <http://www.tandfonline.com/doi/full/10.1080/17405904.2012.7 44320?scroll=top&needAccess=true> (accessed 3 December 2017). 14. Francesco Bonchi, Carlos Castillo, and Sara Hajian, ‘Algorithmic Bias: From Discrimination Discovery to Fairness-aware Data Mining’, KDD 2016 Tutorial <http://francescobonchi.com/tutorial-algorithmicbias.pdf> (accessed 3 December 2017). 15. Tom Slee, What’s Yours is Mine: Against the Sharing Economy (New York and London: OR Books, 2015), 94. 16. Slee, What’s Yours is Mine, 95. 17.


pages: 307 words: 88,180

AI Superpowers: China, Silicon Valley, and the New World Order by Kai-Fu Lee

AI winter, Airbnb, Albert Einstein, algorithmic bias, algorithmic trading, artificial general intelligence, autonomous vehicles, barriers to entry, basic income, business cycle, cloud computing, commoditize, computer vision, corporate social responsibility, creative destruction, crony capitalism, Deng Xiaoping, deskilling, Donald Trump, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, full employment, future of work, gig economy, Google Chrome, happiness index / gross national happiness, if you build it, they will come, ImageNet competition, impact investing, income inequality, informal economy, Internet of things, invention of the telegraph, Jeff Bezos, job automation, John Markoff, Kickstarter, knowledge worker, Lean Startup, low skilled workers, Lyft, mandatory minimum, Mark Zuckerberg, Menlo Park, minimum viable product, natural language processing, new economy, pattern recognition, pirate software, profit maximization, QR code, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, risk tolerance, Robert Mercer, Rodney Brooks, Rubik’s Cube, Sam Altman, Second Machine Age, self-driving car, sentiment analysis, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, special economic zone, speech recognition, Stephen Hawking, Steve Jobs, strong AI, The Future of Employment, Travis Kalanick, Uber and Lyft, uber lyft, universal basic income, urban planning, Y Combinator

As public policy and personal values blend, we should really take the time to study new experiments in defining and measuring progress, such as Bhutan’s decision to pursue “Gross National Happiness” as a key development indicator. Finally, our governments will need to consistently look to one another in evaluating thorny new tradeoffs in data privacy, digital monopolies, online security, and algorithmic bias. In tackling these issues, we can learn much from comparing the different approaches taken by regulators in Europe, the United States, and China. While Europe has opted for a more heavy-handed approach (fining Google, for example, for antitrust and trying to wrest control over data away from the technology companies), China and the United States have given these companies greater leeway, letting technology and markets develop before intervening on the margins.

Index A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z A Africa, 138, 139, 169 age of data, 14, 18, 56 age of implementation Chinese entrepreneurs and, 16, 18, 25 Chinese government and, 18 data and, 17, 20, 55, 80 deep learning and, 13–14, 143 going light vs. going heavy, 71 AGI (artificial general intelligence), 140–44 AI. See artificial intelligence (AI) AI engineers, 14 Airbnb, 39, 49, 73 AI revolution deep learning and, 5, 25, 92, 94, 143 economic impact of, 151–52 speed of, 152–55 AI winters, 6–7, 8, 9, 10 algorithmic bias, 229 algorithms, AI AI revolution and, 152–53 computing power and, 14, 56 credit and, 112–13 data and, 14, 17, 56, 138 fake news detection by, 109 intelligence sharing and, 87 legal applications for, 115–16 medical diagnosis and, 114–15 as recommendation engines, 107–8 robot reporting, 108 white-collar workers and, 167, 168 Alibaba Amazon compared to, 109 Chinese startups and, 58 City Brain, 93–94, 117, 124, 228 as dominant AI player, 83, 91, 93–94 eBay and, 34–35 financial services spun off from, 73 four waves of AI and, 106, 107, 109 global markets and, 137 grid approach and, 95 Microsoft Research Asia and, 89 mobile payments transition, 76 New York Stock Exchange debut, 66–67 online purchasing and, 68 success of, 40 Tencent’s “Pearl Harbor attack” on, 60–61 Wang Xing and, 24 Alipay, 35, 60, 69, 73–74, 75, 112, 118 Alphabet, 92–93 AlphaGo, 1–4, 5, 6, 11, 199 AlphaGo Zero, 90 Altman, Sam, 207 Amazon Alibaba compared to, 109 Chinese market and, 39 data captured by, 77 as dominant AI player, 83, 91 four waves of AI and, 106 grid approach and, 95 innovation mentality at, 33 monopoly of e-commerce, 170 online purchasing and, 68 Wang Xing and, 24 warehouses, 129–30 Amazon Echo, 117, 127 Amazon Go, 163, 213 Anderson, Chris, 130 Andreesen Horowitz, 70 Ant Financial, 73 antitrust laws, 20, 28, 171, 229 Apollo project, 135 app constellation model, 70 Apple, 33, 75, 117, 126, 143, 177, 184 Apple Pay, 75, 76 app-within-an-app model, 59 ARM (British firm), 96 Armstrong, Neil, 3 artificial general intelligence (AGI), 140–44 artificial intelligence (AI) introduction to, ix–xi See also China; deep learning; economy and AI; four waves of AI; global AI story; human coexistence with AI; new world order artificial superintelligence.


The Ethical Algorithm: The Science of Socially Aware Algorithm Design by Michael Kearns, Aaron Roth

23andMe, affirmative action, algorithmic bias, algorithmic trading, Alvin Roth, backpropagation, Bayesian statistics, bitcoin, cloud computing, computer vision, crowdsourcing, Edward Snowden, Elon Musk, Filter Bubble, general-purpose programming language, Google Chrome, ImageNet competition, Lyft, medical residency, Nash equilibrium, Netflix Prize, p-value, Pareto efficiency, performance metric, personalized medicine, pre–internet, profit motive, quantitative trading / quantitative finance, RAND corporation, recommendation engine, replication crisis, ride hailing / ride sharing, Robert Bork, Ronald Coase, self-driving car, short selling, sorting algorithm, speech recognition, statistical model, Stephen Hawking, superintelligent machines, telemarketer, Turing machine, two-sided market, Vilfredo Pareto

By making this distinction, the algorithm is able to “subtract off” the bias in the data associated with nongendered words, thus reducing analogy completions like the one in the paper’s title, while still preserving “correct” analogies like “Man is to king as woman is to queen.” These are the themes of this chapter: scientific notions of algorithmic (and human) bias and discrimination, how to detect and measure them, how to design fairer algorithmic solution—and what the costs of fairness might be to predictive accuracy and other important objectives, just as we examined the costs to accuracy of differential privacy. We will eventually show how such costs can be made quantitative in the form of what are known as Pareto curves specifying the theoretical and empirical trade-offs between fairness and accuracy.

See also societal norms and values Hunch, 52–53, 68 hybrid models, 76–77 hypotheses, 153–54 IBM Research, 132 image classifiers, 181–82 image-creating algorithms, 132–33 ImageNet, 146–51, 160–63, 165 image recognition, 145–49, 162–63, 165–67, 175 incentives and algorithmic game theory, 101 and correlated equilibrium, 114–15 and differential privacy, 56 and differing notions of fairness, 85 and ethical issues of optimization, 189–90 and investing scams, 140–41 and medical residency hiring, 128 and navigation problems, 111–15 and p-hacking, 144–45 and scientific research, 136, 144–45 and user preferences, 97 income discrimination, 88–89 incriminating information, 40–45 individual preferences, 115–17 inference, 33, 51 Infinite Jest (Wallace), 118, 120 informational challenges, 104 input bias, 91 insurance rates, 38 intelligence explosion, 185–88 intentionality, 7 internal design of algorithms, 131 Internal Revenue Service (IRS), 49–50 Internet and availability of data, 66–67 and commerce, 64 Internet-derived data, 6–7 and theoretical computer science field, 13 Internet Movie Database (IMDB), 25–26 interpretability of outputs and algorithmic morality, 175–77 and current state of ethics research, 170–75 and General Data Protection Regulation, 15 and goals of ethics research, 170–71 investing scams, 137–41 Ioannidis, John, 143 iPhones, 47–48, 195 Jeopardy, 180 jump balls, 99–100 Kaggle, 146–47 k-anonymity, 27–30, 44 kidney donation matching, 130 King, Stephen, 118 k-nearest neighbors algorithm, 173 Kubrick, Stanley, 100 large datasets, 100–101, 151, 155 LA Weekly, 94 law enforcement access to data, 54–56 laws and regulations algorithms as regulatory measure, 16–17 bans on data uses, 39 and concerns about algorithm use, 3–4 and correlated equilibrium, 113 and differential privacy, 47–48 and “fairness gerrymandering,” 86–87 and fairness vs. accuracy of models, 77–78 and forbidden inputs, 66–67 and interpretability of model outputs, 172, 174 and investing scams, 138, 140–41 and navigation problems, 107 recent efforts to address machine learning issues, 14–15 and scale issues, 192 and scope of topics covered, 19–21 and shortcomings of anonymization methods, 24 and theoretical computer science field, 13–14 and the US Census, 49–50 Learner and Regulator game, 89 learning process, formal, 38–39 LeCun, Yann, 133 Legg, Shane, 179 lending and creditworthiness and algorithmic bias, 62 and algorithmic violations of fairness and privacy, 96 benefits of machine learning, 191–92 and concerns about algorithm use, 3 and criticisms of ethical algorithm efforts, 193–94 and “fairness gerrymandering,” 86–87 and interpretability of model outputs, 171–72, 174 “merit” and algorithmic fairness, 72–74, 84 and Pareto frontiers, 82–86 random lending, 69–71 and statistical parity, 69–72, 84 and unique challenges of algorithms, 8 Lewis, Amanda, 94–95, 97 linear models, 173 loan applications, 171–72, 174 local differential privacy, 46–47 local trust model, 47 location data and broken anonymity, 2–3 and scope of topics covered, 19–20 and shortcomings of anonymization methods, 23, 27–29 logistic regression, 173 Loken, Eric, 159 lung cancer research, 34–36, 39, 51, 53–54 lying, 111–12 machine learning and adaptive data analysis, 160–62 and aggregate data, 30 and algorithmic game theory, 101 and algorithmic morality, 176–77 and algorithms as regulatory measure, 16–17 contrasted with human learning, 6–7 criminal justice applications, 3 and dangers of adaptive data analysis, 151, 153–54 and data collection bias, 90–93 and differential privacy, 38–39, 52 and discrimination, 96 and echo chamber equilibrium, 124 and “fairness gerrymandering,” 87–90 and forbidden inputs, 68 generative adversarial network (GAN), 133 and Google’s “word embedding” model, 58 and image recognition competition, 145–49, 165–67 and interpretability of model outputs, 171–74 and limitations of differential privacy, 51 and “merit” in algorithmic fairness, 74 and Pareto frontier, 82 and product recommendation algorithms, 122–23 recent laws and regulations, 15 and scope of topics covered, 18–19 self-play in, 131–34 and theoretical computer science, 13 and unique challenges of algorithms, 7–11 and weaknesses of aggregate data, 31 and weaknesses of encryption, 32 and word embedding, 57–63 macroscopic efficiency, 112 Magrin, Lisa, 2–3 Manhattan Project, 17 markets, 177–78 Marriott, 32 Massachusetts Institute of Technology (MIT), 23 matching markets, 126–30 mathematical constraints, 37–38 mathematical literacy, 172.


pages: 340 words: 97,723

The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity by Amy Webb

Ada Lovelace, AI winter, Airbnb, airport security, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, artificial general intelligence, Asilomar, autonomous vehicles, backpropagation, Bayesian statistics, Bernie Sanders, bioinformatics, blockchain, Bretton Woods, business intelligence, Cass Sunstein, Claude Shannon: information theory, cloud computing, cognitive bias, complexity theory, computer vision, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, Deng Xiaoping, disinformation, distributed ledger, don't be evil, Donald Trump, Elon Musk, Filter Bubble, Flynn Effect, gig economy, Google Glasses, Grace Hopper, Gödel, Escher, Bach, Inbox Zero, Internet of things, Jacques de Vaucanson, Jeff Bezos, Joan Didion, job automation, John von Neumann, knowledge worker, Lyft, Mark Zuckerberg, Menlo Park, move fast and break things, move fast and break things, natural language processing, New Urbanism, one-China policy, optical character recognition, packet switching, pattern recognition, personalized medicine, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Rodney Brooks, Rubik’s Cube, Sand Hill Road, Second Machine Age, self-driving car, SETI@home, side project, Silicon Valley, Silicon Valley startup, skunkworks, Skype, smart cities, South China Sea, sovereign wealth fund, speech recognition, Stephen Hawking, strong AI, superintelligent machines, surveillance capitalism, technological singularity, The Coming Technological Singularity, theory of mind, Tim Cook: Apple, trade route, Turing machine, Turing test, uber lyft, Von Neumann architecture, Watson beat the top human players on Jeopardy!, zero day

“Why the Future Doesn’t Need Us.” Wired, April 1, 2000. http://www.wired.com/wired/archive/8.04/joy.html. Kelly, K. The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future. New York: Viking, 2016. Kirkpatrick, K. “Battling Algorithmic Bias.” Communications of the ACM 59, no. 10 (2016): 16–17. https://cacm.acm.org/magazines/2016/10/207759-battling-algorithmic-bias/abstract. Knight, W. “AI Fight Club Could Help Save Us from a Future of Super-Smart Cyberattacks.” MIT Technology Review, July 20, 2017. https://www.technologyreview.com/s/608288/ai-fight-club-could-help-save-us-from-afuture-of-supersmart-cyberattacks/. .


pages: 276 words: 81,153

Outnumbered: From Facebook and Google to Fake News and Filter-Bubbles – the Algorithms That Control Our Lives by David Sumpter

affirmative action, algorithmic bias, Bernie Sanders, correlation does not imply causation, crowdsourcing, disinformation, don't be evil, Donald Trump, Elon Musk, Filter Bubble, Google Glasses, illegal immigration, Jeff Bezos, job automation, Kenneth Arrow, Loebner Prize, Mark Zuckerberg, meta-analysis, Minecraft, Nate Silver, natural language processing, Nelson Mandela, p-value, prediction markets, random walk, Ray Kurzweil, Robert Mercer, selection bias, self-driving car, Silicon Valley, Skype, Snapchat, speech recognition, statistical model, Stephen Hawking, Steve Bannon, Steven Pinker, The Signal and the Noise by Nate Silver, traveling salesman, Turing test

Moreover, if we consider the individuals who weren’t shown the advert, we could be said to have discriminated against males. One in 11 of the men who didn’t see the advert were interested in the job, while only one in 27 of the women who didn’t see the advert were interested. Our new algorithm has a calibration bias that favours females. Table 6.3 Revised breakdown of the men and women shown an advert for our revised (thought experiment) Facebook campaign. Unfairness is like those whack-a-mole games at the fairground where the mole keeps popping up in different places. You hammer it down in one place and another one comes out somewhere else.

The unsupervised algorithms looking at what we write are not programmed to be prejudiced. When we look at what they have learnt about us, they simply reflect the prejudice of the social world we live in. I also thought back to my discussion with Michal Kosinski. Michal had been very enthusiastic about the possibility of algorithms eliminating bias. And, as he predicted, researchers were already proposing tools for extracting information about applicants’ qualities and experience from their CVs.9 One Danish start-up, Relink, is using techniques similar to GloVe to summarise cover letters and match applicants to jobs. But looking more deeply at how the GloVe model works, I had found good reason to be cautious about this approach.


pages: 252 words: 72,473

Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O'Neil

Affordable Care Act / Obamacare, algorithmic bias, Bernie Madoff, big data - Walmart - Pop Tarts, call centre, carried interest, cloud computing, collateralized debt obligation, correlation does not imply causation, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, disinformation, Emanuel Derman, Financial Modelers Manifesto, housing crisis, I will remember that I didn’t make the world, and it doesn’t satisfy my equations, Ida Tarbell, illegal immigration, Internet of things, late fees, mass incarceration, medical bankruptcy, Moneyball by Michael Lewis explains big data, new economy, obamacare, Occupy movement, offshore financial centre, payday loans, peer-to-peer lending, Peter Thiel, Ponzi scheme, prediction markets, price discrimination, quantitative hedge fund, Ralph Nader, RAND corporation, recommendation engine, Rubik’s Cube, Sharpe ratio, statistical model, Tim Cook: Apple, too big to fail, Unsafe at Any Speed, Upton Sinclair, Watson beat the top human players on Jeopardy!, working poor

If you plot year-to-year scores on a chart: Gary Rubinstein, “Analyzing Released NYC Value-Added Data Part 2,” Gary Rubinstein’s Blog, February 28, 2012, http://​garyrubinstein.​teachforus.​org/​2012/​02/​28/​analyzing-​released-​nyc-​value-​added-​data-​part​2/. as the computer scientist Cynthia Dwork has noted: Claire Cain Miller, “Algorithms and Bias: Q. and A. with Cynthia Dwork,” New York Times, August 10, 2015, www.​nytimes.​com/​2015/​08/​11/​upshot/​algorithms-​and-​bias-​q-​and-​a-​with-​cynthia-​dwork.​html. Web Transparency and Accountability Project: Elizabeth Dwoskin, “How Social Bias Creeps into Web Technology,” Wall Street Journal, August 21, 2015, www.​wsj.​com/​articles/​computers-​are-​showing-​their-​biases-​and-​tech-​firms-​are-​concerned-​1440102894.


pages: 245 words: 83,272

Artificial Unintelligence: How Computers Misunderstand the World by Meredith Broussard

1960s counterculture, A Declaration of the Independence of Cyberspace, Ada Lovelace, AI winter, Airbnb, algorithmic bias, Amazon Web Services, autonomous vehicles, availability heuristic, barriers to entry, Bernie Sanders, bitcoin, Buckminster Fuller, Chris Urmson, Clayton Christensen, cloud computing, cognitive bias, complexity theory, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, digital map, disruptive innovation, Donald Trump, Douglas Engelbart, easy for humans, difficult for computers, Electric Kool-Aid Acid Test, Elon Musk, Firefox, gig economy, global supply chain, Google Glasses, Google X / Alphabet X, Hacker Ethic, independent contractor, Jaron Lanier, Jeff Bezos, John von Neumann, Joi Ito, Joseph-Marie Jacquard, life extension, Lyft, Mark Zuckerberg, mass incarceration, Minecraft, minimum viable product, Mother of all demos, move fast and break things, move fast and break things, Nate Silver, natural language processing, PageRank, payday loans, paypal mafia, performance metric, Peter Thiel, price discrimination, Ray Kurzweil, ride hailing / ride sharing, Ross Ulbricht, Saturday Night Live, school choice, self-driving car, Silicon Valley, speech recognition, statistical model, Steve Jobs, Steven Levy, Stewart Brand, Tesla Model S, the High Line, The Signal and the Noise by Nate Silver, theory of mind, Travis Kalanick, Turing test, Uber for X, uber lyft, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, women in the workforce

Index Abacus, 75 Ability beliefs, 83 Academy at Palumbo, 56–57 Ackerman, Arlene, 58–59 Activism, cyberspace, 82–83 Adair, Bill, 45 AI Now Institute, 194–195 AirBnB, 168 Albrecht, Steve, 159 Alda, Alan, 70 Alexa, 38–39, 72 Alexander, Michelle, 159 Algorithmic accountability reporting, 7, 43–44, 65–66 Algorithms bias in, 44, 150, 155–157, 195 defined, 7, 94 elevator, 157 function of, 43–44 risk, 44, 155–156 tic-tac-toe, 34 Alphabet, 96 AlphaGo, 33–37 Amazon, 115, 158 Analytical engine, 76 Anarcho-capitalism, 83 Anderson, C. W., 46–47 Angwin, Julia, 154–156 App hackathons, 165–174 Apple Watch, 157 Artificial intelligence (AI) beginnings, 69–73 expert systems, 52–53, 179 fantasy of, 132 in film, 31, 32, 198 foundations of, 9 future of, 194–196 games and, 33–37 general, 10–11, 32 narrow, 10–11, 32–33, 97 popularity of, 90 real vs. imagined, 31–32 research, women in, 158 sentience challenge in, 129 Asimov, Isaac, 71 Assembly language, 24 Association for Computing Machinery (ACM), 145 Astrolabe, 76 Asymmetry, positive, 28 Automation technology, 176–177 Autopilot, 121 Availability heuristic, 96 Babbage, Charles, 76–77 Bailiwick (Broussard), 182–185, 190–191, 193 Barlow, John Perry, 82–83 Bell Labs, 13 Bench, Shane, 84 Ben Franklin Racing Team (Little Ben), 122–127 Berkman Klein Center (Harvard), 195 Berners-Lee, Tim, 4–5, 47 Bezos, Jeff, 73, 115 Bias in algorithms, 44, 150, 155–157 in algorithms, racial, 44, 155–156 genius myth and, 83–84 programmers and, 155–158 in risk ratings, 44, 155–156 in STEM fields, 83–84 Bill & Melinda Gates Foundation, 60–61, 157 Bipartisan Campaign Reform Act, 180 Bitcoin, 159 Bizannes, Elias, 165, 166, 171 Blow, Charles, 95 Boggs, David, 67–68 Boole, George, 77 Boolean algebra, 77 Borden, Brisha, 154–155 Borsook, Paulina, 82 Bowhead Systems Management, 137 boyd, danah, 195 Bradley, Earl, 43 Brains 19–20, 95, 128–129, 132, 144, 150 Brand, Stewart, 5, 29, 70, 73, 81–82 Brin, Sergei, 72, 151 Brown, Joshua D., 140, 142 Bump, Philip, 186 Burroughs, William S., 77 Burroughs, William Seward, 77 Calculation vs. consciousness, 37 Cali-Fame, 186 California, drug use in, 158–159 Cameron, James, 95 Campaign finance, 177–186, 191 Čapek, Karel, 129 Caprio, Mike, 170–171 Carnegie Mellon University, autonomous vehicle research ALVINN, 131 University Racing Team (Boss), 124, 126–127, 130–131 Cars deaths associated with, 136–138, 146 distracted driving of, 146 human-centered design for, 147 Cars, self-driving 2005 Grand Challenge, 123–124 2007 Grand Challenge, 122–127 algorithms in, 139 artificial intelligence in, 129–131, 133 deaths in, 140 driver-assistance technology from, 135, 146 economics of, 147 experiences in, 121–123, 125–126, 128 fantasy of, 138, 142, 146 GPS hacking, 139 LIDAR guidance system, 139 machine ethics, 144–145, 147 nausea in, 121–123 NHTSA categories for, 134 problems/limitations, 138–140, 142–146 research funding, 133 SAE standards for levels of automation, 134–135 safety, 136–137, 140–142, 143, 146 sentience in, 132 Uber’s use of, 139 Udacity open-source car competition, 135 Waymo technology, 136 CERN, 4–5 Cerulo, Karen A., 28 Chess, 33 Children’s Online Privacy Protection Act (COPPA), 63–64 Chinese Room argument, 38 Choxi, Heteen, 122 Christensen, Clayton, 163 Chrome, 25, 26 Citizens United, 177, 178, 180 Clarke, Arthur C., 71–72 Client-server model, 27 Clinkenbeard, John, 172 Cloud computing, 26, 52, 196 Cohen, Brian, 56–57 Collins, John, 117 Common Core State Standards, 60–61 Communes, 5, 10 Computer ethics, 144–145 Computer Go, 34–36 Computers assumptions about vs. reality of, 8 components, identifying, 21–22 consciousness, 17 early, 196–199 human, 77–78, 198 human brains vs., 19–20, 128–129, 132, 144, 150 human communication vs., 169–170 human mind vs., 38 imagination, 128 limitations, 6–7, 27–28, 37–39 memory, 131 modern-day, development of, 75–79 operating systems, 24–25 in schools, 63–65 sentience, 17, 129 Computer science bias in, 79 ethical training, 145 explaining the world through, 118 women in, 5 Consciousness vs. calculation, 37 Constants in programming, 88 Content-management system (CMS), 26 Cooper, Donna, 58 Copeland, Jack, 74–75 Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), 44, 155–156 Cortana, 72 Counterculture, 5, 81–82 Cox, Amanda, 41–42 Crawford, Kate, 194 Crime reporting, 154–155 CTB/McGraw-Hill, 53 Cumberbatch, Benedict, 74 Cyberspace activism, 82–83 DarkMarket, 159 Dark web, 82 Data on campaign finance, 178–179 computer-generated, 18–19 defined, 18 dirty, 104 generating, 18 people and, 57 social construction of, 18 unreasonable effectiveness of, 118–119, 121, 129 Data & Society, 195 DataCamp, 96 Data density theory, 169 Data journalism, 6, 43–47, 196 Data Journalism Awards, 196 Data journalism stories cost-benefit of, 47 on inflation, 41–42 Parliament members’ expenses, 46 on police speeding, 43 on police stops of people of color, 43 price discrimination, 46 on sexual abuse by doctors, 42–43 Data Privacy Lab (Harvard), 195 Data Recognition Corporation (DRC), 53 Datasets in machine learning, 94–95 Data visualizations, 41–42 Deaths distracted driving accidents, 146 from poisoning, 137 from road accidents, 136–138 in self-driving cars, 140 Decision making computational, 12, 43, 150 data-driven, 119 machine learning and, 115–116, 118–119 subjective, 150 Deep Blue (IBM), 33 Deep learning, 33 Defense Advanced Research Projects Agency (DARPA) Grand Challenge, 123, 131, 133, 164 Desmond, Matthew, 115 Detroit race riots story, 44 Dhondt, Rebecca, 58 Diakopoulos, Nicholas, 46 Difference engine, 76 Differential pricing and race, 116 Digital age, 193 Digital revolution, 193–194 Dinakar, Karthik, 195 Django, 45, 89 DocumentCloud, 52, 196 Domino’s, 170 Drone technology, 67–68 Drug marketplace, online, 159–160 Drug use, 80–81, 158–160 Duncan, Arne, 51 Dunier, Mitchell, 115 Edison, Thomas, 77 Education change, implementing in, 62–63 Common Core State Standards, 60–61 competence bar in, 150 computers in schools, 63–65 equality in, 77–78 funding, 60 supplies, availability of, 58 technochauvinist solutions for, 63 textbook availability, 53–60 unpredictability in, 62 18F, 178–179 Electronic Frontier Foundation, 82 Elevators, 156–157 Eliza, 27–28 Emancipation Proclamation, 78 Engelbart, Doug, 25, 80–81 Engineers, ethical training, 145 ENIAC, 71, 194, 196–199 Equality in education, 77–78 techno hostility toward, 83 technological, creating, 87 technology vs., 115, 156 for women, 5, 77–78, 83–85, 158 Essa, Irfan, 46 Ethics, 144–145, 147 EveryBlock, 46 Expertise, cognitive fallacies associated, 83 Expert systems, 52–53, 179 Facebook, 70, 83, 152, 158, 197 Facial recognition, 157 Fact checking, 45–46 Fake news, 154 Family Educational Rights and Privacy Act (FERPA), 63–64 FEC, McCutcheon v., 180 FEC, Speechnow.org v., 180 FEC.gov, 178–179 Film, AI in, 31, 32, 198 FiveThirtyEight.com, 47 Foote, Tully, 122–123, 125 Ford Motor Company, 140 Fowler, Susan, 74 Fraud campaign finance, 180 Internet advertising, 153–154 Free press, role of, 44 Free speech, 82 Fuller, Buckminster, 74 Futurists, 89–90 Games, AI and, 33–37 Gates, Bill, 61 Gates, Melinda, 157–158 Gawker, 83 Gender equality, hostility toward, 83 Gender gap, 5, 84–85, 115, 158 Genius, cult of, 75 Genius myth, 83–84 Ghost-in-the-machine fallacy, 32, 39 Giffords, Gabby, 19–20 GitHub, 135 Go, 33–37 Good Old-Fashioned Artificial Intelligence (GOFAI), 10 Good vs. popular, 149–152, 160 Google, 72 Google Docs, 25 Google Maps API, 46 Google Street View, 131 Google X, 138, 151, 158 Government campaign finance, 177–186, 191 cyberspace activism, antigovernment ideology, 82–83 tech hostility toward, 82–83 Graphical user interface (GUI), 25, 72 Greyball, 74 Guardian, 45, 46 Hackathons, 165–174 Hackers, 69–70, 82, 153–154, 169, 173 Halevy, Alon, 119 Hamilton, James T., 47 Harley, Mike, 140 Harris, Melanie, 58–59 Harvard, Andrew, 184 Harvard University Berkman Klein Center, 195 Data Privacy Lab, 195 mathematics department, 84 “Hello, world” program, 13–18 Her, 31 Hern, Alex, 159 Hernandez, Daniel, Jr., 19 Heuristics, 95–96 Hillis, Danny, 73 Hippies, 5, 82 HitchBOT, 69 Hite, William, 58 Hoffman, Brian, 159 Holovaty, Adrian, 45–46 Home Depot, 46, 115, 155 Hooke, Robert, 88 Houghton Mifflin Harcourt (HMH) HP, 157 Hugo, Christoph von, 145 Human-centered design, 147, 177 Human computers, 77–78, 198 Human error, 136–137 Human-in-the-loop systems, 177, 179, 187, 195 Hurst, Alicia, 164 Illinois quarter, 153–154 Imagination, 89–90, 128 Imitation Game, The (film), 74 Information industry, annual pay, 153 Injury mortality, 137 Innovation computational, 25 disruptive, 163, 171 funding, 172–173 hackathons and, 166 Instacart, 171 Intelligence in machine learning Interestingness threshold, 188 International Foundation for Advanced Study, 81 Internet advertising model, 151 browsers, 25, 26 careers, annual pay rates, 153 core values, 150 drug marketplace, 159–160 early development of the, 5, 81 fraud, 153–154 online communities, technolibertarianism in culture of, 82–83 rankings, 72, 150–152 Internet Explorer, 25 Internet pioneers, inspiration for, 5, 81–82 Internet publishing industry, annual pay, 153 Internet search, 72, 150–152 Ito, Joi, 147, 195 Jacquard, Joseph Marie, 76 Java, 89 JavaScript, 89 Jobs, Steve, 25, 70, 72, 80, 81 Jones, Paul Tudor, 187–188 Journalism.


pages: 606 words: 157,120

To Save Everything, Click Here: The Folly of Technological Solutionism by Evgeny Morozov

3D printing, algorithmic bias, algorithmic trading, Amazon Mechanical Turk, Andrew Keen, augmented reality, Automated Insights, Berlin Wall, big data - Walmart - Pop Tarts, Buckminster Fuller, call centre, carbon footprint, Cass Sunstein, choice architecture, citizen journalism, cloud computing, cognitive bias, creative destruction, crowdsourcing, data acquisition, Dava Sobel, disintermediation, East Village, en.wikipedia.org, Fall of the Berlin Wall, Filter Bubble, Firefox, Francis Fukuyama: the end of history, frictionless, future of journalism, game design, Gary Taubes, Google Glasses, Ian Bogost, illegal immigration, income inequality, invention of the printing press, Jane Jacobs, Jean Tirole, Jeff Bezos, jimmy wales, Julian Assange, Kevin Kelly, Kickstarter, license plate recognition, lifelogging, lone genius, Louis Pasteur, Mark Zuckerberg, market fundamentalism, Marshall McLuhan, moral panic, Narrative Science, Nelson Mandela, Nicholas Carr, packet switching, PageRank, Parag Khanna, Paul Graham, peer-to-peer, Peter Singer: altruism, Peter Thiel, pets.com, placebo effect, pre–internet, Ray Kurzweil, recommendation engine, Richard Thaler, Ronald Coase, Rosa Parks, self-driving car, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, Slavoj Žižek, smart meter, social graph, social web, stakhanovite, Steve Jobs, Steven Levy, Stuxnet, surveillance capitalism, technoutopianism, the built environment, The Chicago School, The Death and Life of Great American Cities, the medium is the message, The Nature of the Firm, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Thomas L Friedman, transaction costs, urban decay, urban planning, urban sprawl, Vannevar Bush, WikiLeaks, Yochai Benkler

EdgeRank chooses which news items—from the thousands shared by your friends—you should see when you log into the site. According to Bucher’s research, Facebook wants to feed us stuff with high meme potential; thus, it studies what kinds of stories—from which friends? on which subjects?—users tend to click on most often. Bucher even writes of “the algorithmic bias towards making those stories that signify engagement more visible than those that do not.” There is nothing wrong with this attitude per se, but as with Twitter, it does embody a certain vision of how public life should function and what it should reward, and it does make meme manufacturing easier.

Algorithmic Power and the Threat of Invisibility on Facebook,” New Media & Society 14, no. 7 (2012), available at http://nms.sagepub.com/content/14/7/1164; Bucher, “The Friendship Assemblage: Investigating Programmed Sociality on Facebook,” Television & New Media, August 2012, http://tvn.sagepub.com/content/early/2012/08/14/1527476412452800.abstract; and Bucher, “A Technicity of Attention: How Software Makes Sense,’” Culture Machine 13 (2012), http://culturemachine.net/index.php/cm/article/viewArticle/470. 158 “the algorithmic bias towards making those stories”: Bucher, “Algorithmic Power.” 158 Media scholar C. W. Anderson: C. W. Anderson, “Deliberative, Agonistic, and Algorithmic Audiences: Journalism’s Vision of Its Public in an Age of Audience Transparency,” International Journal of Communication 5 (2011): 529–547. 159 “If something is a total bummer”: quoted in Ryan Holiday, Trust Me, I’m Lying, 62. 159 “The economics of the web”: ibid., 62. 159 “efficiencies and the new breadth of artists”: Christopher Steiner, Automate This: How Algorithms Came to Rule Our World (New York: Portfolio Hardcover, 2012), 86. 160 historians of science Peter Galison and Lorraine Daston: Lorraine J.


pages: 661 words: 156,009

Your Computer Is on Fire by Thomas S. Mullaney, Benjamin Peters, Mar Hicks, Kavita Philip

2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, AltaVista, Amazon Mechanical Turk, Amazon Web Services, American Society of Civil Engineers: Report Card, Asilomar, autonomous vehicles, bitcoin, blockchain, Boeing 737 MAX, British Empire, business cycle, business process, call centre, carbon footprint, cloud computing, collective bargaining, computer age, computer vision, connected car, corporate governance, corporate social responsibility, Covid-19, COVID-19, creative destruction, cryptocurrency, dark matter, deskilling, digital map, don't be evil, Donald Davies, Donald Trump, Edward Snowden, en.wikipedia.org, European colonialism, financial innovation, game design, glass ceiling, global pandemic, global supply chain, Grace Hopper, hiring and firing, IBM and the Holocaust, industrial robot, informal economy, Internet Archive, Internet of things, Jeff Bezos, job automation, Julian Assange, Kevin Kelly, Kickstarter, knowledge economy, Landlord’s Game, low-wage service sector, M-Pesa, Mark Zuckerberg, mass incarceration, Menlo Park, meta-analysis, mobile money, moral panic, move fast and break things, move fast and break things, mutually assured destruction, natural language processing, new economy, Norbert Wiener, old-boy network, On the Economy of Machinery and Manufactures, packet switching, pattern recognition, Paul Graham, pink-collar, postindustrial economy, profit motive, QWERTY keyboard, Ray Kurzweil, Report Card for America’s Infrastructure, sentiment analysis, Silicon Valley, Silicon Valley ideology, smart cities, Snapchat, speech recognition, statistical model, Steve Jobs, Stewart Brand, technoutopianism, telepresence, the built environment, the map is not the territory, Thomas L Friedman, Triangle Shirtwaist Factory, undersea cable, union organizing, WikiLeaks, wikimedia commons, women in the workforce, Y2K

Germaine Moore, the suspect in the viral video case, was virtually undetectable, appearing in a video rather than still image and having never had his photograph logged in a prior arrest database against which to match his face. Further still, as a person of color Moore remains less detectable by facial-recognition algorithms that continue to bias accuracy toward white male faces.8 Apprehending Is Embodied Sensing To police is to apprehend—to recognize. Recognition is recall, data processing, and identification. To recognize is to bring close, to know again, to make one’s own. Recognition is a condition of proximity and closeness, or how near both beloved and despised people are presumed to be.

Part of my task in recording this history has been to convince these women that their contributions were important and that historians do value what they have to say. 13. Hicks, Programmed Inequality, 1–3. 14. For more on this, and a case study of one of the earliest examples of mainframe-era transphobic algorithmic bias, see Mar Hicks, “Hacking the Cis-tem: Transgender Citizens and the Early Digital State,” IEEE Annals of the History of Computing 41, no. 1 (January–March 2019): 20–33, https://doi.org/10.1109/MAHC.2019.2897667. 15. Hicks, Programmed Inequality, 90–93. 16. Hicks, Programmed Inequality, 93–96. 17.


pages: 625 words: 167,349

The Alignment Problem: Machine Learning and Human Values by Brian Christian

Albert Einstein, algorithmic bias, Amazon Mechanical Turk, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, butterfly effect, Cass Sunstein, Claude Shannon: information theory, computer vision, Donald Knuth, Douglas Hofstadter, effective altruism, Elon Musk, game design, Google Chrome, Google Glasses, Google X / Alphabet X, Gödel, Escher, Bach, hedonic treadmill, ImageNet competition, industrial robot, Internet Archive, John von Neumann, Joi Ito, Kenneth Arrow, longitudinal study, mandatory minimum, mass incarceration, natural language processing, Norbert Wiener, Panopticon Jeremy Bentham, pattern recognition, Peter Singer: altruism, Peter Thiel, premature optimization, RAND corporation, recommendation engine, Richard Feynman, Rodney Brooks, Saturday Night Live, selection bias, self-driving car, side project, Silicon Valley, speech recognition, Stanislav Petrov, statistical model, Steve Jobs, strong AI, the map is not the territory, theory of mind, Tim Cook: Apple, zero-sum game

In one of the first articles explicitly addressing the notion of bias in computing systems, the University of Washington’s Batya Friedman and Cornell’s Helen Nissenbaum had warned that “computer systems, for instance, are comparatively inexpensive to disseminate, and thus, once developed, a biased system has the potential for widespread impact. If the system becomes a standard in the field, the bias becomes pervasive.”40 Or, as Buolamwini herself puts it, “Halfway around the world, I learned that algorithmic bias can travel as quickly as it takes to download some files off of the internet.”41 After a Rhodes Scholarship at Oxford, Buolamwini came to the MIT Media Lab, and there she began working on an augmented-reality project she dubbed the “Aspire Mirror.” The idea was to project empowering or uplifting visuals onto the user’s face—making the onlooker transform into a lion, for instance.

., “Concrete Problems in AI Safety.” gives an overview of this issue, which comes up in various subsequent chapters of this book. 37. Hardt, “How Big Data Is Unfair.” 38. Jacky Alciné, personal interview, April 19, 2018. 39. Joy Buolamwini, “How I’m Fighting Bias in Algorithms,” https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms. 40. Friedman and Nissenbaum, “Bias in Computer Systems.” 41. Buolamwini, “How I’m Fighting Bias in Algorithms.” 42. Huang et al., “Labeled Faces in the Wild.” 43. Han and Jain, “Age, Gender and Race Estimation from Unconstrained Face Images.” 44. The estimate used here is 252 faces of Black women, arrived at by multiplying the proportion of women in the dataset (2,975/13,233) by the proportion of Black individuals in the dataset (1,122/13,233); numbers from Han and Jain. 45.


pages: 380 words: 109,724

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US by Rana Foroohar

"side hustle", accounting loophole / creative accounting, Airbnb, algorithmic bias, AltaVista, autonomous vehicles, banking crisis, barriers to entry, Bernie Madoff, Bernie Sanders, bitcoin, book scanning, Brewster Kahle, Burning Man, call centre, cashless society, cleantech, cloud computing, cognitive dissonance, Colonization of Mars, computer age, corporate governance, creative destruction, Credit Default Swap, cryptocurrency, data is the new oil, death of newspapers, Deng Xiaoping, disinformation, disintermediation, don't be evil, Donald Trump, drone strike, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Etonian, Filter Bubble, future of work, game design, gig economy, global supply chain, Gordon Gekko, greed is good, income inequality, independent contractor, informal economy, information asymmetry, intangible asset, Internet Archive, Internet of things, invisible hand, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, Kenneth Rogoff, life extension, light touch regulation, Lyft, Mark Zuckerberg, Marshall McLuhan, Martin Wolf, Menlo Park, move fast and break things, move fast and break things, Network effects, new economy, offshore financial centre, PageRank, patent troll, paypal mafia, Peter Thiel, pets.com, price discrimination, profit maximization, race to the bottom, recommendation engine, ride hailing / ride sharing, Robert Bork, Sand Hill Road, search engine result page, self-driving car, shareholder value, sharing economy, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, smart cities, Snapchat, South China Sea, sovereign wealth fund, Steve Bannon, Steve Jobs, Steven Levy, subscription business, supply-chain management, surveillance capitalism, TaskRabbit, Telecommunications Act of 1996, The Chicago School, the new new thing, Tim Cook: Apple, too big to fail, Travis Kalanick, trickle-down economics, Uber and Lyft, Uber for X, uber lyft, Upton Sinclair, WeWork, WikiLeaks, zero-sum game

Eventually, they decided to “stop being British,” as Shivaun put it to me, and take their case to the regulators, which is how Foundem became the lead complainant in the European Commission’s Google Search antitrust case, launched in 2009. It was led by the tough-as-nails EU competition chief, Margrethe Vestager, who eventually found against the firm in 2017. In compliance with EU law, Google was given eighteen months to figure out a way to rejigger its algorithms to eliminate bias in search. But in late 2018, the Raffs sent a letter to the commissioner, telling her that they were unpersuaded that the Google “compliance mechanism,” which depended once again on its own black box algorithmic formulas, was working. “It has now been more than a year since Google introduced its auction-based ‘remedy’ and the harm to competition, consumers, and innovation caused by Google’s illegal conduct has continued unabated,” they wrote.

“A recurring pattern has developed,” says Frank Pasquale at the University of Maryland, “in which some entity complains about a major Internet company’s practices, the company claims that its critics don’t understand how its algorithms sort and rank content, and befuddled onlookers are left to sift through rival stories in the press.” Companies should be prepared to make themselves open to algorithmic audits, as suggested by mathematician and Big Tech critic Cathy O’Neil, in case of complaints or concerns about algorithmic bias that could allow for discrimination in the workplace, healthcare, education, and so on.7 Individuals should also have their digital rights legalized. Former Wired editor John Battelle has proposed a digital bill of rights that would assign possession of data to its true owner, which is, of course, the user and generator of that data, not the company that made off with it.


pages: 345 words: 75,660

Prediction Machines: The Simple Economics of Artificial Intelligence by Ajay Agrawal, Joshua Gans, Avi Goldfarb

"Robert Solow", Ada Lovelace, AI winter, Air France Flight 447, Airbus A320, algorithmic bias, Amazon Picking Challenge, artificial general intelligence, autonomous vehicles, backpropagation, basic income, Bayesian statistics, Black Swan, blockchain, call centre, Capital in the Twenty-First Century by Thomas Piketty, Captain Sullenberger Hudson, collateralized debt obligation, computer age, creative destruction, Daniel Kahneman / Amos Tversky, data acquisition, data is the new oil, deskilling, disruptive innovation, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, everywhere but in the productivity statistics, Google Glasses, high net worth, ImageNet competition, income inequality, information retrieval, inventory management, invisible hand, job automation, John Markoff, Joseph Schumpeter, Kevin Kelly, Lyft, Minecraft, Mitch Kapor, Moneyball by Michael Lewis explains big data, Nate Silver, new economy, On the Economy of Machinery and Manufactures, pattern recognition, performance metric, profit maximization, QWERTY keyboard, race to the bottom, randomized controlled trial, Ray Kurzweil, ride hailing / ride sharing, Second Machine Age, self-driving car, shareholder value, Silicon Valley, statistical model, Stephen Hawking, Steve Jobs, Steven Levy, strong AI, The Future of Employment, The Signal and the Noise by Nate Silver, Tim Cook: Apple, Turing test, Uber and Lyft, uber lyft, US Airways Flight 1549, Vernor Vinge, Watson beat the top human players on Jeopardy!, William Langewiesche, Y Combinator, zero-sum game

Sweeney, “Discrimination in Online Ad Delivery,” Communications of the ACM 56, no. 5 (2013): 44–54, https://dataprivacylab.org/projects/onlineads/. 2. Ibid. 3. “Racism Is Poisoning Online Ad Delivery, Says Harvard Professor,” MIT Technology Review, February 4, 2013, https://www.technologyreview.com/s/510646/racism-is-poisoning-online-ad-delivery-says-harvard-professor/. 4. Anja Lambrecht and Catherine Tucker, “Algorithmic Bias? An Empirical Study into Apparent Gender-Based Discrimination in the Display of STEM Career Ads” (paper presented at the NBER Summer Institute, July 2017). 5. Diane Cardwell and Libby Nelson, “The Fire Dept. Tests That Were Found to Discriminate,” New York Times, July 23, 2009, https://cityroom.blogs.nytimes.com/2009/07/23/the-fire-dept-tests-that-were-found-to-discriminate/?


pages: 301 words: 85,126

AIQ: How People and Machines Are Smarter Together by Nick Polson, James Scott

Air France Flight 447, Albert Einstein, algorithmic bias, Amazon Web Services, Atul Gawande, autonomous vehicles, availability heuristic, basic income, Bayesian statistics, business cycle, Cepheid variable, Checklist Manifesto, cloud computing, combinatorial explosion, computer age, computer vision, Daniel Kahneman / Amos Tversky, Donald Trump, Douglas Hofstadter, Edward Charles Pickering, Elon Musk, epigenetics, Flash crash, Grace Hopper, Gödel, Escher, Bach, Harvard Computers: women astronomers, index fund, Isaac Newton, John von Neumann, late fees, low earth orbit, Lyft, Magellanic Cloud, mass incarceration, Moneyball by Michael Lewis explains big data, Moravec's paradox, more computing power than Apollo, natural language processing, Netflix Prize, North Sea oil, p-value, pattern recognition, Pierre-Simon Laplace, ransomware, recommendation engine, Ronald Reagan, self-driving car, sentiment analysis, side project, Silicon Valley, Skype, smart cities, speech recognition, statistical model, survivorship bias, the scientific method, Thomas Bayes, Uber for X, uber lyft, universal basic income, Watson beat the top human players on Jeopardy!, young professional

Many people, when they hear about something as shocking as a secret algorithm handing down prison sentences in a racially biased way, reach a simple conclusion: that artificial intelligence should play no role whatsoever in the criminal justice system. While we’re as shocked and angry as anyone, we think that’s the wrong conclusion. Yes, we must all fight algorithmic bias when it arises. To do that, we need constant vigilance by experts: people who know the law but who also know AI, and who are empowered to act if they see a threat to justice. But even as we acknowledge the pitfalls of using AI to help people make important decisions, and even as we echo the call for transparency and fairness to become defining values of this new age, let’s not forget that there’s also incredible potential here.


pages: 241 words: 70,307

Leadership by Algorithm: Who Leads and Who Follows in the AI Era? by David de Cremer

algorithmic bias, bitcoin, blockchain, business climate, business process, corporate governance, Donald Trump, Elon Musk, future of work, job automation, Kevin Kelly, Mark Zuckerberg, meta-analysis, Norbert Wiener, pattern recognition, Peter Thiel, race to the bottom, robotic process automation, shareholder value, Silicon Valley, Social Responsibility of Business Is to Increase Its Profits, Stephen Hawking, The Future of Employment, Turing test, zero-sum game

Even more so, their inability to take the perspective of others makes them unable to make decisions on behalf of others and, as such, are perceived as incapable of leadership. Let us consider again Amazon’s experiment to use an algorithm to automate their recruitment process. This case taught us that the employed algorithm duplicated the human bias to favor men over women for the specific software development jobs they were advertising. As I just mentioned, it is not just algorithms, but humans too, that make such biased judgments. The difference is that humans are aware of the social consequences that emerge from the employment of this biased practice.


pages: 475 words: 134,707

The Hype Machine: How Social Media Disrupts Our Elections, Our Economy, and Our Health--And How We Must Adapt by Sinan Aral

Airbnb, Albert Einstein, algorithmic bias, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, Bernie Sanders, bitcoin, carbon footprint, Cass Sunstein, computer vision, coronavirus, correlation does not imply causation, Covid-19, COVID-19, crowdsourcing, cryptocurrency, death of newspapers, disinformation, disintermediation, Donald Trump, Drosophila, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, experimental subject, facts on the ground, Filter Bubble, global pandemic, hive mind, illegal immigration, income inequality, Kickstarter, knowledge worker, longitudinal study, low skilled workers, Lyft, Mahatma Gandhi, Mark Zuckerberg, Menlo Park, meta-analysis, Metcalfe’s law, mobile money, move fast and break things, move fast and break things, multi-sided market, Nate Silver, natural language processing, Network effects, performance metric, phenotype, recommendation engine, Robert Bork, Robert Shiller, Robert Shiller, Second Machine Age, sentiment analysis, shareholder value, skunkworks, Snapchat, social graph, social intelligence, social software, social web, statistical model, stem cell, Stephen Hawking, Steve Bannon, Steve Jobs, surveillance capitalism, Telecommunications Act of 1996, The Chicago School, the strength of weak ties, The Wisdom of Crowds, theory of mind, Tim Cook: Apple, Uber and Lyft, uber lyft, WikiLeaks, Yogi Berra

Given that Facebook is now the largest news outlet on the planet, with an audience greater than any Western television news network, newspaper, magazine, or online publication, it’s important to consider whether its newsfeed algorithm biases our exposure to different news sources, and whether its content-curation policies favor particular political views. (I’ll discuss whether social media should be regulated like traditional media in Chapter 12, but for now, it’s important to understand how algorithmic curation works. I’ll explore the effects of algorithmic curation on bias and polarization in news consumption in detail in Chapter 10.) Newsfeeds rank content according to its relevance. Each piece of content is given a relevance score that is unique to each of us and is sorted to appear in decreasing relevance order in our newsfeeds. Relevance is scored by predictive models that learn what drives us to interact with a piece of content.

A National Commission on Technology and Democracy In researching this book, I watched hours of congressional testimony by tech executives like Mark Zuckerberg, Jack Dorsey, Sundar Pichai, and Susan Wojcicki. I watched testimony on privacy, antitrust, election manipulation, data protection, algorithmic bias, and the role of social media in vaccine hesitancy, free speech, political bias, filter bubbles, and fake news. I got one overwhelming feeling from watching congressmen and -women question tech executives: we need more experts leading the way. Charting our technological future will be complex, technical, and nuanced.


pages: 390 words: 109,519

Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media by Tarleton Gillespie

4chan, A Declaration of the Independence of Cyberspace, affirmative action, Airbnb, algorithmic bias, AltaVista, Amazon Mechanical Turk, borderless world, Burning Man, complexity theory, conceptual framework, crowdsourcing, Donald Trump, drone strike, easy for humans, difficult for computers, Edward Snowden, Filter Bubble, game design, gig economy, Google Glasses, Google Hangouts, hiring and firing, Ian Bogost, independent contractor, Internet Archive, Jean Tirole, John Gruber, Kickstarter, Mark Zuckerberg, mass immigration, Menlo Park, Minecraft, moral panic, multi-sided market, Netflix Prize, Network effects, pattern recognition, peer-to-peer, recommendation engine, Rubik’s Cube, sharing economy, Silicon Valley, Skype, slashdot, Snapchat, social graph, social web, Steve Jobs, Stewart Brand, Telecommunications Act of 1996, two-sided market, WikiLeaks, Yochai Benkler

See freedom of speech flagging, (i), (ii), (iii); limited vocabulary of, (i), (ii); can be gamed, (i), (ii); and the drag queen controversy, (i); as strategically valuable for platforms, (i); as labor, (i); who actually does it, (i); as data that could be offered back to users, (i) Flickr (Yahoo): community guidelines, (i), (ii), (iii); approach to moderation (i), (ii), (iii); and the NIPSA guideline, (i); response to international obligations, (i) Flyverbom, Mikkel, (i)n4 Foursquare (social networking platform), (i), (ii), (iii) Free the Nipple, (i) “freedom from porn” (i), (ii) freedom of speech: and the regulation of traditional media, (i), (ii), (iii); and global values, (i), (ii), (iii); as a commitment of platforms, (i), (ii), (iii), (iv), (v), (vi), (vii), (viii); and the early web, (i); and cyberporn, (i); and private intermediaries, (i), (ii), (iii), (iv); and defamation, (i); and digital copyright, (i) Friendster (social networking platform), (i) Gadde, Vitaya (Twitter), (i) game worlds, moderation of, (i), (ii) Gamergate, (i), (ii), (iii) gatekeepers, (i), (ii), (iii) Gawker, (i) #gay, blocked by Tumblr, (i), (ii) Geiger, Stuart, (i) gender, (i), (ii), (iii), (iv) Google: and political content, (i), (ii); community guidelines, (i), (ii), (iii), (iv), (v)n24; approach to moderation, (i), (ii), (iii); and algorithmic bias, (i); and automatic detection, (i); moderation of the Autocomplete function, (i), (ii)n31; and filtering, (i); and fake news / Russian ad controversies, (i), (ii)n40 Google Glass, (i), (ii) Google Image Search, (i), (ii) Google Maps, (i) Google Play (app store), (i), (ii) Google Translate, (i) Google Trends, (i) Google+, (i), (ii), (iii) graphic content, rules regarding, (i) Gray, Mary, (i), (ii)n105 Gregoire, Carolyn, (i) Grindr (dating app), (i), (ii) The Guardian, (i), (ii), (iii) Hansen, Espen Egil, (i) harassment: and Twitter, (i), (ii), (iii), (iv), (v)n2, (vi)n5; before social media, (i); rules regarding, (i), (ii); and flagging, (i), (ii); automated detection of, (i), (ii) Harvey, Del (Twitter), (i) hashing, (i) hate speech: and Twitter, (i); before social media, (i); legal obligations regarding, (i); rules regarding, (i); Apple’s “enemies” rule, (i); automated detection of, (i), (ii); and Google, (i) Heartmob, (i) Hebron, Micol, (i) Heffernan, Virginia, (i) Herrman, John, (i) Hey Facebook, Breastfeeding Is Not Obscene!


pages: 482 words: 121,173

Tools and Weapons: The Promise and the Peril of the Digital Age by Brad Smith, Carol Ann Browne

Affordable Care Act / Obamacare, AI winter, airport security, Albert Einstein, algorithmic bias, augmented reality, autonomous vehicles, barriers to entry, Berlin Wall, Boeing 737 MAX, business process, call centre, Celtic Tiger, chief data officer, cloud computing, computer vision, corporate social responsibility, disinformation, Donald Trump, Edward Snowden, en.wikipedia.org, immigration reform, income inequality, Internet of things, invention of movable type, invention of the telephone, Jeff Bezos, Mark Zuckerberg, minimum viable product, national security letter, natural language processing, Network effects, new economy, pattern recognition, precision agriculture, race to the bottom, ransomware, Ronald Reagan, Rubik’s Cube, school vouchers, self-driving car, Shoshana Zuboff, Silicon Valley, Skype, speech recognition, Steve Ballmer, Steve Jobs, surveillance capitalism, The Rise and Fall of American Growth, Tim Cook: Apple, WikiLeaks, women in the workforce

Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner, “Machine Bias,” ProPublica, May 23, 2016, https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing. Back to note reference 13. The article led to a lively debate about the definition of bias and how to assess the risk of it in AI algorithms. See Matthias Spielkamp, “Inspecting Algorithms for Bias,” MIT Technology Review, June 12, 2017, https://www.technologyreview.com/s/607955/inspecting-algorithms-for-bias/. Back to note reference 14. Joy Buolamwini, “Gender Shades,” Civic Media, MIT Media Lab, accessed November 15, 2018, https://www.media.mit.edu/projects/gender-shades/overview/. Back to note reference 15. Thomas G.


pages: 586 words: 186,548

Architects of Intelligence by Martin Ford

3D printing, agricultural Revolution, AI winter, algorithmic bias, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, barriers to entry, basic income, Baxter: Rethink Robotics, Bayesian statistics, bitcoin, business intelligence, business process, call centre, cloud computing, cognitive bias, Colonization of Mars, computer vision, correlation does not imply causation, crowdsourcing, DARPA: Urban Challenge, deskilling, disruptive innovation, Donald Trump, Douglas Hofstadter, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, Fellow of the Royal Society, Flash crash, future of work, gig economy, Google X / Alphabet X, Gödel, Escher, Bach, Hans Rosling, ImageNet competition, income inequality, industrial robot, information retrieval, job automation, John von Neumann, Law of Accelerating Returns, life extension, Loebner Prize, Mark Zuckerberg, Mars Rover, means of production, Mitch Kapor, natural language processing, new economy, optical character recognition, pattern recognition, phenotype, Productivity paradox, Ray Kurzweil, recommendation engine, Robert Gordon, Rodney Brooks, Sam Altman, self-driving car, sensor fusion, sentiment analysis, Silicon Valley, smart cities, social intelligence, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, Ted Kaczynski, The Rise and Fall of American Growth, theory of mind, Thomas Bayes, Travis Kalanick, Turing test, universal basic income, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working-age population, zero-sum game, Zipcar

One threat that is already becoming evident is the vulnerability of interconnected, autonomous systems to cyber attack or hacking. As AI becomes ever more integrated into our economy and society, solving this problem will be one of the most critical challenges we face. Another immediate concern is the susceptibility of machine learning algorithms to bias, in some cases on the basis of race or gender. Many of the individuals I spoke with emphasized the importance of addressing this issue and told of research currently underway in this area. Several also sounded an optimistic note—suggesting that AI may someday prove to be a powerful tool to help combat systemic bias or discrimination.

That now said, I agree with you that bias and diversity can be treated a little more separately. For example, in terms of data bias resulting in machine learning outcome bias, a lot of academia researchers are recognizing this now, and working on ways to expose that kind of bias. They’re also modifying algorithms to respond to bias in a way to try to correct it that way. This exposure to the bias of products and technology, from academia to industry, is really healthy, and it keeps the industry on their toes. MARTIN FORD: You must have to deal with machine learning bias at Google. How do you address it? FEI-FEI LI: Google now has a whole group of researchers working on machine learning bias and “explainability” because the pressure is there to tackle bias, to deliver a better product, and we want to be helping others.


pages: 499 words: 144,278

Coders: The Making of a New Tribe and the Remaking of the World by Clive Thompson

2013 Report for America's Infrastructure - American Society of Civil Engineers - 19 March 2013, 4chan, 8-hour work day, Ada Lovelace, AI winter, Airbnb, algorithmic bias, Amazon Web Services, Asperger Syndrome, augmented reality, Ayatollah Khomeini, backpropagation, barriers to entry, basic income, Bernie Sanders, bitcoin, blockchain, blue-collar work, Brewster Kahle, Brian Krebs, Broken windows theory, call centre, cellular automata, Chelsea Manning, clean water, cloud computing, cognitive dissonance, computer vision, Conway's Game of Life, crowdsourcing, cryptocurrency, Danny Hillis, David Heinemeier Hansson, disinformation, don't be evil, don't repeat yourself, Donald Trump, dumpster diving, Edward Snowden, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, Ethereum, ethereum blockchain, Firefox, Frederick Winslow Taylor, game design, glass ceiling, Golden Gate Park, Google Hangouts, Google X / Alphabet X, Grace Hopper, Guido van Rossum, Hacker Ethic, hockey-stick growth, HyperCard, Ian Bogost, illegal immigration, ImageNet competition, Internet Archive, Internet of things, Jane Jacobs, John Markoff, Jony Ive, Julian Assange, Kickstarter, Larry Wall, lone genius, Lyft, Marc Andreessen, Mark Shuttleworth, Mark Zuckerberg, Menlo Park, microservices, Minecraft, move fast and break things, move fast and break things, Nate Silver, Network effects, neurotypical, Nicholas Carr, Oculus Rift, PageRank, pattern recognition, Paul Graham, paypal mafia, Peter Thiel, pink-collar, planetary scale, profit motive, ransomware, recommendation engine, Richard Stallman, ride hailing / ride sharing, Rubik’s Cube, Ruby on Rails, Sam Altman, Satoshi Nakamoto, Saturday Night Live, self-driving car, side project, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, single-payer health, Skype, smart contracts, Snapchat, social software, software is eating the world, sorting algorithm, South of Market, San Francisco, speech recognition, Steve Wozniak, Steven Levy, TaskRabbit, the High Line, Travis Kalanick, Uber and Lyft, Uber for X, uber lyft, universal basic income, urban planning, Wall-E, Watson beat the top human players on Jeopardy!, WeWork, WikiLeaks, women in the workforce, Y Combinator, Zimmermann PGP, éminence grise

Index Aaron Swartz hackathon, ref1 Abbate, Janet, ref1, ref2, ref3, ref4 Abene, Mark, ref1 Abma, Jobert, ref1 Adams, John, ref1 Addiction by Design (Schüll), ref1 addictive behavior, as side effect of optimization, ref1 Adler, Mortimer J., ref1 Adobe, ref1 advertising, ref1, ref2 African American coders. See minority coders agile development, ref1 AI. See artificial intelligence (AI) Albright, Jonathan, ref1 Alciné, Jacky, ref1 algorithms, ref1, ref2 bias in ranking systems, ref1 scale and, ref1 algorithms challenge whiteboard interview, ref1, ref2, ref3 Algorithms of Oppression (Noble), ref1 Allen, Fran, ref1, ref2 Allen, Paul, ref1 AlphaGo, ref1, ref2 Altman, Sam, ref1, ref2 Amabile, Teresa M., ref1 Amazon, ref1, ref2, ref3 Amazons (board game), ref1 Amazon Web Services, ref1 Analytical Engine, ref1 Anderson, Tom, ref1 AND gate, ref1 Andreessen, Marc, ref1, ref2, ref3, ref4, ref5, ref6, ref7, ref8 Antisocial Media (Vaidhyanathan), ref1 Apple, ref1 Apple I, ref1 Apple iPhone, ref1, ref2 aptitude testing, ref1 architects, ref1 artificial intelligence (AI), ref1 dangers of, warnings about and debate over, ref1 de-biasing of, ref1 deep learning (See deep learning) edge cases and, ref1 expert systems, ref1 Hollywood depiction of, ref1 initial attempts to create, at Dartmouth in 1956, ref1 job listing sites, biased results in, ref1 justice system, effect of AI bias on, ref1 learning problem, ref1 neural nets (See neural nets) racism and sexism, learning of, ref1 artistic temperaments, ref1 Assembly computer language, ref1 Atwood, Jeff, ref1, ref2 Babbage, Charles, ref1, ref2 back-end code, ref1, ref2, ref3, ref4 backpropagation, ref1 “Bad Smells in Code” (Fowler and Beck), ref1 Baffler, The, ref1 Bahnken, A.


pages: 480 words: 119,407

Invisible Women by Caroline Criado Perez

Affordable Care Act / Obamacare, algorithmic bias, augmented reality, Bernie Sanders, collective bargaining, crowdsourcing, Diane Coyle, Donald Trump, falling living standards, first-past-the-post, gender pay gap, gig economy, glass ceiling, Grace Hopper, Hacker Ethic, independent contractor, Indoor air pollution, informal economy, lifelogging, low skilled workers, mental accounting, meta-analysis, Nate Silver, new economy, obamacare, Oculus Rift, offshore financial centre, pattern recognition, phenotype, post-industrial society, randomized controlled trial, remote working, Silicon Valley, Simon Kuznets, speech recognition, stem cell, Stephen Hawking, Steven Levy, the built environment, urban planning, women in the workforce, zero-sum game

But if you aren’t aware of how those biases operate, if you aren’t collecting data and taking a little time to produce evidence-based processes, you will continue to blindly perpetuate old injustices. And so by not considering the ways in which women’s lives differ from men’s, both on and offline, Gild’s coders inadvertently created an algorithm with a hidden bias against women. But that’s not even the most troubling bit. The most troubling bit is that we have no idea how bad the problem actually is. Most algorithms of this kind are kept secret and protected as proprietary code. This means that we don’t know how these decisions are being made and what biases they are hiding.

The authors of the women = homemaker paper devised a new algorithm that reduced gender stereotyping (e.g. ‘he is to doctor as she is to nurse’) by over two-thirds, while leaving gender-appropriate word associations (e.g. ‘he is to prostate cancer as she is to ovarian cancer’) intact.49 And the authors of the 2017 study on image interpretation devised a new algorithm that decreased bias amplification by 47.5%. CHAPTER 9 A Sea of Dudes When Janica Alvarez was trying to raise funds for her tech start-up Naya Health Inc. in 2013, she struggled to get investors to take her seriously. In one meeting, ‘investors Googled the product and ended up on a porn site. They lingered on the page and started cracking jokes’, leaving Alvarez feeling like she was ‘in the middle of a fraternity’.1 Other investors were ‘too grossed out to touch her product or pleaded ignorance’, with one male investor saying ‘I’m not touching that; that’s disgusting.’2 And what was this vile, ‘disgusting’ and incomprehensible product Alvarez was pitching?


pages: 688 words: 147,571

Robot Rules: Regulating Artificial Intelligence by Jacob Turner

Ada Lovelace, Affordable Care Act / Obamacare, AI winter, algorithmic bias, algorithmic trading, artificial general intelligence, Asilomar, Asilomar Conference on Recombinant DNA, autonomous vehicles, backpropagation, Basel III, bitcoin, blockchain, brain emulation, Clapham omnibus, cognitive dissonance, corporate governance, corporate social responsibility, correlation does not imply causation, crowdsourcing, distributed ledger, don't be evil, Donald Trump, easy for humans, difficult for computers, effective altruism, Elon Musk, financial exclusion, financial innovation, friendly fire, future of work, hive mind, Internet of things, iterative process, job automation, John Markoff, John von Neumann, Loebner Prize, medical malpractice, Nate Silver, natural language processing, nudge unit, obamacare, off grid, pattern recognition, Peace of Westphalia, race to the bottom, Ray Kurzweil, Rodney Brooks, self-driving car, Silicon Valley, Stanislav Petrov, Stephen Hawking, Steve Wozniak, strong AI, technological singularity, Tesla Model S, The Coming Technological Singularity, The Future of Employment, The Signal and the Noise by Nate Silver, Turing test, Vernor Vinge

– You are not permitted to modify any robot to enable it to function as a weapon.103 It remains to be seen though whether and to what extent the European Parliament’s ambitious proposals will be adopted in legislative proposals by the Commission. 4.8 Japanese Initiatives A June 2016 Report issued by Japan’s Ministry of Internal Affairs and Communications proposed nine principles for developers of AI, which were submitted for international discussion at the G7104 and OECD:1) Principle of collaboration—Developers should pay attention to the interconnectivity and interoperability of AI systems. 2) Principle of transparency —Developers should pay attention to the verifiability of inputs/outputs of AI systems and the explainability of their judgments. 3) Principle of controllability—Developers should pay attention to the controllability of AI systems. 4) Principle of safety—Developers should take it into consideration that AI systems will not harm the life, body, or property of users or third parties through actuators or other devices. 5) Principle of security—Developers should pay attention to the security of AI systems. 6) Principle of privacy—Developers should take it into consideration that AI systems will not infringe the privacy of users or third parties. 7) Principle of ethics—Developers should respect human dignity and individual autonomy in R&D of AI systems. 8) Principle of user assistance—Developers should take it into consideration that AI systems will support users and make it possible to give them opportunities for choice in appropriate manners. 9) Principle of accountability—Developers should make efforts to fulfill their accountability to stakeholders including users of AI systems.105 Japan emphasised that the above principles were intended to be treated as soft law, but with a view to “accelerate the participation of multistakeholders involved in R&D and utilization of AI… at both national and international levels, in the discussions towards establishing ‘AI R&D Guidelines’ and ‘AI Utilization Guidelines’”.106 Non-governmental groups in Japan have also been active: the Japanese Society for Artificial Intelligence proposed Ethical Guidelines for an Artificial Intelligence Society in February 2017, aimed at its members.107 Fumio Shimpo, a member of the Japanese Government’s Cabinet Office Advisory Board, has proposed his own Eight Principles of the Laws of Robots.108 4.9 Chinese Initiatives In furtherance of China’s Next Generation Artificial Intelligence Development Plan,109 and as mentioned in Chapter 6, in January 2018 a division of China’s Ministry of Industry and Information Technology released a 98-page White Paper on AI Standardization (the White Paper), the contents of which comprise China’s most comprehensive analysis to date of the ethical challenges raised by AI.110 The White Paper highlights emergent ethical issues in AI including privacy,111 the Trolley Problem,112 algorithmic bias,113 transparency 114 and liability for harm caused by AI.115 In terms of AI safety, the White Paper explains that:Because the achieved goals of artificial intelligence technology are influenced by its initial settings, the goal of artificial intelligence design must be to ensure that the design goals of artificial intelligence are consistent with the interests and ethics of most human beings.


pages: 424 words: 114,905

Deep Medicine: How Artificial Intelligence Can Make Healthcare Human Again by Eric Topol

23andMe, Affordable Care Act / Obamacare, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, algorithmic bias, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, bioinformatics, blockchain, cloud computing, cognitive bias, Colonization of Mars, computer age, computer vision, conceptual framework, creative destruction, crowdsourcing, Daniel Kahneman / Amos Tversky, dark matter, David Brooks, digital twin, Elon Musk, en.wikipedia.org, epigenetics, Erik Brynjolfsson, fault tolerance, George Santayana, Google Glasses, ImageNet competition, Jeff Bezos, job automation, job satisfaction, Joi Ito, Mark Zuckerberg, medical residency, meta-analysis, microbiome, natural language processing, new economy, Nicholas Carr, nudge unit, pattern recognition, performance metric, personalized medicine, phenotype, placebo effect, randomized controlled trial, recommendation engine, Rubik’s Cube, Sam Altman, self-driving car, Silicon Valley, speech recognition, Stephen Hawking, text mining, the scientific method, Tim Cook: Apple, War on Poverty, Watson beat the top human players on Jeopardy!, working-age population

., “Bias Detectives: The Researchers Striving to Make Algorithms Fair,” Nature. 2018. 37. Simonite, T., “Using Artificial Intelligence to Fix Wikipedia’s Gender Problem,” Wired. 2018. 38. Miller, A. P., “Want Less-Biased Decisions? Use Algorithms,” Harvard Business Review. 2018; Thomas, R., “What HBR Gets Wrong About Algorithms and Bias,” Fast AI. 2018. 39. Adamson, A. S., and A. Smith, “Machine Learning and Health Care Disparities in Dermatology.” JAMA Dermatol, 2018. 40. Harari, Y. N., Homo Deus. 2016. New York: HarperCollins, p. 348. 41. Lee, K. F., “The Real Threat of Artificial Intelligence,” New York Times. 2017. 42.


pages: 364 words: 99,897

The Industries of the Future by Alec Ross

23andMe, 3D printing, Airbnb, algorithmic bias, algorithmic trading, AltaVista, Anne Wojcicki, autonomous vehicles, banking crisis, barriers to entry, Bernie Madoff, bioinformatics, bitcoin, blockchain, Brian Krebs, British Empire, business intelligence, call centre, carbon footprint, cloud computing, collaborative consumption, connected car, corporate governance, Credit Default Swap, cryptocurrency, David Brooks, disintermediation, Dissolution of the Soviet Union, distributed ledger, Edward Glaeser, Edward Snowden, en.wikipedia.org, Erik Brynjolfsson, fiat currency, future of work, global supply chain, Google X / Alphabet X, industrial robot, Internet of things, invention of the printing press, Jaron Lanier, Jeff Bezos, job automation, John Markoff, Joi Ito, Kickstarter, knowledge economy, knowledge worker, lifelogging, litecoin, M-Pesa, Marc Andreessen, Mark Zuckerberg, Mikhail Gorbachev, mobile money, money: store of value / unit of account / medium of exchange, Nelson Mandela, new economy, offshore financial centre, open economy, Parag Khanna, paypal mafia, peer-to-peer, peer-to-peer lending, personalized medicine, Peter Thiel, precision agriculture, pre–internet, RAND corporation, Ray Kurzweil, recommendation engine, ride hailing / ride sharing, Rubik’s Cube, Satoshi Nakamoto, selective serotonin reuptake inhibitor (SSRI), self-driving car, sharing economy, Silicon Valley, Silicon Valley startup, Skype, smart cities, social graph, software as a service, special economic zone, supply-chain management, supply-chain management software, technoutopianism, The Future of Employment, Travis Kalanick, underbanked, Vernor Vinge, Watson beat the top human players on Jeopardy!, women in the workforce, Y Combinator, young professional

Because big data often relies on historical data or at least the status quo, it can easily reproduce discrimination against disadvantaged racial and ethnic minorities. The propensity models used in many algorithms can bake in a bias against someone who lived in the zip code of a low-income neighborhood at any point in his or her life. If an algorithm used by human resources companies queries your social graph and positively weighs candidates with the most existing connections to a workforce, it makes it more difficult to break in in the first place. In effect, these algorithms can hide bias behind a curtain of code. Big data is, by its nature, soulless and uncreative. It nudges us this way and that for reasons we are not meant to understand.


pages: 285 words: 86,853

What Algorithms Want: Imagination in the Age of Computing by Ed Finn

Airbnb, Albert Einstein, algorithmic bias, algorithmic trading, Amazon Mechanical Turk, Amazon Web Services, bitcoin, blockchain, Chuck Templeton: OpenTable:, Claude Shannon: information theory, commoditize, Credit Default Swap, crowdsourcing, cryptocurrency, disruptive innovation, Donald Knuth, Douglas Engelbart, Douglas Engelbart, Elon Musk, factory automation, fiat currency, Filter Bubble, Flash crash, game design, Google Glasses, Google X / Alphabet X, Hacker Conference 1984, High speed trading, hiring and firing, Ian Bogost, invisible hand, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Markoff, Just-in-time delivery, Kickstarter, late fees, lifelogging, Loebner Prize, Lyft, Mother of all demos, Nate Silver, natural language processing, Netflix Prize, new economy, Nicholas Carr, Norbert Wiener, PageRank, peer-to-peer, Peter Thiel, Ray Kurzweil, recommendation engine, Republic of Letters, ride hailing / ride sharing, Satoshi Nakamoto, self-driving car, sharing economy, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, social graph, software studies, speech recognition, statistical model, Steve Jobs, Steven Levy, Stewart Brand, supply-chain management, TaskRabbit, technological singularity, technoutopianism, The Coming Technological Singularity, the scientific method, The Signal and the Noise by Nate Silver, The Structural Transformation of the Public Sphere, The Wealth of Nations by Adam Smith, transaction costs, traveling salesman, Turing machine, Turing test, Uber and Lyft, Uber for X, uber lyft, urban planning, Vannevar Bush, Vernor Vinge, wage slave

The apparent transparency and simplicity of computational systems are leading many to see them as vehicles for unbiased decision-making. Companies like UpStart and ZestFinance view computation as a way to judge financial reliability and make loans to people who fail more traditional algorithmic tests of credit-worthiness, like credit scores.14 These systems essentially deploy algorithms to counter the bias of other algorithms, or more cynically to identify business opportunities missed by others. The companies behind these systems are relatively unusual, however, in acknowledging the ideological framing of their business plans, and explicitly addressing how their systems attempt to judge “character.”


pages: 1,172 words: 114,305

New Laws of Robotics: Defending Human Expertise in the Age of AI by Frank Pasquale

affirmative action, Affordable Care Act / Obamacare, Airbnb, algorithmic bias, Amazon Mechanical Turk, augmented reality, Automated Insights, autonomous vehicles, basic income, battle of ideas, Bernie Sanders, Bill Joy: nanobots, bitcoin, blockchain, call centre, citizen journalism, Clayton Christensen, collective bargaining, commoditize, computer vision, conceptual framework, coronavirus, corporate social responsibility, correlation does not imply causation, Covid-19, COVID-19, cryptocurrency, data is the new oil, decarbonisation, deskilling, digital twin, disinformation, disruptive innovation, don't be evil, Donald Trump, Douglas Engelbart, effective altruism, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Filter Bubble, finite state, Flash crash, future of work, Google Chrome, Google Glasses, high net worth, hiring and firing, Ian Bogost, independent contractor, informal economy, information asymmetry, information retrieval, interchangeable parts, invisible hand, Jaron Lanier, job automation, John Markoff, Joi Ito, Khan Academy, knowledge economy, late capitalism, Marc Andreessen, Mark Zuckerberg, means of production, medical malpractice, meta-analysis, Modern Monetary Theory, Money creation, move fast and break things, move fast and break things, mutually assured destruction, natural language processing, new economy, Nicholas Carr, Norbert Wiener, nuclear winter, obamacare, paradox of thrift, pattern recognition, payday loans, personalized medicine, Peter Singer: altruism, Philip Mirowski, pink-collar, Plutocrats, plutocrats, pre–internet, profit motive, QR code, quantitative easing, race to the bottom, RAND corporation, Ray Kurzweil, recommendation engine, regulatory arbitrage, Robert Shiller, Robert Shiller, Rodney Brooks, Ronald Reagan, self-driving car, sentiment analysis, Shoshana Zuboff, Silicon Valley, Singularitarianism, smart cities, smart contracts, software is eating the world, South China Sea, Steve Bannon, surveillance capitalism, TaskRabbit, technoutopianism, telepresence, telerobotics, The Future of Employment, Therac-25, Thorstein Veblen, too big to fail, Turing test, universal basic income, unorthodox policies, wage slave, Watson beat the top human players on Jeopardy!, working poor, Works Progress Administration, zero day

Mike Butcher, “The Robot-Recruiter Is Coming—VCV’s AI Will Read Your Face in a Job Interview,” TechCrunch, April 23, 2019, https://techcrunch.com/2019/04/23/the-robot-recruiter-is-coming-vcvs-ai-will-read-your-face-in-a-job-interview/. 2. Miranda Bogen and Aaron Rieke, Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias (Washington, DC: Upturn, 2018), https://www.upturn.org/static/reports/2018/hiring-algorithms/files/Upturn%20—%20Help%20Wanted%20-%20An%20Exploration%20of%20Hiring%20Algorithms,%20Equity%20and%20Bias.pdf. 3. There are numerous examples of person-judging technology reinscribing and reinforcing old forms of unearned privilege and unfair disadvantage.


pages: 410 words: 119,823

Radical Technologies: The Design of Everyday Life by Adam Greenfield

3D printing, Airbnb, algorithmic bias, augmented reality, autonomous vehicles, bank run, barriers to entry, basic income, bitcoin, blockchain, business intelligence, business process, call centre, cellular automata, centralized clearinghouse, centre right, Chuck Templeton: OpenTable:, cloud computing, collective bargaining, combinatorial explosion, Computer Numeric Control, computer vision, Conway's Game of Life, cryptocurrency, David Graeber, dematerialisation, digital map, disruptive innovation, distributed ledger, drone strike, Elon Musk, Ethereum, ethereum blockchain, facts on the ground, fiat currency, global supply chain, global village, Google Glasses, Ian Bogost, IBM and the Holocaust, industrial robot, informal economy, information retrieval, Internet of things, James Watt: steam engine, Jane Jacobs, Jeff Bezos, job automation, John Conway, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, joint-stock company, Kevin Kelly, Kickstarter, late capitalism, license plate recognition, lifelogging, M-Pesa, Mark Zuckerberg, means of production, megacity, megastructure, minimum viable product, money: store of value / unit of account / medium of exchange, natural language processing, Network effects, New Urbanism, Occupy movement, Oculus Rift, Pareto efficiency, pattern recognition, Pearl River Delta, performance metric, Peter Eisenman, Peter Thiel, planetary scale, Ponzi scheme, post scarcity, post-work, RAND corporation, recommendation engine, RFID, rolodex, Satoshi Nakamoto, self-driving car, sentiment analysis, shareholder value, sharing economy, Shenzhen special economic zone , Silicon Valley, smart cities, smart contracts, social intelligence, sorting algorithm, special economic zone, speech recognition, stakhanovite, statistical model, stem cell, technoutopianism, Tesla Model S, the built environment, The Death and Life of Great American Cities, The Future of Employment, transaction costs, Uber for X, undersea cable, universal basic income, urban planning, urban sprawl, When a measure becomes a target, Whole Earth Review, WikiLeaks, women in the workforce

It will therefore have problems with accurate identification when presented with a black Camaro. Or it could suffer from the opposite problem, bias. In the context of machine learning, bias means that even after extensive training, an algorithm has failed to acquire anything essential at all about the set of target objects it’s being asked to identify. An algorithm displaying high bias is basically taking random stabs in the dark, however much confidence it may seem to be mustering in its labeling, and will without hesitation identify outright static as a house, a whale or a chair. (We should be careful to distinguish this sense of the word from its more usual, pejorative sense, in which the implicit prejudices of the party responsible for training an algorithm are reflected in its output—though that happens too, as on the notorious occasion on which a Google Images algorithm identified a picture of black people as “gorillas,” apparently because the only training images labeled “people” it had ever been provided had light skin.)6 However they might undermine an algorithm’s practical utility, or embarrass the software developers involved, errors of bias and overfitting can be corrected.


pages: 223 words: 60,909

pages: 448 words: 117,325

Calling Bullshit: The Art of Scepticism in a Data-Driven World by Jevin D. West, Carl T. Bergstrom

airport security, algorithmic bias, Amazon Mechanical Turk, Andrew Wiles, bitcoin, cloud computing, computer vision, correlation coefficient, correlation does not imply causation, crowdsourcing, cryptocurrency, delayed gratification, disinformation, Dmitri Mendeleev, Donald Trump, Elon Musk, epigenetics, Estimating the Reproducibility of Psychological Science, experimental economics, invention of the printing press, John Markoff, longitudinal study, Lyft, meta-analysis, new economy, p-value, Pluto: dwarf planet, publication bias, RAND corporation, randomized controlled trial, replication crisis, ride hailing / ride sharing, Ronald Reagan, selection bias, self-driving car, Silicon Valley, Silicon Valley startup, social graph, Socratic dialogue, Stanford marshmallow experiment, statistical model, stem cell, superintelligent machines, the scientific method, theory of mind, Tim Cook: Apple, twin studies, Uber and Lyft, Uber for X, uber lyft, When a measure becomes a target

Sky News. July 4, 2019. McCool, John H. “Opinion: Why I Published in a Predatory Journal.” The Scientist. April 6, 2017. Merton, R. K. “Priorities in Scientific Discovery: A Chapter in the Sociology of Science.” American Sociological Review 22 (1957): 635–59. “Mortgage Algorithms Perpetuate Racial Bias in Lending, Study Finds.” Press release. University of California, Berkeley. November 13, 2018. “NASA Twins Study Confirms Preliminary Findings.” Press release. National Aeronautics and Space Administration. January 31, 2018. https://www.nasa.gov/​feature/​nasa-twins-study-confirms-preliminary-findings.


pages: 296 words: 78,631

Hello World: Being Human in the Age of Algorithms by Hannah Fry

23andMe, 3D printing, Air France Flight 447, Airbnb, airport security, algorithmic bias, augmented reality, autonomous vehicles, backpropagation, Brixton riot, chief data officer, computer vision, crowdsourcing, DARPA: Urban Challenge, Douglas Hofstadter, Elon Musk, Firefox, Google Chrome, Gödel, Escher, Bach, Ignaz Semmelweis: hand washing, John Markoff, Mark Zuckerberg, meta-analysis, pattern recognition, Peter Thiel, RAND corporation, ransomware, recommendation engine, ride hailing / ride sharing, selection bias, self-driving car, Shai Danziger, Silicon Valley, Silicon Valley startup, Snapchat, speech recognition, Stanislav Petrov, statistical model, Stephen Hawking, Steven Levy, Tesla Model S, The Wisdom of Crowds, Thomas Bayes, Watson beat the top human players on Jeopardy!, web of trust, William Langewiesche, you are the product

(TV show) 97–9 John Carter (film) 180 Johnson, Richard 50, 51 Jones Beach 1 Jones, Robert 13–14 judges anchoring effect 73 bail, factors for consideration 73 decision-making consistency in 51 contradictions in 52–3 differences in 52 discretion in 53 unbiased 77 judges (continued) discrimination and bias 70–1, 75 intuition and considered thought 72 lawyers’ preference over algorithms 76–7 vs machines 59–61 offenders’ preference over algorithms 76 perpetuation of bias 73 sentencing 53–4, 63 use of algorithms 63, 64 Weber’s Law 74–5 Jukebox 192 junk algorithms 200 Just Noticeable Difference 74 justice 49–78 algorithms and 54–6 justification for 77 appeals process 51 Brixton riots 49–51 by country Australia 53 Canada 54 England 54 Ireland 54 Scotland 54 United States 53, 54 Wales 54 discretion of judges 53 discrimination 70–1 humans vs machines 59–61, 62–4 hypothetical cases (UK research) 52–3 defendants appearing twice 52–3 differences in judgement 52, 53 hypothetical cases (US research) 51–2 differences in judgements 52 differences in sentencing 52 inherent injustice 77 machine bias 65–71 maximum terms 54 purpose of 77–8 re-offending 54, 55 reasonable doubt 51 rehabilitation 55 risk-assessment algorithms 56 sentencing consistency in 51 mitigating factors in 53 substantial grounds 51 Kadoodle 15–16 Kahneman, Daniel 72 Kanevsky, Dr Jonathan 93, 95 kangaroos 128 Kant, Immanuel 185 Kasparov, Gary 5-7, 202 Kelly, Frank 87 Kerner, Winifred 188–9 Kernighan, Brian x Killingbeck 145, 146 Larson, Steve 188–9 lasers 119–20 Leibniz, Gottfried 184 Leroi, Armand 186, 192–3 level 0 (driverless technology) 131 level 1 (driverless technology) 131 level 2 (driverless technology) 131, 136 careful attention 134–5 level 3 (driverless technology) 131 technical challenge 136 level 4 (driverless technology) 131 level 5 (driverless technology) 131 Li Yingyun 45 Lickel, Charles 97–8 LiDAR (Light Detection and Ranging) 119–20 life insurance 109 ‘Lockdown’ (52Metro) 177 logic 8 logical instructions 8 London Bridge 172 London School of Economics (LSE) 129 Loomis, Eric 217n38 Los Angeles Police Department 152, 155 Lucas, Teghan 161–2, 163 machine-learning algorithms 10–11 neural networks 85–6 random forests 58–9 machines art and 194 bias in 65–71 diagnostic 98–101, 110–11 domination of humans 5-6 vs humans 59–61, 62–4 paradoxical relationship with 22–3 recognising images 84–7 superior judgement of 16 symbolic dominance over humans 5-6 Magic Test 199 magical illusions 18 mammogram screenings 94, 96 manipulation 39–44 micro-manipulation 42–4 Maple, Jack 147–50 Marx, Gary 173 mastectomies 83, 84, 92, 94 maternity wards, deaths on 81 mathematical certainty 68 mathematical objects 8 McGrayne, Sharon Bertsch 122 mechanized weaving machines 2 Medicaid assistance 16–17 medical conditions, algorithms for 96–7 medical records 102–7 benefits of algorithms 106 DeepMind 104–5 disconnected 102–3 misuse of data 106 privacy 105–7 medicine 79–112 in ancient times 80 cancer diagnoses study 79–80 complexity of 103–4 diabetic retinopathy 96 diagnostic machines 98–101, 110–11 choosing between individuals and the population 111 in fifteenth-century China 81 Hippocrates and 80 magic and 80 medical records 102–6 neural networks 85–6, 95, 96, 219–20n11 in nineteenth-century Europe 81 pathology 79, 82–3 patterns in data 79–81 predicting dementia 90–2 scientific base 80 see also Watson (IBM computer) Meehl, Paul 21–2 MegaFace challenge 168–9 Mercedes 125–6 microprocessors x Millgarth 145, 146 Mills, Tamara 101–2, 103 MIT Technology Review 101 modern inventions 2 Moses, Robert 1 movies see films music 176–80 choosing 176–8 diversity of charts 186 emotion and 189 genetic algorithms 191–2 hip hop 186 piano experiment 188–90 algorithm 188, 189–91 popularity 177, 178 quality 179, 180 terrible, success of 178–9 Music Lab 176–7, 179, 180 Musk, Elon 138 MyHeritage 110 National Geographic ­Genographic project 110 National Highway Traffic Safety Administration 135 Navlab 117 Netflix 8, 188 random forests 59 neural networks 85–6, 95, 119, 201, 219–20n11 driverless cars 117–18 in facial recognition 166–7 predicting performances of films 183 New England Journal of ­Medicine 94 New York City subway crime 147–50 anti-social behaviour 149 fare evasion 149 hotspots 148, 149 New York Police Department (NYPD) 172 New York Times 116 Newman, Paul 127–8, 130 NHS (National Health Service) computer virus in hospitals 105 data security record 105 fax machines 103 linking of healthcare records 102–3 paper records 103 prioritization of non-smokers for operations 106 nuclear war 18–19 Nun Study 90–2 obesity 106 OK Cupid 9 Ontario 169–70 openworm project 13 Operation Lynx 145–7 fingerprints 145 overruling algorithms correctly 19–20 incorrectly 20–1 Oxbotica 127 Palantir Technologies 31 Paris Auto Show (2016) 124–5 parole 54–5 Burgess’s forecasting power 55–6 violation of 55–6 passport officers 161, 164 PathAI 82 pathologists 82 vs algorithms 88 breast cancer research on corpses 92–3 correct diagnoses 83 differences of opinion 83–4 diagnosing cancerous tumours 90 sensitivity and 88 specificity and 88 pathology 79, 82 and biology 82–3 patterns in data 79–81, 103, 108 payday lenders 35 personality traits 39 advertising and 40–1 inferred by algorithm 40 research on 39–40 Petrov, Stanislav 18–19 piano experiment 188–90 pigeons 79–80 Pomerleau, Dean 118–19 popularity 177, 178, 179, 183–4 power 5–24 blind faith in algorithms 13–16 overruling algorithms 19–21 struggle between humans and algorithms 20–4 trusting algorithms 16–19 power of veto 19 Pratt, Gill 137 precision in justice 53 prediction accuracy of 66, 67, 68 algorithms vs humans 22, 59–61, 62–5 Burgess 55–6 of crime burglary 150–1 HunchLab algorithm 157–8 PredPol algorithm 152–7, 158 risk factor 152 Strategic Subject List algorithm 158 decision trees 56–8 dementia 90–2 prediction (continued) development of abnormalities 87, 95 homicide 62 of personality 39–42 of popularity 177, 178, 179, 180, 183–4 powers of 92–6 of pregnancy 29–30 re-offending criminals 55–6 recidivism 62, 63–4, 65 of successful films 180–1, 182–3, 183 superiority of algorithms 22 see also Clinical vs Statistical Prediction (Meehl); neural networks predictive text 190–1 PredPol (PREDictive POL­icing) 152–7, 158, 228–9n27 assessing locations at risk 153–4 cops on the dots 155–6 fall in crime 156 feedback loop 156–7 vs humans, test 153–4 target hardening 154–5 pregnancy prediction 29–30 prescriptive sentencing systems 53, 54 prioritization algorithms 8 prisons cost of incarceration 61 Illinois 55, 56 reduction in population 61 privacy 170, 172 false sense of 47 issues 25 medical records 105–7 overriding of 107 sale of data 36–9 probabilistic inference 124, 127 probability 8 ProPublica 65–8, 70 quality 179, 180 ‘good’ changing nature of 184 defining 184 quantifying 184–8 difficulty of 184 Washington Post experiment 185–6 racial groups COMPAS algorithm 65–6 rates of arrest 68 radar 119–20 RAND Corporation 158 random forests technique 56–9 rape 141, 142 re-offending 54 prediction of 55–6 social types of inmates 55, 56 recidivism 56, 62, 201 rates 61 risk scores 63–4, 65 regulation of algorithms 173 rehabilitation 55 relationships 9 Republican voters 41 Rhode Island 61 Rio de Janeiro–Galeão International Airport 132 risk scores 63–4, 65 Robinson, Nicholas 49, 50, 50–1, 77 imprisonment 51 Rossmo, Kim 142–3 algorithm 145–7 assessment of 146 bomb factories 147 buffer zone 144 distance decay 144 flexibility of 146 stagnant water pools 146–7 Operation Lynx 145–7 Rotten Tomatoes website 181 Royal Free NHS Trust 222–3n48 contract with DeepMind 104–5 access to full medical histories 104–5 outrage at 104 Rubin’s vase 211n13 rule-based algorithms 10, 11, 85 Rutherford, Adam 110 Safari browser 47 Sainsbury’s 27 Salganik, Matthew 176–7, 178 Schmidt, Eric 28 School Sisters of Notre Dame 90, 91 Science magazine 15 Scunthorpe 2 search engines 14–15 experiment 14–15 Kadoodle 15–16 Semmelweis, Ignaz 81 sensitivity, principle of 87, 87–8 sensors 120 sentencing algorithms for 62–4 COMPAS 63, 64 considerations for 62–3 consistency in 51 length of 62–3 influencing 73 Weber’s Law 74–5 mitigating factors in 53 prescriptive systems 53, 54 serial offenders 144, 145 serial rapists 141–2 Sesame Credit 45–6, 168 sexual attacks 141–2 shoplifters 170 shopping habits 28, 29, 31 similarity 187 Slash X (bar) 113, 114, 115 smallpox inoculation 81 Snowden, David 90–2 social proof 177–8, 179 Sorensen, Alan 178 Soviet Union detection of enemy missiles 18 protecting air space 18 retaliatory action 19 specificity, principle of 87, 87–8 speech recognition algorithms 9 Spotify 176, 188 Spotify Discover 188 Sreenivasan, Sameet 181–2 Stammer, Neil 172 Standford University 39–40 STAT website 100 statistics 143 computational 12 modern 107 NYPD 172 Stilgoe, Jack 128–9, 130 Strategic Subject List 158 subway crime see New York City subway crime supermarkets 26–8 superstores 28–31 Supreme Court of Wisconsin 64, 217n38 swine flu 101–2 Talley, Steve 159, 162, 163–4, 171, 230n47 Target 28–31 analysing unusual data ­patterns 28–9 expectant mothers 28–9 algorithm 29, 30 coupons 29 justification of policy 30 teenage pregnancy incident 29–30 target hardening 154–5 teenage pregnancy 29–30 Tencent YouTu Lab algorithm 169 Tesco 26–8 Clubcard 26, 27 customers buying behaviour 26–7 knowledge about 27 loyalty of 26 vouchers 27 online shopping 27–8 ‘My Favourites’ feature 27–8 removal of revealing items 28 Tesla 134, 135 autopilot system 138 full autonomy 138 full self-driving hardware 138 Thiel, Peter 31 thinking, ways of 72 Timberlake, Justin 175–6 Timberlake, Justin (artist) 175–6 Tolstoy, Leo 194 TomTom sat-nav 13–14 Toyota 137, 210n13 chauffeur mode 139 guardian mode 139 trolley problem 125–6 true positives 67 Trump election campaign 41, 44 trust 17–18 tumours 90, 93–4 Twain, Mark 193 Twitter 36, 37, 40 filtering 10 Uber driverless cars 135 human intervention 135 uberPOOL 10 United Kingdom (UK) database of facial images 168 facial recognition algorithms 161 genetic tests for Huntington’s disease 110 United States of America (USA) database of facial images 168 facial recognition algorithms 161 life insurance stipulations 109 linking of healthcare ­records 103 University of California 152 University of Cambridge research on personality traits 39–40 and advertising 40–1 algorithm 40 personality predictions 40 and Twitter 40 University of Oregon 188–90 University of Texas M.


pages: 280 words: 76,638

Rebel Ideas: The Power of Diverse Thinking by Matthew Syed

agricultural Revolution, Alfred Russel Wallace, algorithmic bias, call centre, Cass Sunstein, computer age, crowdsourcing, cuban missile crisis, delayed gratification, drone strike, Elon Musk, Erik Brynjolfsson, Ferguson, Missouri, Filter Bubble, Firefox, invention of writing, James Dyson, Jeff Bezos, knowledge economy, lateral thinking, market bubble, mass immigration, microbiome, Mitch Kapor, Peter Thiel, Richard Thaler, Ronald Reagan, Second Machine Age, self-driving car, Silicon Valley, social intelligence, Steve Jobs, Steve Wozniak, The Wealth of Nations by Adam Smith, The Wisdom of Crowds, traveling salesman

But if you aren’t aware of how those biases operate, if you aren’t collecting data and taking a little time to produce evidence-based processes, you will continue to blindly perpetuate old injustices. And so by not considering ways in which women’s lives differ from men’s, both on and offline, Gild’s coders inadvertently created an algorithm with a hidden bias against women.’ 6 https://hbr.org/2019/06/why-you-should-create-a-shadow-board-of-younger-employees 7 https://www.npr.org/2015/09/14/440215976/journalist-says-the-drone-strike-that-killed-awlaki-did-not-silence-him Footnotes 1 Collective Blindness FN1 This was partly about the fear that gay staff, particularly those who had not come out, might be subject to blackmail.


pages: 208 words: 57,602

pages: 289 words: 87,292

The Strange Order of Things: The Biological Roots of Culture by Antonio Damasio

Albert Einstein, algorithmic bias, biofilm, business process, Daniel Kahneman / Amos Tversky, double helix, Gordon Gekko, invention of the wheel, invention of writing, invisible hand, job automation, mental accounting, meta-analysis, microbiome, Norbert Wiener, pattern recognition, Peter Singer: altruism, planetary scale, profit motive, Ray Kurzweil, Richard Feynman, self-driving car, Silicon Valley, Steven Pinker, Thomas Malthus

On the other hand, the public generally lacks the time and the method to convert massive amounts of information into sensible and practically usable conclusions. Moreover, the companies that manage the distribution and aggregation of the information assist the public in a dubious way: the flow of information is directed by company algorithms that, in turn, bias the presentation so as to suit a variety of financial, political, and social interests, not to mention the tastes of users so that they can continue within their own entertaining silo of opinions. One should acknowledge, in fairness, that the voices of wisdom from the past—the voices of experienced and thoughtful editors of newspapers and radio and television programs—were also biased and favored particular views of how societies should function.


pages: 592 words: 125,186

pages: 268 words: 75,850

The Formula: How Algorithms Solve All Our Problems-And Create More by Luke Dormehl

3D printing, algorithmic bias, algorithmic trading, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, big data - Walmart - Pop Tarts, call centre, Cass Sunstein, Clayton Christensen, commoditize, computer age, death of newspapers, deferred acceptance, disruptive innovation, Edward Lorenz: Chaos theory, Erik Brynjolfsson, Filter Bubble, Flash crash, Florence Nightingale: pie chart, Frank Levy and Richard Murnane: The New Division of Labor, Google Earth, Google Glasses, High speed trading, Internet Archive, Isaac Newton, Jaron Lanier, Jeff Bezos, job automation, John Markoff, Kevin Kelly, Kodak vs Instagram, lifelogging, Marshall McLuhan, means of production, Nate Silver, natural language processing, Netflix Prize, Panopticon Jeremy Bentham, pattern recognition, price discrimination, recommendation engine, Richard Thaler, Rosa Parks, self-driving car, sentiment analysis, Silicon Valley, Silicon Valley startup, Slavoj Žižek, social graph, speech recognition, Steve Jobs, Steven Levy, Steven Pinker, Stewart Brand, the scientific method, The Signal and the Noise by Nate Silver, upwardly mobile, Wall-E, Watson beat the top human players on Jeopardy!, Y Combinator

While Berk’s comments are designed to get actionable information to predict future criminality, one could argue that by black-boxing the inner workings of the technology, something similar has taken place with the underlying social dynamics. In other areas—particularly as relate to law—a reliance on algorithms might simply justify existing bias and lack of understanding, in the same way that the “filter bubble” effect described in Chapter 1 can result in some people not being presented with certain pieces of information, which may take the form of opportunities. “It’s not just you and I who don’t understand how these algorithms work—the engineers themselves don’t understand them entirely,” says scholar Ted Striphas.


The Deep Learning Revolution (The MIT Press) by Terrence J. Sejnowski

AI winter, Albert Einstein, algorithmic bias, algorithmic trading, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, augmented reality, autonomous vehicles, backpropagation, Baxter: Rethink Robotics, bioinformatics, cellular automata, Claude Shannon: information theory, cloud computing, complexity theory, computer vision, conceptual framework, constrained optimization, Conway's Game of Life, correlation does not imply causation, crowdsourcing, Danny Hillis, delayed gratification, discovery of DNA, Donald Trump, Douglas Engelbart, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, Flynn Effect, Frank Gehry, future of work, Google Glasses, Google X / Alphabet X, Guggenheim Bilbao, Gödel, Escher, Bach, haute couture, Henri Poincaré, I think there is a world market for maybe five computers, industrial robot, informal economy, Internet of things, Isaac Newton, John Conway, John Markoff, John von Neumann, Mark Zuckerberg, Minecraft, natural language processing, Netflix Prize, Norbert Wiener, orbital mechanics / astrodynamics, PageRank, pattern recognition, prediction markets, randomized controlled trial, recommendation engine, Renaissance Technologies, Rodney Brooks, self-driving car, Silicon Valley, Silicon Valley startup, Socratic dialogue, speech recognition, statistical model, Stephen Hawking, theory of mind, Thomas Bayes, Thomas Kuhn: the structure of scientific revolutions, traveling salesman, Turing machine, Von Neumann architecture, Watson beat the top human players on Jeopardy!, X Prize, Yogi Berra

Cowan, Manhattan Project to the Santa Fe Institute: The Memoirs of George A. Cowan (Albuquerque: University of New Mexico Press, 2010). 4. Google’s PageRank algorithm, which was invented by Google founders Larry Page and Sergey Brin, uses links to a webpage to rank the importance of pages on the Internet. It has since been elaborated with many layers of algorithms to manipulate the bias on searches. 5. A. D. I. Kramer, J. E. Guillory, and J. T. Hancock, “Experimental Evidence of Massive-Scale Emotional Contagion through Social Networks,” Proceedings of the National Academy of Sciences of the United States of America 111, no. 24 (2014): 8788–8790. 6. Stuart Kauffman, The Origins of Order: Self Organization and Selection in Evolution (New York: Oxford University Press, 1993). 7.


pages: 428 words: 103,544


pages: 442 words: 94,734

pages: 404 words: 92,713

pages: 472 words: 117,093

Machine, Platform, Crowd: Harnessing Our Digital Future by Andrew McAfee, Erik Brynjolfsson

"Robert Solow", 3D printing, additive manufacturing, AI winter, Airbnb, airline deregulation, airport security, Albert Einstein, algorithmic bias, Amazon Mechanical Turk, Amazon Web Services, artificial general intelligence, augmented reality, autonomous vehicles, backpropagation, backtesting, barriers to entry, bitcoin, blockchain, British Empire, business cycle, business process, carbon footprint, Cass Sunstein, centralized clearinghouse, Chris Urmson, cloud computing, cognitive bias, commoditize, complexity theory, computer age, creative destruction, crony capitalism, crowdsourcing, cryptocurrency, Daniel Kahneman / Amos Tversky, Dean Kamen, discovery of DNA, disintermediation, disruptive innovation, distributed ledger, double helix, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Ethereum, ethereum blockchain, everywhere but in the productivity statistics, family office, fiat currency, financial innovation, George Akerlof, global supply chain, Hernando de Soto, hive mind, independent contractor, information asymmetry, Internet of things, inventory management, iterative process, Jean Tirole, Jeff Bezos, jimmy wales, John Markoff, joint-stock company, Joseph Schumpeter, Kickstarter, law of one price, longitudinal study, Lyft, Machine translation of "The spirit is willing, but the flesh is weak." to Russian and back, Marc Andreessen, Mark Zuckerberg, meta-analysis, Mitch Kapor, moral hazard, multi-sided market, Myron Scholes, natural language processing, Network effects, new economy, Norbert Wiener, Oculus Rift, PageRank, pattern recognition, peer-to-peer lending, performance metric, Plutocrats, plutocrats, precision agriculture, prediction markets, pre–internet, price stability, principal–agent problem, Ray Kurzweil, Renaissance Technologies, Richard Stallman, ride hailing / ride sharing, risk tolerance, Ronald Coase, Satoshi Nakamoto, Second Machine Age, self-driving car, sharing economy, Silicon Valley, Skype, slashdot, smart contracts, Snapchat, speech recognition, statistical model, Steve Ballmer, Steve Jobs, Steven Pinker, supply-chain management, TaskRabbit, Ted Nelson, The Market for Lemons, The Nature of the Firm, the strength of weak ties, Thomas Davenport, Thomas L Friedman, too big to fail, transaction costs, transportation-network company, traveling salesman, Travis Kalanick, two-sided market, Uber and Lyft, Uber for X, uber lyft, ubercab, Watson beat the top human players on Jeopardy!, winner-take-all economy, yield management, zero day


pages: 444 words: 130,646

Twitter and Tear Gas: The Power and Fragility of Networked Protest by Zeynep Tufekci

4chan, active measures, Affordable Care Act / Obamacare, algorithmic bias, AltaVista, Andy Carvin, anti-communist, Bernie Sanders, British Empire, citizen journalism, collective bargaining, conceptual framework, crowdsourcing, disinformation, Donald Trump, Edward Snowden, feminist movement, Ferguson, Missouri, Filter Bubble, Howard Rheingold, income inequality, index card, interchangeable parts, invention of movable type, invention of writing, loose coupling, Mahatma Gandhi, Mark Zuckerberg, Menlo Park, Mikhail Gorbachev, moral hazard, moral panic, Naomi Klein, Network effects, new economy, obamacare, Occupy movement, offshore financial centre, pre–internet, race to the bottom, RAND corporation, ride hailing / ride sharing, Rosa Parks, sharing economy, Silicon Valley, Skype, Snapchat, the strength of weak ties, The Structural Transformation of the Public Sphere, The Theory of the Leisure Class by Thorstein Veblen, Thorstein Veblen, We are the 99%, WikiLeaks, Yochai Benkler

It can result in more polarization and at the same time deepen the filter bubble.44 The bias toward “Like” on Facebook promotes the echo-chamber effect, making it more likely that one sees posts one already agrees with. Of course, this builds upon the pre-existing human tendency to gravitate toward topics and positions one already agrees with—confirmation bias—which is well demonstrated in social science research. Facebook’s own studies show that the algorithm contributes to this bias by making the feed somewhat more tilted toward one’s existing views, reinforcing the echo chamber.45 Another type of bias is “comment” bias, which can promote visibility for the occasional quarrels that have garnered many comments. But how widespread are these problems, and what are their effects?


pages: 337 words: 103,522

The Creativity Code: How AI Is Learning to Write, Paint and Think by Marcus Du Sautoy

3D printing, Ada Lovelace, Albert Einstein, algorithmic bias, Alvin Roth, Andrew Wiles, Automated Insights, Benoit Mandelbrot, Claude Shannon: information theory, computer vision, correlation does not imply causation, crowdsourcing, data is the new oil, Donald Trump, double helix, Douglas Hofstadter, Elon Musk, Erik Brynjolfsson, Fellow of the Royal Society, Flash crash, Gödel, Escher, Bach, Henri Poincaré, Jacquard loom, John Conway, Kickstarter, Loebner Prize, mandelbrot fractal, Minecraft, music of the spheres, Narrative Science, natural language processing, Netflix Prize, PageRank, pattern recognition, Paul Erdős, Peter Thiel, random walk, Ray Kurzweil, recommendation engine, Rubik’s Cube, Second Machine Age, Silicon Valley, speech recognition, Turing test, Watson beat the top human players on Jeopardy!, wikimedia commons

This bias in the data has led to a whole host of algorithms that are making unacceptable decisions: voice recognition software trained on male voices that doesn’t recognise women’s voices; image recognition software that classifies black people as gorillas; passport photo booths that tell Asians their photos are unacceptable because they have their eyes closed. In Silicon Valley, four out of five people hired in the tech industry are white males. This has led Buolamwini to set up the Algorithmic Justice League to fight bias in the data that algorithms are learning on. The legal system is also facing challenges as people are being rejected for mortgages, jobs or state benefits because of an algorithm. These people justifiably want to know why they have been turned down. But given that these algorithms are creating decision trees based on their interaction with data that is hard to unravel, justifying these decisions is not easy.


pages: 420 words: 100,811


pages: 523 words: 143,139

Algorithms to Live By: The Computer Science of Human Decisions by Brian Christian, Tom Griffiths

4chan, Ada Lovelace, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, algorithmic bias, algorithmic trading, anthropic principle, asset allocation, autonomous vehicles, Bayesian statistics, Berlin Wall, Bill Duvall, bitcoin, Community Supported Agriculture, complexity theory, constrained optimization, cosmological principle, cryptocurrency, Danny Hillis, David Heinemeier Hansson, delayed gratification, dematerialisation, diversification, Donald Knuth, double helix, Elon Musk, fault tolerance, Fellow of the Royal Society, Firefox, first-price auction, Flash crash, Frederick Winslow Taylor, Garrett Hardin, George Akerlof, global supply chain, Google Chrome, Henri Poincaré, information retrieval, Internet Archive, Jeff Bezos, Johannes Kepler, John Nash: game theory, John von Neumann, Kickstarter, knapsack problem, Lao Tzu, Leonard Kleinrock, linear programming, martingale, Nash equilibrium, natural language processing, NP-complete, P = NP, packet switching, Pierre-Simon Laplace, prediction markets, race to the bottom, RAND corporation, RFC: Request For Comment, Robert X Cringely, Sam Altman, sealed-bid auction, second-price auction, self-driving car, Silicon Valley, Skype, sorting algorithm, spectrum auction, Stanford marshmallow experiment, Steve Jobs, stochastic process, Thomas Bayes, Thomas Malthus, Tragedy of the Commons, traveling salesman, Turing machine, urban planning, Vickrey auction, Vilfredo Pareto, Walter Mischel, Y Combinator, zero-sum game

In almost every domain we’ve considered, we have seen how the more real-world factors we include—whether it’s having incomplete information when interviewing job applicants, dealing with a changing world when trying to resolve the explore/exploit dilemma, or having certain tasks depend on others when we’re trying to get things done—the more likely we are to end up in a situation where finding the perfect solution takes unreasonably long. And indeed, people are almost always confronting what computer science regards as the hard cases. Up against such hard cases, effective algorithms make assumptions, show a bias toward simpler solutions, trade off the costs of error against the costs of delay, and take chances. These aren’t the concessions we make when we can’t be rational. They’re what being rational means. Notes Please note that some of the links referenced are no longer working.


pages: 629 words: 142,393

pages: 721 words: 197,134

Data Mining: Concepts, Models, Methods, and Algorithms by Mehmed Kantardzić

Albert Einstein, algorithmic bias, backpropagation, bioinformatics, business cycle, business intelligence, business process, butter production in bangladesh, combinatorial explosion, computer vision, conceptual framework, correlation coefficient, correlation does not imply causation, data acquisition, discrete time, El Camino Real, fault tolerance, finite state, Gini coefficient, information retrieval, Internet Archive, inventory management, iterative process, knowledge worker, linked data, loose coupling, Menlo Park, natural language processing, Netflix Prize, NP-complete, PageRank, pattern recognition, peer-to-peer, phenotype, random walk, RFID, semantic web, speech recognition, statistical model, Telecommunications Act of 1996, telemarketer, text mining, traveling salesman, web application

The method has two variants: random sampling without replacement and random sampling with replacement. Random sampling without replacement is a popular technique in which n distinct samples are selected from N initial samples in the data set without repetition (a sample may not occur twice). The advantages of the approach are simplicity of the algorithm and nonexistence of any bias in a selection. In random sampling with replacement, the samples are selected from a data set such that all samples are given an equal chance of being selected, no matter how often they already have been drawn, that is, any of the samples may be selected more than once. Random sampling is not a one-time activity in a data-mining process.


pages: 764 words: 261,694

The Elements of Statistical Learning (Springer Series in Statistics) by Trevor Hastie, Robert Tibshirani, Jerome Friedman

algorithmic bias, backpropagation, Bayesian statistics, bioinformatics, computer age, conceptual framework, correlation coefficient, G4S, greed is good, linear programming, p-value, pattern recognition, random walk, selection bias, speech recognition, statistical model, stochastic process, The Wisdom of Crowds

A. 510 Wong, W. 292 Wright, G. 664, 674, 693 Wright, M. 96, 421 Wu, T. 92, 294, 583 Wyner, A. 384, 603 Yang, N. 3, 49 Yang, Y. 686, 693 Yasui, Y. 664 Yeang, C. 654, 658 Yee, T. 300 Yekutieli, Y. 693 Yu, B. 90, 91, 384 Yuan, M. 90 Zhang, H. 90, 304, 428, 455 Zhang, J. 409–412, 605 Zhang, P. 257 Zhang, T. 384 Zhao, P. 90, 91 Zhao, Y. 693 Zhu, J. 89, 98, 174, 348, 349, 385, 426, 428, 434, 610, 611, 615, 657, 661, 664, 666, 693 Zidek, J. 84 Zou, H. 72, 78, 92, 349, 385, 550, 662, 693 This is page 737 Printer: Opaque this Index L1 regularization, see Lasso Activation function, 392–395 AdaBoost, 337–346 Adaptive lasso, 92 Adaptive methods, 429 Adaptive nearest neighbor methods, 475–478 Adaptive wavelet filtering, 181 Additive model, 295–304 Adjusted response, 297 Affine set, 130 Affine-invariant average, 482, 540 AIC, see Akaike information criterion Akaike information criterion (AIC), 230 Analysis of deviance, 124 Applications abstracts, 672 aorta, 204 bone, 152 California housing, 371–372, 591 countries, 517 demographics, 379–380 document, 532 flow cytometry, 637 galaxy, 201 heart attack, 122, 146, 207 lymphoma, 674 marketing, 488 microarray, 5, 505, 532 nested spheres, 590 New Zealand fish, 375–379 nuclear magnetic resonance, 176 ozone, 201 prostate cancer, 3, 49, 61, 608 protein mass spectrometry, 664 satellite image, 470 skin of the orange, 429–432 spam, 2, 300–304, 313, 320, 328, 352, 593 vowel, 440, 464 waveform, 451 ZIP code, 4, 404, 536–539 Archetypal analysis, 554–557 Association rules, 492–495, 499– 501 738 Index Automatic relevance determination, 411 Automatic selection of smoothing parameters , 156 B-Spline, 186 Back-propagation, 392–397, 408– 409 Backfitting, 297, 391 Backward selection, 58 stepwise selection, 59 Backward pass, 396 Bagging, 282–288, 409, 587 Basis expansions and regularization, 139–189 Basis functions, 141, 186, 189, 321, 328 Batch learning, 397 Baum–Welch algorithm, 272 Bayes classifier, 21 factor, 234 methods, 233–235, 267–272 rate, 21 Bayesian, 409 Bayesian information criterion (BIC), 233 Benjamini–Hochberg method, 688 Best-subset selection, 57, 610 Between class covariance matrix, 114 Bias, 16, 24, 37, 160, 219 Bias-variance decomposition, 24, 37, 219 Bias-variance tradeoff, 37, 219 BIC, see Bayesian Information Criterion Boltzmann machines, 638–648 Bonferroni method, 686 Boosting, 337–386, 409 as lasso regression, 607–609 exponential loss and AdaBoost, 343 gradient boosting, 358 implementations, 360 margin maximization, 613 numerical optimization, 358 partial-dependence plots, 369 regularization path, 607 shrinkage, 364 stochastic gradient boosting, 365 tree size, 361 variable importance, 367 Bootstrap, 249, 261–264, 267, 271– 282, 587 relationship to Bayesian method, 271 relationship to maximum likelihood method, 267 Bottom-up clustering, 520–528 Bump hunting, see Patient rule induction method Bumping, 290–292 C5.0, 624 Canonical variates, 441 CART, see Classification and regression trees Categorical predictors, 10, 310 Censored data, 674 Classical multidimensional scaling, 570 Classification, 22, 101–137, 305– 317, 417–429 Classification and regression trees (CART), 305–317 Clique, 628 Clustering, 501–528 k-means, 509–510 agglomerative, 523–528 hierarchical, 520–528 Codebook, 515 Combinatorial algorithms, 507 Combining models, 288–290 Committee, 289, 587, 605 Comparison of learning methods, 350–352 Complete data, 276 Index Complexity parameter, 37 Computational shortcuts quadratic penalty, 659 Condensing procedure, 480 Conditional likelihood, 31 Confusion matrix, 301 Conjugate gradients, 396 Consensus, 285–286 Convolutional networks, 407 Coordinate descent, 92, 636, 668 COSSO, 304 Cost complexity pruning, 308 Covariance graph, 631 Cp statistic, 230 Cross-entropy, 308–310 Cross-validation, 241–245 Cubic smoothing spline, 151–153 Cubic spline, 151–153 Curse of dimensionality, 22–26 Dantzig selector, 89 Data augmentation, 276 Daubechies symmlet-8 wavelets, 176 De-correlation, 597 Decision boundary, 13–15, 21 Decision trees, 305–317 Decoder, 515, see encoder Decomposable models, 641 Degrees of freedom in an additive model, 302 in ridge regression, 68 of a tree, 336 of smoother matrices, 153–154, 158 Delta rule, 397 Demmler-Reinsch basis for splines, 156 Density estimation, 208–215 Deviance, 124, 309 Diagonal linear discriminant analysis, 651–654 Dimension reduction, 658 for nearest neighbors, 479 Discrete variables, 10, 310–311 739 Discriminant adaptive nearest neighbor classifier, 475–480 analysis, 106–119 coordinates, 108 functions, 109–110 Dissimilarity measure, 503–504 Dummy variables, 10 Early stopping, 398 Effective degrees of freedom, 17, 68, 153–154, 158, 232, 302, 336 Effective number of parameters, 15, 68, 153–154, 158, 232, 302, 336 Eigenvalues of a smoother matrix, 154 Elastic net, 662 EM algorithm, 272–279 as a maximization-maximization procedure, 277 for two component Gaussian mixture, 272 Encoder, 514–515 Ensemble, 616–623 Ensemble learning, 605–624 Entropy, 309 Equivalent kernel, 156 Error rate, 219–230 Error-correcting codes, 606 Estimates of in-sample prediction error, 230 Expectation-maximization algorithm, see EM algorithm Extra-sample error, 228 False discovery rate, 687–690, 692, 693 Feature, 1 extraction, 150 selection, 409, 658, 681–683 Feed-forward neural networks, 392– 408 740 Index Fisher’s linear discriminant, 106– 119, 438 Flexible discriminant analysis, 440– 445 Forward selection, 58 stagewise, 86, 608 stagewise additive modeling, 342 stepwise, 73 Forward pass algorithm, 395 Fourier transform, 168 Frequentist methods, 267 Function approximation, 28–36 Fused lasso, 666 Gap statistic, 519 Gating networks, 329 Gauss-Markov theorem, 51–52 Gauss-Newton method, 391 Gaussian (normal) distribution, 16 Gaussian graphical model, 630 Gaussian mixtures, 273, 463, 492, 509 Gaussian radial basis functions, 212 GBM, see Gradient boosting GBM package, see Gradient boosting GCV, see Generalized cross-validation GEM (generalized EM), 277 Generalization error, 220 performance, 220 Generalized additive model, 295– 304 Generalized association rules, 497– 499 Generalized cross-validation, 244 Generalized linear discriminant analysis, 438 Generalized linear models, 125 Gibbs sampler, 279–280, 641 for mixtures, 280 Gini index, 309 Global Markov property, 628 Gradient Boosting, 359–361 Gradient descent, 358, 395–397 Graph Laplacian, 545 Graphical lasso, 636 Grouped lasso, 90 Haar basis function, 176 Hammersley-Clifford theorem, 629 Hard-thresholding, 653 Hat matrix, 46 Helix, 582 Hessian matrix, 121 Hidden nodes, 641–642 Hidden units, 393–394 Hierarchical clustering, 520–528 Hierarchical mixtures of experts, 329–332 High-dimensional problems, 649 Hints, 96 Hyperplane, see Separating Hyperplane ICA, see Independent components analysis Importance sampling, 617 In-sample prediction error, 230 Incomplete data, 332 Independent components analysis, 557–570 Independent variables, 9 Indicator response matrix, 103 Inference, 261–294 Information Fisher, 266 observed, 274 Information theory, 236, 561 Inner product, 53, 668, 670 Inputs, 10 Instability of trees, 312 Intercept, 11 Invariance manifold, 471 Invariant metric, 471 Inverse wavelet transform, 179 Index IRLS, see Iteratively reweighted least squares Irreducible error, 224 Ising model, 638 ISOMAP, 572 Isometric feature mapping, 572 Iterative proportional scaling, 585 Iteratively reweighted least squares (IRLS), 121 Jensen’s inequality, 293 Join tree, 629 Junction tree, 629 K-means clustering, 460, 509–514 K-medoid clustering, 515–520 K-nearest neighbor classifiers, 463 Karhunen-Loeve transformation (principal components), 66– 67, 79, 534–539 Karush-Kuhn-Tucker conditions, 133, 420 Kernel classification, 670 density classification, 210 density estimation, 208–215 function, 209 logistic regression, 654 principal component, 547–550 string, 668–669 trick, 660 Kernel methods, 167–176, 208–215, 423–438, 659 Knot, 141, 322 Kriging, 171 Kruskal-Shephard scaling, 570 Kullback-Leibler distance, 561 Lagrange multipliers, 293 Landmark, 539 Laplacian, 545 Laplacian distribution, 72 LAR, see Least angle regression Lasso, 68–69, 86–90, 609, 635, 636, 661 741 fused, 666 Latent factor, 674 variable, 678 Learning, 1 Learning rate, 396 Learning vector quantization, 462 Least angle regression, 73–79, 86, 610 Least squares, 11, 32 Leave-one-out cross-validation, 243 LeNet, 406 Likelihood function, 265, 273 Linear basis expansion, 139–148 Linear combination splits, 312 Linear discriminant function, 106– 119 Linear methods for classification, 101–137 for regression, 43–99 Linear models and least squares, 11 Linear regression of an indicator matrix, 103 Linear separability, 129 Linear smoother, 153 Link function, 296 LLE, see Local linear embedding Local false discovery rate, 693 Local likelihood, 205 Local linear embedding, 572 Local methods in high dimensions, 22–27 Local minima, 400 Local polynomial regression, 197 Local regression, 194, 200 Localization in time/frequency, 175 Loess (local regression), 194, 200 Log-linear model, 639 Log-odds ratio (logit), 119 Logistic (sigmoid) function, 393 Logistic regression, 119–128, 299 Logit (log-odds ratio), 119 Loss function, 18, 21, 219–223, 346 Loss matrix, 310 742 Index Lossless compression, 515 Lossy compression, 515 LVQ, see Learning Vector Quantization Mahalanobis distance, 441 Majority vote, 337 Majorization, 294, 553 Majorize-Minimize algorithm, 294, 584 MAP (maximum aposteriori) estimate, 270 Margin, 134, 418 Market basket analysis, 488, 499 Markov chain Monte Carlo (MCMC) methods, 279 Markov graph, 627 Markov networks, 638–648 MARS, see Multivariate adaptive regression splines MART, see Multiple additive regression trees Maximum likelihood estimation, 31, 261, 265 MCMC, see Markov Chain Monte Carlo Methods MDL, see Minimum description length Mean field approximation, 641 Mean squared error, 24, 285 Memory-based method, 463 Metropolis-Hastings algorithm, 282 Minimum description length (MDL), 235 Minorization, 294, 553 Minorize-Maximize algorithm, 294, 584 Misclassification error, 17, 309 Missing data, 276, 332–333 Missing predictor values, 332–333 Mixing proportions, 214 Mixture discriminant analysis, 449– 455 Mixture modeling, 214–215, 272– 275, 449–455, 692 Mixture of experts, 329–332 Mixtures and the EM algorithm, 272–275 MM algorithm, 294, 584 Mode seekers, 507 Model averaging and stacking, 288 Model combination, 289 Model complexity, 221–222 Model selection, 57, 222–223, 230– 231 Modified regression, 634 Monte Carlo method, 250, 495 Mother wavelet, 178 Multidimensional scaling, 570–572 Multidimensional splines, 162 Multiedit algorithm, 480 Multilayer perceptron, 400, 401 Multinomial distribution, 120 Multiple additive regression trees (MART), 361 Multiple hypothesis testing, 683– 693 Multiple minima, 291, 400 Multiple outcome shrinkage and selection, 84 Multiple outputs, 56, 84, 103–106 Multiple regression from simple univariate regression, 52 Multiresolution analysis, 178 Multivariate adaptive regression splines (MARS), 321–327 Multivariate nonparametric regression, 445 Nadaraya–Watson estimate, 193 Naive Bayes classifier, 108, 210– 211, 694 Natural cubic splines, 144–146 Nearest centroids, 670 Nearest neighbor methods, 463– 483 Nearest shrunken centroids, 651– 654, 694 Network diagram, 392 Neural networks, 389–416 Index Newton’s method (Newton-Raphson procedure), 120–122 Non-negative matrix factorization, 553–554 Nonparametric logistic regression, 299–304 Normal (Gaussian) distribution, 16, 31 Normal equations, 12 Numerical optimization, 395–396 Object dissimilarity, 505–507 Online algorithm, 397 Optimal scoring, 445, 450–451 Optimal separating hyperplane, 132– 135 Optimism of the training error rate, 228–230 Ordered categorical (ordinal) predictor, 10, 504 Ordered features, 666 Orthogonal predictors, 53 Overfitting, 220, 228–230, 364 PageRank, 576 Pairwise distance, 668 Pairwise Markov property, 628 Parametric bootstrap, 264 Partial dependence plots, 369–370 Partial least squares, 80–82, 680 Partition function, 638 Parzen window, 208 Pasting, 318 Path algorithm, 73–79, 86–89, 432 Patient rule induction method(PRIM), 317–321, 499–501 Peeling, 318 Penalization, 607, see regularization Penalized discriminant analysis, 446– 449 Penalized polynomial regression, 171 Penalized regression, 34, 61–69, 171 Penalty matrix, 152, 189 743 Perceptron, 392–416 Piecewise polynomials and splines, 36, 143 Posterior distribution, 268 probability, 233–235, 268 Power method, 577 Pre-conditioning, 681–683 Prediction accuracy, 329 Prediction error, 18 Predictive distribution, 268 PRIM, see Patient rule induction method Principal components, 66–67, 79– 80, 534–539, 547 regression, 79–80 sparse, 550 supervised, 674 Principal curves and surfaces, 541– 544 Principal points, 541 Prior distribution, 268–272 Procrustes average, 540 distance, 539 Projection pursuit, 389–392, 565 regression, 389–392 Prototype classifier, 459–463 Prototype methods, 459–463 Proximity matrices, 503 Pruning, 308 QR decomposition, 55 Quadratic approximations and inference, 124 Quadratic discriminant function, 108, 110 Radial basis function (RBF) network, 392 Radial basis functions, 212–214, 275, 393 Radial kernel, 548 Random forest, 409, 587–604 algorithm, 588 744 Index bias, 596–601 comparison to boosting, 589 example, 589 out-of-bag (oob), 592 overfit, 596 proximity plot, 595 variable importance, 593 variance, 597–601 Rao score test, 125 Rayleigh quotient, 116 Receiver operating characteristic (ROC) curve, 317 Reduced-rank linear discriminant analysis, 113 Regression, 11–14, 43–99, 200–204 Regression spline, 144 Regularization, 34, 167–176 Regularized discriminant analysis, 112–113, 654 Relevance network, 631 Representer of evaluation, 169 Reproducing kernel Hilbert space, 167–176, 428–429 Reproducing property, 169 Responsibilities, 274–275 Ridge regression, 61–68, 650, 659 Risk factor, 122 Robust fitting, 346–350 Rosenblatt’s perceptron learning algorithm, 130 Rug plot, 303 Rulefit, 623 SAM, 690–693, see Significance Analysis of Microarrays Sammon mapping, 571 SCAD, 92 Scaling of the inputs, 398 Schwarz’s criterion, 230–235 Score equations, 120, 265 Self-consistency property, 541–543 Self-organizing map (SOM), 528– 534 Sensitivity of a test, 314–317 Separating hyperplane, 132–135 Separating hyperplanes, 136, 417– 419 Separator, 628 Shape average, 482, 540 Shrinkage methods, 61–69, 652 Sigmoid, 393 Significance Analysis of Microarrays, 690–693 Similarity measure, see Dissimilarity measure Single index model, 390 Singular value decomposition, 64, 535–536, 659 singular values, 535 singular vectors, 535 Sliced inverse regression, 480 Smoother, 139–156, 192–199 matrix, 153 Smoothing parameter, 37, 156–161, 198–199 Smoothing spline, 151–156 Soft clustering, 512 Soft-thresholding, 653 Softmax function, 393 SOM, see Self-organizing map Sparse, 175, 304, 610–613, 636 additive model, 91 graph, 625, 635 Specificity of a test, 314–317 Spectral clustering, 544–547 Spline, 186 additive, 297–299 cubic, 151–153 cubic smoothing, 151–153 interaction, 428 regression, 144 smoothing, 151–156 thin plate, 165 Squared error loss, 18, 24, 37, 219 SRM, see Structural risk minimization Stacking (stacked generalization), 290 Starting values, 397 Statistical decision theory, 18–22 Index Statistical model, 28–29 Steepest descent, 358, 395–397 Stepwise selection, 60 Stochastic approximation, 397 Stochastic search (bumping), 290– 292 Stress function, 570–572 Structural risk minimization (SRM), 239–241 Subset selection, 57–60 Supervised learning, 2 Supervised principal components, 674–681 Support vector classifier, 417–421, 654 multiclass, 657 Support vector machine, 423–437 SURE shrinkage method, 179 Survival analysis, 674 Survival curve, 674 SVD, see Singular value decomposition Symmlet basis, 176 Tangent distance, 471–475 Tanh activation function, 424 Target variables, 10 Tensor product basis, 162 Test error, 220–223 Test set, 220 Thin plate spline, 165 Thinning strategy, 189 Trace of a matrix, 153 Training epoch, 397 Training error, 220–223 Training set, 219–223 Tree for regression, 307–308 Tree-based methods, 305–317 Trees for classification, 308–310 Trellis display, 202 745 Undirected graph, 625–648 Universal approximator, 390 Unsupervised learning, 2, 485–585 Unsupervised learning as supervised learning, 495–497 Validation set, 222 Vapnik-Chervonenkis (VC) dimension, 237–239 Variable importance plot, 594 Variable types and terminology, 9 Variance, 16, 25, 37, 158–161, 219 between, 114 within, 114, 446 Variance reduction, 588 Varying coefficient models, 203– 204 VC dimension, see Vapnik–Chervonenkis dimension Vector quantization, 514–515 Voronoi regions, 510 Wald test, 125 Wavelet basis functions, 176–179 smoothing, 174 transform, 176–179 Weak learner, 383, 605 Weakest link pruning, 308 Webpages, 576 Website for book, 8 Weight decay, 398 Weight elimination, 398 Weights in a neural network, 395 Within class covariance matrix, 114, 446


pages: 918 words: 257,605

The Age of Surveillance Capitalism by Shoshana Zuboff

algorithmic bias, Amazon Web Services, Andrew Keen, augmented reality, autonomous vehicles, barriers to entry, Bartolomé de las Casas, Berlin Wall, bitcoin, blockchain, blue-collar work, book scanning, Broken windows theory, California gold rush, call centre, Capital in the Twenty-First Century by Thomas Piketty, Cass Sunstein, choice architecture, citizen journalism, cloud computing, collective bargaining, Computer Numeric Control, computer vision, connected car, corporate governance, corporate personhood, creative destruction, cryptocurrency, disinformation, dogs of the Dow, don't be evil, Donald Trump, Edward Snowden, en.wikipedia.org, Erik Brynjolfsson, facts on the ground, Ford paid five dollars a day, future of work, game design, Google Earth, Google Glasses, Google X / Alphabet X, hive mind, Ian Bogost, impulse control, income inequality, Internet of things, invention of the printing press, invisible hand, Jean Tirole, job automation, Johann Wolfgang von Goethe, John Markoff, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, knowledge economy, linked data, longitudinal study, low skilled workers, Mark Zuckerberg, market bubble, means of production, multi-sided market, Naomi Klein, natural language processing, Network effects, new economy, Occupy movement, off grid, PageRank, Panopticon Jeremy Bentham, pattern recognition, Paul Buchheit, performance metric, Philip Mirowski, precision agriculture, price mechanism, profit maximization, profit motive, recommendation engine, refrigerator car, RFID, Richard Thaler, ride hailing / ride sharing, Robert Bork, Robert Mercer, Second Machine Age, self-driving car, sentiment analysis, shareholder value, Shoshana Zuboff, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, slashdot, smart cities, Snapchat, social graph, social web, software as a service, speech recognition, statistical model, Steve Bannon, Steve Jobs, Steven Levy, structural adjustment programs, surveillance capitalism, The Future of Employment, The Wealth of Nations by Adam Smith, Tim Cook: Apple, two-sided market, union organizing, Watson beat the top human players on Jeopardy!, winner-take-all economy, Wolfgang Streeck, Yochai Benkler, you are the product