42 results back to index
3D printing, Airbnb, Amazon Web Services, barriers to entry, bitcoin, blockchain, business process, Clayton Christensen, collaborative economy, crowdsourcing, cryptocurrency, data acquisition, frictionless, game design, hive mind, Internet of things, invisible hand, Kickstarter, Lean Startup, Lyft, M-Pesa, Mark Zuckerberg, means of production, multi-sided market, Network effects, new economy, Paul Graham, recommendation engine, ride hailing / ride sharing, shareholder value, sharing economy, Silicon Valley, Skype, Snapchat, social graph, social software, software as a service, software is eating the world, Spread Networks laid a new fibre optics cable between New York and Chicago, TaskRabbit, the payments system, too big to fail, transport as a service, two-sided market, Uber and Lyft, Uber for X, Wave and Pay
In contrast, the journey to platform scale for a large pipe-based business starts with the data layer. 1. Build A Culture Of Data Acquisition The first step a traditional pipe-based business needs to take is cultural. It needs to create a culture of data acquisition. Most pipe-based businesses have been designed with a culture of dollar acquisition. Sales representatives who acquire revenue are incentivized accordingly. The key metrics measured are structured around the sole priority of dollar acquisition. To kickstart the journey towards platform scale, businesses will need to create a culture of data acquisition. Businesses like LinkedIn and Netflix demonstrate that higher data acquisition opens greater monetization opportunities. LinkedIn acquires significantly more data from its users than Monster.
Traditional agents also connect producers to consumers. However, they can never operate at platform scale because their ability to match the two sides doesn’t scale. A platform’s ability to scale matchmaking helps it to achieve platform scale (see Figure 16). Matchmaking is accomplished through data. As a result, data acquisition becomes an important priority for platforms. Designing the data model – specifications for what data are required for the value unit and the filter – is a critical step in platform design. This informs a platform’s data acquisition strategy. Data acquisition is subtle but critical. LinkedIn’s progress bar encourages users to provide more data to the platform by showing them the completeness of their profiles and suggesting simple actions to enrich them further. Distracted users are often taken through an initial sign-up process on multiple platforms using similar progress bars.
To be strategic, a free app should be a data acquisition interface that powers a larger business model. Every app by Facebook is structured as a user benefit in exchange for data. Facebook’s news feed itself is the best example of a user benefit in exchange for data. As Facebook and LinkedIn demonstrate, a digital strategy, particularly one that intends to leverage platform scale, should start with a cohesive data strategy. This needs to be executed using a culture of data acquisition. 2. Enable Data Porosity And Integration Platform business models are enabled by platform organizations. An organization that is not integrated at the data layer cannot enable an ecosystem that is orchestrated by data. With a clear platform strategy in mind and having set a culture of data acquisition, a pipe organization must institute infrastructural change.
The Silent Intelligence: The Internet of Things by Daniel Kellmereit, Daniel Obodovski
3D printing, Airbnb, Amazon Web Services, Any sufficiently advanced technology is indistinguishable from magic, autonomous vehicles, barriers to entry, business intelligence, call centre, Clayton Christensen, cloud computing, connected car, crowdsourcing, data acquisition, en.wikipedia.org, Erik Brynjolfsson, first square of the chessboard, first square of the chessboard / second half of the chessboard, Freestyle chess, Google X / Alphabet X, Internet of things, Network effects, Paul Graham, Ray Kurzweil, RFID, self-driving car, Silicon Valley, smart cities, smart grid, software as a service, Steve Jobs, web application, Y Combinator, yield management
In addition, as we mentioned in the previous chapter, the success of the Internet of Things largely depends on various industries embracing M2M technologies to solve their business problems. In this chapter, we present the parts of the technology ecosystem and its challenges, players, and future direction. Overall, the M2M technology ecosystem can be split into three major groups: data acquisition, data transport, and data analysis. Data acquisition is the device or hardware space — this is where data is being collected from various sensors and sent to the network. Examples are body sensors that measure pulse or calorie consumption, automotive OBD-II14 devices that measure car acceleration, and many others. RFID tags and readers belong to this category as well. To transmit data, devices are equipped with a radio transmitter, which can be cellular, Wi-Fi, or short range.
Through technology innovation, working with huge data sets has become extremely affordable, compared to the realities of a couple of years ago. This enables us to find correlations, spot business trends, detect and prevent potential criminal activity, or optimize workflows of all kinds. To ensure the smooth flow of data, there are platforms that enable communications between any two of the three major groups in the technology ecosystem. For example, between data acquisition and data transport there is a Connected Device Platform (CDP). The CDP, sometimes referred to as middleware, ensures that the devices and sensors can be easily connected, primarily to a cellular network, and that the devices can be remotely managed. Imagine trying to reset thousands or hundreds of thousands of devices manually in the field. This is just one example of a nightmare that a CDP is supposed to prevent.
I think that is why we have seen the mobile operators go away from fixed pricing for a little while. I think they want to instill the sense of efficiency in the developer community. When they had a fixed price for unlimited mobile data, nobody had to worry about being efficient; it did not matter if the app checked the status of a device every ten seconds. Let’s take a closer look at the M2M technology ecosystem and its parts. Device hardware (data acquisition) is one of the most challenging areas of the ecosystem, primarily because it comes in all sizes and colors. You may think of black box–type devices that are usually installed on industrial equipment, but there are also OBD-II devices that get installed in cars, elegant body-worn fitness devices, connectivity modules that get embedded in home appliances, moisture sensors that go in the soil, RFID readers, and so on.
Big Data Analytics: Turning Big Data Into Big Money by Frank J. Ohlhorst
algorithmic trading, bioinformatics, business intelligence, business process, call centre, cloud computing, create, read, update, delete, data acquisition, DevOps, fault tolerance, linked data, natural language processing, Network effects, pattern recognition, performance metric, personalized medicine, RFID, sentiment analysis, six sigma, smart meter, statistical model, supply-chain management, Watson beat the top human players on Jeopardy!, web application
More to Big Data Than Meets the Eye Dealing with the Nuances of Big Data An Open Source Brings Forth Tools Caution: Obstacles Ahead Chapter 2: Why Big Data Matters Big Data Reaches Deep Obstacles Remain Data Continue to Evolve Data and Data Analysis are Getting More Complex The Future is Now Chapter 3: Big Data and the Business Case Realizing Value The Case for Big Data The Rise of Big Data Options Beyond Hadoop With Choice Come Decisions Chapter 4: Building the Big Data Team The Data Scientist The Team Challenge Different Teams, Different Goals Don’t Forget the Data Challenges Remain Teams versus Culture Gauging Success Chapter 5: Big Data Sources Hunting for Data Setting the Goal Big Data Sources Growing Diving Deeper into Big Data Sources A Wealth of Public Information Getting Started with Big Data Acquisition Ongoing Growth, No End in Sight Chapter 6: The Nuts and Bolts of Big Data The Storage Dilemma Building a Platform Bringing Structure to Unstructured Data Processing Power Choosing among In-house, Outsourced, or Hybrid Approaches Chapter 7: Security, Compliance, Auditing, and Protection Pragmatic Steps to Securing Big Data Classifying Data Protecting Big Data Analytics Big Data and Compliance The Intellectual Property Challenge Chapter 8: The Evolution of Big Data Big Data: The Modern Era Today, Tomorrow, and the Next Day Changing Algorithms Chapter 9: Best Practices for Big Data Analytics Start Small with Big Data Thinking Big Avoiding Worst Practices Baby Steps The Value of Anomalies Expediency versus Accuracy In-Memory Processing Chapter 10: Bringing it All Together The Path to Big Data The Realities of Thinking Big Data Hands-on Big Data The Big Data Pipeline in Depth Big Data Visualization Big Data Privacy Appendix: Supporting Data “The MapR Distribution for Apache Hadoop” “High Availability: No Single Points of Failure” About the Author Index WILEY & SAS BUSINESS SERIES The Wiley & SAS Business Series presents books that help senior-level managers with their critical management decisions.
Many more data sets are available from Amazon S3, and it is definitely worth visiting http://aws.amazon.com/publicdatasets/ to track these down. Another site to visit for a listing of public data sets is http://www.quora.com/Data/Where-can-I-get-large-datasets-open-to-the-public, a treasure trove of links to data sets and information related to those data sets. GETTING STARTED WITH BIG DATA ACQUISITION Barriers to Big Data adoption are generally cultural rather than technological. In particular, many organizations fail to implement Big Data programs because they are unable to appreciate how data analytics can improve their core business. One the most common triggers for Big Data development is a data explosion that makes existing data sets very large and increasingly difficult to manage with conventional database management tools.
That alone is probably reason enough for the majority of businesses to start evaluating how Big Data analytics can affect the bottom line, and those businesses should probably start evaluating Big Data promises sooner rather than later. Delving into the value of Big Data analytics reveals that elements such as heterogeneity, scale, timeliness, complexity, and privacy problems can impede progress at all phases of the process that create value from data. The primary problem begins at the point of data acquisition, when the data tsunami requires us to make decisions, currently in an ad hoc manner, about what data to keep, what to discard, and how to reliably store what we keep with the right metadata. Adding to the confusion is that most data today are not natively stored in a structured format; for example, tweets and blogs are weakly structured pieces of text, while images and video are structured for storage and display but not for semantic content and search.
Cybersecurity: What Everyone Needs to Know by P. W. Singer, Allan Friedman
4chan, A Declaration of the Independence of Cyberspace, Apple's 1984 Super Bowl advert, barriers to entry, Berlin Wall, bitcoin, blood diamonds, borderless world, Brian Krebs, business continuity plan, Chelsea Manning, cloud computing, crowdsourcing, cuban missile crisis, data acquisition, Edward Snowden, energy security, failed state, Fall of the Berlin Wall, fault tolerance, global supply chain, Google Earth, Internet of things, invention of the telegraph, Julian Assange, Khan Academy, M-Pesa, mutually assured destruction, Network effects, packet switching, Peace of Westphalia, pre–internet, profit motive, RAND corporation, ransomware, RFC: Request For Comment, risk tolerance, rolodex, Silicon Valley, Skype, smart grid, Steve Jobs, Stuxnet, uranium enrichment, We are Anonymous. We are Legion, web application, WikiLeaks, zero day
Thus, while cyberspace was once just a realm of communication and then e-commerce (reaching over $10 trillion a year in sales), it has expanded to include what we call “critical infrastructure.” These are the underlying sectors that run our modern-day civilization, ranging from agriculture and food distribution to banking, healthcare, transportation, water, and power. Each of these once stood apart but are now all bound together and linked into cyberspace via information technology, often through what are known as “supervisory control and data acquisition” or SCADA systems. These are the computer systems that monitor, adjust switching, and control other processes of critical infrastructure. Notably, the private sector controls roughly 90 percent of US critical infrastructure, and the firms behind it use cyberspace to, among other things, balance the levels of chlorination in your city’s water, control the flow of gas that heats your home, and execute the financial transactions that keep currency prices stable.
It was this combination that led him to play a role in the discovery of one of the most notable weapons in history; and not just cyber history, but history overall. Since 1988, Ralph and his team of security experts had been advising on the safety of large-scale installations. Their special focus was industrial control systems, the computer systems like SCADA (short for “supervisory control and data acquisition”) that monitor and run industrial processes. SCADA is used in everything from the management and operation of power plants to the manufacture of candy wrappers. In 2010, like many other industrial control experts, Ralph grew concerned about a cyber “worm” of unknown origin that was spreading across the world and embedding itself in these control systems. Thousands of computers in places like India and the United States had been infected.
Lord and Travis Shard (Washington, DC: Center for a New American Security, 2011), pp. 14–15. “the click of a switch” Ibid., p. 9. registered sites hit 550 million by 2012 Jon Russell, “Importance of Microblogs in China Shown as Weibos Pass 500 Million Users,” The Next Web, last modified November 11, 2011, http://thenextweb.com/asia/2011/11/11/importance-of-microblogs-in-china-shown-as-weibos-pass-550-million-users/. “supervisory control and data acquisition” Beidleman, “Defining and Deterring Cyber War,” p. 6. “the control system of our economy” Ibid., p. 1. “knowingly or not, it is life” Ben Hammersley, “Speech to the UK’s Information Assurance Advisory Council,” remarks at the Information Assurance Advisory Council, London, September 6, 2011, http://www.benhammersley.com/2011/09/my-speech-to-the-iaac/. WHERE DID THIS “CYBER STUFF” COME FROM ANYWAY?
Big Data at Work: Dispelling the Myths, Uncovering the Opportunities by Thomas H. Davenport
Automated Insights, autonomous vehicles, bioinformatics, business intelligence, business process, call centre, chief data officer, cloud computing, data acquisition, Edward Snowden, Erik Brynjolfsson, intermodal, Internet of things, Jeff Bezos, knowledge worker, Mark Zuckerberg, move fast and break things, Narrative Science, natural language processing, Netflix Prize, New Journalism, recommendation engine, RFID, self-driving car, sentiment analysis, Silicon Valley, smart grid, smart meter, social graph, sorting algorithm, statistical model, Tesla Model S, text mining
Gardner, the CEO, explained the importance of controlling leakage: “Some large health systems report upwards of 50% leakage from hospital networks, while best-in-class organizations have leakage rates of under 20% . . . If we were to change leakage rates by just a few percentage points, systems that were operating at a loss could become profitable.”a Kyruus is structured into three major groups: data acquisition, integration, and processing; analytics; and applications and the user interface. The company’s data platform includes features to display and analyze data. a. Robert F. Higgins, Penrose O’Donnell, and Mehul Bhatt, “Kyruus: Big Data’s Search for the Killer App,” Case 813-060 (Boston: Harvard Business School, 2012), 13. Chapter_07.indd 162 12/11/13 1:47 PM What You Can Learn from Start-Ups and Online Firms 163 Take Advantage of Free and Low-Cost Stuff In the distant past—say, a decade ago—the costs of computing, data management, and data analysis were major impediments to big data (assuming you could find some in the first place).
These days, the company is using big data technologies to accelerate the integration of petabytes of customer, product, sales, and campaign data in order to understand how to increase marketing returns and bring customers back into its stores. The retailer uses Hadoop to not only store but also process data transformations and integrate heterogeneous data more quickly and efficiently than ever. “We’re investing in real-time data acquisition as it happens,” says Oliver Ratzesberger, (at the time of the interview) Vice President of Information Analytics and Innovation at Sears Holdings. “No more ETL. Big data technologies make it easy to eliminate sources of latency that have built up over a period of time.” The company is now leveraging open-source projects Apache Kafka and Storm to enable real-time processing. “Our goal is to be able to measure what’s just happened.”
The company’s CTO, Phil Shelley (who has since left to start his own big data company), has cited big data’s capability to decrease the release of a set of complex marketing campaigns from eight weeks to one week—and the improvements are still being realized. Faster and more targeted campaigns are just the tip of the iceberg for the retailer, which recently launched a subsidiary, MetaScale, to provide non-retailers with big data services in the cloud. “Sears is investing in real-time data acquisition and integration as it happens,” says Ratzesberger. “We’re bringing in open-source solutions and changing our applications architecture. We’re creating a framework that, over time, any application can leverage.” Chapter_08.indd 192 03/12/13 12:57 PM What You Can Learn from Large Companies 193 Moreover, it’s easier to measure new process improvements against traditional methods, so quantifying faster product time to market, higher return on marketing investment, or fewer patient readmissions makes quantifying return on investment that much easier.
The Transhumanist Reader by Max More, Natasha Vita-More
23andMe, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, Bill Joy: nanobots, bioinformatics, brain emulation, Buckminster Fuller, cellular automata, clean water, cloud computing, cognitive bias, cognitive dissonance, combinatorial explosion, conceptual framework, Conway's Game of Life, cosmological principle, data acquisition, discovery of DNA, Drosophila, en.wikipedia.org, experimental subject, Extropian, fault tolerance, Flynn Effect, Francis Fukuyama: the end of history, Frank Gehry, friendly AI, game design, germ theory of disease, hypertext link, impulse control, index fund, John von Neumann, joint-stock company, Kevin Kelly, Law of Accelerating Returns, life extension, Louis Pasteur, Menlo Park, meta analysis, meta-analysis, moral hazard, Network effects, Norbert Wiener, P = NP, pattern recognition, phenotype, positional goods, prediction markets, presumed consent, Ray Kurzweil, reversible computing, RFID, Richard Feynman, Ronald Reagan, silicon-based life, Singularitarianism, stem cell, stochastic process, superintelligent machines, supply-chain management, supply-chain management software, technological singularity, Ted Nelson, telepresence, telepresence robot, telerobotics, the built environment, The Coming Technological Singularity, the scientific method, The Wisdom of Crowds, transaction costs, Turing machine, Turing test, Upton Sinclair, Vernor Vinge, Von Neumann architecture, Whole Earth Review, women in the workforce
If the time during which measurements are taken is relatively small or does not involve a sufficiently thorough set of events observed, then it is possible to miss pairs of I/O that would indicate the presence of latent function. There are some ways to improve upon this by using patterns of stimulation in order to put each component through its paces, but then we run into the problem that the brain is plastic. Components may change their responses as a result of the exercises. Latent function may be better obtained from structural data acquisition. Even if purely structural or purely functional data acquisition could provide all the necessary information for a whole brain emulation, then such a constraint would still carry a burden of risk that is better avoided from the perspective of sensible engineering. It seems unwise to construct an enormously complex emulation by carrying out a single-shot transformation. It is far better to turn it into a problem of successive partial transformations.
Possible interactions are constrained by the existing functional connections between the components – the functional connectome, which is in turn reflected by physical neuroanatomy in the structural connectome. We consider a strategy of straightforward duplication of the activity, and look at the numbers of some of the components. The human brain has up to one hundred billion (1011) neurons and between one hundred trillion (1014) and one quadrillion (1015) synapses. But we have reached a point where for purposes of data acquisition these objects are now considered fairly large (e.g. 200 nm to 2,000 nm for synaptic spines and 4,000 nm to 100,000 nm for the neural soma), at least by the standards of the current nanotechnology industry (working with precision at 10s to 100s of nanometers). And in terms of their activity those components are mostly quiet. I coined the term whole brain emulation around February/March of 2000 during a discussion on the old “mind uploading research group” (MURG) mailing list, in an effort to remove confusion stemming from the use of the term “mind uploading”, which better refers to a process of transfer of a mind from a biological brain to another substrate.
An increasing number of projects are explicitly building the sort of tools that are needed to acquire data from a brain at the large scope and high resolution required. There are by now at least three different versions of the Automated Tape-Collecting Lathe Ultramicrotome that was developed at the Lichtman Lab at Harvard University (Hayworth et al. 2007). Ken Hayworth is presently working on its successor that employs focused ion beam scanning electron microscopy (FIBSEM) to improve accuracy, reliability, and speed of structural data acquisition from whole brains at a resolution of 5 nm (Hayworth 2011). Meanwhile, the Knife-Edge Scanning Microscope (KESM) developed by Bruce McCormick is presently able to acquire neuronal fiber and vasculature data from entire mouse brains at a slightly lower resolution (McCormick and Mayerich 2004). A number of labs, including the MIT Media Lab of Ed Boyden, are aiming at the development arrays of recording electrodes with tens of thousands of channels.
Cyber War: The Next Threat to National Security and What to Do About It by Richard A. Clarke, Robert Knake
barriers to entry, complexity theory, data acquisition, Just-in-time delivery, nuclear winter, packet switching, RAND corporation, Robert Hanssen: Double agent, Ronald Reagan, Silicon Valley, smart grid, South China Sea, Steve Jobs, trade route, Y2K, zero day
The gas quickly extended well over a mile along the creek. Then it caught fire. Two ten-year-old boys playing along the stream were killed, as was an eighteen-year-old farther up the creek. The nearby municipal water-treatment plant was severely damaged by the fire. When the U.S. National Transportation Safety Board examined why the pipeline burst, it focused on “the performance and security of the supervisory control and data acquisition (SCADA) system.” In other words, the software failed. The report does not conclude that in this case the explosion was intentionally caused by a hacker, but it is obvious from the analysis that pipelines like the one in Bellingham can be manipulated destructively from cyberspace. The clearest example of the dependency and the vulnerability brought on by computer controls also happens to be the one system that everything else depends upon: the electric power grid.
They were also allowed to buy and sell power to each other anywhere within one of the three big power grids in North America. At the same time, they were, like every other company, inserting computer controls deep into their operations. Computer controls were also installed to manage the buying and selling, generation, and transmission. A SCADA system was already running each electric company’s substations, transformers, and generators. That Supervisory Control and Data Acquisition system got and sent signals out to all of the thousands of devices on the company’s grid. SCADAs are software programs, and most electric companies use one of a half dozen commercially available products. These control programs send signals to devices to regulate the electric load in various locations. The signals are most often sent via internal computer network and sometimes by radio. Unfortunately, many of the devices also have other connections, multiple connections.
SIPRNET: Secret Internet Protocol Router Network is the Defense Department’s global intranet for transmitting confidential and secret-level information. The Defense Department classifies information into five catergories: unclassified, confidential, secret, top secret, top secret/SCI (specially compartmented information). The SIPRNET is supposed to be air-gapped from, i.e., not physically touching, the unclassified NIPRNET and the Internet. Supervisory Control and Data Acquisition System (SCADA): Software for networks of devices that control the operation of a system of machines such as valves, pumps, generators, transformers, and robotic arms. SCADA software collects information about the condition of and activities on a system. SCADA software sends instructions to devices, often to do physical movements. Instructions sent to devices on SCADA networks are sometimes sent over the Internet or broadcast via radio waves.
Industrial Internet by Jon Bruner
autonomous vehicles, barriers to entry, computer vision, data acquisition, demand response, en.wikipedia.org, factory automation, Google X / Alphabet X, industrial robot, Internet of things, job automation, loose coupling, natural language processing, performance metric, Silicon Valley, slashdot, smart grid, smart meter, statistical model, web application
In this manner, software running on an inexpensive processor, reading data from inexpensive sensors, can substitute for more expensive capital equipment and labor. Better understanding of maintenance needs means better allocation of equipment — since the timing of maintenance can be optimized if it’s proactive rather than reactive — and workers can similarly avoid being idled or having their time absorbed in detecting maintenance needs. Large manufacturers have invested billions of dollars in SCADA (supervisory control and data acquisition — the low-level industrial control networks that operate automated machines). Comprehensive stacks of specialized software link these systems all the way to management dashboards, but many of these systems have their roots in automation, not high-level intelligence and analysis. Factory managers are understandably conservative in managing these systems, and demand highly-robust, proven technologies in settings where the functioning of a big machine or assembly line is at stake.
3D printing, AI winter, Amazon Web Services, artificial general intelligence, Automated Insights, Bernie Madoff, Bill Joy: nanobots, brain emulation, cellular automata, cloud computing, cognitive bias, computer vision, cuban missile crisis, Daniel Kahneman / Amos Tversky, Danny Hillis, data acquisition, don't be evil, Extropian, finite state, Flash crash, friendly AI, friendly fire, Google Glasses, Google X / Alphabet X, Isaac Newton, Jaron Lanier, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, Loebner Prize, lone genius, mutually assured destruction, natural language processing, Nicholas Carr, optical character recognition, PageRank, pattern recognition, Peter Thiel, prisoner's dilemma, Ray Kurzweil, Rodney Brooks, Search for Extraterrestrial Intelligence, self-driving car, semantic web, Silicon Valley, Singularitarianism, Skype, smart grid, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, superintelligent machines, technological singularity, The Coming Technological Singularity, traveling salesman, Turing machine, Turing test, Vernor Vinge, Watson beat the top human players on Jeopardy!, zero day
The vulnerability investigators sought to explore is endemic in North America’s electrical grid—the habit of attaching the controlling hardware of critical machinery to the Internet so it can be remotely operated, and “protecting” it with passwords, firewalls, encryption, and other safeguards that crooks routinely cut through like hot knives through butter. The device that controlled DHS’ tortured generator is present throughout our national energy network. It is known as a supervisory control and data acquisition, or SCADA, system. SCADA systems don’t just control devices in the electrical grid, but all manner of modern hardware, including traffic lights, nuclear power plants, oil and gas pipelines, water treatment facilities, and factory assembly lines. SCADA has become almost a household acronym because of the phenomenon called Stuxnet. Stuxnet, and its cousins Duqu and Flame, have convinced even the most hardened skeptics that the energy grid can be attacked.
., III Machine Intelligence Research Institute (MIRI) Singularity Summit machine learning Madoff, Bernie malware Mazzafro, Joe McCarthy, John McGurk, Sean military battlefield robots and drones DARPA, see DARPA energy infrastructure and nuclear weapons, see nuclear weapons Mind Children (Moravec) Minsky, Marvin Mitchell, Tom mobile phones see also iPhone Monster Cat Moore, Gordon Moore’s Law morality see also Friendly AI Moravec, Hans Moravec’s Paradox mortality, see immortality mortgage crisis Mutually Assured Destruction (MAD) nano assemblers nanotechnology “gray goo” problem and natural language processing (NLP) natural selection Nekomata (Monster Cat) NELL (Never-Ending-Language-Learning system) neural networks neurons New Scientist New York Times Newman, Max Newton, Isaac Ng, Andrew 9/11 attacks Normal Accidents: Living with High-Risk Technologies (Perrow) normalcy bias North Korea Norvig, Peter Novamente nuclear fission nuclear power plant disasters nuclear weapons of Iran Numenta Ohana, Steve Olympic Games (cyberwar campaign) Omohundro, Stephen OpenCog Otellini, Paul Page, Larry paper clip maximizer scenario parallel processing pattern recognition Pendleton, Leslie Perceptron Perrow, Charles Piaget, Jean power grid Precautionary Principle programming bad evolutionary genetic ordinary self-improving, see self-improvement Rackspace rational agent theory of economics recombinant DNA Reflections on Artificial Intelligence (Whitby) resource acquisition risks of artificial intelligence apoptotic systems and Asilomar Guidelines and Busy Child scenario and, see Busy Child scenario defenses against lack of dialogue about malicious AI Precautionary Principle and runaway AI Safe-AI Scaffolding Approach and Stuxnet and unintended consequences robots, robotics Asimov’s Three Laws of in dangerous and service jobs in sportswriting Rosenblatt, Frank Rowling, J. K. Rubin, Andrew “Runaround” (Asimov) Safe-AI Scaffolding Approach Sagan, Carl SCADA (supervisory control and data acquisition) systems Schmidt, Eric Schwartz, Evan Scientist Speculates, The (Good, ed.) Searle, John self-awareness Self-Aware Systems self-improvement self-preservation September 11 attacks serial processing SETI (Search for Extraterrestrial Intelligence) Shostak, Seth Silicon Valley Singularitarians Singularity definitions of Kurzweil and technological Singularity Is Near, The (Kurzweil) Singularity Summit Singularity University Sir Groovy Siri 60 Minutes Skilling, Jeffrey Smart Action smart phones see also iPhone software complexity of malware see also programming solar energy space exploration “Speculations Concerning the First Ultraintelligent Machine” (Good) speech recognition SRI International stealth companies Sterrit, Roy Stibel, Jeff Stuxnet subprime mortgage crisis Symantec SyNAPSE Technological Risk (Lewis) technology journalism Terminator movies terrorism 9/11 attacks Thiel, Peter Thinking Machines, Inc.
Platform Revolution: How Networked Markets Are Transforming the Economy--And How to Make Them Work for You by Sangeet Paul Choudary, Marshall W. van Alstyne, Geoffrey G. Parker
3D printing, Affordable Care Act / Obamacare, Airbnb, Amazon Mechanical Turk, Amazon Web Services, Andrei Shleifer, Apple's 1984 Super Bowl advert, autonomous vehicles, barriers to entry, big data - Walmart - Pop Tarts, bitcoin, blockchain, business process, buy low sell high, chief data officer, clean water, cloud computing, connected car, corporate governance, crowdsourcing, data acquisition, data is the new oil, discounted cash flows, disintermediation, Edward Glaeser, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, financial innovation, Haber-Bosch Process, High speed trading, Internet of things, inventory management, invisible hand, Jean Tirole, Jeff Bezos, jimmy wales, Khan Academy, Kickstarter, Lean Startup, Lyft, market design, multi-sided market, Network effects, new economy, payday loans, peer-to-peer lending, Peter Thiel, pets.com, pre–internet, price mechanism, recommendation engine, RFID, Richard Stallman, ride hailing / ride sharing, Ronald Coase, Satoshi Nakamoto, self-driving car, shareholder value, sharing economy, side project, Silicon Valley, Skype, smart contracts, smart grid, Snapchat, software is eating the world, Steve Jobs, TaskRabbit, The Chicago School, the payments system, Tim Cook: Apple, transaction costs, two-sided market, Uber and Lyft, Uber for X, winner-take-all economy, Zipcar
They range from relatively static information such as identity, gender, and nationality to dynamic information such as location, relationship status, age, and point-in-time interest (as reflected in a search query). Sophisticated data models like the Facebook news feed may build a filter that considers all these factors as well as all of the participant’s previous activities on the platform. As part of the design process, platform companies need to develop an explicit data acquisition strategy. Users vary greatly in their willingness to share data and their readiness to respond to data-driven activity recommendations. Some platforms use incentives to encourage participants to provide data about themselves; others leverage game elements to gather data from users. LinkedIn famously used a progress bar to encourage users to progressively submit more information about themselves, thereby completing their personal data profiles.
Some mobile apps, such as the music streaming app Spotify, ask users to sign in using their Facebook identities, which helps the app pull in initial data to use in facilitating accurate matches. However, resistance from some users has led many app makers, including Spotify, to provide alternative ways to sign in that don’t require a Facebook link. Successful platforms create mutually rewarding matches on a consistent basis. As such, continual improvement of data acquisition and analysis methods is an important challenge for any organization seeking to build and maintain a platform. Balancing the three functions. All three key functions—pull, facilitate, and match—are essential to a successful platform. But not all platforms are equally good at all three. It’s possible for a platform to survive, at least for a time, thanks mainly to its strength at a particular function.
REST API Design Rulebook by Mark Masse
Architectural Styles and the Design of Network-based Software Architectures, Doctoral dissertation, University of California, Irvine, 2000 (http://www.ics.uci.edu/~fielding/pubs/dissertation/top.htm).  http://www.crummy.com/writing/speaking/2008-QCon/act3.html  Leonard Richardson also co-authored the milestone book, RESTful Web Services (O’Reilly) which really helped move REST forward.  http://www.methods.co.nz/asciidoc Chapter 1. Introduction Hello World Wide Web The Web started in the “data acquisition and control” group at the European Organization for Nuclear Research (CERN), in Geneva, Switzerland. It began with a computer programmer who had a clever idea for a new software project. In December of 1990, to facilitate the sharing of knowledge, Tim Berners-Lee started a non-profit software project that he called “WorldWideWeb.” After working diligently on his project for about a year, Berners-Lee had invented and implemented: The Uniform Resource Identifier (URI), a syntax that assigns each web document a unique address The HyperText Transfer Protocol (HTTP), a message-based language that computers could use to communicate over the Internet.
Solr in Action by Trey Grainger, Timothy Potter
business intelligence, cloud computing, conceptual framework, crowdsourcing, data acquisition, en.wikipedia.org, failed state, fault tolerance, finite state, full text search, glass ceiling, information retrieval, natural language processing, performance metric, premature optimization, recommendation engine, web application
Efficient field collapsing with the Collapsing query parser 11.8. Summary Chapter 12. Taking Solr to production 12.1. Developing a Solr distribution 12.2. Deploying Solr 12.2.1. Building your Solr distribution 12.2.2. Embedded Solr 12.3. Hardware and server configuration 12.3.1. RAM and SSDs 12.3.2. JVM settings 12.3.3. The index shuffle 12.3.4. Useful system tricks 12.4. Data acquisition strategies Update Formats, Indexing Time, and Batching Data Import Handler Extracting text from files with Solr Cell 12.5. Sharding and replication 12.5.1. Choosing to shard 12.5.2. Choosing to replicate 12.6. Solr core management Defining cores Creating cores through the Core Admin API Reloading cores Renaming and swapping cores Unloading and deleting cores Splitting and merging indexes Getting the status of cores 12.7.
ulimit –n 100000 Many systems have a default file descriptor limit of 1024, but since each Solr index can consist of hundreds of files (or even thousands depending upon your MergePolicy settings), it may be necessary to increase this limit to 100,000 (from our example) or something even higher, especially if you expect to have many Solr cores on your server. You will probably want to set this limit permanently (running the command only applies to the current bash session) by setting it in a system-wide configuration such as /etc/security/limits.conf. You should ensure that your new file descriptor limit is sufficiently large that you will never run the chance of hitting it. 12.4. Data acquisition strategies So far, you have seen one way to post documents to Solr: through sending a document over HTTP to the Solr /update handler. We have utilized an included post.jar file as a convenience library for posting files containing Solr documents primarily in XML format, but under the covers it posts the contents of a file to Solr’s /update handler for you. It’s also possible to have Solr ingest documents in other ways, either through pushing documents to Solr in other formats or through having Solr import documents itself from any number of external data sources.
Content-Type header contributing patches coord factor (coord) <copyField> element, 2nd Core Admin API administration console creating cores CoreAdminHandler class core.properties file cos function cosh function CREATEALIAS command createNodeSet parameter cross-core joins cross-document joins CSS (cascading style sheets) CSV (comma-separated values) respose formats update handler support, 2nd CSVResponseWriter class custom hashing composite document ID limitations on overview targeting specific shard Czech language CzechStemFilterFactory D Damerau-Levenshtein distances Danish language data acquisition strategies batching documents DIH extracting text from files with Solr Cell Data Import Handler. See DIH. data model data redundancy data transformation functions data-config.xml file dataDir parameter <dataDir> element debug component, 2nd debug parameter def function DefaultSimilarity class, 2nd, 3rd defType parameter deg function Delete by id Delete by query Delete request, update handler deleteDataDir parameter deleteInstanceDir parameter deleting cores deletions, and segment merging DelimitedPayloadFilterFactory denormalization denormalized documents dependencies, in solrconfig.xml file dereferencing parameters df parameter DFRSimilarity class DFRSimilarityFactory class diacritcal marks, removing dictionary-based stemming DIH (Data Import Handler), 4th importing documents using indexing Stack Exchange indexing Wikipedia, 2nd direct routing Directory component DirectSolrSpellChecker dismax parameter dist function distanceMeasure parameter distrib parameter, 2nd distributed result grouping distributed searching distribution, creating own div function docfreq function [docid] field DocSet DocTransformer class document cache document router document transformers DocumentAnalysisRequestHandler class document-oriented Down state downloading Solr downtime DumpRequestHandler class duplicate documents skipping duplicate parameters reducing with parameter dereferencing durable writes Dutch language dynamic values, returning <dynamicField> element E e function Eclipse importing Lucene/Solr into running Solr from inside EdgeNGramFilter edismax parameter edit distance elasticity, SolrCloud ElisionFilterFactory, 2nd embedded Solr deployment within SolrJ application EmbeddedSolrServer class <enableLazyFieldLoading> element encoding HTML entities English language English Porter Stemmer EnglishMinimalStemFilter EnglishMinimalStemFilterFactory EnglishPossessiveFilter EnglishPossessiveFilterFactory entities, HTML escaping special characters eventual consistency eviction count ex local param excluded terms excludes, multiselect faceting execution order of speed of exists function exp function experimenting with relevancy [explain] field extensibility of Solr Extensible Markup Language.
Bad Pharma: How Medicine Is Broken, and How We Can Fix It by Ben Goldacre
data acquisition, framing effect, if you build it, they will come, illegal immigration, income per capita, meta analysis, meta-analysis, placebo effect, randomized controlled trial, Ronald Reagan, Simon Singh, WikiLeaks
These are widely celebrated, and everyone now speaks of ghostwriting as if it has been fixed by the ICMJE. But in reality, as we have seen so many times before, this is a fake fix: the guidelines are hopelessly vague, and are exploited in ways that are so obvious and predictable that it takes only a paragraph to describe. The ICMJE criteria require that someone is listed as an author if they fulfil three criteria: they contributed to the conception and design of the study (or data acquisition, or analysis and interpretation); they contributed to drafting or revising the manuscript; and they had final approval on the contents of the paper. This sounds great, but because you have to fulfil all three criteria to be listed as an author, it is very easy for a drug company’s commercial medical writer to do almost all the work, but still avoid being listed as an author. For example, a paper could legitimately have the name of an independent academic on it, even if they only contributed 10 per cent of the design, 10 per cent of the analysis, a brief revision of the draft, and agreed the final contents.
For example, a paper could legitimately have the name of an independent academic on it, even if they only contributed 10 per cent of the design, 10 per cent of the analysis, a brief revision of the draft, and agreed the final contents. Meanwhile, a team of commercial medical writers employed by a drug company on the same paper would not appear in the author list, anywhere at all, even though they conceived the study in its entirety, did 90 per cent of the design, 90 per cent of the analysis, 90 per cent of the data acquisition, and wrote the entire draft.91 In fact, often the industry authors’ names do not appear at all, and there is just an acknowledgement of editorial assistance to a company. And often, of course, even this doesn’t happen. A junior academic making the same contribution as many commercial medical writers – structuring the write-up, reviewing the literature, making the first draft, deciding how best to present the data, writing the words – would get their name on the paper, sometimes as first author.
Graph Databases by Ian Robinson, Jim Webber, Emil Eifrem
Amazon Web Services, anti-pattern, bioinformatics, corporate governance, create, read, update, delete, data acquisition, en.wikipedia.org, fault tolerance, linked data, loose coupling, Network effects, recommendation engine, semantic web, sentiment analysis, social graph, software as a service, SPARQL, web application
There is another aspect to velocity, which is the rate at which the structure of the data changes. In other words, as well as the value of specific properties changing, the overall structure of the elements hosting those properties can change as well. This commonly occurs for two reasons. The first is fast-moving business dynamics: as the business changes, so do its data needs. The second is that data acquisition is often an experimental affair: some properties are captured “just in case”, others are introduced at a later point based on changed needs; the ones that prove valuable to the business stay around, others fall by the wayside. Both these forms of velocity are problematic in the relational world, where high write loads translate into a high processing cost, and high schema volatility has a high operational cost.
Where Good Ideas Come from: The Natural History of Innovation by Steven Johnson
Ada Lovelace, Albert Einstein, Alfred Russel Wallace, carbon-based life, Cass Sunstein, cleantech, complexity theory, conceptual framework, cosmic microwave background, crowdsourcing, data acquisition, digital Maoism, discovery of DNA, Dmitri Mendeleev, double entry bookkeeping, double helix, Douglas Engelbart, Drosophila, Edmond Halley, Edward Lloyd's coffeehouse, Ernest Rutherford, Geoffrey West, Santa Fe Institute, greed is good, Hans Lippershey, Henri Poincaré, hive mind, Howard Rheingold, hypertext link, invention of air conditioning, invention of movable type, invention of the printing press, invention of the telephone, Isaac Newton, Islamic Golden Age, Jacquard loom, James Hargreaves, James Watt: steam engine, Jane Jacobs, Jaron Lanier, John Snow's cholera map, Joseph Schumpeter, Joseph-Marie Jacquard, Kevin Kelly, lone genius, Louis Daguerre, Louis Pasteur, Mason jar, Mercator projection, On the Revolutions of the Heavenly Spheres, online collectivism, packet switching, PageRank, patent troll, pattern recognition, price mechanism, profit motive, Ray Oldenburg, Richard Florida, Richard Thaler, Ronald Reagan, side project, Silicon Valley, silicon-based life, six sigma, Solar eclipse in 1919, spinning jenny, Steve Jobs, Steve Wozniak, Stewart Brand, The Death and Life of Great American Cities, The Great Good Place, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, transaction costs, urban planning
In this respect, Berners-Lee was supremely lucky in the work environment he had settled into, the Swiss particle physics lab CERN. It took him ten years to nurture his slow hunch about a hypertext information platform. He spent most of those years working at CERN, but it wasn’t until 1990—a decade after he had first begun working on Enquire—that CERN officially authorized him to work on the hypertext project. His day job was “data acquisition and control”; building a global communications platform was his hobby. Because the two shared some attributes, his superiors at CERN allowed Berners-Lee to tinker with his side project over the years. Thanks to a handful of newsgroups on the Internet, Berners-Lee was able to supplement and refine his ideas by conversing with other early hypertext innovators. That combination of flexibility and connection gave Berners-Lee critical support for his idea.
The Filter Bubble: What the Internet Is Hiding From You by Eli Pariser
A Declaration of the Independence of Cyberspace, A Pattern Language, Amazon Web Services, augmented reality, back-to-the-land, Black Swan, borderless world, Build a better mousetrap, Cass Sunstein, citizen journalism, cloud computing, cognitive dissonance, crowdsourcing, Danny Hillis, data acquisition, disintermediation, don't be evil, Filter Bubble, Flash crash, fundamental attribution error, global village, Haight Ashbury, Internet of things, Isaac Newton, Jaron Lanier, Jeff Bezos, jimmy wales, Kevin Kelly, knowledge worker, Mark Zuckerberg, Marshall McLuhan, megacity, Netflix Prize, new economy, PageRank, paypal mafia, Peter Thiel, recommendation engine, RFID, sentiment analysis, shareholder value, Silicon Valley, Silicon Valley startup, social graph, social software, social web, speech recognition, Startup school, statistical model, stem cell, Steve Jobs, Steven Levy, Stewart Brand, technoutopianism, the scientific method, urban planning, Whole Earth Catalog, WikiLeaks, Y Combinator
When a clothing company determines that knowing your favorite color produces a $5 increase in sales, it has an economic basis for pricing that data point—and for other Web sites to find reasons to ask you. (While OkCupid is mum about its business model, it likely rests on offering advertisers the ability to target its users based on the hundreds of personal questions they answer.) While many of these data acquisitions will be legitimate, some won’t be. Data are uniquely suited to gray-market activities, because they need not carry any trace of where they have come from or where they have been along the way. Wright calls this data laundering, and it’s already well under way: Spyware and spam companies sell questionably derived data to middlemen, who then add it to the databases powering the marketing campaigns of major corporations.
4chan, Airbnb, Amazon Mechanical Turk, asset-backed security, barriers to entry, Berlin Wall, big-box store, bitcoin, blockchain, citizen journalism, collaborative consumption, congestion charging, Credit Default Swap, crowdsourcing, data acquisition, David Brooks, don't be evil, gig economy, Hacker Ethic, income inequality, informal economy, invisible hand, Jacob Appelbaum, Jane Jacobs, Jeff Bezos, Khan Academy, Kibera, Kickstarter, license plate recognition, Lyft, Mark Zuckerberg, move fast and break things, natural language processing, Netflix Prize, Network effects, new economy, Occupy movement, openstreetmap, Paul Graham, peer-to-peer lending, Peter Thiel, pre–internet, principal–agent problem, profit motive, race to the bottom, Ray Kurzweil, recommendation engine, rent control, ride hailing / ride sharing, sharing economy, Silicon Valley, Snapchat, software is eating the world, South of Market, San Francisco, TaskRabbit, The Nature of the Firm, Thomas L Friedman, transportation-network company, Uber and Lyft, Uber for X, ultimatum game, urban planning, WikiLeaks, winner-take-all economy, Y Combinator, Zipcar
Yet we should be careful about drawing general conclusions about the overall effect of any new technology: as criminologist Clive Norris has shown, license plate recognition has now become a way of tracking known individuals as they move around, and it is no surprise who is tracked more and who is tracked less.73 The underlying problem remains: there is still racism in the system, but it is now manifested in different ways. Data acquisition shifts the place where racism happens from the street to the database query. There is no evidence of intentional discrimination by the companies, and the patterns may change as the systems evolve, but we should be cautious about ascribing too much blame or credit to the companies involved. Money is one of the main points of contention for many jobs, but Uber is not just another employer.
A Man on the Moon by Andrew Chaikin
It is this envelope of dimly glowing gas, the corona, that frames the moon’s silhouette during a total solar eclipse on earth. For a glimpse of its cold, eerie light astronomers will travel halfway around the world, but Mattingly now saw the corona as only the space traveler could, in the last minutes of orbital night, while the sun still hid below the unseen horizon. It was Mattingly’s task to capture the corona on film using the Data Acquisition Camera (DAC). And the tape—that was a matter of efficiency. Mattingly knew the inside of the command module so well that even in pitch darkness he could find his way around. And he knew that if he flicked on a flashlight to glance at a checklist, even for a moment, he’d ruin his night vision. So he’d spent an hour during the trip out to the moon reading his checklist into the portable tape recorder.
., 518, 521, 523, 536 astronauts’ wives and, 416-17 backup crew for, 284, 397, 400-401, 407-8 debriefings for, 447, 448 Endeavour in, 412, 417, 433, 434-436, 448 Falcon in, 402, 406, 412-18, 421, 423, 426, 432, 433, 440, 442, 444, 449 and first-day covers stamp deal, 445, 496-98, 549,551,640, 642, 646 geological information from, 414-15, 453, 474 geological training for, 399-400, 414, 424-25,434-35, 448, 639 Griffin in, 408, 428, 440 Hadley Rille as objective of, 402-3, 406, 408,412-15,419, 420, 423 Irwin in, 400, 403-4, 405, 408, 409-51,497, 549 Irwin-Scott relationship in, 404, 638 liftoff of, 417 lunar observations in, 434-35, 506, 507 lunar orbit of, 412, 433, 434 McDivitt in, 406, 407, 440, 442 orbiting lunar science platform for, 434, 453 parachutes of, 444 Powered Descent in, 406-7, 412-14 preflight training for, 416-17 quarantine for, 444 satellite released in, 434 Saturn V booster for, 417 Scott in, 398, 399-400, 403, 404-51,496-98, 549 Silver in, 398-99, 400, 404-8, 410, 414, 415, 419-21, 424-25, 428, 432, 448 simulation for, 407, 412, 413, 416, 424 Slayton in, 416, 428, 443, 497 sleeping in, 414, 415-16 space program and, 448-51 splashdown of, 444-48 success of, 448-49 trajectory for, 406-7, 412-13 Worden in, 400, 409, 416-17, 433, 434-36, 444-48, 497, 506, 507, 549 Apollo 15 moonwalks, 417-44 Apollo Lunar Surface Experiment Package (ALSEP) in, 422-23, 432,436, 438, 453, 468 backpacks for, 402, 426-27, 433 boulders examined in, 421-22, 427, 429, 432 cameras for, 407 circadian rhythms and, 414 deep core sample taken in, 432 — 433, 438-40 drill used in, 422-23, 432-33, 438-40 electrolyte solution for, 446-47, 475 exhaustion from, 423-24, 438, 446-47 exploration in, 409-10, 411-12, 417 first session for, 417-23 Galileo’s experiment repeated in, 442-43, 640 Genesis Rock found in, 430-31, 437, 453, 481, 558, 645 geological information from, 402, 405-6,418,419-21,423, 424-444 geologic traverses in, 417-18, 424, 433, 440-42 g-forces in, 415 gloves for, 422-23 at Hadley Delta mountain, 403, 412,414,415,424-29, 453, 466, 478, 479, 518, 521 at Hadley Rille, 438, 440-42, 450, 488 hammer used in, 421, 423, 443, 638 landmarks for, 412-15, 419 lunar maps for, 407, 420 Lunar Roving Vehicle in, 402, 406, 407,417-18, 422, 424, 425-27, 429, 436, 441,442, 443, 449, 639 at North Complex craters, 415, 438, 440, 442 oxygen supply for, 422, 426-27 penetrometer used in, 432 rake used in, 432, 441 reconnaissance in, 414-15 Apollo 15 moonwalks (cont.) rock samples in, 420-22, 423, 428-32, 433, 453, 523 second session for, 423-33 space suits for, 402, 415-16, 422-23, 426-27, 433, 446 at Spur crater, 427, 429-32, 453, 523, 558 Surface Geology Team for, 420 third session for, 438-44 timeline of, 432, 440, 442 TV transmissions from, 419, 421, 443 walkback limit for, 426-27, 431, 432 Apollo 16, 452-94, 598-99 Apollo 13 vs., 459, 468 Apollo 14 vs., 460, 470 Apollo 15 vs., 464, 468, 469, 474, 475, 478, 479, 481 backup crew for, 549 Casper in, 456-62, 481-85, 491- 492, 494 checklists for, 484 “circ” burn in, 458-59 crew morale in, 473, 475-77 Data Acquisition Camera (DAC) in, 483-84 Descartes highland as destination of, 364, 452-56, 463, 464, 466, 474, 479, 480, 481, 489, 490, 491,492-93, 507 descent orbit of, 456, 459 Duke in, 456, 458, 459, 460, 462-91,492, 493-94 earthrise witnessed by, 484 England in, 468, 469, 472-74, 475, 476, 480, 481, 488, 489, 490-91 flight plan for, 484-85, 490 geological information from, 394, 454-56, 463, 464-91 gimbal motors’ problem in, 456 — 460, 462, 484 Kraft in, 461, 462, 478 lunar observations in, 482-85, 493, 494 lunar orbit of, 483, 484, 490, 493, 494 McDivitt in, 454, 459, 460-62, 641 Mattingly in, 456-62, 464, 481 — 485, 490-92, 493, 494, 641 media coverage of, 490 Muehlberger in, 464-67, 470, 471, 472, 474, 475, 479, 481, 486-90 music on, 484 onboard computers for, 459 orange juice for, 475-76 Orion in, 456, 458, 459, 460, 469, 472, 474, 475 photographic mission of, 482-84 Powered Descent in, 456, 459, 462-63, 641 Saturn V booster for, 457, 461 simulation for, 457 sleeping in, 476-77, 483 solar corona observed in, 483-84 SPS engine for, 456-60, 462, 484, 641 telemetry from, 459, 461 trajectory of, 454-55 TV transmissions from, 467, 469, 471,487-88, 490 Young in, 456, 458, 460, 462-91, 493-94 Apollo 16 moonwalks, 469-90 Apollo Lunar Surface Experiment Package (ALSEP) in, 467-69, 470, 473, 477-78, 493-94 breccias found in, 470, 472, 474, 489-90, 492 at Cayley Plains, 468, 469, 470, 472, 474, 479, 481,483, 487, 490, 491 at Cincos craters, 479-80 extensions for, 487, 489 first session for, 469-74 at Flag crater, 470—71 geologic traverses in, 469-74 g-forces in, 469 House Rock found in, 488-90 Langseth’s heat-flow experiment in, 467-69, 473, 475, 477-78 Lunar Roving Vehicle in, 467, 472, 478, 479, 486, 487, 489 magnetometer in, 472 at North Ray crater, 486, 487-90 photomaps for, 466, 469 at Plum crater, 471-72 rock samples in, 463, 464, 470, 471-72, 474, 479-81,488-91, 523 second session for, 478-81 seismometer placed in, 493-94 at South Ray crater, 474, 479-481, 487 at Stone Mountain, 466, 475, 478-81, 521 Surface Geology Team for, 465, 471, 474, 479, 481, 490 third session for, 487-92 timeline of, 464-65 volcanic evidence in, 452, 455, 465, 466, 470, 473-75, 481, 483, 487, 490-91,492-93 walkback limit for, 489 Apollo 17, 495-551, 599 abort procedure for, 498, 499 America in, 498, 534-35, 545, 547, 548 Apollo 10 vs., 501-2, 511, 512 Apollo 11 vs., 511 Apollo 15 vs., 518, 521, 523, 536 backup crew for, 549-50 Cernan in, 449, 451, 498-505, 506, 508-47, 550, 566, 567-68, 643 Cernan-Schmitt relationship in, 510-14, 536, 636-37, 647 Challenger lunar module in, 504, 508-9, 514-15, 517, 519, 535, 540, 543-45, 546 checklists for, 520-21, 533 crew assigned to, 449-51 crew morale in, 503-5, 523-24 earthrise viewed from, 545-46 Evans in, 451, 498, 499-501, 531-35, 545, 548-50, 644 as final mission, 496, 504-5, 546, 547-48, 550-51 geological information from, 401, 505-7, 509-31 Gordon and, 449, 451, 498, 515 liftoff of, 495-96 lunar observations in, 534-35, 545, 644 Lunar Orbit Insertion (LOI) of, 511 media coverage of, 532 Muehlberger in, 522-23, 525, 527, 537, 540, 541 music on, 517 nighttime launch for, 495, 498, 500-501 Nixon and, 546, 645 onboard computers of, 508 Parker in, 511, 514, 520, 522, 525, 528, 530 Powered Descent in, 508-9, 643 rendezvous in, 544-45 Saturn V booster for, 498, 499, 500-501, 504, 511 Schmitt in, 397, 401, 449, 450-451,498, 500, 503-48, 550, 643 Silver in, 522-24, 527, 528, 644 Slayton in, 449, 450, 451, 502, 533, 550 sleeping in, 517, 540 space program and, 496-98, 504- 5, 509, 530-31, 534, 543, 546, 547-48, 550-51 splashdown of, 550-51 SPS engine for, 534, 547 Taurus-Littrow as destination of, 505- 8, 512, 516, 517, 518, 522, 525, 526, 529, 530, 539, 540, 543, 547 trajectory of, 507-8 TV transmissions from, 514, 530, 537-38, 543, 547 weather observations in, 512, 535 Apollo 17 moonwalks, 509-46 Apollo Lunar Surface Experiment Package (ALSEP) in, 513, 541 breccias collected in, 538-39 core samples in, 513, 515, 516, 525, 528-29 earth as viewed in, 510, 511-13 exploration in, 519-20, 540, 546 first session for, 509-14 Apollo 17 moonwalks (cont.) geologic traverses in, 513, 514, 515, 518-30, 535-44 g-forces in, 517, 541, 567 hammer used in, 542 Langseth’s heat-flow experiment in, 513 at Lara crater, 524-25, 526 Lunar Gravimeter placed in, 513 Lunar Roving Vehicle in, 511, 513,515,525,530,537,539, 541, 542, 543 at Nansen crater, 520-21, 527, 528, 541 at North Massif, 536-41 orange soil discovered in, 527 — 530, 644 oxygen supply for, 513, 514 photographic mission of, 520, 527, 530, 539-40 at Poppy crater, 508 rock samples in, 518, 521-23, 525, 538-39, 542, 543, 544 at Sculptured Hills, 536, 541 second session for, 517-31 at Shorty crater, 525-30, 644 at South Massif, 517-22 space suits for, 542 Surface Geology Team for, 522 — 523, 525, 537, 540-41 third session for, 531, 535-44 timeline of, 504, 515, 521, 525, 541 volcanic evidence in, 506, 515, 527-30 walkback limit for, 519-20, 528, 644 Apollo 18, 283, 284, 401-2, 503, 506, 541, 578, 643 Apollo 19, 283, 506, 578, 643 Apollo 20, 232, 283, 285, 350 Armstrong, Jan, 19, 21, 161, 181, 202, 568 Armstrong, Neil, 138, 160-63, 168-69, 175, 586, 619-20 Aldrin and, 147-50, 173, 227, 569, 570, 618 Apollo 8 and, 82 in Apollo 11, 137, 138-39, 147- 150, 166, 183, 184-227, 250-251, 255, 291, 323, 390-91, 580,617-18, 623-24, 647 Apollo 17 and, 498, 504 Collins and, 168, 568 as first man on moon, 138, 147 — 150, 205-11,221,227, 569-70, 618 in Gemini project, 22, 51-52, 168-69, 170 geological training of, 179, 390 — 391 post-Apollo experience of, 565, 568-70, 582 as test pilot, 32, 138, 160-63, 165, 168, 169 Astronaut Office, 28, 30, 137, 282, 304, 449, 496 Shepard as head of, 44-45, 245, 291, 342-43, 350, 388, 389, 396-97, 611 astronauts, 4-5, 27-55, 114 aerospace design and, 16, 27, 31 biographical information on, 585-94 competition among, 29, 35, 42-49, 64 deaths of, 11-26, 51, 247-48, 443 as former test pilots, 21, 32, 34, 45, 47, 54, 115 geological training for, 389-410 income and perks of, 32, 41 lunar landing as goal of, 29-30, 54-55 marriages of, 349-50 medical evaluation of, 46-47 post-Apollo experiences of, 553 — 583 public image of, 349-50, 497-98 risks taken by, 20-23, 26, 30 rookie, 29, 32-37,41,43-49, 50, 52-53 scientists vs., 386-88, 389 selection process for, 1-7, 33-34, 35, 39-40, 50, 51, 136, 137-38, 176, 284, 342, 346-48 training of, 29-37, 45, 49 veteran, 28, 32-35, 37, 41 wives of, 64, 114, 115, 349-50, 416 see also individual astronauts Babbitt, Don, 17 Bales, Steve, 191, 194, 195, 196 Bassett, Charlie, 21, 45, 48, 51, 65 Bean, Alan, ix, 48, 243-48, 391, 586 in Apollo project, 53, 134, 245 — 248 Apollo 1 disaster and, 19 in Apollo 12, 234-41, 243-84, 371, 391, 580-81 Apollo 15 and, 448 Conrad and, 245, 246, 247, 248, 281 in Fourteen, 41, 48, 50, 51, 53, 245 as fourth man on moon, 263 post-Apollo experience of, 580— 583 Bean, Sue, 19 Bennett, Floyd, 413 Benware, Betty, 296, 310 Benware, Bob, 296 Beregovoy, Georgi, 77, 634 Bergman, Jules, 297 Berry, Chuck, 19, 98-99, 182-83, 245, 288, 307, 333, 334, 446 Bohm, David, 557 Borman, Frank, 32, 53, 60-61, 77-78, 123, 124, 125, 133, 586-87 Apollo 1 disaster and, 60, 61, 124, 611, 613 in Apollo 8, 60-62, 64, 65, 66-67, 70-71, 73, 74, 75, 77-134, 290, 389-90,614,615, 621-22 Apollo 11 and, 128, 137-38, 290 in Gemini project, 42, 49, 50, 62, 67-68, 80, 103, 128 at North American plant, 27, 60, 61, 124, 613-14 post-Apollo experience of, 291, 562-63 Borman, Susan, 122-25, 127, 133, 310, 311, 647 Bostick, Jerry, 191 Boudette, Gene, 466, 470, 493 Bourgin, Simon, 616 Brand, Vance, 317, 324, 325, 401 Bush, George, 577 Carlton, Bob, 191 Carpenter, Scott, 35 Carr, Jerry, 104, 106, 110, 115, 125, 202 Apollo 12 and, 238, 240-41, 260 Carrying the Fire (Collins), 568 Cemanr Barbara, 499 Cernan, Gene, 51, 147, 505, 587, 647 Apollo 1 disaster and, 15, 26 in Apollo 10, 136, 150-51, 152, 155-59, 165, 191, 250, 501-2, 511, 512 Apollo 14 and, 354, 368-69, 370, 377 in Apollo 17, 449, 451,498-505, 506, 508-47, 550, 566, 567-68, 643 as eleventh man on moon, 509 in Gemini project, 51, 156, 501, 502, 614 in helicopter crash, 449, 640-41 as last man on moon, 544 post-Apollo experience of, 565 — 568, 582 Schmitt and, 510-14, 536, 636-637, 647 Cernan, Tracy, 500, 501-2, 508, 540 Chaffee, Martha, 19 Chaffee, Roger, 12, 14, 17, 19, 21-26, 30, 637 Challenger disaster, 565, 569, 573 — 574 Charlesworth, Cliff, 208 Chauvin, Skip, 235, 236 Clarke, Arthur C., Ill, 291, 616, 624 Clinton, Bill, 577 Cohen, Aaron, 574 Collins, Mike, 45, 53, 65, 587 Apollo 1 disaster and, 19 Apollo 8 and, 86, 87, 89, 91, 98, 117, 126, 127, 174 in Apollo 11, 138, 148, 173, 174- Collins, Mike (cont.) 176, 177, 184-90, 192, 202, 219-27, 250, 395, 621 Armstrong and, 168, 568 in Gemini project, 48-49 post-Apollo experience of, 560 Collins, Pat, 176, 182, 202 Columbia space shuttle, 573 command module, Apollo: abort handle in, 73, 85, 87 Block I prototype for, 16, 17, 608 Block II prototype for, 16, 27, 60, 61, 608 design of, 12, 13-15, 16, 17, 23-25, 27, 60, 61, 73, 74 fireproofing of, 61, 82 hatch for, 14, 17, 24-25, 609-10 testing of, 60-62 Conrad, Charles “Pete,” ix, 3-7, 27, 29-37, 48, 54-55, 192, 587-88 Apollo 11 and, 136-37 in Apollo 12, 234-43, 246, 247, 248-84, 323, 351, 371, 391 Apollo 13 and, 296, 297, 310, 333 Bean and, 245, 246, 247, 248, 281 in Gemini project, 29, 41-43, 52, 54, 68, 242-43, 253, 279-280 geological training of, 391, 399, 405-6 Lovell and, 27, 36, 42, 65 in New 4, 31-32,35-37,41 post-Apollo experience of, 554 — 556, 580 as test pilot, 3-4, 5, 7, 27, 34, 36, 41, 55, 65 as third man on moon, 260-63 training of, 35-37, 41 Conrad, Jane, 261, 310 Cook, James, 411-12 Cooper, Gordon: in Apollo project, 347-48, 378, 449 in Gemini project, 42, 43, 279, 347 in Mercury project, 341, 347 in Original 7, 31, 44 cosmonauts, 57, 58, 77, 409-10, 443, 613, 634 Criswell, David, 576 Cronkite, Walter, 227 Cunningham, Claire, 48 Cunningham, Lo, 143 Cunningham, Walt, 47, 48, 53, 245, 610 in Apollo 7, 76, 77 in Fourteen, 47, 48, 50-51, 53, 245, 246 Dana, Bill, 163 De’Orsey, Leo, 342 Duke, Charlie, 158, 186, 191, 288, 345, 588 Apollo 11 and, 186, 191, 192, 195, 196, 197, 199, 202, 204, 220 Apollo 13 and, 308, 312 in Apollo 16, 456, 458, 459, 460, 462-91,492, 493-94 Apollo 17 and, 549-50 geological training of, 393-94 as tenth man on moon, 469 Duke, Dotty, 485-86 Duke, Mike, 392 Dwight, Ed, 611-12 Ehrlichman, John, 336 Eiermann, Horst, 497 Eisele, Donn, 76, 349 El-Baz, Farouk, 394-96, 434-35, 482-83, 484, 535, 639, 644 Elston, Don, 466, 470, 493 England, Tony, 468, 469, 472-74, 475, 476, 480, 481, 488, 489, 490-91 Engle, Joe, 370, 377, 449-50, 451, 503, 535 Evans, Jaime, 451, 499 Evans, Jan, 499-501,531-33 Evans, Jon, 534 Evans, Ron, 186, 499-500, 532-33, 551, 588 Apollo 11 and, 186, 222 Apollo 14 and, 360 in Apollo 17, 451, 498, 499-501, 531-35, 545, 548-50, 644 Eyles, Don, 358 Fallaci, Oriana, 261 Feltz, Charlie, 609 Fendell, Ed, 487, 522 Fourteen group, 41, 43-49, 50 Frank, Pete, 472 Freedom 7, 337-40, 352 Freedom space station, 577 Freeman, Ted, 21 Frick, Charles, 608-9 Frondell, Clifford, viii, 233 Frost, Robert, 582 Fullerton, Gordon, 535, 546 Gagarin, Yuri, 58, 340 Galileo Galilei, 442-43, 640 Garman, Jack, 195 Garriott, Owen, 387 Gemini project: Apollo project compared with, 16, 102, 128, 130, 148, 254 docking missions in, 43, 50, 51 medical experiments in, 46-47, 62 Mercury project compared with, 11, 22, 23, 28, 33 rendezvous flights in, 43, 49, 50, 54, 142, 168 safety of, 22-23 selection process for, 33-34, 35, 41 -43, 48-49, 50,51 spacecraft for, 16, 24, 92 space walks in, 50, 140, 144, 146-47, 206, 242-43 training for, 35-37, 41 Gemini 3, 42 Gemini 4, 42 Gemini 5, 29, 54, 253, 279-80, 347 Gemini 6, 42, 49 Gemini 7, 48-49, 50, 62, 67-68, 80, 128 Gemini 8, 22, 48, 49, 50, 51, 52, 168-69, 170, 399 Gemini 9, 48, 51, 501, 502,614 Gemini 10, 49, 51 Gemini 11, 49, 52, 54, 68, 242-43 Gemini 12, 51, 65, 103, 140, 144, 347 Gibson, Ed, 262, 268, 276, 387 Gilruth, Bob, 61, 178, 241, 285-86, 338, 406, 504, 612, 629 Glenn, John, 5, 31, 34-35,610 as first American in orbit, 5, 340, 579 Kennedy and, 610 in Mercury program, 5, 6, 163 Goddard, Robert, 79 Gold, Thomas, 180 Gooding, Jim, 645 Gordon, Dick, ix, 41, 45, 242-43, 389, 588-89 in Apollo 12, 235-43, 247, 248-249, 252, 253, 254, 256-57, 267-69, 280-84, 395 Apollo 15 and, 400-401,421 Apollo 17 and, 449, 451, 498, 515 in Gemini project, 49, 52, 68, 242-43 geological training of, 398, 408 Graveline, Duane, 387 Griffin, Gerry, 237-38, 312, 408, 428, 440, 513, 514, 542 Grissom, Betty, 19 Grissom, Gus: in Apollo 1 disaster, 12-26, 30, 53, 77, 348, 610-11 in Gemini project, 13, 22-23, 42, 339 in Mercury project, 12-13, 607-608 in Original 7, 12-14, 31, 33 Grumman Corporation, 56, 151, 155-56, 257, 304, 307, 406, 424, 504 Haise, Fred, 589, 629-30 in Apollo 13, 286, 288, 292-335, 351, 397, 629-31,646 Apollo 14 and, 358, 369-71 Apollo 16 and, 477, 549 geological training of, 393, 395, 396, 464 Haise, Mary, 333 Hamblin, Dora Jane, 633 Hammick, Jerry, 296-97 Harter, Alan, 18, 21-22 Hartzell, Lew, 78 Hasselblad cameras, 111, 119, 209-210, 225, 278, 363 Head, Jim, 450-51 Heinlein, Robert, 444 Henize, Karl, 435 Hillary, Edmund, 204, 623-24 Horz, Fred, 492 Hotz, Robert, 568-69 House, William, 343 Houston, Jean, 556 Hubble Space Telescope, 573, 577 Irwin, Jim, 249, 403-4, 416-17, 459, 589 Apollo 12 and, 249, 264 in Apollo 15, 400, 403-4, 405, 408, 409-51,497, 549, 638 as eighth man on moon, 423 geological training of, 398, 404-8 heart problem of, 446-47, 475 post-Apollo experience of, 557 — 559 Irwin, Mary, 403, 416 Jackson, Dale, 421-22, 540, 637 Johnson, Lyndon B., 49, 58, 78, 338 Kapryan, Walter, 235 Kelly, Fred, 18, 21-22 Kennedy, John F., vii, 1-2, 11, 26, 27, 31, 39, 43, 55, 58, 134, 135, 219, 231-32, 338, 340-41, 383, 384, 546, 576, 578, 610 Kennedy, Robert, 56 Kepler, Johannes, 165 Kerwin, Joe, 320-21, 332, 334, 387, 555 King, Elbert, 621 King, Jack, 498 King, Martin Luther, Jr., 56 Kissinger, Henry, 548 Komarov, Vladimir, 57, 77 Koppel, Ted, 565 Korean War, 12, 160, 168 Kraft, Christopher Columbus: Apollo 7 and, 76-77 Apollo 8 and, 59, 67-71, 77, 90, 103, 104, 105, 107, 124, 125, 126 Apollo 9 and, 140 Apollo 10 and, 151 Apollo 12 and, 241, 255, 257 Apollo 14 and, 345-46 Apollo 15 and, 442 Apollo 16 and, 461, 462, 478 Apollo 17 and, 504, 506, 550 as director of Flight Crew Operations, 59, 67-68, 170, 171, 406, 609, 630, 642 Kranz, Eugene F., viii, 170, 191 Apollo 11 and, 170-73, 176, 190-92, 194, 195 Apollo 13 and, 295, 299, 321, 325, 335 Gemini 8 aborted by, 52 Langseth, Mark, 467-69, 473, 475, 477-78, 513 Lawrence, Robert, 612 Leonov, Aleksey, 613 Liberty Bell 7, 13 Life, 31, 32, 33, 181, 342, 345, 352, 499, 610, 633 Lindbergh, Charles, 4, 31, 79-80, 231, 614 Logsden, John, 336, 625 Lorenzo, Frank, 562 Lousma, Jack, 293, 297, 299, 300, 302, 318, 319, 322-23, 327 Lovelace, Randy, 386 Lovell, Barbara, 114 Lovell, Jay, 115 Lovell, Jeffrey, 114, 310, 334 Lovell, Jim, 27,31,36,51,63-65, 79, 89, 503, 590 in Apollo 8, 60, 64, 65-66, 71, 73, 77, 78-134, 313, 563, 583, 616-17 in Apollo 13, 286-87, 289, 290-335, 348, 646 Apollo 15 and, 420 Apollo 17 and, 498, 528, 541 Conrad and, 27, 36, 42, 65 in Gemini project, 42, 48-49, 50, 51, 62, 65, 80, 102, 144 geological training of, 393-94, 395, 396, 464, 467 Lovell, Marilyn, 63-64, 114-16, 127, 286-87, 290, 296-97, 310-11,333-34, 335, 630 Lovell, Susan, 310 Low, George, 57, 58, 62, 82, 349, 506, 551 Luna 16 lunar probe, 409 Lunakhod 1 robot, 409 Lunar Landing Training vehicle (LLTV), 177-79,259, 463, 620 lunar module (LM): abort procedure for, 166-67, 172, 173, 192, 195, 196, 198, 199, 200 “ahead of the airplane” concept in, 165, 186 alarms of, 194-95, 196 ascent engine of, 56, 155, 159, 167, 202, 222-23, 279-80 ascent stage of, 167, 222-24, 305 “dead man’s curve” concept in, 167, 172, 199 delta-H information for, 194, 195 descent engine of, 158, 159, 164, 167, 193, 195, 200, 257, 302-3, 309-10, 314, 322, 323, 328, 358, 388, 406 design of, 34, 56-57, 151, 155-156,257, 304, 383, 388, 402, 406 footpads of, 168, 359, 368 fuel supply for, 179, 192, 198, 199, 200, 259, 353 hammocks in, 270, 271, 415 hatch of, 148-49, 207, 257, 260, 414-15 instrument panel for, 34, 163 — 164, 260 landing of, 113, 164-69, 209, 250-51 Landing Point Designator (LPD) for, 166, 196, 258 landmarks for, 113, 165-66, 201-2, 254-55, 258-59 life support systems of, 304-5, 312, 318, 319-20, 326 navigation system of, 166, 167, 196 nicknames for, 155, 257, 619 onboard computer of, 164, 166, 168, 191, 193-95, 196, 197, 201,251,256, 357-58,412, 413 oxygen supply of, 180, 207 pinpoint landing of, 250-51, 255, 257, 259-60, 282,351,367 pitchover maneuver for, 165, 193, 195-96,254, 258, 259, 412, 508 Powered Descent of, 164-69, 172, 176, 185, 190, 191-200, 202-3,219-20, 357-59, 406-407,412-14, 456, 508-9 radar of, 358-59 shutdown of, 199-200 simulator for, 136-37, 163-74, 176-77, 193-97, 249-50,258, 620 surface disturbed by, 198-99, 200-201,259-60 telescope of, 312-13 throttle of, 164, 165, 167 thrusters of, 159, 173, 189, 193, 197, 199 toggle switch in, 167, 197, 199 weight-saving program for, 151, 155-56 windows of, 195, 202, 207, 210, 257 Lunar Orbiter probes, 70, 179, 274, 394, 412, 435, 454, 482 Lunar Receiving Laboratory (LRL), 180-81,227,233, 365-67, 374, 430, 440, 447, 474, 492 Lunney, Glynn, 76, 299-300, 335 McAuliffe, Christa, 573 McCandless, Bruce, 150, 208, 210, 216, 217 McDivitt, Jim, 29, 33, 61, 62 in Apollo 9, 136, 137, 143, 144, 248-49, 291, 399, 461 Apollo 11 and, 137 Apollo 12 and, 138, 460 Apollo 15 and, 406, 407, 440, 442 McDivitt, Jim (cont.)
The Googlization of Everything: by Siva Vaidhyanathan
1960s counterculture, AltaVista, barriers to entry, Berlin Wall, borderless world, Burning Man, Cass Sunstein, choice architecture, cloud computing, computer age, corporate social responsibility, correlation does not imply causation, data acquisition, death of newspapers, don't be evil, Firefox, Francis Fukuyama: the end of history, full text search, global village, Google Earth, Howard Rheingold, informal economy, information retrieval, Joseph Schumpeter, Kevin Kelly, knowledge worker, libertarian paternalism, market fundamentalism, Marshall McLuhan, means of production, Mikhail Gorbachev, Naomi Klein, Network effects, new economy, Nicholas Carr, PageRank, pirate software, Ray Kurzweil, Richard Thaler, Ronald Reagan, side project, Silicon Valley, Silicon Valley ideology, single-payer health, Skype, social web, Steven Levy, Stewart Brand, technoutopianism, The Nature of the Firm, The Structural Transformation of the Public Sphere, Thorstein Veblen, urban decay, web application
So the broader Google’s reach becomes—the more it Googlizes us—the more likely it is that even informed and critical Internet users will stay in the Google universe and allow Google to use their personal information. For Google, quantity yields quality. For us, resigning ourselves to the Google defaults enhances convenience, utility, and status. But at what cost? T H E P ROBL EM WI T H P RI VACY Google is far from the most egregious offender in the world of personal data acquisition. Google promises (for now) not to sell your data to third parties, and it promises not to give it to agents of the state unless the agents of the state ask for it in a legal capacity. (The criteria for such requests are lax, however, and getting more lax around the world.) But Google is the master at using information in the service of revenue generation, and many of its actions and policies are illustrative of a much larger and deeper set of social and cultural problems.
Testing Extreme Programming by Lisa Crispin, Tip House
Flushing out the hidden assumptions in the stories helps programmers implement more accurately the first time around, which makes for happy customers. XP may be self-correcting in terms of bringing these things out in the open eventually, but clearing them up earlier allows for more velocity later on. Specifying acceptance tests up front with the stories is really a type of test-first development. It helps avoid some types of defects altogether and gives the team a head start on test automation and test data acquisition. This allows functional testing to keep up with the programmers at crunch time, when the end of the iteration approaches. Including acceptance test tasks in the stories and enabling accurate story estimates makes release and iteration planning more accurate and provides time to automate the tests, which pays off many times over in later iterations. In the next chapters, we'll expand on each of these goals, discuss effective ways to accomplish them, and provide examples.
@War: The Rise of the Military-Internet Complex by Shane Harris
Amazon Web Services, barriers to entry, Berlin Wall, Brian Krebs, centralized clearinghouse, clean water, computer age, crowdsourcing, data acquisition, don't be evil, Edward Snowden, failed state, Firefox, Julian Assange, mutually assured destruction, Silicon Valley, Silicon Valley startup, Skype, Stuxnet, uranium enrichment, WikiLeaks, zero day
“We knew what luring words and phrases the e-mails used before they were sent,” the former official says. “We told companies what to be on the lookout for. What e-mails not to open. We could tell them ‘You’re next on the list.’” Among the most worrisome people on those lists were employees of American oil and natural gas companies. These businesses own and operate major refineries and pipelines that are run by SCADA (supervisory control and data acquisition) systems, the same kinds of devices that the NSA attacked in the Iranian nuclear facility to make centrifuges break down. Chinese attempts to penetrate oil and natural gas companies “were never-ending,” the former official says. The campaign reached a fever pitch in the spring of 2012, when hackers penetrated the computer networks of twenty companies that own and operate natural gas pipelines.
The Future of the Brain: Essays by the World's Leading Neuroscientists by Gary Marcus, Jeremy Freeman
23andMe, Albert Einstein, bioinformatics, bitcoin, brain emulation, cloud computing, complexity theory, computer age, computer vision, conceptual framework, correlation does not imply causation, crowdsourcing, dark matter, data acquisition, Drosophila, epigenetics, Google Glasses, iterative process, linked data, mouse model, optical character recognition, pattern recognition, personalized medicine, phenotype, race to the bottom, Richard Feynman, Richard Feynman, Ronald Reagan, semantic web, speech recognition, stem cell, Steven Pinker, supply-chain management, Turing machine, web application
A presumption emerging in our field is that the strategy for success is to collect masses of data, and then, only afterward, to distill from that data an understanding of how the brain works. In some domains, this rather static strategy—collect data first, analyze later—may be both reasonable and profitable. Take, for example, the problem of segmenting neurons from anatomical images to identify connectivity. Achieving that goal will demand powerful algorithms, but the goal itself is clear, so the analysis can proceed somewhat independently of data acquisition and experiment. But the more we stray from such well-defined problems, the less realistic that sort of static strategy may be. In most cases, we do not quite yet know which data we want to collect. Even if it is clear which kinds of measurements we want to make (for example, whole-brain calcium imaging of the larval zebrafish, two-photon imaging of multiple areas of mouse cortex), it is not clear which behaviors the organism should be performing while we collect those data, or which environment it should be experiencing.
Every Patient Tells a Story by Lisa Sanders
data acquisition, discovery of penicillin, high batting average, index card, medical residency, meta analysis, meta-analysis, natural language processing, pattern recognition, randomized controlled trial, Ronald Reagan
Poking, prodding, and thumping in places where it just won’t tell them anything.” And he found residents almost universally grateful when he showed them a better way of doing it. “The physical exam just becomes a much more useful tool when you use it correctly.” In a paper first promulgating the use of direct observation as a tool in evaluating residents, Eric wrote: “Direct observation of trainees is necessary to evaluate the process of data acquisition and care. A trainee’s ability to take a complete history; perform an accurate, thorough physical examination; communicate effectively; and demonstrate appropriate interpersonal and professional behavior can best be measured through the direct sampling of these clinical skills.” It seems obvious and yet it’s been a remarkably hard sell—not just to residents but to training programs as well.
Albert Einstein, Any sufficiently advanced technology is indistinguishable from magic, Clayton Christensen, data acquisition, delayed gratification, deliberate practice, fear of failure, Google Earth, haute couture, impulse control, Isaac Newton, Jeff Bezos, jimmy wales, Kevin Kelly, Lao Tzu, life extension, Maui Hawaii, pattern recognition, Ray Kurzweil, risk tolerance, rolodex, Silicon Valley, Steve Jobs, Walter Mischel, X Prize
“A lot of people associate serotonin directly with flow,” says high performance psychologist Michael Gervais, “but that’s backward. By the time the serotonin has arrived the state has already happened. It’s a signal things are coming to an end, not just beginning.” These five chemicals are flow’s mighty cocktail. Alone, each packs a punch, together a wallop. Consider the chain of events that takes us from pattern recognition through future prediction. Norepinephrine tightens focus (data acquisition); dopamine jacks pattern recognition (data processing); anandamide accelerates lateral thinking (widens the database searched by the pattern recognition system). The results, as basketball legend Bill Russell explains in his biography Second Wind, really do feel psychic: Every so often a Celtic game would heat up so that it would become more than a physical or even mental game, and would be magical.
23andMe, Albert Einstein, Alfred Russel Wallace, banking crisis, Barry Marshall: ulcers, Benoit Mandelbrot, Berlin Wall, biofilm, Black Swan, butterfly effect, Cass Sunstein, cloud computing, congestion charging, correlation does not imply causation, Daniel Kahneman / Amos Tversky, dark matter, data acquisition, David Brooks, delayed gratification, Emanuel Derman, epigenetics, Exxon Valdez, Flash crash, Flynn Effect, hive mind, impulse control, information retrieval, Isaac Newton, Jaron Lanier, John von Neumann, Kevin Kelly, mandelbrot fractal, market design, Mars Rover, Marshall McLuhan, microbiome, Murray Gell-Mann, Nicholas Carr, open economy, place-making, placebo effect, pre–internet, QWERTY keyboard, random walk, randomized controlled trial, rent control, Richard Feynman, Richard Feynman, Richard Feynman: Challenger O-ring, Richard Thaler, Schrödinger's Cat, security theater, Silicon Valley, stem cell, Steve Jobs, Steven Pinker, Stewart Brand, the scientific method, Thorstein Veblen, Turing complete, Turing machine, Walter Mischel, Whole Earth Catalog
Online companies, such as Amazon and Google, don’t anguish over how to design their Web sites. Instead, they conduct controlled experiments by showing different versions to different groups of users until they have iterated to an optimal solution. (And with the amount of traffic those sites receive, individual tests can be completed in seconds.) They are helped, of course, by the fact that the Web is particularly conducive to rapid data acquisition and product iteration. But they are helped even more by the fact that their leaders often have backgrounds in engineering or science and therefore adopt a scientific—which is to say, experimental—mind-set. Government policies—from teaching methods in schools to prison sentencing to taxation —would also benefit from more use of controlled experiments. This is where many people start to get squeamish.
Pinpoint: How GPS Is Changing Our World by Greg Milner
Ayatollah Khomeini, British Empire, data acquisition, Dava Sobel, Edmond Halley, Eratosthenes, experimental subject, Flash crash, friendly fire, Hedy Lamarr / George Antheil, Internet of things, Isaac Newton, John Harrison: Longitude, Kevin Kelly, land tenure, lone genius, Mars Rover, Mercator projection, place-making, polynesian navigation, precision agriculture, race to the bottom, Silicon Valley, Silicon Valley startup, skunkworks, smart grid, the map is not the territory
., 203 Santa Ana winds, 226–27 Santa Cruz District Office, 13 Santa Cruz Islands, 11 Santa Cruz Mountains, 208 Sardinia, 159 satellite laser ranging, 209 satellite navigation systems, xvii, 37–45, 53, 76 see also Global Positioning System (GPS) satellites, 25–45, 47, 108 Air Force, xiv artificial, 44, 252–53 geostationary orbits of, 141–42 GRACE, 231–32 launching of, 29–36, 43, 61, 62–63, 88, 95 orbit and velocity of, xviii, 30, 39, 43–45, 63, 216 Soviet and Russian, 30–37, 44, 75, 101–2 spy, 39 television, 35 testing of, 58–59 tracking of, 29–32, 36–45, 59 U.S., 42–45 WAAS, 141–42, 171 see also Global Positioning System (GPS), satellites of; GPS signals Saturn, 259 Saudi Arabia, 62–64, 69 SCADA (Supervisory Control and Data Acquisition), 159 Scalia, Antonin, 189–90 Schofield, Andrew, 168 Schriever Air Force Base, xiii–xv, xx, 75, 261 Master Control Station at, xix, xx, 62–63, 256 Schultz, Kenneth, 53 Schwartzkopf, H. Norman, Jr., 65–66 Schweinfurt, 49 Schwitzgebel, Ralph, 172–77, 194–95, 198–200 Schwitzgebel, Robert, 172, 174–77, 194–95, 198–200 science, 207 cognitive, 118, 131 computer, 36, 84, 124, 239 neuro-, 129 social, 118 Science Committee on Psychological Experimentation (SCOPE), 173–74 Scotland, 195, 205 Scott, Logan, 149–50, 167 Scripps Institution of Oceanography, 218, 220, 224 seafloor spreading, 207 Sea of Japan, 81 Seattle, Wash., 225 Seaworth, Troy, 101–4, 105 Seaworth Farms, 101–4, 105 seismic monitoring equipment, 217–20, 222–24 GPS-enabled, 203–4 semiconductors, 78 Senate, U.S., 60 Preparedness Committee of, 35 sensors, 122 Serbia, 71 sex offenders, 196 sextants, 5 Shacklett, Mary, 191 Shapiro, Irwin, 209 Sharp, Andrew, 12–13 Shaw, John, xiii sidereal compass, 6, 14 Sierra Nevada Mountains, 206 Silicon Valley, 77, 79, 96 Simpson, John, 28 621B program, 44, 53, 57 Skinner, B.
The New New Thing: A Silicon Valley Story by Michael Lewis
Albert Einstein, Andy Kessler, business climate, Chance favours the prepared mind, data acquisition, family office, high net worth, invention of the steam engine, invisible hand, Jeff Bezos, Menlo Park, pre–internet, risk tolerance, Sand Hill Road, Silicon Valley, Silicon Valley startup, Thorstein Veblen, Y2K
If it wasn't the engine and it wasn't the sensor, then the problem lay somewhere between the sensor and the computers. The closer Robert came to the computers, the less sure of himself he became. "I've got to work out how the system is getting its informationhow the PLCs sends it to SCADA," said Robert. SCADA was an acronym for the hopeful title they'd given to their software: Superior Control and Data Acquisition. SCADA was what Steve and Lance and Tim and Clark had spent most of their time writing. It picked up the digitized information from all over the boat and manipulated it in any way it needed to be manipulated. Steve and Robert left the engine room and made for the neighboring computer room, to find out what SCADA had to say. The twenty-five slender black machines were arranged lengthwise along a wall, which resembled a sales rack in a discount outlet for VCRs.
affirmative action, airport security, Ayatollah Khomeini, clean water, cognitive dissonance, corporate governance, data acquisition, death of newspapers, Extropian, Howard Rheingold, illegal immigration, informal economy, Iridium satellite, Jaron Lanier, John von Neumann, Kevin Kelly, means of production, mutually assured destruction, offshore financial centre, open economy, packet switching, pattern recognition, pirate software, placebo effect, Plutocrats, plutocrats, prediction markets, Ralph Nader, RAND corporation, Saturday Night Live, Search for Extraterrestrial Intelligence, Steve Jobs, Steven Levy, Stewart Brand, telepresence, trade route, Vannevar Bush, Vernor Vinge, Whole Earth Catalog, Whole Earth Review, Yogi Berra, Zimmermann PGP
Alas, weʼll see in chapter 5 that this type of “solution” conflicts fundamentally with human nature. We are, at our core, information pack rats and inveterate correlators. We hunger for news, facts, and rumors—especially when they are forbidden! In this attribute, the rich and powerful, and major corporations, are no different from the rest of us. The predictable consequence? If one kind of data acquisition is made illegal, you can be certain that someone will be doing it anyway on the sly, and possibly turning its dissemination into yet another highly profitable criminal enterprise, one that must be policed by yet another bureaucracy. Here is a little philosophical exercise that can sometimes be instructive. When dealing with so-called obvious solutions to fundamental issues, always try to imagine what might happen if we extrapolate the recommended trend to some extreme degree.
CTOs at Work by Scott Donaldson, Stanley Siegel, Gary Donaldson
Amazon Web Services, bioinformatics, business intelligence, business process, call centre, centre right, cloud computing, computer vision, connected car, crowdsourcing, data acquisition, distributed generation, domain-specific language, glass ceiling, pattern recognition, Pluto: dwarf planet, Richard Feynman, Richard Feynman, shareholder value, Silicon Valley, Skype, smart grid, smart meter, software patent, thinkpad, web application, zero day
Donaldson: So that experience helped shape your interest in the influencing role and communication role of the CTO, and you saw how you could apply those skills as you grew your career. Where did you go to school? Kaplow: For my undergraduate, I went to NYU [New York University] and I majored in physics and in history. I realized that physics, although I enjoyed it quite a bit, was not necessarily the long-term path, but it did give me the opportunity to work in the High Energy physics department. I supported some experiments, mostly on the technology side for data acquisition and analysis. I realized that physics wasn't really going to be my true love, but I enjoyed it immensely and had some great professors. I was on campus there for four years and then I moved to the computer science department at the Courant Institute of Mathematical Sciences, which was sort of a natural progression from the kind of computer work I was doing in the physics department. It's probably a little bit better known for math, but it is a very, very good computer science department and had some great teachers.
Superintelligence: Paths, Dangers, Strategies by Nick Bostrom
agricultural Revolution, AI winter, Albert Einstein, algorithmic trading, anthropic principle, anti-communist, artificial general intelligence, autonomous vehicles, barriers to entry, bioinformatics, brain emulation, cloud computing, combinatorial explosion, computer vision, cosmological constant, dark matter, DARPA: Urban Challenge, data acquisition, delayed gratification, demographic transition, Douglas Hofstadter, Drosophila, Elon Musk, en.wikipedia.org, epigenetics, fear of failure, Flash crash, Flynn Effect, friendly AI, Gödel, Escher, Bach, income inequality, industrial robot, informal economy, information retrieval, interchangeable parts, iterative process, job automation, John von Neumann, knowledge worker, Menlo Park, meta analysis, meta-analysis, mutually assured destruction, Nash equilibrium, Netflix Prize, new economy, Norbert Wiener, NP-complete, nuclear winter, optical character recognition, pattern recognition, performance metric, phenotype, prediction markets, price stability, principal–agent problem, race to the bottom, random walk, Ray Kurzweil, recommendation engine, reversible computing, social graph, speech recognition, Stanislav Petrov, statistical model, stem cell, Stephen Hawking, strong AI, superintelligent machines, supervolcano, technological singularity, technoutopianism, The Coming Technological Singularity, The Nature of the Firm, Thomas Kuhn: the structure of scientific revolutions, transaction costs, Turing machine, Vernor Vinge, Watson beat the top human players on Jeopardy!, World Values Survey
In general, the worse our scanning equipment and the feebler our computers, the less we could rely on simulating low-level chemical and electrophysiological brain processes, and the more theoretical understanding would be needed of the computational architecture that we are seeking to emulate in order to create more abstract representations of the relevant functionalities.25 Conversely, with sufficiently advanced scanning technology and abundant computing power, it might be possible to brute-force an emulation even with a fairly limited understanding of the brain. In the unrealistic limiting case, we could imagine emulating a brain at the level of its elementary particles using the quantum mechanical Schrödinger equation. Then one could rely entirely on existing knowledge of physics and not at all on any biological model. This extreme case, however, would place utterly impracticable demands on computational power and data acquisition. A far more plausible level of emulation would be one that incorporates individual neurons and their connectivity matrix, along with some of the structure of their dendritic trees and maybe some state variables of individual synapses. Neurotransmitter molecules would not be simulated individually, but their fluctuating concentrations would be modeled in a coarse-grained manner. To assess the feasibility of whole brain emulation, one must understand the criterion for success.
23andMe, 3D printing, Affordable Care Act / Obamacare, Anne Wojcicki, Atul Gawande, augmented reality, bioinformatics, call centre, Clayton Christensen, clean water, cloud computing, computer vision, conceptual framework, connected car, correlation does not imply causation, crowdsourcing, dark matter, data acquisition, disintermediation, don't be evil, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Firefox, global village, Google Glasses, Google X / Alphabet X, Ignaz Semmelweis: hand washing, interchangeable parts, Internet of things, Isaac Newton, job automation, Joseph Schumpeter, Julian Assange, Kevin Kelly, license plate recognition, Lyft, Mark Zuckerberg, Marshall McLuhan, meta analysis, meta-analysis, microbiome, Nate Silver, natural language processing, Network effects, Nicholas Carr, obamacare, pattern recognition, personalized medicine, phenotype, placebo effect, RAND corporation, randomized controlled trial, Second Machine Age, self-driving car, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, Snapchat, social graph, speech recognition, stealth mode startup, Steve Jobs, the scientific method, The Signal and the Noise by Nate Silver, The Wealth of Nations by Adam Smith, Turing test, Uber for X, Watson beat the top human players on Jeopardy!, X Prize
Only now that we can capture such panoromic data on each individual, and in populations of people, along with the ability to manage and process such enormous sets of data, are we in the enviable position of predicting illness—and maybe, just maybe, once we get good at it, even preventing diseases in some individuals from ever happening. FIGURE 13.4: Really big data from the individual and comparing that individual’s data with all of the Earth’s population (IoMT = Internet of Medical Things). The two levels of data acquisition, comparison, and machine learning—individual and population—are critical, across all of the components of one’s GIS. Predicting Disease: Who, When, How, Why, and What? First, let’s make sure we differentiate prediction from diagnosis. Online symptom checkers66 are getting increasing electronic traffic and attention on the Internet to help people “self” (computer-assisted) diagnose, but they don’t predict an illness.
Ayatollah Khomeini, Brian Krebs, crowdsourcing, data acquisition, Doomsday Clock, Edward Snowden, facts on the ground, Firefox, friendly fire, Google Earth, information retrieval, Julian Assange, Loma Prieta earthquake, Maui Hawaii, pre–internet, RAND corporation, Silicon Valley, skunkworks, smart grid, smart meter, South China Sea, Stuxnet, uranium enrichment, Vladimir Vetrov: Farewell Dossier, WikiLeaks, Y2K, zero day
WHEN THE SYMANTEC researchers discovered in August 2010 that Stuxnet was designed for physical sabotage of Siemens PLCs, they weren’t the only ones who had no idea what a PLC was. Few people in the world had ever heard of the devices—this, despite the fact that PLCs are the components that regulate some of the most critical facilities and processes in the world. PLCs are used with a variety of automated control systems that include the better-known SCADA system (Supervisory Control and Data Acquisition) as well as distributed control systems and others that keep the generators, turbines, and boilers at power plants running smoothly.2 The systems also control the pumps that transmit raw sewage to treatment plants and prevent water reservoirs from overflowing, and they open and close the valves in gas pipelines to prevent pressure buildups that can cause deadly ruptures and explosions, such as the one that killed eight people and destroyed thirty-eight homes in San Bruno, California, in 2010.
The New Digital Age: Transforming Nations, Businesses, and Our Lives by Eric Schmidt, Jared Cohen
3D printing, access to a mobile phone, additive manufacturing, airport security, Amazon Mechanical Turk, Amazon Web Services, anti-communist, augmented reality, Ayatollah Khomeini, barriers to entry, bitcoin, borderless world, call centre, Chelsea Manning, citizen journalism, clean water, cloud computing, crowdsourcing, data acquisition, Dean Kamen, Elon Musk, failed state, fear of failure, Filter Bubble, Google Earth, Google Glasses, hive mind, income inequality, information trail, invention of the printing press, job automation, Julian Assange, Khan Academy, Kickstarter, knowledge economy, Law of Accelerating Returns, market fundamentalism, means of production, mobile money, mutually assured destruction, Naomi Klein, offshore financial centre, peer-to-peer lending, personalized medicine, Peter Singer: altruism, Ray Kurzweil, RFID, self-driving car, sentiment analysis, Silicon Valley, Skype, Snapchat, social graph, speech recognition, Steve Jobs, Steven Pinker, Stewart Brand, Stuxnet, The Wisdom of Crowds, upwardly mobile, Whole Earth Catalog, WikiLeaks, young professional, zero day
The rest of the country would watch as the first responders scrambled to react and assess damage, but a subsequent barrage of cyber attacks could cripple the police, the fire department and emergency-information systems in those cities. If that’s not terrifying enough, while urban emergency efforts slow to a crawl amid massive physical destruction and loss of life, a sophisticated computer virus could attack the industrial control systems around the country that maintain critical infrastructure like water, power and oil and gas pipelines. Commandeering these systems, called supervisory control and data acquisition (SCADA) systems, would enable terrorists to do all manner of things: shut down power grids, reverse waste-water treatment plants, disable the heat-monitoring systems at nuclear power plants. (When the Stuxnet worm attacked Iranian nuclear facilities in 2012, it operated by compromising the industrial control processes in nuclear centrifuge operations.) Rest assured that it would be incredibly, almost unthinkably difficult to pull off this level of attack—commandeering one SCADA system alone would require detailed knowledge of the internal architecture, months of coding and precision timing.
A Declaration of the Independence of Cyberspace, AI winter, airport security, Apple II, artificial general intelligence, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Duvall, bioinformatics, Brewster Kahle, Burning Man, call centre, cellular automata, Chris Urmson, Claude Shannon: information theory, Clayton Christensen, clean water, cloud computing, collective bargaining, computer age, computer vision, crowdsourcing, Danny Hillis, DARPA: Urban Challenge, data acquisition, Dean Kamen, deskilling, don't be evil, Douglas Engelbart, Douglas Hofstadter, Dynabook, Edward Snowden, Elon Musk, Erik Brynjolfsson, factory automation, From Mathematics to the Technologies of Life and Death, future of work, Galaxy Zoo, Google Glasses, Google X / Alphabet X, Grace Hopper, Gödel, Escher, Bach, Hacker Ethic, haute couture, hive mind, hypertext link, indoor plumbing, industrial robot, information retrieval, Internet Archive, Internet of things, invention of the wheel, Jacques de Vaucanson, Jaron Lanier, Jeff Bezos, job automation, John Conway, John Maynard Keynes: Economic Possibilities for our Grandchildren, John Maynard Keynes: technological unemployment, John von Neumann, Kevin Kelly, knowledge worker, Kodak vs Instagram, labor-force participation, loose coupling, Mark Zuckerberg, Marshall McLuhan, medical residency, Menlo Park, Mother of all demos, natural language processing, new economy, Norbert Wiener, PageRank, pattern recognition, pre–internet, RAND corporation, Ray Kurzweil, Richard Stallman, Robert Gordon, Rodney Brooks, Sand Hill Road, Second Machine Age, self-driving car, semantic web, shareholder value, side project, Silicon Valley, Silicon Valley startup, Singularitarianism, skunkworks, Skype, social software, speech recognition, stealth mode startup, Stephen Hawking, Steve Ballmer, Steve Jobs, Steve Wozniak, Steven Levy, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Nelson, telemarketer, telepresence, telepresence robot, Tenerife airport disaster, The Coming Technological Singularity, the medium is the message, Thorstein Veblen, Turing test, Vannevar Bush, Vernor Vinge, Watson beat the top human players on Jeopardy!, Whole Earth Catalog, William Shockley: the traitorous eight
Headless and motionless, the robots were undeniably spooky. Without skin, they were cybernetic skeleton-men assembled from an admixture of steel, titanium, and aluminum. Each was illuminated by an eerie blue LED glow that revealed a computer embedded in the chest that monitored its motor control. Each of the presently removed “heads” housed another computer that monitored the body’s sensor control and data acquisition. When they were fully equipped, the robots stood six feet high and weighed 330 pounds. When moving, they were not as lithe in real life as they were in videos, but they had an undeniable presence. It was the week before DARPA would announce that it had contracted Boston Dynamics, the company that Raibert had founded two decades earlier, to build “Atlas” robots as the common platform for a new category of Grand Challenge competitions.
To Save Everything, Click Here: The Folly of Technological Solutionism by Evgeny Morozov
3D printing, algorithmic trading, Amazon Mechanical Turk, Andrew Keen, augmented reality, Automated Insights, Berlin Wall, big data - Walmart - Pop Tarts, Buckminster Fuller, call centre, carbon footprint, Cass Sunstein, choice architecture, citizen journalism, cloud computing, cognitive bias, crowdsourcing, data acquisition, Dava Sobel, disintermediation, East Village, en.wikipedia.org, Fall of the Berlin Wall, Filter Bubble, Firefox, Francis Fukuyama: the end of history, frictionless, future of journalism, game design, Gary Taubes, Google Glasses, illegal immigration, income inequality, invention of the printing press, Jane Jacobs, Jean Tirole, Jeff Bezos, jimmy wales, Julian Assange, Kevin Kelly, Kickstarter, license plate recognition, lone genius, Louis Pasteur, Mark Zuckerberg, market fundamentalism, Marshall McLuhan, Narrative Science, Nicholas Carr, packet switching, PageRank, Paul Graham, Peter Singer: altruism, Peter Thiel, pets.com, placebo effect, pre–internet, Ray Kurzweil, recommendation engine, Richard Thaler, Ronald Coase, Rosa Parks, self-driving car, Silicon Valley, Silicon Valley ideology, Silicon Valley startup, Skype, Slavoj Žižek, smart meter, social graph, social web, stakhanovite, Steve Jobs, Steven Levy, Stuxnet, technoutopianism, the built environment, The Chicago School, The Death and Life of Great American Cities, the medium is the message, The Nature of the Firm, the scientific method, The Wisdom of Crowds, Thomas Kuhn: the structure of scientific revolutions, Thomas L Friedman, transaction costs, urban decay, urban planning, urban sprawl, Vannevar Bush, WikiLeaks
Presumably, even if they had infinite shelf space, museums would not abandon the idea of curation. The latter is a deliberate commitment, not a technological constraint stemming from a lack of resources. But Gordon Bell’s one-man museum, while nominally promising to turn its heroes into curators, rejects the very selectionist spirit of curatorial work; like the self-trackers and data miners we met in the previous chapter, Gordon Bell opts for preemptive data acquisition, hoping that one day it will provide him not just with the right answers but also with the right questions. Or perhaps it will just tell him where his car keys are—and who among us relishes the time spent crawling under the table searching for them? But wearing a gadget like a SenseCam around your neck may also help you find the greatest keys of all: those to your inner self. Thus, in Your Life, Uploaded, his book-length manifesto on the benefits of lifelogging, Bell assures us that it will yield “enhanced self-insight, the ability to relive one’s own life story in Proustian detail, the freedom to memorize less and think creatively more, and even a measure of earthly immortality by being cyberized.”
Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia by Anthony M. Townsend
1960s counterculture, 4chan, A Pattern Language, Airbnb, Amazon Web Services, anti-communist, Apple II, Bay Area Rapid Transit, Burning Man, business process, call centre, carbon footprint, charter city, chief data officer, clean water, cleantech, cloud computing, computer age, congestion charging, connected car, crack epidemic, crowdsourcing, DARPA: Urban Challenge, data acquisition, Deng Xiaoping, East Village, Edward Glaeser, game design, garden city movement, Geoffrey West, Santa Fe Institute, George Gilder, ghettoisation, global supply chain, Grace Hopper, Haight Ashbury, Hedy Lamarr / George Antheil, hive mind, Howard Rheingold, interchangeable parts, Internet Archive, Internet of things, Jacquard loom, Jacquard loom, Jane Jacobs, jitney, John Snow's cholera map, Khan Academy, Kibera, knowledge worker, load shedding, M-Pesa, Mark Zuckerberg, megacity, mobile money, mutually assured destruction, new economy, New Urbanism, Norbert Wiener, Occupy movement, openstreetmap, packet switching, patent troll, place-making, planetary scale, popular electronics, RFC: Request For Comment, RFID, ride hailing / ride sharing, Robert Gordon, self-driving car, sharing economy, Silicon Valley, Skype, smart cities, Smart Cities: Big Data, Civic Hackers, and the Quest for a New Utopia, smart grid, smart meter, social graph, social software, social web, special economic zone, Steve Jobs, Steve Wozniak, Stuxnet, supply-chain management, technoutopianism, Ted Kaczynski, telepresence, The Death and Life of Great American Cities, too big to fail, trade route, Tyler Cowen: Great Stagnation, Upton Sinclair, uranium enrichment, urban decay, urban planning, urban renewal, Vannevar Bush, working poor, working-age population, X Prize, Y2K, zero day, Zipcar
Stuxnet, the virus that attacked Iran’s nuclear weapons plant at Natanz in 2010, was just the beginning. Widely believed to the product of a joint Israeli-American operation, Stuxnet was a clever piece of malicious software, or malware, that infected computers involved with monitoring and controlling industrial machinery and infrastructure. Known by the acronym SCADA (supervisory control and data acquisition) these computer systems are industrial-grade versions of the Arduinos discussed in chapter 4. At Natanz some six thousand centrifuges were being used to enrich uranium to bomb-grade purity. Security experts believe Stuxnet, carried in on a USB thumb drive, infected and took over the SCADA systems controlling the plant’s equipment. Working stealthily to knock the centrifuges off balance even as it reported to operators that all was normal, Stuxnet is believed to have put over a thousand machines out of commission, significantly slowing the refinement process, and the Iranian weapons program.40 The wide spread of Stuxnet was shocking.
The Singularity Is Near: When Humans Transcend Biology by Ray Kurzweil
additive manufacturing, AI winter, Alan Turing: On Computable Numbers, with an Application to the Entscheidungsproblem, Albert Einstein, anthropic principle, Any sufficiently advanced technology is indistinguishable from magic, artificial general intelligence, augmented reality, autonomous vehicles, Benoit Mandelbrot, Bill Joy: nanobots, bioinformatics, brain emulation, Brewster Kahle, Brownian motion, business intelligence, c2.com, call centre, carbon-based life, cellular automata, Claude Shannon: information theory, complexity theory, conceptual framework, Conway's Game of Life, cosmological constant, cosmological principle, cuban missile crisis, data acquisition, Dava Sobel, David Brooks, Dean Kamen, disintermediation, double helix, Douglas Hofstadter, en.wikipedia.org, epigenetics, factory automation, friendly AI, George Gilder, Gödel, Escher, Bach, informal economy, information retrieval, invention of the telephone, invention of the telescope, invention of writing, Isaac Newton, iterative process, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, John von Neumann, Kevin Kelly, Law of Accelerating Returns, life extension, linked data, Loebner Prize, Louis Pasteur, mandelbrot fractal, Mikhail Gorbachev, mouse model, Murray Gell-Mann, mutually assured destruction, natural language processing, Network effects, new economy, Norbert Wiener, oil shale / tar sands, optical character recognition, pattern recognition, phenotype, premature optimization, randomized controlled trial, Ray Kurzweil, remote working, reversible computing, Richard Feynman, Richard Feynman, Rodney Brooks, Search for Extraterrestrial Intelligence, semantic web, Silicon Valley, Singularitarianism, speech recognition, statistical model, stem cell, Stephen Hawking, Stewart Brand, strong AI, superintelligent machines, technological singularity, Ted Kaczynski, telepresence, The Coming Technological Singularity, transaction costs, Turing machine, Turing test, Vernor Vinge, Y2K, Yogi Berra
Poggio and E. Bizzi, "Generalization in Vision and Motor Control," Nature 431 (2004): 768–74. 77. R. Llinas and J. P. Welsh, "On the Cerebellum and Motor Learning," Current Opinion in Neurobiology 3.6 (December 1993): 958–65; E. Courchesne and G. Allen, "Prediction and Preparation, Fundamental Functions of the Cerebellum," Learning and Memory 4.1 (May–June 1997): 1-35; J. M. Bower, "Control of Sensory Data Acquisition," International Review of Neurobiology 41 (1997): 489–513. 78. J. Voogd and M. Glickstein, "The Anatomy of the Cerebellum," Trends in Neuroscience 21.9 (September 1998): 370–75; John C. Eccles, Masao Ito, and János Szentágothai, The Cerebellum as a Neuronal Machine (New York: Springer-Verlag, 1967); Masao Ito, The Cerebellum and Neural Control (New York: Raven, 1984). 79. N. Bernstein, The Coordination and Regulation of Movements (New York: Pergamon Press, 1967). 80.
Masterminds of Programming: Conversations With the Creators of Major Programming Languages by Federico Biancuzzi, Shane Warden
business intelligence, business process, cellular automata, cloud computing, complexity theory, conceptual framework, continuous integration, data acquisition, domain-specific language, Douglas Hofstadter, Fellow of the Royal Society, finite state, Firefox, follow your passion, Frank Gehry, general-purpose programming language, HyperCard, information retrieval, iterative process, John von Neumann, linear programming, loose coupling, Mars Rover, millennium bug, NP-complete, Paul Graham, performance metric, QWERTY keyboard, RAND corporation, randomized controlled trial, Renaissance Technologies, Silicon Valley, slashdot, software as a service, software patent, sorting algorithm, Steve Jobs, traveling salesman, Turing complete, type inference, Valgrind, Von Neumann architecture, web application
He is a member of the editorial board of the Journal of Universal Computer Science. James Gosling received a B.Sc. in computer science from the University of Calgary, Canada in 1977. He received a Ph.D. in computer science from Carnegie-Mellon University in 1983. The title of his thesis was “The Algebraic Manipulation of Constraints.” He is currently a VP & Fellow at Sun Microsystems. He has built satellite data acquisition systems, a multiprocessor version of Unix, several compilers, mail systems, and window managers. He has also built a WYSIWYG text editor, a constraint-based drawing editor and a text editor called Emacs for Unix systems. At Sun, his early activity was as lead engineer of the NeWS window system. He did the original design of the Java programming language and implemented its original compiler and virtual machine.
Analysis of Financial Time Series by Ruey S. Tsay
Asian financial crisis, asset allocation, Black-Scholes formula, Brownian motion, capital asset pricing model, compound rate of return, correlation coefficient, data acquisition, discrete time, frictionless, frictionless market, implied volatility, index arbitrage, Long Term Capital Management, market microstructure, martingale, p-value, pattern recognition, random walk, risk tolerance, short selling, statistical model, stochastic process, stochastic volatility, telemarketer, transaction costs, value at risk, volatility smile, Wiener process, yield curve
M. (1994), “Threshold heteroscedastic models,” Journal of Economic Dynamics and Control, 18, 931–955. Analysis of Financial Time Series. Ruey S. Tsay Copyright 2002 John Wiley & Sons, Inc. ISBN: 0-471-41544-8 CHAPTER 5 High-Frequency Data Analysis and Market Microstructure High-frequency data are observations taken at fine time intervals. In finance, they often mean observations taken daily or at a finer time scale. These data have become available primarily due to advances in data acquisition and processing techniques, and they have attracted much attention because they are important in empirical study of market microstructure. The ultimate high-frequency data in finance are the transaction-by-transaction or trade-by-trade data in security markets. Here time is often measured in seconds. The Trades and Quotes (TAQ) database of the New York Stock Exchange (NYSE) contains all equity transactions reported on the Consolidated Tape from 1992 to present, which includes transactions on NYSE, AMEX, NASDAQ, and the regional exchanges.
23andMe, 3D printing, additive manufacturing, Affordable Care Act / Obamacare, Airbnb, airport security, Albert Einstein, algorithmic trading, artificial general intelligence, augmented reality, autonomous vehicles, Baxter: Rethink Robotics, Bill Joy: nanobots, bitcoin, Black Swan, blockchain, borderless world, Brian Krebs, business process, butterfly effect, call centre, Chelsea Manning, cloud computing, cognitive dissonance, computer vision, connected car, corporate governance, crowdsourcing, cryptocurrency, data acquisition, data is the new oil, Dean Kamen, disintermediation, don't be evil, double helix, Downton Abbey, Edward Snowden, Elon Musk, Erik Brynjolfsson, Filter Bubble, Firefox, Flash crash, future of work, game design, Google Chrome, Google Earth, Google Glasses, Gordon Gekko, high net worth, High speed trading, hive mind, Howard Rheingold, hypertext link, illegal immigration, impulse control, industrial robot, Internet of things, Jaron Lanier, Jeff Bezos, job automation, John Harrison: Longitude, Jony Ive, Julian Assange, Kevin Kelly, Khan Academy, Kickstarter, knowledge worker, Kuwabatake Sanjuro: assassination market, Law of Accelerating Returns, Lean Startup, license plate recognition, litecoin, M-Pesa, Mark Zuckerberg, Marshall McLuhan, Menlo Park, mobile money, more computing power than Apollo, move fast and break things, Nate Silver, national security letter, natural language processing, obamacare, Occupy movement, Oculus Rift, offshore financial centre, optical character recognition, pattern recognition, personalized medicine, Peter H. Diamandis: Planetary Resources, Peter Thiel, pre–internet, RAND corporation, ransomware, Ray Kurzweil, refrigerator car, RFID, ride hailing / ride sharing, Rodney Brooks, Satoshi Nakamoto, Second Machine Age, security theater, self-driving car, shareholder value, Silicon Valley, Silicon Valley startup, Skype, smart cities, smart grid, smart meter, Snapchat, social graph, software as a service, speech recognition, stealth mode startup, Stephen Hawking, Steve Jobs, Steve Wozniak, strong AI, Stuxnet, supply-chain management, technological singularity, telepresence, telepresence robot, Tesla Model S, The Wisdom of Crowds, Tim Cook: Apple, trade route, uranium enrichment, Wall-E, Watson beat the top human players on Jeopardy!, Wave and Pay, We are Anonymous. We are Legion, web application, WikiLeaks, Y Combinator, zero day
For a sneak peek of what this dystopian world without computers and electricity looks like, one need only turn on the television for a taste of techno-Armageddon-cum-zombie apocalypse from shows such as The Walking Dead or from films like Planet of the Apes and Live Free or Die Hard. Hollywood machinations aside, our computer-based critical information infrastructures are increasingly under attack and deeply vulnerable to systemic failure—the impact from which could be truly catastrophic. Much of the world’s critical infrastructures utilize supervisory control and data acquisition (SCADA) systems to function. SCADA systems “automatically monitor and adjust switching, manufacturing, and other process control activities, based on digitized feedback data gathered by sensors.” These are specialized, and often older, computer systems that control physical pieces of equipment that do everything from route trains along their tracks to distribute power throughout a city. Increasingly, SCADA systems are being connected to the broader Internet, with significant implications for our common security.
The Data Warehouse Toolkit: The Definitive Guide to Dimensional Modeling by Ralph Kimball, Margy Ross
Albert Einstein, business intelligence, business process, call centre, cloud computing, data acquisition, discrete time, inventory management, iterative process, job automation, knowledge worker, performance metric, platform as a service, side project, supply-chain management
Back Room Source Transactions D a t a D a t a A c q u i s i t i o n Front Room Enterprise Data Warehouse (EDW) • Normalized tables (3NF) • Atomic data • User queryable D e l i v e r y B I Data Marts: • Dimensional • Often summarized • Often departmental A p p l i c a t i o n s Figure 1-9: Simpliﬁed illustration of the hub-and-spoke Corporate Information Factory architecture. With the CIF, data is extracted from the operational source systems and processed through an ETL system sometimes referred to as data acquisition. The atomic data that results from this processing lands in a 3NF database; this normalized, atomic repository is referred to as the Enterprise Data Warehouse (EDW) within the CIF architecture. Although the Kimball architecture enables optional normalization to support ETL processing, the normalized EDW is a mandatory construct in the CIF. Like the Kimball approach, the CIF advocates enterprise data coordination and integration.
Code Complete (Developer Best Practices) by Steve McConnell
Ada Lovelace, Albert Einstein, Buckminster Fuller, call centre, choice architecture, continuous integration, data acquisition, database schema, fault tolerance, Grace Hopper, haute cuisine, if you see hoof prints, think horses—not zebras, index card, inventory management, iterative process, late fees, loose coupling, Menlo Park, place-making, premature optimization, revision control, slashdot, sorting algorithm, statistical model, Tacoma Narrows Bridge, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Turing machine, web application
Program Design Program design includes the major strokes of the design for a single program, mainly the way in which a program is divided into classes. Some program designs make it difficult to write a high-performance system. Others make it hard not to. Cross-Reference For details on designing performance into a program, see the "Additional Resources" section at the end of this chapter. Consider the example of a real-world data-acquisition program for which the highlevel design had identified measurement throughput as a key product attribute. Each measurement included time to make an electrical measurement, calibrate the value, scale the value, and convert it from sensor data units (such as millivolts) into engineering data units (such as degrees Celsius). In this case, without addressing the risk in the high-level design, the programmers would have found themselves trying to optimize the math to evaluate a 13th-order polynomial in softwarethat is, a polynomial with 14 terms, including variables raised to the 13th power.