sensor fusion

10 results back to index


pages: 472 words: 80,835

Life as a Passenger: How Driverless Cars Will Change the World by David Kerrigan

3D printing, Airbnb, airport security, Albert Einstein, autonomous vehicles, big-box store, Boeing 747, butterfly effect, call centre, car-free, Cesare Marchetti: Marchetti’s constant, Chris Urmson, commoditize, computer vision, congestion charging, connected car, DARPA: Urban Challenge, data science, deep learning, DeepMind, deskilling, disruptive innovation, Donald Shoup, driverless car, edge city, Elon Musk, en.wikipedia.org, fake news, Ford Model T, future of work, General Motors Futurama, hype cycle, invention of the wheel, Just-in-time delivery, Lewis Mumford, loss aversion, Lyft, Marchetti’s constant, Mars Rover, megacity, Menlo Park, Metcalfe’s law, Minecraft, Nash equilibrium, New Urbanism, QWERTY keyboard, Ralph Nader, RAND corporation, Ray Kurzweil, ride hailing / ride sharing, Rodney Brooks, Sam Peltzman, self-driving car, sensor fusion, Silicon Valley, Simon Kuznets, smart cities, Snapchat, Stanford marshmallow experiment, Steve Jobs, technological determinism, technoutopianism, TED Talk, the built environment, Thorstein Veblen, traffic fines, transit-oriented development, Travis Kalanick, trolley problem, Uber and Lyft, Uber for X, uber lyft, Unsafe at Any Speed, urban planning, urban sprawl, warehouse robotics, Yogi Berra, young professional, zero-sum game, Zipcar

Unlike say, the steam engine, which was a relatively stand-alone system, driverless cars require radical developments in several fields - sensors, image recognition and Artificial Intelligence. These need to be combined to achieve a solution that can begin to be credible as a substitute for human control. Sensor Fusion As human drivers, we rely primarily on our eyes for inputs, while our brains analyse the situation and direct the muscles in our legs and arms to control the car via the steering wheel and pedals. For automation, sending the commands to the steering, braking and acceleration controls is technically very easy.

The hard part is “seeing” the world around it, interpreting it correctly and then proceeding safely. There is no one type of electronic sensor that provides all the inputs needed to drive as efficiently as the human eye, so most driverless cars rely on an approach of adding multiple types of sensor to vehicles and combining their inputs using a technique called Sensor Fusion. The following illustration shows a prototype driverless car, and the position of the various sensors that contribute to the overall ability of the car to “see”. Image courtesy Guilbert Gates/New York Times [83] With their combination of sensors, these cars can also sense far more than humans can.

All the sensors on these vehicles have been engineered and manufactured in-house by Waymo,[108] and include three types of LiDAR that operate at short, medium and long ranges, and the upgraded system on the Pacifica features eight enhanced camera modules and an additional high-resolution forward-looking multi-sensor module designed to be able to detect smaller objects like traffic cones at longer distances. Much like Apple carefully controls its customer experience because it builds both the software and the hardware for its product suite, Waymo can claim tighter integration between its sensor hardware, sensor fusion software, image recognition and other aspects of its self-driving system. Waymo also claims individual performance benefits in each of its new sensors, including vision cameras, radars and LiDAR, saying each provides better resolution, sensing distance and accuracy than the hardware it has been using on prior vehicles.[109] Show Me the Money Some idea of the scope of change we’re talking regarding the advent of driverless cars is evident given the investment that one of the world’s smartest companies have chosen to put into it.


pages: 202 words: 59,883

Age of Context: Mobile, Sensors, Data and the Future of Privacy by Robert Scoble, Shel Israel

Albert Einstein, Apple II, augmented reality, call centre, Chelsea Manning, cloud computing, connected car, driverless car, Edward Snowden, Edward Thorp, Elon Musk, factory automation, Filter Bubble, G4S, gamification, Google Earth, Google Glasses, Internet of things, job automation, John Markoff, Kickstarter, lifelogging, Marc Andreessen, Marc Benioff, Mars Rover, Menlo Park, Metcalfe’s law, New Urbanism, PageRank, pattern recognition, RFID, ride hailing / ride sharing, Robert Metcalfe, Salesforce, Saturday Night Live, self-driving car, sensor fusion, Silicon Valley, Skype, smart grid, social graph, speech recognition, Steve Jobs, Steve Wozniak, Steven Levy, Tesla Model S, Tim Cook: Apple, TSMC, ubercab, urban planning, Zipcar

An environmentally friendly company, Sensible Self, makes GreenGoose, cute little wireless stickers containing motion sensors that allow you to track anything that moves, from a pet or child to your phone, or even to check if your spouse left the toilet seat up. Melanie Martella, Executive Editor of Sensors magazine, introduced us to the concept of sensor fusion, a fast-emerging technology that takes data from disparate sources to come up with more accurate, complete and dependable data. Sensor fusion enables the same sense of depth that is available in 3D modeling, which is used for all modern design and construction, as well as the magic of special effects in movies. Sensors will understand if you are pilfering office supplies or engaging in a clandestine office affair.


pages: 523 words: 61,179

Human + Machine: Reimagining Work in the Age of AI by Paul R. Daugherty, H. James Wilson

3D printing, AI winter, algorithmic management, algorithmic trading, AlphaGo, Amazon Mechanical Turk, Amazon Robotics, augmented reality, autonomous vehicles, blockchain, business process, call centre, carbon footprint, circular economy, cloud computing, computer vision, correlation does not imply causation, crowdsourcing, data science, deep learning, DeepMind, digital twin, disintermediation, Douglas Hofstadter, driverless car, en.wikipedia.org, Erik Brynjolfsson, fail fast, friendly AI, fulfillment center, future of work, Geoffrey Hinton, Hans Moravec, industrial robot, Internet of things, inventory management, iterative process, Jeff Bezos, job automation, job satisfaction, knowledge worker, Lyft, machine translation, Marc Benioff, natural language processing, Neal Stephenson, personalized medicine, precision agriculture, Ray Kurzweil, recommendation engine, RFID, ride hailing / ride sharing, risk tolerance, robotic process automation, Rodney Brooks, Salesforce, Second Machine Age, self-driving car, sensor fusion, sentiment analysis, Shoshana Zuboff, Silicon Valley, Snow Crash, software as a service, speech recognition, tacit knowledge, telepresence, telepresence robot, text mining, the scientific method, uber lyft, warehouse automation, warehouse robotics

In a pilot, the company observed an 8 percent productivity improvement in logistics tasks.a At Siemens, armies of spider-styled 3-D printed robots use AI to communicate and collaborate to build things in the company’s Princeton, New Jersey, lab. Each bot is equipped with vision sensors and laser scanners. In aggregate, they join forces to manufacture on the go.b At Inertia Switch, robotic intelligence and sensor fusion enable robot-human collaboration. The manufacturing firm uses Universal Robotics’ robots, which can learn tasks on the go and can flexibly move between tasks, making them handy helpers to humans on the factory floor.c a. Dave Gershgorn, “Hitachi Hires Artificially Intelligent Bosses for Their Warehouses,” Popular Science, September 8, 2015, www.popsci.com/hitachi-hires-artificial-intelligence-bosses-for-their-warehouses.

“What matters is companies that don’t continue to experiment or embrace failure eventually get in the position where the only thing they can do is make a Hail Mary bet at the end of their corporate existence. I don’t believe in bet-the-company bets.”5 Instead, Bezos firmly believes in the incredible power of experimentation. (For another example of experimentation in a retail setting, see the sidebar “Controlled Chaos.”) Build-Measure-Learn The technologies that power Amazon Go—computer vision, sensor fusion, and deep learning—are systems very much under development. Limitations include cameras that have a hard time tracking loose fruits and vegetables in a customer’s hands and difficulty recognizing a customer who pulls his hat low or puts on a scarf that obscures her face. These behaviors, inadvertently or on purpose, spoofed the system during the Amazon Go test run in Seattle.


pages: 385 words: 112,842

Arriving Today: From Factory to Front Door -- Why Everything Has Changed About How and What We Buy by Christopher Mims

air freight, Airbnb, Amazon Robotics, Amazon Web Services, Apollo 11, augmented reality, autonomous vehicles, big-box store, blue-collar work, Boeing 747, book scanning, business logic, business process, call centre, cloud computing, company town, coronavirus, cotton gin, COVID-19, creative destruction, data science, Dava Sobel, deep learning, dematerialisation, deskilling, digital twin, Donald Trump, easy for humans, difficult for computers, electronic logging device, Elon Musk, Frederick Winslow Taylor, fulfillment center, gentrification, gig economy, global pandemic, global supply chain, guest worker program, Hans Moravec, heat death of the universe, hive mind, Hyperloop, immigration reform, income inequality, independent contractor, industrial robot, interchangeable parts, intermodal, inventory management, Jacquard loom, Jeff Bezos, Jessica Bruder, job automation, John Maynard Keynes: Economic Possibilities for our Grandchildren, Joseph Schumpeter, Kaizen: continuous improvement, Kanban, Kiva Systems, level 1 cache, Lewis Mumford, lockdown, lone genius, Lyft, machine readable, Malacca Straits, Mark Zuckerberg, market bubble, minimum wage unemployment, Nomadland, Ocado, operation paperclip, Panamax, Pearl River Delta, planetary scale, pneumatic tube, polynesian navigation, post-Panamax, random stow, ride hailing / ride sharing, robot derives from the Czech word robota Czech, meaning slave, Rodney Brooks, rubber-tired gantry crane, scientific management, self-driving car, sensor fusion, Shenzhen special economic zone , Shoshana Zuboff, Silicon Valley, six sigma, skunkworks, social distancing, South China Sea, special economic zone, spinning jenny, standardized shipping container, Steve Jobs, supply-chain management, surveillance capitalism, TED Talk, the scientific method, Tim Cook: Apple, Toyota Production System, traveling salesman, Turing test, two-sided market, Uber and Lyft, Uber for X, uber lyft, Upton Sinclair, vertical integration, warehouse automation, warehouse robotics, workplace surveillance

If you’ve ever played with an AR app on your phone—say, the Ikea app, which shows you what furniture would look like in your actual home, using your phone’s camera and screen—that’s accomplished largely through SLAM. TuSimple’s system relies on “sensor fusion,” as the last step of localizing the truck in space. In the continuum from data to knowledge, sensor fusion is the part of perception in which information from all of a machine’s—or our own—senses is collected and synchronized. It’s essentially the creation of an internal consensus reality. For the purposes of a self-driving truck, sensor fusion is the process by which data from all those different sensors—the IMUs, GPS, cameras, lidar, and radar—are brought together in a single virtual world that depicts the moment-to-moment reality around the truck.


pages: 561 words: 163,916

The History of the Future: Oculus, Facebook, and the Revolution That Swept Virtual Reality by Blake J. Harris

"World Economic Forum" Davos, 4chan, airport security, Anne Wojcicki, Apollo 11, Asian financial crisis, augmented reality, barriers to entry, Benchmark Capital, Bernie Sanders, bitcoin, call centre, Carl Icahn, company town, computer vision, cryptocurrency, data science, disruptive innovation, Donald Trump, drone strike, Elon Musk, fake news, financial independence, game design, Grace Hopper, hype cycle, illegal immigration, invisible hand, it's over 9,000, Ivan Sutherland, Jaron Lanier, Jony Ive, Kickstarter, Marc Andreessen, Mark Zuckerberg, Menlo Park, Minecraft, move fast and break things, Neal Stephenson, Network effects, Oculus Rift, off-the-grid, Peter Thiel, QR code, sensor fusion, Sheryl Sandberg, side project, Silicon Valley, SimCity, skunkworks, Skype, slashdot, Snapchat, Snow Crash, software patent, stealth mode startup, Steve Jobs, unpaid internship, white picket fence

In some cases that may slow down progress in the beginning, but that makes me feel like I can really move fast down the line and make the right choices.” “That makes sense to me.” “I have not yet worked with gyroscopes before. So the orientation tracking, it is all very new to me. But I am beginning to better understand what needs to be done; I started reading about sensor fusion.” Sensor fusion is, literally, the process of fusing together data from multiple sensors. Specifically, Antonov was referring to the need for a VR headset to track head movements and orientation in a three-dimensional space. Central to accomplishing this is something called an inertial measurement unit (IMU), which is an electronic device that detects the rate of acceleration (via accelerometers), the rate of angular velocity (via gyroscopes), and the intensity of a magnetic field (via magnetometers).

He was always superpassionate about head-mounted displays and virtual reality. And he wanted something that actually allowed him to jack into the matrix for video games. PALMER LUCKEY The biggest change is that we’ve developed our own motion tracker sensor chips . . . [which] gives us better data, more samples to work with when we’re doing our sensor fusion, so we can get better tracking overall; and most importantly, because it’s running at 1000 Hz (instead of 250 Hz) and it’s four times faster, we can actually have less latency. Less time between when you make a motion and when it shows up on-screen. TESTED So right now you have, with head tracking, roll . . . but you don’t have depth yet.


pages: 404 words: 95,163

Amazon: How the World’s Most Relentless Retailer Will Continue to Revolutionize Commerce by Natalie Berg, Miya Knights

3D printing, Adam Neumann (WeWork), Airbnb, Amazon Robotics, Amazon Web Services, asset light, augmented reality, Bernie Sanders, big-box store, business intelligence, cloud computing, Colonization of Mars, commoditize, computer vision, connected car, deep learning, DeepMind, digital divide, Donald Trump, Doomsday Clock, driverless car, electronic shelf labels (ESLs), Elon Musk, fulfillment center, gig economy, independent contractor, Internet of things, inventory management, invisible hand, Jeff Bezos, Kiva Systems, market fragmentation, new economy, Ocado, pattern recognition, Ponzi scheme, pre–internet, QR code, race to the bottom, random stow, recommendation engine, remote working, Salesforce, sensor fusion, sharing economy, Skype, SoftBank, Steve Bannon, sunk-cost fallacy, supply-chain management, TaskRabbit, TechCrunch disrupt, TED Talk, trade route, underbanked, urban planning, vertical integration, warehouse automation, warehouse robotics, WeWork, white picket fence, work culture

Addresses the perennial headache that is online returns, while driving footfall to Kohl’s. We expect this to be rolled out internationally. No 2018 Amazon Go Retail First checkout-free store. Shoppers scan their Amazon app to enter. The high-tech convenience store uses a combination of computer vision, sensor fusion and deep learning to create a frictionless customer experience. No 2019 and beyond Fashion or furniture stores would be a logical next step NOTE Amazon Go officially opened its doors to the public in 2018 SOURCE Amazon; author research as of June 2018 However, it was Amazon’s rather ironic launch of physical bookstores in 2015 that marked a genuine shift in strategy, as this was the first time Amazon mimicked digital merchandising and pricing in a physical setting.

The use of a mobile app is the most friction-free way to ensure a smooth Amazon Go experience when a customer enters the store. The removal of any human interface from the most friction-filled process of any store-based shopping journey, ie checkout, affords the customer unprecedented speed and simplicity. Autonomous computing: AI-based computer vision, sensor fusion and deep learning technologies power Amazon Go’s Just Walk Out technology. Just Walk Out technology operates without manual intervention, eliminating the need for checkout staff or hardware. It also eliminates shrinkage as a major source of loss for traditional brick and mortar retailers. Customers are charged with whatever goods they walk out with, even if they try to hide the fact from the store’s extensive computer vision camera systems.


pages: 175 words: 54,755

Robot, Take the Wheel: The Road to Autonomous Cars and the Lost Art of Driving by Jason Torchinsky

autonomous vehicles, barriers to entry, call centre, commoditize, computer vision, connected car, DARPA: Urban Challenge, data science, driverless car, Elon Musk, en.wikipedia.org, interchangeable parts, job automation, Philippa Foot, ransomware, self-driving car, sensor fusion, side project, Tesla Model S, trolley problem, urban sprawl

Autonomous cars will likely mean that no trip you take in them is ever going to be completely private, and we shouldn’t even kid ourselves into thinking otherwise. Everyone will be watching everything, everywhere. Communication and Combining Everything All of these different forms of world sensing are combined to create an overall, composite image of the surrounding reality for the car. Some call this “sensor fusion,”³³ and it’s especially important because each method, individually, has some pretty significant flaws and limitations that could cause real problems in practice. Supplementing the sensor data is communication, both between cars on the road and from more centralized sources. Individual cars will communicate with others in their vicinity, a process known as vehicle-to-vehicle (or V2V) communication.


pages: 205 words: 20,452

Data Mining in Time Series Databases by Mark Last, Abraham Kandel, Horst Bunke

backpropagation, call centre, computer vision, discrete time, G4S, information retrieval, iterative process, NP-complete, p-value, pattern recognition, random walk, sensor fusion, speech recognition, web application

Interesting applications of the median concept have been demonstrated in dealing with 2D shapes [16, 33], binary feature maps [23], 3D rotation [9], geometric features (points, lines, or 3D frames) [32], brain models [12], anatomical structures [37], and facial images [31]. In this paper we discuss the adaptation of the median concept to the domain of strings. The median concept is useful in various contexts. It represents a fundamental quantity in statistics. In sensor fusion, multisensory measurements of some quantity are averaged to produce the best estimate. Averaging the results of several classifiers is used in multiple classifier systems in order to achieve more reliable classifications. The outline of the chapter is as follows. We first formally introduce the median string problem in Section 2 and provide some related theoretical results in Section 3.


pages: 280 words: 74,559

Fully Automated Luxury Communism by Aaron Bastani

"Peter Beck" AND "Rocket Lab", Alan Greenspan, Anthropocene, autonomous vehicles, banking crisis, basic income, Berlin Wall, Bernie Sanders, Boston Dynamics, Bretton Woods, Brexit referendum, capital controls, capitalist realism, cashless society, central bank independence, collapse of Lehman Brothers, computer age, computer vision, CRISPR, David Ricardo: comparative advantage, decarbonisation, deep learning, dematerialisation, DIY culture, Donald Trump, double helix, driverless car, electricity market, Elon Musk, energy transition, Erik Brynjolfsson, fake news, financial independence, Francis Fukuyama: the end of history, future of work, Future Shock, G4S, general purpose technology, Geoffrey Hinton, Gregor Mendel, housing crisis, income inequality, industrial robot, Intergovernmental Panel on Climate Change (IPCC), Internet of things, Isaac Newton, James Watt: steam engine, Jeff Bezos, Jeremy Corbyn, Jevons paradox, job automation, John Markoff, John Maynard Keynes: technological unemployment, Joseph Schumpeter, Kevin Kelly, Kuiper Belt, land reform, Leo Hollis, liberal capitalism, low earth orbit, low interest rates, low skilled workers, M-Pesa, market fundamentalism, means of production, mobile money, more computing power than Apollo, new economy, off grid, pattern recognition, Peter H. Diamandis: Planetary Resources, post scarcity, post-work, price mechanism, price stability, private spaceflight, Productivity paradox, profit motive, race to the bottom, rewilding, RFID, rising living standards, Robert Solow, scientific management, Second Machine Age, self-driving car, sensor fusion, shareholder value, Silicon Valley, Simon Kuznets, Slavoj Žižek, SoftBank, stem cell, Stewart Brand, synthetic biology, technological determinism, technoutopianism, the built environment, the scientific method, The Wealth of Nations by Adam Smith, Thomas Malthus, transatlantic slave trade, Travis Kalanick, universal basic income, V2 rocket, Watson beat the top human players on Jeopardy!, We are as Gods, Whole Earth Catalog, working-age population

While the world economy may be much bigger now than it was in 1900, employing more people and enjoying far higher output per person, the lines of work nearly everyone performs – drivers, nurses, teachers and cashiers – aren’t particularly new. Actually Existing Automation In March 2017 Amazon launched its Amazon GO store in downtown Seattle. Using computer vision, deep learning algorithms, and sensor fusion to identify selected items the company looked to build a near fully automated store without cashiers. Here Amazon customers would be able to buy items simply by swiping in with a phone, choosing the things they wanted and swiping out to leave, their purchases automatically debited to their Amazon account.


pages: 253 words: 84,238

A Thousand Brains: A New Theory of Intelligence by Jeff Hawkins

AI winter, Albert Einstein, artificial general intelligence, carbon-based life, clean water, cloud computing, deep learning, different worldview, discovery of DNA, Doomsday Clock, double helix, en.wikipedia.org, estate planning, Geoffrey Hinton, Jeff Hawkins, PalmPilot, Search for Extraterrestrial Intelligence, self-driving car, sensor fusion, Silicon Valley, superintelligent machines, the scientific method, Thomas Kuhn: the structure of scientific revolutions, Turing machine, Turing test

Consequentially, the input to the neocortex is not like a photograph. It is a highly distorted and incomplete quilt of image patches. Yet we are unaware of the distortions and missing pieces; our perception of the world is uniform and complete. The hierarchy of features theory can’t explain how this happens. This problem is called the binding problem or the sensor-fusion problem. More generally, the binding problem asks how inputs from different senses, which are scattered all over the neocortex with all sorts of distortions, are combined into the singular non-distorted perception we all experience. • As I pointed out in Chapter 1, although some of the connections between regions of the neocortex appear hierarchical, like a step-by-step flowchart, the majority do not.


Autonomous Driving: How the Driverless Revolution Will Change the World by Andreas Herrmann, Walter Brenner, Rupert Stadler

Airbnb, Airbus A320, algorithmic bias, augmented reality, autonomous vehicles, blockchain, call centre, carbon footprint, clean tech, computer vision, conceptual framework, congestion pricing, connected car, crowdsourcing, cyber-physical system, DARPA: Urban Challenge, data acquisition, deep learning, demand response, digital map, disruptive innovation, driverless car, Elon Musk, fault tolerance, fear of failure, global supply chain, industrial cluster, intermodal, Internet of things, Jeff Bezos, John Zimmer (Lyft cofounder), Lyft, manufacturing employment, market fundamentalism, Mars Rover, Masdar, megacity, Pearl River Delta, peer-to-peer rental, precision agriculture, QWERTY keyboard, RAND corporation, ride hailing / ride sharing, self-driving car, sensor fusion, sharing economy, Silicon Valley, smart cities, smart grid, smart meter, Steve Jobs, Tesla Model S, Tim Cook: Apple, trolley problem, uber lyft, upwardly mobile, urban planning, Zipcar

However, it now seems clear that Google is primarily interested in collecting data on drivers and their vehicles, and that it sees the software as a new business model and does not want to produce cars. Players 183 Early in 2016, Nvidia surprisingly announced its own computing platform for controlling autonomous vehicles. This platform has sufficient processing power to support deep learning, sensor fusion and surround vision, all of which are key elements for a self-driving car. It also announced that its PX2 would be used as a standard computer in the Roborace series for self-driving race. Nvidia has also already built autonomous test vehicles, which have only been driven on test routes to date.


pages: 590 words: 152,595

Army of None: Autonomous Weapons and the Future of War by Paul Scharre

"World Economic Forum" Davos, active measures, Air France Flight 447, air gap, algorithmic trading, AlphaGo, Apollo 13, artificial general intelligence, augmented reality, automated trading system, autonomous vehicles, basic income, Black Monday: stock market crash in 1987, brain emulation, Brian Krebs, cognitive bias, computer vision, cuban missile crisis, dark matter, DARPA: Urban Challenge, data science, deep learning, DeepMind, DevOps, Dr. Strangelove, drone strike, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, facts on the ground, fail fast, fault tolerance, Flash crash, Freestyle chess, friendly fire, Herman Kahn, IFF: identification friend or foe, ImageNet competition, information security, Internet of things, Jeff Hawkins, Johann Wolfgang von Goethe, John Markoff, Kevin Kelly, Korean Air Lines Flight 007, Loebner Prize, loose coupling, Mark Zuckerberg, military-industrial complex, moral hazard, move 37, mutually assured destruction, Nate Silver, Nick Bostrom, PalmPilot, paperclip maximiser, pattern recognition, Rodney Brooks, Rubik’s Cube, self-driving car, sensor fusion, South China Sea, speech recognition, Stanislav Petrov, Stephen Hawking, Steve Ballmer, Steve Wozniak, Strategic Defense Initiative, Stuxnet, superintelligent machines, Tesla Model S, The Signal and the Noise by Nate Silver, theory of mind, Turing test, Tyler Cowen, universal basic income, Valery Gerasimov, Wall-E, warehouse robotics, William Langewiesche, Y2K, zero day

Without a human in the loop as a final check, using this technology to do autonomous targeting today would be exceedingly dangerous. Neural networks with these vulnerabilities could be manipulated into avoiding enemy targets and attacking false ones. In the near term, the best chances for high-reliability target recognition lie with the kind of sensor fusion that DARPA’s CODE project envisions. By fusing together data from multiple angles and multiple types of sensors, computers could possibly distinguish between military targets and civilian objects or decoys with high reliability. Objects that are dual-use for military and civilian purposes, such as trucks, would be more difficult since determining whether they are lawful targets might depend on context.


pages: 586 words: 186,548

Architects of Intelligence by Martin Ford

3D printing, agricultural Revolution, AI winter, algorithmic bias, Alignment Problem, AlphaGo, Apple II, artificial general intelligence, Asilomar, augmented reality, autonomous vehicles, backpropagation, barriers to entry, basic income, Baxter: Rethink Robotics, Bayesian statistics, Big Tech, bitcoin, Boeing 747, Boston Dynamics, business intelligence, business process, call centre, Cambridge Analytica, cloud computing, cognitive bias, Colonization of Mars, computer vision, Computing Machinery and Intelligence, correlation does not imply causation, CRISPR, crowdsourcing, DARPA: Urban Challenge, data science, deep learning, DeepMind, Demis Hassabis, deskilling, disruptive innovation, Donald Trump, Douglas Hofstadter, driverless car, Elon Musk, Erik Brynjolfsson, Ernest Rutherford, fake news, Fellow of the Royal Society, Flash crash, future of work, general purpose technology, Geoffrey Hinton, gig economy, Google X / Alphabet X, Gödel, Escher, Bach, Hans Moravec, Hans Rosling, hype cycle, ImageNet competition, income inequality, industrial research laboratory, industrial robot, information retrieval, job automation, John von Neumann, Large Hadron Collider, Law of Accelerating Returns, life extension, Loebner Prize, machine translation, Mark Zuckerberg, Mars Rover, means of production, Mitch Kapor, Mustafa Suleyman, natural language processing, new economy, Nick Bostrom, OpenAI, opioid epidemic / opioid crisis, optical character recognition, paperclip maximiser, pattern recognition, phenotype, Productivity paradox, radical life extension, Ray Kurzweil, recommendation engine, Robert Gordon, Rodney Brooks, Sam Altman, self-driving car, seminal paper, sensor fusion, sentiment analysis, Silicon Valley, smart cities, social intelligence, sparse data, speech recognition, statistical model, stealth mode startup, stem cell, Stephen Hawking, Steve Jobs, Steve Wozniak, Steven Pinker, strong AI, superintelligent machines, synthetic biology, systems thinking, Ted Kaczynski, TED Talk, The Rise and Fall of American Growth, theory of mind, Thomas Bayes, Travis Kalanick, Turing test, universal basic income, Wall-E, Watson beat the top human players on Jeopardy!, women in the workforce, working-age population, workplace surveillance , zero-sum game, Zipcar

I found machine perception particularly fascinating: the challenges of how to build learning algorithms for distributed and multi-agent systems, how to use machine learning algorithms to make sense of environments, and how to develop algorithms that could autonomously build models of those environments, in particular, environments where you had no prior knowledge of them and had to learn as you go—like the surface of Mars. A lot of what I was working on had applications not just in machine vision, but in distributed networks and sensing and sensor fusion. We were building these neural network-based algorithms that were using a combination of Bayesian networks of the kind Judea Pearl had pioneered, Kalman filters and other estimation and prediction algorithms to essentially build machine learning systems. The idea was that these systems could learn from the environment, learn from input data from a wide range of sources of varying quality, and make predictions.