search engine result page

17 results back to index


pages: 298 words: 43,745

Understanding Sponsored Search: Core Elements of Keyword Advertising by Jim Jansen

AltaVista, barriers to entry, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, correlation does not imply causation, en.wikipedia.org, first-price auction, information asymmetry, information retrieval, intangible asset, inventory management, life extension, linear programming, longitudinal study, megacity, Nash equilibrium, Network effects, PageRank, place-making, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social web, software as a service, stochastic process, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, Vilfredo Pareto, yield management

Examples of queries Child labor law Capital one Buy table clocks Level Two (I, D) Directed: specific question (I, U) Undirected: tell me everything about a topic (I, L) List: list of candidates (I, F) Find: locate where some real world service or product can be obtained (I, A) Advice: advice, ideas, suggestions, instructions (N, T) Navigation to transactional: the URL the user wants is a transactional site (N, I ) Navigation to informational: the URL the user wants is an informational site (T, O) Obtain: obtain a specific resource or object (T, D) Download: find a file to download (T, R) Results Page: obtain a resource that one can print, save, or read from the search engine results page (T, I) Interact: interact with program/resource on another Web site Registering domain name Singers in the 1980s Things to do in Hollywood ca PVC suit for overweight men What to serve with roast pork tenderloin Match.com Yahoo.com Music lyrics Mp3 downloads (The user enters a query with the expectation that ‘answer’ will be on the search engine results page and not require browsing to another Web site.) Buy table clock Level Three (I, D, C) Closed: deals with one topic; question with one, unambiguous answer (I, D, O) Open: deals with two or more topics (T, O, O) Online: the resource will be obtained online (T, O, F) Off-line: the resource will be obtained off-line and may require additional actions by the user (T, D, F) Free: the downloadable file is free (T, D, N) Not Free: the downloadable file is not necessarily free (T, R, L) Links: the resources appears in the title, summary, or URL of one or more of the results on the search engine results page (T, R, O) Other: the resources does not appear one of the results but somewhere else on the search engine results page Nine supreme court justices The excretory system of arachnids airline seat map full metal alchemist wallpapers Free online games “Family Guy” episode download (As an example, a user enters the title of a conference paper to locate the page numbers, which usually appear in one or more of the results.)

Buy table clock Level Three (I, D, C) Closed: deals with one topic; question with one, unambiguous answer (I, D, O) Open: deals with two or more topics (T, O, O) Online: the resource will be obtained online (T, O, F) Off-line: the resource will be obtained off-line and may require additional actions by the user (T, D, F) Free: the downloadable file is free (T, D, N) Not Free: the downloadable file is not necessarily free (T, R, L) Links: the resources appears in the title, summary, or URL of one or more of the results on the search engine results page (T, R, O) Other: the resources does not appear one of the results but somewhere else on the search engine results page Nine supreme court justices The excretory system of arachnids airline seat map full metal alchemist wallpapers Free online games “Family Guy” episode download (As an example, a user enters the title of a conference paper to locate the page numbers, which usually appear in one or more of the results.) (As an example, a user enters a query term to check for spelling with no interest in the results listing.

Why does one choose particular terms for one’s ad copy (other than that historically they seem to work)? Why is this bidding process the way it is? Why look at certain metrics and not at others? In other words, why do we do what we do? This book is for those who are curious about such things. I am one of these people, as I am curious why things are the way they are with sponsored search, which is the process in which advertisers pay to have their advertisements appear on a search engine results page in response to a query from a searcher. Sponsored search is also commonly known as keyword advertising. There is also a great practical benefit in understanding the theoretical foundations of what one does. Doing something just because it worked in the past will generally produce good results€– until the context, situation, or environment changes. Then, all the historical data and results are of little value.


Mastering Structured Data on the Semantic Web: From HTML5 Microdata to Linked Open Data by Leslie Sikos

AGPL, Amazon Web Services, bioinformatics, business process, cloud computing, create, read, update, delete, Debian, en.wikipedia.org, fault tolerance, Firefox, Google Chrome, Google Earth, information retrieval, Infrastructure as a Service, Internet of things, linked data, natural language processing, openstreetmap, optical character recognition, platform as a service, search engine result page, semantic web, Silicon Valley, social graph, software as a service, SPARQL, text mining, Watson beat the top human players on Jeopardy!, web application, wikimedia commons

The code length of the direct input is limited to 1,500 characters. The tool provides a preview of Google’s representation of your site on Search Engine Result Pages (SERPs), along with the extracted structured data as item, type, and properties (see Figure 4-5). 84 Chapter 4 ■ Semantic Web Development Tools Figure 4-5. Triples extracted by the Google Structured Data Testing Tool The tool can identify incomplete triples and provides a short explanation if any mandatory property is missing. The Google Structured Data Testing Tool also indicates properties that are not parts of the vocabulary used for the object. ■■Note Google does not use machine-readable metadata annotations on Search Engine Result Pages if certain properties are missing for a particular object type. For example, an hCard description will be used by Google only if you provide not only the name but also at least two of the following three properties: organization, location, or role, while code validity can be achieved even if you omit them.

The Service-Oriented Architecture (SOA) infrastructure over Big Data makes it possible to update Big Data in real time. Data can be automatically classified, relationships associated, and new relationships found, so that data can be collected and integrated without worrying about schemas and data descriptions, yet providing a data description. Big Data applications on the Semantic Web include, but are not limited to, next-generation Search Engine Result Pages, social media graphs, analysis of natural language content, publishing factual data about massive world events, interlinking BBC’s online content, as well as high-performance data storage and processing. Big Semantic Data: Big Data on the Semantic Web Big Data refers to any high-volume, high-velocity datasets too large and complex to process using traditional data processing tools, applications, and database systems.

One of the promising approaches to address the issues associated with Big Data is to implement Semantic Web technologies in order to build systems that can efficiently handle Big Data and evolve with the growing data processing needs. 199 Chapter 8 ■ Big Data Applications Google Knowledge Graph and Knowledge Vault One of the best known Big Data applications on the Semantic Web is the Google Knowledge Graph, which was introduced in 2012. The Google Knowledge Graph is a semantic knowledge base to enhance traditional Search Engine Result Pages (SERPs) with semantic search information gathered from a wide variety of sources. The data sources used by the Knowledge Graph include pages indexed by Google, objects on GoogleMaps, public data sources such as Wikipedia, LOD datasets such as DBpedia, the CIA World Factbook, and the FDA datasets, as well as subject-specific resources such as Weather Underground and World Bank, for meteorological information and economic statistics, respectively.


Google AdWords by Anastasia Holdren

bounce rate, Network effects, search engine result page

Clickthrough rate (CTR) The number of clicks on an ad divided by the number of times the ad is displayed (impressions), expressed as a percentage. Conversion When a click on an ad results in a desirable behavior, like an online purchase. Impression The appearance of an ad on a search results page, whether someone clicks on it or not. Keyword A word or phrase that can trigger an ad on a search engine results page. A keyword is not an AdWord. Search engine results page (SERP) The page presented to a searcher after typing a search query into a search engine. Search query The word or phrase a searcher types into a search engine. Conventions Used in This Book The following typographical conventions are used in this book: Italic Indicates new terms, URLs, email addresses, filenames, and file extensions. Constant width Used for program listings, as well as within paragraphs to refer to program elements such as variable or function names, databases, data types, environment variables, statements, and keywords.

That being said, a minimal commitment to managing an AdWords account can reap benefits for all advertisers. Like anything, the more you put in, the more you’ll get back. Where Ads Can Appear Let’s review how Google organizes search results. Google displays ads on search results pages, abbreviated SERPs (search engine results pages). To keep things interesting, SERPs change search-by-search; Google frequently changes formatting, colors, and layout of results pages. Figure 1-1 shows an example SERP. Figure 1-1. Google search engine results page (SERP) A person went to Google.com and searched for wildlife removal nashville. This is called the search query. The search results page is created on the fly, displaying the most relevant results Google can identify for that particular searcher at that moment in time. A single website can appear in multiple sections of the search results page, via AdWords or AdWords Express, Google Places, and the organic results.


The Art of SEO by Eric Enge, Stephan Spencer, Jessie Stricchiola, Rand Fishkin

AltaVista, barriers to entry, bounce rate, Build a better mousetrap, business intelligence, cloud computing, dark matter, en.wikipedia.org, Firefox, Google Chrome, Google Earth, hypertext link, index card, information retrieval, Internet Archive, Law of Accelerating Returns, linked data, mass immigration, Metcalfe’s law, Network effects, optical character recognition, PageRank, performance metric, risk tolerance, search engine result page, self-driving car, sentiment analysis, social web, sorting algorithm, speech recognition, Steven Levy, text mining, web application, wikimedia commons

User Centric eye-tracking results In 2010, Enquiro investigated the impact of Google Instant on search usage and attention (http://ask.enquiro.com/2010/eye-tracking-google-instant/), noting that for queries typed in their study: Percent of query typed decreased in 25% of the tasks, with no change in the others Query length increased in 17% of the tasks, with no change in the others Time to click decreased in 33% of the tasks and increased in 8% of the tasks These studies are a vivid reminder of how important search engine results pages (SERPs) really are. And as the eye-tracking research demonstrates, “rich” or “personalized” search, as it evolves, will alter users’ search patterns even more: there will be more items on the page for them to focus on, and more ways for them to remember and access the search listings. Search marketers need to be prepared for this as well. The Search, plus Your World announcement in January of 2012 will also have a profound impact on the results, but no studies on that impact have been done as of February 2012.

The search engines continuously invest in improving their ability to better process the content of web pages. For example, advances in image and video search have enabled search engines to inch closer to human-like understanding, a topic that will be explored more in Vertical Search Engines. Understanding Search Engine Results In the search marketing field, the pages the engines return to fulfill a query are referred to as search engine results pages (SERPs). Each engine returns results in a slightly different format and will include vertical search results (specific content targeted to a query based on certain triggers in the query, which we’ll illustrate shortly). Understanding the Layout of Search Results Pages Each unique section represents a snippet of information provided by the engines. Here are the definitions of what each piece is meant to provide: Vertical navigation Each engine offers the option to search different verticals, such as images, news, video, or maps.

Sample SWOT chart data for Business X The preceding analysis suggests where Business X can get some quick wins for its site, as well as where the priorities are. It also forms a great starting point for a long-term strategy and tactical maneuvers. This example is simplistic, but it illustrates how instructive a fleshed out SWOT can be. It does require you to have analyzed your site, the main competitor(s), the keywords, and the search engine results pages (SERPs). Get SMART Every company is unique, so naturally their challenges are unique. Even a second SEO initiative within the same company will not be the same as the first initiative. Your initial SEO efforts will have changed things, creating new benchmarks, new expectations, and different objectives. Thus, each SEO project is a new endeavor. One way to start a new project is to set SMART objectives.


pages: 189 words: 52,741

Lifestyle Entrepreneur: Live Your Dreams, Ignite Your Passions and Run Your Business From Anywhere in the World by Jesse Krieger

Airbnb, always be closing, bounce rate, call centre, carbon footprint, commoditize, Deng Xiaoping, different worldview, financial independence, follow your passion, income inequality, iterative process, Ralph Waldo Emerson, search engine result page, Skype, software as a service, South China Sea, Steve Jobs

Net Effect: Emphasizing Apex Keywords in ads, meta content and visible site content increases your site’s Quality Score, which drives down cost-per-click and increases positioning in the search engine results pages. NON-PAID TRAFFIC (SEO) Search Engine Optimization While CPC advertising will drive immediate traffic to your website, you can’t increase your traffic exponentially with CPC alone. Every click through from your ad costs you money, whether the person who clicked on it ends up buying from you or not. It’s a good short-term solution, but in the long term, you want to have a daily stream of traffic to your site that costs you nothing. You can do this by optimizing your website so that it appears in the main section of the search engine results page (SERP), the organic search results that are based on the content of your site. You increase your sites’ rankings in SERP by proving to the search engines that your website is relevant with respect to the search terms that people use when looking for your products or services.

LOOKING UNDER THE HOOD OF THE OPERATIONS MODEL Site Traffic: Enticing Potencial Customers to your Online Storefront Note in the previous narrative that the first thing Jonathan did was some initial research by entering search words and phrases into a search engine. It may seem obvious, but it’s important to see that the USBsuperstore.com made it a very high priority to be visible in all of the major search engines for a wide range of industry and product-specific search terms. This is why the company was visible in the organic search engine results page (SERP) for different terms or words. This process is known as search engine optimization or SEO. This is a process whereby keywords and search phrases relevant to the business are woven through the site content and coding of the website in specific ways to boost visibility for those terms. To further support their legitimacy as a vendor, they also advertised through Google’s AdWords platform using many of the same search terms.

This would undermine the authenticity and objectivity of the search results displayed. It’s important to remember that there is no direct correlation between ad dollars spent and Quality Score achieved. However one of the determinants of your Quality Score is the amount of traffic to your site, and its consistency. This is where AdWords and CPC advertising can help, since it jumpstarts traffic by running AdWords ads on the right-hand side of the search engine results page, which, over time, becomes a factor in determining your Quality Score, assuming you have simultaneously executed an SEO strategy. You want to use a multifaceted approach to getting a high Quality Score so that it cements your standings in the search engines. Structuring your website like this optimizes your SEO & PPC efforts: 1. Configuring your website so that you have on-site SEO that is based on your Apex Keywords. 2.


pages: 597 words: 119,204

Website Optimization by Andrew B. King

AltaVista, bounce rate, don't be evil, en.wikipedia.org, Firefox, In Cold Blood by Truman Capote, information retrieval, iterative process, Kickstarter, medical malpractice, Network effects, performance metric, search engine result page, second-price auction, second-price sealed-bid, semantic web, Silicon Valley, slashdot, social graph, Steve Jobs, web application

It explains how the effects of search engine marketing and web performance tuning can be quantified and optimized. Part I Part I, Search Engine Marketing Optimization, explains how to use best-practice techniques to boost the search engine visibility and conversion rate of your website. It consists of the following: Introduction to Part I, Search Engine Marketing Optimization Briefly explores the behavior of users as they interact with search engine result pages, and how tight, front-loaded headlines and summaries help to improve natural referrals and PPC conversions for search result pages. Chapter 1, Natural Search Engine Optimization Shows best practices for improving organic search engine visibility, as well as how to overcome the most common barriers to high rankings. The chapter demonstrates the 10 steps you can take to achieve high rankings, including writing optimized title tags, targeting specific keywords, and building popular inbound links.

The chapters that follow will show you the best (and worst) practices for each of these topics, complete with case studies showing the techniques in action. First, let's explore how people behave when using search engines. Search Behavior To best optimize your website, it is important to understand how users interact with search engines. As you'll discover, searchers are selective in their viewing of search engine result pages (SERPs) and spend little time on each page browsing results. SERP Viewing Statistics Good search result placement is important because most searchers (92.5%) don't explore beyond the third page of search results. [1] In fact, about three-fourths don't look past the first page of results. [2], [3] About 90% of searchers view only the first or second page of results. Yes, even on the Web the three most important elements of success are location, location, location.

But the abuse of some meta tags and other SEO shenanigans such as invisible text and keyword stuffing [7] have forced search engines to weigh external factors, such as inbound links, more heavily than on-site optimization. So, how do you achieve your SEO dream now? Today's successful SEO strategy requires a long-term approach with frequent postings, targeted content, and regular online promotion designed to boost inbound links—in short, a combination of off-site and on-site SEO. The Benefits of SEO A high ranking in search engine result pages (SERPs) has become a business necessity. High rankings have been found to increase the following characteristics: Site traffic (see Figure 1-1) [8]. Perceived relevance [9]. Trust [10] Conversion (to sales) rates Figure 1-1 shows the effects of higher rankings. A Oneupweb study found that soon after the average client site appeared in the top 10 search result pages, both conversion rates and new traffic increased significantly.


pages: 407 words: 103,501

The Digital Divide: Arguments for and Against Facebook, Google, Texting, and the Age of Social Netwo Rking by Mark Bauerlein

Amazon Mechanical Turk, Andrew Keen, business cycle, centre right, citizen journalism, collaborative editing, computer age, computer vision, corporate governance, crowdsourcing, David Brooks, disintermediation, Frederick Winslow Taylor, Howard Rheingold, invention of movable type, invention of the steam engine, invention of the telephone, Jaron Lanier, Jeff Bezos, jimmy wales, Kevin Kelly, knowledge worker, late fees, Mark Zuckerberg, Marshall McLuhan, means of production, meta analysis, meta-analysis, moral panic, Network effects, new economy, Nicholas Carr, PageRank, peer-to-peer, pets.com, Results Only Work Environment, Saturday Night Live, search engine result page, semantic web, Silicon Valley, slashdot, social graph, social web, software as a service, speech recognition, Steve Jobs, Stewart Brand, technology bubble, Ted Nelson, The Wisdom of Crowds, Thorstein Veblen, web application

See Results-only work environment RSS Rushkoff, Douglas Rutgers University Safire, William Salon.com Samsung San Francisco Chronicle Sanger, Larry SAP Sartre, Jean-Paul Saturated self Saturday Night Live (television series) Scalable Fabric Scarcity Scherf, Steve Schindler’s List (film) Schmidt, Eric Science (journal) Scientific Learning Scientific management Scion Scope Screenagers Scrutiny The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture (Battelle) Search Engine Results Page (SERP) Search engines. See also specific sites Search strategies Search Wiki Sebald, W. G. Seigenthaler, John, Sr. Seinfeld (television series) Self-portraits Self-publishing Self-realization Self-sufficiency Semantic priming Semiotic democracy Sensory deprivation September 11, 2001 terrorist attacks Serialization SERP. See Search Engine Results Page Sesame Street Shakesville (blog) Shirky, Clay Shoutcast Simulations Six Degrees: The Science of a Connected Age (Watts) Skrenta, Rich “Skyful of Lies” and Black Swans (Gowing) Slashdot Slatalla, Michelle Slate (magazine) Sleeper Curve Slingbox SLVR phone Small world experiment Social currency Social graph Social media Social mind The Social Network (film) Social networking sites.

They might modify their first attempt, but they typically stick with the same general approach rather than try something genuinely new. For example, one user tested the Mayo Clinic’s site to find out how to ensure that a child with a milk allergy would receive sufficient calcium. The user attempted multiple queries with the keyword “calcium,” but never tried the words “milk” or “allergy.” Also, users are incredibly bad at interpreting SERP listings (SERP = Search Engine Results Page). Admittedly, SERPs from Google and the other main search engines typically offer unreadable gibberish rather than decent website descriptions. Still, an expert searcher (like me) can look at the listings and predict a destination site’s quality much better than average users. When it comes to search, users face three problems:• Inability to retarget queries to a different search strategy • Inability to understand the search results and properly evaluate each destination site’s likely usefulness • Inability to sort through the SERP’s polluted mass of poor results, whether from blogs or from heavily SEO-optimized sites that are insufficiently specific to really address the user’s problem Given these difficulties, many users are at the search engine’s mercy and mainly click the top links—a behavior we might call Google Gullibility.


Designing Search: UX Strategies for Ecommerce Success by Greg Nudelman, Pabini Gabriel-Petit

access to a mobile phone, Albert Einstein, AltaVista, augmented reality, barriers to entry, business intelligence, call centre, crowdsourcing, information retrieval, Internet of things, performance metric, QR code, recommendation engine, RFID, search engine result page, semantic web, Silicon Valley, social graph, social web, speech recognition, text mining, the map is not the territory, The Wisdom of Crowds, web application, zero-sum game, Zipcar

Sometimes, we can avoid such initial imprecision. A wider box invites more words. So does experience with large (and growing) bodies of content. In fact, the average number of keywords per query in Web search has moved from 1-2 to 2-3 in recent years. However, advance query specification is difficult, because we don’t know the size or structure of the index. When we are searching without a map, even a traditional search engine results page (SERP) tells us a lot. The nature of snippets and the number of results lets us judge how (and how much) to narrow. Even better, faceted search puts metadata on the map, and even Sort provides a way to limit what we see. Expand The opposite pattern is rare. Expand is uncommon, partly because users often cast a wide net to begin, and partly because it’s a harder problem. Of course, users can try a broader query.

The goals of search engine optimization and marketing parallel those of usability. Both aim to ensure a Web site’s audience is able to view and understand the content. Both go hand-in-hand with marketing as a whole, aiming to connect with users and ensure they tell their friends, sharing via a hyperlink. In turn, these goals influence the various signals search engines are designed to follow, and improve a Web site’s rankings on search engine results pages. The search engines are perhaps the most important users your site can connect with. Although many organizations painstakingly test for usability in all Web browsers and for all demographics, almost as many fall short by neglecting to design and test for search engine robots. SEO and Usability There are many topics to discuss in search engine optimization and marketing that speak to usability professionals.


pages: 315 words: 85,791

Technical Blogging: Turn Your Expertise Into a Remarkable Online Presence by Antonio Cangiano

23andMe, Albert Einstein, anti-pattern, bitcoin, bounce rate, cloud computing, en.wikipedia.org, John Gruber, Kickstarter, Lean Startup, Network effects, revision control, Ruby on Rails, search engine result page, slashdot, software as a service, web application

For example, a Canadian freelancer who’s interested in promoting his or her web design services locally may opt for a ccTLD (country code top-level domain) .ca domain name. Doing so also makes sense from an SEO standpoint, as search engines like Google absolutely love country-specific domain names when showing local search results. Tip 6 If your target audience is local to a country other than the US, favor ccTLDs. In the example of the Canadian site, all things being equal, the .ca will beat out the .com SERP (search engine results page) positioning on Google.ca. (Hosting your blog through a Canadian host would also help the cause, eh?) Keyword- or Brand-Based? Search engines love keyword-based domain names and generally give them an unfair advantage in the result pages. Having the one or two main keywords you are targeting placed within your domain name will boost your blog ranks on Google and Bing (as well as on other search engines that, quite frankly, very few people use).

If a given page on your blog links to 10 external sites and 2 internal pages, the PageRank you pass to the internal pages will be greatly reduced due to the presence of the external links. You can think of it as 1/12 each, instead of the half each you’d have if you weren’t linking to external sites—even if the real math is actually more complex than that. There are good reasons to link to other sites in your posts—as we’ll see in future chapters—including showing Google that your site is natural and not artificially built in order to score high in the SERP (again, search engine result pages). Nevertheless, think twice before deciding if you want to have a site-wide blogroll. The alternatives of not having one or placing a list of links on a single Links page of your site are both better options in most cases. Pages Menu When you click Pages, you’ll see a sample page. Delete it. Pages are different from posts. They don’t appear in your RSS feed, don’t have categories, and can have specific templates applied to them to make them look and behave differently from other pages.


pages: 125 words: 28,222

Growth Hacking Techniques, Disruptive Technology - How 40 Companies Made It BIG – Online Growth Hacker Marketing Strategy by Robert Peters

Airbnb, bounce rate, business climate, citizen journalism, crowdsourcing, digital map, Google Glasses, Jeff Bezos, Lean Startup, Menlo Park, Network effects, new economy, pull request, revision control, ride hailing / ride sharing, search engine result page, sharing economy, Skype, TaskRabbit, turn-by-turn navigation, ubercab

SEO (Search Engine Optimization) Search engine optimizations are augmentations or enhancement to websites that ensure that the page is as visible as possible to search engines and thus will appear at the top of a results page. SEM (Search Engine Marketing) Any marketing practice that is geared toward improving a site’s search engine visibility included search engine optimization or paying or placement on a search engine results page. Social Commerce The term social commerce describes online retail and marketing strategies that incorporate established social networks and peer-to-peer communications as a driver of sales. Virality Any piece of information online, whether it is an article, image, or video, that is widely shared and circulates rapidly is said to have “gone viral” and thus possesses “virality.”


pages: 193 words: 36,189

Html5 Boilerplate Web Development by Divya Manian

en.wikipedia.org, Firefox, Google Chrome, node package manager, pull request, Ruby on Rails, search engine result page

To prevent this, you can add X-Robots-Tag HTTP header tags by appending and uncommenting the following code snippet to the .htaccess file on the staging server: # ------------------------------------------------------------ # Disable URL indexing by crawlers (FOR DEVELOPMENT/STAGE) # ------------------------------------------------------------ # Avoid search engines (Google, Yahoo, etc) indexing website's content # http://yoast.com/prevent-site-being-indexed/ # http://code.google.com/web/controlcrawlindex/docs/robots_meta_tag.html # Matt Cutt (from Google Webmaster Central) on this topic: # http://www.youtube.com/watch?v=KBdEwpRQRD0 # IMPORTANT: serving this header is recommended only for # development/stage websites (or for live websites that don't # want to be indexed). This will avoid the website # being indexed in SERPs (search engines result pages). # This is a better approach than using robots.txt # to disallow the SE robots crawling your website, # because disallowing the robots doesn't exactly # mean that your website won't get indexed (read links above). # <IfModulemod_headers.c> # Header set X-Robots-Tag "noindex, nofollow, noarchive" # <FilesMatch "\.(doc|pdf|png|jpe?g|gif)$"> # Header set X-Robots-Tag "noindex, noarchive, nosnippet" # </FilesMatch> # </IfModule> Trailing slash redirects Search engines consider folder URLs http://example.com/foo and http://example.com/foo/ as two different URLs and as such would consider the content to be duplicates of each other.


pages: 320 words: 87,853

The Black Box Society: The Secret Algorithms That Control Money and Information by Frank Pasquale

Affordable Care Act / Obamacare, algorithmic trading, Amazon Mechanical Turk, American Legislative Exchange Council, asset-backed security, Atul Gawande, bank run, barriers to entry, basic income, Berlin Wall, Bernie Madoff, Black Swan, bonus culture, Brian Krebs, business cycle, call centre, Capital in the Twenty-First Century by Thomas Piketty, Chelsea Manning, Chuck Templeton: OpenTable:, cloud computing, collateralized debt obligation, computerized markets, corporate governance, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, cryptocurrency, Debian, don't be evil, drone strike, Edward Snowden, en.wikipedia.org, Fall of the Berlin Wall, Filter Bubble, financial innovation, financial thriller, fixed income, Flash crash, full employment, Goldman Sachs: Vampire Squid, Google Earth, Hernando de Soto, High speed trading, hiring and firing, housing crisis, informal economy, information asymmetry, information retrieval, interest rate swap, Internet of things, invisible hand, Jaron Lanier, Jeff Bezos, job automation, Julian Assange, Kevin Kelly, knowledge worker, Kodak vs Instagram, kremlinology, late fees, London Interbank Offered Rate, London Whale, Marc Andreessen, Mark Zuckerberg, mobile money, moral hazard, new economy, Nicholas Carr, offshore financial centre, PageRank, pattern recognition, Philip Mirowski, precariat, profit maximization, profit motive, quantitative easing, race to the bottom, recommendation engine, regulatory arbitrage, risk-adjusted returns, Satyajit Das, search engine result page, shareholder value, Silicon Valley, Snapchat, social intelligence, Spread Networks laid a new fibre optics cable between New York and Chicago, statistical arbitrage, statistical model, Steven Levy, the scientific method, too big to fail, transaction costs, two-sided market, universal basic income, Upton Sinclair, value at risk, WikiLeaks, zero-sum game

The original PageRank patent, open for all to see, clandestinely accumulated a thick crust of tweaks and adjustments intended to combat web baddies: the “link farms” (sites that link to other sites only to goose their Google rankings), the “splogs” (spam blogs, which farm links in the more dynamic weblog format); and the “content farms” (which rapidly and clumsily aggregate content based on trending Google searches, so as to appear at the top of search engine result pages, or SERPs). Beneath the façade of sleek interfaces and neatly ordered results, guerrilla war simmers between the search engineers and the spammers.39 The war with legitimate content providers is just as real, if colder. Search engine optimizers parse speeches from Google the way Kremlinologists used to pore over the communiqués of Soviet premiers, looking for ways to improve their showing without provoking the “Google Death Penalty” that de-indexes sites caught gaming the system.

Mark Patterson, “Additional Online Search Comments,” Antitrust and Competition Policy Blog, May 23, 2012, http:// lawprofessors.typepad.com /anti trustprof _blog /2012/05/additional-online -search-comments-by-mark-patter son.html. Of course, I can’t verify Foundem’s claims independently—the black box nature of search algorithms makes that impossible. But even if one thinks Google’s rationale for Foundem’s exclusion is more plausible—i.e., that other offerings were far better than Foundem’s—we cannot simply take Google’s word for it. 91. Sometimes the results would be presented like a search engine results page. In other situations, doctors were presented alphabetically, but with stars indicating a “quality rating,” much like the starred ratings restaurants receive from reviewers. Health insurer Wellpoint even hired a restaurant rating fi rm to help it. Note that Google has also gotten into this business, buying Zagat. 92. Frank Pasquale, “Grand Bargains for Big Data.” 93. As the 10th Circuit said in Jefferson Cty.


pages: 245 words: 68,420

Content Everywhere: Strategy and Structure for Future-Ready Content by Sara Wachter-Boettcher

crowdsourcing, John Gruber, Kickstarter, linked data, search engine result page, semantic web, Silicon Valley

In 2012, for example, Google released several updates designed to cut down on web-spam—many of them designed to emphasize content quality and stop over-ranking sites that engage in keyword-stuffing (repeating the same terms over and over just to game the system) and questionable linking schemes. While lots of SEO companies out there will still sell clients on chasing the white whale of “rankings,” today’s world of personalized search results, social results, and constantly refreshing news means that it’s near impossible to know where you “rank” on the search engine results page for any given term. Instead, you want to consider whether your content is adequately visible to—and understood by—search engines. The good news is, by this point, you should have a lot of the work already under control. Once you’ve turned your content into meaningful chunks, stripped the non-semantic code from your database, and created markup that tells machines what a piece of information is, then the search-engine ’bots will have a much easier time making sense of it, too.


pages: 380 words: 109,724

Don't Be Evil: How Big Tech Betrayed Its Founding Principles--And All of US by Rana Foroohar

"side hustle", accounting loophole / creative accounting, Airbnb, AltaVista, autonomous vehicles, banking crisis, barriers to entry, Bernie Madoff, Bernie Sanders, bitcoin, book scanning, Brewster Kahle, Burning Man, call centre, cashless society, cleantech, cloud computing, cognitive dissonance, Colonization of Mars, computer age, corporate governance, creative destruction, Credit Default Swap, cryptocurrency, data is the new oil, death of newspapers, Deng Xiaoping, disintermediation, don't be evil, Donald Trump, drone strike, Edward Snowden, Elon Musk, en.wikipedia.org, Erik Brynjolfsson, Etonian, Filter Bubble, future of work, game design, gig economy, global supply chain, Gordon Gekko, greed is good, income inequality, informal economy, information asymmetry, intangible asset, Internet Archive, Internet of things, invisible hand, Jaron Lanier, Jeff Bezos, job automation, job satisfaction, Kenneth Rogoff, life extension, light touch regulation, Lyft, Mark Zuckerberg, Marshall McLuhan, Martin Wolf, Menlo Park, move fast and break things, move fast and break things, Network effects, new economy, offshore financial centre, PageRank, patent troll, paypal mafia, Peter Thiel, pets.com, price discrimination, profit maximization, race to the bottom, recommendation engine, ride hailing / ride sharing, Robert Bork, Sand Hill Road, search engine result page, self-driving car, shareholder value, sharing economy, Shoshana Zuboff, Silicon Valley, Silicon Valley startup, smart cities, Snapchat, South China Sea, sovereign wealth fund, Steve Jobs, Steven Levy, subscription business, supply-chain management, TaskRabbit, Telecommunications Act of 1996, The Chicago School, the new new thing, Tim Cook: Apple, too big to fail, Travis Kalanick, trickle-down economics, Uber and Lyft, Uber for X, uber lyft, Upton Sinclair, WikiLeaks, zero-sum game

., fn. 12. 10. Leaked FTC document, 26. 11. Ibid. 12. Rana Foroohar, “Google Versus Orrin Hatch,” Financial Times, September 3, 2018. Varian quote from The Wall Street Journal, Ibid. 13. Nitasha Tiku, “How Google Influences the Conversation in Washington,” Wired, March 13, 2019. 14. Author interview with Walker in January 2019. 15. Madeline Jacobson, “How Far Down the Search Engine Results Page Will Most People Go?” Leverage Marketing, 2015. 16. For general information about antitrust lawsuits, see Wikipedia, s.v. “United States v. Terminal R.R. Ass’n,” last modified May 7, 2019, https://en.wikipedia.org/​wiki/​United_States_v._Terminal_R.R._Ass%27n. 17. “United States v. Reading Co.,” https://casetext.com/​case/​united-states-v-reading-co. 18. Charles Francis Adams Jr., Railroads: Their Origins and Problems (1878). 19.


pages: 481 words: 121,669

The Invisible Web: Uncovering Information Sources Search Engines Can't See by Gary Price, Chris Sherman, Danny Sullivan

AltaVista, American Society of Civil Engineers: Report Card, bioinformatics, Brewster Kahle, business intelligence, dark matter, Donald Davies, Douglas Engelbart, Douglas Engelbart, full text search, HyperCard, hypertext link, information retrieval, Internet Archive, joint-stock company, knowledge worker, natural language processing, pre–internet, profit motive, publish or perish, search engine result page, side project, Silicon Valley, speech recognition, stealth mode startup, Ted Nelson, Vannevar Bush, web application

Crawlers can fetch any page that can be displayed in a Web browser, regardless of whether it’s a static page stored on a server or generated dynamically. A good example of this type of Invisible Web site is Compaq’s experimental SpeechBot search engine, which indexes audio and video content using speech recognition, and converts the streaming media files to viewable text (http://www.speech bot.com). Somewhat ironically, one could make a good argument that most search engine result pages are themselves Invisible Web content, since they generate dynamic pages on the fly in response to user search terms. Dynamically generated pages pose a challenge for crawlers. Dynamic pages are created by a script, a computer program that selects from various options to assemble a customized page. Until the script is actually run, a crawler has no way of knowing what it will actually do.


Seeking SRE: Conversations About Running Production Systems at Scale by David N. Blank-Edelman

Affordable Care Act / Obamacare, algorithmic trading, Amazon Web Services, bounce rate, business continuity plan, business process, cloud computing, cognitive bias, cognitive dissonance, commoditize, continuous integration, crowdsourcing, dark matter, database schema, Debian, defense in depth, DevOps, domain-specific language, en.wikipedia.org, fault tolerance, fear of failure, friendly fire, game design, Grace Hopper, information retrieval, Infrastructure as a Service, Internet of things, invisible hand, iterative process, Kubernetes, loose coupling, Lyft, Marc Andreessen, microservices, minimum viable product, MVC pattern, performance metric, platform as a service, pull request, RAND corporation, remote working, Richard Feynman, risk tolerance, Ruby on Rails, search engine result page, self-driving car, sentiment analysis, Silicon Valley, single page application, Snapchat, software as a service, software is eating the world, source of truth, the scientific method, Toyota Production System, web application, WebSocket, zero day

One of their first questions would be something like, “Who are our customers? And why is getting the response in 10 seconds important for them?” Despite the fact that these questions came primarily from the business perspective, the information questions like these reveal can change the game dramatically. What if this service is for an “information retrieval” development team whose purpose is to address the necessity of content validation on the search engine results page, to make sure that the new index serves only live links? And what if we download a page with a million links on it? Now we can see the conflict between the priorities in the SLA and those of the service’s purposes. The SLA stated that the response time is crucial, but the service is intended to verify data, with accuracy as the most vital aspect of the service for the end user. We therefore need to adjust project requirements to meet business necessities.


pages: 1,535 words: 337,071

Networks, Crowds, and Markets: Reasoning About a Highly Connected World by David Easley, Jon Kleinberg

Albert Einstein, AltaVista, clean water, conceptual framework, Daniel Kahneman / Amos Tversky, Douglas Hofstadter, Erdős number, experimental subject, first-price auction, fudge factor, George Akerlof, Gerard Salton, Gerard Salton, Gödel, Escher, Bach, incomplete markets, information asymmetry, information retrieval, John Nash: game theory, Kenneth Arrow, longitudinal study, market clearing, market microstructure, moral hazard, Nash equilibrium, Network effects, Pareto efficiency, Paul Erdős, planetary scale, prediction markets, price anchoring, price mechanism, prisoner's dilemma, random walk, recommendation engine, Richard Thaler, Ronald Coase, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, Simon Singh, slashdot, social web, Steve Jobs, stochastic process, Ted Nelson, The Market for Lemons, The Wisdom of Crowds, trade route, transaction costs, ultimatum game, Vannevar Bush, Vickrey auction, Vilfredo Pareto, Yogi Berra, zero-sum game

From our perspective, it’s also a very nice blend of ideas that have come up earlier in this book: it creates markets out of the information-seeking behavior of hundreds of millions of people traversing the Web; and we will see shortly that it has surprisingly deep connections to the kinds of auctions and matching markets that we discussed in Chapters 9 and 10. Keyword-based ads show up on search engine results pages alongside the unpaid (“organic” or “algorithmic”) results. Figure 15.1 shows an example of how this currently looks on Google for the query “Keuka Lake,” one of the Finger Lakes in upstate New York. The algorithmic results generated by the search engine’s internal ranking procedure are on the left, while the paid results (in this case for real estate and vacation rentals) are ordered on the right.