9 results back to index
AltaVista, barriers to entry, Black Swan, bounce rate, business intelligence, butterfly effect, call centre, Claude Shannon: information theory, complexity theory, correlation does not imply causation, en.wikipedia.org, first-price auction, information retrieval, inventory management, life extension, linear programming, megacity, Nash equilibrium, Network effects, PageRank, place-making, price mechanism, psychological pricing, random walk, Schrödinger's Cat, sealed-bid auction, search engine result page, second-price auction, second-price sealed-bid, sentiment analysis, social web, software as a service, stochastic process, telemarketer, the market place, The Present Situation in Quantum Mechanics, the scientific method, The Wisdom of Crowds, Vickrey auction, yield management
Examples of queries Child labor law Capital one Buy table clocks Level Two (I, D) Directed: specific question (I, U) Undirected: tell me everything about a topic (I, L) List: list of candidates (I, F) Find: locate where some real world service or product can be obtained (I, A) Advice: advice, ideas, suggestions, instructions (N, T) Navigation to transactional: the URL the user wants is a transactional site (N, I ) Navigation to informational: the URL the user wants is an informational site (T, O) Obtain: obtain a specific resource or object (T, D) Download: find a file to download (T, R) Results Page: obtain a resource that one can print, save, or read from the search engine results page (T, I) Interact: interact with program/resource on another Web site Registering domain name Singers in the 1980s Things to do in Hollywood ca PVC suit for overweight men What to serve with roast pork tenderloin Match.com Yahoo.com Music lyrics Mp3 downloads (The user enters a query with the expectation that ‘answer’ will be on the search engine results page and not require browsing to another Web site.) Buy table clock Level Three (I, D, C) Closed: deals with one topic; question with one, unambiguous answer (I, D, O) Open: deals with two or more topics (T, O, O) Online: the resource will be obtained online (T, O, F) Off-line: the resource will be obtained off-line and may require additional actions by the user (T, D, F) Free: the downloadable file is free (T, D, N) Not Free: the downloadable file is not necessarily free (T, R, L) Links: the resources appears in the title, summary, or URL of one or more of the results on the search engine results page (T, R, O) Other: the resources does not appear one of the results but somewhere else on the search engine results page Nine supreme court justices The excretory system of arachnids airline seat map full metal alchemist wallpapers Free online games “Family Guy” episode download (As an example, a user enters the title of a conference paper to locate the page numbers, which usually appear in one or more of the results.)
Buy table clock Level Three (I, D, C) Closed: deals with one topic; question with one, unambiguous answer (I, D, O) Open: deals with two or more topics (T, O, O) Online: the resource will be obtained online (T, O, F) Off-line: the resource will be obtained off-line and may require additional actions by the user (T, D, F) Free: the downloadable file is free (T, D, N) Not Free: the downloadable file is not necessarily free (T, R, L) Links: the resources appears in the title, summary, or URL of one or more of the results on the search engine results page (T, R, O) Other: the resources does not appear one of the results but somewhere else on the search engine results page Nine supreme court justices The excretory system of arachnids airline seat map full metal alchemist wallpapers Free online games “Family Guy” episode download (As an example, a user enters the title of a conference paper to locate the page numbers, which usually appear in one or more of the results.) (As an example, a user enters a query term to check for spelling with no interest in the results listing.
Why does one choose particular terms for one’s ad copy (other than that historically they seem to work)? Why is this bidding process the way it is? Why look at certain metrics and not at others? In other words, why do we do what we do? This book is for those who are curious about such things. I am one of these people, as I am curious why things are the way they are with sponsored search, which is the process in which advertisers pay to have their advertisements appear on a search engine results page in response to a query from a searcher. Sponsored search is also commonly known as keyword advertising. There is also a great practical benefit in understanding the theoretical foundations of what one does. Doing something just because it worked in the past will generally produce good resultsÂ€– until the context, situation, or environment changes. Then, all the historical data and results are of little value.
Airbnb, bounce rate, call centre, carbon footprint, Deng Xiaoping, financial independence, follow your passion, income inequality, iterative process, Ralph Waldo Emerson, search engine result page, Skype, software as a service, South China Sea, Steve Jobs
Net Effect: Emphasizing Apex Keywords in ads, meta content and visible site content increases your site’s Quality Score, which drives down cost-per-click and increases positioning in the search engine results pages. NON-PAID TRAFFIC (SEO) Search Engine Optimization While CPC advertising will drive immediate traffic to your website, you can’t increase your traffic exponentially with CPC alone. Every click through from your ad costs you money, whether the person who clicked on it ends up buying from you or not. It’s a good short-term solution, but in the long term, you want to have a daily stream of traffic to your site that costs you nothing. You can do this by optimizing your website so that it appears in the main section of the search engine results page (SERP), the organic search results that are based on the content of your site. You increase your sites’ rankings in SERP by proving to the search engines that your website is relevant with respect to the search terms that people use when looking for your products or services.
LOOKING UNDER THE HOOD OF THE OPERATIONS MODEL Site Traffic: Enticing Potencial Customers to your Online Storefront Note in the previous narrative that the first thing Jonathan did was some initial research by entering search words and phrases into a search engine. It may seem obvious, but it’s important to see that the USBsuperstore.com made it a very high priority to be visible in all of the major search engines for a wide range of industry and product-specific search terms. This is why the company was visible in the organic search engine results page (SERP) for different terms or words. This process is known as search engine optimization or SEO. This is a process whereby keywords and search phrases relevant to the business are woven through the site content and coding of the website in specific ways to boost visibility for those terms. To further support their legitimacy as a vendor, they also advertised through Google’s AdWords platform using many of the same search terms.
This would undermine the authenticity and objectivity of the search results displayed. It’s important to remember that there is no direct correlation between ad dollars spent and Quality Score achieved. However one of the determinants of your Quality Score is the amount of traffic to your site, and its consistency. This is where AdWords and CPC advertising can help, since it jumpstarts traffic by running AdWords ads on the right-hand side of the search engine results page, which, over time, becomes a factor in determining your Quality Score, assuming you have simultaneously executed an SEO strategy. You want to use a multifaceted approach to getting a high Quality Score so that it cements your standings in the search engines. Structuring your website like this optimizes your SEO & PPC efforts: 1. Configuring your website so that you have on-site SEO that is based on your Apex Keywords. 2.
Website Optimization by Andrew B. King
AltaVista, bounce rate, don't be evil, en.wikipedia.org, Firefox, In Cold Blood by Truman Capote, information retrieval, iterative process, medical malpractice, Network effects, performance metric, search engine result page, second-price auction, second-price sealed-bid, semantic web, Silicon Valley, slashdot, social graph, Steve Jobs, web application
It explains how the effects of search engine marketing and web performance tuning can be quantified and optimized. Part I Part I, Search Engine Marketing Optimization, explains how to use best-practice techniques to boost the search engine visibility and conversion rate of your website. It consists of the following: Introduction to Part I, Search Engine Marketing Optimization Briefly explores the behavior of users as they interact with search engine result pages, and how tight, front-loaded headlines and summaries help to improve natural referrals and PPC conversions for search result pages. Chapter 1, Natural Search Engine Optimization Shows best practices for improving organic search engine visibility, as well as how to overcome the most common barriers to high rankings. The chapter demonstrates the 10 steps you can take to achieve high rankings, including writing optimized title tags, targeting specific keywords, and building popular inbound links.
The chapters that follow will show you the best (and worst) practices for each of these topics, complete with case studies showing the techniques in action. First, let's explore how people behave when using search engines. Search Behavior To best optimize your website, it is important to understand how users interact with search engines. As you'll discover, searchers are selective in their viewing of search engine result pages (SERPs) and spend little time on each page browsing results. SERP Viewing Statistics Good search result placement is important because most searchers (92.5%) don't explore beyond the third page of search results.  In fact, about three-fourths don't look past the first page of results. ,  About 90% of searchers view only the first or second page of results. Yes, even on the Web the three most important elements of success are location, location, location.
But the abuse of some meta tags and other SEO shenanigans such as invisible text and keyword stuffing  have forced search engines to weigh external factors, such as inbound links, more heavily than on-site optimization. So, how do you achieve your SEO dream now? Today's successful SEO strategy requires a long-term approach with frequent postings, targeted content, and regular online promotion designed to boost inbound links—in short, a combination of off-site and on-site SEO. The Benefits of SEO A high ranking in search engine result pages (SERPs) has become a business necessity. High rankings have been found to increase the following characteristics: Site traffic (see Figure 1-1) . Perceived relevance . Trust  Conversion (to sales) rates Figure 1-1 shows the effects of higher rankings. A Oneupweb study found that soon after the average client site appeared in the top 10 search result pages, both conversion rates and new traffic increased significantly.
Amazon Mechanical Turk, Andrew Keen, centre right, citizen journalism, collaborative editing, computer age, computer vision, corporate governance, crowdsourcing, David Brooks, disintermediation, Frederick Winslow Taylor, Howard Rheingold, invention of movable type, invention of the steam engine, invention of the telephone, Jaron Lanier, Jeff Bezos, jimmy wales, Kevin Kelly, knowledge worker, late fees, Mark Zuckerberg, Marshall McLuhan, means of production, meta analysis, meta-analysis, Network effects, new economy, Nicholas Carr, PageRank, pets.com, Results Only Work Environment, Saturday Night Live, search engine result page, semantic web, Silicon Valley, slashdot, social graph, social web, software as a service, speech recognition, Steve Jobs, Stewart Brand, technology bubble, Ted Nelson, The Wisdom of Crowds, Thorstein Veblen, web application
See Results-only work environment RSS Rushkoff, Douglas Rutgers University Safire, William Salon.com Samsung San Francisco Chronicle Sanger, Larry SAP Sartre, Jean-Paul Saturated self Saturday Night Live (television series) Scalable Fabric Scarcity Scherf, Steve Schindler’s List (film) Schmidt, Eric Science (journal) Scientific Learning Scientific management Scion Scope Screenagers Scrutiny The Search: How Google and Its Rivals Rewrote the Rules of Business and Transformed Our Culture (Battelle) Search Engine Results Page (SERP) Search engines. See also specific sites Search strategies Search Wiki Sebald, W. G. Seigenthaler, John, Sr. Seinfeld (television series) Self-portraits Self-publishing Self-realization Self-sufficiency Semantic priming Semiotic democracy Sensory deprivation September 11, 2001 terrorist attacks Serialization SERP. See Search Engine Results Page Sesame Street Shakesville (blog) Shirky, Clay Shoutcast Simulations Six Degrees: The Science of a Connected Age (Watts) Skrenta, Rich “Skyful of Lies” and Black Swans (Gowing) Slashdot Slatalla, Michelle Slate (magazine) Sleeper Curve Slingbox SLVR phone Small world experiment Social currency Social graph Social media Social mind The Social Network (film) Social networking sites.
They might modify their first attempt, but they typically stick with the same general approach rather than try something genuinely new. For example, one user tested the Mayo Clinic’s site to find out how to ensure that a child with a milk allergy would receive sufficient calcium. The user attempted multiple queries with the keyword “calcium,” but never tried the words “milk” or “allergy.” Also, users are incredibly bad at interpreting SERP listings (SERP = Search Engine Results Page). Admittedly, SERPs from Google and the other main search engines typically offer unreadable gibberish rather than decent website descriptions. Still, an expert searcher (like me) can look at the listings and predict a destination site’s quality much better than average users. When it comes to search, users face three problems:• Inability to retarget queries to a different search strategy • Inability to understand the search results and properly evaluate each destination site’s likely usefulness • Inability to sort through the SERP’s polluted mass of poor results, whether from blogs or from heavily SEO-optimized sites that are insufficiently specific to really address the user’s problem Given these difficulties, many users are at the search engine’s mercy and mainly click the top links—a behavior we might call Google Gullibility.
Technical Blogging: Turn Your Expertise Into a Remarkable Online Presence by Antonio Cangiano
Albert Einstein, anti-pattern, bitcoin, bounce rate, cloud computing, en.wikipedia.org, John Gruber, Lean Startup, Network effects, revision control, search engine result page, slashdot, software as a service, web application
For example, a Canadian freelancer who’s interested in promoting his or her web design services locally may opt for a ccTLD (country code top-level domain) .ca domain name. Doing so also makes sense from an SEO standpoint, as search engines like Google absolutely love country-specific domain names when showing local search results. Tip 6 If your target audience is local to a country other than the US, favor ccTLDs. In the example of the Canadian site, all things being equal, the .ca will beat out the .com SERP (search engine results page) positioning on Google.ca. (Hosting your blog through a Canadian host would also help the cause, eh?) Keyword- or Brand-Based? Search engines love keyword-based domain names and generally give them an unfair advantage in the result pages. Having the one or two main keywords you are targeting placed within your domain name will boost your blog ranks on Google and Bing (as well as on other search engines that, quite frankly, very few people use).
If a given page on your blog links to 10 external sites and 2 internal pages, the PageRank you pass to the internal pages will be greatly reduced due to the presence of the external links. You can think of it as 1/12 each, instead of the half each you’d have if you weren’t linking to external sites—even if the real math is actually more complex than that. There are good reasons to link to other sites in your posts—as we’ll see in future chapters—including showing Google that your site is natural and not artificially built in order to score high in the SERP (again, search engine result pages). Nevertheless, think twice before deciding if you want to have a site-wide blogroll. The alternatives of not having one or placing a list of links on a single Links page of your site are both better options in most cases. Pages Menu When you click Pages, you’ll see a sample page. Delete it. Pages are different from posts. They don’t appear in your RSS feed, don’t have categories, and can have specific templates applied to them to make them look and behave differently from other pages.
Html5 Boilerplate Web Development by Divya Manian
To prevent this, you can add X-Robots-Tag HTTP header tags by appending and uncommenting the following code snippet to the .htaccess file on the staging server: # ------------------------------------------------------------ # Disable URL indexing by crawlers (FOR DEVELOPMENT/STAGE) # ------------------------------------------------------------ # Avoid search engines (Google, Yahoo, etc) indexing website's content # http://yoast.com/prevent-site-being-indexed/ # http://code.google.com/web/controlcrawlindex/docs/robots_meta_tag.html # Matt Cutt (from Google Webmaster Central) on this topic: # http://www.youtube.com/watch?v=KBdEwpRQRD0 # IMPORTANT: serving this header is recommended only for # development/stage websites (or for live websites that don't # want to be indexed). This will avoid the website # being indexed in SERPs (search engines result pages). # This is a better approach than using robots.txt # to disallow the SE robots crawling your website, # because disallowing the robots doesn't exactly # mean that your website won't get indexed (read links above). # <IfModulemod_headers.c> # Header set X-Robots-Tag "noindex, nofollow, noarchive" # <FilesMatch "\.(doc|pdf|png|jpe?g|gif)$"> # Header set X-Robots-Tag "noindex, noarchive, nosnippet" # </FilesMatch> # </IfModule> Trailing slash redirects Search engines consider folder URLs http://example.com/foo and http://example.com/foo/ as two different URLs and as such would consider the content to be duplicates of each other.
Airbnb, bounce rate, business climate, citizen journalism, crowdsourcing, Google Glasses, Jeff Bezos, Lean Startup, Menlo Park, Network effects, new economy, pull request, revision control, ride hailing / ride sharing, search engine result page, sharing economy, Skype, TaskRabbit
SEO (Search Engine Optimization) Search engine optimizations are augmentations or enhancement to websites that ensure that the page is as visible as possible to search engines and thus will appear at the top of a results page. SEM (Search Engine Marketing) Any marketing practice that is geared toward improving a site’s search engine visibility included search engine optimization or paying or placement on a search engine results page. Social Commerce The term social commerce describes online retail and marketing strategies that incorporate established social networks and peer-to-peer communications as a driver of sales. Virality Any piece of information online, whether it is an article, image, or video, that is widely shared and circulates rapidly is said to have “gone viral” and thus possesses “virality.”
Affordable Care Act / Obamacare, algorithmic trading, Amazon Mechanical Turk, asset-backed security, Atul Gawande, bank run, barriers to entry, Berlin Wall, Bernie Madoff, Black Swan, bonus culture, Brian Krebs, call centre, Capital in the Twenty-First Century by Thomas Piketty, Chelsea Manning, cloud computing, collateralized debt obligation, corporate governance, Credit Default Swap, credit default swaps / collateralized debt obligations, crowdsourcing, cryptocurrency, Debian, don't be evil, Edward Snowden, en.wikipedia.org, Fall of the Berlin Wall, Filter Bubble, financial innovation, Flash crash, full employment, Goldman Sachs: Vampire Squid, Google Earth, Hernando de Soto, High speed trading, hiring and firing, housing crisis, informal economy, information retrieval, interest rate swap, Internet of things, invisible hand, Jaron Lanier, Jeff Bezos, job automation, Julian Assange, Kevin Kelly, knowledge worker, Kodak vs Instagram, kremlinology, late fees, London Interbank Offered Rate, London Whale, Mark Zuckerberg, mobile money, moral hazard, new economy, Nicholas Carr, offshore financial centre, PageRank, pattern recognition, precariat, profit maximization, profit motive, quantitative easing, race to the bottom, recommendation engine, regulatory arbitrage, risk-adjusted returns, search engine result page, shareholder value, Silicon Valley, Snapchat, Spread Networks laid a new fibre optics cable between New York and Chicago, statistical arbitrage, statistical model, Steven Levy, the scientific method, too big to fail, transaction costs, two-sided market, universal basic income, Upton Sinclair, value at risk, WikiLeaks
The original PageRank patent, open for all to see, clandestinely accumulated a thick crust of tweaks and adjustments intended to combat web baddies: the “link farms” (sites that link to other sites only to goose their Google rankings), the “splogs” (spam blogs, which farm links in the more dynamic weblog format); and the “content farms” (which rapidly and clumsily aggregate content based on trending Google searches, so as to appear at the top of search engine result pages, or SERPs). Beneath the façade of sleek interfaces and neatly ordered results, guerrilla war simmers between the search engineers and the spammers.39 The war with legitimate content providers is just as real, if colder. Search engine optimizers parse speeches from Google the way Kremlinologists used to pore over the communiqués of Soviet premiers, looking for ways to improve their showing without provoking the “Google Death Penalty” that de-indexes sites caught gaming the system.
Mark Patterson, “Additional Online Search Comments,” Antitrust and Competition Policy Blog, May 23, 2012, http:// lawprofessors.typepad.com /anti trustprof _blog /2012/05/additional-online -search-comments-by-mark-patter son.html. Of course, I can’t verify Foundem’s claims independently—the black box nature of search algorithms makes that impossible. But even if one thinks Google’s rationale for Foundem’s exclusion is more plausible—i.e., that other offerings were far better than Foundem’s—we cannot simply take Google’s word for it. 91. Sometimes the results would be presented like a search engine results page. In other situations, doctors were presented alphabetically, but with stars indicating a “quality rating,” much like the starred ratings restaurants receive from reviewers. Health insurer Wellpoint even hired a restaurant rating fi rm to help it. Note that Google has also gotten into this business, buying Zagat. 92. Frank Pasquale, “Grand Bargains for Big Data.” 93. As the 10th Circuit said in Jefferson Cty.
The Invisible Web: Uncovering Information Sources Search Engines Can't See by Gary Price, Chris Sherman, Danny Sullivan
AltaVista, American Society of Civil Engineers: Report Card, bioinformatics, Brewster Kahle, business intelligence, dark matter, Douglas Engelbart, full text search, HyperCard, hypertext link, information retrieval, Internet Archive, joint-stock company, knowledge worker, natural language processing, pre–internet, profit motive, publish or perish, search engine result page, side project, Silicon Valley, speech recognition, stealth mode startup, Ted Nelson, Vannevar Bush, web application
Crawlers can fetch any page that can be displayed in a Web browser, regardless of whether it’s a static page stored on a server or generated dynamically. A good example of this type of Invisible Web site is Compaq’s experimental SpeechBot search engine, which indexes audio and video content using speech recognition, and converts the streaming media files to viewable text (http://www.speech bot.com). Somewhat ironically, one could make a good argument that most search engine result pages are themselves Invisible Web content, since they generate dynamic pages on the fly in response to user search terms. Dynamically generated pages pose a challenge for crawlers. Dynamic pages are created by a script, a computer program that selects from various options to assemble a customized page. Until the script is actually run, a crawler has no way of knowing what it will actually do.