What does Google’s new approach to search mean to you.
To take advantage of this new environment, one must first put yourself in Google’s shoes. The monolithic search market juggernaut is on a crusade to continue its core brand strategy as the only true innovator in its space. Ask, what would I do if I were trying to maintain this lofty status as the world’s leading provider of information – With this in mind we can better understand the motives and derive some context from the ever-so-vague and nebulous explanations we get from Google’s search team and leadership alike.
|1| Wouldn’t you like search engines like Google to present users with the most personal and relevant results and data possible? Even if it means keeping large amounts of personal data?
Your little bundle of joy has finally arrived! You can’t wait to get your sweet baby home and rock them, play with them, snuggle them, and watch them drift to sleep for hours at a time.
Then reality bitch slaps you in the face.
A week or so in that sweet baby is crying ALL the time. Loudly. You don’t know what to do. It seems like nothing helps. Some parents, like Andrew and I, believed this is normal behavior for a baby. Babies cry right?
Yes and no.
If your baby seems to cry more often than being calm during their waking hours… you may have a colicky one on your hands. Colic is defined as a baby who cries for at least 3 hours a day at least 3 days a week. They say around 3-4 months is when they start to grow out of it…
View original post 1,238 more words
(CNN) — More than 100,000 people are eager to make themselves at home on another planet. They’ve applied for a one-way trip to Mars, hoping to be chosen to spend the rest of their lives on uncharted territory, according to an organization planning the manned missions.
The Mars One project wants to colonize the red planet, beginning in 2022. There are financial and practical questions about this venture that haven’t been clarified. Will there be enough money? Will people really be able to survive on Mars? But these haven’t stopped some 30,000 Americans from signing up.
You can see some of the candidates on the project’s website, but they’re not the only ones who have applied, said Bas Lansdorp, Mars One CEO and co-founder.
“There is also a very large number of people who are still working on their profile, so either they have decided not to pay the…
View original post 1,214 more words
Category : Organic (SEO)
A procedure for solving a mathematical problem such as, in the case of search engines, determining the order in which to present a list of websites in response to a query.
The text that appears in place, when and where an image cannot be displayed. Image ALT tags are useful to your page’s visitors. Equally as important, they can help with your search engine rankings by increasing the keyword density (if you use your keywords in your ALT tags). Example: <img src=”picture_logo.jpg” width=”156″ height=”175″ ALT=”Four historically […]
Anchor text refers to the visible text of a hyperlink. In other words, the actual text that a user clicks on. For example: When I need to link out to something within text, I use anchor text. Anchor text is particularly important when acquiring inbound links and should be a significant aspect of any link […]
The ability of a page or domain to rank well in search engines. Five factors associated with site and largely contributing to the page authority are link equity, site age, traffic trends, site history, and publishing unique original quality content.
A method used to break a page down into multiple points on the web graph by breaking its pages down into smaller blocks.
Most browsers come with the ability to bookmark your favorite pages. Many web based services have also been created to allow you to bookmark and share your favorite resources. The popularity of a document (as measured in terms of link equity, number of bookmarks, or usage data) is a signal for the quality of the […]
Abbreviation for robot (also called a spider). It refers to software programs that scan the web. Bots vary in purpose from indexing web pages for search engines to harvesting e-mail addresses for spammers.
Navigational technique used to help search engines and website users understand the relationship between pages. Example breadcrumb navigation: Home > SEO Tools > SEO for Firefox Whatever page the user is on is unlinked, but the pages above it within the site structure are linked to, and organized starting with the home page, right on […]
Before making large purchases consumers typically research what brands and products fit their needs and wants. Keyword based search marketing allows you to reach consumers at any point in the buying cycle. The buying cycle may consist of the following stages – Problem Discovery: prospect discovers a need or want. Search: after discovering a problem, […]
The process of picking the best URL when there are several choices; this usually refers to home pages. The canonical version of any URL is the single most authoritative version indexed by major search engines. Search engines typically use PageRank or a similar measure to determine which version of a URL is the canonical URL. […]
A system to hide code or content from a user and deliver different content to a search engine spider. IP based cloaking delivers custom pages based on the users IP address. User Agent cloaking delivers custom pages based upon the users Agent. Depending on the intent of the display discrepancy and the strength of the […]
In search results the listings from any individual site are typically limited to a certain number and grouped together to make the search results appear neat and organized and to ensure diversity amongst the top ranked results. Clustering can also refer to a technique which allows search engines to group hubs and authorities on a […]
In relation to Google Instant, a cognitive pause is the time taken by a user to consider the results that were displayed after a letter is typed. If that pause is three seconds or longer, listings on that SERP earn an impression.
As used in SEO, competition analysis is the assessment and analysis of strengths and weaknesses of competing web sites, including identifying traffic patterns, major traffic sources, and keyword selection.
A search which attempts to conceptually match results with the query, not necessarily with those words, rather their concept.
Links which search engines attempt to understand beyond just the words in them. Some rather advanced search engines are attempting to find out the concept links versus just matching the words of the text to that specific word set.
A search that analyzes the page being viewed by a user and gives a list of related search results. Offered by Yahoo! and Google.
Many forms of online advertising are easy to track. A conversion is reached when a desired goal is completed. Most offline ads have generally been much harder to track than online ads. Some marketers use custom phone numbers or coupon codes to tie offline activity to online marketing. A few common example desired goals – […]
Small data file written to a user’s local machine to track them. Cookies are used to help websites customize your user experience and help affiliate program managers track conversions.
How deeply a website is crawled and indexed. Define Crawl Depth in SEO Terms Glossary In the above image, crawl depth refers to the depth to which the search spider crawls. It refers that the website has been spidered to level 4. The above image is just an indicative of the depth. The levels and […]
How frequently a website is crawled. Sites which are well trusted or frequently updated may be crawled more frequently than sites with low trust scores and limited link authority. Sites with highly artificial link authority scores (ie: mostly low quality spammy links) or sites which are heavy in duplicate content or near duplicate content (such […]
Automated programs in search engines that gather web site listings by automatically crawling the web. A search engine’s crawler (also called a spider or robot) “reads” page text contents and web page coding, and also follows links to other hyperlinked pages on the web pages it crawls. A crawler makes copies of the web pages […]
Multiple sites linking to each other. Read more about Cross Linking Best Practices.
A bad, or inactive, HTML link to which the destination web page no longer exists. A link that produces a 404 error, page not found.
The act of linking to a page (deep) within a web site rather than linking to the main URL. Directories discourage the submission of deep links as a way to keep their indexes clean and organized. It is also linking that guides, directs and links a click-through searcher (or a search engine crawler) to a […]
Refers to the information contained in the description META tag. This tag is meant to hold the brief description of the web page it is included on. The information contained in this tag is generally the description displayed immediately after the main link on many search engine result pages. This is how a description tag […]
A directory lists websites by individual topics. Open Directory Project and Google Directory are examples of directory. A directory is eventually a website which looks just like search engine but is organized as a series of sections under which appropriate sites are filed. The content of a directory is vetted by human editors before becoming a part […]
Also known as a search directory. Refers to a directory of web sites contained in an engine that are categorized into topics. The main difference between a search directory and a search engine is in how the listings are obtained. A search directory relies on user input in order to categorize and include a web […]
Information in web pages that changes automatically, based on database or user information. Search engines will index dynamic content in the same way as static content unless the URL includes a question mark [?]. However, if the URL does include a question mark [?], many search engines will ignore the URL.
Search engines count links as votes of quality. They primarily want to count editorial links that were earned over links that were bought or bartered. These are links which are not paid for, not asked for and not traded for. These are links which a web site organically attracts because that site is producing good […]
A review process for potential advertiser listings conducted by search engines, which check to ensure relevancy and compliance with the engine’s editorial policy. This process could be automated – using a spider to crawl ads – or it could be human editorial ad review. Sometimes it’s a combination of both. Not all PPC Search Engines […]
Certain activities or signatures which make a page or site appear unnatural might make search engines inclined to filter / remove them out of the search results. For example, if a site publishes significant duplicate content it may get a reduced crawl priority and get filtered out of the search results. Some search engines also […]
Words such as is, am, were, was, the, for, do, etc., that search engines deem irrelevant for indexing purposes. The term is often confused with ‘stop words’. As might be obvious, removing these words from search saves search engines enormous amount of database space.
Search which will find matching terms when terms are misspelled (or fuzzy). A fuzzy search is a process that locates Web pages that are likely to be relevant to a search argument even when the argument does not exactly correspond to the desired information. A fuzzy search is done by means of a fuzzy matching […]
Increasing the organic search ranking of a particular website or page by pointing hundreds or thousands of links at it using very specific anchor text. Google claims to not have allowed this type of manipulation since 2007.
Knocking a competitor out of the search results by pointing hundreds or thousands of low trust low quality links at their website.
Also called ‘Google Thumbnail’, ‘Google Snaphot’ and ‘Google Preview Window’, Google’s full page preview began appearing in limited tests in early October. The feature allows users to click anywhere on an organic search result (with the exception of the page title) and see a thumbnail image of the entire page.
Google Instant, “a new search enhancement that shows results as you type”, was launched on September 8th, 2010 for users searching while signed into their Google accounts. Google describes three benefits of the new technology: Faster Searches: By predicting your search and showing results before you finish typing, Google Instant can save 2-5 seconds per […]
Shortly after Google Instant was launched, users were interested to see what Google would predict/suggest given every letter of the alphabet. For example: a: amazon b: best buy c: craiglist d: duke energy e: ebay f: facebook g: gmail h: hulu … Mashable, The Wall Street Journal, Advertising Age and ZDNet all ran tests. Google […]
Keyword research tool provided by Google which estimates the competition for a keyword, recommends related keywords, and will tell you what keywords Google thinks are relevant to your site or a page on your site.
Program which webmasters can use to help Google index their contents. Please note that the best way to submit your site to search engines and to keep it in their search indexes is to build high quality editorial links.
Free multi variable testing platform used to help AdWords advertisers improve their conversion rates.
Search terms that are short, popular and straightforward; e.g., “power-boat water skiing.” These short terms are called head terms based on a bell-curve distribution of keyword usage that displays the high numbers of most-used terms at the “head” end of the bell curve graph.
This HTML tag contains the headings or subtitles visible on a page. Your headings provide a summary of page content and ideally should contain strategic keywords to be read by search engine spiders.
(Also known as Invisible text.) Text that is visible to the search engines but hidden to a user. It is traditionally accomplished by coloring a block of HTML text the same color as the background color of the page. More creative methods have also been employed to create the same effect while making it more […]
Making a search engine believe that another website exists at your URL. Typically done using techniques such as a 302 redirect or meta refresh.
The request or retrieval of any item located within a web page. For example, if a user enters a web page with 5 pictures on it, it would be counted as 6 “hits.” One hit is counted for the web page itself, and another 5 hits count for the pictures.
Link based algorithm which ranks relevancy scores based on citations from topical authorities.
A reference (link) from some point in one hypertext document to another location in another (or the same) document. A web browser usually displays hyperlinks with special underlining, color and font, so as to distinguish them from their surroundings. When a user activates the link (e.g. by clicking on it with a mouse) the web […]
A system for defining “hot spots” in a graphic image that, which when clicked, take the user to a different web page.
Link pointing to one website from another website. Also see, outbound link.
A search engine’s index refers to the amount of documents found by a search engines crawler on the web.
Also known as crawlability and spiderability. Indexability refers to the potential of a web site or its contents to be crawled or “indexed” by a search engine. If a site is not “indexable,” or if a site has reduced indexability, it has difficulties getting its URLs included.
Link from one page on a site to another page on the same site. It is preferential to use descriptive internal linking to make it easy for search engines to understand what your website is about.
Portions of the web which are not easily accessible to crawlers due to search technology limitations, copyright issues, or information architecture issues.
Dedicated and shared IPs. –(An IP address is) an identifier for a computer or device on a TCP/IP network. Networks using the TCP/IP protocol route messages based on the IP address of the destination. The format of an IP address is a 32-bit numeric address, written as four numbers separated by periods. Each number can […]
The act of delivering customized content based upon the users IP address. This is used in cloaking so you can deliver specific pages to spiders.
A word or phrase that is used in a search engine query. Optimizing a site entails researching the keyword or keyword phrases that users enter in order to find web sites related to their query, and optimizing a web site around those terms. Includes generic, category keywords; industry-specific terms; product brands; common misspellings and expanded […]
An old measure of search engine relevancy based on how prominent keywords appeared within the content of a page. Keyword density is no longer a valid measure of relevancy over a broad open search index though. The old definition was the number of times a keyword or keyword phrase is used in the body of […]
Denotes how often a keyword appears in a page or in an area of a page. In general, higher the number of times a keyword appears in a page, higher its search engine ranking. However, repeating a keyword too often in a page can lead to that page being penalized for spamming.
The relationship between various related keywords that searchers search for. Some searches are particularly well aligned with others due to spelling errors, poor search relevancy, and automated or manual query refinement.
Keyword research is the process of discovering relevant keywords and keyword phrases to focus your SEO and PPC marketing campaigns on.
To return to the root or stem of a word and build additional words by adding a prefix or suffix, or using pluralization. The word can expand in either direction and even add words, increasing the number of variable options.
The act of adding way too many keywords to a page in either the text or meta information. Generally refers to the act of adding an inordinate number of keyword terms into the HTML or tags of a web page. The words are added for the ‘benefit’ of search engines and not human visitors. The […]
Refers to the META keywords tag within a web page. This tag is meant to hold approximately 8 – 10 keywords or keyword phrases, separated by commas. These phrases should be either misspellings of the main page topic, or terms that directly reflect the content on the page on which they appear. Keyword tags are […]
Denotes the number of times a keyword appears in a page as a percentage of all the other words in the page. In general, higher the weight of a particular keyword in a page, higher will be the search engine ranking of the page for that keyword. However, repeating a keyword too often in order […]
The web page at which a searcher arrives after clicking on an ad. When creating a PPC ad, the advertiser displays a URL (and specifies the exact page URL in the code) on which the searcher will land after clicking on an ad in the SERP. Landing pages are also known as “where the deal […]
A measure used by Google to help filter noisy ads out of their AdWords program. When Google AdWords launched affiliates and arbitrage players made up a large portion of their ad market, but as more mainstream companies have spent on search marketing, Google has done many measures to try to keep their ads relevant.
Web sites that generate leads for products or services offered by another company. On a lead generation site, the visitor is unable to make a purchase but will fill out a contact form in order to get more information about the product or service presented. A submitted contact form is considered a lead. It contains […]
A citation from one web document to another web document or another position in the same document. Most major search engines consider links as a vote of trust.
The art of targeting, creating, and formatting information that provokes the target audience to point high quality links at your site. Many link baiting techniques are targeted at social media and bloggers.
The process of building high quality linkage data that search engines will evaluate to trust your website is authoritative, relevant, and trustworthy.
A measure of how strong a site is based on its inbound link popularity and the authority of the sites providing those links.
A link farm is a group of separate, highly interlinked websites for the purposes of inflating link popularity (or PR). It is also a website or group of websites which exercises little to no editorial control when linking to other sites. Engaging in a link farm is a violation of the Terms Of Service of […]
The act of a search engine counting the number of inbound links to a web site. It is a statistic used by some search engines that counts the number of times a web page is linked to by other web pages. Many search engines now use this information as part of their ranking system. Link […]
A measure of how many and what percent of a website’s links are broken. Links may broken for a number of reason, but four of the most common reasons are: a website going offline linking to content which is temporary in nature (due to licensing structures or other reasons) moving a page’s location changing a […]
A profile is a representation of the extent to which something exhibits various characteristics. A linking profile is the results of an analysis of where of your links are coming from.
All server software stores information about web site incoming and outgoing activities. It is file maintained on a server showing where all files accessed are stored. The log file is usually in the root directory but it may also be found in a secondary folder.
The analysis of records stored in the log file. In its raw format, the data in the log files can be hard to read and overwhelming. There are numerous log file analyzers that convert log file data into user-friendly charts and graphs. A good analyzer is generally considered an essential tool in SEO because it […]
The process of submitting Websites or Web pages to search engines and directories for inclusion in their databases using specific guidelines unique to each index. Also see, submission.
The tag present in the header of a web page that is used to provide a short description of the contents of the page. Some search engines will display the text present in the Meta Description Tag when the page appears in the results of a search. Including keywords in the Meta Description Tag can […]
The meta keywords tag is a tag which can be used to highlight keywords and keyword phrases which the page is targeting. The code for a meta keyword tag looks like this <meta content=”keyword phrase, another keyword, yep another, maybe one more “> Many people spammed meta keyword tags and searchers typically never see the […]
The tag present in the header of a web page that is used to display a different page after a few seconds. If a page displays another page too soon, most search engines will either ignore the current page and index the second page or penalize the current page for spamming.
A server that passes queries on to many search engines and directories, then summarizes the results. Ask Jeeves, Dogpile, Metacrawler, Metafind and Metasearch are meta search engines.
An HTML tag placed within the header area of code for a web site. This information is visible only to spiders and does not appear as a visual part of the web site. These tags were originally used be webmasters to provide information about the content of a web site in order to assist search […]
A system of measures that helps to quantify particular characteristics. In SEO the following are some important metrics to measure: overall traffic, search engine traffic, conversions, top traffic-driving keywords, top conversion-driving keywords, keyword rankings, etc.
In SEO parlance, a mirror is a near identical duplicate website (or page). Mirrors are commonly used in an effort to target different keywords/keyphrases. Mirror sites are useful when the original site generates too much traffic for a single server to support. Using mirrors is a violation of the Terms Of Service of most search […]
URL Rewrite processes, also known as “mod rewrites,” are employed when a webmaster decides to reorganize a current web site, either for the benefit of better user experience with a new directory structure or to clean up URLs which are difficult for search engines to index.
A posted and visible link in the text of a web page that directs to a web site. The link has no anchor text.
NoFollow is an attribute webmasters can place on links that tell search engines not to count the link as a vote or not to send any trust to that site. Search engines will follow the link, yet it will not influence search results. NoFollows can be added to any link with this code: “rel=”nofollow”.” Nofollow […]
A tag used to describe the content of a frame to a user or engine which had trouble displaying / reading frames. Frequently misused and often referred to as “Poor mans cloaking”.
The noscript element is used to define an alternate content (text) if a script is NOT executed. This tag is used for browsers that recognizes the <script> tag, but does not support the script in it.
The changes that are made to the content and code of a web site in order to increase it’s rankings in the results pages of search engines and directories. These changes may involve rewriting body copy, altering Title or Meta tags, removal of Frames or Flash content, and the seeking of incoming links.
Listings on SERPs that were not paid for; listings for which search engines do not sell space. Sites appear in organic (also called “natural”) results because a search engine has applied formulas (algorithms) to its search crawler index, combined with editorial decisions and content weighting, that it deems important enough inclusion without payment. Paid Inclusion […]
Listings that search engines do not sell (unlike paid listings). Instead, sites appear solely because a search engine has deemed it editorially important for them to be included, regardless of payment. Paid Inclusion Content is also often considered “organic” even though it is paid for. This is because paid inclusion content usually appears intermixed with […]
A link from one website pointing at another external website. Also see, inbound link.
Search engines prevent some websites suspected of spamming from ranking highly in the results by banning or penalizing them. These penalties may be automated algorithmically or manually applied. If a site is penalized algorithmically the site may start ranking again after a certain period of time after the reason for being penalized is fixed. If […]
Altering the search results based on a person’s location, search history, content they recently viewed, or other factors relevant to them on a personal level.
Abbreviation for Pay For Inclusion. Many search engines offer a PFI program to assure frequent spidering / indexing of a site (or page). PFI does not guarantee that a site will be ranked highly (or at all) for a given search term. It just offers webmasters the opportunity to quickly incorporate changes to a site […]
Words which were traditionally associated with low quality content that caused search engines to want to demote the rankings of a page.
A pop-up that loads under a page so that it is only viewable when the current page is closed.
A full service website. Usually refers to any high traffic website that provides news, email, search, and some form of entertainment. Yahoo!, MSN, and AOL are portal sites.
Page Rank is the Google technology developed at Stanford University for placing importance on pages and web sites. At one point, PageRank (PR) was a major factor in rankings. Today it is one of hundreds of factors in the algorithm that determines a page’s rankings. The PageRank formula is: PR(A) = (1-d) + d (PR(T1)/C(T1) + […]
The ability of a search engine to list results that satisfy the query, usually measured in percentage. (if 20 of the 50 results match the query the precision is 40%) Search spam and the complexity of language challenge the precision of search engines.
A measure of how close words are to one another. A page which has words near one another may be deemed to be more likely to satisfy a search query containing both terms. If keyword phrases are repeated an excessive number of times, and the proximity is close on all the occurrences of both words […]
This term describes traffic that is produced by users that find a web site by searching for a product of concept that is offered on that web site. These visitors are thought to be more likely to interact with or purchase from your web site and are therefore of higher quality than other visitors.
Search engines count links votes of trust. Quality links count more than low quality links. There are a variety of ways to define what a quality link is, but the following are characteristics of a high quality link: Trusted Source: If a link is from a page or website which seems like it is trustworthy […]
The keyword or keyword phrase a searcher enters into a search field, which initiates a search and results in a SERP with organic and paid listings. Alternately, also a word, phrase or group of words characterizing the information a user seeks from search engines and directories. The search engine subsequently locates Web pages to match […]
Some searchers may refine their search query if they deemed the results as being irrelevant. Query refinement is both a manual and an automated process. If searchers do not find their search results as being relevant they may search again. Search engines may also automatically refine queries using the following techniques: Google OneBox: promotes a […]
The position that a sites entry is display in any search engine query. For example, if you rank at position #1, you’re the first listed paid or sponsored ad. If you’re in position #18, it is likely that your ad appears on the second or third page of search results, after 17 competitor paid ads […]
Two different sites that link out to each other. Also referred to as Cross Linking. Quality reciprocal link exchanges in and of themselves are not a bad thing, but most reciprocal link offers are of low quality. If too many of your links are of low quality it may make it harder for your site […]
A method of alerting browsers and search engines that a page location moved. 301 redirects are for permanent change of location and 302 redirects are used for a temporary change of location.
The URL address of the web page a user came from before entering another web page. Each time a user clicks a hyperlink, most browsers send a HTTP-REFERRER header to the new web server so that the servers can record the information in log files. The search terms a user typed into a search engine […]
A special meta tag that causes a web browser to reload a page (perhaps the same page) after a delay.
If a site has been penalized for spamming they may fix the infraction and ask for reinclusion. Depending on the severity of the infraction and the brand strength of the site they may or may not be added to the search index.
A link which shows the relation of the current URL to the URL of the page being linked at. Some links only show relative link paths instead of having the entire reference URL within the a href tag. Due to canonicalization and hijacking related issues it is typically preferred to use absolute links over relative […]
Ensuring your brand related keywords display results which reinforce your brand. Many hate sites tend to rank highly for brand related queries.
Much like search engine submission, resubmission is generally a useless program which is offered by businesses bilking naive consumers out of their money for a worthless service.
An index of keywords which stores records of matching documents that contain those keywords.
Any browser program that follows hypertext links and accesses Web pages but is not directly under human control. Example: search engine spiders, the harvesting software programs that extract e-mail addresses or other data from Web pages.
A special file in the root directory of a website used to control how and which search engine spiders access pages within a website. When a spider or robot connects to a website, it checks for the presence of the robots.txt file and uses it to index or avoid specific or all web pages within […]
A term relating to the number of URLs included from a specific web site in any given search engine. The higher the saturation level or number of pages indexed into a search engine, the higher the potential traffic levels and rankings.
Similar to a search engine, in that they both compile databases of web sites. A directory does not use crawlers in order to obtain entries in its search database. Instead, it relies on user interaction and submissions for the content it contains. Submissions are then categorized by topic and normally alphabetized, so that the results […]
A search engine is a searchable online database of internet resources. It has several components: search engine software, spider software, an index (database), and a relevancy algorithm (rules for ranking). The search engine software consists of a server or a collection of servers dedicated to indexing Internet Web pages, storing the results and returning lists […]
This is the process of editing a web site’s content and code in order to improve visibility within one or more search engines. When this term is used to describe an individual, it stands for “Search Engine Optimizer” or one who performs SEO.
The practice of trying to ensure that a web site obtains a high rank in the search engines. Also called search engine positioning, search engine optimization etc.
Movement or path of searchers, who tend to do several searches before reaching a buy decision, that works from broad, general keyword search terms to narrower, specific keywords. Advertisers use the search funnel to anticipate customer intent and develop keywords targeted to different stages. Also refers to potential for switches at stages in the funnel […]
The keywords used in a query to a search engine. Also see search query and search string.
The word or phrase a searcher types into a search field, which initiates search engine results page listings and PPC ad serves. In PPC advertising, the goal is to bid on keywords that closely match the search queries of the advertiser’s targets. Also see search string.
Search strings or terms are the words entered by users into a search engine or directory to locate needed information.
Links that are indirectly acquired links, such as a story in a major newspaper about a new product your company released.
A form of internet marketing that seeks to promote websites by increasing their visibility in search engine result pages (SERPs). SEM methods include: search engine optimization (SEO), paid placement (contextual advertising, digital asset optimization, and paid inclusion). Hence resulting in two types of listing: Editorial / Organic / Natural Listings: Any good search engine, such as Google […]
Writing and formatting copy in a way that will help make the documents appear relevant to a wide array of relevant search queries. There are two main ways to write titles and be SEO friendly Write literal titles that are well aligned with things people search for. This works well if you need backfill content […]
Search Engine Results Page, the page delivered to a searcher that displays the results of a search query entered into the search field. Displays both paid ad (sponsored) and organic listings in varying positions or rank.
Techniques used to steal another web sites traffic, including the use of spyware or cyber squatting.
Page which can be used to help give search engines a secondary route to navigate through your site. On large websites the on page navigation should help search engines find all applicable web pages. On large websites it does not make sense to list every page on the site map, just the most important pages. […]
Sites where users actively participate and interact with each other to determine what is popular.
Any search marketing method that a search engine deems to be detrimental to its efforts to deliver relevant, quality search results. Some search engines have written guidelines on their definitions and penalties for SPAM. Examples include doorway landing pages designed primarily to game search engine algorithms rather than meet searcher expectations from the advertiser’s clicked-on […]
The act of creating and distributing spam to “trick” the search engines. These tactics generally are against the guidelines put forth by the search engines. Tactics such as Hidden text, Doorway Pages, Content Duplication and Link Farming are but a few of many spam techniques employed over the years.
An automated program that follows links to visit web sites on behalf of search engines or directories. Robots then process and index the code and content of a web page to be stored in the search engine’s database.
A spider trap refers to either a continuous loop where spiders are requesting pages and the server is requesting data to render the page or an intentional scheme designed to identify (and “ban”) spiders that do not respect robots.txt.
Feature rich or elegantly designed beautiful web page which typically offers poor usability and does not offer search engines much content to index. Make sure your home page has as much simply formatted, relevant content on it as possible.
Content which does not change frequently. May also refer to content that does not have any social elements to it and does not use dynamic programming languages. Many static sites do well, but the reasons fresh content works great for SEO are: If you keep building content every day you eventually build a huge archive […]
The process of determining root words. For example, querying a search engine using the word “computer” might return results for “computers” or “computing”.
With regard to search engines, a stop word is any word which causes the spider to stop indexing a web page, such as any words that may be offensive, prohibited or otherwise censored.
The process of informing a search engine of the URL of a website (submitting). See also, URL Submission.
Documents which generally are trusted less and rank lower than documents in the main search index. Some search engines, such as Google, have multiple indices. Documents which are not well trusted due to any of the following conditions: limited link authority relative to the number of pages on the site duplicate content or near duplication […]
A theme is an overall idea of what a web page is focused on. Search engines determine the theme of a web page through analysis in the algorithm of the density of associated words on a page.
The top echelon, or top three, search engines that serve the vast majority of searcher queries. Also referred to as Major Engines, Top Tier Engines or GYM, for Google, Yahoo! and Microsoft Live Search.
Smaller, vertical and specialized engines, including general engines, such as Ask.com and AOL; meta-engines that search and display results from other search engines, such as Dogpile; local engines, shopping and comparison engines, and business vertical engines. Tier II Search Engines don’t offer the search query market share or features of the Tier I engines; however, […]
Contextual distribution networks, through which marketers’ ads appear on pages within the PPC engine’s content network, triggered by user web site page views at the moment that contain the advertiser’s keyword in its content. Cost is usually through Cost-Per-Thousand-Impressions (CPM) charges, rather than Pay Per Click (PPC). As discussed in Fundamentals coursework, Google’s contextual distribution […]
This is the name of the webpage. The title is one of the most important aspects to doing SEO on a web page. Each page title should be: Unique to that page: Not the same for every page of a site! Descriptive: What important ideas does that page cover? Not excessively long: Typically page titles […]
An HTML tag appearing in the <head> tag of a web page that contains the page title. The page title should be determined by the relevant contents of that specific web page. The contents of a title tag for a web page is generally displayed in a search engine result as a bold blue underlined […]
Top Level Page feed, the often automatic and on-subscription feed of an advertiser’s home page or unique category pages. Also see TLP.
Many major search companies aim to gain marketshare by distributing search toolbars. Some of these toolbars have useful features such as pop-up blockers, spell checkers, and form autofill. These toolbars also help search engines track usage data.
Method of computing PageRank which instead of producing a single global score creates topic related PageRank scores. The meaning is close to Vertical Search except that the search engine is the global search engine in this case as against the Vertical Search Engine. Hence the PageRank is said to be topic-sensitive PageRank.
Automated notification that another website mentioned your site which is baked into most popular blogging software programs. Due to the automated nature of trackbacks they are typically quite easy to spam. Many publishers turn trackbacks off due to a low signal to noise ratio.
The process of analyzing traffic to a web site to understand what visitors are searching for and what is driving traffic to a site.
Search Engine Commando’s unique technology that automatically submits large lists of URL’s over an appropriate number of days according to the acceptance policies of individual search engines, rather than submitting them all at once. Trickle Submission reduces the risk of rejection by search engines while increasing the chances of achieving good placement.
Search relevancy algorithm which places additional weighting on links from trusted seed websites that are controlled by major corporations, educational institutions, or governmental institutions.
A real visitor to a Website (versus a visit by a search engine robot or other robots (computer programs)). Web servers record the IP addresses of each visitor, and this is used to determine the number of real people who have visited a Web site. If someone visits twenty pages within your site, the server […]
The process of sending data or files to another computer. The opposite of upload is download. The process in web context means uploading or downloading files to or from a web-server on which a web-site is hosted.
A technique used to help make URLs more unique and descriptive to help facilitate better site-wide indexing by major search engines. Also refer URL.
The act of submitting a website to the search engines. Also see URL Submission.
This term refers to how “user friendly” a web site and its functions are. A site with good usability is a site that makes it easy for visitors to find the information they are looking for or to perform the action they desire. Bad usability is anything that causes confusion or problems for the user. […]
Things like a large stream of traffic, repeat visitors, multiple page views per visitor, a high click-through rate (CTR) or a high level of brand related search queries may be seen by some search engines as a sign of quality. This is also the data that is used to analyze SEO or Paid SEM efforts.
This is the identity of a web site visitor, spider, browser, etc. The most common user agents are Mozilla and Internet Explorer.
Positioning trends when vertical listings appear at the top of organic search engine results and below top sponsored listings (when they are displayed on the SERP).
Search engines that focus on a specific industry or sector. Such vertical search engines (also called “vortals”) have much more specific indexes and provide narrower and more focused search results than the Tier I search engines. See Vertical Search.
A search service which is focused on a particular field, a particular type of information, or a particular information format. In other words, domain specific search solutions that focus on one area of business or knowledge and hence is limited to a specific topic, media format, genre, purpose, location, or other differentiating feature. Hence a […]
Self propagating marketing techniques. Common modes of transmission are email, blogging, and word of mouth marketing channels. Many social news sites and social bookmarking sites also lead to secondary citations.
One of the often under-considered factors for a strong SEO strategy is the site’s architecture and its affect on the PageRank flow through a site. We know that PageRank dissipates as we get further from the home page, but our conceptualization of page depth is clouded by the modern website’s sidebars, dynamically updating widgets, site-wide footers, and more. The difficulty in assessing a site’s architecture is compounded with very large sites with rich histories, especially as an SEO working for an agency where there is a need to develop a strong understanding of a new client’s site relatively quickly.
Google Webmaster Tools
One such tool that doesn’t fail to disappoint time and again is the Webmaster Tools ‘Internal Links’ report. Within this report we can get a look at which pages are linking to which, and get a count of internal links to any individual page. Unfortunately, the data is sampled (and often misleading) and does not report page depth.
Black Widow by SoftBytes will crawl through your site discovering new pages link by link. The biggest benefit of Black Widow is the Windows Explorer-like visualization of site architecture:
Xenu is a great tool for discovering broken links by doing a full site crawl much like Black Widow. Additionally, Xenu reports “LEVEL” and “LINKS IN”, which are particularly useful for developing a site architecture understanding. After running a full crawl, I like to import the results into Microsoft Excel and do some quick manipulation to rate the internal link juice that flows into each page. I have found that this has enabled me to get a quick sense of the site’s architecture in a pretty painless and repeatable manner. This process is what I’ll be detailing in this blog post:
Step 1: Run Your Crawl
We won’t need anything but links to internal pages crawled. You can speed up a crawl significantly be allowing Xenu to crawl only the pages that matter. That is, if you’d like to crawl only the www subdomain, you should specify so in Xenu, as crawling the root domain could take a lot more time.
Step 2: Import and Clean up Tab Separated File into Excel
After importing, remove all but the “Address”, “Type”, “Level”, and “Links In” columns. Next, we’ll remove all of the non-html pages by deleting all addresses that do not have a “Type” of text/html.
Use the above filter to show only non-text/html entries, then delete them all. Once the filter is removed, we’re left with just html pages.
Once we’ve done this we can delete the “Type” column, leaving us with “Address”, “Level” and “Links In”.
Because of various crawling oddities, many sites will include odd level counts. Unless you’re working with a MASSIVE site, most normal pages will not have a level higher than 10. Sort by level, and find that point where the levels begin to jump and/or non-important pages are crawled, and remove all thereafter.
Step 3: Assign a “Level” and “Links In” Score
Knowing that more PageRank flows to pages closer to zero (the home page), I use the following formula to score “Level”:
The home page (level zero) will receive a score of 1, all of the level one pages will receive a score that is a fraction of 1, level two will be scored less than level one, and so on.
I score the “Links In” column using the following formula:
=Table1[[#This Row],[Links In]]/MAX([Links In])
This formula works similarly in that the strongest score is 1, and lower “Links In” counts will be a fraction of 1.
Your Excel table should look something like this
Step 4: Rank Your Pages
Once we have our scores, we can add them together and/or use the RANK formula to get a quick reference number.
The higher total score or lower PageRank score indicates higher importance
Utility and Caveats
There are some obvious issues and shortcomings with this method of scoring internal pages. The most obvious is the lack of external link weight into the formula. It’s important to understand that our score is simply based on internal weight.
I have found, however, that it can be quite useful to have early in the life of a project as a reference. For instance, as I’m auditing a new site I can copy the URL of a page in question and do a quick CTRL+F in my Excel score sheet to get a quick feel for a page’s internal “importance”. Another great utility would be to compare these scores with other KPI, such as conversion rate or organic traffic. If you’ve got a page that converts like crazy, but has a poor internal link score, perhaps it should be moved closer to the home page, or linked to from more internal pages.
What helps you visualize site architecture? Let me know in the comments or on Twitter, @MikeCP.
SEO is dead. Long live social media optimisation
As Google search results throw up more and more ads, using SEO to reach your audience is becoming increasingly futile. Could social media optimisation be the answer?
Search engine optimisation (SEO) was always a flawed concept. At its worst, it means making web content less engaging for the reader but supposedly better for search robots and for the mysterious algorithms that determine the order in which results appear for a Google search. At its best, it means no more than following best practice in creating clear, accessible web sites with intelligible content, meaningful titles, descriptive “alt” attributes for image, no broken links, and the rest of what makes for a high-quality web destination.
Now SEO is dying. A striking post by Dan Graziano reveals that a Google search may display only 13% organic results; “the rest is ads and junk”. In addition, a recent Forrester report on how consumers found websites in 2012 shows that social media is catching up with search, accounting for 32% of discoveries versus 54% for search, according to the US respondents, up from 25% in 2011. The trend towards localised results delivered to mobile users, perhaps via an app rather than a web page, is another reason why traditional SEO is decreasingly important.
A better model for today’s businesses is to consider what it means to be social-media optimised, with a focus on customer-centric interaction rather than merely setting up a web property in the hope that Google will deliver hits. Recommendations from friends count for more than a search engine algorithm will ever achieve.
What then is social media optimisation? It is about inviting people into conversation rather than merely broadcasting a message. It is about listening to social media chatter and acting on the results. It is a hashtag or twitter handle on every ad, and a responsive team behind that social media presence for those who respond. It is integrating multiple touch points on multiple channels so that customers get a consistent experience across all of them. These things are not trivial to implement, though the technology to do so now exists, and they have the potential not only to drive sales but also to transform customer experience.
SEO will not be missed.
Comic-Con, with its 130,000 attendees over four days, may generate millions of dollars in revenues for local businesses, but it also manufactures close to 20 tons of trash on surrounding city streets.
That’s where the Downtown Partnership’s Clean & Safe staffers come in, tasked with hauling all that debris away, not including all the trash deposited in the center itself.
This year’s event ended Sunday at the San Diego Convention Center. Here’s how the cleanup took place:
Crowd control: The cleanup crew, with some 100 staffers working day and night throughout the event, had to navigate dense crowds with carts laden with tools. The crew covered a wide swath, between 1st and 13th avenues and north of Harbor Drive to Broadway.
“This is hands down the largest event we’ll do,” said Ryan Loofbourrow, executive director of the Clean & Safe program. “Others don’t cascade all the way into downtown the…
View original post 93 more words