Contact Us

You can call us at +972-2-6483441
or fill out the form below and we will get in touch with you shortly

Your name:

Phone Number:




Please answer this security question
What color is the sky?

Google’s Best Kept Secret


One of the worlds best kept and most valuable industrial secrets is not the Coke formula, but rather Google’s search engine algorithm. Access to this algorithm would give a person nearly complete control over search results. With its logic in hand, a person could position the sites of his choice first in the search results, and push other, better sites back into the pages no one views.

Since Google results are the most important factor in directing traffic to a web site, this knowledge could give a very substantial advantage to the businesses that had access to it. It is no surprise, therefore, that Google keeps this information as compartmentalized as possible. Very few people know and understand the full search algorithm.
According to Google, their search algorithm is composed of more than 200 components that are used to determine the ranking of a site. Search engine experts estimate that the Google algorithm undergoes an average of 52 changes per day. So even if someone cracks the whole algorithm, after not too long this knowledge will not be very valuable.
There’s no doubt that there is no other company that has as much influence on searching the web, and the Internet experience in a whole, as much as Google.

How did Google come to dominate the search engine market?

Let’s start with a little bit of history. In the pre-Google era – only about a decade ago, search engines were quite primitive. Most sites were found through web portals and directories, such as Yahoo! The goal of the first search engines (Such as Yahoo! and AltaVista) was to simplify the task of finding a website that contained information you needed but of  whose existence you were unaware.

The challenge:

The user entered a search phrase just as we do today with Google, and the search engine ran around reading each sites meta tags. Meta tags are essentially code words placed in the site by the web site developer that are designed to tell search engines what each page is about, and what search keywords are relevant to it.
The problem with this method was that every site developer or owner could say anything it wanted to say about its site, and soon enough people were cheating to get more visitors by using misleading meta tags. You could have been looking for horses, but instead you would find sites about gambling. At that time, the Internet became flooded with “dirty” sites, and search engines lost much of their credibility and popularity.

And then came Google…

In 1996, two then students at Stanford University, Larry Page and Sergey Brin, were collaborating on their doctoral thesis in mathematics. While working on their thesis, they observed that the more an author is considered to be authoritative, the more that author is quoted in other works.  They realized this could also work in reverse. If they could index and map research articles, check who quotes whom,  who is being referred to more and who less, from that they could determine which authors are more authoritative.


They understood quickly that if this concept worked for academic documents, it would probably work for the web as well. If the linking relations between sites can be mapped, it would be easy to find out what sites are considered to be reliable and authoritative and therefore more relevant. These sites would then be served higher on the search results page. As they started researching the subject, they developed an algorithm Called PageRank (named for Larry Page). This algorithm maps the link structure of the entire web, and points out which sites are more authoritative than others.
They made an offer to sell their technology to Yahoo, which was the biggest search provider at the time, but they were turned down. So they began their own business: Google.
Google’s new concept forever changed the interaction between search engines and websites. Instead of a site telling the search engine what it’s about, the search engine now determines on its own what your site is about, and displays the search results accordingly.

So how does Google assess the value of a web page?

As mentioned earlier, Google has over 200 criteria for assessing the value of a web page. We can’t list all of them (frankly, we don’t know all of them) but the basic process is as follows:
  • The search engine scans a site, reading the contents and analyzing it. It then learns more about a site from the outgoing links (If a site links to other sites about flowers, Google will assume that the site is probably about flowers.)
  • More is learned about the site form incoming links. (If a few sites about flowers point to that page, it’s probably also related to flowers.)
  • The page’s authority is measured by the number and quality of incoming links using the PageRank algorithm. (If high authority sites on flowers link to the page, it will also gain a high authority for flowers.)
  • The length of a users visit is also checked. This is measured by the time it took for a user to return to the results page, after entering the site for the first time through the search engine. The longer the stay, the higher the accuracy for that search term is presumed.

How do SEO consultants learn about Google’s algorithm?

Search engine experts learn about the Google ranking algorithm from various sources, such as:
  • Reverse Engineering- Much can be learned through trial and error, checking what factors influence a site’s rating. Incidentally, if a SEO consultant told you they know the Google algorithm, this should raise a red flag. If they know Google’s secret, they wouldn’t need clients.
  • The information provided by Google – Google is interested in serving the most relevant results for each search. They provide tips, tools and guidelines to help webmasters improve the quality of their sites, and subsequently rank higher in the search results.
  • Google’s Patent Applications – These publicly available documents provide further understanding of the underlying technologies and processes.
It is important to understand that SEO consultants don’t work “against” Google; rather they help Google find and evaluate the client’s site. The first goal of an SEO consultant is to help his client improve the quality of his site, and make it more search engine friendly. This is in Google’s interest as well.

In Summary:

As mentioned previously, there are many ever-changing factors that influence the way sites are ranked on Google. Here at SEO Jerusalem we pride ourselves in being the mavericks of the industry. We  remain in touch and on top of the Google search results methodology, and are continually exploring new technologies and techniques to improve search result rankings for all our clients.

Contact us!

SEO Jerusalem is a premium search engine optimization company, promoting websites in English, Hebrew and German SEO. Our SEO expert team provides worldwide SEO services and we serve as one of the leading companies in the Israel SEO market. Our search engine marketing strategies and our innovative SEO web development techniques are ideal for Google SEO or any other organic SEO needs. Locally – We offer Israel-wide SEO services including the cities: Jerusalem, Tel Aviv, Kiryat Shmona , Haifa and Eilat. Worldwide – we offer global campaign services to companies and organizations from all over the world. We look forward to working with you, so please feel free to contact us.

Comments are closed.