What is SEO and how search engines work

SEO stands for Search Engine Optimization. This is a set of activities performed to satisfy the quality standards of the search engines, so that the web pages we can achieve first rank in search results by beating all competitors on a particular keyword.

The keyword is all the things that you're typing at the moment using Google. After the keyword as you type, Google will display the search results that are required by its users. Those doing the SEO want to get traffic from Google by conquering certain keywords can bring traffic to your website optimized.

To be able to get traffic from google is not easy, because Google only provides 7-10 spots on the first page. While websites that provide similar information could reach millions. So just because it could create a blog or a website, does not mean you will get traffic from Google.

Why Google Will Always Stand Out In SEO?

Why don't we just call from last Google? For starters in activities to learn SEO, then you need to know that 68% of searches in the world is handled by the google search engine. It is because Google has algorithms that most developed at the moment and is able to deliver more relevant results and fresh.

Other search engines are still applying age page so new pages it's hard getting the ratings in their search results. This makes Google favored by the seekers, and the web.

What is SEO , how search engines work

How does Google and other search engines?

Google and other search engines to work with through the 3 stages, namely:
  • Crawling

  • Indexing

  • Presentation of search results

For the first and second points of google and other search engines rely on what is called a robot. As for the third point they used a series of algorithms to do it. You need to know what it is and what it is robot algorithm.

What is Search Engine Robots?

Robot search engine is a program that is created to open the Web page, crawls all content of the web page, retrieve data, open all the links on a web page, and repeat this process on all web pages that are opened through the links above.

More or less the workings of search engines is the same when you open a Web page and read it and then click on other links on that page to read other pages. The difference is Google doing it by using Google bot which is a group of large sets of computers that do it in speed that millions of times your speed.

Why needed Crawling?

The intent is that the Google bot crawl to visit all the pages of the website that can be visited, and the goal is to collect data on those pages to the Google index system. The data in the index is what was served when we do a search.

index server data google

Servers data place all web pages are stored.
So to make sure your web pages get listed in Google search results, then the web page it should be entered in the index and to enter in the index you must make sure your web pages can be crawled by Google.

The best way to ensure that is has a lot of links from pages that already to index referring to your web page. Links from other websites to your web pages is called backlinks.

Before you find a lot of links to verify that the crawl, then you should first make sure your web pages can be crawled. Useless many backlinks if you can't crawl pages.

For this one you could try the Google Webmaster Tools, i.e. Fetch Us Google.

Before doing a fetch, your website must be already connected to webmaster tools. Blogspot users need not bother, because blogspot was already connected with webmaster tools from google. The need to report the website to the google webmaster tools is that have a website outside of google products.  
Overview of google's algorithm:


This is the most important part of learning SEO. There are currently three Algorithms became the primary focus in doing search engine optimization, namely:
  • Hummingbird Algorithm

  • Panda Algorithm         

  • Penguin Algorithm         

This hummingbird algorithms assess the relevance of our website pages with the keywords we are targeting. Pages that are rated by the hummingbird web pages is already escaped assessment algorithms panda and penguins. 

Where panda assessing freshness and element of on page spam a web page, while the penguins rate characteristics of backlinks obtained by a Web page.
May be useful.

Popular Posts