Jump to content
Advertise your business on Top Gold Forum today for as low as $100 per month. Contact us.


  • Content count

  • Points

  • Joined

  • Last visited

  • Points

    0 [ Donate ]

Community Reputation

10 Good

About fayeseom

  • Rank
    New Money Maker
  1. Google Penalty : Google penalty is negative signal for our website. Google penalise our website when we breaks the rules and policies defined by the Google such as :- 1:- Using duplicate content for posting 2:- Keyword stuffing in the content 3:- High keyword density in our content 4:- Building backlinks from low quality websites and their can be more reasons.
  2. Heading tags are indicators used in HTML to help structure your webpage from an SEO point of view, as well as helping Google to read your piece of content. Heading tags range from H1-H6 and form a hierarchical structure to your page. Heading tag 1 is the most important header in Google’s eyes and forms the title of the page. Identifying the title with an H1 tag will help Google quickly know what the content is about.
  3. 1. Understand what makes a good keyword or phrase 2. Think like your customers 3. Tie it all together 4. Be specific and targeted 5. List different variations 6. Use Google's keyword tool to get ideas 7. Language and location targeting 8. Understand keyword matching options 9. Remove "negative" keywords
  4. Google Sandbox is generally used for websites which are newly launched and then arouse the suspicion of Google, usually by adding a large amount of new content in a very short period of time. This appears to Google as spam; and the Sandbox is essentially a spam filter which watches for suspicious activity, reasoning that a site which adds enormous numbers of new posts, pages and content instantly is probably filling itself with duplicate content or engaging in search engine spamming.
  5. Backlinks are links from outside domains that point to pages on your domain; essentially linking back from their domain to yours. To a larger degree, your backlink profile is made up of backlinks from external sites (also known as referring domains) that contribute to the overall strength, relevance and diversity of your domain’s backlink profile.
  6. Use these steps to discover keywords from competitor domains: Access the keyword tool and log in. Input your competitor's web address. Review the listings, and scroll down to view all and see the collection of keywords including ad group ideas.
  7. Do SEO and try to build niche related links, Immediate way to get visitors then try Social media, Like facebook , Twitter ,instagram
  8. Search engine optimization (SEO) is the art and science of publishing and marketing information that ranks well in search engines like Google, Yahoo! Search, and Microsoft Bing. ... By default, many search engines show only 10 results per page. Most searchers tend to click on the top few results.
  9. Meta tags are located inside your html’s head area. <head>Meta tags are located here</head> There are three important parts of Meta tags that you can use: 1. Title – The title tag is the title text that is shown in search engine listings. I know that this is not necessarily a meta tag, but it similarly functions like one. <title>Title text here</title> 2. Description – The meta description tag is where you would want to put your site’s summary. This is where you put what your site is all about and what you are offering people. It should not be too long because the search engines only read up to a certain number of words. <meta name=”description” content=”This is where you put your site’s summary”/> 3. Keywords – The meta keywords tag is where you put all of the keywords you use in your site. This is basically where you want the words which will take you to the top of the SERPs page to be. Your keywords are important – even if you take away all of the other words, the user should be able to know what your site is all about when they read your keywords.
  10. The robots.txt file is a simple text file which is read by bots and crawlers to identify how it should crawl the site. The bots that crawl the website are automated and they check for the robots.txt file before accessing the website. We can specify which crawlers are allowed to crawl the site, which directories should not be crawled, crawl rate, etc. The robots.txt file is required only when you want to have some content on your site excluded from the search engines. If you don’t want to exclude anything (i.e. include everything) on the search engines than you don’t need robots.txt file. If you don’t have a robots.txt file sometimes the server might return a 404 or Permission Denied when trying to access the file and this might cause issues, but it is not a big problem. Hence, it is always better to have robots.txt whether it is blank or with code to allow access to everyone.
  11. Basic search engine optimization (SEO) is fundamental. And essential. SEO will help you position your website properly to be found at the most critical points in the buying process or when people need your site.
  12. First step to make your blog visible on google searches is to list your website in google directory. Here i found step by step way of submitting your website on google, Yahoo and Bing. How to submit sitemap to Google, Yahoo, Bing webmaster tools [ step by step guide] After submitting your site to search engines, you need to do little SEO on your website. Few tips on white hat SEO are: Use keywords in title, H1 tags. Use main keywords in first and last 30 words. Make article Grammar free. Make important keywords bold, italic and underline. Give proper formatting( Headings, subheadings, spacing etc,). Interlink your content. Share your articles on all social channels. Write quality articles to earn organic Do follow back links.
  13. "Guest posting” means writing and publishing an article on someone else's website or blog. I offer this on my own site (occasionally) and do it quite a bit on other blogs with audiences that I want to speak to. It's a great way to connect with new readers and get your name out.
  14. Googlebot's crawl process begins with a list of webpage URLs, generated from previous crawl processes and augmented with sitemap data provided by webmasters. As Googlebot visits each of these websites it detects links on each page and adds them to its list of pages to crawl.
  15. An HTML sitemap is a dedicated page on a website that contains a visual representation of contents and navigation within the website with a collection of links. In some cases an HTML sitemap can be found in the footer of almost every pages - for more details on the subject, see footer sitemap definition.