Old school Swatch Watches

p _ x
Bookmark Weezywap
JarApkBmkOpera
(GMT+1)
Wednesday May 15th 2024
+
Always Post weezywap.xtgem.com on your facebook, twitter, 2go, whatsapp and instagram status! If you love WEEZYWAP and enjoy unlimited services here!
»WEEZYWAP Forum
Forum Rules | Smilies | BB Codes | Back
* Weezy * HOMMIES-INVESTS Add ur site to google @ http://google.com/webmasters/tools/verification ..What is Robot.txt? This is a text file that is written and can be intepreted by Spiders/Crawlers to abide in rules given to it, and in robots.txt it refers to a particular spider/crawler by its UA (User Agent) in directing it where to crawl in site and where to not. All robots.txt must be saved in the root directory and saved with robots.txt and any spider visiting your site must first look for http: //yoursite.wapka.mobi/robots.txt before accessing the site. What is a Spider/Crawler? Spiders or called Crawlers are programs sent by Search engines to index pages and take results gotten from pages to the search engine, Spiders are also used by Hackers in getting Email address for spamming, and all browser, spiders/crawlers have a unique User Agent used forsurfing the net. What is User Agent? User Agents are just like an Identity Card used by browsers and Spiders in surfing the net in order to be recognised. I have gone through most Wapka sites and I noticed most of them cannot be indexed on the google Search Engine. WHY? The reason is from the default Robot.txt file. Before every crawler/spider crawls a site it must first of all goto the Robot.txt and check the area it must and must not crawl. Now, the default wapka Robot.txt goes like this :

User-agent: SlurpDisallow: /User-agent: *Disallow: Crawl-delay: 60
Copy code

User agent means the name of the crawler, slurp is yahoo, googlebot is google and so on. User agent: * means all the spiders but user agent:slurp or user agent: googlebot specifies the particular spider you are referring to and Disallow : / means that the crawler should not touch or crawl any of your site page. Disallow : means the spider is free to access all your page.User Agent: Slurp Disallow : / means that Google Bot should not touch your site and that's why Wapka sites do not appear in Google Search .If you want your site to be crawled and be visible on all search engine then use this.


User Agent: *Disallow :
Copy code

Login to wapka & select your site. Edit Site > Global Settings > Select Head Tags(meta,style,....) Now, Click On Edit > Robots File ..now in the textarea paste this..
User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap: http://yoursite.wapka.mobi/sitemap.xml
Copy code

change your site url though u can modify it with your own mood but i have given the best robots.txt file that will be best for your site. Setting
Robots.txt on Google
: go to google
Webmaster, select site & go to robots.txt tester tools ..u will see a textarea, paste your site robots.txt code in the textarea & click test/submit. Your site robots.txt can be found
on http: //yoursite.wapka.mobi/robots.txt if it shows red alert on your robots.txt that means that the txt has error and the green alert indicates that the txt file is ok. Thats all!!
2016-07-02 16:15 · (0)

Online: Guests: 1

Translate
Pages
Copyright © 2024 WEEZYWAP
facebook twitter whatsapp google plus
Home | About Us | Advertise | Frequently Asked Question | Contact Us | Partners | Disclaimer | Terms of Services | Invite Friends | Credits



Online: 1 , Today: 657, Hits: 7616136
Last modified: 27-08-2018 11:42:51 pm