pls give me the code I can use to create a page design it and how can I add things to it
pls can you help me to add my site to Google I'm on my kneels the name of my site is www.danthommies.wapka.mobi I've tried to add it to Google so many times but I ain't getting it. #thanks
Add ur site to google @ HOMMIES-INVESTShttp://google.com/webmasters/tools/verification
..What is Robot.txt?
This is a text file that is written and can be intepreted by Spiders/Crawlers
to abide in rules given to it, and in robots.txt
it refers to a particular spider/crawler by its UA
) in directing it where to crawl in site and where to not. All robots.txt
must be saved in the root directory and saved with robots.txt
and any spider visiting your site must first look for http: //yoursite.wapka.mobi/robots.txt
before accessing the site. What is a Spider/Crawler?
Spiders or called Crawlers are programs sent by Search engines to index pages and take results gotten from pages to the search engine, Spiders are also used by Hackers in getting Email address for spamming, and all browser, spiders/crawlers have a unique User Agent used forsurfing the net. What is User Agent?
User Agents are just like an Identity Card used by browsers and Spiders in surfing the net in order to be recognised. I have gone through most Wapka sites and I noticed most of them cannot be indexed on the google Search Engine. WHY?
The reason is from the default Robot.txt file. Before every crawler/spider crawls a site it must first of all goto the Robot.txt and check the area it must and must not crawl. Now, the default wapka Robot.txt goes like this :
User-agent: SlurpDisallow: /User-agent: *Disallow: Crawl-delay: 60
User agent means the name of the crawler, slurp
is yahoo, googlebot
is google and so on. User agent: *
means all the spiders but user agent:slurp
or user agent: googlebot
specifies the particular spider you are referring to and Disallow : /
means that the crawler should not touch or crawl any of your site page. Disallow :
means the spider is free to access all your page.User Agent: Slurp Disallow : /
means that Google Bot should not touch your site and that's why Wapka sites do not appear in Google Search .If you want your site to be crawled and be visible on all search engine then use this.
Login to wapka & select your site. Edit Site > Global Settings > Select Head Tags(meta,style,....) Now, Click On Edit > Robots File ..now in the textarea paste this..
User-agent: Mediapartners-GoogleDisallow:User-agent: *Disallow: /searchAllow: /Sitemap: http://yoursite.wapka.mobi/sitemap.xml
change your site url though u can modify it with your own mood but i have given the best robots.txt file that will be best for your site. Setting
Robots.txt on Google
: go to google
, select site & go to robots.txt tester tools ..u will see a textarea, paste your site robots.txt code in the textarea & click test/submit. Your site robots.txt can be found
on http: //yoursite.wapka.mobi/robots.txt
if it shows red alert on your robots.txt that means that the txt has error and the green alert indicates that the txt file is ok. Thats all!!
Wow! I've forgotten bout' this thread though, i was jes passing by and sited it. Weezy
Please do adloft,buzzcity etc pay peeps for real?
Am using buzzcity, its paying!
Wow! How can i register with them and how do i work with them so that they can pay?
Register @ HOMMIES-INVESTShttp://buzzcity.com
as publisher ..then copy their ads code and paste on your site ..u earn when users click on the code ..ur payment will be processed when ur minimum payout reaches $200