# # This file contains rules to prevent the crawling and indexing of certain parts # of your web site by spiders of a major search engines likes Google and Yahoo. # By managing these rules you can allow or disallow access to specific folders # and files for such spyders. # The good way to hide private data or save a lot of bandwidth. # # # For more information about the robots.txt standard, see: # http://www.robotstxt.org/wc/robots.html # # For syntax checking, see: # http://www.sxw.org.uk/computing/robots/check.html User-agent: * Disallow:/ow_version.xml Disallow:/INSTALL.txt Disallow:/LICENSE.txt Disallow:/README.txt Disallow:/UPDATE.txt Disallow:/CHANGELOG.txt Disallow:/admin/ Disallow:/user-credits/ Disallow:/mobile-version Disallow:/users_map Disallow://refund-policy Disallow:/admin/ Disallow:/user/ Disallow:/newsfeed/ Disallow:/forum/ Disallow:/photo/ Disallow:/users/ Disallow: /m/ Disallow: /m/* Sitemap: https://www.updesh.ai/sitemap.xml