Changes in processing robots.txt file
Yandex has made some changes concerning search engine optimization.
Up to this moment, the 'allow' directive with no path specified was considered as a block, so a search engine robot skipped it. Consequently, this site was excluded from search engine results because it was inaccessible to a robot. However, those site owners who did not want to allow access and indexing did not specify anything in this directive deliberately.
Now a robot will ignore a blank «Allow» directive. If you want to block anything, use the 'disallow' directive like this: 'Disallow: *' or 'Disallow: /'.
You can access all necessary information on using robots.txt for search engine optimization, such as syntax, opportunities and bans on webmaster.yandex.ru.
Foreign Real Estate in Russia: 2023 Trends
In recent years the demand for foreign real estate among the Russian-speaking audience has reached the highest point.
In this guide you will find an overview of the current state of the market, a portrait of a Russian-speaking buyer and useful tips on adapting your business and digital sources to a new field.
Read moreGet a quote
Contact us, we speak English and are ready to
answer all your questions!
Russian SEO in 2023: Trends and Features of Russian Search
Nowadays, good online search visibility is an essential element of a successful business, especially one that deals in foreign countries.
We’ve created a whitepaper where we go through both SEO trends in general and consider some specific factors of Russian search as well.
Read more