[vc_row unlock_row_content=”yes” row_height_percent=”0″ override_padding=”yes” h_padding=”0″ top_padding=”0″ bottom_padding=”0″ back_image=”103507″ back_repeat=”no-repeat” back_position=”center center” overlay_alpha=”0″ gutter_size=”3″ column_width_percent=”0″ shift_y=”0″ z_index=”0″ uncode_shortcode_id=”550980″ back_size=”cover” el_class=”inner-banner-slide amp-banner-section”][vc_column column_width_percent=”100″ gutter_size=”3″ back_image=”103506″ back_repeat=”no-repeat” back_position=”center center” overlay_alpha=”0″ shift_x=”0″ shift_y=”0″ shift_y_down=”0″ z_index=”0″ medium_width=”0″ mobile_width=”0″ width=”1/1″ uncode_shortcode_id=”180080″ back_size=”cover”][vc_row_inner row_inner_height_percent=”0″ overlay_alpha=”0″ gutter_size=”3″ shift_y=”0″ z_index=”0″ uncode_shortcode_id=”164561″ el_class=”home-banner-container” css=”.vc_custom_1641456384377{padding-right: 15px !important;padding-left: 15px !important;}” limit_content=””][vc_column_inner width=”2/3″][vc_custom_heading text_color=”color-xsdn” heading_semantic=”h1″ text_font=”font-555555″ text_size=”fontsize-402681″ uncode_shortcode_id=”189613″ text_color_type=”uncode-palette”]
Disallow
[/vc_custom_heading][vc_column_text uncode_shortcode_id=”649667″ css=”.vc_custom_1656659176986{margin-top: 10px !important;}”][/vc_column_text][/vc_column_inner][vc_column_inner width=”1/3″][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_section overlay_alpha=”50″ uncode_shortcode_id=”162784″ el_class=”amp-content-section”][vc_row][vc_column width=”1/2″][vc_single_image media=”107518″ media_width_percent=”100″ uncode_shortcode_id=”441257″][/vc_column][vc_column width=”1/2″][vc_column_text uncode_shortcode_id=”177538″]Disallow is a directive that tells search engine robots not to crawl a page or website. In other words, you can use it to make Googlebot not go to certain areas of your website.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column width=”1/1″][vc_column_text uncode_shortcode_id=”142556″]The Disallow directive is configured in the robots.txt file with the following syntax:
Disallow: [path], with the [path] as the beginning of the URL of the page not to crawl. Taking the example of definitions-seo, we have:
“ For a site like https://www.example.com/ :
Disallow:
No restrictions. Free access.
Disallow:/
The site is entirely prohibited for exploration.
Disallow: blog
No page whose URL begins with “blog” will be crawled (example: https://www.example.com/blog or https://www.example.com/blog/example.php).
Disallow: /*.pdf
No document whose URL contains “.pdf” will be explored (example: https://www.example.com/contract.pdf or https://www.example.com/blog/document.pdf or https://www.example.com/blog/document.pdf?langue). »
However, note that this directive does not prevent web pages from being indexed, which causes ” No information is available for this page ” messages to appear in search results.
References –
- https://moz.com/learn/seo/robotstxt
- https://developers.google.com/search/docs/advanced/robots/robots_txt
- https://searchfacts.com/robots-txt-allow-disallow-all
Other SEO Glossary
Allow | Disallow | NoIndex | Deindexing[/vc_column_text][/vc_column][/vc_row][vc_row row_height_percent=”0″ back_color=”T-Red-155595″ overlay_alpha=”50″ gutter_size=”3″ column_width_percent=”100″ shift_y=”0″ z_index=”0″ uncode_shortcode_id=”115159″ back_color_type=”uncode-palette”][vc_column width=”1/1″][vc_column_text uncode_shortcode_id=”145216″][loc_shortcode][/vc_column_text][/vc_column][/vc_row][/vc_section][vc_row][vc_column width=”1/1″][vc_custom_heading uncode_shortcode_id=”904629″]Knowledge Base Articles[/vc_custom_heading][uncode_index el_id=”index-209635″ isotope_mode=”fitRows” loop=”size:3|order_by:rand|post_type:post|taxonomy_count:10″ screen_lg=”1000″ screen_md=”600″ screen_sm=”480″ gutter_size=”3″ post_items=”media|featured|onpost|original,title,text|excerpt,sep-one|full” single_overlay_opacity=”50″ single_padding=”2″ uncode_shortcode_id=”885009″][/vc_column][/vc_row]