Tools / Crawler / Crawler: ExclusionPatterns

Crawler: ExclusionPatterns

Type: string[]
Parameter syntax
exclusionPatterns: [
  'http://www.exculded.com',
  '!http://www.excluded2.com', // URL can use any micromatch pattern
  ...
]

About this parameter# A

Tells the crawler which URLs to ignore or exclude.

This list is checked against the url of webpages using micromatch. You can use negation, wildcards, and more.

Examples# A

1
2
3
4
5
6
7
8
{
  exclusionPatterns: [
    // Means all `/about` pages are excluded but not the `/about/ny` one. All pages ending with `.html` are also excluded.
    'http://www.google.com/about/**',
    '!http://www.google.com/about/ny',
    '**.html',
  ]
}
Did you find this page helpful?