How to Check Redirects in Bulk
Methods for checking hundreds or thousands of URLs for redirects: scripting with cURL, using Screaming Frog, online bulk checkers, and building your own redirect audit workflow.
Checking one redirect at a time works for spot checks. It does not work after a site migration with 500 redirects, or when auditing a site with years of accumulated redirect rules. You need bulk checking.
Here is how to verify hundreds or thousands of redirects efficiently — from simple scripts to full crawling workflows.
Method 1: cURL Script (Free, Fast, Flexible)
cURL is the best tool for bulk redirect checking when you already have a list of URLs. No account needed, no rate limits (except what the target server imposes), and full control over output format.
Basic Bulk Check
Create a file called urls.txt with one URL per line, then run:
while IFS= read -r url; do
result=$(curl -sIL -o /dev/null -w "%{http_code}\t%{num_redirects}\t%{url_effective}" --max-time 10 "$url")
echo -e "$url\t$result"
done < urls.txt > results.tsv
This produces a TSV file with columns: original URL, final status code, number of redirects, and final URL.
With Full Redirect Chain Detail
When you need to see every hop, not just the final destination:
check_url() {
local url="$1"
local chain=$(curl -sIL "$url" --max-time 15 2>&1 | grep -iE "^(HTTP/|location:)" | tr '\n' ' ')
echo "$url | $chain"
}
while IFS= read -r url; do
check_url "$url"
done < urls.txt > chain_report.txt
Parallel Execution for Speed
Sequential checking is slow. Use xargs for parallel requests:
cat urls.txt | xargs -P 20 -I {} bash -c '
result=$(curl -sIL -o /dev/null -w "%{http_code}\t%{num_redirects}\t%{url_effective}" --max-time 10 "{}")
echo -e "{}\t$result"
' > results.tsv
-P 20 runs 20 requests concurrently. Adjust based on your connection speed and the target server's capacity.
Be respectful
Hammering a server with too many concurrent requests can trigger rate limiting or even get your IP blocked. For external sites, keep parallel requests under 10. For your own server, test what it can handle.
Validate Against Expected Destinations
The most useful bulk check: compare actual results against your redirect map.
# redirect_map.csv format: source_url,expected_destination,expected_code
echo -e "source\texpected_dest\texpected_code\tactual_dest\tactual_code\tstatus"
while IFS=, read -r source expected_dest expected_code; do
actual_code=$(curl -sIL -o /dev/null -w "%{http_code}" --max-time 10 "$source")
actual_dest=$(curl -sIL -o /dev/null -w "%{url_effective}" --max-time 10 "$source")
if [ "$actual_dest" = "$expected_dest" ] && [ "$actual_code" = "$expected_code" ]; then
status="PASS"
else
status="FAIL"
fi
echo -e "$source\t$expected_dest\t$expected_code\t$actual_dest\t$actual_code\t$status"
done < redirect_map.csv > validation_report.tsv
Filter for failures:
grep "FAIL" validation_report.tsv
Trace your redirect chains
Find redirect loops, broken chains, and unnecessary hops instantly.
Method 2: Screaming Frog
Screaming Frog SEO Spider is the standard desktop tool for bulk redirect auditing. The free version crawls up to 500 URLs.
Switch to List Mode
Go to Mode > List. This lets you upload a URL list instead of crawling a site.
Upload your URL list
Click Upload > Paste or Upload > From File. Paste or load your list of URLs.
Start the crawl
Click Start. Screaming Frog fetches each URL and records the response.
Filter and export results
In the main view, filter by Response Codes > Redirection (3xx) to see all redirects. Go to Reports > Redirect Chains for chain analysis. Export to CSV for further processing.
Screaming Frog advantages
- Handles thousands of URLs efficiently
- Detects redirect chains and loops automatically
- Shows canonical URLs, meta robots, and other SEO signals alongside redirect data
- Exports detailed reports in multiple formats
Screaming Frog limitations
- Desktop app — requires installation and local resources
- Free version limited to 500 URLs
- No scheduled/recurring checks (one-time audits only)
- Cannot alert you when redirects change
Method 3: Python Script for Custom Workflows
When you need more control than bash scripts offer — custom logic, database integration, or structured output:
import requests
import csv
def check_redirect(url, timeout=10):
try:
response = requests.get(url, allow_redirects=True, timeout=timeout)
chain = []
for resp in response.history:
chain.append({
'url': resp.url,
'status': resp.status_code,
'location': resp.headers.get('Location', '')
})
return {
'original': url,
'final_url': response.url,
'final_status': response.status_code,
'num_redirects': len(response.history),
'chain': chain
}
except requests.exceptions.TooManyRedirects:
return {'original': url, 'error': 'redirect_loop'}
except requests.exceptions.RequestException as e:
return {'original': url, 'error': str(e)}
# Read URLs and check
with open('urls.txt') as f:
urls = [line.strip() for line in f if line.strip()]
with open('results.csv', 'w', newline='') as f:
writer = csv.writer(f)
writer.writerow(['URL', 'Final URL', 'Status', 'Redirects', 'Error'])
for url in urls:
result = check_redirect(url)
writer.writerow([
result.get('original'),
result.get('final_url', ''),
result.get('final_status', ''),
result.get('num_redirects', ''),
result.get('error', '')
])
Add concurrent.futures.ThreadPoolExecutor for parallelism:
from concurrent.futures import ThreadPoolExecutor, as_completed
with ThreadPoolExecutor(max_workers=10) as executor:
futures = {executor.submit(check_redirect, url): url for url in urls}
for future in as_completed(futures):
result = future.result()
# process result
Method 4: Sitemap-Based Audit
If you do not have a URL list, extract one from your sitemap:
# Extract URLs from sitemap.xml and check each one
curl -s "https://example.com/sitemap.xml" | \
grep -oP '<loc>\K[^<]+' | \
while read url; do
code=$(curl -sIL -o /dev/null -w "%{http_code}" --max-time 10 "$url")
redirects=$(curl -sIL -o /dev/null -w "%{num_redirects}" --max-time 10 "$url")
if [ "$redirects" -gt 0 ]; then
final=$(curl -sIL -o /dev/null -w "%{url_effective}" --max-time 10 "$url")
echo "REDIRECT: $url -> $final ($code, $redirects hops)"
fi
done
Check sitemap index files
Large sites use sitemap index files that reference multiple sitemaps. Check robots.txt for the sitemap URL, then handle nested sitemaps if present.
Building a Redirect Audit Workflow
A proper redirect audit is more than running a script once. Here is the full workflow:
Collect your URL inventory
Gather URLs from: your sitemap, Google Search Console (exported URLs), your redirect configuration files, and any URL mapping documents from past migrations.
Run the bulk check
Use any method above to check every URL. Record: original URL, final URL, final status code, number of hops, and any errors.
Categorize the results
Sort into: working correctly (single hop, correct destination), chains (2+ hops, needs flattening), loops (redirect error), broken (4xx/5xx final destination), and unexpected destination.
Prioritize fixes
Fix loops first (they are completely broken), then broken destinations (404s), then chains (performance and SEO impact). Use traffic data to prioritize within each category.
Verify after fixing
Re-run the same bulk check to confirm every fix works. Diff the before and after reports.
How Often to Run Bulk Checks
| Scenario | Frequency | Scope |
|---|---|---|
| Post-migration | Daily for the first week, then weekly for a month | All migrated URLs |
| Routine maintenance | Monthly | Full sitemap + known redirect URLs |
| After CMS/plugin updates | Immediately after deployment | Critical pages and recent redirects |
| After CDN/DNS changes | Immediately after propagation | Full site check |
Related Articles
One broken redirect is a bug. A hundred broken redirects after a migration is a disaster. Check them all.
Never miss a broken redirect
Trace redirect chains and detect issues before they affect your users and SEO. Free instant tracing.