Complete technical guide for mastering Black SEO Analyzer's command-line interface and advanced features
1. Download the binary for your platform:
# macOS (Intel)
curl -L https://github.com/sethblack/black-seo-analyzer/releases/latest/download/black-seo-analyzer-darwin-x64.tar.gz | tar xz
# macOS (Apple Silicon)
curl -L https://github.com/sethblack/black-seo-analyzer/releases/latest/download/black-seo-analyzer-darwin-arm64.tar.gz | tar xz
# Linux
curl -L https://github.com/sethblack/black-seo-analyzer/releases/latest/download/black-seo-analyzer-linux-x64.tar.gz | tar xz
# Windows (PowerShell)
Invoke-WebRequest -Uri https://github.com/sethblack/black-seo-analyzer/releases/latest/download/black-seo-analyzer-windows-x64.zip -OutFile black-seo-analyzer.zip
Expand-Archive black-seo-analyzer.zip
2. Make executable (macOS/Linux):
chmod +x black-seo-analyzer
3. Verify installation:
./black-seo-analyzer --version
After purchasing, you'll receive a license key via email. Activate it using:
./black-seo-analyzer --activate YOUR_LICENSE_KEY
The license is stored locally in ~/.black-seo-analyzer/license.key
Note: The free version allows crawling up to 3 pages per site for evaluation.
The basic command structure follows this pattern:
./black-seo-analyzer [URL] [OPTIONS]
Examples:
# Basic crawl with default settings
./black-seo-analyzer https://example.com
# Crawl with specific output format
./black-seo-analyzer https://example.com --output json
# Interactive mode for guided setup
./black-seo-analyzer --interactive
# Crawl with custom configuration file
./black-seo-analyzer https://example.com --config my-config.toml
# Crawling Controls
--max-pages 1000 # Limit pages to crawl (default: unlimited)
--max-depth 10 # Maximum crawl depth (default: 10)
--concurrent-requests 10 # Parallel requests (default: 10)
--delay 100 # Delay between requests in ms (default: 0)
# Output Options
--output json # Output format: json, jsonl, xml, csv, csv-flat, html-folder, json-files
--output-dir ./reports # Directory for output files (default: current directory)
# JavaScript Rendering
--render-js # Enable JavaScript rendering for all pages
--render-js-auto # Auto-detect when JS rendering is needed (default)
# Authentication
--basic-auth user:pass # HTTP Basic Authentication
--header "Cookie: session=abc" # Custom headers for authentication
Full site audit with HTML report:
./black-seo-analyzer https://example.com --output html-folder --output-dir ./audit-report
Quick technical check (first 50 pages):
./black-seo-analyzer https://example.com --max-pages 50 --output json | jq '.issues'
JavaScript SPA analysis:
./black-seo-analyzer https://spa-site.com --render-js --wait-for-selector "#app"
Competitive analysis with AI insights:
./black-seo-analyzer https://competitor.com --ai-provider openai --ai-key $OPENAI_KEY
Create a config.toml
file for reusable configurations:
[crawl]
max_pages = 5000
max_depth = 10
concurrent_requests = 20
delay_ms = 100
user_agent = "BlackSEOAnalyzer/1.0"
[javascript]
render_mode = "auto" # "always", "never", or "auto"
wait_for_selector = "#content"
wait_timeout_ms = 30000
[output]
format = "html-folder"
directory = "./reports"
include_screenshots = true
[ai]
provider = "openai" # "openai", "anthropic", "deepseek", "gemini", "ollama"
model = "gpt-4"
temperature = 0.7
[semantic]
enabled = true
similarity_threshold = 0.95
relevance_threshold = 0.7
embedding_provider = "openai"
[filters]
include_patterns = ["^/blog/", "^/products/"]
exclude_patterns = ["^/admin/", "\\.pdf$"]
respect_robots_txt = true
Set API keys and sensitive data via environment variables:
# AI Provider Keys
export OPENAI_API_KEY="sk-..."
export ANTHROPIC_API_KEY="sk-ant-..."
export DEEPSEEK_API_KEY="..."
export GEMINI_API_KEY="..."
# Proxy Configuration
export HTTP_PROXY="http://proxy.company.com:8080"
export HTTPS_PROXY="http://proxy.company.com:8080"
# Chrome/Chromium Path (if not in PATH)
export CHROME_PATH="/Applications/Google Chrome.app/Contents/MacOS/Google Chrome"
Customize HTML report templates by creating a templates/
directory:
templates/
├── index.html.hbs # Main report page
├── page.html.hbs # Individual page reports
├── issues.html.hbs # Issues summary
└── assets/
├── style.css # Custom CSS
└── logo.png # Your branding
Use with:
./black-seo-analyzer https://example.com --output html-folder --template-dir ./templates
Provider | Models | Best For |
---|---|---|
OpenAI | GPT-4, GPT-3.5 | General analysis, content optimization |
Anthropic | Claude 3 Opus, Sonnet | Deep technical analysis, code review |
DeepSeek | DeepSeek-V2 | Cost-effective bulk analysis |
Gemini Pro | Multimodal analysis with screenshots | |
Ollama | Llama, Mistral, etc. | Local/private analysis |
Content Analysis:
# Analyze content quality and optimization
./black-seo-analyzer https://example.com \
--ai-provider openai \
--ai-analyze-content \
--ai-content-prompt "Analyze for E-A-T signals and content depth"
Meta Tag Generation:
# Generate optimized meta descriptions
./black-seo-analyzer https://example.com \
--ai-provider anthropic \
--ai-generate-meta \
--ai-meta-prompt "Create compelling meta descriptions under 160 chars"
Competitive Gap Analysis:
# Compare against competitor content
./black-seo-analyzer https://example.com \
--ai-provider gemini \
--ai-competitive-analysis \
--competitor-urls "https://competitor1.com,https://competitor2.com"
Create custom prompts for specific analysis needs:
# prompts.toml
[content_analysis]
prompt = """
Analyze this page content for:
1. Keyword density and semantic relevance
2. Content structure and readability
3. Missing topics based on search intent
4. Internal linking opportunities
5. Schema.org markup recommendations
Page Title: {title}
Page Content: {content}
Current Meta Description: {meta_description}
"""
[technical_review]
prompt = """
Review the technical SEO aspects:
1. Core Web Vitals: {core_web_vitals}
2. JavaScript errors: {js_errors}
3. Render blocking resources: {render_blocking}
4. Security headers: {security_headers}
Provide specific optimization recommendations.
"""
Use custom prompts:
./black-seo-analyzer https://example.com --ai-prompts ./prompts.toml
Configure semantic duplicate detection:
# Find semantically similar content
./black-seo-analyzer https://example.com \
--semantic-analysis \
--semantic-threshold 0.90 \
--semantic-min-words 100
Topic relevance analysis:
# Measure content relevance to site theme
./black-seo-analyzer https://example.com \
--semantic-relevance \
--relevance-threshold 0.7 \
--topic-keywords "machine learning,AI,data science"
Content clustering:
# Group similar content for consolidation
./black-seo-analyzer https://example.com \
--semantic-clustering \
--cluster-threshold 0.85 \
--min-cluster-size 3
Core Web Vitals tracking:
# Detailed performance metrics
./black-seo-analyzer https://example.com \
--performance-audit \
--lighthouse-mode performance \
--cwv-thresholds "LCP:2500,FID:100,CLS:0.1"
Resource optimization analysis:
# Find optimization opportunities
./black-seo-analyzer https://example.com \
--analyze-resources \
--check-compression \
--check-caching \
--check-cdn-usage
Validate all structured data formats:
# Comprehensive schema validation
./black-seo-analyzer https://example.com \
--validate-structured-data \
--schema-types "Product,Article,Organization" \
--check-required-properties
Generate structured data recommendations:
# AI-powered schema suggestions
./black-seo-analyzer https://example.com \
--suggest-structured-data \
--ai-provider openai \
--industry "ecommerce"
Format | Use Case | Example |
---|---|---|
json |
API integration, programmatic processing | --output json | jq '.pages[]' |
jsonl |
Streaming, large datasets | --output jsonl | while read line... |
xml |
Enterprise systems, XSLT processing | --output xml --xsl-transform report.xsl |
csv |
Spreadsheet analysis, data science | --output csv --csv-delimiter "," |
csv-flat |
Simplified metrics, dashboards | --output csv-flat --include-metrics |
html-folder |
Visual reports, client deliverables | --output html-folder --include-screenshots |
json-files |
Distributed processing, archival | --output json-files --files-per-dir 1000 |
Extract critical issues with jq:
./black-seo-analyzer https://example.com --output json | \
jq '.issues[] | select(.severity == "critical") | {url: .url, issue: .type}'
Generate CSV report for specific metrics:
./black-seo-analyzer https://example.com --output json | \
jq -r '.pages[] | [.url, .title_length, .meta_description_length, .h1_count] | @csv' > metrics.csv
Stream processing for large sites:
./black-seo-analyzer https://example.com --output jsonl | \
while IFS= read -r line; do
echo "$line" | jq -r 'select(.status_code >= 400) | .url'
done > broken-pages.txt
name: SEO Audit
on:
push:
branches: [main]
schedule:
- cron: '0 0 * * 0' # Weekly
jobs:
seo-audit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Download Black SEO Analyzer
run: |
curl -L https://github.com/sethblack/black-seo-analyzer/releases/latest/download/black-seo-analyzer-linux-x64.tar.gz | tar xz
chmod +x black-seo-analyzer
- name: Run SEO Audit
env:
LICENSE_KEY: ${{ secrets.BLACK_SEO_LICENSE }}
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
run: |
./black-seo-analyzer --activate $LICENSE_KEY
./black-seo-analyzer https://staging.example.com \
--output json \
--max-pages 100 \
--ai-provider openai > audit.json
- name: Check for Critical Issues
run: |
CRITICAL_COUNT=$(jq '[.issues[] | select(.severity == "critical")] | length' audit.json)
if [ $CRITICAL_COUNT -gt 0 ]; then
echo "Found $CRITICAL_COUNT critical SEO issues!"
exit 1
fi
- name: Upload Report
uses: actions/upload-artifact@v3
with:
name: seo-audit-report
path: audit.json
Create a monitoring script for regular checks:
#!/bin/bash
# seo-monitor.sh
SITES=(
"https://example.com"
"https://blog.example.com"
"https://shop.example.com"
)
WEBHOOK_URL="https://hooks.slack.com/services/YOUR/WEBHOOK/URL"
for site in "${SITES[@]}"; do
echo "Auditing $site..."
./black-seo-analyzer "$site" \
--output json \
--max-pages 50 \
--config monitoring.toml > "audit-$(date +%Y%m%d)-${site//https:\/\//}.json"
# Check for issues
CRITICAL=$(jq '[.issues[] | select(.severity == "critical")] | length' audit-*.json)
HIGH=$(jq '[.issues[] | select(.severity == "high")] | length' audit-*.json)
if [ $CRITICAL -gt 0 ] || [ $HIGH -gt 5 ]; then
curl -X POST $WEBHOOK_URL \
-H 'Content-Type: application/json' \
-d "{\"text\":\"⚠️ SEO Alert for $site: $CRITICAL critical, $HIGH high priority issues found\"}"
fi
done
Run Black SEO Analyzer in a container:
# Dockerfile
FROM debian:bullseye-slim
RUN apt-get update && apt-get install -y \
curl \
jq \
chromium \
&& rm -rf /var/lib/apt/lists/*
WORKDIR /app
RUN curl -L https://github.com/sethblack/black-seo-analyzer/releases/latest/download/black-seo-analyzer-linux-x64.tar.gz | tar xz
ENV CHROME_PATH=/usr/bin/chromium
ENTRYPOINT ["./black-seo-analyzer"]
Build and run:
# Build image
docker build -t black-seo-analyzer .
# Run analysis
docker run --rm \
-v $(pwd)/reports:/app/reports \
-e LICENSE_KEY=$BLACK_SEO_LICENSE \
black-seo-analyzer https://example.com --output html-folder --output-dir /app/reports
Chrome/Chromium not found:
# Set Chrome path explicitly
export CHROME_PATH="/path/to/chrome"
# Or disable JavaScript rendering
./black-seo-analyzer https://example.com --no-render-js
Rate limiting errors:
# Add delay between requests
./black-seo-analyzer https://example.com --delay 1000 --concurrent-requests 2
Memory issues on large sites:
# Use streaming output format
./black-seo-analyzer https://large-site.com --output jsonl --max-pages 10000
SSL certificate errors:
# Skip SSL verification (development only!)
./black-seo-analyzer https://dev.example.com --insecure
Enable verbose logging for troubleshooting:
# Maximum verbosity
./black-seo-analyzer https://example.com -vvv
# Log to file
./black-seo-analyzer https://example.com -vv 2> debug.log
# Debug specific components
./black-seo-analyzer https://example.com \
--debug-crawler \
--debug-renderer \
--debug-ai
Optimize for speed:
# Maximum performance (be respectful!)
./black-seo-analyzer https://example.com \
--concurrent-requests 50 \
--no-render-js \
--skip-resources "images,videos,fonts"
Optimize for accuracy:
# Thorough analysis
./black-seo-analyzer https://example.com \
--render-js \
--wait-for-network-idle \
--screenshot-all \
--concurrent-requests 5
Join the technical SEO professionals who own their tools and control their analysis.
Expand your knowledge and connect with the community