5LLMs monitored
6perception metrics
30+Academy lessons
EUAI Act ready
Core Feature

How do you measure AI perception improvement over time?

Perception audits are snapshots. Analytics shows you the full picture—how your AI visibility evolves, where you're gaining ground, and what needs attention.

VectorGap Analytics transforms perception audit data into actionable insights about your AI visibility over time. While individual audits show what AI says about you today, analytics reveals patterns and trends: Is your perception improving? Which AI providers are you gaining ground on? Are your GEO optimization efforts working? The platform tracks four core metrics—VPS (composite perception score), SOV (share of voice vs competitors), HAL (hallucination rate), and REC (recommendation rate)—and presents them through trend visualizations, provider comparisons, and automated reports. With scheduled alerts for significant changes and export options for stakeholder presentations, analytics turns raw perception data into strategic intelligence that guides your AI visibility efforts.

What metrics does VectorGap Analytics track?

Not vanity metrics. Real indicators of how AI perceives your brand.

VPS

VectorGap Perception Score

A composite score from 0-100 that combines accuracy, sentiment, visibility, coverage, credibility, and recommendation across all AI providers. One number that captures your overall AI presence.

SOV

Share of Voice

How often you appear vs. competitors when AI answers relevant queries. If someone asks about "project management tools" and you show up 3 times out of 10, your SOV is 30%.

HAL

Hallucination Rate

Percentage of AI responses that contain factual errors about your brand. Lower is better. Industry average is around 35%. Top performers get below 10%.

REC

Recommendation Rate

How often AI actively recommends your product (not just mentions it) when answering relevant queries. The metric that correlates most directly with AI-driven conversions.

How does trend analysis reveal perception patterns?

See how your perception changes over time. Correlate movements with your actions.

[Perception score trend visualization]
+12%
VPS this quarter
-8%
Hallucination rate
+5
SOV rank positions

What reporting and alert options are available?

Scheduled reports

Set up weekly or monthly reports delivered to your inbox or Slack. Include the metrics that matter to your team. No manual export needed.

Smart alerts

Get notified when something significant changes. Perception score drops? New hallucination detected? Competitor gained ground? You'll know immediately.

Export anything

Export to PDF for stakeholder presentations. CSV for your own analysis. JSON via API for custom dashboards. Your data, your way.

Provider comparison

Side-by-side breakdown of how each AI provider perceives you. ChatGPT vs. Claude vs. Gemini vs. Perplexity. Find provider-specific issues.

What does the analytics dashboard look like?

Everything in one place. No hunting through different reports.

72
VPS Score
23%
Share of Voice
12%
Hallucination
45%
Recommendation
Perception trend chart
Provider comparison

Frequently Asked Questions about Analytics & Reporting

What metrics does VectorGap Analytics track?

VectorGap Analytics tracks four core metrics: VPS (VectorGap Perception Score) is a composite 0-100 score combining accuracy, sentiment, visibility, coverage, credibility, and recommendation across all AI providers. SOV (Share of Voice) measures how often you appear vs competitors in AI responses to relevant queries. HAL (Hallucination Rate) tracks the percentage of AI responses containing factual errors about your brand. REC (Recommendation Rate) measures how often AI actively recommends your product, not just mentions it.

How does VectorGap calculate the Perception Score (VPS)?

The VPS is calculated by running perception audits across ChatGPT, Claude, Gemini, and Perplexity, then scoring each response on six dimensions: Accuracy (fact-checked against your knowledge base), Sentiment (positive/neutral/negative tone), Visibility (position in response), Coverage (mention of key features), Credibility (source citations), and Recommendation (active endorsement). These dimensions are weighted and averaged across providers to produce a single 0-100 score that represents your overall AI perception health.

What types of reports and alerts are available?

VectorGap offers scheduled reports (weekly or monthly) delivered via email or Slack, containing the metrics your team cares about. Smart alerts notify you immediately when significant changes occur—perception score drops, new hallucinations detected, or competitors gaining ground. You can export data in multiple formats: PDF for stakeholder presentations, CSV for custom analysis, and JSON via API for integration with your own dashboards. Provider comparison reports show side-by-side breakdowns of how each AI perceives you.

How can I use analytics to improve my GEO strategy?

Analytics reveals patterns that guide your GEO optimization efforts. Trend analysis shows whether your content updates are improving perception scores. Provider comparison identifies AI platforms where you're weak (e.g., strong on ChatGPT but invisible on Claude). SOV tracking shows competitive positioning over time. Hallucination rate changes indicate whether your knowledge base corrections are working. By correlating metric movements with your actions, you can identify what works and double down on effective strategies.

Beyond Metrics

Scores are just the start—diagnostics show WHY

Analytics tells you your VPS is 72. Diagnostics tells you it's low because of missing Wikidata entry, weak Reddit presence, and inconsistent entity data. Actionable insights, not just dashboards.

Learn About Diagnostics

Ready to start measuring what matters?

Your first audit is free. See your baseline metrics today.

Start Free