analytics-tracking
Design, audit, and improve analytics tracking systems that produce reliable, decision-ready data. Use when the user wants to set up, fix, or evaluate analytics tracking (GA4, GTM, product analytics, events, conversions, UTMs). This skill focuses on measurement strategy, signal quality, and validation— not just firing events.
Author
Category
Business AnalysisInstall
Hot:27
Download and extract to your skills directory
Copy command and send to OpenClaw for auto-install:
Download and install this skill https://openskills.cc/api/download?slug=sickn33-skills-analytics-tracking&locale=en&source=copy
Analytics Tracking - Data Analytics Tracking and Measurement Strategy
Skill Overview
Design, audit, and improve analytics tracking systems that produce reliable, decision-ready data. Focused on designing, auditing, and improving high-quality data analytics tracking systems for your business to ensure they produce reliable, decision-ready data.
Applicable Scenarios
1. Building or Migrating Analytics Tracking Systems
When you need to build analytics tracking for a website/product from scratch, migrate from Universal Analytics to GA4, or consolidate multiple analytics tools, this skill helps you establish a clear event model, define meaningful conversion metrics, and ensure cross-tool data consistency.
2. Fixing Data Quality Issues
When you find your analytics data unreliable—abnormal conversion numbers, duplicate event firings, messy UTM parameters, failed cross-domain tracking, etc.—this skill provides a systematic diagnostic approach, locates root causes via the Measurement Readiness Index, and offers remediation plans.
3. Assessing and Optimizing Existing Tracking
Before making important decisions based on data (such as adjusting ad spend, redesigning product features, or running growth experiments), you need to verify data reliability. This skill helps you evaluate the signal quality of your current tracking system, identify blind spots and biases, and ensure decisions are based on accurate data.
Core Features
1. Measurement Readiness & Signal Quality Index
Perform a comprehensive 0–100 score diagnostic of your analytics system before adding or changing any tracking. Evaluate six dimensions—decision alignment, event model clarity, data accuracy, conversion definition quality, attribution context, and governance/maintenance—to clearly indicate whether your data is safe to use for decisions or needs fixing first.
2. Event Model and Conversion Strategy Design
Following the core principle of "tracking for decisions," help you design a clear, non-redundant event taxonomy. Avoid vanity metrics and UI noise; focus on intent signals, completion signals, and meaningful system state changes. Clearly define what constitutes a true conversion and how conversions should be counted.
3. GA4/GTM Implementation and Validation Guidance
Provide concrete implementation recommendations for GA4 and Google Tag Manager: recommend using GA4 standard events, push clean dataLayer events via GTM, avoid multi-container conflicts, and version every publish. Also provide validation methods such as real-time checks, duplicate detection, and cross-browser testing to ensure events fire as expected.
Frequently Asked Questions
How do I verify that my GA4 tracking data is accurate?
Don't trust the numbers in GA4 reports unless they've been validated. Validation methods provided by this skill include: using GA4 Realtime reports to confirm events fire immediately; using DebugView to check event parameter completeness; testing across multiple browsers and devices; checking for duplicate firings; and verifying UTM parameters are passed correctly. If your Measurement Readiness Index data accuracy dimension scores below 15/20, it's recommended to fix issues before relying on the data for decisions.
What kind of conversion definitions are meaningful?
A true conversion must represent real value, a completed intent, and an irreversible progression. Examples of valid conversions include "signup_completed," "purchase_completed," and "demo_booked." Page views, button clicks, and form starts are not conversions—they are process metrics. This skill helps you clarify counting rules for each conversion (once per session vs. count every occurrence) and ensures all tools use consistent definitions for the same conversion.
Why can't we "track everything"?
Event proliferation creates a lot of noise, making data hard to interpret and maintain. The core principle of this skill is: if no decision depends on an event, don't track it. We recommend starting from business questions, reverse-engineering which signals are needed, and then designing a minimal event set. Fewer but accurate events are far more valuable than many unreliable ones. Before adding any tracking, ask: what decision will this data support? If I don't track this, what critical insight will I lose?