Modern data modeling and analytics for data-driven teams
Looker is a cloud-first Business Intelligence and data modeling platform that centralizes metrics via LookML for analytics teams and embedded workflows; it’s ideal for data teams and product analysts who need governed, reusable metrics and embedded analytics, and pricing is enterprise-focused with custom quotes rather than fixed low-cost tiers.
Looker is a cloud-native Business Intelligence and data analytics platform that models data with LookML and delivers dashboards, explorations, and embedded analytics. Its primary capability is semantic data modeling—defining governed metrics once and reusing them across dashboards and queries. Looker’s key differentiator is LookML, a version-controllable modeling layer that separates SQL from visualization, serving analytics engineers, data teams, and product managers. As a Data & Analytics platform, Looker integrates with cloud warehouses and BI workflows. Pricing is enterprise-oriented and sold via custom quotes; there is no unlimited free tier.
Looker, founded in 2012 and acquired by Google Cloud in 2019, positions itself as a cloud-first Business Intelligence and analytics platform that brings a code-driven modeling layer to enterprise data stacks. Its core value proposition is separating metric definitions from visualizations using LookML, a YAML/SQL-based modeling language that enforces consistent business logic. Looker runs queries directly against cloud data warehouses (BigQuery, Snowflake, Redshift) rather than importing data, which reduces data duplication and keeps analyses aligned with the source of truth. That architecture targets organizations that already rely on modern cloud warehouses and need governed, reusable metrics at scale.
Looker’s feature set centers on its modeling and delivery capabilities. LookML lets analytics engineers define derived tables, persistent derived tables (PDTs), measures, and dimensions with SQL fragments that compile into optimized queries. The Explore UI lets non-technical users build ad-hoc reports by selecting fields from modeled Explores, while Looker’s Dashboard and Tile system supports scheduled delivery, data actions, and parameterized filters. Looker also offers Looker Blocks (prebuilt model patterns), an embedded analytics SDK for integrating dashboards into apps, and an API/SDK ecosystem for automations, programmatic dashboards, and metadata access. Governance features include permissioning by model and row-level security expressions embedded in LookML, plus Git-backed model versioning for change control.
Looker’s pricing is not published as fixed tiers on its site; instead Google Cloud sells Looker via custom enterprise contracts. There is no permanently free tier for full Looker; however, Google Cloud frequently offers time-limited trials or demo accounts for evaluation. Enterprise pricing typically covers licensing for Looker instances, user seats or role-based access (Viewer/Explorer/Developer), and optional professional services for implementation; costs vary significantly by customer scale and deployment footprint. For small teams or proof-of-concept work, some organizations use Google Cloud credits or trial arrangements, but expect enterprise-level minimums rather than per-user monthly sticker prices.
Looker is used by analytics engineers and BI teams to create governed metric layers and by product or growth teams to embed analytics into apps. For example, an Analytics Engineer defines LookML measures and PDTs to ensure a single source of truth for monthly active users, while a Product Manager embeds an operational dashboard into a web product to surface churn alerts. A Revenue Operations analyst schedules daily cohort reports and triggers Data Actions to create tickets from dashboards. Compared to direct-competitors like Tableau, Looker’s distinguishing angle is its model-first LookML layer and query-through-warehouse architecture, which favors organizations that need centralized metric governance and embedded analytics over extracted-cube approaches.
Three capabilities that set Looker apart from its nearest competitors.
Current tiers and what you get at each price point. Verified against the vendor's pricing page.
| Plan | Price | What you get | Best for |
|---|---|---|---|
| Trial / Demo | Free (time-limited) | Short evaluation access, limited sample data and features | Evaluators and proof-of-concept projects |
| Professional / Small Deployment | Custom | Quoted by seat/role and data footprint; minimal enterprise services | Small teams needing LookML and core analytics |
| Enterprise | Custom | Full platform, SSO, embedding, SLAs, professional services | Large organizations needing governance and embedding |
Copy these into Looker as-is. Each targets a different high-value workflow.
You are a Looker LookML assistant.
Role: produce a complete LookML view file for an 'orders' source table.
Constraints: use valid LookML syntax, include sql_table_name and primary_key, define at least five dimensions (id, created_at, user_id, status, total_amount), include a dimension_group for created_at with day/week/month, add two measures (count, sum of total_amount) with descriptive labels and value_format_name for currency, and avoid warehouse-specific SQL functions.
Output format: return only the LookML code for a single view (no explanations). Example dimension style: dimension: id { type: string sql: ${TABLE}.id ;; }
You are a Looker SQL Runner helper. Role: craft a single ANSI-compatible SQL query that computes weekly retention cohorts for users over the last 12 weeks. Constraints: deliver one query (no temp tables), compute user_first_week (cohort start), cohort_week_offset (0,1,2...), cohort_size, retained_users, retention_rate (decimal percent), and filter out cohorts with fewer than 10 users; assume a table users_events(user_id, event_time) and user creation determined by MIN(event_time). Output format: return only the SQL query and a one-line SQL comment header describing parameters and assumptions.
You are an Analytics Engineer. Role: define governed revenue metrics in LookML for reuse across explores and dashboards. Constraints: provide LookML code snippets (a view or extend_view) that define gross_revenue, discounts, refunds, net_revenue, mrr, and arpu; include descriptions, appropriate types (sum, number), currency formatting (value_format_name), and simple tests or sql_always_where to handle NULLs; keep SQL expressions portable and avoid vendor-specific functions. Output format: return LookML measure and necessary dimension snippets only, plus one short validation SQL query that returns net_revenue by month for verification.
You are a Product Manager implementing Looker embed. Role: produce a step-by-step integration guide and minimal Node.js example to embed a Looker dashboard securely using signed embed URLs. Constraints: include required Looker admin settings (embed allowlist, user attributes, model permissions), a signed URL or JWT signing example, recommended TTL for embeds, CORS and security header recommendations, and a compact Node.js code snippet that generates the signed URL. Output format: numbered steps (1-8), then the Node.js code snippet and an example JSON payload used to sign the embed (no long prose).
You are a Revenue Operations engineer building an automation runbook. Role: design a production-ready workflow that schedules daily cohort exports from Looker, uploads CSVs to S3, evaluates churn thresholds, and creates support tickets via a REST API when thresholds are exceeded. Constraints: include exact Looker schedule configuration (format, destination webhook), example webhook payload, AWS Lambda pseudocode (Python) to process CSV, threshold evaluation logic, ticket creation request example, error handling and retry policy, IAM least-privilege notes, and monitoring/alerts. Output format: stepwise runbook with numbered steps and an inline Python pseudocode snippet plus a sample webhook JSON payload.
You are a Senior Analytics Engineer performing a LookML performance audit. Role: analyze a LookML model and recommend high-impact optimizations for slow explores and derived tables. Constraints: produce a prioritized checklist of issues and fixes, explain root causes, show a concrete before-and-after refactor for one slow derived_table (include original SQL and optimized SQL), recommend PDT/aggregate strategies and caching settings, and propose CI tests to catch regressions. Output format: return a JSON object with keys issues, prioritized_actions, before_after_sql (objects with original and optimized), and ci_test_snippets. Example slow pattern: derived_table using SELECT DISTINCT over multiple joins.
Choose Looker over Tableau if you prioritize a code-first semantic layer and live queries against cloud warehouses for centralized metric governance.
Head-to-head comparisons between Looker and top alternatives: