Olmec Dynamics
H
·7 min read

How to Send a Weekly Sales Forecast from BigQuery to Slack Using Make.com

Automate a weekly sales forecast from BigQuery to Slack using Make.com. Run a scheduled query, fetch results, format metrics, and post a concise Slack report.

Introduction

You or your analytics team run a weekly SQL export, someone copies numbers into Slack, and stakeholders still ask for a clearer headline. Manual exports mean stale links, inconsistent metrics, and a lot of interrupted mornings.

This guide shows how to automate a weekly sales forecast report using BigQuery, Make.com, and Slack. You will schedule a deterministic BigQuery query that writes results to a partitioned destination, then use Make.com to fetch the latest row, format a short Slack message with the most important metrics and anomalies, and post it to the right channel automatically.

By the end you will have a repeatable workflow that runs every week and gives stakeholders one clear number-led message they can act on.

What You'll Need

  • Google Cloud project with BigQuery enabled and a service account that has bigquery.jobs.create and dataset access
  • A scheduled BigQuery query that writes aggregated weekly results to a destination table (partitioned by week or ingestion date)
  • Make.com account with a Google BigQuery connection and a Slack connection
  • A Slack channel where the weekly forecast will be posted, and a bot token authorised in Make.com
  • Optional: Google Cloud Storage (GCS) if you choose to stage CSV exports or artifacts

Notes: BigQuery scheduled queries are best practice for recurring aggregation. Use a destination partitioned table to make the Make.com fetch step trivial and auditable.

How It Works (The Logic)

  1. BigQuery scheduled query runs on your chosen cadence (week boundary) and writes a single-row summary into dataset.weekly_forecast partitioned by week_start.
  2. Make.com runs shortly after the BigQuery job finishes, queries the destination table for the latest week, formats the metrics and a short narrative, then posts a Slack message into the chosen channel.
  3. Optionally Make.com stores a CSV snapshot in GCS or attaches a Google Drive file when stakeholders need a downloadable copy.

In plain terms: scheduled BigQuery aggregation → Make.com fetch latest row → format message → Slack post.

Step-by-Step Setup

  1. Create the scheduled BigQuery query
  • In the BigQuery console create a scheduled query that writes to a destination table such as analytics.weekly_forecast partitioned by week_start_date.

  • Make the query idempotent and deterministic. Example pattern:

    WITH week AS ( SELECT DATE_TRUNC(CURRENT_DATE(), WEEK(MONDAY)) AS week_start_date ) INSERT INTO project.analytics.weekly_forecast (week_start_date, total_orders, total_revenue, avg_order_value, top_sku, anomalies) SELECT w.week_start_date, COUNT(DISTINCT o.order_id) AS total_orders, SUM(o.total_amount) AS total_revenue, SAFE_DIVIDE(SUM(o.total_amount), COUNT(DISTINCT o.order_id)) AS avg_order_value, (SELECT ARRAY_AGG(CONCAT(li.sku,':',SUM(li.quantity)) ORDER BY SUM(li.quantity) DESC LIMIT 1)[OFFSET(0)]) AS top_sku, (CASE WHEN COUNTIF(o.failed_quality_check)=0 THEN NULL ELSE 'data quality issues' END) AS anomalies FROM project.dataset.orders o JOIN week w ON TRUE WHERE DATE(o.created_at) BETWEEN w.week_start_date AND DATE_ADD(w.week_start_date, INTERVAL 6 DAY) GROUP BY w.week_start_date;

  • Prefer writing to a partitioned destination so you can fetch ORDER BY week_start_date DESC LIMIT 1 in Make.com.

Gotcha: set the scheduled query timezone to your stakeholders' local timezone or be explicit in the Slack message about the timezone.

  1. Schedule BigQuery and confirm run timing
  • Schedule the query for a quiet time after your week closes, for example Sunday 02:00 UTC.
  • Verify the query writes one row per week and that permissions use a service account with dataset write access.
  1. Build the Make.com scenario: trigger and BigQuery fetch
  • Create a new Scenario and add a Scheduler module set to run a few minutes after the BigQuery scheduled job completes.

  • Add the BigQuery module: "Run a query" or "Get rows". The simplest approach is to run a small query that selects the latest partition:

    SELECT * FROM project.analytics.weekly_forecast ORDER BY week_start_date DESC LIMIT 1;

  • Map returned fields into Make.com variables: week_start_date, total_orders, total_revenue, avg_order_value, top_sku, anomalies.

Gotcha: if your scheduled query uses streaming inserts or separate load jobs, confirm the Make.com run time accounts for BigQuery job completion lag.

  1. Compute derived metrics and narrative in Make.com
  • Add a Tools/Formatter module to create a few human-friendly values:
    • Format numbers with thousands separators and currency formatting
    • Compute week-over-week change if you store previous-week values
    • Build a 1-line summary and 2–3 bullets for highlights

Example summary lines:

  • "Week of 2026-04-19 — Orders: 1,234 (+4% WoW), Revenue: $98,123 (+2% WoW), AOV: $79.50, Top SKU: SKU-123."
  • If anomalies is non-null, add a strong alert bullet.
  1. Format the Slack message (use Blocks)
  • Add the Slack module "Post a message" and use Blocks to keep the message scannable. Example block structure:
    • Header block: "Weekly Sales Forecast — Week of {{week_start_date}}"
    • Section: bold summary line with totals and WoW change
    • Divider
    • Section with 2–3 short highlights: top SKU, AOV, notable anomaly
    • Context block with "Last refreshed" timestamp and link to BigQuery table or Looker Studio dashboard

Keep the message short. Stakeholders want the headline, not a CSV dump.

  1. Include a link to the data and optional artifact
  • In the Slack message include a link to the BigQuery destination table or, if you need a downloadable snapshot, add a step in Make.com to export the query results to CSV and store in GCS or Drive, then include that link in the message.

Tip: use signed GCS URLs or a shared Drive folder with appropriate permissions for secure access.

  1. Add error handling and a failure notification path
  • On BigQuery query failure or empty results, route to a secondary Slack message that includes the run id and the error message, posted to an ops channel.
  • Log every run into a small audit table or Google Sheet from Make.com with columns: run_id, week_start_date, status, total_orders, total_revenue, error_message, run_timestamp.

This makes it easy to trace missing reports.

  1. Test end-to-end
  • Manually run the scheduled query for a test week and run the Make.com scenario. Confirm the Slack message appears with correct formatting and links.
  • Test failure modes: simulate a query error, or a missing destination row, and confirm the failure notification path triggers.

Real-World Business Scenario

A subscription retailer used this pattern to publish a single weekly sales forecast to the leadership channel. The scheduled query aggregated orders and LTV signals into a one-row summary per week. Make.com fetched the row, created a short narrative, and posted it to #weekly-ops. That one message replaced a 30-minute manual report each Monday and made anomalies visible earlier.

Common Variations

  • Post a short CSV attachment: export the query results as CSV to GCS, generate a signed URL, and include it in the Slack message for stakeholders who want to download the data.
  • Add charts: have Make.com create a small PNG sparkline from the last N weeks (using a charting API or Google Sheets image trick) and attach it to the Slack message.
  • Trigger on BigQuery job completion: instead of using Make.com scheduler, publish a Cloud Function or Pub/Sub message on scheduled-query completion and have Make.com pick up that webhook for true event-driven handoff.

A short checklist before you turn this on

  • Confirm the scheduled query writes a single, auditable row per week into a partitioned destination table
  • Ensure the Make.com service account connection has bigquery.jobs.create if you run queries from Make.com
  • Choose a run time in Make.com that allows the BigQuery scheduled job to finish
  • Add a lightweight audit log for every Make.com run so you can replay or troubleshoot quickly

Where this connects with other patterns

If you are already streaming operational data into BigQuery or staging via GCS, this weekly forecast pattern slots in as a lightweight consumer of your destination tables. For a more frequent operational digest pattern you can compare our daily Shopify to BigQuery and Slack approach in How to Stream Shopify Orders into BigQuery and Send a Daily Slack Digest Using Make.com. If you want Olmec Dynamics to build and harden this weekly pipeline for you, we design schedules, runbooks, and Slack formatting to match your stakeholder needs. See what we do at https://olmecdynamics.com