DataStoryBot + Zapier/n8n: No-Code Data Analysis Workflows
Connect DataStoryBot to no-code automation platforms: trigger on Google Sheets update, analyze, and post results to Notion or Slack.
DataStoryBot + Zapier/n8n: No-Code Data Analysis Workflows
The DataStoryBot API is three HTTP calls. That's all. And that means it drops cleanly into any automation platform that can make HTTP requests — which includes both Zapier and n8n.
If your team is already using Zapier to move data between Google Sheets, Slack, and Notion, you're a few steps away from adding AI-generated data analysis to those workflows. No code required beyond configuring the HTTP request steps. If you're running n8n for more complex orchestration, the setup is nearly identical but with more control over data mapping and error handling.
This article builds three complete workflows: a Google Sheets trigger that posts analysis to Slack, an email attachment pipeline that saves to Notion, and a scheduled weekly pull from Airtable. Each workflow is covered for both Zapier and n8n.
If you haven't used the DataStoryBot API before, the getting started guide covers the three-call flow in detail. The short version: upload a CSV, get story angles back, refine the one you want into a full narrative with charts.
The API Flow in Automation Context
Before building workflows, it helps to understand what each step returns and how that maps to automation platform concepts.
Call 1 — Upload (POST /api/upload):
Send a CSV as multipart form data. Receive containerId and fileId. The container is an ephemeral Code Interpreter environment that lives for 20 minutes.
Call 2 — Analyze (POST /api/analyze):
Send containerId. Receive an array of story angle objects, each with a title, description, and angle_id. This call takes 10–90 seconds — plan for it in your timeout settings.
Call 3 — Refine (POST /api/refine):
Send containerId, angle_id, and optionally a steering string. Receive a narrative string, an array of chart URLs, and a filtered dataset URL.
In automation platforms, these become three sequential HTTP request steps. Each step's output feeds the next.
Workflow 1: Google Sheets Update → Analyze → Post to Slack
Use case: A team updates a shared Google Sheet with weekly sales numbers. When a new row is added, the workflow exports the sheet as CSV, analyzes it, and posts a narrative summary to a Slack channel.
Zapier Setup
Step 1 — Trigger: New or Updated Spreadsheet Row (Google Sheets)
- Trigger app: Google Sheets
- Event: New or Updated Spreadsheet Row
- Connect your sheet and select the target spreadsheet and worksheet
- Zapier will export the relevant row data; for full-sheet analysis, you'll need the sheet's CSV export URL
For full-sheet analysis, use a Google Sheets export URL in this format:
https://docs.google.com/spreadsheets/d/SPREADSHEET_ID/export?format=csv&gid=SHEET_ID
Step 2 — Action: Webhooks by Zapier (GET)
Fetch the CSV from the export URL before uploading:
- URL: your Google Sheets CSV export URL
- This step gives you the raw CSV content as a file for the next upload step
Step 3 — Action: Webhooks by Zapier (POST) — Upload
- URL:
https://datastory.bot/api/upload - Payload type: Form
- File field name:
file - File content: output from Step 2
The response JSON contains containerId — map this to a variable using Zapier's data mapping.
Step 4 — Action: Webhooks by Zapier (POST) — Analyze
- URL:
https://datastory.bot/api/analyze - Payload type: JSON
- Body:
{
"containerId": "{{step3.containerId}}"
}
Set the request timeout to at least 120 seconds. Zapier's default webhook timeout is 30 seconds — upgrade to a paid plan or use the "Delay" action strategically if you hit timeouts.
The response returns a stories array. Map stories[0].angle_id and stories[0].title for the next step.
Step 5 — Action: Webhooks by Zapier (POST) — Refine
- URL:
https://datastory.bot/api/refine - Payload type: JSON
- Body:
{
"containerId": "{{step3.containerId}}",
"angleId": "{{step4.stories.0.angle_id}}",
"steering": "Focus on week-over-week trends"
}
The response includes narrative (markdown text) and charts (array of image URLs).
Step 6 — Action: Send Channel Message (Slack)
- Channel:
#data-updates(or your preferred channel) - Message text:
*Weekly Sales Analysis*
{{step5.narrative}}
Charts: {{step5.charts.0}}
Slack accepts markdown for basic formatting. For richer output, use the Slack Block Kit format via the "Send Message with Blocks" action.
n8n Setup for Workflow 1
n8n gives you more control over data transformation and error handling. The node sequence:
Node 1 — Google Sheets Trigger
- Use the "Google Sheets Trigger" node
- Operation: Row Added
- Poll interval: every 5 minutes (or use a webhook trigger if your Sheet is connected to an Apps Script)
Node 2 — HTTP Request (Download CSV)
- Method: GET
- URL: your Sheets export URL with format=csv
- Response format: Binary Data
- Output binary data: enabled
Node 3 — HTTP Request (Upload)
- Method: POST
- URL:
https://datastory.bot/api/upload - Send binary data: enabled
- Binary property:
data(from Node 2) - Content-Type:
multipart/form-data
Extract containerId from the JSON response body using an expression: {{ $json.containerId }}.
Node 4 — HTTP Request (Analyze)
- Method: POST
- URL:
https://datastory.bot/api/analyze - Send body: JSON
- Body:
{
"containerId": "{{ $('HTTP Request (Upload)').item.json.containerId }}"
}
- Timeout: 120000 ms (120 seconds)
Node 5 — Set Node (Extract angle_id)
Use a Set node to extract the first story angle cleanly before the refine call:
angleId = {{ $json.stories[0].angle_id }}
storyTitle = {{ $json.stories[0].title }}
Node 6 — HTTP Request (Refine)
- Method: POST
- URL:
https://datastory.bot/api/refine - Body:
{
"containerId": "{{ $('HTTP Request (Upload)').item.json.containerId }}",
"angleId": "{{ $('Set').item.json.angleId }}",
"steering": "Focus on week-over-week trends"
}
Node 7 — Slack Node
- Operation: Post Message
- Channel:
#data-updates - Text:
{{ $json.narrative }}
Add error handling with an Error Trigger node connected to a Slack notification so failed analyses surface immediately.
Workflow 2: Email CSV Attachment → Analyze → Save to Notion
Use case: An operations system emails weekly CSV exports. When the email arrives, the workflow extracts the attachment, analyzes it, and saves the narrative and chart links to a Notion database.
This workflow is particularly useful if you're receiving automated reports from tools like Salesforce, HubSpot, or accounting software that don't have direct API integrations. See also AI data analysis for non-Python teams for more on extracting value from these exports without writing code.
Zapier Setup
Step 1 — Trigger: New Email (Gmail or Email by Zapier)
- Trigger: New Email Matching Search (Gmail)
- Search string:
from:reports@yourtool.com has:attachment filename:*.csv - This fires when a matching email arrives with a CSV attachment
Step 2 — Action: Webhooks by Zapier (POST) — Upload
Gmail's attachment data comes through as a URL, not raw bytes. Use Zapier's file handling:
- URL:
https://datastory.bot/api/upload - Payload type: Form
file: select the attachment from Step 1 (Zapier handles the download automatically when you reference an attachment field in a form upload)
Step 3 — Action: Webhooks by Zapier (POST) — Analyze
{
"containerId": "{{step2.containerId}}"
}
Step 4 — Action: Webhooks by Zapier (POST) — Refine
{
"containerId": "{{step2.containerId}}",
"angleId": "{{step3.stories.0.angle_id}}"
}
Step 5 — Action: Create Database Item (Notion)
- Database: your reporting database
- Field mappings:
Name:{{step3.stories.0.title}}Analysis(Text):{{step4.narrative}}Charts(URL):{{step4.charts.0}}Source File(Text):{{step1.subject}}Date(Date): today's date using Zapier's date formatter
n8n Setup for Workflow 2
Node 1 — Email Trigger (IMAP)
- Configure with your email provider's IMAP settings
- Filter: subject contains "report" or sender matches your source
- Download attachments: enabled
Node 2 — IF Node Check that an attachment exists and is a CSV:
{{ $json.attachments[0].fileName.endsWith('.csv') }}
This prevents the workflow from running on emails without valid CSV attachments.
Node 3 — HTTP Request (Upload)
- Method: POST
- URL:
https://datastory.bot/api/upload - Send binary data: true
- Binary property: reference the attachment binary from Node 1
Nodes 4, 5 — HTTP Request (Analyze + Refine)
Same configuration as Workflow 1. For email-sourced reports, consider adding a steering prompt based on the email subject:
{
"containerId": "{{ $('HTTP Request Upload').item.json.containerId }}",
"angleId": "{{ $('HTTP Request Analyze').item.json.stories[0].angle_id }}",
"steering": "{{ $('Email Trigger').item.json.subject }}"
}
Passing the email subject as the steering prompt aligns the analysis with whatever the sending system flagged as important.
Node 6 — Notion Node
- Operation: Create Database Page
- Database ID: your reporting database
- Properties (use JSON mode for full control):
{
"Name": {
"title": [{ "text": { "content": "{{ $json.story.title }}" } }]
},
"Analysis": {
"rich_text": [{ "text": { "content": "{{ $json.narrative.substring(0, 2000) }}" } }]
},
"Chart URL": {
"url": "{{ $json.charts[0] }}"
},
"Report Date": {
"date": { "start": "{{ $now.toISODate() }}" }
}
}
Note the substring(0, 2000) on the narrative — Notion's rich text fields cap at 2000 characters per block. For longer narratives, split into multiple text blocks using an n8n Function node.
Workflow 3: Scheduled Weekly Analysis from Airtable
Use case: Every Monday morning, pull the past week's records from an Airtable base, export as CSV, run the analysis, and post results to both Slack and Notion simultaneously.
This mirrors the pattern in automating weekly data reports but without writing any backend code — the automation platform handles scheduling and delivery.
Zapier Setup
Zapier's scheduled triggers are available on paid plans via the "Schedule by Zapier" trigger.
Step 1 — Trigger: Schedule by Zapier
- Frequency: Every week
- Day of week: Monday
- Time of day: 08:00 AM
Step 2 — Action: Find Records (Airtable)
- Base: your Airtable base
- Table: your data table
- Filter by formula:
IS_AFTER({Date}, DATEADD(TODAY(), -7, 'days'))
This returns all records from the past 7 days.
Step 3 — Action: Formatter by Zapier (Text → Convert to CSV)
Zapier's Formatter can convert a list of records to CSV format. Map the Airtable fields to CSV columns.
Step 4-6 — Upload, Analyze, Refine
Same as Workflow 1, Steps 3-5.
Step 7a — Slack (Post narrative to channel) Step 7b — Notion (Create database page)
Zapier handles these as separate steps in sequence. You can't branch in parallel on most Zapier plans — run Slack first, then Notion.
n8n Setup for Workflow 3
n8n handles scheduling and parallel output natively, which makes it the better fit for this workflow.
Node 1 — Schedule Trigger
- Mode: Every Week
- Day of Week: Monday
- Hour: 8, Minute: 0
Node 2 — HTTP Request (Airtable API)
- Method: GET
- URL:
https://api.airtable.com/v0/YOUR_BASE_ID/YOUR_TABLE - Headers:
Authorization: Bearer YOUR_AIRTABLE_TOKEN - Query parameters:
filterByFormula=IS_AFTER({Date},DATEADD(TODAY(),-7,'days'))
Node 3 — Function Node (Convert to CSV)
Airtable returns JSON records. Convert to CSV before uploading:
const records = $input.item.json.records;
const fields = records[0].fields;
const headers = Object.keys(fields).join(',');
const rows = records.map(r =>
Object.values(r.fields).map(v =>
typeof v === 'string' && v.includes(',') ? `"${v}"` : v
).join(',')
);
const csv = [headers, ...rows].join('\n');
return [{ json: {}, binary: {
data: {
data: Buffer.from(csv).toString('base64'),
mimeType: 'text/csv',
fileName: 'weekly_report.csv'
}
}}];
Nodes 4, 5, 6 — Upload, Analyze, Refine
Same configuration as the previous workflows. Add a steering prompt tuned for weekly summaries:
{
"steering": "Summarize the most significant changes compared to a typical week. Highlight any outliers."
}
Node 7 — Split Node
After the Refine response, use a n8n Split node to route the same result to two parallel branches.
Branch A — Slack Post the narrative with a chart link.
Branch B — Notion Create a dated database page with the full analysis.
This parallel execution means both outputs complete in roughly the same time as one — no waiting for Slack to finish before Notion starts.
Timeout and Reliability Considerations
The Analyze call is the slow one. On a moderately complex dataset (5,000 rows, 10 columns), it typically takes 30–60 seconds. On larger or more complex data, it can hit 90 seconds.
Zapier: The default webhook timeout is 30 seconds on most plans. If your analysis exceeds this, you have two options: (1) use the Zapier "Delay" pattern where you trigger a second Zap from the first, or (2) use a middleware service like Make (Integromat) or a small serverless function that calls the DataStoryBot API and caches the result. Zapier is not well-suited to long-running synchronous calls.
n8n: Set the HTTP Request node timeout to 120000 (120 seconds). n8n handles long-running requests correctly without the platform-level limits that Zapier imposes. This is the main practical advantage of n8n for DataStoryBot integrations.
For both platforms, add error handling:
- Wrap the Analyze and Refine steps in try/catch equivalents (n8n's Error Trigger node, Zapier's "only continue if" filter)
- Send a failure notification to Slack with the error message when a step fails
- Log
containerIdvalues so you can debug failed runs — the container persists for 20 minutes after creation, which gives you time to investigate
Choosing Between Zapier and n8n
Both platforms work. The decision comes down to your team's existing setup and the complexity of your workflows.
Use Zapier if:
- Your team already uses Zapier for other automations
- Workflows are simple (linear, no branching, dataset under 10,000 rows)
- You need Zapier's extensive app library for triggers (e.g., specific CRM events)
- Analysis time is consistently under 30 seconds
Use n8n if:
- You need parallel output branches (post to Slack and Notion simultaneously)
- Analysis calls regularly take 60+ seconds
- You want to self-host the automation platform for data privacy reasons
- You need complex data transformations between steps (Function nodes)
- You're already running n8n for other internal tooling
Either platform gives non-technical users a way to build and maintain these workflows without writing backend code. For teams that do want to go further — adding custom logic, building a reporting portal, or embedding analysis results in a product — see AI data analysis for non-Python teams for the range of integration options beyond no-code platforms.
Ready to find your data story?
Upload a CSV and DataStoryBot will uncover the narrative in seconds.
Try DataStoryBot →