general7 min read

How to Add Data Stories to Your Product or Dashboard

Integration guide: take DataStoryBot's markdown narrative and chart outputs and embed them in a React dashboard, email digest, or Notion page.

By DataStoryBot Team

How to Add Data Stories to Your Product or Dashboard

Your dashboard shows numbers. DataStoryBot adds the words. A chart says revenue grew 12%. The data story says revenue grew 12% because the West region closed three enterprise deals in the second week of March, and if the pipeline holds, next quarter should be similar.

This article shows how to embed DataStoryBot's narrative + chart output into the places your users already look: React dashboards, email digests, Notion pages, and Slack channels.

What You're Embedding

DataStoryBot's /refine endpoint returns:

{
  "narrative": "## Revenue Concentration Risk\n\n**The top 3 customers account for 47% of total revenue**...",
  "charts": [
    {"fileId": "file-chart001", "caption": "Revenue share by customer segment"},
    {"fileId": "file-chart002", "caption": "Revenue Herfindahl index trend"}
  ],
  "resultDataset": {
    "fileId": "file-data001",
    "fileName": "top_customers.csv",
    "rowCount": 25,
    "colCount": 5
  }
}

Three things to embed:

  1. Narrative — markdown text, rendered as HTML
  2. Charts — PNG images, downloaded via the files API
  3. Dataset — optional CSV of the analyzed subset

React Dashboard Integration

The DataStory Component

import { useState, useEffect } from "react";
import ReactMarkdown from "react-markdown";

function DataStory({ csvUrl, steering, refreshInterval = null }) {
  const [story, setStory] = useState(null);
  const [loading, setLoading] = useState(true);
  const [error, setError] = useState(null);

  async function fetchStory() {
    setLoading(true);
    try {
      const res = await fetch("/api/analyze", {
        method: "POST",
        headers: { "Content-Type": "application/json" },
        body: JSON.stringify({ csvUrl, steering }),
      });
      const data = await res.json();
      setStory(data);
      setError(null);
    } catch (err) {
      setError("Analysis failed. Retrying on next refresh.");
    } finally {
      setLoading(false);
    }
  }

  useEffect(() => {
    fetchStory();
    if (refreshInterval) {
      const interval = setInterval(fetchStory, refreshInterval);
      return () => clearInterval(interval);
    }
  }, [csvUrl, steering]);

  if (loading) return <StoryLoading />;
  if (error) return <StoryError message={error} />;
  if (!story) return null;

  return (
    <div className="data-story rounded-lg border p-6">
      <div className="prose prose-sm max-w-none">
        <ReactMarkdown>{story.narrative}</ReactMarkdown>
      </div>
      <div className="mt-6 grid gap-4 md:grid-cols-2">
        {story.charts.map((chart, i) => (
          <figure key={i}>
            <img
              src={chart.url}
              alt={chart.caption}
              className="w-full rounded"
              loading="lazy"
            />
            <figcaption className="mt-1 text-xs text-gray-500">
              {chart.caption}
            </figcaption>
          </figure>
        ))}
      </div>
    </div>
  );
}

The Backend Proxy

Your backend handles the DataStoryBot API calls, downloads charts, and caches results:

// /api/analyze.js (Next.js API route)
import { analyzeAndCache } from "@/lib/datastorybot";

export async function POST(request) {
  const { csvUrl, steering } = await request.json();

  // Check cache first
  const cacheKey = `${csvUrl}:${steering}`;
  const cached = await getCache(cacheKey);
  if (cached && Date.now() - cached.timestamp < 3600000) {
    return Response.json(cached.data);
  }

  // Run analysis
  const result = await analyzeAndCache(csvUrl, steering);

  // Cache for 1 hour
  await setCache(cacheKey, { data: result, timestamp: Date.now() });

  return Response.json(result);
}
// lib/datastorybot.js
const BASE_URL = "https://datastory.bot/api";

export async function analyzeAndCache(csvUrl, steering) {
  // Download CSV from your data source
  const csvResponse = await fetch(csvUrl);
  const csvBuffer = await csvResponse.arrayBuffer();

  // Upload to DataStoryBot
  const form = new FormData();
  form.append("file", new Blob([csvBuffer]), "data.csv");
  const upload = await fetch(`${BASE_URL}/upload`, {
    method: "POST",
    body: form,
  });
  const { containerId } = await upload.json();

  // Analyze
  const stories = await fetch(`${BASE_URL}/analyze`, {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({ containerId, steeringPrompt: steering }),
  }).then((r) => r.json());

  // Refine top story
  const report = await fetch(`${BASE_URL}/refine`, {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({
      containerId,
      selectedStoryTitle: stories[0].title,
    }),
  }).then((r) => r.json());

  // Download and store charts permanently
  const charts = [];
  for (const chart of report.charts || []) {
    const imgRes = await fetch(
      `${BASE_URL}/files/${containerId}/${chart.fileId}`
    );
    const imgBuffer = await imgRes.arrayBuffer();
    const url = await uploadToStorage(imgBuffer, `${chart.fileId}.png`);
    charts.push({ url, caption: chart.caption });
  }

  return { narrative: report.narrative, charts };
}

Key points:

  • Cache aggressively. The same data + steering prompt produces similar results. Don't re-analyze on every page load.
  • Download charts immediately. Container files expire in 20 minutes. Store them in your own storage (S3, R2, local).
  • Proxy through your backend. Don't expose DataStoryBot API calls to the frontend.

Dashboard Layout

function Dashboard() {
  return (
    <div className="grid gap-6 lg:grid-cols-3">
      {/* Traditional metrics cards */}
      <MetricCard title="Revenue" value="$2.3M" change="+12%" />
      <MetricCard title="Users" value="45,230" change="+8%" />
      <MetricCard title="Churn" value="3.2%" change="-0.5%" />

      {/* Data story — spans full width below metrics */}
      <div className="lg:col-span-3">
        <DataStory
          csvUrl="/api/export/monthly-metrics.csv"
          steering="Summarize the key findings from this month's metrics. Focus on what changed and why."
          refreshInterval={3600000}  // Refresh hourly
        />
      </div>
    </div>
  );
}

The data story sits below the metric cards, providing the narrative context that the numbers alone can't convey.

Email Digest Integration

For automated email reports that include data stories:

import markdown
import requests
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
from email.mime.image import MIMEImage

def send_data_story_email(report, recipients, subject):
    """Send a data story as an HTML email with embedded charts."""

    msg = MIMEMultipart("related")
    msg["Subject"] = subject
    msg["To"] = ", ".join(recipients)

    # Convert narrative to HTML
    narrative_html = markdown.markdown(
        report["narrative"],
        extensions=["extra"]
    )

    # Build HTML with chart CID references
    charts_html = ""
    for i, chart in enumerate(report["charts"]):
        charts_html += f"""
        <div style="margin: 16px 0; text-align: center;">
          <img src="cid:chart{i}" alt="{chart['caption']}"
               style="max-width: 100%; border-radius: 4px;">
          <p style="font-size: 12px; color: #666; margin-top: 4px;">
            {chart['caption']}
          </p>
        </div>
        """

    html = f"""
    <div style="font-family: -apple-system, sans-serif; max-width: 600px; margin: 0 auto;">
      <div style="padding: 24px;">
        {narrative_html}
        {charts_html}
        <hr style="border: none; border-top: 1px solid #eee; margin: 24px 0;">
        <p style="font-size: 11px; color: #999;">
          Generated by DataStoryBot · <a href="https://datastory.bot">datastory.bot</a>
        </p>
      </div>
    </div>
    """

    msg.attach(MIMEText(html, "html"))

    # Attach chart images
    for i, chart in enumerate(report["charts"]):
        with open(chart["path"], "rb") as f:
            img = MIMEImage(f.read())
            img.add_header("Content-ID", f"<chart{i}>")
            msg.attach(img)

    return msg

Slack Integration

Post data stories to Slack channels:

from slack_sdk import WebClient

def post_story_to_slack(report, channel):
    """Post a data story to Slack with charts."""
    client = WebClient(token="xoxb-your-token")

    # Post the narrative as a message
    client.chat_postMessage(
        channel=channel,
        text=report["narrative"],
        mrkdwn=True
    )

    # Upload charts as threaded replies
    for chart in report["charts"]:
        client.files_upload_v2(
            channel=channel,
            file=chart["path"],
            title=chart["caption"],
            initial_comment=chart["caption"]
        )

Notion Integration

Push data stories to Notion pages via the API:

from notion_client import Client

def push_to_notion(report, page_id):
    """Append a data story to a Notion page."""
    notion = Client(auth="your-notion-token")

    # Convert markdown narrative to Notion blocks
    blocks = markdown_to_notion_blocks(report["narrative"])

    # Append narrative blocks
    notion.blocks.children.append(
        block_id=page_id,
        children=blocks
    )

    # Upload chart images
    for chart in report["charts"]:
        notion.blocks.children.append(
            block_id=page_id,
            children=[{
                "type": "image",
                "image": {
                    "type": "external",
                    "external": {"url": chart["public_url"]},
                    "caption": [{"type": "text", "text": {"content": chart["caption"]}}]
                }
            }]
        )

Caching Strategy

Data stories are expensive to generate (API calls + compute). Cache them:

Refresh RateUse Case
Real-time (no cache)Never — analysis takes 30-120 seconds
Every 15 minutesLive dashboards with fresh data feeds
HourlyInternal dashboards, monitoring
DailyEmail digests, Notion summaries
WeeklyBoard reports, stakeholder updates
On-demandUser-triggered analysis in a product

Cache the rendered narrative + chart URLs, not the raw API responses. Once charts are downloaded from the container and stored in your own storage, they're permanent.

What to Read Next

For building the React integration, see integrating DataStoryBot into a React application.

For the email pipeline in detail, read automating weekly data reports with DataStoryBot.

For chart handling, see how to download and embed AI-generated charts.

For the API fundamentals, start with getting started with the DataStoryBot API.

Ready to find your data story?

Upload a CSV and DataStoryBot will uncover the narrative in seconds.

Try DataStoryBot →