The Automation Stack That Changed How We Work: From 10 Tools to 3
You don't need a massive team to run a tight operation. You need the right automation stack.
Last year, we were drowning in tools. Slack, Gmail, Notion, Zapier, Make, custom scripts, spreadsheets. Every tool solved one problem and created three more. We were paying $300/month for integrations that half-broke, and spending 15 hours a week on busywork.
I'm going to show you exactly what we changed — and how to do it without becoming a SaaS subscription addict.
The Problem: Tool Sprawl
Before we fixed this, here's what a typical Monday looked like:
- Check Slack for new leads
- Manually add them to Notion
- Send welcome email (copied from template)
- Log it in Airtable
- Wait for payment confirmation from Stripe
- Update invoice in another spreadsheet
- Send invoice to accounting folder
- Log task in project management tool
- Notify team via Slack
That's 9 steps. For one lead.
Multiply that by 20 leads a week, plus all the edge cases and exceptions, and you're looking at someone's entire job being: "move data between tools."
Most teams know this is insane. They just don't know how to escape it.
The Real Problem Isn't the Tools—It's the Architecture
Here's what nobody tells you: you don't have a tool problem. You have an architecture problem.
Every tool you add should answer this question: "What core capability does this unlock that we can't replicate in our existing stack?"
If the answer is "convenience" or "because everyone uses it," you're adding complexity, not capability.
We started from first principles:
- System of Record: One place where truth lives
- Event Pipeline: Automated triggers that react to changes
- Integration Layer: Connect external services to the pipeline
Once you have those three things, half your tools become optional.
Our Stack: Simple, Not Simplistic
1. PostgreSQL (System of Record)
Everything lives here. Leads, customers, projects, transactions, logs. One database. One schema. One source of truth.
Why not Airtable/Notion? Because we needed programmatic access and couldn't afford their API costs at scale. PostgreSQL was $15/month.
Why not Mongo/Firebase? Because our data has relationships. SQL enforces that. It's a feature, not a limitation.
2. n8n (Event Pipeline)
n8n is an open-source automation platform that runs on your own server. We self-host it.
It watches our system of record and triggers actions:
- New lead created → Send welcome email, add to CRM sync queue
- Payment received → Update invoice, notify team, trigger fulfillment
- Customer marked inactive → Add to re-engagement campaign
The workflow is written once, runs forever, costs basically nothing.
3. Custom APIs (Integration Layer)
We wrote small Node.js APIs that:
- Accept webhooks from Stripe, Form submissions, etc.
- Validate and normalize the data
- Write to PostgreSQL
- Return structured responses
This decouples us from any third-party API changes. If Stripe changes their webhook format, we update our handler. n8n doesn't break.
What We Removed
- Zapier ($50/month) → Replaced with n8n webhooks
- Airtable ($120/month) → PostgreSQL + lightweight UI for manual entry
- Make.com automation ($50/month) → n8n workflows
- Email marketing tool ($80/month) → n8n + SendGrid API
- Invoice generator ($30/month) → Custom template + PDF API
- Notion (team wiki) → Actually kept this one. Notion is great for documentation.
New total: ~$40/month (Postgres + n8n hosting + SendGrid credits). Previously: $330/month.
The Trade-Offs
This isn't magic. Here are the real costs:
Time to set up: ~40 hours. We had to:
- Design the database schema
- Write 3 API handlers (~300 lines of code total)
- Build n8n workflows (~8 workflows, 30-45 min each)
- Write a simple frontend to manually create leads when needed
Operational complexity: Higher. If n8n breaks, nobody's getting notified of leads. We monitor it. We have backups. We're responsible.
Scaling questions: Yes, but good ones. At what volume does this need a rewrite? PostgreSQL can handle 100K transactions/day. n8n can handle 10K/day without thinking hard. When we hit those limits, we'll optimize. Right now? We're at 1% of those limits.
What Actually Happened
After three months:
- Lead response time: 4 hours → 5 minutes (automated welcome emails)
- Time on admin/busywork: 15 hours/week → 2 hours/week
- Data errors: ~5 per week → 0 per week (one source of truth)
- Tool costs: $330/month → $40/month
- Team morale: Noticeably better. People aren't doing data entry anymore.
The last metric matters more than the money.
Is This Right for Your Team?
Do this if:
- You have 3+ people doing the same repetitive process
- You're paying for multiple tools that could talk to each other
- You have someone (or contractor) who can code a little
- You want to own your data and systems
Don't do this if:
- Your tech person is already drowning
- You need perfect UI/UX (internal tools are fine, but rough)
- You change your processes constantly
- You have under 10 transactions/day (the time to set up isn't worth it)
The Real Win: Predictability
The biggest advantage isn't cost or speed. It's predictability.
When you own your automation stack, you know exactly what happens when something breaks. When data flows into PostgreSQL, then gets picked up by n8n, then triggers an email, you can monitor every step.
When you're using Zapier and something breaks? You submit a support ticket and wait.
The Next Level
Once you have this foundation, everything else is easier:
- A/B testing workflows (just change the n8n condition)
- Exporting data (SQL query, done)
- Adding a new step (30 minutes in n8n)
- Onboarding a new team member (show them the database, explain the triggers, they're productive immediately)
We're now thinking about:
- Building a customer dashboard that reads directly from PostgreSQL
- Adding more complex decision logic (not just trigger-based, but conditional)
- Expanding to automated outbound campaigns based on customer behavior
None of that required buying new tools.
Getting Started
If this resonates:
- Audit your current stack. List every tool, what it does, what you pay.
- Find your friction points. Where does data get copied between tools? That's your target.
- Start small. One workflow. One integration. Prove the model works.
- Iterate. Add complexity once the foundation is solid.
The best automation isn't flashy. It's the kind your team doesn't even think about. Data flows, processes run, humans focus on decision-making.
That's when you've actually won.