AutomationFlowsAI & RAG › [Sub] Run SQL with Row Cap

[Sub] Run SQL with Row Cap

[Sub] Run SQL with Row Cap. Uses stickyNote, executeWorkflowTrigger, postgres. Event-driven trigger; 4 nodes.

Event trigger★★★★☆ complexity4 nodesExecute Workflow TriggerPostgres
AI & RAG Trigger: Event Nodes: 4 Complexity: ★★★★☆

The workflow JSON

Copy or download the full n8n JSON below. Paste it into a new n8n workflow, add your credentials, activate. Full import guide →

Download .json
{
  "name": "[Sub] Run SQL with Row Cap",
  "settings": {
    "executionOrder": "v1"
  },
  "nodes": [
    {
      "parameters": {
        "content": "## [Sub] Run SQL with Row Cap\n**Purpose:** Executes agent-generated SELECT queries safely and guarantees the agent never sees more than 200 rows.\n\n**Called by:** main workflow's `run_sql` tool.\n\n**Flow:**\n1. **When Executed by Another Workflow** \u2014 receives `{ query: \"SELECT ...\" }` from the parent Tools Agent.\n2. **Run Query** \u2014 Postgres `executeQuery` runs the SQL string. Needs the Postgres credential (ideally the `n8n_readonly` role from SETUP.sql so the DB itself blocks writes).\n3. **Cap Rows** \u2014 Code node: if > 200 rows, slice and add `truncated:true` + a human-readable note. Prevents token blow-up when agents write `SELECT *` on large tables.\n\nReturns `{ rows, row_count, truncated, total_rows?, note? }` to the agent.",
        "height": 360,
        "width": 540,
        "color": 6
      },
      "type": "n8n-nodes-base.stickyNote",
      "typeVersion": 1,
      "position": [
        -40,
        -420
      ],
      "id": "sticky-sub-run-sql",
      "name": "README"
    },
    {
      "parameters": {
        "inputSource": "passthrough"
      },
      "type": "n8n-nodes-base.executeWorkflowTrigger",
      "typeVersion": 1.1,
      "position": [
        0,
        0
      ],
      "id": "sub-trigger",
      "name": "When Executed by Another Workflow"
    },
    {
      "parameters": {
        "operation": "executeQuery",
        "query": "={{ $json.query }}",
        "options": {}
      },
      "type": "n8n-nodes-base.postgres",
      "typeVersion": 2.6,
      "position": [
        220,
        0
      ],
      "id": "sub-pg",
      "name": "Run Query"
    },
    {
      "parameters": {
        "jsCode": "const items = $input.all();\nconst MAX = 200;\nconst rows = items.map(i => i.json);\nif (rows.length > MAX) {\n  return [{ json: {\n    rows: rows.slice(0, MAX),\n    row_count: MAX,\n    total_rows: rows.length,\n    truncated: true,\n    note: `Showing ${MAX} of ${rows.length} rows. Add a WHERE clause, aggregation, or explicit LIMIT to narrow results.`\n  } }];\n}\nreturn [{ json: { rows, row_count: rows.length, truncated: false } }];"
      },
      "type": "n8n-nodes-base.code",
      "typeVersion": 2,
      "position": [
        440,
        0
      ],
      "id": "sub-cap",
      "name": "Cap Rows"
    }
  ],
  "connections": {
    "When Executed by Another Workflow": {
      "main": [
        [
          {
            "node": "Run Query",
            "type": "main",
            "index": 0
          }
        ]
      ]
    },
    "Run Query": {
      "main": [
        [
          {
            "node": "Cap Rows",
            "type": "main",
            "index": 0
          }
        ]
      ]
    }
  }
}

About this workflow

[Sub] Run SQL with Row Cap. Uses stickyNote, executeWorkflowTrigger, postgres. Event-driven trigger; 4 nodes.

Source: https://github.com/MinaSaad1/n8n-data-analyst-agent/blob/main/workflows/02-sub-run-sql-capped.json — original creator credit. Request a take-down →

More AI & RAG workflows → · Browse all categories →