Guides
Batch Operations
Fetch multiple posts efficiently
When you need data for several Reddit posts at once, batching them into a single request is almost always cheaper and faster than making individual calls.
When to use batch
The batch endpoint fetches up to 25 posts by ID in a single request for 5 credits. Individual post lookups cost 1 credit each.
| Approach | Posts fetched | Cost |
|---|---|---|
| 3 individual calls | 3 | 3 credits |
| 1 batch call | 3 | 5 credits |
| 1 batch call | 6 | 5 credits |
| 1 batch call | 25 | 5 credits |
Batching breaks even at 6 posts. For 6 or more IDs, always use the batch endpoint.
Batch request
curl -H "x-api-key: $REDDGROW_API_KEY" \
"https://api.reddgrow.ai/agent/posts/batch?ids=abc123,def456,ghi789"reddgrow posts batch abc123 def456 ghi789{
"posts": [
{
"id": "abc123",
"subreddit": "typescript",
"title": "TypeScript 5.4 is out",
"score": 1842,
"url": "https://devblogs.microsoft.com/...",
"num_comments": 134,
"created_utc": 1710000000
},
{
"id": "def456",
"subreddit": "javascript",
"title": "Why I switched to TypeScript",
"score": 674,
"url": "https://example.com/why-typescript",
"num_comments": 89,
"created_utc": 1709900000
},
{
"id": "ghi789",
"subreddit": "webdev",
"title": "TypeScript generics explained",
"score": 412,
"num_comments": 31,
"created_utc": 1709800000
}
]
}Common pattern: search then batch
A typical agent workflow is to search for relevant posts (3 credits), collect the IDs, then batch-fetch full details (5 credits) — 8 credits total instead of 3 + N individual lookups.
// Step 1: Search for posts (3 credits)
const searchRes = await fetch(
"https://api.reddgrow.ai/agent/search/posts?q=typescript+generics&limit=20",
{ headers: { "x-api-key": process.env.REDDGROW_API_KEY } }
);
const { posts: searchResults } = await searchRes.json();
// Extract IDs
const ids = searchResults.map((p) => p.id).join(",");
// Step 2: Batch fetch full post data (5 credits)
const batchRes = await fetch(
`https://api.reddgrow.ai/agent/posts/batch?ids=${ids}`,
{ headers: { "x-api-key": process.env.REDDGROW_API_KEY } }
);
const { posts } = await batchRes.json();Chunking large ID lists
The batch endpoint accepts up to 25 IDs per request. If you have more, split them into chunks:
function chunk(arr, size) {
const chunks = [];
for (let i = 0; i < arr.length; i += size) {
chunks.push(arr.slice(i, i + size));
}
return chunks;
}
async function batchFetch(ids) {
const chunks = chunk(ids, 25);
const results = await Promise.all(
chunks.map((chunk) =>
fetch(
`https://api.reddgrow.ai/agent/posts/batch?ids=${chunk.join(",")}`,
{ headers: { "x-api-key": process.env.REDDGROW_API_KEY } }
).then((r) => r.json())
)
);
return results.flatMap((r) => r.posts);
}
// Fetch 60 posts in 3 batch calls (15 credits) instead of 60 individual calls (60 credits)
const posts = await batchFetch(sixtyIds);CLI batch with multiple IDs
Pass IDs as space-separated arguments. The CLI handles chunking automatically:
reddgrow posts batch \
abc123 def456 ghi789 \
jkl012 mno345 pqr678Tips
- Collect IDs from search or subreddit listings before batching — never guess post IDs
- Up to 25 IDs = 1 batch call (5 credits). 26–50 IDs = 2 batch calls (10 credits)
- If you only need 1–5 posts, individual calls are cheaper
- Batch results are returned in the same order as the input IDs