Scheduling social media posts programmatically with cron and an API

The simplest possible automated publishing setup: a cron job that reads from a queue or database and calls a publishing API. A good entry point for developers new to social automation.

Scheduling social media posts programmatically with cron and an API

You do not need a platform to schedule posts

Social media scheduling tools exist because clicking publish at the right time, on the right platform, with the right content, multiple times a day, is tedious. So people pay for dashboards that queue posts and release them on a schedule.

But if you are a developer, you already have a scheduler. It is called cron. It runs on every server, every VPS, every CI system. It has been running jobs on schedule since 1975. It does not need a subscription.

What cron does not have is a way to publish to social media. That is the missing piece. Not the scheduling — the publishing. And if you have an API that handles publishing to multiple platforms in a single call, cron is all you need.

This is the simplest possible automated publishing setup. No workflow engine. No dashboard. No vendor lock-in. A script, a schedule, and an API.

The architecture

┌──────────────┐
│ Cron │ Runs on schedule
│ (every hour,│ (daily, hourly, whatever)
│ daily, etc) │
└──────┬───────┘
┌──────────────┐
│ Script │ Reads next post from
│ │ database, file, or queue
└──────┬───────┘
┌──────────────┐
│ Postproxy │ Publishes to all
│ API │ connected platforms
└──────────────┘

The cron job triggers a script. The script reads the next post to publish — from a database, a JSON file, a spreadsheet, a queue, whatever holds your content. The script calls the Postproxy API. The post goes live on every platform you have connected.

That is the entire system.

A minimal example

Here is a complete setup. A JSON file holds the queue. A script reads the next unpublished post and sends it to Postproxy.

The queue file:

[
{
"body": "We just shipped real-time notifications. Your dashboard now updates the moment something changes.",
"media": ["https://yourstorage.com/notifications-hero.png"],
"published": false
},
{
"body": "Three indexing strategies that cut our p99 query time by 80%. New post on the blog.",
"media": [],
"published": false
},
{
"body": "Version 4.2 is out. Highlights: batch imports, better error messages, and a dark mode that actually works.",
"media": ["https://yourstorage.com/v42-release.png"],
"published": false
}
]

The script:

#!/bin/bash
QUEUE_FILE="/path/to/posts.json"
API_KEY="your_postproxy_api_key"
# Find the first unpublished post
POST=$(jq -r '[.[] | select(.published == false)] | first' "$QUEUE_FILE")
if [ "$POST" = "null" ]; then
echo "No posts to publish"
exit 0
fi
BODY=$(echo "$POST" | jq -r '.body')
MEDIA=$(echo "$POST" | jq -c '.media')
# Publish via Postproxy
curl -s -X POST "https://api.postproxy.dev/api/posts" \
-H "Authorization: Bearer $API_KEY" \
-H "Content-Type: application/json" \
-d "$(jq -n \
--arg body "$BODY" \
--argjson media "$MEDIA" \
'{
post: { body: $body },
profiles: ["twitter", "instagram", "linkedin", "threads"],
media: $media
}'
)"
# Mark as published
jq '(.[] | select(.published == false) | limit(1; .)) .published = true' \
"$QUEUE_FILE" > tmp.json && mv tmp.json "$QUEUE_FILE"
echo "Published: $BODY"

The cron entry:

0 9 * * * /path/to/publish.sh >> /var/log/social-posts.log 2>&1

Every day at 9 AM, the script picks the next post from the queue and publishes it. When the queue is empty, it does nothing.

This is not a toy. This is a production-grade publishing pipeline. It has a content source, a schedule, a publishing mechanism, and a log. Most teams do not need more than this.

Using a database instead of a file

A JSON file works for small queues. For anything larger, or for multiple people adding content, a database is more practical.

The schema is minimal:

CREATE TABLE scheduled_posts (
id SERIAL PRIMARY KEY,
body TEXT NOT NULL,
media TEXT[] DEFAULT '{}',
publish_after TIMESTAMP NOT NULL,
published_at TIMESTAMP,
postproxy_id TEXT
);

The script queries for posts that are due and not yet published:

import os
import requests
import psycopg2
from datetime import datetime
conn = psycopg2.connect(os.environ["DATABASE_URL"])
cur = conn.cursor()
cur.execute("""
SELECT id, body, media
FROM scheduled_posts
WHERE publish_after <= %s AND published_at IS NULL
ORDER BY publish_after
LIMIT 1
""", (datetime.utcnow(),))
row = cur.fetchone()
if not row:
print("No posts due")
exit()
post_id, body, media = row
response = requests.post(
"https://api.postproxy.dev/api/posts",
headers={
"Authorization": f"Bearer {os.environ['POSTPROXY_API_KEY']}",
"Content-Type": "application/json",
},
json={
"post": {"body": body},
"profiles": ["twitter", "instagram", "linkedin", "threads"],
"media": media or [],
},
)
result = response.json()
cur.execute("""
UPDATE scheduled_posts
SET published_at = %s, postproxy_id = %s
WHERE id = %s
""", (datetime.utcnow(), result.get("id"), post_id))
conn.commit()
print(f"Published post {post_id}: {body[:50]}...")

Run this with cron every 15 minutes, every hour, or however frequently you want to check for due posts:

*/15 * * * * cd /path/to/project && python publish.py >> /var/log/social-posts.log 2>&1

The publish_after column gives you precise scheduling. Insert a row with publish_after = '2026-02-20 14:00:00' and the post goes out at 2 PM on February 20th — or within 15 minutes of it, depending on your cron interval.

Adding scheduling to the Postproxy call

Postproxy itself supports scheduled publishing. Instead of relying on cron timing alone, you can create the post immediately and let Postproxy hold it until the right time:

Terminal window
curl -X POST "https://api.postproxy.dev/api/posts" \
-H "Authorization: Bearer YOUR_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"post": {
"body": "Announcing our new CLI tool for database migrations.",
"scheduled_at": "2026-02-20T14:00:00Z"
},
"profiles": ["twitter", "linkedin", "threads"],
"media": ["https://yourstorage.com/cli-tool-hero.png"]
}'

This approach separates the concerns. Your cron job handles reading from the queue and creating posts. Postproxy handles the precise timing of when the post goes live. You can run your cron job once a day, create all scheduled posts for the day, and let Postproxy release them at the specified times.

This is useful when exact timing matters — posting at 2:00 PM sharp rather than sometime between 2:00 and 2:15 depending on when cron last ran.

Where the content comes from

The examples above use a file and a database. In practice, the content source can be anything:

A spreadsheet. Export a Google Sheet or CSV with columns for body, media URL, and publish date. Your script reads the next row. Non-developers on your team can add content to a spreadsheet without touching code.

A content calendar. Tools like Notion, Airtable, or a custom app can serve as the content source. Query their API in your cron script to fetch the next post.

A CMS. If you use a headless CMS like Contentful or Sanity, you can tag entries for social distribution and have your cron script query for unpublished tagged entries.

An LLM. Your cron script calls an LLM API to generate a post, then publishes it via Postproxy. This is the simplest version of an AI content pipeline — a cron job that generates and publishes in one step. Use with caution and consider a human review step for brand accounts.

A queue. Redis, SQS, RabbitMQ — any message queue works. Push posts onto the queue from wherever they originate. The cron script pops the next message and publishes it.

The script does not care where the content comes from. It reads text and optionally a media URL, calls Postproxy, and marks the item as published. Everything upstream is your choice.

Checking what happened

Publishing to multiple platforms means some might succeed while others fail. After your script publishes, check the outcome:

# After publishing
post_id = result.get("id")
# Check status (immediately or in a follow-up cron job)
status = requests.get(
f"https://api.postproxy.dev/api/posts/{post_id}",
headers={"Authorization": f"Bearer {os.environ['POSTPROXY_API_KEY']}"},
).json()
for platform in status.get("platforms", []):
print(f" {platform['platform']}: {platform['status']}")

If a platform failed, you can log it, retry it, or alert someone. Partial success is normal in multi-platform publishing. Your script should expect it rather than treat it as an all-or-nothing operation.

Cron alternatives

Cron is the simplest scheduler, but not the only one.

GitHub Actions can run on a schedule using cron syntax. Your publishing script lives in a repository, and GitHub runs it:

name: Publish social posts
on:
schedule:
- cron: '0 9 * * *'
jobs:
publish:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- run: python publish.py
env:
POSTPROXY_API_KEY: ${{ secrets.POSTPROXY_API_KEY }}
DATABASE_URL: ${{ secrets.DATABASE_URL }}

No server required. Your content lives in a database or file in the repo. GitHub runs the script daily.

Serverless scheduled functions. AWS Lambda with EventBridge, Google Cloud Functions with Cloud Scheduler, Vercel Cron Jobs — all of these let you run a function on a schedule without maintaining a server.

systemd timers. If you are on a Linux server and want more control than cron provides — randomized delays, dependency management, better logging — systemd timers are the modern alternative.

The mechanism does not matter. What matters is: something runs on a schedule, reads the next post, and calls the Postproxy API.

Why this works

The reason a cron-based setup is viable is that the hard part of social media publishing is not scheduling. Scheduling is a solved problem. Every operating system, every cloud provider, every CI system can run something on a schedule.

The hard part is publishing. Eight platforms, each with its own authentication, its own upload protocol, its own format constraints, its own failure modes. That is the part that used to require a platform, a dashboard, a subscription.

Postproxy handles that part. It turns multi-platform publishing into a single HTTP call. And when publishing is a single HTTP call, you do not need a platform to schedule it. You need cron.

A script, a schedule, and an API. The simplest possible automated publishing setup — and often the only one you need.

Get your API key and start publishing from Postproxy.

Ready to get started?

Start with our free plan and scale as your needs grow. No credit card required.