← Blog · Guide · 10 min read

Build a Video Processing Pipeline with FFmpeg API

Build a scalable video processing pipeline using ffpipe's FFmpeg API. Transcode, generate HLS, extract thumbnails — all via simple REST calls.

fT
ffpipe Team
· Updated Apr 14, 2026

A video processing pipeline is a sequence of automated operations — transcoding, thumbnail generation, watermarking, and packaging — applied to video files via API calls instead of manual FFmpeg commands. ffpipe provides a stateless REST API that wraps FFmpeg into JSON payloads, enabling developers to build production-grade pipelines without managing servers, encoding queues, or FFmpeg binaries.

Key Takeaways

  • Single API call replaces 15+ FFmpeg flags for HLS adaptive streaming
  • Async job model with webhooks — no polling required in production
  • Supports multi-step pipelines: transcode → watermark → thumbnail in one request
  • Free tier: 100 processing minutes/month, no credit card required

You need to transcode user uploads into HLS, generate thumbnails, and slap on a watermark — all before the video hits your CDN. Doing this with raw FFmpeg on your own infrastructure is a week of DevOps work you didn’t budget for.

FFmpeg is the most powerful media tool ever built. It’s also a nightmare in production. You’re managing CLI flags nobody remembers, babysitting encoding processes that eat CPU, handling format edge cases across thousands of user uploads, and praying your server doesn’t OOM on a 4K file. Scaling means provisioning GPU instances, building job queues, and writing retry logic. All of that before you write a single line of product code.

How ffpipe Gives You an FFmpeg API

ffpipe wraps the full power of FFmpeg into a stateless REST API. You send a JSON payload describing your video processing pipeline — transcode, thumbnail, watermark, whatever — and ffpipe runs it in an isolated container at scale. Input and output are URLs: pass an S3 presigned link, get results written back to your bucket. No file uploads. No server management. No FFmpeg installed anywhere on your stack.

Every job is async. Submit it, get a job ID, and either poll for status or catch a webhook when it’s done.

Quick Start: Your First Video Transcode

Let’s transcode a video to H.264 MP4 at 720p. This is the “hello world” of video processing pipelines.

curl

curl -X POST https://api.ffpipe.io/v1/jobs \
  -H "Authorization: Bearer fp_live_abc123def456" \
  -H "Content-Type: application/json" \
  -d '{
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/processed/video-001.mp4",
    "pipeline": [
      {
        "operation": "transcode",
        "codec": "h264",
        "resolution": "1280x720",
        "bitrate": "2500k",
        "preset": "fast"
      }
    ]
  }'

Response:

{
	"job_id": "job_7f3a9b2c1d4e",
	"status": "queued",
	"created_at": "2025-01-15T10:32:00Z",
	"webhook_url": null,
	"estimated_duration_seconds": 45
}

Python

import httpx

client = httpx.Client(
    base_url="https://api.ffpipe.io/v1",
    headers={"Authorization": "Bearer fp_live_abc123def456"}
)

response = client.post("/jobs", json={
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/processed/video-001.mp4",
    "pipeline": [
        {
            "operation": "transcode",
            "codec": "h264",
            "resolution": "1280x720",
            "bitrate": "2500k",
            "preset": "fast"
        }
    ]
})

job = response.json()
print(f"Job submitted: {job['job_id']}")

To check on the job:

status = client.get(f"/jobs/{job['job_id']}").json()
print(status["status"])  # "queued" | "processing" | "completed" | "failed"

That’s it. No FFmpeg binary. No subprocess calls. No parsing stdout for progress.

Building a Multi-Step Pipeline: Transcode to HLS

Real video processing pipelines chain operations. A common flow: take a raw upload, transcode it to multiple resolutions, and package everything as HLS for adaptive bitrate streaming.

ffpipe handles this in a single API call using pipeline mode:

curl -X POST https://api.ffpipe.io/v1/jobs \
  -H "Authorization: Bearer fp_live_abc123def456" \
  -H "Content-Type: application/json" \
  -d '{
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/hls/video-001/",
    "pipeline": [
      {
        "operation": "transcode",
        "codec": "h264",
        "variants": [
          {"resolution": "1920x1080", "bitrate": "5000k"},
          {"resolution": "1280x720", "bitrate": "2500k"},
          {"resolution": "854x480", "bitrate": "1000k"}
        ]
      },
      {
        "operation": "package",
        "format": "hls",
        "segment_duration": 6,
        "master_playlist": true
      }
    ],
    "webhook_url": "https://myapp.com/webhooks/ffpipe"
  }'

This single request produces a full HLS output directory with a master playlist and three quality variants. ffpipe handles the multi-pass encoding, segment splitting, and playlist generation internally. The webhook fires when all variants are done.

Compare that to the equivalent local FFmpeg command — you’re looking at 15+ flags, multiple output streams, and a bash script to orchestrate it. With ffpipe’s video transcoding API, it’s one JSON payload.

Generate Thumbnails and Preview GIFs

Every video platform needs thumbnails. Instead of running a separate FFmpeg process, add a thumbnail step to your pipeline:

response = client.post("/jobs", json={
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/thumbs/video-001.jpg",
    "pipeline": [
        {
            "operation": "thumbnail",
            "timestamp": "00:00:05",
            "resolution": "640x360",
            "format": "jpg"
        }
    ]
})

Need an animated preview GIF for hover states? Swap the operation:

response = client.post("/jobs", json={
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/previews/video-001.gif",
    "pipeline": [
        {
            "operation": "gif",
            "start": "00:00:02",
            "duration": 4,
            "resolution": "320x180",
            "fps": 10
        }
    ]
})

Watermark Videos at Scale

Brand protection or content attribution — watermarking is a common media automation task that’s annoying to get right with raw FFmpeg filter graphs.

curl -X POST https://api.ffpipe.io/v1/jobs \
  -H "Authorization: Bearer fp_live_abc123def456" \
  -H "Content-Type: application/json" \
  -d '{
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/watermarked/video-001.mp4",
    "pipeline": [
      {
        "operation": "transcode",
        "codec": "h264",
        "resolution": "1920x1080",
        "bitrate": "5000k"
      },
      {
        "operation": "overlay",
        "image": "https://my-bucket.s3.amazonaws.com/assets/logo.png",
        "position": "bottom_right",
        "opacity": 0.7,
        "padding": 20
      }
    ]
  }'

Transcode and watermark in one pass. No temp files, no intermediate storage costs.

Extract Audio Tracks

Building a podcast feature or need audio-only versions for bandwidth-constrained users? Extract the audio track without re-encoding the video:

response = client.post("/jobs", json={
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/audio/video-001.mp3",
    "pipeline": [
        {
            "operation": "extract_audio",
            "codec": "mp3",
            "bitrate": "192k",
            "sample_rate": 44100
        }
    ]
})

This completes in seconds since there’s no video encoding involved.

Best Practices for Production Pipelines

Use webhooks, not polling. For production workloads, always set a webhook_url on your jobs. Polling works for testing, but webhooks eliminate unnecessary API calls and give you near-instant completion notifications.

response = client.post("/jobs", json={
    "input": "https://my-bucket.s3.amazonaws.com/raw/video-001.mov",
    "output": "https://my-bucket.s3.amazonaws.com/processed/video-001.mp4",
    "pipeline": [
        {"operation": "transcode", "codec": "h264", "resolution": "1280x720", "bitrate": "2500k"}
    ],
    "webhook_url": "https://myapp.com/webhooks/ffpipe",
    "metadata": {"user_id": "usr_42", "upload_id": "upl_99"}
})

The metadata field passes through to the webhook payload — use it to correlate jobs with your application state without maintaining a separate lookup table.

Handle failures gracefully. Check for failed status and inspect the error field:

status = client.get(f"/jobs/{job['job_id']}").json()

if status["status"] == "failed":
    print(f"Job failed: {status['error']['message']}")
    # Common: unsupported codec in source, corrupt file, invalid output URL

Batch wisely. If you’re processing hundreds of videos, submit jobs in parallel. Each ffpipe job runs in its own container — there’s no shared resource contention. Fire off 50 jobs simultaneously and let webhooks handle completion.

Next Steps

ffpipe turns FFmpeg into a production-grade video processing pipeline you can call from anywhere. No servers to manage, no encoding queues to build, no FFmpeg versions to track.

  1. Sign up for ffpipe — free tier includes 100 minutes of processing per month
  2. Read the API docs — full reference for all pipeline operations, codecs, and output formats
  3. Explore pipeline templates — pre-built JSON configs for HLS, DASH, social media formats, and more

Stop wrestling with FFmpeg on your servers. Ship the video feature your users are waiting for.


Frequently asked questions

What is an FFmpeg API?

An FFmpeg API is a cloud service that wraps the FFmpeg media processing tool into a REST API. Instead of installing FFmpeg locally and running CLI commands, you send HTTP requests with JSON payloads describing your processing pipeline, and the API returns processed video URLs.

How does ffpipe differ from running FFmpeg directly?

ffpipe runs FFmpeg in isolated cloud containers at scale. You don’t manage servers, install binaries, or write queue logic. Each job is stateless and async — submit via POST, get results via webhook. ffpipe handles parallelization, retries, and codec updates automatically.

Can I chain multiple operations in one API call?

Yes. ffpipe’s pipeline mode lets you chain operations like transcode → watermark → thumbnail in a single JSON payload. The API handles the sequencing internally, with no temp files or intermediate storage on your end.

What happens if a job fails?

Failed jobs return a failed status with an error field describing the issue (unsupported codec, corrupt file, invalid output URL). You can implement retry logic by re-submitting the same payload. For production workloads, use webhooks to receive failure notifications immediately.


Glossary

  • Transcoding: Converting video from one codec/format to another (e.g., MOV to H.264 MP4).
  • HLS (HTTP Live Streaming): Apple’s adaptive bitrate streaming protocol that segments video into small chunks with a .m3u8 playlist.
  • Pipeline mode: Chaining multiple video operations in a single API request (e.g., transcode + watermark + thumbnail).
  • Webhook: A callback URL that receives an HTTP POST when a job completes, eliminating the need for status polling.
  • Presigned URL: A temporary S3 URL that grants time-limited read/write access to a specific file.