A lightweight CLI tool for installing webhook triggers in ODK Central database.
Install PostgreSQL triggers that automatically call remote APIs on ODK Central database events:
- New submission (XML).
- Update entity (entity properties).
- Submission review (approved, hasIssues, rejected).
The centralwebhook tool is a simple CLI that installs or uninstalls database triggers. Once installed, the triggers use the pgsql-http extension to send HTTP requests directly from the database.
-
ODK Central running, connecting to an accessible PostgreSQL database.
-
A POST webhook endpoint on your service API, to call when the selected event occurs.
-
The
pgsql-httpextension installed and enabled in your PostgreSQL database:CREATE EXTENSION http;
Note
Using our helper images: We provide PostgreSQL images with the pgsql-http extension pre-installed:
ghcr.io/hotosm/postgres:18-http(based on vanilla PostgreSQL 18 images)
These images are drop-in replacements for standard PostgreSQL images and simply add the extension.
Installing manually: If you don't wish to use these images, you must install the pgsql-http extension yourself. The extension may require superuser privileges to install. If you cannot install it yourself, ask your database administrator.
The centralwebhook tool is a CLI that installs or uninstalls database triggers. After installation, the triggers run automatically whenever audit events occur in the database.
Install webhook triggers in your database:
./centralwebhook install \
-db 'postgresql://{user}:{password}@{hostname}/{db}?sslmode=disable' \
-updateEntityUrl 'https://your.domain.com/some/webhook' \
-newSubmissionUrl 'https://your.domain.com/some/webhook' \
-reviewSubmissionUrl 'https://your.domain.com/some/webhook'Tip
Omit a webhook URL flag if you do not wish to use that particular webhook.
The -apiKey flag is optional, see the APIs With Authentication section.
Remove webhook triggers from your database:
./centralwebhook uninstall \
-db 'postgresql://{user}:{password}@{hostname}/{db}?sslmode=disable'All flags can also be provided via environment variables:
CENTRAL_WEBHOOK_DB_URI=postgresql://user:pass@localhost:5432/db_name?sslmode=disable
CENTRAL_WEBHOOK_UPDATE_ENTITY_URL=https://your.domain.com/some/webhook
CENTRAL_WEBHOOK_REVIEW_SUBMISSION_URL=https://your.domain.com/some/webhook
CENTRAL_WEBHOOK_NEW_SUBMISSION_URL=https://your.domain.com/some/webhook
CENTRAL_WEBHOOK_API_KEY=ksdhfiushfiosehf98e3hrih39r8hy439rh389r3hy983y
CENTRAL_WEBHOOK_LOG_LEVEL=DEBUGYou can run the CLI tool via Docker:
docker run --rm ghcr.io/hotosm/central-webhook:latest install \
-db 'postgresql://{user}:{password}@{hostname}/{db}?sslmode=disable' \
-updateEntityUrl 'https://your.domain.com/some/webhook' \
-newSubmissionUrl 'https://your.domain.com/some/webhook' \
-reviewSubmissionUrl 'https://your.domain.com/some/webhook'Download the binary for your platform from the releases page.
{
"type": "entity.update.version",
"id":"uuid:3c142a0d-37b9-4d37-baf0-e58876428181",
"data": {
"entityProperty1": "someStringValue",
"entityProperty2": "someStringValue",
"entityProperty3": "someStringValue"
}
}{
"type": "submission.create",
"id":"uuid:3c142a0d-37b9-4d37-baf0-e58876428181",
"data": {"xml":"<?xml version='1.0' encoding='UTF-8' ?><data ...."}
}{
"type":"submission.update",
"id":"uuid:5ed3b610-a18a-46a2-90a7-8c80c82ebbe9",
"data": {"reviewState":"hasIssues"}
}Many APIs will not be public and require some sort of authentication.
There is an optional -apiKey flag that can be used to pass
an API key / token provided by the application.
This will be inserted in the X-API-Key request header when the trigger sends HTTP requests.
No other authentication methods are supported for now, but feel free to open an issue (or PR!) for a proposal to support other auth methods.
Example:
./centralwebhook install \
-db 'postgresql://{user}:{password}@{hostname}/{db}?sslmode=disable' \
-updateEntityUrl 'https://your.domain.com/some/webhook' \
-apiKey 'ksdhfiushfiosehf98e3hrih39r8hy439rh389r3hy983y'Here is a minimal FastAPI example for receiving the webhook data:
from typing import Annotated, Optional
from fastapi import (
Depends,
Header,
)
from fastapi.exceptions import HTTPException
from pydantic import BaseModel
class OdkCentralWebhookRequest(BaseModel):
"""The POST data from the central webhook service."""
type: OdkWebhookEvents
# NOTE we cannot use UUID validation, as Central often passes uuid as 'uuid:xxx-xxx'
id: str
# NOTE we use a dict to allow flexible passing of the data based on event type
data: dict
async def valid_api_token(
x_api_key: Annotated[Optional[str], Header()] = None,
):
"""Check the API token is present for an active database user.
A header X-API-Key must be provided in the request.
"""
# Logic to validate the api key here
return
@router.post("/webhooks/entity-status")
async def update_entity_status_in_fmtm(
current_user: Annotated[DbUser, Depends(valid_api_token)],
odk_event: OdkCentralWebhookRequest,
):
"""Update the status for an Entity in our app db.
"""
log.debug(f"Webhook called with event ({odk_event.type.value})")
if odk_event.type == OdkWebhookEvents.UPDATE_ENTITY:
# insert state into db
elif odk_event.type == OdkWebhookEvents.REVIEW_SUBMISSION:
# update entity status in odk to match review state
pass
elif odk_event.type == OdkWebhookEvents.NEW_SUBMISSION:
# unsupported for now
log.debug(
"The handling of new submissions via webhook is not implemented yet."
)
else:
msg = f"Webhook was called for an unsupported event type ({odk_event.type.value})"
log.warning(msg)
raise HTTPException(status_code=400, detail=msg)The tool installs PostgreSQL triggers on the audits table that:
- Detect when audit events occur (entity updates, submission creates/updates)
- Format the event data into a JSON payload with
type,id, anddatafields - Use the
pgsql-httpextension to send an HTTP POST request directly from the database - Include the
X-API-Keyheader if provided during installation
The triggers run automatically after installation - no long-running service is needed.
- This package uses the standard library and a Postgres driver.
- Binary and container image distribution is automated on new release.
The test suite depends on a database with the pgsql-http extension installed. The most convenient way is to run via docker:
docker compose run --rm webhook