Skip to main content

Migrating From Temporal

This guide explains how to migrate a Temporal application to DBOS, with a focus on how each major Temporal feature translates to DBOS.

info

For a high-level comparison of DBOS and Temporal's architectures, see Comparing DBOS and Temporal.

Workflows

The core feature of both DBOS and Temporal is durably executed workflows. Both DBOS and Temporal automatically recover workflows from the last completed step (activity) after any failure. Both DBOS and Temporal support extremely long-running workflows, including workflows that run for weeks or months.

Temporal:

@workflow.defn
class OrderWorkflow:
@workflow.run
async def run(self, order: Order) -> str:
result = await workflow.execute_activity(
validate_order,
order,
start_to_close_timeout=timedelta(seconds=30),
)
confirmation = await workflow.execute_activity(
process_payment,
result,
start_to_close_timeout=timedelta(seconds=60),
)
return confirmation

DBOS:

@DBOS.workflow()
def order_workflow(order: Order) -> str:
result = validate_order(order)
confirmation = process_payment(result)
return confirmation

Learn more in the workflows tutorial.

Starting Workflows

In Temporal, workflows are started through a client connected to the Temporal server. The workflow task is then picked up by a worker, which executes the workflow logic.

In DBOS, workflows can be started directly within your application process. Alternatively, you can enqueue workflows from a separate process using the DBOS Client, which connects directly to the DBOS system database.

Temporal:

client = await Client.connect("localhost:7233")
handle = await client.start_workflow(
OrderWorkflow.run,
order,
id="order-123",
task_queue="orders",
)
result = await handle.result()

DBOS:

# Starting a workflow from in your application
with SetWorkflowID("order-123"):
handle = DBOS.start_workflow(order_workflow, order)
result = handle.get_result()
# Starting a workflow from another application using the DBOS Client
client = DBOSClient(system_database_url=os.environ["DBOS_SYSTEM_DATABASE_URL"])
handle = client.enqueue({"workflow_name": "order_workflow", "queue_name": "orders"}, order)
result = handle.get_result()

Learn more in the workflows tutorial.

Workflow IDs and Idempotency

Both systems support workflow IDs to provide idempotent execution.

In Temporal, the workflow ID is passed when starting a workflow. In DBOS, you set the workflow ID before invoking the workflow. If a workflow with the same ID has already executed, DBOS returns the previously recorded result instead of running the workflow again.

One important difference is how workflow executions are identified. Temporal uniquely identifies an execution using a combination of workflow ID and run ID, so a workflow may have multiple run instances over time. DBOS, by contrast, treats each execution as uniquely identified by its workflow ID, so a workflow ID corresponds to exactly one execution.

with SetWorkflowID("payment-idempotency-key"):
order_workflow(order)

Learn more in the workflows tutorial.

Determinism

Both DBOS and Temporal require workflows to be deterministic. Non-deterministic operations (API calls, random numbers, current time) must happen inside activities/steps, not directly in the workflow function.

Durable Timers

Temporal's workflow.sleep() maps directly to DBOS.sleep(). Both are durable and persist across restarts.

Temporal:

await workflow.sleep(timedelta(hours=24))

DBOS:

DBOS.sleep(86400)  # seconds

Learn more in the workflows tutorial.

Continue-as-New

A common pattern in Temporal is to use an extremely long-running workflow as a durable object. Applications interact with it via signals and queries and periodically refresh its state with continue_as_new to avoid Temporal's workflow size limits.

In DBOS, there are no workflow history limits beyond the underlying database column and storage limits. However, instead of maintaining extremely long-lived workflows, which can slow down replay during recovery, we generally recommend storing long-lived objects directly in your database and interacting with them through shorter-lived workflows.

To coordinate those interactions, you can use DBOS durable queues, especially partitioned queues, to control concurrency and ordering. This approach provides similar guarantees while avoiding the complexity of managing extremely long-running workflows.

If you have workflows with many steps, another useful pattern is to use an outer control workflow that orchestrates smaller sub-workflows (child workflows). This improves observability (because you can easily isolate each sub-workflow) and can speed up recovery and replay.

Activities → Steps

Temporal activities map to DBOS steps. Both are where side effects and non-deterministic operations happen.

The key architectural difference is how they are executed. In Temporal, activities are dispatched to workers, often running in separate processes, through the Temporal server. This introduces a network round trip between the workflow and the worker executing the activity. Temporal also supports local activities that run in the same process as the workflow, but they come with several limitations.

In DBOS, steps run in the same process as the workflow and are invoked like regular function calls. DBOS automatically checkpoints the step's result to your database, guaranteeing durability without requiring a separate worker process. Because execution happens in place, steps typically have lower latency and less overhead compared to remotely dispatched activities.

Temporal:

@activity.defn
async def send_email(to: str, body: str) -> bool:
response = requests.post(EMAIL_API, json={"to": to, "body": body})
return response.ok

DBOS:

@DBOS.step()
def send_email(to: str, body: str) -> bool:
response = requests.post(EMAIL_API, json={"to": to, "body": body})
return response.ok

Learn more in the steps tutorial.

Retries

Both systems support configurable retries with exponential backoff.

Temporal:

result = await workflow.execute_activity(
send_email,
args=[to, body],
start_to_close_timeout=timedelta(seconds=30),
retry_policy=RetryPolicy(
initial_interval=timedelta(seconds=1),
backoff_coefficient=2.0,
maximum_attempts=5,
),
)

DBOS:

@DBOS.step(retries_allowed=True, max_attempts=5, interval_seconds=1.0, backoff_rate=2.0)
def send_email(to: str, body: str) -> bool:
response = requests.post(EMAIL_API, json={"to": to, "body": body})
return response.ok

Learn more in the steps tutorial.

Heartbeats

Temporal activities support heartbeats for long-running operations so the server knows the activity is still alive. DBOS does not require heartbeats because there is no central orchestrator monitoring activity execution; instead, steps run directly in your application process.

Database Operations

DBOS provides a special type of step called a transaction that executes database operations in a single database transaction, co-committed with the DBOS checkpoint. This provides exactly-once semantics for database writes, which is stronger than the at-least-once semantics offered by Temporal.

Learn more in the transactions tutorial.

@DBOS.transaction()
def update_order_status(order_id: str, status: str) -> None:
DBOS.sql_session.execute(
text("UPDATE orders SET status = :status WHERE id = :id"),
{"status": status, "id": order_id}
)

Signals → Messages

Temporal signals allow external processes to send data to a running workflow. In DBOS, the equivalent mechanism is messages (notifications), which external processes send using send() and workflows read using recv().

Temporal:

# In the workflow
@workflow.defn
class OrderWorkflow:
def __init__(self):
self.payment_status = None

@workflow.signal
async def payment_received(self, status: str):
self.payment_status = status

@workflow.run
async def run(self, order: Order):
# ... start order processing ...
await workflow.wait_condition(lambda: self.payment_status is not None)
if self.payment_status == "paid":
# handle success
else:
# handle failure

# Sending the signal
handle = client.get_workflow_handle("order-123")
await handle.signal(OrderWorkflow.payment_received, "paid")

DBOS:

# In the workflow
@DBOS.workflow()
def order_workflow(order: Order):
# ... start order processing ...
payment_status = DBOS.recv("payment_status", timeout_seconds=3600)
if payment_status is not None and payment_status == "paid":
# handle success
else:
# handle failure

# Sending the message
DBOS.send("order-123", "paid", topic="payment_status")

Learn more in the workflow communication tutorial.

Messages are persisted to the database, so they remain available even after the workflow completes.

Queries → Events

Temporal queries allow external code to synchronously read the state of a workflow. In DBOS, the equivalent mechanism is events, which workflows publish using set_event() and external processes read using get_event().

Temporal:

@workflow.defn
class OrderWorkflow:
def __init__(self):
self.progress = 0

@workflow.query
def get_progress(self) -> int:
return self.progress

@workflow.run
async def run(self, order: Order):
self.progress = 25
await workflow.execute_activity(validate_order, order, ...)
self.progress = 50
# ...

# Querying workflow state
handle = client.get_workflow_handle("order-123")
progress = await handle.query(OrderWorkflow.get_progress)

DBOS:

@DBOS.workflow()
def order_workflow(order: Order):
DBOS.set_event("progress", 25)
validate_order(order)
DBOS.set_event("progress", 50)
# ...

# Reading workflow state
progress = DBOS.get_event("order-123", "progress")

Learn more in the workflow communication tutorial.

Events are persisted to the database, so they remain available even after the workflow completes.

Task Queues → Queues

Temporal task queues control which workers execute which workflows. DBOS queues serve a similar purpose but also provide built-in advanced concurrency control and rate limiting.

Temporal:

# Worker listens to a task queue
worker = Worker(
client,
task_queue="order-processing",
workflows=[OrderWorkflow],
activities=[validate_order, process_payment],
)
await worker.run()

# Start workflow on a specific queue
handle = await client.start_workflow(
OrderWorkflow.run, order, id="order-123", task_queue="order-processing"
)

DBOS:

# Define a queue with concurrency limits
order_queue = Queue("order-processing", concurrency=10)

# Enqueue a workflow
handle = order_queue.enqueue(order_workflow, order)
result = handle.get_result()

Learn more in the queues tutorial.

DBOS queues provide features that Temporal task queues don't have out of the box:

  • Global concurrency limits: Limit total concurrent executions across all workers.
  • Per-worker concurrency: Limit concurrent executions per process.
  • Global rate limiting: Limit executions per time period across all workers.
  • Partitioned queues: Create per-tenant sub-queues with independent concurrency limits.
  • Priority: Process higher-priority workflows first.
  • Deduplication: Prevent duplicate workflows in the queue.
  • Debouncing: Delay a workflow's execution until some time has passed since it was last called.

Scheduled Workflows

Both DBOS and Temporal let you run workflows on a cron schedule:

Temporal:

await client.create_schedule(
"daily-report",
Schedule(
action=ScheduleActionStartWorkflow(
DailyReportWorkflow.run,
id="daily-report",
task_queue="reports",
),
spec=ScheduleSpec(cron_expressions=["0 9 * * *"]),
),
)

DBOS:

DBOS.create_schedule("daily-report", daily_report_workflow, schedule="0 9 * * *")

DBOS schedules also support pausing, resuming, backfilling missed runs, and triggering immediate execution. Learn more in the scheduling tutorial.

Child Workflows

Both Temporal and DBOS support calling a child workflow from within another workflow.

Temporal:

@workflow.defn
class ParentWorkflow:
@workflow.run
async def run(self):
result = await workflow.execute_child_workflow(
ChildWorkflow.run, args=[data]
)

DBOS:

@DBOS.workflow()
def parent_workflow():
# Call directly (runs inline)
result = child_workflow(data)

# Or start in background
handle = DBOS.start_workflow(child_workflow, data)
result = handle.get_result()

Learn more in the workflows tutorial.

Codecs and Encryption

In Temporal, you can define a codec to encrypt workflow information before it is stored on a Temporal server to limit Temporal's access to sensitive data. In DBOS, this is rarely necessary because data is stored only in your own database. However, if it is necessary to store sensitive data encrypted, you can use a custom serializer (Python, TypeScript) to encrypt your data before storing it and decrypt it before retrieving it.

What's Different in DBOS

No Orchestration Server

DBOS has no central server to manage, operate, or scale. Your workflows run in your application process and checkpoint directly to your database. This eliminates a major source of operational complexity and latency.

Fork

DBOS can fork a workflow from a specific step, re-executing it from that point. This is powerful for recovering from failures, for example, restarting thousands of failed workflows from a specific step after an outage is resolved.

Workflow Streaming

DBOS provides streaming, an append-only stream that workflows can write to and clients can read from in real time. This is useful for streaming LLM outputs, progress updates, or real-time data from long-running workflows.

Database Integration & SQL-Based Introspection

DBOS integrates deeply with your database. For example, you can enqueue workflows directly from Postgres PL/pgSQL function. You can also use transactional steps to perform database operations in workflows with exactly-once semantics.

Moreover, because all workflow state is stored in your database, you can query it with SQL. DBOS also provides programmatic APIs to list, search, and manage workflows by status, name, time, queue, or custom properties.

Queue Flow Control

Using DBOS queues, you can manage how many workflows can execute concurrently (globally, per-worker, and per-tenant) as well as which workers can execute which workflows. Temporal does not have comparable queueing or flow control abstractions, making it harder to control when and where workflows execute.

Automating Temporal -> DBOS Migration

With coding agents, you can largely automate a migration from Temporal to DBOS. To do this, we recommend using DBOS skills and prompts to give your coding agent access to the latest information on DBOS: