Cross-Language Interaction
DBOS supports multiple languages—Python, TypeScript, Go, and Java—each with its own SDK. A client in one language can connect to the system database of an application written in another language to exchange data through workflows, messages, events, and streams. However, each language has a native serialization format that the other languages can't read. The portable JSON serialization format solves this by providing a common data representation that all SDKs can read and write, and can even be read and written from the database without any DBOS code at all.
Cross-language interoperability is currently supported in Python, TypeScript, and Java.
Default Serialization Is Language-Specific
By default, each DBOS SDK serializes data using its language's default format. These default formats are chosen for their fidelity to the wide range of data structures and objects available in each language:
| Language | Default Format | Format Name |
|---|---|---|
| Python | pickle | py_pickle |
| TypeScript | SuperJSON | js_superjson |
| Java | Jackson | java_jackson |
As the set of data structures and classes varies from language to language, data written in one language's default format cannot be read by the other languages. For example, a Python workflow that writes an event using pickle produces a binary blob that TypeScript and Java can't deserialize.
Portable JSON Format
The portable_json format is straightforward use of JSON that all SDKs can read and write.
While a smaller subset of language constructs can be serialized, any DBOS application in any language can read or write it.
Supported types:
- JSON primitives:
null, booleans, numbers, and strings - JSON arrays (ordered lists of JSON values)
- JSON objects (maps with strings as keys and JSON values)
Type conversions: Some language built-in and library types are mapped to equivalent JSON constructs. When these values are decoded, the recipient must restore them to the appropriate language equivalent.
- Date/time values are converted to RFC 3339 UTC strings (e.g.,
"2025-06-15T14:30:00.000Z")
| Language | Type | Portable Representation |
|---|---|---|
| Python | datetime | RFC 3339 UTC string |
| Python | date | ISO 8601 string |
| Python | Decimal | Numeric string |
| Python | set, tuple | JSON array |
| Java | Instant | RFC 3339 UTC string |
| Java | BigDecimal | Numeric string |
| TypeScript | Date | RFC 3339 UTC string |
| TypeScript | BigInt | Numeric string |
Using Portable Serialization
You can opt in to portable serialization at the workflow or operation level. Workflows started with portable serialization return their results or exceptions in portable format. Workflows started with portable serialization also write their events and streams in portable JSON by default, but this can be overridden for each operation.
Per-Workflow (Enqueue)
When enqueuing or starting a workflow from a DBOSClient, set the serialization format in the enqueue options.
This ensures the workflow's arguments are serialized in portable format that can be read by the target language.
- Python
- TypeScript
- Java
from dbos import DBOSClient, WorkflowSerializationFormat
client = DBOSClient(system_database_url=db_url)
handle = client.enqueue(
{
"workflow_name": "process_order",
"queue_name": "orders",
"serialization_type": WorkflowSerializationFormat.PORTABLE,
},
"order-123",
)
import { DBOSClient } from "@dbos-inc/dbos-sdk";
const client = await DBOSClient.create({systemDatabaseUrl: process.env.DBOS_SYSTEM_DATABASE_URL});
const handle = await client.enqueue(
{
workflowName: "process_order",
queueName: "orders",
serializationType: "portable",
},
"order-123",
);
import dev.dbos.transact.client.DBOSClient;
import dev.dbos.transact.client.EnqueueOptions;
import dev.dbos.transact.workflow.SerializationStrategy;
DBOSClient client = new DBOSClient(dbUrl, dbUser, dbPassword);
var options = new EnqueueOptions("OrderProcessor", "processOrder", "orders")
.withSerialization(SerializationStrategy.PORTABLE);
var handle = client.enqueue(options, "order-123");
Per-Workflow (via Annotation or Decorator)
You can set the serialization strategy directly on the workflow annotation or decorator so that the workflow uses portable serialization by default when started:
- Python
- TypeScript
- Java
from dbos import DBOS, WorkflowSerializationFormat
@DBOS.workflow(serialization_type=WorkflowSerializationFormat.PORTABLE)
def process_order(order_id: str):
# All inputs, outputs, events, and streams for this workflow
# use portable JSON serialization by default
return f"processed: {order_id}"
Using a decorator:
import { DBOS } from "@dbos-inc/dbos-sdk";
export class Orders {
@DBOS.workflow({ serializationType: "portable" })
static async processOrder(orderId: string): Promise<string> {
// All inputs, outputs, events, and streams for this workflow
// use portable JSON serialization by default
return `processed: ${orderId}`;
}
}
Or using registerWorkflow:
async function processOrder(orderId: string): Promise<string> {
return `processed: ${orderId}`;
}
const processOrderWorkflow = DBOS.registerWorkflow(processOrder, {
name: "processOrder",
serializationType: "portable",
});
import dev.dbos.transact.workflow.SerializationStrategy;
import dev.dbos.transact.workflow.Workflow;
@Workflow(serializationStrategy = SerializationStrategy.PORTABLE)
public String processOrder(String orderId) {
// All inputs, outputs, events, and streams for this workflow
// use portable JSON serialization by default
return "processed: " + orderId;
}
The default serialization strategy only affects invocations that are aware of the annotation / decorator. This makes the default useful for unit testing, but the actual serialization strategy used will depend on how the workflow is enqueued by the client.
For Workflow Communication
Setting the serialization format at the workflow level affects the default for setEvent and writeStream.
However, individual operations can override this—for example, a workflow running with native serialization may want to publish a specific event in portable format for cross-language consumption, or a portable workflow may need to record an event with the greater flexibility afforded by the native serializer.
Each language's setEvent and writeStream methods accept a serialization parameter for this purpose.
Note that send is not affected by the current workflow's serialization strategy, because messages target a different workflow and the sender does not know what serialization that workflow expects.
You should always set the serialization format explicitly on send when communicating cross-language.
Step outputs always use the native serializer regardless of the workflow's serialization strategy. Steps are internal to a workflow and are not read by other languages, so the native serializer's greater flexibility is preferred.
- Python
- TypeScript
- Java
from dbos import DBOS, WorkflowSerializationFormat
# Send a message readable by any language
DBOS.send(
destination_id="workflow-123",
message={"status": "complete", "count": 42},
topic="updates",
serialization_type=WorkflowSerializationFormat.PORTABLE,
)
# Set an event readable by any language
DBOS.set_event(
"progress",
{"percent": 75},
serialization_type=WorkflowSerializationFormat.PORTABLE,
)
# Write to a stream readable by any language
DBOS.write_stream(
"results",
{"item": "processed"},
serialization_type=WorkflowSerializationFormat.PORTABLE,
)
import { DBOS } from "@dbos-inc/dbos-sdk";
// Send a message readable by any language
await DBOS.send(
"workflow-123",
{ status: "complete", count: 42 },
"updates",
undefined, // idempotencyKey
{ serializationType: "portable" }
);
// Set an event readable by any language
await DBOS.setEvent(
"progress",
{ percent: 75 },
{ serializationType: "portable" }
);
// Write to a stream readable by any language
await DBOS.writeStream(
"results",
{ item: "processed" },
{ serializationType: "portable" }
);
import dev.dbos.transact.DBOS;
import dev.dbos.transact.workflow.SerializationStrategy;
// Send a message readable by any language
DBOS.send(
"workflow-123",
Map.of("status", "complete", "count", 42),
"updates",
null, // idempotencyKey
SerializationStrategy.PORTABLE
);
// Set an event readable by any language
DBOS.setEvent(
"progress",
Map.of("percent", 75),
SerializationStrategy.PORTABLE
);
Portable Errors
When a workflow using portable serialization fails, its error is serialized in a standard JSON structure that all languages can inspect:
{
"name": "ValueError",
"message": "Order not found",
"code": 404,
"data": {"orderId": "order-123"}
}
| Field | Type | Description |
|---|---|---|
name | string | The error type/class name |
message | string | Human-readable error message |
code | number, string, null | Optional application-specific error code |
data | any JSON value, null | Optional structured error details |
Raising Portable Errors
You can explicitly raise a portable error from a workflow:
- Python
- TypeScript
- Java
from dbos import PortableWorkflowError
raise PortableWorkflowError(
message="Order not found",
name="NotFoundError",
code=404,
data={"orderId": "order-123"},
)
import { PortableWorkflowError } from "@dbos-inc/dbos-sdk";
throw new PortableWorkflowError(
"Order not found",
"NotFoundError",
404,
{ orderId: "order-123" },
);
import dev.dbos.transact.json.PortableWorkflowException;
throw new PortableWorkflowException(
"Order not found",
"NotFoundError",
404,
Map.of("orderId", "order-123")
);
Reading Portable Errors
When a workflow that used portable serialization fails, other languages receive the error as a PortableWorkflowError (Python/TS) or PortableWorkflowException (Java) with the name, message, code, and data fields populated.
If a workflow fails with a non-portable exception while using portable serialization, DBOS automatically converts it to the portable error format on a best-effort basis, extracting the error type name, message, and any common error code attributes.
Input Validation and Coercion
When a workflow is started via portable JSON—whether from another language, a DBOSClient, or a direct database insert—the arguments arrive as plain JSON values.
JSON has a limited type system: numbers are untyped (no distinction between int, long, double, or other language-specific offerings), there is no native date type (dates arrive as strings), and collection types may not match the target language's expectations (e.g., a JSON array becomes a generic ArrayList in Java, not a typed list or object array).
Each SDK provides a way to validate these arguments so that the workflow function receives the types it expects. Note that while workflow argument validation is possible, return values, messages, and events are not automatically coerced, as the expected types are not known at runtime. These must be validated and coerced manually.
Each SDK's approach is documented in its language-specific reference:
- Java — Automatic Coercion: Java automatically coerces portable JSON arguments to match the workflow method's parameter types (e.g.,
Integer→long, ISO-8601 strings →Instant). No opt-in required. - TypeScript — Input Schema (Zod): TypeScript workflows can specify an
inputSchema(compatible with Zod) that validates and optionally transforms arguments before the workflow runs. - Python — Argument Validator (Pydantic): Python workflows can specify
validate_args=pydantic_args_validatorto validate arguments against the function's type hints using Pydantic.
Further Reading
- Serialization strategy reference:
- Custom serialization configuration:
- System tables: The
serializationcolumn in system tables records which format was used for each piece of serialized data.