Introduction
Do you want your code to run at a certain time, once or on a repeating schedule? You wouldn’t be the first application developer to need such a utility; Unix systems have offered `cron` for 45 years, and `cron` was far from the first tool with similar scheduling features. Subsequently, other on-premises solutions such as `Quartz` for Java, or `CronJobs` for Kubernetes provided deeper integration with their respective application development environments.
But what if you don’t want to manage the server that executes the jobs, and would prefer a “free `cron` for the cloud”? Better still, what if this scheduler ensured that your jobs completed exactly once? (Not just started exactly once, but completed exactly once?)
Scheduled Workflows are a simple solution, combining the familiarity of `cron`-like scheduling with the execution guarantees of DBOS Transact and the easy, serverless deployment of DBOS Cloud. It’s like a free `cron` for the cloud, but with unmatched execution guarantees.
In this blog post, we’ll:
- Use the DBOS Transact open source durable execution library (Python and TypeScript) to write and schedule a workflow
- Review the DBOS Transact processing guarantees, which ensure that scheduled operations execute exactly one time, regardless of interruptions, crashes, or failures
- Show how to deploy the workflow to DBOS Cloud, a reliable, serverless environment that can be used for free
Writing a Scheduled Workflow
For illustrative purposes, let’s write a workflow that sends out a nightly summary email to the manager of a widget store. We will use a simple workflow that first counts up sales from the day before, and then sends an email using Amazon SES.
Our DBOS Code may be written as:
const reportSes = configureInstance(SendEmailCommunicator, 'reportSES', {awscfgname: 'aws_config'});
interface SalesSummary {
order_count: number;
product_count: number;
total_price: number;
}
class ShopUtilities {
@Workflow()
static async nightlyReport(ctx: WorkflowContext) {
// We will get the real date once we start scheduling...
const yesterday = new Date("2024-06-25");
yesterday.setDate(yesterday.getDate() - 1);
const sales = await ctx.invoke(ShopUtilities).getDailySales(yesterday);
await ShopUtilities.sendStatusEmail(ctx, yesterday, sales);
}
@Transaction({readOnly: true})
static async getDailySales(ctx: KnexTransactionContext, day: Date) {
const startOfDay = new Date(day.setHours(0, 0, 0, 0));
const endOfDay = new Date(day.setHours(23, 59, 59, 999));
const result = await ctx.client('orders')
.join('products', 'orders.product_id', 'products.product_id')
.whereBetween('orders.last_update_time', [startOfDay, endOfDay])
.select(ctx.client.raw('COUNT(DISTINCT orders.order_id) as order_count'))
.select(ctx.client.raw('COUNT(orders.product_id) as product_count'))
.select(ctx.client.raw('SUM(products.price) as total_price'));
return result[0] as SalesSummary;
}
static async sendStatusEmail(ctx: WorkflowContext, yd: Date, sales: SalesSummary) {
await ctx.invoke(reportSes).sendEmail({
to: ['manager@widgetstore.dbos.dev'],
from: 'reportbot@widgetstore.dbos.dev',
subject: `Daily report for ${yd.toDateString()}`,
bodyText: `Yesterday we had ${sales.order_count} orders, selling ${sales.product_count} units, for a total of ${sales.total_price} dollars`,
});
}
}
Note that in the above code, we’ve temporarily used the current time (via `new Date()`) as the basis for the report. We will fix that as we schedule the reporting workflow.
Scheduling the report to be generated at midnight is straightforward, we just apply the `@Scheduled` decorator to the workflow. The signature of the scheduled workflow function must be adjusted to accept two date parameters, the scheduled time and the current time. We will pass the date provided by the scheduler to the reporting query, thus ensuring that the correct report is generated, even in the unlikely event that the workflow is significantly delayed.
The completed workflow function is shown below:
@Scheduled({crontab: '0 0 * * *'}) // Every midnight
@Workflow()
static async nightlyReport(ctx: WorkflowContext, schedDate: Date, _curdate: Date)
{
const yesterday = schedDate;
yesterday.setDate(yesterday.getDate() - 1);
const sales = await ctx.invoke(ShopUtilities).getDailySales(yesterday);
await ShopUtilities.sendStatusEmail(ctx, yesterday, sales);
}
Execution Guarantees
DBOS Transact will guarantee that a report is generated and forwarded to the email service exactly once for each night:
- DBOS Transact uses its system database to keep durable records of the workflow schedule. DBOS Transact uses these records to ensure that workflows are initiated at least once. The scheduler assigns a unique key to the workflow, based on the function name and scheduled date and time. The key is used to deduplicate the workflow runs, resulting in exactly once workflow initiation.
- DBOS Workflows (and associated Transactions) run to completion exactly once. Workflows that are interrupted (for example due to a hardware failure) are restarted transparently.
- DBOS Communicators can easily calculate an idempotency key so that their work is completed exactly once.
Deploying to DBOS Cloud
The open source DBOS Transact durable execution library can only provide its execution guarantees when running in a deployment environment that:
- Provides a reliable system database
- Automatically restarts the cron job and initiates workflow recovery in the event of any system failures
DBOS Cloud provides that environment for you (for free) if you prefer not to set up the environment on your own. Here’s how:
- Create a free DBOS Cloud account and log in
- Deploy your cron job to DBOS Cloud
If you are using the code above, you’ll need to be set up to use the AWS SES service:
- Get an access key and secret from the AWS console
- Put an AWS configuration in dbos-config.yaml as follows
application:
aws_config:
aws_region: ${AWS_REGION}
aws_access_key_id: ${AWS_ACCESS_KEY_ID}
aws_secret_access_key: ${AWS_SECRET_ACCESS_KEY}
- Ensure the referenced environment variables (AWS_REGION, AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY) are set during application deployment
Summary
DBOS Transact provides an elegant Typescript development environment with exactly-once execution guarantees, and applications can be quickly deployed to DBOS Cloud. Scheduled Workflows add one more tool to the platform, a straightforward scheduling service that ensures jobs are executed to completion exactly once.
Further Information
- Ask questions in the DBOS community on Discord.
- Check out DBOS Transact open source code, star our repository, or contribute on GitHub.
- DBOS Transact, check out the quickstart and docs.
- Using Scheduled Workflows in DBOS Transact
- More information on sending emails: AWS SES service.
- Watch an in-depth explanation of DBOS Transact’s execution guarantees from DBOS co-founder Mike Stonebraker.