Building A Serverless Post Scheduling Backend
🙌 Hello there! Welcome to The Serverless Spotlight!
In this week's edition, I have an exciting challenge to build - one you're all familiar with, being on LinkedIn. We're going to be designing the solution to schedule a post on a blogging application.
What we are building
The challenge is to design, in the most efficient way possible, a backend microservice that will allow users to write and schedule a post ahead of time. They will also be able to choose that time in increments of every hour.
Let's see what AWS services we can make use of to design this solution.
Architecture Overview
Here's a high-level architecture overview:
Next, let's look at implementing this solution most efficiently with the AWS services at our disposal.
Implementation
Create A DynamoDB Table
Let's start by creating a new table called "posts" to store all of our post items.
Click the Create table button to start.
Name your table "posts" and use "postID" for the partition key (no need for a sort key).
Then, below in Table settings, select Customize settings.
Then scroll down to the Secondary Indexes section and create a Global Secondary Index.
For the GSI, use "scheduledDay" as the partition key and "scheduledTime" as the sort key.
The "scheduleDay" partition key will store posts that are scheduled on a given day, and the sort key will allow us to filter posts by the time of the day a post is to be scheduled at.
With this key design, we can query all posts on a given day, and we can also filter for posts by the hour.
If you aren't familiar with this design pattern or unsure how partition and sort keys work in DynamoDB, i encourage you to read this article before.
We'll take a look at this data design again later on.
Create A Lambda function to Store Posts to DynamoDB
In the AWS console, navigate to the Lambda service and create a new function.
Use the following configurations:
In the code editor below, add the following code:
import { DynamoDBClient, PutItemCommand } from "@aws-sdk/client-dynamodb";
const ddbClient = new DynamoDBClient();
export const handler = async (event) => {
const { postID, content, scheduledDay, scheduledTime, userId } = event;
const params = {
TableName: "posts",
Item: {
postID: { S: postID },
userId: { S: userId },
content: { S: content },
scheduledDay: { S: scheduledDay },
scheduledTime: { S: `${scheduledTime}#${userId}` },
status: { S: "scheduled" },
},
};
await ddbClient.send(new PutItemCommand(params));
return {
statusCode: 200,
body: JSON.stringify({ message: "Post scheduled successfully!" }),
};
};
The code will accept a few parameters:
Save and deploy the function.
We now need to create a function URL to be able to invoke this function.
Since we're only building the backend for this we won't focus on the endpoint URL but I'll leave a link to this article which explains how to create a function URL in a few simple steps.
Create an EventBridge rule
Let's now head over to the EventBridge service in AWS.
With the option "EventBridge Rule" selected, click on the Create rule button.
Recommended by LinkedIn
Use the following rule configurations:
Click the button "Continue in EventBridge scheduler".
On the new page, scroll down to the Schedule Pattern section.
Choose your timezone, then choose Cron-based schedule.
For the cron expressions, use cron(0***?*).
This cron expression will tell the rule to trigger every hour, of every day, of every month and every year, when the minute is at 0 - basically every hour of the day.
Once you enter a valid cron expression, you will see the next 10 trigger dates displayed below.
If you need to validate your cron expression you can use this link.
Click on the Next button.
On the next page, we'll select a target and choose the Lambda (invoke) option from the list.
Once we select Lambda, the page will scroll down and ask us to choose a Lambda function.
Click on the Create a new Lambda function button.
Create another Lambda function to publish posts
Create a new Lambda function. This time call it "publishPost" and use the same configurations as the first function.
For the code, copy the following into the code editor.
import { DynamoDBClient, QueryCommand, UpdateItemCommand } from "@aws-sdk/client-dynamodb";
const ddbClient = new DynamoDBClient();
export const handler = async () => {
try {
// Get the current date and hour
const now = new Date();
const currentDay = now.toISOString().split("T")[0]; // Format YYYY-MM-DD
const currentHour = now.toISOString().split("T")[1].substring(0, 2) + ":00"; // Format as HH:00
const queryParams = {
TableName: "posts",
IndexName: "scheduledDay-scheduledTime-index",
KeyConditionExpression: "#pk = :currentDay AND begins_with(#sk, :currentHour)",
FilterExpression: "#status = published",
ExpressionAttributeNames: {
"#pk": "scheduledDay",
"#sk": "scheduledTime",
"#status": "status"
},
ExpressionAttributeValues: {
":currentDay": { S: currentDay },
":currentHour": { S: currentHour },
}
};
const queryResult = await ddbClient.send(new QueryCommand(queryParams));
const postsToUpdate = queryResult.Items || [];
// Update the status of each item to "published"
for (const item of postsToUpdate) {
const updateParams = {
TableName: "posts",
Key: {
scheduledDay: { S: item.scheduledDay.S },
scheduledTime: { S: item.scheduledTime.S }
},
UpdateExpression: "SET #status = :published",
ExpressionAttributeNames: {
"#status": "status"
},
ExpressionAttributeValues: {
":published": { S: "published" }
}
};
await ddbClient.send(new UpdateItemCommand(updateParams));
}
return {
statusCode: 200,
body: JSON.stringify({ message: `${postsToUpdate.length} posts updated to 'published' status.` })
};
} catch (error) {
console.error("Error updating posts:", error);
return {
statusCode: 500,
body: JSON.stringify({ message: "Error updating posts", error: error.message })
};
}
};
This code will query items, from the GSI we created earlier, that match the provided day (for all hours).
We also add a FilterExpression to get items whose status is equal to scheduled (so that we don't update already published items).
It will loop over the results and update the status of each item to "published".
(* note that the server time may not match your local date time and you may have to convert the date time to your local time - for this guide you can add/substract hours to make it match - use console.log(currentHour) to find the datetime hour of the server).
Once you have created the Lambda function, return to EventBridge and choose the Lambda function (you will have to refresh the select field).
Click on the Next button. On the next page, you can leave it as it is. Click next again for the configuration review and then click on Create schedule at the bottom of that page to schedule the rule.
We are now ready to test our scheduled post microservice.
Let's use Lambda tests for this.
In Lambda, open the "schedulePost" function we created earlier and run a test with the following JSON payload:
{
"postID": "post-101",
"userId": "user-201",
"content": "Hey all",
"scheduledDay": "2024-11-04",
"scheduledTime": "14:00",
"status": "scheduled"
}
I've created a new test as you can see below and it has successfully written a post item to my DynamoDB posts table.
We can now wait for the EventBridge rule to trigger the Lambda function to schedule our post item. To accelerate the test you can choose a more frequent (5 minutes) schedule.
Once it runs you will see your post item has automatically updated its status to "published".
We have successfully created a scheduled post microservice.
Conclusion
In this article, we’ve built an efficient backend microservice that enables users to schedule posts in hourly increments, by using a few AWS services to automate the publishing process.
By integrating Lambda functions, DynamoDB, and EventBridge rules, we’ve created a scalable, reliable backend solution to manage scheduled posts with ease.
👋 My name is Uriel Bitton and I hope you learned something in this edition of The Serverless Spotlight
🔗 You can share the article with your network to help others learn as well.
📬 If you want to learn how to save money in the cloud you can subscribe to my brand new newsletter The Cloud Economist.
🙌 I hope to see you in next week's edition!
CRM | Data Engineer @ Periti Digital (Elite HubSpot Partner)
2moVery helpful. I suppose we can have the lambda that stores the post return a secure S3 URL to host a media for the post. 👍🏾