We are all used to the Request-Response way of communicating between the Client and API. The client sends an HTTP request to the server, awaits the response and proceeds with its life. This is all cool until we get into a situation where, let’s say, we have a request which involves synchronous calls to 5 microservices. What happens if something goes wrong in one of the services? Maybe a request times out or maybe our app just stays hanging.
To prevent this from happening, some smart developers (not me) have come up with the idea of the Event-Driven approach.
It is a concept of communicating between different services asynchronously using events instead of requests. There are 3 key components:
EventBridge is AWS’ serverless pub/sub service made to connect different types of event sources (SaaS applications, internal AWS services or custom applications).
To get our terminology right:
The flow looks something like this:
To depict this with more context, let’s say we have an API for creating blog posts.
Use case: A user writes a blog, submits it, our API is called, the received blog post data is validated and saved to DB and a response is sent to our user, afterward an event is triggered for the further document processing, after which the additional document data is added to our DB as well and it gets sent to our user via WebSocket.
Let us slow down and explain the flow bit by bit :)
The initial action is the HTTP Request to our API Gateway which triggers our Lambda for creating a Blog Post - initiated by our User.
Let’s say our Blog Post object which is getting sent in the request looks like this:
{
"author": "hunter s thompson",
"title": "Title",
"body": "Joe is going to Spain",
"topic": "summer"
}
We then receive that object and validate it before we write it to DynamoDB. This way we make surer that even if some of our later steps fail, our blog is still saved in DB. Afterwards, we trigger an event on our EventBus and the response with the basic BlogPost info is returned to our user.
Basic BlogPost info would like something like this:
{
"id": "blogPostId",
"createdAt": "timestamp",
"author": "hunter s thompson",
"title": "Title",
"body": "Joe is going to Spain",
"topic": "summer"
}
What does the flow so far mean for us? We have written our key BlogPost data to DynamoDB so it can be displayed on the client, now we can proceed with the next steps using our Event Bus - Sending an email notification to our Subscriber using SNS and enriching our BlogPost with AWS Comprehend entity recognition.
The Event Record received is then sent to 2 targets according to our blogPostCreation Rule:
These 2 steps of our flow will be executed asynchronously.
The SNS will simply send an email notifying the users who are subscribed to the topic our BlogPost is about. - in our case - "summer".
The entity recognition Lambda will make a synchronous call to AWS Comprehend and update the BlogPost in DynamoDB with the newly acquired data.
The BlogPost in the DB now looks like this:
{
"id": "blogPostId",
"createdAt": "timestamp",
"author": "hunter s thompson",
"title": "Title",
"body": "Joe is going to Spain",
"topic": "summer",
"entities": {
"locations": ["Spain"],
"persons": ["Joe"]
}
}
Now we have to deliver these new results to our client using DynamoDB Stream, EventBus and API Gateway WebSocket.
DynamoDB Stream listens to the changes made in our BlogPost table, so it will catch th