EveryMundo’s monthly activity emails are a set of emails sent to customers once a month with informative data on how their EveryMundo products are performing. It includes tables, charts, quick links, customer resources, and more on one page. The data is dynamically generated each month and compared to the previous month. The data provide insights into how customers’ products perform and suggestions for improving their products.
Goal
Aggregate our internal analytics into an informative automated email with a call to action to further promote user engagement.
Prerequisites
- Customer analytics is stored on various platforms, including Google Analytics, AWS Redshift, and AWS Athena. The Data Analytics team needed a way to aggregate all this data and include it in an email.
- The email needs to be fully automated and sent every month.
- The email needs to be linked to the customer database so that when a new one is added or remove, the recipient list updates.
- The data on the page needs to be informative and captivating without cluttering the email with too much text.
Tech stack:
- Sendgrid – Email platform (Allows for custom email templates with injectable data using their API)
- Quickchart – Chart.js image generation
- Node.js – Runtime environment
- AWS Lambda – Serverless platform for email logic
- AWS SQS – To manage the queue of customers for email generation
Planning:
Because the team uses Lambdas, this had to be split into two parts; a trigger lambda and a sender lambda. Lambdas have a 15-minute runtime limit, so it would be impossible to fit all the functionality into a single lambda. So instead, the team decided to utilize one lambda to gather the list of customers using an internal API, then push the customer list into an SQS queue. That queue would then act as a trigger for the second lambda to aggregate and send off the emails individually.
Data Flow
However, the team soon realized that we also wanted to have internal emails sent more than once a month so we could QA and revise the data with enough time to catch any bugs or data discrepancies and fix them before live emails are sent. To address this, we added another CRON trigger to the trigger lambda and sent in a “test” value that would carry over into the sender lambda. So the diagram then looks like this:
Data Flow Revised
With the tech stack and data flow figured out, it was time to begin the Development.
Development
Trigger Lambda
Let’s take a look at the first lambda – the trigger. This lambda ensures that the process starts on a schedule and every customer is included in the emails.
This lambda has two CRON schedules mentioned above. In the serverless config, they appear as:
As you can see, there is one scheduled to run every day at 11 AM UCT with an input of testMail: true, and another one scheduled to run once a month on the 5th of every month at 3 PM UCT (this allows a 5-day window to catch any bugs before emails are sent every month).
Within the function, we write our handler as so:
In this function, we fetch our customers, and for each customer, check to make sure they have a valid code and name (excluding any with the word “test” in the name). Then we push that customer to the queue with all relevant information.
Sender Lambda
Now we move on to our second lambda, where all the data processing happens. For the serverless config, we have a single event to trigger the lambda:
This means when new events get pushed to the queue, this lambda will be triggered. Let’s break down the lambda. It consists of 4 parts; extracting data from SQS, connecting to the clients, querying the data sources, and building the email.
Extracting Data
To extract the data, we use the following code:
Connecting Clients
Once we verified we have all the data, we then connect to our clients:
Data Collection
Then we can begin collecting the data. First, we ensure that the client has enough activity to generate an email. Our current threshold is 100 page views per month.
Once we’ve verified the pageview count, the team collects data from various sources.
Building the Email
Once data is collected, building the email can begin. Chart.js builds out the charts, and the service generates images of the charts. This service returns an image URL which we then include in the emails.
The team uses a function that puts together an object with all the data and is sent off to the Sendgrid API to process.
Conclusion: Email Hurdles and Dynamic Styling
Because this report was sent over email, there were a lot of limitations with how the data could be displayed back to the user. The goal was for the emails to be dynamic and appealing, but without javascript, the team had to use vanilla HTML and CSS. To make things worse, CSS classes did not play well with some email providers (Gmail), and the classes would be stripped away when forwarding/replying to the email. Therefore, most of the sections were left without any styling and would break most of the arrangement put in place. So instead, the team had to include any CSS styling inside the style=” attributes in the HTML tags. This may not seem like a big issue, but when you want colors and icons to change based on the data, this becomes a big problem: the team uses a function that puts together an object with all the data and is sent off to the Sendgrid API to process. Example of colors and icons changing based on data
To solve this, the team took advantage of Handlebar’s HTML Escaping feature and used an escaped block that populates with pre-generated HTML containing the correct styling and icon.
Handlebars HTML block for page views
HTML similar to the one below would then be sent and inserted in that block:
Dynamic Tables
In our emails, a lot of tables are used to show data. However, these tables almost never contain the same amount of rows across customers. Some should only show four rows, others 3, 2, 1, or even 0 (if there was no data). The problem with not using any javascript is the inability to dynamically remove or add <tr> to a table, and we didn’t want to generate entire tables in javascript so we could give greater flexibility to our tables (in case the design or arrangement needed to change). This took a while to solve but ended up being resolved with Handlebar’s conditionals feature and the CSS display property.
In our email template, we used handlebars if-else block on a boolean value that would first determine whether to show a table or show a text saying there is no data. If the boolean is evaluated as true, it would render the table. Each row then had an inline style for display which would evaluate to none or table-row.
Handlebars HTML block for a dynamic table
With this, we could control which rows to show or hide or even hide the table completely and show a message instead.
Grid layouts
Lastly, we wanted to include a section where customers could view the top 3 tutorials from the tutorials page. The team wanted to achieve the result of having the thumbnail next to the tutorial title. We tried different methods, including using HTML tables and vanilla CSS styling. Unfortunately, some email clients kept breaking the layout, which would result in something like this when forwarding or replying to the email:
Broken Layout
That’s not the most eye-pleasing thing to see. While trying to use the most compatible bare-bones styling, everything we tried resulted in a janky layout. However, while playing around with various solutions, the (relatively new-er) flex-box approach worked.
After many obstacles, customers were ready to start receiving emails.
Final Thoughts
The journey of generating these emails has taught the team a lot. It has helped us spot data inconsistencies and discrepancies in our products and figure out what data is most meaningful to customers at a glance. We’re still learning more efficient ways to collect and query the data. For the next iteration, we plan to integrate with Salesforce to manage our email lists better and automate the emails even further. These emails are currently being sent in the beta phase, and there’s still lots of feedback to gather and learn from customers. For now, we’ll continue to monitor the feedback, iterate on designs, and continuously improve how we can make data analysis more tangible for our customers.