At TangoCode, we try to meet and exceed our client’s expectations. In everything we do, we strive to provide a pleasant experience with our applications. While we create customized UI/UX designs, we also believe our solutions must perform well. When working with Serverless Architectures, it’s critical to focus on performance. For best results, you need to plan to mitigate cold-start delays and protect performance throughout.
The Challenge of the Cold Start
To ensure top performance, the first step is understanding the cold-start and how it impacts the performance of your solution. If nobody is using the function, it has an idle state. If it receives a new request, it changes to a warm state. The first request takes time to respond since the function has to read it and load it. Once the feature is warmed up, though, the next applications move faster, since the mechanism is already loaded.
Before considering cold-start delays, it’s important to note that TangoCode offers multiple serverless advantages, including on-demand infrastructure that auto-scales with pay-per-use pricing.
At TangoCode, our tech vision is to develop a strong backend, offer quick implementation, provide fast responses, and create easy-to-scale applications. Because of these things, Serverless fits our solutions perfectly. You can read more about this in our recent blog: “Serverless Architecture with Azure”
Available Serverless Providers
Currently, we use a variety of platforms to create solutions for our clients. Some of them are configured in Amazon AWS. Recently, however, we had a project with Microsoft Azure, which represented a new challenge. Today, we want to share our experience using NodeJS as the language in Azure functions, how to implement them, and how to improve performance.
The architecture we use includes some Azure resources, including App services, Functions, Cosmos DB (previously DocumentDB), API Management, and more.
Azure Architecture for a Standard Project
In the above graph, there’s a typical pattern for microservices using Cosmos DB as a Database and API Management. The endpoints the pattern produces are what we use to connect our App services.
The Main Problem of Working With NodeJS
After months of coding functions with NodeJS, we noticed significant cold-start delays. We knew there were different Azure function plans, so our first approach was to change from Consumption Plan to App Service plan. Unfortunately, this did not help. We were still experiencing cold-start delays of about 36 seconds, which affected the user experience. For example, when trying to post a comment, users couldn’t see it until the service returns a confirmation. Showing an infinite spinner was not an acceptable option, so we kept looking for better ways to improve the experience.
We tried different solutions until we found one that meets our expectations performance-wise. Here’s a breakdown of our process:
- We changed from a Consumption Plan (pay only when your functions are running) to App Service (perpetually warm instances to avoid any cold start, unlimited execution duration). This doesn’t solve our problem. Instead, we notice a moderate performance improvement, but we were still experiencing cold-start delays.
- After some research, we implemented a timer trigger function to keep App consistently warm. This trigger was configured to execute every 5 to 20 minutes. Given that we have about 40 features, though, this was not an optimal solution, since we need to configure all function apps and call each function to keep them awake., Many calls caused saturation, and sometimes this solution made it slower.
- The Solution! We found the Funcpack library. Reading through some documentation, we found this: “Whenever an Azure Function App is recreated on demand (a so-called “cold start”) the node.js module cache for each Function is empty. The current Functions file system is sluggish in dealing with many small file accesses, so there is a significant delay as the node reads the module files. Fortunately, node caches the modules in memory, so subsequent accesses are fast”. This means that if you have many dependencies as node modules, the first time it will take time to read all of them and save it in the cache, so the next time it reads from the cache it is faster. The solution was to avoid this type of file structure and create a bundle as Webpack does. That’s where the azure-functions-pack library enters the picture. It creates a single file containing all the code and dependencies. The results were excellent, cold starts previously took ~36 seconds and now takes ~1.3 seconds.
How to Implement Funcpack
This library is easy to implement, first you should install it and make some changes in the function.json:
npm install -g azure-functions-pack
Next, execute this command in the function app root: funcpack pack ./
This creates the bundle in the path /dist/src/.funcpack
Finally, the bundle should be uploaded to the path: /site/wwwroot/
NOTE: This library is no longer actively maintained.
RUN FROM PACKAGE
The azure-functions-pack library was replaced with the “Run from package” feature in Azure functions, “Run From Package addresses this same problem by keeping files as one payload and using a virtual file system that is much faster than the slower file system used by default.”
Following the same logic, it allows you to upload a single .zip file that avoids read all node dependencies and improve cold start, so now it’s possible to activate this feature adding ‘WEBSITE_RUN_FROM_PACKAGE’ to 1 in the function app configurations.
Here’s an example of how to activate it.
Today, working with NodeJS functions in Azure can be frustrating, especially if you have tasks with many dependencies. Luckily, we have options to make it faster. Most importantly, it’s no longer necessary to install a new library, since Azure offers the “Run from package” option in the Application settings. Thanks to this, we saw a significant improvement in our functions, and the functionalities now work as expected.
Stay tuned! In our next blog, we will be talking about how to create a NodeJS function from scratch in VS Code and how to deploy it using the “Run from package” feature.