How I reduced my API’s monthly maintenance cost from ~5 $ to 0
5 $? It ain’t much but still …
At the end of May 2022, I started an API as a Service with operational costs summing up between 4 to 5 $. Coming to cost breakup, it’s solely on running server as DB is MongoDB’s free 500MB shared cluster.
The service is based on FastAPI, popular python web framework, and I chose railway.app (referral link) to host server, I used to receive something around 5 $ from rlwy as monthly usage for this application.
I continued using railway for 2-3 months, but I always wanted to reduce costs and that’s when I thought to use serverless. Serverless perfectly suits my use case in here as my entire service can be ephemeral and invoked only when needed. So no more need of running server 24x7, where there’s lot of idle time.
Choosing serverless didn’t end my quest entirely. Let’s discuss serverless providers I reviewed at that time
> AWS Lambda
> Microsoft Azure Functions
> Google Cloud Functions
> Cloudflare Workers
> Vercel Functions
I wanted to choose between CF workers and Vercel as they don’t require any Credit Card like AWS, Azure or GCP for what I want to use.
Started checking CF workers documentation, there came a bummer that it compiles to JS from Python. I really thought workers will suffice but now I’m left with Vercel before I can move to AWS or the other 2.
Upon reading Vercel’s serverless documentation, it gave back whatever hope I lost for CF workers. Best thing being app
var that exposes a WSGI or ASGI app and Github integration.
So I hardly need to make any changes to current codebase as Vercel already supports Asynchronous Server Gateway Interface (ASGI), and with Github integration I just need to push changes into my project directly into Github and Vercel will trigger new build on push.
I added vercel.json with builds and routes info in root dir. In no time, I was able to make service online through Vercel to test if I can completely move from rlwy.
Upon doing some tests, I felt it’s perfectly functioning as expected and linked my web domain to Vercel.
For that entire month, I kept an eye on # requests that Vercel is handling and limits for free usage. Since the service just serves everything in JSON but no other media, I was (and am) able to keep up with bandwidth to just 1% of usage limit and function execution being 1% as well.
Happy with entire thing then? Let’s also see what are the compromises that I made in this progress,
There is no perfect system, Software engineering is all about choosing trade-offs.
One of the compromise, the only one in my case, is cold start.
Cold start adds latency to invoke function if it’s not invoked in some time period. If function in accessed frequently or at-least before provider flushes function, function will be available and ready to serve quickly, we can call it warm start. If not, provider will flush in order to save resources from running and next user will to bear with mild latency as it will be cold start.
That’s the end.
This is how I shifted from railway to Vercel and am able to save costs with serverless considering some trade-offs but it’s worth it? Ofc, totally in my case …