This post contains content about Now 1.0 – Learn about the latest version, Now 2.0.
Now 2.0 - Upgrade Available
Since their release, we have seen an incredible reception and fast adoption rate of our first-class static deployments. When you statically generate a website, it’s a matter of running now to get it up and running instantly. And as of a few weeks ago, you can even share individual files just as easily.
Static deployments are 20 times more popular today than during the initial month following their launch. As such, we have made many important improvements to make sure they remain consistently snappy and scalable.
One of the most interesting properties of Now is that each deployment is immutable. In other words, once your static deployment completes, its contents won’t ever change.
We take full advantage of this property internally. When our load balancers first receive a request, we first decode the Host header into a deployment id. This applies to any alias (like mysite.com) or the Now URL of the deployment itself (my-site-pkfrapmbcl.now.sh).
If we then determine the deployment is static, our load balancers communicate with the internal servers in charge of delivering your files with the correct headers, applying your configuration, determining whether special files like index.html should be served, etc.
When the response is returned by our internal services, we can cache it regardless of their HTTP headers, directly inside our load balancers. Subsequent requests are therefore served as fast as it is possible. Our new improvements have made this subsystem significantly better, as the numbers below show.
We set out to try a few different static deployment scenarios to evaluate their performance:
  • A single 100kb HTML file
  • A 1MB image file
  • An entire site comprised of multiple requests (HTML, CSS, JS, images)
NewOld
HTML file
200ms
504ms
Image file
586ms
992ms
Full site
1000ms
1790ms

Benchmarks exclude DNS resolution time, TCP and SSL handshakes

The best part? These improvements are already live, available to all static deployments, including existing ones.
These improvements are just the start. In the coming weeks, we will show you how you can reduce latency even further with enhanced capabilities, applicable to all deployment types.
We look forward to sharing more with you.