E-commerce websites present a unique set of challenges: they need to be extremely dynamic, respond well to massive traffic spikes and load instantly to maximize conversion and customer happiness.
How can we enable high-performance e-commerce without provisioning expensive infrastructure round-the-clock?
We built an open source, serverless Virtual Reality Store — complete with a payment flow, global caching, and deployments made close to its data-source.

You can browse 3D models of products at serverless-vrs.now.sh , and experience a complete payment flow.

By adopting on-demand infrastructure, we were able to design a cost-efficient approach to deliver a high-performance e-commerce store capable of meeting modern-day requirements. When combined with a global caching layer, we can ensure consistently fast load times, without needing to waste expensive, high performance infrastructure during quieter times.
While we had great fun building it with Next.js, MongoDB Atlas, and Passport, we did zero work on setting up the high-performance infrastructure behind this demo. In this post, we explain how.

Persistence with MongoDB

We used MongoDB as our persistence layer and deployed it with Atlas. This choice freed us from database administration and allowed us to focus on creating the store itself. For modelling MongoDB objects, we used mongoose.
// backend/models/Model.js

const ModelSchema = new mongoose.Schema({
  id: {
    type: Number,
    required: true
  name: {
    type: String,
    required: true
  description: {
    type: String,
    required: true
  price: {
    type: Number,
    required: true

Mongoose schema properties for Model, which describes a product.

// backend/api/get-products/index.js

const mongoose = require("mongoose");
const app = require("../../utils/app");
const ModelSchema = require("../../models/Model");

app.get("*", async (req, res) => {
  try {
      Create a new mongoose connection to MongoDB Atlas with connection string,
      which we will close before sending a response back to client.
    const connection = await mongoose.createConnection(
          Buffering allows Mongoose to queue up operations if MongoDB
          gets disconnected, and to send them upon reconnection.
          With serverless, it is better to fail fast when not connected.
        bufferCommands: false,
        bufferMaxEntries: 0
    const Model = connection.model("Model", ModelSchema);
    Model.find({}, (error, docs) => {
      if (error) {
        res.status(500).json({ error });
      res.set("cache-control", "s-maxage=1, maxage=0, stale-while-revalidate");
      res.json({ docs });
  } catch (e) {
    res.status(500).json({ error: e.message || "uh oh " });

module.exports = app

A snippet of the code responsible for retrieving product information from MongoDB Atlas.

// frontend/pages/store.js

Store.getInitialProps = async ({ req }) => {
  let URL = "https://serverless-vrs.now.sh/api/get-products";

  const props = { products: [] };

  try {
    const response = await fetch(URL);
    const { docs } = await response.json();
    props.products = docs;
  } catch (e) {
  return props;

A simplified illustration of the API being consumed on the Next.js frontend .

Authentication with Passport

We took advantage of Passport for authentication, which supports strategies for all common login methods. To set up Passport, we need to specify the strategy for GitHub auth. The strategy involves specifying a CLIENT_ID and CLIENT_SECRET which we obtain from GitHub, a callbackURL which is a custom route on our app, and a function that handles the MongoDB user creation upon successful authentication.

Payments with Stripe

For handling payments, we used Stripe for its elegance and simplicity. Stripe offers rich, pre-built UI components via Stripe elements. Since our project is rendered server-side with Next, we use the react-stripe-elements-universal package. It is straightforward to use — we only need to declare the component at an appropriate place in our UI, passing in the Stripe API key.
To process the payment, we need to create a checkout lambda that accepts a token and amount. Once the card details are submitted, a Stripe token is created and sent to the checkout lambda along with the total amount to be charged.
// backend/api/checkout/index.js

app.post("*", async (req, res) => {
  const { amount, token } = req.body;

  try {
    const { status } = await stripe.charges.create({
      amount: parseInt(amount),
      currency: "usd",
      description: "VRS",
      source: token
    res.json({ status });
  } catch ({ message }) {
    res.status(400).json({ error: message });

The checkout lambda is responsible for creating a Stripe charge. It is invoked once a customer goes through the payment flow on the frontend, passing along amount and token.

Static meets Dynamic

We care deeply about optimizing for performance because that ultimately leads to a great user experience.

Through the use of smart edge caching in combination with server-side rendering, we were able to achieve a perfect lighthouse score for performance.

It is not uncommon for e-commerce sites to experience thousands of requests a second during the festive season. During this time, it is inefficient (and expensive) to compute each of those requests separately when they could be served directly from the cache.
To improve this, we use the stale-while-revalidate header — this allows us to serve previously cached content even after the max-age expires, while the revalidation process takes place in the background. The revalidation itself involves a single request to the lambda, whose response is cached to serve subsequent requests.
  "s-maxage=1, maxage=0, stale-while-revalidate=31536000"

s-maxage helps us set the cache expiry on the edge, and maxage=0 instructs the browser not to cache content.

Deploying and Infrastructure

The beauty of this demo is in the simplicity around infrastructure. We are deploying the demo on the Now platform. Once set up, Now makes it easy to deploy with a single command — now. We don't have to worry about setting up or renewing SSL certificates, or managing DNS records. As every request is either served through the cache or invokes an independent serverless lambda, the infrastructure can handle peak time traffic efficiently. The on-demand pricing model ensures that we never pay for idle time.

The layout of our e-commerce demo monorepo.

Deploying lambdas close to the data source is vital, because it offers the least latency. We can specify any Origin that is part of the global ZEIT network within the regions property of now.json. In our case, when we originally set up our MongoDB cluster, we had specified its region as California. Therefore, we chose SFO as our Origin region.
// now.json

  "version": 2,
  "alias": ["serverless-vrs.now.sh"],
  "regions": ["sfo"],

The choice of Origin region can be specified with the regions property.
All configuration options for now.json can be found on its docs page.

Before we deploy, we need to set values for the environment variables.
// now.json

  Environment variables can be referenced within \`now.json\` and set via Now CLI

  "env": {
    "GITHUB_CLIENT_ID": "@vrs-github-client-id",
    "GITHUB_CLIENT_SECRET": "@vrs-github-client-secret",
    "MONGODB_ATLAS_URI": "@vrs-mongodb-atlas-uri",
    "STRIPE_SECRET": "@vrs-stripe-secret"
now secret add vrs-github-client-id "***"
now secret add vrs-github-client-secret "***"
now secret add vrs-mongodb-atlas-uri "***"
now secret add vrs-stripe-secret "***"

Using now secret add ensures that the data needed by our apps to function are stored in a secure manner.

Once configured, deployment is just one command.

For subsequent deployments, using now secret add is not necessary unless new secrets need to be introduced.

Furthermore, we have the Now for GitHub integration set up on our GitHub repository. This means that we can just push our code, and deployments are automatically taken care of.

Thanks to Now for GitHub , all our commits, pull requests and merges automatically get built and deployed.


Last year alone, e-commerce accounted for 11.9% of all retail sales worldwide.
Through our Virtual Reality Store demo, we have demonstrated how serverless is the perfect choice for a high-performance application such as e-commerce, without any configuration. Serverless allows us to dedicate time on the business logic that matters, without getting distracted by the complications of server and network administration.
We hope you enjoyed reading about this exploration as much as we enjoyed creating it. The project is entirely open source and the code is available at its GitHub repository.
We are very excited to see what this inspires you to create. There are further optimizations that could be made with the WebGL and 3D rendering, and we encourage you to take a stab at it. The 3D models for this demo were downloaded from Clara.
If you would like to share any feedback, or even simply say hello, we are reachable via Twitter or chat.