Using CDN caching with GraphQL


Hi, today we'll see how-to do a CDN GraphQL caching system in some quick, easy and dirty steps. The idea came to me to make this article because the other day I was reading a Blog post about some reasons you should not use GraphQL in production. I've been using it for a bit on production websites and I disagree with some of the arguments presented in the paper.

One of them was that you cannot do CDN caching with GraphQL because almost no CDN manage POST body cache management. That's why I'll show you in this article how you could (but shouldn't) implement GraphQL GET request caching to do (almost) the same as a JSON REST API.

Creating a simple GraphQL server

So let's begin by creating a GraphQL server with a really simple schema offering a simple hello world field. I used Koa (because it is better than express :troll:) and expose a /graphql endpoint.

const Koa = require('koa');
const Router = require('koa-router');
const graphqlHTTP = require('koa-graphql');

const {buildSchema} = require('graphql');

const app = new Koa();
const router = new Router();

const schema = buildSchema(`
  type Query {
    hello: String

const root = {
  hello: () => {
    return 'Hello world!';

router.all('/graphql', graphqlHTTP({
 rootValue: root,
 graphiql: true



Then we do a graphql query to see if it works. Annnnnd Yes.

curl -X GET ""
{"data":{"hello":"Hello world!"}}

As you can see I'm passing a query parameter to define which GraphQL action I want to do on server.


Actually our caching could already work as is, as GraphQL query is passed on GET method and not on POST. But, because GraphQL queries can be very big, the default option is to use POST. The solution to still use GET request is to compress the query using lz-string in order to not have any problem while passing our queries.

So to do that, we'll create a fake client that will add query string compression and force GET request.

const rp = require('request-promise');
const LZString = require('lz-string');

function prepareQuery(query) {
 return LZString.compressToBase64(query);

async function sendQuery() {
 return rp({
  method: 'GET',
  uri: '',
  qs: {
   query: prepareQuery('{hello}')

 .then(res => console.log(res))

So the query will now look like this :

REQUEST emitting complete
{"data":{"hello":"Hello world!"}}

Now to handle this type of query we need to add some reaaaally simple logic to our GraphQL middleware that will decode the query before sending it to our schema treatment routine. This is done by using the function handler that is executed at the very beginning of GraphQL koa middleware.

router.all('/graphql', graphqlHTTP((request, response, ctx) => {
 const decodedQuery = LZString.decompressFromBase64(request.query.query);
 ctx.query = {query: decodedQuery};
 return {
  rootValue: root,
  graphiql: true

As you can see it very simple and straightforward to make it happen. So why isn't it the default GraphQL implementation ?


Actually GraphQL implementations, I'm mainly talking about Apollo Server, as it is the one I know most, implement caching a very different way. By caching received data on client side, they optimized the requesting process, because clients manage locally their own caches you can avoid a lot of useless requests to the server.

The other drawback of this CDN server caching is that GraphQL queries may slightly change and therefore create lot of cache entries for each clients.

To conclude, I'd say this implementation could be useful on some specific cases where clients always do the same requests and you cannot add client caching system (on some legacy system) but it should be avoided as much as possible as it is not the most efficient way to do GraphQL query caching.

Go see my website for some more data about who am I :

Most seen