Building an API in Rust and Node

I thought I'd jot down some brief notes about a recent project replicating a Node API I wrote in Rust.

If you'd like to skip to the source instead, here's the GitHub repo.

TLDR

The API

The API receives markdown and returns HTML via JSON strings. At runtime, the program should validate input, handle errors, perform the transformation and sanitize the output HTML.

Node implementation

For brevity, I'll condense parts of the Node API into a single snippet:

import createServer, { IPureHttpServer } from 'pure-http';
import { PureHttpRequest, PureHttpResponse } from './types/pure-http.js';
import { ResponseErrorStatus, ResponseErrorMessage } from './types/index.js';
import { parseBody } from './lib/parse-body.js';
import { parseJSON } from './lib/parse-json.js';
import { z } from 'zod';
import { marked } from 'marked';
import sanitize from 'xss';

// Server
const app: IPureHttpServer = createServer();

// Routes
app.post('/markdown-to-html', async (request: PureHttpRequest, response: PureHttpResponse) =>
  markdownToHtml(request, response)
);

app.listen(3000);

console.log('Listening on http://localhost:3000');

// Handler transforming markdown to HTML
async function markdownToHtml(request: PureHttpRequest, response: PureHttpResponse): Promise<void> {
  let body = await parseBody(request, response);
  let json = await parseJSON(body, response);

  // Construct schema for validation
  const inputSchema = z.object({
    markdown: z.string({
      required_error: 'markdown property is required',
      invalid_type_error: 'markdown property must be a string'
    })
  });

  try {
    // Validate markdown input
    const result = await inputSchema.safeParseAsync(json);

    if (!result.success || typeof result.data.markdown !== 'string') {
      response.json({ reason: ResponseErrorMessage.validateInput }, false, ResponseErrorStatus.validateInput);
      return;
    }

    const { markdown = '' } = result.data || {};

    // Transform markdown into HTML
    const html = marked.parse(markdown);

    // Sanitize output HTML
    const sanitizedHtml = sanitize(html);

    response.json({ html: sanitizedHtml }, false, 200);
  } catch (error) {
    console.error(error);
    response.json({ reason: ResponseErrorMessage.validateInput }, false, ResponseErrorStatus.validateInput);
  }
}

It's a fairly typical Node API, using ES modules with TypeScript and a few carefully selected dependencies:

All in all, I'm happy with the design tradeoffs and shared more on that in the repo. It was time consuming to find and debug ESM-compatible libraries, which is expected. A particular sour spot was replacing Jest, which has too many dependencies and doesn't play well with TypeScript and ESM. I settled on uvu, which I found much more acceptable.

Rust implementation

I based the Rust implementation off of examples in the Warp repo:

#[tokio::main]
async fn main() {
    let endpoints = api::routes();

    warp::serve(endpoints)
        .run(([127, 0, 0, 1], 3000))
        .await;
}

mod api {
    use super::filters;
    use super::handlers;
    use warp::Filter;

    pub fn routes() -> impl Filter<Extract = impl warp::Reply, Error = warp::Rejection> + Clone {
        transform_markdown()
    }

    pub fn transform_markdown() -> impl Filter<Extract = impl warp::Reply, Error = warp::Rejection> + Clone {
        warp::path!("markdown-to-html")
            .and(warp::post())
            .and(filters::json_body())
            .and_then(handlers::transform_markdown)
    }
}

mod filters {
    use super::models::{RequestBody};
    use warp::Filter;

    pub fn json_body() -> impl Filter<Extract = (RequestBody,), Error = warp::Rejection> + Clone {
        warp::body::content_length_limit(1024 * 16).and(warp::body::json())
    }
}

mod models {
    use serde_derive::{Deserialize, Serialize};

    #[derive(Serialize, Deserialize, Debug)]
    pub struct RequestBody {
        pub markdown: String
    }

    #[derive(Serialize, Deserialize, Debug)]
    pub struct ResponseBody {
        pub html: String
    }
}

mod handlers {
    use super::models::{RequestBody, ResponseBody};
    use std::convert::Infallible;
    use ammonia::clean;
    use pulldown_cmark::{Parser, Options, html::push_html};

    pub async fn transform_markdown(body: RequestBody) -> Result<impl warp::Reply, Infallible> {
        let options = Options::empty();
        let parsed_markdown = Parser::new_ext(&body.markdown, options);
        let mut unsafe_html = String::new();

        push_html(&mut unsafe_html, parsed_markdown);

        let safe_html = clean(&*unsafe_html);

        let response = ResponseBody {
            html: safe_html
        };
        
        Ok(warp::reply::json(&response))
    }
}

Dependencies include:

I can't comment on the design tradeoffs since I'm not as familiar with the Rust ecosystem, but I can say that I found the crates well documented with useful examples. The Rust compiler was strict and offered helpful errors. Cargo did exactly what I wanted it to do, no more, no less.

Benchmarking

I used the ApacheBench tool to run benchmarks for each of the endpoints. Here's the bash script:

# Test concurrency with ApacheBench
# See https://httpd.apache.org/docs/2.4/programs/ab.html
 
# -p indicates POST
# -T sets Content-Type header
# -c is concurrent clients
# -n is the total number of requests

ab -p markdown.json -T application/json -c 20 -n 100  "http://127.0.0.1:3000/markdown-to-html"

And the JSON payload, which has the core Gruber markdown syntax:

{
  "markdown": "# An H1 header\n## An H2 header\n### An H3 Header\nSome **bold** and *italicized* text.\n> A block quote\n1. An\n2. Ordered\n3. List\n- An\n- Unordered\n- List\nSome `inline code`\n---\nA [link](https://tyhopp.com)\nAn ![image of a grapefruit](https://interactive-examples.mdn.mozilla.net/media/cc0-images/grapefruit-slice-332-332.jpg)"
}

The tool outputs a lot of information, but the requests per second metric stood out to me in particular:

That's a big difference.

Of course this an imperfect test with many variables, but at the very least, it's delicious food for thought.

This performance outcome, combined with the pleasant Rust developer experience, is encouraging. Now on to the next experiment, web sockets in Rust!


Thanks for reading! Go home to see other notes.