Skip to content

HTTP

oRPC includes built-in HTTP support, making it easy to expose RPC endpoints in any environment that speaks HTTP.

Server Adapters

AdapterTarget
fetchMDN Fetch API (Browser, Bun, Deno, Cloudflare Workers, etc.)
nodeNode.js built-in http/http2
fastifyFastify
aws-lambdaAWS Lambda
ts
import { createServer } from 'node:http' // or 'node:http2'
import { RPCHandler } from '@orpc/server/node'
import { CORSPlugin } from '@orpc/server/plugins'
import { onError } from '@orpc/server'

const handler = new RPCHandler(router, {
  plugins: [
    new CORSPlugin()
  ],
  interceptors: [
    onError((error) => {
      console.error(error)
    }),
  ],
})

const server = createServer(async (req, res) => {
  const { matched } = await handler.handle(req, res, {
    prefix: '/rpc',
    context: {} // Provide initial context if needed
  })

  if (matched) {
    return
  }

  res.statusCode = 404
  res.end('Not found')
})

server.listen(3000, '127.0.0.1', () => console.log('Listening on 127.0.0.1:3000'))
ts
import { RPCHandler } from '@orpc/server/fetch'
import { CORSPlugin } from '@orpc/server/plugins'
import { onError } from '@orpc/server'

const handler = new RPCHandler(router, {
  plugins: [
    new CORSPlugin()
  ],
  interceptors: [
    onError((error) => {
      console.error(error)
    }),
  ],
})

Bun.serve({
  async fetch(request: Request) {
    const { matched, response } = await handler.handle(request, {
      prefix: '/rpc',
      context: {} // Provide initial context if needed
    })

    if (matched) {
      return response
    }

    return new Response('Not found', { status: 404 })
  }
})
ts
import { RPCHandler } from '@orpc/server/fetch'
import { CORSPlugin } from '@orpc/server/plugins'
import { onError } from '@orpc/server'

const handler = new RPCHandler(router, {
  plugins: [
    new CORSPlugin()
  ],
  interceptors: [
    onError((error) => {
      console.error(error)
    }),
  ],
})

export default {
  async fetch(request: Request, env: any, ctx: ExecutionContext): Promise<Response> {
    const { matched, response } = await handler.handle(request, {
      prefix: '/rpc',
      context: {} // Provide initial context if needed
    })

    if (matched) {
      return response
    }

    return new Response('Not found', { status: 404 })
  }
}
ts
import { RPCHandler } from '@orpc/server/fetch'
import { CORSPlugin } from '@orpc/server/plugins'
import { onError } from '@orpc/server'

const handler = new RPCHandler(router, {
  plugins: [
    new CORSPlugin()
  ],
  interceptors: [
    onError((error) => {
      console.error(error)
    }),
  ],
})

Deno.serve(async (request) => {
  const { matched, response } = await handler.handle(request, {
    prefix: '/rpc',
    context: {} // Provide initial context if needed
  })

  if (matched) {
    return response
  }

  return new Response('Not found', { status: 404 })
})
ts
import Fastify from 'fastify'
import { RPCHandler } from '@orpc/server/fastify'
import { onError } from '@orpc/server'

const rpcHandler = new RPCHandler(router, {
  interceptors: [
    onError((error) => {
      console.error(error)
    }),
  ],
})

const fastify = Fastify()

fastify.addContentTypeParser('*', (request, payload, done) => {
  // Fully utilize oRPC feature by allowing any content type
  // And let oRPC parse the body manually by passing `undefined`
  done(null, undefined)
})

fastify.all('/rpc/*', async (req, reply) => {
  const { matched } = await rpcHandler.handle(req, reply, {
    prefix: '/rpc',
  })

  if (!matched) {
    reply.status(404).send('Not found')
  }
})

fastify.listen({ port: 3000 }).then(() => console.log('Listening on 127.0.0.1:3000'))
ts
import { APIGatewayProxyEventV2 } from 'aws-lambda'
import { RPCHandler } from '@orpc/server/aws-lambda'
import { onError } from '@orpc/server'

const rpcHandler = new RPCHandler(router, {
  interceptors: [
    onError((error) => {
      console.error(error)
    }),
  ],
})

/**
 * oRPC only supports [AWS Lambda response streaming](https://aws.amazon.com/blogs/compute/introducing-aws-lambda-response-streaming/).
 * If you need support chunked responses, use a combination of Hono's `aws-lambda` adapter and oRPC.
 */
export const handler = awslambda.streamifyResponse<APIGatewayProxyEventV2>(async (event, responseStream, context) => {
  const { matched } = await rpcHandler.handle(event, responseStream, {
    prefix: '/rpc',
    context: {} // Provide initial context if needed
  })

  if (matched) {
    return
  }

  awslambda.HttpResponseStream.from(responseStream, {
    statusCode: 404,
  })

  responseStream.write('Not found')
  responseStream.end()
})

INFO

The handler can be any supported oRPC handler, such as RPCHandler, OpenAPIHandler, or another custom handler.

Client Adapters

AdapterTarget
fetchMDN Fetch API (Browser, Node, Bun, Deno, Cloudflare Workers, etc.)
ts
import { RPCLink } from '@orpc/client/fetch'

const link = new RPCLink({
  url: 'http://localhost:3000/rpc',
  headers: () => ({
    'x-api-key': 'my-api-key'
  }),
  // fetch: <-- polyfill fetch if needed
})

INFO

The link can be any supported oRPC link, such as RPCLink, OpenAPILink, or another custom handler.

INFO

This only shows how to configure the http link. For full client examples, see Client-Side Clients.

Event Iterator Options

HTTP adapters provide reliability features for streaming Event Iterators:

ts
const handler = new RPCHandler(router, {
  eventIteratorInitialCommentEnabled: true,
  eventIteratorInitialComment: 'start',
  eventIteratorKeepAliveEnabled: true,
  eventIteratorKeepAliveInterval: 5000,
  eventIteratorKeepAliveComment: '',
})

const link = new OpenAPILink({
  eventIteratorInitialCommentEnabled: true,
  eventIteratorInitialComment: 'start',
  eventIteratorKeepAliveEnabled: true,
  eventIteratorKeepAliveInterval: 5000,
  eventIteratorKeepAliveComment: '',
})

INFO

These options are available for HTTP-based handlers and links only.

WARNING

Link options apply when streaming from client to server, not server to client (as with handlers). In most cases, you don't need to configure these on the link.

Initial Comment

Sends an initial comment immediately when the stream starts to flush response headers early. This allows the receiving side to establish the connection without waiting for the first event.

OptionDefaultDescription
eventIteratorInitialCommentEnabledtrueEnable/disable initial comment
eventIteratorInitialComment''Custom comment content

Keep-Alive Comments

Sends periodic comments during inactivity to prevent connection timeouts.

OptionDefaultDescription
eventIteratorKeepAliveEnabledtrueEnable/disable keep-alive comments
eventIteratorKeepAliveInterval5000Interval in milliseconds
eventIteratorKeepAliveComment''Custom comment content

Released under the MIT License.