Vendor-Free Flexibility for Modern APIs
A minimal TypeScript framework for building web APIs as functions, deployable serverless or locally with ease
The Pitch
This isn't another server, but rather a normalization layer on top of the many serverless and server frameworks available today. You define all your code as functions, and can switch between running them on different environments with ease.
Explore New Environments
Stop wondering if the grass is just greener on the other side. Deploy your services to AWS Lambda, Cloudflare Workers, and other serverless platforms.
Improve Reliability
Deploy the same logic in multiple environments to ensure your services are always available.
Enterprise and Cloud Co-existence
Why limit yourself between serverless and server-based deployments? Vramework lets you have the best of both worlds.
Reduce Costs and Your Carbon Footprint
Easily switch between serverless and server-based deployments to optimize costs and performance, potentially helping make the world a bit greener.
Code Examples
- HTTP
- Scheduled
- WebSockets
// The Function
import { APIFunction } from './vramework/types'
const getTodo: APIFunction<{ todoId: string }, Todo> = async (
services,
data,
userSession
) => {
// This method doesn't exist to make it easier to read
return await getTodo(services.database, data.todoId)
}
// The Wiring
import { addRoute } from '@vramework/core'
addRoute({
method: 'get',
route: '/todo/:todoId',
func: getTodo,
permissions: {
isTodoCreator: [isTodoCreator, withinAPILimits],
isAdmin
},
auth: true,
docs: {
errors: [NotFoundError],
description: 'Updates a todo',
tags: ['todos']
}
})
// Scheduled Task Function
const expireTodos: APIFunctionSessionless<void, void> = async (services) => {
services.logger.info('Expiring all todos')
}
// Scheduled Task Wiring
import { addScheduledTask } from '@vramework/core'
addScheduledTask({
name: 'expireTodos',
schedule: '*/1 * * * *',
func: expireTodos,
docs: {
tags: ['todos'],
},
})
// The Functions
const onConnect: ChannelConnection<'hello!'> = async (
services,
channel
) => {
services.logger.info('A channel has been established')
channel.send('hello!')
}
const onDisconnect: ChannelDisconnection = async (services, channel) => {
services.logger.info('A channel has been ended')
}
export const authenticate: ChannelMessage<
{ token: string; userId: string },
{ authResult: boolean; action: 'auth' }
> = async (services, channel, data) => {
const authResult = data.token === 'valid'
if (authResult) {
await channel.setUserSession({ userId: data.userId })
}
return { authResult, action: 'auth' }
}
const onMessage: ChannelMessage<'hello', 'hey'> = async (
services,
channel
) => {
services.logger.info('Got a generic hello message on callback')
channel.send('hey')
}
// The Wiring
addChannel({
name: 'events',
route: '/',
onConnect,
onDisconnect,
auth: true,
onMessage,
onMessageRoute: {
action: {
auth: {
func: authenticate,
auth: false,
},
},
},
})
Supported Deployment Options
The following are the deployment options currently supported by Vramework
Type | Service | HTTP | WebSocket | Cron |
---|---|---|---|---|
Serverless | AWS | ✅ | ✅ | ✅ |
Serverless | Cloudflare | ✅ | ⚠️ | ✅ |
Server | uWS | ✅ | ✅ | ✅ |
Server | WS | ❌ | ✅ | ✅ |
Server | Express | ✅ | ❌ | ✅ |
Server | Fastify | ✅ | ❌ | ✅ |
Legend: ✅ Supported | ⚠️ Beta Support | ❌ Not Supported
Questions & Answers
How production ready is this?
At this very moment we would suggest using this for all server deployments and for http serverless. Help with testing the serverless solutions would be much appreciated!
How can I provide feedback?
Right now GitHub discussions would be best! We'll be setting up Discord early 2025.
How can I contribute?
Everywhere! We are looking for example demos, testing out the current start workspaces, adding more deployment options, more testing. It's pretty endless! Feel free to reach out via yasser.fadl@vlandor.com if you want some pointers.
How much CPU overhead does this add on top?
A performance testing framework isn't in place yet. However, performance is a top priority. The main issue is that serverless tends to be async by nature, so in some cases we provide local and serverless adaptors to minimize that penalty when possible.
How big are the deployed bundle sizes?
The serverless bundle size of the workspace-starter is around 350kb, including database and schema libraries. Lazy loading and tree-shaking only the required services is on the roadmap.
What about micro service deployment?
This is currently unofficially supported but requires manual work to avoid bundling everything. The CLI will be enhanced to only import desired functions in the future.
Get Involved with Vramework
Start building with Vramework, explore our insights, or join the community to help shape the future of function-driven frameworks.