Introducing fal Model Endpoints
Effortlessly serve your Python functions through a managed web server.
Effortlessly serve your Python functions through a managed web server.
We are excited to introduce Web Endpoints. This feature allows you to serve your Python functions through a managed web server with just a few simple steps.
What are Web Endpoints?
fal-serverless Web Endpoints are an easy way to expose your isolated Python functions through a web server managed by fal-serverless
. By marking a function with the @isolated
decorator and using the fal-serverless
CLI command, you can effortlessly deploy your function and make it accessible via a REST API.
How to Serve a Function
To serve a function, follow these steps:
Mark the function with the @isolated
decorator: Use the serve=True
option to indicate that the function should be served through a web server.
@isolated(serve=True)
def uppercase(text):
return text.upper()
Use the fal-serverless
CLI command: Run the command with the appropriate syntax, specifying the function's name and an optional alias.
fal-serverless function serve ./path/to/file uppercase --alias uppercase
>> Registered a new revision for function 'uppercase' (revision='21847a72-93e6-4227-ae6f-56bf3a90142d').
>> URL: https://github_username-uppercase.gateway.alpha.fal.ai
After running the command, you will receive a unique revision ID and a URL to access the served function. The URL will either include the specified alias or the revision ID.
Accessing Served Functions via REST API
To call a served function, make a POST REST API request to the generated URL, replacing <userid>
and <alias>
with the appropriate values. Include your fal key id and key secret as headers in the request. Here's an example of a cURL request:
curl -X POST "https://<userid>-<alias>.gateway.alpha.fal.ai" -H "Content-Type: application/json" -H "X-Fal-Key-Id:xxxx" -H "X-Fal-Key-Secret:xxxx" -d '{"str":"str to be returned"}'
Expose Function Using Python Web Framework
If you prefer using a Python web framework like Flask or FastAPI for more control over your function, you can do so by providing an exposed_port
in the @isolated
decorator. For example, here's a Flask app exposed on port 8080:
@isolated(requirements=["flask"], exposed_port=8080)
def flask_app(str):
from flask import Flask, jsonify, request
app = Flask(__name__)
@app.route("/")
def call_str(str):
return jsonify({"result": str})
app.run(host="0.0.0.0", port=8080)
With web endpoints, deploying your Python functions has never been easier. Create your web endpoints today and experience the simplicity and power of fal-serverless!
How are we leveraging fal Web Endpoints internally?
We are already making use of fal Web Endpoints at Features and Labels. We have a Discord bot and a GitHub webhook that runs on fal-serverless
. Besides that we started to collect events about our service using Web Endpoints. We are going to get into more details about it in another post but here is a sneak peak to end the post with a simplified example:
@isolated(requirements=["duckdb"], serve=True)
def save_event(event):
import duckdb
import json
con = duckdb.connect("/data/duck.db")
con.sql("CREATE TABLE IF NOT EXISTS events (j JSON);")
query = f"INSERT INTO events VALUES('{json.dumps(event)}')"
con.sql(query)
return
Serve the function using the fal-serverless
command:
fal-serverless function serve ./path/to/file call_str --alias save-event
>> Registered a new revision for function 'call' (revision='xxxx').
>> URL: https://<userid>-<alias>.gateway.alpha.fal.ai
Now the function is accessible through a webserver and ready to receive events from a user onboarding pipeline or directly from an application.
curl -X POST "https://<userid>-<alias>.gateway.alpha.fal.ai" -H "Content-Type: application/json" -H "X-Fal-Key-Id:xxxx" -H "X-Fal-Key-Secret:xxxx" -d '{"event": "user_login", "user_id": 123}'