An example project showing how to build a spec-compliant
wasi:http/proxy server for WASI 0.2 written in Rust. This sample
includes several routes that showcase different behavior. This sample
can be run by any spec-compliant wasi:http/proxy server.
Each release of this sample is packaged up as a Wasm OCI image and published to the GitHub Packages Registry. See the "Deploying published artifacts" section for more on how to fetch and run the published artifacts.
The following HTTP routes are available from the component:
/ # Hello world
/wait # Sleep for one second
/echo # Echo the HTTP body
/echo-headers # Echo the HTTP headers
/echo-trailers # Echo the HTTP trailers
The easiest way to try this project is by opening it in a GitHub Codespace. This will create a VS Code instance with all dependencies installed. If instead you would prefer to run this locally, you can run the following commands:
$ curl https://wasmtime.dev/install.sh -sSf | bash # install wasm runtime
$ cargo install wkg # install wasm OCI toolingThis project uses wkg to manage WIT dependencies. To fetch the required WIT packages, run:
$ wkg wit fetchThis will download the WIT dependencies specified in the project and populate
the wit/deps directory. The wkg.lock file tracks the resolved versions.
The HTTP server uses the wasi:http/proxy world. You can build and run it
locally using cargo and wasmtime:
$ cargo build --release --target wasm32-wasip2 # build the component
$ wasmtime serve -Scli -Shttp target/wasm32-wasip2/release/sample_wasi_http_rust.wasmThere are launch and task configuration files if you want to use VSCode for debugging in an IDE; however, if you prefer using GDB or LLDB directly the configuration files should be enough to get you up and running. Note that the GDB configuration requires an absolute path, so that configuration in VSCode you will need to modify for your computer.
This project automatically published compiled Wasm Components as OCI to GitHub
Artifacts. You can pull the artifact with any OCI-compliant tooling and run it
in any Wasm runtime that supports the wasi:http/proxy world. To fetch the
latest published version from GitHub releases using wkg and run it in a
local wasmtime instance you can run the following commands:
$ wkg oci pull ghcr.io/bytecodealliance/sample-wasi-http-rust/sample-wasi-http-rust:latest
$ wasmtime serve -Scli -Shttp sample-wasi-http-rust.wasmFor production workloads however, you may want to use other runtimes or integrations which provide their own OCI integrations. Deployment will vary depending on you providers, though at their core they will tend to be variations on the pull + serve pattern we've shown here.
Apache-2.0 with LLVM Exception