
High Performance

Fast forward to today. Inference is moving closer to the source of the signal, where insights can inform control systems without requiring a slow, expensive, or unreliable connection to the cloud.
On Prem focuses on Lua, JavaScript, and WASM, three language runtimes that are proven to deploy reliably over the air or be embedded, and are uniquely able to orchestrate performance-sensitive code.
Getting started is free. Just download the agent, log into the web console, provision an API Key, and get started using the examples at docs.on-prem.net.
If you're planning to operationalize a Python inference workload and don't know where to begin, you could consider a one-week professional services engagement to explore porting it to Rust+WASM.
If you're an equipment manufacturer interested in a white-label license of the On Prem cloud console and its API endpoint, to bring cloud-managed over-the-air extensibility to your customer's device fleet, ask about our paid pilot programs.
If you perform device management at scale, you might consider licensing the On Prem control plane and integrating it into your own. When used with the On Prem agent, it has capabilities with respect to Prometheus middleware and Redfish middleware, and can leverage the Lambda and Lambda Trigger (k8s robotic control loop) design patterns to realize BMC device drivers, Redfish action hooks, and attaching reactive performance-sensitive logic such as inference or downsampling to high-frequency signals right at their source.
The Rust ecosystem at crates.io contains a rich assortment of high-performance and memory-safe libraries, ranging from Onnx and Tensorflow runtimes, to image, video, and audio processing libraries. Rust compiles to WASM, and then On Prem is able to deploy it over the air to your edge software agent.
This keeps your workloads lean and fast, so you can avoid those insanely unmanageable 10 GB Python Docker images.
Lambdas can be developed interactively via the cloud console, and targeted to run on a software agent running on your laptop, or specialty equipment in your test lab designated as a development environment.
Assets can then be exported as script files plus JSON or YAML metadata, and checked into Git.
Continuous Integration tests can rebuild your environment from scratch for each merge request, using our CLI to import your assets.
I know of a Go-based software agent that tried to keep up with a 4-channel 24-bit A/D converter, and topped out at around 22,000 samples/sec (almost 20x too slow). That's when I learned that low-latency signal processing is at odds with the garbage collector languages.
Rust provides the agility required to execute projects swiftly, while delivering performance in the ballpark of C++, while delivering best in class memory safety.
