
High Performance

Fast forward to today. Inference is moving closer to the source of the signal, where insights can inform control systems without requiring a slow, expensive, or unreliable connection to the cloud.
On Prem focuses on Lua, JavaScript, and WASM, three language runtimes that are proven to deploy reliably over the air or be embedded, and are uniquely able to orchestrate performance-sensitive code.
I know of a Go-based software agent that tried to keep up with a 4-channel 24-bit A/D converter, and topped out at around 22,000 samples/sec (almost 20x too slow). That's when I learned that low-latency signal processing is at odds with the garbage collector languages.
Rust provides the agility required to execute projects swiftly, while delivering performance in the ballpark of C++, while delivering best in class memory safety.
Lambdas can be developed interactively via the cloud console, and targeted to run on a software agent running on your laptop, or specialty equipment in your test lab designated as a development environment.
Assets can then be exported as script files plus JSON or YAML metadata, and checked into Git.
Continuous Integration tests can rebuild your environment from scratch for each merge request, using our CLI to import your assets.
Getting started is free. Just download the agent, log into the web console, provision an API Key, and get started.
If you're panning to operationalize some Python code, then you might consider a one-week engagement to explore porting it to Rust+WASM.
If you're an equipment manufacturer interested in a white-label license of the On Prem cloud console, ask about our paid pilot programs.
The Rust ecosystem at crates.io contains a rich assortment of high-performance and memory-safe libraries, ranging from Onnx and Tensorflow runtimes, to image, video, and audio processing libraries. Rust compiles to WASM, and then On Prem is able to deploy it over the air to your edge software agent.
This keeps your workloads lean and fast, so you can avoid those insanely unmanageable 10 GB Python Docker images.
