CircadifyCircadify

Hardware Integration FAQ

Common questions about embedding rPPG vitals into your devices

Frequently Asked Questions

The Circadify engine runs on ARM and x86 architectures, covering Android tablets, Linux-based kiosks, embedded single-board computers, and custom device platforms. If your device has a camera and a processor that can handle real-time video, it is likely compatible.

The engine works with standard USB, MIPI CSI, and integrated camera modules. A minimum resolution of 640x480 at 15fps is recommended, though higher resolutions and frame rates improve signal quality. No proprietary sensors, IR illuminators, or specialized optics are needed.

No. All rPPG signal processing runs on-device at the edge. The engine operates fully offline, which makes it suitable for air-gapped clinical environments, locations with unreliable connectivity, and deployments where data must not leave the local network.

The engine outputs heart rate, heart rate variability (HRV) with time-domain and frequency-domain metrics, and stress indicators. Data is delivered as structured JSON payloads that your application layer can route to kiosk UIs, triage workflows, EHR systems, or analytics pipelines.

Video frames are processed in-memory and immediately discarded. The engine never stores facial video or transmits data externally. Only the computed vitals output is made available to your application layer. This architecture supports data residency requirements and patient privacy regulations.

Request the hardware integration kit, which includes the embedded engine, API documentation, and sample implementations. Connect the engine to your camera pipeline, configure it for your deployment environment, and route the vitals output to your application. Most teams have a working prototype within days.

More Questions?

Our integrations team is ready to help you embed rPPG vitals into your hardware platform.

Get Integration Guide