perfect vision
clear sight
HORUS (Holistic Operational Reality Unified Spatiality) is LupoTek’s next-generation mixed-reality platform, developed at the intersection of neuroergonomics, high-fidelity visual computing, and adaptive human–machine collaboration. Unlike traditional VR or AR systems that specialise in isolated modes, HORUS is engineered as an integrated, dynamic MR environment capable of transitioning seamlessly between full immersion, augmented overlays, and transparent reality-augmented perception.
At its core, HORUS operates as a situational cognition amplifier. It merges 6K-per-eye micro-OLED displays (480fps), full-spectrum sensor ingestion, environmental reconstruction, and real-time Companion-Intelligence (CI) feedback loops to support human performance in high-complexity operational settings. Its architecture is designed for dual-use adaptability—from drone oversight, remote piloting, atmospheric and spatial analysis, to cockpit-embedded pilot support systems within LupoTek’s advanced flight programs.
The platform occupies the space where perception, computation, and embodied control converge, representing LupoTek’s broad push toward synthesizing rich information streams into instantly actionable human insight.
-
HORUS is centered on a visual system engineered for extreme fidelity, latency minimisation, and perceptual stability. Each eye receives up to 6,000 horizontal pixels delivered through micro-OLED panels operating at 480 frames per second - an order-of-magnitude leap beyond conventional head-mounted displays.
Micro-OLED Panel Characteristics
Pixel pitch in the low micrometer range, enabling sub-arcminute visual precision
Near-zero persistence illumination at high refresh rates, reducing motion blur and vergence instability
HDR luminance control for high-contrast environments
Optical waveguides tuned to minimise chromatic aberration and pupil swim
The optical stack integrates aspherical hybrid lenses with software-driven distortion compensation, enabling a full MR field of view with uniform clarity. HORUS maintains photorealism not through graphic embellishment, but through physically plausible rendering pipelines, material-accurate shading, and time-synchronised exposure control based on eye-tracking telemetry.
Because the human perceptual system is highly sensitive to micro-latency in head-coupled displays, HORUS uses a closed-loop feedback cycle operating at sub-10-millisecond motion-to-photon latency. This allows for stable operation during high-acceleration movement in both ground-based and cockpit-integrated applications.
-
HORUS was designed to serve as a unified interface layer across LupoTek’s robotics, drone, and vehicle research programs, including the Lítillljós subsonic VTOL platform . Instead of switching between discrete software tools or screens, HORUS allows operators to transition fluidly between task domains:
Drone piloting using real-time 4K/120fps telemetry
Environmental reconstruction, including atmospheric layers, turbulence profiles, and particulate mapping
Companion-Intelligence overlays, highlighting anomalies, trajectories, or system-state predictions
Autonomous-system supervision, where HORUS acts as a high-bandwidth oversight portal rather than a manual controller
The system ingests environmental and sensor data through multimodal fusion, combining infrared, optical, acoustic, inertial, and RF inputs into a spatially coherent MR layer. This allows operators to maintain wide-angle awareness even when local visibility or environmental stability is limited.
Because HORUS interacts with Companion-Intelligence through a closed, encrypted LupoTek architecture, CI can assist by providing real-time annotations, risk scoring, and contextual cueing without overriding human judgment. This maintains human-in-control principles while still leveraging computational scale.
-
HORUS is designed to integrate directly into next-generation pilot helmets as a lightweight, high-temperature-resistant, vibration-stable MR module. Its features reflect the stringent requirements of high-performance aircraft:
Integrated HUD Projection
Instead of relying on traditional HUD glass, HORUS projects airspeed, altitude, system health, navigation vectors, and situational cues directly onto the visor using collimated MR overlays. This reduces the need for head repositioning and supports eyes-forward operation.
360-Degree Situational Reconstruction
HORUS consumes inputs from distributed sensors - including infrared hemispherical arrays - to produce a “see-through” environment. Pilots can look downward, around structural elements, or beyond occluded geometry by referencing sensor-fused reconstructions.
This enables continuous awareness while maintaining the cockpit enclosure.
Digital Night-Vision Integration
Low-light and thermal channels are fused into the visual feed, providing full-spectrum visibility without separate NVG hardware. Because the pipeline is digital, HORUS can apply noise reduction, edge sharpening, and dynamic range correction.
Eyes-On Interaction
Through high-speed eye-tracking, the platform recognises gaze vectors to pre-select points of interest or interface elements. This is not a “lock-on” system; rather it enhances human-machine interaction by reducing menu navigation or hand control delays.
Custom Fit & Ergonomic Co-Mapping
Each HORUS unit is 3D-scanned, fitted, and calibrated to the user’s facial geometry, interpupillary distance, and head movement signatures - necessary to maintain distortion-free rendering under dynamic motion.
-
The HORUS data architecture is designed for high-bandwidth, multi-layered sensor fusion. In both ground and flight applications, it synthesises:
radar inputs
infrared and multispectral channels
inertial datasets
drone telemetry
atmospheric data extraction
optical flow
Cohesive CI-generated predictive mapping
Using these datasets, HORUS produces a unified “cognitive map” to support decision-making. This is where the platform’s dual-use nature becomes clear: the same foundation that supports drone fleet oversight can also serve aviation situational enhancement, infrastructure inspection, emergency-response coordination, or environmental monitoring.
Data-Sharing Framework
HORUS supports distributed situational cognition: multiple operators or pilots can share certain visual layers or MR overlays, enabling collaborative awareness.
This is not an intelligence-sharing system; it is a perception-sharing layer designed to improve coordination between remote or cockpit operators.
CI Closed-Loop Learning
When connected to Companion-Intelligence, HORUS enables a feedback loop where operator behaviour informs predictive modelling.
CI does not command or control; instead it learns patterns such as:
environmental anomalies operators focus on
sub-optimal control sequences
regions of interest that correlate with risk
cognitive load inferred from eye-tracking, head motion, or control timing
This allows the system to adapt overlays, warnings, and environmental representations over time.
-
HORUS is constructed from lightweight composite structures, including carbon fibre, impact-resistant polymers, and internal Kevlar reinforcement. These materials mirror the resilience standards used in advanced cockpit systems without disclosing any restricted specifications. The mass distribution is tuned to minimise neck strain while supporting extended operation.
Thermal & Vibration Stability
The platform includes:
passive and active cooling layers for micro-OLED thermal loads
vibration-absorptive mounting to remain stable during high-frequency aircraft motion
EMI-shielding to ensure data integrity near high-output avionics
Modularity & Field Reconfigurability
HORUS can be fitted as:
a standalone MR headset for ground-based drone control,
a supervisory interface for autonomous-system testing, or
a cockpit-integrated module designed to function as a fused reality visor for high-performance aircraft.
Human-Centric Control Confidence
HORUS does not replace traditional controls - it augments them by improving clarity, reducing cognitive friction, and creating a perceptual environment where complex decisions can be made with higher precision and lower load.
