The Nerve is a dedicated physical command center for algorithmic video production. Instead of managing complex n8n/FFmpeg pipelines through a browser tab, I am building a tactile "cyberdeck" interface that puts the director back in control.
What …
The Nerve is a dedicated physical command center for algorithmic video production. Instead of managing complex n8n/FFmpeg pipelines through a browser tab, I am building a tactile “cyberdeck” interface that puts the director back in control.
What it does:
It features a “Hype Dial” (Rotary Encoder) that physically adjusts the pacing and editing style of the generated video in real-time, displaying system health and render status on an OLED screen. It creates a bridge between low-level hardware (Rust on RP2040) and high-level cloud automation.
Who is working on it:
Just me. I am diving deep into PCB design with EasyEDA and learning Rust (Embassy framework) for embedded concurrency.
What’s Next:
I have finalized the schematic including LiPo power management for portability. The next step is routing the “Cyberpunk-style” PCB, 3D printing the enclosure, and shipping the firmware.
I used AI Gemini basically as a “Senior Engineering” mentor to guide me through the PCB design workflow in EasyEDA, specifically to validate the LiPo power management circuit and check for possible short circuits in my schematic.
I’ll also probably use AI to generate boilerplate code snippets for Rust (Embassy framework) to understand the syntax, that sort of thing (I still don’t know anything about Rust, if I stick with it instead of C++ or MicroPython I think I’ll have to study a lot) but the final logic, component selection and physical routing are being executed and verified manually by me to ensure that I fully understand the system architecture.