Activity

Josh

Made a basic chat UI. I haven’t used react in a bit, so creating a new project and getting acquainted to all the dependencies I needed took a bit of time. It looks a bit generic right now, and I will definitely change how it looks so that it isn’t so bland, but the main focus is with the audio visualizer (which acts as a button) in the navbar as well as the floating interface it activates. The main feature and focus of the app is its integration with your microphone and earbuds, so related features are easily accessible in the floating interface. The app specifics aren’t set in stone yet, but it should be easily modifiable if anything changes.

The stack if anyone is interested: react, vite, tailwind, daisyui (component styles lib), tanstack router (for making this a SPA), zustand (state management). It’s the most NPC stack there is (and not really the most enjoyable to work with), but I’m trying to have this follow industry standards and stuff

0
Josh

ok I was working on a ton of stuff in parallel, so even after 10 hours, I only have partial progress in the touch driver (the hardware-side communication (i2c) sporadically breaks randomly for some reason I cannot figure out for the life of me), having multiple fonts/background images, and the touch input method I plan to have. During the end of this time, though, I’ve been writing some absolutely generational rust code which I wish I could finish but I have to make this devlog so welp. I do have an image of a new wallpaper and font though, so hopefully that’s fresh.

Attachment
0
Josh

Finished polishing the text and added RTC clock integration and bluetooth support

I probably should have split this into multiple devlogs but oh well
I first picked up from my last devlog and fixed the text rendering problems. In the process though, I found out that there were restrictions on the CO5300 display driver in how it can only accept even numbers in draw region dimensions, so I had a bit of fun figuring out why my 1px high line buffer was bugging out. Speaking of line buffers, I changed my draw implementation from lazily evaluating every pixel to lazily evaluating multiple line at once, which reduced the time it takes to draw a frame from ~250ms to ~150ms (this is still horrible at under 10fps; I’m still looking into ways to optimize this further). When I finished it, I pretty quickly managed to make the RTC driver (the watch has the PCF85063A, which has a good datasheet), and it tracks time beautifully, but I realized that synchronizing the time with the real world was going to be a bit annoying. I was originally going to use something that injects a timestamp at compile time and go from there, but I ultimately opted for bluetooth (BLE) communication since it’s cleaner and futureproof. Learning about bluetooth took some of time, but the rust library trouble_host made it pretty straightforward to implement. Bluetooth does have some caveats though, such as how native iOS or KDE Plasma (my daily driver on laptop) don’t support connecting to it directly so I have to use external programs to communicate with the watch, as well as how the underlying bluetooth drivers on my phone permanently cache service descriptions (the structure of the data being sent), so I have to do some goofy MAC address changing in order to fake being a new device in order to have a new schema. The attached video shows me sending a unix epoch timestamp though bluetooth to my watch (connected through serial to my laptop), which the watch can then interpret and sync the RTC to.

Attachment
0
Josh

In the process of adding monospace text support
Imagemagick supported generating images of texts from fonts, so I stuck with it for generating a font bitmap. I spent a couple hours trying to figure out why there were extra pixels when I generated the bitmap directly in 1 line of text before realizing that I can just programmatically generate images each character (each with a correct, constant amount of pixels) and then append them afterwards. There were also some complications with generating partitions (i kept messing up hexadecimal math). Ultimately, the font does work, but my text drawing algorithm is a bit messed up (its supposed to just say “abc”), but it should be a relatively simple fix. I am trying to devise ways to compress the font, since even just 64 characters of 80px text (font size of the text in the image) takes up 400kb, and I do want bigger fonts eventually.

I was also testing color fidelity and conversion correctness (had to fix some bugs with my imagemagick command), so I uploaded some random wallpaper images, with the one shown in the photo being from https://safebooru.org/index.php?page=post&s=view&id=6558188

Attachment
0
Josh

added images to the rendering
The esp32 c6 only has 512kb of ram, and a background image would take 410x502 pixels, and assuming 16 bit pixels (rgb565), that would be 3293120 bits of data, or 411kb. Although it does fit, having just a wallpaper take up 4/5ths of the ram is not really ideal especially when I have so much more I want to add, and the esp32 board also boasts a hefty 16mb flash, so I chose to stream it line by line directly from ram so that only an 402 length 16-bit array has to be stored in memory. This required some digging into partitions, but there were luckily some dependencies in the official esp32 libraries that helped with this.
Another part of this was how I wanted to convert any image into a usable wallpaper, so I have a python script that uses imagemagick to stream the image’s pixel into a function that converts it into rgb565. I was originally going to do this with just imagemagick, but after messing with imagemagick’s fx language for a while, I just couldn’t get anything to work. It’s still a very interesting feature, so I might use it for some image automation later on.
Overall, I’m actually quite surprised by the colors and image fidelity the watch has. The image is in RGB565, so there’s only 64k colors available compared to normal RGB’s 16M, but it’s barely noticeable on a 2 inch watch, and the high resolution is just beautiful (it somehow has 324 pixels per inch, even my oled monitor only has 90). It looks pretty blurry in the photo because of my shaky hands, but I swear it’s absolutely beautiful and vibrant in real life.

Attachment
0
Josh

Refactored the codebase so that there’s now a clean shader-like API to drawing the display
now all it takes to produce the display in the image is this:

co5300.draw_pixels(0, 0, 410, 502, |px, py| {
return RGB565::new((((px as f32) / SCREEN_WIDTH) * 31.) as u16, (((py as f32) / SCREEN_HEIGHT.) * 63.) as u16, 31);
});

I spent a couple hours debugging why it doesn’t work with abstractions in place (hardware is so fun when it produces 0 errors when it fails) before I fixed the endianess of the commands (the esp32 is little endian, while the co5300 expects big endian data)

Attachment
0
Josh

I spent a over a week trying to write a simple CO5300 driver in rust for the display of the watch. Maybe it’s because of my inexperience in embedded development, but making it work was just pure agony:

  • When I received my watch, I really wanted to write firmware in Rust, and I quickly found out that most libraries were in C++
  • This meant I couldn’t use esp-idf or any mainstream c++ library (esp-idf-sys, esp-idf bindings for Rust, does exist, but I only found out much later)
  • Waveshare did have docs that used Arduino, which uses a library for its display called Arduino_GFX (https://github.com/moononournation/Arduino_GFX)
  • The datasheet for the CO5300 (the driver for the display) was 226 pages, and I didnt feel like reading allat, so I spent a couple days scraping all the CO5300 commands that were scattered through Arduino_GFX until I got a working demo in Arduino
  • I now thought it would be relatively easy to port the Arduino to Rust, but it turns out there’s much more lower level stuff to writing commands than repeating “bus->writeCommand(abc);” 10 times
  • I spent a couple days stuck until I realized that the command goes into the address stage instead of the stage literally named “command”??? im still malding on whoever tf thought that was a great idea
  • And I was stuck for like 5 more days trying to get anything to work (wokwi wont run, micro/circuitpython has no general qspi libraries for some reason, I scoured the datasheet and the Adafruit_GFX implementation a few times) until now when I resorted to AI to point out the obvious of how it runs in 1-1-4 instead of 4-4-4 wire mode that it clearly points out in the datasheet. im still blaming the fact that esp-idf names their stuff a bit cryptically and how Adafruit_GFX’s source code is pretty annoying to read (they left no explanation for half the choices they did, and left shifting to multiply by 2 is just annoying to decipher)

at least it works now

Attachment
0