Today, I got the main bulk of the project done - PARTICLE GEOMETRY!
Here’s how it works:
Instead of storing particle positions in CPU arrays and updating them each frame, the positions are stored in a texture (one texel per particle). This texture holds the (x, y, z) position and a “life” value in RGBA channels.
Each frame, a fullscreen compute pass (a small quad render) reads the current texture and writes an updated texture, moving each particle by adding Perlin noise (for organic wandering). If a particle’s life decays to zero, it resets to its original image-grid position. This all happens on the GPU in parallel.
When it’s time to draw, the particle geometry is simple: one vertex per pixel, but no position data. Instead, each vertex carries a UV coordinate that points into the simulation texture. The vertex shader samples that texture to fetch the particle’s current (x, y, z), transforms it to screen space, and scales the point size based on life. The fragment shader colors each point by sampling the original image using the per-pixel image UV.
The result: thousands of points move smoothly via GPU simulation, each one retains its source pixel color, thus creating fluid motion.
That’s it! ;)