Activity

gvrz

Shipped this project!

I built a data analysis platform that gives infos and predictions for the current, past and next Wikipedia trending articles. The hardest part was the data management and user interaction code, but once i understood the way pyrogram works and how to properly use pyrogram, async and Flask togheter, it was really easy. I actually had a lot of fun making this, it was an idea we had togheter with a friend, and he’s actually really happy about it too :)

gvrz

Another small devlog, but a good feature.
I added a “Long Version” page that lists more articles and has a longer analysis, so you can look at whichever version you want, based on your attention span or your available time.

Attachment
0
gvrz

A quick devlog.
I fixed the demo, meaning it shows the features when used and not daily, as that would only be boring and cause problems.
Added a greet message for the new users who subscribe for the telegram notification and fixed general details.

Attachment
0
gvrz

Huge devlog this time hehehehe
The entire thing is hosted 24/7 and changes at 8am UTC (if i’m not wrong), same time at which the telegram bot updates the subscribed users.

I added:

  • nice dynamic frontend hosted on a web server
  • long version and short summary of the analysis (the analysis changes daily)
  • telegram bot sharing the latest analysis daily (while fighting with rate limits)
  • bot users database (while learnig sqlite)
  • finally wrote a readme

Fixed A LOT of things be it typos, errors or any small thing.
I think i coded much more than this but it took so long I forgot
Took me a loooong untracked time to make my demo work on a VPS, Oracle’s fault…

To-do:

  • test the bot with more than 30 subscribed users
  • possibly add a graph for the trends
  • make a mobile app or any alternative notification system
  • website notifications
  • custom telegram message time

Have fun! :3

Attachment
Attachment
Attachment
Attachment
0
gvrz

THIS DEVLOG CONTAINS A SHOWCASE!!!
Small devlog for a kinda quick job.

TL;DR: i managed to use venv and git correctly cause i mess things up 99% of the time when using those, looks promising.

I set up the github repo, made it work and synced it on both my devices.
That’s great cause i was finally able to test my code on more than one platform and it works perfectly, only dependency that’s not managed by python and its libs is ollama, or any other API you wanna use.

I fixed the requirements.txt i previously made, it was made in a bad venv and full of libs that are not needed anymore, now it’s light and contains only the things that are strictly necessary for this to work, as it should be.

I also made some small tweaks in my code and made sure all the references to “context” were removed or fixed, as it was breaking the code and it wasn’t needed at all.

Github repo is almost complete, telegram bot support is also almost complete, and the project might be close to a beta release suitable for shipping :)

Attachment
0
gvrz

Some good updates this time!

As i wrote in the last devlog, the LLM i was using is not available anymore with the API i was using (Cerebras).
I was using cerebras because that’s what i started learning LLMs in python with, but now i needed something local to make it better.

I now use ollama cloud to communicate with a crazy good deepseek model (v3.1 671b), the results are excellent (better than they were with gpt-oss-120b) and the analysis and predictions actually make sense now. I fixed some minor things too like memory, timestamps and output.

I also fixed some small things in the memory section: the date given to the LLM no longer corresponds to the timestamp (the time it was written to memory), so now it’s possible to fetch older data and still have the right time in the prompt.

I removed the context part, it was a useless file containing just a string, i wrote the string in the code itself.

I removed a function that listed the models currently available on cerebras as i no longer use cerebras.
Also added some functions that save the prompt to a file so i know what’s been sent to the LLM, write the output to a markdown file and convert them to a pdf so that the latest analysis can be easily read.

My code is still kinda messy and incomplete, it’s full of functions and still missing the user integration part.

To-do:

  • fetch custom date without changing the url
  • use external config file for API url, key etc.
  • telegram bot/web app or client part
  • graphs showing what’s been trending(?)
  • option to check daily, weekly, etc.

Thanks for reading, hope you like what I’m doing.

Attachment
Attachment
Attachment
0
gvrz

NEW DEVLOG!!! I just came back from an event about start ups and coding, placed third :)

I finally fixed the way memory works, now it will store all the data the Wikimedia API returns, with the corresponding analysis and predictions, so that the new analysis can be based on the latest one, but also on the last couple entries, for better accuracy.

Both API responses and memory are saved to data.json and memory.json, if deleted, data will be recreated next time the function is called, if memory is deleted, well, the analysis won’t take count of the old ones. And it will be recreated too.

The LLM provider i’m relying on is pretty unstable, one day they allow you to use GPT, one day only llama and so on, that’s not really nice.

That’s kind of a big problem since i was using GPT-oss which is not available anymore (for now). I coded this part testing it offline, as it is just a better version of the one in my first devlog, so there was nothing API or AI related to test.

I’ll use a local LLM soon, even though i don’t have a dedicated machine to run it on. I may as well create some categories directly in my code and use those for the analysis, but it won’t be very accurate and it will be way worse than it is now.

Attachment
Attachment
Attachment
0
gvrz

FIRST DEVLOG NEW PROJECT!! :D
Nice idea i had thanks to my bro, had to try some “data analysis” sooner or later, so here it is.

Added:
-first attempt to use an LLM via API, not the best but it works
-function to fetch the trends data from wikimedia’s API
-another function to filter the data in order not to show the main page and search page as trending
-made a context instruction for the LLM, it won’t reply if the prompt or data is out of context
-made a memory instruction for the LLM, not used yet, but it’s there for future in-depth analysis based on data that was already analyzed
-collected data is exported to json file
-final output includes top 5 trending articles, general overview and predictions for the next time period trends
-output is also saved in a MD file
-split the code in functions simply because it doesn’t have to be in a loop, it will be called only after certain events which will be added soon

Attachment
0
gvrz

Developed the first UI for what I wrote until now:
ISS tracking with velocity, position and crew info

Note: some parts of the HTML like the frames and some javascript weren’t made fully by me, since I’m really bad at those things, and I couldn’t get it to work.

Added Daytime/Night time indicator depending on time and ISS position.
Added an additional embed with the official ISS tracking from ESA, might be useful.

Coolest thing so far:
Added a simulator for the weather channel, with international forecast ability.
It can be self hosted by following the instructions in the README, but you can use the one in the demo which is the NoNode version (the guy I forked it from is hosting it on his website, the NoNode version just has an embed of that one, while the self hosted has an embed that points to the local port).

The weather forecast widget includes a smol WinAmp widget, scroll down to unmute the official weather channel music. (Man I miss the Wii’s weather channel, even tho it still works lol)
Scroll to the bottom of the widget to change some settings, and scroll down in the ESA ISS tracking widget to change some more settings.

You can find the NoNode version here: https://github.com/giovannirizzello/WorldTracker_NoNode
I haven’t made the README or anything else yet.
Have fun!

Many thanks to https://github.com/mwood77 for making the international version of the weather channel!

Attachment
0
gvrz

Switched API to one which provides a lot more info for the ISS.
That means now I have implemented faster lat-long tracking and, most importantly, altitude and velocity tracking.
Fun fact: the ISS is in space but suffers from a slight friction caused by the atmosphere, which means velocity decreases and the ISS altitude decreases by. around 100m every day. So alt and vel aren’t constant and need to be tracked even if they don’t change too much.

Attachment
0
gvrz

Added number of people on the ISS and their names, quick update.
Still working on getting aircraft data.

Attachment
0
gvrz

Heya guys, first develop after first development session.
If you have any suggestions please tell me!! gvrz on slack, I still haven’t made a GH repo

Implemented the first features:

  • ISS latitude and longitude tracking via API.
  • Monitored sky section (coordinates of 4 points which correspond to the area that will be projected).
  • Aircraft tracking is done via ADS-B exchange API, not by scraping flightradar24 like many do.
  • Ability to set own coordinates

To-do:

  • Finish aircraft tracking
  • Add option to make the monitored sky section the same as what you would actually see from your coordinates
  • Add ISS speed and crew info
  • Add flight speed, route, departure and arrival info.
  • Add ability to switch from ADS-B exchange API to tracking with own equipment such as RTL-SDR
  • Add weather info (would be cooler if from NOAA)
  • Add Starlink satellite tracker (with signal strength/orientation if possible)

That’s it for now, I think so

Attachment
0
gvrz

Shipped this project!

Hours: 0.53
Cookies: 🍪 3
Multiplier: 5.29 cookies/hr

Very short project…did this instead of a bigger one because i was too bored, i think it’s pretty fun though.

gvrz

Added fun facts, and the actual code in the repo.

Attachment
0
gvrz

I’m working on my first project! This is so exciting. I can’t wait to share more updates as I build.l

Attachment
0