Poetron banner

Poetron

7 devlogs
18h 56m 50s

An AI model that runs in terminal which spins out randomly generated poems according to your mood from a transformer model. PLEASE READ : I had to pull the plug on the project and head in a different direction after the AI started going AWOL and h…

An AI model that runs in terminal which spins out randomly generated poems according to your mood from a transformer model. PLEASE READ : I had to pull the plug on the project and head in a different direction after the AI started going AWOL and had threats which were a bit graphic, included in the devlog. Now the AI is doing alright after a shift in directions, I have also added a rules based failback. This project mostly serves as a way to introduce how creativity and in the beginning actually had tokens for different lengths but I decided to simplify it in order to ensure that the model would only work in comfortable zones. Tldr you only get to choose and tune the creativity index/temperature, where the higher the value is the more creative the model will be and the lower it is the more it will follow the examples it is trained on.

This project uses AI

Only used to decode AWOL outputs from the AI model and how to fix it, as well as writing the readme.

Demo Repository

Loading README...

yydscoder

Shipped this project!

Hours: 18.95
Cookies: 🍪 154
Multiplier: 13.22 cookies/hr

Project is unchanged, i am resubmitting it since the payout was way below average in my opinion. It was a project aimed at generating haikus to get people to learn how differences in temperature could lead to different AI outputs, with haikus not being supposed to being interpreted literally

yydscoder

Due to the updates, i was suggested to ship it again since the payout was quite mediocre

Attachment
0
yydscoder

Shipped this project!

Hours: 18.55
Cookies: 🍪 97
Multiplier: 5.23 cookies/hr

I had to change the project’s direction and shut it down after it started generating obscene content. The model is now retrained with proper guardrails, but it no longer supports custom dataset training (this may return in the future).

It still generates haikus. The main goal of the project was to let users experiment with tokens and temperature to see how they affect output variation, and haikus were chosen because their abstract nature makes output differences more noticeable.

yydscoder

I think we can look to ship soon, i have added a rules based fallback and a way for the model to know if something follows the 5-7-5 structure and to provide a fallback if it doesnt conform to it. Now I just need to rewrite the readme and we can look to ship

Attachment
0
yydscoder

Wtf is my model doing, was training for 4 ish hours and i got this haiku generated which is extremely obscene, will need to retry and retrain the model and put in guardrails against this type of content

Attachment
0
yydscoder

Tuning for windows and making it easier for the user to download externally required modules for pytorch such as windows visual c++

Attachment
0
yydscoder

Finally finished model training, now I will download it and progress with local testing
Local testing has been done but the model is incoherent at times, will need to refine it further, maybe i can integrate it with hackai endpoint to refine it based on the output of the local model

Attachment
0
yydscoder

Today was a hassle, most of the work was done on kaggle where I have started the model training and will HOPEFULLY be done by tommorow. Its a shame that the configuration on kaggle does not contribute to the time for this project

Attachment
0
yydscoder

Alright I am starting this project from scratch, i am currently writing and importing datasets from kaggle regarding poetry from poetry foundation and gpt 2 as my lightweight model, but i am a bit confused as to whether it would be better to attach my dataset that is trained onto git or to have it hosted on hugging face and connected via an api call.

Attachment
0