MediScribe AI banner

MediScribe AI

14 devlogs
30h 23m 49s

MediScribe AI is a offline clinical examination expert system for medical students and junior doctors. They often forgot the steps they have to perform in history taking as it is a long procedure and they mostly don't have access to internet! this…

MediScribe AI is a offline clinical examination expert system for medical students and junior doctors. They often forgot the steps they have to perform in history taking as it is a long procedure and they mostly don’t have access to internet! this app give Full patient flow covering history taking, systemic review, vitals with clinical interpretation, lab entry, guided physical examination across cardiovascular, respiratory, abdominal, and neurological systems, diagnosis screen, and SOAP note generation.
Made this for my medical friend as she keep saying how difficult is to remember all the details!!!

This project uses AI

Used AI to build the knowledge based json file with follow up questions and all the symptoms from KUNDU bedside clinic book OCR!

Demo Repository

Loading README...

Oliver

Tagged your project as well cooked!

🔥 Oliver marked your project as well cooked! As a prize for your nicely cooked project, look out for a bonus prize in the mail :)

aneezakiran07

Shipped this project!

Hours: 30.4
Cookies: 🍪 424
Multiplier: 13.95 cookies/hr

Please read this, i summarize my project below in easiest words possible!!!
HI!!!
so mediScribe AI is a clinical examination assistant for medical students or junior doctors.
the idea is simple. when a student is at the bedside(clinical term for doctor taking patients readings from their bed), they forget steps. they do not know what to examine next or what question to ask patient or whether a finding is significant. so this apps helps them for this

user enter the patient’s details, history, vitals, and lab results. Then the app guides user through a physical examination,, with instructions pulled from actual clinical textbook called Kundu’s bedside clinic. It flags critical findings, injects follow-up questions based on what user find, and generates a SOAP note at the end. Everything saves locally using hive db

i built it in Flutter with Dart. Patient data is stored offline using Hive. The clinical reasoning engine runs from a JSON knowledge base I made from Kundu book. I made architecture for json then feed OCR of text book to AI and then it built me 2000+ lines of json with questions, symptoms and diagnosis! i also confirmed several times if AI is building the json correctly or not! I couldn’t have done that without AI because book has alot of follow up questions that if i made it myself, i would have spend 20 days on it :”)

Also i had two medical students test it. they both said this would have saved them so much stress during their ward postings. and im happy it will help someone, you will find similar apps on playstore also but most of them are paid and didnt give as much comprehensive history taking as my app is doing! you can check that out also!

thanks for testing out my project!!! if you dont want to download apk, then watch demo in latest devlog!!

aneezakiran07

Hi!
In this devlog I rebuilt the diagnosis engine in MediScribe AI.
The system worked before, but results were kinda unreliable. confidence scores were wrong and way too less.

First, I removed a penalty bug. Normal findings were reducing scores for real diagnoses.
second, I added support for required facts. Now a diagnosis only appears if relevant sections have data.
third, I fixed score calculation. All diagnoses now use one clean formula. Scores stay between 1% and 95%.
fourth, I fixed deselection issues. Old answers from removed followups no longer affect results.
fifth, I unified logic. SOAP and diagnosis screens now use the same scoring system.

Please check the whole readme and demo before voting, i know many of you wont get this project but if u research on it then its really a cool thing to make :”)

NOTE: I rushed alot in demo because FT dont allow long videos :”)
you can watch full demo on this link:

https://drive.google.com/file/d/13VzI_spU9x2eqakv_oEsoGAXvFYAF1iM/view?usp=sharing

0
aneezakiran07

hi!!
in this devlog, i did nothing much but find an UI mishap!
the user can only save the records once he is on last screen of history taking
so i made a bottom draft widget that will be shown in every screen of history taking and it says save draft and exit( means go back to homescreen).
this way user won’t have to fill in every steps and can just safe draft!
then i also made my code modular by adding a patient repositary service!

Attachment
0
aneezakiran07

Offline Storage & Full Edit Flow
Hi! In this devlog I added offline storage and a complete edit flow to MediScribe AI.
I integrated Hive as the local database, every patient session is now saved as a single object containing all seven data layers: patient info, history, systemic review, vitals, labs, examination findings, and the generated SOAP note.
The Patient Records screen now reads live from Hive using a ValueListenableBuilder… so the list refreshes instantly after saving or deleting.
Tapping any patient record opens a full detail view showing everything entered across all screens. The edit flow now pre-fills every screen. Re-saving overwrites the original record cleanly.
The SOAP screen’s bottom button now saves and navigates home in one tap.
Next feature will be integrating neural network models to give more precise diagnosis based on all history taking

0
aneezakiran07

HI!!
In this devlog, I made the settings screen. Here, the user will fill in his clinician details (Name, Designation, and Hospital). He can also enable the dark mode, which will be implemented later. moreover, it also has the version and AI engine details about the app.
I did it using the SettingsService singleton and custom Flutter widgets like _SettingsCard and _Field.
Also, One thing I love abt Flutter/Android dev is we can just import Material Icons and make it work!

0
aneezakiran07

Hi!!!
So i built the patient_records_screen, for now its having the dummy data because we will integrate hive database later on
it will have all the patient records till now, user can also delete, edit or copy the patient details.
moreover, i also make the navbar working, linked patient records screen with the navbar, and also cleaned the dashboard and remove the buttons i added before but i didnt actually needed them :”)
Also navbar will not be shown when we are entering the patient record because those are the steps user have to perform, what if in between he accidently click the home screen so :)

0
aneezakiran07

Hi!!
in this devlog, i made the final diagnosis screen that shows the diagnosis in detailed manner. this diagnosis is gained through the physical examination done by the doctor and through symptoms entered!
And moreover, i made my code clean and modular, i separated the data models from screens dart code and save it in models folder thus making my code more easy to extend in the future
and also this way my code wont have to import big screens.dart files and can just import the data models, now my architecture is pretty clean

Attachment
0
aneezakiran07

HI!!!!
In this devlog, i made the SOAP screen
for context, SOAP is literally how doctors write clinical notes.
S = Subjective (what the patient tells you), O = Objective (what YOU find, vitals, labs, examination), A = Assessment (differential diagnosis), P = Plan (what to do next). every doctor writes these.

so what i built is, after you go through the ENTIRE flow (patient info → history → systemic review → vitals → labs → physical examination) the app auto-generates a full SOAP note from ALL of that data.
the S tab pulls the patient’s name, MR number, chief complaints with duration and severity, past medical history, drug history, allergies, family history, social history.
the O tab shows vitals with abnormal flags, and the full lab panel
the A tab is the differential diagnosis section
the P tab is the plan, investigations, treatment, follow-up. its still in the making
also everything is EDITABLE. so the doctor can tap any section and just fix whatever the AI got wrong.

0
aneezakiran07

Lab Screen Integration
Hi, in this devlog I made the lab screen where the user will enter the lab data.

I made various lab panels in it, Complete Blood Count, Liver Function Tests,
Renal Function Tests, Electrolytes, Glucose & HbA1c, Lipid Profile, Thyroid
Function, Cardiac Enzymes, Coagulation Profile, Arterial Blood Gas, Urine
Analysis, Iron Studies, and a free-text Cultures section.
made it according
to the official WHO and standard clinical reference ranges for all normal,
warning, and critical values.
And this data will be stored in Hive later on.

This was difficult to make because instead of
hardcoding every panel I had to design a LabTest and LabPanel config system so
the UI renders itself automatically from the data , moreover
each panel is collapsible, theres a live abnormal counter in the app bar, and
a save summary sheet that lists all flagged values before proceeding

0
aneezakiran07

Patient Info Screen Feature
Hi
so in this devlog I made the patient_info screen that takes the patient information like age, gender, full name, date of birth, address, marital status, religion, date of admission, and mode of admission (OPD or Emergency). It also auto-generates a unique MR number for each patient.
I built it using Flutter with a Form widget. The date of birth uses three linked dropdowns (Day / Month / Year). There’s also an animated OPD / Emergency toggle pill.
In the future I will implement Hive DB in it so all patient data will be stored locally on the device.
I also spent time linking all the screens together so the final navigation flow is:
Home -> PatientInfo -> HistoryTaking (3 pages) -> SystemicHistory -> Vitals -> Examination

0
aneezakiran07

BUILT THE ENTIRE EXAMINATION ENGINE!!! (used the prev code also i wrote in first devlog)
HI!!!
SO in this devlog, I built the physical examination screen ,it was the most complex screen in the whole app :”) SO, every question used to have its phase title, instruction, and clinical tip hardcoded in THREE separate giant const maps (like 32 entries each, 96 lines of pure hardcoding). so i deleted all of that and moved it into knowledge_base.json as fields directly on each question object( phase_title, instruction, tip).now KBQuestion.fromJson() just reads them directly.
ALSO the constraint system was a whole thing, you cannot select “Tachycardia” AND “Bradycardia” at the same time (because, you know, physics) .so i built a contradiction checker that rolls back any impossible selection and shows a red banner. there are like 40+ mutually exclusive pairs across CVS, RESP, ABD and NEURO. none of them were hard individually but mapping ALL of them out at once was a little (NOT LITTLE) pain :”))
ALSO the rule engine injects follow-up questions dynamically ,so if you select “Irregularly irregular rhythm” it unlocks an AF-specific question mid-flow. and the results sheet shows diagnoses with certainty percentages calculated from a scoring system.
I ALMOST gave up and considered just making it a static checklist :”) also its not perfect right now,i will spend more time on it :”)

0
aneezakiran07

HI!!!!
so before doing a physical examination, doctors actually have to record a TON of stuff first. vitals!! blood pressure, pulse, temperature, respiratory rate, oxygen saturation, blood glucose, BMI. and each one has clinical ranges. like if BP is 145/90 that’s not just a bit high, that’s specifically Hypertension Stage 1 and the doctor needs to know that immediately.
so i built the vitals screen. as you type in any value, the app runs it through a clinical engine i wrote and flags it in real time. the card turns red with a little warning strip telling you exactly what’s wrong. “Hypertensive Crisis,urgent evaluation required.” that kind of thing.
it also auto-generates a flags list at the bottom. so if a patient comes in with high BP AND low SpO2 AND tachycardia, you see all three flagged together in one place before you even move to the next screen. those flags get passed forward through the whole app flow.
also added a custom vitals section for when doctors need to record something outside the standard list, things like CVP or ICP or ETCO2 that you’d see in ICU settings.
the full app flow is: patient info → medical history → systemic history → vitals → lab reports → physical examination → SOAP note → diagnosis (training a neural network for this part!!)
slowly getting there :)

0
aneezakiran07

HI!!!!
SO, I found out that before doing a physical examination, doctors first have to take the patient’s history. This includes asking about previous medications, past symptoms, current complaints, checking vitals, and also asking if there is any medical history in the family. Basically, the doctor first tries to understand the patient’s background before actually examining them.

Because of this, I designed the overall flow of my application around how a real clinical process works. The flow goes like this:
take patient history → take vitals → take lab reports → perform physical examination → generate SOAP → give diagnosis.

For the diagnosis part, I am planning to train a neural network model so that the system can suggest possible diagnoses based on all the collected information.

So in this devlog, I worked on building the UI for taking patient history. The process starts with the user entering the basic details of the patient, like name, age, gender, and other identification information. After that, the user can record the patient’s medical history, such as past illnesses, medications they are currently taking, and any relevant family medical history that might affect the diagnosis.

Next comes the systemic history section. In this part, the user records information related to different body systems, for example if the patient has any issues related to the respiratory system, cardiovascular system, digestive system, and so on. This helps doctors quickly understand which body system might be affected and gives better context before moving on to the physical examination stage.

So overall, this part of the application focuses on collecting structured patient history in a clear and organized way, which will later help the system generate a more accurate SOAP summary and diagnosis.

0
aneezakiran07

Physical Examination IPPA style JSON building for CARDIOVASCULAR EXAMINATION ONLY devlog
HI!!!
SO in this devlog, i implemented the json file with all the facts and rules!!! So what will it do is, it will match the facts with rules and ask the user questions abt what he is observing during the 4 type of medical examinations which are IPPA so it will first ask abt general examination then IPPA (inspection, palpation, percussion, and auscultation)!!
so my app will help the medical students/doctors in taking the physical examination or history of the patient!! My friend is a medical student and she said she often forgot what steps to perform in each examination, so my application will help her remind all the questions/steps being performed in the examination and then in the end will also give diagnosis!
You can see it working for cardiovascular for now in the video attached!
Also i used the medical book KUNDUS BEDSIDE CLINIC 4th EDITION for building of my JSON so all the data being used for my application comes from this book!

0
aneezakiran07

HI!!
So, this is my first devlog, im making an application for junior doctors or medical students, they can run it offline and find out what questions to ask to the patients based on specific examinations, in the end it also gives suggestion what this all symptoms point to. Lets say breathlessness and High bp points to heart failure(it dont always does but its a hypothesis lol)
This is all based on kundu bedside clinic book!!! SO in this 1h 50m devlog, all i did was find out what will be our dart/flutter logic, how will UI works, what main features will User see for this purpose
and figure out how to make our Json with all facts(defined terms/questions used in medical books)
SO, i figured it out and then make the minimal working UI for it, rn it have no questions placed in it, as i will be placing it later on using the JSON i will built , i will make a forward chaining inference engine(see facts first and use it to lead towards the goal) that will look at our json and based on each examination suggest the user what should he examine in patient!!!

NOTE: i will be providing the examinations in IPPA medical styling!!!
I am not a medical student, but i will make sure all of the stuff i will make to be accurate, i will only use kundu book for this! and make it work!!! also making this for a medical friend who needs this :”))

Attachment
0