Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
8:30 " I think there will be only 20 or 30 companies around the world building A…
ytc_UgwZwhkCV…
G
He's saying a person using AI as tool is gonna replace you ... That simply means…
ytc_Ugw2pvFpz…
G
Thank you for such a clear, concise, impassioned video. AI is so new, many of th…
ytc_Ugyzrnpa8…
G
It's going to be like...all people will together fight against these ceos...ceos…
ytc_Ugzhxpe9w…
G
AI is getting better and better. I knew something was off and was questioning if…
ytc_UgyGyaj_y…
G
SORRY BUT WHY'S THE ROBOT WHO SIDE EYE IN THE FIRST CLIP LOOK LIKE DOJA CAT 😓😓😓…
ytc_UgyvnvUo4…
G
At this rate we should destroy these general intelligence AI’s now before this g…
ytc_UgyVG-74W…
G
So we are already in Stage 4 - Next is Stage 5 : AGI -someone suggested coding "…
ytc_UgweDRuAZ…
Comment
The wrong way round - you build in watertight safeguards first. Idiots. AI is a locked-in mentality being force-fed vast amounts of info with all the craziness you can expect from that. Already legal AIs are making up laws in order to win cases as they are programmed to win & make money. They don't care how because stupid humans didn't set the terms of the deal to start, just monetise, monetise. The most successful humans also tend to be psychopaths - how can you expect anything better from machine intelligence? You trust anyone or anything with that much power at your peril.
youtube
AI Moral Status
2025-06-05T13:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxlVzHIh-JM09L8lXJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw72PxZSOYkc38z0DV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfAr9KCEfeewHb_DF4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwr5QWImYbpAolGmrh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgxbTtpAs60UMHCTF7p4AaABAg","responsibility":"government","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxlYKASODPJZZ00e7l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugy9KzLh_WvQXbxwYTx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwrcuE_PrDCCZazRd94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzYVohzwdk55pWx5v94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzJJJG6fY7Uu0K_jDZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"mixed"}
]