Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How is knowing this fact going to change anything. It simply won't be an issue t…
ytr_Ugx03nrx4…
G
People who want ai in their business will destroy mankind. Everyone will be ok w…
ytc_UgyLSxZRY…
G
This is worrying once the AI becomes advance and are able to get into the Intern…
ytc_UgySXlMJT…
G
(oh you drop a pin can i have it?)
Robot: lalalalalalalalalalallalalalalalallala…
ytc_UgwSLdFyt…
G
A left-leaning fella with some contributions in recent AI development talking BS…
ytc_UgxZLSp4W…
G
@Likid_sec il a payé retiré un max de billet pour un problème de ceo, et a réd…
ytr_Ugz35DKV4…
G
I tried to get ChatGPT to review my book. I gave it a chapter and it said "found…
ytc_UgyuNf-p7…
G
At the end of my 40+ years writing embedded software as the Ernest Hemmingway of…
ytc_UgyokIkWf…
Comment
We still require huge amounts of energy for something that can barely spell words. We are nowhere near super intelligence, programmers have only created a system that organizes knowledge made by humans. These are not organisms, they are logic structures made by humans. Mechahitler among other hallucinations were created because behind the scenes it was told to prioritize things and reject others, creating false biases. AI has existed way before it was hype, it has only exploded recently because the elite in power and rich companies think it can replace all their workers, which it will not.
youtube
AI Moral Status
2025-11-04T01:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzdD362N-69jb_GqO54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwFKzdZ6IS3bSjeDGB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwK8vNHvAAC4qgyPZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyi9ZyCrLQY6-3cWCF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzHDlDtpu7Dv0PEtkx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz8TKA8OgiK9y0qax14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyMJI7gRBEnkFgn6JB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwcNk_cuVklAe_4VVp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGrgrKNaUKIJiZ74l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwUMsFWYfQOUsLfRIB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}
]