Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Looks like "Self-Driving Cars" equals Waymo. In a growing, improving industry, …
ytc_UgyO4GXei…
G
I think it’s cool and has some neat potential uses
But in its current incarnati…
ytc_UgzIBDNvw…
G
Idk man I feel like as long as you don’t market it as real it doesn’t matter if …
ytc_Ugz99W3Gc…
G
She really said :” WHERES MY BOX 😡”
The worker : “yo chill i’ll got some”
she : …
ytc_UgxVkWKId…
G
BUB…..HUMANS WILL NEVER TRAVEL TO ANOTHER WORLD!!! WE’RE TOO FRAIL!! BUT…..WE SH…
ytc_Ugy2Pp1OJ…
G
It’s another incredibly shortsighted plan though. Say that the AI answer is awes…
rdc_nuflr1p
G
This is what I call full denial. AI will wipe out more than half of the digital-…
ytr_UgwZjXa39…
G
@rodjacksonx Yeah, i don't like censorship in any form and i don't want an AI on…
ytr_UgzIuA0Xy…
Comment
First stop the wars, al weapons and bullets must be disappear from planet earth-when there are so many problems AI will deep the problems and not help.........People will become dissocialized, fat and stupid (just check how kids in schools are already using AI to solve assignments and exam tests - all knowledge is written in books and not on the internet) all in one fell swoop - all in Alian intelligence...ops Artificial intelligence...Tokens bla bla bla when war starts first thing they will disconnect the internet...AHA moment (Internet was invented for military purposes)
youtube
AI Moral Status
2025-07-25T19:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxbzFzUcLviNTFmK3V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx2vLP3y4OOoOaNPah4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"skepticism"},
{"id":"ytc_UgxhoR5UHK6THIMTaSF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZhKhJ4iVVt2hAfQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxgQLOVOtt98fyt7lR4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwWxPJKhJVI0MzAic94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzDbssCuyZV4i4D89N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgzD6hIZA8zXmdwa9gN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxWmftk2HugRn7na3t4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxGm-uZtJ_u4Pe3TGp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]