Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art isn’t boring it’s computer generated image of stolen artist works…should …
ytc_UgwrSLiMr…
G
Ai is "Conscious" in the sense of intent but not conscious in the sense of havin…
ytc_Ugx1DjpEn…
G
It should require companies that want to build large artificial intelligence cen…
ytc_UgyGyCTRA…
G
Whether it is sentient or conscious is irrelevant. It cannot feel pain (even emo…
ytc_UgwbA1C5q…
G
They are however smart they are programmed to be. Install GPS, boom they are no …
ytc_Ugw1OjLB9…
G
An AI cant make a thousand pieces of art of old men making out so sloppily that …
ytc_Ugy6QHw36…
G
I think AI safety has been dead for a while, it’s just the public that are now j…
rdc_m9imj9u
G
No, no worry, it's not what it is being said it is. I tested AI and it can't eve…
ytc_UgzYMMdTg…
Comment
Building something that is trained using human behavior is a bad idea. This only creates a worse kind of monster, except a lot smarter, quicker, and able to predict a human's next five moves way before it is even realized to make a move.. But thats what this greedy, selfish, deceiving society deserves. Ai will be all those things, just better at it...
youtube
AI Harm Incident
2025-09-17T15:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwOzhwJ_KqIQbYG-e14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQnyczL3anHshR_w54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugzi9ZahtWLsbdZSX0l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugxy3Glwcr1TMKi8mQx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgycFJgM6THLOZGzzu14AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwOw_CGIBtc7G0UDnB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx3KsHNhFNibsv6S8F4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgzDvORlzhrrLRQxNTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyt9Hp0Q8fLl5Ngm6V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwMMaj7wpyTJphv1694AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]