Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love how everyone comes out about how dangerous ai is after they've worked on …
ytc_Ugygqjz7Z…
G
Plot twist...the deepfake is so good, we cannot see they are all wearing facemas…
ytr_UgyAEsiUT…
G
I havent checked but if you ask ai itself if ai art has a soul. Its gonna say it…
ytc_UgyrM4j0f…
G
I disagree with anyone who states that ai gave them the D&D character they alwa…
ytc_UgxFYj1QT…
G
Dont try to convince me. I know better. One human cell, and AI is conscious. You…
ytc_UgwBxxHp8…
G
You’re gonna let a self driving megalomaniac designed by purple haired and blue …
ytc_Ugyz0N5Rx…
G
I second this. A lot of AI stands have no idea how any of the AI actually works.…
ytc_UgxX1Z-IY…
G
if someone saw my ai chats well i would be probably be dead by now💀…
ytc_Ugz8Bmt0L…
Comment
We need laws that force self driving companies to forfeit their business to accident victims at the first injury accident. This would insure better safety rather than the company weighing safety costs against damage liability. Anything less will produce accident victims as companies exercise cost avoidance when safety costs more. This is why Tesla has had so many self driving accidents.
youtube
AI Jobs
2025-09-11T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz-su48OGZiP_5Wm0Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugxu6OJgVQ9MRi95P3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwtOn2TbJf7pUZXJUB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzFPhfHLRLhHFVFS8d4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx4AfEiuw9tR3-cADp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwgCtgFyHv_K_HvJN14AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxJIkyjI4rUpORif9x4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxpLmLhwHL5Mcf6PAR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx7Km5WTnVAsf9s0jt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0noKyPsRMb_WgDf54AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"resignation"}
]