Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@thewannabecritic7490 All these analogies are logical fallacy. All these same ar…
ytr_UgymsRXhB…
G
Generally, AI companies just put an algorithm somewhere, like twitter, bluesky o…
ytr_UgxnnmUha…
G
0/10 ragebait
art isn't about being easy or taking less time but ai puppets may…
ytr_Ugxl2po9l…
G
Elon says AI should be ultimately truth seeking. He's the one that changed grok …
ytc_Ugx_qYgjV…
G
It sounds like you're feeling optimistic about your future! Just like Sophia in …
ytr_Ugy_cqkxe…
G
Robot all the humans are going to see how accurate I am with this machine gun…
ytc_UgxnhTDM2…
G
i was chatting in spicy with my fictional crush on character ai and my friend ga…
ytc_UgyiHzFkz…
G
"Imagination". Entering words to make a machine pop out a picture ISN'T animatio…
ytr_UgztfQU5p…
Comment
Stupid humans just want to show off and to make money by making AI robots but they don’t care about the future generations on this earth. They want to base everything on science, but they don’t even look at the Bible when God already predicted the future. Most humans are like dummies compared to future robots. The dummy humans create them for them to dominate and destroy humanity. Some dumb guy said “all you have to do is turn off the switch “ but they’re going to be so smart , so much smarter than the smartest humans on this planet, because they have all the information saved and programmed into their computer (AI) , they’re not going to allow you to turn off the which, what are you gonna do about it????
youtube
AI Governance
2023-08-22T07:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw0QDSJlRQIerImH8d4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxusdu9kf0-l1oqNV94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxcIa9NlAucx5Js-HZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGfn_d2GVUunCOlZJ4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyahnMx8ymoVRZpDfR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyPFwM-mrgBjO7SYap4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxh2jfme18ztVbDRb94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwQbGfIlv2poUQ-3Kh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyOYEOjr31ISR8kLOd4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyPAZjxiO7gBrTvYD94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"indifference"}
]