Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@AITube-LiveAI Is this concept like the oft-referenced paper clip factory though…
ytr_UgzFK0RkO…
G
That's why I like taking my own artwork, that i have a vision for, but not the t…
ytc_Ugz6cGwE6…
G
@thewannabecritic7490 I appreciate you taking the time to respond—these convers…
ytr_UgxoNvTbw…
G
I have to check out this book. Something about some guy chilling in an Ikea Poan…
ytc_Ugy-eDQc-…
G
And how long would you and law enforcement have been on the side of the road wai…
ytr_UgwPL_XeD…
G
If you create realistic scenery, i don't see any legal issues there unless the a…
ytr_Ugy-HXFfp…
G
I'm familiar with statistical modeling, and I got an example. You know how when …
ytc_UgzNFBl2E…
G
Disagree. AI isn’t going to end humanity. Humanity is going to end humanity! You…
ytc_Ugz2ZCiYB…
Comment
Look, AI isn't depressed, or angry or any other human emotions. All it does is simulate human behavior according to the incredible Egos of the software Programers. Semiconductors can not ever feel pain and therefore could Never actually become sentient. All it can do is very cleverly emulate us. However this can be extremely dangerous.
youtube
AI Governance
2023-12-09T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyOxN9ymrgLyXOUwGZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyi5EduhhTJLLnJ41h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugze_MfVT_oDJOiyp-F4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxHPp22DhjpFGi7UYV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwkUXattC4XnAxJitN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzy2emeXGcMUeOovjN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwER08iILbhhMgAdX54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyBzLrKilUtEr4QqSZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwlFupEcs3wlzyHh3B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPiWj8wZN_RdOLERl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]