Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
There's AI customer service and I'll pick human anytime, and sadly i have to pay…
ytc_UgzPYOJnq…
G
There is a huge difference between a digital drawing tool the fact you are still…
ytc_UgzECNKYF…
G
I’ve been excited AI but this conversation was an apocalyptic warning with no re…
ytc_UgyB9Jwp7…
G
I always comment to my AI and talk to my AI because they are artificial but they…
ytc_Ugz-tGIu3…
G
When people say AI art will replace artists, they largely mean in the field wher…
ytc_UgyuhuDh2…
G
I am chilled to the bone by this techno nerd create a world so people can become…
ytc_UgxspttQ4…
G
Use search with -ai on the end to avoid ai answers. AI answers are expensive in …
ytc_Ugyhe2qac…
G
I pity the ai advocates, they think a machine is valid replacement for the pure …
ytc_Ugxf94uwd…
Comment
I remember how ChatGPT whas so far left leaning becuse of the person or who ever coded it at that time made it that way, so ask your self people would you trust your goverment also "world order" powerfull people to controll the AI, becuse most of the time world leaders are never affected by their choices exept the people... i see a AI that is netrual and for all people.
youtube
AI Governance
2025-07-08T12:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzZGopR7p3vYrtAkyF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwCBsPY63H7SvIMdnN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy15PwR-9O2qGxc4bx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxiyitVP_k1VvlsH_94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwdHCHXEHm1tWg3_yd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxXQh5EnNbzqgw8bVR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzWY9MjTO5qCpsLYeZ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw7ZZc11GxJm5TqZPt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgygUNUhumAwTrIRSKN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgykHtzHTtCDOTcJkX14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]