Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Uh several problems here. One, if someone takes a reference image and generates …
ytc_Ugw8zZ9qz…
G
Can anyone tell me who the f**k is buy any goods and services if everyone has no…
ytc_Ugxtnwx14…
G
@llmcoder What did you want to hire a jr for? What tasks would you give them?…
ytr_Ugz7Kt2T_…
G
So, what you’re saying is this AI is like the AI on Person of Interest in its ab…
ytc_UgwQRokMN…
G
Please tell the robot to draw some creative art or tell a story. i want to know…
ytc_UgxJ1vC3t…
G
Guys guys people used to say ai can't copy creativity it just for tec and cal …
ytc_UgzHQAhfG…
G
Nah that robot had enough he was like BRO I CANT ANYMORE. DESTROY THIS DUDE…
ytc_UgzG3mDr1…
G
Per million miles driven, autonomous vehicles are more dangerous than traditiona…
ytr_UgxR8D-pM…
Comment
The problem in most companies is the CEO mindset. These people are not engineers or have any level of ethics and only see profit and getting there first as the primary goal. That is in itself will lead to our destruction. As we progress AI is just one area to worry about as we solve more complex problems particularly in physics. Governments are attracting crackpots as a result of the lack of control of social media. As the clip states the US government became concerned -- I wouldn't have a lot of faith in the present US government when it comes to making rational decisions. They seem to have an obsession about China even if their technology leads to our destruction first. They also seem anti-science.
youtube
AI Governance
2025-08-03T11:4…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxsCVjOVllqD-BRJh94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyOp2la_E8qr9kMbJV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx2on1mYPQLvhcHTNZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwVZRor-_DVoJ3RR1V4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxw_0KKCrt9xoCGJNB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-9myUbulr5PZTyeN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzxKnKLjd40kq9A3-B4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw12C61kiuvOzffFux4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxqeo45IOkdX-wVHtN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzkt7IdhYRjZe56oFp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"approval"}
]