Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The person behind the wheel was reportedly a "human safety driver" who was PAID …
ytc_UgzQJpD_S…
G
@fogone1 because it replaces the printed word of newspapers in how people get t…
ytr_Ugy9vRS-N…
G
People wonder why we never met aliens. Perhaps it's because of a great filter ca…
ytc_UgzWlH6wR…
G
No one gonna mention that Korea has had test like this for a few weeks now? and …
rdc_fjztlap
G
There is an obvious difference between human made creativity and a million AI it…
ytc_UgzWKvGIx…
G
I mean let's be real. It's the "HiGhly EduCaTed" people with "DegReEEs" calling …
ytc_UgxiW2Luw…
G
*Automated - not autonomous. Autonomy, in actual sense, requires decision-making…
ytc_UgxXAQENE…
G
AI 2 years ago compared to today is like when we first heard about the 2 weeks o…
ytc_UgyLQhb3D…
Comment
The differences between this tech and nuclear power are:
1. Nuclear bombs don't have a brain, they can't decide what do to with themselves
2. AI is not as destructive as a nuclear bomb, but rather disruptive, causing changes in the basements of civilization and culture, as opposed to sheer direct destruction, but:
3. AI could manipulate and trigger nuclear bombs if it decides to do so
youtube
AI Governance
2023-05-10T16:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwt_9L9s1cMQUynBS54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyrnILZIDheb5StTxZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwirKaD5DeF4wOgyix4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgznWHPo76axNk8mq3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx7HVGwgSwBNqF2vg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKvDsRLjYiqlACyCl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgydmufFqaE_TWOpTjJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy03Q6WrXntB6U3Jvh4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyuS5CJK6qFISPm8wZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugw526q5I6bZRoqu6px4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]