Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
See you seem to forget a big thing here Charlie... nobody says its being sold. N…
ytc_Ugz_KhxKh…
G
AI: You're not sick enough
ME: Buut I'm just as sick as your other patient
AI: …
ytc_Ugykf89Wk…
G
@L2-Finale Regardless, the point of that part of the video was about the AI bein…
ytr_UgxpHqg1Z…
G
This is narrow minded. The worse AI right now is not Chat GPT, take a guess.…
ytc_UgzRXhPdw…
G
The fact that the NYT could force ChatGPT to generate complete paragraphs verbat…
ytc_UgzooNi5N…
G
AI might be a better Doctors and lawyers as well or anyother knowledge bassed jo…
ytr_Ugyf4Ukhn…
G
No cause the fact that the art was actually good… but it’s sad. It’s done with A…
ytc_UgwI6wwYc…
G
Liberals want them to have a gender switch class and read some sex book instead.…
ytc_Ugyvwqi2a…
Comment
Soldiers, not matter how smart or not, are still primates. An advanced automated combat system, made at this level of technology we have now, wouldn't be even a cockroach. It simply wouldn't be efficient - letting something like that in an urban area - well, you better carpet bomb the place. For reaction, you need processing power. The dumbest soldier you'll find will still have several billions of calculations extra ahead your gaming PC.
youtube
2012-11-24T00:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiBO59xpLPCkecBqd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_uEjQogI-wb-bmPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7nuMjJl6i0N6vTmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys3yBMDKkxZIDT7cd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxNKwh_r49bvXQyVH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH2eMZsO_x_AjYfy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRrYrWDDcgzPrJQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcPYYDaK_--Y12hiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXAQENEamBTwM_Aht4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugykgw-hR28QCN8z1ep4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]