Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These are the dumbest people in the world, man what if that robot say to you is …
ytc_UgzNJics6…
G
Any computer program cannot be “forced” to answer what you want it to. The guy w…
ytc_UgyWEp7-u…
G
@chriscurry2496I’m just gonna say this. The way you word yourself and even the e…
ytr_UgzGc2rCt…
G
Wont the world turn communist the moment Ai replaces most of the jobs? Like, nob…
ytc_UgxITLe0R…
G
Even I’m smart enough to not invent something that can kill me. Turn that crap o…
ytc_UgwdErKH9…
G
if deepseek "stole" from OpenAI, then OpenAI and every other company ever also "…
rdc_m9h20x2
G
1. what distance did the self driving car allow between the truck and vehicle?
…
ytc_UgiXr1C50…
G
Imagine waking up to find out an ai said you would commit a crime and then you g…
ytc_UgwZY1WN5…
Comment
Instead of using real human real equipment and real money. We could create a sort of virtual ai world here every nation has the same part of world and same amount of equipment as in real life. So every consequence of the world effects the virtual world and the same for the virtual world effect the real world. Ex. Ukraine and Russia.. they wold not kill people and civilians because they are fighting on the virtual world, but if someone gains a territory, in real life it would gain that specific land.
I tried too explain myself but it is easier to think it than express it
youtube
2025-02-15T20:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwD5BconxlaZrw_tDt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxZ8N_o43LFIREyFeF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyhCApX07IRqIa2mH94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKA1efE_RoY2OklRF4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyWjzPg-gnICOX8ejl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzYk7tlTPx8TlgwyYd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyF1D8gHD4ECTepa654AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugzj7Gaov4zBVWhwoRJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugziql6DiZx3rGbUuV54AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgyDaqWjD2iDrwKCKfR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]