Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Instead of automating things like taxes, or data computation, or making tools to…
ytc_UgzIgKVRu…
G
Well yeah, AI is a very fancy autocomplete and it's ok. What's not ok - is that …
ytc_UgzNMhImp…
G
I warn all of you AI will be not good for humanity. We build up our own killers…
ytc_Ugwl8bLjp…
G
Now just imagine like in i.robot, a robot with a quantum computer for a process…
ytc_UgwBLWZOi…
G
This smacks of marketing BS. "Bioterrorism" isn't something an LLM is even remot…
rdc_o50gdg6
G
We have no other choice than continue with A.I. otherwise we can’t progress as a…
ytr_UgybrDbv8…
G
I think the biggest problrm is the ai companies going all "Don't think for yours…
ytr_Ugyb7vJPr…
G
I use ai as a creative tool and the enjoyment I get from doing this to express m…
ytc_UgzsbLScZ…
Comment
I would genuinely feel safer if every car on the road was an automated Tesla compared to actual human drivers in a city like Los Angeles. You should obviously have the option to take over control in any unplanned situation... but these cars are genuinely so much more reliable and intelligent than the basic human driver. (Plus, if the majority of cars were autodriving, then they would all be talking to each other, making collisions less and less likely when there's fewer and fewer human drivers doing random stupid things.)
youtube
2025-11-29T23:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgzFpLDVB1cpNavKLz94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwvQTI-CllD4WMUzvJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXv79hkiNPKergW2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwYLFpxg7CmAyvAWKV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy5eBqpoaD5kber8OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVs97RCpw9RjWhBxx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwz-_EWCT_vfJoZRLt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwRsy_sMbzzxrEz61d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwkipaSXGRIlwqdg6Z4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx3J-dOjBT46x7JfNB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]