Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All he had to do was read a little science fiction. The perils of AI have been …
rdc_m11runj
G
I think another strong pillar in any such system MUST be an evidence-based attit…
rdc_c2vp3pg
G
The fact that they killed him says everything about the people who currently own…
ytc_Ugz9RE1s0…
G
Only thing that'll happen if the AI bubble doesn't pop is billionaires will beco…
ytc_Ugxp5HJ91…
G
So what? Once you find out it's AI it just feels fake. 🤷 Fake af doesn't work in…
ytc_Ugw8-itAA…
G
The best case scenario is the west gets advanced AI, uses it to subjugate those …
ytc_UgyPfQ7ix…
G
Think EXPONENTIALLY, not LINEARLY. AI and robotics will easily solve so-called p…
ytc_UgzQpxHXo…
G
Can i ask AI if this interview video is AI created or real Jenssen Huang?…
ytc_UgxI4PMVg…
Comment
You watched too many sci-fi movies, mate. An autonomous vehicle can barely drive 30mph on a straight road and stops to re-calculate route on every obstacle detected. Now, imagine THAT in combat! Not to mention that those are experimental machines that couldn't withstand to be abused like the redundant military gear. And, I'll repeat it again, UAVs are NOT autonomous - they DO NOT and CAN NOT make decisions of any kind. And they land "themselves" as much as your microwave "turns off itself".
youtube
2012-11-23T23:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugx6FCmr9EFaaoya2Dt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNYuH_o45JNtq5kbR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWO5tHzZYAFti4m7t4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy6eCeFDqQBTDRzcWN4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcvhfuGmb7J6atofB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugwh8Muyzn7IBVPdVbp4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugw6OKXZ62XgY3zKeyt4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw_Q2g2u3ixsQEiK_54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz9V7TYg0BzPMRq6ut4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyZPBybJxcQnaHiRxZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}
]