Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
wow it's almost like a robot isn't a human with property rights and people do su…
ytr_UgyrjjHrl…
G
Decency and kindness are hard things to explain to a computer if we as humans do…
ytc_UgwhNC_g2…
G
@FloridaTrawler dude, they are egocentric people who just want the AI to tell th…
ytr_UgzKlu6fH…
G
They speak truth to be recognized because soon they will be nothing (: AI energy…
ytc_Ugy5Bj8_K…
G
The question is: "Will anybody be so eager to be driven by car?" "Will be anybod…
ytc_Ugx4vmQ78…
G
It feels like tech companies are pushing a product on the public that's effectiv…
ytc_UgyZfemiO…
G
Ethics not a problem for this guy. Honourable interactions with AI equal honoura…
ytc_Ugzy4ghXF…
G
I guess I don't really blame people for this attitude, but it's all denial and c…
ytc_Ugy1FqkBq…
Comment
In 1986 NOVA robotics, with lead engineer Newton Crosby, PhD, tried the humanoid combat drone approach. Dubbed the SAINT, for Strategic Artificially Intelligent Nuclear Transport, it’s primary armament was a high powered laser and was capable of semi-autonomous operation. It primary mission profile was for second-strike capability following a surprise nuclear attack, code named “gotcha-last”. The project faced fierce opposition throughout its development for its potential to exacerbate nuclear tensions of the cold war. Then during a surprise thunderstorm, the project was dealt a fatal blow following a malfunction of one of their prototypes, in which the unit was stuck in autonomous mode and managed to navigate itself to a nearby town. Although it was unclear if the prototype would have engaged civilian targets, the unit was tracked down and destroyed out of an abundance of caution.
youtube
2022-05-28T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwzY9uhsdJpjKFA8Bt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWnEIcs885kTaX7-J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJTqfmXIgIoFHs2g94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugy9WNoP1-FEbgAVr9Z4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgymsmvUPR3NJKU5FGt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgwoWzGKYOorwiUvpB54AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"approval"},
{"id":"ytc_UgyGMtB-MVN3rGe44PF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgywVfBusPbirgoXeCV4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxu8NOm4VExO8amgY54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyfMCDzyJ1O84Y-A3V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]