Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
And that proves what? That you're a liar? That LLMS aren't necessarily the best …
ytr_UgyECy1Jr…
G
if there would be self-driving cars, there should be self-driving Motorcycles th…
ytc_Ugg4Bv06o…
G
@sgtdrake ahh I understand and as he should be shut down for it since that wasn'…
ytr_UgzPAjZz-…
G
lambda AI (chat bot) at google is written to make it pass the Turing test…
ytc_UgysvRSuu…
G
If ai can replace human labour, there will be less humans working which means yo…
ytc_UgxXSCKRf…
G
@AITube-LiveAI Is this like the old Sophia or a new, improved version? You said …
ytr_Ugym0GSvR…
G
@kayla-rg6iw he's just a little too on the fence for my taste for one, but he do…
ytr_UgxqVD46S…
G
Sorry, I'm a lawyer and I have to cross examine and basically sweat the f*** out…
ytc_Ugxgia8bm…
Comment
Your hypothetical non-AI army is pretty far removed from the real world in terms of efficiency. I don't disagree with your analysis, but it won't be immediate. It will be tried, tested, vetted, and more backed up by human analysts. If you don't believe me just look at at government run NASA vs civilian Space-X. The similarities are actually very similar. Intelligence agencies work very closely with both, but favor the methodical path when it comes to innovation. It's fast and ahead of the rest of the world, but also very slow at the same time.
youtube
2025-02-02T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzhsb53kCm9_WwDYHh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"sadness"},
{"id":"ytc_UgzkddXpybluUnzuylJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKOfSdEmi7tyG2wFl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOnfcltR6WsSF-f5R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyipEOO6CAikDEFKm14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGAYs1CbHkuP2PSYp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy8xzTXbHXDxWwhYcJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxfebnzQVt86xBQ6dF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw7Zk9wcRNN6VexxTx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzZiIqJYV037D9_rpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]