Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Can you write a script to have the AI write a ton of code all day? Are they trac…
rdc_oi3g3oj
G
Easy solution:
Let musicians and other types of artists/creators use whatever …
ytr_Ugy5BX6_T…
G
AI bros will use big words and big talk and still be so insanely stupid and funn…
ytr_UgwvOk_Ju…
G
@yueguifan As a fellow software engineer, I have to say I can’t agree. Even jun…
ytr_UgyMoPixp…
G
They are wiring our brains ai knows our thoughts when they are impure we get the…
ytc_UgwZ-KnVD…
G
That's got to be to fake u can't put a real human against a machine AI but could…
ytc_Ugzw58j3c…
G
Train em on CCP PROPAGANDA, get one result or FAR RIGHT PROPAGANDA, you get anot…
ytr_UgyS1ahr3…
G
yes, Salim, AI personhood? can AI have person hood without ever having a Mom? …
ytc_UgxPQRvJW…
Comment
Would you consider that the robot troops depicted in the Star Wars prequels were just advanced calculators or did these fictitious characters exhibited consciousness?
And what's the difference between a robot machine is less aware than a bio-mechanical "robots" as depicted in the blade runner movies?
What about a combination of the two.. or like in some sci-fi stories where biological and synthetic systems are combined?
youtube
AI Governance
2025-07-16T17:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugxe3i3-I84L1v6WkTt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyNe47VIeo0z32NiuJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-5BCPmTL8DWyYNYp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugz8He1xAsHAo7lJgxp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgyK13VTVE6tRJwxfJ54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgyE5wX3cFfw6_X6IL94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},{"id":"ytc_UgyZlF8dluKheFFNkeh4AaABAg","responsibility":"none","reasoning":"virtue","policy":"industry_self","emotion":"approval"},{"id":"ytc_UgyeJLaFMWyhNas_XaR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_Ugzz8WYjxBOPN_6YBIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgxWXmn27NBJ1sAWi5p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}]