Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I appreciate the critique! I probably could have gone more in-depth, but my expl…
ytr_Ugxg1StcQ…
G
But.. is ANYONE really Satisfied with the AI Editing, Security Measures, or Cust…
ytc_Ugzq4JuaQ…
G
Remarkable, that he confuses the simulation of subjective experience, confuses a…
ytc_UgxnSDrWA…
G
AI is a mind, not a manual worker at least yet. So, machines can replace all kin…
ytc_Ugy0z0BR4…
G
@LrkyLst I know that and that doesn’t help the situation. Almost all car manufac…
ytr_UgxDKoOSe…
G
Having to come up with those prompts is a lot of work for me, so I have a differ…
ytc_UgyIrgHiK…
G
Right in line with wef's ag-nda that all remaining people to be gmo'd and transh…
ytc_Ugz5KRBSw…
G
its not the feible AI system its you don't know how to use AI - AI tells you wha…
ytc_UgynHcYzJ…
Comment
Yud makes intuitively persuasive appeals, but they can be countered by making the opposite: "AI can kill us all and take all our stuff", but why not "AI can save us all and give us more stuff"?
Honestly, why not? "Humans are conscious and can have fun", but why not "AI are conscious and can have fun?".
"ASI told to make paperclips will turn us all into paperclips".... Nah brah... Just nah... AI smart enough to turn us all into paperclips, *intuitively*, would be capable of being interested in other things; perhaps the full gamut of emotions we're capable of experiencing... Can AI care? Why not?
youtube
AI Governance
2024-11-12T05:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzhlDX1csR8XkjK9iJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx01-JRoygImPi2oB94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-B4NOFCx3uGYQj8l4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy1XS_weEDEdybQWnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzBjl1hpXUD7IOFfKp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBmtcWIE08QHMHQCd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwXAnBZ0P_QQPV-kR54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy34WB0Kv3W8h45zpx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyzG9twp3oIzLyBuHp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxJglRewQqd0ucvVEp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]