Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not that I agree with the AI artist, but Charlie's counterpoint to AI being the …
ytc_Ugx2LpR_w…
G
Damn. They really need to learn the ''work to live, not live to work'' attitude.…
rdc_dv0ekzr
G
Art is HUMAN. That’s litterally in the definition. So it’s impossible to AI to m…
ytc_UgxpBH_Tl…
G
Humanity seems to work on principles. Certainty, progress, hope for a better fu…
ytc_Ugy9iuvKu…
G
There was a big push to do this like 12 years ago, a bunch of green opposition p…
rdc_ibe8cam
G
This is also related to the problem of Overfitting, a common machine learning er…
rdc_e7j0k83
G
about the AI responding back and getting weird,
most likely you influenced that…
ytc_UgzWp1MMF…
G
If one robot learns how to kill someone.... it will save in AI cloud and other R…
ytc_UgzYfIxvW…
Comment
The underlying problem is that when AI replaces most human jobs, and humans no longer earn income they cease to be tax payers. Without tax dollars local, state and federal governments will not be able to function. All social programs will end, emergency services will end (police departments) civilized society will end. If you make humans irrelevant, humans will cease to exist.
youtube
AI Governance
2026-02-19T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxG6f60ZlObElHlfIZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyIc232zuEiwu86G2l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx8RIZz4E92pviwkpR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGJXSZZJjCFC8v01Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgytHqr5HIoRG7i3L3Z4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyfSQWnZTGDaM5z3J94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzMBmgz_v8tTP2c2RV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwuAxbSkPRNHLhDC3N4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz1JlfLfINJ0LgC3aF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwN-KahCxlWMYhlkd14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]