Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Also worth watching Venus Theory's video on how AI "Artists" using SunoAI can cl…
ytc_Ugxh96bAC…
G
AI will be great, but as you already see, people will hate on it for no reason, …
ytc_UgweMXfH2…
G
Not exactly, they hallucinate too often. AI reading scaned documents (which is a…
ytr_UgyudBwIz…
G
As a teacher.. Automation of teaching make it easier. But.. Teaching is much mor…
ytc_UgzMJW2mf…
G
NGL I like AI in some cases, like those hiraliouly cursed images people get it t…
ytc_UgwE7fLVo…
G
Actually, I would go with AI ! Cuz I'm a lawyer by profession, have a dating exp…
ytc_UgyaDUbMF…
G
Please create a video on setting up a local vibe coding agents asumming you are …
ytc_UgyTuedHm…
G
I also need to point out, contrary to popular belief, teacher dont just teach te…
ytr_UgxNyQdx8…
Comment
As with any ethical question, it can go both ways. But I want to say that if developing artificial intelligence leads us to the point in human history when we are so unchallenged and comfortable that we lose our existential authenticity/humility as a species, then something is wrong. I mean that if the day comes when all the work is done for us by a.i., the character of the human outlook on the whole will change. That change might be positive to a degree, but as with all instances of progress, something will inevitably get lost in the scuffle. Since the Industrial Revolution, human outlook and values have changed dramatically, and not always for the better.
I'm surprised you didn't mention the V.I. and A.I. distinction in Mass Effect. That's pretty interesting stuff, a way to avoid any ethical misgivings some people may have or developing a.i.
youtube
2013-11-09T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UggQA9piQKPPHHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugi7lWCqY9ksDngCoAEC","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiOrCe094MKjHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgibYzQAmZAn1ngCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiWTc5cGlkjXngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghImfg7p-LkC3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgjrmcSECEPYmngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugid59dtjYGUrXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggZCNMF-BBN4HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgjGEKM8R0Z5f3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"}
]