Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Is this the difference between human general intelligence vs artificial intellig…
ytc_UgxXi6OOY…
G
AI should has been created for assistance not replacement, but those greedy CEOs…
ytc_Ugwqr9YyI…
G
Whily is the interviewer smiling so much if humanity is cooked. Does he think he…
ytc_UgzcM-bAG…
G
26:36 WE are not using it. Some companies are using it. And without a worldwide …
ytc_UgyfEsbhg…
G
Who ever of truck drivers are helping them, I mean this corporation to bring mor…
ytc_Ugy-1-0_k…
G
So, I get it "AI BAD" ... but how many kids have killed themselves over Facebook…
ytc_UgyAttTBE…
G
i can assure you: the difference in quality between AI writing and good human wr…
ytr_UgwiSgV0W…
G
I dont even understand why so many humans seem to think AI will even be a slave …
ytc_UgzdYm604…
Comment
I think the real question is 'how will you feel if one day AI creates music that you think is perfect?'. e.g. 'take my favourite album and create ten new songs which are just as good'. What if this music costs peanuts, the artist gets suitably compensated and you think it's the best music released in years? New film clips based on the artist's likeness in the prime of their youth, pay per view fake concerts, the list goes on. We can moralise all we want but I don't think anyone knows exactly how they will feel if the content is cheap, high quality and endorsed by their favourite artist.
youtube
2025-08-20T07:1…
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzd8TLlWCsdBGXGIhx4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWg6jqiudqVUHj0Kd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3MrPx1xiupyZJRWZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIaaWC3_rf8GtImFp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugzzrzqy0MGhU5i8f_R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyTqcXLIlSgyNYoRZF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYl3osU2J8PbJQ-at4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw5Xxt7GMIBkNdzR4Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIj48kMHZrWvFJkut4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyV1RBL2xwa2vkUbRB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]