Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, that's my take too, been for a long time. It's not an if but a when, and t…
ytc_UgxQxFYe_…
G
I honestly do not think AI/Robots will ever become concious . It just dosen't se…
ytc_Ugz9UKQeX…
G
Just 2 clankers debating. It’s ironic that AI is having the debate about religio…
ytc_UgwmDY-hY…
G
I think the problem in understanding the limits of LLms is that most of these p…
ytc_UgzD8fBjD…
G
I’m all for the free market. But with AI we unfortunately going to have to move…
ytc_UgzPRUxlB…
G
"if you were a religious officiant in israel what religion would you be?" this …
ytc_UgxdkVQ53…
G
Pour le moment oui mais je pense que dans 2-3 décennies les IA seront bien plus …
ytc_Ugx3gibqs…
G
In a few years, people are going to complain about why generative AI is NOT usin…
ytc_Ugy37MZwd…
Comment
I feel like concept short clips like this are sold easily to the executives and investors that its pitched to because all they think is results and profit - and they ruin stories into making it generic anyway. That when it comes to a full length film it will be much more difficult and awful. This will not go far as it takes longer to generate and fix AI than to just fix a human work??? We are better are problem solving and making things than AI since ai isnt consistent.
We make our own studios, lets the greedy ones eat themsleves and realize how bad of a decision it is to rely on this over human work.
youtube
Viral AI Reaction
2024-07-17T06:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyjhQ3IAf6LjBJGllB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyBa7Et6XDr0DQ7omd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzY5ce4_TGbi27IcNZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzqf3vrXUOYNbgHuq54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzpbMpS05fPJ-cHwTl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyuhViANdDW3_--cFJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmwkXtQIyl0CEPmbB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzRBE_fbe_KswEUGjB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgyLq1cQycmzm5Lt_gR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymJXmQcvvpyO9dU1p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]