Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Personally, i use AI as a tool rather than getting the end product itself, i gen…
ytc_Ugxg0o0ZB…
G
That allows them to put us in nurseries where we cannot hurt ourselves or others…
ytr_UgxDpai-S…
G
Regardless of emotion, a collection of knowledge that can form responses is at t…
ytc_UgzzvHkHk…
G
this is so incredibly dumb. If everyone is out of work, who's growing the food? …
ytc_UgzmB0WeB…
G
Stop using ai
Look it up
But it has something to do with your drinking wate…
ytc_UgxvRKmZJ…
G
Three key billionaires have recently dumped all their shares in AI, particularly…
ytc_UgxcpcKUQ…
G
Eh I dunno about that, I just did a bit of reading on the Online Safety Act and …
rdc_ohyao5p
G
To be honest, AI "artists" aren't even artists, they're generic prompt writers.
…
ytc_UgxJWgMIi…
Comment
Ai is only worth for all the information put into it. Information that has copyright rights and that was and is stolen. If I violate copyright rights, I go to jail. But these companies do it without any restrictions. Ai is a lie. Unfortunately those that put billions on it, like certain countries and companies, don't want to lose money and need to get a return of investment no matter what. Some would definitely kill for it.
youtube
2025-03-18T02:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzt2VJDX89-5OCUsct4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMi4wJTK_UWQpc9AF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyZetTb6f1V5YyFZ9R4AaABAg","responsibility":"government","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugzutm1XDSNSvlbMXjB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgweZmRxDLPdgXkvdkl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzM-tCHD9aGIq-aQnJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwvy_a7lQHB8LEQmAl4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwrmHAHcekTdE9UdJt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwjPkxEDUeuj_pXr6p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz2WLScBLRGbt1Yyo54AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"}
]