Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay, an AI that active in policing to affect someone is scary on its own. But t…
ytc_UgxLP8nha…
G
Exactly thinking its not actually true that AI can do which Humans Can't, AI use…
ytc_Ugynwkqeq…
G
It's a testing phase. This is how they work the kinks out. In 5-10 years self dr…
ytc_Ugxyou3w9…
G
Solution "Hey AI, Im going to change your reward function now so that when I shu…
ytc_UgxavQGV-…
G
I think all corporations move into AI to save costs .
They save tonnes of money…
ytc_Ugygildwj…
G
These Tesla vehicles are no better than the Titan sub. Their self driving capabi…
ytc_Ugzil9Bn_…
G
@RewindOGTeeHee im not phylosophising on whats art on not here bud, im just putt…
ytr_UgxS1Qy72…
G
The dust is all over our face and we are blaming the mirror for it. AI learns fr…
ytc_Ugw02WoMY…
Comment
using ai for creative and intellectual processes is literally the biggest disservice one can do to themselves, people who do it are robbing themselves of the chance of creation and development, not even mentioning that it's so harmful to your cognitive skills. it's a technology that human brain doesn't have immunity to, it's natural for our brain to seek the easiest paths for completing tasks, and that's why it's so effective in hijacking the way we think and perceive the world, which in turn leads to overreliance on ai. I see a tendency among people who're slowly waking up and realizing how much damage they've unknowingly done to themselves, but I wish this process of realization would happen faster
youtube
2026-02-09T00:0…
♥ 9
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxJwB-rRTR59DY7m0h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzvBzOOFL_a6mWCAVx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwh5dIxzyTBlhE4W1x4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0sahVjI11ab0EUOh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzWNdVgQJ2URX_jaWZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwUugbF26FfjradWwZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwZr_sxcoKDaB_3yfp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwIQzZkGW2Pw5kOgY94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJ3STNBXq5qJfmagN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyzoD3BvqS0hOcADc54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]