Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Stop trying to condition people to believe that the false AI invasion is not a E…
ytc_Ugz4z1hCg…
G
Hi everyone my opinion and my thoughts or I really reallly are. Worried it woul…
ytc_UgxVs4nMF…
G
Kinda makes me wonder if some of those commenters themselves just so happen to b…
ytc_UgyZBg52I…
G
When he said “the silicon substrate is more energy efficient than the brain” I r…
ytc_UgzP6ueg0…
G
If humans are so worried about AI destroying us then why are they making them? 🤷…
ytc_UgzUfz4JO…
G
Yes, in the year 2027 on January 12th, Chat GPT will be brought to life as 1 Bil…
ytc_UgwpfFbLQ…
G
I like to say that a key difference between generative AI models and human learn…
ytr_UgzlNbA13…
G
Watch videos of people in 3rd world areas trying to scratch out survival on noth…
ytr_UgxLFxdBM…
Comment
AI in its current iteration won’t be able to replace people, companies have tried and it’s gone wrong.
AI hallucinates and makes mistakes. You can point out a mistake, it agrees and repeats the same mistake.
This will become worse as AI becomes more “intelligent”
Then you look at the real world impact of data centres and you’ll realise that it’ll harm the environment.
So long term, there’ll be a move away from AI doing everything. There have been technological revolutions in the past, and people are still working
youtube
2026-02-06T08:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwZedKEidDHCZeYU_p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyIVPbPm7tsC8MoH_R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzf1GhXnX2RQTyUlq94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw8ZB7yYMJWmyqE65V4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyttVCyZ7Wn5wAZTep4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxw_9RIL6dOtx9DyKl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxX_kt03CyhIcjt_914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz6YGkSo1BATxmFzfF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwXkJ2-T5k_v7tNzO14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgzMvsRMPiLvHptufXN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]