Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is bull shit I am a dev currently using AI in my job. Today only I complete…
ytc_UgwPTLBx8…
G
hey, maybe instead of trying to align AI with human values, or in other words pu…
ytc_UgxlTPCsc…
G
I heard that upper management all dislike AI in my company even if it saves them…
ytc_Ugzmgpumh…
G
This dude Jason is a hypocrite. The only way I would have accept him doing this …
ytc_Ugz3tb2F5…
G
Not even NVIDIA in the long run if things go that way
Eventually no one needs t…
rdc_ohwklpq
G
Democratizing knowledge? I don’t think so, this is just piling up money on peopl…
ytc_UgwTYOR98…
G
Soon AI will be able to fool AI that it is a human basically beating the Turning…
ytc_Ugx1WGblq…
G
AIs backbone is energy & cpu power, no energy - no AI. So whats all the hype abo…
ytc_Ugy39gwK0…
Comment
People will still be needed because AI ultimately has no clue what it’s doing which is why occasionally AI makes literally epic mistakes that no human would make. And work will still pile up. Our organisation makes a certain amount of work. But if each of us could make 5x more we’d definitely choose to do 25x more work and work force would still stay the same.
youtube
Cross-Cultural
2025-10-06T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwJJtJc2d5QmFHwiMx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8ELXlRA6b3sf8Q1l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuMoz6IW907w2KIM14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyocWHgULI0ftcH3Wx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwv-Mu33flRgqOJ-AR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxrj-7O9cGtJLihZaR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzUbrYprUADRDjoDsh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzWe6sicbNfMT_-lXN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxX_kXN49iIgCj7CMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgwWFt47WYSpFSbP3oJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]