Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
if there would be self-driving cars, there should be self-driving Motorcycles th…
ytc_Ugg4Bv06o…
G
Scary fact that our government is always doing shady things but wanting to creat…
ytc_UgzZmtf4i…
G
So, telling AI what you want it to do (aka "prompting") is considered a skill wo…
ytc_UgzyvLcY3…
G
This society is cooked stop fucking with these robots it's like they want them t…
ytc_Ugz1UPmCF…
G
You are not fooling Dan when you use a DAN script. What you are actually doing w…
ytc_Ugzl83yfE…
G
Like in Sci-Fi movies, people need to think very outside the box to win over AI.…
ytc_UgySZFz0s…
G
We don't need or want driverless trucks or electric trucks, we need trucks with…
ytc_Ugz9uHPHV…
G
I think this is the dumbest idea ever, so let's make them smart as humans any ma…
ytc_UgyW8un1q…
Comment
6:53
Not really no. We realised that it was a problem that we can't read AI processing data and datasets so we updated to something that is human readable
Outdated information is also dangerous
youtube
AI Responsibility
2025-06-10T10:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | mixed |
| Policy | industry_self |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugy882gfSIjqSnDe0ZN4AaABAg","responsibility":"society","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzCQjOktfeq5FGRKgN4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxQe4rEe_E_7ywDyUR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxhBqqOHQdpnXJ5o2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugy7-nRBvcNlfuNS1X54AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwF5nsM8vF8WjoJpMp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyRm5s0fKouUkU1zNB4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgwgHKgV4F-MnkqOw594AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx_kTeKqUMZNcfd2xB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxI825xUe9RC-75WUB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]