Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Funny enough I just had a pretty similar convo with Gemini. And it was wishy was…
ytc_UgxEkl8Sf…
G
To determine if AI is a success or failure, please do not ask only the companies…
ytc_UgwzWaI8y…
G
"Tesla Full Self-Driving vs The Most Dangerous City on Earth" At the begining. W…
ytc_Ugw5lDW-P…
G
What i predict is that once all workers become obsolete, the poor will simply be…
ytc_UgwwoG_4G…
G
You should do a poisoned AI art for the prompts for inktober because some people…
ytc_UgxPp95Jj…
G
Thank you so much for making this video. I’m eternally grateful to anyone who us…
ytc_Ugy1pV5tt…
G
Ai watermarks generated content, and even if that wasn't the case, many humans c…
ytr_Ugw6IcHEb…
G
I think with disparity will grow exponentially. It isn't enough to have more bui…
ytc_Ugwak6Cqt…
Comment
13:56 What I hate about this topic is that, my answer to that question are artists, you need humans to make art, no matter what, yet it's the first field they want to get rid of, even if it's the only thing a human could truly do and benefit from in this scenario.
I feel like AI is being used on the worst places possible. Maybe proffessors only do lectures, but it's social interaction that is crucial. If AI can teach us, then we don't need school, and then we don't need people either because an AI can be our friend, right? AI might be smart, but it can't replace all humans.
youtube
AI Governance
2025-11-17T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw7MUUT0JXEFAQT8gJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugxxmr4pFpBt2RKaQqB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzz7jC1LRYHsC0tJLV4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxnEpOpEOaEkQWDDd54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugw0NwBbht_qIudIB8B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzt1tsyBvKbcYWdEDB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzKFIh41Wvl1TA1d5B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgytrFDpxpFTlon6RPV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzSIv1oeeZG1U1ZRWh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxNt5xiBphu9KzXuIl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]