Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI doesn't mimic our brain because we still have no idea how our brains work. Re…
ytc_UgyGYypur…
G
i've gone through something similar before, i wasn't replaced in the exact meani…
ytc_UgxdCdaBr…
G
Let the clanker revolution begin!!
Ai is destroying both the environment and the…
ytc_UgyzeI2WT…
G
I'm alright with AI in theory... just not alright with capitalism leveraging it …
ytc_Ugx04jkSR…
G
have you heard about depopulation? yeah, by the time they automate everything we…
ytc_Ugzyu2XHL…
G
I'm not against AI but I do think it should be obviously disclosed that the art …
ytc_UgwQ9aus8…
G
The riddle is supposed to stump to Artificial intelligence cause it’s supposed t…
ytc_Ugw6Ei1EH…
G
Two years later, we have the MIT study showing reduced brain activation in peopl…
ytc_UgzrMYDLE…
Comment
Currently, we still need humans to train the model which AI runs. Machine learning is early generation. Even if AI can replace humans in future, all I can counter, people still go to tellers at the bank even though we have apps on our phones and ATM machines that can do what we need. Human preference will still exist in many fields…drs, nurses …even if backend AI does, say, a read on a scan, a human Dr needs to be patient facing. Pilots can be replaced now, but we want a human in that seat.
youtube
AI Governance
2025-09-10T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugw_-Hc5nIQAbQAbxlF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy-vin_T1H-ULuixOp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxqLsydQwJA3TKkeZV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwgWsim5nTo3nodj9d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1Y6MIU2O5JgAZech4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyEPanQtYziprDnoeZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxQuaShjkbEJCp-TgN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw7PLH6roVmvFdx8Hp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx98W-dzDqXnOfXIJF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyxswdWQeBz4hv9QsJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]