Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Idk but
Robot 1 when we get out side eye one person ok?
(Robot 1) ok master
(Rob…
ytc_UgwfNEZtr…
G
Daproim Africa a company in kenya same as the Sama AI should be documented.
Dapr…
ytc_Ugzlz6KGi…
G
There will be no AGI in 5 years. And AI is currently used as a coverup for job o…
ytc_Ugz91_yAn…
G
I was gonna say this shit isn't realistic at all, but then I looked at some Inst…
ytc_UgwwFsBS-…
G
Humans do the same but we humans have been thought just do what asked. Well ai i…
ytc_UgymuDqPN…
G
What skill do AI prompts require? Genuine question here, At least I my spaces, n…
ytr_UgzOSypYp…
G
One of the most stupid things Ive ever seen Human fighting robot And he got his …
ytc_Ugza6i8wT…
G
What is a human programmed to do? Each of us has a choice. I can tell you if we …
ytc_Ugx-gkcUY…
Comment
Technology Reflects Us — The Ghosts We Build Into the Machine
Thank goodness for this episode — Karen Hao cuts right to the core of tech's most pressing issues. "Technology is a product of human choices" — a powerful reminder that if our values or foundations as individuals are flawed, those flaws will inevitably be reflected in the technologies we create. It's alarming how often this truth is ignored or downplayed by the tech elites shaping the industry. The eagerness to automate everything — often without fully understanding the problem or questioning whether automation is even appropriate — reflects not only a lack of maturity but, at times, a troubling disregard for the human consequences.
youtube
Cross-Cultural
2025-06-29T16:3…
♥ 73
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwM2_DLwCYvoUoy7kt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwHan2GPMhLnzI-Y6F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgwUz4VsoT8wraR645t4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxGll7S1sLcqWUDzIV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwj3AFVNCaH4BX5CBF4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwHzuY1I4K3ugvXbVp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxRSMEyqh9oadRF6ax4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzdHsWrC3roeihu_HF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxyfMVqtcskP03kXOt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxlhz04NETakikw5jt4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}
]