Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I wonder if in the future, AI will be so smart it thinks about being human.…
ytc_Ugy2zOkwa…
G
While robots and artificial intelligence can perform many tasks efficiently, the…
ytr_Ugz769PzB…
G
Shouldn't that really read "Google's Anti-Bullying AI Misidentifies Incivility" …
rdc_dluyjs3
G
Depends which religious system controls the ai programming doesnt it. Then th…
ytc_Ugwe2dNic…
G
User: Hey chatgpt, do <programming task>. LLM: (produce this code that doesn't w…
ytc_UgzzwpSbA…
G
Predictive policing sounds like bull s##t science fiction at best and completely…
ytc_UgyajQlei…
G
"Don't cry little AI 'artist', Uncle Computer got enough AI slop to feed you you…
ytr_UgwcHMjeC…
G
Having knowledge doesn't automatically make a person smart. Newton, Tesla, and E…
ytc_UgzpdFFUk…
Comment
Abundance requires demand. That thing you have in your right hand was invented 140 years ago. We have typewriters, recorders, word processors, PIMs, etc., yet none of them replaced the ballpoint pen. I use one almost every day.
I'd like to travel instantaneously without turning into a pile of goo like a certain movie (I won't spoil it by naming it). I'd like a pill that keeps me from aging. I'd like a lot of things that aren't feasible now. But having a different way to quickly sketch something or jot something down is not on my radar. And it's the same for a ton of current products. Yes, AI will propose things we didn't know we wanted, but to say the demand will be there to support abundance is just wishful thinking.
youtube
AI Jobs
2025-06-27T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxC2IghGI16Fa1VGXl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwl9cWYx-D-SI2GqqF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyJRq9DAPGCoBoquOV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzZYSFj2ohSSjTVadh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyJwrZgBa_71I3z4ud4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxYagHfkt3p321biSx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxctfSLNOjphJ5DIPN4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwsOUHFtjrt3oFWG2R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwwOyIPTqLrNTcacoN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-FXD-xuOdVl9jLDV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]