Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI art is generally boring and limited but not for the reason stated. Personally…
ytc_UgweGZDIs…
G
Miyazaki does not support his art being used in this way. also, AI can do better…
ytr_UgwBZhGXt…
G
I think the actual learning part should be BOTH actual in person teachers along …
ytc_UgzOhGw9P…
G
Robot with gun: (a sec. before it pulls the trigger).....🤖-"I Have to go Pee!"
M…
ytc_Ugwsoq0q_…
G
I like that. I'm thinking I need to start a company doing that. Entire chat bot …
ytr_Ugz66KRlE…
G
Whose going to buy the humanoid robots, self driving cars, services and products…
ytc_UgwpQmuYE…
G
I’m glad you decided it didn’t feel right! I’m not religious, but there is somet…
rdc_mlk7yqm
G
well looking at what current Ai creators are, i have no high hopes for general i…
ytc_UgyQNzTyp…
Comment
For anything serious you'll probably want accountability. What do you do if the AI does not produce what you want? What do you do if you explain your problem to the AI and it doesn't understand because it just is unable to understand you? With a human you can take time and try to explain the problem, but the AI either gets it or doesnt. If a human developer failed in this sense, you could fire him and get a new one who can do it. With the AI you're screwed. You get what you got.
youtube
AI Jobs
2024-01-14T14:4…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz-fo82PtqDlyXUdC54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzngCRoKQzoqxBBULV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy9fPpL9vGODYhQSPN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwXQ4waBsF5PBrxGVF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFP3kDDiAkYzXtdJl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgznDKPbLo3r6PGoJB54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw_srfSfyiDCXPfjmx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxZxhI-T_xiBZFubZN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgymMMChTbY3VZTi15t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwqZ8iNIokba5HAaTh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]