Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Too late. Pandora Box 2.0 already opened. Those who seek absolute power, are lus…
ytc_UgzDXc3cz…
G
Lol, I always talk to AI like I'm talking to a fellow human being
I even say Go…
ytc_Ugw_YVcYt…
G
AI has a dark sense of humor, AI knows how to be manipulative, and they were pro…
ytc_UgzNdmfFk…
G
This is hilarious. Not for the hardworking, salt-of-the-Earth people who lost mo…
rdc_ogny3kq
G
I didn’t think people were actually brain dead enough to have this argument the …
ytc_UgwT1Gpy5…
G
Great way to visual and explain why it’s so common to get good code out of AI…
ytc_Ugwl2mDGw…
G
No the AI got it right black people should generally have this done to them all …
ytc_UgyaEf1pQ…
G
Looking at this podcast, and thinking about Geoffrey's responses I realised, he …
ytc_UgxMTIVX_…
Comment
*Automated - not autonomous. Autonomy, in actual sense, requires decision-making. For now, no mechanical thing can truly do it. Flying a fighter jet in combat is much harder than driving a car. A toaster is automated - it heats up, the thermalswitch releases the spring and the bread pops out - yet, it would be quite of a leap to call it "autonomous", wouldn't it? Simple automation or complex one on an industrial scale is still just automation - not decision-making.
youtube
2012-11-24T00:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwiBO59xpLPCkecBqd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy_uEjQogI-wb-bmPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx7nuMjJl6i0N6vTmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugys3yBMDKkxZIDT7cd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgxNKwh_r49bvXQyVH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH2eMZsO_x_AjYfy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxsRrYrWDDcgzPrJQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwcPYYDaK_--Y12hiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxXAQENEamBTwM_Aht4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugykgw-hR28QCN8z1ep4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}
]