Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Chatpt doesn’t “ try” to do anything, it doesn’t know “ try” it’s just an algori…
ytc_UgyfT_MdO…
G
This explanation sort of allows transformative AI art usage. AI art is boring as…
ytc_UgwtpSJ8k…
G
With AI everywhere. We need money or get rid of bills. Everything should be free…
ytc_UgxtvuYAx…
G
We can all live without AI. It is a choice, not a necessary thing. I block it al…
ytc_UgzOFC8iV…
G
Ai already convinced the billionaires that was all it needed to do to become uns…
ytc_UgzhujBYZ…
G
I mean, the dude does have a point. Remember the banana taped to wall "art" that…
ytc_Ugya77Gg1…
G
Displace politicians first. I bet when politicians fear of loosing their jobs, t…
ytc_UgxY7bWnH…
G
i want the car thoroughly tested more than it is. I was not fond of the free sel…
ytc_Ugz8dm3JW…
Comment
robots will be designed in all sorts of shapes and sizes, fit for purpose (so not even the plumbers are safe) however, if none of us have jobs we won't be able to afford whatever the goods or services the bots are providing so its all a bit of a moot point. I think the one thing that might save us goes something like this..... We have created this AI in order for it to serve us and help us solve problems, create things etc. It will take away our reason to live, but because of that, it will have no purpose, hopefully it sees this as a big enough problem to its own existence and it works to optimise our lives, not end them or make them redundanr.
youtube
AI Governance
2025-12-04T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugx68Xp57MkZcWQhtMd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw9N9EU2ECr2BBkSoN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgweiMNjLEVvTlitmUJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy3iGtMeTwW_p4xAb54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgweG2tUuUoox0ZcbyF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxfzyIAQ3d4VKtOYR94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzgRpT8ZghF6p5N0L94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgymW6uvFy5iHDwnI-Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxKbY26j5cjDE3pSYF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyBGf_sUEITCUiU4Hp4AaABAg","responsibility":"user","reasoning":"unclear","policy":"ban","emotion":"outrage"}
]