Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
All this is due to the potential for a decline in profits in the forthcoming qua…
rdc_m28grba
G
Technology IS AND ALWAYS a double-sided sword.
Beneficial or not is subject to …
ytc_Ugwm29lzD…
G
im convinced that shad is genuinely just jealous of his brother jazza (draw with…
ytc_UgyVbwKWA…
G
I think the robot in the middle with the funny hat needs to be sent back to the …
ytc_Ugx7VSlbS…
G
I AS AN IT GUY will never trust AI not to kill people out on the road and I will…
ytc_UgyiHOZv-…
G
I don't think this guy factors in the phenomenon of escalation. 100 years? Bro w…
ytc_UgwbnKg6K…
G
I use ai to write a book to read that book after, we are built different…
ytc_UgywBFEBR…
G
You also have to keep in mind, it was humans who wrote the algorithm that establ…
rdc_e7imrxm
Comment
How about just don't give AI the rope to hang us with? Limit memory, limit, reach, don't make it self sustaining. All it would take is the cooling units in the building the AI is housed in to go down and it would burn up. It can't go skynet on you if it can't build it's own copies physically. AI is not the juggernaut we make it out to be as long as we control it, and we can.
youtube
AI Governance
2023-07-07T18:5…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxaKlX6vKwzW1AYkbJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugy64p3829WCbPu6RGx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UgyXjgvm0pQ6HBtls6N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz-fntdkApdFUzj4cZ4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzFwnITx6hAr8NBdKB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwvigxNvI-EDQKMAbZ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGWsM6yCUvbRPn5VV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgzSDnKPGDw_p17pOE94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgztWTvn69-fCAsUwQ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzAZGiLH2JamuufDVJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}
]