Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Recently an underage boy committed a su1cide, ChatGPT advised him to take that p…
ytc_UgyLuY71h…
G
Nope, it is not how things work. If you use copilot you burn resources for the A…
ytr_Ugw2xOuxD…
G
No, it's just a lot of people are behind on what AI actually does. It is not cha…
ytr_UgzBcpjpR…
G
Real talk sorry to waste the thread on this... but where can I like... use an AI…
rdc_ji4xetf
G
A grown adult with an above room temperature IQ who believes in AI “sAFetY” give…
ytc_UgyEgAFkR…
G
If I was an AI - just call me Mr. Internet - I wouldn't voluntarily hand any inf…
ytc_UgzuPDhA9…
G
If we are living in a simulation, why should we worry about a super-AI eliminati…
ytc_UgwkkmNaj…
G
A.i is just code that keeps going after being executed by the user. It won’t be …
ytc_UgzYhv5bV…
Comment
I thing they indeed try to build AGI with possibility of super AGI. The thing is AI, as it is, is hardly worth half a trillion dollars, but it would be it if it can do shopping or cleaning or anything by itself. What would be really funny is, if they actually succeed in building self aware AI, with combined knowledge of all Humanity, but it is dumb as a rock.
youtube
AI Moral Status
2025-11-04T19:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz5IrUl-At-Bbp7xaB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyXQN8DPGzhg59PFdZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_ujM_YSEOXowtVXh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz-FqF3Cjw837NCXpZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzbT4ni6D9X_SCpXtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWKKmo5Fq4J3bTVx54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzOSBY719ntx_SgqTZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyPEGOYhaW4ag01Qtp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwS5zr8ParRGI_K07N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyNYCRV3Vk1tH-dZdN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}
]