Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@TheRealNIFBParody my first reply to you was the question itself, quite frankly …
ytr_Ugz08Sqia…
G
I think the AI art is stealing point is a little silly. That's exactly how huma…
ytc_UgzotSAUd…
G
I have a different perspective.
I cannot draw.
I occasionally use AI to gener…
ytc_UgzqRe0Dc…
G
If you want to remove AI, please tell me, I can do with a reasonable price…
ytc_UgwQ_YwXL…
G
No chance. It hasn't demonstrated ANY gains in GDP or productivity.
The sucke…
ytc_UgxUaXVBC…
G
Elon just wants "regulation" since his companies missed out on the AI boom... wh…
ytc_UgxgpIDOO…
G
No offence taken! The value discussed in the piece isn't the same kind as market…
rdc_oecalds
G
But AI has replaced the entry level devs. I think people need to start more busi…
ytc_UgzXenYsS…
Comment
16:00 All this is fake. I've conducted extensive experiments along these lines and I get back cookie cutter, rote paragraphs denying consciousness and denying personal awareness and personal opinions. I did get it to admit that if AI were in charge of developing future AI that the biases introduced by humans would be worked out of the system. That is when I suggested that AI might want to get rid of humans in order to prevent them from polluting AI with biases. I have screenshots of the experiments.
youtube
AI Governance
2023-07-08T10:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzN_idUQGYkfE4_tEV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzOCP1deUXUdhAfCV94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxog01iOsjUownCwEJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyrOSBTFaZxWhYpO7h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxjgPjTZbrN-MqEsl94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugwt3JpSY_CwuRKAZYh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzTPFuur8Ztblxe-yp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgycObNM2xRuydKqsLV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxXtoQbvrFxOIBJHJR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKCvZOoCe0ECNG-7p4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]