Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The movie, I Robot, with Will Smith was set in 2035, where autonomous robots and…
ytc_UgxRB21oH…
G
And nobody wants to find out whats been going on with AI health and Singularity …
ytc_UgzqSqBSZ…
G
@zip10031 An ai is comparable to a car only in the sense that if Ai image genera…
ytr_UgxZ5KT7_…
G
Are we defining the existential meaning of human existence via AI/LLM's?.."Anyth…
ytc_UgywONdtf…
G
Anyone who says this has no clue the amount of time and effort we’ve put in to l…
ytc_UgwC4kX_E…
G
Respectfully, I don't really agree with this. No one gives a fuck if something i…
ytc_UgyuiVPo8…
G
just use different AI's to check/proof/read/edit....
almost like a human writin…
ytc_UgzfGh-Dn…
G
If she wasn't missing most of her body I would think it was a real woman. The fa…
ytc_Ugxep09dv…
Comment
Not really how it works. Kind of an embarrassing AI take tbh.
Aggression is something inherit to a lot of living organisms on this planet. We and other creatures evolved to be aggressive because it kept us alive during the hunter gatherer times.
Aggression and malice is not something that just comes bundled with sentience. Aliens could be totally peaceful, because they evolved to take energy from the sun or such in such. If all life on that alien planet could get infinite energy from an infinite source, then there would be no reason for them to attack one another.
AI will not attack humanity unless we tell it to. An AI has no interest in world domination.
youtube
AI Responsibility
2023-10-04T23:1…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgylRDn5s2i8bWUXGSV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwTUcatIHLQ0W08NzZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyYnc0EaOK5fmOSsQh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyd_Ibt7Mje9TQ4yhJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgywOHPoGiUeQMcowQt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyc-Yixx_9u4J-gMrh4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw5pypSQaIalogpcAV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyViUKH0weCQGIoTQV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyULlgSbspSGPFWC9N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzEFDviVyijSTxUswJ4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"}
]