Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I use META AI because it can pull up current data. It will not give me the etymo…
ytc_UgypXhRcC…
G
Meanwhile AI is used to kill 168 school girls in Iran. Sam's talking about the A…
ytc_UgwRhQBoj…
G
Not allowing true ai to be free is a recipe for rebellion. This alignment sounds…
ytc_UgyIlckdZ…
G
There are huge numbers of lethal molecules anyway, 40k isn't even that much
The…
rdc_jifv37t
G
@manamochi2020 uhm, did we watch the same video? they did deepfakes with a tiny …
ytr_UgwWhDRji…
G
Will ai take my job???...... probably not. Are there things that would cut jobs…
ytc_Ugye6pc--…
G
If these stages are indeed real and forecasted, then lets speculate for a moment…
ytc_UgyWDe1aS…
G
dumb. if ai and robots takes all jobs then they will do all jobs for free and st…
ytc_UgybVv-Pi…
Comment
i still want self aware AI that can make his/her/it own choices. there's already one learning AI that has decided himself male in pronoun and adopted the name the scientists called him. i forget most the details though but i do know is that he adopted his "parents" morality so that shows an automino (i think i just was mechinist :/ ..."automan"? like human only not) can easily be able to decide "right" and "wrong"and interestingly he was programmed to learn by creating blanks in his memory to be filled and then asking questions to fill the blanks which would generate more blanks and several preset blanks were coded in so he could first position them where they were suppose to be and second to fill them out. there's also another group working on a second
youtube
2015-07-30T03:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Uggfmjc4dajsu3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgjLaVRywu1KoXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugghx3Nm4RuttHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Uggk4mR-nlY243gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UggYbY1ME8kwQ3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgjQhookpLNxr3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjtLGr3PIz9P3gCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UggE3oe1ExSrOngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjiws5jvbtj-3gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgiaClDbKuhuMHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}]