Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
A key differentiator is that art is defined by the creation of something to inst…
ytc_UgzDXqeIm…
G
Human skin that had their makeup done to look like a robot and maybe bald caps t…
ytc_UgxGc0lLE…
G
Well so was out here lol in front of my face lol lol like floods lol lol and sto…
ytc_Ugya0MzZt…
G
Here's the thing: If art is something you like doing, whether AI becomes the nor…
ytr_Ugx8B--mj…
G
Well, I am scared of what an irrational, extremist, racist, fascistic AI is goin…
rdc_n5ljax2
G
That is all wrong. AI is here to save us from viruses. But no one would buy th…
ytc_UgzHXSRpj…
G
Stupid Liberals 🥾😛 keeping those cops boots clean i see. Just remember, when fac…
ytr_UgwXFeFKJ…
G
Dignity and self worth being tied to your job... it is so bizarre. People that w…
ytc_UgzkztFts…
Comment
Huh...he's wrong about scifi. Australian author Joel Shepherd wrote a fun space opera maybe 10 years ago that just happens to make a (now) plausible prediction about what AI super intelligence might look like and do. Developing their own "religious" beliefs is amongst them. Also that its unlikely for such intelligence to have unified, single goals or behaviours, or that we could possibly ever understand or predict any of them.
youtube
AI Governance
2026-02-18T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkeNDMXtbPijAttV94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwTPoh-QPj3qPPkkyx4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz1UmXow4W-2336UO14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbdvGDZ67AohYivOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxH1Ba9ufiwtUJkPPF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyzVbxuWfYt_2-wmPd4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyQrLssUQmiXhJhWl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGqulGF4_qvNvD1d54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxITGIhQwHBneNT2Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypAAVXd0Tm7mwtt1d4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"resignation"}
]