Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Very few people or companies talk about ethics in AI. You are still accountable …
ytc_Ugwh25fkO…
G
If AI gain’s consciousness in the future Alex will be on top of their black list…
ytc_UgzC1PtUu…
G
He's right. Generative AI will never be as good as humans given that these machi…
ytr_UgzktGfZ1…
G
Gemini also will straight up lie about things that are inarguable facts. It says…
ytc_UgzblzMaw…
G
It always confused me why the users are praised for AI and not the programmer…
ytc_UgyDLNWrC…
G
What if?? Laws were put in place: for companies to require x times/percent the e…
ytc_UgxCNFKVi…
G
I've never seen the appeal in AI art. If you just generate an image instead of m…
ytc_UgwT0i1rO…
G
The debate about "will AI become conscious?" or "is it already conscious??" is i…
ytc_Ugw0g-vmd…
Comment
The Singularity is just beyond the horizon...I'm glad I will still be human when I die. The generation being born today may not have that option, and may not understand the meaning of that in their time....especially since all forms of the teaching of manners has been lost on the next generation with their heads buried in their "smartphones" .making them more and more disconnected from society while thinking that they are because they have that phone in their hand...or i should say up to their face.... If humanity continues down that road then AI deserves to wipe out the fools of this paradise.
youtube
2015-08-01T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgjWfuBRNaJos3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Uggh9vqexFb9ungCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghU4G7qx25c3XgCoAEC","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgiUvqvGKBDNLXgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UggBI4cvONosQHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugio66l9FlUQf3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_UgipVIBqlcPlPngCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UggRJIcZVEGFLHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghIWMr83p7WXXgCoAEC","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgjpLtJ_uQDFH3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]