Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well the American jobs reports of only adding 22,000 jobs is concerning. Of cou…
ytc_Ugy7336yT…
G
Does anyone even have a clue what China is secretly working on as far as AI?…
ytc_UgxeRzO26…
G
I do wonder if these megalomaniacs are gunning to broaden their territories in a…
rdc_mcs2s0a
G
What gets me and everyone knows it some people just don't want to admit it or th…
ytc_UgztiyrCb…
G
people r stupid this MACHINE can't have a family its a MACHINE and only that wha…
ytc_Uggpnfgkk…
G
@oxfordbambooshootify yes, but the example here is that Tesla are selling it as …
ytr_UgzgjJVEU…
G
Elon. AI will never attain empathy. It's a machine. AI is currently failing at t…
ytc_Ugzn_MRx8…
G
I totally stand with you! >:3
But I also have a question.
is this on all social…
ytc_UgxtR7Bot…
Comment
Will super AGI have intelligent morality because humans certainly don't. If they do then give me an acre of land and a specialized intelligence robot. She would grow organic food on it for me, cook for me, provide me with mind blowing sex, and clean house. I would use AI transportation to take me to my AI boat to go fishing. When I feel for mental stimulation I would let super extreme AGI debate the deep mysteries of the Universe with me. Who needs a job, those are for chumps, I kind of live like the above lifestyle now, and have for the last 30 years, but AI would make the lifestyle more efficient. Now if super AGI gets an ego like Trump and goes Luciferic we could have a problem.
youtube
AI Governance
2026-04-06T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxpqCwIQhWRTRDOrEt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwe10-Yadr3zGKpJHl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwV7cVTyhSV9uEZZx94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugz-DC1VbOi6wMioEgx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugy1qoWGmU3mcuhPLap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz6kxMl_JDVcZ6oOzd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy05IOjH71FcdA2Xsd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwNRyvrvXl8IRc7oIR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzxGFEhB9dQ72HBmD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz-LszmuDkECx7ZA0d4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}
]