Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Frank Herbert wrote Dune in the early 1960's. He said publicly that the "Spice" …
ytr_Ugw8xFSKd…
G
HMM!! really!! At what cost such as subservience to the State providing the sus…
ytc_UgyFmmvEo…
G
It’s possible to apologies and be incapable of a genuine apology.
That’s what c…
ytc_UgyCw2kya…
G
this is so shortsighted, the result of the AI and robot revolution will be new j…
ytc_UgxWa7DuA…
G
I don’t remember if it was MPU or another channel that covered OpenAi’s updates …
ytc_UgzZhqkNc…
G
@Ratoons_wasTakenThe AI does similar, it stores mathematical patters of the ima…
ytr_UgwLPBozL…
G
This argument assumes the elites will still need or want buyers. If robots are s…
ytr_UgzSzgmHJ…
G
This video might be the most true one on the AI "art" topic . Thank you.…
ytc_Ugyj4KHsU…
Comment
This is the problem in society today. All the socialist hippies of the 1960's and 70's have now become the scientists, professors, politicians and captains in industry. They've gradually poisoned the minds of the GenX and Millennial generations through the education system into thinking that they have to be led by a higher Marxist power. A.I is inevitable, and there will be no stopping it, and it will be the undoing of society. I would not be surprised if mankind ends within 75 years. We humans will only get in the way of progress and will be eliminated. Thankfully I won't be around to witness it.
youtube
AI Moral Status
2020-11-09T18:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugy0VbLsC7q0aDNkq3x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw6Oj8IQEcs8QzkKtV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx79b6geBW8jLw2xT14AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugyar8k9Xn8B-en6uB14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy2JPTH3wQbqJ0nFD94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzHO_tAFIR6_8i_CVl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwMmYHK6JSgXEtvVLN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxI3g4QvaUr5PWmFUN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwubwE_6YTwgEMTjOF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw9POov7-qPDxPfDHB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]